Compare commits

..

77 Commits

Author SHA1 Message Date
Conrad Irwin
8557251e41 Supermaven index fixes 2024-09-23 10:11:09 -06:00
Conrad Irwin
a36706aed6 Fix up/down project_id confusion (#18099)
Release Notes:

- ssh remoting: Fix LSP queries run over collab
2024-09-23 09:11:58 -06:00
Nathan Lovato
35a80f07e0 docs: Split vim mode documentation into two pages, edit for clarity (#17614)
Closes #17215

Release Notes:

- N/A

---

This PR builds upon the vim mode documentation page and aims bring the
following improvements:

- Separate vim mode-specific configuration from introducing vim mode.
- Reformat some lists of provided commands and keymaps from code blocks
to sub-sections containing tables.
- Flesh out the text a little bit to make it more explicit in some
parts.
- Generally format notes and a couple of other things closer to some
other docs pages.

Checking the diff doesn't give a good idea of the changes, so here are
some before after images for quick examples of the kinds of changes
brought by this PR.

**Introducing the key differences of Zed's vim mode**

Before


![2024-09-09_22-12](https://github.com/user-attachments/assets/447418cb-a6e6-4f9c-8d4b-6d941126979e)

After


![2024-09-09_22-16](https://github.com/user-attachments/assets/be69f2d9-c3ae-4b34-978a-344130bee37c)

---

**Zed-specific vim key bindings**

Before


![2024-09-09_22-17](https://github.com/user-attachments/assets/88fdc512-a50b-487d-85d1-5988f15c2a6f)

After


![2024-09-09_22-18](https://github.com/user-attachments/assets/3b77c2f6-0ffa-4afc-a86d-1210ac706c8c)
2024-09-23 09:01:32 -06:00
jvmncs
2ff8dde925 Use fenix toolchain in nix shell (#18227)
In #17974 we explicitly depend on rustc/cargo for the nix devShell,
however the fenix overlay that contains the latest stable versions was
not being applied to that shell. This led to the shell inheriting
whatever rustc/cargo was on nixos-unstable from nixpkgs, which sometimes
lags behind. This change fixes that, and also restructures the flake to
ensure that all outputs rely on the overlaid `pkgs`.

Release Notes:

- N/A
2024-09-23 10:16:15 -04:00
Charlie Egan
d784e72027 docs: Add Rego language (#18217)
Release Notes:

- N/A

---------

Signed-off-by: Charlie Egan <charlieegan3@users.noreply.github.com>
Co-authored-by: Charlie Egan <charlieegan3@users.noreply.github.com>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-09-23 09:38:54 -04:00
moshyfawn
8a36278c95 docs: Fix long code blocks overflow (#18208)
Closes #18207

Release Notes:

- N/A

| Before | After |
|--------|-------|
| <img width="1640" alt="image"
src="https://github.com/user-attachments/assets/b7c76ec9-85bd-4001-b19f-6a8e26d04254">
| <img width="1640" alt="image"
src="https://github.com/user-attachments/assets/cd932e33-0429-43b1-9b4b-25df5890a3cf">
|
2024-09-23 09:59:45 -03:00
Kirill Bulatov
05d18321db Resolve completions properly (#18212)
Related to https://github.com/rust-lang/rust-analyzer/pull/18167

* Declare more completion item fields in the client completion resolve
capabilities
* Do resolve completions even if their docs are present
* Instead, do not resolve completions that could not be resolved when
handling the remote client resolve requests
* Do replace the old lsp completion data with the resolved one

Release Notes:

- Improved completion resolve mechanism
2024-09-23 12:53:57 +03:00
Junseong Park
bb7d9d3525 docs: Remove default_dock_anchor in configuring-zed.md (#18210)
Removed the deprecated option `default_dock_anchor` in
`configuring-zed.md`

Note: https://zed.dev/blog/new-panel-system

Release Notes:

- N/A
2024-09-23 06:26:01 +03:00
CharlesChen0823
75cb199a54 project: Fix typo error cause remove worktree not stop lsp (#18198)
Release Notes:

- N/A
2024-09-22 19:50:51 +03:00
Junseong Park
0f4ebdfbca docs: Add missing ui_font_size option in configuring-zed.md (#18189)
Added `ui_font_size`, an option that works in the editor but is missing
from the documentation.

Release Notes:

- N/A
2024-09-22 12:15:13 +03:00
Junseong Park
37c93d8fea docs: Add missing base_keymap option in configuring-zed.md (#18190)
Added `base_keymap`, an option that works in the editor but is missing
from the documentation.

Release Notes:

- N/A
2024-09-22 12:09:35 +03:00
Junseong Park
e7fcf83ce8 docs: Fix misordered headings (#18192)
1. Raised the `Indent Guides` heading to level 2, which is completely
unrelated to `Git`.
2. the `Git` heading now only contains `Git Gutter` and `Inline Git
Blame` as subheadings.
3. The `Indent Guides` heading is now located directly after the `Git`
heading.

Release Notes:

- N/A
2024-09-22 11:48:52 +03:00
Junseong Park
1f35c8d09d Fix tooltip of always_treat_brackets_as_autoclosed (#18191)
Fixed a bug where the `always_treat_brackets_as_autoclosed` option would
not display the message in the tooltip that appears when hovering.

Release Notes:

- N/A
2024-09-22 11:47:07 +03:00
Junseong Park
3ca18af40b docs: Fix typo in configuring-zed.md (#18178)
Fix typo in `configuring-zed.md`

Release Notes:

- N/A
2024-09-21 15:01:29 +03:00
Conrad Irwin
4f227fd3bf Use LanguageServerName in more places (#18167)
This pushes the new LanguageServerName type to more places.

As both languages and language servers were identified by Arc<str>, it
was
sometimes hard to tell which was intended.

Release Notes:

- N/A
2024-09-20 18:51:34 -06:00
Max Brunsfeld
743feb98bc Add the ability to propose changes to a set of buffers (#18170)
This PR introduces functionality for creating *branches* of buffers that
can be used to preview and edit change sets that haven't yet been
applied to the buffers themselves.

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
Co-authored-by: Marshall <marshall@zed.dev>
2024-09-20 18:28:50 -04:00
Max Brunsfeld
e309fbda2a Add a slash command for automatically retrieving relevant context (#17972)
* [x] put this slash command behind a feature flag until we release
embedding access to the general population
* [x] choose a name for this slash command and name the rust module to
match

Release Notes:

- N/A

---------

Co-authored-by: Jason <jason@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
Co-authored-by: Jason Mancuso <7891333+jvmncs@users.noreply.github.com>
Co-authored-by: Richard Feldman <oss@rtfeldman.com>
2024-09-20 18:09:18 -04:00
Roy Williams
5905fbb9ac Allow Anthropic custom models to override temperature (#18160)
Release Notes:

- Allow Anthropic custom models to override "temperature"

This also centralized the defaulting of "temperature" to be inside of
each model's `into_x` call instead of being sprinkled around the code.
2024-09-20 14:59:12 -06:00
CharlesChen0823
7d62fda5a3 file_finder: Notify user when picker an non-utf8 file (#18136)
notify user when using file finder picker an file which cannot open.

Release Notes:

- N/A
2024-09-20 13:49:40 -06:00
Conrad Irwin
45388805ad vim: gq (#18156)
Closes #ISSUE

Release Notes:

- vim: Added gq/gw for rewrapping lines
2024-09-20 13:02:39 -06:00
Daste
7dac5594cd file_finder: Display file icons (#18091)
This PR adds file icons (like in tabs, the project panel and tab
switcher) to the file finder popup.

It's similar to [tab_switcher
icons](https://github.com/zed-industries/zed/pull/17115), but simpler,
because we're only dealing with actual files.

Release Notes:

- Added icons to the file finder.

Screenshot:

![image](https://github.com/user-attachments/assets/bd6a54c1-cdbd-415a-9a82-0cc7a0bb6ca2)

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-09-20 14:44:13 -04:00
Bennet Bo Fenner
5d12e3ce3a preview tabs: Toggle preview tab when saving (#18158)
Release Notes:

- Saving a preview tab will now mark it as a permanent tab
2024-09-20 20:43:26 +02:00
Joseph T. Lyons
601090511b Remove system_id from all events but editor_events (#18154)
Release Notes:

- N/A
2024-09-20 13:25:06 -04:00
Marshall Bowers
8bd624b5db editor: Remove unneeded blank lines in rewrap test cases (#18152)
This PR removes some unneeded blank lines from some of the test cases
for `editor::Rewrap`.

These weren't meaningful to the test, and their presence could be
confusing.

Release Notes:

- N/A
2024-09-20 13:06:43 -04:00
jvmncs
9f6ff29a54 Reuse OpenAI low_speed_timeout setting for zed.dev provider (#18144)
Release Notes:

- N/A
2024-09-20 12:57:35 -04:00
jvmncs
d97427f69e chore: Update flake inputs (#18150)
Release Notes:

- N/A
2024-09-20 12:48:48 -04:00
狐狸
99bef27300 Add escape string highlights to JSON and JSONC files (#18138)
Release Notes:

- Added escape string highlights to JSON and JSONC files
2024-09-20 12:20:14 -04:00
Peter Tripp
f8195c41e0 docs: Switch proxy example to socks5h not socks5 (#18142)
Very rarely when you have a SOCKS proxy configured do you want local DNS.
`socks5` does local DNS. `socks5h` does remote DNS.
2024-09-20 11:52:57 -04:00
Marshall Bowers
759646e0a3 editor: Improve rewrapping when working with comments at different indentation levels (#18146)
This PR improves the `editor::Rewrap` command when working with comments
that were not all at the same indentation level.

We now use a heuristic of finding the most common indentation level for
each line, using the deepest indent in the event of a tie.

It also removes an `.unwrap()` that would previously lead to a panic in
this case. Instead of unwrapping we now log an error to the logs and
skip rewrapping for that selection.

Release Notes:

- Improved the behavior of `editor: rewrap` when working with a
selection that contained comments at different indentation levels.
2024-09-20 11:45:03 -04:00
Marshall Bowers
ab1d466c5f Remove replica_id from MultiBuffers (#18141)
This PR removes the `replica_id` field from the `MultiBuffer` struct.

We were only ever referencing this field to pass when constructing a
`MultiBuffer`, and never used it outside of that.

Release Notes:

- N/A
2024-09-20 10:48:27 -04:00
Richard Feldman
5f1046b3cd Make evals handle failures more gracefully (#18082)
Now when an individual project eval fails, instead of panicking we add
it to a list of failures that we collect and report at the end (and make
the exit code nonzero).

Release Notes:

- N/A
2024-09-20 10:28:22 -04:00
Peter Tripp
d6c184b494 Detect 'MD' extension as Markdown (#18135) 2024-09-20 09:23:11 -04:00
Marshall Bowers
16d2afc662 ci: Bump nightly tag on scheduled Nightly builds (#18134)
This PR makes it so after a scheduled Nightly build we also update the
`nightly` tag to keep things in sync.

It's safe to bump the tag within this Action, as it won't trigger
another Nightly build due to GitHub's recursive Action protections:

> When you use the repository's `GITHUB_TOKEN` to perform tasks, events
triggered by the `GITHUB_TOKEN`, with the exception of
`workflow_dispatch` and `repository_dispatch`, will not create a new
workflow run. This prevents you from accidentally creating recursive
workflow runs. For example, if a workflow run pushes code using the
repository's `GITHUB_TOKEN`, a new workflow will not run even when the
repository contains a workflow configured to run when `push` events
occur.
>
> —
[source](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#using-the-github_token-in-a-workflow)

Release Notes:

- N/A
2024-09-20 08:46:23 -04:00
Thorsten Ball
90a12f5564 ssh remoting: Do not double-register LspAdapters (#18132)
This fixes the bug with hover tooltips appearing multiple times.

Turns out everytime we receive the `CreateLanguageServer` message we'd
add a new adapter but only have a single server running for all of them.

And we send a `CreateLanguageServer` message everytime you open a
buffer.

What this does is to only add a new adapter if it hasn't already been
registered, which is also what we do locally.


Release Notes:

- N/A
2024-09-20 14:35:45 +02:00
Marshall Bowers
ca033e6475 Revert "Update nightly tag every night (#17879)" (#18133)
This PR reverts #17879, as it wasn't working.

When a GitHub Action pushes a tag, it does not trigger workflows for
push events for that tag:

> When you use the repository's `GITHUB_TOKEN` to perform tasks, events
triggered by the `GITHUB_TOKEN`, with the exception of
`workflow_dispatch` and `repository_dispatch`, will not create a new
workflow run. This prevents you from accidentally creating recursive
workflow runs. For example, if a workflow run pushes code using the
repository's `GITHUB_TOKEN`, a new workflow will not run even when the
repository contains a workflow configured to run when `push` events
occur.
>
> —
[source](https://docs.github.com/en/actions/security-for-github-actions/security-guides/automatic-token-authentication#using-the-github_token-in-a-workflow)

This reverts commit 761129e373.

Release Notes:

- N/A
2024-09-20 08:35:13 -04:00
Thorsten Ball
97708fdf43 settings: Follow-up fix to show more errors (#18123)
The condition added in #18122 was too strict.


Release Notes:

- N/A
2024-09-20 11:10:19 +02:00
Thorsten Ball
ace4d5185d settings: Show notification when user/project settings fail to parse (#18122)
Closes #16876

We only ever showed parsing errors, but not if something failed to
deserialize.

Basically, if you had a stray `,` somewhere, we'd show a notification
for user errors, but only squiggly lines if you had a `[]` instead of a
`{}`.

The squiggly lines would only show up when there were schema errors.

In the case of `formatter` settings, for example, if someone put in a
`{}` instead of `[]`, we'd never show anything.

With this change we always show a notification if parsing user or
project settings fails.

(Right now, the error message might still be bad, but that's a separate
change)


Release Notes:

- Added a notification to warn users if their user settings or
project-local settings failed to deserialize.

Demo:


https://github.com/user-attachments/assets/e5c48165-f2f7-4b5c-9c6d-6ea74f678683
2024-09-20 10:53:06 +02:00
Thorsten Ball
93730983dd ssh remoting: Restore items/buffers when opening SSH project (#18083)
Demo:


https://github.com/user-attachments/assets/ab79ed0d-13a6-4ae7-8e76-6365fc322ec4



Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-09-20 08:04:49 +02:00
Thorsten Ball
579267f399 docs: Update JavaScript docs and remove TBDs (#17989)
Release Notes:

- N/A
2024-09-20 08:04:26 +02:00
Stanislav Alekseev
8103ac12bf ssh-remoting: Tidy up the code a bit after #18094 (#18102)
Release Notes:

- N/A
2024-09-19 21:36:50 -06:00
Antonio Scandurra
15b4130fa5 Introduce the ability to cycle between alternative inline assists (#18098)
Release Notes:

- Added a new `assistant.inline_alternatives` setting to configure
additional models that will be used to perform inline assists in
parallel.

---------

Co-authored-by: Nathan <nathan@zed.dev>
Co-authored-by: Roy <roy@anthropic.com>
Co-authored-by: Adam <wolffiex@anthropic.com>
2024-09-19 17:50:00 -06:00
Nathan Sobo
ae34872f73 Fix prompt reloading in dev mode (#18095)
I think I nulled out the repo path to test the non dev mode case and
then forgot to reenable it 🤦‍♂️ .

Release Notes:

- N/A
2024-09-19 17:49:22 -06:00
Joseph T. Lyons
740803d745 Bump release_notes to v2 endpoint (#18108)
Partially addresses https://github.com/zed-industries/zed/issues/17527

<img width="1608" alt="SCR-20240919-rcik"
src="https://github.com/user-attachments/assets/25057731-7da6-4b36-b51b-021c67e8736b">

Release Notes:

- Enhanced the `auto update: view release notes locally` feature to
display release notes for each patch version associated with the
installed minor version.
2024-09-19 19:43:32 -04:00
Conrad Irwin
edf2c19250 Hide GPU problems from Slack (#18087)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-09-19 17:28:30 -04:00
Peter Tripp
82e6b1e0e5 docs: Update glibc requirements for current binaries (#18101) 2024-09-19 17:22:11 -04:00
Roy Williams
28a54ce122 Add diagnostic information to context of inline assistant (#18096)
Release Notes:

- Added Diagnostic information to inline assistant. This enables users
to just say "Fix this" and have the model know what the errors are.
2024-09-19 14:16:01 -06:00
Conrad Irwin
fbbf0393cb ssh-remoting: Fix go to definition out of worktree (#18094)
Release Notes:

- ssh-remoting: Fixed go to definition outside of worktree

---------

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-09-19 14:04:46 -06:00
David Soria Parra
00b1c81c9f context_servers: Remove context_type from ResourceContent (#18097)
This is removed in the protocol

Release Notes:

- N/A
2024-09-19 15:51:48 -04:00
Joseph T. Lyons
27c1106fad Fix bug where copying from assistant panel appends extra newline to clipboard (#18090)
Closes https://github.com/zed-industries/zed/issues/17661

Release Notes:

- Fixed a bug where copying from the assistant panel appended an
additional newline to the end of the clipboard contents.
2024-09-19 13:26:14 -04:00
Marshall Bowers
1fc391f696 Make Buffer::apply_ops infallible (#18089)
This PR makes the `Buffer::apply_ops` method infallible for
`text::Buffer` and `language::Buffer`.

We discovered that `text::Buffer::apply_ops` was only fallible due to
`apply_undo`, which didn't actually need to be fallible.

Release Notes:

- N/A
2024-09-19 13:14:15 -04:00
Nate Butler
8074fba76b Update List to support UI Density (#18079)
Tracking issue: #18078

Improve UI Density support for List.

UI density is an unstable feature. You can read more about it in the
above issue!

| Before Normal - Before Dense - After Normal - After Dense |
|--------------------------------------------------------|
| ![Group
8](https://github.com/user-attachments/assets/bb896fcf-e4a6-4776-9308-1405906d2dbe)
| | | |

| Before Normal - Before Dense - After Normal - After Dense |
|--------------------------------------------------------|
| ![Group
9](https://github.com/user-attachments/assets/00815a1b-071b-4d02-96bc-36bf37b5ae8b)
|

Release Notes:

- N/A
2024-09-19 12:31:40 -04:00
Junkui Zhang
ac0d5d3152 windows: Fix regional indicator symbols broken (#18053)
Closes #18027


Unlike macOS, not all glyphs in color fonts are color glyphs, such as
`🇩🇪` in `Segoe UI Emoji`. As a result, attempting to retrieve color
information for these glyphs can cause an error, preventing the glyph
from being rendered.

This PR addresses the issue by setting the `is_emoji` variable to
`false` for non-color glyphs within color fonts.



Release Notes:

- N/A
2024-09-19 10:19:13 -06:00
renovate[bot]
c3bdc1c178 Update Rust crate ignore to v0.4.23 (#18044)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[ignore](https://redirect.github.com/BurntSushi/ripgrep/tree/master/crates/ignore)
([source](https://redirect.github.com/BurntSushi/ripgrep/tree/HEAD/crates/ignore))
| workspace.dependencies | patch | `0.4.22` -> `0.4.23` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC44MC4wIiwidXBkYXRlZEluVmVyIjoiMzguODAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-09-19 10:18:14 -06:00
renovate[bot]
ce4f07bd3c Update Rust crate globset to v0.4.15 (#18042)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[globset](https://redirect.github.com/BurntSushi/ripgrep/tree/master/crates/globset)
([source](https://redirect.github.com/BurntSushi/ripgrep/tree/HEAD/crates/globset))
| workspace.dependencies | patch | `0.4.14` -> `0.4.15` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC44MC4wIiwidXBkYXRlZEluVmVyIjoiMzguODAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-09-19 10:16:31 -06:00
renovate[bot]
157c57aa8d Update Rust crate clap to v4.5.17 (#18041)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [clap](https://redirect.github.com/clap-rs/clap) |
workspace.dependencies | patch | `4.5.16` -> `4.5.17` |

---

### Release Notes

<details>
<summary>clap-rs/clap (clap)</summary>

###
[`v4.5.17`](https://redirect.github.com/clap-rs/clap/blob/HEAD/CHANGELOG.md#4517---2024-09-04)

[Compare
Source](https://redirect.github.com/clap-rs/clap/compare/v4.5.16...v4.5.17)

##### Fixes

-   *(help)* Style required argument groups
-   *(derive)* Improve error messages when unsupported fields are used

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC44MC4wIiwidXBkYXRlZEluVmVyIjoiMzguODAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-09-19 10:15:46 -06:00
renovate[bot]
6670c9eb3b Update Rust crate backtrace to v0.3.74 (#18039)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [backtrace](https://redirect.github.com/rust-lang/backtrace-rs) |
dependencies | patch | `0.3.73` -> `0.3.74` |
| [backtrace](https://redirect.github.com/rust-lang/backtrace-rs) |
dev-dependencies | patch | `0.3.73` -> `0.3.74` |

---

### Release Notes

<details>
<summary>rust-lang/backtrace-rs (backtrace)</summary>

###
[`v0.3.74`](https://redirect.github.com/rust-lang/backtrace-rs/releases/tag/0.3.74)

[Compare
Source](https://redirect.github.com/rust-lang/backtrace-rs/compare/0.3.73...0.3.74)

#### What's Changed

- QNX Neutrino 7.0 support, thanks to
[@&#8203;nyurik](https://redirect.github.com/nyurik) in
[https://github.com/rust-lang/backtrace-rs/pull/648](https://redirect.github.com/rust-lang/backtrace-rs/pull/648)
- Cleaned up our Android support. This should massively improve
backtraces for ones with the API level sufficient to ship with
libunwind, etc. Unfortunately, it comes at the cost of dropping support
for older ones! Thanks to
[@&#8203;fengys](https://redirect.github.com/fengys) in
[https://github.com/rust-lang/backtrace-rs/pull/656](https://redirect.github.com/rust-lang/backtrace-rs/pull/656)
- Made PrintFmt, which was using the `Enum::__NonExhaustiveVariant`
pattern, use `#[non_exhaustive]` for real. Don't @&#8203; me if you were
matching on that! Thanks to
[@&#8203;nyurik](https://redirect.github.com/nyurik) in
[https://github.com/rust-lang/backtrace-rs/pull/651](https://redirect.github.com/rust-lang/backtrace-rs/pull/651)
- Massively cleaned up the windows code! We moved from winapi to
windows-sys with windows-targets thanks to
[@&#8203;CraftSpider](https://redirect.github.com/CraftSpider) and
[@&#8203;ChrisDenton](https://redirect.github.com/ChrisDenton) in
- Don't cast HANDLE to usize and back by
[@&#8203;CraftSpider](https://redirect.github.com/CraftSpider) in
[https://github.com/rust-lang/backtrace-rs/pull/635](https://redirect.github.com/rust-lang/backtrace-rs/pull/635)
- Switch from `winapi` to `windows-sys` by
[@&#8203;CraftSpider](https://redirect.github.com/CraftSpider) in
[https://github.com/rust-lang/backtrace-rs/pull/641](https://redirect.github.com/rust-lang/backtrace-rs/pull/641)
- Update windows bindings and use windows-targets by
[@&#8203;ChrisDenton](https://redirect.github.com/ChrisDenton) in
[https://github.com/rust-lang/backtrace-rs/pull/653](https://redirect.github.com/rust-lang/backtrace-rs/pull/653)
- A bunch of updated dependencies. Thanks
[@&#8203;djc](https://redirect.github.com/djc) and
[@&#8203;khuey](https://redirect.github.com/khuey)!
- Sorry if you were testing this code in miri! It started yelling about
sussy casts. A lot. We did a bunch of internal cleanups that should make
it quiet down, thanks to
[@&#8203;workingjubilee](https://redirect.github.com/workingjubilee) in
[https://github.com/rust-lang/backtrace-rs/pull/641](https://redirect.github.com/rust-lang/backtrace-rs/pull/641)
- Uhhh we had to tweak `dl_iterate_phdr` in
[https://github.com/rust-lang/backtrace-rs/pull/660](https://redirect.github.com/rust-lang/backtrace-rs/pull/660)
after Android revealed it was... kind of unsound actually and not doing
things like checking for null pointers before making slices! WHOOPS!
Thanks to [@&#8203;saethlin](https://redirect.github.com/saethlin) for
implementing detection for precisely that in rustc! It's really hard to
find soundness issues in inherited codebases like this one...

#### New Contributors

- [@&#8203;CraftSpider](https://redirect.github.com/CraftSpider) made
their first contribution in
[https://github.com/rust-lang/backtrace-rs/pull/635](https://redirect.github.com/rust-lang/backtrace-rs/pull/635)
- [@&#8203;fengys1996](https://redirect.github.com/fengys1996) made
their first contribution in
[https://github.com/rust-lang/backtrace-rs/pull/656](https://redirect.github.com/rust-lang/backtrace-rs/pull/656)
- [@&#8203;djc](https://redirect.github.com/djc) made their first
contribution in
[https://github.com/rust-lang/backtrace-rs/pull/657](https://redirect.github.com/rust-lang/backtrace-rs/pull/657)

**Full Changelog**:
https://github.com/rust-lang/backtrace-rs/compare/0.3.73...0.3.74

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC44MC4wIiwidXBkYXRlZEluVmVyIjoiMzguODAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-09-19 10:15:31 -06:00
renovate[bot]
3986bcf9dc Update Rust crate async-trait to v0.1.82 (#18038)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [async-trait](https://redirect.github.com/dtolnay/async-trait) |
workspace.dependencies | patch | `0.1.81` -> `0.1.82` |

---

### Release Notes

<details>
<summary>dtolnay/async-trait (async-trait)</summary>

###
[`v0.1.82`](https://redirect.github.com/dtolnay/async-trait/releases/tag/0.1.82)

[Compare
Source](https://redirect.github.com/dtolnay/async-trait/compare/0.1.81...0.1.82)

- Prevent elided_named_lifetimes lint being produced in generated code
([#&#8203;276](https://redirect.github.com/dtolnay/async-trait/issues/276))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC44MC4wIiwidXBkYXRlZEluVmVyIjoiMzguODAuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-09-19 10:14:37 -06:00
Conrad Irwin
713b39bac0 Auto deploy collab staging daily (#18085)
This should avoid us breaking the collab build and not noticing for a
month

Release Notes:

- N/A
2024-09-19 10:13:55 -06:00
Peter Tripp
3fd690ade4 docs: Update lsp.settings examples for yaml-language-server (#18081) 2024-09-19 12:00:13 -04:00
Thorsten Ball
e9f2e72ff0 Workspace persistence for SSH projects (#17996)
TODOs:

- [x] Add tests to `workspace/src/persistence.rs`
- [x] Add a icon for ssh projects
- [x] Fix all `TODO` comments
- [x] Use `port` if it's passed in the ssh connection options

In next PRs:
- Make sure unsaved buffers are persisted/restored, along with other
items/layout
- Handle multiple paths/worktrees correctly


Release Notes:

- N/A

---------

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2024-09-19 17:51:28 +02:00
Marshall Bowers
7d0a7541bf ci: Fix collab deploys (#18077)
This PR fixes issues with deploying collab.

We reverted 4882a75971abafa89467e779466749086d7d3f96—as the DigitalOcean
runners are gone now—and moved back to BuildJet.

We needed to make some changes to the deployment jobs to setup `doctl`.

This PR also adds an automatic bump of the `collab-staging` tag on
merges to `main`. This should help catch issues with collab deploys
earlier.

Release Notes:

- N/A

---------

Co-authored-by: Conrad <conrad@zed.dev>
2024-09-19 11:45:06 -04:00
Joseph T Lyons
a944bb2f24 v0.155.x dev 2024-09-19 11:02:44 -04:00
Piotr Osiewicz
d2894ce9c9 pane: Do not autopin new item created as a neighbour of pinned tab (#18072)
When I used editor::NewFile or ProjectSearch from a pinned tab, the
resulting new tab would be pinned (and the last pinned tab would be
pushed off). This PR fixes it by always storing new tabs outside of the
pinned area if there's no destination index for the new tab.

Release Notes:

- Fixed tab bar not preserving pinned tab state when an editor::NewFile
action is executed.
2024-09-19 17:00:26 +02:00
CharlesChen0823
d91e62524f assistant: Fix offset calculation not in char boundary (#18069)
Closes #17825 

Release Notes:

- N/A
2024-09-19 08:41:42 -06:00
Marshall Bowers
3d5c023fda ci: Move collab deploys back to DigitalOcean runners (#18071)
This PR moves the collab deployment steps in CI back to the DigitalOcean
runners temporarily, so that we can deploy collab.

Release Notes:

- N/A
2024-09-19 09:55:51 -04:00
Casey Watson
4338ff6be4 terminal: Add ability to open file from Git diff (#17446)
- strip "a/" and "b/" prefix for potential paths.

Release Notes:

- Allow clicking on filepaths when using `git diff` inside the built-in
terminal
2024-09-19 15:01:28 +02:00
Thorsten Ball
23e1faa485 assistant panel: Fix copying code when trailing newline is missing (#18067)
Follow-up to #17853.

Apparently tree-sitter-md extends the range of the content node to
include the backticks when there is no newline.

Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-09-19 14:43:56 +02:00
thataboy
1723713dc2 Add ability to copy assistant code block to clipboard or insert into editor, without manual selection (#17853)
Some notes:

- You can put the cursor on the start or end line with triple backticks,
it doesn't actually have to be inside the block.
- Placing the cursor outside of a code block does nothing.
- Code blocks are determined by counting triple backticks pairs from
either start or end of buffer, and nothing else.
- If you manually select something, the selection takes precedence over
any code blocks.

Release Notes:

- Added the ability to copy surrounding code blocks in the assistant
panel into the clipboard, or inserting them directly into the editor,
without manually selecting. Place cursor anywhere in a code block
(marked by triple backticks) and use the `assistant::CopyCode` action
(`cmd-k c` / `ctrl-k c`) to copy to the clipboard, or the
`assistant::InsertIntoEditor` action (`cmd-<` / `ctrl-<`) to insert into
editor.

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
Co-authored-by: Bennet <bennet@zed.dev>
2024-09-19 13:43:49 +02:00
Joseph T. Lyons
ca4980df02 Add system_id (#18040)
This PR adds `system_id` to telemetry, which is contained within a new
`global` database (accessible by any release channel of Zed on a single
system). This will help us get a more accurate understanding of user
count, instead of relying on `installationd_id`, which is different per
release channel. This doesn't solve the problem of a user with multiple
machines, but it gets us closer.

Release Notes:

- N/A
2024-09-19 07:20:27 -04:00
Danilo Leal
5e6d1814e5 Add stray UI tweaks on the task picker (#18059)
This PR adds tiny UI tweaks to the task picker. Just making sure it is
consistent with other pickers throughout Zed.

| Before | After |
|--------|--------|
| <img width="577" alt="Screenshot 2024-09-19 at 12 07 44 PM"
src="https://github.com/user-attachments/assets/c0f010a3-6e08-47ee-9997-9df2b203977c">
| <img width="577" alt="Screenshot 2024-09-19 at 12 07 09 PM"
src="https://github.com/user-attachments/assets/74baf191-4dd9-4765-b2fd-2390d4cb31c6">
|

Release Notes:

- N/A
2024-09-19 07:22:10 -03:00
Piotr Osiewicz
1b612108ba linux: Fix invalid check for denylisted dependencies (#18050)
Closes #ISSUE

Release Notes:

- N/A
2024-09-19 11:40:01 +02:00
hekmyr
c3f47b8040 vim: Fix increment/decrement command (#17644)
Improving vim increment and decrement command.

Closes: #16672

## Release Notes:

- vim: Improved edge-case handling for ctrl-a/ctrl-x

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-09-18 18:28:31 -06:00
Piotr Osiewicz
43e005e936 chore: Remove commented out code following 15446 (#18047)
Closes #ISSUE

Release Notes:

- N/A
2024-09-19 02:19:58 +02:00
Conrad Irwin
b43b800a54 More assistant events (#18032)
Release Notes:

- N/A
2024-09-18 18:07:39 -06:00
Marshall Bowers
eef44aff7f extension: Re-enable test_extension_store_with_test_extension test (#18046)
The `test_extension_store_with_test_extension` test was disabled in
#15446, which got merged before re-enabling the test.

This PR re-enables that test.

Release Notes:

- N/A
2024-09-18 19:48:34 -04:00
Max Brunsfeld
106ca5076f Fix leak of LMDB connection in semantic index (#17992)
Apparently, to close LMDB's file descriptors when using the `heed`
library, you need to explicitly call `prepare_for_closing`.

Release Notes:

- N/A

---------

Co-authored-by: Richard Feldman <oss@rtfeldman.com>
Co-authored-by: Jason <jason@zed.dev>
2024-09-18 16:43:59 -07:00
Marshall Bowers
2cd9a88f53 Clean up after isahc_http_client introduction (#18045)
This PR does some clean up after #15446.

Release Notes:

- N/A
2024-09-18 19:39:15 -04:00
174 changed files with 5894 additions and 2143 deletions

View File

@@ -1,12 +1,12 @@
name: Update Nightly Tag
name: Bump collab-staging Tag
on:
schedule:
# Fire every day at 7:00am UTC (Roughly before EU workday and after US workday)
- cron: "0 7 * * *"
# Fire every day at 16:00 UTC (At the start of the US workday)
- cron: "0 16 * * *"
jobs:
update-nightly-tag:
update-collab-staging-tag:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
@@ -15,9 +15,9 @@ jobs:
with:
fetch-depth: 0
- name: Update nightly tag
- name: Update collab-staging tag
run: |
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force
git tag -f collab-staging
git push origin collab-staging --force

View File

@@ -8,7 +8,6 @@ on:
env:
DOCKER_BUILDKIT: 1
DIGITALOCEAN_ACCESS_TOKEN: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}
jobs:
style:
@@ -63,8 +62,10 @@ jobs:
runs-on:
- buildjet-16vcpu-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Install doctl
uses: digitalocean/action-doctl@v2
with:
token: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}
- name: Sign into DigitalOcean docker registry
run: doctl registry login
@@ -91,6 +92,16 @@ jobs:
- buildjet-16vcpu-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4
with:
clean: false
- name: Install doctl
uses: digitalocean/action-doctl@v2
with:
token: ${{ secrets.DIGITALOCEAN_ACCESS_TOKEN }}
- name: Sign into Kubernetes
run: doctl kubernetes cluster kubeconfig save --expiry-seconds 600 ${{ secrets.CLUSTER_NAME }}

View File

@@ -1,6 +1,9 @@
name: Release Nightly
on:
schedule:
# Fire every day at 7:00am UTC (Roughly before EU workday and after US workday)
- cron: "0 7 * * *"
push:
tags:
- "nightly"
@@ -168,3 +171,28 @@ jobs:
- name: Upload Zed Nightly
run: script/upload-nightly linux-targz
update-nightly-tag:
name: Update nightly tag
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
needs:
- bundle-mac
- bundle-linux-x86
- bundle-linux-arm
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4
with:
fetch-depth: 0
- name: Update nightly tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
exit 0
fi
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force

64
Cargo.lock generated
View File

@@ -21,11 +21,11 @@ dependencies = [
[[package]]
name = "addr2line"
version = "0.22.0"
version = "0.24.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e4503c46a5c0c7844e948c9a4d6acd9f50cccb4de1c48eb9e291ea17470c678"
checksum = "f5fb1d8e4442bd405fdfd1dacb42792696b0cf9cb15882e5d097b742a676d375"
dependencies = [
"gimli",
"gimli 0.31.0",
]
[[package]]
@@ -402,6 +402,7 @@ dependencies = [
"indoc",
"language",
"language_model",
"languages",
"log",
"markdown",
"menu",
@@ -436,6 +437,7 @@ dependencies = [
"text",
"theme",
"toml 0.8.19",
"tree-sitter-md",
"ui",
"unindent",
"util",
@@ -892,9 +894,9 @@ dependencies = [
[[package]]
name = "async-trait"
version = "0.1.81"
version = "0.1.82"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e0c28dcc82d7c8ead5cb13beb15405b57b8546e93215673ff8ca0349a028107"
checksum = "a27b8a3a6e1a44fa4c8baf1f653e4172e81486d4941f2237e20dc2d0cf4ddff1"
dependencies = [
"proc-macro2",
"quote",
@@ -1491,17 +1493,17 @@ dependencies = [
[[package]]
name = "backtrace"
version = "0.3.73"
version = "0.3.74"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5cc23269a4f8976d0a4d2e7109211a419fe30e8d88d677cd60b6bc79c5732e0a"
checksum = "8d82cb332cdfaed17ae235a638438ac4d4839913cc2af585c3c6746e8f8bee1a"
dependencies = [
"addr2line",
"cc",
"cfg-if",
"libc",
"miniz_oxide 0.7.4",
"miniz_oxide 0.8.0",
"object",
"rustc-demangle",
"windows-targets 0.52.6",
]
[[package]]
@@ -2280,9 +2282,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.16"
version = "4.5.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ed6719fffa43d0d87e5fd8caeab59be1554fb028cd30edc88fc4369b17971019"
checksum = "3e5a21b8495e732f1b3c364c9949b201ca7bae518c502c80256c96ad79eaf6ac"
dependencies = [
"clap_builder",
"clap_derive",
@@ -2290,9 +2292,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.15"
version = "4.5.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "216aec2b177652e3846684cbfe25c9964d18ec45234f0f5da5157b207ed1aab6"
checksum = "8cf2dd12af7a047ad9d6da2b6b249759a22a7abc0f474c1dae1777afa4b21a73"
dependencies = [
"anstream",
"anstyle",
@@ -3081,7 +3083,7 @@ dependencies = [
"cranelift-control",
"cranelift-entity",
"cranelift-isle",
"gimli",
"gimli 0.29.0",
"hashbrown 0.14.5",
"log",
"regalloc2",
@@ -4324,6 +4326,7 @@ dependencies = [
"ctor",
"editor",
"env_logger",
"file_icons",
"futures 0.3.30",
"fuzzy",
"gpui",
@@ -4331,7 +4334,9 @@ dependencies = [
"menu",
"picker",
"project",
"schemars",
"serde",
"serde_derive",
"serde_json",
"settings",
"text",
@@ -4871,6 +4876,12 @@ dependencies = [
"stable_deref_trait",
]
[[package]]
name = "gimli"
version = "0.31.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32085ea23f3234fc7846555e85283ba4de91e21016dc0455a16286d87a292d64"
[[package]]
name = "git"
version = "0.1.0"
@@ -4938,9 +4949,9 @@ checksum = "d2fabcfbdc87f4758337ca535fb41a6d701b65693ce38287d856d1674551ec9b"
[[package]]
name = "globset"
version = "0.4.14"
version = "0.4.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "57da3b9b5b85bd66f31093f8c408b90a74431672542466497dcbdfdc02034be1"
checksum = "15f1ce686646e7f1e19bf7d5533fe443a45dbfb990e00629110797578b42fb19"
dependencies = [
"aho-corasick",
"bstr",
@@ -5680,9 +5691,9 @@ dependencies = [
[[package]]
name = "ignore"
version = "0.4.22"
version = "0.4.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b46810df39e66e925525d6e38ce1e7f6e1d208f72dc39757880fcb66e2c58af1"
checksum = "6d89fd380afde86567dfba715db065673989d6253f42b88179abd3eae47bda4b"
dependencies = [
"crossbeam-deque",
"globset",
@@ -6277,6 +6288,7 @@ dependencies = [
"http_client",
"image",
"inline_completion_button",
"isahc",
"language",
"log",
"menu",
@@ -6466,7 +6478,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4979f22fdb869068da03c9f7528f8297c6fd2606bc3a4affe42e6a823fdb8da4"
dependencies = [
"cfg-if",
"windows-targets 0.52.6",
"windows-targets 0.48.5",
]
[[package]]
@@ -7043,7 +7055,6 @@ dependencies = [
"ctor",
"env_logger",
"futures 0.3.30",
"git",
"gpui",
"itertools 0.13.0",
"language",
@@ -13106,7 +13117,7 @@ dependencies = [
"cranelift-frontend",
"cranelift-native",
"cranelift-wasm",
"gimli",
"gimli 0.29.0",
"log",
"object",
"target-lexicon",
@@ -13126,7 +13137,7 @@ dependencies = [
"cpp_demangle",
"cranelift-bitset",
"cranelift-entity",
"gimli",
"gimli 0.29.0",
"indexmap 2.4.0",
"log",
"object",
@@ -13240,7 +13251,7 @@ checksum = "2a25199625effa4c13dd790d64bd56884b014c69829431bfe43991c740bd5bc1"
dependencies = [
"anyhow",
"cranelift-codegen",
"gimli",
"gimli 0.29.0",
"object",
"target-lexicon",
"wasmparser 0.215.0",
@@ -13520,7 +13531,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.59.0",
"windows-sys 0.48.0",
]
[[package]]
@@ -13537,7 +13548,7 @@ checksum = "073efe897d9ead7fc609874f94580afc831114af5149b6a90ee0a3a39b497fe0"
dependencies = [
"anyhow",
"cranelift-codegen",
"gimli",
"gimli 0.29.0",
"regalloc2",
"smallvec",
"target-lexicon",
@@ -14094,6 +14105,7 @@ dependencies = [
"parking_lot",
"postage",
"project",
"remote",
"schemars",
"serde",
"serde_json",
@@ -14373,7 +14385,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.154.0"
version = "0.155.0"
dependencies = [
"activity_indicator",
"anyhow",

View File

@@ -174,9 +174,6 @@ members = [
default-members = ["crates/zed"]
[workspace.dependencies]
#
# Workspace member crates
#
@@ -216,7 +213,6 @@ file_icons = { path = "crates/file_icons" }
fs = { path = "crates/fs" }
fsevent = { path = "crates/fsevent" }
fuzzy = { path = "crates/fuzzy" }
isahc_http_client = { path = "crates/isahc_http_client" }
git = { path = "crates/git" }
git_hosting_providers = { path = "crates/git_hosting_providers" }
go_to_line = { path = "crates/go_to_line" }
@@ -231,6 +227,7 @@ image_viewer = { path = "crates/image_viewer" }
indexed_docs = { path = "crates/indexed_docs" }
inline_completion_button = { path = "crates/inline_completion_button" }
install_cli = { path = "crates/install_cli" }
isahc_http_client = { path = "crates/isahc_http_client" }
journal = { path = "crates/journal" }
language = { path = "crates/language" }
language_model = { path = "crates/language_model" }

View File

@@ -166,6 +166,7 @@
{
"context": "AssistantPanel",
"bindings": {
"ctrl-k c": "assistant::CopyCode",
"ctrl-g": "search::SelectNextMatch",
"ctrl-shift-g": "search::SelectPrevMatch",
"alt-m": "assistant::ToggleModelSelector",
@@ -519,6 +520,13 @@
"alt-enter": "editor::Newline"
}
},
{
"context": "PromptEditor",
"bindings": {
"ctrl-[": "assistant::CyclePreviousInlineAssist",
"ctrl-]": "assistant::CycleNextInlineAssist"
}
},
{
"context": "ProjectSearchBar && !in_replace",
"bindings": {

View File

@@ -188,6 +188,7 @@
{
"context": "AssistantPanel",
"bindings": {
"cmd-k c": "assistant::CopyCode",
"cmd-g": "search::SelectNextMatch",
"cmd-shift-g": "search::SelectPrevMatch",
"alt-m": "assistant::ToggleModelSelector",
@@ -526,6 +527,13 @@
"ctrl-enter": "assistant::InlineAssist"
}
},
{
"context": "PromptEditor",
"bindings": {
"ctrl-[": "assistant::CyclePreviousInlineAssist",
"ctrl-]": "assistant::CycleNextInlineAssist"
}
},
{
"context": "ProjectSearchBar && !in_replace",
"bindings": {

View File

@@ -124,7 +124,6 @@
"g i": "vim::InsertAtPrevious",
"g ,": "vim::ChangeListNewer",
"g ;": "vim::ChangeListOlder",
"g q": "editor::Rewrap",
"shift-h": "vim::WindowTop",
"shift-m": "vim::WindowMiddle",
"shift-l": "vim::WindowBottom",
@@ -240,6 +239,8 @@
"g shift-u": ["vim::PushOperator", "Uppercase"],
"g ~": ["vim::PushOperator", "OppositeCase"],
"\"": ["vim::PushOperator", "Register"],
"g q": ["vim::PushOperator", "Rewrap"],
"g w": ["vim::PushOperator", "Rewrap"],
"q": "vim::ToggleRecord",
"shift-q": "vim::ReplayLastRecording",
"@": ["vim::PushOperator", "ReplayRegister"],
@@ -301,6 +302,7 @@
"i": ["vim::PushOperator", { "Object": { "around": false } }],
"a": ["vim::PushOperator", { "Object": { "around": true } }],
"g c": "vim::ToggleComments",
"g q": "vim::Rewrap",
"\"": ["vim::PushOperator", "Register"],
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
@@ -428,6 +430,15 @@
"~": "vim::CurrentLine"
}
},
{
"context": "vim_operator == gq",
"bindings": {
"g q": "vim::CurrentLine",
"q": "vim::CurrentLine",
"g w": "vim::CurrentLine",
"w": "vim::CurrentLine"
}
},
{
"context": "vim_operator == y",
"bindings": {

View File

@@ -47,6 +47,17 @@ And here's the section to rewrite based on that prompt again for reference:
<rewrite_this>
{{{rewrite_section}}}
</rewrite_this>
{{#if diagnostic_errors}}
{{#each diagnostic_errors}}
<diagnostic_error>
<line_number>{{line_number}}</line_number>
<error_message>{{error_message}}</error_message>
<code_content>{{code_content}}</code_content>
</diagnostic_error>
{{/each}}
{{/if}}
{{/if}}
Only make changes that are necessary to fulfill the prompt, leave everything else as-is. All surrounding {{content_type}} will be preserved.

View File

@@ -0,0 +1,8 @@
A software developer is asking a question about their project. The source files in their project have been indexed into a database of semantic text embeddings.
Your task is to generate a list of 4 diverse search queries that can be run on this embedding database, in order to retrieve a list of code snippets
that are relevant to the developer's question. Redundant search queries will be heavily penalized, so only include another query if it's sufficiently
distinct from previous ones.
Here is the question that's been asked, together with context that the developer has added manually:
{{{context_buffer}}}

View File

@@ -15,9 +15,11 @@
// text editor:
//
// 1. "VSCode"
// 2. "JetBrains"
// 3. "SublimeText"
// 4. "Atom"
// 2. "Atom"
// 3. "JetBrains"
// 4. "None"
// 5. "SublimeText"
// 6. "TextMate"
"base_keymap": "VSCode",
// Features that can be globally enabled or disabled
"features": {
@@ -496,6 +498,11 @@
// Whether a preview tab gets replaced when code navigation is used to navigate away from the tab.
"enable_preview_from_code_navigation": false
},
// Settings related to the file finder.
"file_finder": {
// Whether to show file icons in the file finder.
"file_icons": true
},
// Whether or not to remove any trailing whitespace from lines of a buffer
// before saving it.
"remove_trailing_whitespace_on_save": true,
@@ -1029,7 +1036,7 @@
// environment variables.
//
// Examples:
// - "proxy": "socks5://localhost:10808"
// - "proxy": "socks5h://localhost:10808"
// - "proxy": "http://127.0.0.1:10809"
"proxy": null,
// Set to configure aliases for the command palette.

View File

@@ -19,7 +19,10 @@ use workspace::{item::ItemHandle, StatusItemView, Workspace};
actions!(activity_indicator, [ShowErrorMessage]);
pub enum Event {
ShowError { lsp_name: Arc<str>, error: String },
ShowError {
lsp_name: LanguageServerName,
error: String,
},
}
pub struct ActivityIndicator {
@@ -123,7 +126,7 @@ impl ActivityIndicator {
self.statuses.retain(|status| {
if let LanguageServerBinaryStatus::Failed { error } = &status.status {
cx.emit(Event::ShowError {
lsp_name: status.name.0.clone(),
lsp_name: status.name.clone(),
error: error.clone(),
});
false

View File

@@ -49,6 +49,7 @@ pub enum Model {
/// Indicates whether this custom model supports caching.
cache_configuration: Option<AnthropicModelCacheConfiguration>,
max_output_tokens: Option<u32>,
default_temperature: Option<f32>,
},
}
@@ -124,6 +125,19 @@ impl Model {
}
}
pub fn default_temperature(&self) -> f32 {
match self {
Self::Claude3_5Sonnet
| Self::Claude3Opus
| Self::Claude3Sonnet
| Self::Claude3Haiku => 1.0,
Self::Custom {
default_temperature,
..
} => default_temperature.unwrap_or(1.0),
}
}
pub fn tool_model_id(&self) -> &str {
if let Self::Custom {
tool_override: Some(tool_override),

View File

@@ -94,9 +94,11 @@ editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
language = { workspace = true, features = ["test-support"] }
language_model = { workspace = true, features = ["test-support"] }
languages = { workspace = true, features = ["test-support"] }
log.workspace = true
project = { workspace = true, features = ["test-support"] }
rand.workspace = true
serde_json_lenient.workspace = true
text = { workspace = true, features = ["test-support"] }
tree-sitter-md.workspace = true
unindent.workspace = true

View File

@@ -41,9 +41,10 @@ use semantic_index::{CloudEmbeddingProvider, SemanticDb};
use serde::{Deserialize, Serialize};
use settings::{update_settings_file, Settings, SettingsStore};
use slash_command::{
auto_command, context_server_command, default_command, delta_command, diagnostics_command,
docs_command, fetch_command, file_command, now_command, project_command, prompt_command,
search_command, symbols_command, tab_command, terminal_command, workflow_command,
auto_command, cargo_workspace_command, context_server_command, default_command, delta_command,
diagnostics_command, docs_command, fetch_command, file_command, now_command, project_command,
prompt_command, search_command, symbols_command, tab_command, terminal_command,
workflow_command,
};
use std::path::PathBuf;
use std::sync::Arc;
@@ -58,6 +59,7 @@ actions!(
[
Assist,
Split,
CopyCode,
CycleMessageRole,
QuoteSelection,
InsertIntoEditor,
@@ -68,6 +70,8 @@ actions!(
ConfirmCommand,
NewContext,
ToggleModelSelector,
CycleNextInlineAssist,
CyclePreviousInlineAssist
]
);
@@ -358,8 +362,19 @@ fn update_active_language_model_from_settings(cx: &mut AppContext) {
let settings = AssistantSettings::get_global(cx);
let provider_name = LanguageModelProviderId::from(settings.default_model.provider.clone());
let model_id = LanguageModelId::from(settings.default_model.model.clone());
let inline_alternatives = settings
.inline_alternatives
.iter()
.map(|alternative| {
(
LanguageModelProviderId::from(alternative.provider.clone()),
LanguageModelId::from(alternative.model.clone()),
)
})
.collect::<Vec<_>>();
LanguageModelRegistry::global(cx).update(cx, |registry, cx| {
registry.select_active_model(&provider_name, &model_id, cx);
registry.select_inline_alternative_models(inline_alternatives, cx);
});
}
@@ -370,20 +385,33 @@ fn register_slash_commands(prompt_builder: Option<Arc<PromptBuilder>>, cx: &mut
slash_command_registry.register_command(delta_command::DeltaSlashCommand, true);
slash_command_registry.register_command(symbols_command::OutlineSlashCommand, true);
slash_command_registry.register_command(tab_command::TabSlashCommand, true);
slash_command_registry.register_command(project_command::ProjectSlashCommand, true);
slash_command_registry
.register_command(cargo_workspace_command::CargoWorkspaceSlashCommand, true);
slash_command_registry.register_command(prompt_command::PromptSlashCommand, true);
slash_command_registry.register_command(default_command::DefaultSlashCommand, false);
slash_command_registry.register_command(terminal_command::TerminalSlashCommand, true);
slash_command_registry.register_command(now_command::NowSlashCommand, false);
slash_command_registry.register_command(diagnostics_command::DiagnosticsSlashCommand, true);
slash_command_registry.register_command(fetch_command::FetchSlashCommand, false);
if let Some(prompt_builder) = prompt_builder {
slash_command_registry.register_command(
workflow_command::WorkflowSlashCommand::new(prompt_builder.clone()),
true,
);
cx.observe_flag::<project_command::ProjectSlashCommandFeatureFlag, _>({
let slash_command_registry = slash_command_registry.clone();
move |is_enabled, _cx| {
if is_enabled {
slash_command_registry.register_command(
project_command::ProjectSlashCommand::new(prompt_builder.clone()),
true,
);
}
}
})
.detach();
}
slash_command_registry.register_command(fetch_command::FetchSlashCommand, false);
cx.observe_flag::<auto_command::AutoSlashCommandFeatureFlag, _>({
let slash_command_registry = slash_command_registry.clone();
@@ -421,10 +449,12 @@ fn update_slash_commands_from_settings(cx: &mut AppContext) {
slash_command_registry.unregister_command(docs_command::DocsSlashCommand);
}
if settings.project.enabled {
slash_command_registry.register_command(project_command::ProjectSlashCommand, true);
if settings.cargo_workspace.enabled {
slash_command_registry
.register_command(cargo_workspace_command::CargoWorkspaceSlashCommand, true);
} else {
slash_command_registry.unregister_command(project_command::ProjectSlashCommand);
slash_command_registry
.unregister_command(cargo_workspace_command::CargoWorkspaceSlashCommand);
}
}

View File

@@ -12,11 +12,11 @@ use crate::{
slash_command_picker,
terminal_inline_assistant::TerminalInlineAssistant,
Assist, CacheStatus, ConfirmCommand, Content, Context, ContextEvent, ContextId, ContextStore,
ContextStoreEvent, CycleMessageRole, DeployHistory, DeployPromptLibrary, InlineAssistId,
InlineAssistant, InsertDraggedFiles, InsertIntoEditor, Message, MessageId, MessageMetadata,
MessageStatus, ModelPickerDelegate, ModelSelector, NewContext, PendingSlashCommand,
PendingSlashCommandStatus, QuoteSelection, RemoteContextMetadata, SavedContextMetadata, Split,
ToggleFocus, ToggleModelSelector, WorkflowStepResolution,
ContextStoreEvent, CopyCode, CycleMessageRole, DeployHistory, DeployPromptLibrary,
InlineAssistId, InlineAssistant, InsertDraggedFiles, InsertIntoEditor, Message, MessageId,
MessageMetadata, MessageStatus, ModelPickerDelegate, ModelSelector, NewContext,
PendingSlashCommand, PendingSlashCommandStatus, QuoteSelection, RemoteContextMetadata,
SavedContextMetadata, Split, ToggleFocus, ToggleModelSelector, WorkflowStepResolution,
};
use anyhow::{anyhow, Result};
use assistant_slash_command::{SlashCommand, SlashCommandOutputSection};
@@ -45,7 +45,8 @@ use gpui::{
};
use indexed_docs::IndexedDocsStore;
use language::{
language_settings::SoftWrap, Capability, LanguageRegistry, LspAdapterDelegate, Point, ToOffset,
language_settings::SoftWrap, BufferSnapshot, Capability, LanguageRegistry, LspAdapterDelegate,
ToOffset,
};
use language_model::{
provider::cloud::PROVIDER_ID, LanguageModelProvider, LanguageModelProviderId,
@@ -56,6 +57,7 @@ use multi_buffer::MultiBufferRow;
use picker::{Picker, PickerDelegate};
use project::lsp_store::LocalLspAdapterDelegate;
use project::{Project, Worktree};
use rope::Point;
use search::{buffer_search::DivRegistrar, BufferSearchBar};
use serde::{Deserialize, Serialize};
use settings::{update_settings_file, Settings};
@@ -81,9 +83,10 @@ use util::{maybe, ResultExt};
use workspace::{
dock::{DockPosition, Panel, PanelEvent},
item::{self, FollowableItem, Item, ItemHandle},
notifications::NotificationId,
pane::{self, SaveIntent},
searchable::{SearchEvent, SearchableItem},
DraggedSelection, Pane, Save, ShowConfiguration, ToggleZoom, ToolbarItemEvent,
DraggedSelection, Pane, Save, ShowConfiguration, Toast, ToggleZoom, ToolbarItemEvent,
ToolbarItemLocation, ToolbarItemView, Workspace,
};
use workspace::{searchable::SearchableItemHandle, DraggedTab};
@@ -105,6 +108,7 @@ pub fn init(cx: &mut AppContext) {
.register_action(AssistantPanel::inline_assist)
.register_action(ContextEditor::quote_selection)
.register_action(ContextEditor::insert_selection)
.register_action(ContextEditor::copy_code)
.register_action(ContextEditor::insert_dragged_files)
.register_action(AssistantPanel::show_configuration)
.register_action(AssistantPanel::create_new_context);
@@ -2810,9 +2814,8 @@ impl ContextEditor {
} else {
// If there are multiple buffers or suggestion groups, create a multibuffer
let multibuffer = cx.new_model(|cx| {
let replica_id = project.read(cx).replica_id();
let mut multibuffer = MultiBuffer::new(replica_id, Capability::ReadWrite)
.with_title(resolved_step.title.clone());
let mut multibuffer =
MultiBuffer::new(Capability::ReadWrite).with_title(resolved_step.title.clone());
for (buffer, groups) in &resolved_step.suggestion_groups {
let excerpt_ids = multibuffer.push_excerpts(
buffer.clone(),
@@ -3100,6 +3103,49 @@ impl ContextEditor {
});
}
/// Returns either the selected text, or the content of the Markdown code
/// block surrounding the cursor.
fn get_selection_or_code_block(
context_editor_view: &View<ContextEditor>,
cx: &mut ViewContext<Workspace>,
) -> Option<(String, bool)> {
const CODE_FENCE_DELIMITER: &'static str = "```";
let context_editor = context_editor_view.read(cx).editor.read(cx);
if context_editor.selections.newest::<Point>(cx).is_empty() {
let snapshot = context_editor.buffer().read(cx).snapshot(cx);
let (_, _, snapshot) = snapshot.as_singleton()?;
let head = context_editor.selections.newest::<Point>(cx).head();
let offset = snapshot.point_to_offset(head);
let surrounding_code_block_range = find_surrounding_code_block(snapshot, offset)?;
let mut text = snapshot
.text_for_range(surrounding_code_block_range)
.collect::<String>();
// If there is no newline trailing the closing three-backticks, then
// tree-sitter-md extends the range of the content node to include
// the backticks.
if text.ends_with(CODE_FENCE_DELIMITER) {
text.drain((text.len() - CODE_FENCE_DELIMITER.len())..);
}
(!text.is_empty()).then_some((text, true))
} else {
let anchor = context_editor.selections.newest_anchor();
let text = context_editor
.buffer()
.read(cx)
.read(cx)
.text_for_range(anchor.range())
.collect::<String>();
(!text.is_empty()).then_some((text, false))
}
}
fn insert_selection(
workspace: &mut Workspace,
_: &InsertIntoEditor,
@@ -3118,17 +3164,7 @@ impl ContextEditor {
return;
};
let context_editor = context_editor_view.read(cx).editor.read(cx);
let anchor = context_editor.selections.newest_anchor();
let text = context_editor
.buffer()
.read(cx)
.read(cx)
.text_for_range(anchor.range())
.collect::<String>();
// If nothing is selected, don't delete the current selection; instead, be a no-op.
if !text.is_empty() {
if let Some((text, _)) = Self::get_selection_or_code_block(&context_editor_view, cx) {
active_editor_view.update(cx, |editor, cx| {
editor.insert(&text, cx);
editor.focus(cx);
@@ -3136,6 +3172,36 @@ impl ContextEditor {
}
}
fn copy_code(workspace: &mut Workspace, _: &CopyCode, cx: &mut ViewContext<Workspace>) {
let result = maybe!({
let panel = workspace.panel::<AssistantPanel>(cx)?;
let context_editor_view = panel.read(cx).active_context_editor(cx)?;
Self::get_selection_or_code_block(&context_editor_view, cx)
});
let Some((text, is_code_block)) = result else {
return;
};
cx.write_to_clipboard(ClipboardItem::new_string(text));
struct CopyToClipboardToast;
workspace.show_toast(
Toast::new(
NotificationId::unique::<CopyToClipboardToast>(),
format!(
"{} copied to clipboard.",
if is_code_block {
"Code block"
} else {
"Selection"
}
),
)
.autohide(),
cx,
);
}
fn insert_dragged_files(
workspace: &mut Workspace,
action: &InsertDraggedFiles,
@@ -3466,7 +3532,9 @@ impl ContextEditor {
for chunk in context.buffer().read(cx).text_for_range(range) {
text.push_str(chunk);
}
text.push('\n');
if message.offset_range.end < selection.range().end {
text.push('\n');
}
}
}
}
@@ -4215,6 +4283,48 @@ impl ContextEditor {
}
}
/// Returns the contents of the *outermost* fenced code block that contains the given offset.
fn find_surrounding_code_block(snapshot: &BufferSnapshot, offset: usize) -> Option<Range<usize>> {
const CODE_BLOCK_NODE: &'static str = "fenced_code_block";
const CODE_BLOCK_CONTENT: &'static str = "code_fence_content";
let layer = snapshot.syntax_layers().next()?;
let root_node = layer.node();
let mut cursor = root_node.walk();
// Go to the first child for the given offset
while cursor.goto_first_child_for_byte(offset).is_some() {
// If we're at the end of the node, go to the next one.
// Example: if you have a fenced-code-block, and you're on the start of the line
// right after the closing ```, you want to skip the fenced-code-block and
// go to the next sibling.
if cursor.node().end_byte() == offset {
cursor.goto_next_sibling();
}
if cursor.node().start_byte() > offset {
break;
}
// We found the fenced code block.
if cursor.node().kind() == CODE_BLOCK_NODE {
// Now we need to find the child node that contains the code.
cursor.goto_first_child();
loop {
if cursor.node().kind() == CODE_BLOCK_CONTENT {
return Some(cursor.node().byte_range());
}
if !cursor.goto_next_sibling() {
break;
}
}
}
}
None
}
fn render_fold_icon_button(
editor: WeakView<Editor>,
icon: IconName,
@@ -5497,3 +5607,85 @@ fn configuration_error(cx: &AppContext) -> Option<ConfigurationError> {
None
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::{AppContext, Context};
use language::Buffer;
use unindent::Unindent;
#[gpui::test]
fn test_find_code_blocks(cx: &mut AppContext) {
let markdown = languages::language("markdown", tree_sitter_md::LANGUAGE.into());
let buffer = cx.new_model(|cx| {
let text = r#"
line 0
line 1
```rust
fn main() {}
```
line 5
line 6
line 7
```go
func main() {}
```
line 11
```
this is plain text code block
```
```go
func another() {}
```
line 19
"#
.unindent();
let mut buffer = Buffer::local(text, cx);
buffer.set_language(Some(markdown.clone()), cx);
buffer
});
let snapshot = buffer.read(cx).snapshot();
let code_blocks = vec![
Point::new(3, 0)..Point::new(4, 0),
Point::new(9, 0)..Point::new(10, 0),
Point::new(13, 0)..Point::new(14, 0),
Point::new(17, 0)..Point::new(18, 0),
]
.into_iter()
.map(|range| snapshot.point_to_offset(range.start)..snapshot.point_to_offset(range.end))
.collect::<Vec<_>>();
let expected_results = vec![
(0, None),
(1, None),
(2, Some(code_blocks[0].clone())),
(3, Some(code_blocks[0].clone())),
(4, Some(code_blocks[0].clone())),
(5, None),
(6, None),
(7, None),
(8, Some(code_blocks[1].clone())),
(9, Some(code_blocks[1].clone())),
(10, Some(code_blocks[1].clone())),
(11, None),
(12, Some(code_blocks[2].clone())),
(13, Some(code_blocks[2].clone())),
(14, Some(code_blocks[2].clone())),
(15, None),
(16, Some(code_blocks[3].clone())),
(17, Some(code_blocks[3].clone())),
(18, Some(code_blocks[3].clone())),
(19, None),
];
for (row, expected) in expected_results {
let offset = snapshot.point_to_offset(Point::new(row, 0));
let range = find_surrounding_code_block(&snapshot, offset);
assert_eq!(range, expected, "unexpected result on row {:?}", row);
}
}
}

View File

@@ -59,6 +59,7 @@ pub struct AssistantSettings {
pub default_width: Pixels,
pub default_height: Pixels,
pub default_model: LanguageModelSelection,
pub inline_alternatives: Vec<LanguageModelSelection>,
pub using_outdated_settings_version: bool,
}
@@ -236,6 +237,7 @@ impl AssistantSettingsContent {
})
}
}),
inline_alternatives: None,
},
VersionedAssistantSettingsContent::V2(settings) => settings.clone(),
},
@@ -254,6 +256,7 @@ impl AssistantSettingsContent {
.id()
.to_string(),
}),
inline_alternatives: None,
},
}
}
@@ -369,6 +372,7 @@ impl Default for VersionedAssistantSettingsContent {
default_width: None,
default_height: None,
default_model: None,
inline_alternatives: None,
})
}
}
@@ -397,6 +401,8 @@ pub struct AssistantSettingsContentV2 {
default_height: Option<f32>,
/// The default model to use when creating new contexts.
default_model: Option<LanguageModelSelection>,
/// Additional models with which to generate alternatives when performing inline assists.
inline_alternatives: Option<Vec<LanguageModelSelection>>,
}
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq)]
@@ -517,10 +523,8 @@ impl Settings for AssistantSettings {
&mut settings.default_height,
value.default_height.map(Into::into),
);
merge(
&mut settings.default_model,
value.default_model.map(Into::into),
);
merge(&mut settings.default_model, value.default_model);
merge(&mut settings.inline_alternatives, value.inline_alternatives);
// merge(&mut settings.infer_context, value.infer_context); TODO re-enable this once we ship context inference
}
@@ -574,6 +578,7 @@ mod tests {
provider: "test-provider".into(),
model: "gpt-99".into(),
}),
inline_alternatives: None,
enabled: None,
button: None,
dock: None,

View File

@@ -46,7 +46,7 @@ use std::{
sync::Arc,
time::{Duration, Instant},
};
use telemetry_events::AssistantKind;
use telemetry_events::{AssistantKind, AssistantPhase};
use text::BufferSnapshot;
use util::{post_inc, ResultExt, TryFutureExt};
use uuid::Uuid;
@@ -683,7 +683,7 @@ impl Context {
buffer.set_text(saved_context.text.as_str(), cx)
});
let operations = saved_context.into_ops(&this.buffer, cx);
this.apply_ops(operations, cx).unwrap();
this.apply_ops(operations, cx);
this
}
@@ -756,7 +756,7 @@ impl Context {
&mut self,
ops: impl IntoIterator<Item = ContextOperation>,
cx: &mut ModelContext<Self>,
) -> Result<()> {
) {
let mut buffer_ops = Vec::new();
for op in ops {
match op {
@@ -765,10 +765,8 @@ impl Context {
}
}
self.buffer
.update(cx, |buffer, cx| buffer.apply_ops(buffer_ops, cx))?;
.update(cx, |buffer, cx| buffer.apply_ops(buffer_ops, cx));
self.flush_ops(cx);
Ok(())
}
fn flush_ops(&mut self, cx: &mut ModelContext<Context>) {
@@ -1008,9 +1006,12 @@ impl Context {
cx: &mut ModelContext<Self>,
) {
match event {
language::BufferEvent::Operation(operation) => cx.emit(ContextEvent::Operation(
ContextOperation::BufferOperation(operation.clone()),
)),
language::BufferEvent::Operation {
operation,
is_local: true,
} => cx.emit(ContextEvent::Operation(ContextOperation::BufferOperation(
operation.clone(),
))),
language::BufferEvent::Edited => {
self.count_remaining_tokens(cx);
self.reparse(cx);
@@ -1969,8 +1970,9 @@ impl Context {
}
pub fn assist(&mut self, cx: &mut ModelContext<Self>) -> Option<MessageAnchor> {
let provider = LanguageModelRegistry::read_global(cx).active_provider()?;
let model = LanguageModelRegistry::read_global(cx).active_model()?;
let model_registry = LanguageModelRegistry::read_global(cx);
let provider = model_registry.active_provider()?;
let model = model_registry.active_model()?;
let last_message_id = self.get_last_valid_message_id(cx)?;
if !provider.is_authenticated(cx) {
@@ -2134,6 +2136,7 @@ impl Context {
telemetry.report_assistant_event(
Some(this.id.0.clone()),
AssistantKind::Panel,
AssistantPhase::Response,
model.telemetry_id(),
response_latency,
error_message,
@@ -2181,7 +2184,7 @@ impl Context {
messages: Vec::new(),
tools: Vec::new(),
stop: Vec::new(),
temperature: 1.0,
temperature: None,
};
for message in self.messages(cx) {
if message.status != MessageStatus::Done {

View File

@@ -1166,9 +1166,7 @@ async fn test_random_context_collaboration(cx: &mut TestAppContext, mut rng: Std
);
network.lock().broadcast(replica_id, ops_to_send);
context
.update(cx, |context, cx| context.apply_ops(ops_to_receive, cx))
.unwrap();
context.update(cx, |context, cx| context.apply_ops(ops_to_receive, cx));
} else if rng.gen_bool(0.1) && replica_id != 0 {
log::info!("Context {}: disconnecting", context_index);
network.lock().disconnect_peer(replica_id);
@@ -1180,9 +1178,7 @@ async fn test_random_context_collaboration(cx: &mut TestAppContext, mut rng: Std
.map(ContextOperation::from_proto)
.collect::<Result<Vec<_>>>()
.unwrap();
context
.update(cx, |context, cx| context.apply_ops(ops, cx))
.unwrap();
context.update(cx, |context, cx| context.apply_ops(ops, cx));
}
}
}

View File

@@ -223,7 +223,7 @@ impl ContextStore {
if let Some(context) = this.loaded_context_for_id(&context_id, cx) {
let operation_proto = envelope.payload.operation.context("invalid operation")?;
let operation = ContextOperation::from_proto(operation_proto)?;
context.update(cx, |context, cx| context.apply_ops([operation], cx))?;
context.update(cx, |context, cx| context.apply_ops([operation], cx));
}
Ok(())
})?
@@ -394,7 +394,7 @@ impl ContextStore {
.collect::<Result<Vec<_>>>()
})
.await?;
context.update(&mut cx, |context, cx| context.apply_ops(operations, cx))??;
context.update(&mut cx, |context, cx| context.apply_ops(operations, cx))?;
this.update(&mut cx, |this, cx| {
if let Some(existing_context) = this.loaded_context_for_id(&context_id, cx) {
existing_context
@@ -531,7 +531,7 @@ impl ContextStore {
.collect::<Result<Vec<_>>>()
})
.await?;
context.update(&mut cx, |context, cx| context.apply_ops(operations, cx))??;
context.update(&mut cx, |context, cx| context.apply_ops(operations, cx))?;
this.update(&mut cx, |this, cx| {
if let Some(existing_context) = this.loaded_context_for_id(&context_id, cx) {
existing_context

File diff suppressed because it is too large Load Diff

View File

@@ -796,7 +796,7 @@ impl PromptLibrary {
}],
tools: Vec::new(),
stop: Vec::new(),
temperature: 1.,
temperature: None,
},
cx,
)

View File

@@ -4,13 +4,20 @@ use fs::Fs;
use futures::StreamExt;
use gpui::AssetSource;
use handlebars::{Handlebars, RenderError};
use language::{BufferSnapshot, LanguageName};
use language::{BufferSnapshot, LanguageName, Point};
use parking_lot::Mutex;
use serde::Serialize;
use std::{ops::Range, path::PathBuf, sync::Arc, time::Duration};
use text::LineEnding;
use util::ResultExt;
#[derive(Serialize)]
pub struct ContentPromptDiagnosticContext {
pub line_number: usize,
pub error_message: String,
pub code_content: String,
}
#[derive(Serialize)]
pub struct ContentPromptContext {
pub content_type: String,
@@ -20,6 +27,7 @@ pub struct ContentPromptContext {
pub document_content: String,
pub user_prompt: String,
pub rewrite_section: Option<String>,
pub diagnostic_errors: Vec<ContentPromptDiagnosticContext>,
}
#[derive(Serialize)]
@@ -32,6 +40,11 @@ pub struct TerminalAssistantPromptContext {
pub user_prompt: String,
}
#[derive(Serialize)]
pub struct ProjectSlashCommandPromptContext {
pub context_buffer: String,
}
/// Context required to generate a workflow step resolution prompt.
#[derive(Debug, Serialize)]
pub struct StepResolutionContext {
@@ -82,10 +95,9 @@ impl PromptBuilder {
/// and application context.
/// * `handlebars` - An `Arc<Mutex<Handlebars>>` for registering and updating templates.
fn watch_fs_for_template_overrides(
mut params: PromptLoadingParams,
params: PromptLoadingParams,
handlebars: Arc<Mutex<Handlebars<'static>>>,
) {
params.repo_path = None;
let templates_dir = paths::prompt_overrides_dir(params.repo_path.as_deref());
params.cx.background_executor()
.spawn(async move {
@@ -220,7 +232,8 @@ impl PromptBuilder {
let before_range = 0..range.start;
let truncated_before = if before_range.len() > MAX_CTX {
is_truncated = true;
range.start - MAX_CTX..range.start
let start = buffer.clip_offset(range.start - MAX_CTX, text::Bias::Right);
start..range.start
} else {
before_range
};
@@ -228,7 +241,8 @@ impl PromptBuilder {
let after_range = range.end..buffer.len();
let truncated_after = if after_range.len() > MAX_CTX {
is_truncated = true;
range.end..range.end + MAX_CTX
let end = buffer.clip_offset(range.end + MAX_CTX, text::Bias::Left);
range.end..end
} else {
after_range
};
@@ -259,6 +273,17 @@ impl PromptBuilder {
} else {
None
};
let diagnostics = buffer.diagnostics_in_range::<_, Point>(range, false);
let diagnostic_errors: Vec<ContentPromptDiagnosticContext> = diagnostics
.map(|entry| {
let start = entry.range.start;
ContentPromptDiagnosticContext {
line_number: (start.row + 1) as usize,
error_message: entry.diagnostic.message.clone(),
code_content: buffer.text_for_range(entry.range.clone()).collect(),
}
})
.collect();
let context = ContentPromptContext {
content_type: content_type.to_string(),
@@ -268,8 +293,8 @@ impl PromptBuilder {
document_content,
user_prompt,
rewrite_section,
diagnostic_errors,
};
self.handlebars.lock().render("content_prompt", &context)
}
@@ -297,4 +322,14 @@ impl PromptBuilder {
pub fn generate_workflow_prompt(&self) -> Result<String, RenderError> {
self.handlebars.lock().render("edit_workflow", &())
}
pub fn generate_project_slash_command_prompt(
&self,
context_buffer: String,
) -> Result<String, RenderError> {
self.handlebars.lock().render(
"project_slash_command",
&ProjectSlashCommandPromptContext { context_buffer },
)
}
}

View File

@@ -18,8 +18,8 @@ use std::{
};
use ui::ActiveTheme;
use workspace::Workspace;
pub mod auto_command;
pub mod cargo_workspace_command;
pub mod context_server_command;
pub mod default_command;
pub mod delta_command;

View File

@@ -216,7 +216,7 @@ async fn commands_for_summaries(
}],
tools: Vec::new(),
stop: Vec::new(),
temperature: 1.0,
temperature: None,
};
while let Some(current_summaries) = stack.pop() {

View File

@@ -0,0 +1,153 @@
use super::{SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Context, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use fs::Fs;
use gpui::{AppContext, Model, Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
use project::{Project, ProjectPath};
use std::{
fmt::Write,
path::Path,
sync::{atomic::AtomicBool, Arc},
};
use ui::prelude::*;
use workspace::Workspace;
pub(crate) struct CargoWorkspaceSlashCommand;
impl CargoWorkspaceSlashCommand {
async fn build_message(fs: Arc<dyn Fs>, path_to_cargo_toml: &Path) -> Result<String> {
let buffer = fs.load(path_to_cargo_toml).await?;
let cargo_toml: cargo_toml::Manifest = toml::from_str(&buffer)?;
let mut message = String::new();
writeln!(message, "You are in a Rust project.")?;
if let Some(workspace) = cargo_toml.workspace {
writeln!(
message,
"The project is a Cargo workspace with the following members:"
)?;
for member in workspace.members {
writeln!(message, "- {member}")?;
}
if !workspace.default_members.is_empty() {
writeln!(message, "The default members are:")?;
for member in workspace.default_members {
writeln!(message, "- {member}")?;
}
}
if !workspace.dependencies.is_empty() {
writeln!(
message,
"The following workspace dependencies are installed:"
)?;
for dependency in workspace.dependencies.keys() {
writeln!(message, "- {dependency}")?;
}
}
} else if let Some(package) = cargo_toml.package {
writeln!(
message,
"The project name is \"{name}\".",
name = package.name
)?;
let description = package
.description
.as_ref()
.and_then(|description| description.get().ok().cloned());
if let Some(description) = description.as_ref() {
writeln!(message, "It describes itself as \"{description}\".")?;
}
if !cargo_toml.dependencies.is_empty() {
writeln!(message, "The following dependencies are installed:")?;
for dependency in cargo_toml.dependencies.keys() {
writeln!(message, "- {dependency}")?;
}
}
}
Ok(message)
}
fn path_to_cargo_toml(project: Model<Project>, cx: &mut AppContext) -> Option<Arc<Path>> {
let worktree = project.read(cx).worktrees(cx).next()?;
let worktree = worktree.read(cx);
let entry = worktree.entry_for_path("Cargo.toml")?;
let path = ProjectPath {
worktree_id: worktree.id(),
path: entry.path.clone(),
};
Some(Arc::from(
project.read(cx).absolute_path(&path, cx)?.as_path(),
))
}
}
impl SlashCommand for CargoWorkspaceSlashCommand {
fn name(&self) -> String {
"cargo-workspace".into()
}
fn description(&self) -> String {
"insert project workspace metadata".into()
}
fn menu_text(&self) -> String {
"Insert Project Workspace Metadata".into()
}
fn complete_argument(
self: Arc<Self>,
_arguments: &[String],
_cancel: Arc<AtomicBool>,
_workspace: Option<WeakView<Workspace>>,
_cx: &mut WindowContext,
) -> Task<Result<Vec<ArgumentCompletion>>> {
Task::ready(Err(anyhow!("this command does not require argument")))
}
fn requires_argument(&self) -> bool {
false
}
fn run(
self: Arc<Self>,
_arguments: &[String],
_context_slash_command_output_sections: &[SlashCommandOutputSection<language::Anchor>],
_context_buffer: BufferSnapshot,
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
let output = workspace.update(cx, |workspace, cx| {
let project = workspace.project().clone();
let fs = workspace.project().read(cx).fs().clone();
let path = Self::path_to_cargo_toml(project, cx);
let output = cx.background_executor().spawn(async move {
let path = path.with_context(|| "Cargo.toml not found")?;
Self::build_message(fs, &path).await
});
cx.foreground_executor().spawn(async move {
let text = output.await?;
let range = 0..text.len();
Ok(SlashCommandOutput {
text,
sections: vec![SlashCommandOutputSection {
range,
icon: IconName::FileTree,
label: "Project".into(),
metadata: None,
}],
run_commands_in_text: false,
})
})
});
output.unwrap_or_else(|error| Task::ready(Err(error)))
}
}

View File

@@ -1,90 +1,39 @@
use super::{SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Context, Result};
use super::{
create_label_for_command, search_command::add_search_result_section, SlashCommand,
SlashCommandOutput,
};
use crate::PromptBuilder;
use anyhow::{anyhow, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use fs::Fs;
use gpui::{AppContext, Model, Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
use project::{Project, ProjectPath};
use feature_flags::FeatureFlag;
use gpui::{AppContext, Task, WeakView, WindowContext};
use language::{Anchor, CodeLabel, LspAdapterDelegate};
use language_model::{LanguageModelRegistry, LanguageModelTool};
use schemars::JsonSchema;
use semantic_index::SemanticDb;
use serde::Deserialize;
pub struct ProjectSlashCommandFeatureFlag;
impl FeatureFlag for ProjectSlashCommandFeatureFlag {
const NAME: &'static str = "project-slash-command";
}
use std::{
fmt::Write,
path::Path,
fmt::Write as _,
ops::DerefMut,
sync::{atomic::AtomicBool, Arc},
};
use ui::prelude::*;
use ui::{BorrowAppContext as _, IconName};
use workspace::Workspace;
pub(crate) struct ProjectSlashCommand;
pub struct ProjectSlashCommand {
prompt_builder: Arc<PromptBuilder>,
}
impl ProjectSlashCommand {
async fn build_message(fs: Arc<dyn Fs>, path_to_cargo_toml: &Path) -> Result<String> {
let buffer = fs.load(path_to_cargo_toml).await?;
let cargo_toml: cargo_toml::Manifest = toml::from_str(&buffer)?;
let mut message = String::new();
writeln!(message, "You are in a Rust project.")?;
if let Some(workspace) = cargo_toml.workspace {
writeln!(
message,
"The project is a Cargo workspace with the following members:"
)?;
for member in workspace.members {
writeln!(message, "- {member}")?;
}
if !workspace.default_members.is_empty() {
writeln!(message, "The default members are:")?;
for member in workspace.default_members {
writeln!(message, "- {member}")?;
}
}
if !workspace.dependencies.is_empty() {
writeln!(
message,
"The following workspace dependencies are installed:"
)?;
for dependency in workspace.dependencies.keys() {
writeln!(message, "- {dependency}")?;
}
}
} else if let Some(package) = cargo_toml.package {
writeln!(
message,
"The project name is \"{name}\".",
name = package.name
)?;
let description = package
.description
.as_ref()
.and_then(|description| description.get().ok().cloned());
if let Some(description) = description.as_ref() {
writeln!(message, "It describes itself as \"{description}\".")?;
}
if !cargo_toml.dependencies.is_empty() {
writeln!(message, "The following dependencies are installed:")?;
for dependency in cargo_toml.dependencies.keys() {
writeln!(message, "- {dependency}")?;
}
}
}
Ok(message)
}
fn path_to_cargo_toml(project: Model<Project>, cx: &mut AppContext) -> Option<Arc<Path>> {
let worktree = project.read(cx).worktrees(cx).next()?;
let worktree = worktree.read(cx);
let entry = worktree.entry_for_path("Cargo.toml")?;
let path = ProjectPath {
worktree_id: worktree.id(),
path: entry.path.clone(),
};
Some(Arc::from(
project.read(cx).absolute_path(&path, cx)?.as_path(),
))
pub fn new(prompt_builder: Arc<PromptBuilder>) -> Self {
Self { prompt_builder }
}
}
@@ -93,12 +42,20 @@ impl SlashCommand for ProjectSlashCommand {
"project".into()
}
fn label(&self, cx: &AppContext) -> CodeLabel {
create_label_for_command("project", &[], cx)
}
fn description(&self) -> String {
"insert project metadata".into()
"Generate semantic searches based on the current context".into()
}
fn menu_text(&self) -> String {
"Insert Project Metadata".into()
"Project Context".into()
}
fn requires_argument(&self) -> bool {
false
}
fn complete_argument(
@@ -108,46 +65,126 @@ impl SlashCommand for ProjectSlashCommand {
_workspace: Option<WeakView<Workspace>>,
_cx: &mut WindowContext,
) -> Task<Result<Vec<ArgumentCompletion>>> {
Task::ready(Err(anyhow!("this command does not require argument")))
}
fn requires_argument(&self) -> bool {
false
Task::ready(Ok(Vec::new()))
}
fn run(
self: Arc<Self>,
_arguments: &[String],
_context_slash_command_output_sections: &[SlashCommandOutputSection<language::Anchor>],
_context_buffer: BufferSnapshot,
_context_slash_command_output_sections: &[SlashCommandOutputSection<Anchor>],
context_buffer: language::BufferSnapshot,
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
let output = workspace.update(cx, |workspace, cx| {
let project = workspace.project().clone();
let fs = workspace.project().read(cx).fs().clone();
let path = Self::path_to_cargo_toml(project, cx);
let output = cx.background_executor().spawn(async move {
let path = path.with_context(|| "Cargo.toml not found")?;
Self::build_message(fs, &path).await
});
let model_registry = LanguageModelRegistry::read_global(cx);
let current_model = model_registry.active_model();
let prompt_builder = self.prompt_builder.clone();
cx.foreground_executor().spawn(async move {
let text = output.await?;
let range = 0..text.len();
Ok(SlashCommandOutput {
text,
sections: vec![SlashCommandOutputSection {
range,
icon: IconName::FileTree,
label: "Project".into(),
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow::anyhow!("workspace was dropped")));
};
let project = workspace.read(cx).project().clone();
let fs = project.read(cx).fs().clone();
let Some(project_index) =
cx.update_global(|index: &mut SemanticDb, cx| index.project_index(project, cx))
else {
return Task::ready(Err(anyhow::anyhow!("no project indexer")));
};
cx.spawn(|mut cx| async move {
let current_model = current_model.ok_or_else(|| anyhow!("no model selected"))?;
let prompt =
prompt_builder.generate_project_slash_command_prompt(context_buffer.text())?;
let search_queries = current_model
.use_tool::<SearchQueries>(
language_model::LanguageModelRequest {
messages: vec![language_model::LanguageModelRequestMessage {
role: language_model::Role::User,
content: vec![language_model::MessageContent::Text(prompt)],
cache: false,
}],
tools: vec![],
stop: vec![],
temperature: None,
},
cx.deref_mut(),
)
.await?
.search_queries;
let results = project_index
.read_with(&cx, |project_index, cx| {
project_index.search(search_queries.clone(), 25, cx)
})?
.await?;
let results = SemanticDb::load_results(results, &fs, &cx).await?;
cx.background_executor()
.spawn(async move {
let mut output = "Project context:\n".to_string();
let mut sections = Vec::new();
for (ix, query) in search_queries.into_iter().enumerate() {
let start_ix = output.len();
writeln!(&mut output, "Results for {query}:").unwrap();
let mut has_results = false;
for result in &results {
if result.query_index == ix {
add_search_result_section(result, &mut output, &mut sections);
has_results = true;
}
}
if has_results {
sections.push(SlashCommandOutputSection {
range: start_ix..output.len(),
icon: IconName::MagnifyingGlass,
label: query.into(),
metadata: None,
});
output.push('\n');
} else {
output.truncate(start_ix);
}
}
sections.push(SlashCommandOutputSection {
range: 0..output.len(),
icon: IconName::Book,
label: "Project context".into(),
metadata: None,
}],
run_commands_in_text: false,
});
Ok(SlashCommandOutput {
text: output,
sections,
run_commands_in_text: true,
})
})
})
});
output.unwrap_or_else(|error| Task::ready(Err(error)))
.await
})
}
}
#[derive(JsonSchema, Deserialize)]
struct SearchQueries {
/// An array of semantic search queries.
///
/// These queries will be used to search the user's codebase.
/// The function can only accept 4 queries, otherwise it will error.
/// As such, it's important that you limit the length of the search_queries array to 5 queries or less.
search_queries: Vec<String>,
}
impl LanguageModelTool for SearchQueries {
fn name() -> String {
"search_queries".to_string()
}
fn description() -> String {
"Generate semantic search queries based on context".to_string()
}
}

View File

@@ -7,7 +7,7 @@ use anyhow::Result;
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use feature_flags::FeatureFlag;
use gpui::{AppContext, Task, WeakView};
use language::{CodeLabel, LineEnding, LspAdapterDelegate};
use language::{CodeLabel, LspAdapterDelegate};
use semantic_index::{LoadedSearchResult, SemanticDb};
use std::{
fmt::Write,
@@ -101,7 +101,7 @@ impl SlashCommand for SearchSlashCommand {
cx.spawn(|cx| async move {
let results = project_index
.read_with(&cx, |project_index, cx| {
project_index.search(query.clone(), limit.unwrap_or(5), cx)
project_index.search(vec![query.clone()], limit.unwrap_or(5), cx)
})?
.await?;
@@ -112,31 +112,8 @@ impl SlashCommand for SearchSlashCommand {
.spawn(async move {
let mut text = format!("Search results for {query}:\n");
let mut sections = Vec::new();
for LoadedSearchResult {
path,
range,
full_path,
file_content,
row_range,
} in loaded_results
{
let section_start_ix = text.len();
text.push_str(&codeblock_fence_for_path(
Some(&path),
Some(row_range.clone()),
));
let mut excerpt = file_content[range].to_string();
LineEnding::normalize(&mut excerpt);
text.push_str(&excerpt);
writeln!(text, "\n```\n").unwrap();
let section_end_ix = text.len() - 1;
sections.push(build_entry_output_section(
section_start_ix..section_end_ix,
Some(&full_path),
false,
Some(row_range.start() + 1..row_range.end() + 1),
));
for loaded_result in &loaded_results {
add_search_result_section(loaded_result, &mut text, &mut sections);
}
let query = SharedString::from(query);
@@ -159,3 +136,35 @@ impl SlashCommand for SearchSlashCommand {
})
}
}
pub fn add_search_result_section(
loaded_result: &LoadedSearchResult,
text: &mut String,
sections: &mut Vec<SlashCommandOutputSection<usize>>,
) {
let LoadedSearchResult {
path,
full_path,
excerpt_content,
row_range,
..
} = loaded_result;
let section_start_ix = text.len();
text.push_str(&codeblock_fence_for_path(
Some(&path),
Some(row_range.clone()),
));
text.push_str(&excerpt_content);
if !text.ends_with('\n') {
text.push('\n');
}
writeln!(text, "```\n").unwrap();
let section_end_ix = text.len() - 1;
sections.push(build_entry_output_section(
section_start_ix..section_end_ix,
Some(&full_path),
false,
Some(row_range.start() + 1..row_range.end() + 1),
));
}

View File

@@ -10,9 +10,9 @@ pub struct SlashCommandSettings {
/// Settings for the `/docs` slash command.
#[serde(default)]
pub docs: DocsCommandSettings,
/// Settings for the `/project` slash command.
/// Settings for the `/cargo-workspace` slash command.
#[serde(default)]
pub project: ProjectCommandSettings,
pub cargo_workspace: CargoWorkspaceCommandSettings,
}
/// Settings for the `/docs` slash command.
@@ -23,10 +23,10 @@ pub struct DocsCommandSettings {
pub enabled: bool,
}
/// Settings for the `/project` slash command.
/// Settings for the `/cargo-workspace` slash command.
#[derive(Deserialize, Serialize, Debug, Default, Clone, JsonSchema)]
pub struct ProjectCommandSettings {
/// Whether `/project` is enabled.
pub struct CargoWorkspaceCommandSettings {
/// Whether `/cargo-workspace` is enabled.
#[serde(default)]
pub enabled: bool,
}

View File

@@ -284,7 +284,7 @@ impl TerminalInlineAssistant {
messages,
tools: Vec::new(),
stop: Vec::new(),
temperature: 1.0,
temperature: None,
})
}
@@ -1066,6 +1066,7 @@ impl Codegen {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
telemetry_events::AssistantPhase::Response,
model_telemetry_id,
response_latency,
error_message,

View File

@@ -268,7 +268,7 @@ fn view_release_notes_locally(workspace: &mut Workspace, cx: &mut ViewContext<Wo
let client = client::Client::global(cx).http_client();
let url = client.build_url(&format!(
"/api/release_notes/{}/{}",
"/api/release_notes/v2/{}/{}",
release_channel.dev_name(),
version
));

View File

@@ -66,7 +66,7 @@ impl ChannelBuffer {
let capability = channel_store.read(cx).channel_capability(channel.id);
language::Buffer::remote(buffer_id, response.replica_id as u16, capability, base_text)
})?;
buffer.update(&mut cx, |buffer, cx| buffer.apply_ops(operations, cx))??;
buffer.update(&mut cx, |buffer, cx| buffer.apply_ops(operations, cx))?;
let subscription = client.subscribe_to_entity(channel.id.0)?;
@@ -151,7 +151,7 @@ impl ChannelBuffer {
cx.notify();
this.buffer
.update(cx, |buffer, cx| buffer.apply_ops(ops, cx))
})??;
})?;
Ok(())
}
@@ -175,7 +175,10 @@ impl ChannelBuffer {
cx: &mut ModelContext<Self>,
) {
match event {
language::BufferEvent::Operation(operation) => {
language::BufferEvent::Operation {
operation,
is_local: true,
} => {
if *ZED_ALWAYS_ACTIVE {
if let language::Operation::UpdateSelections { selections, .. } = operation {
if selections.is_empty() {

View File

@@ -1007,7 +1007,7 @@ impl ChannelStore {
.into_iter()
.map(language::proto::deserialize_operation)
.collect::<Result<Vec<_>>>()?;
buffer.apply_ops(incoming_operations, cx)?;
buffer.apply_ops(incoming_operations, cx);
anyhow::Ok(outgoing_operations)
})
.log_err();

View File

@@ -1621,6 +1621,10 @@ impl ProtoClient for Client {
fn message_handler_set(&self) -> &parking_lot::Mutex<ProtoMessageHandlerSet> {
&self.handler_set
}
fn is_via_collab(&self) -> bool {
true
}
}
#[derive(Serialize, Deserialize)]

View File

@@ -16,9 +16,9 @@ use std::io::Write;
use std::{env, mem, path::PathBuf, sync::Arc, time::Duration};
use sysinfo::{CpuRefreshKind, Pid, ProcessRefreshKind, RefreshKind, System};
use telemetry_events::{
ActionEvent, AppEvent, AssistantEvent, AssistantKind, CallEvent, CpuEvent, EditEvent,
EditorEvent, Event, EventRequestBody, EventWrapper, ExtensionEvent, InlineCompletionEvent,
MemoryEvent, ReplEvent, SettingEvent,
ActionEvent, AppEvent, AssistantEvent, AssistantKind, AssistantPhase, CallEvent, CpuEvent,
EditEvent, EditorEvent, Event, EventRequestBody, EventWrapper, ExtensionEvent,
InlineCompletionEvent, MemoryEvent, ReplEvent, SettingEvent,
};
use tempfile::NamedTempFile;
#[cfg(not(debug_assertions))]
@@ -37,9 +37,10 @@ pub struct Telemetry {
struct TelemetryState {
settings: TelemetrySettings,
metrics_id: Option<Arc<str>>, // Per logged-in user
system_id: Option<Arc<str>>, // Per system
installation_id: Option<Arc<str>>, // Per app installation (different for dev, nightly, preview, and stable)
session_id: Option<String>, // Per app launch
metrics_id: Option<Arc<str>>, // Per logged-in user
release_channel: Option<&'static str>,
architecture: &'static str,
events_queue: Vec<EventWrapper>,
@@ -191,9 +192,10 @@ impl Telemetry {
settings: *TelemetrySettings::get_global(cx),
architecture: env::consts::ARCH,
release_channel,
system_id: None,
installation_id: None,
metrics_id: None,
session_id: None,
metrics_id: None,
events_queue: Vec::new(),
flush_events_task: None,
log_file: None,
@@ -283,11 +285,13 @@ impl Telemetry {
pub fn start(
self: &Arc<Self>,
system_id: Option<String>,
installation_id: Option<String>,
session_id: String,
cx: &mut AppContext,
) {
let mut state = self.state.lock();
state.system_id = system_id.map(|id| id.into());
state.installation_id = installation_id.map(|id| id.into());
state.session_id = Some(session_id);
state.app_version = release_channel::AppVersion::global(cx).to_string();
@@ -391,6 +395,7 @@ impl Telemetry {
self: &Arc<Self>,
conversation_id: Option<String>,
kind: AssistantKind,
phase: AssistantPhase,
model: String,
response_latency: Option<Duration>,
error_message: Option<String>,
@@ -398,6 +403,7 @@ impl Telemetry {
let event = Event::Assistant(AssistantEvent {
conversation_id,
kind,
phase,
model: model.to_string(),
response_latency,
error_message,
@@ -635,9 +641,10 @@ impl Telemetry {
let state = this.state.lock();
let request_body = EventRequestBody {
system_id: state.system_id.as_deref().map(Into::into),
installation_id: state.installation_id.as_deref().map(Into::into),
metrics_id: state.metrics_id.as_deref().map(Into::into),
session_id: state.session_id.clone(),
metrics_id: state.metrics_id.as_deref().map(Into::into),
is_staff: state.is_staff,
app_version: state.app_version.clone(),
os_name: state.os_name.clone(),
@@ -709,6 +716,7 @@ mod tests {
Utc.with_ymd_and_hms(1990, 4, 12, 12, 0, 0).unwrap(),
));
let http = FakeHttpClient::with_200_response();
let system_id = Some("system_id".to_string());
let installation_id = Some("installation_id".to_string());
let session_id = "session_id".to_string();
@@ -716,7 +724,7 @@ mod tests {
let telemetry = Telemetry::new(clock.clone(), http, cx);
telemetry.state.lock().max_queue_size = 4;
telemetry.start(installation_id, session_id, cx);
telemetry.start(system_id, installation_id, session_id, cx);
assert!(is_empty_state(&telemetry));
@@ -794,13 +802,14 @@ mod tests {
Utc.with_ymd_and_hms(1990, 4, 12, 12, 0, 0).unwrap(),
));
let http = FakeHttpClient::with_200_response();
let system_id = Some("system_id".to_string());
let installation_id = Some("installation_id".to_string());
let session_id = "session_id".to_string();
cx.update(|cx| {
let telemetry = Telemetry::new(clock.clone(), http, cx);
telemetry.state.lock().max_queue_size = 4;
telemetry.start(installation_id, session_id, cx);
telemetry.start(system_id, installation_id, session_id, cx);
assert!(is_empty_state(&telemetry));

View File

@@ -9,6 +9,8 @@ use std::{
pub use system_clock::*;
pub const LOCAL_BRANCH_REPLICA_ID: u16 = u16::MAX;
/// A unique identifier for each distributed node.
pub type ReplicaId = u16;
@@ -25,7 +27,10 @@ pub struct Lamport {
/// A [vector clock](https://en.wikipedia.org/wiki/Vector_clock).
#[derive(Clone, Default, Hash, Eq, PartialEq)]
pub struct Global(SmallVec<[u32; 8]>);
pub struct Global {
values: SmallVec<[u32; 8]>,
local_branch_value: u32,
}
impl Global {
pub fn new() -> Self {
@@ -33,41 +38,51 @@ impl Global {
}
pub fn get(&self, replica_id: ReplicaId) -> Seq {
self.0.get(replica_id as usize).copied().unwrap_or(0) as Seq
if replica_id == LOCAL_BRANCH_REPLICA_ID {
self.local_branch_value
} else {
self.values.get(replica_id as usize).copied().unwrap_or(0) as Seq
}
}
pub fn observe(&mut self, timestamp: Lamport) {
if timestamp.value > 0 {
let new_len = timestamp.replica_id as usize + 1;
if new_len > self.0.len() {
self.0.resize(new_len, 0);
}
if timestamp.replica_id == LOCAL_BRANCH_REPLICA_ID {
self.local_branch_value = cmp::max(self.local_branch_value, timestamp.value);
} else {
let new_len = timestamp.replica_id as usize + 1;
if new_len > self.values.len() {
self.values.resize(new_len, 0);
}
let entry = &mut self.0[timestamp.replica_id as usize];
*entry = cmp::max(*entry, timestamp.value);
let entry = &mut self.values[timestamp.replica_id as usize];
*entry = cmp::max(*entry, timestamp.value);
}
}
}
pub fn join(&mut self, other: &Self) {
if other.0.len() > self.0.len() {
self.0.resize(other.0.len(), 0);
if other.values.len() > self.values.len() {
self.values.resize(other.values.len(), 0);
}
for (left, right) in self.0.iter_mut().zip(&other.0) {
for (left, right) in self.values.iter_mut().zip(&other.values) {
*left = cmp::max(*left, *right);
}
self.local_branch_value = cmp::max(self.local_branch_value, other.local_branch_value);
}
pub fn meet(&mut self, other: &Self) {
if other.0.len() > self.0.len() {
self.0.resize(other.0.len(), 0);
if other.values.len() > self.values.len() {
self.values.resize(other.values.len(), 0);
}
let mut new_len = 0;
for (ix, (left, right)) in self
.0
.values
.iter_mut()
.zip(other.0.iter().chain(iter::repeat(&0)))
.zip(other.values.iter().chain(iter::repeat(&0)))
.enumerate()
{
if *left == 0 {
@@ -80,7 +95,8 @@ impl Global {
new_len = ix + 1;
}
}
self.0.resize(new_len, 0);
self.values.resize(new_len, 0);
self.local_branch_value = cmp::min(self.local_branch_value, other.local_branch_value);
}
pub fn observed(&self, timestamp: Lamport) -> bool {
@@ -88,34 +104,44 @@ impl Global {
}
pub fn observed_any(&self, other: &Self) -> bool {
self.0
self.values
.iter()
.zip(other.0.iter())
.zip(other.values.iter())
.any(|(left, right)| *right > 0 && left >= right)
|| (other.local_branch_value > 0 && self.local_branch_value >= other.local_branch_value)
}
pub fn observed_all(&self, other: &Self) -> bool {
let mut rhs = other.0.iter();
self.0.iter().all(|left| match rhs.next() {
let mut rhs = other.values.iter();
self.values.iter().all(|left| match rhs.next() {
Some(right) => left >= right,
None => true,
}) && rhs.next().is_none()
&& self.local_branch_value >= other.local_branch_value
}
pub fn changed_since(&self, other: &Self) -> bool {
self.0.len() > other.0.len()
self.values.len() > other.values.len()
|| self
.0
.values
.iter()
.zip(other.0.iter())
.zip(other.values.iter())
.any(|(left, right)| left > right)
|| self.local_branch_value > other.local_branch_value
}
pub fn iter(&self) -> impl Iterator<Item = Lamport> + '_ {
self.0.iter().enumerate().map(|(replica_id, seq)| Lamport {
replica_id: replica_id as ReplicaId,
value: *seq,
})
self.values
.iter()
.enumerate()
.map(|(replica_id, seq)| Lamport {
replica_id: replica_id as ReplicaId,
value: *seq,
})
.chain((self.local_branch_value > 0).then_some(Lamport {
replica_id: LOCAL_BRANCH_REPLICA_ID,
value: self.local_branch_value,
}))
}
}
@@ -192,6 +218,9 @@ impl fmt::Debug for Global {
}
write!(f, "{}: {}", timestamp.replica_id, timestamp.value)?;
}
if self.local_branch_value > 0 {
write!(f, "<branch>: {}", self.local_branch_value)?;
}
write!(f, "}}")
}
}

View File

@@ -36,8 +36,8 @@ envy = "0.4.2"
futures.workspace = true
google_ai.workspace = true
hex.workspace = true
isahc_http_client.workspace = true
http_client.workspace = true
isahc_http_client.workspace = true
jsonwebtoken.workspace = true
live_kit_server.workspace = true
log.workspace = true

View File

@@ -18,8 +18,8 @@ use sha2::{Digest, Sha256};
use std::sync::{Arc, OnceLock};
use telemetry_events::{
ActionEvent, AppEvent, AssistantEvent, CallEvent, CpuEvent, EditEvent, EditorEvent, Event,
EventRequestBody, EventWrapper, ExtensionEvent, InlineCompletionEvent, MemoryEvent, ReplEvent,
SettingEvent,
EventRequestBody, EventWrapper, ExtensionEvent, InlineCompletionEvent, MemoryEvent, Panic,
ReplEvent, SettingEvent,
};
use uuid::Uuid;
@@ -149,7 +149,8 @@ pub async fn post_crash(
installation_id = %installation_id,
description = %description,
backtrace = %summary,
"crash report");
"crash report"
);
if let Some(slack_panics_webhook) = app.config.slack_panics_webhook.clone() {
let payload = slack::WebhookBody::new(|w| {
@@ -295,10 +296,11 @@ pub async fn post_panic(
version = %panic.app_version,
os_name = %panic.os_name,
os_version = %panic.os_version.clone().unwrap_or_default(),
installation_id = %panic.installation_id.unwrap_or_default(),
installation_id = %panic.installation_id.clone().unwrap_or_default(),
description = %panic.payload,
backtrace = %panic.backtrace.join("\n"),
"panic report");
"panic report"
);
let backtrace = if panic.backtrace.len() > 25 {
let total = panic.backtrace.len();
@@ -316,6 +318,11 @@ pub async fn post_panic(
} else {
panic.backtrace.join("\n")
};
if !report_to_slack(&panic) {
return Ok(());
}
let backtrace_with_summary = panic.payload + "\n" + &backtrace;
if let Some(slack_panics_webhook) = app.config.slack_panics_webhook.clone() {
@@ -356,6 +363,23 @@ pub async fn post_panic(
Ok(())
}
fn report_to_slack(panic: &Panic) -> bool {
if panic.os_name == "Linux" {
if panic.payload.contains("ERROR_SURFACE_LOST_KHR") {
return false;
}
if panic
.payload
.contains("GPU has crashed, and no debug information is available")
{
return false;
}
}
true
}
pub async fn post_events(
Extension(app): Extension<Arc<AppState>>,
TypedHeader(ZedChecksumHeader(checksum)): TypedHeader<ZedChecksumHeader>,
@@ -627,7 +651,9 @@ where
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct EditorEventRow {
system_id: String,
installation_id: String,
session_id: Option<String>,
metrics_id: String,
operation: String,
app_version: String,
@@ -647,7 +673,6 @@ pub struct EditorEventRow {
historical_event: bool,
architecture: String,
is_staff: Option<bool>,
session_id: Option<String>,
major: Option<i32>,
minor: Option<i32>,
patch: Option<i32>,
@@ -677,9 +702,10 @@ impl EditorEventRow {
os_name: body.os_name.clone(),
os_version: body.os_version.clone().unwrap_or_default(),
architecture: body.architecture.clone(),
system_id: body.system_id.clone().unwrap_or_default(),
installation_id: body.installation_id.clone().unwrap_or_default(),
metrics_id: body.metrics_id.clone().unwrap_or_default(),
session_id: body.session_id.clone(),
metrics_id: body.metrics_id.clone().unwrap_or_default(),
is_staff: body.is_staff,
time: time.timestamp_millis(),
operation: event.operation,
@@ -699,6 +725,7 @@ impl EditorEventRow {
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct InlineCompletionEventRow {
installation_id: String,
session_id: Option<String>,
provider: String,
suggestion_accepted: bool,
app_version: String,
@@ -713,7 +740,6 @@ pub struct InlineCompletionEventRow {
city: String,
time: i64,
is_staff: Option<bool>,
session_id: Option<String>,
major: Option<i32>,
minor: Option<i32>,
patch: Option<i32>,
@@ -834,6 +860,7 @@ pub struct AssistantEventRow {
// AssistantEventRow
conversation_id: String,
kind: String,
phase: String,
model: String,
response_latency_in_ms: Option<i64>,
error_message: Option<String>,
@@ -866,6 +893,7 @@ impl AssistantEventRow {
time: time.timestamp_millis(),
conversation_id: event.conversation_id.unwrap_or_default(),
kind: event.kind.to_string(),
phase: event.phase.to_string(),
model: event.model,
response_latency_in_ms: event
.response_latency
@@ -878,6 +906,7 @@ impl AssistantEventRow {
#[derive(Debug, clickhouse::Row, Serialize)]
pub struct CpuEventRow {
installation_id: Option<String>,
session_id: Option<String>,
is_staff: Option<bool>,
usage_as_percentage: f32,
core_count: u32,
@@ -886,7 +915,6 @@ pub struct CpuEventRow {
os_name: String,
os_version: String,
time: i64,
session_id: Option<String>,
// pub normalized_cpu_usage: f64, MATERIALIZED
major: Option<i32>,
minor: Option<i32>,

View File

@@ -689,9 +689,7 @@ impl Database {
}
let mut text_buffer = text::Buffer::new(0, text::BufferId::new(1).unwrap(), base_text);
text_buffer
.apply_ops(operations.into_iter().filter_map(operation_from_wire))
.unwrap();
text_buffer.apply_ops(operations.into_iter().filter_map(operation_from_wire));
let base_text = text_buffer.text();
let epoch = buffer.epoch + 1;

View File

@@ -96,16 +96,14 @@ async fn test_channel_buffers(db: &Arc<Database>) {
text::BufferId::new(1).unwrap(),
buffer_response_b.base_text,
);
buffer_b
.apply_ops(buffer_response_b.operations.into_iter().map(|operation| {
let operation = proto::deserialize_operation(operation).unwrap();
if let language::Operation::Buffer(operation) = operation {
operation
} else {
unreachable!()
}
}))
.unwrap();
buffer_b.apply_ops(buffer_response_b.operations.into_iter().map(|operation| {
let operation = proto::deserialize_operation(operation).unwrap();
if let language::Operation::Buffer(operation) = operation {
operation
} else {
unreachable!()
}
}));
assert_eq!(buffer_b.text(), "hello, cruel world");

View File

@@ -956,7 +956,6 @@ impl Server {
tracing::info!("connection opened");
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = match IsahcHttpClient::builder().default_header("User-Agent", user_agent).build() {
Ok(http_client) => Arc::new(IsahcHttpClient::from(http_client)),

View File

@@ -289,7 +289,7 @@ async fn test_basic_following(
.get_open_buffer(&(worktree_id, "2.txt").into(), cx)
.unwrap()
});
let mut result = MultiBuffer::new(0, Capability::ReadWrite);
let mut result = MultiBuffer::new(Capability::ReadWrite);
result.push_excerpts(
buffer_a1,
[ExcerptRange {

View File

@@ -102,7 +102,7 @@ async fn test_sharing_an_ssh_remote_project(
all_language_settings(file, cx)
.language(Some(&("Rust".into())))
.language_servers,
["override-rust-analyzer".into()]
["override-rust-analyzer".to_string()]
)
});

View File

@@ -239,7 +239,6 @@ pub struct Resource {
pub struct ResourceContent {
pub uri: Url,
pub mime_type: Option<String>,
pub content_type: String,
pub text: Option<String>,
pub data: Option<String>,
}

View File

@@ -767,7 +767,7 @@ mod tests {
let buffer_1 = cx.new_model(|cx| Buffer::local("a = 1\nb = 2\n", cx));
let buffer_2 = cx.new_model(|cx| Buffer::local("c = 3\nd = 4\n", cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, language::Capability::ReadWrite);
let mut multibuffer = MultiBuffer::new(language::Capability::ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[ExcerptRange {
@@ -1018,7 +1018,7 @@ mod tests {
.unwrap();
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, language::Capability::ReadWrite);
let mut multibuffer = MultiBuffer::new(language::Capability::ReadWrite);
multibuffer.push_excerpts(
private_buffer.clone(),
[ExcerptRange {

View File

@@ -11,16 +11,14 @@ pub use smol;
pub use sqlez;
pub use sqlez_macros;
use release_channel::ReleaseChannel;
pub use release_channel::RELEASE_CHANNEL;
use sqlez::domain::Migrator;
use sqlez::thread_safe_connection::ThreadSafeConnection;
use sqlez_macros::sql;
use std::env;
use std::future::Future;
use std::path::Path;
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::LazyLock;
use std::sync::{atomic::Ordering, LazyLock};
use std::{env, sync::atomic::AtomicBool};
use util::{maybe, ResultExt};
const CONNECTION_INITIALIZE_QUERY: &str = sql!(
@@ -47,16 +45,12 @@ pub static ALL_FILE_DB_FAILED: LazyLock<AtomicBool> = LazyLock::new(|| AtomicBoo
/// This will retry a couple times if there are failures. If opening fails once, the db directory
/// is moved to a backup folder and a new one is created. If that fails, a shared in memory db is created.
/// In either case, static variables are set so that the user can be notified.
pub async fn open_db<M: Migrator + 'static>(
db_dir: &Path,
release_channel: &ReleaseChannel,
) -> ThreadSafeConnection<M> {
pub async fn open_db<M: Migrator + 'static>(db_dir: &Path, scope: &str) -> ThreadSafeConnection<M> {
if *ZED_STATELESS {
return open_fallback_db().await;
}
let release_channel_name = release_channel.dev_name();
let main_db_dir = db_dir.join(Path::new(&format!("0-{}", release_channel_name)));
let main_db_dir = db_dir.join(format!("0-{}", scope));
let connection = maybe!(async {
smol::fs::create_dir_all(&main_db_dir)
@@ -118,7 +112,7 @@ pub async fn open_test_db<M: Migrator>(db_name: &str) -> ThreadSafeConnection<M>
/// Implements a basic DB wrapper for a given domain
#[macro_export]
macro_rules! define_connection {
(pub static ref $id:ident: $t:ident<()> = $migrations:expr;) => {
(pub static ref $id:ident: $t:ident<()> = $migrations:expr; $($global:ident)?) => {
pub struct $t($crate::sqlez::thread_safe_connection::ThreadSafeConnection<$t>);
impl ::std::ops::Deref for $t {
@@ -139,18 +133,23 @@ macro_rules! define_connection {
}
}
use std::sync::LazyLock;
#[cfg(any(test, feature = "test-support"))]
pub static $id: LazyLock<$t> = LazyLock::new(|| {
pub static $id: std::sync::LazyLock<$t> = std::sync::LazyLock::new(|| {
$t($crate::smol::block_on($crate::open_test_db(stringify!($id))))
});
#[cfg(not(any(test, feature = "test-support")))]
pub static $id: LazyLock<$t> = LazyLock::new(|| {
$t($crate::smol::block_on($crate::open_db($crate::database_dir(), &$crate::RELEASE_CHANNEL)))
pub static $id: std::sync::LazyLock<$t> = std::sync::LazyLock::new(|| {
let db_dir = $crate::database_dir();
let scope = if false $(|| stringify!($global) == "global")? {
"global"
} else {
$crate::RELEASE_CHANNEL.dev_name()
};
$t($crate::smol::block_on($crate::open_db(db_dir, scope)))
});
};
(pub static ref $id:ident: $t:ident<$($d:ty),+> = $migrations:expr;) => {
(pub static ref $id:ident: $t:ident<$($d:ty),+> = $migrations:expr; $($global:ident)?) => {
pub struct $t($crate::sqlez::thread_safe_connection::ThreadSafeConnection<( $($d),+, $t )>);
impl ::std::ops::Deref for $t {
@@ -178,7 +177,13 @@ macro_rules! define_connection {
#[cfg(not(any(test, feature = "test-support")))]
pub static $id: std::sync::LazyLock<$t> = std::sync::LazyLock::new(|| {
$t($crate::smol::block_on($crate::open_db($crate::database_dir(), &$crate::RELEASE_CHANNEL)))
let db_dir = $crate::database_dir();
let scope = if false $(|| stringify!($global) == "global")? {
"global"
} else {
$crate::RELEASE_CHANNEL.dev_name()
};
$t($crate::smol::block_on($crate::open_db(db_dir, scope)))
});
};
}
@@ -225,7 +230,11 @@ mod tests {
.prefix("DbTests")
.tempdir()
.unwrap();
let _bad_db = open_db::<BadDB>(tempdir.path(), &release_channel::ReleaseChannel::Dev).await;
let _bad_db = open_db::<BadDB>(
tempdir.path(),
&release_channel::ReleaseChannel::Dev.dev_name(),
)
.await;
}
/// Test that DB exists but corrupted (causing recreate)
@@ -262,13 +271,19 @@ mod tests {
.tempdir()
.unwrap();
{
let corrupt_db =
open_db::<CorruptedDB>(tempdir.path(), &release_channel::ReleaseChannel::Dev).await;
let corrupt_db = open_db::<CorruptedDB>(
tempdir.path(),
&release_channel::ReleaseChannel::Dev.dev_name(),
)
.await;
assert!(corrupt_db.persistent());
}
let good_db =
open_db::<GoodDB>(tempdir.path(), &release_channel::ReleaseChannel::Dev).await;
let good_db = open_db::<GoodDB>(
tempdir.path(),
&release_channel::ReleaseChannel::Dev.dev_name(),
)
.await;
assert!(
good_db.select_row::<usize>("SELECT * FROM test2").unwrap()()
.unwrap()
@@ -311,8 +326,11 @@ mod tests {
.unwrap();
{
// Setup the bad database
let corrupt_db =
open_db::<CorruptedDB>(tempdir.path(), &release_channel::ReleaseChannel::Dev).await;
let corrupt_db = open_db::<CorruptedDB>(
tempdir.path(),
&release_channel::ReleaseChannel::Dev.dev_name(),
)
.await;
assert!(corrupt_db.persistent());
}
@@ -323,7 +341,7 @@ mod tests {
let guard = thread::spawn(move || {
let good_db = smol::block_on(open_db::<GoodDB>(
tmp_path.as_path(),
&release_channel::ReleaseChannel::Dev,
&release_channel::ReleaseChannel::Dev.dev_name(),
));
assert!(
good_db.select_row::<usize>("SELECT * FROM test2").unwrap()()

View File

@@ -60,3 +60,33 @@ mod tests {
assert_eq!(db.read_kvp("key-1").unwrap(), None);
}
}
define_connection!(pub static ref GLOBAL_KEY_VALUE_STORE: GlobalKeyValueStore<()> =
&[sql!(
CREATE TABLE IF NOT EXISTS kv_store(
key TEXT PRIMARY KEY,
value TEXT NOT NULL
) STRICT;
)];
global
);
impl GlobalKeyValueStore {
query! {
pub fn read_kvp(key: &str) -> Result<Option<String>> {
SELECT value FROM kv_store WHERE key = (?)
}
}
query! {
pub async fn write_kvp(key: String, value: String) -> Result<()> {
INSERT OR REPLACE INTO kv_store(key, value) VALUES ((?), (?))
}
}
query! {
pub async fn delete_kvp(key: String) -> Result<()> {
DELETE FROM kv_store WHERE key = (?)
}
}
}

View File

@@ -156,12 +156,7 @@ impl ProjectDiagnosticsEditor {
cx.on_focus_out(&focus_handle, |this, _event, cx| this.focus_out(cx))
.detach();
let excerpts = cx.new_model(|cx| {
MultiBuffer::new(
project_handle.read(cx).replica_id(),
project_handle.read(cx).capability(),
)
});
let excerpts = cx.new_model(|cx| MultiBuffer::new(project_handle.read(cx).capability()));
let editor = cx.new_view(|cx| {
let mut editor =
Editor::for_multibuffer(excerpts.clone(), Some(project_handle.clone()), false, cx);

View File

@@ -273,6 +273,7 @@ gpui::actions!(
NextScreen,
OpenExcerpts,
OpenExcerptsSplit,
OpenProposedChangesEditor,
OpenFile,
OpenPermalinkToLine,
OpenUrl,

View File

@@ -1671,7 +1671,7 @@ mod tests {
let mut excerpt_ids = Vec::new();
let multi_buffer = cx.new_model(|cx| {
let mut multi_buffer = MultiBuffer::new(0, Capability::ReadWrite);
let mut multi_buffer = MultiBuffer::new(Capability::ReadWrite);
excerpt_ids.extend(multi_buffer.push_excerpts(
buffer1.clone(),
[ExcerptRange {

View File

@@ -35,6 +35,7 @@ mod lsp_ext;
mod mouse_context_menu;
pub mod movement;
mod persistence;
mod proposed_changes_editor;
mod rust_analyzer_ext;
pub mod scroll;
mod selections_collection;
@@ -46,7 +47,7 @@ mod signature_help;
#[cfg(any(test, feature = "test-support"))]
pub mod test;
use ::git::diff::{DiffHunk, DiffHunkStatus};
use ::git::diff::DiffHunkStatus;
use ::git::{parse_git_remote_url, BuildPermalinkParams, GitHostingProviderRegistry};
pub(crate) use actions::*;
use aho_corasick::AhoCorasick;
@@ -98,6 +99,7 @@ use language::{
};
use language::{point_to_lsp, BufferRow, CharClassifier, Runnable, RunnableRange};
use linked_editing_ranges::refresh_linked_ranges;
use proposed_changes_editor::{ProposedChangesBuffer, ProposedChangesEditor};
use similar::{ChangeTag, TextDiff};
use task::{ResolvedTask, TaskTemplate, TaskVariables};
@@ -113,7 +115,9 @@ pub use multi_buffer::{
Anchor, AnchorRangeExt, ExcerptId, ExcerptRange, MultiBuffer, MultiBufferSnapshot, ToOffset,
ToPoint,
};
use multi_buffer::{ExpandExcerptDirection, MultiBufferPoint, MultiBufferRow, ToOffsetUtf16};
use multi_buffer::{
ExpandExcerptDirection, MultiBufferDiffHunk, MultiBufferPoint, MultiBufferRow, ToOffsetUtf16,
};
use ordered_float::OrderedFloat;
use parking_lot::{Mutex, RwLock};
use project::project_settings::{GitGutterSetting, ProjectSettings};
@@ -2155,10 +2159,6 @@ impl Editor {
});
}
pub fn replica_id(&self, cx: &AppContext) -> ReplicaId {
self.buffer.read(cx).replica_id()
}
pub fn leader_peer_id(&self) -> Option<PeerId> {
self.leader_peer_id
}
@@ -4758,8 +4758,6 @@ impl Editor {
title: String,
mut cx: AsyncWindowContext,
) -> Result<()> {
let replica_id = this.update(&mut cx, |this, cx| this.replica_id(cx))?;
let mut entries = transaction.0.into_iter().collect::<Vec<_>>();
cx.update(|cx| {
entries.sort_unstable_by_key(|(buffer, _)| {
@@ -4802,8 +4800,7 @@ impl Editor {
let mut ranges_to_highlight = Vec::new();
let excerpt_buffer = cx.new_model(|cx| {
let mut multibuffer =
MultiBuffer::new(replica_id, Capability::ReadWrite).with_title(title);
let mut multibuffer = MultiBuffer::new(Capability::ReadWrite).with_title(title);
for (buffer_handle, transaction) in &entries {
let buffer = buffer_handle.read(cx);
ranges_to_highlight.extend(
@@ -6159,7 +6156,7 @@ impl Editor {
pub fn prepare_revert_change(
revert_changes: &mut HashMap<BufferId, Vec<(Range<text::Anchor>, Rope)>>,
multi_buffer: &Model<MultiBuffer>,
hunk: &DiffHunk<MultiBufferRow>,
hunk: &MultiBufferDiffHunk,
cx: &AppContext,
) -> Option<()> {
let buffer = multi_buffer.read(cx).buffer(hunk.buffer_id)?;
@@ -6712,6 +6709,10 @@ impl Editor {
}
pub fn rewrap(&mut self, _: &Rewrap, cx: &mut ViewContext<Self>) {
self.rewrap_impl(true, cx)
}
pub fn rewrap_impl(&mut self, only_text: bool, cx: &mut ViewContext<Self>) {
let buffer = self.buffer.read(cx).snapshot(cx);
let selections = self.selections.all::<Point>(cx);
let mut selections = selections.iter().peekable();
@@ -6732,7 +6733,7 @@ impl Editor {
continue;
}
let mut should_rewrap = false;
let mut should_rewrap = !only_text;
if let Some(language_scope) = buffer.language_scope_at(selection.head()) {
match language_scope.language_name().0.as_ref() {
@@ -6743,9 +6744,31 @@ impl Editor {
}
}
let row = selection.head().row;
let indent_size = buffer.indent_size_for_line(MultiBufferRow(row));
let indent_end = Point::new(row, indent_size.len);
// Since not all lines in the selection may be at the same indent
// level, choose the indent size that is the most common between all
// of the lines.
//
// If there is a tie, we use the deepest indent.
let (indent_size, indent_end) = {
let mut indent_size_occurrences = HashMap::default();
let mut rows_by_indent_size = HashMap::<IndentSize, Vec<u32>>::default();
for row in start_row..=end_row {
let indent = buffer.indent_size_for_line(MultiBufferRow(row));
rows_by_indent_size.entry(indent).or_default().push(row);
*indent_size_occurrences.entry(indent).or_insert(0) += 1;
}
let indent_size = indent_size_occurrences
.into_iter()
.max_by_key(|(indent, count)| (*count, indent.len))
.map(|(indent, _)| indent)
.unwrap_or_default();
let row = rows_by_indent_size[&indent_size][0];
let indent_end = Point::new(row, indent_size.len);
(indent_size, indent_end)
};
let mut line_prefix = indent_size.chars().collect::<String>();
@@ -6795,10 +6818,22 @@ impl Editor {
let start = Point::new(start_row, 0);
let end = Point::new(end_row, buffer.line_len(MultiBufferRow(end_row)));
let selection_text = buffer.text_for_range(start..end).collect::<String>();
let unwrapped_text = selection_text
let Some(lines_without_prefixes) = selection_text
.lines()
.map(|line| line.strip_prefix(&line_prefix).unwrap())
.join(" ");
.map(|line| {
line.strip_prefix(&line_prefix)
.or_else(|| line.trim_start().strip_prefix(&line_prefix.trim_start()))
.ok_or_else(|| {
anyhow!("line did not start with prefix {line_prefix:?}: {line:?}")
})
})
.collect::<Result<Vec<_>, _>>()
.log_err()
else {
continue;
};
let unwrapped_text = lines_without_prefixes.join(" ");
let wrap_column = buffer
.settings_at(Point::new(start_row, 0), cx)
.preferred_line_length as usize;
@@ -9307,7 +9342,7 @@ impl Editor {
snapshot: &DisplaySnapshot,
initial_point: Point,
is_wrapped: bool,
hunks: impl Iterator<Item = DiffHunk<MultiBufferRow>>,
hunks: impl Iterator<Item = MultiBufferDiffHunk>,
cx: &mut ViewContext<Editor>,
) -> bool {
let display_point = initial_point.to_display_point(snapshot);
@@ -9610,7 +9645,6 @@ impl Editor {
})
})
} else if !definitions.is_empty() {
let replica_id = self.replica_id(cx);
cx.spawn(|editor, mut cx| async move {
let (title, location_tasks, workspace) = editor
.update(&mut cx, |editor, cx| {
@@ -9663,9 +9697,7 @@ impl Editor {
};
let opened = workspace
.update(&mut cx, |workspace, cx| {
Self::open_locations_in_multibuffer(
workspace, locations, replica_id, title, split, cx,
)
Self::open_locations_in_multibuffer(workspace, locations, title, split, cx)
})
.ok();
@@ -9762,7 +9794,6 @@ impl Editor {
}
let (buffer, head) = multi_buffer.text_anchor_for_position(head, cx)?;
let replica_id = self.replica_id(cx);
let workspace = self.workspace()?;
let project = workspace.read(cx).project().clone();
let references = project.update(cx, |project, cx| project.references(&buffer, head, cx));
@@ -9803,9 +9834,7 @@ impl Editor {
)
})
.unwrap();
Self::open_locations_in_multibuffer(
workspace, locations, replica_id, title, false, cx,
);
Self::open_locations_in_multibuffer(workspace, locations, title, false, cx);
Navigated::Yes
})
}))
@@ -9815,7 +9844,6 @@ impl Editor {
pub fn open_locations_in_multibuffer(
workspace: &mut Workspace,
mut locations: Vec<Location>,
replica_id: ReplicaId,
title: String,
split: bool,
cx: &mut ViewContext<Workspace>,
@@ -9827,7 +9855,7 @@ impl Editor {
let capability = workspace.project().read(cx).capability();
let excerpt_buffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(replica_id, capability);
let mut multibuffer = MultiBuffer::new(capability);
while let Some(location) = locations.next() {
let buffer = location.buffer.read(cx);
let mut ranges_for_buffer = Vec::new();
@@ -11861,6 +11889,52 @@ impl Editor {
self.searchable
}
fn open_proposed_changes_editor(
&mut self,
_: &OpenProposedChangesEditor,
cx: &mut ViewContext<Self>,
) {
let Some(workspace) = self.workspace() else {
cx.propagate();
return;
};
let buffer = self.buffer.read(cx);
let mut new_selections_by_buffer = HashMap::default();
for selection in self.selections.all::<usize>(cx) {
for (buffer, mut range, _) in
buffer.range_to_buffer_ranges(selection.start..selection.end, cx)
{
if selection.reversed {
mem::swap(&mut range.start, &mut range.end);
}
let mut range = range.to_point(buffer.read(cx));
range.start.column = 0;
range.end.column = buffer.read(cx).line_len(range.end.row);
new_selections_by_buffer
.entry(buffer)
.or_insert(Vec::new())
.push(range)
}
}
let proposed_changes_buffers = new_selections_by_buffer
.into_iter()
.map(|(buffer, ranges)| ProposedChangesBuffer { buffer, ranges })
.collect::<Vec<_>>();
let proposed_changes_editor = cx.new_view(|cx| {
ProposedChangesEditor::new(proposed_changes_buffers, self.project.clone(), cx)
});
cx.window_context().defer(move |cx| {
workspace.update(cx, |workspace, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.add_item(Box::new(proposed_changes_editor), true, true, None, cx);
});
});
});
}
fn open_excerpts_in_split(&mut self, _: &OpenExcerptsSplit, cx: &mut ViewContext<Self>) {
self.open_excerpts_common(true, cx)
}
@@ -12375,7 +12449,7 @@ impl Editor {
fn hunks_for_selections(
multi_buffer_snapshot: &MultiBufferSnapshot,
selections: &[Selection<Anchor>],
) -> Vec<DiffHunk<MultiBufferRow>> {
) -> Vec<MultiBufferDiffHunk> {
let buffer_rows_for_selections = selections.iter().map(|selection| {
let head = selection.head();
let tail = selection.tail();
@@ -12394,7 +12468,7 @@ fn hunks_for_selections(
pub fn hunks_for_rows(
rows: impl Iterator<Item = Range<MultiBufferRow>>,
multi_buffer_snapshot: &MultiBufferSnapshot,
) -> Vec<DiffHunk<MultiBufferRow>> {
) -> Vec<MultiBufferDiffHunk> {
let mut hunks = Vec::new();
let mut processed_buffer_rows: HashMap<BufferId, HashSet<Range<text::Anchor>>> =
HashMap::default();
@@ -12406,14 +12480,14 @@ pub fn hunks_for_rows(
// when the caret is just above or just below the deleted hunk.
let allow_adjacent = hunk_status(&hunk) == DiffHunkStatus::Removed;
let related_to_selection = if allow_adjacent {
hunk.associated_range.overlaps(&query_rows)
|| hunk.associated_range.start == query_rows.end
|| hunk.associated_range.end == query_rows.start
hunk.row_range.overlaps(&query_rows)
|| hunk.row_range.start == query_rows.end
|| hunk.row_range.end == query_rows.start
} else {
// `selected_multi_buffer_rows` are inclusive (e.g. [2..2] means 2nd row is selected)
// `hunk.associated_range` is exclusive (e.g. [2..3] means 2nd row is selected)
hunk.associated_range.overlaps(&selected_multi_buffer_rows)
|| selected_multi_buffer_rows.end == hunk.associated_range.start
// `hunk.row_range` is exclusive (e.g. [2..3] means 2nd row is selected)
hunk.row_range.overlaps(&selected_multi_buffer_rows)
|| selected_multi_buffer_rows.end == hunk.row_range.start
};
if related_to_selection {
if !processed_buffer_rows
@@ -13714,10 +13788,10 @@ impl RowRangeExt for Range<DisplayRow> {
}
}
fn hunk_status(hunk: &DiffHunk<MultiBufferRow>) -> DiffHunkStatus {
fn hunk_status(hunk: &MultiBufferDiffHunk) -> DiffHunkStatus {
if hunk.diff_base_byte_range.is_empty() {
DiffHunkStatus::Added
} else if hunk.associated_range.is_empty() {
} else if hunk.row_range.is_empty() {
DiffHunkStatus::Removed
} else {
DiffHunkStatus::Modified

View File

@@ -2822,7 +2822,7 @@ fn test_indent_outdent_with_excerpts(cx: &mut TestAppContext) {
Buffer::local("const c: usize = 3;\n", cx).with_language(rust_language, cx)
});
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
toml_buffer.clone(),
[ExcerptRange {
@@ -4249,6 +4249,78 @@ async fn test_rewrap(cx: &mut TestAppContext) {
cx.update_editor(|e, cx| e.rewrap(&Rewrap, cx));
cx.assert_editor_state(wrapped_text);
}
// Test rewrapping unaligned comments in a selection.
{
let language = Arc::new(Language::new(
LanguageConfig {
line_comments: vec!["// ".into(), "/// ".into()],
..LanguageConfig::default()
},
Some(tree_sitter_rust::LANGUAGE.into()),
));
cx.update_buffer(|buffer, cx| buffer.set_language(Some(language), cx));
let unwrapped_text = indoc! {"
fn foo() {
if true {
« // Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus mollis elit purus, a ornare lacus gravida vitae.
// Praesent semper egestas tellus id dignissim.ˇ»
do_something();
} else {
//
}
}
"};
let wrapped_text = indoc! {"
fn foo() {
if true {
// Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus
// mollis elit purus, a ornare lacus gravida vitae. Praesent semper
// egestas tellus id dignissim.ˇ
do_something();
} else {
//
}
}
"};
cx.set_state(unwrapped_text);
cx.update_editor(|e, cx| e.rewrap(&Rewrap, cx));
cx.assert_editor_state(wrapped_text);
let unwrapped_text = indoc! {"
fn foo() {
if true {
«ˇ // Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus mollis elit purus, a ornare lacus gravida vitae.
// Praesent semper egestas tellus id dignissim.»
do_something();
} else {
//
}
}
"};
let wrapped_text = indoc! {"
fn foo() {
if true {
// Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus
// mollis elit purus, a ornare lacus gravida vitae. Praesent semper
// egestas tellus id dignissim.ˇ
do_something();
} else {
//
}
}
"};
cx.set_state(unwrapped_text);
cx.update_editor(|e, cx| e.rewrap(&Rewrap, cx));
cx.assert_editor_state(wrapped_text);
}
}
#[gpui::test]
@@ -6671,7 +6743,7 @@ async fn test_multibuffer_format_during_save(cx: &mut gpui::TestAppContext) {
.unwrap();
let multi_buffer = cx.new_model(|cx| {
let mut multi_buffer = MultiBuffer::new(0, ReadWrite);
let mut multi_buffer = MultiBuffer::new(ReadWrite);
multi_buffer.push_excerpts(
buffer_1.clone(),
[
@@ -8614,7 +8686,7 @@ fn test_editing_disjoint_excerpts(cx: &mut TestAppContext) {
let buffer = cx.new_model(|cx| Buffer::local(sample_text(3, 4, 'a'), cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
buffer.clone(),
[
@@ -8698,7 +8770,7 @@ fn test_editing_overlapping_excerpts(cx: &mut TestAppContext) {
});
let buffer = cx.new_model(|cx| Buffer::local(initial_text, cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(buffer, excerpt_ranges, cx);
multibuffer
});
@@ -8757,7 +8829,7 @@ fn test_refresh_selections(cx: &mut TestAppContext) {
let buffer = cx.new_model(|cx| Buffer::local(sample_text(3, 4, 'a'), cx));
let mut excerpt1_id = None;
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
excerpt1_id = multibuffer
.push_excerpts(
buffer.clone(),
@@ -8842,7 +8914,7 @@ fn test_refresh_selections_while_selecting_with_mouse(cx: &mut TestAppContext) {
let buffer = cx.new_model(|cx| Buffer::local(sample_text(3, 4, 'a'), cx));
let mut excerpt1_id = None;
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
excerpt1_id = multibuffer
.push_excerpts(
buffer.clone(),
@@ -9230,7 +9302,7 @@ async fn test_following_with_multiple_excerpts(cx: &mut gpui::TestAppContext) {
let cx = &mut VisualTestContext::from_window(*workspace.deref(), cx);
let leader = pane.update(cx, |_, cx| {
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, ReadWrite));
let multibuffer = cx.new_model(|_| MultiBuffer::new(ReadWrite));
cx.new_view(|cx| build_editor(multibuffer.clone(), cx))
});
@@ -10685,7 +10757,7 @@ async fn test_multibuffer_reverts(cx: &mut gpui::TestAppContext) {
diff_every_buffer_row(&buffer_3, sample_text_3.clone(), cols, cx);
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[
@@ -10825,7 +10897,7 @@ async fn test_mutlibuffer_in_navigation_history(cx: &mut gpui::TestAppContext) {
let buffer_3 = cx.new_model(|cx| Buffer::local(sample_text_3.clone(), cx));
let multi_buffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[
@@ -11764,7 +11836,7 @@ async fn test_toggle_diff_expand_in_multi_buffer(cx: &mut gpui::TestAppContext)
});
let multi_buffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[

View File

@@ -346,6 +346,7 @@ impl EditorElement {
register_action(view, cx, Editor::toggle_code_actions);
register_action(view, cx, Editor::open_excerpts);
register_action(view, cx, Editor::open_excerpts_in_split);
register_action(view, cx, Editor::open_proposed_changes_editor);
register_action(view, cx, Editor::toggle_soft_wrap);
register_action(view, cx, Editor::toggle_tab_bar);
register_action(view, cx, Editor::toggle_line_numbers);
@@ -1931,7 +1932,7 @@ impl EditorElement {
content_origin: gpui::Point<Pixels>,
cx: &mut WindowContext,
) -> SmallVec<[AnyElement; 1]> {
let mut inline_elements = SmallVec::new();
let mut line_elements = SmallVec::new();
for (ix, line) in line_layouts.iter_mut().enumerate() {
let row = start_row + DisplayRow(ix as u32);
line.prepaint(
@@ -1939,11 +1940,11 @@ impl EditorElement {
scroll_pixel_position,
row,
content_origin,
&mut inline_elements,
&mut line_elements,
cx,
);
}
inline_elements
line_elements
}
#[allow(clippy::too_many_arguments)]
@@ -3478,24 +3479,10 @@ impl EditorElement {
whitespace_setting,
invisible_display_ranges,
cx,
);
let line_height = layout.position_map.line_height;
let line_y = line_height
* (row.as_f32() - layout.position_map.scroll_pixel_position.y / line_height);
cx.paint_quad(PaintQuad {
bounds: Bounds::new(
layout.hitbox.bounds.upper_right()
+ point(px(-20.) - self.style.scrollbar_width, line_y),
size(px(20.), line_height - px(1.)),
),
background: Hsla::red(),
..Default::default()
});
)
}
for line_element in &mut layout.inline_elements {
for line_element in &mut layout.line_elements {
line_element.paint(cx);
}
}
@@ -3724,11 +3711,11 @@ impl EditorElement {
)
.map(|hunk| {
let start_display_row =
MultiBufferPoint::new(hunk.associated_range.start.0, 0)
MultiBufferPoint::new(hunk.row_range.start.0, 0)
.to_display_point(&snapshot.display_snapshot)
.row();
let mut end_display_row =
MultiBufferPoint::new(hunk.associated_range.end.0, 0)
MultiBufferPoint::new(hunk.row_range.end.0, 0)
.to_display_point(&snapshot.display_snapshot)
.row();
if end_display_row != start_display_row {
@@ -4344,7 +4331,7 @@ pub(crate) struct LineWithInvisibles {
#[allow(clippy::large_enum_variant)]
enum LineFragment {
Text(ShapedLine),
InlineElement {
Element {
element: Option<AnyElement>,
size: Size<Pixels>,
len: usize,
@@ -4355,7 +4342,7 @@ impl fmt::Debug for LineFragment {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match self {
LineFragment::Text(shaped_line) => f.debug_tuple("Text").field(shaped_line).finish(),
LineFragment::InlineElement { size, len, .. } => f
LineFragment::Element { size, len, .. } => f
.debug_struct("Element")
.field("size", size)
.field("len", len)
@@ -4440,7 +4427,7 @@ impl LineWithInvisibles {
width += size.width;
len += highlighted_chunk.text.len();
fragments.push(LineFragment::InlineElement {
fragments.push(LineFragment::Element {
element: Some(element),
size,
len: highlighted_chunk.text.len(),
@@ -4556,7 +4543,7 @@ impl LineWithInvisibles {
LineFragment::Text(line) => {
fragment_origin.x += line.width;
}
LineFragment::InlineElement { element, size, .. } => {
LineFragment::Element { element, size, .. } => {
let mut element = element
.take()
.expect("you can't prepaint LineWithInvisibles twice");
@@ -4595,7 +4582,7 @@ impl LineWithInvisibles {
line.paint(fragment_origin, line_height, cx).log_err();
fragment_origin.x += line.width;
}
LineFragment::InlineElement { size, .. } => {
LineFragment::Element { size, .. } => {
fragment_origin.x += size.width;
}
}
@@ -4729,7 +4716,7 @@ impl LineWithInvisibles {
fragment_start_x += shaped_line.width;
fragment_start_index = fragment_end_index;
}
LineFragment::InlineElement { len, size, .. } => {
LineFragment::Element { len, size, .. } => {
let fragment_end_index = fragment_start_index + len;
if index < fragment_end_index {
return fragment_start_x;
@@ -4759,7 +4746,7 @@ impl LineWithInvisibles {
fragment_start_x = fragment_end_x;
fragment_start_index += shaped_line.len;
}
LineFragment::InlineElement { len, size, .. } => {
LineFragment::Element { len, size, .. } => {
let fragment_end_x = fragment_start_x + size.width;
if x < fragment_end_x {
return Some(fragment_start_index);
@@ -4785,7 +4772,7 @@ impl LineWithInvisibles {
}
fragment_start_index = fragment_end_index;
}
LineFragment::InlineElement { len, .. } => {
LineFragment::Element { len, .. } => {
let fragment_end_index = fragment_start_index + len;
if index < fragment_end_index {
return None;
@@ -5339,7 +5326,7 @@ impl Element for EditorElement {
}
});
let inline_elements = self.prepaint_lines(
let line_elements = self.prepaint_lines(
start_row,
&mut line_layouts,
line_height,
@@ -5610,7 +5597,7 @@ impl Element for EditorElement {
highlighted_ranges,
highlighted_gutter_ranges,
redacted_ranges,
inline_elements,
line_elements,
line_numbers,
blamed_display_rows,
inline_blame,
@@ -5748,7 +5735,7 @@ pub struct EditorLayout {
visible_display_row_range: Range<DisplayRow>,
active_rows: BTreeMap<DisplayRow, bool>,
highlighted_rows: BTreeMap<DisplayRow, Hsla>,
inline_elements: SmallVec<[AnyElement; 1]>,
line_elements: SmallVec<[AnyElement; 1]>,
line_numbers: Vec<Option<ShapedLine>>,
display_hunks: Vec<(DisplayDiffHunk, Option<Hitbox>)>,
blamed_display_rows: Option<Vec<AnyElement>>,

View File

@@ -2,9 +2,9 @@ pub mod blame;
use std::ops::Range;
use git::diff::{DiffHunk, DiffHunkStatus};
use git::diff::DiffHunkStatus;
use language::Point;
use multi_buffer::{Anchor, MultiBufferRow};
use multi_buffer::{Anchor, MultiBufferDiffHunk};
use crate::{
display_map::{DisplaySnapshot, ToDisplayPoint},
@@ -49,25 +49,25 @@ impl DisplayDiffHunk {
}
pub fn diff_hunk_to_display(
hunk: &DiffHunk<MultiBufferRow>,
hunk: &MultiBufferDiffHunk,
snapshot: &DisplaySnapshot,
) -> DisplayDiffHunk {
let hunk_start_point = Point::new(hunk.associated_range.start.0, 0);
let hunk_start_point_sub = Point::new(hunk.associated_range.start.0.saturating_sub(1), 0);
let hunk_start_point = Point::new(hunk.row_range.start.0, 0);
let hunk_start_point_sub = Point::new(hunk.row_range.start.0.saturating_sub(1), 0);
let hunk_end_point_sub = Point::new(
hunk.associated_range
hunk.row_range
.end
.0
.saturating_sub(1)
.max(hunk.associated_range.start.0),
.max(hunk.row_range.start.0),
0,
);
let status = hunk_status(hunk);
let is_removal = status == DiffHunkStatus::Removed;
let folds_start = Point::new(hunk.associated_range.start.0.saturating_sub(2), 0);
let folds_end = Point::new(hunk.associated_range.end.0 + 2, 0);
let folds_start = Point::new(hunk.row_range.start.0.saturating_sub(2), 0);
let folds_end = Point::new(hunk.row_range.end.0 + 2, 0);
let folds_range = folds_start..folds_end;
let containing_fold = snapshot.folds_in_range(folds_range).find(|fold| {
@@ -87,7 +87,7 @@ pub fn diff_hunk_to_display(
} else {
let start = hunk_start_point.to_display_point(snapshot).row();
let hunk_end_row = hunk.associated_range.end.max(hunk.associated_range.start);
let hunk_end_row = hunk.row_range.end.max(hunk.row_range.start);
let hunk_end_point = Point::new(hunk_end_row.0, 0);
let multi_buffer_start = snapshot.buffer_snapshot.anchor_after(hunk_start_point);
@@ -195,7 +195,7 @@ mod tests {
cx.background_executor.run_until_parked();
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, ReadWrite);
let mut multibuffer = MultiBuffer::new(ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[
@@ -288,7 +288,7 @@ mod tests {
assert_eq!(
snapshot
.git_diff_hunks_in_range(MultiBufferRow(0)..MultiBufferRow(12))
.map(|hunk| (hunk_status(&hunk), hunk.associated_range))
.map(|hunk| (hunk_status(&hunk), hunk.row_range))
.collect::<Vec<_>>(),
&expected,
);
@@ -296,7 +296,7 @@ mod tests {
assert_eq!(
snapshot
.git_diff_hunks_in_range_rev(MultiBufferRow(0)..MultiBufferRow(12))
.map(|hunk| (hunk_status(&hunk), hunk.associated_range))
.map(|hunk| (hunk_status(&hunk), hunk.row_range))
.collect::<Vec<_>>(),
expected
.iter()

View File

@@ -4,11 +4,12 @@ use std::{
};
use collections::{hash_map, HashMap, HashSet};
use git::diff::{DiffHunk, DiffHunkStatus};
use git::diff::DiffHunkStatus;
use gpui::{Action, AppContext, CursorStyle, Hsla, Model, MouseButton, Subscription, Task, View};
use language::Buffer;
use multi_buffer::{
Anchor, AnchorRangeExt, ExcerptRange, MultiBuffer, MultiBufferRow, MultiBufferSnapshot, ToPoint,
Anchor, AnchorRangeExt, ExcerptRange, MultiBuffer, MultiBufferDiffHunk, MultiBufferRow,
MultiBufferSnapshot, ToPoint,
};
use settings::SettingsStore;
use text::{BufferId, Point};
@@ -190,9 +191,9 @@ impl Editor {
.buffer_snapshot
.git_diff_hunks_in_range(MultiBufferRow::MIN..MultiBufferRow::MAX)
.filter(|hunk| {
let hunk_display_row_range = Point::new(hunk.associated_range.start.0, 0)
let hunk_display_row_range = Point::new(hunk.row_range.start.0, 0)
.to_display_point(&snapshot.display_snapshot)
..Point::new(hunk.associated_range.end.0, 0)
..Point::new(hunk.row_range.end.0, 0)
.to_display_point(&snapshot.display_snapshot);
let row_range_end =
display_rows_with_expanded_hunks.get(&hunk_display_row_range.start.row());
@@ -203,7 +204,7 @@ impl Editor {
fn toggle_hunks_expanded(
&mut self,
hunks_to_toggle: Vec<DiffHunk<MultiBufferRow>>,
hunks_to_toggle: Vec<MultiBufferDiffHunk>,
cx: &mut ViewContext<Self>,
) {
let previous_toggle_task = self.expanded_hunks.hunk_update_tasks.remove(&None);
@@ -274,8 +275,8 @@ impl Editor {
});
for remaining_hunk in hunks_to_toggle {
let remaining_hunk_point_range =
Point::new(remaining_hunk.associated_range.start.0, 0)
..Point::new(remaining_hunk.associated_range.end.0, 0);
Point::new(remaining_hunk.row_range.start.0, 0)
..Point::new(remaining_hunk.row_range.end.0, 0);
hunks_to_expand.push(HoveredHunk {
status: hunk_status(&remaining_hunk),
multi_buffer_range: remaining_hunk_point_range
@@ -705,7 +706,7 @@ impl Editor {
fn to_diff_hunk(
hovered_hunk: &HoveredHunk,
multi_buffer_snapshot: &MultiBufferSnapshot,
) -> Option<DiffHunk<MultiBufferRow>> {
) -> Option<MultiBufferDiffHunk> {
let buffer_id = hovered_hunk
.multi_buffer_range
.start
@@ -716,9 +717,8 @@ fn to_diff_hunk(
let point_range = hovered_hunk
.multi_buffer_range
.to_point(multi_buffer_snapshot);
Some(DiffHunk {
associated_range: MultiBufferRow(point_range.start.row)
..MultiBufferRow(point_range.end.row),
Some(MultiBufferDiffHunk {
row_range: MultiBufferRow(point_range.start.row)..MultiBufferRow(point_range.end.row),
buffer_id,
buffer_range,
diff_base_byte_range: hovered_hunk.diff_base_byte_range.clone(),
@@ -764,7 +764,7 @@ fn editor_with_deleted_text(
let parent_editor = cx.view().downgrade();
let editor = cx.new_view(|cx| {
let multi_buffer =
cx.new_model(|_| MultiBuffer::without_headers(0, language::Capability::ReadOnly));
cx.new_model(|_| MultiBuffer::without_headers(language::Capability::ReadOnly));
multi_buffer.update(cx, |multi_buffer, cx| {
multi_buffer.push_excerpts(
diff_base_buffer,
@@ -868,7 +868,7 @@ fn editor_with_deleted_text(
fn buffer_diff_hunk(
buffer_snapshot: &MultiBufferSnapshot,
row_range: Range<Point>,
) -> Option<DiffHunk<MultiBufferRow>> {
) -> Option<MultiBufferDiffHunk> {
let mut hunks = buffer_snapshot.git_diff_hunks_in_range(
MultiBufferRow(row_range.start.row)..MultiBufferRow(row_range.end.row),
);

View File

@@ -2607,7 +2607,7 @@ pub mod tests {
.await
.unwrap();
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, Capability::ReadWrite);
let mut multibuffer = MultiBuffer::new(Capability::ReadWrite);
multibuffer.push_excerpts(
buffer_1.clone(),
[
@@ -2957,7 +2957,7 @@ pub mod tests {
})
.await
.unwrap();
let multibuffer = cx.new_model(|_| MultiBuffer::new(0, Capability::ReadWrite));
let multibuffer = cx.new_model(|_| MultiBuffer::new(Capability::ReadWrite));
let (buffer_1_excerpts, buffer_2_excerpts) = multibuffer.update(cx, |multibuffer, cx| {
let buffer_1_excerpts = multibuffer.push_excerpts(
buffer_1.clone(),

View File

@@ -68,7 +68,6 @@ impl FollowableItem for Editor {
unreachable!()
};
let replica_id = project.read(cx).replica_id();
let buffer_ids = state
.excerpts
.iter()
@@ -92,7 +91,7 @@ impl FollowableItem for Editor {
if state.singleton && buffers.len() == 1 {
multibuffer = MultiBuffer::singleton(buffers.pop().unwrap(), cx)
} else {
multibuffer = MultiBuffer::new(replica_id, project.read(cx).capability());
multibuffer = MultiBuffer::new(project.read(cx).capability());
let mut excerpts = state.excerpts.into_iter().peekable();
while let Some(excerpt) = excerpts.peek() {
let Ok(buffer_id) = BufferId::new(excerpt.buffer_id) else {
@@ -1087,10 +1086,14 @@ impl SerializableItem for Editor {
let workspace_id = workspace.database_id()?;
let buffer = self.buffer().read(cx).as_singleton()?;
let path = buffer
.read(cx)
.file()
.map(|file| file.full_path(cx))
.and_then(|full_path| project.read(cx).find_project_path(&full_path, cx))
.and_then(|project_path| project.read(cx).absolute_path(&project_path, cx));
let is_dirty = buffer.read(cx).is_dirty();
let local_file = buffer.read(cx).file().and_then(|file| file.as_local());
let path = local_file.map(|file| file.abs_path(cx));
let mtime = buffer.read(cx).saved_mtime();
let snapshot = buffer.read(cx).snapshot();

View File

@@ -928,7 +928,7 @@ mod tests {
let buffer = cx.new_model(|cx| Buffer::local("abc\ndefg\nhijkl\nmn", cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, Capability::ReadWrite);
let mut multibuffer = MultiBuffer::new(Capability::ReadWrite);
multibuffer.push_excerpts(
buffer.clone(),
[

View File

@@ -0,0 +1,125 @@
use crate::{Editor, EditorEvent};
use collections::HashSet;
use futures::{channel::mpsc, future::join_all};
use gpui::{AppContext, EventEmitter, FocusableView, Model, Render, Subscription, Task, View};
use language::{Buffer, BufferEvent, Capability};
use multi_buffer::{ExcerptRange, MultiBuffer};
use project::Project;
use smol::stream::StreamExt;
use std::{ops::Range, time::Duration};
use text::ToOffset;
use ui::prelude::*;
use workspace::Item;
pub struct ProposedChangesEditor {
editor: View<Editor>,
_subscriptions: Vec<Subscription>,
_recalculate_diffs_task: Task<Option<()>>,
recalculate_diffs_tx: mpsc::UnboundedSender<Model<Buffer>>,
}
pub struct ProposedChangesBuffer<T> {
pub buffer: Model<Buffer>,
pub ranges: Vec<Range<T>>,
}
impl ProposedChangesEditor {
pub fn new<T: ToOffset>(
buffers: Vec<ProposedChangesBuffer<T>>,
project: Option<Model<Project>>,
cx: &mut ViewContext<Self>,
) -> Self {
let mut subscriptions = Vec::new();
let multibuffer = cx.new_model(|_| MultiBuffer::new(Capability::ReadWrite));
for buffer in buffers {
let branch_buffer = buffer.buffer.update(cx, |buffer, cx| buffer.branch(cx));
subscriptions.push(cx.subscribe(&branch_buffer, Self::on_buffer_event));
multibuffer.update(cx, |multibuffer, cx| {
multibuffer.push_excerpts(
branch_buffer,
buffer.ranges.into_iter().map(|range| ExcerptRange {
context: range,
primary: None,
}),
cx,
);
});
}
let (recalculate_diffs_tx, mut recalculate_diffs_rx) = mpsc::unbounded();
Self {
editor: cx
.new_view(|cx| Editor::for_multibuffer(multibuffer.clone(), project, true, cx)),
recalculate_diffs_tx,
_recalculate_diffs_task: cx.spawn(|_, mut cx| async move {
let mut buffers_to_diff = HashSet::default();
while let Some(buffer) = recalculate_diffs_rx.next().await {
buffers_to_diff.insert(buffer);
loop {
cx.background_executor()
.timer(Duration::from_millis(250))
.await;
let mut had_further_changes = false;
while let Ok(next_buffer) = recalculate_diffs_rx.try_next() {
buffers_to_diff.insert(next_buffer?);
had_further_changes = true;
}
if !had_further_changes {
break;
}
}
join_all(buffers_to_diff.drain().filter_map(|buffer| {
buffer
.update(&mut cx, |buffer, cx| buffer.recalculate_diff(cx))
.ok()?
}))
.await;
}
None
}),
_subscriptions: subscriptions,
}
}
fn on_buffer_event(
&mut self,
buffer: Model<Buffer>,
event: &BufferEvent,
_cx: &mut ViewContext<Self>,
) {
if let BufferEvent::Edited = event {
self.recalculate_diffs_tx.unbounded_send(buffer).ok();
}
}
}
impl Render for ProposedChangesEditor {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
self.editor.clone()
}
}
impl FocusableView for ProposedChangesEditor {
fn focus_handle(&self, cx: &AppContext) -> gpui::FocusHandle {
self.editor.focus_handle(cx)
}
}
impl EventEmitter<EditorEvent> for ProposedChangesEditor {}
impl Item for ProposedChangesEditor {
type Event = EditorEvent;
fn tab_icon(&self, _cx: &ui::WindowContext) -> Option<Icon> {
Some(Icon::new(IconName::Pencil))
}
fn tab_content_text(&self, _cx: &WindowContext) -> Option<SharedString> {
Some("Proposed changes".into())
}
}

View File

@@ -108,16 +108,16 @@ pub fn editor_hunks(
.buffer_snapshot
.git_diff_hunks_in_range(MultiBufferRow::MIN..MultiBufferRow::MAX)
.map(|hunk| {
let display_range = Point::new(hunk.associated_range.start.0, 0)
let display_range = Point::new(hunk.row_range.start.0, 0)
.to_display_point(snapshot)
.row()
..Point::new(hunk.associated_range.end.0, 0)
..Point::new(hunk.row_range.end.0, 0)
.to_display_point(snapshot)
.row();
let (_, buffer, _) = editor
.buffer()
.read(cx)
.excerpt_containing(Point::new(hunk.associated_range.start.0, 0), cx)
.excerpt_containing(Point::new(hunk.row_range.start.0, 0), cx)
.expect("no excerpt for expanded buffer's hunk start");
let diff_base = buffer
.read(cx)

View File

@@ -75,7 +75,7 @@ impl EditorTestContext {
cx: &mut gpui::TestAppContext,
excerpts: [&str; COUNT],
) -> EditorTestContext {
let mut multibuffer = MultiBuffer::new(0, language::Capability::ReadWrite);
let mut multibuffer = MultiBuffer::new(language::Capability::ReadWrite);
let buffer = cx.new_model(|cx| {
for excerpt in excerpts.into_iter() {
let (text, ranges) = marked_text_ranges(excerpt, false);

View File

@@ -12,13 +12,16 @@ use language::LanguageRegistry;
use node_runtime::FakeNodeRuntime;
use open_ai::OpenAiEmbeddingModel;
use project::Project;
use semantic_index::{OpenAiEmbeddingProvider, ProjectIndex, SemanticDb, Status};
use semantic_index::{
EmbeddingProvider, OpenAiEmbeddingProvider, ProjectIndex, SemanticDb, Status,
};
use serde::{Deserialize, Serialize};
use settings::SettingsStore;
use smol::channel::bounded;
use smol::io::AsyncReadExt;
use smol::Timer;
use std::ops::RangeInclusive;
use std::path::PathBuf;
use std::time::Duration;
use std::{
fs,
@@ -237,6 +240,14 @@ async fn fetch_code_search_net_resources(http_client: &dyn HttpClient) -> Result
Ok(())
}
#[derive(Default, Debug)]
struct Counts {
covered_results: usize,
overlapped_results: usize,
covered_files: usize,
total_results: usize,
}
async fn run_evaluation(
only_repo: Option<String>,
executor: &BackgroundExecutor,
@@ -297,12 +308,11 @@ async fn run_evaluation(
cx.update(|cx| languages::init(language_registry.clone(), node_runtime.clone(), cx))
.unwrap();
let mut covered_result_count = 0;
let mut overlapped_result_count = 0;
let mut covered_file_count = 0;
let mut total_result_count = 0;
let mut counts = Counts::default();
eprint!("Running evals.");
let mut failures = Vec::new();
for evaluation_project in evaluations {
if only_repo
.as_ref()
@@ -314,27 +324,24 @@ async fn run_evaluation(
eprint!("\r\x1B[2K");
eprint!(
"Running evals. {}/{} covered. {}/{} overlapped. {}/{} files captured. Project: {}...",
covered_result_count,
total_result_count,
overlapped_result_count,
total_result_count,
covered_file_count,
total_result_count,
counts.covered_results,
counts.total_results,
counts.overlapped_results,
counts.total_results,
counts.covered_files,
counts.total_results,
evaluation_project.repo
);
let repo_db_path =
db_path.join(format!("{}.db", evaluation_project.repo.replace('/', "_")));
let mut semantic_index = SemanticDb::new(repo_db_path, embedding_provider.clone(), cx)
.await
.unwrap();
let repo_dir = repos_dir.join(&evaluation_project.repo);
if !repo_dir.exists() || repo_dir.join(SKIP_EVAL_PATH).exists() {
eprintln!("Skipping {}: directory not found", evaluation_project.repo);
continue;
}
let repo_db_path =
db_path.join(format!("{}.db", evaluation_project.repo.replace('/', "_")));
let project = cx
.update(|cx| {
Project::local(
@@ -349,116 +356,193 @@ async fn run_evaluation(
})
.unwrap();
let (worktree, _) = project
.update(cx, |project, cx| {
project.find_or_create_worktree(repo_dir, true, cx)
})?
.await?;
let repo = evaluation_project.repo.clone();
if let Err(err) = run_eval_project(
evaluation_project,
&user_store,
repo_db_path,
&repo_dir,
&mut counts,
project,
embedding_provider.clone(),
fs.clone(),
cx,
)
.await
{
eprintln!("{repo} eval failed with error: {:?}", err);
worktree
.update(cx, |worktree, _| {
worktree.as_local().unwrap().scan_complete()
})
.unwrap()
.await;
let project_index = cx
.update(|cx| semantic_index.create_project_index(project.clone(), cx))
.unwrap();
wait_for_indexing_complete(&project_index, cx, Some(Duration::from_secs(120))).await;
for query in evaluation_project.queries {
let results = cx
.update(|cx| {
let project_index = project_index.read(cx);
project_index.search(query.query.clone(), SEARCH_RESULT_LIMIT, cx)
})
.unwrap()
.await
.unwrap();
let results = SemanticDb::load_results(results, &fs.clone(), &cx)
.await
.unwrap();
let mut project_covered_result_count = 0;
let mut project_overlapped_result_count = 0;
let mut project_covered_file_count = 0;
let mut covered_result_indices = Vec::new();
for expected_result in &query.expected_results {
let mut file_matched = false;
let mut range_overlapped = false;
let mut range_covered = false;
for (ix, result) in results.iter().enumerate() {
if result.path.as_ref() == Path::new(&expected_result.file) {
file_matched = true;
let start_matched =
result.row_range.contains(&expected_result.lines.start());
let end_matched = result.row_range.contains(&expected_result.lines.end());
if start_matched || end_matched {
range_overlapped = true;
}
if start_matched && end_matched {
range_covered = true;
covered_result_indices.push(ix);
break;
}
}
}
if range_covered {
project_covered_result_count += 1
};
if range_overlapped {
project_overlapped_result_count += 1
};
if file_matched {
project_covered_file_count += 1
};
}
let outcome_repo = evaluation_project.repo.clone();
let query_results = EvaluationQueryOutcome {
repo: outcome_repo,
query: query.query,
total_result_count: query.expected_results.len(),
covered_result_count: project_covered_result_count,
overlapped_result_count: project_overlapped_result_count,
covered_file_count: project_covered_file_count,
expected_results: query.expected_results,
actual_results: results
.iter()
.map(|result| EvaluationSearchResult {
file: result.path.to_string_lossy().to_string(),
lines: result.row_range.clone(),
})
.collect(),
covered_result_indices,
};
overlapped_result_count += query_results.overlapped_result_count;
covered_result_count += query_results.covered_result_count;
covered_file_count += query_results.covered_file_count;
total_result_count += query_results.total_result_count;
println!("{}", serde_json::to_string(&query_results).unwrap());
failures.push((repo, err));
}
}
eprint!(
"Running evals. {}/{} covered. {}/{} overlapped. {}/{} files captured.",
covered_result_count,
total_result_count,
overlapped_result_count,
total_result_count,
covered_file_count,
total_result_count,
eprintln!(
"Running evals. {}/{} covered. {}/{} overlapped. {}/{} files captured. {} failed.",
counts.covered_results,
counts.total_results,
counts.overlapped_results,
counts.total_results,
counts.covered_files,
counts.total_results,
failures.len(),
);
Ok(())
if failures.is_empty() {
Ok(())
} else {
eprintln!("Failures:\n");
for (index, (repo, failure)) in failures.iter().enumerate() {
eprintln!("Failure #{} - {repo}\n{:?}", index + 1, failure);
}
Err(anyhow::anyhow!("Some evals failed."))
}
}
#[allow(clippy::too_many_arguments)]
async fn run_eval_project(
evaluation_project: EvaluationProject,
user_store: &Model<UserStore>,
repo_db_path: PathBuf,
repo_dir: &Path,
counts: &mut Counts,
project: Model<Project>,
embedding_provider: Arc<dyn EmbeddingProvider>,
fs: Arc<dyn Fs>,
cx: &mut AsyncAppContext,
) -> Result<(), anyhow::Error> {
let mut semantic_index = SemanticDb::new(repo_db_path, embedding_provider, cx).await?;
let (worktree, _) = project
.update(cx, |project, cx| {
project.find_or_create_worktree(repo_dir, true, cx)
})?
.await?;
worktree
.update(cx, |worktree, _| {
worktree.as_local().unwrap().scan_complete()
})?
.await;
let project_index = cx.update(|cx| semantic_index.create_project_index(project.clone(), cx))?;
wait_for_indexing_complete(&project_index, cx, Some(Duration::from_secs(120))).await;
for query in evaluation_project.queries {
let results = {
// Retry search up to 3 times in case of timeout, network failure, etc.
let mut retries_remaining = 3;
let mut result;
loop {
match cx.update(|cx| {
let project_index = project_index.read(cx);
project_index.search(vec![query.query.clone()], SEARCH_RESULT_LIMIT, cx)
}) {
Ok(task) => match task.await {
Ok(answer) => {
result = Ok(answer);
break;
}
Err(err) => {
result = Err(err);
}
},
Err(err) => {
result = Err(err);
}
}
if retries_remaining > 0 {
eprintln!(
"Retrying search after it failed on query {:?} with {:?}",
query, result
);
retries_remaining -= 1;
} else {
eprintln!(
"Ran out of retries; giving up on search which failed on query {:?} with {:?}",
query, result
);
break;
}
}
SemanticDb::load_results(result?, &fs.clone(), &cx).await?
};
let mut project_covered_result_count = 0;
let mut project_overlapped_result_count = 0;
let mut project_covered_file_count = 0;
let mut covered_result_indices = Vec::new();
for expected_result in &query.expected_results {
let mut file_matched = false;
let mut range_overlapped = false;
let mut range_covered = false;
for (ix, result) in results.iter().enumerate() {
if result.path.as_ref() == Path::new(&expected_result.file) {
file_matched = true;
let start_matched = result.row_range.contains(&expected_result.lines.start());
let end_matched = result.row_range.contains(&expected_result.lines.end());
if start_matched || end_matched {
range_overlapped = true;
}
if start_matched && end_matched {
range_covered = true;
covered_result_indices.push(ix);
break;
}
}
}
if range_covered {
project_covered_result_count += 1
};
if range_overlapped {
project_overlapped_result_count += 1
};
if file_matched {
project_covered_file_count += 1
};
}
let outcome_repo = evaluation_project.repo.clone();
let query_results = EvaluationQueryOutcome {
repo: outcome_repo,
query: query.query,
total_result_count: query.expected_results.len(),
covered_result_count: project_covered_result_count,
overlapped_result_count: project_overlapped_result_count,
covered_file_count: project_covered_file_count,
expected_results: query.expected_results,
actual_results: results
.iter()
.map(|result| EvaluationSearchResult {
file: result.path.to_string_lossy().to_string(),
lines: result.row_range.clone(),
})
.collect(),
covered_result_indices,
};
counts.overlapped_results += query_results.overlapped_result_count;
counts.covered_results += query_results.covered_result_count;
counts.covered_files += query_results.covered_file_count;
counts.total_results += query_results.total_result_count;
println!("{}", serde_json::to_string(&query_results)?);
}
user_store.update(cx, |_, _| {
drop(semantic_index);
drop(project);
drop(worktree);
drop(project_index);
})
}
async fn wait_for_indexing_complete(
@@ -515,7 +599,7 @@ async fn fetch_eval_repos(
let evaluations = fs::read(&evaluations_path).expect("failed to read evaluations.json");
let evaluations: Vec<EvaluationProject> = serde_json::from_slice(&evaluations).unwrap();
eprint!("Fetching evaluation repositories...");
eprintln!("Fetching evaluation repositories...");
executor
.scoped(move |scope| {

View File

@@ -827,6 +827,7 @@ impl ExtensionStore {
let mut extension_manifest =
ExtensionManifest::load(fs.clone(), &extension_source_path).await?;
let extension_id = extension_manifest.id.clone();
if !this.update(&mut cx, |this, cx| {
match this.outstanding_operations.entry(extension_id.clone()) {
btree_map::Entry::Occupied(_) => return false,
@@ -850,6 +851,7 @@ impl ExtensionStore {
.ok();
}
});
cx.background_executor()
.spawn({
let extension_source_path = extension_source_path.clone();
@@ -880,8 +882,10 @@ impl ExtensionStore {
bail!("extension {extension_id} is already installed");
}
}
fs.create_symlink(output_path, extension_source_path)
.await?;
this.update(&mut cx, |this, cx| this.reload(None, cx))?
.await;
Ok(())

View File

@@ -457,8 +457,6 @@ async fn test_extension_store(cx: &mut TestAppContext) {
});
}
// TODO remove
#[ignore]
#[gpui::test]
async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
init_test(cx);
@@ -564,10 +562,6 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
let mut encoder = GzipEncoder::new(BufReader::new(bytes.as_slice()));
encoder.read_to_end(&mut gzipped_bytes).await.unwrap();
Ok(Response::new(gzipped_bytes.into()))
// } else if uri == WASI_ADAPTER_URL {
// let binary_contents =
// include_bytes!("wasi_snapshot_preview1.reactor.wasm").as_slice();
// Ok(Response::new(binary_contents.into()))
} else {
Ok(Response::builder().status(404).body("not found".into())?)
}

View File

@@ -9,10 +9,10 @@ use futures::{io::BufReader, FutureExt as _};
use futures::{lock::Mutex, AsyncReadExt};
use indexed_docs::IndexedDocsDatabase;
use isahc::config::{Configurable, RedirectPolicy};
use language::LanguageName;
use language::{
language_settings::AllLanguageSettings, LanguageServerBinaryStatus, LspAdapterDelegate,
};
use language::{LanguageName, LanguageServerName};
use project::project_settings::ProjectSettings;
use semantic_version::SemanticVersion;
use std::{
@@ -366,7 +366,7 @@ impl ExtensionImports for WasmState {
.and_then(|key| {
ProjectSettings::get(location, cx)
.lsp
.get(&Arc::<str>::from(key))
.get(&LanguageServerName(key.into()))
})
.cloned()
.unwrap_or_default();

View File

@@ -9,10 +9,10 @@ use futures::{io::BufReader, FutureExt as _};
use futures::{lock::Mutex, AsyncReadExt};
use indexed_docs::IndexedDocsDatabase;
use isahc::config::{Configurable, RedirectPolicy};
use language::LanguageName;
use language::{
language_settings::AllLanguageSettings, LanguageServerBinaryStatus, LspAdapterDelegate,
};
use language::{LanguageName, LanguageServerName};
use project::project_settings::ProjectSettings;
use semantic_version::SemanticVersion;
use std::{
@@ -412,7 +412,7 @@ impl ExtensionImports for WasmState {
.and_then(|key| {
ProjectSettings::get(location, cx)
.lsp
.get(&Arc::<str>::from(key))
.get(&LanguageServerName::from_proto(key))
})
.cloned()
.unwrap_or_default();

View File

@@ -44,8 +44,8 @@ const FEEDBACK_SUBMISSION_ERROR_TEXT: &str =
struct FeedbackRequestBody<'a> {
feedback_text: &'a str,
email: Option<String>,
metrics_id: Option<Arc<str>>,
installation_id: Option<Arc<str>>,
metrics_id: Option<Arc<str>>,
system_specs: SystemSpecs,
is_staff: bool,
}
@@ -296,16 +296,16 @@ impl FeedbackModal {
}
let telemetry = zed_client.telemetry();
let metrics_id = telemetry.metrics_id();
let installation_id = telemetry.installation_id();
let metrics_id = telemetry.metrics_id();
let is_staff = telemetry.is_staff();
let http_client = zed_client.http_client();
let feedback_endpoint = http_client.build_url("/api/feedback");
let request = FeedbackRequestBody {
feedback_text,
email,
metrics_id,
installation_id,
metrics_id,
system_specs,
is_staff: is_staff.unwrap_or(false),
};

View File

@@ -16,14 +16,17 @@ doctest = false
anyhow.workspace = true
collections.workspace = true
editor.workspace = true
file_icons.workspace = true
futures.workspace = true
fuzzy.workspace = true
gpui.workspace = true
menu.workspace = true
picker.workspace = true
project.workspace = true
schemars.workspace = true
settings.workspace = true
serde.workspace = true
serde_derive.workspace = true
text.workspace = true
theme.workspace = true
ui.workspace = true

View File

@@ -1,11 +1,14 @@
#[cfg(test)]
mod file_finder_tests;
mod file_finder_settings;
mod new_path_prompt;
mod open_path_prompt;
use collections::HashMap;
use editor::{scroll::Autoscroll, Bias, Editor};
use file_finder_settings::FileFinderSettings;
use file_icons::FileIcons;
use fuzzy::{CharBag, PathMatch, PathMatchCandidate};
use gpui::{
actions, rems, Action, AnyElement, AppContext, DismissEvent, EventEmitter, FocusHandle,
@@ -28,7 +31,7 @@ use std::{
use text::Point;
use ui::{prelude::*, HighlightedLabel, ListItem, ListItemSpacing};
use util::{paths::PathWithPosition, post_inc, ResultExt};
use workspace::{item::PreviewTabsSettings, ModalView, Workspace};
use workspace::{item::PreviewTabsSettings, notifications::NotifyResultExt, ModalView, Workspace};
actions!(file_finder, [SelectPrev]);
@@ -39,7 +42,12 @@ pub struct FileFinder {
init_modifiers: Option<Modifiers>,
}
pub fn init_settings(cx: &mut AppContext) {
FileFinderSettings::register(cx);
}
pub fn init(cx: &mut AppContext) {
init_settings(cx);
cx.observe_new_views(FileFinder::register).detach();
cx.observe_new_views(NewPathPrompt::register).detach();
cx.observe_new_views(OpenPathPrompt::register).detach();
@@ -1003,7 +1011,7 @@ impl PickerDelegate for FileFinderDelegate {
let finder = self.file_finder.clone();
cx.spawn(|_, mut cx| async move {
let item = open_task.await.log_err()?;
let item = open_task.await.notify_async_err(&mut cx)?;
if let Some(row) = row {
if let Some(active_editor) = item.downcast::<Editor>() {
active_editor
@@ -1041,12 +1049,14 @@ impl PickerDelegate for FileFinderDelegate {
selected: bool,
cx: &mut ViewContext<Picker<Self>>,
) -> Option<Self::ListItem> {
let settings = FileFinderSettings::get_global(cx);
let path_match = self
.matches
.get(ix)
.expect("Invalid matches state: no element for index {ix}");
let icon = match &path_match {
let history_icon = match &path_match {
Match::History { .. } => Icon::new(IconName::HistoryRerun)
.color(Color::Muted)
.size(IconSize::Small)
@@ -1059,10 +1069,17 @@ impl PickerDelegate for FileFinderDelegate {
let (file_name, file_name_positions, full_path, full_path_positions) =
self.labels_for_match(path_match, cx, ix);
let file_icon = if settings.file_icons {
FileIcons::get_icon(Path::new(&file_name), cx).map(Icon::from_path)
} else {
None
};
Some(
ListItem::new(ix)
.spacing(ListItemSpacing::Sparse)
.end_slot::<AnyElement>(Some(icon))
.start_slot::<Icon>(file_icon)
.end_slot::<AnyElement>(history_icon)
.inset(true)
.selected(selected)
.child(

View File

@@ -0,0 +1,27 @@
use anyhow::Result;
use schemars::JsonSchema;
use serde_derive::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
#[derive(Deserialize, Debug, Clone, Copy, PartialEq)]
pub struct FileFinderSettings {
pub file_icons: bool,
}
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
pub struct FileFinderSettingsContent {
/// Whether to show file icons in the file finder.
///
/// Default: true
pub file_icons: Option<bool>,
}
impl Settings for FileFinderSettings {
const KEY: Option<&'static str> = Some("file_finder");
type FileContent = FileFinderSettingsContent;
fn load(sources: SettingsSources<Self::FileContent>, _: &mut gpui::AppContext) -> Result<Self> {
sources.json_merge()
}
}

View File

@@ -1,7 +1,7 @@
use rope::Rope;
use std::{iter, ops::Range};
use sum_tree::SumTree;
use text::{Anchor, BufferId, BufferSnapshot, OffsetRangeExt, Point};
use text::{Anchor, BufferSnapshot, OffsetRangeExt, Point};
pub use git2 as libgit;
use libgit::{DiffLineType as GitDiffLineType, DiffOptions as GitOptions, Patch as GitPatch};
@@ -13,29 +13,30 @@ pub enum DiffHunkStatus {
Removed,
}
/// A diff hunk, representing a range of consequent lines in a singleton buffer, associated with a generic range.
/// A diff hunk resolved to rows in the buffer.
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct DiffHunk<T> {
/// E.g. a range in multibuffer, that has an excerpt added, singleton buffer for which has this diff hunk.
/// Consider a singleton buffer with 10 lines, all of them are modified — so a corresponding diff hunk would have a range 0..10.
/// And a multibuffer with the excerpt of lines 2-6 from the singleton buffer.
/// If the multibuffer is searched for diff hunks, the associated range would be multibuffer rows, corresponding to rows 2..6 from the singleton buffer.
/// But the hunk range would be 0..10, same for any other excerpts from the same singleton buffer.
pub associated_range: Range<T>,
/// Singleton buffer ID this hunk belongs to.
pub buffer_id: BufferId,
/// A consequent range of lines in the singleton buffer, that were changed and produced this diff hunk.
pub struct DiffHunk {
/// The buffer range, expressed in terms of rows.
pub row_range: Range<u32>,
/// The range in the buffer to which this hunk corresponds.
pub buffer_range: Range<Anchor>,
/// Original singleton buffer text before the change, that was instead of the `buffer_range`.
/// The range in the buffer's diff base text to which this hunk corresponds.
pub diff_base_byte_range: Range<usize>,
}
impl sum_tree::Item for DiffHunk<Anchor> {
/// We store [`InternalDiffHunk`]s internally so we don't need to store the additional row range.
#[derive(Debug, Clone)]
struct InternalDiffHunk {
buffer_range: Range<Anchor>,
diff_base_byte_range: Range<usize>,
}
impl sum_tree::Item for InternalDiffHunk {
type Summary = DiffHunkSummary;
fn summary(&self) -> Self::Summary {
DiffHunkSummary {
buffer_range: self.associated_range.clone(),
buffer_range: self.buffer_range.clone(),
}
}
}
@@ -64,7 +65,7 @@ impl sum_tree::Summary for DiffHunkSummary {
#[derive(Debug, Clone)]
pub struct BufferDiff {
last_buffer_version: Option<clock::Global>,
tree: SumTree<DiffHunk<Anchor>>,
tree: SumTree<InternalDiffHunk>,
}
impl BufferDiff {
@@ -79,11 +80,12 @@ impl BufferDiff {
self.tree.is_empty()
}
#[cfg(any(test, feature = "test-support"))]
pub fn hunks_in_row_range<'a>(
&'a self,
range: Range<u32>,
buffer: &'a BufferSnapshot,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
) -> impl 'a + Iterator<Item = DiffHunk> {
let start = buffer.anchor_before(Point::new(range.start, 0));
let end = buffer.anchor_after(Point::new(range.end, 0));
@@ -94,7 +96,7 @@ impl BufferDiff {
&'a self,
range: Range<Anchor>,
buffer: &'a BufferSnapshot,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
) -> impl 'a + Iterator<Item = DiffHunk> {
let mut cursor = self
.tree
.filter::<_, DiffHunkSummary>(buffer, move |summary| {
@@ -109,11 +111,8 @@ impl BufferDiff {
})
.flat_map(move |hunk| {
[
(
&hunk.associated_range.start,
hunk.diff_base_byte_range.start,
),
(&hunk.associated_range.end, hunk.diff_base_byte_range.end),
(&hunk.buffer_range.start, hunk.diff_base_byte_range.start),
(&hunk.buffer_range.end, hunk.diff_base_byte_range.end),
]
.into_iter()
});
@@ -129,10 +128,9 @@ impl BufferDiff {
}
Some(DiffHunk {
associated_range: start_point.row..end_point.row,
row_range: start_point.row..end_point.row,
diff_base_byte_range: start_base..end_base,
buffer_range: buffer.anchor_before(start_point)..buffer.anchor_after(end_point),
buffer_id: buffer.remote_id(),
})
})
}
@@ -141,7 +139,7 @@ impl BufferDiff {
&'a self,
range: Range<Anchor>,
buffer: &'a BufferSnapshot,
) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
) -> impl 'a + Iterator<Item = DiffHunk> {
let mut cursor = self
.tree
.filter::<_, DiffHunkSummary>(buffer, move |summary| {
@@ -154,7 +152,7 @@ impl BufferDiff {
cursor.prev(buffer);
let hunk = cursor.item()?;
let range = hunk.associated_range.to_point(buffer);
let range = hunk.buffer_range.to_point(buffer);
let end_row = if range.end.column > 0 {
range.end.row + 1
} else {
@@ -162,10 +160,9 @@ impl BufferDiff {
};
Some(DiffHunk {
associated_range: range.start.row..end_row,
row_range: range.start.row..end_row,
diff_base_byte_range: hunk.diff_base_byte_range.clone(),
buffer_range: hunk.buffer_range.clone(),
buffer_id: hunk.buffer_id,
})
})
}
@@ -196,7 +193,7 @@ impl BufferDiff {
}
#[cfg(test)]
fn hunks<'a>(&'a self, text: &'a BufferSnapshot) -> impl 'a + Iterator<Item = DiffHunk<u32>> {
fn hunks<'a>(&'a self, text: &'a BufferSnapshot) -> impl 'a + Iterator<Item = DiffHunk> {
let start = text.anchor_before(Point::new(0, 0));
let end = text.anchor_after(Point::new(u32::MAX, u32::MAX));
self.hunks_intersecting_range(start..end, text)
@@ -229,7 +226,7 @@ impl BufferDiff {
hunk_index: usize,
buffer: &text::BufferSnapshot,
buffer_row_divergence: &mut i64,
) -> DiffHunk<Anchor> {
) -> InternalDiffHunk {
let line_item_count = patch.num_lines_in_hunk(hunk_index).unwrap();
assert!(line_item_count > 0);
@@ -284,11 +281,9 @@ impl BufferDiff {
let start = Point::new(buffer_row_range.start, 0);
let end = Point::new(buffer_row_range.end, 0);
let buffer_range = buffer.anchor_before(start)..buffer.anchor_before(end);
DiffHunk {
associated_range: buffer_range.clone(),
InternalDiffHunk {
buffer_range,
diff_base_byte_range,
buffer_id: buffer.remote_id(),
}
}
}
@@ -302,17 +297,16 @@ pub fn assert_hunks<Iter>(
diff_base: &str,
expected_hunks: &[(Range<u32>, &str, &str)],
) where
Iter: Iterator<Item = DiffHunk<u32>>,
Iter: Iterator<Item = DiffHunk>,
{
let actual_hunks = diff_hunks
.map(|hunk| {
(
hunk.associated_range.clone(),
hunk.row_range.clone(),
&diff_base[hunk.diff_base_byte_range],
buffer
.text_for_range(
Point::new(hunk.associated_range.start, 0)
..Point::new(hunk.associated_range.end, 0),
Point::new(hunk.row_range.start, 0)..Point::new(hunk.row_range.end, 0),
)
.collect::<String>(),
)

View File

@@ -1063,7 +1063,7 @@ impl IDWriteTextRenderer_Impl for TextRenderer_Impl {
// This `cast()` action here should never fail since we are running on Win10+, and
// `IDWriteFontFace3` requires Win10
let font_face = &font_face.cast::<IDWriteFontFace3>().unwrap();
let Some((font_identifier, font_struct, is_emoji)) =
let Some((font_identifier, font_struct, color_font)) =
get_font_identifier_and_font_struct(font_face, &self.locale)
else {
return Ok(());
@@ -1084,6 +1084,8 @@ impl IDWriteTextRenderer_Impl for TextRenderer_Impl {
context
.index_converter
.advance_to_utf16_ix(context.utf16_index);
let is_emoji = color_font
&& is_color_glyph(font_face, id, &context.text_system.components.factory);
glyphs.push(ShapedGlyph {
id,
position: point(px(context.width), px(0.0)),
@@ -1446,6 +1448,44 @@ fn get_render_target_property(
}
}
// One would think that with newer DirectWrite method: IDWriteFontFace4::GetGlyphImageFormats
// but that doesn't seem to work for some glyphs, say ❤
fn is_color_glyph(
font_face: &IDWriteFontFace3,
glyph_id: GlyphId,
factory: &IDWriteFactory5,
) -> bool {
let glyph_run = DWRITE_GLYPH_RUN {
fontFace: unsafe { std::mem::transmute_copy(font_face) },
fontEmSize: 14.0,
glyphCount: 1,
glyphIndices: &(glyph_id.0 as u16),
glyphAdvances: &0.0,
glyphOffsets: &DWRITE_GLYPH_OFFSET {
advanceOffset: 0.0,
ascenderOffset: 0.0,
},
isSideways: BOOL(0),
bidiLevel: 0,
};
unsafe {
factory.TranslateColorGlyphRun(
D2D_POINT_2F::default(),
&glyph_run as _,
None,
DWRITE_GLYPH_IMAGE_FORMATS_COLR
| DWRITE_GLYPH_IMAGE_FORMATS_SVG
| DWRITE_GLYPH_IMAGE_FORMATS_PNG
| DWRITE_GLYPH_IMAGE_FORMATS_JPEG
| DWRITE_GLYPH_IMAGE_FORMATS_PREMULTIPLIED_B8G8R8A8,
DWRITE_MEASURING_MODE_NATURAL,
None,
0,
)
}
.is_ok()
}
const DEFAULT_LOCALE_NAME: PCWSTR = windows::core::w!("en-US");
const BRUSH_COLOR: D2D1_COLOR_F = D2D1_COLOR_F {
r: 1.0,

View File

@@ -9,6 +9,13 @@ use util::arc_cow::ArcCow;
#[derive(Deref, DerefMut, Eq, PartialEq, PartialOrd, Ord, Hash, Clone)]
pub struct SharedString(ArcCow<'static, str>);
impl SharedString {
/// creates a static SharedString
pub const fn new_static(s: &'static str) -> Self {
Self(ArcCow::Borrowed(s))
}
}
impl Default for SharedString {
fn default() -> Self {
Self(ArcCow::Owned(Arc::default()))

View File

@@ -4929,7 +4929,7 @@ impl From<(&'static str, u32)> for ElementId {
/// A rectangle to be rendered in the window at the given position and size.
/// Passed as an argument [`WindowContext::paint_quad`].
#[derive(Clone, Default)]
#[derive(Clone)]
pub struct PaintQuad {
/// The bounds of the quad within the window.
pub bounds: Bounds<Pixels>,

View File

@@ -15,8 +15,8 @@ test-support = []
path = "src/isahc_http_client.rs"
[dependencies]
anyhow.workspace = true
futures.workspace = true
http_client.workspace = true
isahc.workspace = true
futures.workspace = true
anyhow.workspace = true
util.workspace = true

View File

@@ -21,8 +21,8 @@ use async_watch as watch;
pub use clock::ReplicaId;
use futures::channel::oneshot;
use gpui::{
AnyElement, AppContext, EventEmitter, HighlightStyle, ModelContext, Pixels, Task, TaskLabel,
WindowContext,
AnyElement, AppContext, Context as _, EventEmitter, HighlightStyle, Model, ModelContext,
Pixels, Task, TaskLabel, WindowContext,
};
use lsp::LanguageServerId;
use parking_lot::Mutex;
@@ -84,11 +84,17 @@ pub enum Capability {
pub type BufferRow = u32;
#[derive(Clone)]
enum BufferDiffBase {
Git(Rope),
PastBufferVersion(Model<Buffer>, BufferSnapshot),
}
/// An in-memory representation of a source code file, including its text,
/// syntax trees, git status, and diagnostics.
pub struct Buffer {
text: TextBuffer,
diff_base: Option<Rope>,
diff_base: Option<BufferDiffBase>,
git_diff: git::diff::BufferDiff,
file: Option<Arc<dyn File>>,
/// The mtime of the file when this buffer was last loaded from
@@ -121,6 +127,7 @@ pub struct Buffer {
/// Memoize calls to has_changes_since(saved_version).
/// The contents of a cell are (self.version, has_changes) at the time of a last call.
has_unsaved_edits: Cell<(clock::Global, bool)>,
_subscriptions: Vec<gpui::Subscription>,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
@@ -144,7 +151,7 @@ pub struct BufferSnapshot {
/// The kind and amount of indentation in a particular line. For now,
/// assumes that indentation is all the same character.
#[derive(Clone, Copy, Debug, PartialEq, Eq, Default)]
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash, Default)]
pub struct IndentSize {
/// The number of bytes that comprise the indentation.
pub len: u32,
@@ -153,7 +160,7 @@ pub struct IndentSize {
}
/// A whitespace character that's used for indentation.
#[derive(Clone, Copy, Debug, PartialEq, Eq, Default)]
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash, Default)]
pub enum IndentKind {
/// An ASCII space character.
#[default]
@@ -308,7 +315,10 @@ pub enum Operation {
pub enum BufferEvent {
/// The buffer was changed in a way that must be
/// propagated to its other replicas.
Operation(Operation),
Operation {
operation: Operation,
is_local: bool,
},
/// The buffer was edited.
Edited,
/// The buffer's `dirty` bit changed.
@@ -644,7 +654,7 @@ impl Buffer {
id: self.remote_id().into(),
file: self.file.as_ref().map(|f| f.to_proto(cx)),
base_text: self.base_text().to_string(),
diff_base: self.diff_base.as_ref().map(|h| h.to_string()),
diff_base: self.diff_base().as_ref().map(|h| h.to_string()),
line_ending: proto::serialize_line_ending(self.line_ending()) as i32,
saved_version: proto::serialize_version(&self.saved_version),
saved_mtime: self.saved_mtime.map(|time| time.into()),
@@ -734,12 +744,10 @@ impl Buffer {
was_dirty_before_starting_transaction: None,
has_unsaved_edits: Cell::new((buffer.version(), false)),
text: buffer,
diff_base: diff_base
.map(|mut raw_diff_base| {
LineEnding::normalize(&mut raw_diff_base);
raw_diff_base
})
.map(Rope::from),
diff_base: diff_base.map(|mut raw_diff_base| {
LineEnding::normalize(&mut raw_diff_base);
BufferDiffBase::Git(Rope::from(raw_diff_base))
}),
diff_base_version: 0,
git_diff,
file,
@@ -759,6 +767,7 @@ impl Buffer {
completion_triggers_timestamp: Default::default(),
deferred_ops: OperationQueue::new(),
has_conflict: false,
_subscriptions: Vec::new(),
}
}
@@ -782,6 +791,52 @@ impl Buffer {
}
}
pub fn branch(&mut self, cx: &mut ModelContext<Self>) -> Model<Self> {
let this = cx.handle();
cx.new_model(|cx| {
let mut branch = Self {
diff_base: Some(BufferDiffBase::PastBufferVersion(
this.clone(),
self.snapshot(),
)),
language: self.language.clone(),
has_conflict: self.has_conflict,
has_unsaved_edits: Cell::new(self.has_unsaved_edits.get_mut().clone()),
_subscriptions: vec![cx.subscribe(&this, |branch: &mut Self, _, event, cx| {
if let BufferEvent::Operation { operation, .. } = event {
branch.apply_ops([operation.clone()], cx);
branch.diff_base_version += 1;
}
})],
..Self::build(
self.text.branch(),
None,
self.file.clone(),
self.capability(),
)
};
if let Some(language_registry) = self.language_registry() {
branch.set_language_registry(language_registry);
}
branch
})
}
pub fn merge(&mut self, branch: &Model<Self>, cx: &mut ModelContext<Self>) {
let branch = branch.read(cx);
let edits = branch
.edits_since::<usize>(&self.version)
.map(|edit| {
(
edit.old,
branch.text_for_range(edit.new).collect::<String>(),
)
})
.collect::<Vec<_>>();
self.edit(edits, None, cx);
}
#[cfg(test)]
pub(crate) fn as_text_snapshot(&self) -> &text::BufferSnapshot {
&self.text
@@ -961,20 +1016,23 @@ impl Buffer {
/// Returns the current diff base, see [Buffer::set_diff_base].
pub fn diff_base(&self) -> Option<&Rope> {
self.diff_base.as_ref()
match self.diff_base.as_ref()? {
BufferDiffBase::Git(rope) => Some(rope),
BufferDiffBase::PastBufferVersion(_, buffer_snapshot) => {
Some(buffer_snapshot.as_rope())
}
}
}
/// Sets the text that will be used to compute a Git diff
/// against the buffer text.
pub fn set_diff_base(&mut self, diff_base: Option<String>, cx: &mut ModelContext<Self>) {
self.diff_base = diff_base
.map(|mut raw_diff_base| {
LineEnding::normalize(&mut raw_diff_base);
raw_diff_base
})
.map(Rope::from);
self.diff_base = diff_base.map(|mut raw_diff_base| {
LineEnding::normalize(&mut raw_diff_base);
BufferDiffBase::Git(Rope::from(raw_diff_base))
});
self.diff_base_version += 1;
if let Some(recalc_task) = self.git_diff_recalc(cx) {
if let Some(recalc_task) = self.recalculate_diff(cx) {
cx.spawn(|buffer, mut cx| async move {
recalc_task.await;
buffer
@@ -992,14 +1050,21 @@ impl Buffer {
self.diff_base_version
}
/// Recomputes the Git diff status.
pub fn git_diff_recalc(&mut self, cx: &mut ModelContext<Self>) -> Option<Task<()>> {
let diff_base = self.diff_base.clone()?;
/// Recomputes the diff.
pub fn recalculate_diff(&mut self, cx: &mut ModelContext<Self>) -> Option<Task<()>> {
let diff_base_rope = match self.diff_base.as_mut()? {
BufferDiffBase::Git(rope) => rope.clone(),
BufferDiffBase::PastBufferVersion(base_buffer, base_buffer_snapshot) => {
let new_base_snapshot = base_buffer.read(cx).snapshot();
*base_buffer_snapshot = new_base_snapshot;
base_buffer_snapshot.as_rope().clone()
}
};
let snapshot = self.snapshot();
let mut diff = self.git_diff.clone();
let diff = cx.background_executor().spawn(async move {
diff.update(&diff_base, &snapshot).await;
diff.update(&diff_base_rope, &snapshot).await;
diff
});
@@ -1169,7 +1234,7 @@ impl Buffer {
lamport_timestamp,
};
self.apply_diagnostic_update(server_id, diagnostics, lamport_timestamp, cx);
self.send_operation(op, cx);
self.send_operation(op, true, cx);
}
fn request_autoindent(&mut self, cx: &mut ModelContext<Self>) {
@@ -1743,6 +1808,7 @@ impl Buffer {
lamport_timestamp,
cursor_shape,
},
true,
cx,
);
self.non_text_state_update_count += 1;
@@ -1889,7 +1955,7 @@ impl Buffer {
}
self.end_transaction(cx);
self.send_operation(Operation::Buffer(edit_operation), cx);
self.send_operation(Operation::Buffer(edit_operation), true, cx);
Some(edit_id)
}
@@ -1972,7 +2038,7 @@ impl Buffer {
&mut self,
ops: I,
cx: &mut ModelContext<Self>,
) -> Result<()> {
) {
self.pending_autoindent.take();
let was_dirty = self.is_dirty();
let old_version = self.version.clone();
@@ -1991,14 +2057,16 @@ impl Buffer {
}
})
.collect::<Vec<_>>();
self.text.apply_ops(buffer_ops)?;
for operation in buffer_ops.iter() {
self.send_operation(Operation::Buffer(operation.clone()), false, cx);
}
self.text.apply_ops(buffer_ops);
self.deferred_ops.insert(deferred_ops);
self.flush_deferred_ops(cx);
self.did_edit(&old_version, was_dirty, cx);
// Notify independently of whether the buffer was edited as the operations could include a
// selection update.
cx.notify();
Ok(())
}
fn flush_deferred_ops(&mut self, cx: &mut ModelContext<Self>) {
@@ -2115,8 +2183,16 @@ impl Buffer {
}
}
fn send_operation(&mut self, operation: Operation, cx: &mut ModelContext<Self>) {
cx.emit(BufferEvent::Operation(operation));
fn send_operation(
&mut self,
operation: Operation,
is_local: bool,
cx: &mut ModelContext<Self>,
) {
cx.emit(BufferEvent::Operation {
operation,
is_local,
});
}
/// Removes the selections for a given peer.
@@ -2131,7 +2207,7 @@ impl Buffer {
let old_version = self.version.clone();
if let Some((transaction_id, operation)) = self.text.undo() {
self.send_operation(Operation::Buffer(operation), cx);
self.send_operation(Operation::Buffer(operation), true, cx);
self.did_edit(&old_version, was_dirty, cx);
Some(transaction_id)
} else {
@@ -2148,7 +2224,7 @@ impl Buffer {
let was_dirty = self.is_dirty();
let old_version = self.version.clone();
if let Some(operation) = self.text.undo_transaction(transaction_id) {
self.send_operation(Operation::Buffer(operation), cx);
self.send_operation(Operation::Buffer(operation), true, cx);
self.did_edit(&old_version, was_dirty, cx);
true
} else {
@@ -2168,7 +2244,7 @@ impl Buffer {
let operations = self.text.undo_to_transaction(transaction_id);
let undone = !operations.is_empty();
for operation in operations {
self.send_operation(Operation::Buffer(operation), cx);
self.send_operation(Operation::Buffer(operation), true, cx);
}
if undone {
self.did_edit(&old_version, was_dirty, cx)
@@ -2182,7 +2258,7 @@ impl Buffer {
let old_version = self.version.clone();
if let Some((transaction_id, operation)) = self.text.redo() {
self.send_operation(Operation::Buffer(operation), cx);
self.send_operation(Operation::Buffer(operation), true, cx);
self.did_edit(&old_version, was_dirty, cx);
Some(transaction_id)
} else {
@@ -2202,7 +2278,7 @@ impl Buffer {
let operations = self.text.redo_to_transaction(transaction_id);
let redone = !operations.is_empty();
for operation in operations {
self.send_operation(Operation::Buffer(operation), cx);
self.send_operation(Operation::Buffer(operation), true, cx);
}
if redone {
self.did_edit(&old_version, was_dirty, cx)
@@ -2219,6 +2295,7 @@ impl Buffer {
triggers,
lamport_timestamp: self.completion_triggers_timestamp,
},
true,
cx,
);
cx.notify();
@@ -2298,7 +2375,7 @@ impl Buffer {
let ops = self.text.randomly_undo_redo(rng);
if !ops.is_empty() {
for op in ops {
self.send_operation(Operation::Buffer(op), cx);
self.send_operation(Operation::Buffer(op), true, cx);
self.did_edit(&old_version, was_dirty, cx);
}
}
@@ -3639,12 +3716,12 @@ impl BufferSnapshot {
!self.git_diff.is_empty()
}
/// Returns all the Git diff hunks intersecting the given
/// row range.
/// Returns all the Git diff hunks intersecting the given row range.
#[cfg(any(test, feature = "test-support"))]
pub fn git_diff_hunks_in_row_range(
&self,
range: Range<BufferRow>,
) -> impl '_ + Iterator<Item = git::diff::DiffHunk<u32>> {
) -> impl '_ + Iterator<Item = git::diff::DiffHunk> {
self.git_diff.hunks_in_row_range(range, self)
}
@@ -3653,7 +3730,7 @@ impl BufferSnapshot {
pub fn git_diff_hunks_intersecting_range(
&self,
range: Range<Anchor>,
) -> impl '_ + Iterator<Item = git::diff::DiffHunk<u32>> {
) -> impl '_ + Iterator<Item = git::diff::DiffHunk> {
self.git_diff.hunks_intersecting_range(range, self)
}
@@ -3662,7 +3739,7 @@ impl BufferSnapshot {
pub fn git_diff_hunks_intersecting_range_rev(
&self,
range: Range<Anchor>,
) -> impl '_ + Iterator<Item = git::diff::DiffHunk<u32>> {
) -> impl '_ + Iterator<Item = git::diff::DiffHunk> {
self.git_diff.hunks_intersecting_range_rev(range, self)
}

View File

@@ -6,6 +6,7 @@ use crate::Buffer;
use clock::ReplicaId;
use collections::BTreeMap;
use futures::FutureExt as _;
use git::diff::assert_hunks;
use gpui::{AppContext, BorrowAppContext, Model};
use gpui::{Context, TestAppContext};
use indoc::indoc;
@@ -275,13 +276,19 @@ fn test_edit_events(cx: &mut gpui::AppContext) {
|buffer, cx| {
let buffer_1_events = buffer_1_events.clone();
cx.subscribe(&buffer1, move |_, _, event, _| match event.clone() {
BufferEvent::Operation(op) => buffer1_ops.lock().push(op),
BufferEvent::Operation {
operation,
is_local: true,
} => buffer1_ops.lock().push(operation),
event => buffer_1_events.lock().push(event),
})
.detach();
let buffer_2_events = buffer_2_events.clone();
cx.subscribe(&buffer2, move |_, _, event, _| {
buffer_2_events.lock().push(event.clone())
cx.subscribe(&buffer2, move |_, _, event, _| match event.clone() {
BufferEvent::Operation {
is_local: false, ..
} => {}
event => buffer_2_events.lock().push(event),
})
.detach();
@@ -308,7 +315,7 @@ fn test_edit_events(cx: &mut gpui::AppContext) {
// Incorporating a set of remote ops emits a single edited event,
// followed by a dirty changed event.
buffer2.update(cx, |buffer, cx| {
buffer.apply_ops(buffer1_ops.lock().drain(..), cx).unwrap();
buffer.apply_ops(buffer1_ops.lock().drain(..), cx);
});
assert_eq!(
mem::take(&mut *buffer_1_events.lock()),
@@ -332,7 +339,7 @@ fn test_edit_events(cx: &mut gpui::AppContext) {
// Incorporating the remote ops again emits a single edited event,
// followed by a dirty changed event.
buffer2.update(cx, |buffer, cx| {
buffer.apply_ops(buffer1_ops.lock().drain(..), cx).unwrap();
buffer.apply_ops(buffer1_ops.lock().drain(..), cx);
});
assert_eq!(
mem::take(&mut *buffer_1_events.lock()),
@@ -2274,13 +2281,11 @@ fn test_serialization(cx: &mut gpui::AppContext) {
.block(buffer1.read(cx).serialize_ops(None, cx));
let buffer2 = cx.new_model(|cx| {
let mut buffer = Buffer::from_proto(1, Capability::ReadWrite, state, None).unwrap();
buffer
.apply_ops(
ops.into_iter()
.map(|op| proto::deserialize_operation(op).unwrap()),
cx,
)
.unwrap();
buffer.apply_ops(
ops.into_iter()
.map(|op| proto::deserialize_operation(op).unwrap()),
cx,
);
buffer
});
assert_eq!(buffer2.read(cx).text(), "abcDF");
@@ -2372,6 +2377,118 @@ async fn test_find_matching_indent(cx: &mut TestAppContext) {
);
}
#[gpui::test]
fn test_branch_and_merge(cx: &mut TestAppContext) {
cx.update(|cx| init_settings(cx, |_| {}));
let base_buffer = cx.new_model(|cx| Buffer::local("one\ntwo\nthree\n", cx));
// Create a remote replica of the base buffer.
let base_buffer_replica = cx.new_model(|cx| {
Buffer::from_proto(
1,
Capability::ReadWrite,
base_buffer.read(cx).to_proto(cx),
None,
)
.unwrap()
});
base_buffer.update(cx, |_buffer, cx| {
cx.subscribe(&base_buffer_replica, |this, _, event, cx| {
if let BufferEvent::Operation {
operation,
is_local: true,
} = event
{
this.apply_ops([operation.clone()], cx);
}
})
.detach();
});
// Create a branch, which initially has the same state as the base buffer.
let branch_buffer = base_buffer.update(cx, |buffer, cx| buffer.branch(cx));
branch_buffer.read_with(cx, |buffer, _| {
assert_eq!(buffer.text(), "one\ntwo\nthree\n");
});
// Edits to the branch are not applied to the base.
branch_buffer.update(cx, |buffer, cx| {
buffer.edit(
[(Point::new(1, 0)..Point::new(1, 0), "ONE_POINT_FIVE\n")],
None,
cx,
)
});
branch_buffer.read_with(cx, |branch_buffer, cx| {
assert_eq!(base_buffer.read(cx).text(), "one\ntwo\nthree\n");
assert_eq!(branch_buffer.text(), "one\nONE_POINT_FIVE\ntwo\nthree\n");
});
// Edits to the base are applied to the branch.
base_buffer.update(cx, |buffer, cx| {
buffer.edit([(Point::new(0, 0)..Point::new(0, 0), "ZERO\n")], None, cx)
});
branch_buffer.read_with(cx, |branch_buffer, cx| {
assert_eq!(base_buffer.read(cx).text(), "ZERO\none\ntwo\nthree\n");
assert_eq!(
branch_buffer.text(),
"ZERO\none\nONE_POINT_FIVE\ntwo\nthree\n"
);
});
assert_diff_hunks(&branch_buffer, cx, &[(2..3, "", "ONE_POINT_FIVE\n")]);
// Edits to any replica of the base are applied to the branch.
base_buffer_replica.update(cx, |buffer, cx| {
buffer.edit(
[(Point::new(2, 0)..Point::new(2, 0), "TWO_POINT_FIVE\n")],
None,
cx,
)
});
branch_buffer.read_with(cx, |branch_buffer, cx| {
assert_eq!(
base_buffer.read(cx).text(),
"ZERO\none\ntwo\nTWO_POINT_FIVE\nthree\n"
);
assert_eq!(
branch_buffer.text(),
"ZERO\none\nONE_POINT_FIVE\ntwo\nTWO_POINT_FIVE\nthree\n"
);
});
// Merging the branch applies all of its changes to the base.
base_buffer.update(cx, |base_buffer, cx| {
base_buffer.merge(&branch_buffer, cx);
assert_eq!(
base_buffer.text(),
"ZERO\none\nONE_POINT_FIVE\ntwo\nTWO_POINT_FIVE\nthree\n"
);
});
}
fn assert_diff_hunks(
buffer: &Model<Buffer>,
cx: &mut TestAppContext,
expected_hunks: &[(Range<u32>, &str, &str)],
) {
buffer
.update(cx, |buffer, cx| buffer.recalculate_diff(cx).unwrap())
.detach();
cx.executor().run_until_parked();
buffer.read_with(cx, |buffer, _| {
let snapshot = buffer.snapshot();
assert_hunks(
snapshot.git_diff_hunks_intersecting_range(Anchor::MIN..Anchor::MAX),
&snapshot,
&buffer.diff_base().unwrap().to_string(),
expected_hunks,
);
});
}
#[gpui::test(iterations = 100)]
fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
let min_peers = env::var("MIN_PEERS")
@@ -2401,20 +2518,23 @@ fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
.block(base_buffer.read(cx).serialize_ops(None, cx));
let mut buffer =
Buffer::from_proto(i as ReplicaId, Capability::ReadWrite, state, None).unwrap();
buffer
.apply_ops(
ops.into_iter()
.map(|op| proto::deserialize_operation(op).unwrap()),
cx,
)
.unwrap();
buffer.apply_ops(
ops.into_iter()
.map(|op| proto::deserialize_operation(op).unwrap()),
cx,
);
buffer.set_group_interval(Duration::from_millis(rng.gen_range(0..=200)));
let network = network.clone();
cx.subscribe(&cx.handle(), move |buffer, _, event, _| {
if let BufferEvent::Operation(op) = event {
network
.lock()
.broadcast(buffer.replica_id(), vec![proto::serialize_operation(op)]);
if let BufferEvent::Operation {
operation,
is_local: true,
} = event
{
network.lock().broadcast(
buffer.replica_id(),
vec![proto::serialize_operation(operation)],
);
}
})
.detach();
@@ -2523,14 +2643,12 @@ fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
None,
)
.unwrap();
new_buffer
.apply_ops(
old_buffer_ops
.into_iter()
.map(|op| deserialize_operation(op).unwrap()),
cx,
)
.unwrap();
new_buffer.apply_ops(
old_buffer_ops
.into_iter()
.map(|op| deserialize_operation(op).unwrap()),
cx,
);
log::info!(
"New replica {} text: {:?}",
new_buffer.replica_id(),
@@ -2539,10 +2657,14 @@ fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
new_buffer.set_group_interval(Duration::from_millis(rng.gen_range(0..=200)));
let network = network.clone();
cx.subscribe(&cx.handle(), move |buffer, _, event, _| {
if let BufferEvent::Operation(op) = event {
if let BufferEvent::Operation {
operation,
is_local: true,
} = event
{
network.lock().broadcast(
buffer.replica_id(),
vec![proto::serialize_operation(op)],
vec![proto::serialize_operation(operation)],
);
}
})
@@ -2570,7 +2692,7 @@ fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
ops
);
new_buffer.update(cx, |new_buffer, cx| {
new_buffer.apply_ops(ops, cx).unwrap();
new_buffer.apply_ops(ops, cx);
});
}
}
@@ -2598,7 +2720,7 @@ fn test_random_collaboration(cx: &mut AppContext, mut rng: StdRng) {
ops.len(),
ops
);
buffer.update(cx, |buffer, cx| buffer.apply_ops(ops, cx).unwrap());
buffer.update(cx, |buffer, cx| buffer.apply_ops(ops, cx));
}
}
_ => {}

View File

@@ -139,11 +139,52 @@ pub trait ToLspPosition {
/// A name of a language server.
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub struct LanguageServerName(pub Arc<str>);
pub struct LanguageServerName(pub SharedString);
impl std::fmt::Display for LanguageServerName {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
std::fmt::Display::fmt(&self.0, f)
}
}
impl AsRef<str> for LanguageServerName {
fn as_ref(&self) -> &str {
self.0.as_ref()
}
}
impl AsRef<OsStr> for LanguageServerName {
fn as_ref(&self) -> &OsStr {
self.0.as_ref().as_ref()
}
}
impl JsonSchema for LanguageServerName {
fn schema_name() -> String {
"LanguageServerName".into()
}
fn json_schema(_: &mut SchemaGenerator) -> Schema {
SchemaObject {
instance_type: Some(InstanceType::String.into()),
..Default::default()
}
.into()
}
}
impl LanguageServerName {
pub const fn new_static(s: &'static str) -> Self {
Self(SharedString::new_static(s))
}
pub fn from_proto(s: String) -> Self {
Self(Arc::from(s))
Self(s.into())
}
}
impl<'a> From<&'a str> for LanguageServerName {
fn from(str: &'a str) -> LanguageServerName {
LanguageServerName(str.to_string().into())
}
}
@@ -202,8 +243,8 @@ impl CachedLspAdapter {
})
}
pub fn name(&self) -> Arc<str> {
self.adapter.name().0.clone()
pub fn name(&self) -> LanguageServerName {
self.adapter.name().clone()
}
pub async fn get_language_server_command(
@@ -594,7 +635,7 @@ pub struct LanguageConfig {
pub block_comment: Option<(Arc<str>, Arc<str>)>,
/// A list of language servers that are allowed to run on subranges of a given language.
#[serde(default)]
pub scope_opt_in_language_servers: Vec<String>,
pub scope_opt_in_language_servers: Vec<LanguageServerName>,
#[serde(default)]
pub overrides: HashMap<String, LanguageConfigOverride>,
/// A list of characters that Zed should treat as word characters for the
@@ -658,7 +699,7 @@ pub struct LanguageConfigOverride {
#[serde(default)]
pub word_characters: Override<HashSet<char>>,
#[serde(default)]
pub opt_into_language_servers: Vec<String>,
pub opt_into_language_servers: Vec<LanguageServerName>,
}
#[derive(Clone, Deserialize, Debug, Serialize, JsonSchema)]
@@ -1479,9 +1520,9 @@ impl LanguageScope {
pub fn language_allowed(&self, name: &LanguageServerName) -> bool {
let config = &self.language.config;
let opt_in_servers = &config.scope_opt_in_language_servers;
if opt_in_servers.iter().any(|o| *o == *name.0) {
if opt_in_servers.iter().any(|o| *o == *name) {
if let Some(over) = self.config_override() {
over.opt_into_language_servers.iter().any(|o| *o == *name.0)
over.opt_into_language_servers.iter().any(|o| *o == *name)
} else {
false
}

View File

@@ -326,13 +326,43 @@ impl LanguageRegistry {
Some(load_lsp_adapter())
}
pub fn register_lsp_adapter(&self, language_name: LanguageName, adapter: Arc<dyn LspAdapter>) {
pub fn register_lsp_adapter(
&self,
language_name: LanguageName,
adapter: Arc<dyn LspAdapter>,
) -> Arc<CachedLspAdapter> {
let cached = CachedLspAdapter::new(adapter);
self.state
.write()
.lsp_adapters
.entry(language_name)
.or_default()
.push(CachedLspAdapter::new(adapter));
.push(cached.clone());
cached
}
pub fn get_or_register_lsp_adapter(
&self,
language_name: LanguageName,
server_name: LanguageServerName,
build_adapter: impl FnOnce() -> Arc<dyn LspAdapter> + 'static,
) -> Arc<CachedLspAdapter> {
let registered = self
.state
.write()
.lsp_adapters
.entry(language_name.clone())
.or_default()
.iter()
.find(|cached_adapter| cached_adapter.name == server_name)
.cloned();
if let Some(found) = registered {
found
} else {
let adapter = build_adapter();
self.register_lsp_adapter(language_name, adapter)
}
}
/// Register a fake language server and adapter

View File

@@ -99,7 +99,7 @@ pub struct LanguageSettings {
/// special tokens:
/// - `"!<language_server_id>"` - A language server ID prefixed with a `!` will be disabled.
/// - `"..."` - A placeholder to refer to the **rest** of the registered language servers for this language.
pub language_servers: Vec<Arc<str>>,
pub language_servers: Vec<String>,
/// Controls whether inline completions are shown immediately (true)
/// or manually by triggering `editor::ShowInlineCompletion` (false).
pub show_inline_completions: bool,
@@ -137,22 +137,24 @@ impl LanguageSettings {
}
pub(crate) fn resolve_language_servers(
configured_language_servers: &[Arc<str>],
configured_language_servers: &[String],
available_language_servers: &[LanguageServerName],
) -> Vec<LanguageServerName> {
let (disabled_language_servers, enabled_language_servers): (Vec<Arc<str>>, Vec<Arc<str>>) =
configured_language_servers.iter().partition_map(
|language_server| match language_server.strip_prefix('!') {
Some(disabled) => Either::Left(disabled.into()),
None => Either::Right(language_server.clone()),
},
);
let (disabled_language_servers, enabled_language_servers): (
Vec<LanguageServerName>,
Vec<LanguageServerName>,
) = configured_language_servers.iter().partition_map(
|language_server| match language_server.strip_prefix('!') {
Some(disabled) => Either::Left(LanguageServerName(disabled.to_string().into())),
None => Either::Right(LanguageServerName(language_server.clone().into())),
},
);
let rest = available_language_servers
.iter()
.filter(|&available_language_server| {
!disabled_language_servers.contains(&available_language_server.0)
&& !enabled_language_servers.contains(&available_language_server.0)
!disabled_language_servers.contains(&available_language_server)
&& !enabled_language_servers.contains(&available_language_server)
})
.cloned()
.collect::<Vec<_>>();
@@ -160,10 +162,10 @@ impl LanguageSettings {
enabled_language_servers
.into_iter()
.flat_map(|language_server| {
if language_server.as_ref() == Self::REST_OF_LANGUAGE_SERVERS {
if language_server.0.as_ref() == Self::REST_OF_LANGUAGE_SERVERS {
rest.clone()
} else {
vec![LanguageServerName(language_server.clone())]
vec![language_server.clone()]
}
})
.collect::<Vec<_>>()
@@ -295,7 +297,7 @@ pub struct LanguageSettingsContent {
///
/// Default: ["..."]
#[serde(default)]
pub language_servers: Option<Vec<Arc<str>>>,
pub language_servers: Option<Vec<String>>,
/// Controls whether inline completions are shown immediately (true)
/// or manually by triggering `editor::ShowInlineCompletion` (false).
///
@@ -323,11 +325,11 @@ pub struct LanguageSettingsContent {
///
/// Default: true
pub use_auto_surround: Option<bool>,
// Controls how the editor handles the autoclosed characters.
// When set to `false`(default), skipping over and auto-removing of the closing characters
// happen only for auto-inserted characters.
// Otherwise(when `true`), the closing characters are always skipped over and auto-removed
// no matter how they were inserted.
/// Controls how the editor handles the autoclosed characters.
/// When set to `false`(default), skipping over and auto-removing of the closing characters
/// happen only for auto-inserted characters.
/// Otherwise(when `true`), the closing characters are always skipped over and auto-removed
/// no matter how they were inserted.
///
/// Default: false
pub always_treat_brackets_as_autoclosed: Option<bool>,
@@ -1152,13 +1154,20 @@ mod tests {
);
}
#[test]
fn test_formatter_deserialization_invalid() {
let raw_auto = "{\"formatter\": {}}";
let result: Result<LanguageSettingsContent, _> = serde_json::from_str(raw_auto);
assert!(result.is_err());
}
#[test]
pub fn test_resolve_language_servers() {
fn language_server_names(names: &[&str]) -> Vec<LanguageServerName> {
names
.iter()
.copied()
.map(|name| LanguageServerName(name.into()))
.map(|name| LanguageServerName(name.to_string().into()))
.collect::<Vec<_>>()
}

View File

@@ -32,6 +32,7 @@ futures.workspace = true
google_ai = { workspace = true, features = ["schemars"] }
gpui.workspace = true
http_client.workspace = true
isahc.workspace = true
inline_completion_button.workspace = true
log.workspace = true
menu.workspace = true

View File

@@ -51,6 +51,7 @@ pub struct AvailableModel {
/// Configuration of Anthropic's caching API.
pub cache_configuration: Option<LanguageModelCacheConfiguration>,
pub max_output_tokens: Option<u32>,
pub default_temperature: Option<f32>,
}
pub struct AnthropicLanguageModelProvider {
@@ -200,6 +201,7 @@ impl LanguageModelProvider for AnthropicLanguageModelProvider {
}
}),
max_output_tokens: model.max_output_tokens,
default_temperature: model.default_temperature,
},
);
}
@@ -375,8 +377,11 @@ impl LanguageModel for AnthropicModel {
request: LanguageModelRequest,
cx: &AsyncAppContext,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<LanguageModelCompletionEvent>>>> {
let request =
request.into_anthropic(self.model.id().into(), self.model.max_output_tokens());
let request = request.into_anthropic(
self.model.id().into(),
self.model.default_temperature(),
self.model.max_output_tokens(),
);
let request = self.stream_completion(request, cx);
let future = self.request_limiter.stream(async move {
let response = request.await.map_err(|err| anyhow!(err))?;
@@ -405,6 +410,7 @@ impl LanguageModel for AnthropicModel {
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
let mut request = request.into_anthropic(
self.model.tool_model_id().into(),
self.model.default_temperature(),
self.model.max_output_tokens(),
);
request.tool_choice = Some(anthropic::ToolChoice::Tool {

View File

@@ -19,6 +19,7 @@ use gpui::{
Subscription, Task,
};
use http_client::{AsyncBody, HttpClient, Method, Response};
use isahc::config::Configurable;
use schemars::JsonSchema;
use serde::{de::DeserializeOwned, Deserialize, Serialize};
use serde_json::value::RawValue;
@@ -27,6 +28,7 @@ use smol::{
io::{AsyncReadExt, BufReader},
lock::{RwLock, RwLockUpgradableReadGuard, RwLockWriteGuard},
};
use std::time::Duration;
use std::{
future,
sync::{Arc, LazyLock},
@@ -56,6 +58,7 @@ fn zed_cloud_provider_additional_models() -> &'static [AvailableModel] {
#[derive(Default, Clone, Debug, PartialEq)]
pub struct ZedDotDevSettings {
pub available_models: Vec<AvailableModel>,
pub low_speed_timeout: Option<Duration>,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
@@ -84,6 +87,8 @@ pub struct AvailableModel {
pub tool_override: Option<String>,
/// Indicates whether this custom model supports caching.
pub cache_configuration: Option<LanguageModelCacheConfiguration>,
/// The default temperature to use for this model.
pub default_temperature: Option<f32>,
}
pub struct CloudLanguageModelProvider {
@@ -252,6 +257,7 @@ impl LanguageModelProvider for CloudLanguageModelProvider {
min_total_token: config.min_total_token,
}
}),
default_temperature: model.default_temperature,
max_output_tokens: model.max_output_tokens,
}),
AvailableProvider::OpenAi => CloudModel::OpenAi(open_ai::Model::Custom {
@@ -380,6 +386,7 @@ impl CloudLanguageModel {
client: Arc<Client>,
llm_api_token: LlmApiToken,
body: PerformCompletionParams,
low_speed_timeout: Option<Duration>,
) -> Result<Response<AsyncBody>> {
let http_client = &client.http_client();
@@ -387,7 +394,11 @@ impl CloudLanguageModel {
let mut did_retry = false;
let response = loop {
let request = http_client::Request::builder()
let mut request_builder = http_client::Request::builder();
if let Some(low_speed_timeout) = low_speed_timeout {
request_builder = request_builder.low_speed_timeout(100, low_speed_timeout);
};
let request = request_builder
.method(Method::POST)
.uri(http_client.build_zed_llm_url("/completion", &[])?.as_ref())
.header("Content-Type", "application/json")
@@ -501,11 +512,18 @@ impl LanguageModel for CloudLanguageModel {
fn stream_completion(
&self,
request: LanguageModelRequest,
_cx: &AsyncAppContext,
cx: &AsyncAppContext,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<LanguageModelCompletionEvent>>>> {
let openai_low_speed_timeout =
AllLanguageModelSettings::try_read_global(cx, |s| s.openai.low_speed_timeout.unwrap());
match &self.model {
CloudModel::Anthropic(model) => {
let request = request.into_anthropic(model.id().into(), model.max_output_tokens());
let request = request.into_anthropic(
model.id().into(),
model.default_temperature(),
model.max_output_tokens(),
);
let client = self.client.clone();
let llm_api_token = self.llm_api_token.clone();
let future = self.request_limiter.stream(async move {
@@ -519,6 +537,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;
Ok(map_to_language_model_completion_events(Box::pin(
@@ -542,6 +561,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
openai_low_speed_timeout,
)
.await?;
Ok(open_ai::extract_text_from_events(response_lines(response)))
@@ -569,6 +589,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;
Ok(google_ai::extract_text_from_events(response_lines(
@@ -599,6 +620,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;
Ok(open_ai::extract_text_from_events(response_lines(response)))
@@ -627,8 +649,11 @@ impl LanguageModel for CloudLanguageModel {
match &self.model {
CloudModel::Anthropic(model) => {
let mut request =
request.into_anthropic(model.tool_model_id().into(), model.max_output_tokens());
let mut request = request.into_anthropic(
model.tool_model_id().into(),
model.default_temperature(),
model.max_output_tokens(),
);
request.tool_choice = Some(anthropic::ToolChoice::Tool {
name: tool_name.clone(),
});
@@ -650,6 +675,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;
@@ -694,6 +720,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;
@@ -741,6 +768,7 @@ impl LanguageModel for CloudLanguageModel {
&request,
)?)?,
},
None,
)
.await?;

View File

@@ -235,7 +235,7 @@ impl OllamaLanguageModel {
options: Some(ChatOptions {
num_ctx: Some(self.model.max_tokens),
stop: Some(request.stop),
temperature: Some(request.temperature),
temperature: request.temperature.or(Some(1.0)),
..Default::default()
}),
tools: vec![],

View File

@@ -76,6 +76,7 @@ impl Global for GlobalLanguageModelRegistry {}
pub struct LanguageModelRegistry {
active_model: Option<ActiveModel>,
providers: BTreeMap<LanguageModelProviderId, Arc<dyn LanguageModelProvider>>,
inline_alternatives: Vec<Arc<dyn LanguageModel>>,
}
pub struct ActiveModel {
@@ -229,6 +230,37 @@ impl LanguageModelRegistry {
pub fn active_model(&self) -> Option<Arc<dyn LanguageModel>> {
self.active_model.as_ref()?.model.clone()
}
/// Selects and sets the inline alternatives for language models based on
/// provider name and id.
pub fn select_inline_alternative_models(
&mut self,
alternatives: impl IntoIterator<Item = (LanguageModelProviderId, LanguageModelId)>,
cx: &mut ModelContext<Self>,
) {
let mut selected_alternatives = Vec::new();
for (provider_id, model_id) in alternatives {
if let Some(provider) = self.providers.get(&provider_id) {
if let Some(model) = provider
.provided_models(cx)
.iter()
.find(|m| m.id() == model_id)
{
selected_alternatives.push(model.clone());
}
}
}
self.inline_alternatives = selected_alternatives;
}
/// The models to use for inline assists. Returns the union of the active
/// model and all inline alternatives. When there are multiple models, the
/// user will be able to cycle through results.
pub fn inline_alternative_models(&self) -> &[Arc<dyn LanguageModel>] {
&self.inline_alternatives
}
}
#[cfg(test)]

View File

@@ -236,7 +236,7 @@ pub struct LanguageModelRequest {
pub messages: Vec<LanguageModelRequestMessage>,
pub tools: Vec<LanguageModelRequestTool>,
pub stop: Vec<String>,
pub temperature: f32,
pub temperature: Option<f32>,
}
impl LanguageModelRequest {
@@ -262,7 +262,7 @@ impl LanguageModelRequest {
.collect(),
stream,
stop: self.stop,
temperature: self.temperature,
temperature: self.temperature.unwrap_or(1.0),
max_tokens: max_output_tokens,
tools: Vec::new(),
tool_choice: None,
@@ -290,7 +290,7 @@ impl LanguageModelRequest {
candidate_count: Some(1),
stop_sequences: Some(self.stop),
max_output_tokens: None,
temperature: Some(self.temperature as f64),
temperature: self.temperature.map(|t| t as f64).or(Some(1.0)),
top_p: None,
top_k: None,
}),
@@ -298,7 +298,12 @@ impl LanguageModelRequest {
}
}
pub fn into_anthropic(self, model: String, max_output_tokens: u32) -> anthropic::Request {
pub fn into_anthropic(
self,
model: String,
default_temperature: f32,
max_output_tokens: u32,
) -> anthropic::Request {
let mut new_messages: Vec<anthropic::Message> = Vec::new();
let mut system_message = String::new();
@@ -400,7 +405,7 @@ impl LanguageModelRequest {
tool_choice: None,
metadata: None,
stop_sequences: Vec::new(),
temperature: Some(self.temperature),
temperature: self.temperature.or(Some(default_temperature)),
top_k: None,
top_p: None,
}

View File

@@ -99,6 +99,7 @@ impl AnthropicSettingsContent {
tool_override,
cache_configuration,
max_output_tokens,
default_temperature,
} => Some(provider::anthropic::AvailableModel {
name,
display_name,
@@ -112,6 +113,7 @@ impl AnthropicSettingsContent {
},
),
max_output_tokens,
default_temperature,
}),
_ => None,
})
@@ -231,6 +233,7 @@ pub struct GoogleSettingsContent {
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct ZedDotDevSettingsContent {
available_models: Option<Vec<cloud::AvailableModel>>,
pub low_speed_timeout_in_seconds: Option<u64>,
}
#[derive(Default, Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
@@ -333,6 +336,14 @@ impl settings::Settings for AllLanguageModelSettings {
.as_ref()
.and_then(|s| s.available_models.clone()),
);
if let Some(low_speed_timeout_in_seconds) = value
.zed_dot_dev
.as_ref()
.and_then(|s| s.low_speed_timeout_in_seconds)
{
settings.zed_dot_dev.low_speed_timeout =
Some(Duration::from_secs(low_speed_timeout_in_seconds));
}
merge(
&mut settings.google.api_url,

View File

@@ -236,7 +236,7 @@ impl LogStore {
));
this.add_language_server(
LanguageServerKind::Global {
name: LanguageServerName(Arc::from("copilot")),
name: LanguageServerName::new_static("copilot"),
},
server.server_id(),
Some(server.clone()),

View File

@@ -13,13 +13,13 @@ use util::{fs::remove_matching, maybe, ResultExt};
pub struct CLspAdapter;
impl CLspAdapter {
const SERVER_NAME: &'static str = "clangd";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("clangd");
}
#[async_trait(?Send)]
impl super::LspAdapter for CLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn check_if_user_installed(
@@ -28,7 +28,8 @@ impl super::LspAdapter for CLspAdapter {
cx: &AsyncAppContext,
) -> Option<LanguageServerBinary> {
let configured_binary = cx.update(|cx| {
language_server_settings(delegate, Self::SERVER_NAME, cx).and_then(|s| s.binary.clone())
language_server_settings(delegate, &Self::SERVER_NAME, cx)
.and_then(|s| s.binary.clone())
});
match configured_binary {

View File

@@ -33,7 +33,7 @@ fn server_binary_arguments() -> Vec<OsString> {
pub struct GoLspAdapter;
impl GoLspAdapter {
const SERVER_NAME: &'static str = "gopls";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("gopls");
}
static GOPLS_VERSION_REGEX: LazyLock<Regex> =
@@ -46,7 +46,7 @@ static GO_ESCAPE_SUBTEST_NAME_REGEX: LazyLock<Regex> = LazyLock::new(|| {
#[async_trait(?Send)]
impl super::LspAdapter for GoLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn fetch_latest_server_version(
@@ -71,7 +71,8 @@ impl super::LspAdapter for GoLspAdapter {
cx: &AsyncAppContext,
) -> Option<LanguageServerBinary> {
let configured_binary = cx.update(|cx| {
language_server_settings(delegate, Self::SERVER_NAME, cx).and_then(|s| s.binary.clone())
language_server_settings(delegate, &Self::SERVER_NAME, cx)
.and_then(|s| s.binary.clone())
});
match configured_binary {

View File

@@ -1,6 +1,7 @@
(comment) @comment
(string) @string
(escape_sequence) @string.escape
(pair
key: (string) @property.json_key)

View File

@@ -1,6 +1,7 @@
(comment) @comment
(string) @string
(escape_sequence) @string.escape
(pair
key: (string) @property.json_key)

View File

@@ -1,6 +1,6 @@
name = "Markdown"
grammar = "markdown"
path_suffixes = ["md", "mdx", "mdwn", "markdown"]
path_suffixes = ["md", "mdx", "mdwn", "markdown", "MD"]
word_characters = ["-"]
brackets = [
{ start = "{", end = "}", close = true, newline = true },

View File

@@ -30,7 +30,7 @@ pub struct PythonLspAdapter {
}
impl PythonLspAdapter {
const SERVER_NAME: &'static str = "pyright";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("pyright");
pub fn new(node: Arc<dyn NodeRuntime>) -> Self {
PythonLspAdapter { node }
@@ -40,7 +40,7 @@ impl PythonLspAdapter {
#[async_trait(?Send)]
impl LspAdapter for PythonLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn fetch_latest_server_version(
@@ -49,7 +49,7 @@ impl LspAdapter for PythonLspAdapter {
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(
self.node
.npm_package_latest_version(Self::SERVER_NAME)
.npm_package_latest_version(Self::SERVER_NAME.as_ref())
.await?,
) as Box<_>)
}
@@ -62,16 +62,23 @@ impl LspAdapter for PythonLspAdapter {
) -> Result<LanguageServerBinary> {
let latest_version = latest_version.downcast::<String>().unwrap();
let server_path = container_dir.join(SERVER_PATH);
let package_name = Self::SERVER_NAME;
let should_install_language_server = self
.node
.should_install_npm_package(package_name, &server_path, &container_dir, &latest_version)
.should_install_npm_package(
Self::SERVER_NAME.as_ref(),
&server_path,
&container_dir,
&latest_version,
)
.await;
if should_install_language_server {
self.node
.npm_install_packages(&container_dir, &[(package_name, latest_version.as_str())])
.npm_install_packages(
&container_dir,
&[(Self::SERVER_NAME.as_ref(), latest_version.as_str())],
)
.await?;
}
@@ -182,7 +189,7 @@ impl LspAdapter for PythonLspAdapter {
cx: &mut AsyncAppContext,
) -> Result<Value> {
cx.update(|cx| {
language_server_settings(adapter.as_ref(), Self::SERVER_NAME, cx)
language_server_settings(adapter.as_ref(), &Self::SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
.unwrap_or_default()
})

View File

@@ -25,13 +25,13 @@ use util::{fs::remove_matching, maybe, ResultExt};
pub struct RustLspAdapter;
impl RustLspAdapter {
const SERVER_NAME: &'static str = "rust-analyzer";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("rust-analyzer");
}
#[async_trait(?Send)]
impl LspAdapter for RustLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn check_if_user_installed(
@@ -41,7 +41,7 @@ impl LspAdapter for RustLspAdapter {
) -> Option<LanguageServerBinary> {
let configured_binary = cx
.update(|cx| {
language_server_settings(delegate, Self::SERVER_NAME, cx)
language_server_settings(delegate, &Self::SERVER_NAME, cx)
.and_then(|s| s.binary.clone())
})
.ok()?;
@@ -60,7 +60,7 @@ impl LspAdapter for RustLspAdapter {
path_lookup: None,
..
}) => {
let path = delegate.which(Self::SERVER_NAME.as_ref()).await;
let path = delegate.which("rust-analyzer".as_ref()).await;
let env = delegate.shell_env().await;
if let Some(path) = path {

View File

@@ -32,7 +32,8 @@ pub struct TailwindLspAdapter {
}
impl TailwindLspAdapter {
const SERVER_NAME: &'static str = "tailwindcss-language-server";
const SERVER_NAME: LanguageServerName =
LanguageServerName::new_static("tailwindcss-language-server");
pub fn new(node: Arc<dyn NodeRuntime>) -> Self {
TailwindLspAdapter { node }
@@ -42,7 +43,7 @@ impl TailwindLspAdapter {
#[async_trait(?Send)]
impl LspAdapter for TailwindLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn check_if_user_installed(
@@ -52,7 +53,7 @@ impl LspAdapter for TailwindLspAdapter {
) -> Option<LanguageServerBinary> {
let configured_binary = cx
.update(|cx| {
language_server_settings(delegate, Self::SERVER_NAME, cx)
language_server_settings(delegate, &Self::SERVER_NAME, cx)
.and_then(|s| s.binary.clone())
})
.ok()??;
@@ -152,7 +153,7 @@ impl LspAdapter for TailwindLspAdapter {
cx: &mut AsyncAppContext,
) -> Result<Value> {
let tailwind_user_settings = cx.update(|cx| {
language_server_settings(delegate.as_ref(), Self::SERVER_NAME, cx)
language_server_settings(delegate.as_ref(), &Self::SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
.unwrap_or_default()
})?;

View File

@@ -71,7 +71,8 @@ pub struct TypeScriptLspAdapter {
impl TypeScriptLspAdapter {
const OLD_SERVER_PATH: &'static str = "node_modules/typescript-language-server/lib/cli.js";
const NEW_SERVER_PATH: &'static str = "node_modules/typescript-language-server/lib/cli.mjs";
const SERVER_NAME: &'static str = "typescript-language-server";
const SERVER_NAME: LanguageServerName =
LanguageServerName::new_static("typescript-language-server");
pub fn new(node: Arc<dyn NodeRuntime>) -> Self {
TypeScriptLspAdapter { node }
}
@@ -97,7 +98,7 @@ struct TypeScriptVersions {
#[async_trait(?Send)]
impl LspAdapter for TypeScriptLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn fetch_latest_server_version(
@@ -239,7 +240,7 @@ impl LspAdapter for TypeScriptLspAdapter {
cx: &mut AsyncAppContext,
) -> Result<Value> {
let override_options = cx.update(|cx| {
language_server_settings(delegate.as_ref(), Self::SERVER_NAME, cx)
language_server_settings(delegate.as_ref(), &Self::SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
})?;
if let Some(options) = override_options {
@@ -304,7 +305,7 @@ impl EsLintLspAdapter {
const GITHUB_ASSET_KIND: AssetKind = AssetKind::Zip;
const SERVER_PATH: &'static str = "vscode-eslint/server/out/eslintServer.js";
const SERVER_NAME: &'static str = "eslint";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("eslint");
const FLAT_CONFIG_FILE_NAMES: &'static [&'static str] =
&["eslint.config.js", "eslint.config.mjs", "eslint.config.cjs"];
@@ -331,7 +332,7 @@ impl LspAdapter for EsLintLspAdapter {
let workspace_root = delegate.worktree_root_path();
let eslint_user_settings = cx.update(|cx| {
language_server_settings(delegate.as_ref(), Self::SERVER_NAME, cx)
language_server_settings(delegate.as_ref(), &Self::SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
.unwrap_or_default()
})?;
@@ -403,7 +404,7 @@ impl LspAdapter for EsLintLspAdapter {
}
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn fetch_latest_server_version(

View File

@@ -48,11 +48,11 @@ struct TypeScriptVersions {
server_version: String,
}
const SERVER_NAME: &str = "vtsls";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("vtsls");
#[async_trait(?Send)]
impl LspAdapter for VtslsLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(SERVER_NAME.into())
SERVER_NAME.clone()
}
async fn fetch_latest_server_version(
@@ -74,7 +74,7 @@ impl LspAdapter for VtslsLspAdapter {
cx: &AsyncAppContext,
) -> Option<LanguageServerBinary> {
let configured_binary = cx.update(|cx| {
language_server_settings(delegate, SERVER_NAME, cx).and_then(|s| s.binary.clone())
language_server_settings(delegate, &SERVER_NAME, cx).and_then(|s| s.binary.clone())
});
match configured_binary {
@@ -267,7 +267,7 @@ impl LspAdapter for VtslsLspAdapter {
cx: &mut AsyncAppContext,
) -> Result<Value> {
let override_options = cx.update(|cx| {
language_server_settings(delegate.as_ref(), SERVER_NAME, cx)
language_server_settings(delegate.as_ref(), &SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
})?;

View File

@@ -30,7 +30,7 @@ pub struct YamlLspAdapter {
}
impl YamlLspAdapter {
const SERVER_NAME: &'static str = "yaml-language-server";
const SERVER_NAME: LanguageServerName = LanguageServerName::new_static("yaml-language-server");
pub fn new(node: Arc<dyn NodeRuntime>) -> Self {
YamlLspAdapter { node }
}
@@ -39,7 +39,7 @@ impl YamlLspAdapter {
#[async_trait(?Send)]
impl LspAdapter for YamlLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName(Self::SERVER_NAME.into())
Self::SERVER_NAME.clone()
}
async fn check_if_user_installed(
@@ -49,7 +49,7 @@ impl LspAdapter for YamlLspAdapter {
) -> Option<LanguageServerBinary> {
let configured_binary = cx
.update(|cx| {
language_server_settings(delegate, Self::SERVER_NAME, cx)
language_server_settings(delegate, &Self::SERVER_NAME, cx)
.and_then(|s| s.binary.clone())
})
.ok()??;
@@ -145,7 +145,7 @@ impl LspAdapter for YamlLspAdapter {
let mut options = serde_json::json!({"[yaml]": {"editor.tabSize": tab_size}});
let project_options = cx.update(|cx| {
language_server_settings(delegate.as_ref(), Self::SERVER_NAME, cx)
language_server_settings(delegate.as_ref(), &Self::SERVER_NAME, cx)
.and_then(|s| s.settings.clone())
})?;
if let Some(override_options) = project_options {

View File

@@ -615,8 +615,14 @@ impl LanguageServer {
snippet_support: Some(true),
resolve_support: Some(CompletionItemCapabilityResolveSupport {
properties: vec![
"documentation".to_string(),
"additionalTextEdits".to_string(),
"command".to_string(),
"detail".to_string(),
"documentation".to_string(),
"filterText".to_string(),
"labelDetails".to_string(),
"tags".to_string(),
"textEdit".to_string(),
],
}),
insert_replace_support: Some(true),

Some files were not shown because too many files have changed in this diff Show More