Compare commits

..

43 Commits

Author SHA1 Message Date
Peter Tripp
76ab2d7a5a Switch to ubuntu24.04 2025-08-21 12:41:11 -04:00
Peter Tripp
77c3a8c808 ci: Switch from ubuntu-latest to namespace-profile-2x4-ubuntu-2204 2025-08-21 12:37:09 -04:00
Piotr Osiewicz
132daef9f6 lsp: Add basic test for server tree toolchain use (#36692)
Closes #ISSUE

Release Notes:

- N/A
2025-08-21 17:52:17 +02:00
Agus Zubiaga
4bee06e507 acp: Use ResourceLink for agents that don't support embedded context (#36687)
The completion provider was already limiting the mention kinds according
to `acp::PromptCapabilities`. However, it was still using
`ContentBlock::EmbeddedResource` when
`acp::PromptCapabilities::embedded_context` was `false`. We will now use
`ResourceLink` in that case making it more complaint with the
specification.

Release Notes:

- N/A
2025-08-21 14:57:46 +00:00
Ryan Drew
f23314bef4 editor: Use editorconfig's max_line_length for hard wrap (#36426)
PR #20198, "Do not alter soft wrap based on .editorconfig contents"
removed support for setting line lengths for both soft and hard wrap,
not just soft wrap. This causes the `max_line_length` property within a
`.editorconfig` file to be ignored by Zed. This commit restores allowing
for hard wrap limits to be set using `max_line_length` without impacting
soft wrap limits. This is done by merging the `max_line_length` property
from an editorconfig file into Zed's `preferred_line_length` property.

Release Notes:

- Added support for .editorconfig's `max_line_length` property

Signed-off-by: Ryan Drew <git@ry4n.me>
2025-08-21 17:55:43 +03:00
Smit Barmase
697a39c251 Fix issue where renaming a file would not update imports in related files if they are not open (#36681)
Closes #34445

Now we open a multi-buffer consisting of buffers that have updated,
renamed file imports.

Only local is handled, for now.

Release Notes:

- Fixed an issue where renaming a file would not update imports in
related files if they are not already open.
2025-08-21 20:19:17 +05:30
Conrad Irwin
d9ea97ee9c acp: Detect gemini auth errors and show a button (#36641)
Closes #ISSUE

Release Notes:

- N/A
2025-08-21 08:44:04 -06:00
Conrad Irwin
d8fc779a67 acp: Hide history unless in native agent (#36644)
Release Notes:

- N/A
2025-08-21 08:43:57 -06:00
Bennet Bo Fenner
001ec97c0e acp: Use file icons for edit tool cards when ToolCallLocation is known (#36684)
Release Notes:

- N/A
2025-08-21 14:18:22 +00:00
Marshall Bowers
2781a30971 collab: Add Orb subscription status and period to billing_subscriptions table (#36682)
This PR adds the following new columns to the `billing_subscriptions`
table:

- `orb_subscription_status`
- `orb_current_billing_period_start_date`
- `orb_current_billing_period_end_date`

Release Notes:

- N/A
2025-08-21 13:59:18 +00:00
David Kleingeld
e0613cbd0f Add Rodio audio pipeline as alternative to current LiveKit pipeline (#36607)
Rodio parts are well tested and need less configuration then the livekit
parts. I suspect there is a bug in the livekit configuration regarding
resampling. Rather then investigate that it seemed faster & easier to
swap in Rodio.

This opens the door to using other Rodio parts like:
 - Decibel based volume control
 - Limiter (prevents sound from becoming too loud)
 - Automatic gain control

To use this add to settings:
```
  "audio": {
    "experimental.rodio_audio": true
  }
```

Release Notes:

- N/A

Co-authored-by: Mikayla <mikayla@zed.dev>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
2025-08-21 15:56:16 +02:00
Cole Miller
1dd237139c Fix more improper uses of the buffer_id field of Anchor (#36636)
Follow-up to #36524 

Release Notes:

- N/A
2025-08-21 09:24:34 -04:00
Cole Miller
f63d8e4c53 Show excerpt dividers in without_headers multibuffers (#36647)
Release Notes:

- Fixed diff cards in agent threads not showing dividers between
disjoint edited regions.
2025-08-21 13:23:56 +00:00
Bennet Bo Fenner
ad64a71f04 acp: Allow collapsing edit file tool calls (#36675)
Release Notes:

- N/A
2025-08-21 11:05:41 +00:00
Antonio Scandurra
f435af2fde acp: Use unstaged style for diffs (#36674)
Release Notes:

- N/A
2025-08-21 10:59:51 +00:00
Julia Ryan
c5ee3f3e2e Avoid suspending panicking thread while crashing (#36645)
On the latest build @maxbrunsfeld got a panic that hung zed. It appeared
that the hang occured after the minidump had been successfully written,
so our theory on what happened is that the `suspend_all_other_threads`
call in the crash handler suspended the panicking thread (due to the
signal from simulate_exception being received on a different thread),
and then when the crash handler returned everything was suspended so the
panic hook never made it to the `process::abort`.

This change makes the crash handler avoid _both_ the current and the
panicking thread which should avoid that scenario.

Release Notes:

- N/A
2025-08-21 10:33:45 +00:00
Piotr Osiewicz
7f1bd2f15e remote: Fix toolchain RPC messages not being handled because of the entity getting dropped (#36665)
Release Notes:

- N/A
2025-08-21 09:37:45 +00:00
Bennet Bo Fenner
62f2ef86dc agent2: Allow expanding terminals individually (#36670)
Release Notes:

- N/A
2025-08-21 11:25:00 +02:00
Antonio Scandurra
fda6eda3c2 Fix @-mentioning threads when their summary isn't ready yet (#36664)
Release Notes:

- N/A
2025-08-21 08:57:28 +00:00
Kirill Bulatov
ed84767c9d Fix overlooked Clippy lints (#36659)
Follow-up of https://github.com/zed-industries/zed/pull/36557 that is
needed after https://github.com/zed-industries/zed/pull/36652

Release Notes:

- N/A
2025-08-21 06:48:04 +00:00
Kirill Bulatov
cde0a5dd27 Add a non-style lint exclusion (#36658)
Follow-up of https://github.com/zed-industries/zed/pull/36651
Restores https://github.com/zed-industries/zed/pull/35955 footgun guard.

Release Notes:

- N/A
2025-08-21 06:36:57 +00:00
Sachith Shetty
68f97d6069 editor: Use highlight_text to highlight matching brackets, fix unnecessary inlay hint highlighting (#36540)
Closes #35981

Release Notes:

- Fixed bracket highlights overly including parts of inlays when
highlighting

Before -
<img width="1480" height="602" alt="Screenshot from 2025-08-19 17-15-06"
src="https://github.com/user-attachments/assets/8e6b5ed8-f133-4867-8352-ed93441fbd8b"
/>

After -
<img width="1480" height="602" alt="Screenshot from 2025-08-19 17-24-26"
src="https://github.com/user-attachments/assets/1314e54e-ecf9-4280-9d53-eed6e96e393f"
/>
2025-08-21 09:27:41 +03:00
Kirill Bulatov
5dcb90858e Stop waiting for part of LSP responses on remote Collab clients' part (#36557)
Instead of holding a connection for potentially long LSP queries (e.g.
rust-analyzer might take minutes to look up a definition), disconnect
right after sending the initial request and handle the follow-up
responses later.

As a bonus, this allows to cancel previously sent request on the local
Collab clients' side due to this, as instead of holding and serving the
old connection, local clients now can stop previous requests, if needed.

Current PR does not convert all LSP requests to the new paradigm, but
the problematic ones, deprecating `MultiLspQuery` and moving all its
requests to the new paradigm.

Release Notes:

- Improved resource usage when querying LSP over Collab

---------

Co-authored-by: David Kleingeld <git@davidsk.dev>
Co-authored-by: Mikayla Maki <mikayla@zed.dev>
Co-authored-by: David Kleingeld <davidsk@zed.dev>
2025-08-21 09:24:34 +03:00
Conrad Irwin
c731bb6d91 Re-add redundant clone (#36652)
Although I said I'd do this, I actually didn't...

Updates #36651

Release Notes:

- N/A
2025-08-20 21:08:49 -06:00
Conrad Irwin
4b03d791b5 Remove style lints for now (#36651)
Closes #36577

Release Notes:

- N/A
2025-08-20 20:38:30 -06:00
Agus Zubiaga
9a3e4c47d0 acp: Suggest upgrading to preview instead of latest (#36648)
A previous PR changed the install command from `@latest` to `@preview`,
but the upgrade command kept suggesting `@latest`.

Release Notes:

- N/A
2025-08-21 00:52:38 +00:00
Ben Brandt
568e1d0a42 acp: Add e2e test support for NativeAgent (#36635)
Release Notes:

- N/A
2025-08-21 00:36:50 +00:00
Agus Zubiaga
6f242772cc acp: Update to 0.0.30 (#36643)
See: https://github.com/zed-industries/agent-client-protocol/pull/20

Release Notes:

- N/A
2025-08-21 00:10:36 +00:00
张小白
8ef9ecc91f windows: Fix RevealInFileManager (#36592)
Closes #36314

This PR takes inspiration from [Electron’s
implementation](dd54e84a58/shell/common/platform_util_win.cc (L268-L314)).

Before and after:



https://github.com/user-attachments/assets/53eec5d3-23c7-4ee1-8477-e524b0538f60



Release Notes:

- N/A
2025-08-21 08:08:54 +08:00
Ben Kunkle
3dd362978a docs: Add table of all actions (#36642)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-08-20 23:41:06 +00:00
Agus Zubiaga
74c0ba980b acp: Reliably suppress gemini abort error (#36640)
https://github.com/zed-industries/zed/pull/36633 relied on the prompt
request responding before cancel, but that's not guaranteed


Release Notes:

- N/A
2025-08-20 23:32:17 +00:00
Marshall Bowers
c20233e0b4 agent_ui: Fix signed-in check in Zed provider configuration (#36639)
This PR fixes the check for if the user is signed in in the Agent panel
configuration.

Supersedes https://github.com/zed-industries/zed/pull/36634.

Release Notes:

- Fixed the user's plan badge near the Zed provider in the Agent panel
not showing despite being signed in.
2025-08-20 23:09:09 +00:00
Agus Zubiaga
ffb995181e acp: Supress gemini aborted errors (#36633)
This PR adds a temporary workaround to supress "Aborted" errors from
Gemini when cancelling generation. This won't be needed once
https://github.com/google-gemini/gemini-cli/pull/6656 is generally
available.

Release Notes:

- N/A
2025-08-20 22:30:25 +00:00
Conrad Irwin
5120b6b7f9 acp: Handle Gemini Auth Better (#36631)
Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-08-20 16:12:41 -06:00
Julia Ryan
c9c708ff08 nix: Re-enable nightly builds (#36632)
Release Notes:

- N/A
2025-08-20 21:43:53 +00:00
Agus Zubiaga
9e34bb3f05 acp: Hide feedback buttons for external agents (#36630)
Release Notes:

- N/A
2025-08-20 21:35:48 +00:00
Cole Miller
595cf1c6c3 acp: Rename assistant::QuoteSelection and support it in agent2 threads (#36628)
Release Notes:

- N/A
2025-08-20 21:31:25 +00:00
Agus Zubiaga
d1820b183a acp: Suggest installing gemini@preview instead of latest (#36629)
Release Notes:

- N/A
2025-08-20 21:26:07 +00:00
Danilo Leal
fb7edbfb46 thread_view: Add recent history entries & adjust empty state (#36625)
Release Notes:

- N/A
2025-08-20 18:01:22 -03:00
Agus Zubiaga
02dabbb9fa acp thread view: Do not go into editing mode if unsupported (#36623)
Release Notes:

- N/A
2025-08-20 20:05:53 +00:00
Joseph T. Lyons
fa8bef1496 Bump Zed to v0.202 (#36622)
Release Notes:

-N/A
2025-08-20 20:05:30 +00:00
Cole Miller
739e4551da Fix typo in Excerpt::contains (#36621)
Follow-up to #36524 

Release Notes:

- N/A
2025-08-20 19:30:11 +00:00
Ben Brandt
b0bef3a9a2 agent2: Clean up tool descriptions (#36619)
schemars was passing along the newlines from the doc comments. This
should make these closer to the markdown file versions we had in the old
agent.

Release Notes:

- N/A
2025-08-20 19:17:07 +00:00
54 changed files with 2191 additions and 1019 deletions

View File

@@ -19,11 +19,12 @@ self-hosted-runner:
- namespace-profile-16x32-ubuntu-2004-arm
- namespace-profile-32x64-ubuntu-2004-arm
# Namespace Ubuntu 22.04 (Everything else)
- namespace-profile-2x4-ubuntu-2204
- namespace-profile-4x8-ubuntu-2204
- namespace-profile-8x16-ubuntu-2204
- namespace-profile-16x32-ubuntu-2204
- namespace-profile-32x64-ubuntu-2204
# Namespace Ubuntu 24.04 (like ubuntu-latest)
- namespace-profile-2x4-ubuntu-2404
# Namespace Limited Preview
- namespace-profile-8x16-ubuntu-2004-arm-m4
- namespace-profile-8x32-ubuntu-2004-arm-m4

View File

@@ -8,7 +8,7 @@ on:
jobs:
update-collab-staging-tag:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: Checkout repository
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4

View File

@@ -37,7 +37,7 @@ jobs:
run_nix: ${{ steps.filter.outputs.run_nix }}
run_actionlint: ${{ steps.filter.outputs.run_actionlint }}
runs-on:
- ubuntu-latest
- namespace-profile-2x4-ubuntu-2404
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
@@ -237,7 +237,7 @@ jobs:
uses: ./.github/actions/build_docs
actionlint:
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
if: github.repository_owner == 'zed-industries' && needs.job_spec.outputs.run_actionlint == 'true'
needs: [job_spec]
steps:
@@ -458,7 +458,7 @@ jobs:
tests_pass:
name: Tests Pass
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- job_spec
- style

View File

@@ -12,7 +12,7 @@ on:
jobs:
danger:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4

View File

@@ -206,9 +206,6 @@ jobs:
runs-on: github-8vcpu-ubuntu-2404
needs: tests
name: Build Zed on FreeBSD
# env:
# MYTOKEN : ${{ secrets.MYTOKEN }}
# MYTOKEN2: "value2"
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
@@ -243,7 +240,6 @@ jobs:
bundle-nix:
name: Build and cache Nix package
if: false
needs: tests
secrets: inherit
uses: ./.github/workflows/nix.yml
@@ -294,7 +290,7 @@ jobs:
update-nightly-tag:
name: Update nightly tag
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- bundle-mac
- bundle-linux-x86

View File

@@ -12,7 +12,7 @@ jobs:
shellcheck:
name: "ShellCheck Scripts"
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4

9
Cargo.lock generated
View File

@@ -1379,10 +1379,11 @@ version = "0.1.0"
dependencies = [
"anyhow",
"collections",
"derive_more 0.99.19",
"gpui",
"parking_lot",
"rodio",
"schemars",
"serde",
"settings",
"util",
"workspace-hack",
]
@@ -9621,6 +9622,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"async-trait",
"audio",
"collections",
"core-foundation 0.10.0",
"core-video",
@@ -9643,6 +9645,7 @@ dependencies = [
"scap",
"serde",
"serde_json",
"settings",
"sha2",
"simplelog",
"smallvec",
@@ -20391,7 +20394,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.201.2"
version = "0.202.0"
dependencies = [
"activity_indicator",
"agent",

View File

@@ -802,147 +802,32 @@ unexpected_cfgs = { level = "allow" }
dbg_macro = "deny"
todo = "deny"
# Motivation: We use `vec![a..b]` a lot when dealing with ranges in text, so
# warning on this rule produces a lot of noise.
single_range_in_vec_init = "allow"
redundant_clone = "warn"
# This is not a style lint, see https://github.com/rust-lang/rust-clippy/pull/15454
# Remove when the lint gets promoted to `suspicious`.
declare_interior_mutable_const = "deny"
# These are all of the rules that currently have violations in the Zed
# codebase.
#
# We'll want to drive this list down by either:
# 1. fixing violations of the rule and begin enforcing it
# 2. deciding we want to allow the rule permanently, at which point
# we should codify that separately above.
#
# This list shouldn't be added to; it should only get shorter.
# =============================================================================
redundant_clone = "deny"
# There are a bunch of rules currently failing in the `style` group, so
# allow all of those, for now.
# We currently do not restrict any style rules
# as it slows down shipping code to Zed.
#
# Running ./script/clippy can take several minutes, and so it's
# common to skip that step and let CI do it. Any unexpected failures
# (which also take minutes to discover) thus require switching back
# to an old branch, manual fixing, and re-pushing.
#
# In the future we could improve this by either making sure
# Zed can surface clippy errors in diagnostics (in addition to the
# rust-analyzer errors), or by having CI fix style nits automatically.
style = { level = "allow", priority = -1 }
# Temporary list of style lints that we've fixed so far.
# Progress is being tracked in #36577
blocks_in_conditions = "warn"
bool_assert_comparison = "warn"
borrow_interior_mutable_const = "warn"
box_default = "warn"
builtin_type_shadow = "warn"
bytes_nth = "warn"
chars_next_cmp = "warn"
cmp_null = "warn"
collapsible_else_if = "warn"
collapsible_if = "warn"
comparison_to_empty = "warn"
default_instead_of_iter_empty = "warn"
disallowed_macros = "warn"
disallowed_methods = "warn"
disallowed_names = "warn"
disallowed_types = "warn"
doc_lazy_continuation = "warn"
doc_overindented_list_items = "warn"
duplicate_underscore_argument = "warn"
err_expect = "warn"
fn_to_numeric_cast = "warn"
fn_to_numeric_cast_with_truncation = "warn"
for_kv_map = "warn"
implicit_saturating_add = "warn"
implicit_saturating_sub = "warn"
inconsistent_digit_grouping = "warn"
infallible_destructuring_match = "warn"
inherent_to_string = "warn"
init_numbered_fields = "warn"
into_iter_on_ref = "warn"
io_other_error = "warn"
items_after_test_module = "warn"
iter_cloned_collect = "warn"
iter_next_slice = "warn"
iter_nth = "warn"
iter_nth_zero = "warn"
iter_skip_next = "warn"
just_underscores_and_digits = "warn"
len_zero = "warn"
let_and_return = "warn"
main_recursion = "warn"
manual_bits = "warn"
manual_dangling_ptr = "warn"
manual_is_ascii_check = "warn"
manual_is_finite = "warn"
manual_is_infinite = "warn"
manual_map = "warn"
manual_next_back = "warn"
manual_non_exhaustive = "warn"
manual_ok_or = "warn"
manual_pattern_char_comparison = "warn"
manual_rotate = "warn"
manual_slice_fill = "warn"
manual_while_let_some = "warn"
map_clone = "warn"
map_collect_result_unit = "warn"
match_like_matches_macro = "warn"
match_overlapping_arm = "warn"
mem_replace_option_with_none = "warn"
mem_replace_option_with_some = "warn"
missing_enforced_import_renames = "warn"
missing_safety_doc = "warn"
mixed_attributes_style = "warn"
mixed_case_hex_literals = "warn"
module_inception = "warn"
must_use_unit = "warn"
mut_mutex_lock = "warn"
needless_borrow = "warn"
needless_doctest_main = "warn"
needless_else = "warn"
needless_parens_on_range_literals = "warn"
needless_pub_self = "warn"
needless_return = "warn"
needless_return_with_question_mark = "warn"
non_minimal_cfg = "warn"
ok_expect = "warn"
owned_cow = "warn"
print_literal = "warn"
print_with_newline = "warn"
println_empty_string = "warn"
ptr_eq = "warn"
question_mark = "warn"
redundant_closure = "warn"
redundant_field_names = "warn"
redundant_pattern_matching = "warn"
redundant_static_lifetimes = "warn"
result_map_or_into_option = "warn"
self_named_constructors = "warn"
single_match = "warn"
tabs_in_doc_comments = "warn"
to_digit_is_some = "warn"
toplevel_ref_arg = "warn"
unnecessary_fold = "warn"
unnecessary_map_or = "warn"
unnecessary_mut_passed = "warn"
unnecessary_owned_empty_strings = "warn"
unneeded_struct_pattern = "warn"
unsafe_removed_from_name = "warn"
unused_unit = "warn"
unusual_byte_groupings = "warn"
while_let_on_iterator = "warn"
write_literal = "warn"
write_with_newline = "warn"
writeln_empty_string = "warn"
wrong_self_convention = "warn"
zero_ptr = "warn"
# Individual rules that have violations in the codebase:
type_complexity = "allow"
# We often return trait objects from `new` functions.
new_ret_no_self = { level = "allow" }
# We have a few `next` functions that differ in lifetimes
# compared to Iterator::next. Yet, clippy complains about those.
should_implement_trait = { level = "allow" }
let_underscore_future = "allow"
# It doesn't make sense to implement `Default` unilaterally.
new_without_default = "allow"
# Motivation: We use `vec![a..b]` a lot when dealing with ranges in text, so
# warning on this rule produces a lot of noise.
single_range_in_vec_init = "allow"
# in Rust it can be very tedious to reduce argument count without
# running afoul of the borrow checker.
@@ -951,10 +836,6 @@ too_many_arguments = "allow"
# We often have large enum variants yet we rarely actually bother with splitting them up.
large_enum_variant = "allow"
# `enum_variant_names` fires for all enums, even when they derive serde traits.
# Adhering to this lint would be a breaking change.
enum_variant_names = "allow"
[workspace.metadata.cargo-machete]
ignored = [
"bindgen",

View File

@@ -1715,7 +1715,7 @@ impl SemanticsProvider for SlashCommandSemanticsProvider {
buffer: &Entity<Buffer>,
position: text::Anchor,
cx: &mut App,
) -> Option<Task<Vec<project::Hover>>> {
) -> Option<Task<Option<Vec<project::Hover>>>> {
let snapshot = buffer.read(cx).snapshot();
let offset = position.to_offset(&snapshot);
let (start, end) = self.range.get()?;
@@ -1723,14 +1723,14 @@ impl SemanticsProvider for SlashCommandSemanticsProvider {
return None;
}
let range = snapshot.anchor_after(start)..snapshot.anchor_after(end);
Some(Task::ready(vec![project::Hover {
Some(Task::ready(Some(vec![project::Hover {
contents: vec![project::HoverBlock {
text: "Slash commands are not supported".into(),
kind: project::HoverBlockKind::PlainText,
}],
range: Some(range),
language: None,
}]))
}])))
}
fn inline_values(
@@ -1780,7 +1780,7 @@ impl SemanticsProvider for SlashCommandSemanticsProvider {
_position: text::Anchor,
_kind: editor::GotoDefinitionKind,
_cx: &mut App,
) -> Option<Task<Result<Vec<project::LocationLink>>>> {
) -> Option<Task<Result<Option<Vec<project::LocationLink>>>>> {
None
}

View File

@@ -192,7 +192,7 @@ impl AgentConfiguration {
let is_signed_in = self
.workspace
.read_with(cx, |workspace, _| {
workspace.client().status().borrow().is_connected()
!workspace.client().status().borrow().is_signed_out()
})
.unwrap_or(false);

View File

@@ -15,9 +15,10 @@ doctest = false
[dependencies]
anyhow.workspace = true
collections.workspace = true
derive_more.workspace = true
gpui.workspace = true
parking_lot.workspace = true
settings.workspace = true
schemars.workspace = true
serde.workspace = true
rodio = { workspace = true, features = [ "wav", "playback", "tracing" ] }
util.workspace = true
workspace-hack.workspace = true

View File

@@ -1,54 +0,0 @@
use std::{io::Cursor, sync::Arc};
use anyhow::{Context as _, Result};
use collections::HashMap;
use gpui::{App, AssetSource, Global};
use rodio::{Decoder, Source, source::Buffered};
type Sound = Buffered<Decoder<Cursor<Vec<u8>>>>;
pub struct SoundRegistry {
cache: Arc<parking_lot::Mutex<HashMap<String, Sound>>>,
assets: Box<dyn AssetSource>,
}
struct GlobalSoundRegistry(Arc<SoundRegistry>);
impl Global for GlobalSoundRegistry {}
impl SoundRegistry {
pub fn new(source: impl AssetSource) -> Arc<Self> {
Arc::new(Self {
cache: Default::default(),
assets: Box::new(source),
})
}
pub fn global(cx: &App) -> Arc<Self> {
cx.global::<GlobalSoundRegistry>().0.clone()
}
pub(crate) fn set_global(source: impl AssetSource, cx: &mut App) {
cx.set_global(GlobalSoundRegistry(SoundRegistry::new(source)));
}
pub fn get(&self, name: &str) -> Result<impl Source<Item = f32> + use<>> {
if let Some(wav) = self.cache.lock().get(name) {
return Ok(wav.clone());
}
let path = format!("sounds/{}.wav", name);
let bytes = self
.assets
.load(&path)?
.map(anyhow::Ok)
.with_context(|| format!("No asset available for path {path}"))??
.into_owned();
let cursor = Cursor::new(bytes);
let source = Decoder::new(cursor)?.buffered();
self.cache.lock().insert(name.to_string(), source.clone());
Ok(source)
}
}

View File

@@ -1,16 +1,19 @@
use assets::SoundRegistry;
use derive_more::{Deref, DerefMut};
use gpui::{App, AssetSource, BorrowAppContext, Global};
use rodio::{OutputStream, OutputStreamBuilder};
use anyhow::{Context as _, Result, anyhow};
use collections::HashMap;
use gpui::{App, BorrowAppContext, Global};
use rodio::{Decoder, OutputStream, OutputStreamBuilder, Source, source::Buffered};
use settings::Settings;
use std::io::Cursor;
use util::ResultExt;
mod assets;
mod audio_settings;
pub use audio_settings::AudioSettings;
pub fn init(source: impl AssetSource, cx: &mut App) {
SoundRegistry::set_global(source, cx);
cx.set_global(GlobalAudio(Audio::new()));
pub fn init(cx: &mut App) {
AudioSettings::register(cx);
}
#[derive(Copy, Clone, Eq, Hash, PartialEq)]
pub enum Sound {
Joined,
Leave,
@@ -38,18 +41,12 @@ impl Sound {
#[derive(Default)]
pub struct Audio {
output_handle: Option<OutputStream>,
source_cache: HashMap<Sound, Buffered<Decoder<Cursor<Vec<u8>>>>>,
}
#[derive(Deref, DerefMut)]
struct GlobalAudio(Audio);
impl Global for GlobalAudio {}
impl Global for Audio {}
impl Audio {
pub fn new() -> Self {
Self::default()
}
fn ensure_output_exists(&mut self) -> Option<&OutputStream> {
if self.output_handle.is_none() {
self.output_handle = OutputStreamBuilder::open_default_stream().log_err();
@@ -58,26 +55,51 @@ impl Audio {
self.output_handle.as_ref()
}
pub fn play_sound(sound: Sound, cx: &mut App) {
if !cx.has_global::<GlobalAudio>() {
return;
}
pub fn play_source(
source: impl rodio::Source + Send + 'static,
cx: &mut App,
) -> anyhow::Result<()> {
cx.update_default_global(|this: &mut Self, _cx| {
let output_handle = this
.ensure_output_exists()
.ok_or_else(|| anyhow!("Could not open audio output"))?;
output_handle.mixer().add(source);
Ok(())
})
}
cx.update_global::<GlobalAudio, _>(|this, cx| {
pub fn play_sound(sound: Sound, cx: &mut App) {
cx.update_default_global(|this: &mut Self, cx| {
let source = this.sound_source(sound, cx).log_err()?;
let output_handle = this.ensure_output_exists()?;
let source = SoundRegistry::global(cx).get(sound.file()).log_err()?;
output_handle.mixer().add(source);
Some(())
});
}
pub fn end_call(cx: &mut App) {
if !cx.has_global::<GlobalAudio>() {
return;
}
cx.update_global::<GlobalAudio, _>(|this, _| {
cx.update_default_global(|this: &mut Self, _cx| {
this.output_handle.take();
});
}
fn sound_source(&mut self, sound: Sound, cx: &App) -> Result<impl Source + use<>> {
if let Some(wav) = self.source_cache.get(&sound) {
return Ok(wav.clone());
}
let path = format!("sounds/{}.wav", sound.file());
let bytes = cx
.asset_source()
.load(&path)?
.map(anyhow::Ok)
.with_context(|| format!("No asset available for path {path}"))??
.into_owned();
let cursor = Cursor::new(bytes);
let source = Decoder::new(cursor)?.buffered();
self.source_cache.insert(sound, source.clone());
Ok(source)
}
}

View File

@@ -0,0 +1,33 @@
use anyhow::Result;
use gpui::App;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
#[derive(Deserialize, Debug)]
pub struct AudioSettings {
/// Opt into the new audio system.
#[serde(rename = "experimental.rodio_audio", default)]
pub rodio_audio: bool, // default is false
}
/// Configuration of audio in Zed.
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
#[serde(default)]
pub struct AudioSettingsContent {
/// Whether to use the experimental audio system
#[serde(rename = "experimental.rodio_audio", default)]
pub rodio_audio: bool,
}
impl Settings for AudioSettings {
const KEY: Option<&'static str> = Some("audio");
type FileContent = AudioSettingsContent;
fn load(sources: SettingsSources<Self::FileContent>, _cx: &mut App) -> Result<Self> {
sources.json_merge()
}
fn import_from_vscode(_vscode: &settings::VsCodeSettings, _current: &mut Self::FileContent) {}
}

View File

@@ -0,0 +1,4 @@
alter table billing_subscriptions
add column orb_subscription_status text,
add column orb_current_billing_period_start_date timestamp without time zone,
add column orb_current_billing_period_end_date timestamp without time zone;

View File

@@ -400,6 +400,8 @@ impl Server {
.add_request_handler(forward_mutating_project_request::<proto::SaveBuffer>)
.add_request_handler(forward_mutating_project_request::<proto::BlameBuffer>)
.add_request_handler(multi_lsp_query)
.add_request_handler(lsp_query)
.add_message_handler(broadcast_project_message_from_host::<proto::LspQueryResponse>)
.add_request_handler(forward_mutating_project_request::<proto::RestartLanguageServers>)
.add_request_handler(forward_mutating_project_request::<proto::StopLanguageServers>)
.add_request_handler(forward_mutating_project_request::<proto::LinkedEditingRange>)
@@ -910,7 +912,9 @@ impl Server {
user_id=field::Empty,
login=field::Empty,
impersonator=field::Empty,
// todo(lsp) remove after Zed Stable hits v0.204.x
multi_lsp_query_request=field::Empty,
lsp_query_request=field::Empty,
release_channel=field::Empty,
{ TOTAL_DURATION_MS }=field::Empty,
{ PROCESSING_DURATION_MS }=field::Empty,
@@ -2356,6 +2360,7 @@ where
Ok(())
}
// todo(lsp) remove after Zed Stable hits v0.204.x
async fn multi_lsp_query(
request: MultiLspQuery,
response: Response<MultiLspQuery>,
@@ -2366,6 +2371,21 @@ async fn multi_lsp_query(
forward_mutating_project_request(request, response, session).await
}
async fn lsp_query(
request: proto::LspQuery,
response: Response<proto::LspQuery>,
session: MessageContext,
) -> Result<()> {
let (name, should_write) = request.query_name_and_write_permissions();
tracing::Span::current().record("lsp_query_request", name);
tracing::info!("lsp_query message received");
if should_write {
forward_mutating_project_request(request, response, session).await
} else {
forward_read_only_project_request(request, response, session).await
}
}
/// Notify other participants that a new buffer has been created
async fn create_buffer_for_peer(
request: proto::CreateBufferForPeer,

View File

@@ -15,13 +15,14 @@ use editor::{
},
};
use fs::Fs;
use futures::{StreamExt, lock::Mutex};
use futures::{SinkExt, StreamExt, channel::mpsc, lock::Mutex};
use gpui::{App, Rgba, TestAppContext, UpdateGlobal, VisualContext, VisualTestContext};
use indoc::indoc;
use language::{
FakeLspAdapter,
language_settings::{AllLanguageSettings, InlayHintSettings},
};
use lsp::LSP_REQUEST_TIMEOUT;
use project::{
ProjectPath, SERVER_PROGRESS_THROTTLE_TIMEOUT,
lsp_store::lsp_ext_command::{ExpandedMacro, LspExtExpandMacro},
@@ -1017,6 +1018,211 @@ async fn test_collaborating_with_renames(cx_a: &mut TestAppContext, cx_b: &mut T
})
}
#[gpui::test]
async fn test_slow_lsp_server(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext) {
let mut server = TestServer::start(cx_a.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
let client_b = server.create_client(cx_b, "user_b").await;
server
.create_room(&mut [(&client_a, cx_a), (&client_b, cx_b)])
.await;
let active_call_a = cx_a.read(ActiveCall::global);
cx_b.update(editor::init);
let command_name = "test_command";
let capabilities = lsp::ServerCapabilities {
code_lens_provider: Some(lsp::CodeLensOptions {
resolve_provider: None,
}),
execute_command_provider: Some(lsp::ExecuteCommandOptions {
commands: vec![command_name.to_string()],
..lsp::ExecuteCommandOptions::default()
}),
..lsp::ServerCapabilities::default()
};
client_a.language_registry().add(rust_lang());
let mut fake_language_servers = client_a.language_registry().register_fake_lsp(
"Rust",
FakeLspAdapter {
capabilities: capabilities.clone(),
..FakeLspAdapter::default()
},
);
client_b.language_registry().add(rust_lang());
client_b.language_registry().register_fake_lsp_adapter(
"Rust",
FakeLspAdapter {
capabilities,
..FakeLspAdapter::default()
},
);
client_a
.fs()
.insert_tree(
path!("/dir"),
json!({
"one.rs": "const ONE: usize = 1;"
}),
)
.await;
let (project_a, worktree_id) = client_a.build_local_project(path!("/dir"), cx_a).await;
let project_id = active_call_a
.update(cx_a, |call, cx| call.share_project(project_a.clone(), cx))
.await
.unwrap();
let project_b = client_b.join_remote_project(project_id, cx_b).await;
let (workspace_b, cx_b) = client_b.build_workspace(&project_b, cx_b);
let editor_b = workspace_b
.update_in(cx_b, |workspace, window, cx| {
workspace.open_path((worktree_id, "one.rs"), None, true, window, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let (lsp_store_b, buffer_b) = editor_b.update(cx_b, |editor, cx| {
let lsp_store = editor.project().unwrap().read(cx).lsp_store();
let buffer = editor.buffer().read(cx).as_singleton().unwrap();
(lsp_store, buffer)
});
let fake_language_server = fake_language_servers.next().await.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
let long_request_time = LSP_REQUEST_TIMEOUT / 2;
let (request_started_tx, mut request_started_rx) = mpsc::unbounded();
let requests_started = Arc::new(AtomicUsize::new(0));
let requests_completed = Arc::new(AtomicUsize::new(0));
let _lens_requests = fake_language_server
.set_request_handler::<lsp::request::CodeLensRequest, _, _>({
let request_started_tx = request_started_tx.clone();
let requests_started = requests_started.clone();
let requests_completed = requests_completed.clone();
move |params, cx| {
let mut request_started_tx = request_started_tx.clone();
let requests_started = requests_started.clone();
let requests_completed = requests_completed.clone();
async move {
assert_eq!(
params.text_document.uri.as_str(),
uri!("file:///dir/one.rs")
);
requests_started.fetch_add(1, atomic::Ordering::Release);
request_started_tx.send(()).await.unwrap();
cx.background_executor().timer(long_request_time).await;
let i = requests_completed.fetch_add(1, atomic::Ordering::Release) + 1;
Ok(Some(vec![lsp::CodeLens {
range: lsp::Range::new(lsp::Position::new(0, 0), lsp::Position::new(0, 9)),
command: Some(lsp::Command {
title: format!("LSP Command {i}"),
command: command_name.to_string(),
arguments: None,
}),
data: None,
}]))
}
}
});
// Move cursor to a location, this should trigger the code lens call.
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([7..7])
});
});
let () = request_started_rx.next().await.unwrap();
assert_eq!(
requests_started.load(atomic::Ordering::Acquire),
1,
"Selection change should have initiated the first request"
);
assert_eq!(
requests_completed.load(atomic::Ordering::Acquire),
0,
"Slow requests should be running still"
);
let _first_task = lsp_store_b.update(cx_b, |lsp_store, cx| {
lsp_store
.forget_code_lens_task(buffer_b.read(cx).remote_id())
.expect("Should have the fetch task started")
});
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([1..1])
});
});
let () = request_started_rx.next().await.unwrap();
assert_eq!(
requests_started.load(atomic::Ordering::Acquire),
2,
"Selection change should have initiated the second request"
);
assert_eq!(
requests_completed.load(atomic::Ordering::Acquire),
0,
"Slow requests should be running still"
);
let _second_task = lsp_store_b.update(cx_b, |lsp_store, cx| {
lsp_store
.forget_code_lens_task(buffer_b.read(cx).remote_id())
.expect("Should have the fetch task started for the 2nd time")
});
editor_b.update_in(cx_b, |editor, window, cx| {
editor.change_selections(SelectionEffects::no_scroll(), window, cx, |s| {
s.select_ranges([2..2])
});
});
let () = request_started_rx.next().await.unwrap();
assert_eq!(
requests_started.load(atomic::Ordering::Acquire),
3,
"Selection change should have initiated the third request"
);
assert_eq!(
requests_completed.load(atomic::Ordering::Acquire),
0,
"Slow requests should be running still"
);
_first_task.await.unwrap();
_second_task.await.unwrap();
cx_b.run_until_parked();
assert_eq!(
requests_started.load(atomic::Ordering::Acquire),
3,
"No selection changes should trigger no more code lens requests"
);
assert_eq!(
requests_completed.load(atomic::Ordering::Acquire),
3,
"After enough time, all 3 LSP requests should have been served by the language server"
);
let resulting_lens_actions = editor_b
.update(cx_b, |editor, cx| {
let lsp_store = editor.project().unwrap().read(cx).lsp_store();
lsp_store.update(cx, |lsp_store, cx| {
lsp_store.code_lens_actions(&buffer_b, cx)
})
})
.await
.unwrap()
.unwrap();
assert_eq!(
resulting_lens_actions.len(),
1,
"Should have fetched one code lens action, but got: {resulting_lens_actions:?}"
);
assert_eq!(
resulting_lens_actions.first().unwrap().lsp_action.title(),
"LSP Command 3",
"Only the final code lens action should be in the data"
)
}
#[gpui::test(iterations = 10)]
async fn test_language_server_statuses(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext) {
let mut server = TestServer::start(cx_a.executor()).await;

View File

@@ -4850,6 +4850,7 @@ async fn test_definition(
let definitions_1 = project_b
.update(cx_b, |p, cx| p.definitions(&buffer_b, 23, cx))
.await
.unwrap()
.unwrap();
cx_b.read(|cx| {
assert_eq!(
@@ -4885,6 +4886,7 @@ async fn test_definition(
let definitions_2 = project_b
.update(cx_b, |p, cx| p.definitions(&buffer_b, 33, cx))
.await
.unwrap()
.unwrap();
cx_b.read(|cx| {
assert_eq!(definitions_2.len(), 1);
@@ -4922,6 +4924,7 @@ async fn test_definition(
let type_definitions = project_b
.update(cx_b, |p, cx| p.type_definitions(&buffer_b, 7, cx))
.await
.unwrap()
.unwrap();
cx_b.read(|cx| {
assert_eq!(
@@ -5060,7 +5063,7 @@ async fn test_references(
])))
.unwrap();
let references = references.await.unwrap();
let references = references.await.unwrap().unwrap();
executor.run_until_parked();
project_b.read_with(cx_b, |project, cx| {
// User is informed that a request is no longer pending.
@@ -5104,7 +5107,7 @@ async fn test_references(
lsp_response_tx
.unbounded_send(Err(anyhow!("can't find references")))
.unwrap();
assert_eq!(references.await.unwrap(), []);
assert_eq!(references.await.unwrap().unwrap(), []);
// User is informed that the request is no longer pending.
executor.run_until_parked();
@@ -5505,7 +5508,8 @@ async fn test_lsp_hover(
// Request hover information as the guest.
let mut hovers = project_b
.update(cx_b, |p, cx| p.hover(&buffer_b, 22, cx))
.await;
.await
.unwrap();
assert_eq!(
hovers.len(),
2,
@@ -5764,7 +5768,7 @@ async fn test_open_buffer_while_getting_definition_pointing_to_it(
definitions = project_b.update(cx_b, |p, cx| p.definitions(&buffer_b1, 23, cx));
}
let definitions = definitions.await.unwrap();
let definitions = definitions.await.unwrap().unwrap();
assert_eq!(
definitions.len(),
1,

View File

@@ -4,6 +4,8 @@ use minidumper::{Client, LoopAction, MinidumpBinary};
use release_channel::{RELEASE_CHANNEL, ReleaseChannel};
use serde::{Deserialize, Serialize};
#[cfg(target_os = "macos")]
use std::sync::atomic::AtomicU32;
use std::{
env,
fs::{self, File},
@@ -26,6 +28,9 @@ pub static REQUESTED_MINIDUMP: AtomicBool = AtomicBool::new(false);
const CRASH_HANDLER_PING_TIMEOUT: Duration = Duration::from_secs(60);
const CRASH_HANDLER_CONNECT_TIMEOUT: Duration = Duration::from_secs(10);
#[cfg(target_os = "macos")]
static PANIC_THREAD_ID: AtomicU32 = AtomicU32::new(0);
pub async fn init(crash_init: InitCrashHandler) {
if *RELEASE_CHANNEL == ReleaseChannel::Dev && env::var("ZED_GENERATE_MINIDUMPS").is_err() {
return;
@@ -110,9 +115,10 @@ unsafe fn suspend_all_other_threads() {
mach2::task::task_threads(task, &raw mut threads, &raw mut count);
}
let current = unsafe { mach2::mach_init::mach_thread_self() };
let panic_thread = PANIC_THREAD_ID.load(Ordering::SeqCst);
for i in 0..count {
let t = unsafe { *threads.add(i as usize) };
if t != current {
if t != current && t != panic_thread {
unsafe { mach2::thread_act::thread_suspend(t) };
}
}
@@ -238,6 +244,13 @@ pub fn handle_panic(message: String, span: Option<&Location>) {
)
.ok();
log::error!("triggering a crash to generate a minidump...");
#[cfg(target_os = "macos")]
PANIC_THREAD_ID.store(
unsafe { mach2::mach_init::mach_thread_self() },
Ordering::SeqCst,
);
#[cfg(target_os = "linux")]
CrashHandler.simulate_signal(crash_handler::Signal::Trap as u32);
#[cfg(not(target_os = "linux"))]

View File

@@ -99,6 +99,7 @@ fn handle_preprocessing() -> Result<()> {
let mut errors = HashSet::<PreprocessorError>::new();
handle_frontmatter(&mut book, &mut errors);
template_big_table_of_actions(&mut book);
template_and_validate_keybindings(&mut book, &mut errors);
template_and_validate_actions(&mut book, &mut errors);
@@ -147,6 +148,18 @@ fn handle_frontmatter(book: &mut Book, errors: &mut HashSet<PreprocessorError>)
});
}
fn template_big_table_of_actions(book: &mut Book) {
for_each_chapter_mut(book, |chapter| {
let needle = "{#ACTIONS_TABLE#}";
if let Some(start) = chapter.content.rfind(needle) {
chapter.content.replace_range(
start..start + needle.len(),
&generate_big_table_of_actions(),
);
}
});
}
fn template_and_validate_keybindings(book: &mut Book, errors: &mut HashSet<PreprocessorError>) {
let regex = Regex::new(r"\{#kb (.*?)\}").unwrap();
@@ -277,6 +290,7 @@ struct ActionDef {
name: &'static str,
human_name: String,
deprecated_aliases: &'static [&'static str],
docs: Option<&'static str>,
}
fn dump_all_gpui_actions() -> Vec<ActionDef> {
@@ -285,6 +299,7 @@ fn dump_all_gpui_actions() -> Vec<ActionDef> {
name: action.name,
human_name: command_palette::humanize_action_name(action.name),
deprecated_aliases: action.deprecated_aliases,
docs: action.documentation,
})
.collect::<Vec<ActionDef>>();
@@ -418,3 +433,54 @@ fn title_regex() -> &'static Regex {
static TITLE_REGEX: OnceLock<Regex> = OnceLock::new();
TITLE_REGEX.get_or_init(|| Regex::new(r"<title>\s*(.*?)\s*</title>").unwrap())
}
fn generate_big_table_of_actions() -> String {
let actions = &*ALL_ACTIONS;
let mut output = String::new();
let mut actions_sorted = actions.iter().collect::<Vec<_>>();
actions_sorted.sort_by_key(|a| a.name);
// Start the definition list with custom styling for better spacing
output.push_str("<dl style=\"line-height: 1.8;\">\n");
for action in actions_sorted.into_iter() {
// Add the humanized action name as the term with margin
output.push_str(
"<dt style=\"margin-top: 1.5em; margin-bottom: 0.5em; font-weight: bold;\"><code>",
);
output.push_str(&action.human_name);
output.push_str("</code></dt>\n");
// Add the definition with keymap name and description
output.push_str("<dd style=\"margin-left: 2em; margin-bottom: 1em;\">\n");
// Add the description, escaping HTML if needed
if let Some(description) = action.docs {
output.push_str(
&description
.replace("&", "&amp;")
.replace("<", "&lt;")
.replace(">", "&gt;"),
);
output.push_str("<br>\n");
}
output.push_str("Keymap Name: <code>");
output.push_str(action.name);
output.push_str("</code><br>\n");
if !action.deprecated_aliases.is_empty() {
output.push_str("Deprecated Aliases:");
for alias in action.deprecated_aliases.iter() {
output.push_str("<code>");
output.push_str(alias);
output.push_str("</code>, ");
}
}
output.push_str("\n</dd>\n");
}
// Close the definition list
output.push_str("</dl>\n");
output
}

View File

@@ -1900,6 +1900,60 @@ impl Editor {
editor.update_lsp_data(false, Some(*buffer_id), window, cx);
}
}
project::Event::EntryRenamed(transaction) => {
let Some(workspace) = editor.workspace() else {
return;
};
let Some(active_editor) = workspace.read(cx).active_item_as::<Self>(cx)
else {
return;
};
if active_editor.entity_id() == cx.entity_id() {
let edited_buffers_already_open = {
let other_editors: Vec<Entity<Editor>> = workspace
.read(cx)
.panes()
.iter()
.flat_map(|pane| pane.read(cx).items_of_type::<Editor>())
.filter(|editor| editor.entity_id() != cx.entity_id())
.collect();
transaction.0.keys().all(|buffer| {
other_editors.iter().any(|editor| {
let multi_buffer = editor.read(cx).buffer();
multi_buffer.read(cx).is_singleton()
&& multi_buffer.read(cx).as_singleton().map_or(
false,
|singleton| {
singleton.entity_id() == buffer.entity_id()
},
)
})
})
};
if !edited_buffers_already_open {
let workspace = workspace.downgrade();
let transaction = transaction.clone();
cx.defer_in(window, move |_, window, cx| {
cx.spawn_in(window, async move |editor, cx| {
Self::open_project_transaction(
&editor,
workspace,
transaction,
"Rename".to_string(),
cx,
)
.await
.ok()
})
.detach();
});
}
}
}
_ => {}
},
));
@@ -6282,7 +6336,7 @@ impl Editor {
}
pub async fn open_project_transaction(
this: &WeakEntity<Editor>,
editor: &WeakEntity<Editor>,
workspace: WeakEntity<Workspace>,
transaction: ProjectTransaction,
title: String,
@@ -6300,7 +6354,7 @@ impl Editor {
if let Some((buffer, transaction)) = entries.first() {
if entries.len() == 1 {
let excerpt = this.update(cx, |editor, cx| {
let excerpt = editor.update(cx, |editor, cx| {
editor
.buffer()
.read(cx)
@@ -15700,7 +15754,9 @@ impl Editor {
};
cx.spawn_in(window, async move |editor, cx| {
let definitions = definitions.await?;
let Some(definitions) = definitions.await? else {
return Ok(Navigated::No);
};
let navigated = editor
.update_in(cx, |editor, window, cx| {
editor.navigate_to_hover_links(
@@ -16042,7 +16098,9 @@ impl Editor {
}
});
let locations = references.await?;
let Some(locations) = references.await? else {
return anyhow::Ok(Navigated::No);
};
if locations.is_empty() {
return anyhow::Ok(Navigated::No);
}
@@ -21824,7 +21882,7 @@ pub trait SemanticsProvider {
buffer: &Entity<Buffer>,
position: text::Anchor,
cx: &mut App,
) -> Option<Task<Vec<project::Hover>>>;
) -> Option<Task<Option<Vec<project::Hover>>>>;
fn inline_values(
&self,
@@ -21863,7 +21921,7 @@ pub trait SemanticsProvider {
position: text::Anchor,
kind: GotoDefinitionKind,
cx: &mut App,
) -> Option<Task<Result<Vec<LocationLink>>>>;
) -> Option<Task<Result<Option<Vec<LocationLink>>>>>;
fn range_for_rename(
&self,
@@ -21976,7 +22034,13 @@ impl CodeActionProvider for Entity<Project> {
Ok(code_lens_actions
.context("code lens fetch")?
.into_iter()
.chain(code_actions.context("code action fetch")?)
.flatten()
.chain(
code_actions
.context("code action fetch")?
.into_iter()
.flatten(),
)
.collect())
})
})
@@ -22271,7 +22335,7 @@ impl SemanticsProvider for Entity<Project> {
buffer: &Entity<Buffer>,
position: text::Anchor,
cx: &mut App,
) -> Option<Task<Vec<project::Hover>>> {
) -> Option<Task<Option<Vec<project::Hover>>>> {
Some(self.update(cx, |project, cx| project.hover(buffer, position, cx)))
}
@@ -22292,7 +22356,7 @@ impl SemanticsProvider for Entity<Project> {
position: text::Anchor,
kind: GotoDefinitionKind,
cx: &mut App,
) -> Option<Task<Result<Vec<LocationLink>>>> {
) -> Option<Task<Result<Option<Vec<LocationLink>>>>> {
Some(self.update(cx, |project, cx| match kind {
GotoDefinitionKind::Symbol => project.definitions(buffer, position, cx),
GotoDefinitionKind::Declaration => project.declarations(buffer, position, cx),

View File

@@ -1,6 +1,7 @@
use crate::{Editor, RangeToAnchorExt};
use gpui::{Context, Window};
use gpui::{Context, HighlightStyle, Window};
use language::CursorShape;
use theme::ActiveTheme;
enum MatchingBracketHighlight {}
@@ -9,7 +10,7 @@ pub fn refresh_matching_bracket_highlights(
window: &mut Window,
cx: &mut Context<Editor>,
) {
editor.clear_background_highlights::<MatchingBracketHighlight>(cx);
editor.clear_highlights::<MatchingBracketHighlight>(cx);
let newest_selection = editor.selections.newest::<usize>(cx);
// Don't highlight brackets if the selection isn't empty
@@ -35,12 +36,19 @@ pub fn refresh_matching_bracket_highlights(
.buffer_snapshot
.innermost_enclosing_bracket_ranges(head..tail, None)
{
editor.highlight_background::<MatchingBracketHighlight>(
&[
editor.highlight_text::<MatchingBracketHighlight>(
vec![
opening_range.to_anchors(&snapshot.buffer_snapshot),
closing_range.to_anchors(&snapshot.buffer_snapshot),
],
|theme| theme.colors().editor_document_highlight_bracket_background,
HighlightStyle {
background_color: Some(
cx.theme()
.colors()
.editor_document_highlight_bracket_background,
),
..Default::default()
},
cx,
)
}
@@ -104,7 +112,7 @@ mod tests {
another_test(1, 2, 3);
}
"#});
cx.assert_editor_background_highlights::<MatchingBracketHighlight>(indoc! {r#"
cx.assert_editor_text_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test«(»"Test argument"«)» {
another_test(1, 2, 3);
}
@@ -115,7 +123,7 @@ mod tests {
another_test(1, ˇ2, 3);
}
"#});
cx.assert_editor_background_highlights::<MatchingBracketHighlight>(indoc! {r#"
cx.assert_editor_text_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test("Test argument") {
another_test«(»1, 2, 3«)»;
}
@@ -126,7 +134,7 @@ mod tests {
anotherˇ_test(1, 2, 3);
}
"#});
cx.assert_editor_background_highlights::<MatchingBracketHighlight>(indoc! {r#"
cx.assert_editor_text_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test("Test argument") «{»
another_test(1, 2, 3);
«}»
@@ -138,7 +146,7 @@ mod tests {
another_test(1, 2, 3);
}
"#});
cx.assert_editor_background_highlights::<MatchingBracketHighlight>(indoc! {r#"
cx.assert_editor_text_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test("Test argument") {
another_test(1, 2, 3);
}
@@ -150,8 +158,8 @@ mod tests {
another_test(1, 2, 3);
}
"#});
cx.assert_editor_background_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test("Test argument") {
cx.assert_editor_text_highlights::<MatchingBracketHighlight>(indoc! {r#"
pub fn test«("Test argument") {
another_test(1, 2, 3);
}
"#});

View File

@@ -562,7 +562,7 @@ pub fn show_link_definition(
provider.definitions(&buffer, buffer_position, preferred_kind, cx)
})?;
if let Some(task) = task {
task.await.ok().map(|definition_result| {
task.await.ok().flatten().map(|definition_result| {
(
definition_result.iter().find_map(|link| {
link.origin.as_ref().and_then(|origin| {

View File

@@ -428,7 +428,7 @@ fn show_hover(
};
let hovers_response = if let Some(hover_request) = hover_request {
hover_request.await
hover_request.await.unwrap_or_default()
} else {
Vec::new()
};

View File

@@ -431,7 +431,7 @@ impl SemanticsProvider for BranchBufferSemanticsProvider {
buffer: &Entity<Buffer>,
position: text::Anchor,
cx: &mut App,
) -> Option<Task<Vec<project::Hover>>> {
) -> Option<Task<Option<Vec<project::Hover>>>> {
let buffer = self.to_base(buffer, &[position], cx)?;
self.0.hover(&buffer, position, cx)
}
@@ -490,7 +490,7 @@ impl SemanticsProvider for BranchBufferSemanticsProvider {
position: text::Anchor,
kind: crate::GotoDefinitionKind,
cx: &mut App,
) -> Option<Task<anyhow::Result<Vec<project::LocationLink>>>> {
) -> Option<Task<anyhow::Result<Option<Vec<project::LocationLink>>>>> {
let buffer = self.to_base(buffer, &[position], cx)?;
self.0.definitions(&buffer, position, kind, cx)
}

View File

@@ -182,7 +182,9 @@ impl Editor {
let signature_help = task.await;
editor
.update(cx, |editor, cx| {
let Some(mut signature_help) = signature_help.into_iter().next() else {
let Some(mut signature_help) =
signature_help.unwrap_or_default().into_iter().next()
else {
editor
.signature_help_state
.hide(SignatureHelpHiddenBy::AutoClose);

View File

@@ -1,5 +1,6 @@
use std::{
cell::RefCell,
ffi::OsStr,
mem::ManuallyDrop,
path::{Path, PathBuf},
rc::Rc,
@@ -460,13 +461,15 @@ impl Platform for WindowsPlatform {
}
fn open_url(&self, url: &str) {
if url.is_empty() {
return;
}
let url_string = url.to_string();
self.background_executor()
.spawn(async move {
if url_string.is_empty() {
return;
}
open_target(url_string.as_str());
open_target(&url_string)
.with_context(|| format!("Opening url: {}", url_string))
.log_err();
})
.detach();
}
@@ -514,37 +517,29 @@ impl Platform for WindowsPlatform {
}
fn reveal_path(&self, path: &Path) {
let Ok(file_full_path) = path.canonicalize() else {
log::error!("unable to parse file path");
if path.as_os_str().is_empty() {
return;
};
}
let path = path.to_path_buf();
self.background_executor()
.spawn(async move {
let Some(path) = file_full_path.to_str() else {
return;
};
if path.is_empty() {
return;
}
open_target_in_explorer(path);
open_target_in_explorer(&path)
.with_context(|| format!("Revealing path {} in explorer", path.display()))
.log_err();
})
.detach();
}
fn open_with_system(&self, path: &Path) {
let Ok(full_path) = path.canonicalize() else {
log::error!("unable to parse file full path: {}", path.display());
if path.as_os_str().is_empty() {
return;
};
}
let path = path.to_path_buf();
self.background_executor()
.spawn(async move {
let Some(full_path_str) = full_path.to_str() else {
return;
};
if full_path_str.is_empty() {
return;
};
open_target(full_path_str);
open_target(&path)
.with_context(|| format!("Opening {} with system", path.display()))
.log_err();
})
.detach();
}
@@ -735,39 +730,67 @@ pub(crate) struct WindowCreationInfo {
pub(crate) disable_direct_composition: bool,
}
fn open_target(target: &str) {
unsafe {
let ret = ShellExecuteW(
fn open_target(target: impl AsRef<OsStr>) -> Result<()> {
let target = target.as_ref();
let ret = unsafe {
ShellExecuteW(
None,
windows::core::w!("open"),
&HSTRING::from(target),
None,
None,
SW_SHOWDEFAULT,
);
if ret.0 as isize <= 32 {
log::error!("Unable to open target: {}", std::io::Error::last_os_error());
}
)
};
if ret.0 as isize <= 32 {
Err(anyhow::anyhow!(
"Unable to open target: {}",
std::io::Error::last_os_error()
))
} else {
Ok(())
}
}
fn open_target_in_explorer(target: &str) {
fn open_target_in_explorer(target: &Path) -> Result<()> {
let dir = target.parent().context("No parent folder found")?;
let desktop = unsafe { SHGetDesktopFolder()? };
let mut dir_item = std::ptr::null_mut();
unsafe {
let ret = ShellExecuteW(
desktop.ParseDisplayName(
HWND::default(),
None,
windows::core::w!("open"),
windows::core::w!("explorer.exe"),
&HSTRING::from(format!("/select,{}", target).as_str()),
&HSTRING::from(dir),
None,
SW_SHOWDEFAULT,
);
if ret.0 as isize <= 32 {
log::error!(
"Unable to open target in explorer: {}",
std::io::Error::last_os_error()
);
}
&mut dir_item,
std::ptr::null_mut(),
)?;
}
let mut file_item = std::ptr::null_mut();
unsafe {
desktop.ParseDisplayName(
HWND::default(),
None,
&HSTRING::from(target),
None,
&mut file_item,
std::ptr::null_mut(),
)?;
}
let highlight = [file_item as *const _];
unsafe { SHOpenFolderAndSelectItems(dir_item as _, Some(&highlight), 0) }.or_else(|err| {
if err.code().0 == ERROR_FILE_NOT_FOUND.0 as i32 {
// On some systems, the above call mysteriously fails with "file not
// found" even though the file is there. In these cases, ShellExecute()
// seems to work as a fallback (although it won't select the file).
open_target(dir).context("Opening target parent folder")
} else {
Err(anyhow::anyhow!("Can not open target path: {}", err))
}
})
}
fn file_open_dialog(

View File

@@ -5,7 +5,7 @@ use anyhow::Result;
use collections::{FxHashMap, HashMap, HashSet};
use ec4rs::{
Properties as EditorconfigProperties,
property::{FinalNewline, IndentSize, IndentStyle, TabWidth, TrimTrailingWs},
property::{FinalNewline, IndentSize, IndentStyle, MaxLineLen, TabWidth, TrimTrailingWs},
};
use globset::{Glob, GlobMatcher, GlobSet, GlobSetBuilder};
use gpui::{App, Modifiers};
@@ -1131,6 +1131,10 @@ impl AllLanguageSettings {
}
fn merge_with_editorconfig(settings: &mut LanguageSettings, cfg: &EditorconfigProperties) {
let preferred_line_length = cfg.get::<MaxLineLen>().ok().and_then(|v| match v {
MaxLineLen::Value(u) => Some(u as u32),
MaxLineLen::Off => None,
});
let tab_size = cfg.get::<IndentSize>().ok().and_then(|v| match v {
IndentSize::Value(u) => NonZeroU32::new(u as u32),
IndentSize::UseTabWidth => cfg.get::<TabWidth>().ok().and_then(|w| match w {
@@ -1158,6 +1162,7 @@ fn merge_with_editorconfig(settings: &mut LanguageSettings, cfg: &EditorconfigPr
*target = value;
}
}
merge(&mut settings.preferred_line_length, preferred_line_length);
merge(&mut settings.tab_size, tab_size);
merge(&mut settings.hard_tabs, hard_tabs);
merge(

View File

@@ -96,7 +96,7 @@ impl<T: LocalLanguageToolchainStore> LanguageToolchainStore for T {
}
type DefaultIndex = usize;
#[derive(Default, Clone)]
#[derive(Default, Clone, Debug)]
pub struct ToolchainList {
pub toolchains: Vec<Toolchain>,
pub default: Option<DefaultIndex>,

View File

@@ -25,6 +25,7 @@ async-trait.workspace = true
collections.workspace = true
cpal.workspace = true
futures.workspace = true
audio.workspace = true
gpui = { workspace = true, features = ["screen-capture", "x11", "wayland", "windows-manifest"] }
gpui_tokio.workspace = true
http_client_tls.workspace = true
@@ -35,6 +36,7 @@ nanoid.workspace = true
parking_lot.workspace = true
postage.workspace = true
smallvec.workspace = true
settings.workspace = true
tokio-tungstenite.workspace = true
util.workspace = true
workspace-hack.workspace = true

View File

@@ -24,8 +24,11 @@ mod livekit_client;
)))]
pub use livekit_client::*;
// If you need proper LSP in livekit_client you've got to comment out
// the mocks and test
// If you need proper LSP in livekit_client you've got to comment
// - the cfg blocks above
// - the mods: mock_client & test and their conditional blocks
// - the pub use mock_client::* and their conditional blocks
#[cfg(any(
test,
feature = "test-support",

View File

@@ -1,15 +1,16 @@
use std::sync::Arc;
use anyhow::{Context as _, Result};
use audio::AudioSettings;
use collections::HashMap;
use futures::{SinkExt, channel::mpsc};
use gpui::{App, AsyncApp, ScreenCaptureSource, ScreenCaptureStream, Task};
use gpui_tokio::Tokio;
use log::info;
use playback::capture_local_video_track;
use settings::Settings;
mod playback;
#[cfg(feature = "record-microphone")]
mod record;
use crate::{LocalTrack, Participant, RemoteTrack, RoomEvent, TrackPublication};
pub use playback::AudioStream;
@@ -125,9 +126,14 @@ impl Room {
pub fn play_remote_audio_track(
&self,
track: &RemoteAudioTrack,
_cx: &App,
cx: &mut App,
) -> Result<playback::AudioStream> {
Ok(self.playback.play_remote_audio_track(&track.0))
if AudioSettings::get_global(cx).rodio_audio {
info!("Using experimental.rodio_audio audio pipeline");
playback::play_remote_audio_track(&track.0, cx)
} else {
Ok(self.playback.play_remote_audio_track(&track.0))
}
}
}

View File

@@ -18,13 +18,16 @@ use livekit::webrtc::{
video_stream::native::NativeVideoStream,
};
use parking_lot::Mutex;
use rodio::Source;
use std::cell::RefCell;
use std::sync::Weak;
use std::sync::atomic::{self, AtomicI32};
use std::sync::atomic::{AtomicBool, AtomicI32, Ordering};
use std::time::Duration;
use std::{borrow::Cow, collections::VecDeque, sync::Arc, thread};
use util::{ResultExt as _, maybe};
mod source;
pub(crate) struct AudioStack {
executor: BackgroundExecutor,
apm: Arc<Mutex<apm::AudioProcessingModule>>,
@@ -40,6 +43,29 @@ pub(crate) struct AudioStack {
const SAMPLE_RATE: u32 = 48000;
const NUM_CHANNELS: u32 = 2;
pub(crate) fn play_remote_audio_track(
track: &livekit::track::RemoteAudioTrack,
cx: &mut gpui::App,
) -> Result<AudioStream> {
let stop_handle = Arc::new(AtomicBool::new(false));
let stop_handle_clone = stop_handle.clone();
let stream = source::LiveKitStream::new(cx.background_executor(), track)
.stoppable()
.periodic_access(Duration::from_millis(50), move |s| {
if stop_handle.load(Ordering::Relaxed) {
s.stop();
}
});
audio::Audio::play_source(stream, cx).context("Could not play audio")?;
let on_drop = util::defer(move || {
stop_handle_clone.store(true, Ordering::Relaxed);
});
Ok(AudioStream::Output {
_drop: Box::new(on_drop),
})
}
impl AudioStack {
pub(crate) fn new(executor: BackgroundExecutor) -> Self {
let apm = Arc::new(Mutex::new(apm::AudioProcessingModule::new(
@@ -61,7 +87,7 @@ impl AudioStack {
) -> AudioStream {
let output_task = self.start_output();
let next_ssrc = self.next_ssrc.fetch_add(1, atomic::Ordering::Relaxed);
let next_ssrc = self.next_ssrc.fetch_add(1, Ordering::Relaxed);
let source = AudioMixerSource {
ssrc: next_ssrc,
sample_rate: SAMPLE_RATE,
@@ -97,6 +123,23 @@ impl AudioStack {
}
}
fn start_output(&self) -> Arc<Task<()>> {
if let Some(task) = self._output_task.borrow().upgrade() {
return task;
}
let task = Arc::new(self.executor.spawn({
let apm = self.apm.clone();
let mixer = self.mixer.clone();
async move {
Self::play_output(apm, mixer, SAMPLE_RATE, NUM_CHANNELS)
.await
.log_err();
}
}));
*self._output_task.borrow_mut() = Arc::downgrade(&task);
task
}
pub(crate) fn capture_local_microphone_track(
&self,
) -> Result<(crate::LocalAudioTrack, AudioStream)> {
@@ -139,23 +182,6 @@ impl AudioStack {
))
}
fn start_output(&self) -> Arc<Task<()>> {
if let Some(task) = self._output_task.borrow().upgrade() {
return task;
}
let task = Arc::new(self.executor.spawn({
let apm = self.apm.clone();
let mixer = self.mixer.clone();
async move {
Self::play_output(apm, mixer, SAMPLE_RATE, NUM_CHANNELS)
.await
.log_err();
}
}));
*self._output_task.borrow_mut() = Arc::downgrade(&task);
task
}
async fn play_output(
apm: Arc<Mutex<apm::AudioProcessingModule>>,
mixer: Arc<Mutex<audio_mixer::AudioMixer>>,

View File

@@ -0,0 +1,67 @@
use futures::StreamExt;
use libwebrtc::{audio_stream::native::NativeAudioStream, prelude::AudioFrame};
use livekit::track::RemoteAudioTrack;
use rodio::{Source, buffer::SamplesBuffer, conversions::SampleTypeConverter};
use crate::livekit_client::playback::{NUM_CHANNELS, SAMPLE_RATE};
fn frame_to_samplesbuffer(frame: AudioFrame) -> SamplesBuffer {
let samples = frame.data.iter().copied();
let samples = SampleTypeConverter::<_, _>::new(samples);
let samples: Vec<f32> = samples.collect();
SamplesBuffer::new(frame.num_channels as u16, frame.sample_rate, samples)
}
pub struct LiveKitStream {
// shared_buffer: SharedBuffer,
inner: rodio::queue::SourcesQueueOutput,
_receiver_task: gpui::Task<()>,
}
impl LiveKitStream {
pub fn new(executor: &gpui::BackgroundExecutor, track: &RemoteAudioTrack) -> Self {
let mut stream =
NativeAudioStream::new(track.rtc_track(), SAMPLE_RATE as i32, NUM_CHANNELS as i32);
let (queue_input, queue_output) = rodio::queue::queue(true);
// spawn rtc stream
let receiver_task = executor.spawn({
async move {
while let Some(frame) = stream.next().await {
let samples = frame_to_samplesbuffer(frame);
queue_input.append(samples);
}
}
});
LiveKitStream {
_receiver_task: receiver_task,
inner: queue_output,
}
}
}
impl Iterator for LiveKitStream {
type Item = rodio::Sample;
fn next(&mut self) -> Option<Self::Item> {
self.inner.next()
}
}
impl Source for LiveKitStream {
fn current_span_len(&self) -> Option<usize> {
self.inner.current_span_len()
}
fn channels(&self) -> rodio::ChannelCount {
self.inner.channels()
}
fn sample_rate(&self) -> rodio::SampleRate {
self.inner.sample_rate()
}
fn total_duration(&self) -> Option<std::time::Duration> {
self.inner.total_duration()
}
}

View File

@@ -45,7 +45,7 @@ use util::{ConnectionResult, ResultExt, TryFutureExt, redact};
const JSON_RPC_VERSION: &str = "2.0";
const CONTENT_LEN_HEADER: &str = "Content-Length: ";
const LSP_REQUEST_TIMEOUT: Duration = Duration::from_secs(60 * 2);
pub const LSP_REQUEST_TIMEOUT: Duration = Duration::from_secs(60 * 2);
const SERVER_SHUTDOWN_TIMEOUT: Duration = Duration::from_secs(5);
type NotificationHandler = Box<dyn Send + FnMut(Option<RequestId>, Value, &mut AsyncApp)>;

View File

@@ -88,9 +88,18 @@ pub enum BufferStoreEvent {
},
}
#[derive(Default, Debug)]
#[derive(Default, Debug, Clone)]
pub struct ProjectTransaction(pub HashMap<Entity<Buffer>, language::Transaction>);
impl PartialEq for ProjectTransaction {
fn eq(&self, other: &Self) -> bool {
self.0.len() == other.0.len()
&& self.0.iter().all(|(buffer, transaction)| {
other.0.get(buffer).is_some_and(|t| t.id == transaction.id)
})
}
}
impl EventEmitter<BufferStoreEvent> for BufferStore {}
impl RemoteBufferStore {

View File

@@ -3444,8 +3444,7 @@ impl LspCommand for GetCodeLens {
capabilities
.server_capabilities
.code_lens_provider
.as_ref()
.is_some_and(|code_lens_options| code_lens_options.resolve_provider.unwrap_or(false))
.is_some()
}
fn to_lsp(

File diff suppressed because it is too large Load Diff

View File

@@ -181,6 +181,7 @@ impl LanguageServerTree {
&root_path.path,
language_name.clone(),
);
(
Arc::new(InnerTreeNode::new(
adapter.name(),
@@ -408,6 +409,7 @@ impl ServerTreeRebase {
if live_node.id.get().is_some() {
return Some(node);
}
let disposition = &live_node.disposition;
let Some((existing_node, _)) = self
.old_contents

View File

@@ -327,6 +327,7 @@ pub enum Event {
RevealInProjectPanel(ProjectEntryId),
SnippetEdit(BufferId, Vec<(lsp::Range, Snippet)>),
ExpandedAllForEntry(WorktreeId, ProjectEntryId),
EntryRenamed(ProjectTransaction),
AgentLocationChanged,
}
@@ -2119,7 +2120,7 @@ impl Project {
let is_root_entry = self.entry_is_worktree_root(entry_id, cx);
let lsp_store = self.lsp_store().downgrade();
cx.spawn(async move |_, cx| {
cx.spawn(async move |project, cx| {
let (old_abs_path, new_abs_path) = {
let root_path = worktree.read_with(cx, |this, _| this.abs_path())?;
let new_abs_path = if is_root_entry {
@@ -2129,7 +2130,7 @@ impl Project {
};
(root_path.join(&old_path), new_abs_path)
};
LspStore::will_rename_entry(
let transaction = LspStore::will_rename_entry(
lsp_store.clone(),
worktree_id,
&old_abs_path,
@@ -2145,6 +2146,12 @@ impl Project {
})?
.await?;
project
.update(cx, |_, cx| {
cx.emit(Event::EntryRenamed(transaction));
})
.ok();
lsp_store
.read_with(cx, |this, _| {
this.did_rename_entry(worktree_id, &old_abs_path, &new_abs_path, is_dir);
@@ -3415,7 +3422,7 @@ impl Project {
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> {
) -> Task<Result<Option<Vec<LocationLink>>>> {
let position = position.to_point_utf16(buffer.read(cx));
let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3433,7 +3440,7 @@ impl Project {
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> {
) -> Task<Result<Option<Vec<LocationLink>>>> {
let position = position.to_point_utf16(buffer.read(cx));
let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3451,7 +3458,7 @@ impl Project {
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> {
) -> Task<Result<Option<Vec<LocationLink>>>> {
let position = position.to_point_utf16(buffer.read(cx));
let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3469,7 +3476,7 @@ impl Project {
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Result<Vec<LocationLink>>> {
) -> Task<Result<Option<Vec<LocationLink>>>> {
let position = position.to_point_utf16(buffer.read(cx));
let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3487,7 +3494,7 @@ impl Project {
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Result<Vec<Location>>> {
) -> Task<Result<Option<Vec<Location>>>> {
let position = position.to_point_utf16(buffer.read(cx));
let guard = self.retain_remotely_created_models(cx);
let task = self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3585,23 +3592,12 @@ impl Project {
})
}
pub fn signature_help<T: ToPointUtf16>(
&self,
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Vec<SignatureHelp>> {
self.lsp_store.update(cx, |lsp_store, cx| {
lsp_store.signature_help(buffer, position, cx)
})
}
pub fn hover<T: ToPointUtf16>(
&self,
buffer: &Entity<Buffer>,
position: T,
cx: &mut Context<Self>,
) -> Task<Vec<Hover>> {
) -> Task<Option<Vec<Hover>>> {
let position = position.to_point_utf16(buffer.read(cx));
self.lsp_store
.update(cx, |lsp_store, cx| lsp_store.hover(buffer, position, cx))
@@ -3637,7 +3633,7 @@ impl Project {
range: Range<T>,
kinds: Option<Vec<CodeActionKind>>,
cx: &mut Context<Self>,
) -> Task<Result<Vec<CodeAction>>> {
) -> Task<Result<Option<Vec<CodeAction>>>> {
let buffer = buffer_handle.read(cx);
let range = buffer.anchor_before(range.start)..buffer.anchor_before(range.end);
self.lsp_store.update(cx, |lsp_store, cx| {
@@ -3650,7 +3646,7 @@ impl Project {
buffer: &Entity<Buffer>,
range: Range<T>,
cx: &mut Context<Self>,
) -> Task<Result<Vec<CodeAction>>> {
) -> Task<Result<Option<Vec<CodeAction>>>> {
let snapshot = buffer.read(cx).snapshot();
let range = range.to_point(&snapshot);
let range_start = snapshot.anchor_before(range.start);
@@ -3668,16 +3664,18 @@ impl Project {
let mut code_lens_actions = code_lens_actions
.await
.map_err(|e| anyhow!("code lens fetch failed: {e:#}"))?;
code_lens_actions.retain(|code_lens_action| {
range
.start
.cmp(&code_lens_action.range.start, &snapshot)
.is_ge()
&& range
.end
.cmp(&code_lens_action.range.end, &snapshot)
.is_le()
});
if let Some(code_lens_actions) = &mut code_lens_actions {
code_lens_actions.retain(|code_lens_action| {
range
.start
.cmp(&code_lens_action.range.start, &snapshot)
.is_ge()
&& range
.end
.cmp(&code_lens_action.range.end, &snapshot)
.is_le()
});
}
Ok(code_lens_actions)
})
}

View File

@@ -4,6 +4,7 @@ use crate::{
Event, git_store::StatusEntry, task_inventory::TaskContexts, task_store::TaskSettingsLocation,
*,
};
use async_trait::async_trait;
use buffer_diff::{
BufferDiffEvent, CALCULATE_DIFF_TASK, DiffHunkSecondaryStatus, DiffHunkStatus,
DiffHunkStatusKind, assert_hunks,
@@ -21,7 +22,8 @@ use http_client::Url;
use itertools::Itertools;
use language::{
Diagnostic, DiagnosticEntry, DiagnosticSet, DiagnosticSourceKind, DiskState, FakeLspAdapter,
LanguageConfig, LanguageMatcher, LanguageName, LineEnding, OffsetRangeExt, Point, ToPoint,
LanguageConfig, LanguageMatcher, LanguageName, LineEnding, ManifestName, ManifestProvider,
ManifestQuery, OffsetRangeExt, Point, ToPoint, ToolchainLister,
language_settings::{AllLanguageSettings, LanguageSettingsContent, language_settings},
tree_sitter_rust, tree_sitter_typescript,
};
@@ -140,8 +142,10 @@ async fn test_editorconfig_support(cx: &mut gpui::TestAppContext) {
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
max_line_length = 120
[*.js]
tab_width = 10
max_line_length = off
"#,
".zed": {
"settings.json": r#"{
@@ -149,7 +153,8 @@ async fn test_editorconfig_support(cx: &mut gpui::TestAppContext) {
"hard_tabs": false,
"ensure_final_newline_on_save": false,
"remove_trailing_whitespace_on_save": false,
"soft_wrap": "editor_width"
"preferred_line_length": 64,
"soft_wrap": "editor_width",
}"#,
},
"a.rs": "fn a() {\n A\n}",
@@ -157,6 +162,7 @@ async fn test_editorconfig_support(cx: &mut gpui::TestAppContext) {
".editorconfig": r#"
[*.rs]
indent_size = 2
max_line_length = off,
"#,
"b.rs": "fn b() {\n B\n}",
},
@@ -205,6 +211,7 @@ async fn test_editorconfig_support(cx: &mut gpui::TestAppContext) {
assert_eq!(settings_a.hard_tabs, true);
assert_eq!(settings_a.ensure_final_newline_on_save, true);
assert_eq!(settings_a.remove_trailing_whitespace_on_save, true);
assert_eq!(settings_a.preferred_line_length, 120);
// .editorconfig in b/ overrides .editorconfig in root
assert_eq!(Some(settings_b.tab_size), NonZeroU32::new(2));
@@ -212,6 +219,10 @@ async fn test_editorconfig_support(cx: &mut gpui::TestAppContext) {
// "indent_size" is not set, so "tab_width" is used
assert_eq!(Some(settings_c.tab_size), NonZeroU32::new(10));
// When max_line_length is "off", default to .zed/settings.json
assert_eq!(settings_b.preferred_line_length, 64);
assert_eq!(settings_c.preferred_line_length, 64);
// README.md should not be affected by .editorconfig's globe "*.rs"
assert_eq!(Some(settings_readme.tab_size), NonZeroU32::new(8));
});
@@ -587,6 +598,203 @@ async fn test_fallback_to_single_worktree_tasks(cx: &mut gpui::TestAppContext) {
);
}
#[gpui::test]
async fn test_running_multiple_instances_of_a_single_server_in_one_worktree(
cx: &mut gpui::TestAppContext,
) {
pub(crate) struct PyprojectTomlManifestProvider;
impl ManifestProvider for PyprojectTomlManifestProvider {
fn name(&self) -> ManifestName {
SharedString::new_static("pyproject.toml").into()
}
fn search(
&self,
ManifestQuery {
path,
depth,
delegate,
}: ManifestQuery,
) -> Option<Arc<Path>> {
for path in path.ancestors().take(depth) {
let p = path.join("pyproject.toml");
if delegate.exists(&p, Some(false)) {
return Some(path.into());
}
}
None
}
}
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
path!("/the-root"),
json!({
".zed": {
"settings.json": r#"
{
"languages": {
"Python": {
"language_servers": ["ty"]
}
}
}"#
},
"project-a": {
".venv": {},
"file.py": "",
"pyproject.toml": ""
},
"project-b": {
".venv": {},
"source_file.py":"",
"another_file.py": "",
"pyproject.toml": ""
}
}),
)
.await;
cx.update(|cx| {
ManifestProvidersStore::global(cx).register(Arc::new(PyprojectTomlManifestProvider))
});
let project = Project::test(fs.clone(), [path!("/the-root").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
let _fake_python_server = language_registry.register_fake_lsp(
"Python",
FakeLspAdapter {
name: "ty",
capabilities: lsp::ServerCapabilities {
..Default::default()
},
..Default::default()
},
);
language_registry.add(python_lang(fs.clone()));
let (first_buffer, _handle) = project
.update(cx, |project, cx| {
project.open_local_buffer_with_lsp(path!("/the-root/project-a/file.py"), cx)
})
.await
.unwrap();
cx.executor().run_until_parked();
let servers = project.update(cx, |project, cx| {
project.lsp_store.update(cx, |this, cx| {
first_buffer.update(cx, |buffer, cx| {
this.language_servers_for_local_buffer(buffer, cx)
.map(|(adapter, server)| (adapter.clone(), server.clone()))
.collect::<Vec<_>>()
})
})
});
cx.executor().run_until_parked();
assert_eq!(servers.len(), 1);
let (adapter, server) = servers.into_iter().next().unwrap();
assert_eq!(adapter.name(), LanguageServerName::new_static("ty"));
assert_eq!(server.server_id(), LanguageServerId(0));
// `workspace_folders` are set to the rooting point.
assert_eq!(
server.workspace_folders(),
BTreeSet::from_iter(
[Url::from_file_path(path!("/the-root/project-a")).unwrap()].into_iter()
)
);
let (second_project_buffer, _other_handle) = project
.update(cx, |project, cx| {
project.open_local_buffer_with_lsp(path!("/the-root/project-b/source_file.py"), cx)
})
.await
.unwrap();
cx.executor().run_until_parked();
let servers = project.update(cx, |project, cx| {
project.lsp_store.update(cx, |this, cx| {
second_project_buffer.update(cx, |buffer, cx| {
this.language_servers_for_local_buffer(buffer, cx)
.map(|(adapter, server)| (adapter.clone(), server.clone()))
.collect::<Vec<_>>()
})
})
});
cx.executor().run_until_parked();
assert_eq!(servers.len(), 1);
let (adapter, server) = servers.into_iter().next().unwrap();
assert_eq!(adapter.name(), LanguageServerName::new_static("ty"));
// We're not using venvs at all here, so both folders should fall under the same root.
assert_eq!(server.server_id(), LanguageServerId(0));
// Now, let's select a different toolchain for one of subprojects.
let (available_toolchains_for_b, root_path) = project
.update(cx, |this, cx| {
let worktree_id = this.worktrees(cx).next().unwrap().read(cx).id();
this.available_toolchains(
ProjectPath {
worktree_id,
path: Arc::from("project-b/source_file.py".as_ref()),
},
LanguageName::new("Python"),
cx,
)
})
.await
.expect("A toolchain to be discovered");
assert_eq!(root_path.as_ref(), Path::new("project-b"));
assert_eq!(available_toolchains_for_b.toolchains().len(), 1);
let currently_active_toolchain = project
.update(cx, |this, cx| {
let worktree_id = this.worktrees(cx).next().unwrap().read(cx).id();
this.active_toolchain(
ProjectPath {
worktree_id,
path: Arc::from("project-b/source_file.py".as_ref()),
},
LanguageName::new("Python"),
cx,
)
})
.await;
assert!(currently_active_toolchain.is_none());
let _ = project
.update(cx, |this, cx| {
let worktree_id = this.worktrees(cx).next().unwrap().read(cx).id();
this.activate_toolchain(
ProjectPath {
worktree_id,
path: root_path,
},
available_toolchains_for_b
.toolchains
.into_iter()
.next()
.unwrap(),
cx,
)
})
.await
.unwrap();
cx.run_until_parked();
let servers = project.update(cx, |project, cx| {
project.lsp_store.update(cx, |this, cx| {
second_project_buffer.update(cx, |buffer, cx| {
this.language_servers_for_local_buffer(buffer, cx)
.map(|(adapter, server)| (adapter.clone(), server.clone()))
.collect::<Vec<_>>()
})
})
});
cx.executor().run_until_parked();
assert_eq!(servers.len(), 1);
let (adapter, server) = servers.into_iter().next().unwrap();
assert_eq!(adapter.name(), LanguageServerName::new_static("ty"));
// There's a new language server in town.
assert_eq!(server.server_id(), LanguageServerId(1));
}
#[gpui::test]
async fn test_managing_language_servers(cx: &mut gpui::TestAppContext) {
init_test(cx);
@@ -3005,6 +3213,7 @@ async fn test_definition(cx: &mut gpui::TestAppContext) {
let mut definitions = project
.update(cx, |project, cx| project.definitions(&buffer, 22, cx))
.await
.unwrap()
.unwrap();
// Assert no new language server started
@@ -3519,7 +3728,7 @@ async fn test_apply_code_actions_with_commands(cx: &mut gpui::TestAppContext) {
.next()
.await;
let action = actions.await.unwrap()[0].clone();
let action = actions.await.unwrap().unwrap()[0].clone();
let apply = project.update(cx, |project, cx| {
project.apply_code_action(buffer.clone(), action, true, cx)
});
@@ -6110,6 +6319,7 @@ async fn test_multiple_language_server_hovers(cx: &mut gpui::TestAppContext) {
hover_task
.await
.into_iter()
.flatten()
.map(|hover| hover.contents.iter().map(|block| &block.text).join("|"))
.sorted()
.collect::<Vec<_>>(),
@@ -6183,6 +6393,7 @@ async fn test_hovers_with_empty_parts(cx: &mut gpui::TestAppContext) {
hover_task
.await
.into_iter()
.flatten()
.map(|hover| hover.contents.iter().map(|block| &block.text).join("|"))
.sorted()
.collect::<Vec<_>>(),
@@ -6261,7 +6472,7 @@ async fn test_code_actions_only_kinds(cx: &mut gpui::TestAppContext) {
.await
.expect("The code action request should have been triggered");
let code_actions = code_actions_task.await.unwrap();
let code_actions = code_actions_task.await.unwrap().unwrap();
assert_eq!(code_actions.len(), 1);
assert_eq!(
code_actions[0].lsp_action.action_kind(),
@@ -6420,6 +6631,7 @@ async fn test_multiple_language_server_actions(cx: &mut gpui::TestAppContext) {
code_actions_task
.await
.unwrap()
.unwrap()
.into_iter()
.map(|code_action| code_action.lsp_action.title().to_owned())
.sorted()
@@ -8969,6 +9181,65 @@ fn rust_lang() -> Arc<Language> {
))
}
fn python_lang(fs: Arc<FakeFs>) -> Arc<Language> {
struct PythonMootToolchainLister(Arc<FakeFs>);
#[async_trait]
impl ToolchainLister for PythonMootToolchainLister {
async fn list(
&self,
worktree_root: PathBuf,
subroot_relative_path: Option<Arc<Path>>,
_: Option<HashMap<String, String>>,
) -> ToolchainList {
// This lister will always return a path .venv directories within ancestors
let ancestors = subroot_relative_path
.into_iter()
.flat_map(|path| path.ancestors().map(ToOwned::to_owned).collect::<Vec<_>>());
let mut toolchains = vec![];
for ancestor in ancestors {
let venv_path = worktree_root.join(ancestor).join(".venv");
if self.0.is_dir(&venv_path).await {
toolchains.push(Toolchain {
name: SharedString::new("Python Venv"),
path: venv_path.to_string_lossy().into_owned().into(),
language_name: LanguageName(SharedString::new_static("Python")),
as_json: serde_json::Value::Null,
})
}
}
ToolchainList {
toolchains,
..Default::default()
}
}
// Returns a term which we should use in UI to refer to a toolchain.
fn term(&self) -> SharedString {
SharedString::new_static("virtual environment")
}
/// Returns the name of the manifest file for this toolchain.
fn manifest_name(&self) -> ManifestName {
SharedString::new_static("pyproject.toml").into()
}
}
Arc::new(
Language::new(
LanguageConfig {
name: "Python".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["py".to_string()],
..Default::default()
},
..Default::default()
},
None, // We're not testing Python parsing with this language.
)
.with_manifest(Some(ManifestName::from(SharedString::new_static(
"pyproject.toml",
))))
.with_toolchain_lister(Some(Arc::new(PythonMootToolchainLister(fs)))),
)
}
fn typescript_lang() -> Arc<Language> {
Arc::new(Language::new(
LanguageConfig {

View File

@@ -753,28 +753,47 @@ message TextEdit {
PointUtf16 lsp_range_end = 3;
}
message MultiLspQuery {
message LspQuery {
uint64 project_id = 1;
uint64 buffer_id = 2;
repeated VectorClockEntry version = 3;
oneof strategy {
AllLanguageServers all = 4;
}
uint64 lsp_request_id = 2;
oneof request {
GetReferences get_references = 3;
GetDocumentColor get_document_color = 4;
GetHover get_hover = 5;
GetCodeActions get_code_actions = 6;
GetSignatureHelp get_signature_help = 7;
GetCodeLens get_code_lens = 8;
GetDocumentDiagnostics get_document_diagnostics = 9;
GetDocumentColor get_document_color = 10;
GetDefinition get_definition = 11;
GetDeclaration get_declaration = 12;
GetTypeDefinition get_type_definition = 13;
GetImplementation get_implementation = 14;
GetReferences get_references = 15;
GetDefinition get_definition = 10;
GetDeclaration get_declaration = 11;
GetTypeDefinition get_type_definition = 12;
GetImplementation get_implementation = 13;
}
}
message LspQueryResponse {
uint64 project_id = 1;
uint64 lsp_request_id = 2;
repeated LspResponse responses = 3;
}
message LspResponse {
oneof response {
GetHoverResponse get_hover_response = 1;
GetCodeActionsResponse get_code_actions_response = 2;
GetSignatureHelpResponse get_signature_help_response = 3;
GetCodeLensResponse get_code_lens_response = 4;
GetDocumentDiagnosticsResponse get_document_diagnostics_response = 5;
GetDocumentColorResponse get_document_color_response = 6;
GetDefinitionResponse get_definition_response = 8;
GetDeclarationResponse get_declaration_response = 9;
GetTypeDefinitionResponse get_type_definition_response = 10;
GetImplementationResponse get_implementation_response = 11;
GetReferencesResponse get_references_response = 12;
}
uint64 server_id = 7;
}
message AllLanguageServers {}
message LanguageServerSelector {
@@ -798,27 +817,6 @@ message StopLanguageServers {
bool all = 4;
}
message MultiLspQueryResponse {
repeated LspResponse responses = 1;
}
message LspResponse {
oneof response {
GetHoverResponse get_hover_response = 1;
GetCodeActionsResponse get_code_actions_response = 2;
GetSignatureHelpResponse get_signature_help_response = 3;
GetCodeLensResponse get_code_lens_response = 4;
GetDocumentDiagnosticsResponse get_document_diagnostics_response = 5;
GetDocumentColorResponse get_document_color_response = 6;
GetDefinitionResponse get_definition_response = 8;
GetDeclarationResponse get_declaration_response = 9;
GetTypeDefinitionResponse get_type_definition_response = 10;
GetImplementationResponse get_implementation_response = 11;
GetReferencesResponse get_references_response = 12;
}
uint64 server_id = 7;
}
message LspExtRunnables {
uint64 project_id = 1;
uint64 buffer_id = 2;
@@ -909,3 +907,30 @@ message PullWorkspaceDiagnostics {
uint64 project_id = 1;
uint64 server_id = 2;
}
// todo(lsp) remove after Zed Stable hits v0.204.x
message MultiLspQuery {
uint64 project_id = 1;
uint64 buffer_id = 2;
repeated VectorClockEntry version = 3;
oneof strategy {
AllLanguageServers all = 4;
}
oneof request {
GetHover get_hover = 5;
GetCodeActions get_code_actions = 6;
GetSignatureHelp get_signature_help = 7;
GetCodeLens get_code_lens = 8;
GetDocumentDiagnostics get_document_diagnostics = 9;
GetDocumentColor get_document_color = 10;
GetDefinition get_definition = 11;
GetDeclaration get_declaration = 12;
GetTypeDefinition get_type_definition = 13;
GetImplementation get_implementation = 14;
GetReferences get_references = 15;
}
}
message MultiLspQueryResponse {
repeated LspResponse responses = 1;
}

View File

@@ -393,7 +393,10 @@ message Envelope {
GetCrashFilesResponse get_crash_files_response = 362;
GitClone git_clone = 363;
GitCloneResponse git_clone_response = 364; // current max
GitCloneResponse git_clone_response = 364;
LspQuery lsp_query = 365;
LspQueryResponse lsp_query_response = 366; // current max
}
reserved 87 to 88;

View File

@@ -69,3 +69,32 @@ macro_rules! entity_messages {
})*
};
}
#[macro_export]
macro_rules! lsp_messages {
($(($request_name:ident, $response_name:ident, $stop_previous_requests:expr)),* $(,)?) => {
$(impl LspRequestMessage for $request_name {
type Response = $response_name;
fn to_proto_query(self) -> $crate::lsp_query::Request {
$crate::lsp_query::Request::$request_name(self)
}
fn response_to_proto_query(response: Self::Response) -> $crate::lsp_response::Response {
$crate::lsp_response::Response::$response_name(response)
}
fn buffer_id(&self) -> u64 {
self.buffer_id
}
fn buffer_version(&self) -> &[$crate::VectorClockEntry] {
&self.version
}
fn stop_previous_requests() -> bool {
$stop_previous_requests
}
})*
};
}

View File

@@ -169,6 +169,9 @@ messages!(
(MarkNotificationRead, Foreground),
(MoveChannel, Foreground),
(ReorderChannel, Foreground),
(LspQuery, Background),
(LspQueryResponse, Background),
// todo(lsp) remove after Zed Stable hits v0.204.x
(MultiLspQuery, Background),
(MultiLspQueryResponse, Background),
(OnTypeFormatting, Background),
@@ -426,7 +429,10 @@ request_messages!(
(SetRoomParticipantRole, Ack),
(BlameBuffer, BlameBufferResponse),
(RejoinRemoteProjects, RejoinRemoteProjectsResponse),
// todo(lsp) remove after Zed Stable hits v0.204.x
(MultiLspQuery, MultiLspQueryResponse),
(LspQuery, Ack),
(LspQueryResponse, Ack),
(RestartLanguageServers, Ack),
(StopLanguageServers, Ack),
(OpenContext, OpenContextResponse),
@@ -478,6 +484,20 @@ request_messages!(
(GitClone, GitCloneResponse)
);
lsp_messages!(
(GetReferences, GetReferencesResponse, true),
(GetDocumentColor, GetDocumentColorResponse, true),
(GetHover, GetHoverResponse, true),
(GetCodeActions, GetCodeActionsResponse, true),
(GetSignatureHelp, GetSignatureHelpResponse, true),
(GetCodeLens, GetCodeLensResponse, true),
(GetDocumentDiagnostics, GetDocumentDiagnosticsResponse, true),
(GetDefinition, GetDefinitionResponse, true),
(GetDeclaration, GetDeclarationResponse, true),
(GetTypeDefinition, GetTypeDefinitionResponse, true),
(GetImplementation, GetImplementationResponse, true),
);
entity_messages!(
{project_id, ShareProject},
AddProjectCollaborator,
@@ -520,6 +540,9 @@ entity_messages!(
LeaveProject,
LinkedEditingRange,
LoadCommitDiff,
LspQuery,
LspQueryResponse,
// todo(lsp) remove after Zed Stable hits v0.204.x
MultiLspQuery,
RestartLanguageServers,
StopLanguageServers,
@@ -777,6 +800,28 @@ pub fn split_repository_update(
}])
}
impl LspQuery {
pub fn query_name_and_write_permissions(&self) -> (&str, bool) {
match self.request {
Some(lsp_query::Request::GetHover(_)) => ("GetHover", false),
Some(lsp_query::Request::GetCodeActions(_)) => ("GetCodeActions", true),
Some(lsp_query::Request::GetSignatureHelp(_)) => ("GetSignatureHelp", false),
Some(lsp_query::Request::GetCodeLens(_)) => ("GetCodeLens", true),
Some(lsp_query::Request::GetDocumentDiagnostics(_)) => {
("GetDocumentDiagnostics", false)
}
Some(lsp_query::Request::GetDefinition(_)) => ("GetDefinition", false),
Some(lsp_query::Request::GetDeclaration(_)) => ("GetDeclaration", false),
Some(lsp_query::Request::GetTypeDefinition(_)) => ("GetTypeDefinition", false),
Some(lsp_query::Request::GetImplementation(_)) => ("GetImplementation", false),
Some(lsp_query::Request::GetReferences(_)) => ("GetReferences", false),
Some(lsp_query::Request::GetDocumentColor(_)) => ("GetDocumentColor", false),
None => ("<unknown>", true),
}
}
}
// todo(lsp) remove after Zed Stable hits v0.204.x
impl MultiLspQuery {
pub fn request_str(&self) -> &str {
match self.request {

View File

@@ -31,6 +31,58 @@ pub trait RequestMessage: EnvelopedMessage {
type Response: EnvelopedMessage;
}
/// A trait to bind LSP request and responses for the proto layer.
/// Should be used for every LSP request that has to traverse through the proto layer.
///
/// `lsp_messages` macro in the same crate provides a convenient way to implement this.
pub trait LspRequestMessage: EnvelopedMessage {
type Response: EnvelopedMessage;
fn to_proto_query(self) -> crate::lsp_query::Request;
fn response_to_proto_query(response: Self::Response) -> crate::lsp_response::Response;
fn buffer_id(&self) -> u64;
fn buffer_version(&self) -> &[crate::VectorClockEntry];
/// Whether to deduplicate the requests, or keep the previous ones running when another
/// request of the same kind is processed.
fn stop_previous_requests() -> bool;
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct LspRequestId(pub u64);
/// A response from a single language server.
/// There could be multiple responses for a single LSP request,
/// from different servers.
pub struct ProtoLspResponse<R> {
pub server_id: u64,
pub response: R,
}
impl ProtoLspResponse<Box<dyn AnyTypedEnvelope>> {
pub fn into_response<T: LspRequestMessage>(self) -> Result<ProtoLspResponse<T::Response>> {
let envelope = self
.response
.into_any()
.downcast::<TypedEnvelope<T::Response>>()
.map_err(|_| {
anyhow::anyhow!(
"cannot downcast LspResponse to {} for message {}",
T::Response::NAME,
T::NAME,
)
})?;
Ok(ProtoLspResponse {
server_id: self.server_id,
response: envelope.payload,
})
}
}
pub trait AnyTypedEnvelope: Any + Send + Sync {
fn payload_type_id(&self) -> TypeId;
fn payload_type_name(&self) -> &'static str;

View File

@@ -1,35 +1,48 @@
use anyhow::Context;
use anyhow::{Context, Result};
use collections::HashMap;
use futures::{
Future, FutureExt as _,
channel::oneshot,
future::{BoxFuture, LocalBoxFuture},
};
use gpui::{AnyEntity, AnyWeakEntity, AsyncApp, Entity};
use gpui::{AnyEntity, AnyWeakEntity, AsyncApp, BackgroundExecutor, Entity, FutureExt as _};
use parking_lot::Mutex;
use proto::{
AnyTypedEnvelope, EntityMessage, Envelope, EnvelopedMessage, RequestMessage, TypedEnvelope,
error::ErrorExt as _,
AnyTypedEnvelope, EntityMessage, Envelope, EnvelopedMessage, LspRequestId, LspRequestMessage,
RequestMessage, TypedEnvelope, error::ErrorExt as _,
};
use std::{
any::{Any, TypeId},
sync::{Arc, Weak},
sync::{
Arc, OnceLock,
atomic::{self, AtomicU64},
},
time::Duration,
};
#[derive(Clone)]
pub struct AnyProtoClient(Arc<dyn ProtoClient>);
pub struct AnyProtoClient(Arc<State>);
impl AnyProtoClient {
pub fn downgrade(&self) -> AnyWeakProtoClient {
AnyWeakProtoClient(Arc::downgrade(&self.0))
}
}
type RequestIds = Arc<
Mutex<
HashMap<
LspRequestId,
oneshot::Sender<
Result<
Option<TypedEnvelope<Vec<proto::ProtoLspResponse<Box<dyn AnyTypedEnvelope>>>>>,
>,
>,
>,
>,
>;
#[derive(Clone)]
pub struct AnyWeakProtoClient(Weak<dyn ProtoClient>);
static NEXT_LSP_REQUEST_ID: OnceLock<Arc<AtomicU64>> = OnceLock::new();
static REQUEST_IDS: OnceLock<RequestIds> = OnceLock::new();
impl AnyWeakProtoClient {
pub fn upgrade(&self) -> Option<AnyProtoClient> {
self.0.upgrade().map(AnyProtoClient)
}
struct State {
client: Arc<dyn ProtoClient>,
next_lsp_request_id: Arc<AtomicU64>,
request_ids: RequestIds,
}
pub trait ProtoClient: Send + Sync {
@@ -37,11 +50,11 @@ pub trait ProtoClient: Send + Sync {
&self,
envelope: Envelope,
request_type: &'static str,
) -> BoxFuture<'static, anyhow::Result<Envelope>>;
) -> BoxFuture<'static, Result<Envelope>>;
fn send(&self, envelope: Envelope, message_type: &'static str) -> anyhow::Result<()>;
fn send(&self, envelope: Envelope, message_type: &'static str) -> Result<()>;
fn send_response(&self, envelope: Envelope, message_type: &'static str) -> anyhow::Result<()>;
fn send_response(&self, envelope: Envelope, message_type: &'static str) -> Result<()>;
fn message_handler_set(&self) -> &parking_lot::Mutex<ProtoMessageHandlerSet>;
@@ -65,7 +78,7 @@ pub type ProtoMessageHandler = Arc<
Box<dyn AnyTypedEnvelope>,
AnyProtoClient,
AsyncApp,
) -> LocalBoxFuture<'static, anyhow::Result<()>>,
) -> LocalBoxFuture<'static, Result<()>>,
>;
impl ProtoMessageHandlerSet {
@@ -113,7 +126,7 @@ impl ProtoMessageHandlerSet {
message: Box<dyn AnyTypedEnvelope>,
client: AnyProtoClient,
cx: AsyncApp,
) -> Option<LocalBoxFuture<'static, anyhow::Result<()>>> {
) -> Option<LocalBoxFuture<'static, Result<()>>> {
let payload_type_id = message.payload_type_id();
let mut this = this.lock();
let handler = this.message_handlers.get(&payload_type_id)?.clone();
@@ -169,43 +182,195 @@ where
T: ProtoClient + 'static,
{
fn from(client: Arc<T>) -> Self {
Self(client)
Self::new(client)
}
}
impl AnyProtoClient {
pub fn new<T: ProtoClient + 'static>(client: Arc<T>) -> Self {
Self(client)
Self(Arc::new(State {
client,
next_lsp_request_id: NEXT_LSP_REQUEST_ID
.get_or_init(|| Arc::new(AtomicU64::new(0)))
.clone(),
request_ids: REQUEST_IDS.get_or_init(RequestIds::default).clone(),
}))
}
pub fn is_via_collab(&self) -> bool {
self.0.is_via_collab()
self.0.client.is_via_collab()
}
pub fn request<T: RequestMessage>(
&self,
request: T,
) -> impl Future<Output = anyhow::Result<T::Response>> + use<T> {
) -> impl Future<Output = Result<T::Response>> + use<T> {
let envelope = request.into_envelope(0, None, None);
let response = self.0.request(envelope, T::NAME);
let response = self.0.client.request(envelope, T::NAME);
async move {
T::Response::from_envelope(response.await?)
.context("received response of the wrong type")
}
}
pub fn send<T: EnvelopedMessage>(&self, request: T) -> anyhow::Result<()> {
pub fn send<T: EnvelopedMessage>(&self, request: T) -> Result<()> {
let envelope = request.into_envelope(0, None, None);
self.0.send(envelope, T::NAME)
self.0.client.send(envelope, T::NAME)
}
pub fn send_response<T: EnvelopedMessage>(
&self,
request_id: u32,
request: T,
) -> anyhow::Result<()> {
pub fn send_response<T: EnvelopedMessage>(&self, request_id: u32, request: T) -> Result<()> {
let envelope = request.into_envelope(0, Some(request_id), None);
self.0.send(envelope, T::NAME)
self.0.client.send(envelope, T::NAME)
}
pub fn request_lsp<T>(
&self,
project_id: u64,
timeout: Duration,
executor: BackgroundExecutor,
request: T,
) -> impl Future<
Output = Result<Option<TypedEnvelope<Vec<proto::ProtoLspResponse<T::Response>>>>>,
> + use<T>
where
T: LspRequestMessage,
{
let new_id = LspRequestId(
self.0
.next_lsp_request_id
.fetch_add(1, atomic::Ordering::Acquire),
);
let (tx, rx) = oneshot::channel();
{
self.0.request_ids.lock().insert(new_id, tx);
}
let query = proto::LspQuery {
project_id,
lsp_request_id: new_id.0,
request: Some(request.to_proto_query()),
};
let request = self.request(query);
let request_ids = self.0.request_ids.clone();
async move {
match request.await {
Ok(_request_enqueued) => {}
Err(e) => {
request_ids.lock().remove(&new_id);
return Err(e).context("sending LSP proto request");
}
}
let response = rx.with_timeout(timeout, &executor).await;
{
request_ids.lock().remove(&new_id);
}
match response {
Ok(Ok(response)) => {
let response = response
.context("waiting for LSP proto response")?
.map(|response| {
anyhow::Ok(TypedEnvelope {
payload: response
.payload
.into_iter()
.map(|lsp_response| lsp_response.into_response::<T>())
.collect::<Result<Vec<_>>>()?,
sender_id: response.sender_id,
original_sender_id: response.original_sender_id,
message_id: response.message_id,
received_at: response.received_at,
})
})
.transpose()
.context("converting LSP proto response")?;
Ok(response)
}
Err(_cancelled_due_timeout) => Ok(None),
Ok(Err(_channel_dropped)) => Ok(None),
}
}
}
pub fn send_lsp_response<T: LspRequestMessage>(
&self,
project_id: u64,
lsp_request_id: LspRequestId,
server_responses: HashMap<u64, T::Response>,
) -> Result<()> {
self.send(proto::LspQueryResponse {
project_id,
lsp_request_id: lsp_request_id.0,
responses: server_responses
.into_iter()
.map(|(server_id, response)| proto::LspResponse {
server_id,
response: Some(T::response_to_proto_query(response)),
})
.collect(),
})
}
pub fn handle_lsp_response(&self, mut envelope: TypedEnvelope<proto::LspQueryResponse>) {
let request_id = LspRequestId(envelope.payload.lsp_request_id);
let mut response_senders = self.0.request_ids.lock();
if let Some(tx) = response_senders.remove(&request_id) {
let responses = envelope.payload.responses.drain(..).collect::<Vec<_>>();
tx.send(Ok(Some(proto::TypedEnvelope {
sender_id: envelope.sender_id,
original_sender_id: envelope.original_sender_id,
message_id: envelope.message_id,
received_at: envelope.received_at,
payload: responses
.into_iter()
.filter_map(|response| {
use proto::lsp_response::Response;
let server_id = response.server_id;
let response = match response.response? {
Response::GetReferencesResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetDocumentColorResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetHoverResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetCodeActionsResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetSignatureHelpResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetCodeLensResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetDocumentDiagnosticsResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetDefinitionResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetDeclarationResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetTypeDefinitionResponse(response) => {
to_any_envelope(&envelope, response)
}
Response::GetImplementationResponse(response) => {
to_any_envelope(&envelope, response)
}
};
Some(proto::ProtoLspResponse {
server_id,
response,
})
})
.collect(),
})))
.ok();
}
}
pub fn add_request_handler<M, E, H, F>(&self, entity: gpui::WeakEntity<E>, handler: H)
@@ -213,31 +378,35 @@ impl AnyProtoClient {
M: RequestMessage,
E: 'static,
H: 'static + Sync + Fn(Entity<E>, TypedEnvelope<M>, AsyncApp) -> F + Send + Sync,
F: 'static + Future<Output = anyhow::Result<M::Response>>,
F: 'static + Future<Output = Result<M::Response>>,
{
self.0.message_handler_set().lock().add_message_handler(
TypeId::of::<M>(),
entity.into(),
Arc::new(move |entity, envelope, client, cx| {
let entity = entity.downcast::<E>().unwrap();
let envelope = envelope.into_any().downcast::<TypedEnvelope<M>>().unwrap();
let request_id = envelope.message_id();
handler(entity, *envelope, cx)
.then(move |result| async move {
match result {
Ok(response) => {
client.send_response(request_id, response)?;
Ok(())
self.0
.client
.message_handler_set()
.lock()
.add_message_handler(
TypeId::of::<M>(),
entity.into(),
Arc::new(move |entity, envelope, client, cx| {
let entity = entity.downcast::<E>().unwrap();
let envelope = envelope.into_any().downcast::<TypedEnvelope<M>>().unwrap();
let request_id = envelope.message_id();
handler(entity, *envelope, cx)
.then(move |result| async move {
match result {
Ok(response) => {
client.send_response(request_id, response)?;
Ok(())
}
Err(error) => {
client.send_response(request_id, error.to_proto())?;
Err(error)
}
}
Err(error) => {
client.send_response(request_id, error.to_proto())?;
Err(error)
}
}
})
.boxed_local()
}),
)
})
.boxed_local()
}),
)
}
pub fn add_entity_request_handler<M, E, H, F>(&self, handler: H)
@@ -245,7 +414,7 @@ impl AnyProtoClient {
M: EnvelopedMessage + RequestMessage + EntityMessage,
E: 'static,
H: 'static + Sync + Send + Fn(gpui::Entity<E>, TypedEnvelope<M>, AsyncApp) -> F,
F: 'static + Future<Output = anyhow::Result<M::Response>>,
F: 'static + Future<Output = Result<M::Response>>,
{
let message_type_id = TypeId::of::<M>();
let entity_type_id = TypeId::of::<E>();
@@ -257,6 +426,7 @@ impl AnyProtoClient {
.remote_entity_id()
};
self.0
.client
.message_handler_set()
.lock()
.add_entity_message_handler(
@@ -290,7 +460,7 @@ impl AnyProtoClient {
M: EnvelopedMessage + EntityMessage,
E: 'static,
H: 'static + Sync + Send + Fn(gpui::Entity<E>, TypedEnvelope<M>, AsyncApp) -> F,
F: 'static + Future<Output = anyhow::Result<()>>,
F: 'static + Future<Output = Result<()>>,
{
let message_type_id = TypeId::of::<M>();
let entity_type_id = TypeId::of::<E>();
@@ -302,6 +472,7 @@ impl AnyProtoClient {
.remote_entity_id()
};
self.0
.client
.message_handler_set()
.lock()
.add_entity_message_handler(
@@ -319,7 +490,7 @@ impl AnyProtoClient {
pub fn subscribe_to_entity<E: 'static>(&self, remote_id: u64, entity: &Entity<E>) {
let id = (TypeId::of::<E>(), remote_id);
let mut message_handlers = self.0.message_handler_set().lock();
let mut message_handlers = self.0.client.message_handler_set().lock();
if message_handlers
.entities_by_type_and_remote_id
.contains_key(&id)
@@ -335,3 +506,16 @@ impl AnyProtoClient {
);
}
}
fn to_any_envelope<T: EnvelopedMessage>(
envelope: &TypedEnvelope<proto::LspQueryResponse>,
response: T,
) -> Box<dyn AnyTypedEnvelope> {
Box::new(proto::TypedEnvelope {
sender_id: envelope.sender_id,
original_sender_id: envelope.original_sender_id,
message_id: envelope.message_id,
received_at: envelope.received_at,
payload: response,
}) as Box<_>
}

View File

@@ -60,6 +60,11 @@ pub trait Settings: 'static + Send + Sync {
/// The logic for combining together values from one or more JSON files into the
/// final value for this setting.
///
/// # Warning
/// `Self::FileContent` deserialized field names should match with `Self` deserialized field names
/// otherwise the field won't be deserialized properly and you will get the error:
/// "A default setting must be added to the `default.json` file"
fn load(sources: SettingsSources<Self::FileContent>, cx: &mut App) -> Result<Self>
where
Self: Sized;

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.201.2"
version = "0.202.0"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View File

@@ -1 +1 @@
preview
dev

View File

@@ -598,7 +598,7 @@ pub fn main() {
repl::notebook::init(cx);
diagnostics::init(cx);
audio::init(Assets, cx);
audio::init(cx);
workspace::init(app_state.clone(), cx);
ui_prompt::init(cx);

View File

@@ -4614,7 +4614,7 @@ mod tests {
gpui_tokio::init(cx);
vim_mode_setting::init(cx);
theme::init(theme::LoadThemes::JustBase, cx);
audio::init((), cx);
audio::init(cx);
channel::init(&app_state.client, app_state.user_store.clone(), cx);
call::init(app_state.client.clone(), app_state.user_store.clone(), cx);
notifications::init(app_state.client.clone(), app_state.user_store.clone(), cx);

View File

@@ -16,6 +16,7 @@
- [Configuring Zed](./configuring-zed.md)
- [Configuring Languages](./configuring-languages.md)
- [Key bindings](./key-bindings.md)
- [All Actions](./all-actions.md)
- [Snippets](./snippets.md)
- [Themes](./themes.md)
- [Icon Themes](./icon-themes.md)

3
docs/src/all-actions.md Normal file
View File

@@ -0,0 +1,3 @@
## All Actions
{#ACTIONS_TABLE#}