Compare commits

...

20 Commits

Author SHA1 Message Date
Joseph T. Lyons
56049ac3b5 v0.217.x stable 2025-12-16 11:56:48 -05:00
Joseph T. Lyons
26801dcfc8 Use omnisharp for as default CSharp language server
Co-authored-by: John Tur <john-tur@outlook.com>
2025-12-16 11:43:33 -05:00
Joseph T. Lyons
03ab187a5d Revert "Improve support for multiple registrations of textDocument/diagnostic (#43703)"
This reverts commit a51e975b81.

Co-authored-by: John Tur <john-tur@outlook.com>
2025-12-16 11:40:33 -05:00
Joseph T. Lyons
e8505f1407 Revert "Fix unregistration logic for pull diagnostics (#44294)"
This reverts commit 9e33243015.
2025-12-16 11:33:53 -05:00
zed-zippy[bot]
8965ed40d4 Revert "Add Doxygen injection into C and C++ comments" (#44883) (cherry-pick to preview) (#44897)
Cherry-pick of #44883 to preview

----
Reverts zed-industries/zed#43581

Release notes:
- Fixed comment injections not working with C and C++.

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2025-12-15 19:29:51 +01:00
zed-zippy[bot]
b0c6de7ee9 ci: Explicitly set git committer information in protobuf check (#44582) (cherry-pick to preview) (#44900)
Cherry-pick of #44582 to preview

----
This should hopefully fix the flakes for good.

Release Notes:

- N/A

Co-authored-by: Finn Evers <finn@zed.dev>
2025-12-15 17:29:28 +00:00
Anthony Eid
535caa7ec6 git: Fix create remote branch (#44805)
Fix a bug where the branch picker would be dismissed before completing
the add remote flow, thus making Zed unable to add remote repositories
through the branch picker.

This bug was caused by the picker always being dismissed on the confirm
action, so the fix was stopping the branch modal from being dismissed
too early.

I also cleaned up the UI a bit and code.

1. Removed the loading field from the Branch delegate because it was
never used and the activity indicator will show remote add command if it
takes a while.
2. I replaced some async task spawning with the use of `cx.defer`.
3. Added a `add remote name` fake entry when the picker is in the name
remote state. I did this so the UI would be consistent with the other
states.
4. Added two regression tests. 
4.1 One to prevent this bug from occurring again:
https://github.com/zed-industries/zed/pull/44742
4.2 Another to prevent the early dismissal bug from occurring 
5. Made `init_branch_list_test` param order consistent with Zed's code
base

###### Updated UI
<img width="1150" height="298" alt="image"
src="https://github.com/user-attachments/assets/edead508-381c-4bd8-8a41-394dd5b7b781"
/>


Release Notes:

- N/A
2025-12-15 11:33:03 -05:00
Anthony Eid
20f9df660a git: Show all branches in branch picker empty state (#44742)
This fixes an issue where a user could get confused by the branch picker
because it would only show the 10 most recent branches, instead of all
branches.

Release Notes:

- git: Show all branches in branch picker when search field is empty
2025-12-15 11:32:57 -05:00
zed-zippy[bot]
5ebf4f1e5d proto: Add two language servers and change used grammar (#44440) (cherry-pick to preview) (#44863)
Cherry-pick of #44440 to preview

----
Closes #43784
Closes #44375
Closes #21057

This PR updates the Proto extension to include support for two new
language servers as well as an updated grammar for better highlighting.

Release Notes:

- Improved Proto support to work better out of the box.

Co-authored-by: Finn Evers <finn@zed.dev>
2025-12-15 12:09:16 +01:00
zed-zippy[bot]
8d0fabaeff language: Make TreeSitterData only shared between snapshots of the same version (#44198) (cherry-pick to preview) (#44704)
Cherry-pick of #44198 to preview

----
Currently we have a single cache for this data shared between all
snapshots which is incorrect, as we might update the cache to a new
version while having old snapshots around which then may try to access
new data with old offsets/rows.

Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-12-12 13:26:08 +00:00
zed-zippy[bot]
df28644e95 remote: Remove unnecessary and incorrect single quote in MasterProcess (#44697) (cherry-pick to preview) (#44698)
Cherry-pick of #44697 to preview

----
Closes https://github.com/zed-industries/zed/issues/43992

Release Notes:

- Fixed remoting not working on some linux and mac systems

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-12-12 12:03:43 +00:00
Zed Bot
47f6fd714d Bump to 0.217.1 for @rtfeldman 2025-12-11 21:24:14 +00:00
Richard Feldman
35c3daaecf Add GPT-5.2 support (cherry-pick to preview) (#44662)
Cherry-pick of 541e086835 to preview

----
2025-12-11 16:17:58 -05:00
zed-zippy[bot]
5a06069f68 agent_ui: Fix project path not found error when pasting code from other project (#44555) (cherry-pick to preview) (#44633)
Cherry-pick of #44555 to preview

----
The problem with inserting the absolute paths is that the agent will try
to read them. However, we don't allow the agent to read files outside
the current project. For now, we will only insert the crease in case the
code that is getting pasted is from the same project

Release Notes:

- Fixed an issue where pasting code into the agent panel from another
window would show an error

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2025-12-11 15:08:31 +00:00
zed-zippy[bot]
c88ba0bafc acp: Better telemetry IDs for ACP agents (#44544) (cherry-pick to preview) (#44610)
Cherry-pick of #44544 to preview

----
We were defining these in multiple places and also weren't leveraging
the ids the agents were already providing.

This should make sure we use them consistently and avoid issues in the
future.

Release Notes:

- N/A

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-12-11 10:33:51 +00:00
zed-zippy[bot]
1ca98ade65 windows: Fix incorrect cursor insertion keybinds (#44608) (cherry-pick to preview) (#44611)
Cherry-pick of #44608 to preview

----
Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-12-11 10:23:03 +00:00
zed-zippy[bot]
b3249aa93d git: Fix failing commits when hook command is not available (#43993) (cherry-pick to preview) (#44593)
Cherry-pick of #43993 to preview

----

Co-authored-by: Mayank Verma <errmayank@gmail.com>
2025-12-11 04:01:45 +00:00
zed-zippy[bot]
3487d5594e Revert "Increase askpass timeout for git operations (#42946)" (#44578) (cherry-pick to preview) (#44583)
Cherry-pick of #44578 to preview

----
This reverts commit a74aac88c9.

cc @11happy, we need to do a bit more than just running `git hook
pre-push` before pushing, as described

[here](https://github.com/zed-industries/zed/pull/42946#issuecomment-3550570438).
Right now this is also running the pre-push hook twice.

Release Notes:

- N/A

Co-authored-by: Cole Miller <cole@zed.dev>
2025-12-10 23:43:43 +00:00
Finn Evers
297a65410e Disable OmniSharp by default for C# files (#44427)
In preparation for https://github.com/zed-extensions/csharp/pull/11. Do
not merge before that PR is published.

Release Notes:

- Added support for Roslyn in C# files. Roslyn will now be the default
language server for C#
2025-12-10 10:18:29 -05:00
Joseph T. Lyons
fb01de04c3 v0.217.x preview 2025-12-10 10:09:51 -05:00
57 changed files with 1208 additions and 1061 deletions

View File

@@ -497,6 +497,8 @@ jobs:
env:
GIT_AUTHOR_NAME: Protobuf Action
GIT_AUTHOR_EMAIL: ci@zed.dev
GIT_COMMITTER_NAME: Protobuf Action
GIT_COMMITTER_EMAIL: ci@zed.dev
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683

5
Cargo.lock generated
View File

@@ -7057,6 +7057,7 @@ dependencies = [
"picker",
"pretty_assertions",
"project",
"rand 0.9.2",
"recent_projects",
"remote",
"schemars",
@@ -20469,7 +20470,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.217.0"
version = "0.217.1"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -20795,7 +20796,7 @@ dependencies = [
name = "zed_proto"
version = "0.2.3"
dependencies = [
"zed_extension_api 0.1.0",
"zed_extension_api 0.7.0",
]
[[package]]

View File

@@ -489,8 +489,8 @@
"bindings": {
"ctrl-[": "editor::Outdent",
"ctrl-]": "editor::Indent",
"ctrl-shift-alt-up": ["editor::AddSelectionAbove", { "skip_soft_wrap": true }], // Insert Cursor Above
"ctrl-shift-alt-down": ["editor::AddSelectionBelow", { "skip_soft_wrap": true }], // Insert Cursor Below
"ctrl-alt-up": ["editor::AddSelectionAbove", { "skip_soft_wrap": true }], // Insert Cursor Above
"ctrl-alt-down": ["editor::AddSelectionBelow", { "skip_soft_wrap": true }], // Insert Cursor Below
"ctrl-shift-k": "editor::DeleteLine",
"alt-up": "editor::MoveLineUp",
"alt-down": "editor::MoveLineDown",

View File

@@ -1810,6 +1810,9 @@
"allowed": false
}
},
"CSharp": {
"language_servers": ["omnisharp", "!roslyn", "..."]
},
"CSS": {
"prettier": {
"allowed": true
@@ -1918,6 +1921,9 @@
"allow_rewrap": "anywhere",
"soft_wrap": "editor_width"
},
"Proto": {
"language_servers": ["buf", "!protols", "!protobuf-language-server", "..."]
},
"Python": {
"code_actions_on_format": {
"source.organizeImports.ruff": true

View File

@@ -1372,7 +1372,7 @@ impl AcpThread {
let path_style = self.project.read(cx).path_style(cx);
let id = update.tool_call_id.clone();
let agent = self.connection().telemetry_id();
let agent_telemetry_id = self.connection().telemetry_id();
let session = self.session_id();
if let ToolCallStatus::Completed | ToolCallStatus::Failed = status {
let status = if matches!(status, ToolCallStatus::Completed) {
@@ -1380,7 +1380,12 @@ impl AcpThread {
} else {
"failed"
};
telemetry::event!("Agent Tool Call Completed", agent, session, status);
telemetry::event!(
"Agent Tool Call Completed",
agent_telemetry_id,
session,
status
);
}
if let Some(ix) = self.index_for_tool_call(&id) {
@@ -3556,8 +3561,8 @@ mod tests {
}
impl AgentConnection for FakeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"fake"
fn telemetry_id(&self) -> SharedString {
"fake".into()
}
fn auth_methods(&self) -> &[acp::AuthMethod] {

View File

@@ -20,7 +20,7 @@ impl UserMessageId {
}
pub trait AgentConnection {
fn telemetry_id(&self) -> &'static str;
fn telemetry_id(&self) -> SharedString;
fn new_thread(
self: Rc<Self>,
@@ -322,8 +322,8 @@ mod test_support {
}
impl AgentConnection for StubAgentConnection {
fn telemetry_id(&self) -> &'static str {
"stub"
fn telemetry_id(&self) -> SharedString {
"stub".into()
}
fn auth_methods(&self) -> &[acp::AuthMethod] {

View File

@@ -777,7 +777,7 @@ impl ActionLog {
#[derive(Clone)]
pub struct ActionLogTelemetry {
pub agent_telemetry_id: &'static str,
pub agent_telemetry_id: SharedString,
pub session_id: Arc<str>,
}

View File

@@ -947,8 +947,8 @@ impl acp_thread::AgentModelSelector for NativeAgentModelSelector {
}
impl acp_thread::AgentConnection for NativeAgentConnection {
fn telemetry_id(&self) -> &'static str {
"zed"
fn telemetry_id(&self) -> SharedString {
"zed".into()
}
fn new_thread(

View File

@@ -21,10 +21,6 @@ impl NativeAgentServer {
}
impl AgentServer for NativeAgentServer {
fn telemetry_id(&self) -> &'static str {
"zed"
}
fn name(&self) -> SharedString {
"Zed Agent".into()
}

View File

@@ -29,7 +29,7 @@ pub struct UnsupportedVersion;
pub struct AcpConnection {
server_name: SharedString,
telemetry_id: &'static str,
telemetry_id: SharedString,
connection: Rc<acp::ClientSideConnection>,
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
auth_methods: Vec<acp::AuthMethod>,
@@ -54,7 +54,6 @@ pub struct AcpSession {
pub async fn connect(
server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
@@ -64,7 +63,6 @@ pub async fn connect(
) -> Result<Rc<dyn AgentConnection>> {
let conn = AcpConnection::stdio(
server_name,
telemetry_id,
command.clone(),
root_dir,
default_mode,
@@ -81,7 +79,6 @@ const MINIMUM_SUPPORTED_VERSION: acp::ProtocolVersion = acp::ProtocolVersion::V1
impl AcpConnection {
pub async fn stdio(
server_name: SharedString,
telemetry_id: &'static str,
command: AgentServerCommand,
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
@@ -199,6 +196,13 @@ impl AcpConnection {
return Err(UnsupportedVersion.into());
}
let telemetry_id = response
.agent_info
// Use the one the agent provides if we have one
.map(|info| info.name.into())
// Otherwise, just use the name
.unwrap_or_else(|| server_name.clone());
Ok(Self {
auth_methods: response.auth_methods,
root_dir: root_dir.to_owned(),
@@ -233,8 +237,8 @@ impl Drop for AcpConnection {
}
impl AgentConnection for AcpConnection {
fn telemetry_id(&self) -> &'static str {
self.telemetry_id
fn telemetry_id(&self) -> SharedString {
self.telemetry_id.clone()
}
fn new_thread(

View File

@@ -56,7 +56,6 @@ impl AgentServerDelegate {
pub trait AgentServer: Send {
fn logo(&self) -> ui::IconName;
fn name(&self) -> SharedString;
fn telemetry_id(&self) -> &'static str;
fn default_mode(&self, _cx: &mut App) -> Option<agent_client_protocol::SessionModeId> {
None
}

View File

@@ -22,10 +22,6 @@ pub struct AgentServerLoginCommand {
}
impl AgentServer for ClaudeCode {
fn telemetry_id(&self) -> &'static str {
"claude-code"
}
fn name(&self) -> SharedString {
"Claude Code".into()
}
@@ -83,7 +79,6 @@ impl AgentServer for ClaudeCode {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -108,7 +103,6 @@ impl AgentServer for ClaudeCode {
.await?;
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -23,10 +23,6 @@ pub(crate) mod tests {
}
impl AgentServer for Codex {
fn telemetry_id(&self) -> &'static str {
"codex"
}
fn name(&self) -> SharedString {
"Codex".into()
}
@@ -84,7 +80,6 @@ impl AgentServer for Codex {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -110,7 +105,6 @@ impl AgentServer for Codex {
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -1,4 +1,4 @@
use crate::{AgentServerDelegate, load_proxy_env};
use crate::{AgentServer, AgentServerDelegate, load_proxy_env};
use acp_thread::AgentConnection;
use agent_client_protocol as acp;
use anyhow::{Context as _, Result};
@@ -20,11 +20,7 @@ impl CustomAgentServer {
}
}
impl crate::AgentServer for CustomAgentServer {
fn telemetry_id(&self) -> &'static str {
"custom"
}
impl AgentServer for CustomAgentServer {
fn name(&self) -> SharedString {
self.name.clone()
}
@@ -112,7 +108,6 @@ impl crate::AgentServer for CustomAgentServer {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let default_mode = self.default_mode(cx);
@@ -139,7 +134,6 @@ impl crate::AgentServer for CustomAgentServer {
.await?;
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -12,10 +12,6 @@ use project::agent_server_store::GEMINI_NAME;
pub struct Gemini;
impl AgentServer for Gemini {
fn telemetry_id(&self) -> &'static str {
"gemini-cli"
}
fn name(&self) -> SharedString {
"Gemini CLI".into()
}
@@ -31,7 +27,6 @@ impl AgentServer for Gemini {
cx: &mut App,
) -> Task<Result<(Rc<dyn AgentConnection>, Option<task::SpawnInTerminal>)>> {
let name = self.name();
let telemetry_id = self.telemetry_id();
let root_dir = root_dir.map(|root_dir| root_dir.to_string_lossy().into_owned());
let is_remote = delegate.project.read(cx).is_via_remote_server();
let store = delegate.store.downgrade();
@@ -66,7 +61,6 @@ impl AgentServer for Gemini {
let connection = crate::acp::connect(
name,
telemetry_id,
command,
root_dir.as_ref(),
default_mode,

View File

@@ -565,8 +565,26 @@ impl MessageEditor {
if let Some((workspace, selections)) =
self.workspace.upgrade().zip(editor_clipboard_selections)
{
cx.stop_propagation();
let Some(first_selection) = selections.first() else {
return;
};
if let Some(file_path) = &first_selection.file_path {
// In case someone pastes selections from another window
// with a different project, we don't want to insert the
// crease (containing the absolute path) since the agent
// cannot access files outside the project.
let is_in_project = workspace
.read(cx)
.project()
.read(cx)
.project_path_for_absolute_path(file_path, cx)
.is_some();
if !is_in_project {
return;
}
}
cx.stop_propagation();
let insertion_target = self
.editor
.read(cx)

View File

@@ -170,7 +170,7 @@ impl ThreadFeedbackState {
}
}
let session_id = thread.read(cx).session_id().clone();
let agent = thread.read(cx).connection().telemetry_id();
let agent_telemetry_id = thread.read(cx).connection().telemetry_id();
let task = telemetry.thread_data(&session_id, cx);
let rating = match feedback {
ThreadFeedback::Positive => "positive",
@@ -180,7 +180,7 @@ impl ThreadFeedbackState {
let thread = task.await?;
telemetry::event!(
"Agent Thread Rated",
agent = agent,
agent = agent_telemetry_id,
session_id = session_id,
rating = rating,
thread = thread
@@ -207,13 +207,13 @@ impl ThreadFeedbackState {
self.comments_editor.take();
let session_id = thread.read(cx).session_id().clone();
let agent = thread.read(cx).connection().telemetry_id();
let agent_telemetry_id = thread.read(cx).connection().telemetry_id();
let task = telemetry.thread_data(&session_id, cx);
cx.background_spawn(async move {
let thread = task.await?;
telemetry::event!(
"Agent Thread Feedback Comments",
agent = agent,
agent = agent_telemetry_id,
session_id = session_id,
comments = comments,
thread = thread
@@ -333,6 +333,7 @@ impl AcpThreadView {
project: Entity<Project>,
history_store: Entity<HistoryStore>,
prompt_store: Option<Entity<PromptStore>>,
track_load_event: bool,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -391,8 +392,9 @@ impl AcpThreadView {
),
];
let show_codex_windows_warning = crate::ExternalAgent::parse_built_in(agent.as_ref())
== Some(crate::ExternalAgent::Codex);
let show_codex_windows_warning = cfg!(windows)
&& project.read(cx).is_local()
&& agent.clone().downcast::<agent_servers::Codex>().is_some();
Self {
agent: agent.clone(),
@@ -404,6 +406,7 @@ impl AcpThreadView {
resume_thread.clone(),
workspace.clone(),
project.clone(),
track_load_event,
window,
cx,
),
@@ -448,6 +451,7 @@ impl AcpThreadView {
self.resume_thread_metadata.clone(),
self.workspace.clone(),
self.project.clone(),
true,
window,
cx,
);
@@ -461,6 +465,7 @@ impl AcpThreadView {
resume_thread: Option<DbThreadMetadata>,
workspace: WeakEntity<Workspace>,
project: Entity<Project>,
track_load_event: bool,
window: &mut Window,
cx: &mut Context<Self>,
) -> ThreadState {
@@ -519,6 +524,10 @@ impl AcpThreadView {
}
};
if track_load_event {
telemetry::event!("Agent Thread Started", agent = connection.telemetry_id());
}
let result = if let Some(native_agent) = connection
.clone()
.downcast::<agent::NativeAgentConnection>()
@@ -1133,8 +1142,8 @@ impl AcpThreadView {
let Some(thread) = self.thread() else {
return;
};
let agent_telemetry_id = self.agent.telemetry_id();
let session_id = thread.read(cx).session_id().clone();
let agent_telemetry_id = thread.read(cx).connection().telemetry_id();
let thread = thread.downgrade();
if self.should_be_following {
self.workspace
@@ -1512,6 +1521,7 @@ impl AcpThreadView {
else {
return;
};
let agent_telemetry_id = connection.telemetry_id();
// Check for the experimental "terminal-auth" _meta field
let auth_method = connection.auth_methods().iter().find(|m| m.id == method);
@@ -1579,19 +1589,18 @@ impl AcpThreadView {
);
cx.notify();
self.auth_task = Some(cx.spawn_in(window, {
let agent = self.agent.clone();
async move |this, cx| {
let result = authenticate.await;
match &result {
Ok(_) => telemetry::event!(
"Authenticate Agent Succeeded",
agent = agent.telemetry_id()
agent = agent_telemetry_id
),
Err(_) => {
telemetry::event!(
"Authenticate Agent Failed",
agent = agent.telemetry_id(),
agent = agent_telemetry_id,
)
}
}
@@ -1675,6 +1684,7 @@ impl AcpThreadView {
None,
this.workspace.clone(),
this.project.clone(),
true,
window,
cx,
)
@@ -1730,43 +1740,38 @@ impl AcpThreadView {
connection.authenticate(method, cx)
};
cx.notify();
self.auth_task =
Some(cx.spawn_in(window, {
let agent = self.agent.clone();
async move |this, cx| {
let result = authenticate.await;
self.auth_task = Some(cx.spawn_in(window, {
async move |this, cx| {
let result = authenticate.await;
match &result {
Ok(_) => telemetry::event!(
"Authenticate Agent Succeeded",
agent = agent.telemetry_id()
),
Err(_) => {
telemetry::event!(
"Authenticate Agent Failed",
agent = agent.telemetry_id(),
)
}
match &result {
Ok(_) => telemetry::event!(
"Authenticate Agent Succeeded",
agent = agent_telemetry_id
),
Err(_) => {
telemetry::event!("Authenticate Agent Failed", agent = agent_telemetry_id,)
}
this.update_in(cx, |this, window, cx| {
if let Err(err) = result {
if let ThreadState::Unauthenticated {
pending_auth_method,
..
} = &mut this.thread_state
{
pending_auth_method.take();
}
this.handle_thread_error(err, cx);
} else {
this.reset(window, cx);
}
this.auth_task.take()
})
.ok();
}
}));
this.update_in(cx, |this, window, cx| {
if let Err(err) = result {
if let ThreadState::Unauthenticated {
pending_auth_method,
..
} = &mut this.thread_state
{
pending_auth_method.take();
}
this.handle_thread_error(err, cx);
} else {
this.reset(window, cx);
}
this.auth_task.take()
})
.ok();
}
}));
}
fn spawn_external_agent_login(
@@ -1896,10 +1901,11 @@ impl AcpThreadView {
let Some(thread) = self.thread() else {
return;
};
let agent_telemetry_id = thread.read(cx).connection().telemetry_id();
telemetry::event!(
"Agent Tool Call Authorized",
agent = self.agent.telemetry_id(),
agent = agent_telemetry_id,
session = thread.read(cx).session_id(),
option = option_kind
);
@@ -3509,6 +3515,8 @@ impl AcpThreadView {
(method.id.0.clone(), method.name.clone())
};
let agent_telemetry_id = connection.telemetry_id();
Button::new(method_id.clone(), name)
.label_size(LabelSize::Small)
.map(|this| {
@@ -3528,7 +3536,7 @@ impl AcpThreadView {
cx.listener(move |this, _, window, cx| {
telemetry::event!(
"Authenticate Agent Started",
agent = this.agent.telemetry_id(),
agent = agent_telemetry_id,
method = method_id
);
@@ -5376,47 +5384,39 @@ impl AcpThreadView {
)
}
fn render_codex_windows_warning(&self, cx: &mut Context<Self>) -> Option<Callout> {
if self.show_codex_windows_warning {
Some(
Callout::new()
.icon(IconName::Warning)
.severity(Severity::Warning)
.title("Codex on Windows")
.description(
"For best performance, run Codex in Windows Subsystem for Linux (WSL2)",
)
.actions_slot(
Button::new("open-wsl-modal", "Open in WSL")
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.on_click(cx.listener({
move |_, _, _window, cx| {
#[cfg(windows)]
_window.dispatch_action(
zed_actions::wsl_actions::OpenWsl::default().boxed_clone(),
cx,
);
cx.notify();
}
})),
)
.dismiss_action(
IconButton::new("dismiss", IconName::Close)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.tooltip(Tooltip::text("Dismiss Warning"))
.on_click(cx.listener({
move |this, _, _, cx| {
this.show_codex_windows_warning = false;
cx.notify();
}
})),
),
fn render_codex_windows_warning(&self, cx: &mut Context<Self>) -> Callout {
Callout::new()
.icon(IconName::Warning)
.severity(Severity::Warning)
.title("Codex on Windows")
.description("For best performance, run Codex in Windows Subsystem for Linux (WSL2)")
.actions_slot(
Button::new("open-wsl-modal", "Open in WSL")
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.on_click(cx.listener({
move |_, _, _window, cx| {
#[cfg(windows)]
_window.dispatch_action(
zed_actions::wsl_actions::OpenWsl::default().boxed_clone(),
cx,
);
cx.notify();
}
})),
)
.dismiss_action(
IconButton::new("dismiss", IconName::Close)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.tooltip(Tooltip::text("Dismiss Warning"))
.on_click(cx.listener({
move |this, _, _, cx| {
this.show_codex_windows_warning = false;
cx.notify();
}
})),
)
} else {
None
}
}
fn render_thread_error(&mut self, window: &mut Window, cx: &mut Context<Self>) -> Option<Div> {
@@ -5936,12 +5936,8 @@ impl Render for AcpThreadView {
_ => this,
})
.children(self.render_thread_retry_status_callout(window, cx))
.children({
if cfg!(windows) && self.project.read(cx).is_local() {
self.render_codex_windows_warning(cx)
} else {
None
}
.when(self.show_codex_windows_warning, |this| {
this.child(self.render_codex_windows_warning(cx))
})
.children(self.render_thread_error(window, cx))
.when_some(
@@ -6398,6 +6394,7 @@ pub(crate) mod tests {
project,
history_store,
None,
false,
window,
cx,
)
@@ -6475,10 +6472,6 @@ pub(crate) mod tests {
where
C: 'static + AgentConnection + Send + Clone,
{
fn telemetry_id(&self) -> &'static str {
"test"
}
fn logo(&self) -> ui::IconName {
ui::IconName::Ai
}
@@ -6505,8 +6498,8 @@ pub(crate) mod tests {
struct SaboteurAgentConnection;
impl AgentConnection for SaboteurAgentConnection {
fn telemetry_id(&self) -> &'static str {
"saboteur"
fn telemetry_id(&self) -> SharedString {
"saboteur".into()
}
fn new_thread(
@@ -6569,8 +6562,8 @@ pub(crate) mod tests {
struct RefusalAgentConnection;
impl AgentConnection for RefusalAgentConnection {
fn telemetry_id(&self) -> &'static str {
"refusal"
fn telemetry_id(&self) -> SharedString {
"refusal".into()
}
fn new_thread(
@@ -6671,6 +6664,7 @@ pub(crate) mod tests {
project.clone(),
history_store.clone(),
None,
false,
window,
cx,
)

View File

@@ -305,6 +305,7 @@ impl ActiveView {
project,
history_store,
prompt_store,
false,
window,
cx,
)
@@ -885,10 +886,6 @@ impl AgentPanel {
let server = ext_agent.server(fs, history);
if !loading {
telemetry::event!("Agent Thread Started", agent = server.telemetry_id());
}
this.update_in(cx, |this, window, cx| {
let selected_agent = ext_agent.into();
if this.selected_agent != selected_agent {
@@ -905,6 +902,7 @@ impl AgentPanel {
project,
this.history_store.clone(),
this.prompt_store.clone(),
!loading,
window,
cx,
)

View File

@@ -160,16 +160,6 @@ pub enum ExternalAgent {
}
impl ExternalAgent {
pub fn parse_built_in(server: &dyn agent_servers::AgentServer) -> Option<Self> {
match server.telemetry_id() {
"gemini-cli" => Some(Self::Gemini),
"claude-code" => Some(Self::ClaudeCode),
"codex" => Some(Self::Codex),
"zed" => Some(Self::NativeAgent),
_ => None,
}
}
pub fn server(
&self,
fs: Arc<dyn fs::Fs>,

View File

@@ -22,7 +22,6 @@ use gpui::{
use indoc::indoc;
use language::{FakeLspAdapter, rust_lang};
use lsp::LSP_REQUEST_TIMEOUT;
use pretty_assertions::assert_eq;
use project::{
ProgressToken, ProjectPath, SERVER_PROGRESS_THROTTLE_TIMEOUT,
lsp_store::lsp_ext_command::{ExpandedMacro, LspExtExpandMacro},
@@ -3190,12 +3189,13 @@ async fn test_lsp_pull_diagnostics(
.collect::<Vec<_>>();
let expected_messages = [
expected_pull_diagnostic_lib_message,
expected_push_diagnostic_lib_message,
// TODO bug: the pushed diagnostics are not being sent to the client when they open the corresponding buffer.
// expected_push_diagnostic_lib_message,
];
assert_eq!(
all_diagnostics.len(),
2,
"Expected pull and push diagnostics, but got: {all_diagnostics:?}"
1,
"Expected pull diagnostics, but got: {all_diagnostics:?}"
);
for diagnostic in all_diagnostics {
assert!(
@@ -3255,15 +3255,14 @@ async fn test_lsp_pull_diagnostics(
.diagnostics_in_range(MultiBufferOffset(0)..snapshot.len())
.collect::<Vec<_>>();
let expected_messages = [
// Despite workspace diagnostics provided,
// the currently open file's diagnostics should be preferred, as LSP suggests.
expected_pull_diagnostic_lib_message,
expected_push_diagnostic_lib_message,
expected_workspace_pull_diagnostics_lib_message,
// TODO bug: the pushed diagnostics are not being sent to the client when they open the corresponding buffer.
// expected_push_diagnostic_lib_message,
];
assert_eq!(
all_diagnostics.len(),
2,
"Expected pull and push diagnostics, but got: {all_diagnostics:?}"
1,
"Expected pull diagnostics, but got: {all_diagnostics:?}"
);
for diagnostic in all_diagnostics {
assert!(
@@ -3376,9 +3375,8 @@ async fn test_lsp_pull_diagnostics(
"Another workspace diagnostics pull should happen after the diagnostics refresh server request"
);
{
assert_eq!(
diagnostics_pulls_result_ids.lock().await.len(),
diagnostic_pulls_result_ids,
assert!(
diagnostics_pulls_result_ids.lock().await.len() == diagnostic_pulls_result_ids,
"Pulls should not happen hence no extra ids should appear"
);
assert!(
@@ -3396,7 +3394,7 @@ async fn test_lsp_pull_diagnostics(
expected_pull_diagnostic_lib_message,
expected_push_diagnostic_lib_message,
];
assert_eq!(all_diagnostics.len(), 2);
assert_eq!(all_diagnostics.len(), 1);
for diagnostic in &all_diagnostics {
assert!(
expected_messages.contains(&diagnostic.diagnostic.message.as_str()),

View File

@@ -45,7 +45,7 @@ impl Editor {
let bracket_matches_by_accent = self.visible_excerpts(false, cx).into_iter().fold(
HashMap::default(),
|mut acc, (excerpt_id, (buffer, buffer_version, buffer_range))| {
|mut acc, (excerpt_id, (buffer, _, buffer_range))| {
let buffer_snapshot = buffer.read(cx).snapshot();
if language_settings::language_settings(
buffer_snapshot.language().map(|language| language.name()),
@@ -62,7 +62,7 @@ impl Editor {
let brackets_by_accent = buffer_snapshot
.fetch_bracket_ranges(
buffer_range.start..buffer_range.end,
Some((&buffer_version, fetched_chunks)),
Some(fetched_chunks),
)
.into_iter()
.flat_map(|(chunk_range, pairs)| {

View File

@@ -1177,7 +1177,6 @@ pub struct Editor {
gutter_breakpoint_indicator: (Option<PhantomBreakpointIndicator>, Option<Task<()>>),
hovered_diff_hunk_row: Option<DisplayRow>,
pull_diagnostics_task: Task<()>,
pull_diagnostics_background_task: Task<()>,
in_project_search: bool,
previous_search_ranges: Option<Arc<[Range<Anchor>]>>,
breadcrumb_header: Option<String>,
@@ -2374,7 +2373,6 @@ impl Editor {
.unwrap_or_default(),
tasks_update_task: None,
pull_diagnostics_task: Task::ready(()),
pull_diagnostics_background_task: Task::ready(()),
colors: None,
refresh_colors_task: Task::ready(()),
inlay_hints: None,
@@ -2551,6 +2549,7 @@ impl Editor {
if let Some(buffer) = multi_buffer.read(cx).as_singleton() {
editor.register_buffer(buffer.read(cx).remote_id(), cx);
}
editor.update_lsp_data(None, window, cx);
editor.report_editor_event(ReportEditorEvent::EditorOpened, None, cx);
}
@@ -18773,101 +18772,54 @@ impl Editor {
return None;
}
let project = self.project()?.downgrade();
let mut edited_buffer_ids = HashSet::default();
let mut edited_worktree_ids = HashSet::default();
let edited_buffers = match buffer_id {
Some(buffer_id) => {
let buffer = self.buffer().read(cx).buffer(buffer_id)?;
let worktree_id = buffer.read(cx).file().map(|f| f.worktree_id(cx))?;
edited_buffer_ids.insert(buffer.read(cx).remote_id());
edited_worktree_ids.insert(worktree_id);
vec![buffer]
}
None => self
.buffer()
.read(cx)
.all_buffers()
.into_iter()
.filter(|buffer| {
let buffer = buffer.read(cx);
match buffer.file().map(|f| f.worktree_id(cx)) {
Some(worktree_id) => {
edited_buffer_ids.insert(buffer.remote_id());
edited_worktree_ids.insert(worktree_id);
true
}
None => false,
}
})
.collect::<Vec<_>>(),
};
if edited_buffers.is_empty() {
let debounce = Duration::from_millis(pull_diagnostics_settings.debounce_ms);
let mut buffers = self.buffer.read(cx).all_buffers();
buffers.retain(|buffer| {
let buffer_id_to_retain = buffer.read(cx).remote_id();
buffer_id.is_none_or(|buffer_id| buffer_id == buffer_id_to_retain)
&& self.registered_buffers.contains_key(&buffer_id_to_retain)
});
if buffers.is_empty() {
self.pull_diagnostics_task = Task::ready(());
self.pull_diagnostics_background_task = Task::ready(());
return None;
}
let mut already_used_buffers = HashSet::default();
let related_open_buffers = self
.workspace
.as_ref()
.and_then(|(workspace, _)| workspace.upgrade())
.into_iter()
.flat_map(|workspace| workspace.read(cx).panes())
.flat_map(|pane| pane.read(cx).items_of_type::<Editor>())
.filter(|editor| editor != &cx.entity())
.flat_map(|editor| editor.read(cx).buffer().read(cx).all_buffers())
.filter(|buffer| {
let buffer = buffer.read(cx);
let buffer_id = buffer.remote_id();
if already_used_buffers.insert(buffer_id) {
if let Some(worktree_id) = buffer.file().map(|f| f.worktree_id(cx)) {
return !edited_buffer_ids.contains(&buffer_id)
&& !edited_worktree_ids.contains(&worktree_id);
}
}
false
})
.collect::<Vec<_>>();
self.pull_diagnostics_task = cx.spawn_in(window, async move |editor, cx| {
cx.background_executor().timer(debounce).await;
let debounce = Duration::from_millis(pull_diagnostics_settings.debounce_ms);
let make_spawn = |buffers: Vec<Entity<Buffer>>, delay: Duration| {
if buffers.is_empty() {
return Task::ready(());
}
let project_weak = project.clone();
cx.spawn_in(window, async move |_, cx| {
cx.background_executor().timer(delay).await;
let Ok(mut pull_diagnostics_tasks) = cx.update(|_, cx| {
buffers
.into_iter()
.filter_map(|buffer| {
project_weak
.update(cx, |project, cx| {
project.lsp_store().update(cx, |lsp_store, cx| {
lsp_store.pull_diagnostics_for_buffer(buffer, cx)
})
let Ok(mut pull_diagnostics_tasks) = cx.update(|_, cx| {
buffers
.into_iter()
.filter_map(|buffer| {
project
.update(cx, |project, cx| {
project.lsp_store().update(cx, |lsp_store, cx| {
lsp_store.pull_diagnostics_for_buffer(buffer, cx)
})
.ok()
})
.collect::<FuturesUnordered<_>>()
}) else {
return;
};
})
.ok()
})
.collect::<FuturesUnordered<_>>()
}) else {
return;
};
while let Some(pull_task) = pull_diagnostics_tasks.next().await {
if let Err(e) = pull_task {
log::error!("Failed to update project diagnostics: {e:#}");
while let Some(pull_task) = pull_diagnostics_tasks.next().await {
match pull_task {
Ok(()) => {
if editor
.update_in(cx, |editor, window, cx| {
editor.update_diagnostics_state(window, cx);
})
.is_err()
{
return;
}
}
Err(e) => log::error!("Failed to update project diagnostics: {e:#}"),
}
})
};
self.pull_diagnostics_task = make_spawn(edited_buffers, debounce);
self.pull_diagnostics_background_task = make_spawn(related_open_buffers, debounce * 2);
}
});
Some(())
}

View File

@@ -26798,7 +26798,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
}
});
let ensure_result_id = |expected: Option<SharedString>, cx: &mut TestAppContext| {
let ensure_result_id = |expected: Option<String>, cx: &mut TestAppContext| {
project.update(cx, |project, cx| {
let buffer_id = editor
.read(cx)
@@ -26811,7 +26811,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
let buffer_result_id = project
.lsp_store()
.read(cx)
.result_id_for_buffer_pull(server_id, buffer_id, &None, cx);
.result_id(server_id, buffer_id, cx);
assert_eq!(expected, buffer_result_id);
});
};
@@ -26828,7 +26828,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
.next()
.await
.expect("should have sent the first diagnostics pull request");
ensure_result_id(Some(SharedString::new("1")), cx);
ensure_result_id(Some("1".to_string()), cx);
// Editing should trigger diagnostics
editor.update_in(cx, |editor, window, cx| {
@@ -26841,7 +26841,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
2,
"Editing should trigger diagnostic request"
);
ensure_result_id(Some(SharedString::new("2")), cx);
ensure_result_id(Some("2".to_string()), cx);
// Moving cursor should not trigger diagnostic request
editor.update_in(cx, |editor, window, cx| {
@@ -26856,7 +26856,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
2,
"Cursor movement should not trigger diagnostic request"
);
ensure_result_id(Some(SharedString::new("2")), cx);
ensure_result_id(Some("2".to_string()), cx);
// Multiple rapid edits should be debounced
for _ in 0..5 {
editor.update_in(cx, |editor, window, cx| {
@@ -26871,7 +26871,7 @@ async fn test_pulling_diagnostics(cx: &mut TestAppContext) {
final_requests <= 4,
"Multiple rapid edits should be debounced (got {final_requests} requests)",
);
ensure_result_id(Some(SharedString::new(final_requests.to_string())), cx);
ensure_result_id(Some(final_requests.to_string()), cx);
}
#[gpui::test]

View File

@@ -232,14 +232,12 @@ impl From<Oid> for usize {
#[derive(Copy, Clone, Debug)]
pub enum RunHook {
PreCommit,
PrePush,
}
impl RunHook {
pub fn as_str(&self) -> &str {
match self {
Self::PreCommit => "pre-commit",
Self::PrePush => "pre-push",
}
}
@@ -250,7 +248,6 @@ impl RunHook {
pub fn from_proto(value: i32) -> Option<Self> {
match value {
0 => Some(Self::PreCommit),
1 => Some(Self::PrePush),
_ => None,
}
}

View File

@@ -2295,8 +2295,38 @@ impl GitRepository for RealGitRepository {
self.executor
.spawn(async move {
let working_directory = working_directory?;
let git = GitBinary::new(git_binary_path, working_directory, executor)
let git = GitBinary::new(git_binary_path, working_directory.clone(), executor)
.envs(HashMap::clone(&env));
let output = git.run(&["help", "-a"]).await?;
if !output.lines().any(|line| line.trim().starts_with("hook ")) {
log::warn!(
"git hook command not available, running the {} hook manually",
hook.as_str()
);
let hook_abs_path = working_directory
.join(".git")
.join("hooks")
.join(hook.as_str());
if hook_abs_path.is_file() {
let output = new_smol_command(&hook_abs_path)
.envs(env.iter())
.current_dir(&working_directory)
.output()
.await?;
anyhow::ensure!(
output.status.success(),
"{} hook failed:\n{}",
hook.as_str(),
String::from_utf8_lossy(&output.stderr)
);
}
return Ok(());
}
git.run(&["hook", "run", "--ignore-missing", hook.as_str()])
.await?;
Ok(())

View File

@@ -74,6 +74,7 @@ gpui = { workspace = true, features = ["test-support"] }
indoc.workspace = true
pretty_assertions.workspace = true
project = { workspace = true, features = ["test-support"] }
rand.workspace = true
settings = { workspace = true, features = ["test-support"] }
unindent.workspace = true
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -6,7 +6,7 @@ use collections::HashSet;
use git::repository::Branch;
use gpui::http_client::Url;
use gpui::{
Action, App, AsyncApp, Context, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable,
Action, App, Context, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable,
InteractiveElement, IntoElement, Modifiers, ModifiersChangedEvent, ParentElement, Render,
SharedString, Styled, Subscription, Task, WeakEntity, Window, actions, rems,
};
@@ -17,8 +17,8 @@ use settings::Settings;
use std::sync::Arc;
use time::OffsetDateTime;
use ui::{
CommonAnimationExt, Divider, HighlightedLabel, KeyBinding, ListHeader, ListItem,
ListItemSpacing, Tooltip, prelude::*,
Divider, HighlightedLabel, KeyBinding, ListHeader, ListItem, ListItemSpacing, Tooltip,
prelude::*,
};
use util::ResultExt;
use workspace::notifications::DetachAndPromptErr;
@@ -232,21 +232,12 @@ impl BranchList {
window: &mut Window,
cx: &mut Context<Self>,
) {
self.picker.update(cx, |this, cx| {
this.delegate.display_remotes = !this.delegate.display_remotes;
cx.spawn_in(window, async move |this, cx| {
this.update_in(cx, |picker, window, cx| {
let last_query = picker.delegate.last_query.clone();
picker.delegate.update_matches(last_query, window, cx)
})?
.await;
Result::Ok::<_, anyhow::Error>(())
})
.detach_and_log_err(cx);
self.picker.update(cx, |picker, cx| {
picker.delegate.branch_filter = picker.delegate.branch_filter.invert();
picker.update_matches(picker.query(cx), window, cx);
picker.refresh_placeholder(window, cx);
cx.notify();
});
cx.notify();
}
}
impl ModalView for BranchList {}
@@ -289,6 +280,10 @@ enum Entry {
NewBranch {
name: String,
},
NewRemoteName {
name: String,
url: SharedString,
},
}
impl Entry {
@@ -304,6 +299,7 @@ impl Entry {
Entry::Branch { branch, .. } => branch.name(),
Entry::NewUrl { url, .. } => url.as_str(),
Entry::NewBranch { name, .. } => name.as_str(),
Entry::NewRemoteName { name, .. } => name.as_str(),
}
}
@@ -318,6 +314,23 @@ impl Entry {
}
}
#[derive(Clone, Copy, PartialEq)]
enum BranchFilter {
/// Only show local branches
Local,
/// Only show remote branches
Remote,
}
impl BranchFilter {
fn invert(&self) -> Self {
match self {
BranchFilter::Local => BranchFilter::Remote,
BranchFilter::Remote => BranchFilter::Local,
}
}
}
pub struct BranchListDelegate {
workspace: Option<WeakEntity<Workspace>>,
matches: Vec<Entry>,
@@ -328,9 +341,8 @@ pub struct BranchListDelegate {
selected_index: usize,
last_query: String,
modifiers: Modifiers,
display_remotes: bool,
branch_filter: BranchFilter,
state: PickerState,
loading: bool,
focus_handle: FocusHandle,
}
@@ -363,9 +375,8 @@ impl BranchListDelegate {
selected_index: 0,
last_query: Default::default(),
modifiers: Default::default(),
display_remotes: false,
branch_filter: BranchFilter::Local,
state: PickerState::List,
loading: false,
focus_handle: cx.focus_handle(),
}
}
@@ -406,37 +417,13 @@ impl BranchListDelegate {
let Some(repo) = self.repo.clone() else {
return;
};
cx.spawn(async move |this, cx| {
this.update(cx, |picker, cx| {
picker.delegate.loading = true;
cx.notify();
})
.log_err();
let stop_loader = |this: &WeakEntity<Picker<BranchListDelegate>>, cx: &mut AsyncApp| {
this.update(cx, |picker, cx| {
picker.delegate.loading = false;
cx.notify();
})
.log_err();
};
repo.update(cx, |repo, _| repo.create_remote(remote_name, remote_url))
.inspect_err(|_err| {
stop_loader(&this, cx);
})?
.await
.inspect_err(|_err| {
stop_loader(&this, cx);
})?
.inspect_err(|_err| {
stop_loader(&this, cx);
})?;
stop_loader(&this, cx);
Ok(())
})
.detach_and_prompt_err("Failed to create remote", window, cx, |e, _, _cx| {
Some(e.to_string())
});
let receiver = repo.update(cx, |repo, _| repo.create_remote(remote_name, remote_url));
cx.background_spawn(async move { receiver.await? })
.detach_and_prompt_err("Failed to create remote", window, cx, |e, _, _cx| {
Some(e.to_string())
});
cx.emit(DismissEvent);
}
@@ -528,29 +515,33 @@ impl PickerDelegate for BranchListDelegate {
type ListItem = ListItem;
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> Arc<str> {
"Select branch…".into()
match self.state {
PickerState::List | PickerState::NewRemote | PickerState::NewBranch => {
match self.branch_filter {
BranchFilter::Local => "Select branch…",
BranchFilter::Remote => "Select remote…",
}
}
PickerState::CreateRemote(_) => "Enter a name for this remote…",
}
.into()
}
fn no_matches_text(&self, _window: &mut Window, _cx: &mut App) -> Option<SharedString> {
match self.state {
PickerState::CreateRemote(_) => {
Some(SharedString::new_static("Remote name can't be empty"))
}
_ => None,
}
}
fn render_editor(
&self,
editor: &Entity<Editor>,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> Div {
cx.update_entity(editor, move |editor, cx| {
let placeholder = match self.state {
PickerState::List | PickerState::NewRemote | PickerState::NewBranch => {
if self.display_remotes {
"Select remote…"
} else {
"Select branch…"
}
}
PickerState::CreateRemote(_) => "Choose a name…",
};
editor.set_placeholder_text(placeholder, window, cx);
});
let focus_handle = self.focus_handle.clone();
v_flex()
@@ -568,16 +559,14 @@ impl PickerDelegate for BranchListDelegate {
.when(
self.editor_position() == PickerEditorPosition::End,
|this| {
let tooltip_label = if self.display_remotes {
"Turn Off Remote Filter"
} else {
"Filter Remote Branches"
let tooltip_label = match self.branch_filter {
BranchFilter::Local => "Turn Off Remote Filter",
BranchFilter::Remote => "Filter Remote Branches",
};
this.gap_1().justify_between().child({
IconButton::new("filter-remotes", IconName::Filter)
.disabled(self.loading)
.toggle_state(self.display_remotes)
.toggle_state(self.branch_filter == BranchFilter::Remote)
.tooltip(move |_, cx| {
Tooltip::for_action_in(
tooltip_label,
@@ -636,20 +625,18 @@ impl PickerDelegate for BranchListDelegate {
return Task::ready(());
};
const RECENT_BRANCHES_COUNT: usize = 10;
let display_remotes = self.display_remotes;
let display_remotes = self.branch_filter;
cx.spawn_in(window, async move |picker, cx| {
let mut matches: Vec<Entry> = if query.is_empty() {
all_branches
.into_iter()
.filter(|branch| {
if display_remotes {
if display_remotes == BranchFilter::Remote {
branch.is_remote()
} else {
!branch.is_remote()
}
})
.take(RECENT_BRANCHES_COUNT)
.map(|branch| Entry::Branch {
branch,
positions: Vec::new(),
@@ -659,7 +646,7 @@ impl PickerDelegate for BranchListDelegate {
let branches = all_branches
.iter()
.filter(|branch| {
if display_remotes {
if display_remotes == BranchFilter::Remote {
branch.is_remote()
} else {
!branch.is_remote()
@@ -690,11 +677,19 @@ impl PickerDelegate for BranchListDelegate {
};
picker
.update(cx, |picker, _| {
if matches!(picker.delegate.state, PickerState::CreateRemote(_)) {
if let PickerState::CreateRemote(url) = &picker.delegate.state {
let query = query.replace(' ', "-");
if !query.is_empty() {
picker.delegate.matches = vec![Entry::NewRemoteName {
name: query.clone(),
url: url.clone(),
}];
picker.delegate.selected_index = 0;
} else {
picker.delegate.matches = Vec::new();
picker.delegate.selected_index = 0;
}
picker.delegate.last_query = query;
picker.delegate.matches = Vec::new();
picker.delegate.selected_index = 0;
return;
}
@@ -738,13 +733,6 @@ impl PickerDelegate for BranchListDelegate {
}
fn confirm(&mut self, secondary: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
if let PickerState::CreateRemote(remote_url) = &self.state {
self.create_remote(self.last_query.clone(), remote_url.to_string(), window, cx);
self.state = PickerState::List;
cx.notify();
return;
}
let Some(entry) = self.matches.get(self.selected_index()) else {
return;
};
@@ -787,13 +775,19 @@ impl PickerDelegate for BranchListDelegate {
self.state = PickerState::CreateRemote(url.clone().into());
self.matches = Vec::new();
self.selected_index = 0;
cx.spawn_in(window, async move |this, cx| {
this.update_in(cx, |picker, window, cx| {
picker.set_query("", window, cx);
})
})
.detach_and_log_err(cx);
cx.notify();
cx.defer_in(window, |picker, window, cx| {
picker.refresh_placeholder(window, cx);
picker.set_query("", window, cx);
cx.notify();
});
// returning early to prevent dismissing the modal, so a user can enter
// a remote name first.
return;
}
Entry::NewRemoteName { name, url } => {
self.create_remote(name.clone(), url.to_string(), window, cx);
}
Entry::NewBranch { name } => {
let from_branch = if secondary {
@@ -844,17 +838,13 @@ impl PickerDelegate for BranchListDelegate {
.unwrap_or_else(|| (None, None, None));
let entry_icon = match entry {
Entry::NewUrl { .. } | Entry::NewBranch { .. } => {
Entry::NewUrl { .. } | Entry::NewBranch { .. } | Entry::NewRemoteName { .. } => {
Icon::new(IconName::Plus).color(Color::Muted)
}
Entry::Branch { .. } => {
if self.display_remotes {
Icon::new(IconName::Screen).color(Color::Muted)
} else {
Icon::new(IconName::GitBranchAlt).color(Color::Muted)
}
}
Entry::Branch { .. } => match self.branch_filter {
BranchFilter::Local => Icon::new(IconName::GitBranchAlt).color(Color::Muted),
BranchFilter::Remote => Icon::new(IconName::Screen).color(Color::Muted),
},
};
let entry_title = match entry {
@@ -866,6 +856,10 @@ impl PickerDelegate for BranchListDelegate {
.single_line()
.truncate()
.into_any_element(),
Entry::NewRemoteName { name, .. } => Label::new(format!("Create Remote: \"{name}\""))
.single_line()
.truncate()
.into_any_element(),
Entry::Branch { branch, positions } => {
HighlightedLabel::new(branch.name().to_string(), positions.clone())
.single_line()
@@ -875,7 +869,10 @@ impl PickerDelegate for BranchListDelegate {
};
let focus_handle = self.focus_handle.clone();
let is_new_items = matches!(entry, Entry::NewUrl { .. } | Entry::NewBranch { .. });
let is_new_items = matches!(
entry,
Entry::NewUrl { .. } | Entry::NewBranch { .. } | Entry::NewRemoteName { .. }
);
let delete_branch_button = IconButton::new("delete", IconName::Trash)
.tooltip(move |_, cx| {
@@ -937,6 +934,9 @@ impl PickerDelegate for BranchListDelegate {
Entry::NewUrl { url } => {
format!("Based off {url}")
}
Entry::NewRemoteName { url, .. } => {
format!("Based off {url}")
}
Entry::NewBranch { .. } => {
if let Some(current_branch) =
self.repo.as_ref().and_then(|repo| {
@@ -1035,10 +1035,9 @@ impl PickerDelegate for BranchListDelegate {
_cx: &mut Context<Picker<Self>>,
) -> Option<AnyElement> {
matches!(self.state, PickerState::List).then(|| {
let label = if self.display_remotes {
"Remote"
} else {
"Local"
let label = match self.branch_filter {
BranchFilter::Local => "Local",
BranchFilter::Remote => "Remote",
};
ListHeader::new(label).inset(true).into_any_element()
@@ -1049,11 +1048,7 @@ impl PickerDelegate for BranchListDelegate {
if self.editor_position() == PickerEditorPosition::End {
return None;
}
let focus_handle = self.focus_handle.clone();
let loading_icon = Icon::new(IconName::LoadCircle)
.size(IconSize::Small)
.with_rotate_animation(3);
let footer_container = || {
h_flex()
@@ -1092,7 +1087,6 @@ impl PickerDelegate for BranchListDelegate {
.gap_1()
.child(
Button::new("delete-branch", "Delete")
.disabled(self.loading)
.key_binding(
KeyBinding::for_action_in(
&branch_picker::DeleteBranch,
@@ -1140,17 +1134,15 @@ impl PickerDelegate for BranchListDelegate {
)
},
)
} else if self.loading {
this.justify_between()
.child(loading_icon)
.child(delete_and_select_btns)
} else {
this.justify_between()
.child({
let focus_handle = focus_handle.clone();
Button::new("filter-remotes", "Filter Remotes")
.disabled(self.loading)
.toggle_state(self.display_remotes)
.toggle_state(matches!(
self.branch_filter,
BranchFilter::Remote
))
.key_binding(
KeyBinding::for_action_in(
&branch_picker::FilterRemotes,
@@ -1215,14 +1207,15 @@ impl PickerDelegate for BranchListDelegate {
footer_container()
.justify_end()
.child(
Label::new("Choose a name for this remote repository")
.size(LabelSize::Small)
.color(Color::Muted),
)
.child(
Label::new("Save")
.size(LabelSize::Small)
.color(Color::Muted),
Button::new("branch-from-default", "Confirm")
.key_binding(
KeyBinding::for_action_in(&menu::Confirm, &focus_handle, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(cx.listener(|this, _, window, cx| {
this.delegate.confirm(false, window, cx);
}))
.disabled(self.last_query.is_empty()),
)
.into_any_element(),
),
@@ -1239,6 +1232,7 @@ mod tests {
use git::repository::{CommitSummary, Remote};
use gpui::{TestAppContext, VisualTestContext};
use project::{FakeFs, Project};
use rand::{Rng, rngs::StdRng};
use serde_json::json;
use settings::SettingsStore;
use util::path;
@@ -1286,10 +1280,10 @@ mod tests {
}
fn init_branch_list_test(
cx: &mut TestAppContext,
repository: Option<Entity<Repository>>,
branches: Vec<Branch>,
) -> (VisualTestContext, Entity<BranchList>) {
cx: &mut TestAppContext,
) -> (Entity<BranchList>, VisualTestContext) {
let window = cx.add_window(|window, cx| {
let mut delegate =
BranchListDelegate::new(None, repository, BranchListStyle::Modal, cx);
@@ -1315,7 +1309,7 @@ mod tests {
let branch_list = window.root(cx).unwrap();
let cx = VisualTestContext::from_window(*window, cx);
(cx, branch_list)
(branch_list, cx)
}
async fn init_fake_repository(cx: &mut TestAppContext) -> Entity<Repository> {
@@ -1349,7 +1343,7 @@ mod tests {
init_test(cx);
let branches = create_test_branches();
let (mut ctx, branch_list) = init_branch_list_test(cx, None, branches);
let (branch_list, mut ctx) = init_branch_list_test(None, branches, cx);
let cx = &mut ctx;
branch_list
@@ -1425,7 +1419,7 @@ mod tests {
.await;
cx.run_until_parked();
let (mut ctx, branch_list) = init_branch_list_test(cx, repository.into(), branches);
let (branch_list, mut ctx) = init_branch_list_test(repository.into(), branches, cx);
let cx = &mut ctx;
update_branch_list_matches_with_empty_query(&branch_list, cx).await;
@@ -1490,12 +1484,12 @@ mod tests {
.await;
cx.run_until_parked();
let (mut ctx, branch_list) = init_branch_list_test(cx, repository.into(), branches);
let (branch_list, mut ctx) = init_branch_list_test(repository.into(), branches, cx);
let cx = &mut ctx;
// Enable remote filter
branch_list.update(cx, |branch_list, cx| {
branch_list.picker.update(cx, |picker, _cx| {
picker.delegate.display_remotes = true;
picker.delegate.branch_filter = BranchFilter::Remote;
});
});
update_branch_list_matches_with_empty_query(&branch_list, cx).await;
@@ -1548,7 +1542,7 @@ mod tests {
create_test_branch("develop", false, None, Some(700)),
];
let (mut ctx, branch_list) = init_branch_list_test(cx, None, branches);
let (branch_list, mut ctx) = init_branch_list_test(None, branches, cx);
let cx = &mut ctx;
update_branch_list_matches_with_empty_query(&branch_list, cx).await;
@@ -1575,7 +1569,7 @@ mod tests {
let last_match = picker.delegate.matches.last().unwrap();
assert!(!last_match.is_new_branch());
assert!(!last_match.is_new_url());
picker.delegate.display_remotes = true;
picker.delegate.branch_filter = BranchFilter::Remote;
picker.delegate.update_matches(String::new(), window, cx)
})
})
@@ -1602,7 +1596,7 @@ mod tests {
// Verify the last entry is NOT the "create new branch" option
let last_match = picker.delegate.matches.last().unwrap();
assert!(!last_match.is_new_url());
picker.delegate.display_remotes = true;
picker.delegate.branch_filter = BranchFilter::Remote;
picker
.delegate
.update_matches(String::from("fork"), window, cx)
@@ -1631,22 +1625,27 @@ mod tests {
#[gpui::test]
async fn test_new_branch_creation_with_query(test_cx: &mut TestAppContext) {
const MAIN_BRANCH: &str = "main";
const FEATURE_BRANCH: &str = "feature";
const NEW_BRANCH: &str = "new-feature-branch";
init_test(test_cx);
let repository = init_fake_repository(test_cx).await;
let branches = vec![
create_test_branch("main", true, None, Some(1000)),
create_test_branch("feature", false, None, Some(900)),
create_test_branch(MAIN_BRANCH, true, None, Some(1000)),
create_test_branch(FEATURE_BRANCH, false, None, Some(900)),
];
let (mut ctx, branch_list) = init_branch_list_test(test_cx, repository.into(), branches);
let (branch_list, mut ctx) = init_branch_list_test(repository.into(), branches, test_cx);
let cx = &mut ctx;
branch_list
.update_in(cx, |branch_list, window, cx| {
branch_list.picker.update(cx, |picker, cx| {
let query = "new-feature-branch".to_string();
picker.delegate.update_matches(query, window, cx)
picker
.delegate
.update_matches(NEW_BRANCH.to_string(), window, cx)
})
})
.await;
@@ -1657,7 +1656,7 @@ mod tests {
branch_list.picker.update(cx, |picker, cx| {
let last_match = picker.delegate.matches.last().unwrap();
assert!(last_match.is_new_branch());
assert_eq!(last_match.name(), "new-feature-branch");
assert_eq!(last_match.name(), NEW_BRANCH);
// State is NewBranch because no existing branches fuzzy-match the query
assert!(matches!(picker.delegate.state, PickerState::NewBranch));
picker.delegate.confirm(false, window, cx);
@@ -1682,11 +1681,11 @@ mod tests {
let new_branch = branches
.into_iter()
.find(|branch| branch.name() == "new-feature-branch")
.find(|branch| branch.name() == NEW_BRANCH)
.expect("new-feature-branch should exist");
assert_eq!(
new_branch.ref_name.as_ref(),
"refs/heads/new-feature-branch",
&format!("refs/heads/{NEW_BRANCH}"),
"branch ref_name should not have duplicate refs/heads/ prefix"
);
}
@@ -1697,7 +1696,7 @@ mod tests {
let repository = init_fake_repository(cx).await;
let branches = vec![create_test_branch("main", true, None, Some(1000))];
let (mut ctx, branch_list) = init_branch_list_test(cx, repository.into(), branches);
let (branch_list, mut ctx) = init_branch_list_test(repository.into(), branches, cx);
let cx = &mut ctx;
branch_list
@@ -1736,8 +1735,13 @@ mod tests {
branch_list.update_in(cx, |branch_list, window, cx| {
branch_list.picker.update(cx, |picker, cx| {
assert_eq!(picker.delegate.matches.len(), 1);
assert!(matches!(
picker.delegate.matches.first(),
Some(Entry::NewRemoteName { name, url })
if name == "my_new_remote" && url.as_ref() == "https://github.com/user/repo.git"
));
picker.delegate.confirm(false, window, cx);
assert_eq!(picker.delegate.matches.len(), 0);
})
});
cx.run_until_parked();
@@ -1770,7 +1774,7 @@ mod tests {
init_test(cx);
let branches = vec![create_test_branch("main_branch", true, None, Some(1000))];
let (mut ctx, branch_list) = init_branch_list_test(cx, None, branches);
let (branch_list, mut ctx) = init_branch_list_test(None, branches, cx);
let cx = &mut ctx;
branch_list
@@ -1825,4 +1829,79 @@ mod tests {
})
});
}
#[gpui::test]
async fn test_confirm_remote_url_does_not_dismiss(cx: &mut TestAppContext) {
const REMOTE_URL: &str = "https://github.com/user/repo.git";
init_test(cx);
let branches = vec![create_test_branch("main", true, None, Some(1000))];
let (branch_list, mut ctx) = init_branch_list_test(None, branches, cx);
let cx = &mut ctx;
let subscription = cx.update(|_, cx| {
cx.subscribe(&branch_list, |_, _: &DismissEvent, _| {
panic!("DismissEvent should not be emitted when confirming a remote URL");
})
});
branch_list
.update_in(cx, |branch_list, window, cx| {
window.focus(&branch_list.picker_focus_handle);
branch_list.picker.update(cx, |picker, cx| {
picker
.delegate
.update_matches(REMOTE_URL.to_string(), window, cx)
})
})
.await;
cx.run_until_parked();
branch_list.update_in(cx, |branch_list, window, cx| {
branch_list.picker.update(cx, |picker, cx| {
let last_match = picker.delegate.matches.last().unwrap();
assert!(last_match.is_new_url());
assert!(matches!(picker.delegate.state, PickerState::NewRemote));
picker.delegate.confirm(false, window, cx);
assert!(
matches!(picker.delegate.state, PickerState::CreateRemote(ref url) if url.as_ref() == REMOTE_URL),
"State should transition to CreateRemote with the URL"
);
});
assert!(
branch_list.picker_focus_handle.is_focused(window),
"Branch list picker should still be focused after confirming remote URL"
);
});
cx.run_until_parked();
drop(subscription);
}
#[gpui::test(iterations = 10)]
async fn test_empty_query_displays_all_branches(mut rng: StdRng, cx: &mut TestAppContext) {
init_test(cx);
let branch_count = rng.random_range(13..540);
let branches: Vec<Branch> = (0..branch_count)
.map(|i| create_test_branch(&format!("branch-{:02}", i), i == 0, None, Some(i * 100)))
.collect();
let (branch_list, mut ctx) = init_branch_list_test(None, branches, cx);
let cx = &mut ctx;
update_branch_list_matches_with_empty_query(&branch_list, cx).await;
branch_list.update(cx, |branch_list, cx| {
branch_list.picker.update(cx, |picker, _cx| {
assert_eq!(picker.delegate.matches.len(), branch_count as usize);
})
});
}
}

View File

@@ -22,8 +22,8 @@ pub use crate::{
proto,
};
use anyhow::{Context as _, Result};
use clock::Lamport;
pub use clock::ReplicaId;
use clock::{Global, Lamport};
use collections::{HashMap, HashSet};
use fs::MTime;
use futures::channel::oneshot;
@@ -33,7 +33,7 @@ use gpui::{
};
use lsp::{LanguageServerId, NumberOrString};
use parking_lot::{Mutex, RawMutex, lock_api::MutexGuard};
use parking_lot::Mutex;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use settings::WorktreeId;
@@ -130,29 +130,37 @@ pub struct Buffer {
has_unsaved_edits: Cell<(clock::Global, bool)>,
change_bits: Vec<rc::Weak<Cell<bool>>>,
_subscriptions: Vec<gpui::Subscription>,
tree_sitter_data: Arc<Mutex<TreeSitterData>>,
tree_sitter_data: Arc<TreeSitterData>,
}
#[derive(Debug, Clone)]
#[derive(Debug)]
pub struct TreeSitterData {
chunks: RowChunks,
brackets_by_chunks: Vec<Option<Vec<BracketMatch<usize>>>>,
brackets_by_chunks: Mutex<Vec<Option<Vec<BracketMatch<usize>>>>>,
}
const MAX_ROWS_IN_A_CHUNK: u32 = 50;
impl TreeSitterData {
fn clear(&mut self) {
self.brackets_by_chunks = vec![None; self.chunks.len()];
fn clear(&mut self, snapshot: text::BufferSnapshot) {
self.chunks = RowChunks::new(snapshot, MAX_ROWS_IN_A_CHUNK);
self.brackets_by_chunks.get_mut().clear();
self.brackets_by_chunks
.get_mut()
.resize(self.chunks.len(), None);
}
fn new(snapshot: text::BufferSnapshot) -> Self {
let chunks = RowChunks::new(snapshot, MAX_ROWS_IN_A_CHUNK);
Self {
brackets_by_chunks: vec![None; chunks.len()],
brackets_by_chunks: Mutex::new(vec![None; chunks.len()]),
chunks,
}
}
fn version(&self) -> &clock::Global {
self.chunks.version()
}
}
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
@@ -176,7 +184,7 @@ pub struct BufferSnapshot {
remote_selections: TreeMap<ReplicaId, SelectionSet>,
language: Option<Arc<Language>>,
non_text_state_update_count: usize,
tree_sitter_data: Arc<Mutex<TreeSitterData>>,
tree_sitter_data: Arc<TreeSitterData>,
}
/// The kind and amount of indentation in a particular line. For now,
@@ -237,8 +245,6 @@ struct SelectionSet {
pub struct Diagnostic {
/// The name of the service that produced this diagnostic.
pub source: Option<String>,
/// The ID provided by the dynamic registration that produced this diagnostic.
pub registration_id: Option<SharedString>,
/// A machine-readable code that identifies this diagnostic.
pub code: Option<NumberOrString>,
pub code_description: Option<lsp::Uri>,
@@ -1062,7 +1068,7 @@ impl Buffer {
let tree_sitter_data = TreeSitterData::new(snapshot);
Self {
saved_mtime,
tree_sitter_data: Arc::new(Mutex::new(tree_sitter_data)),
tree_sitter_data: Arc::new(tree_sitter_data),
saved_version: buffer.version(),
preview_version: buffer.version(),
reload_task: None,
@@ -1119,7 +1125,7 @@ impl Buffer {
file: None,
diagnostics: Default::default(),
remote_selections: Default::default(),
tree_sitter_data: Arc::new(Mutex::new(tree_sitter_data)),
tree_sitter_data: Arc::new(tree_sitter_data),
language,
non_text_state_update_count: 0,
}
@@ -1141,7 +1147,7 @@ impl Buffer {
BufferSnapshot {
text,
syntax,
tree_sitter_data: Arc::new(Mutex::new(tree_sitter_data)),
tree_sitter_data: Arc::new(tree_sitter_data),
file: None,
diagnostics: Default::default(),
remote_selections: Default::default(),
@@ -1170,7 +1176,7 @@ impl Buffer {
BufferSnapshot {
text,
syntax,
tree_sitter_data: Arc::new(Mutex::new(tree_sitter_data)),
tree_sitter_data: Arc::new(tree_sitter_data),
file: None,
diagnostics: Default::default(),
remote_selections: Default::default(),
@@ -1187,10 +1193,16 @@ impl Buffer {
syntax_map.interpolate(&text);
let syntax = syntax_map.snapshot();
let tree_sitter_data = if self.text.version() != *self.tree_sitter_data.version() {
Arc::new(TreeSitterData::new(text.clone()))
} else {
self.tree_sitter_data.clone()
};
BufferSnapshot {
text,
syntax,
tree_sitter_data: self.tree_sitter_data.clone(),
tree_sitter_data,
file: self.file.clone(),
remote_selections: self.remote_selections.clone(),
diagnostics: self.diagnostics.clone(),
@@ -1624,6 +1636,16 @@ impl Buffer {
self.sync_parse_timeout = timeout;
}
fn invalidate_tree_sitter_data(&mut self, snapshot: text::BufferSnapshot) {
match Arc::get_mut(&mut self.tree_sitter_data) {
Some(tree_sitter_data) => tree_sitter_data.clear(snapshot),
None => {
let tree_sitter_data = TreeSitterData::new(snapshot);
self.tree_sitter_data = Arc::new(tree_sitter_data)
}
}
}
/// Called after an edit to synchronize the buffer's main parse tree with
/// the buffer's new underlying state.
///
@@ -1648,6 +1670,9 @@ impl Buffer {
/// for the same buffer, we only initiate a new parse if we are not already
/// parsing in the background.
pub fn reparse(&mut self, cx: &mut Context<Self>, may_block: bool) {
if self.text.version() != *self.tree_sitter_data.version() {
self.invalidate_tree_sitter_data(self.text.snapshot());
}
if self.reparse.is_some() {
return;
}
@@ -1749,7 +1774,9 @@ impl Buffer {
self.syntax_map.lock().did_parse(syntax_snapshot);
self.request_autoindent(cx);
self.parse_status.0.send(ParseStatus::Idle).unwrap();
self.tree_sitter_data.lock().clear();
if self.text.version() != *self.tree_sitter_data.version() {
self.invalidate_tree_sitter_data(self.text.snapshot());
}
cx.emit(BufferEvent::Reparsed);
cx.notify();
}
@@ -4281,157 +4308,125 @@ impl BufferSnapshot {
pub fn fetch_bracket_ranges(
&self,
range: Range<usize>,
known_chunks: Option<(&Global, &HashSet<Range<BufferRow>>)>,
known_chunks: Option<&HashSet<Range<BufferRow>>>,
) -> HashMap<Range<BufferRow>, Vec<BracketMatch<usize>>> {
let mut tree_sitter_data = self.latest_tree_sitter_data().clone();
let known_chunks = match known_chunks {
Some((known_version, known_chunks)) => {
if !tree_sitter_data
.chunks
.version()
.changed_since(known_version)
{
known_chunks.clone()
} else {
HashSet::default()
}
}
None => HashSet::default(),
};
let mut new_bracket_matches = HashMap::default();
let mut all_bracket_matches = HashMap::default();
for chunk in tree_sitter_data
for chunk in self
.tree_sitter_data
.chunks
.applicable_chunks(&[self.anchor_before(range.start)..self.anchor_after(range.end)])
{
if known_chunks.contains(&chunk.row_range()) {
if known_chunks.is_some_and(|chunks| chunks.contains(&chunk.row_range())) {
continue;
}
let Some(chunk_range) = tree_sitter_data.chunks.chunk_range(chunk) else {
let Some(chunk_range) = self.tree_sitter_data.chunks.chunk_range(chunk) else {
continue;
};
let chunk_range = chunk_range.to_offset(&tree_sitter_data.chunks.snapshot);
let chunk_range = chunk_range.to_offset(&self);
let bracket_matches = match tree_sitter_data.brackets_by_chunks[chunk.id].take() {
Some(cached_brackets) => cached_brackets,
None => {
let mut all_brackets = Vec::new();
let mut opens = Vec::new();
let mut color_pairs = Vec::new();
if let Some(cached_brackets) =
&self.tree_sitter_data.brackets_by_chunks.lock()[chunk.id]
{
all_bracket_matches.insert(chunk.row_range(), cached_brackets.clone());
continue;
}
let mut matches =
self.syntax
.matches(chunk_range.clone(), &self.text, |grammar| {
grammar.brackets_config.as_ref().map(|c| &c.query)
});
let configs = matches
.grammars()
.iter()
.map(|grammar| grammar.brackets_config.as_ref().unwrap())
.collect::<Vec<_>>();
let mut all_brackets = Vec::new();
let mut opens = Vec::new();
let mut color_pairs = Vec::new();
while let Some(mat) = matches.peek() {
let mut open = None;
let mut close = None;
let syntax_layer_depth = mat.depth;
let config = configs[mat.grammar_index];
let pattern = &config.patterns[mat.pattern_index];
for capture in mat.captures {
if capture.index == config.open_capture_ix {
open = Some(capture.node.byte_range());
} else if capture.index == config.close_capture_ix {
close = Some(capture.node.byte_range());
}
}
let mut matches = self
.syntax
.matches(chunk_range.clone(), &self.text, |grammar| {
grammar.brackets_config.as_ref().map(|c| &c.query)
});
let configs = matches
.grammars()
.iter()
.map(|grammar| grammar.brackets_config.as_ref().unwrap())
.collect::<Vec<_>>();
matches.advance();
let Some((open_range, close_range)) = open.zip(close) else {
continue;
};
let bracket_range = open_range.start..=close_range.end;
if !bracket_range.overlaps(&chunk_range) {
continue;
}
let index = all_brackets.len();
all_brackets.push(BracketMatch {
open_range: open_range.clone(),
close_range: close_range.clone(),
newline_only: pattern.newline_only,
syntax_layer_depth,
color_index: None,
});
// Certain languages have "brackets" that are not brackets, e.g. tags. and such
// bracket will match the entire tag with all text inside.
// For now, avoid highlighting any pair that has more than single char in each bracket.
// We need to colorize `<Element/>` bracket pairs, so cannot make this check stricter.
let should_color = !pattern.rainbow_exclude
&& (open_range.len() == 1 || close_range.len() == 1);
if should_color {
opens.push(open_range.clone());
color_pairs.push((open_range, close_range, index));
}
while let Some(mat) = matches.peek() {
let mut open = None;
let mut close = None;
let syntax_layer_depth = mat.depth;
let config = configs[mat.grammar_index];
let pattern = &config.patterns[mat.pattern_index];
for capture in mat.captures {
if capture.index == config.open_capture_ix {
open = Some(capture.node.byte_range());
} else if capture.index == config.close_capture_ix {
close = Some(capture.node.byte_range());
}
opens.sort_by_key(|r| (r.start, r.end));
opens.dedup_by(|a, b| a.start == b.start && a.end == b.end);
color_pairs.sort_by_key(|(_, close, _)| close.end);
let mut open_stack = Vec::new();
let mut open_index = 0;
for (open, close, index) in color_pairs {
while open_index < opens.len() && opens[open_index].start < close.start {
open_stack.push(opens[open_index].clone());
open_index += 1;
}
if open_stack.last() == Some(&open) {
let depth_index = open_stack.len() - 1;
all_brackets[index].color_index = Some(depth_index);
open_stack.pop();
}
}
all_brackets.sort_by_key(|bracket_match| {
(bracket_match.open_range.start, bracket_match.open_range.end)
});
new_bracket_matches.insert(chunk.id, all_brackets.clone());
all_brackets
}
};
all_bracket_matches.insert(chunk.row_range(), bracket_matches);
}
let mut latest_tree_sitter_data = self.latest_tree_sitter_data();
if latest_tree_sitter_data.chunks.version() == &self.version {
for (chunk_id, new_matches) in new_bracket_matches {
let old_chunks = &mut latest_tree_sitter_data.brackets_by_chunks[chunk_id];
if old_chunks.is_none() {
*old_chunks = Some(new_matches);
matches.advance();
let Some((open_range, close_range)) = open.zip(close) else {
continue;
};
let bracket_range = open_range.start..=close_range.end;
if !bracket_range.overlaps(&chunk_range) {
continue;
}
let index = all_brackets.len();
all_brackets.push(BracketMatch {
open_range: open_range.clone(),
close_range: close_range.clone(),
newline_only: pattern.newline_only,
syntax_layer_depth,
color_index: None,
});
// Certain languages have "brackets" that are not brackets, e.g. tags. and such
// bracket will match the entire tag with all text inside.
// For now, avoid highlighting any pair that has more than single char in each bracket.
// We need to colorize `<Element/>` bracket pairs, so cannot make this check stricter.
let should_color =
!pattern.rainbow_exclude && (open_range.len() == 1 || close_range.len() == 1);
if should_color {
opens.push(open_range.clone());
color_pairs.push((open_range, close_range, index));
}
}
opens.sort_by_key(|r| (r.start, r.end));
opens.dedup_by(|a, b| a.start == b.start && a.end == b.end);
color_pairs.sort_by_key(|(_, close, _)| close.end);
let mut open_stack = Vec::new();
let mut open_index = 0;
for (open, close, index) in color_pairs {
while open_index < opens.len() && opens[open_index].start < close.start {
open_stack.push(opens[open_index].clone());
open_index += 1;
}
if open_stack.last() == Some(&open) {
let depth_index = open_stack.len() - 1;
all_brackets[index].color_index = Some(depth_index);
open_stack.pop();
}
}
all_brackets.sort_by_key(|bracket_match| {
(bracket_match.open_range.start, bracket_match.open_range.end)
});
if let empty_slot @ None =
&mut self.tree_sitter_data.brackets_by_chunks.lock()[chunk.id]
{
*empty_slot = Some(all_brackets.clone());
}
all_bracket_matches.insert(chunk.row_range(), all_brackets);
}
all_bracket_matches
}
fn latest_tree_sitter_data(&self) -> MutexGuard<'_, RawMutex, TreeSitterData> {
let mut tree_sitter_data = self.tree_sitter_data.lock();
if self
.version
.changed_since(tree_sitter_data.chunks.version())
{
*tree_sitter_data = TreeSitterData::new(self.text.clone());
}
tree_sitter_data
}
pub fn all_bracket_ranges(
&self,
range: Range<usize>,
@@ -5409,7 +5404,6 @@ impl Default for Diagnostic {
is_unnecessary: false,
underline: true,
data: None,
registration_id: None,
}
}
}

View File

@@ -19,7 +19,7 @@ use crate::BufferRow;
/// <https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#inlayHintParams>
#[derive(Clone)]
pub struct RowChunks {
pub(crate) snapshot: text::BufferSnapshot,
snapshot: text::BufferSnapshot,
chunks: Arc<[RowChunk]>,
}

View File

@@ -3,7 +3,6 @@
use crate::{CursorShape, Diagnostic, DiagnosticSourceKind, diagnostic_set::DiagnosticEntry};
use anyhow::{Context as _, Result};
use clock::ReplicaId;
use gpui::SharedString;
use lsp::{DiagnosticSeverity, LanguageServerId};
use rpc::proto;
use serde_json::Value;
@@ -240,11 +239,6 @@ pub fn serialize_diagnostics<'a>(
is_disk_based: entry.diagnostic.is_disk_based,
is_unnecessary: entry.diagnostic.is_unnecessary,
data: entry.diagnostic.data.as_ref().map(|data| data.to_string()),
registration_id: entry
.diagnostic
.registration_id
.as_ref()
.map(ToString::to_string),
})
.collect()
}
@@ -463,7 +457,6 @@ pub fn deserialize_diagnostics(
is_disk_based: diagnostic.is_disk_based,
is_unnecessary: diagnostic.is_unnecessary,
underline: diagnostic.underline,
registration_id: diagnostic.registration_id.map(SharedString::from),
source_kind: match proto::diagnostic::SourceKind::from_i32(
diagnostic.source_kind,
)? {

View File

@@ -278,6 +278,7 @@ impl LanguageModel for OpenAiLanguageModel {
| Model::FiveMini
| Model::FiveNano
| Model::FivePointOne
| Model::FivePointTwo
| Model::O1
| Model::O3
| Model::O4Mini => true,
@@ -675,8 +676,11 @@ pub fn count_open_ai_tokens(
| Model::O4Mini
| Model::Five
| Model::FiveMini
| Model::FiveNano => tiktoken_rs::num_tokens_from_messages(model.id(), &messages), // GPT-5.1 doesn't have tiktoken support yet; fall back on gpt-4o tokenizer
Model::FivePointOne => tiktoken_rs::num_tokens_from_messages("gpt-5", &messages),
| Model::FiveNano => tiktoken_rs::num_tokens_from_messages(model.id(), &messages),
// GPT-5.1 and 5.2 don't have dedicated tiktoken support; use gpt-5 tokenizer
Model::FivePointOne | Model::FivePointTwo => {
tiktoken_rs::num_tokens_from_messages("gpt-5", &messages)
}
}
.map(|tokens| tokens as u64)
})

View File

@@ -1,7 +1,6 @@
((comment) @injection.content
(#match? @injection.content "^(///|//!|/\\*\\*|/\\*!)(.*)")
(#set! injection.language "doxygen")
(#set! injection.include-children))
(#set! injection.language "comment")
)
(preproc_def
value: (preproc_arg) @injection.content

View File

@@ -1,7 +1,6 @@
((comment) @injection.content
(#match? @injection.content "^(///|//!|/\\*\\*|/\\*!)(.*)")
(#set! injection.language "doxygen")
(#set! injection.include-children))
(#set! injection.language "comment")
)
(preproc_def
value: (preproc_arg) @injection.content

View File

@@ -87,6 +87,8 @@ pub enum Model {
FiveNano,
#[serde(rename = "gpt-5.1")]
FivePointOne,
#[serde(rename = "gpt-5.2")]
FivePointTwo,
#[serde(rename = "custom")]
Custom {
name: String,
@@ -123,6 +125,7 @@ impl Model {
"gpt-5-mini" => Ok(Self::FiveMini),
"gpt-5-nano" => Ok(Self::FiveNano),
"gpt-5.1" => Ok(Self::FivePointOne),
"gpt-5.2" => Ok(Self::FivePointTwo),
invalid_id => anyhow::bail!("invalid model id '{invalid_id}'"),
}
}
@@ -145,6 +148,7 @@ impl Model {
Self::FiveMini => "gpt-5-mini",
Self::FiveNano => "gpt-5-nano",
Self::FivePointOne => "gpt-5.1",
Self::FivePointTwo => "gpt-5.2",
Self::Custom { name, .. } => name,
}
}
@@ -167,6 +171,7 @@ impl Model {
Self::FiveMini => "gpt-5-mini",
Self::FiveNano => "gpt-5-nano",
Self::FivePointOne => "gpt-5.1",
Self::FivePointTwo => "gpt-5.2",
Self::Custom {
name, display_name, ..
} => display_name.as_ref().unwrap_or(name),
@@ -191,6 +196,7 @@ impl Model {
Self::FiveMini => 272_000,
Self::FiveNano => 272_000,
Self::FivePointOne => 400_000,
Self::FivePointTwo => 400_000,
Self::Custom { max_tokens, .. } => *max_tokens,
}
}
@@ -216,6 +222,7 @@ impl Model {
Self::FiveMini => Some(128_000),
Self::FiveNano => Some(128_000),
Self::FivePointOne => Some(128_000),
Self::FivePointTwo => Some(128_000),
}
}
@@ -244,6 +251,7 @@ impl Model {
| Self::Five
| Self::FiveMini
| Self::FivePointOne
| Self::FivePointTwo
| Self::FiveNano => true,
Self::O1 | Self::O3 | Self::O3Mini | Self::O4Mini | Model::Custom { .. } => false,
}

View File

@@ -4692,11 +4692,9 @@ impl Repository {
});
let this = cx.weak_entity();
let rx = self.run_hook(RunHook::PrePush, cx);
self.send_job(
Some(format!("git push {} {} {}", args, remote, branch).into()),
move |git_repo, mut cx| async move {
rx.await??;
match git_repo {
RepositoryState::Local(LocalRepositoryState {
backend,

View File

@@ -14,7 +14,7 @@ use client::proto::{self, PeerId};
use clock::Global;
use collections::{HashMap, HashSet};
use futures::future;
use gpui::{App, AsyncApp, Entity, SharedString, Task};
use gpui::{App, AsyncApp, Entity, Task};
use language::{
Anchor, Bias, Buffer, BufferSnapshot, CachedLspAdapter, CharKind, CharScopeContext,
OffsetRangeExt, PointUtf16, ToOffset, ToPointUtf16, Transaction, Unclipped,
@@ -26,8 +26,8 @@ use language::{
use lsp::{
AdapterServerCapabilities, CodeActionKind, CodeActionOptions, CodeDescription,
CompletionContext, CompletionListItemDefaultsEditRange, CompletionTriggerKind,
DocumentHighlightKind, LanguageServer, LanguageServerId, LinkedEditingRangeServerCapabilities,
OneOf, RenameOptions, ServerCapabilities,
DiagnosticServerCapabilities, DocumentHighlightKind, LanguageServer, LanguageServerId,
LinkedEditingRangeServerCapabilities, OneOf, RenameOptions, ServerCapabilities,
};
use serde_json::Value;
use signature_help::{lsp_to_proto_signature, proto_to_lsp_signature};
@@ -265,9 +265,8 @@ pub(crate) struct LinkedEditingRange {
pub(crate) struct GetDocumentDiagnostics {
/// We cannot blindly rely on server's capabilities.diagnostic_provider, as they're a singular field, whereas
/// a server can register multiple diagnostic providers post-mortem.
pub registration_id: Option<SharedString>,
pub identifier: Option<String>,
pub previous_result_id: Option<SharedString>,
pub dynamic_caps: DiagnosticServerCapabilities,
pub previous_result_id: Option<String>,
}
#[async_trait(?Send)]
@@ -3756,16 +3755,15 @@ impl GetDocumentDiagnostics {
.into_iter()
.filter_map(|diagnostics| {
Some(LspPullDiagnostics::Response {
registration_id: diagnostics.registration_id.map(SharedString::from),
server_id: LanguageServerId::from_proto(diagnostics.server_id),
uri: lsp::Uri::from_str(diagnostics.uri.as_str()).log_err()?,
diagnostics: if diagnostics.changed {
PulledDiagnostics::Unchanged {
result_id: SharedString::new(diagnostics.result_id?),
result_id: diagnostics.result_id?,
}
} else {
PulledDiagnostics::Changed {
result_id: diagnostics.result_id.map(SharedString::new),
result_id: diagnostics.result_id,
diagnostics: diagnostics
.diagnostics
.into_iter()
@@ -3929,7 +3927,6 @@ impl GetDocumentDiagnostics {
pub fn deserialize_workspace_diagnostics_report(
report: lsp::WorkspaceDiagnosticReportResult,
server_id: LanguageServerId,
registration_id: Option<SharedString>,
) -> Vec<WorkspaceLspPullDiagnostics> {
let mut pulled_diagnostics = HashMap::default();
match report {
@@ -3941,7 +3938,6 @@ impl GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
report,
registration_id.clone(),
)
}
lsp::WorkspaceDocumentDiagnosticReport::Unchanged(report) => {
@@ -3949,7 +3945,6 @@ impl GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
report,
registration_id.clone(),
)
}
}
@@ -3965,7 +3960,6 @@ impl GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
report,
registration_id.clone(),
)
}
lsp::WorkspaceDocumentDiagnosticReport::Unchanged(report) => {
@@ -3973,7 +3967,6 @@ impl GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
report,
registration_id.clone(),
)
}
}
@@ -3994,7 +3987,6 @@ fn process_full_workspace_diagnostics_report(
diagnostics: &mut HashMap<lsp::Uri, WorkspaceLspPullDiagnostics>,
server_id: LanguageServerId,
report: lsp::WorkspaceFullDocumentDiagnosticReport,
registration_id: Option<SharedString>,
) {
let mut new_diagnostics = HashMap::default();
process_full_diagnostics_report(
@@ -4002,7 +3994,6 @@ fn process_full_workspace_diagnostics_report(
server_id,
report.uri,
report.full_document_diagnostic_report,
registration_id,
);
diagnostics.extend(new_diagnostics.into_iter().map(|(uri, diagnostics)| {
(
@@ -4019,7 +4010,6 @@ fn process_unchanged_workspace_diagnostics_report(
diagnostics: &mut HashMap<lsp::Uri, WorkspaceLspPullDiagnostics>,
server_id: LanguageServerId,
report: lsp::WorkspaceUnchangedDocumentDiagnosticReport,
registration_id: Option<SharedString>,
) {
let mut new_diagnostics = HashMap::default();
process_unchanged_diagnostics_report(
@@ -4027,7 +4017,6 @@ fn process_unchanged_workspace_diagnostics_report(
server_id,
report.uri,
report.unchanged_document_diagnostic_report,
registration_id,
);
diagnostics.extend(new_diagnostics.into_iter().map(|(uri, diagnostics)| {
(
@@ -4061,12 +4050,19 @@ impl LspCommand for GetDocumentDiagnostics {
_: &Arc<LanguageServer>,
_: &App,
) -> Result<lsp::DocumentDiagnosticParams> {
let identifier = match &self.dynamic_caps {
lsp::DiagnosticServerCapabilities::Options(options) => options.identifier.clone(),
lsp::DiagnosticServerCapabilities::RegistrationOptions(options) => {
options.diagnostic_options.identifier.clone()
}
};
Ok(lsp::DocumentDiagnosticParams {
text_document: lsp::TextDocumentIdentifier {
uri: file_path_to_lsp_url(path)?,
},
identifier: self.identifier.clone(),
previous_result_id: self.previous_result_id.clone().map(|id| id.to_string()),
identifier,
previous_result_id: self.previous_result_id.clone(),
partial_result_params: Default::default(),
work_done_progress_params: Default::default(),
})
@@ -4101,7 +4097,6 @@ impl LspCommand for GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
related_documents,
self.registration_id.clone(),
);
}
process_full_diagnostics_report(
@@ -4109,7 +4104,6 @@ impl LspCommand for GetDocumentDiagnostics {
server_id,
url,
report.full_document_diagnostic_report,
self.registration_id,
);
}
lsp::DocumentDiagnosticReport::Unchanged(report) => {
@@ -4118,7 +4112,6 @@ impl LspCommand for GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
related_documents,
self.registration_id.clone(),
);
}
process_unchanged_diagnostics_report(
@@ -4126,7 +4119,6 @@ impl LspCommand for GetDocumentDiagnostics {
server_id,
url,
report.unchanged_document_diagnostic_report,
self.registration_id,
);
}
},
@@ -4136,7 +4128,6 @@ impl LspCommand for GetDocumentDiagnostics {
&mut pulled_diagnostics,
server_id,
related_documents,
self.registration_id,
);
}
}
@@ -4179,7 +4170,6 @@ impl LspCommand for GetDocumentDiagnostics {
server_id,
uri,
diagnostics,
registration_id,
} => {
let mut changed = false;
let (diagnostics, result_id) = match diagnostics {
@@ -4194,7 +4184,7 @@ impl LspCommand for GetDocumentDiagnostics {
};
Some(proto::PulledDiagnostics {
changed,
result_id: result_id.map(|id| id.to_string()),
result_id,
uri: uri.to_string(),
server_id: server_id.to_proto(),
diagnostics: diagnostics
@@ -4205,7 +4195,6 @@ impl LspCommand for GetDocumentDiagnostics {
.log_err()
})
.collect(),
registration_id: registration_id.as_ref().map(ToString::to_string),
})
}
})
@@ -4376,25 +4365,14 @@ fn process_related_documents(
diagnostics: &mut HashMap<lsp::Uri, LspPullDiagnostics>,
server_id: LanguageServerId,
documents: impl IntoIterator<Item = (lsp::Uri, lsp::DocumentDiagnosticReportKind)>,
registration_id: Option<SharedString>,
) {
for (url, report_kind) in documents {
match report_kind {
lsp::DocumentDiagnosticReportKind::Full(report) => process_full_diagnostics_report(
diagnostics,
server_id,
url,
report,
registration_id.clone(),
),
lsp::DocumentDiagnosticReportKind::Full(report) => {
process_full_diagnostics_report(diagnostics, server_id, url, report)
}
lsp::DocumentDiagnosticReportKind::Unchanged(report) => {
process_unchanged_diagnostics_report(
diagnostics,
server_id,
url,
report,
registration_id.clone(),
)
process_unchanged_diagnostics_report(diagnostics, server_id, url, report)
}
}
}
@@ -4405,9 +4383,8 @@ fn process_unchanged_diagnostics_report(
server_id: LanguageServerId,
uri: lsp::Uri,
report: lsp::UnchangedDocumentDiagnosticReport,
registration_id: Option<SharedString>,
) {
let result_id = SharedString::new(report.result_id);
let result_id = report.result_id;
match diagnostics.entry(uri.clone()) {
hash_map::Entry::Occupied(mut o) => match o.get_mut() {
LspPullDiagnostics::Default => {
@@ -4415,14 +4392,12 @@ fn process_unchanged_diagnostics_report(
server_id,
uri,
diagnostics: PulledDiagnostics::Unchanged { result_id },
registration_id,
});
}
LspPullDiagnostics::Response {
server_id: existing_server_id,
uri: existing_uri,
diagnostics: existing_diagnostics,
..
} => {
if server_id != *existing_server_id || &uri != existing_uri {
debug_panic!(
@@ -4442,7 +4417,6 @@ fn process_unchanged_diagnostics_report(
server_id,
uri,
diagnostics: PulledDiagnostics::Unchanged { result_id },
registration_id,
});
}
}
@@ -4453,9 +4427,8 @@ fn process_full_diagnostics_report(
server_id: LanguageServerId,
uri: lsp::Uri,
report: lsp::FullDocumentDiagnosticReport,
registration_id: Option<SharedString>,
) {
let result_id = report.result_id.map(SharedString::new);
let result_id = report.result_id;
match diagnostics.entry(uri.clone()) {
hash_map::Entry::Occupied(mut o) => match o.get_mut() {
LspPullDiagnostics::Default => {
@@ -4466,14 +4439,12 @@ fn process_full_diagnostics_report(
result_id,
diagnostics: report.items,
},
registration_id,
});
}
LspPullDiagnostics::Response {
server_id: existing_server_id,
uri: existing_uri,
diagnostics: existing_diagnostics,
..
} => {
if server_id != *existing_server_id || &uri != existing_uri {
debug_panic!(
@@ -4507,7 +4478,6 @@ fn process_full_diagnostics_report(
result_id,
diagnostics: report.items,
},
registration_id,
});
}
}

View File

@@ -116,7 +116,6 @@ use std::{
atomic::{self, AtomicUsize},
},
time::{Duration, Instant},
vec,
};
use sum_tree::Dimensions;
use text::{Anchor, BufferId, LineEnding, OffsetRangeExt, ToPoint as _};
@@ -230,8 +229,7 @@ struct LanguageServerSeed {
#[derive(Debug)]
pub struct DocumentDiagnosticsUpdate<'a, D> {
pub diagnostics: D,
pub result_id: Option<SharedString>,
pub registration_id: Option<SharedString>,
pub result_id: Option<String>,
pub server_id: LanguageServerId,
pub disk_based_sources: Cow<'a, [String]>,
}
@@ -285,14 +283,7 @@ pub struct LocalLspStore {
lsp_tree: LanguageServerTree,
registered_buffers: HashMap<BufferId, usize>,
buffers_opened_in_servers: HashMap<BufferId, HashSet<LanguageServerId>>,
buffer_pull_diagnostics_result_ids: HashMap<
LanguageServerId,
HashMap<Option<SharedString>, HashMap<PathBuf, Option<SharedString>>>,
>,
workspace_pull_diagnostics_result_ids: HashMap<
LanguageServerId,
HashMap<Option<SharedString>, HashMap<PathBuf, Option<SharedString>>>,
>,
buffer_pull_diagnostics_result_ids: HashMap<LanguageServerId, HashMap<PathBuf, Option<String>>>,
}
impl LocalLspStore {
@@ -694,7 +685,6 @@ impl LocalLspStore {
disk_based_sources: Cow::Borrowed(
&adapter.disk_based_diagnostic_sources,
),
registration_id: None,
}],
|_, diagnostic, cx| match diagnostic.source_kind {
DiagnosticSourceKind::Other | DiagnosticSourceKind::Pushed => {
@@ -2266,9 +2256,8 @@ impl LocalLspStore {
server_id,
None,
None,
None,
Vec::new(),
diagnostics,
Vec::new(),
cx,
)
.log_err();
@@ -2346,8 +2335,7 @@ impl LocalLspStore {
&mut self,
buffer: &Entity<Buffer>,
server_id: LanguageServerId,
registration_id: Option<Option<SharedString>>,
result_id: Option<SharedString>,
result_id: Option<String>,
version: Option<i32>,
new_diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
reused_diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
@@ -2420,15 +2408,11 @@ impl LocalLspStore {
let set = DiagnosticSet::new(sanitized_diagnostics, &snapshot);
buffer.update(cx, |buffer, cx| {
if let Some(registration_id) = registration_id {
if let Some(abs_path) = File::from_dyn(buffer.file()).map(|f| f.abs_path(cx)) {
self.buffer_pull_diagnostics_result_ids
.entry(server_id)
.or_default()
.entry(registration_id)
.or_default()
.insert(abs_path, result_id);
}
if let Some(abs_path) = File::from_dyn(buffer.file()).map(|f| f.abs_path(cx)) {
self.buffer_pull_diagnostics_result_ids
.entry(server_id)
.or_default()
.insert(abs_path, result_id);
}
buffer.update_diagnostics(server_id, set, cx)
@@ -3282,8 +3266,6 @@ impl LocalLspStore {
self.language_servers.remove(server_id_to_remove);
self.buffer_pull_diagnostics_result_ids
.remove(server_id_to_remove);
self.workspace_pull_diagnostics_result_ids
.remove(server_id_to_remove);
for buffer_servers in self.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(server_id_to_remove);
}
@@ -3970,7 +3952,6 @@ impl LspStore {
registered_buffers: HashMap::default(),
buffers_opened_in_servers: HashMap::default(),
buffer_pull_diagnostics_result_ids: HashMap::default(),
workspace_pull_diagnostics_result_ids: HashMap::default(),
watched_manifest_filenames: ManifestProvidersStore::global(cx)
.manifest_file_names(),
}),
@@ -4244,50 +4225,9 @@ impl LspStore {
lsp_store.lsp_data.remove(&buffer_id);
let local = lsp_store.as_local_mut().unwrap();
local.registered_buffers.remove(&buffer_id);
local.buffers_opened_in_servers.remove(&buffer_id);
if let Some(file) = File::from_dyn(buffer.read(cx).file()).cloned() {
local.unregister_old_buffer_from_language_servers(buffer, &file, cx);
let buffer_abs_path = file.abs_path(cx);
for (_, buffer_pull_diagnostics_result_ids) in
&mut local.buffer_pull_diagnostics_result_ids
{
buffer_pull_diagnostics_result_ids.retain(
|_, buffer_result_ids| {
buffer_result_ids.remove(&buffer_abs_path);
!buffer_result_ids.is_empty()
},
);
}
let diagnostic_updates = local
.language_servers
.keys()
.cloned()
.map(|server_id| DocumentDiagnosticsUpdate {
diagnostics: DocumentDiagnostics {
document_abs_path: buffer_abs_path.clone(),
version: None,
diagnostics: Vec::new(),
},
result_id: None,
registration_id: None,
server_id: server_id,
disk_based_sources: Cow::Borrowed(&[]),
})
.collect::<Vec<_>>();
lsp_store
.merge_diagnostic_entries(
diagnostic_updates,
|_, diagnostic, _| {
diagnostic.source_kind != DiagnosticSourceKind::Pulled
},
cx,
)
.context("Clearing diagnostics for the closed buffer")
.log_err();
}
}
})
@@ -6760,11 +6700,9 @@ impl LspStore {
};
assert!(any_server_has_diagnostics_provider);
let identifier = buffer_diagnostic_identifier(&dynamic_caps);
let request = GetDocumentDiagnostics {
previous_result_id: None,
identifier,
registration_id: None,
dynamic_caps,
};
let request_task = client.request_lsp(
upstream_project_id,
@@ -6797,27 +6735,19 @@ impl LspStore {
.language_server_dynamic_registrations
.get(&server_id)
.into_iter()
.flat_map(|registrations| registrations.diagnostics.clone())
.flat_map(|registrations| registrations.diagnostics.values().cloned())
.collect::<Vec<_>>();
Some(
providers_with_identifiers
.into_iter()
.map(|(registration_id, dynamic_caps)| {
let identifier = buffer_diagnostic_identifier(&dynamic_caps);
let registration_id = registration_id.map(SharedString::from);
let result_id = self.result_id_for_buffer_pull(
server_id,
buffer_id,
&registration_id,
cx,
);
.map(|dynamic_caps| {
let result_id = self.result_id(server_id, buffer_id, cx);
self.request_lsp(
buffer.clone(),
LanguageServerToQuery::Other(server_id),
GetDocumentDiagnostics {
previous_result_id: result_id,
registration_id,
identifier,
dynamic_caps,
},
cx,
)
@@ -7182,7 +7112,8 @@ impl LspStore {
return;
}
let mut unchanged_buffers = HashMap::default();
let mut unchanged_buffers = HashSet::default();
let mut changed_buffers = HashSet::default();
let server_diagnostics_updates = diagnostics
.into_iter()
.filter_map(|diagnostics_set| match diagnostics_set {
@@ -7190,25 +7121,24 @@ impl LspStore {
server_id,
uri,
diagnostics,
registration_id,
} => Some((server_id, uri, diagnostics, registration_id)),
} => Some((server_id, uri, diagnostics)),
LspPullDiagnostics::Default => None,
})
.fold(
HashMap::default(),
|mut acc, (server_id, uri, diagnostics, new_registration_id)| {
|mut acc, (server_id, uri, diagnostics)| {
let (result_id, diagnostics) = match diagnostics {
PulledDiagnostics::Unchanged { result_id } => {
unchanged_buffers
.entry(new_registration_id.clone())
.or_insert_with(HashSet::default)
.insert(uri.clone());
unchanged_buffers.insert(uri.clone());
(Some(result_id), Vec::new())
}
PulledDiagnostics::Changed {
result_id,
diagnostics,
} => (result_id, diagnostics),
} => {
changed_buffers.insert(uri.clone());
(result_id, diagnostics)
}
};
let disk_based_sources = Cow::Owned(
lsp_store
@@ -7218,11 +7148,8 @@ impl LspStore {
.unwrap_or(&[])
.to_vec(),
);
acc.entry(server_id)
.or_insert_with(HashMap::default)
.entry(new_registration_id.clone())
.or_insert_with(Vec::new)
.push(DocumentDiagnosticsUpdate {
acc.entry(server_id).or_insert_with(Vec::new).push(
DocumentDiagnosticsUpdate {
server_id,
diagnostics: lsp::PublishDiagnosticsParams {
uri,
@@ -7231,35 +7158,37 @@ impl LspStore {
},
result_id,
disk_based_sources,
registration_id: new_registration_id,
});
},
);
acc
},
);
for diagnostic_updates in server_diagnostics_updates.into_values() {
for (registration_id, diagnostic_updates) in diagnostic_updates {
lsp_store
.merge_lsp_diagnostics(
DiagnosticSourceKind::Pulled,
diagnostic_updates,
|document_uri, old_diagnostic, _| match old_diagnostic.source_kind {
DiagnosticSourceKind::Pulled => {
old_diagnostic.registration_id != registration_id
|| unchanged_buffers
.get(&old_diagnostic.registration_id)
.is_some_and(|unchanged_buffers| {
unchanged_buffers.contains(&document_uri)
})
}
DiagnosticSourceKind::Other | DiagnosticSourceKind::Pushed => {
true
}
},
cx,
)
.log_err();
}
lsp_store
.merge_lsp_diagnostics(
DiagnosticSourceKind::Pulled,
diagnostic_updates,
|buffer, old_diagnostic, cx| {
File::from_dyn(buffer.file())
.and_then(|file| {
let abs_path = file.as_local()?.abs_path(cx);
lsp::Uri::from_file_path(abs_path).ok()
})
.is_none_or(|buffer_uri| {
unchanged_buffers.contains(&buffer_uri)
|| match old_diagnostic.source_kind {
DiagnosticSourceKind::Pulled => {
!changed_buffers.contains(&buffer_uri)
}
DiagnosticSourceKind::Other
| DiagnosticSourceKind::Pushed => true,
}
})
},
cx,
)
.log_err();
}
})
})
@@ -8266,7 +8195,7 @@ impl LspStore {
&mut self,
server_id: LanguageServerId,
abs_path: PathBuf,
result_id: Option<SharedString>,
result_id: Option<String>,
version: Option<i32>,
diagnostics: Vec<DiagnosticEntry<Unclipped<PointUtf16>>>,
cx: &mut Context<Self>,
@@ -8281,7 +8210,6 @@ impl LspStore {
result_id,
server_id,
disk_based_sources: Cow::Borrowed(&[]),
registration_id: None,
}],
|_, _, _| false,
cx,
@@ -8292,7 +8220,7 @@ impl LspStore {
pub fn merge_diagnostic_entries<'a>(
&mut self,
diagnostic_updates: Vec<DocumentDiagnosticsUpdate<'a, DocumentDiagnostics>>,
merge: impl Fn(&lsp::Uri, &Diagnostic, &App) -> bool + Clone,
merge: impl Fn(&Buffer, &Diagnostic, &App) -> bool + Clone,
cx: &mut Context<Self>,
) -> anyhow::Result<()> {
let mut diagnostics_summary = None::<proto::UpdateDiagnosticSummary>;
@@ -8313,15 +8241,13 @@ impl LspStore {
path: relative_path,
};
let document_uri = lsp::Uri::from_file_path(abs_path)
.map_err(|()| anyhow!("Failed to convert buffer path {abs_path:?} to lsp Uri"))?;
if let Some(buffer_handle) = self.buffer_store.read(cx).get_by_path(&project_path) {
let snapshot = buffer_handle.read(cx).snapshot();
let buffer = buffer_handle.read(cx);
let reused_diagnostics = buffer
.buffer_diagnostics(Some(server_id))
.iter()
.filter(|v| merge(&document_uri, &v.diagnostic, cx))
.filter(|v| merge(buffer, &v.diagnostic, cx))
.map(|v| {
let start = Unclipped(v.range.start.to_point_utf16(&snapshot));
let end = Unclipped(v.range.end.to_point_utf16(&snapshot));
@@ -8337,7 +8263,6 @@ impl LspStore {
.update_buffer_diagnostics(
&buffer_handle,
server_id,
Some(update.registration_id),
update.result_id,
update.diagnostics.version,
update.diagnostics.diagnostics.clone(),
@@ -8346,25 +8271,6 @@ impl LspStore {
)?;
update.diagnostics.diagnostics.extend(reused_diagnostics);
} else if let Some(local) = self.as_local() {
let reused_diagnostics = local
.diagnostics
.get(&worktree_id)
.and_then(|diagnostics_for_tree| diagnostics_for_tree.get(&project_path.path))
.and_then(|diagnostics_by_server_id| {
diagnostics_by_server_id
.binary_search_by_key(&server_id, |e| e.0)
.ok()
.map(|ix| &diagnostics_by_server_id[ix].1)
})
.into_iter()
.flatten()
.filter(|v| merge(&document_uri, &v.diagnostic, cx));
update
.diagnostics
.diagnostics
.extend(reused_diagnostics.cloned());
}
let updated = worktree.update(cx, |worktree, cx| {
@@ -8449,7 +8355,7 @@ impl LspStore {
.unwrap_or_default();
let new_summary = DiagnosticSummary::new(&diagnostics);
if diagnostics.is_empty() {
if new_summary.is_empty() {
if let Some(diagnostics_by_server_id) = diagnostics_for_tree.get_mut(&path_in_worktree)
{
if let Ok(ix) = diagnostics_by_server_id.binary_search_by_key(&server_id, |e| e.0) {
@@ -9759,7 +9665,7 @@ impl LspStore {
);
}
lsp::ProgressParamsValue::WorkspaceDiagnostic(report) => {
let registration_id = match progress_params.token {
let identifier = match progress_params.token {
lsp::NumberOrString::Number(_) => None,
lsp::NumberOrString::String(token) => token
.split_once(WORKSPACE_DIAGNOSTICS_TOKEN_START)
@@ -9772,15 +9678,10 @@ impl LspStore {
.as_local_mut()
.and_then(|local| local.language_servers.get_mut(&language_server_id))
&& let Some(workspace_diagnostics) =
workspace_diagnostics_refresh_tasks.get_mut(&registration_id)
workspace_diagnostics_refresh_tasks.get_mut(&identifier)
{
workspace_diagnostics.progress_tx.try_send(()).ok();
self.apply_workspace_diagnostic_report(
language_server_id,
report,
registration_id.map(SharedString::from),
cx,
)
self.apply_workspace_diagnostic_report(language_server_id, report, cx)
}
}
}
@@ -11040,7 +10941,7 @@ impl LspStore {
&mut self,
server_id: LanguageServerId,
diagnostics: lsp::PublishDiagnosticsParams,
result_id: Option<SharedString>,
result_id: Option<String>,
source_kind: DiagnosticSourceKind,
disk_based_sources: &[String],
cx: &mut Context<Self>,
@@ -11052,7 +10953,6 @@ impl LspStore {
result_id,
server_id,
disk_based_sources: Cow::Borrowed(disk_based_sources),
registration_id: None,
}],
|_, _, _| false,
cx,
@@ -11063,7 +10963,7 @@ impl LspStore {
&mut self,
source_kind: DiagnosticSourceKind,
lsp_diagnostics: Vec<DocumentDiagnosticsUpdate<lsp::PublishDiagnosticsParams>>,
merge: impl Fn(&lsp::Uri, &Diagnostic, &App) -> bool + Clone,
merge: impl Fn(&Buffer, &Diagnostic, &App) -> bool + Clone,
cx: &mut Context<Self>,
) -> Result<()> {
anyhow::ensure!(self.mode.is_local(), "called update_diagnostics on remote");
@@ -11078,12 +10978,10 @@ impl LspStore {
update.server_id,
update.diagnostics,
&update.disk_based_sources,
update.registration_id.clone(),
),
result_id: update.result_id,
server_id: update.server_id,
disk_based_sources: update.disk_based_sources,
registration_id: update.registration_id,
})
})
.collect();
@@ -11098,7 +10996,6 @@ impl LspStore {
server_id: LanguageServerId,
mut lsp_diagnostics: lsp::PublishDiagnosticsParams,
disk_based_sources: &[String],
registration_id: Option<SharedString>,
) -> DocumentDiagnostics {
let mut diagnostics = Vec::default();
let mut primary_diagnostic_group_ids = HashMap::default();
@@ -11172,7 +11069,6 @@ impl LspStore {
is_unnecessary,
underline,
data: diagnostic.data.clone(),
registration_id: registration_id.clone(),
},
});
if let Some(infos) = &diagnostic.related_information {
@@ -11200,7 +11096,6 @@ impl LspStore {
is_unnecessary: false,
underline,
data: diagnostic.data.clone(),
registration_id: registration_id.clone(),
},
});
}
@@ -11950,22 +11845,18 @@ impl LspStore {
}
if let Some(local) = self.as_local_mut() {
local.buffer_pull_diagnostics_result_ids.remove(&for_server);
local
.workspace_pull_diagnostics_result_ids
.remove(&for_server);
for buffer_servers in local.buffers_opened_in_servers.values_mut() {
buffer_servers.remove(&for_server);
}
}
}
pub fn result_id_for_buffer_pull(
pub fn result_id(
&self,
server_id: LanguageServerId,
buffer_id: BufferId,
registration_id: &Option<SharedString>,
cx: &App,
) -> Option<SharedString> {
) -> Option<String> {
let abs_path = self
.buffer_store
.read(cx)
@@ -11975,40 +11866,20 @@ impl LspStore {
self.as_local()?
.buffer_pull_diagnostics_result_ids
.get(&server_id)?
.get(registration_id)?
.get(&abs_path)?
.clone()
}
/// Gets all result_ids for a workspace diagnostics pull request.
/// First, it tries to find buffer's result_id retrieved via the diagnostics pull; if it fails, it falls back to the workspace disagnostics pull result_id.
/// The latter is supposed to be of lower priority as we keep on pulling diagnostics for open buffers eagerly.
pub fn result_ids_for_workspace_refresh(
&self,
server_id: LanguageServerId,
registration_id: &Option<SharedString>,
) -> HashMap<PathBuf, SharedString> {
pub fn all_result_ids(&self, server_id: LanguageServerId) -> HashMap<PathBuf, String> {
let Some(local) = self.as_local() else {
return HashMap::default();
};
local
.workspace_pull_diagnostics_result_ids
.buffer_pull_diagnostics_result_ids
.get(&server_id)
.into_iter()
.filter_map(|diagnostics| diagnostics.get(registration_id))
.flatten()
.filter_map(|(abs_path, result_id)| {
let result_id = local
.buffer_pull_diagnostics_result_ids
.get(&server_id)
.and_then(|buffer_ids_result_ids| {
buffer_ids_result_ids.get(registration_id)?.get(abs_path)
})
.cloned()
.flatten()
.or_else(|| result_id.clone())?;
Some((abs_path.clone(), result_id))
})
.filter_map(|(abs_path, result_id)| Some((abs_path.clone(), result_id.clone()?)))
.collect()
}
@@ -12053,16 +11924,12 @@ impl LspStore {
&mut self,
server_id: LanguageServerId,
report: lsp::WorkspaceDiagnosticReportResult,
registration_id: Option<SharedString>,
cx: &mut Context<Self>,
) {
let workspace_diagnostics =
GetDocumentDiagnostics::deserialize_workspace_diagnostics_report(
report,
server_id,
registration_id,
);
let mut unchanged_buffers = HashMap::default();
GetDocumentDiagnostics::deserialize_workspace_diagnostics_report(report, server_id);
let mut unchanged_buffers = HashSet::default();
let mut changed_buffers = HashSet::default();
let workspace_diagnostics_updates = workspace_diagnostics
.into_iter()
.filter_map(
@@ -12071,32 +11938,25 @@ impl LspStore {
server_id,
uri,
diagnostics,
registration_id,
} => Some((
server_id,
uri,
diagnostics,
workspace_diagnostics.version,
registration_id,
)),
} => Some((server_id, uri, diagnostics, workspace_diagnostics.version)),
LspPullDiagnostics::Default => None,
},
)
.fold(
HashMap::default(),
|mut acc, (server_id, uri, diagnostics, version, new_registration_id)| {
|mut acc, (server_id, uri, diagnostics, version)| {
let (result_id, diagnostics) = match diagnostics {
PulledDiagnostics::Unchanged { result_id } => {
unchanged_buffers
.entry(new_registration_id.clone())
.or_insert_with(HashSet::default)
.insert(uri.clone());
unchanged_buffers.insert(uri.clone());
(Some(result_id), Vec::new())
}
PulledDiagnostics::Changed {
result_id,
diagnostics,
} => (result_id, diagnostics),
} => {
changed_buffers.insert(uri.clone());
(result_id, diagnostics)
}
};
let disk_based_sources = Cow::Owned(
self.language_server_adapter_for_id(server_id)
@@ -12105,68 +11965,47 @@ impl LspStore {
.unwrap_or(&[])
.to_vec(),
);
let Some(abs_path) = uri.to_file_path().ok() else {
return acc;
};
let Some((worktree, relative_path)) =
self.worktree_store.read(cx).find_worktree(abs_path.clone(), cx)
else {
log::warn!("skipping workspace diagnostics update, no worktree found for path {abs_path:?}");
return acc;
};
let worktree_id = worktree.read(cx).id();
let project_path = ProjectPath {
worktree_id,
path: relative_path,
};
if let Some(local_lsp_store) = self.as_local_mut() {
local_lsp_store.workspace_pull_diagnostics_result_ids.entry(server_id)
.or_default().entry(new_registration_id.clone()).or_default().insert(abs_path, result_id.clone());
}
// The LSP spec recommends that "diagnostics from a document pull should win over diagnostics from a workspace pull."
// Since we actively pull diagnostics for documents with open buffers, we ignore contents of workspace pulls for these documents.
if self.buffer_store.read(cx).get_by_path(&project_path).is_none() {
acc.entry(server_id)
.or_insert_with(HashMap::default)
.entry(new_registration_id.clone())
.or_insert_with(Vec::new)
.push(DocumentDiagnosticsUpdate {
server_id,
diagnostics: lsp::PublishDiagnosticsParams {
uri,
diagnostics,
version,
},
result_id,
disk_based_sources,
registration_id: new_registration_id,
});
}
acc.entry(server_id)
.or_insert_with(Vec::new)
.push(DocumentDiagnosticsUpdate {
server_id,
diagnostics: lsp::PublishDiagnosticsParams {
uri,
diagnostics,
version,
},
result_id,
disk_based_sources,
});
acc
},
);
for diagnostic_updates in workspace_diagnostics_updates.into_values() {
for (registration_id, diagnostic_updates) in diagnostic_updates {
self.merge_lsp_diagnostics(
DiagnosticSourceKind::Pulled,
diagnostic_updates,
|document_uri, old_diagnostic, _| match old_diagnostic.source_kind {
DiagnosticSourceKind::Pulled => {
old_diagnostic.registration_id != registration_id
|| unchanged_buffers
.get(&old_diagnostic.registration_id)
.is_some_and(|unchanged_buffers| {
unchanged_buffers.contains(&document_uri)
})
}
DiagnosticSourceKind::Other | DiagnosticSourceKind::Pushed => true,
},
cx,
)
.log_err();
}
self.merge_lsp_diagnostics(
DiagnosticSourceKind::Pulled,
diagnostic_updates,
|buffer, old_diagnostic, cx| {
File::from_dyn(buffer.file())
.and_then(|file| {
let abs_path = file.as_local()?.abs_path(cx);
lsp::Uri::from_file_path(abs_path).ok()
})
.is_none_or(|buffer_uri| {
unchanged_buffers.contains(&buffer_uri)
|| match old_diagnostic.source_kind {
DiagnosticSourceKind::Pulled => {
!changed_buffers.contains(&buffer_uri)
}
DiagnosticSourceKind::Other | DiagnosticSourceKind::Pushed => {
true
}
}
})
},
cx,
)
.log_err();
}
}
@@ -12445,41 +12284,54 @@ impl LspStore {
.diagnostics
.insert(Some(reg.id.clone()), caps.clone());
let supports_workspace_diagnostics =
|capabilities: &DiagnosticServerCapabilities| match capabilities {
DiagnosticServerCapabilities::Options(diagnostic_options) => {
diagnostic_options.workspace_diagnostics
}
DiagnosticServerCapabilities::RegistrationOptions(
diagnostic_registration_options,
) => {
diagnostic_registration_options
.diagnostic_options
.workspace_diagnostics
}
};
if supports_workspace_diagnostics(&caps) {
if let LanguageServerState::Running {
workspace_diagnostics_refresh_tasks,
..
} = state
&& let Some(task) = lsp_workspace_diagnostics_refresh(
Some(reg.id.clone()),
caps.clone(),
server.clone(),
cx,
)
{
workspace_diagnostics_refresh_tasks.insert(Some(reg.id), task);
}
if let LanguageServerState::Running {
workspace_diagnostics_refresh_tasks,
..
} = state
&& let Some(task) = lsp_workspace_diagnostics_refresh(
Some(reg.id.clone()),
caps.clone(),
server.clone(),
cx,
)
{
workspace_diagnostics_refresh_tasks.insert(Some(reg.id), task);
}
let mut did_update_caps = false;
server.update_capabilities(|capabilities| {
capabilities.diagnostic_provider = Some(caps);
if capabilities.diagnostic_provider.as_ref().is_none_or(
|current_caps| {
let supports_workspace_diagnostics =
|capabilities: &DiagnosticServerCapabilities| {
match capabilities {
DiagnosticServerCapabilities::Options(
diagnostic_options,
) => diagnostic_options.workspace_diagnostics,
DiagnosticServerCapabilities::RegistrationOptions(
diagnostic_registration_options,
) => {
diagnostic_registration_options
.diagnostic_options
.workspace_diagnostics
}
}
};
// We don't actually care about capabilities.diagnostic_provider, but it IS relevant for the remote peer
// to know that there's at least one provider. Otherwise, it will never ask us to issue documentdiagnostic calls on their behalf,
// as it'll think that they're not supported.
// If we did not support any workspace diagnostics up to this point but now do, let's update.
!supports_workspace_diagnostics(current_caps)
& supports_workspace_diagnostics(&caps)
},
) {
did_update_caps = true;
capabilities.diagnostic_provider = Some(caps);
}
});
notify_server_capabilities_updated(&server, cx);
if did_update_caps {
notify_server_capabilities_updated(&server, cx);
}
}
}
"textDocument/documentColor" => {
@@ -12647,29 +12499,31 @@ impl LspStore {
.language_servers
.get_mut(&server_id)
.context("Could not obtain Language Servers state")?;
let registrations = local
let options = local
.language_server_dynamic_registrations
.get_mut(&server_id)
.with_context(|| {
format!("Expected dynamic registration to exist for server {server_id}")
})?;
registrations.diagnostics
})?.diagnostics
.remove(&Some(unreg.id.clone()))
.with_context(|| format!(
"Attempted to unregister non-existent diagnostic registration with ID {}",
unreg.id)
)?;
let removed_last_diagnostic_provider = registrations.diagnostics.is_empty();
if let LanguageServerState::Running {
workspace_diagnostics_refresh_tasks,
..
} = state
let mut has_any_diagnostic_providers_still = true;
if let Some(identifier) = diagnostic_identifier(&options)
&& let LanguageServerState::Running {
workspace_diagnostics_refresh_tasks,
..
} = state
{
workspace_diagnostics_refresh_tasks.remove(&Some(unreg.id.clone()));
workspace_diagnostics_refresh_tasks.remove(&identifier);
has_any_diagnostic_providers_still =
!workspace_diagnostics_refresh_tasks.is_empty();
}
if removed_last_diagnostic_provider {
if !has_any_diagnostic_providers_still {
server.update_capabilities(|capabilities| {
debug_assert!(capabilities.diagnostic_provider.is_some());
capabilities.diagnostic_provider = None;
@@ -12968,8 +12822,7 @@ fn lsp_workspace_diagnostics_refresh(
server: Arc<LanguageServer>,
cx: &mut Context<'_, LspStore>,
) -> Option<WorkspaceRefreshTask> {
let identifier = workspace_diagnostic_identifier(&options)?;
let registration_id_shared = registration_id.as_ref().map(SharedString::from);
let identifier = diagnostic_identifier(&options)?;
let (progress_tx, mut progress_rx) = mpsc::channel(1);
let (mut refresh_tx, mut refresh_rx) = mpsc::channel(1);
@@ -13001,13 +12854,13 @@ fn lsp_workspace_diagnostics_refresh(
let Ok(previous_result_ids) = lsp_store.update(cx, |lsp_store, _| {
lsp_store
.result_ids_for_workspace_refresh(server.server_id(), &registration_id_shared)
.all_result_ids(server.server_id())
.into_iter()
.filter_map(|(abs_path, result_id)| {
let uri = file_path_to_lsp_url(&abs_path).ok()?;
Some(lsp::PreviousResultId {
uri,
value: result_id.to_string(),
value: result_id,
})
})
.collect()
@@ -13015,9 +12868,9 @@ fn lsp_workspace_diagnostics_refresh(
return;
};
let token = if let Some(registration_id) = &registration_id {
let token = if let Some(identifier) = &registration_id {
format!(
"workspace/diagnostic/{}/{requests}/{WORKSPACE_DIAGNOSTICS_TOKEN_START}{registration_id}",
"workspace/diagnostic/{}/{requests}/{WORKSPACE_DIAGNOSTICS_TOKEN_START}{identifier}",
server.server_id(),
)
} else {
@@ -13067,7 +12920,6 @@ fn lsp_workspace_diagnostics_refresh(
lsp_store.apply_workspace_diagnostic_report(
server.server_id(),
pulled_diagnostics,
registration_id_shared.clone(),
cx,
)
})
@@ -13089,21 +12941,7 @@ fn lsp_workspace_diagnostics_refresh(
})
}
fn buffer_diagnostic_identifier(options: &DiagnosticServerCapabilities) -> Option<String> {
match &options {
lsp::DiagnosticServerCapabilities::Options(diagnostic_options) => {
diagnostic_options.identifier.clone()
}
lsp::DiagnosticServerCapabilities::RegistrationOptions(registration_options) => {
let diagnostic_options = &registration_options.diagnostic_options;
diagnostic_options.identifier.clone()
}
}
}
fn workspace_diagnostic_identifier(
options: &DiagnosticServerCapabilities,
) -> Option<Option<String>> {
fn diagnostic_identifier(options: &DiagnosticServerCapabilities) -> Option<Option<String>> {
match &options {
lsp::DiagnosticServerCapabilities::Options(diagnostic_options) => {
if !diagnostic_options.workspace_diagnostics {

View File

@@ -90,7 +90,6 @@ pub fn register_notifications(
disk_based_sources: Cow::Borrowed(
&adapter.disk_based_diagnostic_sources,
),
registration_id: None,
}],
|_, diag, _| !is_inactive_region(diag),
cx,

View File

@@ -984,8 +984,6 @@ pub enum LspPullDiagnostics {
server_id: LanguageServerId,
/// URI of the resource,
uri: lsp::Uri,
/// The ID provided by the dynamic registration that produced diagnostics.
registration_id: Option<SharedString>,
/// The diagnostics produced by this language server.
diagnostics: PulledDiagnostics,
},
@@ -996,10 +994,10 @@ pub enum PulledDiagnostics {
Unchanged {
/// An ID the current pulled batch for this file.
/// If given, can be used to query workspace diagnostics partially.
result_id: SharedString,
result_id: String,
},
Changed {
result_id: Option<SharedString>,
result_id: Option<String>,
diagnostics: Vec<lsp::Diagnostic>,
},
}

View File

@@ -2750,13 +2750,11 @@ async fn test_empty_diagnostic_ranges(cx: &mut gpui::TestAppContext) {
);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(path!("/dir"), json!({ "a.rs": text })).await;
fs.insert_tree("/dir", json!({ "a.rs": text })).await;
let project = Project::test(fs, [Path::new(path!("/dir"))], cx).await;
let project = Project::test(fs, ["/dir".as_ref()], cx).await;
let buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/dir/a.rs"), cx)
})
.update(cx, |project, cx| project.open_local_buffer("/dir/a.rs", cx))
.await
.unwrap();
@@ -2765,7 +2763,7 @@ async fn test_empty_diagnostic_ranges(cx: &mut gpui::TestAppContext) {
lsp_store
.update_diagnostic_entries(
LanguageServerId(0),
PathBuf::from(path!("/dir/a.rs")),
PathBuf::from("/dir/a.rs"),
None,
None,
vec![
@@ -2822,17 +2820,17 @@ async fn test_diagnostics_from_multiple_language_servers(cx: &mut gpui::TestAppC
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(path!("/dir"), json!({ "a.rs": "one two three" }))
fs.insert_tree("/dir", json!({ "a.rs": "one two three" }))
.await;
let project = Project::test(fs, [Path::new(path!("/dir"))], cx).await;
let project = Project::test(fs, ["/dir".as_ref()], cx).await;
let lsp_store = project.read_with(cx, |project, _| project.lsp_store.clone());
lsp_store.update(cx, |lsp_store, cx| {
lsp_store
.update_diagnostic_entries(
LanguageServerId(0),
Path::new(path!("/dir/a.rs")).to_owned(),
Path::new("/dir/a.rs").to_owned(),
None,
None,
vec![DiagnosticEntry {
@@ -2851,7 +2849,7 @@ async fn test_diagnostics_from_multiple_language_servers(cx: &mut gpui::TestAppC
lsp_store
.update_diagnostic_entries(
LanguageServerId(1),
Path::new(path!("/dir/a.rs")).to_owned(),
Path::new("/dir/a.rs").to_owned(),
None,
None,
vec![DiagnosticEntry {

View File

@@ -258,7 +258,6 @@ message Diagnostic {
Anchor start = 1;
Anchor end = 2;
optional string source = 3;
optional string registration_id = 17;
enum SourceKind {
Pulled = 0;

View File

@@ -580,7 +580,7 @@ message GitCreateWorktree {
message RunGitHook {
enum GitHook {
PRE_COMMIT = 0;
PRE_PUSH = 1;
reserved 1;
}
uint64 project_id = 1;

View File

@@ -949,7 +949,6 @@ message PulledDiagnostics {
optional string result_id = 3;
bool changed = 4;
repeated LspDiagnostic diagnostics = 5;
optional string registration_id = 6;
}
message PullWorkspaceDiagnostics {

View File

@@ -115,7 +115,7 @@ impl MasterProcess {
.args(additional_args)
.args(args);
master_process.arg(format!("ControlPath='{}'", socket_path.display()));
master_process.arg(format!("ControlPath={}", socket_path.display()));
let process = master_process.arg(&url).spawn()?;

View File

@@ -2321,8 +2321,13 @@ impl BufferSnapshot {
} else if anchor.is_max() {
self.visible_text.len()
} else {
debug_assert!(anchor.buffer_id == Some(self.remote_id));
debug_assert!(self.version.observed(anchor.timestamp));
debug_assert_eq!(anchor.buffer_id, Some(self.remote_id));
debug_assert!(
self.version.observed(anchor.timestamp),
"Anchor timestamp {:?} not observed by buffer {:?}",
anchor.timestamp,
self.version
);
let anchor_key = InsertionFragmentKey {
timestamp: anchor.timestamp,
split_offset: anchor.offset,

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.217.0"
version = "0.217.1"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View File

@@ -1 +1 @@
dev
stable

View File

@@ -13,4 +13,4 @@ path = "src/proto.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.1.0"
zed_extension_api = "0.7.0"

View File

@@ -7,9 +7,18 @@ authors = ["Zed Industries <support@zed.dev>"]
repository = "https://github.com/zed-industries/zed"
[grammars.proto]
repository = "https://github.com/zed-industries/tree-sitter-proto"
commit = "0848bd30a64be48772e15fbb9d5ba8c0cc5772ad"
repository = "https://github.com/coder3101/tree-sitter-proto"
commit = "a6caac94b5aa36b322b5b70040d5b67132f109d0"
[language_servers.buf]
name = "Buf"
languages = ["Proto"]
[language_servers.protobuf-language-server]
name = "Protobuf Language Server"
languages = ["Proto"]
[language_servers.protols]
name = "Protols"
languages = ["Proto"]

View File

@@ -0,0 +1,8 @@
mod buf;
mod protobuf_language_server;
mod protols;
mod util;
pub(crate) use buf::*;
pub(crate) use protobuf_language_server::*;
pub(crate) use protols::*;

View File

@@ -0,0 +1,114 @@
use std::fs;
use zed_extension_api::{
self as zed, Architecture, DownloadedFileType, GithubReleaseOptions, Os, Result,
settings::LspSettings,
};
use crate::language_servers::util;
pub(crate) struct BufLsp {
cached_binary_path: Option<String>,
}
impl BufLsp {
pub(crate) const SERVER_NAME: &str = "buf";
pub(crate) fn new() -> Self {
BufLsp {
cached_binary_path: None,
}
}
pub(crate) fn language_server_binary(
&mut self,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
let binary_settings = LspSettings::for_worktree(Self::SERVER_NAME, worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.binary);
let args = binary_settings
.as_ref()
.and_then(|binary_settings| binary_settings.arguments.clone())
.unwrap_or_else(|| ["lsp", "serve"].map(ToOwned::to_owned).into());
if let Some(path) = binary_settings.and_then(|binary_settings| binary_settings.path) {
return Ok(zed::Command {
command: path,
args,
env: Default::default(),
});
} else if let Some(path) = self.cached_binary_path.clone() {
return Ok(zed::Command {
command: path,
args,
env: Default::default(),
});
} else if let Some(path) = worktree.which(Self::SERVER_NAME) {
self.cached_binary_path = Some(path.clone());
return Ok(zed::Command {
command: path,
args,
env: Default::default(),
});
}
let latest_release = zed::latest_github_release(
"bufbuild/buf",
GithubReleaseOptions {
require_assets: true,
pre_release: false,
},
)?;
let (os, arch) = zed::current_platform();
let release_suffix = match (os, arch) {
(Os::Mac, Architecture::Aarch64) => "Darwin-arm64",
(Os::Mac, Architecture::X8664) => "Darwin-x86_64",
(Os::Linux, Architecture::Aarch64) => "Linux-aarch64",
(Os::Linux, Architecture::X8664) => "Linux-x86_64",
(Os::Windows, Architecture::Aarch64) => "Windows-arm64.exe",
(Os::Windows, Architecture::X8664) => "Windows-x86_64.exe",
_ => {
return Err("Platform and architecture not supported by buf CLI".to_string());
}
};
let release_name = format!("buf-{release_suffix}");
let version_dir = format!("{}-{}", Self::SERVER_NAME, latest_release.version);
fs::create_dir_all(&version_dir).map_err(|_| "Could not create directory")?;
let binary_path = format!("{version_dir}/buf");
let download_target = latest_release
.assets
.into_iter()
.find(|asset| asset.name == release_name)
.ok_or_else(|| {
format!(
"Could not find asset with name {} in buf CLI release",
&release_name
)
})?;
zed::download_file(
&download_target.download_url,
&binary_path,
DownloadedFileType::Uncompressed,
)?;
zed::make_file_executable(&binary_path)?;
util::remove_outdated_versions(Self::SERVER_NAME, &version_dir)?;
self.cached_binary_path = Some(binary_path.clone());
Ok(zed::Command {
command: binary_path,
args,
env: Default::default(),
})
}
}

View File

@@ -0,0 +1,52 @@
use zed_extension_api::{self as zed, Result, settings::LspSettings};
pub(crate) struct ProtobufLanguageServer {
cached_binary_path: Option<String>,
}
impl ProtobufLanguageServer {
pub(crate) const SERVER_NAME: &str = "protobuf-language-server";
pub(crate) fn new() -> Self {
ProtobufLanguageServer {
cached_binary_path: None,
}
}
pub(crate) fn language_server_binary(
&mut self,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
let binary_settings = LspSettings::for_worktree(Self::SERVER_NAME, worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.binary);
let args = binary_settings
.as_ref()
.and_then(|binary_settings| binary_settings.arguments.clone())
.unwrap_or_else(|| vec!["-logs".into(), "".into()]);
if let Some(path) = binary_settings.and_then(|binary_settings| binary_settings.path) {
Ok(zed::Command {
command: path,
args,
env: Default::default(),
})
} else if let Some(path) = self.cached_binary_path.clone() {
Ok(zed::Command {
command: path,
args,
env: Default::default(),
})
} else if let Some(path) = worktree.which(Self::SERVER_NAME) {
self.cached_binary_path = Some(path.clone());
Ok(zed::Command {
command: path,
args,
env: Default::default(),
})
} else {
Err(format!("{} not found in PATH", Self::SERVER_NAME))
}
}
}

View File

@@ -0,0 +1,113 @@
use zed_extension_api::{
self as zed, Architecture, DownloadedFileType, GithubReleaseOptions, Os, Result,
settings::LspSettings,
};
use crate::language_servers::util;
pub(crate) struct ProtoLs {
cached_binary_path: Option<String>,
}
impl ProtoLs {
pub(crate) const SERVER_NAME: &str = "protols";
pub(crate) fn new() -> Self {
ProtoLs {
cached_binary_path: None,
}
}
pub(crate) fn language_server_binary(
&mut self,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
let binary_settings = LspSettings::for_worktree(Self::SERVER_NAME, worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.binary);
let args = binary_settings
.as_ref()
.and_then(|binary_settings| binary_settings.arguments.clone())
.unwrap_or_default();
let env = worktree.shell_env();
if let Some(path) = binary_settings.and_then(|binary_settings| binary_settings.path) {
return Ok(zed::Command {
command: path,
args,
env,
});
} else if let Some(path) = self.cached_binary_path.clone() {
return Ok(zed::Command {
command: path,
args,
env,
});
} else if let Some(path) = worktree.which(Self::SERVER_NAME) {
self.cached_binary_path = Some(path.clone());
return Ok(zed::Command {
command: path,
args,
env,
});
}
let latest_release = zed::latest_github_release(
"coder3101/protols",
GithubReleaseOptions {
require_assets: true,
pre_release: false,
},
)?;
let (os, arch) = zed::current_platform();
let release_suffix = match (os, arch) {
(Os::Mac, Architecture::Aarch64) => "aarch64-apple-darwin.tar.gz",
(Os::Mac, Architecture::X8664) => "x86_64-apple-darwin.tar.gz",
(Os::Linux, Architecture::Aarch64) => "aarch64-unknown-linux-gnu.tar.gz",
(Os::Linux, Architecture::X8664) => "x86_64-unknown-linux-gnu.tar.gz",
(Os::Windows, Architecture::X8664) => "x86_64-pc-windows-msvc.zip",
_ => {
return Err("Platform and architecture not supported by Protols".to_string());
}
};
let release_name = format!("protols-{release_suffix}");
let file_type = if os == Os::Windows {
DownloadedFileType::Zip
} else {
DownloadedFileType::GzipTar
};
let version_dir = format!("{}-{}", Self::SERVER_NAME, latest_release.version);
let binary_path = format!("{version_dir}/protols");
let download_target = latest_release
.assets
.into_iter()
.find(|asset| asset.name == release_name)
.ok_or_else(|| {
format!(
"Could not find asset with name {} in Protols release",
&release_name
)
})?;
zed::download_file(&download_target.download_url, &version_dir, file_type)?;
zed::make_file_executable(&binary_path)?;
util::remove_outdated_versions(Self::SERVER_NAME, &version_dir)?;
self.cached_binary_path = Some(binary_path.clone());
Ok(zed::Command {
command: binary_path,
args,
env,
})
}
}

View File

@@ -0,0 +1,19 @@
use std::fs;
use zed_extension_api::Result;
pub(super) fn remove_outdated_versions(
language_server_id: &'static str,
version_dir: &str,
) -> Result<()> {
let entries = fs::read_dir(".").map_err(|e| format!("failed to list working directory {e}"))?;
for entry in entries {
let entry = entry.map_err(|e| format!("failed to load directory entry {e}"))?;
if entry.file_name().to_str().is_none_or(|file_name| {
file_name.starts_with(language_server_id) && file_name != version_dir
}) {
fs::remove_dir_all(entry.path()).ok();
}
}
Ok(())
}

View File

@@ -1,48 +1,22 @@
use zed_extension_api::{self as zed, Result, settings::LspSettings};
const PROTOBUF_LANGUAGE_SERVER_NAME: &str = "protobuf-language-server";
use crate::language_servers::{BufLsp, ProtoLs, ProtobufLanguageServer};
struct ProtobufLanguageServerBinary {
path: String,
args: Option<Vec<String>>,
}
mod language_servers;
struct ProtobufExtension;
impl ProtobufExtension {
fn language_server_binary(
&self,
_language_server_id: &zed::LanguageServerId,
worktree: &zed::Worktree,
) -> Result<ProtobufLanguageServerBinary> {
let binary_settings = LspSettings::for_worktree("protobuf-language-server", worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.binary);
let binary_args = binary_settings
.as_ref()
.and_then(|binary_settings| binary_settings.arguments.clone());
if let Some(path) = binary_settings.and_then(|binary_settings| binary_settings.path) {
return Ok(ProtobufLanguageServerBinary {
path,
args: binary_args,
});
}
if let Some(path) = worktree.which(PROTOBUF_LANGUAGE_SERVER_NAME) {
return Ok(ProtobufLanguageServerBinary {
path,
args: binary_args,
});
}
Err(format!("{PROTOBUF_LANGUAGE_SERVER_NAME} not found in PATH",))
}
struct ProtobufExtension {
protobuf_language_server: Option<ProtobufLanguageServer>,
protols: Option<ProtoLs>,
buf_lsp: Option<BufLsp>,
}
impl zed::Extension for ProtobufExtension {
fn new() -> Self {
Self
Self {
protobuf_language_server: None,
protols: None,
buf_lsp: None,
}
}
fn language_server_command(
@@ -50,14 +24,24 @@ impl zed::Extension for ProtobufExtension {
language_server_id: &zed_extension_api::LanguageServerId,
worktree: &zed_extension_api::Worktree,
) -> zed_extension_api::Result<zed_extension_api::Command> {
let binary = self.language_server_binary(language_server_id, worktree)?;
Ok(zed::Command {
command: binary.path,
args: binary
.args
.unwrap_or_else(|| vec!["-logs".into(), "".into()]),
env: Default::default(),
})
match language_server_id.as_ref() {
ProtobufLanguageServer::SERVER_NAME => self
.protobuf_language_server
.get_or_insert_with(ProtobufLanguageServer::new)
.language_server_binary(worktree),
ProtoLs::SERVER_NAME => self
.protols
.get_or_insert_with(ProtoLs::new)
.language_server_binary(worktree),
BufLsp::SERVER_NAME => self
.buf_lsp
.get_or_insert_with(BufLsp::new)
.language_server_binary(worktree),
_ => Err(format!("Unknown language server ID {}", language_server_id)),
}
}
fn language_server_workspace_configuration(
@@ -65,10 +49,8 @@ impl zed::Extension for ProtobufExtension {
server_id: &zed::LanguageServerId,
worktree: &zed::Worktree,
) -> Result<Option<zed::serde_json::Value>> {
let settings = LspSettings::for_worktree(server_id.as_ref(), worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.settings);
Ok(settings)
LspSettings::for_worktree(server_id.as_ref(), worktree)
.map(|lsp_settings| lsp_settings.settings)
}
fn language_server_initialization_options(
@@ -76,10 +58,8 @@ impl zed::Extension for ProtobufExtension {
server_id: &zed::LanguageServerId,
worktree: &zed::Worktree,
) -> Result<Option<zed_extension_api::serde_json::Value>> {
let initialization_options = LspSettings::for_worktree(server_id.as_ref(), worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.initialization_options);
Ok(initialization_options)
LspSettings::for_worktree(server_id.as_ref(), worktree)
.map(|lsp_settings| lsp_settings.initialization_options)
}
}

View File

@@ -368,6 +368,8 @@ pub(crate) fn check_postgres_and_protobuf_migrations() -> NamedJob {
.runs_on(runners::LINUX_DEFAULT)
.add_env(("GIT_AUTHOR_NAME", "Protobuf Action"))
.add_env(("GIT_AUTHOR_EMAIL", "ci@zed.dev"))
.add_env(("GIT_COMMITTER_NAME", "Protobuf Action"))
.add_env(("GIT_COMMITTER_EMAIL", "ci@zed.dev"))
.add_step(steps::checkout_repo().with(("fetch-depth", 0))) // fetch full history
.add_step(remove_untracked_files())
.add_step(ensure_fresh_merge())

View File

@@ -31,6 +31,9 @@ extend-exclude = [
"crates/rpc/src/auth.rs",
# glsl isn't recognized by this tool.
"extensions/glsl/languages/glsl/",
# Protols is the name of the language server.
"extensions/proto/extension.toml",
"extensions/proto/src/language_servers/protols.rs",
# Windows likes its abbreviations.
"crates/gpui/src/platform/windows/directx_renderer.rs",
"crates/gpui/src/platform/windows/events.rs",