Compare commits

...

8 Commits

Author SHA1 Message Date
Smit Barmase
2c52e0b614 fix ime popup position on linux 2025-09-05 04:54:33 +05:30
Lukas Wirth
fca44f89c1 languages: Allow installing pre-release of rust-analyzer and clangd (#37530)
Release Notes:

- Added lsp binary config to allow fetching nightly rust-analyzer and
clangd releases
2025-09-04 09:22:19 +00:00
Mitch (a.k.a Voz)
b7ad20773c worktree: Create parent directories on rename (#37437)
Closes https://github.com/zed-industries/zed/issues/37357

Release Notes:

- Allow creating sub-directories when renaming a file in file finder

---------

Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2025-09-04 08:25:47 +00:00
Finn Evers
aa1629b544 Remove some unused events (#37498)
This PR cleans up some emitted events around the codebase. These events
are either never emitted or never listened for.

It seems better to re-implement these at some point should they again be
needed - this ensures that they will actually be fired in the cases
where they are needed as opposed to being there and getting unreliable
and stale (which is already the case for the majority of the events
removed here).

Lastly, this ensures the `CapabilitiesChanged` event is not fired too
often.

Release Notes:

- N/A
2025-09-04 09:09:28 +02:00
James Tucker
69a5c45672 gpui: Fix out-of-bounds node indices in dispatch_path (#37252)
Observed in a somewhat regular startup crash on Windows at head (~50% of
launches in release mode).

Closes #37212

Release Notes:

- N/A
2025-09-03 23:18:23 -07:00
沈瑗杰
d0aaf04673 Change DeepSeek max token count to 128k (#36864)
https://api-docs.deepseek.com/zh-cn/news/news250821

Now the official API supports 128k token content

and have modify the name to v3.1/v3.1 thinking

Release Notes:

- N/A

---------

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-09-04 05:51:48 +00:00
Francis
d677c98f43 agent2: Use inline enums in now and edit_file tools JSON schema (#37397)
Added schemars annotations to generate inline enums instead of
references ($ref) in the JSON schema passed to LLMs.

Concerns :
- "timezeone" parameter for "now" tool function
- "mode" parameter for "edit_file" tool function

Should be the same for futures tools/functions enums. This is easier for
LLMs to understand the schema since many of them don't use JSON
references correctly.

Tested with :
- local GPT-OSS-120b with llama.cpp server (openai compatible)
- remote Claude Sonnet 4.0 with Zed pro subscription

Thanks in advance for the merge.
(notice this is my first PR ever on Github, I hope I'm doing things
well, please let me know if you have any comment - edit: just noticed my
username/email were not correctly setup on my local git, sorry, it's
been 5 years I've not used git)

Closes #37389

Release Notes:

- agent: Improve "now" and "edit_file" tool schemas to work with more
models.
2025-09-04 05:39:55 +00:00
Ben Brandt
ce362864db docs: Update OpenAI-compatible provider config format (#37517)
The example was still showing how we used to setup openai compatible
providers, but that format should only be used for changing the url for
your actual OpenAI provider.

If you are doing a compatible provider, it should be using the new
format.

Closes #37093

Release Notes:

- N/A
2025-09-04 04:39:06 +00:00
37 changed files with 275 additions and 167 deletions

View File

@@ -84,7 +84,6 @@ impl ActivityIndicator {
) -> Entity<ActivityIndicator> {
let project = workspace.project().clone();
let auto_updater = AutoUpdater::get(cx);
let workspace_handle = cx.entity();
let this = cx.new(|cx| {
let mut status_events = languages.language_server_binary_statuses();
cx.spawn(async move |this, cx| {
@@ -102,20 +101,6 @@ impl ActivityIndicator {
})
.detach();
cx.subscribe_in(
&workspace_handle,
window,
|activity_indicator, _, event, window, cx| {
if let workspace::Event::ClearActivityIndicator = event
&& activity_indicator.statuses.pop().is_some()
{
activity_indicator.dismiss_error_message(&DismissErrorMessage, window, cx);
cx.notify();
}
},
)
.detach();
cx.subscribe(
&project.read(cx).lsp_store(),
|activity_indicator, _, event, cx| {

View File

@@ -83,6 +83,7 @@ struct EditFileToolPartialInput {
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "lowercase")]
#[schemars(inline)]
pub enum EditFileMode {
Edit,
Create,

View File

@@ -11,6 +11,7 @@ use crate::{AgentTool, ToolCallEventStream};
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "snake_case")]
#[schemars(inline)]
pub enum Timezone {
/// Use UTC for the datetime.
Utc,

View File

@@ -13,11 +13,8 @@ use anyhow::{Context as _, Result, anyhow};
use collections::IndexMap;
use dap::adapters::DebugAdapterName;
use dap::debugger_settings::DebugPanelDockPosition;
use dap::{
ContinuedEvent, LoadedSourceEvent, ModuleEvent, OutputEvent, StoppedEvent, ThreadEvent,
client::SessionId, debugger_settings::DebuggerSettings,
};
use dap::{DapRegistry, StartDebuggingRequestArguments};
use dap::{client::SessionId, debugger_settings::DebuggerSettings};
use editor::Editor;
use gpui::{
Action, App, AsyncWindowContext, ClipboardItem, Context, DismissEvent, Entity, EntityId,
@@ -46,23 +43,6 @@ use workspace::{
};
use zed_actions::ToggleFocus;
pub enum DebugPanelEvent {
Exited(SessionId),
Terminated(SessionId),
Stopped {
client_id: SessionId,
event: StoppedEvent,
go_to_stack_frame: bool,
},
Thread((SessionId, ThreadEvent)),
Continued((SessionId, ContinuedEvent)),
Output((SessionId, OutputEvent)),
Module((SessionId, ModuleEvent)),
LoadedSource((SessionId, LoadedSourceEvent)),
ClientShutdown(SessionId),
CapabilitiesChanged(SessionId),
}
pub struct DebugPanel {
size: Pixels,
active_session: Option<Entity<DebugSession>>,
@@ -1407,7 +1387,6 @@ async fn register_session_inner(
}
impl EventEmitter<PanelEvent> for DebugPanel {}
impl EventEmitter<DebugPanelEvent> for DebugPanel {}
impl Focusable for DebugPanel {
fn focus_handle(&self, _: &App) -> FocusHandle {

View File

@@ -2,9 +2,7 @@ pub mod running;
use crate::{StackTraceView, persistence::SerializedLayout, session::running::DebugTerminal};
use dap::client::SessionId;
use gpui::{
App, Axis, Entity, EventEmitter, FocusHandle, Focusable, Subscription, Task, WeakEntity,
};
use gpui::{App, Axis, Entity, EventEmitter, FocusHandle, Focusable, Task, WeakEntity};
use project::debugger::session::Session;
use project::worktree_store::WorktreeStore;
use project::{Project, debugger::session::SessionQuirks};
@@ -24,13 +22,6 @@ pub struct DebugSession {
stack_trace_view: OnceCell<Entity<StackTraceView>>,
_worktree_store: WeakEntity<WorktreeStore>,
workspace: WeakEntity<Workspace>,
_subscriptions: [Subscription; 1],
}
#[derive(Debug)]
pub enum DebugPanelItemEvent {
Close,
Stopped { go_to_stack_frame: bool },
}
impl DebugSession {
@@ -59,9 +50,6 @@ impl DebugSession {
let quirks = session.read(cx).quirks();
cx.new(|cx| Self {
_subscriptions: [cx.subscribe(&running_state, |_, _, _, cx| {
cx.notify();
})],
remote_id: None,
running_state,
quirks,
@@ -133,7 +121,7 @@ impl DebugSession {
}
}
impl EventEmitter<DebugPanelItemEvent> for DebugSession {}
impl EventEmitter<()> for DebugSession {}
impl Focusable for DebugSession {
fn focus_handle(&self, cx: &App) -> FocusHandle {
@@ -142,7 +130,7 @@ impl Focusable for DebugSession {
}
impl Item for DebugSession {
type Event = DebugPanelItemEvent;
type Event = ();
fn tab_content_text(&self, _detail: usize, _cx: &App) -> SharedString {
"Debugger".into()
}

View File

@@ -14,7 +14,6 @@ use crate::{
session::running::memory_view::MemoryView,
};
use super::DebugPanelItemEvent;
use anyhow::{Context as _, Result, anyhow};
use breakpoint_list::BreakpointList;
use collections::{HashMap, IndexMap};
@@ -1826,8 +1825,6 @@ impl RunningState {
}
}
impl EventEmitter<DebugPanelItemEvent> for RunningState {}
impl Focusable for RunningState {
fn focus_handle(&self, _: &App) -> FocusHandle {
self.focus_handle.clone()

View File

@@ -96,7 +96,7 @@ impl Model {
pub fn max_token_count(&self) -> u64 {
match self {
Self::Chat | Self::Reasoner => 64_000,
Self::Chat | Self::Reasoner => 128_000,
Self::Custom { max_tokens, .. } => *max_tokens,
}
}
@@ -104,7 +104,7 @@ impl Model {
pub fn max_output_tokens(&self) -> Option<u64> {
match self {
Self::Chat => Some(8_192),
Self::Reasoner => Some(8_192),
Self::Reasoner => Some(64_000),
Self::Custom {
max_output_tokens, ..
} => *max_output_tokens,

View File

@@ -20326,7 +20326,6 @@ impl Editor {
multi_buffer::Event::FileHandleChanged
| multi_buffer::Event::Reloaded
| multi_buffer::Event::BufferDiffChanged => cx.emit(EditorEvent::TitleChanged),
multi_buffer::Event::Closed => cx.emit(EditorEvent::Closed),
multi_buffer::Event::DiagnosticsUpdated => {
self.update_diagnostics_state(window, cx);
}
@@ -23024,7 +23023,6 @@ pub enum EditorEvent {
DirtyChanged,
Saved,
TitleChanged,
DiffBaseChanged,
SelectionsChanged {
local: bool,
},
@@ -23032,14 +23030,12 @@ pub enum EditorEvent {
local: bool,
autoscroll: bool,
},
Closed,
TransactionUndone {
transaction_id: clock::Lamport,
},
TransactionBegun {
transaction_id: clock::Lamport,
},
Reloaded,
CursorShapeChanged,
BreadcrumbsChanged,
PushedToNavHistory {

View File

@@ -16783,6 +16783,7 @@ async fn test_language_server_restart_due_to_settings_change(cx: &mut TestAppCon
"some other init value": false
})),
enable_lsp_tasks: false,
fetch: None,
},
);
});
@@ -16803,6 +16804,7 @@ async fn test_language_server_restart_due_to_settings_change(cx: &mut TestAppCon
"anotherInitValue": false
})),
enable_lsp_tasks: false,
fetch: None,
},
);
});
@@ -16823,6 +16825,7 @@ async fn test_language_server_restart_due_to_settings_change(cx: &mut TestAppCon
"anotherInitValue": false
})),
enable_lsp_tasks: false,
fetch: None,
},
);
});
@@ -16841,6 +16844,7 @@ async fn test_language_server_restart_due_to_settings_change(cx: &mut TestAppCon
settings: None,
initialization_options: None,
enable_lsp_tasks: false,
fetch: None,
},
);
});

View File

@@ -775,12 +775,6 @@ impl Item for Editor {
self.nav_history = Some(history);
}
fn discarded(&self, _project: Entity<Project>, _: &mut Window, cx: &mut Context<Self>) {
for buffer in self.buffer().clone().read(cx).all_buffers() {
buffer.update(cx, |buffer, cx| buffer.discarded(cx))
}
}
fn on_removed(&self, cx: &App) {
self.report_editor_event(ReportEditorEvent::Closed, None, cx);
}
@@ -1022,8 +1016,6 @@ impl Item for Editor {
fn to_item_events(event: &EditorEvent, mut f: impl FnMut(ItemEvent)) {
match event {
EditorEvent::Closed => f(ItemEvent::CloseItem),
EditorEvent::Saved | EditorEvent::TitleChanged => {
f(ItemEvent::UpdateTab);
f(ItemEvent::UpdateBreadcrumbs);

View File

@@ -552,7 +552,7 @@ impl DispatchTree {
let mut current_node_id = Some(target);
while let Some(node_id) = current_node_id {
dispatch_path.push(node_id);
current_node_id = self.nodes[node_id.0].parent;
current_node_id = self.nodes.get(node_id.0).and_then(|node| node.parent);
}
dispatch_path.reverse(); // Reverse the path so it goes from the root to the focused node.
dispatch_path

View File

@@ -646,9 +646,10 @@ impl WaylandWindowStatePtr {
}
}
pub fn get_ime_area(&self) -> Option<Bounds<Pixels>> {
pub fn get_ime_area(&self) -> Option<Bounds<ScaledPixels>> {
let mut state = self.state.borrow_mut();
let mut bounds: Option<Bounds<Pixels>> = None;
let scale_factor = state.scale;
if let Some(mut input_handler) = state.input_handler.take() {
drop(state);
if let Some(selection) = input_handler.marked_text_range() {
@@ -656,7 +657,7 @@ impl WaylandWindowStatePtr {
}
self.state.borrow_mut().input_handler = Some(input_handler);
}
bounds
bounds.map(|b| b.scale(scale_factor))
}
pub fn set_size_and_scale(&self, size: Option<Size<Pixels>>, scale: Option<f32>) {

View File

@@ -1019,9 +1019,10 @@ impl X11WindowStatePtr {
}
}
pub fn get_ime_area(&self) -> Option<Bounds<Pixels>> {
pub fn get_ime_area(&self) -> Option<Bounds<ScaledPixels>> {
let mut state = self.state.borrow_mut();
let mut bounds: Option<Bounds<Pixels>> = None;
let scale_factor = state.scale_factor;
if let Some(mut input_handler) = state.input_handler.take() {
drop(state);
if let Some(selection) = input_handler.selected_text_range(true) {
@@ -1030,7 +1031,7 @@ impl X11WindowStatePtr {
let mut state = self.state.borrow_mut();
state.input_handler = Some(input_handler);
};
bounds
bounds.map(|b| b.scale(scale_factor))
}
pub fn set_bounds(&self, bounds: Bounds<i32>) -> anyhow::Result<()> {

View File

@@ -315,10 +315,6 @@ pub enum BufferEvent {
DiagnosticsUpdated,
/// The buffer gained or lost editing capabilities.
CapabilityChanged,
/// The buffer was explicitly requested to close.
Closed,
/// The buffer was discarded when closing.
Discarded,
}
/// The file associated with a buffer.
@@ -1246,8 +1242,10 @@ impl Buffer {
/// Assign the buffer a new [`Capability`].
pub fn set_capability(&mut self, capability: Capability, cx: &mut Context<Self>) {
self.capability = capability;
cx.emit(BufferEvent::CapabilityChanged)
if self.capability != capability {
self.capability = capability;
cx.emit(BufferEvent::CapabilityChanged)
}
}
/// This method is called to signal that the buffer has been saved.
@@ -1267,12 +1265,6 @@ impl Buffer {
cx.notify();
}
/// This method is called to signal that the buffer has been discarded.
pub fn discarded(&self, cx: &mut Context<Self>) {
cx.emit(BufferEvent::Discarded);
cx.notify();
}
/// Reloads the contents of the buffer from disk.
pub fn reload(&mut self, cx: &Context<Self>) -> oneshot::Receiver<Option<Transaction>> {
let (tx, rx) = futures::channel::oneshot::channel();

View File

@@ -395,6 +395,7 @@ pub trait LspAdapter: 'static + Send + Sync {
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
cx: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>>;
fn will_fetch_server(
@@ -605,7 +606,7 @@ async fn try_fetch_server_binary<L: LspAdapter + 'static + Send + Sync + ?Sized>
delegate.update_status(name.clone(), BinaryStatus::CheckingForUpdate);
let latest_version = adapter
.fetch_latest_server_version(delegate.as_ref())
.fetch_latest_server_version(delegate.as_ref(), cx)
.await?;
if let Some(binary) = adapter
@@ -2222,6 +2223,7 @@ impl LspAdapter for FakeLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
unreachable!();
}

View File

@@ -204,6 +204,7 @@ impl LspAdapter for ExtensionLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
unreachable!("get_language_server_command is overridden")
}

View File

@@ -5,8 +5,9 @@ use gpui::{App, AsyncApp};
use http_client::github::{AssetKind, GitHubLspBinaryVersion, latest_github_release};
pub use language::*;
use lsp::{InitializeParams, LanguageServerBinary, LanguageServerName};
use project::lsp_store::clangd_ext;
use project::{lsp_store::clangd_ext, project_settings::ProjectSettings};
use serde_json::json;
use settings::Settings as _;
use smol::fs;
use std::{any::Any, env::consts, path::PathBuf, sync::Arc};
use util::{ResultExt, fs::remove_matching, maybe, merge_json_value_into};
@@ -42,9 +43,19 @@ impl super::LspAdapter for CLspAdapter {
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
cx: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
let release =
latest_github_release("clangd/clangd", true, false, delegate.http_client()).await?;
let release = latest_github_release(
"clangd/clangd",
true,
ProjectSettings::try_read_global(cx, |s| {
s.lsp.get(&Self::SERVER_NAME)?.fetch.as_ref()?.pre_release
})
.flatten()
.unwrap_or(false),
delegate.http_client(),
)
.await?;
let os_suffix = match consts::OS {
"macos" => "mac",
"linux" => "linux",

View File

@@ -61,6 +61,7 @@ impl LspAdapter for CssLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(
self.node

View File

@@ -59,6 +59,7 @@ impl super::LspAdapter for GoLspAdapter {
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
let release =
latest_github_release("golang/tools", false, false, delegate.http_client()).await?;

View File

@@ -321,6 +321,7 @@ impl LspAdapter for JsonLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
Ok(Box::new(
self.node
@@ -494,6 +495,7 @@ impl LspAdapter for NodeVersionAdapter {
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
let release = latest_github_release(
"zed-industries/package-version-server",

View File

@@ -157,6 +157,7 @@ impl LspAdapter for PythonLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(
self.node
@@ -1111,6 +1112,7 @@ impl LspAdapter for PyLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(()) as Box<_>)
}
@@ -1422,6 +1424,7 @@ impl LspAdapter for BasedPyrightLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(()) as Box<_>)
}

View File

@@ -147,11 +147,16 @@ impl LspAdapter for RustLspAdapter {
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
cx: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
let release = latest_github_release(
"rust-lang/rust-analyzer",
true,
false,
ProjectSettings::try_read_global(cx, |s| {
s.lsp.get(&SERVER_NAME)?.fetch.as_ref()?.pre_release
})
.flatten()
.unwrap_or(false),
delegate.http_client(),
)
.await?;

View File

@@ -66,6 +66,7 @@ impl LspAdapter for TailwindLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(
self.node

View File

@@ -563,6 +563,7 @@ impl LspAdapter for TypeScriptLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
Ok(Box::new(TypeScriptVersions {
typescript_version: self.node.npm_package_latest_version("typescript").await?,
@@ -885,6 +886,7 @@ impl LspAdapter for EsLintLspAdapter {
async fn fetch_latest_server_version(
&self,
_delegate: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
let url = build_asset_url(
"zed-industries/vscode-eslint",

View File

@@ -73,6 +73,7 @@ impl LspAdapter for VtslsLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
Ok(Box::new(TypeScriptVersions {
typescript_version: self.node.npm_package_latest_version("typescript").await?,

View File

@@ -44,6 +44,7 @@ impl LspAdapter for YamlLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Any + Send>> {
Ok(Box::new(
self.node

View File

@@ -113,15 +113,10 @@ pub enum Event {
transaction_id: TransactionId,
},
Reloaded,
ReloadNeeded,
LanguageChanged(BufferId),
CapabilityChanged,
Reparsed(BufferId),
Saved,
FileHandleChanged,
Closed,
Discarded,
DirtyChanged,
DiagnosticsUpdated,
BufferDiffChanged,
@@ -2433,28 +2428,24 @@ impl MultiBuffer {
event: &language::BufferEvent,
cx: &mut Context<Self>,
) {
use language::BufferEvent;
cx.emit(match event {
language::BufferEvent::Edited => Event::Edited {
BufferEvent::Edited => Event::Edited {
singleton_buffer_edited: true,
edited_buffer: Some(buffer),
},
language::BufferEvent::DirtyChanged => Event::DirtyChanged,
language::BufferEvent::Saved => Event::Saved,
language::BufferEvent::FileHandleChanged => Event::FileHandleChanged,
language::BufferEvent::Reloaded => Event::Reloaded,
language::BufferEvent::ReloadNeeded => Event::ReloadNeeded,
language::BufferEvent::LanguageChanged => {
Event::LanguageChanged(buffer.read(cx).remote_id())
}
language::BufferEvent::Reparsed => Event::Reparsed(buffer.read(cx).remote_id()),
language::BufferEvent::DiagnosticsUpdated => Event::DiagnosticsUpdated,
language::BufferEvent::Closed => Event::Closed,
language::BufferEvent::Discarded => Event::Discarded,
language::BufferEvent::CapabilityChanged => {
BufferEvent::DirtyChanged => Event::DirtyChanged,
BufferEvent::Saved => Event::Saved,
BufferEvent::FileHandleChanged => Event::FileHandleChanged,
BufferEvent::Reloaded => Event::Reloaded,
BufferEvent::LanguageChanged => Event::LanguageChanged(buffer.read(cx).remote_id()),
BufferEvent::Reparsed => Event::Reparsed(buffer.read(cx).remote_id()),
BufferEvent::DiagnosticsUpdated => Event::DiagnosticsUpdated,
BufferEvent::CapabilityChanged => {
self.capability = buffer.read(cx).capability();
Event::CapabilityChanged
return;
}
language::BufferEvent::Operation { .. } => return,
BufferEvent::Operation { .. } | BufferEvent::ReloadNeeded => return,
});
}

View File

@@ -13070,6 +13070,7 @@ impl LspAdapter for SshLspAdapter {
async fn fetch_latest_server_version(
&self,
_: &dyn LspAdapterDelegate,
_: &AsyncApp,
) -> Result<Box<dyn 'static + Send + Any>> {
anyhow::bail!("SshLspAdapter does not support fetch_latest_server_version")
}

View File

@@ -516,6 +516,12 @@ pub struct BinarySettings {
pub ignore_system_version: Option<bool>,
}
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq, Eq, JsonSchema, Hash)]
pub struct FetchSettings {
// Whether to consider pre-releases for fetching
pub pre_release: Option<bool>,
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, Eq, JsonSchema, Hash)]
#[serde(rename_all = "snake_case")]
pub struct LspSettings {
@@ -527,6 +533,7 @@ pub struct LspSettings {
/// Default: true
#[serde(default = "default_true")]
pub enable_lsp_tasks: bool,
pub fetch: Option<FetchSettings>,
}
impl Default for LspSettings {
@@ -536,6 +543,7 @@ impl Default for LspSettings {
initialization_options: None,
settings: None,
enable_lsp_tasks: true,
fetch: None,
}
}
}

View File

@@ -541,7 +541,6 @@ pub trait ItemHandle: 'static + Send {
cx: &mut Context<Workspace>,
);
fn deactivated(&self, window: &mut Window, cx: &mut App);
fn discarded(&self, project: Entity<Project>, window: &mut Window, cx: &mut App);
fn on_removed(&self, cx: &App);
fn workspace_deactivated(&self, window: &mut Window, cx: &mut App);
fn navigate(&self, data: Box<dyn Any>, window: &mut Window, cx: &mut App) -> bool;
@@ -975,10 +974,6 @@ impl<T: Item> ItemHandle for Entity<T> {
});
}
fn discarded(&self, project: Entity<Project>, window: &mut Window, cx: &mut App) {
self.update(cx, |this, cx| this.discarded(project, window, cx));
}
fn deactivated(&self, window: &mut Window, cx: &mut App) {
self.update(cx, |this, cx| this.deactivated(window, cx));
}

View File

@@ -254,9 +254,6 @@ pub enum Event {
Remove {
focus_on_pane: Option<Entity<Pane>>,
},
RemoveItem {
idx: usize,
},
RemovedItem {
item: Box<dyn ItemHandle>,
},
@@ -287,7 +284,6 @@ impl fmt::Debug for Event {
.field("local", local)
.finish(),
Event::Remove { .. } => f.write_str("Remove"),
Event::RemoveItem { idx } => f.debug_struct("RemoveItem").field("idx", idx).finish(),
Event::RemovedItem { item } => f
.debug_struct("RemovedItem")
.field("item", &item.item_id())
@@ -2096,11 +2092,10 @@ impl Pane {
Ok(0) => {}
Ok(1) => {
// Don't save this file
pane.update_in(cx, |pane, window, cx| {
pane.update_in(cx, |pane, _, cx| {
if pane.is_tab_pinned(item_ix) && !item.can_save(cx) {
pane.pinned_tab_count -= 1;
}
item.discarded(project, window, cx)
})
.log_err();
return Ok(true);

View File

@@ -1030,7 +1030,6 @@ pub enum Event {
ItemAdded {
item: Box<dyn ItemHandle>,
},
ItemRemoved,
ActiveItemChanged,
UserSavedItem {
pane: WeakEntity<Pane>,
@@ -1046,7 +1045,6 @@ pub enum Event {
},
ZoomChanged,
ModalOpened,
ClearActivityIndicator,
}
#[derive(Debug)]
@@ -3939,7 +3937,6 @@ impl Workspace {
}
serialize_workspace = false;
}
pane::Event::RemoveItem { .. } => {}
pane::Event::RemovedItem { item } => {
cx.emit(Event::ActiveItemChanged);
self.update_window_edited(window, cx);

View File

@@ -1775,36 +1775,54 @@ impl LocalWorktree {
};
absolutize_path
};
let abs_path = abs_new_path.clone();
let fs = self.fs.clone();
let case_sensitive = self.fs_case_sensitive;
let rename = cx.background_spawn(async move {
let abs_old_path = abs_old_path?;
let abs_new_path = abs_new_path;
let abs_old_path_lower = abs_old_path.to_str().map(|p| p.to_lowercase());
let abs_new_path_lower = abs_new_path.to_str().map(|p| p.to_lowercase());
let fs = self.fs.clone();
let abs_path = abs_new_path.clone();
let case_sensitive = self.fs_case_sensitive;
let do_rename = async move |fs: &dyn Fs, old_path: &Path, new_path: &Path, overwrite| {
fs.rename(
&old_path,
&new_path,
fs::RenameOptions {
overwrite,
..fs::RenameOptions::default()
},
)
.await
.with_context(|| format!("renaming {old_path:?} into {new_path:?}"))
};
let rename_task = cx.background_spawn(async move {
let abs_old_path = abs_old_path?;
// If we're on a case-insensitive FS and we're doing a case-only rename (i.e. `foobar` to `FOOBAR`)
// we want to overwrite, because otherwise we run into a file-already-exists error.
let overwrite = !case_sensitive
&& abs_old_path != abs_new_path
&& abs_old_path_lower == abs_new_path_lower;
&& abs_old_path.to_str().map(|p| p.to_lowercase())
== abs_new_path.to_str().map(|p| p.to_lowercase());
fs.rename(
&abs_old_path,
&abs_new_path,
fs::RenameOptions {
overwrite,
..Default::default()
},
)
.await
.with_context(|| format!("Renaming {abs_old_path:?} into {abs_new_path:?}"))
// The directory we're renaming into might not exist yet
if let Err(e) = do_rename(fs.as_ref(), &abs_old_path, &abs_new_path, overwrite).await {
if let Some(err) = e.downcast_ref::<std::io::Error>()
&& err.kind() == std::io::ErrorKind::NotFound
{
if let Some(parent) = abs_new_path.parent() {
fs.create_dir(parent)
.await
.with_context(|| format!("creating parent directory {parent:?}"))?;
return do_rename(fs.as_ref(), &abs_old_path, &abs_new_path, overwrite)
.await;
}
}
return Err(e);
}
Ok(())
});
cx.spawn(async move |this, cx| {
rename.await?;
rename_task.await?;
Ok(this
.update(cx, |this, cx| {
let local = this.as_local_mut().unwrap();
@@ -1818,6 +1836,11 @@ impl LocalWorktree {
);
Task::ready(Ok(this.root_entry().cloned()))
} else {
// First refresh the parent directory (in case it was newly created)
if let Some(parent) = new_path.parent() {
let _ = local.refresh_entries_for_paths(vec![parent.into()]);
}
// Then refresh the new path
local.refresh_entry(new_path.clone(), Some(old_path), cx)
}
})?

View File

@@ -1924,6 +1924,101 @@ fn random_filename(rng: &mut impl Rng) -> String {
.collect()
}
#[gpui::test]
async fn test_rename_file_to_new_directory(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
let expected_contents = "content";
fs.as_fake()
.insert_tree(
"/root",
json!({
"test.txt": expected_contents
}),
)
.await;
let worktree = Worktree::local(
Path::new("/root"),
true,
fs.clone(),
Arc::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| worktree.read(cx).as_local().unwrap().scan_complete())
.await;
let entry_id = worktree.read_with(cx, |worktree, _| {
worktree.entry_for_path("test.txt").unwrap().id
});
let _result = worktree
.update(cx, |worktree, cx| {
worktree.rename_entry(entry_id, Path::new("dir1/dir2/dir3/test.txt"), cx)
})
.await
.unwrap();
worktree.read_with(cx, |worktree, _| {
assert!(
worktree.entry_for_path("test.txt").is_none(),
"Old file should have been removed"
);
assert!(
worktree.entry_for_path("dir1/dir2/dir3/test.txt").is_some(),
"Whole directory hierarchy and the new file should have been created"
);
});
assert_eq!(
worktree
.update(cx, |worktree, cx| {
worktree.load_file("dir1/dir2/dir3/test.txt".as_ref(), cx)
})
.await
.unwrap()
.text,
expected_contents,
"Moved file's contents should be preserved"
);
let entry_id = worktree.read_with(cx, |worktree, _| {
worktree
.entry_for_path("dir1/dir2/dir3/test.txt")
.unwrap()
.id
});
let _result = worktree
.update(cx, |worktree, cx| {
worktree.rename_entry(entry_id, Path::new("dir1/dir2/test.txt"), cx)
})
.await
.unwrap();
worktree.read_with(cx, |worktree, _| {
assert!(
worktree.entry_for_path("test.txt").is_none(),
"First file should not reappear"
);
assert!(
worktree.entry_for_path("dir1/dir2/dir3/test.txt").is_none(),
"Old file should have been removed"
);
assert!(
worktree.entry_for_path("dir1/dir2/test.txt").is_some(),
"No error should have occurred after moving into existing directory"
);
});
assert_eq!(
worktree
.update(cx, |worktree, cx| {
worktree.load_file("dir1/dir2/test.txt".as_ref(), cx)
})
.await
.unwrap()
.text,
expected_contents,
"Moved file's contents should be preserved"
);
}
#[gpui::test]
async fn test_private_single_file_worktree(cx: &mut TestAppContext) {
init_test(cx);

View File

@@ -435,21 +435,24 @@ To do it via your `settings.json`, add the following snippet under `language_mod
```json
{
"language_models": {
"openai": {
"api_url": "https://api.together.xyz/v1", // Using Together AI as an example
"available_models": [
{
"name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"display_name": "Together Mixtral 8x7B",
"max_tokens": 32768,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
"openai_compatible": {
// Using Together AI as an example
"Together AI": {
"api_url": "https://api.together.xyz/v1",
"available_models": [
{
"name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"display_name": "Together Mixtral 8x7B",
"max_tokens": 32768,
"capabilities": {
"tools": true,
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
}
}
}
]
]
}
}
}
}
@@ -463,7 +466,7 @@ By default, OpenAI-compatible models inherit the following capabilities:
- `prompt_cache_key`: false (does not support `prompt_cache_key` parameter)
Note that LLM API keys aren't stored in your settings file.
So, ensure you have it set in your environment variables (`OPENAI_API_KEY=<your api key>`) so your settings can pick it up.
So, ensure you have it set in your environment variables (`<PROVIDER_NAME>_API_KEY=<your api key>`) so your settings can pick it up. In the example above, it would be `TOGETHER_AI_API_KEY=<your api key>`.
### OpenRouter {#openrouter}

View File

@@ -9,22 +9,23 @@ C++ support is available natively in Zed.
You can configure which `clangd` binary Zed should use.
To use a binary in a custom location, add the following to your `settings.json`:
By default, Zed will try to find a `clangd` in your `$PATH` and try to use that. If that binary successfully executes, it's used. Otherwise, Zed will fall back to installing its own `clangd` version and use that.
If you want to install a pre-release `clangd` version instead you can instruct Zed to do so by setting `pre_release` to `true` in your `settings.json`:
```json
{
"lsp": {
"clangd": {
"binary": {
"path": "/path/to/clangd",
"arguments": []
"fetch": {
"pre_release": true
}
}
}
}
```
If you want to disable Zed looking for a `clangd` binary, you can set `ignore_system_version` to `true`:
If you want to disable Zed looking for a `clangd` binary, you can set `ignore_system_version` to `true` in your `settings.json`:
```json
{
@@ -38,6 +39,23 @@ If you want to disable Zed looking for a `clangd` binary, you can set `ignore_sy
}
```
If you want to use a binary in a custom location, you can specify a `path` and optional `arguments`:
```json
{
"lsp": {
"cangd": {
"binary": {
"path": "/path/to/clangd",
"arguments": []
}
}
}
}
```
This `"path"` has to be an absolute path.
## Arguments
You can pass any number of arguments to clangd. To see a full set of available options, run `clangd --help` from the command line. For example with `--function-arg-placeholders=0` completions contain only parentheses for function calls, while the default (`--function-arg-placeholders=1`) completions also contain placeholders for method parameters.

View File

@@ -63,7 +63,21 @@ A `true` setting will set the target directory to `target/rust-analyzer`. You ca
You can configure which `rust-analyzer` binary Zed should use.
By default, Zed will try to find a `rust-analyzer` in your `$PATH` and try to use that. If that binary successfully executes `rust-analyzer --help`, it's used. Otherwise, Zed will fall back to installing its own `rust-analyzer` version and using that.
By default, Zed will try to find a `rust-analyzer` in your `$PATH` and try to use that. If that binary successfully executes `rust-analyzer --help`, it's used. Otherwise, Zed will fall back to installing its own stable `rust-analyzer` version and use that.
If you want to install pre-release `rust-analyzer` version instead you can instruct Zed to do so by setting `pre_release` to `true` in your `settings.json`:
```json
{
"lsp": {
"rust-analyzer": {
"fetch": {
"pre_release": true
}
}
}
}
```
If you want to disable Zed looking for a `rust-analyzer` binary, you can set `ignore_system_version` to `true` in your `settings.json`: