Compare commits

..

14 Commits

Author SHA1 Message Date
Bennet Bo Fenner
854b9b85a3 Remove unused dependencies 2025-05-12 17:53:58 +02:00
Bennet Bo Fenner
84efe57c00 Adjust default settings 2025-05-12 17:49:37 +02:00
Bennet Bo Fenner
d34e501b38 Remove language model settings version 2025-05-12 17:49:31 +02:00
Bennet Bo Fenner
58cf69704d Remove tests 2025-05-12 17:40:25 +02:00
Bennet Bo Fenner
59cbc05698 agent: Remove deprecated settings handling 2025-05-12 17:35:58 +02:00
Bennet Bo Fenner
3173f87dc3 agent: Restore find path tool card after restart (#30580)
Release Notes:

- N/A
2025-05-12 17:11:40 +02:00
Ron Harel
6592314984 editor: Trim indent guides at last non-empty line (#29482)
Closes #26274

Adjust the end position of indent guides to prevent them from extending
through empty space.
Also corrected old test values ​​that seemed to have adapted to the
indentation's behavior.

Release Notes:

- Fixed indentation guides extending beyond the final scope in a file.
2025-05-12 17:04:46 +02:00
THELOSTSOUL
93b6fdb8e5 assistant_tools: Make terminal tool work on Windows (#30497)
Release Notes:

- N/A
2025-05-12 23:02:33 +08:00
Bennet Bo Fenner
e79d1b27b1 agent: Restore web search tool card after restart (#30578)
Release Notes:

- N/A
2025-05-12 14:59:38 +00:00
Smit Barmase
1a0eedb787 Fix migrate banner not showing markdown on file changes (#30575)
Fixes case where on file (settings/keymap) changes banner would appear
but markdown was not visible.
Regression caused by refactor happened in
https://github.com/zed-industries/zed/pull/30456.

Release Notes:

- N/A
2025-05-12 20:16:18 +05:30
Umesh Yadav
8db0333b04 Fix out-of-bounds panic in fuzzy matcher with Unicode/multibyte characters (#30546)
This PR fixes a crash in the fuzzy matcher that occurred when handling
Unicode or multibyte characters (such as Turkish `İ` or `ş`). The issue
was caused by the matcher attempting to index beyond the end of internal
arrays when lowercased Unicode characters expanded into multiple
codepoints, resulting in an out-of-bounds panic.

#### Root Cause

The loop in `recursive_score_match` used an upper bound (`limit`)
derived from `self.last_positions[query_idx]`, which could exceed the
actual length of the arrays being indexed, especially with multibyte
Unicode input.

#### Solution

The fix clamps the loop’s upper bound to the maximum valid index for the
arrays being accessed:
```rust
let max_valid_index = (prefix.len() + path_lowercased.len()).saturating_sub(1);
let safe_limit = limit.min(max_valid_index);
for j in path_idx..=safe_limit { ... }
```
This ensures all indexing is safe and prevents panics.

Closes #30269 

Release Notes:

- N/A

---------

Signed-off-by: Umesh Yadav <git@umesh.dev>
2025-05-12 14:43:14 +00:00
Danilo Leal
a13c8b70dd docs: Update the Text Threads page (#30576)
We had some broken links and outdated content here.

Release Notes:

- N/A
2025-05-12 11:23:50 -03:00
Danilo Leal
ddc649bdb8 agent: Don't rely only on color to communicate MCP server status (#30573)
The MCP server item in the settings view has an indicator that used to
only use colors to communicate the connection status. From an
accessibility standpoint, relying on just colors is never a good idea;
there should always be a supporting element that complements color for
communicating a certain thing. In this case, I added a tooltip, when you
hover over the indicator dot, that clearly words out the status.

Release Notes:

- agent: Improved clarity of MCP server connection status in the
Settings view.
2025-05-12 11:21:22 -03:00
张小白
33c896c23d windows: Fix ctrl-click open hovered URL (#30574)
Closes #30452

Release Notes:

- N/A
2025-05-12 22:15:50 +08:00
25 changed files with 359 additions and 917 deletions

2
Cargo.lock generated
View File

@@ -539,7 +539,6 @@ dependencies = [
"indexmap",
"language_model",
"lmstudio",
"log",
"ollama",
"open_ai",
"paths",
@@ -7816,7 +7815,6 @@ dependencies = [
"deepseek",
"editor",
"feature_flags",
"fs",
"futures 0.3.31",
"google_ai",
"gpui",

View File

@@ -700,8 +700,6 @@
"default_width": 380
},
"agent": {
// Version of this setting.
"version": "2",
// Whether the agent is enabled.
"enabled": true,
/// What completion mode to start new threads in, if available. Can be 'normal' or 'max'.
@@ -1560,7 +1558,6 @@
// Different settings for specific language models.
"language_models": {
"anthropic": {
"version": "1",
"api_url": "https://api.anthropic.com"
},
"google": {
@@ -1570,7 +1567,6 @@
"api_url": "http://localhost:11434"
},
"openai": {
"version": "1",
"api_url": "https://api.openai.com/v1"
},
"lmstudio": {

View File

@@ -422,6 +422,7 @@ impl AgentConfiguration {
.unwrap_or(ContextServerStatus::Stopped);
let is_running = matches!(server_status, ContextServerStatus::Running);
let item_id = SharedString::from(context_server_id.0.clone());
let error = if let ContextServerStatus::Error(error) = server_status.clone() {
Some(error)
@@ -443,9 +444,38 @@ impl AgentConfiguration {
let tool_count = tools.len();
let border_color = cx.theme().colors().border.opacity(0.6);
let success_color = Color::Success.color(cx);
let (status_indicator, tooltip_text) = match server_status {
ContextServerStatus::Starting => (
Indicator::dot()
.color(Color::Success)
.with_animation(
SharedString::from(format!("{}-starting", context_server_id.0.clone(),)),
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.4, 1.)),
move |this, delta| this.color(success_color.alpha(delta).into()),
)
.into_any_element(),
"Server is starting.",
),
ContextServerStatus::Running => (
Indicator::dot().color(Color::Success).into_any_element(),
"Server is running.",
),
ContextServerStatus::Error(_) => (
Indicator::dot().color(Color::Error).into_any_element(),
"Server has an error.",
),
ContextServerStatus::Stopped => (
Indicator::dot().color(Color::Muted).into_any_element(),
"Server is stopped.",
),
};
v_flex()
.id(SharedString::from(context_server_id.0.clone()))
.id(item_id.clone())
.border_1()
.rounded_md()
.border_color(border_color)
@@ -480,35 +510,12 @@ impl AgentConfiguration {
}
})),
)
.child(match server_status {
ContextServerStatus::Starting => {
let color = Color::Success.color(cx);
Indicator::dot()
.color(Color::Success)
.with_animation(
SharedString::from(format!(
"{}-starting",
context_server_id.0.clone(),
)),
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.4, 1.)),
move |this, delta| {
this.color(color.alpha(delta).into())
},
)
.into_any_element()
}
ContextServerStatus::Running => {
Indicator::dot().color(Color::Success).into_any_element()
}
ContextServerStatus::Error(_) => {
Indicator::dot().color(Color::Error).into_any_element()
}
ContextServerStatus::Stopped => {
Indicator::dot().color(Color::Muted).into_any_element()
}
})
.child(
div()
.id(item_id.clone())
.tooltip(Tooltip::text(tooltip_text))
.child(status_indicator),
)
.child(Label::new(context_server_id.0.clone()).ml_0p5())
.when(is_running, |this| {
this.child(

View File

@@ -274,42 +274,35 @@ impl PickerDelegate for ToolPickerDelegate {
let server_id = server_id.clone();
let tool_name = tool_name.clone();
move |settings: &mut AssistantSettingsContent, _cx| {
settings
.v2_setting(|v2_settings| {
let profiles = v2_settings.profiles.get_or_insert_default();
let profile =
profiles
.entry(profile_id)
.or_insert_with(|| AgentProfileContent {
name: default_profile.name.into(),
tools: default_profile.tools,
enable_all_context_servers: Some(
default_profile.enable_all_context_servers,
),
context_servers: default_profile
.context_servers
.into_iter()
.map(|(server_id, preset)| {
(
server_id,
ContextServerPresetContent {
tools: preset.tools,
},
)
})
.collect(),
});
let profiles = settings.profiles.get_or_insert_default();
let profile = profiles
.entry(profile_id)
.or_insert_with(|| AgentProfileContent {
name: default_profile.name.into(),
tools: default_profile.tools,
enable_all_context_servers: Some(
default_profile.enable_all_context_servers,
),
context_servers: default_profile
.context_servers
.into_iter()
.map(|(server_id, preset)| {
(
server_id,
ContextServerPresetContent {
tools: preset.tools,
},
)
})
.collect(),
});
if let Some(server_id) = server_id {
let preset = profile.context_servers.entry(server_id).or_default();
*preset.tools.entry(tool_name).or_default() = !is_currently_enabled;
} else {
*profile.tools.entry(tool_name).or_default() = !is_currently_enabled;
}
Ok(())
})
.ok();
if let Some(server_id) = server_id {
let preset = profile.context_servers.entry(server_id).or_default();
*preset.tools.entry(tool_name).or_default() = !is_currently_enabled;
} else {
*profile.tools.entry(tool_name).or_default() = !is_currently_enabled;
}
}
});
}

View File

@@ -844,6 +844,7 @@ pub fn load_context(
let load_results = future::join_all(load_tasks).await;
let mut contexts = Vec::new();
let mut text = String::new();
let mut referenced_buffers = HashSet::default();
for context in load_results {
let Some((context, buffers)) = context else {
@@ -862,18 +863,10 @@ pub fn load_context(
let mut text_thread_context = Vec::new();
let mut rules_context = Vec::new();
let mut images = Vec::new();
let mut loaded_files = Vec::new();
let mut loaded_dirs = Vec::new();
for context in &contexts {
match context {
AgentContext::File(context) => {
file_context.push(context);
loaded_files.push(context.full_path.clone());
}
AgentContext::Directory(context) => {
directory_context.push(context);
loaded_dirs.push(context.full_path.clone());
}
AgentContext::File(context) => file_context.push(context),
AgentContext::Directory(context) => directory_context.push(context),
AgentContext::Symbol(context) => symbol_context.push(context),
AgentContext::Selection(context) => selection_context.push(context),
AgentContext::FetchedUrl(context) => fetched_url_context.push(context),
@@ -898,17 +891,18 @@ pub fn load_context(
return ContextLoadResult {
loaded_context: LoadedContext {
contexts,
text: String::new(),
text,
images,
},
referenced_buffers,
};
}
let mut text = "\n<context>\n\
text.push_str(
"\n<context>\n\
The following items were attached by the user. \
They are up-to-date and don't need to be re-read.\n\n"
.to_string();
They are up-to-date and don't need to be re-read.\n\n",
);
if !file_context.is_empty() {
text.push_str("<files>");

View File

@@ -19,7 +19,6 @@ gpui.workspace = true
indexmap.workspace = true
language_model.workspace = true
lmstudio = { workspace = true, features = ["schemars"] }
log.workspace = true
ollama = { workspace = true, features = ["schemars"] }
open_ai = { workspace = true, features = ["schemars"] }
deepseek = { workspace = true, features = ["schemars"] }

View File

@@ -84,7 +84,6 @@ pub struct AssistantSettings {
pub commit_message_model: Option<LanguageModelSelection>,
pub thread_summary_model: Option<LanguageModelSelection>,
pub inline_alternatives: Vec<LanguageModelSelection>,
pub using_outdated_settings_version: bool,
pub default_profile: AgentProfileId,
pub profiles: IndexMap<AgentProfileId, AgentProfile>,
pub always_allow_tool_actions: bool,
@@ -150,358 +149,52 @@ impl LanguageModelParameters {
}
}
/// Assistant panel settings
#[derive(Clone, Serialize, Deserialize, Debug, Default)]
pub struct AssistantSettingsContent {
#[serde(flatten)]
pub inner: Option<AssistantSettingsContentInner>,
}
#[derive(Clone, Serialize, Deserialize, Debug)]
#[serde(untagged)]
pub enum AssistantSettingsContentInner {
Versioned(Box<VersionedAssistantSettingsContent>),
Legacy(LegacyAssistantSettingsContent),
}
impl AssistantSettingsContentInner {
fn for_v2(content: AssistantSettingsContentV2) -> Self {
AssistantSettingsContentInner::Versioned(Box::new(VersionedAssistantSettingsContent::V2(
content,
)))
}
}
impl JsonSchema for AssistantSettingsContent {
fn schema_name() -> String {
VersionedAssistantSettingsContent::schema_name()
}
fn json_schema(r#gen: &mut schemars::r#gen::SchemaGenerator) -> Schema {
VersionedAssistantSettingsContent::json_schema(r#gen)
}
fn is_referenceable() -> bool {
VersionedAssistantSettingsContent::is_referenceable()
}
}
impl AssistantSettingsContent {
pub fn is_version_outdated(&self) -> bool {
match &self.inner {
Some(AssistantSettingsContentInner::Versioned(settings)) => match **settings {
VersionedAssistantSettingsContent::V1(_) => true,
VersionedAssistantSettingsContent::V2(_) => false,
},
Some(AssistantSettingsContentInner::Legacy(_)) => true,
None => false,
}
}
fn upgrade(&self) -> AssistantSettingsContentV2 {
match &self.inner {
Some(AssistantSettingsContentInner::Versioned(settings)) => match **settings {
VersionedAssistantSettingsContent::V1(ref settings) => AssistantSettingsContentV2 {
enabled: settings.enabled,
button: settings.button,
dock: settings.dock,
default_width: settings.default_width,
default_height: settings.default_width,
default_model: settings
.provider
.clone()
.and_then(|provider| match provider {
AssistantProviderContentV1::ZedDotDev { default_model } => {
default_model.map(|model| LanguageModelSelection {
provider: "zed.dev".into(),
model: model.id().to_string(),
})
}
AssistantProviderContentV1::OpenAi { default_model, .. } => {
default_model.map(|model| LanguageModelSelection {
provider: "openai".into(),
model: model.id().to_string(),
})
}
AssistantProviderContentV1::Anthropic { default_model, .. } => {
default_model.map(|model| LanguageModelSelection {
provider: "anthropic".into(),
model: model.id().to_string(),
})
}
AssistantProviderContentV1::Ollama { default_model, .. } => {
default_model.map(|model| LanguageModelSelection {
provider: "ollama".into(),
model: model.id().to_string(),
})
}
AssistantProviderContentV1::LmStudio { default_model, .. } => {
default_model.map(|model| LanguageModelSelection {
provider: "lmstudio".into(),
model: model.id().to_string(),
})
}
AssistantProviderContentV1::DeepSeek { default_model, .. } => {
default_model.map(|model| LanguageModelSelection {
provider: "deepseek".into(),
model: model.id().to_string(),
})
}
}),
inline_assistant_model: None,
commit_message_model: None,
thread_summary_model: None,
inline_alternatives: None,
default_profile: None,
profiles: None,
always_allow_tool_actions: None,
notify_when_agent_waiting: None,
stream_edits: None,
single_file_review: None,
model_parameters: Vec::new(),
preferred_completion_mode: None,
},
VersionedAssistantSettingsContent::V2(ref settings) => settings.clone(),
},
Some(AssistantSettingsContentInner::Legacy(settings)) => AssistantSettingsContentV2 {
enabled: None,
button: settings.button,
dock: settings.dock,
default_width: settings.default_width,
default_height: settings.default_height,
default_model: Some(LanguageModelSelection {
provider: "openai".into(),
model: settings
.default_open_ai_model
.clone()
.unwrap_or_default()
.id()
.to_string(),
}),
inline_assistant_model: None,
commit_message_model: None,
thread_summary_model: None,
inline_alternatives: None,
default_profile: None,
profiles: None,
always_allow_tool_actions: None,
notify_when_agent_waiting: None,
stream_edits: None,
single_file_review: None,
model_parameters: Vec::new(),
preferred_completion_mode: None,
},
None => AssistantSettingsContentV2::default(),
}
}
pub fn set_dock(&mut self, dock: AssistantDockPosition) {
match &mut self.inner {
Some(AssistantSettingsContentInner::Versioned(settings)) => match **settings {
VersionedAssistantSettingsContent::V1(ref mut settings) => {
settings.dock = Some(dock);
}
VersionedAssistantSettingsContent::V2(ref mut settings) => {
settings.dock = Some(dock);
}
},
Some(AssistantSettingsContentInner::Legacy(settings)) => {
settings.dock = Some(dock);
}
None => {
self.inner = Some(AssistantSettingsContentInner::for_v2(
AssistantSettingsContentV2 {
dock: Some(dock),
..Default::default()
},
))
}
}
self.dock = Some(dock);
}
pub fn set_model(&mut self, language_model: Arc<dyn LanguageModel>) {
let model = language_model.id().0.to_string();
let provider = language_model.provider_id().0.to_string();
match &mut self.inner {
Some(AssistantSettingsContentInner::Versioned(settings)) => match **settings {
VersionedAssistantSettingsContent::V1(ref mut settings) => {
match provider.as_ref() {
"zed.dev" => {
log::warn!("attempted to set zed.dev model on outdated settings");
}
"anthropic" => {
let api_url = match &settings.provider {
Some(AssistantProviderContentV1::Anthropic { api_url, .. }) => {
api_url.clone()
}
_ => None,
};
settings.provider = Some(AssistantProviderContentV1::Anthropic {
default_model: AnthropicModel::from_id(&model).ok(),
api_url,
});
}
"ollama" => {
let api_url = match &settings.provider {
Some(AssistantProviderContentV1::Ollama { api_url, .. }) => {
api_url.clone()
}
_ => None,
};
settings.provider = Some(AssistantProviderContentV1::Ollama {
default_model: Some(ollama::Model::new(
&model,
None,
None,
Some(language_model.supports_tools()),
)),
api_url,
});
}
"lmstudio" => {
let api_url = match &settings.provider {
Some(AssistantProviderContentV1::LmStudio { api_url, .. }) => {
api_url.clone()
}
_ => None,
};
settings.provider = Some(AssistantProviderContentV1::LmStudio {
default_model: Some(lmstudio::Model::new(&model, None, None)),
api_url,
});
}
"openai" => {
let (api_url, available_models) = match &settings.provider {
Some(AssistantProviderContentV1::OpenAi {
api_url,
available_models,
..
}) => (api_url.clone(), available_models.clone()),
_ => (None, None),
};
settings.provider = Some(AssistantProviderContentV1::OpenAi {
default_model: OpenAiModel::from_id(&model).ok(),
api_url,
available_models,
});
}
"deepseek" => {
let api_url = match &settings.provider {
Some(AssistantProviderContentV1::DeepSeek { api_url, .. }) => {
api_url.clone()
}
_ => None,
};
settings.provider = Some(AssistantProviderContentV1::DeepSeek {
default_model: DeepseekModel::from_id(&model).ok(),
api_url,
});
}
_ => {}
}
}
VersionedAssistantSettingsContent::V2(ref mut settings) => {
settings.default_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
}
},
Some(AssistantSettingsContentInner::Legacy(settings)) => {
if let Ok(model) = OpenAiModel::from_id(&language_model.id().0) {
settings.default_open_ai_model = Some(model);
}
}
None => {
self.inner = Some(AssistantSettingsContentInner::for_v2(
AssistantSettingsContentV2 {
default_model: Some(LanguageModelSelection {
provider: provider.into(),
model,
}),
..Default::default()
},
));
}
}
self.default_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
}
pub fn set_inline_assistant_model(&mut self, provider: String, model: String) {
self.v2_setting(|setting| {
setting.inline_assistant_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
Ok(())
})
.ok();
self.inline_assistant_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
}
pub fn set_commit_message_model(&mut self, provider: String, model: String) {
self.v2_setting(|setting| {
setting.commit_message_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
Ok(())
})
.ok();
}
pub fn v2_setting(
&mut self,
f: impl FnOnce(&mut AssistantSettingsContentV2) -> anyhow::Result<()>,
) -> anyhow::Result<()> {
match self.inner.get_or_insert_with(|| {
AssistantSettingsContentInner::for_v2(AssistantSettingsContentV2 {
..Default::default()
})
}) {
AssistantSettingsContentInner::Versioned(boxed) => {
if let VersionedAssistantSettingsContent::V2(ref mut settings) = **boxed {
f(settings)
} else {
Ok(())
}
}
_ => Ok(()),
}
self.commit_message_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
}
pub fn set_thread_summary_model(&mut self, provider: String, model: String) {
self.v2_setting(|setting| {
setting.thread_summary_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
Ok(())
})
.ok();
self.thread_summary_model = Some(LanguageModelSelection {
provider: provider.into(),
model,
});
}
pub fn set_always_allow_tool_actions(&mut self, allow: bool) {
self.v2_setting(|setting| {
setting.always_allow_tool_actions = Some(allow);
Ok(())
})
.ok();
self.always_allow_tool_actions = Some(allow);
}
pub fn set_single_file_review(&mut self, allow: bool) {
self.v2_setting(|setting| {
setting.single_file_review = Some(allow);
Ok(())
})
.ok();
self.single_file_review = Some(allow);
}
pub fn set_profile(&mut self, profile_id: AgentProfileId) {
self.v2_setting(|setting| {
setting.default_profile = Some(profile_id);
Ok(())
})
.ok();
self.default_profile = Some(profile_id);
}
pub fn create_profile(
@@ -509,74 +202,38 @@ impl AssistantSettingsContent {
profile_id: AgentProfileId,
profile: AgentProfile,
) -> Result<()> {
self.v2_setting(|settings| {
let profiles = settings.profiles.get_or_insert_default();
if profiles.contains_key(&profile_id) {
bail!("profile with ID '{profile_id}' already exists");
}
let profiles = self.profiles.get_or_insert_default();
if profiles.contains_key(&profile_id) {
bail!("profile with ID '{profile_id}' already exists");
}
profiles.insert(
profile_id,
AgentProfileContent {
name: profile.name.into(),
tools: profile.tools,
enable_all_context_servers: Some(profile.enable_all_context_servers),
context_servers: profile
.context_servers
.into_iter()
.map(|(server_id, preset)| {
(
server_id,
ContextServerPresetContent {
tools: preset.tools,
},
)
})
.collect(),
},
);
profiles.insert(
profile_id,
AgentProfileContent {
name: profile.name.into(),
tools: profile.tools,
enable_all_context_servers: Some(profile.enable_all_context_servers),
context_servers: profile
.context_servers
.into_iter()
.map(|(server_id, preset)| {
(
server_id,
ContextServerPresetContent {
tools: preset.tools,
},
)
})
.collect(),
},
);
Ok(())
})
}
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug)]
#[serde(tag = "version")]
pub enum VersionedAssistantSettingsContent {
#[serde(rename = "1")]
V1(AssistantSettingsContentV1),
#[serde(rename = "2")]
V2(AssistantSettingsContentV2),
}
impl Default for VersionedAssistantSettingsContent {
fn default() -> Self {
Self::V2(AssistantSettingsContentV2 {
enabled: None,
button: None,
dock: None,
default_width: None,
default_height: None,
default_model: None,
inline_assistant_model: None,
commit_message_model: None,
thread_summary_model: None,
inline_alternatives: None,
default_profile: None,
profiles: None,
always_allow_tool_actions: None,
notify_when_agent_waiting: None,
stream_edits: None,
single_file_review: None,
model_parameters: Vec::new(),
preferred_completion_mode: None,
})
Ok(())
}
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug, Default)]
pub struct AssistantSettingsContentV2 {
pub struct AssistantSettingsContent {
/// Whether the Assistant is enabled.
///
/// Default: true
@@ -806,11 +463,6 @@ impl Settings for AssistantSettings {
let mut settings = AssistantSettings::default();
for value in sources.defaults_and_customizations() {
if value.is_version_outdated() {
settings.using_outdated_settings_version = true;
}
let value = value.upgrade();
merge(&mut settings.enabled, value.enabled);
merge(&mut settings.button, value.button);
merge(&mut settings.dock, value.dock);
@@ -822,17 +474,23 @@ impl Settings for AssistantSettings {
&mut settings.default_height,
value.default_height.map(Into::into),
);
merge(&mut settings.default_model, value.default_model);
merge(&mut settings.default_model, value.default_model.clone());
settings.inline_assistant_model = value
.inline_assistant_model
.clone()
.or(settings.inline_assistant_model.take());
settings.commit_message_model = value
.commit_message_model
.clone()
.or(settings.commit_message_model.take());
settings.thread_summary_model = value
.thread_summary_model
.clone()
.or(settings.thread_summary_model.take());
merge(&mut settings.inline_alternatives, value.inline_alternatives);
merge(
&mut settings.inline_alternatives,
value.inline_alternatives.clone(),
);
merge(
&mut settings.always_allow_tool_actions,
value.always_allow_tool_actions,
@@ -843,7 +501,7 @@ impl Settings for AssistantSettings {
);
merge(&mut settings.stream_edits, value.stream_edits);
merge(&mut settings.single_file_review, value.single_file_review);
merge(&mut settings.default_profile, value.default_profile);
merge(&mut settings.default_profile, value.default_profile.clone());
merge(
&mut settings.preferred_completion_mode,
value.preferred_completion_mode,
@@ -853,7 +511,7 @@ impl Settings for AssistantSettings {
.model_parameters
.extend_from_slice(&value.model_parameters);
if let Some(profiles) = value.profiles {
if let Some(profiles) = value.profiles.clone() {
settings
.profiles
.extend(profiles.into_iter().map(|(id, profile)| {
@@ -891,31 +549,8 @@ impl Settings for AssistantSettings {
.read_value("chat.agent.enabled")
.and_then(|b| b.as_bool())
{
match &mut current.inner {
Some(AssistantSettingsContentInner::Versioned(versioned)) => {
match versioned.as_mut() {
VersionedAssistantSettingsContent::V1(setting) => {
setting.enabled = Some(b);
setting.button = Some(b);
}
VersionedAssistantSettingsContent::V2(setting) => {
setting.enabled = Some(b);
setting.button = Some(b);
}
}
}
Some(AssistantSettingsContentInner::Legacy(setting)) => setting.button = Some(b),
None => {
current.inner = Some(AssistantSettingsContentInner::for_v2(
AssistantSettingsContentV2 {
enabled: Some(b),
button: Some(b),
..Default::default()
},
));
}
}
current.enabled = Some(b);
current.button = Some(b);
}
}
}
@@ -925,150 +560,3 @@ fn merge<T>(target: &mut T, value: Option<T>) {
*target = value;
}
}
#[cfg(test)]
mod tests {
use fs::Fs;
use gpui::{ReadGlobal, TestAppContext};
use settings::SettingsStore;
use super::*;
#[gpui::test]
async fn test_deserialize_assistant_settings_with_version(cx: &mut TestAppContext) {
let fs = fs::FakeFs::new(cx.executor().clone());
fs.create_dir(paths::settings_file().parent().unwrap())
.await
.unwrap();
cx.update(|cx| {
let test_settings = settings::SettingsStore::test(cx);
cx.set_global(test_settings);
AssistantSettings::register(cx);
});
cx.update(|cx| {
assert!(!AssistantSettings::get_global(cx).using_outdated_settings_version);
assert_eq!(
AssistantSettings::get_global(cx).default_model,
LanguageModelSelection {
provider: "zed.dev".into(),
model: "claude-3-7-sonnet-latest".into(),
}
);
});
cx.update(|cx| {
settings::SettingsStore::global(cx).update_settings_file::<AssistantSettings>(
fs.clone(),
|settings, _| {
*settings = AssistantSettingsContent {
inner: Some(AssistantSettingsContentInner::for_v2(
AssistantSettingsContentV2 {
default_model: Some(LanguageModelSelection {
provider: "test-provider".into(),
model: "gpt-99".into(),
}),
inline_assistant_model: None,
commit_message_model: None,
thread_summary_model: None,
inline_alternatives: None,
enabled: None,
button: None,
dock: None,
default_width: None,
default_height: None,
default_profile: None,
profiles: None,
always_allow_tool_actions: None,
notify_when_agent_waiting: None,
stream_edits: None,
single_file_review: None,
model_parameters: Vec::new(),
preferred_completion_mode: None,
},
)),
}
},
);
});
cx.run_until_parked();
let raw_settings_value = fs.load(paths::settings_file()).await.unwrap();
assert!(raw_settings_value.contains(r#""version": "2""#));
#[derive(Debug, Deserialize)]
struct AssistantSettingsTest {
agent: AssistantSettingsContent,
}
let assistant_settings: AssistantSettingsTest =
serde_json_lenient::from_str(&raw_settings_value).unwrap();
assert!(!assistant_settings.agent.is_version_outdated());
}
#[gpui::test]
async fn test_load_settings_from_old_key(cx: &mut TestAppContext) {
let fs = fs::FakeFs::new(cx.executor().clone());
fs.create_dir(paths::settings_file().parent().unwrap())
.await
.unwrap();
cx.update(|cx| {
let mut test_settings = settings::SettingsStore::test(cx);
let user_settings_content = r#"{
"assistant": {
"enabled": true,
"version": "2",
"default_model": {
"provider": "zed.dev",
"model": "gpt-99"
},
}}"#;
test_settings
.set_user_settings(user_settings_content, cx)
.unwrap();
cx.set_global(test_settings);
AssistantSettings::register(cx);
});
cx.run_until_parked();
let assistant_settings = cx.update(|cx| AssistantSettings::get_global(cx).clone());
assert!(assistant_settings.enabled);
assert!(!assistant_settings.using_outdated_settings_version);
assert_eq!(assistant_settings.default_model.model, "gpt-99");
cx.update_global::<SettingsStore, _>(|settings_store, cx| {
settings_store.update_user_settings::<AssistantSettings>(cx, |settings| {
*settings = AssistantSettingsContent {
inner: Some(AssistantSettingsContentInner::for_v2(
AssistantSettingsContentV2 {
enabled: Some(false),
default_model: Some(LanguageModelSelection {
provider: "xai".to_owned().into(),
model: "grok".to_owned(),
}),
..Default::default()
},
)),
};
});
});
cx.run_until_parked();
let settings = cx.update(|cx| SettingsStore::global(cx).raw_user_settings().clone());
#[derive(Debug, Deserialize)]
struct AssistantSettingsTest {
assistant: AssistantSettingsContent,
agent: Option<serde_json_lenient::Value>,
}
let assistant_settings: AssistantSettingsTest = serde_json::from_value(settings).unwrap();
assert!(assistant_settings.agent.is_none());
}
}

View File

@@ -1245,7 +1245,7 @@ impl EditAgentTest {
Project::init_settings(cx);
language::init(cx);
language_model::init(client.clone(), cx);
language_models::init(user_store.clone(), client.clone(), fs.clone(), cx);
language_models::init(user_store.clone(), client.clone(), cx);
crate::init(client.http_client(), cx);
});

View File

@@ -1,6 +1,6 @@
use crate::{schema::json_schema_for, ui::ToolCallCardHeader};
use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, Tool, ToolCard, ToolResult, ToolUseStatus};
use assistant_tool::{ActionLog, Tool, ToolCard, ToolResult, ToolResultOutput, ToolUseStatus};
use editor::Editor;
use futures::channel::oneshot::{self, Receiver};
use gpui::{
@@ -38,6 +38,12 @@ pub struct FindPathToolInput {
pub offset: usize,
}
#[derive(Debug, Serialize, Deserialize)]
struct FindPathToolOutput {
glob: String,
paths: Vec<PathBuf>,
}
const RESULTS_PER_PAGE: usize = 50;
pub struct FindPathTool;
@@ -111,10 +117,18 @@ impl Tool for FindPathTool {
)
.unwrap();
}
let output = FindPathToolOutput {
glob,
paths: matches.clone(),
};
for mat in matches.into_iter().skip(offset).take(RESULTS_PER_PAGE) {
write!(&mut message, "\n{}", mat.display()).unwrap();
}
Ok(message.into())
Ok(ToolResultOutput {
content: message,
output: Some(serde_json::to_value(output)?),
})
}
});
@@ -123,6 +137,18 @@ impl Tool for FindPathTool {
card: Some(card.into()),
}
}
fn deserialize_card(
self: Arc<Self>,
output: serde_json::Value,
_project: Entity<Project>,
_window: &mut Window,
cx: &mut App,
) -> Option<assistant_tool::AnyToolCard> {
let output = serde_json::from_value::<FindPathToolOutput>(output).ok()?;
let card = cx.new(|_| FindPathToolCard::from_output(output));
Some(card.into())
}
}
fn search_paths(glob: &str, project: Entity<Project>, cx: &mut App) -> Task<Result<Vec<PathBuf>>> {
@@ -180,6 +206,15 @@ impl FindPathToolCard {
_receiver_task: Some(_receiver_task),
}
}
fn from_output(output: FindPathToolOutput) -> Self {
Self {
glob: output.glob,
paths: output.paths,
expanded: false,
_receiver_task: None,
}
}
}
impl ToolCard for FindPathToolCard {

View File

@@ -131,8 +131,12 @@ impl Tool for TerminalTool {
Err(err) => return Task::ready(Err(err)).into(),
};
let program = self.determine_shell.clone();
let command = format!("({}) </dev/null", input.command);
let args = vec!["-c".into(), command.clone()];
let command = if cfg!(windows) {
format!("$null | & {{{}}}", input.command.replace("\"", "'"))
} else {
format!("({}) </dev/null", input.command)
};
let args = vec!["-c".into(), command];
let cwd = working_dir.clone();
let env = match &working_dir {
Some(dir) => project.update(cx, |project, cx| {

View File

@@ -3,7 +3,7 @@ use std::{sync::Arc, time::Duration};
use crate::schema::json_schema_for;
use crate::ui::ToolCallCardHeader;
use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{ActionLog, Tool, ToolCard, ToolResult, ToolUseStatus};
use assistant_tool::{ActionLog, Tool, ToolCard, ToolResult, ToolResultOutput, ToolUseStatus};
use futures::{Future, FutureExt, TryFutureExt};
use gpui::{
AnyWindowHandle, App, AppContext, Context, Entity, IntoElement, Task, WeakEntity, Window,
@@ -73,9 +73,11 @@ impl Tool for WebSearchTool {
let search_task = search_task.clone();
async move {
let response = search_task.await.map_err(|err| anyhow!(err))?;
serde_json::to_string(&response)
.context("Failed to serialize search results")
.map(Into::into)
Ok(ToolResultOutput {
content: serde_json::to_string(&response)
.context("Failed to serialize search results")?,
output: Some(serde_json::to_value(response)?),
})
}
});
@@ -84,6 +86,18 @@ impl Tool for WebSearchTool {
card: Some(cx.new(|cx| WebSearchToolCard::new(search_task, cx)).into()),
}
}
fn deserialize_card(
self: Arc<Self>,
output: serde_json::Value,
_project: Entity<Project>,
_window: &mut Window,
cx: &mut App,
) -> Option<assistant_tool::AnyToolCard> {
let output = serde_json::from_value::<WebSearchResponse>(output).ok()?;
let card = cx.new(|cx| WebSearchToolCard::new(Task::ready(Ok(output)), cx));
Some(card.into())
}
}
#[derive(RegisterComponent)]

View File

@@ -16520,7 +16520,7 @@ async fn test_indent_guide_tabs(cx: &mut TestAppContext) {
assert_indent_guides(
0..6,
vec![
indent_guide(buffer_id, 1, 6, 0),
indent_guide(buffer_id, 1, 5, 0),
indent_guide(buffer_id, 3, 4, 1),
],
None,

View File

@@ -424,7 +424,7 @@ pub fn init(cx: &mut App) -> Arc<AgentAppState> {
language::init(cx);
language_extension::init(extension_host_proxy.clone(), languages.clone());
language_model::init(client.clone(), cx);
language_models::init(user_store.clone(), client.clone(), fs.clone(), cx);
language_models::init(user_store.clone(), client.clone(), cx);
languages::init(languages.clone(), node_runtime.clone(), cx);
prompt_store::init(cx);
let stdout_is_a_pty = false;

View File

@@ -158,7 +158,6 @@ impl<'a> Matcher<'a> {
if score <= 0.0 {
return 0.0;
}
let path_len = prefix.len() + path.len();
let mut cur_start = 0;
let mut byte_ix = 0;
@@ -173,8 +172,17 @@ impl<'a> Matcher<'a> {
byte_ix += ch.len_utf8();
char_ix += 1;
}
cur_start = match_char_ix + 1;
self.match_positions[i] = byte_ix;
let matched_ch = prefix
.get(match_char_ix)
.or_else(|| path.get(match_char_ix - prefix.len()))
.unwrap();
byte_ix += matched_ch.len_utf8();
cur_start = match_char_ix + 1;
char_ix = match_char_ix + 1;
}
score
@@ -209,8 +217,11 @@ impl<'a> Matcher<'a> {
let query_char = self.lowercase_query[query_idx];
let limit = self.last_positions[query_idx];
let max_valid_index = (prefix.len() + path_lowercased.len()).saturating_sub(1);
let safe_limit = limit.min(max_valid_index);
let mut last_slash = 0;
for j in path_idx..=limit {
for j in path_idx..=safe_limit {
let extra_lowercase_chars_count = extra_lowercase_chars
.iter()
.take_while(|(i, _)| i < &&j)
@@ -218,10 +229,15 @@ impl<'a> Matcher<'a> {
.sum::<usize>();
let j_regular = j - extra_lowercase_chars_count;
let path_char = if j_regular < prefix.len() {
let path_char = if j < prefix.len() {
lowercase_prefix[j]
} else {
path_lowercased[j - prefix.len()]
let path_index = j - prefix.len();
if path_index < path_lowercased.len() {
path_lowercased[path_index]
} else {
continue;
}
};
let is_path_sep = path_char == MAIN_SEPARATOR;
@@ -490,6 +506,89 @@ mod tests {
);
}
#[test]
fn match_unicode_path_entries() {
let mixed_unicode_paths = vec![
"İolu/oluş",
"İstanbul/code",
"Athens/Şanlıurfa",
"Çanakkale/scripts",
"paris/Düzce_İl",
"Berlin_Önemli_Ğündem",
"KİTAPLIK/london/dosya",
"tokyo/kyoto/fuji",
"new_york/san_francisco",
];
assert_eq!(
match_single_path_query("İo/oluş", false, &mixed_unicode_paths),
vec![("İolu/oluş", vec![0, 2, 4, 6, 8, 10, 12])]
);
assert_eq!(
match_single_path_query("İst/code", false, &mixed_unicode_paths),
vec![("İstanbul/code", vec![0, 2, 4, 6, 8, 10, 12, 14])]
);
assert_eq!(
match_single_path_query("athens/şa", false, &mixed_unicode_paths),
vec![("Athens/Şanlıurfa", vec![0, 1, 2, 3, 4, 5, 6, 7, 9])]
);
assert_eq!(
match_single_path_query("BerlinÖĞ", false, &mixed_unicode_paths),
vec![("Berlin_Önemli_Ğündem", vec![0, 1, 2, 3, 4, 5, 7, 15])]
);
assert_eq!(
match_single_path_query("tokyo/fuji", false, &mixed_unicode_paths),
vec![("tokyo/kyoto/fuji", vec![0, 1, 2, 3, 4, 5, 12, 13, 14, 15])]
);
let mixed_script_paths = vec![
"résumé_Москва",
"naïve_київ_implementation",
"café_北京_app",
"東京_über_driver",
"déjà_vu_cairo",
"seoul_piñata_game",
"voilà_istanbul_result",
];
assert_eq!(
match_single_path_query("résmé", false, &mixed_script_paths),
vec![("résumé_Москва", vec![0, 1, 3, 5, 6])]
);
assert_eq!(
match_single_path_query("café北京", false, &mixed_script_paths),
vec![("café_北京_app", vec![0, 1, 2, 3, 6, 9])]
);
assert_eq!(
match_single_path_query("ista", false, &mixed_script_paths),
vec![("voilà_istanbul_result", vec![7, 8, 9, 10])]
);
let complex_paths = vec![
"document_📚_library",
"project_👨👩👧👦_family",
"flags_🇯🇵🇺🇸🇪🇺_world",
"code_😀😃😄😁_happy",
"photo_👩👩👧👦_album",
];
assert_eq!(
match_single_path_query("doc📚lib", false, &complex_paths),
vec![("document_📚_library", vec![0, 1, 2, 9, 14, 15, 16])]
);
assert_eq!(
match_single_path_query("codehappy", false, &complex_paths),
vec![("code_😀😃😄😁_happy", vec![0, 1, 2, 3, 22, 23, 24, 25, 26])]
);
}
fn match_single_path_query<'a>(
query: &str,
smart_case: bool,

View File

@@ -385,10 +385,6 @@ fn handle_keydown_msg(
return Some(1);
};
let mut lock = state_ptr.state.borrow_mut();
let Some(mut func) = lock.callbacks.input.take() else {
return Some(1);
};
drop(lock);
let event = match keystroke_or_modifier {
KeystrokeOrModifier::Keystroke(keystroke) => PlatformInput::KeyDown(KeyDownEvent {
@@ -396,9 +392,19 @@ fn handle_keydown_msg(
is_held: lparam.0 & (0x1 << 30) > 0,
}),
KeystrokeOrModifier::Modifier(modifiers) => {
if let Some(prev_modifiers) = lock.last_reported_modifiers {
if prev_modifiers == modifiers {
return Some(0);
}
}
lock.last_reported_modifiers = Some(modifiers);
PlatformInput::ModifiersChanged(ModifiersChangedEvent { modifiers })
}
};
let Some(mut func) = lock.callbacks.input.take() else {
return Some(1);
};
drop(lock);
let result = if func(event).default_prevented {
Some(0)

View File

@@ -42,6 +42,7 @@ pub struct WindowsWindowState {
pub callbacks: Callbacks,
pub input_handler: Option<PlatformInputHandler>,
pub last_reported_modifiers: Option<Modifiers>,
pub system_key_handled: bool,
pub hovered: bool,
@@ -100,6 +101,7 @@ impl WindowsWindowState {
let renderer = windows_renderer::init(gpu_context, hwnd, transparent)?;
let callbacks = Callbacks::default();
let input_handler = None;
let last_reported_modifiers = None;
let system_key_handled = false;
let hovered = false;
let click_state = ClickState::new();
@@ -118,6 +120,7 @@ impl WindowsWindowState {
min_size,
callbacks,
input_handler,
last_reported_modifiers,
system_key_handled,
hovered,
renderer,

View File

@@ -27,7 +27,6 @@ copilot.workspace = true
deepseek = { workspace = true, features = ["schemars"] }
editor.workspace = true
feature_flags.workspace = true
fs.workspace = true
futures.workspace = true
google_ai = { workspace = true, features = ["schemars"] }
gpui.workspace = true
@@ -41,7 +40,6 @@ mistral = { workspace = true, features = ["schemars"] }
ollama = { workspace = true, features = ["schemars"] }
open_ai = { workspace = true, features = ["schemars"] }
partial-json-fixer.workspace = true
project.workspace = true
proto.workspace = true
release_channel.workspace = true
schemars.workspace = true

View File

@@ -1,7 +1,6 @@
use std::sync::Arc;
use client::{Client, UserStore};
use fs::Fs;
use gpui::{App, Context, Entity};
use language_model::LanguageModelRegistry;
use provider::deepseek::DeepSeekLanguageModelProvider;
@@ -21,8 +20,8 @@ use crate::provider::ollama::OllamaLanguageModelProvider;
use crate::provider::open_ai::OpenAiLanguageModelProvider;
pub use crate::settings::*;
pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, fs: Arc<dyn Fs>, cx: &mut App) {
crate::settings::init(fs, cx);
pub fn init(user_store: Entity<UserStore>, client: Arc<Client>, cx: &mut App) {
crate::settings::init(cx);
let registry = LanguageModelRegistry::global(cx);
registry.update(cx, |registry, cx| {
register_language_model_providers(registry, user_store, client, cx);

View File

@@ -38,7 +38,6 @@ pub struct AnthropicSettings {
pub api_url: String,
/// Extend Zed's list of Anthropic models.
pub available_models: Vec<AvailableModel>,
pub needs_setting_migration: bool,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]

View File

@@ -1,12 +1,8 @@
use std::sync::Arc;
use anyhow::Result;
use gpui::App;
use language_model::LanguageModelCacheConfiguration;
use project::Fs;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources, update_settings_file};
use settings::{Settings, SettingsSources};
use crate::provider::{
self,
@@ -23,36 +19,8 @@ use crate::provider::{
};
/// Initializes the language model settings.
pub fn init(fs: Arc<dyn Fs>, cx: &mut App) {
pub fn init(cx: &mut App) {
AllLanguageModelSettings::register(cx);
if AllLanguageModelSettings::get_global(cx)
.openai
.needs_setting_migration
{
update_settings_file::<AllLanguageModelSettings>(fs.clone(), cx, move |setting, _| {
if let Some(settings) = setting.openai.clone() {
let (newest_version, _) = settings.upgrade();
setting.openai = Some(OpenAiSettingsContent::Versioned(
VersionedOpenAiSettingsContent::V1(newest_version),
));
}
});
}
if AllLanguageModelSettings::get_global(cx)
.anthropic
.needs_setting_migration
{
update_settings_file::<AllLanguageModelSettings>(fs, cx, move |setting, _| {
if let Some(settings) = setting.anthropic.clone() {
let (newest_version, _) = settings.upgrade();
setting.anthropic = Some(AnthropicSettingsContent::Versioned(
VersionedAnthropicSettingsContent::V1(newest_version),
));
}
});
}
}
#[derive(Default)]
@@ -85,78 +53,7 @@ pub struct AllLanguageModelSettingsContent {
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
#[serde(untagged)]
pub enum AnthropicSettingsContent {
Versioned(VersionedAnthropicSettingsContent),
Legacy(LegacyAnthropicSettingsContent),
}
impl AnthropicSettingsContent {
pub fn upgrade(self) -> (AnthropicSettingsContentV1, bool) {
match self {
AnthropicSettingsContent::Legacy(content) => (
AnthropicSettingsContentV1 {
api_url: content.api_url,
available_models: content.available_models.map(|models| {
models
.into_iter()
.filter_map(|model| match model {
anthropic::Model::Custom {
name,
display_name,
max_tokens,
tool_override,
cache_configuration,
max_output_tokens,
default_temperature,
extra_beta_headers,
mode,
} => Some(provider::anthropic::AvailableModel {
name,
display_name,
max_tokens,
tool_override,
cache_configuration: cache_configuration.as_ref().map(
|config| LanguageModelCacheConfiguration {
max_cache_anchors: config.max_cache_anchors,
should_speculate: config.should_speculate,
min_total_token: config.min_total_token,
},
),
max_output_tokens,
default_temperature,
extra_beta_headers,
mode: Some(mode.into()),
}),
_ => None,
})
.collect()
}),
},
true,
),
AnthropicSettingsContent::Versioned(content) => match content {
VersionedAnthropicSettingsContent::V1(content) => (content, false),
},
}
}
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct LegacyAnthropicSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<anthropic::Model>>,
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
#[serde(tag = "version")]
pub enum VersionedAnthropicSettingsContent {
#[serde(rename = "1")]
V1(AnthropicSettingsContentV1),
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct AnthropicSettingsContentV1 {
pub struct AnthropicSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<provider::anthropic::AvailableModel>>,
}
@@ -195,64 +92,7 @@ pub struct MistralSettingsContent {
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
#[serde(untagged)]
pub enum OpenAiSettingsContent {
Versioned(VersionedOpenAiSettingsContent),
Legacy(LegacyOpenAiSettingsContent),
}
impl OpenAiSettingsContent {
pub fn upgrade(self) -> (OpenAiSettingsContentV1, bool) {
match self {
OpenAiSettingsContent::Legacy(content) => (
OpenAiSettingsContentV1 {
api_url: content.api_url,
available_models: content.available_models.map(|models| {
models
.into_iter()
.filter_map(|model| match model {
open_ai::Model::Custom {
name,
display_name,
max_tokens,
max_output_tokens,
max_completion_tokens,
} => Some(provider::open_ai::AvailableModel {
name,
max_tokens,
max_output_tokens,
display_name,
max_completion_tokens,
}),
_ => None,
})
.collect()
}),
},
true,
),
OpenAiSettingsContent::Versioned(content) => match content {
VersionedOpenAiSettingsContent::V1(content) => (content, false),
},
}
}
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct LegacyOpenAiSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<open_ai::Model>>,
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
#[serde(tag = "version")]
pub enum VersionedOpenAiSettingsContent {
#[serde(rename = "1")]
V1(OpenAiSettingsContentV1),
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq, JsonSchema)]
pub struct OpenAiSettingsContentV1 {
pub struct OpenAiSettingsContent {
pub api_url: Option<String>,
pub available_models: Option<Vec<provider::open_ai::AvailableModel>>,
}
@@ -288,15 +128,7 @@ impl settings::Settings for AllLanguageModelSettings {
let mut settings = AllLanguageModelSettings::default();
for value in sources.defaults_and_customizations() {
// Anthropic
let (anthropic, upgraded) = match value.anthropic.clone().map(|s| s.upgrade()) {
Some((content, upgraded)) => (Some(content), upgraded),
None => (None, false),
};
if upgraded {
settings.anthropic.needs_setting_migration = true;
}
let anthropic = value.anthropic.clone();
merge(
&mut settings.anthropic.api_url,
@@ -362,16 +194,7 @@ impl settings::Settings for AllLanguageModelSettings {
deepseek.as_ref().and_then(|s| s.available_models.clone()),
);
// OpenAI
let (openai, upgraded) = match value.openai.clone().map(|s| s.upgrade()) {
Some((content, upgraded)) => (Some(content), upgraded),
None => (None, false),
};
if upgraded {
settings.openai.needs_setting_migration = true;
}
let openai = value.openai.clone();
merge(
&mut settings.openai.api_url,
openai.as_ref().and_then(|s| s.api_url.clone()),

View File

@@ -5793,7 +5793,7 @@ impl MultiBufferSnapshot {
line_indent.len(tab_size) / tab_size
+ ((line_indent.len(tab_size) % tab_size) > 0) as u32
} else {
current_depth
0
};
match depth.cmp(&current_depth) {

View File

@@ -488,12 +488,7 @@ fn main() {
);
supermaven::init(app_state.client.clone(), cx);
language_model::init(app_state.client.clone(), cx);
language_models::init(
app_state.user_store.clone(),
app_state.client.clone(),
app_state.fs.clone(),
cx,
);
language_models::init(app_state.user_store.clone(), app_state.client.clone(), cx);
web_search::init(cx);
web_search_providers::init(app_state.client.clone(), cx);
snippet_provider::init(cx);

View File

@@ -4213,12 +4213,7 @@ mod tests {
copilot::copilot_chat::init(app_state.fs.clone(), app_state.client.http_client(), cx);
image_viewer::init(cx);
language_model::init(app_state.client.clone(), cx);
language_models::init(
app_state.user_store.clone(),
app_state.client.clone(),
app_state.fs.clone(),
cx,
);
language_models::init(app_state.user_store.clone(), app_state.client.clone(), cx);
web_search::init(cx);
web_search_providers::init(app_state.client.clone(), cx);
let prompt_builder = PromptBuilder::load(app_state.fs.clone(), false, cx);

View File

@@ -76,15 +76,15 @@ impl MigrationBanner {
migration_type,
migrated,
} => {
if self.migration_type == Some(*migration_type) {
let location = if *migrated {
ToolbarItemLocation::Secondary
} else {
ToolbarItemLocation::Hidden
};
cx.emit(ToolbarItemEvent::ChangeLocation(location));
cx.notify();
}
if *migrated {
self.migration_type = Some(*migration_type);
self.show(cx);
} else {
cx.emit(ToolbarItemEvent::ChangeLocation(
ToolbarItemLocation::Hidden,
));
self.reset(cx);
};
}
}
}

View File

@@ -32,11 +32,12 @@ If you want to start a new conversation at any time, you can hit <kbd>cmd-n|ctrl
Simple back-and-forth conversations work well with the text threads. However, there may come a time when you want to modify the previous text in the conversation and steer it in a different direction.
## Editing a Context {#edit-context}
## Editing a Text Thread {#edit-text-thread}
> **Note**: Wondering about Context vs. Conversation? [Read more here](./contexts.md).
Text threads give you the flexibility to have control over the context. You can freely edit any previous text, including the responses from the LLM. If you want to remove a message block entirely, simply place your cursor at the beginning of the block and use the `delete` key. A typical workflow might involve making edits and adjustments throughout the context to refine your inquiry or provide additional information. Here's an example:
Text threads give you the flexibility to have control over the context.
You can freely edit any previous text, including the responses from the LLM.
If you want to remove a message block entirely, simply place your cursor at the beginning of the block and use the `delete` key.
A typical workflow might involve making edits and adjustments throughout the context to refine your inquiry or provide additional information. Here's an example:
1. Write text in a `You` block.
2. Submit the message with {#kb assistant::Assist}.
@@ -58,31 +59,26 @@ Some additional points to keep in mind:
Slash commands enhance the assistant's capabilities. Begin by typing a `/` at the beginning of the line to see a list of available commands:
- `/default`: Inserts the default prompt into the context
- `/diagnostics`: Injects errors reported by the project's language server into the context
- `/fetch`: Fetches the content of a webpage and inserts it into the context
- `/file`: Inserts a single file or a directory of files into the context
- `/now`: Inserts the current date and time into the context
- `/default`: Inserts the default rule
- `/diagnostics`: Injects errors reported by the project's language server
- `/fetch`: Fetches the content of a webpage and inserts it
- `/file`: Inserts a single file or a directory of files
- `/now`: Inserts the current date and time
- `/prompt`: Adds a custom-configured prompt to the context ([see Rules Library](./rules.md#rules-library))
- `/symbols`: Inserts the current tab's active symbols into the context
- `/tab`: Inserts the content of the active tab or all open tabs into the context
- `/symbols`: Inserts the current tab's active symbols
- `/tab`: Inserts the content of the active tab or all open tabs
- `/terminal`: Inserts a select number of lines of output from the terminal
- `/selection`: Inserts the selected text into the context
- `/selection`: Inserts the selected text
### Other Commands:
> **Note:** Remember, commands are only evaluated when the text thread is created or when the command is inserted, so a command like `/now` won't continuously update, or `/file` commands won't keep their contents up to date.
- `/search`: Performs semantic search for content in your project based on natural language
- Not generally available yet, but some users may have access to it.
> **Note:** Remember, commands are only evaluated when the context is created or when the command is inserted, so a command like `/now` won't continuously update, or `/file` commands won't keep their contents up to date.
#### `/default`
### `/default`
Read more about `/default` in the [Rules: Editing the Default Rules](./rules.md#default-rules) section.
Usage: `/default`
#### `/diagnostics`
### `/diagnostics`
The `/diagnostics` command injects errors reported by the project's language server into the context. This is useful for getting an overview of current issues in your project.
@@ -91,7 +87,7 @@ Usage: `/diagnostics [--include-warnings] [path]`
- `--include-warnings`: Optional flag to include warnings in addition to errors.
- `path`: Optional path to limit diagnostics to a specific file or directory.
#### `/file`
### `/file`
The `/file` command inserts the content of a single file or a directory of files into the context. This allows you to reference specific parts of your project in your conversation with the assistant.
@@ -105,13 +101,13 @@ Examples:
- `/file src/*.js` - Inserts the content of all `.js` files in the `src` directory.
- `/file src` - Inserts the content of all files in the `src` directory.
#### `/now`
### `/now`
The `/now` command inserts the current date and time into the context. This can be useful letting the language model know the current time (and by extension, how old their current knowledge base is).
Usage: `/now`
#### `/prompt`
### `/prompt`
The `/prompt` command inserts a prompt from the prompt library into the context. It can also be used to nest prompts within prompts.
@@ -119,13 +115,13 @@ Usage: `/prompt <prompt_name>`
Related: `/default`
#### `/symbols`
### `/symbols`
The `/symbols` command inserts the active symbols (functions, classes, etc.) from the current tab into the context. This is useful for getting an overview of the structure of the current file.
Usage: `/symbols`
#### `/tab`
### `/tab`
The `/tab` command inserts the content of the active tab or all open tabs into the context. This allows you to reference the content you're currently working on.
@@ -140,7 +136,7 @@ Examples:
- `/tab "index.js"` - Inserts the content of the tab named "index.js".
- `/tab all` - Inserts the content of all open tabs.
#### `/terminal`
### `/terminal`
The `/terminal` command inserts a select number of lines of output from the terminal into the context. This is useful for referencing recent command outputs or logs.
@@ -148,7 +144,7 @@ Usage: `/terminal [<number>]`
- `<number>`: Optional parameter to specify the number of lines to insert (default is a 50).
#### `/selection`
### `/selection`
The `/selection` command inserts the selected text in the editor into the context. This is useful for referencing specific parts of your code.
@@ -156,9 +152,10 @@ This is equivalent to the `assistant: quote selection` command ({#kb assistant::
Usage: `/selection`
## Commands in the Rules Library (previously known as Prompt Library) {#slash-commands-in-rules}
## Commands in the Rules Library {#slash-commands-in-rules}
[Commands](#commands) can be used in rules to insert dynamic content or perform actions. For example, if you want to create a rule where it is important for the model to know the date, you can use the `/now` command to insert the current date.
[Commands](#commands) can be used in rules, in the Rules Library (previously known as Prompt Library), to insert dynamic content or perform actions.
For example, if you want to create a rule where it is important for the model to know the date, you can use the `/now` command to insert the current date.
> **Warn:** Slash commands in rules **only** work when they are used in text threads. Using them in non-text threads is not supported.
@@ -166,7 +163,7 @@ Usage: `/selection`
See the [list of commands](#commands) above for more information on commands, and what slash commands are available.
### Example:
### Example
```plaintext
You are an expert Rust engineer. The user has asked you to review their project and answer some questions.