Compare commits

..

29 Commits

Author SHA1 Message Date
Matt Miller
f34d12dd27 SecurityModal uses updated AlertModal 2025-12-04 18:03:27 -06:00
Kirill Bulatov
57b880526a Add initial functioning design of the new security modal 2025-12-04 22:34:53 +02:00
Kirill Bulatov
039fe76a84 Show a rudimental security modal
Rework new tasks
2025-12-04 20:28:48 +02:00
Kirill Bulatov
be7bfa1803 Add restricted mode indicator to title bar 2025-12-04 20:28:48 +02:00
Kirill Bulatov
792641796a Show untrusted worktrees notifications in proper workspaces 2025-12-04 20:28:48 +02:00
Kirill Bulatov
b14b869aa6 Store untrusted worktrees globally 2025-12-04 20:28:48 +02:00
Kirill Bulatov
9473e69fff Apply a review suggestion 2025-12-04 20:28:48 +02:00
Kirill Bulatov
26945eea1c Fix most of the TODOs 2025-12-04 20:28:48 +02:00
Kirill Bulatov
7cf39ed7e5 Only check trusted worktrees on local settings sync 2025-12-04 20:28:48 +02:00
Kirill Bulatov
2ddd3a033f Small fixes 2025-12-04 20:28:48 +02:00
Kirill Bulatov
ddb7eb1747 Show trust notifications 2025-12-04 20:28:48 +02:00
Kirill Bulatov
db75a2c62a Implement initial project settings trust mechanism 2025-12-04 20:28:48 +02:00
Cole Miller
d5ed9d3e3a git: Don't call git2::Repository::find_remote for every blamed buffer (#44107)
We already store the remote URLs for `origin` and `upstream` in the
`RepositorySnapshot`, so just use that data. Follow-up to #44092.

Release Notes:

- N/A
2025-12-04 13:25:30 -05:00
Liffindra Angga Zaaldian
74a1b5d14d Update PHP language server docs (#44001)
Reformat document structure like other language docs, improve
information flow, add missing requirements, and fix typos.

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2025-12-04 19:04:06 +01:00
Lukas Wirth
07af011eb4 worktree: Fix git ignored directories dropping their contents when they are refreshed (#44143)
Closes https://github.com/zed-industries/zed/issues/38653

Release Notes:

- Fixed git ignored directories appearing as empty when their content
changes on windows

Co-authored by: Smit Barmase <smit@zed.dev>
2025-12-04 18:14:10 +01:00
Danilo Leal
c357dc25fc git_ui: Clean up the commit view UI (#44162) 2025-12-04 13:44:48 -03:00
Lukas Wirth
93bc6616c6 editor: Improve performance of update_visible_edit_prediction (#44161)
One half of https://github.com/zed-industries/zed/issues/42861

This basically reduces the main thread work for large enough json (and
other) files from multiple milliseconds (15ms was observed in that test
case) down to microseconds (100ms here).

Release Notes:

- Improved cursor movement performance when edit predictions are enabled
2025-12-04 15:41:48 +00:00
Lukas Wirth
a33e881906 remote: Recognize WSL interop to open browser for codex web login (#44136)
Closes #41521

Release Notes:

- Fixed codex web login not working on wsl remotes if no browser is
installed

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-12-04 14:42:26 +00:00
Agus Zubiaga
c978db8626 Fix background scanner deadlock (#44109)
Fixes a deadlock in the background scanner that occurs on single-core
Linux devices. This happens because the background scanner would `block`
on a background thread waiting for a future, but on single-core Linux
devices there would be no other thread to pick it up. This mostly
affects SSH remoting use cases where it's common for servers to have 1
vCPU.

Closes #43884 
Closes #43809

Release Notes:

- Fix SSH remoting hang when connecting to 1 vCPU servers
2025-12-04 11:30:16 -03:00
Rawand Ahmed Shaswar
2dad46c5c0 gpui: Fix division by zero when chars/sec = 0 on Wayland (#44151)
Closes #44148

the existing rate == 0 check inside the timer callback already handles
disabling repeat - it just drops the timer immediately. So the fix
prevents the crash while preserving correct behavior. 

Release Notes:

- Linux (Wayland): Fixed a crash that could occur when
`characters_per_second` was zero
2025-12-04 11:26:17 -03:00
Coenen Benjamin
4c51fffbb5 Add support for git remotes (#42819)
Follow up of #42486 
Closes #26559



https://github.com/user-attachments/assets/e2f54dda-a78b-4d9b-a910-16d51f98a111



Release Notes:

- Added support for git remotes

---------

Signed-off-by: Benjamin <5719034+bnjjj@users.noreply.github.com>
2025-12-04 14:23:36 +01:00
Piotr Osiewicz
0d80b452fb python: Improve sorting order of toolchains to give higher precedence to project-local virtual environments that are within current subproject (#44141)
Closes #44090

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>

Release Notes:

- python: Improved sorting order of toolchains in monorepos with
multiple local virtual environments.
- python: Fixed toolchain selector not having an active toolchain
selected on open.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
Co-authored-by: Smit <smit@zed.dev>
2025-12-04 12:33:13 +00:00
John Gibb
bad6bde03a Use buffer language when formatting with Prettier (#43368)
Set `prettier_parser` explicitly if the file extension for the buffer
does not match a known one for the current language

Release Notes:

- N/A

---------

Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2025-12-04 12:07:40 +00:00
Piotr Osiewicz
4ec2d04ad9 search: Fix sort order not being maintained in presence of open buffers (#44135)
In project search UI code we were seeing an issue where "Go to next
match" would act up and behave weirdly. It would not wrap at times.
Stuff would be weird, yo. It turned out that match ranges reported by
core project search were sometimes out of sync with the state of the
multi-buffer. As in, the sort order of
`search::ProjectSearch::match_ranges` would not match up with
multi-buffer's sort order. This is ~because multi-buffers maintain their
own sort order.

What happened within project search is that we were skipping straight
from stage 1 (filtering paths) to stage 3 via an internal channel and in
the process we've dropped the channel used to maintain result sorting.
This made is so that, given 2 files to scan:
- project/file1.rs <- not open, has to go through stage2 (FS scan)
- project/file2.rs <- open, goes straight from stage1 (path filtering)
  to stage3 (finding all matches) We would report matches for
  project/file2.rs first, because we would notice that there's an
  existing language::Buffer for it. However, we should wait for
  project/file1.rs status to be reported first before we kick off
  project/file2.rs

The fix is to use the sorting channel instead of an internal one, as
that keeps the sorting worker "in the loop" about the state of the
world.

Closes #43672

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>

Release Notes:

- Fixed "Select next match" in project search results misbehaving when
some of the buffers within the search results were open before search
was ran.
- Fixed project search results being scrolled to the last file active
prior to running the search.

---------

Co-authored-by: Smit Barmase <heysmitbarmase@gmail.com>
Co-authored-by: Smit <smit@zed.dev>
2025-12-04 12:21:02 +01:00
Shardul Vaidya
0f0017dc8e bedrock: Support global endpoints and new regional endpoints (#44103)
Closes #43598

Release Notes:

- bedrock: Added opt-in `allow_global` which enables global endpoints
- bedrock: Updated cross-region-inference endpoint and model list
- bedrock: Fixed Opus 4.5 access on Bedrock, now only accessible through the `allow_global` setting
2025-12-04 12:14:31 +01:00
Agus Zubiaga
9db0d66251 linux: Spawn at least two background threads (#44110)
Related to https://github.com/zed-industries/zed/pull/44109,
https://github.com/zed-industries/zed/issues/43884,
https://github.com/zed-industries/zed/issues/43809.

In the Linux dispatcher, we create one background thread per CPU, but
when a single core is available, having a single background thread
significantly hinders the perceived performance of Zed. This is
particularly helpful when SSH remoting to low-resource servers.

We may want to bump this to more than two threads actually, but I wanted
to be conservative, and this seems to make a big difference already.

Release Notes:

- N/A
2025-12-04 10:40:51 +00:00
Aero
b07389d9f3 macos: Add missing file access entitlements (#43609)
Adds `com.apple.security.files.user-selected.read-write` and
`com.apple.security.files.downloads.read-write` to zed.entitlements.

This resolves an issue where the integrated terminal could not access
external drives or user-selected files on macOS, even when "Full Disk
Access" was granted. These entitlements are required for the application
to properly inherit file access permissions.

Release Notes:

- Resolves an issue where the integrated terminal could not access
external drives or user-selected files on macOS.
2025-12-04 12:38:10 +02:00
Kirill Bulatov
db2e26f67b Re-colorize the brackets when the theme changes (#44130)
Closes https://github.com/zed-industries/zed/issues/44127

Release Notes:

- Fixed brackets not re-colorizing on theme change
2025-12-04 10:21:37 +00:00
John Tur
391c92b07a Reduce priority of Windows thread pool work items (#44121)
`WorkItemPriority::High` will enqueue the work items to threads with
higher-than-normal priority. If the work items are very intensive, this
can cause the system to become unresponsive. It's not clear what this
gets us, so let's avoid the responsiveness issue by deleting this.

Release Notes:

- N/A
2025-12-04 07:45:36 +00:00
67 changed files with 3356 additions and 744 deletions

3
Cargo.lock generated
View File

@@ -13076,6 +13076,7 @@ dependencies = [
"semver",
"serde",
"serde_json",
"session",
"settings",
"sha2",
"shellexpand 2.1.2",
@@ -15383,6 +15384,7 @@ dependencies = [
name = "session"
version = "0.1.0"
dependencies = [
"collections",
"db",
"gpui",
"serde_json",
@@ -17509,6 +17511,7 @@ dependencies = [
"rpc",
"schemars",
"serde",
"session",
"settings",
"smallvec",
"story",

View File

@@ -2040,7 +2040,11 @@
// dirty files when closing the application.
//
// Default: true
"restore_unsaved_buffers": true
"restore_unsaved_buffers": true,
// Whether or not to skip project trust checks and synchronize project settings from any worktree automatically.
//
// Default: false
"trust_all_worktrees": false
},
// Zed's Prettier integration settings.
// Allows to enable/disable formatting with Prettier

View File

@@ -584,41 +584,100 @@ impl Model {
}
}
pub fn cross_region_inference_id(&self, region: &str) -> anyhow::Result<String> {
pub fn cross_region_inference_id(
&self,
region: &str,
allow_global: bool,
) -> anyhow::Result<String> {
// List derived from here:
// https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles-support.html#inference-profiles-support-system
let model_id = self.request_id();
let supports_global = matches!(
self,
Model::ClaudeOpus4_5
| Model::ClaudeOpus4_5Thinking
| Model::ClaudeHaiku4_5
| Model::ClaudeSonnet4
| Model::ClaudeSonnet4Thinking
| Model::ClaudeSonnet4_5
| Model::ClaudeSonnet4_5Thinking
);
let region_group = if region.starts_with("us-gov-") {
"us-gov"
} else if region.starts_with("us-") {
"us"
} else if region.starts_with("us-")
|| region.starts_with("ca-")
|| region.starts_with("sa-")
{
if allow_global && supports_global {
"global"
} else {
"us"
}
} else if region.starts_with("eu-") {
"eu"
if allow_global && supports_global {
"global"
} else {
"eu"
}
} else if region.starts_with("ap-") || region == "me-central-1" || region == "me-south-1" {
"apac"
} else if region.starts_with("ca-") || region.starts_with("sa-") {
// Canada and South America regions - default to US profiles
"us"
if allow_global && supports_global {
"global"
} else {
"apac"
}
} else {
anyhow::bail!("Unsupported Region {region}");
};
let model_id = self.request_id();
match (self, region_group, region) {
(Model::Custom { .. }, _, _) => Ok(self.request_id().into()),
match (self, region_group) {
// Custom models can't have CRI IDs
(Model::Custom { .. }, _) => Ok(self.request_id().into()),
(
Model::ClaudeOpus4_5
| Model::ClaudeOpus4_5Thinking
| Model::ClaudeHaiku4_5
| Model::ClaudeSonnet4
| Model::ClaudeSonnet4Thinking
| Model::ClaudeSonnet4_5
| Model::ClaudeSonnet4_5Thinking,
"global",
_,
) => Ok(format!("{}.{}", region_group, model_id)),
// Models with US Gov only
(Model::Claude3_5Sonnet, "us-gov") | (Model::Claude3Haiku, "us-gov") => {
Ok(format!("{}.{}", region_group, model_id))
(
Model::Claude3Haiku
| Model::Claude3_5Sonnet
| Model::Claude3_7Sonnet
| Model::Claude3_7SonnetThinking
| Model::ClaudeSonnet4_5
| Model::ClaudeSonnet4_5Thinking,
"us-gov",
_,
) => Ok(format!("{}.{}", region_group, model_id)),
(
Model::ClaudeHaiku4_5 | Model::ClaudeSonnet4_5 | Model::ClaudeSonnet4_5Thinking,
"apac",
"ap-southeast-2" | "ap-southeast-4",
) => Ok(format!("au.{}", model_id)),
(
Model::ClaudeHaiku4_5 | Model::ClaudeSonnet4_5 | Model::ClaudeSonnet4_5Thinking,
"apac",
"ap-northeast-1" | "ap-northeast-3",
) => Ok(format!("jp.{}", model_id)),
(Model::AmazonNovaLite, "us", r) if r.starts_with("ca-") => {
Ok(format!("ca.{}", model_id))
}
// Available everywhere
(Model::AmazonNovaLite | Model::AmazonNovaMicro | Model::AmazonNovaPro, _) => {
Ok(format!("{}.{}", region_group, model_id))
}
// Models in US
(
Model::AmazonNovaPremier
| Model::AmazonNovaLite
| Model::AmazonNovaMicro
| Model::AmazonNovaPro
| Model::Claude3_5Haiku
| Model::ClaudeHaiku4_5
| Model::Claude3_5Sonnet
@@ -655,16 +714,18 @@ impl Model {
| Model::PalmyraWriterX4
| Model::PalmyraWriterX5,
"us",
_,
) => Ok(format!("{}.{}", region_group, model_id)),
// Models available in EU
(
Model::Claude3_5Sonnet
Model::AmazonNovaLite
| Model::AmazonNovaMicro
| Model::AmazonNovaPro
| Model::Claude3_5Sonnet
| Model::ClaudeHaiku4_5
| Model::Claude3_7Sonnet
| Model::Claude3_7SonnetThinking
| Model::ClaudeSonnet4
| Model::ClaudeSonnet4Thinking
| Model::ClaudeSonnet4_5
| Model::ClaudeSonnet4_5Thinking
| Model::Claude3Haiku
@@ -673,26 +734,26 @@ impl Model {
| Model::MetaLlama323BInstructV1
| Model::MistralPixtralLarge2502V1,
"eu",
_,
) => Ok(format!("{}.{}", region_group, model_id)),
// Models available in APAC
(
Model::Claude3_5Sonnet
Model::AmazonNovaLite
| Model::AmazonNovaMicro
| Model::AmazonNovaPro
| Model::Claude3_5Sonnet
| Model::Claude3_5SonnetV2
| Model::ClaudeHaiku4_5
| Model::Claude3Haiku
| Model::Claude3Sonnet
| Model::Claude3_7Sonnet
| Model::Claude3_7SonnetThinking
| Model::ClaudeSonnet4
| Model::ClaudeSonnet4Thinking
| Model::ClaudeSonnet4_5
| Model::ClaudeSonnet4_5Thinking,
| Model::Claude3Haiku
| Model::Claude3Sonnet,
"apac",
_,
) => Ok(format!("{}.{}", region_group, model_id)),
// Any other combination is not supported
_ => Ok(self.request_id().into()),
_ => Ok(model_id.into()),
}
}
}
@@ -705,15 +766,15 @@ mod tests {
fn test_us_region_inference_ids() -> anyhow::Result<()> {
// Test US regions
assert_eq!(
Model::Claude3_5SonnetV2.cross_region_inference_id("us-east-1")?,
Model::Claude3_5SonnetV2.cross_region_inference_id("us-east-1", false)?,
"us.anthropic.claude-3-5-sonnet-20241022-v2:0"
);
assert_eq!(
Model::Claude3_5SonnetV2.cross_region_inference_id("us-west-2")?,
Model::Claude3_5SonnetV2.cross_region_inference_id("us-west-2", false)?,
"us.anthropic.claude-3-5-sonnet-20241022-v2:0"
);
assert_eq!(
Model::AmazonNovaPro.cross_region_inference_id("us-east-2")?,
Model::AmazonNovaPro.cross_region_inference_id("us-east-2", false)?,
"us.amazon.nova-pro-v1:0"
);
Ok(())
@@ -723,19 +784,19 @@ mod tests {
fn test_eu_region_inference_ids() -> anyhow::Result<()> {
// Test European regions
assert_eq!(
Model::ClaudeSonnet4.cross_region_inference_id("eu-west-1")?,
Model::ClaudeSonnet4.cross_region_inference_id("eu-west-1", false)?,
"eu.anthropic.claude-sonnet-4-20250514-v1:0"
);
assert_eq!(
Model::ClaudeSonnet4_5.cross_region_inference_id("eu-west-1")?,
Model::ClaudeSonnet4_5.cross_region_inference_id("eu-west-1", false)?,
"eu.anthropic.claude-sonnet-4-5-20250929-v1:0"
);
assert_eq!(
Model::Claude3Sonnet.cross_region_inference_id("eu-west-1")?,
Model::Claude3Sonnet.cross_region_inference_id("eu-west-1", false)?,
"eu.anthropic.claude-3-sonnet-20240229-v1:0"
);
assert_eq!(
Model::AmazonNovaMicro.cross_region_inference_id("eu-north-1")?,
Model::AmazonNovaMicro.cross_region_inference_id("eu-north-1", false)?,
"eu.amazon.nova-micro-v1:0"
);
Ok(())
@@ -745,15 +806,15 @@ mod tests {
fn test_apac_region_inference_ids() -> anyhow::Result<()> {
// Test Asia-Pacific regions
assert_eq!(
Model::Claude3_5SonnetV2.cross_region_inference_id("ap-northeast-1")?,
Model::Claude3_5SonnetV2.cross_region_inference_id("ap-northeast-1", false)?,
"apac.anthropic.claude-3-5-sonnet-20241022-v2:0"
);
assert_eq!(
Model::Claude3_5SonnetV2.cross_region_inference_id("ap-southeast-2")?,
Model::Claude3_5SonnetV2.cross_region_inference_id("ap-southeast-2", false)?,
"apac.anthropic.claude-3-5-sonnet-20241022-v2:0"
);
assert_eq!(
Model::AmazonNovaLite.cross_region_inference_id("ap-south-1")?,
Model::AmazonNovaLite.cross_region_inference_id("ap-south-1", false)?,
"apac.amazon.nova-lite-v1:0"
);
Ok(())
@@ -763,11 +824,11 @@ mod tests {
fn test_gov_region_inference_ids() -> anyhow::Result<()> {
// Test Government regions
assert_eq!(
Model::Claude3_5Sonnet.cross_region_inference_id("us-gov-east-1")?,
Model::Claude3_5Sonnet.cross_region_inference_id("us-gov-east-1", false)?,
"us-gov.anthropic.claude-3-5-sonnet-20240620-v1:0"
);
assert_eq!(
Model::Claude3Haiku.cross_region_inference_id("us-gov-west-1")?,
Model::Claude3Haiku.cross_region_inference_id("us-gov-west-1", false)?,
"us-gov.anthropic.claude-3-haiku-20240307-v1:0"
);
Ok(())
@@ -777,15 +838,15 @@ mod tests {
fn test_meta_models_inference_ids() -> anyhow::Result<()> {
// Test Meta models
assert_eq!(
Model::MetaLlama370BInstructV1.cross_region_inference_id("us-east-1")?,
Model::MetaLlama370BInstructV1.cross_region_inference_id("us-east-1", false)?,
"meta.llama3-70b-instruct-v1:0"
);
assert_eq!(
Model::MetaLlama3170BInstructV1.cross_region_inference_id("us-east-1")?,
Model::MetaLlama3170BInstructV1.cross_region_inference_id("us-east-1", false)?,
"us.meta.llama3-1-70b-instruct-v1:0"
);
assert_eq!(
Model::MetaLlama321BInstructV1.cross_region_inference_id("eu-west-1")?,
Model::MetaLlama321BInstructV1.cross_region_inference_id("eu-west-1", false)?,
"eu.meta.llama3-2-1b-instruct-v1:0"
);
Ok(())
@@ -796,11 +857,11 @@ mod tests {
// Mistral models don't follow the regional prefix pattern,
// so they should return their original IDs
assert_eq!(
Model::MistralMistralLarge2402V1.cross_region_inference_id("us-east-1")?,
Model::MistralMistralLarge2402V1.cross_region_inference_id("us-east-1", false)?,
"mistral.mistral-large-2402-v1:0"
);
assert_eq!(
Model::MistralMixtral8x7BInstructV0.cross_region_inference_id("eu-west-1")?,
Model::MistralMixtral8x7BInstructV0.cross_region_inference_id("eu-west-1", false)?,
"mistral.mixtral-8x7b-instruct-v0:1"
);
Ok(())
@@ -811,11 +872,11 @@ mod tests {
// AI21 models don't follow the regional prefix pattern,
// so they should return their original IDs
assert_eq!(
Model::AI21J2UltraV1.cross_region_inference_id("us-east-1")?,
Model::AI21J2UltraV1.cross_region_inference_id("us-east-1", false)?,
"ai21.j2-ultra-v1"
);
assert_eq!(
Model::AI21JambaInstructV1.cross_region_inference_id("eu-west-1")?,
Model::AI21JambaInstructV1.cross_region_inference_id("eu-west-1", false)?,
"ai21.jamba-instruct-v1:0"
);
Ok(())
@@ -826,11 +887,11 @@ mod tests {
// Cohere models don't follow the regional prefix pattern,
// so they should return their original IDs
assert_eq!(
Model::CohereCommandRV1.cross_region_inference_id("us-east-1")?,
Model::CohereCommandRV1.cross_region_inference_id("us-east-1", false)?,
"cohere.command-r-v1:0"
);
assert_eq!(
Model::CohereCommandTextV14_4k.cross_region_inference_id("ap-southeast-1")?,
Model::CohereCommandTextV14_4k.cross_region_inference_id("ap-southeast-1", false)?,
"cohere.command-text-v14:7:4k"
);
Ok(())
@@ -850,10 +911,17 @@ mod tests {
// Custom model should return its name unchanged
assert_eq!(
custom_model.cross_region_inference_id("us-east-1")?,
custom_model.cross_region_inference_id("us-east-1", false)?,
"custom.my-model-v1:0"
);
// Test that models without global support fall back to regional when allow_global is true
assert_eq!(
Model::AmazonNovaPro.cross_region_inference_id("us-east-1", true)?,
"us.amazon.nova-pro-v1:0",
"Nova Pro should fall back to regional profile even when allow_global is true"
);
Ok(())
}
@@ -892,3 +960,28 @@ mod tests {
);
}
}
#[test]
fn test_global_inference_ids() -> anyhow::Result<()> {
// Test global inference for models that support it when allow_global is true
assert_eq!(
Model::ClaudeSonnet4.cross_region_inference_id("us-east-1", true)?,
"global.anthropic.claude-sonnet-4-20250514-v1:0"
);
assert_eq!(
Model::ClaudeSonnet4_5.cross_region_inference_id("eu-west-1", true)?,
"global.anthropic.claude-sonnet-4-5-20250929-v1:0"
);
assert_eq!(
Model::ClaudeHaiku4_5.cross_region_inference_id("ap-south-1", true)?,
"global.anthropic.claude-haiku-4-5-20251001-v1:0"
);
// Test that regional prefix is used when allow_global is false
assert_eq!(
Model::ClaudeSonnet4.cross_region_inference_id("us-east-1", false)?,
"us.anthropic.claude-sonnet-4-20250514-v1:0"
);
Ok(())
}

View File

@@ -1723,6 +1723,10 @@ impl ProtoClient for Client {
fn is_via_collab(&self) -> bool {
true
}
fn has_wsl_interop(&self) -> bool {
false
}
}
/// prefix for the zed:// url scheme

View File

@@ -121,6 +121,8 @@ CREATE TABLE "project_repositories" (
"merge_message" VARCHAR,
"branch_summary" VARCHAR,
"head_commit_details" VARCHAR,
"remote_upstream_url" VARCHAR,
"remote_origin_url" VARCHAR,
PRIMARY KEY (project_id, id)
);

View File

@@ -0,0 +1,2 @@
ALTER TABLE "project_repositories" ADD COLUMN "remote_upstream_url" VARCHAR;
ALTER TABLE "project_repositories" ADD COLUMN "remote_origin_url" VARCHAR;

View File

@@ -362,6 +362,8 @@ impl Database {
entry_ids: ActiveValue::set("[]".into()),
head_commit_details: ActiveValue::set(None),
merge_message: ActiveValue::set(None),
remote_upstream_url: ActiveValue::set(None),
remote_origin_url: ActiveValue::set(None),
}
}),
)
@@ -511,6 +513,8 @@ impl Database {
serde_json::to_string(&update.current_merge_conflicts).unwrap(),
)),
merge_message: ActiveValue::set(update.merge_message.clone()),
remote_upstream_url: ActiveValue::set(update.remote_upstream_url.clone()),
remote_origin_url: ActiveValue::set(update.remote_origin_url.clone()),
})
.on_conflict(
OnConflict::columns([
@@ -1005,6 +1009,8 @@ impl Database {
is_last_update: true,
merge_message: db_repository_entry.merge_message,
stash_entries: Vec::new(),
remote_upstream_url: db_repository_entry.remote_upstream_url.clone(),
remote_origin_url: db_repository_entry.remote_origin_url.clone(),
});
}
}

View File

@@ -796,6 +796,8 @@ impl Database {
is_last_update: true,
merge_message: db_repository.merge_message,
stash_entries: Vec::new(),
remote_upstream_url: db_repository.remote_upstream_url.clone(),
remote_origin_url: db_repository.remote_origin_url.clone(),
});
}
}

View File

@@ -22,6 +22,8 @@ pub struct Model {
pub branch_summary: Option<String>,
// A JSON object representing the current Head commit values
pub head_commit_details: Option<String>,
pub remote_upstream_url: Option<String>,
pub remote_origin_url: Option<String>,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]

View File

@@ -469,6 +469,8 @@ impl Server {
.add_request_handler(forward_mutating_project_request::<proto::GetBlobContent>)
.add_request_handler(forward_mutating_project_request::<proto::GitCreateBranch>)
.add_request_handler(forward_mutating_project_request::<proto::GitChangeBranch>)
.add_request_handler(forward_mutating_project_request::<proto::GitCreateRemote>)
.add_request_handler(forward_mutating_project_request::<proto::GitRemoveRemote>)
.add_request_handler(forward_mutating_project_request::<proto::CheckForPushedCommits>)
.add_message_handler(broadcast_project_message_from_host::<proto::AdvertiseContexts>)
.add_message_handler(update_context)

View File

@@ -3518,7 +3518,6 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
.into_iter()
.map(|(sha, message)| (sha.parse().unwrap(), message.into()))
.collect(),
remote_url: Some("git@github.com:zed-industries/zed.git".to_string()),
};
client_a.fs().set_blame_for_repo(
Path::new(path!("/my-repo/.git")),
@@ -3603,10 +3602,6 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
for (idx, (buffer, entry)) in entries.iter().flatten().enumerate() {
let details = blame.details_for_entry(*buffer, entry).unwrap();
assert_eq!(details.message, format!("message for idx-{}", idx));
assert_eq!(
details.permalink.unwrap().to_string(),
format!("https://github.com/zed-industries/zed/commit/{}", entry.sha)
);
}
});
});

View File

@@ -182,7 +182,7 @@ use std::{
iter::{self, Peekable},
mem,
num::NonZeroU32,
ops::{Deref, DerefMut, Not, Range, RangeInclusive},
ops::{ControlFlow, Deref, DerefMut, Not, Range, RangeInclusive},
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
@@ -191,7 +191,7 @@ use std::{
use task::{ResolvedTask, RunnableTag, TaskTemplate, TaskVariables};
use text::{BufferId, FromAnchor, OffsetUtf16, Rope, ToOffset as _};
use theme::{
ActiveTheme, PlayerColor, StatusColors, SyntaxTheme, Theme, ThemeSettings,
AccentColors, ActiveTheme, PlayerColor, StatusColors, SyntaxTheme, Theme, ThemeSettings,
observe_buffer_font_size_adjustment,
};
use ui::{
@@ -1206,11 +1206,17 @@ pub struct Editor {
select_next_is_case_sensitive: Option<bool>,
pub lookup_key: Option<Box<dyn Any + Send + Sync>>,
applicable_language_settings: HashMap<Option<LanguageName>, LanguageSettings>,
accent_overrides: Vec<SharedString>,
accent_data: Option<AccentData>,
fetched_tree_sitter_chunks: HashMap<ExcerptId, HashSet<Range<BufferRow>>>,
use_base_text_line_numbers: bool,
}
#[derive(Debug, PartialEq)]
struct AccentData {
colors: AccentColors,
overrides: Vec<SharedString>,
}
fn debounce_value(debounce_ms: u64) -> Option<Duration> {
if debounce_ms > 0 {
Some(Duration::from_millis(debounce_ms))
@@ -2354,7 +2360,7 @@ impl Editor {
lookup_key: None,
select_next_is_case_sensitive: None,
applicable_language_settings: HashMap::default(),
accent_overrides: Vec::new(),
accent_data: None,
fetched_tree_sitter_chunks: HashMap::default(),
use_base_text_line_numbers: false,
};
@@ -2364,7 +2370,7 @@ impl Editor {
}
editor.applicable_language_settings = editor.fetch_applicable_language_settings(cx);
editor.accent_overrides = editor.fetch_accent_overrides(cx);
editor.accent_data = editor.fetch_accent_data(cx);
if let Some(breakpoints) = editor.breakpoint_store.as_ref() {
editor
@@ -8067,10 +8073,17 @@ impl Editor {
if self.edit_prediction_indent_conflict {
let cursor_point = cursor.to_point(&multibuffer);
let mut suggested_indent = None;
multibuffer.suggested_indents_callback(
cursor_point.row..cursor_point.row + 1,
|_, indent| {
suggested_indent = Some(indent);
ControlFlow::Break(())
},
cx,
);
let indents = multibuffer.suggested_indents(cursor_point.row..cursor_point.row + 1, cx);
if let Some((_, indent)) = indents.iter().next()
if let Some(indent) = suggested_indent
&& indent.len == cursor_point.column
{
self.edit_prediction_indent_conflict = false;
@@ -20077,6 +20090,11 @@ impl Editor {
self.show_indent_guides
}
pub fn disable_indent_guides(&mut self) -> Option<bool> {
self.show_indent_guides = Some(false);
self.show_indent_guides
}
pub fn toggle_line_numbers(
&mut self,
_: &ToggleLineNumbers,
@@ -21706,16 +21724,18 @@ impl Editor {
cx.notify();
}
fn fetch_accent_overrides(&self, cx: &App) -> Vec<SharedString> {
fn fetch_accent_data(&self, cx: &App) -> Option<AccentData> {
if !self.mode.is_full() {
return Vec::new();
return None;
}
let theme_settings = theme::ThemeSettings::get_global(cx);
let theme = cx.theme();
let accent_colors = theme.accents().clone();
theme_settings
let accent_overrides = theme_settings
.theme_overrides
.get(cx.theme().name.as_ref())
.get(theme.name.as_ref())
.map(|theme_style| &theme_style.accents)
.into_iter()
.flatten()
@@ -21728,7 +21748,12 @@ impl Editor {
.flatten(),
)
.flat_map(|accent| accent.0.clone())
.collect()
.collect();
Some(AccentData {
colors: accent_colors,
overrides: accent_overrides,
})
}
fn fetch_applicable_language_settings(
@@ -21758,9 +21783,9 @@ impl Editor {
let language_settings_changed = new_language_settings != self.applicable_language_settings;
self.applicable_language_settings = new_language_settings;
let new_accent_overrides = self.fetch_accent_overrides(cx);
let accent_overrides_changed = new_accent_overrides != self.accent_overrides;
self.accent_overrides = new_accent_overrides;
let new_accents = self.fetch_accent_data(cx);
let accents_changed = new_accents != self.accent_data;
self.accent_data = new_accents;
if self.diagnostics_enabled() {
let new_severity = EditorSettings::get_global(cx)
@@ -21834,7 +21859,7 @@ impl Editor {
}
}
if language_settings_changed || accent_overrides_changed {
if language_settings_changed || accents_changed {
self.colorize_brackets(true, cx);
}

View File

@@ -19095,6 +19095,109 @@ async fn test_document_format_with_prettier(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_document_format_with_prettier_explicit_language(cx: &mut TestAppContext) {
init_test(cx, |settings| {
settings.defaults.formatter = Some(FormatterList::Single(Formatter::Prettier))
});
let fs = FakeFs::new(cx.executor());
fs.insert_file(path!("/file.settings"), Default::default())
.await;
let project = Project::test(fs, [path!("/file.settings").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
let ts_lang = Arc::new(Language::new(
LanguageConfig {
name: "TypeScript".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["ts".to_string()],
..LanguageMatcher::default()
},
prettier_parser_name: Some("typescript".to_string()),
..LanguageConfig::default()
},
Some(tree_sitter_typescript::LANGUAGE_TYPESCRIPT.into()),
));
language_registry.add(ts_lang.clone());
update_test_language_settings(cx, |settings| {
settings.defaults.prettier.get_or_insert_default().allowed = Some(true);
});
let test_plugin = "test_plugin";
let _ = language_registry.register_fake_lsp(
"TypeScript",
FakeLspAdapter {
prettier_plugins: vec![test_plugin],
..Default::default()
},
);
let prettier_format_suffix = project::TEST_PRETTIER_FORMAT_SUFFIX;
let buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/file.settings"), cx)
})
.await
.unwrap();
project.update(cx, |project, cx| {
project.set_language_for_buffer(&buffer, ts_lang, cx)
});
let buffer_text = "one\ntwo\nthree\n";
let buffer = cx.new(|cx| MultiBuffer::singleton(buffer, cx));
let (editor, cx) = cx.add_window_view(|window, cx| build_editor(buffer, window, cx));
editor.update_in(cx, |editor, window, cx| {
editor.set_text(buffer_text, window, cx)
});
editor
.update_in(cx, |editor, window, cx| {
editor.perform_format(
project.clone(),
FormatTrigger::Manual,
FormatTarget::Buffers(editor.buffer().read(cx).all_buffers()),
window,
cx,
)
})
.unwrap()
.await;
assert_eq!(
editor.update(cx, |editor, cx| editor.text(cx)),
buffer_text.to_string() + prettier_format_suffix + "\ntypescript",
"Test prettier formatting was not applied to the original buffer text",
);
update_test_language_settings(cx, |settings| {
settings.defaults.formatter = Some(FormatterList::default())
});
let format = editor.update_in(cx, |editor, window, cx| {
editor.perform_format(
project.clone(),
FormatTrigger::Manual,
FormatTarget::Buffers(editor.buffer().read(cx).all_buffers()),
window,
cx,
)
});
format.await.unwrap();
assert_eq!(
editor.update(cx, |editor, cx| editor.text(cx)),
buffer_text.to_string()
+ prettier_format_suffix
+ "\ntypescript\n"
+ prettier_format_suffix
+ "\ntypescript",
"Autoformatting (via test prettier) was not applied to the original buffer text",
);
}
#[gpui::test]
async fn test_addition_reverts(cx: &mut TestAppContext) {
init_test(cx, |_| {});

View File

@@ -3915,6 +3915,8 @@ impl EditorElement {
) -> impl IntoElement {
let editor = self.editor.read(cx);
let multi_buffer = editor.buffer.read(cx);
let is_read_only = self.editor.read(cx).read_only(cx);
let file_status = multi_buffer
.all_diff_hunks_expanded()
.then(|| editor.status_for_buffer_id(for_excerpt.buffer_id, cx))
@@ -3967,7 +3969,7 @@ impl EditorElement {
.gap_1p5()
.when(is_sticky, |el| el.shadow_md())
.border_1()
.map(|div| {
.map(|border| {
let border_color = if is_selected
&& is_folded
&& focus_handle.contains_focused(window, cx)
@@ -3976,7 +3978,7 @@ impl EditorElement {
} else {
colors.border
};
div.border_color(border_color)
border.border_color(border_color)
})
.bg(colors.editor_subheader_background)
.hover(|style| style.bg(colors.element_hover))
@@ -4056,13 +4058,15 @@ impl EditorElement {
})
.take(1),
)
.child(
h_flex()
.size_3()
.justify_center()
.flex_shrink_0()
.children(indicator),
)
.when(!is_read_only, |this| {
this.child(
h_flex()
.size_3()
.justify_center()
.flex_shrink_0()
.children(indicator),
)
})
.child(
h_flex()
.cursor_pointer()

View File

@@ -508,7 +508,19 @@ impl GitBlame {
let buffer_edits = buffer.update(cx, |buffer, _| buffer.subscribe());
let blame_buffer = project.blame_buffer(&buffer, None, cx);
Some(async move { (id, snapshot, buffer_edits, blame_buffer.await) })
let remote_url = project
.git_store()
.read(cx)
.repository_and_path_for_buffer_id(buffer.read(cx).remote_id(), cx)
.and_then(|(repo, _)| {
repo.read(cx)
.remote_upstream_url
.clone()
.or(repo.read(cx).remote_origin_url.clone())
});
Some(
async move { (id, snapshot, buffer_edits, blame_buffer.await, remote_url) },
)
})
.collect::<Vec<_>>()
});
@@ -524,13 +536,9 @@ impl GitBlame {
.await;
let mut res = vec![];
let mut errors = vec![];
for (id, snapshot, buffer_edits, blame) in blame {
for (id, snapshot, buffer_edits, blame, remote_url) in blame {
match blame {
Ok(Some(Blame {
entries,
messages,
remote_url,
})) => {
Ok(Some(Blame { entries, messages })) => {
let entries = build_blame_entry_sum_tree(
entries,
snapshot.max_point().row,

View File

@@ -50,6 +50,8 @@ pub struct FakeGitRepositoryState {
pub blames: HashMap<RepoPath, Blame>,
pub current_branch_name: Option<String>,
pub branches: HashSet<String>,
/// List of remotes, keys are names and values are URLs
pub remotes: HashMap<String, String>,
pub simulated_index_write_error_message: Option<String>,
pub refs: HashMap<String, String>,
}
@@ -68,6 +70,7 @@ impl FakeGitRepositoryState {
refs: HashMap::from_iter([("HEAD".into(), "abc".into())]),
merge_base_contents: Default::default(),
oids: Default::default(),
remotes: HashMap::default(),
}
}
}
@@ -432,8 +435,13 @@ impl GitRepository for FakeGitRepository {
})
}
fn delete_branch(&self, _name: String) -> BoxFuture<'_, Result<()>> {
unimplemented!()
fn delete_branch(&self, name: String) -> BoxFuture<'_, Result<()>> {
self.with_state_async(true, move |state| {
if !state.branches.remove(&name) {
bail!("no such branch: {name}");
}
Ok(())
})
}
fn blame(&self, path: RepoPath, _content: Rope) -> BoxFuture<'_, Result<git::blame::Blame>> {
@@ -598,6 +606,19 @@ impl GitRepository for FakeGitRepository {
unimplemented!()
}
fn get_all_remotes(&self) -> BoxFuture<'_, Result<Vec<Remote>>> {
self.with_state_async(false, move |state| {
let remotes = state
.remotes
.keys()
.map(|r| Remote {
name: r.clone().into(),
})
.collect::<Vec<_>>();
Ok(remotes)
})
}
fn get_push_remote(&self, _branch: String) -> BoxFuture<'_, Result<Option<Remote>>> {
unimplemented!()
}
@@ -606,10 +627,6 @@ impl GitRepository for FakeGitRepository {
unimplemented!()
}
fn get_all_remotes(&self) -> BoxFuture<'_, Result<Vec<Remote>>> {
unimplemented!()
}
fn check_for_pushed_commit(&self) -> BoxFuture<'_, Result<Vec<gpui::SharedString>>> {
future::ready(Ok(Vec::new())).boxed()
}
@@ -683,6 +700,20 @@ impl GitRepository for FakeGitRepository {
fn default_branch(&self) -> BoxFuture<'_, Result<Option<SharedString>>> {
async { Ok(Some("main".into())) }.boxed()
}
fn create_remote(&self, name: String, url: String) -> BoxFuture<'_, Result<()>> {
self.with_state_async(true, move |state| {
state.remotes.insert(name, url);
Ok(())
})
}
fn remove_remote(&self, name: String) -> BoxFuture<'_, Result<()>> {
self.with_state_async(true, move |state| {
state.remotes.remove(&name);
Ok(())
})
}
}
#[cfg(test)]

View File

@@ -19,7 +19,6 @@ pub use git2 as libgit;
pub struct Blame {
pub entries: Vec<BlameEntry>,
pub messages: HashMap<Oid, String>,
pub remote_url: Option<String>,
}
#[derive(Clone, Debug, Default)]
@@ -36,7 +35,6 @@ impl Blame {
working_directory: &Path,
path: &RepoPath,
content: &Rope,
remote_url: Option<String>,
) -> Result<Self> {
let output = run_git_blame(git_binary, working_directory, path, content).await?;
let mut entries = parse_git_blame(&output)?;
@@ -53,11 +51,7 @@ impl Blame {
.await
.context("failed to get commit messages")?;
Ok(Self {
entries,
messages,
remote_url,
})
Ok(Self { entries, messages })
}
}

View File

@@ -1,3 +1,4 @@
use std::str::FromStr;
use std::sync::LazyLock;
use derive_more::Deref;
@@ -11,7 +12,7 @@ pub struct RemoteUrl(Url);
static USERNAME_REGEX: LazyLock<Regex> =
LazyLock::new(|| Regex::new(r"^[0-9a-zA-Z\-_]+@").expect("Failed to create USERNAME_REGEX"));
impl std::str::FromStr for RemoteUrl {
impl FromStr for RemoteUrl {
type Err = url::ParseError;
fn from_str(input: &str) -> Result<Self, Self::Err> {

View File

@@ -7,13 +7,15 @@ use collections::HashMap;
use futures::future::BoxFuture;
use futures::io::BufWriter;
use futures::{AsyncWriteExt, FutureExt as _, select_biased};
use git2::BranchType;
use git2::{BranchType, ErrorCode};
use gpui::{AppContext as _, AsyncApp, BackgroundExecutor, SharedString, Task};
use parking_lot::Mutex;
use rope::Rope;
use schemars::JsonSchema;
use serde::Deserialize;
use smol::io::{AsyncBufReadExt, AsyncReadExt, BufReader};
use std::collections::HashSet;
use std::ffi::{OsStr, OsString};
use std::process::{ExitStatus, Stdio};
use std::{
@@ -55,6 +57,12 @@ impl Branch {
self.ref_name.starts_with("refs/remotes/")
}
pub fn remote_name(&self) -> Option<&str> {
self.ref_name
.strip_prefix("refs/remotes/")
.and_then(|stripped| stripped.split("/").next())
}
pub fn tracking_status(&self) -> Option<UpstreamTrackingStatus> {
self.upstream
.as_ref()
@@ -590,6 +598,10 @@ pub trait GitRepository: Send + Sync {
fn get_all_remotes(&self) -> BoxFuture<'_, Result<Vec<Remote>>>;
fn remove_remote(&self, name: String) -> BoxFuture<'_, Result<()>>;
fn create_remote(&self, name: String, url: String) -> BoxFuture<'_, Result<()>>;
/// returns a list of remote branches that contain HEAD
fn check_for_pushed_commit(&self) -> BoxFuture<'_, Result<Vec<SharedString>>>;
@@ -1385,9 +1397,19 @@ impl GitRepository for RealGitRepository {
branch
} else if let Ok(revision) = repo.find_branch(&name, BranchType::Remote) {
let (_, branch_name) = name.split_once("/").context("Unexpected branch format")?;
let revision = revision.get();
let branch_commit = revision.peel_to_commit()?;
let mut branch = repo.branch(&branch_name, &branch_commit, false)?;
let mut branch = match repo.branch(&branch_name, &branch_commit, false) {
Ok(branch) => branch,
Err(err) if err.code() == ErrorCode::Exists => {
repo.find_branch(&branch_name, BranchType::Local)?
}
Err(err) => {
return Err(err.into());
}
};
branch.set_upstream(Some(&name))?;
branch
} else {
@@ -1403,7 +1425,6 @@ impl GitRepository for RealGitRepository {
self.executor
.spawn(async move {
let branch = branch.await?;
GitBinary::new(git_binary_path, working_directory?, executor)
.run(&["checkout", &branch])
.await?;
@@ -1473,28 +1494,17 @@ impl GitRepository for RealGitRepository {
let git_binary_path = self.any_git_binary_path.clone();
let executor = self.executor.clone();
async move {
let remote_url = if let Some(remote_url) = self.remote_url("upstream").await {
Some(remote_url)
} else if let Some(remote_url) = self.remote_url("origin").await {
Some(remote_url)
} else {
None
};
executor
.spawn(async move {
crate::blame::Blame::for_path(
&git_binary_path,
&working_directory?,
&path,
&content,
remote_url,
)
.await
})
executor
.spawn(async move {
crate::blame::Blame::for_path(
&git_binary_path,
&working_directory?,
&path,
&content,
)
.await
}
.boxed()
})
.boxed()
}
fn file_history(&self, path: RepoPath) -> BoxFuture<'_, Result<FileHistory>> {
@@ -1993,7 +2003,7 @@ impl GitRepository for RealGitRepository {
let working_directory = working_directory?;
let output = new_smol_command(&git_binary_path)
.current_dir(&working_directory)
.args(["remote"])
.args(["remote", "-v"])
.output()
.await?;
@@ -2002,14 +2012,43 @@ impl GitRepository for RealGitRepository {
"Failed to get all remotes:\n{}",
String::from_utf8_lossy(&output.stderr)
);
let remote_names = String::from_utf8_lossy(&output.stdout)
.split('\n')
.filter(|name| !name.is_empty())
.map(|name| Remote {
name: name.trim().to_string().into(),
let remote_names: HashSet<Remote> = String::from_utf8_lossy(&output.stdout)
.lines()
.filter(|line| !line.is_empty())
.filter_map(|line| {
let mut split_line = line.split_whitespace();
let remote_name = split_line.next()?;
Some(Remote {
name: remote_name.trim().to_string().into(),
})
})
.collect();
Ok(remote_names)
Ok(remote_names.into_iter().collect())
})
.boxed()
}
fn remove_remote(&self, name: String) -> BoxFuture<'_, Result<()>> {
let repo = self.repository.clone();
self.executor
.spawn(async move {
let repo = repo.lock();
repo.remote_delete(&name)?;
Ok(())
})
.boxed()
}
fn create_remote(&self, name: String, url: String) -> BoxFuture<'_, Result<()>> {
let repo = self.repository.clone();
self.executor
.spawn(async move {
let repo = repo.lock();
repo.remote(&name, url.as_ref())?;
Ok(())
})
.boxed()
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
use anyhow::{Context as _, Result};
use buffer_diff::{BufferDiff, BufferDiffSnapshot};
use editor::display_map::{BlockPlacement, BlockProperties, BlockStyle};
use editor::{Addon, Editor, EditorEvent, ExcerptId, ExcerptRange, MultiBuffer};
use editor::{Editor, EditorEvent, ExcerptId, ExcerptRange, MultiBuffer};
use git::repository::{CommitDetails, CommitDiff, RepoPath};
use git::{GitHostingProviderRegistry, GitRemote, parse_git_remote_url};
use gpui::{
@@ -11,9 +11,8 @@ use gpui::{
};
use language::{
Anchor, Buffer, Capability, DiskState, File, LanguageRegistry, LineEnding, ReplicaId, Rope,
TextBuffer, ToPoint,
TextBuffer,
};
use multi_buffer::ExcerptInfo;
use multi_buffer::PathKey;
use project::{Project, WorktreeId, git_store::Repository};
use std::{
@@ -22,11 +21,9 @@ use std::{
sync::Arc,
};
use theme::ActiveTheme;
use ui::{
Avatar, Button, ButtonCommon, Clickable, Color, Icon, IconName, IconSize, Label,
LabelCommon as _, LabelSize, SharedString, div, h_flex, v_flex,
};
use ui::{Avatar, DiffStat, Tooltip, prelude::*};
use util::{ResultExt, paths::PathStyle, rel_path::RelPath, truncate_and_trailoff};
use workspace::item::TabTooltipContent;
use workspace::{
Item, ItemHandle, ItemNavHistory, ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView,
Workspace,
@@ -151,11 +148,11 @@ impl CommitView {
let editor = cx.new(|cx| {
let mut editor =
Editor::for_multibuffer(multibuffer.clone(), Some(project.clone()), window, cx);
editor.disable_inline_diagnostics();
editor.disable_indent_guides();
editor.set_expand_all_diff_hunks(cx);
editor.register_addon(CommitViewAddon {
multibuffer: multibuffer.downgrade(),
});
editor
});
let commit_sha = Arc::<str>::from(commit.sha.as_ref());
@@ -357,6 +354,41 @@ impl CommitView {
.into_any()
}
fn calculate_changed_lines(&self, cx: &App) -> (u32, u32) {
let snapshot = self.multibuffer.read(cx).snapshot(cx);
let mut total_additions = 0u32;
let mut total_deletions = 0u32;
let mut seen_buffers = std::collections::HashSet::new();
for (_, buffer, _) in snapshot.excerpts() {
let buffer_id = buffer.remote_id();
if !seen_buffers.insert(buffer_id) {
continue;
}
let Some(diff) = snapshot.diff_for_buffer_id(buffer_id) else {
continue;
};
let base_text = diff.base_text();
for hunk in diff.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, buffer) {
let added_rows = hunk.range.end.row.saturating_sub(hunk.range.start.row);
total_additions += added_rows;
let base_start = base_text
.offset_to_point(hunk.diff_base_byte_range.start)
.row;
let base_end = base_text.offset_to_point(hunk.diff_base_byte_range.end).row;
let deleted_rows = base_end.saturating_sub(base_start);
total_deletions += deleted_rows;
}
}
(total_additions, total_deletions)
}
fn render_header(&self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let commit = &self.commit;
let author_name = commit.author_name.clone();
@@ -380,46 +412,72 @@ impl CommitView {
)
});
v_flex()
.p_4()
.pl_0()
.gap_4()
let (additions, deletions) = self.calculate_changed_lines(cx);
let commit_diff_stat = if additions > 0 || deletions > 0 {
Some(DiffStat::new(
"commit-diff-stat",
additions as usize,
deletions as usize,
))
} else {
None
};
h_flex()
.border_b_1()
.border_color(cx.theme().colors().border)
.border_color(cx.theme().colors().border_variant)
.child(
h_flex()
.w(self.editor.read(cx).last_gutter_dimensions().full_width())
.justify_center()
.child(self.render_commit_avatar(&commit.sha, rems_from_px(48.), window, cx)),
)
.child(
h_flex()
.py_4()
.pl_1()
.pr_4()
.w_full()
.items_start()
.child(
h_flex()
.w(self.editor.read(cx).last_gutter_dimensions().full_width())
.justify_center()
.child(self.render_commit_avatar(
&commit.sha,
gpui::rems(3.0),
window,
cx,
)),
)
.justify_between()
.flex_wrap()
.child(
v_flex()
.gap_1()
.child(
h_flex()
.gap_3()
.items_baseline()
.gap_1()
.child(Label::new(author_name).color(Color::Default))
.child(
Label::new(format!("commit {}", commit.sha))
.color(Color::Muted),
Label::new(format!("Commit:{}", commit.sha))
.color(Color::Muted)
.size(LabelSize::Small)
.truncate()
.buffer_font(cx),
),
)
.child(Label::new(date_string).color(Color::Muted)),
.child(
h_flex()
.gap_1p5()
.child(
Label::new(date_string)
.color(Color::Muted)
.size(LabelSize::Small),
)
.child(
Label::new("")
.color(Color::Ignored)
.size(LabelSize::Small),
)
.children(commit_diff_stat),
),
)
.child(div().flex_grow())
.children(github_url.map(|url| {
Button::new("view_on_github", "View on GitHub")
.icon(IconName::Github)
.style(ui::ButtonStyle::Subtle)
.icon_color(Color::Muted)
.icon_size(IconSize::Small)
.icon_position(IconPosition::Start)
.on_click(move |_, _, cx| cx.open_url(&url))
})),
)
@@ -714,55 +772,6 @@ impl language::File for GitBlob {
// }
// }
struct CommitViewAddon {
multibuffer: WeakEntity<MultiBuffer>,
}
impl Addon for CommitViewAddon {
fn render_buffer_header_controls(
&self,
excerpt: &ExcerptInfo,
_window: &Window,
cx: &App,
) -> Option<AnyElement> {
let multibuffer = self.multibuffer.upgrade()?;
let snapshot = multibuffer.read(cx).snapshot(cx);
let excerpts = snapshot.excerpts().collect::<Vec<_>>();
let current_idx = excerpts.iter().position(|(id, _, _)| *id == excerpt.id)?;
let (_, _, current_range) = &excerpts[current_idx];
let start_row = current_range.context.start.to_point(&excerpt.buffer).row;
let prev_end_row = if current_idx > 0 {
let (_, prev_buffer, prev_range) = &excerpts[current_idx - 1];
if prev_buffer.remote_id() == excerpt.buffer_id {
prev_range.context.end.to_point(&excerpt.buffer).row
} else {
0
}
} else {
0
};
let skipped_lines = start_row.saturating_sub(prev_end_row);
if skipped_lines > 0 {
Some(
Label::new(format!("{} unchanged lines", skipped_lines))
.color(Color::Muted)
.size(LabelSize::Small)
.into_any_element(),
)
} else {
None
}
}
fn to_any(&self) -> &dyn Any {
self
}
}
async fn build_buffer(
mut text: String,
blob: Arc<dyn File>,
@@ -865,13 +874,28 @@ impl Item for CommitView {
fn tab_content_text(&self, _detail: usize, _cx: &App) -> SharedString {
let short_sha = self.commit.sha.get(0..7).unwrap_or(&*self.commit.sha);
let subject = truncate_and_trailoff(self.commit.message.split('\n').next().unwrap(), 20);
format!("{short_sha} - {subject}").into()
format!("{short_sha} {subject}").into()
}
fn tab_tooltip_text(&self, _: &App) -> Option<ui::SharedString> {
fn tab_tooltip_content(&self, _: &App) -> Option<TabTooltipContent> {
let short_sha = self.commit.sha.get(0..16).unwrap_or(&*self.commit.sha);
let subject = self.commit.message.split('\n').next().unwrap();
Some(format!("{short_sha} - {subject}").into())
Some(TabTooltipContent::Custom(Box::new(Tooltip::element({
let subject = subject.to_string();
let short_sha = short_sha.to_string();
move |_, _| {
v_flex()
.child(Label::new(subject.clone()))
.child(
Label::new(short_sha.clone())
.color(Color::Muted)
.size(LabelSize::Small),
)
.into_any_element()
}
}))))
}
fn to_item_events(event: &EditorEvent, f: impl FnMut(ItemEvent)) {
@@ -988,12 +1012,11 @@ impl Item for CommitView {
impl Render for CommitView {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let is_stash = self.stash.is_some();
div()
v_flex()
.key_context(if is_stash { "StashDiff" } else { "CommitDiff" })
.bg(cx.theme().colors().editor_background)
.flex()
.flex_col()
.size_full()
.bg(cx.theme().colors().editor_background)
.child(self.render_header(window, cx))
.child(div().flex_grow().child(self.editor.clone()))
}
@@ -1013,7 +1036,7 @@ impl EventEmitter<ToolbarItemEvent> for CommitViewToolbar {}
impl Render for CommitViewToolbar {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
div()
div().hidden()
}
}

View File

@@ -3463,7 +3463,6 @@ impl GitPanel {
) -> Option<impl IntoElement> {
let active_repository = self.active_repository.clone()?;
let panel_editor_style = panel_editor_style(true, window, cx);
let enable_coauthors = self.render_co_authors(cx);
let editor_focus_handle = self.commit_editor.focus_handle(cx);
@@ -4772,7 +4771,6 @@ impl RenderOnce for PanelRepoFooter {
const MAX_REPO_LEN: usize = 16;
const LABEL_CHARACTER_BUDGET: usize = MAX_BRANCH_LEN + MAX_REPO_LEN;
const MAX_SHORT_SHA_LEN: usize = 8;
let branch_name = self
.branch
.as_ref()

View File

@@ -1,4 +1,5 @@
use anyhow::Context as _;
use git::repository::{Remote, RemoteCommandOutput};
use linkify::{LinkFinder, LinkKind};
use ui::SharedString;

View File

@@ -26,12 +26,13 @@ pub(crate) struct LinuxDispatcher {
main_thread_id: thread::ThreadId,
}
const MIN_THREADS: usize = 2;
impl LinuxDispatcher {
pub fn new(main_sender: Sender<RunnableVariant>) -> Self {
let (background_sender, background_receiver) = flume::unbounded::<RunnableVariant>();
let thread_count = std::thread::available_parallelism()
.map(|i| i.get())
.unwrap_or(1);
let thread_count =
std::thread::available_parallelism().map_or(MIN_THREADS, |i| i.get().max(MIN_THREADS));
let mut background_threads = (0..thread_count)
.map(|i| {

View File

@@ -1419,7 +1419,7 @@ impl Dispatch<wl_keyboard::WlKeyboard, ()> for WaylandClientStatePtr {
state.repeat.current_keycode = Some(keycode);
let rate = state.repeat.characters_per_second;
let repeat_interval = Duration::from_secs(1) / rate;
let repeat_interval = Duration::from_secs(1) / rate.max(1);
let id = state.repeat.current_id;
state
.loop_handle

View File

@@ -7,9 +7,7 @@ use std::{
use flume::Sender;
use util::ResultExt;
use windows::{
System::Threading::{
ThreadPool, ThreadPoolTimer, TimerElapsedHandler, WorkItemHandler, WorkItemPriority,
},
System::Threading::{ThreadPool, ThreadPoolTimer, TimerElapsedHandler, WorkItemHandler},
Win32::{
Foundation::{LPARAM, WPARAM},
UI::WindowsAndMessaging::PostMessageW,
@@ -55,7 +53,7 @@ impl WindowsDispatcher {
Ok(())
})
};
ThreadPool::RunWithPriorityAsync(&handler, WorkItemPriority::High).log_err();
ThreadPool::RunAsync(&handler).log_err();
}
fn dispatch_on_threadpool_after(&self, runnable: RunnableVariant, duration: Duration) {

View File

@@ -71,6 +71,7 @@ pub struct AmazonBedrockSettings {
pub profile_name: Option<String>,
pub role_arn: Option<String>,
pub authentication_method: Option<BedrockAuthMethod>,
pub allow_global: Option<bool>,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, EnumIter, IntoStaticStr, JsonSchema)]
@@ -239,6 +240,13 @@ impl State {
.or(settings_region)
.unwrap_or(String::from("us-east-1"))
}
fn get_allow_global(&self) -> bool {
self.settings
.as_ref()
.and_then(|s| s.allow_global)
.unwrap_or(false)
}
}
pub struct BedrockLanguageModelProvider {
@@ -545,11 +553,13 @@ impl LanguageModel for BedrockModel {
LanguageModelCompletionError,
>,
> {
let Ok(region) = cx.read_entity(&self.state, |state, _cx| state.get_region()) else {
let Ok((region, allow_global)) = cx.read_entity(&self.state, |state, _cx| {
(state.get_region(), state.get_allow_global())
}) else {
return async move { Err(anyhow::anyhow!("App State Dropped").into()) }.boxed();
};
let model_id = match self.model.cross_region_inference_id(&region) {
let model_id = match self.model.cross_region_inference_id(&region, allow_global) {
Ok(s) => s,
Err(e) => {
return async move { Err(e.into()) }.boxed();

View File

@@ -58,6 +58,7 @@ impl settings::Settings for AllLanguageModelSettings {
profile_name: bedrock.profile,
role_arn: None, // todo(was never a setting for this...)
authentication_method: bedrock.authentication_method.map(Into::into),
allow_global: bedrock.allow_global,
},
deepseek: DeepSeekSettings {
api_url: deepseek.api_url.unwrap(),

View File

@@ -23,7 +23,7 @@ use serde::{Deserialize, Serialize};
use serde_json::{Value, json};
use settings::Settings;
use smol::lock::OnceCell;
use std::cmp::Ordering;
use std::cmp::{Ordering, Reverse};
use std::env::consts;
use terminal::terminal_settings::TerminalSettings;
use util::command::new_smol_command;
@@ -1101,13 +1101,33 @@ fn get_venv_parent_dir(env: &PythonEnvironment) -> Option<PathBuf> {
venv.parent().map(|parent| parent.to_path_buf())
}
fn wr_distance(wr: &PathBuf, venv: Option<&PathBuf>) -> usize {
// How far is this venv from the root of our current project?
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
enum SubprojectDistance {
WithinSubproject(Reverse<usize>),
WithinWorktree(Reverse<usize>),
NotInWorktree,
}
fn wr_distance(
wr: &PathBuf,
subroot_relative_path: &RelPath,
venv: Option<&PathBuf>,
) -> SubprojectDistance {
if let Some(venv) = venv
&& let Ok(p) = venv.strip_prefix(wr)
{
p.components().count()
if subroot_relative_path.components().next().is_some()
&& let Ok(distance) = p
.strip_prefix(subroot_relative_path.as_std_path())
.map(|p| p.components().count())
{
SubprojectDistance::WithinSubproject(Reverse(distance))
} else {
SubprojectDistance::WithinWorktree(Reverse(p.components().count()))
}
} else {
usize::MAX
SubprojectDistance::NotInWorktree
}
}
@@ -1170,11 +1190,14 @@ impl ToolchainLister for PythonToolchainProvider {
});
// Compare project paths against worktree root
let proj_ordering = || {
let lhs_project = lhs.project.clone().or_else(|| get_venv_parent_dir(lhs));
let rhs_project = rhs.project.clone().or_else(|| get_venv_parent_dir(rhs));
wr_distance(&wr, lhs_project.as_ref()).cmp(&wr_distance(&wr, rhs_project.as_ref()))
};
let proj_ordering =
|| {
let lhs_project = lhs.project.clone().or_else(|| get_venv_parent_dir(lhs));
let rhs_project = rhs.project.clone().or_else(|| get_venv_parent_dir(rhs));
wr_distance(&wr, &subroot_relative_path, lhs_project.as_ref()).cmp(
&wr_distance(&wr, &subroot_relative_path, rhs_project.as_ref()),
)
};
// Compare environment priorities
let priority_ordering = || env_priority(lhs.kind).cmp(&env_priority(rhs.kind));

View File

@@ -43,7 +43,7 @@ use std::{
io,
iter::{self, FromIterator},
mem,
ops::{self, AddAssign, Range, RangeBounds, Sub, SubAssign},
ops::{self, AddAssign, ControlFlow, Range, RangeBounds, Sub, SubAssign},
rc::Rc,
str,
sync::Arc,
@@ -4618,7 +4618,24 @@ impl MultiBufferSnapshot {
cx: &App,
) -> BTreeMap<MultiBufferRow, IndentSize> {
let mut result = BTreeMap::new();
self.suggested_indents_callback(
rows,
|row, indent| {
result.insert(row, indent);
ControlFlow::Continue(())
},
cx,
);
result
}
// move this to be a generator once those are a thing
pub fn suggested_indents_callback(
&self,
rows: impl IntoIterator<Item = u32>,
mut cb: impl FnMut(MultiBufferRow, IndentSize) -> ControlFlow<()>,
cx: &App,
) {
let mut rows_for_excerpt = Vec::new();
let mut cursor = self.cursor::<Point, Point>();
let mut rows = rows.into_iter().peekable();
@@ -4662,16 +4679,17 @@ impl MultiBufferSnapshot {
let buffer_indents = region
.buffer
.suggested_indents(buffer_rows, single_indent_size);
let multibuffer_indents = buffer_indents.into_iter().map(|(row, indent)| {
(
for (row, indent) in buffer_indents {
if cb(
MultiBufferRow(start_multibuffer_row + row - start_buffer_row),
indent,
)
});
result.extend(multibuffer_indents);
.is_break()
{
return;
}
}
}
result
}
pub fn indent_size_for_line(&self, row: MultiBufferRow) -> IndentSize {

View File

@@ -2,7 +2,8 @@ use anyhow::Context as _;
use collections::{HashMap, HashSet};
use fs::Fs;
use gpui::{AsyncApp, Entity};
use language::{Buffer, Diff, language_settings::language_settings};
use language::language_settings::PrettierSettings;
use language::{Buffer, Diff, Language, language_settings::language_settings};
use lsp::{LanguageServer, LanguageServerId};
use node_runtime::NodeRuntime;
use paths::default_prettier_dir;
@@ -349,7 +350,7 @@ impl Prettier {
Self::Real(local) => {
let params = buffer
.update(cx, |buffer, cx| {
let buffer_language = buffer.language();
let buffer_language = buffer.language().map(|language| language.as_ref());
let language_settings = language_settings(buffer_language.map(|l| l.name()), buffer.file(), cx);
let prettier_settings = &language_settings.prettier;
anyhow::ensure!(
@@ -449,15 +450,7 @@ impl Prettier {
})
.collect();
let mut prettier_parser = prettier_settings.parser.as_deref();
if buffer_path.is_none() {
prettier_parser = prettier_parser.or_else(|| buffer_language.and_then(|language| language.prettier_parser_name()));
if prettier_parser.is_none() {
log::error!("Formatting unsaved file with prettier failed. No prettier parser configured for language {buffer_language:?}");
anyhow::bail!("Cannot determine prettier parser for unsaved file");
}
}
let parser = prettier_parser_name(buffer_path.as_deref(), buffer_language, prettier_settings).context("getting prettier parser")?;
let ignore_path = ignore_dir.and_then(|dir| {
let ignore_file = dir.join(".prettierignore");
@@ -475,15 +468,15 @@ impl Prettier {
anyhow::Ok(FormatParams {
text: buffer.text(),
options: FormatOptions {
parser: prettier_parser.map(ToOwned::to_owned),
plugins,
path: buffer_path,
parser,
plugins,
prettier_options,
ignore_path,
},
})
})?
.context("building prettier request")?;
})?
.context("building prettier request")?;
let response = local
.server
@@ -503,7 +496,26 @@ impl Prettier {
{
Some("rust") => anyhow::bail!("prettier does not support Rust"),
Some(_other) => {
let formatted_text = buffer.text() + FORMAT_SUFFIX;
let mut formatted_text = buffer.text() + FORMAT_SUFFIX;
let buffer_language =
buffer.language().map(|language| language.as_ref());
let language_settings = language_settings(
buffer_language.map(|l| l.name()),
buffer.file(),
cx,
);
let prettier_settings = &language_settings.prettier;
let parser = prettier_parser_name(
buffer_path.as_deref(),
buffer_language,
prettier_settings,
)?;
if let Some(parser) = parser {
formatted_text = format!("{formatted_text}\n{parser}");
}
Ok(buffer.diff(formatted_text, cx))
}
None => panic!("Should not format buffer without a language with prettier"),
@@ -551,6 +563,40 @@ impl Prettier {
}
}
fn prettier_parser_name(
buffer_path: Option<&Path>,
buffer_language: Option<&Language>,
prettier_settings: &PrettierSettings,
) -> anyhow::Result<Option<String>> {
let parser = if buffer_path.is_none() {
let parser = prettier_settings
.parser
.as_deref()
.or_else(|| buffer_language.and_then(|language| language.prettier_parser_name()));
if parser.is_none() {
log::error!(
"Formatting unsaved file with prettier failed. No prettier parser configured for language {buffer_language:?}"
);
anyhow::bail!("Cannot determine prettier parser for unsaved file");
}
parser
} else if let (Some(buffer_language), Some(buffer_path)) = (buffer_language, buffer_path)
&& buffer_path.extension().is_some_and(|extension| {
!buffer_language
.config()
.matcher
.path_suffixes
.contains(&extension.to_string_lossy().into_owned())
})
{
buffer_language.prettier_parser_name()
} else {
prettier_settings.parser.as_deref()
};
Ok(parser.map(ToOwned::to_owned))
}
async fn has_prettier_in_node_modules(fs: &dyn Fs, path: &Path) -> anyhow::Result<bool> {
let possible_node_modules_location = path.join("node_modules").join(PRETTIER_PACKAGE_NAME);
if let Some(node_modules_location_metadata) = fs

View File

@@ -70,6 +70,7 @@ schemars.workspace = true
semver.workspace = true
serde.workspace = true
serde_json.workspace = true
session.workspace = true
settings.workspace = true
sha2.workspace = true
shellexpand.workspace = true

View File

@@ -453,7 +453,9 @@ impl AgentServerStore {
.clone()
.and_then(|settings| settings.custom_command()),
http_client: http_client.clone(),
is_remote: downstream_client.is_some(),
no_browser: downstream_client
.as_ref()
.is_some_and(|(_, client)| !client.has_wsl_interop()),
}),
);
self.external_agents.insert(
@@ -1355,7 +1357,7 @@ struct LocalCodex {
project_environment: Entity<ProjectEnvironment>,
http_client: Arc<dyn HttpClient>,
custom_command: Option<AgentServerCommand>,
is_remote: bool,
no_browser: bool,
}
impl ExternalAgentServer for LocalCodex {
@@ -1375,7 +1377,7 @@ impl ExternalAgentServer for LocalCodex {
.map(|root_dir| Path::new(root_dir))
.unwrap_or(paths::home_dir())
.into();
let is_remote = self.is_remote;
let no_browser = self.no_browser;
cx.spawn(async move |cx| {
let mut env = project_environment
@@ -1388,7 +1390,7 @@ impl ExternalAgentServer for LocalCodex {
})?
.await
.unwrap_or_default();
if is_remote {
if no_browser {
env.insert("NO_BROWSER".to_owned(), "1".to_owned());
}

View File

@@ -472,6 +472,8 @@ impl GitStore {
client.add_entity_request_handler(Self::handle_change_branch);
client.add_entity_request_handler(Self::handle_create_branch);
client.add_entity_request_handler(Self::handle_rename_branch);
client.add_entity_request_handler(Self::handle_create_remote);
client.add_entity_request_handler(Self::handle_remove_remote);
client.add_entity_request_handler(Self::handle_delete_branch);
client.add_entity_request_handler(Self::handle_git_init);
client.add_entity_request_handler(Self::handle_push);
@@ -2274,6 +2276,25 @@ impl GitStore {
Ok(proto::Ack {})
}
async fn handle_create_remote(
this: Entity<Self>,
envelope: TypedEnvelope<proto::GitCreateRemote>,
mut cx: AsyncApp,
) -> Result<proto::Ack> {
let repository_id = RepositoryId::from_proto(envelope.payload.repository_id);
let repository_handle = Self::repository_for_request(&this, repository_id, &mut cx)?;
let remote_name = envelope.payload.remote_name;
let remote_url = envelope.payload.remote_url;
repository_handle
.update(&mut cx, |repository_handle, _| {
repository_handle.create_remote(remote_name, remote_url)
})?
.await??;
Ok(proto::Ack {})
}
async fn handle_delete_branch(
this: Entity<Self>,
envelope: TypedEnvelope<proto::GitDeleteBranch>,
@@ -2292,6 +2313,24 @@ impl GitStore {
Ok(proto::Ack {})
}
async fn handle_remove_remote(
this: Entity<Self>,
envelope: TypedEnvelope<proto::GitRemoveRemote>,
mut cx: AsyncApp,
) -> Result<proto::Ack> {
let repository_id = RepositoryId::from_proto(envelope.payload.repository_id);
let repository_handle = Self::repository_for_request(&this, repository_id, &mut cx)?;
let remote_name = envelope.payload.remote_name;
repository_handle
.update(&mut cx, |repository_handle, _| {
repository_handle.remove_remote(remote_name)
})?
.await??;
Ok(proto::Ack {})
}
async fn handle_show(
this: Entity<Self>,
envelope: TypedEnvelope<proto::GitShow>,
@@ -3257,6 +3296,8 @@ impl RepositorySnapshot {
.iter()
.map(stash_to_proto)
.collect(),
remote_upstream_url: self.remote_upstream_url.clone(),
remote_origin_url: self.remote_origin_url.clone(),
}
}
@@ -3326,6 +3367,8 @@ impl RepositorySnapshot {
.iter()
.map(stash_to_proto)
.collect(),
remote_upstream_url: self.remote_upstream_url.clone(),
remote_origin_url: self.remote_origin_url.clone(),
}
}
@@ -4865,6 +4908,61 @@ impl Repository {
)
}
pub fn create_remote(
&mut self,
remote_name: String,
remote_url: String,
) -> oneshot::Receiver<Result<()>> {
let id = self.id;
self.send_job(
Some(format!("git remote add {remote_name} {remote_url}").into()),
move |repo, _cx| async move {
match repo {
RepositoryState::Local(LocalRepositoryState { backend, .. }) => {
backend.create_remote(remote_name, remote_url).await
}
RepositoryState::Remote(RemoteRepositoryState { project_id, client }) => {
client
.request(proto::GitCreateRemote {
project_id: project_id.0,
repository_id: id.to_proto(),
remote_name,
remote_url,
})
.await?;
Ok(())
}
}
},
)
}
pub fn remove_remote(&mut self, remote_name: String) -> oneshot::Receiver<Result<()>> {
let id = self.id;
self.send_job(
Some(format!("git remove remote {remote_name}").into()),
move |repo, _cx| async move {
match repo {
RepositoryState::Local(LocalRepositoryState { backend, .. }) => {
backend.remove_remote(remote_name).await
}
RepositoryState::Remote(RemoteRepositoryState { project_id, client }) => {
client
.request(proto::GitRemoveRemote {
project_id: project_id.0,
repository_id: id.to_proto(),
remote_name,
})
.await?;
Ok(())
}
}
},
)
}
pub fn get_remotes(
&mut self,
branch_name: Option<String>,
@@ -4902,7 +5000,7 @@ impl Repository {
let remotes = response
.remotes
.into_iter()
.map(|remotes| git::repository::Remote {
.map(|remotes| Remote {
name: remotes.name.into(),
})
.collect();
@@ -5301,6 +5399,8 @@ impl Repository {
cx.emit(RepositoryEvent::StashEntriesChanged)
}
self.snapshot.stash_entries = new_stash_entries;
self.snapshot.remote_upstream_url = update.remote_upstream_url;
self.snapshot.remote_origin_url = update.remote_origin_url;
let edits = update
.removed_statuses
@@ -5860,11 +5960,7 @@ fn serialize_blame_buffer_response(blame: Option<git::blame::Blame>) -> proto::B
.collect::<Vec<_>>();
proto::BlameBufferResponse {
blame_response: Some(proto::blame_buffer_response::BlameResponse {
entries,
messages,
remote_url: blame.remote_url,
}),
blame_response: Some(proto::blame_buffer_response::BlameResponse { entries, messages }),
}
}
@@ -5901,11 +5997,7 @@ fn deserialize_blame_buffer_response(
.filter_map(|message| Some((git::Oid::from_bytes(&message.oid).ok()?, message.message)))
.collect::<HashMap<_, _>>();
Some(Blame {
entries,
messages,
remote_url: response.remote_url,
})
Some(Blame { entries, messages })
}
fn branch_to_proto(branch: &git::repository::Branch) -> proto::Branch {
@@ -6053,7 +6145,6 @@ async fn compute_snapshot(
events.push(RepositoryEvent::BranchChanged);
}
// Used by edit prediction data collection
let remote_origin_url = backend.remote_url("origin").await;
let remote_upstream_url = backend.remote_url("upstream").await;

View File

@@ -93,9 +93,6 @@ enum FindSearchCandidates {
/// based on disk contents of a buffer. This step is not performed for buffers we already have in memory.
confirm_contents_will_match_tx: Sender<MatchingEntry>,
confirm_contents_will_match_rx: Receiver<MatchingEntry>,
/// Of those that contain at least one match (or are already in memory), look for rest of matches (and figure out their ranges).
/// But wait - first, we need to go back to the main thread to open a buffer (& create an entity for it).
get_buffer_for_full_scan_tx: Sender<ProjectPath>,
},
Remote,
OpenBuffersOnly,
@@ -226,7 +223,7 @@ impl Search {
.boxed_local(),
cx.background_spawn(Self::maintain_sorted_search_results(
sorted_search_results_rx,
get_buffer_for_full_scan_tx.clone(),
get_buffer_for_full_scan_tx,
self.limit,
))
.boxed_local(),
@@ -234,7 +231,6 @@ impl Search {
(
FindSearchCandidates::Local {
fs,
get_buffer_for_full_scan_tx,
confirm_contents_will_match_tx,
confirm_contents_will_match_rx,
input_paths_rx,
@@ -593,7 +589,6 @@ impl Worker<'_> {
input_paths_rx,
confirm_contents_will_match_rx,
mut confirm_contents_will_match_tx,
mut get_buffer_for_full_scan_tx,
fs,
) = match self.candidates {
FindSearchCandidates::Local {
@@ -601,21 +596,15 @@ impl Worker<'_> {
input_paths_rx,
confirm_contents_will_match_rx,
confirm_contents_will_match_tx,
get_buffer_for_full_scan_tx,
} => (
input_paths_rx,
confirm_contents_will_match_rx,
confirm_contents_will_match_tx,
get_buffer_for_full_scan_tx,
Some(fs),
),
FindSearchCandidates::Remote | FindSearchCandidates::OpenBuffersOnly => (
unbounded().1,
unbounded().1,
unbounded().0,
unbounded().0,
None,
),
FindSearchCandidates::Remote | FindSearchCandidates::OpenBuffersOnly => {
(unbounded().1, unbounded().1, unbounded().0, None)
}
};
// WorkerA: grabs a request for "find all matches in file/a" <- takes 5 minutes
// right after: WorkerB: grabs a request for "find all matches in file/b" <- takes 5 seconds
@@ -629,7 +618,6 @@ impl Worker<'_> {
open_entries: &self.open_buffers,
fs: fs.as_deref(),
confirm_contents_will_match_tx: &confirm_contents_will_match_tx,
get_buffer_for_full_scan_tx: &get_buffer_for_full_scan_tx,
};
// Whenever we notice that some step of a pipeline is closed, we don't want to close subsequent
// steps straight away. Another worker might be about to produce a value that will
@@ -645,10 +633,7 @@ impl Worker<'_> {
find_first_match = find_first_match.next() => {
if let Some(buffer_with_at_least_one_match) = find_first_match {
handler.handle_find_first_match(buffer_with_at_least_one_match).await;
} else {
get_buffer_for_full_scan_tx = bounded(1).0;
}
},
scan_path = scan_path.next() => {
if let Some(path_to_scan) = scan_path {
@@ -673,7 +658,6 @@ struct RequestHandler<'worker> {
fs: Option<&'worker dyn Fs>,
open_entries: &'worker HashSet<ProjectEntryId>,
confirm_contents_will_match_tx: &'worker Sender<MatchingEntry>,
get_buffer_for_full_scan_tx: &'worker Sender<ProjectPath>,
}
impl RequestHandler<'_> {
@@ -729,9 +713,8 @@ impl RequestHandler<'_> {
_ = maybe!(async move {
let InputPath {
entry,
snapshot,
should_scan_tx,
mut should_scan_tx,
} = req;
if entry.is_fifo || !entry.is_file() {
@@ -754,7 +737,7 @@ impl RequestHandler<'_> {
if self.open_entries.contains(&entry.id) {
// The buffer is already in memory and that's the version we want to scan;
// hence skip the dilly-dally and look for all matches straight away.
self.get_buffer_for_full_scan_tx
should_scan_tx
.send(ProjectPath {
worktree_id: snapshot.id(),
path: entry.path.clone(),

View File

@@ -17,13 +17,14 @@ use rpc::{
};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use session::TrustedWorktreesStorage;
pub use settings::DirenvSettings;
pub use settings::LspSettings;
use settings::{
DapSettingsContent, InvalidSettingsError, LocalSettingsKind, RegisterSetting, Settings,
SettingsLocation, SettingsStore, parse_json_with_comments, watch_config_file,
};
use std::{path::PathBuf, sync::Arc, time::Duration};
use std::{cell::OnceCell, collections::BTreeMap, path::PathBuf, sync::Arc, time::Duration};
use task::{DebugTaskFile, TaskTemplates, VsCodeDebugTaskFile, VsCodeTaskFile};
use util::{ResultExt, rel_path::RelPath, serde::default_true};
use worktree::{PathChange, UpdatedEntriesSet, Worktree, WorktreeId};
@@ -83,6 +84,10 @@ pub struct SessionSettings {
///
/// Default: true
pub restore_unsaved_buffers: bool,
/// Whether or not to skip project trust checks and synchronize project settings from any worktree automatically.
///
/// Default: false
pub trust_all_worktrees: bool,
}
#[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize, JsonSchema)]
@@ -570,6 +575,7 @@ impl Settings for ProjectSettings {
load_direnv: project.load_direnv.clone().unwrap(),
session: SessionSettings {
restore_unsaved_buffers: content.session.unwrap().restore_unsaved_buffers.unwrap(),
trust_all_worktrees: content.session.unwrap().trust_all_worktrees.unwrap(),
},
}
}
@@ -595,6 +601,8 @@ pub struct SettingsObserver {
worktree_store: Entity<WorktreeStore>,
project_id: u64,
task_store: Entity<TaskStore>,
pending_local_settings: HashMap<PathBuf, BTreeMap<(WorktreeId, Arc<RelPath>), Option<String>>>,
_trusted_worktrees_watcher: Option<Subscription>,
_user_settings_watcher: Option<Subscription>,
_global_task_config_watcher: Task<()>,
_global_debug_config_watcher: Task<()>,
@@ -620,11 +628,65 @@ impl SettingsObserver {
cx.subscribe(&worktree_store, Self::on_worktree_store_event)
.detach();
let weak_settings_observer = cx.weak_entity();
let _trusted_worktrees_watcher = if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(|trusted_worktrees, cx| {
let watcher = trusted_worktrees.subscribe(cx, move |_, e, cx| match e {
session::Event::TrustedWorktree(trusted_path) => {
weak_settings_observer
.update(cx, |settings_observer, cx| {
if let Some(pending_local_settings) = settings_observer
.pending_local_settings
.remove(trusted_path)
{
for ((worktree_id, directory_path), settings_contents) in
pending_local_settings
{
apply_local_settings(
worktree_id,
&directory_path,
LocalSettingsKind::Settings,
&settings_contents,
cx,
);
if let Some(downstream_client) =
&settings_observer.downstream_client
{
downstream_client
.send(proto::UpdateWorktreeSettings {
project_id: settings_observer.project_id,
worktree_id: worktree_id.to_proto(),
path: directory_path.to_proto(),
content: settings_contents,
kind: Some(
local_settings_kind_to_proto(
LocalSettingsKind::Settings,
)
.into(),
),
})
.log_err();
}
}
}
})
.ok();
}
session::Event::UntrustedWorktree(_) => {}
});
Some(watcher)
})
} else {
None
};
Self {
worktree_store,
task_store,
mode: SettingsObserverMode::Local(fs.clone()),
downstream_client: None,
_trusted_worktrees_watcher,
pending_local_settings: HashMap::default(),
_user_settings_watcher: None,
project_id: REMOTE_SERVER_PROJECT_ID,
_global_task_config_watcher: Self::subscribe_to_global_task_file_changes(
@@ -677,6 +739,8 @@ impl SettingsObserver {
mode: SettingsObserverMode::Remote,
downstream_client: None,
project_id: REMOTE_SERVER_PROJECT_ID,
_trusted_worktrees_watcher: None,
pending_local_settings: HashMap::default(),
_user_settings_watcher: user_settings_watcher,
_global_task_config_watcher: Self::subscribe_to_global_task_file_changes(
fs.clone(),
@@ -968,36 +1032,36 @@ impl SettingsObserver {
let worktree_id = worktree.read(cx).id();
let remote_worktree_id = worktree.read(cx).id();
let task_store = self.task_store.clone();
let worktree_abs_path = worktree.read(cx).abs_path();
let can_trust_worktree = OnceCell::new();
for (directory, kind, file_content) in settings_contents {
let mut applied = true;
match kind {
LocalSettingsKind::Settings | LocalSettingsKind::Editorconfig => cx
.update_global::<SettingsStore, _>(|store, cx| {
let result = store.set_local_settings(
worktree_id,
directory.clone(),
kind,
file_content.as_deref(),
cx,
);
match result {
Err(InvalidSettingsError::LocalSettings { path, message }) => {
log::error!("Failed to set local settings in {path:?}: {message}");
cx.emit(SettingsObserverEvent::LocalSettingsUpdated(Err(
InvalidSettingsError::LocalSettings { path, message },
)));
}
Err(e) => {
log::error!("Failed to set local settings: {e}");
}
Ok(()) => {
cx.emit(SettingsObserverEvent::LocalSettingsUpdated(Ok(directory
.as_std_path()
.join(local_settings_file_relative_path().as_std_path()))));
}
LocalSettingsKind::Settings => {
if *can_trust_worktree.get_or_init(|| {
if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(
|trusted_worktrees_storage, cx| {
trusted_worktrees_storage
.can_trust_path(worktree_abs_path.as_ref(), cx)
},
)
} else {
true
}
}),
}) {
apply_local_settings(worktree_id, &directory, kind, &file_content, cx)
} else {
applied = false;
self.pending_local_settings
.entry(worktree_abs_path.to_path_buf())
.or_default()
.insert((worktree_id, directory.clone()), file_content.clone());
}
}
LocalSettingsKind::Editorconfig => {
apply_local_settings(worktree_id, &directory, kind, &file_content, cx)
}
LocalSettingsKind::Tasks => {
let result = task_store.update(cx, |task_store, cx| {
task_store.update_user_tasks(
@@ -1060,16 +1124,18 @@ impl SettingsObserver {
}
};
if let Some(downstream_client) = &self.downstream_client {
downstream_client
.send(proto::UpdateWorktreeSettings {
project_id: self.project_id,
worktree_id: remote_worktree_id.to_proto(),
path: directory.to_proto(),
content: file_content.clone(),
kind: Some(local_settings_kind_to_proto(kind).into()),
})
.log_err();
if applied {
if let Some(downstream_client) = &self.downstream_client {
downstream_client
.send(proto::UpdateWorktreeSettings {
project_id: self.project_id,
worktree_id: remote_worktree_id.to_proto(),
path: directory.to_proto(),
content: file_content.clone(),
kind: Some(local_settings_kind_to_proto(kind).into()),
})
.log_err();
}
}
}
}
@@ -1186,6 +1252,41 @@ impl SettingsObserver {
}
}
fn apply_local_settings(
worktree_id: WorktreeId,
directory: &Arc<RelPath>,
kind: LocalSettingsKind,
file_content: &Option<String>,
cx: &mut Context<'_, SettingsObserver>,
) {
cx.update_global::<SettingsStore, _>(|store, cx| {
let result = store.set_local_settings(
worktree_id,
directory.clone(),
kind,
file_content.as_deref(),
cx,
);
match result {
Err(InvalidSettingsError::LocalSettings { path, message }) => {
log::error!("Failed to set local settings in {path:?}: {message}");
cx.emit(SettingsObserverEvent::LocalSettingsUpdated(Err(
InvalidSettingsError::LocalSettings { path, message },
)));
}
Err(e) => {
log::error!("Failed to set local settings: {e}");
}
Ok(()) => {
cx.emit(SettingsObserverEvent::LocalSettingsUpdated(Ok(directory
.as_std_path()
.join(local_settings_file_relative_path().as_std_path()))));
}
}
})
}
pub fn local_settings_kind_from_proto(kind: proto::LocalSettingsKind) -> LocalSettingsKind {
match kind {
proto::LocalSettingsKind::Settings => LocalSettingsKind::Settings,

View File

@@ -124,6 +124,8 @@ message UpdateRepository {
optional GitCommitDetails head_commit_details = 11;
optional string merge_message = 12;
repeated StashEntry stash_entries = 13;
optional string remote_upstream_url = 14;
optional string remote_origin_url = 15;
}
message RemoveRepository {
@@ -190,6 +192,19 @@ message GitRenameBranch {
string new_name = 4;
}
message GitCreateRemote {
uint64 project_id = 1;
uint64 repository_id = 2;
string remote_name = 3;
string remote_url = 4;
}
message GitRemoveRemote {
uint64 project_id = 1;
uint64 repository_id = 2;
string remote_name = 3;
}
message GitDeleteBranch {
uint64 project_id = 1;
uint64 repository_id = 2;
@@ -487,8 +502,8 @@ message BlameBufferResponse {
message BlameResponse {
repeated BlameEntry entries = 1;
repeated CommitMessage messages = 2;
optional string remote_url = 4;
reserved 3;
reserved 4;
}
optional BlameResponse blame_response = 5;

View File

@@ -437,13 +437,18 @@ message Envelope {
OpenImageResponse open_image_response = 392;
CreateImageForPeer create_image_for_peer = 393;
GitFileHistory git_file_history = 397;
GitFileHistoryResponse git_file_history_response = 398;
RunGitHook run_git_hook = 399;
GitDeleteBranch git_delete_branch = 400;
ExternalExtensionAgentsUpdated external_extension_agents_updated = 401; // current max
ExternalExtensionAgentsUpdated external_extension_agents_updated = 401;
GitCreateRemote git_create_remote = 402;
GitRemoveRemote git_remove_remote = 403;// current max
}
reserved 87 to 88, 396;

View File

@@ -305,6 +305,8 @@ messages!(
(RemoteMessageResponse, Background),
(AskPassRequest, Background),
(AskPassResponse, Background),
(GitCreateRemote, Background),
(GitRemoveRemote, Background),
(GitCreateBranch, Background),
(GitChangeBranch, Background),
(GitRenameBranch, Background),
@@ -504,6 +506,8 @@ request_messages!(
(GetRemotes, GetRemotesResponse),
(Pull, RemoteMessageResponse),
(AskPassRequest, AskPassResponse),
(GitCreateRemote, Ack),
(GitRemoveRemote, Ack),
(GitCreateBranch, Ack),
(GitChangeBranch, Ack),
(GitRenameBranch, Ack),
@@ -676,6 +680,8 @@ entity_messages!(
GitChangeBranch,
GitRenameBranch,
GitCreateBranch,
GitCreateRemote,
GitRemoveRemote,
CheckForPushedCommits,
GitDiff,
GitInit,

View File

@@ -43,7 +43,6 @@ urlencoding.workspace = true
util.workspace = true
which.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }

View File

@@ -328,8 +328,15 @@ impl RemoteClient {
let (incoming_tx, incoming_rx) = mpsc::unbounded::<Envelope>();
let (connection_activity_tx, connection_activity_rx) = mpsc::channel::<()>(1);
let client =
cx.update(|cx| ChannelClient::new(incoming_rx, outgoing_tx, cx, "client"))?;
let client = cx.update(|cx| {
ChannelClient::new(
incoming_rx,
outgoing_tx,
cx,
"client",
remote_connection.has_wsl_interop(),
)
})?;
let path_style = remote_connection.path_style();
let this = cx.new(|_| Self {
@@ -420,8 +427,9 @@ impl RemoteClient {
outgoing_tx: mpsc::UnboundedSender<Envelope>,
cx: &App,
name: &'static str,
has_wsl_interop: bool,
) -> AnyProtoClient {
ChannelClient::new(incoming_rx, outgoing_tx, cx, name).into()
ChannelClient::new(incoming_rx, outgoing_tx, cx, name, has_wsl_interop).into()
}
pub fn shutdown_processes<T: RequestMessage>(
@@ -921,8 +929,8 @@ impl RemoteClient {
});
let (outgoing_tx, _) = mpsc::unbounded::<Envelope>();
let (_, incoming_rx) = mpsc::unbounded::<Envelope>();
let server_client =
server_cx.update(|cx| ChannelClient::new(incoming_rx, outgoing_tx, cx, "fake-server"));
let server_client = server_cx
.update(|cx| ChannelClient::new(incoming_rx, outgoing_tx, cx, "fake-server", false));
let connection: Arc<dyn RemoteConnection> = Arc::new(fake::FakeRemoteConnection {
connection_options: opts.clone(),
server_cx: fake::SendableCx::new(server_cx),
@@ -1140,6 +1148,7 @@ pub trait RemoteConnection: Send + Sync {
fn path_style(&self) -> PathStyle;
fn shell(&self) -> String;
fn default_system_shell(&self) -> String;
fn has_wsl_interop(&self) -> bool;
#[cfg(any(test, feature = "test-support"))]
fn simulate_disconnect(&self, _: &AsyncApp) {}
@@ -1188,6 +1197,7 @@ struct ChannelClient {
name: &'static str,
task: Mutex<Task<Result<()>>>,
remote_started: Signal<()>,
has_wsl_interop: bool,
}
impl ChannelClient {
@@ -1196,6 +1206,7 @@ impl ChannelClient {
outgoing_tx: mpsc::UnboundedSender<Envelope>,
cx: &App,
name: &'static str,
has_wsl_interop: bool,
) -> Arc<Self> {
Arc::new_cyclic(|this| Self {
outgoing_tx: Mutex::new(outgoing_tx),
@@ -1211,6 +1222,7 @@ impl ChannelClient {
&cx.to_async(),
)),
remote_started: Signal::new(cx),
has_wsl_interop,
})
}
@@ -1489,6 +1501,10 @@ impl ProtoClient for ChannelClient {
fn is_via_collab(&self) -> bool {
false
}
fn has_wsl_interop(&self) -> bool {
self.has_wsl_interop
}
}
#[cfg(any(test, feature = "test-support"))]
@@ -1652,6 +1668,10 @@ mod fake {
fn default_system_shell(&self) -> String {
"sh".to_owned()
}
fn has_wsl_interop(&self) -> bool {
false
}
}
pub(super) struct Delegate;

View File

@@ -131,11 +131,7 @@ async fn build_remote_server_from_source(
let build_remote_server =
std::env::var("ZED_BUILD_REMOTE_SERVER").unwrap_or("nocompress".into());
if build_remote_server == "false"
|| build_remote_server == "no"
|| build_remote_server == "off"
|| build_remote_server == "0"
{
if let "false" | "no" | "off" | "0" = &*build_remote_server {
return Ok(None);
}

View File

@@ -394,6 +394,10 @@ impl RemoteConnection for SshRemoteConnection {
fn path_style(&self) -> PathStyle {
self.ssh_path_style
}
fn has_wsl_interop(&self) -> bool {
false
}
}
impl SshRemoteConnection {

View File

@@ -47,6 +47,7 @@ pub(crate) struct WslRemoteConnection {
shell: String,
shell_kind: ShellKind,
default_system_shell: String,
has_wsl_interop: bool,
connection_options: WslConnectionOptions,
}
@@ -71,6 +72,7 @@ impl WslRemoteConnection {
shell: String::new(),
shell_kind: ShellKind::Posix,
default_system_shell: String::from("/bin/sh"),
has_wsl_interop: false,
};
delegate.set_status(Some("Detecting WSL environment"), cx);
this.shell = this
@@ -79,6 +81,15 @@ impl WslRemoteConnection {
.context("failed detecting shell")?;
log::info!("Remote shell discovered: {}", this.shell);
this.shell_kind = ShellKind::new(&this.shell, false);
this.has_wsl_interop = this.detect_has_wsl_interop().await.unwrap_or_default();
log::info!(
"Remote has wsl interop {}",
if this.has_wsl_interop {
"enabled"
} else {
"disabled"
}
);
this.platform = this
.detect_platform()
.await
@@ -115,6 +126,14 @@ impl WslRemoteConnection {
.unwrap_or_else(|| "/bin/sh".to_string()))
}
async fn detect_has_wsl_interop(&self) -> Result<bool> {
Ok(self
.run_wsl_command_with_output("cat", &["/proc/sys/fs/binfmt_misc/WSLInterop"])
.await
.inspect_err(|err| log::error!("Failed to detect wsl interop: {err}"))?
.contains("enabled"))
}
async fn windows_path_to_wsl_path(&self, source: &Path) -> Result<String> {
windows_path_to_wsl_path_impl(&self.connection_options, source).await
}
@@ -317,6 +336,7 @@ impl RemoteConnection for WslRemoteConnection {
proxy_args.push(format!("{}={}", env_var, value));
}
}
proxy_args.push(remote_binary_path.display(PathStyle::Posix).into_owned());
proxy_args.push("proxy".to_owned());
proxy_args.push("--identifier".to_owned());
@@ -489,6 +509,10 @@ impl RemoteConnection for WslRemoteConnection {
fn default_system_shell(&self) -> String {
self.default_system_shell.clone()
}
fn has_wsl_interop(&self) -> bool {
self.has_wsl_interop
}
}
/// `wslpath` is a executable available in WSL, it's a linux binary.

View File

@@ -199,6 +199,7 @@ fn start_server(
listeners: ServerListeners,
log_rx: Receiver<Vec<u8>>,
cx: &mut App,
is_wsl_interop: bool,
) -> AnyProtoClient {
// This is the server idle timeout. If no connection comes in this timeout, the server will shut down.
const IDLE_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10 * 60);
@@ -318,7 +319,7 @@ fn start_server(
})
.detach();
RemoteClient::proto_client_from_channels(incoming_rx, outgoing_tx, cx, "server")
RemoteClient::proto_client_from_channels(incoming_rx, outgoing_tx, cx, "server", is_wsl_interop)
}
fn init_paths() -> anyhow::Result<()> {
@@ -407,8 +408,15 @@ pub fn execute_run(
HeadlessProject::init(cx);
let is_wsl_interop = if cfg!(target_os = "linux") {
// See: https://learn.microsoft.com/en-us/windows/wsl/filesystems#disable-interoperability
matches!(std::fs::read_to_string("/proc/sys/fs/binfmt_misc/WSLInterop"), Ok(s) if s.contains("enabled"))
} else {
false
};
log::info!("gpui app started, initializing server");
let session = start_server(listeners, log_rx, cx);
let session = start_server(listeners, log_rx, cx, is_wsl_interop);
GitHostingProviderRegistry::set_global(git_hosting_provider_registry, cx);
git_hosting_providers::init(cx);

View File

@@ -59,6 +59,7 @@ pub trait ProtoClient: Send + Sync {
fn message_handler_set(&self) -> &parking_lot::Mutex<ProtoMessageHandlerSet>;
fn is_via_collab(&self) -> bool;
fn has_wsl_interop(&self) -> bool;
}
#[derive(Default)]
@@ -510,6 +511,10 @@ impl AnyProtoClient {
},
);
}
pub fn has_wsl_interop(&self) -> bool {
self.0.client.has_wsl_interop()
}
}
fn to_any_envelope<T: EnvelopedMessage>(

View File

@@ -18,6 +18,7 @@ test-support = [
]
[dependencies]
collections.workspace = true
db.workspace = true
gpui.workspace = true
uuid.workspace = true

View File

@@ -1,9 +1,31 @@
use std::time::Duration;
use std::{
path::{Path, PathBuf},
time::Duration,
};
use collections::HashSet;
use db::kvp::KEY_VALUE_STORE;
use gpui::{App, AppContext as _, Context, Subscription, Task, WindowId};
use gpui::{
App, AppContext as _, Context, Entity, EventEmitter, Global, Subscription, Task, Window,
WindowId,
};
use util::ResultExt;
pub fn init(cx: &mut App) {
cx.spawn(async move |cx| {
let trusted_worktrees = TrustedWorktrees::new().await;
cx.update(|cx| {
let trusted_worktees_storage = TrustedWorktreesStorage {
trusted: cx.new(|_| trusted_worktrees),
untrusted: HashSet::default(),
};
cx.set_global(trusted_worktees_storage);
})
.log_err();
})
.detach();
}
pub struct Session {
session_id: String,
old_session_id: Option<String>,
@@ -12,6 +34,8 @@ pub struct Session {
const SESSION_ID_KEY: &str = "session_id";
const SESSION_WINDOW_STACK_KEY: &str = "session_window_stack";
const TRUSTED_WORKSPACES_KEY: &str = "trusted_workspaces";
const TRUSTED_WORKSPACES_SEPARATOR: &str = "<|>";
impl Session {
pub async fn new(session_id: String) -> Self {
@@ -108,6 +132,177 @@ impl AppSession {
}
}
// TODO kb move the whole trusted shebang into `project` crate.
/// A collection of worktree absolute paths that are considered trusted.
/// This can be used when checking for this criteria before enabling certain features.
#[derive(Clone)]
pub struct TrustedWorktreesStorage {
trusted: Entity<TrustedWorktrees>,
untrusted: HashSet<PathBuf>,
}
#[derive(Debug)]
pub enum Event {
TrustedWorktree(PathBuf),
UntrustedWorktree(PathBuf),
}
/// A collection of absolute paths for trusted worktrees.
/// Such worktrees' local settings will be processed and applied.
///
/// Emits an event each time the worktree path checked and found not trusted,
/// or a certain worktree path had been trusted.
struct TrustedWorktrees {
worktree_roots: HashSet<PathBuf>,
serialization_task: Task<()>,
}
impl EventEmitter<Event> for TrustedWorktrees {}
impl TrustedWorktrees {
async fn new() -> Self {
Self {
worktree_roots: KEY_VALUE_STORE
.read_kvp(TRUSTED_WORKSPACES_KEY)
.ok()
.flatten()
.map(|workspaces| {
workspaces
.split(TRUSTED_WORKSPACES_SEPARATOR)
.map(|workspace_path| PathBuf::from(workspace_path))
.collect()
})
.unwrap_or_default(),
serialization_task: Task::ready(()),
}
}
fn trust_path(&mut self, abs_path: PathBuf, cx: &mut Context<'_, Self>) {
debug_assert!(
abs_path.is_absolute(),
"Cannot trust non-absolute path {abs_path:?}"
);
let updated = self.worktree_roots.insert(abs_path.clone());
if updated {
let new_worktree_roots =
self.worktree_roots
.iter()
.fold(String::new(), |mut acc, path| {
if !acc.is_empty() {
acc.push_str(TRUSTED_WORKSPACES_SEPARATOR);
}
acc.push_str(&path.to_string_lossy());
acc
});
self.serialization_task = cx.background_spawn(async move {
KEY_VALUE_STORE
.write_kvp(TRUSTED_WORKSPACES_KEY.to_string(), new_worktree_roots)
.await
.log_err();
});
// TODO kb wrong: need to emut multiple worktrees, as we can trust some high-level directory
cx.emit(Event::TrustedWorktree(abs_path));
}
}
fn clear(&mut self, cx: &App) {
self.worktree_roots.clear();
self.serialization_task = cx.background_spawn(async move {
KEY_VALUE_STORE
.write_kvp(TRUSTED_WORKSPACES_KEY.to_string(), String::new())
.await
.log_err();
});
}
}
impl Global for TrustedWorktreesStorage {}
impl TrustedWorktreesStorage {
pub fn subscribe<T: 'static>(
&self,
cx: &mut Context<T>,
mut on_event: impl FnMut(&mut T, &Event, &mut Context<T>) + 'static,
) -> Subscription {
cx.subscribe(&self.trusted, move |t, _, e, cx| on_event(t, e, cx))
}
pub fn subscribe_in<T: 'static>(
&self,
window: &mut Window,
cx: &mut Context<T>,
mut on_event: impl FnMut(&mut T, &Event, &mut Window, &mut Context<T>) + 'static,
) -> Subscription {
cx.subscribe_in(&self.trusted, window, move |t, _, e, window, cx| {
on_event(t, e, window, cx)
})
}
/// Adds a worktree absolute path to the trusted list.
/// This will emit [`Event::TrustedWorktree`] event.
pub fn trust_path(&mut self, abs_path: PathBuf, cx: &mut App) {
self.untrusted.remove(&abs_path);
self.trusted.update(cx, |trusted_worktrees, cx| {
trusted_worktrees.trust_path(abs_path, cx)
});
}
/// Checks whether a certain worktree absolute path is trusted.
/// If not, emits [`Event::UntrustedWorktree`] event.
pub fn can_trust_path(&mut self, abs_path: &Path, cx: &mut App) -> bool {
debug_assert!(
abs_path.is_absolute(),
"Cannot check if trusting non-absolute path {abs_path:?}"
);
self.trusted.update(cx, |trusted_worktrees, cx| {
let trusted_worktree_roots = &trusted_worktrees.worktree_roots;
let mut can_trust = !self.untrusted.contains(abs_path);
if can_trust {
can_trust = if trusted_worktree_roots.len() > 100 {
let mut path = Some(abs_path);
while let Some(path_to_check) = path {
if trusted_worktree_roots.contains(path_to_check) {
return true;
}
path = path_to_check.parent();
}
false
} else {
trusted_worktree_roots
.iter()
.any(|trusted_root| abs_path.starts_with(&trusted_root))
};
}
if !can_trust {
if self.untrusted.insert(abs_path.to_owned()) {
cx.emit(Event::UntrustedWorktree(abs_path.to_owned()));
}
}
can_trust
})
}
pub fn untrusted_worktrees(&self) -> &HashSet<PathBuf> {
&self.untrusted
}
pub fn trust_all(&mut self, cx: &mut App) {
for untrusted_path in std::mem::take(&mut self.untrusted) {
self.trust_path(untrusted_path, cx);
}
}
pub fn clear_trusted_paths(&self, cx: &mut App) {
self.trusted.update(cx, |trusted_worktrees, cx| {
trusted_worktrees.clear(cx);
});
}
}
fn window_stack(cx: &App) -> Option<Vec<u64>> {
Some(
cx.window_stack()?

View File

@@ -61,6 +61,7 @@ pub struct AmazonBedrockSettingsContent {
pub region: Option<String>,
pub profile: Option<String>,
pub authentication_method: Option<BedrockAuthMethodContent>,
pub allow_global: Option<bool>,
}
#[with_fallible_options]

View File

@@ -187,6 +187,10 @@ pub struct SessionSettingsContent {
///
/// Default: true
pub restore_unsaved_buffers: Option<bool>,
/// Whether or not to skip project trust checks and synchronize project settings from any worktree automatically.
///
/// Default: false
pub trust_all_worktrees: Option<bool>,
}
#[derive(Deserialize, Serialize, Clone, PartialEq, Eq, JsonSchema, MergeFrom, Debug)]

View File

@@ -138,6 +138,28 @@ pub(crate) fn settings_data(cx: &App) -> Vec<SettingsPage> {
metadata: None,
files: USER,
}),
SettingsPageItem::SectionHeader("Security"),
SettingsPageItem::SettingItem(SettingItem {
title: "Trust all worktrees by default",
description: r#"When opening a directory in Zed, whether to require confirmation to read and apply project settings"#,
field: Box::new(SettingField {
json_path: Some("session.trust_all_worktrees"),
pick: |settings_content| {
settings_content
.session
.as_ref()
.and_then(|session| session.trust_all_worktrees.as_ref())
},
write: |settings_content, value| {
settings_content
.session
.get_or_insert_default()
.trust_all_worktrees = value;
},
}),
metadata: None,
files: USER,
}),
SettingsPageItem::SectionHeader("Workspace Restoration"),
SettingsPageItem::SettingItem(SettingItem {
title: "Restore Unsaved Buffers",

View File

@@ -34,6 +34,7 @@ channel.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
db.workspace = true
gpui = { workspace = true, features = ["screen-capture"] }
notifications.workspace = true
@@ -42,6 +43,7 @@ remote.workspace = true
rpc.workspace = true
schemars.workspace = true
serde.workspace = true
session.workspace = true
settings.workspace = true
smallvec.workspace = true
story = { workspace = true, optional = true }

View File

@@ -24,6 +24,7 @@ use auto_update::AutoUpdateStatus;
use call::ActiveCall;
use client::{Client, UserStore, zed_urls};
use cloud_llm_client::{Plan, PlanV1, PlanV2};
use collections::HashSet;
use gpui::{
Action, AnyElement, App, Context, Corner, Element, Entity, Focusable, InteractiveElement,
IntoElement, MouseButton, ParentElement, Render, StatefulInteractiveElement, Styled,
@@ -32,8 +33,9 @@ use gpui::{
use onboarding_banner::OnboardingBanner;
use project::{Project, WorktreeSettings, git_store::GitStoreEvent};
use remote::RemoteConnectionOptions;
use session::TrustedWorktreesStorage;
use settings::{Settings, SettingsLocation};
use std::sync::Arc;
use std::{path::PathBuf, sync::Arc};
use theme::ActiveTheme;
use title_bar_settings::TitleBarSettings;
use ui::{
@@ -134,6 +136,7 @@ pub struct TitleBar {
_subscriptions: Vec<Subscription>,
banner: Entity<OnboardingBanner>,
screen_share_popover_handle: PopoverMenuHandle<ContextMenu>,
untrusted_worktrees: HashSet<PathBuf>,
}
impl Render for TitleBar {
@@ -163,6 +166,8 @@ impl Render for TitleBar {
title_bar
.when(title_bar_settings.show_project_items, |title_bar| {
title_bar
.pl_2()
.children(self.render_restricted_mode(cx))
.children(self.render_project_host(cx))
.child(self.render_project_name(cx))
})
@@ -290,6 +295,43 @@ impl TitleBar {
}),
);
subscriptions.push(cx.observe(&user_store, |_, _, cx| cx.notify()));
let mut untrusted_worktrees = if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(|trusted_worktrees_storage, cx| {
subscriptions.push(trusted_worktrees_storage.subscribe(
cx,
move |title_bar, e, cx| match e {
session::Event::TrustedWorktree(abs_path) => {
title_bar.untrusted_worktrees.remove(abs_path);
}
session::Event::UntrustedWorktree(abs_path) => {
title_bar
.workspace
.update(cx, |workspace, cx| {
if workspace
.project()
.read(cx)
.find_worktree(abs_path, cx)
.is_some()
{
title_bar.untrusted_worktrees.insert(abs_path.clone());
};
})
.ok();
}
},
));
trusted_worktrees_storage.untrusted_worktrees().clone()
})
} else {
HashSet::default()
};
untrusted_worktrees.retain(|untrusted_path| {
workspace
.project()
.read(cx)
.find_worktree(untrusted_path, cx)
.is_some()
});
let banner = cx.new(|cx| {
OnboardingBanner::new(
@@ -315,7 +357,8 @@ impl TitleBar {
client,
_subscriptions: subscriptions,
banner,
screen_share_popover_handle: Default::default(),
untrusted_worktrees,
screen_share_popover_handle: PopoverMenuHandle::default(),
}
}
@@ -398,6 +441,31 @@ impl TitleBar {
)
}
pub fn render_restricted_mode(&self, cx: &mut Context<Self>) -> Option<AnyElement> {
if self.untrusted_worktrees.is_empty()
|| !cx.has_global::<session::TrustedWorktreesStorage>()
{
return None;
}
Some(
IconButton::new("restricted_mode_trigger", IconName::Warning)
.icon_color(Color::Warning)
.tooltip(Tooltip::text("Restricted Mode".to_string()))
.on_click({
cx.listener(move |title_bar, _, window, cx| {
title_bar
.workspace
.update(cx, |workspace, cx| {
workspace.show_worktree_security_modal(window, cx)
})
.log_err();
})
})
.into_any_element(),
)
}
pub fn render_project_host(&self, cx: &mut Context<Self>) -> Option<AnyElement> {
if self.project.read(cx).is_via_remote_server() {
return self.render_remote_project_connection(cx);

View File

@@ -588,19 +588,20 @@ impl ToolchainSelector {
.worktree_for_id(worktree_id, cx)?
.read(cx)
.abs_path();
let workspace_id = workspace.database_id()?;
let weak = workspace.weak_handle();
cx.spawn_in(window, async move |workspace, cx| {
let active_toolchain = workspace::WORKSPACE_DB
.toolchain(
workspace_id,
worktree_id,
relative_path.clone(),
language_name.clone(),
)
.await
.ok()
.flatten();
let active_toolchain = project
.read_with(cx, |this, cx| {
this.active_toolchain(
ProjectPath {
worktree_id,
path: relative_path.clone(),
},
language_name.clone(),
cx,
)
})?
.await;
workspace
.update_in(cx, |this, window, cx| {
this.toggle_modal(window, cx, move |window, cx| {
@@ -618,6 +619,7 @@ impl ToolchainSelector {
});
})
.ok();
anyhow::Ok(())
})
.detach();

View File

@@ -1,73 +1,121 @@
use crate::component_prelude::*;
use crate::prelude::*;
use crate::{Checkbox, ListBulletItem, ToggleState};
use gpui::IntoElement;
use smallvec::{SmallVec, smallvec};
use theme::ActiveTheme;
#[derive(IntoElement, RegisterComponent)]
pub struct AlertModal {
id: ElementId,
header: Option<AnyElement>,
children: SmallVec<[AnyElement; 2]>,
title: SharedString,
primary_action: SharedString,
dismiss_label: SharedString,
footer: Option<AnyElement>,
title: Option<SharedString>,
primary_action: Option<SharedString>,
dismiss_label: Option<SharedString>,
width: Option<DefiniteLength>,
}
impl AlertModal {
pub fn new(id: impl Into<ElementId>, title: impl Into<SharedString>) -> Self {
pub fn new(id: impl Into<ElementId>) -> Self {
Self {
id: id.into(),
header: None,
children: smallvec![],
title: title.into(),
primary_action: "Ok".into(),
dismiss_label: "Cancel".into(),
footer: None,
title: None,
primary_action: None,
dismiss_label: None,
width: None,
}
}
pub fn title(mut self, title: impl Into<SharedString>) -> Self {
self.title = Some(title.into());
self
}
pub fn header(mut self, header: impl IntoElement) -> Self {
self.header = Some(header.into_any_element());
self
}
pub fn footer(mut self, footer: impl IntoElement) -> Self {
self.footer = Some(footer.into_any_element());
self
}
pub fn primary_action(mut self, primary_action: impl Into<SharedString>) -> Self {
self.primary_action = primary_action.into();
self.primary_action = Some(primary_action.into());
self
}
pub fn dismiss_label(mut self, dismiss_label: impl Into<SharedString>) -> Self {
self.dismiss_label = dismiss_label.into();
self.dismiss_label = Some(dismiss_label.into());
self
}
pub fn width(mut self, width: impl Into<DefiniteLength>) -> Self {
self.width = Some(width.into());
self
}
}
impl RenderOnce for AlertModal {
fn render(self, _window: &mut Window, cx: &mut App) -> impl IntoElement {
v_flex()
let width = self.width.unwrap_or_else(|| px(440.).into());
let has_default_footer = self.primary_action.is_some() || self.dismiss_label.is_some();
let mut modal = v_flex()
.id(self.id)
.elevation_3(cx)
.w(px(440.))
.p_5()
.child(
.bg(cx.theme().colors().elevated_surface_background)
.w(width)
.overflow_hidden();
if let Some(header) = self.header {
modal = modal.child(header);
} else if let Some(title) = self.title {
modal = modal.child(
v_flex()
.pt_3()
.pr_3()
.pl_3()
.pb_1()
.child(Headline::new(title).size(HeadlineSize::Small)),
);
}
if !self.children.is_empty() {
modal = modal.child(
v_flex()
.p_3()
.text_ui(cx)
.text_color(Color::Muted.color(cx))
.gap_1()
.child(Headline::new(self.title).size(HeadlineSize::Small))
.children(self.children),
)
.child(
);
}
if let Some(footer) = self.footer {
modal = modal.child(footer);
} else if has_default_footer {
let primary_action = self.primary_action.unwrap_or_else(|| "Ok".into());
let dismiss_label = self.dismiss_label.unwrap_or_else(|| "Cancel".into());
modal = modal.child(
h_flex()
.h(rems(1.75))
.p_3()
.items_center()
.child(div().flex_1())
.child(
h_flex()
.items_center()
.gap_1()
.child(
Button::new(self.dismiss_label.clone(), self.dismiss_label.clone())
.color(Color::Muted),
)
.child(Button::new(
self.primary_action.clone(),
self.primary_action,
)),
),
)
.justify_end()
.gap_1()
.child(Button::new(dismiss_label.clone(), dismiss_label).color(Color::Muted))
.child(Button::new(primary_action.clone(), primary_action)),
);
}
modal
}
}
@@ -90,24 +138,75 @@ impl Component for AlertModal {
Some("A modal dialog that presents an alert message with primary and dismiss actions.")
}
fn preview(_window: &mut Window, _cx: &mut App) -> Option<AnyElement> {
fn preview(_window: &mut Window, cx: &mut App) -> Option<AnyElement> {
Some(
v_flex()
.gap_6()
.p_4()
.children(vec![example_group(
vec![
single_example(
"Basic Alert",
AlertModal::new("simple-modal", "Do you want to leave the current call?")
.child("The current window will be closed, and connections to any shared projects will be terminated."
)
.primary_action("Leave Call")
.into_any_element(),
)
],
)])
.into_any_element()
.children(vec![
example_group(vec![single_example(
"Basic Alert",
AlertModal::new("simple-modal")
.title("Do you want to leave the current call?")
.child(
"The current window will be closed, and connections to any shared projects will be terminated."
)
.primary_action("Leave Call")
.dismiss_label("Cancel")
.into_any_element(),
)]),
example_group(vec![single_example(
"Custom Header",
AlertModal::new("custom-header-modal")
.header(
v_flex()
.p_3()
.bg(cx.theme().colors().background)
.gap_1()
.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::Warning).color(Color::Warning))
.child(Headline::new("Unrecognized Workspace").size(HeadlineSize::Small))
)
.child(
h_flex()
.pl(IconSize::default().rems() + rems(0.5))
.child(Label::new("~/projects/my-project").color(Color::Muted))
)
)
.child(
"Untrusted workspaces are opened in Restricted Mode to protect your system.
Review .zed/settings.json for any extensions or commands configured by this project.",
)
.child(
v_flex()
.mt_1()
.child(Label::new("Restricted mode prevents:").color(Color::Muted))
.child(ListBulletItem::new("Project settings from being applied"))
.child(ListBulletItem::new("Language servers from running"))
.child(ListBulletItem::new("MCP integrations from installing"))
)
.footer(
h_flex()
.p_3()
.justify_between()
.child(
Checkbox::new("trust-parent", ToggleState::Unselected)
.label("Trust all projects in parent directory")
)
.child(
h_flex()
.gap_1()
.child(Button::new("restricted", "Open in Restricted Mode").color(Color::Muted))
.child(Button::new("trust", "Trust and Continue").style(ButtonStyle::Filled))
)
)
.width(rems(40.))
.into_any_element(),
)]),
])
.into_any_element(),
)
}
}

View File

@@ -1656,49 +1656,6 @@ impl WorkspaceDb {
}
}
pub async fn toolchain(
&self,
workspace_id: WorkspaceId,
worktree_id: WorktreeId,
relative_worktree_path: Arc<RelPath>,
language_name: LanguageName,
) -> Result<Option<Toolchain>> {
self.write(move |this| {
let mut select = this
.select_bound(sql!(
SELECT
name, path, raw_json
FROM toolchains
WHERE
workspace_id = ? AND
language_name = ? AND
worktree_id = ? AND
relative_worktree_path = ?
))
.context("select toolchain")?;
let toolchain: Vec<(String, String, String)> = select((
workspace_id,
language_name.as_ref().to_string(),
worktree_id.to_usize(),
relative_worktree_path.as_unix_str().to_string(),
))?;
Ok(toolchain
.into_iter()
.next()
.and_then(|(name, path, raw_json)| {
Some(Toolchain {
name: name.into(),
path: path.into(),
language_name,
as_json: serde_json::Value::from_str(&raw_json).ok()?,
})
}))
})
.await
}
pub(crate) async fn toolchains(
&self,
workspace_id: WorkspaceId,

View File

@@ -0,0 +1,186 @@
use std::{
borrow::Cow,
path::{Path, PathBuf},
};
use collections::HashSet;
use gpui::BorrowAppContext;
use gpui::{DismissEvent, EventEmitter, Focusable};
use session::TrustedWorktreesStorage;
use theme::ActiveTheme;
use ui::{
AlertModal, Button, ButtonCommon as _, ButtonStyle, Checkbox, Clickable as _, Color, Context,
Headline, HeadlineSize, Icon, IconName, IconSize, IntoElement, Label, LabelCommon as _,
ListBulletItem, ParentElement as _, Render, Styled, ToggleState, Window, h_flex, rems, v_flex,
};
use crate::{DismissDecision, ModalView};
pub struct SecurityModal {
pub paths: HashSet<PathBuf>,
home_dir: Option<PathBuf>,
dismissed: bool,
trust_parents: bool,
}
impl Focusable for SecurityModal {
fn focus_handle(&self, cx: &ui::App) -> gpui::FocusHandle {
cx.focus_handle()
}
}
impl EventEmitter<DismissEvent> for SecurityModal {}
impl ModalView for SecurityModal {
fn on_before_dismiss(
&mut self,
_window: &mut Window,
_: &mut Context<Self>,
) -> DismissDecision {
DismissDecision::Dismiss(self.dismissed)
}
fn fade_out_background(&self) -> bool {
true
}
}
impl Render for SecurityModal {
fn render(&mut self, _window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
if self.paths.is_empty() {
self.dismiss(cx);
return v_flex().into_any_element();
}
let header_label = if self.paths.len() == 1 {
"Unrecognized Workspace"
} else {
"Unrecognized Workspaces"
};
let trust_label = self.build_trust_label();
AlertModal::new("security-modal")
.header(
v_flex()
.p_3()
.bg(cx.theme().colors().background)
.gap_1()
.child(
h_flex()
.gap_1()
.child(Icon::new(IconName::Warning).color(Color::Warning))
.child(Headline::new(header_label).size(HeadlineSize::Small)),
)
.children(self.paths.iter().map(|path| {
h_flex()
.pl(IconSize::default().rems() + rems(0.5))
.child(Label::new(path.display().to_string()).color(Color::Muted))
})),
)
.child(
"Untrusted workspaces are opened in Restricted Mode to protect your system.
Review .zed/settings.json for any extensions or commands configured by this project.",
)
.child(
v_flex()
.mt_2()
.child(Label::new("Restricted mode prevents:").color(Color::Muted))
.child(ListBulletItem::new("Project settings from being applied"))
.child(ListBulletItem::new("Language servers from running"))
.child(ListBulletItem::new("MCP integrations from installing")),
)
.footer(
h_flex()
.p_3()
.justify_between()
.child(
Checkbox::new("trust-parents", ToggleState::from(self.trust_parents))
.label(trust_label)
.on_click(cx.listener(|security_modal, state: &ToggleState, _, cx| {
security_modal.trust_parents = state.selected();
cx.notify();
})),
)
.child(
h_flex()
.gap_1()
.child(
Button::new("open-in-restricted-mode", "Restricted Mode")
.color(Color::Muted)
.on_click(cx.listener(move |security_modal, _, _, cx| {
security_modal.dismiss(cx);
cx.stop_propagation();
})),
)
.child(
Button::new("trust-and-continue", "Trust and Continue")
.style(ButtonStyle::Filled)
.on_click(cx.listener(move |security_modal, _, _, cx| {
security_modal.trust_and_dismiss(cx);
})),
),
),
)
.width(rems(40.))
.into_any_element()
}
}
impl SecurityModal {
pub fn new(paths: HashSet<PathBuf>) -> Self {
Self {
paths,
dismissed: false,
trust_parents: false,
home_dir: std::env::home_dir(),
}
}
fn build_trust_label(&self) -> Cow<'static, str> {
if self.paths.len() == 1 {
let Some(single_path) = self.paths.iter().next() else {
return Cow::Borrowed("Trust all projects in the parent folders");
};
match single_path.parent().map(|path| match &self.home_dir {
Some(home_dir) => path
.strip_prefix(home_dir)
.map(|stripped| Path::new("~").join(stripped))
.map(Cow::Owned)
.unwrap_or(Cow::Borrowed(path)),
None => Cow::Borrowed(path),
}) {
Some(parent) => Cow::Owned(format!("Trust all projects in the {parent:?} folder")),
None => Cow::Borrowed("Trust all projects in the parent folders"),
}
} else {
Cow::Borrowed("Trust all projects in the parent folders")
}
}
fn trust_and_dismiss(&mut self, cx: &mut Context<Self>) {
if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(|trusted_wortrees_storage, cx| {
let mut paths_to_trust = self.paths.clone();
if self.trust_parents {
paths_to_trust.extend(
self.paths
.iter()
.filter_map(|path| Some(path.parent()?.to_owned())),
);
}
for path_to_trust in paths_to_trust {
trusted_wortrees_storage.trust_path(path_to_trust, cx);
}
});
}
self.dismiss(cx);
}
fn dismiss(&mut self, cx: &mut Context<Self>) {
self.dismissed = true;
cx.emit(DismissEvent);
}
}

View File

@@ -9,6 +9,7 @@ pub mod pane_group;
mod path_list;
mod persistence;
pub mod searchable;
mod security_modal;
pub mod shared_screen;
mod status_bar;
pub mod tasks;
@@ -74,6 +75,7 @@ use project::{
DirectoryLister, Project, ProjectEntryId, ProjectPath, ResolvedPath, Worktree, WorktreeId,
WorktreeSettings,
debugger::{breakpoint_store::BreakpointStoreEvent, session::ThreadStatus},
project_settings::ProjectSettings,
toolchain_store::ToolchainStoreEvent,
};
use remote::{
@@ -82,8 +84,10 @@ use remote::{
};
use schemars::JsonSchema;
use serde::Deserialize;
use session::AppSession;
use settings::{CenteredPaddingSettings, Settings, SettingsLocation, update_settings_file};
use session::{AppSession, TrustedWorktreesStorage};
use settings::{
CenteredPaddingSettings, Settings, SettingsLocation, SettingsStore, update_settings_file,
};
use shared_screen::SharedScreen;
use sqlez::{
bindable::{Bind, Column, StaticColumnCount},
@@ -126,11 +130,14 @@ pub use workspace_settings::{
};
use zed_actions::{Spawn, feedback::FileBugReport};
use crate::persistence::{
SerializedAxis,
model::{DockData, DockStructure, SerializedItem, SerializedPane, SerializedPaneGroup},
};
use crate::{item::ItemBufferKind, notifications::NotificationId};
use crate::{
persistence::{
SerializedAxis,
model::{DockData, DockStructure, SerializedItem, SerializedPane, SerializedPaneGroup},
},
security_modal::SecurityModal,
};
pub const SERIALIZATION_THROTTLE_TIME: Duration = Duration::from_millis(200);
@@ -265,6 +272,13 @@ actions!(
ToggleRightDock,
/// Toggles zoom on the active pane.
ToggleZoom,
/// If any worktrees are in restricted mode, shows a modal with possible actions.
/// TODO kb docs
ShowWorktreeSecurity,
/// Clears all trusted worktrees, placing them in restricted mode on next open.
/// Requires restart to take effect on already opened projects.
/// TODO kb docs
ClearTrustedWorktrees,
/// Stops following a collaborator.
Unfollow,
/// Restores the banner.
@@ -1204,6 +1218,46 @@ impl Workspace {
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(|trusted_worktrees_storage, cx| {
// TODO kb Is it ok that remote projects' worktrees are identified by abs path only?
// Need to join with remote_hosts DB table data
trusted_worktrees_storage
.subscribe_in(window, cx, move |workspace, e, window, cx| match e {
session::Event::TrustedWorktree(trusted_path) => {
if let Some(security_modal) =
workspace.active_modal::<SecurityModal>(cx)
{
let remove = security_modal.update(cx, |security_modal, _| {
security_modal.paths.remove(trusted_path);
security_modal.paths.is_empty()
});
if remove {
workspace.hide_modal(window, cx);
}
}
}
session::Event::UntrustedWorktree(_) => {
workspace.show_worktree_security_modal(window, cx)
}
})
.detach();
})
};
cx.observe_global::<SettingsStore>(|_, cx| {
if ProjectSettings::get_global(cx).session.trust_all_worktrees {
if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(
|trusted_worktrees_storage, cx| {
trusted_worktrees_storage.trust_all(cx);
},
)
}
}
})
.detach();
cx.subscribe_in(&project, window, move |this, _, event, window, cx| {
match event {
project::Event::RemoteIdChanged(_) => {
@@ -1461,9 +1515,10 @@ impl Workspace {
}),
];
cx.defer_in(window, |this, window, cx| {
this.update_window_title(window, cx);
this.show_initial_notifications(cx);
cx.defer_in(window, move |workspace, window, cx| {
workspace.update_window_title(window, cx);
workspace.show_initial_notifications(cx);
workspace.show_worktree_security_modal(window, cx);
});
Workspace {
weak_self: weak_handle.clone(),
@@ -1517,6 +1572,7 @@ impl Workspace {
scheduled_tasks: Vec::new(),
last_open_dock_positions: Vec::new(),
removing: false,
}
}
@@ -5912,6 +5968,22 @@ impl Workspace {
}
},
))
.on_action(cx.listener(
|workspace: &mut Workspace, _: &ShowWorktreeSecurity, window, cx| {
workspace.show_worktree_security_modal(window, cx);
},
))
.on_action(
cx.listener(|_: &mut Workspace, _: &ClearTrustedWorktrees, _, cx| {
if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(
|trusted_worktrees_storage, cx| {
trusted_worktrees_storage.clear_trusted_paths(cx);
},
);
}
}),
)
.on_action(cx.listener(
|workspace: &mut Workspace, _: &ReopenClosedItem, window, cx| {
workspace.reopen_closed_item(window, cx).detach();
@@ -6349,6 +6421,40 @@ impl Workspace {
file.project.all_languages.defaults.show_edit_predictions = Some(!show_edit_predictions)
});
}
pub fn show_worktree_security_modal(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let untrusted_worktrees = if cx.has_global::<TrustedWorktreesStorage>() {
cx.update_global::<TrustedWorktreesStorage, _>(|trusted_worktrees_storage, cx| {
trusted_worktrees_storage
.untrusted_worktrees()
.iter()
.filter(|untrusted_path| {
self.project()
.read(cx)
.find_worktree(untrusted_path, cx)
.is_some()
})
.cloned()
.collect()
})
} else {
HashSet::default()
};
if let Some(security_modal) = self.active_modal::<SecurityModal>(cx) {
let remove = security_modal.update(cx, |security_modal, cx| {
security_modal.paths.extend(untrusted_worktrees);
let remove = security_modal.paths.is_empty();
cx.notify();
remove
});
if remove {
self.hide_modal(window, cx);
}
} else if !untrusted_worktrees.is_empty() {
self.toggle_modal(window, cx, |_, _| SecurityModal::new(untrusted_worktrees));
}
}
}
fn leader_border_for_pane(
@@ -11474,26 +11580,4 @@ mod tests {
});
item
}
#[gpui::test]
async fn test_window_focus_set_on_startup_without_buffer(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree("/root", json!({"a": ""})).await;
let project = Project::test(fs, ["/root".as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
cx.run_until_parked();
workspace.update_in(cx, |_workspace, window, cx| {
let focused = window.focused(cx);
assert!(
focused.is_some(),
"Expected window focus to be set on startup, but it was None"
);
});
}
}

View File

@@ -105,6 +105,7 @@ impl Settings for WorkspaceSettings {
.collect(),
close_on_file_delete: workspace.close_on_file_delete.unwrap(),
use_system_window_tabs: workspace.use_system_window_tabs.unwrap(),
zoomed_padding: workspace.zoomed_padding.unwrap(),
window_decorations: workspace.window_decorations.unwrap(),
}

View File

@@ -52,7 +52,7 @@ use std::{
fmt,
future::Future,
mem::{self},
ops::{Deref, DerefMut},
ops::{Deref, DerefMut, Range},
path::{Path, PathBuf},
pin::Pin,
sync::{
@@ -428,7 +428,7 @@ impl Worktree {
let mut entry = Entry::new(
RelPath::empty().into(),
&metadata,
&next_entry_id,
ProjectEntryId::new(&next_entry_id),
snapshot.root_char_bag,
None,
);
@@ -2736,13 +2736,30 @@ impl BackgroundScannerState {
}
}
async fn insert_entry(
fn entry_id_for(
&mut self,
mut entry: Entry,
fs: &dyn Fs,
watcher: &dyn Watcher,
) -> Entry {
self.reuse_entry_id(&mut entry);
next_entry_id: &AtomicUsize,
path: &RelPath,
metadata: &fs::Metadata,
) -> ProjectEntryId {
// If an entry with the same inode was removed from the worktree during this scan,
// then it *might* represent the same file or directory. But the OS might also have
// re-used the inode for a completely different file or directory.
//
// Conditionally reuse the old entry's id:
// * if the mtime is the same, the file was probably been renamed.
// * if the path is the same, the file may just have been updated
if let Some(removed_entry) = self.removed_entries.remove(&metadata.inode) {
if removed_entry.mtime == Some(metadata.mtime) || *removed_entry.path == *path {
return removed_entry.id;
}
} else if let Some(existing_entry) = self.snapshot.entry_for_path(path) {
return existing_entry.id;
}
ProjectEntryId::new(next_entry_id)
}
async fn insert_entry(&mut self, entry: Entry, fs: &dyn Fs, watcher: &dyn Watcher) -> Entry {
let entry = self.snapshot.insert_entry(entry, fs);
if entry.path.file_name() == Some(&DOT_GIT) {
self.insert_git_repository(entry.path.clone(), fs, watcher)
@@ -3389,13 +3406,13 @@ impl Entry {
fn new(
path: Arc<RelPath>,
metadata: &fs::Metadata,
next_entry_id: &AtomicUsize,
id: ProjectEntryId,
root_char_bag: CharBag,
canonical_path: Option<Arc<Path>>,
) -> Self {
let char_bag = char_bag_for_path(root_char_bag, &path);
Self {
id: ProjectEntryId::new(next_entry_id),
id,
kind: if metadata.is_dir {
EntryKind::PendingDir
} else {
@@ -3682,8 +3699,10 @@ impl BackgroundScanner {
.await;
if ignore_stack.is_abs_path_ignored(root_abs_path.as_path(), true) {
root_entry.is_ignored = true;
let mut root_entry = root_entry.clone();
state.reuse_entry_id(&mut root_entry);
state
.insert_entry(root_entry.clone(), self.fs.as_ref(), self.watcher.as_ref())
.insert_entry(root_entry, self.fs.as_ref(), self.watcher.as_ref())
.await;
}
if root_entry.is_dir() {
@@ -3877,29 +3896,35 @@ impl BackgroundScanner {
abs_paths.dedup_by(|a, b| a.starts_with(b));
{
let snapshot = &self.state.lock().await.snapshot;
abs_paths.retain(|abs_path| {
let abs_path = &SanitizedPath::new(abs_path);
let mut ranges_to_drop = SmallVec::<[Range<usize>; 4]>::new();
fn skip_ix(ranges: &mut SmallVec<[Range<usize>; 4]>, ix: usize) {
if let Some(last_range) = ranges.last_mut()
&& last_range.end == ix
{
last_range.end += 1;
} else {
ranges.push(ix..ix + 1);
}
}
for (ix, abs_path) in abs_paths.iter().enumerate() {
let abs_path = &SanitizedPath::new(&abs_path);
{
let mut is_git_related = false;
let mut dot_git_paths = None;
let dot_git_paths = self.executor.block(maybe!(async {
let mut path = None;
for ancestor in abs_path.as_path().ancestors() {
for ancestor in abs_path.as_path().ancestors() {
if is_git_dir(ancestor, self.fs.as_ref()).await {
let path_in_git_dir = abs_path
.as_path()
.strip_prefix(ancestor)
.expect("stripping off the ancestor");
path = Some((ancestor.to_owned(), path_in_git_dir.to_owned()));
break;
dot_git_paths = Some((ancestor.to_owned(), path_in_git_dir.to_owned()));
break;
}
}
path
}));
}
if let Some((dot_git_abs_path, path_in_git_dir)) = dot_git_paths {
if skipped_files_in_dot_git
@@ -3909,8 +3934,11 @@ impl BackgroundScanner {
path_in_git_dir.starts_with(skipped_git_subdir)
})
{
log::debug!("ignoring event {abs_path:?} as it's in the .git directory among skipped files or directories");
return false;
log::debug!(
"ignoring event {abs_path:?} as it's in the .git directory among skipped files or directories"
);
skip_ix(&mut ranges_to_drop, ix);
continue;
}
is_git_related = true;
@@ -3919,8 +3947,7 @@ impl BackgroundScanner {
}
}
let relative_path = if let Ok(path) =
abs_path.strip_prefix(&root_canonical_path)
let relative_path = if let Ok(path) = abs_path.strip_prefix(&root_canonical_path)
&& let Ok(path) = RelPath::new(path, PathStyle::local())
{
path
@@ -3931,10 +3958,11 @@ impl BackgroundScanner {
);
} else {
log::error!(
"ignoring event {abs_path:?} outside of root path {root_canonical_path:?}",
"ignoring event {abs_path:?} outside of root path {root_canonical_path:?}",
);
}
return false;
skip_ix(&mut ranges_to_drop, ix);
continue;
};
if abs_path.file_name() == Some(OsStr::new(GITIGNORE)) {
@@ -3958,21 +3986,26 @@ impl BackgroundScanner {
});
if !parent_dir_is_loaded {
log::debug!("ignoring event {relative_path:?} within unloaded directory");
return false;
skip_ix(&mut ranges_to_drop, ix);
continue;
}
if self.settings.is_path_excluded(&relative_path) {
if !is_git_related {
log::debug!("ignoring FS event for excluded path {relative_path:?}");
}
return false;
skip_ix(&mut ranges_to_drop, ix);
continue;
}
relative_paths.push(relative_path.into_arc());
true
}
});
for range_to_drop in ranges_to_drop.into_iter().rev() {
abs_paths.drain(range_to_drop);
}
}
if relative_paths.is_empty() && dot_git_abs_paths.is_empty() {
return;
}
@@ -4275,7 +4308,7 @@ impl BackgroundScanner {
let mut child_entry = Entry::new(
child_path.clone(),
&child_metadata,
&next_entry_id,
ProjectEntryId::new(&next_entry_id),
root_char_bag,
None,
);
@@ -4462,10 +4495,11 @@ impl BackgroundScanner {
.ignore_stack_for_abs_path(&abs_path, metadata.is_dir, self.fs.as_ref())
.await;
let is_external = !canonical_path.starts_with(&root_canonical_path);
let entry_id = state.entry_id_for(self.next_entry_id.as_ref(), path, &metadata);
let mut fs_entry = Entry::new(
path.clone(),
&metadata,
self.next_entry_id.as_ref(),
entry_id,
state.snapshot.root_char_bag,
if metadata.is_symlink {
Some(canonical_path.as_path().to_path_buf().into())

View File

@@ -1533,6 +1533,175 @@ async fn test_create_dir_all_on_create_entry(cx: &mut TestAppContext) {
});
}
#[gpui::test]
async fn test_create_file_in_expanded_gitignored_dir(cx: &mut TestAppContext) {
// Tests the behavior of our worktree refresh when a file in a gitignored directory
// is created.
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
"/root",
json!({
".gitignore": "ignored_dir\n",
"ignored_dir": {
"existing_file.txt": "existing content",
"another_file.txt": "another content",
},
}),
)
.await;
let tree = Worktree::local(
Path::new("/root"),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
.await;
tree.read_with(cx, |tree, _| {
let ignored_dir = tree.entry_for_path(rel_path("ignored_dir")).unwrap();
assert!(ignored_dir.is_ignored);
assert_eq!(ignored_dir.kind, EntryKind::UnloadedDir);
});
tree.update(cx, |tree, cx| {
tree.load_file(rel_path("ignored_dir/existing_file.txt"), cx)
})
.await
.unwrap();
tree.read_with(cx, |tree, _| {
let ignored_dir = tree.entry_for_path(rel_path("ignored_dir")).unwrap();
assert!(ignored_dir.is_ignored);
assert_eq!(ignored_dir.kind, EntryKind::Dir);
assert!(
tree.entry_for_path(rel_path("ignored_dir/existing_file.txt"))
.is_some()
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/another_file.txt"))
.is_some()
);
});
let entry = tree
.update(cx, |tree, cx| {
tree.create_entry(rel_path("ignored_dir/new_file.txt").into(), false, None, cx)
})
.await
.unwrap();
assert!(entry.into_included().is_some());
cx.executor().run_until_parked();
tree.read_with(cx, |tree, _| {
let ignored_dir = tree.entry_for_path(rel_path("ignored_dir")).unwrap();
assert!(ignored_dir.is_ignored);
assert_eq!(
ignored_dir.kind,
EntryKind::Dir,
"ignored_dir should still be loaded, not UnloadedDir"
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/existing_file.txt"))
.is_some(),
"existing_file.txt should still be visible"
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/another_file.txt"))
.is_some(),
"another_file.txt should still be visible"
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/new_file.txt"))
.is_some(),
"new_file.txt should be visible"
);
});
}
#[gpui::test]
async fn test_fs_event_for_gitignored_dir_does_not_lose_contents(cx: &mut TestAppContext) {
// Tests the behavior of our worktree refresh when a directory modification for a gitignored directory
// is triggered.
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
"/root",
json!({
".gitignore": "ignored_dir\n",
"ignored_dir": {
"file1.txt": "content1",
"file2.txt": "content2",
},
}),
)
.await;
let tree = Worktree::local(
Path::new("/root"),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
.await;
// Load a file to expand the ignored directory
tree.update(cx, |tree, cx| {
tree.load_file(rel_path("ignored_dir/file1.txt"), cx)
})
.await
.unwrap();
tree.read_with(cx, |tree, _| {
let ignored_dir = tree.entry_for_path(rel_path("ignored_dir")).unwrap();
assert_eq!(ignored_dir.kind, EntryKind::Dir);
assert!(
tree.entry_for_path(rel_path("ignored_dir/file1.txt"))
.is_some()
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/file2.txt"))
.is_some()
);
});
fs.emit_fs_event("/root/ignored_dir", Some(fs::PathEventKind::Changed));
tree.flush_fs_events(cx).await;
tree.read_with(cx, |tree, _| {
let ignored_dir = tree.entry_for_path(rel_path("ignored_dir")).unwrap();
assert_eq!(
ignored_dir.kind,
EntryKind::Dir,
"ignored_dir should still be loaded (Dir), not UnloadedDir"
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/file1.txt"))
.is_some(),
"file1.txt should still be visible after directory fs event"
);
assert!(
tree.entry_for_path(rel_path("ignored_dir/file2.txt"))
.is_some(),
"file2.txt should still be visible after directory fs event"
);
});
}
#[gpui::test(iterations = 100)]
async fn test_random_worktree_operations_during_initial_scan(
cx: &mut TestAppContext,

View File

@@ -22,5 +22,9 @@
<true/>
<key>com.apple.security.personal-information.photos-library</key>
<true/>
<key>com.apple.security.files.user-selected.read-write</key>
<true/>
<key>com.apple.security.files.downloads.read-write</key>
<true/>
</dict>
</plist>

View File

@@ -404,6 +404,7 @@ pub fn main() {
});
app.run(move |cx| {
session::init(cx);
menu::init();
zed_actions::init();

View File

@@ -215,6 +215,10 @@ pub mod git {
Switch,
/// Selects a different repository.
SelectRepo,
/// Filter remotes.
FilterRemotes,
/// Create a git remote.
CreateRemote,
/// Opens the git branch selector.
#[action(deprecated_aliases = ["branches::OpenRecent"])]
Branch,

View File

@@ -89,12 +89,32 @@ To do this:
#### Cross-Region Inference
The Zed implementation of Amazon Bedrock uses [Cross-Region inference](https://docs.aws.amazon.com/bedrock/latest/userguide/cross-region-inference.html) for all the models and region combinations that support it.
The Zed implementation of Amazon Bedrock uses [Cross-Region inference](https://docs.aws.amazon.com/bedrock/latest/userguide/cross-region-inference.html) to improve availability and throughput.
With Cross-Region inference, you can distribute traffic across multiple AWS Regions, enabling higher throughput.
For example, if you use `Claude Sonnet 3.7 Thinking` from `us-east-1`, it may be processed across the US regions, namely: `us-east-1`, `us-east-2`, or `us-west-2`.
Cross-Region inference requests are kept within the AWS Regions that are part of the geography where the data originally resides.
For example, a request made within the US is kept within the AWS Regions in the US.
##### Regional vs Global Inference Profiles
Bedrock supports two types of cross-region inference profiles:
- **Regional profiles** (default): Route requests within a specific geography (US, EU, APAC). For example, `us-east-1` uses the `us.*` profile which routes across `us-east-1`, `us-east-2`, and `us-west-2`.
- **Global profiles**: Route requests across all commercial AWS Regions for maximum availability and performance.
By default, Zed uses **regional profiles** which keep your data within the same geography. You can opt into global profiles by adding `"allow_global": true` to your Bedrock configuration:
```json [settings]
{
"language_models": {
"bedrock": {
"authentication_method": "named_profile",
"region": "your-aws-region",
"profile": "your-profile-name",
"allow_global": true
}
}
}
```
**Note:** Only select newer models support global inference profiles. See the [AWS Bedrock supported models documentation](https://docs.aws.amazon.com/bedrock/latest/userguide/inference-profiles-support.html#inference-profiles-support-system) for the current list of models that support global inference. If you encounter availability issues with a model in your region, enabling `allow_global` may resolve them.
Although the data remains stored only in the source Region, your input prompts and output results might move outside of your source Region during cross-Region inference.
All data will be transmitted encrypted across Amazon's secure network.

View File

@@ -1451,6 +1451,45 @@ or
`boolean` values
### Session
- Description: Controls Zed lifecycle-related behavior.
- Setting: `session`
- Default:
```json
{
"session": {
"restore_unsaved_buffers": true,
"trust_all_worktrees": false
}
}
```
**Options**
1. Whether or not to restore unsaved buffers on restart:
```json [settings]
{
"session": {
"restore_unsaved_buffers": true
}
}
```
If this is true, user won't be prompted whether to save/discard dirty files when closing the application.
2. Whether or not to skip project trust checks and synchronize project settings from any worktree automatically:
```json [settings]
{
"session": {
"trust_all_worktrees": false
}
}
```
### Drag And Drop Selection
- Description: Whether to allow drag and drop text selection in buffer. `delay` is the milliseconds that must elapse before drag and drop is allowed. Otherwise, a new text selection is created.

View File

@@ -2,34 +2,44 @@
PHP support is available through the [PHP extension](https://github.com/zed-extensions/php).
- Tree-sitter: https://github.com/tree-sitter/tree-sitter-php
- Language Servers:
- [phpactor](https://github.com/phpactor/phpactor)
- [intelephense](https://github.com/bmewburn/vscode-intelephense/)
- Tree-sitter: [tree-sitter/tree-sitter-php](https://github.com/tree-sitter/tree-sitter-php)
- Language Server: [phpactor/phpactor](https://github.com/phpactor/phpactor)
- Alternate Language Server: [bmewburn/vscode-intelephense](https://github.com/bmewburn/vscode-intelephense/)
## Install PHP
The PHP extension requires PHP to be installed and available in your `PATH`:
```sh
# macOS via Homebrew
brew install php
# Debian/Ubuntu
sudo apt-get install php-cli
# CentOS 8+/RHEL
sudo dnf install php-cli
# Arch Linux
sudo pacman -S php
# check PHP path
## macOS and Linux
which php
## Windows
where php
```
## Choosing a language server
The PHP extension offers both `phpactor` and `intelephense` language server support.
`phpactor` is enabled by default.
### Phpactor
The Zed PHP Extension can install `phpactor` automatically but requires `php` to be installed and available in your path:
```sh
# brew install php # macOS
# sudo apt-get install php # Debian/Ubuntu
# yum install php # CentOS/RHEL
# pacman -S php # Arch Linux
which php
```
The PHP extension uses [LSP language servers](https://microsoft.github.io/language-server-protocol) with Phpactor as the default. If you want to use other language servers that support Zed (e.g. Intelephense or PHP Tools), make sure to follow the documentation on how to implement it.
### Intelephense
[Intelephense](https://intelephense.com/) is a [proprietary](https://github.com/bmewburn/vscode-intelephense/blob/master/LICENSE.txt#L29) language server for PHP operating under a freemium model. Certain features require purchase of a [premium license](https://intelephense.com/).
[Intelephense](https://intelephense.com/) is a [proprietary](https://github.com/bmewburn/vscode-intelephense/blob/master/LICENSE.txt#L29) language server for PHP operating under a freemium model. Certain features require purchase of a [premium license](https://intelephense.com/buy).
To switch to `intelephense`, add the following to your `settings.json`:
To use Intelephense, add the following to your `settings.json`:
```json [settings]
{
@@ -41,7 +51,9 @@ To switch to `intelephense`, add the following to your `settings.json`:
}
```
To use the premium features, you can place your [licence.txt file](https://intelephense.com/faq.html) at `~/intelephense/licence.txt` inside your home directory. Alternatively, you can pass the licence key or a path to a file containing the licence key as an initialization option for the `intelephense` language server. To do this, add the following to your `settings.json`:
To use the premium features, you can place your license file inside your home directory at `~/intelephense/licence.txt` for macOS and Linux, or `%USERPROFILE%/intelephense/licence.txt` on Windows.
Alternatively, you can pass the licence key or a path to a file containing the licence key as an initialization option. To do this, add the following to your `settings.json`:
```json [settings]
{
@@ -55,15 +67,67 @@ To use the premium features, you can place your [licence.txt file](https://intel
}
```
### PHP Tools
[PHP Tools](https://www.devsense.com/) is a proprietary language server that offers free and premium features. You need to [purchase a license](https://www.devsense.com/en/purchase) to activate the premium features.
To use PHP Tools, add the following to your `settings.json`:
```json [settings]
{
"languages": {
"PHP": {
"language_servers": ["phptools", "!intelephense", "!phpactor", "..."]
}
}
}
```
To use the premium features, you can add your license in `initialization_options` in your `settings.json`:
```json [settings]
{
"lsp": {
"phptools": {
"initialization_options": {
"0": "your_license_key"
}
}
}
}
```
or, set environment variable `DEVSENSE_PHP_LS_LICENSE` on `.env` file in your project.
```env
DEVSENSE_PHP_LS_LICENSE="your_license_key"
```
Check out the documentation of [PHP Tools for Zed](https://docs.devsense.com/other/zed/) for more details.
### Phpactor
To use Phpactor instead of Intelephense or any other tools, add the following to your `settings.json`:
```json [settings]
{
"languages": {
"PHP": {
"language_servers": ["phpactor", "!intelephense", "!phptools", "..."]
}
}
}
```
## PHPDoc
Zed supports syntax highlighting for PHPDoc comments.
- Tree-sitter: [claytonrcarter/tree-sitter-phpdoc](https://github.com/claytonrcarter/tree-sitter-phpdoc)
## Setting up Xdebug
## Debugging
Zeds PHP extension provides a debug adapter for PHP and Xdebug. The adapter name is `Xdebug`. Here a couple ways you can use it:
The PHP extension provides a debug adapter for PHP via Xdebug. There are several ways to use it:
```json
[
@@ -83,10 +147,10 @@ Zeds PHP extension provides a debug adapter for PHP and Xdebug. The adapter n
]
```
In case you run into issues:
These are common troubleshooting tips, in case you run into issues:
- ensure that you have Xdebug installed for the version of PHP youre running
- ensure that Xdebug is configured to run in `debug` mode
- ensure that Xdebug is actually starting a debugging session
- check that the host and port matches between Xdebug and Zed
- look at the diagnostics log by using the `xdebug_info()` function in the page youre trying to debug
- Ensure that you have Xdebug installed for the version of PHP youre running.
- Ensure that Xdebug is configured to run in `debug` mode.
- Ensure that Xdebug is actually starting a debugging session.
- Ensure that the host and port matches between Xdebug and Zed.
- Look at the diagnostics log by using the `xdebug_info()` function in the page youre trying to debug.