Compare commits

...

20 Commits

Author SHA1 Message Date
Conrad Irwin
f802444e79 Remove can_open_windows
Co-Authored-By: Mikayla <mikayla@zed.dev>
2024-06-10 15:24:01 -06:00
Conrad Irwin
7716fa8312 Make window initialization fallible 2024-06-10 15:20:20 -06:00
Vitaly Slobodin
a600799840 ruby: Remove outline for running tests (#12642)
Hi, this pull request superseeds the
https://github.com/zed-industries/zed/pull/12624
and removes queries for runnables from `outline.scm`. This pull request
has couple things to mention:

- Removed task for running tests with `minitest` as I think it's not
reliable in its state because, AFAIK, the only way to run `minitest`
with the specific line, i.e. `bundle exec rake test
spec/models/some_model.rb:12` is to use it with Rails. The support for
`minitest` is still there and users can add their own task, for
instance, when they use `minitest` in Rails to get support for running
tests:

  ```json
  {
    "label": "test $ZED_RELATIVE_FILE:$ZED_ROW",
    "command": "./bin/rails",
    "args": ["test", "\"$ZED_RELATIVE_FILE:$ZED_ROW\""],
    "tags": ["minitest-test"]
  }
  ```

**Question:** Perhaps that should be mentioned in the Ruby extension
documentation?

- Adjusted runnables queries to work without `ZED_SYMBOL`.

Release Notes:

- N/A
2024-06-10 18:04:43 +02:00
Thorsten Ball
05b6581147 linux/x11: handle XIM events sync to reduce lag (#12840)
This helps with the problem of keyboard input feeling laggy when the
event loop is under load.

What would previously happen is:

- N events from X11 arrive
- N events get forwarded to XIM
- N events are handled in N iterations of the event loop (sadly, yes: we
only seem to be getting back one `ClientMessage` per poll from XCB
connection)
- Each event is pushed into the channel
- N event loop iterations are needed to get the events off the channel
and handle them

With this change, we get rid of the last 2 steps: instead of pushing the
event onto a channel, we store it on the XIM handler itself, and then
work it off synchronously.

Usually one shouldn't block the event loop, but I think in this case -
user input! - it's better to handle the events directly instead of
re-enqueuing them again in a channel, where they can accumulate and need
multiple iterations of the loop to be worked off.

This does *not* fix the problem of input feeling choppy/slower when the
system is under load, but it makes the behavior now feel exactly the
same as when XIM is disabled.

I also think the code is easier to understand since it's more
straightforward.

Release Notes:

- N/A
2024-06-10 14:08:16 +02:00
Thorsten Ball
43d1a8040d linux: run runnables only when event loop is idle (#12839)
This change ensures that the event loop prioritizes enqueueing another
render or handling user input over executing runnables.

It's a subtle change as a result of a week of digging into performance
on X11. It's also not perfect: ideally we'd get rid of the intermediate
channel here and had more control over when and how we run runnables vs.
X11 events, but I think short of rewriting how we use an event loop,
this is good cost/benefit change.

To illustrate:

Before this change, it was possible to block the app from rendering for
a long time by just creating a ton of futures that were executed on the
"main" thread (we don't have a "main" thread on Linux, but we have a
single thread in which we run the event loop).

That was relatively easy to reproduce by opening the `zed` repository
and starting `rust-analyzer`: at some point `rust-analyzer` sends us so
many notifications, that are all handled in futures, that the event loop
is busy just working off the runnables, never getting to the events that
X11 sends us or our own timer to re-enqueue another render.

When you put print statements into the code to show when which event was
handled, you'd see something like this **before this change**:

```
[ ... hundreds of runnable.run() ... ]
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
new render tick timer. lag: 56.942049ms
X11 event
new render tick timer. lag: 9.668µs
X11 event
new render tick timer. lag: 9.955µs
X11 event
runnable.run()
runnable.run()
runnable.run()
runnable.run()
new render tick timer. lag: 12.462µs
X11 event
new render tick timer. lag: 14.868µs
X11 event
new render tick timer. lag: 11.234µs
X11 event
new render tick timer. lag: 11.681µs
X11 event
new render tick timer. lag: 13.926µs
X11 event
```

Note the `lag: 56ms`: that's the difference between when we wanted to
execute the callback that enqueues another render and when it ran.

Longer lags are possible, this is just the first one I grabbed from the
logs.

Now, compare this with the logs **after this change**:

```
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
new render tick timer. lag: 36.051µs
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
X11 event
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
runnable.run()
```

In-between many `runnable.run()` we'll always handle events.

So, in essence, what this change does is to introduce 2 priorities into
the X11 event queue:

- high: X11 events (user events, render events, ...), render tick, XIM
events, ...
- low: all async rust code

I've tested this with a debug build and release build and I think the
app now feels more responsive. It doesn't feel perfect still, especially
in the slow debug builds, but I couldn't observe 10s lockups anymore.

Since it's a pretty small change, I think we should go for it and see
how it behaves.

Thanks to @maan2003 this now also includes the same change to Wayland.

Release Notes:

- N/A

---------

Co-authored-by: maan2003 <manmeetmann2003@gmail.com>
2024-06-10 14:04:41 +02:00
Panghu
e829a8c3b0 Add auto-completion support for package.json files (#12792)
![截屏2024-06-08 07 56
41](https://github.com/zed-industries/zed/assets/21101490/da97e7d4-458b-4262-ac23-a4704af4f015)

Release Notes:

- Added auto-completion support for `package.json` files.
2024-06-08 13:33:29 +03:00
Arseny Kapoulkine
87845a349d cpp: Highlight sized type specifiers as keywords (#12751)
Without this, `unsigned` or `unsigned int` is not highlighted properly:
`int` is a primitive_type but `unsigned` is a sized_type_specifier. This
is already handled in C as both are part of @type highlight group.

Before:

![image](https://github.com/zed-industries/zed/assets/1106629/7210b769-9dff-428c-9e4f-55b652f91674)

After:

![image](https://github.com/zed-industries/zed/assets/1106629/8661c412-30f0-4b44-a4a2-1860a0b56a4e)

Release Notes:

- N/A
2024-06-08 13:26:10 +03:00
Kirill Bulatov
953393f6ce Rename workspace::Restart action into workspace::Reload (#12672)
Closes https://github.com/zed-industries/zed/issues/12609

Instead of adding some ordering mechanism to the actions, rename the
action so that it's not interfering with the `editor: restart language
server` command.

Before:

![image](https://github.com/zed-industries/zed/assets/2690773/b5e86eda-d766-49fc-a25b-f8b9fdb7b521)

![image](https://github.com/zed-industries/zed/assets/2690773/c5edeb56-12aa-496b-bb6f-dc705cbb9ae3)


After:

![image](https://github.com/zed-industries/zed/assets/2690773/ed30c68d-bfdd-4e00-bb5d-0be52fbe4e16)
![Screenshot 2024-06-05 at 09 46
25](https://github.com/zed-industries/zed/assets/2690773/9fe4eb52-0399-4321-85a9-3b07c11395ce)


Release Notes:

- Improved language server restart command ergonomics by renaming
`workspace::Restart` action into `workspace::Reload` to remove any other
"restart"-worded actions in the list
2024-06-08 13:23:59 +03:00
Conrad Irwin
75f8be6a0f vim: add guu gUU g~~ g/ (#12789)
Release Notes:

- vim: Add `g/` for project search
2024-06-07 16:45:38 -06:00
Max Brunsfeld
e174f16d50 Refactor: Make it possible to share a remote worktree (#12775)
This PR is an internal refactor in preparation for remote editing. It
restructures the public interface of `Worktree`, reducing the number of
call sites that assume that a worktree is local or remote.

* The Project no longer calls `worktree.as_local_mut().unwrap()` in code
paths related to basic file operations
* Fewer code paths in the app rely on the worktree's `LocalSnapshot`
* Worktree-related RPC message handling is more fully encapsulated by
the `Worktree` type.

to do:
* [x] file manipulation operations
* [x] sending worktree updates when sharing

for later
* opening buffers
* updating open buffers upon worktree changes

Release Notes:

- N/A
2024-06-07 12:53:01 -07:00
slowlydev
aa60fc2f19 Use the new assistant icon in the setup instructions (#12787)
This is a PR with just a small visual adjustment, so instructions are
up-to-date with the new icon.
I did not remove the "old" ai.svg as I am not sure if its gonna be used
in the future or if its has been completely replaced by the new "zed
assistant" icon.

Release Notes:

- Fixed the wrong icon being used in the assistant setup instructions.

For open ai

<img width="543" alt="image"
src="https://github.com/zed-industries/zed/assets/61624214/5f18a8f4-6761-4df5-8482-92582545dee5">

and anthropic

<img width="544" alt="image"
src="https://github.com/zed-industries/zed/assets/61624214/6ca3ed23-0f68-4c0d-bc8a-32ab7c607029">

how it looked before (Zed Preview 0.139.3
0c083b7f38):

<img width="526" alt="image"
src="https://github.com/zed-industries/zed/assets/61624214/af9c9fa8-89ed-4f6a-88ca-b285b4c522c3">
2024-06-07 15:12:16 -04:00
Conrad Irwin
5548773b2e vim: Add gU/gu/g~ (#12782)
Co-Authored-By: ethanmsl@gmail.com

Release Notes:

- vim: Added `gu`/`gU`/`g~` for changing case. (#12565)
2024-06-07 12:38:12 -06:00
Joseph T. Lyons
3eac83eece Add event for yarn project identification (#12785)
Report a `open yarn project` `app_event` for each worktree where
`yarn.lock` is found and only report it once per session.

Release Notes:

- N/A
2024-06-07 14:30:38 -04:00
Marshall Bowers
243a0e764d Block publishing of zed_extension_api v0.0.7 (#12784)
This PR adds a temporary block on publishing v0.0.7 of the
`zed_extension_api`.

We have breaking changes to the extension API that are currently staged
on `main` and are still being iterated on, so we don't want to publish
again until we're ready to commit to the new API.

This change is intended to prevent accidental publishing of the crate
before we're ready.

Release Notes:

- N/A
2024-06-07 14:16:21 -04:00
Conrad Irwin
6fa6e0718c Check validity of new.range too (#12781)
I'm not certain yet how it could be invalid, but we are still seeing
panics here.

Release Notes:

- Fixed a panic when opening the diagnostics view
2024-06-07 11:48:23 -06:00
Marshall Bowers
834089feb1 Handle Wikipedia code blocks in /fetch command (#12780)
This PR extends the `/fetch` command with support for Wikipedia code
blocks.

Release Notes:

- N/A
2024-06-07 12:54:33 -04:00
Marshall Bowers
9174858225 Add basic Wikipedia support to /fetch (#12777)
This PR extends the `/fetch` slash command with the initial support for
Wikipedia's HTML structure.

Release Notes:

- N/A
2024-06-07 12:03:43 -04:00
Thorsten Ball
a910f192db docs: Document how to setup Tailwind CSS support in Ruby (#12762)
Release Notes:

- N/A
2024-06-07 13:43:57 +02:00
Piotr Osiewicz
5f5e6b8616 workspace: Fix drag&dropping project panel entries into editor area (#12767)
Fixes #12733 

Release Notes:

- Fixed drag&dropping project panel entries into editor area & tab bar
2024-06-07 10:23:57 +02:00
Stanislav Alekseev
07dbd2bce8 Use rust-analyzer from path if possible (#12418)
Release Notes:

- Added support for looking up the `rust-analyzer` binary in `$PATH`. This allows using such tools as `asdf` and nix to configure per-folder rust installations. To enable this behavior, use the `path_lookup` key when configuring the `rust-analyzer` `binary`: `{"lsp": {"rust-analyzer": {"binary": {"path_lookup": true }}}}`.
2024-06-07 06:56:38 +02:00
77 changed files with 2857 additions and 1276 deletions

8
Cargo.lock generated
View File

@@ -1511,7 +1511,7 @@ dependencies = [
[[package]]
name = "blade-graphics"
version = "0.4.0"
source = "git+https://github.com/kvark/blade?rev=bdaf8c534fbbc9fbca71d1cf272f45640b3a068d#bdaf8c534fbbc9fbca71d1cf272f45640b3a068d"
source = "git+https://github.com/zed-industries/blade?rev=86bf7228f50f44058edf7872055261efd9eca9de#86bf7228f50f44058edf7872055261efd9eca9de"
dependencies = [
"ash",
"ash-window",
@@ -1541,7 +1541,7 @@ dependencies = [
[[package]]
name = "blade-macros"
version = "0.2.1"
source = "git+https://github.com/kvark/blade?rev=bdaf8c534fbbc9fbca71d1cf272f45640b3a068d#bdaf8c534fbbc9fbca71d1cf272f45640b3a068d"
source = "git+https://github.com/zed-industries/blade?rev=86bf7228f50f44058edf7872055261efd9eca9de#86bf7228f50f44058edf7872055261efd9eca9de"
dependencies = [
"proc-macro2",
"quote",
@@ -1551,7 +1551,7 @@ dependencies = [
[[package]]
name = "blade-util"
version = "0.1.0"
source = "git+https://github.com/kvark/blade?rev=bdaf8c534fbbc9fbca71d1cf272f45640b3a068d#bdaf8c534fbbc9fbca71d1cf272f45640b3a068d"
source = "git+https://github.com/zed-industries/blade?rev=86bf7228f50f44058edf7872055261efd9eca9de#86bf7228f50f44058edf7872055261efd9eca9de"
dependencies = [
"blade-graphics",
"bytemuck",
@@ -12898,7 +12898,6 @@ dependencies = [
"gpui",
"http 0.1.0",
"ignore",
"itertools 0.11.0",
"language",
"log",
"parking_lot",
@@ -13153,6 +13152,7 @@ version = "0.140.0"
dependencies = [
"activity_indicator",
"anyhow",
"ashpd",
"assets",
"assistant",
"audio",

View File

@@ -267,9 +267,9 @@ async-tar = "0.4.2"
async-trait = "0.1"
async_zip = { version = "0.0.17", features = ["deflate", "deflate64"] }
bitflags = "2.4.2"
blade-graphics = { git = "https://github.com/kvark/blade", rev = "bdaf8c534fbbc9fbca71d1cf272f45640b3a068d" }
blade-macros = { git = "https://github.com/kvark/blade", rev = "bdaf8c534fbbc9fbca71d1cf272f45640b3a068d" }
blade-util = { git = "https://github.com/kvark/blade", rev = "bdaf8c534fbbc9fbca71d1cf272f45640b3a068d" }
blade-graphics = { git = "https://github.com/zed-industries/blade", rev = "86bf7228f50f44058edf7872055261efd9eca9de" }
blade-macros = { git = "https://github.com/zed-industries/blade", rev = "86bf7228f50f44058edf7872055261efd9eca9de" }
blade-util = { git = "https://github.com/zed-industries/blade", rev = "86bf7228f50f44058edf7872055261efd9eca9de" }
cap-std = "3.0"
cargo_toml = "0.20"
chrono = { version = "0.4", features = ["serde"] }

View File

@@ -80,6 +80,7 @@
"g shift-e": ["vim::PreviousWordEnd", { "ignorePunctuation": true }],
"/": "vim::Search",
"g /": "pane::DeploySearch",
"?": [
"vim::Search",
{
@@ -381,6 +382,9 @@
"shift-s": "vim::SubstituteLine",
">": ["vim::PushOperator", "Indent"],
"<": ["vim::PushOperator", "Outdent"],
"g u": ["vim::PushOperator", "Lowercase"],
"g shift-u": ["vim::PushOperator", "Uppercase"],
"g ~": ["vim::PushOperator", "OppositeCase"],
"ctrl-pagedown": "pane::ActivateNextItem",
"ctrl-pageup": "pane::ActivatePrevItem",
// tree-sitter related commands
@@ -430,6 +434,27 @@
"d": "vim::CurrentLine"
}
},
{
"context": "Editor && vim_operator == gu",
"bindings": {
"g u": "vim::CurrentLine",
"u": "vim::CurrentLine"
}
},
{
"context": "Editor && vim_operator == gU",
"bindings": {
"g shift-u": "vim::CurrentLine",
"shift-u": "vim::CurrentLine"
}
},
{
"context": "Editor && vim_operator == g~",
"bindings": {
"g ~": "vim::CurrentLine",
"~": "vim::CurrentLine"
}
},
{
"context": "Editor && vim_mode == normal && vim_operator == d",
"bindings": {

View File

@@ -285,10 +285,10 @@ impl ActivityIndicator {
icon: None,
message: "Click to restart and update Zed".to_string(),
on_click: Some(Arc::new({
let restart = workspace::Restart {
let reload = workspace::Reload {
binary_path: Some(binary_path.clone()),
};
move |_, cx| workspace::restart(&restart, cx)
move |_, cx| workspace::reload(&reload, cx)
})),
},
AutoUpdateStatus::Errored => Content {

View File

@@ -349,7 +349,7 @@ impl Render for AuthenticationPrompt {
h_flex()
.gap_2()
.child(Label::new("Click on").size(LabelSize::Small))
.child(Icon::new(IconName::Ai).size(IconSize::XSmall))
.child(Icon::new(IconName::ZedAssistant).size(IconSize::XSmall))
.child(
Label::new("in the status bar to close this panel.").size(LabelSize::Small),
),

View File

@@ -336,7 +336,7 @@ impl Render for AuthenticationPrompt {
h_flex()
.gap_2()
.child(Label::new("Click on").size(LabelSize::Small))
.child(Icon::new(IconName::Ai).size(IconSize::XSmall))
.child(Icon::new(IconName::ZedAssistant).size(IconSize::XSmall))
.child(
Label::new("in the status bar to close this panel.").size(LabelSize::Small),
),

View File

@@ -97,7 +97,7 @@ pub fn open_prompt_library(
},
|cx| cx.new_view(|cx| PromptLibrary::new(store, language_registry, cx)),
)
})
})?
})
}
}

View File

@@ -5,7 +5,7 @@ use anyhow::{anyhow, bail, Context, Result};
use assistant_slash_command::{SlashCommand, SlashCommandOutput, SlashCommandOutputSection};
use futures::AsyncReadExt;
use gpui::{AppContext, Task, WeakView};
use html_to_markdown::convert_html_to_markdown;
use html_to_markdown::{convert_html_to_markdown, markdown, HandleTag};
use http::{AsyncBody, HttpClient, HttpClientWithUrl};
use language::LspAdapterDelegate;
use ui::{prelude::*, ButtonLike, ElevationIndex};
@@ -37,7 +37,24 @@ impl FetchSlashCommand {
);
}
convert_html_to_markdown(&body[..])
let mut handlers: Vec<Box<dyn HandleTag>> = vec![
Box::new(markdown::ParagraphHandler),
Box::new(markdown::HeadingHandler),
Box::new(markdown::ListHandler),
Box::new(markdown::TableHandler::new()),
Box::new(markdown::StyledTextHandler),
];
if url.contains("wikipedia.org") {
use html_to_markdown::structure::wikipedia;
handlers.push(Box::new(wikipedia::WikipediaChromeRemover));
handlers.push(Box::new(wikipedia::WikipediaInfoboxHandler));
handlers.push(Box::new(wikipedia::WikipediaCodeHandler::new()));
} else {
handlers.push(Box::new(markdown::CodeHandler));
}
convert_html_to_markdown(&body[..], handlers)
}
}

View File

@@ -1429,6 +1429,31 @@ impl Client {
}
}
pub fn request_dynamic(
&self,
envelope: proto::Envelope,
request_type: &'static str,
) -> impl Future<Output = Result<proto::Envelope>> {
let client_id = self.id();
log::debug!(
"rpc request start. client_id:{}. name:{}",
client_id,
request_type
);
let response = self
.connection_id()
.map(|conn_id| self.peer.request_dynamic(conn_id, envelope, request_type));
async move {
let response = response?.await;
log::debug!(
"rpc request finish. client_id:{}. name:{}",
client_id,
request_type
);
Ok(response?.0)
}
}
fn respond<T: RequestMessage>(&self, receipt: Receipt<T>, response: T::Response) -> Result<()> {
log::debug!("rpc respond. client_id:{}. name:{}", self.id(), T::NAME);
self.peer.respond(receipt, response)

View File

@@ -83,10 +83,7 @@ async fn test_host_disconnect(
let project_b = client_b.build_dev_server_project(project_id, cx_b).await;
cx_a.background_executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree.has_update_observer()));
let workspace_b = cx_b
.add_window(|cx| Workspace::new(None, project_b.clone(), client_b.app_state.clone(), cx));
@@ -123,10 +120,7 @@ async fn test_host_disconnect(
project_b.read_with(cx_b, |project, _| project.is_read_only());
assert!(worktree_a.read_with(cx_a, |tree, _| !tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(worktree_a.read_with(cx_a, |tree, _| !tree.has_update_observer()));
// Ensure client B's edited state is reset and that the whole window is blurred.

View File

@@ -1378,10 +1378,7 @@ async fn test_unshare_project(
let project_b = client_b.build_dev_server_project(project_id, cx_b).await;
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree.has_update_observer()));
project_b
.update(cx_b, |p, cx| p.open_buffer((worktree_id, "a.txt"), cx))
@@ -1406,10 +1403,7 @@ async fn test_unshare_project(
.unwrap();
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| !tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(worktree_a.read_with(cx_a, |tree, _| !tree.has_update_observer()));
assert!(project_c.read_with(cx_c, |project, _| project.is_disconnected()));
@@ -1421,10 +1415,7 @@ async fn test_unshare_project(
let project_c2 = client_c.build_dev_server_project(project_id, cx_c).await;
executor.run_until_parked();
assert!(worktree_a.read_with(cx_a, |tree, _| tree
.as_local()
.unwrap()
.has_update_observer()));
assert!(worktree_a.read_with(cx_a, |tree, _| tree.has_update_observer()));
project_c2
.update(cx_c, |p, cx| p.open_buffer((worktree_id, "a.txt"), cx))
.await
@@ -1531,7 +1522,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree1_id = worktree_a1.read_with(cx_a, |worktree, _| {
assert!(worktree.as_local().unwrap().has_update_observer());
assert!(worktree.has_update_observer());
worktree.id()
});
let (worktree_a2, _) = project_a1
@@ -1543,7 +1534,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree2_id = worktree_a2.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().has_update_observer());
assert!(tree.has_update_observer());
tree.id()
});
executor.run_until_parked();
@@ -1576,9 +1567,7 @@ async fn test_project_reconnect(
assert_eq!(project.collaborators().len(), 1);
});
worktree_a1.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().has_update_observer())
});
worktree_a1.read_with(cx_a, |tree, _| assert!(tree.has_update_observer()));
// While client A is disconnected, add and remove files from client A's project.
client_a
@@ -1620,7 +1609,7 @@ async fn test_project_reconnect(
.await;
let worktree3_id = worktree_a3.read_with(cx_a, |tree, _| {
assert!(!tree.as_local().unwrap().has_update_observer());
assert!(!tree.has_update_observer());
tree.id()
});
executor.run_until_parked();
@@ -1643,11 +1632,7 @@ async fn test_project_reconnect(
project_a1.read_with(cx_a, |project, cx| {
assert!(project.is_shared());
assert!(worktree_a1
.read(cx)
.as_local()
.unwrap()
.has_update_observer());
assert!(worktree_a1.read(cx).has_update_observer());
assert_eq!(
worktree_a1
.read(cx)
@@ -1665,11 +1650,7 @@ async fn test_project_reconnect(
"subdir2/i.txt"
]
);
assert!(worktree_a3
.read(cx)
.as_local()
.unwrap()
.has_update_observer());
assert!(worktree_a3.read(cx).has_update_observer());
assert_eq!(
worktree_a3
.read(cx)
@@ -1750,7 +1731,7 @@ async fn test_project_reconnect(
executor.run_until_parked();
let worktree4_id = worktree_a4.read_with(cx_a, |tree, _| {
assert!(tree.as_local().unwrap().has_update_observer());
assert!(tree.has_update_observer());
tree.id()
});
project_a1.update(cx_a, |project, cx| {

View File

@@ -686,7 +686,7 @@ impl CollabTitlebarItem {
.on_click(|_, cx| {
if let Some(auto_updater) = auto_update::AutoUpdater::get(cx) {
if auto_updater.read(cx).status().is_updated() {
workspace::restart(&Default::default(), cx);
workspace::reload(&Default::default(), cx);
return;
}
}

View File

@@ -8,6 +8,7 @@ use settings::Settings;
use std::sync::{Arc, Weak};
use theme::ThemeSettings;
use ui::{prelude::*, Button, Label};
use util::ResultExt;
use workspace::AppState;
pub fn init(app_state: &Arc<AppState>, cx: &mut AppContext) {
@@ -27,16 +28,21 @@ pub fn init(app_state: &Arc<AppState>, cx: &mut AppContext) {
for screen in cx.displays() {
let options = notification_window_options(screen, window_size, cx);
let window = cx.open_window(options, |cx| {
cx.new_view(|_| {
ProjectSharedNotification::new(
owner.clone(),
*project_id,
worktree_root_names.clone(),
app_state.clone(),
)
let Some(window) = cx
.open_window(options, |cx| {
cx.new_view(|_| {
ProjectSharedNotification::new(
owner.clone(),
*project_id,
worktree_root_names.clone(),
app_state.clone(),
)
})
})
});
.log_err()
else {
continue;
};
notification_windows
.entry(*project_id)
.or_insert(Vec::new())

View File

@@ -867,10 +867,12 @@ fn compare_diagnostics(
snapshot: &language::BufferSnapshot,
) -> Ordering {
use language::ToOffset;
// The old diagnostics may point to a previously open Buffer for this file.
if !old.range.start.is_valid(snapshot) {
// The diagnostics may point to a previously open Buffer for this file.
if !old.range.start.is_valid(snapshot) || !new.range.start.is_valid(snapshot) {
return Ordering::Greater;
}
old.range
.start
.to_offset(snapshot)

View File

@@ -509,6 +509,7 @@ fn test_clone(cx: &mut TestAppContext) {
.update(cx, |editor, cx| {
cx.open_window(Default::default(), |cx| cx.new_view(|cx| editor.clone(cx)))
})
.unwrap()
.unwrap();
let snapshot = editor.update(cx, |e, cx| e.snapshot(cx)).unwrap();
@@ -7657,6 +7658,7 @@ async fn test_following(cx: &mut gpui::TestAppContext) {
},
|cx| cx.new_view(|cx| build_editor(buffer.clone(), cx)),
)
.unwrap()
});
let is_still_following = Rc::new(RefCell::new(true));

View File

@@ -8,6 +8,10 @@ keywords = ["zed", "extension"]
edition = "2021"
license = "Apache-2.0"
# Don't publish v0.0.7 until we're ready to commit to the breaking API changes
# Marshall is DRI on this.
publish = false
[lints]
workspace = true

View File

@@ -76,6 +76,7 @@ fn main() {
cx.open_window(options, |cx| {
cx.activate(false);
cx.new_view(|_cx| AnimationExample {})
});
})
.unwrap();
});
}

View File

@@ -34,6 +34,7 @@ fn main() {
text: "World".into(),
})
},
);
)
.unwrap();
});
}

View File

@@ -93,6 +93,7 @@ fn main() {
local_resource: Arc::new(PathBuf::from_str("examples/image/app-icon.png").unwrap()),
remote_resource: "https://picsum.photos/512/512".into(),
})
});
})
.unwrap();
});
}

View File

@@ -29,7 +29,8 @@ fn main() {
}]);
cx.open_window(WindowOptions::default(), |cx| {
cx.new_view(|_cx| SetMenus {})
});
})
.unwrap();
});
}

View File

@@ -61,7 +61,8 @@ fn main() {
cx.new_view(|_| WindowContent {
text: format!("{:?}", screen.id()).into(),
})
});
})
.unwrap();
}
});
}

View File

@@ -490,26 +490,26 @@ impl AppContext {
&mut self,
options: crate::WindowOptions,
build_root_view: impl FnOnce(&mut WindowContext) -> View<V>,
) -> WindowHandle<V> {
) -> anyhow::Result<WindowHandle<V>> {
self.update(|cx| {
let id = cx.windows.insert(None);
let handle = WindowHandle::new(id);
let mut window = Window::new(handle.into(), options, cx);
let root_view = build_root_view(&mut WindowContext::new(cx, &mut window));
window.root_view.replace(root_view.into());
cx.window_handles.insert(id, window.handle);
cx.windows.get_mut(id).unwrap().replace(window);
handle
match Window::new(handle.into(), options, cx) {
Ok(mut window) => {
let root_view = build_root_view(&mut WindowContext::new(cx, &mut window));
window.root_view.replace(root_view.into());
cx.window_handles.insert(id, window.handle);
cx.windows.get_mut(id).unwrap().replace(window);
Ok(handle)
}
Err(e) => {
cx.windows.remove(id);
return Err(e);
}
}
})
}
/// Returns Ok() if the platform supports opening windows.
/// This returns false (for example) on linux when we could
/// not establish a connection to X or Wayland.
pub fn can_open_windows(&self) -> anyhow::Result<()> {
self.platform.can_open_windows()
}
/// Instructs the platform to activate the application by bringing it to the foreground.
pub fn activate(&self, ignoring_other_apps: bool) {
self.platform.activate(ignoring_other_apps);

View File

@@ -151,7 +151,7 @@ impl AsyncAppContext {
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut lock = app.borrow_mut();
Ok(lock.open_window(options, build_root_view))
lock.open_window(options, build_root_view)
}
/// Schedule a future to be polled in the background.

View File

@@ -193,19 +193,22 @@ impl TestAppContext {
},
|cx| cx.new_view(build_window),
)
.unwrap()
}
/// Adds a new window with no content.
pub fn add_empty_window(&mut self) -> &mut VisualTestContext {
let mut cx = self.app.borrow_mut();
let bounds = Bounds::maximized(None, &mut cx);
let window = cx.open_window(
WindowOptions {
window_bounds: Some(WindowBounds::Windowed(bounds)),
..Default::default()
},
|cx| cx.new_view(|_| Empty),
);
let window = cx
.open_window(
WindowOptions {
window_bounds: Some(WindowBounds::Windowed(bounds)),
..Default::default()
},
|cx| cx.new_view(|_| Empty),
)
.unwrap();
drop(cx);
let cx = VisualTestContext::from_window(*window.deref(), self).as_mut();
cx.run_until_parked();
@@ -222,13 +225,15 @@ impl TestAppContext {
{
let mut cx = self.app.borrow_mut();
let bounds = Bounds::maximized(None, &mut cx);
let window = cx.open_window(
WindowOptions {
window_bounds: Some(WindowBounds::Windowed(bounds)),
..Default::default()
},
|cx| cx.new_view(build_root_view),
);
let window = cx
.open_window(
WindowOptions {
window_bounds: Some(WindowBounds::Windowed(bounds)),
..Default::default()
},
|cx| cx.new_view(build_root_view),
)
.unwrap();
drop(cx);
let view = window.root_view(self).unwrap();
let cx = VisualTestContext::from_window(*window.deref(), self).as_mut();

View File

@@ -486,6 +486,7 @@ mod test {
focus_handle: cx.focus_handle(),
})
})
.unwrap()
});
cx.update(|cx| {

View File

@@ -295,7 +295,7 @@ impl KeyBindingContextPredicate {
}
_ if is_identifier_char(next) => {
let len = source
.find(|c: char| !is_identifier_char(c))
.find(|c: char| !is_identifier_char(c) && !is_vim_operator_char(c))
.unwrap_or(source.len());
let (identifier, rest) = source.split_at(len);
source = skip_whitespace(rest);
@@ -356,7 +356,7 @@ fn is_identifier_char(c: char) -> bool {
}
fn is_vim_operator_char(c: char) -> bool {
c == '>' || c == '<'
c == '>' || c == '<' || c == '~'
}
fn skip_whitespace(source: &str) -> &str {

View File

@@ -106,14 +106,12 @@ pub(crate) trait Platform: 'static {
fn displays(&self) -> Vec<Rc<dyn PlatformDisplay>>;
fn primary_display(&self) -> Option<Rc<dyn PlatformDisplay>>;
fn active_window(&self) -> Option<AnyWindowHandle>;
fn can_open_windows(&self) -> anyhow::Result<()> {
Ok(())
}
fn open_window(
&self,
handle: AnyWindowHandle,
options: WindowParams,
) -> Box<dyn PlatformWindow>;
) -> anyhow::Result<Box<dyn PlatformWindow>>;
/// Returns the appearance of the application's windows.
fn window_appearance(&self) -> WindowAppearance;

View File

@@ -59,10 +59,6 @@ impl LinuxClient for HeadlessClient {
None
}
fn can_open_windows(&self) -> anyhow::Result<()> {
return Err(anyhow::anyhow!("neither DISPLAY, nor WAYLAND_DISPLAY found. You can still run zed for remote development with --dev-server-token."));
}
fn active_window(&self) -> Option<AnyWindowHandle> {
None
}
@@ -71,8 +67,10 @@ impl LinuxClient for HeadlessClient {
&self,
_handle: AnyWindowHandle,
_params: WindowParams,
) -> Box<dyn PlatformWindow> {
unimplemented!()
) -> anyhow::Result<Box<dyn PlatformWindow>> {
Err(anyhow::anyhow!(
"neither DISPLAY nor WAYLAND_DISPLAY is set. You can run in headless mode"
))
}
fn set_cursor_style(&self, _style: CursorStyle) {}

View File

@@ -65,7 +65,7 @@ pub trait LinuxClient {
&self,
handle: AnyWindowHandle,
options: WindowParams,
) -> Box<dyn PlatformWindow>;
) -> anyhow::Result<Box<dyn PlatformWindow>>;
fn set_cursor_style(&self, style: CursorStyle);
fn open_uri(&self, uri: &str);
fn write_to_primary(&self, item: ClipboardItem);
@@ -245,7 +245,7 @@ impl<P: LinuxClient + 'static> Platform for P {
&self,
handle: AnyWindowHandle,
options: WindowParams,
) -> Box<dyn PlatformWindow> {
) -> anyhow::Result<Box<dyn PlatformWindow>> {
self.open_window(handle, options)
}

View File

@@ -403,9 +403,14 @@ impl WaylandClient {
let handle = event_loop.handle();
handle
.insert_source(main_receiver, |event, _, _: &mut WaylandClientStatePtr| {
if let calloop::channel::Event::Msg(runnable) = event {
runnable.run();
.insert_source(main_receiver, {
let handle = handle.clone();
move |event, _, _: &mut WaylandClientStatePtr| {
if let calloop::channel::Event::Msg(runnable) = event {
handle.insert_idle(|_| {
runnable.run();
});
}
}
})
.unwrap();
@@ -554,7 +559,7 @@ impl LinuxClient for WaylandClient {
&self,
handle: AnyWindowHandle,
params: WindowParams,
) -> Box<dyn PlatformWindow> {
) -> anyhow::Result<Box<dyn PlatformWindow>> {
let mut state = self.0.borrow_mut();
let (window, surface_id) = WaylandWindow::new(
@@ -563,10 +568,10 @@ impl LinuxClient for WaylandClient {
WaylandClientStatePtr(Rc::downgrade(&self.0)),
params,
state.common.appearance,
);
)?;
state.windows.insert(surface_id, window.0.clone());
Box::new(window)
Ok(Box::new(window))
}
fn set_cursor_style(&self, style: CursorStyle) {

View File

@@ -107,7 +107,7 @@ impl WaylandWindowState {
client: WaylandClientStatePtr,
globals: Globals,
options: WindowParams,
) -> Self {
) -> anyhow::Result<Self> {
let bounds = options.bounds.map(|p| p.0 as u32);
let raw = RawWindow {
@@ -130,7 +130,7 @@ impl WaylandWindowState {
},
)
}
.unwrap(),
.map_err(|e| anyhow::anyhow!("{:?}", e))?,
);
let config = BladeSurfaceConfig {
size: gpu::Extent {
@@ -141,7 +141,7 @@ impl WaylandWindowState {
transparent: options.window_background != WindowBackgroundAppearance::Opaque,
};
Self {
Ok(Self {
xdg_surface,
acknowledged_first_configure: false,
surface,
@@ -164,7 +164,7 @@ impl WaylandWindowState {
appearance,
handle,
active: false,
}
})
}
}
@@ -224,7 +224,7 @@ impl WaylandWindow {
client: WaylandClientStatePtr,
params: WindowParams,
appearance: WindowAppearance,
) -> (Self, ObjectId) {
) -> anyhow::Result<(Self, ObjectId)> {
let surface = globals.compositor.create_surface(&globals.qh, ());
let xdg_surface = globals
.wm_base
@@ -267,14 +267,14 @@ impl WaylandWindow {
client,
globals,
params,
))),
)?)),
callbacks: Rc::new(RefCell::new(Callbacks::default())),
});
// Kick things off
surface.commit();
(this, surface.id())
Ok((this, surface.id()))
}
}

View File

@@ -5,7 +5,7 @@ use std::rc::{Rc, Weak};
use std::time::{Duration, Instant};
use calloop::generic::{FdWrapper, Generic};
use calloop::{channel, EventLoop, LoopHandle, RegistrationToken};
use calloop::{EventLoop, LoopHandle, RegistrationToken};
use collections::HashMap;
use copypasta::x11_clipboard::{Clipboard, Primary, X11ClipboardContext};
@@ -165,9 +165,17 @@ impl X11Client {
let handle = event_loop.handle();
handle
.insert_source(main_receiver, |event, _, _: &mut X11Client| {
if let calloop::channel::Event::Msg(runnable) = event {
runnable.run();
.insert_source(main_receiver, {
let handle = handle.clone();
move |event, _, _: &mut X11Client| {
if let calloop::channel::Event::Msg(runnable) = event {
// Insert the runnables as idle callbacks, so we make sure that user-input and X11
// events have higher priority and runnables are only worked off after the event
// callbacks.
handle.insert_idle(|_| {
runnable.run();
});
}
}
})
.unwrap();
@@ -274,11 +282,9 @@ impl X11Client {
let xcb_connection = Rc::new(xcb_connection);
let (xim_tx, xim_rx) = channel::channel::<XimCallbackEvent>();
let ximc = X11rbClient::init(Rc::clone(&xcb_connection), x_root_index, None).ok();
let xim_handler = if ximc.is_some() {
Some(XimHandler::new(xim_tx))
Some(XimHandler::new())
} else {
None
};
@@ -303,10 +309,12 @@ impl X11Client {
client.handle_event(event);
continue;
}
let mut ximc = state.ximc.take().unwrap();
let mut xim_handler = state.xim_handler.take().unwrap();
let xim_connected = xim_handler.connected;
drop(state);
let xim_filtered = match ximc.filter_event(&event, &mut xim_handler) {
Ok(handled) => handled,
Err(err) => {
@@ -314,46 +322,34 @@ impl X11Client {
false
}
};
let xim_callback_event = xim_handler.last_callback_event.take();
let mut state = client.0.borrow_mut();
state.ximc = Some(ximc);
state.xim_handler = Some(xim_handler);
drop(state);
if let Some(event) = xim_callback_event {
client.handle_xim_callback_event(event);
}
if xim_filtered {
continue;
}
if xim_connected {
client.xim_handle_event(event);
} else {
client.handle_event(event);
}
}
Ok(calloop::PostAction::Continue)
}
},
)
.expect("Failed to initialize x11 event source");
handle
.insert_source(xim_rx, {
move |chan_event, _, client| match chan_event {
channel::Event::Msg(xim_event) => {
match xim_event {
XimCallbackEvent::XimXEvent(event) => {
client.handle_event(event);
}
XimCallbackEvent::XimCommitEvent(window, text) => {
client.xim_handle_commit(window, text);
}
XimCallbackEvent::XimPreeditEvent(window, text) => {
client.xim_handle_preedit(window, text);
}
};
}
channel::Event::Closed => {
log::error!("XIM Event Sender dropped")
}
}
})
.expect("Failed to initialize XIM event source");
handle
.insert_source(XDPEventSource::new(&common.background_executor), {
move |event, _, client| match event {
@@ -793,6 +789,20 @@ impl X11Client {
Some(())
}
fn handle_xim_callback_event(&self, event: XimCallbackEvent) {
match event {
XimCallbackEvent::XimXEvent(event) => {
self.handle_event(event);
}
XimCallbackEvent::XimCommitEvent(window, text) => {
self.xim_handle_commit(window, text);
}
XimCallbackEvent::XimPreeditEvent(window, text) => {
self.xim_handle_preedit(window, text);
}
};
}
fn xim_handle_event(&self, event: Event) -> Option<()> {
match event {
Event::KeyPress(event) | Event::KeyRelease(event) => {
@@ -913,7 +923,7 @@ impl LinuxClient for X11Client {
&self,
handle: AnyWindowHandle,
params: WindowParams,
) -> Box<dyn PlatformWindow> {
) -> anyhow::Result<Box<dyn PlatformWindow>> {
let mut state = self.0.borrow_mut();
let x_window = state.xcb_connection.generate_id().unwrap();
@@ -928,7 +938,7 @@ impl LinuxClient for X11Client {
&state.atoms,
state.scale_factor,
state.common.appearance,
);
)?;
let screen_resources = state
.xcb_connection
@@ -996,7 +1006,7 @@ impl LinuxClient for X11Client {
};
state.windows.insert(x_window, window_ref);
Box::new(window)
Ok(Box::new(window))
}
fn set_cursor_style(&self, style: CursorStyle) {

View File

@@ -216,7 +216,7 @@ impl X11WindowState {
atoms: &XcbAtoms,
scale_factor: f32,
appearance: WindowAppearance,
) -> Self {
) -> anyhow::Result<Self> {
let x_screen_index = params
.display_id
.map_or(x_main_screen_index, |did| did.0 as usize);
@@ -248,8 +248,7 @@ impl X11WindowState {
xcb_connection
.create_colormap(xproto::ColormapAlloc::NONE, id, visual_set.root, visual.id)
.unwrap()
.check()
.unwrap();
.check()?;
id
};
@@ -281,8 +280,7 @@ impl X11WindowState {
&win_aux,
)
.unwrap()
.check()
.unwrap();
.check()?;
if let Some(titlebar) = params.titlebar {
if let Some(title) = titlebar.title {
@@ -345,7 +343,7 @@ impl X11WindowState {
},
)
}
.unwrap(),
.map_err(|e| anyhow::anyhow!("{:?}", e))?,
);
let config = BladeSurfaceConfig {
@@ -355,7 +353,7 @@ impl X11WindowState {
transparent: params.window_background != WindowBackgroundAppearance::Opaque,
};
Self {
Ok(Self {
client,
executor,
display: Rc::new(X11Display::new(xcb_connection, x_screen_index).unwrap()),
@@ -368,7 +366,7 @@ impl X11WindowState {
input_handler: None,
appearance,
handle,
}
})
}
fn content_size(&self) -> Size<Pixels> {
@@ -426,8 +424,8 @@ impl X11Window {
atoms: &XcbAtoms,
scale_factor: f32,
appearance: WindowAppearance,
) -> Self {
Self(X11WindowStatePtr {
) -> anyhow::Result<Self> {
Ok(Self(X11WindowStatePtr {
state: Rc::new(RefCell::new(X11WindowState::new(
handle,
client,
@@ -439,11 +437,11 @@ impl X11Window {
atoms,
scale_factor,
appearance,
))),
)?)),
callbacks: Rc::new(RefCell::new(Callbacks::default())),
xcb_connection: xcb_connection.clone(),
x_window,
})
}))
}
fn set_wm_hints(&self, wm_hint_property_state: WmHintPropertyState, prop1: u32, prop2: u32) {

View File

@@ -1,7 +1,5 @@
use std::default::Default;
use calloop::channel;
use x11rb::protocol::{xproto, Event};
use xim::{AHashMap, AttributeName, Client, ClientError, ClientHandler, InputStyle};
@@ -14,19 +12,19 @@ pub enum XimCallbackEvent {
pub struct XimHandler {
pub im_id: u16,
pub ic_id: u16,
pub xim_tx: channel::Sender<XimCallbackEvent>,
pub connected: bool,
pub window: xproto::Window,
pub last_callback_event: Option<XimCallbackEvent>,
}
impl XimHandler {
pub fn new(xim_tx: channel::Sender<XimCallbackEvent>) -> Self {
pub fn new() -> Self {
Self {
im_id: Default::default(),
ic_id: Default::default(),
xim_tx,
connected: false,
window: Default::default(),
last_callback_event: None,
}
}
}
@@ -80,12 +78,10 @@ impl<C: Client<XEvent = xproto::KeyPressEvent>> ClientHandler<C> for XimHandler
_input_context_id: u16,
text: &str,
) -> Result<(), ClientError> {
self.xim_tx
.send(XimCallbackEvent::XimCommitEvent(
self.window,
String::from(text),
))
.ok();
self.last_callback_event = Some(XimCallbackEvent::XimCommitEvent(
self.window,
String::from(text),
));
Ok(())
}
@@ -99,14 +95,11 @@ impl<C: Client<XEvent = xproto::KeyPressEvent>> ClientHandler<C> for XimHandler
) -> Result<(), ClientError> {
match xev.response_type {
x11rb::protocol::xproto::KEY_PRESS_EVENT => {
self.xim_tx
.send(XimCallbackEvent::XimXEvent(Event::KeyPress(xev)))
.ok();
self.last_callback_event = Some(XimCallbackEvent::XimXEvent(Event::KeyPress(xev)));
}
x11rb::protocol::xproto::KEY_RELEASE_EVENT => {
self.xim_tx
.send(XimCallbackEvent::XimXEvent(Event::KeyRelease(xev)))
.ok();
self.last_callback_event =
Some(XimCallbackEvent::XimXEvent(Event::KeyRelease(xev)));
}
_ => {}
}
@@ -145,12 +138,10 @@ impl<C: Client<XEvent = xproto::KeyPressEvent>> ClientHandler<C> for XimHandler
// XIMPrimary, XIMHighlight, XIMSecondary, XIMTertiary are not specified,
// but interchangeable as above
// Currently there's no way to support these.
self.xim_tx
.send(XimCallbackEvent::XimPreeditEvent(
self.window,
String::from(preedit_string),
))
.ok();
self.last_callback_event = Some(XimCallbackEvent::XimPreeditEvent(
self.window,
String::from(preedit_string),
));
Ok(())
}
}

View File

@@ -187,14 +187,14 @@ impl Platform for TestPlatform {
&self,
handle: AnyWindowHandle,
params: WindowParams,
) -> Box<dyn crate::PlatformWindow> {
) -> anyhow::Result<Box<dyn crate::PlatformWindow>> {
let window = TestWindow::new(
handle,
params,
self.weak.clone(),
self.active_display.clone(),
);
Box::new(window)
Ok(Box::new(window))
}
fn window_appearance(&self) -> WindowAppearance {

View File

@@ -605,7 +605,7 @@ impl Window {
handle: AnyWindowHandle,
options: WindowOptions,
cx: &mut AppContext,
) -> Self {
) -> Result<Self> {
let WindowOptions {
window_bounds,
titlebar,
@@ -633,7 +633,7 @@ impl Window {
display_id,
window_background,
},
);
)?;
let display_id = platform_window.display().map(|display| display.id());
let sprite_atlas = platform_window.sprite_atlas();
let mouse_position = platform_window.mouse_position();
@@ -761,7 +761,7 @@ impl Window {
platform_window.set_app_id(&app_id);
}
Window {
Ok(Window {
handle,
removed: false,
platform_window,
@@ -807,7 +807,7 @@ impl Window {
focus_enabled: true,
pending_input: None,
prompt: None,
}
})
}
fn new_focus_listener(
&mut self,

View File

@@ -1,11 +1,9 @@
//! Provides conversion from rustdoc's HTML output to Markdown.
#![deny(missing_docs)]
mod html_element;
mod markdown;
pub mod markdown;
mod markdown_writer;
mod structure;
pub mod structure;
use std::io::Read;
@@ -19,24 +17,17 @@ use markup5ever_rcdom::RcDom;
use crate::markdown::{
HeadingHandler, ListHandler, ParagraphHandler, StyledTextHandler, TableHandler,
};
use crate::markdown_writer::{HandleTag, MarkdownWriter};
use crate::markdown_writer::MarkdownWriter;
pub use crate::markdown_writer::HandleTag;
/// Converts the provided HTML to Markdown.
pub fn convert_html_to_markdown(html: impl Read) -> Result<String> {
pub fn convert_html_to_markdown(
html: impl Read,
handlers: Vec<Box<dyn HandleTag>>,
) -> Result<String> {
let dom = parse_html(html).context("failed to parse HTML")?;
let handlers: Vec<Box<dyn HandleTag>> = vec![
Box::new(ParagraphHandler),
Box::new(HeadingHandler),
Box::new(ListHandler),
Box::new(TableHandler::new()),
Box::new(StyledTextHandler),
Box::new(structure::rustdoc::RustdocChromeRemover),
Box::new(structure::rustdoc::RustdocHeadingHandler),
Box::new(structure::rustdoc::RustdocCodeHandler),
Box::new(structure::rustdoc::RustdocItemHandler),
];
let markdown_writer = MarkdownWriter::new();
let markdown = markdown_writer
.run(&dom.document, handlers)
@@ -47,26 +38,20 @@ pub fn convert_html_to_markdown(html: impl Read) -> Result<String> {
/// Converts the provided rustdoc HTML to Markdown.
pub fn convert_rustdoc_to_markdown(html: impl Read) -> Result<String> {
let dom = parse_html(html).context("failed to parse rustdoc HTML")?;
let handlers: Vec<Box<dyn HandleTag>> = vec![
Box::new(ParagraphHandler),
Box::new(HeadingHandler),
Box::new(ListHandler),
Box::new(TableHandler::new()),
Box::new(StyledTextHandler),
Box::new(structure::rustdoc::RustdocChromeRemover),
Box::new(structure::rustdoc::RustdocHeadingHandler),
Box::new(structure::rustdoc::RustdocCodeHandler),
Box::new(structure::rustdoc::RustdocItemHandler),
];
let markdown_writer = MarkdownWriter::new();
let markdown = markdown_writer
.run(&dom.document, handlers)
.context("failed to convert rustdoc HTML to Markdown")?;
Ok(markdown)
convert_html_to_markdown(
html,
vec![
Box::new(ParagraphHandler),
Box::new(HeadingHandler),
Box::new(ListHandler),
Box::new(TableHandler::new()),
Box::new(StyledTextHandler),
Box::new(structure::rustdoc::RustdocChromeRemover),
Box::new(structure::rustdoc::RustdocHeadingHandler),
Box::new(structure::rustdoc::RustdocCodeHandler),
Box::new(structure::rustdoc::RustdocItemHandler),
],
)
}
fn parse_html(mut html: impl Read) -> Result<RcDom> {

View File

@@ -1,5 +1,5 @@
use crate::html_element::HtmlElement;
use crate::markdown_writer::{HandleTag, MarkdownWriter, StartTagOutcome};
use crate::markdown_writer::{HandleTag, HandlerOutcome, MarkdownWriter, StartTagOutcome};
pub struct ParagraphHandler;
@@ -214,3 +214,53 @@ impl HandleTag for StyledTextHandler {
}
}
}
pub struct CodeHandler;
impl HandleTag for CodeHandler {
fn should_handle(&self, tag: &str) -> bool {
match tag {
"pre" | "code" => true,
_ => false,
}
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"code" => {
if !writer.is_inside("pre") {
writer.push_str("`");
}
}
"pre" => writer.push_str("\n\n```\n"),
_ => {}
}
StartTagOutcome::Continue
}
fn handle_tag_end(&mut self, tag: &HtmlElement, writer: &mut MarkdownWriter) {
match tag.tag.as_str() {
"code" => {
if !writer.is_inside("pre") {
writer.push_str("`");
}
}
"pre" => writer.push_str("\n```\n"),
_ => {}
}
}
fn handle_text(&mut self, text: &str, writer: &mut MarkdownWriter) -> HandlerOutcome {
if writer.is_inside("pre") {
writer.push_str(&text);
return HandlerOutcome::Handled;
}
HandlerOutcome::NoOp
}
}

View File

@@ -162,7 +162,7 @@ impl MarkdownWriter {
}
let text = text
.trim_matches(|char| char == '\n' || char == '\r')
.trim_matches(|char| char == '\n' || char == '\r' || char == '\t')
.replace('\n', " ");
self.push_str(&text);

View File

@@ -1 +1,2 @@
pub mod rustdoc;
pub mod wikipedia;

View File

@@ -0,0 +1,180 @@
use crate::html_element::HtmlElement;
use crate::markdown_writer::{HandlerOutcome, MarkdownWriter, StartTagOutcome};
use crate::HandleTag;
pub struct WikipediaChromeRemover;
impl HandleTag for WikipediaChromeRemover {
fn should_handle(&self, _tag: &str) -> bool {
true
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
_writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"head" | "script" | "style" | "nav" => return StartTagOutcome::Skip,
"sup" => {
if tag.has_class("reference") {
return StartTagOutcome::Skip;
}
}
"div" | "span" | "a" => {
if tag.attr("id").as_deref() == Some("p-lang-btn") {
return StartTagOutcome::Skip;
}
if tag.attr("id").as_deref() == Some("p-search") {
return StartTagOutcome::Skip;
}
let classes_to_skip = ["noprint", "mw-editsection", "mw-jump-link"];
if tag.has_any_classes(&classes_to_skip) {
return StartTagOutcome::Skip;
}
}
_ => {}
}
StartTagOutcome::Continue
}
}
pub struct WikipediaInfoboxHandler;
impl HandleTag for WikipediaInfoboxHandler {
fn should_handle(&self, tag: &str) -> bool {
tag == "table"
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
_writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"table" => {
if tag.has_class("infobox") {
return StartTagOutcome::Skip;
}
}
_ => {}
}
StartTagOutcome::Continue
}
}
pub struct WikipediaCodeHandler {
language: Option<String>,
}
impl WikipediaCodeHandler {
pub fn new() -> Self {
Self { language: None }
}
}
impl HandleTag for WikipediaCodeHandler {
fn should_handle(&self, tag: &str) -> bool {
match tag {
"div" | "pre" | "code" => true,
_ => false,
}
}
fn handle_tag_start(
&mut self,
tag: &HtmlElement,
writer: &mut MarkdownWriter,
) -> StartTagOutcome {
match tag.tag.as_str() {
"code" => {
if !writer.is_inside("pre") {
writer.push_str("`");
}
}
"div" => {
let classes = tag.classes();
self.language = classes.iter().find_map(|class| {
if let Some((_, language)) = class.split_once("mw-highlight-lang-") {
Some(language.trim().to_owned())
} else {
None
}
});
}
"pre" => {
writer.push_blank_line();
writer.push_str("```");
if let Some(language) = self.language.take() {
writer.push_str(&language);
}
writer.push_newline();
}
_ => {}
}
StartTagOutcome::Continue
}
fn handle_tag_end(&mut self, tag: &HtmlElement, writer: &mut MarkdownWriter) {
match tag.tag.as_str() {
"code" => {
if !writer.is_inside("pre") {
writer.push_str("`");
}
}
"pre" => writer.push_str("\n```\n"),
_ => {}
}
}
fn handle_text(&mut self, text: &str, writer: &mut MarkdownWriter) -> HandlerOutcome {
if writer.is_inside("pre") {
writer.push_str(&text);
return HandlerOutcome::Handled;
}
HandlerOutcome::NoOp
}
}
#[cfg(test)]
mod tests {
use indoc::indoc;
use pretty_assertions::assert_eq;
use crate::{convert_html_to_markdown, markdown};
use super::*;
fn wikipedia_handlers() -> Vec<Box<dyn HandleTag>> {
vec![
Box::new(markdown::ParagraphHandler),
Box::new(markdown::HeadingHandler),
Box::new(markdown::ListHandler),
Box::new(markdown::StyledTextHandler),
Box::new(WikipediaChromeRemover),
]
}
#[test]
fn test_citation_references_get_removed() {
let html = indoc! {r##"
<p>Rust began as a personal project in 2006 by <a href="/wiki/Mozilla" title="Mozilla">Mozilla</a> Research employee Graydon Hoare.<sup id="cite_ref-MITTechReview_23-0" class="reference"><a href="#cite_note-MITTechReview-23">[20]</a></sup> Mozilla began sponsoring the project in 2009 as a part of the ongoing development of an experimental <a href="/wiki/Browser_engine" title="Browser engine">browser engine</a> called <a href="/wiki/Servo_(software)" title="Servo (software)">Servo</a>,<sup id="cite_ref-infoq2012_24-0" class="reference"><a href="#cite_note-infoq2012-24">[21]</a></sup> which was officially announced by Mozilla in 2010.<sup id="cite_ref-MattAsay_25-0" class="reference"><a href="#cite_note-MattAsay-25">[22]</a></sup><sup id="cite_ref-26" class="reference"><a href="#cite_note-26">[23]</a></sup> Rust's memory and ownership system was influenced by <a href="/wiki/Region-based_memory_management" title="Region-based memory management">region-based memory management</a> in languages such as <a href="/wiki/Cyclone_(programming_language)" title="Cyclone (programming language)">Cyclone</a> and ML Kit.<sup id="cite_ref-influences_8-13" class="reference"><a href="#cite_note-influences-8">[5]</a></sup>
</p>
"##};
let expected = indoc! {"
Rust began as a personal project in 2006 by Mozilla Research employee Graydon Hoare. Mozilla began sponsoring the project in 2009 as a part of the ongoing development of an experimental browser engine called Servo, which was officially announced by Mozilla in 2010. Rust's memory and ownership system was influenced by region-based memory management in languages such as Cyclone and ML Kit.
"}
.trim();
assert_eq!(
convert_html_to_markdown(html.as_bytes(), wikipedia_handlers()).unwrap(),
expected
)
}
}

View File

@@ -35,12 +35,12 @@ impl super::LspAdapter for CLspAdapter {
.and_then(|s| s.binary.clone())
});
if let Ok(Some(BinarySettings {
path: Some(path),
arguments,
})) = configured_binary
{
Some(LanguageServerBinary {
match configured_binary {
Ok(Some(BinarySettings {
path: Some(path),
arguments,
..
})) => Some(LanguageServerBinary {
path: path.into(),
arguments: arguments
.unwrap_or_default()
@@ -48,15 +48,20 @@ impl super::LspAdapter for CLspAdapter {
.map(|arg| arg.into())
.collect(),
env: None,
})
} else {
let env = delegate.shell_env().await;
let path = delegate.which(Self::SERVER_NAME.as_ref()).await?;
Some(LanguageServerBinary {
path,
arguments: vec![],
env: Some(env),
})
}),
Ok(Some(BinarySettings {
path_lookup: Some(false),
..
})) => None,
_ => {
let env = delegate.shell_env().await;
let path = delegate.which(Self::SERVER_NAME.as_ref()).await?;
Some(LanguageServerBinary {
path,
arguments: vec![],
env: Some(env),
})
}
}
}

View File

@@ -91,6 +91,7 @@
"volatile"
"while"
(primitive_type)
(sized_type_specifier)
(type_qualifier)
] @keyword

View File

@@ -78,12 +78,12 @@ impl super::LspAdapter for GoLspAdapter {
.and_then(|s| s.binary.clone())
});
if let Ok(Some(BinarySettings {
path: Some(path),
arguments,
})) = configured_binary
{
Some(LanguageServerBinary {
match configured_binary {
Ok(Some(BinarySettings {
path: Some(path),
arguments,
..
})) => Some(LanguageServerBinary {
path: path.into(),
arguments: arguments
.unwrap_or_default()
@@ -91,15 +91,20 @@ impl super::LspAdapter for GoLspAdapter {
.map(|arg| arg.into())
.collect(),
env: None,
})
} else {
let env = delegate.shell_env().await;
let path = delegate.which(Self::SERVER_NAME.as_ref()).await?;
Some(LanguageServerBinary {
path,
arguments: server_binary_arguments(),
env: Some(env),
})
}),
Ok(Some(BinarySettings {
path_lookup: Some(false),
..
})) => None,
_ => {
let env = delegate.shell_env().await;
let path = delegate.which(Self::SERVER_NAME.as_ref()).await?;
Some(LanguageServerBinary {
path,
arguments: server_binary_arguments(),
env: Some(env),
})
}
}
}

View File

@@ -25,6 +25,8 @@ const SERVER_PATH: &str = "node_modules/vscode-json-languageserver/bin/vscode-js
// Origin: https://github.com/SchemaStore/schemastore
const TSCONFIG_SCHEMA: &str = include_str!("json/schemas/tsconfig.json");
const PACKAGE_JSON_SCHEMA: &str = include_str!("json/schemas/package.json");
pub(super) fn json_task_context() -> ContextProviderWithTasks {
ContextProviderWithTasks::new(TaskTemplates(vec![
TaskTemplate {
@@ -78,6 +80,8 @@ impl JsonLspAdapter {
);
let tasks_schema = task::TaskTemplates::generate_json_schema();
let tsconfig_schema = serde_json::Value::from_str(TSCONFIG_SCHEMA).unwrap();
let package_json_schema = serde_json::Value::from_str(PACKAGE_JSON_SCHEMA).unwrap();
serde_json::json!({
"json": {
"format": {
@@ -88,6 +92,10 @@ impl JsonLspAdapter {
"fileMatch": ["tsconfig.json"],
"schema":tsconfig_schema
},
{
"fileMatch": ["package.json"],
"schema":package_json_schema
},
{
"fileMatch": [
schema_file_match(&paths::SETTINGS),

View File

@@ -0,0 +1,864 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "JSON schema for NPM package.json files",
"definitions": {
"person": {
"description": "A person who has been involved in creating or maintaining this package.",
"type": [
"object",
"string"
],
"required": [
"name"
],
"properties": {
"name": {
"type": "string"
},
"url": {
"type": "string",
"format": "uri"
},
"email": {
"type": "string",
"format": "email"
}
}
},
"dependency": {
"description": "Dependencies are specified with a simple hash of package name to version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL.",
"type": "object",
"additionalProperties": {
"type": "string"
}
},
"license": {
"anyOf": [
{
"type": "string"
},
{
"enum": [
"AGPL-3.0-only",
"Apache-2.0",
"BSD-2-Clause",
"BSD-3-Clause",
"BSL-1.0",
"CC0-1.0",
"CDDL-1.0",
"CDDL-1.1",
"EPL-1.0",
"EPL-2.0",
"GPL-2.0-only",
"GPL-3.0-only",
"ISC",
"LGPL-2.0-only",
"LGPL-2.1-only",
"LGPL-2.1-or-later",
"LGPL-3.0-only",
"LGPL-3.0-or-later",
"MIT",
"MPL-2.0",
"MS-PL",
"UNLICENSED"
]
}
]
},
"scriptsInstallAfter": {
"description": "Run AFTER the package is installed.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsPublishAfter": {
"description": "Run AFTER the package is published.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsRestart": {
"description": "Run by the 'npm restart' command. Note: 'npm restart' will run the stop and start scripts if no restart script is provided.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsStart": {
"description": "Run by the 'npm start' command.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsStop": {
"description": "Run by the 'npm stop' command.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsTest": {
"description": "Run by the 'npm test' command.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsUninstallBefore": {
"description": "Run BEFORE the package is uninstalled.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"scriptsVersionBefore": {
"description": "Run BEFORE bump the package version.",
"type": "string",
"x-intellij-language-injection": "Shell Script"
},
"packageExportsEntryPath": {
"type": [
"string",
"null"
],
"description": "The module path that is resolved when this specifier is imported. Set to `null` to disallow importing this module.",
"pattern": "^\\./"
},
"packageExportsEntryObject": {
"type": "object",
"description": "Used to specify conditional exports, note that Conditional exports are unsupported in older environments, so it's recommended to use the fallback array option if support for those environments is a concern.",
"properties": {
"require": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when this specifier is imported as a CommonJS module using the `require(...)` function."
},
"import": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when this specifier is imported as an ECMAScript module using an `import` declaration or the dynamic `import(...)` function."
},
"node": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when this environment is Node.js."
},
"default": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when no other export type matches."
},
"types": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved for TypeScript types when this specifier is imported. Should be listed before other conditions."
}
},
"patternProperties": {
"^(?![\\.0-9]).": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when this environment matches the property name."
}
},
"additionalProperties": false
},
"packageExportsEntry": {
"oneOf": [
{
"$ref": "#/definitions/packageExportsEntryPath"
},
{
"$ref": "#/definitions/packageExportsEntryObject"
}
]
},
"packageExportsFallback": {
"type": "array",
"description": "Used to allow fallbacks in case this environment doesn't support the preceding entries.",
"items": {
"$ref": "#/definitions/packageExportsEntry"
}
},
"packageExportsEntryOrFallback": {
"oneOf": [
{
"$ref": "#/definitions/packageExportsEntry"
},
{
"$ref": "#/definitions/packageExportsFallback"
}
]
},
"fundingUrl": {
"type": "string",
"format": "uri",
"description": "URL to a website with details about how to fund the package."
},
"fundingWay": {
"type": "object",
"description": "Used to inform about ways to help fund development of the package.",
"properties": {
"url": {
"$ref": "#/definitions/fundingUrl"
},
"type": {
"type": "string",
"description": "The type of funding or the platform through which funding can be provided, e.g. patreon, opencollective, tidelift or github."
}
},
"additionalProperties": false,
"required": [
"url"
]
}
},
"type": "object",
"patternProperties": {
"^_": {
"description": "Any property starting with _ is valid.",
"tsType": "any"
}
},
"properties": {
"name": {
"description": "The name of the package.",
"type": "string",
"maxLength": 214,
"minLength": 1,
"pattern": "^(?:(?:@(?:[a-z0-9-*~][a-z0-9-*._~]*)?/[a-z0-9-._~])|[a-z0-9-~])[a-z0-9-._~]*$"
},
"version": {
"description": "Version must be parseable by node-semver, which is bundled with npm as a dependency.",
"type": "string"
},
"description": {
"description": "This helps people discover your package, as it's listed in 'npm search'.",
"type": "string"
},
"keywords": {
"description": "This helps people discover your package as it's listed in 'npm search'.",
"type": "array",
"items": {
"type": "string"
}
},
"homepage": {
"description": "The url to the project homepage.",
"type": "string"
},
"bugs": {
"description": "The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package.",
"type": [
"object",
"string"
],
"properties": {
"url": {
"type": "string",
"description": "The url to your project's issue tracker.",
"format": "uri"
},
"email": {
"type": "string",
"description": "The email address to which issues should be reported.",
"format": "email"
}
}
},
"license": {
"$ref": "#/definitions/license",
"description": "You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it."
},
"licenses": {
"description": "DEPRECATED: Instead, use SPDX expressions, like this: { \"license\": \"ISC\" } or { \"license\": \"(MIT OR Apache-2.0)\" } see: 'https://docs.npmjs.com/files/package.json#license'.",
"type": "array",
"items": {
"type": "object",
"properties": {
"type": {
"$ref": "#/definitions/license"
},
"url": {
"type": "string",
"format": "uri"
}
}
}
},
"author": {
"$ref": "#/definitions/person"
},
"contributors": {
"description": "A list of people who contributed to this package.",
"type": "array",
"items": {
"$ref": "#/definitions/person"
}
},
"maintainers": {
"description": "A list of people who maintains this package.",
"type": "array",
"items": {
"$ref": "#/definitions/person"
}
},
"files": {
"description": "The 'files' field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder.",
"type": "array",
"items": {
"type": "string"
}
},
"main": {
"description": "The main field is a module ID that is the primary entry point to your program.",
"type": "string"
},
"exports": {
"description": "The \"exports\" field is used to restrict external access to non-exported module files, also enables a module to import itself using \"name\".",
"oneOf": [
{
"$ref": "#/definitions/packageExportsEntryPath",
"description": "The module path that is resolved when the module specifier matches \"name\", shadows the \"main\" field."
},
{
"type": "object",
"properties": {
".": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path that is resolved when the module specifier matches \"name\", shadows the \"main\" field."
}
},
"patternProperties": {
"^\\./.+": {
"$ref": "#/definitions/packageExportsEntryOrFallback",
"description": "The module path prefix that is resolved when the module specifier starts with \"name/\", set to \"./*\" to allow external modules to import any subpath."
}
},
"additionalProperties": false
},
{
"$ref": "#/definitions/packageExportsEntryObject",
"description": "The module path that is resolved when the module specifier matches \"name\", shadows the \"main\" field."
},
{
"$ref": "#/definitions/packageExportsFallback",
"description": "The module path that is resolved when the module specifier matches \"name\", shadows the \"main\" field."
}
]
},
"bin": {
"type": [
"string",
"object"
],
"additionalProperties": {
"type": "string"
}
},
"type": {
"description": "When set to \"module\", the type field allows a package to specify all .js files within are ES modules. If the \"type\" field is omitted or set to \"commonjs\", all .js files are treated as CommonJS.",
"type": "string",
"enum": [
"commonjs",
"module"
],
"default": "commonjs"
},
"types": {
"description": "Set the types property to point to your bundled declaration file.",
"type": "string"
},
"typings": {
"description": "Note that the \"typings\" field is synonymous with \"types\", and could be used as well.",
"type": "string"
},
"typesVersions": {
"description": "The \"typesVersions\" field is used since TypeScript 3.1 to support features that were only made available in newer TypeScript versions.",
"type": "object",
"additionalProperties": {
"description": "Contains overrides for the TypeScript version that matches the version range matching the property key.",
"type": "object",
"properties": {
"*": {
"description": "Maps all file paths to the file paths specified in the array.",
"type": "array",
"items": {
"type": "string",
"pattern": "^[^*]*(?:\\*[^*]*)?$"
}
}
},
"patternProperties": {
"^[^*]+$": {
"description": "Maps the file path matching the property key to the file paths specified in the array.",
"type": "array",
"items": {
"type": "string"
}
},
"^[^*]*\\*[^*]*$": {
"description": "Maps file paths matching the pattern specified in property key to file paths specified in the array.",
"type": "array",
"items": {
"type": "string",
"pattern": "^[^*]*(?:\\*[^*]*)?$"
}
}
},
"additionalProperties": false
}
},
"man": {
"type": [
"array",
"string"
],
"description": "Specify either a single file or an array of filenames to put in place for the man program to find.",
"items": {
"type": "string"
}
},
"directories": {
"type": "object",
"properties": {
"bin": {
"description": "If you specify a 'bin' directory, then all the files in that folder will be used as the 'bin' hash.",
"type": "string"
},
"doc": {
"description": "Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday.",
"type": "string"
},
"example": {
"description": "Put example scripts in here. Someday, it might be exposed in some clever way.",
"type": "string"
},
"lib": {
"description": "Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info.",
"type": "string"
},
"man": {
"description": "A folder that is full of man pages. Sugar to generate a 'man' array by walking the folder.",
"type": "string"
},
"test": {
"type": "string"
}
}
},
"repository": {
"description": "Specify the place where your code lives. This is helpful for people who want to contribute.",
"type": [
"object",
"string"
],
"properties": {
"type": {
"type": "string"
},
"url": {
"type": "string"
},
"directory": {
"type": "string"
}
}
},
"funding": {
"oneOf": [
{
"$ref": "#/definitions/fundingUrl"
},
{
"$ref": "#/definitions/fundingWay"
},
{
"type": "array",
"items": {
"oneOf": [
{
"$ref": "#/definitions/fundingUrl"
},
{
"$ref": "#/definitions/fundingWay"
}
]
},
"minItems": 1,
"uniqueItems": true
}
]
},
"scripts": {
"description": "The 'scripts' member is an object hash of script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point.",
"type": "object",
"properties": {
"lint": {
"type": "string",
"description": "Run code quality tools, e.g. ESLint, TSLint, etc."
},
"prepublish": {
"type": "string",
"description": "Run BEFORE the package is published (Also run on local npm install without any arguments)."
},
"prepare": {
"type": "string",
"description": "Run both BEFORE the package is packed and published, and on local npm install without any arguments. This is run AFTER prepublish, but BEFORE prepublishOnly."
},
"prepublishOnly": {
"type": "string",
"description": "Run BEFORE the package is prepared and packed, ONLY on npm publish."
},
"prepack": {
"type": "string",
"description": "run BEFORE a tarball is packed (on npm pack, npm publish, and when installing git dependencies)."
},
"postpack": {
"type": "string",
"description": "Run AFTER the tarball has been generated and moved to its final destination."
},
"publish": {
"type": "string",
"description": "Publishes a package to the registry so that it can be installed by name. See https://docs.npmjs.com/cli/v8/commands/npm-publish"
},
"postpublish": {
"$ref": "#/definitions/scriptsPublishAfter"
},
"preinstall": {
"type": "string",
"description": "Run BEFORE the package is installed."
},
"install": {
"$ref": "#/definitions/scriptsInstallAfter"
},
"postinstall": {
"$ref": "#/definitions/scriptsInstallAfter"
},
"preuninstall": {
"$ref": "#/definitions/scriptsUninstallBefore"
},
"uninstall": {
"$ref": "#/definitions/scriptsUninstallBefore"
},
"postuninstall": {
"type": "string",
"description": "Run AFTER the package is uninstalled."
},
"preversion": {
"$ref": "#/definitions/scriptsVersionBefore"
},
"version": {
"$ref": "#/definitions/scriptsVersionBefore"
},
"postversion": {
"type": "string",
"description": "Run AFTER bump the package version."
},
"pretest": {
"$ref": "#/definitions/scriptsTest"
},
"test": {
"$ref": "#/definitions/scriptsTest"
},
"posttest": {
"$ref": "#/definitions/scriptsTest"
},
"prestop": {
"$ref": "#/definitions/scriptsStop"
},
"stop": {
"$ref": "#/definitions/scriptsStop"
},
"poststop": {
"$ref": "#/definitions/scriptsStop"
},
"prestart": {
"$ref": "#/definitions/scriptsStart"
},
"start": {
"$ref": "#/definitions/scriptsStart"
},
"poststart": {
"$ref": "#/definitions/scriptsStart"
},
"prerestart": {
"$ref": "#/definitions/scriptsRestart"
},
"restart": {
"$ref": "#/definitions/scriptsRestart"
},
"postrestart": {
"$ref": "#/definitions/scriptsRestart"
},
"serve": {
"type": "string",
"description": "Start dev server to serve application files"
}
},
"additionalProperties": {
"type": "string",
"tsType": "string | undefined",
"x-intellij-language-injection": "Shell Script"
}
},
"config": {
"description": "A 'config' hash can be used to set configuration parameters used in package scripts that persist across upgrades.",
"type": "object",
"additionalProperties": true
},
"dependencies": {
"$ref": "#/definitions/dependency"
},
"devDependencies": {
"$ref": "#/definitions/dependency"
},
"optionalDependencies": {
"$ref": "#/definitions/dependency"
},
"peerDependencies": {
"$ref": "#/definitions/dependency"
},
"peerDependenciesMeta": {
"description": "When a user installs your package, warnings are emitted if packages specified in \"peerDependencies\" are not already installed. The \"peerDependenciesMeta\" field serves to provide more information on how your peer dependencies are utilized. Most commonly, it allows peer dependencies to be marked as optional. Metadata for this field is specified with a simple hash of the package name to a metadata object.",
"type": "object",
"additionalProperties": {
"type": "object",
"additionalProperties": true,
"properties": {
"optional": {
"description": "Specifies that this peer dependency is optional and should not be installed automatically.",
"type": "boolean"
}
}
}
},
"bundledDependencies": {
"description": "Array of package names that will be bundled when publishing the package.",
"oneOf": [
{
"type": "array",
"items": {
"type": "string"
}
},
{
"type": "boolean"
}
]
},
"bundleDependencies": {
"description": "DEPRECATED: This field is honored, but \"bundledDependencies\" is the correct field name.",
"oneOf": [
{
"type": "array",
"items": {
"type": "string"
}
},
{
"type": "boolean"
}
]
},
"resolutions": {
"description": "Resolutions is used to support selective version resolutions using yarn, which lets you define custom package versions or ranges inside your dependencies. For npm, use overrides instead. See: https://classic.yarnpkg.com/en/docs/selective-version-resolutions",
"type": "object"
},
"overrides": {
"description": "Overrides is used to support selective version overrides using npm, which lets you define custom package versions or ranges inside your dependencies. For yarn, use resolutions instead. See: https://docs.npmjs.com/cli/v9/configuring-npm/package-json#overrides",
"type": "object"
},
"packageManager": {
"description": "Defines which package manager is expected to be used when working on the current project. This field is currently experimental and needs to be opted-in; see https://nodejs.org/api/corepack.html",
"type": "string",
"pattern": "(npm|pnpm|yarn|bun)@\\d+\\.\\d+\\.\\d+(-.+)?"
},
"engines": {
"type": "object",
"properties": {
"node": {
"type": "string"
}
},
"additionalProperties": {
"type": "string"
}
},
"volta": {
"description": "Defines which tools and versions are expected to be used when Volta is installed.",
"type": "object",
"properties": {
"extends": {
"description": "The value of that entry should be a path to another JSON file which also has a \"volta\" section",
"type": "string"
}
},
"patternProperties": {
"(node|npm|pnpm|yarn)": {
"type": "string"
}
}
},
"engineStrict": {
"type": "boolean"
},
"os": {
"description": "Specify which operating systems your module will run on.",
"type": "array",
"items": {
"type": "string"
}
},
"cpu": {
"description": "Specify that your code only runs on certain cpu architectures.",
"type": "array",
"items": {
"type": "string"
}
},
"preferGlobal": {
"type": "boolean",
"description": "DEPRECATED: This option used to trigger an npm warning, but it will no longer warn. It is purely there for informational purposes. It is now recommended that you install any binaries as local devDependencies wherever possible."
},
"private": {
"description": "If set to true, then npm will refuse to publish it.",
"oneOf": [
{
"type": "boolean"
},
{
"enum": [
"false",
"true"
]
}
]
},
"publishConfig": {
"type": "object",
"properties": {
"access": {
"type": "string",
"enum": [
"public",
"restricted"
]
},
"tag": {
"type": "string"
},
"registry": {
"type": "string",
"format": "uri"
}
},
"additionalProperties": true
},
"dist": {
"type": "object",
"properties": {
"shasum": {
"type": "string"
},
"tarball": {
"type": "string"
}
}
},
"readme": {
"type": "string"
},
"module": {
"description": "An ECMAScript module ID that is the primary entry point to your program.",
"type": "string"
},
"esnext": {
"description": "A module ID with untranspiled code that is the primary entry point to your program.",
"type": [
"string",
"object"
],
"properties": {
"main": {
"type": "string"
},
"browser": {
"type": "string"
}
},
"additionalProperties": {
"type": "string"
}
},
"workspaces": {
"description": "Allows packages within a directory to depend on one another using direct linking of local files. Additionally, dependencies within a workspace are hoisted to the workspace root when possible to reduce duplication. Note: It's also a good idea to set \"private\" to true when using this feature.",
"anyOf": [
{
"type": "array",
"description": "Workspace package paths. Glob patterns are supported.",
"items": {
"type": "string"
}
},
{
"type": "object",
"properties": {
"packages": {
"type": "array",
"description": "Workspace package paths. Glob patterns are supported.",
"items": {
"type": "string"
}
},
"nohoist": {
"type": "array",
"description": "Packages to block from hoisting to the workspace root. Currently only supported in Yarn only.",
"items": {
"type": "string"
}
}
}
}
]
},
"jspm": {
"$ref": "#"
},
"eslintConfig": {
"$ref": "https://json.schemastore.org/eslintrc.json"
},
"prettier": {
"$ref": "https://json.schemastore.org/prettierrc.json"
},
"stylelint": {
"$ref": "https://json.schemastore.org/stylelintrc.json"
},
"ava": {
"$ref": "https://json.schemastore.org/ava.json"
},
"release": {
"$ref": "https://json.schemastore.org/semantic-release.json"
},
"jscpd": {
"$ref": "https://json.schemastore.org/jscpd.json"
}
},
"anyOf": [
{
"type": "object",
"not": {
"required": [
"bundledDependencies",
"bundleDependencies"
]
}
},
{
"type": "object",
"not": {
"required": [
"bundleDependencies"
]
},
"required": [
"bundledDependencies"
]
},
{
"type": "object",
"not": {
"required": [
"bundledDependencies"
]
},
"required": [
"bundleDependencies"
]
}
],
"$id": "https://json.schemastore.org/package.json"
}

View File

@@ -7,7 +7,7 @@ use http::github::{latest_github_release, GitHubLspBinaryVersion};
pub use language::*;
use lazy_static::lazy_static;
use lsp::LanguageServerBinary;
use project::project_settings::ProjectSettings;
use project::project_settings::{BinarySettings, ProjectSettings};
use regex::Regex;
use settings::Settings;
use smol::fs::{self, File};
@@ -35,29 +35,50 @@ impl LspAdapter for RustLspAdapter {
async fn check_if_user_installed(
&self,
_delegate: &dyn LspAdapterDelegate,
delegate: &dyn LspAdapterDelegate,
cx: &AsyncAppContext,
) -> Option<LanguageServerBinary> {
let binary = cx
.update(|cx| {
ProjectSettings::get_global(cx)
.lsp
.get(Self::SERVER_NAME)
.and_then(|s| s.binary.clone())
})
.ok()??;
let configured_binary = cx.update(|cx| {
ProjectSettings::get_global(cx)
.lsp
.get(Self::SERVER_NAME)
.and_then(|s| s.binary.clone())
});
let path = binary.path?;
Some(LanguageServerBinary {
path: path.into(),
arguments: binary
.arguments
.unwrap_or_default()
.iter()
.map(|arg| arg.into())
.collect(),
env: None,
})
match configured_binary {
Ok(Some(BinarySettings {
path,
arguments,
path_lookup,
})) => {
let (path, env) = match (path, path_lookup) {
(Some(path), lookup) => {
if lookup.is_some() {
log::warn!(
"Both `path` and `path_lookup` are set, ignoring `path_lookup`"
);
}
(Some(path.into()), None)
}
(None, Some(true)) => {
let path = delegate.which(Self::SERVER_NAME.as_ref()).await?;
let env = delegate.shell_env().await;
(Some(path), Some(env))
}
(None, Some(false)) | (None, None) => (None, None),
};
path.map(|path| LanguageServerBinary {
path,
arguments: arguments
.unwrap_or_default()
.iter()
.map(|arg| arg.into())
.collect(),
env,
})
}
_ => None,
}
}
async fn fetch_latest_server_version(

View File

@@ -147,7 +147,8 @@ pub fn main() {
cx,
)
})
});
})
.unwrap();
});
}

View File

@@ -27,6 +27,7 @@ use futures::{
oneshot,
},
future::{join_all, try_join_all, Shared},
prelude::future::BoxFuture,
select,
stream::FuturesUnordered,
AsyncWriteExt, Future, FutureExt, StreamExt, TryFutureExt,
@@ -38,6 +39,7 @@ use gpui::{
AnyModel, AppContext, AsyncAppContext, BackgroundExecutor, BorrowAppContext, Context, Entity,
EventEmitter, Model, ModelContext, PromptLevel, SharedString, Task, WeakModel, WindowContext,
};
use http::{HttpClient, Url};
use itertools::Itertools;
use language::{
language_settings::{language_settings, FormatOnSave, Formatter, InlayHintKind},
@@ -66,19 +68,16 @@ use postage::watch;
use prettier_support::{DefaultPrettier, PrettierInstance};
use project_settings::{LspSettings, ProjectSettings};
use rand::prelude::*;
use search_history::SearchHistory;
use snippet::Snippet;
use worktree::{CreatedEntry, LocalSnapshot};
use http::{HttpClient, Url};
use rpc::{ErrorCode, ErrorExt as _};
use search::SearchQuery;
use search_history::SearchHistory;
use serde::Serialize;
use settings::{watch_config_file, Settings, SettingsLocation, SettingsStore};
use sha2::{Digest, Sha256};
use similar::{ChangeTag, TextDiff};
use smol::channel::{Receiver, Sender};
use smol::lock::Semaphore;
use snippet::Snippet;
use std::{
borrow::Cow,
cmp::{self, Ordering},
@@ -111,7 +110,7 @@ use util::{
},
post_inc, ResultExt, TryFutureExt as _,
};
use worktree::{Snapshot, Traversal};
use worktree::{CreatedEntry, RemoteWorktreeClient, Snapshot, Traversal};
pub use fs::*;
pub use language::Location;
@@ -229,6 +228,7 @@ pub struct Project {
hosted_project_id: Option<ProjectId>,
dev_server_project_id: Option<client::DevServerProjectId>,
search_history: SearchHistory,
yarn_worktree_ids_reported: Vec<WorktreeId>,
}
pub enum LanguageServerToQuery {
@@ -787,6 +787,7 @@ impl Project {
hosted_project_id: None,
dev_server_project_id: None,
search_history: Self::new_search_history(),
yarn_worktree_ids_reported: Vec::new(),
}
})
}
@@ -856,7 +857,13 @@ impl Project {
// That's because Worktree's identifier is entity id, which should probably be changed.
let mut worktrees = Vec::new();
for worktree in response.payload.worktrees {
let worktree = Worktree::remote(replica_id, worktree, cx);
let worktree = Worktree::remote(
remote_id,
replica_id,
worktree,
Box::new(CollabRemoteWorktreeClient(client.clone())),
cx,
);
worktrees.push(worktree);
}
@@ -945,6 +952,7 @@ impl Project {
.dev_server_project_id
.map(|dev_server_project_id| DevServerProjectId(dev_server_project_id)),
search_history: Self::new_search_history(),
yarn_worktree_ids_reported: Vec::new(),
};
this.set_role(role, cx);
for worktree in worktrees {
@@ -1450,47 +1458,9 @@ impl Project {
"No worktree for path {project_path:?}"
))));
};
if self.is_local() {
worktree.update(cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.create_entry(project_path.path, is_directory, cx)
})
} else {
let client = self.client.clone();
let project_id = self.remote_id().unwrap();
cx.spawn(move |_, mut cx| async move {
let response = client
.request(proto::CreateProjectEntry {
worktree_id: project_path.worktree_id.to_proto(),
project_id,
path: project_path.path.to_string_lossy().into(),
is_directory,
})
.await?;
match response.entry {
Some(entry) => worktree
.update(&mut cx, |worktree, cx| {
worktree.as_remote_mut().unwrap().insert_entry(
entry,
response.worktree_scan_id as usize,
cx,
)
})?
.await
.map(CreatedEntry::Included),
None => {
let abs_path = worktree.update(&mut cx, |worktree, _| {
worktree
.absolutize(&project_path.path)
.with_context(|| format!("absolutizing {project_path:?}"))
})??;
Ok(CreatedEntry::Excluded { abs_path })
}
}
})
}
worktree.update(cx, |worktree, cx| {
worktree.create_entry(project_path.path, is_directory, cx)
})
}
pub fn copy_entry(
@@ -1502,41 +1472,9 @@ impl Project {
let Some(worktree) = self.worktree_for_entry(entry_id, cx) else {
return Task::ready(Ok(None));
};
let new_path = new_path.into();
if self.is_local() {
worktree.update(cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.copy_entry(entry_id, new_path, cx)
})
} else {
let client = self.client.clone();
let project_id = self.remote_id().unwrap();
cx.spawn(move |_, mut cx| async move {
let response = client
.request(proto::CopyProjectEntry {
project_id,
entry_id: entry_id.to_proto(),
new_path: new_path.to_string_lossy().into(),
})
.await?;
match response.entry {
Some(entry) => worktree
.update(&mut cx, |worktree, cx| {
worktree.as_remote_mut().unwrap().insert_entry(
entry,
response.worktree_scan_id as usize,
cx,
)
})?
.await
.map(Some),
None => Ok(None),
}
})
}
worktree.update(cx, |worktree, cx| {
worktree.copy_entry(entry_id, new_path, cx)
})
}
pub fn rename_entry(
@@ -1548,48 +1486,9 @@ impl Project {
let Some(worktree) = self.worktree_for_entry(entry_id, cx) else {
return Task::ready(Err(anyhow!(format!("No worktree for entry {entry_id:?}"))));
};
let new_path = new_path.into();
if self.is_local() {
worktree.update(cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.rename_entry(entry_id, new_path, cx)
})
} else {
let client = self.client.clone();
let project_id = self.remote_id().unwrap();
cx.spawn(move |_, mut cx| async move {
let response = client
.request(proto::RenameProjectEntry {
project_id,
entry_id: entry_id.to_proto(),
new_path: new_path.to_string_lossy().into(),
})
.await?;
match response.entry {
Some(entry) => worktree
.update(&mut cx, |worktree, cx| {
worktree.as_remote_mut().unwrap().insert_entry(
entry,
response.worktree_scan_id as usize,
cx,
)
})?
.await
.map(CreatedEntry::Included),
None => {
let abs_path = worktree.update(&mut cx, |worktree, _| {
worktree
.absolutize(&new_path)
.with_context(|| format!("absolutizing {new_path:?}"))
})??;
Ok(CreatedEntry::Excluded { abs_path })
}
}
})
}
worktree.update(cx, |worktree, cx| {
worktree.rename_entry(entry_id, new_path, cx)
})
}
pub fn delete_entry(
@@ -1599,38 +1498,10 @@ impl Project {
cx: &mut ModelContext<Self>,
) -> Option<Task<Result<()>>> {
let worktree = self.worktree_for_entry(entry_id, cx)?;
cx.emit(Event::DeletedEntry(entry_id));
if self.is_local() {
worktree.update(cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.delete_entry(entry_id, trash, cx)
})
} else {
let client = self.client.clone();
let project_id = self.remote_id().unwrap();
Some(cx.spawn(move |_, mut cx| async move {
let response = client
.request(proto::DeleteProjectEntry {
project_id,
entry_id: entry_id.to_proto(),
use_trash: trash,
})
.await?;
worktree
.update(&mut cx, move |worktree, cx| {
worktree.as_remote_mut().unwrap().delete_entry(
entry_id,
response.worktree_scan_id as usize,
cx,
)
})?
.await
}))
}
worktree.update(cx, |worktree, cx| {
worktree.delete_entry(entry_id, trash, cx)
})
}
pub fn expand_entry(
@@ -1640,31 +1511,7 @@ impl Project {
cx: &mut ModelContext<Self>,
) -> Option<Task<Result<()>>> {
let worktree = self.worktree_for_id(worktree_id, cx)?;
if self.is_local() {
worktree.update(cx, |worktree, cx| {
worktree.as_local_mut().unwrap().expand_entry(entry_id, cx)
})
} else {
let worktree = worktree.downgrade();
let request = self.client.request(proto::ExpandProjectEntry {
project_id: self.remote_id().unwrap(),
entry_id: entry_id.to_proto(),
});
Some(cx.spawn(move |_, mut cx| async move {
let response = request.await?;
if let Some(worktree) = worktree.upgrade() {
worktree
.update(&mut cx, |worktree, _| {
worktree
.as_remote_mut()
.unwrap()
.wait_for_snapshot(response.worktree_scan_id as usize)
})?
.await?;
}
Ok(())
}))
}
worktree.update(cx, |worktree, cx| worktree.expand_entry(entry_id, cx))
}
pub fn shared(&mut self, project_id: u64, cx: &mut ModelContext<Self>) -> Result<()> {
@@ -1782,18 +1629,12 @@ impl Project {
}
}
worktree.as_local_mut().unwrap().observe_updates(
project_id,
cx,
{
let client = client.clone();
move |update| {
client
.request(update)
.map(|result| result.is_ok())
}
},
);
worktree.observe_updates(project_id, cx, {
let client = client.clone();
move |update| {
client.request(update).map(|result| result.is_ok())
}
});
anyhow::Ok(())
})?;
@@ -1944,7 +1785,7 @@ impl Project {
for worktree_handle in self.worktrees.iter_mut() {
if let WorktreeHandle::Strong(worktree) = worktree_handle {
let is_visible = worktree.update(cx, |worktree, _| {
worktree.as_local_mut().unwrap().stop_observing_updates();
worktree.stop_observing_updates();
worktree.is_visible()
});
if !is_visible {
@@ -2227,21 +2068,20 @@ impl Project {
cx: &mut ModelContext<Self>,
) -> Task<Result<Model<Buffer>>> {
let load_buffer = worktree.update(cx, |worktree, cx| {
let worktree = worktree.as_local_mut().unwrap();
let file = worktree.load_file(path.as_ref(), cx);
let load_file = worktree.load_file(path.as_ref(), cx);
let reservation = cx.reserve_model();
let buffer_id = BufferId::from(reservation.entity_id().as_non_zero_u64());
cx.spawn(move |_, mut cx| async move {
let (file, contents, diff_base) = file.await?;
let loaded = load_file.await?;
let text_buffer = cx
.background_executor()
.spawn(async move { text::Buffer::new(0, buffer_id, contents) })
.spawn(async move { text::Buffer::new(0, buffer_id, loaded.text) })
.await;
cx.insert_model(reservation, |_| {
Buffer::build(
text_buffer,
diff_base,
Some(Arc::new(file)),
loaded.diff_base,
Some(loaded.file),
Capability::ReadWrite,
)
})
@@ -2395,10 +2235,11 @@ impl Project {
};
let worktree = file.worktree.clone();
let path = file.path.clone();
worktree.update(cx, |worktree, cx| match worktree {
Worktree::Local(worktree) => self.save_local_buffer(&worktree, buffer, path, false, cx),
Worktree::Remote(_) => self.save_remote_buffer(buffer, None, cx),
})
if self.is_local() {
self.save_local_buffer(worktree, buffer, path, false, cx)
} else {
self.save_remote_buffer(buffer, None, cx)
}
}
pub fn save_buffer_as(
@@ -2407,26 +2248,21 @@ impl Project {
path: ProjectPath,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let old_file = File::from_dyn(buffer.read(cx).file())
.filter(|f| f.is_local())
.cloned();
let old_file = File::from_dyn(buffer.read(cx).file()).cloned();
let Some(worktree) = self.worktree_for_id(path.worktree_id, cx) else {
return Task::ready(Err(anyhow!("worktree does not exist")));
};
cx.spawn(move |this, mut cx| async move {
this.update(&mut cx, |this, cx| {
if let Some(old_file) = &old_file {
this.unregister_buffer_from_language_servers(&buffer, old_file, cx);
if this.is_local() {
if let Some(old_file) = &old_file {
this.unregister_buffer_from_language_servers(&buffer, old_file, cx);
}
this.save_local_buffer(worktree, buffer.clone(), path.path, true, cx)
} else {
this.save_remote_buffer(buffer.clone(), Some(path.to_proto()), cx)
}
worktree.update(cx, |worktree, cx| match worktree {
Worktree::Local(worktree) => {
this.save_local_buffer(worktree, buffer.clone(), path.path, true, cx)
}
Worktree::Remote(_) => {
this.save_remote_buffer(buffer.clone(), Some(path.to_proto()), cx)
}
})
})?
.await?;
@@ -2440,70 +2276,39 @@ impl Project {
pub fn save_local_buffer(
&self,
worktree: &LocalWorktree,
worktree: Model<Worktree>,
buffer_handle: Model<Buffer>,
path: Arc<Path>,
mut has_changed_file: bool,
cx: &mut ModelContext<Worktree>,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let rpc = self.client.clone();
let buffer_id: u64 = buffer.remote_id().into();
let project_id = self.remote_id();
let buffer_id = buffer.remote_id();
let text = buffer.as_rope().clone();
let line_ending = buffer.line_ending();
let version = buffer.version();
if buffer.file().is_some_and(|file| !file.is_created()) {
has_changed_file = true;
}
let text = buffer.as_rope().clone();
let version = buffer.version();
let save = worktree.write_file(path.as_ref(), text, buffer.line_ending(), cx);
let fs = Arc::clone(&self.fs);
let abs_path = worktree.absolutize(&path);
let is_private = worktree.is_path_private(&path);
cx.spawn(move |this, mut cx| async move {
let entry = save.await?;
let abs_path = abs_path?;
let this = this.upgrade().context("worktree dropped")?;
let (entry_id, mtime, path, is_private) = match entry {
Some(entry) => (Some(entry.id), entry.mtime, entry.path, entry.is_private),
None => {
let metadata = fs
.metadata(&abs_path)
.await
.with_context(|| {
format!(
"Fetching metadata after saving the excluded buffer {abs_path:?}"
)
})?
.with_context(|| {
format!("Excluded buffer {path:?} got removed during saving")
})?;
(None, Some(metadata.mtime), path, is_private)
}
};
let save = worktree.update(cx, |worktree, cx| {
worktree.write_file(path.as_ref(), text, line_ending, cx)
});
let client = self.client.clone();
let project_id = self.remote_id();
cx.spawn(move |_, mut cx| async move {
let new_file = save.await?;
let mtime = new_file.mtime;
if has_changed_file {
let new_file = Arc::new(File {
entry_id,
worktree: this,
path,
mtime,
is_local: true,
is_deleted: false,
is_private,
});
if let Some(project_id) = project_id {
rpc.send(proto::UpdateBufferFile {
project_id,
buffer_id,
file: Some(new_file.to_proto()),
})
.log_err();
client
.send(proto::UpdateBufferFile {
project_id,
buffer_id: buffer_id.into(),
file: Some(new_file.to_proto()),
})
.log_err();
}
buffer_handle.update(&mut cx, |buffer, cx| {
@@ -2514,9 +2319,9 @@ impl Project {
}
if let Some(project_id) = project_id {
rpc.send(proto::BufferSaved {
client.send(proto::BufferSaved {
project_id,
buffer_id,
buffer_id: buffer_id.into(),
version: serialize_version(&version),
mtime: mtime.map(|time| time.into()),
})?;
@@ -2534,7 +2339,7 @@ impl Project {
&self,
buffer_handle: Model<Buffer>,
new_path: Option<proto::ProjectPath>,
cx: &mut ModelContext<Worktree>,
cx: &mut ModelContext<Self>,
) -> Task<Result<()>> {
let buffer = buffer_handle.read(cx);
let buffer_id = buffer.remote_id().into();
@@ -2638,7 +2443,6 @@ impl Project {
self.detect_language_for_buffer(buffer, cx);
self.register_buffer_with_language_servers(buffer, cx);
// self.register_buffer_with_copilot(buffer, cx);
cx.observe_release(buffer, |this, buffer, cx| {
if let Some(file) = File::from_dyn(buffer.file()) {
if file.is_local() {
@@ -2788,16 +2592,6 @@ impl Project {
});
}
// fn register_buffer_with_copilot(
// &self,
// buffer_handle: &Model<Buffer>,
// cx: &mut ModelContext<Self>,
// ) {
// if let Some(copilot) = Copilot::global(cx) {
// copilot.update(cx, |copilot, cx| copilot.register_buffer(buffer_handle, cx));
// }
// }
async fn send_buffer_ordered_messages(
this: WeakModel<Self>,
rx: UnboundedReceiver<BufferOrderedMessage>,
@@ -5518,7 +5312,7 @@ impl Project {
) -> Result<Option<Diff>> {
let working_dir_path = buffer.update(cx, |buffer, cx| {
let file = File::from_dyn(buffer.file())?;
let worktree = file.worktree.read(cx).as_local()?;
let worktree = file.worktree.read(cx);
let mut worktree_path = worktree.abs_path().to_path_buf();
if worktree.root_entry()?.is_file() {
worktree_path.pop();
@@ -5705,9 +5499,6 @@ impl Project {
if !worktree.is_visible() {
continue;
}
let Some(worktree) = worktree.as_local() else {
continue;
};
let worktree_abs_path = worktree.abs_path().clone();
let (adapter, language, server) = match self.language_servers.get(server_id) {
@@ -5871,15 +5662,14 @@ impl Project {
let worktree_abs_path = if let Some(worktree_abs_path) = self
.worktree_for_id(symbol.path.worktree_id, cx)
.and_then(|worktree| worktree.read(cx).as_local())
.map(|local_worktree| local_worktree.abs_path())
.map(|worktree| worktree.read(cx).abs_path())
{
worktree_abs_path
} else {
return Task::ready(Err(anyhow!("worktree not found for symbol")));
};
let symbol_abs_path = resolve_path(worktree_abs_path, &symbol.path.path);
let symbol_abs_path = resolve_path(&worktree_abs_path, &symbol.path.path);
let symbol_uri = if let Ok(uri) = lsp::Url::from_file_path(symbol_abs_path) {
uri
} else {
@@ -7231,8 +7021,8 @@ impl Project {
let snapshots = self
.visible_worktrees(cx)
.filter_map(|tree| {
let tree = tree.read(cx).as_local()?;
Some(tree.snapshot())
let tree = tree.read(cx);
Some((tree.snapshot(), tree.as_local()?.settings()))
})
.collect::<Vec<_>>();
let include_root = snapshots.len() > 1;
@@ -7240,11 +7030,11 @@ impl Project {
let background = cx.background_executor().clone();
let path_count: usize = snapshots
.iter()
.map(|s| {
.map(|(snapshot, _)| {
if query.include_ignored() {
s.file_count()
snapshot.file_count()
} else {
s.visible_file_count()
snapshot.visible_file_count()
}
})
.sum();
@@ -7400,7 +7190,7 @@ impl Project {
query: SearchQuery,
include_root: bool,
path_count: usize,
snapshots: Vec<LocalSnapshot>,
snapshots: Vec<(Snapshot, WorktreeSettings)>,
matching_paths_tx: Sender<SearchMatchCandidate>,
) {
let fs = &fs;
@@ -7456,13 +7246,14 @@ impl Project {
}
if query.include_ignored() {
for snapshot in snapshots {
for (snapshot, settings) in snapshots {
for ignored_entry in snapshot.entries(true).filter(|e| e.is_ignored) {
let limiter = Arc::clone(&max_concurrent_workers);
scope.spawn(async move {
let _guard = limiter.acquire().await;
search_ignored_entry(
snapshot,
settings,
ignored_entry,
fs,
query,
@@ -7897,6 +7688,8 @@ impl Project {
worktree.read(cx).id(),
changes.clone(),
));
this.report_yarn_project(&worktree, changes, cx);
}
worktree::Event::UpdatedGitRepositories(updated_repos) => {
if is_local {
@@ -7949,6 +7742,32 @@ impl Project {
self.metadata_changed(cx);
}
fn report_yarn_project(
&mut self,
worktree: &Model<Worktree>,
updated_entries_set: &UpdatedEntriesSet,
cx: &mut ModelContext<Self>,
) {
let worktree_id = worktree.update(cx, |worktree, _| worktree.id());
if !self.yarn_worktree_ids_reported.contains(&worktree_id) {
let is_yarn_project = updated_entries_set.iter().any(|(path, _, _)| {
path.as_ref()
.file_name()
.and_then(|name| name.to_str())
.map(|name_str| name_str == "yarn.lock")
.unwrap_or(false)
});
if is_yarn_project {
self.client()
.telemetry()
.report_app_event("open yarn project".to_string());
self.yarn_worktree_ids_reported.push(worktree_id);
}
}
}
fn update_local_worktree_buffers(
&mut self,
worktree_handle: &Model<Worktree>,
@@ -8271,7 +8090,7 @@ impl Project {
changes: &UpdatedEntriesSet,
cx: &mut ModelContext<Self>,
) {
if worktree.read(cx).as_local().is_none() {
if worktree.read(cx).is_remote() {
return;
}
let project_id = self.remote_id();
@@ -8576,14 +8395,12 @@ impl Project {
self.worktree_for_id(project_path.worktree_id, cx)?
.read(cx)
.as_local()?
.snapshot()
.local_git_repo(&project_path.path)
}
pub fn get_first_worktree_root_repo(&self, cx: &AppContext) -> Option<Arc<dyn GitRepository>> {
let worktree = self.visible_worktrees(cx).next()?.read(cx).as_local()?;
let root_entry = worktree.root_git_entry()?;
worktree.get_local_repo(&root_entry)?.repo().clone().into()
}
@@ -8985,21 +8802,7 @@ impl Project {
this.worktree_for_id(worktree_id, cx)
.ok_or_else(|| anyhow!("worktree not found"))
})??;
let worktree_scan_id = worktree.update(&mut cx, |worktree, _| worktree.scan_id())?;
let entry = worktree
.update(&mut cx, |worktree, cx| {
let worktree = worktree.as_local_mut().unwrap();
let path = PathBuf::from(envelope.payload.path);
worktree.create_entry(path, envelope.payload.is_directory, cx)
})?
.await?;
Ok(proto::ProjectEntryResponse {
entry: match &entry {
CreatedEntry::Included(entry) => Some(entry.into()),
CreatedEntry::Excluded { .. } => None,
},
worktree_scan_id: worktree_scan_id as u64,
})
Worktree::handle_create_entry(worktree, envelope.payload, cx).await
}
async fn handle_rename_project_entry(
@@ -9013,23 +8816,7 @@ impl Project {
this.worktree_for_entry(entry_id, cx)
.ok_or_else(|| anyhow!("worktree not found"))
})??;
let worktree_scan_id = worktree.update(&mut cx, |worktree, _| worktree.scan_id())?;
let entry = worktree
.update(&mut cx, |worktree, cx| {
let new_path = PathBuf::from(envelope.payload.new_path);
worktree
.as_local_mut()
.unwrap()
.rename_entry(entry_id, new_path, cx)
})?
.await?;
Ok(proto::ProjectEntryResponse {
entry: match &entry {
CreatedEntry::Included(entry) => Some(entry.into()),
CreatedEntry::Excluded { .. } => None,
},
worktree_scan_id: worktree_scan_id as u64,
})
Worktree::handle_rename_entry(worktree, envelope.payload, cx).await
}
async fn handle_copy_project_entry(
@@ -9043,20 +8830,7 @@ impl Project {
this.worktree_for_entry(entry_id, cx)
.ok_or_else(|| anyhow!("worktree not found"))
})??;
let worktree_scan_id = worktree.update(&mut cx, |worktree, _| worktree.scan_id())?;
let entry = worktree
.update(&mut cx, |worktree, cx| {
let new_path = PathBuf::from(envelope.payload.new_path);
worktree
.as_local_mut()
.unwrap()
.copy_entry(entry_id, new_path, cx)
})?
.await?;
Ok(proto::ProjectEntryResponse {
entry: entry.as_ref().map(|e| e.into()),
worktree_scan_id: worktree_scan_id as u64,
})
Worktree::handle_copy_entry(worktree, envelope.payload, cx).await
}
async fn handle_delete_project_entry(
@@ -9066,28 +8840,12 @@ impl Project {
mut cx: AsyncAppContext,
) -> Result<proto::ProjectEntryResponse> {
let entry_id = ProjectEntryId::from_proto(envelope.payload.entry_id);
let trash = envelope.payload.use_trash;
this.update(&mut cx, |_, cx| cx.emit(Event::DeletedEntry(entry_id)))?;
let worktree = this.update(&mut cx, |this, cx| {
this.worktree_for_entry(entry_id, cx)
.ok_or_else(|| anyhow!("worktree not found"))
})??;
let worktree_scan_id = worktree.update(&mut cx, |worktree, _| worktree.scan_id())?;
worktree
.update(&mut cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.delete_entry(entry_id, trash, cx)
.ok_or_else(|| anyhow!("invalid entry"))
})??
.await?;
Ok(proto::ProjectEntryResponse {
entry: None,
worktree_scan_id: worktree_scan_id as u64,
})
this.update(&mut cx, |_, cx| cx.emit(Event::DeletedEntry(entry_id)))?;
Worktree::handle_delete_entry(worktree, envelope.payload, cx).await
}
async fn handle_expand_project_entry(
@@ -9100,17 +8858,7 @@ impl Project {
let worktree = this
.update(&mut cx, |this, cx| this.worktree_for_entry(entry_id, cx))?
.ok_or_else(|| anyhow!("invalid request"))?;
worktree
.update(&mut cx, |worktree, cx| {
worktree
.as_local_mut()
.unwrap()
.expand_entry(entry_id, cx)
.ok_or_else(|| anyhow!("invalid entry"))
})??
.await?;
let worktree_scan_id = worktree.update(&mut cx, |worktree, _| worktree.scan_id())? as u64;
Ok(proto::ExpandProjectEntryResponse { worktree_scan_id })
Worktree::handle_expand_entry(worktree, envelope.payload, cx).await
}
async fn handle_update_diagnostic_summary(
@@ -10563,6 +10311,7 @@ impl Project {
cx: &mut ModelContext<Project>,
) -> Result<()> {
let replica_id = self.replica_id();
let remote_id = self.remote_id().ok_or_else(|| anyhow!("invalid project"))?;
let mut old_worktrees_by_id = self
.worktrees
@@ -10579,8 +10328,16 @@ impl Project {
{
self.worktrees.push(WorktreeHandle::Strong(old_worktree));
} else {
let worktree = Worktree::remote(replica_id, worktree, cx);
let _ = self.add_worktree(&worktree, cx);
self.add_worktree(
&Worktree::remote(
remote_id,
replica_id,
worktree,
Box::new(CollabRemoteWorktreeClient(self.client.clone())),
cx,
),
cx,
);
}
}
@@ -11343,7 +11100,7 @@ fn deserialize_code_actions(code_actions: &HashMap<String, bool>) -> Vec<lsp::Co
#[allow(clippy::too_many_arguments)]
async fn search_snapshots(
snapshots: &Vec<LocalSnapshot>,
snapshots: &Vec<(Snapshot, WorktreeSettings)>,
worker_start_ix: usize,
worker_end_ix: usize,
query: &SearchQuery,
@@ -11355,7 +11112,7 @@ async fn search_snapshots(
let mut snapshot_start_ix = 0;
let mut abs_path = PathBuf::new();
for snapshot in snapshots {
for (snapshot, _) in snapshots {
let snapshot_end_ix = snapshot_start_ix
+ if query.include_ignored() {
snapshot.file_count()
@@ -11421,7 +11178,8 @@ async fn search_snapshots(
}
async fn search_ignored_entry(
snapshot: &LocalSnapshot,
snapshot: &Snapshot,
settings: &WorktreeSettings,
ignored_entry: &Entry,
fs: &Arc<dyn Fs>,
query: &SearchQuery,
@@ -11455,7 +11213,7 @@ async fn search_ignored_entry(
}
} else if !fs_metadata.is_symlink {
if !query.file_matches(Some(&ignored_abs_path))
|| snapshot.is_path_excluded(&ignored_entry.path)
|| settings.is_path_excluded(&ignored_entry.path)
{
continue;
}
@@ -11531,6 +11289,18 @@ impl OpenBuffer {
}
}
pub struct CollabRemoteWorktreeClient(Arc<Client>);
impl RemoteWorktreeClient for CollabRemoteWorktreeClient {
fn request(
&self,
envelope: proto::Envelope,
request_type: &'static str,
) -> BoxFuture<'static, Result<proto::Envelope>> {
self.0.request_dynamic(envelope, request_type).boxed()
}
}
pub struct PathMatchCandidateSet {
pub snapshot: Snapshot,
pub include_ignored: bool,

View File

@@ -94,6 +94,7 @@ const fn true_value() -> bool {
pub struct BinarySettings {
pub path: Option<String>,
pub arguments: Option<Vec<String>>,
pub path_lookup: Option<bool>,
}
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq, Eq, JsonSchema)]

View File

@@ -2981,21 +2981,26 @@ async fn test_rescan_and_remote_updates(cx: &mut gpui::TestAppContext) {
// Create a remote copy of this worktree.
let tree = project.update(cx, |project, _| project.worktrees().next().unwrap());
let metadata = tree.update(cx, |tree, _| tree.as_local().unwrap().metadata_proto());
let metadata = tree.update(cx, |tree, _| tree.metadata_proto());
let updates = Arc::new(Mutex::new(Vec::new()));
tree.update(cx, |tree, cx| {
tree.as_local_mut().unwrap().observe_updates(0, cx, {
let updates = updates.clone();
move |update| {
updates.lock().push(update);
async { true }
}
let updates = updates.clone();
tree.observe_updates(0, cx, move |update| {
updates.lock().push(update);
async { true }
});
});
let remote = cx.update(|cx| Worktree::remote(1, metadata, cx));
let remote = cx.update(|cx| {
Worktree::remote(
0,
1,
metadata,
Box::new(CollabRemoteWorktreeClient(project.read(cx).client())),
cx,
)
});
cx.executor().run_until_parked();

View File

@@ -36,7 +36,7 @@ use util::{maybe, NumericPrefixWithSuffix, ResultExt, TryFutureExt};
use workspace::{
dock::{DockPosition, Panel, PanelEvent},
notifications::{DetachAndPromptErr, NotifyTaskExt},
OpenInTerminal, Workspace,
DraggedSelection, OpenInTerminal, SelectedEntry, Workspace,
};
use worktree::CreatedEntry;
@@ -65,26 +65,6 @@ pub struct ProjectPanel {
pending_serialization: Task<Option<()>>,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq, PartialOrd, Ord)]
struct SelectedEntry {
worktree_id: WorktreeId,
entry_id: ProjectEntryId,
}
struct DraggedSelection {
active_selection: SelectedEntry,
marked_selections: Arc<BTreeSet<SelectedEntry>>,
}
impl DraggedSelection {
fn items<'a>(&'a self) -> Box<dyn Iterator<Item = &'a SelectedEntry> + 'a> {
if self.marked_selections.contains(&self.active_selection) {
Box::new(self.marked_selections.iter())
} else {
Box::new(std::iter::once(&self.active_selection))
}
}
}
#[derive(Clone, Debug)]
struct EditState {
worktree_id: WorktreeId,

View File

@@ -435,6 +435,7 @@ impl Peer {
self.connections.write().clear();
}
/// Make a request and wait for a response.
pub fn request<T: RequestMessage>(
&self,
receiver_id: ConnectionId,
@@ -462,28 +463,50 @@ impl Peer {
.map_ok(|envelope| envelope.payload)
}
pub fn request_internal<T: RequestMessage>(
fn request_internal<T: RequestMessage>(
&self,
original_sender_id: Option<ConnectionId>,
receiver_id: ConnectionId,
request: T,
) -> impl Future<Output = Result<TypedEnvelope<T::Response>>> {
let envelope = request.into_envelope(0, None, original_sender_id.map(Into::into));
let response = self.request_dynamic(receiver_id, envelope, T::NAME);
async move {
let (response, received_at) = response.await?;
Ok(TypedEnvelope {
message_id: response.id,
sender_id: receiver_id,
original_sender_id: response.original_sender_id,
payload: T::Response::from_envelope(response)
.ok_or_else(|| anyhow!("received response of the wrong type"))?,
received_at,
})
}
}
/// Make a request and wait for a response.
///
/// The caller must make sure to deserialize the response into the request's
/// response type. This interface is only useful in trait objects, where
/// generics can't be used. If you have a concrete type, use `request`.
pub fn request_dynamic(
&self,
receiver_id: ConnectionId,
mut envelope: proto::Envelope,
type_name: &'static str,
) -> impl Future<Output = Result<(proto::Envelope, Instant)>> {
let (tx, rx) = oneshot::channel();
let send = self.connection_state(receiver_id).and_then(|connection| {
let message_id = connection.next_message_id.fetch_add(1, SeqCst);
envelope.id = connection.next_message_id.fetch_add(1, SeqCst);
connection
.response_channels
.lock()
.as_mut()
.ok_or_else(|| anyhow!("connection was closed"))?
.insert(message_id, tx);
.insert(envelope.id, tx);
connection
.outgoing_tx
.unbounded_send(proto::Message::Envelope(request.into_envelope(
message_id,
None,
original_sender_id.map(Into::into),
)))
.unbounded_send(proto::Message::Envelope(envelope))
.map_err(|_| anyhow!("connection was closed"))?;
Ok(())
});
@@ -491,19 +514,10 @@ impl Peer {
send?;
let (response, received_at, _barrier) =
rx.await.map_err(|_| anyhow!("connection was closed"))?;
if let Some(proto::envelope::Payload::Error(error)) = &response.payload {
Err(RpcError::from_proto(&error, T::NAME))
} else {
Ok(TypedEnvelope {
message_id: response.id,
sender_id: receiver_id,
original_sender_id: response.original_sender_id,
payload: T::Response::from_envelope(response)
.ok_or_else(|| anyhow!("received response of the wrong type"))?,
received_at,
})
return Err(RpcError::from_proto(&error, type_name));
}
Ok((response, received_at))
}
}

View File

@@ -30,7 +30,7 @@ use std::{
time::{Duration, SystemTime},
};
use util::ResultExt;
use worktree::LocalSnapshot;
use worktree::Snapshot;
pub use project_index_debug_view::ProjectIndexDebugView;
@@ -583,9 +583,9 @@ impl WorktreeIndex {
}
fn index_entries_changed_on_disk(&self, cx: &AppContext) -> impl Future<Output = Result<()>> {
let worktree = self.worktree.read(cx).as_local().unwrap().snapshot();
let worktree = self.worktree.read(cx).snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_entries(worktree.clone(), cx);
let scan = self.scan_entries(worktree, cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = Self::embed_files(self.embedding_provider.clone(), chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
@@ -600,7 +600,7 @@ impl WorktreeIndex {
updated_entries: UpdatedEntriesSet,
cx: &AppContext,
) -> impl Future<Output = Result<()>> {
let worktree = self.worktree.read(cx).as_local().unwrap().snapshot();
let worktree = self.worktree.read(cx).snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_updated_entries(worktree, updated_entries.clone(), cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
@@ -612,7 +612,7 @@ impl WorktreeIndex {
}
}
fn scan_entries(&self, worktree: LocalSnapshot, cx: &AppContext) -> ScanEntries {
fn scan_entries(&self, worktree: Snapshot, cx: &AppContext) -> ScanEntries {
let (updated_entries_tx, updated_entries_rx) = channel::bounded(512);
let (deleted_entry_ranges_tx, deleted_entry_ranges_rx) = channel::bounded(128);
let db_connection = self.db_connection.clone();
@@ -692,7 +692,7 @@ impl WorktreeIndex {
fn scan_updated_entries(
&self,
worktree: LocalSnapshot,
worktree: Snapshot,
updated_entries: UpdatedEntriesSet,
cx: &AppContext,
) -> ScanEntries {

View File

@@ -632,8 +632,6 @@ impl SettingsStore {
}
// If the global settings file changed, reload the global value for the field.
project_settings_stack.clear();
paths_stack.clear();
if changed_local_path.is_none() {
if let Some(value) = setting_value
.load_setting(
@@ -653,6 +651,8 @@ impl SettingsStore {
}
// Reload the local values for the setting.
paths_stack.clear();
project_settings_stack.clear();
for ((root_id, path), local_settings) in &self.raw_local_settings {
// Build a stack of all of the local values for that setting.
while let Some(prev_entry) = paths_stack.last() {

View File

@@ -1301,13 +1301,7 @@ mod tests {
.unwrap();
let entry = cx
.update(|cx| {
wt.update(cx, |wt, cx| {
wt.as_local()
.unwrap()
.create_entry(Path::new(""), is_dir, cx)
})
})
.update(|cx| wt.update(cx, |wt, cx| wt.create_entry(Path::new(""), is_dir, cx)))
.await
.unwrap()
.to_included()

View File

@@ -343,36 +343,40 @@ impl<P> PathLikeWithPosition<P> {
#[derive(Clone, Debug)]
pub struct PathMatcher {
maybe_path: PathBuf,
source: String,
glob: GlobMatcher,
}
impl std::fmt::Display for PathMatcher {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
self.maybe_path.to_string_lossy().fmt(f)
self.source.fmt(f)
}
}
impl PartialEq for PathMatcher {
fn eq(&self, other: &Self) -> bool {
self.maybe_path.eq(&other.maybe_path)
self.source.eq(&other.source)
}
}
impl Eq for PathMatcher {}
impl PathMatcher {
pub fn new(maybe_glob: &str) -> Result<Self, globset::Error> {
pub fn new(source: &str) -> Result<Self, globset::Error> {
Ok(PathMatcher {
glob: Glob::new(maybe_glob)?.compile_matcher(),
maybe_path: PathBuf::from(maybe_glob),
glob: Glob::new(source)?.compile_matcher(),
source: String::from(source),
})
}
pub fn source(&self) -> &str {
&self.source
}
pub fn is_match<P: AsRef<Path>>(&self, other: P) -> bool {
let other_path = other.as_ref();
other_path.starts_with(&self.maybe_path)
|| other_path.ends_with(&self.maybe_path)
other_path.starts_with(Path::new(&self.source))
|| other_path.ends_with(Path::new(&self.source))
|| self.glob.is_match(other_path)
|| self.check_with_end_separator(other_path)
}

View File

@@ -21,6 +21,7 @@ use crate::{
surrounds::{check_and_move_to_valid_bracket_pair, SurroundsType},
Vim,
};
use case::{change_case_motion, change_case_object, CaseTarget};
use collections::BTreeSet;
use editor::display_map::ToDisplayPoint;
use editor::scroll::Autoscroll;
@@ -198,6 +199,15 @@ pub fn normal_motion(
Some(Operator::AddSurrounds { target: None }) => {}
Some(Operator::Indent) => indent_motion(vim, motion, times, IndentDirection::In, cx),
Some(Operator::Outdent) => indent_motion(vim, motion, times, IndentDirection::Out, cx),
Some(Operator::Lowercase) => {
change_case_motion(vim, motion, times, CaseTarget::Lowercase, cx)
}
Some(Operator::Uppercase) => {
change_case_motion(vim, motion, times, CaseTarget::Uppercase, cx)
}
Some(Operator::OppositeCase) => {
change_case_motion(vim, motion, times, CaseTarget::OppositeCase, cx)
}
Some(operator) => {
// Can't do anything for text objects, Ignoring
error!("Unexpected normal mode motion operator: {:?}", operator)
@@ -220,6 +230,15 @@ pub fn normal_object(object: Object, cx: &mut WindowContext) {
Some(Operator::Outdent) => {
indent_object(vim, object, around, IndentDirection::Out, cx)
}
Some(Operator::Lowercase) => {
change_case_object(vim, object, around, CaseTarget::Lowercase, cx)
}
Some(Operator::Uppercase) => {
change_case_object(vim, object, around, CaseTarget::Uppercase, cx)
}
Some(Operator::OppositeCase) => {
change_case_object(vim, object, around, CaseTarget::OppositeCase, cx)
}
Some(Operator::AddSurrounds { target: None }) => {
waiting_operator = Some(Operator::AddSurrounds {
target: Some(SurroundsType::Object(object)),

View File

@@ -1,13 +1,98 @@
use editor::scroll::Autoscroll;
use collections::HashMap;
use editor::{display_map::ToDisplayPoint, scroll::Autoscroll};
use gpui::ViewContext;
use language::{Bias, Point};
use language::{Bias, Point, SelectionGoal};
use multi_buffer::MultiBufferRow;
use ui::WindowContext;
use workspace::Workspace;
use crate::{
normal::ChangeCase, normal::ConvertToLowerCase, normal::ConvertToUpperCase, state::Mode, Vim,
motion::Motion,
normal::{ChangeCase, ConvertToLowerCase, ConvertToUpperCase},
object::Object,
state::Mode,
Vim,
};
pub enum CaseTarget {
Lowercase,
Uppercase,
OppositeCase,
}
pub fn change_case_motion(
vim: &mut Vim,
motion: Motion,
times: Option<usize>,
mode: CaseTarget,
cx: &mut WindowContext,
) {
vim.stop_recording();
vim.update_active_editor(cx, |_, editor, cx| {
let text_layout_details = editor.text_layout_details(cx);
editor.transact(cx, |editor, cx| {
let mut selection_starts: HashMap<_, _> = Default::default();
editor.change_selections(None, cx, |s| {
s.move_with(|map, selection| {
let anchor = map.display_point_to_anchor(selection.head(), Bias::Left);
selection_starts.insert(selection.id, anchor);
motion.expand_selection(map, selection, times, false, &text_layout_details);
});
});
match mode {
CaseTarget::Lowercase => editor.convert_to_lower_case(&Default::default(), cx),
CaseTarget::Uppercase => editor.convert_to_upper_case(&Default::default(), cx),
CaseTarget::OppositeCase => {
editor.convert_to_opposite_case(&Default::default(), cx)
}
}
editor.change_selections(None, cx, |s| {
s.move_with(|map, selection| {
let anchor = selection_starts.remove(&selection.id).unwrap();
selection.collapse_to(anchor.to_display_point(map), SelectionGoal::None);
});
});
});
});
}
pub fn change_case_object(
vim: &mut Vim,
object: Object,
around: bool,
mode: CaseTarget,
cx: &mut WindowContext,
) {
vim.stop_recording();
vim.update_active_editor(cx, |_, editor, cx| {
editor.transact(cx, |editor, cx| {
let mut original_positions: HashMap<_, _> = Default::default();
editor.change_selections(None, cx, |s| {
s.move_with(|map, selection| {
object.expand_selection(map, selection, around);
original_positions.insert(
selection.id,
map.display_point_to_anchor(selection.start, Bias::Left),
);
});
});
match mode {
CaseTarget::Lowercase => editor.convert_to_lower_case(&Default::default(), cx),
CaseTarget::Uppercase => editor.convert_to_upper_case(&Default::default(), cx),
CaseTarget::OppositeCase => {
editor.convert_to_opposite_case(&Default::default(), cx)
}
}
editor.change_selections(None, cx, |s| {
s.move_with(|map, selection| {
let anchor = original_positions.remove(&selection.id).unwrap();
selection.collapse_to(anchor.to_display_point(map), SelectionGoal::None);
});
});
});
});
}
pub fn change_case(_: &mut Workspace, _: &ChangeCase, cx: &mut ViewContext<Workspace>) {
manipulate_text(cx, |c| {
if c.is_lowercase() {
@@ -180,4 +265,29 @@ mod test {
cx.simulate_shared_keystrokes("ctrl-v j u").await;
cx.shared_state().await.assert_eq("ˇaa\nbb\nCc");
}
#[gpui::test]
async fn test_change_case_motion(cx: &mut gpui::TestAppContext) {
let mut cx = NeovimBackedTestContext::new(cx).await;
// works in visual mode
cx.set_shared_state("ˇabc def").await;
cx.simulate_shared_keystrokes("g shift-u w").await;
cx.shared_state().await.assert_eq("ˇABC def");
cx.simulate_shared_keystrokes("g u w").await;
cx.shared_state().await.assert_eq("ˇabc def");
cx.simulate_shared_keystrokes("g ~ w").await;
cx.shared_state().await.assert_eq("ˇABC def");
cx.simulate_shared_keystrokes(".").await;
cx.shared_state().await.assert_eq("ˇabc def");
cx.set_shared_state("abˇc def").await;
cx.simulate_shared_keystrokes("g ~ i w").await;
cx.shared_state().await.assert_eq("ˇABC def");
cx.simulate_shared_keystrokes(".").await;
cx.shared_state().await.assert_eq("ˇabc def");
}
}

View File

@@ -63,6 +63,10 @@ pub enum Operator {
Jump { line: bool },
Indent,
Outdent,
Lowercase,
Uppercase,
OppositeCase,
}
#[derive(Default, Clone)]
@@ -270,6 +274,9 @@ impl Operator {
Operator::Jump { line: false } => "`",
Operator::Indent => ">",
Operator::Outdent => "<",
Operator::Uppercase => "gU",
Operator::Lowercase => "gu",
Operator::OppositeCase => "g~",
}
}

View File

@@ -539,6 +539,9 @@ impl Vim {
| Operator::Replace
| Operator::Indent
| Operator::Outdent
| Operator::Lowercase
| Operator::Uppercase
| Operator::OppositeCase
) {
self.start_recording(cx)
};

View File

@@ -0,0 +1,23 @@
{"Put":{"state":"ˇabc def"}}
{"Key":"g"}
{"Key":"shift-u"}
{"Key":"w"}
{"Get":{"state":"ˇABC def","mode":"Normal"}}
{"Key":"g"}
{"Key":"u"}
{"Key":"w"}
{"Get":{"state":"ˇabc def","mode":"Normal"}}
{"Key":"g"}
{"Key":"~"}
{"Key":"w"}
{"Get":{"state":"ˇABC def","mode":"Normal"}}
{"Key":"."}
{"Get":{"state":"ˇabc def","mode":"Normal"}}
{"Put":{"state":"abˇc def"}}
{"Key":"g"}
{"Key":"~"}
{"Key":"i"}
{"Key":"w"}
{"Get":{"state":"ˇABC def","mode":"Normal"}}
{"Key":"."}
{"Get":{"state":"ˇabc def","mode":"Normal"}}

View File

@@ -5,7 +5,7 @@ use client::{telemetry::Telemetry, TelemetrySettings};
use db::kvp::KEY_VALUE_STORE;
use gpui::{
svg, AnyElement, AppContext, EventEmitter, FocusHandle, FocusableView, InteractiveElement,
ParentElement, Render, Styled, Subscription, View, ViewContext, VisualContext, WeakView,
ParentElement, Render, Styled, Subscription, Task, View, ViewContext, VisualContext, WeakView,
WindowContext,
};
use settings::{Settings, SettingsStore};
@@ -36,19 +36,21 @@ pub fn init(cx: &mut AppContext) {
base_keymap_picker::init(cx);
}
pub fn show_welcome_view(app_state: Arc<AppState>, cx: &mut AppContext) {
pub fn show_welcome_view(
app_state: Arc<AppState>,
cx: &mut AppContext,
) -> Task<anyhow::Result<()>> {
open_new(app_state, cx, |workspace, cx| {
workspace.toggle_dock(DockPosition::Left, cx);
let welcome_page = WelcomePage::new(workspace, cx);
workspace.add_item_to_center(Box::new(welcome_page.clone()), cx);
cx.focus_view(&welcome_page);
cx.notify();
})
.detach();
db::write_and_log(cx, || {
KEY_VALUE_STORE.write_kvp(FIRST_OPEN.to_string(), "false".to_string())
});
db::write_and_log(cx, || {
KEY_VALUE_STORE.write_kvp(FIRST_OPEN.to_string(), "false".to_string())
});
})
}
pub struct WelcomePage {

View File

@@ -9,7 +9,7 @@ use crate::{
SplitDirection, ToggleZoom, Workspace,
};
use anyhow::Result;
use collections::{HashMap, HashSet, VecDeque};
use collections::{BTreeSet, HashMap, HashSet, VecDeque};
use futures::{stream::FuturesUnordered, StreamExt};
use gpui::{
actions, anchored, deferred, impl_actions, prelude::*, Action, AnchorCorner, AnyElement,
@@ -20,7 +20,7 @@ use gpui::{
};
use itertools::Itertools;
use parking_lot::Mutex;
use project::{Project, ProjectEntryId, ProjectPath};
use project::{Project, ProjectEntryId, ProjectPath, WorktreeId};
use serde::Deserialize;
use settings::{Settings, SettingsStore};
use std::{
@@ -43,6 +43,30 @@ use ui::{
use ui::{v_flex, ContextMenu};
use util::{debug_panic, maybe, truncate_and_remove_front, ResultExt};
/// A selected entry in e.g. project panel.
#[derive(Copy, Clone, Debug, PartialEq, Eq, PartialOrd, Ord)]
pub struct SelectedEntry {
pub worktree_id: WorktreeId,
pub entry_id: ProjectEntryId,
}
/// A group of selected entries from project panel.
#[derive(Debug)]
pub struct DraggedSelection {
pub active_selection: SelectedEntry,
pub marked_selections: Arc<BTreeSet<SelectedEntry>>,
}
impl DraggedSelection {
pub fn items<'a>(&'a self) -> Box<dyn Iterator<Item = &'a SelectedEntry> + 'a> {
if self.marked_selections.contains(&self.active_selection) {
Box::new(self.marked_selections.iter())
} else {
Box::new(std::iter::once(&self.active_selection))
}
}
}
#[derive(PartialEq, Clone, Copy, Deserialize, Debug)]
#[serde(rename_all = "camelCase")]
pub enum SaveIntent {
@@ -1602,7 +1626,7 @@ impl Pane {
.drag_over::<DraggedTab>(|tab, _, cx| {
tab.bg(cx.theme().colors().drop_target_background)
})
.drag_over::<ProjectEntryId>(|tab, _, cx| {
.drag_over::<DraggedSelection>(|tab, _, cx| {
tab.bg(cx.theme().colors().drop_target_background)
})
.when_some(self.can_drop_predicate.clone(), |this, p| {
@@ -1612,9 +1636,9 @@ impl Pane {
this.drag_split_direction = None;
this.handle_tab_drop(dragged_tab, ix, cx)
}))
.on_drop(cx.listener(move |this, entry_id: &ProjectEntryId, cx| {
.on_drop(cx.listener(move |this, selection: &DraggedSelection, cx| {
this.drag_split_direction = None;
this.handle_project_entry_drop(entry_id, cx)
this.handle_project_entry_drop(&selection.active_selection.entry_id, cx)
}))
.on_drop(cx.listener(move |this, paths, cx| {
this.drag_split_direction = None;
@@ -1820,16 +1844,16 @@ impl Pane {
.drag_over::<DraggedTab>(|bar, _, cx| {
bar.bg(cx.theme().colors().drop_target_background)
})
.drag_over::<ProjectEntryId>(|bar, _, cx| {
.drag_over::<DraggedSelection>(|bar, _, cx| {
bar.bg(cx.theme().colors().drop_target_background)
})
.on_drop(cx.listener(move |this, dragged_tab: &DraggedTab, cx| {
this.drag_split_direction = None;
this.handle_tab_drop(dragged_tab, this.items.len(), cx)
}))
.on_drop(cx.listener(move |this, entry_id: &ProjectEntryId, cx| {
.on_drop(cx.listener(move |this, selection: &DraggedSelection, cx| {
this.drag_split_direction = None;
this.handle_project_entry_drop(entry_id, cx)
this.handle_project_entry_drop(&selection.active_selection.entry_id, cx)
}))
.on_drop(cx.listener(move |this, paths, cx| {
this.drag_split_direction = None;
@@ -2179,7 +2203,7 @@ impl Render for Pane {
.relative()
.group("")
.on_drag_move::<DraggedTab>(cx.listener(Self::handle_drag_move))
.on_drag_move::<ProjectEntryId>(cx.listener(Self::handle_drag_move))
.on_drag_move::<DraggedSelection>(cx.listener(Self::handle_drag_move))
.on_drag_move::<ExternalPaths>(cx.listener(Self::handle_drag_move))
.map(|div| {
if let Some(item) = self.active_item() {
@@ -2205,7 +2229,7 @@ impl Render for Pane {
.absolute()
.bg(cx.theme().colors().drop_target_background)
.group_drag_over::<DraggedTab>("", |style| style.visible())
.group_drag_over::<ProjectEntryId>("", |style| style.visible())
.group_drag_over::<DraggedSelection>("", |style| style.visible())
.group_drag_over::<ExternalPaths>("", |style| style.visible())
.when_some(self.can_drop_predicate.clone(), |this, p| {
this.can_drop(move |a, cx| p(a, cx))
@@ -2213,8 +2237,11 @@ impl Render for Pane {
.on_drop(cx.listener(move |this, dragged_tab, cx| {
this.handle_tab_drop(dragged_tab, this.active_item_index(), cx)
}))
.on_drop(cx.listener(move |this, entry_id, cx| {
this.handle_project_entry_drop(entry_id, cx)
.on_drop(cx.listener(move |this, selection: &DraggedSelection, cx| {
this.handle_project_entry_drop(
&selection.active_selection.entry_id,
cx,
)
}))
.on_drop(cx.listener(move |this, paths, cx| {
this.handle_external_paths_drop(paths, cx)

View File

@@ -185,7 +185,7 @@ pub struct CloseInactiveTabsAndPanes {
pub struct SendKeystrokes(pub String);
#[derive(Clone, Deserialize, PartialEq, Default)]
pub struct Restart {
pub struct Reload {
pub binary_path: Option<PathBuf>,
}
@@ -198,7 +198,7 @@ impl_actions!(
CloseInactiveTabsAndPanes,
NewFileInDirection,
OpenTerminal,
Restart,
Reload,
Save,
SaveAll,
SwapPaneInDirection,
@@ -282,7 +282,7 @@ pub fn init(app_state: Arc<AppState>, cx: &mut AppContext) {
notifications::init(cx);
cx.on_action(Workspace::close_global);
cx.on_action(restart);
cx.on_action(reload);
cx.on_action({
let app_state = Arc::downgrade(&app_state);
@@ -4839,18 +4839,16 @@ pub fn open_new(
app_state: Arc<AppState>,
cx: &mut AppContext,
init: impl FnOnce(&mut Workspace, &mut ViewContext<Workspace>) + 'static + Send,
) -> Task<()> {
) -> Task<anyhow::Result<()>> {
let task = Workspace::new_local(Vec::new(), app_state, None, cx);
cx.spawn(|mut cx| async move {
if let Some((workspace, opened_paths)) = task.await.log_err() {
workspace
.update(&mut cx, |workspace, cx| {
if opened_paths.is_empty() {
init(workspace, cx)
}
})
.log_err();
}
let (workspace, opened_paths) = task.await?;
workspace.update(&mut cx, |workspace, cx| {
if opened_paths.is_empty() {
init(workspace, cx)
}
})?;
Ok(())
})
}
@@ -4922,7 +4920,7 @@ pub fn join_hosted_project(
Workspace::new(Default::default(), project, app_state.clone(), cx)
})
})
})?
})??
};
workspace.update(&mut cx, |_, cx| {
@@ -4987,7 +4985,7 @@ pub fn join_dev_server_project(
Workspace::new(Default::default(), project, app_state.clone(), cx)
})
})
})?
})??
}
};
@@ -5050,7 +5048,7 @@ pub fn join_in_room_project(
Workspace::new(Default::default(), project, app_state.clone(), cx)
})
})
})?
})??
};
workspace.update(&mut cx, |workspace, cx| {
@@ -5085,7 +5083,7 @@ pub fn join_in_room_project(
})
}
pub fn restart(restart: &Restart, cx: &mut AppContext) {
pub fn reload(reload: &Reload, cx: &mut AppContext) {
let should_confirm = WorkspaceSettings::get_global(cx).confirm_quit;
let mut workspace_windows = cx
.windows()
@@ -5111,7 +5109,7 @@ pub fn restart(restart: &Restart, cx: &mut AppContext) {
.ok();
}
let binary_path = restart.binary_path.clone();
let binary_path = reload.binary_path.clone();
cx.spawn(|mut cx| async move {
if let Some(prompt) = prompt {
let answer = prompt.await?;

View File

@@ -31,7 +31,6 @@ fuzzy.workspace = true
git.workspace = true
gpui.workspace = true
ignore.workspace = true
itertools.workspace = true
language.workspace = true
log.workspace = true
parking_lot.workspace = true

File diff suppressed because it is too large Load Diff

View File

@@ -1,10 +1,37 @@
use std::{path::Path, sync::Arc};
use gpui::AppContext;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
use util::paths::PathMatcher;
#[derive(Clone, PartialEq, Eq)]
pub struct WorktreeSettings {
pub file_scan_exclusions: Arc<[PathMatcher]>,
pub private_files: Arc<[PathMatcher]>,
}
impl WorktreeSettings {
pub fn is_path_private(&self, path: &Path) -> bool {
path.ancestors().any(|ancestor| {
self.private_files
.iter()
.any(|matcher| matcher.is_match(&ancestor))
})
}
pub fn is_path_excluded(&self, path: &Path) -> bool {
path.ancestors().any(|ancestor| {
self.file_scan_exclusions
.iter()
.any(|matcher| matcher.is_match(&ancestor))
})
}
}
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema)]
pub struct WorktreeSettings {
pub struct WorktreeSettingsContent {
/// Completely ignore files matching globs from `file_scan_exclusions`
///
/// Default: [
@@ -28,12 +55,37 @@ pub struct WorktreeSettings {
impl Settings for WorktreeSettings {
const KEY: Option<&'static str> = None;
type FileContent = Self;
type FileContent = WorktreeSettingsContent;
fn load(
sources: SettingsSources<Self::FileContent>,
_: &mut AppContext,
) -> anyhow::Result<Self> {
sources.json_merge()
let result: WorktreeSettingsContent = sources.json_merge()?;
let mut file_scan_exclusions = result.file_scan_exclusions.unwrap_or_default();
let mut private_files = result.private_files.unwrap_or_default();
file_scan_exclusions.sort();
private_files.sort();
Ok(Self {
file_scan_exclusions: path_matchers(&file_scan_exclusions, "file_scan_exclusions"),
private_files: path_matchers(&private_files, "private_files"),
})
}
}
fn path_matchers(values: &[String], context: &'static str) -> Arc<[PathMatcher]> {
values
.iter()
.filter_map(|pattern| {
PathMatcher::new(pattern)
.map(Some)
.unwrap_or_else(|e| {
log::error!(
"Skipping pattern {pattern} in `{}` project settings due to parsing error: {e:#}", context
);
None
})
})
.collect::<Vec<_>>()
.into()
}

View File

@@ -453,11 +453,9 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
// Open a file that is nested inside of a gitignored directory that
// has not yet been expanded.
let prev_read_dir_count = fs.read_dir_call_count();
let (file, _, _) = tree
let loaded = tree
.update(cx, |tree, cx| {
tree.as_local_mut()
.unwrap()
.load_file("one/node_modules/b/b1.js".as_ref(), cx)
tree.load_file("one/node_modules/b/b1.js".as_ref(), cx)
})
.await
.unwrap();
@@ -483,7 +481,10 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
]
);
assert_eq!(file.path.as_ref(), Path::new("one/node_modules/b/b1.js"));
assert_eq!(
loaded.file.path.as_ref(),
Path::new("one/node_modules/b/b1.js")
);
// Only the newly-expanded directories are scanned.
assert_eq!(fs.read_dir_call_count() - prev_read_dir_count, 2);
@@ -492,11 +493,9 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
// Open another file in a different subdirectory of the same
// gitignored directory.
let prev_read_dir_count = fs.read_dir_call_count();
let (file, _, _) = tree
let loaded = tree
.update(cx, |tree, cx| {
tree.as_local_mut()
.unwrap()
.load_file("one/node_modules/a/a2.js".as_ref(), cx)
tree.load_file("one/node_modules/a/a2.js".as_ref(), cx)
})
.await
.unwrap();
@@ -524,7 +523,10 @@ async fn test_open_gitignored_files(cx: &mut TestAppContext) {
]
);
assert_eq!(file.path.as_ref(), Path::new("one/node_modules/a/a2.js"));
assert_eq!(
loaded.file.path.as_ref(),
Path::new("one/node_modules/a/a2.js")
);
// Only the newly-expanded directory is scanned.
assert_eq!(fs.read_dir_call_count() - prev_read_dir_count, 1);
@@ -844,7 +846,7 @@ async fn test_write_file(cx: &mut TestAppContext) {
tree.flush_fs_events(cx).await;
tree.update(cx, |tree, cx| {
tree.as_local().unwrap().write_file(
tree.write_file(
Path::new("tracked-dir/file.txt"),
"hello".into(),
Default::default(),
@@ -854,7 +856,7 @@ async fn test_write_file(cx: &mut TestAppContext) {
.await
.unwrap();
tree.update(cx, |tree, cx| {
tree.as_local().unwrap().write_file(
tree.write_file(
Path::new("ignored-dir/file.txt"),
"world".into(),
Default::default(),

View File

@@ -103,6 +103,9 @@ zed_actions.workspace = true
[target.'cfg(target_os = "windows")'.build-dependencies]
winresource = "0.1"
[target.'cfg(target_os = "linux")'.dependencies]
ashpd.workspace = true
[dev-dependencies]
call = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }

View File

@@ -56,21 +56,58 @@ use crate::zed::inline_completion_registry;
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
fn fail_to_launch(e: anyhow::Error) {
eprintln!("Zed failed to launch: {:?}", e);
App::new().run(move |cx| {
let window = cx.open_window(gpui::WindowOptions::default(), |cx| cx.new_view(|_| gpui::Empty));
window.update(cx, |_, cx| {
let response = cx.prompt(gpui::PromptLevel::Critical, "Zed failed to launch", Some(&format!("{}\n\nFor help resolving this, please open an issue on https://github.com/zed-industries/zed", e)), &["Exit"]);
if let Ok(window) = cx.open_window(gpui::WindowOptions::default(), |cx| cx.new_view(|_| gpui::Empty)) {
window.update(cx, |_, cx| {
let response = cx.prompt(gpui::PromptLevel::Critical, "Zed failed to launch", Some(&format!("{}\n\nFor help resolving this, please open an issue on https://github.com/zed-industries/zed", e)), &["Exit"]);
cx.spawn(|_, mut cx| async move {
response.await?;
cx.update(|cx| {
cx.quit()
})
}).detach_and_log_err(cx);
}).log_err();
cx.spawn(|_, mut cx| async move {
response.await?;
cx.update(|cx| {
cx.quit()
})
}).detach_and_log_err(cx);
}).log_err();
} else {
fail_to_open_window(e, cx)
}
})
}
fn fail_to_open_window_async(e: anyhow::Error, cx: &mut AsyncAppContext) {
cx.update(|cx| fail_to_open_window(e, cx)).log_err();
}
fn fail_to_open_window(e: anyhow::Error, _cx: &mut AppContext) {
eprintln!("Zed failed to open a window: {:?}", e);
#[cfg(target_os = "linux")]
{
use ashpd::desktop::notification::{Notification, NotificationProxy, Priority};
_cx.spawn(|cx| async move {
let proxy = NotificationProxy::new().await?;
let notification_id = "dev.zed.Oops";
proxy
.add_notification(
notification_id,
Notification::new("Zed failed to launch")
.body(Some(format!("{:?}", e).as_str()))
.priority(Priority::High)
.icon(ashpd::desktop::Icon::with_names(&[
"dialog-question-symbolic",
])),
)
.await?;
cx.update(|cx| {
cx.quit();
})
})
.detach();
}
}
enum AppMode {
Headless(DevServerToken),
Ui,
@@ -122,10 +159,6 @@ fn init_ui(app_state: Arc<AppState>, cx: &mut AppContext) -> Result<()> {
}
};
if let Err(err) = cx.can_open_windows() {
return Err(err);
}
SystemAppearance::init(cx);
load_embedded_fonts(cx);
@@ -319,7 +352,11 @@ fn main() {
{
cx.spawn({
let app_state = app_state.clone();
|cx| async move { restore_or_create_workspace(app_state, cx).await }
|mut cx| async move {
if let Err(e) = restore_or_create_workspace(app_state, &mut cx).await {
fail_to_open_window_async(e, &mut cx)
}
}
})
.detach();
}
@@ -421,7 +458,11 @@ fn main() {
init_ui(app_state.clone(), cx).unwrap();
cx.spawn({
let app_state = app_state.clone();
|cx| async move { restore_or_create_workspace(app_state, cx).await }
|mut cx| async move {
if let Err(e) = restore_or_create_workspace(app_state, &mut cx).await {
fail_to_open_window_async(e, &mut cx)
}
}
})
.detach();
}
@@ -448,13 +489,12 @@ fn handle_open_request(request: OpenRequest, app_state: Arc<AppState>, cx: &mut
let app_state = app_state.clone();
cx.spawn(move |cx| handle_cli_connection(connection, app_state, cx))
.detach();
return;
}
if let Err(e) = init_ui(app_state.clone(), cx) {
log::error!("{}", e);
fail_to_open_window(e, cx);
return;
}
};
let mut task = None;
if !request.open_paths.is_empty() {
@@ -478,48 +518,59 @@ fn handle_open_request(request: OpenRequest, app_state: Arc<AppState>, cx: &mut
if !request.open_channel_notes.is_empty() || request.join_channel.is_some() {
cx.spawn(|mut cx| async move {
if let Some(task) = task {
task.await?;
}
let client = app_state.client.clone();
// we continue even if authentication fails as join_channel/ open channel notes will
// show a visible error message.
authenticate(client, &cx).await.log_err();
let result = maybe!(async {
if let Some(task) = task {
task.await?;
}
let client = app_state.client.clone();
// we continue even if authentication fails as join_channel/ open channel notes will
// show a visible error message.
authenticate(client, &cx).await.log_err();
if let Some(channel_id) = request.join_channel {
cx.update(|cx| {
workspace::join_channel(
client::ChannelId(channel_id),
app_state.clone(),
None,
cx,
)
})?
.await?;
}
if let Some(channel_id) = request.join_channel {
cx.update(|cx| {
workspace::join_channel(
client::ChannelId(channel_id),
app_state.clone(),
None,
cx,
)
})?
.await?;
}
let workspace_window =
workspace::get_any_active_workspace(app_state, cx.clone()).await?;
let workspace = workspace_window.root_view(&cx)?;
let workspace_window =
workspace::get_any_active_workspace(app_state, cx.clone()).await?;
let workspace = workspace_window.root_view(&cx)?;
let mut promises = Vec::new();
for (channel_id, heading) in request.open_channel_notes {
promises.push(cx.update_window(workspace_window.into(), |_, cx| {
ChannelView::open(
client::ChannelId(channel_id),
heading,
workspace.clone(),
cx,
)
.log_err()
})?)
let mut promises = Vec::new();
for (channel_id, heading) in request.open_channel_notes {
promises.push(cx.update_window(workspace_window.into(), |_, cx| {
ChannelView::open(
client::ChannelId(channel_id),
heading,
workspace.clone(),
cx,
)
.log_err()
})?)
}
future::join_all(promises).await;
anyhow::Ok(())
})
.await;
if let Err(err) = result {
fail_to_open_window_async(err, &mut cx);
}
future::join_all(promises).await;
anyhow::Ok(())
})
.detach_and_log_err(cx);
.detach()
} else if let Some(task) = task {
task.detach_and_log_err(cx)
cx.spawn(|mut cx| async move {
if let Err(err) = task.await {
fail_to_open_window_async(err, &mut cx);
}
})
.detach();
}
}
@@ -562,41 +613,39 @@ async fn installation_id() -> Result<(String, bool)> {
Ok((installation_id, false))
}
async fn restore_or_create_workspace(app_state: Arc<AppState>, cx: AsyncAppContext) {
maybe!(async {
let restore_behaviour =
cx.update(|cx| WorkspaceSettings::get(None, cx).restore_on_startup)?;
let location = match restore_behaviour {
workspace::RestoreOnStartupBehaviour::LastWorkspace => {
workspace::last_opened_workspace_paths().await
}
_ => None,
};
if let Some(location) = location {
cx.update(|cx| {
workspace::open_paths(
location.paths().as_ref(),
app_state,
workspace::OpenOptions::default(),
cx,
)
})?
.await
.log_err();
} else if matches!(KEY_VALUE_STORE.read_kvp(FIRST_OPEN), Ok(None)) {
cx.update(|cx| show_welcome_view(app_state, cx)).log_err();
} else {
cx.update(|cx| {
workspace::open_new(app_state, cx, |workspace, cx| {
Editor::new_file(workspace, &Default::default(), cx)
})
.detach();
})?;
async fn restore_or_create_workspace(
app_state: Arc<AppState>,
cx: &mut AsyncAppContext,
) -> Result<()> {
let restore_behaviour = cx.update(|cx| WorkspaceSettings::get(None, cx).restore_on_startup)?;
let location = match restore_behaviour {
workspace::RestoreOnStartupBehaviour::LastWorkspace => {
workspace::last_opened_workspace_paths().await
}
anyhow::Ok(())
})
.await
.log_err();
_ => None,
};
if let Some(location) = location {
cx.update(|cx| {
workspace::open_paths(
location.paths().as_ref(),
app_state,
workspace::OpenOptions::default(),
cx,
)
})?
.await?;
} else if matches!(KEY_VALUE_STORE.read_kvp(FIRST_OPEN), Ok(None)) {
cx.update(|cx| show_welcome_view(app_state, cx))?.await?;
} else {
cx.update(|cx| {
workspace::open_new(app_state, cx, |workspace, cx| {
Editor::new_file(workspace, &Default::default(), cx)
})
})?
.await?;
}
Ok(())
}
fn init_paths() -> anyhow::Result<()> {

View File

@@ -1314,7 +1314,8 @@ mod tests {
Editor::new_file(workspace, &Default::default(), cx)
})
})
.await;
.await
.unwrap();
cx.run_until_parked();
let workspace = cx

View File

@@ -81,3 +81,48 @@ Ruby LSP uses pull-based diagnostics which Zed doesn't support yet. We can tell
}
}
```
## Using the Tailwind CSS Language Server with Ruby
It's possible to use the [Tailwind CSS Language Server](https://github.com/tailwindlabs/tailwindcss-intellisense/tree/HEAD/packages/tailwindcss-language-server#readme) in Ruby and ERB files.
In order to do that, you need to configure the language server so that it knows about where to look for CSS classes in Ruby/ERB files by adding the following to your `settings.json`:
```json
{
"languages": {
"Ruby": {
"language_servers": ["tailwindcss-language-server", "..."]
}
},
"lsp": {
"tailwindcss-language-server": {
"settings": {
"includeLanguages": {
"erb": "html",
"ruby": "html"
},
"experimental": {
"classRegex": ["\\bclass:\\s*['\"]([^'\"]*)['\"]"]
}
}
}
}
}
```
With these settings you will get completions for Tailwind CSS classes in HTML attributes inside ERB files and inside Ruby/ERB strings that are coming after a `class:` key. Examples:
```ruby
# Ruby file:
def method
div(class: "pl-2 <completion here>") do
p(class: "mt-2 <completion here>") { "Hello World" }
end
end
# ERB file:
<%= link_to "Hello", "/hello", class: "pl-2 <completion here>" %>
<a href="/hello" class="pl-2 <completion here>">Hello</a>
```

View File

@@ -48,12 +48,14 @@ g < The same, but backwards
g a Add a visual selection for every copy of the current word
# Pane management
g / Open a project-wide search
g <space> Open the current search excerpt
<ctrl-w> <space> Open the current search excerpt in a split
<ctrl-w> g d Go to definition in a split
<ctrl-w> g D Go to type definition in a split
# Insert mode
i a / a a Select the function argument the cursor is in
ctrl-x ctrl-o Open the completion menu
ctrl-x ctrl-c Request GitHub Copilot suggestion (if configured)
ctrl-x ctrl-a Open the inline AI assistant (if configured)

View File

@@ -18,9 +18,3 @@
(module
"module" @context
name: (_) @name) @item
; Minitest/RSpec
(call
method: (identifier) @run (#any-of? @run "describe" "context" "test")
arguments: (argument_list . (_) @name)
) @item

View File

@@ -10,16 +10,16 @@
(constant) @run
(scope_resolution scope: (constant) name: (constant) @run)
]
(superclass (scope_resolution) @superclass (#match? @superclass "(::IntegrationTest|::TestCase|::SystemTestCase)$"))
) @minitest-test
(superclass (scope_resolution) @superclass (#match? @superclass "(::IntegrationTest|::TestCase|::SystemTestCase|Minitest::Test)$"))
) @_minitest-test
(#set! tag minitest-test)
)
(
(call
method: (identifier) @run (#eq? @run "test")
arguments: (argument_list (string (string_content) @name))
) @minitest-test
arguments: (argument_list (string (string_content) @_name))
) @_minitest-test
(#set! tag minitest-test)
)
@@ -27,7 +27,7 @@
(
(method
name: (identifier) @run (#match? @run "^test_")
) @minitest-test
) @_minitest-test
(#set! tag minitest-test)
)
@@ -35,27 +35,17 @@
(
(class
name: (constant) @run (superclass) @superclass (#match? @superclass "(ApplicationSystemTestCase)$")
) @minitest-test
) @_minitest-test
(#set! tag minitest-test)
)
; RSpec
; Example groups with literals
(
(call
method: (identifier) @run (#any-of? @run "describe" "context")
arguments: (argument_list . (_) @name)
) @rspec-test
(#set! tag rspec-test)
)
; Examples
(
(call
method: (identifier) @run (#any-of? @run "it" "its" "specify")
arguments: (argument_list (string (string_content) @name))
) @rspec-test
method: (identifier) @run (#any-of? @run "describe" "context" "it" "its" "specify")
arguments: (argument_list . (_) @_name)
) @_rspec-test
(#set! tag rspec-test)
)
@@ -63,8 +53,8 @@
(
(call
method: (identifier) @run (#any-of? @run "it" "its" "specify")
block: (_) @name
block: (_) @_name
!arguments
) @rspec-test
) @_rspec-test
(#set! tag rspec-test)
)

View File

@@ -1,14 +1,8 @@
[
{
"label": "test $ZED_SYMBOL",
"command": "ruby",
"args": ["-Itest", "$ZED_FILE", "--name", "\"/$ZED_SYMBOL/\""],
"tags": ["minitest-test"]
},
{
"label": "rspec $ZED_SYMBOL",
"label": "rspec $ZED_RELATIVE_FILE:$ZED_ROW",
"command": "./bin/rspec",
"args": ["\"$ZED_FILE:$ZED_ROW\""],
"args": ["\"$ZED_RELATIVE_FILE:$ZED_ROW\""],
"tags": ["rspec-test"]
}
]