Compare commits

..

73 Commits

Author SHA1 Message Date
Nate Butler
3772a467e0 Remove logs 2024-10-07 15:14:54 -04:00
Nate Butler
1be27e7270 wip first approach at advanced entries 2024-10-07 15:14:34 -04:00
Marshall Bowers
7d380e9e18 Temporarily prevent deploying collab to production (#18825)
This PR adds a temporary measure to prevent deploying collab to
production, while we investigate some issues stemming from the HTTP
client change.

Release Notes:

- N/A
2024-10-07 14:31:23 -04:00
Piotr Osiewicz
60c12a8d06 ssh: Remove old dev servers code paths (#18823)
Closes #ISSUE

Release Notes:

- N/A
2024-10-07 19:18:44 +02:00
Marshall Bowers
11206a8444 ui: Fix avatar indicators getting cut off (#18821)
This PR fixes an issue introduced in #18810 that was causing the avatar
indicators to get cut off.

Release Notes:

- N/A
2024-10-07 12:53:11 -04:00
Marshall Bowers
c83690ff14 storybook: Wire up HTTP client (#18818)
This PR wires up the HTTP client in the Storybook.

Release Notes:

- N/A
2024-10-07 12:29:10 -04:00
Marshall Bowers
d1a758708d php: Bump to v0.2.1 (#18815)
This PR bumps the PHP extension to v0.2.1.

Changes:

- https://github.com/zed-industries/zed/pull/18368
- https://github.com/zed-industries/zed/pull/18774

Release Notes:

- N/A
2024-10-07 10:23:16 -04:00
Marshall Bowers
7c7151551a proto: Bump to v0.2.0 (#18814)
This PR bumps the Protobuf extension to v0.2.0.

Changes:

- https://github.com/zed-industries/zed/pull/18763

Release Notes:

- N/A
2024-10-07 10:11:12 -04:00
Bennet Bo Fenner
a3b63448df ssh: Do not cancel connection process if user is typing password (#18812)
Previously, the connection process would be cancelled after 10 seconds,
even if the connection was established successfully but the user was
still typing in a password.
We know recognize when the user is prompted for a password, and cancel
the timeout task.

Co-Authored-by: Thorsten <thorsten@zed.dev>

Release Notes:

- N/A

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
2024-10-07 15:53:32 +02:00
Nate Butler
65c9b15796 Remove avatar shape (#18810)
This PR re-removes `AvatarShape` as it is unused. The previous time it
was removed incorrectly, resulting in square avatars!

Release Notes:

- N/A
2024-10-07 09:23:40 -04:00
Bennet Bo Fenner
25a97a6a2b ssh: Detect timeouts when server is unresponsive (#18808)
To detect connection timeouts we ping the remote server every X seconds
and attempt to reconnect if the server failed to respond.
Next up is showing some feedback in the UI to make this visible to the
user, and stop reconnecting after X amount of retries.

Release Notes:

- N/A

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
2024-10-07 15:08:16 +02:00
Piotr Osiewicz
5aa165c530 ssh: Overhaul remoting UI (#18727)
Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2024-10-07 15:01:50 +02:00
Thorsten Ball
9c5bec5efb formatting: Use project environment to find external formatters (#18611)
Closes #18261

This makes sure that we find external formatters in the project
environment.

TODO:

- [x] Use a different type for the triplet of `(buffer_handle,
buffer_path, buffer_env)`. Something like `FormattableBuffer`.
- [x] Test this!!

Release Notes:

- Fixed external formatters not being found, even when they were
available in the `$PATH` of a project.

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-07 12:24:12 +02:00
Thorsten Ball
c03b8d6c48 ssh remoting: Enable reconnecting after connection losses (#18586)
Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-07 11:40:59 +02:00
Danilo Leal
67fbdbbed6 Put back code that makes the avatar rounded (#18799)
Follow-up to https://github.com/zed-industries/zed/pull/18768

---

Release Notes:

- N/A
2024-10-07 05:42:48 -03:00
Piotr Osiewicz
03c84466c2 chore: Fix some violations of 'needless_pass_by_ref_mut' lint (#18795)
While this lint is allow-by-default, it seems pretty useful to get rid
of mutable borrows when they're not needed.

Closes #ISSUE

Release Notes:

- N/A
2024-10-07 01:29:58 +02:00
Agustin Gomes
59f0f4ac42 Fix script/linux on RHEL/Fedora (#18788)
- Add missing `/etc/os-release` from a grep call
- Remove typo `grep grep` from another.

Co-authored-by: Peter Tripp <peter@zed.dev>
2024-10-06 14:47:48 -04:00
Peter Tripp
bd746145b0 ci: Make docs-only PRs only trigger docs-related tests (#18744)
This should speed up any docs-only PRs so that they don't have to run the full 5 minute battery of tests.

Release Notes:

- N/A
2024-10-06 10:28:39 -04:00
Peter Tripp
1b06c70a76 Fix alt-t context (#18783)
- Fix incorrect context introduced in https://github.com/zed-industries/zed/pull/18749/

Release Notes:

- N/A
2024-10-06 10:26:26 -04:00
Peter
06bd2431d2 proto: Add language server support (#18763)
Closes #18762

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-06 10:12:06 -04:00
Roman Zipp
200b2bf70a php: Add syntax highlighting for Intelephense completions (#18774)
Release Notes:

- N/A

This PR introduces syntax highlighting for intelephense autocomple. The
styling was selected to roughly match PHPStorm's default scheme.

Please note that I'm not very familiar with writing Rust, but I'm happy
to adapt to any requested changes!

## Examples

### Object attributes, methods and constants

![Screenshot 2024-10-06 at 13 38
03](https://github.com/user-attachments/assets/a91634ff-0f2e-41f0-b548-ecb09c40947c)
![Screenshot 2024-10-06 at 13 38
11](https://github.com/user-attachments/assets/b6f179f4-898b-4d82-9d36-a3e82328325c)

### Typed enum members

![Screenshot 2024-10-06 at 13 38
53](https://github.com/user-attachments/assets/7133b981-4f68-4210-b233-403cdf3ec9bb)
![Screenshot 2024-10-06 at 13 38
41](https://github.com/user-attachments/assets/2e806f3d-3538-45f2-b075-b8be5902b786)

### Variables

Includes altered highlighting for [reserved variable
names](https://www.php.net/manual/en/reserved.variables.php).

![Screenshot 2024-10-06 at 13 39
30](https://github.com/user-attachments/assets/be426eb8-5879-432d-b302-391c2c68a7cb)

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-06 10:11:21 -04:00
Nate Butler
8376dd2011 ui crate docs & spring cleaning (#18768)
Similar to https://github.com/zed-industries/zed/pull/18690 &
https://github.com/zed-industries/zed/pull/18695, this PR enables
required docs for `ui` and does some cleanup.

Changes:
- Enables the `deny(missing_docs)` crate-wide.
- Adds `allow(missing_docs)` on many modules until folks pick them up to
document them
- Documents some modules (all in `ui/src/styles`)
- Crate root-level organization: Traits move to `traits`, other misc
organization
- Cleaned out a bunch of unused code.

Note: I'd like to remove `utils/format_distance` but the assistant panel
uses it. To move it over to use the `time_format` crate we may need to
update it to use `time` instead of `chrono`. Needs more investigation.

Release Notes:

- N/A
2024-10-05 23:28:34 -04:00
Chris Boette
c9bee9f81f docs: Note the need for Rust when developing extensions (#18753) 2024-10-05 12:26:28 -04:00
Kirill Bulatov
1f31022cbe Compare migrations formatted uniformly (#18760)
Otherwise old migrations may be formatted differently than new
migrations, causing comparison errors.

Follow-up of https://github.com/zed-industries/zed/pull/18676

Release Notes:

- N/A
2024-10-05 12:58:45 +03:00
Peter Tripp
7608000df8 Fix option-t and option-shift-t in terminal (#18749) 2024-10-04 16:56:01 -04:00
Remco Smits
8f27ffda4d gpui: Fix uniform list horizon offset for non-horizontal scrollable lists (#18748)
Closes #18739

/cc @osiewicz 
/cc @maxdeviant 

I'm not sure why the `+ padding.left` was added, but this was the cause
of the issue. I also tested removing the extra left padding but didn't
seem to see a difference inside the project panel. So we can maybe even
remove it?

**Before:**
![Screenshot 2024-10-04 at 21 43
34](https://github.com/user-attachments/assets/b5d67cd9-f92b-4301-880c-d351fe156c98)

**After:**
<img width="294" alt="Screenshot 2024-10-04 at 21 49 05"
src="https://github.com/user-attachments/assets/8cc84170-a86b-46b8-91c9-39def64f0bd0">

Release Notes:

- Fix code action list not horizontal aligned correctly
2024-10-04 23:07:58 +03:00
Marshall Bowers
cee019b1ea editor: Qualify RangeExt::overlaps call to prevent phantom diagnostics (#18743)
This PR qualifies a call to `RangeExt::overlaps` to avoid some confusion
in rust-analyzer not being able to distinguish between
`RangeExt::overlaps` and `AnchorRangeExt::overlaps` and producing
phantom diagnostics.

We may also want to consider renaming the method on `AnchorRangeExt` to
disambiguate them.

Release Notes:

- N/A
2024-10-04 15:06:05 -04:00
Boris Cherny
01ad22683d telemetry: Add language_name and model_provider (#18640)
This PR adds a bit more metadata for assistant logging.

Release Notes:

- Assistant: Added `language_name` and `model_provider` fields to
telemetry events.

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
Co-authored-by: Max <max@zed.dev>
2024-10-04 14:37:27 -04:00
Peter Tripp
dfe1e43832 docs: Linux XDG desktop secrets portals 2024-10-04 14:13:07 -04:00
Marshall Bowers
e3a6f89e2d Make report_assistant_event take an AssistantEvent struct (#18741)
This PR makes the `report_assistant_event` method take an
`AssistantEvent` struct instead of all of the struct fields as
individual parameters.

Release Notes:

- N/A
2024-10-04 13:19:18 -04:00
Peter Tripp
07e808d16f Document File Scan Exclusions (#18738)
Release Notes:

- N/A
2024-10-04 12:07:43 -04:00
Muhammad Talal Anwar
2f7430af70 c: Add runnable for main function (#18720)
Release Notes:

- Added Runnable for C main function

This tags can then be used in tasks, for example:

```json
[
  {
    "label": "Run ${ZED_STEM}",
    "command": "gcc",
    "args": [
      "$ZED_FILE",
      "-o",
      "${ZED_DIRNAME}/${ZED_STEM}.out",
      "&&",
      "${ZED_DIRNAME}/${ZED_STEM}.out"
    ],
    "tags": ["c-main"]
  }
]

```
2024-10-04 17:28:12 +02:00
renovate[bot]
d012e35b04 Update Rust crate parking to v2.2.1 (#18664)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [parking](https://redirect.github.com/smol-rs/parking) | dependencies
| patch | `2.2.0` -> `2.2.1` |

---

### Release Notes

<details>
<summary>smol-rs/parking (parking)</summary>

###
[`v2.2.1`](https://redirect.github.com/smol-rs/parking/blob/HEAD/CHANGELOG.md#Version-221)

[Compare
Source](https://redirect.github.com/smol-rs/parking/compare/v2.2.0...v2.2.1)

- Specify the reason for using `parking` in the docs.
([#&#8203;25](https://redirect.github.com/smol-rs/parking/issues/25))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-04 11:09:20 -04:00
Daste
d695de4504 tab_switcher: Use git-aware colors for file icons (#18733)
Release Notes:

- Fixed tab switcher icons not respecting the `tabs.git_status` setting.

Fixes an issue mentioned in
https://github.com/zed-industries/zed/pull/17115#issuecomment-2378966170
- file icons in the tab switcher weren't colored according to git
status, even if `tabs.git_status` was set to true.

I used a similar approach I saw in other places of the project to get
the project entry and its git status, but maybe we could move the
coloring logic entirely to `tab_icon()`? Wouldn't this break anything?

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-04 10:37:41 -04:00
renovate[bot]
9702310737 Update Rust crate sqlformat to v0.2.6 (#18676)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [sqlformat](https://redirect.github.com/shssoichiro/sqlformat-rs) |
dependencies | patch | `0.2.4` -> `0.2.6` |

---

### Release Notes

<details>
<summary>shssoichiro/sqlformat-rs (sqlformat)</summary>

###
[`v0.2.6`](https://redirect.github.com/shssoichiro/sqlformat-rs/blob/HEAD/CHANGELOG.md#Version-026)

[Compare
Source](https://redirect.github.com/shssoichiro/sqlformat-rs/compare/v0.2.5...v0.2.6)

- fix: ON UPDATE with two many blank formatted incorrectly
([#&#8203;46](https://redirect.github.com/shssoichiro/sqlformat-rs/issues/46))
-   fix: `EXCEPT` not handled well
- fix: REFERENCES xyz ON UPDATE .. causes formatter to treat the
remaining as an UPDATE statement
-   fix: Escaped strings formatted incorrectly
-   fix: RETURNING is not placed on a new line
- fix: fix the issue of misaligned comments after formatting
([#&#8203;40](https://redirect.github.com/shssoichiro/sqlformat-rs/issues/40))

###
[`v0.2.5`](https://redirect.github.com/shssoichiro/sqlformat-rs/compare/v0.2.4...v0.2.5)

[Compare
Source](https://redirect.github.com/shssoichiro/sqlformat-rs/compare/v0.2.4...v0.2.5)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-04 10:36:30 -04:00
Piotr Osiewicz
bafd7ed000 gpui: Store measure functions as context of taffy nodes (#18732)
Taffy maintains a mapping of NodeId <-> Context anyways (and does the
lookup), so it's redundant for us to store it separately. Tl;dr: we get
rid of one map and one map lookup per layout request.

Release Notes:

- N/A
2024-10-04 13:58:57 +02:00
Piotr Osiewicz
37ded190cf gpui: Use taffy to retrieve the parent for a given layout node (#18730)
Again. https://github.com/zed-industries/zed/pull/4070

Let's see how it goes this time around. The only thing that might've
been related to that revert on our Slack was about crashing in collab
panel.

Release Notes:

- N/A
2024-10-04 12:37:55 +02:00
Piotr Osiewicz
a99750fd35 chore: Bump taffy to 0.5.2 (#18729)
Release Notes:

- N/A
2024-10-04 12:37:44 +02:00
Ömer Sinan Ağacan
e2647025ac Add vim::MoveTo{Next,Prev} flags for regex and case sensitive search (#18429)
This makes the hard-coded regex and case-sensitive search flags in
`vim::MoveToNext` and `vim::MoveToPrev` commands configurable in key
bindings.

Example:

```json
{
  "context": "VimControl && !menu",
  "bindings": {
    "*": ["vim::MoveToNext", { "regex": false, "caseSensitive": false }],
    "#": ["vim::MoveToPrev", { "regex": false, "caseSensitive": false }]
  }
}
```

Closes #15837.

Release Notes:

- Added `regex` and `caseSensitive` arguments to `vim::MoveToNext` and
`vim ::MoveToPrev` commands, for toggling regex and case sensitive
search.
2024-10-04 09:10:26 +02:00
Marshall Bowers
6635758009 vcs_menu: Streamline branch creation from branch selector (#18712)
This PR streamlines the branch creation from the branch selector when
searching for a branch that does not exist.

The branch selector will show the available branches, as it does today:

<img width="576" alt="Screenshot 2024-10-03 at 4 01 25 PM"
src="https://github.com/user-attachments/assets/e1904f5b-4aad-4f88-901d-ab9422ec18bb">

When entering the name of a branch that does not exist, the picker will
be populated with an entry to create a new branch:

<img width="570" alt="Screenshot 2024-10-03 at 4 01 37 PM"
src="https://github.com/user-attachments/assets/07f8d12c-9422-4fd8-a6dc-ae450e297a13">

Selecting that entry will create the branch and switch to it.

Release Notes:

- Streamlined creating a new branch from the branch selector.
2024-10-03 16:18:28 -04:00
Junkui Zhang
8d6fa9526e windows: Fix sometimes log error messages don't show the crate name (#18706)
On windows, path could be something like `C:\path\to\the\crate`. Hence,
`split('/')` would refuse to work in this case.

### Before

![Screenshot 2024-10-04
023652](https://github.com/user-attachments/assets/9c14fb24-5ee0-4b56-8fbd-313abb28f134)

### After

![Screenshot 2024-10-04
024115](https://github.com/user-attachments/assets/217e175c-b0e1-4589-9c3d-98670882b185)


Release Notes:

- N/A
2024-10-03 13:00:33 -07:00
Marshall Bowers
fd22c9bef9 editor: Use predefined rounding value for color swatches (#18708)
This PR updates the color swatches added in #18665 to use a predefined
`rounding` value instead of a literal value.

The underlying values are the same, but we don't want to diverge from
our design system.

Release Notes:

- N/A
2024-10-03 15:20:41 -04:00
Joseph T. Lyons
43d05a432b Close stale issues out after 7 days (#18707)
Closes #ISSUE

Release Notes:

- N/A
2024-10-03 14:38:49 -04:00
Jordan Pittman
cac98b7bbf Show color swatches for LSP completions (#18665)
Closes #11991

Release Notes:

- Added support for color swatches for language server completions.

<img width="502" alt="Screenshot 2024-10-02 at 19 02 22"
src="https://github.com/user-attachments/assets/57e85492-3760-461a-9b17-a846dc40576b">

<img width="534" alt="Screenshot 2024-10-02 at 19 02 48"
src="https://github.com/user-attachments/assets/713ac41c-16f0-4ad3-9103-d2c9b3fa8b2e">

This implementation is mostly a port of the VSCode version of the
ColorExtractor. It seems reasonable the we should support _at least_
what VSCode does for detecting color swatches from LSP completions.

This implementation could definitely be better perf-wise by writing a
dedicated color parser. I also think it would be neat if, in the future,
Zed handled _more_ color formats — especially wide-gamut colors.

There are a few differences to the regexes in the VSCode implementation
but mainly so simplify the implementation :
- The hex vs rgb/hsl regexes were split into two parts
- The rgb/hsl regexes allow 3 or 4 color components whether hsla/rgba or
not and the parsing implementation accepts/rejects colors as needed

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-03 14:38:17 -04:00
Marshall Bowers
cddd7875a4 Extract Protocol Buffers support into an extension (#18704)
This PR extracts the Protocol Buffers support into an extension.

Release Notes:

- Removed built-in support for Protocol Buffers, in favor of making it
available as an extension. The Protocol Buffers extension will be
suggested for download when you open a `.proto` file.
2024-10-03 13:37:43 -04:00
Nate Butler
8c95b8d89a theme crate spring cleaning (#18695)
This PR does some spring cleaning on the `theme` crate:

- Removed two unused stories and the story dep
- Removed the `one` theme family (from the `theme` crate, not the app),
this is now `zed_default_themes`.
- This will hopefully remove some confusion caused by this theme we
started in rust but didn't end up using
- Removed `theme::prelude` (it just re-exported scale colors, which we
don't use outside `theme`)
- Removed completely unused `zed_pro` themes (we started on these during
the gpui2 port and didn't finish them.)

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-03 13:17:31 -04:00
Marshall Bowers
a9f816d5fb telemetry_events: Update crate-level docs (#18703)
This PR updates the `telemetry_events` crate to use module-level
documentation for its crate-level docs.

Release Notes:

- N/A
2024-10-03 12:38:51 -04:00
renovate[bot]
f7b3680e4d Update Rust crate pretty_assertions to v1.4.1 (#18668)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[pretty_assertions](https://redirect.github.com/rust-pretty-assertions/rust-pretty-assertions)
| workspace.dependencies | patch | `1.4.0` -> `1.4.1` |

---

### Release Notes

<details>
<summary>rust-pretty-assertions/rust-pretty-assertions
(pretty_assertions)</summary>

###
[`v1.4.1`](https://redirect.github.com/rust-pretty-assertions/rust-pretty-assertions/blob/HEAD/CHANGELOG.md#v141)

[Compare
Source](https://redirect.github.com/rust-pretty-assertions/rust-pretty-assertions/compare/v1.4.0...v1.4.1)

#### Fixed

- Show feature-flagged code in documentation. Thanks to
[@&#8203;sandydoo](https://redirect.github.com/sandydoo) for the fix!
([#&#8203;130](https://redirect.github.com/rust-pretty-assertions/rust-pretty-assertions/pull/130))

#### Internal

- Bump `yansi` version to `1.x`. Thanks to
[@&#8203;SergioBenitez](https://redirect.github.com/SergioBenitez) for
the update, and maintaining this library!
([#&#8203;121](https://redirect.github.com/rust-pretty-assertions/rust-pretty-assertions/pull/121))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-03 11:32:04 -04:00
renovate[bot]
ded3d3fc14 Update Python to v3.12.7 (#18652)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [python](https://redirect.github.com/containerbase/python-prebuild) |
dependencies | patch | `3.12.6` -> `3.12.7` |

---

### Release Notes

<details>
<summary>containerbase/python-prebuild (python)</summary>

###
[`v3.12.7`](https://redirect.github.com/containerbase/python-prebuild/releases/tag/3.12.7)

[Compare
Source](https://redirect.github.com/containerbase/python-prebuild/compare/3.12.6...3.12.7)

##### Bug Fixes

-   **deps:** update dependency python to v3.12.7

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-03 11:29:29 -04:00
Danilo Leal
ddcd45bb45 docs: Add tweaks to the outline panel page (#18697)
Thought we could be extra clear here with the meaning of "singleton
buffers".

Release Notes:

- N/A
2024-10-03 12:27:42 -03:00
renovate[bot]
29796aa412 Update Rust crate serde_json to v1.0.128 (#18669)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [serde_json](https://redirect.github.com/serde-rs/json) | dependencies
| patch | `1.0.127` -> `1.0.128` |
| [serde_json](https://redirect.github.com/serde-rs/json) |
workspace.dependencies | patch | `1.0.127` -> `1.0.128` |

---

### Release Notes

<details>
<summary>serde-rs/json (serde_json)</summary>

###
[`v1.0.128`](https://redirect.github.com/serde-rs/json/releases/tag/1.0.128)

[Compare
Source](https://redirect.github.com/serde-rs/json/compare/1.0.127...1.0.128)

- Support serializing maps containing 128-bit integer keys to
serde_json::Value
([#&#8203;1188](https://redirect.github.com/serde-rs/json/issues/1188),
thanks [@&#8203;Mrreadiness](https://redirect.github.com/Mrreadiness))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-03 11:14:22 -04:00
Nate Butler
773ad6bfd1 Document the theme crate (#18690)
This PR enables required documentation for the `theme` crate starts on
documenting it.

The end goal is to have all meaningful documentation in the crate filled
out – However I'm not sure that just adding `#![deny(missing_docs)]` to
the whole crate is the right approach.

I don't know that having 200+ "The color of the _ color" field docs is
useful however–In the short term I've excluded some of the modules that
contain structs with a ton of fields (`colors, `status`, etc.) until we
decide what the right solution here is.

Next steps are to clean up the crate, removing unused modules or those
with low usage in favor of other approaches.

Changes in this PR:
- Enable the `deny(missing_docs)` lint for the `theme` crate 
- Start documenting a subset of the crate.
- Enable `#![allow(missing_docs)]` for some modules.


Release Notes:

- N/A
2024-10-03 10:27:19 -04:00
Danilo Leal
dc85378b96 Clean up style properties on hunk controls (#18639)
This PR removes some duplicate style properties on the hunk controls,
namely padding, border, and background color.

Release Notes:

- N/A
2024-10-03 11:23:56 -03:00
Kirill Bulatov
1e8297a469 Remove a debug dev config line (#18689)
Follow-up of https://github.com/zed-industries/zed/pull/18645

Release Notes:

- N/A
2024-10-03 15:38:42 +03:00
renovate[bot]
9cd42427d8 Update Rust crate thiserror to v1.0.64 (#18677)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [thiserror](https://redirect.github.com/dtolnay/thiserror) |
workspace.dependencies | patch | `1.0.63` -> `1.0.64` |

---

### Release Notes

<details>
<summary>dtolnay/thiserror (thiserror)</summary>

###
[`v1.0.64`](https://redirect.github.com/dtolnay/thiserror/releases/tag/1.0.64)

[Compare
Source](https://redirect.github.com/dtolnay/thiserror/compare/1.0.63...1.0.64)

- Exclude derived impls from coverage instrumentation
([#&#8203;322](https://redirect.github.com/dtolnay/thiserror/issues/322),
thanks [@&#8203;oxalica](https://redirect.github.com/oxalica))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-02 23:28:00 -04:00
Joseph T. Lyons
df21fe174d Add command palette action name to outline panel docs (#18678)
Release Notes:

- N/A
2024-10-02 22:16:56 -04:00
Joseph T. Lyons
c48d4dbc6b Add basic outline panel docs (#18674)
Bandaid to: https://github.com/zed-industries/zed/issues/18672

Release Notes:

- Added basic outline panel docs
2024-10-02 22:06:07 -04:00
Piotr Osiewicz
19b186671b ssh: Add session state indicator to title bar (#18645)
![image](https://github.com/user-attachments/assets/0ed6f59c-e0e7-49e6-8db7-f09ec5cdf653)
The indicator turns yellow when ssh client is trying to reconnect. Note
that the state tracking is probably not ideal (we'll see how it pans out
once we start dog-fooding), but at the very least "green=good" should be
a decent mental model for now.

Release Notes:

- N/A
2024-10-03 00:35:56 +02:00
renovate[bot]
e2d613a803 Update Rust crate clap to v4.5.19 (#18660)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [clap](https://redirect.github.com/clap-rs/clap) |
workspace.dependencies | patch | `4.5.18` -> `4.5.19` |

---

### Release Notes

<details>
<summary>clap-rs/clap (clap)</summary>

###
[`v4.5.19`](https://redirect.github.com/clap-rs/clap/blob/HEAD/CHANGELOG.md#4519---2024-10-01)

[Compare
Source](https://redirect.github.com/clap-rs/clap/compare/v4.5.18...v4.5.19)

##### Internal

-   Update dependencies

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-02 17:39:32 -04:00
Marshall Bowers
6f4385e737 Sort dependencies in Cargo.toml files (#18657)
This PR sorts the dependencies in various `Cargo.toml` files after
#18414.

Release Notes:

- N/A
2024-10-02 16:26:48 -04:00
Marshall Bowers
9565a90528 collab: Revert changes to Clickhouse event rows (#18654)
This PR reverts the changes to the Clickhouse event rows that were
included in https://github.com/zed-industries/zed/pull/18414.

The changes don't seem to be correct, as they make the row structs
differ from the underlying table schema.

Release Notes:

- N/A
2024-10-02 16:10:25 -04:00
Conrad Irwin
3a5deb5c6f Replace isahc with async ureq (#18414)
REplace isahc with ureq everywhere gpui is used.

This should allow us to make http requests without libssl; and avoid a
long-tail of panics caused by ishac.

Release Notes:

- (potentially breaking change) updated our http client

---------

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-10-02 12:30:48 -07:00
renovate[bot]
f809787275 Update cloudflare/wrangler-action digest to 168bc28 (#18651)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[cloudflare/wrangler-action](https://redirect.github.com/cloudflare/wrangler-action)
| action | digest | `f84a562` -> `168bc28` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguOTcuMCIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-02 15:23:22 -04:00
Kirill Bulatov
778dedec6c Prepare to sync other kinds of settings (#18616)
This PR does not change how things work for settings, but lays the
ground work for the future functionality.
After this change, Zed is prepared to sync more than just
`settings.json` files from local worktree and user config.

* ssh tasks

Part of this work is to streamline the task sync mechanism.
Instead of having an extra set of requests to fetch the task contents
from the server (as remote-via-collab does now and does not cover all
sync cases), we want to reuse the existing mechanism for synchronizing
user and local settings.

* editorconfig

Part of the task is to sync .editorconfig file changes to everyone which
involves sending and storing those configs.


Both ssh (and remove-over-collab) .zed/tasks.json and .editorconfig
files behave similar to .zed/settings.json local files: they belong to a
certain path in a certain worktree; may update over time, changing Zed's
functionality; can be merged hierarchically.
Settings sync follows the same "config file changed -> send to watchers
-> parse and merge locally and on watchers" path that's needed for both
new kinds of files, ergo the messaging layer is extended to send more
types of settings for future watch & parse and merge impls to follow.

Release Notes:

- N/A
2024-10-02 22:00:40 +03:00
Marshall Bowers
7c4615519b editor: Ensure proposed changes editor is syntax-highlighted when opened (#18648)
This PR fixes an issue where the proposed changes editor would not have
any syntax highlighting until a modification was made.

When creating the branch buffer we reparse the buffer to rebuild the
syntax map.

Release Notes:

- N/A
2024-10-02 14:23:59 -04:00
Marshall Bowers
0e8276560f language: Update buffer doc comments (#18646)
This PR updates the doc comments in `buffer.rs` to use the standard
style for linking to other items.

Release Notes:

- N/A
2024-10-02 14:10:19 -04:00
Mikayla Maki
209ebb0c65 Revert "Fix blurry cursor on Wayland at a scale other than 100%" (#18642)
Closes #17771

Reverts zed-industries/zed#17496

This PR turns out to need more work than I thought when I merged it. 

Release Notes:

- Linux: Fix a bug where the cursor would be the wrong size on Wayland
2024-10-02 10:44:16 -07:00
Danilo Leal
a5f50e5c1e Tweak warning diagnostic toggle (#18637)
This PR adds color to the warning diagnostic toggle, so that, if it's
turned on, the warning icon is yellow. And, in the opposite case, it's
muted.

| Turned on | Turned off |
|--------|--------|
| <img width="1136" alt="Screenshot 2024-10-02 at 6 08 30 PM"
src="https://github.com/user-attachments/assets/be64738b-4c14-41d4-b1d4-ad788cf9e72b">
| <img width="1136" alt="Screenshot 2024-10-02 at 6 08 36 PM"
src="https://github.com/user-attachments/assets/d144ff50-4bf6-4c23-925a-05bcbbcd8b9d">
|

---

Release Notes:

- N/A
2024-10-02 13:57:20 -03:00
Danilo Leal
5aaaed52fc Adjust spacing and sizing of buffer search bar icon buttons (#18638)
This PR mostly makes all of the search bar icon buttons all squared and
adjusts the spacing between them, as well as the additional input that
appears when you toggle the "Replace all" action.

<img width="900" alt="Screenshot 2024-10-02 at 6 08 30 PM"
src="https://github.com/user-attachments/assets/86d50a3b-94bd-4c6a-822e-5f7f7b2e2707">

---

Release Notes:

- N/A
2024-10-02 13:57:03 -03:00
Junseong Park
845991c0e5 docs: Add missing UI font settings to "Configuring Zed" (#18267)
- Add missing `ui_font` options in `configuring-zed.md`

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-02 12:35:35 -04:00
Marshall Bowers
167af4bc1d Use const over static for string literals (#18635)
I noticed a few places where we were storing `&'static str`s in
`static`s instead of `const`s.

This PR updates them to use `const`.

Release Notes:

- N/A
2024-10-02 12:33:13 -04:00
Victor Roetman
2cd12f84de docs: Add FIPS mode error to Linux troubleshooting (#18407)
- Closes: #18335
Update linux.md with a workaround for the
```
crypto/fips/fips.c:154: OpenSSL internal error: FATAL FIPS SELFTEST FAILURE
```
error when using bundled libssl and libcrypto.

Co-authored-by: Peter Tripp <peter@zed.dev>
2024-10-02 12:18:41 -04:00
Joseph T Lyons
028d7a624f v0.157.x dev 2024-10-02 11:03:57 -04:00
248 changed files with 4945 additions and 3201 deletions

View File

@@ -7,9 +7,13 @@ on:
- "v[0-9]+.[0-9]+.x"
tags:
- "v*"
paths-ignore:
- "docs/**"
pull_request:
branches:
- "**"
paths-ignore:
- "docs/**"
concurrency:
# Allow only one workflow per any non-`main` branch.

View File

@@ -14,7 +14,7 @@ jobs:
stale-issue-message: >
Hi there! 👋
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. Are you able to reproduce this issue in the latest version of Zed? If so, please let us know by commenting on this issue and we will keep it open; otherwise, we'll close it in 10 days. Feel free to open a new issue if you're seeing this message after the issue has been closed.
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. Are you able to reproduce this issue in the latest version of Zed? If so, please let us know by commenting on this issue and we will keep it open; otherwise, we'll close it in 7 days. Feel free to open a new issue if you're seeing this message after the issue has been closed.
Thanks for your help!
close-issue-message: "This issue was closed due to inactivity; feel free to open a new issue if you're still experiencing this problem!"
@@ -23,7 +23,7 @@ jobs:
# 'community' to 'zed' repository. The migration added activity to all
# issues, preventing 365 days from working until then.
days-before-stale: 180
days-before-close: 10
days-before-close: 7
any-of-issue-labels: "defect,panic / crash"
operations-per-run: 1000
ascending: true

View File

@@ -36,28 +36,28 @@ jobs:
mdbook build ./docs --dest-dir=../target/deploy/docs/
- name: Deploy Docs
uses: cloudflare/wrangler-action@f84a562284fc78278ff9052435d9526f9c718361 # v3
uses: cloudflare/wrangler-action@168bc28b7078db16f6f1ecc26477fc2248592143 # v3
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
command: pages deploy target/deploy --project-name=docs
- name: Deploy Install
uses: cloudflare/wrangler-action@f84a562284fc78278ff9052435d9526f9c718361 # v3
uses: cloudflare/wrangler-action@168bc28b7078db16f6f1ecc26477fc2248592143 # v3
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
command: r2 object put -f script/install.sh zed-open-source-website-assets/install.sh
- name: Deploy Docs Workers
uses: cloudflare/wrangler-action@f84a562284fc78278ff9052435d9526f9c718361 # v3
uses: cloudflare/wrangler-action@168bc28b7078db16f6f1ecc26477fc2248592143 # v3
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}
command: deploy .cloudflare/docs-proxy/src/worker.js
- name: Deploy Install Workers
uses: cloudflare/wrangler-action@f84a562284fc78278ff9052435d9526f9c718361 # v3
uses: cloudflare/wrangler-action@168bc28b7078db16f6f1ecc26477fc2248592143 # v3
with:
apiToken: ${{ secrets.CLOUDFLARE_API_TOKEN }}
accountId: ${{ secrets.CLOUDFLARE_ACCOUNT_ID }}

View File

@@ -3,7 +3,8 @@ name: Publish Collab Server Image
on:
push:
tags:
- collab-production
# Pause production deploys while we investigate an issue.
# - collab-production
- collab-staging
env:

View File

@@ -20,11 +20,14 @@ jobs:
with:
version: 9
- run: |
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx prettier . --check || {
echo "To fix, run from the root of the zed repo:"
echo " cd docs && pnpm dlx prettier . --write && cd .."
false
}
working-directory: ./docs
- name: Check spelling
run: script/check-spelling docs/

691
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -52,7 +52,6 @@ members = [
"crates/indexed_docs",
"crates/inline_completion_button",
"crates/install_cli",
"crates/isahc_http_client",
"crates/journal",
"crates/language",
"crates/language_model",
@@ -88,6 +87,7 @@ members = [
"crates/remote",
"crates/remote_server",
"crates/repl",
"crates/reqwest_client",
"crates/rich_text",
"crates/rope",
"crates/rpc",
@@ -122,6 +122,7 @@ members = [
"crates/ui",
"crates/ui_input",
"crates/ui_macros",
"crates/ureq_client",
"crates/util",
"crates/vcs_menu",
"crates/vim",
@@ -153,6 +154,7 @@ members = [
"extensions/php",
"extensions/perplexity",
"extensions/prisma",
"extensions/proto",
"extensions/purescript",
"extensions/ruff",
"extensions/ruby",
@@ -175,6 +177,7 @@ members = [
default-members = ["crates/zed"]
[workspace.dependencies]
#
# Workspace member crates
#
@@ -220,7 +223,6 @@ go_to_line = { path = "crates/go_to_line" }
google_ai = { path = "crates/google_ai" }
gpui = { path = "crates/gpui" }
gpui_macros = { path = "crates/gpui_macros" }
handlebars = "4.3"
headless = { path = "crates/headless" }
html_to_markdown = { path = "crates/html_to_markdown" }
http_client = { path = "crates/http_client" }
@@ -228,7 +230,6 @@ image_viewer = { path = "crates/image_viewer" }
indexed_docs = { path = "crates/indexed_docs" }
inline_completion_button = { path = "crates/inline_completion_button" }
install_cli = { path = "crates/install_cli" }
isahc_http_client = { path = "crates/isahc_http_client" }
journal = { path = "crates/journal" }
language = { path = "crates/language" }
language_model = { path = "crates/language_model" }
@@ -265,6 +266,7 @@ release_channel = { path = "crates/release_channel" }
remote = { path = "crates/remote" }
remote_server = { path = "crates/remote_server" }
repl = { path = "crates/repl" }
reqwest_client = { path = "crates/reqwest_client" }
rich_text = { path = "crates/rich_text" }
rope = { path = "crates/rope" }
rpc = { path = "crates/rpc" }
@@ -299,6 +301,7 @@ title_bar = { path = "crates/title_bar" }
ui = { path = "crates/ui" }
ui_input = { path = "crates/ui_input" }
ui_macros = { path = "crates/ui_macros" }
ureq_client = { path = "crates/ureq_client" }
util = { path = "crates/util" }
vcs_menu = { path = "crates/vcs_menu" }
vim = { path = "crates/vim" }
@@ -318,6 +321,7 @@ any_vec = "0.14"
anyhow = "1.0.86"
arrayvec = { version = "0.7.4", features = ["serde"] }
ashpd = "0.9.1"
async-compat = "0.2.1"
async-compression = { version = "0.4", features = ["gzip", "futures-io"] }
async-dispatcher = "0.1"
async-fs = "1.6"
@@ -325,7 +329,7 @@ async-pipe = { git = "https://github.com/zed-industries/async-pipe-rs", rev = "8
async-recursion = "1.0.0"
async-tar = "0.5.0"
async-trait = "0.1"
async-tungstenite = "0.23"
async-tungstenite = "0.28"
async-watch = "0.3.1"
async_zip = { version = "0.0.17", features = ["deflate", "deflate64"] }
base64 = "0.22"
@@ -356,18 +360,15 @@ futures-batch = "0.6.1"
futures-lite = "1.13"
git2 = { version = "0.19", default-features = false }
globset = "0.4"
handlebars = "4.3"
heed = { version = "0.20.1", features = ["read-txn-no-tls"] }
hex = "0.4.3"
hyper = "0.14"
html5ever = "0.27.0"
hyper = "0.14"
ignore = "0.4.22"
image = "0.25.1"
indexmap = { version = "1.6.2", features = ["serde"] }
indoc = "2"
# We explicitly disable http2 support in isahc.
isahc = { version = "1.7.2", default-features = false, features = [
"text-decoding",
] }
itertools = "0.13.0"
jsonwebtoken = "9.3"
libc = "0.2"
@@ -382,9 +383,9 @@ ordered-float = "2.1.1"
palette = { version = "0.7.5", default-features = false, features = ["std"] }
parking_lot = "0.12.1"
pathdiff = "0.2"
profiling = "1"
postage = { version = "0.5", features = ["futures-traits"] }
pretty_assertions = "1.3.0"
profiling = "1"
prost = "0.9"
prost-build = "0.9"
prost-types = "0.9"
@@ -392,13 +393,14 @@ pulldown-cmark = { version = "0.12.0", default-features = false }
rand = "0.8.5"
regex = "1.5"
repair_json = "0.1.0"
reqwest = { git = "https://github.com/zed-industries/reqwest.git", rev = "fd110f6998da16bbca97b6dddda9be7827c50e29" }
rsa = "0.9.6"
runtimelib = { version = "0.15", default-features = false, features = [
"async-dispatcher-runtime",
] }
rustc-demangle = "0.1.23"
rust-embed = { version = "8.4", features = ["include-exclude"] }
rustls = "0.20.3"
rustls = "0.21.12"
rustls-native-certs = "0.8.0"
schemars = { version = "0.8", features = ["impl_json_schema"] }
semver = "1.0"
@@ -418,6 +420,7 @@ similar = "1.3"
simplelog = "0.12.2"
smallvec = { version = "1.6", features = ["union"] }
smol = "1.2"
sqlformat = "0.2"
strsim = "0.11"
strum = { version = "0.25.0", features = ["derive"] }
subtle = "2.5.0"
@@ -452,15 +455,14 @@ tree-sitter-html = "0.20"
tree-sitter-jsdoc = "0.23"
tree-sitter-json = "0.23"
tree-sitter-md = { git = "https://github.com/zed-industries/tree-sitter-markdown", rev = "4cfa6aad6b75052a5077c80fd934757d9267d81b" }
protols-tree-sitter-proto = { git = "https://github.com/zed-industries/tree-sitter-proto", rev = "0848bd30a64be48772e15fbb9d5ba8c0cc5772ad" }
tree-sitter-python = "0.23"
tree-sitter-regex = "0.23"
tree-sitter-ruby = "0.23"
tree-sitter-rust = "0.23"
tree-sitter-typescript = "0.23"
tree-sitter-yaml = { git = "https://github.com/zed-industries/tree-sitter-yaml", rev = "baff0b51c64ef6a1fb1f8390f3ad6015b83ec13a" }
unindent = "0.1.7"
tree-sitter-yaml = { git = "https://github.com/zed-industries/tree-sitter-yaml", rev = "baff0b51c64ef6a1fb1f8390f3ad6015b83ec13a" }
unicase = "2.6"
unindent = "0.1.7"
unicode-segmentation = "1.10"
url = "2.2"
uuid = { version = "1.1.2", features = ["v4", "v5", "serde"] }

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-trash"><path d="M3 6h18"/><path d="M19 6v14c0 1-1 2-2 2H7c-1 0-2-1-2-2V6"/><path d="M8 6V4c0-1 1-2 2-2h4c1 0 2 1 2 2v2"/></svg>

After

Width:  |  Height:  |  Size: 330 B

View File

@@ -440,7 +440,12 @@
"cmd-k shift-right": ["workspace::SwapPaneInDirection", "Right"],
"cmd-k shift-up": ["workspace::SwapPaneInDirection", "Up"],
"cmd-k shift-down": ["workspace::SwapPaneInDirection", "Down"],
"cmd-shift-x": "zed::Extensions",
"cmd-shift-x": "zed::Extensions"
}
},
{
"context": "Workspace && !Terminal",
"bindings": {
"alt-t": "task::Rerun",
"alt-shift-t": "task::Spawn"
}

View File

@@ -1 +1,2 @@
allow-private-module-inception = true
avoid-breaking-exported-api = false

View File

@@ -77,7 +77,7 @@ use ui::TintColor;
use ui::{
prelude::*,
utils::{format_distance_from_now, DateTimeType},
Avatar, AvatarShape, ButtonLike, ContextMenu, Disclosure, ElevationIndex, KeyBinding, ListItem,
Avatar, ButtonLike, ContextMenu, Disclosure, ElevationIndex, KeyBinding, ListItem,
ListItemSpacing, PopoverMenu, PopoverMenuHandle, Tooltip,
};
use util::{maybe, ResultExt};
@@ -262,9 +262,7 @@ impl PickerDelegate for SavedContextPickerDelegate {
.gap_2()
.children(if let Some(host_user) = host_user {
vec![
Avatar::new(host_user.avatar_uri.clone())
.shape(AvatarShape::Circle)
.into_any_element(),
Avatar::new(host_user.avatar_uri.clone()).into_any_element(),
Label::new(format!("Shared by @{}", host_user.github_login))
.color(Color::Muted)
.size(LabelSize::Small)

View File

@@ -46,7 +46,7 @@ use std::{
sync::Arc,
time::{Duration, Instant},
};
use telemetry_events::{AssistantKind, AssistantPhase};
use telemetry_events::{AssistantEvent, AssistantKind, AssistantPhase};
use text::BufferSnapshot;
use util::{post_inc, ResultExt, TryFutureExt};
use uuid::Uuid;
@@ -549,7 +549,7 @@ impl Context {
cx: &mut ModelContext<Self>,
) -> Self {
let buffer = cx.new_model(|_cx| {
let mut buffer = Buffer::remote(
let buffer = Buffer::remote(
language::BufferId::new(1).unwrap(),
replica_id,
capability,
@@ -2133,14 +2133,21 @@ impl Context {
});
if let Some(telemetry) = this.telemetry.as_ref() {
telemetry.report_assistant_event(
Some(this.id.0.clone()),
AssistantKind::Panel,
AssistantPhase::Response,
model.telemetry_id(),
let language_name = this
.buffer
.read(cx)
.language()
.map(|language| language.name());
telemetry.report_assistant_event(AssistantEvent {
conversation_id: Some(this.id.0.clone()),
kind: AssistantKind::Panel,
phase: AssistantPhase::Response,
model: model.telemetry_id(),
model_provider: model.provider_id().to_string(),
response_latency,
error_message,
);
language_name,
});
}
if let Ok(stop_reason) = result {

View File

@@ -50,6 +50,7 @@ use std::{
task::{self, Poll},
time::{Duration, Instant},
};
use telemetry_events::{AssistantEvent, AssistantKind, AssistantPhase};
use terminal_view::terminal_panel::TerminalPanel;
use text::{OffsetRangeExt, ToPoint as _};
use theme::ThemeSettings;
@@ -209,18 +210,6 @@ impl InlineAssistant {
initial_prompt: Option<String>,
cx: &mut WindowContext,
) {
if let Some(telemetry) = self.telemetry.as_ref() {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
telemetry_events::AssistantPhase::Invoked,
model.telemetry_id(),
None,
None,
);
}
}
let snapshot = editor.read(cx).buffer().read(cx).snapshot(cx);
let mut selections = Vec::<Selection<Point>>::new();
@@ -267,6 +256,21 @@ impl InlineAssistant {
text_anchor: buffer.anchor_after(buffer_range.end),
};
codegen_ranges.push(start..end);
if let Some(telemetry) = self.telemetry.as_ref() {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
telemetry.report_assistant_event(AssistantEvent {
conversation_id: None,
kind: AssistantKind::Inline,
phase: AssistantPhase::Invoked,
model: model.telemetry_id(),
model_provider: model.provider_id().to_string(),
response_latency: None,
error_message: None,
language_name: buffer.language().map(|language| language.name()),
});
}
}
}
let assist_group_id = self.next_assist_group_id.post_inc();
@@ -761,23 +765,34 @@ impl InlineAssistant {
}
pub fn finish_assist(&mut self, assist_id: InlineAssistId, undo: bool, cx: &mut WindowContext) {
if let Some(telemetry) = self.telemetry.as_ref() {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
if undo {
telemetry_events::AssistantPhase::Rejected
} else {
telemetry_events::AssistantPhase::Accepted
},
model.telemetry_id(),
None,
None,
);
}
}
if let Some(assist) = self.assists.get(&assist_id) {
if let Some(telemetry) = self.telemetry.as_ref() {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
let language_name = assist.editor.upgrade().and_then(|editor| {
let multibuffer = editor.read(cx).buffer().read(cx);
let ranges = multibuffer.range_to_buffer_ranges(assist.range.clone(), cx);
ranges
.first()
.and_then(|(buffer, _, _)| buffer.read(cx).language())
.map(|language| language.name())
});
telemetry.report_assistant_event(AssistantEvent {
conversation_id: None,
kind: AssistantKind::Inline,
phase: if undo {
AssistantPhase::Rejected
} else {
AssistantPhase::Accepted
},
model: model.telemetry_id(),
model_provider: model.provider_id().to_string(),
response_latency: None,
error_message: None,
language_name,
});
}
}
let assist_group_id = assist.group_id;
if self.assist_groups[&assist_group_id].linked {
for assist_id in self.unlink_assist_group(assist_group_id, cx) {
@@ -2706,6 +2721,7 @@ impl CodegenAlternative {
self.edit_position = Some(self.range.start.bias_right(&self.snapshot));
let telemetry_id = model.telemetry_id();
let provider_id = model.provider_id();
let chunks: LocalBoxFuture<Result<BoxStream<Result<String>>>> =
if user_prompt.trim().to_lowercase() == "delete" {
async { Ok(stream::empty().boxed()) }.boxed_local()
@@ -2716,7 +2732,7 @@ impl CodegenAlternative {
.spawn(|_, cx| async move { model.stream_completion_text(request, &cx).await });
async move { Ok(chunks.await?.boxed()) }.boxed_local()
};
self.handle_stream(telemetry_id, chunks, cx);
self.handle_stream(telemetry_id, provider_id.to_string(), chunks, cx);
Ok(())
}
@@ -2780,6 +2796,7 @@ impl CodegenAlternative {
pub fn handle_stream(
&mut self,
model_telemetry_id: String,
model_provider_id: String,
stream: impl 'static + Future<Output = Result<BoxStream<'static, Result<String>>>>,
cx: &mut ModelContext<Self>,
) {
@@ -2810,6 +2827,15 @@ impl CodegenAlternative {
}
let telemetry = self.telemetry.clone();
let language_name = {
let multibuffer = self.buffer.read(cx);
let ranges = multibuffer.range_to_buffer_ranges(self.range.clone(), cx);
ranges
.first()
.and_then(|(buffer, _, _)| buffer.read(cx).language())
.map(|language| language.name())
};
self.diff = Diff::default();
self.status = CodegenStatus::Pending;
let mut edit_start = self.range.start.to_offset(&snapshot);
@@ -2920,14 +2946,16 @@ impl CodegenAlternative {
let error_message =
result.as_ref().err().map(|error| error.to_string());
if let Some(telemetry) = telemetry {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
telemetry_events::AssistantPhase::Response,
model_telemetry_id,
telemetry.report_assistant_event(AssistantEvent {
conversation_id: None,
kind: AssistantKind::Inline,
phase: AssistantPhase::Response,
model: model_telemetry_id,
model_provider: model_provider_id.to_string(),
response_latency,
error_message,
);
language_name,
});
}
result?;
@@ -3539,6 +3567,7 @@ mod tests {
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
String::new(),
future::ready(Ok(chunks_rx.map(Ok).boxed())),
cx,
@@ -3610,6 +3639,7 @@ mod tests {
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
String::new(),
future::ready(Ok(chunks_rx.map(Ok).boxed())),
cx,
@@ -3684,6 +3714,7 @@ mod tests {
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
String::new(),
future::ready(Ok(chunks_rx.map(Ok).boxed())),
cx,
@@ -3757,6 +3788,7 @@ mod tests {
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
String::new(),
future::ready(Ok(chunks_rx.map(Ok).boxed())),
cx,
@@ -3820,6 +3852,7 @@ mod tests {
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
String::new(),
future::ready(Ok(chunks_rx.map(Ok).boxed())),
cx,

View File

@@ -910,7 +910,7 @@ impl PromptLibrary {
.features
.clone(),
font_size: HeadlineSize::Large
.size()
.rems()
.into(),
font_weight: settings.ui_font.weight,
line_height: relative(

View File

@@ -25,6 +25,7 @@ use std::{
sync::Arc,
time::{Duration, Instant},
};
use telemetry_events::{AssistantEvent, AssistantKind, AssistantPhase};
use terminal::Terminal;
use terminal_view::TerminalView;
use theme::ThemeSettings;
@@ -1039,6 +1040,7 @@ impl Codegen {
self.transaction = Some(TerminalTransaction::start(self.terminal.clone()));
self.generation = cx.spawn(|this, mut cx| async move {
let model_telemetry_id = model.telemetry_id();
let model_provider_id = model.provider_id();
let response = model.stream_completion_text(prompt, &cx).await;
let generate = async {
let (mut hunks_tx, mut hunks_rx) = mpsc::channel(1);
@@ -1063,14 +1065,16 @@ impl Codegen {
let error_message = result.as_ref().err().map(|error| error.to_string());
if let Some(telemetry) = telemetry {
telemetry.report_assistant_event(
None,
telemetry_events::AssistantKind::Inline,
telemetry_events::AssistantPhase::Response,
model_telemetry_id,
telemetry.report_assistant_event(AssistantEvent {
conversation_id: None,
kind: AssistantKind::Inline,
phase: AssistantPhase::Response,
model: model_telemetry_id,
model_provider: model_provider_id.to_string(),
response_latency,
error_message,
);
language_name: None,
});
}
result?;

View File

@@ -18,6 +18,7 @@ test-support = ["clock/test-support", "collections/test-support", "gpui/test-sup
[dependencies]
anyhow.workspace = true
async-recursion = "0.3"
async-tls = "0.13"
async-tungstenite = { workspace = true, features = ["async-std", "async-tls"] }
chrono = { workspace = true, features = ["serde"] }
clock.workspace = true
@@ -34,8 +35,6 @@ postage.workspace = true
rand.workspace = true
release_channel.workspace = true
rpc = { workspace = true, features = ["gpui"] }
rustls.workspace = true
rustls-native-certs.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true

View File

@@ -394,7 +394,7 @@ pub struct PendingEntitySubscription<T: 'static> {
}
impl<T: 'static> PendingEntitySubscription<T> {
pub fn set_model(mut self, model: &Model<T>, cx: &mut AsyncAppContext) -> Subscription {
pub fn set_model(mut self, model: &Model<T>, cx: &AsyncAppContext) -> Subscription {
self.consumed = true;
let mut handlers = self.client.handler_set.lock();
let id = (TypeId::of::<T>(), self.remote_id);
@@ -1023,7 +1023,7 @@ impl Client {
&self,
http: Arc<HttpClientWithUrl>,
release_channel: Option<ReleaseChannel>,
) -> impl Future<Output = Result<Url>> {
) -> impl Future<Output = Result<url::Url>> {
#[cfg(any(test, feature = "test-support"))]
let url_override = self.rpc_url.read().clone();
@@ -1117,7 +1117,7 @@ impl Client {
// for us from the RPC URL.
//
// Among other things, it will generate and set a `Sec-WebSocket-Key` header for us.
let mut request = rpc_url.into_client_request()?;
let mut request = IntoClientRequest::into_client_request(rpc_url.as_str())?;
// We then modify the request to add our desired headers.
let request_headers = request.headers_mut();
@@ -1137,30 +1137,13 @@ impl Client {
match url_scheme {
Https => {
let client_config = {
let mut root_store = rustls::RootCertStore::empty();
let root_certs = rustls_native_certs::load_native_certs();
for error in root_certs.errors {
log::warn!("error loading native certs: {:?}", error);
}
root_store.add_parsable_certificates(
&root_certs
.certs
.into_iter()
.map(|cert| cert.as_ref().to_owned())
.collect::<Vec<_>>(),
);
rustls::ClientConfig::builder()
.with_safe_defaults()
.with_root_certificates(root_store)
.with_no_client_auth()
};
let (stream, _) =
async_tungstenite::async_tls::client_async_tls_with_connector(
request,
stream,
Some(client_config.into()),
Some(async_tls::TlsConnector::from(
http_client::TLS_CONFIG.clone(),
)),
)
.await?;
Ok(Connection::new(
@@ -1752,7 +1735,7 @@ impl CredentialsProvider for KeychainCredentialsProvider {
}
/// prefix for the zed:// url scheme
pub static ZED_URL_SCHEME: &str = "zed";
pub const ZED_URL_SCHEME: &str = "zed";
/// Parses the given link into a Zed link.
///

View File

@@ -16,9 +16,9 @@ use std::io::Write;
use std::{env, mem, path::PathBuf, sync::Arc, time::Duration};
use sysinfo::{CpuRefreshKind, Pid, ProcessRefreshKind, RefreshKind, System};
use telemetry_events::{
ActionEvent, AppEvent, AssistantEvent, AssistantKind, AssistantPhase, CallEvent, CpuEvent,
EditEvent, EditorEvent, Event, EventRequestBody, EventWrapper, ExtensionEvent,
InlineCompletionEvent, MemoryEvent, ReplEvent, SettingEvent,
ActionEvent, AppEvent, AssistantEvent, CallEvent, CpuEvent, EditEvent, EditorEvent, Event,
EventRequestBody, EventWrapper, ExtensionEvent, InlineCompletionEvent, MemoryEvent, ReplEvent,
SettingEvent,
};
use tempfile::NamedTempFile;
#[cfg(not(debug_assertions))]
@@ -288,7 +288,7 @@ impl Telemetry {
system_id: Option<String>,
installation_id: Option<String>,
session_id: String,
cx: &mut AppContext,
cx: &AppContext,
) {
let mut state = self.state.lock();
state.system_id = system_id.map(|id| id.into());
@@ -391,25 +391,8 @@ impl Telemetry {
self.report_event(event)
}
pub fn report_assistant_event(
self: &Arc<Self>,
conversation_id: Option<String>,
kind: AssistantKind,
phase: AssistantPhase,
model: String,
response_latency: Option<Duration>,
error_message: Option<String>,
) {
let event = Event::Assistant(AssistantEvent {
conversation_id,
kind,
phase,
model: model.to_string(),
response_latency,
error_message,
});
self.report_event(event)
pub fn report_assistant_event(self: &Arc<Self>, event: AssistantEvent) {
self.report_event(Event::Assistant(event));
}
pub fn report_call_event(

View File

@@ -138,7 +138,7 @@ enum UpdateContacts {
}
impl UserStore {
pub fn new(client: Arc<Client>, cx: &mut ModelContext<Self>) -> Self {
pub fn new(client: Arc<Client>, cx: &ModelContext<Self>) -> Self {
let (mut current_user_tx, current_user_rx) = watch::channel();
let (update_contacts_tx, mut update_contacts_rx) = mpsc::unbounded();
let rpc_subscriptions = vec![
@@ -310,7 +310,7 @@ impl UserStore {
fn update_contacts(
&mut self,
message: UpdateContacts,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Task<Result<()>> {
match message {
UpdateContacts::Wait(barrier) => {
@@ -525,9 +525,9 @@ impl UserStore {
}
pub fn dismiss_contact_request(
&mut self,
&self,
requester_id: u64,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Task<Result<()>> {
let client = self.client.upgrade();
cx.spawn(move |_, _| async move {
@@ -573,7 +573,7 @@ impl UserStore {
})
}
pub fn clear_contacts(&mut self) -> impl Future<Output = ()> {
pub fn clear_contacts(&self) -> impl Future<Output = ()> {
let (tx, mut rx) = postage::barrier::channel();
self.update_contacts_tx
.unbounded_send(UpdateContacts::Clear(tx))
@@ -583,7 +583,7 @@ impl UserStore {
}
}
pub fn contact_updates_done(&mut self) -> impl Future<Output = ()> {
pub fn contact_updates_done(&self) -> impl Future<Output = ()> {
let (tx, mut rx) = postage::barrier::channel();
self.update_contacts_tx
.unbounded_send(UpdateContacts::Wait(tx))
@@ -594,9 +594,9 @@ impl UserStore {
}
pub fn get_users(
&mut self,
&self,
user_ids: Vec<u64>,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Task<Result<Vec<Arc<User>>>> {
let mut user_ids_to_fetch = user_ids.clone();
user_ids_to_fetch.retain(|id| !self.users.contains_key(id));
@@ -629,9 +629,9 @@ impl UserStore {
}
pub fn fuzzy_search_users(
&mut self,
&self,
query: String,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Task<Result<Vec<Arc<User>>>> {
self.load_users(proto::FuzzySearchUsers { query }, cx)
}
@@ -640,11 +640,7 @@ impl UserStore {
self.users.get(&user_id).cloned()
}
pub fn get_user_optimistic(
&mut self,
user_id: u64,
cx: &mut ModelContext<Self>,
) -> Option<Arc<User>> {
pub fn get_user_optimistic(&self, user_id: u64, cx: &ModelContext<Self>) -> Option<Arc<User>> {
if let Some(user) = self.users.get(&user_id).cloned() {
return Some(user);
}
@@ -653,11 +649,7 @@ impl UserStore {
None
}
pub fn get_user(
&mut self,
user_id: u64,
cx: &mut ModelContext<Self>,
) -> Task<Result<Arc<User>>> {
pub fn get_user(&self, user_id: u64, cx: &ModelContext<Self>) -> Task<Result<Arc<User>>> {
if let Some(user) = self.users.get(&user_id).cloned() {
return Task::ready(Ok(user));
}
@@ -697,7 +689,7 @@ impl UserStore {
.map(|accepted_tos_at| accepted_tos_at.is_some())
}
pub fn accept_terms_of_service(&mut self, cx: &mut ModelContext<Self>) -> Task<Result<()>> {
pub fn accept_terms_of_service(&self, cx: &ModelContext<Self>) -> Task<Result<()>> {
if self.current_user().is_none() {
return Task::ready(Err(anyhow!("no current user")));
};
@@ -726,9 +718,9 @@ impl UserStore {
}
fn load_users(
&mut self,
&self,
request: impl RequestMessage<Response = UsersResponse>,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Task<Result<Vec<Arc<User>>>> {
let client = self.client.clone();
cx.spawn(|this, mut cx| async move {

View File

@@ -28,8 +28,8 @@ axum = { version = "0.6", features = ["json", "headers", "ws"] }
axum-extra = { version = "0.4", features = ["erased-json"] }
base64.workspace = true
chrono.workspace = true
clock.workspace = true
clickhouse.workspace = true
clock.workspace = true
collections.workspace = true
dashmap.workspace = true
envy = "0.4.2"
@@ -37,19 +37,19 @@ futures.workspace = true
google_ai.workspace = true
hex.workspace = true
http_client.workspace = true
isahc_http_client.workspace = true
jsonwebtoken.workspace = true
live_kit_server.workspace = true
log.workspace = true
nanoid.workspace = true
open_ai.workspace = true
supermaven_api.workspace = true
parking_lot.workspace = true
prometheus = "0.13"
prost.workspace = true
rand.workspace = true
reqwest = { version = "0.11", features = ["json"] }
reqwest_client.workspace = true
rpc.workspace = true
rustc-demangle.workspace = true
scrypt = "0.11"
sea-orm = { version = "1.1.0-rc.1", features = ["sqlx-postgres", "postgres-array", "runtime-tokio-rustls", "with-uuid"] }
semantic_version.workspace = true
@@ -61,7 +61,7 @@ sha2.workspace = true
sqlx = { version = "0.8", features = ["runtime-tokio-rustls", "postgres", "json", "time", "uuid", "any"] }
strum.workspace = true
subtle.workspace = true
rustc-demangle.workspace = true
supermaven_api.workspace = true
telemetry_events.workspace = true
text.workspace = true
thiserror.workspace = true
@@ -85,6 +85,7 @@ client = { workspace = true, features = ["test-support"] }
collab_ui = { workspace = true, features = ["test-support"] }
collections = { workspace = true, features = ["test-support"] }
ctor.workspace = true
dev_server_projects.workspace = true
editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true
file_finder.workspace = true
@@ -92,6 +93,7 @@ fs = { workspace = true, features = ["test-support"] }
git = { workspace = true, features = ["test-support"] }
git_hosting_providers.workspace = true
gpui = { workspace = true, features = ["test-support"] }
headless.workspace = true
hyper.workspace = true
indoc.workspace = true
language = { workspace = true, features = ["test-support"] }
@@ -108,7 +110,6 @@ recent_projects = { workspace = true }
release_channel.workspace = true
remote = { workspace = true, features = ["test-support"] }
remote_server.workspace = true
dev_server_projects.workspace = true
rpc = { workspace = true, features = ["test-support"] }
sea-orm = { version = "1.1.0-rc.1", features = ["sqlx-sqlite"] }
serde_json.workspace = true
@@ -120,7 +121,6 @@ unindent.workspace = true
util.workspace = true
workspace = { workspace = true, features = ["test-support"] }
worktree = { workspace = true, features = ["test-support"] }
headless.workspace = true
[package.metadata.cargo-machete]
ignored = ["async-stripe"]

View File

@@ -112,6 +112,7 @@ CREATE TABLE "worktree_settings_files" (
"worktree_id" INTEGER NOT NULL,
"path" VARCHAR NOT NULL,
"content" TEXT,
"kind" VARCHAR,
PRIMARY KEY(project_id, worktree_id, path),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
);

View File

@@ -0,0 +1 @@
ALTER TABLE "worktree_settings_files" ADD COLUMN "kind" VARCHAR;

View File

@@ -23,7 +23,7 @@ use telemetry_events::{
};
use uuid::Uuid;
static CRASH_REPORTS_BUCKET: &str = "zed-crash-reports";
const CRASH_REPORTS_BUCKET: &str = "zed-crash-reports";
pub fn router() -> Router {
Router::new()

View File

@@ -35,6 +35,7 @@ use std::{
};
use time::PrimitiveDateTime;
use tokio::sync::{Mutex, OwnedMutexGuard};
use worktree_settings_file::LocalSettingsKind;
#[cfg(test)]
pub use tests::TestDb;
@@ -766,6 +767,7 @@ pub struct Worktree {
pub struct WorktreeSettingsFile {
pub path: String,
pub content: String,
pub kind: LocalSettingsKind,
}
pub struct NewExtensionVersion {
@@ -783,3 +785,21 @@ pub struct ExtensionVersionConstraints {
pub schema_versions: RangeInclusive<i32>,
pub wasm_api_versions: RangeInclusive<SemanticVersion>,
}
impl LocalSettingsKind {
pub fn from_proto(proto_kind: proto::LocalSettingsKind) -> Self {
match proto_kind {
proto::LocalSettingsKind::Settings => Self::Settings,
proto::LocalSettingsKind::Tasks => Self::Tasks,
proto::LocalSettingsKind::Editorconfig => Self::Editorconfig,
}
}
pub fn to_proto(&self) -> proto::LocalSettingsKind {
match self {
Self::Settings => proto::LocalSettingsKind::Settings,
Self::Tasks => proto::LocalSettingsKind::Tasks,
Self::Editorconfig => proto::LocalSettingsKind::Editorconfig,
}
}
}

View File

@@ -1,3 +1,4 @@
use anyhow::Context as _;
use util::ResultExt;
use super::*;
@@ -527,6 +528,12 @@ impl Database {
connection: ConnectionId,
) -> Result<TransactionGuard<Vec<ConnectionId>>> {
let project_id = ProjectId::from_proto(update.project_id);
let kind = match update.kind {
Some(kind) => proto::LocalSettingsKind::from_i32(kind)
.with_context(|| format!("unknown worktree settings kind: {kind}"))?,
None => proto::LocalSettingsKind::Settings,
};
let kind = LocalSettingsKind::from_proto(kind);
self.project_transaction(project_id, |tx| async move {
// Ensure the update comes from the host.
let project = project::Entity::find_by_id(project_id)
@@ -543,6 +550,7 @@ impl Database {
worktree_id: ActiveValue::Set(update.worktree_id as i64),
path: ActiveValue::Set(update.path.clone()),
content: ActiveValue::Set(content.clone()),
kind: ActiveValue::Set(kind),
})
.on_conflict(
OnConflict::columns([
@@ -800,6 +808,7 @@ impl Database {
worktree.settings_files.push(WorktreeSettingsFile {
path: db_settings_file.path,
content: db_settings_file.content,
kind: db_settings_file.kind,
});
}
}

View File

@@ -735,6 +735,7 @@ impl Database {
worktree.settings_files.push(WorktreeSettingsFile {
path: db_settings_file.path,
content: db_settings_file.content,
kind: db_settings_file.kind,
});
}
}

View File

@@ -11,9 +11,25 @@ pub struct Model {
#[sea_orm(primary_key)]
pub path: String,
pub content: String,
pub kind: LocalSettingsKind,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {}
impl ActiveModelBehavior for ActiveModel {}
#[derive(
Copy, Clone, Debug, PartialEq, Eq, EnumIter, DeriveActiveEnum, Default, Hash, serde::Serialize,
)]
#[sea_orm(rs_type = "String", db_type = "String(StringLen::None)")]
#[serde(rename_all = "snake_case")]
pub enum LocalSettingsKind {
#[default]
#[sea_orm(string_value = "settings")]
Settings,
#[sea_orm(string_value = "tasks")]
Tasks,
#[sea_orm(string_value = "editorconfig")]
Editorconfig,
}

View File

@@ -22,7 +22,8 @@ use chrono::{DateTime, Duration, Utc};
use collections::HashMap;
use db::{usage_measure::UsageMeasure, ActiveUserCount, LlmDatabase};
use futures::{Stream, StreamExt as _};
use isahc_http_client::IsahcHttpClient;
use reqwest_client::ReqwestClient;
use rpc::ListModelsResponse;
use rpc::{
proto::Plan, LanguageModelProvider, PerformCompletionParams, EXPIRED_LLM_TOKEN_HEADER_NAME,
@@ -43,7 +44,7 @@ pub struct LlmState {
pub config: Config,
pub executor: Executor,
pub db: Arc<LlmDatabase>,
pub http_client: IsahcHttpClient,
pub http_client: ReqwestClient,
pub clickhouse_client: Option<clickhouse::Client>,
active_user_count_by_model:
RwLock<HashMap<(LanguageModelProvider, String), (DateTime<Utc>, ActiveUserCount)>>,
@@ -69,11 +70,8 @@ impl LlmState {
let db = Arc::new(db);
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = IsahcHttpClient::builder()
.default_header("User-Agent", user_agent)
.build()
.map(IsahcHttpClient::from)
.context("failed to construct http client")?;
let http_client =
ReqwestClient::user_agent(&user_agent).context("failed to construct http client")?;
let this = Self {
executor,

View File

@@ -36,8 +36,8 @@ use collections::{HashMap, HashSet};
pub use connection_pool::{ConnectionPool, ZedVersion};
use core::fmt::{self, Debug, Formatter};
use http_client::HttpClient;
use isahc_http_client::IsahcHttpClient;
use open_ai::{OpenAiEmbeddingModel, OPEN_AI_API_URL};
use reqwest_client::ReqwestClient;
use sha2::Digest;
use supermaven_api::{CreateExternalUserRequest, SupermavenAdminApi};
@@ -954,8 +954,8 @@ impl Server {
tracing::info!("connection opened");
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = match IsahcHttpClient::builder().default_header("User-Agent", user_agent).build() {
Ok(http_client) => Arc::new(IsahcHttpClient::from(http_client)),
let http_client = match ReqwestClient::user_agent(&user_agent) {
Ok(http_client) => Arc::new(http_client),
Err(error) => {
tracing::error!(?error, "failed to create HTTP client");
return;
@@ -1739,6 +1739,7 @@ fn notify_rejoined_projects(
worktree_id: worktree.id,
path: settings_file.path,
content: Some(settings_file.content),
kind: Some(settings_file.kind.to_proto().into()),
},
)?;
}
@@ -2220,6 +2221,7 @@ fn join_project_internal(
worktree_id: worktree.id,
path: settings_file.path,
content: Some(settings_file.content),
kind: Some(proto::update_user_settings::Kind::Settings.into()),
},
)?;
}

View File

@@ -33,7 +33,7 @@ use project::{
};
use rand::prelude::*;
use serde_json::json;
use settings::SettingsStore;
use settings::{LocalSettingsKind, SettingsStore};
use std::{
cell::{Cell, RefCell},
env, future, mem,
@@ -3327,8 +3327,16 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(Path::new("").into(), r#"{"tab_size":2}"#.to_string()),
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
(
Path::new("").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":2}"#.to_string()
),
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
]
)
});
@@ -3346,8 +3354,16 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(Path::new("").into(), r#"{}"#.to_string()),
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
(
Path::new("").into(),
LocalSettingsKind::Settings,
r#"{}"#.to_string()
),
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
]
)
});
@@ -3375,8 +3391,16 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
(Path::new("b").into(), r#"{"tab_size":4}"#.to_string()),
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
(
Path::new("b").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":4}"#.to_string()
),
]
)
});
@@ -3406,7 +3430,11 @@ async fn test_local_settings(
store
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[(Path::new("a").into(), r#"{"hard_tabs":true}"#.to_string()),]
&[(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"hard_tabs":true}"#.to_string()
),]
)
});
}

View File

@@ -835,7 +835,7 @@ impl TestClient {
pub async fn build_ssh_project(
&self,
root_path: impl AsRef<Path>,
ssh: Arc<SshRemoteClient>,
ssh: Model<SshRemoteClient>,
cx: &mut TestAppContext,
) -> (Model<Project>, WorktreeId) {
let project = cx.update(|cx| {

View File

@@ -188,7 +188,7 @@ macro_rules! define_connection {
};
}
pub fn write_and_log<F>(cx: &mut AppContext, db_write: impl FnOnce() -> F + Send + 'static)
pub fn write_and_log<F>(cx: &AppContext, db_write: impl FnOnce() -> F + Send + 'static)
where
F: Future<Output = anyhow::Result<()>> + Send,
{

View File

@@ -1,7 +1,7 @@
use crate::ProjectDiagnosticsEditor;
use gpui::{EventEmitter, ParentElement, Render, View, ViewContext, WeakView};
use ui::prelude::*;
use ui::{IconButton, IconName, Tooltip};
use ui::{IconButton, IconButtonShape, IconName, Tooltip};
use workspace::{item::ItemHandle, ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView};
pub struct ToolbarControls {
@@ -33,11 +33,19 @@ impl Render for ToolbarControls {
"Include Warnings"
};
let warning_color = if include_warnings {
Color::Warning
} else {
Color::Muted
};
h_flex()
.gap_1()
.when(has_stale_excerpts, |div| {
div.child(
IconButton::new("update-excerpts", IconName::Update)
.icon_color(Color::Info)
.shape(IconButtonShape::Square)
.disabled(is_updating)
.tooltip(move |cx| Tooltip::text("Update excerpts", cx))
.on_click(cx.listener(|this, _, cx| {
@@ -51,6 +59,8 @@ impl Render for ToolbarControls {
})
.child(
IconButton::new("toggle-warnings", IconName::Warning)
.icon_color(warning_color)
.shape(IconButtonShape::Square)
.tooltip(move |cx| Tooltip::text(tooltip, cx))
.on_click(cx.listener(|this, _, cx| {
if let Some(editor) = this.editor() {

View File

@@ -9,7 +9,7 @@ use crate::lsp_ext::find_specific_language_server_in_selection;
use crate::{element::register_action, Editor, SwitchSourceHeader};
static CLANGD_SERVER_NAME: &str = "clangd";
const CLANGD_SERVER_NAME: &str = "clangd";
fn is_c_language(language: &Language) -> bool {
return language.name() == "C++".into() || language.name() == "C".into();

View File

@@ -1228,6 +1228,10 @@ impl CompletionsMenu {
None
};
let color_swatch = completion
.color()
.map(|color| div().size_4().bg(color).rounded_sm());
div().min_w(px(220.)).max_w(px(540.)).child(
ListItem::new(mat.candidate_id)
.inset(true)
@@ -1243,6 +1247,7 @@ impl CompletionsMenu {
task.detach_and_log_err(cx)
}
}))
.start_slot::<Div>(color_swatch)
.child(h_flex().overflow_hidden().child(completion_label))
.end_slot::<Label>(documentation_label),
)
@@ -10787,7 +10792,7 @@ impl Editor {
.selections
.all::<Point>(cx)
.iter()
.any(|selection| selection.range().overlaps(&intersection_range));
.any(|selection| RangeExt::overlaps(&selection.range(), &intersection_range));
self.unfold_ranges(std::iter::once(intersection_range), true, autoscroll, cx)
}

View File

@@ -379,6 +379,7 @@ impl Editor {
});
let border_color = cx.theme().colors().border_variant;
let bg_color = cx.theme().colors().editor_background;
let gutter_color = match hunk.status {
DiffHunkStatus::Added => cx.theme().status().created,
DiffHunkStatus::Modified => cx.theme().status().modified,
@@ -394,6 +395,7 @@ impl Editor {
render: Box::new({
let editor = cx.view().clone();
let hunk = hunk.clone();
move |cx| {
let hunk_controls_menu_handle =
editor.read(cx).hunk_controls_menu_handle.clone();
@@ -404,7 +406,7 @@ impl Editor {
.w_full()
.border_t_1()
.border_color(border_color)
.bg(cx.theme().colors().editor_background)
.bg(bg_color)
.child(
div()
.id("gutter-strip")
@@ -424,14 +426,9 @@ impl Editor {
)
.child(
h_flex()
.pl_2()
.pr_6()
.px_6()
.size_full()
.justify_between()
.border_t_1()
.pl_6()
.pr_6()
.border_color(border_color)
.child(
h_flex()
.gap_1()
@@ -608,7 +605,7 @@ impl Editor {
move |menu, _| {
menu.context(focus.clone())
.action(
"Discard All",
"Discard All Hunks",
RevertFile
.boxed_clone(),
)

View File

@@ -10,7 +10,7 @@ use crate::{
ExpandMacroRecursively,
};
static RUST_ANALYZER_NAME: &str = "rust-analyzer";
const RUST_ANALYZER_NAME: &str = "rust-analyzer";
fn is_rust_language(language: &Language) -> bool {
language.name() == "Rust".into()

View File

@@ -14,8 +14,8 @@ name = "eval"
path = "src/eval.rs"
[dependencies]
clap.workspace = true
anyhow.workspace = true
clap.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
@@ -24,15 +24,15 @@ feature_flags.workspace = true
fs.workspace = true
git.workspace = true
gpui.workspace = true
isahc_http_client.workspace = true
http_client.workspace = true
language.workspace = true
languages.workspace = true
http_client.workspace = true
node_runtime.workspace = true
open_ai.workspace = true
project.workspace = true
settings.workspace = true
semantic_index.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
semantic_index.workspace = true
node_runtime.workspace = true
ureq_client.workspace = true

View File

@@ -32,6 +32,7 @@ use std::{
Arc,
},
};
use ureq_client::UreqClient;
const CODESEARCH_NET_DIR: &'static str = "target/datasets/code-search-net";
const EVAL_REPOS_DIR: &'static str = "target/datasets/eval-repos";
@@ -100,7 +101,11 @@ fn main() -> Result<()> {
gpui::App::headless().run(move |cx| {
let executor = cx.background_executor().clone();
let client = isahc_http_client::IsahcHttpClient::new(None, None);
let client = Arc::new(UreqClient::new(
None,
"Zed LLM evals".to_string(),
executor.clone(),
));
cx.set_http_client(client.clone());
match cli.command {
Commands::Fetch {} => {

View File

@@ -39,30 +39,31 @@ schemars.workspace = true
semantic_version.workspace = true
serde.workspace = true
serde_json.workspace = true
serde_json_lenient.workspace = true
settings.workspace = true
snippet_provider.workspace = true
task.workspace = true
theme.workspace = true
toml.workspace = true
ui.workspace = true
url.workspace = true
util.workspace = true
wasm-encoder.workspace = true
wasmtime.workspace = true
wasmtime-wasi.workspace = true
wasmparser.workspace = true
wasmtime-wasi.workspace = true
wasmtime.workspace = true
wit-component.workspace = true
workspace.workspace = true
task.workspace = true
serde_json_lenient.workspace = true
[dev-dependencies]
isahc_http_client.workspace = true
ctor.workspace = true
env_logger.workspace = true
parking_lot.workspace = true
fs = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
parking_lot.workspace = true
project = { workspace = true, features = ["test-support"] }
reqwest_client.workspace = true
tokio.workspace = true
ureq_client.workspace = true
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -25,7 +25,7 @@ use wit_component::ComponentEncoder;
/// Once Rust 1.78 is released, there will be a `wasm32-wasip2` target available, so we will
/// not need the adapter anymore.
const RUST_TARGET: &str = "wasm32-wasip1";
const WASI_ADAPTER_URL: &str =
pub const WASI_ADAPTER_URL: &str =
"https://github.com/bytecodealliance/wasmtime/releases/download/v18.0.2/wasi_snapshot_preview1.reactor.wasm";
/// Compiling Tree-sitter parsers from C to WASM requires Clang 17, and a WASM build of libc

View File

@@ -1,3 +1,4 @@
use crate::extension_builder::WASI_ADAPTER_URL;
use crate::extension_manifest::SchemaVersion;
use crate::extension_settings::ExtensionSettings;
use crate::{
@@ -11,14 +12,14 @@ use collections::BTreeMap;
use fs::{FakeFs, Fs, RealFs};
use futures::{io::BufReader, AsyncReadExt, StreamExt};
use gpui::{Context, SemanticVersion, TestAppContext};
use http_client::{FakeHttpClient, Response};
use http_client::{AsyncBody, FakeHttpClient, HttpClient, Response};
use indexed_docs::IndexedDocsRegistry;
use isahc_http_client::IsahcHttpClient;
use language::{LanguageMatcher, LanguageRegistry, LanguageServerBinaryStatus, LanguageServerName};
use node_runtime::NodeRuntime;
use parking_lot::Mutex;
use project::{Project, DEFAULT_COMPLETION_CONTEXT};
use release_channel::AppVersion;
use reqwest_client::ReqwestClient;
use serde_json::json;
use settings::{Settings as _, SettingsStore};
use snippet_provider::SnippetRegistry;
@@ -28,6 +29,7 @@ use std::{
sync::Arc,
};
use theme::ThemeRegistry;
use ureq_client::UreqClient;
use util::test::temp_tree;
#[cfg(test)]
@@ -576,7 +578,7 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
std::env::consts::ARCH
)
});
let builder_client = IsahcHttpClient::new(None, Some(user_agent));
let builder_client = Arc::new(UreqClient::new(None, user_agent, cx.executor().clone()));
let extension_store = cx.new_model(|cx| {
ExtensionStore::new(
@@ -769,6 +771,50 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
assert!(fs.metadata(&expected_server_path).await.unwrap().is_none());
}
#[gpui::test]
async fn test_wasi_adapter_download(cx: &mut TestAppContext) {
let client = Arc::new(UreqClient::new(
None,
"zed-test-wasi-adapter-download".to_string(),
cx.executor().clone(),
));
let mut response = client
.get(WASI_ADAPTER_URL, AsyncBody::default(), true)
.await
.unwrap();
let mut content = Vec::new();
let mut body = BufReader::new(response.body_mut());
body.read_to_end(&mut content).await.unwrap();
assert!(wasmparser::Parser::is_core_wasm(&content));
assert_eq!(content.len(), 96801); // Determined by downloading this to my computer
wit_component::ComponentEncoder::default()
.adapter("wasi_snapshot_preview1", &content)
.unwrap();
}
#[tokio::test]
async fn test_wasi_adapter_download_tokio() {
let client = Arc::new(ReqwestClient::new());
let mut response = client
.get(WASI_ADAPTER_URL, AsyncBody::default(), true)
.await
.unwrap();
let mut content = Vec::new();
let mut body = BufReader::new(response.body_mut());
body.read_to_end(&mut content).await.unwrap();
assert!(wasmparser::Parser::is_core_wasm(&content));
assert_eq!(content.len(), 96801); // Determined by downloading this to my computer
wit_component::ComponentEncoder::default()
.adapter("wasi_snapshot_preview1", &content)
.unwrap();
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let store = SettingsStore::test(cx);

View File

@@ -18,9 +18,9 @@ clap = { workspace = true, features = ["derive"] }
env_logger.workspace = true
extension = { workspace = true, features = ["no-webrtc"] }
fs.workspace = true
isahc_http_client.workspace = true
language.workspace = true
log.workspace = true
reqwest_client.workspace = true
rpc.workspace = true
serde.workspace = true
serde_json.workspace = true

View File

@@ -13,8 +13,8 @@ use extension::{
extension_builder::{CompileExtensionOptions, ExtensionBuilder},
ExtensionManifest,
};
use isahc_http_client::IsahcHttpClient;
use language::LanguageConfig;
use reqwest_client::ReqwestClient;
use theme::ThemeRegistry;
use tree_sitter::{Language, Query, WasmStore};
@@ -66,12 +66,7 @@ async fn main() -> Result<()> {
std::env::consts::OS,
std::env::consts::ARCH
);
let http_client = Arc::new(
IsahcHttpClient::builder()
.default_header("User-Agent", user_agent)
.build()
.map(IsahcHttpClient::from)?,
);
let http_client = Arc::new(ReqwestClient::user_agent(&user_agent)?);
let builder = ExtensionBuilder::new(http_client, scratch_dir);
builder

View File

@@ -54,6 +54,7 @@ const SUGGESTIONS_BY_EXTENSION_ID: &[(&str, &[&str])] = &[
("ocaml", &["ml", "mli"]),
("php", &["php"]),
("prisma", &["prisma"]),
("proto", &["proto"]),
("purescript", &["purs"]),
("r", &["r", "R"]),
("racket", &["rkt"]),

View File

@@ -8,7 +8,7 @@ use gpui::AppContext;
pub use crate::providers::*;
/// Initializes the Git hosting providers.
pub fn init(cx: &mut AppContext) {
pub fn init(cx: &AppContext) {
let provider_registry = GitHostingProviderRegistry::global(cx);
// The providers are stored in a `BTreeMap`, so insertion order matters.

View File

@@ -66,7 +66,7 @@ smallvec.workspace = true
smol.workspace = true
strum.workspace = true
sum_tree.workspace = true
taffy = "0.4.3"
taffy = "0.5"
thiserror.workspace = true
util.workspace = true
uuid.workspace = true

View File

@@ -348,7 +348,7 @@ impl AppContext {
}
/// Gracefully quit the application via the platform's standard routine.
pub fn quit(&mut self) {
pub fn quit(&self) {
self.platform.quit();
}
@@ -1004,11 +1004,7 @@ impl AppContext {
self.globals_by_type.insert(global_type, lease.global);
}
pub(crate) fn new_view_observer(
&mut self,
key: TypeId,
value: NewViewListener,
) -> Subscription {
pub(crate) fn new_view_observer(&self, key: TypeId, value: NewViewListener) -> Subscription {
let (subscription, activate) = self.new_view_observers.insert(key, value);
activate();
subscription
@@ -1016,7 +1012,7 @@ impl AppContext {
/// Arrange for the given function to be invoked whenever a view of the specified type is created.
/// The function will be passed a mutable reference to the view along with an appropriate context.
pub fn observe_new_views<V: 'static>(
&mut self,
&self,
on_new: impl 'static + Fn(&mut V, &mut ViewContext<V>),
) -> Subscription {
self.new_view_observer(
@@ -1035,7 +1031,7 @@ impl AppContext {
/// Observe the release of a model or view. The callback is invoked after the model or view
/// has no more strong references but before it has been dropped.
pub fn observe_release<E, T>(
&mut self,
&self,
handle: &E,
on_release: impl FnOnce(&mut T, &mut AppContext) + 'static,
) -> Subscription
@@ -1062,7 +1058,7 @@ impl AppContext {
mut f: impl FnMut(&KeystrokeEvent, &mut WindowContext) + 'static,
) -> Subscription {
fn inner(
keystroke_observers: &mut SubscriberSet<(), KeystrokeObserver>,
keystroke_observers: &SubscriberSet<(), KeystrokeObserver>,
handler: KeystrokeObserver,
) -> Subscription {
let (subscription, activate) = keystroke_observers.insert((), handler);
@@ -1140,7 +1136,7 @@ impl AppContext {
/// Register a callback to be invoked when the application is about to quit.
/// It is not possible to cancel the quit event at this point.
pub fn on_app_quit<Fut>(
&mut self,
&self,
mut on_quit: impl FnMut(&mut AppContext) -> Fut + 'static,
) -> Subscription
where
@@ -1186,7 +1182,7 @@ impl AppContext {
}
/// Sets the menu bar for this application. This will replace any existing menu bar.
pub fn set_menus(&mut self, menus: Vec<Menu>) {
pub fn set_menus(&self, menus: Vec<Menu>) {
self.platform.set_menus(menus, &self.keymap.borrow());
}
@@ -1196,7 +1192,7 @@ impl AppContext {
}
/// Sets the right click menu for the app icon in the dock
pub fn set_dock_menu(&mut self, menus: Vec<MenuItem>) {
pub fn set_dock_menu(&self, menus: Vec<MenuItem>) {
self.platform.set_dock_menu(menus, &self.keymap.borrow());
}
@@ -1204,7 +1200,7 @@ impl AppContext {
/// The list is usually shown on the application icon's context menu in the dock,
/// and allows to open the recent files via that context menu.
/// If the path is already in the list, it will be moved to the bottom of the list.
pub fn add_recent_document(&mut self, path: &Path) {
pub fn add_recent_document(&self, path: &Path) {
self.platform.add_recent_document(path);
}

View File

@@ -107,7 +107,7 @@ impl Context for AsyncAppContext {
impl AsyncAppContext {
/// Schedules all windows in the application to be redrawn.
pub fn refresh(&mut self) -> Result<()> {
pub fn refresh(&self) -> Result<()> {
let app = self
.app
.upgrade()
@@ -205,7 +205,7 @@ impl AsyncAppContext {
/// A convenience method for [AppContext::update_global]
/// for updating the global state of the specified type.
pub fn update_global<G: Global, R>(
&mut self,
&self,
update: impl FnOnce(&mut G, &mut AppContext) -> R,
) -> Result<R> {
let app = self

View File

@@ -91,7 +91,7 @@ impl<'a, T: 'static> ModelContext<'a, T> {
/// Register a callback to be invoked when GPUI releases this model.
pub fn on_release(
&mut self,
&self,
on_release: impl FnOnce(&mut T, &mut AppContext) + 'static,
) -> Subscription
where
@@ -110,7 +110,7 @@ impl<'a, T: 'static> ModelContext<'a, T> {
/// Register a callback to be run on the release of another model or view
pub fn observe_release<T2, E>(
&mut self,
&self,
entity: &E,
on_release: impl FnOnce(&mut T, &mut T2, &mut ModelContext<'_, T>) + 'static,
) -> Subscription
@@ -154,7 +154,7 @@ impl<'a, T: 'static> ModelContext<'a, T> {
/// Arrange for the given function to be invoked whenever the application is quit.
/// The future returned from this callback will be polled for up to [crate::SHUTDOWN_TIMEOUT] until the app fully quits.
pub fn on_app_quit<Fut>(
&mut self,
&self,
mut on_quit: impl FnMut(&mut T, &mut ModelContext<T>) -> Fut + 'static,
) -> Subscription
where

View File

@@ -1418,7 +1418,7 @@ impl Interactivity {
}
fn clamp_scroll_position(
&mut self,
&self,
bounds: Bounds<Pixels>,
style: &Style,
cx: &mut WindowContext,
@@ -1547,7 +1547,7 @@ impl Interactivity {
#[cfg(debug_assertions)]
fn paint_debug_info(
&mut self,
&self,
global_id: Option<&GlobalElementId>,
hitbox: &Hitbox,
style: &Style,

View File

@@ -252,7 +252,7 @@ impl TextLayout {
}
fn layout(
&mut self,
&self,
text: SharedString,
runs: Option<Vec<TextRun>>,
cx: &mut WindowContext,
@@ -350,7 +350,7 @@ impl TextLayout {
layout_id
}
fn prepaint(&mut self, bounds: Bounds<Pixels>, text: &str) {
fn prepaint(&self, bounds: Bounds<Pixels>, text: &str) {
let mut element_state = self.lock();
let element_state = element_state
.as_mut()
@@ -359,7 +359,7 @@ impl TextLayout {
element_state.bounds = Some(bounds);
}
fn paint(&mut self, text: &str, cx: &mut WindowContext) {
fn paint(&self, text: &str, cx: &mut WindowContext) {
let element_state = self.lock();
let element_state = element_state
.as_ref()

View File

@@ -115,7 +115,7 @@ impl UniformListScrollHandle {
}
/// Scroll the list to the given item index.
pub fn scroll_to_item(&mut self, ix: usize) {
pub fn scroll_to_item(&self, ix: usize) {
self.0.borrow_mut().deferred_scroll_to_item = Some(ix);
}

View File

@@ -706,11 +706,7 @@ pub struct Bounds<T: Clone + Default + Debug> {
impl Bounds<Pixels> {
/// Generate a centered bounds for the given display or primary display if none is provided
pub fn centered(
display_id: Option<DisplayId>,
size: Size<Pixels>,
cx: &mut AppContext,
) -> Self {
pub fn centered(display_id: Option<DisplayId>, size: Size<Pixels>, cx: &AppContext) -> Self {
let display = display_id
.and_then(|id| cx.find_display(id))
.or_else(|| cx.primary_display());
@@ -730,7 +726,7 @@ impl Bounds<Pixels> {
}
/// Generate maximized bounds for the given display or primary display if none is provided
pub fn maximized(display_id: Option<DisplayId>, cx: &mut AppContext) -> Self {
pub fn maximized(display_id: Option<DisplayId>, cx: &AppContext) -> Self {
let display = display_id
.and_then(|id| cx.find_display(id))
.or_else(|| cx.primary_display());

View File

@@ -219,7 +219,7 @@ impl DispatchTree {
self.focusable_node_ids.insert(focus_id, node_id);
}
pub fn parent_view_id(&mut self) -> Option<EntityId> {
pub fn parent_view_id(&self) -> Option<EntityId> {
self.view_stack.last().copied()
}
@@ -484,7 +484,7 @@ impl DispatchTree {
/// Converts the longest prefix of input to a replay event and returns the rest.
fn replay_prefix(
&mut self,
&self,
mut input: SmallVec<[Keystroke; 1]>,
dispatch_path: &SmallVec<[DispatchNodeId; 32]>,
) -> (SmallVec<[Keystroke; 1]>, SmallVec<[Replay; 1]>) {

View File

@@ -171,7 +171,7 @@ pub enum OsAction {
Redo,
}
pub(crate) fn init_app_menus(platform: &dyn Platform, cx: &mut AppContext) {
pub(crate) fn init_app_menus(platform: &dyn Platform, cx: &AppContext) {
platform.on_will_open_app_menu(Box::new({
let cx = cx.to_async();
move || {

View File

@@ -477,8 +477,7 @@ impl WaylandClient {
.as_ref()
.map(|primary_selection_manager| primary_selection_manager.get_device(&seat, &qh, ()));
// FIXME: Determine the scaling factor dynamically by the compositor
let mut cursor = Cursor::new(&conn, &globals, 24, 2);
let mut cursor = Cursor::new(&conn, &globals, 24);
handle
.insert_source(XDPEventSource::new(&common.background_executor), {

View File

@@ -11,7 +11,6 @@ pub(crate) struct Cursor {
theme_name: Option<String>,
surface: WlSurface,
size: u32,
scale: u32,
shm: WlShm,
connection: Connection,
}
@@ -24,7 +23,7 @@ impl Drop for Cursor {
}
impl Cursor {
pub fn new(connection: &Connection, globals: &Globals, size: u32, scale: u32) -> Self {
pub fn new(connection: &Connection, globals: &Globals, size: u32) -> Self {
Self {
theme: CursorTheme::load(&connection, globals.shm.clone(), size).log_err(),
theme_name: None,
@@ -32,7 +31,6 @@ impl Cursor {
shm: globals.shm.clone(),
connection: connection.clone(),
size,
scale,
}
}
@@ -40,18 +38,14 @@ impl Cursor {
if let Some(size) = size {
self.size = size;
}
if let Some(theme) = CursorTheme::load_from_name(
&self.connection,
self.shm.clone(),
theme_name,
self.size * self.scale,
)
.log_err()
if let Some(theme) =
CursorTheme::load_from_name(&self.connection, self.shm.clone(), theme_name, self.size)
.log_err()
{
self.theme = Some(theme);
self.theme_name = Some(theme_name.to_string());
} else if let Some(theme) =
CursorTheme::load(&self.connection, self.shm.clone(), self.size * self.scale).log_err()
CursorTheme::load(&self.connection, self.shm.clone(), self.size).log_err()
{
self.theme = Some(theme);
self.theme_name = None;
@@ -97,22 +91,9 @@ impl Cursor {
let (width, height) = buffer.dimensions();
let (hot_x, hot_y) = buffer.hotspot();
let scaled_width = width / self.scale;
let scaled_height = height / self.scale;
let scaled_hot_x = hot_x / self.scale;
let scaled_hot_y = hot_y / self.scale;
self.surface.set_buffer_scale(self.scale as i32);
wl_pointer.set_cursor(
serial_id,
Some(&self.surface),
scaled_hot_x as i32,
scaled_hot_y as i32,
);
wl_pointer.set_cursor(serial_id, Some(&self.surface), hot_x as i32, hot_y as i32);
self.surface.attach(Some(&buffer), 0, 0);
self.surface
.damage(0, 0, scaled_width as i32, scaled_height as i32);
self.surface.damage(0, 0, width as i32, height as i32);
self.surface.commit();
}
} else {

View File

@@ -284,11 +284,11 @@ impl MetalRenderer {
}
}
pub fn update_transparency(&mut self, _transparent: bool) {
pub fn update_transparency(&self, _transparent: bool) {
// todo(mac)?
}
pub fn destroy(&mut self) {
pub fn destroy(&self) {
// nothing to do
}
@@ -486,7 +486,7 @@ impl MetalRenderer {
}
fn rasterize_paths(
&mut self,
&self,
paths: &[Path<ScaledPixels>],
instance_buffer: &mut InstanceBuffer,
instance_offset: &mut usize,
@@ -576,7 +576,7 @@ impl MetalRenderer {
}
fn draw_shadows(
&mut self,
&self,
shadows: &[Shadow],
instance_buffer: &mut InstanceBuffer,
instance_offset: &mut usize,
@@ -639,7 +639,7 @@ impl MetalRenderer {
}
fn draw_quads(
&mut self,
&self,
quads: &[Quad],
instance_buffer: &mut InstanceBuffer,
instance_offset: &mut usize,
@@ -698,7 +698,7 @@ impl MetalRenderer {
}
fn draw_paths(
&mut self,
&self,
paths: &[Path<ScaledPixels>],
tiles_by_path_id: &HashMap<PathId, AtlasTile>,
instance_buffer: &mut InstanceBuffer,
@@ -808,7 +808,7 @@ impl MetalRenderer {
}
fn draw_underlines(
&mut self,
&self,
underlines: &[Underline],
instance_buffer: &mut InstanceBuffer,
instance_offset: &mut usize,
@@ -871,7 +871,7 @@ impl MetalRenderer {
}
fn draw_monochrome_sprites(
&mut self,
&self,
texture_id: AtlasTextureId,
sprites: &[MonochromeSprite],
instance_buffer: &mut InstanceBuffer,
@@ -945,7 +945,7 @@ impl MetalRenderer {
}
fn draw_polychrome_sprites(
&mut self,
&self,
texture_id: AtlasTextureId,
sprites: &[PolychromeSprite],
instance_buffer: &mut InstanceBuffer,

View File

@@ -1432,7 +1432,7 @@ impl UTType {
self.0
}
fn inner_mut(&mut self) -> *mut Object {
fn inner_mut(&self) -> *mut Object {
self.0 as *mut _
}
}

View File

@@ -15,36 +15,30 @@ use taffy::{
type NodeMeasureFn =
Box<dyn FnMut(Size<Option<Pixels>>, Size<AvailableSpace>, &mut WindowContext) -> Size<Pixels>>;
struct NodeContext {
measure: NodeMeasureFn,
}
pub struct TaffyLayoutEngine {
taffy: TaffyTree<()>,
styles: FxHashMap<LayoutId, Style>,
children_to_parents: FxHashMap<LayoutId, LayoutId>,
taffy: TaffyTree<NodeContext>,
absolute_layout_bounds: FxHashMap<LayoutId, Bounds<Pixels>>,
computed_layouts: FxHashSet<LayoutId>,
nodes_to_measure: FxHashMap<LayoutId, NodeMeasureFn>,
}
static EXPECT_MESSAGE: &str = "we should avoid taffy layout errors by construction if possible";
const EXPECT_MESSAGE: &str = "we should avoid taffy layout errors by construction if possible";
impl TaffyLayoutEngine {
pub fn new() -> Self {
TaffyLayoutEngine {
taffy: TaffyTree::new(),
styles: FxHashMap::default(),
children_to_parents: FxHashMap::default(),
absolute_layout_bounds: FxHashMap::default(),
computed_layouts: FxHashSet::default(),
nodes_to_measure: FxHashMap::default(),
}
}
pub fn clear(&mut self) {
self.taffy.clear();
self.children_to_parents.clear();
self.absolute_layout_bounds.clear();
self.computed_layouts.clear();
self.nodes_to_measure.clear();
self.styles.clear();
}
pub fn request_layout(
@@ -68,11 +62,8 @@ impl TaffyLayoutEngine {
})
.expect(EXPECT_MESSAGE)
.into();
self.children_to_parents
.extend(children.iter().map(|child_id| (*child_id, parent_id)));
parent_id
};
self.styles.insert(layout_id, style);
layout_id
}
@@ -87,11 +78,14 @@ impl TaffyLayoutEngine {
let layout_id = self
.taffy
.new_leaf_with_context(taffy_style, ())
.new_leaf_with_context(
taffy_style,
NodeContext {
measure: Box::new(measure),
},
)
.expect(EXPECT_MESSAGE)
.into();
self.nodes_to_measure.insert(layout_id, Box::new(measure));
self.styles.insert(layout_id, style);
layout_id
}
@@ -180,8 +174,8 @@ impl TaffyLayoutEngine {
.compute_layout_with_measure(
id.into(),
available_space.into(),
|known_dimensions, available_space, node_id, _context| {
let Some(measure) = self.nodes_to_measure.get_mut(&node_id.into()) else {
|known_dimensions, available_space, _node_id, node_context, _style| {
let Some(node_context) = node_context else {
return taffy::geometry::Size::default();
};
@@ -190,7 +184,7 @@ impl TaffyLayoutEngine {
height: known_dimensions.height.map(Pixels),
};
measure(known_dimensions, available_space.into(), cx).into()
(node_context.measure)(known_dimensions, available_space.into(), cx).into()
},
)
.expect(EXPECT_MESSAGE);
@@ -209,8 +203,8 @@ impl TaffyLayoutEngine {
size: layout.size.into(),
};
if let Some(parent_id) = self.children_to_parents.get(&id).copied() {
let parent_bounds = self.layout_bounds(parent_id);
if let Some(parent_id) = self.taffy.parent(id.0) {
let parent_bounds = self.layout_bounds(parent_id.into());
bounds.origin += parent_bounds.origin;
}
self.absolute_layout_bounds.insert(id, bounds);

View File

@@ -835,10 +835,7 @@ impl Window {
prompt: None,
})
}
fn new_focus_listener(
&mut self,
value: AnyWindowFocusListener,
) -> (Subscription, impl FnOnce()) {
fn new_focus_listener(&self, value: AnyWindowFocusListener) -> (Subscription, impl FnOnce()) {
self.focus_listeners.insert((), value)
}
}
@@ -929,7 +926,7 @@ impl<'a> WindowContext<'a> {
/// Obtain a new [`FocusHandle`], which allows you to track and manipulate the keyboard focus
/// for elements rendered within this window.
pub fn focus_handle(&mut self) -> FocusHandle {
pub fn focus_handle(&self) -> FocusHandle {
FocusHandle::new(&self.window.focus_handles)
}
@@ -1127,7 +1124,7 @@ impl<'a> WindowContext<'a> {
/// Register a callback to be invoked when the given Model or View is released.
pub fn observe_release<E, T>(
&mut self,
&self,
entity: &E,
mut on_release: impl FnOnce(&mut T, &mut WindowContext) + 'static,
) -> Subscription
@@ -1155,7 +1152,7 @@ impl<'a> WindowContext<'a> {
}
/// Schedule the given closure to be run directly after the current frame is rendered.
pub fn on_next_frame(&mut self, callback: impl FnOnce(&mut WindowContext) + 'static) {
pub fn on_next_frame(&self, callback: impl FnOnce(&mut WindowContext) + 'static) {
RefCell::borrow_mut(&self.window.next_frame_callbacks).push(Box::new(callback));
}
@@ -1165,7 +1162,7 @@ impl<'a> WindowContext<'a> {
/// It will cause the window to redraw on the next frame, even if no other changes have occurred.
///
/// If called from within a view, it will notify that view on the next frame. Otherwise, it will refresh the entire window.
pub fn request_animation_frame(&mut self) {
pub fn request_animation_frame(&self) {
let parent_id = self.parent_view_id();
self.on_next_frame(move |cx| {
if let Some(parent_id) = parent_id {
@@ -1179,7 +1176,7 @@ impl<'a> WindowContext<'a> {
/// Spawn the future returned by the given closure on the application thread pool.
/// The closure is provided a handle to the current window and an `AsyncWindowContext` for
/// use within your future.
pub fn spawn<Fut, R>(&mut self, f: impl FnOnce(AsyncWindowContext) -> Fut) -> Task<R>
pub fn spawn<Fut, R>(&self, f: impl FnOnce(AsyncWindowContext) -> Fut) -> Task<R>
where
R: 'static,
Fut: Future<Output = R> + 'static,
@@ -2865,7 +2862,7 @@ impl<'a> WindowContext<'a> {
}
/// Get the last view id for the current element
pub fn parent_view_id(&mut self) -> Option<EntityId> {
pub fn parent_view_id(&self) -> Option<EntityId> {
self.window.next_frame.dispatch_tree.parent_view_id()
}
@@ -3606,7 +3603,7 @@ impl<'a> WindowContext<'a> {
}
/// Updates the IME panel position suggestions for languages like japanese, chinese.
pub fn invalidate_character_coordinates(&mut self) {
pub fn invalidate_character_coordinates(&self) {
self.on_next_frame(|cx| {
if let Some(mut input_handler) = cx.window.platform_window.take_input_handler() {
if let Some(bounds) = input_handler.selected_bounds(cx) {
@@ -3752,7 +3749,7 @@ impl<'a> WindowContext<'a> {
/// Register a callback that can interrupt the closing of the current window based the returned boolean.
/// If the callback returns false, the window won't be closed.
pub fn on_window_should_close(&mut self, f: impl Fn(&mut WindowContext) -> bool + 'static) {
pub fn on_window_should_close(&self, f: impl Fn(&mut WindowContext) -> bool + 'static) {
let mut this = self.to_async();
self.window
.platform_window
@@ -4070,7 +4067,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
}
/// Sets a given callback to be run on the next frame.
pub fn on_next_frame(&mut self, f: impl FnOnce(&mut V, &mut ViewContext<V>) + 'static)
pub fn on_next_frame(&self, f: impl FnOnce(&mut V, &mut ViewContext<V>) + 'static)
where
V: 'static,
{
@@ -4162,7 +4159,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// The callback receives a handle to the view's window. This handle may be
/// invalid, if the window was closed before the view was released.
pub fn on_release(
&mut self,
&self,
on_release: impl FnOnce(&mut V, AnyWindowHandle, &mut AppContext) + 'static,
) -> Subscription {
let window_handle = self.window.handle;
@@ -4179,7 +4176,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// Register a callback to be invoked when the given Model or View is released.
pub fn observe_release<V2, E>(
&mut self,
&self,
entity: &E,
mut on_release: impl FnMut(&mut V, &mut V2, &mut ViewContext<'_, V>) + 'static,
) -> Subscription
@@ -4212,7 +4209,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// Register a callback to be invoked when the window is resized.
pub fn observe_window_bounds(
&mut self,
&self,
mut callback: impl FnMut(&mut V, &mut ViewContext<V>) + 'static,
) -> Subscription {
let view = self.view.downgrade();
@@ -4226,7 +4223,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// Register a callback to be invoked when the window is activated or deactivated.
pub fn observe_window_activation(
&mut self,
&self,
mut callback: impl FnMut(&mut V, &mut ViewContext<V>) + 'static,
) -> Subscription {
let view = self.view.downgrade();
@@ -4240,7 +4237,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// Registers a callback to be invoked when the window appearance changes.
pub fn observe_window_appearance(
&mut self,
&self,
mut callback: impl FnMut(&mut V, &mut ViewContext<V>) + 'static,
) -> Subscription {
let view = self.view.downgrade();
@@ -4260,7 +4257,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
mut f: impl FnMut(&mut V, &KeystrokeEvent, &mut ViewContext<V>) + 'static,
) -> Subscription {
fn inner(
keystroke_observers: &mut SubscriberSet<(), KeystrokeObserver>,
keystroke_observers: &SubscriberSet<(), KeystrokeObserver>,
handler: KeystrokeObserver,
) -> Subscription {
let (subscription, activate) = keystroke_observers.insert((), handler);
@@ -4284,7 +4281,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// Register a callback to be invoked when the window's pending input changes.
pub fn observe_pending_input(
&mut self,
&self,
mut callback: impl FnMut(&mut V, &mut ViewContext<V>) + 'static,
) -> Subscription {
let view = self.view.downgrade();
@@ -4372,7 +4369,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// and this callback lets you chose a default place to restore the users focus.
/// Returns a subscription and persists until the subscription is dropped.
pub fn on_focus_lost(
&mut self,
&self,
mut listener: impl FnMut(&mut V, &mut ViewContext<V>) + 'static,
) -> Subscription {
let view = self.view.downgrade();
@@ -4418,10 +4415,7 @@ impl<'a, V: 'static> ViewContext<'a, V> {
/// The given callback is invoked with a [`WeakView<V>`] to avoid leaking the view for a long-running process.
/// It's also given an [`AsyncWindowContext`], which can be used to access the state of the view across await points.
/// The returned future will be polled on the main thread.
pub fn spawn<Fut, R>(
&mut self,
f: impl FnOnce(WeakView<V>, AsyncWindowContext) -> Fut,
) -> Task<R>
pub fn spawn<Fut, R>(&self, f: impl FnOnce(WeakView<V>, AsyncWindowContext) -> Fut) -> Task<R>
where
R: 'static,
Fut: Future<Output = R> + 'static,

View File

@@ -16,11 +16,13 @@ path = "src/http_client.rs"
doctest = true
[dependencies]
http = "0.2"
anyhow.workspace = true
derive_more.workspace = true
futures.workspace = true
http = "1.1"
log.workspace = true
rustls-native-certs.workspace = true
rustls.workspace = true
serde.workspace = true
serde_json.workspace = true
smol.workspace = true

View File

@@ -11,13 +11,21 @@ use http::request::Builder;
#[cfg(feature = "test-support")]
use std::fmt;
use std::{
sync::{Arc, Mutex},
sync::{Arc, LazyLock, Mutex},
time::Duration,
};
pub use url::Url;
#[derive(Clone)]
pub struct ReadTimeout(pub Duration);
#[derive(Default, Debug, Clone)]
impl Default for ReadTimeout {
fn default() -> Self {
Self(Duration::from_secs(5))
}
}
#[derive(Default, Debug, Clone, PartialEq, Eq, Hash)]
pub enum RedirectPolicy {
#[default]
NoFollow,
@@ -26,6 +34,23 @@ pub enum RedirectPolicy {
}
pub struct FollowRedirects(pub bool);
pub static TLS_CONFIG: LazyLock<Arc<rustls::ClientConfig>> = LazyLock::new(|| {
let mut root_store = rustls::RootCertStore::empty();
let root_certs = rustls_native_certs::load_native_certs();
for error in root_certs.errors {
log::warn!("error loading native certs: {:?}", error);
}
root_store.add_parsable_certificates(&root_certs.certs);
Arc::new(
rustls::ClientConfig::builder()
.with_safe_defaults()
.with_root_certificates(root_store)
.with_no_client_auth(),
)
});
pub trait HttpRequestExt {
/// Set a read timeout on the request.
/// For isahc, this is the low_speed_timeout.

View File

@@ -1,105 +0,0 @@
use std::{mem, sync::Arc, time::Duration};
use futures::future::BoxFuture;
use util::maybe;
pub use isahc::config::Configurable;
pub struct IsahcHttpClient(isahc::HttpClient);
pub use http_client::*;
impl IsahcHttpClient {
pub fn new(proxy: Option<Uri>, user_agent: Option<String>) -> Arc<IsahcHttpClient> {
let mut builder = isahc::HttpClient::builder()
.connect_timeout(Duration::from_secs(5))
.low_speed_timeout(100, Duration::from_secs(5))
.proxy(proxy.clone());
if let Some(agent) = user_agent {
builder = builder.default_header("User-Agent", agent);
}
Arc::new(IsahcHttpClient(builder.build().unwrap()))
}
pub fn builder() -> isahc::HttpClientBuilder {
isahc::HttpClientBuilder::new()
}
}
impl From<isahc::HttpClient> for IsahcHttpClient {
fn from(client: isahc::HttpClient) -> Self {
Self(client)
}
}
impl HttpClient for IsahcHttpClient {
fn proxy(&self) -> Option<&Uri> {
None
}
fn send(
&self,
req: http_client::http::Request<http_client::AsyncBody>,
) -> BoxFuture<'static, Result<http_client::Response<http_client::AsyncBody>, anyhow::Error>>
{
let redirect_policy = req
.extensions()
.get::<http_client::RedirectPolicy>()
.cloned()
.unwrap_or_default();
let read_timeout = req
.extensions()
.get::<http_client::ReadTimeout>()
.map(|t| t.0);
let req = maybe!({
let (mut parts, body) = req.into_parts();
let mut builder = isahc::Request::builder()
.method(parts.method)
.uri(parts.uri)
.version(parts.version);
if let Some(read_timeout) = read_timeout {
builder = builder.low_speed_timeout(100, read_timeout);
}
let headers = builder.headers_mut()?;
mem::swap(headers, &mut parts.headers);
let extensions = builder.extensions_mut()?;
mem::swap(extensions, &mut parts.extensions);
let isahc_body = match body.0 {
http_client::Inner::Empty => isahc::AsyncBody::empty(),
http_client::Inner::AsyncReader(reader) => isahc::AsyncBody::from_reader(reader),
http_client::Inner::SyncReader(reader) => {
isahc::AsyncBody::from_bytes_static(reader.into_inner())
}
};
builder
.redirect_policy(match redirect_policy {
http_client::RedirectPolicy::FollowAll => isahc::config::RedirectPolicy::Follow,
http_client::RedirectPolicy::FollowLimit(limit) => {
isahc::config::RedirectPolicy::Limit(limit)
}
http_client::RedirectPolicy::NoFollow => isahc::config::RedirectPolicy::None,
})
.body(isahc_body)
.ok()
});
let client = self.0.clone();
Box::pin(async move {
match req {
Some(req) => client
.send_async(req)
.await
.map_err(Into::into)
.map(|response| {
let (parts, body) = response.into_parts();
let body = http_client::AsyncBody::from_reader(body);
http_client::Response::from_parts(parts, body)
}),
None => Err(anyhow::anyhow!("Request was malformed")),
}
})
}
}

View File

@@ -73,7 +73,7 @@ pub use lsp::DiagnosticSeverity;
/// a diff against the contents of its file.
pub static BUFFER_DIFF_TASK: LazyLock<TaskLabel> = LazyLock::new(TaskLabel::new);
/// Indicate whether a [Buffer] has permissions to edit.
/// Indicate whether a [`Buffer`] has permissions to edit.
#[derive(PartialEq, Clone, Copy, Debug)]
pub enum Capability {
/// The buffer is a mutable replica.
@@ -211,7 +211,7 @@ pub struct Diagnostic {
///
/// When a language server produces a diagnostic with
/// one or more associated diagnostics, those diagnostics are all
/// assigned a single group id.
/// assigned a single group ID.
pub group_id: usize,
/// Whether this diagnostic is the primary diagnostic for its group.
///
@@ -588,7 +588,7 @@ impl IndentGuide {
impl Buffer {
/// Create a new buffer with the given base text.
pub fn local<T: Into<String>>(base_text: T, cx: &mut ModelContext<Self>) -> Self {
pub fn local<T: Into<String>>(base_text: T, cx: &ModelContext<Self>) -> Self {
Self::build(
TextBuffer::new(0, cx.entity_id().as_non_zero_u64().into(), base_text.into()),
None,
@@ -601,7 +601,7 @@ impl Buffer {
pub fn local_normalized(
base_text_normalized: Rope,
line_ending: LineEnding,
cx: &mut ModelContext<Self>,
cx: &ModelContext<Self>,
) -> Self {
Self::build(
TextBuffer::new_normalized(
@@ -718,7 +718,7 @@ impl Buffer {
self
}
/// Returns the [Capability] of this buffer.
/// Returns the [`Capability`] of this buffer.
pub fn capability(&self) -> Capability {
self.capability
}
@@ -728,7 +728,7 @@ impl Buffer {
self.capability == Capability::ReadOnly
}
/// Builds a [Buffer] with the given underlying [TextBuffer], diff base, [File] and [Capability].
/// Builds a [`Buffer`] with the given underlying [`TextBuffer`], diff base, [`File`] and [`Capability`].
pub fn build(
buffer: TextBuffer,
diff_base: Option<String>,
@@ -819,6 +819,9 @@ impl Buffer {
branch.set_language_registry(language_registry);
}
// Reparse the branch buffer so that we get syntax highlighting immediately.
branch.reparse(cx);
branch
})
}
@@ -931,7 +934,7 @@ impl Buffer {
/// Assign a language registry to the buffer. This allows the buffer to retrieve
/// other languages if parts of the buffer are written in different languages.
pub fn set_language_registry(&mut self, language_registry: Arc<LanguageRegistry>) {
pub fn set_language_registry(&self, language_registry: Arc<LanguageRegistry>) {
self.syntax_map
.lock()
.set_language_registry(language_registry);
@@ -941,7 +944,7 @@ impl Buffer {
self.syntax_map.lock().language_registry()
}
/// Assign the buffer a new [Capability].
/// Assign the buffer a new [`Capability`].
pub fn set_capability(&mut self, capability: Capability, cx: &mut ModelContext<Self>) {
self.capability = capability;
cx.emit(BufferEvent::CapabilityChanged)
@@ -964,16 +967,13 @@ impl Buffer {
}
/// This method is called to signal that the buffer has been discarded.
pub fn discarded(&mut self, cx: &mut ModelContext<Self>) {
pub fn discarded(&self, cx: &mut ModelContext<Self>) {
cx.emit(BufferEvent::Discarded);
cx.notify();
}
/// Reloads the contents of the buffer from disk.
pub fn reload(
&mut self,
cx: &mut ModelContext<Self>,
) -> oneshot::Receiver<Option<Transaction>> {
pub fn reload(&mut self, cx: &ModelContext<Self>) -> oneshot::Receiver<Option<Transaction>> {
let (tx, rx) = futures::channel::oneshot::channel();
let prev_version = self.text.version();
self.reload_task = Some(cx.spawn(|this, mut cx| async move {
@@ -1032,7 +1032,7 @@ impl Buffer {
cx.notify();
}
/// Updates the [File] backing this buffer. This should be called when
/// Updates the [`File`] backing this buffer. This should be called when
/// the file has changed or has been deleted.
pub fn file_updated(&mut self, new_file: Arc<dyn File>, cx: &mut ModelContext<Self>) {
let mut file_changed = false;
@@ -1071,7 +1071,7 @@ impl Buffer {
}
}
/// Returns the current diff base, see [Buffer::set_diff_base].
/// Returns the current diff base, see [`Buffer::set_diff_base`].
pub fn diff_base(&self) -> Option<&Rope> {
match self.diff_base.as_ref()? {
BufferDiffBase::Git(rope) | BufferDiffBase::PastBufferVersion { rope, .. } => {
@@ -1082,7 +1082,7 @@ impl Buffer {
/// Sets the text that will be used to compute a Git diff
/// against the buffer text.
pub fn set_diff_base(&mut self, diff_base: Option<String>, cx: &mut ModelContext<Self>) {
pub fn set_diff_base(&mut self, diff_base: Option<String>, cx: &ModelContext<Self>) {
self.diff_base = diff_base.map(|mut raw_diff_base| {
LineEnding::normalize(&mut raw_diff_base);
BufferDiffBase::Git(Rope::from(raw_diff_base))
@@ -1114,7 +1114,7 @@ impl Buffer {
}
/// Recomputes the diff.
pub fn recalculate_diff(&mut self, cx: &mut ModelContext<Self>) -> Option<Task<()>> {
pub fn recalculate_diff(&self, cx: &ModelContext<Self>) -> Option<Task<()>> {
let diff_base_rope = match self.diff_base.as_ref()? {
BufferDiffBase::Git(rope) => rope.clone(),
BufferDiffBase::PastBufferVersion { buffer, .. } => buffer.read(cx).as_rope().clone(),
@@ -1142,12 +1142,12 @@ impl Buffer {
}))
}
/// Returns the primary [Language] assigned to this [Buffer].
/// Returns the primary [`Language`] assigned to this [`Buffer`].
pub fn language(&self) -> Option<&Arc<Language>> {
self.language.as_ref()
}
/// Returns the [Language] at the given location.
/// Returns the [`Language`] at the given location.
pub fn language_at<D: ToOffset>(&self, position: D) -> Option<Arc<Language>> {
let offset = position.to_offset(self);
self.syntax_map
@@ -2246,12 +2246,7 @@ impl Buffer {
}
}
fn send_operation(
&mut self,
operation: Operation,
is_local: bool,
cx: &mut ModelContext<Self>,
) {
fn send_operation(&self, operation: Operation, is_local: bool, cx: &mut ModelContext<Self>) {
cx.emit(BufferEvent::Operation {
operation,
is_local,
@@ -2730,6 +2725,7 @@ impl BufferSnapshot {
.collect();
(captures, highlight_maps)
}
/// Iterates over chunks of text in the given range of the buffer. Text is chunked
/// in an arbitrary way due to being stored in a [`Rope`](text::Rope). The text is also
/// returned in chunks where each chunk has a single syntax highlighting style and
@@ -2781,12 +2777,12 @@ impl BufferSnapshot {
.last()
}
/// Returns the main [Language]
/// Returns the main [`Language`].
pub fn language(&self) -> Option<&Arc<Language>> {
self.language.as_ref()
}
/// Returns the [Language] at the given location.
/// Returns the [`Language`] at the given location.
pub fn language_at<D: ToOffset>(&self, position: D) -> Option<&Arc<Language>> {
self.syntax_layer_at(position)
.map(|info| info.language)
@@ -2806,7 +2802,7 @@ impl BufferSnapshot {
CharClassifier::new(self.language_scope_at(point))
}
/// Returns the [LanguageScope] at the given location.
/// Returns the [`LanguageScope`] at the given location.
pub fn language_scope_at<D: ToOffset>(&self, position: D) -> Option<LanguageScope> {
let offset = position.to_offset(self);
let mut scope = None;
@@ -2961,7 +2957,7 @@ impl BufferSnapshot {
/// Returns the outline for the buffer.
///
/// This method allows passing an optional [SyntaxTheme] to
/// This method allows passing an optional [`SyntaxTheme`] to
/// syntax-highlight the returned symbols.
pub fn outline(&self, theme: Option<&SyntaxTheme>) -> Option<Outline<Anchor>> {
self.outline_items_containing(0..self.len(), true, theme)
@@ -2970,7 +2966,7 @@ impl BufferSnapshot {
/// Returns all the symbols that contain the given position.
///
/// This method allows passing an optional [SyntaxTheme] to
/// This method allows passing an optional [`SyntaxTheme`] to
/// syntax-highlight the returned symbols.
pub fn symbols_containing<T: ToOffset>(
&self,
@@ -3213,7 +3209,7 @@ impl BufferSnapshot {
}
/// For each grammar in the language, runs the provided
/// [tree_sitter::Query] against the given range.
/// [`tree_sitter::Query`] against the given range.
pub fn matches(
&self,
range: Range<usize>,
@@ -3774,7 +3770,7 @@ impl BufferSnapshot {
})
}
/// Whether the buffer contains any git changes.
/// Whether the buffer contains any Git changes.
pub fn has_git_diff(&self) -> bool {
!self.git_diff.is_empty()
}
@@ -3856,7 +3852,7 @@ impl BufferSnapshot {
}
/// Returns all the diagnostic groups associated with the given
/// language server id. If no language server id is provided,
/// language server ID. If no language server ID is provided,
/// all diagnostics groups are returned.
pub fn diagnostic_groups(
&self,
@@ -4239,7 +4235,7 @@ impl Default for Diagnostic {
}
impl IndentSize {
/// Returns an [IndentSize] representing the given spaces.
/// Returns an [`IndentSize`] representing the given spaces.
pub fn spaces(len: u32) -> Self {
Self {
len,
@@ -4247,7 +4243,7 @@ impl IndentSize {
}
}
/// Returns an [IndentSize] representing a tab.
/// Returns an [`IndentSize`] representing a tab.
pub fn tab() -> Self {
Self {
len: 1,
@@ -4255,12 +4251,12 @@ impl IndentSize {
}
}
/// An iterator over the characters represented by this [IndentSize].
/// An iterator over the characters represented by this [`IndentSize`].
pub fn chars(&self) -> impl Iterator<Item = char> {
iter::repeat(self.char()).take(self.len as usize)
}
/// The character representation of this [IndentSize].
/// The character representation of this [`IndentSize`].
pub fn char(&self) -> char {
match self.kind {
IndentKind::Space => ' ',
@@ -4268,7 +4264,7 @@ impl IndentSize {
}
}
/// Consumes the current [IndentSize] and returns a new one that has
/// Consumes the current [`IndentSize`] and returns a new one that has
/// been shrunk or enlarged by the given size along the given direction.
pub fn with_delta(mut self, direction: Ordering, size: IndentSize) -> Self {
match direction {

View File

@@ -22,6 +22,7 @@ pub use request::*;
pub use role::*;
use schemars::JsonSchema;
use serde::{de::DeserializeOwned, Deserialize, Serialize};
use std::fmt;
use std::{future::Future, sync::Arc};
use ui::IconName;
@@ -231,6 +232,12 @@ pub struct LanguageModelProviderId(pub SharedString);
#[derive(Clone, Eq, PartialEq, Hash, Debug, Ord, PartialOrd)]
pub struct LanguageModelProviderName(pub SharedString);
impl fmt::Display for LanguageModelProviderId {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
impl From<String> for LanguageModelId {
fn from(value: String) -> Self {
Self(SharedString::from(value))

View File

@@ -21,7 +21,6 @@ load-grammars = [
"tree-sitter-jsdoc",
"tree-sitter-json",
"tree-sitter-md",
"protols-tree-sitter-proto",
"tree-sitter-python",
"tree-sitter-regex",
"tree-sitter-rust",
@@ -46,7 +45,6 @@ lsp.workspace = true
node_runtime.workspace = true
paths.workspace = true
project.workspace = true
protols-tree-sitter-proto = { workspace = true, optional = true }
regex.workspace = true
rope.workspace = true
rust-embed.workspace = true

View File

@@ -0,0 +1,10 @@
; Tag the main function
(
(function_definition
declarator: (function_declarator
declarator: (identifier) @run
)
) @c-main
(#eq? @run "main")
(#set! tag c-main)
)

View File

@@ -45,7 +45,6 @@ pub fn init(languages: Arc<LanguageRegistry>, node_runtime: NodeRuntime, cx: &mu
("jsonc", tree_sitter_json::LANGUAGE),
("markdown", tree_sitter_md::LANGUAGE),
("markdown-inline", tree_sitter_md::INLINE_LANGUAGE),
("proto", protols_tree_sitter_proto::LANGUAGE),
("python", tree_sitter_python::LANGUAGE),
("regex", tree_sitter_regex::LANGUAGE),
("rust", tree_sitter_rust::LANGUAGE),
@@ -183,7 +182,6 @@ pub fn init(languages: Arc<LanguageRegistry>, node_runtime: NodeRuntime, cx: &mu
"yaml",
vec![Arc::new(yaml::YamlLspAdapter::new(node_runtime.clone()))]
);
language!("proto");
// Register globally available language servers.
//
@@ -277,7 +275,7 @@ pub fn language(name: &str, grammar: tree_sitter::Language) -> Arc<Language> {
fn load_config(name: &str) -> LanguageConfig {
let config_toml = String::from_utf8(
LanguageDir::get(&format!("{}/config.toml", name))
.unwrap()
.unwrap_or_else(|| panic!("missing config for language {:?}", name))
.data
.to_vec(),
)

View File

@@ -20,7 +20,7 @@ jsonwebtoken.workspace = true
log.workspace = true
prost.workspace = true
prost-types.workspace = true
reqwest = "0.11"
reqwest.workspace = true
serde.workspace = true
[build-dependencies]

View File

@@ -631,8 +631,7 @@ impl LanguageServer {
"filterText".to_string(),
"labelDetails".to_string(),
"tags".to_string(),
// NB: Do not have this resolved, otherwise Zed becomes slow to complete things
// "textEdit".to_string(),
"textEdit".to_string(),
],
}),
insert_replace_support: Some(true),

View File

@@ -71,7 +71,7 @@ impl Markdown {
source: String,
style: MarkdownStyle,
language_registry: Option<Arc<LanguageRegistry>>,
cx: &mut ViewContext<Self>,
cx: &ViewContext<Self>,
fallback_code_block_language: Option<String>,
) -> Self {
let focus_handle = cx.focus_handle();
@@ -97,7 +97,7 @@ impl Markdown {
source: String,
style: MarkdownStyle,
language_registry: Option<Arc<LanguageRegistry>>,
cx: &mut ViewContext<Self>,
cx: &ViewContext<Self>,
fallback_code_block_language: Option<String>,
) -> Self {
let focus_handle = cx.focus_handle();
@@ -119,12 +119,12 @@ impl Markdown {
this
}
pub fn append(&mut self, text: &str, cx: &mut ViewContext<Self>) {
pub fn append(&mut self, text: &str, cx: &ViewContext<Self>) {
self.source.push_str(text);
self.parse(cx);
}
pub fn reset(&mut self, source: String, cx: &mut ViewContext<Self>) {
pub fn reset(&mut self, source: String, cx: &ViewContext<Self>) {
if source == self.source() {
return;
}
@@ -145,7 +145,7 @@ impl Markdown {
&self.parsed_markdown
}
fn copy(&self, text: &RenderedText, cx: &mut ViewContext<Self>) {
fn copy(&self, text: &RenderedText, cx: &ViewContext<Self>) {
if self.selection.end <= self.selection.start {
return;
}
@@ -153,7 +153,7 @@ impl Markdown {
cx.write_to_clipboard(ClipboardItem::new_string(text));
}
fn parse(&mut self, cx: &mut ViewContext<Self>) {
fn parse(&mut self, cx: &ViewContext<Self>) {
if self.source.is_empty() {
return;
}
@@ -319,7 +319,7 @@ impl MarkdownElement {
}
fn paint_selection(
&mut self,
&self,
bounds: Bounds<Pixels>,
rendered_text: &RenderedText,
cx: &mut WindowContext,
@@ -382,7 +382,7 @@ impl MarkdownElement {
}
fn paint_mouse_listeners(
&mut self,
&self,
hitbox: &Hitbox,
rendered_text: &RenderedText,
cx: &mut WindowContext,
@@ -487,7 +487,7 @@ impl MarkdownElement {
});
}
fn autoscroll(&mut self, rendered_text: &RenderedText, cx: &mut WindowContext) -> Option<()> {
fn autoscroll(&self, rendered_text: &RenderedText, cx: &mut WindowContext) -> Option<()> {
let autoscroll_index = self
.markdown
.update(cx, |markdown, _| markdown.autoscroll_request.take())?;

View File

@@ -515,7 +515,7 @@ impl MultiBuffer {
}
pub fn edit<I, S, T>(
&mut self,
&self,
edits: I,
mut autoindent_mode: Option<AutoindentMode>,
cx: &mut ModelContext<Self>,
@@ -664,7 +664,7 @@ impl MultiBuffer {
drop(snapshot);
// Non-generic part of edit, hoisted out to avoid blowing up LLVM IR.
fn tail(
this: &mut MultiBuffer,
this: &MultiBuffer,
buffer_edits: HashMap<BufferId, Vec<BufferEdit>>,
autoindent_mode: Option<AutoindentMode>,
edited_excerpt_ids: Vec<ExcerptId>,
@@ -928,7 +928,7 @@ impl MultiBuffer {
}
}
pub fn push_transaction<'a, T>(&mut self, buffer_transactions: T, cx: &mut ModelContext<Self>)
pub fn push_transaction<'a, T>(&mut self, buffer_transactions: T, cx: &ModelContext<Self>)
where
T: IntoIterator<Item = (&'a Model<Buffer>, &'a language::Transaction)>,
{
@@ -952,7 +952,7 @@ impl MultiBuffer {
}
pub fn set_active_selections(
&mut self,
&self,
selections: &[Selection<Anchor>],
line_mode: bool,
cursor_shape: CursorShape,
@@ -1028,7 +1028,7 @@ impl MultiBuffer {
}
}
pub fn remove_active_selections(&mut self, cx: &mut ModelContext<Self>) {
pub fn remove_active_selections(&self, cx: &mut ModelContext<Self>) {
for buffer in self.buffers.borrow().values() {
buffer
.buffer
@@ -1180,7 +1180,7 @@ impl MultiBuffer {
}
pub fn push_multiple_excerpts_with_context_lines(
&mut self,
&self,
buffers_with_ranges: Vec<(Model<Buffer>, Vec<Range<text::Anchor>>)>,
context_line_count: u32,
cx: &mut ModelContext<Self>,
@@ -4208,7 +4208,7 @@ impl History {
&mut self,
buffer_transactions: T,
now: Instant,
cx: &mut ModelContext<MultiBuffer>,
cx: &ModelContext<MultiBuffer>,
) where
T: IntoIterator<Item = (&'a Model<Buffer>, &'a language::Transaction)>,
{

View File

@@ -0,0 +1,297 @@
use std::sync::LazyLock;
use gpui::{Hsla, Rgba};
use lsp::{CompletionItem, Documentation};
use regex::{Regex, RegexBuilder};
const HEX: &'static str = r#"(#(?:[\da-fA-F]{3}){1,2})"#;
const RGB_OR_HSL: &'static str = r#"(rgba?|hsla?)\(\s*(\d{1,3}%?)\s*,\s*(\d{1,3}%?)\s*,\s*(\d{1,3}%?)\s*(?:,\s*(1|0?\.\d+))?\s*\)"#;
static RELAXED_HEX_REGEX: LazyLock<Regex> = LazyLock::new(|| {
RegexBuilder::new(HEX)
.case_insensitive(false)
.build()
.expect("Failed to create RELAXED_HEX_REGEX")
});
static STRICT_HEX_REGEX: LazyLock<Regex> = LazyLock::new(|| {
RegexBuilder::new(&format!("^{HEX}$"))
.case_insensitive(true)
.build()
.expect("Failed to create STRICT_HEX_REGEX")
});
static RELAXED_RGB_OR_HSL_REGEX: LazyLock<Regex> = LazyLock::new(|| {
RegexBuilder::new(RGB_OR_HSL)
.case_insensitive(false)
.build()
.expect("Failed to create RELAXED_RGB_OR_HSL_REGEX")
});
static STRICT_RGB_OR_HSL_REGEX: LazyLock<Regex> = LazyLock::new(|| {
RegexBuilder::new(&format!("^{RGB_OR_HSL}$"))
.case_insensitive(true)
.build()
.expect("Failed to create STRICT_RGB_OR_HSL_REGEX")
});
/// Extracts a color from an LSP [`CompletionItem`].
///
/// Adapted from https://github.com/microsoft/vscode/blob/a6870fcb6d79093738c17e8319b760cf1c41764a/src/vs/editor/contrib/suggest/browser/suggestWidgetRenderer.ts#L34-L61
pub fn extract_color(item: &CompletionItem) -> Option<Hsla> {
// Try to extract from entire `label` field.
parse(&item.label, ParseMode::Strict)
// Try to extract from entire `detail` field.
.or_else(|| {
item.detail
.as_ref()
.and_then(|detail| parse(detail, ParseMode::Strict))
})
// Try to extract from beginning or end of `documentation` field.
.or_else(|| match item.documentation {
Some(Documentation::String(ref str)) => parse(str, ParseMode::Relaxed),
Some(Documentation::MarkupContent(ref markup)) => {
parse(&markup.value, ParseMode::Relaxed)
}
None => None,
})
}
enum ParseMode {
Strict,
Relaxed,
}
fn parse(str: &str, mode: ParseMode) -> Option<Hsla> {
let (hex, rgb) = match mode {
ParseMode::Strict => (&STRICT_HEX_REGEX, &STRICT_RGB_OR_HSL_REGEX),
ParseMode::Relaxed => (&RELAXED_HEX_REGEX, &RELAXED_RGB_OR_HSL_REGEX),
};
if let Some(captures) = hex.captures(str) {
let rmatch = captures.get(0)?;
// Color must be anchored to start or end of string.
if rmatch.start() > 0 && rmatch.end() != str.len() {
return None;
}
let hex = captures.get(1)?.as_str();
return from_hex(hex);
}
if let Some(captures) = rgb.captures(str) {
let rmatch = captures.get(0)?;
// Color must be anchored to start or end of string.
if rmatch.start() > 0 && rmatch.end() != str.len() {
return None;
}
let typ = captures.get(1)?.as_str();
let r_or_h = captures.get(2)?.as_str();
let g_or_s = captures.get(3)?.as_str();
let b_or_l = captures.get(4)?.as_str();
let a = captures.get(5).map(|a| a.as_str());
return match (typ, a) {
("rgb", None) | ("rgba", Some(_)) => from_rgb(r_or_h, g_or_s, b_or_l, a),
("hsl", None) | ("hsla", Some(_)) => from_hsl(r_or_h, g_or_s, b_or_l, a),
_ => None,
};
}
return None;
}
fn parse_component(value: &str, max: f32) -> Option<f32> {
if let Some(field) = value.strip_suffix("%") {
field.parse::<f32>().map(|value| value / 100.).ok()
} else {
value.parse::<f32>().map(|value| value / max).ok()
}
}
fn from_hex(hex: &str) -> Option<Hsla> {
Rgba::try_from(hex).map(Hsla::from).ok()
}
fn from_rgb(r: &str, g: &str, b: &str, a: Option<&str>) -> Option<Hsla> {
let r = parse_component(r, 255.)?;
let g = parse_component(g, 255.)?;
let b = parse_component(b, 255.)?;
let a = a.and_then(|a| parse_component(a, 1.0)).unwrap_or(1.0);
Some(Rgba { r, g, b, a }.into())
}
fn from_hsl(h: &str, s: &str, l: &str, a: Option<&str>) -> Option<Hsla> {
let h = parse_component(h, 360.)?;
let s = parse_component(s, 100.)?;
let l = parse_component(l, 100.)?;
let a = a.and_then(|a| parse_component(a, 1.0)).unwrap_or(1.0);
Some(Hsla { h, s, l, a })
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::rgba;
use lsp::{CompletionItem, CompletionItemKind};
pub static COLOR_TABLE: LazyLock<Vec<(&'static str, Option<u32>)>> = LazyLock::new(|| {
vec![
// -- Invalid --
// Invalid hex
("f0f", None),
("#fof", None),
// Extra field
("rgb(255, 0, 0, 0.0)", None),
("hsl(120, 0, 0, 0.0)", None),
// Missing field
("rgba(255, 0, 0)", None),
("hsla(120, 0, 0)", None),
// No decimal after zero
("rgba(255, 0, 0, 0)", None),
("hsla(120, 0, 0, 0)", None),
// Decimal after one
("rgba(255, 0, 0, 1.0)", None),
("hsla(120, 0, 0, 1.0)", None),
// HEX (sRGB)
("#f0f", Some(0xFF00FFFF)),
("#ff0000", Some(0xFF0000FF)),
// RGB / RGBA (sRGB)
("rgb(255, 0, 0)", Some(0xFF0000FF)),
("rgba(255, 0, 0, 0.4)", Some(0xFF000066)),
("rgba(255, 0, 0, 1)", Some(0xFF0000FF)),
("rgb(20%, 0%, 0%)", Some(0x330000FF)),
("rgba(20%, 0%, 0%, 1)", Some(0x330000FF)),
("rgb(0%, 20%, 0%)", Some(0x003300FF)),
("rgba(0%, 20%, 0%, 1)", Some(0x003300FF)),
("rgb(0%, 0%, 20%)", Some(0x000033FF)),
("rgba(0%, 0%, 20%, 1)", Some(0x000033FF)),
// HSL / HSLA (sRGB)
("hsl(0, 100%, 50%)", Some(0xFF0000FF)),
("hsl(120, 100%, 50%)", Some(0x00FF00FF)),
("hsla(0, 100%, 50%, 0.0)", Some(0xFF000000)),
("hsla(0, 100%, 50%, 0.4)", Some(0xFF000066)),
("hsla(0, 100%, 50%, 1)", Some(0xFF0000FF)),
("hsla(120, 100%, 50%, 0.0)", Some(0x00FF0000)),
("hsla(120, 100%, 50%, 0.4)", Some(0x00FF0066)),
("hsla(120, 100%, 50%, 1)", Some(0x00FF00FF)),
]
});
#[test]
fn can_extract_from_label() {
for (color_str, color_val) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: color_str.to_string(),
detail: None,
documentation: None,
..Default::default()
});
assert_eq!(color, color_val.map(|v| Hsla::from(rgba(v))));
}
}
#[test]
fn only_whole_label_matches_are_allowed() {
for (color_str, _) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: format!("{} foo", color_str).to_string(),
detail: None,
documentation: None,
..Default::default()
});
assert_eq!(color, None);
}
}
#[test]
fn can_extract_from_detail() {
for (color_str, color_val) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: "".to_string(),
detail: Some(color_str.to_string()),
documentation: None,
..Default::default()
});
assert_eq!(color, color_val.map(|v| Hsla::from(rgba(v))));
}
}
#[test]
fn only_whole_detail_matches_are_allowed() {
for (color_str, _) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: "".to_string(),
detail: Some(format!("{} foo", color_str).to_string()),
documentation: None,
..Default::default()
});
assert_eq!(color, None);
}
}
#[test]
fn can_extract_from_documentation_start() {
for (color_str, color_val) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: "".to_string(),
detail: None,
documentation: Some(Documentation::String(
format!("{} foo", color_str).to_string(),
)),
..Default::default()
});
assert_eq!(color, color_val.map(|v| Hsla::from(rgba(v))));
}
}
#[test]
fn can_extract_from_documentation_end() {
for (color_str, color_val) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: "".to_string(),
detail: None,
documentation: Some(Documentation::String(
format!("foo {}", color_str).to_string(),
)),
..Default::default()
});
assert_eq!(color, color_val.map(|v| Hsla::from(rgba(v))));
}
}
#[test]
fn cannot_extract_from_documentation_middle() {
for (color_str, _) in COLOR_TABLE.iter() {
let color = extract_color(&CompletionItem {
kind: Some(CompletionItemKind::COLOR),
label: "".to_string(),
detail: None,
documentation: Some(Documentation::String(
format!("foo {} foo", color_str).to_string(),
)),
..Default::default()
});
assert_eq!(color, None);
}
}
}

View File

@@ -158,7 +158,7 @@ impl LocalLspStore {
async fn format_locally(
lsp_store: WeakModel<LspStore>,
mut buffers_with_paths: Vec<(Model<Buffer>, Option<PathBuf>)>,
mut buffers: Vec<FormattableBuffer>,
push_to_history: bool,
trigger: FormatTrigger,
mut cx: AsyncAppContext,
@@ -167,22 +167,22 @@ impl LocalLspStore {
// same buffer.
lsp_store.update(&mut cx, |this, cx| {
let this = this.as_local_mut().unwrap();
buffers_with_paths.retain(|(buffer, _)| {
buffers.retain(|buffer| {
this.buffers_being_formatted
.insert(buffer.read(cx).remote_id())
.insert(buffer.handle.read(cx).remote_id())
});
})?;
let _cleanup = defer({
let this = lsp_store.clone();
let mut cx = cx.clone();
let buffers = &buffers_with_paths;
let buffers = &buffers;
move || {
this.update(&mut cx, |this, cx| {
let this = this.as_local_mut().unwrap();
for (buffer, _) in buffers {
for buffer in buffers {
this.buffers_being_formatted
.remove(&buffer.read(cx).remote_id());
.remove(&buffer.handle.read(cx).remote_id());
}
})
.ok();
@@ -190,10 +190,10 @@ impl LocalLspStore {
});
let mut project_transaction = ProjectTransaction::default();
for (buffer, buffer_abs_path) in &buffers_with_paths {
for buffer in &buffers {
let (primary_adapter_and_server, adapters_and_servers) =
lsp_store.update(&mut cx, |lsp_store, cx| {
let buffer = buffer.read(cx);
let buffer = buffer.handle.read(cx);
let adapters_and_servers = lsp_store
.language_servers_for_buffer(buffer, cx)
@@ -207,7 +207,7 @@ impl LocalLspStore {
(primary_adapter, adapters_and_servers)
})?;
let settings = buffer.update(&mut cx, |buffer, cx| {
let settings = buffer.handle.update(&mut cx, |buffer, cx| {
language_settings(buffer.language(), buffer.file(), cx).clone()
})?;
@@ -218,13 +218,14 @@ impl LocalLspStore {
let trailing_whitespace_diff = if remove_trailing_whitespace {
Some(
buffer
.handle
.update(&mut cx, |b, cx| b.remove_trailing_whitespace(cx))?
.await,
)
} else {
None
};
let whitespace_transaction_id = buffer.update(&mut cx, |buffer, cx| {
let whitespace_transaction_id = buffer.handle.update(&mut cx, |buffer, cx| {
buffer.finalize_last_transaction();
buffer.start_transaction();
if let Some(diff) = trailing_whitespace_diff {
@@ -246,7 +247,7 @@ impl LocalLspStore {
&lsp_store,
&adapters_and_servers,
code_actions,
buffer,
&buffer.handle,
push_to_history,
&mut project_transaction,
&mut cx,
@@ -261,9 +262,9 @@ impl LocalLspStore {
primary_adapter_and_server.map(|(_adapter, server)| server.clone());
let server_and_buffer = primary_language_server
.as_ref()
.zip(buffer_abs_path.as_ref());
.zip(buffer.abs_path.as_ref());
let prettier_settings = buffer.read_with(&cx, |buffer, cx| {
let prettier_settings = buffer.handle.read_with(&cx, |buffer, cx| {
language_settings(buffer.language(), buffer.file(), cx)
.prettier
.clone()
@@ -288,7 +289,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -302,7 +302,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -325,7 +324,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -351,7 +349,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -379,7 +376,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -393,7 +389,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -418,7 +413,6 @@ impl LocalLspStore {
server_and_buffer,
lsp_store.clone(),
buffer,
buffer_abs_path,
&settings,
&adapters_and_servers,
push_to_history,
@@ -438,7 +432,7 @@ impl LocalLspStore {
}
}
buffer.update(&mut cx, |b, cx| {
buffer.handle.update(&mut cx, |b, cx| {
// If the buffer had its whitespace formatted and was edited while the language-specific
// formatting was being computed, avoid applying the language-specific formatting, because
// it can't be grouped with the whitespace formatting in the undo history.
@@ -467,7 +461,7 @@ impl LocalLspStore {
if let Some(transaction_id) = whitespace_transaction_id {
b.group_until_transaction(transaction_id);
} else if let Some(transaction) = project_transaction.0.get(buffer) {
} else if let Some(transaction) = project_transaction.0.get(&buffer.handle) {
b.group_until_transaction(transaction.id)
}
}
@@ -476,7 +470,9 @@ impl LocalLspStore {
if !push_to_history {
b.forget_transaction(transaction.id);
}
project_transaction.0.insert(buffer.clone(), transaction);
project_transaction
.0
.insert(buffer.handle.clone(), transaction);
}
})?;
}
@@ -489,8 +485,7 @@ impl LocalLspStore {
formatter: &Formatter,
primary_server_and_buffer: Option<(&Arc<LanguageServer>, &PathBuf)>,
lsp_store: WeakModel<LspStore>,
buffer: &Model<Buffer>,
buffer_abs_path: &Option<PathBuf>,
buffer: &FormattableBuffer,
settings: &LanguageSettings,
adapters_and_servers: &[(Arc<CachedLspAdapter>, Arc<LanguageServer>)],
push_to_history: bool,
@@ -514,7 +509,7 @@ impl LocalLspStore {
Some(FormatOperation::Lsp(
LspStore::format_via_lsp(
&lsp_store,
buffer,
&buffer.handle,
buffer_abs_path,
language_server,
settings,
@@ -531,27 +526,20 @@ impl LocalLspStore {
let prettier = lsp_store.update(cx, |lsp_store, _cx| {
lsp_store.prettier_store().unwrap().downgrade()
})?;
prettier_store::format_with_prettier(&prettier, buffer, cx)
prettier_store::format_with_prettier(&prettier, &buffer.handle, cx)
.await
.transpose()
.ok()
.flatten()
}
Formatter::External { command, arguments } => {
let buffer_abs_path = buffer_abs_path.as_ref().map(|path| path.as_path());
Self::format_via_external_command(
buffer,
buffer_abs_path,
command,
arguments.as_deref(),
cx,
)
.await
.context(format!(
"failed to format via external command {:?}",
command
))?
.map(FormatOperation::External)
Self::format_via_external_command(buffer, command, arguments.as_deref(), cx)
.await
.context(format!(
"failed to format via external command {:?}",
command
))?
.map(FormatOperation::External)
}
Formatter::CodeActions(code_actions) => {
let code_actions = deserialize_code_actions(code_actions);
@@ -560,7 +548,7 @@ impl LocalLspStore {
&lsp_store,
adapters_and_servers,
code_actions,
buffer,
&buffer.handle,
push_to_history,
transaction,
cx,
@@ -574,13 +562,12 @@ impl LocalLspStore {
}
async fn format_via_external_command(
buffer: &Model<Buffer>,
buffer_abs_path: Option<&Path>,
buffer: &FormattableBuffer,
command: &str,
arguments: Option<&[String]>,
cx: &mut AsyncAppContext,
) -> Result<Option<Diff>> {
let working_dir_path = buffer.update(cx, |buffer, cx| {
let working_dir_path = buffer.handle.update(cx, |buffer, cx| {
let file = File::from_dyn(buffer.file())?;
let worktree = file.worktree.read(cx);
let mut worktree_path = worktree.abs_path().to_path_buf();
@@ -597,13 +584,17 @@ impl LocalLspStore {
child.creation_flags(windows::Win32::System::Threading::CREATE_NO_WINDOW.0);
}
if let Some(buffer_env) = buffer.env.as_ref() {
child.envs(buffer_env);
}
if let Some(working_dir_path) = working_dir_path {
child.current_dir(working_dir_path);
}
if let Some(arguments) = arguments {
child.args(arguments.iter().map(|arg| {
if let Some(buffer_abs_path) = buffer_abs_path {
if let Some(buffer_abs_path) = buffer.abs_path.as_ref() {
arg.replace("{buffer_path}", &buffer_abs_path.to_string_lossy())
} else {
arg.replace("{buffer_path}", "Untitled")
@@ -621,7 +612,9 @@ impl LocalLspStore {
.stdin
.as_mut()
.ok_or_else(|| anyhow!("failed to acquire stdin"))?;
let text = buffer.update(cx, |buffer, _| buffer.as_rope().clone())?;
let text = buffer
.handle
.update(cx, |buffer, _| buffer.as_rope().clone())?;
for chunk in text.chunks() {
stdin.write_all(chunk.as_bytes()).await?;
}
@@ -640,12 +633,19 @@ impl LocalLspStore {
let stdout = String::from_utf8(output.stdout)?;
Ok(Some(
buffer
.handle
.update(cx, |buffer, cx| buffer.diff(stdout, cx))?
.await,
))
}
}
pub struct FormattableBuffer {
handle: Model<Buffer>,
abs_path: Option<PathBuf>,
env: Option<HashMap<String, String>>,
}
pub struct RemoteLspStore {
upstream_client: AnyProtoClient,
upstream_project_id: u64,
@@ -5028,6 +5028,28 @@ impl LspStore {
.and_then(|local| local.last_formatting_failure.as_deref())
}
pub fn environment_for_buffer(
&self,
buffer: &Model<Buffer>,
cx: &mut ModelContext<Self>,
) -> Shared<Task<Option<HashMap<String, String>>>> {
let worktree_id = buffer.read(cx).file().map(|file| file.worktree_id(cx));
let worktree_abs_path = worktree_id.and_then(|worktree_id| {
self.worktree_store
.read(cx)
.worktree_for_id(worktree_id, cx)
.map(|entry| entry.read(cx).abs_path().clone())
});
if let Some(environment) = &self.as_local().map(|local| local.environment.clone()) {
environment.update(cx, |env, cx| {
env.get_environment(worktree_id, worktree_abs_path, cx)
})
} else {
Task::ready(None).shared()
}
}
pub fn format(
&mut self,
buffers: HashSet<Model<Buffer>>,
@@ -5042,14 +5064,31 @@ impl LspStore {
let buffer = buffer_handle.read(cx);
let buffer_abs_path = File::from_dyn(buffer.file())
.and_then(|file| file.as_local().map(|f| f.abs_path(cx)));
(buffer_handle, buffer_abs_path)
})
.collect::<Vec<_>>();
cx.spawn(move |lsp_store, mut cx| async move {
let mut formattable_buffers = Vec::with_capacity(buffers_with_paths.len());
for (handle, abs_path) in buffers_with_paths {
let env = lsp_store
.update(&mut cx, |lsp_store, cx| {
lsp_store.environment_for_buffer(&handle, cx)
})?
.await;
formattable_buffers.push(FormattableBuffer {
handle,
abs_path,
env,
});
}
let result = LocalLspStore::format_locally(
lsp_store.clone(),
buffers_with_paths,
formattable_buffers,
push_to_history,
trigger,
cx.clone(),

View File

@@ -1,4 +1,5 @@
pub mod buffer_store;
mod color_extractor;
pub mod connection_manager;
pub mod debounced_delay;
pub mod lsp_command;
@@ -36,7 +37,7 @@ use futures::{
use git::{blame::Blame, repository::GitRepository};
use gpui::{
AnyModel, AppContext, AsyncAppContext, BorrowAppContext, Context, EventEmitter, Model,
AnyModel, AppContext, AsyncAppContext, BorrowAppContext, Context, EventEmitter, Hsla, Model,
ModelContext, SharedString, Task, WeakModel, WindowContext,
};
use itertools::Itertools;
@@ -47,7 +48,9 @@ use language::{
Documentation, File as _, Language, LanguageRegistry, LanguageServerName, PointUtf16, ToOffset,
ToPointUtf16, Transaction, Unclipped,
};
use lsp::{CompletionContext, DocumentHighlightKind, LanguageServer, LanguageServerId};
use lsp::{
CompletionContext, CompletionItemKind, DocumentHighlightKind, LanguageServer, LanguageServerId,
};
use lsp_command::*;
use node_runtime::NodeRuntime;
use parking_lot::{Mutex, RwLock};
@@ -138,7 +141,7 @@ pub struct Project {
join_project_response_message_id: u32,
user_store: Model<UserStore>,
fs: Arc<dyn Fs>,
ssh_client: Option<Arc<SshRemoteClient>>,
ssh_client: Option<Model<SshRemoteClient>>,
client_state: ProjectClientState,
collaborators: HashMap<proto::PeerId, Collaborator>,
client_subscriptions: Vec<client::Subscription>,
@@ -664,7 +667,7 @@ impl Project {
}
pub fn ssh(
ssh: Arc<SshRemoteClient>,
ssh: Model<SshRemoteClient>,
client: Arc<Client>,
node: NodeRuntime,
user_store: Model<UserStore>,
@@ -681,15 +684,16 @@ impl Project {
let snippets =
SnippetProvider::new(fs.clone(), BTreeSet::from_iter([global_snippets_dir]), cx);
let ssh_proto = ssh.read(cx).to_proto_client();
let worktree_store =
cx.new_model(|_| WorktreeStore::remote(false, ssh.to_proto_client(), 0, None));
cx.new_model(|_| WorktreeStore::remote(false, ssh_proto.clone(), 0, None));
cx.subscribe(&worktree_store, Self::on_worktree_store_event)
.detach();
let buffer_store = cx.new_model(|cx| {
BufferStore::remote(
worktree_store.clone(),
ssh.to_proto_client(),
ssh.read(cx).to_proto_client(),
SSH_PROJECT_ID,
cx,
)
@@ -698,7 +702,7 @@ impl Project {
.detach();
let settings_observer = cx.new_model(|cx| {
SettingsObserver::new_ssh(ssh.to_proto_client(), worktree_store.clone(), cx)
SettingsObserver::new_ssh(ssh_proto.clone(), worktree_store.clone(), cx)
});
cx.subscribe(&settings_observer, Self::on_settings_observer_event)
.detach();
@@ -709,13 +713,24 @@ impl Project {
buffer_store.clone(),
worktree_store.clone(),
languages.clone(),
ssh.to_proto_client(),
ssh_proto.clone(),
SSH_PROJECT_ID,
cx,
)
});
cx.subscribe(&lsp_store, Self::on_lsp_store_event).detach();
cx.on_release(|this, cx| {
if let Some(ssh_client) = this.ssh_client.as_ref() {
ssh_client
.read(cx)
.to_proto_client()
.send(proto::ShutdownRemoteServer {})
.log_err();
}
})
.detach();
let this = Self {
buffer_ordered_messages_tx: tx,
collaborators: Default::default(),
@@ -751,20 +766,20 @@ impl Project {
search_excluded_history: Self::new_search_history(),
};
let client: AnyProtoClient = ssh.to_proto_client();
let ssh = ssh.read(cx);
ssh.subscribe_to_entity(SSH_PROJECT_ID, &cx.handle());
ssh.subscribe_to_entity(SSH_PROJECT_ID, &this.buffer_store);
ssh.subscribe_to_entity(SSH_PROJECT_ID, &this.worktree_store);
ssh.subscribe_to_entity(SSH_PROJECT_ID, &this.lsp_store);
ssh.subscribe_to_entity(SSH_PROJECT_ID, &this.settings_observer);
client.add_model_message_handler(Self::handle_create_buffer_for_peer);
client.add_model_message_handler(Self::handle_update_worktree);
client.add_model_message_handler(Self::handle_update_project);
client.add_model_request_handler(BufferStore::handle_update_buffer);
BufferStore::init(&client);
LspStore::init(&client);
SettingsObserver::init(&client);
ssh_proto.add_model_message_handler(Self::handle_create_buffer_for_peer);
ssh_proto.add_model_message_handler(Self::handle_update_worktree);
ssh_proto.add_model_message_handler(Self::handle_update_project);
ssh_proto.add_model_request_handler(BufferStore::handle_update_buffer);
BufferStore::init(&ssh_proto);
LspStore::init(&ssh_proto);
SettingsObserver::init(&ssh_proto);
this
})
@@ -1217,7 +1232,10 @@ impl Project {
server.ssh_connection_string.is_some()
}
pub fn ssh_connection_string(&self, cx: &ModelContext<Self>) -> Option<SharedString> {
pub fn ssh_connection_string(&self, cx: &AppContext) -> Option<SharedString> {
if let Some(ssh_state) = &self.ssh_client {
return Some(ssh_state.read(cx).connection_string().into());
}
let dev_server_id = self.dev_server_project_id()?;
dev_server_projects::Store::global(cx)
.read(cx)
@@ -1226,6 +1244,10 @@ impl Project {
.clone()
}
pub fn ssh_is_connected(&self, cx: &AppContext) -> Option<bool> {
Some(!self.ssh_client.as_ref()?.read(cx).is_reconnect_underway())
}
pub fn replica_id(&self) -> ReplicaId {
match self.client_state {
ProjectClientState::Remote { replica_id, .. } => replica_id,
@@ -1935,6 +1957,7 @@ impl Project {
BufferStoreEvent::BufferDropped(buffer_id) => {
if let Some(ref ssh_client) = self.ssh_client {
ssh_client
.read(cx)
.to_proto_client()
.send(proto::CloseBuffer {
project_id: 0,
@@ -2141,7 +2164,8 @@ impl Project {
let operation = language::proto::serialize_operation(operation);
if let Some(ssh) = &self.ssh_client {
ssh.to_proto_client()
ssh.read(cx)
.to_proto_client()
.send(proto::UpdateBuffer {
project_id: 0,
buffer_id: buffer_id.to_proto(),
@@ -2828,7 +2852,7 @@ impl Project {
let (tx, rx) = smol::channel::unbounded();
let (client, remote_id): (AnyProtoClient, _) = if let Some(ssh_client) = &self.ssh_client {
(ssh_client.to_proto_client(), 0)
(ssh_client.read(cx).to_proto_client(), 0)
} else if let Some(remote_id) = self.remote_id() {
(self.client.clone().into(), remote_id)
} else {
@@ -2963,12 +2987,14 @@ impl Project {
exists.then(|| ResolvedPath::AbsPath(expanded))
})
} else if let Some(ssh_client) = self.ssh_client.as_ref() {
let request = ssh_client
.to_proto_client()
.request(proto::CheckFileExists {
project_id: SSH_PROJECT_ID,
path: path.to_string(),
});
let request =
ssh_client
.read(cx)
.to_proto_client()
.request(proto::CheckFileExists {
project_id: SSH_PROJECT_ID,
path: path.to_string(),
});
cx.background_executor().spawn(async move {
let response = request.await.log_err()?;
if response.exists {
@@ -3044,7 +3070,7 @@ impl Project {
path: query,
};
let response = session.to_proto_client().request(request);
let response = session.read(cx).to_proto_client().request(request);
cx.background_executor().spawn(async move {
let response = response.await?;
Ok(response.entries.into_iter().map(PathBuf::from).collect())
@@ -3472,7 +3498,7 @@ impl Project {
let mut payload = envelope.payload.clone();
payload.project_id = 0;
cx.background_executor()
.spawn(ssh.to_proto_client().request(payload))
.spawn(ssh.read(cx).to_proto_client().request(payload))
.detach_and_log_err(cx);
}
this.buffer_store.clone()
@@ -4438,6 +4464,16 @@ impl Completion {
pub fn is_snippet(&self) -> bool {
self.lsp_completion.insert_text_format == Some(lsp::InsertTextFormat::SNIPPET)
}
/// Returns the corresponding color for this completion.
///
/// Will return `None` if this completion's kind is not [`CompletionItemKind::COLOR`].
pub fn color(&self) -> Option<Hsla> {
match self.lsp_completion.kind {
Some(CompletionItemKind::COLOR) => color_extractor::extract_color(&self.lsp_completion),
_ => None,
}
}
}
#[derive(Debug)]

View File

@@ -1,3 +1,4 @@
use anyhow::Context;
use collections::HashMap;
use fs::Fs;
use gpui::{AppContext, AsyncAppContext, BorrowAppContext, EventEmitter, Model, ModelContext};
@@ -6,7 +7,7 @@ use paths::local_settings_file_relative_path;
use rpc::{proto, AnyProtoClient, TypedEnvelope};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{InvalidSettingsError, Settings, SettingsSources, SettingsStore};
use settings::{InvalidSettingsError, LocalSettingsKind, Settings, SettingsSources, SettingsStore};
use std::{
path::{Path, PathBuf},
sync::Arc,
@@ -266,13 +267,14 @@ impl SettingsObserver {
let store = cx.global::<SettingsStore>();
for worktree in self.worktree_store.read(cx).worktrees() {
let worktree_id = worktree.read(cx).id().to_proto();
for (path, content) in store.local_settings(worktree.read(cx).id()) {
for (path, kind, content) in store.local_settings(worktree.read(cx).id()) {
downstream_client
.send(proto::UpdateWorktreeSettings {
project_id,
worktree_id,
path: path.to_string_lossy().into(),
content: Some(content),
kind: Some(local_settings_kind_to_proto(kind).into()),
})
.log_err();
}
@@ -288,6 +290,11 @@ impl SettingsObserver {
envelope: TypedEnvelope<proto::UpdateWorktreeSettings>,
mut cx: AsyncAppContext,
) -> anyhow::Result<()> {
let kind = match envelope.payload.kind {
Some(kind) => proto::LocalSettingsKind::from_i32(kind)
.with_context(|| format!("unknown kind {kind}"))?,
None => proto::LocalSettingsKind::Settings,
};
this.update(&mut cx, |this, cx| {
let worktree_id = WorktreeId::from_proto(envelope.payload.worktree_id);
let Some(worktree) = this
@@ -297,10 +304,12 @@ impl SettingsObserver {
else {
return;
};
this.update_settings(
worktree,
[(
PathBuf::from(&envelope.payload.path).into(),
local_settings_kind_from_proto(kind),
envelope.payload.content,
)],
cx,
@@ -312,7 +321,7 @@ impl SettingsObserver {
pub async fn handle_update_user_settings(
_: Model<Self>,
envelope: TypedEnvelope<proto::UpdateUserSettings>,
mut cx: AsyncAppContext,
cx: AsyncAppContext,
) -> anyhow::Result<()> {
cx.update_global(move |settings_store: &mut SettingsStore, cx| {
settings_store.set_user_settings(&envelope.payload.content, cx)
@@ -327,6 +336,7 @@ impl SettingsObserver {
ssh.send(proto::UpdateUserSettings {
project_id: 0,
content,
kind: Some(proto::LocalSettingsKind::Settings.into()),
})
.log_err();
}
@@ -342,6 +352,7 @@ impl SettingsObserver {
ssh.send(proto::UpdateUserSettings {
project_id: 0,
content,
kind: Some(proto::LocalSettingsKind::Settings.into()),
})
.log_err();
}
@@ -397,6 +408,7 @@ impl SettingsObserver {
settings_contents.push(async move {
(
settings_dir,
LocalSettingsKind::Settings,
if removed {
None
} else {
@@ -413,15 +425,15 @@ impl SettingsObserver {
let worktree = worktree.clone();
cx.spawn(move |this, cx| async move {
let settings_contents: Vec<(Arc<Path>, _)> =
let settings_contents: Vec<(Arc<Path>, _, _)> =
futures::future::join_all(settings_contents).await;
cx.update(|cx| {
this.update(cx, |this, cx| {
this.update_settings(
worktree,
settings_contents
.into_iter()
.map(|(path, content)| (path, content.and_then(|c| c.log_err()))),
settings_contents.into_iter().map(|(path, kind, content)| {
(path, kind, content.and_then(|c| c.log_err()))
}),
cx,
)
})
@@ -433,17 +445,18 @@ impl SettingsObserver {
fn update_settings(
&mut self,
worktree: Model<Worktree>,
settings_contents: impl IntoIterator<Item = (Arc<Path>, Option<String>)>,
settings_contents: impl IntoIterator<Item = (Arc<Path>, LocalSettingsKind, Option<String>)>,
cx: &mut ModelContext<Self>,
) {
let worktree_id = worktree.read(cx).id();
let remote_worktree_id = worktree.read(cx).id();
let result = cx.update_global::<SettingsStore, anyhow::Result<()>>(|store, cx| {
for (directory, file_content) in settings_contents {
for (directory, kind, file_content) in settings_contents {
store.set_local_settings(
worktree_id,
directory.clone(),
kind,
file_content.as_deref(),
cx,
)?;
@@ -455,6 +468,7 @@ impl SettingsObserver {
worktree_id: remote_worktree_id.to_proto(),
path: directory.to_string_lossy().into_owned(),
content: file_content,
kind: Some(local_settings_kind_to_proto(kind).into()),
})
.log_err();
}
@@ -481,3 +495,19 @@ impl SettingsObserver {
}
}
}
pub fn local_settings_kind_from_proto(kind: proto::LocalSettingsKind) -> LocalSettingsKind {
match kind {
proto::LocalSettingsKind::Settings => LocalSettingsKind::Settings,
proto::LocalSettingsKind::Tasks => LocalSettingsKind::Tasks,
proto::LocalSettingsKind::Editorconfig => LocalSettingsKind::Editorconfig,
}
}
pub fn local_settings_kind_to_proto(kind: LocalSettingsKind) -> proto::LocalSettingsKind {
match kind {
LocalSettingsKind::Settings => proto::LocalSettingsKind::Settings,
LocalSettingsKind::Tasks => proto::LocalSettingsKind::Tasks,
LocalSettingsKind::Editorconfig => proto::LocalSettingsKind::Editorconfig,
}
}

View File

@@ -70,7 +70,7 @@ impl Project {
if let Some(args) = self
.ssh_client
.as_ref()
.and_then(|session| session.ssh_args())
.and_then(|session| session.read(cx).ssh_args())
{
return Some(SshCommand::Direct(args));
}

View File

@@ -282,7 +282,9 @@ message Envelope {
UpdateUserSettings update_user_settings = 246;
CheckFileExists check_file_exists = 255;
CheckFileExistsResponse check_file_exists_response = 256; // current max
CheckFileExistsResponse check_file_exists_response = 256;
ShutdownRemoteServer shutdown_remote_server = 257; // current max
}
reserved 87 to 88;
@@ -642,6 +644,13 @@ message UpdateWorktreeSettings {
uint64 worktree_id = 2;
string path = 3;
optional string content = 4;
optional LocalSettingsKind kind = 5;
}
enum LocalSettingsKind {
Settings = 0;
Tasks = 1;
Editorconfig = 2;
}
message CreateProjectEntry {
@@ -2487,6 +2496,12 @@ message AddWorktreeResponse {
message UpdateUserSettings {
uint64 project_id = 1;
string content = 2;
optional Kind kind = 3;
enum Kind {
Settings = 0;
Tasks = 1;
}
}
message CheckFileExists {
@@ -2498,3 +2513,5 @@ message CheckFileExistsResponse {
bool exists = 1;
string path = 2;
}
message ShutdownRemoteServer {}

View File

@@ -364,7 +364,8 @@ messages!(
(CloseBuffer, Foreground),
(UpdateUserSettings, Foreground),
(CheckFileExists, Background),
(CheckFileExistsResponse, Background)
(CheckFileExistsResponse, Background),
(ShutdownRemoteServer, Foreground),
);
request_messages!(
@@ -487,7 +488,8 @@ request_messages!(
(SynchronizeContexts, SynchronizeContextsResponse),
(LspExtSwitchSourceHeader, LspExtSwitchSourceHeaderResponse),
(AddWorktree, AddWorktreeResponse),
(CheckFileExists, CheckFileExistsResponse)
(CheckFileExists, CheckFileExistsResponse),
(ShutdownRemoteServer, Ack)
);
entity_messages!(

View File

@@ -22,7 +22,6 @@ futures.workspace = true
fuzzy.workspace = true
gpui.workspace = true
log.workspace = true
markdown.workspace = true
menu.workspace = true
ordered-float.workspace = true
picker.workspace = true

File diff suppressed because it is too large Load Diff

View File

@@ -4,21 +4,21 @@ use anyhow::Result;
use auto_update::AutoUpdater;
use editor::Editor;
use futures::channel::oneshot;
use gpui::AppContext;
use gpui::{
percentage, px, Animation, AnimationExt, AnyWindowHandle, AsyncAppContext, DismissEvent,
EventEmitter, FocusableView, ParentElement as _, Render, SemanticVersion, SharedString, Task,
Transformation, View,
percentage, px, Action, Animation, AnimationExt, AnyWindowHandle, AsyncAppContext,
DismissEvent, EventEmitter, FocusableView, ParentElement as _, Render, SemanticVersion,
SharedString, Task, Transformation, View,
};
use gpui::{AppContext, Model};
use release_channel::{AppVersion, ReleaseChannel};
use remote::{SshConnectionOptions, SshPlatform, SshRemoteClient};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
use ui::{
h_flex, v_flex, Color, FluentBuilder as _, Icon, IconName, IconSize, InteractiveElement,
IntoElement, Label, LabelCommon, Styled, StyledExt as _, ViewContext, VisualContext,
WindowContext,
div, h_flex, v_flex, ActiveTheme, ButtonCommon, Clickable, Color, FluentBuilder as _, Icon,
IconButton, IconName, IconSize, InteractiveElement, IntoElement, Label, LabelCommon, Styled,
StyledExt as _, Tooltip, ViewContext, VisualContext, WindowContext,
};
use workspace::{AppState, ModalView, Workspace};
@@ -28,10 +28,6 @@ pub struct SshSettings {
}
impl SshSettings {
pub fn use_direct_ssh(&self) -> bool {
self.ssh_connections.is_some()
}
pub fn ssh_connections(&self) -> impl Iterator<Item = SshConnection> {
self.ssh_connections.clone().into_iter().flatten()
}
@@ -140,47 +136,57 @@ impl SshPrompt {
}
impl Render for SshPrompt {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
fn render(&mut self, _: &mut ViewContext<Self>) -> impl IntoElement {
v_flex()
.w_full()
.key_context("PasswordPrompt")
.p_4()
.size_full()
.justify_start()
.child(
h_flex()
.gap_2()
.child(if self.error_message.is_some() {
Icon::new(IconName::XCircle)
.size(IconSize::Medium)
.color(Color::Error)
.into_any_element()
} else {
Icon::new(IconName::ArrowCircle)
.size(IconSize::Medium)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| {
icon.transform(Transformation::rotate(percentage(delta)))
},
)
.into_any_element()
})
v_flex()
.p_4()
.size_full()
.child(
Label::new(format!("ssh {}", self.connection_string))
.size(ui::LabelSize::Large),
),
h_flex()
.gap_2()
.justify_between()
.child(h_flex().w_full())
.child(if self.error_message.is_some() {
Icon::new(IconName::XCircle)
.size(IconSize::Medium)
.color(Color::Error)
.into_any_element()
} else {
Icon::new(IconName::ArrowCircle)
.size(IconSize::Medium)
.with_animation(
"arrow-circle",
Animation::new(Duration::from_secs(2)).repeat(),
|icon, delta| {
icon.transform(Transformation::rotate(percentage(
delta,
)))
},
)
.into_any_element()
})
.child(Label::new(format!(
"Connecting to {}",
self.connection_string
)))
.child(h_flex().w_full()),
)
.when_some(self.error_message.as_ref(), |el, error| {
el.child(Label::new(error.clone()))
})
.when(
self.error_message.is_none() && self.status_message.is_some(),
|el| el.child(Label::new(self.status_message.clone().unwrap())),
)
.when_some(self.prompt.as_ref(), |el, prompt| {
el.child(Label::new(prompt.0.clone()))
.child(self.editor.clone())
}),
)
.when_some(self.error_message.as_ref(), |el, error| {
el.child(Label::new(error.clone()))
})
.when(
self.error_message.is_none() && self.status_message.is_some(),
|el| el.child(Label::new(self.status_message.clone().unwrap())),
)
.when_some(self.prompt.as_ref(), |el, prompt| {
el.child(Label::new(prompt.0.clone()))
.child(self.editor.clone())
})
}
}
@@ -202,14 +208,41 @@ impl SshConnectionModal {
impl Render for SshConnectionModal {
fn render(&mut self, cx: &mut ui::ViewContext<Self>) -> impl ui::IntoElement {
let connection_string = self.prompt.read(cx).connection_string.clone();
let theme = cx.theme();
let header_color = theme.colors().element_background;
let body_color = theme.colors().background;
v_flex()
.elevation_3(cx)
.p_4()
.gap_2()
.on_action(cx.listener(Self::dismiss))
.on_action(cx.listener(Self::confirm))
.w(px(400.))
.child(self.prompt.clone())
.child(
h_flex()
.p_1()
.border_b_1()
.border_color(theme.colors().border)
.bg(header_color)
.justify_between()
.child(
IconButton::new("ssh-connection-cancel", IconName::ArrowLeft)
.icon_size(IconSize::XSmall)
.on_click(|_, cx| cx.dispatch_action(menu::Cancel.boxed_clone()))
.tooltip(|cx| Tooltip::for_action("Back", &menu::Cancel, cx)),
)
.child(
h_flex()
.gap_2()
.child(Icon::new(IconName::Server).size(IconSize::XSmall))
.child(
Label::new(connection_string)
.size(ui::LabelSize::Small)
.single_line(),
),
)
.child(div()),
)
.child(h_flex().bg(body_color).w_full().child(self.prompt.clone()))
}
}
@@ -373,25 +406,24 @@ impl SshClientDelegate {
}
pub fn connect_over_ssh(
unique_identifier: String,
connection_options: SshConnectionOptions,
ui: View<SshPrompt>,
cx: &mut WindowContext,
) -> Task<Result<Arc<SshRemoteClient>>> {
) -> Task<Result<Model<SshRemoteClient>>> {
let window = cx.window_handle();
let known_password = connection_options.password.clone();
cx.spawn(|mut cx| async move {
remote::SshRemoteClient::new(
connection_options,
Arc::new(SshClientDelegate {
window,
ui,
known_password,
}),
&mut cx,
)
.await
})
remote::SshRemoteClient::new(
unique_identifier,
connection_options,
Arc::new(SshClientDelegate {
window,
ui,
known_password,
}),
cx,
)
}
pub async fn open_ssh_project(
@@ -420,22 +452,25 @@ pub async fn open_ssh_project(
})?
};
let session = window
.update(cx, |workspace, cx| {
cx.activate_window();
workspace.toggle_modal(cx, |cx| SshConnectionModal::new(&connection_options, cx));
let ui = workspace
.active_modal::<SshConnectionModal>(cx)
.unwrap()
.read(cx)
.prompt
.clone();
connect_over_ssh(connection_options.clone(), ui, cx)
})?
.await?;
let delegate = window.update(cx, |workspace, cx| {
cx.activate_window();
workspace.toggle_modal(cx, |cx| SshConnectionModal::new(&connection_options, cx));
let ui = workspace
.active_modal::<SshConnectionModal>(cx)
.unwrap()
.read(cx)
.prompt
.clone();
Arc::new(SshClientDelegate {
window: cx.window_handle(),
ui,
known_password: connection_options.password.clone(),
})
})?;
cx.update(|cx| {
workspace::open_ssh_project(window, connection_options, session, app_state, paths, cx)
workspace::open_ssh_project(window, connection_options, delegate, app_state, paths, cx)
})?
.await
}

View File

@@ -49,3 +49,17 @@ pub async fn write_message<S: AsyncWrite + Unpin>(
stream.write_all(buffer).await?;
Ok(())
}
pub async fn read_message_raw<S: AsyncRead + Unpin>(
stream: &mut S,
buffer: &mut Vec<u8>,
) -> Result<()> {
buffer.resize(MESSAGE_LEN_SIZE, 0);
stream.read_exact(buffer).await?;
let message_len = message_len_from_buffer(buffer);
buffer.resize(message_len as usize, 0);
stream.read_exact(buffer).await?;
Ok(())
}

View File

@@ -15,7 +15,9 @@ use futures::{
select_biased, AsyncReadExt as _, AsyncWriteExt as _, Future, FutureExt as _, SinkExt,
StreamExt as _,
};
use gpui::{AppContext, AsyncAppContext, Model, SemanticVersion, Task};
use gpui::{
AppContext, AsyncAppContext, Context, Model, ModelContext, SemanticVersion, Task, WeakModel,
};
use parking_lot::Mutex;
use rpc::{
proto::{self, build_typed_envelope, Envelope, EnvelopedMessage, PeerId, RequestMessage},
@@ -24,18 +26,21 @@ use rpc::{
use smol::{
fs,
process::{self, Child, Stdio},
Timer,
};
use std::{
any::TypeId,
ffi::OsStr,
mem,
path::{Path, PathBuf},
sync::{
atomic::{AtomicU32, Ordering::SeqCst},
Arc, Weak,
Arc,
},
time::Instant,
time::{Duration, Instant},
};
use tempfile::TempDir;
use util::maybe;
#[derive(
Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, serde::Serialize, serde::Deserialize,
@@ -48,7 +53,7 @@ pub struct SshSocket {
socket_path: PathBuf,
}
#[derive(Debug, Clone, PartialEq, Eq)]
#[derive(Debug, Default, Clone, PartialEq, Eq)]
pub struct SshConnectionOptions {
pub host: String,
pub username: Option<String>,
@@ -91,6 +96,17 @@ impl SshConnectionOptions {
host
}
}
// Uniquely identifies dev server projects on a remote host. Needs to be
// stable for the same dev server project.
pub fn dev_server_identifier(&self) -> String {
let mut identifier = format!("dev-server-{:?}", self.host);
if let Some(username) = self.username.as_ref() {
identifier.push('-');
identifier.push_str(&username);
}
identifier
}
}
#[derive(Copy, Clone, Debug)]
@@ -155,28 +171,6 @@ async fn run_cmd(command: &mut process::Command) -> Result<String> {
))
}
}
#[cfg(unix)]
async fn read_with_timeout(
stdout: &mut process::ChildStdout,
timeout: std::time::Duration,
output: &mut Vec<u8>,
) -> Result<(), std::io::Error> {
smol::future::or(
async {
stdout.read_to_end(output).await?;
Ok::<_, std::io::Error>(())
},
async {
smol::Timer::after(timeout).await;
Err(std::io::Error::new(
std::io::ErrorKind::TimedOut,
"Read operation timed out",
))
},
)
.await
}
struct ChannelForwarder {
quit_tx: UnboundedSender<()>,
@@ -187,7 +181,7 @@ impl ChannelForwarder {
fn new(
mut incoming_tx: UnboundedSender<Envelope>,
mut outgoing_rx: UnboundedReceiver<Envelope>,
cx: &mut AsyncAppContext,
cx: &AsyncAppContext,
) -> (Self, UnboundedSender<Envelope>, UnboundedReceiver<Envelope>) {
let (quit_tx, mut quit_rx) = mpsc::unbounded::<()>();
@@ -245,71 +239,119 @@ struct SshRemoteClientState {
delegate: Arc<dyn SshClientDelegate>,
forwarder: ChannelForwarder,
multiplex_task: Task<Result<()>>,
heartbeat_task: Task<Result<()>>,
}
pub struct SshRemoteClient {
client: Arc<ChannelClient>,
inner_state: Mutex<Option<SshRemoteClientState>>,
unique_identifier: String,
connection_options: SshConnectionOptions,
inner_state: Arc<Mutex<Option<SshRemoteClientState>>>,
}
impl Drop for SshRemoteClient {
fn drop(&mut self) {
self.shutdown_processes();
}
}
impl SshRemoteClient {
pub async fn new(
pub fn new(
unique_identifier: String,
connection_options: SshConnectionOptions,
delegate: Arc<dyn SshClientDelegate>,
cx: &mut AsyncAppContext,
) -> Result<Arc<Self>> {
let (outgoing_tx, outgoing_rx) = mpsc::unbounded::<Envelope>();
let (incoming_tx, incoming_rx) = mpsc::unbounded::<Envelope>();
cx: &AppContext,
) -> Task<Result<Model<Self>>> {
cx.spawn(|mut cx| async move {
let (outgoing_tx, outgoing_rx) = mpsc::unbounded::<Envelope>();
let (incoming_tx, incoming_rx) = mpsc::unbounded::<Envelope>();
let client = cx.update(|cx| ChannelClient::new(incoming_rx, outgoing_tx, cx))?;
let this = Arc::new(Self {
client,
inner_state: Mutex::new(None),
});
let this = cx.new_model(|cx| {
cx.on_app_quit(|this: &mut Self, _| {
this.shutdown_processes();
futures::future::ready(())
})
.detach();
let inner_state = {
let (proxy, proxy_incoming_tx, proxy_outgoing_rx) =
ChannelForwarder::new(incoming_tx, outgoing_rx, cx);
let client = ChannelClient::new(incoming_rx, outgoing_tx, cx);
Self {
client,
unique_identifier: unique_identifier.clone(),
connection_options: SshConnectionOptions::default(),
inner_state: Arc::new(Mutex::new(None)),
}
})?;
let (ssh_connection, ssh_process) =
Self::establish_connection(connection_options.clone(), delegate.clone(), cx)
.await?;
let inner_state = {
let (proxy, proxy_incoming_tx, proxy_outgoing_rx) =
ChannelForwarder::new(incoming_tx, outgoing_rx, &mut cx);
let multiplex_task = Self::multiplex(
Arc::downgrade(&this),
ssh_process,
proxy_incoming_tx,
proxy_outgoing_rx,
cx,
);
let (ssh_connection, ssh_proxy_process) = Self::establish_connection(
unique_identifier,
connection_options,
delegate.clone(),
&mut cx,
)
.await?;
SshRemoteClientState {
ssh_connection,
delegate,
forwarder: proxy,
multiplex_task,
}
};
let multiplex_task = Self::multiplex(
this.downgrade(),
ssh_proxy_process,
proxy_incoming_tx,
proxy_outgoing_rx,
&mut cx,
);
this.inner_state.lock().replace(inner_state);
SshRemoteClientState {
ssh_connection,
delegate,
forwarder: proxy,
multiplex_task,
heartbeat_task: Self::heartbeat(this.downgrade(), &mut cx),
}
};
Ok(this)
this.update(&mut cx, |this, cx| {
this.inner_state.lock().replace(inner_state);
cx.notify();
})?;
Ok(this)
})
}
fn reconnect(this: Arc<Self>, cx: &mut AsyncAppContext) -> Result<()> {
let Some(state) = this.inner_state.lock().take() else {
fn shutdown_processes(&self) {
let Some(mut state) = self.inner_state.lock().take() else {
return;
};
log::info!("shutting down ssh processes");
// Drop `multiplex_task` because it owns our ssh_proxy_process, which is a
// child of master_process.
let task = mem::replace(&mut state.multiplex_task, Task::ready(Ok(())));
drop(task);
// Now drop the rest of state, which kills master process.
drop(state);
}
fn reconnect(&self, cx: &ModelContext<Self>) -> Result<()> {
log::info!("Trying to reconnect to ssh server...");
let Some(state) = self.inner_state.lock().take() else {
return Err(anyhow!("reconnect is already in progress"));
};
let workspace_identifier = self.unique_identifier.clone();
let SshRemoteClientState {
mut ssh_connection,
delegate,
forwarder: proxy,
multiplex_task,
heartbeat_task,
} = state;
drop(multiplex_task);
drop(heartbeat_task);
cx.spawn(|mut cx| async move {
cx.spawn(|this, mut cx| async move {
let (incoming_tx, outgoing_rx) = proxy.into_channels().await;
ssh_connection.master_process.kill()?;
@@ -321,8 +363,13 @@ impl SshRemoteClient {
let connection_options = ssh_connection.socket.connection_options.clone();
let (ssh_connection, ssh_process) =
Self::establish_connection(connection_options, delegate.clone(), &mut cx).await?;
let (ssh_connection, ssh_process) = Self::establish_connection(
workspace_identifier,
connection_options,
delegate.clone(),
&mut cx,
)
.await?;
let (proxy, proxy_incoming_tx, proxy_outgoing_rx) =
ChannelForwarder::new(incoming_tx, outgoing_rx, &mut cx);
@@ -332,32 +379,95 @@ impl SshRemoteClient {
delegate,
forwarder: proxy,
multiplex_task: Self::multiplex(
Arc::downgrade(&this),
this.clone(),
ssh_process,
proxy_incoming_tx,
proxy_outgoing_rx,
&mut cx,
),
heartbeat_task: Self::heartbeat(this.clone(), &mut cx),
};
this.inner_state.lock().replace(inner_state);
anyhow::Ok(())
this.update(&mut cx, |this, _| {
this.inner_state.lock().replace(inner_state);
})
})
.detach();
Ok(())
}
anyhow::Ok(())
fn heartbeat(this: WeakModel<Self>, cx: &mut AsyncAppContext) -> Task<Result<()>> {
let Ok(client) = this.update(cx, |this, _| this.client.clone()) else {
return Task::ready(Err(anyhow!("SshRemoteClient lost")));
};
cx.spawn(|mut cx| {
let this = this.clone();
async move {
const MAX_MISSED_HEARTBEATS: usize = 5;
const HEARTBEAT_INTERVAL: Duration = Duration::from_secs(5);
const HEARTBEAT_TIMEOUT: Duration = Duration::from_secs(5);
let mut missed_heartbeats = 0;
let mut timer = Timer::interval(HEARTBEAT_INTERVAL);
loop {
timer.next().await;
log::info!("Sending heartbeat to server...");
let result = smol::future::or(
async {
client.request(proto::Ping {}).await?;
Ok(())
},
async {
smol::Timer::after(HEARTBEAT_TIMEOUT).await;
Err(anyhow!("Timeout detected"))
},
)
.await;
if result.is_err() {
missed_heartbeats += 1;
log::warn!(
"No heartbeat from server after {:?}. Missed heartbeat {} out of {}.",
HEARTBEAT_TIMEOUT,
missed_heartbeats,
MAX_MISSED_HEARTBEATS
);
} else {
missed_heartbeats = 0;
}
if missed_heartbeats >= MAX_MISSED_HEARTBEATS {
log::error!(
"Missed last {} hearbeats. Reconnecting...",
missed_heartbeats
);
this.update(&mut cx, |this, cx| {
this.reconnect(cx)
.context("failed to reconnect after missing heartbeats")
})
.context("failed to update weak reference, SshRemoteClient lost?")??;
return Ok(());
}
}
}
})
}
fn multiplex(
this: Weak<Self>,
mut ssh_process: Child,
this: WeakModel<Self>,
mut ssh_proxy_process: Child,
incoming_tx: UnboundedSender<Envelope>,
mut outgoing_rx: UnboundedReceiver<Envelope>,
cx: &mut AsyncAppContext,
cx: &AsyncAppContext,
) -> Task<Result<()>> {
let mut child_stderr = ssh_process.stderr.take().unwrap();
let mut child_stdout = ssh_process.stdout.take().unwrap();
let mut child_stdin = ssh_process.stdin.take().unwrap();
let mut child_stderr = ssh_proxy_process.stderr.take().unwrap();
let mut child_stdout = ssh_proxy_process.stdout.take().unwrap();
let mut child_stdin = ssh_proxy_process.stdin.take().unwrap();
let io_task = cx.background_executor().spawn(async move {
let mut stdin_buffer = Vec::new();
@@ -383,7 +493,7 @@ impl SshRemoteClient {
Ok(0) => {
child_stdin.close().await?;
outgoing_rx.close();
let status = ssh_process.status().await?;
let status = ssh_proxy_process.status().await?;
if !status.success() {
log::error!("ssh process exited with status: {status:?}");
return Err(anyhow!("ssh process exited with non-zero status code: {:?}", status.code()));
@@ -444,9 +554,9 @@ impl SshRemoteClient {
if let Err(error) = result {
log::warn!("ssh io task died with error: {:?}. reconnecting...", error);
if let Some(this) = this.upgrade() {
Self::reconnect(this, &mut cx).ok();
}
this.update(&mut cx, |this, cx| {
this.reconnect(cx).ok();
})?;
}
Ok(())
@@ -454,6 +564,7 @@ impl SshRemoteClient {
}
async fn establish_connection(
unique_identifier: String,
connection_options: SshConnectionOptions,
delegate: Arc<dyn SshClientDelegate>,
cx: &mut AsyncAppContext,
@@ -477,17 +588,22 @@ impl SshRemoteClient {
let socket = ssh_connection.socket.clone();
run_cmd(socket.ssh_command(&remote_binary_path).arg("version")).await?;
let ssh_process = socket
delegate.set_status(Some("Starting proxy"), cx);
let ssh_proxy_process = socket
.ssh_command(format!(
"RUST_LOG={} RUST_BACKTRACE={} {:?} run",
"RUST_LOG={} RUST_BACKTRACE={} {:?} proxy --identifier {}",
std::env::var("RUST_LOG").unwrap_or_default(),
std::env::var("RUST_BACKTRACE").unwrap_or_default(),
remote_binary_path,
unique_identifier,
))
// IMPORTANT: we kill this process when we drop the task that uses it.
.kill_on_drop(true)
.spawn()
.context("failed to spawn remote server")?;
Ok((ssh_connection, ssh_process))
Ok((ssh_connection, ssh_proxy_process))
}
pub fn subscribe_to_entity<E: 'static>(&self, remote_id: u64, entity: &Model<E>) {
@@ -505,20 +621,32 @@ impl SshRemoteClient {
self.client.clone().into()
}
pub fn connection_string(&self) -> String {
self.connection_options.connection_string()
}
pub fn is_reconnect_underway(&self) -> bool {
maybe!({ Some(self.inner_state.try_lock()?.is_none()) }).unwrap_or_default()
}
#[cfg(any(test, feature = "test-support"))]
pub fn fake(
client_cx: &mut gpui::TestAppContext,
server_cx: &mut gpui::TestAppContext,
) -> (Arc<Self>, Arc<ChannelClient>) {
) -> (Model<Self>, Arc<ChannelClient>) {
use gpui::Context;
let (server_to_client_tx, server_to_client_rx) = mpsc::unbounded();
let (client_to_server_tx, client_to_server_rx) = mpsc::unbounded();
(
client_cx.update(|cx| {
let client = ChannelClient::new(server_to_client_rx, client_to_server_tx, cx);
Arc::new(Self {
cx.new_model(|_| Self {
client,
inner_state: Mutex::new(None),
unique_identifier: "fake".to_string(),
connection_options: SshConnectionOptions::default(),
inner_state: Arc::new(Mutex::new(None)),
})
}),
server_cx.update(|cx| ChannelClient::new(client_to_server_rx, server_to_client_tx, cx)),
@@ -575,13 +703,19 @@ impl SshRemoteConnection {
// Create a domain socket listener to handle requests from the askpass program.
let askpass_socket = temp_dir.path().join("askpass.sock");
let (askpass_opened_tx, askpass_opened_rx) = oneshot::channel::<()>();
let listener =
UnixListener::bind(&askpass_socket).context("failed to create askpass socket")?;
let askpass_task = cx.spawn({
let delegate = delegate.clone();
|mut cx| async move {
let mut askpass_opened_tx = Some(askpass_opened_tx);
while let Ok((mut stream, _)) = listener.accept().await {
if let Some(askpass_opened_tx) = askpass_opened_tx.take() {
askpass_opened_tx.send(()).ok();
}
let mut buffer = Vec::new();
let mut reader = BufReader::new(&mut stream);
if reader.read_until(b'\0', &mut buffer).await.is_err() {
@@ -631,20 +765,29 @@ impl SshRemoteConnection {
// has completed.
let stdout = master_process.stdout.as_mut().unwrap();
let mut output = Vec::new();
let connection_timeout = std::time::Duration::from_secs(10);
let result = read_with_timeout(stdout, connection_timeout, &mut output).await;
if let Err(e) = result {
let error_message = if e.kind() == std::io::ErrorKind::TimedOut {
format!(
"Failed to connect to host. Timed out after {:?}.",
connection_timeout
)
} else {
format!("Failed to connect to host: {}.", e)
};
let connection_timeout = Duration::from_secs(10);
let result = select_biased! {
_ = askpass_opened_rx.fuse() => {
// If the askpass script has opened, that means the user is typing
// their password, in which case we don't want to timeout anymore,
// since we know a connection has been established.
stdout.read_to_end(&mut output).await?;
Ok(())
}
result = stdout.read_to_end(&mut output).fuse() => {
result?;
Ok(())
}
_ = futures::FutureExt::fuse(smol::Timer::after(connection_timeout)) => {
Err(anyhow!("Exceeded {:?} timeout trying to connect to host", connection_timeout))
}
};
if let Err(e) = result {
let error_message = format!("Failed to connect to host: {}.", e);
delegate.set_error(error_message, cx);
return Err(e.into());
return Err(e);
}
drop(askpass_task);
@@ -653,10 +796,10 @@ impl SshRemoteConnection {
output.clear();
let mut stderr = master_process.stderr.take().unwrap();
stderr.read_to_end(&mut output).await?;
Err(anyhow!(
"failed to connect: {}",
String::from_utf8_lossy(&output)
))?;
let error_message = format!("failed to connect: {}", String::from_utf8_lossy(&output));
delegate.set_error(error_message.clone(), cx);
Err(anyhow!(error_message))?;
}
Ok(Self {

View File

@@ -22,25 +22,26 @@ test-support = ["fs/test-support"]
[dependencies]
anyhow.workspace = true
clap.workspace = true
client.workspace = true
env_logger.workspace = true
fs.workspace = true
futures.workspace = true
gpui.workspace = true
node_runtime.workspace = true
language.workspace = true
languages.workspace = true
log.workspace = true
node_runtime.workspace = true
project.workspace = true
remote.workspace = true
rpc.workspace = true
settings.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
shellexpand.workspace = true
smol.workspace = true
worktree.workspace = true
language.workspace = true
languages.workspace = true
util.workspace = true
worktree.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }

View File

@@ -112,6 +112,8 @@ impl HeadlessProject {
client.add_request_handler(cx.weak_model(), Self::handle_list_remote_directory);
client.add_request_handler(cx.weak_model(), Self::handle_check_file_exists);
client.add_request_handler(cx.weak_model(), Self::handle_shutdown_remote_server);
client.add_request_handler(cx.weak_model(), Self::handle_ping);
client.add_model_request_handler(Self::handle_add_worktree);
client.add_model_request_handler(Self::handle_open_buffer_by_path);
@@ -335,4 +337,31 @@ impl HeadlessProject {
path: expanded,
})
}
pub async fn handle_shutdown_remote_server(
_this: Model<Self>,
_envelope: TypedEnvelope<proto::ShutdownRemoteServer>,
cx: AsyncAppContext,
) -> Result<proto::Ack> {
cx.spawn(|cx| async move {
cx.update(|cx| {
// TODO: This is a hack, because in a headless project, shutdown isn't executed
// when calling quit, but it should be.
cx.shutdown();
cx.quit();
})
})
.detach();
Ok(proto::Ack {})
}
pub async fn handle_ping(
_this: Model<Self>,
_envelope: TypedEnvelope<proto::Ping>,
_cx: AsyncAppContext,
) -> Result<proto::Ack> {
log::debug!("Received ping from client");
Ok(proto::Ack {})
}
}

View File

@@ -1,20 +1,34 @@
#![cfg_attr(target_os = "windows", allow(unused, dead_code))]
use fs::RealFs;
use futures::channel::mpsc;
use gpui::Context as _;
use remote::{
json_log::LogRecord,
protocol::{read_message, write_message},
};
use remote_server::HeadlessProject;
use smol::{io::AsyncWriteExt, stream::StreamExt as _, Async};
use std::{
env,
io::{self, Write},
mem, process,
sync::Arc,
};
use anyhow::Result;
use clap::{Parser, Subcommand};
use std::path::PathBuf;
#[derive(Parser)]
#[command(disable_version_flag = true)]
struct Cli {
#[command(subcommand)]
command: Option<Commands>,
}
#[derive(Subcommand)]
enum Commands {
Run {
#[arg(long)]
log_file: PathBuf,
#[arg(long)]
pid_file: PathBuf,
#[arg(long)]
stdin_socket: PathBuf,
#[arg(long)]
stdout_socket: PathBuf,
},
Proxy {
#[arg(long)]
identifier: String,
},
Version,
}
#[cfg(windows)]
fn main() {
@@ -22,76 +36,32 @@ fn main() {
}
#[cfg(not(windows))]
fn main() {
use remote::ssh_session::ChannelClient;
fn main() -> Result<()> {
use remote_server::unix::{execute_proxy, execute_run, init_logging};
env_logger::builder()
.format(|buf, record| {
serde_json::to_writer(&mut *buf, &LogRecord::new(record))?;
buf.write_all(b"\n")?;
Ok(())
})
.init();
let cli = Cli::parse();
let subcommand = std::env::args().nth(1);
match subcommand.as_deref() {
Some("run") => {}
Some("version") => {
println!("{}", env!("ZED_PKG_VERSION"));
return;
match cli.command {
Some(Commands::Run {
log_file,
pid_file,
stdin_socket,
stdout_socket,
}) => {
init_logging(Some(log_file))?;
execute_run(pid_file, stdin_socket, stdout_socket)
}
_ => {
eprintln!("usage: remote <run|version>");
process::exit(1);
Some(Commands::Proxy { identifier }) => {
init_logging(None)?;
execute_proxy(identifier)
}
Some(Commands::Version) => {
eprintln!("{}", env!("ZED_PKG_VERSION"));
Ok(())
}
None => {
eprintln!("usage: remote <run|proxy|version>");
std::process::exit(1);
}
}
gpui::App::headless().run(move |cx| {
settings::init(cx);
HeadlessProject::init(cx);
let (incoming_tx, incoming_rx) = mpsc::unbounded();
let (outgoing_tx, mut outgoing_rx) = mpsc::unbounded();
let mut stdin = Async::new(io::stdin()).unwrap();
let mut stdout = Async::new(io::stdout()).unwrap();
let session = ChannelClient::new(incoming_rx, outgoing_tx, cx);
let project = cx.new_model(|cx| {
HeadlessProject::new(
session.clone(),
Arc::new(RealFs::new(Default::default(), None)),
cx,
)
});
cx.background_executor()
.spawn(async move {
let mut output_buffer = Vec::new();
while let Some(message) = outgoing_rx.next().await {
write_message(&mut stdout, &mut output_buffer, message).await?;
stdout.flush().await?;
}
anyhow::Ok(())
})
.detach();
cx.background_executor()
.spawn(async move {
let mut input_buffer = Vec::new();
loop {
let message = match read_message(&mut stdin, &mut input_buffer).await {
Ok(message) => message,
Err(error) => {
log::warn!("error reading message: {:?}", error);
process::exit(0);
}
};
incoming_tx.unbounded_send(message).ok();
}
})
.detach();
mem::forget(project);
});
}

View File

@@ -655,7 +655,7 @@ async fn init_test(
(project, headless, fs)
}
fn build_project(ssh: Arc<SshRemoteClient>, cx: &mut TestAppContext) -> Model<Project> {
fn build_project(ssh: Model<SshRemoteClient>, cx: &mut TestAppContext) -> Model<Project> {
cx.update(|cx| {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);

View File

@@ -1,5 +1,8 @@
mod headless_project;
#[cfg(not(windows))]
pub mod unix;
#[cfg(test)]
mod remote_editing_tests;

View File

@@ -0,0 +1,336 @@
use crate::HeadlessProject;
use anyhow::{anyhow, Context, Result};
use fs::RealFs;
use futures::channel::mpsc;
use futures::{select, select_biased, AsyncRead, AsyncWrite, FutureExt, SinkExt};
use gpui::{AppContext, Context as _};
use remote::ssh_session::ChannelClient;
use remote::{
json_log::LogRecord,
protocol::{read_message, write_message},
};
use rpc::proto::Envelope;
use smol::Async;
use smol::{io::AsyncWriteExt, net::unix::UnixListener, stream::StreamExt as _};
use std::{
env,
io::Write,
mem,
path::{Path, PathBuf},
sync::Arc,
};
pub fn init_logging(log_file: Option<PathBuf>) -> Result<()> {
if let Some(log_file) = log_file {
let target = Box::new(if log_file.exists() {
std::fs::OpenOptions::new()
.append(true)
.open(&log_file)
.context("Failed to open log file in append mode")?
} else {
std::fs::File::create(&log_file).context("Failed to create log file")?
});
env_logger::Builder::from_default_env()
.target(env_logger::Target::Pipe(target))
.init();
} else {
env_logger::builder()
.format(|buf, record| {
serde_json::to_writer(&mut *buf, &LogRecord::new(record))?;
buf.write_all(b"\n")?;
Ok(())
})
.init();
}
Ok(())
}
fn start_server(
stdin_listener: UnixListener,
stdout_listener: UnixListener,
cx: &mut AppContext,
) -> Arc<ChannelClient> {
// This is the server idle timeout. If no connection comes in in this timeout, the server will shut down.
const IDLE_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10 * 60);
let (incoming_tx, incoming_rx) = mpsc::unbounded::<Envelope>();
let (outgoing_tx, mut outgoing_rx) = mpsc::unbounded::<Envelope>();
let (app_quit_tx, mut app_quit_rx) = mpsc::unbounded::<()>();
cx.on_app_quit(move |_| {
let mut app_quit_tx = app_quit_tx.clone();
async move {
app_quit_tx.send(()).await.ok();
}
})
.detach();
cx.spawn(|cx| async move {
let mut stdin_incoming = stdin_listener.incoming();
let mut stdout_incoming = stdout_listener.incoming();
loop {
let streams = futures::future::join(stdin_incoming.next(), stdout_incoming.next());
log::info!("server: accepting new connections");
let result = select! {
streams = streams.fuse() => {
let (Some(Ok(stdin_stream)), Some(Ok(stdout_stream))) = streams else {
break;
};
anyhow::Ok((stdin_stream, stdout_stream))
}
_ = futures::FutureExt::fuse(smol::Timer::after(IDLE_TIMEOUT)) => {
log::warn!("server: timed out waiting for new connections after {:?}. exiting.", IDLE_TIMEOUT);
cx.update(|cx| {
// TODO: This is a hack, because in a headless project, shutdown isn't executed
// when calling quit, but it should be.
cx.shutdown();
cx.quit();
})?;
break;
}
_ = app_quit_rx.next().fuse() => {
break;
}
};
let Ok((mut stdin_stream, mut stdout_stream)) = result else {
break;
};
let mut input_buffer = Vec::new();
let mut output_buffer = Vec::new();
loop {
select_biased! {
_ = app_quit_rx.next().fuse() => {
return anyhow::Ok(());
}
stdin_message = read_message(&mut stdin_stream, &mut input_buffer).fuse() => {
let message = match stdin_message {
Ok(message) => message,
Err(error) => {
log::warn!("server: error reading message on stdin: {}. exiting.", error);
break;
}
};
if let Err(error) = incoming_tx.unbounded_send(message) {
log::error!("server: failed to send message to application: {:?}. exiting.", error);
return Err(anyhow!(error));
}
}
outgoing_message = outgoing_rx.next().fuse() => {
let Some(message) = outgoing_message else {
log::error!("server: stdout handler, no message");
break;
};
if let Err(error) =
write_message(&mut stdout_stream, &mut output_buffer, message).await
{
log::error!("server: failed to write stdout message: {:?}", error);
break;
}
if let Err(error) = stdout_stream.flush().await {
log::error!("server: failed to flush stdout message: {:?}", error);
break;
}
}
}
}
}
anyhow::Ok(())
})
.detach();
ChannelClient::new(incoming_rx, outgoing_tx, cx)
}
pub fn execute_run(pid_file: PathBuf, stdin_socket: PathBuf, stdout_socket: PathBuf) -> Result<()> {
write_pid_file(&pid_file)
.with_context(|| format!("failed to write pid file: {:?}", &pid_file))?;
let stdin_listener = UnixListener::bind(stdin_socket).context("failed to bind stdin socket")?;
let stdout_listener =
UnixListener::bind(stdout_socket).context("failed to bind stdout socket")?;
gpui::App::headless().run(move |cx| {
settings::init(cx);
HeadlessProject::init(cx);
let session = start_server(stdin_listener, stdout_listener, cx);
let project = cx.new_model(|cx| {
HeadlessProject::new(session, Arc::new(RealFs::new(Default::default(), None)), cx)
});
mem::forget(project);
});
log::info!("server: gpui app is shut down. quitting.");
Ok(())
}
pub fn execute_proxy(identifier: String) -> Result<()> {
log::debug!("proxy: starting up. PID: {}", std::process::id());
let project_dir = ensure_project_dir(&identifier)?;
let pid_file = project_dir.join("server.pid");
let stdin_socket = project_dir.join("stdin.sock");
let stdout_socket = project_dir.join("stdout.sock");
let log_file = project_dir.join("server.log");
let server_running = check_pid_file(&pid_file)?;
if !server_running {
spawn_server(&log_file, &pid_file, &stdin_socket, &stdout_socket)?;
};
let stdin_task = smol::spawn(async move {
let stdin = Async::new(std::io::stdin())?;
let stream = smol::net::unix::UnixStream::connect(stdin_socket).await?;
handle_io(stdin, stream, "stdin").await
});
let stdout_task: smol::Task<Result<()>> = smol::spawn(async move {
let stdout = Async::new(std::io::stdout())?;
let stream = smol::net::unix::UnixStream::connect(stdout_socket).await?;
handle_io(stream, stdout, "stdout").await
});
if let Err(forwarding_result) =
smol::block_on(async move { smol::future::race(stdin_task, stdout_task).await })
{
log::error!(
"proxy: failed to forward messages: {:?}, terminating...",
forwarding_result
);
return Err(forwarding_result);
}
Ok(())
}
fn ensure_project_dir(identifier: &str) -> Result<PathBuf> {
let project_dir = env::var("HOME").unwrap_or_else(|_| ".".to_string());
let project_dir = PathBuf::from(project_dir)
.join(".local")
.join("state")
.join("zed-remote-server")
.join(identifier);
std::fs::create_dir_all(&project_dir)?;
Ok(project_dir)
}
fn spawn_server(
log_file: &Path,
pid_file: &Path,
stdin_socket: &Path,
stdout_socket: &Path,
) -> Result<()> {
if stdin_socket.exists() {
std::fs::remove_file(&stdin_socket)?;
}
if stdout_socket.exists() {
std::fs::remove_file(&stdout_socket)?;
}
let binary_name = std::env::current_exe()?;
let server_process = std::process::Command::new(binary_name)
.arg("run")
.arg("--log-file")
.arg(log_file)
.arg("--pid-file")
.arg(pid_file)
.arg("--stdin-socket")
.arg(stdin_socket)
.arg("--stdout-socket")
.arg(stdout_socket)
.spawn()?;
log::debug!("proxy: server started. PID: {:?}", server_process.id());
let mut total_time_waited = std::time::Duration::from_secs(0);
let wait_duration = std::time::Duration::from_millis(20);
while !stdout_socket.exists() || !stdin_socket.exists() {
log::debug!("proxy: waiting for server to be ready to accept connections...");
std::thread::sleep(wait_duration);
total_time_waited += wait_duration;
}
log::info!(
"proxy: server ready to accept connections. total time waited: {:?}",
total_time_waited
);
Ok(())
}
fn check_pid_file(path: &Path) -> Result<bool> {
let Some(pid) = std::fs::read_to_string(&path)
.ok()
.and_then(|contents| contents.parse::<u32>().ok())
else {
return Ok(false);
};
log::debug!("proxy: Checking if process with PID {} exists...", pid);
match std::process::Command::new("kill")
.arg("-0")
.arg(pid.to_string())
.output()
{
Ok(output) if output.status.success() => {
log::debug!("proxy: Process with PID {} exists. NOT spawning new server, but attaching to existing one.", pid);
Ok(true)
}
_ => {
log::debug!("proxy: Found PID file, but process with that PID does not exist. Removing PID file.");
std::fs::remove_file(&path).context("proxy: Failed to remove PID file")?;
Ok(false)
}
}
}
fn write_pid_file(path: &Path) -> Result<()> {
if path.exists() {
std::fs::remove_file(path)?;
}
std::fs::write(path, std::process::id().to_string()).context("Failed to write PID file")
}
async fn handle_io<R, W>(mut reader: R, mut writer: W, socket_name: &str) -> Result<()>
where
R: AsyncRead + Unpin,
W: AsyncWrite + Unpin,
{
use remote::protocol::read_message_raw;
let mut buffer = Vec::new();
loop {
read_message_raw(&mut reader, &mut buffer)
.await
.with_context(|| format!("proxy: failed to read message from {}", socket_name))?;
write_size_prefixed_buffer(&mut writer, &mut buffer)
.await
.with_context(|| format!("proxy: failed to write message to {}", socket_name))?;
writer.flush().await?;
buffer.clear();
}
}
async fn write_size_prefixed_buffer<S: AsyncWrite + Unpin>(
stream: &mut S,
buffer: &mut Vec<u8>,
) -> Result<()> {
let len = buffer.len() as u32;
stream.write_all(len.to_le_bytes().as_slice()).await?;
stream.write_all(buffer).await?;
Ok(())
}

View File

@@ -0,0 +1,31 @@
[package]
name = "reqwest_client"
version = "0.1.0"
edition = "2021"
publish = false
license = "Apache-2.0"
[lints]
workspace = true
[features]
test-support = []
[lib]
path = "src/reqwest_client.rs"
doctest = true
[[example]]
name = "client"
path = "examples/client.rs"
[dependencies]
anyhow.workspace = true
bytes = "1.0"
futures.workspace = true
http_client.workspace = true
serde.workspace = true
smol.workspace = true
tokio.workspace = true
reqwest = { workspace = true, features = ["rustls-tls-manual-roots", "stream"] }

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,16 @@
use futures::AsyncReadExt as _;
use http_client::AsyncBody;
use http_client::HttpClient;
use reqwest_client::ReqwestClient;
#[tokio::main]
async fn main() {
let resp = ReqwestClient::new()
.get("http://zed.dev", AsyncBody::empty(), true)
.await
.unwrap();
let mut body = String::new();
resp.into_body().read_to_string(&mut body).await.unwrap();
println!("{}", &body);
}

View File

@@ -0,0 +1,232 @@
use std::{borrow::Cow, io::Read, pin::Pin, task::Poll};
use anyhow::anyhow;
use bytes::{BufMut, Bytes, BytesMut};
use futures::{AsyncRead, TryStreamExt};
use http_client::{http, AsyncBody, ReadTimeout};
use reqwest::header::{HeaderMap, HeaderValue};
use smol::future::FutureExt;
const DEFAULT_CAPACITY: usize = 4096;
pub struct ReqwestClient {
client: reqwest::Client,
}
impl ReqwestClient {
pub fn new() -> Self {
Self {
client: reqwest::Client::new(),
}
}
pub fn user_agent(agent: &str) -> anyhow::Result<Self> {
let mut map = HeaderMap::new();
map.insert(http::header::USER_AGENT, HeaderValue::from_str(agent)?);
Ok(Self {
client: reqwest::Client::builder().default_headers(map).build()?,
})
}
}
impl From<reqwest::Client> for ReqwestClient {
fn from(client: reqwest::Client) -> Self {
Self { client }
}
}
// This struct is essentially a re-implementation of
// https://docs.rs/tokio-util/0.7.12/tokio_util/io/struct.ReaderStream.html
// except outside of Tokio's aegis
struct ReaderStream {
reader: Option<Pin<Box<dyn futures::AsyncRead + Send + Sync>>>,
buf: BytesMut,
capacity: usize,
}
impl ReaderStream {
fn new(reader: Pin<Box<dyn futures::AsyncRead + Send + Sync>>) -> Self {
Self {
reader: Some(reader),
buf: BytesMut::new(),
capacity: DEFAULT_CAPACITY,
}
}
}
impl futures::Stream for ReaderStream {
type Item = std::io::Result<Bytes>;
fn poll_next(
mut self: Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> Poll<Option<Self::Item>> {
let mut this = self.as_mut();
let mut reader = match this.reader.take() {
Some(r) => r,
None => return Poll::Ready(None),
};
if this.buf.capacity() == 0 {
let capacity = this.capacity;
this.buf.reserve(capacity);
}
match poll_read_buf(&mut reader, cx, &mut this.buf) {
Poll::Pending => Poll::Pending,
Poll::Ready(Err(err)) => {
self.reader = None;
Poll::Ready(Some(Err(err)))
}
Poll::Ready(Ok(0)) => {
self.reader = None;
Poll::Ready(None)
}
Poll::Ready(Ok(_)) => {
let chunk = this.buf.split();
self.reader = Some(reader);
Poll::Ready(Some(Ok(chunk.freeze())))
}
}
}
}
/// Implementation from https://docs.rs/tokio-util/0.7.12/src/tokio_util/util/poll_buf.rs.html
/// Specialized for this use case
pub fn poll_read_buf(
io: &mut Pin<Box<dyn futures::AsyncRead + Send + Sync>>,
cx: &mut std::task::Context<'_>,
buf: &mut BytesMut,
) -> Poll<std::io::Result<usize>> {
if !buf.has_remaining_mut() {
return Poll::Ready(Ok(0));
}
let n = {
let dst = buf.chunk_mut();
// Safety: `chunk_mut()` returns a `&mut UninitSlice`, and `UninitSlice` is a
// transparent wrapper around `[MaybeUninit<u8>]`.
let dst = unsafe { &mut *(dst as *mut _ as *mut [std::mem::MaybeUninit<u8>]) };
let mut buf = tokio::io::ReadBuf::uninit(dst);
let ptr = buf.filled().as_ptr();
let unfilled_portion = buf.initialize_unfilled();
// SAFETY: Pin projection
let io_pin = unsafe { Pin::new_unchecked(io) };
std::task::ready!(io_pin.poll_read(cx, unfilled_portion)?);
// Ensure the pointer does not change from under us
assert_eq!(ptr, buf.filled().as_ptr());
buf.filled().len()
};
// Safety: This is guaranteed to be the number of initialized (and read)
// bytes due to the invariants provided by `ReadBuf::filled`.
unsafe {
buf.advance_mut(n);
}
Poll::Ready(Ok(n))
}
enum WrappedBodyInner {
None,
SyncReader(std::io::Cursor<Cow<'static, [u8]>>),
Stream(ReaderStream),
}
struct WrappedBody(WrappedBodyInner);
impl WrappedBody {
fn new(body: AsyncBody) -> Self {
match body.0 {
http_client::Inner::Empty => Self(WrappedBodyInner::None),
http_client::Inner::SyncReader(cursor) => Self(WrappedBodyInner::SyncReader(cursor)),
http_client::Inner::AsyncReader(pin) => {
Self(WrappedBodyInner::Stream(ReaderStream::new(pin)))
}
}
}
}
impl futures::stream::Stream for WrappedBody {
type Item = Result<Bytes, std::io::Error>;
fn poll_next(
mut self: std::pin::Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> std::task::Poll<Option<Self::Item>> {
match &mut self.0 {
WrappedBodyInner::None => Poll::Ready(None),
WrappedBodyInner::SyncReader(cursor) => {
let mut buf = Vec::new();
match cursor.read_to_end(&mut buf) {
Ok(_) => {
return Poll::Ready(Some(Ok(Bytes::from(buf))));
}
Err(e) => return Poll::Ready(Some(Err(e))),
}
}
WrappedBodyInner::Stream(stream) => {
// SAFETY: Pin projection
let stream = unsafe { Pin::new_unchecked(stream) };
futures::Stream::poll_next(stream, cx)
}
}
}
}
impl http_client::HttpClient for ReqwestClient {
fn proxy(&self) -> Option<&http::Uri> {
None
}
fn send(
&self,
req: http::Request<http_client::AsyncBody>,
) -> futures::future::BoxFuture<
'static,
Result<http_client::Response<http_client::AsyncBody>, anyhow::Error>,
> {
let (parts, body) = req.into_parts();
let mut request = self.client.request(parts.method, parts.uri.to_string());
request = request.headers(parts.headers);
if let Some(redirect_policy) = parts.extensions.get::<http_client::RedirectPolicy>() {
request = request.redirect_policy(match redirect_policy {
http_client::RedirectPolicy::NoFollow => reqwest::redirect::Policy::none(),
http_client::RedirectPolicy::FollowLimit(limit) => {
reqwest::redirect::Policy::limited(*limit as usize)
}
http_client::RedirectPolicy::FollowAll => reqwest::redirect::Policy::limited(100),
});
}
if let Some(ReadTimeout(timeout)) = parts.extensions.get::<ReadTimeout>() {
request = request.timeout(*timeout);
}
let body = WrappedBody::new(body);
let request = request.body(reqwest::Body::wrap_stream(body));
async move {
let response = request.send().await.map_err(|e| anyhow!(e))?;
let status = response.status();
let mut builder = http::Response::builder().status(status.as_u16());
for (name, value) in response.headers() {
builder = builder.header(name, value);
}
let bytes = response.bytes_stream();
let bytes = bytes
.map_err(|e| futures::io::Error::new(futures::io::ErrorKind::Other, e))
.into_async_read();
let body = http_client::AsyncBody::from_reader(bytes);
builder.body(body).map_err(|e| anyhow!(e))
}
.boxed()
}
}

View File

@@ -27,7 +27,7 @@ use settings::Settings;
use std::sync::Arc;
use theme::ThemeSettings;
use ui::{h_flex, prelude::*, IconButton, IconName, Tooltip, BASE_REM_SIZE_IN_PX};
use ui::{h_flex, prelude::*, IconButton, IconButtonShape, IconName, Tooltip, BASE_REM_SIZE_IN_PX};
use util::ResultExt;
use workspace::{
item::ItemHandle,
@@ -200,7 +200,7 @@ impl Render for BufferSearchBar {
};
let search_line = h_flex()
.mb_1()
.gap_2()
.child(
h_flex()
.id("editor-scroll")
@@ -208,7 +208,6 @@ impl Render for BufferSearchBar {
.flex_1()
.h_8()
.px_2()
.mr_2()
.py_1()
.border_1()
.border_color(editor_border)
@@ -244,66 +243,70 @@ impl Render for BufferSearchBar {
}))
}),
)
.when(supported_options.replacement, |this| {
this.child(
IconButton::new("buffer-search-bar-toggle-replace-button", IconName::Replace)
.style(ButtonStyle::Subtle)
.when(self.replace_enabled, |button| {
button.style(ButtonStyle::Filled)
})
.on_click(cx.listener(|this, _: &ClickEvent, cx| {
this.toggle_replace(&ToggleReplace, cx);
}))
.selected(self.replace_enabled)
.size(ButtonSize::Compact)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
Tooltip::for_action_in(
"Toggle replace",
&ToggleReplace,
&focus_handle,
cx,
)
}
}),
)
})
.when(supported_options.selection, |this| {
this.child(
IconButton::new(
"buffer-search-bar-toggle-search-selection-button",
IconName::SearchSelection,
)
.style(ButtonStyle::Subtle)
.when(self.selection_search_enabled, |button| {
button.style(ButtonStyle::Filled)
})
.on_click(cx.listener(|this, _: &ClickEvent, cx| {
this.toggle_selection(&ToggleSelection, cx);
}))
.selected(self.selection_search_enabled)
.size(ButtonSize::Compact)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
Tooltip::for_action_in(
"Toggle Search Selection",
&ToggleSelection,
&focus_handle,
cx,
)
}
}),
)
})
.child(
h_flex()
.flex_none()
.gap_0p5()
.when(supported_options.replacement, |this| {
this.child(
IconButton::new(
"buffer-search-bar-toggle-replace-button",
IconName::Replace,
)
.style(ButtonStyle::Subtle)
.shape(IconButtonShape::Square)
.when(self.replace_enabled, |button| {
button.style(ButtonStyle::Filled)
})
.on_click(cx.listener(|this, _: &ClickEvent, cx| {
this.toggle_replace(&ToggleReplace, cx);
}))
.selected(self.replace_enabled)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
Tooltip::for_action_in(
"Toggle Replace",
&ToggleReplace,
&focus_handle,
cx,
)
}
}),
)
})
.when(supported_options.selection, |this| {
this.child(
IconButton::new(
"buffer-search-bar-toggle-search-selection-button",
IconName::SearchSelection,
)
.style(ButtonStyle::Subtle)
.shape(IconButtonShape::Square)
.when(self.selection_search_enabled, |button| {
button.style(ButtonStyle::Filled)
})
.on_click(cx.listener(|this, _: &ClickEvent, cx| {
this.toggle_selection(&ToggleSelection, cx);
}))
.selected(self.selection_search_enabled)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
Tooltip::for_action_in(
"Toggle Search Selection",
&ToggleSelection,
&focus_handle,
cx,
)
}
}),
)
})
.child(
IconButton::new("select-all", ui::IconName::SelectAll)
.on_click(|_, cx| cx.dispatch_action(SelectAllMatches.boxed_clone()))
.size(ButtonSize::Compact)
.shape(IconButtonShape::Square)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
@@ -332,11 +335,13 @@ impl Render for BufferSearchBar {
))
.when(!narrow_mode, |this| {
this.child(h_flex().ml_2().min_w(rems_from_px(40.)).child(
Label::new(match_text).color(if self.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
}),
Label::new(match_text).size(LabelSize::Small).color(
if self.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
},
),
))
}),
);
@@ -367,8 +372,10 @@ impl Render for BufferSearchBar {
.child(
h_flex()
.flex_none()
.gap_0p5()
.child(
IconButton::new("search-replace-next", ui::IconName::ReplaceNext)
.shape(IconButtonShape::Square)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
@@ -386,6 +393,7 @@ impl Render for BufferSearchBar {
)
.child(
IconButton::new("search-replace-all", ui::IconName::ReplaceAll)
.shape(IconButtonShape::Square)
.tooltip({
let focus_handle = focus_handle.clone();
move |cx| {
@@ -441,6 +449,7 @@ impl Render for BufferSearchBar {
.when(!narrow_mode, |div| {
div.child(
IconButton::new(SharedString::from("Close"), IconName::Close)
.shape(IconButtonShape::Square)
.tooltip(move |cx| {
Tooltip::for_action("Close Search Bar", &Dismiss, cx)
})

View File

@@ -1634,7 +1634,7 @@ impl Render for ProjectSearchBar {
let focus_handle = focus_handle.clone();
move |cx| {
Tooltip::for_action_in(
"Toggle replace",
"Toggle Replace",
&ToggleReplace,
&focus_handle,
cx,

View File

@@ -5,7 +5,7 @@ use gpui::{actions, Action, AppContext, FocusHandle, IntoElement};
use project::search::SearchQuery;
pub use project_search::ProjectSearchView;
use ui::{prelude::*, Tooltip};
use ui::{ButtonStyle, IconButton};
use ui::{ButtonStyle, IconButton, IconButtonShape};
use workspace::notifications::NotificationId;
use workspace::{Toast, Workspace};
@@ -112,6 +112,7 @@ impl SearchOptions {
IconButton::new(self.label(), self.icon())
.on_click(action)
.style(ButtonStyle::Subtle)
.shape(IconButtonShape::Square)
.selected(active)
.tooltip({
let action = self.to_toggle_action();

View File

@@ -1,6 +1,6 @@
use gpui::{Action, FocusHandle, IntoElement};
use ui::IconButton;
use ui::{prelude::*, Tooltip};
use ui::{IconButton, IconButtonShape};
pub(super) fn render_nav_button(
icon: ui::IconName,
@@ -13,6 +13,7 @@ pub(super) fn render_nav_button(
SharedString::from(format!("search-nav-button-{}", action.name())),
icon,
)
.shape(IconButtonShape::Square)
.on_click(|_, cx| cx.dispatch_action(action.boxed_clone()))
.tooltip(move |cx| Tooltip::for_action_in(tooltip, action, &focus_handle, cx))
.disabled(!active)

Some files were not shown because too many files have changed in this diff Show More