Compare commits

...

30 Commits

Author SHA1 Message Date
Antonio Scandurra
a7ed4de96c Update status 2025-03-27 17:34:22 +01:00
Antonio Scandurra
dae295fc72 Reuse the same BufferDiffs instead of always creating new ones 2025-03-27 16:37:51 +01:00
Antonio Scandurra
e1ed35377a Start on a new ThreadDiff primitive 2025-03-27 16:29:47 +01:00
Antonio Scandurra
c707ec1c24 Add a new GitStoreStatus::entries method 2025-03-27 15:02:07 +01:00
Antonio Scandurra
a5677a1e93 Fix diffing checkpoints not returning a newline 2025-03-27 15:01:47 +01:00
Antonio Scandurra
eba6f443fd Introduce a DiffSource trait for ProjectDiff 2025-03-27 10:53:26 +01:00
Antonio Scandurra
82a06f0ca9 Introduce primitives in GitStore to support reviewing assistant diffs (#27576)
Release Notes:

- N/A
2025-03-27 09:46:31 +00:00
renovate[bot]
cd6b1d32d0 Update Rust crate blake3 to v1.7.0 (#27566)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [blake3](https://redirect.github.com/BLAKE3-team/BLAKE3) |
workspace.dependencies | minor | `1.6.1` -> `1.7.0` |

---

### Release Notes

<details>
<summary>BLAKE3-team/BLAKE3 (blake3)</summary>

###
[`v1.7.0`](https://redirect.github.com/BLAKE3-team/BLAKE3/releases/tag/1.7.0)

[Compare
Source](https://redirect.github.com/BLAKE3-team/BLAKE3/compare/1.6.1...1.7.0)

version 1.7.0

Changes since 1.6.1:

-   The C implementation has gained multithreading support, based on
    Intel's oneTBB library. This works similarly to the Rayon-based
    multithreading used in the Rust implementation. See c/README.md for
details. Contributed by
[@&#8203;silvanshade](https://redirect.github.com/silvanshade)
([#&#8203;445](https://redirect.github.com/BLAKE3-team/BLAKE3/issues/445)).
-   The Rust implementation has gained a WASM SIMD backend, gated by the
`wasm32_simd` Cargo feature. Under Wasmtime on my laptop, this is a 6x
    performance improvement for large inputs. This backend is currently
Rust-only. Contributed by
[@&#8203;monoid](https://redirect.github.com/monoid)
([#&#8203;341](https://redirect.github.com/BLAKE3-team/BLAKE3/issues/341)).
-   Fixed cross-compilation builds targeting Windows with cargo-xwin.
Contributed by [@&#8203;Sporif](https://redirect.github.com/Sporif) and
[@&#8203;toothbrush7777777](https://redirect.github.com/toothbrush7777777)
([#&#8203;230](https://redirect.github.com/BLAKE3-team/BLAKE3/issues/230)).
-   Added `b3sum --tag`, which changes the output format. This is for
    compatibility with GNU checksum tools (which use the same flag) and
    BSD checksum tools (which use the output format this flag turns on).
Contributed by
[@&#8203;leahneukirchen](https://redirect.github.com/leahneukirchen)
([#&#8203;453](https://redirect.github.com/BLAKE3-team/BLAKE3/issues/453))
and [@&#8203;dbohdan](https://redirect.github.com/dbohdan)
([#&#8203;430](https://redirect.github.com/BLAKE3-team/BLAKE3/issues/430)).

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-27 11:37:20 +02:00
renovate[bot]
5033a2aba0 Update Rust crate image to v0.25.6 (#27539)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [image](https://redirect.github.com/image-rs/image) | dependencies |
patch | `0.25.5` -> `0.25.6` |
| [image](https://redirect.github.com/image-rs/image) |
workspace.dependencies | patch | `0.25.5` -> `0.25.6` |

---

### Release Notes

<details>
<summary>image-rs/image (image)</summary>

###
[`v0.25.6`](https://redirect.github.com/image-rs/image/blob/HEAD/CHANGES.md#Version-0256)

[Compare
Source](https://redirect.github.com/image-rs/image/compare/v0.25.5...v0.25.6)

Features:

- Improved format detection
([#&#8203;2418](https://redirect.github.com/image-rs/image/pull/2418))
- Implement writing ICC profiles for JPEG and PNG images
([#&#8203;2389](https://redirect.github.com/image-rs/image/pull/2389))

Bug fixes:

- JPEG encoding bugfix
([#&#8203;2387](https://redirect.github.com/image-rs/image/pull/2387))
- Expanded ICO format detection
([#&#8203;2434](https://redirect.github.com/image-rs/image/pull/2434))
- Fixed EXR bug with NaNs
([#&#8203;2381](https://redirect.github.com/image-rs/image/pull/2381))
-   Various documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-27 11:36:33 +02:00
renovate[bot]
0392ef10cf Update Rust crate bitflags to v2.9.0 (#27565)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [bitflags](https://redirect.github.com/bitflags/bitflags) |
workspace.dependencies | minor | `2.8.0` -> `2.9.0` |

---

### Release Notes

<details>
<summary>bitflags/bitflags (bitflags)</summary>

###
[`v2.9.0`](https://redirect.github.com/bitflags/bitflags/blob/HEAD/CHANGELOG.md#290)

[Compare
Source](https://redirect.github.com/bitflags/bitflags/compare/2.8.0...2.9.0)

#### What's Changed

- `Flags` trait: add `clear(&mut self)` method by
[@&#8203;wysiwys](https://redirect.github.com/wysiwys) in
[https://github.com/bitflags/bitflags/pull/437](https://redirect.github.com/bitflags/bitflags/pull/437)
- Fix up UI tests by
[@&#8203;KodrAus](https://redirect.github.com/KodrAus) in
[https://github.com/bitflags/bitflags/pull/438](https://redirect.github.com/bitflags/bitflags/pull/438)

**Full Changelog**:
https://github.com/bitflags/bitflags/compare/2.8.0...2.9.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-27 11:36:02 +02:00
Antonio Scandurra
7354ef91e1 Make GitRepository::status async and remove cx parameter (#27514)
This lays the groundwork for using `status` as part of the new agent
panel.

Release Notes:

- N/A
2025-03-27 09:05:54 +00:00
张小白
926d10cc45 Move the EventKind::Access filtering before the loop starts (#27569)
Follow up #27498

Release Notes:

- N/A
2025-03-27 15:15:24 +08:00
renovate[bot]
a7697be857 Update Rust crate async-compression to v0.4.22 (#27529)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[async-compression](https://redirect.github.com/Nullus157/async-compression)
| workspace.dependencies | patch | `0.4.21` -> `0.4.22` |

---

### Release Notes

<details>
<summary>Nullus157/async-compression (async-compression)</summary>

###
[`v0.4.22`](https://redirect.github.com/Nullus157/async-compression/blob/HEAD/CHANGELOG.md#0422---2025-03-25)

[Compare
Source](https://redirect.github.com/Nullus157/async-compression/compare/v0.4.21...v0.4.22)

##### Other

-   Add lz4 encoder/decoder
-   Expose total_in/total_out in DeflateEncoder

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-27 00:00:41 -04:00
renovate[bot]
97392a23e3 Update Rust crate time to v0.3.41 (#27553)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [time](https://time-rs.github.io)
([source](https://redirect.github.com/time-rs/time)) |
workspace.dependencies | patch | `0.3.40` -> `0.3.41` |

---

### Release Notes

<details>
<summary>time-rs/time (time)</summary>

###
[`v0.3.41`](https://redirect.github.com/time-rs/time/blob/HEAD/CHANGELOG.md#0341-2025-03-23)

[Compare
Source](https://redirect.github.com/time-rs/time/compare/v0.3.40...v0.3.41)

##### Fixed

- Compatibility with the latest release of `deranged`. This fix is
permanent and covers future
    similar changes upstream.

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:55:45 -04:00
renovate[bot]
3f40e0f433 Update Rust crate clap to v4.5.34 (#27530)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [clap](https://redirect.github.com/clap-rs/clap) |
workspace.dependencies | patch | `4.5.32` -> `4.5.34` |

---

### Release Notes

<details>
<summary>clap-rs/clap (clap)</summary>

###
[`v4.5.34`](https://redirect.github.com/clap-rs/clap/blob/HEAD/CHANGELOG.md#4534---2025-03-27)

[Compare
Source](https://redirect.github.com/clap-rs/clap/compare/v4.5.33...v4.5.34)

##### Fixes

- *(help)* Don't add extra blank lines with `flatten_help(true)` and
subcommands without arguments

###
[`v4.5.33`](https://redirect.github.com/clap-rs/clap/blob/HEAD/CHANGELOG.md#4533---2025-03-26)

[Compare
Source](https://redirect.github.com/clap-rs/clap/compare/v4.5.32...v4.5.33)

##### Fixes

- *(error)* When showing the usage of a suggestion for an unknown
argument, don't show the group

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:55:18 -04:00
renovate[bot]
3e6d5c0814 Update dependency @tsconfig/node20 to v20.1.5 (#27560)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [@tsconfig/node20](https://redirect.github.com/tsconfig/bases)
([source](https://redirect.github.com/tsconfig/bases/tree/HEAD/bases)) |
[`20.1.4` ->
`20.1.5`](https://renovatebot.com/diffs/npm/@tsconfig%2fnode20/20.1.4/20.1.5)
|
[![age](https://developer.mend.io/api/mc/badges/age/npm/@tsconfig%2fnode20/20.1.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/npm/@tsconfig%2fnode20/20.1.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/npm/@tsconfig%2fnode20/20.1.4/20.1.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@tsconfig%2fnode20/20.1.4/20.1.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>tsconfig/bases (@&#8203;tsconfig/node20)</summary>

###
[`v20.1.5`](be6b3bb160...f6e0345911)

[Compare
Source](be6b3bb160...f6e0345911)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:54:49 -04:00
renovate[bot]
2bc91e8c59 Update Rust crate plist to v1.7.1 (#27550)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [plist](https://redirect.github.com/ebarnard/rust-plist) |
dependencies | patch | `1.7.0` -> `1.7.1` |

---

### Release Notes

<details>
<summary>ebarnard/rust-plist (plist)</summary>

###
[`v1.7.1`](https://redirect.github.com/ebarnard/rust-plist/compare/v1.7.0...v1.7.1)

[Compare
Source](https://redirect.github.com/ebarnard/rust-plist/compare/v1.7.0...v1.7.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:51:58 -04:00
renovate[bot]
bbc80c78fd Update dependency @slack/webhook to v7.0.5 (#27554)
This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [@slack/webhook](https://tools.slack.dev/node-slack-sdk/webhook)
([source](https://redirect.github.com/slackapi/node-slack-sdk)) |
[`7.0.4` ->
`7.0.5`](https://renovatebot.com/diffs/npm/@slack%2fwebhook/7.0.4/7.0.5)
|
[![age](https://developer.mend.io/api/mc/badges/age/npm/@slack%2fwebhook/7.0.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![adoption](https://developer.mend.io/api/mc/badges/adoption/npm/@slack%2fwebhook/7.0.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![passing](https://developer.mend.io/api/mc/badges/compatibility/npm/@slack%2fwebhook/7.0.4/7.0.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|
[![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/@slack%2fwebhook/7.0.4/7.0.5?slim=true)](https://docs.renovatebot.com/merge-confidence/)
|

---

### Release Notes

<details>
<summary>slackapi/node-slack-sdk (@&#8203;slack/webhook)</summary>

###
[`v7.0.5`](https://redirect.github.com/slackapi/node-slack-sdk/releases/tag/%40slack/webhook%407.0.5)

[Compare
Source](https://redirect.github.com/slackapi/node-slack-sdk/compare/@slack/webhook@7.0.4...@slack/webhook@7.0.5)

#### What's Changed

This patch release updates the `axios` dependency used to send webhooks
with internal bug fixes.

- fix(webhook): bump axios to 1.8.3 to address CVE-2025-27152 by
[@&#8203;zimeg](https://redirect.github.com/zimeg) in
[https://github.com/slackapi/node-slack-sdk/pull/2173](https://redirect.github.com/slackapi/node-slack-sdk/pull/2173)

**Full Changelog**:
https://github.com/slackapi/node-slack-sdk/compare/[@&#8203;slack/webhook](https://redirect.github.com/slack/webhook)[@&#8203;7](https://redirect.github.com/7).0.4..[@&#8203;slack/webhook](https://redirect.github.com/slack/webhook)[@&#8203;7](https://redirect.github.com/7).0.5
**Milestone**: https://github.com/slackapi/node-slack-sdk/milestone/130

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:50:40 -04:00
renovate[bot]
24ab5afa10 Update serde monorepo to v1.0.219 (#27561)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [serde](https://serde.rs)
([source](https://redirect.github.com/serde-rs/serde)) | dependencies |
patch | `1.0.218` -> `1.0.219` |
| [serde](https://serde.rs)
([source](https://redirect.github.com/serde-rs/serde)) |
workspace.dependencies | patch | `1.0.218` -> `1.0.219` |
| [serde_derive](https://serde.rs)
([source](https://redirect.github.com/serde-rs/serde)) |
workspace.dependencies | patch | `1.0.218` -> `1.0.219` |

---

### Release Notes

<details>
<summary>serde-rs/serde (serde)</summary>

###
[`v1.0.219`](https://redirect.github.com/serde-rs/serde/releases/tag/v1.0.219)

[Compare
Source](https://redirect.github.com/serde-rs/serde/compare/v1.0.218...v1.0.219)

- Prevent `absolute_paths` Clippy restriction being triggered inside
macro-generated code
([#&#8203;2906](https://redirect.github.com/serde-rs/serde/issues/2906),
thanks [@&#8203;davidzeng0](https://redirect.github.com/davidzeng0))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-26 22:48:53 -04:00
Marshall Bowers
af8acba353 Remove unneeded inline tables in Cargo.tomls (#27563)
This PR removes some unneeded inline tables from our `Cargo.toml`s.

Release Notes:

- N/A
2025-03-27 02:36:47 +00:00
Marshall Bowers
231e9c2000 assistant2: Add ability to configure tools for profiles in the UI (#27562)
This PR adds the ability to configure tools for a profile in the UI:


https://github.com/user-attachments/assets/16642f14-8faa-4a91-bb9e-1d480692f1f2

Note: Doesn't yet work for customizing tools for the default profiles.

Release Notes:

- N/A
2025-03-27 02:19:45 +00:00
João Marcos
47b94e5ef0 Git: Fix hunks being skipped when staging too quickly (#27552)
Release Notes:

- Git: Fix hunks being skipped when staging too quickly.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-26 23:22:37 +00:00
Cole Miller
29e2e13e6d Fix broken merge (#27551)
This PR fixes main after a semantic merge conflict with
https://github.com/zed-industries/zed/pull/27391.

Release Notes:

- N/A
2025-03-26 23:00:08 +00:00
João Marcos
e635798fe0 Fix crash when staging a hunk that overlaps multiple unstaged hunks (#27545)
Release Notes:

- Git: Fix crash when staging a hunk that overlaps multiple unstaged
hunks.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-26 19:54:51 -03:00
Cole Miller
6924720b35 Move repository state RPC handlers to the GitStore (#27391)
This is another in the series of PRs to make the GitStore own all
repository state and enable better concurrency control for git
repository scans.

After this PR, the `RepositoryEntry`s stored in worktree snapshots are
used only as a staging ground for local GitStores to pull from after
git-related events; non-local worktrees don't store them at all,
although this is not reflected in the types. GitTraversal and other
places that need information about repositories get it from the
GitStore. The GitStore also takes over handling of the new
UpdateRepository and RemoveRepository messages. However, repositories
are still discovered and scanned on a per-worktree basis, and we're
still identifying them by the (worktree-specific) project entry ID of
their working directory.

- [x] Remove WorkDirectory from RepositoryEntry
- [x] Remove worktree IDs from repository-related RPC messages
- [x] Handle UpdateRepository and RemoveRepository RPCs from the
GitStore

Release Notes:

- N/A

---------

Co-authored-by: Max <max@zed.dev>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-26 18:23:44 -04:00
Thomas Mickley-Doyle
1e8b50f471 Add token usage to LanguageModelTextStream (#27490)
Release Notes:

- N/A

---------

Co-authored-by: Michael Sloan <michael@zed.dev>
2025-03-26 22:21:01 +00:00
Anthony Eid
5f8c53ffe8 Debugger UI: Fix breakpoint rendering in git hunks (#27538)
This PR fixes a bug where breakpoints would be rendered on incorrect
lines when openings a git hunk that contained breakpoints. This also
disables breakpoints from being shown in deleted git hunks as well.

Note: There's some unexpected behavior when using an anchor to get a
display point that is in an open git hunk, where the
`anchor.to_point().col == 0`.

```rust
                let position = multi_buffer_anchor
                    .to_point(&multi_buffer_snapshot)
                    .to_display_point(&snapshot);
```

The above code will return a display point that is one line below where
the anchor actually represents when it's in an opened hunk diff. Which
causes the bug shown below



https://github.com/user-attachments/assets/bd15d02a-3cdc-4c8e-841f-bef238583351


@ConradIrwin Is this expected behavior when calling
`.to_display_point(&snapshot)`?

Release Notes:

- N/A
2025-03-26 18:12:00 -04:00
Smit Barmase
6e82bbf367 Revert "editor: Do not use hide_mouse_while_typing for single line editor" (#27547)
Reverts zed-industries/zed#27536

Looks like hiding cursor on single editor is okay and is default
behavior for other apps.
2025-03-26 22:02:09 +00:00
Marshall Bowers
0ac717c3a8 assistant2: Start on modal for managing profiles (#27546)
This PR starts work on a modal for managing profiles.

Release Notes:

- N/A
2025-03-26 18:01:34 -04:00
Michael Sloan
44aff7cd46 Fix tools' ui_text to use inline code escaping (#27543)
Markdown escaping was added in #27502.

Release Notes:

- N/A
2025-03-26 21:49:51 +00:00
71 changed files with 4343 additions and 2551 deletions

63
Cargo.lock generated
View File

@@ -453,6 +453,7 @@ dependencies = [
"assistant_slash_command",
"assistant_tool",
"async-watch",
"buffer_diff",
"chrono",
"client",
"clock",
@@ -491,7 +492,6 @@ dependencies = [
"prompt_store",
"proto",
"rand 0.8.5",
"regex",
"release_channel",
"rope",
"serde",
@@ -800,9 +800,9 @@ dependencies = [
[[package]]
name = "async-compression"
version = "0.4.21"
version = "0.4.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c0cf008e5e1a9e9e22a7d3c9a4992e21a350290069e36d8fb72304ed17e8f2d2"
checksum = "59a194f9d963d8099596278594b3107448656ba73831c9d8c783e613ce86da64"
dependencies = [
"deflate64",
"flate2",
@@ -2361,7 +2361,7 @@ dependencies = [
"cap-primitives",
"cap-std",
"io-lifetimes",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -2389,7 +2389,7 @@ dependencies = [
"ipnet",
"maybe-owned",
"rustix",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
"winx",
]
@@ -2669,9 +2669,9 @@ dependencies = [
[[package]]
name = "clap"
version = "4.5.32"
version = "4.5.34"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6088f3ae8c3608d19260cd7445411865a485688711b78b5be70d78cd96136f83"
checksum = "e958897981290da2a852763fe9cdb89cd36977a5d729023127095fa94d95e2ff"
dependencies = [
"clap_builder",
"clap_derive",
@@ -2679,9 +2679,9 @@ dependencies = [
[[package]]
name = "clap_builder"
version = "4.5.32"
version = "4.5.34"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "22a7ef7f676155edfb82daa97f99441f3ebf4a58d5e32f295a56259f1b6facc8"
checksum = "83b0f35019843db2160b5bb19ae09b4e6411ac33fc6a712003c33e03090e2489"
dependencies = [
"anstream",
"anstyle",
@@ -4590,7 +4590,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d"
dependencies = [
"libc",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -4750,6 +4750,7 @@ dependencies = [
"env_logger 0.11.7",
"extension",
"fs",
"gpui",
"language",
"log",
"reqwest_client",
@@ -5253,7 +5254,7 @@ checksum = "5e2e6123af26f0f2c51cc66869137080199406754903cc926a7690401ce09cb4"
dependencies = [
"io-lifetimes",
"rustix",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -6910,7 +6911,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2285ddfe3054097ef4b2fe909ef8c3bcd1ea52a8f0d274416caebeef39f04a65"
dependencies = [
"io-lifetimes",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -7556,7 +7557,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc2f4eb4bc735547cfed7c0a4922cbd04a4655978c09b54f1f7b228750664c34"
dependencies = [
"cfg-if",
"windows-targets 0.52.6",
"windows-targets 0.48.5",
]
[[package]]
@@ -10227,9 +10228,9 @@ checksum = "953ec861398dccce10c670dfeaf3ec4911ca479e9c02154b3a215178c5f566f2"
[[package]]
name = "plist"
version = "1.7.0"
version = "1.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "42cf17e9a1800f5f396bc67d193dc9411b59012a5876445ef450d449881e1016"
checksum = "eac26e981c03a6e53e0aee43c113e3202f5581d5360dae7bd2c70e800dd0451d"
dependencies = [
"base64 0.22.1",
"indexmap",
@@ -10711,7 +10712,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "22505a5c94da8e3b7c2996394d1c933236c4d743e81a410bcca4e6989fc066a4"
dependencies = [
"bytes 1.10.1",
"heck 0.5.0",
"heck 0.4.1",
"itertools 0.12.1",
"log",
"multimap 0.10.0",
@@ -10939,7 +10940,7 @@ dependencies = [
"once_cell",
"socket2",
"tracing",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -11864,7 +11865,7 @@ dependencies = [
"libc",
"linux-raw-sys",
"once_cell",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -11975,7 +11976,7 @@ dependencies = [
"security-framework 3.0.1",
"security-framework-sys",
"webpki-root-certs",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -12397,18 +12398,18 @@ dependencies = [
[[package]]
name = "serde"
version = "1.0.218"
version = "1.0.219"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8dfc9d19bdbf6d17e22319da49161d5d0108e4188e8b680aef6299eed22df60"
checksum = "5f0e2c6ed6606019b4e29e69dbaba95b11854410e5347d525002456dbbb786b6"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.218"
version = "1.0.219"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f09503e191f4e797cb8aac08e9a4a4695c5edf6a2e70e376d961ddd5c969f82b"
checksum = "5b0276cf7f2c73365f7157c8123c21cd9a50fbbd844757af28ca1f5925fc2a00"
dependencies = [
"proc-macro2",
"quote",
@@ -13639,7 +13640,7 @@ dependencies = [
"fd-lock",
"io-lifetimes",
"rustix",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
"winx",
]
@@ -13783,7 +13784,7 @@ dependencies = [
"getrandom 0.3.1",
"once_cell",
"rustix",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]
@@ -14057,9 +14058,9 @@ dependencies = [
[[package]]
name = "time"
version = "0.3.40"
version = "0.3.41"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d9c75b47bdff86fa3334a3db91356b8d7d86a9b839dab7d0bdc5c3d3a077618"
checksum = "8a7619e19bc266e0f9c5e6686659d394bc57973859340060a69221e57dbc0c40"
dependencies = [
"deranged",
"itoa",
@@ -14080,9 +14081,9 @@ checksum = "c9e9a38711f559d9e3ce1cdb06dd7c5b8ea546bc90052da6d06bb76da74bb07c"
[[package]]
name = "time-macros"
version = "0.2.21"
version = "0.2.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "29aa485584182073ed57fd5004aa09c371f021325014694e432313345865fd04"
checksum = "3526739392ec93fd8b359c8e98514cb3e8e021beb4e5f597b00a0221f8ed8a49"
dependencies = [
"num-conv",
"time-core",
@@ -16172,7 +16173,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.59.0",
"windows-sys 0.48.0",
]
[[package]]
@@ -16732,7 +16733,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f3fd376f71958b862e7afb20cfe5a22830e1963462f3a17f49d82a6c1d1f42d"
dependencies = [
"bitflags 2.8.0",
"windows-sys 0.59.0",
"windows-sys 0.52.0",
]
[[package]]

View File

@@ -3712,7 +3712,7 @@ mod tests {
language_settings, tree_sitter_rust, Buffer, Language, LanguageConfig, LanguageMatcher,
Point,
};
use language_model::LanguageModelRegistry;
use language_model::{LanguageModelRegistry, TokenUsage};
use rand::prelude::*;
use serde::Serialize;
use settings::SettingsStore;
@@ -4091,6 +4091,7 @@ mod tests {
future::ready(Ok(LanguageModelTextStream {
message_id: None,
stream: chunks_rx.map(Ok).boxed(),
last_token_usage: Arc::new(Mutex::new(TokenUsage::default())),
})),
cx,
);

View File

@@ -25,6 +25,7 @@ assistant_settings.workspace = true
assistant_slash_command.workspace = true
assistant_tool.workspace = true
async-watch.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
client.workspace = true
clock.workspace = true
@@ -62,7 +63,6 @@ prompt_library.workspace = true
prompt_store.workspace = true
proto.workspace = true
release_channel.workspace = true
regex.workspace = true
rope.workspace = true
serde.workspace = true
serde_json.workspace = true
@@ -86,6 +86,7 @@ workspace.workspace = true
zed_actions.workspace = true
[dev-dependencies]
buffer_diff = { workspace = true, features = ["test-support"] }
editor = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, "features" = ["test-support"] }
indoc.workspace = true

View File

@@ -521,6 +521,10 @@ impl ActiveThread {
}
}
ThreadEvent::CheckpointChanged => cx.notify(),
ThreadEvent::DiffChanged => {
// todo!("update list of changed files")
cx.notify();
}
}
}

View File

@@ -15,6 +15,7 @@ mod profile_selector;
mod terminal_codegen;
mod terminal_inline_assistant;
mod thread;
mod thread_diff;
mod thread_history;
mod thread_store;
mod tool_use;
@@ -32,10 +33,11 @@ use prompt_store::PromptBuilder;
use settings::Settings as _;
pub use crate::active_thread::ActiveThread;
use crate::assistant_configuration::AddContextServerModal;
use crate::assistant_configuration::{AddContextServerModal, ManageProfilesModal};
pub use crate::assistant_panel::{AssistantPanel, ConcreteAssistantPanelDelegate};
pub use crate::inline_assistant::InlineAssistant;
pub use crate::thread::{Message, RequestKind, Thread, ThreadEvent};
pub(crate) use crate::thread_diff::*;
pub use crate::thread_store::ThreadStore;
actions!(
@@ -47,6 +49,7 @@ actions!(
RemoveAllContext,
OpenHistory,
OpenConfiguration,
ManageProfiles,
AddContextServer,
RemoveSelectedThread,
Chat,
@@ -59,7 +62,8 @@ actions!(
FocusRight,
RemoveFocusedContext,
AcceptSuggestedContext,
OpenActiveThreadAsMarkdown
OpenActiveThreadAsMarkdown,
ShowThreadDiff
]
);
@@ -89,6 +93,7 @@ pub fn init(
cx,
);
cx.observe_new(AddContextServerModal::register).detach();
cx.observe_new(ManageProfilesModal::register).detach();
feature_gate_assistant2_actions(cx);
}

View File

@@ -1,4 +1,7 @@
mod add_context_server_modal;
mod manage_profiles_modal;
mod profile_picker;
mod tool_picker;
use std::sync::Arc;
@@ -12,6 +15,7 @@ use util::ResultExt as _;
use zed_actions::ExtensionCategoryFilter;
pub(crate) use add_context_server_modal::AddContextServerModal;
pub(crate) use manage_profiles_modal::ManageProfilesModal;
use crate::AddContextServer;

View File

@@ -0,0 +1,201 @@
use std::sync::Arc;
use assistant_settings::AssistantSettings;
use assistant_tool::ToolWorkingSet;
use fs::Fs;
use gpui::{prelude::*, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable};
use settings::Settings as _;
use ui::{prelude::*, ListItem, ListItemSpacing, Navigable, NavigableEntry};
use workspace::{ModalView, Workspace};
use crate::assistant_configuration::profile_picker::{ProfilePicker, ProfilePickerDelegate};
use crate::assistant_configuration::tool_picker::{ToolPicker, ToolPickerDelegate};
use crate::{AssistantPanel, ManageProfiles};
enum Mode {
ChooseProfile(Entity<ProfilePicker>),
ViewProfile(ViewProfileMode),
ConfigureTools(Entity<ToolPicker>),
}
#[derive(Clone)]
pub struct ViewProfileMode {
profile_id: Arc<str>,
configure_tools: NavigableEntry,
}
pub struct ManageProfilesModal {
fs: Arc<dyn Fs>,
tools: Arc<ToolWorkingSet>,
focus_handle: FocusHandle,
mode: Mode,
}
impl ManageProfilesModal {
pub fn register(
workspace: &mut Workspace,
_window: Option<&mut Window>,
_cx: &mut Context<Workspace>,
) {
workspace.register_action(|workspace, _: &ManageProfiles, window, cx| {
if let Some(panel) = workspace.panel::<AssistantPanel>(cx) {
let fs = workspace.app_state().fs.clone();
let thread_store = panel.read(cx).thread_store().read(cx);
let tools = thread_store.tools();
workspace.toggle_modal(window, cx, |window, cx| Self::new(fs, tools, window, cx))
}
});
}
pub fn new(
fs: Arc<dyn Fs>,
tools: Arc<ToolWorkingSet>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let focus_handle = cx.focus_handle();
let handle = cx.entity();
Self {
fs,
tools,
focus_handle,
mode: Mode::ChooseProfile(cx.new(|cx| {
let delegate = ProfilePickerDelegate::new(
move |profile_id, window, cx| {
handle.update(cx, |this, cx| {
this.view_profile(profile_id.clone(), window, cx);
})
},
cx,
);
ProfilePicker::new(delegate, window, cx)
})),
}
}
pub fn view_profile(
&mut self,
profile_id: Arc<str>,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.mode = Mode::ViewProfile(ViewProfileMode {
profile_id,
configure_tools: NavigableEntry::focusable(cx),
});
self.focus_handle(cx).focus(window);
}
fn configure_tools(
&mut self,
profile_id: Arc<str>,
window: &mut Window,
cx: &mut Context<Self>,
) {
let settings = AssistantSettings::get_global(cx);
let Some(profile) = settings.profiles.get(&profile_id).cloned() else {
return;
};
self.mode = Mode::ConfigureTools(cx.new(|cx| {
let delegate = ToolPickerDelegate::new(
self.fs.clone(),
self.tools.clone(),
profile_id,
profile,
cx,
);
ToolPicker::new(delegate, window, cx)
}));
self.focus_handle(cx).focus(window);
}
fn confirm(&mut self, _window: &mut Window, _cx: &mut Context<Self>) {}
fn cancel(&mut self, _window: &mut Window, _cx: &mut Context<Self>) {}
}
impl ModalView for ManageProfilesModal {}
impl Focusable for ManageProfilesModal {
fn focus_handle(&self, cx: &App) -> FocusHandle {
match &self.mode {
Mode::ChooseProfile(profile_picker) => profile_picker.focus_handle(cx),
Mode::ConfigureTools(tool_picker) => tool_picker.focus_handle(cx),
Mode::ViewProfile(_) => self.focus_handle.clone(),
}
}
}
impl EventEmitter<DismissEvent> for ManageProfilesModal {}
impl ManageProfilesModal {
fn render_view_profile(
&mut self,
mode: ViewProfileMode,
window: &mut Window,
cx: &mut Context<Self>,
) -> impl IntoElement {
Navigable::new(
div()
.track_focus(&self.focus_handle(cx))
.size_full()
.child(
v_flex().child(
div()
.id("configure-tools")
.track_focus(&mode.configure_tools.focus_handle)
.on_action({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _: &menu::Confirm, window, cx| {
this.configure_tools(profile_id.clone(), window, cx);
})
})
.child(
ListItem::new("configure-tools")
.toggle_state(
mode.configure_tools
.focus_handle
.contains_focused(window, cx),
)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.start_slot(Icon::new(IconName::Cog))
.child(Label::new("Configure Tools"))
.on_click({
let profile_id = mode.profile_id.clone();
cx.listener(move |this, _, window, cx| {
this.configure_tools(profile_id.clone(), window, cx);
})
}),
),
),
)
.into_any_element(),
)
.entry(mode.configure_tools)
}
}
impl Render for ManageProfilesModal {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
div()
.elevation_3(cx)
.w(rems(34.))
.key_context("ManageProfilesModal")
.on_action(cx.listener(|this, _: &menu::Cancel, window, cx| this.cancel(window, cx)))
.on_action(cx.listener(|this, _: &menu::Confirm, window, cx| this.confirm(window, cx)))
.capture_any_mouse_down(cx.listener(|this, _, window, cx| {
this.focus_handle(cx).focus(window);
}))
.on_mouse_down_out(cx.listener(|_this, _, _, cx| cx.emit(DismissEvent)))
.child(match &self.mode {
Mode::ChooseProfile(profile_picker) => profile_picker.clone().into_any_element(),
Mode::ViewProfile(mode) => self
.render_view_profile(mode.clone(), window, cx)
.into_any_element(),
Mode::ConfigureTools(tool_picker) => tool_picker.clone().into_any_element(),
})
}
}

View File

@@ -0,0 +1,194 @@
use std::sync::Arc;
use assistant_settings::AssistantSettings;
use fuzzy::{match_strings, StringMatch, StringMatchCandidate};
use gpui::{
App, Context, DismissEvent, Entity, EventEmitter, Focusable, SharedString, Task, WeakEntity,
Window,
};
use picker::{Picker, PickerDelegate};
use settings::Settings;
use ui::{prelude::*, HighlightedLabel, ListItem, ListItemSpacing};
use util::ResultExt as _;
pub struct ProfilePicker {
picker: Entity<Picker<ProfilePickerDelegate>>,
}
impl ProfilePicker {
pub fn new(
delegate: ProfilePickerDelegate,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let picker = cx.new(|cx| Picker::uniform_list(delegate, window, cx));
Self { picker }
}
}
impl EventEmitter<DismissEvent> for ProfilePicker {}
impl Focusable for ProfilePicker {
fn focus_handle(&self, cx: &App) -> gpui::FocusHandle {
self.picker.focus_handle(cx)
}
}
impl Render for ProfilePicker {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
v_flex().w(rems(34.)).child(self.picker.clone())
}
}
#[derive(Debug)]
pub struct ProfileEntry {
pub id: Arc<str>,
pub name: SharedString,
}
pub struct ProfilePickerDelegate {
profile_picker: WeakEntity<ProfilePicker>,
profiles: Vec<ProfileEntry>,
matches: Vec<StringMatch>,
selected_index: usize,
on_confirm: Arc<dyn Fn(&Arc<str>, &mut Window, &mut App) + 'static>,
}
impl ProfilePickerDelegate {
pub fn new(
on_confirm: impl Fn(&Arc<str>, &mut Window, &mut App) + 'static,
cx: &mut Context<ProfilePicker>,
) -> Self {
let settings = AssistantSettings::get_global(cx);
let profiles = settings
.profiles
.iter()
.map(|(id, profile)| ProfileEntry {
id: id.clone(),
name: profile.name.clone(),
})
.collect::<Vec<_>>();
Self {
profile_picker: cx.entity().downgrade(),
profiles,
matches: Vec::new(),
selected_index: 0,
on_confirm: Arc::new(on_confirm),
}
}
}
impl PickerDelegate for ProfilePickerDelegate {
type ListItem = ListItem;
fn match_count(&self) -> usize {
self.matches.len()
}
fn selected_index(&self) -> usize {
self.selected_index
}
fn set_selected_index(
&mut self,
ix: usize,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) {
self.selected_index = ix;
}
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> Arc<str> {
"Search profiles…".into()
}
fn update_matches(
&mut self,
query: String,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Task<()> {
let background = cx.background_executor().clone();
let candidates = self
.profiles
.iter()
.enumerate()
.map(|(id, profile)| StringMatchCandidate::new(id, profile.name.as_ref()))
.collect::<Vec<_>>();
cx.spawn_in(window, async move |this, cx| {
let matches = if query.is_empty() {
candidates
.into_iter()
.enumerate()
.map(|(index, candidate)| StringMatch {
candidate_id: index,
string: candidate.string,
positions: Vec::new(),
score: 0.,
})
.collect()
} else {
match_strings(
&candidates,
&query,
false,
100,
&Default::default(),
background,
)
.await
};
this.update(cx, |this, _cx| {
this.delegate.matches = matches;
this.delegate.selected_index = this
.delegate
.selected_index
.min(this.delegate.matches.len().saturating_sub(1));
})
.log_err();
})
}
fn confirm(&mut self, _secondary: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
if self.matches.is_empty() {
self.dismissed(window, cx);
return;
}
let candidate_id = self.matches[self.selected_index].candidate_id;
let profile = &self.profiles[candidate_id];
(self.on_confirm)(&profile.id, window, cx);
}
fn dismissed(&mut self, _window: &mut Window, cx: &mut Context<Picker<Self>>) {
self.profile_picker
.update(cx, |_this, cx| cx.emit(DismissEvent))
.log_err();
}
fn render_match(
&self,
ix: usize,
selected: bool,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let profile_match = &self.matches[ix];
Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.child(HighlightedLabel::new(
profile_match.string.clone(),
profile_match.positions.clone(),
)),
)
}
}

View File

@@ -0,0 +1,267 @@
use std::sync::Arc;
use assistant_settings::{
AgentProfile, AssistantSettings, AssistantSettingsContent, VersionedAssistantSettingsContent,
};
use assistant_tool::{ToolSource, ToolWorkingSet};
use fs::Fs;
use fuzzy::{match_strings, StringMatch, StringMatchCandidate};
use gpui::{App, Context, DismissEvent, Entity, EventEmitter, Focusable, Task, WeakEntity, Window};
use picker::{Picker, PickerDelegate};
use settings::update_settings_file;
use ui::{prelude::*, HighlightedLabel, ListItem, ListItemSpacing};
use util::ResultExt as _;
pub struct ToolPicker {
picker: Entity<Picker<ToolPickerDelegate>>,
}
impl ToolPicker {
pub fn new(delegate: ToolPickerDelegate, window: &mut Window, cx: &mut Context<Self>) -> Self {
let picker = cx.new(|cx| Picker::uniform_list(delegate, window, cx));
Self { picker }
}
}
impl EventEmitter<DismissEvent> for ToolPicker {}
impl Focusable for ToolPicker {
fn focus_handle(&self, cx: &App) -> gpui::FocusHandle {
self.picker.focus_handle(cx)
}
}
impl Render for ToolPicker {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
v_flex().w(rems(34.)).child(self.picker.clone())
}
}
#[derive(Debug, Clone)]
pub struct ToolEntry {
pub name: Arc<str>,
pub source: ToolSource,
}
pub struct ToolPickerDelegate {
tool_picker: WeakEntity<ToolPicker>,
fs: Arc<dyn Fs>,
tools: Vec<ToolEntry>,
profile_id: Arc<str>,
profile: AgentProfile,
matches: Vec<StringMatch>,
selected_index: usize,
}
impl ToolPickerDelegate {
pub fn new(
fs: Arc<dyn Fs>,
tool_set: Arc<ToolWorkingSet>,
profile_id: Arc<str>,
profile: AgentProfile,
cx: &mut Context<ToolPicker>,
) -> Self {
let mut tool_entries = Vec::new();
for (source, tools) in tool_set.tools_by_source(cx) {
tool_entries.extend(tools.into_iter().map(|tool| ToolEntry {
name: tool.name().into(),
source: source.clone(),
}));
}
Self {
tool_picker: cx.entity().downgrade(),
fs,
tools: tool_entries,
profile_id,
profile,
matches: Vec::new(),
selected_index: 0,
}
}
}
impl PickerDelegate for ToolPickerDelegate {
type ListItem = ListItem;
fn match_count(&self) -> usize {
self.matches.len()
}
fn selected_index(&self) -> usize {
self.selected_index
}
fn set_selected_index(
&mut self,
ix: usize,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) {
self.selected_index = ix;
}
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> Arc<str> {
"Search tools…".into()
}
fn update_matches(
&mut self,
query: String,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Task<()> {
let background = cx.background_executor().clone();
let candidates = self
.tools
.iter()
.enumerate()
.map(|(id, profile)| StringMatchCandidate::new(id, profile.name.as_ref()))
.collect::<Vec<_>>();
cx.spawn_in(window, async move |this, cx| {
let matches = if query.is_empty() {
candidates
.into_iter()
.enumerate()
.map(|(index, candidate)| StringMatch {
candidate_id: index,
string: candidate.string,
positions: Vec::new(),
score: 0.,
})
.collect()
} else {
match_strings(
&candidates,
&query,
false,
100,
&Default::default(),
background,
)
.await
};
this.update(cx, |this, _cx| {
this.delegate.matches = matches;
this.delegate.selected_index = this
.delegate
.selected_index
.min(this.delegate.matches.len().saturating_sub(1));
})
.log_err();
})
}
fn confirm(&mut self, _secondary: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
if self.matches.is_empty() {
self.dismissed(window, cx);
return;
}
let candidate_id = self.matches[self.selected_index].candidate_id;
let tool = &self.tools[candidate_id];
let is_enabled = match &tool.source {
ToolSource::Native => {
let is_enabled = self.profile.tools.entry(tool.name.clone()).or_default();
*is_enabled = !*is_enabled;
*is_enabled
}
ToolSource::ContextServer { id } => {
let preset = self
.profile
.context_servers
.entry(id.clone().into())
.or_default();
let is_enabled = preset.tools.entry(tool.name.clone()).or_default();
*is_enabled = !*is_enabled;
*is_enabled
}
};
update_settings_file::<AssistantSettings>(self.fs.clone(), cx, {
let profile_id = self.profile_id.clone();
let tool = tool.clone();
move |settings, _cx| match settings {
AssistantSettingsContent::Versioned(VersionedAssistantSettingsContent::V2(
settings,
)) => {
if let Some(profiles) = &mut settings.profiles {
if let Some(profile) = profiles.get_mut(&profile_id) {
match tool.source {
ToolSource::Native => {
*profile.tools.entry(tool.name).or_default() = is_enabled;
}
ToolSource::ContextServer { id } => {
let preset = profile
.context_servers
.entry(id.clone().into())
.or_default();
*preset.tools.entry(tool.name.clone()).or_default() =
is_enabled;
}
}
}
}
}
_ => {}
}
});
}
fn dismissed(&mut self, _window: &mut Window, cx: &mut Context<Picker<Self>>) {
self.tool_picker
.update(cx, |_this, cx| cx.emit(DismissEvent))
.log_err();
}
fn render_match(
&self,
ix: usize,
selected: bool,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let tool_match = &self.matches[ix];
let tool = &self.tools[tool_match.candidate_id];
let is_enabled = match &tool.source {
ToolSource::Native => self.profile.tools.get(&tool.name).copied().unwrap_or(false),
ToolSource::ContextServer { id } => self
.profile
.context_servers
.get(id.as_ref())
.and_then(|preset| preset.tools.get(&tool.name))
.copied()
.unwrap_or(false),
};
Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.child(
h_flex()
.gap_2()
.child(HighlightedLabel::new(
tool_match.string.clone(),
tool_match.positions.clone(),
))
.map(|parent| match &tool.source {
ToolSource::Native => parent,
ToolSource::ContextServer { id } => parent
.child(Label::new(id).size(LabelSize::XSmall).color(Color::Muted)),
}),
)
.end_slot::<Icon>(is_enabled.then(|| {
Icon::new(IconName::Check)
.size(IconSize::Small)
.color(Color::Success)
})),
)
}
}

View File

@@ -482,11 +482,17 @@ impl CodegenAlternative {
self.generation = cx.spawn(async move |codegen, cx| {
let stream = stream.await;
let token_usage = stream
.as_ref()
.ok()
.map(|stream| stream.last_token_usage.clone());
let message_id = stream
.as_ref()
.ok()
.and_then(|stream| stream.message_id.clone());
let generate = async {
let model_telemetry_id = model_telemetry_id.clone();
let model_provider_id = model_provider_id.clone();
let (mut diff_tx, mut diff_rx) = mpsc::channel(1);
let executor = cx.background_executor().clone();
let message_id = message_id.clone();
@@ -596,7 +602,7 @@ impl CodegenAlternative {
kind: AssistantKind::Inline,
phase: AssistantPhase::Response,
model: model_telemetry_id,
model_provider: model_provider_id.to_string(),
model_provider: model_provider_id,
response_latency,
error_message,
language_name: language_name.map(|name| name.to_proto()),
@@ -677,6 +683,16 @@ impl CodegenAlternative {
}
this.elapsed_time = Some(elapsed_time);
this.completion = Some(completion.lock().clone());
if let Some(usage) = token_usage {
let usage = usage.lock();
telemetry::event!(
"Inline Assistant Completion",
model = model_telemetry_id,
model_provider = model_provider_id,
input_tokens = usage.input_tokens,
output_tokens = usage.output_tokens,
)
}
cx.emit(CodegenEvent::Finished);
cx.notify();
})
@@ -1021,7 +1037,7 @@ mod tests {
language_settings, tree_sitter_rust, Buffer, Language, LanguageConfig, LanguageMatcher,
Point,
};
use language_model::LanguageModelRegistry;
use language_model::{LanguageModelRegistry, TokenUsage};
use rand::prelude::*;
use serde::Serialize;
use settings::SettingsStore;
@@ -1405,6 +1421,7 @@ mod tests {
future::ready(Ok(LanguageModelTextStream {
message_id: None,
stream: chunks_rx.map(Ok).boxed(),
last_token_usage: Arc::new(Mutex::new(TokenUsage::default())),
})),
cx,
);

View File

@@ -29,7 +29,7 @@ use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
use crate::profile_selector::ProfileSelector;
use crate::thread::{RequestKind, Thread};
use crate::thread_store::ThreadStore;
use crate::{Chat, ChatMode, RemoveAllContext, ThreadEvent, ToggleContextPicker};
use crate::{Chat, ChatMode, RemoveAllContext, ShowThreadDiff, ThreadEvent, ToggleContextPicker};
pub struct MessageEditor {
thread: Entity<Thread>,
@@ -152,6 +152,32 @@ impl MessageEditor {
) {
self.context_picker_menu_handle.toggle(window, cx);
}
fn show_thread_diff(
&mut self,
_: &ShowThreadDiff,
window: &mut Window,
cx: &mut Context<Self>,
) {
let Some(workspace) = self.workspace.upgrade() else {
return;
};
let diff_source = Arc::new(self.thread.read(cx).diff_source(cx));
let project_diff = cx.new(|cx| {
git_ui::project_diff::ProjectDiff::new(
diff_source,
self.project.clone(),
workspace.clone(),
window,
cx,
)
});
workspace.update(cx, |workspace, cx| {
workspace.add_item_to_active_pane(Box::new(project_diff), None, true, window, cx)
});
}
pub fn remove_all_context(
&mut self,
_: &RemoveAllContext,
@@ -317,7 +343,7 @@ impl Render for MessageEditor {
let project = self.thread.read(cx).project();
let changed_files = if let Some(repository) = project.read(cx).active_repository(cx) {
repository.read(cx).status().count()
repository.read(cx).cached_status().count()
} else {
0
};
@@ -565,6 +591,7 @@ impl Render for MessageEditor {
this.model_selector
.update(cx, |model_selector, cx| model_selector.toggle(window, cx));
}))
.on_action(cx.listener(Self::show_thread_diff))
.on_action(cx.listener(Self::toggle_context_picker))
.on_action(cx.listener(Self::remove_all_context))
.on_action(cx.listener(Self::move_up))

View File

@@ -1,19 +1,14 @@
use std::sync::{Arc, LazyLock};
use std::sync::Arc;
use anyhow::Result;
use assistant_settings::{AgentProfile, AssistantSettings};
use editor::scroll::Autoscroll;
use editor::Editor;
use fs::Fs;
use gpui::{prelude::*, AsyncWindowContext, Entity, Subscription, WeakEntity};
use gpui::{prelude::*, Action, Entity, Subscription, WeakEntity};
use indexmap::IndexMap;
use regex::Regex;
use settings::{update_settings_file, Settings as _, SettingsStore};
use ui::{prelude::*, ContextMenu, ContextMenuEntry, PopoverMenu, Tooltip};
use util::ResultExt as _;
use workspace::{create_and_open_local_file, Workspace};
use crate::ThreadStore;
use crate::{ManageProfiles, ThreadStore};
pub struct ProfileSelector {
profiles: IndexMap<Arc<str>, AgentProfile>,
@@ -92,89 +87,13 @@ impl ProfileSelector {
.icon(IconName::Pencil)
.icon_color(Color::Muted)
.handler(move |window, cx| {
if let Some(workspace) = window.root().flatten() {
let workspace = workspace.downgrade();
window
.spawn(cx, async |cx| {
Self::open_profiles_setting_in_editor(workspace, cx).await
})
.detach_and_log_err(cx);
}
window.dispatch_action(ManageProfiles.boxed_clone(), cx);
}),
);
menu
})
}
async fn open_profiles_setting_in_editor(
workspace: WeakEntity<Workspace>,
cx: &mut AsyncWindowContext,
) -> Result<()> {
let settings_editor = workspace
.update_in(cx, |_, window, cx| {
create_and_open_local_file(paths::settings_file(), window, cx, || {
settings::initial_user_settings_content().as_ref().into()
})
})?
.await?
.downcast::<Editor>()
.unwrap();
settings_editor
.downgrade()
.update_in(cx, |editor, window, cx| {
let text = editor.buffer().read(cx).snapshot(cx).text();
let settings = cx.global::<SettingsStore>();
let edits =
settings.edits_for_update::<AssistantSettings>(
&text,
|settings| match settings {
assistant_settings::AssistantSettingsContent::Versioned(settings) => {
match settings {
assistant_settings::VersionedAssistantSettingsContent::V2(
settings,
) => {
settings.profiles.get_or_insert_with(IndexMap::default);
}
assistant_settings::VersionedAssistantSettingsContent::V1(
_,
) => {}
}
}
assistant_settings::AssistantSettingsContent::Legacy(_) => {}
},
);
if !edits.is_empty() {
editor.edit(edits.iter().cloned(), cx);
}
let text = editor.buffer().read(cx).snapshot(cx).text();
static PROFILES_REGEX: LazyLock<Regex> =
LazyLock::new(|| Regex::new(r#"(?P<key>"profiles":)\s*\{"#).unwrap());
let range = PROFILES_REGEX.captures(&text).and_then(|captures| {
captures
.name("key")
.map(|inner_match| inner_match.start()..inner_match.end())
});
if let Some(range) = range {
editor.change_selections(
Some(Autoscroll::newest()),
window,
cx,
|selections| {
selections.select_ranges(vec![range]);
},
);
}
})?;
anyhow::Ok(())
}
}
impl Render for ProfileSelector {

View File

@@ -34,6 +34,7 @@ use crate::thread_store::{
SerializedToolUse,
};
use crate::tool_use::{PendingToolUse, ToolUse, ToolUseState};
use crate::{ChangeAuthor, ThreadDiff, ThreadDiffSource};
#[derive(Debug, Clone, Copy)]
pub enum RequestKind {
@@ -181,6 +182,7 @@ pub struct Thread {
completion_count: usize,
pending_completions: Vec<PendingCompletion>,
project: Entity<Project>,
git_store: Entity<GitStore>,
prompt_builder: Arc<PromptBuilder>,
tools: Arc<ToolWorkingSet>,
tool_use: ToolUseState,
@@ -190,6 +192,7 @@ pub struct Thread {
initial_project_snapshot: Shared<Task<Option<Arc<ProjectSnapshot>>>>,
cumulative_token_usage: TokenUsage,
feedback: Option<ThreadFeedback>,
diff: Entity<ThreadDiff>,
}
impl Thread {
@@ -213,6 +216,7 @@ impl Thread {
completion_count: 0,
pending_completions: Vec::new(),
project: project.clone(),
git_store: project.read(cx).git_store().clone(),
prompt_builder,
tools: tools.clone(),
last_restore_checkpoint: None,
@@ -220,13 +224,14 @@ impl Thread {
tool_use: ToolUseState::new(tools.clone()),
action_log: cx.new(|_| ActionLog::new()),
initial_project_snapshot: {
let project_snapshot = Self::project_snapshot(project, cx);
let project_snapshot = Self::project_snapshot(project.clone(), cx);
cx.foreground_executor()
.spawn(async move { Some(project_snapshot.await) })
.shared()
},
cumulative_token_usage: TokenUsage::default(),
feedback: None,
diff: cx.new(|cx| ThreadDiff::new(project.clone(), cx)),
}
}
@@ -280,6 +285,8 @@ impl Thread {
pending_completions: Vec::new(),
last_restore_checkpoint: None,
pending_checkpoint: None,
diff: cx.new(|cx| ThreadDiff::new(project.clone(), cx)),
git_store: project.read(cx).git_store().clone(),
project,
prompt_builder,
tools,
@@ -356,6 +363,15 @@ impl Thread {
!self.tool_use.pending_tool_uses().is_empty()
}
pub fn diff_source(&self, cx: &App) -> ThreadDiffSource {
let project = self.project.read(cx);
ThreadDiffSource::new(
self.diff.clone(),
project.git_store().clone(),
project.languages().clone(),
)
}
pub fn checkpoint_for_message(&self, id: MessageId) -> Option<ThreadCheckpoint> {
self.checkpoints_by_message.get(&id).cloned()
}
@@ -1231,17 +1247,37 @@ impl Thread {
tool: Arc<dyn Tool>,
cx: &mut Context<Thread>,
) -> Task<()> {
let run_tool = tool.run(
input,
messages,
self.project.clone(),
self.action_log.clone(),
cx,
);
let diff = self.diff.clone();
let git_store = self.git_store.clone();
let checkpoint = git_store.read(cx).checkpoint(cx);
let messages = messages.to_vec();
let project = self.project.clone();
let action_log = self.action_log.clone();
cx.spawn({
async move |thread: WeakEntity<Thread>, cx| {
let output = run_tool.await;
if let Ok(checkpoint) = checkpoint.await {
diff.update(cx, |diff, cx| {
diff.compute_changes(ChangeAuthor::User, checkpoint)
})
.ok();
}
let Ok(output) =
cx.update(|cx| tool.run(input, &messages, project, action_log, cx))
else {
return;
};
let output = output.await;
if let Ok(checkpoint) = git_store.read_with(cx, |git, cx| git.checkpoint(cx)) {
if let Ok(checkpoint) = checkpoint.await {
diff.update(cx, |diff, cx| {
diff.compute_changes(ChangeAuthor::Agent, checkpoint)
})
.ok();
}
}
thread
.update(cx, |thread, cx| {
@@ -1410,7 +1446,7 @@ impl Thread {
git_store
.repositories()
.values()
.find(|repo| repo.read(cx).worktree_id == snapshot.id())
.find(|repo| repo.read(cx).worktree_id == Some(snapshot.id()))
.and_then(|repo| {
let repo = repo.read(cx);
Some((repo.branch().cloned(), repo.local_repository()?))
@@ -1429,7 +1465,7 @@ impl Thread {
// Get diff asynchronously
let diff = repo
.diff(git::repository::DiffType::HeadToWorktree, cx.clone())
.diff(git::repository::DiffType::HeadToWorktree)
.await
.ok();
@@ -1562,6 +1598,7 @@ pub enum ThreadEvent {
canceled: bool,
},
CheckpointChanged,
DiffChanged,
ToolConfirmationNeeded,
}

View File

@@ -0,0 +1,219 @@
use anyhow::{Context as _, Result};
use buffer_diff::BufferDiff;
use collections::HashMap;
use futures::{channel::mpsc, future::Shared, FutureExt, StreamExt};
use gpui::{prelude::*, App, Entity, Subscription, Task};
use language::{Buffer, LanguageRegistry};
use project::{
git_store::{GitStore, GitStoreCheckpoint, GitStoreIndex, GitStoreStatus},
Project,
};
use std::sync::Arc;
use util::TryFutureExt;
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub enum ChangeAuthor {
User,
Agent,
}
pub struct ThreadDiff {
base: Shared<Task<Option<GitStoreIndex>>>,
diffs_by_buffer: HashMap<Entity<Buffer>, Entity<BufferDiff>>,
status: GitStoreStatus,
project: Entity<Project>,
git_store: Entity<GitStore>,
checkpoints_tx: mpsc::UnboundedSender<(ChangeAuthor, GitStoreCheckpoint)>,
update_status_tx: async_watch::Sender<()>,
_subscription: Subscription,
_maintain_status: Task<Result<()>>,
_maintain_diff: Task<Result<()>>,
}
impl ThreadDiff {
pub fn new(project: Entity<Project>, cx: &mut Context<Self>) -> Self {
let git_store = project.read(cx).git_store().clone();
let (checkpoints_tx, mut checkpoints_rx) = mpsc::unbounded();
let (update_status_tx, mut update_status_rx) = async_watch::channel(());
let checkpoint = git_store.read(cx).checkpoint(cx);
let base = cx
.background_spawn(
project
.read(cx)
.git_store()
.read(cx)
.create_index(cx)
.log_err(),
)
.shared();
Self {
base: base.clone(),
status: GitStoreStatus::default(),
diffs_by_buffer: HashMap::default(),
git_store: git_store.clone(),
project,
checkpoints_tx,
update_status_tx,
_subscription: cx.subscribe(&git_store, |this, _git_store, event, _cx| {
if let project::git_store::GitEvent::FileSystemUpdated = event {
this.update_status_tx.send(()).ok();
}
}),
_maintain_status: cx.spawn({
let git_store = git_store.clone();
let base = base.clone();
async move |this, cx| {
let base = base.await.context("failed to create base")?;
while let Ok(()) = update_status_rx.recv().await {
let status = git_store
.read_with(cx, |store, cx| store.status(Some(base.clone()), cx))?
.await
.unwrap_or_default();
this.update(cx, |this, cx| this.set_status(status, cx))?;
}
Ok(())
}
}),
_maintain_diff: cx.spawn(async move |this, cx| {
let mut last_checkpoint = checkpoint.await.ok();
let base = base.await.context("failed to create base")?;
while let Some((author, checkpoint)) = checkpoints_rx.next().await {
if let Some(last_checkpoint) = last_checkpoint {
if author == ChangeAuthor::User {
let diff = git_store
.read_with(cx, |store, cx| {
store.diff_checkpoints(last_checkpoint, checkpoint.clone(), cx)
})?
.await;
if let Ok(diff) = diff {
_ = git_store
.read_with(cx, |store, cx| {
store.apply_diff(base.clone(), diff, cx)
})?
.await;
}
}
this.update(cx, |this, _cx| this.update_status_tx.send(()).ok())?;
}
last_checkpoint = Some(checkpoint);
}
Ok(())
}),
}
}
pub fn compute_changes(&mut self, author: ChangeAuthor, checkpoint: GitStoreCheckpoint) {
_ = self.checkpoints_tx.unbounded_send((author, checkpoint));
}
fn set_status(&mut self, status: GitStoreStatus, cx: &mut Context<Self>) {
self.status = status;
cx.notify();
}
}
pub struct ThreadDiffSource {
thread_diff: Entity<ThreadDiff>,
git_store: Entity<GitStore>,
language_registry: Arc<LanguageRegistry>,
}
impl ThreadDiffSource {
pub fn new(
thread_diff: Entity<ThreadDiff>,
git_store: Entity<GitStore>,
language_registry: Arc<LanguageRegistry>,
) -> Self {
Self {
thread_diff,
git_store,
language_registry,
}
}
}
impl git_ui::project_diff::DiffSource for ThreadDiffSource {
fn observe(&self, cx: &mut App, mut f: Box<dyn FnMut(&mut App) + Send>) -> Subscription {
cx.observe(&self.thread_diff, move |_, cx| f(cx))
}
fn status(&self, cx: &App) -> Vec<git_ui::project_diff::StatusEntry> {
let mut results = Vec::new();
for (repo, repo_path, status) in self
.thread_diff
.read(cx)
.status
.entries(&self.git_store, cx)
{
let Some(project_path) = repo.read(cx).repo_path_to_project_path(repo_path, cx) else {
continue;
};
let status = match *status {
git::status::FileStatus::Tracked(mut tracked_status) => {
if tracked_status.worktree_status == git::status::StatusCode::Unmodified {
continue;
} else {
tracked_status.index_status = git::status::StatusCode::Unmodified;
git::status::FileStatus::Tracked(tracked_status)
}
}
status @ _ => status,
};
results.push(git_ui::project_diff::StatusEntry {
project_path,
status,
has_conflict: false,
});
}
results
}
fn open_uncommitted_diff(
&self,
buffer: Entity<Buffer>,
cx: &mut App,
) -> Task<Result<Entity<BufferDiff>>> {
let base = self.thread_diff.read(cx).base.clone();
let git_store = self.git_store.clone();
let language_registry = self.language_registry.clone();
let thread_diff = self.thread_diff.clone();
cx.spawn(async move |cx| {
let base = base.await.context("failed to load diff base")?;
let base_text = git_store
.read_with(cx, |git, cx| git.load_index_text(Some(base), &buffer, cx))?
.await;
let snapshot = buffer.read_with(cx, |buffer, _cx| buffer.snapshot())?;
let diff = thread_diff.update(cx, |thread_diff, cx| {
thread_diff
.diffs_by_buffer
.entry(buffer.clone())
.or_insert_with(|| cx.new(|cx| BufferDiff::new(&snapshot, cx)))
.clone()
})?;
let base_text = Arc::new(base_text.unwrap_or_default());
let diff_snapshot = BufferDiff::update_diff(
diff.clone(),
snapshot.text.clone(),
Some(base_text),
true,
false,
snapshot.language().cloned(),
Some(language_registry),
cx,
)
.await?;
diff.update(cx, |diff, cx| {
diff.set_snapshot(&snapshot, diff_snapshot, false, None, cx);
})?;
Ok(diff)
})
}
}

View File

@@ -149,7 +149,10 @@ pub fn init(cx: &mut App) -> Arc<HeadlessAppState> {
cx.set_http_client(client.http_client().clone());
let git_binary_path = None;
let fs = Arc::new(RealFs::new(git_binary_path));
let fs = Arc::new(RealFs::new(
git_binary_path,
cx.background_executor().clone(),
));
let languages = Arc::new(LanguageRegistry::new(cx.background_executor().clone()));

View File

@@ -12,7 +12,7 @@ pub struct AgentProfile {
pub context_servers: IndexMap<Arc<str>, ContextServerPreset>,
}
#[derive(Debug, Clone)]
#[derive(Debug, Clone, Default)]
pub struct ContextServerPreset {
pub tools: IndexMap<Arc<str>, bool>,
}

View File

@@ -442,7 +442,7 @@ pub struct AgentProfileContent {
pub context_servers: IndexMap<Arc<str>, ContextServerPresetContent>,
}
#[derive(Debug, PartialEq, Clone, Serialize, Deserialize, JsonSchema)]
#[derive(Debug, PartialEq, Clone, Default, Serialize, Deserialize, JsonSchema)]
pub struct ContextServerPresetContent {
pub tools: IndexMap<Arc<str>, bool>,
}

View File

@@ -45,11 +45,10 @@ impl Tool for BashTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<BashToolInput>(input.clone()) {
Ok(input) => {
let cmd = MarkdownString::escape(&input.command);
if input.command.contains('\n') {
format!("```bash\n{cmd}\n```")
MarkdownString::code_block("bash", &input.command).0
} else {
format!("`{cmd}`")
MarkdownString::inline_code(&input.command).0
}
}
Err(_) => "Run bash command".to_string(),

View File

@@ -61,9 +61,9 @@ impl Tool for CopyPathTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<CopyPathToolInput>(input.clone()) {
Ok(input) => {
let src = MarkdownString::escape(&input.source_path);
let dest = MarkdownString::escape(&input.destination_path);
format!("Copy `{src}` to `{dest}`")
let src = MarkdownString::inline_code(&input.source_path);
let dest = MarkdownString::inline_code(&input.destination_path);
format!("Copy {src} to {dest}")
}
Err(_) => "Copy path".to_string(),
}

View File

@@ -51,7 +51,10 @@ impl Tool for CreateDirectoryTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<CreateDirectoryToolInput>(input.clone()) {
Ok(input) => {
format!("Create directory `{}`", MarkdownString::escape(&input.path))
format!(
"Create directory {}",
MarkdownString::inline_code(&input.path)
)
}
Err(_) => "Create directory".to_string(),
}

View File

@@ -58,8 +58,8 @@ impl Tool for CreateFileTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<CreateFileToolInput>(input.clone()) {
Ok(input) => {
let path = MarkdownString::escape(&input.path);
format!("Create file `{path}`")
let path = MarkdownString::inline_code(&input.path);
format!("Create file {path}")
}
Err(_) => "Create file".to_string(),
}

View File

@@ -66,11 +66,11 @@ impl Tool for DiagnosticsTool {
if let Some(path) = serde_json::from_value::<DiagnosticsToolInput>(input.clone())
.ok()
.and_then(|input| match input.path {
Some(path) if !path.is_empty() => Some(MarkdownString::escape(&path)),
Some(path) if !path.is_empty() => Some(MarkdownString::inline_code(&path)),
_ => None,
})
{
format!("Check diagnostics for `{path}`")
format!("Check diagnostics for {path}")
} else {
"Check project diagnostics".to_string()
}

View File

@@ -63,8 +63,8 @@ impl Tool for ListDirectoryTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<ListDirectoryToolInput>(input.clone()) {
Ok(input) => {
let path = MarkdownString::escape(&input.path);
format!("List the `{path}` directory's contents")
let path = MarkdownString::inline_code(&input.path);
format!("List the {path} directory's contents")
}
Err(_) => "List directory".to_string(),
}

View File

@@ -61,8 +61,8 @@ impl Tool for MovePathTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<MovePathToolInput>(input.clone()) {
Ok(input) => {
let src = MarkdownString::escape(&input.source_path);
let dest = MarkdownString::escape(&input.destination_path);
let src = MarkdownString::inline_code(&input.source_path);
let dest = MarkdownString::inline_code(&input.destination_path);
let src_path = Path::new(&input.source_path);
let dest_path = Path::new(&input.destination_path);
@@ -71,11 +71,11 @@ impl Tool for MovePathTool {
.and_then(|os_str| os_str.to_os_string().into_string().ok())
{
Some(filename) if src_path.parent() == dest_path.parent() => {
let filename = MarkdownString::escape(&filename);
format!("Rename `{src}` to `{filename}`")
let filename = MarkdownString::inline_code(&filename);
format!("Rename {src} to {filename}")
}
_ => {
format!("Move `{src}` to `{dest}`")
format!("Move {src} to {dest}")
}
}
}

View File

@@ -66,8 +66,8 @@ impl Tool for ReadFileTool {
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<ReadFileToolInput>(input.clone()) {
Ok(input) => {
let path = MarkdownString::escape(&input.path.display().to_string());
format!("Read file `{path}`")
let path = MarkdownString::inline_code(&input.path.display().to_string());
format!("Read file {path}")
}
Err(_) => "Read file".to_string(),
}

View File

@@ -64,12 +64,12 @@ impl Tool for RegexSearchTool {
match serde_json::from_value::<RegexSearchToolInput>(input.clone()) {
Ok(input) => {
let page = input.page();
let regex = MarkdownString::escape(&input.regex);
let regex = MarkdownString::inline_code(&input.regex);
if page > 1 {
format!("Get page {page} of search results for regex “`{regex}`")
format!("Get page {page} of search results for regex “{regex}")
} else {
format!("Search files for regex “`{regex}`")
format!("Search files for regex “{regex}")
}
}
Err(_) => "Search with regex".to_string(),

View File

@@ -3,9 +3,7 @@ use git2::{DiffLineType as GitDiffLineType, DiffOptions as GitOptions, Patch as
use gpui::{App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Task};
use language::{Language, LanguageRegistry};
use rope::Rope;
use std::cmp::Ordering;
use std::mem;
use std::{future::Future, iter, ops::Range, sync::Arc};
use std::{cmp::Ordering, future::Future, iter, mem, ops::Range, sync::Arc};
use sum_tree::SumTree;
use text::{Anchor, Bias, BufferId, OffsetRangeExt, Point, ToOffset as _};
use util::ResultExt;
@@ -195,7 +193,7 @@ impl BufferDiffInner {
hunks: &[DiffHunk],
buffer: &text::BufferSnapshot,
file_exists: bool,
) -> (Option<Rope>, SumTree<PendingHunk>) {
) -> Option<Rope> {
let head_text = self
.base_text_exists
.then(|| self.base_text.as_rope().clone());
@@ -208,7 +206,7 @@ impl BufferDiffInner {
let (index_text, head_text) = match (index_text, head_text) {
(Some(index_text), Some(head_text)) if file_exists || !stage => (index_text, head_text),
(index_text, head_text) => {
let (rope, new_status) = if stage {
let (new_index_text, new_status) = if stage {
log::debug!("stage all");
(
file_exists.then(|| buffer.as_rope().clone()),
@@ -228,15 +226,13 @@ impl BufferDiffInner {
buffer_version: buffer.version().clone(),
new_status,
};
let tree = SumTree::from_item(hunk, buffer);
return (rope, tree);
self.pending_hunks = SumTree::from_item(hunk, buffer);
return new_index_text;
}
};
let mut pending_hunks = SumTree::new(buffer);
let mut old_pending_hunks = unstaged_diff
.pending_hunks
.cursor::<DiffHunkSummary>(buffer);
let mut old_pending_hunks = self.pending_hunks.cursor::<DiffHunkSummary>(buffer);
// first, merge new hunks into pending_hunks
for DiffHunk {
@@ -261,7 +257,6 @@ impl BufferDiffInner {
old_pending_hunks.next(buffer);
}
// merge into pending hunks
if (stage && secondary_status == DiffHunkSecondaryStatus::NoSecondaryHunk)
|| (!stage && secondary_status == DiffHunkSecondaryStatus::HasSecondaryHunk)
{
@@ -288,56 +283,71 @@ impl BufferDiffInner {
let mut unstaged_hunk_cursor = unstaged_diff.hunks.cursor::<DiffHunkSummary>(buffer);
unstaged_hunk_cursor.next(buffer);
let mut prev_unstaged_hunk_buffer_offset = 0;
let mut prev_unstaged_hunk_base_text_offset = 0;
let mut edits = Vec::<(Range<usize>, String)>::new();
// then, iterate over all pending hunks (both new ones and the existing ones) and compute the edits
for PendingHunk {
let mut prev_unstaged_hunk_buffer_end = 0;
let mut prev_unstaged_hunk_base_text_end = 0;
let mut edits = Vec::<(Range<usize>, String)>::new();
let mut pending_hunks_iter = pending_hunks.iter().cloned().peekable();
while let Some(PendingHunk {
buffer_range,
diff_base_byte_range,
..
} in pending_hunks.iter().cloned()
}) = pending_hunks_iter.next()
{
let skipped_hunks = unstaged_hunk_cursor.slice(&buffer_range.start, Bias::Left, buffer);
// Advance unstaged_hunk_cursor to skip unstaged hunks before current hunk
let skipped_unstaged =
unstaged_hunk_cursor.slice(&buffer_range.start, Bias::Left, buffer);
if let Some(secondary_hunk) = skipped_hunks.last() {
prev_unstaged_hunk_base_text_offset = secondary_hunk.diff_base_byte_range.end;
prev_unstaged_hunk_buffer_offset =
secondary_hunk.buffer_range.end.to_offset(buffer);
if let Some(unstaged_hunk) = skipped_unstaged.last() {
prev_unstaged_hunk_base_text_end = unstaged_hunk.diff_base_byte_range.end;
prev_unstaged_hunk_buffer_end = unstaged_hunk.buffer_range.end.to_offset(buffer);
}
// Find where this hunk is in the index if it doesn't overlap
let mut buffer_offset_range = buffer_range.to_offset(buffer);
let start_overshoot = buffer_offset_range.start - prev_unstaged_hunk_buffer_offset;
let mut index_start = prev_unstaged_hunk_base_text_offset + start_overshoot;
let start_overshoot = buffer_offset_range.start - prev_unstaged_hunk_buffer_end;
let mut index_start = prev_unstaged_hunk_base_text_end + start_overshoot;
while let Some(unstaged_hunk) = unstaged_hunk_cursor.item().filter(|item| {
item.buffer_range
.start
.cmp(&buffer_range.end, buffer)
.is_le()
}) {
let unstaged_hunk_offset_range = unstaged_hunk.buffer_range.to_offset(buffer);
prev_unstaged_hunk_base_text_offset = unstaged_hunk.diff_base_byte_range.end;
prev_unstaged_hunk_buffer_offset = unstaged_hunk_offset_range.end;
loop {
// Merge this hunk with any overlapping unstaged hunks.
if let Some(unstaged_hunk) = unstaged_hunk_cursor.item() {
let unstaged_hunk_offset_range = unstaged_hunk.buffer_range.to_offset(buffer);
if unstaged_hunk_offset_range.start <= buffer_offset_range.end {
prev_unstaged_hunk_base_text_end = unstaged_hunk.diff_base_byte_range.end;
prev_unstaged_hunk_buffer_end = unstaged_hunk_offset_range.end;
index_start = index_start.min(unstaged_hunk.diff_base_byte_range.start);
buffer_offset_range.start = buffer_offset_range
.start
.min(unstaged_hunk_offset_range.start);
index_start = index_start.min(unstaged_hunk.diff_base_byte_range.start);
buffer_offset_range.start = buffer_offset_range
.start
.min(unstaged_hunk_offset_range.start);
buffer_offset_range.end =
buffer_offset_range.end.max(unstaged_hunk_offset_range.end);
unstaged_hunk_cursor.next(buffer);
unstaged_hunk_cursor.next(buffer);
continue;
}
}
// If any unstaged hunks were merged, then subsequent pending hunks may
// now overlap this hunk. Merge them.
if let Some(next_pending_hunk) = pending_hunks_iter.peek() {
let next_pending_hunk_offset_range =
next_pending_hunk.buffer_range.to_offset(buffer);
if next_pending_hunk_offset_range.start <= buffer_offset_range.end {
buffer_offset_range.end = next_pending_hunk_offset_range.end;
pending_hunks_iter.next();
continue;
}
}
break;
}
let end_overshoot = buffer_offset_range
.end
.saturating_sub(prev_unstaged_hunk_buffer_offset);
let index_end = prev_unstaged_hunk_base_text_offset + end_overshoot;
let index_range = index_start..index_end;
buffer_offset_range.end = buffer_offset_range
.end
.max(prev_unstaged_hunk_buffer_offset);
.saturating_sub(prev_unstaged_hunk_buffer_end);
let index_end = prev_unstaged_hunk_base_text_end + end_overshoot;
let index_byte_range = index_start..index_end;
let replacement_text = if stage {
log::debug!("stage hunk {:?}", buffer_offset_range);
@@ -351,8 +361,11 @@ impl BufferDiffInner {
.collect::<String>()
};
edits.push((index_range, replacement_text));
edits.push((index_byte_range, replacement_text));
}
drop(pending_hunks_iter);
drop(old_pending_hunks);
self.pending_hunks = pending_hunks;
#[cfg(debug_assertions)] // invariants: non-overlapping and sorted
{
@@ -371,7 +384,7 @@ impl BufferDiffInner {
new_index_text.push(&replacement_text);
}
new_index_text.append(index_cursor.suffix());
(Some(new_index_text), pending_hunks)
Some(new_index_text)
}
fn hunks_intersecting_range<'a>(
@@ -408,15 +421,14 @@ impl BufferDiffInner {
]
});
let mut pending_hunks_cursor = self.pending_hunks.cursor::<DiffHunkSummary>(buffer);
pending_hunks_cursor.next(buffer);
let mut secondary_cursor = None;
let mut pending_hunks_cursor = None;
if let Some(secondary) = secondary.as_ref() {
let mut cursor = secondary.hunks.cursor::<DiffHunkSummary>(buffer);
cursor.next(buffer);
secondary_cursor = Some(cursor);
let mut cursor = secondary.pending_hunks.cursor::<DiffHunkSummary>(buffer);
cursor.next(buffer);
pending_hunks_cursor = Some(cursor);
}
let max_point = buffer.max_point();
@@ -438,29 +450,27 @@ impl BufferDiffInner {
let mut secondary_status = DiffHunkSecondaryStatus::NoSecondaryHunk;
let mut has_pending = false;
if let Some(pending_cursor) = pending_hunks_cursor.as_mut() {
if start_anchor
.cmp(&pending_cursor.start().buffer_range.start, buffer)
.is_gt()
{
pending_cursor.seek_forward(&start_anchor, Bias::Left, buffer);
if start_anchor
.cmp(&pending_hunks_cursor.start().buffer_range.start, buffer)
.is_gt()
{
pending_hunks_cursor.seek_forward(&start_anchor, Bias::Left, buffer);
}
if let Some(pending_hunk) = pending_hunks_cursor.item() {
let mut pending_range = pending_hunk.buffer_range.to_point(buffer);
if pending_range.end.column > 0 {
pending_range.end.row += 1;
pending_range.end.column = 0;
}
if let Some(pending_hunk) = pending_cursor.item() {
let mut pending_range = pending_hunk.buffer_range.to_point(buffer);
if pending_range.end.column > 0 {
pending_range.end.row += 1;
pending_range.end.column = 0;
}
if pending_range == (start_point..end_point) {
if !buffer.has_edits_since_in_range(
&pending_hunk.buffer_version,
start_anchor..end_anchor,
) {
has_pending = true;
secondary_status = pending_hunk.new_status;
}
if pending_range == (start_point..end_point) {
if !buffer.has_edits_since_in_range(
&pending_hunk.buffer_version,
start_anchor..end_anchor,
) {
has_pending = true;
secondary_status = pending_hunk.new_status;
}
}
}
@@ -839,10 +849,8 @@ impl BufferDiff {
}
pub fn clear_pending_hunks(&mut self, cx: &mut Context<Self>) {
if let Some(secondary_diff) = &self.secondary_diff {
secondary_diff.update(cx, |diff, _| {
diff.inner.pending_hunks = SumTree::from_summary(DiffHunkSummary::default());
});
if self.secondary_diff.is_some() {
self.inner.pending_hunks = SumTree::from_summary(DiffHunkSummary::default());
cx.emit(BufferDiffEvent::DiffChanged {
changed_range: Some(Anchor::MIN..Anchor::MAX),
});
@@ -857,7 +865,7 @@ impl BufferDiff {
file_exists: bool,
cx: &mut Context<Self>,
) -> Option<Rope> {
let (new_index_text, new_pending_hunks) = self.inner.stage_or_unstage_hunks_impl(
let new_index_text = self.inner.stage_or_unstage_hunks_impl(
&self.secondary_diff.as_ref()?.read(cx).inner,
stage,
&hunks,
@@ -865,11 +873,6 @@ impl BufferDiff {
file_exists,
);
if let Some(unstaged_diff) = &self.secondary_diff {
unstaged_diff.update(cx, |diff, _| {
diff.inner.pending_hunks = new_pending_hunks;
});
}
cx.emit(BufferDiffEvent::HunksStagedOrUnstaged(
new_index_text.clone(),
));
@@ -1649,6 +1652,75 @@ mod tests {
"
.unindent(),
},
Example {
name: "one unstaged hunk that contains two uncommitted hunks",
head_text: "
one
two
three
four
"
.unindent(),
index_text: "
one
two
three
four
"
.unindent(),
buffer_marked_text: "
«one
three // modified
four»
"
.unindent(),
final_index_text: "
one
three // modified
four
"
.unindent(),
},
Example {
name: "one uncommitted hunk that contains two unstaged hunks",
head_text: "
one
two
three
four
five
"
.unindent(),
index_text: "
ZERO
one
TWO
THREE
FOUR
five
"
.unindent(),
buffer_marked_text: "
«one
TWO_HUNDRED
THREE
FOUR_HUNDRED
five»
"
.unindent(),
final_index_text: "
ZERO
one
TWO_HUNDRED
THREE
FOUR_HUNDRED
five
"
.unindent(),
},
];
for example in table {

View File

@@ -43,10 +43,10 @@ telemetry.workspace = true
util.workspace = true
[target.'cfg(target_os = "macos")'.dependencies]
livekit_client_macos = { workspace = true }
livekit_client_macos.workspace = true
[target.'cfg(not(target_os = "macos"))'.dependencies]
livekit_client = { workspace = true }
livekit_client.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }

View File

@@ -115,7 +115,7 @@ notifications = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
project = { workspace = true, features = ["test-support"] }
prompt_store.workspace = true
recent_projects = { workspace = true }
recent_projects.workspace = true
release_channel.workspace = true
remote = { workspace = true, features = ["test-support"] }
remote_server.workspace = true

View File

@@ -2892,15 +2892,17 @@ async fn test_git_branch_name(
#[track_caller]
fn assert_branch(branch_name: Option<impl Into<String>>, project: &Project, cx: &App) {
let branch_name = branch_name.map(Into::into);
let worktrees = project.visible_worktrees(cx).collect::<Vec<_>>();
assert_eq!(worktrees.len(), 1);
let worktree = worktrees[0].clone();
let snapshot = worktree.read(cx).snapshot();
let repo = snapshot.repositories().first().unwrap();
let repositories = project.repositories(cx).values().collect::<Vec<_>>();
assert_eq!(repositories.len(), 1);
let repository = repositories[0].clone();
assert_eq!(
repo.branch().map(|branch| branch.name.to_string()),
repository
.read(cx)
.repository_entry
.branch()
.map(|branch| branch.name.to_string()),
branch_name
);
)
}
// Smoke test branch reading
@@ -3022,11 +3024,20 @@ async fn test_git_status_sync(
cx: &App,
) {
let file = file.as_ref();
let worktrees = project.visible_worktrees(cx).collect::<Vec<_>>();
assert_eq!(worktrees.len(), 1);
let worktree = worktrees[0].clone();
let snapshot = worktree.read(cx).snapshot();
assert_eq!(snapshot.status_for_file(file), status);
let repos = project
.repositories(cx)
.values()
.cloned()
.collect::<Vec<_>>();
assert_eq!(repos.len(), 1);
let repo = repos.into_iter().next().unwrap();
assert_eq!(
repo.read(cx)
.repository_entry
.status_for_path(&file.into())
.map(|entry| entry.status),
status
);
}
project_local.read_with(cx_a, |project, cx| {
@@ -3094,6 +3105,27 @@ async fn test_git_status_sync(
assert_status("b.txt", Some(B_STATUS_END), project, cx);
assert_status("c.txt", Some(C_STATUS_END), project, cx);
});
// Now remove the original git repository and check that collaborators are notified.
client_a
.fs()
.remove_dir("/dir/.git".as_ref(), RemoveOptions::default())
.await
.unwrap();
executor.run_until_parked();
project_remote.update(cx_b, |project, cx| {
pretty_assertions::assert_eq!(
project.git_store().read(cx).repo_snapshots(cx),
HashMap::default()
);
});
project_remote_c.update(cx_c, |project, cx| {
pretty_assertions::assert_eq!(
project.git_store().read(cx).repo_snapshots(cx),
HashMap::default()
);
});
}
#[gpui::test(iterations = 10)]

View File

@@ -1,8 +1,8 @@
use crate::tests::TestServer;
use call::ActiveCall;
use collections::HashSet;
use collections::{HashMap, HashSet};
use extension::ExtensionHostProxy;
use fs::{FakeFs, Fs as _};
use fs::{FakeFs, Fs as _, RemoveOptions};
use futures::StreamExt as _;
use gpui::{
AppContext as _, BackgroundExecutor, SemanticVersion, TestAppContext, UpdateGlobal as _,
@@ -356,6 +356,26 @@ async fn test_ssh_collaboration_git_branches(
});
assert_eq!(server_branch.name, "totally-new-branch");
// Remove the git repository and check that all participants get the update.
remote_fs
.remove_dir("/project/.git".as_ref(), RemoveOptions::default())
.await
.unwrap();
executor.run_until_parked();
project_a.update(cx_a, |project, cx| {
pretty_assertions::assert_eq!(
project.git_store().read(cx).repo_snapshots(cx),
HashMap::default()
);
});
project_b.update(cx_b, |project, cx| {
pretty_assertions::assert_eq!(
project.git_store().read(cx).repo_snapshots(cx),
HashMap::default()
);
});
}
#[gpui::test]

View File

@@ -1409,14 +1409,6 @@ impl Editor {
code_action_providers.push(Rc::new(project) as Rc<_>);
}
let hide_mouse_while_typing = if !matches!(mode, EditorMode::SingleLine { .. }) {
EditorSettings::get_global(cx)
.hide_mouse_while_typing
.unwrap_or(true)
} else {
false
};
let mut this = Self {
focus_handle,
show_cursor_when_unfocused: false,
@@ -1579,7 +1571,9 @@ impl Editor {
text_style_refinement: None,
load_diff_task: load_uncommitted_diff,
mouse_cursor_hidden: false,
hide_mouse_while_typing,
hide_mouse_while_typing: EditorSettings::get_global(cx)
.hide_mouse_while_typing
.unwrap_or(true),
};
if let Some(breakpoints) = this.breakpoint_store.as_ref() {
this._subscriptions
@@ -6134,35 +6128,6 @@ impl Editor {
return breakpoint_display_points;
};
if let Some(buffer) = self.buffer.read(cx).as_singleton() {
let buffer_snapshot = buffer.read(cx).snapshot();
for breakpoint in
breakpoint_store
.read(cx)
.breakpoints(&buffer, None, &buffer_snapshot, cx)
{
let point = buffer_snapshot.summary_for_anchor::<Point>(&breakpoint.0);
let mut anchor = multi_buffer_snapshot.anchor_before(point);
anchor.text_anchor = breakpoint.0;
breakpoint_display_points.insert(
snapshot
.point_to_display_point(
MultiBufferPoint {
row: point.row,
column: point.column,
},
Bias::Left,
)
.row(),
(anchor, breakpoint.1.clone()),
);
}
return breakpoint_display_points;
}
let range = snapshot.display_point_to_point(DisplayPoint::new(range.start, 0), Bias::Left)
..snapshot.display_point_to_point(DisplayPoint::new(range.end, 0), Bias::Right);
@@ -16701,11 +16666,7 @@ impl Editor {
self.scroll_manager.vertical_scroll_margin = editor_settings.vertical_scroll_margin;
self.show_breadcrumbs = editor_settings.toolbar.breadcrumbs;
self.cursor_shape = editor_settings.cursor_shape.unwrap_or_default();
self.hide_mouse_while_typing = if !matches!(self.mode, EditorMode::SingleLine { .. }) {
editor_settings.hide_mouse_while_typing.unwrap_or(true)
} else {
false
};
self.hide_mouse_while_typing = editor_settings.hide_mouse_while_typing.unwrap_or(true);
if !self.hide_mouse_while_typing {
self.mouse_cursor_hidden = false;

View File

@@ -1955,7 +1955,12 @@ impl EditorElement {
.filter_map(|(display_row, (text_anchor, bp))| {
if row_infos
.get((display_row.0.saturating_sub(range.start.0)) as usize)
.is_some_and(|row_info| row_info.expand_info.is_some())
.is_some_and(|row_info| {
row_info.expand_info.is_some()
|| row_info
.diff_status
.is_some_and(|status| status.is_deleted())
})
{
return None;
}

View File

@@ -150,7 +150,7 @@ impl GitBlame {
this.generate(cx);
}
}
project::Event::WorktreeUpdatedGitRepositories(_) => {
project::Event::GitStateUpdated => {
log::debug!("Status of git repositories updated. Regenerating blame data...",);
this.generate(cx);
}

View File

@@ -339,7 +339,8 @@ impl EditorTestContext {
let mut found = None;
fs.with_git_state(&Self::root_path().join(".git"), false, |git_state| {
found = git_state.index_contents.get(path.as_ref()).cloned();
});
})
.unwrap();
assert_eq!(expected, found.as_deref());
}

View File

@@ -273,7 +273,7 @@ async fn run_evaluation(
let repos_dir = Path::new(EVAL_REPOS_DIR);
let db_path = Path::new(EVAL_DB_PATH);
let api_key = std::env::var("OPENAI_API_KEY").unwrap();
let fs = Arc::new(RealFs::new(None)) as Arc<dyn Fs>;
let fs = Arc::new(RealFs::new(None, cx.background_executor().clone())) as Arc<dyn Fs>;
let clock = Arc::new(RealSystemClock);
let client = cx
.update(|cx| {

View File

@@ -18,6 +18,7 @@ clap = { workspace = true, features = ["derive"] }
env_logger.workspace = true
extension.workspace = true
fs.workspace = true
gpui.workspace = true
language.workspace = true
log.workspace = true
reqwest_client.workspace = true

View File

@@ -34,7 +34,7 @@ async fn main() -> Result<()> {
env_logger::init();
let args = Args::parse();
let fs = Arc::new(RealFs::default());
let fs = Arc::new(RealFs::new(None, gpui::background_executor()));
let engine = wasmtime::Engine::default();
let mut wasm_store = WasmStore::new(&engine)?;

View File

@@ -477,7 +477,7 @@ async fn test_extension_store_with_test_extension(cx: &mut TestAppContext) {
let test_extension_id = "test-extension";
let test_extension_dir = root_dir.join("extensions").join(test_extension_id);
let fs = Arc::new(RealFs::default());
let fs = Arc::new(RealFs::new(None, cx.executor()));
let extensions_dir = TempTree::new(json!({
"installed": {},
"work": {}

View File

@@ -5,8 +5,8 @@ use futures::future::{self, BoxFuture};
use git::{
blame::Blame,
repository::{
AskPassSession, Branch, CommitDetails, GitRepository, GitRepositoryCheckpoint, PushOptions,
Remote, RepoPath, ResetMode,
AskPassSession, Branch, CommitDetails, GitIndex, GitRepository, GitRepositoryCheckpoint,
PushOptions, Remote, RepoPath, ResetMode,
},
status::{FileStatus, GitStatus, StatusCode, TrackedStatus, UnmergedStatus},
};
@@ -57,12 +57,14 @@ impl FakeGitRepository {
where
F: FnOnce(&mut FakeGitRepositoryState) -> T,
{
self.fs.with_git_state(&self.dot_git_path, false, f)
self.fs
.with_git_state(&self.dot_git_path, false, f)
.unwrap()
}
fn with_state_async<F, T>(&self, write: bool, f: F) -> BoxFuture<T>
fn with_state_async<F, T>(&self, write: bool, f: F) -> BoxFuture<'static, Result<T>>
where
F: 'static + Send + FnOnce(&mut FakeGitRepositoryState) -> T,
F: 'static + Send + FnOnce(&mut FakeGitRepositoryState) -> Result<T>,
T: Send,
{
let fs = self.fs.clone();
@@ -70,7 +72,7 @@ impl FakeGitRepository {
let dot_git_path = self.dot_git_path.clone();
async move {
executor.simulate_random_delay().await;
fs.with_git_state(&dot_git_path, write, f)
fs.with_git_state(&dot_git_path, write, f)?
}
.boxed()
}
@@ -79,16 +81,42 @@ impl FakeGitRepository {
impl GitRepository for FakeGitRepository {
fn reload_index(&self) {}
fn load_index_text(&self, path: RepoPath, _cx: AsyncApp) -> BoxFuture<Option<String>> {
self.with_state_async(false, move |state| {
state.index_contents.get(path.as_ref()).cloned()
})
fn load_index_text(
&self,
index: Option<GitIndex>,
path: RepoPath,
) -> BoxFuture<Option<String>> {
if index.is_some() {
unimplemented!();
}
async {
self.with_state_async(false, move |state| {
state
.index_contents
.get(path.as_ref())
.ok_or_else(|| anyhow!("not present in index"))
.cloned()
})
.await
.ok()
}
.boxed()
}
fn load_committed_text(&self, path: RepoPath, _cx: AsyncApp) -> BoxFuture<Option<String>> {
self.with_state_async(false, move |state| {
state.head_contents.get(path.as_ref()).cloned()
})
fn load_committed_text(&self, path: RepoPath) -> BoxFuture<Option<String>> {
async {
self.with_state_async(false, move |state| {
state
.head_contents
.get(path.as_ref())
.ok_or_else(|| anyhow!("not present in HEAD"))
.cloned()
})
.await
.ok()
}
.boxed()
}
fn set_index_text(
@@ -96,7 +124,6 @@ impl GitRepository for FakeGitRepository {
path: RepoPath,
content: Option<String>,
_env: HashMap<String, String>,
_cx: AsyncApp,
) -> BoxFuture<anyhow::Result<()>> {
self.with_state_async(true, move |state| {
if let Some(message) = state.simulated_index_write_error_message.clone() {
@@ -122,7 +149,7 @@ impl GitRepository for FakeGitRepository {
vec![]
}
fn show(&self, _commit: String, _cx: AsyncApp) -> BoxFuture<Result<CommitDetails>> {
fn show(&self, _commit: String) -> BoxFuture<Result<CommitDetails>> {
unimplemented!()
}
@@ -152,7 +179,20 @@ impl GitRepository for FakeGitRepository {
self.path()
}
fn status(&self, path_prefixes: &[RepoPath]) -> Result<GitStatus> {
fn status(
&self,
index: Option<GitIndex>,
path_prefixes: &[RepoPath],
) -> BoxFuture<'static, Result<GitStatus>> {
if index.is_some() {
unimplemented!();
}
let status = self.status_blocking(path_prefixes);
async move { status }.boxed()
}
fn status_blocking(&self, path_prefixes: &[RepoPath]) -> Result<GitStatus> {
let workdir_path = self.dot_git_path.parent().unwrap();
// Load gitignores
@@ -194,7 +234,7 @@ impl GitRepository for FakeGitRepository {
})
.collect();
self.with_state(|state| {
self.fs.with_git_state(&self.dot_git_path, false, |state| {
let mut entries = Vec::new();
let paths = state
.head_contents
@@ -278,7 +318,7 @@ impl GitRepository for FakeGitRepository {
Ok(GitStatus {
entries: entries.into(),
})
})
})?
}
fn branches(&self) -> BoxFuture<Result<Vec<Branch>>> {
@@ -297,26 +337,21 @@ impl GitRepository for FakeGitRepository {
})
}
fn change_branch(&self, name: String, _cx: AsyncApp) -> BoxFuture<Result<()>> {
fn change_branch(&self, name: String) -> BoxFuture<Result<()>> {
self.with_state_async(true, |state| {
state.current_branch_name = Some(name);
Ok(())
})
}
fn create_branch(&self, name: String, _: AsyncApp) -> BoxFuture<Result<()>> {
fn create_branch(&self, name: String) -> BoxFuture<Result<()>> {
self.with_state_async(true, move |state| {
state.branches.insert(name.to_owned());
Ok(())
})
}
fn blame(
&self,
path: RepoPath,
_content: Rope,
_cx: &mut AsyncApp,
) -> BoxFuture<Result<git::blame::Blame>> {
fn blame(&self, path: RepoPath, _content: Rope) -> BoxFuture<Result<git::blame::Blame>> {
self.with_state_async(false, move |state| {
state
.blames
@@ -330,7 +365,6 @@ impl GitRepository for FakeGitRepository {
&self,
_paths: Vec<RepoPath>,
_env: HashMap<String, String>,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
unimplemented!()
}
@@ -339,7 +373,6 @@ impl GitRepository for FakeGitRepository {
&self,
_paths: Vec<RepoPath>,
_env: HashMap<String, String>,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
unimplemented!()
}
@@ -349,7 +382,6 @@ impl GitRepository for FakeGitRepository {
_message: gpui::SharedString,
_name_and_email: Option<(gpui::SharedString, gpui::SharedString)>,
_env: HashMap<String, String>,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
unimplemented!()
}
@@ -386,38 +418,23 @@ impl GitRepository for FakeGitRepository {
unimplemented!()
}
fn get_remotes(
&self,
_branch: Option<String>,
_cx: AsyncApp,
) -> BoxFuture<Result<Vec<Remote>>> {
fn get_remotes(&self, _branch: Option<String>) -> BoxFuture<Result<Vec<Remote>>> {
unimplemented!()
}
fn check_for_pushed_commit(
&self,
_cx: gpui::AsyncApp,
) -> BoxFuture<Result<Vec<gpui::SharedString>>> {
fn check_for_pushed_commit(&self) -> BoxFuture<Result<Vec<gpui::SharedString>>> {
future::ready(Ok(Vec::new())).boxed()
}
fn diff(
&self,
_diff: git::repository::DiffType,
_cx: gpui::AsyncApp,
) -> BoxFuture<Result<String>> {
fn diff(&self, _diff: git::repository::DiffType) -> BoxFuture<Result<String>> {
unimplemented!()
}
fn checkpoint(&self, _cx: AsyncApp) -> BoxFuture<Result<GitRepositoryCheckpoint>> {
fn checkpoint(&self) -> BoxFuture<'static, Result<GitRepositoryCheckpoint>> {
unimplemented!()
}
fn restore_checkpoint(
&self,
_checkpoint: GitRepositoryCheckpoint,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
fn restore_checkpoint(&self, _checkpoint: GitRepositoryCheckpoint) -> BoxFuture<Result<()>> {
unimplemented!()
}
@@ -425,16 +442,27 @@ impl GitRepository for FakeGitRepository {
&self,
_left: GitRepositoryCheckpoint,
_right: GitRepositoryCheckpoint,
_cx: AsyncApp,
) -> BoxFuture<Result<bool>> {
unimplemented!()
}
fn delete_checkpoint(
fn delete_checkpoint(&self, _checkpoint: GitRepositoryCheckpoint) -> BoxFuture<Result<()>> {
unimplemented!()
}
fn diff_checkpoints(
&self,
_checkpoint: GitRepositoryCheckpoint,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
_base_checkpoint: GitRepositoryCheckpoint,
_target_checkpoint: GitRepositoryCheckpoint,
) -> BoxFuture<Result<String>> {
unimplemented!()
}
fn create_index(&self) -> BoxFuture<Result<GitIndex>> {
unimplemented!()
}
fn apply_diff(&self, _index: GitIndex, _diff: String) -> BoxFuture<Result<()>> {
unimplemented!()
}
}

View File

@@ -8,6 +8,7 @@ use anyhow::{anyhow, Context as _, Result};
#[cfg(any(target_os = "linux", target_os = "freebsd"))]
use ashpd::desktop::trash;
use gpui::App;
use gpui::BackgroundExecutor;
use gpui::Global;
use gpui::ReadGlobal as _;
use std::borrow::Cow;
@@ -240,9 +241,9 @@ impl From<MTime> for proto::Timestamp {
}
}
#[derive(Default)]
pub struct RealFs {
git_binary_path: Option<PathBuf>,
executor: BackgroundExecutor,
}
pub trait FileHandle: Send + Sync + std::fmt::Debug {
@@ -294,8 +295,11 @@ impl FileHandle for std::fs::File {
pub struct RealWatcher {}
impl RealFs {
pub fn new(git_binary_path: Option<PathBuf>) -> Self {
Self { git_binary_path }
pub fn new(git_binary_path: Option<PathBuf>, executor: BackgroundExecutor) -> Self {
Self {
git_binary_path,
executor,
}
}
}
@@ -754,6 +758,7 @@ impl Fs for RealFs {
Some(Arc::new(RealGitRepository::new(
dotgit_path,
self.git_binary_path.clone(),
self.executor.clone(),
)?))
}
@@ -1248,12 +1253,12 @@ impl FakeFs {
.boxed()
}
pub fn with_git_state<T, F>(&self, dot_git: &Path, emit_git_event: bool, f: F) -> T
pub fn with_git_state<T, F>(&self, dot_git: &Path, emit_git_event: bool, f: F) -> Result<T>
where
F: FnOnce(&mut FakeGitRepositoryState) -> T,
{
let mut state = self.state.lock();
let entry = state.read_path(dot_git).unwrap();
let entry = state.read_path(dot_git).context("open .git")?;
let mut entry = entry.lock();
if let FakeFsEntry::Dir { git_repo_state, .. } = &mut *entry {
@@ -1271,9 +1276,9 @@ impl FakeFs {
state.emit_event([(dot_git, None)]);
}
result
Ok(result)
} else {
panic!("not a directory");
Err(anyhow!("not a directory"))
}
}
@@ -1283,6 +1288,7 @@ impl FakeFs {
state.branches.extend(branch.clone());
state.current_branch_name = branch
})
.unwrap();
}
pub fn insert_branches(&self, dot_git: &Path, branches: &[&str]) {
@@ -1296,6 +1302,7 @@ impl FakeFs {
.branches
.extend(branches.iter().map(ToString::to_string));
})
.unwrap();
}
pub fn set_unmerged_paths_for_repo(
@@ -1310,7 +1317,8 @@ impl FakeFs {
.iter()
.map(|(path, content)| (path.clone(), *content)),
);
});
})
.unwrap();
}
pub fn set_index_for_repo(&self, dot_git: &Path, index_state: &[(RepoPath, String)]) {
@@ -1321,7 +1329,8 @@ impl FakeFs {
.iter()
.map(|(path, content)| (path.clone(), content.clone())),
);
});
})
.unwrap();
}
pub fn set_head_for_repo(&self, dot_git: &Path, head_state: &[(RepoPath, String)]) {
@@ -1332,7 +1341,8 @@ impl FakeFs {
.iter()
.map(|(path, content)| (path.clone(), content.clone())),
);
});
})
.unwrap();
}
pub fn set_git_content_for_repo(
@@ -1356,7 +1366,8 @@ impl FakeFs {
)
},
));
});
})
.unwrap();
}
pub fn set_head_and_index_for_repo(
@@ -1371,14 +1382,16 @@ impl FakeFs {
state
.index_contents
.extend(contents_by_path.iter().cloned());
});
})
.unwrap();
}
pub fn set_blame_for_repo(&self, dot_git: &Path, blames: Vec<(RepoPath, git::blame::Blame)>) {
self.with_git_state(dot_git, true, |state| {
state.blames.clear();
state.blames.extend(blames);
});
})
.unwrap();
}
/// Put the given git repository into a state with the given status,
@@ -1460,13 +1473,14 @@ impl FakeFs {
state.head_contents.insert(repo_path.clone(), content);
}
}
});
}).unwrap();
}
pub fn set_error_message_for_index_write(&self, dot_git: &Path, message: Option<String>) {
self.with_git_state(dot_git, true, |state| {
state.simulated_index_write_error_message = message;
});
})
.unwrap();
}
pub fn paths(&self, include_dot_git: bool) -> Vec<PathBuf> {

View File

@@ -38,9 +38,6 @@ impl Watcher for FsWatcher {
EventKind::Create(_) => Some(PathEventKind::Created),
EventKind::Modify(_) => Some(PathEventKind::Changed),
EventKind::Remove(_) => Some(PathEventKind::Removed),
// Adding this fix a weird bug on Linux after upgrading notify
// https://github.com/zed-industries/zed/actions/runs/14085230504/job/39449448832
EventKind::Access(_) => return,
_ => None,
};
let mut path_events = event
@@ -108,7 +105,14 @@ static FS_WATCHER_INSTANCE: OnceLock<anyhow::Result<GlobalWatcher, notify::Error
OnceLock::new();
fn handle_event(event: Result<notify::Event, notify::Error>) {
let Some(event) = event.log_err() else { return };
// Filter out access events, which could lead to a weird bug on Linux after upgrading notify
// https://github.com/zed-industries/zed/actions/runs/14085230504/job/39449448832
let Some(event) = event
.log_err()
.filter(|event| !matches!(event.kind, EventKind::Access(_)))
else {
return;
};
global::<()>(move |watcher| {
for f in watcher.watchers.lock().iter() {
f(&event)

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
use crate::repository::RepoPath;
use anyhow::{anyhow, Result};
use serde::{Deserialize, Serialize};
use std::{path::Path, process::Stdio, sync::Arc};
use std::{path::Path, str::FromStr, sync::Arc};
use util::ResultExt;
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash, Serialize, Deserialize)]
@@ -438,50 +438,16 @@ impl std::ops::Sub for GitSummary {
}
}
#[derive(Clone)]
#[derive(Clone, Debug)]
pub struct GitStatus {
pub entries: Arc<[(RepoPath, FileStatus)]>,
}
impl GitStatus {
pub(crate) fn new(
git_binary: &Path,
working_directory: &Path,
path_prefixes: &[RepoPath],
) -> Result<Self> {
let child = util::command::new_std_command(git_binary)
.current_dir(working_directory)
.args([
"--no-optional-locks",
"status",
"--porcelain=v1",
"--untracked-files=all",
"--no-renames",
"-z",
])
.args(path_prefixes.iter().map(|path_prefix| {
if path_prefix.0.as_ref() == Path::new("") {
Path::new(".")
} else {
path_prefix
}
}))
.stdin(Stdio::null())
.stdout(Stdio::piped())
.stderr(Stdio::piped())
.spawn()
.map_err(|e| anyhow!("Failed to start git status process: {e}"))?;
impl FromStr for GitStatus {
type Err = anyhow::Error;
let output = child
.wait_with_output()
.map_err(|e| anyhow!("Failed to read git status output: {e}"))?;
if !output.status.success() {
let stderr = String::from_utf8_lossy(&output.stderr);
return Err(anyhow!("git status process failed: {stderr}"));
}
let stdout = String::from_utf8_lossy(&output.stdout);
let mut entries = stdout
fn from_str(s: &str) -> Result<Self> {
let mut entries = s
.split('\0')
.filter_map(|entry| {
let sep = entry.get(2..3)?;

View File

@@ -3,7 +3,7 @@ use crate::commit_modal::CommitModal;
use crate::git_panel_settings::StatusStyle;
use crate::project_diff::Diff;
use crate::remote_output::{self, RemoteAction, SuccessMessage};
use crate::repository_selector::filtered_repository_entries;
use crate::{branch_picker, render_remote_button};
use crate::{
git_panel_settings::GitPanelSettings, git_status_icon, repository_selector::RepositorySelector,
@@ -63,7 +63,7 @@ use ui::{
Tooltip,
};
use util::{maybe, post_inc, ResultExt, TryFutureExt};
use workspace::{AppState, OpenOptions, OpenVisible};
use workspace::AppState;
use notifications::status_toast::{StatusToast, ToastIcon};
use workspace::{
@@ -195,7 +195,6 @@ impl GitListEntry {
#[derive(Debug, PartialEq, Eq, Clone)]
pub struct GitStatusEntry {
pub(crate) repo_path: RepoPath,
pub(crate) worktree_path: Arc<Path>,
pub(crate) abs_path: PathBuf,
pub(crate) status: FileStatus,
pub(crate) staging: StageStatus,
@@ -203,14 +202,14 @@ pub struct GitStatusEntry {
impl GitStatusEntry {
fn display_name(&self) -> String {
self.worktree_path
self.repo_path
.file_name()
.map(|name| name.to_string_lossy().into_owned())
.unwrap_or_else(|| self.worktree_path.to_string_lossy().into_owned())
.unwrap_or_else(|| self.repo_path.to_string_lossy().into_owned())
}
fn parent_dir(&self) -> Option<String> {
self.worktree_path
self.repo_path
.parent()
.map(|parent| parent.to_string_lossy().into_owned())
}
@@ -652,7 +651,7 @@ impl GitPanel {
let Some(git_repo) = self.active_repository.as_ref() else {
return;
};
let Some(repo_path) = git_repo.read(cx).project_path_to_repo_path(&path) else {
let Some(repo_path) = git_repo.read(cx).project_path_to_repo_path(&path, cx) else {
return;
};
let Some(ix) = self.entry_by_path(&repo_path) else {
@@ -865,7 +864,7 @@ impl GitPanel {
if Some(&entry.repo_path)
== git_repo
.read(cx)
.project_path_to_repo_path(&project_path)
.project_path_to_repo_path(&project_path, cx)
.as_ref()
{
project_diff.focus_handle(cx).focus(window);
@@ -875,31 +874,12 @@ impl GitPanel {
}
};
if entry.worktree_path.starts_with("..") {
self.workspace
.update(cx, |workspace, cx| {
workspace
.open_abs_path(
entry.abs_path.clone(),
OpenOptions {
visible: Some(OpenVisible::All),
focus: Some(false),
..Default::default()
},
window,
cx,
)
.detach_and_log_err(cx);
})
.ok();
} else {
self.workspace
.update(cx, |workspace, cx| {
ProjectDiff::deploy_at(workspace, Some(entry.clone()), window, cx);
})
.ok();
self.focus_handle.focus(window);
}
self.workspace
.update(cx, |workspace, cx| {
ProjectDiff::deploy_at(workspace, Some(entry.clone()), window, cx);
})
.ok();
self.focus_handle.focus(window);
Some(())
});
@@ -916,7 +896,7 @@ impl GitPanel {
let active_repo = self.active_repository.as_ref()?;
let path = active_repo
.read(cx)
.repo_path_to_project_path(&entry.repo_path)?;
.repo_path_to_project_path(&entry.repo_path, cx)?;
if entry.status.is_deleted() {
return None;
}
@@ -952,9 +932,9 @@ impl GitPanel {
&format!(
"Are you sure you want to restore {}?",
entry
.worktree_path
.repo_path
.file_name()
.unwrap_or(entry.worktree_path.as_os_str())
.unwrap_or(entry.repo_path.as_os_str())
.to_string_lossy()
),
None,
@@ -992,7 +972,7 @@ impl GitPanel {
let active_repo = self.active_repository.clone()?;
let path = active_repo
.read(cx)
.repo_path_to_project_path(&entry.repo_path)?;
.repo_path_to_project_path(&entry.repo_path, cx)?;
let workspace = self.workspace.clone();
if entry.status.staging().has_staged() {
@@ -1052,7 +1032,7 @@ impl GitPanel {
.filter_map(|entry| {
let path = active_repository
.read(cx)
.repo_path_to_project_path(&entry.repo_path)?;
.repo_path_to_project_path(&entry.repo_path, cx)?;
Some(project.open_buffer(path, cx))
})
.collect()
@@ -1218,7 +1198,7 @@ impl GitPanel {
workspace.project().update(cx, |project, cx| {
let project_path = active_repo
.read(cx)
.repo_path_to_project_path(&entry.repo_path)?;
.repo_path_to_project_path(&entry.repo_path, cx)?;
project.delete_file(project_path, true, cx)
})
})
@@ -2279,7 +2259,7 @@ impl GitPanel {
let repo = repo.read(cx);
for entry in repo.status() {
for entry in repo.cached_status() {
let is_conflict = repo.has_conflict(&entry.repo_path);
let is_new = entry.status.is_created();
let staging = entry.status.staging();
@@ -2295,16 +2275,12 @@ impl GitPanel {
continue;
}
// dot_git_abs path always has at least one component, namely .git.
let abs_path = repo
.dot_git_abs_path
.parent()
.unwrap()
.join(&entry.repo_path);
let worktree_path = repo.repository_entry.unrelativize(&entry.repo_path);
.repository_entry
.work_directory_abs_path
.join(&entry.repo_path.0);
let entry = GitStatusEntry {
repo_path: entry.repo_path.clone(),
worktree_path,
abs_path,
status: entry.status,
staging,
@@ -2883,7 +2859,6 @@ impl GitPanel {
) -> Option<impl IntoElement> {
let active_repository = self.active_repository.clone()?;
let (can_commit, tooltip) = self.configure_commit_button(cx);
let project = self.project.clone().read(cx);
let panel_editor_style = panel_editor_style(true, window, cx);
let enable_coauthors = self.render_co_authors(cx);
@@ -2907,7 +2882,7 @@ impl GitPanel {
let display_name = SharedString::from(Arc::from(
active_repository
.read(cx)
.display_name(project, cx)
.display_name()
.trim_end_matches("/"),
));
let editor_is_long = self.commit_editor.update(cx, |editor, cx| {
@@ -3236,7 +3211,8 @@ impl GitPanel {
cx: &App,
) -> Option<AnyElement> {
let repo = self.active_repository.as_ref()?.read(cx);
let repo_path = repo.worktree_id_path_to_repo_path(file.worktree_id(cx), file.path())?;
let project_path = (file.worktree_id(cx), file.path()).into();
let repo_path = repo.project_path_to_repo_path(&project_path, cx)?;
let ix = self.entry_by_path(&repo_path)?;
let entry = self.entries.get(ix)?;
@@ -4056,9 +4032,7 @@ impl RenderOnce for PanelRepoFooter {
let single_repo = project
.as_ref()
.map(|project| {
filtered_repository_entries(project.read(cx).git_store().read(cx), cx).len() == 1
})
.map(|project| project.read(cx).git_store().read(cx).repositories().len() == 1)
.unwrap_or(true);
const MAX_BRANCH_LEN: usize = 16;
@@ -4558,66 +4532,65 @@ mod tests {
GitListEntry::GitStatusEntry(GitStatusEntry {
abs_path: path!("/root/zed/crates/gpui/gpui.rs").into(),
repo_path: "crates/gpui/gpui.rs".into(),
worktree_path: Path::new("gpui.rs").into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
abs_path: path!("/root/zed/crates/util/util.rs").into(),
repo_path: "crates/util/util.rs".into(),
worktree_path: Path::new("../util/util.rs").into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
},),
],
);
cx.update_window_entity(&panel, |panel, window, cx| {
panel.select_last(&Default::default(), window, cx);
assert_eq!(panel.selected_entry, Some(2));
panel.open_diff(&Default::default(), window, cx);
});
cx.run_until_parked();
// TODO(cole) restore this once repository deduplication is implemented properly.
//cx.update_window_entity(&panel, |panel, window, cx| {
// panel.select_last(&Default::default(), window, cx);
// assert_eq!(panel.selected_entry, Some(2));
// panel.open_diff(&Default::default(), window, cx);
//});
//cx.run_until_parked();
let worktree_roots = workspace.update(cx, |workspace, cx| {
workspace
.worktrees(cx)
.map(|worktree| worktree.read(cx).abs_path())
.collect::<Vec<_>>()
});
pretty_assertions::assert_eq!(
worktree_roots,
vec![
Path::new(path!("/root/zed/crates/gpui")).into(),
Path::new(path!("/root/zed/crates/util/util.rs")).into(),
]
);
//let worktree_roots = workspace.update(cx, |workspace, cx| {
// workspace
// .worktrees(cx)
// .map(|worktree| worktree.read(cx).abs_path())
// .collect::<Vec<_>>()
//});
//pretty_assertions::assert_eq!(
// worktree_roots,
// vec![
// Path::new(path!("/root/zed/crates/gpui")).into(),
// Path::new(path!("/root/zed/crates/util/util.rs")).into(),
// ]
//);
project.update(cx, |project, cx| {
let git_store = project.git_store().read(cx);
// The repo that comes from the single-file worktree can't be selected through the UI.
let filtered_entries = filtered_repository_entries(git_store, cx)
.iter()
.map(|repo| repo.read(cx).worktree_abs_path.clone())
.collect::<Vec<_>>();
assert_eq!(
filtered_entries,
[Path::new(path!("/root/zed/crates/gpui")).into()]
);
// But we can select it artificially here.
let repo_from_single_file_worktree = git_store
.repositories()
.values()
.find(|repo| {
repo.read(cx).worktree_abs_path.as_ref()
== Path::new(path!("/root/zed/crates/util/util.rs"))
})
.unwrap()
.clone();
//project.update(cx, |project, cx| {
// let git_store = project.git_store().read(cx);
// // The repo that comes from the single-file worktree can't be selected through the UI.
// let filtered_entries = filtered_repository_entries(git_store, cx)
// .iter()
// .map(|repo| repo.read(cx).worktree_abs_path.clone())
// .collect::<Vec<_>>();
// assert_eq!(
// filtered_entries,
// [Path::new(path!("/root/zed/crates/gpui")).into()]
// );
// // But we can select it artificially here.
// let repo_from_single_file_worktree = git_store
// .repositories()
// .values()
// .find(|repo| {
// repo.read(cx).worktree_abs_path.as_ref()
// == Path::new(path!("/root/zed/crates/util/util.rs"))
// })
// .unwrap()
// .clone();
// Paths still make sense when we somehow activate a repo that comes from a single-file worktree.
repo_from_single_file_worktree.update(cx, |repo, cx| repo.set_as_active_repository(cx));
});
// // Paths still make sense when we somehow activate a repo that comes from a single-file worktree.
// repo_from_single_file_worktree.update(cx, |repo, cx| repo.set_as_active_repository(cx));
//});
let handle = cx.update_window_entity(&panel, |panel, _, _| {
std::mem::replace(&mut panel.update_visible_entries_task, Task::ready(()))
@@ -4634,14 +4607,12 @@ mod tests {
GitListEntry::GitStatusEntry(GitStatusEntry {
abs_path: path!("/root/zed/crates/gpui/gpui.rs").into(),
repo_path: "crates/gpui/gpui.rs".into(),
worktree_path: Path::new("../../gpui/gpui.rs").into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
}),
GitListEntry::GitStatusEntry(GitStatusEntry {
abs_path: path!("/root/zed/crates/util/util.rs").into(),
repo_path: "crates/util/util.rs".into(),
worktree_path: Path::new("util.rs").into(),
status: StatusCode::Modified.worktree(),
staging: StageStatus::Unstaged,
},),

View File

@@ -26,7 +26,11 @@ use project::{
git_store::{GitEvent, GitStore},
Project, ProjectPath,
};
use std::any::{Any, TypeId};
use std::{
any::{Any, TypeId},
path::Path,
sync::Arc,
};
use theme::ActiveTheme;
use ui::{prelude::*, vertical_divider, KeyBinding, Tooltip};
use util::ResultExt as _;
@@ -39,7 +43,67 @@ use workspace::{
actions!(git, [Diff, Add]);
#[derive(Debug)]
pub struct StatusEntry {
pub project_path: ProjectPath,
pub status: FileStatus,
pub has_conflict: bool,
}
pub trait DiffSource {
fn observe(&self, cx: &mut App, f: Box<dyn FnMut(&mut App) + Send>) -> Subscription;
fn status(&self, cx: &App) -> Vec<StatusEntry>;
fn open_uncommitted_diff(
&self,
buffer: Entity<Buffer>,
cx: &mut App,
) -> Task<Result<Entity<BufferDiff>>>;
}
pub struct ProjectDiffSource(Entity<Project>);
impl DiffSource for ProjectDiffSource {
fn observe(&self, cx: &mut App, mut f: Box<dyn FnMut(&mut App) + Send>) -> Subscription {
let git_store = self.0.read(cx).git_store().clone();
cx.subscribe(&git_store, move |_git_store, event, cx| match event {
GitEvent::ActiveRepositoryChanged
| GitEvent::FileSystemUpdated
| GitEvent::GitStateUpdated => f(cx),
_ => {}
})
}
fn status(&self, cx: &App) -> Vec<StatusEntry> {
let mut result = Vec::new();
if let Some(git_repo) = self.0.read(cx).git_store().read(cx).active_repository() {
let git_repo = git_repo.read(cx);
for entry in git_repo.cached_status() {
if let Some(project_path) = git_repo.repo_path_to_project_path(&entry.repo_path, cx)
{
let has_conflict = git_repo.has_conflict(&entry.repo_path);
result.push(StatusEntry {
project_path,
status: entry.status,
has_conflict,
});
}
}
}
result
}
fn open_uncommitted_diff(
&self,
buffer: Entity<Buffer>,
cx: &mut App,
) -> Task<Result<Entity<BufferDiff>>> {
self.0
.update(cx, |project, cx| project.open_uncommitted_diff(buffer, cx))
}
}
pub struct ProjectDiff {
source: Arc<dyn DiffSource>,
project: Entity<Project>,
multibuffer: Entity<MultiBuffer>,
editor: Entity<Editor>,
@@ -102,8 +166,16 @@ impl ProjectDiff {
existing
} else {
let workspace_handle = cx.entity();
let project_diff =
cx.new(|cx| Self::new(workspace.project().clone(), workspace_handle, window, cx));
let source = Arc::new(ProjectDiffSource(workspace.project().clone()));
let project_diff = cx.new(|cx| {
Self::new(
source,
workspace.project().clone(),
workspace_handle,
window,
cx,
)
});
workspace.add_item_to_active_pane(
Box::new(project_diff.clone()),
None,
@@ -126,7 +198,8 @@ impl ProjectDiff {
})
}
fn new(
pub fn new(
source: Arc<dyn DiffSource>,
project: Entity<Project>,
workspace: Entity<Workspace>,
window: &mut Window,
@@ -149,17 +222,16 @@ impl ProjectDiff {
.detach();
let git_store = project.read(cx).git_store().clone();
let git_store_subscription = cx.subscribe_in(
&git_store,
window,
move |this, _git_store, event, _window, _cx| match event {
GitEvent::ActiveRepositoryChanged
| GitEvent::FileSystemUpdated
| GitEvent::GitStateUpdated => {
*this.update_needed.borrow_mut() = ();
}
_ => {}
},
let weak_this = cx.weak_entity();
let diff_source_subscription = source.observe(
cx,
Box::new(move |cx| {
weak_this
.update(cx, |this, _| {
*this.update_needed.borrow_mut() = ();
})
.ok();
}),
);
let (mut send, recv) = postage::watch::channel::<()>();
@@ -171,6 +243,7 @@ impl ProjectDiff {
*send.borrow_mut() = ();
Self {
source,
project,
git_store: git_store.clone(),
workspace: workspace.downgrade(),
@@ -181,7 +254,7 @@ impl ProjectDiff {
update_needed: send,
current_branch: None,
_task: worker,
_subscription: git_store_subscription,
_subscription: diff_source_subscription,
}
}
@@ -328,55 +401,53 @@ impl ProjectDiff {
}
fn load_buffers(&mut self, cx: &mut Context<Self>) -> Vec<Task<Result<DiffBuffer>>> {
let Some(repo) = self.git_store.read(cx).active_repository() else {
self.multibuffer.update(cx, |multibuffer, cx| {
multibuffer.clear(cx);
});
return vec![];
};
let mut previous_paths = self.multibuffer.read(cx).paths().collect::<HashSet<_>>();
let mut result = vec![];
repo.update(cx, |repo, cx| {
for entry in repo.status() {
if !entry.status.has_changes() {
continue;
}
let Some(project_path) = repo.repo_path_to_project_path(&entry.repo_path) else {
continue;
};
let namespace = if repo.has_conflict(&entry.repo_path) {
CONFLICT_NAMESPACE
} else if entry.status.is_created() {
NEW_NAMESPACE
} else {
TRACKED_NAMESPACE
};
let path_key = PathKey::namespaced(namespace, entry.repo_path.0.clone());
previous_paths.remove(&path_key);
let load_buffer = self
.project
.update(cx, |project, cx| project.open_buffer(project_path, cx));
let project = self.project.clone();
result.push(cx.spawn(async move |_, cx| {
let buffer = load_buffer.await?;
let changes = project
.update(cx, |project, cx| {
project.open_uncommitted_diff(buffer.clone(), cx)
})?
.await?;
Ok(DiffBuffer {
path_key,
buffer,
diff: changes,
file_status: entry.status,
})
}));
for entry in self.source.status(cx) {
if !entry.status.has_changes() {
continue;
}
});
let Some(worktree) = self
.project
.read(cx)
.worktree_for_id(entry.project_path.worktree_id, cx)
else {
continue;
};
let full_path =
Arc::from(Path::new(worktree.read(cx).root_name()).join(&entry.project_path.path));
let namespace = if entry.has_conflict {
CONFLICT_NAMESPACE
} else if entry.status.is_created() {
NEW_NAMESPACE
} else {
TRACKED_NAMESPACE
};
let path_key = PathKey::namespaced(namespace, full_path);
previous_paths.remove(&path_key);
let load_buffer = self.project.update(cx, |project, cx| {
project.open_buffer(entry.project_path, cx)
});
let source = self.source.clone();
result.push(cx.spawn(async move |_, cx| {
let buffer = load_buffer.await?;
let changes = cx
.update(|cx| source.open_uncommitted_diff(buffer.clone(), cx))?
.await?;
Ok(DiffBuffer {
path_key,
buffer,
diff: changes,
file_status: entry.status,
})
}));
}
self.multibuffer.update(cx, |multibuffer, cx| {
for path in previous_paths {
multibuffer.remove_excerpts_for_path(path, cx);
@@ -585,7 +656,15 @@ impl Item for ProjectDiff {
Self: Sized,
{
let workspace = self.workspace.upgrade()?;
Some(cx.new(|cx| ProjectDiff::new(self.project.clone(), workspace, window, cx)))
Some(cx.new(|cx| {
ProjectDiff::new(
self.source.clone(),
self.project.clone(),
workspace,
window,
cx,
)
}))
}
fn is_dirty(&self, cx: &App) -> bool {
@@ -743,7 +822,7 @@ impl SerializableItem for ProjectDiff {
}
fn deserialize(
_project: Entity<Project>,
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
_workspace_id: workspace::WorkspaceId,
_item_id: workspace::ItemId,
@@ -753,7 +832,16 @@ impl SerializableItem for ProjectDiff {
window.spawn(cx, async move |cx| {
workspace.update_in(cx, |workspace, window, cx| {
let workspace_handle = cx.entity();
cx.new(|cx| Self::new(workspace.project().clone(), workspace_handle, window, cx))
let diff = Arc::new(ProjectDiffSource(project));
cx.new(|cx| {
Self::new(
diff,
workspace.project().clone(),
workspace_handle,
window,
cx,
)
})
})
})
}
@@ -1337,8 +1425,9 @@ mod tests {
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let source = Arc::new(ProjectDiffSource(project.clone()));
let diff = cx.new_window_entity(|window, cx| {
ProjectDiff::new(project.clone(), workspace, window, cx)
ProjectDiff::new(source, project.clone(), workspace, window, cx)
});
cx.run_until_parked();
@@ -1391,8 +1480,9 @@ mod tests {
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let source = Arc::new(ProjectDiffSource(project.clone()));
let diff = cx.new_window_entity(|window, cx| {
ProjectDiff::new(project.clone(), workspace, window, cx)
ProjectDiff::new(source, project.clone(), workspace, window, cx)
});
cx.run_until_parked();
@@ -1464,6 +1554,7 @@ mod tests {
let project = Project::test(fs.clone(), [path!("/project").as_ref()], cx).await;
let (workspace, cx) =
cx.add_window_view(|window, cx| Workspace::test_new(project.clone(), window, cx));
let source = Arc::new(ProjectDiffSource(project.clone()));
let buffer = project
.update(cx, |project, cx| {
project.open_local_buffer(path!("/project/foo"), cx)
@@ -1474,7 +1565,7 @@ mod tests {
Editor::for_buffer(buffer, Some(project.clone()), window, cx)
});
let diff = cx.new_window_entity(|window, cx| {
ProjectDiff::new(project.clone(), workspace, window, cx)
ProjectDiff::new(source, project.clone(), workspace, window, cx)
});
cx.run_until_parked();

View File

@@ -3,10 +3,7 @@ use gpui::{
};
use itertools::Itertools;
use picker::{Picker, PickerDelegate};
use project::{
git_store::{GitStore, Repository},
Project,
};
use project::{git_store::Repository, Project};
use std::sync::Arc;
use ui::{prelude::*, ListItem, ListItemSpacing};
use workspace::{ModalView, Workspace};
@@ -40,21 +37,23 @@ impl RepositorySelector {
cx: &mut Context<Self>,
) -> Self {
let git_store = project_handle.read(cx).git_store().clone();
let repository_entries = git_store.update(cx, |git_store, cx| {
filtered_repository_entries(git_store, cx)
let repository_entries = git_store.update(cx, |git_store, _cx| {
git_store
.repositories()
.values()
.cloned()
.collect::<Vec<_>>()
});
let project = project_handle.read(cx);
let filtered_repositories = repository_entries.clone();
let widest_item_ix = repository_entries.iter().position_max_by(|a, b| {
a.read(cx)
.display_name(project, cx)
.display_name()
.len()
.cmp(&b.read(cx).display_name(project, cx).len())
.cmp(&b.read(cx).display_name().len())
});
let delegate = RepositorySelectorDelegate {
project: project_handle.downgrade(),
repository_selector: cx.entity().downgrade(),
repository_entries,
filtered_repositories,
@@ -71,36 +70,36 @@ impl RepositorySelector {
}
}
pub(crate) fn filtered_repository_entries(
git_store: &GitStore,
cx: &App,
) -> Vec<Entity<Repository>> {
let repositories = git_store
.repositories()
.values()
.sorted_by_key(|repo| {
let repo = repo.read(cx);
(
repo.dot_git_abs_path.clone(),
repo.worktree_abs_path.clone(),
)
})
.collect::<Vec<&Entity<Repository>>>();
repositories
.chunk_by(|a, b| a.read(cx).dot_git_abs_path == b.read(cx).dot_git_abs_path)
.flat_map(|chunk| {
let has_non_single_file_worktree = chunk
.iter()
.any(|repo| !repo.read(cx).is_from_single_file_worktree);
chunk.iter().filter(move |repo| {
// Remove any entry that comes from a single file worktree and represents a repository that is also represented by a non-single-file worktree.
!repo.read(cx).is_from_single_file_worktree || !has_non_single_file_worktree
})
})
.map(|&repo| repo.clone())
.collect()
}
//pub(crate) fn filtered_repository_entries(
// git_store: &GitStore,
// cx: &App,
//) -> Vec<Entity<Repository>> {
// let repositories = git_store
// .repositories()
// .values()
// .sorted_by_key(|repo| {
// let repo = repo.read(cx);
// (
// repo.dot_git_abs_path.clone(),
// repo.worktree_abs_path.clone(),
// )
// })
// .collect::<Vec<&Entity<Repository>>>();
//
// repositories
// .chunk_by(|a, b| a.read(cx).dot_git_abs_path == b.read(cx).dot_git_abs_path)
// .flat_map(|chunk| {
// let has_non_single_file_worktree = chunk
// .iter()
// .any(|repo| !repo.read(cx).is_from_single_file_worktree);
// chunk.iter().filter(move |repo| {
// // Remove any entry that comes from a single file worktree and represents a repository that is also represented by a non-single-file worktree.
// !repo.read(cx).is_from_single_file_worktree || !has_non_single_file_worktree
// })
// })
// .map(|&repo| repo.clone())
// .collect()
//}
impl EventEmitter<DismissEvent> for RepositorySelector {}
@@ -119,7 +118,6 @@ impl Render for RepositorySelector {
impl ModalView for RepositorySelector {}
pub struct RepositorySelectorDelegate {
project: WeakEntity<Project>,
repository_selector: WeakEntity<RepositorySelector>,
repository_entries: Vec<Entity<Repository>>,
filtered_repositories: Vec<Entity<Repository>>,
@@ -225,9 +223,8 @@ impl PickerDelegate for RepositorySelectorDelegate {
_window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
let project = self.project.upgrade()?;
let repo_info = self.filtered_repositories.get(ix)?;
let display_name = repo_info.read(cx).display_name(project.read(cx), cx);
let display_name = repo_info.read(cx).display_name();
Some(
ListItem::new(ix)
.inset(true)

View File

@@ -74,6 +74,11 @@ pub(crate) use windows::*;
#[cfg(any(test, feature = "test-support"))]
pub use test::TestScreenCaptureSource;
/// Returns a background executor for the current platform.
pub fn background_executor() -> BackgroundExecutor {
current_platform(true).background_executor()
}
#[cfg(target_os = "macos")]
pub(crate) fn current_platform(headless: bool) -> Rc<dyn Platform> {
Rc::new(MacPlatform::new(headless))

View File

@@ -14,6 +14,7 @@ use futures::FutureExt;
use futures::{future::BoxFuture, stream::BoxStream, StreamExt, TryStreamExt as _};
use gpui::{AnyElement, AnyView, App, AsyncApp, SharedString, Task, Window};
use icons::IconName;
use parking_lot::Mutex;
use proto::Plan;
use schemars::JsonSchema;
use serde::{de::DeserializeOwned, Deserialize, Serialize};
@@ -141,6 +142,8 @@ pub struct LanguageModelToolUse {
pub struct LanguageModelTextStream {
pub message_id: Option<String>,
pub stream: BoxStream<'static, Result<String>>,
// Has complete token usage after the stream has finished
pub last_token_usage: Arc<Mutex<TokenUsage>>,
}
impl Default for LanguageModelTextStream {
@@ -148,6 +151,7 @@ impl Default for LanguageModelTextStream {
Self {
message_id: None,
stream: Box::pin(futures::stream::empty()),
last_token_usage: Arc::new(Mutex::new(TokenUsage::default())),
}
}
}
@@ -200,6 +204,7 @@ pub trait LanguageModel: Send + Sync {
let mut events = events.await?.fuse();
let mut message_id = None;
let mut first_item_text = None;
let last_token_usage = Arc::new(Mutex::new(TokenUsage::default()));
if let Some(first_event) = events.next().await {
match first_event {
@@ -214,20 +219,33 @@ pub trait LanguageModel: Send + Sync {
}
let stream = futures::stream::iter(first_item_text.map(Ok))
.chain(events.filter_map(|result| async move {
match result {
Ok(LanguageModelCompletionEvent::StartMessage { .. }) => None,
Ok(LanguageModelCompletionEvent::Text(text)) => Some(Ok(text)),
Ok(LanguageModelCompletionEvent::Thinking(_)) => None,
Ok(LanguageModelCompletionEvent::Stop(_)) => None,
Ok(LanguageModelCompletionEvent::ToolUse(_)) => None,
Ok(LanguageModelCompletionEvent::UsageUpdate(_)) => None,
Err(err) => Some(Err(err)),
.chain(events.filter_map({
let last_token_usage = last_token_usage.clone();
move |result| {
let last_token_usage = last_token_usage.clone();
async move {
match result {
Ok(LanguageModelCompletionEvent::StartMessage { .. }) => None,
Ok(LanguageModelCompletionEvent::Text(text)) => Some(Ok(text)),
Ok(LanguageModelCompletionEvent::Thinking(_)) => None,
Ok(LanguageModelCompletionEvent::Stop(_)) => None,
Ok(LanguageModelCompletionEvent::ToolUse(_)) => None,
Ok(LanguageModelCompletionEvent::UsageUpdate(token_usage)) => {
*last_token_usage.lock() = token_usage;
None
}
Err(err) => Some(Err(err)),
}
}
}
}))
.boxed();
Ok(LanguageModelTextStream { message_id, stream })
Ok(LanguageModelTextStream {
message_id,
stream,
last_token_usage,
})
}
.boxed()
}

View File

@@ -44,9 +44,9 @@ postage.workspace = true
core-foundation.workspace = true
[target.'cfg(all(not(target_os = "macos")))'.dependencies]
async-trait = { workspace = true }
collections = { workspace = true }
gpui = { workspace = true }
async-trait.workspace = true
collections.workspace = true
gpui.workspace = true
livekit_api.workspace = true
nanoid.workspace = true

View File

@@ -14,4 +14,4 @@ doctest = false
[dependencies]
gpui.workspace = true
serde = { workspace = true }
serde.workspace = true

View File

@@ -2555,6 +2555,9 @@ impl OutlinePanel {
let auto_fold_dirs = OutlinePanelSettings::get_global(cx).auto_fold_dirs;
let active_multi_buffer = active_editor.read(cx).buffer().clone();
let new_entries = self.new_entries_for_fs_update.clone();
let repo_snapshots = self.project.update(cx, |project, cx| {
project.git_store().read(cx).repo_snapshots(cx)
});
self.updating_fs_entries = true;
self.fs_entries_update_task = cx.spawn_in(window, async move |outline_panel, cx| {
if let Some(debounce) = debounce {
@@ -2679,13 +2682,15 @@ impl OutlinePanel {
.unwrap_or_default(),
entry,
};
let mut traversal =
GitTraversal::new(worktree.traverse_from_path(
let mut traversal = GitTraversal::new(
&repo_snapshots,
worktree.traverse_from_path(
true,
true,
true,
entry.path.as_ref(),
));
),
);
let mut entries_to_add = HashMap::default();
worktree_excerpts

File diff suppressed because it is too large Load Diff

View File

@@ -1,23 +1,28 @@
use collections::HashMap;
use git::status::GitSummary;
use std::{ops::Deref, path::Path};
use sum_tree::Cursor;
use text::Bias;
use worktree::{Entry, PathProgress, PathTarget, RepositoryEntry, StatusEntry, Traversal};
use worktree::{
Entry, PathProgress, PathTarget, ProjectEntryId, RepositoryEntry, StatusEntry, Traversal,
};
/// Walks the worktree entries and their associated git statuses.
pub struct GitTraversal<'a> {
traversal: Traversal<'a>,
current_entry_summary: Option<GitSummary>,
repo_location: Option<(
&'a RepositoryEntry,
Cursor<'a, StatusEntry, PathProgress<'a>>,
)>,
repo_snapshots: &'a HashMap<ProjectEntryId, RepositoryEntry>,
repo_location: Option<(ProjectEntryId, Cursor<'a, StatusEntry, PathProgress<'a>>)>,
}
impl<'a> GitTraversal<'a> {
pub fn new(traversal: Traversal<'a>) -> GitTraversal<'a> {
pub fn new(
repo_snapshots: &'a HashMap<ProjectEntryId, RepositoryEntry>,
traversal: Traversal<'a>,
) -> GitTraversal<'a> {
let mut this = GitTraversal {
traversal,
repo_snapshots,
current_entry_summary: None,
repo_location: None,
};
@@ -32,7 +37,20 @@ impl<'a> GitTraversal<'a> {
return;
};
let Some(repo) = self.traversal.snapshot().repository_for_path(&entry.path) else {
let Ok(abs_path) = self.traversal.snapshot().absolutize(&entry.path) else {
self.repo_location = None;
return;
};
let Some((repo, repo_path)) = self
.repo_snapshots
.values()
.filter_map(|repo_snapshot| {
let relative_path = repo_snapshot.relativize_abs_path(&abs_path)?;
Some((repo_snapshot, relative_path))
})
.max_by_key(|(repo, _)| repo.work_directory_abs_path.clone())
else {
self.repo_location = None;
return;
};
@@ -42,18 +60,19 @@ impl<'a> GitTraversal<'a> {
|| self
.repo_location
.as_ref()
.map(|(prev_repo, _)| &prev_repo.work_directory)
!= Some(&repo.work_directory)
.map(|(prev_repo_id, _)| *prev_repo_id)
!= Some(repo.work_directory_id())
{
self.repo_location = Some((repo, repo.statuses_by_path.cursor::<PathProgress>(&())));
self.repo_location = Some((
repo.work_directory_id(),
repo.statuses_by_path.cursor::<PathProgress>(&()),
));
}
let Some((repo, statuses)) = &mut self.repo_location else {
let Some((_, statuses)) = &mut self.repo_location else {
return;
};
let repo_path = repo.relativize(&entry.path).unwrap();
if entry.is_dir() {
let mut statuses = statuses.clone();
statuses.seek_forward(&PathTarget::Path(repo_path.as_ref()), Bias::Left, &());
@@ -128,9 +147,15 @@ pub struct ChildEntriesGitIter<'a> {
}
impl<'a> ChildEntriesGitIter<'a> {
pub fn new(snapshot: &'a worktree::Snapshot, parent_path: &'a Path) -> Self {
let mut traversal =
GitTraversal::new(snapshot.traverse_from_path(true, true, true, parent_path));
pub fn new(
repo_snapshots: &'a HashMap<ProjectEntryId, RepositoryEntry>,
worktree_snapshot: &'a worktree::Snapshot,
parent_path: &'a Path,
) -> Self {
let mut traversal = GitTraversal::new(
repo_snapshots,
worktree_snapshot.traverse_from_path(true, true, true, parent_path),
);
traversal.advance();
ChildEntriesGitIter {
parent_path,
@@ -215,6 +240,8 @@ impl AsRef<Entry> for GitEntry {
mod tests {
use std::time::Duration;
use crate::Project;
use super::*;
use fs::FakeFs;
use git::status::{FileStatus, StatusCode, TrackedSummary, UnmergedStatus, UnmergedStatusCode};
@@ -222,7 +249,7 @@ mod tests {
use serde_json::json;
use settings::{Settings as _, SettingsStore};
use util::path;
use worktree::{Worktree, WorktreeSettings};
use worktree::WorktreeSettings;
const CONFLICT: FileStatus = FileStatus::Unmerged(UnmergedStatus {
first_head: UnmergedStatusCode::Updated,
@@ -282,44 +309,35 @@ mod tests {
&[(Path::new("z2.txt"), StatusCode::Added.index())],
);
let tree = Worktree::local(
Path::new(path!("/root")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let project = Project::test(fs, [path!("/root").as_ref()], cx).await;
cx.executor().run_until_parked();
let snapshot = tree.read_with(cx, |tree, _| tree.snapshot());
let (repo_snapshots, worktree_snapshot) = project.read_with(cx, |project, cx| {
(
project.git_store().read(cx).repo_snapshots(cx),
project.worktrees(cx).next().unwrap().read(cx).snapshot(),
)
});
let mut traversal =
GitTraversal::new(snapshot.traverse_from_path(true, false, true, Path::new("x")));
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("x/x1.txt"));
assert_eq!(entry.git_summary, GitSummary::UNCHANGED);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("x/x2.txt"));
assert_eq!(entry.git_summary, MODIFIED);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("x/y/y1.txt"));
assert_eq!(entry.git_summary, GitSummary::CONFLICT);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("x/y/y2.txt"));
assert_eq!(entry.git_summary, GitSummary::UNCHANGED);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("x/z.txt"));
assert_eq!(entry.git_summary, ADDED);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("z/z1.txt"));
assert_eq!(entry.git_summary, GitSummary::UNCHANGED);
let entry = traversal.next().unwrap();
assert_eq!(entry.path.as_ref(), Path::new("z/z2.txt"));
assert_eq!(entry.git_summary, ADDED);
let traversal = GitTraversal::new(
&repo_snapshots,
worktree_snapshot.traverse_from_path(true, false, true, Path::new("x")),
);
let entries = traversal
.map(|entry| (entry.path.clone(), entry.git_summary))
.collect::<Vec<_>>();
pretty_assertions::assert_eq!(
entries,
[
(Path::new("x/x1.txt").into(), GitSummary::UNCHANGED),
(Path::new("x/x2.txt").into(), MODIFIED),
(Path::new("x/y/y1.txt").into(), GitSummary::CONFLICT),
(Path::new("x/y/y2.txt").into(), GitSummary::UNCHANGED),
(Path::new("x/z.txt").into(), ADDED),
(Path::new("z/z1.txt").into(), GitSummary::UNCHANGED),
(Path::new("z/z2.txt").into(), ADDED),
]
)
}
#[gpui::test]
@@ -366,23 +384,20 @@ mod tests {
&[(Path::new("z2.txt"), StatusCode::Added.index())],
);
let tree = Worktree::local(
Path::new(path!("/root")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let project = Project::test(fs, [path!("/root").as_ref()], cx).await;
cx.executor().run_until_parked();
let snapshot = tree.read_with(cx, |tree, _| tree.snapshot());
let (repo_snapshots, worktree_snapshot) = project.read_with(cx, |project, cx| {
(
project.git_store().read(cx).repo_snapshots(cx),
project.worktrees(cx).next().unwrap().read(cx).snapshot(),
)
});
// Sanity check the propagation for x/y and z
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("x/y"), GitSummary::CONFLICT),
(Path::new("x/y/y1.txt"), GitSummary::CONFLICT),
@@ -390,7 +405,8 @@ mod tests {
],
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("z"), ADDED),
(Path::new("z/z1.txt"), GitSummary::UNCHANGED),
@@ -400,7 +416,8 @@ mod tests {
// Test one of the fundamental cases of propagation blocking, the transition from one git repository to another
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("x"), MODIFIED + ADDED),
(Path::new("x/y"), GitSummary::CONFLICT),
@@ -410,7 +427,8 @@ mod tests {
// Sanity check everything around it
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("x"), MODIFIED + ADDED),
(Path::new("x/x1.txt"), GitSummary::UNCHANGED),
@@ -424,7 +442,8 @@ mod tests {
// Test the other fundamental case, transitioning from git repository to non-git repository
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new(""), GitSummary::UNCHANGED),
(Path::new("x"), MODIFIED + ADDED),
@@ -434,7 +453,8 @@ mod tests {
// And all together now
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new(""), GitSummary::UNCHANGED),
(Path::new("x"), MODIFIED + ADDED),
@@ -490,21 +510,19 @@ mod tests {
],
);
let tree = Worktree::local(
Path::new(path!("/root")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let project = Project::test(fs, [path!("/root").as_ref()], cx).await;
cx.executor().run_until_parked();
let snapshot = tree.read_with(cx, |tree, _| tree.snapshot());
let (repo_snapshots, worktree_snapshot) = project.read_with(cx, |project, cx| {
(
project.git_store().read(cx).repo_snapshots(cx),
project.worktrees(cx).next().unwrap().read(cx).snapshot(),
)
});
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new(""), GitSummary::CONFLICT + MODIFIED + ADDED),
(Path::new("g"), GitSummary::CONFLICT),
@@ -513,7 +531,8 @@ mod tests {
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new(""), GitSummary::CONFLICT + ADDED + MODIFIED),
(Path::new("a"), ADDED + MODIFIED),
@@ -530,7 +549,8 @@ mod tests {
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("a/b"), ADDED),
(Path::new("a/b/c1.txt"), ADDED),
@@ -545,7 +565,8 @@ mod tests {
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("a/b/c1.txt"), ADDED),
(Path::new("a/b/c2.txt"), GitSummary::UNCHANGED),
@@ -598,26 +619,25 @@ mod tests {
&[(Path::new("z2.txt"), StatusCode::Modified.index())],
);
let tree = Worktree::local(
Path::new(path!("/root")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let project = Project::test(fs, [path!("/root").as_ref()], cx).await;
cx.executor().run_until_parked();
let snapshot = tree.read_with(cx, |tree, _| tree.snapshot());
let (repo_snapshots, worktree_snapshot) = project.read_with(cx, |project, cx| {
(
project.git_store().read(cx).repo_snapshots(cx),
project.worktrees(cx).next().unwrap().read(cx).snapshot(),
)
});
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[(Path::new("x"), ADDED), (Path::new("x/x1.txt"), ADDED)],
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("y"), GitSummary::CONFLICT + MODIFIED),
(Path::new("y/y1.txt"), GitSummary::CONFLICT),
@@ -626,7 +646,8 @@ mod tests {
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("z"), MODIFIED),
(Path::new("z/z2.txt"), MODIFIED),
@@ -634,12 +655,14 @@ mod tests {
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[(Path::new("x"), ADDED), (Path::new("x/x1.txt"), ADDED)],
);
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new("x"), ADDED),
(Path::new("x/x1.txt"), ADDED),
@@ -689,18 +712,11 @@ mod tests {
);
cx.run_until_parked();
let tree = Worktree::local(
path!("/root").as_ref(),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
cx.executor().run_until_parked();
let (old_entry_ids, old_mtimes) = tree.read_with(cx, |tree, _| {
let (old_entry_ids, old_mtimes) = project.read_with(cx, |project, cx| {
let tree = project.worktrees(cx).next().unwrap().read(cx);
(
tree.entries(true, 0).map(|e| e.id).collect::<Vec<_>>(),
tree.entries(true, 0).map(|e| e.mtime).collect::<Vec<_>>(),
@@ -713,7 +729,8 @@ mod tests {
fs.touch_path(path!("/root")).await;
cx.executor().run_until_parked();
let (new_entry_ids, new_mtimes) = tree.read_with(cx, |tree, _| {
let (new_entry_ids, new_mtimes) = project.read_with(cx, |project, cx| {
let tree = project.worktrees(cx).next().unwrap().read(cx);
(
tree.entries(true, 0).map(|e| e.id).collect::<Vec<_>>(),
tree.entries(true, 0).map(|e| e.mtime).collect::<Vec<_>>(),
@@ -734,10 +751,16 @@ mod tests {
cx.executor().run_until_parked();
cx.executor().advance_clock(Duration::from_secs(1));
let snapshot = tree.read_with(cx, |tree, _| tree.snapshot());
let (repo_snapshots, worktree_snapshot) = project.read_with(cx, |project, cx| {
(
project.git_store().read(cx).repo_snapshots(cx),
project.worktrees(cx).next().unwrap().read(cx).snapshot(),
)
});
check_git_statuses(
&snapshot,
&repo_snapshots,
&worktree_snapshot,
&[
(Path::new(""), MODIFIED),
(Path::new("a.txt"), GitSummary::UNCHANGED),
@@ -748,11 +771,14 @@ mod tests {
#[track_caller]
fn check_git_statuses(
snapshot: &worktree::Snapshot,
repo_snapshots: &HashMap<ProjectEntryId, RepositoryEntry>,
worktree_snapshot: &worktree::Snapshot,
expected_statuses: &[(&Path, GitSummary)],
) {
let mut traversal =
GitTraversal::new(snapshot.traverse_from_path(true, true, false, "".as_ref()));
let mut traversal = GitTraversal::new(
repo_snapshots,
worktree_snapshot.traverse_from_path(true, true, false, "".as_ref()),
);
let found_statuses = expected_statuses
.iter()
.map(|&(path, _)| {
@@ -762,6 +788,6 @@ mod tests {
(path, git_entry.git_summary)
})
.collect::<Vec<_>>();
assert_eq!(found_statuses, expected_statuses);
pretty_assertions::assert_eq!(found_statuses, expected_statuses);
}
}

View File

@@ -24,7 +24,7 @@ mod direnv;
mod environment;
use buffer_diff::BufferDiff;
pub use environment::{EnvironmentErrorMessage, ProjectEnvironmentEvent};
use git_store::Repository;
use git_store::{GitEvent, Repository};
pub mod search_history;
mod yarn;
@@ -270,7 +270,6 @@ pub enum Event {
WorktreeOrderChanged,
WorktreeRemoved(WorktreeId),
WorktreeUpdatedEntries(WorktreeId, UpdatedEntriesSet),
WorktreeUpdatedGitRepositories(WorktreeId),
DiskBasedDiagnosticsStarted {
language_server_id: LanguageServerId,
},
@@ -300,6 +299,8 @@ pub enum Event {
RevealInProjectPanel(ProjectEntryId),
SnippetEdit(BufferId, Vec<(lsp::Range, Snippet)>),
ExpandedAllForEntry(WorktreeId, ProjectEntryId),
GitStateUpdated,
ActiveRepositoryChanged,
}
pub enum DebugAdapterClientState {
@@ -793,8 +794,6 @@ impl Project {
client.add_entity_message_handler(Self::handle_unshare_project);
client.add_entity_request_handler(Self::handle_update_buffer);
client.add_entity_message_handler(Self::handle_update_worktree);
client.add_entity_message_handler(Self::handle_update_repository);
client.add_entity_message_handler(Self::handle_remove_repository);
client.add_entity_request_handler(Self::handle_synchronize_buffers);
client.add_entity_request_handler(Self::handle_search_candidate_buffers);
@@ -922,6 +921,7 @@ impl Project {
cx,
)
});
cx.subscribe(&git_store, Self::on_git_store_event).detach();
cx.subscribe(&lsp_store, Self::on_lsp_store_event).detach();
@@ -1136,8 +1136,6 @@ impl Project {
ssh_proto.add_entity_message_handler(Self::handle_create_buffer_for_peer);
ssh_proto.add_entity_message_handler(Self::handle_update_worktree);
ssh_proto.add_entity_message_handler(Self::handle_update_repository);
ssh_proto.add_entity_message_handler(Self::handle_remove_repository);
ssh_proto.add_entity_message_handler(Self::handle_update_project);
ssh_proto.add_entity_message_handler(Self::handle_toast);
ssh_proto.add_entity_request_handler(Self::handle_language_server_prompt_request);
@@ -1478,7 +1476,7 @@ impl Project {
) -> Entity<Project> {
use clock::FakeSystemClock;
let fs = Arc::new(RealFs::default());
let fs = Arc::new(RealFs::new(None, cx.background_executor().clone()));
let languages = LanguageRegistry::test(cx.background_executor().clone());
let clock = Arc::new(FakeSystemClock::new());
let http_client = http_client::FakeHttpClient::with_404_response();
@@ -2040,6 +2038,11 @@ impl Project {
self.worktree_store.update(cx, |worktree_store, cx| {
worktree_store.send_project_updates(cx);
});
if let Some(remote_id) = self.remote_id() {
self.git_store.update(cx, |git_store, cx| {
git_store.shared(remote_id, self.client.clone().into(), cx)
});
}
cx.emit(Event::Reshared);
Ok(())
}
@@ -2707,6 +2710,19 @@ impl Project {
}
}
fn on_git_store_event(
&mut self,
_: Entity<GitStore>,
event: &GitEvent,
cx: &mut Context<Self>,
) {
match event {
GitEvent::GitStateUpdated => cx.emit(Event::GitStateUpdated),
GitEvent::ActiveRepositoryChanged => cx.emit(Event::ActiveRepositoryChanged),
GitEvent::FileSystemUpdated | GitEvent::IndexWriteError(_) => {}
}
}
fn on_ssh_event(
&mut self,
_: Entity<SshRemoteClient>,
@@ -2792,12 +2808,11 @@ impl Project {
.report_discovered_project_events(*worktree_id, changes);
cx.emit(Event::WorktreeUpdatedEntries(*worktree_id, changes.clone()))
}
WorktreeStoreEvent::WorktreeUpdatedGitRepositories(worktree_id) => {
cx.emit(Event::WorktreeUpdatedGitRepositories(*worktree_id))
}
WorktreeStoreEvent::WorktreeDeletedEntry(worktree_id, id) => {
cx.emit(Event::DeletedEntry(*worktree_id, *id))
}
// Listen to the GitStore instead.
WorktreeStoreEvent::WorktreeUpdatedGitRepositories(_, _) => {}
}
}
@@ -4309,43 +4324,7 @@ impl Project {
if let Some(worktree) = this.worktree_for_id(worktree_id, cx) {
worktree.update(cx, |worktree, _| {
let worktree = worktree.as_remote_mut().unwrap();
worktree.update_from_remote(envelope.payload.into());
});
}
Ok(())
})?
}
async fn handle_update_repository(
this: Entity<Self>,
envelope: TypedEnvelope<proto::UpdateRepository>,
mut cx: AsyncApp,
) -> Result<()> {
this.update(&mut cx, |this, cx| {
if let Some((worktree, _relative_path)) =
this.find_worktree(envelope.payload.abs_path.as_ref(), cx)
{
worktree.update(cx, |worktree, _| {
let worktree = worktree.as_remote_mut().unwrap();
worktree.update_from_remote(envelope.payload.into());
});
}
Ok(())
})?
}
async fn handle_remove_repository(
this: Entity<Self>,
envelope: TypedEnvelope<proto::RemoveRepository>,
mut cx: AsyncApp,
) -> Result<()> {
this.update(&mut cx, |this, cx| {
if let Some(worktree) =
this.worktree_for_entry(ProjectEntryId::from_proto(envelope.payload.id), cx)
{
worktree.update(cx, |worktree, _| {
let worktree = worktree.as_remote_mut().unwrap();
worktree.update_from_remote(envelope.payload.into());
worktree.update_from_remote(envelope.payload);
});
}
Ok(())

View File

@@ -6,7 +6,8 @@ use buffer_diff::{
};
use fs::FakeFs;
use futures::{future, StreamExt};
use gpui::{App, SemanticVersion, UpdateGlobal};
use git::repository::RepoPath;
use gpui::{App, BackgroundExecutor, SemanticVersion, UpdateGlobal};
use http_client::Url;
use language::{
language_settings::{language_settings, AllLanguageSettings, LanguageSettingsContent},
@@ -34,6 +35,7 @@ use util::{
test::{marked_text_offsets, TempTree},
uri, TryFutureExt as _,
};
use worktree::WorktreeModelHandle as _;
#[gpui::test]
async fn test_block_via_channel(cx: &mut gpui::TestAppContext) {
@@ -97,7 +99,12 @@ async fn test_symlinks(cx: &mut gpui::TestAppContext) {
)
.unwrap();
let project = Project::test(Arc::new(RealFs::default()), [root_link_path.as_ref()], cx).await;
let project = Project::test(
Arc::new(RealFs::new(None, cx.executor())),
[root_link_path.as_ref()],
cx,
)
.await;
project.update(cx, |project, cx| {
let tree = project.worktrees(cx).next().unwrap().read(cx);
@@ -3330,7 +3337,7 @@ async fn test_rescan_and_remote_updates(cx: &mut gpui::TestAppContext) {
}
}));
let project = Project::test(Arc::new(RealFs::default()), [dir.path()], cx).await;
let project = Project::test(Arc::new(RealFs::new(None, cx.executor())), [dir.path()], cx).await;
let buffer_for_path = |path: &'static str, cx: &mut gpui::TestAppContext| {
let buffer = project.update(cx, |p, cx| p.open_local_buffer(dir.path().join(path), cx));
@@ -6769,6 +6776,158 @@ async fn test_single_file_diffs(cx: &mut gpui::TestAppContext) {
});
}
#[gpui::test]
async fn test_repository_and_path_for_project_path(
background_executor: BackgroundExecutor,
cx: &mut gpui::TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(background_executor);
fs.insert_tree(
path!("/root"),
json!({
"c.txt": "",
"dir1": {
".git": {},
"deps": {
"dep1": {
".git": {},
"src": {
"a.txt": ""
}
}
},
"src": {
"b.txt": ""
}
},
}),
)
.await;
let project = Project::test(fs.clone(), [path!("/root").as_ref()], cx).await;
let tree = project.read_with(cx, |project, cx| project.worktrees(cx).next().unwrap());
let tree_id = tree.read_with(cx, |tree, _| tree.id());
tree.read_with(cx, |tree, _| tree.as_local().unwrap().scan_complete())
.await;
tree.flush_fs_events(cx).await;
project.read_with(cx, |project, cx| {
let git_store = project.git_store().read(cx);
let pairs = [
("c.txt", None),
("dir1/src/b.txt", Some((path!("/root/dir1"), "src/b.txt"))),
(
"dir1/deps/dep1/src/a.txt",
Some((path!("/root/dir1/deps/dep1"), "src/a.txt")),
),
];
let expected = pairs
.iter()
.map(|(path, result)| {
(
path,
result.map(|(repo, repo_path)| {
(Path::new(repo).to_owned(), RepoPath::from(repo_path))
}),
)
})
.collect::<Vec<_>>();
let actual = pairs
.iter()
.map(|(path, _)| {
let project_path = (tree_id, Path::new(path)).into();
let result = maybe!({
let (repo, repo_path) =
git_store.repository_and_path_for_project_path(&project_path, cx)?;
Some((
repo.read(cx)
.repository_entry
.work_directory_abs_path
.clone(),
repo_path,
))
});
(path, result)
})
.collect::<Vec<_>>();
pretty_assertions::assert_eq!(expected, actual);
});
fs.remove_dir(path!("/root/dir1/.git").as_ref(), RemoveOptions::default())
.await
.unwrap();
tree.flush_fs_events(cx).await;
project.read_with(cx, |project, cx| {
let git_store = project.git_store().read(cx);
assert_eq!(
git_store.repository_and_path_for_project_path(
&(tree_id, Path::new("dir1/src/b.txt")).into(),
cx
),
None
);
});
}
#[gpui::test]
async fn test_home_dir_as_git_repository(cx: &mut gpui::TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
path!("/root"),
json!({
"home": {
".git": {},
"project": {
"a.txt": "A"
},
},
}),
)
.await;
fs.set_home_dir(Path::new(path!("/root/home")).to_owned());
let project = Project::test(fs.clone(), [path!("/root/home/project").as_ref()], cx).await;
let tree = project.read_with(cx, |project, cx| project.worktrees(cx).next().unwrap());
let tree_id = tree.read_with(cx, |tree, _| tree.id());
tree.read_with(cx, |tree, _| tree.as_local().unwrap().scan_complete())
.await;
tree.flush_fs_events(cx).await;
project.read_with(cx, |project, cx| {
let containing = project
.git_store()
.read(cx)
.repository_and_path_for_project_path(&(tree_id, "a.txt").into(), cx);
assert!(containing.is_none());
});
let project = Project::test(fs.clone(), [path!("/root/home").as_ref()], cx).await;
let tree = project.read_with(cx, |project, cx| project.worktrees(cx).next().unwrap());
let tree_id = tree.read_with(cx, |tree, _| tree.id());
tree.read_with(cx, |tree, _| tree.as_local().unwrap().scan_complete())
.await;
tree.flush_fs_events(cx).await;
project.read_with(cx, |project, cx| {
let containing = project
.git_store()
.read(cx)
.repository_and_path_for_project_path(&(tree_id, "project/a.txt").into(), cx);
assert_eq!(
containing
.unwrap()
.0
.read(cx)
.repository_entry
.work_directory_abs_path,
Path::new(path!("/root/home"))
);
});
}
async fn search(
project: &Entity<Project>,
query: SearchQuery,

View File

@@ -26,7 +26,10 @@ use smol::{
};
use text::ReplicaId;
use util::{paths::SanitizedPath, ResultExt};
use worktree::{Entry, ProjectEntryId, UpdatedEntriesSet, Worktree, WorktreeId, WorktreeSettings};
use worktree::{
Entry, ProjectEntryId, UpdatedEntriesSet, UpdatedGitRepositoriesSet, Worktree, WorktreeId,
WorktreeSettings,
};
use crate::{search::SearchQuery, ProjectPath};
@@ -66,7 +69,7 @@ pub enum WorktreeStoreEvent {
WorktreeOrderChanged,
WorktreeUpdateSent(Entity<Worktree>),
WorktreeUpdatedEntries(WorktreeId, UpdatedEntriesSet),
WorktreeUpdatedGitRepositories(WorktreeId),
WorktreeUpdatedGitRepositories(WorktreeId, UpdatedGitRepositoriesSet),
WorktreeDeletedEntry(WorktreeId, ProjectEntryId),
}
@@ -156,6 +159,11 @@ impl WorktreeStore {
None
}
pub fn absolutize(&self, project_path: &ProjectPath, cx: &App) -> Option<PathBuf> {
let worktree = self.worktree_for_id(project_path.worktree_id, cx)?;
worktree.read(cx).absolutize(&project_path.path).ok()
}
pub fn find_or_create_worktree(
&mut self,
abs_path: impl AsRef<Path>,
@@ -367,9 +375,10 @@ impl WorktreeStore {
changes.clone(),
));
}
worktree::Event::UpdatedGitRepositories(_) => {
worktree::Event::UpdatedGitRepositories(set) => {
cx.emit(WorktreeStoreEvent::WorktreeUpdatedGitRepositories(
worktree_id,
set.clone(),
));
}
worktree::Event::DeletedEntry(id) => {
@@ -561,44 +570,12 @@ impl WorktreeStore {
let client = client.clone();
async move {
if client.is_via_collab() {
match update {
proto::WorktreeRelatedMessage::UpdateWorktree(
update,
) => {
client
.request(update)
.map(|result| result.log_err().is_some())
.await
}
proto::WorktreeRelatedMessage::UpdateRepository(
update,
) => {
client
.request(update)
.map(|result| result.log_err().is_some())
.await
}
proto::WorktreeRelatedMessage::RemoveRepository(
update,
) => {
client
.request(update)
.map(|result| result.log_err().is_some())
.await
}
}
client
.request(update)
.map(|result| result.log_err().is_some())
.await
} else {
match update {
proto::WorktreeRelatedMessage::UpdateWorktree(
update,
) => client.send(update).log_err().is_some(),
proto::WorktreeRelatedMessage::UpdateRepository(
update,
) => client.send(update).log_err().is_some(),
proto::WorktreeRelatedMessage::RemoveRepository(
update,
) => client.send(update).log_err().is_some(),
}
client.send(update).log_err().is_some()
}
}
}

View File

@@ -334,7 +334,8 @@ impl ProjectPanel {
this.update_visible_entries(None, cx);
cx.notify();
}
project::Event::WorktreeUpdatedGitRepositories(_)
project::Event::GitStateUpdated
| project::Event::ActiveRepositoryChanged
| project::Event::WorktreeUpdatedEntries(_, _)
| project::Event::WorktreeAdded(_)
| project::Event::WorktreeOrderChanged => {
@@ -1553,6 +1554,7 @@ impl ProjectPanel {
.map(|entry| entry.worktree_id)
.filter_map(|id| project.worktree_for_id(id, cx).map(|w| (id, w.read(cx))))
.max_by(|(_, a), (_, b)| a.root_name().cmp(b.root_name()))?;
let git_store = project.git_store().read(cx);
let marked_entries_in_worktree = sanitized_entries
.iter()
@@ -1577,18 +1579,20 @@ impl ProjectPanel {
let parent_entry = worktree.entry_for_path(parent_path)?;
// Remove all siblings that are being deleted except the last marked entry
let snapshot = worktree.snapshot();
let repo_snapshots = git_store.repo_snapshots(cx);
let worktree_snapshot = worktree.snapshot();
let hide_gitignore = ProjectPanelSettings::get_global(cx).hide_gitignore;
let mut siblings: Vec<_> = ChildEntriesGitIter::new(&snapshot, parent_path)
.filter(|sibling| {
(sibling.id == latest_entry.id)
|| (!marked_entries_in_worktree.contains(&&SelectedEntry {
worktree_id,
entry_id: sibling.id,
}) && (!hide_gitignore || !sibling.is_ignored))
})
.map(|entry| entry.to_owned())
.collect();
let mut siblings: Vec<_> =
ChildEntriesGitIter::new(&repo_snapshots, &worktree_snapshot, parent_path)
.filter(|sibling| {
(sibling.id == latest_entry.id)
|| (!marked_entries_in_worktree.contains(&&SelectedEntry {
worktree_id,
entry_id: sibling.id,
}) && (!hide_gitignore || !sibling.is_ignored))
})
.map(|entry| entry.to_owned())
.collect();
project::sort_worktree_entries(&mut siblings);
let sibling_entry_index = siblings
@@ -2605,6 +2609,7 @@ impl ProjectPanel {
let auto_collapse_dirs = settings.auto_fold_dirs;
let hide_gitignore = settings.hide_gitignore;
let project = self.project.read(cx);
let repo_snapshots = project.git_store().read(cx).repo_snapshots(cx);
self.last_worktree_root_id = project
.visible_worktrees(cx)
.next_back()
@@ -2615,15 +2620,15 @@ impl ProjectPanel {
self.visible_entries.clear();
let mut max_width_item = None;
for worktree in project.visible_worktrees(cx) {
let snapshot = worktree.read(cx).snapshot();
let worktree_id = snapshot.id();
let worktree_snapshot = worktree.read(cx).snapshot();
let worktree_id = worktree_snapshot.id();
let expanded_dir_ids = match self.expanded_dir_ids.entry(worktree_id) {
hash_map::Entry::Occupied(e) => e.into_mut(),
hash_map::Entry::Vacant(e) => {
// The first time a worktree's root entry becomes available,
// mark that root entry as expanded.
if let Some(entry) = snapshot.root_entry() {
if let Some(entry) = worktree_snapshot.root_entry() {
e.insert(vec![entry.id]).as_slice()
} else {
&[]
@@ -2645,14 +2650,15 @@ impl ProjectPanel {
}
let mut visible_worktree_entries = Vec::new();
let mut entry_iter = GitTraversal::new(snapshot.entries(true, 0));
let mut entry_iter =
GitTraversal::new(&repo_snapshots, worktree_snapshot.entries(true, 0));
let mut auto_folded_ancestors = vec![];
while let Some(entry) = entry_iter.entry() {
if auto_collapse_dirs && entry.kind.is_dir() {
auto_folded_ancestors.push(entry.id);
if !self.unfolded_dir_ids.contains(&entry.id) {
if let Some(root_path) = snapshot.root_entry() {
let mut child_entries = snapshot.child_entries(&entry.path);
if let Some(root_path) = worktree_snapshot.root_entry() {
let mut child_entries = worktree_snapshot.child_entries(&entry.path);
if let Some(child) = child_entries.next() {
if entry.path != root_path.path
&& child_entries.next().is_none()
@@ -3297,10 +3303,16 @@ impl ProjectPanel {
.cloned();
}
let repo_snapshots = self
.project
.read(cx)
.git_store()
.read(cx)
.repo_snapshots(cx);
let worktree = self.project.read(cx).worktree_for_id(worktree_id, cx)?;
worktree.update(cx, |tree, _| {
utils::ReversibleIterable::new(
GitTraversal::new(tree.entries(true, 0usize)),
GitTraversal::new(&repo_snapshots, tree.entries(true, 0usize)),
reverse_search,
)
.find_single_ended(|ele| predicate(*ele, worktree_id))
@@ -3320,6 +3332,12 @@ impl ProjectPanel {
.iter()
.map(|(worktree_id, _, _)| *worktree_id)
.collect();
let repo_snapshots = self
.project
.read(cx)
.git_store()
.read(cx)
.repo_snapshots(cx);
let mut last_found: Option<SelectedEntry> = None;
@@ -3334,12 +3352,10 @@ impl ProjectPanel {
let root_entry = tree.root_entry()?;
let tree_id = tree.id();
let mut first_iter = GitTraversal::new(tree.traverse_from_path(
true,
true,
true,
entry.path.as_ref(),
));
let mut first_iter = GitTraversal::new(
&repo_snapshots,
tree.traverse_from_path(true, true, true, entry.path.as_ref()),
);
if reverse_search {
first_iter.next();
@@ -3352,7 +3368,7 @@ impl ProjectPanel {
.find(|ele| predicate(*ele, tree_id))
.map(|ele| ele.to_owned());
let second_iter = GitTraversal::new(tree.entries(true, 0usize));
let second_iter = GitTraversal::new(&repo_snapshots, tree.entries(true, 0usize));
let second = if reverse_search {
second_iter

View File

@@ -2240,7 +2240,7 @@ message OpenUncommittedDiffResponse {
message SetIndexText {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string path = 4;
optional string text = 5;
@@ -3350,7 +3350,7 @@ message GetPanicFiles {
message GitShow {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string commit = 4;
}
@@ -3365,7 +3365,7 @@ message GitCommitDetails {
message GitReset {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string commit = 4;
ResetMode mode = 5;
@@ -3377,7 +3377,7 @@ message GitReset {
message GitCheckoutFiles {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string commit = 4;
repeated string paths = 5;
@@ -3432,21 +3432,21 @@ message RegisterBufferWithLanguageServers{
message Stage {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
repeated string paths = 4;
}
message Unstage {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
repeated string paths = 4;
}
message Commit {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
optional string name = 4;
optional string email = 5;
@@ -3455,13 +3455,13 @@ message Commit {
message OpenCommitMessageBuffer {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
}
message Push {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string remote_name = 4;
string branch_name = 5;
@@ -3476,14 +3476,14 @@ message Push {
message Fetch {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
uint64 askpass_id = 4;
}
message GetRemotes {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
optional string branch_name = 4;
}
@@ -3498,7 +3498,7 @@ message GetRemotesResponse {
message Pull {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string remote_name = 4;
string branch_name = 5;
@@ -3512,7 +3512,7 @@ message RemoteMessageResponse {
message AskPassRequest {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
uint64 askpass_id = 4;
string prompt = 5;
@@ -3524,27 +3524,27 @@ message AskPassResponse {
message GitGetBranches {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
}
message GitCreateBranch {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string branch_name = 4;
}
message GitChangeBranch {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
string branch_name = 4;
}
message CheckForPushedCommits {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
}
@@ -3554,7 +3554,7 @@ message CheckForPushedCommitsResponse {
message GitDiff {
uint64 project_id = 1;
uint64 worktree_id = 2;
reserved 2;
uint64 work_directory_id = 3;
DiffType diff_type = 4;

View File

@@ -793,31 +793,6 @@ pub const MAX_WORKTREE_UPDATE_MAX_CHUNK_SIZE: usize = 2;
#[cfg(not(any(test, feature = "test-support")))]
pub const MAX_WORKTREE_UPDATE_MAX_CHUNK_SIZE: usize = 256;
#[derive(Clone, Debug)]
pub enum WorktreeRelatedMessage {
UpdateWorktree(UpdateWorktree),
UpdateRepository(UpdateRepository),
RemoveRepository(RemoveRepository),
}
impl From<UpdateWorktree> for WorktreeRelatedMessage {
fn from(value: UpdateWorktree) -> Self {
Self::UpdateWorktree(value)
}
}
impl From<UpdateRepository> for WorktreeRelatedMessage {
fn from(value: UpdateRepository) -> Self {
Self::UpdateRepository(value)
}
}
impl From<RemoveRepository> for WorktreeRelatedMessage {
fn from(value: RemoveRepository) -> Self {
Self::RemoveRepository(value)
}
}
pub fn split_worktree_update(mut message: UpdateWorktree) -> impl Iterator<Item = UpdateWorktree> {
let mut done = false;
@@ -924,20 +899,6 @@ pub fn split_repository_update(
})
}
pub fn split_worktree_related_message(
message: WorktreeRelatedMessage,
) -> Box<dyn Iterator<Item = WorktreeRelatedMessage> + Send> {
match message {
WorktreeRelatedMessage::UpdateWorktree(message) => {
Box::new(split_worktree_update(message).map(WorktreeRelatedMessage::UpdateWorktree))
}
WorktreeRelatedMessage::UpdateRepository(message) => {
Box::new(split_repository_update(message).map(WorktreeRelatedMessage::UpdateRepository))
}
WorktreeRelatedMessage::RemoveRepository(update) => Box::new([update.into()].into_iter()),
}
}
#[cfg(test)]
mod tests {
use super::*;

View File

@@ -44,7 +44,7 @@ workspace.workspace = true
zed_actions.workspace = true
[dev-dependencies]
dap = { workspace = true }
dap.workspace = true
editor = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }

View File

@@ -44,7 +44,7 @@ languages.workspace = true
log.workspace = true
lsp.workspace = true
node_runtime.workspace = true
paths = { workspace = true }
paths.workspace = true
project.workspace = true
proto.workspace = true
release_channel.workspace = true

View File

@@ -445,7 +445,7 @@ pub fn execute_run(
let extension_host_proxy = ExtensionHostProxy::global(cx);
let project = cx.new(|cx| {
let fs = Arc::new(RealFs::new(None));
let fs = Arc::new(RealFs::new(None, cx.background_executor().clone()));
let node_settings_rx = initialize_settings(session.clone(), fs.clone(), cx);
let proxy_url = read_proxy_settings(cx);

View File

@@ -2,6 +2,7 @@ use crate::prelude::*;
use gpui::{AnyElement, FocusHandle, ScrollAnchor, ScrollHandle};
/// An element that can be navigated through via keyboard. Intended for use with scrollable views that want to use
#[derive(IntoElement)]
pub struct Navigable {
child: AnyElement,
selectable_children: Vec<NavigableEntry>,

View File

@@ -11,7 +11,9 @@ impl Display for MarkdownString {
}
impl MarkdownString {
/// Escapes markdown special characters.
/// Escapes markdown special characters in markdown text blocks. Markdown code blocks follow
/// different rules and `MarkdownString::inline_code` or `MarkdownString::code_block` should be
/// used in that case.
///
/// Also escapes the following markdown extensions:
///
@@ -134,6 +136,12 @@ impl MarkdownString {
Self(format!("{backticks}{space}{text}{space}{backticks}"))
}
}
/// Returns markdown for code blocks, wrapped in 3 or more backticks as needed.
pub fn code_block(tag: &str, text: &str) -> Self {
let backticks = "`".repeat(3.max(count_max_consecutive_chars(text, '`') + 1));
Self(format!("{backticks}{tag}\n{text}\n{backticks}\n"))
}
}
// Copied from `pulldown-cmark-to-cmark-20.0.0` with changed names.

View File

@@ -41,7 +41,7 @@ use postage::{
watch,
};
use rpc::{
proto::{self, split_worktree_related_message, FromProto, ToProto, WorktreeRelatedMessage},
proto::{self, split_worktree_update, FromProto, ToProto},
AnyProtoClient,
};
pub use settings::WorktreeId;
@@ -138,12 +138,12 @@ struct ScanRequest {
pub struct RemoteWorktree {
snapshot: Snapshot,
background_snapshot: Arc<Mutex<(Snapshot, Vec<WorktreeRelatedMessage>)>>,
background_snapshot: Arc<Mutex<(Snapshot, Vec<proto::UpdateWorktree>)>>,
project_id: u64,
client: AnyProtoClient,
file_scan_inclusions: PathMatcher,
updates_tx: Option<UnboundedSender<WorktreeRelatedMessage>>,
update_observer: Option<mpsc::UnboundedSender<WorktreeRelatedMessage>>,
updates_tx: Option<UnboundedSender<proto::UpdateWorktree>>,
update_observer: Option<mpsc::UnboundedSender<proto::UpdateWorktree>>,
snapshot_subscriptions: VecDeque<(usize, oneshot::Sender<()>)>,
replica_id: ReplicaId,
visible: bool,
@@ -196,28 +196,25 @@ pub struct RepositoryEntry {
/// - my_sub_folder_1/project_root/changed_file_1
/// - my_sub_folder_2/changed_file_2
pub statuses_by_path: SumTree<StatusEntry>,
work_directory_id: ProjectEntryId,
pub work_directory: WorkDirectory,
work_directory_abs_path: PathBuf,
pub(crate) current_branch: Option<Branch>,
pub work_directory_id: ProjectEntryId,
pub work_directory_abs_path: PathBuf,
pub worktree_scan_id: usize,
pub current_branch: Option<Branch>,
pub current_merge_conflicts: TreeSet<RepoPath>,
}
impl RepositoryEntry {
pub fn relativize(&self, path: &Path) -> Result<RepoPath> {
self.work_directory.relativize(path)
pub fn relativize_abs_path(&self, abs_path: &Path) -> Option<RepoPath> {
Some(
abs_path
.strip_prefix(&self.work_directory_abs_path)
.ok()?
.into(),
)
}
pub fn try_unrelativize(&self, path: &RepoPath) -> Option<Arc<Path>> {
self.work_directory.try_unrelativize(path)
}
pub fn unrelativize(&self, path: &RepoPath) -> Arc<Path> {
self.work_directory.unrelativize(path)
}
pub fn directory_contains(&self, path: impl AsRef<Path>) -> bool {
self.work_directory.directory_contains(path)
pub fn directory_contains_abs_path(&self, abs_path: impl AsRef<Path>) -> bool {
abs_path.as_ref().starts_with(&self.work_directory_abs_path)
}
pub fn branch(&self) -> Option<&Branch> {
@@ -246,11 +243,7 @@ impl RepositoryEntry {
.cloned()
}
pub fn initial_update(
&self,
project_id: u64,
worktree_scan_id: usize,
) -> proto::UpdateRepository {
pub fn initial_update(&self, project_id: u64) -> proto::UpdateRepository {
proto::UpdateRepository {
branch_summary: self.current_branch.as_ref().map(branch_to_proto),
updated_statuses: self
@@ -274,16 +267,11 @@ impl RepositoryEntry {
entry_ids: vec![self.work_directory_id().to_proto()],
// This is also semantically wrong, and should be replaced once we separate git repo updates
// from worktree scans.
scan_id: worktree_scan_id as u64,
scan_id: self.worktree_scan_id as u64,
}
}
pub fn build_update(
&self,
old: &Self,
project_id: u64,
scan_id: usize,
) -> proto::UpdateRepository {
pub fn build_update(&self, old: &Self, project_id: u64) -> proto::UpdateRepository {
let mut updated_statuses: Vec<proto::StatusEntry> = Vec::new();
let mut removed_statuses: Vec<String> = Vec::new();
@@ -338,7 +326,7 @@ impl RepositoryEntry {
id: self.work_directory_id.to_proto(),
abs_path: self.work_directory_abs_path.as_path().to_proto(),
entry_ids: vec![self.work_directory_id.to_proto()],
scan_id: scan_id as u64,
scan_id: self.worktree_scan_id as u64,
}
}
}
@@ -428,28 +416,21 @@ impl WorkDirectory {
}
}
#[cfg(test)]
fn canonicalize(&self) -> Self {
match self {
WorkDirectory::InProject { relative_path } => WorkDirectory::InProject {
relative_path: relative_path.clone(),
},
WorkDirectory::AboveProject {
absolute_path,
location_in_repo,
} => WorkDirectory::AboveProject {
absolute_path: absolute_path.canonicalize().unwrap().into(),
location_in_repo: location_in_repo.clone(),
},
}
}
pub fn is_above_project(&self) -> bool {
match self {
WorkDirectory::InProject { .. } => false,
WorkDirectory::AboveProject { .. } => true,
}
}
//#[cfg(test)]
//fn canonicalize(&self) -> Self {
// match self {
// WorkDirectory::InProject { relative_path } => WorkDirectory::InProject {
// relative_path: relative_path.clone(),
// },
// WorkDirectory::AboveProject {
// absolute_path,
// location_in_repo,
// } => WorkDirectory::AboveProject {
// absolute_path: absolute_path.canonicalize().unwrap().into(),
// location_in_repo: location_in_repo.clone(),
// },
// }
//}
fn path_key(&self) -> PathKey {
match self {
@@ -699,8 +680,7 @@ enum ScanState {
}
struct UpdateObservationState {
snapshots_tx:
mpsc::UnboundedSender<(LocalSnapshot, UpdatedEntriesSet, UpdatedGitRepositoriesSet)>,
snapshots_tx: mpsc::UnboundedSender<(LocalSnapshot, UpdatedEntriesSet)>,
resume_updates: watch::Sender<()>,
_maintain_remote_snapshot: Task<Option<()>>,
}
@@ -824,10 +804,10 @@ impl Worktree {
let background_snapshot = Arc::new(Mutex::new((
snapshot.clone(),
Vec::<WorktreeRelatedMessage>::new(),
Vec::<proto::UpdateWorktree>::new(),
)));
let (background_updates_tx, mut background_updates_rx) =
mpsc::unbounded::<WorktreeRelatedMessage>();
mpsc::unbounded::<proto::UpdateWorktree>();
let (mut snapshot_updated_tx, mut snapshot_updated_rx) = watch::channel();
let worktree_id = snapshot.id();
@@ -872,25 +852,14 @@ impl Worktree {
cx.spawn(async move |this, cx| {
while (snapshot_updated_rx.recv().await).is_some() {
this.update(cx, |this, cx| {
let mut git_repos_changed = false;
let mut entries_changed = false;
let this = this.as_remote_mut().unwrap();
{
let mut lock = this.background_snapshot.lock();
this.snapshot = lock.0.clone();
for update in lock.1.drain(..) {
entries_changed |= match &update {
WorktreeRelatedMessage::UpdateWorktree(update_worktree) => {
!update_worktree.updated_entries.is_empty()
|| !update_worktree.removed_entries.is_empty()
}
_ => false,
};
git_repos_changed |= matches!(
update,
WorktreeRelatedMessage::UpdateRepository(_)
| WorktreeRelatedMessage::RemoveRepository(_)
);
entries_changed |= !update.updated_entries.is_empty()
|| !update.removed_entries.is_empty();
if let Some(tx) = &this.update_observer {
tx.unbounded_send(update).ok();
}
@@ -900,9 +869,6 @@ impl Worktree {
if entries_changed {
cx.emit(Event::UpdatedEntries(Arc::default()));
}
if git_repos_changed {
cx.emit(Event::UpdatedGitRepositories(Arc::default()));
}
cx.notify();
while let Some((scan_id, _)) = this.snapshot_subscriptions.front() {
if this.observed_snapshot(*scan_id) {
@@ -1027,7 +993,7 @@ impl Worktree {
pub fn observe_updates<F, Fut>(&mut self, project_id: u64, cx: &Context<Worktree>, callback: F)
where
F: 'static + Send + Fn(WorktreeRelatedMessage) -> Fut,
F: 'static + Send + Fn(proto::UpdateWorktree) -> Fut,
Fut: 'static + Send + Future<Output = bool>,
{
match self {
@@ -1069,15 +1035,15 @@ impl Worktree {
Worktree::Local(this) => {
let path = Arc::from(path);
let snapshot = this.snapshot();
cx.spawn(async move |cx| {
if let Some(repo) = snapshot.repository_for_path(&path) {
cx.spawn(async move |_cx| {
if let Some(repo) = snapshot.local_repo_containing_path(&path) {
if let Some(repo_path) = repo.relativize(&path).log_err() {
if let Some(git_repo) =
snapshot.git_repositories.get(&repo.work_directory_id)
{
return Ok(git_repo
.repo_ptr
.load_index_text(repo_path, cx.clone())
.load_index_text(None, repo_path)
.await);
}
}
@@ -1096,16 +1062,13 @@ impl Worktree {
Worktree::Local(this) => {
let path = Arc::from(path);
let snapshot = this.snapshot();
cx.spawn(async move |cx| {
if let Some(repo) = snapshot.repository_for_path(&path) {
cx.spawn(async move |_cx| {
if let Some(repo) = snapshot.local_repo_containing_path(&path) {
if let Some(repo_path) = repo.relativize(&path).log_err() {
if let Some(git_repo) =
snapshot.git_repositories.get(&repo.work_directory_id)
{
return Ok(git_repo
.repo_ptr
.load_committed_text(repo_path, cx.clone())
.await);
return Ok(git_repo.repo_ptr.load_committed_text(repo_path).await);
}
}
}
@@ -1611,11 +1574,7 @@ impl LocalWorktree {
if let Some(share) = self.update_observer.as_mut() {
share
.snapshots_tx
.unbounded_send((
self.snapshot.clone(),
entry_changes.clone(),
repo_changes.clone(),
))
.unbounded_send((self.snapshot.clone(), entry_changes.clone()))
.ok();
}
@@ -1656,10 +1615,8 @@ impl LocalWorktree {
|| new_repo.status_scan_id != old_repo.status_scan_id
{
if let Some(entry) = new_snapshot.entry_for_id(new_entry_id) {
let old_repo = old_snapshot
.repositories
.get(&PathKey(entry.path.clone()), &())
.cloned();
let old_repo =
old_snapshot.repository_for_id(old_entry_id).cloned();
changes.push((
entry.clone(),
GitRepositoryChange {
@@ -1673,10 +1630,8 @@ impl LocalWorktree {
}
Ordering::Greater => {
if let Some(entry) = old_snapshot.entry_for_id(old_entry_id) {
let old_repo = old_snapshot
.repositories
.get(&PathKey(entry.path.clone()), &())
.cloned();
let old_repo =
old_snapshot.repository_for_id(old_entry_id).cloned();
changes.push((
entry.clone(),
GitRepositoryChange {
@@ -1701,10 +1656,7 @@ impl LocalWorktree {
}
(None, Some((entry_id, _))) => {
if let Some(entry) = old_snapshot.entry_for_id(entry_id) {
let old_repo = old_snapshot
.repositories
.get(&PathKey(entry.path.clone()), &())
.cloned();
let old_repo = old_snapshot.repository_for_id(entry_id).cloned();
changes.push((
entry.clone(),
GitRepositoryChange {
@@ -2320,7 +2272,7 @@ impl LocalWorktree {
fn observe_updates<F, Fut>(&mut self, project_id: u64, cx: &Context<Worktree>, callback: F)
where
F: 'static + Send + Fn(WorktreeRelatedMessage) -> Fut,
F: 'static + Send + Fn(proto::UpdateWorktree) -> Fut,
Fut: 'static + Send + Future<Output = bool>,
{
if let Some(observer) = self.update_observer.as_mut() {
@@ -2330,26 +2282,23 @@ impl LocalWorktree {
let (resume_updates_tx, mut resume_updates_rx) = watch::channel::<()>();
let (snapshots_tx, mut snapshots_rx) =
mpsc::unbounded::<(LocalSnapshot, UpdatedEntriesSet, UpdatedGitRepositoriesSet)>();
mpsc::unbounded::<(LocalSnapshot, UpdatedEntriesSet)>();
snapshots_tx
.unbounded_send((self.snapshot(), Arc::default(), Arc::default()))
.unbounded_send((self.snapshot(), Arc::default()))
.ok();
let worktree_id = cx.entity_id().as_u64();
let _maintain_remote_snapshot = cx.background_spawn(async move {
let mut is_first = true;
while let Some((snapshot, entry_changes, repo_changes)) = snapshots_rx.next().await {
let updates = if is_first {
while let Some((snapshot, entry_changes)) = snapshots_rx.next().await {
let update = if is_first {
is_first = false;
snapshot.build_initial_update(project_id, worktree_id)
} else {
snapshot.build_update(project_id, worktree_id, entry_changes, repo_changes)
snapshot.build_update(project_id, worktree_id, entry_changes)
};
for update in updates
.into_iter()
.flat_map(proto::split_worktree_related_message)
{
for update in proto::split_worktree_update(update) {
let _ = resume_updates_rx.try_recv();
loop {
let result = callback(update.clone());
@@ -2412,7 +2361,7 @@ impl RemoteWorktree {
self.disconnected = true;
}
pub fn update_from_remote(&self, update: WorktreeRelatedMessage) {
pub fn update_from_remote(&self, update: proto::UpdateWorktree) {
if let Some(updates_tx) = &self.updates_tx {
updates_tx
.unbounded_send(update)
@@ -2422,41 +2371,29 @@ impl RemoteWorktree {
fn observe_updates<F, Fut>(&mut self, project_id: u64, cx: &Context<Worktree>, callback: F)
where
F: 'static + Send + Fn(WorktreeRelatedMessage) -> Fut,
F: 'static + Send + Fn(proto::UpdateWorktree) -> Fut,
Fut: 'static + Send + Future<Output = bool>,
{
let (tx, mut rx) = mpsc::unbounded();
let initial_updates = self
let initial_update = self
.snapshot
.build_initial_update(project_id, self.id().to_proto());
self.update_observer = Some(tx);
cx.spawn(async move |this, cx| {
let mut updates = initial_updates;
let mut update = initial_update;
'outer: loop {
for mut update in updates {
// SSH projects use a special project ID of 0, and we need to
// remap it to the correct one here.
match &mut update {
WorktreeRelatedMessage::UpdateWorktree(update_worktree) => {
update_worktree.project_id = project_id;
}
WorktreeRelatedMessage::UpdateRepository(update_repository) => {
update_repository.project_id = project_id;
}
WorktreeRelatedMessage::RemoveRepository(remove_repository) => {
remove_repository.project_id = project_id;
}
};
// SSH projects use a special project ID of 0, and we need to
// remap it to the correct one here.
update.project_id = project_id;
for chunk in split_worktree_related_message(update) {
if !callback(chunk).await {
break 'outer;
}
for chunk in split_worktree_update(update) {
if !callback(chunk).await {
break 'outer;
}
}
if let Some(next_update) = rx.next().await {
updates = vec![next_update];
update = next_update;
} else {
break;
}
@@ -2616,11 +2553,7 @@ impl Snapshot {
self.abs_path.as_path()
}
fn build_initial_update(
&self,
project_id: u64,
worktree_id: u64,
) -> Vec<WorktreeRelatedMessage> {
fn build_initial_update(&self, project_id: u64, worktree_id: u64) -> proto::UpdateWorktree {
let mut updated_entries = self
.entries_by_path
.iter()
@@ -2628,7 +2561,7 @@ impl Snapshot {
.collect::<Vec<_>>();
updated_entries.sort_unstable_by_key(|e| e.id);
[proto::UpdateWorktree {
proto::UpdateWorktree {
project_id,
worktree_id,
abs_path: self.abs_path().to_proto(),
@@ -2641,14 +2574,15 @@ impl Snapshot {
updated_repositories: Vec::new(),
removed_repositories: Vec::new(),
}
.into()]
.into_iter()
.chain(
self.repositories
.iter()
.map(|repository| repository.initial_update(project_id, self.scan_id).into()),
)
.collect()
}
pub fn work_directory_abs_path(&self, work_directory: &WorkDirectory) -> Result<PathBuf> {
match work_directory {
WorkDirectory::InProject { relative_path } => self.absolutize(relative_path),
WorkDirectory::AboveProject { absolute_path, .. } => {
Ok(absolute_path.as_ref().to_owned())
}
}
}
pub fn absolutize(&self, path: &Path) -> Result<PathBuf> {
@@ -2712,15 +2646,24 @@ impl Snapshot {
Some(removed_entry.path)
}
//#[cfg(any(test, feature = "test-support"))]
//pub fn status_for_file(&self, path: impl AsRef<Path>) -> Option<FileStatus> {
// let path = path.as_ref();
// self.repository_for_path(path).and_then(|repo| {
// let repo_path = repo.relativize(path).unwrap();
// repo.statuses_by_path
// .get(&PathKey(repo_path.0), &())
// .map(|entry| entry.status)
// })
//}
#[cfg(any(test, feature = "test-support"))]
pub fn status_for_file(&self, path: impl AsRef<Path>) -> Option<FileStatus> {
let path = path.as_ref();
self.repository_for_path(path).and_then(|repo| {
let repo_path = repo.relativize(path).unwrap();
repo.statuses_by_path
.get(&PathKey(repo_path.0), &())
.map(|entry| entry.status)
})
pub fn status_for_file_abs_path(&self, abs_path: impl AsRef<Path>) -> Option<FileStatus> {
let abs_path = abs_path.as_ref();
let repo = self.repository_containing_abs_path(abs_path)?;
let repo_path = repo.relativize_abs_path(abs_path)?;
let status = repo.statuses_by_path.get(&PathKey(repo_path.0), &())?;
Some(status.status)
}
fn update_abs_path(&mut self, abs_path: SanitizedPath, root_name: String) {
@@ -2731,95 +2674,7 @@ impl Snapshot {
}
}
pub(crate) fn apply_update_repository(
&mut self,
update: proto::UpdateRepository,
) -> Result<()> {
// NOTE: this is practically but not semantically correct. For now we're using the
// ID field to store the work directory ID, but eventually it will be a different
// kind of ID.
let work_directory_id = ProjectEntryId::from_proto(update.id);
if let Some(work_dir_entry) = self.entry_for_id(work_directory_id) {
let conflicted_paths = TreeSet::from_ordered_entries(
update
.current_merge_conflicts
.into_iter()
.map(|path| RepoPath(Path::new(&path).into())),
);
if self
.repositories
.contains(&PathKey(work_dir_entry.path.clone()), &())
{
let edits = update
.removed_statuses
.into_iter()
.map(|path| Edit::Remove(PathKey(FromProto::from_proto(path))))
.chain(
update
.updated_statuses
.into_iter()
.filter_map(|updated_status| {
Some(Edit::Insert(updated_status.try_into().log_err()?))
}),
)
.collect::<Vec<_>>();
self.repositories
.update(&PathKey(work_dir_entry.path.clone()), &(), |repo| {
repo.current_branch = update.branch_summary.as_ref().map(proto_to_branch);
repo.statuses_by_path.edit(edits, &());
repo.current_merge_conflicts = conflicted_paths
});
} else {
let statuses = SumTree::from_iter(
update
.updated_statuses
.into_iter()
.filter_map(|updated_status| updated_status.try_into().log_err()),
&(),
);
self.repositories.insert_or_replace(
RepositoryEntry {
work_directory_id,
// When syncing repository entries from a peer, we don't need
// the location_in_repo field, since git operations don't happen locally
// anyway.
work_directory: WorkDirectory::InProject {
relative_path: work_dir_entry.path.clone(),
},
current_branch: update.branch_summary.as_ref().map(proto_to_branch),
statuses_by_path: statuses,
current_merge_conflicts: conflicted_paths,
work_directory_abs_path: update.abs_path.into(),
},
&(),
);
}
} else {
log::error!("no work directory entry for repository {:?}", update.id)
}
Ok(())
}
pub(crate) fn apply_remove_repository(
&mut self,
update: proto::RemoveRepository,
) -> Result<()> {
// NOTE: this is practically but not semantically correct. For now we're using the
// ID field to store the work directory ID, but eventually it will be a different
// kind of ID.
let work_directory_id = ProjectEntryId::from_proto(update.id);
self.repositories.retain(&(), |entry: &RepositoryEntry| {
entry.work_directory_id != work_directory_id
});
Ok(())
}
pub(crate) fn apply_update_worktree(
pub(crate) fn apply_remote_update(
&mut self,
update: proto::UpdateWorktree,
always_included_paths: &PathMatcher,
@@ -2875,24 +2730,6 @@ impl Snapshot {
Ok(())
}
pub(crate) fn apply_remote_update(
&mut self,
update: WorktreeRelatedMessage,
always_included_paths: &PathMatcher,
) -> Result<()> {
match update {
WorktreeRelatedMessage::UpdateWorktree(update) => {
self.apply_update_worktree(update, always_included_paths)
}
WorktreeRelatedMessage::UpdateRepository(update) => {
self.apply_update_repository(update)
}
WorktreeRelatedMessage::RemoveRepository(update) => {
self.apply_remove_repository(update)
}
}
}
pub fn entry_count(&self) -> usize {
self.entries_by_path.summary().count
}
@@ -2972,48 +2809,18 @@ impl Snapshot {
&self.repositories
}
/// Get the repository whose work directory corresponds to the given path.
fn repository(&self, work_directory: PathKey) -> Option<RepositoryEntry> {
self.repositories.get(&work_directory, &()).cloned()
}
/// Get the repository whose work directory contains the given path.
#[track_caller]
pub fn repository_for_path(&self, path: &Path) -> Option<&RepositoryEntry> {
fn repository_containing_abs_path(&self, abs_path: &Path) -> Option<&RepositoryEntry> {
self.repositories
.iter()
.filter(|repo| repo.directory_contains(path))
.filter(|repo| repo.directory_contains_abs_path(abs_path))
.last()
}
/// Given an ordered iterator of entries, returns an iterator of those entries,
/// along with their containing git repository.
#[cfg(test)]
#[track_caller]
fn entries_with_repositories<'a>(
&'a self,
entries: impl 'a + Iterator<Item = &'a Entry>,
) -> impl 'a + Iterator<Item = (&'a Entry, Option<&'a RepositoryEntry>)> {
let mut containing_repos = Vec::<&RepositoryEntry>::new();
let mut repositories = self.repositories.iter().peekable();
entries.map(move |entry| {
while let Some(repository) = containing_repos.last() {
if repository.directory_contains(&entry.path) {
break;
} else {
containing_repos.pop();
}
}
while let Some(repository) = repositories.peek() {
if repository.directory_contains(&entry.path) {
containing_repos.push(repositories.next().unwrap());
} else {
break;
}
}
let repo = containing_repos.last().copied();
(entry, repo)
})
fn repository_for_id(&self, id: ProjectEntryId) -> Option<&RepositoryEntry> {
self.repositories
.iter()
.find(|repo| repo.work_directory_id == id)
}
pub fn paths(&self) -> impl Iterator<Item = &Arc<Path>> {
@@ -3098,10 +2905,18 @@ impl Snapshot {
}
impl LocalSnapshot {
pub fn local_repo_for_path(&self, path: &Path) -> Option<&LocalRepositoryEntry> {
let repository_entry = self.repository_for_path(path)?;
let work_directory_id = repository_entry.work_directory_id();
self.git_repositories.get(&work_directory_id)
pub fn local_repo_for_work_directory_path(&self, path: &Path) -> Option<&LocalRepositoryEntry> {
self.git_repositories
.iter()
.map(|(_, entry)| entry)
.find(|entry| entry.work_directory.path_key() == PathKey(path.into()))
}
pub fn local_repo_containing_path(&self, path: &Path) -> Option<&LocalRepositoryEntry> {
self.git_repositories
.values()
.filter(|local_repo| path.starts_with(&local_repo.path_key().0))
.max_by_key(|local_repo| local_repo.path_key())
}
fn build_update(
@@ -3109,11 +2924,9 @@ impl LocalSnapshot {
project_id: u64,
worktree_id: u64,
entry_changes: UpdatedEntriesSet,
repo_changes: UpdatedGitRepositoriesSet,
) -> Vec<WorktreeRelatedMessage> {
) -> proto::UpdateWorktree {
let mut updated_entries = Vec::new();
let mut removed_entries = Vec::new();
let mut updates = Vec::new();
for (_, entry_id, path_change) in entry_changes.iter() {
if let PathChange::Removed = path_change {
@@ -3123,55 +2936,25 @@ impl LocalSnapshot {
}
}
for (entry, change) in repo_changes.iter() {
let new_repo = self.repositories.get(&PathKey(entry.path.clone()), &());
match (&change.old_repository, new_repo) {
(Some(old_repo), Some(new_repo)) => {
updates.push(
new_repo
.build_update(old_repo, project_id, self.scan_id)
.into(),
);
}
(None, Some(new_repo)) => {
updates.push(new_repo.initial_update(project_id, self.scan_id).into());
}
(Some(old_repo), None) => {
updates.push(
proto::RemoveRepository {
project_id,
id: old_repo.work_directory_id.to_proto(),
}
.into(),
);
}
_ => {}
}
}
removed_entries.sort_unstable();
updated_entries.sort_unstable_by_key(|e| e.id);
// TODO - optimize, knowing that removed_entries are sorted.
removed_entries.retain(|id| updated_entries.binary_search_by_key(id, |e| e.id).is_err());
updates.push(
proto::UpdateWorktree {
project_id,
worktree_id,
abs_path: self.abs_path().to_proto(),
root_name: self.root_name().to_string(),
updated_entries,
removed_entries,
scan_id: self.scan_id as u64,
is_last_update: self.completed_scan_id == self.scan_id,
// Sent in separate messages.
updated_repositories: Vec::new(),
removed_repositories: Vec::new(),
}
.into(),
);
updates
proto::UpdateWorktree {
project_id,
worktree_id,
abs_path: self.abs_path().to_proto(),
root_name: self.root_name().to_string(),
updated_entries,
removed_entries,
scan_id: self.scan_id as u64,
is_last_update: self.completed_scan_id == self.scan_id,
// Sent in separate messages.
updated_repositories: Vec::new(),
removed_repositories: Vec::new(),
}
}
fn insert_entry(&mut self, mut entry: Entry, fs: &dyn Fs) -> Entry {
@@ -3351,7 +3134,7 @@ impl LocalSnapshot {
let work_dir_paths = self
.repositories
.iter()
.map(|repo| repo.work_directory.path_key())
.map(|repo| repo.work_directory_abs_path.clone())
.collect::<HashSet<_>>();
assert_eq!(dotgit_paths.len(), work_dir_paths.len());
assert_eq!(self.repositories.iter().count(), work_dir_paths.len());
@@ -3560,14 +3343,9 @@ impl BackgroundScannerState {
.git_repositories
.retain(|id, _| removed_ids.binary_search(id).is_err());
self.snapshot.repositories.retain(&(), |repository| {
let retain = !repository.work_directory.path_key().0.starts_with(path);
if !retain {
log::info!(
"dropping repository entry for {:?}",
repository.work_directory
);
}
retain
removed_ids
.binary_search(&repository.work_directory_id)
.is_err()
});
#[cfg(test)]
@@ -3622,9 +3400,13 @@ impl BackgroundScannerState {
fs: &dyn Fs,
watcher: &dyn Watcher,
) -> Option<LocalRepositoryEntry> {
// TODO canonicalize here
log::info!("insert git repository for {dot_git_path:?}");
let work_dir_entry = self.snapshot.entry_for_path(work_directory.path_key().0)?;
let work_directory_abs_path = self.snapshot.absolutize(&work_dir_entry.path).log_err()?;
let work_directory_abs_path = self
.snapshot
.work_directory_abs_path(&work_directory)
.log_err()?;
if self
.snapshot
@@ -3676,18 +3458,18 @@ impl BackgroundScannerState {
self.snapshot.repositories.insert_or_replace(
RepositoryEntry {
work_directory_id,
work_directory: work_directory.clone(),
work_directory_abs_path,
current_branch: None,
statuses_by_path: Default::default(),
current_merge_conflicts: Default::default(),
worktree_scan_id: 0,
},
&(),
);
let local_repository = LocalRepositoryEntry {
work_directory_id,
work_directory: work_directory.clone(),
work_directory,
git_dir_scan_id: 0,
status_scan_id: 0,
repo_ptr: repository.clone(),
@@ -4120,22 +3902,53 @@ impl<'a, S: Summary> sum_tree::Dimension<'a, PathSummary<S>> for PathProgress<'a
}
}
#[derive(Clone, Debug)]
pub struct AbsPathSummary {
max_path: Arc<Path>,
}
impl Summary for AbsPathSummary {
type Context = ();
fn zero(_: &Self::Context) -> Self {
Self {
max_path: Path::new("").into(),
}
}
fn add_summary(&mut self, rhs: &Self, _: &Self::Context) {
self.max_path = rhs.max_path.clone();
}
}
impl sum_tree::Item for RepositoryEntry {
type Summary = PathSummary<Unit>;
type Summary = AbsPathSummary;
fn summary(&self, _: &<Self::Summary as Summary>::Context) -> Self::Summary {
PathSummary {
max_path: self.work_directory.path_key().0,
item_summary: Unit,
AbsPathSummary {
max_path: self.work_directory_abs_path.as_path().into(),
}
}
}
#[derive(Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct AbsPathKey(pub Arc<Path>);
impl<'a> sum_tree::Dimension<'a, AbsPathSummary> for AbsPathKey {
fn zero(_: &()) -> Self {
Self(Path::new("").into())
}
fn add_summary(&mut self, summary: &'a AbsPathSummary, _: &()) {
self.0 = summary.max_path.clone();
}
}
impl sum_tree::KeyedItem for RepositoryEntry {
type Key = PathKey;
type Key = AbsPathKey;
fn key(&self) -> Self::Key {
self.work_directory.path_key()
AbsPathKey(self.work_directory_abs_path.as_path().into())
}
}
@@ -4375,7 +4188,7 @@ impl<'a> sum_tree::Dimension<'a, PathEntrySummary> for ProjectEntryId {
}
#[derive(Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub struct PathKey(Arc<Path>);
pub struct PathKey(pub Arc<Path>);
impl Default for PathKey {
fn default() -> Self {
@@ -5191,11 +5004,11 @@ impl BackgroundScanner {
// Group all relative paths by their git repository.
let mut paths_by_git_repo = HashMap::default();
for relative_path in relative_paths.iter() {
for (relative_path, abs_path) in relative_paths.iter().zip(&abs_paths) {
let repository_data = state
.snapshot
.local_repo_for_path(relative_path)
.zip(state.snapshot.repository_for_path(relative_path));
.local_repo_containing_path(relative_path)
.zip(state.snapshot.repository_containing_abs_path(abs_path));
if let Some((local_repo, entry)) = repository_data {
if let Ok(repo_path) = local_repo.relativize(relative_path) {
paths_by_git_repo
@@ -5210,8 +5023,8 @@ impl BackgroundScanner {
}
}
for (work_directory, mut paths) in paths_by_git_repo {
if let Ok(status) = paths.repo.status(&paths.repo_paths) {
for (_work_directory, mut paths) in paths_by_git_repo {
if let Ok(status) = paths.repo.status_blocking(&paths.repo_paths) {
let mut changed_path_statuses = Vec::new();
let statuses = paths.entry.statuses_by_path.clone();
let mut cursor = statuses.cursor::<PathProgress>(&());
@@ -5239,7 +5052,7 @@ impl BackgroundScanner {
if !changed_path_statuses.is_empty() {
let work_directory_id = state.snapshot.repositories.update(
&work_directory.path_key(),
&AbsPathKey(paths.entry.work_directory_abs_path.as_path().into()),
&(),
move |repository_entry| {
repository_entry
@@ -5324,14 +5137,13 @@ impl BackgroundScanner {
.components()
.any(|component| component.as_os_str() == *DOT_GIT)
{
if let Some(repository) = snapshot.repository(PathKey(path.clone())) {
if let Some(local_repo) = snapshot.local_repo_for_work_directory_path(path) {
let id = local_repo.work_directory_id;
log::debug!("remove repo path: {:?}", path);
snapshot.git_repositories.remove(&id);
snapshot
.git_repositories
.remove(&repository.work_directory_id);
snapshot
.snapshot
.repositories
.remove(&repository.work_directory.path_key(), &());
.retain(&(), |repo_entry| repo_entry.work_directory_id != id);
return Some(());
}
}
@@ -5540,6 +5352,17 @@ impl BackgroundScanner {
entry.status_scan_id = scan_id;
},
);
if let Some(repo_entry) = state
.snapshot
.repository_for_id(local_repository.work_directory_id)
{
let abs_path_key =
AbsPathKey(repo_entry.work_directory_abs_path.as_path().into());
state
.snapshot
.repositories
.update(&abs_path_key, &(), |repo| repo.worktree_scan_id = scan_id);
}
local_repository
}
@@ -5674,8 +5497,11 @@ async fn update_branches(
let branches = repository.repo().branches().await?;
let snapshot = state.lock().snapshot.snapshot.clone();
let mut repository = snapshot
.repository(repository.work_directory.path_key())
.context("Missing repository")?;
.repositories
.iter()
.find(|repo_entry| repo_entry.work_directory_id == repository.work_directory_id)
.context("missing repository")?
.clone();
repository.current_branch = branches.into_iter().find(|branch| branch.is_head);
let mut state = state.lock();
@@ -5702,7 +5528,7 @@ async fn do_git_status_update(
log::trace!("updating git statuses for repo {repository_name}");
let Some(statuses) = local_repository
.repo()
.status(&[git::WORK_DIRECTORY_REPO_PATH.clone()])
.status_blocking(&[git::WORK_DIRECTORY_REPO_PATH.clone()])
.log_err()
else {
return;
@@ -5717,9 +5543,10 @@ async fn do_git_status_update(
let snapshot = job_state.lock().snapshot.snapshot.clone();
let Some(mut repository) = snapshot
.repository(local_repository.work_directory.path_key())
.context("Tried to update git statuses for a repository that isn't in the snapshot")
.repository_for_id(local_repository.work_directory_id)
.context("tried to update git statuses for a repository that isn't in the snapshot")
.log_err()
.cloned()
else {
return;
};
@@ -5731,7 +5558,7 @@ async fn do_git_status_update(
let mut new_entries_by_path = SumTree::new(&());
for (repo_path, status) in statuses.entries.iter() {
let project_path = repository.work_directory.try_unrelativize(repo_path);
let project_path = local_repository.work_directory.try_unrelativize(repo_path);
new_entries_by_path.insert_or_replace(
StatusEntry {
@@ -5749,6 +5576,7 @@ async fn do_git_status_update(
}
}
log::trace!("statuses: {:#?}", new_entries_by_path);
repository.statuses_by_path = new_entries_by_path;
let mut state = job_state.lock();
state

View File

@@ -1,6 +1,6 @@
use crate::{
worktree_settings::WorktreeSettings, Entry, EntryKind, Event, PathChange, WorkDirectory,
Worktree, WorktreeModelHandle,
worktree_settings::WorktreeSettings, Entry, EntryKind, Event, PathChange, StatusEntry,
WorkDirectory, Worktree, WorktreeModelHandle,
};
use anyhow::Result;
use fs::{FakeFs, Fs, RealFs, RemoveOptions};
@@ -15,7 +15,7 @@ use parking_lot::Mutex;
use postage::stream::Stream;
use pretty_assertions::assert_eq;
use rand::prelude::*;
use rpc::proto::WorktreeRelatedMessage;
use serde_json::json;
use settings::{Settings, SettingsStore};
use std::{
@@ -351,7 +351,7 @@ async fn test_renaming_case_only(cx: &mut TestAppContext) {
const OLD_NAME: &str = "aaa.rs";
const NEW_NAME: &str = "AAA.rs";
let fs = Arc::new(RealFs::default());
let fs = Arc::new(RealFs::new(None, cx.executor()));
let temp_root = TempTree::new(json!({
OLD_NAME: "",
}));
@@ -876,7 +876,7 @@ async fn test_write_file(cx: &mut TestAppContext) {
let worktree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -965,7 +965,7 @@ async fn test_file_scan_inclusions(cx: &mut TestAppContext) {
let tree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1028,7 +1028,7 @@ async fn test_file_scan_exclusions_overrules_inclusions(cx: &mut TestAppContext)
let tree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1085,7 +1085,7 @@ async fn test_file_scan_inclusions_reindexes_on_setting_change(cx: &mut TestAppC
let tree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1166,7 +1166,7 @@ async fn test_file_scan_exclusions(cx: &mut TestAppContext) {
let tree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1271,7 +1271,7 @@ async fn test_fs_events_in_exclusions(cx: &mut TestAppContext) {
let tree = Worktree::local(
dir.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1382,7 +1382,7 @@ async fn test_fs_events_in_dot_git_worktree(cx: &mut TestAppContext) {
let tree = Worktree::local(
dot_git_worktree_dir.clone(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -1512,7 +1512,7 @@ async fn test_create_dir_all_on_create_entry(cx: &mut TestAppContext) {
assert!(tree.entry_for_path("a/b/").unwrap().is_dir());
});
let fs_real = Arc::new(RealFs::default());
let fs_real = Arc::new(RealFs::new(None, cx.executor()));
let temp_root = TempTree::new(json!({
"a": {}
}));
@@ -1665,12 +1665,7 @@ async fn test_random_worktree_operations_during_initial_scan(
for (i, snapshot) in snapshots.into_iter().enumerate().rev() {
let mut updated_snapshot = snapshot.clone();
for update in updates.lock().iter() {
let scan_id = match update {
WorktreeRelatedMessage::UpdateWorktree(update) => update.scan_id,
WorktreeRelatedMessage::UpdateRepository(update) => update.scan_id,
WorktreeRelatedMessage::RemoveRepository(_) => u64::MAX,
};
if scan_id >= updated_snapshot.scan_id() as u64 {
if update.scan_id >= updated_snapshot.scan_id() as u64 {
updated_snapshot
.apply_remote_update(update.clone(), &settings.file_scan_inclusions)
.unwrap();
@@ -1807,12 +1802,7 @@ async fn test_random_worktree_changes(cx: &mut TestAppContext, mut rng: StdRng)
for (i, mut prev_snapshot) in snapshots.into_iter().enumerate().rev() {
for update in updates.lock().iter() {
let scan_id = match update {
WorktreeRelatedMessage::UpdateWorktree(update) => update.scan_id,
WorktreeRelatedMessage::UpdateRepository(update) => update.scan_id,
WorktreeRelatedMessage::RemoveRepository(_) => u64::MAX,
};
if scan_id >= prev_snapshot.scan_id() as u64 {
if update.scan_id >= prev_snapshot.scan_id() as u64 {
prev_snapshot
.apply_remote_update(update.clone(), &settings.file_scan_inclusions)
.unwrap();
@@ -2136,7 +2126,7 @@ async fn test_rename_work_directory(cx: &mut TestAppContext) {
let tree = Worktree::local(
root_path,
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -2157,15 +2147,15 @@ async fn test_rename_work_directory(cx: &mut TestAppContext) {
let tree = tree.read(cx);
let repo = tree.repositories.iter().next().unwrap();
assert_eq!(
repo.work_directory,
WorkDirectory::in_project("projects/project1")
repo.work_directory_abs_path,
root_path.join("projects/project1")
);
assert_eq!(
tree.status_for_file(Path::new("projects/project1/a")),
repo.status_for_path(&"a".into()).map(|entry| entry.status),
Some(StatusCode::Modified.worktree()),
);
assert_eq!(
tree.status_for_file(Path::new("projects/project1/b")),
repo.status_for_path(&"b".into()).map(|entry| entry.status),
Some(FileStatus::Untracked),
);
});
@@ -2181,201 +2171,20 @@ async fn test_rename_work_directory(cx: &mut TestAppContext) {
let tree = tree.read(cx);
let repo = tree.repositories.iter().next().unwrap();
assert_eq!(
repo.work_directory,
WorkDirectory::in_project("projects/project2")
repo.work_directory_abs_path,
root_path.join("projects/project2")
);
assert_eq!(
tree.status_for_file(Path::new("projects/project2/a")),
Some(StatusCode::Modified.worktree()),
repo.status_for_path(&"a".into()).unwrap().status,
StatusCode::Modified.worktree(),
);
assert_eq!(
tree.status_for_file(Path::new("projects/project2/b")),
Some(FileStatus::Untracked),
repo.status_for_path(&"b".into()).unwrap().status,
FileStatus::Untracked,
);
});
}
#[gpui::test]
async fn test_home_dir_as_git_repository(cx: &mut TestAppContext) {
init_test(cx);
cx.executor().allow_parking();
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
"/root",
json!({
"home": {
".git": {},
"project": {
"a.txt": "A"
},
},
}),
)
.await;
fs.set_home_dir(Path::new(path!("/root/home")).to_owned());
let tree = Worktree::local(
Path::new(path!("/root/home/project")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
.await;
tree.flush_fs_events(cx).await;
tree.read_with(cx, |tree, _cx| {
let tree = tree.as_local().unwrap();
let repo = tree.repository_for_path(path!("a.txt").as_ref());
assert!(repo.is_none());
});
let home_tree = Worktree::local(
Path::new(path!("/root/home")),
true,
fs.clone(),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| home_tree.read(cx).as_local().unwrap().scan_complete())
.await;
home_tree.flush_fs_events(cx).await;
home_tree.read_with(cx, |home_tree, _cx| {
let home_tree = home_tree.as_local().unwrap();
let repo = home_tree.repository_for_path(path!("project/a.txt").as_ref());
assert_eq!(
repo.map(|repo| &repo.work_directory),
Some(&WorkDirectory::InProject {
relative_path: Path::new("").into()
})
);
})
}
#[gpui::test]
async fn test_git_repository_for_path(cx: &mut TestAppContext) {
init_test(cx);
cx.executor().allow_parking();
let root = TempTree::new(json!({
"c.txt": "",
"dir1": {
".git": {},
"deps": {
"dep1": {
".git": {},
"src": {
"a.txt": ""
}
}
},
"src": {
"b.txt": ""
}
},
}));
let tree = Worktree::local(
root.path(),
true,
Arc::new(RealFs::default()),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
.await;
tree.flush_fs_events(cx).await;
tree.read_with(cx, |tree, _cx| {
let tree = tree.as_local().unwrap();
assert!(tree.repository_for_path("c.txt".as_ref()).is_none());
let repo = tree.repository_for_path("dir1/src/b.txt".as_ref()).unwrap();
assert_eq!(repo.work_directory, WorkDirectory::in_project("dir1"));
let repo = tree
.repository_for_path("dir1/deps/dep1/src/a.txt".as_ref())
.unwrap();
assert_eq!(
repo.work_directory,
WorkDirectory::in_project("dir1/deps/dep1")
);
let entries = tree.files(false, 0);
let paths_with_repos = tree
.entries_with_repositories(entries)
.map(|(entry, repo)| {
(
entry.path.as_ref(),
repo.map(|repo| repo.work_directory.clone()),
)
})
.collect::<Vec<_>>();
assert_eq!(
paths_with_repos,
&[
(Path::new("c.txt"), None),
(
Path::new("dir1/deps/dep1/src/a.txt"),
Some(WorkDirectory::in_project("dir1/deps/dep1"))
),
(
Path::new("dir1/src/b.txt"),
Some(WorkDirectory::in_project("dir1"))
),
]
);
});
let repo_update_events = Arc::new(Mutex::new(vec![]));
tree.update(cx, |_, cx| {
let repo_update_events = repo_update_events.clone();
cx.subscribe(&tree, move |_, _, event, _| {
if let Event::UpdatedGitRepositories(update) = event {
repo_update_events.lock().push(update.clone());
}
})
.detach();
});
std::fs::write(root.path().join("dir1/.git/random_new_file"), "hello").unwrap();
tree.flush_fs_events(cx).await;
assert_eq!(
repo_update_events.lock()[0]
.iter()
.map(|(entry, _)| entry.path.clone())
.collect::<Vec<Arc<Path>>>(),
vec![Path::new("dir1").into()]
);
std::fs::remove_dir_all(root.path().join("dir1/.git")).unwrap();
tree.flush_fs_events(cx).await;
tree.read_with(cx, |tree, _cx| {
let tree = tree.as_local().unwrap();
assert!(tree
.repository_for_path("dir1/src/b.txt".as_ref())
.is_none());
});
}
// NOTE: This test always fails on Windows, because on Windows, unlike on Unix,
// you can't rename a directory which some program has already open. This is a
// limitation of the Windows. See:
@@ -2411,7 +2220,6 @@ async fn test_file_status(cx: &mut TestAppContext) {
const F_TXT: &str = "f.txt";
const DOTGITIGNORE: &str = ".gitignore";
const BUILD_FILE: &str = "target/build_file";
let project_path = Path::new("project");
// Set up git repository before creating the worktree.
let work_dir = root.path().join("project");
@@ -2425,12 +2233,13 @@ async fn test_file_status(cx: &mut TestAppContext) {
let tree = Worktree::local(
root.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
.await
.unwrap();
let root_path = root.path();
tree.flush_fs_events(cx).await;
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
@@ -2443,17 +2252,17 @@ async fn test_file_status(cx: &mut TestAppContext) {
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(
repo_entry.work_directory,
WorkDirectory::in_project("project")
repo_entry.work_directory_abs_path,
root_path.join("project")
);
assert_eq!(
snapshot.status_for_file(project_path.join(B_TXT)),
Some(FileStatus::Untracked),
repo_entry.status_for_path(&B_TXT.into()).unwrap().status,
FileStatus::Untracked,
);
assert_eq!(
snapshot.status_for_file(project_path.join(F_TXT)),
Some(FileStatus::Untracked),
repo_entry.status_for_path(&F_TXT.into()).unwrap().status,
FileStatus::Untracked,
);
});
@@ -2465,9 +2274,11 @@ async fn test_file_status(cx: &mut TestAppContext) {
// The worktree detects that the file's git status has changed.
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(
snapshot.status_for_file(project_path.join(A_TXT)),
Some(StatusCode::Modified.worktree()),
repo_entry.status_for_path(&A_TXT.into()).unwrap().status,
StatusCode::Modified.worktree(),
);
});
@@ -2481,12 +2292,14 @@ async fn test_file_status(cx: &mut TestAppContext) {
// The worktree detects that the files' git status have changed.
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(
snapshot.status_for_file(project_path.join(F_TXT)),
Some(FileStatus::Untracked),
repo_entry.status_for_path(&F_TXT.into()).unwrap().status,
FileStatus::Untracked,
);
assert_eq!(snapshot.status_for_file(project_path.join(B_TXT)), None);
assert_eq!(snapshot.status_for_file(project_path.join(A_TXT)), None);
assert_eq!(repo_entry.status_for_path(&B_TXT.into()), None);
assert_eq!(repo_entry.status_for_path(&A_TXT.into()), None);
});
// Modify files in the working copy and perform git operations on other files.
@@ -2501,15 +2314,17 @@ async fn test_file_status(cx: &mut TestAppContext) {
// Check that more complex repo changes are tracked
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(snapshot.status_for_file(project_path.join(A_TXT)), None);
assert_eq!(repo_entry.status_for_path(&A_TXT.into()), None);
assert_eq!(
snapshot.status_for_file(project_path.join(B_TXT)),
Some(FileStatus::Untracked),
repo_entry.status_for_path(&B_TXT.into()).unwrap().status,
FileStatus::Untracked,
);
assert_eq!(
snapshot.status_for_file(project_path.join(E_TXT)),
Some(StatusCode::Modified.worktree()),
repo_entry.status_for_path(&E_TXT.into()).unwrap().status,
StatusCode::Modified.worktree(),
);
});
@@ -2542,9 +2357,14 @@ async fn test_file_status(cx: &mut TestAppContext) {
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(
snapshot.status_for_file(project_path.join(renamed_dir_name).join(RENAMED_FILE)),
Some(FileStatus::Untracked),
repo_entry
.status_for_path(&Path::new(renamed_dir_name).join(RENAMED_FILE).into())
.unwrap()
.status,
FileStatus::Untracked,
);
});
@@ -2561,14 +2381,15 @@ async fn test_file_status(cx: &mut TestAppContext) {
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo_entry = snapshot.repositories.iter().next().unwrap();
assert_eq!(
snapshot.status_for_file(
project_path
.join(Path::new(renamed_dir_name))
.join(RENAMED_FILE)
),
Some(FileStatus::Untracked),
repo_entry
.status_for_path(&Path::new(renamed_dir_name).join(RENAMED_FILE).into())
.unwrap()
.status,
FileStatus::Untracked,
);
});
}
@@ -2601,7 +2422,7 @@ async fn test_git_repository_status(cx: &mut TestAppContext) {
let tree = Worktree::local(
root.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -2619,17 +2440,26 @@ async fn test_git_repository_status(cx: &mut TestAppContext) {
let repo = snapshot.repositories.iter().next().unwrap();
let entries = repo.status().collect::<Vec<_>>();
assert_eq!(entries.len(), 3);
assert_eq!(entries[0].repo_path.as_ref(), Path::new("a.txt"));
assert_eq!(entries[0].status, StatusCode::Modified.worktree());
assert_eq!(entries[1].repo_path.as_ref(), Path::new("b.txt"));
assert_eq!(entries[1].status, FileStatus::Untracked);
assert_eq!(entries[2].repo_path.as_ref(), Path::new("d.txt"));
assert_eq!(entries[2].status, StatusCode::Deleted.worktree());
assert_eq!(
entries,
[
StatusEntry {
repo_path: "a.txt".into(),
status: StatusCode::Modified.worktree(),
},
StatusEntry {
repo_path: "b.txt".into(),
status: FileStatus::Untracked,
},
StatusEntry {
repo_path: "d.txt".into(),
status: StatusCode::Deleted.worktree(),
},
]
);
});
std::fs::write(work_dir.join("c.txt"), "some changes").unwrap();
eprintln!("File c.txt has been modified");
tree.flush_fs_events(cx).await;
cx.read(|cx| tree.read(cx).as_local().unwrap().scan_complete())
@@ -2641,16 +2471,27 @@ async fn test_git_repository_status(cx: &mut TestAppContext) {
let repository = snapshot.repositories.iter().next().unwrap();
let entries = repository.status().collect::<Vec<_>>();
std::assert_eq!(entries.len(), 4, "entries: {entries:?}");
assert_eq!(entries[0].repo_path.as_ref(), Path::new("a.txt"));
assert_eq!(entries[0].status, StatusCode::Modified.worktree());
assert_eq!(entries[1].repo_path.as_ref(), Path::new("b.txt"));
assert_eq!(entries[1].status, FileStatus::Untracked);
// Status updated
assert_eq!(entries[2].repo_path.as_ref(), Path::new("c.txt"));
assert_eq!(entries[2].status, StatusCode::Modified.worktree());
assert_eq!(entries[3].repo_path.as_ref(), Path::new("d.txt"));
assert_eq!(entries[3].status, StatusCode::Deleted.worktree());
assert_eq!(
entries,
[
StatusEntry {
repo_path: "a.txt".into(),
status: StatusCode::Modified.worktree(),
},
StatusEntry {
repo_path: "b.txt".into(),
status: FileStatus::Untracked,
},
StatusEntry {
repo_path: "c.txt".into(),
status: StatusCode::Modified.worktree(),
},
StatusEntry {
repo_path: "d.txt".into(),
status: StatusCode::Deleted.worktree(),
},
]
);
});
git_add("a.txt", &repo);
@@ -2677,13 +2518,12 @@ async fn test_git_repository_status(cx: &mut TestAppContext) {
// Deleting an untracked entry, b.txt, should leave no status
// a.txt was tracked, and so should have a status
assert_eq!(
entries.len(),
1,
"Entries length was incorrect\n{:#?}",
&entries
entries,
[StatusEntry {
repo_path: "a.txt".into(),
status: StatusCode::Deleted.worktree(),
}]
);
assert_eq!(entries[0].repo_path.as_ref(), Path::new("a.txt"));
assert_eq!(entries[0].status, StatusCode::Deleted.worktree());
});
}
@@ -2711,7 +2551,7 @@ async fn test_git_status_postprocessing(cx: &mut TestAppContext) {
let tree = Worktree::local(
root.path(),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -2729,17 +2569,18 @@ async fn test_git_status_postprocessing(cx: &mut TestAppContext) {
let entries = repo.status().collect::<Vec<_>>();
// `sub` doesn't appear in our computed statuses.
assert_eq!(entries.len(), 1);
assert_eq!(entries[0].repo_path.as_ref(), Path::new("a.txt"));
// a.txt appears with a combined `DA` status.
assert_eq!(
entries[0].status,
TrackedStatus {
index_status: StatusCode::Deleted,
worktree_status: StatusCode::Added
}
.into()
);
entries,
[StatusEntry {
repo_path: "a.txt".into(),
status: TrackedStatus {
index_status: StatusCode::Deleted,
worktree_status: StatusCode::Added
}
.into(),
}]
)
});
}
@@ -2778,7 +2619,7 @@ async fn test_repository_subfolder_git_status(cx: &mut TestAppContext) {
let tree = Worktree::local(
root.path().join(project_root),
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -2797,19 +2638,14 @@ async fn test_repository_subfolder_git_status(cx: &mut TestAppContext) {
assert_eq!(snapshot.repositories.iter().count(), 1);
let repo = snapshot.repositories.iter().next().unwrap();
assert_eq!(
repo.work_directory.canonicalize(),
WorkDirectory::AboveProject {
absolute_path: Arc::from(root.path().join("my-repo").canonicalize().unwrap()),
location_in_repo: Arc::from(Path::new(util::separator!(
"sub-folder-1/sub-folder-2"
)))
}
repo.work_directory_abs_path.canonicalize().unwrap(),
root.path().join("my-repo").canonicalize().unwrap()
);
assert_eq!(snapshot.status_for_file("c.txt"), None);
assert_eq!(repo.status_for_path(&C_TXT.into()), None);
assert_eq!(
snapshot.status_for_file("d/e.txt"),
Some(FileStatus::Untracked)
repo.status_for_path(&E_TXT.into()).unwrap().status,
FileStatus::Untracked
);
});
@@ -2823,11 +2659,14 @@ async fn test_repository_subfolder_git_status(cx: &mut TestAppContext) {
tree.read_with(cx, |tree, _cx| {
let snapshot = tree.snapshot();
let repos = snapshot.repositories().iter().cloned().collect::<Vec<_>>();
assert_eq!(repos.len(), 1);
let repo_entry = repos.into_iter().next().unwrap();
assert!(snapshot.repositories.iter().next().is_some());
assert_eq!(snapshot.status_for_file("c.txt"), None);
assert_eq!(snapshot.status_for_file("d/e.txt"), None);
assert_eq!(repo_entry.status_for_path(&C_TXT.into()), None);
assert_eq!(repo_entry.status_for_path(&E_TXT.into()), None);
});
}
@@ -2846,7 +2685,7 @@ async fn test_conflicted_cherry_pick(cx: &mut TestAppContext) {
let tree = Worktree::local(
root_path,
true,
Arc::new(RealFs::default()),
Arc::new(RealFs::new(None, cx.executor())),
Default::default(),
&mut cx.to_async(),
)
@@ -3140,7 +2979,12 @@ fn assert_entry_git_state(
is_ignored: bool,
) {
let entry = tree.entry_for_path(path).expect("entry {path} not found");
let status = tree.status_for_file(Path::new(path));
let repos = tree.repositories().iter().cloned().collect::<Vec<_>>();
assert_eq!(repos.len(), 1);
let repo_entry = repos.into_iter().next().unwrap();
let status = repo_entry
.status_for_path(&path.into())
.map(|entry| entry.status);
let expected = index_status.map(|index_status| {
TrackedStatus {
index_status,

View File

@@ -264,7 +264,7 @@ fn main() {
};
log::info!("Using git binary path: {:?}", git_binary_path);
let fs = Arc::new(RealFs::new(git_binary_path));
let fs = Arc::new(RealFs::new(git_binary_path, app.background_executor()));
let user_settings_file_rx = watch_config_file(
&app.background_executor(),
fs.clone(),

View File

@@ -17,7 +17,7 @@
"devDependencies": {
"@octokit/types": "^13.8.0",
"@slack/types": "^2.14.0",
"@tsconfig/node20": "20.1.4",
"@tsconfig/node20": "20.1.5",
"@tsconfig/strictest": "2.0.5",
"typescript": "5.7.3"
}

View File

@@ -13,7 +13,7 @@ importers:
version: 21.1.1
'@slack/webhook':
specifier: ^7.0.4
version: 7.0.4
version: 7.0.5
date-fns:
specifier: ^4.1.0
version: 4.1.0
@@ -28,8 +28,8 @@ importers:
specifier: ^2.14.0
version: 2.14.0
'@tsconfig/node20':
specifier: 20.1.4
version: 20.1.4
specifier: 20.1.5
version: 20.1.5
'@tsconfig/strictest':
specifier: 2.0.5
version: 2.0.5
@@ -160,12 +160,12 @@ packages:
resolution: {integrity: sha512-n0EGm7ENQRxlXbgKSrQZL69grzg1gHLAVd+GlRVQJ1NSORo0FrApR7wql/gaKdu2n4TO83Sq/AmeUOqD60aXUA==}
engines: {node: '>= 12.13.0', npm: '>= 6.12.0'}
'@slack/webhook@7.0.4':
resolution: {integrity: sha512-JDJte2dbJCcq1/GCMBYJH6fj+YS4n5GuPjT4tF3O1NPN6pFPCR9yA/apRh9sdfhdFG7hadiRgmiQqC4GLgNkZg==}
'@slack/webhook@7.0.5':
resolution: {integrity: sha512-PmbZx89+SmH4zt78FUwe4If8hWX2MAIRmGXjmlF0A8PwyJb/H7CWaQYV6DDlZn1+7Zs6CEytKH0ejEE/idVSDw==}
engines: {node: '>= 18', npm: '>= 8.6.0'}
'@tsconfig/node20@20.1.4':
resolution: {integrity: sha512-sqgsT69YFeLWf5NtJ4Xq/xAF8p4ZQHlmGW74Nu2tD4+g5fAsposc4ZfaaPixVu4y01BEiDCWLRDCvDM5JOsRxg==}
'@tsconfig/node20@20.1.5':
resolution: {integrity: sha512-Vm8e3WxDTqMGPU4GATF9keQAIy1Drd7bPwlgzKJnZtoOsTm1tduUTbDjg0W5qERvGuxPI2h9RbMufH0YdfBylA==}
'@tsconfig/strictest@2.0.5':
resolution: {integrity: sha512-ec4tjL2Rr0pkZ5hww65c+EEPYwxOi4Ryv+0MtjeaSQRJyq322Q27eOQiFbuNgw2hpL4hB1/W/HBGk3VKS43osg==}
@@ -173,14 +173,14 @@ packages:
'@types/aws-lambda@8.10.147':
resolution: {integrity: sha512-nD0Z9fNIZcxYX5Mai2CTmFD7wX7UldCkW2ezCF8D1T5hdiLsnTWDGRpfRYntU6VjTdLQjOvyszru7I1c1oCQew==}
'@types/node@22.13.4':
resolution: {integrity: sha512-ywP2X0DYtX3y08eFVx5fNIw7/uIv8hYUKgXoK8oayJlLnKcRfEYCxWMVE1XagUdVtCJlZT1AU4LXEABW+L1Peg==}
'@types/node@22.13.13':
resolution: {integrity: sha512-ClsL5nMwKaBRwPcCvH8E7+nU4GxHVx1axNvMZTFHMEfNI7oahimt26P5zjVCRrjiIWj6YFXfE1v3dEp94wLcGQ==}
asynckit@0.4.0:
resolution: {integrity: sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==}
axios@1.7.9:
resolution: {integrity: sha512-LhLcE7Hbiryz8oMDdDptSrWowmB4Bl6RCt6sIJKpRB4XtVf0iEgewX3au/pJqm+Py1kCASkb/FFKjxQaLtxJvw==}
axios@1.8.4:
resolution: {integrity: sha512-eBSYY4Y68NNlHbHBMdeDmKNtDgXWhQsJcGqzO3iLUM0GraQFSS9cVgPX5I9b3lbdFKyYoAEGAZF1DwhTaljNAw==}
before-after-hook@3.0.2:
resolution: {integrity: sha512-Nik3Sc0ncrMK4UUdXQmAnRtzmNQTAAXmXIopizwZ1W1t8QmfJj+zL4OA2I7XPTPW5z5TDqv4hRo/JzouDJnX3A==}
@@ -188,6 +188,10 @@ packages:
bottleneck@2.19.5:
resolution: {integrity: sha512-VHiNCbI1lKdl44tGrhNfU3lup0Tj/ZBMJB5/2ZbNXRCPuRCO7ed2mgcK4r17y+KB2EfuYuRaVlwNbAeaWGSpbw==}
call-bind-apply-helpers@1.0.2:
resolution: {integrity: sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==}
engines: {node: '>= 0.4'}
combined-stream@1.0.8:
resolution: {integrity: sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==}
engines: {node: '>= 0.8'}
@@ -199,6 +203,26 @@ packages:
resolution: {integrity: sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==}
engines: {node: '>=0.4.0'}
dunder-proto@1.0.1:
resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==}
engines: {node: '>= 0.4'}
es-define-property@1.0.1:
resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==}
engines: {node: '>= 0.4'}
es-errors@1.3.0:
resolution: {integrity: sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==}
engines: {node: '>= 0.4'}
es-object-atoms@1.1.1:
resolution: {integrity: sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==}
engines: {node: '>= 0.4'}
es-set-tostringtag@2.1.0:
resolution: {integrity: sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==}
engines: {node: '>= 0.4'}
fast-content-type-parse@2.0.1:
resolution: {integrity: sha512-nGqtvLrj5w0naR6tDPfB4cUmYCqouzyQiz6C5y/LtcDllJdrcc6WaWW6iXyIIOErTa/XRybj28aasdn4LkVk6Q==}
@@ -211,10 +235,41 @@ packages:
debug:
optional: true
form-data@4.0.1:
resolution: {integrity: sha512-tzN8e4TX8+kkxGPK8D5u0FNmjPUjw3lwC9lSLxxoB/+GtsJG91CO8bSWy73APlgAZzZbXEYZJuxjkHH2w+Ezhw==}
form-data@4.0.2:
resolution: {integrity: sha512-hGfm/slu0ZabnNt4oaRZ6uREyfCj6P4fT/n6A1rGV+Z0VdGXjfOhVUpkn6qVQONHGIFwmveGXyDs75+nr6FM8w==}
engines: {node: '>= 6'}
function-bind@1.1.2:
resolution: {integrity: sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==}
get-intrinsic@1.3.0:
resolution: {integrity: sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==}
engines: {node: '>= 0.4'}
get-proto@1.0.1:
resolution: {integrity: sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==}
engines: {node: '>= 0.4'}
gopd@1.2.0:
resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==}
engines: {node: '>= 0.4'}
has-symbols@1.1.0:
resolution: {integrity: sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==}
engines: {node: '>= 0.4'}
has-tostringtag@1.0.2:
resolution: {integrity: sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==}
engines: {node: '>= 0.4'}
hasown@2.0.2:
resolution: {integrity: sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==}
engines: {node: '>= 0.4'}
math-intrinsics@1.1.0:
resolution: {integrity: sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==}
engines: {node: '>= 0.4'}
mime-db@1.52.0:
resolution: {integrity: sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==}
engines: {node: '>= 0.6'}
@@ -410,30 +465,30 @@ snapshots:
'@slack/types@2.14.0': {}
'@slack/webhook@7.0.4':
'@slack/webhook@7.0.5':
dependencies:
'@slack/types': 2.14.0
'@types/node': 22.13.4
axios: 1.7.9
'@types/node': 22.13.13
axios: 1.8.4
transitivePeerDependencies:
- debug
'@tsconfig/node20@20.1.4': {}
'@tsconfig/node20@20.1.5': {}
'@tsconfig/strictest@2.0.5': {}
'@types/aws-lambda@8.10.147': {}
'@types/node@22.13.4':
'@types/node@22.13.13':
dependencies:
undici-types: 6.20.0
asynckit@0.4.0: {}
axios@1.7.9:
axios@1.8.4:
dependencies:
follow-redirects: 1.15.9
form-data: 4.0.1
form-data: 4.0.2
proxy-from-env: 1.1.0
transitivePeerDependencies:
- debug
@@ -442,6 +497,11 @@ snapshots:
bottleneck@2.19.5: {}
call-bind-apply-helpers@1.0.2:
dependencies:
es-errors: 1.3.0
function-bind: 1.1.2
combined-stream@1.0.8:
dependencies:
delayed-stream: 1.0.0
@@ -450,16 +510,72 @@ snapshots:
delayed-stream@1.0.0: {}
dunder-proto@1.0.1:
dependencies:
call-bind-apply-helpers: 1.0.2
es-errors: 1.3.0
gopd: 1.2.0
es-define-property@1.0.1: {}
es-errors@1.3.0: {}
es-object-atoms@1.1.1:
dependencies:
es-errors: 1.3.0
es-set-tostringtag@2.1.0:
dependencies:
es-errors: 1.3.0
get-intrinsic: 1.3.0
has-tostringtag: 1.0.2
hasown: 2.0.2
fast-content-type-parse@2.0.1: {}
follow-redirects@1.15.9: {}
form-data@4.0.1:
form-data@4.0.2:
dependencies:
asynckit: 0.4.0
combined-stream: 1.0.8
es-set-tostringtag: 2.1.0
mime-types: 2.1.35
function-bind@1.1.2: {}
get-intrinsic@1.3.0:
dependencies:
call-bind-apply-helpers: 1.0.2
es-define-property: 1.0.1
es-errors: 1.3.0
es-object-atoms: 1.1.1
function-bind: 1.1.2
get-proto: 1.0.1
gopd: 1.2.0
has-symbols: 1.1.0
hasown: 2.0.2
math-intrinsics: 1.1.0
get-proto@1.0.1:
dependencies:
dunder-proto: 1.0.1
es-object-atoms: 1.1.1
gopd@1.2.0: {}
has-symbols@1.1.0: {}
has-tostringtag@1.0.2:
dependencies:
has-symbols: 1.1.0
hasown@2.0.2:
dependencies:
function-bind: 1.1.2
math-intrinsics@1.1.0: {}
mime-db@1.52.0: {}
mime-types@2.1.35:

View File

@@ -5,15 +5,15 @@ index 9ba10e56ba..bb69440691 100644
@@ -41,10 +41,10 @@ serde_derive.workspace = true
telemetry.workspace = true
util.workspace = true
-[target.'cfg(target_os = "macos")'.dependencies]
+[target.'cfg(any())'.dependencies]
livekit_client_macos = { workspace = true }
livekit_client_macos.workspace = true
-[target.'cfg(not(target_os = "macos"))'.dependencies]
+[target.'cfg(all())'.dependencies]
livekit_client = { workspace = true }
livekit_client.workspace = true
[dev-dependencies]
diff --git a/crates/call/src/call.rs b/crates/call/src/call.rs
index 5e212d35b7..a8f9e8f43e 100644
@@ -21,19 +21,19 @@ index 5e212d35b7..a8f9e8f43e 100644
+++ b/crates/call/src/call.rs
@@ -1,13 +1,13 @@
pub mod call_settings;
-#[cfg(target_os = "macos")]
+#[cfg(any())]
mod macos;
-#[cfg(target_os = "macos")]
+#[cfg(any())]
pub use macos::*;
-#[cfg(not(target_os = "macos"))]
+#[cfg(all())]
mod cross_platform;
-#[cfg(not(target_os = "macos"))]
+#[cfg(all())]
pub use cross_platform::*;
@@ -45,15 +45,15 @@ index 1d17cfa145..f845234987 100644
-#[cfg(target_os = "macos")]
+#[cfg(any())]
mod macos;
-#[cfg(target_os = "macos")]
+#[cfg(any())]
pub use macos::*;
-#[cfg(not(target_os = "macos"))]
+#[cfg(all())]
mod cross_platform;
-#[cfg(not(target_os = "macos"))]
+#[cfg(all())]
pub use cross_platform::*;