Compare commits

..

86 Commits

Author SHA1 Message Date
Conrad Irwin
41406b3043 Hush little feedback, don't say a word 2025-03-21 20:39:01 -06:00
Conrad Irwin
4186fecdbc Fewer WebRTC related build hacks? 2025-03-21 11:07:41 -06:00
Conrad Irwin
e8aede9aae Remove WebRTC from build 2025-03-21 10:55:05 -06:00
Conrad Irwin
0900d9c6c1 Try removing windows config directives (#27261)
Closes #ISSUE

Release Notes:

- N/A
2025-03-21 10:39:44 -06:00
Conrad Irwin
5655bde5d5 Fix tests 2025-03-21 09:55:00 -06:00
Conrad Irwin
9db1b33344 Uncomment out some tests 2025-03-20 22:44:24 -06:00
Conrad Irwin
9d44e38d7a Fix linux tests 2025-03-20 22:37:23 -06:00
Conrad Irwin
d94fd6a433 own goaaaal 2025-03-20 22:35:42 -06:00
Conrad Irwin
1914d872e4 Merge branch 'main' into nv12-buffers 2025-03-20 22:22:52 -06:00
Conrad Irwin
3b4d3a2206 clippity clop 2025-03-20 22:16:05 -06:00
Cole Miller
cf7d639fbc Migrate most callers of git-related worktree APIs to use the GitStore (#27225)
This is a pure refactoring PR that goes through all the git-related APIs
exposed by the worktree crate and minimizes their use outside that
crate, migrating callers of those APIs to read from the GitStore
instead. This is to prepare for evacuating git repository state from
worktrees and making the GitStore the new source of truth.

Other drive-by changes:

- `project::git` is now `project::git_store`, for consistency with the
other project stores
- the project panel's test module has been split into its own file

Release Notes:

- N/A

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-21 00:10:17 -04:00
Conrad Irwin
96220a5656 Fix linux build 2025-03-20 21:58:39 -06:00
Conrad Irwin
8694e46f24 More 4s 2025-03-20 21:58:05 -06:00
Conrad Irwin
414a3ae32c Fix one test 2025-03-20 21:12:34 -06:00
Conrad Irwin
926ca4fead Fewer remote impls 2025-03-20 20:49:24 -06:00
Conrad Irwin
7ba41e6536 Merge branch 'main' into nv12-buffers 2025-03-20 20:38:23 -06:00
Smit Barmase
9134630841 extensions: Add copy author info button in context menu (#27221)
Closes #26108

Add "Copy Author Info" button to extension context menu.

Release Notes:

- Added option to copy extension author's name and email from extension
context menu.
2025-03-21 03:45:06 +05:30
Cole Miller
bc1c0a2297 Separate repository state synchronization from worktree synchronization (#27140)
This PR updates our DB schemas and wire protocol to separate the
synchronization of git statuses and other repository state from the
synchronization of worktrees. This paves the way for moving the code
that executes git status updates out of the `worktree` crate and onto
the new `GitStore`. That end goal is motivated by two (related) points:

- Disentangling git status updates from the worktree's
`BackgroundScanner` will allow us to implement a simpler concurrency
story for those updates, hopefully fixing some known but elusive bugs
(upstream state not updating after push; statuses getting out of sync in
remote projects).
- By moving git repository state to the project-scoped `GitStore`, we
can get rid of the duplication that currently happens when two worktrees
are associated with the same git repository.

Co-authored-by: Max <max@zed.dev>

Release Notes:

- N/A

---------

Co-authored-by: Max <max@zed.dev>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-20 18:07:03 -04:00
Marshall Bowers
700af63c45 assistant2: Watch settings for changes to profiles (#27219)
This PR makes it so we watch the settings and update when the profiles
change.

Release Notes:

- N/A
2025-03-20 21:12:58 +00:00
Conrad Irwin
e5b8d86a09 do the soft shoe shuffle 2025-03-20 15:09:18 -06:00
Conrad Irwin
679ecdb7d3 re-add build flags 2025-03-20 14:55:38 -06:00
Conrad Irwin
3578d01c4a Make app run with no livekit build 2025-03-20 14:50:41 -06:00
Marshall Bowers
4b5df2189b assistant2: Allow creating agent profiles via settings (#27216)
This PR adds support for creating new agent profiles via the settings:

```json
{
  "assistant": {
    "profiles": {
      "lua": {
        "name": "Lua",
        "tools": {
          "lua-interpreter": true
        }
      },
      "lua-thinking": {
        "name": "Lua + Thinking",
        "tools": {
          "lua-interpreter": true,
          "thinking": true
        }
      }
    }
  }
}
```

Release Notes:

- N/A
2025-03-20 20:30:07 +00:00
Finn Evers
48b1a43f5e docs: Fix rendering of keybind in languages.md (#27217)
This fixes a broken keybind in the language extension docs: [Language
metadata](https://zed.dev/docs/extensions/languages#language-metadata) >
`line_comments`.

Release Notes:

- N/A
2025-03-20 20:25:54 +00:00
Kirill Bulatov
9609e04bb2 Add a way to copy with the selections trimmed (#27206)
No default binding currently, `cmd/ctr-shift-c` seem somewhat natural
but those are occupied by the collab panel.


https://github.com/user-attachments/assets/702cc52a-a4b7-4f2c-bb7f-12ca0c66faeb


Release Notes:

- Added a way to copy with the selections trimmed

---------

Co-authored-by: Cole Miller <m@cole-miller.net>
2025-03-20 19:58:51 +00:00
Conrad Irwin
a7dd93fefe Build a livekit facade.. 2025-03-20 13:26:19 -06:00
Anthony Eid
a74f2bb18b Reuse values from last debug panel inert state if they exist (#27211)
This should allow the team to iterate faster when using the debug panel
to set up a session

Release Notes:

- N/A
2025-03-20 18:53:11 +00:00
Conrad Irwin
dce8d8a5a5 TEMP 2025-03-20 12:36:22 -06:00
Remco Smits
ac452799b0 debugger: Fix shutdown issues (#27071)
This PR fixes a few issues around shutting down a debug adapter.

The first issue I discovered was when I shut down all sessions via
`shutdown all adapters` command. We would still fetch the threads
request again, because we receive a thread event that indicated that it
exited. But this will always time out because the debug adapter is
already shutdown at this point, so by updating the check so we don't
allow fetching a request when the session is terminated fixes the issue.

The second issue fixes a bug where we would always shut down the parent
session, when a child session is terminated. This was reintroduced by
the big refactor. This is not something we want, because you could
receive multiple StartDebugging reverse requests, so if one child is
shutting down that does not mean the other ones should have been
shutting down as well.
Issue was original fixed in
https://github.com/RemcoSmitsDev/zed/pull/80#issuecomment-2573943661.


## TODO:
- [x] Add tests

Release Notes:

- N/A
2025-03-20 18:32:37 +00:00
Smit Barmase
7b80cd865d Show more possible matches in code context completion (#27199)
Closes #24794

We now don't filter matches provided by the fuzzy matcher, as it already
performs most of the filtering for us. Instead, the custom logic we
previously used for filtering is now used to partition, where before
discarded matches will be appended at end of list.

Before - Filtering out matches with higher fuzzy score
<img width="400" alt="image"
src="https://github.com/user-attachments/assets/7f9d66a2-0921-499c-af8a-f1e530da50b1"
/>

After - Changing filter to partition instead, and appending remaining
items at the end
<img width="400" alt="image"
src="https://github.com/user-attachments/assets/45848f70-ed51-4935-976c-6c16c5b5777b"
/>


Release Notes:

- Improved LSP auto complete to show more possible matches.

---------

Co-authored-by: Peter Tripp <petertripp@gmail.com>
2025-03-20 23:46:20 +05:30
Joseph T. Lyons
7931b1d345 Pre-fill body of email with system specs (#27210)
I think we still want to be able to easily capture system spec info from
users. They can decide if they want to include it or not.

Release Notes:

- N/A
2025-03-20 18:09:10 +00:00
Marshall Bowers
27ebedf517 gpui: Make App::get_name return an Option (#27209)
This PR makes `App::get_name` return an `Option` instead of panicking if
the name is not set.

We'll let the caller be responsible for dealing with the absence of a
name.

Release Notes:

- N/A
2025-03-20 17:56:27 +00:00
Marshall Bowers
f9f5126d2c assistant2: Uniquely identify context server entries in configuration view (#27207)
This PR gives each context server entry in the configuration view a
unique element ID.

This fixes some issues where the disclosures and switches weren't
working properly due to element ID collisions.

Release Notes:

- N/A
2025-03-20 17:37:15 +00:00
Agus Zubiaga
6408ae81d1 assistant2: Return no-edits response to architect model (#27200)
Sometimes the editor model returns no search/replace blocks. This
usually happens when the architect model calls the edit tool before
reading any files. When this happens, we'll now return the raw response
from the editor model to the architect model so it can recover
accordingly.

Release Notes:

- N/A
2025-03-20 14:29:34 -03:00
Marshall Bowers
c60a7034c8 context_server: Interpret context server command paths relative to the extension's work dir (#27201)
This PR fixes an issues where the commands returned from context server
extensions were being used as-is instead of interpreting them relative
to the extension's work dir.

Release Notes:

- Fixed an issue with context server paths not being interpreted
relative to the extension's work dir.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Thomas Mickley-Doyle <tmickleydoyle@gmail.com>
2025-03-20 16:36:40 +00:00
Antonio Scandurra
7feb50fafe Add UI feedback for checkpoint restoration (#27203)
Release Notes:

- N/A

Co-authored-by: Agus Zubiaga <hi@aguz.me>
Co-authored-by: Bennet Bo Fenner <bennetbo@gmx.de>
2025-03-20 16:35:44 +00:00
Antonio Scandurra
f365b80814 Avoid polluting branch list and restore parent commit when using checkpoints (#27191)
Release Notes:

- N/A
2025-03-20 15:00:23 +00:00
Joseph T. Lyons
d0641a38a4 Rework feedback modal (#27186)
After our last community sync, we came to the conclusion that feedback
being sent outside of email is difficult to reply to. Our decision was
to use the old, tried and true email system, so that we can better
respond to people asking questions.

<img width="392" alt="SCR-20250320-igub"
src="https://github.com/user-attachments/assets/f1d01771-30eb-4b6f-b031-c68ddaac5700"
/>

Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-03-20 10:28:43 -04:00
Agus Zubiaga
2e8c0ff244 assistant edit tool: Report when file is empty or doesn't exist (#27190)
Instead of just reporting a search match failure, we'll now indicate
whether the file is empty or exists to help the model recover better
from bad edits.


Release Notes:

- N/A
2025-03-20 10:42:10 -03:00
Danilo Leal
4421bdd12e assistant: Dismiss model picker upon selection (#27162)
This PR makes the model picker close when you click on a new item.

Release Notes:

- N/A
2025-03-20 10:22:49 -03:00
Peter Tripp
aa2fe9cce1 Add additional git-blame-ignore-revs (#27189)
Release Notes:

- N/A
2025-03-20 09:17:56 -04:00
Richard Feldman
e3578fc44a Display what the tool is doing (#27120)
<img width="639" alt="Screenshot 2025-03-19 at 4 56 47 PM"
src="https://github.com/user-attachments/assets/b997f04d-4aff-4070-87b1-ffdb61019bd1"
/>

Release Notes:

- N/A

---------

Co-authored-by: Agus Zubiaga <hi@aguz.me>
2025-03-20 09:16:39 -04:00
Kirill Bulatov
aae81fd54c Notify about broken task file contents (#27185)
Closes https://github.com/zed-industries/zed/issues/23783


https://github.com/user-attachments/assets/df019f68-a76b-4953-967a-a35ed21206ab

Release Notes:

- Added notifications when invalid tasks.json/debug.json is saved
2025-03-20 13:06:10 +00:00
Kirill Bulatov
de99febd9b debugger: Ensure both debug and regular global tasks are correctly merged (#27184)
Follow-up of https://github.com/zed-industries/zed/pull/13433
Closes https://github.com/zed-industries/zed/issues/27124
Closes https://github.com/zed-industries/zed/issues/27066

After this change, both old global task source, `tasks.json` and new,
`debug.json` started to call for the same task update method:


14920ab910/crates/project/src/task_inventory.rs (L414)

erasing previous declarations.

The PR puts this data under different paths instead and adjusts the code
around it.

Release Notes:

- Fixed custom tasks not shown
2025-03-20 12:51:26 +00:00
Kirill Bulatov
5bef32f3ed When determining Python task context, do not consider worktree-less files as an error (#27183)
Makes Python plugin to output


![image](https://github.com/user-attachments/assets/4960bc48-21b7-4392-82b9-18bfd1dd9cd0)

for standalone Python files now, instead of nothing as now.

Before the change, no task context was created for the standalone file
due to `VariableName::RelativeFile` lookup considered as an error.
Now, Zed continues and constructs whatever possible context instead.

That `pytest` task seems odd, as the logic fixed here needs a relative
path (hence, a worktree) to consider unit tests.
We do not have variables at the moment the associated tasks are queried
for:


14920ab910/crates/languages/src/python.rs (L359-L363)


14920ab910/crates/languages/src/python.rs (L417-L446)

so we cannot filter this the same way the PR does.
Maybe, we can use a `VariableName::RelativeFile` instead of
`VariableName::File` there?

Release Notes:

- Show tasks from Python plugin for standalone files
2025-03-20 12:44:07 +00:00
Smit Barmase
23e8519057 Add completion_query_characters in language (#27175)
Closes #18581

Now characters for completing query and word characters, which are
responsible for selecting words by double clicking or navigating, are
different. This fixes a bunch of things:

For settings.json, this improves completions to treat the whole string
as a completion query, instead of just the last word. We now added
"space" as a completion query character without it being a word
character.

For keymap.json, this improves selecting part of an action as the ":"
character is only a completion character and not a word character. So,
completions would still trigger on ":" and query capture will treat ":"
as a word, but for actions like selections and navigation, ":" will be
treated as punctuation.

Before:
Unnecessary related suggestions as query is only the last word which is
"d".
<img width="300" alt="image"
src="https://github.com/user-attachments/assets/8199a715-7521-49dd-948b-e6aaed04c488"
/>

Double clicking `ToggleFold` selects the whole action:
<img width="300" alt="image"
src="https://github.com/user-attachments/assets/c7f91a6b-06d5-45b6-9d59-61a1b2deda71"
/>

After:
Now query is "one d" and it shows only matched ones.
<img width="300" alt="image"
src="https://github.com/user-attachments/assets/1455dfbc-9906-42e8-b8aa-b3f551194ca2"
/>

Double clicking `ToggleFold` only selects part of the action, which is
more refined behavior.
<img width="300" alt="image"
src="https://github.com/user-attachments/assets/34b1c3c2-184f-402f-9dc8-73030a8c370f"
/>

Release Notes:

- Improved autocomplete suggestions in `settings.json`, now whole string
is queried instead of just last word of string, which filters out lot of
false positives.
- Improved selection of action in `keymap.json`, where now you can
double click to only select certain part of action, instead of selecting
whole action.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-03-20 16:45:35 +05:30
Michael Sloan
1180b6fbc7 Initial support for AI assistant rules files (#27168)
Release Notes:

- N/A

---------

Co-authored-by: Danilo <danilo@zed.dev>
Co-authored-by: Nathan <nathan@zed.dev>
Co-authored-by: Thomas <thomas@zed.dev>
2025-03-20 08:30:04 +00:00
renovate[bot]
14920ab910 Update swatinem/rust-cache digest to 9d47c6a (#27121)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [swatinem/rust-cache](https://redirect.github.com/swatinem/rust-cache)
| action | digest | `f0deed1` -> `9d47c6a` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:58:22 +02:00
renovate[bot]
000b981cb4 Update Rust crate rustls-platform-verifier to v0.5.1 (#27136)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[rustls-platform-verifier](https://redirect.github.com/rustls/rustls-platform-verifier)
| workspace.dependencies | patch | `0.5.0` -> `0.5.1` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:52:49 +02:00
renovate[bot]
c9bff6e762 Update Rust crate sea-orm to v1.1.7 (#27137)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [sea-orm](https://www.sea-ql.org/SeaORM)
([source](https://redirect.github.com/SeaQL/sea-orm)) | dev-dependencies
| patch | `1.1.5` -> `1.1.7` |
| [sea-orm](https://www.sea-ql.org/SeaORM)
([source](https://redirect.github.com/SeaQL/sea-orm)) | dependencies |
patch | `1.1.5` -> `1.1.7` |

---

### Release Notes

<details>
<summary>SeaQL/sea-orm (sea-orm)</summary>

###
[`v1.1.7`](https://redirect.github.com/SeaQL/sea-orm/blob/HEAD/CHANGELOG.md#117---2025-03-02)

[Compare
Source](https://redirect.github.com/SeaQL/sea-orm/compare/1.1.6...1.1.7)

##### New Features

- Support nested entities in `FromQueryResult`
[https://github.com/SeaQL/sea-orm/pull/2508](https://redirect.github.com/SeaQL/sea-orm/pull/2508)

```rust

#[derive(FromQueryResult)]
struct Cake {
    id: i32,
    name: String,
    #[sea_orm(nested)]
    bakery: Option<CakeBakery>,
}

#[derive(FromQueryResult)]
struct CakeBakery {
    #[sea_orm(from_alias = "bakery_id")]
    id: i32,
    #[sea_orm(from_alias = "bakery_name")]
    title: String,
}

let cake: Cake = cake::Entity::find()
    .select_only()
    .column(cake::Column::Id)
    .column(cake::Column::Name)
    .column_as(bakery::Column::Id, "bakery_id")
    .column_as(bakery::Column::Name, "bakery_name")
    .left_join(bakery::Entity)
    .order_by_asc(cake::Column::Id)
    .into_model()
    .one(&ctx.db)
    .await?
    .unwrap();

assert_eq!(
    cake,
    Cake {
        id: 1,
        name: "Cake".to_string(),
        bakery: Some(CakeBakery {
            id: 20,
            title: "Bakery".to_string(),
        })
    }
);
```

- Support nested entities in `DerivePartialModel`
[https://github.com/SeaQL/sea-orm/pull/2508](https://redirect.github.com/SeaQL/sea-orm/pull/2508)

```rust

#[derive(DerivePartialModel)] // FromQueryResult is no longer needed
#[sea_orm(entity = "cake::Entity", from_query_result)]
struct Cake {
    id: i32,
    name: String,
    #[sea_orm(nested)]
    bakery: Option<Bakery>,
}

#[derive(DerivePartialModel)]

#[sea_orm(entity = "bakery::Entity", from_query_result)]
struct Bakery {
    id: i32,
    #[sea_orm(from_col = "Name")]
    title: String,
}

// same as previous example, but without the custom selects
let cake: Cake = cake::Entity::find()
    .left_join(bakery::Entity)
    .order_by_asc(cake::Column::Id)
    .into_partial_model()
    .one(&ctx.db)
    .await?
    .unwrap();

assert_eq!(
    cake,
    Cake {
        id: 1,
        name: "Cake".to_string(),
        bakery: Some(CakeBakery {
            id: 20,
            title: "Bakery".to_string(),
        })
    }
);
```

- Derive also `IntoActiveModel` with `DerivePartialModel`
[https://github.com/SeaQL/sea-orm/pull/2517](https://redirect.github.com/SeaQL/sea-orm/pull/2517)

```rust

#[derive(DerivePartialModel)]
#[sea_orm(entity = "cake::Entity", into_active_model)]
struct Cake {
    id: i32,
    name: String,
}

assert_eq!(
    Cake {
        id: 12,
        name: "Lemon Drizzle".to_owned(),
    }
    .into_active_model(),
    cake::ActiveModel {
        id: Set(12),
        name: Set("Lemon Drizzle".to_owned()),
        ..Default::default()
    }
);
```

- Added `SelectThree`
[https://github.com/SeaQL/sea-orm/pull/2518](https://redirect.github.com/SeaQL/sea-orm/pull/2518)

```rust
// Order -> (many) Lineitem -> Cake
let items: Vec<(order::Model, Option<lineitem::Model>, Option<cake::Model>)> =
    order::Entity::find()
        .find_also_related(lineitem::Entity)
        .and_also_related(cake::Entity)
        .order_by_asc(order::Column::Id)
        .order_by_asc(lineitem::Column::Id)
        .all(&ctx.db)
        .await?;
```

##### Enhancements

- Support complex type path in `DeriveIntoActiveModel`
[https://github.com/SeaQL/sea-orm/pull/2517](https://redirect.github.com/SeaQL/sea-orm/pull/2517)

```rust

#[derive(DeriveIntoActiveModel)]
#[sea_orm(active_model = "<fruit::Entity as EntityTrait>::ActiveModel")]
struct Fruit {
    cake_id: Option<Option<i32>>,
}
```

- Added `DatabaseConnection::close_by_ref`
[https://github.com/SeaQL/sea-orm/pull/2511](https://redirect.github.com/SeaQL/sea-orm/pull/2511)

```rust
pub async fn close(self) -> Result<(), DbErr> { .. } // existing
pub async fn close_by_ref(&self) -> Result<(), DbErr> { .. } // new
```

##### House Keeping

- Cleanup legacy `ActiveValue::Set`
[https://github.com/SeaQL/sea-orm/pull/2515](https://redirect.github.com/SeaQL/sea-orm/pull/2515)

###
[`v1.1.6`](https://redirect.github.com/SeaQL/sea-orm/blob/HEAD/CHANGELOG.md#116---2025-02-24)

[Compare
Source](https://redirect.github.com/SeaQL/sea-orm/compare/1.1.5...1.1.6)

##### New Features

- Support PgVector (under feature flag `postgres-vector`)
[https://github.com/SeaQL/sea-orm/pull/2500](https://redirect.github.com/SeaQL/sea-orm/pull/2500)

```rust
// Model

#[derive(Clone, Debug, PartialEq, DeriveEntityModel)]
#[sea_orm(table_name = "image_model")]
pub struct Model {
    #[sea_orm(primary_key, auto_increment = false)]
    pub id: i32,
    pub embedding: PgVector,
}
 
// Schema
sea_query::Table::create()
    .table(image_model::Entity.table_ref())
    .col(ColumnDef::new(Column::Id).integer().not_null().primary_key())
    .col(ColumnDef::new(Column::Embedding).vector(None).not_null())
    ..

// Insert
ActiveModel {
    id: NotSet,
    embedding: Set(PgVector::from(vec![1., 2., 3.])),
}
.insert(db)
.await?
```

- Added `Insert::exec_with_returning_keys` &
`Insert::exec_with_returning_many` (Postgres only)

```rust
assert_eq!(
    Entity::insert_many([
        ActiveModel { id: NotSet, name: Set("two".into()) },
        ActiveModel { id: NotSet, name: Set("three".into()) },
    ])
    .exec_with_returning_many(db)
    .await
    .unwrap(),
    [
        Model { id: 2, name: "two".into() },
        Model { id: 3, name: "three".into() },
    ]
);

assert_eq!(
    cakes_bakers::Entity::insert_many([
        cakes_bakers::ActiveModel {
            cake_id: Set(1),
            baker_id: Set(2),
        },
        cakes_bakers::ActiveModel {
            cake_id: Set(2),
            baker_id: Set(1),
        },
    ])
    .exec_with_returning_keys(db)
    .await
    .unwrap(),
    [(1, 2), (2, 1)]
);
```

- Added `DeleteOne::exec_with_returning` &
`DeleteMany::exec_with_returning`
[https://github.com/SeaQL/sea-orm/pull/2432](https://redirect.github.com/SeaQL/sea-orm/pull/2432)

##### Enhancements

- Expose underlying row types (e.g. `sqlx::postgres::PgRow`)
[https://github.com/SeaQL/sea-orm/pull/2265](https://redirect.github.com/SeaQL/sea-orm/pull/2265)
- \[sea-orm-cli] Added `acquire-timeout` option
[https://github.com/SeaQL/sea-orm/pull/2461](https://redirect.github.com/SeaQL/sea-orm/pull/2461)
- \[sea-orm-cli] Added `with-prelude` option
[https://github.com/SeaQL/sea-orm/pull/2322](https://redirect.github.com/SeaQL/sea-orm/pull/2322)
- \[sea-orm-cli] Added `impl-active-model-behavior` option
[https://github.com/SeaQL/sea-orm/pull/2487](https://redirect.github.com/SeaQL/sea-orm/pull/2487)

##### Bug Fixes

- Fixed `seaography::register_active_enums` macro
[https://github.com/SeaQL/sea-orm/pull/2475](https://redirect.github.com/SeaQL/sea-orm/pull/2475)

##### House keeping

- Remove `futures` crate, replace with `futures-util`
[https://github.com/SeaQL/sea-orm/pull/2466](https://redirect.github.com/SeaQL/sea-orm/pull/2466)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:52:35 +02:00
renovate[bot]
9fd2d064ee Update Rust crate mimalloc to v0.1.44 (#27131)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [mimalloc](https://redirect.github.com/purpleprotocol/mimalloc_rust) |
dependencies | patch | `0.1.43` -> `0.1.44` |

---

### Release Notes

<details>
<summary>purpleprotocol/mimalloc_rust (mimalloc)</summary>

###
[`v0.1.44`](https://redirect.github.com/purpleprotocol/mimalloc_rust/releases/tag/v0.1.44):
Version 0.1.44

[Compare
Source](https://redirect.github.com/purpleprotocol/mimalloc_rust/compare/v0.1.43...v0.1.44)

##### Changes

-   Mimalloc v2.2.2

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:48:35 +02:00
renovate[bot]
11425cf5f1 Update Rust crate unindent to v0.2.4 (#27151)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [unindent](https://redirect.github.com/dtolnay/indoc) |
workspace.dependencies | patch | `0.2.3` -> `0.2.4` |

---

### Release Notes

<details>
<summary>dtolnay/indoc (unindent)</summary>

###
[`v0.2.4`](https://redirect.github.com/dtolnay/indoc/releases/tag/0.2.4)

[Compare
Source](https://redirect.github.com/dtolnay/indoc/compare/0.2.3...0.2.4)

-   Update to Syn 0.13

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:42:39 +02:00
renovate[bot]
b54c92079f Update Rust crate winresource to v0.1.20 (#27152)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [winresource](https://redirect.github.com/BenjaminRi/winresource) |
build-dependencies | patch | `0.1.19` -> `0.1.20` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-20 07:42:20 +02:00
Conrad Irwin
ed9afd86a1 Merge branch 'main' into nv12-buffers 2025-03-19 22:34:21 -06:00
renovate[bot]
3bbdc546ec Update Rust crate time to v0.3.40 (#27147)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [time](https://time-rs.github.io)
([source](https://redirect.github.com/time-rs/time)) |
workspace.dependencies | patch | `0.3.37` -> `0.3.40` |

---

### Release Notes

<details>
<summary>time-rs/time (time)</summary>

###
[`v0.3.40`](https://redirect.github.com/time-rs/time/blob/HEAD/CHANGELOG.md#0340-2025-03-18)

[Compare
Source](https://redirect.github.com/time-rs/time/compare/v0.3.39...v0.3.40)

##### Added

- Visibility modifiers may now be added to the `mod` generated by
`time::sere::format_description!`.

###
[`v0.3.39`](https://redirect.github.com/time-rs/time/blob/HEAD/CHANGELOG.md#0339-2025-03-06)

[Compare
Source](https://redirect.github.com/time-rs/time/compare/v0.3.38...v0.3.39)

##### Fixed

-   Doc tests run successfully with the default feature set.
-   wasm builds work again.

Both of these were regressions in v0.3.38 and are now checked in CI.

###
[`v0.3.38`](https://redirect.github.com/time-rs/time/blob/HEAD/CHANGELOG.md#0338-2025-03-05)

[Compare
Source](https://redirect.github.com/time-rs/time/compare/v0.3.37...v0.3.38)

##### Added

- The `[year]` component (in format descriptions) now supports a `range`
modifier, which can be
either `standard` or `extended`. The default is `extended` for backwards
compatibility. This is
intended as a manner to opt *out* of the extended range when the
`large-dates` feature is enabled.
When the `large-dates` feature is not enabled, the modifier has no
effect.
- `UtcDateTime`, which is semantically equivalent to an `OffsetDateTime`
with UTC as its offset. The
advantage is that it is the same size as a `PrimitiveDateTime` and has
improved operability with
    well-known formats.

    As part of this, there were some other additions:

- `utc_datetime!` macro, which is similar to the `datetime!` macro but
constructs a `UtcDateTime`.
    -   `PrimitiveDateTime::as_utc`
    -   `OffsetDateTime::to_utc`
    -   `OffsetDateTime::checked_to_utc`
- `time::serde::timestamp::milliseconds_i64`, which is a module to
serialize/deserialize timestamps
as the Unix timestamp. The pre-existing module does this as an `i128`
where an `i64` would
    suffice. This new module should be preferred.

##### Changed

- `error::Format` has had its `source()` implementation changed to no
longer return a boxed value
from the `ComponentRange` variant. If you were explicitly expecting
this, you will need to update
    your code. The method API remains unchanged.
-   `[year repr:century]` supports single-digit values.
-   All `format_into` methods accept `?Sized` references.

##### Miscellaneous

- Some non-exhaustive enum variants that are no longer used have been
modified to be statically
proven as uninhabited. The relevant fields are doc-hidden and not
semver-guaranteed to remain as
    such, though it is unlikely to change.
-   An unnecessary check when parsing RFC 2822 has been removed.
- Various methods have had their implementations changed, resulting in
significant performance
    gains. Among the methods changed are
    -   `util::is_leap_year`
    -   `util::weeks_in_year`
    -   `Month::length`
    -   `Date::to_calendar_date`
    -   `Date::month`
    -   `Date::day`
    -   `Date::from_julian_day`
    -   `Date::to_julian_day`
    -   other methods that call into these methods

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-19 23:36:33 -04:00
renovate[bot]
e4e3ce6a38 Update Rust crate serde_repr to v0.1.20 (#27146)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [serde_repr](https://redirect.github.com/dtolnay/serde-repr) |
workspace.dependencies | patch | `0.1.19` -> `0.1.20` |

---

### Release Notes

<details>
<summary>dtolnay/serde-repr (serde_repr)</summary>

###
[`v0.1.20`](https://redirect.github.com/dtolnay/serde-repr/releases/tag/0.1.20)

[Compare
Source](https://redirect.github.com/dtolnay/serde-repr/compare/0.1.19...0.1.20)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-19 23:35:25 -04:00
renovate[bot]
8cd96cbf59 Update Rust crate serde_json to v1.0.140 (#27144)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [serde_json](https://redirect.github.com/serde-rs/json) | dependencies
| patch | `1.0.139` -> `1.0.140` |
| [serde_json](https://redirect.github.com/serde-rs/json) |
workspace.dependencies | patch | `1.0.139` -> `1.0.140` |

---

### Release Notes

<details>
<summary>serde-rs/json (serde_json)</summary>

###
[`v1.0.140`](https://redirect.github.com/serde-rs/json/releases/tag/v1.0.140)

[Compare
Source](https://redirect.github.com/serde-rs/json/compare/v1.0.139...v1.0.140)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4yMDcuMSIsInVwZGF0ZWRJblZlciI6IjM5LjIwNy4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-03-19 23:35:21 -04:00
Ben Kunkle
274124256d Fix code action formatters creating separate transaction (#26311)
Closes #24588
Closes #25419

Restructures `LspStore.format_local` a decent bit in order to make how
the transaction history is preserved more clear, and in doing so fix
various bugs with how the transaction history is handled during a format
request (especially when formatting in remote dev)

Release Notes:

- Fixed an issue that prevented formatting from working when working
with remote dev
- Fixed an issue when using code actions as a format step where the
edits made by the code actions would not be grouped with the other
format edits in the undo history
2025-03-19 20:59:43 -05:00
Mikayla Maki
076aeaec49 Fix tests 2025-03-19 16:24:05 -07:00
Mikayla Maki
18a342c6c0 Hide macos dependencies from other platforms 2025-03-19 16:01:16 -07:00
Mikayla Maki
ac16b144b1 remove unused dependencies 2025-03-19 15:56:37 -07:00
Mikayla Maki
c30f758879 clippppppp 2025-03-19 15:51:31 -07:00
Mikayla Maki
f37a763b88 Merge branch 'main' into nv12-buffers 2025-03-19 15:44:35 -07:00
Mikayla Maki
7dfd257631 Fix blade with new core video types 2025-03-19 15:42:05 -07:00
Mikayla Maki
22e8b374ac Delete swift bindings 2025-03-19 15:16:57 -07:00
Conrad Irwin
e18cb79a73 Fix screen resolution calculation 2025-03-19 15:45:40 -06:00
Conrad Irwin
6b726a8a31 spell more correclty 2025-03-19 15:30:11 -06:00
Conrad Irwin
4a37b59f14 and more
:ewa
2025-03-19 15:28:06 -06:00
Conrad Irwin
96f246ddf4 Git deps 2025-03-19 15:26:11 -06:00
Conrad Irwin
853f4299e4 MOARMOARMOARMOAR 2025-03-19 15:21:49 -06:00
Conrad Irwin
4dfa685ea5 Get all the crate versions aligned 2025-03-19 13:38:43 -06:00
Conrad Irwin
5ef80e355f Moar 2025-03-19 12:41:37 -06:00
Conrad Irwin
d93db7c3ec TEMP 2025-03-18 12:55:28 -06:00
Conrad Irwin
d0cb7fb5d4 TEMP 2025-03-18 11:59:13 -06:00
Conrad Irwin
7af7281348 TEMP 2025-03-18 10:36:52 -06:00
Conrad Irwin
1fe10ae1ba TEMP 2025-03-17 22:27:04 -06:00
Conrad Irwin
13c3551f09 Update room.rs 2025-03-17 20:59:27 -06:00
Conrad Irwin
d61eb8dbc6 tokio based livekit.. 2025-03-17 20:54:33 -06:00
Conrad Irwin
14ef68f76c TEMP 2025-03-17 20:15:53 -06:00
Conrad Irwin
d39a215df7 TEMP 2025-03-17 19:16:15 -06:00
Conrad Irwin
3099bbc48b TEMP 2025-03-14 20:59:02 -06:00
Conrad Irwin
ba8a8701e4 TEMP 2025-03-14 14:14:05 -06:00
Conrad Irwin
cc7373e7e7 TEMP 2025-03-14 12:07:44 -06:00
Conrad Irwin
38330d96e5 Revert "TEMP"
This reverts commit db4685cf1f.
2025-03-14 11:42:47 -06:00
Conrad Irwin
db4685cf1f TEMP 2025-03-14 11:42:46 -06:00
Conrad Irwin
0253c8e93e Use livekit rust 2025-03-14 10:30:38 -06:00
176 changed files with 12967 additions and 16992 deletions

View File

@@ -19,6 +19,10 @@
# https://github.com/zed-industries/zed/pull/2394
eca93c124a488b4e538946cd2d313bd571aa2b86
# 2024-02-15 Format YAML files
# https://github.com/zed-industries/zed/pull/7887
a161a7d0c95ca7505bf9218bfae640ee5444c88b
# 2024-02-25 Format JSON files in assets/
# https://github.com/zed-industries/zed/pull/8405
ffdda588b41f7d9d270ffe76cab116f828ad545e

View File

@@ -209,7 +209,6 @@ jobs:
cargo check -p workspace
cargo build -p remote_server
cargo check -p gpui --examples
script/check-rust-livekit-macos
# Since the macOS runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
@@ -235,7 +234,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "buildjet"
@@ -287,7 +286,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "buildjet"
@@ -334,7 +333,7 @@ jobs:
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.ZED_WORKSPACE }}" -Recurse
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
workspaces: ${{ env.ZED_WORKSPACE }}
@@ -393,7 +392,7 @@ jobs:
Copy-Item -Path "${{ github.workspace }}" -Destination "${{ env.ZED_WORKSPACE }}" -Recurse
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
workspaces: ${{ env.ZED_WORKSPACE }}

View File

@@ -22,7 +22,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@f0deed1e0edfc6a9be95417288c0e1099b1eeec3 # v2
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "github"

361
Cargo.lock generated
View File

@@ -607,6 +607,7 @@ version = "0.1.0"
dependencies = [
"anthropic",
"anyhow",
"collections",
"deepseek",
"feature_flags",
"fs",
@@ -1074,7 +1075,7 @@ source = "git+https://github.com/zed-industries/async-tls?rev=1e759a4b5e370f87dc
dependencies = [
"futures-core",
"futures-io",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-pemfile 2.2.0",
"webpki-roots",
]
@@ -2297,10 +2298,10 @@ dependencies = [
"fs",
"futures 0.3.31",
"gpui",
"gpui_tokio",
"http_client",
"language",
"livekit_client",
"livekit_client_macos",
"log",
"postage",
"project",
@@ -2356,7 +2357,7 @@ dependencies = [
"cap-primitives",
"cap-std",
"io-lifetimes",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -2384,7 +2385,7 @@ dependencies = [
"ipnet",
"maybe-owned",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
"winx",
]
@@ -2436,8 +2437,7 @@ dependencies = [
[[package]]
name = "cargo_metadata"
version = "0.19.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dd5eb614ed4c27c5d706420e4320fbe3216ab31fa1c33cd8246ac36dae4479ba"
source = "git+https://github.com/zed-industries/cargo_metadata?rev=ce8171bad673923d61a77b6761d0dc4aff63398a#ce8171bad673923d61a77b6761d0dc4aff63398a"
dependencies = [
"camino",
"cargo-platform",
@@ -2563,6 +2563,15 @@ version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "613afe47fcd5fac7ccf1db93babcb082c5994d996f20b8b159f2ad1658eb5724"
[[package]]
name = "cgl"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ced0551234e87afee12411d535648dd89d2e7f34c78b753395567aff3d447ff"
dependencies = [
"libc",
]
[[package]]
name = "channel"
version = "0.1.0"
@@ -2719,7 +2728,7 @@ dependencies = [
"anyhow",
"clap",
"collections",
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"core-services",
"exec",
"fork",
@@ -2919,6 +2928,7 @@ dependencies = [
"git_ui",
"google_ai",
"gpui",
"gpui_tokio",
"hex",
"http_client",
"hyper 0.14.32",
@@ -2928,7 +2938,6 @@ dependencies = [
"language_model",
"livekit_api",
"livekit_client",
"livekit_client_macos",
"log",
"lsp",
"menu",
@@ -3354,6 +3363,19 @@ dependencies = [
"libc",
]
[[package]]
name = "core-graphics2"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7e4583956b9806b69f73fcb23aee05eb3620efc282972f08f6a6db7504f8334d"
dependencies = [
"bitflags 2.8.0",
"block",
"cfg-if",
"core-foundation 0.10.0",
"libc",
]
[[package]]
name = "core-services"
version = "0.2.1"
@@ -3365,16 +3387,30 @@ dependencies = [
[[package]]
name = "core-text"
version = "20.1.0"
version = "21.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c9d2790b5c08465d49f8dc05c8bcae9fea467855947db39b0f8145c091aaced5"
checksum = "a593227b66cbd4007b2a050dfdd9e1d1318311409c8d600dc82ba1b15ca9c130"
dependencies = [
"core-foundation 0.9.4",
"core-graphics 0.23.2",
"core-foundation 0.10.0",
"core-graphics 0.24.0",
"foreign-types 0.5.0",
"libc",
]
[[package]]
name = "core-video"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d45e71d5be22206bed53c3c3cb99315fc4c3d31b8963808c6bc4538168c4f8ef"
dependencies = [
"block",
"core-foundation 0.10.0",
"core-graphics2",
"io-surface",
"libc",
"metal",
]
[[package]]
name = "core_maths"
version = "0.1.1"
@@ -4075,9 +4111,9 @@ dependencies = [
[[package]]
name = "deranged"
version = "0.3.11"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b42b6fa04a440b495c8b04d0e71b707c585f83cb9cb28cf8cd0d976c315e31b4"
checksum = "9c9e6a11ca8224451684bc0d7d5a7adbf8f2fd6887261a1cfc3c0432f9d4068e"
dependencies = [
"powerfmt",
"serde",
@@ -4592,7 +4628,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -4927,25 +4963,13 @@ dependencies = [
name = "feedback"
version = "0.1.0"
dependencies = [
"anyhow",
"bitflags 2.8.0",
"client",
"db",
"editor",
"futures 0.3.31",
"gpui",
"http_client",
"human_bytes",
"language",
"log",
"menu",
"project",
"regex",
"release_channel",
"serde",
"serde_derive",
"serde_json",
"smol",
"sysinfo",
"ui",
"urlencoding",
@@ -5088,12 +5112,12 @@ checksum = "f81ec6369c545a7d40e4589b5597581fa1c441fe1cce96dd1de43159910a36a2"
[[package]]
name = "font-kit"
version = "0.14.1"
source = "git+https://github.com/zed-industries/font-kit?rev=40391b7#40391b7c0041d8a8572af2afa3de32ae088f0120"
source = "git+https://github.com/zed-industries/font-kit?rev=5474cfad4b719a72ec8ed2cb7327b2b01fd10568#5474cfad4b719a72ec8ed2cb7327b2b01fd10568"
dependencies = [
"bitflags 2.8.0",
"byteorder",
"core-foundation 0.9.4",
"core-graphics 0.23.2",
"core-foundation 0.10.0",
"core-graphics 0.24.0",
"core-text",
"dirs 5.0.1",
"dwrote",
@@ -5267,7 +5291,7 @@ checksum = "5e2e6123af26f0f2c51cc66869137080199406754903cc926a7690401ce09cb4"
dependencies = [
"io-lifetimes",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -5291,7 +5315,7 @@ name = "fsevent"
version = "0.1.0"
dependencies = [
"bitflags 2.8.0",
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"fsevent-sys 3.1.0",
"parking_lot",
"tempfile",
@@ -5828,10 +5852,11 @@ dependencies = [
"cbindgen 0.28.0",
"cocoa 0.26.0",
"collections",
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"core-foundation-sys",
"core-graphics 0.23.2",
"core-graphics 0.24.0",
"core-text",
"core-video",
"cosmic-text",
"ctor",
"derive_more",
@@ -6344,7 +6369,7 @@ dependencies = [
name = "http_client_tls"
version = "0.1.0"
dependencies = [
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-platform-verifier",
]
@@ -6442,7 +6467,7 @@ dependencies = [
"http 1.2.0",
"hyper 1.5.1",
"hyper-util",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-native-certs 0.8.1",
"rustls-pki-types",
"tokio",
@@ -6926,7 +6951,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2285ddfe3054097ef4b2fe909ef8c3bcd1ea52a8f0d274416caebeef39f04a65"
dependencies = [
"io-lifetimes",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -6935,6 +6960,19 @@ version = "2.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "06432fb54d3be7964ecd3649233cddf80db2832f47fec34c01f65b3d9d774983"
[[package]]
name = "io-surface"
version = "0.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8283575d5f0b2e7447ec0840363879d71c0fa325d4c699d5b45208ea4a51f45e"
dependencies = [
"cgl",
"core-foundation 0.10.0",
"core-foundation-sys",
"leaky-cow",
"libc",
]
[[package]]
name = "iovec"
version = "0.1.4"
@@ -7515,6 +7553,21 @@ version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "830d08ce1d1d941e6b30645f1a0eb5643013d835ce3779a5fc208261dbe10f55"
[[package]]
name = "leak"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bd100e01f1154f2908dfa7d02219aeab25d0b9c7fa955164192e3245255a0c73"
[[package]]
name = "leaky-cow"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "40a8225d44241fd324a8af2806ba635fc7c8a7e9a7de4d5cf3ef54e71f5926fc"
dependencies = [
"leak",
]
[[package]]
name = "leb128"
version = "0.2.5"
@@ -7572,7 +7625,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc2f4eb4bc735547cfed7c0a4922cbd04a4655978c09b54f1f7b228750664c34"
dependencies = [
"cfg-if",
"windows-targets 0.48.5",
"windows-targets 0.52.6",
]
[[package]]
@@ -7583,9 +7636,9 @@ checksum = "8355be11b20d696c8f18f6cc018c4e372165b1fa8126cef092399c9951984ffa"
[[package]]
name = "libmimalloc-sys"
version = "0.1.39"
version = "0.1.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "23aa6811d3bd4deb8a84dde645f943476d13b248d818edcf8ce0b2f37f036b44"
checksum = "07d0e07885d6a754b9c7993f2625187ad694ee985d60f23355ff0e7077261502"
dependencies = [
"cc",
"libc",
@@ -7615,8 +7668,8 @@ dependencies = [
[[package]]
name = "libwebrtc"
version = "0.3.7"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.3.10"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"cxx",
"jni",
@@ -7700,12 +7753,13 @@ checksum = "4ee93343901ab17bd981295f2cf0026d4ad018c7c31ba84549a4ddbb47a45104"
[[package]]
name = "livekit"
version = "0.7.0"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.7.7"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"chrono",
"futures-util",
"lazy_static",
"libloading",
"libwebrtc",
"livekit-api",
"livekit-protocol",
@@ -7722,10 +7776,10 @@ dependencies = [
[[package]]
name = "livekit-api"
version = "0.4.1"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.4.2"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"async-tungstenite",
"base64 0.21.7",
"futures-util",
"http 0.2.12",
"jsonwebtoken",
@@ -7733,7 +7787,9 @@ dependencies = [
"livekit-runtime",
"log",
"parking_lot",
"pbjson-types",
"prost 0.12.6",
"rand 0.9.0",
"reqwest 0.11.27",
"scopeguard",
"serde",
@@ -7741,14 +7797,14 @@ dependencies = [
"sha2",
"thiserror 1.0.69",
"tokio",
"tokio-tungstenite 0.20.1",
"tokio-tungstenite 0.26.2",
"url",
]
[[package]]
name = "livekit-protocol"
version = "0.3.6"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.3.9"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"futures-util",
"livekit-runtime",
@@ -7764,13 +7820,11 @@ dependencies = [
[[package]]
name = "livekit-runtime"
version = "0.3.1"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.4.0"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"async-io",
"async-std",
"async-task",
"futures 0.3.31",
"tokio",
"tokio-stream",
]
[[package]]
@@ -7795,19 +7849,21 @@ dependencies = [
"anyhow",
"async-trait",
"collections",
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"core-video",
"coreaudio-rs 0.12.1",
"cpal",
"futures 0.3.31",
"gpui",
"http 0.2.12",
"http_client",
"gpui_tokio",
"http_client_tls",
"image",
"libwebrtc",
"livekit",
"livekit_api",
"log",
"media",
"nanoid",
"objc",
"parking_lot",
"postage",
"serde",
@@ -7815,32 +7871,10 @@ dependencies = [
"sha2",
"simplelog",
"smallvec",
"tokio-tungstenite 0.26.2",
"util",
]
[[package]]
name = "livekit_client_macos"
version = "0.1.0"
dependencies = [
"anyhow",
"async-broadcast",
"async-trait",
"collections",
"core-foundation 0.9.4",
"futures 0.3.31",
"gpui",
"livekit_api",
"log",
"media",
"nanoid",
"parking_lot",
"postage",
"serde",
"serde_json",
"sha2",
"simplelog",
]
[[package]]
name = "lmdb-master-sys"
version = "0.2.4"
@@ -8201,7 +8235,8 @@ version = "0.1.0"
dependencies = [
"anyhow",
"bindgen 0.70.1",
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"core-video",
"ctor",
"foreign-types 0.5.0",
"metal",
@@ -8251,9 +8286,9 @@ dependencies = [
[[package]]
name = "metal"
version = "0.31.0"
version = "0.29.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f569fb946490b5743ad69813cb19629130ce9374034abe31614a36402d18f99e"
checksum = "7ecfd3296f8c56b7c1f6fbac3c71cefa9d78ce009850c45000015f206dc7fa21"
dependencies = [
"bitflags 2.8.0",
"block",
@@ -8280,9 +8315,9 @@ dependencies = [
[[package]]
name = "mimalloc"
version = "0.1.43"
version = "0.1.44"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68914350ae34959d83f732418d51e2427a794055d0b9529f48259ac07af65633"
checksum = "99585191385958383e13f6b822e6b6d8d9cf928e7d286ceb092da92b43c87bc1"
dependencies = [
"libmimalloc-sys",
]
@@ -10112,6 +10147,15 @@ dependencies = [
"indexmap",
]
[[package]]
name = "pgvector"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e0e8871b6d7ca78348c6cd29b911b94851f3429f0cd403130ca17f26c1fb91a6"
dependencies = [
"serde",
]
[[package]]
name = "phf"
version = "0.11.2"
@@ -10592,6 +10636,7 @@ dependencies = [
"smol",
"snippet",
"snippet_provider",
"sum_tree",
"task",
"tempfile",
"terminal",
@@ -10961,7 +11006,7 @@ dependencies = [
"quinn-proto",
"quinn-udp",
"rustc-hash 2.1.1",
"rustls 0.23.23",
"rustls 0.23.25",
"socket2",
"thiserror 2.0.12",
"tokio",
@@ -10979,7 +11024,7 @@ dependencies = [
"rand 0.8.5",
"ring",
"rustc-hash 2.1.1",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-pki-types",
"slab",
"thiserror 2.0.12",
@@ -10999,7 +11044,7 @@ dependencies = [
"once_cell",
"socket2",
"tracing",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -11576,7 +11621,7 @@ dependencies = [
"percent-encoding",
"pin-project-lite",
"quinn",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-native-certs 0.8.1",
"rustls-pemfile 2.2.0",
"rustls-pki-types",
@@ -11924,7 +11969,7 @@ dependencies = [
"libc",
"linux-raw-sys",
"once_cell",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -11952,16 +11997,16 @@ dependencies = [
[[package]]
name = "rustls"
version = "0.23.23"
version = "0.23.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "47796c98c480fce5406ef69d1c76378375492c3b0a0de587be0c1d9feb12f395"
checksum = "822ee9188ac4ec04a2f0531e55d035fb2de73f18b41a63c70c2712503b6fb13c"
dependencies = [
"aws-lc-rs",
"log",
"once_cell",
"ring",
"rustls-pki-types",
"rustls-webpki 0.102.8",
"rustls-webpki 0.103.0",
"subtle",
"zeroize",
]
@@ -12010,32 +12055,32 @@ dependencies = [
[[package]]
name = "rustls-pki-types"
version = "1.10.0"
version = "1.11.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "16f1201b3c9a7ee8039bcadc17b7e605e2945b27eee7631788c1bd2b0643674b"
checksum = "917ce264624a4b4db1c364dcc35bfca9ded014d0a958cd47ad3e960e988ea51c"
dependencies = [
"web-time",
]
[[package]]
name = "rustls-platform-verifier"
version = "0.5.0"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e012c45844a1790332c9386ed4ca3a06def221092eda277e6f079728f8ea99da"
checksum = "4a5467026f437b4cb2a533865eaa73eb840019a0916f4b9ec563c6e617e086c9"
dependencies = [
"core-foundation 0.10.0",
"core-foundation-sys",
"jni",
"log",
"once_cell",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-native-certs 0.8.1",
"rustls-platform-verifier-android",
"rustls-webpki 0.102.8",
"rustls-webpki 0.103.0",
"security-framework 3.0.1",
"security-framework-sys",
"webpki-root-certs",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -12056,9 +12101,9 @@ dependencies = [
[[package]]
name = "rustls-webpki"
version = "0.102.8"
version = "0.103.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "64ca1bc8749bd4cf37b5ce386cc146580777b4e8572c7b97baf22c83f444bee9"
checksum = "0aa4eeac2588ffff23e9d7a7e9b3f971c5fb5b7ebc9452745e0c232c64f83b2f"
dependencies = [
"aws-lc-rs",
"ring",
@@ -12257,17 +12302,18 @@ dependencies = [
[[package]]
name = "sea-orm"
version = "1.1.5"
version = "1.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00733e5418e8ae3758cdb988c3654174e716230cc53ee2cb884207cf86a23029"
checksum = "3417812d38049e8ec3d588c03570f8c60de811d2453fb48e424045a1600ffd86"
dependencies = [
"async-stream",
"async-trait",
"bigdecimal",
"chrono",
"futures 0.3.31",
"futures-util",
"log",
"ouroboros",
"pgvector",
"rust_decimal",
"sea-orm-macros",
"sea-query",
@@ -12285,9 +12331,9 @@ dependencies = [
[[package]]
name = "sea-orm-macros"
version = "1.1.5"
version = "1.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a98408f82fb4875d41ef469a79944a7da29767c7b3e4028e22188a3dd613b10f"
checksum = "d705ba84e1c74c8ac27784e4ac6f21584058c1dc0cadb9d39b43e109fcf8139c"
dependencies = [
"heck 0.4.1",
"proc-macro2",
@@ -12520,9 +12566,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.139"
version = "1.0.140"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44f86c3acccc9c65b153fe1b85a3be07fe5515274ec9f0653b4a0875731c72a6"
checksum = "20068b6e96dc6c9bd23e01df8827e6c7e1f2fddd43c21810382803c136b99373"
dependencies = [
"indexmap",
"itoa",
@@ -12578,9 +12624,9 @@ dependencies = [
[[package]]
name = "serde_repr"
version = "0.1.19"
version = "0.1.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6c64451ba24fc7a6a2d60fc75dd9c83c90903b19028d4eff35e88fc1e86564e9"
checksum = "175ee3e80ae9982737ca543e96133087cbd9a485eecc3bc4de9c1a37b47ea59c"
dependencies = [
"proc-macro2",
"quote",
@@ -12706,13 +12752,6 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24188a676b6ae68c3b2cb3a01be17fbf7240ce009799bb56d5b1409051e78fde"
[[package]]
name = "shell_parser"
version = "0.1.0"
dependencies = [
"shlex",
]
[[package]]
name = "shellexpand"
version = "2.1.2"
@@ -13100,7 +13139,7 @@ dependencies = [
"once_cell",
"percent-encoding",
"rust_decimal",
"rustls 0.23.23",
"rustls 0.23.25",
"rustls-pemfile 2.2.0",
"serde",
"serde_json",
@@ -13723,7 +13762,7 @@ dependencies = [
"fd-lock",
"io-lifetimes",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
"winx",
]
@@ -13867,7 +13906,7 @@ dependencies = [
"getrandom 0.3.1",
"once_cell",
"rustix",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -14141,9 +14180,9 @@ dependencies = [
[[package]]
name = "time"
version = "0.3.37"
version = "0.3.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "35e7868883861bd0e56d9ac6efcaaca0d6d5d82a2a7ec8209ff492c07cf37b21"
checksum = "9d9c75b47bdff86fa3334a3db91356b8d7d86a9b839dab7d0bdc5c3d3a077618"
dependencies = [
"deranged",
"itoa",
@@ -14158,15 +14197,15 @@ dependencies = [
[[package]]
name = "time-core"
version = "0.1.2"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ef927ca75afb808a4d64dd374f00a2adf8d0fcff8e7b184af886c3c87ec4a3f3"
checksum = "c9e9a38711f559d9e3ce1cdb06dd7c5b8ea546bc90052da6d06bb76da74bb07c"
[[package]]
name = "time-macros"
version = "0.2.19"
version = "0.2.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2834e6017e3e5e4b9834939793b282bc03b37a3336245fa820e35e233e2a85de"
checksum = "29aa485584182073ed57fd5004aa09c371f021325014694e432313345865fd04"
dependencies = [
"num-conv",
"time-core",
@@ -14176,7 +14215,7 @@ dependencies = [
name = "time_format"
version = "0.1.0"
dependencies = [
"core-foundation 0.9.4",
"core-foundation 0.10.0",
"core-foundation-sys",
"sys-locale",
"time",
@@ -14364,7 +14403,7 @@ version = "0.26.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5f6d0975eaace0cf0fcadee4e4aaa5da15b5c079146f2cffb67c113be122bf37"
dependencies = [
"rustls 0.23.23",
"rustls 0.23.25",
"tokio",
]
@@ -14400,10 +14439,7 @@ checksum = "212d5dcb2a1ce06d81107c3d0ffa3121fe974b73f068c8282cb1c32328113b6c"
dependencies = [
"futures-util",
"log",
"rustls 0.21.12",
"rustls-native-certs 0.6.3",
"tokio",
"tokio-rustls 0.24.1",
"tungstenite 0.20.1",
]
@@ -14419,6 +14455,21 @@ dependencies = [
"tungstenite 0.21.0",
]
[[package]]
name = "tokio-tungstenite"
version = "0.26.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a9daff607c6d2bf6c16fd681ccb7eecc83e4e2cdc1ca067ffaadfca5de7f084"
dependencies = [
"futures-util",
"log",
"rustls 0.23.25",
"rustls-pki-types",
"tokio",
"tokio-rustls 0.26.1",
"tungstenite 0.26.2",
]
[[package]]
name = "tokio-util"
version = "0.7.13"
@@ -14917,7 +14968,6 @@ dependencies = [
"httparse",
"log",
"rand 0.8.5",
"rustls 0.21.12",
"sha1",
"thiserror 1.0.69",
"url",
@@ -14961,6 +15011,25 @@ dependencies = [
"utf-8",
]
[[package]]
name = "tungstenite"
version = "0.26.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4793cb5e56680ecbb1d843515b23b6de9a75eb04b66643e256a396d43be33c13"
dependencies = [
"bytes 1.10.1",
"data-encoding",
"http 1.2.0",
"httparse",
"log",
"rand 0.9.0",
"rustls 0.23.25",
"rustls-pki-types",
"sha1",
"thiserror 2.0.12",
"utf-8",
]
[[package]]
name = "typeid"
version = "1.0.2"
@@ -15147,9 +15216,9 @@ checksum = "39ec24b3121d976906ece63c9daad25b85969647682eee313cb5779fdd69e14e"
[[package]]
name = "unindent"
version = "0.2.3"
version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7de7d73e1754487cb58364ee906a499937a0dfabd86bcb980fa99ec8c8fa2ce"
checksum = "7264e107f553ccae879d21fbea1d6724ac785e8c3bfc762137959b5802826ef3"
[[package]]
name = "untrusted"
@@ -16096,8 +16165,8 @@ dependencies = [
[[package]]
name = "webrtc-sys"
version = "0.3.5"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.3.7"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"cc",
"cxx",
@@ -16109,8 +16178,8 @@ dependencies = [
[[package]]
name = "webrtc-sys-build"
version = "0.3.5"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=811ceae29fabee455f110c56cd66b3f49a7e5003#811ceae29fabee455f110c56cd66b3f49a7e5003"
version = "0.3.6"
source = "git+https://github.com/zed-industries/livekit-rust-sdks?rev=4941ff2352c9a243113607e808fc784450aa2ff3#4941ff2352c9a243113607e808fc784450aa2ff3"
dependencies = [
"fs2",
"regex",
@@ -16261,7 +16330,7 @@ version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf221c93e13a30d793f7645a0e7762c55d169dbb0a49671918a2319d289b10bb"
dependencies = [
"windows-sys 0.48.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -16800,9 +16869,9 @@ dependencies = [
[[package]]
name = "winresource"
version = "0.1.19"
version = "0.1.20"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7276691b353ad4547af8c3268488d1311f4be791ffdc0c65b8cfa8f41eed693b"
checksum = "ba4a67c78ee5782c0c1cb41bebc7e12c6e79644daa1650ebbc1de5d5b08593f7"
dependencies = [
"toml 0.8.20",
"version_check",
@@ -16821,7 +16890,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f3fd376f71958b862e7afb20cfe5a22830e1963462f3a17f49d82a6c1d1f42d"
dependencies = [
"bitflags 2.8.0",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]

View File

@@ -86,7 +86,6 @@ members = [
"crates/languages",
"crates/livekit_api",
"crates/livekit_client",
"crates/livekit_client_macos",
"crates/lmstudio",
"crates/lsp",
"crates/markdown",
@@ -131,7 +130,6 @@ members = [
"crates/session",
"crates/settings",
"crates/settings_ui",
"crates/shell_parser",
"crates/snippet",
"crates/snippet_provider",
"crates/snippets_ui",
@@ -290,7 +288,6 @@ language_tools = { path = "crates/language_tools" }
languages = { path = "crates/languages" }
livekit_api = { path = "crates/livekit_api" }
livekit_client = { path = "crates/livekit_client" }
livekit_client_macos = { path = "crates/livekit_client_macos" }
lmstudio = { path = "crates/lmstudio" }
lsp = { path = "crates/lsp" }
markdown = { path = "crates/markdown" }
@@ -410,15 +407,16 @@ blade-util = { git = "https://github.com/kvark/blade", rev = "b16f5c7bd873c7126f
naga = { version = "23.1.0", features = ["wgsl-in"] }
blake3 = "1.5.3"
bytes = "1.0"
cargo_metadata = "0.19"
cargo_metadata = { git = "https://github.com/zed-industries/cargo_metadata", rev = "ce8171bad673923d61a77b6761d0dc4aff63398a"}
cargo_toml = "0.21"
chrono = { version = "0.4", features = ["serde"] }
circular-buffer = "1.0"
clap = { version = "4.4", features = ["derive"] }
cocoa = "0.26"
cocoa-foundation = "0.2.0"
core-video = { version = "0.4.3", features = ["metal"] }
convert_case = "0.8.0"
core-foundation = "0.9.3"
core-foundation = "0.10.0"
core-foundation-sys = "0.8.6"
ctor = "0.4.0"
dashmap = "6.0"
@@ -456,11 +454,6 @@ libc = "0.2"
libsqlite3-sys = { version = "0.30.1", features = ["bundled"] }
linkify = "0.10.0"
linkme = "0.3.31"
livekit = { git = "https://github.com/zed-industries/livekit-rust-sdks", rev = "811ceae29fabee455f110c56cd66b3f49a7e5003", features = [
"dispatcher",
"services-dispatcher",
"rustls-tls-native-roots",
], default-features = false }
log = { version = "0.4.16", features = ["kv_unstable_serde", "serde"] }
markup5ever_rcdom = "0.3.0"
mlua = { version = "0.10", features = ["lua54", "vendored", "async", "send"] }
@@ -548,6 +541,7 @@ time = { version = "0.3", features = [
tiny_http = "0.8"
toml = "0.8"
tokio = { version = "1" }
tokio-tungstenite = { version = "0.26", features = ["__rustls-tls"]}
tower-http = "0.4.4"
tree-sitter = { version = "0.25.3", features = ["wasm"] }
tree-sitter-bash = "0.23"
@@ -593,7 +587,7 @@ which = "6.0.0"
wit-component = "0.221"
zed_llm_client = "0.4"
zstd = "0.11"
metal = "0.31"
metal = "0.29"
[workspace.dependencies.async-stripe]
git = "https://github.com/zed-industries/async-stripe"

View File

@@ -14,5 +14,19 @@ Be concise and direct in your responses.
The user has opened a project that contains the following root directories/files:
{{#each worktrees}}
- {{root_name}} (absolute path: {{abs_path}})
- `{{root_name}}` (absolute path: `{{abs_path}}`)
{{/each}}
{{#if has_rules}}
There are rules that apply to these root directories:
{{#each worktrees}}
{{#if rules_file}}
`{{root_name}}/{{rules_file.rel_path}}`:
``````
{{{rules_file.text}}}
``````
{{/if}}
{{/each}}
{{/if}}

View File

@@ -1,14 +1,16 @@
use crate::thread::{MessageId, RequestKind, Thread, ThreadError, ThreadEvent};
use crate::thread::{
LastRestoreCheckpoint, MessageId, RequestKind, Thread, ThreadError, ThreadEvent,
};
use crate::thread_store::ThreadStore;
use crate::tool_use::{ToolUse, ToolUseStatus};
use crate::ui::ContextPill;
use collections::HashMap;
use editor::{Editor, MultiBuffer};
use gpui::{
list, percentage, AbsoluteLength, Animation, AnimationExt, AnyElement, App, ClickEvent,
DefiniteLength, EdgesRefinement, Empty, Entity, Focusable, Length, ListAlignment, ListOffset,
ListState, StyleRefinement, Subscription, Task, TextStyleRefinement, Transformation,
UnderlineStyle,
list, percentage, pulsating_between, AbsoluteLength, Animation, AnimationExt, AnyElement, App,
ClickEvent, DefiniteLength, EdgesRefinement, Empty, Entity, Focusable, Length, ListAlignment,
ListOffset, ListState, StyleRefinement, Subscription, Task, TextStyleRefinement,
Transformation, UnderlineStyle, WeakEntity,
};
use language::{Buffer, LanguageRegistry};
use language_model::{LanguageModelRegistry, LanguageModelToolUseId, Role};
@@ -18,9 +20,9 @@ use settings::Settings as _;
use std::sync::Arc;
use std::time::Duration;
use theme::ThemeSettings;
use ui::Color;
use ui::{prelude::*, Disclosure, KeyBinding};
use ui::{prelude::*, Disclosure, KeyBinding, Tooltip};
use util::ResultExt as _;
use workspace::{OpenOptions, Workspace};
use crate::context_store::{refresh_context_store_text, ContextStore};
@@ -29,11 +31,13 @@ pub struct ActiveThread {
thread_store: Entity<ThreadStore>,
thread: Entity<Thread>,
context_store: Entity<ContextStore>,
workspace: WeakEntity<Workspace>,
save_thread_task: Option<Task<()>>,
messages: Vec<MessageId>,
list_state: ListState,
rendered_messages_by_id: HashMap<MessageId, Entity<Markdown>>,
rendered_scripting_tool_uses: HashMap<LanguageModelToolUseId, Entity<Markdown>>,
rendered_tool_use_labels: HashMap<LanguageModelToolUseId, Entity<Markdown>>,
editing_message: Option<(MessageId, EditMessageState)>,
expanded_tool_uses: HashMap<LanguageModelToolUseId, bool>,
last_error: Option<ThreadError>,
@@ -50,6 +54,7 @@ impl ActiveThread {
thread_store: Entity<ThreadStore>,
language_registry: Arc<LanguageRegistry>,
context_store: Entity<ContextStore>,
workspace: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
@@ -63,10 +68,12 @@ impl ActiveThread {
thread_store,
thread: thread.clone(),
context_store,
workspace,
save_thread_task: None,
messages: Vec::new(),
rendered_messages_by_id: HashMap::default(),
rendered_scripting_tool_uses: HashMap::default(),
rendered_tool_use_labels: HashMap::default(),
expanded_tool_uses: HashMap::default(),
list_state: ListState::new(0, ListAlignment::Bottom, px(1024.), {
let this = cx.entity().downgrade();
@@ -83,10 +90,29 @@ impl ActiveThread {
for message in thread.read(cx).messages().cloned().collect::<Vec<_>>() {
this.push_message(&message.id, message.text.clone(), window, cx);
for tool_use in thread.read(cx).scripting_tool_uses_for_message(message.id) {
for tool_use in thread.read(cx).tool_uses_for_message(message.id, cx) {
this.render_tool_use_label_markdown(
tool_use.id.clone(),
tool_use.ui_text.clone(),
window,
cx,
);
}
for tool_use in thread
.read(cx)
.scripting_tool_uses_for_message(message.id, cx)
{
this.render_tool_use_label_markdown(
tool_use.id.clone(),
tool_use.ui_text.clone(),
window,
cx,
);
this.render_scripting_tool_use_markdown(
tool_use.id.clone(),
tool_use.name.as_ref(),
tool_use.ui_text.as_ref(),
tool_use.input.clone(),
window,
cx,
@@ -284,6 +310,19 @@ impl ActiveThread {
.insert(tool_use_id, lua_script);
}
fn render_tool_use_label_markdown(
&mut self,
tool_use_id: LanguageModelToolUseId,
tool_label: impl Into<SharedString>,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.rendered_tool_use_labels.insert(
tool_use_id,
self.render_markdown(tool_label.into(), window, cx),
);
}
fn handle_thread_event(
&mut self,
_thread: &Entity<Thread>,
@@ -338,9 +377,18 @@ impl ActiveThread {
cx.notify();
}
ThreadEvent::UsePendingTools => {
self.thread.update(cx, |thread, cx| {
thread.use_pending_tools(cx);
});
let tool_uses = self
.thread
.update(cx, |thread, cx| thread.use_pending_tools(cx));
for tool_use in tool_uses {
self.render_tool_use_label_markdown(
tool_use.id,
tool_use.ui_text.clone(),
window,
cx,
);
}
}
ThreadEvent::ToolFinished {
pending_tool_use,
@@ -349,6 +397,12 @@ impl ActiveThread {
} => {
let canceled = *canceled;
if let Some(tool_use) = pending_tool_use {
self.render_tool_use_label_markdown(
tool_use.id.clone(),
SharedString::from(tool_use.ui_text.clone()),
window,
cx,
);
self.render_scripting_tool_use_markdown(
tool_use.id.clone(),
tool_use.name.as_ref(),
@@ -410,6 +464,7 @@ impl ActiveThread {
}
}
}
ThreadEvent::CheckpointChanged => cx.notify(),
}
}
@@ -552,8 +607,8 @@ impl ActiveThread {
// Get all the data we need from thread before we start using it in closures
let checkpoint = thread.checkpoint_for_message(message_id);
let context = thread.context_for_message(message_id);
let tool_uses = thread.tool_uses_for_message(message_id);
let scripting_tool_uses = thread.scripting_tool_uses_for_message(message_id);
let tool_uses = thread.tool_uses_for_message(message_id, cx);
let scripting_tool_uses = thread.scripting_tool_uses_for_message(message_id, cx);
// Don't render user messages that are just there for returning tool results.
if message.role == Role::User
@@ -706,27 +761,25 @@ impl ActiveThread {
)
.child(div().p_2().child(message_content)),
),
Role::Assistant => {
v_flex()
.id(("message-container", ix))
.child(div().py_3().px_4().child(message_content))
.when(
!tool_uses.is_empty() || !scripting_tool_uses.is_empty(),
|parent| {
parent.child(
v_flex()
.children(
tool_uses
.into_iter()
.map(|tool_use| self.render_tool_use(tool_use, cx)),
)
.children(scripting_tool_uses.into_iter().map(|tool_use| {
self.render_scripting_tool_use(tool_use, cx)
})),
)
},
)
}
Role::Assistant => v_flex()
.id(("message-container", ix))
.child(div().py_3().px_4().child(message_content))
.when(
!tool_uses.is_empty() || !scripting_tool_uses.is_empty(),
|parent| {
parent.child(
v_flex()
.children(
tool_uses
.into_iter()
.map(|tool_use| self.render_tool_use(tool_use, cx)),
)
.children(scripting_tool_uses.into_iter().map(|tool_use| {
self.render_scripting_tool_use(tool_use, window, cx)
})),
)
},
),
Role::System => div().id(("message-container", ix)).py_1().px_2().child(
v_flex()
.bg(colors.editor_background)
@@ -736,21 +789,64 @@ impl ActiveThread {
};
v_flex()
.when(ix == 0, |parent| parent.child(self.render_rules_item(cx)))
.when_some(checkpoint, |parent, checkpoint| {
parent.child(
h_flex().pl_2().child(
Button::new("restore-checkpoint", "Restore Checkpoint")
.icon(IconName::Undo)
.size(ButtonSize::Compact)
.on_click(cx.listener(move |this, _, _window, cx| {
this.thread.update(cx, |thread, cx| {
thread
.restore_checkpoint(checkpoint.clone(), cx)
.detach_and_log_err(cx);
});
})),
),
)
let mut is_pending = false;
let mut error = None;
if let Some(last_restore_checkpoint) =
self.thread.read(cx).last_restore_checkpoint()
{
if last_restore_checkpoint.message_id() == message_id {
match last_restore_checkpoint {
LastRestoreCheckpoint::Pending { .. } => is_pending = true,
LastRestoreCheckpoint::Error { error: err, .. } => {
error = Some(err.clone());
}
}
}
}
let restore_checkpoint_button =
Button::new(("restore-checkpoint", ix), "Restore Checkpoint")
.icon(if error.is_some() {
IconName::XCircle
} else {
IconName::Undo
})
.size(ButtonSize::Compact)
.disabled(is_pending)
.icon_color(if error.is_some() {
Some(Color::Error)
} else {
None
})
.on_click(cx.listener(move |this, _, _window, cx| {
this.thread.update(cx, |thread, cx| {
thread
.restore_checkpoint(checkpoint.clone(), cx)
.detach_and_log_err(cx);
});
}));
let restore_checkpoint_button = if is_pending {
restore_checkpoint_button
.with_animation(
("pulsating-restore-checkpoint-button", ix),
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.6, 1.)),
|label, delta| label.alpha(delta),
)
.into_any_element()
} else if let Some(error) = error {
restore_checkpoint_button
.tooltip(Tooltip::text(error.to_string()))
.into_any_element()
} else {
restore_checkpoint_button.into_any_element()
};
parent.child(h_flex().pl_2().child(restore_checkpoint_button))
})
.child(styled_message)
.into_any()
@@ -801,11 +897,10 @@ impl ActiveThread {
}
}),
))
.child(
Label::new(tool_use.name)
.size(LabelSize::Small)
.buffer_font(cx),
),
.child(div().text_ui_sm(cx).children(
self.rendered_tool_use_labels.get(&tool_use.id).cloned(),
))
.truncate(),
)
.child({
let (icon_name, color, animated) = match &tool_use.status {
@@ -933,6 +1028,7 @@ impl ActiveThread {
fn render_scripting_tool_use(
&self,
tool_use: ToolUse,
window: &Window,
cx: &mut Context<Self>,
) -> impl IntoElement {
let is_open = self
@@ -978,7 +1074,12 @@ impl ActiveThread {
}
}),
))
.child(Label::new(tool_use.name)),
.child(div().text_ui_sm(cx).child(self.render_markdown(
tool_use.ui_text.clone(),
window,
cx,
)))
.truncate(),
)
.child(
Label::new(match tool_use.status {
@@ -1042,6 +1143,86 @@ impl ActiveThread {
}),
)
}
fn render_rules_item(&self, cx: &Context<Self>) -> AnyElement {
let Some(system_prompt_context) = self.thread.read(cx).system_prompt_context().as_ref()
else {
return div().into_any();
};
let rules_files = system_prompt_context
.worktrees
.iter()
.filter_map(|worktree| worktree.rules_file.as_ref())
.collect::<Vec<_>>();
let label_text = match rules_files.as_slice() {
&[] => return div().into_any(),
&[rules_file] => {
format!("Using {:?} file", rules_file.rel_path)
}
rules_files => {
format!("Using {} rules files", rules_files.len())
}
};
div()
.pt_1()
.px_2p5()
.child(
h_flex()
.group("rules-item")
.w_full()
.gap_2()
.justify_between()
.child(
h_flex()
.gap_1p5()
.child(
Icon::new(IconName::File)
.size(IconSize::XSmall)
.color(Color::Disabled),
)
.child(
Label::new(label_text)
.size(LabelSize::XSmall)
.color(Color::Muted)
.buffer_font(cx),
),
)
.child(
div().visible_on_hover("rules-item").child(
Button::new("open-rules", "Open Rules")
.label_size(LabelSize::XSmall)
.on_click(cx.listener(Self::handle_open_rules)),
),
),
)
.into_any()
}
fn handle_open_rules(&mut self, _: &ClickEvent, window: &mut Window, cx: &mut Context<Self>) {
let Some(system_prompt_context) = self.thread.read(cx).system_prompt_context().as_ref()
else {
return;
};
let abs_paths = system_prompt_context
.worktrees
.iter()
.flat_map(|worktree| worktree.rules_file.as_ref())
.map(|rules_file| rules_file.abs_path.to_path_buf())
.collect::<Vec<_>>();
if let Ok(task) = self.workspace.update(cx, move |workspace, cx| {
// TODO: Open a multibuffer instead? In some cases this doesn't make the set of rules
// files clear. For example, if rules file 1 is already open but rules file 2 is not,
// this would open and focus rules file 2 in a tab that is not next to rules file 1.
workspace.open_paths(abs_paths, OpenOptions::default(), None, window, cx)
}) {
task.detach();
}
}
}
impl Render for ActiveThread {

View File

@@ -1,5 +1,4 @@
mod active_thread;
mod agent_profile;
mod assistant_configuration;
mod assistant_model_selector;
mod assistant_panel;

View File

@@ -195,6 +195,7 @@ impl AssistantConfiguration {
let tool_count = tools.len();
v_flex()
.id(SharedString::from(context_server.id()))
.border_1()
.rounded_sm()
.border_color(cx.theme().colors().border)

View File

@@ -174,6 +174,7 @@ impl AssistantPanel {
thread_store.clone(),
language_registry.clone(),
message_editor_context_store.clone(),
workspace.clone(),
window,
cx,
)
@@ -252,6 +253,7 @@ impl AssistantPanel {
self.thread_store.clone(),
self.language_registry.clone(),
message_editor_context_store.clone(),
self.workspace.clone(),
window,
cx,
)
@@ -389,6 +391,7 @@ impl AssistantPanel {
this.thread_store.clone(),
this.language_registry.clone(),
message_editor_context_store.clone(),
this.workspace.clone(),
window,
cx,
)
@@ -455,7 +458,7 @@ impl AssistantPanel {
workspace.update_in(cx, |workspace, window, cx| {
let thread = thread.read(cx);
let markdown = thread.to_markdown()?;
let markdown = thread.to_markdown(cx)?;
let thread_summary = thread
.summary()
.map(|summary| summary.to_string())
@@ -922,8 +925,8 @@ impl AssistantPanel {
ThreadError::MaxMonthlySpendReached => {
self.render_max_monthly_spend_reached_error(cx)
}
ThreadError::Message(error_message) => {
self.render_error_message(&error_message, cx)
ThreadError::Message { header, message } => {
self.render_error_message(header, message, cx)
}
})
.into_any(),
@@ -1026,7 +1029,8 @@ impl AssistantPanel {
fn render_error_message(
&self,
error_message: &SharedString,
header: SharedString,
message: SharedString,
cx: &mut Context<Self>,
) -> AnyElement {
v_flex()
@@ -1036,17 +1040,14 @@ impl AssistantPanel {
.gap_1p5()
.items_center()
.child(Icon::new(IconName::XCircle).color(Color::Error))
.child(
Label::new("Error interacting with language model")
.weight(FontWeight::MEDIUM),
),
.child(Label::new(header).weight(FontWeight::MEDIUM)),
)
.child(
div()
.id("error-message")
.max_h_32()
.overflow_y_scroll()
.child(Label::new(error_message.clone())),
.child(Label::new(message)),
)
.child(
h_flex()

View File

@@ -33,7 +33,7 @@ use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
use crate::thread::{RequestKind, Thread};
use crate::thread_store::ThreadStore;
use crate::tool_selector::ToolSelector;
use crate::{Chat, ChatMode, RemoveAllContext, ToggleContextPicker};
use crate::{Chat, ChatMode, RemoveAllContext, ThreadEvent, ToggleContextPicker};
pub struct MessageEditor {
thread: Entity<Thread>,
@@ -206,12 +206,23 @@ impl MessageEditor {
let refresh_task =
refresh_context_store_text(self.context_store.clone(), &HashSet::default(), cx);
let system_prompt_context_task = self.thread.read(cx).load_system_prompt_context(cx);
let thread = self.thread.clone();
let context_store = self.context_store.clone();
let git_store = self.project.read(cx).git_store();
let checkpoint = git_store.read(cx).checkpoint(cx);
cx.spawn(async move |_, cx| {
refresh_task.await;
let (system_prompt_context, load_error) = system_prompt_context_task.await;
thread
.update(cx, |thread, cx| {
thread.set_system_prompt_context(system_prompt_context);
if let Some(load_error) = load_error {
cx.emit(ThreadEvent::ShowError(load_error));
}
})
.ok();
let checkpoint = checkpoint.await.log_err();
thread
.update(cx, |thread, cx| {

View File

@@ -6,6 +6,7 @@ use anyhow::{Context as _, Result};
use assistant_tool::{ActionLog, ToolWorkingSet};
use chrono::{DateTime, Utc};
use collections::{BTreeMap, HashMap, HashSet};
use fs::Fs;
use futures::future::Shared;
use futures::{FutureExt, StreamExt as _};
use git;
@@ -16,12 +17,14 @@ use language_model::{
LanguageModelToolUseId, MaxMonthlySpendReachedError, MessageContent, PaymentRequiredError,
Role, StopReason, TokenUsage,
};
use project::git::GitStoreCheckpoint;
use project::Project;
use prompt_store::{AssistantSystemPromptWorktree, PromptBuilder};
use project::git_store::{GitStore, GitStoreCheckpoint};
use project::{Project, Worktree};
use prompt_store::{
AssistantSystemPromptContext, PromptBuilder, RulesFile, WorktreeInfoForSystemPrompt,
};
use scripting_tool::{ScriptingSession, ScriptingTool};
use serde::{Deserialize, Serialize};
use util::{post_inc, ResultExt, TryFutureExt as _};
use util::{maybe, post_inc, ResultExt as _, TryFutureExt as _};
use uuid::Uuid;
use crate::context::{attach_context_to_message, ContextId, ContextSnapshot};
@@ -96,6 +99,25 @@ pub struct ThreadCheckpoint {
git_checkpoint: GitStoreCheckpoint,
}
pub enum LastRestoreCheckpoint {
Pending {
message_id: MessageId,
},
Error {
message_id: MessageId,
error: String,
},
}
impl LastRestoreCheckpoint {
pub fn message_id(&self) -> MessageId {
match self {
LastRestoreCheckpoint::Pending { message_id } => *message_id,
LastRestoreCheckpoint::Error { message_id, .. } => *message_id,
}
}
}
/// A thread of conversation with the LLM.
pub struct Thread {
id: ThreadId,
@@ -106,6 +128,7 @@ pub struct Thread {
next_message_id: MessageId,
context: BTreeMap<ContextId, ContextSnapshot>,
context_by_message: HashMap<MessageId, Vec<ContextId>>,
system_prompt_context: Option<AssistantSystemPromptContext>,
checkpoints_by_message: HashMap<MessageId, GitStoreCheckpoint>,
completion_count: usize,
pending_completions: Vec<PendingCompletion>,
@@ -114,6 +137,7 @@ pub struct Thread {
tools: Arc<ToolWorkingSet>,
tool_use: ToolUseState,
action_log: Entity<ActionLog>,
last_restore_checkpoint: Option<LastRestoreCheckpoint>,
scripting_session: Entity<ScriptingSession>,
scripting_tool_use: ToolUseState,
initial_project_snapshot: Shared<Task<Option<Arc<ProjectSnapshot>>>>,
@@ -136,15 +160,17 @@ impl Thread {
next_message_id: MessageId(0),
context: BTreeMap::default(),
context_by_message: HashMap::default(),
system_prompt_context: None,
checkpoints_by_message: HashMap::default(),
completion_count: 0,
pending_completions: Vec::new(),
project: project.clone(),
prompt_builder,
tools,
tool_use: ToolUseState::new(),
tools: tools.clone(),
last_restore_checkpoint: None,
tool_use: ToolUseState::new(tools.clone()),
scripting_session: cx.new(|cx| ScriptingSession::new(project.clone(), cx)),
scripting_tool_use: ToolUseState::new(),
scripting_tool_use: ToolUseState::new(tools),
action_log: cx.new(|_| ActionLog::new()),
initial_project_snapshot: {
let project_snapshot = Self::project_snapshot(project, cx);
@@ -171,11 +197,12 @@ impl Thread {
.map(|message| message.id.0 + 1)
.unwrap_or(0),
);
let tool_use = ToolUseState::from_serialized_messages(&serialized.messages, |name| {
name != ScriptingTool::NAME
});
let tool_use =
ToolUseState::from_serialized_messages(tools.clone(), &serialized.messages, |name| {
name != ScriptingTool::NAME
});
let scripting_tool_use =
ToolUseState::from_serialized_messages(&serialized.messages, |name| {
ToolUseState::from_serialized_messages(tools.clone(), &serialized.messages, |name| {
name == ScriptingTool::NAME
});
let scripting_session = cx.new(|cx| ScriptingSession::new(project.clone(), cx));
@@ -197,9 +224,11 @@ impl Thread {
next_message_id,
context: BTreeMap::default(),
context_by_message: HashMap::default(),
system_prompt_context: None,
checkpoints_by_message: HashMap::default(),
completion_count: 0,
pending_completions: Vec::new(),
last_restore_checkpoint: None,
project,
prompt_builder,
tools,
@@ -272,17 +301,38 @@ impl Thread {
checkpoint: ThreadCheckpoint,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
self.last_restore_checkpoint = Some(LastRestoreCheckpoint::Pending {
message_id: checkpoint.message_id,
});
cx.emit(ThreadEvent::CheckpointChanged);
let project = self.project.read(cx);
let restore = project
.git_store()
.read(cx)
.restore_checkpoint(checkpoint.git_checkpoint, cx);
cx.spawn(async move |this, cx| {
restore.await?;
this.update(cx, |this, cx| this.truncate(checkpoint.message_id, cx))
let result = restore.await;
this.update(cx, |this, cx| {
if let Err(err) = result.as_ref() {
this.last_restore_checkpoint = Some(LastRestoreCheckpoint::Error {
message_id: checkpoint.message_id,
error: err.to_string(),
});
} else {
this.last_restore_checkpoint = None;
this.truncate(checkpoint.message_id, cx);
}
cx.emit(ThreadEvent::CheckpointChanged);
})?;
result
})
}
pub fn last_restore_checkpoint(&self) -> Option<&LastRestoreCheckpoint> {
self.last_restore_checkpoint.as_ref()
}
pub fn truncate(&mut self, message_id: MessageId, cx: &mut Context<Self>) {
let Some(message_ix) = self
.messages
@@ -322,12 +372,12 @@ impl Thread {
all_pending_tool_uses.all(|tool_use| tool_use.status.is_error())
}
pub fn tool_uses_for_message(&self, id: MessageId) -> Vec<ToolUse> {
self.tool_use.tool_uses_for_message(id)
pub fn tool_uses_for_message(&self, id: MessageId, cx: &App) -> Vec<ToolUse> {
self.tool_use.tool_uses_for_message(id, cx)
}
pub fn scripting_tool_uses_for_message(&self, id: MessageId) -> Vec<ToolUse> {
self.scripting_tool_use.tool_uses_for_message(id)
pub fn scripting_tool_uses_for_message(&self, id: MessageId, cx: &App) -> Vec<ToolUse> {
self.scripting_tool_use.tool_uses_for_message(id, cx)
}
pub fn tool_results_for_message(&self, id: MessageId) -> Vec<&LanguageModelToolResult> {
@@ -442,7 +492,7 @@ impl Thread {
let initial_project_snapshot = self.initial_project_snapshot.clone();
cx.spawn(async move |this, cx| {
let initial_project_snapshot = initial_project_snapshot.await;
this.read_with(cx, |this, _| SerializedThread {
this.read_with(cx, |this, cx| SerializedThread {
summary: this.summary_or_default(),
updated_at: this.updated_at(),
messages: this
@@ -452,9 +502,9 @@ impl Thread {
role: message.role,
text: message.text.clone(),
tool_uses: this
.tool_uses_for_message(message.id)
.tool_uses_for_message(message.id, cx)
.into_iter()
.chain(this.scripting_tool_uses_for_message(message.id))
.chain(this.scripting_tool_uses_for_message(message.id, cx))
.map(|tool_use| SerializedToolUse {
id: tool_use.id,
name: tool_use.name,
@@ -478,6 +528,116 @@ impl Thread {
})
}
pub fn set_system_prompt_context(&mut self, context: AssistantSystemPromptContext) {
self.system_prompt_context = Some(context);
}
pub fn system_prompt_context(&self) -> &Option<AssistantSystemPromptContext> {
&self.system_prompt_context
}
pub fn load_system_prompt_context(
&self,
cx: &App,
) -> Task<(AssistantSystemPromptContext, Option<ThreadError>)> {
let project = self.project.read(cx);
let tasks = project
.visible_worktrees(cx)
.map(|worktree| {
Self::load_worktree_info_for_system_prompt(
project.fs().clone(),
worktree.read(cx),
cx,
)
})
.collect::<Vec<_>>();
cx.spawn(async |_cx| {
let results = futures::future::join_all(tasks).await;
let mut first_err = None;
let worktrees = results
.into_iter()
.map(|(worktree, err)| {
if first_err.is_none() && err.is_some() {
first_err = err;
}
worktree
})
.collect::<Vec<_>>();
(AssistantSystemPromptContext::new(worktrees), first_err)
})
}
fn load_worktree_info_for_system_prompt(
fs: Arc<dyn Fs>,
worktree: &Worktree,
cx: &App,
) -> Task<(WorktreeInfoForSystemPrompt, Option<ThreadError>)> {
let root_name = worktree.root_name().into();
let abs_path = worktree.abs_path();
// Note that Cline supports `.clinerules` being a directory, but that is not currently
// supported. This doesn't seem to occur often in GitHub repositories.
const RULES_FILE_NAMES: [&'static str; 5] = [
".rules",
".cursorrules",
".windsurfrules",
".clinerules",
"CLAUDE.md",
];
let selected_rules_file = RULES_FILE_NAMES
.into_iter()
.filter_map(|name| {
worktree
.entry_for_path(name)
.filter(|entry| entry.is_file())
.map(|entry| (entry.path.clone(), worktree.absolutize(&entry.path)))
})
.next();
if let Some((rel_rules_path, abs_rules_path)) = selected_rules_file {
cx.spawn(async move |_| {
let rules_file_result = maybe!(async move {
let abs_rules_path = abs_rules_path?;
let text = fs.load(&abs_rules_path).await.with_context(|| {
format!("Failed to load assistant rules file {:?}", abs_rules_path)
})?;
anyhow::Ok(RulesFile {
rel_path: rel_rules_path,
abs_path: abs_rules_path.into(),
text: text.trim().to_string(),
})
})
.await;
let (rules_file, rules_file_error) = match rules_file_result {
Ok(rules_file) => (Some(rules_file), None),
Err(err) => (
None,
Some(ThreadError::Message {
header: "Error loading rules file".into(),
message: format!("{err}").into(),
}),
),
};
let worktree_info = WorktreeInfoForSystemPrompt {
root_name,
abs_path,
rules_file,
};
(worktree_info, rules_file_error)
})
} else {
Task::ready((
WorktreeInfoForSystemPrompt {
root_name,
abs_path,
rules_file: None,
},
None,
))
}
}
pub fn send_to_model(
&mut self,
model: Arc<dyn LanguageModel>,
@@ -515,36 +675,30 @@ impl Thread {
request_kind: RequestKind,
cx: &App,
) -> LanguageModelRequest {
let worktree_root_names = self
.project
.read(cx)
.visible_worktrees(cx)
.map(|worktree| {
let worktree = worktree.read(cx);
AssistantSystemPromptWorktree {
root_name: worktree.root_name().into(),
abs_path: worktree.abs_path(),
}
})
.collect::<Vec<_>>();
let system_prompt = self
.prompt_builder
.generate_assistant_system_prompt(worktree_root_names)
.context("failed to generate assistant system prompt")
.log_err()
.unwrap_or_default();
let mut request = LanguageModelRequest {
messages: vec![LanguageModelRequestMessage {
role: Role::System,
content: vec![MessageContent::Text(system_prompt)],
cache: true,
}],
messages: vec![],
tools: Vec::new(),
stop: Vec::new(),
temperature: None,
};
if let Some(system_prompt_context) = self.system_prompt_context.as_ref() {
if let Some(system_prompt) = self
.prompt_builder
.generate_assistant_system_prompt(system_prompt_context)
.context("failed to generate assistant system prompt")
.log_err()
{
request.messages.push(LanguageModelRequestMessage {
role: Role::System,
content: vec![MessageContent::Text(system_prompt)],
cache: true,
});
}
} else {
log::error!("system_prompt_context not set.")
}
let mut referenced_context_ids = HashSet::default();
for message in &self.messages {
@@ -699,13 +853,17 @@ impl Thread {
.rfind(|message| message.role == Role::Assistant)
{
if tool_use.name.as_ref() == ScriptingTool::NAME {
thread
.scripting_tool_use
.request_tool_use(last_assistant_message.id, tool_use);
thread.scripting_tool_use.request_tool_use(
last_assistant_message.id,
tool_use,
cx,
);
} else {
thread
.tool_use
.request_tool_use(last_assistant_message.id, tool_use);
thread.tool_use.request_tool_use(
last_assistant_message.id,
tool_use,
cx,
);
}
}
}
@@ -757,9 +915,10 @@ impl Thread {
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join("\n");
cx.emit(ThreadEvent::ShowError(ThreadError::Message(
SharedString::from(error_message.clone()),
)));
cx.emit(ThreadEvent::ShowError(ThreadError::Message {
header: "Error interacting with language model".into(),
message: SharedString::from(error_message.clone()),
}));
}
thread.cancel_last_completion(cx);
@@ -845,7 +1004,10 @@ impl Thread {
});
}
pub fn use_pending_tools(&mut self, cx: &mut Context<Self>) {
pub fn use_pending_tools(
&mut self,
cx: &mut Context<Self>,
) -> impl IntoIterator<Item = PendingToolUse> {
let request = self.to_completion_request(RequestKind::Chat, cx);
let pending_tool_uses = self
.tool_use
@@ -855,17 +1017,22 @@ impl Thread {
.cloned()
.collect::<Vec<_>>();
for tool_use in pending_tool_uses {
for tool_use in pending_tool_uses.iter() {
if let Some(tool) = self.tools.tool(&tool_use.name, cx) {
let task = tool.run(
tool_use.input,
tool_use.input.clone(),
&request.messages,
self.project.clone(),
self.action_log.clone(),
cx,
);
self.insert_tool_output(tool_use.id.clone(), task, cx);
self.insert_tool_output(
tool_use.id.clone(),
tool_use.ui_text.clone().into(),
task,
cx,
);
}
}
@@ -877,8 +1044,8 @@ impl Thread {
.cloned()
.collect::<Vec<_>>();
for scripting_tool_use in pending_scripting_tool_uses {
let task = match ScriptingTool::deserialize_input(scripting_tool_use.input) {
for scripting_tool_use in pending_scripting_tool_uses.iter() {
let task = match ScriptingTool::deserialize_input(scripting_tool_use.input.clone()) {
Err(err) => Task::ready(Err(err.into())),
Ok(input) => {
let (script_id, script_task) =
@@ -905,13 +1072,20 @@ impl Thread {
}
};
self.insert_scripting_tool_output(scripting_tool_use.id.clone(), task, cx);
let ui_text: SharedString = scripting_tool_use.name.clone().into();
self.insert_scripting_tool_output(scripting_tool_use.id.clone(), ui_text, task, cx);
}
pending_tool_uses
.into_iter()
.chain(pending_scripting_tool_uses)
}
pub fn insert_tool_output(
&mut self,
tool_use_id: LanguageModelToolUseId,
ui_text: SharedString,
output: Task<Result<String>>,
cx: &mut Context<Self>,
) {
@@ -936,12 +1110,13 @@ impl Thread {
});
self.tool_use
.run_pending_tool(tool_use_id, insert_output_task);
.run_pending_tool(tool_use_id, ui_text, insert_output_task);
}
pub fn insert_scripting_tool_output(
&mut self,
tool_use_id: LanguageModelToolUseId,
ui_text: SharedString,
output: Task<Result<String>>,
cx: &mut Context<Self>,
) {
@@ -966,7 +1141,7 @@ impl Thread {
});
self.scripting_tool_use
.run_pending_tool(tool_use_id, insert_output_task);
.run_pending_tool(tool_use_id, ui_text, insert_output_task);
}
pub fn attach_tool_results(
@@ -1044,10 +1219,11 @@ impl Thread {
project: Entity<Project>,
cx: &mut Context<Self>,
) -> Task<Arc<ProjectSnapshot>> {
let git_store = project.read(cx).git_store().clone();
let worktree_snapshots: Vec<_> = project
.read(cx)
.visible_worktrees(cx)
.map(|worktree| Self::worktree_snapshot(worktree, cx))
.map(|worktree| Self::worktree_snapshot(worktree, git_store.clone(), cx))
.collect();
cx.spawn(async move |_, cx| {
@@ -1076,7 +1252,11 @@ impl Thread {
})
}
fn worktree_snapshot(worktree: Entity<project::Worktree>, cx: &App) -> Task<WorktreeSnapshot> {
fn worktree_snapshot(
worktree: Entity<project::Worktree>,
git_store: Entity<GitStore>,
cx: &App,
) -> Task<WorktreeSnapshot> {
cx.spawn(async move |cx| {
// Get worktree path and snapshot
let worktree_info = cx.update(|app_cx| {
@@ -1093,42 +1273,40 @@ impl Thread {
};
};
let repo_info = git_store
.update(cx, |git_store, cx| {
git_store
.repositories()
.values()
.find(|repo| repo.read(cx).worktree_id == snapshot.id())
.and_then(|repo| {
let repo = repo.read(cx);
Some((repo.branch().cloned(), repo.local_repository()?))
})
})
.ok()
.flatten();
// Extract git information
let git_state = match snapshot.repositories().first() {
let git_state = match repo_info {
None => None,
Some(repo_entry) => {
// Get branch information
let current_branch = repo_entry.branch().map(|branch| branch.name.to_string());
Some((branch, repo)) => {
let current_branch = branch.map(|branch| branch.name.to_string());
let remote_url = repo.remote_url("origin");
let head_sha = repo.head_sha();
// Get repository info
let repo_result = worktree.read_with(cx, |worktree, _cx| {
if let project::Worktree::Local(local_worktree) = &worktree {
local_worktree.get_local_repo(repo_entry).map(|local_repo| {
let repo = local_repo.repo();
(repo.remote_url("origin"), repo.head_sha(), repo.clone())
})
} else {
None
}
});
// Get diff asynchronously
let diff = repo
.diff(git::repository::DiffType::HeadToWorktree, cx.clone())
.await
.ok();
match repo_result {
Ok(Some((remote_url, head_sha, repository))) => {
// Get diff asynchronously
let diff = repository
.diff(git::repository::DiffType::HeadToWorktree, cx.clone())
.await
.ok();
Some(GitState {
remote_url,
head_sha,
current_branch,
diff,
})
}
Err(_) | Ok(None) => None,
}
Some(GitState {
remote_url,
head_sha,
current_branch,
diff,
})
}
};
@@ -1139,7 +1317,7 @@ impl Thread {
})
}
pub fn to_markdown(&self) -> Result<String> {
pub fn to_markdown(&self, cx: &App) -> Result<String> {
let mut markdown = Vec::new();
if let Some(summary) = self.summary() {
@@ -1158,7 +1336,7 @@ impl Thread {
)?;
writeln!(markdown, "{}\n", message.text)?;
for tool_use in self.tool_uses_for_message(message.id) {
for tool_use in self.tool_uses_for_message(message.id, cx) {
writeln!(
markdown,
"**Use Tool: {} ({})**",
@@ -1204,7 +1382,10 @@ impl Thread {
pub enum ThreadError {
PaymentRequired,
MaxMonthlySpendReached,
Message(SharedString),
Message {
header: SharedString,
message: SharedString,
},
}
#[derive(Debug, Clone)]
@@ -1226,6 +1407,7 @@ pub enum ThreadEvent {
/// Whether the tool was canceled by the user.
canceled: bool,
},
CheckpointChanged,
}
impl EventEmitter<ThreadEvent> for Thread {}

View File

@@ -20,7 +20,7 @@ use prompt_store::PromptBuilder;
use serde::{Deserialize, Serialize};
use util::ResultExt as _;
use crate::thread::{MessageId, ProjectSnapshot, Thread, ThreadId};
use crate::thread::{MessageId, ProjectSnapshot, Thread, ThreadEvent, ThreadId};
pub fn init(cx: &mut App) {
ThreadsDatabase::init(cx);
@@ -113,7 +113,7 @@ impl ThreadStore {
.await?
.ok_or_else(|| anyhow!("no thread found with ID: {id:?}"))?;
this.update(cx, |this, cx| {
let thread = this.update(cx, |this, cx| {
cx.new(|cx| {
Thread::deserialize(
id.clone(),
@@ -124,7 +124,19 @@ impl ThreadStore {
cx,
)
})
})
})?;
let (system_prompt_context, load_error) = thread
.update(cx, |thread, cx| thread.load_system_prompt_context(cx))?
.await;
thread.update(cx, |thread, cx| {
thread.set_system_prompt_context(system_prompt_context);
if let Some(load_error) = load_error {
cx.emit(ThreadEvent::ShowError(load_error));
}
})?;
Ok(thread)
})
}

View File

@@ -1,23 +1,50 @@
use std::sync::Arc;
use assistant_settings::{AgentProfile, AssistantSettings};
use assistant_tool::{ToolSource, ToolWorkingSet};
use gpui::Entity;
use collections::HashMap;
use gpui::{Entity, Subscription};
use scripting_tool::ScriptingTool;
use settings::{Settings as _, SettingsStore};
use ui::{prelude::*, ContextMenu, PopoverMenu, Tooltip};
use crate::agent_profile::AgentProfile;
pub struct ToolSelector {
profiles: Vec<AgentProfile>,
profiles: HashMap<Arc<str>, AgentProfile>,
tools: Arc<ToolWorkingSet>,
_subscriptions: Vec<Subscription>,
}
impl ToolSelector {
pub fn new(tools: Arc<ToolWorkingSet>, _cx: &mut Context<Self>) -> Self {
Self {
profiles: vec![AgentProfile::read_only(), AgentProfile::code_writer()],
pub fn new(tools: Arc<ToolWorkingSet>, cx: &mut Context<Self>) -> Self {
let settings_subscription = cx.observe_global::<SettingsStore>(move |this, cx| {
this.refresh_profiles(cx);
});
let mut this = Self {
profiles: HashMap::default(),
tools,
_subscriptions: vec![settings_subscription],
};
this.refresh_profiles(cx);
this
}
fn refresh_profiles(&mut self, cx: &mut Context<Self>) {
let settings = AssistantSettings::get_global(cx);
let mut profiles = settings.profiles.clone();
let read_only = AgentProfile::read_only();
if !profiles.contains_key(read_only.name.as_ref()) {
profiles.insert(read_only.name.clone().into(), read_only);
}
let code_writer = AgentProfile::code_writer();
if !profiles.contains_key(code_writer.name.as_ref()) {
profiles.insert(code_writer.name.clone().into(), code_writer);
}
self.profiles = profiles;
}
fn build_context_menu(
@@ -31,7 +58,7 @@ impl ToolSelector {
let icon_position = IconPosition::End;
menu = menu.header("Profiles");
for profile in profiles.clone() {
for (_id, profile) in profiles.clone() {
menu = menu.toggleable_entry(profile.name.clone(), false, icon_position, None, {
let tools = tool_set.clone();
move |_window, cx| {
@@ -44,6 +71,10 @@ impl ToolSelector {
.filter_map(|(tool, enabled)| enabled.then(|| tool.clone()))
.collect::<Vec<_>>(),
);
if profile.tools.contains_key(ScriptingTool::NAME) {
tools.enable_scripting_tool();
}
}
});
}

View File

@@ -1,10 +1,11 @@
use std::sync::Arc;
use anyhow::Result;
use assistant_tool::ToolWorkingSet;
use collections::HashMap;
use futures::future::Shared;
use futures::FutureExt as _;
use gpui::{SharedString, Task};
use gpui::{App, SharedString, Task};
use language_model::{
LanguageModelRequestMessage, LanguageModelToolResult, LanguageModelToolUse,
LanguageModelToolUseId, MessageContent, Role,
@@ -17,6 +18,7 @@ use crate::thread_store::SerializedMessage;
pub struct ToolUse {
pub id: LanguageModelToolUseId,
pub name: SharedString,
pub ui_text: SharedString,
pub status: ToolUseStatus,
pub input: serde_json::Value,
}
@@ -30,6 +32,7 @@ pub enum ToolUseStatus {
}
pub struct ToolUseState {
tools: Arc<ToolWorkingSet>,
tool_uses_by_assistant_message: HashMap<MessageId, Vec<LanguageModelToolUse>>,
tool_uses_by_user_message: HashMap<MessageId, Vec<LanguageModelToolUseId>>,
tool_results: HashMap<LanguageModelToolUseId, LanguageModelToolResult>,
@@ -37,8 +40,9 @@ pub struct ToolUseState {
}
impl ToolUseState {
pub fn new() -> Self {
pub fn new(tools: Arc<ToolWorkingSet>) -> Self {
Self {
tools,
tool_uses_by_assistant_message: HashMap::default(),
tool_uses_by_user_message: HashMap::default(),
tool_results: HashMap::default(),
@@ -50,10 +54,11 @@ impl ToolUseState {
///
/// Accepts a function to filter the tools that should be used to populate the state.
pub fn from_serialized_messages(
tools: Arc<ToolWorkingSet>,
messages: &[SerializedMessage],
mut filter_by_tool_name: impl FnMut(&str) -> bool,
) -> Self {
let mut this = Self::new();
let mut this = Self::new(tools);
let mut tool_names_by_id = HashMap::default();
for message in messages {
@@ -138,7 +143,7 @@ impl ToolUseState {
self.pending_tool_uses_by_id.values().collect()
}
pub fn tool_uses_for_message(&self, id: MessageId) -> Vec<ToolUse> {
pub fn tool_uses_for_message(&self, id: MessageId, cx: &App) -> Vec<ToolUse> {
let Some(tool_uses_for_message) = &self.tool_uses_by_assistant_message.get(&id) else {
return Vec::new();
};
@@ -173,6 +178,7 @@ impl ToolUseState {
tool_uses.push(ToolUse {
id: tool_use.id.clone(),
name: tool_use.name.clone().into(),
ui_text: self.tool_ui_label(&tool_use.name, &tool_use.input, cx),
input: tool_use.input.clone(),
status,
})
@@ -181,6 +187,19 @@ impl ToolUseState {
tool_uses
}
pub fn tool_ui_label(
&self,
tool_name: &str,
input: &serde_json::Value,
cx: &App,
) -> SharedString {
if let Some(tool) = self.tools.tool(tool_name, cx) {
tool.ui_text(input).into()
} else {
"Unknown tool".into()
}
}
pub fn tool_results_for_message(&self, message_id: MessageId) -> Vec<&LanguageModelToolResult> {
let empty = Vec::new();
@@ -209,6 +228,7 @@ impl ToolUseState {
&mut self,
assistant_message_id: MessageId,
tool_use: LanguageModelToolUse,
cx: &App,
) {
self.tool_uses_by_assistant_message
.entry(assistant_message_id)
@@ -228,15 +248,24 @@ impl ToolUseState {
PendingToolUse {
assistant_message_id,
id: tool_use.id,
name: tool_use.name,
name: tool_use.name.clone(),
ui_text: self
.tool_ui_label(&tool_use.name, &tool_use.input, cx)
.into(),
input: tool_use.input,
status: PendingToolUseStatus::Idle,
},
);
}
pub fn run_pending_tool(&mut self, tool_use_id: LanguageModelToolUseId, task: Task<()>) {
pub fn run_pending_tool(
&mut self,
tool_use_id: LanguageModelToolUseId,
ui_text: SharedString,
task: Task<()>,
) {
if let Some(tool_use) = self.pending_tool_uses_by_id.get_mut(&tool_use_id) {
tool_use.ui_text = ui_text.into();
tool_use.status = PendingToolUseStatus::Running {
_task: task.shared(),
};
@@ -335,6 +364,7 @@ pub struct PendingToolUse {
#[allow(unused)]
pub assistant_message_id: MessageId,
pub name: Arc<str>,
pub ui_text: Arc<str>,
pub input: serde_json::Value,
pub status: PendingToolUseStatus,
}

View File

@@ -6,15 +6,6 @@ fn main() {
if cfg!(target_os = "macos") {
println!("cargo:rustc-env=MACOSX_DEPLOYMENT_TARGET=10.15.7");
println!("cargo:rerun-if-env-changed=ZED_BUNDLE");
if std::env::var("ZED_BUNDLE").ok().as_deref() == Some("true") {
// Find WebRTC.framework in the Frameworks folder when running as part of an application bundle.
println!("cargo:rustc-link-arg=-Wl,-rpath,@executable_path/../Frameworks");
} else {
// Find WebRTC.framework as a sibling of the executable when running outside of an application bundle.
println!("cargo:rustc-link-arg=-Wl,-rpath,@executable_path");
}
// Weakly link ReplayKit to ensure Zed can be used on macOS 10.15+.
println!("cargo:rustc-link-arg=-Wl,-weak_framework,ReplayKit");

View File

@@ -79,10 +79,25 @@ impl Eval {
let start_time = std::time::SystemTime::now();
let (system_prompt_context, load_error) = cx
.update(|cx| {
assistant
.read(cx)
.thread
.read(cx)
.load_system_prompt_context(cx)
})?
.await;
if let Some(load_error) = load_error {
return Err(anyhow!("{:?}", load_error));
};
assistant.update(cx, |assistant, cx| {
assistant.thread.update(cx, |thread, cx| {
let context = vec![];
thread.insert_user_message(self.user_prompt.clone(), context, None, cx);
thread.set_system_prompt_context(system_prompt_context);
thread.send_to_model(model, RequestKind::Chat, cx);
});
})?;

View File

@@ -128,12 +128,7 @@ impl HeadlessAssistant {
}
}
}
ThreadEvent::StreamedCompletion
| ThreadEvent::SummaryChanged
| ThreadEvent::StreamedAssistantText(_, _)
| ThreadEvent::MessageAdded(_)
| ThreadEvent::MessageEdited(_)
| ThreadEvent::MessageDeleted(_) => {}
_ => {}
}
}
}

View File

@@ -14,6 +14,7 @@ path = "src/assistant_settings.rs"
[dependencies]
anthropic = { workspace = true, features = ["schemars"] }
anyhow.workspace = true
collections.workspace = true
feature_flags.workspace = true
gpui.workspace = true
language_model.workspace = true

View File

@@ -1,7 +1,10 @@
mod agent_profile;
use std::sync::Arc;
use ::open_ai::Model as OpenAiModel;
use anthropic::Model as AnthropicModel;
use collections::HashMap;
use deepseek::Model as DeepseekModel;
use feature_flags::FeatureFlagAppExt;
use gpui::{App, Pixels};
@@ -12,6 +15,8 @@ use schemars::{schema::Schema, JsonSchema};
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
pub use crate::agent_profile::*;
#[derive(Copy, Clone, Default, Debug, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "snake_case")]
pub enum AssistantDockPosition {
@@ -66,6 +71,7 @@ pub struct AssistantSettings {
pub inline_alternatives: Vec<LanguageModelSelection>,
pub using_outdated_settings_version: bool,
pub enable_experimental_live_diffs: bool,
pub profiles: HashMap<Arc<str>, AgentProfile>,
}
impl AssistantSettings {
@@ -166,6 +172,7 @@ impl AssistantSettingsContent {
editor_model: None,
inline_alternatives: None,
enable_experimental_live_diffs: None,
profiles: None,
},
VersionedAssistantSettingsContent::V2(settings) => settings.clone(),
},
@@ -187,6 +194,7 @@ impl AssistantSettingsContent {
editor_model: None,
inline_alternatives: None,
enable_experimental_live_diffs: None,
profiles: None,
},
}
}
@@ -316,6 +324,7 @@ impl Default for VersionedAssistantSettingsContent {
editor_model: None,
inline_alternatives: None,
enable_experimental_live_diffs: None,
profiles: None,
})
}
}
@@ -352,6 +361,8 @@ pub struct AssistantSettingsContentV2 {
///
/// Default: false
enable_experimental_live_diffs: Option<bool>,
#[schemars(skip)]
profiles: Option<HashMap<Arc<str>, AgentProfileContent>>,
}
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq)]
@@ -388,6 +399,12 @@ impl Default for LanguageModelSelection {
}
}
#[derive(Debug, PartialEq, Clone, Serialize, Deserialize, JsonSchema)]
pub struct AgentProfileContent {
pub name: Arc<str>,
pub tools: HashMap<Arc<str>, bool>,
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug)]
pub struct AssistantSettingsContentV1 {
/// Whether the Assistant is enabled.
@@ -482,6 +499,24 @@ impl Settings for AssistantSettings {
&mut settings.enable_experimental_live_diffs,
value.enable_experimental_live_diffs,
);
merge(
&mut settings.profiles,
value.profiles.map(|profiles| {
profiles
.into_iter()
.map(|(id, profile)| {
(
id,
AgentProfile {
name: profile.name.into(),
tools: profile.tools,
context_servers: HashMap::default(),
},
)
})
.collect()
}),
);
}
Ok(settings)
@@ -546,6 +581,7 @@ mod tests {
default_width: None,
default_height: None,
enable_experimental_live_diffs: None,
profiles: None,
}),
)
},

View File

@@ -5,8 +5,7 @@ use std::sync::Arc;
use anyhow::Result;
use collections::{HashMap, HashSet};
use gpui::Context;
use gpui::{App, Entity, SharedString, Task};
use gpui::{App, Context, Entity, SharedString, Task};
use language::Buffer;
use language_model::LanguageModelRequestMessage;
use project::Project;
@@ -44,6 +43,9 @@ pub trait Tool: 'static + Send + Sync {
serde_json::Value::Object(serde_json::Map::default())
}
/// Returns markdown to be displayed in the UI for this tool.
fn ui_text(&self, input: &serde_json::Value) -> String;
/// Runs the tool with the provided input.
fn run(
self: Arc<Self>,

View File

@@ -32,6 +32,13 @@ impl Tool for BashTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<BashToolInput>(input.clone()) {
Ok(input) => format!("`$ {}`", input.command),
Err(_) => "Run bash command".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,

View File

@@ -39,6 +39,13 @@ impl Tool for DeletePathTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<DeletePathToolInput>(input.clone()) {
Ok(input) => format!("Delete “`{}`”", input.path),
Err(_) => "Delete path".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -59,13 +66,12 @@ impl Tool for DeletePathTool {
{
Some(deletion_task) => cx.background_spawn(async move {
match deletion_task.await {
Ok(()) => Ok(format!("Deleted {}", &path_str)),
Err(err) => Err(anyhow!("Failed to delete {}: {}", &path_str, err)),
Ok(()) => Ok(format!("Deleted {path_str}")),
Err(err) => Err(anyhow!("Failed to delete {path_str}: {err}")),
}
}),
None => Task::ready(Err(anyhow!(
"Couldn't delete {} because that path isn't in this project.",
path_str
"Couldn't delete {path_str} because that path isn't in this project."
))),
}
}

View File

@@ -46,6 +46,17 @@ impl Tool for DiagnosticsTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
if let Some(path) = serde_json::from_value::<DiagnosticsToolInput>(input.clone())
.ok()
.and_then(|input| input.path)
{
format!("Check diagnostics for “`{}`”", path.display())
} else {
"Check project diagnostics".to_string()
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -54,14 +65,15 @@ impl Tool for DiagnosticsTool {
_action_log: Entity<ActionLog>,
cx: &mut App,
) -> Task<Result<String>> {
let input = match serde_json::from_value::<DiagnosticsToolInput>(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
};
if let Some(path) = input.path {
if let Some(path) = serde_json::from_value::<DiagnosticsToolInput>(input)
.ok()
.and_then(|input| input.path)
{
let Some(project_path) = project.read(cx).find_project_path(&path, cx) else {
return Task::ready(Err(anyhow!("Could not find path in project")));
return Task::ready(Err(anyhow!(
"Could not find path {} in project",
path.display()
)));
};
let buffer = project.update(cx, |project, cx| project.open_buffer(project_path, cx));

View File

@@ -24,10 +24,7 @@ use util::ResultExt;
pub struct EditFilesToolInput {
/// High-level edit instructions. These will be interpreted by a smaller
/// model, so explain the changes you want that model to make and which
/// file paths need changing.
///
/// The description should be concise and clear. We will show this
/// description to the user as well.
/// file paths need changing. The description should be concise and clear.
///
/// WARNING: When specifying which file paths need changing, you MUST
/// start each path with one of the project's root directories.
@@ -58,6 +55,21 @@ pub struct EditFilesToolInput {
/// Notice how we never specify code snippets in the instructions!
/// </example>
pub edit_instructions: String,
/// A user-friendly description of what changes are being made.
/// This will be shown to the user in the UI to describe the edit operation. The screen real estate for this UI will be extremely
/// constrained, so make the description extremely terse.
///
/// <example>
/// For fixing a broken authentication system:
/// "Fix auth bug in login flow"
/// </example>
///
/// <example>
/// For adding unit tests to a module:
/// "Add tests for user profile logic"
/// </example>
pub display_description: String,
}
pub struct EditFilesTool;
@@ -76,6 +88,13 @@ impl Tool for EditFilesTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<EditFilesToolInput>(input.clone()) {
Ok(input) => input.display_description,
Err(_) => "Edit files".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -126,24 +145,39 @@ impl Tool for EditFilesTool {
struct EditToolRequest {
parser: EditActionParser,
output: String,
changed_buffers: HashSet<Entity<language::Buffer>>,
bad_searches: Vec<BadSearch>,
editor_response: EditorResponse,
project: Entity<Project>,
action_log: Entity<ActionLog>,
tool_log: Option<(Entity<EditToolLog>, EditToolRequestId)>,
}
#[derive(Debug)]
enum DiffResult {
BadSearch(BadSearch),
Diff(language::Diff),
enum EditorResponse {
/// The editor model hasn't produced any actions yet.
/// If we don't have any by the end, we'll return its message to the architect model.
Message(String),
/// The editor model produced at least one action.
Actions {
applied: Vec<AppliedAction>,
search_errors: Vec<SearchError>,
},
}
struct AppliedAction {
source: String,
buffer: Entity<language::Buffer>,
}
#[derive(Debug)]
struct BadSearch {
file_path: String,
search: String,
enum SearchError {
NoMatch {
file_path: String,
search: String,
},
EmptyBuffer {
file_path: String,
search: String,
exists: bool,
},
}
impl EditToolRequest {
@@ -200,10 +234,7 @@ impl EditToolRequest {
let mut request = Self {
parser: EditActionParser::new(),
// we start with the success header so we don't need to shift the output in the common case
output: Self::SUCCESS_OUTPUT_HEADER.to_string(),
changed_buffers: HashSet::default(),
bad_searches: Vec::new(),
editor_response: EditorResponse::Message(String::with_capacity(256)),
action_log,
project,
tool_log,
@@ -220,6 +251,12 @@ impl EditToolRequest {
async fn process_response_chunk(&mut self, chunk: &str, cx: &mut AsyncApp) -> Result<()> {
let new_actions = self.parser.parse_chunk(chunk);
if let EditorResponse::Message(ref mut message) = self.editor_response {
if new_actions.is_empty() {
message.push_str(chunk);
}
}
if let Some((ref log, req_id)) = self.tool_log {
log.update(cx, |log, cx| {
log.push_editor_response_chunk(req_id, chunk, &new_actions, cx)
@@ -250,6 +287,11 @@ impl EditToolRequest {
.update(cx, |project, cx| project.open_buffer(project_path, cx))?
.await?;
enum DiffResult {
Diff(language::Diff),
SearchError(SearchError),
}
let result = match action {
EditAction::Replace {
old,
@@ -259,7 +301,39 @@ impl EditToolRequest {
let snapshot = buffer.read_with(cx, |buffer, _cx| buffer.snapshot())?;
cx.background_executor()
.spawn(Self::replace_diff(old, new, file_path, snapshot))
.spawn(async move {
if snapshot.is_empty() {
let exists = snapshot
.file()
.map_or(false, |file| file.disk_state().exists());
let error = SearchError::EmptyBuffer {
file_path: file_path.display().to_string(),
exists,
search: old,
};
return anyhow::Ok(DiffResult::SearchError(error));
}
let replace_result =
// Try to match exactly
replace_exact(&old, &new, &snapshot)
.await
// If that fails, try being flexible about indentation
.or_else(|| replace_with_flexible_indent(&old, &new, &snapshot));
let Some(diff) = replace_result else {
let error = SearchError::NoMatch {
search: old,
file_path: file_path.display().to_string(),
};
return Ok(DiffResult::SearchError(error));
};
Ok(DiffResult::Diff(diff))
})
.await
}
EditAction::Write { content, .. } => Ok(DiffResult::Diff(
@@ -270,139 +344,179 @@ impl EditToolRequest {
}?;
match result {
DiffResult::BadSearch(invalid_replace) => {
self.bad_searches.push(invalid_replace);
DiffResult::SearchError(error) => {
self.push_search_error(error);
}
DiffResult::Diff(diff) => {
let _clock = buffer.update(cx, |buffer, cx| buffer.apply_diff(diff, cx))?;
write!(&mut self.output, "\n\n{}", source)?;
self.changed_buffers.insert(buffer);
self.push_applied_action(AppliedAction { source, buffer });
}
}
Ok(())
anyhow::Ok(())
}
async fn replace_diff(
old: String,
new: String,
file_path: std::path::PathBuf,
snapshot: language::BufferSnapshot,
) -> Result<DiffResult> {
let result =
// Try to match exactly
replace_exact(&old, &new, &snapshot)
.await
// If that fails, try being flexible about indentation
.or_else(|| replace_with_flexible_indent(&old, &new, &snapshot));
let Some(diff) = result else {
return anyhow::Ok(DiffResult::BadSearch(BadSearch {
search: old,
file_path: file_path.display().to_string(),
}));
};
anyhow::Ok(DiffResult::Diff(diff))
fn push_search_error(&mut self, error: SearchError) {
match &mut self.editor_response {
EditorResponse::Message(_) => {
self.editor_response = EditorResponse::Actions {
applied: Vec::new(),
search_errors: vec![error],
};
}
EditorResponse::Actions { search_errors, .. } => {
search_errors.push(error);
}
}
}
const SUCCESS_OUTPUT_HEADER: &str = "Successfully applied. Here's a list of changes:";
const ERROR_OUTPUT_HEADER_NO_EDITS: &str = "I couldn't apply any edits!";
const ERROR_OUTPUT_HEADER_WITH_EDITS: &str =
"Errors occurred. First, here's a list of the edits we managed to apply:";
fn push_applied_action(&mut self, action: AppliedAction) {
match &mut self.editor_response {
EditorResponse::Message(_) => {
self.editor_response = EditorResponse::Actions {
applied: vec![action],
search_errors: Vec::new(),
};
}
EditorResponse::Actions { applied, .. } => {
applied.push(action);
}
}
}
async fn finalize(self, cx: &mut AsyncApp) -> Result<String> {
let changed_buffer_count = self.changed_buffers.len();
match self.editor_response {
EditorResponse::Message(message) => Err(anyhow!(
"No edits were applied! You might need to provide more context.\n\n{}",
message
)),
EditorResponse::Actions {
applied,
search_errors,
} => {
let mut output = String::with_capacity(1024);
// Save each buffer once at the end
for buffer in &self.changed_buffers {
self.project
.update(cx, |project, cx| project.save_buffer(buffer.clone(), cx))?
.await?;
}
let parse_errors = self.parser.errors();
let has_errors = !search_errors.is_empty() || !parse_errors.is_empty();
self.action_log
.update(cx, |log, cx| log.buffer_edited(self.changed_buffers, cx))
.log_err();
if has_errors {
let error_count = search_errors.len() + parse_errors.len();
let errors = self.parser.errors();
if applied.is_empty() {
writeln!(
&mut output,
"{} errors occurred! No edits were applied.",
error_count,
)?;
} else {
writeln!(
&mut output,
"{} errors occurred, but {} edits were correctly applied.",
error_count,
applied.len(),
)?;
if errors.is_empty() && self.bad_searches.is_empty() {
if changed_buffer_count == 0 {
return Err(anyhow!(
"The instructions didn't lead to any changes. You might need to consult the file contents first."
));
}
Ok(self.output)
} else {
let mut output = self.output;
if output.is_empty() {
output.replace_range(
0..Self::SUCCESS_OUTPUT_HEADER.len(),
Self::ERROR_OUTPUT_HEADER_NO_EDITS,
);
} else {
output.replace_range(
0..Self::SUCCESS_OUTPUT_HEADER.len(),
Self::ERROR_OUTPUT_HEADER_WITH_EDITS,
);
}
if !self.bad_searches.is_empty() {
writeln!(
&mut output,
"\n\n# {} SEARCH/REPLACE block(s) failed to match:\n",
self.bad_searches.len()
)?;
for replace in self.bad_searches {
writeln!(
writeln!(
&mut output,
"# {} SEARCH/REPLACE block(s) applied:\n\nDo not re-send these since they are already applied!\n",
applied.len()
)?;
}
} else {
write!(
&mut output,
"## No exact match in: {}\n```\n{}\n```\n",
replace.file_path, replace.search,
"Successfully applied! Here's a list of applied edits:"
)?;
}
write!(&mut output,
"The SEARCH section must exactly match an existing block of lines including all white \
space, comments, indentation, docstrings, etc."
)?;
}
let mut changed_buffers = HashSet::default();
if !errors.is_empty() {
writeln!(
&mut output,
"\n\n# {} SEARCH/REPLACE blocks failed to parse:",
errors.len()
)?;
for error in errors {
writeln!(&mut output, "- {}", error)?;
for action in applied {
changed_buffers.insert(action.buffer);
write!(&mut output, "\n\n{}", action.source)?;
}
}
if changed_buffer_count > 0 {
writeln!(
&mut output,
"\n\nThe other SEARCH/REPLACE blocks were applied successfully. Do not re-send them!",
)?;
}
for buffer in &changed_buffers {
self.project
.update(cx, |project, cx| project.save_buffer(buffer.clone(), cx))?
.await?;
}
writeln!(
&mut output,
"{}You can fix errors by running the tool again. You can include instructions, \
but errors are part of the conversation so you don't need to repeat them.",
if changed_buffer_count == 0 {
"\n\n"
self.action_log
.update(cx, |log, cx| log.buffer_edited(changed_buffers.clone(), cx))
.log_err();
if !search_errors.is_empty() {
writeln!(
&mut output,
"\n\n## {} SEARCH/REPLACE block(s) failed to match:\n",
search_errors.len()
)?;
for error in search_errors {
match error {
SearchError::NoMatch { file_path, search } => {
writeln!(
&mut output,
"### No exact match in: `{}`\n```\n{}\n```\n",
file_path, search,
)?;
}
SearchError::EmptyBuffer {
file_path,
exists: true,
search,
} => {
writeln!(
&mut output,
"### No match because `{}` is empty:\n```\n{}\n```\n",
file_path, search,
)?;
}
SearchError::EmptyBuffer {
file_path,
exists: false,
search,
} => {
writeln!(
&mut output,
"### No match because `{}` does not exist:\n```\n{}\n```\n",
file_path, search,
)?;
}
}
}
write!(&mut output,
"The SEARCH section must exactly match an existing block of lines including all white \
space, comments, indentation, docstrings, etc."
)?;
}
if !parse_errors.is_empty() {
writeln!(
&mut output,
"\n\n## {} SEARCH/REPLACE blocks failed to parse:",
parse_errors.len()
)?;
for error in parse_errors {
writeln!(&mut output, "- {}", error)?;
}
}
if has_errors {
writeln!(&mut output,
"\n\nYou can fix errors by running the tool again. You can include instructions, \
but errors are part of the conversation so you don't need to repeat them.",
)?;
Err(anyhow!(output))
} else {
""
Ok(output)
}
)?;
Err(anyhow!(output))
}
}
}
}

View File

@@ -122,6 +122,13 @@ impl Tool for FetchTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<FetchToolInput>(input.clone()) {
Ok(input) => format!("Fetch `{}`", input.url),
Err(_) => "Fetch URL".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,

View File

@@ -50,6 +50,13 @@ impl Tool for ListDirectoryTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<ListDirectoryToolInput>(input.clone()) {
Ok(input) => format!("List the `{}` directory's contents", input.path.display()),
Err(_) => "List directory".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -64,7 +71,10 @@ impl Tool for ListDirectoryTool {
};
let Some(project_path) = project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path not found in project")));
return Task::ready(Err(anyhow!(
"Path {} not found in project",
input.path.display()
)));
};
let Some(worktree) = project
.read(cx)
@@ -79,7 +89,7 @@ impl Tool for ListDirectoryTool {
};
if !entry.is_dir() {
return Task::ready(Err(anyhow!("{} is a file.", input.path.display())));
return Task::ready(Err(anyhow!("{} is not a directory.", input.path.display())));
}
let mut output = String::new();

View File

@@ -40,6 +40,10 @@ impl Tool for NowTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, _input: &serde_json::Value) -> String {
"Get current time".to_string()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,

View File

@@ -48,6 +48,13 @@ impl Tool for PathSearchTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<PathSearchToolInput>(input.clone()) {
Ok(input) => format!("Find paths matching “`{}`”", input.glob),
Err(_) => "Search paths".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -62,7 +69,7 @@ impl Tool for PathSearchTool {
};
let path_matcher = match PathMatcher::new(&[glob.clone()]) {
Ok(matcher) => matcher,
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {}", err))),
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {err}"))),
};
let snapshots: Vec<Snapshot> = project
.read(cx)

View File

@@ -53,6 +53,13 @@ impl Tool for ReadFileTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<ReadFileToolInput>(input.clone()) {
Ok(input) => format!("Read file `{}`", input.path.display()),
Err(_) => "Read file".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -67,7 +74,10 @@ impl Tool for ReadFileTool {
};
let Some(project_path) = project.read(cx).find_project_path(&input.path, cx) else {
return Task::ready(Err(anyhow!("Path not found in project")));
return Task::ready(Err(anyhow!(
"Path {} not found in project",
&input.path.display()
)));
};
cx.spawn(async move |cx| {

View File

@@ -22,10 +22,17 @@ pub struct RegexSearchToolInput {
/// Optional starting position for paginated results (0-based).
/// When not provided, starts from the beginning.
#[serde(default)]
pub offset: Option<usize>,
pub offset: Option<u32>,
}
const RESULTS_PER_PAGE: usize = 20;
impl RegexSearchToolInput {
/// Which page of search results this is.
pub fn page(&self) -> u32 {
1 + (self.offset.unwrap_or(0) / RESULTS_PER_PAGE)
}
}
const RESULTS_PER_PAGE: u32 = 20;
pub struct RegexSearchTool;
@@ -43,6 +50,24 @@ impl Tool for RegexSearchTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, input: &serde_json::Value) -> String {
match serde_json::from_value::<RegexSearchToolInput>(input.clone()) {
Ok(input) => {
let page = input.page();
if page > 1 {
format!(
"Get page {page} of search results for regex “`{}`”",
input.regex
)
} else {
format!("Search files for regex “`{}`”", input.regex)
}
}
Err(_) => "Search with regex".to_string(),
}
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -154,7 +179,7 @@ impl Tool for RegexSearchTool {
offset + matches_found,
offset + RESULTS_PER_PAGE,
))
} else {
} else {
Ok(format!("Found {matches_found} matches:\n{output}"))
}
})

View File

@@ -31,6 +31,10 @@ impl Tool for ThinkingTool {
serde_json::to_value(&schema).unwrap()
}
fn ui_text(&self, _input: &serde_json::Value) -> String {
"Thinking".to_string()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,

View File

@@ -18,7 +18,6 @@ test-support = [
"collections/test-support",
"gpui/test-support",
"livekit_client/test-support",
"livekit_client_macos/test-support",
"project/test-support",
"util/test-support"
]
@@ -41,12 +40,8 @@ serde_derive.workspace = true
settings.workspace = true
telemetry.workspace = true
util.workspace = true
[target.'cfg(target_os = "macos")'.dependencies]
livekit_client_macos = { workspace = true }
[target.'cfg(not(target_os = "macos"))'.dependencies]
livekit_client = { workspace = true }
gpui_tokio.workspace = true
livekit_client.workspace = true
[dev-dependencies]
client = { workspace = true, features = ["test-support"] }
@@ -57,9 +52,4 @@ language = { workspace = true, features = ["test-support"] }
project = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }
http_client = { workspace = true, features = ["test-support"] }
[target.'cfg(target_os = "macos")'.dev-dependencies]
livekit_client_macos = { workspace = true, features = ["test-support"] }
[target.'cfg(not(target_os = "macos"))'.dev-dependencies]
livekit_client = { workspace = true, features = ["test-support"] }

View File

@@ -1,13 +1,5 @@
pub mod call_settings;
#[cfg(target_os = "macos")]
mod macos;
mod call_impl;
#[cfg(target_os = "macos")]
pub use macos::*;
#[cfg(not(target_os = "macos"))]
mod cross_platform;
#[cfg(not(target_os = "macos"))]
pub use cross_platform::*;
pub use call_impl::*;

View File

@@ -17,9 +17,7 @@ use room::Event;
use settings::Settings;
use std::sync::Arc;
pub use livekit_client::{
track::RemoteVideoTrack, RemoteVideoTrackView, RemoteVideoTrackViewEvent,
};
pub use livekit_client::{RemoteVideoTrack, RemoteVideoTrackView, RemoteVideoTrackViewEvent};
pub use participant::ParticipantLocation;
pub use room::Room;
@@ -28,10 +26,6 @@ struct GlobalActiveCall(Entity<ActiveCall>);
impl Global for GlobalActiveCall {}
pub fn init(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut App) {
livekit_client::init(
cx.background_executor().dispatcher.clone(),
cx.http_client(),
);
CallSettings::register(cx);
let active_call = cx.new(|cx| ActiveCall::new(client, user_store, cx));

View File

@@ -1,13 +1,14 @@
use anyhow::{anyhow, Result};
use client::ParticipantIndex;
use client::{proto, User};
use client::{proto, ParticipantIndex, User};
use collections::HashMap;
use gpui::WeakEntity;
pub use livekit_client_macos::Frame;
pub use livekit_client_macos::{RemoteAudioTrack, RemoteVideoTrack};
use livekit_client::AudioStream;
use project::Project;
use std::sync::Arc;
pub use livekit_client::TrackSid;
pub use livekit_client::{RemoteAudioTrack, RemoteVideoTrack};
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub enum ParticipantLocation {
SharedProject { project_id: u64 },
@@ -48,7 +49,6 @@ impl LocalParticipant {
}
}
#[derive(Clone, Debug)]
pub struct RemoteParticipant {
pub user: Arc<User>,
pub peer_id: proto::PeerId,
@@ -58,13 +58,13 @@ pub struct RemoteParticipant {
pub participant_index: ParticipantIndex,
pub muted: bool,
pub speaking: bool,
pub video_tracks: HashMap<livekit_client_macos::Sid, Arc<RemoteVideoTrack>>,
pub audio_tracks: HashMap<livekit_client_macos::Sid, Arc<RemoteAudioTrack>>,
pub video_tracks: HashMap<TrackSid, RemoteVideoTrack>,
pub audio_tracks: HashMap<TrackSid, (RemoteAudioTrack, AudioStream)>,
}
impl RemoteParticipant {
pub fn has_video_tracks(&self) -> bool {
!self.video_tracks.is_empty()
return !self.video_tracks.is_empty();
}
pub fn can_write(&self) -> bool {

View File

@@ -1,5 +1,3 @@
#![cfg_attr(all(target_os = "windows", target_env = "gnu"), allow(unused))]
use crate::{
call_settings::CallSettings,
participant::{LocalParticipant, ParticipantLocation, RemoteParticipant},
@@ -14,19 +12,9 @@ use collections::{BTreeMap, HashMap, HashSet};
use fs::Fs;
use futures::{FutureExt, StreamExt};
use gpui::{App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Task, WeakEntity};
use gpui_tokio::Tokio;
use language::LanguageRegistry;
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
use livekit::{
capture_local_audio_track, capture_local_video_track,
id::ParticipantIdentity,
options::{TrackPublishOptions, VideoCodec},
play_remote_audio_track,
publication::LocalTrackPublication,
track::{TrackKind, TrackSource},
RoomEvent, RoomOptions,
};
#[cfg(all(target_os = "windows", target_env = "gnu"))]
use livekit::{publication::LocalTrackPublication, RoomEvent};
use livekit::{play_remote_audio_track, LocalTrackPublication, ParticipantIdentity, RoomEvent};
use livekit_client as livekit;
use postage::{sink::Sink, stream::Stream, watch};
use project::Project;
@@ -104,11 +92,7 @@ impl Room {
!self.shared_projects.is_empty()
}
#[cfg(all(
any(test, feature = "test-support"),
not(all(target_os = "windows", target_env = "gnu"))
))]
pub fn is_connected(&self) -> bool {
pub fn is_connected(&self, _: &App) -> bool {
if let Some(live_kit) = self.live_kit.as_ref() {
live_kit.room.connection_state() == livekit::ConnectionState::Connected
} else {
@@ -469,18 +453,27 @@ impl Room {
let project = handle.read(cx);
if let Some(project_id) = project.remote_id() {
projects.insert(project_id, handle.clone());
let mut worktrees = Vec::new();
let mut repositories = Vec::new();
for worktree in project.worktrees(cx) {
let worktree = worktree.read(cx);
worktrees.push(proto::RejoinWorktree {
id: worktree.id().to_proto(),
scan_id: worktree.completed_scan_id() as u64,
});
}
for (entry_id, repository) in project.repositories(cx) {
let repository = repository.read(cx);
repositories.push(proto::RejoinRepository {
id: entry_id.to_proto(),
scan_id: repository.completed_scan_id as u64,
});
}
rejoined_projects.push(proto::RejoinProject {
id: project_id,
worktrees: project
.worktrees(cx)
.map(|worktree| {
let worktree = worktree.read(cx);
proto::RejoinWorktree {
id: worktree.id().to_proto(),
scan_id: worktree.completed_scan_id() as u64,
}
})
.collect(),
worktrees,
repositories,
});
}
return true;
@@ -680,12 +673,6 @@ impl Room {
}
}
#[cfg(all(target_os = "windows", target_env = "gnu"))]
fn start_room_connection(&self, mut room: proto::Room, cx: &mut Context<Self>) -> Task<()> {
Task::ready(())
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
fn start_room_connection(&self, mut room: proto::Room, cx: &mut Context<Self>) -> Task<()> {
// Filter ourselves out from the room's participants.
let local_participant_ix = room
@@ -838,7 +825,6 @@ impl Room {
muted: true,
speaking: false,
video_tracks: Default::default(),
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
audio_tracks: Default::default(),
},
);
@@ -941,7 +927,6 @@ impl Room {
);
match event {
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::TrackSubscribed {
track,
participant,
@@ -956,18 +941,27 @@ impl Room {
)
})?;
if self.live_kit.as_ref().map_or(true, |kit| kit.deafened) {
track.rtc_track().set_enabled(false);
if matches!(track, livekit_client::RemoteTrack::Audio(_)) {
track.set_enabled(false, cx);
}
}
match track {
livekit::track::RemoteTrack::Audio(track) => {
livekit_client::RemoteTrack::Audio(track) => {
cx.emit(Event::RemoteAudioTracksChanged {
participant_id: participant.peer_id,
});
let stream = play_remote_audio_track(&track, cx.background_executor())?;
let apm = self
.live_kit
.as_ref()
.unwrap()
.room
.audio_processing_module();
let stream =
play_remote_audio_track(apm, &track, cx.background_executor())?;
participant.audio_tracks.insert(track_id, (track, stream));
participant.muted = publication.is_muted();
}
livekit::track::RemoteTrack::Video(track) => {
livekit_client::RemoteTrack::Video(track) => {
cx.emit(Event::RemoteVideoTracksChanged {
participant_id: participant.peer_id,
});
@@ -976,7 +970,6 @@ impl Room {
}
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::TrackUnsubscribed {
track, participant, ..
} => {
@@ -988,14 +981,14 @@ impl Room {
)
})?;
match track {
livekit::track::RemoteTrack::Audio(track) => {
livekit_client::RemoteTrack::Audio(track) => {
participant.audio_tracks.remove(&track.sid());
participant.muted = true;
cx.emit(Event::RemoteAudioTracksChanged {
participant_id: participant.peer_id,
});
}
livekit::track::RemoteTrack::Video(track) => {
livekit_client::RemoteTrack::Video(track) => {
participant.video_tracks.remove(&track.sid());
cx.emit(Event::RemoteVideoTracksChanged {
participant_id: participant.peer_id,
@@ -1004,7 +997,6 @@ impl Room {
}
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::ActiveSpeakersChanged { speakers } => {
let mut speaker_ids = speakers
.into_iter()
@@ -1021,7 +1013,6 @@ impl Room {
}
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::TrackMuted {
participant,
publication,
@@ -1046,7 +1037,6 @@ impl Room {
}
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::LocalTrackUnpublished { publication, .. } => {
log::info!("unpublished track {}", publication.sid());
if let Some(room) = &mut self.live_kit {
@@ -1069,12 +1059,10 @@ impl Room {
}
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::LocalTrackPublished { publication, .. } => {
log::info!("published track {:?}", publication.sid());
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
RoomEvent::Disconnected { reason } => {
log::info!("disconnected from room: {reason:?}");
self.leave(cx).detach_and_log_err(cx);
@@ -1302,13 +1290,6 @@ impl Room {
pub fn can_use_microphone(&self) -> bool {
use proto::ChannelRole::*;
#[cfg(not(any(test, feature = "test-support")))]
{
if cfg!(all(target_os = "windows", target_env = "gnu")) {
return false;
}
}
match self.local_participant.role {
Admin | Member | Talker => true,
Guest | Banned => false,
@@ -1323,40 +1304,27 @@ impl Room {
}
}
#[cfg(all(target_os = "windows", target_env = "gnu"))]
pub fn share_microphone(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
Task::ready(Err(anyhow!("MinGW is not supported yet")))
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
#[track_caller]
pub fn share_microphone(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
if self.status.is_offline() {
return Task::ready(Err(anyhow!("room is offline")));
}
let (participant, publish_id) = if let Some(live_kit) = self.live_kit.as_mut() {
let (participant, publish_id, apm) = if let Some(live_kit) = self.live_kit.as_mut() {
let publish_id = post_inc(&mut live_kit.next_publish_id);
live_kit.microphone_track = LocalTrack::Pending { publish_id };
cx.notify();
(live_kit.room.local_participant(), publish_id)
(
live_kit.room.local_participant(),
publish_id,
live_kit.room.audio_processing_module(),
)
} else {
return Task::ready(Err(anyhow!("live-kit was not initialized")));
};
cx.spawn(async move |this, cx| {
let (track, stream) = capture_local_audio_track(cx.background_executor())?.await;
let publication = participant
.publish_track(
livekit::track::LocalTrack::Audio(track),
TrackPublishOptions {
source: TrackSource::Microphone,
..Default::default()
},
)
.await
.map_err(|error| anyhow!("failed to publish track: {error}"));
let publication = participant.publish_microphone_track(apm, cx).await;
this.update(cx, |this, cx| {
let live_kit = this
.live_kit
@@ -1373,15 +1341,15 @@ impl Room {
};
match publication {
Ok(publication) => {
Ok((publication, stream)) => {
if canceled {
cx.background_spawn(async move {
participant.unpublish_track(&publication.sid()).await
cx.spawn(async move |_, cx| {
participant.unpublish_track(publication.sid(), cx).await
})
.detach_and_log_err(cx)
} else {
if live_kit.muted_by_user || live_kit.deafened {
publication.mute();
publication.mute(cx);
}
live_kit.microphone_track = LocalTrack::Published {
track_publication: publication,
@@ -1405,12 +1373,6 @@ impl Room {
})
}
#[cfg(all(target_os = "windows", target_env = "gnu"))]
pub fn share_screen(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
Task::ready(Err(anyhow!("MinGW is not supported yet")))
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
pub fn share_screen(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
if self.status.is_offline() {
return Task::ready(Err(anyhow!("room is offline")));
@@ -1434,19 +1396,7 @@ impl Room {
let sources = sources.await??;
let source = sources.first().ok_or_else(|| anyhow!("no display found"))?;
let (track, stream) = capture_local_video_track(&**source).await?;
let publication = participant
.publish_track(
livekit::track::LocalTrack::Video(track),
TrackPublishOptions {
source: TrackSource::Screenshare,
video_codec: VideoCodec::H264,
..Default::default()
},
)
.await
.map_err(|error| anyhow!("error publishing screen track {error:?}"));
let publication = participant.publish_screenshare_track(&**source, cx).await;
this.update(cx, |this, cx| {
let live_kit = this
@@ -1464,10 +1414,10 @@ impl Room {
};
match publication {
Ok(publication) => {
Ok((publication, stream)) => {
if canceled {
cx.background_spawn(async move {
participant.unpublish_track(&publication.sid()).await
cx.spawn(async move |_, cx| {
participant.unpublish_track(publication.sid(), cx).await
})
.detach()
} else {
@@ -1557,14 +1507,11 @@ impl Room {
LocalTrack::Published {
track_publication, ..
} => {
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
{
let local_participant = live_kit.room.local_participant();
let sid = track_publication.sid();
cx.background_spawn(
async move { local_participant.unpublish_track(&sid).await },
)
.detach_and_log_err(cx);
cx.spawn(async move |_, cx| local_participant.unpublish_track(sid, cx).await)
.detach_and_log_err(cx);
cx.notify();
}
@@ -1575,14 +1522,13 @@ impl Room {
}
fn set_deafened(&mut self, deafened: bool, cx: &mut Context<Self>) -> Option<()> {
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
{
let live_kit = self.live_kit.as_mut()?;
cx.notify();
for (_, participant) in live_kit.room.remote_participants() {
for (_, publication) in participant.track_publications() {
if publication.kind() == TrackKind::Audio {
publication.set_enabled(!deafened);
if publication.is_audio() {
publication.set_enabled(!deafened, cx);
}
}
}
@@ -1613,14 +1559,13 @@ impl Room {
LocalTrack::Published {
track_publication, ..
} => {
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
{
if should_mute {
track_publication.mute()
} else {
track_publication.unmute()
}
let guard = Tokio::handle(cx);
if should_mute {
track_publication.mute(cx)
} else {
track_publication.unmute(cx)
}
drop(guard);
None
}
@@ -1628,30 +1573,19 @@ impl Room {
}
}
#[cfg(all(target_os = "windows", target_env = "gnu"))]
fn spawn_room_connection(
livekit_connection_info: Option<proto::LiveKitConnectionInfo>,
cx: &mut Context<'_, Room>,
) {
}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
fn spawn_room_connection(
livekit_connection_info: Option<proto::LiveKitConnectionInfo>,
cx: &mut Context<'_, Room>,
) {
if let Some(connection_info) = livekit_connection_info {
cx.spawn(async move |this, cx| {
let (room, mut events) = livekit::Room::connect(
&connection_info.server_url,
&connection_info.token,
RoomOptions::default(),
)
.await?;
let (room, mut events) =
livekit::Room::connect(connection_info.server_url, connection_info.token, cx)
.await?;
this.update(cx, |this, cx| {
let _handle_updates = cx.spawn(async move |this, cx| {
while let Some(event) = events.recv().await {
while let Some(event) = events.next().await {
if this
.update(cx, |this, cx| {
this.livekit_room_updated(event, cx).warn_on_err();
@@ -1700,10 +1634,6 @@ struct LiveKitRoom {
}
impl LiveKitRoom {
#[cfg(all(target_os = "windows", target_env = "gnu"))]
fn stop_publishing(&mut self, _cx: &mut Context<Room>) {}
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
fn stop_publishing(&mut self, cx: &mut Context<Room>) {
let mut tracks_to_unpublish = Vec::new();
if let LocalTrack::Published {
@@ -1723,9 +1653,9 @@ impl LiveKitRoom {
}
let participant = self.room.local_participant();
cx.background_spawn(async move {
cx.spawn(async move |_, cx| {
for sid in tracks_to_unpublish {
participant.unpublish_track(&sid).await.log_err();
participant.unpublish_track(sid, cx).await.log_err();
}
})
.detach();

View File

@@ -1,84 +0,0 @@
#![cfg_attr(all(target_os = "windows", target_env = "gnu"), allow(unused))]
use anyhow::{anyhow, Result};
use client::{proto, ParticipantIndex, User};
use collections::HashMap;
use gpui::WeakEntity;
use livekit_client::AudioStream;
use project::Project;
use std::sync::Arc;
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
pub use livekit_client::id::TrackSid;
pub use livekit_client::track::{RemoteAudioTrack, RemoteVideoTrack};
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub enum ParticipantLocation {
SharedProject { project_id: u64 },
UnsharedProject,
External,
}
impl ParticipantLocation {
pub fn from_proto(location: Option<proto::ParticipantLocation>) -> Result<Self> {
match location.and_then(|l| l.variant) {
Some(proto::participant_location::Variant::SharedProject(project)) => {
Ok(Self::SharedProject {
project_id: project.id,
})
}
Some(proto::participant_location::Variant::UnsharedProject(_)) => {
Ok(Self::UnsharedProject)
}
Some(proto::participant_location::Variant::External(_)) => Ok(Self::External),
None => Err(anyhow!("participant location was not provided")),
}
}
}
#[derive(Clone, Default)]
pub struct LocalParticipant {
pub projects: Vec<proto::ParticipantProject>,
pub active_project: Option<WeakEntity<Project>>,
pub role: proto::ChannelRole,
}
impl LocalParticipant {
pub fn can_write(&self) -> bool {
matches!(
self.role,
proto::ChannelRole::Admin | proto::ChannelRole::Member
)
}
}
pub struct RemoteParticipant {
pub user: Arc<User>,
pub peer_id: proto::PeerId,
pub role: proto::ChannelRole,
pub projects: Vec<proto::ParticipantProject>,
pub location: ParticipantLocation,
pub participant_index: ParticipantIndex,
pub muted: bool,
pub speaking: bool,
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
pub video_tracks: HashMap<TrackSid, RemoteVideoTrack>,
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
pub audio_tracks: HashMap<TrackSid, (RemoteAudioTrack, AudioStream)>,
}
impl RemoteParticipant {
pub fn has_video_tracks(&self) -> bool {
#[cfg(not(all(target_os = "windows", target_env = "gnu")))]
return !self.video_tracks.is_empty();
#[cfg(all(target_os = "windows", target_env = "gnu"))]
return false;
}
pub fn can_write(&self) -> bool {
matches!(
self.role,
proto::ChannelRole::Admin | proto::ChannelRole::Member
)
}
}

View File

@@ -1,521 +0,0 @@
pub mod participant;
pub mod room;
use crate::call_settings::CallSettings;
use anyhow::{anyhow, Result};
use audio::Audio;
use client::{proto, ChannelId, Client, TypedEnvelope, User, UserStore, ZED_ALWAYS_ACTIVE};
use collections::HashSet;
use futures::{channel::oneshot, future::Shared, Future, FutureExt};
use gpui::{
App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, Global, Subscription, Task,
WeakEntity,
};
use postage::watch;
use project::Project;
use room::Event;
use settings::Settings;
use std::sync::Arc;
pub use participant::ParticipantLocation;
pub use room::Room;
struct GlobalActiveCall(Entity<ActiveCall>);
impl Global for GlobalActiveCall {}
pub fn init(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut App) {
CallSettings::register(cx);
let active_call = cx.new(|cx| ActiveCall::new(client, user_store, cx));
cx.set_global(GlobalActiveCall(active_call));
}
pub struct OneAtATime {
cancel: Option<oneshot::Sender<()>>,
}
impl OneAtATime {
/// spawn a task in the given context.
/// if another task is spawned before that resolves, or if the OneAtATime itself is dropped, the first task will be cancelled and return Ok(None)
/// otherwise you'll see the result of the task.
fn spawn<F, Fut, R>(&mut self, cx: &mut App, f: F) -> Task<Result<Option<R>>>
where
F: 'static + FnOnce(AsyncApp) -> Fut,
Fut: Future<Output = Result<R>>,
R: 'static,
{
let (tx, rx) = oneshot::channel();
self.cancel.replace(tx);
cx.spawn(async move |cx| {
futures::select_biased! {
_ = rx.fuse() => Ok(None),
result = f(cx.clone()).fuse() => result.map(Some),
}
})
}
fn running(&self) -> bool {
self.cancel
.as_ref()
.is_some_and(|cancel| !cancel.is_canceled())
}
}
#[derive(Clone)]
pub struct IncomingCall {
pub room_id: u64,
pub calling_user: Arc<User>,
pub participants: Vec<Arc<User>>,
pub initial_project: Option<proto::ParticipantProject>,
}
/// Singleton global maintaining the user's participation in a room across workspaces.
pub struct ActiveCall {
room: Option<(Entity<Room>, Vec<Subscription>)>,
pending_room_creation: Option<Shared<Task<Result<Entity<Room>, Arc<anyhow::Error>>>>>,
location: Option<WeakEntity<Project>>,
_join_debouncer: OneAtATime,
pending_invites: HashSet<u64>,
incoming_call: (
watch::Sender<Option<IncomingCall>>,
watch::Receiver<Option<IncomingCall>>,
),
client: Arc<Client>,
user_store: Entity<UserStore>,
_subscriptions: Vec<client::Subscription>,
}
impl EventEmitter<Event> for ActiveCall {}
impl ActiveCall {
fn new(client: Arc<Client>, user_store: Entity<UserStore>, cx: &mut Context<Self>) -> Self {
Self {
room: None,
pending_room_creation: None,
location: None,
pending_invites: Default::default(),
incoming_call: watch::channel(),
_join_debouncer: OneAtATime { cancel: None },
_subscriptions: vec![
client.add_request_handler(cx.weak_entity(), Self::handle_incoming_call),
client.add_message_handler(cx.weak_entity(), Self::handle_call_canceled),
],
client,
user_store,
}
}
pub fn channel_id(&self, cx: &App) -> Option<ChannelId> {
self.room()?.read(cx).channel_id()
}
async fn handle_incoming_call(
this: Entity<Self>,
envelope: TypedEnvelope<proto::IncomingCall>,
mut cx: AsyncApp,
) -> Result<proto::Ack> {
let user_store = this.update(&mut cx, |this, _| this.user_store.clone())?;
let call = IncomingCall {
room_id: envelope.payload.room_id,
participants: user_store
.update(&mut cx, |user_store, cx| {
user_store.get_users(envelope.payload.participant_user_ids, cx)
})?
.await?,
calling_user: user_store
.update(&mut cx, |user_store, cx| {
user_store.get_user(envelope.payload.calling_user_id, cx)
})?
.await?,
initial_project: envelope.payload.initial_project,
};
this.update(&mut cx, |this, _| {
*this.incoming_call.0.borrow_mut() = Some(call);
})?;
Ok(proto::Ack {})
}
async fn handle_call_canceled(
this: Entity<Self>,
envelope: TypedEnvelope<proto::CallCanceled>,
mut cx: AsyncApp,
) -> Result<()> {
this.update(&mut cx, |this, _| {
let mut incoming_call = this.incoming_call.0.borrow_mut();
if incoming_call
.as_ref()
.map_or(false, |call| call.room_id == envelope.payload.room_id)
{
incoming_call.take();
}
})?;
Ok(())
}
pub fn global(cx: &App) -> Entity<Self> {
cx.global::<GlobalActiveCall>().0.clone()
}
pub fn try_global(cx: &App) -> Option<Entity<Self>> {
cx.try_global::<GlobalActiveCall>()
.map(|call| call.0.clone())
}
pub fn invite(
&mut self,
called_user_id: u64,
initial_project: Option<Entity<Project>>,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
if !self.pending_invites.insert(called_user_id) {
return Task::ready(Err(anyhow!("user was already invited")));
}
cx.notify();
if self._join_debouncer.running() {
return Task::ready(Ok(()));
}
let room = if let Some(room) = self.room().cloned() {
Some(Task::ready(Ok(room)).shared())
} else {
self.pending_room_creation.clone()
};
let invite = if let Some(room) = room {
cx.spawn(async move |_, cx| {
let room = room.await.map_err(|err| anyhow!("{:?}", err))?;
let initial_project_id = if let Some(initial_project) = initial_project {
Some(
room.update(cx, |room, cx| room.share_project(initial_project, cx))?
.await?,
)
} else {
None
};
room.update(cx, move |room, cx| {
room.call(called_user_id, initial_project_id, cx)
})?
.await?;
anyhow::Ok(())
})
} else {
let client = self.client.clone();
let user_store = self.user_store.clone();
let room = cx
.spawn(async move |this, cx| {
let create_room = async {
let room = cx
.update(|cx| {
Room::create(
called_user_id,
initial_project,
client,
user_store,
cx,
)
})?
.await?;
this.update(cx, |this, cx| this.set_room(Some(room.clone()), cx))?
.await?;
anyhow::Ok(room)
};
let room = create_room.await;
this.update(cx, |this, _| this.pending_room_creation = None)?;
room.map_err(Arc::new)
})
.shared();
self.pending_room_creation = Some(room.clone());
cx.background_spawn(async move {
room.await.map_err(|err| anyhow!("{:?}", err))?;
anyhow::Ok(())
})
};
cx.spawn(async move |this, cx| {
let result = invite.await;
if result.is_ok() {
this.update(cx, |this, cx| {
this.report_call_event("Participant Invited", cx)
})?;
} else {
//TODO: report collaboration error
log::error!("invite failed: {:?}", result);
}
this.update(cx, |this, cx| {
this.pending_invites.remove(&called_user_id);
cx.notify();
})?;
result
})
}
pub fn cancel_invite(
&mut self,
called_user_id: u64,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let room_id = if let Some(room) = self.room() {
room.read(cx).id()
} else {
return Task::ready(Err(anyhow!("no active call")));
};
let client = self.client.clone();
cx.background_spawn(async move {
client
.request(proto::CancelCall {
room_id,
called_user_id,
})
.await?;
anyhow::Ok(())
})
}
pub fn incoming(&self) -> watch::Receiver<Option<IncomingCall>> {
self.incoming_call.1.clone()
}
pub fn accept_incoming(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
if self.room.is_some() {
return Task::ready(Err(anyhow!("cannot join while on another call")));
}
let call = if let Some(call) = self.incoming_call.0.borrow_mut().take() {
call
} else {
return Task::ready(Err(anyhow!("no incoming call")));
};
if self.pending_room_creation.is_some() {
return Task::ready(Ok(()));
}
let room_id = call.room_id;
let client = self.client.clone();
let user_store = self.user_store.clone();
let join = self._join_debouncer.spawn(cx, move |mut cx| async move {
Room::join(room_id, client, user_store, &mut cx).await
});
cx.spawn(async move |this, cx| {
let room = join.await?;
this.update(cx, |this, cx| this.set_room(room.clone(), cx))?
.await?;
this.update(cx, |this, cx| {
this.report_call_event("Incoming Call Accepted", cx)
})?;
Ok(())
})
}
pub fn decline_incoming(&mut self, _: &mut Context<Self>) -> Result<()> {
let call = self
.incoming_call
.0
.borrow_mut()
.take()
.ok_or_else(|| anyhow!("no incoming call"))?;
telemetry::event!("Incoming Call Declined", room_id = call.room_id);
self.client.send(proto::DeclineCall {
room_id: call.room_id,
})?;
Ok(())
}
pub fn join_channel(
&mut self,
channel_id: ChannelId,
cx: &mut Context<Self>,
) -> Task<Result<Option<Entity<Room>>>> {
if let Some(room) = self.room().cloned() {
if room.read(cx).channel_id() == Some(channel_id) {
return Task::ready(Ok(Some(room)));
} else {
room.update(cx, |room, cx| room.clear_state(cx));
}
}
if self.pending_room_creation.is_some() {
return Task::ready(Ok(None));
}
let client = self.client.clone();
let user_store = self.user_store.clone();
let join = self._join_debouncer.spawn(cx, move |mut cx| async move {
Room::join_channel(channel_id, client, user_store, &mut cx).await
});
cx.spawn(async move |this, cx| {
let room = join.await?;
this.update(cx, |this, cx| this.set_room(room.clone(), cx))?
.await?;
this.update(cx, |this, cx| this.report_call_event("Channel Joined", cx))?;
Ok(room)
})
}
pub fn hang_up(&mut self, cx: &mut Context<Self>) -> Task<Result<()>> {
cx.notify();
self.report_call_event("Call Ended", cx);
Audio::end_call(cx);
let channel_id = self.channel_id(cx);
if let Some((room, _)) = self.room.take() {
cx.emit(Event::RoomLeft { channel_id });
room.update(cx, |room, cx| room.leave(cx))
} else {
Task::ready(Ok(()))
}
}
pub fn share_project(
&mut self,
project: Entity<Project>,
cx: &mut Context<Self>,
) -> Task<Result<u64>> {
if let Some((room, _)) = self.room.as_ref() {
self.report_call_event("Project Shared", cx);
room.update(cx, |room, cx| room.share_project(project, cx))
} else {
Task::ready(Err(anyhow!("no active call")))
}
}
pub fn unshare_project(
&mut self,
project: Entity<Project>,
cx: &mut Context<Self>,
) -> Result<()> {
if let Some((room, _)) = self.room.as_ref() {
self.report_call_event("Project Unshared", cx);
room.update(cx, |room, cx| room.unshare_project(project, cx))
} else {
Err(anyhow!("no active call"))
}
}
pub fn location(&self) -> Option<&WeakEntity<Project>> {
self.location.as_ref()
}
pub fn set_location(
&mut self,
project: Option<&Entity<Project>>,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
if project.is_some() || !*ZED_ALWAYS_ACTIVE {
self.location = project.map(|project| project.downgrade());
if let Some((room, _)) = self.room.as_ref() {
return room.update(cx, |room, cx| room.set_location(project, cx));
}
}
Task::ready(Ok(()))
}
fn set_room(&mut self, room: Option<Entity<Room>>, cx: &mut Context<Self>) -> Task<Result<()>> {
if room.as_ref() == self.room.as_ref().map(|room| &room.0) {
Task::ready(Ok(()))
} else {
cx.notify();
if let Some(room) = room {
if room.read(cx).status().is_offline() {
self.room = None;
Task::ready(Ok(()))
} else {
let subscriptions = vec![
cx.observe(&room, |this, room, cx| {
if room.read(cx).status().is_offline() {
this.set_room(None, cx).detach_and_log_err(cx);
}
cx.notify();
}),
cx.subscribe(&room, |_, _, event, cx| cx.emit(event.clone())),
];
self.room = Some((room.clone(), subscriptions));
let location = self
.location
.as_ref()
.and_then(|location| location.upgrade());
let channel_id = room.read(cx).channel_id();
cx.emit(Event::RoomJoined { channel_id });
room.update(cx, |room, cx| room.set_location(location.as_ref(), cx))
}
} else {
self.room = None;
Task::ready(Ok(()))
}
}
}
pub fn room(&self) -> Option<&Entity<Room>> {
self.room.as_ref().map(|(room, _)| room)
}
pub fn client(&self) -> Arc<Client> {
self.client.clone()
}
pub fn pending_invites(&self) -> &HashSet<u64> {
&self.pending_invites
}
pub fn report_call_event(&self, operation: &'static str, cx: &mut App) {
if let Some(room) = self.room() {
let room = room.read(cx);
telemetry::event!(
operation,
room_id = room.id(),
channel_id = room.channel_id()
);
}
}
}
#[cfg(test)]
mod test {
use gpui::TestAppContext;
use crate::OneAtATime;
#[gpui::test]
async fn test_one_at_a_time(cx: &mut TestAppContext) {
let mut one_at_a_time = OneAtATime { cancel: None };
assert_eq!(
cx.update(|cx| one_at_a_time.spawn(cx, |_| async { Ok(1) }))
.await
.unwrap(),
Some(1)
);
let (a, b) = cx.update(|cx| {
(
one_at_a_time.spawn(cx, |_| async {
panic!("");
}),
one_at_a_time.spawn(cx, |_| async { Ok(3) }),
)
});
assert_eq!(a.await.unwrap(), None::<u32>);
assert_eq!(b.await.unwrap(), Some(3));
let promise = cx.update(|cx| one_at_a_time.spawn(cx, |_| async { Ok(4) }));
drop(one_at_a_time);
assert_eq!(promise.await.unwrap(), None);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -706,10 +706,11 @@ mod mac_os {
use anyhow::{anyhow, Context as _, Result};
use core_foundation::{
array::{CFArray, CFIndex},
base::TCFType as _,
string::kCFStringEncodingUTF8,
url::{CFURLCreateWithBytes, CFURL},
};
use core_services::{kLSLaunchDefaults, LSLaunchURLSpec, LSOpenFromURLSpec, TCFType};
use core_services::{kLSLaunchDefaults, LSLaunchURLSpec, LSOpenFromURLSpec};
use serde::Deserialize;
use std::{
ffi::OsStr,
@@ -736,7 +737,6 @@ mod mac_os {
},
LocalPath {
executable: PathBuf,
plist: InfoPlist,
},
}
@@ -773,34 +773,16 @@ mod mac_os {
plist,
})
}
_ => {
println!("Bundle path {bundle_path:?} has no *.app extension, attempting to locate a dev build");
let plist_path = bundle_path
.parent()
.with_context(|| format!("Bundle path {bundle_path:?} has no parent"))?
.join("WebRTC.framework/Resources/Info.plist");
let plist =
plist::from_file::<_, InfoPlist>(&plist_path).with_context(|| {
format!("Reading dev bundle plist file at {plist_path:?}")
})?;
Ok(Bundle::LocalPath {
executable: bundle_path,
plist,
})
}
_ => Ok(Bundle::LocalPath {
executable: bundle_path,
}),
}
}
}
impl InstalledApp for Bundle {
fn zed_version_string(&self) -> String {
let is_dev = matches!(self, Self::LocalPath { .. });
format!(
"Zed {}{} {}",
self.plist().bundle_short_version_string,
if is_dev { " (dev)" } else { "" },
self.path().display(),
)
format!("Zed {} {}", self.version(), self.path().display(),)
}
fn launch(&self, url: String) -> anyhow::Result<()> {
@@ -879,10 +861,10 @@ mod mac_os {
}
impl Bundle {
fn plist(&self) -> &InfoPlist {
fn version(&self) -> String {
match self {
Self::App { plist, .. } => plist,
Self::LocalPath { plist, .. } => plist,
Self::App { plist, .. } => plist.bundle_short_version_string.clone(),
Self::LocalPath { .. } => "<development>".to_string(),
}
}

View File

@@ -100,13 +100,15 @@ extension.workspace = true
file_finder.workspace = true
fs = { workspace = true, features = ["test-support"] }
git = { workspace = true, features = ["test-support"] }
git_ui = { workspace = true, features = ["test-support"] }
git_hosting_providers.workspace = true
git_ui = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
gpui_tokio.workspace = true
hyper.workspace = true
indoc.workspace = true
language = { workspace = true, features = ["test-support"] }
language_model = { workspace = true, features = ["test-support"] }
livekit_client = { workspace = true, features = ["test-support"] }
lsp = { workspace = true, features = ["test-support"] }
menu.workspace = true
multi_buffer = { workspace = true, features = ["test-support"] }
@@ -131,11 +133,5 @@ util.workspace = true
workspace = { workspace = true, features = ["test-support"] }
worktree = { workspace = true, features = ["test-support"] }
[target.'cfg(target_os = "macos")'.dev-dependencies]
livekit_client_macos = { workspace = true, features = ["test-support"] }
[target.'cfg(not(target_os = "macos"))'.dev-dependencies]
livekit_client = { workspace = true, features = ["test-support"] }
[package.metadata.cargo-machete]
ignored = ["async-stripe"]

View File

@@ -15,9 +15,13 @@ CREATE TABLE "users" (
"github_user_created_at" TIMESTAMP WITHOUT TIME ZONE,
"custom_llm_monthly_allowance_in_cents" INTEGER
);
CREATE UNIQUE INDEX "index_users_github_login" ON "users" ("github_login");
CREATE UNIQUE INDEX "index_invite_code_users" ON "users" ("invite_code");
CREATE INDEX "index_users_on_email_address" ON "users" ("email_address");
CREATE UNIQUE INDEX "index_users_on_github_user_id" ON "users" ("github_user_id");
CREATE TABLE "access_tokens" (
@@ -26,6 +30,7 @@ CREATE TABLE "access_tokens" (
"impersonated_user_id" INTEGER REFERENCES users (id),
"hash" VARCHAR(128)
);
CREATE INDEX "index_access_tokens_user_id" ON "access_tokens" ("user_id");
CREATE TABLE "contacts" (
@@ -36,7 +41,9 @@ CREATE TABLE "contacts" (
"should_notify" BOOLEAN NOT NULL,
"accepted" BOOLEAN NOT NULL
);
CREATE UNIQUE INDEX "index_contacts_user_ids" ON "contacts" ("user_id_a", "user_id_b");
CREATE INDEX "index_contacts_user_id_b" ON "contacts" ("user_id_b");
CREATE TABLE "rooms" (
@@ -45,6 +52,7 @@ CREATE TABLE "rooms" (
"environment" VARCHAR,
"channel_id" INTEGER REFERENCES channels (id) ON DELETE CASCADE
);
CREATE UNIQUE INDEX "index_rooms_on_channel_id" ON "rooms" ("channel_id");
CREATE TABLE "projects" (
@@ -55,7 +63,9 @@ CREATE TABLE "projects" (
"host_connection_server_id" INTEGER REFERENCES servers (id) ON DELETE CASCADE,
"unregistered" BOOLEAN NOT NULL DEFAULT FALSE
);
CREATE INDEX "index_projects_on_host_connection_server_id" ON "projects" ("host_connection_server_id");
CREATE INDEX "index_projects_on_host_connection_id_and_host_connection_server_id" ON "projects" ("host_connection_id", "host_connection_server_id");
CREATE TABLE "worktrees" (
@@ -67,8 +77,9 @@ CREATE TABLE "worktrees" (
"scan_id" INTEGER NOT NULL,
"is_complete" BOOL NOT NULL DEFAULT FALSE,
"completed_scan_id" INTEGER NOT NULL,
PRIMARY KEY(project_id, id)
PRIMARY KEY (project_id, id)
);
CREATE INDEX "index_worktrees_on_project_id" ON "worktrees" ("project_id");
CREATE TABLE "worktree_entries" (
@@ -87,32 +98,33 @@ CREATE TABLE "worktree_entries" (
"is_deleted" BOOL NOT NULL,
"git_status" INTEGER,
"is_fifo" BOOL NOT NULL,
PRIMARY KEY(project_id, worktree_id, id),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
PRIMARY KEY (project_id, worktree_id, id),
FOREIGN KEY (project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
);
CREATE INDEX "index_worktree_entries_on_project_id" ON "worktree_entries" ("project_id");
CREATE INDEX "index_worktree_entries_on_project_id_and_worktree_id" ON "worktree_entries" ("project_id", "worktree_id");
CREATE TABLE "worktree_repositories" (
CREATE TABLE "project_repositories" (
"project_id" INTEGER NOT NULL,
"worktree_id" INTEGER NOT NULL,
"work_directory_id" INTEGER NOT NULL,
"abs_path" VARCHAR,
"id" INTEGER NOT NULL,
"entry_ids" VARCHAR,
"legacy_worktree_id" INTEGER,
"branch" VARCHAR,
"scan_id" INTEGER NOT NULL,
"is_deleted" BOOL NOT NULL,
"current_merge_conflicts" VARCHAR,
"branch_summary" VARCHAR,
PRIMARY KEY(project_id, worktree_id, work_directory_id),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE,
FOREIGN KEY(project_id, worktree_id, work_directory_id) REFERENCES worktree_entries (project_id, worktree_id, id) ON DELETE CASCADE
PRIMARY KEY (project_id, id)
);
CREATE INDEX "index_worktree_repositories_on_project_id" ON "worktree_repositories" ("project_id");
CREATE INDEX "index_worktree_repositories_on_project_id_and_worktree_id" ON "worktree_repositories" ("project_id", "worktree_id");
CREATE TABLE "worktree_repository_statuses" (
CREATE INDEX "index_project_repositories_on_project_id" ON "project_repositories" ("project_id");
CREATE TABLE "project_repository_statuses" (
"project_id" INTEGER NOT NULL,
"worktree_id" INT8 NOT NULL,
"work_directory_id" INT8 NOT NULL,
"repository_id" INTEGER NOT NULL,
"repo_path" VARCHAR NOT NULL,
"status" INT8 NOT NULL,
"status_kind" INT4 NOT NULL,
@@ -120,13 +132,12 @@ CREATE TABLE "worktree_repository_statuses" (
"second_status" INT4 NULL,
"scan_id" INT8 NOT NULL,
"is_deleted" BOOL NOT NULL,
PRIMARY KEY(project_id, worktree_id, work_directory_id, repo_path),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE,
FOREIGN KEY(project_id, worktree_id, work_directory_id) REFERENCES worktree_entries (project_id, worktree_id, id) ON DELETE CASCADE
PRIMARY KEY (project_id, repository_id, repo_path)
);
CREATE INDEX "index_wt_repos_statuses_on_project_id" ON "worktree_repository_statuses" ("project_id");
CREATE INDEX "index_wt_repos_statuses_on_project_id_and_wt_id" ON "worktree_repository_statuses" ("project_id", "worktree_id");
CREATE INDEX "index_wt_repos_statuses_on_project_id_and_wt_id_and_wd_id" ON "worktree_repository_statuses" ("project_id", "worktree_id", "work_directory_id");
CREATE INDEX "index_project_repos_statuses_on_project_id" ON "project_repository_statuses" ("project_id");
CREATE INDEX "index_project_repos_statuses_on_project_id_and_repo_id" ON "project_repository_statuses" ("project_id", "repository_id");
CREATE TABLE "worktree_settings_files" (
"project_id" INTEGER NOT NULL,
@@ -134,10 +145,12 @@ CREATE TABLE "worktree_settings_files" (
"path" VARCHAR NOT NULL,
"content" TEXT,
"kind" VARCHAR,
PRIMARY KEY(project_id, worktree_id, path),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
PRIMARY KEY (project_id, worktree_id, path),
FOREIGN KEY (project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
);
CREATE INDEX "index_worktree_settings_files_on_project_id" ON "worktree_settings_files" ("project_id");
CREATE INDEX "index_worktree_settings_files_on_project_id_and_worktree_id" ON "worktree_settings_files" ("project_id", "worktree_id");
CREATE TABLE "worktree_diagnostic_summaries" (
@@ -147,18 +160,21 @@ CREATE TABLE "worktree_diagnostic_summaries" (
"language_server_id" INTEGER NOT NULL,
"error_count" INTEGER NOT NULL,
"warning_count" INTEGER NOT NULL,
PRIMARY KEY(project_id, worktree_id, path),
FOREIGN KEY(project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
PRIMARY KEY (project_id, worktree_id, path),
FOREIGN KEY (project_id, worktree_id) REFERENCES worktrees (project_id, id) ON DELETE CASCADE
);
CREATE INDEX "index_worktree_diagnostic_summaries_on_project_id" ON "worktree_diagnostic_summaries" ("project_id");
CREATE INDEX "index_worktree_diagnostic_summaries_on_project_id_and_worktree_id" ON "worktree_diagnostic_summaries" ("project_id", "worktree_id");
CREATE TABLE "language_servers" (
"id" INTEGER NOT NULL,
"project_id" INTEGER NOT NULL REFERENCES projects (id) ON DELETE CASCADE,
"name" VARCHAR NOT NULL,
PRIMARY KEY(project_id, id)
PRIMARY KEY (project_id, id)
);
CREATE INDEX "index_language_servers_on_project_id" ON "language_servers" ("project_id");
CREATE TABLE "project_collaborators" (
@@ -170,11 +186,20 @@ CREATE TABLE "project_collaborators" (
"replica_id" INTEGER NOT NULL,
"is_host" BOOLEAN NOT NULL
);
CREATE INDEX "index_project_collaborators_on_project_id" ON "project_collaborators" ("project_id");
CREATE UNIQUE INDEX "index_project_collaborators_on_project_id_and_replica_id" ON "project_collaborators" ("project_id", "replica_id");
CREATE INDEX "index_project_collaborators_on_connection_server_id" ON "project_collaborators" ("connection_server_id");
CREATE INDEX "index_project_collaborators_on_connection_id" ON "project_collaborators" ("connection_id");
CREATE UNIQUE INDEX "index_project_collaborators_on_project_id_connection_id_and_server_id" ON "project_collaborators" ("project_id", "connection_id", "connection_server_id");
CREATE UNIQUE INDEX "index_project_collaborators_on_project_id_connection_id_and_server_id" ON "project_collaborators" (
"project_id",
"connection_id",
"connection_server_id"
);
CREATE TABLE "room_participants" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -193,12 +218,21 @@ CREATE TABLE "room_participants" (
"role" TEXT,
"in_call" BOOLEAN NOT NULL DEFAULT FALSE
);
CREATE UNIQUE INDEX "index_room_participants_on_user_id" ON "room_participants" ("user_id");
CREATE INDEX "index_room_participants_on_room_id" ON "room_participants" ("room_id");
CREATE INDEX "index_room_participants_on_answering_connection_server_id" ON "room_participants" ("answering_connection_server_id");
CREATE INDEX "index_room_participants_on_calling_connection_server_id" ON "room_participants" ("calling_connection_server_id");
CREATE INDEX "index_room_participants_on_answering_connection_id" ON "room_participants" ("answering_connection_id");
CREATE UNIQUE INDEX "index_room_participants_on_answering_connection_id_and_answering_connection_server_id" ON "room_participants" ("answering_connection_id", "answering_connection_server_id");
CREATE UNIQUE INDEX "index_room_participants_on_answering_connection_id_and_answering_connection_server_id" ON "room_participants" (
"answering_connection_id",
"answering_connection_server_id"
);
CREATE TABLE "servers" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -214,9 +248,15 @@ CREATE TABLE "followers" (
"follower_connection_server_id" INTEGER NOT NULL REFERENCES servers (id) ON DELETE CASCADE,
"follower_connection_id" INTEGER NOT NULL
);
CREATE UNIQUE INDEX
"index_followers_on_project_id_and_leader_connection_server_id_and_leader_connection_id_and_follower_connection_server_id_and_follower_connection_id"
ON "followers" ("project_id", "leader_connection_server_id", "leader_connection_id", "follower_connection_server_id", "follower_connection_id");
CREATE UNIQUE INDEX "index_followers_on_project_id_and_leader_connection_server_id_and_leader_connection_id_and_follower_connection_server_id_and_follower_connection_id" ON "followers" (
"project_id",
"leader_connection_server_id",
"leader_connection_id",
"follower_connection_server_id",
"follower_connection_id"
);
CREATE INDEX "index_followers_on_room_id" ON "followers" ("room_id");
CREATE TABLE "channels" (
@@ -237,6 +277,7 @@ CREATE TABLE IF NOT EXISTS "channel_chat_participants" (
"connection_id" INTEGER NOT NULL,
"connection_server_id" INTEGER NOT NULL REFERENCES servers (id) ON DELETE CASCADE
);
CREATE INDEX "index_channel_chat_participants_on_channel_id" ON "channel_chat_participants" ("channel_id");
CREATE TABLE IF NOT EXISTS "channel_messages" (
@@ -249,7 +290,9 @@ CREATE TABLE IF NOT EXISTS "channel_messages" (
"nonce" BLOB NOT NULL,
"reply_to_message_id" INTEGER DEFAULT NULL
);
CREATE INDEX "index_channel_messages_on_channel_id" ON "channel_messages" ("channel_id");
CREATE UNIQUE INDEX "index_channel_messages_on_sender_id_nonce" ON "channel_messages" ("sender_id", "nonce");
CREATE TABLE "channel_message_mentions" (
@@ -257,7 +300,7 @@ CREATE TABLE "channel_message_mentions" (
"start_offset" INTEGER NOT NULL,
"end_offset" INTEGER NOT NULL,
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
PRIMARY KEY(message_id, start_offset)
PRIMARY KEY (message_id, start_offset)
);
CREATE TABLE "channel_members" (
@@ -288,7 +331,7 @@ CREATE TABLE "buffer_operations" (
"replica_id" INTEGER NOT NULL,
"lamport_timestamp" INTEGER NOT NULL,
"value" BLOB NOT NULL,
PRIMARY KEY(buffer_id, epoch, lamport_timestamp, replica_id)
PRIMARY KEY (buffer_id, epoch, lamport_timestamp, replica_id)
);
CREATE TABLE "buffer_snapshots" (
@@ -296,7 +339,7 @@ CREATE TABLE "buffer_snapshots" (
"epoch" INTEGER NOT NULL,
"text" TEXT NOT NULL,
"operation_serialization_version" INTEGER NOT NULL,
PRIMARY KEY(buffer_id, epoch)
PRIMARY KEY (buffer_id, epoch)
);
CREATE TABLE "channel_buffer_collaborators" (
@@ -310,11 +353,18 @@ CREATE TABLE "channel_buffer_collaborators" (
);
CREATE INDEX "index_channel_buffer_collaborators_on_channel_id" ON "channel_buffer_collaborators" ("channel_id");
CREATE UNIQUE INDEX "index_channel_buffer_collaborators_on_channel_id_and_replica_id" ON "channel_buffer_collaborators" ("channel_id", "replica_id");
CREATE INDEX "index_channel_buffer_collaborators_on_connection_server_id" ON "channel_buffer_collaborators" ("connection_server_id");
CREATE INDEX "index_channel_buffer_collaborators_on_connection_id" ON "channel_buffer_collaborators" ("connection_id");
CREATE UNIQUE INDEX "index_channel_buffer_collaborators_on_channel_id_connection_id_and_server_id" ON "channel_buffer_collaborators" ("channel_id", "connection_id", "connection_server_id");
CREATE UNIQUE INDEX "index_channel_buffer_collaborators_on_channel_id_and_replica_id" ON "channel_buffer_collaborators" ("channel_id", "replica_id");
CREATE INDEX "index_channel_buffer_collaborators_on_connection_server_id" ON "channel_buffer_collaborators" ("connection_server_id");
CREATE INDEX "index_channel_buffer_collaborators_on_connection_id" ON "channel_buffer_collaborators" ("connection_id");
CREATE UNIQUE INDEX "index_channel_buffer_collaborators_on_channel_id_connection_id_and_server_id" ON "channel_buffer_collaborators" (
"channel_id",
"connection_id",
"connection_server_id"
);
CREATE TABLE "feature_flags" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
@@ -324,7 +374,6 @@ CREATE TABLE "feature_flags" (
CREATE INDEX "index_feature_flags" ON "feature_flags" ("id");
CREATE TABLE "user_features" (
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
"feature_id" INTEGER NOT NULL REFERENCES feature_flags (id) ON DELETE CASCADE,
@@ -332,9 +381,10 @@ CREATE TABLE "user_features" (
);
CREATE UNIQUE INDEX "index_user_features_user_id_and_feature_id" ON "user_features" ("user_id", "feature_id");
CREATE INDEX "index_user_features_on_user_id" ON "user_features" ("user_id");
CREATE INDEX "index_user_features_on_feature_id" ON "user_features" ("feature_id");
CREATE INDEX "index_user_features_on_user_id" ON "user_features" ("user_id");
CREATE INDEX "index_user_features_on_feature_id" ON "user_features" ("feature_id");
CREATE TABLE "observed_buffer_edits" (
"user_id" INTEGER NOT NULL REFERENCES users (id) ON DELETE CASCADE,
@@ -374,13 +424,10 @@ CREATE TABLE "notifications" (
"response" BOOLEAN
);
CREATE INDEX
"index_notifications_on_recipient_id_is_read_kind_entity_id"
ON "notifications"
("recipient_id", "is_read", "kind", "entity_id");
CREATE INDEX "index_notifications_on_recipient_id_is_read_kind_entity_id" ON "notifications" ("recipient_id", "is_read", "kind", "entity_id");
CREATE TABLE contributors (
user_id INTEGER REFERENCES users(id),
user_id INTEGER REFERENCES users (id),
signed_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (user_id)
);
@@ -394,7 +441,7 @@ CREATE TABLE extensions (
);
CREATE TABLE extension_versions (
extension_id INTEGER REFERENCES extensions(id),
extension_id INTEGER REFERENCES extensions (id),
version TEXT NOT NULL,
published_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
authors TEXT NOT NULL,
@@ -416,6 +463,7 @@ CREATE TABLE extension_versions (
);
CREATE UNIQUE INDEX "index_extensions_external_id" ON "extensions" ("external_id");
CREATE INDEX "index_extensions_total_download_count" ON "extensions" ("total_download_count");
CREATE TABLE rate_buckets (
@@ -424,14 +472,15 @@ CREATE TABLE rate_buckets (
token_count INT NOT NULL,
last_refill TIMESTAMP WITHOUT TIME ZONE NOT NULL,
PRIMARY KEY (user_id, rate_limit_name),
FOREIGN KEY (user_id) REFERENCES users(id)
FOREIGN KEY (user_id) REFERENCES users (id)
);
CREATE INDEX idx_user_id_rate_limit ON rate_buckets (user_id, rate_limit_name);
CREATE TABLE IF NOT EXISTS billing_preferences (
id INTEGER PRIMARY KEY AUTOINCREMENT,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
user_id INTEGER NOT NULL REFERENCES users(id),
user_id INTEGER NOT NULL REFERENCES users (id),
max_monthly_llm_usage_spending_in_cents INTEGER NOT NULL
);
@@ -440,18 +489,19 @@ CREATE UNIQUE INDEX "uix_billing_preferences_on_user_id" ON billing_preferences
CREATE TABLE IF NOT EXISTS billing_customers (
id INTEGER PRIMARY KEY AUTOINCREMENT,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
user_id INTEGER NOT NULL REFERENCES users(id),
user_id INTEGER NOT NULL REFERENCES users (id),
has_overdue_invoices BOOLEAN NOT NULL DEFAULT FALSE,
stripe_customer_id TEXT NOT NULL
);
CREATE UNIQUE INDEX "uix_billing_customers_on_user_id" ON billing_customers (user_id);
CREATE UNIQUE INDEX "uix_billing_customers_on_stripe_customer_id" ON billing_customers (stripe_customer_id);
CREATE TABLE IF NOT EXISTS billing_subscriptions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
billing_customer_id INTEGER NOT NULL REFERENCES billing_customers(id),
billing_customer_id INTEGER NOT NULL REFERENCES billing_customers (id),
stripe_subscription_id TEXT NOT NULL,
stripe_subscription_status TEXT NOT NULL,
stripe_cancel_at TIMESTAMP,
@@ -459,6 +509,7 @@ CREATE TABLE IF NOT EXISTS billing_subscriptions (
);
CREATE INDEX "ix_billing_subscriptions_on_billing_customer_id" ON billing_subscriptions (billing_customer_id);
CREATE UNIQUE INDEX "uix_billing_subscriptions_on_stripe_subscription_id" ON billing_subscriptions (stripe_subscription_id);
CREATE TABLE IF NOT EXISTS processed_stripe_events (
@@ -479,4 +530,5 @@ CREATE TABLE IF NOT EXISTS "breakpoints" (
"path" TEXT NOT NULL,
"kind" VARCHAR NOT NULL
);
CREATE INDEX "index_breakpoints_on_project_id" ON "breakpoints" ("project_id");

View File

@@ -0,0 +1,32 @@
CREATE TABLE "project_repositories" (
"project_id" INTEGER NOT NULL,
"abs_path" VARCHAR,
"id" INT8 NOT NULL,
"legacy_worktree_id" INT8,
"entry_ids" VARCHAR,
"branch" VARCHAR,
"scan_id" INT8 NOT NULL,
"is_deleted" BOOL NOT NULL,
"current_merge_conflicts" VARCHAR,
"branch_summary" VARCHAR,
PRIMARY KEY (project_id, id)
);
CREATE INDEX "index_project_repositories_on_project_id" ON "project_repositories" ("project_id");
CREATE TABLE "project_repository_statuses" (
"project_id" INTEGER NOT NULL,
"repository_id" INT8 NOT NULL,
"repo_path" VARCHAR NOT NULL,
"status" INT8 NOT NULL,
"status_kind" INT4 NOT NULL,
"first_status" INT4 NULL,
"second_status" INT4 NULL,
"scan_id" INT8 NOT NULL,
"is_deleted" BOOL NOT NULL,
PRIMARY KEY (project_id, repository_id, repo_path)
);
CREATE INDEX "index_project_repos_statuses_on_project_id" ON "project_repository_statuses" ("project_id");
CREATE INDEX "index_project_repos_statuses_on_project_id_and_repo_id" ON "project_repository_statuses" ("project_id", "repository_id");

View File

@@ -9,6 +9,7 @@ use anyhow::anyhow;
use collections::{BTreeMap, BTreeSet, HashMap, HashSet};
use dashmap::DashMap;
use futures::StreamExt;
use project_repository_statuses::StatusKind;
use rand::{prelude::StdRng, Rng, SeedableRng};
use rpc::ExtensionProvides;
use rpc::{
@@ -36,7 +37,6 @@ use std::{
};
use time::PrimitiveDateTime;
use tokio::sync::{Mutex, OwnedMutexGuard};
use worktree_repository_statuses::StatusKind;
use worktree_settings_file::LocalSettingsKind;
#[cfg(test)]
@@ -658,6 +658,8 @@ pub struct RejoinedProject {
pub old_connection_id: ConnectionId,
pub collaborators: Vec<ProjectCollaborator>,
pub worktrees: Vec<RejoinedWorktree>,
pub updated_repositories: Vec<proto::UpdateRepository>,
pub removed_repositories: Vec<u64>,
pub language_servers: Vec<proto::LanguageServer>,
}
@@ -726,6 +728,7 @@ pub struct Project {
pub role: ChannelRole,
pub collaborators: Vec<ProjectCollaborator>,
pub worktrees: BTreeMap<u64, Worktree>,
pub repositories: Vec<proto::UpdateRepository>,
pub language_servers: Vec<proto::LanguageServer>,
}
@@ -760,7 +763,7 @@ pub struct Worktree {
pub root_name: String,
pub visible: bool,
pub entries: Vec<proto::Entry>,
pub repository_entries: BTreeMap<u64, proto::RepositoryEntry>,
pub legacy_repository_entries: BTreeMap<u64, proto::RepositoryEntry>,
pub diagnostic_summaries: Vec<proto::DiagnosticSummary>,
pub settings_files: Vec<WorktreeSettingsFile>,
pub scan_id: u64,
@@ -810,7 +813,7 @@ impl LocalSettingsKind {
}
fn db_status_to_proto(
entry: worktree_repository_statuses::Model,
entry: project_repository_statuses::Model,
) -> anyhow::Result<proto::StatusEntry> {
use proto::git_file_status::{Tracked, Unmerged, Variant};

View File

@@ -324,119 +324,135 @@ impl Database {
.await?;
}
if !update.updated_repositories.is_empty() {
worktree_repository::Entity::insert_many(update.updated_repositories.iter().map(
|repository| {
worktree_repository::ActiveModel {
project_id: ActiveValue::set(project_id),
worktree_id: ActiveValue::set(worktree_id),
work_directory_id: ActiveValue::set(
repository.work_directory_id as i64,
),
scan_id: ActiveValue::set(update.scan_id as i64),
branch: ActiveValue::set(repository.branch.clone()),
is_deleted: ActiveValue::set(false),
branch_summary: ActiveValue::Set(
repository
.branch_summary
.as_ref()
.map(|summary| serde_json::to_string(summary).unwrap()),
),
current_merge_conflicts: ActiveValue::Set(Some(
serde_json::to_string(&repository.current_merge_conflicts).unwrap(),
)),
}
},
))
.on_conflict(
OnConflict::columns([
worktree_repository::Column::ProjectId,
worktree_repository::Column::WorktreeId,
worktree_repository::Column::WorkDirectoryId,
])
.update_columns([
worktree_repository::Column::ScanId,
worktree_repository::Column::Branch,
worktree_repository::Column::BranchSummary,
worktree_repository::Column::CurrentMergeConflicts,
])
.to_owned(),
)
.exec(&*tx)
.await?;
// Backward-compatibility for old Zed clients.
//
// Remove this block when Zed 1.80 stable has been out for a week.
{
if !update.updated_repositories.is_empty() {
project_repository::Entity::insert_many(
update.updated_repositories.iter().map(|repository| {
project_repository::ActiveModel {
project_id: ActiveValue::set(project_id),
legacy_worktree_id: ActiveValue::set(Some(worktree_id)),
id: ActiveValue::set(repository.work_directory_id as i64),
scan_id: ActiveValue::set(update.scan_id as i64),
is_deleted: ActiveValue::set(false),
branch_summary: ActiveValue::Set(
repository
.branch_summary
.as_ref()
.map(|summary| serde_json::to_string(summary).unwrap()),
),
current_merge_conflicts: ActiveValue::Set(Some(
serde_json::to_string(&repository.current_merge_conflicts)
.unwrap(),
)),
let has_any_statuses = update
.updated_repositories
.iter()
.any(|repository| !repository.updated_statuses.is_empty());
if has_any_statuses {
worktree_repository_statuses::Entity::insert_many(
update.updated_repositories.iter().flat_map(
|repository: &proto::RepositoryEntry| {
repository.updated_statuses.iter().map(|status_entry| {
let (repo_path, status_kind, first_status, second_status) =
proto_status_to_db(status_entry.clone());
worktree_repository_statuses::ActiveModel {
project_id: ActiveValue::set(project_id),
worktree_id: ActiveValue::set(worktree_id),
work_directory_id: ActiveValue::set(
repository.work_directory_id as i64,
),
scan_id: ActiveValue::set(update.scan_id as i64),
is_deleted: ActiveValue::set(false),
repo_path: ActiveValue::set(repo_path),
status: ActiveValue::set(0),
status_kind: ActiveValue::set(status_kind),
first_status: ActiveValue::set(first_status),
second_status: ActiveValue::set(second_status),
}
})
},
),
// Old clients do not use abs path or entry ids.
abs_path: ActiveValue::set(String::new()),
entry_ids: ActiveValue::set("[]".into()),
}
}),
)
.on_conflict(
OnConflict::columns([
worktree_repository_statuses::Column::ProjectId,
worktree_repository_statuses::Column::WorktreeId,
worktree_repository_statuses::Column::WorkDirectoryId,
worktree_repository_statuses::Column::RepoPath,
project_repository::Column::ProjectId,
project_repository::Column::Id,
])
.update_columns([
worktree_repository_statuses::Column::ScanId,
worktree_repository_statuses::Column::StatusKind,
worktree_repository_statuses::Column::FirstStatus,
worktree_repository_statuses::Column::SecondStatus,
project_repository::Column::ScanId,
project_repository::Column::BranchSummary,
project_repository::Column::CurrentMergeConflicts,
])
.to_owned(),
)
.exec(&*tx)
.await?;
let has_any_statuses = update
.updated_repositories
.iter()
.any(|repository| !repository.updated_statuses.is_empty());
if has_any_statuses {
project_repository_statuses::Entity::insert_many(
update.updated_repositories.iter().flat_map(
|repository: &proto::RepositoryEntry| {
repository.updated_statuses.iter().map(|status_entry| {
let (repo_path, status_kind, first_status, second_status) =
proto_status_to_db(status_entry.clone());
project_repository_statuses::ActiveModel {
project_id: ActiveValue::set(project_id),
repository_id: ActiveValue::set(
repository.work_directory_id as i64,
),
scan_id: ActiveValue::set(update.scan_id as i64),
is_deleted: ActiveValue::set(false),
repo_path: ActiveValue::set(repo_path),
status: ActiveValue::set(0),
status_kind: ActiveValue::set(status_kind),
first_status: ActiveValue::set(first_status),
second_status: ActiveValue::set(second_status),
}
})
},
),
)
.on_conflict(
OnConflict::columns([
project_repository_statuses::Column::ProjectId,
project_repository_statuses::Column::RepositoryId,
project_repository_statuses::Column::RepoPath,
])
.update_columns([
project_repository_statuses::Column::ScanId,
project_repository_statuses::Column::StatusKind,
project_repository_statuses::Column::FirstStatus,
project_repository_statuses::Column::SecondStatus,
])
.to_owned(),
)
.exec(&*tx)
.await?;
}
for repo in &update.updated_repositories {
if !repo.removed_statuses.is_empty() {
project_repository_statuses::Entity::update_many()
.filter(
project_repository_statuses::Column::ProjectId
.eq(project_id)
.and(
project_repository_statuses::Column::RepositoryId
.eq(repo.work_directory_id),
)
.and(
project_repository_statuses::Column::RepoPath
.is_in(repo.removed_statuses.iter()),
),
)
.set(project_repository_statuses::ActiveModel {
is_deleted: ActiveValue::Set(true),
scan_id: ActiveValue::Set(update.scan_id as i64),
..Default::default()
})
.exec(&*tx)
.await?;
}
}
}
let has_any_removed_statuses = update
.updated_repositories
.iter()
.any(|repository| !repository.removed_statuses.is_empty());
if has_any_removed_statuses {
worktree_repository_statuses::Entity::update_many()
if !update.removed_repositories.is_empty() {
project_repository::Entity::update_many()
.filter(
worktree_repository_statuses::Column::ProjectId
project_repository::Column::ProjectId
.eq(project_id)
.and(
worktree_repository_statuses::Column::WorktreeId
.eq(worktree_id),
)
.and(
worktree_repository_statuses::Column::RepoPath.is_in(
update.updated_repositories.iter().flat_map(|repository| {
repository.removed_statuses.iter()
}),
),
),
.and(project_repository::Column::LegacyWorktreeId.eq(worktree_id))
.and(project_repository::Column::Id.is_in(
update.removed_repositories.iter().map(|id| *id as i64),
)),
)
.set(worktree_repository_statuses::ActiveModel {
.set(project_repository::ActiveModel {
is_deleted: ActiveValue::Set(true),
scan_id: ActiveValue::Set(update.scan_id as i64),
..Default::default()
@@ -446,18 +462,109 @@ impl Database {
}
}
if !update.removed_repositories.is_empty() {
worktree_repository::Entity::update_many()
let connection_ids = self.project_guest_connection_ids(project_id, &tx).await?;
Ok(connection_ids)
})
.await
}
pub async fn update_repository(
&self,
update: &proto::UpdateRepository,
_connection: ConnectionId,
) -> Result<TransactionGuard<Vec<ConnectionId>>> {
let project_id = ProjectId::from_proto(update.project_id);
let repository_id = update.id as i64;
self.project_transaction(project_id, |tx| async move {
project_repository::Entity::insert(project_repository::ActiveModel {
project_id: ActiveValue::set(project_id),
id: ActiveValue::set(repository_id),
legacy_worktree_id: ActiveValue::set(None),
abs_path: ActiveValue::set(update.abs_path.clone()),
entry_ids: ActiveValue::Set(serde_json::to_string(&update.entry_ids).unwrap()),
scan_id: ActiveValue::set(update.scan_id as i64),
is_deleted: ActiveValue::set(false),
branch_summary: ActiveValue::Set(
update
.branch_summary
.as_ref()
.map(|summary| serde_json::to_string(summary).unwrap()),
),
current_merge_conflicts: ActiveValue::Set(Some(
serde_json::to_string(&update.current_merge_conflicts).unwrap(),
)),
})
.on_conflict(
OnConflict::columns([
project_repository::Column::ProjectId,
project_repository::Column::Id,
])
.update_columns([
project_repository::Column::ScanId,
project_repository::Column::BranchSummary,
project_repository::Column::EntryIds,
project_repository::Column::AbsPath,
project_repository::Column::CurrentMergeConflicts,
])
.to_owned(),
)
.exec(&*tx)
.await?;
let has_any_statuses = !update.updated_statuses.is_empty();
if has_any_statuses {
project_repository_statuses::Entity::insert_many(
update.updated_statuses.iter().map(|status_entry| {
let (repo_path, status_kind, first_status, second_status) =
proto_status_to_db(status_entry.clone());
project_repository_statuses::ActiveModel {
project_id: ActiveValue::set(project_id),
repository_id: ActiveValue::set(repository_id),
scan_id: ActiveValue::set(update.scan_id as i64),
is_deleted: ActiveValue::set(false),
repo_path: ActiveValue::set(repo_path),
status: ActiveValue::set(0),
status_kind: ActiveValue::set(status_kind),
first_status: ActiveValue::set(first_status),
second_status: ActiveValue::set(second_status),
}
}),
)
.on_conflict(
OnConflict::columns([
project_repository_statuses::Column::ProjectId,
project_repository_statuses::Column::RepositoryId,
project_repository_statuses::Column::RepoPath,
])
.update_columns([
project_repository_statuses::Column::ScanId,
project_repository_statuses::Column::StatusKind,
project_repository_statuses::Column::FirstStatus,
project_repository_statuses::Column::SecondStatus,
])
.to_owned(),
)
.exec(&*tx)
.await?;
}
let has_any_removed_statuses = !update.removed_statuses.is_empty();
if has_any_removed_statuses {
project_repository_statuses::Entity::update_many()
.filter(
worktree_repository::Column::ProjectId
project_repository_statuses::Column::ProjectId
.eq(project_id)
.and(worktree_repository::Column::WorktreeId.eq(worktree_id))
.and(
worktree_repository::Column::WorkDirectoryId
.is_in(update.removed_repositories.iter().map(|id| *id as i64)),
project_repository_statuses::Column::RepositoryId.eq(repository_id),
)
.and(
project_repository_statuses::Column::RepoPath
.is_in(update.removed_statuses.iter()),
),
)
.set(worktree_repository::ActiveModel {
.set(project_repository_statuses::ActiveModel {
is_deleted: ActiveValue::Set(true),
scan_id: ActiveValue::Set(update.scan_id as i64),
..Default::default()
@@ -472,6 +579,34 @@ impl Database {
.await
}
pub async fn remove_repository(
&self,
remove: &proto::RemoveRepository,
_connection: ConnectionId,
) -> Result<TransactionGuard<Vec<ConnectionId>>> {
let project_id = ProjectId::from_proto(remove.project_id);
let repository_id = remove.id as i64;
self.project_transaction(project_id, |tx| async move {
project_repository::Entity::update_many()
.filter(
project_repository::Column::ProjectId
.eq(project_id)
.and(project_repository::Column::Id.eq(repository_id)),
)
.set(project_repository::ActiveModel {
is_deleted: ActiveValue::Set(true),
// scan_id: ActiveValue::Set(update.scan_id as i64),
..Default::default()
})
.exec(&*tx)
.await?;
let connection_ids = self.project_guest_connection_ids(project_id, &tx).await?;
Ok(connection_ids)
})
.await
}
/// Updates the diagnostic summary for the given connection.
pub async fn update_diagnostic_summary(
&self,
@@ -703,11 +838,11 @@ impl Database {
root_name: db_worktree.root_name,
visible: db_worktree.visible,
entries: Default::default(),
repository_entries: Default::default(),
diagnostic_summaries: Default::default(),
settings_files: Default::default(),
scan_id: db_worktree.scan_id as u64,
completed_scan_id: db_worktree.completed_scan_id as u64,
legacy_repository_entries: Default::default(),
},
)
})
@@ -750,65 +885,77 @@ impl Database {
}
// Populate repository entries.
let mut repositories = Vec::new();
{
let db_repository_entries = worktree_repository::Entity::find()
let db_repository_entries = project_repository::Entity::find()
.filter(
Condition::all()
.add(worktree_repository::Column::ProjectId.eq(project.id))
.add(worktree_repository::Column::IsDeleted.eq(false)),
.add(project_repository::Column::ProjectId.eq(project.id))
.add(project_repository::Column::IsDeleted.eq(false)),
)
.all(tx)
.await?;
for db_repository_entry in db_repository_entries {
if let Some(worktree) = worktrees.get_mut(&(db_repository_entry.worktree_id as u64))
{
let mut repository_statuses = worktree_repository_statuses::Entity::find()
.filter(
Condition::all()
.add(worktree_repository_statuses::Column::ProjectId.eq(project.id))
.add(
worktree_repository_statuses::Column::WorktreeId
.eq(worktree.id),
)
.add(
worktree_repository_statuses::Column::WorkDirectoryId
.eq(db_repository_entry.work_directory_id),
)
.add(worktree_repository_statuses::Column::IsDeleted.eq(false)),
)
.stream(tx)
.await?;
let mut updated_statuses = Vec::new();
while let Some(status_entry) = repository_statuses.next().await {
let status_entry: worktree_repository_statuses::Model = status_entry?;
updated_statuses.push(db_status_to_proto(status_entry)?);
let mut repository_statuses = project_repository_statuses::Entity::find()
.filter(
Condition::all()
.add(project_repository_statuses::Column::ProjectId.eq(project.id))
.add(
project_repository_statuses::Column::RepositoryId
.eq(db_repository_entry.id),
)
.add(project_repository_statuses::Column::IsDeleted.eq(false)),
)
.stream(tx)
.await?;
let mut updated_statuses = Vec::new();
while let Some(status_entry) = repository_statuses.next().await {
let status_entry = status_entry?;
updated_statuses.push(db_status_to_proto(status_entry)?);
}
let current_merge_conflicts = db_repository_entry
.current_merge_conflicts
.as_ref()
.map(|conflicts| serde_json::from_str(&conflicts))
.transpose()?
.unwrap_or_default();
let branch_summary = db_repository_entry
.branch_summary
.as_ref()
.map(|branch_summary| serde_json::from_str(&branch_summary))
.transpose()?
.unwrap_or_default();
let entry_ids = serde_json::from_str(&db_repository_entry.entry_ids)
.context("failed to deserialize repository's entry ids")?;
if let Some(worktree_id) = db_repository_entry.legacy_worktree_id {
if let Some(worktree) = worktrees.get_mut(&(worktree_id as u64)) {
worktree.legacy_repository_entries.insert(
db_repository_entry.id as u64,
proto::RepositoryEntry {
work_directory_id: db_repository_entry.id as u64,
updated_statuses,
removed_statuses: Vec::new(),
current_merge_conflicts,
branch_summary,
},
);
}
let current_merge_conflicts = db_repository_entry
.current_merge_conflicts
.as_ref()
.map(|conflicts| serde_json::from_str(&conflicts))
.transpose()?
.unwrap_or_default();
let branch_summary = db_repository_entry
.branch_summary
.as_ref()
.map(|branch_summary| serde_json::from_str(&branch_summary))
.transpose()?
.unwrap_or_default();
worktree.repository_entries.insert(
db_repository_entry.work_directory_id as u64,
proto::RepositoryEntry {
work_directory_id: db_repository_entry.work_directory_id as u64,
branch: db_repository_entry.branch,
updated_statuses,
removed_statuses: Vec::new(),
current_merge_conflicts,
branch_summary,
},
);
} else {
repositories.push(proto::UpdateRepository {
project_id: db_repository_entry.project_id.0 as u64,
id: db_repository_entry.id as u64,
abs_path: db_repository_entry.abs_path,
entry_ids,
updated_statuses,
removed_statuses: Vec::new(),
current_merge_conflicts,
branch_summary,
scan_id: db_repository_entry.scan_id as u64,
});
}
}
}
@@ -871,6 +1018,7 @@ impl Database {
})
.collect(),
worktrees,
repositories,
language_servers: language_servers
.into_iter()
.map(|language_server| proto::LanguageServer {

View File

@@ -1,3 +1,5 @@
use anyhow::Context as _;
use super::*;
impl Database {
@@ -606,6 +608,11 @@ impl Database {
let mut worktrees = Vec::new();
let db_worktrees = project.find_related(worktree::Entity).all(tx).await?;
let db_repos = project
.find_related(project_repository::Entity)
.all(tx)
.await?;
for db_worktree in db_worktrees {
let mut worktree = RejoinedWorktree {
id: db_worktree.id as u64,
@@ -673,96 +680,112 @@ impl Database {
}
}
// Repository Entries
{
let repository_entry_filter = if let Some(rejoined_worktree) = rejoined_worktree {
worktree_repository::Column::ScanId.gt(rejoined_worktree.scan_id)
worktrees.push(worktree);
}
let mut removed_repositories = Vec::new();
let mut updated_repositories = Vec::new();
for db_repo in db_repos {
let rejoined_repository = rejoined_project
.repositories
.iter()
.find(|repo| repo.id == db_repo.id as u64);
let repository_filter = if let Some(rejoined_repository) = rejoined_repository {
project_repository::Column::ScanId.gt(rejoined_repository.scan_id)
} else {
project_repository::Column::IsDeleted.eq(false)
};
let db_repositories = project_repository::Entity::find()
.filter(
Condition::all()
.add(project_repository::Column::ProjectId.eq(project.id))
.add(repository_filter),
)
.all(tx)
.await?;
for db_repository in db_repositories.into_iter() {
if db_repository.is_deleted {
removed_repositories.push(db_repository.id as u64);
} else {
worktree_repository::Column::IsDeleted.eq(false)
};
let db_repositories = worktree_repository::Entity::find()
.filter(
Condition::all()
.add(worktree_repository::Column::ProjectId.eq(project.id))
.add(worktree_repository::Column::WorktreeId.eq(worktree.id))
.add(repository_entry_filter),
)
.all(tx)
.await?;
for db_repository in db_repositories.into_iter() {
if db_repository.is_deleted {
worktree
.removed_repositories
.push(db_repository.work_directory_id as u64);
let status_entry_filter = if let Some(rejoined_repository) = rejoined_repository
{
project_repository_statuses::Column::ScanId.gt(rejoined_repository.scan_id)
} else {
let status_entry_filter = if let Some(rejoined_worktree) = rejoined_worktree
{
worktree_repository_statuses::Column::ScanId
.gt(rejoined_worktree.scan_id)
project_repository_statuses::Column::IsDeleted.eq(false)
};
let mut db_statuses = project_repository_statuses::Entity::find()
.filter(
Condition::all()
.add(project_repository_statuses::Column::ProjectId.eq(project.id))
.add(
project_repository_statuses::Column::RepositoryId
.eq(db_repository.id),
)
.add(status_entry_filter),
)
.stream(tx)
.await?;
let mut removed_statuses = Vec::new();
let mut updated_statuses = Vec::new();
while let Some(db_status) = db_statuses.next().await {
let db_status: project_repository_statuses::Model = db_status?;
if db_status.is_deleted {
removed_statuses.push(db_status.repo_path);
} else {
worktree_repository_statuses::Column::IsDeleted.eq(false)
};
let mut db_statuses = worktree_repository_statuses::Entity::find()
.filter(
Condition::all()
.add(
worktree_repository_statuses::Column::ProjectId
.eq(project.id),
)
.add(
worktree_repository_statuses::Column::WorktreeId
.eq(worktree.id),
)
.add(
worktree_repository_statuses::Column::WorkDirectoryId
.eq(db_repository.work_directory_id),
)
.add(status_entry_filter),
)
.stream(tx)
.await?;
let mut removed_statuses = Vec::new();
let mut updated_statuses = Vec::new();
while let Some(db_status) = db_statuses.next().await {
let db_status: worktree_repository_statuses::Model = db_status?;
if db_status.is_deleted {
removed_statuses.push(db_status.repo_path);
} else {
updated_statuses.push(db_status_to_proto(db_status)?);
}
updated_statuses.push(db_status_to_proto(db_status)?);
}
}
let current_merge_conflicts = db_repository
.current_merge_conflicts
.as_ref()
.map(|conflicts| serde_json::from_str(&conflicts))
.transpose()?
.unwrap_or_default();
let current_merge_conflicts = db_repository
.current_merge_conflicts
.as_ref()
.map(|conflicts| serde_json::from_str(&conflicts))
.transpose()?
.unwrap_or_default();
let branch_summary = db_repository
.branch_summary
.as_ref()
.map(|branch_summary| serde_json::from_str(&branch_summary))
.transpose()?
.unwrap_or_default();
let branch_summary = db_repository
.branch_summary
.as_ref()
.map(|branch_summary| serde_json::from_str(&branch_summary))
.transpose()?
.unwrap_or_default();
worktree.updated_repositories.push(proto::RepositoryEntry {
work_directory_id: db_repository.work_directory_id as u64,
branch: db_repository.branch,
let entry_ids = serde_json::from_str(&db_repository.entry_ids)
.context("failed to deserialize repository's entry ids")?;
if let Some(legacy_worktree_id) = db_repository.legacy_worktree_id {
if let Some(worktree) = worktrees
.iter_mut()
.find(|worktree| worktree.id as i64 == legacy_worktree_id)
{
worktree.updated_repositories.push(proto::RepositoryEntry {
work_directory_id: db_repository.id as u64,
updated_statuses,
removed_statuses,
current_merge_conflicts,
branch_summary,
});
}
} else {
updated_repositories.push(proto::UpdateRepository {
entry_ids,
updated_statuses,
removed_statuses,
current_merge_conflicts,
branch_summary,
project_id: project_id.to_proto(),
id: db_repository.id as u64,
abs_path: db_repository.abs_path,
scan_id: db_repository.scan_id as u64,
});
}
}
}
worktrees.push(worktree);
}
let language_servers = project
@@ -832,6 +855,8 @@ impl Database {
id: project_id,
old_connection_id,
collaborators,
updated_repositories,
removed_repositories,
worktrees,
language_servers,
}))

View File

@@ -26,6 +26,8 @@ pub mod observed_channel_messages;
pub mod processed_stripe_event;
pub mod project;
pub mod project_collaborator;
pub mod project_repository;
pub mod project_repository_statuses;
pub mod rate_buckets;
pub mod room;
pub mod room_participant;
@@ -36,6 +38,4 @@ pub mod user_feature;
pub mod worktree;
pub mod worktree_diagnostic_summary;
pub mod worktree_entry;
pub mod worktree_repository;
pub mod worktree_repository_statuses;
pub mod worktree_settings_file;

View File

@@ -45,6 +45,8 @@ pub enum Relation {
Room,
#[sea_orm(has_many = "super::worktree::Entity")]
Worktrees,
#[sea_orm(has_many = "super::project_repository::Entity")]
Repositories,
#[sea_orm(has_many = "super::project_collaborator::Entity")]
Collaborators,
#[sea_orm(has_many = "super::language_server::Entity")]
@@ -69,6 +71,12 @@ impl Related<super::worktree::Entity> for Entity {
}
}
impl Related<super::project_repository::Entity> for Entity {
fn to() -> RelationDef {
Relation::Repositories.def()
}
}
impl Related<super::project_collaborator::Entity> for Entity {
fn to() -> RelationDef {
Relation::Collaborators.def()

View File

@@ -2,16 +2,17 @@ use crate::db::ProjectId;
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "worktree_repositories")]
#[sea_orm(table_name = "project_repositories")]
pub struct Model {
#[sea_orm(primary_key)]
pub project_id: ProjectId,
#[sea_orm(primary_key)]
pub worktree_id: i64,
#[sea_orm(primary_key)]
pub work_directory_id: i64,
pub id: i64,
pub abs_path: String,
pub legacy_worktree_id: Option<i64>,
// JSON array containing 1 or more integer project entry ids
pub entry_ids: String,
pub scan_id: i64,
pub branch: Option<String>,
pub is_deleted: bool,
// JSON array typed string
pub current_merge_conflicts: Option<String>,
@@ -20,6 +21,19 @@ pub struct Model {
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {}
pub enum Relation {
#[sea_orm(
belongs_to = "super::project::Entity",
from = "Column::ProjectId",
to = "super::project::Column::Id"
)]
Project,
}
impl Related<super::project::Entity> for Entity {
fn to() -> RelationDef {
Relation::Project.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -2,14 +2,12 @@ use crate::db::ProjectId;
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "worktree_repository_statuses")]
#[sea_orm(table_name = "project_repository_statuses")]
pub struct Model {
#[sea_orm(primary_key)]
pub project_id: ProjectId,
#[sea_orm(primary_key)]
pub worktree_id: i64,
#[sea_orm(primary_key)]
pub work_directory_id: i64,
pub repository_id: i64,
#[sea_orm(primary_key)]
pub repo_path: String,
/// Old single-code status field, no longer used but kept here to mirror the DB schema.

View File

@@ -37,6 +37,7 @@ use core::fmt::{self, Debug, Formatter};
use http_client::HttpClient;
use open_ai::{OpenAiEmbeddingModel, OPEN_AI_API_URL};
use reqwest_client::ReqwestClient;
use rpc::proto::split_repository_update;
use sha2::Digest;
use supermaven_api::{CreateExternalUserRequest, SupermavenAdminApi};
@@ -291,6 +292,8 @@ impl Server {
.add_message_handler(leave_project)
.add_request_handler(update_project)
.add_request_handler(update_worktree)
.add_request_handler(update_repository)
.add_request_handler(remove_repository)
.add_message_handler(start_language_server)
.add_message_handler(update_language_server)
.add_message_handler(update_diagnostic_summary)
@@ -1464,7 +1467,7 @@ fn notify_rejoined_projects(
removed_repositories: worktree.removed_repositories,
};
for update in proto::split_worktree_update(message) {
session.peer.send(session.connection_id, update.clone())?;
session.peer.send(session.connection_id, update)?;
}
// Stream this worktree's diagnostics.
@@ -1493,21 +1496,23 @@ fn notify_rejoined_projects(
}
}
for language_server in &project.language_servers {
for repository in mem::take(&mut project.updated_repositories) {
for update in split_repository_update(repository) {
session.peer.send(session.connection_id, update)?;
}
}
for id in mem::take(&mut project.removed_repositories) {
session.peer.send(
session.connection_id,
proto::UpdateLanguageServer {
proto::RemoveRepository {
project_id: project.id.to_proto(),
language_server_id: language_server.id,
variant: Some(
proto::update_language_server::Variant::DiskBasedDiagnosticsUpdated(
proto::LspDiskBasedDiagnosticsUpdated {},
),
),
id,
},
)?;
}
}
Ok(())
}
@@ -1893,7 +1898,7 @@ fn join_project_internal(
removed_entries: Default::default(),
scan_id: worktree.scan_id,
is_last_update: worktree.scan_id == worktree.completed_scan_id,
updated_repositories: worktree.repository_entries.into_values().collect(),
updated_repositories: worktree.legacy_repository_entries.into_values().collect(),
removed_repositories: Default::default(),
};
for update in proto::split_worktree_update(message) {
@@ -1926,6 +1931,12 @@ fn join_project_internal(
}
}
for repository in mem::take(&mut project.repositories) {
for update in split_repository_update(repository) {
session.peer.send(session.connection_id, update)?;
}
}
for language_server in &project.language_servers {
session.peer.send(
session.connection_id,
@@ -2018,6 +2029,54 @@ async fn update_worktree(
Ok(())
}
async fn update_repository(
request: proto::UpdateRepository,
response: Response<proto::UpdateRepository>,
session: Session,
) -> Result<()> {
let guest_connection_ids = session
.db()
.await
.update_repository(&request, session.connection_id)
.await?;
broadcast(
Some(session.connection_id),
guest_connection_ids.iter().copied(),
|connection_id| {
session
.peer
.forward_send(session.connection_id, connection_id, request.clone())
},
);
response.send(proto::Ack {})?;
Ok(())
}
async fn remove_repository(
request: proto::RemoveRepository,
response: Response<proto::RemoveRepository>,
session: Session,
) -> Result<()> {
let guest_connection_ids = session
.db()
.await
.remove_repository(&request, session.connection_id)
.await?;
broadcast(
Some(session.connection_id),
guest_connection_ids.iter().copied(),
|connection_id| {
session
.peer
.forward_send(session.connection_id, connection_id, request.clone())
},
);
response.send(proto::Ack {})?;
Ok(())
}
/// Updates other participants with changes to the diagnostics
async fn update_diagnostic_summary(
message: proto::UpdateDiagnosticSummary,

View File

@@ -387,7 +387,7 @@ async fn test_channel_room(
executor.run_until_parked();
let room_a =
cx_a.read(|cx| active_call_a.read_with(cx, |call, _| call.room().unwrap().clone()));
cx_a.read(|cx| room_a.read_with(cx, |room, _| assert!(room.is_connected())));
cx_a.read(|cx| room_a.read_with(cx, |room, cx| assert!(room.is_connected(cx))));
cx_a.read(|cx| {
client_a.channel_store().read_with(cx, |channels, _| {
@@ -461,7 +461,7 @@ async fn test_channel_room(
let room_a =
cx_a.read(|cx| active_call_a.read_with(cx, |call, _| call.room().unwrap().clone()));
cx_a.read(|cx| room_a.read_with(cx, |room, _| assert!(room.is_connected())));
cx_a.read(|cx| room_a.read_with(cx, |room, cx| assert!(room.is_connected(cx))));
assert_eq!(
room_participants(&room_a, cx_a),
RoomParticipants {
@@ -472,7 +472,7 @@ async fn test_channel_room(
let room_b =
cx_b.read(|cx| active_call_b.read_with(cx, |call, _| call.room().unwrap().clone()));
cx_b.read(|cx| room_b.read_with(cx, |room, _| assert!(room.is_connected())));
cx_b.read(|cx| room_b.read_with(cx, |room, cx| assert!(room.is_connected(cx))));
assert_eq!(
room_participants(&room_b, cx_b),
RoomParticipants {
@@ -556,7 +556,7 @@ async fn test_channel_room(
let room_a =
cx_a.read(|cx| active_call_a.read_with(cx, |call, _| call.room().unwrap().clone()));
cx_a.read(|cx| room_a.read_with(cx, |room, _| assert!(room.is_connected())));
cx_a.read(|cx| room_a.read_with(cx, |room, cx| assert!(room.is_connected(cx))));
assert_eq!(
room_participants(&room_a, cx_a),
RoomParticipants {
@@ -567,7 +567,7 @@ async fn test_channel_room(
let room_b =
cx_b.read(|cx| active_call_b.read_with(cx, |call, _| call.room().unwrap().clone()));
cx_b.read(|cx| room_b.read_with(cx, |room, _| assert!(room.is_connected())));
cx_b.read(|cx| room_b.read_with(cx, |room, cx| assert!(room.is_connected(cx))));
assert_eq!(
room_participants(&room_b, cx_b),
RoomParticipants {

View File

@@ -435,118 +435,114 @@ async fn test_basic_following(
editor_a1.item_id()
);
// TODO: Re-enable this test once we can replace our swift Livekit SDK with the rust SDK
#[cfg(not(target_os = "macos"))]
{
use crate::rpc::RECONNECT_TIMEOUT;
use gpui::TestScreenCaptureSource;
use workspace::{
dock::{test::TestPanel, DockPosition},
item::test::TestItem,
shared_screen::SharedScreen,
};
use crate::rpc::RECONNECT_TIMEOUT;
use gpui::TestScreenCaptureSource;
use workspace::{
dock::{test::TestPanel, DockPosition},
item::test::TestItem,
shared_screen::SharedScreen,
};
// Client B activates an external window, which causes a new screen-sharing item to be added to the pane.
let display = TestScreenCaptureSource::new();
active_call_b
.update(cx_b, |call, cx| call.set_location(None, cx))
.await
.unwrap();
cx_b.set_screen_capture_sources(vec![display]);
active_call_b
.update(cx_b, |call, cx| {
call.room()
.unwrap()
.update(cx, |room, cx| room.share_screen(cx))
})
.await
.unwrap(); // This is what breaks
executor.run_until_parked();
let shared_screen = workspace_a.update(cx_a, |workspace, cx| {
workspace
.active_item(cx)
.expect("no active item")
.downcast::<SharedScreen>()
.expect("active item isn't a shared screen")
});
// Client B activates Zed again, which causes the previous editor to become focused again.
active_call_b
.update(cx_b, |call, cx| call.set_location(Some(&project_b), cx))
.await
.unwrap();
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace.active_item(cx).unwrap().item_id(),
editor_a1.item_id()
)
});
// Client B activates a multibuffer that was created by following client A. Client A returns to that multibuffer.
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.activate_item(&multibuffer_editor_b, true, true, window, cx)
});
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace.active_item(cx).unwrap().item_id(),
multibuffer_editor_a.item_id()
)
});
// Client B activates a panel, and the previously-opened screen-sharing item gets activated.
let panel = cx_b.new(|cx| TestPanel::new(DockPosition::Left, cx));
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.add_panel(panel, window, cx);
workspace.toggle_panel_focus::<TestPanel>(window, cx);
});
executor.run_until_parked();
assert_eq!(
workspace_a.update(cx_a, |workspace, cx| workspace
.active_item(cx)
// Client B activates an external window, which causes a new screen-sharing item to be added to the pane.
let display = TestScreenCaptureSource::new();
active_call_b
.update(cx_b, |call, cx| call.set_location(None, cx))
.await
.unwrap();
cx_b.set_screen_capture_sources(vec![display]);
active_call_b
.update(cx_b, |call, cx| {
call.room()
.unwrap()
.item_id()),
shared_screen.item_id()
);
.update(cx, |room, cx| room.share_screen(cx))
})
.await
.unwrap();
executor.run_until_parked();
let shared_screen = workspace_a.update(cx_a, |workspace, cx| {
workspace
.active_item(cx)
.expect("no active item")
.downcast::<SharedScreen>()
.expect("active item isn't a shared screen")
});
// Toggling the focus back to the pane causes client A to return to the multibuffer.
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.toggle_panel_focus::<TestPanel>(window, cx);
});
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace.active_item(cx).unwrap().item_id(),
multibuffer_editor_a.item_id()
)
});
// Client B activates an item that doesn't implement following,
// so the previously-opened screen-sharing item gets activated.
let unfollowable_item = cx_b.new(TestItem::new);
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.add_item(Box::new(unfollowable_item), true, true, None, window, cx)
})
});
executor.run_until_parked();
// Client B activates Zed again, which causes the previous editor to become focused again.
active_call_b
.update(cx_b, |call, cx| call.set_location(Some(&project_b), cx))
.await
.unwrap();
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace_a.update(cx_a, |workspace, cx| workspace
.active_item(cx)
.unwrap()
.item_id()),
shared_screen.item_id()
);
workspace.active_item(cx).unwrap().item_id(),
editor_a1.item_id()
)
});
// Following interrupts when client B disconnects.
client_b.disconnect(&cx_b.to_async());
executor.advance_clock(RECONNECT_TIMEOUT);
// Client B activates a multibuffer that was created by following client A. Client A returns to that multibuffer.
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.activate_item(&multibuffer_editor_b, true, true, window, cx)
});
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace_a.update(cx_a, |workspace, _| workspace.leader_for_pane(&pane_a)),
None
);
}
workspace.active_item(cx).unwrap().item_id(),
multibuffer_editor_a.item_id()
)
});
// Client B activates a panel, and the previously-opened screen-sharing item gets activated.
let panel = cx_b.new(|cx| TestPanel::new(DockPosition::Left, cx));
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.add_panel(panel, window, cx);
workspace.toggle_panel_focus::<TestPanel>(window, cx);
});
executor.run_until_parked();
assert_eq!(
workspace_a.update(cx_a, |workspace, cx| workspace
.active_item(cx)
.unwrap()
.item_id()),
shared_screen.item_id()
);
// Toggling the focus back to the pane causes client A to return to the multibuffer.
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.toggle_panel_focus::<TestPanel>(window, cx);
});
executor.run_until_parked();
workspace_a.update(cx_a, |workspace, cx| {
assert_eq!(
workspace.active_item(cx).unwrap().item_id(),
multibuffer_editor_a.item_id()
)
});
// Client B activates an item that doesn't implement following,
// so the previously-opened screen-sharing item gets activated.
let unfollowable_item = cx_b.new(TestItem::new);
workspace_b.update_in(cx_b, |workspace, window, cx| {
workspace.active_pane().update(cx, |pane, cx| {
pane.add_item(Box::new(unfollowable_item), true, true, None, window, cx)
})
});
executor.run_until_parked();
assert_eq!(
workspace_a.update(cx_a, |workspace, cx| workspace
.active_item(cx)
.unwrap()
.item_id()),
shared_screen.item_id()
);
// Following interrupts when client B disconnects.
client_b.disconnect(&cx_b.to_async());
executor.advance_clock(RECONNECT_TIMEOUT);
assert_eq!(
workspace_a.update(cx_a, |workspace, _| workspace.leader_for_pane(&pane_a)),
None
);
}
#[gpui::test]

View File

@@ -243,60 +243,56 @@ async fn test_basic_calls(
}
);
// TODO: Re-enable this test once we can replace our swift Livekit SDK with the rust SDK
#[cfg(not(target_os = "macos"))]
{
// User A shares their screen
let display = gpui::TestScreenCaptureSource::new();
let events_b = active_call_events(cx_b);
let events_c = active_call_events(cx_c);
cx_a.set_screen_capture_sources(vec![display]);
active_call_a
.update(cx_a, |call, cx| {
call.room()
.unwrap()
.update(cx, |room, cx| room.share_screen(cx))
})
.await
.unwrap();
// User A shares their screen
let display = gpui::TestScreenCaptureSource::new();
let events_b = active_call_events(cx_b);
let events_c = active_call_events(cx_c);
cx_a.set_screen_capture_sources(vec![display]);
active_call_a
.update(cx_a, |call, cx| {
call.room()
.unwrap()
.update(cx, |room, cx| room.share_screen(cx))
})
.await
.unwrap();
executor.run_until_parked();
executor.run_until_parked();
// User B observes the remote screen sharing track.
assert_eq!(events_b.borrow().len(), 1);
let event_b = events_b.borrow().first().unwrap().clone();
if let call::room::Event::RemoteVideoTracksChanged { participant_id } = event_b {
assert_eq!(participant_id, client_a.peer_id().unwrap());
// User B observes the remote screen sharing track.
assert_eq!(events_b.borrow().len(), 1);
let event_b = events_b.borrow().first().unwrap().clone();
if let call::room::Event::RemoteVideoTracksChanged { participant_id } = event_b {
assert_eq!(participant_id, client_a.peer_id().unwrap());
room_b.read_with(cx_b, |room, _| {
assert_eq!(
room.remote_participants()[&client_a.user_id().unwrap()]
.video_tracks
.len(),
1
);
});
} else {
panic!("unexpected event")
}
room_b.read_with(cx_b, |room, _| {
assert_eq!(
room.remote_participants()[&client_a.user_id().unwrap()]
.video_tracks
.len(),
1
);
});
} else {
panic!("unexpected event")
}
// User C observes the remote screen sharing track.
assert_eq!(events_c.borrow().len(), 1);
let event_c = events_c.borrow().first().unwrap().clone();
if let call::room::Event::RemoteVideoTracksChanged { participant_id } = event_c {
assert_eq!(participant_id, client_a.peer_id().unwrap());
// User C observes the remote screen sharing track.
assert_eq!(events_c.borrow().len(), 1);
let event_c = events_c.borrow().first().unwrap().clone();
if let call::room::Event::RemoteVideoTracksChanged { participant_id } = event_c {
assert_eq!(participant_id, client_a.peer_id().unwrap());
room_c.read_with(cx_c, |room, _| {
assert_eq!(
room.remote_participants()[&client_a.user_id().unwrap()]
.video_tracks
.len(),
1
);
});
} else {
panic!("unexpected event")
}
room_c.read_with(cx_c, |room, _| {
assert_eq!(
room.remote_participants()[&client_a.user_id().unwrap()]
.video_tracks
.len(),
1
);
});
} else {
panic!("unexpected event")
}
// User A leaves the room.
@@ -2085,17 +2081,7 @@ async fn test_mute_deafen(
audio_tracks_playing: participant
.audio_tracks
.values()
.map({
#[cfg(target_os = "macos")]
{
|track| track.is_playing()
}
#[cfg(not(target_os = "macos"))]
{
|(track, _)| track.rtc_track().enabled()
}
})
.map(|(track, _)| track.enabled())
.collect(),
})
.collect::<Vec<_>>()
@@ -2847,7 +2833,7 @@ async fn test_git_diff_base_change(
});
}
#[gpui::test]
#[gpui::test(iterations = 10)]
async fn test_git_branch_name(
executor: BackgroundExecutor,
cx_a: &mut TestAppContext,
@@ -2895,9 +2881,10 @@ async fn test_git_branch_name(
let worktrees = project.visible_worktrees(cx).collect::<Vec<_>>();
assert_eq!(worktrees.len(), 1);
let worktree = worktrees[0].clone();
let root_entry = worktree.read(cx).snapshot().root_git_entry().unwrap();
let snapshot = worktree.read(cx).snapshot();
let repo = snapshot.repositories().first().unwrap();
assert_eq!(
root_entry.branch().map(|branch| branch.name.to_string()),
repo.branch().map(|branch| branch.name.to_string()),
branch_name
);
}
@@ -6161,8 +6148,6 @@ async fn test_contact_requests(
}
}
// TODO: Re-enable this test once we can replace our swift Livekit SDK with the rust SDK
#[cfg(not(target_os = "macos"))]
#[gpui::test(iterations = 10)]
async fn test_join_call_after_screen_was_shared(
executor: BackgroundExecutor,
@@ -6771,7 +6756,7 @@ async fn test_remote_git_branches(
.map(ToString::to_string)
.collect::<HashSet<_>>();
let (project_a, worktree_id) = client_a.build_local_project("/project", cx_a).await;
let (project_a, _) = client_a.build_local_project("/project", cx_a).await;
let project_id = active_call_a
.update(cx_a, |call, cx| call.share_project(project_a.clone(), cx))
@@ -6784,8 +6769,6 @@ async fn test_remote_git_branches(
let repo_b = cx_b.update(|cx| project_b.read(cx).active_repository(cx).unwrap());
let root_path = ProjectPath::root_path(worktree_id);
let branches_b = cx_b
.update(|cx| repo_b.update(cx, |repository, _| repository.branches()))
.await
@@ -6810,11 +6793,15 @@ async fn test_remote_git_branches(
let host_branch = cx_a.update(|cx| {
project_a.update(cx, |project, cx| {
project.worktree_store().update(cx, |worktree_store, cx| {
worktree_store
.current_branch(root_path.clone(), cx)
.unwrap()
})
project
.repositories(cx)
.values()
.next()
.unwrap()
.read(cx)
.current_branch()
.unwrap()
.clone()
})
});
@@ -6843,9 +6830,15 @@ async fn test_remote_git_branches(
let host_branch = cx_a.update(|cx| {
project_a.update(cx, |project, cx| {
project.worktree_store().update(cx, |worktree_store, cx| {
worktree_store.current_branch(root_path, cx).unwrap()
})
project
.repositories(cx)
.values()
.next()
.unwrap()
.read(cx)
.current_branch()
.unwrap()
.clone()
})
});

View File

@@ -258,7 +258,7 @@ async fn test_ssh_collaboration_git_branches(
});
let client_ssh = SshRemoteClient::fake_client(opts, cx_a).await;
let (project_a, worktree_id) = client_a
let (project_a, _) = client_a
.build_ssh_project("/project", client_ssh, cx_a)
.await;
@@ -277,7 +277,6 @@ async fn test_ssh_collaboration_git_branches(
executor.run_until_parked();
let repo_b = cx_b.update(|cx| project_b.read(cx).active_repository(cx).unwrap());
let root_path = ProjectPath::root_path(worktree_id);
let branches_b = cx_b
.update(|cx| repo_b.read(cx).branches())
@@ -303,13 +302,17 @@ async fn test_ssh_collaboration_git_branches(
let server_branch = server_cx.update(|cx| {
headless_project.update(cx, |headless_project, cx| {
headless_project
.worktree_store
.update(cx, |worktree_store, cx| {
worktree_store
.current_branch(root_path.clone(), cx)
.unwrap()
})
headless_project.git_store.update(cx, |git_store, cx| {
git_store
.repositories()
.values()
.next()
.unwrap()
.read(cx)
.current_branch()
.unwrap()
.clone()
})
})
});
@@ -338,11 +341,17 @@ async fn test_ssh_collaboration_git_branches(
let server_branch = server_cx.update(|cx| {
headless_project.update(cx, |headless_project, cx| {
headless_project
.worktree_store
.update(cx, |worktree_store, cx| {
worktree_store.current_branch(root_path, cx).unwrap()
})
headless_project.git_store.update(cx, |git_store, cx| {
git_store
.repositories()
.values()
.next()
.unwrap()
.read(cx)
.current_branch()
.unwrap()
.clone()
})
})
});

View File

@@ -45,12 +45,8 @@ use std::{
};
use workspace::{Workspace, WorkspaceStore};
#[cfg(not(target_os = "macos"))]
use livekit_client::test::TestServer as LivekitTestServer;
#[cfg(target_os = "macos")]
use livekit_client_macos::TestServer as LivekitTestServer;
pub struct TestServer {
pub app_state: Arc<AppState>,
pub test_livekit_server: Arc<LivekitTestServer>,
@@ -165,6 +161,7 @@ impl TestServer {
let fs = FakeFs::new(cx.executor());
cx.update(|cx| {
gpui_tokio::init(cx);
if cx.has_global::<SettingsStore>() {
panic!("Same cx used to create two test clients")
}

View File

@@ -56,6 +56,10 @@ impl Tool for ContextServerTool {
}
}
fn ui_text(&self, _input: &serde_json::Value) -> String {
format!("Run MCP tool `{}`", self.tool.name)
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
@@ -65,42 +69,43 @@ impl Tool for ContextServerTool {
cx: &mut App,
) -> Task<Result<String>> {
if let Some(server) = self.server_manager.read(cx).get_server(&self.server_id) {
cx.foreground_executor().spawn({
let tool_name = self.tool.name.clone();
async move {
let Some(protocol) = server.client() else {
bail!("Context server not initialized");
};
let tool_name = self.tool.name.clone();
let server_clone = server.clone();
let input_clone = input.clone();
let arguments = if let serde_json::Value::Object(map) = input {
Some(map.into_iter().collect())
} else {
None
};
cx.spawn(async move |_cx| {
let Some(protocol) = server_clone.client() else {
bail!("Context server not initialized");
};
log::trace!(
"Running tool: {} with arguments: {:?}",
tool_name,
arguments
);
let response = protocol.run_tool(tool_name, arguments).await?;
let arguments = if let serde_json::Value::Object(map) = input_clone {
Some(map.into_iter().collect())
} else {
None
};
let mut result = String::new();
for content in response.content {
match content {
types::ToolResponseContent::Text { text } => {
result.push_str(&text);
}
types::ToolResponseContent::Image { .. } => {
log::warn!("Ignoring image content from tool response");
}
types::ToolResponseContent::Resource { .. } => {
log::warn!("Ignoring resource content from tool response");
}
log::trace!(
"Running tool: {} with arguments: {:?}",
tool_name,
arguments
);
let response = protocol.run_tool(tool_name, arguments).await?;
let mut result = String::new();
for content in response.content {
match content {
types::ToolResponseContent::Text { text } => {
result.push_str(&text);
}
types::ToolResponseContent::Image { .. } => {
log::warn!("Ignoring image content from tool response");
}
types::ToolResponseContent::Resource { .. } => {
log::warn!("Ignoring resource content from tool response");
}
}
Ok(result)
}
Ok(result)
})
} else {
Task::ready(Err(anyhow!("Context server not found")))

View File

@@ -51,9 +51,13 @@ impl ExtensionContextServerProxy for ContextServerFactoryRegistryProxy {
})
})?;
let command = extension
let mut command = extension
.context_server_command(id.clone(), extension_project)
.await?;
command.command = extension
.path_from_extension(command.command.as_ref())
.to_string_lossy()
.to_string();
log::info!("loaded command for context server {id}: {command:?}");

View File

@@ -3,8 +3,8 @@ use anyhow::{anyhow, Result};
use collections::HashMap;
use command_palette_hooks::CommandPaletteFilter;
use dap::{
client::SessionId, debugger_settings::DebuggerSettings, ContinuedEvent, LoadedSourceEvent,
ModuleEvent, OutputEvent, StoppedEvent, ThreadEvent,
client::SessionId, debugger_settings::DebuggerSettings, ContinuedEvent, DebugAdapterConfig,
LoadedSourceEvent, ModuleEvent, OutputEvent, StoppedEvent, ThreadEvent,
};
use futures::{channel::mpsc, SinkExt as _};
use gpui::{
@@ -21,6 +21,7 @@ use settings::Settings;
use std::{any::TypeId, path::PathBuf};
use terminal_view::terminal_panel::TerminalPanel;
use ui::prelude::*;
use util::ResultExt;
use workspace::{
dock::{DockPosition, Panel, PanelEvent},
pane, Continue, Disconnect, Pane, Pause, Restart, StepBack, StepInto, StepOut, StepOver, Stop,
@@ -51,6 +52,7 @@ pub struct DebugPanel {
project: WeakEntity<Project>,
workspace: WeakEntity<Workspace>,
_subscriptions: Vec<Subscription>,
pub(crate) last_inert_config: Option<DebugAdapterConfig>,
}
impl DebugPanel {
@@ -63,6 +65,7 @@ impl DebugPanel {
let project = workspace.project().clone();
let dap_store = project.read(cx).dap_store();
let weak_workspace = workspace.weak_handle();
let debug_panel = cx.weak_entity();
let pane = cx.new(|cx| {
let mut pane = Pane::new(
workspace.weak_handle(),
@@ -81,6 +84,7 @@ impl DebugPanel {
pane.set_render_tab_bar_buttons(cx, {
let project = project.clone();
let weak_workspace = weak_workspace.clone();
let debug_panel = debug_panel.clone();
move |_, _, cx| {
let project = project.clone();
let weak_workspace = weak_workspace.clone();
@@ -91,21 +95,34 @@ impl DebugPanel {
.child(
IconButton::new("new-debug-session", IconName::Plus)
.icon_size(IconSize::Small)
.on_click(cx.listener(move |pane, _, window, cx| {
pane.add_item(
Box::new(DebugSession::inert(
project.clone(),
weak_workspace.clone(),
.on_click({
let debug_panel = debug_panel.clone();
cx.listener(move |pane, _, window, cx| {
let config = debug_panel
.read_with(cx, |this: &DebugPanel, _| {
this.last_inert_config.clone()
})
.log_err()
.flatten();
pane.add_item(
Box::new(DebugSession::inert(
project.clone(),
weak_workspace.clone(),
debug_panel.clone(),
config,
window,
cx,
)),
false,
false,
None,
window,
cx,
)),
false,
false,
None,
window,
cx,
);
})),
);
})
}),
)
.into_any_element(),
),
@@ -116,6 +133,8 @@ impl DebugPanel {
Box::new(DebugSession::inert(
project.clone(),
weak_workspace.clone(),
debug_panel.clone(),
None,
window,
cx,
)),
@@ -138,6 +157,7 @@ impl DebugPanel {
pane,
size: px(300.),
_subscriptions,
last_inert_config: None,
project: project.downgrade(),
workspace: workspace.weak_handle(),
};
@@ -280,8 +300,14 @@ impl DebugPanel {
// We already have an item for this session.
return;
}
let session_item =
DebugSession::running(project, self.workspace.clone(), session, window, cx);
let session_item = DebugSession::running(
project,
self.workspace.clone(),
session,
cx.weak_entity(),
window,
cx,
);
self.pane.update(cx, |pane, cx| {
pane.add_item(Box::new(session_item), true, true, None, window, cx);
@@ -504,12 +530,16 @@ impl Panel for DebugPanel {
let Some(project) = self.project.clone().upgrade() else {
return;
};
let config = self.last_inert_config.clone();
let panel = cx.weak_entity();
// todo: We need to revisit it when we start adding stopped items to pane (as that'll cause us to add two items).
self.pane.update(cx, |this, cx| {
this.add_item(
Box::new(DebugSession::inert(
project,
self.workspace.clone(),
panel,
config,
window,
cx,
)),

View File

@@ -6,6 +6,7 @@ mod starting;
use std::time::Duration;
use dap::client::SessionId;
use dap::DebugAdapterConfig;
use failed::FailedState;
use gpui::{
percentage, Animation, AnimationExt, AnyElement, App, Entity, EventEmitter, FocusHandle,
@@ -19,11 +20,14 @@ use rpc::proto::{self, PeerId};
use running::RunningState;
use starting::{StartingEvent, StartingState};
use ui::prelude::*;
use util::ResultExt;
use workspace::{
item::{self, Item},
FollowableItem, ViewId, Workspace,
};
use crate::debugger_panel::DebugPanel;
pub(crate) enum DebugSessionState {
Inert(Entity<InertState>),
Starting(Entity<StartingState>),
@@ -44,6 +48,7 @@ pub struct DebugSession {
remote_id: Option<workspace::ViewId>,
mode: DebugSessionState,
dap_store: WeakEntity<DapStore>,
debug_panel: WeakEntity<DebugPanel>,
worktree_store: WeakEntity<WorktreeStore>,
workspace: WeakEntity<Workspace>,
_subscriptions: [Subscription; 1],
@@ -67,6 +72,8 @@ impl DebugSession {
pub(super) fn inert(
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
debug_panel: WeakEntity<DebugPanel>,
config: Option<DebugAdapterConfig>,
window: &mut Window,
cx: &mut App,
) -> Entity<Self> {
@@ -77,7 +84,8 @@ impl DebugSession {
.and_then(|tree| tree.read(cx).abs_path().to_str().map(|str| str.to_string()))
.unwrap_or_default();
let inert = cx.new(|cx| InertState::new(workspace.clone(), &default_cwd, window, cx));
let inert =
cx.new(|cx| InertState::new(workspace.clone(), &default_cwd, config, window, cx));
let project = project.read(cx);
let dap_store = project.dap_store().downgrade();
@@ -89,6 +97,7 @@ impl DebugSession {
mode: DebugSessionState::Inert(inert),
dap_store,
worktree_store,
debug_panel,
workspace,
_subscriptions,
}
@@ -99,6 +108,7 @@ impl DebugSession {
project: Entity<Project>,
workspace: WeakEntity<Workspace>,
session: Entity<Session>,
debug_panel: WeakEntity<DebugPanel>,
window: &mut Window,
cx: &mut App,
) -> Entity<Self> {
@@ -111,6 +121,7 @@ impl DebugSession {
remote_id: None,
mode: DebugSessionState::Running(mode),
dap_store: project.read(cx).dap_store().downgrade(),
debug_panel,
worktree_store: project.read(cx).worktree_store().downgrade(),
workspace,
})
@@ -148,6 +159,11 @@ impl DebugSession {
let dap_store = self.dap_store.clone();
let InertEvent::Spawned { config } = event;
let config = config.clone();
self.debug_panel
.update(cx, |this, _| this.last_inert_config = Some(config.clone()))
.log_err();
let worktree = self
.worktree_store
.update(cx, |this, _| this.worktrees().next())

View File

@@ -32,6 +32,15 @@ impl SpawnMode {
}
}
impl From<DebugRequestType> for SpawnMode {
fn from(request: DebugRequestType) -> Self {
match request {
DebugRequestType::Launch => SpawnMode::Launch,
DebugRequestType::Attach(_) => SpawnMode::Attach,
}
}
}
pub(crate) struct InertState {
focus_handle: FocusHandle,
selected_debugger: Option<SharedString>,
@@ -46,27 +55,56 @@ impl InertState {
pub(super) fn new(
workspace: WeakEntity<Workspace>,
default_cwd: &str,
debug_config: Option<DebugAdapterConfig>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let selected_debugger = debug_config.as_ref().and_then(|config| match config.kind {
DebugAdapterKind::Lldb => Some("LLDB".into()),
DebugAdapterKind::Go(_) => Some("Delve".into()),
DebugAdapterKind::Php(_) => Some("PHP".into()),
DebugAdapterKind::Javascript(_) => Some("JavaScript".into()),
DebugAdapterKind::Python(_) => Some("Debugpy".into()),
_ => None,
});
let spawn_mode = debug_config
.as_ref()
.map(|config| config.request.clone().into())
.unwrap_or_default();
let program = debug_config
.as_ref()
.and_then(|config| config.program.to_owned());
let program_editor = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
editor.set_placeholder_text("Program path", cx);
if let Some(program) = program {
editor.insert(&program, window, cx);
} else {
editor.set_placeholder_text("Program path", cx);
}
editor
});
let cwd = debug_config
.and_then(|config| config.cwd.map(|cwd| cwd.to_owned()))
.unwrap_or_else(|| PathBuf::from(default_cwd));
let cwd_editor = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
editor.insert(default_cwd, window, cx);
editor.insert(cwd.to_str().unwrap_or_else(|| default_cwd), window, cx);
editor.set_placeholder_text("Working directory", cx);
editor
});
Self {
workspace,
cwd_editor,
program_editor,
selected_debugger: None,
selected_debugger,
spawn_mode,
focus_handle: cx.focus_handle(),
spawn_mode: SpawnMode::default(),
popover_handle: Default::default(),
}
}

View File

@@ -602,7 +602,7 @@ async fn test_handle_start_debugging_reverse_request(
});
let child_client = child_session.update(cx, |session, _| session.adapter_client().unwrap());
client
child_client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
@@ -645,6 +645,230 @@ async fn test_handle_start_debugging_reverse_request(
shutdown_session.await.unwrap();
}
#[gpui::test]
async fn test_shutdown_children_when_parent_session_shutdown(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let dap_store = project.update(cx, |project, _| project.dap_store());
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(DebugRequestType::Launch, None, None), cx)
});
let parent_session = task.await.unwrap();
let client = parent_session.update(cx, |session, _| session.adapter_client().unwrap());
client
.on_request::<dap::requests::Threads, _>(move |_, _| {
Ok(dap::ThreadsResponse {
threads: vec![dap::Thread {
id: 1,
name: "Thread 1".into(),
}],
})
})
.await;
client.on_response::<StartDebugging, _>(move |_| {}).await;
// start first child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
configuration: json!({}),
request: StartDebuggingRequestArgumentsRequest::Launch,
})
.await;
cx.run_until_parked();
// start second child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
configuration: json!({}),
request: StartDebuggingRequestArgumentsRequest::Launch,
})
.await;
cx.run_until_parked();
// configure first child session
let first_child_session = dap_store.read_with(cx, |dap_store, _| {
dap_store.session_by_id(SessionId(1)).unwrap()
});
let first_child_client =
first_child_session.update(cx, |session, _| session.adapter_client().unwrap());
first_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
// configure second child session
let second_child_session = dap_store.read_with(cx, |dap_store, _| {
dap_store.session_by_id(SessionId(2)).unwrap()
});
let second_child_client =
second_child_session.update(cx, |session, _| session.adapter_client().unwrap());
second_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
cx.run_until_parked();
// shutdown parent session
dap_store
.update(cx, |dap_store, cx| {
dap_store.shutdown_session(parent_session.read(cx).session_id(), cx)
})
.await
.unwrap();
// assert parent session and all children sessions are shutdown
dap_store.update(cx, |dap_store, cx| {
assert!(dap_store
.session_by_id(parent_session.read(cx).session_id())
.is_none());
assert!(dap_store
.session_by_id(first_child_session.read(cx).session_id())
.is_none());
assert!(dap_store
.session_by_id(second_child_session.read(cx).session_id())
.is_none());
});
}
#[gpui::test]
async fn test_shutdown_parent_session_if_all_children_are_shutdown(
executor: BackgroundExecutor,
cx: &mut TestAppContext,
) {
init_test(cx);
let fs = FakeFs::new(executor.clone());
fs.insert_tree(
"/project",
json!({
"main.rs": "First line\nSecond line\nThird line\nFourth line",
}),
)
.await;
let project = Project::test(fs, ["/project".as_ref()], cx).await;
let dap_store = project.update(cx, |project, _| project.dap_store());
let workspace = init_test_workspace(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let task = project.update(cx, |project, cx| {
project.start_debug_session(dap::test_config(DebugRequestType::Launch, None, None), cx)
});
let parent_session = task.await.unwrap();
let client = parent_session.update(cx, |session, _| session.adapter_client().unwrap());
client.on_response::<StartDebugging, _>(move |_| {}).await;
// start first child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
configuration: json!({}),
request: StartDebuggingRequestArgumentsRequest::Launch,
})
.await;
cx.run_until_parked();
// start second child session
client
.fake_reverse_request::<StartDebugging>(StartDebuggingRequestArguments {
configuration: json!({}),
request: StartDebuggingRequestArgumentsRequest::Launch,
})
.await;
cx.run_until_parked();
// configure first child session
let first_child_session = dap_store.read_with(cx, |dap_store, _| {
dap_store.session_by_id(SessionId(1)).unwrap()
});
let first_child_client =
first_child_session.update(cx, |session, _| session.adapter_client().unwrap());
first_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
// configure second child session
let second_child_session = dap_store.read_with(cx, |dap_store, _| {
dap_store.session_by_id(SessionId(2)).unwrap()
});
let second_child_client =
second_child_session.update(cx, |session, _| session.adapter_client().unwrap());
second_child_client
.on_request::<Disconnect, _>(move |_, _| Ok(()))
.await;
cx.run_until_parked();
// shutdown first child session
dap_store
.update(cx, |dap_store, cx| {
dap_store.shutdown_session(first_child_session.read(cx).session_id(), cx)
})
.await
.unwrap();
// assert parent session and second child session still exist
dap_store.update(cx, |dap_store, cx| {
assert!(dap_store
.session_by_id(parent_session.read(cx).session_id())
.is_some());
assert!(dap_store
.session_by_id(first_child_session.read(cx).session_id())
.is_none());
assert!(dap_store
.session_by_id(second_child_session.read(cx).session_id())
.is_some());
});
// shutdown first child session
dap_store
.update(cx, |dap_store, cx| {
dap_store.shutdown_session(second_child_session.read(cx).session_id(), cx)
})
.await
.unwrap();
// assert parent session got shutdown by second child session
// because it was the last child
dap_store.update(cx, |dap_store, cx| {
assert!(dap_store
.session_by_id(parent_session.read(cx).session_id())
.is_none());
assert!(dap_store
.session_by_id(second_child_session.read(cx).session_id())
.is_none());
});
}
#[gpui::test]
async fn test_debug_panel_item_thread_status_reset_on_failure(
executor: BackgroundExecutor,

View File

@@ -275,6 +275,7 @@ actions!(
ConvertToUpperCamelCase,
ConvertToUpperCase,
Copy,
CopyAndTrim,
CopyFileLocation,
CopyHighlightJson,
CopyFileName,

View File

@@ -665,10 +665,11 @@ impl CompletionsMenu {
.collect()
};
// Remove all candidates where the query's start does not match the start of any word in the candidate
let mut additional_matches = Vec::new();
// Deprioritize all candidates where the query's start does not match the start of any word in the candidate
if let Some(query) = query {
if let Some(query_start) = query.chars().next() {
matches.retain(|string_match| {
let (primary, secondary) = matches.into_iter().partition(|string_match| {
split_words(&string_match.string).any(|word| {
// Check that the first codepoint of the word as lowercase matches the first
// codepoint of the query as lowercase
@@ -678,6 +679,8 @@ impl CompletionsMenu {
.all(|(word_cp, query_cp)| word_cp == query_cp)
})
});
matches = primary;
additional_matches = secondary;
}
}
@@ -740,6 +743,8 @@ impl CompletionsMenu {
}
drop(completions);
matches.extend(additional_matches);
*self.entries.borrow_mut() = matches;
self.selected_item = 0;
// This keeps the display consistent when y_flipped.

View File

@@ -9429,7 +9429,15 @@ impl Editor {
self.do_paste(&text, metadata, false, window, cx);
}
pub fn copy_and_trim(&mut self, _: &CopyAndTrim, _: &mut Window, cx: &mut Context<Self>) {
self.do_copy(true, cx);
}
pub fn copy(&mut self, _: &Copy, _: &mut Window, cx: &mut Context<Self>) {
self.do_copy(false, cx);
}
fn do_copy(&self, strip_leading_indents: bool, cx: &mut Context<Self>) {
let selections = self.selections.all::<Point>(cx);
let buffer = self.buffer.read(cx).read(cx);
let mut text = String::new();
@@ -9438,7 +9446,7 @@ impl Editor {
{
let max_point = buffer.max_point();
let mut is_first = true;
for selection in selections.iter() {
for selection in &selections {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
@@ -9446,21 +9454,55 @@ impl Editor {
start = Point::new(start.row, 0);
end = cmp::min(max_point, Point::new(end.row + 1, 0));
}
if is_first {
is_first = false;
let mut trimmed_selections = Vec::new();
if strip_leading_indents && end.row.saturating_sub(start.row) > 0 {
let row = MultiBufferRow(start.row);
let first_indent = buffer.indent_size_for_line(row);
if first_indent.len == 0 || start.column > first_indent.len {
trimmed_selections.push(start..end);
} else {
trimmed_selections.push(
Point::new(row.0, first_indent.len)
..Point::new(row.0, buffer.line_len(row)),
);
for row in start.row + 1..=end.row {
let row_indent_size = buffer.indent_size_for_line(MultiBufferRow(row));
if row_indent_size.len >= first_indent.len {
trimmed_selections.push(
Point::new(row, first_indent.len)
..Point::new(row, buffer.line_len(MultiBufferRow(row))),
);
} else {
trimmed_selections.clear();
trimmed_selections.push(start..end);
break;
}
}
}
} else {
text += "\n";
trimmed_selections.push(start..end);
}
let mut len = 0;
for chunk in buffer.text_for_range(start..end) {
text.push_str(chunk);
len += chunk.len();
for trimmed_range in trimmed_selections {
if is_first {
is_first = false;
} else {
text += "\n";
}
let mut len = 0;
for chunk in buffer.text_for_range(trimmed_range.start..trimmed_range.end) {
text.push_str(chunk);
len += chunk.len();
}
clipboard_selections.push(ClipboardSelection {
len,
is_entire_line,
first_line_indent: buffer
.indent_size_for_line(MultiBufferRow(trimmed_range.start.row))
.len,
});
}
clipboard_selections.push(ClipboardSelection {
len,
is_entire_line,
first_line_indent: buffer.indent_size_for_line(MultiBufferRow(start.row)).len,
});
}
}

View File

@@ -4918,6 +4918,180 @@ async fn test_clipboard(cx: &mut TestAppContext) {
tˇhe lazy dog"});
}
#[gpui::test]
async fn test_copy_trim(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
cx.set_state(
r#" «for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);ˇ»
end = cmp::min(max_point, Point::new(end.row + 1, 0));
}
"#,
);
cx.update_editor(|e, window, cx| e.copy(&Copy, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Regular copying preserves all indentation selected",
);
cx.update_editor(|e, window, cx| e.copy_and_trim(&CopyAndTrim, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Copying with stripping should strip all leading whitespaces"
);
cx.set_state(
r#" « for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);ˇ»
end = cmp::min(max_point, Point::new(end.row + 1, 0));
}
"#,
);
cx.update_editor(|e, window, cx| e.copy(&Copy, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
" for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Regular copying preserves all indentation selected",
);
cx.update_editor(|e, window, cx| e.copy_and_trim(&CopyAndTrim, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Copying with stripping should strip all leading whitespaces, even if some of it was selected"
);
cx.set_state(
r#" «ˇ for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);»
end = cmp::min(max_point, Point::new(end.row + 1, 0));
}
"#,
);
cx.update_editor(|e, window, cx| e.copy(&Copy, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
" for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Regular copying for reverse selection works the same",
);
cx.update_editor(|e, window, cx| e.copy_and_trim(&CopyAndTrim, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"for selection in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"Copying with stripping for reverse selection works the same"
);
cx.set_state(
r#" for selection «in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);ˇ»
end = cmp::min(max_point, Point::new(end.row + 1, 0));
}
"#,
);
cx.update_editor(|e, window, cx| e.copy(&Copy, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"When selecting past the indent, the copying works as usual",
);
cx.update_editor(|e, window, cx| e.copy_and_trim(&CopyAndTrim, window, cx));
assert_eq!(
cx.read_from_clipboard()
.and_then(|item| item.text().as_deref().map(str::to_string)),
Some(
"in selections.iter() {
let mut start = selection.start;
let mut end = selection.end;
let is_entire_line = selection.is_empty() || self.selections.line_mode;
if is_entire_line {
start = Point::new(start.row, 0);"
.to_string()
),
"When selecting past the indent, nothing is trimmed"
);
}
#[gpui::test]
async fn test_paste_multiline(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -9492,7 +9666,7 @@ async fn test_word_completions_continue_on_typing(cx: &mut TestAppContext) {
}
});
cx.simulate_keystroke("s");
cx.simulate_keystroke("l");
cx.executor().run_until_parked();
cx.condition(|editor, _| editor.context_menu_visible())
.await;
@@ -9501,7 +9675,7 @@ async fn test_word_completions_continue_on_typing(cx: &mut TestAppContext) {
{
assert_eq!(
completion_menu_entries(&menu),
&["second"],
&["last"],
"After showing word completions, further editing should filter them and not query the LSP"
);
} else {
@@ -12871,7 +13045,7 @@ async fn test_completions_in_languages_with_extra_word_characters(cx: &mut TestA
overrides: [(
"element".into(),
LanguageConfigOverride {
word_characters: Override::Set(['-'].into_iter().collect()),
completion_query_characters: Override::Set(['-'].into_iter().collect()),
..Default::default()
},
)]

View File

@@ -244,6 +244,7 @@ impl EditorElement {
register_action(editor, window, Editor::kill_ring_cut);
register_action(editor, window, Editor::kill_ring_yank);
register_action(editor, window, Editor::copy);
register_action(editor, window, Editor::copy_and_trim);
register_action(editor, window, Editor::paste);
register_action(editor, window, Editor::undo);
register_action(editor, window, Editor::redo);

View File

@@ -629,18 +629,20 @@ impl Item for Editor {
self.buffer()
.read(cx)
.as_singleton()
.and_then(|buffer| buffer.read(cx).project_path(cx))
.and_then(|path| {
.and_then(|buffer| {
let buffer = buffer.read(cx);
let path = buffer.project_path(cx)?;
let buffer_id = buffer.remote_id();
let project = self.project.as_ref()?.read(cx);
let entry = project.entry_for_path(&path, cx)?;
let git_status = project
.worktree_for_id(path.worktree_id, cx)?
let (repo, repo_path) = project
.git_store()
.read(cx)
.snapshot()
.status_for_file(path.path)?;
.repository_and_path_for_buffer_id(buffer_id, cx)?;
let status = repo.read(cx).status_for_path(&repo_path)?.status;
Some(entry_git_aware_label_color(
git_status.summary(),
status.summary(),
entry.is_ignored,
params.selected,
))

View File

@@ -137,9 +137,9 @@ pub fn deploy_context_menu(
menu
} else {
// Don't show the context menu if there isn't a project associated with this editor
if editor.project.is_none() {
let Some(project) = editor.project.clone() else {
return;
}
};
let display_map = editor.selections.display_map(cx);
let buffer = &editor.snapshot(window, cx).buffer_snapshot;
@@ -159,10 +159,13 @@ pub fn deploy_context_menu(
.all::<PointUtf16>(cx)
.into_iter()
.any(|s| !s.is_empty());
let has_git_repo = editor.project.as_ref().map_or(false, |project| {
project.update(cx, |project, cx| {
project.get_first_worktree_root_repo(cx).is_some()
})
let has_git_repo = anchor.buffer_id.is_some_and(|buffer_id| {
project
.read(cx)
.git_store()
.read(cx)
.repository_and_path_for_buffer_id(buffer_id, cx)
.is_some()
});
ui::ContextMenu::build(window, cx, |menu, _window, _cx| {

View File

@@ -264,7 +264,7 @@ impl EditorLspTestContext {
..Default::default()
},
block_comment: Some(("<!-- ".into(), " -->".into())),
word_characters: ['-'].into_iter().collect(),
completion_query_characters: ['-'].into_iter().collect(),
..Default::default()
},
Some(tree_sitter_html::LANGUAGE.into()),

View File

@@ -1,14 +1,5 @@
fn main() {
if cfg!(target_os = "macos") {
println!("cargo:rustc-env=MACOSX_DEPLOYMENT_TARGET=10.15.7");
println!("cargo:rerun-if-env-changed=ZED_BUNDLE");
if std::env::var("ZED_BUNDLE").ok().as_deref() == Some("true") {
// Find WebRTC.framework in the Frameworks folder when running as part of an application bundle.
println!("cargo:rustc-link-arg=-Wl,-rpath,@executable_path/../Frameworks");
} else {
// Find WebRTC.framework as a sibling of the executable when running outside of an application bundle.
println!("cargo:rustc-link-arg=-Wl,-rpath,@executable_path");
}
}
}

View File

@@ -906,7 +906,10 @@ impl ExtensionStore {
.await
}
})
.await?;
.await
.inspect_err(|error| {
util::log_err(error);
})?;
let output_path = &extensions_dir.join(extension_id.as_ref());
if let Some(metadata) = fs.metadata(output_path).await? {

View File

@@ -612,6 +612,7 @@ impl ExtensionsPage {
self.buttons_for_entry(extension, &status, has_dev_extension, cx);
let version = extension.manifest.version.clone();
let repository_url = extension.manifest.repository.clone();
let authors = extension.manifest.authors.clone();
let installed_version = match status {
ExtensionStatus::Installed(installed_version) => Some(installed_version),
@@ -749,6 +750,7 @@ impl ExtensionsPage {
Some(Self::render_remote_extension_context_menu(
&this,
extension_id.clone(),
authors.clone(),
window,
cx,
))
@@ -761,6 +763,7 @@ impl ExtensionsPage {
fn render_remote_extension_context_menu(
this: &Entity<Self>,
extension_id: Arc<str>,
authors: Vec<String>,
window: &mut Window,
cx: &mut App,
) -> Entity<ContextMenu> {
@@ -782,6 +785,12 @@ impl ExtensionsPage {
cx.write_to_clipboard(ClipboardItem::new_string(extension_id.to_string()));
}
})
.entry("Copy Author Info", None, {
let authors = authors.clone();
move |_, cx| {
cx.write_to_clipboard(ClipboardItem::new_string(authors.join(", ")));
}
})
});
context_menu

View File

@@ -15,25 +15,12 @@ path = "src/feedback.rs"
test-support = []
[dependencies]
anyhow.workspace = true
bitflags.workspace = true
client.workspace = true
db.workspace = true
editor.workspace = true
futures.workspace = true
gpui.workspace = true
http_client.workspace = true
human_bytes = "0.4.1"
language.workspace = true
log.workspace = true
menu.workspace = true
project.workspace = true
regex.workspace = true
release_channel.workspace = true
serde.workspace = true
serde_derive.workspace = true
serde_json.workspace = true
smol.workspace = true
sysinfo.workspace = true
ui.workspace = true
urlencoding.workspace = true

View File

@@ -11,19 +11,16 @@ actions!(
zed,
[
CopySystemSpecsIntoClipboard,
EmailZed,
FileBugReport,
OpenZedRepo,
RequestFeature,
OpenZedRepo
]
);
const fn zed_repo_url() -> &'static str {
"https://github.com/zed-industries/zed"
}
const ZED_REPO_URL: &str = "https://github.com/zed-industries/zed";
fn request_feature_url() -> String {
"https://github.com/zed-industries/zed/discussions/new/choose".to_string()
}
const REQUEST_FEATURE_URL: &str = "https://github.com/zed-industries/zed/discussions/new/choose";
fn file_bug_report_url(specs: &SystemSpecs) -> String {
format!(
@@ -38,6 +35,18 @@ fn file_bug_report_url(specs: &SystemSpecs) -> String {
)
}
fn email_zed_url(specs: &SystemSpecs) -> String {
format!(
concat!("mailto:hi@zed.dev", "?", "body={}"),
email_body(specs)
)
}
fn email_body(specs: &SystemSpecs) -> String {
let body = format!("\n\nSystem Information:\n\n{}", specs);
urlencoding::encode(&body).to_string()
}
pub fn init(cx: &mut App) {
cx.observe_new(|workspace: &mut Workspace, window, cx| {
let Some(window) = window else {
@@ -66,14 +75,8 @@ pub fn init(cx: &mut App) {
})
.detach();
})
.register_action(|_, _: &RequestFeature, window, cx| {
cx.spawn_in(window, async move |_, cx| {
cx.update(|_, cx| {
cx.open_url(&request_feature_url());
})
.log_err();
})
.detach();
.register_action(|_, _: &RequestFeature, _, cx| {
cx.open_url(REQUEST_FEATURE_URL);
})
.register_action(move |_, _: &FileBugReport, window, cx| {
let specs = SystemSpecs::new(window, cx);
@@ -86,8 +89,19 @@ pub fn init(cx: &mut App) {
})
.detach();
})
.register_action(move |_, _: &EmailZed, window, cx| {
let specs = SystemSpecs::new(window, cx);
cx.spawn_in(window, async move |_, cx| {
let specs = specs.await;
cx.update(|_, cx| {
cx.open_url(&email_zed_url(&specs));
})
.log_err();
})
.detach();
})
.register_action(move |_, _: &OpenZedRepo, _, cx| {
cx.open_url(zed_repo_url());
cx.open_url(ZED_REPO_URL);
});
})
.detach();

View File

@@ -1,421 +1,37 @@
use std::{
ops::RangeInclusive,
sync::{Arc, LazyLock},
time::Duration,
};
use anyhow::{anyhow, bail};
use bitflags::bitflags;
use client::Client;
use db::kvp::KEY_VALUE_STORE;
use editor::{Editor, EditorEvent};
use futures::AsyncReadExt;
use gpui::{
div, rems, App, Context, DismissEvent, Entity, EventEmitter, FocusHandle, Focusable,
PromptLevel, Render, Task, Window,
};
use http_client::HttpClient;
use language::Buffer;
use project::Project;
use regex::Regex;
use serde_derive::Serialize;
use ui::{prelude::*, Button, ButtonStyle, IconPosition, Tooltip};
use util::ResultExt;
use workspace::{DismissDecision, ModalView, Workspace};
use gpui::{App, Context, DismissEvent, EventEmitter, FocusHandle, Focusable, Render, Window};
use ui::{prelude::*, IconPosition};
use workspace::{ModalView, Workspace};
use zed_actions::feedback::GiveFeedback;
use crate::{system_specs::SystemSpecs, OpenZedRepo};
// For UI testing purposes
const SEND_SUCCESS_IN_DEV_MODE: bool = true;
const SEND_TIME_IN_DEV_MODE: Duration = Duration::from_secs(2);
// Temporary, until tests are in place
#[cfg(debug_assertions)]
const DEV_MODE: bool = true;
#[cfg(not(debug_assertions))]
const DEV_MODE: bool = false;
const DATABASE_KEY_NAME: &str = "email_address";
static EMAIL_REGEX: LazyLock<Regex> =
LazyLock::new(|| Regex::new(r"\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b").unwrap());
const FEEDBACK_CHAR_LIMIT: RangeInclusive<i32> = 10..=5000;
const FEEDBACK_SUBMISSION_ERROR_TEXT: &str =
"Feedback failed to submit, see error log for details.";
#[derive(Serialize)]
struct FeedbackRequestBody<'a> {
feedback_text: &'a str,
email: Option<String>,
installation_id: Option<Arc<str>>,
metrics_id: Option<Arc<str>>,
system_specs: SystemSpecs,
is_staff: bool,
}
bitflags! {
#[derive(Debug, Clone, PartialEq)]
struct InvalidStateFlags: u8 {
const EmailAddress = 0b00000001;
const CharacterCount = 0b00000010;
}
}
#[derive(Debug, Clone, PartialEq)]
enum CannotSubmitReason {
InvalidState { flags: InvalidStateFlags },
AwaitingSubmission,
}
#[derive(Debug, Clone, PartialEq)]
enum SubmissionState {
CanSubmit,
CannotSubmit { reason: CannotSubmitReason },
}
use crate::{EmailZed, FileBugReport, OpenZedRepo, RequestFeature};
pub struct FeedbackModal {
system_specs: SystemSpecs,
feedback_editor: Entity<Editor>,
email_address_editor: Entity<Editor>,
submission_state: Option<SubmissionState>,
dismiss_modal: bool,
character_count: i32,
focus_handle: FocusHandle,
}
impl Focusable for FeedbackModal {
fn focus_handle(&self, cx: &App) -> FocusHandle {
self.feedback_editor.focus_handle(cx)
fn focus_handle(&self, _: &App) -> FocusHandle {
self.focus_handle.clone()
}
}
impl EventEmitter<DismissEvent> for FeedbackModal {}
impl ModalView for FeedbackModal {
fn on_before_dismiss(
&mut self,
window: &mut Window,
cx: &mut Context<Self>,
) -> DismissDecision {
self.update_email_in_store(window, cx);
if self.dismiss_modal {
return DismissDecision::Dismiss(true);
}
let has_feedback = self.feedback_editor.read(cx).text_option(cx).is_some();
if !has_feedback {
return DismissDecision::Dismiss(true);
}
let answer = window.prompt(
PromptLevel::Info,
"Discard feedback?",
None,
&["Yes", "No"],
cx,
);
cx.spawn_in(window, async move |this, cx| {
if answer.await.ok() == Some(0) {
this.update(cx, |this, cx| {
this.dismiss_modal = true;
cx.emit(DismissEvent)
})
.log_err();
}
})
.detach();
DismissDecision::Pending
}
}
impl ModalView for FeedbackModal {}
impl FeedbackModal {
pub fn register(workspace: &mut Workspace, _: &mut Window, cx: &mut Context<Workspace>) {
let _handle = cx.entity().downgrade();
workspace.register_action(move |workspace, _: &GiveFeedback, window, cx| {
workspace
.with_local_workspace(window, cx, |workspace, window, cx| {
let markdown = workspace
.app_state()
.languages
.language_for_name("Markdown");
let project = workspace.project().clone();
let system_specs = SystemSpecs::new(window, cx);
cx.spawn_in(window, async move |workspace, cx| {
let markdown = markdown.await.log_err();
let buffer = project.update(cx, |project, cx| {
project.create_local_buffer("", markdown, cx)
})?;
let system_specs = system_specs.await;
workspace.update_in(cx, |workspace, window, cx| {
workspace.toggle_modal(window, cx, move |window, cx| {
FeedbackModal::new(system_specs, project, buffer, window, cx)
});
})?;
anyhow::Ok(())
})
.detach_and_log_err(cx);
})
.detach_and_log_err(cx);
workspace.toggle_modal(window, cx, move |_, cx| FeedbackModal::new(cx));
});
}
pub fn new(
system_specs: SystemSpecs,
project: Entity<Project>,
buffer: Entity<Buffer>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let email_address_editor = cx.new(|cx| {
let mut editor = Editor::single_line(window, cx);
editor.set_placeholder_text("Email address (optional)", cx);
if let Ok(Some(email_address)) = KEY_VALUE_STORE.read_kvp(DATABASE_KEY_NAME) {
editor.set_text(email_address, window, cx)
}
editor
});
let feedback_editor = cx.new(|cx| {
let mut editor = Editor::for_buffer(buffer, Some(project.clone()), window, cx);
editor.set_placeholder_text(
"You can use markdown to organize your feedback with code and links.",
cx,
);
editor.set_show_gutter(false, cx);
editor.set_show_indent_guides(false, cx);
editor.set_show_edit_predictions(Some(false), window, cx);
editor.set_vertical_scroll_margin(5, cx);
editor.set_use_modal_editing(false);
editor.set_soft_wrap();
editor
});
cx.subscribe(&feedback_editor, |this, editor, event: &EditorEvent, cx| {
if matches!(event, EditorEvent::Edited { .. }) {
this.character_count = editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("Feedback editor is never a multi-buffer")
.read(cx)
.len() as i32;
cx.notify();
}
})
.detach();
pub fn new(cx: &mut Context<Self>) -> Self {
Self {
system_specs: system_specs.clone(),
feedback_editor,
email_address_editor,
submission_state: None,
dismiss_modal: false,
character_count: 0,
focus_handle: cx.focus_handle(),
}
}
pub fn submit(
&mut self,
window: &mut Window,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<()>> {
let feedback_text = self.feedback_editor.read(cx).text(cx).trim().to_string();
let email = self.email_address_editor.read(cx).text_option(cx);
let answer = window.prompt(
PromptLevel::Info,
"Ready to submit your feedback?",
None,
&["Yes, Submit!", "No"],
cx,
);
let client = Client::global(cx).clone();
let specs = self.system_specs.clone();
cx.spawn_in(window, async move |this, cx| {
let answer = answer.await.ok();
if answer == Some(0) {
this.update(cx, |this, cx| {
this.submission_state = Some(SubmissionState::CannotSubmit {
reason: CannotSubmitReason::AwaitingSubmission,
});
cx.notify();
})
.log_err();
let res =
FeedbackModal::submit_feedback(&feedback_text, email, client, specs).await;
match res {
Ok(_) => {
this.update(cx, |this, cx| {
this.dismiss_modal = true;
cx.notify();
cx.emit(DismissEvent)
})
.ok();
}
Err(error) => {
log::error!("{}", error);
this.update_in(cx, |this, window, cx| {
let prompt = window.prompt(
PromptLevel::Critical,
FEEDBACK_SUBMISSION_ERROR_TEXT,
None,
&["OK"],
cx,
);
cx.spawn_in(window, async move |_, _cx| {
prompt.await.ok();
})
.detach();
this.submission_state = Some(SubmissionState::CanSubmit);
cx.notify();
})
.log_err();
}
}
}
})
.detach();
Task::ready(Ok(()))
}
async fn submit_feedback(
feedback_text: &str,
email: Option<String>,
zed_client: Arc<Client>,
system_specs: SystemSpecs,
) -> anyhow::Result<()> {
if DEV_MODE {
smol::Timer::after(SEND_TIME_IN_DEV_MODE).await;
if SEND_SUCCESS_IN_DEV_MODE {
return Ok(());
} else {
return Err(anyhow!("Error submitting feedback"));
}
}
let telemetry = zed_client.telemetry();
let installation_id = telemetry.installation_id();
let metrics_id = telemetry.metrics_id();
let is_staff = telemetry.is_staff();
let http_client = zed_client.http_client();
let feedback_endpoint = http_client.build_url("/api/feedback");
let request = FeedbackRequestBody {
feedback_text,
email,
installation_id,
metrics_id,
system_specs,
is_staff: is_staff.unwrap_or(false),
};
let json_bytes = serde_json::to_vec(&request)?;
let request = http_client::http::Request::post(feedback_endpoint)
.header("content-type", "application/json")
.body(json_bytes.into())?;
let mut response = http_client.send(request).await?;
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
let response_status = response.status();
if !response_status.is_success() {
bail!("Feedback API failed with error: {}", response_status)
}
Ok(())
}
fn update_submission_state(&mut self, cx: &mut Context<Self>) {
if self.awaiting_submission() {
return;
}
let mut invalid_state_flags = InvalidStateFlags::empty();
let valid_email_address = match self.email_address_editor.read(cx).text_option(cx) {
Some(email_address) => EMAIL_REGEX.is_match(&email_address),
None => true,
};
if !valid_email_address {
invalid_state_flags |= InvalidStateFlags::EmailAddress;
}
if !FEEDBACK_CHAR_LIMIT.contains(&self.character_count) {
invalid_state_flags |= InvalidStateFlags::CharacterCount;
}
if invalid_state_flags.is_empty() {
self.submission_state = Some(SubmissionState::CanSubmit);
} else {
self.submission_state = Some(SubmissionState::CannotSubmit {
reason: CannotSubmitReason::InvalidState {
flags: invalid_state_flags,
},
});
}
}
fn update_email_in_store(&self, window: &mut Window, cx: &mut Context<Self>) {
let email = self.email_address_editor.read(cx).text_option(cx);
cx.spawn_in(window, async move |_, _| match email {
Some(email) => {
KEY_VALUE_STORE
.write_kvp(DATABASE_KEY_NAME.to_string(), email)
.await
.ok();
}
None => {
KEY_VALUE_STORE
.delete_kvp(DATABASE_KEY_NAME.to_string())
.await
.ok();
}
})
.detach();
}
fn valid_email_address(&self) -> bool {
!self.in_invalid_state(InvalidStateFlags::EmailAddress)
}
fn valid_character_count(&self) -> bool {
!self.in_invalid_state(InvalidStateFlags::CharacterCount)
}
fn in_invalid_state(&self, flag: InvalidStateFlags) -> bool {
match self.submission_state {
Some(SubmissionState::CannotSubmit {
reason: CannotSubmitReason::InvalidState { ref flags },
}) => flags.contains(flag),
_ => false,
}
}
fn awaiting_submission(&self) -> bool {
matches!(
self.submission_state,
Some(SubmissionState::CannotSubmit {
reason: CannotSubmitReason::AwaitingSubmission
})
)
}
fn can_submit(&self) -> bool {
matches!(self.submission_state, Some(SubmissionState::CanSubmit))
}
fn cancel(&mut self, _: &menu::Cancel, _: &mut Window, cx: &mut Context<Self>) {
cx.emit(DismissEvent)
}
@@ -423,118 +39,75 @@ impl FeedbackModal {
impl Render for FeedbackModal {
fn render(&mut self, _: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
self.update_submission_state(cx);
let submit_button_text = if self.awaiting_submission() {
"Submitting..."
} else {
"Submit"
};
let open_zed_repo =
cx.listener(|_, _, window, cx| window.dispatch_action(Box::new(OpenZedRepo), cx));
v_flex()
.elevation_3(cx)
.key_context("GiveFeedback")
.on_action(cx.listener(Self::cancel))
.min_w(rems(40.))
.max_w(rems(96.))
.h(rems(32.))
.elevation_3(cx)
.w_96()
.h_auto()
.p_4()
.gap_2()
.child(Headline::new("Give Feedback"))
.child(
Label::new(if self.character_count < *FEEDBACK_CHAR_LIMIT.start() {
format!(
"Feedback must be at least {} characters.",
FEEDBACK_CHAR_LIMIT.start()
)
} else {
format!(
"Characters: {}",
*FEEDBACK_CHAR_LIMIT.end() - self.character_count
)
})
.color(if self.valid_character_count() {
Color::Success
} else {
Color::Error
}),
)
.child(
div()
.flex_1()
.bg(cx.theme().colors().editor_background)
.p_2()
.border_1()
.rounded_sm()
.border_color(cx.theme().colors().border)
.child(self.feedback_editor.clone()),
)
.child(
v_flex()
.gap_1()
.child(
h_flex()
.bg(cx.theme().colors().editor_background)
.p_2()
.border_1()
.rounded_sm()
.border_color(if self.valid_email_address() {
cx.theme().colors().border
} else {
cx.theme().status().error_border
})
.child(self.email_address_editor.clone()),
)
.child(
Label::new("Provide an email address if you want us to be able to reply.")
.size(LabelSize::Small)
.color(Color::Muted),
),
)
.child(
h_flex()
.w_full()
.justify_between()
.gap_1()
.child(Headline::new("Give Feedback"))
.child(
Button::new("zed_repository", "Zed Repository")
.style(ButtonStyle::Transparent)
.icon(IconName::ExternalLink)
.icon_position(IconPosition::End)
.icon_size(IconSize::Small)
.on_click(open_zed_repo),
)
.child(
h_flex()
.gap_1()
.child(
Button::new("cancel_feedback", "Cancel")
.style(ButtonStyle::Subtle)
.color(Color::Muted)
.on_click(cx.listener(move |_, _, window, cx| {
cx.spawn_in(window, async move |this, cx| {
this.update(cx, |_, cx| cx.emit(DismissEvent)).ok();
})
.detach();
})),
)
.child(
Button::new("submit_feedback", submit_button_text)
.color(Color::Accent)
.style(ButtonStyle::Filled)
.on_click(cx.listener(|this, _, window, cx| {
this.submit(window, cx).detach();
}))
.tooltip(move |_, cx| {
Tooltip::simple("Submit feedback to the Zed team.", cx)
})
.when(!self.can_submit(), |this| this.disabled(true)),
),
IconButton::new("close-btn", IconName::Close)
.icon_color(Color::Muted)
.on_click(cx.listener(move |_, _, window, cx| {
cx.spawn_in(window, async move |this, cx| {
this.update(cx, |_, cx| cx.emit(DismissEvent)).ok();
})
.detach();
})),
),
)
.child(Label::new("Thanks for using Zed! To share your experience with us, reach for the channel that's the most appropriate:"))
.child(
Button::new("file-a-bug-report", "File a Bug Report")
.full_width()
.icon(IconName::Debug)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(cx.listener(|_, _, window, cx| {
window.dispatch_action(Box::new(FileBugReport), cx);
})),
)
.child(
Button::new("request-a-feature", "Request a Feature")
.full_width()
.icon(IconName::Sparkle)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(cx.listener(|_, _, window, cx| {
window.dispatch_action(Box::new(RequestFeature), cx);
})),
)
.child(
Button::new("send-us_an-email", "Send an Email")
.full_width()
.icon(IconName::Envelope)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(cx.listener(|_, _, window, cx| {
window.dispatch_action(Box::new(EmailZed), cx);
})),
)
.child(
Button::new("zed_repository", "GitHub Repository")
.full_width()
.icon(IconName::Github)
.icon_size(IconSize::XSmall)
.icon_color(Color::Muted)
.icon_position(IconPosition::Start)
.on_click(open_zed_repo),
)
}
}
// TODO: Testing of various button states, dismissal prompts, etc. :)

View File

@@ -5,8 +5,8 @@ use futures::future::{self, BoxFuture};
use git::{
blame::Blame,
repository::{
AskPassSession, Branch, CommitDetails, GitRepository, PushOptions, Remote, RepoPath,
ResetMode,
AskPassSession, Branch, CommitDetails, GitRepository, GitRepositoryCheckpoint, PushOptions,
Remote, RepoPath, ResetMode,
},
status::{FileStatus, GitStatus, StatusCode, TrackedStatus, UnmergedStatus},
};
@@ -409,11 +409,15 @@ impl GitRepository for FakeGitRepository {
unimplemented!()
}
fn checkpoint(&self, _cx: AsyncApp) -> BoxFuture<Result<git::Oid>> {
fn checkpoint(&self, _cx: AsyncApp) -> BoxFuture<Result<GitRepositoryCheckpoint>> {
unimplemented!()
}
fn restore_checkpoint(&self, _oid: git::Oid, _cx: AsyncApp) -> BoxFuture<Result<()>> {
fn restore_checkpoint(
&self,
_checkpoint: GitRepositoryCheckpoint,
_cx: AsyncApp,
) -> BoxFuture<Result<()>> {
unimplemented!()
}
}

View File

@@ -290,10 +290,14 @@ pub trait GitRepository: Send + Sync {
fn diff(&self, diff: DiffType, cx: AsyncApp) -> BoxFuture<Result<String>>;
/// Creates a checkpoint for the repository.
fn checkpoint(&self, cx: AsyncApp) -> BoxFuture<Result<Oid>>;
fn checkpoint(&self, cx: AsyncApp) -> BoxFuture<Result<GitRepositoryCheckpoint>>;
/// Resets to a previously-created checkpoint.
fn restore_checkpoint(&self, oid: Oid, cx: AsyncApp) -> BoxFuture<Result<()>>;
fn restore_checkpoint(
&self,
checkpoint: GitRepositoryCheckpoint,
cx: AsyncApp,
) -> BoxFuture<Result<()>>;
}
pub enum DiffType {
@@ -337,6 +341,12 @@ impl RealGitRepository {
}
}
#[derive(Copy, Clone)]
pub struct GitRepositoryCheckpoint {
head_sha: Option<Oid>,
sha: Oid,
}
// https://git-scm.com/book/en/v2/Git-Internals-Git-Objects
const GIT_MODE_SYMLINK: u32 = 0o120000;
@@ -1033,7 +1043,7 @@ impl GitRepository for RealGitRepository {
.boxed()
}
fn checkpoint(&self, cx: AsyncApp) -> BoxFuture<Result<Oid>> {
fn checkpoint(&self, cx: AsyncApp) -> BoxFuture<Result<GitRepositoryCheckpoint>> {
let working_directory = self.working_directory();
let git_binary_path = self.git_binary_path.clone();
let executor = cx.background_executor().clone();
@@ -1056,10 +1066,7 @@ impl GitRepository for RealGitRepository {
let output = new_smol_command(&git_binary_path)
.current_dir(&working_directory)
.env("GIT_INDEX_FILE", &index_file_path)
.env("GIT_AUTHOR_NAME", "Zed")
.env("GIT_AUTHOR_EMAIL", "hi@zed.dev")
.env("GIT_COMMITTER_NAME", "Zed")
.env("GIT_COMMITTER_EMAIL", "hi@zed.dev")
.envs(checkpoint_author_envs())
.args(args)
.output()
.await?;
@@ -1071,35 +1078,56 @@ impl GitRepository for RealGitRepository {
}
};
let head_sha = run_git_command(&["rev-parse", "HEAD"]).await.ok();
run_git_command(&["add", "--all"]).await?;
let tree = run_git_command(&["write-tree"]).await?;
let commit_sha = run_git_command(&["commit-tree", &tree, "-m", "Checkpoint"]).await?;
let checkpoint_sha = if let Some(head_sha) = head_sha.as_deref() {
run_git_command(&["commit-tree", &tree, "-p", head_sha, "-m", "Checkpoint"]).await?
} else {
run_git_command(&["commit-tree", &tree, "-m", "Checkpoint"]).await?
};
let ref_name = Uuid::new_v4().to_string();
run_git_command(&["update-ref", &format!("refs/heads/{ref_name}"), &commit_sha])
.await?;
run_git_command(&[
"update-ref",
&format!("refs/zed/{ref_name}"),
&checkpoint_sha,
])
.await?;
smol::fs::remove_file(index_file_path).await.ok();
delete_temp_index.abort();
commit_sha.parse()
Ok(GitRepositoryCheckpoint {
head_sha: if let Some(head_sha) = head_sha {
Some(head_sha.parse()?)
} else {
None
},
sha: checkpoint_sha.parse()?,
})
})
.boxed()
}
fn restore_checkpoint(&self, oid: Oid, cx: AsyncApp) -> BoxFuture<Result<()>> {
fn restore_checkpoint(
&self,
checkpoint: GitRepositoryCheckpoint,
cx: AsyncApp,
) -> BoxFuture<Result<()>> {
let working_directory = self.working_directory();
let git_binary_path = self.git_binary_path.clone();
cx.background_spawn(async move {
let working_directory = working_directory?;
let index_file_path = working_directory.join(".git/index.tmp");
let run_git_command = async |args: &[&str]| {
let output = new_smol_command(&git_binary_path)
.current_dir(&working_directory)
.env("GIT_INDEX_FILE", &index_file_path)
.args(args)
.output()
.await?;
let run_git_command = async |args: &[&str], use_temp_index: bool| {
let mut command = new_smol_command(&git_binary_path);
command.current_dir(&working_directory);
command.args(args);
if use_temp_index {
command.env("GIT_INDEX_FILE", &index_file_path);
}
let output = command.output().await?;
if output.status.success() {
anyhow::Ok(String::from_utf8(output.stdout)?.trim_end().to_string())
} else {
@@ -1108,9 +1136,26 @@ impl GitRepository for RealGitRepository {
}
};
run_git_command(&["restore", "--source", &oid.to_string(), "--worktree", "."]).await?;
run_git_command(&["read-tree", &oid.to_string()]).await?;
run_git_command(&["clean", "-d", "--force"]).await?;
run_git_command(
&[
"restore",
"--source",
&checkpoint.sha.to_string(),
"--worktree",
".",
],
false,
)
.await?;
run_git_command(&["read-tree", &checkpoint.sha.to_string()], true).await?;
run_git_command(&["clean", "-d", "--force"], true).await?;
if let Some(head_sha) = checkpoint.head_sha {
run_git_command(&["reset", "--mixed", &head_sha.to_string()], false).await?;
} else {
run_git_command(&["update-ref", "-d", "HEAD"], false).await?;
}
Ok(())
})
.boxed()
@@ -1350,14 +1395,111 @@ fn check_path_to_repo_path_errors(relative_file_path: &Path) -> Result<()> {
}
}
fn checkpoint_author_envs() -> HashMap<String, String> {
HashMap::from_iter([
("GIT_AUTHOR_NAME".to_string(), "Zed".to_string()),
("GIT_AUTHOR_EMAIL".to_string(), "hi@zed.dev".to_string()),
("GIT_COMMITTER_NAME".to_string(), "Zed".to_string()),
("GIT_COMMITTER_EMAIL".to_string(), "hi@zed.dev".to_string()),
])
}
#[cfg(test)]
mod tests {
use super::*;
use crate::status::FileStatus;
use gpui::TestAppContext;
use super::*;
#[gpui::test]
async fn test_checkpoint_basic(cx: &mut TestAppContext) {
cx.executor().allow_parking();
let repo_dir = tempfile::tempdir().unwrap();
git2::Repository::init(repo_dir.path()).unwrap();
let file_path = repo_dir.path().join("file");
smol::fs::write(&file_path, "initial").await.unwrap();
let repo = RealGitRepository::new(&repo_dir.path().join(".git"), None).unwrap();
repo.stage_paths(
vec![RepoPath::from_str("file")],
HashMap::default(),
cx.to_async(),
)
.await
.unwrap();
repo.commit(
"Initial commit".into(),
None,
checkpoint_author_envs(),
cx.to_async(),
)
.await
.unwrap();
smol::fs::write(&file_path, "modified before checkpoint")
.await
.unwrap();
smol::fs::write(repo_dir.path().join("new_file_before_checkpoint"), "1")
.await
.unwrap();
let sha_before_checkpoint = repo.head_sha().unwrap();
let checkpoint = repo.checkpoint(cx.to_async()).await.unwrap();
// Ensure the user can't see any branches after creating a checkpoint.
assert_eq!(repo.branches().await.unwrap().len(), 1);
smol::fs::write(&file_path, "modified after checkpoint")
.await
.unwrap();
repo.stage_paths(
vec![RepoPath::from_str("file")],
HashMap::default(),
cx.to_async(),
)
.await
.unwrap();
repo.commit(
"Commit after checkpoint".into(),
None,
checkpoint_author_envs(),
cx.to_async(),
)
.await
.unwrap();
smol::fs::remove_file(repo_dir.path().join("new_file_before_checkpoint"))
.await
.unwrap();
smol::fs::write(repo_dir.path().join("new_file_after_checkpoint"), "2")
.await
.unwrap();
repo.restore_checkpoint(checkpoint, cx.to_async())
.await
.unwrap();
assert_eq!(repo.head_sha().unwrap(), sha_before_checkpoint);
assert_eq!(
smol::fs::read_to_string(&file_path).await.unwrap(),
"modified before checkpoint"
);
assert_eq!(
smol::fs::read_to_string(repo_dir.path().join("new_file_before_checkpoint"))
.await
.unwrap(),
"1"
);
assert_eq!(
smol::fs::read_to_string(repo_dir.path().join("new_file_after_checkpoint"))
.await
.ok(),
None
);
}
#[gpui::test]
async fn test_checkpoint(cx: &mut TestAppContext) {
async fn test_checkpoint_empty_repo(cx: &mut TestAppContext) {
cx.executor().allow_parking();
let repo_dir = tempfile::tempdir().unwrap();
@@ -1369,6 +1511,9 @@ mod tests {
.unwrap();
let checkpoint_sha = repo.checkpoint(cx.to_async()).await.unwrap();
// Ensure the user can't see any branches after creating a checkpoint.
assert_eq!(repo.branches().await.unwrap().len(), 1);
smol::fs::write(repo_dir.path().join("foo"), "bar")
.await
.unwrap();
@@ -1392,6 +1537,88 @@ mod tests {
);
}
#[gpui::test]
async fn test_undoing_commit_via_checkpoint(cx: &mut TestAppContext) {
cx.executor().allow_parking();
let repo_dir = tempfile::tempdir().unwrap();
git2::Repository::init(repo_dir.path()).unwrap();
let file_path = repo_dir.path().join("file");
smol::fs::write(&file_path, "initial").await.unwrap();
let repo = RealGitRepository::new(&repo_dir.path().join(".git"), None).unwrap();
repo.stage_paths(
vec![RepoPath::from_str("file")],
HashMap::default(),
cx.to_async(),
)
.await
.unwrap();
repo.commit(
"Initial commit".into(),
None,
checkpoint_author_envs(),
cx.to_async(),
)
.await
.unwrap();
let initial_commit_sha = repo.head_sha().unwrap();
smol::fs::write(repo_dir.path().join("new_file1"), "content1")
.await
.unwrap();
smol::fs::write(repo_dir.path().join("new_file2"), "content2")
.await
.unwrap();
let checkpoint = repo.checkpoint(cx.to_async()).await.unwrap();
repo.stage_paths(
vec![
RepoPath::from_str("new_file1"),
RepoPath::from_str("new_file2"),
],
HashMap::default(),
cx.to_async(),
)
.await
.unwrap();
repo.commit(
"Commit new files".into(),
None,
checkpoint_author_envs(),
cx.to_async(),
)
.await
.unwrap();
repo.restore_checkpoint(checkpoint, cx.to_async())
.await
.unwrap();
assert_eq!(repo.head_sha().unwrap(), initial_commit_sha);
assert_eq!(
smol::fs::read_to_string(repo_dir.path().join("new_file1"))
.await
.unwrap(),
"content1"
);
assert_eq!(
smol::fs::read_to_string(repo_dir.path().join("new_file2"))
.await
.unwrap(),
"content2"
);
assert_eq!(
repo.status(&[]).unwrap().entries.as_ref(),
&[
(RepoPath::from_str("new_file1"), FileStatus::Untracked),
(RepoPath::from_str("new_file2"), FileStatus::Untracked)
]
);
}
#[test]
fn test_branches_parsing() {
// suppress "help: octal escapes are not supported, `\0` is always null"

View File

@@ -8,7 +8,7 @@ use gpui::{
SharedString, Styled, Subscription, Task, Window,
};
use picker::{Picker, PickerDelegate, PickerEditorPosition};
use project::git::Repository;
use project::git_store::Repository;
use std::sync::Arc;
use time::OffsetDateTime;
use time_format::format_local_timestamp;

View File

@@ -46,7 +46,7 @@ use panel::{
panel_icon_button, PanelHeader,
};
use project::{
git::{GitEvent, Repository},
git_store::{GitEvent, Repository},
Fs, Project, ProjectPath,
};
use serde::{Deserialize, Serialize};

View File

@@ -23,7 +23,7 @@ use gpui::{
use language::{Anchor, Buffer, Capability, OffsetRangeExt};
use multi_buffer::{MultiBuffer, PathKey};
use project::{
git::{GitEvent, GitStore},
git_store::{GitEvent, GitStore},
Project, ProjectPath,
};
use std::any::{Any, TypeId};

View File

@@ -4,7 +4,7 @@ use gpui::{
use itertools::Itertools;
use picker::{Picker, PickerDelegate};
use project::{
git::{GitStore, Repository},
git_store::{GitStore, Repository},
Project,
};
use std::sync::Arc;

View File

@@ -12,7 +12,7 @@ license = "Apache-2.0"
workspace = true
[features]
default = ["http_client", "font-kit", "wayland", "x11"]
default = ["macos-blade", "http_client", "font-kit", "wayland", "x11"]
test-support = [
"leak-detection",
"collections/test-support",
@@ -123,10 +123,11 @@ lyon = "1.0"
block = "0.1"
cocoa.workspace = true
core-foundation.workspace = true
core-foundation-sys = "0.8"
core-graphics = "0.23"
core-text = "20.1"
font-kit = { git = "https://github.com/zed-industries/font-kit", rev = "40391b7", optional = true }
core-foundation-sys.workspace = true
core-graphics = "0.24"
core-video.workspace = true
core-text = "21"
font-kit = { git = "https://github.com/zed-industries/font-kit", rev = "5474cfad4b719a72ec8ed2cb7327b2b01fd10568", optional = true }
foreign-types = "0.5"
log.workspace = true
media.workspace = true
@@ -154,9 +155,10 @@ blade-macros = { workspace = true, optional = true }
blade-util = { workspace = true, optional = true }
bytemuck = { version = "1", optional = true }
cosmic-text = { git = "https://github.com/pop-os/cosmic-text", rev = "542b20c", optional = true }
font-kit = { git = "https://github.com/zed-industries/font-kit", rev = "40391b7", features = [
font-kit = { git = "https://github.com/zed-industries/font-kit", rev = "5474cfad4b719a72ec8ed2cb7327b2b01fd10568", features = [
"source-fontconfig-dlopen",
], optional = true }
calloop = { version = "0.13.0" }
filedescriptor = { version = "0.8.2", optional = true }
open = { version = "5.2.0", optional = true }

View File

@@ -1590,10 +1590,10 @@ impl App {
.insert(entity_id, window_invalidators);
}
/// Get the name for this App.
/// Returns the name for this [`App`].
#[cfg(any(test, feature = "test-support", debug_assertions))]
pub fn get_name(&self) -> &'static str {
self.name.as_ref().unwrap()
pub fn get_name(&self) -> Option<&'static str> {
self.name
}
/// Returns `true` if the platform file picker supports selecting a mix of files and directories.

View File

@@ -3,7 +3,7 @@ use crate::{
Style, StyleRefinement, Styled, Window,
};
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use core_video::pixel_buffer::CVPixelBuffer;
use refineable::Refineable;
/// A source of a surface's content.
@@ -11,12 +11,12 @@ use refineable::Refineable;
pub enum SurfaceSource {
/// A macOS image buffer from CoreVideo
#[cfg(target_os = "macos")]
Surface(CVImageBuffer),
Surface(CVPixelBuffer),
}
#[cfg(target_os = "macos")]
impl From<CVImageBuffer> for SurfaceSource {
fn from(value: CVImageBuffer) -> Self {
impl From<CVPixelBuffer> for SurfaceSource {
fn from(value: CVPixelBuffer) -> Self {
SurfaceSource::Surface(value)
}
}
@@ -87,7 +87,7 @@ impl Element for Surface {
match &self.source {
#[cfg(target_os = "macos")]
SurfaceSource::Surface(surface) => {
let size = crate::size(surface.width().into(), surface.height().into());
let size = crate::size(surface.get_width().into(), surface.get_height().into());
let new_bounds = self.object_fit.get_bounds(bounds, size);
// TODO: Add support for corner_radii
window.paint_surface(new_bounds, surface.clone());

View File

@@ -725,8 +725,8 @@ impl BladeRenderer {
use std::ptr;
assert_eq!(
surface.image_buffer.pixel_format_type(),
media::core_video::kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
surface.image_buffer.get_pixel_format(),
core_video::pixel_buffer::kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
);
let y_texture = self
@@ -735,8 +735,8 @@ impl BladeRenderer {
surface.image_buffer.as_concrete_TypeRef(),
ptr::null(),
metal::MTLPixelFormat::R8Unorm,
surface.image_buffer.plane_width(0),
surface.image_buffer.plane_height(0),
surface.image_buffer.get_width_of_plane(0),
surface.image_buffer.get_height_of_plane(0),
0,
)
.unwrap();
@@ -746,8 +746,8 @@ impl BladeRenderer {
surface.image_buffer.as_concrete_TypeRef(),
ptr::null(),
metal::MTLPixelFormat::RG8Unorm,
surface.image_buffer.plane_width(1),
surface.image_buffer.plane_height(1),
surface.image_buffer.get_width_of_plane(1),
surface.image_buffer.get_height_of_plane(1),
1,
)
.unwrap();

View File

@@ -11,7 +11,7 @@ mod metal_atlas;
#[cfg(not(feature = "macos-blade"))]
pub mod metal_renderer;
use media::core_video::CVImageBuffer;
use core_video::image_buffer::CVImageBuffer;
#[cfg(not(feature = "macos-blade"))]
use metal_renderer as renderer;

View File

@@ -13,8 +13,11 @@ use cocoa::{
};
use collections::HashMap;
use core_foundation::base::TCFType;
use foreign_types::ForeignType;
use media::core_video::CVMetalTextureCache;
use core_video::{
metal_texture::CVMetalTextureGetTexture, metal_texture_cache::CVMetalTextureCache,
pixel_buffer::kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,
};
use foreign_types::{ForeignType, ForeignTypeRef};
use metal::{CAMetalLayer, CommandQueue, MTLPixelFormat, MTLResourceOptions, NSRange};
use objc::{self, msg_send, sel, sel_impl};
use parking_lot::Mutex;
@@ -107,7 +110,7 @@ pub(crate) struct MetalRenderer {
#[allow(clippy::arc_with_non_send_sync)]
instance_buffer_pool: Arc<Mutex<InstanceBufferPool>>,
sprite_atlas: Arc<MetalAtlas>,
core_video_texture_cache: CVMetalTextureCache,
core_video_texture_cache: core_video::metal_texture_cache::CVMetalTextureCache,
}
impl MetalRenderer {
@@ -235,7 +238,7 @@ impl MetalRenderer {
let command_queue = device.new_command_queue();
let sprite_atlas = Arc::new(MetalAtlas::new(device.clone(), PATH_SAMPLE_COUNT));
let core_video_texture_cache =
unsafe { CVMetalTextureCache::new(device.as_ptr()).unwrap() };
CVMetalTextureCache::new(None, device.clone(), None).unwrap();
Self {
device,
@@ -1054,39 +1057,37 @@ impl MetalRenderer {
for surface in surfaces {
let texture_size = size(
DevicePixels::from(surface.image_buffer.width() as i32),
DevicePixels::from(surface.image_buffer.height() as i32),
DevicePixels::from(surface.image_buffer.get_width() as i32),
DevicePixels::from(surface.image_buffer.get_height() as i32),
);
assert_eq!(
surface.image_buffer.pixel_format_type(),
media::core_video::kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
surface.image_buffer.get_pixel_format(),
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
);
let y_texture = unsafe {
self.core_video_texture_cache
.create_texture_from_image(
surface.image_buffer.as_concrete_TypeRef(),
ptr::null(),
MTLPixelFormat::R8Unorm,
surface.image_buffer.plane_width(0),
surface.image_buffer.plane_height(0),
0,
)
.unwrap()
};
let cb_cr_texture = unsafe {
self.core_video_texture_cache
.create_texture_from_image(
surface.image_buffer.as_concrete_TypeRef(),
ptr::null(),
MTLPixelFormat::RG8Unorm,
surface.image_buffer.plane_width(1),
surface.image_buffer.plane_height(1),
1,
)
.unwrap()
};
let y_texture = self
.core_video_texture_cache
.create_texture_from_image(
surface.image_buffer.as_concrete_TypeRef(),
None,
MTLPixelFormat::R8Unorm,
surface.image_buffer.get_width_of_plane(0),
surface.image_buffer.get_height_of_plane(0),
0,
)
.unwrap();
let cb_cr_texture = self
.core_video_texture_cache
.create_texture_from_image(
surface.image_buffer.as_concrete_TypeRef(),
None,
MTLPixelFormat::RG8Unorm,
surface.image_buffer.get_width_of_plane(1),
surface.image_buffer.get_height_of_plane(1),
1,
)
.unwrap();
align_offset(instance_offset);
let next_offset = *instance_offset + mem::size_of::<Surface>();
@@ -1104,14 +1105,15 @@ impl MetalRenderer {
mem::size_of_val(&texture_size) as u64,
&texture_size as *const Size<DevicePixels> as *const _,
);
command_encoder.set_fragment_texture(
SurfaceInputIndex::YTexture as u64,
Some(y_texture.as_texture_ref()),
);
command_encoder.set_fragment_texture(
SurfaceInputIndex::CbCrTexture as u64,
Some(cb_cr_texture.as_texture_ref()),
);
// let y_texture = y_texture.get_texture().unwrap().
command_encoder.set_fragment_texture(SurfaceInputIndex::YTexture as u64, unsafe {
let texture = CVMetalTextureGetTexture(y_texture.as_concrete_TypeRef());
Some(metal::TextureRef::from_ptr(texture as *mut _))
});
command_encoder.set_fragment_texture(SurfaceInputIndex::CbCrTexture as u64, unsafe {
let texture = CVMetalTextureGetTexture(cb_cr_texture.as_concrete_TypeRef());
Some(metal::TextureRef::from_ptr(texture as *mut _))
});
unsafe {
let buffer_contents = (instance_buffer.metal_buffer.contents() as *mut u8)

View File

@@ -9,6 +9,10 @@ use cocoa::{
foundation::NSArray,
};
use core_foundation::base::TCFType;
use core_graphics::display::{
CGDirectDisplayID, CGDisplayCopyDisplayMode, CGDisplayModeGetPixelHeight,
CGDisplayModeGetPixelWidth, CGDisplayModeRelease,
};
use ctor::ctor;
use futures::channel::oneshot;
use media::core_media::{CMSampleBuffer, CMSampleBufferRef};
@@ -45,8 +49,12 @@ const SCStreamOutputTypeScreen: NSInteger = 0;
impl ScreenCaptureSource for MacScreenCaptureSource {
fn resolution(&self) -> Result<Size<Pixels>> {
unsafe {
let width: i64 = msg_send![self.sc_display, width];
let height: i64 = msg_send![self.sc_display, height];
let display_id: CGDirectDisplayID = msg_send![self.sc_display, displayID];
let display_mode_ref = CGDisplayCopyDisplayMode(display_id);
let width = CGDisplayModeGetPixelWidth(display_mode_ref);
let height = CGDisplayModeGetPixelHeight(display_mode_ref);
CGDisplayModeRelease(display_mode_ref);
Ok(size(px(width as f32), px(height as f32)))
}
}
@@ -65,6 +73,10 @@ impl ScreenCaptureSource for MacScreenCaptureSource {
let excluded_windows = NSArray::array(nil);
let filter: id = msg_send![filter, initWithDisplay:self.sc_display excludingWindows:excluded_windows];
let configuration: id = msg_send![configuration, init];
let _: id = msg_send![configuration, setScalesToFit: true];
let _: id = msg_send![configuration, setPixelFormat: 0x42475241];
// let _: id = msg_send![configuration, setShowsCursor: false];
// let _: id = msg_send![configuration, setCaptureResolution: 3];
let delegate: id = msg_send![delegate, init];
let output: id = msg_send![output, init];
@@ -73,6 +85,9 @@ impl ScreenCaptureSource for MacScreenCaptureSource {
Box::into_raw(Box::new(frame_callback)) as *mut c_void,
);
let resolution = self.resolution().unwrap();
let _: id = msg_send![configuration, setWidth: resolution.width.0 as i64];
let _: id = msg_send![configuration, setHeight: resolution.height.0 as i64];
let stream: id = msg_send![stream, initWithFilter:filter configuration:configuration delegate:delegate];
let (mut tx, rx) = oneshot::channel();

View File

@@ -651,7 +651,7 @@ pub(crate) struct PaintSurface {
pub bounds: Bounds<ScaledPixels>,
pub content_mask: ContentMask<ScaledPixels>,
#[cfg(target_os = "macos")]
pub image_buffer: media::core_video::CVImageBuffer,
pub image_buffer: core_video::pixel_buffer::CVPixelBuffer,
}
impl From<PaintSurface> for Primitive {

View File

@@ -17,11 +17,11 @@ use crate::{
};
use anyhow::{anyhow, Context as _, Result};
use collections::{FxHashMap, FxHashSet};
#[cfg(target_os = "macos")]
use core_video::pixel_buffer::CVPixelBuffer;
use derive_more::{Deref, DerefMut};
use futures::channel::oneshot;
use futures::FutureExt;
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use parking_lot::RwLock;
use raw_window_handle::{HandleError, HasWindowHandle};
use refineable::Refineable;
@@ -2654,7 +2654,7 @@ impl Window {
///
/// This method should only be called as part of the paint phase of element drawing.
#[cfg(target_os = "macos")]
pub fn paint_surface(&mut self, bounds: Bounds<Pixels>, image_buffer: CVImageBuffer) {
pub fn paint_surface(&mut self, bounds: Bounds<Pixels>, image_buffer: CVPixelBuffer) {
use crate::PaintSurface;
self.invalidator.debug_assert_paint();

View File

@@ -32,7 +32,7 @@ pub struct Tokio {}
impl Tokio {
/// Spawns the given future on Tokio's thread pool, and returns it via a GPUI task
/// Note that the Tokio task will be cancelled if the GPUI task is dropped
pub fn spawn<C, Fut, R>(cx: &mut C, f: Fut) -> C::Result<Task<Result<R, JoinError>>>
pub fn spawn<C, Fut, R>(cx: &C, f: Fut) -> C::Result<Task<Result<R, JoinError>>>
where
C: AppContext,
Fut: Future<Output = R> + Send + 'static,
@@ -52,7 +52,7 @@ impl Tokio {
})
}
pub fn handle(cx: &mut App) -> tokio::runtime::Handle {
pub fn handle(cx: &App) -> tokio::runtime::Handle {
GlobalTokio::global(cx).runtime.handle().clone()
}
}

View File

@@ -4727,23 +4727,27 @@ impl CharClassifier {
}
pub fn kind_with(&self, c: char, ignore_punctuation: bool) -> CharKind {
if c.is_whitespace() {
return CharKind::Whitespace;
} else if c.is_alphanumeric() || c == '_' {
if c.is_alphanumeric() || c == '_' {
return CharKind::Word;
}
if let Some(scope) = &self.scope {
if let Some(characters) = scope.word_characters() {
let characters = if self.for_completion {
scope.completion_query_characters()
} else {
scope.word_characters()
};
if let Some(characters) = characters {
if characters.contains(&c) {
if c == '-' && !self.for_completion && !ignore_punctuation {
return CharKind::Punctuation;
}
return CharKind::Word;
}
}
}
if c.is_whitespace() {
return CharKind::Whitespace;
}
if ignore_punctuation {
CharKind::Word
} else {

View File

@@ -700,6 +700,9 @@ pub struct LanguageConfig {
/// If configured, this language contains JSX style tags, and should support auto-closing of those tags.
#[serde(default)]
pub jsx_tag_auto_close: Option<JsxTagAutoCloseConfig>,
/// A list of characters that Zed should treat as word characters for completion queries.
#[serde(default)]
pub completion_query_characters: HashSet<char>,
}
#[derive(Clone, Debug, Serialize, Deserialize, Default, JsonSchema)]
@@ -765,6 +768,8 @@ pub struct LanguageConfigOverride {
#[serde(default)]
pub word_characters: Override<HashSet<char>>,
#[serde(default)]
pub completion_query_characters: Override<HashSet<char>>,
#[serde(default)]
pub opt_into_language_servers: Vec<LanguageServerName>,
}
@@ -816,6 +821,7 @@ impl Default for LanguageConfig {
prettier_parser_name: None,
hidden: false,
jsx_tag_auto_close: None,
completion_query_characters: Default::default(),
}
}
}
@@ -1705,6 +1711,16 @@ impl LanguageScope {
)
}
/// Returns a list of language-specific characters that are considered part of
/// a completion query.
pub fn completion_query_characters(&self) -> Option<&HashSet<char>> {
Override::as_option(
self.config_override()
.map(|o| &o.completion_query_characters),
Some(&self.language.config.completion_query_characters),
)
}
/// Returns a list of bracket pairs for a given language with an additional
/// piece of information about whether the particular bracket pair is currently active for a given language.
pub fn brackets(&self) -> impl Iterator<Item = (&BracketPair, bool)> {

View File

@@ -56,15 +56,20 @@ impl LanguageModelSelector {
.max_height(Some(rems(20.).into()))
});
let subscription = cx.subscribe(&picker, |_, _, _, cx| cx.emit(DismissEvent));
LanguageModelSelector {
picker,
update_matches_task: None,
_authenticate_all_providers_task: Self::authenticate_all_providers(cx),
_subscriptions: vec![cx.subscribe_in(
&LanguageModelRegistry::global(cx),
window,
Self::handle_language_model_registry_event,
)],
_subscriptions: vec![
cx.subscribe_in(
&LanguageModelRegistry::global(cx),
window,
Self::handle_language_model_registry_event,
),
subscription,
],
}
}

View File

@@ -9,6 +9,6 @@ brackets = [
{ start = "\"", end = "\"", close = true, newline = false, not_in = ["string", "comment"] },
{ start = "'", end = "'", close = true, newline = false, not_in = ["string", "comment"] },
]
word_characters = ["-"]
completion_query_characters = ["-"]
block_comment = ["/* ", " */"]
prettier_parser_name = "css"

View File

@@ -32,5 +32,5 @@ block_comment = ["{/* ", " */}"]
opt_into_language_servers = ["emmet-language-server"]
[overrides.string]
word_characters = ["-"]
completion_query_characters = ["-"]
opt_into_language_servers = ["tailwindcss-language-server"]

View File

@@ -12,4 +12,4 @@ tab_size = 2
prettier_parser_name = "json"
[overrides.string]
word_characters = [":"]
completion_query_characters = [":", " "]

Some files were not shown because too many files have changed in this diff Show More