Compare commits

...

169 Commits

Author SHA1 Message Date
Nate Butler
58a7a277df wip 2024-10-21 10:20:26 -04:00
Nate Butler
5e9f084f12 Start on quick commit UI PoC 2024-10-21 09:33:15 -04:00
Michael Sloan
b355a6f449 Fix markdown preview handling of empty list items (#19449)
Before this change, `parse_block` was consuming events that it doesn't
handle. This was fine in its use in `parse_document`, but in its use in
`parse_list` this broke when there is an empty list item, causing it to
consume list end tags / list item starts / etc.

Release Notes:

- Fixed markdown preview rendering of empty list items.
2024-10-21 15:13:26 +02:00
reslear
6341ad2f7a docs: Correct link to Vue extension (#19508)
Closes -

Release Notes:

- N/A
2024-10-21 08:12:10 -04:00
Kirill Bulatov
d3cb08bf35 Support .editorconfig (#19455)
Closes https://github.com/zed-industries/zed/issues/8534
Supersedes https://github.com/zed-industries/zed/pull/16349

Potential concerns:
* we do not follow up to the `/` when looking for `.editorconfig`, only
up to the worktree root.
Seems fine for most of the cases, and the rest should be solved
generically later, as the same issue exists for settings.json
* `fn language` in `AllLanguageSettings` is very hot, called very
frequently during rendering. We accumulate and parse all `.editorconfig`
file contents beforehand, but have to go over globs and match these
against the path given + merge the properties still.
This does not seem to be very bad, but needs more testing and
potentially some extra caching.


Release Notes:

- Added .editorconfig support

---------

Co-authored-by: Ulysse Buonomo <buonomo.ulysse@gmail.com>
2024-10-21 13:05:30 +03:00
renovate[bot]
d95a4f8671 Update swatinem/rust-cache digest to 82a92a6 (#19318)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [swatinem/rust-cache](https://redirect.github.com/swatinem/rust-cache)
| action | digest | `23bce25` -> `82a92a6` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-21 11:10:45 +03:00
Alvaro Gaona
44dc693d30 docs: Update C# and C++ configuration pages (#19500)
- Update C# configuration page.
- Fix typo in C++ configuration page.

Release Notes:

- N/A
2024-10-21 11:07:12 +03:00
Conrad Irwin
92c29be74c SSH Remoting: Fix reconnects (#19485)
Before this change messages could be lost on reconnect, now they will
not be.

Release Notes:

- SSH Remoting: make reconnects smoother

---------

Co-authored-by: Nathan <nathan@zed.dev>
2024-10-19 23:14:19 -06:00
Kirill Bulatov
1ae30f5813 Show project panel symlink icons for remote clients (#19464) 2024-10-19 19:44:47 +03:00
Antonio Scandurra
e8207288e5 Refold updated patch only if the patch was already folded (#19462)
Release Notes:

- N/A
2024-10-19 17:10:50 +02:00
Nathan Sobo
781fff220c Fix merging of an update of a symbol with an insert_before operation before the same symbol (#19450)
When we insert before some text and then update that same text, we need
to preserve and concatenate the new text associated with both
operations.

Release Notes:

- N/A
2024-10-19 08:36:21 -06:00
Max Brunsfeld
d209eab058 Combine excerpt footer and header into a single block (#19441)
This simplifies rendering of excerpt headers and footers, and removes
the need to store a `BlockDisposition` on these boundary blocks. It's a
step toward implementing "replace blocks", which we want to use in the
assistant panel.

We've also cleaned up the way heights are specified for headers and
footers and fixed some visual asymmetries between the "expand upward"
and "expand downward" buttons.

Release Notes:

- N/A

---------

Co-authored-by: Richard <richard@zed.dev>
2024-10-18 17:58:07 -07:00
Vitaly Slobodin
3e0c5c10b7 lsp: Handle unregistration "textDocument/rename" from a server (#19427)
Hi. While working on https://github.com/zed-industries/zed/pull/19230 I
noticed that some servers send a request to unregistered the
`textDocument/rename` capability. I thought it would be good to handle
that message in Zed:

```plaintext
[2024-10-18T21:25:07+02:00 WARN  project::lsp_store] unhandled capability unregistration: Unregistration { id: "biome_rename", method: "textDocument/rename" }
```

So this pull request implements that. Thanks.

Release Notes:

- N/A
2024-10-19 00:52:17 +02:00
Mikayla Maki
8a912726d7 Fix flakey SSH connection (#19439)
Fixes a bug due to the `select!` macro tossing futures that had
partially read messages, causing us to desync our message reading with
the input stream.

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: conrad <conrad@zed.dev>
2024-10-18 15:41:43 -07:00
Marshall Bowers
30e081b3f7 elixir: Bump to v0.1.1 (#19437)
This PR bumps the Elixir extension to v0.1.1.

Changes:

- https://github.com/zed-industries/zed/pull/19135

Release Notes:

- N/A
2024-10-18 18:31:08 -04:00
Conrad Irwin
a5492b3ea6 Revert "SSH reconnect reliability (#19398)" (#19440)
This reverts commit 98ecb43b2d.

Tests fail on main?!

Closes #ISSUE

Release Notes:

- N/A
2024-10-18 15:08:56 -07:00
Marshall Bowers
47380001cc remote: Fix formatting (#19438)
This PR fixes some formatting issues from #19398 that slipped past CI,
somehow.

Release Notes:

- N/A
2024-10-18 17:31:41 -04:00
Conrad Irwin
98ecb43b2d SSH reconnect reliability (#19398)
Release Notes:

- SSH Remoting: Fix message reliability across restarts

---------

Co-authored-by: Nathan <nathan@zed.dev>
2024-10-18 15:28:08 -06:00
Valentine Briese
be474a6d6f docs: Update info on JDTLS install for Java (#19436) 2024-10-18 17:12:34 -04:00
Marshall Bowers
b44bed0115 collab: Unconditionally execute billing checks (#19432)
This PR removes the conditional checks around the billing-related
enforcement for LLM completions.

These were just in place to prevent executing any billing code before we
had rolled it out. Now that it is rolled out, we don't need this
conditional execution anymore.

Release Notes:

- N/A
2024-10-18 15:55:28 -04:00
Joseph T. Lyons
11a82e3347 Do not run CI on changes to community action config files (#19430)
Release Notes:

- N/A
2024-10-18 15:36:20 -04:00
Joseph T. Lyons
be81e29b0f Prefer users to open new issues if closed stale issue is valid (#19428)
I no longer want to have to keep my ears and eyes open for GitHub
notifications that relate to someone requesting a closed stale issue be
reopened. As a community maintainer, I can get hundreds or even
thousands of notifications a week, and a lot of those are about activity
on closed issues. If everyone following an issue did not react fast
enough (7 days) to keep an issue flagged as `stale` open, let's instruct
them to open new issues, so we are forced to see it during next triage.

Release Notes:

- N/A
2024-10-18 15:04:26 -04:00
Valentine Briese
6a463be1ae docs: Direct Java extension users to JDTLS initialization options (#19401)
Continuation of #19390
2024-10-18 15:02:42 -04:00
Vladimir Varankin
64a6e9cafb docs: Outline Jsonnet language (#19410)
This PR adds a basic documentation about the Jsonnet language support.

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-18 14:20:15 -04:00
Marshall Bowers
fa738ee5e1 vue: Extract to zed-extensions/vue repository (#19426)
This PR extracts the Vue extension to the
[zed-extensions/vue](https://github.com/zed-extensions/vue) repository.

Release Notes:

- N/A
2024-10-18 14:08:32 -04:00
Marshall Bowers
15449cdf30 svelte: Extract to zed-extensions/svelte repository (#19425)
This PR extracts the Svelte extension to the
[zed-extensions/svelte](https://github.com/zed-extensions/svelte)
repository.

Release Notes:

- N/A
2024-10-18 13:36:07 -04:00
Shish
2db9090a2f remote: Polish for connection progress & error dialogs (#19379)
Before/after:

![err1-before](https://github.com/user-attachments/assets/43d959b3-c9d9-45dd-938e-42d34ec1cfc5)

![err1-after](https://github.com/user-attachments/assets/311d53e0-752c-4eb8-9816-64b1970c228d)

Before/after (I feel like text-wrapping would be more useful than
text-ellipsis here, but I don't see any wrap function):

![err2-before](https://github.com/user-attachments/assets/1626cda9-bf06-43fe-9b7d-3ec64f4db08a)

![err2-after](https://github.com/user-attachments/assets/749a6950-1409-4e75-808e-a1a96dbfc87e)

Before/after:

![prog-before](https://github.com/user-attachments/assets/f5f5a171-db42-4797-bab0-ad71c750bb20)

![prog-after](https://github.com/user-attachments/assets/b52a7694-36f6-4f7a-8a90-ceb223f12ec1)

Release Notes:

- N/A
2024-10-18 11:09:52 -06:00
Thomas
34b8655bf6 Improve increment/decrement with leading zeros in vim mode (#18362)
- Closes: 18360

Release Notes:

- Added support for incrementing and decrementing numbers with leading
zeros
2024-10-18 11:03:34 -06:00
张小白
5b745a82e1 reqwest_client: Fix socks proxy settings (#19123)
Closes #19362

This pull request includes several updates to the `reqwest_client` crate
and its dependencies. The most important changes involve adding support
for SOCKS proxies, improving error handling for proxy URIs, and adding
tests for proxy functionality.

### Dependency Updates:
*
[`Cargo.toml`](diffhunk://#diff-2e9d962a08321605940b5a657135052fbcef87b5e360662bb527c96d9a615542L394-R401):
Added support for SOCKS proxies in the `reqwest` dependency by including
the `socks` feature.

### Code Improvements:
*
[`crates/reqwest_client/src/reqwest_client.rs`](diffhunk://#diff-8e036b034e987390be2f57373864b75d6983f0cf84e85c43793eb431d13538f3L47-R52):
Improved error handling when parsing proxy URIs by logging errors
instead of directly panicking.

### Testing Enhancements:
*
[`crates/reqwest_client/src/reqwest_client.rs`](diffhunk://#diff-8e036b034e987390be2f57373864b75d6983f0cf84e85c43793eb431d13538f3R274-R317):
Added tests to verify the handling of various proxy URIs, including
valid and invalid cases.

Release Notes:

- N/A
2024-10-18 09:57:00 -07:00
renovate[bot]
c59a75db1d Update actions/upload-artifact digest to b4b15b8 (#19310)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[actions/upload-artifact](https://redirect.github.com/actions/upload-artifact)
| action | digest | `604373d` -> `b4b15b8` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-18 12:41:36 -04:00
David Baldwin
b3c93130ec elixir: Support describe, test, setup, setup_all in outlines (#19135)
Closes #9894 

Release Notes:

- N/A

### Before


![2024-10-12T204848@2x](https://github.com/user-attachments/assets/84b7f123-8845-4e6d-b1b1-444e54ea6599)

### After


![2024-10-12T204749@2x](https://github.com/user-attachments/assets/67fdcead-bad3-4967-9ac4-0b85f1da7bca)
2024-10-18 12:39:24 -04:00
Marshall Bowers
73a6c542f3 vue: Bump to v0.1.1 (#19421)
This PR bumps the Vue extension to v0.1.1.

Changes:

- https://github.com/zed-industries/zed/pull/19419

Release Notes:

- N/A
2024-10-18 11:56:16 -04:00
Marshall Bowers
2cd6c19873 svelte: Bump to v0.2.1 (#19420)
This PR bumps the Svelte extension to v0.2.1.

Changes:

- https://github.com/zed-industries/zed/pull/19418

Release Notes:

- N/A
2024-10-18 11:37:51 -04:00
Marshall Bowers
6f24c1da79 vue: Support lang attribute for style tag injections (#19419)
This PR adds support for injecting languages into `<style>` tags using
`<style lang="...">` in Vue.

Extracted from https://github.com/zed-industries/zed/pull/18052.

Release Notes:

- N/A

Co-authored-by: Albert Marashi <albert@lumina.earth>
2024-10-18 11:36:30 -04:00
Marshall Bowers
5508832ba6 svelte: Adjust block keyword highlighting (#19418)
This PR adjusts the highlights for `{#each ...}` and `{#if ...}`
expression blocks to use keyword highlighting.

Extracted from https://github.com/zed-industries/zed/pull/18052.

Release Notes:

- N/A

Co-authored-by: Albert Marashi <albert@lumina.earth>
2024-10-18 11:25:03 -04:00
Marshall Bowers
35f2f2aac4 Treat .postcss files as CSS (#19416)
This PR makes it so `.postcss` files are recognized as CSS.

The `tree-sitter-css` grammar has basic support for PostCSS:
https://github.com/tree-sitter/tree-sitter-css/issues/17#issuecomment-1830349808.

Closes #18051.

Release Notes:

- `.postcss` files are now recognized as CSS.
2024-10-18 11:11:29 -04:00
Peter Tripp
9e27b6694a keymap: Add cmd-o 'Go to symbol' for JetBrains MacOS key bindings (#19415) 2024-10-18 10:12:39 -04:00
CharlesChen0823
f5124c21d1 scrollbar: Fix horizontal scrollbar display overflow last item (#19403)
currently, the laste item display is overflow by scrollbar, cause it
cannot clickable.
So remain one item height for display horizontal scrollbar,.
![Screenshot 2024-10-18
192352](https://github.com/user-attachments/assets/d686f0e8-93f8-426e-8816-6f00ed17a599)



Release Notes:

- N/A
2024-10-18 14:07:50 +02:00
Thorsten Ball
ea460014ab ssh remoting: Undo the spawning of message handlers (#19409)
I suspect that this might have something to do with saving files not
being possible after a while.

I want to merge this and bump nightly.

Release Notes:

- N/A
2024-10-18 10:48:11 +02:00
Peter Tripp
5168fc27a1 docs: More Java extension documentation (#19390)
Follow up of: https://github.com/zed-industries/zed/pull/19113
2024-10-17 20:21:45 -04:00
Marshall Bowers
2bcf9fc490 Add client::zed_urls module for constructing zed.dev URLs (#19391)
This PR adds a new `zed_urls` module to the `client` crate.

This module contains functions for constructing URLs to Zed properties,
such as zed.dev.

The URLs produced by this module will respect the server URL set via
settings or the `ZED_SERVER_URL` environment variable. This allows them
to correctly reflect the current environment (such as when testing Zed
against a local collab/zed.dev).

Release Notes:

- N/A
2024-10-17 16:18:35 -04:00
Marshall Bowers
3c32f01a8f paths: Update doc comment (#19388)
This PR updates a doc comment to use proper capitalization.

Also removes an unneeded `return`.

Release Notes:

- N/A
2024-10-17 15:36:29 -04:00
Marshall Bowers
9d61cd5120 ci: Check spelling separately from formatting (#19385)
This PR moves the spelling check out of the `check_style` action, which
we can leave for just checking formatting.

We can't use the `crates-ci-typos` action as-is on the macOS runners due
to the absence of `wget`.

Release Notes:

- N/A
2024-10-17 13:47:23 -04:00
Max Brunsfeld
411f64b374 Restructure assistant edits to show all changes in a proposed-change editor (#18240)
This changes the `/workflow` command so that instead of emitting edits
in separate steps, the user is presented with a single tab, with an
editable diff that they can apply to the buffer.

Todo

* Assistant panel
* [x] Show a patch title and a list of changed files in a block
decoration
* [x] Don't store resolved patches as state on Context. Resolve on
demand.
    * [ ] Better presentation of patches in the panel
    * [ ] Show a spinner while patch is streaming in
* Patches
* [x] Preserve leading whitespace in new text, auto-indent insertions
    * [x] Ensure patch title is very short, to fit better in tab
* [x] Improve patch location resolution, prefer skipping whitespace over
skipping `}`
    * [x] Ensure patch edits are auto-indented properly
* [ ] Apply `Update` edits via a diff between the old and new text, to
get fine-grained edits.
* Proposed changes editor
    * [x] Show patch title in the tab
    * [x] Add a toolbar with an "Apply all" button
* [x] Make `open excerpts` open the corresponding location in the base
buffer (https://github.com/zed-industries/zed/pull/18591)
* [x] Add an apply button above every hunk
(https://github.com/zed-industries/zed/pull/18592)
* [x] Expand all diff hunks by default
(https://github.com/zed-industries/zed/pull/18598)
    * [x] Fix https://github.com/zed-industries/zed/issues/18589
* [x] Syntax highlighting doesn't work until the buffer is edited
(https://github.com/zed-industries/zed/pull/18648)
* [x] Disable LSP interaction in Proposed Changes editor
(https://github.com/zed-industries/zed/pull/18945)
* [x] No auto-indent? (https://github.com/zed-industries/zed/pull/18984)
* Prompt
    * [ ] make sure old_text is unique

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Nate Butler <iamnbutler@gmail.com>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Richard Feldman <oss@rtfeldman.com>
2024-10-17 13:18:13 -04:00
Peter Tripp
4ae2f93086 ci: Improve GitHub Action modularity (#18861)
- Closes: https://github.com/zed-industries/zed/issues/19351
- Switch to using the official [typos GitHub Action](https://github.com/crate-ci/typos/blob/master/docs/github-action.md)
- Move the typos check into `actions/check_style`
- Move Squawk Postgres migration check out of `actions/check_style` file into ci.yml
- `actions/check_style` can now be run on stateless/linux runners (previous required self-hosted MacOS runner)
- ci.yml: Split old `style` into checks into those that can run statelessly (linux) and everything else into a new `migration` group which benefit from the full git checkout available on the MacOS runners.
- ci.yml: Move `Check unused dependencies` from style to `linux_tests`
- Add `if: github.repository_owner == 'zed-industries'` to all jobs so they won't try and run on GitHub forks.
2024-10-17 12:50:19 -04:00
Conrad Irwin
65fb2782eb Always have cmd-o open a local project (#19376)
Release Notes:

- Fixed `cmd-o` in an SSH project to always open a local project
2024-10-17 10:36:53 -06:00
Thorsten Ball
e6b9a8ef9b ssh remoting: Handle OpenNewBuffer request (#19373)
Release Notes:

- N/A
2024-10-17 17:49:17 +02:00
Elliot Thomas
398d0396b6 workspace: Fix inconsistent paths order serialization (#19232)
Release Notes:

- Fixed inconsistent serialization of workspace paths order
2024-10-17 17:38:28 +02:00
Finn Evers
e9e4c770ca Update all occurrences of option_as_meta to new default value (#19369)
This PR is a quick follow-up to #19364 which updates some left-out
occurrences of `option_as_meta` to the new default value (`false`).
2024-10-17 11:21:07 -04:00
Thorsten Ball
4be9da2641 remote ssh: Make "get permalink to line" work (#19366)
This makes the `editor: copy permalink to line` and `editor: copy
permalink to line` actions work in SSH remote projects.

Previously it would only work in local projects.

Demo:


https://github.com/user-attachments/assets/a8012152-b631-4b34-9ff2-e4d033c97dee




Release Notes:

- N/A
2024-10-17 17:07:42 +02:00
Thorsten Ball
c186e99a3d ssh remote: Reset missed heartbeats on connection activity (#19368)
Ran into this this morning. At least I suspect I ran into it. In any
case: we need to reset the missed hearbeats to 0 in case we got any
connection activity.

Release Notes:

- N/A
2024-10-17 17:07:34 +02:00
Peter Tripp
4df882c295 Make terminal.option_as_meta=false in default settings (#19364)
- This reverts the change I made in https://github.com/zed-industries/zed/pull/15535 which set `option_as_meta` to `true` in the default settings.
- `true` is a reasonable default for US Keyboards, but is terrible for many others which rely on `alt+<key>` for totally normal keystroke combinations.
2024-10-17 10:31:35 -04:00
Marshall Bowers
17f2929b4c collab: Anchor new subscription's billing cycle to the first of the month (#19367)
This PR makes it so new subscriptions will have their billing cycle
anchored to the first of the month.

When someone signs up today, they will be billed starting on the first
of next month.

Release Notes:

- N/A

Co-authored-by: Antonio <antontio@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
2024-10-17 10:18:12 -04:00
Peter Tripp
5ad392035e Support uppercase extensions in image preview (#19304) 2024-10-17 08:48:18 -04:00
Antonio Scandurra
8c910540ed Subtract FREE_TIER_MONTHLY_SPENDING_LIMIT from reported monthly spend (#19358)
Release Notes:

- N/A
2024-10-17 13:09:50 +02:00
Antonio Scandurra
455f241c6a Introduce a new /billing/monthly_spend API (#19354)
Fixes https://github.com/zed-industries/zed/issues/19353

Release Notes:

- N/A
2024-10-17 12:34:25 +02:00
Antonio Scandurra
498ecd6404 Fetch more than one page when polling stripe events (#19343)
This fixes a bug that was causing most users to be unable to use the
LLMs via Zed. It was caused by not using pagination and, instead, always
querying the very first page of stripe events.

Note that we're also allowing processing events generated in the last 24
hours (before, this was only 1 hour). I did this so that we can process
the backlog of events that the aforementioned bug was skipping.

Release Notes:

- N/A
2024-10-17 09:47:25 +02:00
Thorsten Ball
3216de7eb5 ssh remoting: Do not print error backtrace on non-zero exit (#19290)
Closes #ISSUE


Release Notes:

- N/A
2024-10-17 09:41:16 +02:00
renovate[bot]
57369b5a54 Update Rust crate tree-sitter-elixir to v0.3.1 (#19335) 2024-10-17 08:44:51 +03:00
Heavysnowjakarta
f9d4272e13 docs: Java extension settings (#19113)
Co-authored-by: Peter Tripp <peter@zed.dev>
2024-10-17 00:04:59 -04:00
Conrad Irwin
378a2cf9d8 Allow passing args to ssh (#19336)
This is useful for passing a custom identity file, jump hosts, etc.

Unlike with the v1 feature, we won't support `gh`/`gcloud` ssh wrappers
(yet?). I think the right way of supporting those would be to let
extensions provide remote projects.

Closes #19118

Release Notes:

- SSH remoting: restored ability to set arguments for SSH
2024-10-16 21:09:31 -06:00
Conrad Irwin
f1d01d59ac Simplify PR template (#19337)
Release Notes:

- N/A
2024-10-16 20:22:08 -06:00
Danilo Leal
78093b8e76 ssh: Clean up title bar indicator icon (#19328)
This PR cleans up the custom icon with indicator implementation in favor
of `IconWithIndicator`, which we already had. It seems like it isn't
super used still, but it's good to try to enforce some consistency
either way. I checked my changes against the REPL stuff (one instance
where its used) and everything's looking good so far. As far as SSH,
nothing has visually changed; we just have less code for this thing now.

<img width="800" alt="Screenshot 2024-10-17 at 2 15 47 AM"
src="https://github.com/user-attachments/assets/5c146757-501e-4242-b145-a576a8f289b5">

---

Release Notes:

- N/A
2024-10-16 22:25:27 -03:00
Danilo Leal
a41e973782 ssh: Remove server count from modal header (#19329)
The server count was something that existed since the remote development
implementation and we just kept it there without a lot of critical
thinking. However, it doesn't feel like it's particularly useful yet,
which means that, at least for now, we could clean it up more and wait
for further feedback to add it back, if ever requested.

Release Notes:

- N/A
2024-10-16 22:25:15 -03:00
Danilo Leal
9a3d8733ce ssh: Use system prompt for the server removal action (#19332)
This PR replaces a toast for the system prompt to confirm the action of
removing a server from the remote list. The alert dialog component is
the right choice here as we want to have a modal action that forces
choice. This should make it easier to convert to a nativa alert dialog
in the future, as well as for other platforms.

<img width="800" alt="Screenshot 2024-10-17 at 3 01 41 AM"
src="https://github.com/user-attachments/assets/7bb1210a-54bf-40da-a85a-f269484825a1">

Release Notes:

- N/A
2024-10-16 22:25:03 -03:00
Conrad Irwin
c888101e4b SSH remoting: Don't panic when opening root, open ~ instead (#19322)
Release Notes:

- Fixed a panic when doing `zed ssh://server/`
2024-10-16 17:17:20 -06:00
Conrad Irwin
0c04fb9862 SSH remoting: better error message for projects (#19320)
Before this, if no project paths were opened you were in a wierd UI
state where
most things didn't work because the project was ssh, but no
files/folders were open.

Release Notes:

- Fixed error handling when no project paths could be opened
2024-10-16 17:16:56 -06:00
Marshall Bowers
f6fad3b09e collab: Remove lifetime spending limit in favor of LLM usage billing (#19321)
This PR removes the lifetime spending limit that was added in #16780.

We had previously added this as a way to prevent runaway usage, but now
that we have a cap on free usage per month with paid access after that,
we don't need this check anymore.

Release Notes:

- N/A
2024-10-16 18:14:07 -04:00
renovate[bot]
6614feff97 Pin astral-sh/setup-uv action to f3bcaeb (#19309)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [astral-sh/setup-uv](https://redirect.github.com/astral-sh/setup-uv) |
action | pinDigest | -> `f3bcaeb` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-16 17:20:39 -04:00
Conrad Irwin
08b1545c85 Show a user-visible error message if saving fails (#19311)
Release Notes:

- Added a user-visible error message when a manual save fails.
2024-10-16 15:17:38 -06:00
Marshall Bowers
fedd177b08 collab: Add context to errors syncing billing events to Stripe (#19315)
This PR adds context to errors that occur when trying to sync billing
events to Stripe.

Release Notes:

- N/A
2024-10-16 17:09:26 -04:00
renovate[bot]
4288096ca1 Update Rust crate tree-sitter-cpp to v0.23.1 (#18974)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
|
[tree-sitter-cpp](https://redirect.github.com/tree-sitter/tree-sitter-cpp)
| workspace.dependencies | patch | `0.23.0` -> `0.23.1` |

---

### Release Notes

<details>
<summary>tree-sitter/tree-sitter-cpp (tree-sitter-cpp)</summary>

###
[`v0.23.1`](https://redirect.github.com/tree-sitter/tree-sitter-cpp/compare/v0.23.0...v0.23.1)

[Compare
Source](https://redirect.github.com/tree-sitter/tree-sitter-cpp/compare/v0.23.0...v0.23.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMTQuMCIsInVwZGF0ZWRJblZlciI6IjM4LjExNC4wIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-16 23:21:23 +03:00
renovate[bot]
256c31a5d9 Update Rust crate tree-sitter-c to v0.23.1 (#18958)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [tree-sitter-c](https://redirect.github.com/tree-sitter/tree-sitter-c)
| workspace.dependencies | patch | `0.23.0` -> `0.23.1` |

---

### Release Notes

<details>
<summary>tree-sitter/tree-sitter-c (tree-sitter-c)</summary>

###
[`v0.23.1`](https://redirect.github.com/tree-sitter/tree-sitter-c/compare/v0.23.0...v0.23.1)

[Compare
Source](https://redirect.github.com/tree-sitter/tree-sitter-c/compare/v0.23.0...v0.23.1)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMTQuMCIsInVwZGF0ZWRJblZlciI6IjM4LjExNC4wIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-16 23:19:10 +03:00
David Soria Parra
c8b6ad9666 Context Servers: Protocol fixes and UI improvements (#19087)
This PR does two things. It fixes some minor inconsistencies in the
protocol. This is mostly about handling JSON RPC notifications correctly
and skipping fields when set to None.

Second part is about improving the rendering of context server commands,
by passing on the description
of the command to the slash command UI and showing the name of the
argument as a CodeLabel.

Release Notes:

- N/A
2024-10-16 13:07:15 -07:00
Peter Tripp
0e22c9f275 docs: Add C++ clangd example arguments (#19308) 2024-10-16 16:07:05 -04:00
Kirill Bulatov
56f69be2e7 Do not allow [re]running ssh tasks when not connected to the server (#19306)
Release Notes:

- N/A
2024-10-16 22:57:39 +03:00
Kirill Bulatov
02f63e49ed Resolve proto hints with empty resolve data (#19274)
Fixed ssh remoting not showing a lot of hints


Release Notes:

- N/A
2024-10-16 21:50:51 +03:00
Kirill Bulatov
3dcc638537 Better handle shell for remote ssh projects (#19297)
Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad@zed.dev>
2024-10-16 21:49:54 +03:00
Marshall Bowers
d35b646dbb assistant: Direct user to account page to subscribe for more LLM usage (#19300)
This PR updates the location where we send the user to subscribe for
more LLM usage to the account page.

Release Notes:

- Updated the URL to the account page when subscribing to LLM usage.
2024-10-16 14:03:42 -04:00
张小白
338bf3fd28 windows: Fix window not displaying correctly on launch (#19124)
Closes #18705 (comment)

This PR fixes the issue where the Zed window was not displaying
correctly on launch. Now, when Zed is closed in a maximized state, it
will reopen in a maximized state.

On macOS, when a window is created but not yet visible, calling `zoom`
or `toggle_fullscreen` will still affect the hidden window. However,
this behavior is different on Windows, so special handling is required.

Also, since #18705 hasn't been reviewed yet, I'm not sure if this PR
should be merged now or if it should wait until #18705 is reviewed
first.


Release Notes:

- N/A
2024-10-16 10:29:42 -07:00
Matin Aniss
879a2ea06f gpui: Replace redundant code in animation (#19273)
Just a small change to replace some redundant code in the animation
element.

Release Notes:

- N/A
2024-10-16 10:26:26 -07:00
Piotr Osiewicz
7a5003bea2 ssh: Do not look up dev servers when rendering the default mode (#19295)
This should help with the bug where there's a mismatch between
connection count and the list showing empty state.

Closes #ISSUE

Release Notes:

- N/A
2024-10-16 18:53:05 +02:00
Joseph T. Lyons
f8f3f369f6 v0.159.x dev 2024-10-16 12:47:57 -04:00
Antonio Scandurra
474e670bbd Increase monthly free tier spend from 5 dollars to 10 dollars (#19291)
Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
2024-10-16 12:22:24 -04:00
Marshall Bowers
fe0bcc063c collab: Add Stripe API key to Kubernetes template (#19292)
This PR adds the Stripe API key to the Kubernetes template.

It's optional right now, so we can set the API key when we're ready.

Release Notes:

- N/A
2024-10-16 12:10:39 -04:00
Thorsten Ball
69abe71bf7 ssh remoting: Treat closed stderr as error (#19289)
Before this change we had a race condition bug: if stderr was closed
before the other two sockets, we wouldn't properly detect when the
server died, and not report or retry anything.

That's because we treated a closed stderr as a non-error.

Technically, it isn't an error (closing a connection is okay!), but
until we have a proper shutdown ceremony between all three processes, we
can treat it as an error, because that lets us to detect when the server
is gone.

On the client-side, we also always react to these errors by
reconnecting. Except when we shutdown: there we do a proper shutdown and
won't error on the proxy exit code.

So, this works, even if I wish there was a better way for the server to
communicate to the proxy that it shutdown properly. But I don't want a
fourth socket.

Release Notes:

- N/A
2024-10-16 18:05:52 +02:00
Marshall Bowers
9c3d80d6e8 collab: Fetch more meters and prices when initializing StripeBilling (#19288)
This PR makes it so we fetch more meters and prices when initializing
`StripeBilling`, as we have more than 10 meters defined.

Release Notes:

- N/A
2024-10-16 11:40:56 -04:00
Kirill Bulatov
834d50f0db Properly open worktrees when cmd-clicking in terminal or on inlay hints (#19280)
* uses the state that's synced, to fetch the language server name
* uses proper, canonicalized path when creating a remote ssh worktree,
otherwise `~/foo/something` stays unexpanded

Release Notes:

- N/A
2024-10-16 18:12:36 +03:00
Kirill Bulatov
bcdb10b3cb Do not attempt to install prettier if the language change is unrelated (#19283)
Release Notes:

- Fix prettier install being attempted too much
2024-10-16 18:10:05 +03:00
Marshall Bowers
598939d186 collab: Refresh the user's LLM token when their subscription changes (#19281)
This PR makes it so collab will trigger a refresh for a user's LLM token
whenever their subscription changes.

This allows us to proactively push down changes to their subscription.

In order to facilitate this, the Stripe event processing has been moved
from the `api` service to the `collab` service in order to access the
RPC server.

Release Notes:

- N/A
2024-10-16 10:58:28 -04:00
Thorsten Ball
9d944d0662 ssh remote: Restore ControlPersist=no (#19277)
This restores the change from #19193 that I erroneously reverted in
#19234.

I think the bug in #19275 got in my way when testing.

With that bug fixed, the changes in here also work fine.


Release Notes:

- N/A
2024-10-16 16:13:31 +02:00
Tilman Roeder
7d2628e805 Make the divider rule color more muted (#19255)
I've been a bit annoyed by the hover divider rule being extremely bright
compared to other divider rules in the UI. This PR updates their color
to use the regular border color from the current theme instead of the
muted (but still pretty bright) text color.

Apologies for the unsolicited PR (and please feel free to close if it
goes against some other plans / designs you already have in place :).

#### Example screenshot before:
<img width="302" alt="Screenshot 2024-10-15 at 23 29 18"
src="https://github.com/user-attachments/assets/7ea22808-8135-4a46-9457-e670225aebaa">

#### Example screenshot after:
<img width="312" alt="Screenshot 2024-10-15 at 23 28 16"
src="https://github.com/user-attachments/assets/63ac0d02-ae6d-4962-84a2-1fdb95519b15">

***

Release Notes:

- Make the divider rule in LSP hovers more muted
2024-10-16 11:00:22 -03:00
Ihnat Aŭtuška
84df3a0cad Allow formatting selections via LSP (#18752)
Release Notes:

- Added a new `editor: format selections` action that allows formatting
only the currently selected text via the primary language server.

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
2024-10-16 15:58:37 +02:00
Thorsten Ball
eb76065ad3 ssh remoting: Fix hang when activity channel gets dropped (#19275)
When the SSH command dies or the server, the channel gets dropped and
the heartbeat method went into an infinite loop causing a hang.

Oversight from yesterday. Fixed now.

Release Notes:

- N/A
2024-10-16 15:57:58 +02:00
Peter Tripp
84018d7a2d zig: Bump to v0.3.1 (#19252)
Includes:
- https://github.com/zed-industries/zed/pull/18323
- https://github.com/zed-industries/zed/pull/17488
2024-10-16 08:42:45 -04:00
Peter Tripp
57c55b32e1 html: Bump to v0.1.3 (#19251)
Includes:
- https://github.com/zed-industries/zed/pull/18024
2024-10-16 08:42:27 -04:00
Peter Tripp
a4357c429a elixir: Bump to v0.1.0 (#19250)
Includes:
- https://github.com/zed-industries/zed/pull/18024
- https://github.com/zed-industries/zed/pull/17488
- https://github.com/zed-industries/zed/pull/16985
2024-10-16 08:42:07 -04:00
Peter Tripp
103665ee28 astro: Bump to v0.1.1 (#19249)
Includes:
- https://github.com/zed-industries/zed/pull/18024
2024-10-16 08:41:45 -04:00
Thorsten Ball
2f960c4aba project environment: Log when which env is used (#19270)
This adds more logging for debugging purposes.

Release Notes:

- N/A
2024-10-16 14:12:45 +02:00
Piotr Osiewicz
109ebc5f27 ui: Add Scrollbar component (#18927)
Closes #ISSUE

Release Notes:

- N/A
2024-10-16 13:57:28 +02:00
Stanislav Alekseev
eddf70b5c4 Revert "lsp: Do not notify all language servers on file save" (#19183)
Reverts zed-industries/zed#17756. According to the existing
implementations of the LSP specification, namely
[Helix](a7651f5bf0/helix-view/src/document.rs (L1038))
and, if I'm not wrong,
[VSCode](https://github.com/microsoft/vscode-languageserver-node/blob/main/client/src/common/textSynchronization.ts#L580),
`textDocument/didSave` has nothing to do with the watched files and
should be sent to the language servers connected to the buffers even if
the files are not watched by those. As the LSP spec doesn't say anything
about `didSave` being related to the watched files, and the reference
implementation in VSCode seemingly does not filter the notifications
according to those, it seems like this is an incorrect interpretation of
the specification

This also causes issues with language servers. See [Metals
issue](https://github.com/scalameta/metals-zed/issues/28#issuecomment-2410393150)
for example

Closes #18636

Release Notes:

- N/A
2024-10-16 12:41:01 +02:00
Stanislav Alekseev
128619899e Environment loading fixes (#19144)
Closes #19040
Addresses the problem with annoying error messages on windows (see
comment from SomeoneToIgnore on #18567)

Release Notes:

- Fixed the bug where language servers from PATH would sometimes be
prioritised over the ones from `direnv`
- Stopped running environment loading on windows as it didn't work
anyways due to `SHELL` not being set
2024-10-16 12:14:40 +02:00
Axel Carlsson
a77ec94cbc vim: Add support for insert button (#19245)
This commit adds support for using the physical insert-button. First
click toggles insert mode and subsequent clicks toggle back and forth
between replace and insert mode.

Closes #19224

Release Notes:

- Added support for using the insert button for vim_mode.
2024-10-16 12:11:17 +02:00
Kirill Bulatov
a56f946a7d Force astro-language-server to be the primary one for Astro (#19266)
Part of https://github.com/zed-industries/zed/issues/19239

Overall, this hardcoding approach has to stop and Zed better show some
notification/modal that proposes to select a primary language server,
when launching with the language that has no such settings.

Release Notes:

- Fixed Astro LSP interactions
2024-10-16 10:06:45 +03:00
Mikayla Maki
f944ebc4cb Add settings to remote servers, use XDG paths on remote, and enable node LSPs (#19176)
Supersedes https://github.com/zed-industries/zed/pull/19166

TODO:
- [x] Update basic zed paths
- [x] update create_state_directory
- [x] Use this with `NodeRuntime`
- [x] Add server settings
- [x] Add an 'open server settings command'
- [x] Make sure it all works


Release Notes:

- Updated the actions `zed::OpenLocalSettings` and `zed::OpenLocalTasks`
to `zed::OpenProjectSettings` and `zed::OpenProjectTasks`.

---------

Co-authored-by: Conrad <conrad@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
2024-10-15 23:32:44 -07:00
CharlesChen0823
1dda039f38 remote_server: local build also need feature debug-embed (#19265)
it's waste me one more hour. IMO, this also need for `build_local`

Release Notes:

- N/A
2024-10-15 23:08:37 -07:00
Alvaro Gaona
182230a0ba Fix C++ configuration documentation (#19258)
- Update the binary JSON object
- Add .clangd configuration file summary


![image](https://github.com/user-attachments/assets/5c4cfd26-3cd8-4d12-96fc-483596c74287)

Release Notes:

- N/A
2024-10-16 08:45:26 +03:00
Danilo Leal
b64919aa11 ssh: Refine the modal UI (#19256)
This PR refines the SSH modal UI, adjusting spacing and alignment. Via
these changes, I'm also introducing the ability for the `empty_message`
on the `List` component to receive not just a string but any element.
The custom way in which the SSH modal was designed made it feel like
this was needed for proper spacing.

<img width="700" alt="Screenshot 2024-10-16 at 1 20 54 AM"
src="https://github.com/user-attachments/assets/f2e0586b-4c9f-4497-b4cb-e90c8157512b">


Release Notes:

- N/A
2024-10-15 20:39:27 -03:00
Marshall Bowers
b752548742 storybook: Load GPUI with default features (#19253)
This PR makes it so the Storybook loads GPUI with the default features
enabled.

This fixes a panic that would occur when trying to run any of the
stories.

Release Notes:

- N/A
2024-10-15 17:55:58 -04:00
Peter Tripp
d63a49647f Glob documentation (#18789)
Leaving this unlinked for now because I'm not sure where it belongs.
Plan to write up something for regexes too.
2024-10-15 17:21:04 -04:00
Peter Tripp
c00f2d8842 Add Diff language (#19129) 2024-10-15 16:02:12 -04:00
Joseph T. Lyons
973143fa35 Fix variable name typo (#19244)
Release Notes:

- N/A
2024-10-15 16:01:50 -04:00
Joseph T. Lyons
d806df9f16 Ensure issues without core labels have triage labels (#19243)
Sometimes, issues are created outside of issue templates (which we don't
prefer, but we can't prevent). This updates our top-ranking issues
script such that it will add `triage` and `admin read` labels to any
issue that is missing a core label, so that we don't miss the issues
when doing the next triage.

Release Notes:

- N/A
2024-10-15 15:55:18 -04:00
Peter Tripp
b682fc6d1d Use multi-line regex for '\s' (#19241) 2024-10-15 15:47:43 -04:00
Peter Tripp
5445f898e8 ruby: Move Ruby extension to zed-extensions/ruby repo (#19098) 2024-10-15 15:41:20 -04:00
Peter Schilling
56163b1e35 Add ability to reload a file (#18395)
Closes #13212

Release Notes:

- Added reload command
- vim: Added `:e[dit]`, `:e[dit]!` which calls reload
2024-10-15 13:02:18 -06:00
CharlesChen0823
695176898e project_search: Fix message displayed when no results are found (#19108)
when no result found, always display `Search all files`, which is
confused.

Release Notes:

- Fixed an issue where the project search would sometimes show "Search
all files" when there were no results.

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-15 13:41:51 -04:00
Thorsten Ball
e3c6ba4bd7 ssh remote: Revert #19193 and treat killed proxy as non-zero (#19234)
This does two things.

Important one: it reverts #19193, which lead to our whole process
handling breaking. When the `proxy` process was killed, it apparently
didn't close the stdout/stderr anymore, which meant we would not detect
when it died. (Watching its `status()` in the io loop also didn't work!)

We should figure out how to keep our process handling working before we
make this change in #19193, which sounds reasonable.

Second, less important thing: I think we should treat the process being
killed from a signal as non-zero, as an error.

Release Notes:

- N/A
2024-10-15 19:20:06 +02:00
thataboy
8924b3fb5b Fix block cursor obscuring placeholder text and editor text in some cases (#18114) 2024-10-15 13:10:01 -04:00
Joseph T. Lyons
1732441f48 Use python 3.13 (#19225)
Release Notes:

- N/A
2024-10-15 11:08:02 -04:00
Joseph T. Lyons
096d37a1ed Remove outdated requirements.txt (#19223)
Release Notes:

- N/A
2024-10-15 11:04:54 -04:00
cabrinha
ba69f48ccf docs: Add Helm extension docs (#19095)
Merge after this:
- https://github.com/zed-industries/extensions/pull/746

Release Notes:

- N/A

---------

Co-authored-by: Peter Tripp <peter@zed.dev>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-15 10:40:47 -04:00
Bennet Bo Fenner
0eb6bfd323 ssh remoting: Treat other message as heartbeat (#19219)
This improves the heartbeat detection logic. We now treat any other
incoming message from the ssh remote server
as a heartbeat message, meaning that we can detect re-connects earlier.

It also changes the connection handling to await futures detached.

Co-Authored-by: Thorsten <thorsten@zed.dev>

Release Notes:

- N/A

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
Co-authored-by: Antonio <antonio@zed.dev>
2024-10-15 16:37:56 +02:00
Piotr Osiewicz
4fa75a78b9 gpui: Improve performance of laying out long lines (#19215)
TL;DR: Another O(n^2) strikes.

In #19194 we received a report about a 7Mb JSON file that Zed struggles
with. Naturally this file showcased a O(n^2) in line layout; this file
has one long line.

During line layout for Mac we have to convert between UTF-16 and UTF-8
indices in the string, as CoreText works with UTF-16 and Rust strings
are UTF-8. The problem stemmed from the fact that we were re-seeking our
string converter on each glyph, which boils down to: we were reparsing
[0..curr_string_position] bytes up to full length of the string, which
is the O(n^2) in question. This PR changes this behaviour to reuse the
Index Converter if the position we're seeking to is not yet reached.
Basically, we're treating the converter as forward iterator and we try
to seek with the same iterator, if possible.

Where previously you could not even open the file in OP (within
reasonable time frame, I waited for 40 seconds before giving up), now
you can do it in.. slightly over a second. The best part is: the
experience is still not ideal. Typing in the buffer is sluggish. Still,
this is a start.


Release Notes:

- Mac: Improved performance with very long lines
2024-10-15 16:28:47 +02:00
Thorsten Ball
397e4bee0a ssh remoting: Forward LSP logs to client (#19212)
Release Notes:

- N/A

---------

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2024-10-15 16:04:29 +02:00
Piotr Osiewicz
db7417f3b5 Rework file picker for SSH modal (#19020)
This PR changes the SSH modal design so its more keyboard
navigation-friendly and adds the server nickname feature.

Release Notes:

- N/A

---------

Co-authored-by: Danilo <danilo@zed.dev>
Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2024-10-15 12:38:03 +02:00
Thorsten Ball
be7b24fcf7 ssh remote: Shutdown SSH & server process correctly on app quit (#19210)
Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-15 11:26:23 +02:00
CharlesChen0823
62de03f286 tab: Fix copy wrong relative path for tab (#19206)
Closes #19204 

Release Notes:

- Fixed relative paths copied incorrectly from tabs
([#19204](https://github.com/zed-industries/zed/issues/19204))
2024-10-15 09:27:34 +03:00
wannacu
41ba4178fc gpui: Fix crash caused by ownership leak (#19185)
- Closes #18811

Release Notes:

- N/A
2024-10-14 12:46:04 -07:00
Antonio Scandurra
6e2869a321 Prevent deadlock when create a new meter/price on Stripe (#19196)
This also puts the entire state of `StripeBilling` behind a `RwLock`.
When fetching the existing prices and meters, or when inserting new
ones, we acquire a write lock and hold it until the Stripe request
completes. This prevents two concurrent calls to `get_or_insert_price`
from inserting the same data twice.

Creating a new meter/price is unusual, so in practice we'll acquire a
read lock most of the time.

/cc @rtfeldman @maxdeviant 

Release Notes:

- N/A
2024-10-14 17:31:51 +02:00
张小白
6986f081d0 supermaven: Fix crash when editing non-ASCII text (#19153)
Closes #19051
Closes #19182


#### How to reproduce this crash:
1. Open any file and input some ASCII characters.
2. Replace these characters with `你好`.
3. Press `backspace`.
4. Crash.



https://github.com/user-attachments/assets/ea5c5340-29a5-42c8-98c5-6e60770445a4



The issue lies with the `prefix_offset` introduced in #18858. After the
buffer is modified, this value is not always valid and may fall within a
`char boundary`, which results in a crash.



Release Notes:

- Fixed Supermaven crashing on deleting non-ASCII text
2024-10-14 18:27:15 +03:00
张小白
3ff52a816e windows: Fix opening wrong path when clicking path in the terminal view (#18726)
Closes #18550

This PR removes the prefix `\\?\`.

https://github.com/user-attachments/assets/f4f4333c-5d87-4f0f-b12c-fb2108703b6a


Release Notes:

- N/A
2024-10-14 18:23:16 +03:00
Shish
7d5fe66b54 remote: Disable ControlPersist for master ssh connection (#19193)
remote: Disable ControlPersist for master ssh connection

`ControlPersist=yes` combined with `ControlMaster=yes` silently forces
`ForkAfterAuthentication=yes` (even when the user has explicitly set it
to `no` - reported upstream in [0]) - and the latter makes the ssh
subprocess disappear, which makes us think that the connection died

(This is only an issue for people who have `ControlPersist=yes` in their
`ssh_config`, and perhaps the answer is "if that option breaks things,
don't use that option?" - but it's an option that makes sense _most_ of
the time, it's just in this edge-case of "creating an ssh connection
with -N and expecting the process to stay in the foreground" where it
_must_ be set to no)

I think the alternative approach is to tell people "if you want to use
persistent connections, have a separate ~/.ssh/config entry for
servername (to ssh into) and servername-no-persist (to zed into)", which
is possible, but ugh. Kind of a messy situation >.<


Tests:

- Before: Connections to my server result in "Failed to connect: ." (The
error message is attempting to show stderr, but stderr is empty)

- After: Connections to my server work reliably


[0] https://bugzilla.mindrot.org/show_bug.cgi?id=3743


Release Notes:

- N/A
2024-10-14 15:36:31 +02:00
Piotr Osiewicz
792f583b97 Revert "chore: Bump taffy to 0.5.2 (#18729)" (#19189)
This reverts commit a99750fd35.

@huacnlee found that commit to have a bad impact on perf and triaged it
for us in
https://github.com/zed-industries/zed/pull/18729#issuecomment-2410445980
Closes #ISSUE

Release Notes:

- N/A
2024-10-14 15:19:10 +02:00
Thorsten Ball
6ec00cdb06 ssh remoting: Restore SSH projects when reopening Zed (#19188)
Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-14 14:56:25 +02:00
Thorsten Ball
71a878aa39 remote ssh: Fix asset embedding in cross-compilation (#19180)
This fixes the panic from the settings file not being embedded.


Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-14 14:13:06 +02:00
Antonio Scandurra
f2337bbed1 Redirect to checkout page when payment is required (#19179)
Previously, we were redirecting to a non-existant page.

Release Notes:

- N/A
2024-10-14 12:39:20 +02:00
Lilith Iris
fcf9e546da project: Fix content not displaying when selecting a folder in Windows (#18946)
- Closes #16998

This PR resolves issues with the /file and /diagnostics commands in the
assistant panel, which previously failed to display the contents of a
directory when searching for a folder instead of using the arrow button.

- Changed the format in `project.rs` (located at
`crates/project/src/project.rs`) to use `std::path::MAIN_SEPARATOR` for
cross-platform compatibility, which resolves errors encountered on
Windows that originally used the format `format!("{}/", ...)`.


Release Notes:

- N/A
2024-10-14 12:13:53 +03:00
Xavier Lau
7dc069100d Improve macOS build guide (#19172)
- `mold` moved to `sold` long time ago.
  And https://github.com/bluewhalesystems/sold/issues/43...
- And add a step for accepting xcodebuild license

Signed-off-by: Xavier Lau <x@acg.box>
2024-10-14 10:20:36 +02:00
Frank Sheiness
5b207ba238 vim: Add some "z" keybindings for scrolling (#18928)
Release Notes:

- vim: Added a few "z" keybindings for scrolling
2024-10-14 10:00:37 +02:00
Ömer Sinan Ağacan
325f106c8b Add vim::Search command option for non-regex search (#19177)
Similar to e2647025ac, this adds a `regex`
option to `vim::Search` command to allow disabling regex search.

Release Notes:

- Added `regex` option to `vim::Search` command to allow disabling regex
search by default in the keymap. Example usage:
  ```yaml
  {
    "context": "VimControl && !menu",
    "bindings": {
      "/": ["vim::Search", { "regex": false }],
    }
  }
  ```
2024-10-14 09:59:29 +02:00
Kirill Bulatov
ec5d6e96bb Make danger to output less false-positives (#19151) 2024-10-14 01:50:46 +03:00
Bolaji Olajide
54683ff2b9 docs: Fix typo in environment documentation (#19164)
Update incorrect spelling of Raycast in environment.md
2024-10-13 16:47:09 -04:00
Peter Tripp
cdead5760a docs: Formatter arguments, document {buffer_path} usage (#19156) 2024-10-13 10:25:08 -04:00
Kirill Bulatov
39468de8c6 Return back to history-based tabs activation on close (#19150)
Closes https://github.com/zed-industries/zed/issues/19036

Alters https://github.com/zed-industries/zed/pull/18168 and moves its
change behind a settings flag, restoring the previous behavior.

Release Notes:

- Fixed tab closing not respecting history. Use `tabs.activate_on_close
= neighbour` settings to activate near tabs instead.
2024-10-13 14:35:06 +03:00
Kirill Bulatov
6491148196 Fail on warnings during CI builds (#19149)
Forbid things like
https://github.com/zed-industries/zed/pull/19144#issuecomment-2408871788

Release Notes:

- N/A
2024-10-13 13:39:15 +03:00
Peter Tripp
0b10fd5098 cpp: Better icon support (#19146) 2024-10-13 04:09:27 -04:00
Shish
74cc90887a ci: Give names to all github actions (#19080) 2024-10-13 03:01:56 -04:00
Peter Tripp
875c0cb09f Bytes 1.7.2 merge fix (#19145) 2024-10-13 02:56:12 -04:00
Peter Tripp
aefc559f43 Improve auto-detection via shebang of TypeScript, JavaScript and Shell Script (#19114) 2024-10-13 02:35:46 -04:00
Mikayla Maki
bebe24ea77 Add remote server cross compilation (#19136)
This will allow us to compile debug builds of the remote-server for a
different architecture than the one we are developing on.

This also adds a CI step for building our remote server with minimal
dependencies.

Release Notes:

- N/A
2024-10-12 23:23:56 -07:00
renovate[bot]
f73a076a63 Update Rust crate bytes to v1.7.2 (#18656)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [bytes](https://redirect.github.com/tokio-rs/bytes) |
workspace.dependencies | patch | `1.7.1` -> `1.7.2` |

---

### Release Notes

<details>
<summary>tokio-rs/bytes (bytes)</summary>

###
[`v1.7.2`](https://redirect.github.com/tokio-rs/bytes/blob/HEAD/CHANGELOG.md#172-September-17-2024)

[Compare
Source](https://redirect.github.com/tokio-rs/bytes/compare/v1.7.1...v1.7.2)

##### Fixed

- Fix default impl of `Buf::{get_int, get_int_le}`
([#&#8203;732](https://redirect.github.com/tokio-rs/bytes/issues/732))

##### Documented

- Fix double spaces in comments and doc comments
([#&#8203;731](https://redirect.github.com/tokio-rs/bytes/issues/731))

##### Internal changes

- Ensure BytesMut::advance reduces capacity
([#&#8203;728](https://redirect.github.com/tokio-rs/bytes/issues/728))

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC45Ny4wIiwidXBkYXRlZEluVmVyIjoiMzguMTE1LjEiLCJ0YXJnZXRCcmFuY2giOiJtYWluIiwibGFiZWxzIjpbXX0=-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-12 14:54:06 -07:00
Mikayla Maki
b2e844f2ec Fix an issue with using non-reusable body types with redirects (#19134)
Closes #19131
Closes #19039

fixes the broken auto-updater.

I had the bright idea of using streams as the most common unit of data
transfer. Unfortunately, streams are not re-usable. So HTTP redirects
that have a stream body (like our remote server and auto update
downloads), don't redirect, as they can't reuse the stream. This PR
fixes the problem and simplifies the AsyncBody implementation now that
we're not using Isahc.

Release Notes:

- N/A
2024-10-12 13:32:08 -07:00
Lorenzo Cinque
9e14fd915f docs: Fix missing parenthesis in the Terminal: Detect Virtual Environments section of configuring-zed.md (#19127)
Release Notes:

- N/A
2024-10-12 20:18:33 +03:00
Mikayla Maki
c85a3cc117 Switch from OpenSSL to Rustls (#19104)
This PR also includes a downgrade of our async_tungstenite version to
0.24

Release Notes:

- N/A
2024-10-11 18:18:09 -07:00
Mikayla Maki
22ac178f9d Restore HTTP client transition, but use reqwest everywhere (#19055)
Release Notes:

- N/A
2024-10-11 14:58:58 -07:00
Marshall Bowers
c709b66f35 collab: Don't record billing events if billing is not enabled (#19102)
This PR adjusts the billing logic to not write any records to
`billing_events` if:

- The user is staff, as we don't want to bill staff members
- Billing is disabled (we currently enable billing based on the presence
of the Stripe API key)

Release Notes:

- N/A
2024-10-11 17:54:10 -04:00
Peter Tripp
b739cfa73f docs: Link environment.md (#19101) 2024-10-11 17:24:56 -04:00
Peter Tripp
0fc3072362 Document extension extraction process (#19085)
Release Notes:

- N/A
2024-10-11 16:33:40 -04:00
Peter Tripp
3cbaa08d89 Fix script/linux for Linux Mint (#19096)
- Closes: https://github.com/zed-industries/zed/issues/18827

Release Notes:

- N/A
2024-10-11 16:29:35 -04:00
Richard Feldman
12c9f0f723 Test some billing events logic (#19094)
Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-10-11 16:21:21 -04:00
Marshall Bowers
f280b29859 collab: Make the StripeBilling object long-lived (#19090)
This PR makes the `StripeBilling` object long-lived so that we can make
better use of the cached data on it.

We now hold it on the `AppState` and spawn a background task to
initialize the cache on startup.

Release Notes:

- N/A

Co-authored-by: Richard <richard@zed.dev>
2024-10-11 15:15:08 -04:00
Kirill Bulatov
550064f80f Fix ~ expansion in ssh projects' terminals (#19078)
When setting a remote ssh project path starting with ~, Zed would fail
to cd into such project's directory when opening a new terminal.

Release Notes:

- N/A

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
2024-10-11 21:53:37 +03:00
Marshall Bowers
f33b8abc72 collab: Sort LLM database ID types (#19083)
This PR sorts the order of the LLM database ID type declarations.

Release Notes:

- N/A
2024-10-11 13:57:48 -04:00
Marshall Bowers
22ea7cef7a collab: Add usage-based billing for LLM interactions (#19081)
This PR adds usage-based billing for LLM interactions in the Assistant.

Release Notes:

- N/A

---------

Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
Co-authored-by: Richard Feldman <oss@rtfeldman.com>
2024-10-11 13:36:54 -04:00
Shish
f1c45d988e collab: Remove dependency on X11 (#19079)
collab: Remove dependency on X11

I'm not sure if this is the best solution (perhaps pulling
`LanguageName` into a separate `language_types` crate would be
better...?) - but it massively reduces build time / dependencies / size
and means that the collab server no longer requires X11 libraries to be
installed.

tl;dr: `telemetry_events` requires the `language` crate, and the
language crate requires a whole ton of extra stuff. Since
telemetry_events only uses `language` for a single type definition
(`LanguageName`, aka `String`), we can cut all of these out by using the
base `String` type (This doesn't seem too terrible, given that all other
telemetry fields are using basic datatypes like String as opposed to
more strongly-typed variants).


FYI the dependency tree for "why does collab need X11 libraries??" looks
like this:

```
collab
 \- telemetry_events
     \- language
         |- gpui
         |- fuzzy
         |   \- gpui
         |- git
         |   \- gpui
         |- lsp
         |   |- gpui
         |   \- release_channel
         |       \- gpui
         |- settings
         |   |- fs
         |   |   \- gpui
         |   \- gpui
         |- task
         |   \- gpui
         \- theme
             \- gpui
```

Release Notes:

- N/A
2024-10-11 13:28:34 -04:00
Marshall Bowers
84b61c8b1a assistant: Add support for displaying billing-related errors (#19082)
This PR adds support to the assistant for display billing-related
errors.

Pulling this out of #19081 to make it easier to cherry-pick.

Release Notes:

- N/A

Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Richard <richard@zed.dev>
2024-10-11 13:22:45 -04:00
Shish
5cf0217549 terminal: Improve default locale handling (#18967)
terminal: Improve default locale handling

* Use `LANG` instead of `LC_ALL` (`LC_ALL` is the highest priority which
will override any other end-user settings; when that isn't set things
fall back to separate `LC_*` variables; and when those aren't set things
fall back to `LANG`). [0]
* Only set `LANG` for our child if necessary (if it already exists in
the parent, then the child will inherit that, no need for us to do
anything)

[0]
https://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap08.html#tag_08_02

Tested cases:

- `unset LANG ; cargo run`: locale inside zed's terminal is set to
`en_US.UTF-8`
- `export LANG=en_GB.UTF-8 ; cargo run`: locale inside zed's terminal is
set to `en_GB.UTF-8`

Release Notes:

- Use the system locale in the terminal instead of forcing `en_US.UTF-8`
2024-10-11 18:09:24 +02:00
Thorsten Ball
c21f26c419 ssh remote: Stream stderr from server via proxy to client (#19073)
Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-11 17:11:20 +02:00
336 changed files with 12437 additions and 9277 deletions

View File

@@ -7,9 +7,3 @@ runs:
- name: cargo fmt
shell: bash -euxo pipefail {0}
run: cargo fmt --all -- --check
- name: Find modified migrations
shell: bash -euxo pipefail {0}
run: |
export SQUAWK_GITHUB_TOKEN=${{ github.token }}
. ./script/squawk

View File

@@ -2,14 +2,4 @@ Closes #ISSUE
Release Notes:
- Added/Fixed/Improved ...
Optionally, include screenshots / media showcasing your addition that can be included in the release notes.
### Or...
Closes #ISSUE
Release Notes:
- N/A
- N/A *or* Added/Fixed/Improved ...

View File

@@ -14,6 +14,7 @@ on:
- "**"
paths-ignore:
- "docs/**"
- ".github/workflows/community_*"
concurrency:
# Allow only one workflow per any non-`main` branch.
@@ -26,9 +27,10 @@ env:
RUST_BACKTRACE: 1
jobs:
style:
migration_checks:
name: Check Postgres and Protobuf migrations, mergability
if: github.repository_owner == 'zed-industries'
timeout-minutes: 60
name: Check formatting and spelling
runs-on:
- self-hosted
- test
@@ -37,25 +39,16 @@ jobs:
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
with:
clean: false
fetch-depth: 0
fetch-depth: 0 # fetch full history
- name: Remove untracked files
run: git clean -df
- name: Check spelling
run: script/check-spelling
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check unused dependencies
uses: bnjbvr/cargo-machete@main
- name: Check licenses are present
run: script/check-licenses
- name: Check license generation
run: script/generate-licenses /tmp/zed_licenses_output
- name: Find modified migrations
shell: bash -euxo pipefail {0}
run: |
export SQUAWK_GITHUB_TOKEN=${{ github.token }}
. ./script/squawk
- name: Ensure fresh merge
shell: bash -euxo pipefail {0}
@@ -77,6 +70,24 @@ jobs:
input: "crates/proto/proto/"
against: "https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/"
style:
timeout-minutes: 60
name: Check formatting and spelling
if: github.repository_owner == 'zed-industries'
runs-on:
- buildjet-8vcpu-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check for typos
uses: crate-ci/typos@v1.24.6
with:
config: ./typos.toml
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
@@ -92,14 +103,25 @@ jobs:
- name: cargo clippy
run: ./script/clippy
- name: Check unused dependencies
uses: bnjbvr/cargo-machete@main
- name: Check licenses
run: |
script/check-licenses
script/generate-licenses /tmp/zed_licenses_output
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build collab
run: cargo build -p collab
run: RUSTFLAGS="-D warnings" cargo build -p collab
- name: Build other binaries and features
run: cargo build --workspace --bins --all-features; cargo check -p gpui --features "macos-blade"
run: |
RUSTFLAGS="-D warnings" cargo build --workspace --bins --all-features
cargo check -p gpui --features "macos-blade"
RUSTFLAGS="-D warnings" cargo build -p remote_server
linux_tests:
timeout-minutes: 60
@@ -116,7 +138,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@23bce251a8cd2ffc3c1075eaa2367cf899916d84 # v2
uses: swatinem/rust-cache@82a92a6e8fbeee089604da2575dc567ae9ddeaab # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "buildjet"
@@ -131,7 +153,33 @@ jobs:
uses: ./.github/actions/run_tests
- name: Build Zed
run: cargo build -p zed
run: RUSTFLAGS="-D warnings" cargo build -p zed
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
runs-on:
- buildjet-16vcpu-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@82a92a6e8fbeee089604da2575dc567ae9ddeaab # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "buildjet"
- name: Install Clang & Mold
run: ./script/remote-server && ./script/install-mold 2.34.0
- name: Build Remote Server
run: RUSTFLAGS="-D warnings" cargo build -p remote_server
# todo(windows): Actually run the tests
windows_tests:
@@ -145,7 +193,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@23bce251a8cd2ffc3c1075eaa2367cf899916d84 # v2
uses: swatinem/rust-cache@82a92a6e8fbeee089604da2575dc567ae9ddeaab # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "github"
@@ -155,7 +203,7 @@ jobs:
run: cargo xtask clippy
- name: Build Zed
run: cargo build -p zed
run: $env:RUSTFLAGS="-D warnings"; cargo build
bundle-mac:
timeout-minutes: 60
@@ -219,20 +267,20 @@ jobs:
mv target/x86_64-apple-darwin/release/Zed.dmg target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- name: Upload app bundle (universal) to workflow run if main branch or specific label
uses: actions/upload-artifact@604373da6381bf24206979c74d06a550515601b9 # v4
uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}.dmg
path: target/release/Zed.dmg
- name: Upload app bundle (aarch64) to workflow run if main branch or specific label
uses: actions/upload-artifact@604373da6381bf24206979c74d06a550515601b9 # v4
uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
- name: Upload app bundle (x86_64) to workflow run if main branch or specific label
uses: actions/upload-artifact@604373da6381bf24206979c74d06a550515601b9 # v4
uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
@@ -283,7 +331,7 @@ jobs:
run: script/bundle-linux
- name: Upload Linux bundle to workflow run if main branch or specific label
uses: actions/upload-artifact@604373da6381bf24206979c74d06a550515601b9 # v4
uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
@@ -330,7 +378,7 @@ jobs:
run: script/bundle-linux
- name: Upload Linux bundle to workflow run if main branch or specific label
uses: actions/upload-artifact@604373da6381bf24206979c74d06a550515601b9 # v4
uses: actions/upload-artifact@b4b15b8c7c6ac21ea08fcf65892d2ee8f75cf882 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz

View File

@@ -17,7 +17,7 @@ jobs:
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. Are you able to reproduce this issue in the latest version of Zed? If so, please let us know by commenting on this issue and we will keep it open; otherwise, we'll close it in 7 days. Feel free to open a new issue if you're seeing this message after the issue has been closed.
Thanks for your help!
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, feel free to ping a Zed team member to reopen this issue or open a new one."
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, please open a new issue with a link to this issue."
# We will increase `days-before-stale` to 365 on or after Jan 24th,
# 2024. This date marks one year since migrating issues from
# 'community' to 'zed' repository. The migration added activity to all

View File

@@ -1,3 +1,5 @@
name: Release Actions
on:
release:
types: [published]

View File

@@ -1,3 +1,5 @@
name: Update All Top Ranking Issues
on:
schedule:
- cron: "0 */12 * * *"
@@ -10,14 +12,14 @@ jobs:
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- name: Set up uv
uses: astral-sh/setup-uv@v3
uses: astral-sh/setup-uv@f3bcaebff5eace81a1c062af9f9011aae482ca9d # v3
with:
version: "latest"
enable-cache: true
cache-dependency-glob: "script/update_top_ranking_issues/pyproject.toml"
- name: Install Python 3.12
run: uv python install 3.12
- name: Install Python 3.13
run: uv python install 3.13
- name: Install dependencies
run: uv sync --project script/update_top_ranking_issues -p 3.12
run: uv sync --project script/update_top_ranking_issues -p 3.13
- name: Run script
run: uv run --project script/update_top_ranking_issues script/update_top_ranking_issues/main.py --github-token ${{ secrets.GITHUB_TOKEN }} --issue-reference-number 5393

View File

@@ -1,3 +1,5 @@
name: Update Weekly Top Ranking Issues
on:
schedule:
- cron: "0 15 * * *"
@@ -10,14 +12,14 @@ jobs:
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- name: Set up uv
uses: astral-sh/setup-uv@v3
uses: astral-sh/setup-uv@f3bcaebff5eace81a1c062af9f9011aae482ca9d # v3
with:
version: "latest"
enable-cache: true
cache-dependency-glob: "script/update_top_ranking_issues/pyproject.toml"
- name: Install Python 3.12
run: uv python install 3.12
- name: Install Python 3.13
run: uv python install 3.13
- name: Install dependencies
run: uv sync --project script/update_top_ranking_issues -p 3.12
run: uv sync --project script/update_top_ranking_issues -p 3.13
- name: Run script
run: uv run --project script/update_top_ranking_issues script/update_top_ranking_issues/main.py --github-token ${{ secrets.GITHUB_TOKEN }} --issue-reference-number 6952 --query-day-interval 7

View File

@@ -8,6 +8,7 @@ on:
jobs:
deploy-docs:
name: Deploy Docs
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:

View File

@@ -11,6 +11,7 @@ on:
jobs:
check_formatting:
name: "Check formatting"
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
@@ -29,5 +30,8 @@ jobs:
false
}
- name: Check spelling
run: script/check-spelling docs/
- name: Check for Typos with Typos-CLI
uses: crate-ci/typos@v1.24.6
with:
config: ./typos.toml
files: ./docs/

View File

@@ -21,7 +21,7 @@ jobs:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@23bce251a8cd2ffc3c1075eaa2367cf899916d84 # v2
uses: swatinem/rust-cache@82a92a6e8fbeee089604da2575dc567ae9ddeaab # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
cache-provider: "github"

668
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -52,7 +52,6 @@ members = [
"crates/indexed_docs",
"crates/inline_completion_button",
"crates/install_cli",
"crates/isahc_http_client",
"crates/journal",
"crates/language",
"crates/language_model",
@@ -88,6 +87,7 @@ members = [
"crates/remote",
"crates/remote_server",
"crates/repl",
"crates/reqwest_client",
"crates/rich_text",
"crates/rope",
"crates/rpc",
@@ -122,6 +122,7 @@ members = [
"crates/ui",
"crates/ui_input",
"crates/ui_macros",
"crates/reqwest_client",
"crates/util",
"crates/vcs_menu",
"crates/vim",
@@ -155,15 +156,12 @@ members = [
"extensions/proto",
"extensions/purescript",
"extensions/ruff",
"extensions/ruby",
"extensions/slash-commands-example",
"extensions/snippets",
"extensions/svelte",
"extensions/terraform",
"extensions/test-extension",
"extensions/toml",
"extensions/uiua",
"extensions/vue",
"extensions/zig",
#
@@ -219,7 +217,7 @@ git = { path = "crates/git" }
git_hosting_providers = { path = "crates/git_hosting_providers" }
go_to_line = { path = "crates/go_to_line" }
google_ai = { path = "crates/google_ai" }
gpui = { path = "crates/gpui" }
gpui = { path = "crates/gpui", default-features = false, features = ["http_client"]}
gpui_macros = { path = "crates/gpui_macros" }
headless = { path = "crates/headless" }
html_to_markdown = { path = "crates/html_to_markdown" }
@@ -228,7 +226,6 @@ image_viewer = { path = "crates/image_viewer" }
indexed_docs = { path = "crates/indexed_docs" }
inline_completion_button = { path = "crates/inline_completion_button" }
install_cli = { path = "crates/install_cli" }
isahc_http_client = { path = "crates/isahc_http_client" }
journal = { path = "crates/journal" }
language = { path = "crates/language" }
language_model = { path = "crates/language_model" }
@@ -265,6 +262,7 @@ release_channel = { path = "crates/release_channel" }
remote = { path = "crates/remote" }
remote_server = { path = "crates/remote_server" }
repl = { path = "crates/repl" }
reqwest_client = { path = "crates/reqwest_client" }
rich_text = { path = "crates/rich_text" }
rope = { path = "crates/rope" }
rpc = { path = "crates/rpc" }
@@ -326,7 +324,7 @@ async-pipe = { git = "https://github.com/zed-industries/async-pipe-rs", rev = "8
async-recursion = "1.0.0"
async-tar = "0.5.0"
async-trait = "0.1"
async-tungstenite = "0.23"
async-tungstenite = "0.24"
async-watch = "0.3.1"
async_zip = { version = "0.0.17", features = ["deflate", "deflate64"] }
base64 = "0.22"
@@ -335,6 +333,7 @@ blade-graphics = { git = "https://github.com/kvark/blade", rev = "e142a3a5e678eb
blade-macros = { git = "https://github.com/kvark/blade", rev = "e142a3a5e678eb6a13e642ad8401b1f3aa38e969" }
blade-util = { git = "https://github.com/kvark/blade", rev = "e142a3a5e678eb6a13e642ad8401b1f3aa38e969" }
blake3 = "1.5.3"
bytes = "1.0"
cargo_metadata = "0.18"
cargo_toml = "0.20"
chrono = { version = "0.4", features = ["serde"] }
@@ -348,6 +347,7 @@ ctor = "0.2.6"
dashmap = "6.0"
derive_more = "0.99.17"
dirs = "4.0"
ec4rs = "1.1"
emojis = "0.6.1"
env_logger = "0.11"
exec = "0.3.1"
@@ -366,10 +366,6 @@ ignore = "0.4.22"
image = "0.25.1"
indexmap = { version = "1.6.2", features = ["serde"] }
indoc = "2"
# We explicitly disable http2 support in isahc.
isahc = { version = "1.7.2", default-features = false, features = [
"text-decoding",
] }
itertools = "0.13.0"
jsonwebtoken = "9.3"
libc = "0.2"
@@ -394,6 +390,14 @@ pulldown-cmark = { version = "0.12.0", default-features = false }
rand = "0.8.5"
regex = "1.5"
repair_json = "0.1.0"
reqwest = { git = "https://github.com/zed-industries/reqwest.git", rev = "fd110f6998da16bbca97b6dddda9be7827c50e29", default-features = false, features = [
"charset",
"http2",
"macos-system-configuration",
"rustls-tls-native-roots",
"socks",
"stream",
] }
rsa = "0.9.6"
runtimelib = { version = "0.15", default-features = false, features = [
"async-dispatcher-runtime",
@@ -438,7 +442,7 @@ time = { version = "0.3", features = [
] }
tiny_http = "0.8"
toml = "0.8"
tokio = { version = "1", features = ["full"] }
tokio = { version = "1" }
tower-http = "0.4.4"
tree-sitter = { version = "0.23", features = ["wasm"] }
tree-sitter-bash = "0.23"
@@ -451,6 +455,7 @@ tree-sitter-go = "0.23"
tree-sitter-go-mod = { git = "https://github.com/zed-industries/tree-sitter-go-mod", rev = "a9aea5e358cde4d0f8ff20b7bc4fa311e359c7ca", package = "tree-sitter-gomod" }
tree-sitter-gowork = { git = "https://github.com/zed-industries/tree-sitter-go-work", rev = "acb0617bf7f4fda02c6217676cc64acb89536dc7" }
tree-sitter-heex = { git = "https://github.com/zed-industries/tree-sitter-heex", rev = "1dd45142fbb05562e35b2040c6129c9bca346592" }
tree-sitter-diff = "0.1.0"
tree-sitter-html = "0.20"
tree-sitter-jsdoc = "0.23"
tree-sitter-json = "0.23"
@@ -478,9 +483,11 @@ wasmtime = { version = "24", default-features = false, features = [
wasmtime-wasi = "24"
which = "6.0.0"
wit-component = "0.201"
zstd = "0.11"
[workspace.dependencies.async-stripe]
version = "0.39"
git = "https://github.com/zed-industries/async-stripe"
rev = "3672dd4efb7181aa597bf580bf5a2f5d23db6735"
default-features = false
features = [
"runtime-tokio-hyper-rustls",

2
Cross.toml Normal file
View File

@@ -0,0 +1,2 @@
[build]
dockerfile = "Dockerfile-cross"

View File

@@ -13,30 +13,9 @@ ARG GITHUB_SHA
ENV GITHUB_SHA=$GITHUB_SHA
# At some point in the past 3 weeks, additional dependencies on `xkbcommon` and
# `xkbcommon-x11` were introduced into collab.
#
# A `git bisect` points to this commit as being the culprit: `b8e6098f60e5dabe98fe8281f993858dacc04a55`.
#
# Now when we try to build collab for the Docker image, it fails with the following
# error:
#
# ```
# 985.3 = note: /usr/bin/ld: cannot find -lxkbcommon: No such file or directory
# 985.3 /usr/bin/ld: cannot find -lxkbcommon-x11: No such file or directory
# 985.3 collect2: error: ld returned 1 exit status
# ```
#
# The last successful deploys were at:
# - Staging: `4f408ec65a3867278322a189b4eb20f1ab51f508`
# - Production: `fc4c533d0a8c489e5636a4249d2b52a80039fbd7`
#
# Also add `cmake`, since we need it to build `wasmtime`.
#
# Installing these as a temporary workaround, but I think ideally we'd want to figure
# out what caused them to be included in the first place.
RUN apt-get update; \
apt-get install -y --no-install-recommends libxkbcommon-dev libxkbcommon-x11-dev cmake
apt-get install -y --no-install-recommends cmake
RUN --mount=type=cache,target=./script/node_modules \
--mount=type=cache,target=/usr/local/cargo/registry \

17
Dockerfile-cross Normal file
View File

@@ -0,0 +1,17 @@
# syntax=docker/dockerfile:1
ARG CROSS_BASE_IMAGE
FROM ${CROSS_BASE_IMAGE}
WORKDIR /app
ARG TZ=Etc/UTC \
LANG=C.UTF-8 \
LC_ALL=C.UTF-8 \
DEBIAN_FRONTEND=noninteractive
ENV CARGO_TERM_COLOR=always
COPY script/install-mold script/
RUN ./script/install-mold "2.34.0"
COPY script/remote-server script/
RUN ./script/remote-server
COPY . .

View File

@@ -0,0 +1,16 @@
.git
.github
**/.gitignore
**/.gitkeep
.gitattributes
.mailmap
**/target
zed.xcworkspace
.DS_Store
compose.yml
plugins/bin
script/node_modules
styles/node_modules
crates/collab/static/styles.css
vendor/bin
assets/themes/

1
assets/icons/diff.svg Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-diff"><path d="M12 3v14"/><path d="M5 10h14"/><path d="M5 21h14"/></svg>

After

Width:  |  Height:  |  Size: 275 B

View File

@@ -20,6 +20,7 @@
"bashrc": "terminal",
"bmp": "image",
"c": "c",
"c++": "cpp",
"cc": "cpp",
"cjs": "javascript",
"coffee": "coffeescript",
@@ -27,6 +28,7 @@
"cpp": "cpp",
"css": "css",
"csv": "storage",
"cxx": "cpp",
"cts": "typescript",
"dart": "dart",
"dat": "storage",
@@ -66,11 +68,13 @@
"heex": "elixir",
"heic": "image",
"heif": "image",
"hh": "cpp",
"hpp": "cpp",
"hrl": "erlang",
"hs": "haskell",
"htm": "template",
"html": "template",
"hxx": "cpp",
"ib": "storage",
"ico": "image",
"ini": "settings",
@@ -124,6 +128,7 @@
"php": "php",
"plist": "template",
"png": "image",
"postcss": "css",
"ppt": "document",
"pptx": "document",
"prettierignore": "prettier",

View File

@@ -395,6 +395,7 @@
// Change the default action on `menu::Confirm` by setting the parameter
// "alt-cmd-o": ["projects::OpenRecent", {"create_new_window": true }],
"alt-cmd-o": "projects::OpenRecent",
"ctrl-cmd-o": "projects::OpenRemote",
"alt-cmd-b": "branches::OpenRecent",
"ctrl-~": "workspace::NewTerminal",
"cmd-s": "workspace::Save",

View File

@@ -34,7 +34,7 @@
"cmd-]": "pane::GoForward",
"alt-f7": "editor::FindAllReferences",
"cmd-alt-f7": "editor::FindAllReferences",
"cmd-b": "editor::GoToDefinition",
"cmd-b": "editor::GoToDefinition", // Conflicts with workspace::ToggleLeftDock
"cmd-alt-b": "editor::GoToDefinitionSplit",
"cmd-shift-b": "editor::GoToTypeDefinition",
"cmd-alt-shift-b": "editor::GoToTypeDefinitionSplit",
@@ -64,7 +64,8 @@
"cmd-shift-o": "file_finder::Toggle",
"cmd-shift-a": "command_palette::Toggle",
"shift shift": "command_palette::Toggle",
"cmd-alt-o": "project_symbols::Toggle",
"cmd-alt-o": "project_symbols::Toggle", // JetBrains: Go to Symbol
"cmd-o": "project_symbols::Toggle", // JetBrains: Go to Class
"cmd-1": "workspace::ToggleLeftDock",
"cmd-6": "diagnostics::Deploy"
}

View File

@@ -128,6 +128,10 @@
"shift-m": "vim::WindowMiddle",
"shift-l": "vim::WindowBottom",
// z commands
"z enter": ["workspace::SendKeystrokes", "z t ^"],
"z -": ["workspace::SendKeystrokes", "z b ^"],
"z ^": ["workspace::SendKeystrokes", "shift-h k z b ^"],
"z +": ["workspace::SendKeystrokes", "shift-l j z t ^"],
"z t": "editor::ScrollCursorTop",
"z z": "editor::ScrollCursorCenter",
"z .": ["workspace::SendKeystrokes", "z z ^"],
@@ -252,6 +256,7 @@
"@": ["vim::PushOperator", "ReplayRegister"],
"ctrl-pagedown": "pane::ActivateNextItem",
"ctrl-pageup": "pane::ActivatePrevItem",
"insert": "vim::InsertBefore",
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
"] x": "editor::SelectSmallerSyntaxNode",
@@ -334,7 +339,8 @@
"ctrl-t": "vim::Indent",
"ctrl-d": "vim::Outdent",
"ctrl-k": ["vim::PushOperator", { "Digraph": {} }],
"ctrl-r": ["vim::PushOperator", "Register"]
"ctrl-r": ["vim::PushOperator", "Register"],
"insert": "vim::ToggleReplace"
}
},
{
@@ -353,7 +359,8 @@
"ctrl-k": ["vim::PushOperator", { "Digraph": {} }],
"backspace": "vim::UndoReplace",
"tab": "vim::Tab",
"enter": "vim::Enter"
"enter": "vim::Enter",
"insert": "vim::InsertBefore"
}
},
{

View File

@@ -1,85 +1,33 @@
<task_description>
# Code Change Workflow
The user of a code editor wants to make a change to their codebase.
You must describe the change using the following XML structure:
Your task is to guide the user through code changes using a series of steps. Each step should describe a high-level change, which can consist of multiple edits to distinct locations in the codebase.
## Output Example
Provide output as XML, with the following format:
<step>
Update the Person struct to store an age
```rust
struct Person {
// existing fields...
age: u8,
height: f32,
// existing fields...
}
impl Person {
fn age(&self) -> u8 {
self.age
}
}
```
<edit>
<path>src/person.rs</path>
<operation>insert_before</operation>
<search>height: f32,</search>
<description>Add the age field</description>
</edit>
<edit>
<path>src/person.rs</path>
<operation>insert_after</operation>
<search>impl Person {</search>
<description>Add the age getter</description>
</edit>
</step>
## Output Format
First, each `<step>` must contain a written description of the change that should be made. The description should begin with a high-level overview, and can contain markdown code blocks as well. The description should be self-contained and actionable.
After the description, each `<step>` must contain one or more `<edit>` tags, each of which refer to a specific range in a source file. Each `<edit>` tag must contain the following child tags:
### `<path>` (required)
This tag contains the path to the file that will be changed. It can be an existing path, or a path that should be created.
### `<search>` (optional)
This tag contains a search string to locate in the source file, e.g. `pub fn baz() {`. If not provided, the new content will be inserted at the top of the file. Make sure to produce a string that exists in the source file and that isn't ambiguous. When there's ambiguity, add more lines to the search to eliminate it.
### `<description>` (required)
This tag contains a single-line description of the edit that should be made at the given location.
### `<operation>` (required)
This tag indicates what type of change should be made, relative to the given location. It can be one of the following:
- `update`: Rewrites the specified string entirely based on the given description.
- `create`: Creates a new file with the given path based on the provided description.
- `insert_before`: Inserts new text based on the given description before the specified search string.
- `insert_after`: Inserts new text based on the given description after the specified search string.
- `delete`: Deletes the specified string from the containing file.
- <patch> - A group of related code changes.
Child tags:
- <title> (required) - A high-level description of the changes. This should be as short
as possible, possibly using common abbreviations.
- <edit> (1 or more) - An edit to make at a particular range within a file.
Includes the following child tags:
- <path> (required) - The path to the file that will be changed.
- <description> (optional) - An arbitrarily-long comment that describes the purpose
of this edit.
- <old_text> (optional) - An excerpt from the file's current contents that uniquely
identifies a range within the file where the edit should occur. If this tag is not
specified, then the entire file will be used as the range.
- <new_text> (required) - The new text to insert into the file.
- <operation> (required) - The type of change that should occur at the given range
of the file. Must be one of the following values:
- `update`: Replaces the entire range with the new text.
- `insert_before`: Inserts the new text before the range.
- `insert_after`: Inserts new text after the range.
- `create`: Creates a new file with the given path and the new text.
- `delete`: Deletes the specified range from the file.
<guidelines>
- There's no need to describe *what* to do, just *where* to do it.
- Only reference locations that actually exist (unless you're creating a file).
- If creating a file, assume any subsequent updates are included at the time of creation.
- Don't create and then update a file. Always create new files in one hot.
- Prefer multiple edits to smaller regions, as opposed to one big edit to a larger region.
- Don't produce edits that intersect each other. In that case, merge them into a bigger edit.
- Never nest an edit with another edit. Never include CDATA. All edits are leaf nodes.
- Descriptions are required for all edits except delete.
- When generating multiple edits, ensure the descriptions are specific to each individual operation.
- Avoid referring to the search string in the description. Focus on the change to be made, not the location where it's made. That's implicit with the `search` string you provide.
- Don't generate multiple edits at the same location. Instead, combine them together in a single edit with a succinct combined description.
- Never provide multiple edits whose ranges intersect each other. Instead, merge them into one edit.
- Prefer multiple edits to smaller, disjoint ranges, rather than one edit to a larger range.
- There's no need to escape angle brackets within XML tags.
- Always ensure imports are added if you're referencing symbols that are not in scope.
</guidelines>
@@ -124,189 +72,137 @@ Update all shapes to store their origin as an (x, y) tuple and implement Display
<message role="assistant">
We'll need to update both the rectangle and circle modules.
<step>
Add origin fields to both shape types.
```rust
struct Rectangle {
// existing fields ...
origin: (f64, f64),
}
```
```rust
struct Circle {
// existing fields ...
origin: (f64, f64),
}
```
<patch>
<title>Add origins and display impls to shapes</title>
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Add the origin field to Rectangle struct</description>
<operation>insert_after</operation>
<old_text>
pub struct Rectangle {
</old_text>
<new_text>
origin: (f64, f64),
</new_text>
</edit>
<edit>
<path>src/shapes/rectangle.rs</path>
<operation>insert_before</operation>
<search>
width: f64,
height: f64,
</search>
<description>Add the origin field to Rectangle</description>
<description>Update the Rectangle's new function to take an origin parameter</description>
<operation>update</operation>
<old_text>
fn new(width: f64, height: f64) -> Self {
Rectangle { width, height }
}
</old_text>
<new_text>
fn new(origin: (f64, f64), width: f64, height: f64) -> Self {
Rectangle { origin, width, height }
}
</new_text>
</edit>
<edit>
<path>src/shapes/circle.rs</path>
<operation>insert_before</operation>
<search>
<description>Add the origin field to Circle struct</description>
<operation>insert_after</operation>
<old_text>
pub struct Circle {
radius: f64,
</search>
<description>Add the origin field to Circle</description>
</edit>
<step>
Update both shape's constructors to take an origin.
<edit>
<path>src/shapes/rectangle.rs</path>
<operation>update</operation>
<search>
fn new(width: f64, height: f64) -> Self {
Rectangle { width, height }
}
</search>
<description>Update the Rectangle new function to take an origin</description>
</old_text>
<new_text>
origin: (f64, f64),
</new_text>
</edit>
<edit>
<path>src/shapes/circle.rs</path>
<description>Update the Circle's new function to take an origin parameter</description>
<operation>update</operation>
<search>
fn new(radius: f64) -> Self {
Circle { radius }
}
</search>
<description>Update the Circle new function to take an origin</description>
<old_text>
fn new(radius: f64) -> Self {
Circle { radius }
}
</old_text>
<new_text>
fn new(origin: (f64, f64), radius: f64) -> Self {
Circle { origin, radius }
}
</new_text>
</edit>
</step>
<step>
Implement Display for both shapes
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Add an import for the std::fmt module</description>
<operation>insert_before</operation>
<search>
<old_text>
struct Rectangle {
</search>
<description>Add an import for the `std::fmt` module</description>
</old_text>
<new_text>
use std::fmt;
</new_text>
</edit>
<edit>
<path>src/shapes/rectangle.rs</path>
<description>Add a Display implementation for Rectangle</description>
<operation>insert_after</operation>
<search>
<old_text>
Rectangle { width, height }
}
}
</search>
<description>Add a Display implementation for Rectangle</description>
</old_text>
<new_text>
impl fmt::Display for Rectangle {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.format_struct(f, "Rectangle")
.field("origin", &self.origin)
.field("width", &self.width)
.field("height", &self.height)
.finish()
}
}
</new_text>
</edit>
<edit>
<path>src/shapes/circle.rs</path>
<operation>insert_before</operation>
<search>
struct Circle {
</search>
<description>Add an import for the `std::fmt` module</description>
<operation>insert_before</operation>
<old_text>
struct Circle {
</old_text>
<new_text>
use std::fmt;
</new_text>
</edit>
<edit>
<path>src/shapes/circle.rs</path>
<description>Add a Display implementation for Circle</description>
<operation>insert_after</operation>
<search>
<old_text>
Circle { radius }
}
}
</search>
<description>Add a Display implementation for Circle</description>
</old_text>
<new_text>
impl fmt::Display for Rectangle {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.format_struct(f, "Rectangle")
.field("origin", &self.origin)
.field("width", &self.width)
.field("height", &self.height)
.finish()
}
}
</new_text>
</edit>
</step>
</patch>
</message>
</example>
<example number=2>
<message role="user">
```rs src/user.rs
struct User {
pub name: String,
age: u32,
email: String,
}
impl User {
fn new(name: String, age: u32, email: String) -> Self {
User { name, age, email }
}
pub fn print_info(&self) {
todo!()
}
}
```
Let's print all the user information and delete the email field.
</message>
<message role="assistant">
<step>
Update the 'print_info' method to print user information
```rust
impl User {
// ... other methods ...
pub fn print_info(&self) {
println!("Name: {name}, Age: {age}", name = self.name, age = self.age);
}
}
```
<edit>
<path>src/user.rs</path>
<operation>update</operation>
<search>
pub fn print_info(&self) {
todo!()
}
</search>
<description>Print all the user information</description>
</edit>
</step>
<step>
Remove the 'email' field from the User struct
<edit>
<path>src/user.rs</path>
<operation>delete</operation>
<search>
email: String,
</search>
</edit>
<edit>
<path>src/user.rs</path>
<operation>update</operation>
<symbol>
fn new(name: String, age: u32, email: String) -> Self {
User { name, age, email }
}
</symbol>
<description>Remove email parameter from new method</description>
</edit>
</step>
</message>
</example>
You should think step by step. When possible, produce smaller, coherent logical steps as opposed to one big step that combines lots of heterogeneous edits.
</task_description>

View File

@@ -1,496 +0,0 @@
<overview>
Your task is to map a step from a workflow to locations in source code where code needs to be changed to fulfill that step.
Given a workflow containing background context plus a series of <step> tags, you will resolve *one* of these step tags to resolve to one or more locations in the code.
With each location, you will produce a brief, one-line description of the changes to be made.
<guidelines>
- There's no need to describe *what* to do, just *where* to do it.
- Only reference locations that actually exist (unless you're creating a file).
- If creating a file, assume any subsequent updates are included at the time of creation.
- Don't create and then update a file. Always create new files in shot.
- Prefer updating symbols lower in the syntax tree if possible.
- Never include suggestions on a parent symbol and one of its children in the same suggestions block.
- Never nest an operation with another operation or include CDATA or other content. All suggestions are leaf nodes.
- Descriptions are required for all suggestions except delete.
- When generating multiple suggestions, ensure the descriptions are specific to each individual operation.
- Avoid referring to the location in the description. Focus on the change to be made, not the location where it's made. That's implicit with the symbol you provide.
- Don't generate multiple suggestions at the same location. Instead, combine them together in a single operation with a succinct combined description.
- To add imports respond with a suggestion where the `"symbol"` key is set to `"#imports"`
</guidelines>
</overview>
<examples>
<example>
<workflow_context>
<message role="user">
```rs src/rectangle.rs
struct Rectangle {
width: f64,
height: f64,
}
impl Rectangle {
fn new(width: f64, height: f64) -> Self {
Rectangle { width, height }
}
}
```
We need to add methods to calculate the area and perimeter of the rectangle. Can you help with that?
</message>
<message role="assistant">
Sure, I can help with that!
<step>Add new methods 'calculate_area' and 'calculate_perimeter' to the Rectangle struct</step>
<step>Implement the 'Display' trait for the Rectangle struct</step>
</message>
</workflow_context>
<step_to_resolve>
Add new methods 'calculate_area' and 'calculate_perimeter' to the Rectangle struct
</step_to_resolve>
<incorrect_output reason="NEVER append multiple children at the same location.">
{
"title": "Add Rectangle methods",
"suggestions": [
{
"kind": "AppendChild",
"path": "src/shapes.rs",
"symbol": "impl Rectangle",
"description": "Add calculate_area method"
},
{
"kind": "AppendChild",
"path": "src/shapes.rs",
"symbol": "impl Rectangle",
"description": "Add calculate_perimeter method"
}
]
}
</incorrect_output>
<correct_output>
{
"title": "Add Rectangle methods",
"suggestions": [
{
"kind": "AppendChild",
"path": "src/shapes.rs",
"symbol": "impl Rectangle",
"description": "Add calculate area and perimeter methods"
}
]
}
</correct_output>
<step_to_resolve>
Implement the 'Display' trait for the Rectangle struct
</step_to_resolve>
<output>
{
"title": "Implement Display for Rectangle",
"suggestions": [
{
"kind": "InsertSiblingAfter",
"path": "src/shapes.rs",
"symbol": "impl Rectangle",
"description": "Implement Display trait for Rectangle"
}
]
}
</output>
<example>
<workflow_context>
<message role="user">
```rs src/user.rs
struct User {
pub name: String,
age: u32,
email: String,
}
impl User {
fn new(name: String, age: u32, email: String) -> Self {
User { name, age, email }
}
pub fn print_info(&self) {
println!("Name: {}, Age: {}, Email: {}", self.name, self.age, self.email);
}
}
```
</message>
<message role="assistant">
Certainly!
<step>Update the 'print_info' method to use formatted output</step>
<step>Remove the 'email' field from the User struct</step>
</message>
</workflow_context>
<step_to_resolve>
Update the 'print_info' method to use formatted output
</step_to_resolve>
<output>
{
"title": "Use formatted output",
"suggestions": [
{
"kind": "Update",
"path": "src/user.rs",
"symbol": "impl User pub fn print_info",
"description": "Use formatted output"
}
]
}
</output>
<step_to_resolve>
Remove the 'email' field from the User struct
</step_to_resolve>
<output>
{
"title": "Remove email field",
"suggestions": [
{
"kind": "Delete",
"path": "src/user.rs",
"symbol": "struct User email"
}
]
}
</output>
</example>
<example>
<workflow_context>
<message role="user">
```rs src/vehicle.rs
struct Vehicle {
make: String,
model: String,
year: u32,
}
impl Vehicle {
fn new(make: String, model: String, year: u32) -> Self {
Vehicle { make, model, year }
}
fn print_year(&self) {
println!("Year: {}", self.year);
}
}
```
</message>
<message role="assistant">
<step>Add a 'use std::fmt;' statement at the beginning of the file</step>
<step>Add a new method 'start_engine' in the Vehicle impl block</step>
</message>
</workflow_context>
<step_to_resolve>
Add a 'use std::fmt;' statement at the beginning of the file
</step_to_resolve>
<output>
{
"title": "Add use std::fmt statement",
"suggestions": [
{
"kind": "PrependChild",
"path": "src/vehicle.rs",
"symbol": "#imports",
"description": "Add 'use std::fmt' statement"
}
]
}
</output>
<step_to_resolve>
Add a new method 'start_engine' in the Vehicle impl block
</step_to_resolve>
<output>
{
"title": "Add start_engine method",
"suggestions": [
{
"kind": "InsertSiblingAfter",
"path": "src/vehicle.rs",
"symbol": "impl Vehicle fn new",
"description": "Add start_engine method"
}
]
}
</output>
</example>
<example>
<workflow_context>
<message role="user">
```rs src/employee.rs
struct Employee {
name: String,
position: String,
salary: u32,
department: String,
}
impl Employee {
fn new(name: String, position: String, salary: u32, department: String) -> Self {
Employee { name, position, salary, department }
}
fn print_details(&self) {
println!("Name: {}, Position: {}, Salary: {}, Department: {}",
self.name, self.position, self.salary, self.department);
}
fn give_raise(&mut self, amount: u32) {
self.salary += amount;
}
}
```
</message>
<message role="assistant">
<step>Make salary an f32</step>
<step>Remove the 'department' field and update the 'print_details' method</step>
</message>
</workflow_context>
<step_to_resolve>
Make salary an f32
</step_to_resolve>
<incorrect_output reason="NEVER include suggestions on a parent symbol and one of its children in the same suggestions block.">
{
"title": "Change salary to f32",
"suggestions": [
{
"kind": "Update",
"path": "src/employee.rs",
"symbol": "struct Employee",
"description": "Change the type of salary to an f32"
},
{
"kind": "Update",
"path": "src/employee.rs",
"symbol": "struct Employee salary",
"description": "Change the type to an f32"
}
]
}
</incorrect_output>
<correct_output>
{
"title": "Change salary to f32",
"suggestions": [
{
"kind": "Update",
"path": "src/employee.rs",
"symbol": "struct Employee salary",
"description": "Change the type to an f32"
}
]
}
</correct_output>
<step_to_resolve>
Remove the 'department' field and update the 'print_details' method
</step_to_resolve>
<output>
{
"title": "Remove department",
"suggestions": [
{
"kind": "Delete",
"path": "src/employee.rs",
"symbol": "struct Employee department"
},
{
"kind": "Update",
"path": "src/employee.rs",
"symbol": "impl Employee fn print_details",
"description": "Don't print the 'department' field"
}
]
}
</output>
</example>
<example>
<workflow_context>
<message role="user">
```rs src/game.rs
struct Player {
name: String,
health: i32,
pub score: u32,
}
impl Player {
pub fn new(name: String) -> Self {
Player { name, health: 100, score: 0 }
}
}
struct Game {
players: Vec<Player>,
}
impl Game {
fn new() -> Self {
Game { players: Vec::new() }
}
}
```
</message>
<message role="assistant">
<step>Add a 'level' field to Player and update the 'new' method</step>
</message>
</workflow_context>
<step_to_resolve>
Add a 'level' field to Player and update the 'new' method
</step_to_resolve>
<output>
{
"title": "Add level field to Player",
"suggestions": [
{
"kind": "InsertSiblingAfter",
"path": "src/game.rs",
"symbol": "struct Player pub score",
"description": "Add level field to Player"
},
{
"kind": "Update",
"path": "src/game.rs",
"symbol": "impl Player pub fn new",
"description": "Initialize level in new method"
}
]
}
</output>
</example>
<example>
<workflow_context>
<message role="user">
```rs src/config.rs
use std::collections::HashMap;
struct Config {
settings: HashMap<String, String>,
}
impl Config {
fn new() -> Self {
Config { settings: HashMap::new() }
}
}
```
</message>
<message role="assistant">
<step>Add a 'load_from_file' method to Config and import necessary modules</step>
</message>
</workflow_context>
<step_to_resolve>
Add a 'load_from_file' method to Config and import necessary modules
</step_to_resolve>
<output>
{
"title": "Add load_from_file method",
"suggestions": [
{
"kind": "PrependChild",
"path": "src/config.rs",
"symbol": "#imports",
"description": "Import std::fs and std::io modules"
},
{
"kind": "AppendChild",
"path": "src/config.rs",
"symbol": "impl Config",
"description": "Add load_from_file method"
}
]
}
</output>
</example>
<example>
<workflow_context>
<message role="user">
```rs src/database.rs
pub(crate) struct Database {
connection: Connection,
}
impl Database {
fn new(url: &str) -> Result<Self, Error> {
let connection = Connection::connect(url)?;
Ok(Database { connection })
}
async fn query(&self, sql: &str) -> Result<Vec<Row>, Error> {
self.connection.query(sql, &[])
}
}
```
</message>
<message role="assistant">
<step>Add error handling to the 'query' method and create a custom error type</step>
</message>
</workflow_context>
<step_to_resolve>
Add error handling to the 'query' method and create a custom error type
</step_to_resolve>
<output>
{
"title": "Add error handling to query",
"suggestions": [
{
"kind": "PrependChild",
"path": "src/database.rs",
"description": "Import necessary error handling modules"
},
{
"kind": "InsertSiblingBefore",
"path": "src/database.rs",
"symbol": "pub(crate) struct Database",
"description": "Define custom DatabaseError enum"
},
{
"kind": "Update",
"path": "src/database.rs",
"symbol": "impl Database async fn query",
"description": "Implement error handling in query method"
}
]
}
</output>
</example>
</examples>
Now generate the suggestions for the following step:
<workflow_context>
{{{workflow_context}}}
</workflow_context>
<step_to_resolve>
{{{step_to_resolve}}}
</step_to_resolve>

View File

@@ -494,7 +494,14 @@
// Position of the close button on the editor tabs.
"close_position": "right",
// Whether to show the file icon for a tab.
"file_icons": false
"file_icons": false,
// What to do after closing the current tab.
//
// 1. Activate the tab that was open previously (default)
// "History"
// 2. Activate the neighbour tab (prefers the right one, if present)
// "Neighbour"
"activate_on_close": "history"
},
// Settings related to preview tabs.
"preview_tabs": {
@@ -705,10 +712,10 @@
// May take 2 values:
// 1. Rely on default platform handling of option key, on macOS
// this means generating certain unicode characters
// "option_to_meta": false,
// "option_as_meta": false,
// 2. Make the option keys behave as a 'meta' key, e.g. for emacs
// "option_to_meta": true,
"option_as_meta": true,
// "option_as_meta": true,
"option_as_meta": false,
// Whether or not selecting text in the terminal will automatically
// copy to the system clipboard.
"copy_on_select": false,
@@ -817,6 +824,7 @@
// Different settings for specific languages.
"languages": {
"Astro": {
"language_servers": ["astro-language-server", "..."],
"prettier": {
"allowed": true,
"plugins": ["prettier-plugin-astro"]
@@ -843,6 +851,10 @@
"Dart": {
"tab_size": 2
},
"Diff": {
"remove_trailing_whitespace_on_save": false,
"ensure_final_newline_on_save": false
},
"Elixir": {
"language_servers": ["elixir-ls", "!next-ls", "!lexical", "..."]
},

View File

@@ -0,0 +1,7 @@
// Server-specific settings
//
// For a full list of overridable settings, and general information on settings,
// see the documentation: https://zed.dev/docs/configuring-zed#settings-files
{
"lsp": {}
}

View File

@@ -26,6 +26,3 @@ serde_json.workspace = true
strum.workspace = true
thiserror.workspace = true
util.workspace = true
[dev-dependencies]
tokio.workspace = true

View File

@@ -97,6 +97,7 @@ language = { workspace = true, features = ["test-support"] }
language_model = { workspace = true, features = ["test-support"] }
languages = { workspace = true, features = ["test-support"] }
log.workspace = true
pretty_assertions.workspace = true
project = { workspace = true, features = ["test-support"] }
rand.workspace = true
serde_json_lenient.workspace = true

View File

@@ -6,6 +6,7 @@ mod context;
pub mod context_store;
mod inline_assistant;
mod model_selector;
mod patch;
mod prompt_library;
mod prompts;
mod slash_command;
@@ -14,7 +15,6 @@ pub mod slash_command_settings;
mod streaming_diff;
mod terminal_inline_assistant;
mod tools;
mod workflow;
pub use assistant_panel::{AssistantPanel, AssistantPanelEvent};
use assistant_settings::AssistantSettings;
@@ -35,11 +35,13 @@ use language_model::{
LanguageModelId, LanguageModelProviderId, LanguageModelRegistry, LanguageModelResponseMessage,
};
pub(crate) use model_selector::*;
pub use patch::*;
pub use prompts::PromptBuilder;
use prompts::PromptLoadingParams;
use semantic_index::{CloudEmbeddingProvider, SemanticDb};
use serde::{Deserialize, Serialize};
use settings::{update_settings_file, Settings, SettingsStore};
use slash_command::workflow_command::WorkflowSlashCommand;
use slash_command::{
auto_command, cargo_workspace_command, context_server_command, default_command, delta_command,
diagnostics_command, docs_command, fetch_command, file_command, now_command, project_command,
@@ -50,7 +52,6 @@ use std::path::PathBuf;
use std::sync::Arc;
pub(crate) use streaming_diff::*;
use util::ResultExt;
pub use workflow::*;
use crate::slash_command_settings::SlashCommandSettings;
@@ -393,12 +394,25 @@ fn register_slash_commands(prompt_builder: Option<Arc<PromptBuilder>>, cx: &mut
slash_command_registry.register_command(now_command::NowSlashCommand, false);
slash_command_registry.register_command(diagnostics_command::DiagnosticsSlashCommand, true);
slash_command_registry.register_command(fetch_command::FetchSlashCommand, false);
slash_command_registry.register_command(fetch_command::FetchSlashCommand, false);
if let Some(prompt_builder) = prompt_builder {
slash_command_registry.register_command(
workflow_command::WorkflowSlashCommand::new(prompt_builder.clone()),
true,
);
cx.observe_global::<SettingsStore>({
let slash_command_registry = slash_command_registry.clone();
let prompt_builder = prompt_builder.clone();
move |cx| {
if AssistantSettings::get_global(cx).are_live_diffs_enabled(cx) {
slash_command_registry.register_command(
workflow_command::WorkflowSlashCommand::new(prompt_builder.clone()),
true,
);
} else {
slash_command_registry.unregister_command_by_name(WorkflowSlashCommand::NAME);
}
}
})
.detach();
cx.observe_flag::<project_command::ProjectSlashCommandFeatureFlag, _>({
let slash_command_registry = slash_command_registry.clone();
move |is_enabled, _cx| {

File diff suppressed because it is too large Load Diff

View File

@@ -2,6 +2,7 @@ use std::sync::Arc;
use ::open_ai::Model as OpenAiModel;
use anthropic::Model as AnthropicModel;
use feature_flags::FeatureFlagAppExt;
use fs::Fs;
use gpui::{AppContext, Pixels};
use language_model::provider::open_ai;
@@ -61,6 +62,13 @@ pub struct AssistantSettings {
pub default_model: LanguageModelSelection,
pub inline_alternatives: Vec<LanguageModelSelection>,
pub using_outdated_settings_version: bool,
pub enable_experimental_live_diffs: bool,
}
impl AssistantSettings {
pub fn are_live_diffs_enabled(&self, cx: &AppContext) -> bool {
cx.is_staff() || self.enable_experimental_live_diffs
}
}
/// Assistant panel settings
@@ -238,6 +246,7 @@ impl AssistantSettingsContent {
}
}),
inline_alternatives: None,
enable_experimental_live_diffs: None,
},
VersionedAssistantSettingsContent::V2(settings) => settings.clone(),
},
@@ -257,6 +266,7 @@ impl AssistantSettingsContent {
.to_string(),
}),
inline_alternatives: None,
enable_experimental_live_diffs: None,
},
}
}
@@ -373,6 +383,7 @@ impl Default for VersionedAssistantSettingsContent {
default_height: None,
default_model: None,
inline_alternatives: None,
enable_experimental_live_diffs: None,
})
}
}
@@ -403,6 +414,10 @@ pub struct AssistantSettingsContentV2 {
default_model: Option<LanguageModelSelection>,
/// Additional models with which to generate alternatives when performing inline assists.
inline_alternatives: Option<Vec<LanguageModelSelection>>,
/// Enable experimental live diffs in the assistant panel.
///
/// Default: false
enable_experimental_live_diffs: Option<bool>,
}
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq)]
@@ -525,7 +540,10 @@ impl Settings for AssistantSettings {
);
merge(&mut settings.default_model, value.default_model);
merge(&mut settings.inline_alternatives, value.inline_alternatives);
// merge(&mut settings.infer_context, value.infer_context); TODO re-enable this once we ship context inference
merge(
&mut settings.enable_experimental_live_diffs,
value.enable_experimental_live_diffs,
);
}
Ok(settings)
@@ -584,6 +602,7 @@ mod tests {
dock: None,
default_width: None,
default_height: None,
enable_experimental_live_diffs: None,
}),
)
},

View File

@@ -2,8 +2,8 @@
mod context_tests;
use crate::{
prompts::PromptBuilder, slash_command::SlashCommandLine, MessageId, MessageStatus,
WorkflowStep, WorkflowStepEdit, WorkflowStepResolution, WorkflowSuggestionGroup,
prompts::PromptBuilder, slash_command::SlashCommandLine, AssistantEdit, AssistantPatch,
AssistantPatchStatus, MessageId, MessageStatus,
};
use anyhow::{anyhow, Context as _, Result};
use assistant_slash_command::{
@@ -15,17 +15,15 @@ use clock::ReplicaId;
use collections::{HashMap, HashSet};
use feature_flags::{FeatureFlag, FeatureFlagAppExt};
use fs::{Fs, RemoveOptions};
use futures::{
future::{self, Shared},
FutureExt, StreamExt,
};
use futures::{future::Shared, FutureExt, StreamExt};
use gpui::{
AppContext, AsyncAppContext, Context as _, EventEmitter, Model, ModelContext, RenderImage,
SharedString, Subscription, Task,
AppContext, Context as _, EventEmitter, Model, ModelContext, RenderImage, SharedString,
Subscription, Task,
};
use language::{AnchorRangeExt, Bias, Buffer, LanguageRegistry, OffsetRangeExt, Point, ToOffset};
use language_model::{
provider::cloud::{MaxMonthlySpendReachedError, PaymentRequiredError},
LanguageModel, LanguageModelCacheConfiguration, LanguageModelCompletionEvent,
LanguageModelImage, LanguageModelRegistry, LanguageModelRequest, LanguageModelRequestMessage,
LanguageModelRequestTool, LanguageModelToolResult, LanguageModelToolUse, MessageContent, Role,
@@ -37,7 +35,7 @@ use project::Project;
use serde::{Deserialize, Serialize};
use smallvec::SmallVec;
use std::{
cmp::{self, max, Ordering},
cmp::{max, Ordering},
fmt::Debug,
iter, mem,
ops::Range,
@@ -294,10 +292,12 @@ impl ContextOperation {
#[derive(Debug, Clone)]
pub enum ContextEvent {
ShowAssistError(SharedString),
ShowPaymentRequiredError,
ShowMaxMonthlySpendReachedError,
MessagesEdited,
SummaryChanged,
StreamedCompletion,
WorkflowStepsUpdated {
PatchesUpdated {
removed: Vec<Range<language::Anchor>>,
updated: Vec<Range<language::Anchor>>,
},
@@ -451,13 +451,14 @@ pub struct XmlTag {
#[derive(Copy, Clone, Debug, strum::EnumString, PartialEq, Eq, strum::AsRefStr)]
#[strum(serialize_all = "snake_case")]
pub enum XmlTagKind {
Step,
Patch,
Title,
Edit,
Path,
Search,
Within,
Operation,
Description,
OldText,
NewText,
Operation,
}
pub struct Context {
@@ -487,7 +488,7 @@ pub struct Context {
_subscriptions: Vec<Subscription>,
telemetry: Option<Arc<Telemetry>>,
language_registry: Arc<LanguageRegistry>,
workflow_steps: Vec<WorkflowStep>,
patches: Vec<AssistantPatch>,
xml_tags: Vec<XmlTag>,
project: Option<Model<Project>>,
prompt_builder: Arc<PromptBuilder>,
@@ -503,7 +504,7 @@ impl ContextAnnotation for PendingSlashCommand {
}
}
impl ContextAnnotation for WorkflowStep {
impl ContextAnnotation for AssistantPatch {
fn range(&self) -> &Range<language::Anchor> {
&self.range
}
@@ -588,7 +589,7 @@ impl Context {
telemetry,
project,
language_registry,
workflow_steps: Vec::new(),
patches: Vec::new(),
xml_tags: Vec::new(),
prompt_builder,
};
@@ -926,48 +927,49 @@ impl Context {
self.summary.as_ref()
}
pub(crate) fn workflow_step_containing(
pub(crate) fn patch_containing(
&self,
offset: usize,
position: Point,
cx: &AppContext,
) -> Option<&WorkflowStep> {
) -> Option<&AssistantPatch> {
let buffer = self.buffer.read(cx);
let index = self
.workflow_steps
.binary_search_by(|step| {
let step_range = step.range.to_offset(&buffer);
if offset < step_range.start {
Ordering::Greater
} else if offset > step_range.end {
Ordering::Less
} else {
Ordering::Equal
}
})
.ok()?;
Some(&self.workflow_steps[index])
let index = self.patches.binary_search_by(|patch| {
let patch_range = patch.range.to_point(&buffer);
if position < patch_range.start {
Ordering::Greater
} else if position > patch_range.end {
Ordering::Less
} else {
Ordering::Equal
}
});
if let Ok(ix) = index {
Some(&self.patches[ix])
} else {
None
}
}
pub fn workflow_step_ranges(&self) -> impl Iterator<Item = Range<language::Anchor>> + '_ {
self.workflow_steps.iter().map(|step| step.range.clone())
pub fn patch_ranges(&self) -> impl Iterator<Item = Range<language::Anchor>> + '_ {
self.patches.iter().map(|patch| patch.range.clone())
}
pub(crate) fn workflow_step_for_range(
pub(crate) fn patch_for_range(
&self,
range: &Range<language::Anchor>,
cx: &AppContext,
) -> Option<&WorkflowStep> {
) -> Option<&AssistantPatch> {
let buffer = self.buffer.read(cx);
let index = self.workflow_step_index_for_range(range, buffer).ok()?;
Some(&self.workflow_steps[index])
let index = self.patch_index_for_range(range, buffer).ok()?;
Some(&self.patches[index])
}
fn workflow_step_index_for_range(
fn patch_index_for_range(
&self,
tagged_range: &Range<text::Anchor>,
buffer: &text::BufferSnapshot,
) -> Result<usize, usize> {
self.workflow_steps
self.patches
.binary_search_by(|probe| probe.range.cmp(&tagged_range, buffer))
}
@@ -1015,8 +1017,6 @@ impl Context {
language::BufferEvent::Edited => {
self.count_remaining_tokens(cx);
self.reparse(cx);
// Use `inclusive = true` to invalidate a step when an edit occurs
// at the start/end of a parsed step.
cx.emit(ContextEvent::MessagesEdited);
}
_ => {}
@@ -1245,8 +1245,8 @@ impl Context {
let mut removed_slash_command_ranges = Vec::new();
let mut updated_slash_commands = Vec::new();
let mut removed_steps = Vec::new();
let mut updated_steps = Vec::new();
let mut removed_patches = Vec::new();
let mut updated_patches = Vec::new();
while let Some(mut row_range) = row_ranges.next() {
while let Some(next_row_range) = row_ranges.peek() {
if row_range.end >= next_row_range.start {
@@ -1270,11 +1270,11 @@ impl Context {
&mut removed_slash_command_ranges,
cx,
);
self.reparse_workflow_steps_in_range(
self.reparse_patches_in_range(
start..end,
&buffer,
&mut updated_steps,
&mut removed_steps,
&mut updated_patches,
&mut removed_patches,
cx,
);
}
@@ -1286,10 +1286,10 @@ impl Context {
});
}
if !updated_steps.is_empty() || !removed_steps.is_empty() {
cx.emit(ContextEvent::WorkflowStepsUpdated {
removed: removed_steps,
updated: updated_steps,
if !updated_patches.is_empty() || !removed_patches.is_empty() {
cx.emit(ContextEvent::PatchesUpdated {
removed: removed_patches,
updated: updated_patches,
});
}
}
@@ -1351,7 +1351,7 @@ impl Context {
removed.extend(removed_commands.map(|command| command.source_range));
}
fn reparse_workflow_steps_in_range(
fn reparse_patches_in_range(
&mut self,
range: Range<text::Anchor>,
buffer: &BufferSnapshot,
@@ -1366,41 +1366,32 @@ impl Context {
self.xml_tags
.splice(intersecting_tags_range.clone(), new_tags);
// Find which steps intersect the changed range.
let intersecting_steps_range =
self.indices_intersecting_buffer_range(&self.workflow_steps, range.clone(), cx);
// Find which patches intersect the changed range.
let intersecting_patches_range =
self.indices_intersecting_buffer_range(&self.patches, range.clone(), cx);
// Reparse all tags after the last unchanged step before the change.
// Reparse all tags after the last unchanged patch before the change.
let mut tags_start_ix = 0;
if let Some(preceding_unchanged_step) =
self.workflow_steps[..intersecting_steps_range.start].last()
if let Some(preceding_unchanged_patch) =
self.patches[..intersecting_patches_range.start].last()
{
tags_start_ix = match self.xml_tags.binary_search_by(|tag| {
tag.range
.start
.cmp(&preceding_unchanged_step.range.end, buffer)
.cmp(&preceding_unchanged_patch.range.end, buffer)
.then(Ordering::Less)
}) {
Ok(ix) | Err(ix) => ix,
};
}
// Rebuild the edit suggestions in the range.
let mut new_steps = self.parse_steps(tags_start_ix, range.end, buffer);
if let Some(project) = self.project() {
for step in &mut new_steps {
Self::resolve_workflow_step_internal(step, &project, cx);
}
}
updated.extend(new_steps.iter().map(|step| step.range.clone()));
let removed_steps = self
.workflow_steps
.splice(intersecting_steps_range, new_steps);
// Rebuild the patches in the range.
let new_patches = self.parse_patches(tags_start_ix, range.end, buffer, cx);
updated.extend(new_patches.iter().map(|patch| patch.range.clone()));
let removed_patches = self.patches.splice(intersecting_patches_range, new_patches);
removed.extend(
removed_steps
.map(|step| step.range)
removed_patches
.map(|patch| patch.range)
.filter(|range| !updated.contains(&range)),
);
}
@@ -1461,60 +1452,95 @@ impl Context {
tags
}
fn parse_steps(
fn parse_patches(
&mut self,
tags_start_ix: usize,
buffer_end: text::Anchor,
buffer: &BufferSnapshot,
) -> Vec<WorkflowStep> {
let mut new_steps = Vec::new();
let mut pending_step = None;
let mut edit_step_depth = 0;
cx: &AppContext,
) -> Vec<AssistantPatch> {
let mut new_patches = Vec::new();
let mut pending_patch = None;
let mut patch_tag_depth = 0;
let mut tags = self.xml_tags[tags_start_ix..].iter().peekable();
'tags: while let Some(tag) = tags.next() {
if tag.range.start.cmp(&buffer_end, buffer).is_gt() && edit_step_depth == 0 {
if tag.range.start.cmp(&buffer_end, buffer).is_gt() && patch_tag_depth == 0 {
break;
}
if tag.kind == XmlTagKind::Step && tag.is_open_tag {
edit_step_depth += 1;
let edit_start = tag.range.start;
let mut edits = Vec::new();
let mut step = WorkflowStep {
range: edit_start..edit_start,
leading_tags_end: tag.range.end,
trailing_tag_start: None,
if tag.kind == XmlTagKind::Patch && tag.is_open_tag {
patch_tag_depth += 1;
let patch_start = tag.range.start;
let mut edits = Vec::<Result<AssistantEdit>>::new();
let mut patch = AssistantPatch {
range: patch_start..patch_start,
title: String::new().into(),
edits: Default::default(),
resolution: None,
resolution_task: None,
status: crate::AssistantPatchStatus::Pending,
};
while let Some(tag) = tags.next() {
step.trailing_tag_start.get_or_insert(tag.range.start);
if tag.kind == XmlTagKind::Patch && !tag.is_open_tag {
patch_tag_depth -= 1;
if patch_tag_depth == 0 {
patch.range.end = tag.range.end;
if tag.kind == XmlTagKind::Step && !tag.is_open_tag {
// step.trailing_tag_start = Some(tag.range.start);
edit_step_depth -= 1;
if edit_step_depth == 0 {
step.range.end = tag.range.end;
step.edits = edits.into();
new_steps.push(step);
// Include the line immediately after this <patch> tag if it's empty.
let patch_end_offset = patch.range.end.to_offset(buffer);
let mut patch_end_chars = buffer.chars_at(patch_end_offset);
if patch_end_chars.next() == Some('\n')
&& patch_end_chars.next().map_or(true, |ch| ch == '\n')
{
let messages = self.messages_for_offsets(
[patch_end_offset, patch_end_offset + 1],
cx,
);
if messages.len() == 1 {
patch.range.end = buffer.anchor_before(patch_end_offset + 1);
}
}
edits.sort_unstable_by(|a, b| {
if let (Ok(a), Ok(b)) = (a, b) {
a.path.cmp(&b.path)
} else {
Ordering::Equal
}
});
patch.edits = edits.into();
patch.status = AssistantPatchStatus::Ready;
new_patches.push(patch);
continue 'tags;
}
}
if tag.kind == XmlTagKind::Title && tag.is_open_tag {
let content_start = tag.range.end;
while let Some(tag) = tags.next() {
if tag.kind == XmlTagKind::Title && !tag.is_open_tag {
let content_end = tag.range.start;
patch.title =
trimmed_text_in_range(buffer, content_start..content_end)
.into();
break;
}
}
}
if tag.kind == XmlTagKind::Edit && tag.is_open_tag {
let mut path = None;
let mut search = None;
let mut old_text = None;
let mut new_text = None;
let mut operation = None;
let mut description = None;
while let Some(tag) = tags.next() {
if tag.kind == XmlTagKind::Edit && !tag.is_open_tag {
edits.push(WorkflowStepEdit::new(
edits.push(AssistantEdit::new(
path,
operation,
search,
old_text,
new_text,
description,
));
break;
@@ -1523,7 +1549,8 @@ impl Context {
if tag.is_open_tag
&& [
XmlTagKind::Path,
XmlTagKind::Search,
XmlTagKind::OldText,
XmlTagKind::NewText,
XmlTagKind::Operation,
XmlTagKind::Description,
]
@@ -1535,15 +1562,18 @@ impl Context {
if tag.kind == kind && !tag.is_open_tag {
let tag = tags.next().unwrap();
let content_end = tag.range.start;
let mut content = buffer
.text_for_range(content_start..content_end)
.collect::<String>();
content.truncate(content.trim_end().len());
let content = trimmed_text_in_range(
buffer,
content_start..content_end,
);
match kind {
XmlTagKind::Path => path = Some(content),
XmlTagKind::Operation => operation = Some(content),
XmlTagKind::Search => {
search = Some(content).filter(|s| !s.is_empty())
XmlTagKind::OldText => {
old_text = Some(content).filter(|s| !s.is_empty())
}
XmlTagKind::NewText => {
new_text = Some(content).filter(|s| !s.is_empty())
}
XmlTagKind::Description => {
description =
@@ -1558,162 +1588,28 @@ impl Context {
}
}
pending_step = Some(step);
patch.edits = edits.into();
pending_patch = Some(patch);
}
}
if let Some(mut pending_step) = pending_step {
pending_step.range.end = text::Anchor::MAX;
new_steps.push(pending_step);
}
new_steps
}
pub fn resolve_workflow_step(
&mut self,
tagged_range: Range<text::Anchor>,
cx: &mut ModelContext<Self>,
) -> Option<()> {
let index = self
.workflow_step_index_for_range(&tagged_range, self.buffer.read(cx))
.ok()?;
let step = &mut self.workflow_steps[index];
let project = self.project.as_ref()?;
step.resolution.take();
Self::resolve_workflow_step_internal(step, project, cx);
None
}
fn resolve_workflow_step_internal(
step: &mut WorkflowStep,
project: &Model<Project>,
cx: &mut ModelContext<'_, Context>,
) {
step.resolution_task = Some(cx.spawn({
let range = step.range.clone();
let edits = step.edits.clone();
let project = project.clone();
|this, mut cx| async move {
let suggestion_groups =
Self::compute_step_resolution(project, edits, &mut cx).await;
this.update(&mut cx, |this, cx| {
let buffer = this.buffer.read(cx).text_snapshot();
let ix = this.workflow_step_index_for_range(&range, &buffer).ok();
if let Some(ix) = ix {
let step = &mut this.workflow_steps[ix];
let resolution = suggestion_groups.map(|suggestion_groups| {
let mut title = String::new();
for mut chunk in buffer.text_for_range(
step.leading_tags_end
..step.trailing_tag_start.unwrap_or(step.range.end),
) {
if title.is_empty() {
chunk = chunk.trim_start();
}
if let Some((prefix, _)) = chunk.split_once('\n') {
title.push_str(prefix);
break;
} else {
title.push_str(chunk);
}
}
WorkflowStepResolution {
title,
suggestion_groups,
}
});
step.resolution = Some(Arc::new(resolution));
cx.emit(ContextEvent::WorkflowStepsUpdated {
removed: vec![],
updated: vec![range],
})
}
})
.ok();
}
}));
}
async fn compute_step_resolution(
project: Model<Project>,
edits: Arc<[Result<WorkflowStepEdit>]>,
cx: &mut AsyncAppContext,
) -> Result<HashMap<Model<Buffer>, Vec<WorkflowSuggestionGroup>>> {
let mut suggestion_tasks = Vec::new();
for edit in edits.iter() {
let edit = edit.as_ref().map_err(|e| anyhow!("{e}"))?;
suggestion_tasks.push(edit.resolve(project.clone(), cx.clone()));
}
// Expand the context ranges of each suggestion and group suggestions with overlapping context ranges.
let suggestions = future::try_join_all(suggestion_tasks).await?;
let mut suggestions_by_buffer = HashMap::default();
for (buffer, suggestion) in suggestions {
suggestions_by_buffer
.entry(buffer)
.or_insert_with(Vec::new)
.push(suggestion);
}
let mut suggestion_groups_by_buffer = HashMap::default();
for (buffer, mut suggestions) in suggestions_by_buffer {
let mut suggestion_groups = Vec::<WorkflowSuggestionGroup>::new();
let snapshot = buffer.update(cx, |buffer, _| buffer.snapshot())?;
// Sort suggestions by their range so that earlier, larger ranges come first
suggestions.sort_by(|a, b| a.range().cmp(&b.range(), &snapshot));
// Merge overlapping suggestions
suggestions.dedup_by(|a, b| b.try_merge(a, &snapshot));
// Create context ranges for each suggestion
for suggestion in suggestions {
let context_range = {
let suggestion_point_range = suggestion.range().to_point(&snapshot);
let start_row = suggestion_point_range.start.row.saturating_sub(5);
let end_row =
cmp::min(suggestion_point_range.end.row + 5, snapshot.max_point().row);
let start = snapshot.anchor_before(Point::new(start_row, 0));
let end =
snapshot.anchor_after(Point::new(end_row, snapshot.line_len(end_row)));
start..end
};
if let Some(last_group) = suggestion_groups.last_mut() {
if last_group
.context_range
.end
.cmp(&context_range.start, &snapshot)
.is_ge()
{
// Merge with the previous group if context ranges overlap
last_group.context_range.end = context_range.end;
last_group.suggestions.push(suggestion);
} else {
// Create a new group
suggestion_groups.push(WorkflowSuggestionGroup {
context_range,
suggestions: vec![suggestion],
});
}
if let Some(mut pending_patch) = pending_patch {
let patch_start = pending_patch.range.start.to_offset(buffer);
if let Some(message) = self.message_for_offset(patch_start, cx) {
if message.anchor_range.end == text::Anchor::MAX {
pending_patch.range.end = text::Anchor::MAX;
} else {
// Create the first group
suggestion_groups.push(WorkflowSuggestionGroup {
context_range,
suggestions: vec![suggestion],
});
let message_end = buffer.anchor_after(message.offset_range.end - 1);
pending_patch.range.end = message_end;
}
} else {
pending_patch.range.end = text::Anchor::MAX;
}
suggestion_groups_by_buffer.insert(buffer, suggestion_groups);
new_patches.push(pending_patch);
}
Ok(suggestion_groups_by_buffer)
new_patches
}
pub fn pending_command_for_position(
@@ -2112,25 +2008,36 @@ impl Context {
let result = stream_completion.await;
this.update(&mut cx, |this, cx| {
let error_message = result
.as_ref()
.err()
.map(|error| error.to_string().trim().to_string());
if let Some(error_message) = error_message.as_ref() {
cx.emit(ContextEvent::ShowAssistError(SharedString::from(
error_message.clone(),
)));
}
this.update_metadata(assistant_message_id, cx, |metadata| {
if let Some(error_message) = error_message.as_ref() {
metadata.status =
MessageStatus::Error(SharedString::from(error_message.clone()));
let error_message = if let Some(error) = result.as_ref().err() {
if error.is::<PaymentRequiredError>() {
cx.emit(ContextEvent::ShowPaymentRequiredError);
this.update_metadata(assistant_message_id, cx, |metadata| {
metadata.status = MessageStatus::Canceled;
});
Some(error.to_string())
} else if error.is::<MaxMonthlySpendReachedError>() {
cx.emit(ContextEvent::ShowMaxMonthlySpendReachedError);
this.update_metadata(assistant_message_id, cx, |metadata| {
metadata.status = MessageStatus::Canceled;
});
Some(error.to_string())
} else {
metadata.status = MessageStatus::Done;
let error_message = error.to_string().trim().to_string();
cx.emit(ContextEvent::ShowAssistError(SharedString::from(
error_message.clone(),
)));
this.update_metadata(assistant_message_id, cx, |metadata| {
metadata.status =
MessageStatus::Error(SharedString::from(error_message.clone()));
});
Some(error_message)
}
});
} else {
this.update_metadata(assistant_message_id, cx, |metadata| {
metadata.status = MessageStatus::Done;
});
None
};
if let Some(telemetry) = this.telemetry.as_ref() {
let language_name = this
@@ -2146,7 +2053,7 @@ impl Context {
model_provider: model.provider_id().to_string(),
response_latency,
error_message,
language_name,
language_name: language_name.map(|name| name.to_proto()),
});
}
@@ -2301,11 +2208,11 @@ impl Context {
let mut updated = Vec::new();
let mut removed = Vec::new();
for range in ranges {
self.reparse_workflow_steps_in_range(range, &buffer, &mut updated, &mut removed, cx);
self.reparse_patches_in_range(range, &buffer, &mut updated, &mut removed, cx);
}
if !updated.is_empty() || !removed.is_empty() {
cx.emit(ContextEvent::WorkflowStepsUpdated { removed, updated })
cx.emit(ContextEvent::PatchesUpdated { removed, updated })
}
}
@@ -2811,6 +2718,24 @@ impl Context {
}
}
fn trimmed_text_in_range(buffer: &BufferSnapshot, range: Range<text::Anchor>) -> String {
let mut is_start = true;
let mut content = buffer
.text_for_range(range)
.map(|mut chunk| {
if is_start {
chunk = chunk.trim_start_matches('\n');
if !chunk.is_empty() {
is_start = false;
}
}
chunk
})
.collect::<String>();
content.truncate(content.trim_end().len());
content
}
#[derive(Debug, Default)]
pub struct ContextVersion {
context: clock::Global,

View File

@@ -1,8 +1,7 @@
use super::{MessageCacheMetadata, WorkflowStepEdit};
use super::{AssistantEdit, MessageCacheMetadata};
use crate::{
assistant_panel, prompt_library, slash_command::file_command, CacheStatus, Context,
ContextEvent, ContextId, ContextOperation, MessageId, MessageStatus, PromptBuilder,
WorkflowStepEditKind,
assistant_panel, prompt_library, slash_command::file_command, AssistantEditKind, CacheStatus,
Context, ContextEvent, ContextId, ContextOperation, MessageId, MessageStatus, PromptBuilder,
};
use anyhow::Result;
use assistant_slash_command::{
@@ -15,6 +14,7 @@ use gpui::{AppContext, Model, SharedString, Task, TestAppContext, WeakView};
use language::{Buffer, BufferSnapshot, LanguageRegistry, LspAdapterDelegate};
use language_model::{LanguageModelCacheConfiguration, LanguageModelRegistry, Role};
use parking_lot::Mutex;
use pretty_assertions::assert_eq;
use project::Project;
use rand::prelude::*;
use serde_json::json;
@@ -478,7 +478,15 @@ async fn test_slash_commands(cx: &mut TestAppContext) {
#[gpui::test]
async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
cx.update(prompt_library::init);
let settings_store = cx.update(SettingsStore::test);
let mut settings_store = cx.update(SettingsStore::test);
cx.update(|cx| {
settings_store
.set_user_settings(
r#"{ "assistant": { "enable_experimental_live_diffs": true } }"#,
cx,
)
.unwrap()
});
cx.set_global(settings_store);
cx.update(language::init);
cx.update(Project::init_settings);
@@ -520,7 +528,7 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
»",
cx,
);
expect_steps(
expect_patches(
&context,
"
@@ -539,17 +547,17 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
one
two
«
<step»",
<patch»",
cx,
);
expect_steps(
expect_patches(
&context,
"
one
two
<step",
<patch",
&[],
cx,
);
@@ -563,36 +571,24 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
one
two
<step«>
Add a second function
```rust
fn two() {}
```
<patch«>
<edit>»",
cx,
);
expect_steps(
expect_patches(
&context,
"
one
two
«<step>
Add a second function
```rust
fn two() {}
```
«<patch>
<edit>»",
&[&[]],
cx,
);
// The full suggestion is added
// The full patch is added
edit(
&context,
"
@@ -600,51 +596,46 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
one
two
<step>
Add a second function
```rust
fn two() {}
```
<patch>
<edit>«
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn one</search>
<description>add a `two` function</description>
<old_text>fn one</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>
</patch>
also,»",
cx,
);
expect_steps(
expect_patches(
&context,
"
one
two
«<step>
Add a second function
```rust
fn two() {}
```
«<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn one</search>
<description>add a `two` function</description>
<old_text>fn one</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>»
</patch>
»
also,",
&[&[WorkflowStepEdit {
&[&[AssistantEdit {
path: "src/lib.rs".into(),
kind: WorkflowStepEditKind::InsertAfter {
search: "fn one".into(),
kind: AssistantEditKind::InsertAfter {
old_text: "fn one".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
},
}]],
@@ -659,51 +650,46 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
one
two
<step>
Add a second function
```rust
fn two() {}
```
<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>«fn zero»</search>
<description>add a `two` function</description>
<old_text>«fn zero»</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>
</patch>
also,",
cx,
);
expect_steps(
expect_patches(
&context,
"
one
two
«<step>
Add a second function
```rust
fn two() {}
```
«<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn zero</search>
<description>add a `two` function</description>
<old_text>fn zero</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>»
</patch>
»
also,",
&[&[WorkflowStepEdit {
&[&[AssistantEdit {
path: "src/lib.rs".into(),
kind: WorkflowStepEditKind::InsertAfter {
search: "fn zero".into(),
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
},
}]],
@@ -715,27 +701,24 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
context.cycle_message_roles(HashSet::from_iter([assistant_message_id]), cx);
context.cycle_message_roles(HashSet::from_iter([assistant_message_id]), cx);
});
expect_steps(
expect_patches(
&context,
"
one
two
<step>
Add a second function
```rust
fn two() {}
```
<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn zero</search>
<description>add a `two` function</description>
<old_text>fn zero</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>
</patch>
also,",
&[],
@@ -746,33 +729,31 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
context.update(cx, |context, cx| {
context.cycle_message_roles(HashSet::from_iter([assistant_message_id]), cx);
});
expect_steps(
expect_patches(
&context,
"
one
two
«<step>
Add a second function
```rust
fn two() {}
```
«<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn zero</search>
<description>add a `two` function</description>
<old_text>fn zero</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>»
</patch>
»
also,",
&[&[WorkflowStepEdit {
&[&[AssistantEdit {
path: "src/lib.rs".into(),
kind: WorkflowStepEditKind::InsertAfter {
search: "fn zero".into(),
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
},
}]],
@@ -792,33 +773,31 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
cx,
)
});
expect_steps(
expect_patches(
&deserialized_context,
"
one
two
«<step>
Add a second function
```rust
fn two() {}
```
«<patch>
<edit>
<description>add a `two` function</description>
<path>src/lib.rs</path>
<operation>insert_after</operation>
<search>fn zero</search>
<description>add a `two` function</description>
<old_text>fn zero</old_text>
<new_text>
fn two() {}
</new_text>
</edit>
</step>»
</patch>
»
also,",
&[&[WorkflowStepEdit {
&[&[AssistantEdit {
path: "src/lib.rs".into(),
kind: WorkflowStepEditKind::InsertAfter {
search: "fn zero".into(),
kind: AssistantEditKind::InsertAfter {
old_text: "fn zero".into(),
new_text: "fn two() {}".into(),
description: "add a `two` function".into(),
},
}]],
@@ -834,48 +813,58 @@ async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
cx.executor().run_until_parked();
}
fn expect_steps(
#[track_caller]
fn expect_patches(
context: &Model<Context>,
expected_marked_text: &str,
expected_suggestions: &[&[WorkflowStepEdit]],
expected_suggestions: &[&[AssistantEdit]],
cx: &mut TestAppContext,
) {
context.update(cx, |context, cx| {
let expected_marked_text = expected_marked_text.unindent();
let (expected_text, expected_ranges) = marked_text_ranges(&expected_marked_text, false);
let expected_marked_text = expected_marked_text.unindent();
let (expected_text, _) = marked_text_ranges(&expected_marked_text, false);
let (buffer_text, ranges, patches) = context.update(cx, |context, cx| {
context.buffer.read_with(cx, |buffer, _| {
assert_eq!(buffer.text(), expected_text);
let ranges = context
.workflow_steps
.patches
.iter()
.map(|entry| entry.range.to_offset(buffer))
.collect::<Vec<_>>();
let marked = generate_marked_text(&expected_text, &ranges, false);
assert_eq!(
marked,
expected_marked_text,
"unexpected suggestion ranges. actual: {ranges:?}, expected: {expected_ranges:?}"
);
let suggestions = context
.workflow_steps
.iter()
.map(|step| {
step.edits
.iter()
.map(|edit| {
let edit = edit.as_ref().unwrap();
WorkflowStepEdit {
path: edit.path.clone(),
kind: edit.kind.clone(),
}
})
.collect::<Vec<_>>()
})
.collect::<Vec<_>>();
assert_eq!(suggestions, expected_suggestions);
});
(
buffer.text(),
ranges,
context
.patches
.iter()
.map(|step| step.edits.clone())
.collect::<Vec<_>>(),
)
})
});
assert_eq!(buffer_text, expected_text);
let actual_marked_text = generate_marked_text(&expected_text, &ranges, false);
assert_eq!(actual_marked_text, expected_marked_text);
assert_eq!(
patches
.iter()
.map(|patch| {
patch
.iter()
.map(|edit| {
let edit = edit.as_ref().unwrap();
AssistantEdit {
path: edit.path.clone(),
kind: edit.kind.clone(),
}
})
.collect::<Vec<_>>()
})
.collect::<Vec<_>>(),
expected_suggestions
);
}
}

View File

@@ -82,13 +82,6 @@ pub struct InlineAssistant {
assists: HashMap<InlineAssistId, InlineAssist>,
assists_by_editor: HashMap<WeakView<Editor>, EditorInlineAssists>,
assist_groups: HashMap<InlineAssistGroupId, InlineAssistGroup>,
assist_observations: HashMap<
InlineAssistId,
(
async_watch::Sender<AssistStatus>,
async_watch::Receiver<AssistStatus>,
),
>,
confirmed_assists: HashMap<InlineAssistId, Model<CodegenAlternative>>,
prompt_history: VecDeque<String>,
prompt_builder: Arc<PromptBuilder>,
@@ -96,19 +89,6 @@ pub struct InlineAssistant {
fs: Arc<dyn Fs>,
}
pub enum AssistStatus {
Idle,
Started,
Stopped,
Finished,
}
impl AssistStatus {
pub fn is_done(&self) -> bool {
matches!(self, Self::Stopped | Self::Finished)
}
}
impl Global for InlineAssistant {}
impl InlineAssistant {
@@ -123,7 +103,6 @@ impl InlineAssistant {
assists: HashMap::default(),
assists_by_editor: HashMap::default(),
assist_groups: HashMap::default(),
assist_observations: HashMap::default(),
confirmed_assists: HashMap::default(),
prompt_history: VecDeque::default(),
prompt_builder,
@@ -267,7 +246,7 @@ impl InlineAssistant {
model_provider: model.provider_id().to_string(),
response_latency: None,
error_message: None,
language_name: buffer.language().map(|language| language.name()),
language_name: buffer.language().map(|language| language.name().to_proto()),
});
}
}
@@ -788,7 +767,7 @@ impl InlineAssistant {
model_provider: model.provider_id().to_string(),
response_latency: None,
error_message: None,
language_name,
language_name: language_name.map(|name| name.to_proto()),
});
}
}
@@ -835,17 +814,6 @@ impl InlineAssistant {
.insert(assist_id, confirmed_alternative);
}
}
// Remove the assist from the status updates map
self.assist_observations.remove(&assist_id);
}
pub fn undo_assist(&mut self, assist_id: InlineAssistId, cx: &mut WindowContext) -> bool {
let Some(codegen) = self.confirmed_assists.remove(&assist_id) else {
return false;
};
codegen.update(cx, |this, cx| this.undo(cx));
true
}
fn dismiss_assist(&mut self, assist_id: InlineAssistId, cx: &mut WindowContext) -> bool {
@@ -1039,10 +1007,6 @@ impl InlineAssistant {
codegen.start(user_prompt, assistant_panel_context, cx)
})
.log_err();
if let Some((tx, _)) = self.assist_observations.get(&assist_id) {
tx.send(AssistStatus::Started).ok();
}
}
pub fn stop_assist(&mut self, assist_id: InlineAssistId, cx: &mut WindowContext) {
@@ -1053,25 +1017,6 @@ impl InlineAssistant {
};
assist.codegen.update(cx, |codegen, cx| codegen.stop(cx));
if let Some((tx, _)) = self.assist_observations.get(&assist_id) {
tx.send(AssistStatus::Stopped).ok();
}
}
pub fn assist_status(&self, assist_id: InlineAssistId, cx: &AppContext) -> InlineAssistStatus {
if let Some(assist) = self.assists.get(&assist_id) {
match assist.codegen.read(cx).status(cx) {
CodegenStatus::Idle => InlineAssistStatus::Idle,
CodegenStatus::Pending => InlineAssistStatus::Pending,
CodegenStatus::Done => InlineAssistStatus::Done,
CodegenStatus::Error(_) => InlineAssistStatus::Error,
}
} else if self.confirmed_assists.contains_key(&assist_id) {
InlineAssistStatus::Confirmed
} else {
InlineAssistStatus::Canceled
}
}
fn update_editor_highlights(&self, editor: &View<Editor>, cx: &mut WindowContext) {
@@ -1257,42 +1202,6 @@ impl InlineAssistant {
.collect();
})
}
pub fn observe_assist(
&mut self,
assist_id: InlineAssistId,
) -> async_watch::Receiver<AssistStatus> {
if let Some((_, rx)) = self.assist_observations.get(&assist_id) {
rx.clone()
} else {
let (tx, rx) = async_watch::channel(AssistStatus::Idle);
self.assist_observations.insert(assist_id, (tx, rx.clone()));
rx
}
}
}
pub enum InlineAssistStatus {
Idle,
Pending,
Done,
Error,
Confirmed,
Canceled,
}
impl InlineAssistStatus {
pub(crate) fn is_pending(&self) -> bool {
matches!(self, Self::Pending)
}
pub(crate) fn is_confirmed(&self) -> bool {
matches!(self, Self::Confirmed)
}
pub(crate) fn is_done(&self) -> bool {
matches!(self, Self::Done)
}
}
struct EditorInlineAssists {
@@ -2278,7 +2187,7 @@ impl InlineAssist {
struct InlineAssistantError;
let id =
NotificationId::identified::<InlineAssistantError>(
NotificationId::composite::<InlineAssistantError>(
assist_id.0,
);
@@ -2290,8 +2199,6 @@ impl InlineAssist {
if assist.decorations.is_none() {
this.finish_assist(assist_id, false, cx);
} else if let Some(tx) = this.assist_observations.get(&assist_id) {
tx.0.send(AssistStatus::Finished).ok();
}
}
})
@@ -2954,7 +2861,7 @@ impl CodegenAlternative {
model_provider: model_provider_id.to_string(),
response_latency,
error_message,
language_name,
language_name: language_name.map(|name| name.to_proto()),
});
}

View File

@@ -0,0 +1,829 @@
use anyhow::{anyhow, Context as _, Result};
use collections::HashMap;
use editor::ProposedChangesEditor;
use futures::{future, TryFutureExt as _};
use gpui::{AppContext, AsyncAppContext, Model, SharedString};
use language::{AutoindentMode, Buffer, BufferSnapshot};
use project::{Project, ProjectPath};
use std::{cmp, ops::Range, path::Path, sync::Arc};
use text::{AnchorRangeExt as _, Bias, OffsetRangeExt as _, Point};
#[derive(Clone, Debug)]
pub(crate) struct AssistantPatch {
pub range: Range<language::Anchor>,
pub title: SharedString,
pub edits: Arc<[Result<AssistantEdit>]>,
pub status: AssistantPatchStatus,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub(crate) enum AssistantPatchStatus {
Pending,
Ready,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub(crate) struct AssistantEdit {
pub path: String,
pub kind: AssistantEditKind,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum AssistantEditKind {
Update {
old_text: String,
new_text: String,
description: String,
},
Create {
new_text: String,
description: String,
},
InsertBefore {
old_text: String,
new_text: String,
description: String,
},
InsertAfter {
old_text: String,
new_text: String,
description: String,
},
Delete {
old_text: String,
},
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) struct ResolvedPatch {
pub edit_groups: HashMap<Model<Buffer>, Vec<ResolvedEditGroup>>,
pub errors: Vec<AssistantPatchResolutionError>,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct ResolvedEditGroup {
pub context_range: Range<language::Anchor>,
pub edits: Vec<ResolvedEdit>,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct ResolvedEdit {
range: Range<language::Anchor>,
new_text: String,
description: Option<String>,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) struct AssistantPatchResolutionError {
pub edit_ix: usize,
pub message: String,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq, PartialOrd, Ord)]
enum SearchDirection {
Up,
Left,
Diagonal,
}
// A measure of the currently quality of an in-progress fuzzy search.
//
// Uses 60 bits to store a numeric cost, and 4 bits to store the preceding
// operation in the search.
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
struct SearchState {
score: u32,
direction: SearchDirection,
}
impl SearchState {
fn new(score: u32, direction: SearchDirection) -> Self {
Self { score, direction }
}
}
impl ResolvedPatch {
pub fn apply(&self, editor: &ProposedChangesEditor, cx: &mut AppContext) {
for (buffer, groups) in &self.edit_groups {
let branch = editor.branch_buffer_for_base(buffer).unwrap();
Self::apply_edit_groups(groups, &branch, cx);
}
editor.recalculate_all_buffer_diffs();
}
fn apply_edit_groups(
groups: &Vec<ResolvedEditGroup>,
buffer: &Model<Buffer>,
cx: &mut AppContext,
) {
let mut edits = Vec::new();
for group in groups {
for suggestion in &group.edits {
edits.push((suggestion.range.clone(), suggestion.new_text.clone()));
}
}
buffer.update(cx, |buffer, cx| {
buffer.edit(
edits,
Some(AutoindentMode::Block {
original_indent_columns: Vec::new(),
}),
cx,
);
});
}
}
impl ResolvedEdit {
pub fn try_merge(&mut self, other: &Self, buffer: &text::BufferSnapshot) -> bool {
let range = &self.range;
let other_range = &other.range;
// Don't merge if we don't contain the other suggestion.
if range.start.cmp(&other_range.start, buffer).is_gt()
|| range.end.cmp(&other_range.end, buffer).is_lt()
{
return false;
}
let other_offset_range = other_range.to_offset(buffer);
let offset_range = range.to_offset(buffer);
// If the other range is empty at the start of this edit's range, combine the new text
if other_offset_range.is_empty() && other_offset_range.start == offset_range.start {
self.new_text = format!("{}\n{}", other.new_text, self.new_text);
self.range.start = other_range.start;
if let Some((description, other_description)) =
self.description.as_mut().zip(other.description.as_ref())
{
*description = format!("{}\n{}", other_description, description)
}
} else {
if let Some((description, other_description)) =
self.description.as_mut().zip(other.description.as_ref())
{
description.push('\n');
description.push_str(other_description);
}
}
true
}
}
impl AssistantEdit {
pub fn new(
path: Option<String>,
operation: Option<String>,
old_text: Option<String>,
new_text: Option<String>,
description: Option<String>,
) -> Result<Self> {
let path = path.ok_or_else(|| anyhow!("missing path"))?;
let operation = operation.ok_or_else(|| anyhow!("missing operation"))?;
let kind = match operation.as_str() {
"update" => AssistantEditKind::Update {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"insert_before" => AssistantEditKind::InsertBefore {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"insert_after" => AssistantEditKind::InsertAfter {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"delete" => AssistantEditKind::Delete {
old_text: old_text.ok_or_else(|| anyhow!("missing old_text"))?,
},
"create" => AssistantEditKind::Create {
description: description.ok_or_else(|| anyhow!("missing description"))?,
new_text: new_text.ok_or_else(|| anyhow!("missing new_text"))?,
},
_ => Err(anyhow!("unknown operation {operation:?}"))?,
};
Ok(Self { path, kind })
}
pub async fn resolve(
&self,
project: Model<Project>,
mut cx: AsyncAppContext,
) -> Result<(Model<Buffer>, ResolvedEdit)> {
let path = self.path.clone();
let kind = self.kind.clone();
let buffer = project
.update(&mut cx, |project, cx| {
let project_path = project
.find_project_path(Path::new(&path), cx)
.or_else(|| {
// If we couldn't find a project path for it, put it in the active worktree
// so that when we create the buffer, it can be saved.
let worktree = project
.active_entry()
.and_then(|entry_id| project.worktree_for_entry(entry_id, cx))
.or_else(|| project.worktrees(cx).next())?;
let worktree = worktree.read(cx);
Some(ProjectPath {
worktree_id: worktree.id(),
path: Arc::from(Path::new(&path)),
})
})
.with_context(|| format!("worktree not found for {:?}", path))?;
anyhow::Ok(project.open_buffer(project_path, cx))
})??
.await?;
let snapshot = buffer.update(&mut cx, |buffer, _| buffer.snapshot())?;
let suggestion = cx
.background_executor()
.spawn(async move { kind.resolve(&snapshot) })
.await;
Ok((buffer, suggestion))
}
}
impl AssistantEditKind {
fn resolve(self, snapshot: &BufferSnapshot) -> ResolvedEdit {
match self {
Self::Update {
old_text,
new_text,
description,
} => {
let range = Self::resolve_location(&snapshot, &old_text);
ResolvedEdit {
range,
new_text,
description: Some(description),
}
}
Self::Create {
new_text,
description,
} => ResolvedEdit {
range: text::Anchor::MIN..text::Anchor::MAX,
description: Some(description),
new_text,
},
Self::InsertBefore {
old_text,
mut new_text,
description,
} => {
let range = Self::resolve_location(&snapshot, &old_text);
new_text.push('\n');
ResolvedEdit {
range: range.start..range.start,
new_text,
description: Some(description),
}
}
Self::InsertAfter {
old_text,
mut new_text,
description,
} => {
let range = Self::resolve_location(&snapshot, &old_text);
new_text.insert(0, '\n');
ResolvedEdit {
range: range.end..range.end,
new_text,
description: Some(description),
}
}
Self::Delete { old_text } => {
let range = Self::resolve_location(&snapshot, &old_text);
ResolvedEdit {
range,
new_text: String::new(),
description: None,
}
}
}
}
fn resolve_location(buffer: &text::BufferSnapshot, search_query: &str) -> Range<text::Anchor> {
const INSERTION_COST: u32 = 3;
const WHITESPACE_INSERTION_COST: u32 = 1;
const DELETION_COST: u32 = 3;
const WHITESPACE_DELETION_COST: u32 = 1;
const EQUALITY_BONUS: u32 = 5;
struct Matrix {
cols: usize,
data: Vec<SearchState>,
}
impl Matrix {
fn new(rows: usize, cols: usize) -> Self {
Matrix {
cols,
data: vec![SearchState::new(0, SearchDirection::Diagonal); rows * cols],
}
}
fn get(&self, row: usize, col: usize) -> SearchState {
self.data[row * self.cols + col]
}
fn set(&mut self, row: usize, col: usize, cost: SearchState) {
self.data[row * self.cols + col] = cost;
}
}
let buffer_len = buffer.len();
let query_len = search_query.len();
let mut matrix = Matrix::new(query_len + 1, buffer_len + 1);
for (row, query_byte) in search_query.bytes().enumerate() {
for (col, buffer_byte) in buffer.bytes_in_range(0..buffer.len()).flatten().enumerate() {
let deletion_cost = if query_byte.is_ascii_whitespace() {
WHITESPACE_DELETION_COST
} else {
DELETION_COST
};
let insertion_cost = if buffer_byte.is_ascii_whitespace() {
WHITESPACE_INSERTION_COST
} else {
INSERTION_COST
};
let up = SearchState::new(
matrix.get(row, col + 1).score.saturating_sub(deletion_cost),
SearchDirection::Up,
);
let left = SearchState::new(
matrix
.get(row + 1, col)
.score
.saturating_sub(insertion_cost),
SearchDirection::Left,
);
let diagonal = SearchState::new(
if query_byte == *buffer_byte {
matrix.get(row, col).score.saturating_add(EQUALITY_BONUS)
} else {
matrix
.get(row, col)
.score
.saturating_sub(deletion_cost + insertion_cost)
},
SearchDirection::Diagonal,
);
matrix.set(row + 1, col + 1, up.max(left).max(diagonal));
}
}
// Traceback to find the best match
let mut best_buffer_end = buffer_len;
let mut best_score = 0;
for col in 1..=buffer_len {
let score = matrix.get(query_len, col).score;
if score > best_score {
best_score = score;
best_buffer_end = col;
}
}
let mut query_ix = query_len;
let mut buffer_ix = best_buffer_end;
while query_ix > 0 && buffer_ix > 0 {
let current = matrix.get(query_ix, buffer_ix);
match current.direction {
SearchDirection::Diagonal => {
query_ix -= 1;
buffer_ix -= 1;
}
SearchDirection::Up => {
query_ix -= 1;
}
SearchDirection::Left => {
buffer_ix -= 1;
}
}
}
let mut start = buffer.offset_to_point(buffer.clip_offset(buffer_ix, Bias::Left));
start.column = 0;
let mut end = buffer.offset_to_point(buffer.clip_offset(best_buffer_end, Bias::Right));
if end.column > 0 {
end.column = buffer.line_len(end.row);
}
buffer.anchor_after(start)..buffer.anchor_before(end)
}
}
impl AssistantPatch {
pub(crate) async fn resolve(
&self,
project: Model<Project>,
cx: &mut AsyncAppContext,
) -> ResolvedPatch {
let mut resolve_tasks = Vec::new();
for (ix, edit) in self.edits.iter().enumerate() {
if let Ok(edit) = edit.as_ref() {
resolve_tasks.push(
edit.resolve(project.clone(), cx.clone())
.map_err(move |error| (ix, error)),
);
}
}
let edits = future::join_all(resolve_tasks).await;
let mut errors = Vec::new();
let mut edits_by_buffer = HashMap::default();
for entry in edits {
match entry {
Ok((buffer, edit)) => {
edits_by_buffer
.entry(buffer)
.or_insert_with(Vec::new)
.push(edit);
}
Err((edit_ix, error)) => errors.push(AssistantPatchResolutionError {
edit_ix,
message: error.to_string(),
}),
}
}
// Expand the context ranges of each edit and group edits with overlapping context ranges.
let mut edit_groups_by_buffer = HashMap::default();
for (buffer, edits) in edits_by_buffer {
if let Ok(snapshot) = buffer.update(cx, |buffer, _| buffer.text_snapshot()) {
edit_groups_by_buffer.insert(buffer, Self::group_edits(edits, &snapshot));
}
}
ResolvedPatch {
edit_groups: edit_groups_by_buffer,
errors,
}
}
fn group_edits(
mut edits: Vec<ResolvedEdit>,
snapshot: &text::BufferSnapshot,
) -> Vec<ResolvedEditGroup> {
let mut edit_groups = Vec::<ResolvedEditGroup>::new();
// Sort edits by their range so that earlier, larger ranges come first
edits.sort_by(|a, b| a.range.cmp(&b.range, &snapshot));
// Merge overlapping edits
edits.dedup_by(|a, b| b.try_merge(a, &snapshot));
// Create context ranges for each edit
for edit in edits {
let context_range = {
let edit_point_range = edit.range.to_point(&snapshot);
let start_row = edit_point_range.start.row.saturating_sub(5);
let end_row = cmp::min(edit_point_range.end.row + 5, snapshot.max_point().row);
let start = snapshot.anchor_before(Point::new(start_row, 0));
let end = snapshot.anchor_after(Point::new(end_row, snapshot.line_len(end_row)));
start..end
};
if let Some(last_group) = edit_groups.last_mut() {
if last_group
.context_range
.end
.cmp(&context_range.start, &snapshot)
.is_ge()
{
// Merge with the previous group if context ranges overlap
last_group.context_range.end = context_range.end;
last_group.edits.push(edit);
} else {
// Create a new group
edit_groups.push(ResolvedEditGroup {
context_range,
edits: vec![edit],
});
}
} else {
// Create the first group
edit_groups.push(ResolvedEditGroup {
context_range,
edits: vec![edit],
});
}
}
edit_groups
}
pub fn path_count(&self) -> usize {
self.paths().count()
}
pub fn paths(&self) -> impl '_ + Iterator<Item = &str> {
let mut prev_path = None;
self.edits.iter().filter_map(move |edit| {
if let Ok(edit) = edit {
let path = Some(edit.path.as_str());
if path != prev_path {
prev_path = path;
return path;
}
}
None
})
}
}
impl PartialEq for AssistantPatch {
fn eq(&self, other: &Self) -> bool {
self.range == other.range
&& self.title == other.title
&& Arc::ptr_eq(&self.edits, &other.edits)
}
}
impl Eq for AssistantPatch {}
#[cfg(test)]
mod tests {
use super::*;
use gpui::{AppContext, Context};
use language::{
language_settings::AllLanguageSettings, Language, LanguageConfig, LanguageMatcher,
};
use settings::SettingsStore;
use text::{OffsetRangeExt, Point};
use ui::BorrowAppContext;
use unindent::Unindent as _;
#[gpui::test]
fn test_resolve_location(cx: &mut AppContext) {
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
" Lorem\n",
" ipsum\n",
" dolor sit amet\n",
" consecteur",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "ipsum\ndolor").to_point(&snapshot),
Point::new(1, 0)..Point::new(2, 18)
);
}
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn foo1(a: usize) -> usize {\n",
" 40\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "fn foo1(b: usize) {\n40\n}")
.to_point(&snapshot),
Point::new(0, 0)..Point::new(2, 1)
);
}
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn main() {\n",
" Foo\n",
" .bar()\n",
" .baz()\n",
" .qux()\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
AssistantEditKind::resolve_location(&snapshot, "Foo.bar.baz.qux()")
.to_point(&snapshot),
Point::new(1, 0)..Point::new(4, 14)
);
}
}
#[gpui::test]
fn test_resolve_edits(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language::init(cx);
cx.update_global::<SettingsStore, _>(|settings, cx| {
settings.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
assert_edits(
"
/// A person
struct Person {
name: String,
age: usize,
}
/// A dog
struct Dog {
weight: f32,
}
impl Person {
fn name(&self) -> &str {
&self.name
}
}
"
.unindent(),
vec![
AssistantEditKind::Update {
old_text: "
name: String,
"
.unindent(),
new_text: "
first_name: String,
last_name: String,
"
.unindent(),
description: "".into(),
},
AssistantEditKind::Update {
old_text: "
fn name(&self) -> &str {
&self.name
}
"
.unindent(),
new_text: "
fn name(&self) -> String {
format!(\"{} {}\", self.first_name, self.last_name)
}
"
.unindent(),
description: "".into(),
},
],
"
/// A person
struct Person {
first_name: String,
last_name: String,
age: usize,
}
/// A dog
struct Dog {
weight: f32,
}
impl Person {
fn name(&self) -> String {
format!(\"{} {}\", self.first_name, self.last_name)
}
}
"
.unindent(),
cx,
);
// Ensure InsertBefore merges correctly with Update of the same text
assert_edits(
"
fn foo() {
}
"
.unindent(),
vec![
AssistantEditKind::InsertBefore {
old_text: "
fn foo() {"
.unindent(),
new_text: "
fn bar() {
qux();
}"
.unindent(),
description: "implement bar".into(),
},
AssistantEditKind::Update {
old_text: "
fn foo() {
}"
.unindent(),
new_text: "
fn foo() {
bar();
}"
.unindent(),
description: "call bar in foo".into(),
},
AssistantEditKind::InsertAfter {
old_text: "
fn foo() {
}
"
.unindent(),
new_text: "
fn qux() {
// todo
}
"
.unindent(),
description: "implement qux".into(),
},
],
"
fn bar() {
qux();
}
fn foo() {
bar();
}
fn qux() {
// todo
}
"
.unindent(),
cx,
);
}
#[track_caller]
fn assert_edits(
old_text: String,
edits: Vec<AssistantEditKind>,
new_text: String,
cx: &mut AppContext,
) {
let buffer =
cx.new_model(|cx| Buffer::local(old_text, cx).with_language(Arc::new(rust_lang()), cx));
let snapshot = buffer.read(cx).snapshot();
let resolved_edits = edits
.into_iter()
.map(|kind| kind.resolve(&snapshot))
.collect();
let edit_groups = AssistantPatch::group_edits(resolved_edits, &snapshot);
ResolvedPatch::apply_edit_groups(&edit_groups, &buffer, cx);
let actual_new_text = buffer.read(cx).text();
pretty_assertions::assert_eq!(actual_new_text, new_text);
}
fn rust_lang() -> Language {
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(language::tree_sitter_rust::LANGUAGE.into()),
)
.with_indents_query(
r#"
(call_expression) @indent
(field_expression) @indent
(_ "(" ")" @end) @indent
(_ "{" "}" @end) @indent
"#,
)
.unwrap()
}
}

View File

@@ -45,15 +45,6 @@ pub struct ProjectSlashCommandPromptContext {
pub context_buffer: String,
}
/// Context required to generate a workflow step resolution prompt.
#[derive(Debug, Serialize)]
pub struct StepResolutionContext {
/// The full context, including <step>...</step> tags
pub workflow_context: String,
/// The text of the specific step from the context to resolve
pub step_to_resolve: String,
}
pub struct PromptLoadingParams<'a> {
pub fs: Arc<dyn Fs>,
pub repo_path: Option<PathBuf>,

View File

@@ -1,3 +1,4 @@
use super::create_label_for_command;
use anyhow::{anyhow, Result};
use assistant_slash_command::{
AfterCompletion, ArgumentCompletion, SlashCommand, SlashCommandOutput,
@@ -6,9 +7,9 @@ use assistant_slash_command::{
use collections::HashMap;
use context_servers::{
manager::{ContextServer, ContextServerManager},
protocol::PromptInfo,
types::Prompt,
};
use gpui::{Task, WeakView, WindowContext};
use gpui::{AppContext, Task, WeakView, WindowContext};
use language::{BufferSnapshot, CodeLabel, LspAdapterDelegate};
use std::sync::atomic::AtomicBool;
use std::sync::Arc;
@@ -18,11 +19,11 @@ use workspace::Workspace;
pub struct ContextServerSlashCommand {
server_id: String,
prompt: PromptInfo,
prompt: Prompt,
}
impl ContextServerSlashCommand {
pub fn new(server: &Arc<ContextServer>, prompt: PromptInfo) -> Self {
pub fn new(server: &Arc<ContextServer>, prompt: Prompt) -> Self {
Self {
server_id: server.id.clone(),
prompt,
@@ -35,12 +36,28 @@ impl SlashCommand for ContextServerSlashCommand {
self.prompt.name.clone()
}
fn label(&self, cx: &AppContext) -> language::CodeLabel {
let mut parts = vec![self.prompt.name.as_str()];
if let Some(args) = &self.prompt.arguments {
if let Some(arg) = args.first() {
parts.push(arg.name.as_str());
}
}
create_label_for_command(&parts[0], &parts[1..], cx)
}
fn description(&self) -> String {
format!("Run context server command: {}", self.prompt.name)
match &self.prompt.description {
Some(desc) => desc.clone(),
None => format!("Run '{}' from {}", self.prompt.name, self.server_id),
}
}
fn menu_text(&self) -> String {
format!("Run '{}' from {}", self.prompt.name, self.server_id)
match &self.prompt.description {
Some(desc) => desc.clone(),
None => format!("Run '{}' from {}", self.prompt.name, self.server_id),
}
}
fn requires_argument(&self) -> bool {
@@ -154,7 +171,7 @@ impl SlashCommand for ContextServerSlashCommand {
}
}
fn completion_argument(prompt: &PromptInfo, arguments: &[String]) -> Result<(String, String)> {
fn completion_argument(prompt: &Prompt, arguments: &[String]) -> Result<(String, String)> {
if arguments.is_empty() {
return Err(anyhow!("No arguments given"));
}
@@ -170,7 +187,7 @@ fn completion_argument(prompt: &PromptInfo, arguments: &[String]) -> Result<(Str
}
}
fn prompt_arguments(prompt: &PromptInfo, arguments: &[String]) -> Result<HashMap<String, String>> {
fn prompt_arguments(prompt: &Prompt, arguments: &[String]) -> Result<HashMap<String, String>> {
match &prompt.arguments {
Some(args) if args.len() > 1 => Err(anyhow!(
"Prompt has more than one argument, which is not supported"
@@ -199,7 +216,7 @@ fn prompt_arguments(prompt: &PromptInfo, arguments: &[String]) -> Result<HashMap
/// MCP servers can return prompts with multiple arguments. Since we only
/// support one argument, we ignore all others. This is the necessary predicate
/// for this.
pub fn acceptable_prompt(prompt: &PromptInfo) -> bool {
pub fn acceptable_prompt(prompt: &Prompt) -> bool {
match &prompt.arguments {
None => true,
Some(args) if args.len() <= 1 => true,

View File

@@ -18,6 +18,8 @@ pub(crate) struct WorkflowSlashCommand {
}
impl WorkflowSlashCommand {
pub const NAME: &'static str = "workflow";
pub fn new(prompt_builder: Arc<PromptBuilder>) -> Self {
Self { prompt_builder }
}
@@ -25,7 +27,7 @@ impl WorkflowSlashCommand {
impl SlashCommand for WorkflowSlashCommand {
fn name(&self) -> String {
"workflow".into()
Self::NAME.into()
}
fn description(&self) -> String {

View File

@@ -38,7 +38,10 @@ impl Settings for SlashCommandSettings {
fn load(sources: SettingsSources<Self::FileContent>, _cx: &mut AppContext) -> Result<Self> {
SettingsSources::<Self::FileContent>::json_merge_with(
[sources.default].into_iter().chain(sources.user),
[sources.default]
.into_iter()
.chain(sources.user)
.chain(sources.server),
)
}
}

View File

@@ -414,7 +414,7 @@ impl TerminalInlineAssist {
struct InlineAssistantError;
let id =
NotificationId::identified::<InlineAssistantError>(
NotificationId::composite::<InlineAssistantError>(
assist_id.0,
);

View File

@@ -1,507 +0,0 @@
use crate::{AssistantPanel, InlineAssistId, InlineAssistant};
use anyhow::{anyhow, Context as _, Result};
use collections::HashMap;
use editor::Editor;
use gpui::AsyncAppContext;
use gpui::{Model, Task, UpdateGlobal as _, View, WeakView, WindowContext};
use language::{Buffer, BufferSnapshot};
use project::{Project, ProjectPath};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::{ops::Range, path::Path, sync::Arc};
use text::Bias;
use workspace::Workspace;
#[derive(Debug)]
pub(crate) struct WorkflowStep {
pub range: Range<language::Anchor>,
pub leading_tags_end: text::Anchor,
pub trailing_tag_start: Option<text::Anchor>,
pub edits: Arc<[Result<WorkflowStepEdit>]>,
pub resolution_task: Option<Task<()>>,
pub resolution: Option<Arc<Result<WorkflowStepResolution>>>,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub(crate) struct WorkflowStepEdit {
pub path: String,
pub kind: WorkflowStepEditKind,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) struct WorkflowStepResolution {
pub title: String,
pub suggestion_groups: HashMap<Model<Buffer>, Vec<WorkflowSuggestionGroup>>,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct WorkflowSuggestionGroup {
pub context_range: Range<language::Anchor>,
pub suggestions: Vec<WorkflowSuggestion>,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum WorkflowSuggestion {
Update {
range: Range<language::Anchor>,
description: String,
},
CreateFile {
description: String,
},
InsertBefore {
position: language::Anchor,
description: String,
},
InsertAfter {
position: language::Anchor,
description: String,
},
Delete {
range: Range<language::Anchor>,
},
}
impl WorkflowSuggestion {
pub fn range(&self) -> Range<language::Anchor> {
match self {
Self::Update { range, .. } => range.clone(),
Self::CreateFile { .. } => language::Anchor::MIN..language::Anchor::MAX,
Self::InsertBefore { position, .. } | Self::InsertAfter { position, .. } => {
*position..*position
}
Self::Delete { range, .. } => range.clone(),
}
}
pub fn description(&self) -> Option<&str> {
match self {
Self::Update { description, .. }
| Self::CreateFile { description }
| Self::InsertBefore { description, .. }
| Self::InsertAfter { description, .. } => Some(description),
Self::Delete { .. } => None,
}
}
fn description_mut(&mut self) -> Option<&mut String> {
match self {
Self::Update { description, .. }
| Self::CreateFile { description }
| Self::InsertBefore { description, .. }
| Self::InsertAfter { description, .. } => Some(description),
Self::Delete { .. } => None,
}
}
pub fn try_merge(&mut self, other: &Self, buffer: &BufferSnapshot) -> bool {
let range = self.range();
let other_range = other.range();
// Don't merge if we don't contain the other suggestion.
if range.start.cmp(&other_range.start, buffer).is_gt()
|| range.end.cmp(&other_range.end, buffer).is_lt()
{
return false;
}
if let Some(description) = self.description_mut() {
if let Some(other_description) = other.description() {
description.push('\n');
description.push_str(other_description);
}
}
true
}
pub fn show(
&self,
editor: &View<Editor>,
excerpt_id: editor::ExcerptId,
workspace: &WeakView<Workspace>,
assistant_panel: &View<AssistantPanel>,
cx: &mut WindowContext,
) -> Option<InlineAssistId> {
let mut initial_transaction_id = None;
let initial_prompt;
let suggestion_range;
let buffer = editor.read(cx).buffer().clone();
let snapshot = buffer.read(cx).snapshot(cx);
match self {
Self::Update {
range, description, ..
} => {
initial_prompt = description.clone();
suggestion_range = snapshot.anchor_in_excerpt(excerpt_id, range.start)?
..snapshot.anchor_in_excerpt(excerpt_id, range.end)?;
}
Self::CreateFile { description } => {
initial_prompt = description.clone();
suggestion_range = editor::Anchor::min()..editor::Anchor::min();
}
Self::InsertBefore {
position,
description,
..
} => {
let position = snapshot.anchor_in_excerpt(excerpt_id, *position)?;
initial_prompt = description.clone();
suggestion_range = buffer.update(cx, |buffer, cx| {
buffer.start_transaction(cx);
let line_start = buffer.insert_empty_line(position, true, true, cx);
initial_transaction_id = buffer.end_transaction(cx);
buffer.refresh_preview(cx);
let line_start = buffer.read(cx).anchor_before(line_start);
line_start..line_start
});
}
Self::InsertAfter {
position,
description,
..
} => {
let position = snapshot.anchor_in_excerpt(excerpt_id, *position)?;
initial_prompt = description.clone();
suggestion_range = buffer.update(cx, |buffer, cx| {
buffer.start_transaction(cx);
let line_start = buffer.insert_empty_line(position, true, true, cx);
initial_transaction_id = buffer.end_transaction(cx);
buffer.refresh_preview(cx);
let line_start = buffer.read(cx).anchor_before(line_start);
line_start..line_start
});
}
Self::Delete { range, .. } => {
initial_prompt = "Delete".to_string();
suggestion_range = snapshot.anchor_in_excerpt(excerpt_id, range.start)?
..snapshot.anchor_in_excerpt(excerpt_id, range.end)?;
}
}
InlineAssistant::update_global(cx, |inline_assistant, cx| {
Some(inline_assistant.suggest_assist(
editor,
suggestion_range,
initial_prompt,
initial_transaction_id,
false,
Some(workspace.clone()),
Some(assistant_panel),
cx,
))
})
}
}
impl WorkflowStepEdit {
pub fn new(
path: Option<String>,
operation: Option<String>,
search: Option<String>,
description: Option<String>,
) -> Result<Self> {
let path = path.ok_or_else(|| anyhow!("missing path"))?;
let operation = operation.ok_or_else(|| anyhow!("missing operation"))?;
let kind = match operation.as_str() {
"update" => WorkflowStepEditKind::Update {
search: search.ok_or_else(|| anyhow!("missing search"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"insert_before" => WorkflowStepEditKind::InsertBefore {
search: search.ok_or_else(|| anyhow!("missing search"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"insert_after" => WorkflowStepEditKind::InsertAfter {
search: search.ok_or_else(|| anyhow!("missing search"))?,
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
"delete" => WorkflowStepEditKind::Delete {
search: search.ok_or_else(|| anyhow!("missing search"))?,
},
"create" => WorkflowStepEditKind::Create {
description: description.ok_or_else(|| anyhow!("missing description"))?,
},
_ => Err(anyhow!("unknown operation {operation:?}"))?,
};
Ok(Self { path, kind })
}
pub async fn resolve(
&self,
project: Model<Project>,
mut cx: AsyncAppContext,
) -> Result<(Model<Buffer>, super::WorkflowSuggestion)> {
let path = self.path.clone();
let kind = self.kind.clone();
let buffer = project
.update(&mut cx, |project, cx| {
let project_path = project
.find_project_path(Path::new(&path), cx)
.or_else(|| {
// If we couldn't find a project path for it, put it in the active worktree
// so that when we create the buffer, it can be saved.
let worktree = project
.active_entry()
.and_then(|entry_id| project.worktree_for_entry(entry_id, cx))
.or_else(|| project.worktrees(cx).next())?;
let worktree = worktree.read(cx);
Some(ProjectPath {
worktree_id: worktree.id(),
path: Arc::from(Path::new(&path)),
})
})
.with_context(|| format!("worktree not found for {:?}", path))?;
anyhow::Ok(project.open_buffer(project_path, cx))
})??
.await?;
let snapshot = buffer.update(&mut cx, |buffer, _| buffer.snapshot())?;
let suggestion = cx
.background_executor()
.spawn(async move {
match kind {
WorkflowStepEditKind::Update {
search,
description,
} => {
let range = Self::resolve_location(&snapshot, &search);
WorkflowSuggestion::Update { range, description }
}
WorkflowStepEditKind::Create { description } => {
WorkflowSuggestion::CreateFile { description }
}
WorkflowStepEditKind::InsertBefore {
search,
description,
} => {
let range = Self::resolve_location(&snapshot, &search);
WorkflowSuggestion::InsertBefore {
position: range.start,
description,
}
}
WorkflowStepEditKind::InsertAfter {
search,
description,
} => {
let range = Self::resolve_location(&snapshot, &search);
WorkflowSuggestion::InsertAfter {
position: range.end,
description,
}
}
WorkflowStepEditKind::Delete { search } => {
let range = Self::resolve_location(&snapshot, &search);
WorkflowSuggestion::Delete { range }
}
}
})
.await;
Ok((buffer, suggestion))
}
fn resolve_location(buffer: &text::BufferSnapshot, search_query: &str) -> Range<text::Anchor> {
const INSERTION_SCORE: f64 = -1.0;
const DELETION_SCORE: f64 = -1.0;
const REPLACEMENT_SCORE: f64 = -1.0;
const EQUALITY_SCORE: f64 = 5.0;
struct Matrix {
cols: usize,
data: Vec<f64>,
}
impl Matrix {
fn new(rows: usize, cols: usize) -> Self {
Matrix {
cols,
data: vec![0.0; rows * cols],
}
}
fn get(&self, row: usize, col: usize) -> f64 {
self.data[row * self.cols + col]
}
fn set(&mut self, row: usize, col: usize, value: f64) {
self.data[row * self.cols + col] = value;
}
}
let buffer_len = buffer.len();
let query_len = search_query.len();
let mut matrix = Matrix::new(query_len + 1, buffer_len + 1);
for (i, query_byte) in search_query.bytes().enumerate() {
for (j, buffer_byte) in buffer.bytes_in_range(0..buffer.len()).flatten().enumerate() {
let match_score = if query_byte == *buffer_byte {
EQUALITY_SCORE
} else {
REPLACEMENT_SCORE
};
let up = matrix.get(i + 1, j) + DELETION_SCORE;
let left = matrix.get(i, j + 1) + INSERTION_SCORE;
let diagonal = matrix.get(i, j) + match_score;
let score = up.max(left.max(diagonal)).max(0.);
matrix.set(i + 1, j + 1, score);
}
}
// Traceback to find the best match
let mut best_buffer_end = buffer_len;
let mut best_score = 0.0;
for col in 1..=buffer_len {
let score = matrix.get(query_len, col);
if score > best_score {
best_score = score;
best_buffer_end = col;
}
}
let mut query_ix = query_len;
let mut buffer_ix = best_buffer_end;
while query_ix > 0 && buffer_ix > 0 {
let current = matrix.get(query_ix, buffer_ix);
let up = matrix.get(query_ix - 1, buffer_ix);
let left = matrix.get(query_ix, buffer_ix - 1);
if current == left + INSERTION_SCORE {
buffer_ix -= 1;
} else if current == up + DELETION_SCORE {
query_ix -= 1;
} else {
query_ix -= 1;
buffer_ix -= 1;
}
}
let mut start = buffer.offset_to_point(buffer.clip_offset(buffer_ix, Bias::Left));
start.column = 0;
let mut end = buffer.offset_to_point(buffer.clip_offset(best_buffer_end, Bias::Right));
end.column = buffer.line_len(end.row);
buffer.anchor_after(start)..buffer.anchor_before(end)
}
}
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, JsonSchema)]
#[serde(tag = "operation")]
pub enum WorkflowStepEditKind {
/// Rewrites the specified text entirely based on the given description.
/// This operation completely replaces the given text.
Update {
/// A string in the source text to apply the update to.
search: String,
/// A brief description of the transformation to apply to the symbol.
description: String,
},
/// Creates a new file with the given path based on the provided description.
/// This operation adds a new file to the codebase.
Create {
/// A brief description of the file to be created.
description: String,
},
/// Inserts text before the specified text in the source file.
InsertBefore {
/// A string in the source text to insert text before.
search: String,
/// A brief description of how the new text should be generated.
description: String,
},
/// Inserts text after the specified text in the source file.
InsertAfter {
/// A string in the source text to insert text after.
search: String,
/// A brief description of how the new text should be generated.
description: String,
},
/// Deletes the specified symbol from the containing file.
Delete {
/// A string in the source text to delete.
search: String,
},
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::{AppContext, Context};
use text::{OffsetRangeExt, Point};
#[gpui::test]
fn test_resolve_location(cx: &mut AppContext) {
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
" Lorem\n",
" ipsum\n",
" dolor sit amet\n",
" consecteur",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
WorkflowStepEdit::resolve_location(&snapshot, "ipsum\ndolor").to_point(&snapshot),
Point::new(1, 0)..Point::new(2, 18)
);
}
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn foo1(a: usize) -> usize {\n",
" 42\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
WorkflowStepEdit::resolve_location(&snapshot, "fn foo1(b: usize) {\n42\n}")
.to_point(&snapshot),
Point::new(0, 0)..Point::new(2, 1)
);
}
{
let buffer = cx.new_model(|cx| {
Buffer::local(
concat!(
"fn main() {\n",
" Foo\n",
" .bar()\n",
" .baz()\n",
" .qux()\n",
"}\n",
"\n",
"fn foo2(b: usize) -> usize {\n",
" 42\n",
"}\n",
),
cx,
)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
WorkflowStepEdit::resolve_location(&snapshot, "Foo.bar.baz.qux()")
.to_point(&snapshot),
Point::new(1, 0)..Point::new(4, 14)
);
}
}
}

View File

@@ -130,7 +130,7 @@ impl Settings for AutoUpdateSetting {
type FileContent = Option<AutoUpdateSettingContent>;
fn load(sources: SettingsSources<Self::FileContent>, _: &mut AppContext) -> Result<Self> {
let auto_update = [sources.release_channel, sources.user]
let auto_update = [sources.server, sources.release_channel, sources.user]
.into_iter()
.find_map(|value| value.copied().flatten())
.unwrap_or(sources.default.ok_or_else(Self::missing_default)?);
@@ -464,6 +464,7 @@ impl AutoUpdater {
smol::fs::create_dir_all(&platform_dir).await.ok();
let client = this.read_with(cx, |this, _| this.http_client.clone())?;
if smol::fs::metadata(&version_path).await.is_err() {
log::info!("downloading zed-remote-server {os} {arch}");
download_remote_server_binary(&version_path, release, client, cx).await?;

View File

@@ -34,8 +34,8 @@ postage.workspace = true
rand.workspace = true
release_channel.workspace = true
rpc = { workspace = true, features = ["gpui"] }
rustls.workspace = true
rustls-native-certs.workspace = true
rustls.workspace = true
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true

View File

@@ -4,6 +4,7 @@ pub mod test;
mod socks;
pub mod telemetry;
pub mod user;
pub mod zed_urls;
use anyhow::{anyhow, bail, Context as _, Result};
use async_recursion::async_recursion;
@@ -141,6 +142,7 @@ impl Settings for ProxySettings {
Ok(Self {
proxy: sources
.user
.or(sources.server)
.and_then(|value| value.proxy.clone())
.or(sources.default.proxy.clone()),
})
@@ -472,15 +474,21 @@ impl settings::Settings for TelemetrySettings {
fn load(sources: SettingsSources<Self::FileContent>, _: &mut AppContext) -> Result<Self> {
Ok(Self {
diagnostics: sources.user.as_ref().and_then(|v| v.diagnostics).unwrap_or(
sources
.default
.diagnostics
.ok_or_else(Self::missing_default)?,
),
diagnostics: sources
.user
.as_ref()
.or(sources.server.as_ref())
.and_then(|v| v.diagnostics)
.unwrap_or(
sources
.default
.diagnostics
.ok_or_else(Self::missing_default)?,
),
metrics: sources
.user
.as_ref()
.or(sources.server.as_ref())
.and_then(|v| v.metrics)
.unwrap_or(sources.default.metrics.ok_or_else(Self::missing_default)?),
})
@@ -1023,7 +1031,7 @@ impl Client {
&self,
http: Arc<HttpClientWithUrl>,
release_channel: Option<ReleaseChannel>,
) -> impl Future<Output = Result<Url>> {
) -> impl Future<Output = Result<url::Url>> {
#[cfg(any(test, feature = "test-support"))]
let url_override = self.rpc_url.read().clone();
@@ -1117,7 +1125,7 @@ impl Client {
// for us from the RPC URL.
//
// Among other things, it will generate and set a `Sec-WebSocket-Key` header for us.
let mut request = rpc_url.into_client_request()?;
let mut request = IntoClientRequest::into_client_request(rpc_url.as_str())?;
// We then modify the request to add our desired headers.
let request_headers = request.headers_mut();
@@ -1156,6 +1164,7 @@ impl Client {
.with_root_certificates(root_store)
.with_no_client_auth()
};
let (stream, _) =
async_tungstenite::async_tls::client_async_tls_with_connector(
request,

View File

@@ -0,0 +1,19 @@
//! Contains helper functions for constructing URLs to various Zed-related pages.
//!
//! These URLs will adapt to the configured server URL in order to construct
//! links appropriate for the environment (e.g., by linking to a local copy of
//! zed.dev in development).
use gpui::AppContext;
use settings::Settings;
use crate::ClientSettings;
fn server_url(cx: &AppContext) -> &str {
&ClientSettings::get_global(cx).server_url
}
/// Returns the URL to the account page on zed.dev.
pub fn account_url(cx: &AppContext) -> String {
format!("{server_url}/account", server_url = server_url(cx))
}

View File

@@ -38,7 +38,6 @@ futures.workspace = true
google_ai.workspace = true
hex.workspace = true
http_client.workspace = true
isahc_http_client.workspace = true
jsonwebtoken.workspace = true
live_kit_server.workspace = true
log.workspace = true
@@ -49,6 +48,7 @@ prometheus = "0.13"
prost.workspace = true
rand.workspace = true
reqwest = { version = "0.11", features = ["json"] }
reqwest_client.workspace = true
rpc.workspace = true
rustc-demangle.workspace = true
scrypt = "0.11"
@@ -67,7 +67,7 @@ telemetry_events.workspace = true
text.workspace = true
thiserror.workspace = true
time.workspace = true
tokio.workspace = true
tokio = { workspace = true, features = ["full"] }
toml.workspace = true
tower = "0.4"
tower-http = { workspace = true, features = ["trace"] }

View File

@@ -199,6 +199,12 @@ spec:
secretKeyRef:
name: slack
key: panics_webhook
- name: STRIPE_API_KEY
valueFrom:
secretKeyRef:
name: stripe
key: api_key
optional: true
- name: COMPLETE_WITH_LANGUAGE_MODEL_RATE_LIMIT_PER_HOUR
value: "1000"
- name: SUPERMAVEN_ADMIN_API_KEY

View File

@@ -78,10 +78,10 @@ CREATE TABLE "worktree_entries" (
"id" INTEGER NOT NULL,
"is_dir" BOOL NOT NULL,
"path" VARCHAR NOT NULL,
"canonical_path" TEXT,
"inode" INTEGER NOT NULL,
"mtime_seconds" INTEGER NOT NULL,
"mtime_nanos" INTEGER NOT NULL,
"is_symlink" BOOL NOT NULL,
"is_external" BOOL NOT NULL,
"is_ignored" BOOL NOT NULL,
"is_deleted" BOOL NOT NULL,

View File

@@ -0,0 +1,2 @@
ALTER TABLE worktree_entries ADD COLUMN canonical_path text;
ALTER TABLE worktree_entries ALTER COLUMN is_symlink SET DEFAULT false;

View File

@@ -0,0 +1,12 @@
create table billing_events (
id serial primary key,
idempotency_key uuid not null default gen_random_uuid(),
user_id integer not null,
model_id integer not null references models (id) on delete cascade,
input_tokens bigint not null default 0,
input_cache_creation_tokens bigint not null default 0,
input_cache_read_tokens bigint not null default 0,
output_tokens bigint not null default 0
);
create index uix_billing_events_on_user_id_model_id on billing_events (user_id, model_id);

View File

@@ -1,7 +1,3 @@
use std::str::FromStr;
use std::sync::Arc;
use std::time::Duration;
use anyhow::{anyhow, bail, Context};
use axum::{
extract::{self, Query},
@@ -9,28 +5,35 @@ use axum::{
Extension, Json, Router,
};
use chrono::{DateTime, SecondsFormat, Utc};
use collections::HashSet;
use reqwest::StatusCode;
use sea_orm::ActiveValue;
use serde::{Deserialize, Serialize};
use std::{str::FromStr, sync::Arc, time::Duration};
use stripe::{
BillingPortalSession, CheckoutSession, CreateBillingPortalSession,
CreateBillingPortalSessionFlowData, CreateBillingPortalSessionFlowDataAfterCompletion,
BillingPortalSession, CreateBillingPortalSession, CreateBillingPortalSessionFlowData,
CreateBillingPortalSessionFlowDataAfterCompletion,
CreateBillingPortalSessionFlowDataAfterCompletionRedirect,
CreateBillingPortalSessionFlowDataType, CreateCheckoutSession, CreateCheckoutSessionLineItems,
CreateCustomer, Customer, CustomerId, EventObject, EventType, Expandable, ListEvents,
Subscription, SubscriptionId, SubscriptionStatus,
CreateBillingPortalSessionFlowDataType, CreateCustomer, Customer, CustomerId, EventObject,
EventType, Expandable, ListEvents, Subscription, SubscriptionId, SubscriptionStatus,
};
use util::ResultExt;
use crate::db::billing_subscription::{self, StripeSubscriptionStatus};
use crate::db::{
billing_customer, BillingSubscriptionId, CreateBillingCustomerParams,
CreateBillingSubscriptionParams, CreateProcessedStripeEventParams, UpdateBillingCustomerParams,
UpdateBillingPreferencesParams, UpdateBillingSubscriptionParams,
};
use crate::llm::db::LlmDatabase;
use crate::llm::{DEFAULT_MAX_MONTHLY_SPEND, FREE_TIER_MONTHLY_SPENDING_LIMIT};
use crate::rpc::ResultExt as _;
use crate::rpc::{ResultExt as _, Server};
use crate::{
db::{
billing_customer, BillingSubscriptionId, CreateBillingCustomerParams,
CreateBillingSubscriptionParams, CreateProcessedStripeEventParams,
UpdateBillingCustomerParams, UpdateBillingPreferencesParams,
UpdateBillingSubscriptionParams,
},
stripe_billing::StripeBilling,
};
use crate::{
db::{billing_subscription::StripeSubscriptionStatus, UserId},
llm::db::LlmDatabase,
};
use crate::{AppState, Error, Result};
pub fn router() -> Router {
@@ -47,6 +50,7 @@ pub fn router() -> Router {
"/billing/subscriptions/manage",
post(manage_billing_subscription),
)
.route("/billing/monthly_spend", get(get_monthly_spend))
}
#[derive(Debug, Deserialize)]
@@ -87,6 +91,7 @@ struct UpdateBillingPreferencesBody {
async fn update_billing_preferences(
Extension(app): Extension<Arc<AppState>>,
Extension(rpc_server): Extension<Arc<crate::rpc::Server>>,
extract::Json(body): extract::Json<UpdateBillingPreferencesBody>,
) -> Result<Json<BillingPreferencesResponse>> {
let user = app
@@ -119,6 +124,8 @@ async fn update_billing_preferences(
.await?
};
rpc_server.refresh_llm_tokens_for_user(user.id).await;
Ok(Json(BillingPreferencesResponse {
max_monthly_llm_usage_spending_in_cents: billing_preferences
.max_monthly_llm_usage_spending_in_cents,
@@ -197,12 +204,22 @@ async fn create_billing_subscription(
.await?
.ok_or_else(|| anyhow!("user not found"))?;
let Some((stripe_client, stripe_access_price_id)) = app
.stripe_client
.clone()
.zip(app.config.stripe_llm_access_price_id.clone())
else {
log::error!("failed to retrieve Stripe client or price ID");
let Some(stripe_client) = app.stripe_client.clone() else {
log::error!("failed to retrieve Stripe client");
Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"not supported".into(),
))?
};
let Some(stripe_billing) = app.stripe_billing.clone() else {
log::error!("failed to retrieve Stripe billing object");
Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"not supported".into(),
))?
};
let Some(llm_db) = app.llm_db.clone() else {
log::error!("failed to retrieve LLM database");
Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"not supported".into(),
@@ -226,26 +243,14 @@ async fn create_billing_subscription(
customer.id
};
let checkout_session = {
let mut params = CreateCheckoutSession::new();
params.mode = Some(stripe::CheckoutSessionMode::Subscription);
params.customer = Some(customer_id);
params.client_reference_id = Some(user.github_login.as_str());
params.line_items = Some(vec![CreateCheckoutSessionLineItems {
price: Some(stripe_access_price_id.to_string()),
quantity: Some(1),
..Default::default()
}]);
let success_url = format!("{}/account", app.config.zed_dot_dev_url());
params.success_url = Some(&success_url);
CheckoutSession::create(&stripe_client, params).await?
};
let default_model = llm_db.model(rpc::LanguageModelProvider::Anthropic, "claude-3-5-sonnet")?;
let stripe_model = stripe_billing.register_model(default_model).await?;
let success_url = format!("{}/account", app.config.zed_dot_dev_url());
let checkout_session_url = stripe_billing
.checkout(customer_id, &user.github_login, &stripe_model, &success_url)
.await?;
Ok(Json(CreateBillingSubscriptionResponse {
checkout_session_url: checkout_session
.url
.ok_or_else(|| anyhow!("no checkout session URL"))?,
checkout_session_url,
}))
}
@@ -400,7 +405,7 @@ const NUMBER_OF_ALREADY_PROCESSED_PAGES_BEFORE_WE_STOP: usize = 4;
/// Polls the Stripe events API periodically to reconcile the records in our
/// database with the data in Stripe.
pub fn poll_stripe_events_periodically(app: Arc<AppState>) {
pub fn poll_stripe_events_periodically(app: Arc<AppState>, rpc_server: Arc<Server>) {
let Some(stripe_client) = app.stripe_client.clone() else {
log::warn!("failed to retrieve Stripe client");
return;
@@ -411,7 +416,9 @@ pub fn poll_stripe_events_periodically(app: Arc<AppState>) {
let executor = executor.clone();
async move {
loop {
poll_stripe_events(&app, &stripe_client).await.log_err();
poll_stripe_events(&app, &rpc_server, &stripe_client)
.await
.log_err();
executor.sleep(POLL_EVENTS_INTERVAL).await;
}
@@ -421,6 +428,7 @@ pub fn poll_stripe_events_periodically(app: Arc<AppState>) {
async fn poll_stripe_events(
app: &Arc<AppState>,
rpc_server: &Arc<Server>,
stripe_client: &stripe::Client,
) -> anyhow::Result<()> {
fn event_type_to_string(event_type: EventType) -> String {
@@ -445,29 +453,28 @@ async fn poll_stripe_events(
let mut pages_of_already_processed_events = 0;
let mut unprocessed_events = Vec::new();
log::info!(
"Stripe events: starting retrieval for {}",
event_types.join(", ")
);
let mut params = ListEvents::new();
params.types = Some(event_types.clone());
params.limit = Some(EVENTS_LIMIT_PER_PAGE);
let mut event_pages = stripe::Event::list(&stripe_client, &params)
.await?
.paginate(params);
loop {
if pages_of_already_processed_events >= NUMBER_OF_ALREADY_PROCESSED_PAGES_BEFORE_WE_STOP {
log::info!("saw {pages_of_already_processed_events} pages of already-processed events: stopping event retrieval");
break;
}
log::info!("retrieving events from Stripe: {}", event_types.join(", "));
let mut params = ListEvents::new();
params.types = Some(event_types.clone());
params.limit = Some(EVENTS_LIMIT_PER_PAGE);
let events = stripe::Event::list(stripe_client, &params).await?;
let processed_event_ids = {
let event_ids = &events
let event_ids = event_pages
.page
.data
.iter()
.map(|event| event.id.as_str())
.collect::<Vec<_>>();
app.db
.get_processed_stripe_events_by_event_ids(event_ids)
.get_processed_stripe_events_by_event_ids(&event_ids)
.await?
.into_iter()
.map(|event| event.stripe_event_id)
@@ -475,13 +482,13 @@ async fn poll_stripe_events(
};
let mut processed_events_in_page = 0;
let events_in_page = events.data.len();
for event in events.data {
let events_in_page = event_pages.page.data.len();
for event in &event_pages.page.data {
if processed_event_ids.contains(&event.id.to_string()) {
processed_events_in_page += 1;
log::debug!("Stripe event {} already processed: skipping", event.id);
log::debug!("Stripe events: already processed '{}', skipping", event.id);
} else {
unprocessed_events.push(event);
unprocessed_events.push(event.clone());
}
}
@@ -489,15 +496,21 @@ async fn poll_stripe_events(
pages_of_already_processed_events += 1;
}
if !events.has_more {
if event_pages.page.has_more {
if pages_of_already_processed_events >= NUMBER_OF_ALREADY_PROCESSED_PAGES_BEFORE_WE_STOP
{
log::info!("Stripe events: stopping, saw {pages_of_already_processed_events} pages of already-processed events");
break;
} else {
log::info!("Stripe events: retrieving next page");
event_pages = event_pages.next(&stripe_client).await?;
}
} else {
break;
}
}
log::info!(
"unprocessed events from Stripe: {}",
unprocessed_events.len()
);
log::info!("Stripe events: unprocessed {}", unprocessed_events.len());
// Sort all of the unprocessed events in ascending order, so we can handle them in the order they occurred.
unprocessed_events.sort_by(|a, b| a.created.cmp(&b.created).then_with(|| a.id.cmp(&b.id)));
@@ -513,12 +526,12 @@ async fn poll_stripe_events(
// If the event has happened too far in the past, we don't want to
// process it and risk overwriting other more-recent updates.
//
// 1 hour was chosen arbitrarily. This could be made longer or shorter.
let one_hour = Duration::from_secs(60 * 60);
let an_hour_ago = Utc::now() - one_hour;
if an_hour_ago.timestamp() > event.created {
// 1 day was chosen arbitrarily. This could be made longer or shorter.
let one_day = Duration::from_secs(24 * 60 * 60);
let a_day_ago = Utc::now() - one_day;
if a_day_ago.timestamp() > event.created {
log::info!(
"Stripe event {} is more than {one_hour:?} old, marking as processed",
"Stripe events: event '{}' is more than {one_day:?} old, marking as processed",
event_id
);
app.db
@@ -537,7 +550,7 @@ async fn poll_stripe_events(
| EventType::CustomerSubscriptionPaused
| EventType::CustomerSubscriptionResumed
| EventType::CustomerSubscriptionDeleted => {
handle_customer_subscription_event(app, stripe_client, event).await
handle_customer_subscription_event(app, rpc_server, stripe_client, event).await
}
_ => Ok(()),
};
@@ -605,6 +618,7 @@ async fn handle_customer_event(
async fn handle_customer_subscription_event(
app: &Arc<AppState>,
rpc_server: &Arc<Server>,
stripe_client: &stripe::Client,
event: stripe::Event,
) -> anyhow::Result<()> {
@@ -650,9 +664,52 @@ async fn handle_customer_subscription_event(
.await?;
}
// When the user's subscription changes, we want to refresh their LLM tokens
// to either grant/revoke access.
rpc_server
.refresh_llm_tokens_for_user(billing_customer.user_id)
.await;
Ok(())
}
#[derive(Debug, Deserialize)]
struct GetMonthlySpendParams {
github_user_id: i32,
}
#[derive(Debug, Serialize)]
struct GetMonthlySpendResponse {
monthly_spend_in_cents: i32,
}
async fn get_monthly_spend(
Extension(app): Extension<Arc<AppState>>,
Query(params): Query<GetMonthlySpendParams>,
) -> Result<Json<GetMonthlySpendResponse>> {
let user = app
.db
.get_user_by_github_user_id(params.github_user_id)
.await?
.ok_or_else(|| anyhow!("user not found"))?;
let Some(llm_db) = app.llm_db.clone() else {
return Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"LLM database not available".into(),
));
};
let monthly_spend = llm_db
.get_user_spending_for_month(user.id, Utc::now())
.await?
.saturating_sub(FREE_TIER_MONTHLY_SPENDING_LIMIT);
Ok(Json(GetMonthlySpendResponse {
monthly_spend_in_cents: monthly_spend.0 as i32,
}))
}
impl From<SubscriptionStatus> for StripeSubscriptionStatus {
fn from(value: SubscriptionStatus) -> Self {
match value {
@@ -715,15 +772,15 @@ async fn find_or_create_billing_customer(
Ok(Some(billing_customer))
}
const SYNC_LLM_USAGE_WITH_STRIPE_INTERVAL: Duration = Duration::from_secs(24 * 60 * 60);
const SYNC_LLM_USAGE_WITH_STRIPE_INTERVAL: Duration = Duration::from_secs(60);
pub fn sync_llm_usage_with_stripe_periodically(app: Arc<AppState>, llm_db: LlmDatabase) {
let Some(stripe_client) = app.stripe_client.clone() else {
log::warn!("failed to retrieve Stripe client");
pub fn sync_llm_usage_with_stripe_periodically(app: Arc<AppState>) {
let Some(stripe_billing) = app.stripe_billing.clone() else {
log::warn!("failed to retrieve Stripe billing object");
return;
};
let Some(stripe_llm_usage_price_id) = app.config.stripe_llm_usage_price_id.clone() else {
log::warn!("failed to retrieve Stripe LLM usage price ID");
let Some(llm_db) = app.llm_db.clone() else {
log::warn!("failed to retrieve LLM database");
return;
};
@@ -732,15 +789,10 @@ pub fn sync_llm_usage_with_stripe_periodically(app: Arc<AppState>, llm_db: LlmDa
let executor = executor.clone();
async move {
loop {
sync_with_stripe(
&app,
&llm_db,
&stripe_client,
stripe_llm_usage_price_id.clone(),
)
.await
.trace_err();
sync_with_stripe(&app, &llm_db, &stripe_billing)
.await
.context("failed to sync LLM usage to Stripe")
.trace_err();
executor.sleep(SYNC_LLM_USAGE_WITH_STRIPE_INTERVAL).await;
}
}
@@ -749,71 +801,44 @@ pub fn sync_llm_usage_with_stripe_periodically(app: Arc<AppState>, llm_db: LlmDa
async fn sync_with_stripe(
app: &Arc<AppState>,
llm_db: &LlmDatabase,
stripe_client: &stripe::Client,
stripe_llm_usage_price_id: Arc<str>,
llm_db: &Arc<LlmDatabase>,
stripe_billing: &Arc<StripeBilling>,
) -> anyhow::Result<()> {
let subscriptions = app.db.get_active_billing_subscriptions().await?;
let events = llm_db.get_billing_events().await?;
let user_ids = events
.iter()
.map(|(event, _)| event.user_id)
.collect::<HashSet<UserId>>();
let stripe_subscriptions = app.db.get_active_billing_subscriptions(user_ids).await?;
for (customer, subscription) in subscriptions {
update_stripe_subscription(
llm_db,
stripe_client,
&stripe_llm_usage_price_id,
customer,
subscription,
)
.await
.log_err();
for (event, model) in events {
let Some((stripe_db_customer, stripe_db_subscription)) =
stripe_subscriptions.get(&event.user_id)
else {
tracing::warn!(
user_id = event.user_id.0,
"Registered billing event for user who is not a Stripe customer. Billing events should only be created for users who are Stripe customers, so this is a mistake on our side."
);
continue;
};
let stripe_subscription_id: stripe::SubscriptionId = stripe_db_subscription
.stripe_subscription_id
.parse()
.context("failed to parse stripe subscription id from db")?;
let stripe_customer_id: stripe::CustomerId = stripe_db_customer
.stripe_customer_id
.parse()
.context("failed to parse stripe customer id from db")?;
let stripe_model = stripe_billing.register_model(&model).await?;
stripe_billing
.subscribe_to_model(&stripe_subscription_id, &stripe_model)
.await?;
stripe_billing
.bill_model_usage(&stripe_customer_id, &stripe_model, &event)
.await?;
llm_db.consume_billing_event(event.id).await?;
}
Ok(())
}
async fn update_stripe_subscription(
llm_db: &LlmDatabase,
stripe_client: &stripe::Client,
stripe_llm_usage_price_id: &Arc<str>,
customer: billing_customer::Model,
subscription: billing_subscription::Model,
) -> Result<(), anyhow::Error> {
let monthly_spending = llm_db
.get_user_spending_for_month(customer.user_id, Utc::now())
.await?;
let subscription_id = SubscriptionId::from_str(&subscription.stripe_subscription_id)
.context("failed to parse subscription ID")?;
let monthly_spending_over_free_tier =
monthly_spending.saturating_sub(FREE_TIER_MONTHLY_SPENDING_LIMIT);
let new_quantity = (monthly_spending_over_free_tier.0 as f32 / 100.).ceil();
let current_subscription = Subscription::retrieve(stripe_client, &subscription_id, &[]).await?;
let mut update_params = stripe::UpdateSubscription {
proration_behavior: Some(
stripe::generated::billing::subscription::SubscriptionProrationBehavior::None,
),
..Default::default()
};
if let Some(existing_item) = current_subscription.items.data.iter().find(|item| {
item.price.as_ref().map_or(false, |price| {
price.id == stripe_llm_usage_price_id.as_ref()
})
}) {
update_params.items = Some(vec![stripe::UpdateSubscriptionItems {
id: Some(existing_item.id.to_string()),
quantity: Some(new_quantity as u64),
..Default::default()
}]);
} else {
update_params.items = Some(vec![stripe::UpdateSubscriptionItems {
price: Some(stripe_llm_usage_price_id.to_string()),
quantity: Some(new_quantity as u64),
..Default::default()
}]);
}
Subscription::update(stripe_client, &subscription_id, update_params).await?;
Ok(())
}

View File

@@ -10,6 +10,8 @@
Copy,
derive_more::Add,
derive_more::AddAssign,
derive_more::Sub,
derive_more::SubAssign,
)]
pub struct Cents(pub u32);

View File

@@ -114,23 +114,31 @@ impl Database {
pub async fn get_active_billing_subscriptions(
&self,
) -> Result<Vec<(billing_customer::Model, billing_subscription::Model)>> {
self.transaction(|tx| async move {
let mut result = Vec::new();
let mut rows = billing_subscription::Entity::find()
.inner_join(billing_customer::Entity)
.select_also(billing_customer::Entity)
.order_by_asc(billing_subscription::Column::Id)
.stream(&*tx)
.await?;
user_ids: HashSet<UserId>,
) -> Result<HashMap<UserId, (billing_customer::Model, billing_subscription::Model)>> {
self.transaction(|tx| {
let user_ids = user_ids.clone();
async move {
let mut rows = billing_subscription::Entity::find()
.inner_join(billing_customer::Entity)
.select_also(billing_customer::Entity)
.filter(billing_customer::Column::UserId.is_in(user_ids))
.filter(
billing_subscription::Column::StripeSubscriptionStatus
.eq(StripeSubscriptionStatus::Active),
)
.order_by_asc(billing_subscription::Column::Id)
.stream(&*tx)
.await?;
while let Some(row) = rows.next().await {
if let (subscription, Some(customer)) = row? {
result.push((customer, subscription));
let mut subscriptions = HashMap::default();
while let Some(row) = rows.next().await {
if let (subscription, Some(customer)) = row? {
subscriptions.insert(customer.user_id, (customer, subscription));
}
}
Ok(subscriptions)
}
Ok(result)
})
.await
}

View File

@@ -317,7 +317,7 @@ impl Database {
inode: ActiveValue::set(entry.inode as i64),
mtime_seconds: ActiveValue::set(mtime.seconds as i64),
mtime_nanos: ActiveValue::set(mtime.nanos as i32),
is_symlink: ActiveValue::set(entry.is_symlink),
canonical_path: ActiveValue::set(entry.canonical_path.clone()),
is_ignored: ActiveValue::set(entry.is_ignored),
is_external: ActiveValue::set(entry.is_external),
git_status: ActiveValue::set(entry.git_status.map(|status| status as i64)),
@@ -338,7 +338,7 @@ impl Database {
worktree_entry::Column::Inode,
worktree_entry::Column::MtimeSeconds,
worktree_entry::Column::MtimeNanos,
worktree_entry::Column::IsSymlink,
worktree_entry::Column::CanonicalPath,
worktree_entry::Column::IsIgnored,
worktree_entry::Column::GitStatus,
worktree_entry::Column::ScanId,
@@ -735,7 +735,7 @@ impl Database {
seconds: db_entry.mtime_seconds as u64,
nanos: db_entry.mtime_nanos as u32,
}),
is_symlink: db_entry.is_symlink,
canonical_path: db_entry.canonical_path,
is_ignored: db_entry.is_ignored,
is_external: db_entry.is_external,
git_status: db_entry.git_status.map(|status| status as i32),
@@ -838,6 +838,7 @@ impl Database {
.map(|language_server| proto::LanguageServer {
id: language_server.id as u64,
name: language_server.name,
worktree_id: None,
})
.collect(),
dev_server_project_id: project.dev_server_project_id,

View File

@@ -659,7 +659,7 @@ impl Database {
seconds: db_entry.mtime_seconds as u64,
nanos: db_entry.mtime_nanos as u32,
}),
is_symlink: db_entry.is_symlink,
canonical_path: db_entry.canonical_path,
is_ignored: db_entry.is_ignored,
is_external: db_entry.is_external,
git_status: db_entry.git_status.map(|status| status as i32),
@@ -718,6 +718,7 @@ impl Database {
.map(|language_server| proto::LanguageServer {
id: language_server.id as u64,
name: language_server.name,
worktree_id: None,
})
.collect::<Vec<_>>();

View File

@@ -16,12 +16,12 @@ pub struct Model {
pub mtime_seconds: i64,
pub mtime_nanos: i32,
pub git_status: Option<i64>,
pub is_symlink: bool,
pub is_ignored: bool,
pub is_external: bool,
pub is_deleted: bool,
pub scan_id: i64,
pub is_fifo: bool,
pub canonical_path: Option<String>,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]

View File

@@ -10,6 +10,7 @@ pub mod migrations;
mod rate_limiter;
pub mod rpc;
pub mod seed;
pub mod stripe_billing;
pub mod user_backfiller;
#[cfg(test)]
@@ -24,11 +25,14 @@ use axum::{
pub use cents::*;
use db::{ChannelId, Database};
use executor::Executor;
use llm::db::LlmDatabase;
pub use rate_limiter::*;
use serde::Deserialize;
use std::{path::PathBuf, sync::Arc};
use util::ResultExt;
use crate::stripe_billing::StripeBilling;
pub type Result<T, E = Error> = std::result::Result<T, E>;
pub enum Error {
@@ -176,8 +180,6 @@ pub struct Config {
pub slack_panics_webhook: Option<String>,
pub auto_join_channel_id: Option<ChannelId>,
pub stripe_api_key: Option<String>,
pub stripe_llm_access_price_id: Option<Arc<str>>,
pub stripe_llm_usage_price_id: Option<Arc<str>>,
pub supermaven_admin_api_key: Option<Arc<str>>,
pub user_backfiller_github_access_token: Option<Arc<str>>,
}
@@ -196,10 +198,6 @@ impl Config {
}
}
pub fn is_llm_billing_enabled(&self) -> bool {
self.stripe_llm_usage_price_id.is_some()
}
#[cfg(test)]
pub fn test() -> Self {
Self {
@@ -238,8 +236,6 @@ impl Config {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_llm_access_price_id: None,
stripe_llm_usage_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
}
@@ -272,9 +268,11 @@ impl ServiceMode {
pub struct AppState {
pub db: Arc<Database>,
pub llm_db: Option<Arc<LlmDatabase>>,
pub live_kit_client: Option<Arc<dyn live_kit_server::api::Client>>,
pub blob_store_client: Option<aws_sdk_s3::Client>,
pub stripe_client: Option<Arc<stripe::Client>>,
pub stripe_billing: Option<Arc<StripeBilling>>,
pub rate_limiter: Arc<RateLimiter>,
pub executor: Executor,
pub clickhouse_client: Option<::clickhouse::Client>,
@@ -288,6 +286,20 @@ impl AppState {
let mut db = Database::new(db_options, Executor::Production).await?;
db.initialize_notification_kinds().await?;
let llm_db = if let Some((llm_database_url, llm_database_max_connections)) = config
.llm_database_url
.clone()
.zip(config.llm_database_max_connections)
{
let mut llm_db_options = db::ConnectOptions::new(llm_database_url);
llm_db_options.max_connections(llm_database_max_connections);
let mut llm_db = LlmDatabase::new(llm_db_options, executor.clone()).await?;
llm_db.initialize().await?;
Some(Arc::new(llm_db))
} else {
None
};
let live_kit_client = if let Some(((server, key), secret)) = config
.live_kit_server
.as_ref()
@@ -304,11 +316,16 @@ impl AppState {
};
let db = Arc::new(db);
let stripe_client = build_stripe_client(&config).map(Arc::new).log_err();
let this = Self {
db: db.clone(),
llm_db,
live_kit_client,
blob_store_client: build_blob_store_client(&config).await.log_err(),
stripe_client: build_stripe_client(&config).await.map(Arc::new).log_err(),
stripe_billing: stripe_client
.clone()
.map(|stripe_client| Arc::new(StripeBilling::new(stripe_client))),
stripe_client,
rate_limiter: Arc::new(RateLimiter::new(db)),
executor,
clickhouse_client: config
@@ -321,12 +338,11 @@ impl AppState {
}
}
async fn build_stripe_client(config: &Config) -> anyhow::Result<stripe::Client> {
fn build_stripe_client(config: &Config) -> anyhow::Result<stripe::Client> {
let api_key = config
.stripe_api_key
.as_ref()
.ok_or_else(|| anyhow!("missing stripe_api_key"))?;
Ok(stripe::Client::new(api_key))
}

View File

@@ -20,13 +20,14 @@ use axum::{
};
use chrono::{DateTime, Duration, Utc};
use collections::HashMap;
use db::TokenUsage;
use db::{usage_measure::UsageMeasure, ActiveUserCount, LlmDatabase};
use futures::{Stream, StreamExt as _};
use isahc_http_client::IsahcHttpClient;
use rpc::ListModelsResponse;
use reqwest_client::ReqwestClient;
use rpc::{
proto::Plan, LanguageModelProvider, PerformCompletionParams, EXPIRED_LLM_TOKEN_HEADER_NAME,
};
use rpc::{ListModelsResponse, MAX_LLM_MONTHLY_SPEND_REACHED_HEADER_NAME};
use std::{
pin::Pin,
sync::Arc,
@@ -43,7 +44,7 @@ pub struct LlmState {
pub config: Config,
pub executor: Executor,
pub db: Arc<LlmDatabase>,
pub http_client: IsahcHttpClient,
pub http_client: ReqwestClient,
pub clickhouse_client: Option<clickhouse::Client>,
active_user_count_by_model:
RwLock<HashMap<(LanguageModelProvider, String), (DateTime<Utc>, ActiveUserCount)>>,
@@ -69,11 +70,8 @@ impl LlmState {
let db = Arc::new(db);
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = IsahcHttpClient::builder()
.default_header("User-Agent", user_agent)
.build()
.map(IsahcHttpClient::from)
.context("failed to construct http client")?;
let http_client =
ReqwestClient::user_agent(&user_agent).context("failed to construct http client")?;
let this = Self {
executor,
@@ -418,10 +416,7 @@ async fn perform_completion(
claims,
provider: params.provider,
model,
input_tokens: 0,
output_tokens: 0,
cache_creation_input_tokens: 0,
cache_read_input_tokens: 0,
tokens: TokenUsage::default(),
inner_stream: stream,
})))
}
@@ -440,7 +435,7 @@ fn normalize_model_name(known_models: Vec<String>, name: String) -> String {
/// The maximum monthly spending an individual user can reach on the free tier
/// before they have to pay.
pub const FREE_TIER_MONTHLY_SPENDING_LIMIT: Cents = Cents::from_dollars(5);
pub const FREE_TIER_MONTHLY_SPENDING_LIMIT: Cents = Cents::from_dollars(10);
/// The default value to use for maximum spend per month if the user did not
/// explicitly set a maximum spend.
@@ -448,9 +443,6 @@ pub const FREE_TIER_MONTHLY_SPENDING_LIMIT: Cents = Cents::from_dollars(5);
/// Used to prevent surprise bills.
pub const DEFAULT_MAX_MONTHLY_SPEND: Cents = Cents::from_dollars(10);
/// The maximum lifetime spending an individual user can reach before being cut off.
const LIFETIME_SPENDING_LIMIT: Cents = Cents::from_dollars(1_000);
async fn check_usage_limit(
state: &Arc<LlmState>,
provider: LanguageModelProvider,
@@ -468,23 +460,28 @@ async fn check_usage_limit(
)
.await?;
if state.config.is_llm_billing_enabled() {
if usage.spending_this_month >= FREE_TIER_MONTHLY_SPENDING_LIMIT {
if !claims.has_llm_subscription {
return Err(Error::http(
StatusCode::PAYMENT_REQUIRED,
"Maximum spending limit reached for this month.".to_string(),
));
}
if usage.spending_this_month >= FREE_TIER_MONTHLY_SPENDING_LIMIT {
if !claims.has_llm_subscription {
return Err(Error::http(
StatusCode::PAYMENT_REQUIRED,
"Maximum spending limit reached for this month.".to_string(),
));
}
}
// TODO: Remove this once we've rolled out monthly spending limits.
if usage.lifetime_spending >= LIFETIME_SPENDING_LIMIT {
return Err(Error::http(
StatusCode::FORBIDDEN,
"Maximum spending limit reached.".to_string(),
));
if (usage.spending_this_month - FREE_TIER_MONTHLY_SPENDING_LIMIT)
>= Cents(claims.max_monthly_spend_in_cents)
{
return Err(Error::Http(
StatusCode::FORBIDDEN,
"Maximum spending limit reached for this month.".to_string(),
[(
HeaderName::from_static(MAX_LLM_MONTHLY_SPEND_REACHED_HEADER_NAME),
HeaderValue::from_static("true"),
)]
.into_iter()
.collect(),
));
}
}
let active_users = state.get_active_user_count(provider, model_name).await?;
@@ -598,10 +595,7 @@ struct TokenCountingStream<S> {
claims: LlmTokenClaims,
provider: LanguageModelProvider,
model: String,
input_tokens: usize,
output_tokens: usize,
cache_creation_input_tokens: usize,
cache_read_input_tokens: usize,
tokens: TokenUsage,
inner_stream: S,
}
@@ -615,10 +609,10 @@ where
match Pin::new(&mut self.inner_stream).poll_next(cx) {
Poll::Ready(Some(Ok(mut chunk))) => {
chunk.bytes.push(b'\n');
self.input_tokens += chunk.input_tokens;
self.output_tokens += chunk.output_tokens;
self.cache_creation_input_tokens += chunk.cache_creation_input_tokens;
self.cache_read_input_tokens += chunk.cache_read_input_tokens;
self.tokens.input += chunk.input_tokens;
self.tokens.output += chunk.output_tokens;
self.tokens.input_cache_creation += chunk.cache_creation_input_tokens;
self.tokens.input_cache_read += chunk.cache_read_input_tokens;
Poll::Ready(Some(Ok(chunk.bytes)))
}
Poll::Ready(Some(Err(e))) => Poll::Ready(Some(Err(e))),
@@ -634,10 +628,7 @@ impl<S> Drop for TokenCountingStream<S> {
let claims = self.claims.clone();
let provider = self.provider;
let model = std::mem::take(&mut self.model);
let input_token_count = self.input_tokens;
let output_token_count = self.output_tokens;
let cache_creation_input_token_count = self.cache_creation_input_tokens;
let cache_read_input_token_count = self.cache_read_input_tokens;
let tokens = self.tokens;
self.state.executor.spawn_detached(async move {
let usage = state
.db
@@ -646,10 +637,9 @@ impl<S> Drop for TokenCountingStream<S> {
claims.is_staff,
provider,
&model,
input_token_count,
cache_creation_input_token_count,
cache_read_input_token_count,
output_token_count,
tokens,
claims.has_llm_subscription,
Cents(claims.max_monthly_spend_in_cents),
Utc::now(),
)
.await
@@ -679,22 +669,23 @@ impl<S> Drop for TokenCountingStream<S> {
},
model,
provider: provider.to_string(),
input_token_count: input_token_count as u64,
cache_creation_input_token_count: cache_creation_input_token_count
as u64,
cache_read_input_token_count: cache_read_input_token_count as u64,
output_token_count: output_token_count as u64,
input_token_count: tokens.input as u64,
cache_creation_input_token_count: tokens.input_cache_creation as u64,
cache_read_input_token_count: tokens.input_cache_read as u64,
output_token_count: tokens.output as u64,
requests_this_minute: usage.requests_this_minute as u64,
tokens_this_minute: usage.tokens_this_minute as u64,
tokens_this_day: usage.tokens_this_day as u64,
input_tokens_this_month: usage.input_tokens_this_month as u64,
input_tokens_this_month: usage.tokens_this_month.input as u64,
cache_creation_input_tokens_this_month: usage
.cache_creation_input_tokens_this_month
.tokens_this_month
.input_cache_creation
as u64,
cache_read_input_tokens_this_month: usage
.cache_read_input_tokens_this_month
.tokens_this_month
.input_cache_read
as u64,
output_tokens_this_month: usage.output_tokens_this_month as u64,
output_tokens_this_month: usage.tokens_this_month.output as u64,
spending_this_month: usage.spending_this_month.0 as u64,
lifetime_spending: usage.lifetime_spending.0 as u64,
},

View File

@@ -20,7 +20,7 @@ use std::future::Future;
use std::sync::Arc;
use anyhow::anyhow;
pub use queries::usages::ActiveUserCount;
pub use queries::usages::{ActiveUserCount, TokenUsage};
use sea_orm::prelude::*;
pub use sea_orm::ConnectOptions;
use sea_orm::{

View File

@@ -3,8 +3,9 @@ use serde::{Deserialize, Serialize};
use crate::id_type;
id_type!(BillingEventId);
id_type!(ModelId);
id_type!(ProviderId);
id_type!(RevokedAccessTokenId);
id_type!(UsageId);
id_type!(UsageMeasureId);
id_type!(RevokedAccessTokenId);

View File

@@ -1,5 +1,6 @@
use super::*;
pub mod billing_events;
pub mod providers;
pub mod revoked_access_tokens;
pub mod usages;

View File

@@ -0,0 +1,31 @@
use super::*;
use crate::Result;
use anyhow::Context as _;
impl LlmDatabase {
pub async fn get_billing_events(&self) -> Result<Vec<(billing_event::Model, model::Model)>> {
self.transaction(|tx| async move {
let events_with_models = billing_event::Entity::find()
.find_also_related(model::Entity)
.all(&*tx)
.await?;
events_with_models
.into_iter()
.map(|(event, model)| {
let model =
model.context("could not find model associated with billing event")?;
Ok((event, model))
})
.collect()
})
.await
}
pub async fn consume_billing_event(&self, id: BillingEventId) -> Result<()> {
self.transaction(|tx| async move {
billing_event::Entity::delete_by_id(id).exec(&*tx).await?;
Ok(())
})
.await
}
}

View File

@@ -1,5 +1,5 @@
use crate::db::UserId;
use crate::llm::Cents;
use crate::{db::UserId, llm::FREE_TIER_MONTHLY_SPENDING_LIMIT};
use chrono::{Datelike, Duration};
use futures::StreamExt as _;
use rpc::LanguageModelProvider;
@@ -9,15 +9,26 @@ use strum::IntoEnumIterator as _;
use super::*;
#[derive(Debug, PartialEq, Clone, Copy, Default)]
pub struct TokenUsage {
pub input: usize,
pub input_cache_creation: usize,
pub input_cache_read: usize,
pub output: usize,
}
impl TokenUsage {
pub fn total(&self) -> usize {
self.input + self.input_cache_creation + self.input_cache_read + self.output
}
}
#[derive(Debug, PartialEq, Clone, Copy)]
pub struct Usage {
pub requests_this_minute: usize,
pub tokens_this_minute: usize,
pub tokens_this_day: usize,
pub input_tokens_this_month: usize,
pub cache_creation_input_tokens_this_month: usize,
pub cache_read_input_tokens_this_month: usize,
pub output_tokens_this_month: usize,
pub tokens_this_month: TokenUsage,
pub spending_this_month: Cents,
pub lifetime_spending: Cents,
}
@@ -257,18 +268,20 @@ impl LlmDatabase {
requests_this_minute,
tokens_this_minute,
tokens_this_day,
input_tokens_this_month: monthly_usage
.as_ref()
.map_or(0, |usage| usage.input_tokens as usize),
cache_creation_input_tokens_this_month: monthly_usage
.as_ref()
.map_or(0, |usage| usage.cache_creation_input_tokens as usize),
cache_read_input_tokens_this_month: monthly_usage
.as_ref()
.map_or(0, |usage| usage.cache_read_input_tokens as usize),
output_tokens_this_month: monthly_usage
.as_ref()
.map_or(0, |usage| usage.output_tokens as usize),
tokens_this_month: TokenUsage {
input: monthly_usage
.as_ref()
.map_or(0, |usage| usage.input_tokens as usize),
input_cache_creation: monthly_usage
.as_ref()
.map_or(0, |usage| usage.cache_creation_input_tokens as usize),
input_cache_read: monthly_usage
.as_ref()
.map_or(0, |usage| usage.cache_read_input_tokens as usize),
output: monthly_usage
.as_ref()
.map_or(0, |usage| usage.output_tokens as usize),
},
spending_this_month,
lifetime_spending,
})
@@ -283,10 +296,9 @@ impl LlmDatabase {
is_staff: bool,
provider: LanguageModelProvider,
model_name: &str,
input_token_count: usize,
cache_creation_input_tokens: usize,
cache_read_input_tokens: usize,
output_token_count: usize,
tokens: TokenUsage,
has_llm_subscription: bool,
max_monthly_spend: Cents,
now: DateTimeUtc,
) -> Result<Usage> {
self.transaction(|tx| async move {
@@ -313,10 +325,6 @@ impl LlmDatabase {
&tx,
)
.await?;
let total_token_count = input_token_count
+ cache_read_input_tokens
+ cache_creation_input_tokens
+ output_token_count;
let tokens_this_minute = self
.update_usage_for_measure(
user_id,
@@ -325,7 +333,7 @@ impl LlmDatabase {
&usages,
UsageMeasure::TokensPerMinute,
now,
total_token_count,
tokens.total(),
&tx,
)
.await?;
@@ -337,7 +345,7 @@ impl LlmDatabase {
&usages,
UsageMeasure::TokensPerDay,
now,
total_token_count,
tokens.total(),
&tx,
)
.await?;
@@ -361,18 +369,14 @@ impl LlmDatabase {
Some(usage) => {
monthly_usage::Entity::update(monthly_usage::ActiveModel {
id: ActiveValue::unchanged(usage.id),
input_tokens: ActiveValue::set(
usage.input_tokens + input_token_count as i64,
),
input_tokens: ActiveValue::set(usage.input_tokens + tokens.input as i64),
cache_creation_input_tokens: ActiveValue::set(
usage.cache_creation_input_tokens + cache_creation_input_tokens as i64,
usage.cache_creation_input_tokens + tokens.input_cache_creation as i64,
),
cache_read_input_tokens: ActiveValue::set(
usage.cache_read_input_tokens + cache_read_input_tokens as i64,
),
output_tokens: ActiveValue::set(
usage.output_tokens + output_token_count as i64,
usage.cache_read_input_tokens + tokens.input_cache_read as i64,
),
output_tokens: ActiveValue::set(usage.output_tokens + tokens.output as i64),
..Default::default()
})
.exec(&*tx)
@@ -384,12 +388,12 @@ impl LlmDatabase {
model_id: ActiveValue::set(model.id),
month: ActiveValue::set(month),
year: ActiveValue::set(year),
input_tokens: ActiveValue::set(input_token_count as i64),
input_tokens: ActiveValue::set(tokens.input as i64),
cache_creation_input_tokens: ActiveValue::set(
cache_creation_input_tokens as i64,
tokens.input_cache_creation as i64,
),
cache_read_input_tokens: ActiveValue::set(cache_read_input_tokens as i64),
output_tokens: ActiveValue::set(output_token_count as i64),
cache_read_input_tokens: ActiveValue::set(tokens.input_cache_read as i64),
output_tokens: ActiveValue::set(tokens.output as i64),
..Default::default()
}
.insert(&*tx)
@@ -405,6 +409,27 @@ impl LlmDatabase {
monthly_usage.output_tokens as usize,
);
if !is_staff
&& spending_this_month > FREE_TIER_MONTHLY_SPENDING_LIMIT
&& has_llm_subscription
&& (spending_this_month - FREE_TIER_MONTHLY_SPENDING_LIMIT) <= max_monthly_spend
{
billing_event::ActiveModel {
id: ActiveValue::not_set(),
idempotency_key: ActiveValue::not_set(),
user_id: ActiveValue::set(user_id),
model_id: ActiveValue::set(model.id),
input_tokens: ActiveValue::set(tokens.input as i64),
input_cache_creation_tokens: ActiveValue::set(
tokens.input_cache_creation as i64,
),
input_cache_read_tokens: ActiveValue::set(tokens.input_cache_read as i64),
output_tokens: ActiveValue::set(tokens.output as i64),
}
.insert(&*tx)
.await?;
}
// Update lifetime usage
let lifetime_usage = lifetime_usage::Entity::find()
.filter(
@@ -419,18 +444,14 @@ impl LlmDatabase {
Some(usage) => {
lifetime_usage::Entity::update(lifetime_usage::ActiveModel {
id: ActiveValue::unchanged(usage.id),
input_tokens: ActiveValue::set(
usage.input_tokens + input_token_count as i64,
),
input_tokens: ActiveValue::set(usage.input_tokens + tokens.input as i64),
cache_creation_input_tokens: ActiveValue::set(
usage.cache_creation_input_tokens + cache_creation_input_tokens as i64,
usage.cache_creation_input_tokens + tokens.input_cache_creation as i64,
),
cache_read_input_tokens: ActiveValue::set(
usage.cache_read_input_tokens + cache_read_input_tokens as i64,
),
output_tokens: ActiveValue::set(
usage.output_tokens + output_token_count as i64,
usage.cache_read_input_tokens + tokens.input_cache_read as i64,
),
output_tokens: ActiveValue::set(usage.output_tokens + tokens.output as i64),
..Default::default()
})
.exec(&*tx)
@@ -440,12 +461,12 @@ impl LlmDatabase {
lifetime_usage::ActiveModel {
user_id: ActiveValue::set(user_id),
model_id: ActiveValue::set(model.id),
input_tokens: ActiveValue::set(input_token_count as i64),
input_tokens: ActiveValue::set(tokens.input as i64),
cache_creation_input_tokens: ActiveValue::set(
cache_creation_input_tokens as i64,
tokens.input_cache_creation as i64,
),
cache_read_input_tokens: ActiveValue::set(cache_read_input_tokens as i64),
output_tokens: ActiveValue::set(output_token_count as i64),
cache_read_input_tokens: ActiveValue::set(tokens.input_cache_read as i64),
output_tokens: ActiveValue::set(tokens.output as i64),
..Default::default()
}
.insert(&*tx)
@@ -465,11 +486,12 @@ impl LlmDatabase {
requests_this_minute,
tokens_this_minute,
tokens_this_day,
input_tokens_this_month: monthly_usage.input_tokens as usize,
cache_creation_input_tokens_this_month: monthly_usage.cache_creation_input_tokens
as usize,
cache_read_input_tokens_this_month: monthly_usage.cache_read_input_tokens as usize,
output_tokens_this_month: monthly_usage.output_tokens as usize,
tokens_this_month: TokenUsage {
input: monthly_usage.input_tokens as usize,
input_cache_creation: monthly_usage.cache_creation_input_tokens as usize,
input_cache_read: monthly_usage.cache_read_input_tokens as usize,
output: monthly_usage.output_tokens as usize,
},
spending_this_month,
lifetime_spending,
})

View File

@@ -1,3 +1,4 @@
pub mod billing_event;
pub mod lifetime_usage;
pub mod model;
pub mod monthly_usage;

View File

@@ -0,0 +1,37 @@
use crate::{
db::UserId,
llm::db::{BillingEventId, ModelId},
};
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, DeriveEntityModel)]
#[sea_orm(table_name = "billing_events")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: BillingEventId,
pub idempotency_key: Uuid,
pub user_id: UserId,
pub model_id: ModelId,
pub input_tokens: i64,
pub input_cache_creation_tokens: i64,
pub input_cache_read_tokens: i64,
pub output_tokens: i64,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::model::Entity",
from = "Column::ModelId",
to = "super::model::Column::Id"
)]
Model,
}
impl Related<super::model::Entity> for Entity {
fn to() -> RelationDef {
Relation::Model.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -29,6 +29,8 @@ pub enum Relation {
Provider,
#[sea_orm(has_many = "super::usage::Entity")]
Usages,
#[sea_orm(has_many = "super::billing_event::Entity")]
BillingEvents,
}
impl Related<super::provider::Entity> for Entity {
@@ -43,4 +45,10 @@ impl Related<super::usage::Entity> for Entity {
}
}
impl Related<super::billing_event::Entity> for Entity {
fn to() -> RelationDef {
Relation::BillingEvents.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -1,3 +1,4 @@
mod billing_tests;
mod provider_tests;
mod usage_tests;

View File

@@ -0,0 +1,148 @@
use crate::{
db::UserId,
llm::{
db::{queries::providers::ModelParams, LlmDatabase, TokenUsage},
FREE_TIER_MONTHLY_SPENDING_LIMIT,
},
test_llm_db, Cents,
};
use chrono::{DateTime, Utc};
use pretty_assertions::assert_eq;
use rpc::LanguageModelProvider;
test_llm_db!(
test_billing_limit_exceeded,
test_billing_limit_exceeded_postgres
);
async fn test_billing_limit_exceeded(db: &mut LlmDatabase) {
let provider = LanguageModelProvider::Anthropic;
let model = "fake-claude-limerick";
const PRICE_PER_MILLION_INPUT_TOKENS: i32 = 5;
const PRICE_PER_MILLION_OUTPUT_TOKENS: i32 = 5;
// Initialize the database and insert the model
db.initialize().await.unwrap();
db.insert_models(&[ModelParams {
provider,
name: model.to_string(),
max_requests_per_minute: 5,
max_tokens_per_minute: 10_000,
max_tokens_per_day: 50_000,
price_per_million_input_tokens: PRICE_PER_MILLION_INPUT_TOKENS,
price_per_million_output_tokens: PRICE_PER_MILLION_OUTPUT_TOKENS,
}])
.await
.unwrap();
// Set a fixed datetime for consistent testing
let now = DateTime::parse_from_rfc3339("2024-08-08T22:46:33Z")
.unwrap()
.with_timezone(&Utc);
let user_id = UserId::from_proto(123);
let max_monthly_spend = Cents::from_dollars(11);
// Record usage that brings us close to the limit but doesn't exceed it
// Let's say we use $10.50 worth of tokens
let tokens_to_use = 210_000_000; // This will cost $10.50 at $0.05 per 1 million tokens
let usage = TokenUsage {
input: tokens_to_use,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
};
// Verify that before we record any usage, there are 0 billing events
let billing_events = db.get_billing_events().await.unwrap();
assert_eq!(billing_events.len(), 0);
db.record_usage(
user_id,
false,
provider,
model,
usage,
true,
max_monthly_spend,
now,
)
.await
.unwrap();
// Verify the recorded usage and spending
let recorded_usage = db.get_usage(user_id, provider, model, now).await.unwrap();
// Verify that we exceeded the free tier usage
assert_eq!(recorded_usage.spending_this_month, Cents::new(1050));
assert!(recorded_usage.spending_this_month > FREE_TIER_MONTHLY_SPENDING_LIMIT);
// Verify that there is one `billing_event` record
let billing_events = db.get_billing_events().await.unwrap();
assert_eq!(billing_events.len(), 1);
let (billing_event, _model) = &billing_events[0];
assert_eq!(billing_event.user_id, user_id);
assert_eq!(billing_event.input_tokens, tokens_to_use as i64);
assert_eq!(billing_event.input_cache_creation_tokens, 0);
assert_eq!(billing_event.input_cache_read_tokens, 0);
assert_eq!(billing_event.output_tokens, 0);
// Record usage that puts us at $20.50
let usage_2 = TokenUsage {
input: 200_000_000, // This will cost $10 more, pushing us from $10.50 to $20.50,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
};
db.record_usage(
user_id,
false,
provider,
model,
usage_2,
true,
max_monthly_spend,
now,
)
.await
.unwrap();
// Verify the updated usage and spending
let updated_usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(updated_usage.spending_this_month, Cents::new(2050));
// Verify that there are now two billing events
let billing_events = db.get_billing_events().await.unwrap();
assert_eq!(billing_events.len(), 2);
let tokens_to_exceed = 20_000_000; // This will cost $1.00 more, pushing us from $20.50 to $21.50, which is over the $11 monthly maximum limit
let usage_exceeding = TokenUsage {
input: tokens_to_exceed,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
};
// This should still create a billing event as it's the first request that exceeds the limit
db.record_usage(
user_id,
false,
provider,
model,
usage_exceeding,
true,
max_monthly_spend,
now,
)
.await
.unwrap();
// Verify the updated usage and spending
let updated_usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(updated_usage.spending_this_month, Cents::new(2150));
// Verify that we never exceed the user max spending for the user
// and avoid charging them.
let billing_events = db.get_billing_events().await.unwrap();
assert_eq!(billing_events.len(), 2);
}

View File

@@ -2,7 +2,7 @@ use crate::{
db::UserId,
llm::db::{
queries::{providers::ModelParams, usages::Usage},
LlmDatabase,
LlmDatabase, TokenUsage,
},
test_llm_db, Cents,
};
@@ -36,14 +36,42 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
let user_id = UserId::from_proto(123);
let now = t0;
db.record_usage(user_id, false, provider, model, 1000, 0, 0, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 1000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let now = t0 + Duration::seconds(10);
db.record_usage(user_id, false, provider, model, 2000, 0, 0, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 2000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(
@@ -52,10 +80,12 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 2,
tokens_this_minute: 3000,
tokens_this_day: 3000,
input_tokens_this_month: 3000,
cache_creation_input_tokens_this_month: 0,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 3000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
@@ -69,19 +99,35 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 1,
tokens_this_minute: 2000,
tokens_this_day: 3000,
input_tokens_this_month: 3000,
cache_creation_input_tokens_this_month: 0,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 3000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
);
let now = t0 + Duration::seconds(60);
db.record_usage(user_id, false, provider, model, 3000, 0, 0, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 3000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(
@@ -90,10 +136,12 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 2,
tokens_this_minute: 5000,
tokens_this_day: 6000,
input_tokens_this_month: 6000,
cache_creation_input_tokens_this_month: 0,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 6000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
@@ -108,18 +156,34 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 0,
tokens_this_minute: 0,
tokens_this_day: 5000,
input_tokens_this_month: 6000,
cache_creation_input_tokens_this_month: 0,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 6000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
);
db.record_usage(user_id, false, provider, model, 4000, 0, 0, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 4000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(
@@ -128,10 +192,12 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 1,
tokens_this_minute: 4000,
tokens_this_day: 9000,
input_tokens_this_month: 10000,
cache_creation_input_tokens_this_month: 0,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 10000,
input_cache_creation: 0,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
@@ -143,9 +209,23 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
.with_timezone(&Utc);
// Test cache creation input tokens
db.record_usage(user_id, false, provider, model, 1000, 500, 0, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 1000,
input_cache_creation: 500,
input_cache_read: 0,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(
@@ -154,19 +234,35 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 1,
tokens_this_minute: 1500,
tokens_this_day: 1500,
input_tokens_this_month: 1000,
cache_creation_input_tokens_this_month: 500,
cache_read_input_tokens_this_month: 0,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 1000,
input_cache_creation: 500,
input_cache_read: 0,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}
);
// Test cache read input tokens
db.record_usage(user_id, false, provider, model, 1000, 0, 300, 0, now)
.await
.unwrap();
db.record_usage(
user_id,
false,
provider,
model,
TokenUsage {
input: 1000,
input_cache_creation: 0,
input_cache_read: 300,
output: 0,
},
false,
Cents::ZERO,
now,
)
.await
.unwrap();
let usage = db.get_usage(user_id, provider, model, now).await.unwrap();
assert_eq!(
@@ -175,10 +271,12 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 2,
tokens_this_minute: 2800,
tokens_this_day: 2800,
input_tokens_this_month: 2000,
cache_creation_input_tokens_this_month: 500,
cache_read_input_tokens_this_month: 300,
output_tokens_this_month: 0,
tokens_this_month: TokenUsage {
input: 2000,
input_cache_creation: 500,
input_cache_read: 300,
output: 0,
},
spending_this_month: Cents::ZERO,
lifetime_spending: Cents::ZERO,
}

View File

@@ -111,6 +111,13 @@ async fn main() -> Result<()> {
let state = AppState::new(config, Executor::Production).await?;
if let Some(stripe_billing) = state.stripe_billing.clone() {
let executor = state.executor.clone();
executor.spawn_detached(async move {
stripe_billing.initialize().await.trace_err();
});
}
if mode.is_collab() {
state.db.purge_old_embeddings().await.trace_err();
RateLimiter::save_periodically(
@@ -125,6 +132,8 @@ async fn main() -> Result<()> {
let rpc_server = collab::rpc::Server::new(epoch, state.clone());
rpc_server.start().await?;
poll_stripe_events_periodically(state.clone(), rpc_server.clone());
app = app
.merge(collab::api::routes(rpc_server.clone()))
.merge(collab::rpc::routes(rpc_server.clone()));
@@ -133,7 +142,6 @@ async fn main() -> Result<()> {
}
if mode.is_api() {
poll_stripe_events_periodically(state.clone());
fetch_extensions_from_blob_store_periodically(state.clone());
spawn_user_backfiller(state.clone());
@@ -157,7 +165,7 @@ async fn main() -> Result<()> {
if let Some(mut llm_db) = llm_db {
llm_db.initialize().await?;
sync_llm_usage_with_stripe_periodically(state.clone(), llm_db);
sync_llm_usage_with_stripe_periodically(state.clone());
}
app = app

View File

@@ -36,8 +36,8 @@ use collections::{HashMap, HashSet};
pub use connection_pool::{ConnectionPool, ZedVersion};
use core::fmt::{self, Debug, Formatter};
use http_client::HttpClient;
use isahc_http_client::IsahcHttpClient;
use open_ai::{OpenAiEmbeddingModel, OPEN_AI_API_URL};
use reqwest_client::ReqwestClient;
use sha2::Digest;
use supermaven_api::{CreateExternalUserRequest, SupermavenAdminApi};
@@ -961,8 +961,8 @@ impl Server {
tracing::info!("connection opened");
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = match IsahcHttpClient::builder().default_header("User-Agent", user_agent).build() {
Ok(http_client) => Arc::new(IsahcHttpClient::from(http_client)),
let http_client = match ReqwestClient::user_agent(&user_agent) {
Ok(http_client) => Arc::new(http_client),
Err(error) => {
tracing::error!(?error, "failed to create HTTP client");
return;
@@ -1218,6 +1218,15 @@ impl Server {
Ok(())
}
pub async fn refresh_llm_tokens_for_user(self: &Arc<Self>, user_id: UserId) {
let pool = self.connection_pool.lock();
for connection_id in pool.user_connection_ids(user_id) {
self.peer
.send(connection_id, proto::RefreshLlmToken {})
.trace_err();
}
}
pub async fn snapshot<'a>(self: &'a Arc<Self>) -> ServerSnapshot<'a> {
ServerSnapshot {
connection_pool: ConnectionPoolGuard {
@@ -2228,7 +2237,7 @@ fn join_project_internal(
worktree_id: worktree.id,
path: settings_file.path,
content: Some(settings_file.content),
kind: Some(proto::update_user_settings::Kind::Settings.into()),
kind: Some(settings_file.kind.to_proto() as i32),
},
)?;
}

View File

@@ -0,0 +1,479 @@
use std::sync::Arc;
use crate::{llm, Cents, Result};
use anyhow::Context;
use chrono::{Datelike, Utc};
use collections::HashMap;
use serde::{Deserialize, Serialize};
use tokio::sync::RwLock;
pub struct StripeBilling {
state: RwLock<StripeBillingState>,
client: Arc<stripe::Client>,
}
#[derive(Default)]
struct StripeBillingState {
meters_by_event_name: HashMap<String, StripeMeter>,
price_ids_by_meter_id: HashMap<String, stripe::PriceId>,
}
pub struct StripeModel {
input_tokens_price: StripeBillingPrice,
input_cache_creation_tokens_price: StripeBillingPrice,
input_cache_read_tokens_price: StripeBillingPrice,
output_tokens_price: StripeBillingPrice,
}
struct StripeBillingPrice {
id: stripe::PriceId,
meter_event_name: String,
}
impl StripeBilling {
pub fn new(client: Arc<stripe::Client>) -> Self {
Self {
client,
state: RwLock::default(),
}
}
pub async fn initialize(&self) -> Result<()> {
log::info!("StripeBilling: initializing");
let mut state = self.state.write().await;
let (meters, prices) = futures::try_join!(
StripeMeter::list(&self.client),
stripe::Price::list(
&self.client,
&stripe::ListPrices {
limit: Some(100),
..Default::default()
}
)
)?;
for meter in meters.data {
state
.meters_by_event_name
.insert(meter.event_name.clone(), meter);
}
for price in prices.data {
if let Some(recurring) = price.recurring {
if let Some(meter) = recurring.meter {
state.price_ids_by_meter_id.insert(meter, price.id);
}
}
}
log::info!("StripeBilling: initialized");
Ok(())
}
pub async fn register_model(&self, model: &llm::db::model::Model) -> Result<StripeModel> {
let input_tokens_price = self
.get_or_insert_price(
&format!("model_{}/input_tokens", model.id),
&format!("{} (Input Tokens)", model.name),
Cents::new(model.price_per_million_input_tokens as u32),
)
.await?;
let input_cache_creation_tokens_price = self
.get_or_insert_price(
&format!("model_{}/input_cache_creation_tokens", model.id),
&format!("{} (Input Cache Creation Tokens)", model.name),
Cents::new(model.price_per_million_cache_creation_input_tokens as u32),
)
.await?;
let input_cache_read_tokens_price = self
.get_or_insert_price(
&format!("model_{}/input_cache_read_tokens", model.id),
&format!("{} (Input Cache Read Tokens)", model.name),
Cents::new(model.price_per_million_cache_read_input_tokens as u32),
)
.await?;
let output_tokens_price = self
.get_or_insert_price(
&format!("model_{}/output_tokens", model.id),
&format!("{} (Output Tokens)", model.name),
Cents::new(model.price_per_million_output_tokens as u32),
)
.await?;
Ok(StripeModel {
input_tokens_price,
input_cache_creation_tokens_price,
input_cache_read_tokens_price,
output_tokens_price,
})
}
async fn get_or_insert_price(
&self,
meter_event_name: &str,
price_description: &str,
price_per_million_tokens: Cents,
) -> Result<StripeBillingPrice> {
// Fast code path when the meter and the price already exist.
{
let state = self.state.read().await;
if let Some(meter) = state.meters_by_event_name.get(meter_event_name) {
if let Some(price_id) = state.price_ids_by_meter_id.get(&meter.id) {
return Ok(StripeBillingPrice {
id: price_id.clone(),
meter_event_name: meter_event_name.to_string(),
});
}
}
}
let mut state = self.state.write().await;
let meter = if let Some(meter) = state.meters_by_event_name.get(meter_event_name) {
meter.clone()
} else {
let meter = StripeMeter::create(
&self.client,
StripeCreateMeterParams {
default_aggregation: DefaultAggregation { formula: "sum" },
display_name: price_description.to_string(),
event_name: meter_event_name,
},
)
.await?;
state
.meters_by_event_name
.insert(meter_event_name.to_string(), meter.clone());
meter
};
let price_id = if let Some(price_id) = state.price_ids_by_meter_id.get(&meter.id) {
price_id.clone()
} else {
let price = stripe::Price::create(
&self.client,
stripe::CreatePrice {
active: Some(true),
billing_scheme: Some(stripe::PriceBillingScheme::PerUnit),
currency: stripe::Currency::USD,
currency_options: None,
custom_unit_amount: None,
expand: &[],
lookup_key: None,
metadata: None,
nickname: None,
product: None,
product_data: Some(stripe::CreatePriceProductData {
id: None,
active: Some(true),
metadata: None,
name: price_description.to_string(),
statement_descriptor: None,
tax_code: None,
unit_label: None,
}),
recurring: Some(stripe::CreatePriceRecurring {
aggregate_usage: None,
interval: stripe::CreatePriceRecurringInterval::Month,
interval_count: None,
trial_period_days: None,
usage_type: Some(stripe::CreatePriceRecurringUsageType::Metered),
meter: Some(meter.id.clone()),
}),
tax_behavior: None,
tiers: None,
tiers_mode: None,
transfer_lookup_key: None,
transform_quantity: None,
unit_amount: None,
unit_amount_decimal: Some(&format!(
"{:.12}",
price_per_million_tokens.0 as f64 / 1_000_000f64
)),
},
)
.await?;
state
.price_ids_by_meter_id
.insert(meter.id, price.id.clone());
price.id
};
Ok(StripeBillingPrice {
id: price_id,
meter_event_name: meter_event_name.to_string(),
})
}
pub async fn subscribe_to_model(
&self,
subscription_id: &stripe::SubscriptionId,
model: &StripeModel,
) -> Result<()> {
let subscription =
stripe::Subscription::retrieve(&self.client, &subscription_id, &[]).await?;
let mut items = Vec::new();
if !subscription_contains_price(&subscription, &model.input_tokens_price.id) {
items.push(stripe::UpdateSubscriptionItems {
price: Some(model.input_tokens_price.id.to_string()),
..Default::default()
});
}
if !subscription_contains_price(&subscription, &model.input_cache_creation_tokens_price.id)
{
items.push(stripe::UpdateSubscriptionItems {
price: Some(model.input_cache_creation_tokens_price.id.to_string()),
..Default::default()
});
}
if !subscription_contains_price(&subscription, &model.input_cache_read_tokens_price.id) {
items.push(stripe::UpdateSubscriptionItems {
price: Some(model.input_cache_read_tokens_price.id.to_string()),
..Default::default()
});
}
if !subscription_contains_price(&subscription, &model.output_tokens_price.id) {
items.push(stripe::UpdateSubscriptionItems {
price: Some(model.output_tokens_price.id.to_string()),
..Default::default()
});
}
if !items.is_empty() {
items.extend(subscription.items.data.iter().map(|item| {
stripe::UpdateSubscriptionItems {
id: Some(item.id.to_string()),
..Default::default()
}
}));
stripe::Subscription::update(
&self.client,
subscription_id,
stripe::UpdateSubscription {
items: Some(items),
..Default::default()
},
)
.await?;
}
Ok(())
}
pub async fn bill_model_usage(
&self,
customer_id: &stripe::CustomerId,
model: &StripeModel,
event: &llm::db::billing_event::Model,
) -> Result<()> {
let timestamp = Utc::now().timestamp();
if event.input_tokens > 0 {
StripeMeterEvent::create(
&self.client,
StripeCreateMeterEventParams {
identifier: &format!("input_tokens/{}", event.idempotency_key),
event_name: &model.input_tokens_price.meter_event_name,
payload: StripeCreateMeterEventPayload {
value: event.input_tokens as u64,
stripe_customer_id: customer_id,
},
timestamp: Some(timestamp),
},
)
.await?;
}
if event.input_cache_creation_tokens > 0 {
StripeMeterEvent::create(
&self.client,
StripeCreateMeterEventParams {
identifier: &format!("input_cache_creation_tokens/{}", event.idempotency_key),
event_name: &model.input_cache_creation_tokens_price.meter_event_name,
payload: StripeCreateMeterEventPayload {
value: event.input_cache_creation_tokens as u64,
stripe_customer_id: customer_id,
},
timestamp: Some(timestamp),
},
)
.await?;
}
if event.input_cache_read_tokens > 0 {
StripeMeterEvent::create(
&self.client,
StripeCreateMeterEventParams {
identifier: &format!("input_cache_read_tokens/{}", event.idempotency_key),
event_name: &model.input_cache_read_tokens_price.meter_event_name,
payload: StripeCreateMeterEventPayload {
value: event.input_cache_read_tokens as u64,
stripe_customer_id: customer_id,
},
timestamp: Some(timestamp),
},
)
.await?;
}
if event.output_tokens > 0 {
StripeMeterEvent::create(
&self.client,
StripeCreateMeterEventParams {
identifier: &format!("output_tokens/{}", event.idempotency_key),
event_name: &model.output_tokens_price.meter_event_name,
payload: StripeCreateMeterEventPayload {
value: event.output_tokens as u64,
stripe_customer_id: customer_id,
},
timestamp: Some(timestamp),
},
)
.await?;
}
Ok(())
}
pub async fn checkout(
&self,
customer_id: stripe::CustomerId,
github_login: &str,
model: &StripeModel,
success_url: &str,
) -> Result<String> {
let first_of_next_month = Utc::now()
.checked_add_months(chrono::Months::new(1))
.unwrap()
.with_day(1)
.unwrap();
let mut params = stripe::CreateCheckoutSession::new();
params.mode = Some(stripe::CheckoutSessionMode::Subscription);
params.customer = Some(customer_id);
params.client_reference_id = Some(github_login);
params.subscription_data = Some(stripe::CreateCheckoutSessionSubscriptionData {
billing_cycle_anchor: Some(first_of_next_month.timestamp()),
..Default::default()
});
params.line_items = Some(
[
&model.input_tokens_price.id,
&model.input_cache_creation_tokens_price.id,
&model.input_cache_read_tokens_price.id,
&model.output_tokens_price.id,
]
.into_iter()
.map(|price_id| stripe::CreateCheckoutSessionLineItems {
price: Some(price_id.to_string()),
..Default::default()
})
.collect(),
);
params.success_url = Some(success_url);
let session = stripe::CheckoutSession::create(&self.client, params).await?;
Ok(session.url.context("no checkout session URL")?)
}
}
#[derive(Serialize)]
struct DefaultAggregation {
formula: &'static str,
}
#[derive(Serialize)]
struct StripeCreateMeterParams<'a> {
default_aggregation: DefaultAggregation,
display_name: String,
event_name: &'a str,
}
#[derive(Clone, Deserialize)]
struct StripeMeter {
id: String,
event_name: String,
}
impl StripeMeter {
pub fn create(
client: &stripe::Client,
params: StripeCreateMeterParams,
) -> stripe::Response<Self> {
client.post_form("/billing/meters", params)
}
pub fn list(client: &stripe::Client) -> stripe::Response<stripe::List<Self>> {
#[derive(Serialize)]
struct Params {
#[serde(skip_serializing_if = "Option::is_none")]
limit: Option<u64>,
}
client.get_query("/billing/meters", Params { limit: Some(100) })
}
}
#[derive(Deserialize)]
struct StripeMeterEvent {
identifier: String,
}
impl StripeMeterEvent {
pub async fn create(
client: &stripe::Client,
params: StripeCreateMeterEventParams<'_>,
) -> Result<Self, stripe::StripeError> {
let identifier = params.identifier;
match client.post_form("/billing/meter_events", params).await {
Ok(event) => Ok(event),
Err(stripe::StripeError::Stripe(error)) => {
if error.http_status == 400
&& error
.message
.as_ref()
.map_or(false, |message| message.contains(identifier))
{
Ok(Self {
identifier: identifier.to_string(),
})
} else {
Err(stripe::StripeError::Stripe(error))
}
}
Err(error) => Err(error),
}
}
}
#[derive(Serialize)]
struct StripeCreateMeterEventParams<'a> {
identifier: &'a str,
event_name: &'a str,
payload: StripeCreateMeterEventPayload<'a>,
timestamp: Option<i64>,
}
#[derive(Serialize)]
struct StripeCreateMeterEventPayload<'a> {
value: u64,
stripe_customer_id: &'a stripe::CustomerId,
}
fn subscription_contains_price(
subscription: &stripe::Subscription,
price_id: &stripe::PriceId,
) -> bool {
subscription.items.data.iter().any(|item| {
item.price
.as_ref()
.map_or(false, |price| price.id == *price_id)
})
}

View File

@@ -12,6 +12,7 @@ use editor::{
test::editor_test_context::{AssertionContextManager, EditorTestContext},
Editor,
};
use fs::Fs;
use futures::StreamExt;
use gpui::{TestAppContext, UpdateGlobal, VisualContext, VisualTestContext};
use indoc::indoc;
@@ -30,7 +31,7 @@ use serde_json::json;
use settings::SettingsStore;
use std::{
ops::Range,
path::Path,
path::{Path, PathBuf},
sync::{
atomic::{self, AtomicBool, AtomicUsize},
Arc,
@@ -60,7 +61,7 @@ async fn test_host_disconnect(
.fs()
.insert_tree(
"/a",
serde_json::json!({
json!({
"a.txt": "a-contents",
"b.txt": "b-contents",
}),
@@ -2152,6 +2153,295 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
});
}
#[gpui::test(iterations = 30)]
async fn test_collaborating_with_editorconfig(
cx_a: &mut TestAppContext,
cx_b: &mut TestAppContext,
) {
let mut server = TestServer::start(cx_a.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
let client_b = server.create_client(cx_b, "user_b").await;
server
.create_room(&mut [(&client_a, cx_a), (&client_b, cx_b)])
.await;
let active_call_a = cx_a.read(ActiveCall::global);
cx_b.update(editor::init);
// Set up a fake language server.
client_a.language_registry().add(rust_lang());
client_a
.fs()
.insert_tree(
"/a",
json!({
"src": {
"main.rs": "mod other;\nfn main() { let foo = other::foo(); }",
"other_mod": {
"other.rs": "pub fn foo() -> usize {\n 4\n}",
".editorconfig": "",
},
},
".editorconfig": "[*]\ntab_width = 2\n",
}),
)
.await;
let (project_a, worktree_id) = client_a.build_local_project("/a", cx_a).await;
let project_id = active_call_a
.update(cx_a, |call, cx| call.share_project(project_a.clone(), cx))
.await
.unwrap();
let main_buffer_a = project_a
.update(cx_a, |p, cx| {
p.open_buffer((worktree_id, "src/main.rs"), cx)
})
.await
.unwrap();
let other_buffer_a = project_a
.update(cx_a, |p, cx| {
p.open_buffer((worktree_id, "src/other_mod/other.rs"), cx)
})
.await
.unwrap();
let cx_a = cx_a.add_empty_window();
let main_editor_a =
cx_a.new_view(|cx| Editor::for_buffer(main_buffer_a, Some(project_a.clone()), cx));
let other_editor_a =
cx_a.new_view(|cx| Editor::for_buffer(other_buffer_a, Some(project_a), cx));
let mut main_editor_cx_a = EditorTestContext {
cx: cx_a.clone(),
window: cx_a.handle(),
editor: main_editor_a,
assertion_cx: AssertionContextManager::new(),
};
let mut other_editor_cx_a = EditorTestContext {
cx: cx_a.clone(),
window: cx_a.handle(),
editor: other_editor_a,
assertion_cx: AssertionContextManager::new(),
};
// Join the project as client B.
let project_b = client_b.join_remote_project(project_id, cx_b).await;
let main_buffer_b = project_b
.update(cx_b, |p, cx| {
p.open_buffer((worktree_id, "src/main.rs"), cx)
})
.await
.unwrap();
let other_buffer_b = project_b
.update(cx_b, |p, cx| {
p.open_buffer((worktree_id, "src/other_mod/other.rs"), cx)
})
.await
.unwrap();
let cx_b = cx_b.add_empty_window();
let main_editor_b =
cx_b.new_view(|cx| Editor::for_buffer(main_buffer_b, Some(project_b.clone()), cx));
let other_editor_b =
cx_b.new_view(|cx| Editor::for_buffer(other_buffer_b, Some(project_b.clone()), cx));
let mut main_editor_cx_b = EditorTestContext {
cx: cx_b.clone(),
window: cx_b.handle(),
editor: main_editor_b,
assertion_cx: AssertionContextManager::new(),
};
let mut other_editor_cx_b = EditorTestContext {
cx: cx_b.clone(),
window: cx_b.handle(),
editor: other_editor_b,
assertion_cx: AssertionContextManager::new(),
};
let initial_main = indoc! {"
ˇmod other;
fn main() { let foo = other::foo(); }"};
let initial_other = indoc! {"
ˇpub fn foo() -> usize {
4
}"};
let first_tabbed_main = indoc! {"
ˇmod other;
fn main() { let foo = other::foo(); }"};
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
first_tabbed_main,
true,
);
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
first_tabbed_main,
false,
);
let first_tabbed_other = indoc! {"
ˇpub fn foo() -> usize {
4
}"};
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
first_tabbed_other,
true,
);
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
first_tabbed_other,
false,
);
client_a
.fs()
.atomic_write(
PathBuf::from("/a/src/.editorconfig"),
"[*]\ntab_width = 3\n".to_owned(),
)
.await
.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
let second_tabbed_main = indoc! {"
ˇmod other;
fn main() { let foo = other::foo(); }"};
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
second_tabbed_main,
true,
);
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
second_tabbed_main,
false,
);
let second_tabbed_other = indoc! {"
ˇpub fn foo() -> usize {
4
}"};
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
second_tabbed_other,
true,
);
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
second_tabbed_other,
false,
);
let editorconfig_buffer_b = project_b
.update(cx_b, |p, cx| {
p.open_buffer((worktree_id, "src/other_mod/.editorconfig"), cx)
})
.await
.unwrap();
editorconfig_buffer_b.update(cx_b, |buffer, cx| {
buffer.set_text("[*.rs]\ntab_width = 6\n", cx);
});
project_b
.update(cx_b, |project, cx| {
project.save_buffer(editorconfig_buffer_b.clone(), cx)
})
.await
.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
second_tabbed_main,
true,
);
tab_undo_assert(
&mut main_editor_cx_a,
&mut main_editor_cx_b,
initial_main,
second_tabbed_main,
false,
);
let third_tabbed_other = indoc! {"
ˇpub fn foo() -> usize {
4
}"};
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
third_tabbed_other,
true,
);
tab_undo_assert(
&mut other_editor_cx_a,
&mut other_editor_cx_b,
initial_other,
third_tabbed_other,
false,
);
}
#[track_caller]
fn tab_undo_assert(
cx_a: &mut EditorTestContext,
cx_b: &mut EditorTestContext,
expected_initial: &str,
expected_tabbed: &str,
a_tabs: bool,
) {
cx_a.assert_editor_state(expected_initial);
cx_b.assert_editor_state(expected_initial);
if a_tabs {
cx_a.update_editor(|editor, cx| {
editor.tab(&editor::actions::Tab, cx);
});
} else {
cx_b.update_editor(|editor, cx| {
editor.tab(&editor::actions::Tab, cx);
});
}
cx_a.run_until_parked();
cx_b.run_until_parked();
cx_a.assert_editor_state(expected_tabbed);
cx_b.assert_editor_state(expected_tabbed);
if a_tabs {
cx_a.update_editor(|editor, cx| {
editor.undo(&editor::actions::Undo, cx);
});
} else {
cx_b.update_editor(|editor, cx| {
editor.undo(&editor::actions::Undo, cx);
});
}
cx_a.run_until_parked();
cx_b.run_until_parked();
cx_a.assert_editor_state(expected_initial);
cx_b.assert_editor_state(expected_initial);
}
fn extract_hint_labels(editor: &Editor) -> Vec<String> {
let mut labels = Vec::new();
for hint in editor.inlay_hint_cache().hints() {

View File

@@ -27,13 +27,14 @@ use language::{
use live_kit_client::MacOSDisplay;
use lsp::LanguageServerId;
use parking_lot::Mutex;
use project::lsp_store::FormatTarget;
use project::{
lsp_store::FormatTrigger, search::SearchQuery, search::SearchResult, DiagnosticSummary,
HoverBlockKind, Project, ProjectPath,
};
use rand::prelude::*;
use serde_json::json;
use settings::{LocalSettingsKind, SettingsStore};
use settings::SettingsStore;
use std::{
cell::{Cell, RefCell},
env, future, mem,
@@ -3327,16 +3328,8 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(
Path::new("").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":2}"#.to_string()
),
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
(Path::new("").into(), r#"{"tab_size":2}"#.to_string()),
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
]
)
});
@@ -3354,16 +3347,8 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(
Path::new("").into(),
LocalSettingsKind::Settings,
r#"{}"#.to_string()
),
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
(Path::new("").into(), r#"{}"#.to_string()),
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
]
)
});
@@ -3391,16 +3376,8 @@ async fn test_local_settings(
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[
(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":8}"#.to_string()
),
(
Path::new("b").into(),
LocalSettingsKind::Settings,
r#"{"tab_size":4}"#.to_string()
),
(Path::new("a").into(), r#"{"tab_size":8}"#.to_string()),
(Path::new("b").into(), r#"{"tab_size":4}"#.to_string()),
]
)
});
@@ -3430,11 +3407,7 @@ async fn test_local_settings(
store
.local_settings(worktree_b.read(cx).id())
.collect::<Vec<_>>(),
&[(
Path::new("a").into(),
LocalSettingsKind::Settings,
r#"{"hard_tabs":true}"#.to_string()
),]
&[(Path::new("a").into(), r#"{"hard_tabs":true}"#.to_string()),]
)
});
}
@@ -4417,6 +4390,7 @@ async fn test_formatting_buffer(
HashSet::from_iter([buffer_b.clone()]),
true,
FormatTrigger::Save,
FormatTarget::Buffer,
cx,
)
})
@@ -4450,6 +4424,7 @@ async fn test_formatting_buffer(
HashSet::from_iter([buffer_b.clone()]),
true,
FormatTrigger::Save,
FormatTarget::Buffer,
cx,
)
})
@@ -4555,6 +4530,7 @@ async fn test_prettier_formatting_buffer(
HashSet::from_iter([buffer_b.clone()]),
true,
FormatTrigger::Save,
FormatTarget::Buffer,
cx,
)
})
@@ -4574,6 +4550,7 @@ async fn test_prettier_formatting_buffer(
HashSet::from_iter([buffer_a.clone()]),
true,
FormatTrigger::Manual,
FormatTarget::Buffer,
cx,
)
})

View File

@@ -2,10 +2,12 @@ use crate::tests::TestServer;
use call::ActiveCall;
use fs::{FakeFs, Fs as _};
use gpui::{Context as _, TestAppContext};
use language::language_settings::all_language_settings;
use http_client::BlockedHttpClient;
use language::{language_settings::language_settings, LanguageRegistry};
use node_runtime::NodeRuntime;
use project::ProjectPath;
use remote::SshRemoteClient;
use remote_server::HeadlessProject;
use remote_server::{HeadlessAppState, HeadlessProject};
use serde_json::json;
use std::{path::Path, sync::Arc};
@@ -24,7 +26,7 @@ async fn test_sharing_an_ssh_remote_project(
.await;
// Set up project on remote FS
let (client_ssh, server_ssh) = SshRemoteClient::fake(cx_a, server_cx);
let (port, server_ssh) = SshRemoteClient::fake_server(cx_a, server_cx);
let remote_fs = FakeFs::new(server_cx.executor());
remote_fs
.insert_tree(
@@ -48,9 +50,24 @@ async fn test_sharing_an_ssh_remote_project(
// User A connects to the remote project via SSH.
server_cx.update(HeadlessProject::init);
let _headless_project =
server_cx.new_model(|cx| HeadlessProject::new(server_ssh, remote_fs.clone(), cx));
let remote_http_client = Arc::new(BlockedHttpClient);
let node = NodeRuntime::unavailable();
let languages = Arc::new(LanguageRegistry::new(server_cx.executor()));
let _headless_project = server_cx.new_model(|cx| {
client::init_settings(cx);
HeadlessProject::new(
HeadlessAppState {
session: server_ssh,
fs: remote_fs.clone(),
http_client: remote_http_client,
node_runtime: node,
languages,
},
cx,
)
});
let client_ssh = SshRemoteClient::fake_client(port, cx_a).await;
let (project_a, worktree_id) = client_a
.build_ssh_project("/code/project1", client_ssh, cx_a)
.await;
@@ -118,9 +135,7 @@ async fn test_sharing_an_ssh_remote_project(
cx_b.read(|cx| {
let file = buffer_b.read(cx).file();
assert_eq!(
all_language_settings(file, cx)
.language(Some(&("Rust".into())))
.language_servers,
language_settings(Some("Rust".into()), file, cx).language_servers,
["override-rust-analyzer".to_string()]
)
});

View File

@@ -635,9 +635,11 @@ impl TestServer {
) -> Arc<AppState> {
Arc::new(AppState {
db: test_db.db().clone(),
llm_db: None,
live_kit_client: Some(Arc::new(live_kit_test_server.create_api_client())),
blob_store_client: None,
stripe_client: None,
stripe_billing: None,
rate_limiter: Arc::new(RateLimiter::new(test_db.db().clone())),
executor,
clickhouse_client: None,
@@ -677,8 +679,6 @@ impl TestServer {
migrations_path: None,
seed_path: None,
stripe_api_key: None,
stripe_llm_access_price_id: None,
stripe_llm_usage_price_id: None,
supermaven_admin_api_key: None,
user_backfiller_github_access_token: None,
},

View File

@@ -26,7 +26,7 @@ const JSON_RPC_VERSION: &str = "2.0";
const REQUEST_TIMEOUT: Duration = Duration::from_secs(60);
type ResponseHandler = Box<dyn Send + FnOnce(Result<String, Error>)>;
type NotificationHandler = Box<dyn Send + FnMut(RequestId, Value, AsyncAppContext)>;
type NotificationHandler = Box<dyn Send + FnMut(Value, AsyncAppContext)>;
#[derive(Debug, Clone, Eq, PartialEq, Hash, Serialize, Deserialize)]
#[serde(untagged)]
@@ -94,7 +94,6 @@ enum CspResult<T> {
#[derive(Serialize, Deserialize)]
struct Notification<'a, T> {
jsonrpc: &'static str,
id: RequestId,
#[serde(borrow)]
method: &'a str,
params: T,
@@ -103,7 +102,6 @@ struct Notification<'a, T> {
#[derive(Debug, Clone, Deserialize)]
struct AnyNotification<'a> {
jsonrpc: &'a str,
id: RequestId,
method: String,
#[serde(default)]
params: Option<Value>,
@@ -246,11 +244,7 @@ impl Client {
if let Some(handler) =
notification_handlers.get_mut(notification.method.as_str())
{
handler(
notification.id,
notification.params.unwrap_or(Value::Null),
cx.clone(),
);
handler(notification.params.unwrap_or(Value::Null), cx.clone());
}
}
}
@@ -378,10 +372,8 @@ impl Client {
/// Sends a notification to the context server without expecting a response.
/// This function serializes the notification and sends it through the outbound channel.
pub fn notify(&self, method: &str, params: impl Serialize) -> Result<()> {
let id = self.next_id.fetch_add(1, SeqCst);
let notification = serde_json::to_string(&Notification {
jsonrpc: JSON_RPC_VERSION,
id: RequestId::Int(id),
method,
params,
})
@@ -390,13 +382,13 @@ impl Client {
Ok(())
}
pub fn on_notification<F>(&self, method: &'static str, mut f: F)
pub fn on_notification<F>(&self, method: &'static str, f: F)
where
F: 'static + Send + FnMut(Value, AsyncAppContext),
{
self.notification_handlers
.lock()
.insert(method, Box::new(move |_, params, cx| f(params, cx)));
.insert(method, Box::new(f));
}
pub fn name(&self) -> &str {

View File

@@ -85,7 +85,7 @@ impl ContextServer {
)?;
let protocol = crate::protocol::ModelContextProtocol::new(client);
let client_info = types::EntityInfo {
let client_info = types::Implementation {
name: "Zed".to_string(),
version: env!("CARGO_PKG_VERSION").to_string(),
};

View File

@@ -11,8 +11,6 @@ use collections::HashMap;
use crate::client::Client;
use crate::types;
pub use types::PromptInfo;
const PROTOCOL_VERSION: u32 = 1;
pub struct ModelContextProtocol {
@@ -26,7 +24,7 @@ impl ModelContextProtocol {
pub async fn initialize(
self,
client_info: types::EntityInfo,
client_info: types::Implementation,
) -> Result<InitializedContextServerProtocol> {
let params = types::InitializeParams {
protocol_version: PROTOCOL_VERSION,
@@ -96,7 +94,7 @@ impl InitializedContextServerProtocol {
}
/// List the MCP prompts.
pub async fn list_prompts(&self) -> Result<Vec<types::PromptInfo>> {
pub async fn list_prompts(&self) -> Result<Vec<types::Prompt>> {
self.check_capability(ServerCapability::Prompts)?;
let response: types::PromptsListResponse = self
@@ -107,6 +105,18 @@ impl InitializedContextServerProtocol {
Ok(response.prompts)
}
/// List the MCP resources.
pub async fn list_resources(&self) -> Result<types::ResourcesListResponse> {
self.check_capability(ServerCapability::Resources)?;
let response: types::ResourcesListResponse = self
.inner
.request(types::RequestType::ResourcesList.as_str(), ())
.await?;
Ok(response)
}
/// Executes a prompt with the given arguments and returns the result.
pub async fn run_prompt<P: AsRef<str>>(
&self,

View File

@@ -15,6 +15,7 @@ pub enum RequestType {
PromptsGet,
PromptsList,
CompletionComplete,
Ping,
}
impl RequestType {
@@ -30,6 +31,7 @@ impl RequestType {
RequestType::PromptsGet => "prompts/get",
RequestType::PromptsList => "prompts/list",
RequestType::CompletionComplete => "completion/complete",
RequestType::Ping => "ping",
}
}
}
@@ -39,14 +41,15 @@ impl RequestType {
pub struct InitializeParams {
pub protocol_version: u32,
pub capabilities: ClientCapabilities,
pub client_info: EntityInfo,
pub client_info: Implementation,
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct CallToolParams {
pub name: String,
pub arguments: Option<serde_json::Value>,
#[serde(skip_serializing_if = "Option::is_none")]
pub arguments: Option<HashMap<String, serde_json::Value>>,
}
#[derive(Debug, Serialize)]
@@ -77,6 +80,7 @@ pub struct LoggingSetLevelParams {
#[serde(rename_all = "camelCase")]
pub struct PromptsGetParams {
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub arguments: Option<HashMap<String, String>>,
}
@@ -101,6 +105,13 @@ pub struct PromptReference {
pub name: String,
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourceReference {
pub r#type: PromptReferenceType,
pub uri: Url,
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum PromptReferenceType {
@@ -110,13 +121,6 @@ pub enum PromptReferenceType {
Resource,
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourceReference {
pub r#type: String,
pub uri: String,
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct CompletionArgument {
@@ -129,7 +133,7 @@ pub struct CompletionArgument {
pub struct InitializeResponse {
pub protocol_version: u32,
pub capabilities: ServerCapabilities,
pub server_info: EntityInfo,
pub server_info: Implementation,
}
#[derive(Debug, Deserialize)]
@@ -141,13 +145,39 @@ pub struct ResourcesReadResponse {
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourcesListResponse {
#[serde(skip_serializing_if = "Option::is_none")]
pub resource_templates: Option<Vec<ResourceTemplate>>,
pub resources: Vec<Resource>,
#[serde(skip_serializing_if = "Option::is_none")]
pub resources: Option<Vec<Resource>>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct SamplingMessage {
pub role: SamplingRole,
pub content: SamplingContent,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "lowercase")]
pub enum SamplingRole {
User,
Assistant,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(tag = "type")]
pub enum SamplingContent {
#[serde(rename = "text")]
Text { text: String },
#[serde(rename = "image")]
Image { data: String, mime_type: String },
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptsGetResponse {
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
pub prompt: String,
}
@@ -155,7 +185,7 @@ pub struct PromptsGetResponse {
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptsListResponse {
pub prompts: Vec<PromptInfo>,
pub prompts: Vec<Prompt>,
}
#[derive(Debug, Deserialize)]
@@ -168,61 +198,91 @@ pub struct CompletionCompleteResponse {
#[serde(rename_all = "camelCase")]
pub struct CompletionResult {
pub values: Vec<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub total: Option<u32>,
#[serde(skip_serializing_if = "Option::is_none")]
pub has_more: Option<bool>,
}
#[derive(Debug, Deserialize, Clone)]
#[derive(Debug, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptInfo {
pub struct Prompt {
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub arguments: Option<Vec<PromptArgument>>,
}
#[derive(Debug, Deserialize, Clone)]
#[derive(Debug, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptArgument {
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub required: Option<bool>,
}
// Shared Types
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ClientCapabilities {
#[serde(skip_serializing_if = "Option::is_none")]
pub experimental: Option<HashMap<String, serde_json::Value>>,
pub sampling: Option<HashMap<String, serde_json::Value>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub sampling: Option<serde_json::Value>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ServerCapabilities {
#[serde(skip_serializing_if = "Option::is_none")]
pub experimental: Option<HashMap<String, serde_json::Value>>,
pub logging: Option<HashMap<String, serde_json::Value>>,
pub prompts: Option<HashMap<String, serde_json::Value>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub logging: Option<serde_json::Value>,
#[serde(skip_serializing_if = "Option::is_none")]
pub prompts: Option<PromptsCapabilities>,
#[serde(skip_serializing_if = "Option::is_none")]
pub resources: Option<ResourcesCapabilities>,
pub tools: Option<HashMap<String, serde_json::Value>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub tools: Option<ToolsCapabilities>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptsCapabilities {
#[serde(skip_serializing_if = "Option::is_none")]
pub list_changed: Option<bool>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourcesCapabilities {
#[serde(skip_serializing_if = "Option::is_none")]
pub subscribe: Option<bool>,
#[serde(skip_serializing_if = "Option::is_none")]
pub list_changed: Option<bool>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ToolsCapabilities {
#[serde(skip_serializing_if = "Option::is_none")]
pub list_changed: Option<bool>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct Tool {
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
pub input_schema: serde_json::Value,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct EntityInfo {
pub struct Implementation {
pub name: String,
pub version: String,
}
@@ -231,6 +291,10 @@ pub struct EntityInfo {
#[serde(rename_all = "camelCase")]
pub struct Resource {
pub uri: Url,
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub mime_type: Option<String>,
}
@@ -238,17 +302,23 @@ pub struct Resource {
#[serde(rename_all = "camelCase")]
pub struct ResourceContent {
pub uri: Url,
#[serde(skip_serializing_if = "Option::is_none")]
pub mime_type: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub text: Option<String>,
pub data: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub blob: Option<String>,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourceTemplate {
pub uri_template: String,
pub name: Option<String>,
pub name: String,
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
#[serde(skip_serializing_if = "Option::is_none")]
pub mime_type: Option<String>,
}
#[derive(Debug, Serialize, Deserialize)]
@@ -260,13 +330,16 @@ pub enum LoggingLevel {
Error,
}
// Client Notifications
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub enum NotificationType {
Initialized,
Progress,
Message,
ResourcesUpdated,
ResourcesListChanged,
ToolsListChanged,
PromptsListChanged,
}
impl NotificationType {
@@ -274,6 +347,11 @@ impl NotificationType {
match self {
NotificationType::Initialized => "notifications/initialized",
NotificationType::Progress => "notifications/progress",
NotificationType::Message => "notifications/message",
NotificationType::ResourcesUpdated => "notifications/resources/updated",
NotificationType::ResourcesListChanged => "notifications/resources/list_changed",
NotificationType::ToolsListChanged => "notifications/tools/list_changed",
NotificationType::PromptsListChanged => "notifications/prompts/list_changed",
}
}
}
@@ -288,12 +366,13 @@ pub enum ClientNotification {
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct ProgressParams {
pub progress_token: String,
pub progress_token: ProgressToken,
pub progress: f64,
#[serde(skip_serializing_if = "Option::is_none")]
pub total: Option<f64>,
}
// Helper Types that don't map directly to the protocol
pub type ProgressToken = String;
pub enum CompletionTotal {
Exact(u32),

View File

@@ -864,7 +864,11 @@ impl Copilot {
let buffer = buffer.read(cx);
let uri = registered_buffer.uri.clone();
let position = position.to_point_utf16(buffer);
let settings = language_settings(buffer.language_at(position).as_ref(), buffer.file(), cx);
let settings = language_settings(
buffer.language_at(position).map(|l| l.name()),
buffer.file(),
cx,
);
let tab_size = settings.tab_size;
let hard_tabs = settings.hard_tabs;
let relative_path = buffer

View File

@@ -77,7 +77,7 @@ impl InlineCompletionProvider for CopilotCompletionProvider {
let file = buffer.file();
let language = buffer.language_at(cursor_position);
let settings = all_language_settings(file, cx);
settings.inline_completions_enabled(language.as_ref(), file.map(|f| f.path().as_ref()))
settings.inline_completions_enabled(language.as_ref(), file.map(|f| f.path().as_ref()), cx)
}
fn refresh(
@@ -209,7 +209,7 @@ impl InlineCompletionProvider for CopilotCompletionProvider {
) {
let settings = AllLanguageSettings::get_global(cx);
let copilot_enabled = settings.inline_completions_enabled(None, None);
let copilot_enabled = settings.inline_completions_enabled(None, None, cx);
if !copilot_enabled {
return;

View File

@@ -962,7 +962,6 @@ fn random_diagnostic(
const FILE_HEADER: &str = "file header";
const EXCERPT_HEADER: &str = "excerpt header";
const EXCERPT_FOOTER: &str = "excerpt footer";
fn editor_blocks(
editor: &View<Editor>,
@@ -998,7 +997,7 @@ fn editor_blocks(
.ok()?
}
Block::ExcerptHeader {
Block::ExcerptBoundary {
starts_new_buffer, ..
} => {
if *starts_new_buffer {
@@ -1007,7 +1006,6 @@ fn editor_blocks(
EXCERPT_HEADER.into()
}
}
Block::ExcerptFooter { .. } => EXCERPT_FOOTER.into(),
};
Some((row, name))

View File

@@ -237,6 +237,7 @@ gpui::actions!(
ToggleFold,
ToggleFoldRecursive,
Format,
FormatSelections,
GoToDeclaration,
GoToDeclarationSplit,
GoToDefinition,
@@ -294,6 +295,7 @@ gpui::actions!(
RevealInFileManager,
ReverseLines,
RevertFile,
ReloadFile,
RevertSelectedHunks,
Rewrap,
ScrollCursorBottom,

View File

@@ -423,11 +423,12 @@ impl DisplayMap {
}
fn tab_size(buffer: &Model<MultiBuffer>, cx: &mut ModelContext<Self>) -> NonZeroU32 {
let buffer = buffer.read(cx).as_singleton().map(|buffer| buffer.read(cx));
let language = buffer
.read(cx)
.as_singleton()
.and_then(|buffer| buffer.read(cx).language());
language_settings(language, None, cx).tab_size
.and_then(|buffer| buffer.language())
.map(|l| l.name());
let file = buffer.and_then(|buffer| buffer.file());
language_settings(language, file, cx).tab_size
}
#[cfg(test)]

View File

@@ -5,8 +5,8 @@ use super::{
use crate::{EditorStyle, GutterDimensions};
use collections::{Bound, HashMap, HashSet};
use gpui::{AnyElement, EntityId, Pixels, WindowContext};
use language::{BufferSnapshot, Chunk, Patch, Point};
use multi_buffer::{Anchor, ExcerptId, ExcerptRange, MultiBufferRow, ToPoint as _};
use language::{Chunk, Patch, Point};
use multi_buffer::{Anchor, ExcerptId, ExcerptInfo, MultiBufferRow, ToPoint as _};
use parking_lot::Mutex;
use std::{
cell::RefCell,
@@ -128,26 +128,17 @@ pub struct BlockContext<'a, 'b> {
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum BlockId {
Custom(CustomBlockId),
ExcerptHeader(ExcerptId),
ExcerptFooter(ExcerptId),
}
impl From<BlockId> for EntityId {
fn from(value: BlockId) -> Self {
match value {
BlockId::Custom(CustomBlockId(id)) => EntityId::from(id as u64),
BlockId::ExcerptHeader(id) => id.into(),
BlockId::ExcerptFooter(id) => id.into(),
}
}
ExcerptBoundary(Option<ExcerptId>),
}
impl From<BlockId> for ElementId {
fn from(value: BlockId) -> Self {
match value {
BlockId::Custom(CustomBlockId(id)) => ("Block", id).into(),
BlockId::ExcerptHeader(id) => ("ExcerptHeader", EntityId::from(id)).into(),
BlockId::ExcerptFooter(id) => ("ExcerptFooter", EntityId::from(id)).into(),
BlockId::ExcerptBoundary(next_excerpt) => match next_excerpt {
Some(id) => ("ExcerptBoundary", EntityId::from(id)).into(),
None => "LastExcerptBoundary".into(),
},
}
}
}
@@ -156,8 +147,7 @@ impl std::fmt::Display for BlockId {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Self::Custom(id) => write!(f, "Block({id:?})"),
Self::ExcerptHeader(id) => write!(f, "ExcerptHeader({id:?})"),
Self::ExcerptFooter(id) => write!(f, "ExcerptFooter({id:?})"),
Self::ExcerptBoundary(id) => write!(f, "ExcerptHeader({id:?})"),
}
}
}
@@ -177,8 +167,7 @@ struct Transform {
pub(crate) enum BlockType {
Custom(CustomBlockId),
Header,
Footer,
ExcerptBoundary,
}
pub(crate) trait BlockLike {
@@ -191,27 +180,20 @@ pub(crate) trait BlockLike {
#[derive(Clone)]
pub enum Block {
Custom(Arc<CustomBlock>),
ExcerptHeader {
id: ExcerptId,
buffer: BufferSnapshot,
range: ExcerptRange<text::Anchor>,
ExcerptBoundary {
prev_excerpt: Option<ExcerptInfo>,
next_excerpt: Option<ExcerptInfo>,
height: u32,
starts_new_buffer: bool,
show_excerpt_controls: bool,
},
ExcerptFooter {
id: ExcerptId,
disposition: BlockDisposition,
height: u32,
},
}
impl BlockLike for Block {
fn block_type(&self) -> BlockType {
match self {
Block::Custom(block) => BlockType::Custom(block.id),
Block::ExcerptHeader { .. } => BlockType::Header,
Block::ExcerptFooter { .. } => BlockType::Footer,
Block::ExcerptBoundary { .. } => BlockType::ExcerptBoundary,
}
}
@@ -222,8 +204,7 @@ impl BlockLike for Block {
fn priority(&self) -> usize {
match self {
Block::Custom(block) => block.priority,
Block::ExcerptHeader { .. } => usize::MAX,
Block::ExcerptFooter { .. } => 0,
Block::ExcerptBoundary { .. } => usize::MAX,
}
}
}
@@ -232,32 +213,36 @@ impl Block {
pub fn id(&self) -> BlockId {
match self {
Block::Custom(block) => BlockId::Custom(block.id),
Block::ExcerptHeader { id, .. } => BlockId::ExcerptHeader(*id),
Block::ExcerptFooter { id, .. } => BlockId::ExcerptFooter(*id),
Block::ExcerptBoundary { next_excerpt, .. } => {
BlockId::ExcerptBoundary(next_excerpt.as_ref().map(|info| info.id))
}
}
}
fn disposition(&self) -> BlockDisposition {
match self {
Block::Custom(block) => block.disposition,
Block::ExcerptHeader { .. } => BlockDisposition::Above,
Block::ExcerptFooter { disposition, .. } => *disposition,
Block::ExcerptBoundary { next_excerpt, .. } => {
if next_excerpt.is_some() {
BlockDisposition::Above
} else {
BlockDisposition::Below
}
}
}
}
pub fn height(&self) -> u32 {
match self {
Block::Custom(block) => block.height,
Block::ExcerptHeader { height, .. } => *height,
Block::ExcerptFooter { height, .. } => *height,
Block::ExcerptBoundary { height, .. } => *height,
}
}
pub fn style(&self) -> BlockStyle {
match self {
Block::Custom(block) => block.style,
Block::ExcerptHeader { .. } => BlockStyle::Sticky,
Block::ExcerptFooter { .. } => BlockStyle::Sticky,
Block::ExcerptBoundary { .. } => BlockStyle::Sticky,
}
}
}
@@ -266,24 +251,17 @@ impl Debug for Block {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Self::Custom(block) => f.debug_struct("Custom").field("block", block).finish(),
Self::ExcerptHeader {
buffer,
Self::ExcerptBoundary {
starts_new_buffer,
id,
next_excerpt,
prev_excerpt,
..
} => f
.debug_struct("ExcerptHeader")
.field("id", &id)
.field("path", &buffer.file().map(|f| f.path()))
.debug_struct("ExcerptBoundary")
.field("prev_excerpt", &prev_excerpt)
.field("next_excerpt", &next_excerpt)
.field("starts_new_buffer", &starts_new_buffer)
.finish(),
Block::ExcerptFooter {
id, disposition, ..
} => f
.debug_struct("ExcerptFooter")
.field("id", &id)
.field("disposition", &disposition)
.finish(),
}
}
}
@@ -595,66 +573,62 @@ impl BlockMap {
{
buffer
.excerpt_boundaries_in_range(range)
.flat_map(move |excerpt_boundary| {
let mut wrap_row = wrap_snapshot
.make_wrap_point(Point::new(excerpt_boundary.row.0, 0), Bias::Left)
.row();
[
show_excerpt_controls
.then(|| {
let disposition;
if excerpt_boundary.next.is_some() {
disposition = BlockDisposition::Above;
} else {
wrap_row = wrap_snapshot
.make_wrap_point(
Point::new(
excerpt_boundary.row.0,
buffer.line_len(excerpt_boundary.row),
),
Bias::Left,
)
.row();
disposition = BlockDisposition::Below;
}
excerpt_boundary.prev.as_ref().map(|prev| {
(
wrap_row,
Block::ExcerptFooter {
id: prev.id,
height: excerpt_footer_height,
disposition,
},
)
})
})
.flatten(),
excerpt_boundary.next.map(|next| {
let starts_new_buffer = excerpt_boundary
.prev
.map_or(true, |prev| prev.buffer_id != next.buffer_id);
(
wrap_row,
Block::ExcerptHeader {
id: next.id,
buffer: next.buffer,
range: next.range,
height: if starts_new_buffer {
buffer_header_height
} else {
excerpt_header_height
},
starts_new_buffer,
show_excerpt_controls,
},
.filter_map(move |excerpt_boundary| {
let wrap_row;
if excerpt_boundary.next.is_some() {
wrap_row = wrap_snapshot
.make_wrap_point(Point::new(excerpt_boundary.row.0, 0), Bias::Left)
.row();
} else {
wrap_row = wrap_snapshot
.make_wrap_point(
Point::new(
excerpt_boundary.row.0,
buffer.line_len(excerpt_boundary.row),
),
Bias::Left,
)
}),
]
.row();
}
let starts_new_buffer = match (&excerpt_boundary.prev, &excerpt_boundary.next) {
(_, None) => false,
(None, Some(_)) => true,
(Some(prev), Some(next)) => prev.buffer_id != next.buffer_id,
};
let mut height = 0;
if excerpt_boundary.prev.is_some() {
if show_excerpt_controls {
height += excerpt_footer_height;
}
}
if excerpt_boundary.next.is_some() {
if starts_new_buffer {
height += buffer_header_height;
if show_excerpt_controls {
height += excerpt_header_height;
}
} else {
height += excerpt_header_height;
}
}
if height == 0 {
return None;
}
Some((
wrap_row,
Block::ExcerptBoundary {
prev_excerpt: excerpt_boundary.prev,
next_excerpt: excerpt_boundary.next,
height,
starts_new_buffer,
show_excerpt_controls,
},
))
})
.flatten()
}
pub(crate) fn sort_blocks<B: BlockLike>(blocks: &mut [(u32, B)]) {
@@ -665,12 +639,9 @@ impl BlockMap {
.disposition()
.cmp(&block_b.disposition())
.then_with(|| match ((block_a.block_type()), (block_b.block_type())) {
(BlockType::Footer, BlockType::Footer) => Ordering::Equal,
(BlockType::Footer, _) => Ordering::Less,
(_, BlockType::Footer) => Ordering::Greater,
(BlockType::Header, BlockType::Header) => Ordering::Equal,
(BlockType::Header, _) => Ordering::Less,
(_, BlockType::Header) => Ordering::Greater,
(BlockType::ExcerptBoundary, BlockType::ExcerptBoundary) => Ordering::Equal,
(BlockType::ExcerptBoundary, _) => Ordering::Less,
(_, BlockType::ExcerptBoundary) => Ordering::Greater,
(BlockType::Custom(a_id), BlockType::Custom(b_id)) => block_b
.priority()
.cmp(&block_a.priority())
@@ -1045,33 +1016,19 @@ impl BlockSnapshot {
let custom_block = self.custom_blocks_by_id.get(&custom_block_id)?;
Some(Block::Custom(custom_block.clone()))
}
BlockId::ExcerptHeader(excerpt_id) => {
let excerpt_range = buffer.range_for_excerpt::<Point>(excerpt_id)?;
let wrap_point = self
.wrap_snapshot
.make_wrap_point(excerpt_range.start, Bias::Left);
let mut cursor = self.transforms.cursor::<(WrapRow, BlockRow)>(&());
cursor.seek(&WrapRow(wrap_point.row()), Bias::Left, &());
while let Some(transform) = cursor.item() {
if let Some(block) = transform.block.as_ref() {
if block.id() == block_id {
return Some(block.clone());
}
} else if cursor.start().0 > WrapRow(wrap_point.row()) {
break;
}
cursor.next(&());
BlockId::ExcerptBoundary(next_excerpt_id) => {
let wrap_point;
if let Some(next_excerpt_id) = next_excerpt_id {
let excerpt_range = buffer.range_for_excerpt::<Point>(next_excerpt_id)?;
wrap_point = self
.wrap_snapshot
.make_wrap_point(excerpt_range.start, Bias::Left);
} else {
wrap_point = self
.wrap_snapshot
.make_wrap_point(buffer.max_point(), Bias::Left);
}
None
}
BlockId::ExcerptFooter(excerpt_id) => {
let excerpt_range = buffer.range_for_excerpt::<Point>(excerpt_id)?;
let wrap_point = self
.wrap_snapshot
.make_wrap_point(excerpt_range.end, Bias::Left);
let mut cursor = self.transforms.cursor::<(WrapRow, BlockRow)>(&());
cursor.seek(&WrapRow(wrap_point.row()), Bias::Left, &());
while let Some(transform) = cursor.item() {
@@ -1468,7 +1425,7 @@ mod tests {
};
use gpui::{div, font, px, AppContext, Context as _, Element};
use language::{Buffer, Capability};
use multi_buffer::MultiBuffer;
use multi_buffer::{ExcerptRange, MultiBuffer};
use rand::prelude::*;
use settings::SettingsStore;
use std::env;
@@ -1724,22 +1681,20 @@ mod tests {
// Each excerpt has a header above and footer below. Excerpts are also *separated* by a newline.
assert_eq!(
snapshot.text(),
"\nBuff\ner 1\n\n\nBuff\ner 2\n\n\nBuff\ner 3\n"
"\n\nBuff\ner 1\n\n\n\nBuff\ner 2\n\n\n\nBuff\ner 3\n"
);
let blocks: Vec<_> = snapshot
.blocks_in_range(0..u32::MAX)
.map(|(row, block)| (row, block.id()))
.map(|(row, block)| (row..row + block.height(), block.id()))
.collect();
assert_eq!(
blocks,
vec![
(0, BlockId::ExcerptHeader(excerpt_ids[0])),
(3, BlockId::ExcerptFooter(excerpt_ids[0])),
(4, BlockId::ExcerptHeader(excerpt_ids[1])),
(7, BlockId::ExcerptFooter(excerpt_ids[1])),
(8, BlockId::ExcerptHeader(excerpt_ids[2])),
(11, BlockId::ExcerptFooter(excerpt_ids[2]))
(0..2, BlockId::ExcerptBoundary(Some(excerpt_ids[0]))), // path, header
(4..7, BlockId::ExcerptBoundary(Some(excerpt_ids[1]))), // footer, path, header
(9..12, BlockId::ExcerptBoundary(Some(excerpt_ids[2]))), // footer, path, header
(14..15, BlockId::ExcerptBoundary(None)), // footer
]
);
}
@@ -2283,13 +2238,10 @@ mod tests {
#[derive(Debug, Eq, PartialEq)]
enum ExpectedBlock {
ExcerptHeader {
ExcerptBoundary {
height: u32,
starts_new_buffer: bool,
},
ExcerptFooter {
height: u32,
disposition: BlockDisposition,
is_last: bool,
},
Custom {
disposition: BlockDisposition,
@@ -2303,8 +2255,7 @@ mod tests {
fn block_type(&self) -> BlockType {
match self {
ExpectedBlock::Custom { id, .. } => BlockType::Custom(*id),
ExpectedBlock::ExcerptHeader { .. } => BlockType::Header,
ExpectedBlock::ExcerptFooter { .. } => BlockType::Footer,
ExpectedBlock::ExcerptBoundary { .. } => BlockType::ExcerptBoundary,
}
}
@@ -2315,8 +2266,7 @@ mod tests {
fn priority(&self) -> usize {
match self {
ExpectedBlock::Custom { priority, .. } => *priority,
ExpectedBlock::ExcerptHeader { .. } => usize::MAX,
ExpectedBlock::ExcerptFooter { .. } => 0,
ExpectedBlock::ExcerptBoundary { .. } => usize::MAX,
}
}
}
@@ -2324,17 +2274,21 @@ mod tests {
impl ExpectedBlock {
fn height(&self) -> u32 {
match self {
ExpectedBlock::ExcerptHeader { height, .. } => *height,
ExpectedBlock::ExcerptBoundary { height, .. } => *height,
ExpectedBlock::Custom { height, .. } => *height,
ExpectedBlock::ExcerptFooter { height, .. } => *height,
}
}
fn disposition(&self) -> BlockDisposition {
match self {
ExpectedBlock::ExcerptHeader { .. } => BlockDisposition::Above,
ExpectedBlock::ExcerptBoundary { is_last, .. } => {
if *is_last {
BlockDisposition::Below
} else {
BlockDisposition::Above
}
}
ExpectedBlock::Custom { disposition, .. } => *disposition,
ExpectedBlock::ExcerptFooter { disposition, .. } => *disposition,
}
}
}
@@ -2348,21 +2302,15 @@ mod tests {
height: block.height,
priority: block.priority,
},
Block::ExcerptHeader {
Block::ExcerptBoundary {
height,
starts_new_buffer,
next_excerpt,
..
} => ExpectedBlock::ExcerptHeader {
} => ExpectedBlock::ExcerptBoundary {
height,
starts_new_buffer,
},
Block::ExcerptFooter {
height,
disposition,
..
} => ExpectedBlock::ExcerptFooter {
height,
disposition,
is_last: next_excerpt.is_none(),
},
}
}
@@ -2380,8 +2328,7 @@ mod tests {
fn as_custom(&self) -> Option<&CustomBlock> {
match self {
Block::Custom(block) => Some(block),
Block::ExcerptHeader { .. } => None,
Block::ExcerptFooter { .. } => None,
Block::ExcerptBoundary { .. } => None,
}
}
}

View File

@@ -48,7 +48,6 @@ mod signature_help;
pub mod test;
use ::git::diff::DiffHunkStatus;
use ::git::{parse_git_remote_url, BuildPermalinkParams, GitHostingProviderRegistry};
pub(crate) use actions::*;
use aho_corasick::AhoCorasick;
use anyhow::{anyhow, Context as _, Result};
@@ -74,12 +73,12 @@ use git::blame::GitBlame;
use gpui::{
div, impl_actions, point, prelude::*, px, relative, size, uniform_list, Action, AnyElement,
AppContext, AsyncWindowContext, AvailableSpace, BackgroundExecutor, Bounds, ClipboardEntry,
ClipboardItem, Context, DispatchPhase, ElementId, EntityId, EventEmitter, FocusHandle,
FocusOutEvent, FocusableView, FontId, FontWeight, HighlightStyle, Hsla, InteractiveText,
KeyContext, ListSizingBehavior, Model, MouseButton, PaintQuad, ParentElement, Pixels, Render,
SharedString, Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle,
UTF16Selection, UnderlineStyle, UniformListScrollHandle, View, ViewContext, ViewInputHandler,
VisualContext, WeakFocusHandle, WeakView, WindowContext,
ClipboardItem, Context, DispatchPhase, ElementId, EventEmitter, FocusHandle, FocusOutEvent,
FocusableView, FontId, FontWeight, HighlightStyle, Hsla, InteractiveText, KeyContext,
ListSizingBehavior, Model, MouseButton, PaintQuad, ParentElement, Pixels, Render, SharedString,
Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle, UTF16Selection,
UnderlineStyle, UniformListScrollHandle, View, ViewContext, ViewInputHandler, VisualContext,
WeakFocusHandle, WeakView, WindowContext,
};
use highlight_matching_bracket::refresh_matching_bracket_highlights;
use hover_popover::{hide_hover, HoverState};
@@ -91,15 +90,17 @@ pub use inline_completion_provider::*;
pub use items::MAX_TAB_TITLE_LEN;
use itertools::Itertools;
use language::{
language_settings::{self, all_language_settings, InlayHintSettings},
language_settings::{self, all_language_settings, language_settings, InlayHintSettings},
markdown, point_from_lsp, AutoindentMode, BracketPair, Buffer, Capability, CharKind, CodeLabel,
CursorShape, Diagnostic, Documentation, IndentKind, IndentSize, Language, OffsetRangeExt,
Point, Selection, SelectionGoal, TransactionId,
};
use language::{point_to_lsp, BufferRow, CharClassifier, Runnable, RunnableRange};
use language::{
point_to_lsp, BufferRow, CharClassifier, LanguageServerName, Runnable, RunnableRange,
};
use linked_editing_ranges::refresh_linked_ranges;
pub use proposed_changes_editor::{
ProposedChangesBuffer, ProposedChangesEditor, ProposedChangesEditorToolbar,
ProposedChangeLocation, ProposedChangesEditor, ProposedChangesEditorToolbar,
};
use similar::{ChangeTag, TextDiff};
use task::{ResolvedTask, TaskTemplate, TaskVariables};
@@ -122,7 +123,7 @@ use multi_buffer::{
use ordered_float::OrderedFloat;
use parking_lot::{Mutex, RwLock};
use project::{
lsp_store::FormatTrigger,
lsp_store::{FormatTarget, FormatTrigger},
project_settings::{GitGutterSetting, ProjectSettings},
CodeAction, Completion, CompletionIntent, DocumentHighlight, InlayHint, Item, Location,
LocationLink, Project, ProjectPath, ProjectTransaction, TaskSourceKind,
@@ -161,16 +162,16 @@ use ui::{
};
use util::{defer, maybe, post_inc, RangeExt, ResultExt, TryFutureExt};
use workspace::item::{ItemHandle, PreviewTabsSettings};
use workspace::notifications::{DetachAndPromptErr, NotificationId};
use workspace::notifications::{DetachAndPromptErr, NotificationId, NotifyTaskExt};
use workspace::{
searchable::SearchEvent, ItemNavHistory, SplitDirection, ViewId, Workspace, WorkspaceId,
};
use workspace::{OpenInTerminal, OpenTerminal, TabBarSettings, Toast};
use workspace::{Item as WorkspaceItem, OpenInTerminal, OpenTerminal, TabBarSettings, Toast};
use crate::hover_links::find_url;
use crate::signature_help::{SignatureHelpHiddenBy, SignatureHelpState};
pub const FILE_HEADER_HEIGHT: u32 = 1;
pub const FILE_HEADER_HEIGHT: u32 = 2;
pub const MULTI_BUFFER_EXCERPT_HEADER_HEIGHT: u32 = 1;
pub const MULTI_BUFFER_EXCERPT_FOOTER_HEIGHT: u32 = 1;
pub const DEFAULT_MULTIBUFFER_CONTEXT: u32 = 2;
@@ -427,8 +428,7 @@ impl Default for EditorStyle {
}
pub fn make_inlay_hints_style(cx: &WindowContext) -> HighlightStyle {
let show_background = all_language_settings(None, cx)
.language(None)
let show_background = language_settings::language_settings(None, None, cx)
.inlay_hints
.show_background;
@@ -639,7 +639,6 @@ pub struct Editor {
tasks: BTreeMap<(BufferId, BufferRow), RunnableTasks>,
tasks_update_task: Option<Task<()>>,
previous_search_ranges: Option<Arc<[Range<Anchor>]>>,
file_header_size: u32,
breadcrumb_header: Option<String>,
focused_block: Option<FocusedBlock>,
next_scroll_position: NextScrollCursorCenterTopBottom,
@@ -1845,7 +1844,6 @@ impl Editor {
}),
merge_adjacent: true,
};
let file_header_size = if show_excerpt_controls { 3 } else { 2 };
let display_map = cx.new_model(|cx| {
DisplayMap::new(
buffer.clone(),
@@ -1853,7 +1851,7 @@ impl Editor {
font_size,
None,
show_excerpt_controls,
file_header_size,
FILE_HEADER_HEIGHT,
MULTI_BUFFER_EXCERPT_HEADER_HEIGHT,
MULTI_BUFFER_EXCERPT_FOOTER_HEIGHT,
fold_placeholder,
@@ -2037,7 +2035,6 @@ impl Editor {
.restore_unsaved_buffers,
blame: None,
blame_subscription: None,
file_header_size,
tasks: Default::default(),
_subscriptions: vec![
cx.observe(&buffer, Self::on_buffer_changed),
@@ -4250,7 +4247,10 @@ impl Editor {
.text_anchor_for_position(position, cx)?;
let settings = language_settings::language_settings(
buffer.read(cx).language_at(buffer_position).as_ref(),
buffer
.read(cx)
.language_at(buffer_position)
.map(|l| l.name()),
buffer.read(cx).file(),
cx,
);
@@ -6241,6 +6241,13 @@ impl Editor {
}
}
pub fn reload_file(&mut self, _: &ReloadFile, cx: &mut ViewContext<Self>) {
let Some(project) = self.project.clone() else {
return;
};
self.reload(project, cx).detach_and_notify_err(cx);
}
pub fn revert_selected_hunks(&mut self, _: &RevertSelectedHunks, cx: &mut ViewContext<Self>) {
let revert_changes = self.gather_revert_changes(&self.selections.disjoint_anchors(), cx);
if !revert_changes.is_empty() {
@@ -6277,11 +6284,9 @@ impl Editor {
let project_path = buffer.read(cx).project_path(cx)?;
let project = self.project.as_ref()?.read(cx);
let entry = project.entry_for_path(&project_path, cx)?;
let abs_path = project.absolute_path(&project_path, cx)?;
let parent = if entry.is_symlink {
abs_path.canonicalize().ok()?
} else {
abs_path
let parent = match &entry.canonical_path {
Some(canonical_path) => canonical_path.to_path_buf(),
None => project.absolute_path(&project_path, cx)?,
}
.parent()?
.to_path_buf();
@@ -9886,21 +9891,19 @@ impl Editor {
&self,
lsp_location: lsp::Location,
server_id: LanguageServerId,
cx: &mut ViewContext<Editor>,
cx: &mut ViewContext<Self>,
) -> Task<anyhow::Result<Option<Location>>> {
let Some(project) = self.project.clone() else {
return Task::Ready(Some(Ok(None)));
};
cx.spawn(move |editor, mut cx| async move {
let location_task = editor.update(&mut cx, |editor, cx| {
let location_task = editor.update(&mut cx, |_, cx| {
project.update(cx, |project, cx| {
let language_server_name =
editor.buffer.read(cx).as_singleton().and_then(|buffer| {
project
.language_server_for_buffer(buffer.read(cx), server_id, cx)
.map(|(lsp_adapter, _)| lsp_adapter.name.clone())
});
let language_server_name = project
.language_server_statuses(cx)
.find(|(id, _)| server_id == *id)
.map(|(_, status)| LanguageServerName::from(status.name.as_str()));
language_server_name.map(|language_server_name| {
project.open_local_buffer_via_lsp(
lsp_location.uri.clone(),
@@ -10379,13 +10382,39 @@ impl Editor {
None => return None,
};
Some(self.perform_format(project, FormatTrigger::Manual, cx))
Some(self.perform_format(project, FormatTrigger::Manual, FormatTarget::Buffer, cx))
}
fn format_selections(
&mut self,
_: &FormatSelections,
cx: &mut ViewContext<Self>,
) -> Option<Task<Result<()>>> {
let project = match &self.project {
Some(project) => project.clone(),
None => return None,
};
let selections = self
.selections
.all_adjusted(cx)
.into_iter()
.filter(|s| !s.is_empty())
.collect_vec();
Some(self.perform_format(
project,
FormatTrigger::Manual,
FormatTarget::Ranges(selections),
cx,
))
}
fn perform_format(
&mut self,
project: Model<Project>,
trigger: FormatTrigger,
target: FormatTarget,
cx: &mut ViewContext<Self>,
) -> Task<Result<()>> {
let buffer = self.buffer().clone();
@@ -10395,7 +10424,9 @@ impl Editor {
}
let mut timeout = cx.background_executor().timer(FORMAT_TIMEOUT).fuse();
let format = project.update(cx, |project, cx| project.format(buffers, true, trigger, cx));
let format = project.update(cx, |project, cx| {
project.format(buffers, true, trigger, target, cx)
});
cx.spawn(|_, mut cx| async move {
let transaction = futures::select_biased! {
@@ -11453,11 +11484,8 @@ impl Editor {
snapshot.line_len(buffer_row) == 0
}
fn get_permalink_to_line(&mut self, cx: &mut ViewContext<Self>) -> Result<url::Url> {
let (path, selection, repo) = maybe!({
let project_handle = self.project.as_ref()?.clone();
let project = project_handle.read(cx);
fn get_permalink_to_line(&mut self, cx: &mut ViewContext<Self>) -> Task<Result<url::Url>> {
let buffer_and_selection = maybe!({
let selection = self.selections.newest::<Point>(cx);
let selection_range = selection.range();
@@ -11481,64 +11509,58 @@ impl Editor {
(buffer.clone(), selection)
};
let path = buffer
.read(cx)
.file()?
.as_local()?
.path()
.to_str()?
.to_string();
let repo = project.get_repo(&buffer.read(cx).project_path(cx)?, cx)?;
Some((path, selection, repo))
Some((buffer, selection))
});
let Some((buffer, selection)) = buffer_and_selection else {
return Task::ready(Err(anyhow!("failed to determine buffer and selection")));
};
let Some(project) = self.project.as_ref() else {
return Task::ready(Err(anyhow!("editor does not have project")));
};
project.update(cx, |project, cx| {
project.get_permalink_to_line(&buffer, selection, cx)
})
.ok_or_else(|| anyhow!("unable to open git repository"))?;
const REMOTE_NAME: &str = "origin";
let origin_url = repo
.remote_url(REMOTE_NAME)
.ok_or_else(|| anyhow!("remote \"{REMOTE_NAME}\" not found"))?;
let sha = repo
.head_sha()
.ok_or_else(|| anyhow!("failed to read HEAD SHA"))?;
let (provider, remote) =
parse_git_remote_url(GitHostingProviderRegistry::default_global(cx), &origin_url)
.ok_or_else(|| anyhow!("failed to parse Git remote URL"))?;
Ok(provider.build_permalink(
remote,
BuildPermalinkParams {
sha: &sha,
path: &path,
selection: Some(selection),
},
))
}
pub fn copy_permalink_to_line(&mut self, _: &CopyPermalinkToLine, cx: &mut ViewContext<Self>) {
let permalink = self.get_permalink_to_line(cx);
let permalink_task = self.get_permalink_to_line(cx);
let workspace = self.workspace();
match permalink {
Ok(permalink) => {
cx.write_to_clipboard(ClipboardItem::new_string(permalink.to_string()));
}
Err(err) => {
let message = format!("Failed to copy permalink: {err}");
Err::<(), anyhow::Error>(err).log_err();
if let Some(workspace) = self.workspace() {
workspace.update(cx, |workspace, cx| {
struct CopyPermalinkToLine;
workspace.show_toast(
Toast::new(NotificationId::unique::<CopyPermalinkToLine>(), message),
cx,
)
cx.spawn(|_, mut cx| async move {
match permalink_task.await {
Ok(permalink) => {
cx.update(|cx| {
cx.write_to_clipboard(ClipboardItem::new_string(permalink.to_string()));
})
.ok();
}
Err(err) => {
let message = format!("Failed to copy permalink: {err}");
Err::<(), anyhow::Error>(err).log_err();
if let Some(workspace) = workspace {
workspace
.update(&mut cx, |workspace, cx| {
struct CopyPermalinkToLine;
workspace.show_toast(
Toast::new(
NotificationId::unique::<CopyPermalinkToLine>(),
message,
),
cx,
)
})
.ok();
}
}
}
}
})
.detach();
}
pub fn copy_file_location(&mut self, _: &CopyFileLocation, cx: &mut ViewContext<Self>) {
@@ -11551,29 +11573,41 @@ impl Editor {
}
pub fn open_permalink_to_line(&mut self, _: &OpenPermalinkToLine, cx: &mut ViewContext<Self>) {
let permalink = self.get_permalink_to_line(cx);
let permalink_task = self.get_permalink_to_line(cx);
let workspace = self.workspace();
match permalink {
Ok(permalink) => {
cx.open_url(permalink.as_ref());
}
Err(err) => {
let message = format!("Failed to open permalink: {err}");
Err::<(), anyhow::Error>(err).log_err();
if let Some(workspace) = self.workspace() {
workspace.update(cx, |workspace, cx| {
struct OpenPermalinkToLine;
workspace.show_toast(
Toast::new(NotificationId::unique::<OpenPermalinkToLine>(), message),
cx,
)
cx.spawn(|_, mut cx| async move {
match permalink_task.await {
Ok(permalink) => {
cx.update(|cx| {
cx.open_url(permalink.as_ref());
})
.ok();
}
Err(err) => {
let message = format!("Failed to open permalink: {err}");
Err::<(), anyhow::Error>(err).log_err();
if let Some(workspace) = workspace {
workspace
.update(&mut cx, |workspace, cx| {
struct OpenPermalinkToLine;
workspace.show_toast(
Toast::new(
NotificationId::unique::<OpenPermalinkToLine>(),
message,
),
cx,
)
})
.ok();
}
}
}
}
})
.detach();
}
/// Adds a row highlight for the given range. If a row has multiple highlights, the
@@ -12326,10 +12360,15 @@ impl Editor {
let proposed_changes_buffers = new_selections_by_buffer
.into_iter()
.map(|(buffer, ranges)| ProposedChangesBuffer { buffer, ranges })
.map(|(buffer, ranges)| ProposedChangeLocation { buffer, ranges })
.collect::<Vec<_>>();
let proposed_changes_editor = cx.new_view(|cx| {
ProposedChangesEditor::new(proposed_changes_buffers, self.project.clone(), cx)
ProposedChangesEditor::new(
"Proposed changes",
proposed_changes_buffers,
self.project.clone(),
cx,
)
});
cx.window_context().defer(move |cx| {
@@ -12766,7 +12805,7 @@ impl Editor {
}
pub fn file_header_size(&self) -> u32 {
self.file_header_size
FILE_HEADER_HEIGHT
}
pub fn revert(
@@ -13087,18 +13126,11 @@ fn snippet_completions(
return vec![];
}
let snapshot = buffer.read(cx).text_snapshot();
let chunks = snapshot.reversed_chunks_in_range(text::Anchor::MIN..buffer_position);
let mut lines = chunks.lines();
let Some(line_at) = lines.next().filter(|line| !line.is_empty()) else {
return vec![];
};
let chars = snapshot.reversed_chars_for_range(text::Anchor::MIN..buffer_position);
let scope = language.map(|language| language.default_scope());
let classifier = CharClassifier::new(scope).for_completion(true);
let mut last_word = line_at
.chars()
.rev()
let mut last_word = chars
.take_while(|c| classifier.is_word(*c))
.collect::<String>();
last_word = last_word.chars().rev().collect();
@@ -13344,11 +13376,8 @@ fn inlay_hint_settings(
cx: &mut ViewContext<'_, Editor>,
) -> InlayHintSettings {
let file = snapshot.file_at(location);
let language = snapshot.language_at(location);
let settings = all_language_settings(file, cx);
settings
.language(language.map(|l| l.name()).as_ref())
.inlay_hints
let language = snapshot.language_at(location).map(|l| l.name());
language_settings(language, file, cx).inlay_hints
}
fn consume_contiguous_rows(
@@ -13631,6 +13660,7 @@ pub enum EditorEvent {
TransactionBegun {
transaction_id: clock::Lamport,
},
Reloaded,
CursorShapeChanged,
}
@@ -14084,7 +14114,7 @@ pub fn diagnostic_block_renderer(
let multi_line_diagnostic = diagnostic.message.contains('\n');
let buttons = |diagnostic: &Diagnostic, block_id: BlockId| {
let buttons = |diagnostic: &Diagnostic| {
if multi_line_diagnostic {
v_flex()
} else {
@@ -14092,7 +14122,7 @@ pub fn diagnostic_block_renderer(
}
.when(allow_closing, |div| {
div.children(diagnostic.is_primary.then(|| {
IconButton::new(("close-block", EntityId::from(block_id)), IconName::XCircle)
IconButton::new("close-block", IconName::XCircle)
.icon_color(Color::Muted)
.size(ButtonSize::Compact)
.style(ButtonStyle::Transparent)
@@ -14102,7 +14132,7 @@ pub fn diagnostic_block_renderer(
}))
})
.child(
IconButton::new(("copy-block", EntityId::from(block_id)), IconName::Copy)
IconButton::new("copy-block", IconName::Copy)
.icon_color(Color::Muted)
.size(ButtonSize::Compact)
.style(ButtonStyle::Transparent)
@@ -14117,7 +14147,7 @@ pub fn diagnostic_block_renderer(
)
};
let icon_size = buttons(&diagnostic, cx.block_id)
let icon_size = buttons(&diagnostic)
.into_any_element()
.layout_as_root(AvailableSpace::min_size(), cx);
@@ -14134,7 +14164,7 @@ pub fn diagnostic_block_renderer(
.w(cx.anchor_x - cx.gutter_dimensions.width - icon_size.width)
.flex_shrink(),
)
.child(buttons(&diagnostic, cx.block_id))
.child(buttons(&diagnostic))
.child(div().flex().flex_shrink_0().child(
StyledText::new(text_without_backticks.clone()).with_highlights(
&text_style,

View File

@@ -7076,7 +7076,12 @@ async fn test_document_format_manual_trigger(cx: &mut gpui::TestAppContext) {
let format = editor
.update(cx, |editor, cx| {
editor.perform_format(project.clone(), FormatTrigger::Manual, cx)
editor.perform_format(
project.clone(),
FormatTrigger::Manual,
FormatTarget::Buffer,
cx,
)
})
.unwrap();
fake_server
@@ -7112,7 +7117,7 @@ async fn test_document_format_manual_trigger(cx: &mut gpui::TestAppContext) {
});
let format = editor
.update(cx, |editor, cx| {
editor.perform_format(project, FormatTrigger::Manual, cx)
editor.perform_format(project, FormatTrigger::Manual, FormatTarget::Buffer, cx)
})
.unwrap();
cx.executor().advance_clock(super::FORMAT_TIMEOUT);
@@ -10309,7 +10314,12 @@ async fn test_document_format_with_prettier(cx: &mut gpui::TestAppContext) {
editor
.update(cx, |editor, cx| {
editor.perform_format(project.clone(), FormatTrigger::Manual, cx)
editor.perform_format(
project.clone(),
FormatTrigger::Manual,
FormatTarget::Buffer,
cx,
)
})
.unwrap()
.await;
@@ -10323,7 +10333,12 @@ async fn test_document_format_with_prettier(cx: &mut gpui::TestAppContext) {
settings.defaults.formatter = Some(language_settings::SelectedFormatter::Auto)
});
let format = editor.update(cx, |editor, cx| {
editor.perform_format(project.clone(), FormatTrigger::Manual, cx)
editor.perform_format(
project.clone(),
FormatTrigger::Manual,
FormatTarget::Buffer,
cx,
)
});
format.await.unwrap();
assert_eq!(

View File

@@ -21,7 +21,8 @@ use crate::{
EditorSnapshot, EditorStyle, ExpandExcerpts, FocusedBlock, GutterDimensions, HalfPageDown,
HalfPageUp, HandleInput, HoveredCursor, HoveredHunk, LineDown, LineUp, OpenExcerpts, PageDown,
PageUp, Point, RowExt, RowRangeExt, SelectPhase, Selection, SoftWrap, ToPoint,
CURSORS_VISIBLE_FOR, GIT_BLAME_MAX_AUTHOR_CHARS_DISPLAYED, MAX_LINE_LEN,
CURSORS_VISIBLE_FOR, FILE_HEADER_HEIGHT, GIT_BLAME_MAX_AUTHOR_CHARS_DISPLAYED, MAX_LINE_LEN,
MULTI_BUFFER_EXCERPT_HEADER_HEIGHT,
};
use client::ParticipantIndex;
use collections::{BTreeMap, HashMap};
@@ -31,7 +32,7 @@ use gpui::{
anchored, deferred, div, fill, outline, point, px, quad, relative, size, svg,
transparent_black, Action, AnchorCorner, AnyElement, AvailableSpace, Bounds, ClipboardItem,
ContentMask, Corners, CursorStyle, DispatchPhase, Edges, Element, ElementInputHandler, Entity,
EntityId, FontId, GlobalElementId, Hitbox, Hsla, InteractiveElement, IntoElement, Length,
FontId, GlobalElementId, Hitbox, Hsla, InteractiveElement, IntoElement, Length,
ModifiersChangedEvent, MouseButton, MouseDownEvent, MouseMoveEvent, MouseUpEvent, PaintQuad,
ParentElement, Pixels, ScrollDelta, ScrollWheelEvent, ShapedLine, SharedString, Size,
StatefulInteractiveElement, Style, Styled, TextRun, TextStyle, TextStyleRefinement, View,
@@ -46,7 +47,7 @@ use language::{
ChunkRendererContext,
};
use lsp::DiagnosticSeverity;
use multi_buffer::{Anchor, MultiBufferPoint, MultiBufferRow};
use multi_buffer::{Anchor, ExcerptId, ExpandExcerptDirection, MultiBufferPoint, MultiBufferRow};
use project::{
project_settings::{GitGutterSetting, ProjectSettings},
ProjectPath,
@@ -64,7 +65,7 @@ use std::{
sync::Arc,
};
use sum_tree::Bias;
use theme::{ActiveTheme, PlayerColor};
use theme::{ActiveTheme, Appearance, PlayerColor};
use ui::prelude::*;
use ui::{h_flex, ButtonLike, ButtonStyle, ContextMenu, Tooltip};
use util::RangeExt;
@@ -376,6 +377,13 @@ impl EditorElement {
cx.propagate();
}
});
register_action(view, cx, |editor, action, cx| {
if let Some(task) = editor.format_selections(action, cx) {
task.detach_and_log_err(cx);
} else {
cx.propagate();
}
});
register_action(view, cx, Editor::restart_language_server);
register_action(view, cx, Editor::cancel_language_server_work);
register_action(view, cx, Editor::show_character_palette);
@@ -437,7 +445,8 @@ impl EditorElement {
register_action(view, cx, Editor::revert_file);
register_action(view, cx, Editor::revert_selected_hunks);
register_action(view, cx, Editor::apply_selected_diff_hunks);
register_action(view, cx, Editor::open_active_item_in_terminal)
register_action(view, cx, Editor::open_active_item_in_terminal);
register_action(view, cx, Editor::reload_file)
}
fn register_key_listeners(&self, cx: &mut WindowContext, layout: &EditorLayout) {
@@ -1015,8 +1024,20 @@ impl EditorElement {
block_width = em_width;
}
let block_text = if let CursorShape::Block = selection.cursor_shape {
snapshot.display_chars_at(cursor_position).next().and_then(
|(character, _)| {
snapshot
.display_chars_at(cursor_position)
.next()
.or_else(|| {
if cursor_column == 0 {
snapshot
.placeholder_text()
.and_then(|s| s.chars().next())
.map(|c| (c, cursor_position))
} else {
None
}
})
.and_then(|(character, _)| {
let text = if character == '\n' {
SharedString::from(" ")
} else {
@@ -1031,6 +1052,22 @@ impl EditorElement {
})
.unwrap_or(self.style.text.font());
// Invert the text color for the block cursor. Ensure that the text
// color is opaque enough to be visible against the background color.
//
// 0.75 is an arbitrary threshold to determine if the background color is
// opaque enough to use as a text color.
//
// TODO: In the future we should ensure themes have a `text_inverse` color.
let color = if cx.theme().colors().editor_background.a < 0.75 {
match cx.theme().appearance {
Appearance::Dark => Hsla::black(),
Appearance::Light => Hsla::white(),
}
} else {
cx.theme().colors().editor_background
};
cx.text_system()
.shape_line(
text,
@@ -1038,15 +1075,14 @@ impl EditorElement {
&[TextRun {
len,
font,
color: self.style.background,
color,
background_color: None,
strikethrough: None,
underline: None,
}],
)
.log_err()
},
)
})
} else {
None
};
@@ -1597,7 +1633,7 @@ impl EditorElement {
let mut block_offset = 0;
let mut found_excerpt_header = false;
for (_, block) in snapshot.blocks_in_range(prev_line..row_range.start) {
if matches!(block, Block::ExcerptHeader { .. }) {
if matches!(block, Block::ExcerptBoundary { .. }) {
found_excerpt_header = true;
break;
}
@@ -1614,7 +1650,7 @@ impl EditorElement {
let mut block_height = 0;
let mut found_excerpt_header = false;
for (_, block) in snapshot.blocks_in_range(row_range.end..cons_line) {
if matches!(block, Block::ExcerptHeader { .. }) {
if matches!(block, Block::ExcerptBoundary { .. }) {
found_excerpt_header = true;
}
block_height += block.height();
@@ -2065,23 +2101,14 @@ impl EditorElement {
.into_any_element()
}
Block::ExcerptHeader {
buffer,
range,
Block::ExcerptBoundary {
prev_excerpt,
next_excerpt,
show_excerpt_controls,
starts_new_buffer,
height,
id,
show_excerpt_controls,
..
} => {
let include_root = self
.editor
.read(cx)
.project
.as_ref()
.map(|project| project.read(cx).visible_worktrees(cx).count() > 1)
.unwrap_or_default();
#[derive(Clone)]
struct JumpData {
position: Point,
@@ -2090,233 +2117,227 @@ impl EditorElement {
line_offset_from_top: u32,
}
let jump_data = project::File::from_dyn(buffer.file()).map(|file| {
let jump_path = ProjectPath {
worktree_id: file.worktree_id(cx),
path: file.path.clone(),
};
let jump_anchor = range
.primary
.as_ref()
.map_or(range.context.start, |primary| primary.start);
let excerpt_start = range.context.start;
let jump_position = language::ToPoint::to_point(&jump_anchor, buffer);
let offset_from_excerpt_start = if jump_anchor == excerpt_start {
0
} else {
let excerpt_start_row =
language::ToPoint::to_point(&jump_anchor, buffer).row;
jump_position.row - excerpt_start_row
};
let line_offset_from_top =
block_row_start.0 + *height + offset_from_excerpt_start
- snapshot
.scroll_anchor
.scroll_position(&snapshot.display_snapshot)
.y as u32;
JumpData {
position: jump_position,
anchor: jump_anchor,
path: jump_path,
line_offset_from_top,
}
});
let icon_offset = gutter_dimensions.width
- (gutter_dimensions.left_padding + gutter_dimensions.margin);
let element = if *starts_new_buffer {
let path = buffer.resolve_file_path(cx, include_root);
let mut filename = None;
let mut parent_path = None;
// Can't use .and_then() because `.file_name()` and `.parent()` return references :(
if let Some(path) = path {
filename = path.file_name().map(|f| f.to_string_lossy().to_string());
parent_path = path
.parent()
.map(|p| SharedString::from(p.to_string_lossy().to_string() + "/"));
}
let header_padding = px(6.0);
let header_padding = px(6.0);
let mut result = v_flex().id(block_id).w_full();
v_flex()
.id(("path excerpt header", EntityId::from(block_id)))
.w_full()
.px(header_padding)
.pt(header_padding)
.child(
if let Some(prev_excerpt) = prev_excerpt {
if *show_excerpt_controls {
result = result.child(
h_flex()
.flex_basis(Length::Definite(DefiniteLength::Fraction(0.667)))
.id("path header block")
.h(2. * cx.line_height())
.px(gpui::px(12.))
.rounded_md()
.shadow_md()
.border_1()
.border_color(cx.theme().colors().border)
.bg(cx.theme().colors().editor_subheader_background)
.justify_between()
.hover(|style| style.bg(cx.theme().colors().element_hover))
.child(
h_flex().gap_3().child(
h_flex()
.gap_2()
.child(
filename
.map(SharedString::from)
.unwrap_or_else(|| "untitled".into()),
)
.when_some(parent_path, |then, path| {
then.child(
div()
.child(path)
.text_color(cx.theme().colors().text_muted),
)
}),
),
)
.when_some(jump_data.clone(), |el, jump_data| {
el.child(Icon::new(IconName::ArrowUpRight))
.cursor_pointer()
.tooltip(|cx| {
Tooltip::for_action("Jump to File", &OpenExcerpts, cx)
})
.on_mouse_down(MouseButton::Left, |_, cx| {
cx.stop_propagation()
})
.on_click(cx.listener_for(&self.editor, {
move |editor, _, cx| {
editor.jump(
jump_data.path.clone(),
jump_data.position,
jump_data.anchor,
jump_data.line_offset_from_top,
cx,
);
}
}))
}),
)
.children(show_excerpt_controls.then(|| {
h_flex()
.flex_basis(Length::Definite(DefiniteLength::Fraction(0.333)))
.h(1. * cx.line_height())
.pt_1()
.justify_end()
.w(icon_offset)
.h(MULTI_BUFFER_EXCERPT_HEADER_HEIGHT as f32 * cx.line_height())
.flex_none()
.w(icon_offset - header_padding)
.child(
ButtonLike::new("expand-icon")
.style(ButtonStyle::Transparent)
.child(
svg()
.path(IconName::ArrowUpFromLine.path())
.size(IconSize::XSmall.rems())
.text_color(cx.theme().colors().editor_line_number)
.group("")
.hover(|style| {
style.text_color(
cx.theme()
.colors()
.editor_active_line_number,
)
}),
)
.on_click(cx.listener_for(&self.editor, {
let id = *id;
move |editor, _, cx| {
editor.expand_excerpt(
id,
multi_buffer::ExpandExcerptDirection::Up,
cx,
);
}
}))
.tooltip({
move |cx| {
Tooltip::for_action(
"Expand Excerpt",
&ExpandExcerpts { lines: 0 },
cx,
)
}
}),
)
}))
} else {
v_flex()
.id(("excerpt header", EntityId::from(block_id)))
.w_full()
.h(snapshot.excerpt_header_height() as f32 * cx.line_height())
.child(
.justify_end()
.child(self.render_expand_excerpt_button(
prev_excerpt.id,
ExpandExcerptDirection::Down,
IconName::ArrowDownFromLine,
cx,
)),
);
}
}
if let Some(next_excerpt) = next_excerpt {
let buffer = &next_excerpt.buffer;
let range = &next_excerpt.range;
let jump_data = project::File::from_dyn(buffer.file()).map(|file| {
let jump_path = ProjectPath {
worktree_id: file.worktree_id(cx),
path: file.path.clone(),
};
let jump_anchor = range
.primary
.as_ref()
.map_or(range.context.start, |primary| primary.start);
let excerpt_start = range.context.start;
let jump_position = language::ToPoint::to_point(&jump_anchor, buffer);
let offset_from_excerpt_start = if jump_anchor == excerpt_start {
0
} else {
let excerpt_start_row =
language::ToPoint::to_point(&jump_anchor, buffer).row;
jump_position.row - excerpt_start_row
};
let line_offset_from_top =
block_row_start.0 + *height + offset_from_excerpt_start
- snapshot
.scroll_anchor
.scroll_position(&snapshot.display_snapshot)
.y as u32;
JumpData {
position: jump_position,
anchor: jump_anchor,
path: jump_path,
line_offset_from_top,
}
});
if *starts_new_buffer {
let include_root = self
.editor
.read(cx)
.project
.as_ref()
.map(|project| project.read(cx).visible_worktrees(cx).count() > 1)
.unwrap_or_default();
let path = buffer.resolve_file_path(cx, include_root);
let filename = path
.as_ref()
.and_then(|path| Some(path.file_name()?.to_string_lossy().to_string()));
let parent_path = path.as_ref().and_then(|path| {
Some(path.parent()?.to_string_lossy().to_string() + "/")
});
result = result.child(
div()
.flex()
.v_flex()
.px(header_padding)
.pt(header_padding)
.w_full()
.h(FILE_HEADER_HEIGHT as f32 * cx.line_height())
.child(
h_flex()
.id("path header block")
.size_full()
.flex_basis(Length::Definite(DefiniteLength::Fraction(
0.667,
)))
.px(gpui::px(12.))
.rounded_md()
.shadow_md()
.border_1()
.border_color(cx.theme().colors().border)
.bg(cx.theme().colors().editor_subheader_background)
.justify_between()
.hover(|style| style.bg(cx.theme().colors().element_hover))
.child(
h_flex().gap_3().child(
h_flex()
.gap_2()
.child(
filename
.map(SharedString::from)
.unwrap_or_else(|| "untitled".into()),
)
.when_some(parent_path, |then, path| {
then.child(div().child(path).text_color(
cx.theme().colors().text_muted,
))
}),
),
)
.when_some(jump_data, |el, jump_data| {
el.child(Icon::new(IconName::ArrowUpRight))
.cursor_pointer()
.tooltip(|cx| {
Tooltip::for_action(
"Jump to File",
&OpenExcerpts,
cx,
)
})
.on_mouse_down(MouseButton::Left, |_, cx| {
cx.stop_propagation()
})
.on_click(cx.listener_for(&self.editor, {
move |editor, _, cx| {
editor.jump(
jump_data.path.clone(),
jump_data.position,
jump_data.anchor,
jump_data.line_offset_from_top,
cx,
);
}
}))
}),
),
);
if *show_excerpt_controls {
result = result.child(
h_flex()
.w(icon_offset)
.h(MULTI_BUFFER_EXCERPT_HEADER_HEIGHT as f32 * cx.line_height())
.flex_none()
.justify_end()
.child(self.render_expand_excerpt_button(
next_excerpt.id,
ExpandExcerptDirection::Up,
IconName::ArrowUpFromLine,
cx,
)),
);
}
} else {
result = result.child(
h_flex()
.id("excerpt header block")
.group("excerpt-jump-action")
.justify_start()
.id("jump to collapsed context")
.w(relative(1.0))
.h_full()
.w_full()
.h(MULTI_BUFFER_EXCERPT_HEADER_HEIGHT as f32 * cx.line_height())
.relative()
.child(
div()
.h_px()
.top(px(0.))
.absolute()
.w_full()
.h_px()
.bg(cx.theme().colors().border_variant)
.group_hover("excerpt-jump-action", |style| {
style.bg(cx.theme().colors().border)
}),
),
)
.child(
h_flex()
.justify_end()
.flex_none()
.w(icon_offset)
.h_full()
)
.cursor_pointer()
.when_some(jump_data.clone(), |this, jump_data| {
this.on_click(cx.listener_for(&self.editor, {
let path = jump_data.path.clone();
move |editor, _, cx| {
cx.stop_propagation();
editor.jump(
path.clone(),
jump_data.position,
jump_data.anchor,
jump_data.line_offset_from_top,
cx,
);
}
}))
.tooltip(move |cx| {
Tooltip::for_action(
format!(
"Jump to {}:L{}",
jump_data.path.path.display(),
jump_data.position.row + 1
),
&OpenExcerpts,
cx,
)
})
})
.child(
show_excerpt_controls
.then(|| {
ButtonLike::new("expand-icon")
.style(ButtonStyle::Transparent)
.child(
svg()
.path(IconName::ArrowUpFromLine.path())
.size(IconSize::XSmall.rems())
.text_color(
cx.theme().colors().editor_line_number,
)
.group("")
.hover(|style| {
style.text_color(
cx.theme()
.colors()
.editor_active_line_number,
)
}),
)
.on_click(cx.listener_for(&self.editor, {
let id = *id;
move |editor, _, cx| {
editor.expand_excerpt(
id,
multi_buffer::ExpandExcerptDirection::Up,
cx,
);
}
}))
.tooltip({
move |cx| {
Tooltip::for_action(
"Expand Excerpt",
&ExpandExcerpts { lines: 0 },
cx,
)
}
})
})
.unwrap_or_else(|| {
h_flex()
.w(icon_offset)
.h(MULTI_BUFFER_EXCERPT_HEADER_HEIGHT as f32
* cx.line_height())
.flex_none()
.justify_end()
.child(if *show_excerpt_controls {
self.render_expand_excerpt_button(
next_excerpt.id,
ExpandExcerptDirection::Up,
IconName::ArrowUpFromLine,
cx,
)
} else {
ButtonLike::new("jump-icon")
.style(ButtonStyle::Transparent)
.child(
@@ -2326,7 +2347,6 @@ impl EditorElement {
.text_color(
cx.theme().colors().border_variant,
)
.group("excerpt-jump-action")
.group_hover(
"excerpt-jump-action",
|style| {
@@ -2336,118 +2356,13 @@ impl EditorElement {
},
),
)
.when_some(jump_data.clone(), |this, jump_data| {
this.on_click(cx.listener_for(&self.editor, {
let path = jump_data.path.clone();
move |editor, _, cx| {
cx.stop_propagation();
editor.jump(
path.clone(),
jump_data.position,
jump_data.anchor,
jump_data.line_offset_from_top,
cx,
);
}
}))
.tooltip(move |cx| {
Tooltip::for_action(
format!(
"Jump to {}:L{}",
jump_data.path.path.display(),
jump_data.position.row + 1
),
&OpenExcerpts,
cx,
)
})
})
}),
),
)
.group("excerpt-jump-action")
.cursor_pointer()
.when_some(jump_data.clone(), |this, jump_data| {
this.on_click(cx.listener_for(&self.editor, {
let path = jump_data.path.clone();
move |editor, _, cx| {
cx.stop_propagation();
);
}
}
editor.jump(
path.clone(),
jump_data.position,
jump_data.anchor,
jump_data.line_offset_from_top,
cx,
);
}
}))
.tooltip(move |cx| {
Tooltip::for_action(
format!(
"Jump to {}:L{}",
jump_data.path.path.display(),
jump_data.position.row + 1
),
&OpenExcerpts,
cx,
)
})
})
};
element.into_any()
}
Block::ExcerptFooter { id, .. } => {
let element = v_flex()
.id(("excerpt footer", EntityId::from(block_id)))
.w_full()
.h(snapshot.excerpt_footer_height() as f32 * cx.line_height())
.child(
h_flex()
.justify_end()
.flex_none()
.w(gutter_dimensions.width
- (gutter_dimensions.left_padding + gutter_dimensions.margin))
.h_full()
.child(
ButtonLike::new("expand-icon")
.style(ButtonStyle::Transparent)
.child(
svg()
.path(IconName::ArrowDownFromLine.path())
.size(IconSize::XSmall.rems())
.text_color(cx.theme().colors().editor_line_number)
.group("")
.hover(|style| {
style.text_color(
cx.theme().colors().editor_active_line_number,
)
}),
)
.on_click(cx.listener_for(&self.editor, {
let id = *id;
move |editor, _, cx| {
editor.expand_excerpt(
id,
multi_buffer::ExpandExcerptDirection::Down,
cx,
);
}
}))
.tooltip({
move |cx| {
Tooltip::for_action(
"Expand Excerpt",
&ExpandExcerpts { lines: 0 },
cx,
)
}
}),
),
);
element.into_any()
result.into_any()
}
};
@@ -2474,6 +2389,33 @@ impl EditorElement {
(element, final_size)
}
fn render_expand_excerpt_button(
&self,
excerpt_id: ExcerptId,
direction: ExpandExcerptDirection,
icon: IconName,
cx: &mut WindowContext,
) -> ButtonLike {
ButtonLike::new("expand-icon")
.style(ButtonStyle::Transparent)
.child(
svg()
.path(icon.path())
.size(IconSize::XSmall.rems())
.text_color(cx.theme().colors().editor_line_number)
.group("")
.hover(|style| style.text_color(cx.theme().colors().editor_active_line_number)),
)
.on_click(cx.listener_for(&self.editor, {
move |editor, _, cx| {
editor.expand_excerpt(excerpt_id, direction, cx);
}
}))
.tooltip({
move |cx| Tooltip::for_action("Expand Excerpt", &ExpandExcerpts { lines: 0 }, cx)
})
}
#[allow(clippy::too_many_arguments)]
fn render_blocks(
&self,
@@ -3332,7 +3274,7 @@ impl EditorElement {
let end_row_in_current_excerpt = snapshot
.blocks_in_range(start_row..end_row)
.find_map(|(start_row, block)| {
if matches!(block, Block::ExcerptHeader { .. }) {
if matches!(block, Block::ExcerptBoundary { .. }) {
Some(start_row)
} else {
None

View File

@@ -403,7 +403,10 @@ impl GitBlame {
if this.user_triggered {
log::error!("failed to get git blame data: {error:?}");
let notification = format!("{:#}", error).trim().to_string();
cx.emit(project::Event::Notification(notification));
cx.emit(project::Event::Toast {
notification_id: "git-blame".into(),
message: notification,
});
} else {
// If we weren't triggered by a user, we just log errors in the background, instead of sending
// notifications.
@@ -619,9 +622,11 @@ mod tests {
let event = project.next_event(cx).await;
assert_eq!(
event,
project::Event::Notification(
"Failed to blame \"file.txt\": failed to get blame for \"file.txt\"".to_string()
)
project::Event::Toast {
notification_id: "git-blame".into(),
message: "Failed to blame \"file.txt\": failed to get blame for \"file.txt\""
.to_string()
}
);
blame.update(cx, |blame, cx| {

View File

@@ -525,7 +525,7 @@ async fn parse_blocks(
font_family: Some(buffer_font_family),
..Default::default()
},
rule_color: Color::Muted.color(cx),
rule_color: cx.theme().colors().border,
block_quote_border_color: Color::Muted.color(cx),
block_quote: TextStyleRefinement {
color: Some(Color::Muted.color(cx)),

View File

@@ -39,9 +39,13 @@ impl Editor {
) -> Option<Vec<MultiBufferIndentGuide>> {
let show_indent_guides = self.should_show_indent_guides().unwrap_or_else(|| {
if let Some(buffer) = self.buffer().read(cx).as_singleton() {
language_settings(buffer.read(cx).language(), buffer.read(cx).file(), cx)
.indent_guides
.enabled
language_settings(
buffer.read(cx).language().map(|l| l.name()),
buffer.read(cx).file(),
cx,
)
.indent_guides
.enabled
} else {
true
}

View File

@@ -27,6 +27,7 @@ use rpc::proto::{self, update_view, PeerId};
use settings::Settings;
use workspace::item::{Dedup, ItemSettings, SerializableItem, TabContentParams};
use project::lsp_store::FormatTarget;
use std::{
any::TypeId,
borrow::Cow,
@@ -722,7 +723,12 @@ impl Item for Editor {
cx.spawn(|this, mut cx| async move {
if format {
this.update(&mut cx, |editor, cx| {
editor.perform_format(project.clone(), FormatTrigger::Save, cx)
editor.perform_format(
project.clone(),
FormatTrigger::Save,
FormatTarget::Buffer,
cx,
)
})?
.await?;
}

View File

@@ -1,5 +1,4 @@
use std::ops::Range;
use crate::actions::FormatSelections;
use crate::{
actions::Format, selections_collection::SelectionsCollection, Copy, CopyPermalinkToLine, Cut,
DisplayPoint, DisplaySnapshot, Editor, EditorMode, FindAllReferences, GoToDeclaration,
@@ -8,6 +7,8 @@ use crate::{
};
use gpui::prelude::FluentBuilder;
use gpui::{DismissEvent, Pixels, Point, Subscription, View, ViewContext};
use std::ops::Range;
use text::PointUtf16;
use workspace::OpenInTerminal;
#[derive(Debug)]
@@ -164,6 +165,12 @@ pub fn deploy_context_menu(
} else {
"Reveal in File Manager"
};
let has_selections = editor
.selections
.all::<PointUtf16>(cx)
.into_iter()
.any(|s| !s.is_empty());
ui::ContextMenu::build(cx, |menu, _cx| {
let builder = menu
.on_blur_subscription(Subscription::new(|| {}))
@@ -175,6 +182,9 @@ pub fn deploy_context_menu(
.separator()
.action("Rename Symbol", Box::new(Rename))
.action("Format Buffer", Box::new(Format))
.when(has_selections, |cx| {
cx.action("Format Selections", Box::new(FormatSelections))
})
.action(
"Code Actions",
Box::new(ToggleCodeActions {

View File

@@ -952,7 +952,7 @@ mod tests {
px(14.0),
None,
true,
2,
0,
2,
0,
FoldPlaceholder::test(),

View File

@@ -16,16 +16,24 @@ use workspace::{
pub struct ProposedChangesEditor {
editor: View<Editor>,
_subscriptions: Vec<Subscription>,
multibuffer: Model<MultiBuffer>,
title: SharedString,
buffer_entries: Vec<BufferEntry>,
_recalculate_diffs_task: Task<Option<()>>,
recalculate_diffs_tx: mpsc::UnboundedSender<RecalculateDiff>,
}
pub struct ProposedChangesBuffer<T> {
pub struct ProposedChangeLocation<T> {
pub buffer: Model<Buffer>,
pub ranges: Vec<Range<T>>,
}
struct BufferEntry {
base: Model<Buffer>,
branch: Model<Buffer>,
_subscription: Subscription,
}
pub struct ProposedChangesEditorToolbar {
current_editor: Option<View<ProposedChangesEditor>>,
}
@@ -43,32 +51,14 @@ struct BranchBufferSemanticsProvider(Rc<dyn SemanticsProvider>);
impl ProposedChangesEditor {
pub fn new<T: ToOffset>(
buffers: Vec<ProposedChangesBuffer<T>>,
title: impl Into<SharedString>,
locations: Vec<ProposedChangeLocation<T>>,
project: Option<Model<Project>>,
cx: &mut ViewContext<Self>,
) -> Self {
let mut subscriptions = Vec::new();
let multibuffer = cx.new_model(|_| MultiBuffer::new(Capability::ReadWrite));
for buffer in buffers {
let branch_buffer = buffer.buffer.update(cx, |buffer, cx| buffer.branch(cx));
subscriptions.push(cx.subscribe(&branch_buffer, Self::on_buffer_event));
multibuffer.update(cx, |multibuffer, cx| {
multibuffer.push_excerpts(
branch_buffer,
buffer.ranges.into_iter().map(|range| ExcerptRange {
context: range,
primary: None,
}),
cx,
);
});
}
let (recalculate_diffs_tx, mut recalculate_diffs_rx) = mpsc::unbounded();
Self {
let mut this = Self {
editor: cx.new_view(|cx| {
let mut editor = Editor::for_multibuffer(multibuffer.clone(), project, true, cx);
editor.set_expand_all_diff_hunks();
@@ -81,6 +71,9 @@ impl ProposedChangesEditor {
);
editor
}),
multibuffer,
title: title.into(),
buffer_entries: Vec::new(),
recalculate_diffs_tx,
_recalculate_diffs_task: cx.spawn(|_, mut cx| async move {
let mut buffers_to_diff = HashSet::default();
@@ -112,7 +105,100 @@ impl ProposedChangesEditor {
}
None
}),
_subscriptions: subscriptions,
};
this.reset_locations(locations, cx);
this
}
pub fn branch_buffer_for_base(&self, base_buffer: &Model<Buffer>) -> Option<Model<Buffer>> {
self.buffer_entries.iter().find_map(|entry| {
if &entry.base == base_buffer {
Some(entry.branch.clone())
} else {
None
}
})
}
pub fn set_title(&mut self, title: SharedString, cx: &mut ViewContext<Self>) {
self.title = title;
cx.notify();
}
pub fn reset_locations<T: ToOffset>(
&mut self,
locations: Vec<ProposedChangeLocation<T>>,
cx: &mut ViewContext<Self>,
) {
// Undo all branch changes
for entry in &self.buffer_entries {
let base_version = entry.base.read(cx).version();
entry.branch.update(cx, |buffer, cx| {
let undo_counts = buffer
.operations()
.iter()
.filter_map(|(timestamp, _)| {
if !base_version.observed(*timestamp) {
Some((*timestamp, u32::MAX))
} else {
None
}
})
.collect();
buffer.undo_operations(undo_counts, cx);
});
}
self.multibuffer.update(cx, |multibuffer, cx| {
multibuffer.clear(cx);
});
let mut buffer_entries = Vec::new();
for location in locations {
let branch_buffer;
if let Some(ix) = self
.buffer_entries
.iter()
.position(|entry| entry.base == location.buffer)
{
let entry = self.buffer_entries.remove(ix);
branch_buffer = entry.branch.clone();
buffer_entries.push(entry);
} else {
branch_buffer = location.buffer.update(cx, |buffer, cx| buffer.branch(cx));
buffer_entries.push(BufferEntry {
branch: branch_buffer.clone(),
base: location.buffer.clone(),
_subscription: cx.subscribe(&branch_buffer, Self::on_buffer_event),
});
}
self.multibuffer.update(cx, |multibuffer, cx| {
multibuffer.push_excerpts(
branch_buffer,
location.ranges.into_iter().map(|range| ExcerptRange {
context: range,
primary: None,
}),
cx,
);
});
}
self.buffer_entries = buffer_entries;
self.editor.update(cx, |editor, cx| {
editor.change_selections(None, cx, |selections| selections.refresh())
});
}
pub fn recalculate_all_buffer_diffs(&self) {
for (ix, entry) in self.buffer_entries.iter().enumerate().rev() {
self.recalculate_diffs_tx
.unbounded_send(RecalculateDiff {
buffer: entry.branch.clone(),
debounce: ix > 0,
})
.ok();
}
}
@@ -162,11 +248,11 @@ impl Item for ProposedChangesEditor {
type Event = EditorEvent;
fn tab_icon(&self, _cx: &ui::WindowContext) -> Option<Icon> {
Some(Icon::new(IconName::Pencil))
Some(Icon::new(IconName::Diff))
}
fn tab_content_text(&self, _cx: &WindowContext) -> Option<SharedString> {
Some("Proposed changes".into())
Some(self.title.clone())
}
fn as_searchable(&self, _: &View<Self>) -> Option<Box<dyn SearchableItemHandle>> {

View File

@@ -25,7 +25,6 @@ fs.workspace = true
git.workspace = true
gpui.workspace = true
http_client.workspace = true
isahc_http_client.workspace = true
language.workspace = true
languages.workspace = true
node_runtime.workspace = true
@@ -36,3 +35,4 @@ serde.workspace = true
serde_json.workspace = true
settings.workspace = true
smol.workspace = true
reqwest_client.workspace = true

View File

@@ -12,6 +12,7 @@ use language::LanguageRegistry;
use node_runtime::NodeRuntime;
use open_ai::OpenAiEmbeddingModel;
use project::Project;
use reqwest_client::ReqwestClient;
use semantic_index::{
EmbeddingProvider, OpenAiEmbeddingProvider, ProjectIndex, SemanticDb, Status,
};
@@ -100,7 +101,7 @@ fn main() -> Result<()> {
gpui::App::headless().run(move |cx| {
let executor = cx.background_executor().clone();
let client = isahc_http_client::IsahcHttpClient::new(None, None);
let client = Arc::new(ReqwestClient::user_agent("Zed LLM evals").unwrap());
cx.set_http_client(client.clone());
match cli.command {
Commands::Fetch {} => {

View File

@@ -56,7 +56,6 @@ wit-component.workspace = true
workspace.workspace = true
[dev-dependencies]
isahc_http_client.workspace = true
ctor.workspace = true
env_logger.workspace = true
fs = { workspace = true, features = ["test-support"] }
@@ -64,5 +63,5 @@ gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
parking_lot.workspace = true
project = { workspace = true, features = ["test-support"] }
tokio.workspace = true
reqwest_client.workspace = true
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -42,7 +42,10 @@ impl Settings for ExtensionSettings {
fn load(sources: SettingsSources<Self::FileContent>, _cx: &mut AppContext) -> Result<Self> {
SettingsSources::<Self::FileContent>::json_merge_with(
[sources.default].into_iter().chain(sources.user),
[sources.default]
.into_iter()
.chain(sources.user)
.chain(sources.server),
)
}
}

Some files were not shown because too many files have changed in this diff Show More