Compare commits

...

83 Commits

Author SHA1 Message Date
Nathan Sobo
5790f85383 Experiment with duckdb for analytics 2024-10-24 11:33:45 -06:00
Antonio Scandurra
499e1459eb Fix crash in collab when sending worktree updates (#19678)
This pull request does a couple of things:

- In 29c2df73e1, we introduced a safety
guard that prevents this crash from happening again in the future by
returning an error instead of panicking when the payload is too large.
- In 3e7a2e5c30, we introduced chunking
for updates coming from SSH servers (previously, we were sending the
whole changeset and initial set of paths in their entirety).
- In 122b5b4, we introduced a panic hook that sends panics to Axiom.

For posterity, this is how we figured out what the panic was:

```
kubectl logs current-pod-name --previous --namespace=production
```

Release Notes:

- N/A

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
Co-authored-by: Bennet <bennet@zed.dev>
Co-authored-by: Kirill <kirill@zed.dev>
2024-10-24 15:57:24 +02:00
Danilo Leal
b5aea548a8 ssh: Capitalize error and connection strings (#19675)
Another tiny PR for the sake of consistency :)

Release Notes:

- N/A
2024-10-24 09:43:35 -03:00
Danilo Leal
3c6a505166 docs: Add tweaks to the Remote Development page (#19674)
Just making just we also add the other keybinding to open the Remote
Projects dialog and capitalize every "SSH" mention for consistency. Tiny
stuff!

Release Notes:

- N/A
2024-10-24 09:35:59 -03:00
Thorsten Ball
efc4d3efdf ssh remoting: Fix wrong working directory for SSH terminals (#19672)
Before this change, we would save the working directory *on the client*
of each shell that was running in a terminal.

While it's technically right, it's wrong in all of these cases where
`working_directory` was used:

- in inline assistant
- when resolving file paths in the terminal output
- when serializing the current working dir and deserializing it on
restart

Release Notes:

- Fixed terminals opened on remote hosts failing to deserialize with an
error message after restarting Zed.
2024-10-24 13:52:26 +02:00
Bennet Bo Fenner
4214ed927f project panel: Add indent guides (#18260)
See #12673



https://github.com/user-attachments/assets/94079afc-a851-4206-9c9b-4fad3542334e



TODO:
- [x] Make active indent guides work for autofolded directories
- [x] Figure out which theme colors to use
- [x] Fix horizontal scrolling
- [x] Make indent guides easier to click
- [x] Fix selected background flashing when hovering over entry/indent
guide
- [x] Docs

Release Notes:

- Added indent guides to the project panel
2024-10-24 13:07:20 +02:00
Zhang
e040b200bc project_panel: Make up/down in file rename editor not select items (#19670)
Closes #19017 

Release Notes:

- Fixed project panel bug when renaming files where up/down keys could
select other files.
2024-10-24 12:15:42 +02:00
Thorsten Ball
1dba50f42f ssh remoting: Fix version check (#19668)
This snuck in when Bennet and I were debugging why our connection to the
SSH host would break. We suspected that somewhere something was logging
to STDOUT and, I guess, we changed all `println!` to `eprintln!`.

Now, two weeks later, I'm sitting here, wondering why the version check
doesn't work anymore. The server always reports a version of `""`.

Turns out we take the command's STDOUT and not STDERR, which is correct.

But it also turns out we started to print the version to STDERR, which
breaks the version check.

One-character bug & one-character fix.

Release Notes:

- N/A
2024-10-24 11:37:32 +02:00
renovate[bot]
0ffc92ab65 Update actions/checkout digest to 11bd719 (#19636)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [actions/checkout](https://redirect.github.com/actions/checkout) |
action | digest | `eef6144` -> `11bd719` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-23 21:57:06 -04:00
Marshall Bowers
d30361537e assistant: Update SlashCommand trait with streaming return type (#19652)
This PR updates the `SlashCommand` trait to use a streaming return type.

This change is just at the trait layer. The goal here is to decouple
changing the trait's API while preserving behavior on either side.

The `SlashCommandOutput` type now has two methods for converting two and
from a stream to use in cases where we're not yet doing streaming.

On the `SlashCommand` implementer side, the implements can call
`to_event_stream` to produce a stream of events based off the
`SlashCommandOutput`.

On the slash command consumer side we use
`SlashCommandOutput::from_event_stream` to convert a stream of events
back into a `SlashCommandOutput`.

The `/file` slash command has been updated to emit `SlashCommandEvent`s
directly in order for it to work properly.

Release Notes:

- N/A

---------

Co-authored-by: Max <max@zed.dev>
2024-10-23 21:26:50 -04:00
renovate[bot]
510c71d41b Pin crate-ci/typos action to 8e6a428 (#19635)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [crate-ci/typos](https://redirect.github.com/crate-ci/typos) | action
| pinDigest | -> `8e6a428` |

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-23 19:00:28 -04:00
renovate[bot]
013d2d52fd Update Rust crate anyhow to v1.0.91 (#19640)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [anyhow](https://redirect.github.com/dtolnay/anyhow) |
workspace.dependencies | patch | `1.0.89` -> `1.0.91` |

---

### Release Notes

<details>
<summary>dtolnay/anyhow (anyhow)</summary>

###
[`v1.0.91`](https://redirect.github.com/dtolnay/anyhow/releases/tag/1.0.91)

[Compare
Source](https://redirect.github.com/dtolnay/anyhow/compare/1.0.90...1.0.91)

- Ensure OUT_DIR is left with deterministic contents after build script
execution
([#&#8203;388](https://redirect.github.com/dtolnay/anyhow/issues/388))

###
[`v1.0.90`](https://redirect.github.com/dtolnay/anyhow/releases/tag/1.0.90)

[Compare
Source](https://redirect.github.com/dtolnay/anyhow/compare/1.0.89...1.0.90)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-24 00:45:58 +03:00
Conrad Irwin
eee91f3f1b docs: Update SSH docs (#19339)
Update of the SSH remoting docs

Release Notes:

- N/A
2024-10-23 14:25:27 -06:00
Kirill Bulatov
e87d5e145d Use zstd without dynamic linking due to musl usage (#19627)
Due to leaning towards `musl` builds, unit features for `zstd` and link
it statically too for Zed.


bfe1e34f59/zstd-safe/zstd-sys/build.rs (L260)
shows that `ZSTD_SYS_USE_PKG_CONFIG` env var can be used to return this
behavior.

Release Notes:

- N/A
2024-10-23 22:19:06 +03:00
Peter Tripp
291af664e1 Switch to Anthropic -latest tags (#19615)
- Closes: https://github.com/zed-industries/zed/issues/19609

Switches us to using `-latest` tags with Anthropic models instead of
pinning to a specific date version.
See: [Anthropic Model
Docs](https://docs.anthropic.com/en/docs/about-claude/models)

This is a no-op for:
- Claude 3 Opus (`claude-3-opus-20240229`)
- Claude 3 Sonnet (`claude-3-sonnet-20240229`)
- Claude 3 Haiku (`claude-3-haiku-20240307`)

For Claude 3.5 Sonnet this will update us from
`claude-3-5-sonnet-20240620` to `claude-3-5-sonnet-20241022`. We will
also pickup any subsequent model updates automatically when Anthropic
updates the `latest` tag.

This matches the behavior for OpenAI where use `gpt-4o` as the
model_name and not `gpt-4o-2024-08-06`.
2024-10-23 15:13:52 -04:00
Marshall Bowers
9c0dba4ce1 Add a SlashCommandResult type alias (#19633)
This PR adds a new `SlashCommandResult` type alias.

We're going to be changing what slash commands can return in order to
support streaming, so having this type alias in place will make that
switch a bit more neat.

Release Notes:

- N/A
2024-10-23 14:32:43 -04:00
Peter Tripp
8bfd27b00b docs: Improve Markdown trailing whitespace section (#19630) 2024-10-23 13:23:01 -04:00
Joseph T. Lyons
c6f08dea89 v0.160.x dev 2024-10-23 13:12:28 -04:00
Piotr Osiewicz
9dfe4a30bb languages: Do not expose unnecessary captures from tasks (#19625)
This tackles an issue with us exposing unnecessary env variables in
environment which are not actually needed for tasks themselves (and may
have little utility), yet come into the way of ssh remoting.

/cc @ConradIrwin 

Release Notes:

- N/A
2024-10-23 18:54:08 +02:00
Marshall Bowers
69b12f4e33 semantic_index: Disable embeddings index for non-staff (#19618)
This PR disables the embeddings index for non-staff users.

Release Notes:

- N/A
2024-10-23 12:34:51 -04:00
Danilo Leal
c3860804ff ssh: Ensure long server names (and nicknames) truncate (#19621)
Just polishing the UI a bit more. One drawback of this, though, is that
if you _do_ have a big nickname or server name, with this current
solution, you won't be able to see it. Ideally, we should be able to
hover over it and see it in a tooltip, but the `div` still doesn't
support that out of the box.

| Main modal | Modal header |
|--------|--------|
| <img width="1136" alt="Screenshot 2024-10-23 at 12 49 18"
src="https://github.com/user-attachments/assets/ed5f0222-faa1-49bd-b249-2f22497566d8">
| <img width="1136" alt="Screenshot 2024-10-23 at 12 49 23"
src="https://github.com/user-attachments/assets/5a464b12-99e8-4934-aa6a-c9c4c40ea4d4">
|

Release Notes:

- N/A
2024-10-23 13:31:03 -03:00
Thorsten Ball
6a0c19fcf9 ssh remoting: Log server version check result (#19622)
Release Notes:

- N/A
2024-10-23 18:30:48 +02:00
Tom Zaspel
622c266160 docs: Add documentation for auto_install_extensions setting (#19559)
Improved: Documenation for (un)installing extensions automatically.

Signed-off-by: Tom Zaspel <tom@zaspel.it>
Co-authored-by: Peter Tripp <peter@zed.dev>
2024-10-23 11:57:13 -04:00
Thorsten Ball
dc4396b79c ssh remoting: Fix overflowing host in titlebar (#19619)
Release Notes:

- N/A

Co-authored-by: Danilo <danilo@zed.dev>
2024-10-23 18:37:30 +03:00
Kirill Bulatov
1a59b6413b Fill the rest of the prompt data when sending to the client (#19616)
Follow-up of https://github.com/zed-industries/zed/pull/19587

Release Notes:

- N/A
2024-10-23 18:33:57 +03:00
Jamie Bowers
992155c60c Update example node settings in default.json to use the correct key for node path (#19607)
Hey team!

I was investigating the new node settings added in #18172, and when
trying to set the node path I noticed that the example settings in
`default.json` use the wrong key for the `node_path` - it should be
`path` instead.

See
[here](19eebcd349/crates/zed/src/main.rs (L488))
for where the setting is used.

Release Notes:

- N/A
2024-10-23 18:33:31 +03:00
Thorsten Ball
e633f62eaf ssh remoting: Fix opening settings file (#19614)
We have to do `workspace.with_local_workspace`, otherwise we'll try to
open the settings on the remote host.


Release Notes:

- N/A
2024-10-23 16:56:39 +02:00
Kirill Bulatov
b85af0e533 Fall back to handling the abs path for external worktree entries (#19612)
Certain files like Rust stdlib ones can be opened by cmd-clicking on
terminal, editor contents, etc.

Those files will not belong to the current worktree, so a fake worktree,
with a single file, invisible (i.e. its dir(s) will not be shown in the
UI such as project panel), will be created on the file opening.

When the file is closed, the worktree is closed and removed along the
way, so those worktrees are considered ephemeral and their ids are not
stored in the database.
This causes issues on reopening such files when they are closed. 

The PR makes Zed to fall back to opening the file by abs path when it's
not in the project metadata, but has the abs path stored in history or
in the opened items DB data.

Release Notes:

- Handle external worktree entries [re]open better
2024-10-23 17:34:23 +03:00
Thorsten Ball
bce1b7a10a ssh remoting: Add setting to download binary on host (#19606)
This adds the following optional setting:

```json
{
  "remote_server": {
    "download_on_host": false
  }
}
```
Right now, it's **off by default** because I haven't tested it enough.

Release Notes:

- N/A
2024-10-23 16:06:16 +02:00
Peter Tripp
c292bdd2ca docs: Improve language server configuration dotted/nested notation example (#19608) 2024-10-23 09:41:27 -04:00
Danilo Leal
1cb9f64917 ssh: Clean up bits of the main modal UI (#19604)
This PR main relevant change is removing the logic we had inserted for
keyboard nav scroll as that was unreliable; we need to figure out a
better solution still. I'm also removing the visible on hover behavior
for the scrollbar as that was making us lose the click and drag feature
the component has. Lastly, I added a bit of right-margin in the delete
icon button so that's not too crammed with the scrollbar.

Release Notes:

- N/A
2024-10-23 09:49:44 -03:00
Piotr Osiewicz
1bded42b2a ssh: Dismiss file picker when new window is opened (#19600)
Closes #ISSUE

Release Notes:

- N/A
2024-10-23 13:57:50 +02:00
Thorsten Ball
7f64f0454d ssh remoting: Ensure only single instance can upload binary (#19597)
This creates a `<binary_path>.lock` file and checks for it whenever
uploading a binary.

Parameters:
- Wait for other instance to finish: 10m
- Mark lockfile as stale and ignore it after: 10m
- When waiting on another process, check every 5seconds

We can tweak all of them.

Ideally we'd have a value in the lockfile, that lets us detect whether
the other process is still there, but I haven't found something stable
yet:

- We don't have a stable PID on the server side when we run multiple
commands
- The `ControlPath` is on the client side

Release Notes:

- N/A
2024-10-23 13:18:27 +02:00
Vitaly Slobodin
375bc88f95 lsp_log: Add server capabilities view (#19448)
Hello, this PR adds a new view to the LSP servers menu for
displaying an LSP server capabilities.

When I work on LSP stuff, quite often I need to check what capabilities
an LSP server has. Currently there is no built-in way for checking that
in Zed, and I have to use [`LSP
DevTools`](https://lsp-devtools.readthedocs.io) project. LSP DevTools
works OK but it works as a proxy between the client and the server, so
setting it up is not that easy in Zed. Zed already has many goodies for
LSP like tracing and RPC messages, so I thought that a simple view with
server capabilities could be useful too. Thanks!

## Some screenshots:

### Ruby LSP

![CleanShot 2024-10-19 at 07 44
38@2x](https://github.com/user-attachments/assets/22c97b49-c539-4e39-a5f1-1c926347abca)


### New menu entry:

![CleanShot 2024-10-19 at 07 45
08@2x](https://github.com/user-attachments/assets/d3903d6e-c09a-40e2-b042-1abde490987d)


Release Notes:

- N/A
2024-10-23 12:53:49 +02:00
wannacu
d53a86b01d project_panel: Fix the confusing display when files or directories have the same name in a case-insensitive comparison (#19211)
- Closes #18851

Release Notes:

- Fixed directory name comparison on Linux not considering the case
([#18851](https://github.com/zed-industries/zed/issues/18851))
2024-10-23 10:37:57 +03:00
Kevin Wang
6c7e79eff6 Cap the size of the Supermaven states buffer (#19246)
Caps the size of the Supermaven states buffer to 1000 elements.
Previously, the buffer would grow unbounded so for long sessions the
number of states that the Supermaven autocomplete provider maintains can
be quite large. In practice, states that are sufficiently old are so
unlikely to be visited again that we can regenerate the completion.
Thus, we can cap the buffer to 1000 elements.

Release Notes:

- N/A
2024-10-23 10:36:14 +03:00
Mikayla Maki
d0bc84eb33 Fix remoting things (#19587)
- Fixes modal closing when using the remote modal folder 
- Fixes a bug with local terminals where they could open in / instead of
~
- Fixes a bug where SSH connections would continue running after their
window is closed
- Hides SSH Terminal process details from Zed UI
- Implement `cmd-o` for remote projects
- Implement LanguageServerPromptRequest for remote LSPs

Release Notes:

- N/A
2024-10-23 00:14:43 -07:00
Joseph T. Lyons
fabc14355c Update telemetry docs to not point to specific lines of code (#19583)
The old links pointed to specific lines that were no longer the correct
lines. Let's skip pointing the reader of the docs to a specific line.

Release Notes:

- N/A
2024-10-23 00:36:43 -04:00
Conrad Irwin
07e086b41e Don't upload local settings to ssh remotes (#19577)
Closes: #18618

Release Notes:

- (breaking) SSH Remoting: stop uploading local settings to the remote.
2024-10-22 20:11:05 -06:00
Mikayla Maki
48674ec54c Add paths to connection headers (#19579)
Closes #ISSUE

<img width="562" alt="Screenshot 2024-10-22 at 3 21 22 PM"
src="https://github.com/user-attachments/assets/51afd8e8-b635-4d69-9463-2ccdaabeb955">

<img width="844" alt="Screenshot 2024-10-22 at 3 22 35 PM"
src="https://github.com/user-attachments/assets/ae57c0f8-396c-485d-b5e2-14558da98a18">

Release Notes:

- N/A

---------

Co-authored-by: Trace <violet.white.batt@gmail.com>
2024-10-22 16:50:33 -07:00
Finn Evers
fc7874e64e Fix GitHub link for upstream markdown grammer (#19580)
This PR removes an extra slash from the Github link to the upstream
Tree-sitter markdown grammer introduced in #19570 in the cargo-files.

Release Notes:

- N/A
2024-10-22 19:06:34 -04:00
Marshall Bowers
dcb0da0a7d collab: Return free tier usage from GET /billing/monthly_spend (#19578)
This PR updates the `GET /billing/monthly_spend` endpoint to also return
information about the free tier usage.

Release Notes:

- N/A
2024-10-22 18:06:54 -04:00
Conrad Irwin
a9f48bd9d1 Try to build with musl on CI (#19571)
- Closes #19092 
- Closes #15411

Release Notes:

- SSH Remoting: Build with musl (adds support for machines with old, or
no glibc)
2024-10-22 16:00:00 -06:00
Mikayla Maki
33197608ed Make closing the SSH modal equivalent to cancelling the SSH connection task (#19568)
TODO: 
- [x] Fix focus not being on the modal initially

Release Notes:

- N/A

---------

Co-authored-by: Trace <violet.white.batt@gmail.com>
2024-10-22 14:50:55 -07:00
Kirill Bulatov
f951410ef0 Use upstream tree-sitter-markdown (#19570)
After
https://github.com/tree-sitter-grammars/tree-sitter-markdown/pull/163 ,
no need to use zed-industries fork for tree-sitter-markdown

Release Notes:

- N/A
2024-10-23 00:41:01 +03:00
Peter Tripp
47ade2f9f9 Check for rsync before downloading updates (#19392) 2024-10-22 17:12:26 -04:00
Peter Tripp
263e143d1b macos: Add services menu (#16959) 2024-10-22 17:08:19 -04:00
Marshall Bowers
21a44d74bd collab: Prevent users from ending up with multiple active billing subscriptions (#19574)
This PR adds some safeguards to ensure that users do not end up with
multiple active billing subscriptions.

We now do the following:

1. When initiating a checkout, we first make sure the user does not
already have an active subscription.
2. When creating subscriptions in response to Stripe events, we ensure
that we don't already have an active subscription.

Release Notes:

- N/A
2024-10-22 17:01:59 -04:00
Derek Briggs
efd485cbb8 Add file icons for Zig, Julia, SCSS, HCL, Nix, and Roc (#19529)
Release Notes:

- Added file icons for Zig, Julia, SCSS, HCL, Nix, and Roc files.

<img width="733" alt="image"
src="https://github.com/user-attachments/assets/37602be7-173f-40d1-8177-3d35b11d4208">

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-10-22 16:03:58 -04:00
Kirill Bulatov
7a6550c1d1 Do not allow drag and drop of FS entries into the remote projects (#19565)
Release Notes:

- N/A
2024-10-22 22:34:54 +03:00
Danilo Leal
23ad470daf ssh: Add UI refinements to the remote modals (#19558)
This PR polishes spacing, borders, header design, font size, etc.

Release Notes:

- N/A
2024-10-22 16:33:59 -03:00
Conrad Irwin
6dcec47235 Show invisibles in editor (#19298)
Release Notes:

- Added highlighting for "invisible" unicode characters

Closes #16310

---------

Co-authored-by: dovakin0007 <dovakin0007@gmail.com>
Co-authored-by: dovakin0007 <73059450+dovakin0007@users.noreply.github.com>
2024-10-22 13:23:13 -06:00
Conrad Irwin
e93d62680d Improve disconnected modal for SSH (#19567)
Closes #ISSUE

Release Notes:

- SSH Remoting: made reconnect modal more robust
2024-10-22 13:22:27 -06:00
Marshall Bowers
5dbf68ddc4 editor: Save base buffers when applying changes from their branches (#19562)
This PR makes it so that when we apply changes within a branch
buffer—currently just the edits buffer—we save the underlying buffer.

This also fixes an issue where new files created via edits were not
properly flushed to disk.

Release Notes:

- N/A

---------

Co-authored-by: Max <max@zed.dev>
2024-10-22 14:10:11 -04:00
Joseph T. Lyons
d8d84bf5d4 Tell users they can close a stale issue if they can't repro (#19563)
A user commented on this issue to let us know they couldn't reproduce,
which would keep it open for another 6 months. Best we can do is tell
them what to do if they can't repro, which is to close the issue
themselves.

- https://github.com/zed-industries/zed/issues/10671

Release Notes:

- N/A
2024-10-22 13:46:21 -04:00
Mikayla Maki
6f6893a93a Fix reversed prompt in #19533 (#19538)
Release Notes:

- N/A
2024-10-22 10:33:42 -07:00
Joseph T. Lyons
7ae25d10c8 Fix comment when marking issues as stale (#19561)
Oops

<img width="928" alt="image"
src="https://github.com/user-attachments/assets/a16ea5cd-c02d-4add-9b58-853de9aa5ba7">

Release Notes:

- N/A
2024-10-22 13:29:48 -04:00
Thorsten Ball
d80683f2bf ssh remoting: Fix heartbeat timer and exit conditions (#19557)
This contains a bunch of smallish but nasty fixes:

- Heartbeat timer was never reset after first heartbeat
- Use same return value when stderr is closed as when stdout is closed
- Always check proxy process status since it should also be done when we
get to this point (either it died and our task stopped, or our task
stopped and we dropped the process handle and it was killed on drop)
- make error messages less wrongly-specific


Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-22 18:45:56 +02:00
Adam Wolff
ab98d4889b assistant: Add tracking fields to Codegen for alternative assists (#19467)
Added some tracking fields to Codegen and CodegenAlternatives in order
to collect data for experimentation in a fork.

Release Notes:

- N/A
2024-10-22 12:45:01 -04:00
Richard Feldman
ce11ca9d49 Don't suggest assistant code actions if it's disabled (#19553)
Closes #19155

Release Notes:

- Fixed code action for "Fix with assistant" appearing when assistant
was disabled.
2024-10-22 12:36:36 -04:00
Kirill Bulatov
edda149d75 Do not remove worktrees after the headless server removal (#19556)
Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad@zed.dev>
2024-10-22 19:27:50 +03:00
renovate[bot]
291ca2c32c Update Rust crate wasmtime-wasi to v24.0.1 (#19338)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [wasmtime-wasi](https://redirect.github.com/bytecodealliance/wasmtime)
| workspace.dependencies | patch | `24.0.0` -> `24.0.1` |

---

### Release Notes

<details>
<summary>bytecodealliance/wasmtime (wasmtime-wasi)</summary>

###
[`v24.0.1`](https://redirect.github.com/bytecodealliance/wasmtime/releases/tag/v24.0.1)

[Compare
Source](https://redirect.github.com/bytecodealliance/wasmtime/compare/v24.0.0...v24.0.1)

#### 24.0.1

Released 2024-10-09.

##### Fixed

- Fix a runtime crash when combining tail-calls with host imports that
capture a
    stack trace or trap.

[GHSA-q8hx-mm92-4wvg](https://redirect.github.com/bytecodealliance/wasmtime/security/advisories/GHSA-q8hx-mm92-4wvg)

- Fix a race condition could lead to WebAssembly control-flow integrity
and type
    safety violations.

[GHSA-7qmx-3fpx-r45m](https://redirect.github.com/bytecodealliance/wasmtime/security/advisories/GHSA-7qmx-3fpx-r45m)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4xMjAuMSIsInVwZGF0ZWRJblZlciI6IjM4LjEyMC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-10-22 11:39:35 -04:00
Danilo Leal
3ba2af289b ssh: Expose server address in the title bar (#19549)
This PR exposes the server address (or the nickname, if there is one) on
the title bar and in all modals that have the SSH header. The title bar
tooltip meta description still shows the original server address
(regardless of a nickname existing in this case), though.

<img width="600" alt="Screenshot 2024-10-22 at 10 58 36"
src="https://github.com/user-attachments/assets/64a94d9f-798b-44a4-9dee-6056886535bb">

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-10-22 12:39:00 -03:00
David Soria Parra
d8d8c908ed context_servers: Update protocol (#19547)
We sadly have to change the underlying protocol once again. This will
likely be the last change to the core protocol without correctly
handling older versions. From here on out, we want to get better with
version handling. To do so, we introduce the notion of a string protocol
version to be explicit of when the underlying protocol last changed.

The change also changes the return values of prompts. For now we only
allow User messages from servers to match the current behaviour. We will
change this once #19222 lands which will allow slash commands to insert
user and assistant messages.

Release Notes:

- N/A
2024-10-22 11:19:32 -04:00
Adam Wolff
680b3dd80b Refine AI context summary prompt (#19530)
Release Notes:

- Improved prompt for generating context editor summaries.
2024-10-22 11:02:04 -04:00
Peter Tripp
270e13bb9a docs: Document upstream JSONC Prettier issues (#19552) 2024-10-22 10:43:21 -04:00
Thorsten Ball
b3aa7055e4 ssh remoting: Daemonize server process properly by forking and closing stdio (#19544)
This fixes the `ssh` proxy process not being notified when the proxy
process dies. Turns out that the server would have stdout/stderr/stdin
connected to the grand-parent ssh process connected to it and as long as
the server kept running (even once it was daemonized into the
background) the grand-parent ssh process wouldn't exit.

That in turn meant that the Zed client wasn't notified when the proxy
process died.

Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-22 16:36:40 +02:00
Joseph T. Lyons
f16461d7d0 Fix Discord link (#19551)
We can't seem to generate invite links that dumps a user directly into a
specific channel, so just adding a general invite link and then pointing
them to the Windows channel name.

Release Notes:

- N/A
2024-10-22 10:36:05 -04:00
Danilo Leal
970f8db5c4 ssh: Don't erase the connection input on escape key (#19550)
This PR changes the behavior of pressing the `Esc` key on the connection
input. Now, if you hit it, the address inserted into the input won't be
erased. Effectively, escape now only cancels the connection process
instead of doing both (clearing the input _and_ cancelling the
connection).

Release Notes:

- N/A

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-10-22 11:28:39 -03:00
Piotr Osiewicz
bc9086c9af ssh: Fix file picker not getting focus when it's opened for the first time (#19546)
Release Notes:

- N/A

Co-authored-by: Danilo <danilo@zed.dev>
2024-10-22 16:11:59 +02:00
Thorsten Ball
a367c6de6e ssh remote: Only send a single FlushBufferedMessages (#19541)
Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-22 12:04:15 +02:00
Thorsten Ball
27d1a566d0 ssh remoting: Emit Disconnected when reconnected exhausted (#19540)
Release Notes:

- N/A

---------

Co-authored-by: Bennet <bennet@zed.dev>
2024-10-22 11:35:27 +02:00
Conrad Irwin
4f52077d97 Better error handling for SSH (#19533)
Before this change we sometimes showed errors inline, sometimes in
alerts.
Sometimes we closed the window, someimtes we didn't.

Now they always show as prompts and we never close windows.

Co-Authored-By: Mikayla <mikayla@zed.dev>

Release Notes:

- SSH Remoting: Improve error handling
2024-10-21 22:17:42 -07:00
Conrad Irwin
6e485453d0 Rename dev servers to remote projects in UI (#19527)
Release Notes:

- N/A

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-10-21 16:50:25 -06:00
Conrad Irwin
9bae93cd39 SSH Remoting: Fix yes/no/fingerprint prompt (#19526)
Release Notes:

- SSH Remoting: fix SSH fingerprint prompt

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-10-21 15:28:22 -06:00
Marshall Bowers
1a4b253ee5 collab: Add support for a custom monthly allowance for LLM usage (#19525)
This PR adds support for setting a monthly LLM usage allowance for
certain users.

Release Notes:

- N/A
2024-10-21 17:12:33 -04:00
Vladimir Varankin
89f6b65ee6 docs: Add missing link to jsonnet.md in the summary (#19522)
This is a fixup for #19410.

Apparently, `mdbook` requires a reference to the document from the
`SUMMARY.md`. This fixes the 404
(https://zed.dev/docs/languages/jsonnet.html) and also adds the Jsonnet
to the docs' navigation.

Release Notes:

- N/A
2024-10-21 15:56:56 -04:00
Bruno Calza
a2c6b4ad2f Fix empty keystroke with simulated IME (#19414)
Closes #19181

When the keystroke was empty ("") the `ime_key` was converted from
`None` to `Some("")` when `with_simulated_ime` was called. That was
leading to not intentional behavior when an empty keystroke was combined
with `shift-up` in a keybinding `["workspace::SendKeystrokes", "shift-up
"]`.

By adding a `key.is_empty()` we make sure the `ime_key` keeps as `None`.

This was manually tested. 

Release Notes:

- Fixed empty keystroke with simulated ime

Signed-off-by: Bruno Calza <brunoangelicalza@gmail.com>
2024-10-21 13:44:05 -06:00
Conrad Irwin
6b7d85b769 SSH Remoting: Fix Save As (#19517)
Co-Authored-By: Mikayla <mikayla@zed.dev>

Closes #ISSUE

Release Notes:

- SSH Remoting: Fix SaveAs to pick the file on the remote

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-10-21 13:41:48 -06:00
Max Brunsfeld
cb3eb75712 Fix block-wise autoindent when editing adjacent ranges (#19521)
This fixes problems where auto-indent wasn't working correctly for
assistant edits.

Release Notes:

- Fixed a bug where auto-indent didn't work correctly when pasting with
multiple cursors on adjacent lines

Co-authored-by: Marshall <marshall@zed.dev>
2024-10-21 12:41:41 -07:00
Mikayla Maki
bae85d858e SSH Remoting: Fix reload/save race (#19519)
Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-10-21 11:23:19 -07:00
Marshall Bowers
755fd695f5 editor: Move hunk controls to the right (#19515)
This PR moves the hunk controls over to the right so we can see how they
feel over there.

Git:

<img width="1068" alt="Screenshot 2024-10-21 at 10 27 34 AM"
src="https://github.com/user-attachments/assets/71556e22-024a-4bdf-8a99-fe28430b9155">

Live diffs:

<img width="1060" alt="Screenshot 2024-10-21 at 10 27 28 AM"
src="https://github.com/user-attachments/assets/681ff409-dc55-4b63-87d7-7e39016417d2">


Release Notes:

- Moved hunk controls to the right of the header.
2024-10-21 10:45:54 -04:00
Daniel Eichman
74e25c11f1 Fix incorrect checkbox placement in Markdown preview (#19383)
- Closes #12515

Before fix:
<img width="1506" alt="Screenshot 2024-10-17 at 09 50 19"
src="https://github.com/user-attachments/assets/250f50cb-0119-4b96-bc9b-7258aa83247c">

After fix:
<img width="1027" alt="Screenshot 2024-10-17 at 09 52 36"
src="https://github.com/user-attachments/assets/c2eb7e4a-3c03-466c-b215-7fcc22eed024">

Testing:
- Manual testing 
- Added unit test

Test results, these tests fail on the main branch for my setup as well,
I have docker running but still had some failures:
```
failures:
    tests::integration_tests::test_context_collaboration_with_reconnect
    tests::integration_tests::test_formatting_buffer
    tests::integration_tests::test_fs_operations
    tests::integration_tests::test_git_branch_name
    tests::integration_tests::test_git_diff_base_change
    tests::integration_tests::test_git_status_sync
    tests::integration_tests::test_join_after_restart
    tests::integration_tests::test_join_call_after_screen_was_shared
    tests::integration_tests::test_joining_channels_and_calling_multiple_users_simultaneously
    tests::integration_tests::test_leaving_project
    tests::integration_tests::test_leaving_worktree_while_opening_buffer
    tests::integration_tests::test_local_settings
    tests::integration_tests::test_lsp_hover
    tests::integration_tests::test_mute_deafen
    tests::integration_tests::test_open_buffer_while_getting_definition_pointing_to_it
    tests::integration_tests::test_pane_split_left
    tests::integration_tests::test_prettier_formatting_buffer
    tests::integration_tests::test_preview_tabs
    tests::integration_tests::test_project_reconnect
    tests::integration_tests::test_project_search
    tests::integration_tests::test_project_symbols
    tests::integration_tests::test_propagate_saves_and_fs_changes
    tests::integration_tests::test_references
    tests::integration_tests::test_reloading_buffer_manually
    tests::integration_tests::test_right_click_menu_behind_collab_panel
    tests::integration_tests::test_room_location
    tests::integration_tests::test_room_uniqueness
    tests::integration_tests::test_server_restarts
    tests::integration_tests::test_unshare_project
    tests::notification_tests::test_notifications
    tests::random_project_collaboration_tests::test_random_project_collaboration
    tests::remote_editing_collaboration_tests::test_sharing_an_ssh_remote_project

test result: FAILED. 156 passed; 32 failed; 0 ignored; 0 measured; 0 filtered out; finished in 100.98s
```
Comments:
I do not have a ton of rust knowledge, so very open to feedback. TYSM

Release Notes:

- Fix Incorrect checkbox placement in Markdown preview

---------

Co-authored-by: Bennet Bo Fenner <bennet@zed.dev>
2024-10-21 15:57:49 +02:00
169 changed files with 5423 additions and 1771 deletions

View File

@@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0

View File

@@ -18,7 +18,7 @@ jobs:
- buildjet-16vcpu-ubuntu-2204
steps:
- name: Checkout code
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
ref: ${{ github.event.inputs.branch }}
ssh-key: ${{ secrets.ZED_BOT_DEPLOY_KEY }}

View File

@@ -36,7 +36,7 @@ jobs:
- test
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0 # fetch full history
@@ -78,13 +78,13 @@ jobs:
- buildjet-8vcpu-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check for typos
uses: crate-ci/typos@v1.24.6
uses: crate-ci/typos@8e6a4285bcbde632c5d79900a7779746e8b7ea3f # v1.24.6
with:
config: ./typos.toml
@@ -96,7 +96,7 @@ jobs:
- test
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -133,7 +133,7 @@ jobs:
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -165,7 +165,7 @@ jobs:
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -188,7 +188,7 @@ jobs:
runs-on: hosted-windows-1
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -229,7 +229,7 @@ jobs:
node-version: "18"
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# We need to fetch more than one commit so that `script/draft-release-notes`
# is able to diff between the current and previous tag.
@@ -314,7 +314,7 @@ jobs:
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -361,7 +361,7 @@ jobs:
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -386,7 +386,6 @@ jobs:
- name: Upload app bundle to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}

View File

@@ -14,10 +14,10 @@ jobs:
stale-issue-message: >
Hi there! 👋
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. Are you able to reproduce this issue in the latest version of Zed? If so, please let us know by commenting on this issue and we will keep it open; otherwise, we'll close it in 7 days. Feel free to open a new issue if you're seeing this message after the issue has been closed.
We're working to clean up our issue tracker by closing older issues that might not be relevant anymore. If you are able to reproduce this issue in the latest version of Zed, please let us know by commenting on this issue, and we will keep it open. If you can't reproduce it, feel free to close the issue yourself. Otherwise, we'll close it in 7 days.
Thanks for your help!
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, please open a new issue with a link to this issue."
close-issue-message: "This issue was closed due to inactivity. If you're still experiencing this problem, please open a new issue with a link to this issue."
# We will increase `days-before-stale` to 365 on or after Jan 24th,
# 2024. This date marks one year since migrating issues from
# 'community' to 'zed' repository. The migration added activity to all

View File

@@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest
if: github.repository_owner == 'zed-industries'
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: Set up uv
uses: astral-sh/setup-uv@f3bcaebff5eace81a1c062af9f9011aae482ca9d # v3
with:

View File

@@ -10,7 +10,7 @@ jobs:
runs-on: ubuntu-latest
if: github.repository_owner == 'zed-industries'
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: Set up uv
uses: astral-sh/setup-uv@f3bcaebff5eace81a1c062af9f9011aae482ca9d # v3
with:

View File

@@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:

View File

@@ -13,7 +13,7 @@ jobs:
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false

View File

@@ -17,7 +17,7 @@ jobs:
- test
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0
@@ -36,7 +36,7 @@ jobs:
needs: style
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0
@@ -71,7 +71,7 @@ jobs:
run: doctl registry login
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -97,7 +97,7 @@ jobs:
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false

View File

@@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
@@ -31,7 +31,7 @@ jobs:
}
- name: Check for Typos with Typos-CLI
uses: crate-ci/typos@v1.24.6
uses: crate-ci/typos@8e6a4285bcbde632c5d79900a7779746e8b7ea3f # v1.24.6
with:
config: ./typos.toml
files: ./docs/

View File

@@ -16,7 +16,7 @@ jobs:
- ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false

View File

@@ -27,7 +27,7 @@ jobs:
node-version: "18"
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false

View File

@@ -23,7 +23,7 @@ jobs:
- test
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0
@@ -44,7 +44,7 @@ jobs:
needs: style
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -75,7 +75,7 @@ jobs:
node-version: "18"
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -109,7 +109,7 @@ jobs:
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -149,7 +149,7 @@ jobs:
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
@@ -182,7 +182,7 @@ jobs:
- bundle-linux-arm
steps:
- name: Checkout repo
uses: actions/checkout@eef61447b9ff4aafe5dcd4e0bbf5d482be7e7871 # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0

378
Cargo.lock generated
View File

@@ -71,6 +71,7 @@ checksum = "e89da841a80418a9b391ebaea17f5c112ffaaa96f621d2c285b5174da76b9011"
dependencies = [
"cfg-if",
"const-random",
"getrandom 0.2.15",
"once_cell",
"version_check",
"zerocopy",
@@ -261,9 +262,9 @@ checksum = "34cd60c5e3152cef0a592f1b296f1cc93715d89d2551d85315828c3a09575ff4"
[[package]]
name = "anyhow"
version = "1.0.89"
version = "1.0.91"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86fdf8605db99b54d3cd748a44c6d04df638eb5dafb219b135d0149bd0db01f6"
checksum = "c042108f3ed77fd83760a5fd79b53be043192bb3b9dba91d8c574c0ada7850c8"
[[package]]
name = "approx"
@@ -306,6 +307,168 @@ dependencies = [
"serde",
]
[[package]]
name = "arrow"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9ba0d7248932f4e2a12fb37f0a2e3ec82b3bdedbac2a1dce186e036843b8f8c"
dependencies = [
"arrow-arith",
"arrow-array",
"arrow-buffer",
"arrow-cast",
"arrow-data",
"arrow-ord",
"arrow-row",
"arrow-schema",
"arrow-select",
"arrow-string",
]
[[package]]
name = "arrow-arith"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d60afcdc004841a5c8d8da4f4fa22d64eb19c0c01ef4bcedd77f175a7cf6e38f"
dependencies = [
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"chrono",
"half",
"num",
]
[[package]]
name = "arrow-array"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7f16835e8599dbbb1659fd869d865254c4cf32c6c2bb60b6942ac9fc36bfa5da"
dependencies = [
"ahash 0.8.11",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"chrono",
"half",
"hashbrown 0.14.5",
"num",
]
[[package]]
name = "arrow-buffer"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1a1f34f0faae77da6b142db61deba2cb6d60167592b178be317b341440acba80"
dependencies = [
"bytes 1.7.2",
"half",
"num",
]
[[package]]
name = "arrow-cast"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "450e4abb5775bca0740bec0bcf1b1a5ae07eff43bd625661c4436d8e8e4540c4"
dependencies = [
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"arrow-select",
"atoi",
"base64 0.22.1",
"chrono",
"comfy-table",
"half",
"lexical-core",
"num",
"ryu",
]
[[package]]
name = "arrow-data"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2b1e618bbf714c7a9e8d97203c806734f012ff71ae3adc8ad1b075689f540634"
dependencies = [
"arrow-buffer",
"arrow-schema",
"half",
"num",
]
[[package]]
name = "arrow-ord"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2427f37b4459a4b9e533045abe87a5183a5e0995a3fc2c2fd45027ae2cc4ef3f"
dependencies = [
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"arrow-select",
"half",
"num",
]
[[package]]
name = "arrow-row"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "15959657d92e2261a7a323517640af87f5afd9fd8a6492e424ebee2203c567f6"
dependencies = [
"ahash 0.8.11",
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"half",
]
[[package]]
name = "arrow-schema"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fbf0388a18fd7f7f3fe3de01852d30f54ed5182f9004db700fbe3ba843ed2794"
dependencies = [
"bitflags 2.6.0",
]
[[package]]
name = "arrow-select"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b83e5723d307a38bf00ecd2972cd078d1339c7fd3eb044f609958a9a24463f3a"
dependencies = [
"ahash 0.8.11",
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"num",
]
[[package]]
name = "arrow-string"
version = "53.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7ab3db7c09dd826e74079661d84ed01ed06547cf75d52c2818ef776d0d852305"
dependencies = [
"arrow-array",
"arrow-buffer",
"arrow-data",
"arrow-schema",
"arrow-select",
"memchr",
"num",
"regex",
"regex-syntax 0.8.4",
]
[[package]]
name = "as-raw-xcb-connection"
version = "1.0.1"
@@ -453,9 +616,11 @@ dependencies = [
"anyhow",
"collections",
"derive_more",
"futures 0.3.30",
"gpui",
"language",
"parking_lot",
"pretty_assertions",
"serde",
"serde_json",
"workspace",
@@ -868,7 +1033,7 @@ dependencies = [
"libc",
"pin-project",
"redox_syscall 0.2.16",
"xattr",
"xattr 0.2.3",
]
[[package]]
@@ -1009,6 +1174,7 @@ dependencies = [
"smol",
"tempfile",
"util",
"which 6.0.3",
"workspace",
]
@@ -2548,6 +2714,7 @@ dependencies = [
"dashmap 6.0.1",
"derive_more",
"dev_server_projects",
"duckdb",
"editor",
"env_logger",
"envy",
@@ -2581,6 +2748,7 @@ dependencies = [
"project",
"prometheus",
"prost",
"r2d2",
"rand 0.8.5",
"recent_projects",
"release_channel",
@@ -2696,6 +2864,17 @@ dependencies = [
"memchr",
]
[[package]]
name = "comfy-table"
version = "7.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b34115915337defe99b2aff5c2ce6771e5fbc4079f4b506301f5cf394c8452f7"
dependencies = [
"strum 0.26.3",
"strum_macros 0.26.4",
"unicode-width",
]
[[package]]
name = "command_palette"
version = "0.1.0"
@@ -3631,6 +3810,26 @@ dependencies = [
"phf",
]
[[package]]
name = "duckdb"
version = "1.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "86844939330ba6ce345c4b5333d3be45c4f0c092779bf9617bba92efb8b841f5"
dependencies = [
"arrow",
"cast",
"fallible-iterator",
"fallible-streaming-iterator",
"hashlink",
"libduckdb-sys",
"memchr",
"num-integer",
"r2d2",
"rust_decimal",
"smallvec",
"strum 0.25.0",
]
[[package]]
name = "dwrote"
version = "0.11.1"
@@ -3722,6 +3921,7 @@ dependencies = [
"tree-sitter-rust",
"tree-sitter-typescript",
"ui",
"unicode-segmentation",
"unindent",
"url",
"util",
@@ -4185,6 +4385,12 @@ version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2acce4a10f12dc2fb14a218589d4f1f62ef011b2d0cc4b3cb1bba8e94da14649"
[[package]]
name = "fallible-streaming-iterator"
version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7360491ce676a36bf9bb3c56c1aa791658183a54d2744120f27285738d90465a"
[[package]]
name = "fancy-regex"
version = "0.12.0"
@@ -5178,6 +5384,7 @@ checksum = "6dd08c532ae367adf81c312a4580bc67f1d0fe8bc9c460520283f4c0ff277888"
dependencies = [
"cfg-if",
"crunchy",
"num-traits",
]
[[package]]
@@ -6230,6 +6437,7 @@ dependencies = [
"lsp",
"parking_lot",
"postage",
"pretty_assertions",
"pulldown-cmark 0.12.1",
"rand 0.8.5",
"regex",
@@ -6425,6 +6633,70 @@ version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "03087c2bad5e1034e8cace5926dec053fb3790248370865f5117a7d0213354c8"
[[package]]
name = "lexical-core"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0431c65b318a590c1de6b8fd6e72798c92291d27762d94c9e6c37ed7a73d8458"
dependencies = [
"lexical-parse-float",
"lexical-parse-integer",
"lexical-util",
"lexical-write-float",
"lexical-write-integer",
]
[[package]]
name = "lexical-parse-float"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eb17a4bdb9b418051aa59d41d65b1c9be5affab314a872e5ad7f06231fb3b4e0"
dependencies = [
"lexical-parse-integer",
"lexical-util",
"static_assertions",
]
[[package]]
name = "lexical-parse-integer"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5df98f4a4ab53bf8b175b363a34c7af608fe31f93cc1fb1bf07130622ca4ef61"
dependencies = [
"lexical-util",
"static_assertions",
]
[[package]]
name = "lexical-util"
version = "1.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "85314db53332e5c192b6bca611fb10c114a80d1b831ddac0af1e9be1b9232ca0"
dependencies = [
"static_assertions",
]
[[package]]
name = "lexical-write-float"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e7c3ad4e37db81c1cbe7cf34610340adc09c322871972f74877a712abc6c809"
dependencies = [
"lexical-util",
"lexical-write-integer",
"static_assertions",
]
[[package]]
name = "lexical-write-integer"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eb89e9f6958b83258afa3deed90b5de9ef68eef090ad5086c791cd2345610162"
dependencies = [
"lexical-util",
"static_assertions",
]
[[package]]
name = "libc"
version = "0.2.159"
@@ -6441,6 +6713,21 @@ dependencies = [
"pkg-config",
]
[[package]]
name = "libduckdb-sys"
version = "1.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eac2de5219db852597558df5dcd617ffccd5cbd7b9f5402ccbf899aca6cb6047"
dependencies = [
"autocfg",
"flate2",
"pkg-config",
"serde",
"serde_json",
"tar",
"vcpkg",
]
[[package]]
name = "libfuzzer-sys"
version = "0.4.7"
@@ -8510,6 +8797,7 @@ dependencies = [
"serde_derive",
"serde_json",
"settings",
"smallvec",
"theme",
"ui",
"util",
@@ -8789,6 +9077,17 @@ dependencies = [
"proc-macro2",
]
[[package]]
name = "r2d2"
version = "0.8.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "51de85fb3fb6524929c8a2eb85e6b6d363de4e8c48f9e2c2eac4944abc181c93"
dependencies = [
"log",
"parking_lot",
"scheduled-thread-pool",
]
[[package]]
name = "radium"
version = "0.7.0"
@@ -8985,6 +9284,7 @@ dependencies = [
"itertools 0.13.0",
"language",
"log",
"markdown",
"menu",
"ordered-float 2.10.1",
"paths",
@@ -9000,6 +9300,7 @@ dependencies = [
"smol",
"task",
"terminal_view",
"theme",
"ui",
"util",
"workspace",
@@ -9156,6 +9457,7 @@ dependencies = [
"client",
"clock",
"env_logger",
"fork",
"fs",
"futures 0.3.30",
"git",
@@ -9164,6 +9466,7 @@ dependencies = [
"http_client",
"language",
"languages",
"libc",
"log",
"lsp",
"node_runtime",
@@ -9860,6 +10163,15 @@ dependencies = [
"windows-sys 0.52.0",
]
[[package]]
name = "scheduled-thread-pool"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3cbc66816425a074528352f5789333ecff06ca41b36b0b0efdfbb29edc391a19"
dependencies = [
"parking_lot",
]
[[package]]
name = "schemars"
version = "0.8.21"
@@ -11087,7 +11399,7 @@ version = "0.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "290d54ea6f91c969195bdbcd7442c8c2a2ba87da8bf60a7ee86a235d4bc1e125"
dependencies = [
"strum_macros",
"strum_macros 0.25.3",
]
[[package]]
@@ -11109,6 +11421,19 @@ dependencies = [
"syn 2.0.76",
]
[[package]]
name = "strum_macros"
version = "0.26.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c6bee85a5a24955dc440386795aa378cd9cf82acd5f764469152d2270e581be"
dependencies = [
"heck 0.5.0",
"proc-macro2",
"quote",
"rustversion",
"syn 2.0.76",
]
[[package]]
name = "subtle"
version = "2.6.1"
@@ -11474,6 +11799,17 @@ version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "55937e1799185b12863d447f42597ed69d9928686b8d88a1df17376a097d8369"
[[package]]
name = "tar"
version = "0.4.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4ff6c40d3aedb5e06b57c6f669ad17ab063dd1e63d977c6a88e7f4dfa4f04020"
dependencies = [
"filetime",
"libc",
"xattr 1.3.1",
]
[[package]]
name = "target-lexicon"
version = "0.12.16"
@@ -12436,7 +12772,7 @@ checksum = "2545046bd1473dac6c626659cc2567c6c0ff302fc8b84a56c4243378276f7f57"
[[package]]
name = "tree-sitter-md"
version = "0.3.2"
source = "git+https://github.com/zed-industries/tree-sitter-markdown?rev=4cfa6aad6b75052a5077c80fd934757d9267d81b#4cfa6aad6b75052a5077c80fd934757d9267d81b"
source = "git+https://github.com/tree-sitter-grammars/tree-sitter-markdown?rev=9a23c1a96c0513d8fc6520972beedd419a973539#9a23c1a96c0513d8fc6520972beedd419a973539"
dependencies = [
"cc",
"tree-sitter-language",
@@ -13420,9 +13756,9 @@ dependencies = [
[[package]]
name = "wasmtime-wasi"
version = "24.0.0"
version = "24.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "545ae8298ffce025604f7480f9c7d6948c985bef7ce9aee249ef79307813e83c"
checksum = "fda03f5bfd5c4cc09f75c7e44846663f25f2c48a2d688fbfb5c7a33af6cf34f5"
dependencies = [
"anyhow",
"async-trait",
@@ -13675,9 +14011,9 @@ dependencies = [
[[package]]
name = "wiggle"
version = "24.0.0"
version = "24.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cc850ca3c02c5835934d23f28cec4c5a3fb66fe0b4ecd968bbb35609dda5ddc0"
checksum = "2d3b31bd2b4d2d82a4b747b8dbc45f566214214a4ffdc5690429a73bc221dc8a"
dependencies = [
"anyhow",
"async-trait",
@@ -13690,9 +14026,9 @@ dependencies = [
[[package]]
name = "wiggle-generate"
version = "24.0.0"
version = "24.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "634b8804a67200bcb43ea8af5f7c53e862439a086b68b16fd333454bc74d5aab"
checksum = "e2c6136b195fc12067aa9d4e7a5baf118729394df7bc7cbf8c63119bc9f2a7cd"
dependencies = [
"anyhow",
"heck 0.4.1",
@@ -13705,9 +14041,9 @@ dependencies = [
[[package]]
name = "wiggle-macro"
version = "24.0.0"
version = "24.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "474b7cbdb942c74031e619d66c600bba7f73867c5800fc2c2306cf307649be2f"
checksum = "8a41eaceee468da976ac43b85c4eb82e482f828d5e8e56f49f90dfac2d9bc3b4"
dependencies = [
"proc-macro2",
"quote",
@@ -14412,6 +14748,17 @@ dependencies = [
"libc",
]
[[package]]
name = "xattr"
version = "1.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8da84f1a25939b27f6820d92aed108f83ff920fdf11a7b19366c27c4cda81d4f"
dependencies = [
"libc",
"linux-raw-sys 0.4.14",
"rustix 0.38.35",
]
[[package]]
name = "xcursor"
version = "0.3.8"
@@ -14594,7 +14941,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.159.0"
version = "0.160.0"
dependencies = [
"activity_indicator",
"anyhow",
@@ -14702,7 +15049,6 @@ dependencies = [
"winresource",
"workspace",
"zed_actions",
"zstd",
]
[[package]]
@@ -14834,7 +15180,7 @@ dependencies = [
[[package]]
name = "zed_php"
version = "0.2.1"
version = "0.2.2"
dependencies = [
"zed_extension_api 0.1.0",
]

View File

@@ -459,7 +459,7 @@ tree-sitter-diff = "0.1.0"
tree-sitter-html = "0.20"
tree-sitter-jsdoc = "0.23"
tree-sitter-json = "0.23"
tree-sitter-md = { git = "https://github.com/zed-industries/tree-sitter-markdown", rev = "4cfa6aad6b75052a5077c80fd934757d9267d81b" }
tree-sitter-md = { git = "https://github.com/tree-sitter-grammars/tree-sitter-markdown", rev = "9a23c1a96c0513d8fc6520972beedd419a973539" }
tree-sitter-python = "0.23"
tree-sitter-regex = "0.23"
tree-sitter-ruby = "0.23"
@@ -468,7 +468,7 @@ tree-sitter-typescript = "0.23"
tree-sitter-yaml = { git = "https://github.com/zed-industries/tree-sitter-yaml", rev = "baff0b51c64ef6a1fb1f8390f3ad6015b83ec13a" }
unicase = "2.6"
unindent = "0.1.7"
unicode-segmentation = "1.10"
unicode-segmentation = "1.11"
url = "2.2"
uuid = { version = "1.1.2", features = ["v4", "v5", "serde"] }
wasmparser = "0.215"

View File

@@ -65,6 +65,7 @@
"h": "c",
"handlebars": "code",
"hbs": "template",
"hcl": "hcl",
"heex": "elixir",
"heic": "image",
"heif": "image",
@@ -89,6 +90,7 @@
"json": "storage",
"jsonc": "storage",
"jsx": "react",
"julia": "julia",
"jxl": "image",
"kt": "kotlin",
"ldf": "storage",
@@ -116,6 +118,7 @@
"myd": "storage",
"myi": "storage",
"nim": "nim",
"nix": "nix",
"nu": "terminal",
"odp": "document",
"ods": "document",
@@ -143,12 +146,15 @@
"rb": "ruby",
"rebar.config": "erlang",
"rkt": "code",
"roc": "roc",
"rs": "rust",
"rtf": "document",
"sass": "sass",
"sav": "storage",
"sc": "scala",
"scala": "scala",
"scm": "code",
"scss": "sass",
"sdf": "storage",
"sh": "terminal",
"sql": "storage",
@@ -182,6 +188,7 @@
"yaml": "settings",
"yml": "settings",
"yrl": "erlang",
"zig": "zig",
"zlogin": "terminal",
"zsh": "terminal",
"zsh_aliases": "terminal",
@@ -266,6 +273,9 @@
"haskell": {
"icon": "icons/file_icons/haskell.svg"
},
"hcl": {
"icon": "icons/file_icons/hcl.svg"
},
"heroku": {
"icon": "icons/file_icons/heroku.svg"
},
@@ -278,6 +288,9 @@
"javascript": {
"icon": "icons/file_icons/javascript.svg"
},
"julia": {
"icon": "icons/file_icons/julia.svg"
},
"kotlin": {
"icon": "icons/file_icons/kotlin.svg"
},
@@ -293,6 +306,9 @@
"nim": {
"icon": "icons/file_icons/nim.svg"
},
"nix": {
"icon": "icons/file_icons/nix.svg"
},
"ocaml": {
"icon": "icons/file_icons/ocaml.svg"
},
@@ -317,12 +333,18 @@
"react": {
"icon": "icons/file_icons/react.svg"
},
"roc": {
"icon": "icons/file_icons/roc.svg"
},
"ruby": {
"icon": "icons/file_icons/ruby.svg"
},
"rust": {
"icon": "icons/file_icons/rust.svg"
},
"sass": {
"icon": "icons/file_icons/sass.svg"
},
"scala": {
"icon": "icons/file_icons/scala.svg"
},
@@ -361,6 +383,9 @@
},
"vue": {
"icon": "icons/file_icons/vue.svg"
},
"zig": {
"icon": "icons/file_icons/zig.svg"
}
}
}

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M7.11466 3.11809C7.21859 3.37393 7.09545 3.66558 6.83961 3.76952L4.31181 4.79643C4.1233 4.87302 4 5.05619 4 5.25967V11.5C4 11.7761 3.77614 12 3.5 12H2.5C2.22386 12 2 11.7761 2 11.5V4.41827C2 3.90959 2.30825 3.45164 2.77953 3.26018L6.08686 1.91658C6.34269 1.81265 6.63434 1.93579 6.73828 2.19163L7.11466 3.11809ZM10.5 1.99999C10.7761 1.99999 11 2.22384 11 2.49999V10.5C11 10.7761 10.7761 11 10.5 11H9.5C9.22386 11 9 10.7761 9 10.5V9.49999C9 9.22384 8.77614 8.99999 8.5 8.99999H7.5C7.22386 8.99999 7 9.22384 7 9.49999V13.5C7 13.7761 6.77614 14 6.5 14H5.5C5.22386 14 5 13.7761 5 13.5V5.53124C5 5.25509 5.22386 5.03124 5.5 5.03124H6.5C6.77614 5.03124 7 5.25509 7 5.53124V6.49999C7 6.77613 7.22386 6.99999 7.5 6.99999H8.5C8.77614 6.99999 9 6.77613 9 6.49999V2.49999C9 2.22384 9.22386 1.99999 9.5 1.99999H10.5ZM13.5 4.03124C13.7761 4.03124 14 4.2551 14 4.53124L14 11.5847C14 12.0859 13.7006 12.5386 13.2394 12.7349L9.99399 14.1159C9.7399 14.224 9.44626 14.1057 9.33813 13.8516L8.94658 12.9315C8.83845 12.6774 8.95678 12.3837 9.21087 12.2756L11.6958 11.2182C11.8802 11.1397 12 10.9586 12 10.7581L12 4.53124C12 4.2551 12.2238 4.03124 12.5 4.03124L13.5 4.03124Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@@ -0,0 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<circle cx="8" cy="5" r="2.75" fill="black"/>
<circle cx="4.75" cy="11" r="2.75" fill="black" fill-opacity="0.5"/>
<circle cx="11.25" cy="11" r="2.75" fill="black" fill-opacity="0.75"/>
</svg>

After

Width:  |  Height:  |  Size: 289 B

View File

@@ -0,0 +1,8 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M6.00005 4.76556L4.76569 2.74996M6.00005 4.76556L3.75 4.76563M6.00005 4.76556L7.25006 4.7656" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M10.0232 11.2311L11.2675 13.2406M10.0232 11.2311L12.2732 11.2199M10.0232 11.2311L8.7732 11.2373" stroke="black" stroke-opacity="0.5" stroke-width="1.5" stroke-linecap="round"/>
<path d="M9.99025 4.91551L10.9985 2.77781M9.99025 4.91551L8.75599 3.03419M9.99025 4.91551L10.6759 5.9607" stroke="black" stroke-opacity="0.5" stroke-width="1.5" stroke-linecap="round"/>
<path d="M6.0323 11.1009L5.03465 13.2436M6.0323 11.1009L7.27585 12.9761M6.0323 11.1009L5.34151 10.0592" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M11.883 8.19023L14.2466 8.19287M11.883 8.19023L13.0602 6.27268M11.883 8.19023L11.229 9.25547" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M4.12354 7.8356L1.76002 7.84465M4.12354 7.8356L2.95585 9.75894M4.12354 7.8356L4.7723 6.76713" stroke="black" stroke-opacity="0.5" stroke-width="1.5" stroke-linecap="round"/>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@@ -0,0 +1,7 @@
<svg width="17" height="16" viewBox="0 0 17 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M5.51497 2.02702L1.92042 1.95067C1.69543 1.94589 1.57917 2.21756 1.73796 2.37702L6.24865 6.9068C6.42388 7.08277 6.72071 6.92326 6.67067 6.68002L5.75454 2.22659C5.73103 2.11231 5.63161 2.02949 5.51497 2.02702Z" fill="black" fill-opacity="0.5"/>
<path d="M8.05816 7.38492L12.1366 8.02844C12.3704 8.06532 12.5198 7.78697 12.3599 7.61255L7.30439 2.09814C7.13336 1.91159 6.82522 2.06811 6.87499 2.31624L7.852 7.18714C7.87257 7.28971 7.95483 7.36862 8.05816 7.38492Z" fill="black"/>
<path d="M9.0952 10.9797L11.3824 9.35081C11.564 9.22151 11.4983 8.93722 11.2785 8.90058L8.496 8.43683C8.31974 8.40746 8.17047 8.56712 8.21162 8.74101L8.70689 10.8337C8.74777 11.0064 8.95062 11.0827 9.0952 10.9797Z" fill="black" fill-opacity="0.5"/>
<path d="M5.10282 13.9632L7.59108 12.4532C7.68331 12.3972 7.72923 12.2884 7.70498 12.1832L6.75736 8.07484C6.699 7.8218 6.34133 7.81448 6.27266 8.06491L4.73201 13.6834C4.67223 13.9014 4.90954 14.0805 5.10282 13.9632Z" fill="black"/>
<path d="M11.3183 4.89351L13.1588 7.03149L15.535 6.14302C15.7099 6.07761 15.754 5.85043 15.6161 5.72438L13.7222 3.99219L11.4546 4.48614C11.2695 4.52645 11.1947 4.74995 11.3183 4.89351Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 1.2 KiB

View File

@@ -0,0 +1,3 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M6.92096 7.00668C7.87408 7.83549 10.0987 7.48203 10.9376 7.06254C12.8751 6.09381 13.9407 4.39379 12.6407 2.90629C11.0157 1.04692 6.24221 2.49998 4.89844 3.40625C3.55467 4.31252 2.67972 5.53126 2.89071 7.1719C3.1017 8.81254 4.68758 9.7422 6.03128 10.3203C5.38786 10.5616 3.8517 11.0388 3.3125 11.7188C2.71341 12.4742 3.04343 14 4.51577 14C7.15639 14 7.59539 11.1486 7.14847 10.4375C7.88773 10.1295 8.49597 9.96169 9.40138 9.77081C9.63831 9.72087 9.65457 9.46395 9.41295 9.44827C8.80252 9.40864 7.30567 9.8489 6.92096 9.97657C5.78909 9.35157 4.51016 7.93818 4.59378 6.87501C4.68676 5.6928 5.27676 5.07603 6.84508 4.21876C8.01705 3.57813 10.258 3.10695 11.25 3.62501C12.6563 4.35936 10.7875 5.75599 9.92969 6.32031C9.28179 6.74656 8.21971 6.77513 7.22979 6.61435C6.99371 6.576 6.74048 6.84974 6.92096 7.00668ZM5.6719 12.4643C6.35508 11.9894 6.45471 11.1076 6.29955 10.8844C5.76663 11.0874 4.36593 11.9102 4.75111 12.4643C4.90628 12.6875 5.31358 12.7134 5.6719 12.4643Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@@ -0,0 +1,5 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M14.25 12H11C10.794 12 10.6764 11.7648 10.8 11.6L11.925 10.1C11.9722 10.037 12.0463 10 12.125 10H12.75C12.8881 10 13 9.88807 13 9.75V6.25C13 6.11193 12.8881 6 12.75 6H12.4045C12.2187 6 12.0978 5.80442 12.1809 5.6382L12.9309 4.1382C12.9732 4.0535 13.0598 4 13.1545 4H14.25C14.3881 4 14.5 4.11193 14.5 4.25V11.75C14.5 11.8881 14.3881 12 14.25 12Z" fill="black"/>
<path d="M1.75 4H5C5.20601 4 5.32361 4.23519 5.2 4.4L4.075 5.9C4.02779 5.96295 3.95369 6 3.875 6H3.25C3.11193 6 3 6.11193 3 6.25V9.75C3 9.88807 3.11193 10 3.25 10H3.59549C3.78134 10 3.90221 10.1956 3.8191 10.3618L3.0691 11.8618C3.02675 11.9465 2.94018 12 2.84549 12H1.75C1.61193 12 1.5 11.8881 1.5 11.75V4.25C1.5 4.11193 1.61193 4 1.75 4Z" fill="black"/>
<path d="M7.55748 6H5.95006C5.74177 6 5.62482 5.76022 5.75306 5.59609L6.92493 4.09609C6.97231 4.03544 7.04498 4 7.12194 4H9.93075C9.97607 4 10.0205 3.98769 10.0594 3.96437L11.6408 3.0155C11.8641 2.88154 12.1179 3.13555 11.9837 3.3587L8.22612 9.6083C8.12629 9.77433 8.24508 9.98591 8.43881 9.98712L10.0039 9.9969C10.2092 9.99818 10.3255 10.2327 10.2023 10.3969L9.075 11.9C9.02779 11.963 8.95369 12 8.875 12H6.55383C6.51835 12 6.48328 12.0076 6.45094 12.0222L4.32473 12.9824C4.10122 13.0833 3.88113 12.8356 4.00771 12.6255L7.77161 6.37903C7.87201 6.2124 7.75202 6 7.55748 6Z" fill="black"/>
</svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

@@ -346,6 +346,8 @@
"git_status": true,
// Amount of indentation for nested items.
"indent_size": 20,
// Whether to show indent guides in the project panel.
"indent_guides": true,
// Whether to reveal it in the project panel automatically,
// when a corresponding project entry becomes active.
// Gitignored entries are never auto revealed.
@@ -803,7 +805,7 @@
/// You can override this to use a version of node that is not in $PATH with:
/// {
/// "node": {
/// "node_path": "/path/to/node"
/// "path": "/path/to/node"
/// "npm_path": "/path/to/npm" (defaults to node_path/../npm)
/// }
/// }
@@ -1099,13 +1101,13 @@
// }
"command_aliases": {},
// ssh_connections is an array of ssh connections.
// By default this setting is null, which disables the direct ssh connection support.
// You can configure these from `project: Open Remote` in the command palette.
// Zed's ssh support will pull configuration from your ~/.ssh too.
// Examples:
// [
// {
// "host": "example-box",
// // "port": 22, "username": "test", "args": ["-i", "/home/user/.ssh/id_rsa"]
// "projects": [
// {
// "paths": ["/home/user/code/zed"]
@@ -1113,7 +1115,7 @@
// ]
// }
// ]
"ssh_connections": null,
"ssh_connections": [],
// Configures the Context Server Protocol binaries
//
// Examples:

View File

@@ -29,13 +29,13 @@ pub struct AnthropicModelCacheConfiguration {
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq, EnumIter)]
pub enum Model {
#[default]
#[serde(rename = "claude-3-5-sonnet", alias = "claude-3-5-sonnet-20240620")]
#[serde(rename = "claude-3-5-sonnet", alias = "claude-3-5-sonnet-latest")]
Claude3_5Sonnet,
#[serde(rename = "claude-3-opus", alias = "claude-3-opus-20240229")]
#[serde(rename = "claude-3-opus", alias = "claude-3-opus-latest")]
Claude3Opus,
#[serde(rename = "claude-3-sonnet", alias = "claude-3-sonnet-20240229")]
#[serde(rename = "claude-3-sonnet", alias = "claude-3-sonnet-latest")]
Claude3Sonnet,
#[serde(rename = "claude-3-haiku", alias = "claude-3-haiku-20240307")]
#[serde(rename = "claude-3-haiku", alias = "claude-3-haiku-latest")]
Claude3Haiku,
#[serde(rename = "custom")]
Custom {
@@ -69,10 +69,10 @@ impl Model {
pub fn id(&self) -> &str {
match self {
Model::Claude3_5Sonnet => "claude-3-5-sonnet-20240620",
Model::Claude3Opus => "claude-3-opus-20240229",
Model::Claude3Sonnet => "claude-3-sonnet-20240229",
Model::Claude3Haiku => "claude-3-haiku-20240307",
Model::Claude3_5Sonnet => "claude-3-5-sonnet-latest",
Model::Claude3Opus => "claude-3-opus-latest",
Model::Claude3Sonnet => "claude-3-sonnet-latest",
Model::Claude3Haiku => "claude-3-haiku-latest",
Self::Custom { name, .. } => name,
}
}

View File

@@ -356,8 +356,10 @@ impl AssistantPanel {
let project = workspace.project().clone();
pane.set_custom_drop_handle(cx, move |_, dropped_item, cx| {
let action = maybe!({
if let Some(paths) = dropped_item.downcast_ref::<ExternalPaths>() {
return Some(InsertDraggedFiles::ExternalFiles(paths.paths().to_vec()));
if project.read(cx).is_local() {
if let Some(paths) = dropped_item.downcast_ref::<ExternalPaths>() {
return Some(InsertDraggedFiles::ExternalFiles(paths.paths().to_vec()));
}
}
let project_paths = if let Some(tab) = dropped_item.downcast_ref::<DraggedTab>()

View File

@@ -7,7 +7,7 @@ use crate::{
};
use anyhow::{anyhow, Context as _, Result};
use assistant_slash_command::{
SlashCommandOutput, SlashCommandOutputSection, SlashCommandRegistry,
SlashCommandOutput, SlashCommandOutputSection, SlashCommandRegistry, SlashCommandResult,
};
use assistant_tool::ToolRegistry;
use client::{self, proto, telemetry::Telemetry};
@@ -1677,7 +1677,7 @@ impl Context {
pub fn insert_command_output(
&mut self,
command_range: Range<language::Anchor>,
output: Task<Result<SlashCommandOutput>>,
output: Task<SlashCommandResult>,
ensure_trailing_newline: bool,
expand_result: bool,
cx: &mut ModelContext<Self>,
@@ -1688,19 +1688,13 @@ impl Context {
let command_range = command_range.clone();
async move {
let output = output.await;
let output = match output {
Ok(output) => SlashCommandOutput::from_event_stream(output).await,
Err(err) => Err(err),
};
this.update(&mut cx, |this, cx| match output {
Ok(mut output) => {
// Ensure section ranges are valid.
for section in &mut output.sections {
section.range.start = section.range.start.min(output.text.len());
section.range.end = section.range.end.min(output.text.len());
while !output.text.is_char_boundary(section.range.start) {
section.range.start -= 1;
}
while !output.text.is_char_boundary(section.range.end) {
section.range.end += 1;
}
}
output.ensure_valid_section_ranges();
// Ensure there is a newline after the last section.
if ensure_trailing_newline {
@@ -2487,7 +2481,8 @@ impl Context {
request.messages.push(LanguageModelRequestMessage {
role: Role::User,
content: vec![
"Summarize the context into a short title without punctuation.".into(),
"Generate a concise 3-7 word title for this conversation, omitting punctuation"
.into(),
],
cache: false,
});

View File

@@ -6,7 +6,7 @@ use crate::{
use anyhow::Result;
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandRegistry,
SlashCommandRegistry, SlashCommandResult,
};
use collections::HashSet;
use fs::FakeFs;
@@ -1097,7 +1097,8 @@ async fn test_random_context_collaboration(cx: &mut TestAppContext, mut rng: Std
text: output_text,
sections,
run_commands_in_text: false,
})),
}
.to_event_stream())),
true,
false,
cx,
@@ -1416,11 +1417,12 @@ impl SlashCommand for FakeSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
_cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
Task::ready(Ok(SlashCommandOutput {
text: format!("Executed fake command: {}", self.0),
sections: vec![],
run_commands_in_text: false,
}))
}
.to_event_stream()))
}
}

View File

@@ -2256,6 +2256,7 @@ pub enum CodegenEvent {
pub struct Codegen {
alternatives: Vec<Model<CodegenAlternative>>,
active_alternative: usize,
seen_alternatives: HashSet<usize>,
subscriptions: Vec<Subscription>,
buffer: Model<MultiBuffer>,
range: Range<Anchor>,
@@ -2286,6 +2287,7 @@ impl Codegen {
let mut this = Self {
alternatives: vec![codegen],
active_alternative: 0,
seen_alternatives: HashSet::default(),
subscriptions: Vec::new(),
buffer,
range,
@@ -2338,6 +2340,7 @@ impl Codegen {
fn activate(&mut self, index: usize, cx: &mut ModelContext<Self>) {
self.active_alternative()
.update(cx, |codegen, cx| codegen.set_active(false, cx));
self.seen_alternatives.insert(index);
self.active_alternative = index;
self.active_alternative()
.update(cx, |codegen, cx| codegen.set_active(true, cx));
@@ -2467,6 +2470,8 @@ pub struct CodegenAlternative {
active: bool,
edits: Vec<(Range<Anchor>, String)>,
line_operations: Vec<LineOperation>,
request: Option<LanguageModelRequest>,
elapsed_time: Option<f64>,
}
enum CodegenStatus {
@@ -2538,6 +2543,8 @@ impl CodegenAlternative {
edits: Vec::new(),
line_operations: Vec::new(),
range,
request: None,
elapsed_time: None,
}
}
@@ -2634,6 +2641,7 @@ impl CodegenAlternative {
async { Ok(stream::empty().boxed()) }.boxed_local()
} else {
let request = self.build_request(user_prompt, assistant_panel_context, cx)?;
self.request = Some(request.clone());
let chunks = cx
.spawn(|_, cx| async move { model.stream_completion_text(request, &cx).await });
@@ -2707,6 +2715,7 @@ impl CodegenAlternative {
stream: impl 'static + Future<Output = Result<BoxStream<'static, Result<String>>>>,
cx: &mut ModelContext<Self>,
) {
let start_time = Instant::now();
let snapshot = self.snapshot.clone();
let selected_text = snapshot
.text_for_range(self.range.start..self.range.end)
@@ -2923,6 +2932,8 @@ impl CodegenAlternative {
};
let result = generate.await;
let elapsed_time = start_time.elapsed().as_secs_f64();
codegen
.update(&mut cx, |this, cx| {
this.last_equal_ranges.clear();
@@ -2931,6 +2942,7 @@ impl CodegenAlternative {
} else {
this.status = CodegenStatus::Done;
}
this.elapsed_time = Some(elapsed_time);
cx.emit(CodegenEvent::Finished);
cx.notify();
})
@@ -3277,6 +3289,10 @@ impl CodeActionProvider for AssistantCodeActionProvider {
range: Range<text::Anchor>,
cx: &mut WindowContext,
) -> Task<Result<Vec<CodeAction>>> {
if !AssistantSettings::get_global(cx).enabled {
return Task::ready(Ok(Vec::new()));
}
let snapshot = buffer.read(cx).snapshot();
let mut range = range.to_point(&snapshot);

View File

@@ -717,7 +717,6 @@ mod tests {
);
// Ensure InsertBefore merges correctly with Update of the same text
assert_edits(
"
fn foo() {
@@ -782,6 +781,90 @@ mod tests {
.unindent(),
cx,
);
// Correctly indent new text when replacing multiple adjacent indented blocks.
assert_edits(
"
impl Numbers {
fn one() {
1
}
fn two() {
2
}
fn three() {
3
}
}
"
.unindent(),
vec![
AssistantEditKind::Update {
old_text: "
fn one() {
1
}
"
.unindent(),
new_text: "
fn one() {
101
}
"
.unindent(),
description: "pick better number".into(),
},
AssistantEditKind::Update {
old_text: "
fn two() {
2
}
"
.unindent(),
new_text: "
fn two() {
102
}
"
.unindent(),
description: "pick better number".into(),
},
AssistantEditKind::Update {
old_text: "
fn three() {
3
}
"
.unindent(),
new_text: "
fn three() {
103
}
"
.unindent(),
description: "pick better number".into(),
},
],
"
impl Numbers {
fn one() {
101
}
fn two() {
102
}
fn three() {
103
}
}
"
.unindent(),
cx,
);
}
#[track_caller]

View File

@@ -1,7 +1,8 @@
use super::create_label_for_command;
use super::{SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use feature_flags::FeatureFlag;
use futures::StreamExt;
use gpui::{AppContext, AsyncAppContext, Task, WeakView};
@@ -17,6 +18,8 @@ use ui::{BorrowAppContext, WindowContext};
use util::ResultExt;
use workspace::Workspace;
use crate::slash_command::create_label_for_command;
pub struct AutoSlashCommandFeatureFlag;
impl FeatureFlag for AutoSlashCommandFeatureFlag {
@@ -92,7 +95,7 @@ impl SlashCommand for AutoCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow::anyhow!("workspace was dropped")));
};
@@ -144,7 +147,8 @@ impl SlashCommand for AutoCommand {
text: prompt,
sections: Vec::new(),
run_commands_in_text: true,
})
}
.to_event_stream())
})
}
}

View File

@@ -1,6 +1,8 @@
use super::{SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Context, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use fs::Fs;
use gpui::{AppContext, Model, Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
@@ -123,7 +125,7 @@ impl SlashCommand for CargoWorkspaceSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let output = workspace.update(cx, |workspace, cx| {
let project = workspace.project().clone();
let fs = workspace.project().read(cx).fs().clone();
@@ -145,7 +147,8 @@ impl SlashCommand for CargoWorkspaceSlashCommand {
metadata: None,
}],
run_commands_in_text: false,
})
}
.to_event_stream())
})
});
output.unwrap_or_else(|error| Task::ready(Err(error)))

View File

@@ -1,8 +1,7 @@
use super::create_label_for_command;
use anyhow::{anyhow, Result};
use assistant_slash_command::{
AfterCompletion, ArgumentCompletion, SlashCommand, SlashCommandOutput,
SlashCommandOutputSection,
SlashCommandOutputSection, SlashCommandResult,
};
use collections::HashMap;
use context_servers::{
@@ -17,6 +16,8 @@ use text::LineEnding;
use ui::{IconName, SharedString};
use workspace::Workspace;
use crate::slash_command::create_label_for_command;
pub struct ContextServerSlashCommand {
server_id: String,
prompt: Prompt,
@@ -128,7 +129,7 @@ impl SlashCommand for ContextServerSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let server_id = self.server_id.clone();
let prompt_name = self.prompt.name.clone();
@@ -145,7 +146,28 @@ impl SlashCommand for ContextServerSlashCommand {
return Err(anyhow!("Context server not initialized"));
};
let result = protocol.run_prompt(&prompt_name, prompt_args).await?;
let mut prompt = result.prompt;
// Check that there are only user roles
if result
.messages
.iter()
.any(|msg| !matches!(msg.role, context_servers::types::SamplingRole::User))
{
return Err(anyhow!(
"Prompt contains non-user roles, which is not supported"
));
}
// Extract text from user messages into a single prompt string
let mut prompt = result
.messages
.into_iter()
.filter_map(|msg| match msg.content {
context_servers::types::SamplingContent::Text { text } => Some(text),
_ => None,
})
.collect::<Vec<String>>()
.join("\n\n");
// We must normalize the line endings here, since servers might return CR characters.
LineEnding::normalize(&mut prompt);
@@ -163,7 +185,8 @@ impl SlashCommand for ContextServerSlashCommand {
}],
text: prompt,
run_commands_in_text: false,
})
}
.to_event_stream())
})
} else {
Task::ready(Err(anyhow!("Context server not found")))

View File

@@ -1,7 +1,9 @@
use super::{SlashCommand, SlashCommandOutput};
use crate::prompt_library::PromptStore;
use anyhow::{anyhow, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use gpui::{Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
use std::{
@@ -48,7 +50,7 @@ impl SlashCommand for DefaultSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let store = PromptStore::global(cx);
cx.background_executor().spawn(async move {
let store = store.await?;
@@ -76,7 +78,8 @@ impl SlashCommand for DefaultSlashCommand {
}],
text,
run_commands_in_text: true,
})
}
.to_event_stream())
})
}
}

View File

@@ -2,6 +2,7 @@ use crate::slash_command::file_command::{FileCommandMetadata, FileSlashCommand};
use anyhow::Result;
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use collections::HashSet;
use futures::future;
@@ -48,7 +49,7 @@ impl SlashCommand for DeltaSlashCommand {
workspace: WeakView<Workspace>,
delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let mut paths = HashSet::default();
let mut file_command_old_outputs = Vec::new();
let mut file_command_new_outputs = Vec::new();
@@ -85,25 +86,28 @@ impl SlashCommand for DeltaSlashCommand {
.zip(file_command_new_outputs)
{
if let Ok(new_output) = new_output {
if let Some(file_command_range) = new_output.sections.first() {
let new_text = &new_output.text[file_command_range.range.clone()];
if old_text.chars().ne(new_text.chars()) {
output.sections.extend(new_output.sections.into_iter().map(
|section| SlashCommandOutputSection {
range: output.text.len() + section.range.start
..output.text.len() + section.range.end,
icon: section.icon,
label: section.label,
metadata: section.metadata,
},
));
output.text.push_str(&new_output.text);
if let Ok(new_output) = SlashCommandOutput::from_event_stream(new_output).await
{
if let Some(file_command_range) = new_output.sections.first() {
let new_text = &new_output.text[file_command_range.range.clone()];
if old_text.chars().ne(new_text.chars()) {
output.sections.extend(new_output.sections.into_iter().map(
|section| SlashCommandOutputSection {
range: output.text.len() + section.range.start
..output.text.len() + section.range.end,
icon: section.icon,
label: section.label,
metadata: section.metadata,
},
));
output.text.push_str(&new_output.text);
}
}
}
}
}
Ok(output)
Ok(output.to_event_stream())
})
}
}

View File

@@ -1,6 +1,8 @@
use super::{create_label_for_command, SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use fuzzy::{PathMatch, StringMatchCandidate};
use gpui::{AppContext, Model, Task, View, WeakView};
use language::{
@@ -19,6 +21,8 @@ use util::paths::PathMatcher;
use util::ResultExt;
use workspace::Workspace;
use crate::slash_command::create_label_for_command;
pub(crate) struct DiagnosticsSlashCommand;
impl DiagnosticsSlashCommand {
@@ -167,7 +171,7 @@ impl SlashCommand for DiagnosticsSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow!("workspace was dropped")));
};
@@ -176,7 +180,11 @@ impl SlashCommand for DiagnosticsSlashCommand {
let task = collect_diagnostics(workspace.read(cx).project().clone(), options, cx);
cx.spawn(move |_| async move { task.await?.ok_or_else(|| anyhow!("No diagnostics found")) })
cx.spawn(move |_| async move {
task.await?
.map(|output| output.to_event_stream())
.ok_or_else(|| anyhow!("No diagnostics found"))
})
}
}

View File

@@ -6,6 +6,7 @@ use std::time::Duration;
use anyhow::{anyhow, bail, Result};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use gpui::{AppContext, BackgroundExecutor, Model, Task, WeakView};
use indexed_docs::{
@@ -274,7 +275,7 @@ impl SlashCommand for DocsSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
if arguments.is_empty() {
return Task::ready(Err(anyhow!("missing an argument")));
};
@@ -355,7 +356,8 @@ impl SlashCommand for DocsSlashCommand {
})
.collect(),
run_commands_in_text: false,
})
}
.to_event_stream())
})
}
}

View File

@@ -6,6 +6,7 @@ use std::sync::Arc;
use anyhow::{anyhow, bail, Context, Result};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use futures::AsyncReadExt;
use gpui::{Task, WeakView};
@@ -133,7 +134,7 @@ impl SlashCommand for FetchSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(argument) = arguments.first() else {
return Task::ready(Err(anyhow!("missing URL")));
};
@@ -166,7 +167,8 @@ impl SlashCommand for FetchSlashCommand {
metadata: None,
}],
run_commands_in_text: false,
})
}
.to_event_stream())
})
}
}

View File

@@ -1,11 +1,15 @@
use super::{diagnostics_command::collect_buffer_diagnostics, SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Context as _, Result};
use assistant_slash_command::{AfterCompletion, ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
AfterCompletion, ArgumentCompletion, SlashCommand, SlashCommandContent, SlashCommandEvent,
SlashCommandOutput, SlashCommandOutputSection, SlashCommandResult,
};
use futures::channel::mpsc;
use fuzzy::PathMatch;
use gpui::{AppContext, Model, Task, View, WeakView};
use language::{BufferSnapshot, CodeLabel, HighlightId, LineEnding, LspAdapterDelegate};
use project::{PathMatchCandidateSet, Project};
use serde::{Deserialize, Serialize};
use smol::stream::StreamExt;
use std::{
fmt::Write,
ops::{Range, RangeInclusive},
@@ -16,6 +20,8 @@ use ui::prelude::*;
use util::ResultExt;
use workspace::Workspace;
use crate::slash_command::diagnostics_command::collect_buffer_diagnostics;
pub(crate) struct FileSlashCommand;
impl FileSlashCommand {
@@ -181,7 +187,7 @@ impl SlashCommand for FileSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow!("workspace was dropped")));
};
@@ -198,7 +204,7 @@ fn collect_files(
project: Model<Project>,
glob_inputs: &[String],
cx: &mut AppContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Ok(matchers) = glob_inputs
.into_iter()
.map(|glob_input| {
@@ -217,11 +223,11 @@ fn collect_files(
.map(|worktree| worktree.read(cx).snapshot())
.collect::<Vec<_>>();
let (events_tx, events_rx) = mpsc::unbounded();
cx.spawn(|mut cx| async move {
let mut output = SlashCommandOutput::default();
for snapshot in snapshots {
let worktree_id = snapshot.id();
let mut directory_stack: Vec<(Arc<Path>, String, usize)> = Vec::new();
let mut directory_stack: Vec<Arc<Path>> = Vec::new();
let mut folded_directory_names_stack = Vec::new();
let mut is_top_level_directory = true;
@@ -237,17 +243,19 @@ fn collect_files(
continue;
}
while let Some((dir, _, _)) = directory_stack.last() {
while let Some(dir) = directory_stack.last() {
if entry.path.starts_with(dir) {
break;
}
let (_, entry_name, start) = directory_stack.pop().unwrap();
output.sections.push(build_entry_output_section(
start..output.text.len().saturating_sub(1),
Some(&PathBuf::from(entry_name)),
true,
None,
));
directory_stack.pop().unwrap();
events_tx
.unbounded_send(Ok(SlashCommandEvent::EndSection { metadata: None }))?;
events_tx.unbounded_send(Ok(SlashCommandEvent::Content(
SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false,
},
)))?;
}
let filename = entry
@@ -279,23 +287,46 @@ fn collect_files(
continue;
}
let prefix_paths = folded_directory_names_stack.drain(..).as_slice().join("/");
let entry_start = output.text.len();
if prefix_paths.is_empty() {
if is_top_level_directory {
output
.text
.push_str(&path_including_worktree_name.to_string_lossy());
let label = if is_top_level_directory {
is_top_level_directory = false;
path_including_worktree_name.to_string_lossy().to_string()
} else {
output.text.push_str(&filename);
}
directory_stack.push((entry.path.clone(), filename, entry_start));
filename
};
events_tx.unbounded_send(Ok(SlashCommandEvent::StartSection {
icon: IconName::Folder,
label: label.clone().into(),
metadata: None,
}))?;
events_tx.unbounded_send(Ok(SlashCommandEvent::Content(
SlashCommandContent::Text {
text: label,
run_commands_in_text: false,
},
)))?;
directory_stack.push(entry.path.clone());
} else {
let entry_name = format!("{}/{}", prefix_paths, &filename);
output.text.push_str(&entry_name);
directory_stack.push((entry.path.clone(), entry_name, entry_start));
events_tx.unbounded_send(Ok(SlashCommandEvent::StartSection {
icon: IconName::Folder,
label: entry_name.clone().into(),
metadata: None,
}))?;
events_tx.unbounded_send(Ok(SlashCommandEvent::Content(
SlashCommandContent::Text {
text: entry_name,
run_commands_in_text: false,
},
)))?;
directory_stack.push(entry.path.clone());
}
output.text.push('\n');
events_tx.unbounded_send(Ok(SlashCommandEvent::Content(
SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false,
},
)))?;
} else if entry.is_file() {
let Some(open_buffer_task) = project_handle
.update(&mut cx, |project, cx| {
@@ -306,6 +337,7 @@ fn collect_files(
continue;
};
if let Some(buffer) = open_buffer_task.await.log_err() {
let mut output = SlashCommandOutput::default();
let snapshot = buffer.read_with(&cx, |buffer, _| buffer.snapshot())?;
append_buffer_to_output(
&snapshot,
@@ -313,32 +345,19 @@ fn collect_files(
&mut output,
)
.log_err();
let mut buffer_events = output.to_event_stream();
while let Some(event) = buffer_events.next().await {
events_tx.unbounded_send(event)?;
}
}
}
}
while let Some((dir, entry, start)) = directory_stack.pop() {
if directory_stack.is_empty() {
let mut root_path = PathBuf::new();
root_path.push(snapshot.root_name());
root_path.push(&dir);
output.sections.push(build_entry_output_section(
start..output.text.len(),
Some(&root_path),
true,
None,
));
} else {
output.sections.push(build_entry_output_section(
start..output.text.len(),
Some(&PathBuf::from(entry.as_str())),
true,
None,
));
}
while let Some(_) = directory_stack.pop() {
events_tx.unbounded_send(Ok(SlashCommandEvent::EndSection { metadata: None }))?;
}
}
Ok(output)
Ok(events_rx.boxed())
})
}
@@ -524,8 +543,10 @@ pub fn append_buffer_to_output(
#[cfg(test)]
mod test {
use assistant_slash_command::SlashCommandOutput;
use fs::FakeFs;
use gpui::TestAppContext;
use pretty_assertions::assert_eq;
use project::Project;
use serde_json::json;
use settings::SettingsStore;
@@ -573,6 +594,9 @@ mod test {
.update(|cx| collect_files(project.clone(), &["root/dir".to_string()], cx))
.await
.unwrap();
let result_1 = SlashCommandOutput::from_event_stream(result_1)
.await
.unwrap();
assert!(result_1.text.starts_with("root/dir"));
// 4 files + 2 directories
@@ -582,6 +606,9 @@ mod test {
.update(|cx| collect_files(project.clone(), &["root/dir/".to_string()], cx))
.await
.unwrap();
let result_2 = SlashCommandOutput::from_event_stream(result_2)
.await
.unwrap();
assert_eq!(result_1, result_2);
@@ -589,6 +616,7 @@ mod test {
.update(|cx| collect_files(project.clone(), &["root/dir*".to_string()], cx))
.await
.unwrap();
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
assert!(result.text.starts_with("root/dir"));
// 5 files + 2 directories
@@ -635,6 +663,7 @@ mod test {
.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx))
.await
.unwrap();
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
// Sanity check
assert!(result.text.starts_with("zed/assets/themes\n"));
@@ -696,6 +725,7 @@ mod test {
.update(|cx| collect_files(project.clone(), &["zed/assets/themes".to_string()], cx))
.await
.unwrap();
let result = SlashCommandOutput::from_event_stream(result).await.unwrap();
assert!(result.text.starts_with("zed/assets/themes\n"));
assert_eq!(result.sections[0].label, "zed/assets/themes/LICENSE");
@@ -716,6 +746,8 @@ mod test {
assert_eq!(result.sections[6].label, "summercamp");
assert_eq!(result.sections[7].label, "zed/assets/themes");
assert_eq!(result.text, "zed/assets/themes\n```zed/assets/themes/LICENSE\n1\n```\n\nsummercamp\n```zed/assets/themes/summercamp/LICENSE\n1\n```\n\nsubdir\n```zed/assets/themes/summercamp/subdir/LICENSE\n1\n```\n\nsubsubdir\n```zed/assets/themes/summercamp/subdir/subsubdir/LICENSE\n3\n```\n\n");
// Ensure that the project lasts until after the last await
drop(project);
}

View File

@@ -4,6 +4,7 @@ use std::sync::Arc;
use anyhow::Result;
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use chrono::Local;
use gpui::{Task, WeakView};
@@ -48,7 +49,7 @@ impl SlashCommand for NowSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
_cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let now = Local::now();
let text = format!("Today is {now}.", now = now.to_rfc2822());
let range = 0..text.len();
@@ -62,6 +63,7 @@ impl SlashCommand for NowSlashCommand {
metadata: None,
}],
run_commands_in_text: false,
}))
}
.to_event_stream()))
}
}

View File

@@ -4,7 +4,7 @@ use super::{
};
use crate::PromptBuilder;
use anyhow::{anyhow, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection, SlashCommandResult};
use feature_flags::FeatureFlag;
use gpui::{AppContext, Task, WeakView, WindowContext};
use language::{Anchor, CodeLabel, LspAdapterDelegate};
@@ -76,7 +76,7 @@ impl SlashCommand for ProjectSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let model_registry = LanguageModelRegistry::read_global(cx);
let current_model = model_registry.active_model();
let prompt_builder = self.prompt_builder.clone();
@@ -162,7 +162,8 @@ impl SlashCommand for ProjectSlashCommand {
text: output,
sections,
run_commands_in_text: true,
})
}
.to_event_stream())
})
.await
})

View File

@@ -1,7 +1,9 @@
use super::{SlashCommand, SlashCommandOutput};
use crate::prompt_library::PromptStore;
use anyhow::{anyhow, Context, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use gpui::{Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
use std::sync::{atomic::AtomicBool, Arc};
@@ -61,7 +63,7 @@ impl SlashCommand for PromptSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let title = arguments.to_owned().join(" ");
if title.trim().is_empty() {
return Task::ready(Err(anyhow!("missing prompt name")));
@@ -100,7 +102,8 @@ impl SlashCommand for PromptSlashCommand {
metadata: None,
}],
run_commands_in_text: true,
})
}
.to_event_stream())
})
}
}

View File

@@ -1,10 +1,8 @@
use super::{
create_label_for_command,
file_command::{build_entry_output_section, codeblock_fence_for_path},
SlashCommand, SlashCommandOutput,
};
use anyhow::Result;
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use feature_flags::FeatureFlag;
use gpui::{AppContext, Task, WeakView};
use language::{CodeLabel, LspAdapterDelegate};
@@ -16,6 +14,9 @@ use std::{
use ui::{prelude::*, IconName};
use workspace::Workspace;
use crate::slash_command::create_label_for_command;
use crate::slash_command::file_command::{build_entry_output_section, codeblock_fence_for_path};
pub(crate) struct SearchSlashCommandFeatureFlag;
impl FeatureFlag for SearchSlashCommandFeatureFlag {
@@ -63,7 +64,7 @@ impl SlashCommand for SearchSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow::anyhow!("workspace was dropped")));
};
@@ -129,6 +130,7 @@ impl SlashCommand for SearchSlashCommand {
sections,
run_commands_in_text: false,
}
.to_event_stream()
})
.await;

View File

@@ -1,6 +1,8 @@
use super::{SlashCommand, SlashCommandOutput};
use anyhow::{anyhow, Context as _, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use editor::Editor;
use gpui::{Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
@@ -46,7 +48,7 @@ impl SlashCommand for OutlineSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let output = workspace.update(cx, |workspace, cx| {
let Some(active_item) = workspace.active_item(cx) else {
return Task::ready(Err(anyhow!("no active tab")));
@@ -83,7 +85,8 @@ impl SlashCommand for OutlineSlashCommand {
}],
text: outline_text,
run_commands_in_text: false,
})
}
.to_event_stream())
})
});

View File

@@ -1,6 +1,8 @@
use super::{file_command::append_buffer_to_output, SlashCommand, SlashCommandOutput};
use anyhow::{Context, Result};
use assistant_slash_command::{ArgumentCompletion, SlashCommandOutputSection};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use collections::{HashMap, HashSet};
use editor::Editor;
use futures::future::join_all;
@@ -14,6 +16,8 @@ use ui::{ActiveTheme, WindowContext};
use util::ResultExt;
use workspace::Workspace;
use crate::slash_command::file_command::append_buffer_to_output;
pub(crate) struct TabSlashCommand;
const ALL_TABS_COMPLETION_ITEM: &str = "all";
@@ -132,7 +136,7 @@ impl SlashCommand for TabSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let tab_items_search = tab_items_for_queries(
Some(workspace),
arguments,
@@ -146,7 +150,7 @@ impl SlashCommand for TabSlashCommand {
for (full_path, buffer, _) in tab_items_search.await? {
append_buffer_to_output(&buffer, full_path.as_deref(), &mut output).log_err();
}
Ok(output)
Ok(output.to_event_stream())
})
}
}

View File

@@ -4,6 +4,7 @@ use std::sync::Arc;
use anyhow::Result;
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use gpui::{AppContext, Task, View, WeakView};
use language::{BufferSnapshot, CodeLabel, LspAdapterDelegate};
@@ -62,7 +63,7 @@ impl SlashCommand for TerminalSlashCommand {
workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let Some(workspace) = workspace.upgrade() else {
return Task::ready(Err(anyhow::anyhow!("workspace was dropped")));
};
@@ -96,7 +97,8 @@ impl SlashCommand for TerminalSlashCommand {
metadata: None,
}],
run_commands_in_text: false,
}))
}
.to_event_stream()))
}
}

View File

@@ -1,18 +1,18 @@
use crate::prompts::PromptBuilder;
use std::sync::Arc;
use std::sync::atomic::AtomicBool;
use std::sync::Arc;
use anyhow::Result;
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use gpui::{Task, WeakView};
use language::{BufferSnapshot, LspAdapterDelegate};
use ui::prelude::*;
use workspace::Workspace;
use crate::prompts::PromptBuilder;
pub(crate) struct WorkflowSlashCommand {
prompt_builder: Arc<PromptBuilder>,
}
@@ -60,7 +60,7 @@ impl SlashCommand for WorkflowSlashCommand {
_workspace: WeakView<Workspace>,
_delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let prompt_builder = self.prompt_builder.clone();
cx.spawn(|_cx| async move {
let text = prompt_builder.generate_workflow_prompt()?;
@@ -75,7 +75,8 @@ impl SlashCommand for WorkflowSlashCommand {
metadata: None,
}],
run_commands_in_text: false,
})
}
.to_event_stream())
})
}
}

View File

@@ -15,9 +15,15 @@ path = "src/assistant_slash_command.rs"
anyhow.workspace = true
collections.workspace = true
derive_more.workspace = true
futures.workspace = true
gpui.workspace = true
language.workspace = true
parking_lot.workspace = true
serde.workspace = true
serde_json.workspace = true
workspace.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -1,6 +1,8 @@
mod slash_command_registry;
use anyhow::Result;
use futures::stream::{self, BoxStream};
use futures::StreamExt;
use gpui::{AnyElement, AppContext, ElementId, SharedString, Task, WeakView, WindowContext};
use language::{BufferSnapshot, CodeLabel, LspAdapterDelegate, OffsetRangeExt};
use serde::{Deserialize, Serialize};
@@ -56,6 +58,8 @@ pub struct ArgumentCompletion {
pub replace_previous_arguments: bool,
}
pub type SlashCommandResult = Result<BoxStream<'static, Result<SlashCommandEvent>>>;
pub trait SlashCommand: 'static + Send + Sync {
fn name(&self) -> String;
fn label(&self, _cx: &AppContext) -> CodeLabel {
@@ -87,7 +91,7 @@ pub trait SlashCommand: 'static + Send + Sync {
// perhaps another kind of delegate is needed here.
delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>>;
) -> Task<SlashCommandResult>;
}
pub type RenderFoldPlaceholder = Arc<
@@ -96,13 +100,146 @@ pub type RenderFoldPlaceholder = Arc<
+ Fn(ElementId, Arc<dyn Fn(&mut WindowContext)>, &mut WindowContext) -> AnyElement,
>;
#[derive(Debug, Default, PartialEq)]
#[derive(Debug, PartialEq, Eq)]
pub enum SlashCommandContent {
Text {
text: String,
run_commands_in_text: bool,
},
}
#[derive(Debug, PartialEq, Eq)]
pub enum SlashCommandEvent {
StartSection {
icon: IconName,
label: SharedString,
metadata: Option<serde_json::Value>,
},
Content(SlashCommandContent),
EndSection {
metadata: Option<serde_json::Value>,
},
}
#[derive(Debug, Default, PartialEq, Clone)]
pub struct SlashCommandOutput {
pub text: String,
pub sections: Vec<SlashCommandOutputSection<usize>>,
pub run_commands_in_text: bool,
}
impl SlashCommandOutput {
pub fn ensure_valid_section_ranges(&mut self) {
for section in &mut self.sections {
section.range.start = section.range.start.min(self.text.len());
section.range.end = section.range.end.min(self.text.len());
while !self.text.is_char_boundary(section.range.start) {
section.range.start -= 1;
}
while !self.text.is_char_boundary(section.range.end) {
section.range.end += 1;
}
}
}
/// Returns this [`SlashCommandOutput`] as a stream of [`SlashCommandEvent`]s.
pub fn to_event_stream(mut self) -> BoxStream<'static, Result<SlashCommandEvent>> {
self.ensure_valid_section_ranges();
let mut events = Vec::new();
let mut last_section_end = 0;
for section in self.sections {
if last_section_end < section.range.start {
events.push(Ok(SlashCommandEvent::Content(SlashCommandContent::Text {
text: self
.text
.get(last_section_end..section.range.start)
.unwrap_or_default()
.to_string(),
run_commands_in_text: self.run_commands_in_text,
})));
}
events.push(Ok(SlashCommandEvent::StartSection {
icon: section.icon,
label: section.label,
metadata: section.metadata.clone(),
}));
events.push(Ok(SlashCommandEvent::Content(SlashCommandContent::Text {
text: self
.text
.get(section.range.start..section.range.end)
.unwrap_or_default()
.to_string(),
run_commands_in_text: self.run_commands_in_text,
})));
events.push(Ok(SlashCommandEvent::EndSection {
metadata: section.metadata,
}));
last_section_end = section.range.end;
}
if last_section_end < self.text.len() {
events.push(Ok(SlashCommandEvent::Content(SlashCommandContent::Text {
text: self.text[last_section_end..].to_string(),
run_commands_in_text: self.run_commands_in_text,
})));
}
stream::iter(events).boxed()
}
pub async fn from_event_stream(
mut events: BoxStream<'static, Result<SlashCommandEvent>>,
) -> Result<SlashCommandOutput> {
let mut output = SlashCommandOutput::default();
let mut section_stack = Vec::new();
while let Some(event) = events.next().await {
match event? {
SlashCommandEvent::StartSection {
icon,
label,
metadata,
} => {
let start = output.text.len();
section_stack.push(SlashCommandOutputSection {
range: start..start,
icon,
label,
metadata,
});
}
SlashCommandEvent::Content(SlashCommandContent::Text {
text,
run_commands_in_text,
}) => {
output.text.push_str(&text);
output.run_commands_in_text = run_commands_in_text;
if let Some(section) = section_stack.last_mut() {
section.range.end = output.text.len();
}
}
SlashCommandEvent::EndSection { metadata } => {
if let Some(mut section) = section_stack.pop() {
section.metadata = metadata;
output.sections.push(section);
}
}
}
}
while let Some(section) = section_stack.pop() {
output.sections.push(section);
}
Ok(output)
}
}
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
pub struct SlashCommandOutputSection<T> {
pub range: Range<T>,
@@ -116,3 +253,243 @@ impl SlashCommandOutputSection<language::Anchor> {
self.range.start.is_valid(buffer) && !self.range.to_offset(buffer).is_empty()
}
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
use serde_json::json;
use super::*;
#[gpui::test]
async fn test_slash_command_output_to_events_round_trip() {
// Test basic output consisting of a single section.
{
let text = "Hello, world!".to_string();
let range = 0..text.len();
let output = SlashCommandOutput {
text,
sections: vec![SlashCommandOutputSection {
range,
icon: IconName::Code,
label: "Section 1".into(),
metadata: None,
}],
run_commands_in_text: false,
};
let events = output.clone().to_event_stream().collect::<Vec<_>>().await;
let events = events
.into_iter()
.filter_map(|event| event.ok())
.collect::<Vec<_>>();
assert_eq!(
events,
vec![
SlashCommandEvent::StartSection {
icon: IconName::Code,
label: "Section 1".into(),
metadata: None
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Hello, world!".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection { metadata: None }
]
);
let new_output =
SlashCommandOutput::from_event_stream(output.clone().to_event_stream())
.await
.unwrap();
assert_eq!(new_output, output);
}
// Test output where the sections do not comprise all of the text.
{
let text = "Apple\nCucumber\nBanana\n".to_string();
let output = SlashCommandOutput {
text,
sections: vec![
SlashCommandOutputSection {
range: 0..6,
icon: IconName::Check,
label: "Fruit".into(),
metadata: None,
},
SlashCommandOutputSection {
range: 15..22,
icon: IconName::Check,
label: "Fruit".into(),
metadata: None,
},
],
run_commands_in_text: false,
};
let events = output.clone().to_event_stream().collect::<Vec<_>>().await;
let events = events
.into_iter()
.filter_map(|event| event.ok())
.collect::<Vec<_>>();
assert_eq!(
events,
vec![
SlashCommandEvent::StartSection {
icon: IconName::Check,
label: "Fruit".into(),
metadata: None
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Apple\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection { metadata: None },
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Cucumber\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::StartSection {
icon: IconName::Check,
label: "Fruit".into(),
metadata: None
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Banana\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection { metadata: None }
]
);
let new_output =
SlashCommandOutput::from_event_stream(output.clone().to_event_stream())
.await
.unwrap();
assert_eq!(new_output, output);
}
// Test output consisting of multiple sections.
{
let text = "Line 1\nLine 2\nLine 3\nLine 4\n".to_string();
let output = SlashCommandOutput {
text,
sections: vec![
SlashCommandOutputSection {
range: 0..6,
icon: IconName::FileCode,
label: "Section 1".into(),
metadata: Some(json!({ "a": true })),
},
SlashCommandOutputSection {
range: 7..13,
icon: IconName::FileDoc,
label: "Section 2".into(),
metadata: Some(json!({ "b": true })),
},
SlashCommandOutputSection {
range: 14..20,
icon: IconName::FileGit,
label: "Section 3".into(),
metadata: Some(json!({ "c": true })),
},
SlashCommandOutputSection {
range: 21..27,
icon: IconName::FileToml,
label: "Section 4".into(),
metadata: Some(json!({ "d": true })),
},
],
run_commands_in_text: false,
};
let events = output.clone().to_event_stream().collect::<Vec<_>>().await;
let events = events
.into_iter()
.filter_map(|event| event.ok())
.collect::<Vec<_>>();
assert_eq!(
events,
vec![
SlashCommandEvent::StartSection {
icon: IconName::FileCode,
label: "Section 1".into(),
metadata: Some(json!({ "a": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Line 1".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection {
metadata: Some(json!({ "a": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::StartSection {
icon: IconName::FileDoc,
label: "Section 2".into(),
metadata: Some(json!({ "b": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Line 2".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection {
metadata: Some(json!({ "b": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::StartSection {
icon: IconName::FileGit,
label: "Section 3".into(),
metadata: Some(json!({ "c": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Line 3".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection {
metadata: Some(json!({ "c": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false
}),
SlashCommandEvent::StartSection {
icon: IconName::FileToml,
label: "Section 4".into(),
metadata: Some(json!({ "d": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "Line 4".into(),
run_commands_in_text: false
}),
SlashCommandEvent::EndSection {
metadata: Some(json!({ "d": true }))
},
SlashCommandEvent::Content(SlashCommandContent::Text {
text: "\n".into(),
run_commands_in_text: false
}),
]
);
let new_output =
SlashCommandOutput::from_event_stream(output.clone().to_event_stream())
.await
.unwrap();
assert_eq!(new_output, output);
}
}
}

View File

@@ -32,4 +32,5 @@ settings.workspace = true
smol.workspace = true
tempfile.workspace = true
util.workspace = true
which.workspace = true
workspace.workspace = true

View File

@@ -33,6 +33,7 @@ use std::{
};
use update_notification::UpdateNotification;
use util::ResultExt;
use which::which;
use workspace::notifications::NotificationId;
use workspace::Workspace;
@@ -473,6 +474,39 @@ impl AutoUpdater {
Ok(version_path)
}
pub async fn get_latest_remote_server_release_url(
os: &str,
arch: &str,
mut release_channel: ReleaseChannel,
cx: &mut AsyncAppContext,
) -> Result<(String, String)> {
let this = cx.update(|cx| {
cx.default_global::<GlobalAutoUpdate>()
.0
.clone()
.ok_or_else(|| anyhow!("auto-update not initialized"))
})??;
if release_channel == ReleaseChannel::Dev {
release_channel = ReleaseChannel::Nightly;
}
let release = Self::get_latest_release(
&this,
"zed-remote-server",
os,
arch,
Some(release_channel),
cx,
)
.await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let body = serde_json::to_string(&update_request_body)?;
Ok((release.url, body))
}
async fn get_latest_release(
this: &Model<Self>,
asset: &str,
@@ -560,6 +594,12 @@ impl AutoUpdater {
"linux" => Ok("zed.tar.gz"),
_ => Err(anyhow!("not supported: {:?}", OS)),
}?;
anyhow::ensure!(
which("rsync").is_ok(),
"Aborting. Could not find rsync which is required for auto-updates."
);
let downloaded_asset = temp_dir.path().join(filename);
download_release(&downloaded_asset, release, client, &cx).await?;
@@ -622,6 +662,15 @@ async fn download_remote_server_binary(
cx: &AsyncAppContext,
) -> Result<()> {
let mut target_file = File::create(&target_path).await?;
let update_request_body = build_remote_server_update_request_body(cx)?;
let request_body = AsyncBody::from(serde_json::to_string(&update_request_body)?);
let mut response = client.get(&release.url, request_body, true).await?;
smol::io::copy(response.body_mut(), &mut target_file).await?;
Ok(())
}
fn build_remote_server_update_request_body(cx: &AsyncAppContext) -> Result<UpdateRequestBody> {
let (installation_id, release_channel, telemetry_enabled, is_staff) = cx.update(|cx| {
let telemetry = Client::global(cx).telemetry().clone();
let is_staff = telemetry.is_staff();
@@ -637,17 +686,14 @@ async fn download_remote_server_binary(
is_staff,
)
})?;
let request_body = AsyncBody::from(serde_json::to_string(&UpdateRequestBody {
Ok(UpdateRequestBody {
installation_id,
release_channel,
telemetry: telemetry_enabled,
is_staff,
destination: "remote",
})?);
let mut response = client.get(&release.url, request_body, true).await?;
smol::io::copy(response.body_mut(), &mut target_file).await?;
Ok(())
})
}
async fn download_release(

View File

@@ -33,6 +33,7 @@ clock.workspace = true
collections.workspace = true
dashmap.workspace = true
derive_more.workspace = true
duckdb = { version = "1.1.1", features = ["r2d2"] }
envy = "0.4.2"
futures.workspace = true
google_ai.workspace = true
@@ -51,6 +52,7 @@ reqwest = { version = "0.11", features = ["json"] }
reqwest_client.workspace = true
rpc.workspace = true
rustc-demangle.workspace = true
r2d2 = "0.8.9"
scrypt = "0.11"
sea-orm = { version = "1.1.0-rc.1", features = ["sqlx-postgres", "postgres-array", "runtime-tokio-rustls", "with-uuid"] }
semantic_version.workspace = true

View File

@@ -11,7 +11,8 @@ CREATE TABLE "users" (
"metrics_id" TEXT,
"github_user_id" INTEGER NOT NULL,
"accepted_tos_at" TIMESTAMP WITHOUT TIME ZONE,
"github_user_created_at" TIMESTAMP WITHOUT TIME ZONE
"github_user_created_at" TIMESTAMP WITHOUT TIME ZONE,
"custom_llm_monthly_allowance_in_cents" INTEGER
);
CREATE UNIQUE INDEX "index_users_github_login" ON "users" ("github_login");
CREATE UNIQUE INDEX "index_invite_code_users" ON "users" ("invite_code");

View File

@@ -0,0 +1 @@
alter table users add column custom_llm_monthly_allowance_in_cents integer;

View File

@@ -34,7 +34,7 @@ use crate::{
db::{billing_subscription::StripeSubscriptionStatus, UserId},
llm::db::LlmDatabase,
};
use crate::{AppState, Error, Result};
use crate::{AppState, Cents, Error, Result};
pub fn router() -> Router {
Router::new()
@@ -226,6 +226,13 @@ async fn create_billing_subscription(
))?
};
if app.db.has_active_billing_subscription(user.id).await? {
return Err(Error::http(
StatusCode::CONFLICT,
"user already has an active subscription".into(),
));
}
let customer_id =
if let Some(existing_customer) = app.db.get_billing_customer_by_user_id(user.id).await? {
CustomerId::from_str(&existing_customer.stripe_customer_id)
@@ -655,6 +662,33 @@ async fn handle_customer_subscription_event(
)
.await?;
} else {
// If the user already has an active billing subscription, ignore the
// event and return an `Ok` to signal that it was processed
// successfully.
//
// There is the possibility that this could cause us to not create a
// subscription in the following scenario:
//
// 1. User has an active subscription A
// 2. User cancels subscription A
// 3. User creates a new subscription B
// 4. We process the new subscription B before the cancellation of subscription A
// 5. User ends up with no subscriptions
//
// In theory this situation shouldn't arise as we try to process the events in the order they occur.
if app
.db
.has_active_billing_subscription(billing_customer.user_id)
.await?
{
log::info!(
"user {user_id} already has an active subscription, skipping creation of subscription {subscription_id}",
user_id = billing_customer.user_id,
subscription_id = subscription.id
);
return Ok(());
}
app.db
.create_billing_subscription(&CreateBillingSubscriptionParams {
billing_customer_id: billing_customer.id,
@@ -680,7 +714,9 @@ struct GetMonthlySpendParams {
#[derive(Debug, Serialize)]
struct GetMonthlySpendResponse {
monthly_spend_in_cents: i32,
monthly_free_tier_spend_in_cents: u32,
monthly_free_tier_allowance_in_cents: u32,
monthly_spend_in_cents: u32,
}
async fn get_monthly_spend(
@@ -700,13 +736,22 @@ async fn get_monthly_spend(
));
};
let monthly_spend = llm_db
let free_tier = user
.custom_llm_monthly_allowance_in_cents
.map(|allowance| Cents(allowance as u32))
.unwrap_or(FREE_TIER_MONTHLY_SPENDING_LIMIT);
let spending_for_month = llm_db
.get_user_spending_for_month(user.id, Utc::now())
.await?
.saturating_sub(FREE_TIER_MONTHLY_SPENDING_LIMIT);
.await?;
let free_tier_spend = Cents::min(spending_for_month, free_tier);
let monthly_spend = spending_for_month.saturating_sub(free_tier);
Ok(Json(GetMonthlySpendResponse {
monthly_spend_in_cents: monthly_spend.0 as i32,
monthly_free_tier_spend_in_cents: free_tier_spend.0,
monthly_free_tier_allowance_in_cents: free_tier.0,
monthly_spend_in_cents: monthly_spend.0,
}))
}

View File

@@ -1,6 +1,7 @@
use super::ips_file::IpsFile;
use crate::api::CloudflareIpCountryHeader;
use crate::clickhouse::write_to_table;
use crate::clickhouse;
use crate::duckdb;
use crate::{api::slack, AppState, Error, Result};
use anyhow::{anyhow, Context};
use aws_sdk_s3::primitives::ByteStream;
@@ -11,6 +12,7 @@ use axum::{
routing::post,
Extension, Router, TypedHeader,
};
use duckdb::Connection as DuckDbConnection;
use rpc::ExtensionMetadata;
use semantic_version::SemanticVersion;
use serde::{Serialize, Serializer};
@@ -388,13 +390,6 @@ pub async fn post_events(
country_code_header: Option<TypedHeader<CloudflareIpCountryHeader>>,
body: Bytes,
) -> Result<()> {
let Some(clickhouse_client) = app.clickhouse_client.clone() else {
Err(Error::http(
StatusCode::NOT_IMPLEMENTED,
"not supported".into(),
))?
};
let Some(expected) = calculate_json_checksum(app.clone(), &body) else {
return Err(Error::http(
StatusCode::INTERNAL_SERVER_ERROR,
@@ -527,10 +522,26 @@ pub async fn post_events(
}
}
to_upload
.upload(&clickhouse_client)
if let Some(clickhouse_client) = app.clickhouse_client.clone() {
to_upload
.upload_to_clickhouse(&clickhouse_client)
.await
.map_err(|err| Error::Internal(anyhow!(err)))?;
}
if let Some(pool) = app.duckdb_pool.clone() {
tokio::task::spawn_blocking(move || {
let connection = pool
.get()
.context("can't get duckdb connection from pool")?;
to_upload
.upload_to_duckdb(&connection)
.map_err(|err| Error::Internal(anyhow!(err)))?;
anyhow::Ok(())
})
.await
.map_err(|err| Error::Internal(anyhow!(err)))?;
.context("error spawning duckdb write")??;
}
Ok(())
}
@@ -552,14 +563,17 @@ struct ToUpload {
}
impl ToUpload {
pub async fn upload(&self, clickhouse_client: &clickhouse::Client) -> anyhow::Result<()> {
pub async fn upload_to_clickhouse(
&self,
clickhouse_client: &clickhouse::Client,
) -> anyhow::Result<()> {
const EDITOR_EVENTS_TABLE: &str = "editor_events";
write_to_table(EDITOR_EVENTS_TABLE, &self.editor_events, clickhouse_client)
clickhouse::write_to_table(EDITOR_EVENTS_TABLE, &self.editor_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{EDITOR_EVENTS_TABLE}'"))?;
const INLINE_COMPLETION_EVENTS_TABLE: &str = "inline_completion_events";
write_to_table(
clickhouse::write_to_table(
INLINE_COMPLETION_EVENTS_TABLE,
&self.inline_completion_events,
clickhouse_client,
@@ -568,7 +582,7 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{INLINE_COMPLETION_EVENTS_TABLE}'"))?;
const ASSISTANT_EVENTS_TABLE: &str = "assistant_events";
write_to_table(
clickhouse::write_to_table(
ASSISTANT_EVENTS_TABLE,
&self.assistant_events,
clickhouse_client,
@@ -577,27 +591,27 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{ASSISTANT_EVENTS_TABLE}'"))?;
const CALL_EVENTS_TABLE: &str = "call_events";
write_to_table(CALL_EVENTS_TABLE, &self.call_events, clickhouse_client)
clickhouse::write_to_table(CALL_EVENTS_TABLE, &self.call_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{CALL_EVENTS_TABLE}'"))?;
const CPU_EVENTS_TABLE: &str = "cpu_events";
write_to_table(CPU_EVENTS_TABLE, &self.cpu_events, clickhouse_client)
clickhouse::write_to_table(CPU_EVENTS_TABLE, &self.cpu_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{CPU_EVENTS_TABLE}'"))?;
const MEMORY_EVENTS_TABLE: &str = "memory_events";
write_to_table(MEMORY_EVENTS_TABLE, &self.memory_events, clickhouse_client)
clickhouse::write_to_table(MEMORY_EVENTS_TABLE, &self.memory_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{MEMORY_EVENTS_TABLE}'"))?;
const APP_EVENTS_TABLE: &str = "app_events";
write_to_table(APP_EVENTS_TABLE, &self.app_events, clickhouse_client)
clickhouse::write_to_table(APP_EVENTS_TABLE, &self.app_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{APP_EVENTS_TABLE}'"))?;
const SETTING_EVENTS_TABLE: &str = "setting_events";
write_to_table(
clickhouse::write_to_table(
SETTING_EVENTS_TABLE,
&self.setting_events,
clickhouse_client,
@@ -606,7 +620,7 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{SETTING_EVENTS_TABLE}'"))?;
const EXTENSION_EVENTS_TABLE: &str = "extension_events";
write_to_table(
clickhouse::write_to_table(
EXTENSION_EVENTS_TABLE,
&self.extension_events,
clickhouse_client,
@@ -615,22 +629,29 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{EXTENSION_EVENTS_TABLE}'"))?;
const EDIT_EVENTS_TABLE: &str = "edit_events";
write_to_table(EDIT_EVENTS_TABLE, &self.edit_events, clickhouse_client)
clickhouse::write_to_table(EDIT_EVENTS_TABLE, &self.edit_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{EDIT_EVENTS_TABLE}'"))?;
const ACTION_EVENTS_TABLE: &str = "action_events";
write_to_table(ACTION_EVENTS_TABLE, &self.action_events, clickhouse_client)
clickhouse::write_to_table(ACTION_EVENTS_TABLE, &self.action_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{ACTION_EVENTS_TABLE}'"))?;
const REPL_EVENTS_TABLE: &str = "repl_events";
write_to_table(REPL_EVENTS_TABLE, &self.repl_events, clickhouse_client)
clickhouse::write_to_table(REPL_EVENTS_TABLE, &self.repl_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{REPL_EVENTS_TABLE}'"))?;
Ok(())
}
pub fn upload_to_duckdb(&self, connection: &DuckDbConnection) -> anyhow::Result<()> {
duckdb::write_to_table("edit_events", &self.edit_events, &connection)
.with_context(|| format!("failed to upload to table 'edit_events"))?;
Ok(())
}
}
pub fn serialize_country_code<S>(country_code: &str, serializer: S) -> Result<S::Ok, S::Error>

View File

@@ -1,4 +1,6 @@
use serde::Serialize;
pub use clickhouse::*;
use ::serde::Serialize;
/// Writes the given rows to the specified Clickhouse table.
pub async fn write_to_table<T: clickhouse::Row + Serialize + std::fmt::Debug>(

View File

@@ -272,6 +272,16 @@ impl Database {
update: &proto::UpdateWorktree,
connection: ConnectionId,
) -> Result<TransactionGuard<Vec<ConnectionId>>> {
if update.removed_entries.len() > proto::MAX_WORKTREE_UPDATE_MAX_CHUNK_SIZE
|| update.updated_entries.len() > proto::MAX_WORKTREE_UPDATE_MAX_CHUNK_SIZE
{
return Err(anyhow!(
"invalid worktree update. removed entries: {}, updated entries: {}",
update.removed_entries.len(),
update.updated_entries.len()
))?;
}
let project_id = ProjectId::from_proto(update.project_id);
let worktree_id = update.worktree_id as i64;
self.project_transaction(project_id, |tx| async move {

View File

@@ -21,6 +21,7 @@ pub struct Model {
pub metrics_id: Uuid,
pub created_at: NaiveDateTime,
pub accepted_tos_at: Option<NaiveDateTime>,
pub custom_llm_monthly_allowance_in_cents: Option<i32>,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]

View File

@@ -0,0 +1,18 @@
pub use duckdb::*;
pub fn write_to_table<T>(
table_name: &str,
rows: &[T],
connection: &duckdb::Connection,
) -> anyhow::Result<()>
where
T: serde::Serialize,
{
let mut stmt = connection.prepare(&format!(
"INSERT INTO {} SELECT * FROM json_each(?)",
table_name
))?;
let json = serde_json::to_string(rows)?;
stmt.execute([json])?;
Ok(())
}

View File

@@ -3,6 +3,7 @@ pub mod auth;
mod cents;
pub mod clickhouse;
pub mod db;
mod duckdb;
pub mod env;
pub mod executor;
pub mod llm;
@@ -24,6 +25,7 @@ use axum::{
};
pub use cents::*;
use db::{ChannelId, Database};
use duckdb::DuckdbConnectionManager;
use executor::Executor;
use llm::db::LlmDatabase;
pub use rate_limiter::*;
@@ -155,6 +157,7 @@ pub struct Config {
pub clickhouse_user: Option<String>,
pub clickhouse_password: Option<String>,
pub clickhouse_database: Option<String>,
pub duckdb_path: Option<String>,
pub invite_link_prefix: String,
pub live_kit_server: Option<String>,
pub live_kit_key: Option<String>,
@@ -230,6 +233,7 @@ impl Config {
clickhouse_user: None,
clickhouse_password: None,
clickhouse_database: None,
duckdb_path: None,
zed_client_checksum_seed: None,
slack_panics_webhook: None,
auto_join_channel_id: None,
@@ -276,9 +280,16 @@ pub struct AppState {
pub rate_limiter: Arc<RateLimiter>,
pub executor: Executor,
pub clickhouse_client: Option<::clickhouse::Client>,
pub duckdb_pool: Option<r2d2::Pool<DuckdbConnectionManager>>,
pub config: Config,
}
#[test]
fn test_app_state_is_send_and_sync() {
fn assert_send_sync<T: Send + Sync>() {}
assert_send_sync::<AppState>();
}
impl AppState {
pub async fn new(config: Config, executor: Executor) -> Result<Arc<Self>> {
let mut db_options = db::ConnectOptions::new(config.database_url.clone());
@@ -317,6 +328,14 @@ impl AppState {
let db = Arc::new(db);
let stripe_client = build_stripe_client(&config).map(Arc::new).log_err();
let duckdb_pool = config.duckdb_path.as_ref().and_then(|path| {
r2d2::Pool::builder()
.max_size(15)
.build(DuckdbConnectionManager::file(path).log_err()?)
.log_err()
});
let this = Self {
db: db.clone(),
llm_db,
@@ -332,6 +351,7 @@ impl AppState {
.clickhouse_url
.as_ref()
.and_then(|_| build_clickhouse_client(&config).log_err()),
duckdb_pool,
config,
};
Ok(Arc::new(this))

View File

@@ -459,8 +459,9 @@ async fn check_usage_limit(
Utc::now(),
)
.await?;
let free_tier = claims.free_tier_monthly_spending_limit();
if usage.spending_this_month >= FREE_TIER_MONTHLY_SPENDING_LIMIT {
if usage.spending_this_month >= free_tier {
if !claims.has_llm_subscription {
return Err(Error::http(
StatusCode::PAYMENT_REQUIRED,
@@ -468,9 +469,7 @@ async fn check_usage_limit(
));
}
if (usage.spending_this_month - FREE_TIER_MONTHLY_SPENDING_LIMIT)
>= Cents(claims.max_monthly_spend_in_cents)
{
if (usage.spending_this_month - free_tier) >= Cents(claims.max_monthly_spend_in_cents) {
return Err(Error::Http(
StatusCode::FORBIDDEN,
"Maximum spending limit reached for this month.".to_string(),
@@ -640,6 +639,7 @@ impl<S> Drop for TokenCountingStream<S> {
tokens,
claims.has_llm_subscription,
Cents(claims.max_monthly_spend_in_cents),
claims.free_tier_monthly_spending_limit(),
Utc::now(),
)
.await

View File

@@ -1,5 +1,5 @@
use crate::db::UserId;
use crate::llm::Cents;
use crate::{db::UserId, llm::FREE_TIER_MONTHLY_SPENDING_LIMIT};
use chrono::{Datelike, Duration};
use futures::StreamExt as _;
use rpc::LanguageModelProvider;
@@ -299,6 +299,7 @@ impl LlmDatabase {
tokens: TokenUsage,
has_llm_subscription: bool,
max_monthly_spend: Cents,
free_tier_monthly_spending_limit: Cents,
now: DateTimeUtc,
) -> Result<Usage> {
self.transaction(|tx| async move {
@@ -410,9 +411,9 @@ impl LlmDatabase {
);
if !is_staff
&& spending_this_month > FREE_TIER_MONTHLY_SPENDING_LIMIT
&& spending_this_month > free_tier_monthly_spending_limit
&& has_llm_subscription
&& (spending_this_month - FREE_TIER_MONTHLY_SPENDING_LIMIT) <= max_monthly_spend
&& (spending_this_month - free_tier_monthly_spending_limit) <= max_monthly_spend
{
billing_event::ActiveModel {
id: ActiveValue::not_set(),

View File

@@ -66,6 +66,7 @@ async fn test_billing_limit_exceeded(db: &mut LlmDatabase) {
usage,
true,
max_monthly_spend,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -103,6 +104,7 @@ async fn test_billing_limit_exceeded(db: &mut LlmDatabase) {
usage_2,
true,
max_monthly_spend,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -132,6 +134,7 @@ async fn test_billing_limit_exceeded(db: &mut LlmDatabase) {
model,
usage_exceeding,
true,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
max_monthly_spend,
now,
)

View File

@@ -1,3 +1,4 @@
use crate::llm::FREE_TIER_MONTHLY_SPENDING_LIMIT;
use crate::{
db::UserId,
llm::db::{
@@ -49,6 +50,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -68,6 +70,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -124,6 +127,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -180,6 +184,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -222,6 +227,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await
@@ -259,6 +265,7 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
},
false,
Cents::ZERO,
FREE_TIER_MONTHLY_SPENDING_LIMIT,
now,
)
.await

View File

@@ -1,8 +1,7 @@
use crate::llm::DEFAULT_MAX_MONTHLY_SPEND;
use crate::{
db::{billing_preference, UserId},
Config,
};
use crate::db::user;
use crate::llm::{DEFAULT_MAX_MONTHLY_SPEND, FREE_TIER_MONTHLY_SPENDING_LIMIT};
use crate::Cents;
use crate::{db::billing_preference, Config};
use anyhow::{anyhow, Result};
use chrono::Utc;
use jsonwebtoken::{DecodingKey, EncodingKey, Header, Validation};
@@ -22,6 +21,7 @@ pub struct LlmTokenClaims {
pub has_llm_closed_beta_feature_flag: bool,
pub has_llm_subscription: bool,
pub max_monthly_spend_in_cents: u32,
pub custom_llm_monthly_allowance_in_cents: Option<u32>,
pub plan: rpc::proto::Plan,
}
@@ -30,8 +30,7 @@ const LLM_TOKEN_LIFETIME: Duration = Duration::from_secs(60 * 60);
impl LlmTokenClaims {
#[allow(clippy::too_many_arguments)]
pub fn create(
user_id: UserId,
github_user_login: String,
user: &user::Model,
is_staff: bool,
billing_preferences: Option<billing_preference::Model>,
has_llm_closed_beta_feature_flag: bool,
@@ -49,8 +48,8 @@ impl LlmTokenClaims {
iat: now.timestamp() as u64,
exp: (now + LLM_TOKEN_LIFETIME).timestamp() as u64,
jti: uuid::Uuid::new_v4().to_string(),
user_id: user_id.to_proto(),
github_user_login,
user_id: user.id.to_proto(),
github_user_login: user.github_login.clone(),
is_staff,
has_llm_closed_beta_feature_flag,
has_llm_subscription,
@@ -58,6 +57,9 @@ impl LlmTokenClaims {
.map_or(DEFAULT_MAX_MONTHLY_SPEND.0, |preferences| {
preferences.max_monthly_llm_usage_spending_in_cents as u32
}),
custom_llm_monthly_allowance_in_cents: user
.custom_llm_monthly_allowance_in_cents
.map(|allowance| allowance as u32),
plan,
};
@@ -89,6 +91,12 @@ impl LlmTokenClaims {
}
}
}
pub fn free_tier_monthly_spending_limit(&self) -> Cents {
self.custom_llm_monthly_allowance_in_cents
.map(Cents)
.unwrap_or(FREE_TIER_MONTHLY_SPENDING_LIMIT)
}
}
#[derive(Error, Debug)]

View File

@@ -84,6 +84,8 @@ async fn main() -> Result<()> {
let config = envy::from_env::<Config>().expect("error loading config");
init_tracing(&config);
init_panic_hook();
let mut app = Router::new()
.route("/", get(handle_root))
.route("/healthz", get(handle_liveness_probe))
@@ -378,3 +380,20 @@ pub fn init_tracing(config: &Config) -> Option<()> {
None
}
fn init_panic_hook() {
std::panic::set_hook(Box::new(move |panic_info| {
let panic_message = match panic_info.payload().downcast_ref::<&'static str>() {
Some(message) => *message,
None => match panic_info.payload().downcast_ref::<String>() {
Some(message) => message.as_str(),
None => "Box<Any>",
},
};
let backtrace = std::backtrace::Backtrace::force_capture();
let location = panic_info
.location()
.map(|loc| format!("{}:{}", loc.file(), loc.line()));
tracing::error!(panic = true, ?location, %panic_message, %backtrace, "Server Panic");
}));
}

View File

@@ -1713,11 +1713,6 @@ fn notify_rejoined_projects(
for project in rejoined_projects {
for worktree in mem::take(&mut project.worktrees) {
#[cfg(any(test, feature = "test-support"))]
const MAX_CHUNK_SIZE: usize = 2;
#[cfg(not(any(test, feature = "test-support")))]
const MAX_CHUNK_SIZE: usize = 256;
// Stream this worktree's entries.
let message = proto::UpdateWorktree {
project_id: project.id.to_proto(),
@@ -1731,7 +1726,7 @@ fn notify_rejoined_projects(
updated_repositories: worktree.updated_repositories,
removed_repositories: worktree.removed_repositories,
};
for update in proto::split_worktree_update(message, MAX_CHUNK_SIZE) {
for update in proto::split_worktree_update(message) {
session.peer.send(session.connection_id, update.clone())?;
}
@@ -2195,11 +2190,6 @@ fn join_project_internal(
})?;
for (worktree_id, worktree) in mem::take(&mut project.worktrees) {
#[cfg(any(test, feature = "test-support"))]
const MAX_CHUNK_SIZE: usize = 2;
#[cfg(not(any(test, feature = "test-support")))]
const MAX_CHUNK_SIZE: usize = 256;
// Stream this worktree's entries.
let message = proto::UpdateWorktree {
project_id: project_id.to_proto(),
@@ -2213,7 +2203,7 @@ fn join_project_internal(
updated_repositories: worktree.repository_entries.into_values().collect(),
removed_repositories: Default::default(),
};
for update in proto::split_worktree_update(message, MAX_CHUNK_SIZE) {
for update in proto::split_worktree_update(message) {
session.peer.send(session.connection_id, update.clone())?;
}
@@ -4930,8 +4920,7 @@ async fn get_llm_api_token(
let billing_preferences = db.get_billing_preferences(user.id).await?;
let token = LlmTokenClaims::create(
user.id,
user.github_login.clone(),
&user,
session.is_staff(),
billing_preferences,
has_llm_closed_beta_feature_flag,

View File

@@ -643,6 +643,7 @@ impl TestServer {
rate_limiter: Arc::new(RateLimiter::new(test_db.db().clone())),
executor,
clickhouse_client: None,
duckdb_pool: None,
config: Config {
http_port: 0,
database_url: "".into(),
@@ -673,6 +674,7 @@ impl TestServer {
clickhouse_user: None,
clickhouse_password: None,
clickhouse_database: None,
duckdb_path: None,
zed_client_checksum_seed: None,
slack_panics_webhook: None,
auto_join_channel_id: None,

View File

@@ -11,7 +11,7 @@ use collections::HashMap;
use crate::client::Client;
use crate::types;
const PROTOCOL_VERSION: u32 = 1;
const PROTOCOL_VERSION: &str = "2024-10-07";
pub struct ModelContextProtocol {
inner: Client,
@@ -22,12 +22,19 @@ impl ModelContextProtocol {
Self { inner }
}
fn supported_protocols() -> Vec<types::ProtocolVersion> {
vec![
types::ProtocolVersion::VersionString(PROTOCOL_VERSION.to_string()),
types::ProtocolVersion::VersionNumber(1),
]
}
pub async fn initialize(
self,
client_info: types::Implementation,
) -> Result<InitializedContextServerProtocol> {
let params = types::InitializeParams {
protocol_version: PROTOCOL_VERSION,
protocol_version: types::ProtocolVersion::VersionString(PROTOCOL_VERSION.to_string()),
capabilities: types::ClientCapabilities {
experimental: None,
sampling: None,
@@ -40,6 +47,13 @@ impl ModelContextProtocol {
.request(types::RequestType::Initialize.as_str(), params)
.await?;
if !Self::supported_protocols().contains(&response.protocol_version) {
return Err(anyhow::anyhow!(
"Unsupported protocol version: {:?}",
response.protocol_version
));
}
log::trace!("mcp server info {:?}", response.server_info);
self.inner.notify(

View File

@@ -36,10 +36,17 @@ impl RequestType {
}
}
#[derive(Debug, PartialEq, Eq, Serialize, Deserialize)]
#[serde(untagged)]
pub enum ProtocolVersion {
VersionString(String),
VersionNumber(u32),
}
#[derive(Debug, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct InitializeParams {
pub protocol_version: u32,
pub protocol_version: ProtocolVersion,
pub capabilities: ClientCapabilities,
pub client_info: Implementation,
}
@@ -131,7 +138,7 @@ pub struct CompletionArgument {
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct InitializeResponse {
pub protocol_version: u32,
pub protocol_version: ProtocolVersion,
pub capabilities: ServerCapabilities,
pub server_info: Implementation,
}
@@ -145,10 +152,9 @@ pub struct ResourcesReadResponse {
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct ResourcesListResponse {
pub resources: Vec<Resource>,
#[serde(skip_serializing_if = "Option::is_none")]
pub resource_templates: Option<Vec<ResourceTemplate>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub resources: Option<Vec<Resource>>,
pub next_cursor: Option<String>,
}
#[derive(Debug, Serialize, Deserialize)]
@@ -179,13 +185,15 @@ pub enum SamplingContent {
pub struct PromptsGetResponse {
#[serde(skip_serializing_if = "Option::is_none")]
pub description: Option<String>,
pub prompt: String,
pub messages: Vec<SamplingMessage>,
}
#[derive(Debug, Deserialize)]
#[serde(rename_all = "camelCase")]
pub struct PromptsListResponse {
pub prompts: Vec<Prompt>,
#[serde(skip_serializing_if = "Option::is_none")]
pub next_cursor: Option<String>,
}
#[derive(Debug, Deserialize)]

View File

@@ -81,6 +81,7 @@ ui.workspace = true
url.workspace = true
util.workspace = true
workspace.workspace = true
unicode-segmentation.workspace = true
[dev-dependencies]
ctor.workspace = true

View File

@@ -18,7 +18,7 @@ pub struct SelectPrevious {
#[derive(PartialEq, Clone, Deserialize, Default)]
pub struct MoveToBeginningOfLine {
#[serde(default = "default_true")]
pub(super) stop_at_soft_wraps: bool,
pub stop_at_soft_wraps: bool,
}
#[derive(PartialEq, Clone, Deserialize, Default)]

View File

@@ -8,7 +8,7 @@
//! of several smaller structures that form a hierarchy (starting at the bottom):
//! - [`InlayMap`] that decides where the [`Inlay`]s should be displayed.
//! - [`FoldMap`] that decides where the fold indicators should be; it also tracks parts of a source file that are currently folded.
//! - [`TabMap`] that keeps track of hard tabs in a buffer.
//! - [`CharMap`] that replaces tabs and non-printable characters
//! - [`WrapMap`] that handles soft wrapping.
//! - [`BlockMap`] that tracks custom blocks such as diagnostics that should be displayed within buffer.
//! - [`DisplayMap`] that adds background highlights to the regions of text.
@@ -18,10 +18,11 @@
//! [EditorElement]: crate::element::EditorElement
mod block_map;
mod char_map;
mod crease_map;
mod fold_map;
mod inlay_map;
mod tab_map;
mod invisibles;
mod wrap_map;
use crate::{
@@ -32,6 +33,7 @@ pub use block_map::{
BlockMap, BlockPoint, BlockProperties, BlockStyle, CustomBlockId, RenderBlock,
};
use block_map::{BlockRow, BlockSnapshot};
use char_map::{CharMap, CharSnapshot};
use collections::{HashMap, HashSet};
pub use crease_map::*;
pub use fold_map::{Fold, FoldId, FoldPlaceholder, FoldPoint};
@@ -42,6 +44,7 @@ use gpui::{
pub(crate) use inlay_map::Inlay;
use inlay_map::{InlayMap, InlaySnapshot};
pub use inlay_map::{InlayOffset, InlayPoint};
pub use invisibles::is_invisible;
use language::{
language_settings::language_settings, ChunkRenderer, OffsetUtf16, Point,
Subscription as BufferSubscription,
@@ -61,9 +64,9 @@ use std::{
sync::Arc,
};
use sum_tree::{Bias, TreeMap};
use tab_map::{TabMap, TabSnapshot};
use text::LineIndent;
use ui::WindowContext;
use ui::{px, WindowContext};
use unicode_segmentation::UnicodeSegmentation;
use wrap_map::{WrapMap, WrapSnapshot};
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
@@ -94,7 +97,7 @@ pub struct DisplayMap {
/// Decides where the fold indicators should be and tracks parts of a source file that are currently folded.
fold_map: FoldMap,
/// Keeps track of hard tabs in a buffer.
tab_map: TabMap,
char_map: CharMap,
/// Handles soft wrapping.
wrap_map: Model<WrapMap>,
/// Tracks custom blocks such as diagnostics that should be displayed within buffer.
@@ -131,7 +134,7 @@ impl DisplayMap {
let crease_map = CreaseMap::new(&buffer_snapshot);
let (inlay_map, snapshot) = InlayMap::new(buffer_snapshot);
let (fold_map, snapshot) = FoldMap::new(snapshot);
let (tab_map, snapshot) = TabMap::new(snapshot, tab_size);
let (char_map, snapshot) = CharMap::new(snapshot, tab_size);
let (wrap_map, snapshot) = WrapMap::new(snapshot, font, font_size, wrap_width, cx);
let block_map = BlockMap::new(
snapshot,
@@ -148,7 +151,7 @@ impl DisplayMap {
buffer_subscription,
fold_map,
inlay_map,
tab_map,
char_map,
wrap_map,
block_map,
crease_map,
@@ -166,17 +169,17 @@ impl DisplayMap {
let (inlay_snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (fold_snapshot, edits) = self.fold_map.read(inlay_snapshot.clone(), edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (tab_snapshot, edits) = self.tab_map.sync(fold_snapshot.clone(), edits, tab_size);
let (char_snapshot, edits) = self.char_map.sync(fold_snapshot.clone(), edits, tab_size);
let (wrap_snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(tab_snapshot.clone(), edits, cx));
.update(cx, |map, cx| map.sync(char_snapshot.clone(), edits, cx));
let block_snapshot = self.block_map.read(wrap_snapshot.clone(), edits).snapshot;
DisplaySnapshot {
buffer_snapshot: self.buffer.read(cx).snapshot(cx),
fold_snapshot,
inlay_snapshot,
tab_snapshot,
char_snapshot,
wrap_snapshot,
block_snapshot,
crease_snapshot: self.crease_map.snapshot(),
@@ -212,13 +215,13 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
let (snapshot, edits) = fold_map.fold(ranges);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -236,13 +239,13 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (mut fold_map, snapshot, edits) = self.fold_map.write(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
self.block_map.read(snapshot, edits);
let (snapshot, edits) = fold_map.unfold(ranges, inclusive);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -277,7 +280,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -295,7 +298,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -313,7 +316,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -331,7 +334,7 @@ impl DisplayMap {
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.inlay_map.sync(snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -407,7 +410,7 @@ impl DisplayMap {
let (snapshot, edits) = self.inlay_map.sync(buffer_snapshot, edits);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let tab_size = Self::tab_size(&self.buffer, cx);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -415,7 +418,7 @@ impl DisplayMap {
let (snapshot, edits) = self.inlay_map.splice(to_remove, to_insert);
let (snapshot, edits) = self.fold_map.read(snapshot, edits);
let (snapshot, edits) = self.tab_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self.char_map.sync(snapshot, edits, tab_size);
let (snapshot, edits) = self
.wrap_map
.update(cx, |map, cx| map.sync(snapshot, edits, cx));
@@ -467,7 +470,7 @@ pub struct DisplaySnapshot {
pub fold_snapshot: FoldSnapshot,
pub crease_snapshot: CreaseSnapshot,
inlay_snapshot: InlaySnapshot,
tab_snapshot: TabSnapshot,
char_snapshot: CharSnapshot,
wrap_snapshot: WrapSnapshot,
block_snapshot: BlockSnapshot,
text_highlights: TextHighlights,
@@ -567,8 +570,8 @@ impl DisplaySnapshot {
fn point_to_display_point(&self, point: MultiBufferPoint, bias: Bias) -> DisplayPoint {
let inlay_point = self.inlay_snapshot.to_inlay_point(point);
let fold_point = self.fold_snapshot.to_fold_point(inlay_point, bias);
let tab_point = self.tab_snapshot.to_tab_point(fold_point);
let wrap_point = self.wrap_snapshot.tab_point_to_wrap_point(tab_point);
let char_point = self.char_snapshot.to_char_point(fold_point);
let wrap_point = self.wrap_snapshot.char_point_to_wrap_point(char_point);
let block_point = self.block_snapshot.to_block_point(wrap_point);
DisplayPoint(block_point)
}
@@ -596,21 +599,21 @@ impl DisplaySnapshot {
fn display_point_to_inlay_point(&self, point: DisplayPoint, bias: Bias) -> InlayPoint {
let block_point = point.0;
let wrap_point = self.block_snapshot.to_wrap_point(block_point);
let tab_point = self.wrap_snapshot.to_tab_point(wrap_point);
let fold_point = self.tab_snapshot.to_fold_point(tab_point, bias).0;
let char_point = self.wrap_snapshot.to_char_point(wrap_point);
let fold_point = self.char_snapshot.to_fold_point(char_point, bias).0;
fold_point.to_inlay_point(&self.fold_snapshot)
}
pub fn display_point_to_fold_point(&self, point: DisplayPoint, bias: Bias) -> FoldPoint {
let block_point = point.0;
let wrap_point = self.block_snapshot.to_wrap_point(block_point);
let tab_point = self.wrap_snapshot.to_tab_point(wrap_point);
self.tab_snapshot.to_fold_point(tab_point, bias).0
let char_point = self.wrap_snapshot.to_char_point(wrap_point);
self.char_snapshot.to_fold_point(char_point, bias).0
}
pub fn fold_point_to_display_point(&self, fold_point: FoldPoint) -> DisplayPoint {
let tab_point = self.tab_snapshot.to_tab_point(fold_point);
let wrap_point = self.wrap_snapshot.tab_point_to_wrap_point(tab_point);
let char_point = self.char_snapshot.to_char_point(fold_point);
let wrap_point = self.wrap_snapshot.char_point_to_wrap_point(char_point);
let block_point = self.block_snapshot.to_block_point(wrap_point);
DisplayPoint(block_point)
}
@@ -688,6 +691,23 @@ impl DisplaySnapshot {
}
}
if chunk.is_invisible {
let invisible_highlight = HighlightStyle {
background_color: Some(editor_style.status.hint_background),
underline: Some(UnderlineStyle {
color: Some(editor_style.status.hint),
thickness: px(1.),
wavy: false,
}),
..Default::default()
};
if let Some(highlight_style) = highlight_style.as_mut() {
highlight_style.highlight(invisible_highlight);
} else {
highlight_style = Some(invisible_highlight);
}
}
let mut diagnostic_highlight = HighlightStyle::default();
if chunk.is_unnecessary {
@@ -784,12 +804,11 @@ impl DisplaySnapshot {
layout_line.closest_index_for_x(x) as u32
}
pub fn display_chars_at(
&self,
mut point: DisplayPoint,
) -> impl Iterator<Item = (char, DisplayPoint)> + '_ {
pub fn grapheme_at(&self, mut point: DisplayPoint) -> Option<String> {
point = DisplayPoint(self.block_snapshot.clip_point(point.0, Bias::Left));
self.text_chunks(point.row())
let chars = self
.text_chunks(point.row())
.flat_map(str::chars)
.skip_while({
let mut column = 0;
@@ -799,16 +818,21 @@ impl DisplaySnapshot {
!at_point
}
})
.map(move |ch| {
let result = (ch, point);
if ch == '\n' {
*point.row_mut() += 1;
*point.column_mut() = 0;
} else {
*point.column_mut() += ch.len_utf8() as u32;
.take_while({
let mut prev = false;
move |char| {
let now = char.is_ascii();
let end = char.is_ascii() && (char.is_ascii_whitespace() || prev);
prev = now;
!end
}
result
})
});
chars
.collect::<String>()
.graphemes(true)
.next()
.map(|s| s.to_owned())
}
pub fn buffer_chars_at(&self, mut offset: usize) -> impl Iterator<Item = (char, usize)> + '_ {
@@ -1120,8 +1144,8 @@ impl DisplayPoint {
pub fn to_offset(self, map: &DisplaySnapshot, bias: Bias) -> usize {
let wrap_point = map.block_snapshot.to_wrap_point(self.0);
let tab_point = map.wrap_snapshot.to_tab_point(wrap_point);
let fold_point = map.tab_snapshot.to_fold_point(tab_point, bias).0;
let char_point = map.wrap_snapshot.to_char_point(wrap_point);
let fold_point = map.char_snapshot.to_fold_point(char_point, bias).0;
let inlay_point = fold_point.to_inlay_point(&map.fold_snapshot);
map.inlay_snapshot
.to_buffer_offset(map.inlay_snapshot.to_offset(inlay_point))
@@ -1228,7 +1252,7 @@ pub mod tests {
let snapshot = map.update(cx, |map, cx| map.snapshot(cx));
log::info!("buffer text: {:?}", snapshot.buffer_snapshot.text());
log::info!("fold text: {:?}", snapshot.fold_snapshot.text());
log::info!("tab text: {:?}", snapshot.tab_snapshot.text());
log::info!("char text: {:?}", snapshot.char_snapshot.text());
log::info!("wrap text: {:?}", snapshot.wrap_snapshot.text());
log::info!("block text: {:?}", snapshot.block_snapshot.text());
log::info!("display text: {:?}", snapshot.text());
@@ -1345,7 +1369,7 @@ pub mod tests {
fold_count = snapshot.fold_count();
log::info!("buffer text: {:?}", snapshot.buffer_snapshot.text());
log::info!("fold text: {:?}", snapshot.fold_snapshot.text());
log::info!("tab text: {:?}", snapshot.tab_snapshot.text());
log::info!("char text: {:?}", snapshot.char_snapshot.text());
log::info!("wrap text: {:?}", snapshot.wrap_snapshot.text());
log::info!("block text: {:?}", snapshot.block_snapshot.text());
log::info!("display text: {:?}", snapshot.text());

View File

@@ -1421,7 +1421,7 @@ fn offset_for_row(s: &str, target: u32) -> (u32, usize) {
mod tests {
use super::*;
use crate::display_map::{
fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap, wrap_map::WrapMap,
char_map::CharMap, fold_map::FoldMap, inlay_map::InlayMap, wrap_map::WrapMap,
};
use gpui::{div, font, px, AppContext, Context as _, Element};
use language::{Buffer, Capability};
@@ -1456,9 +1456,9 @@ mod tests {
let subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (mut tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 1.try_into().unwrap());
let (mut char_map, char_snapshot) = CharMap::new(fold_snapshot, 1.try_into().unwrap());
let (wrap_map, wraps_snapshot) =
cx.update(|cx| WrapMap::new(tab_snapshot, font("Helvetica"), px(14.0), None, cx));
cx.update(|cx| WrapMap::new(char_snapshot, font("Helvetica"), px(14.0), None, cx));
let mut block_map = BlockMap::new(wraps_snapshot.clone(), true, 1, 1, 1);
let mut writer = block_map.write(wraps_snapshot.clone(), Default::default());
@@ -1609,10 +1609,10 @@ mod tests {
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot, subscription.consume().into_inner());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, 4.try_into().unwrap());
let (char_snapshot, tab_edits) =
char_map.sync(fold_snapshot, fold_edits, 4.try_into().unwrap());
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
wrap_map.sync(tab_snapshot, tab_edits, cx)
wrap_map.sync(char_snapshot, tab_edits, cx)
});
let snapshot = block_map.read(wraps_snapshot, wrap_edits);
assert_eq!(snapshot.text(), "aaa\n\nb!!!\n\n\nbb\nccc\nddd\n\n\n");
@@ -1672,8 +1672,9 @@ mod tests {
let multi_buffer_snapshot = multi_buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(multi_buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wraps_snapshot) = WrapMap::new(tab_snapshot, font, font_size, Some(wrap_width), cx);
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wraps_snapshot) =
WrapMap::new(char_snapshot, font, font_size, Some(wrap_width), cx);
let block_map = BlockMap::new(wraps_snapshot.clone(), true, 1, 1, 1);
let snapshot = block_map.read(wraps_snapshot, Default::default());
@@ -1710,9 +1711,9 @@ mod tests {
let _subscription = buffer.update(cx, |buffer, _| buffer.subscribe());
let (_inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 1.try_into().unwrap());
let (_char_map, char_snapshot) = CharMap::new(fold_snapshot, 1.try_into().unwrap());
let (_wrap_map, wraps_snapshot) =
cx.update(|cx| WrapMap::new(tab_snapshot, font("Helvetica"), px(14.0), None, cx));
cx.update(|cx| WrapMap::new(char_snapshot, font("Helvetica"), px(14.0), None, cx));
let mut block_map = BlockMap::new(wraps_snapshot.clone(), false, 1, 1, 0);
let mut writer = block_map.write(wraps_snapshot.clone(), Default::default());
@@ -1815,9 +1816,15 @@ mod tests {
let buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, wraps_snapshot) = cx.update(|cx| {
WrapMap::new(tab_snapshot, font("Helvetica"), px(14.0), Some(px(60.)), cx)
WrapMap::new(
char_snapshot,
font("Helvetica"),
px(14.0),
Some(px(60.)),
cx,
)
});
let mut block_map = BlockMap::new(wraps_snapshot.clone(), true, 1, 1, 0);
@@ -1885,9 +1892,9 @@ mod tests {
let mut buffer_snapshot = cx.update(|cx| buffer.read(cx).snapshot(cx));
let (mut inlay_map, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (mut tab_map, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (mut char_map, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
let (wrap_map, wraps_snapshot) = cx
.update(|cx| WrapMap::new(tab_snapshot, font("Helvetica"), font_size, wrap_width, cx));
.update(|cx| WrapMap::new(char_snapshot, font("Helvetica"), font_size, wrap_width, cx));
let mut block_map = BlockMap::new(
wraps_snapshot,
true,
@@ -1944,10 +1951,10 @@ mod tests {
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), vec![]);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (char_snapshot, tab_edits) =
char_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
wrap_map.sync(tab_snapshot, tab_edits, cx)
wrap_map.sync(char_snapshot, tab_edits, cx)
});
let mut block_map = block_map.write(wraps_snapshot, wrap_edits);
let block_ids =
@@ -1976,10 +1983,10 @@ mod tests {
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), vec![]);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (char_snapshot, tab_edits) =
char_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
wrap_map.sync(tab_snapshot, tab_edits, cx)
wrap_map.sync(char_snapshot, tab_edits, cx)
});
let mut block_map = block_map.write(wraps_snapshot, wrap_edits);
block_map.remove(block_ids_to_remove);
@@ -1999,9 +2006,9 @@ mod tests {
let (inlay_snapshot, inlay_edits) =
inlay_map.sync(buffer_snapshot.clone(), buffer_edits);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tab_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
let (char_snapshot, tab_edits) = char_map.sync(fold_snapshot, fold_edits, tab_size);
let (wraps_snapshot, wrap_edits) = wrap_map.update(cx, |wrap_map, cx| {
wrap_map.sync(tab_snapshot, tab_edits, cx)
wrap_map.sync(char_snapshot, tab_edits, cx)
});
let blocks_snapshot = block_map.read(wraps_snapshot.clone(), wrap_edits);
assert_eq!(
@@ -2084,7 +2091,10 @@ mod tests {
}
}
let soft_wrapped = wraps_snapshot.to_tab_point(WrapPoint::new(row, 0)).column() > 0;
let soft_wrapped = wraps_snapshot
.to_char_point(WrapPoint::new(row, 0))
.column()
> 0;
expected_buffer_rows.push(if soft_wrapped { None } else { buffer_row });
expected_text.push_str(input_line);

View File

@@ -1,5 +1,6 @@
use super::{
fold_map::{self, FoldChunks, FoldEdit, FoldPoint, FoldSnapshot},
invisibles::{is_invisible, replacement},
Highlights,
};
use language::{Chunk, Point};
@@ -9,14 +10,14 @@ use sum_tree::Bias;
const MAX_EXPANSION_COLUMN: u32 = 256;
/// Keeps track of hard tabs in a text buffer.
/// Keeps track of hard tabs and non-printable characters in a text buffer.
///
/// See the [`display_map` module documentation](crate::display_map) for more information.
pub struct TabMap(TabSnapshot);
pub struct CharMap(CharSnapshot);
impl TabMap {
pub fn new(fold_snapshot: FoldSnapshot, tab_size: NonZeroU32) -> (Self, TabSnapshot) {
let snapshot = TabSnapshot {
impl CharMap {
pub fn new(fold_snapshot: FoldSnapshot, tab_size: NonZeroU32) -> (Self, CharSnapshot) {
let snapshot = CharSnapshot {
fold_snapshot,
tab_size,
max_expansion_column: MAX_EXPANSION_COLUMN,
@@ -26,7 +27,7 @@ impl TabMap {
}
#[cfg(test)]
pub fn set_max_expansion_column(&mut self, column: u32) -> TabSnapshot {
pub fn set_max_expansion_column(&mut self, column: u32) -> CharSnapshot {
self.0.max_expansion_column = column;
self.0.clone()
}
@@ -36,9 +37,9 @@ impl TabMap {
fold_snapshot: FoldSnapshot,
mut fold_edits: Vec<FoldEdit>,
tab_size: NonZeroU32,
) -> (TabSnapshot, Vec<TabEdit>) {
) -> (CharSnapshot, Vec<TabEdit>) {
let old_snapshot = &mut self.0;
let mut new_snapshot = TabSnapshot {
let mut new_snapshot = CharSnapshot {
fold_snapshot,
tab_size,
max_expansion_column: old_snapshot.max_expansion_column,
@@ -137,15 +138,15 @@ impl TabMap {
let new_start = fold_edit.new.start.to_point(&new_snapshot.fold_snapshot);
let new_end = fold_edit.new.end.to_point(&new_snapshot.fold_snapshot);
tab_edits.push(TabEdit {
old: old_snapshot.to_tab_point(old_start)..old_snapshot.to_tab_point(old_end),
new: new_snapshot.to_tab_point(new_start)..new_snapshot.to_tab_point(new_end),
old: old_snapshot.to_char_point(old_start)..old_snapshot.to_char_point(old_end),
new: new_snapshot.to_char_point(new_start)..new_snapshot.to_char_point(new_end),
});
}
} else {
new_snapshot.version += 1;
tab_edits.push(TabEdit {
old: TabPoint::zero()..old_snapshot.max_point(),
new: TabPoint::zero()..new_snapshot.max_point(),
old: CharPoint::zero()..old_snapshot.max_point(),
new: CharPoint::zero()..new_snapshot.max_point(),
});
}
@@ -155,14 +156,14 @@ impl TabMap {
}
#[derive(Clone)]
pub struct TabSnapshot {
pub struct CharSnapshot {
pub fold_snapshot: FoldSnapshot,
pub tab_size: NonZeroU32,
pub max_expansion_column: u32,
pub version: usize,
}
impl TabSnapshot {
impl CharSnapshot {
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
&self.fold_snapshot.inlay_snapshot.buffer
}
@@ -170,7 +171,7 @@ impl TabSnapshot {
pub fn line_len(&self, row: u32) -> u32 {
let max_point = self.max_point();
if row < max_point.row() {
self.to_tab_point(FoldPoint::new(row, self.fold_snapshot.line_len(row)))
self.to_char_point(FoldPoint::new(row, self.fold_snapshot.line_len(row)))
.0
.column
} else {
@@ -179,10 +180,10 @@ impl TabSnapshot {
}
pub fn text_summary(&self) -> TextSummary {
self.text_summary_for_range(TabPoint::zero()..self.max_point())
self.text_summary_for_range(CharPoint::zero()..self.max_point())
}
pub fn text_summary_for_range(&self, range: Range<TabPoint>) -> TextSummary {
pub fn text_summary_for_range(&self, range: Range<CharPoint>) -> TextSummary {
let input_start = self.to_fold_point(range.start, Bias::Left).0;
let input_end = self.to_fold_point(range.end, Bias::Right).0;
let input_summary = self
@@ -211,7 +212,7 @@ impl TabSnapshot {
} else {
for _ in self
.chunks(
TabPoint::new(range.end.row(), 0)..range.end,
CharPoint::new(range.end.row(), 0)..range.end,
false,
Highlights::default(),
)
@@ -232,7 +233,7 @@ impl TabSnapshot {
pub fn chunks<'a>(
&'a self,
range: Range<TabPoint>,
range: Range<CharPoint>,
language_aware: bool,
highlights: Highlights<'a>,
) -> TabChunks<'a> {
@@ -278,7 +279,7 @@ impl TabSnapshot {
#[cfg(test)]
pub fn text(&self) -> String {
self.chunks(
TabPoint::zero()..self.max_point(),
CharPoint::zero()..self.max_point(),
false,
Highlights::default(),
)
@@ -286,24 +287,24 @@ impl TabSnapshot {
.collect()
}
pub fn max_point(&self) -> TabPoint {
self.to_tab_point(self.fold_snapshot.max_point())
pub fn max_point(&self) -> CharPoint {
self.to_char_point(self.fold_snapshot.max_point())
}
pub fn clip_point(&self, point: TabPoint, bias: Bias) -> TabPoint {
self.to_tab_point(
pub fn clip_point(&self, point: CharPoint, bias: Bias) -> CharPoint {
self.to_char_point(
self.fold_snapshot
.clip_point(self.to_fold_point(point, bias).0, bias),
)
}
pub fn to_tab_point(&self, input: FoldPoint) -> TabPoint {
pub fn to_char_point(&self, input: FoldPoint) -> CharPoint {
let chars = self.fold_snapshot.chars_at(FoldPoint::new(input.row(), 0));
let expanded = self.expand_tabs(chars, input.column());
TabPoint::new(input.row(), expanded)
CharPoint::new(input.row(), expanded)
}
pub fn to_fold_point(&self, output: TabPoint, bias: Bias) -> (FoldPoint, u32, u32) {
pub fn to_fold_point(&self, output: CharPoint, bias: Bias) -> (FoldPoint, u32, u32) {
let chars = self.fold_snapshot.chars_at(FoldPoint::new(output.row(), 0));
let expanded = output.column();
let (collapsed, expanded_char_column, to_next_stop) =
@@ -315,13 +316,13 @@ impl TabSnapshot {
)
}
pub fn make_tab_point(&self, point: Point, bias: Bias) -> TabPoint {
pub fn make_char_point(&self, point: Point, bias: Bias) -> CharPoint {
let inlay_point = self.fold_snapshot.inlay_snapshot.to_inlay_point(point);
let fold_point = self.fold_snapshot.to_fold_point(inlay_point, bias);
self.to_tab_point(fold_point)
self.to_char_point(fold_point)
}
pub fn to_point(&self, point: TabPoint, bias: Bias) -> Point {
pub fn to_point(&self, point: CharPoint, bias: Bias) -> Point {
let fold_point = self.to_fold_point(point, bias).0;
let inlay_point = fold_point.to_inlay_point(&self.fold_snapshot);
self.fold_snapshot
@@ -344,6 +345,9 @@ impl TabSnapshot {
let tab_len = tab_size - expanded_chars % tab_size;
expanded_bytes += tab_len;
expanded_chars += tab_len;
} else if let Some(replacement) = replacement(c) {
expanded_chars += replacement.chars().count() as u32;
expanded_bytes += replacement.len() as u32;
} else {
expanded_bytes += c.len_utf8() as u32;
expanded_chars += 1;
@@ -383,6 +387,9 @@ impl TabSnapshot {
Bias::Right => (collapsed_bytes + 1, expanded_chars, 0),
};
}
} else if let Some(replacement) = replacement(c) {
expanded_chars += replacement.chars().count() as u32;
expanded_bytes += replacement.len() as u32;
} else {
expanded_chars += 1;
expanded_bytes += c.len_utf8() as u32;
@@ -404,9 +411,9 @@ impl TabSnapshot {
}
#[derive(Copy, Clone, Debug, Default, Eq, Ord, PartialOrd, PartialEq)]
pub struct TabPoint(pub Point);
pub struct CharPoint(pub Point);
impl TabPoint {
impl CharPoint {
pub fn new(row: u32, column: u32) -> Self {
Self(Point::new(row, column))
}
@@ -424,13 +431,13 @@ impl TabPoint {
}
}
impl From<Point> for TabPoint {
impl From<Point> for CharPoint {
fn from(point: Point) -> Self {
Self(point)
}
}
pub type TabEdit = text::Edit<TabPoint>;
pub type TabEdit = text::Edit<CharPoint>;
#[derive(Clone, Debug, Default, Eq, PartialEq)]
pub struct TextSummary {
@@ -551,6 +558,37 @@ impl<'a> Iterator for TabChunks<'a> {
self.input_column = 0;
self.output_position += Point::new(1, 0);
}
_ if is_invisible(c) => {
if ix > 0 {
let (prefix, suffix) = self.chunk.text.split_at(ix);
self.chunk.text = suffix;
return Some(Chunk {
text: prefix,
is_invisible: false,
..self.chunk.clone()
});
}
let c_len = c.len_utf8();
let replacement = replacement(c).unwrap_or(&self.chunk.text[..c_len]);
if self.chunk.text.len() >= c_len {
self.chunk.text = &self.chunk.text[c_len..];
} else {
self.chunk.text = "";
}
let len = replacement.chars().count() as u32;
let next_output_position = cmp::min(
self.output_position + Point::new(0, len),
self.max_output_position,
);
self.column += len;
self.input_column += 1;
self.output_position = next_output_position;
return Some(Chunk {
text: replacement,
is_invisible: true,
..self.chunk.clone()
});
}
_ => {
self.column += 1;
if !self.inside_leading_tab {
@@ -580,11 +618,11 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
assert_eq!(tab_snapshot.expand_tabs("\t".chars(), 0), 0);
assert_eq!(tab_snapshot.expand_tabs("\t".chars(), 1), 4);
assert_eq!(tab_snapshot.expand_tabs("\ta".chars(), 2), 5);
assert_eq!(char_snapshot.expand_tabs("\t".chars(), 0), 0);
assert_eq!(char_snapshot.expand_tabs("\t".chars(), 1), 4);
assert_eq!(char_snapshot.expand_tabs("\ta".chars(), 2), 5);
}
#[gpui::test]
@@ -597,16 +635,16 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, mut char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
tab_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(tab_snapshot.text(), output);
char_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(char_snapshot.text(), output);
for (ix, c) in input.char_indices() {
assert_eq!(
tab_snapshot
char_snapshot
.chunks(
TabPoint::new(0, ix as u32)..tab_snapshot.max_point(),
CharPoint::new(0, ix as u32)..char_snapshot.max_point(),
false,
Highlights::default(),
)
@@ -620,13 +658,13 @@ mod tests {
let input_point = Point::new(0, ix as u32);
let output_point = Point::new(0, output.find(c).unwrap() as u32);
assert_eq!(
tab_snapshot.to_tab_point(FoldPoint(input_point)),
TabPoint(output_point),
"to_tab_point({input_point:?})"
char_snapshot.to_char_point(FoldPoint(input_point)),
CharPoint(output_point),
"to_char_point({input_point:?})"
);
assert_eq!(
tab_snapshot
.to_fold_point(TabPoint(output_point), Bias::Left)
char_snapshot
.to_fold_point(CharPoint(output_point), Bias::Left)
.0,
FoldPoint(input_point),
"to_fold_point({output_point:?})"
@@ -644,10 +682,10 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, mut tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, mut char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
tab_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(tab_snapshot.text(), input);
char_snapshot.max_expansion_column = max_expansion_column;
assert_eq!(char_snapshot.text(), input);
}
#[gpui::test]
@@ -658,10 +696,10 @@ mod tests {
let buffer_snapshot = buffer.read(cx).snapshot(cx);
let (_, inlay_snapshot) = InlayMap::new(buffer_snapshot.clone());
let (_, fold_snapshot) = FoldMap::new(inlay_snapshot);
let (_, tab_snapshot) = TabMap::new(fold_snapshot, 4.try_into().unwrap());
let (_, char_snapshot) = CharMap::new(fold_snapshot, 4.try_into().unwrap());
assert_eq!(
chunks(&tab_snapshot, TabPoint::zero()),
chunks(&char_snapshot, CharPoint::zero()),
vec![
(" ".to_string(), true),
(" ".to_string(), false),
@@ -670,7 +708,7 @@ mod tests {
]
);
assert_eq!(
chunks(&tab_snapshot, TabPoint::new(0, 2)),
chunks(&char_snapshot, CharPoint::new(0, 2)),
vec![
(" ".to_string(), true),
(" ".to_string(), false),
@@ -679,7 +717,7 @@ mod tests {
]
);
fn chunks(snapshot: &TabSnapshot, start: TabPoint) -> Vec<(String, bool)> {
fn chunks(snapshot: &CharSnapshot, start: CharPoint) -> Vec<(String, bool)> {
let mut chunks = Vec::new();
let mut was_tab = false;
let mut text = String::new();
@@ -725,12 +763,12 @@ mod tests {
let (inlay_snapshot, _) = inlay_map.randomly_mutate(&mut 0, &mut rng);
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut tab_map, _) = TabMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = tab_map.set_max_expansion_column(32);
let (mut char_map, _) = CharMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = char_map.set_max_expansion_column(32);
let text = text::Rope::from(tabs_snapshot.text().as_str());
log::info!(
"TabMap text (tab size: {}): {:?}",
"CharMap text (tab size: {}): {:?}",
tab_size,
tabs_snapshot.text(),
);
@@ -738,11 +776,11 @@ mod tests {
for _ in 0..5 {
let end_row = rng.gen_range(0..=text.max_point().row);
let end_column = rng.gen_range(0..=text.line_len(end_row));
let mut end = TabPoint(text.clip_point(Point::new(end_row, end_column), Bias::Right));
let mut end = CharPoint(text.clip_point(Point::new(end_row, end_column), Bias::Right));
let start_row = rng.gen_range(0..=text.max_point().row);
let start_column = rng.gen_range(0..=text.line_len(start_row));
let mut start =
TabPoint(text.clip_point(Point::new(start_row, start_column), Bias::Left));
CharPoint(text.clip_point(Point::new(start_row, start_column), Bias::Left));
if start > end {
mem::swap(&mut start, &mut end);
}

View File

@@ -0,0 +1,157 @@
use std::sync::LazyLock;
use collections::HashMap;
// Invisibility in a Unicode context is not well defined, so we have to guess.
//
// We highlight all ASCII control codes, and unicode whitespace because they are likely
// confused with a normal space (U+0020).
//
// We also highlight the handful of blank non-space characters:
// U+2800 BRAILLE PATTERN BLANK - Category: So
// U+115F HANGUL CHOSEONG FILLER - Category: Lo
// U+1160 HANGUL CHOSEONG FILLER - Category: Lo
// U+3164 HANGUL FILLER - Category: Lo
// U+FFA0 HALFWIDTH HANGUL FILLER - Category: Lo
// U+FFFC OBJECT REPLACEMENT CHARACTER - Category: So
//
// For the rest of Unicode, invisibility happens for two reasons:
// * A Format character (like a byte order mark or right-to-left override)
// * An invisible Nonspacing Mark character (like U+034F, or variation selectors)
//
// We don't consider unassigned codepoints invisible as the font renderer already shows
// a replacement character in that case (and there are a *lot* of them)
//
// Control characters are mostly fine to highlight; except:
// * U+E0020..=U+E007F are used in emoji flags. We don't highlight them right now, but we could if we tightened our heuristics.
// * U+200D is used to join characters. We highlight this but don't replace it. As our font system ignores mid-glyph highlights this mostly works to highlight unexpected uses.
//
// Nonspacing marks are handled like U+200D. This means that mid-glyph we ignore them, but
// probably causes issues with end-of-glyph usage.
//
// ref: https://invisible-characters.com
// ref: https://www.compart.com/en/unicode/category/Cf
// ref: https://gist.github.com/ConradIrwin/f759e1fc29267143c4c7895aa495dca5?h=1
// ref: https://unicode.org/Public/emoji/13.0/emoji-test.txt
// https://github.com/bits/UTF-8-Unicode-Test-Documents/blob/master/UTF-8_sequence_separated/utf8_sequence_0-0x10ffff_assigned_including-unprintable-asis.txt
pub fn is_invisible(c: char) -> bool {
if c <= '\u{1f}' {
c != '\t' && c != '\n' && c != '\r'
} else if c >= '\u{7f}' {
c <= '\u{9f}' || c.is_whitespace() || contains(c, &FORMAT) || contains(c, &OTHER)
} else {
false
}
}
pub(crate) fn replacement(c: char) -> Option<&'static str> {
if !is_invisible(c) {
return None;
}
if c <= '\x7f' {
REPLACEMENTS.get(&c).copied()
} else if contains(c, &PRESERVE) {
None
} else {
Some(" ")
}
}
const REPLACEMENTS: LazyLock<HashMap<char, &'static str>> = LazyLock::new(|| {
[
('\x00', ""),
('\x01', ""),
('\x02', ""),
('\x03', ""),
('\x04', ""),
('\x05', ""),
('\x06', ""),
('\x07', ""),
('\x08', ""),
('\x0B', ""),
('\x0C', ""),
('\x0D', ""),
('\x0E', ""),
('\x0F', ""),
('\x10', ""),
('\x11', ""),
('\x12', ""),
('\x13', ""),
('\x14', ""),
('\x15', ""),
('\x16', ""),
('\x17', ""),
('\x18', ""),
('\x19', ""),
('\x1A', ""),
('\x1B', ""),
('\x1C', ""),
('\x1D', ""),
('\x1E', ""),
('\x1F', ""),
('\u{007F}', ""),
]
.into_iter()
.collect()
});
// generated using ucd-generate: ucd-generate general-category --include Format --chars ucd-16.0.0
pub const FORMAT: &'static [(char, char)] = &[
('\u{ad}', '\u{ad}'),
('\u{600}', '\u{605}'),
('\u{61c}', '\u{61c}'),
('\u{6dd}', '\u{6dd}'),
('\u{70f}', '\u{70f}'),
('\u{890}', '\u{891}'),
('\u{8e2}', '\u{8e2}'),
('\u{180e}', '\u{180e}'),
('\u{200b}', '\u{200f}'),
('\u{202a}', '\u{202e}'),
('\u{2060}', '\u{2064}'),
('\u{2066}', '\u{206f}'),
('\u{feff}', '\u{feff}'),
('\u{fff9}', '\u{fffb}'),
('\u{110bd}', '\u{110bd}'),
('\u{110cd}', '\u{110cd}'),
('\u{13430}', '\u{1343f}'),
('\u{1bca0}', '\u{1bca3}'),
('\u{1d173}', '\u{1d17a}'),
('\u{e0001}', '\u{e0001}'),
('\u{e0020}', '\u{e007f}'),
];
// hand-made base on https://invisible-characters.com (Excluding Cf)
pub const OTHER: &'static [(char, char)] = &[
('\u{034f}', '\u{034f}'),
('\u{115F}', '\u{1160}'),
('\u{17b4}', '\u{17b5}'),
('\u{180b}', '\u{180d}'),
('\u{2800}', '\u{2800}'),
('\u{3164}', '\u{3164}'),
('\u{fe00}', '\u{fe0d}'),
('\u{ffa0}', '\u{ffa0}'),
('\u{fffc}', '\u{fffc}'),
('\u{e0100}', '\u{e01ef}'),
];
// a subset of FORMAT/OTHER that may appear within glyphs
const PRESERVE: &'static [(char, char)] = &[
('\u{034f}', '\u{034f}'),
('\u{200d}', '\u{200d}'),
('\u{17b4}', '\u{17b5}'),
('\u{180b}', '\u{180d}'),
('\u{e0061}', '\u{e007a}'),
('\u{e007f}', '\u{e007f}'),
];
fn contains(c: char, list: &[(char, char)]) -> bool {
for (start, end) in list {
if c < *start {
return false;
}
if c <= *end {
return true;
}
}
false
}

View File

@@ -1,6 +1,6 @@
use super::{
char_map::{self, CharPoint, CharSnapshot, TabEdit},
fold_map::FoldBufferRows,
tab_map::{self, TabEdit, TabPoint, TabSnapshot},
Highlights,
};
use gpui::{AppContext, Context, Font, LineWrapper, Model, ModelContext, Pixels, Task};
@@ -12,7 +12,7 @@ use std::{cmp, collections::VecDeque, mem, ops::Range, time::Duration};
use sum_tree::{Bias, Cursor, SumTree};
use text::Patch;
pub use super::tab_map::TextSummary;
pub use super::char_map::TextSummary;
pub type WrapEdit = text::Edit<u32>;
/// Handles soft wrapping of text.
@@ -20,7 +20,7 @@ pub type WrapEdit = text::Edit<u32>;
/// See the [`display_map` module documentation](crate::display_map) for more information.
pub struct WrapMap {
snapshot: WrapSnapshot,
pending_edits: VecDeque<(TabSnapshot, Vec<TabEdit>)>,
pending_edits: VecDeque<(CharSnapshot, Vec<TabEdit>)>,
interpolated_edits: Patch<u32>,
edits_since_sync: Patch<u32>,
wrap_width: Option<Pixels>,
@@ -30,7 +30,7 @@ pub struct WrapMap {
#[derive(Clone)]
pub struct WrapSnapshot {
tab_snapshot: TabSnapshot,
char_snapshot: CharSnapshot,
transforms: SumTree<Transform>,
interpolated: bool,
}
@@ -51,11 +51,11 @@ struct TransformSummary {
pub struct WrapPoint(pub Point);
pub struct WrapChunks<'a> {
input_chunks: tab_map::TabChunks<'a>,
input_chunks: char_map::TabChunks<'a>,
input_chunk: Chunk<'a>,
output_position: WrapPoint,
max_output_row: u32,
transforms: Cursor<'a, Transform, (WrapPoint, TabPoint)>,
transforms: Cursor<'a, Transform, (WrapPoint, CharPoint)>,
}
#[derive(Clone)]
@@ -65,12 +65,12 @@ pub struct WrapBufferRows<'a> {
output_row: u32,
soft_wrapped: bool,
max_output_row: u32,
transforms: Cursor<'a, Transform, (WrapPoint, TabPoint)>,
transforms: Cursor<'a, Transform, (WrapPoint, CharPoint)>,
}
impl WrapMap {
pub fn new(
tab_snapshot: TabSnapshot,
char_snapshot: CharSnapshot,
font: Font,
font_size: Pixels,
wrap_width: Option<Pixels>,
@@ -83,7 +83,7 @@ impl WrapMap {
pending_edits: Default::default(),
interpolated_edits: Default::default(),
edits_since_sync: Default::default(),
snapshot: WrapSnapshot::new(tab_snapshot),
snapshot: WrapSnapshot::new(char_snapshot),
background_task: None,
};
this.set_wrap_width(wrap_width, cx);
@@ -101,17 +101,17 @@ impl WrapMap {
pub fn sync(
&mut self,
tab_snapshot: TabSnapshot,
char_snapshot: CharSnapshot,
edits: Vec<TabEdit>,
cx: &mut ModelContext<Self>,
) -> (WrapSnapshot, Patch<u32>) {
if self.wrap_width.is_some() {
self.pending_edits.push_back((tab_snapshot, edits));
self.pending_edits.push_back((char_snapshot, edits));
self.flush_edits(cx);
} else {
self.edits_since_sync = self
.edits_since_sync
.compose(self.snapshot.interpolate(tab_snapshot, &edits));
.compose(self.snapshot.interpolate(char_snapshot, &edits));
self.snapshot.interpolated = false;
}
@@ -161,11 +161,11 @@ impl WrapMap {
let (font, font_size) = self.font_with_size.clone();
let task = cx.background_executor().spawn(async move {
let mut line_wrapper = text_system.line_wrapper(font, font_size);
let tab_snapshot = new_snapshot.tab_snapshot.clone();
let range = TabPoint::zero()..tab_snapshot.max_point();
let char_snapshot = new_snapshot.char_snapshot.clone();
let range = CharPoint::zero()..char_snapshot.max_point();
let edits = new_snapshot
.update(
tab_snapshot,
char_snapshot,
&[TabEdit {
old: range.clone(),
new: range.clone(),
@@ -205,7 +205,7 @@ impl WrapMap {
} else {
let old_rows = self.snapshot.transforms.summary().output.lines.row + 1;
self.snapshot.transforms = SumTree::default();
let summary = self.snapshot.tab_snapshot.text_summary();
let summary = self.snapshot.char_snapshot.text_summary();
if !summary.lines.is_zero() {
self.snapshot
.transforms
@@ -223,8 +223,8 @@ impl WrapMap {
fn flush_edits(&mut self, cx: &mut ModelContext<Self>) {
if !self.snapshot.interpolated {
let mut to_remove_len = 0;
for (tab_snapshot, _) in &self.pending_edits {
if tab_snapshot.version <= self.snapshot.tab_snapshot.version {
for (char_snapshot, _) in &self.pending_edits {
if char_snapshot.version <= self.snapshot.char_snapshot.version {
to_remove_len += 1;
} else {
break;
@@ -246,9 +246,9 @@ impl WrapMap {
let update_task = cx.background_executor().spawn(async move {
let mut edits = Patch::default();
let mut line_wrapper = text_system.line_wrapper(font, font_size);
for (tab_snapshot, tab_edits) in pending_edits {
for (char_snapshot, tab_edits) in pending_edits {
let wrap_edits = snapshot
.update(tab_snapshot, &tab_edits, wrap_width, &mut line_wrapper)
.update(char_snapshot, &tab_edits, wrap_width, &mut line_wrapper)
.await;
edits = edits.compose(&wrap_edits);
}
@@ -285,11 +285,11 @@ impl WrapMap {
let was_interpolated = self.snapshot.interpolated;
let mut to_remove_len = 0;
for (tab_snapshot, edits) in &self.pending_edits {
if tab_snapshot.version <= self.snapshot.tab_snapshot.version {
for (char_snapshot, edits) in &self.pending_edits {
if char_snapshot.version <= self.snapshot.char_snapshot.version {
to_remove_len += 1;
} else {
let interpolated_edits = self.snapshot.interpolate(tab_snapshot.clone(), edits);
let interpolated_edits = self.snapshot.interpolate(char_snapshot.clone(), edits);
self.edits_since_sync = self.edits_since_sync.compose(&interpolated_edits);
self.interpolated_edits = self.interpolated_edits.compose(&interpolated_edits);
}
@@ -302,45 +302,49 @@ impl WrapMap {
}
impl WrapSnapshot {
fn new(tab_snapshot: TabSnapshot) -> Self {
fn new(char_snapshot: CharSnapshot) -> Self {
let mut transforms = SumTree::default();
let extent = tab_snapshot.text_summary();
let extent = char_snapshot.text_summary();
if !extent.lines.is_zero() {
transforms.push(Transform::isomorphic(extent), &());
}
Self {
transforms,
tab_snapshot,
char_snapshot,
interpolated: true,
}
}
pub fn buffer_snapshot(&self) -> &MultiBufferSnapshot {
self.tab_snapshot.buffer_snapshot()
self.char_snapshot.buffer_snapshot()
}
fn interpolate(&mut self, new_tab_snapshot: TabSnapshot, tab_edits: &[TabEdit]) -> Patch<u32> {
fn interpolate(
&mut self,
new_char_snapshot: CharSnapshot,
tab_edits: &[TabEdit],
) -> Patch<u32> {
let mut new_transforms;
if tab_edits.is_empty() {
new_transforms = self.transforms.clone();
} else {
let mut old_cursor = self.transforms.cursor::<TabPoint>(&());
let mut old_cursor = self.transforms.cursor::<CharPoint>(&());
let mut tab_edits_iter = tab_edits.iter().peekable();
new_transforms =
old_cursor.slice(&tab_edits_iter.peek().unwrap().old.start, Bias::Right, &());
while let Some(edit) = tab_edits_iter.next() {
if edit.new.start > TabPoint::from(new_transforms.summary().input.lines) {
let summary = new_tab_snapshot.text_summary_for_range(
TabPoint::from(new_transforms.summary().input.lines)..edit.new.start,
if edit.new.start > CharPoint::from(new_transforms.summary().input.lines) {
let summary = new_char_snapshot.text_summary_for_range(
CharPoint::from(new_transforms.summary().input.lines)..edit.new.start,
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
if !edit.new.is_empty() {
new_transforms.push_or_extend(Transform::isomorphic(
new_tab_snapshot.text_summary_for_range(edit.new.clone()),
new_char_snapshot.text_summary_for_range(edit.new.clone()),
));
}
@@ -349,7 +353,7 @@ impl WrapSnapshot {
if next_edit.old.start > old_cursor.end(&()) {
if old_cursor.end(&()) > edit.old.end {
let summary = self
.tab_snapshot
.char_snapshot
.text_summary_for_range(edit.old.end..old_cursor.end(&()));
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -363,7 +367,7 @@ impl WrapSnapshot {
} else {
if old_cursor.end(&()) > edit.old.end {
let summary = self
.tab_snapshot
.char_snapshot
.text_summary_for_range(edit.old.end..old_cursor.end(&()));
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -376,7 +380,7 @@ impl WrapSnapshot {
let old_snapshot = mem::replace(
self,
WrapSnapshot {
tab_snapshot: new_tab_snapshot,
char_snapshot: new_char_snapshot,
transforms: new_transforms,
interpolated: true,
},
@@ -387,7 +391,7 @@ impl WrapSnapshot {
async fn update(
&mut self,
new_tab_snapshot: TabSnapshot,
new_char_snapshot: CharSnapshot,
tab_edits: &[TabEdit],
wrap_width: Pixels,
line_wrapper: &mut LineWrapper,
@@ -424,27 +428,27 @@ impl WrapSnapshot {
new_transforms = self.transforms.clone();
} else {
let mut row_edits = row_edits.into_iter().peekable();
let mut old_cursor = self.transforms.cursor::<TabPoint>(&());
let mut old_cursor = self.transforms.cursor::<CharPoint>(&());
new_transforms = old_cursor.slice(
&TabPoint::new(row_edits.peek().unwrap().old_rows.start, 0),
&CharPoint::new(row_edits.peek().unwrap().old_rows.start, 0),
Bias::Right,
&(),
);
while let Some(edit) = row_edits.next() {
if edit.new_rows.start > new_transforms.summary().input.lines.row {
let summary = new_tab_snapshot.text_summary_for_range(
TabPoint(new_transforms.summary().input.lines)
..TabPoint::new(edit.new_rows.start, 0),
let summary = new_char_snapshot.text_summary_for_range(
CharPoint(new_transforms.summary().input.lines)
..CharPoint::new(edit.new_rows.start, 0),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
let mut line = String::new();
let mut remaining = None;
let mut chunks = new_tab_snapshot.chunks(
TabPoint::new(edit.new_rows.start, 0)..new_tab_snapshot.max_point(),
let mut chunks = new_char_snapshot.chunks(
CharPoint::new(edit.new_rows.start, 0)..new_char_snapshot.max_point(),
false,
Highlights::default(),
);
@@ -491,19 +495,19 @@ impl WrapSnapshot {
}
new_transforms.extend(edit_transforms, &());
old_cursor.seek_forward(&TabPoint::new(edit.old_rows.end, 0), Bias::Right, &());
old_cursor.seek_forward(&CharPoint::new(edit.old_rows.end, 0), Bias::Right, &());
if let Some(next_edit) = row_edits.peek() {
if next_edit.old_rows.start > old_cursor.end(&()).row() {
if old_cursor.end(&()) > TabPoint::new(edit.old_rows.end, 0) {
let summary = self.tab_snapshot.text_summary_for_range(
TabPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
if old_cursor.end(&()) > CharPoint::new(edit.old_rows.end, 0) {
let summary = self.char_snapshot.text_summary_for_range(
CharPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
old_cursor.next(&());
new_transforms.append(
old_cursor.slice(
&TabPoint::new(next_edit.old_rows.start, 0),
&CharPoint::new(next_edit.old_rows.start, 0),
Bias::Right,
&(),
),
@@ -511,9 +515,9 @@ impl WrapSnapshot {
);
}
} else {
if old_cursor.end(&()) > TabPoint::new(edit.old_rows.end, 0) {
let summary = self.tab_snapshot.text_summary_for_range(
TabPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
if old_cursor.end(&()) > CharPoint::new(edit.old_rows.end, 0) {
let summary = self.char_snapshot.text_summary_for_range(
CharPoint::new(edit.old_rows.end, 0)..old_cursor.end(&()),
);
new_transforms.push_or_extend(Transform::isomorphic(summary));
}
@@ -526,7 +530,7 @@ impl WrapSnapshot {
let old_snapshot = mem::replace(
self,
WrapSnapshot {
tab_snapshot: new_tab_snapshot,
char_snapshot: new_char_snapshot,
transforms: new_transforms,
interpolated: false,
},
@@ -579,17 +583,17 @@ impl WrapSnapshot {
) -> WrapChunks<'a> {
let output_start = WrapPoint::new(rows.start, 0);
let output_end = WrapPoint::new(rows.end, 0);
let mut transforms = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
let mut transforms = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
transforms.seek(&output_start, Bias::Right, &());
let mut input_start = TabPoint(transforms.start().1 .0);
let mut input_start = CharPoint(transforms.start().1 .0);
if transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_start.0 += output_start.0 - transforms.start().0 .0;
}
let input_end = self
.to_tab_point(output_end)
.min(self.tab_snapshot.max_point());
.to_char_point(output_end)
.min(self.char_snapshot.max_point());
WrapChunks {
input_chunks: self.tab_snapshot.chunks(
input_chunks: self.char_snapshot.chunks(
input_start..input_end,
language_aware,
highlights,
@@ -606,7 +610,7 @@ impl WrapSnapshot {
}
pub fn line_len(&self, row: u32) -> u32 {
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
cursor.seek(&WrapPoint::new(row + 1, 0), Bias::Left, &());
if cursor
.item()
@@ -614,7 +618,7 @@ impl WrapSnapshot {
{
let overshoot = row - cursor.start().0.row();
let tab_row = cursor.start().1.row() + overshoot;
let tab_line_len = self.tab_snapshot.line_len(tab_row);
let tab_line_len = self.char_snapshot.line_len(tab_row);
if overshoot == 0 {
cursor.start().0.column() + (tab_line_len - cursor.start().1.column())
} else {
@@ -642,14 +646,14 @@ impl WrapSnapshot {
}
pub fn buffer_rows(&self, start_row: u32) -> WrapBufferRows {
let mut transforms = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
let mut transforms = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
transforms.seek(&WrapPoint::new(start_row, 0), Bias::Left, &());
let mut input_row = transforms.start().1.row();
if transforms.item().map_or(false, |t| t.is_isomorphic()) {
input_row += start_row - transforms.start().0.row();
}
let soft_wrapped = transforms.item().map_or(false, |t| !t.is_isomorphic());
let mut input_buffer_rows = self.tab_snapshot.buffer_rows(input_row);
let mut input_buffer_rows = self.char_snapshot.buffer_rows(input_row);
let input_buffer_row = input_buffer_rows.next().unwrap();
WrapBufferRows {
transforms,
@@ -661,26 +665,26 @@ impl WrapSnapshot {
}
}
pub fn to_tab_point(&self, point: WrapPoint) -> TabPoint {
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
pub fn to_char_point(&self, point: WrapPoint) -> CharPoint {
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
cursor.seek(&point, Bias::Right, &());
let mut tab_point = cursor.start().1 .0;
let mut char_point = cursor.start().1 .0;
if cursor.item().map_or(false, |t| t.is_isomorphic()) {
tab_point += point.0 - cursor.start().0 .0;
char_point += point.0 - cursor.start().0 .0;
}
TabPoint(tab_point)
CharPoint(char_point)
}
pub fn to_point(&self, point: WrapPoint, bias: Bias) -> Point {
self.tab_snapshot.to_point(self.to_tab_point(point), bias)
self.char_snapshot.to_point(self.to_char_point(point), bias)
}
pub fn make_wrap_point(&self, point: Point, bias: Bias) -> WrapPoint {
self.tab_point_to_wrap_point(self.tab_snapshot.make_tab_point(point, bias))
self.char_point_to_wrap_point(self.char_snapshot.make_char_point(point, bias))
}
pub fn tab_point_to_wrap_point(&self, point: TabPoint) -> WrapPoint {
let mut cursor = self.transforms.cursor::<(TabPoint, WrapPoint)>(&());
pub fn char_point_to_wrap_point(&self, point: CharPoint) -> WrapPoint {
let mut cursor = self.transforms.cursor::<(CharPoint, WrapPoint)>(&());
cursor.seek(&point, Bias::Right, &());
WrapPoint(cursor.start().1 .0 + (point.0 - cursor.start().0 .0))
}
@@ -695,7 +699,10 @@ impl WrapSnapshot {
}
}
self.tab_point_to_wrap_point(self.tab_snapshot.clip_point(self.to_tab_point(point), bias))
self.char_point_to_wrap_point(
self.char_snapshot
.clip_point(self.to_char_point(point), bias),
)
}
pub fn prev_row_boundary(&self, mut point: WrapPoint) -> u32 {
@@ -705,7 +712,7 @@ impl WrapSnapshot {
*point.column_mut() = 0;
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
cursor.seek(&point, Bias::Right, &());
if cursor.item().is_none() {
cursor.prev(&());
@@ -725,7 +732,7 @@ impl WrapSnapshot {
pub fn next_row_boundary(&self, mut point: WrapPoint) -> Option<u32> {
point.0 += Point::new(1, 0);
let mut cursor = self.transforms.cursor::<(WrapPoint, TabPoint)>(&());
let mut cursor = self.transforms.cursor::<(WrapPoint, CharPoint)>(&());
cursor.seek(&point, Bias::Right, &());
while let Some(transform) = cursor.item() {
if transform.is_isomorphic() && cursor.start().1.column() == 0 {
@@ -742,8 +749,8 @@ impl WrapSnapshot {
#[cfg(test)]
{
assert_eq!(
TabPoint::from(self.transforms.summary().input.lines),
self.tab_snapshot.max_point()
CharPoint::from(self.transforms.summary().input.lines),
self.char_snapshot.max_point()
);
{
@@ -756,18 +763,18 @@ impl WrapSnapshot {
}
let text = language::Rope::from(self.text().as_str());
let mut input_buffer_rows = self.tab_snapshot.buffer_rows(0);
let mut input_buffer_rows = self.char_snapshot.buffer_rows(0);
let mut expected_buffer_rows = Vec::new();
let mut prev_tab_row = 0;
for display_row in 0..=self.max_point().row() {
let tab_point = self.to_tab_point(WrapPoint::new(display_row, 0));
if tab_point.row() == prev_tab_row && display_row != 0 {
let char_point = self.to_char_point(WrapPoint::new(display_row, 0));
if char_point.row() == prev_tab_row && display_row != 0 {
expected_buffer_rows.push(None);
} else {
expected_buffer_rows.push(input_buffer_rows.next().unwrap());
}
prev_tab_row = tab_point.row();
prev_tab_row = char_point.row();
assert_eq!(self.line_len(display_row), text.line_len(display_row));
}
@@ -831,13 +838,11 @@ impl<'a> Iterator for WrapChunks<'a> {
} else {
*self.output_position.column_mut() += char_len as u32;
}
if self.output_position >= transform_end {
self.transforms.next(&());
break;
}
}
let (prefix, suffix) = self.input_chunk.text.split_at(input_len);
self.input_chunk.text = suffix;
Some(Chunk {
@@ -992,7 +997,7 @@ impl sum_tree::Summary for TransformSummary {
}
}
impl<'a> sum_tree::Dimension<'a, TransformSummary> for TabPoint {
impl<'a> sum_tree::Dimension<'a, TransformSummary> for CharPoint {
fn zero(_cx: &()) -> Self {
Default::default()
}
@@ -1002,7 +1007,7 @@ impl<'a> sum_tree::Dimension<'a, TransformSummary> for TabPoint {
}
}
impl<'a> sum_tree::SeekTarget<'a, TransformSummary, TransformSummary> for TabPoint {
impl<'a> sum_tree::SeekTarget<'a, TransformSummary, TransformSummary> for CharPoint {
fn cmp(&self, cursor_location: &TransformSummary, _: &()) -> std::cmp::Ordering {
Ord::cmp(&self.0, &cursor_location.input.lines)
}
@@ -1050,7 +1055,7 @@ fn consolidate_wrap_edits(edits: Vec<WrapEdit>) -> Vec<WrapEdit> {
mod tests {
use super::*;
use crate::{
display_map::{fold_map::FoldMap, inlay_map::InlayMap, tab_map::TabMap},
display_map::{char_map::CharMap, fold_map::FoldMap, inlay_map::InlayMap},
MultiBuffer,
};
use gpui::{font, px, test::observe};
@@ -1102,9 +1107,9 @@ mod tests {
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (mut fold_map, fold_snapshot) = FoldMap::new(inlay_snapshot.clone());
log::info!("FoldMap text: {:?}", fold_snapshot.text());
let (mut tab_map, _) = TabMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = tab_map.set_max_expansion_column(32);
log::info!("TabMap text: {:?}", tabs_snapshot.text());
let (mut char_map, _) = CharMap::new(fold_snapshot.clone(), tab_size);
let tabs_snapshot = char_map.set_max_expansion_column(32);
log::info!("CharMap text: {:?}", tabs_snapshot.text());
let mut line_wrapper = text_system.line_wrapper(font.clone(), font_size);
let unwrapped_text = tabs_snapshot.text();
@@ -1150,7 +1155,7 @@ mod tests {
20..=39 => {
for (fold_snapshot, fold_edits) in fold_map.randomly_mutate(&mut rng) {
let (tabs_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
char_map.sync(fold_snapshot, fold_edits, tab_size);
let (mut snapshot, wrap_edits) =
wrap_map.update(cx, |map, cx| map.sync(tabs_snapshot, tab_edits, cx));
snapshot.check_invariants();
@@ -1163,7 +1168,7 @@ mod tests {
inlay_map.randomly_mutate(&mut next_inlay_id, &mut rng);
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
let (tabs_snapshot, tab_edits) =
tab_map.sync(fold_snapshot, fold_edits, tab_size);
char_map.sync(fold_snapshot, fold_edits, tab_size);
let (mut snapshot, wrap_edits) =
wrap_map.update(cx, |map, cx| map.sync(tabs_snapshot, tab_edits, cx));
snapshot.check_invariants();
@@ -1187,8 +1192,8 @@ mod tests {
log::info!("InlayMap text: {:?}", inlay_snapshot.text());
let (fold_snapshot, fold_edits) = fold_map.read(inlay_snapshot, inlay_edits);
log::info!("FoldMap text: {:?}", fold_snapshot.text());
let (tabs_snapshot, tab_edits) = tab_map.sync(fold_snapshot, fold_edits, tab_size);
log::info!("TabMap text: {:?}", tabs_snapshot.text());
let (tabs_snapshot, tab_edits) = char_map.sync(fold_snapshot, fold_edits, tab_size);
log::info!("CharMap text: {:?}", tabs_snapshot.text());
let unwrapped_text = tabs_snapshot.text();
let expected_text = wrap_text(&unwrapped_text, wrap_width, &mut line_wrapper);
@@ -1234,7 +1239,7 @@ mod tests {
if tab_size.get() == 1
|| !wrapped_snapshot
.tab_snapshot
.char_snapshot
.fold_snapshot
.text()
.contains('\t')

View File

@@ -76,9 +76,9 @@ use gpui::{
ClipboardItem, Context, DispatchPhase, ElementId, EventEmitter, FocusHandle, FocusOutEvent,
FocusableView, FontId, FontWeight, HighlightStyle, Hsla, InteractiveText, KeyContext,
ListSizingBehavior, Model, MouseButton, PaintQuad, ParentElement, Pixels, Render, SharedString,
Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle, UTF16Selection,
UnderlineStyle, UniformListScrollHandle, View, ViewContext, ViewInputHandler, VisualContext,
WeakFocusHandle, WeakView, WindowContext,
Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle,
TextStyleRefinement, UTF16Selection, UnderlineStyle, UniformListScrollHandle, View,
ViewContext, ViewInputHandler, VisualContext, WeakFocusHandle, WeakView, WindowContext,
};
use highlight_matching_bracket::refresh_matching_bracket_highlights;
use hover_popover::{hide_hover, HoverState};
@@ -546,6 +546,7 @@ pub struct Editor {
ime_transaction: Option<TransactionId>,
active_diagnostics: Option<ActiveDiagnosticGroup>,
soft_wrap_mode_override: Option<language_settings::SoftWrap>,
project: Option<Model<Project>>,
semantics_provider: Option<Rc<dyn SemanticsProvider>>,
completion_provider: Option<Box<dyn CompletionProvider>>,
@@ -615,6 +616,7 @@ pub struct Editor {
pixel_position_of_newest_cursor: Option<gpui::Point<Pixels>>,
gutter_dimensions: GutterDimensions,
style: Option<EditorStyle>,
text_style_refinement: Option<TextStyleRefinement>,
next_editor_action_id: EditorActionId,
editor_actions: Rc<RefCell<BTreeMap<EditorActionId, Box<dyn Fn(&mut ViewContext<Self>)>>>>,
use_autoclose: bool,
@@ -2062,6 +2064,7 @@ impl Editor {
next_scroll_position: NextScrollCursorCenterTopBottom::default(),
addons: HashMap::default(),
_scroll_cursor_center_top_bottom_task: Task::ready(()),
text_style_refinement: None,
};
this.tasks_update_task = Some(this.refresh_runnables(cx));
this._subscriptions.extend(project_subscriptions);
@@ -6257,28 +6260,6 @@ impl Editor {
}
}
fn apply_selected_diff_hunks(&mut self, _: &ApplyDiffHunk, cx: &mut ViewContext<Self>) {
let snapshot = self.buffer.read(cx).snapshot(cx);
let hunks = hunks_for_selections(&snapshot, &self.selections.disjoint_anchors());
let mut ranges_by_buffer = HashMap::default();
self.transact(cx, |editor, cx| {
for hunk in hunks {
if let Some(buffer) = editor.buffer.read(cx).buffer(hunk.buffer_id) {
ranges_by_buffer
.entry(buffer.clone())
.or_insert_with(Vec::new)
.push(hunk.buffer_range.to_offset(buffer.read(cx)));
}
}
for (buffer, ranges) in ranges_by_buffer {
buffer.update(cx, |buffer, cx| {
buffer.merge_into_base(ranges, cx);
});
}
});
}
pub fn open_active_item_in_terminal(&mut self, _: &OpenInTerminal, cx: &mut ViewContext<Self>) {
if let Some(working_directory) = self.active_excerpt(cx).and_then(|(_, buffer, _)| {
let project_path = buffer.read(cx).project_path(cx)?;
@@ -11180,7 +11161,12 @@ impl Editor {
cx.notify();
}
pub fn set_style(&mut self, style: EditorStyle, cx: &mut ViewContext<Self>) {
pub fn set_text_style_refinement(&mut self, style: TextStyleRefinement) {
self.text_style_refinement = Some(style);
}
/// called by the Element so we know what style we were most recently rendered with.
pub(crate) fn set_style(&mut self, style: EditorStyle, cx: &mut ViewContext<Self>) {
let rem_size = cx.rem_size();
self.display_map.update(cx, |map, cx| {
map.set_font(
@@ -13676,7 +13662,7 @@ impl Render for Editor {
fn render<'a>(&mut self, cx: &mut ViewContext<'a, Self>) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let text_style = match self.mode {
let mut text_style = match self.mode {
EditorMode::SingleLine { .. } | EditorMode::AutoHeight { .. } => TextStyle {
color: cx.theme().colors().editor_foreground,
font_family: settings.ui_font.family.clone(),
@@ -13698,6 +13684,9 @@ impl Render for Editor {
..Default::default()
},
};
if let Some(text_style_refinement) = &self.text_style_refinement {
text_style.refine(text_style_refinement)
}
let background = match self.mode {
EditorMode::SingleLine { .. } => cx.theme().system().transparent,

View File

@@ -68,6 +68,7 @@ use sum_tree::Bias;
use theme::{ActiveTheme, Appearance, PlayerColor};
use ui::prelude::*;
use ui::{h_flex, ButtonLike, ButtonStyle, ContextMenu, Tooltip};
use unicode_segmentation::UnicodeSegmentation;
use util::RangeExt;
use util::ResultExt;
use workspace::{item::Item, Workspace};
@@ -1025,23 +1026,21 @@ impl EditorElement {
}
let block_text = if let CursorShape::Block = selection.cursor_shape {
snapshot
.display_chars_at(cursor_position)
.next()
.grapheme_at(cursor_position)
.or_else(|| {
if cursor_column == 0 {
snapshot
.placeholder_text()
.and_then(|s| s.chars().next())
.map(|c| (c, cursor_position))
snapshot.placeholder_text().and_then(|s| {
s.graphemes(true).next().map(|s| s.to_owned())
})
} else {
None
}
})
.and_then(|(character, _)| {
let text = if character == '\n' {
.and_then(|grapheme| {
let text = if grapheme == "\n" {
SharedString::from(" ")
} else {
SharedString::from(character.to_string())
SharedString::from(grapheme)
};
let len = text.len();

View File

@@ -1,6 +1,7 @@
use crate::{
display_map::{InlayOffset, ToDisplayPoint},
hover_links::{InlayHighlight, RangeInEditor},
is_invisible,
scroll::ScrollAmount,
Anchor, AnchorRangeExt, DisplayPoint, DisplayRow, Editor, EditorSettings, EditorSnapshot,
Hover, RangeToAnchorExt,
@@ -11,7 +12,7 @@ use gpui::{
StyleRefinement, Styled, Task, TextStyleRefinement, View, ViewContext,
};
use itertools::Itertools;
use language::{DiagnosticEntry, Language, LanguageRegistry};
use language::{Diagnostic, DiagnosticEntry, Language, LanguageRegistry};
use lsp::DiagnosticSeverity;
use markdown::{Markdown, MarkdownStyle};
use multi_buffer::ToOffset;
@@ -199,7 +200,6 @@ fn show_hover(
if editor.pending_rename.is_some() {
return None;
}
let snapshot = editor.snapshot(cx);
let (buffer, buffer_position) = editor
@@ -259,7 +259,7 @@ fn show_hover(
}
// If there's a diagnostic, assign it on the hover state and notify
let local_diagnostic = snapshot
let mut local_diagnostic = snapshot
.buffer_snapshot
.diagnostics_in_range::<_, usize>(anchor..anchor, false)
// Find the entry with the most specific range
@@ -281,6 +281,42 @@ fn show_hover(
})
});
if let Some(invisible) = snapshot
.buffer_snapshot
.chars_at(anchor)
.next()
.filter(|&c| is_invisible(c))
{
let after = snapshot.buffer_snapshot.anchor_after(
anchor.to_offset(&snapshot.buffer_snapshot) + invisible.len_utf8(),
);
local_diagnostic = Some(DiagnosticEntry {
diagnostic: Diagnostic {
severity: DiagnosticSeverity::HINT,
message: format!("Unicode character U+{:02X}", invisible as u32),
..Default::default()
},
range: anchor..after,
})
} else if let Some(invisible) = snapshot
.buffer_snapshot
.reversed_chars_at(anchor)
.next()
.filter(|&c| is_invisible(c))
{
let before = snapshot.buffer_snapshot.anchor_before(
anchor.to_offset(&snapshot.buffer_snapshot) - invisible.len_utf8(),
);
local_diagnostic = Some(DiagnosticEntry {
diagnostic: Diagnostic {
severity: DiagnosticSeverity::HINT,
message: format!("Unicode character U+{:02X}", invisible as u32),
..Default::default()
},
range: before..anchor,
})
}
let diagnostic_popover = if let Some(local_diagnostic) = local_diagnostic {
let text = match local_diagnostic.diagnostic.source {
Some(ref source) => {
@@ -288,7 +324,6 @@ fn show_hover(
}
None => local_diagnostic.diagnostic.message.clone(),
};
let mut border_color: Option<Hsla> = None;
let mut background_color: Option<Hsla> = None;
@@ -344,7 +379,6 @@ fn show_hover(
Markdown::new_text(text, markdown_style.clone(), None, cx, None)
})
.ok();
Some(DiagnosticPopover {
local_diagnostic,
primary_diagnostic,
@@ -432,7 +466,6 @@ fn show_hover(
cx.notify();
cx.refresh();
})?;
anyhow::Ok(())
}
.log_err()

View File

@@ -7,11 +7,13 @@ use multi_buffer::{
MultiBufferSnapshot, ToPoint,
};
use std::{ops::Range, sync::Arc};
use text::OffsetRangeExt;
use ui::{
prelude::*, ActiveTheme, ContextMenu, IconButtonShape, InteractiveElement, IntoElement,
ParentElement, PopoverMenu, Styled, Tooltip, ViewContext, VisualContext,
};
use util::RangeExt;
use workspace::Item;
use crate::{
editor_settings::CurrentLineHighlight, hunk_status, hunks_for_selections, ApplyDiffHunk,
@@ -327,7 +329,7 @@ impl Editor {
Some(())
}
fn apply_changes_in_range(
fn apply_diff_hunks_in_range(
&mut self,
range: Range<Anchor>,
cx: &mut ViewContext<'_, Editor>,
@@ -343,16 +345,54 @@ impl Editor {
branch_buffer.merge_into_base(vec![range], cx);
});
if let Some(project) = self.project.clone() {
self.save(true, project, cx).detach_and_log_err(cx);
}
None
}
pub(crate) fn apply_all_changes(&self, cx: &mut ViewContext<Self>) {
pub(crate) fn apply_all_diff_hunks(&mut self, cx: &mut ViewContext<Self>) {
let buffers = self.buffer.read(cx).all_buffers();
for branch_buffer in buffers {
branch_buffer.update(cx, |branch_buffer, cx| {
branch_buffer.merge_into_base(Vec::new(), cx);
});
}
if let Some(project) = self.project.clone() {
self.save(true, project, cx).detach_and_log_err(cx);
}
}
pub(crate) fn apply_selected_diff_hunks(
&mut self,
_: &ApplyDiffHunk,
cx: &mut ViewContext<Self>,
) {
let snapshot = self.buffer.read(cx).snapshot(cx);
let hunks = hunks_for_selections(&snapshot, &self.selections.disjoint_anchors());
let mut ranges_by_buffer = HashMap::default();
self.transact(cx, |editor, cx| {
for hunk in hunks {
if let Some(buffer) = editor.buffer.read(cx).buffer(hunk.buffer_id) {
ranges_by_buffer
.entry(buffer.clone())
.or_insert_with(Vec::new)
.push(hunk.buffer_range.to_offset(buffer.read(cx)));
}
}
for (buffer, ranges) in ranges_by_buffer {
buffer.update(cx, |buffer, cx| {
buffer.merge_into_base(ranges, cx);
});
}
});
if let Some(project) = self.project.clone() {
self.save(true, project, cx).detach_and_log_err(cx);
}
}
fn hunk_header_block(
@@ -418,7 +458,7 @@ impl Editor {
h_flex()
.px_6()
.size_full()
.justify_between()
.justify_end()
.child(
h_flex()
.gap_1()
@@ -548,11 +588,12 @@ impl Editor {
let hunk = hunk.clone();
move |_event, cx| {
editor.update(cx, |editor, cx| {
editor.apply_changes_in_range(
hunk.multi_buffer_range
.clone(),
cx,
);
editor
.apply_diff_hunks_in_range(
hunk.multi_buffer_range
.clone(),
cx,
);
});
}
}),

View File

@@ -720,6 +720,10 @@ impl Item for Editor {
) -> Task<Result<()>> {
self.report_editor_event("save", None, cx);
let buffers = self.buffer().clone().read(cx).all_buffers();
let buffers = buffers
.into_iter()
.map(|handle| handle.read(cx).diff_base_buffer().unwrap_or(handle.clone()))
.collect::<HashSet<_>>();
cx.spawn(|this, mut cx| async move {
if format {
this.update(&mut cx, |editor, cx| {
@@ -932,7 +936,7 @@ impl SerializableItem for Editor {
fn deserialize(
project: Model<Project>,
_workspace: WeakView<Workspace>,
workspace: WeakView<Workspace>,
workspace_id: workspace::WorkspaceId,
item_id: ItemId,
cx: &mut ViewContext<Pane>,
@@ -949,7 +953,7 @@ impl SerializableItem for Editor {
serialized_editor
} else {
SerializedEditor {
path: serialized_editor.path,
abs_path: serialized_editor.abs_path,
contents: None,
language: None,
mtime: None,
@@ -964,13 +968,13 @@ impl SerializableItem for Editor {
}
};
let buffer_task = match serialized_editor {
match serialized_editor {
SerializedEditor {
path: None,
abs_path: None,
contents: Some(contents),
language,
..
} => cx.spawn(|_, mut cx| {
} => cx.spawn(|pane, mut cx| {
let project = project.clone();
async move {
let language = if let Some(language_name) = language {
@@ -997,30 +1001,34 @@ impl SerializableItem for Editor {
buffer.set_text(contents, cx);
})?;
anyhow::Ok(buffer)
pane.update(&mut cx, |_, cx| {
cx.new_view(|cx| {
let mut editor = Editor::for_buffer(buffer, Some(project), cx);
editor.read_scroll_position_from_db(item_id, workspace_id, cx);
editor
})
})
}
}),
SerializedEditor {
path: Some(path),
abs_path: Some(abs_path),
contents,
mtime,
..
} => {
let project_item = project.update(cx, |project, cx| {
let (worktree, path) = project
.find_worktree(&path, cx)
.with_context(|| format!("No worktree for path: {path:?}"))?;
let (worktree, path) = project.find_worktree(&abs_path, cx)?;
let project_path = ProjectPath {
worktree_id: worktree.read(cx).id(),
path: path.into(),
};
Ok(project.open_path(project_path, cx))
Some(project.open_path(project_path, cx))
});
project_item
.map(|project_item| {
cx.spawn(|_, mut cx| async move {
match project_item {
Some(project_item) => {
cx.spawn(|pane, mut cx| async move {
let (_, project_item) = project_item.await?;
let buffer = project_item.downcast::<Buffer>().map_err(|_| {
anyhow!("Project item at stored path was not a buffer")
@@ -1047,26 +1055,36 @@ impl SerializableItem for Editor {
})?;
}
Ok(buffer)
pane.update(&mut cx, |_, cx| {
cx.new_view(|cx| {
let mut editor = Editor::for_buffer(buffer, Some(project), cx);
editor.read_scroll_position_from_db(item_id, workspace_id, cx);
editor
})
})
})
})
.unwrap_or_else(|error| Task::ready(Err(error)))
}
None => {
let open_by_abs_path = workspace.update(cx, |workspace, cx| {
workspace.open_abs_path(abs_path.clone(), false, cx)
});
cx.spawn(|_, mut cx| async move {
let editor = open_by_abs_path?.await?.downcast::<Editor>().with_context(|| format!("Failed to downcast to Editor after opening abs path {abs_path:?}"))?;
editor.update(&mut cx, |editor, cx| {
editor.read_scroll_position_from_db(item_id, workspace_id, cx);
})?;
Ok(editor)
})
}
}
}
_ => return Task::ready(Err(anyhow!("No path or contents found for buffer"))),
};
cx.spawn(|pane, mut cx| async move {
let buffer = buffer_task.await?;
pane.update(&mut cx, |_, cx| {
cx.new_view(|cx| {
let mut editor = Editor::for_buffer(buffer, Some(project), cx);
editor.read_scroll_position_from_db(item_id, workspace_id, cx);
editor
})
})
})
SerializedEditor {
abs_path: None,
contents: None,
..
} => Task::ready(Err(anyhow!("No path or contents found for buffer"))),
}
}
fn serialize(
@@ -1092,12 +1110,19 @@ impl SerializableItem for Editor {
let workspace_id = workspace.database_id()?;
let buffer = self.buffer().read(cx).as_singleton()?;
let path = buffer
.read(cx)
.file()
.map(|file| file.full_path(cx))
.and_then(|full_path| project.read(cx).find_project_path(&full_path, cx))
.and_then(|project_path| project.read(cx).absolute_path(&project_path, cx));
let abs_path = buffer.read(cx).file().and_then(|file| {
let worktree_id = file.worktree_id(cx);
project
.read(cx)
.worktree_for_id(worktree_id, cx)
.and_then(|worktree| worktree.read(cx).absolutize(&file.path()).ok())
.or_else(|| {
let full_path = file.full_path(cx);
let project_path = project.read(cx).find_project_path(&full_path, cx)?;
project.read(cx).absolute_path(&project_path, cx)
})
});
let is_dirty = buffer.read(cx).is_dirty();
let mtime = buffer.read(cx).saved_mtime();
@@ -1116,7 +1141,7 @@ impl SerializableItem for Editor {
};
let editor = SerializedEditor {
path,
abs_path,
contents,
language,
mtime,
@@ -1629,7 +1654,7 @@ mod tests {
let item_id = 1234 as ItemId;
let serialized_editor = SerializedEditor {
path: Some(PathBuf::from("/file.rs")),
abs_path: Some(PathBuf::from("/file.rs")),
contents: Some("fn main() {}".to_string()),
language: Some("Rust".to_string()),
mtime: Some(now),
@@ -1660,7 +1685,7 @@ mod tests {
let item_id = 5678 as ItemId;
let serialized_editor = SerializedEditor {
path: Some(PathBuf::from("/file.rs")),
abs_path: Some(PathBuf::from("/file.rs")),
contents: None,
language: None,
mtime: None,
@@ -1695,7 +1720,7 @@ mod tests {
let item_id = 9012 as ItemId;
let serialized_editor = SerializedEditor {
path: None,
abs_path: None,
contents: Some("hello".to_string()),
language: Some("Rust".to_string()),
mtime: None,
@@ -1733,7 +1758,7 @@ mod tests {
.checked_sub(std::time::Duration::from_secs(60 * 60 * 24))
.unwrap();
let serialized_editor = SerializedEditor {
path: Some(PathBuf::from("/file.rs")),
abs_path: Some(PathBuf::from("/file.rs")),
contents: Some("fn main() {}".to_string()),
language: Some("Rust".to_string()),
mtime: Some(old_mtime),

View File

@@ -11,7 +11,7 @@ use workspace::{ItemId, WorkspaceDb, WorkspaceId};
#[derive(Clone, Debug, PartialEq, Default)]
pub(crate) struct SerializedEditor {
pub(crate) path: Option<PathBuf>,
pub(crate) abs_path: Option<PathBuf>,
pub(crate) contents: Option<String>,
pub(crate) language: Option<String>,
pub(crate) mtime: Option<SystemTime>,
@@ -25,7 +25,7 @@ impl StaticColumnCount for SerializedEditor {
impl Bind for SerializedEditor {
fn bind(&self, statement: &Statement, start_index: i32) -> Result<i32> {
let start_index = statement.bind(&self.path, start_index)?;
let start_index = statement.bind(&self.abs_path, start_index)?;
let start_index = statement.bind(&self.contents, start_index)?;
let start_index = statement.bind(&self.language, start_index)?;
@@ -51,7 +51,8 @@ impl Bind for SerializedEditor {
impl Column for SerializedEditor {
fn column(statement: &mut Statement, start_index: i32) -> Result<(Self, i32)> {
let (path, start_index): (Option<PathBuf>, i32) = Column::column(statement, start_index)?;
let (abs_path, start_index): (Option<PathBuf>, i32) =
Column::column(statement, start_index)?;
let (contents, start_index): (Option<String>, i32) =
Column::column(statement, start_index)?;
let (language, start_index): (Option<String>, i32) =
@@ -66,7 +67,7 @@ impl Column for SerializedEditor {
.map(|(seconds, nanos)| UNIX_EPOCH + Duration::new(seconds as u64, nanos as u32));
let editor = Self {
path,
abs_path,
contents,
language,
mtime,
@@ -226,7 +227,7 @@ mod tests {
let workspace_id = workspace::WORKSPACE_DB.next_id().await.unwrap();
let serialized_editor = SerializedEditor {
path: Some(PathBuf::from("testing.txt")),
abs_path: Some(PathBuf::from("testing.txt")),
contents: None,
language: None,
mtime: None,
@@ -244,7 +245,7 @@ mod tests {
// Now update contents and language
let serialized_editor = SerializedEditor {
path: Some(PathBuf::from("testing.txt")),
abs_path: Some(PathBuf::from("testing.txt")),
contents: Some("Test".to_owned()),
language: Some("Go".to_owned()),
mtime: None,
@@ -262,7 +263,7 @@ mod tests {
// Now set all the fields to NULL
let serialized_editor = SerializedEditor {
path: None,
abs_path: None,
contents: None,
language: None,
mtime: None,
@@ -281,7 +282,7 @@ mod tests {
// Storing and retrieving mtime
let now = SystemTime::now();
let serialized_editor = SerializedEditor {
path: None,
abs_path: None,
contents: None,
language: None,
mtime: Some(now),

View File

@@ -298,6 +298,20 @@ impl Item for ProposedChangesEditor {
Item::set_nav_history(editor, nav_history, cx)
});
}
fn can_save(&self, cx: &AppContext) -> bool {
self.editor.read(cx).can_save(cx)
}
fn save(
&mut self,
format: bool,
project: Model<Project>,
cx: &mut ViewContext<Self>,
) -> Task<gpui::Result<()>> {
self.editor
.update(cx, |editor, cx| Item::save(editor, format, project, cx))
}
}
impl ProposedChangesEditorToolbar {
@@ -323,7 +337,7 @@ impl Render for ProposedChangesEditorToolbar {
if let Some(editor) = &editor {
editor.update(cx, |editor, cx| {
editor.editor.update(cx, |editor, cx| {
editor.apply_all_changes(cx);
editor.apply_all_diff_hunks(cx);
})
});
}

View File

@@ -3,6 +3,7 @@ use std::sync::{atomic::AtomicBool, Arc};
use anyhow::{anyhow, Result};
use assistant_slash_command::{
ArgumentCompletion, SlashCommand, SlashCommandOutput, SlashCommandOutputSection,
SlashCommandResult,
};
use futures::FutureExt;
use gpui::{Task, WeakView, WindowContext};
@@ -87,7 +88,7 @@ impl SlashCommand for ExtensionSlashCommand {
_workspace: WeakView<Workspace>,
delegate: Option<Arc<dyn LspAdapterDelegate>>,
cx: &mut WindowContext,
) -> Task<Result<SlashCommandOutput>> {
) -> Task<SlashCommandResult> {
let arguments = arguments.to_owned();
let output = cx.background_executor().spawn(async move {
self.extension
@@ -127,7 +128,8 @@ impl SlashCommand for ExtensionSlashCommand {
})
.collect(),
run_commands_in_text: false,
})
}
.to_event_stream())
})
}
}

View File

@@ -48,6 +48,7 @@ where
item_count,
item_to_measure_index: 0,
render_items: Box::new(render_range),
decorations: Vec::new(),
interactivity: Interactivity {
element_id: Some(id),
base_style: Box::new(base_style),
@@ -69,6 +70,7 @@ pub struct UniformList {
item_to_measure_index: usize,
render_items:
Box<dyn for<'a> Fn(Range<usize>, &'a mut WindowContext) -> SmallVec<[AnyElement; 64]>>,
decorations: Vec<Box<dyn UniformListDecoration>>,
interactivity: Interactivity,
scroll_handle: Option<UniformListScrollHandle>,
sizing_behavior: ListSizingBehavior,
@@ -78,6 +80,7 @@ pub struct UniformList {
/// Frame state used by the [UniformList].
pub struct UniformListFrameState {
items: SmallVec<[AnyElement; 32]>,
decorations: SmallVec<[AnyElement; 1]>,
}
/// A handle for controlling the scroll position of a uniform list.
@@ -185,6 +188,7 @@ impl Element for UniformList {
layout_id,
UniformListFrameState {
items: SmallVec::new(),
decorations: SmallVec::new(),
},
)
}
@@ -292,9 +296,10 @@ impl Element for UniformList {
..cmp::min(last_visible_element_ix, self.item_count);
let mut items = (self.render_items)(visible_range.clone(), cx);
let content_mask = ContentMask { bounds };
cx.with_content_mask(Some(content_mask), |cx| {
for (mut item, ix) in items.into_iter().zip(visible_range) {
for (mut item, ix) in items.into_iter().zip(visible_range.clone()) {
let item_origin = padded_bounds.origin
+ point(
if can_scroll_horizontally {
@@ -317,6 +322,34 @@ impl Element for UniformList {
item.prepaint_at(item_origin, cx);
frame_state.items.push(item);
}
let bounds = Bounds::new(
padded_bounds.origin
+ point(
if can_scroll_horizontally {
scroll_offset.x + padding.left
} else {
scroll_offset.x
},
scroll_offset.y + padding.top,
),
padded_bounds.size,
);
for decoration in &self.decorations {
let mut decoration = decoration.as_ref().compute(
visible_range.clone(),
bounds,
item_height,
cx,
);
let available_space = size(
AvailableSpace::Definite(bounds.size.width),
AvailableSpace::Definite(bounds.size.height),
);
decoration.layout_as_root(available_space, cx);
decoration.prepaint_at(bounds.origin, cx);
frame_state.decorations.push(decoration);
}
});
}
@@ -338,6 +371,9 @@ impl Element for UniformList {
for item in &mut request_layout.items {
item.paint(cx);
}
for decoration in &mut request_layout.decorations {
decoration.paint(cx);
}
})
}
}
@@ -350,6 +386,20 @@ impl IntoElement for UniformList {
}
}
/// A decoration for a [`UniformList`]. This can be used for various things,
/// such as rendering indent guides, or other visual effects.
pub trait UniformListDecoration {
/// Compute the decoration element, given the visible range of list items,
/// the bounds of the list, and the height of each item.
fn compute(
&self,
visible_range: Range<usize>,
bounds: Bounds<Pixels>,
item_height: Pixels,
cx: &mut WindowContext,
) -> AnyElement;
}
impl UniformList {
/// Selects a specific list item for measurement.
pub fn with_width_from_item(mut self, item_index: Option<usize>) -> Self {
@@ -382,6 +432,12 @@ impl UniformList {
self
}
/// Adds a decoration element to the list.
pub fn with_decoration(mut self, decoration: impl UniformListDecoration + 'static) -> Self {
self.decorations.push(Box::new(decoration));
self
}
fn measure_item(&self, list_width: Option<Pixels>, cx: &mut WindowContext) -> Size<Pixels> {
if self.item_count == 0 {
return Size::default();

View File

@@ -146,7 +146,7 @@ impl Keystroke {
"space" => Some(" ".into()),
"tab" => Some("\t".into()),
"enter" => Some("\n".into()),
key if !is_printable_key(key) => None,
key if !is_printable_key(key) || key.is_empty() => None,
key => {
if self.modifiers.shift {
Some(key.to_uppercase())

View File

@@ -381,6 +381,11 @@ impl MacPlatform {
}
item.setSubmenu_(submenu);
item.setTitle_(ns_string(&name));
if name == "Services" {
let app: id = msg_send![APP_CLASS, sharedApplication];
app.setServicesMenu_(item);
}
item
}
}

View File

@@ -1,6 +1,7 @@
use crate::{
black, fill, point, px, size, Bounds, Hsla, LineLayout, Pixels, Point, Result, SharedString,
StrikethroughStyle, UnderlineStyle, WindowContext, WrapBoundary, WrappedLineLayout,
black, fill, point, px, size, Bounds, Half, Hsla, LineLayout, Pixels, Point, Result,
SharedString, StrikethroughStyle, UnderlineStyle, WindowContext, WrapBoundary,
WrappedLineLayout,
};
use derive_more::{Deref, DerefMut};
use smallvec::SmallVec;
@@ -129,8 +130,9 @@ fn paint_line(
let text_system = cx.text_system().clone();
let mut glyph_origin = origin;
let mut prev_glyph_position = Point::default();
let mut max_glyph_size = size(px(0.), px(0.));
for (run_ix, run) in layout.runs.iter().enumerate() {
let max_glyph_size = text_system.bounding_box(run.font_id, layout.font_size).size;
max_glyph_size = text_system.bounding_box(run.font_id, layout.font_size).size;
for (glyph_ix, glyph) in run.glyphs.iter().enumerate() {
glyph_origin.x += glyph.position.x - prev_glyph_position.x;
@@ -139,6 +141,9 @@ fn paint_line(
wraps.next();
if let Some((background_origin, background_color)) = current_background.as_mut()
{
if glyph_origin.x == background_origin.x {
background_origin.x -= max_glyph_size.width.half()
}
cx.paint_quad(fill(
Bounds {
origin: *background_origin,
@@ -150,6 +155,9 @@ fn paint_line(
background_origin.y += line_height;
}
if let Some((underline_origin, underline_style)) = current_underline.as_mut() {
if glyph_origin.x == underline_origin.x {
underline_origin.x -= max_glyph_size.width.half();
};
cx.paint_underline(
*underline_origin,
glyph_origin.x - underline_origin.x,
@@ -161,6 +169,9 @@ fn paint_line(
if let Some((strikethrough_origin, strikethrough_style)) =
current_strikethrough.as_mut()
{
if glyph_origin.x == strikethrough_origin.x {
strikethrough_origin.x -= max_glyph_size.width.half();
};
cx.paint_strikethrough(
*strikethrough_origin,
glyph_origin.x - strikethrough_origin.x,
@@ -179,7 +190,18 @@ fn paint_line(
let mut finished_underline: Option<(Point<Pixels>, UnderlineStyle)> = None;
let mut finished_strikethrough: Option<(Point<Pixels>, StrikethroughStyle)> = None;
if glyph.index >= run_end {
if let Some(style_run) = decoration_runs.next() {
let mut style_run = decoration_runs.next();
// ignore style runs that apply to a partial glyph
while let Some(run) = style_run {
if glyph.index < run_end + (run.len as usize) {
break;
}
run_end += run.len as usize;
style_run = decoration_runs.next();
}
if let Some(style_run) = style_run {
if let Some((_, background_color)) = &mut current_background {
if style_run.background_color.as_ref() != Some(background_color) {
finished_background = current_background.take();
@@ -240,10 +262,14 @@ fn paint_line(
}
if let Some((background_origin, background_color)) = finished_background {
let mut width = glyph_origin.x - background_origin.x;
if width == px(0.) {
width = px(5.)
};
cx.paint_quad(fill(
Bounds {
origin: background_origin,
size: size(glyph_origin.x - background_origin.x, line_height),
size: size(width, line_height),
},
background_color,
));
@@ -299,7 +325,10 @@ fn paint_line(
last_line_end_x -= glyph.position.x;
}
if let Some((background_origin, background_color)) = current_background.take() {
if let Some((mut background_origin, background_color)) = current_background.take() {
if last_line_end_x == background_origin.x {
background_origin.x -= max_glyph_size.width.half()
};
cx.paint_quad(fill(
Bounds {
origin: background_origin,
@@ -309,7 +338,10 @@ fn paint_line(
));
}
if let Some((underline_start, underline_style)) = current_underline.take() {
if let Some((mut underline_start, underline_style)) = current_underline.take() {
if last_line_end_x == underline_start.x {
underline_start.x -= max_glyph_size.width.half()
};
cx.paint_underline(
underline_start,
last_line_end_x - underline_start.x,
@@ -317,7 +349,10 @@ fn paint_line(
);
}
if let Some((strikethrough_start, strikethrough_style)) = current_strikethrough.take() {
if let Some((mut strikethrough_start, strikethrough_style)) = current_strikethrough.take() {
if last_line_end_x == strikethrough_start.x {
strikethrough_start.x -= max_glyph_size.width.half()
};
cx.paint_strikethrough(
strikethrough_start,
last_line_end_x - strikethrough_start.x,

View File

@@ -71,6 +71,7 @@ env_logger.workspace = true
gpui = { workspace = true, features = ["test-support"] }
indoc.workspace = true
lsp = { workspace = true, features = ["test-support"] }
pretty_assertions.workspace = true
rand.workspace = true
settings = { workspace = true, features = ["test-support"] }
text = { workspace = true, features = ["test-support"] }

View File

@@ -336,6 +336,8 @@ pub enum BufferEvent {
FileHandleChanged,
/// The buffer was reloaded.
Reloaded,
/// The buffer is in need of a reload
ReloadNeeded,
/// The buffer's diff_base changed.
DiffBaseChanged,
/// Buffer's excerpts for a certain diff base were recalculated.
@@ -440,7 +442,7 @@ struct AutoindentRequest {
is_block_mode: bool,
}
#[derive(Clone)]
#[derive(Debug, Clone)]
struct AutoindentRequestEntry {
/// A range of the buffer whose indentation should be adjusted.
range: Range<Anchor>,
@@ -499,6 +501,8 @@ pub struct Chunk<'a> {
pub is_unnecessary: bool,
/// Whether this chunk of text was originally a tab character.
pub is_tab: bool,
/// Whether this chunk of text is an invisible character.
pub is_invisible: bool,
/// An optional recipe for how the chunk should be presented.
pub renderer: Option<ChunkRenderer>,
}
@@ -1077,7 +1081,7 @@ impl Buffer {
file_changed = true;
if !self.is_dirty() {
self.reload(cx).close();
cx.emit(BufferEvent::ReloadNeeded);
}
}
}
@@ -1418,24 +1422,17 @@ impl Buffer {
yield_now().await;
}
// In block mode, only compute indentation suggestions for the first line
// of each insertion. Otherwise, compute suggestions for every inserted line.
let new_edited_row_ranges = contiguous_ranges(
row_ranges.iter().flat_map(|(range, _)| {
if request.is_block_mode {
range.start..range.start + 1
} else {
range.clone()
}
}),
max_rows_between_yields,
);
// Compute new suggestions for each line, but only include them in the result
// if they differ from the old suggestion for that line.
let mut language_indent_sizes = language_indent_sizes_by_new_row.iter().peekable();
let mut language_indent_size = IndentSize::default();
for new_edited_row_range in new_edited_row_ranges {
for (row_range, original_indent_column) in row_ranges {
let new_edited_row_range = if request.is_block_mode {
row_range.start..row_range.start + 1
} else {
row_range.clone()
};
let suggestions = snapshot
.suggest_autoindents(new_edited_row_range.clone())
.into_iter()
@@ -1469,22 +1466,9 @@ impl Buffer {
}
}
}
yield_now().await;
}
// For each block of inserted text, adjust the indentation of the remaining
// lines of the block by the same amount as the first line was adjusted.
if request.is_block_mode {
for (row_range, original_indent_column) in
row_ranges
.into_iter()
.filter_map(|(range, original_indent_column)| {
if range.len() > 1 {
Some((range, original_indent_column?))
} else {
None
}
})
if let (true, Some(original_indent_column)) =
(request.is_block_mode, original_indent_column)
{
let new_indent = indent_sizes
.get(&row_range.start)
@@ -1509,6 +1493,8 @@ impl Buffer {
}
}
}
yield_now().await;
}
}
@@ -4227,7 +4213,6 @@ impl<'a> Iterator for BufferChunks<'a> {
if self.range.start == self.chunks.offset() + chunk.len() {
self.chunks.next().unwrap();
}
Some(Chunk {
text: slice,
syntax_highlight_id: highlight_id,

View File

@@ -1658,6 +1658,69 @@ fn test_autoindent_block_mode_without_original_indent_columns(cx: &mut AppContex
});
}
#[gpui::test]
fn test_autoindent_block_mode_multiple_adjacent_ranges(cx: &mut AppContext) {
init_settings(cx, |_| {});
cx.new_model(|cx| {
let (text, ranges_to_replace) = marked_text_ranges(
&"
mod numbers {
«fn one() {
1
}
»
«fn two() {
2
}
»
«fn three() {
3
}
»}
"
.unindent(),
false,
);
let mut buffer = Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx);
buffer.edit(
[
(ranges_to_replace[0].clone(), "fn one() {\n 101\n}\n"),
(ranges_to_replace[1].clone(), "fn two() {\n 102\n}\n"),
(ranges_to_replace[2].clone(), "fn three() {\n 103\n}\n"),
],
Some(AutoindentMode::Block {
original_indent_columns: vec![0, 0, 0],
}),
cx,
);
pretty_assertions::assert_eq!(
buffer.text(),
"
mod numbers {
fn one() {
101
}
fn two() {
102
}
fn three() {
103
}
}
"
.unindent()
);
buffer
});
}
#[gpui::test]
fn test_autoindent_language_without_indents_query(cx: &mut AppContext) {
init_settings(cx, |_| {});

View File

@@ -831,7 +831,7 @@ impl AllLanguageSettings {
let editorconfig_properties = location.and_then(|location| {
cx.global::<SettingsStore>()
.editorconfg_properties(location.worktree_id, location.path)
.editorconfig_properties(location.worktree_id, location.path)
});
if let Some(editorconfig_properties) = editorconfig_properties {
let mut settings = settings.clone();

View File

@@ -40,7 +40,7 @@ pub struct AnthropicSettings {
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct AvailableModel {
/// The model's name in the Anthropic API. e.g. claude-3-5-sonnet-20240620
/// The model's name in the Anthropic API. e.g. claude-3-5-sonnet-latest, claude-3-opus-20240229, etc
pub name: String,
/// The model's name in Zed's UI, such as in the model selector dropdown menu in the assistant panel.
pub display_name: Option<String>,

View File

@@ -9,7 +9,8 @@ use gpui::{
};
use language::{LanguageServerId, LanguageServerName};
use lsp::{
notification::SetTrace, IoKind, LanguageServer, MessageType, SetTraceParams, TraceValue,
notification::SetTrace, IoKind, LanguageServer, MessageType, ServerCapabilities,
SetTraceParams, TraceValue,
};
use project::{search::SearchQuery, Project, WorktreeId};
use std::{borrow::Cow, sync::Arc};
@@ -107,6 +108,7 @@ struct LanguageServerState {
rpc_state: Option<LanguageServerRpcState>,
trace_level: TraceValue,
log_level: MessageType,
capabilities: ServerCapabilities,
io_logs_subscription: Option<lsp::Subscription>,
}
@@ -176,6 +178,7 @@ pub enum LogKind {
Trace,
#[default]
Logs,
Capabilities,
}
impl LogKind {
@@ -184,6 +187,7 @@ impl LogKind {
LogKind::Rpc => RPC_MESSAGES,
LogKind::Trace => SERVER_TRACE,
LogKind::Logs => SERVER_LOGS,
LogKind::Capabilities => SERVER_CAPABILITIES,
}
}
}
@@ -374,6 +378,7 @@ impl LogStore {
trace_level: TraceValue::Off,
log_level: MessageType::LOG,
io_logs_subscription: None,
capabilities: ServerCapabilities::default(),
}
});
@@ -384,7 +389,10 @@ impl LogStore {
server_state.worktree_id = Some(worktree_id);
}
if let Some(server) = server.filter(|_| server_state.io_logs_subscription.is_none()) {
if let Some(server) = server
.clone()
.filter(|_| server_state.io_logs_subscription.is_none())
{
let io_tx = self.io_tx.clone();
let server_id = server.server_id();
server_state.io_logs_subscription = Some(server.on_io(move |io_kind, message| {
@@ -393,6 +401,11 @@ impl LogStore {
.ok();
}));
}
if let Some(server) = server {
server_state.capabilities = server.capabilities();
}
Some(server_state)
}
@@ -477,6 +490,10 @@ impl LogStore {
Some(&self.language_servers.get(&server_id)?.trace_messages)
}
fn server_capabilities(&self, server_id: LanguageServerId) -> Option<&ServerCapabilities> {
Some(&self.language_servers.get(&server_id)?.capabilities)
}
fn server_ids_for_project<'a>(
&'a self,
lookup_project: &'a WeakModel<Project>,
@@ -602,6 +619,9 @@ impl LspLogView {
LogKind::Rpc => this.show_rpc_trace_for_server(server_id, cx),
LogKind::Trace => this.show_trace_for_server(server_id, cx),
LogKind::Logs => this.show_logs_for_server(server_id, cx),
LogKind::Capabilities => {
this.show_capabilities_for_server(server_id, cx)
}
}
} else {
this.current_server_id = None;
@@ -618,6 +638,7 @@ impl LspLogView {
LogKind::Rpc => this.show_rpc_trace_for_server(server_id, cx),
LogKind::Trace => this.show_trace_for_server(server_id, cx),
LogKind::Logs => this.show_logs_for_server(server_id, cx),
LogKind::Capabilities => this.show_capabilities_for_server(server_id, cx),
}
}
@@ -695,6 +716,33 @@ impl LspLogView {
(editor, vec![editor_subscription, search_subscription])
}
fn editor_for_capabilities(
capabilities: ServerCapabilities,
cx: &mut ViewContext<Self>,
) -> (View<Editor>, Vec<Subscription>) {
let editor = cx.new_view(|cx| {
let mut editor = Editor::multi_line(cx);
editor.set_text(serde_json::to_string_pretty(&capabilities).unwrap(), cx);
editor.move_to_end(&MoveToEnd, cx);
editor.set_read_only(true);
editor.set_show_inline_completions(Some(false), cx);
editor
});
let editor_subscription = cx.subscribe(
&editor,
|_, _, event: &EditorEvent, cx: &mut ViewContext<'_, LspLogView>| {
cx.emit(event.clone())
},
);
let search_subscription = cx.subscribe(
&editor,
|_, _, event: &SearchEvent, cx: &mut ViewContext<'_, LspLogView>| {
cx.emit(event.clone())
},
);
(editor, vec![editor_subscription, search_subscription])
}
pub(crate) fn menu_items<'a>(&'a self, cx: &'a AppContext) -> Option<Vec<LogMenuItem>> {
let log_store = self.log_store.read(cx);
@@ -881,6 +929,7 @@ impl LspLogView {
cx.notify();
}
}
fn update_trace_level(
&self,
server_id: LanguageServerId,
@@ -899,6 +948,25 @@ impl LspLogView {
.ok();
}
}
fn show_capabilities_for_server(
&mut self,
server_id: LanguageServerId,
cx: &mut ViewContext<Self>,
) {
let capabilities = self.log_store.read(cx).server_capabilities(server_id);
if let Some(capabilities) = capabilities {
self.current_server_id = Some(server_id);
self.active_entry_kind = LogKind::Capabilities;
let (editor, editor_subscriptions) =
Self::editor_for_capabilities(capabilities.clone(), cx);
self.editor = editor;
self.editor_subscriptions = editor_subscriptions;
cx.notify();
}
cx.focus(&self.focus_handle);
}
}
fn log_filter<T: Message>(line: &T, cmp: <T as Message>::Level) -> Option<&str> {
@@ -967,6 +1035,7 @@ impl Item for LspLogView {
LogKind::Rpc => new_view.show_rpc_trace_for_server(server_id, cx),
LogKind::Trace => new_view.show_trace_for_server(server_id, cx),
LogKind::Logs => new_view.show_logs_for_server(server_id, cx),
LogKind::Capabilities => new_view.show_capabilities_for_server(server_id, cx),
}
}
new_view
@@ -1168,6 +1237,13 @@ impl Render for LspLogToolbarItemView {
view.show_rpc_trace_for_server(row.server_id, cx);
}),
);
menu = menu.entry(
SERVER_CAPABILITIES,
None,
cx.handler_for(&log_view, move |view, cx| {
view.show_capabilities_for_server(row.server_id, cx);
}),
);
if server_selected && row.selected_entry == LogKind::Rpc {
let selected_ix = menu.select_last();
debug_assert_eq!(
@@ -1317,6 +1393,7 @@ impl Render for LspLogToolbarItemView {
const RPC_MESSAGES: &str = "RPC Messages";
const SERVER_LOGS: &str = "Server Logs";
const SERVER_TRACE: &str = "Server Trace";
const SERVER_CAPABILITIES: &str = "Server Capabilities";
impl Default for LspLogToolbarItemView {
fn default() -> Self {

View File

@@ -4,7 +4,7 @@
declarator: (function_declarator
declarator: (identifier) @run
)
) @c-main
) @_c-main
(#eq? @run "main")
(#set! tag c-main)
)

View File

@@ -9,6 +9,6 @@
(string_fragment) @run
)
)
) @js-test
) @_js-test
(#set! tag js-test)
)

View File

@@ -7,7 +7,7 @@
(attribute (identifier) @_superclass)]
)
(#eq? @_superclass "TestCase")
) @python-unittest-class
) @_python-unittest-class
(#set! tag python-unittest-class)
)
@@ -24,7 +24,7 @@
(function_definition
name: (identifier) @run @_unittest_method_name
(#match? @_unittest_method_name "^test.*")
) @python-unittest-method
) @_python-unittest-method
(#set! tag python-unittest-method)
)
)

View File

@@ -3,7 +3,7 @@
(mod_item
name: (_) @run
(#eq? @run "tests")
) @rust-mod-test
)
(#set! tag rust-mod-test)
)
@@ -14,14 +14,14 @@
(scoped_identifier (identifier) @_attribute)
])
(#match? @_attribute "test")
) @start
) @_start
.
(attribute_item) *
.
(function_item
name: (_) @run
body: _
) @end
) @_end
)
(#set! tag rust-test)
)

View File

@@ -9,6 +9,6 @@
(string_fragment) @run
)
)
) @ts-test
) @_ts-test
(#set! tag ts-test)
)

Some files were not shown because too many files have changed in this diff Show More