Compare commits

..

71 Commits

Author SHA1 Message Date
Ben Brandt
a2ce038352 eval: retry in more scenarios 2025-07-31 11:21:05 +02:00
Joseph T. Lyons
47af878ebb Do not sort settings profiles (#35389)
After playing with this for a bit, I realize it does not feel good to
not have control over the order of profiles. I find myself wanting to
group similar profiles together and not being able to.

Release Notes:

- N/A
2025-07-31 07:34:35 +00:00
Danilo Leal
5488398986 onboarding: Refine page and component designs (#35387)
Includes adding new variants to the Dropdown and Numeric Stepper
components.

Release Notes:

- N/A
2025-07-31 05:32:18 +00:00
Marshall Bowers
b1a7993544 cloud_api_types: Add more data to the GetAuthenticatedUserResponse (#35384)
This PR adds more data to the `GetAuthenticatedUserResponse`.

We now return more information about the authenticated user, as well as
their plan information.

Release Notes:

- N/A
2025-07-30 23:38:51 -04:00
Marshall Bowers
b90fd4287f client: Don't fetch the authenticated user once we have them (#35385)
This PR makes it so we don't keep fetching the authenticated user once
we have them.

Release Notes:

- N/A
2025-07-31 03:37:02 +00:00
Ben Kunkle
e1e2775b80 docs: Run lychee link check on generated docs output (#35381)
Closes #ISSUE

Following #35310, . This PR makes it so the lychee link check is ran
before building the docs on the md files to catch basic errors, and then
after building on the html output to catch generation errors, including
regressions like the one #35380 fixes.

Release Notes:

- N/A
2025-07-31 02:01:40 +00:00
Joseph T. Lyons
ed104ec5e0 Ensure settings are being adjusted via settings profile selector (#35382)
This PR just pins down the behavior of the settings profile selector by
checking a single setting, `buffer_font_size`, as options in the
selector are changed / selected.

Release Notes:

- N/A
2025-07-31 01:52:02 +00:00
Kainoa Kanter
67a491df50 Use outlined bolt icon for the LSP tool (#35373)
| Before | After |
|--------|--------|
| <img width="266" height="67" alt="image"
src="https://github.com/user-attachments/assets/bbfc75b6-6747-4eb1-ab94-ab098eba5335"
/> | <img width="266" height="67" alt="image"
src="https://github.com/user-attachments/assets/4631be9d-3d5e-4eb6-bf2f-596403fdf014"
/> |

Release Notes:

- Changed the icon of the language servers entry in the status bar.
2025-07-30 21:37:10 -04:00
Marshall Bowers
f003036aec docs: Pin mdbook to v0.4.40 (#35380)
This PR pins `mdbook` to v0.4.40 to fix an issue with sidebar links
having some of their path segments duplicated (e.g.,
`http://localhost:3000/extensions/extensions/developing-extensions.html`.

For reference:

-
https://zed-industries.slack.com/archives/C04S5TU0RSN/p1745439470378339?thread_ts=1745428671.190059&cid=C04S5TU0RSN
-
https://zed-industries.slack.com/archives/C04S5TU0RSN/p1753922478290399

Release Notes:

- N/A
2025-07-31 01:34:26 +00:00
Marshall Bowers
fbc784d323 Use the user from the CloudUserStore to drive the user menu (#35375)
This PR updates the user menu in the title bar to base the "signed in"
state on the user in the `CloudUserStore` rather than the `UserStore`.

This makes it possible to be signed-in—at least, as far as the user menu
is concerned—even when disconnected from Collab.

Release Notes:

- N/A
2025-07-30 20:31:22 -04:00
Piotr Osiewicz
296bb66b65 chore: Move a few more tasks into background_spawn (#35374)
Release Notes:

- N/A
2025-07-30 23:56:47 +00:00
Marshall Bowers
bb1a7ccbba client: Add CloudUserStore (#35370)
This PR adds a new `CloudUserStore` for storing information about the
user retrieved from Cloud instead of Collab.

Release Notes:

- N/A
2025-07-30 18:43:10 -04:00
Marshall Bowers
289f420504 Sort crate members in Cargo.toml (#35371)
This PR sorts the crate members in the `Cargo.toml` file, as they had
gotten unsorted.

Release Notes:

- N/A
2025-07-30 22:35:17 +00:00
张小白
15ad986329 windows: Port to DirectX 11 (#34374)
Closes #16713
Closes #19739
Closes #33191
Closes #26692
Closes #17374
Closes #35077
Closes https://github.com/zed-industries/zed/issues/35205
Closes https://github.com/zed-industries/zed/issues/35262


Compared to the current Vulkan implementation, this PR brings several
improvements:

- Fewer weird bugs
- Better hardware compatibility
- VSync support
- More accurate colors
- Lower memory usage
- Graceful handling of device loss

---

**TODO:**

- [x] Don’t use AGS binaries directly
- [ ] The message loop is using too much CPU when ths app is idle
- [x] There’s a
[bug](https://github.com/zed-industries/zed/issues/33191#issuecomment-3109306630)
in how `Path` is being rendered.

---

Release Notes:

- N/A

---------

Co-authored-by: Kate <kate@zed.dev>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-07-30 15:27:58 -07:00
Finn Evers
0d9715325c docs: Add section about terminal contrast adjustments (#35369)
Closes #35146

This change adds documentation for the `terminal.minimum_contrast`
setting to the docs as we've had a lot of reports regarding the contrast
adjustments, yet are missing proper documentation (aside from that in
the `defaults.json`) for it.

Release Notes:

- N/A
2025-07-31 00:19:56 +02:00
Joseph T. Lyons
5ef5f3c5ca Introduce settings profiles (#35339)
Settings Profiles

- [X] Allow profiles to be defined, where each profile can be any of
Zed's settings
    - [X] Autocompletion of all settings
    - [X] Errors on invalid keys
- [X] Action brings up modal that shows user-defined profiles
- [X] Alphabetize profiles
- [X] Ability to filter down via keyboard, and navigate via arrow up and
down
- [X] Auto select Disabled option by default (first in list, after
alphabetizing user-defined profiles)
- [X] Automatically select active profile on next picker summoning
- [X] Persist settings until toggled off
- [X] Show live preview as you select from the profile picker
- [X] Tweaking a setting, while in a profile, updates the profile live
- [X] Make sure actions that live update Zed, such as `cmd-0`, `cmd-+`,
and `cmd--`, work while in a profile
- [X] Add a test to track state

Release Notes:

- Added the ability to configure settings profiles, via the "profiles"
key. Example:

```json
{
  "profiles": {
    "Streaming": {
      "agent_font_size": 20,
      "buffer_font_size": 20,
      "theme": "One Light",
      "ui_font_size": 20
    }
  }
}
```

To set a profile, use `settings profile selector: toggle`
2025-07-30 21:48:24 +00:00
Anthony Eid
2d4afd2119 Polish onboarding page (#35367)
- Added borders to the numeric stepper.
- Changed the hover mouse style for SwitchField.
- Made the edit page toggle buttons more responsive.

Release Notes:

- N/A
2025-07-30 20:35:21 +00:00
Ben Kunkle
afcb8f2a3f onboarding: Wire up settings import (#35366)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-07-30 20:09:11 +00:00
Smit Barmase
cdce3b3620 linux: Fix caps lock not working consistently for certain X11 systems (#35361)
Closes #35316

Bug in https://github.com/zed-industries/zed/pull/34514

Turns out you are not supposed to call `update_key` for modifiers on
`KeyPress`/`KeyRelease`, as modifiers are already updated in
`XkbStateNotify` events. Not sure why this only causes issues on a few
systems and works on others.

Tested on Ubuntu 24.04.2 LTS (initial bug) and Kubuntu 25.04 (worked
fine before too).

Release Notes:

- Fixed an issue where caps lock stopped working consistently on some
Linux X11 systems.
2025-07-31 00:56:36 +05:30
Marshall Bowers
bc6bb42745 Add cloud_api_client and cloud_api_types crates (#35357)
This PR adds two new crates for interacting with Cloud:

- `cloud_api_client` - The client that will be used to talk to Cloud.
- `cloud_api_types` - The types for the Cloud API that are shared
between Zed and Cloud.

Release Notes:

- N/A
2025-07-30 18:57:51 +00:00
Marshall Bowers
7695c4b82e collab: Temporarily add back GET /user endpoint for local development (#35358)
This PR temporarily adds back the `GET /user` endpoint to Collab since
we're still using it for local development.

Will remove it again once we update the local development process to
leverage Cloud.

Release Notes:

- N/A
2025-07-30 18:54:44 +00:00
Smit Barmase
794ade8b6d ui_prompt: Fix prompt dialog is hard to see on large screen (#35348)
Closes #18516

Release Notes:

- Improved visibility of prompt dialog on Linux by dimming the
background.
2025-07-30 23:03:53 +05:30
Smit Barmase
f4bd524d7f ui_prompt: Fix copy version number from About Zed (#35346)
Closes #29361

Release Notes:

- Fixed not selectable version number in About Zed prompt on Linux.
2025-07-30 22:51:59 +05:30
Joseph T. Lyons
9d82e148de Bump Zed to v0.199 (#35343)
Release Notes:

-N/A
2025-07-30 16:53:53 +00:00
Finn Evers
f8d1062484 onboarding: Fix keybindings showing up after a delay (#35342)
This fixes an issue where keybinds would only show up after a delay on
the welcome page upon re-opening it. It also binds one of the buttons to
the corresponding action.

Release Notes:

- N/A
2025-07-30 16:18:14 +00:00
Antonio Scandurra
45af1fcc2f Always double reconnection delay and add jitter (#35337)
Previously, we would pick an exponent between 0.5 and 2.5, which would
cause a lot of clients to try reconnecting in rapid succession,
overwhelming the server as a result.

This pull request always doubles the previous delay and introduces a
jitter that can, at most, double it.

As part of this, we're also increasing the maximum reconnection delay
from 10s to 30s: this gives us more space to spread out the reconnection
requests.

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <git@maxdeviant.com>
2025-07-30 15:34:09 +00:00
Peter Tripp
0aea5acc68 Fix Windows CI logic (#35335)
Fixes unintentional change in
https://github.com/zed-industries/zed/pull/35204

Release Notes:

- N/A
2025-07-30 11:01:20 -04:00
Kirill Bulatov
4d66d967f2 Revert "gpui: Implement support for wlr layer shell (#32651)" (#35331)
This reverts commit c110f78015.

On Linux Wayland, that causes a panic:

```
already mutably borrowed: BorrowError
zed::reliability::init_panic_hook::{{closure}}::h276cc55bf0717738+165677654
std::panicking::rust_panic_with_hook::h409da73ddef13937+139331443
std::panicking::begin_panic_handler::{{closure}}::h159b61b27f96a9c2+139330666
std::sys::backtrace::__rust_end_short_backtrace::h5b56844d75e766fc+139314825
__rustc[4794b31dd7191200]::rust_begin_unwind+139329805
core::panicking::panic_fmt::hc8737e8cca20a7c8+9934576
core::cell::panic_already_mutably_borrowed::h95c7d326eb19a92a+9934403
<gpui::platform::linux::wayland::window::WaylandWindow as gpui::platform::PlatformWindow>::set_app_id::hfa7deae0be264f60+10621600
gpui::window::Window::new::h6505f6042d99702f+80424235
gpui::app::async_context::AsyncApp::open_window::h62ef8f80789a0af2+159117345
workspace::Workspace::new_local::{{closure}}::{{closure}}::h4d786ba393f391b5+160720110
gpui::app::App::spawn::{{closure}}::haf6a6ef0f9bab21c+159294806
async_task::raw::RawTask<F,T,S,M>::run::h9e5f668e091fddff+158375501
<gpui::platform::linux::wayland::client::WaylandClient as gpui::platform::linux::platform::LinuxClient>::run::h69e40feabd97f1bb+79906738
gpui::platform::linux::platform::<impl gpui::platform::Platform for P>::run::hd80e5b2da41c7d0a+79758141
gpui::app::Application::run::h9136595e7346a2c9+163935333
zed::main::h83f7ef86a32dbbfd+165755480
std::sys::backtrace::__rust_begin_short_backtrace::hb6da6fe5454d7688+168421891
std::rt::lang_start::{{closure}}::h51a50d6423746d5f+168421865
std::rt::lang_start_internal::ha8ef919ae4984948+139244369
main+168421964
__libc_start_call_main+29344125649354
__libc_start_main_impl+29344125649547
_start+12961358
```


Release Notes:

- N/A
2025-07-30 13:36:22 +00:00
Kirill Bulatov
93e6b01486 Actually disable ai for now (#35327)
Closes https://github.com/zed-industries/zed/issues/35325

* removes Supermaven actions
* removes copilot-related action
* stops re-enabling edit predictions when disabled

Release Notes:

- N/A
2025-07-30 13:10:05 +00:00
Danilo Leal
00725273e4 agent: Rename "open configuration" action to "open settings" (#35329)
"Settings" is the terminology we use in the agent panel, thus having the
action use "configuration" makes it harder for folks to find this either
via the command palette or the keybinding editor UI in case they'd like
to change it.

Release Notes:

- agent: Renamed the "open configuration" action to "open settings" for
better discoverability and consistency
2025-07-30 09:55:13 -03:00
Piotr Osiewicz
c22fa9adee chore: Move a bunch of foreground tasks into background (#35322)
Closes #ISSUE

Release Notes:

- N/A
2025-07-30 10:29:03 +00:00
Kirill Bulatov
49b75e9e93 Kb/wasm panics (#35319)
Follow-up of https://github.com/zed-industries/zed/pull/34208
Closes https://github.com/zed-industries/zed/issues/35185

Previous code assumed that extensions' language server wrappers may leak
only in static data (e.g. fields that were not cleared on deinit), but
we seem to have a race that breaks this assumption.

1. We do clean `all_lsp_adapters` field after
https://github.com/zed-industries/zed/pull/34334 and it's called for
every extension that is unregistered.
2. `LspStore::maintain_workspace_config` ->
`LspStore::refresh_workspace_configurations` chain is triggered
independently, apparently on `ToolchainStoreEvent::ToolchainActivated`
event which means somewhere behind there's potentially a Python code
that gets executed to activate the toolchian, making
`refresh_workspace_configurations` start timings unpredictable.
3. Seems that toolchain activation overlaps with plugin reload, as 
`2025-07-28T12:16:19+03:00 INFO [extension_host] extensions updated.
loading 0, reloading 1, unloading 0` suggests in the issue logs.

The plugin reload seem to happen faster than workspace configuration
refresh in


c65da547c9/crates/project/src/lsp_store.rs (L7426-L7456)

as the language servers are just starting and take extra time to respond
to the notification.

At least one of the `.clone()`d `adapter`s there is the adapter that got
removed during plugin reload and has its channel closed, which causes a
panic later.

----------------------------

A good fix would be to re-architect the workspace refresh approach, same
as other accesses to the language server collections.
One way could be to use `Weak`-based structures instead, as definitely
the extension server data belongs to extension, not the `LspStore`.
This is quite a large undertaking near the extension core though, so is
not done yet.

Currently, to stop the excessive panics, no more `.expect` is done on
the channel result, as indeed, it now can be closed very dynamically.
This will result in more errors (and backtraces, presumably) printed in
the logs and no panics.

More logging and comments are added, and workspace querying is replaced
to the concurrent one: no need to wait until a previous server had
processed the notification to send the same to the next one.

Release Notes:

- Fixed warm-related panic happening during startup
2025-07-30 09:18:26 +00:00
Marshall Bowers
7be1f2418d Replace zed_llm_client with cloud_llm_client (#35309)
This PR replaces the usage of the `zed_llm_client` with the
`cloud_llm_client`.

It was ported into this repo in #35307.

Release Notes:

- N/A
2025-07-30 00:09:14 +00:00
Mikayla Maki
17a0179f0a Stop caching needlessly (#35308)
Release Notes:

- N/A
2025-07-29 23:38:06 +00:00
Marshall Bowers
b8f3a9101c Add cloud_llm_client crate (#35307)
This PR adds a `cloud_llm_client` crate to take the place of the
`zed_llm_client`.

Release Notes:

- N/A
2025-07-29 23:30:45 +00:00
Ben Kunkle
3824751e61 Add meta description tag to docs pages (#35112)
Closes #ISSUE

Adds basic frontmatter support to `.md` files in docs. The only
supported keys currently are `description` which becomes a `<meta
name="description" contents="...">` tag, and `title` which becomes a
normal `title` tag, with the title contents prefixed with the subject of
the file.

An example of the syntax can be found in `git.md`, as well as below

```md
---
title: Some more detailed title for this page
description: A page-specific description
---

# Editor
```

The above will be transformed into (with non-relevant tags removed)

```html
<head>
    <title>Editor | Some more detailed title for this page</title>
    <meta name="description" contents="A page-specific description">
</head>
<body>
<h1>Editor</h1>
</body>
```

If no front-matter is provided, or If one or both keys aren't provided,
the title and description will be set based on the `default-title` and
`default-description` keys in `book.toml` respectively.

## Implementation details

Unfortunately, `mdbook` does not support post-processing like it does
pre-processing, and only supports defining one description to put in the
meta tag per book rather than per file. So in order to apply
post-processing (necessary to modify the html head tags) the global book
description is set to a marker value `#description#` and the html
renderer is replaced with a sub-command of `docs_preprocessor` that
wraps the builtin `html` renderer and applies post-processing to the
`html` files, replacing the marker value and the `<title>(.*)</title>`
with the contents of the front-matter if there is one.

## Known limitations

The front-matter parsing is extremely simple, which avoids needing to
take on an additional dependency, or implement full yaml parsing.

* Double quotes and multi-line values are not supported, i.e. Keys and
values must be entirely on the same line, with no double quotes around
the value.

The following will not work:

```md
---
title: Some
 Multi-line
 Title
---
```

* The front-matter must be at the top of the file, with only white-space
preceding it

* The contents of the title and description will not be html-escaped.
They should be simple ascii text with no unicode or emoji characters

Release Notes:

- N/A *or* Added/Fixed/Improved ...

---------

Co-authored-by: Katie Greer <katie@zed.dev>
2025-07-29 23:01:03 +00:00
Finn Evers
57766199cf ui: Clean up toggle button group component (#35303)
This change cleans up the toggle button component a bit by utilizing
const parameters instead and also removes some clones by consuming the
values where possible instead.

Release Notes:

- N/A
2025-07-30 00:47:04 +02:00
Finn Evers
0be83f1c67 emmet: Bump to 0.0.4 (#35305)
This PR bumps the emmet extension to version 0.0.4. 

Includes:
- https://github.com/zed-industries/zed/pull/33865
- https://github.com/zed-industries/zed/pull/32208
- https://github.com/zed-industries/zed/pull/15177

Note that this intentionally does NOT include a change in the
`extension.toml`: The version was bumped incorrectly once in
https://github.com/zed-industries/zed/pull/32208 in both the
`extension.toml` as well as the `Cargo.lock` but not in the
`Cargo.toml`. After that,
https://github.com/zed-industries/zed/pull/33667 only removed the
changes in the `Cargo.lock` but didn't revert the change in the
`extension.toml` file. Hence, the version in the `extension.toml` is
already at `0.0.4`

Release Notes:

- N/A
2025-07-29 22:46:17 +00:00
Marshall Bowers
f0927faf61 collab: Add kill switches for syncing data to and from Stripe (#35304)
This PR adds two kill switches for syncing data to and from Stripe using
Collab.

The `cloud-stripe-events-polling` and `cloud-stripe-usage-meters-sync`
feature flags control whether we use Cloud for polling Stripe events and
updating Stripe meters, respectively.

When we're ready to hand off the syncing to Cloud we can enable the
feature flag to do so.

Release Notes:

- N/A
2025-07-29 22:45:00 +00:00
Marshall Bowers
d2d116cb02 collab: Remove GET /user endpoint (#35301)
This PR removes the `GET /user` endpoint, as it has been moved to
`cloud.zed.dev`.

Release Notes:

- N/A
2025-07-29 22:34:04 +00:00
Ben Kunkle
9f69b53869 keymap_ui: Additional cleanup (#35299)
Closes #ISSUE

Additional cleanup and testing for the keystroke input including
- Focused testing of the "previous modifiers" logic in search mode
- Not merging unmodified keystrokes into previous modifier only bindings
(extension of #35208)
- Fixing a bug where input would overflow in search mode when entering
only modifiers
- Additional testing logic to ensure keystrokes updated events are
always emitted correctly

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-07-29 18:04:00 -04:00
Anthony Eid
48e085a523 onboarding ui: Add editing page to onboarding page (#35298)
I added buttons for inlay values, showing the mini map, git blame, and
controlling the UI/Editor Font/Font size. The only thing left for this
page is some UI clean up and adding buttons for setting import from
VSCode/cursor.

I also added Numeric Stepper as a component preview.

Current state:
<img width="1085" height="585" alt="image"
src="https://github.com/user-attachments/assets/230df474-da81-4810-ba64-05673896d119"
/>


Release Notes:

- N/A
2025-07-29 21:54:58 +00:00
marius851000
3378f02b7e Fix link to panic location on GitHub (#35162)
I had a panic, and it reported


``24c2a465bb/src/crates/assistant_tools/src/edit_agent.rs (L686)
(may not be uploaded, line may be incorrect if files modified)``

The `/src` part seems superfluous, and result in a link that don’t work
(unlike
`24c2a465bb/src/crates/assistant_tools/src/edit_agent.rs (L686)`).
I don’t know why it originally worked (of if it even actually originally
worked properly), but there seems to be no reason to keep that `/src`.

Release Notes:

- N/A
2025-07-29 17:45:46 -04:00
Ridan Vandenbergh
c110f78015 gpui: Implement support for wlr layer shell (#32651)
I was interested in potentially using gpui for a hobby project, but
needed [layer
shell](https://wayland.app/protocols/wlr-layer-shell-unstable-v1)
support for it. Turns out gpui's (excellent!) architecture made that
super easy to implement, so I went ahead and did it.

Layer shell is a window role used for notification windows, lock
screens, docks, backgrounds, etc. Supporting it in gpui opens the door
to implementing applications like that using the framework.

If this turns out interesting enough to merge - I'm also happy to
provide a follow-up PR (when I have the time to) to implement some of
the desirable window options for layer shell surfaces, such as:
- namespace (currently always `""`)
- keyboard interactivity (currently always `OnDemand`, which mimics
normal keyboard interactivity)
- anchor, exclusive zone, margins
- popups

Release Notes:

- Added support for wayland layer shell surfaces in gpui

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-07-29 21:26:30 +00:00
Ben Kunkle
85b712c04e keymap_ui: Clear close keystroke capture on timeout (#35289)
Closes #ISSUE

Introduces a mechanism whereby keystrokes that have a post-fix which
matches the prefix of the stop recording binding can still be entered.
The solution is to introduce a (as of right now) 300ms timeout before
the close keystroke state is wiped.

Previously, with the default stop recording binding `esc esc esc`,
searching or entering a binding ending in esc was not possible without
using the mouse. `e.g.` entering keystroke `ctrl-g esc` and then
attempting to hit `esc` three times would stop recording on the
penultimate `esc` press and the final `esc` would not be intercepted.
Now with the timeout, it is possible to enter `ctrl-g esc`, pause for a
moment, then hit `esc esc esc` and end the recording with the keystroke
input state being `ctrl-g esc`.

I arrived at 300ms for this delay as it was long enough that I didn't
run into it very often when trying to escape, but short enough that a
natural pause will almost always work as expected.

Release Notes:

- Keymap Editor: Added a short timeout to the stop recording keybind
handling in the keystroke input, so that it is now possible using the
default bindings as an example (custom bindings should work as well) to
search for/enter a binding ending with `escape` (with no modifier),
pause for a moment, then hit `escape escape escape` to stop recording
and search for/enter a keystroke ending with `escape`.
2025-07-29 17:24:57 -04:00
Daniel Sauble
5fa212183a Fix animations in the component preview (#33673)
Fixes #33869

The Animation page in the Component Preview had a few issues.

* The animations only ran once, so you couldn't watch animations below
the fold.
* The offset math was wrong, so some animated elements were rendered
outside of their parent container.
* The "animate in from right" elements were defined with an initial
`.left()` offset, which overrode the animation behavior.

I made fixes to address these issues. In particular, every time you
click the active list item, it renders the preview again (which causes
the animations to run again).

Before:


https://github.com/user-attachments/assets/a1fa2e3f-653c-4b83-a6ed-c55ca9c78ad4

After:


https://github.com/user-attachments/assets/3623bbbc-9047-4443-b7f3-96bd92f582bf

Release Notes:

- N/A
2025-07-29 14:22:53 -07:00
David Kleingeld
1501ae0013 Upgrade rodio to 0.21 (#34368)
Hi all,

We just released [Rodio
0.21](https://github.com/RustAudio/rodio/blob/master/CHANGELOG.md)
🥳 with quite some breaking changes. This should take care
of those for zed. I tested it by hopping in and out some of the zed
channels, sound seems to still work.

Given zed uses tracing I also took the liberty of enabling the tracing
feature for rodio.

edit:
We changed the default wav decoder from hound to symphonia. The latter
has a slightly more restrictive license however that should be no issue
here (as the audio crate uses the GPL)

Release Notes:

- N/A
2025-07-29 13:24:34 -07:00
Joseph T. Lyons
3973142324 Adjust Zed badge (#35294)
- Make right side background white
- Fix Zed casing

Release Notes:

- N/A
2025-07-29 15:09:31 -04:00
Cole Miller
7878eacc73 python: Use a single workspace folder for basedpyright (#35292)
Treat the new basedpyright adapter the same as pyright was treated in
#35243.

Release Notes:

- N/A
2025-07-29 19:00:41 +00:00
Joseph T. Lyons
72f8fa6d1e Adjust Zed badge (#35290)
- Inline badges
- Set label background fill color to black
- Uppercase Zed text
- Remove gray padding

Release Notes:

- N/A
2025-07-29 14:24:10 -04:00
Joseph T. Lyons
902c17ac1a Add Zed badge to README.md (#35287)
Release Notes:

- N/A
2025-07-29 14:15:17 -04:00
Ben Kunkle
efa3cc13ef keymap_ui: Test keystroke input (#35286)
Closes #ISSUE

Separate out the keystroke input into it's own component and add a bunch
of tests for it's core keystroke+modifier event handling logic

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-07-29 14:10:51 -04:00
Michael Sloan
65250fe08d cloud provider: Use CompletionEvent type from zed_llm_client (#35285)
Release Notes:

- N/A
2025-07-29 17:28:18 +00:00
Marshall Bowers
77dc65d826 collab: Attach User-Agent to handle connection span (#35282)
This PR makes it so we attach the value from the `User-Agent` header to
the `handle connection` span.

We'll start sending this header in
https://github.com/zed-industries/zed/pull/35280.

Release Notes:

- N/A
2025-07-29 17:06:27 +00:00
Marshall Bowers
f9224b1d74 client: Send User-Agent header on WebSocket connection requests (#35280)
This PR makes it so we send the `User-Agent` header on the WebSocket
connection requests when connecting to Collab.

We use the user agent set on the parent HTTP client.

Release Notes:

- N/A
2025-07-29 16:53:56 +00:00
localcc
aa3437e98f Allow installing from an administrator user (#35202)
Release Notes:

- N/A
2025-07-29 18:03:57 +02:00
Finn Evers
397b5f9301 Ensure context servers are spawned in the workspace directory (#35271)
This fixes an issue where we were not setting the context server working
directory at all.

Release Notes:

- Context servers will now be spawned in the currently active project
root.

---------

Co-authored-by: Danilo Leal <danilo@zed.dev>
2025-07-29 18:03:43 +02:00
localcc
d43f464174 Fix nightly icon (#35204)
Release Notes:

- N/A
2025-07-29 18:01:07 +02:00
localcc
511fdaed43 Allow searching Windows paths with forward slash (#35198)
Release Notes:

- Searching windows paths is now possible with a forward slash
2025-07-29 17:58:28 +02:00
Marshall Bowers
a8bdf30259 client: Fix typo in the error message (#35275)
This PR fixes a typo in the error message for when we fail to parse the
Collab URL.

Release Notes:

- N/A
2025-07-29 15:45:49 +00:00
devjasperwang
2fced602b8 paths: Fix using relative path as custom_data_dir (#35256)
This PR fixes issue of incorrect LSP path args caused by using a
relative path when customizing data directory.

command:
```bash
.\target\debug\zed.exe --user-data-dir=.\target\data
```

before:
```log
2025-07-29T14:17:18+08:00 INFO  [lsp] starting language server process. binary path: "F:\\nvm\\nodejs\\node.exe", working directory: "F:\\zed\\target\\data\\config", args: [".\\target\\data\\languages\\json-language-server\\node_modules/vscode-langservers-extracted/bin/vscode-json-language-server", "--stdio"]
2025-07-29T14:17:18+08:00 INFO  [project::prettier_store] Installing default prettier and plugins: [("prettier", "3.6.2")]
2025-07-29T14:17:18+08:00 ERROR [lsp] cannot read LSP message headers
2025-07-29T14:17:18+08:00 ERROR [lsp] Shutdown request failure, server json-language-server (id 1): server shut down

2025-07-29T14:17:43+08:00 ERROR [project] Invalid file path provided to LSP request: ".\\target\\data\\config\\settings.json"
Thread "main" panicked with "called `Result::unwrap()` on an `Err` value: ()" at crates\project\src\lsp_store.rs:7203:54
cfd5b8ff10/src/crates\project\src\lsp_store.rs#L7203 (may not be uploaded, line may be incorrect if files modified)
```

after:
```log
2025-07-29T14:24:20+08:00 INFO  [lsp] starting language server process. binary path: "F:\\nvm\\nodejs\\node.exe", working directory: "F:\\zed\\target\\data\\config", args: ["F:\\zed\\target\\data\\languages\\json-language-server\\node_modules/vscode-langservers-extracted/bin/vscode-json-language-server", "--stdio"]
```

Release Notes:

- N/A
2025-07-29 15:31:54 +00:00
Peter Tripp
3fc84f8a62 Comment on source of ctrl-m in keymaps (#35273)
Closes https://github.com/zed-industries/zed/issues/23896

Release Notes:

- N/A
2025-07-29 15:29:12 +00:00
Kirill Bulatov
5a218d8323 Add more data to see which extension got leaked (#35272)
Part of https://github.com/zed-industries/zed/issues/35185

Release Notes:

- N/A
2025-07-29 15:24:52 +00:00
Agus Zubiaga
9353ba7887 Fix remaining agent server integration tests (#35222)
Release Notes:

- N/A
2025-07-29 12:40:59 +00:00
Finn Evers
8f952f1b58 gpui: Ensure first tab index is selected on first focus (#35247)
This fixes an issue with tab indices where we would actually focus the
second focus handle on first focus instead of the first one. The test
was updated accordingly.

Release Notes:

- N/A

---------

Co-authored-by: Jason Lee <huacnlee@gmail.com>
2025-07-29 10:30:38 +00:00
Piotr Osiewicz
6c5791532e lsp: Remove Attach enum, default to Shared behaviour (#35248)
This should be a no-op PR, behavior-wise.

Release Notes:

- N/A
2025-07-29 10:07:36 +00:00
Kirill Bulatov
691b3ca238 Cache LSP code lens requests (#35207) 2025-07-29 09:51:58 +03:00
Cole Miller
cfd5b8ff10 python: Uplift basedpyright support into core (#35250)
This PR adds a built-in adapter for the basedpyright language server.

For now, it's behind the `basedpyright` feature flag, and needs to be
requested explicitly like this for staff:

```
  "languages": {
    "Python": {
      "language_servers": ["basedpyright", "!pylsp", "!pyright"]
    }
  }
```

(After uninstalling the basedpyright extension.)

Release Notes:

- N/A
2025-07-29 03:19:31 +00:00
Piotr Osiewicz
e5269212ad lsp/python: Temporarily report just a singular workspace folder instead of all of the roots (#35243)
Temporarily fixes #29133

Co-authored-by: Cole <cole@zed.dev>

Release Notes:

- python: Zed now reports a slightly different set of workspace folders
for Python projects to work around quirks in handling of multi-lsp
projects with virtual environment. This behavior will be revisited in a
near future.

Co-authored-by: Cole <cole@zed.dev>
2025-07-29 00:10:32 +00:00
Bedis Nbiba
d2ef287791 Add runnable support for Deno.test (#34593)
example of detected code:
```ts
Deno.test("t", () => {
  console.log("Hello, World!");
});

Deno.test(function azaz() {
  console.log("Hello, World!");
});
```

I can't build zed locally so I didn't test this, but I think the code is
straightforward enough, hopefully someone else can verify it

Closes #ISSUE

Release Notes:

- N/A
2025-07-29 01:45:41 +02:00
Tom Monaghan
109eddafd0 docs: Fix link in configuration documentation (#35249)
# Summary

The link "under the configuration page" [on this
page](https://zed.dev/docs/configuring-zed#agent) is broken. It should
be linking to [this page](https://zed.dev/docs/ai/configuration).

## Approach

I noted that all other links in this document begin with "./" where the
ai configuration link does not, I also noticed [this
PR](https://github.com/zed-industries/zed/pull/31119) fixing a link with
the same approach. I don't fully understand why this is the fix.

## Previous Approaches

I have tried writing the following redirect in `docs/book.toml`:
`"/ai/configuration.html" = "/docs/ai/configuration.html"`. However this
broke the `mdbook` build with the below error.

```
2025-07-29 08:49:36 [ERROR] (mdbook::utils): 	Caused By: Not redirecting "/Users/tmonaghan/dev/zed/docs/book/ai/configuration.html" to "/docs/ai/configuration.html" because it already exists. Are you sure it needs to be redirected?
```

Release Notes:

- N/A
2025-07-28 22:59:46 +00:00
172 changed files with 6398 additions and 2732 deletions

View File

@@ -19,7 +19,7 @@ runs:
shell: bash -euxo pipefail {0}
run: ./script/linux
- name: Check for broken links
- name: Check for broken links (in MD)
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332 # v2.4.1
with:
args: --no-progress --exclude '^http' './docs/src/**/*'
@@ -30,3 +30,9 @@ runs:
run: |
mkdir -p target/deploy
mdbook build ./docs --dest-dir=../target/deploy/docs/
- name: Check for broken links (in HTML)
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332 # v2.4.1
with:
args: --no-progress --exclude '^http' 'target/deploy/docs/'
fail: true

View File

@@ -771,7 +771,8 @@ jobs:
timeout-minutes: 120
name: Create a Windows installer
runs-on: [self-hosted, Windows, X64]
if: true && (startsWith(github.ref, 'refs/tags/v') || contains(github.event.pull_request.labels.*.name, 'run-bundling'))
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
# if: (startsWith(github.ref, 'refs/tags/v') || contains(github.event.pull_request.labels.*.name, 'run-bundling'))
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
@@ -807,16 +808,16 @@ jobs:
name: ZedEditorUserSetup-x64-${{ github.event.pull_request.head.sha || github.sha }}.exe
path: ${{ env.SETUP_PATH }}
# - name: Upload Artifacts to release
# uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
# # Re-enable when we are ready to publish windows preview releases
# if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) && env.RELEASE_CHANNEL == 'preview' }} # upload only preview
# with:
# draft: true
# prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
# files: ${{ env.SETUP_PATH }}
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
# Re-enable when we are ready to publish windows preview releases
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) && env.RELEASE_CHANNEL == 'preview' }} # upload only preview
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto-release-preview:
name: Auto release preview

458
Cargo.lock generated
View File

@@ -90,6 +90,7 @@ dependencies = [
"assistant_tools",
"chrono",
"client",
"cloud_llm_client",
"collections",
"component",
"context_server",
@@ -132,7 +133,6 @@ dependencies = [
"uuid",
"workspace",
"workspace-hack",
"zed_llm_client",
"zstd",
]
@@ -189,6 +189,7 @@ name = "agent_settings"
version = "0.1.0"
dependencies = [
"anyhow",
"cloud_llm_client",
"collections",
"fs",
"gpui",
@@ -200,7 +201,6 @@ dependencies = [
"serde_json_lenient",
"settings",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -223,6 +223,7 @@ dependencies = [
"buffer_diff",
"chrono",
"client",
"cloud_llm_client",
"collections",
"command_palette_hooks",
"component",
@@ -294,7 +295,6 @@ dependencies = [
"workspace",
"workspace-hack",
"zed_actions",
"zed_llm_client",
]
[[package]]
@@ -687,6 +687,7 @@ dependencies = [
"chrono",
"client",
"clock",
"cloud_llm_client",
"collections",
"context_server",
"fs",
@@ -720,7 +721,6 @@ dependencies = [
"uuid",
"workspace",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -828,6 +828,7 @@ dependencies = [
"chrono",
"client",
"clock",
"cloud_llm_client",
"collections",
"component",
"derive_more 0.99.19",
@@ -881,7 +882,6 @@ dependencies = [
"which 6.0.3",
"workspace",
"workspace-hack",
"zed_llm_client",
"zlog",
]
@@ -2976,6 +2976,8 @@ dependencies = [
"base64 0.22.1",
"chrono",
"clock",
"cloud_api_client",
"cloud_llm_client",
"cocoa 0.26.0",
"collections",
"credentials_provider",
@@ -3018,7 +3020,6 @@ dependencies = [
"windows 0.61.1",
"workspace-hack",
"worktree",
"zed_llm_client",
]
[[package]]
@@ -3031,6 +3032,44 @@ dependencies = [
"workspace-hack",
]
[[package]]
name = "cloud_api_client"
version = "0.1.0"
dependencies = [
"anyhow",
"cloud_api_types",
"futures 0.3.31",
"http_client",
"parking_lot",
"serde_json",
"workspace-hack",
]
[[package]]
name = "cloud_api_types"
version = "0.1.0"
dependencies = [
"chrono",
"cloud_llm_client",
"pretty_assertions",
"serde",
"serde_json",
"workspace-hack",
]
[[package]]
name = "cloud_llm_client"
version = "0.1.0"
dependencies = [
"anyhow",
"pretty_assertions",
"serde",
"serde_json",
"strum 0.27.1",
"uuid",
"workspace-hack",
]
[[package]]
name = "clru"
version = "0.6.2"
@@ -3157,6 +3196,7 @@ dependencies = [
"chrono",
"client",
"clock",
"cloud_llm_client",
"collab_ui",
"collections",
"command_palette_hooks",
@@ -3243,7 +3283,6 @@ dependencies = [
"workspace",
"workspace-hack",
"worktree",
"zed_llm_client",
"zlog",
]
@@ -3684,17 +3723,6 @@ dependencies = [
"libm",
]
[[package]]
name = "coreaudio-rs"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "321077172d79c662f64f5071a03120748d5bb652f5231570141be24cfcd2bace"
dependencies = [
"bitflags 1.3.2",
"core-foundation-sys",
"coreaudio-sys",
]
[[package]]
name = "coreaudio-rs"
version = "0.12.1"
@@ -3752,29 +3780,6 @@ dependencies = [
"unicode-segmentation",
]
[[package]]
name = "cpal"
version = "0.15.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "873dab07c8f743075e57f524c583985fbaf745602acbe916a01539364369a779"
dependencies = [
"alsa",
"core-foundation-sys",
"coreaudio-rs 0.11.3",
"dasp_sample",
"jni",
"js-sys",
"libc",
"mach2",
"ndk 0.8.0",
"ndk-context",
"oboe",
"wasm-bindgen",
"wasm-bindgen-futures",
"web-sys",
"windows 0.54.0",
]
[[package]]
name = "cpal"
version = "0.16.0"
@@ -3788,7 +3793,7 @@ dependencies = [
"js-sys",
"libc",
"mach2",
"ndk 0.9.0",
"ndk",
"ndk-context",
"num-derive",
"num-traits",
@@ -4290,41 +4295,6 @@ dependencies = [
"workspace-hack",
]
[[package]]
name = "darling"
version = "0.20.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc7f46116c46ff9ab3eb1597a45688b6715c6e628b5c133e288e709a29bcb4ee"
dependencies = [
"darling_core",
"darling_macro",
]
[[package]]
name = "darling_core"
version = "0.20.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d00b9596d185e565c2207a0b01f8bd1a135483d02d9b7b0a54b11da8d53412e"
dependencies = [
"fnv",
"ident_case",
"proc-macro2",
"quote",
"strsim",
"syn 2.0.101",
]
[[package]]
name = "darling_macro"
version = "0.20.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fc34b93ccb385b40dc71c6fceac4b2ad23662c7eeb248cf10d529b7e055b6ead"
dependencies = [
"darling_core",
"quote",
"syn 2.0.101",
]
[[package]]
name = "dashmap"
version = "5.5.3"
@@ -4540,37 +4510,6 @@ dependencies = [
"serde",
]
[[package]]
name = "derive_builder"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "507dfb09ea8b7fa618fcf76e953f4f5e192547945816d5358edffe39f6f94947"
dependencies = [
"derive_builder_macro",
]
[[package]]
name = "derive_builder_core"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2d5bcf7b024d6835cfb3d473887cd966994907effbe9227e8c8219824d06c4e8"
dependencies = [
"darling",
"proc-macro2",
"quote",
"syn 2.0.101",
]
[[package]]
name = "derive_builder_macro"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab63b0e2bf4d5928aff72e83a7dace85d7bba5fe12dcc3c5a572d78caffd3f3c"
dependencies = [
"derive_builder_core",
"syn 2.0.101",
]
[[package]]
name = "derive_more"
version = "0.99.19"
@@ -4792,7 +4731,6 @@ name = "docs_preprocessor"
version = "0.1.0"
dependencies = [
"anyhow",
"clap",
"command_palette",
"gpui",
"mdbook",
@@ -4803,6 +4741,7 @@ dependencies = [
"util",
"workspace-hack",
"zed",
"zlog",
]
[[package]]
@@ -5263,6 +5202,7 @@ dependencies = [
"chrono",
"clap",
"client",
"cloud_llm_client",
"collections",
"debug_adapter_extension",
"dirs 4.0.0",
@@ -5302,7 +5242,6 @@ dependencies = [
"uuid",
"watch",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -5367,6 +5306,12 @@ dependencies = [
"zune-inflate",
]
[[package]]
name = "extended"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "af9673d8203fcb076b19dfd17e38b3d4ae9f44959416ea532ce72415a6020365"
[[package]]
name = "extension"
version = "0.1.0"
@@ -5943,7 +5888,7 @@ dependencies = [
"ignore",
"libc",
"log",
"notify",
"notify 8.0.0",
"objc",
"parking_lot",
"paths",
@@ -6378,6 +6323,7 @@ dependencies = [
"call",
"chrono",
"client",
"cloud_llm_client",
"collections",
"command_palette_hooks",
"component",
@@ -6420,7 +6366,6 @@ dependencies = [
"workspace",
"workspace-hack",
"zed_actions",
"zed_llm_client",
"zlog",
]
@@ -7500,18 +7445,16 @@ dependencies = [
[[package]]
name = "handlebars"
version = "6.3.2"
version = "5.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "759e2d5aea3287cb1190c8ec394f42866cb5bf74fcbf213f354e3c856ea26098"
checksum = "d08485b96a0e6393e9e4d1b8d48cf74ad6c063cd905eb33f42c1ce3f0377539b"
dependencies = [
"derive_builder",
"log",
"num-order",
"pest",
"pest_derive",
"serde",
"serde_json",
"thiserror 2.0.12",
"thiserror 1.0.69",
]
[[package]]
@@ -7742,12 +7685,6 @@ dependencies = [
"windows-sys 0.59.0",
]
[[package]]
name = "hound"
version = "3.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62adaabb884c94955b19907d60019f4e145d091c75345379e70d1ee696f7854f"
[[package]]
name = "html5ever"
version = "0.27.0"
@@ -8188,12 +8125,6 @@ version = "2.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "25a2bc672d1148e28034f176e01fffebb08b35768468cc954630da77a1449005"
[[package]]
name = "ident_case"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b9e0384b61958566e926dc50660321d12159025e767c18e043daf26b70104c39"
[[package]]
name = "idna"
version = "1.0.3"
@@ -8386,6 +8317,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"cloud_llm_client",
"copilot",
"editor",
"feature_flags",
@@ -8408,10 +8340,20 @@ dependencies = [
"workspace",
"workspace-hack",
"zed_actions",
"zed_llm_client",
"zeta",
]
[[package]]
name = "inotify"
version = "0.9.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f8069d3ec154eb856955c1c0fbffefbf5f3c40a104ec912d4797314c1801abff"
dependencies = [
"bitflags 1.3.2",
"inotify-sys",
"libc",
]
[[package]]
name = "inotify"
version = "0.11.0"
@@ -8565,7 +8507,7 @@ dependencies = [
"fnv",
"lazy_static",
"libc",
"mio",
"mio 1.0.3",
"rand 0.8.5",
"serde",
"tempfile",
@@ -9090,6 +9032,7 @@ dependencies = [
"anyhow",
"base64 0.22.1",
"client",
"cloud_llm_client",
"collections",
"futures 0.3.31",
"gpui",
@@ -9107,7 +9050,6 @@ dependencies = [
"thiserror 2.0.12",
"util",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -9123,6 +9065,7 @@ dependencies = [
"bedrock",
"chrono",
"client",
"cloud_llm_client",
"collections",
"component",
"convert_case 0.8.0",
@@ -9164,7 +9107,6 @@ dependencies = [
"vercel",
"workspace-hack",
"x_ai",
"zed_llm_client",
]
[[package]]
@@ -9226,6 +9168,7 @@ dependencies = [
"chrono",
"collections",
"dap",
"feature_flags",
"futures 0.3.31",
"gpui",
"http_client",
@@ -9594,7 +9537,7 @@ dependencies = [
"core-foundation 0.10.0",
"core-video",
"coreaudio-rs 0.12.1",
"cpal 0.16.0",
"cpal",
"futures 0.3.31",
"gpui",
"gpui_tokio",
@@ -10004,9 +9947,9 @@ dependencies = [
[[package]]
name = "mdbook"
version = "0.4.48"
version = "0.4.40"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b6fbb4ac2d9fd7aa987c3510309ea3c80004a968d063c42f0d34fea070817c1"
checksum = "b45a38e19bd200220ef07c892b0157ad3d2365e5b5a267ca01ad12182491eea5"
dependencies = [
"ammonia",
"anyhow",
@@ -10016,12 +9959,11 @@ dependencies = [
"elasticlunr-rs",
"env_logger 0.11.8",
"futures-util",
"handlebars 6.3.2",
"hex",
"handlebars 5.1.2",
"ignore",
"log",
"memchr",
"notify",
"notify 6.1.1",
"notify-debouncer-mini",
"once_cell",
"opener",
@@ -10030,7 +9972,6 @@ dependencies = [
"regex",
"serde",
"serde_json",
"sha2",
"shlex",
"tempfile",
"tokio",
@@ -10173,6 +10114,18 @@ version = "0.5.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e53debba6bda7a793e5f99b8dacf19e626084f525f7829104ba9898f367d85ff"
[[package]]
name = "mio"
version = "0.8.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a4a650543ca06a924e8b371db273b2756685faae30f8487da1b56505a8f78b0c"
dependencies = [
"libc",
"log",
"wasi 0.11.0+wasi-snapshot-preview1",
"windows-sys 0.48.0",
]
[[package]]
name = "mio"
version = "1.0.3"
@@ -10365,20 +10318,6 @@ dependencies = [
"workspace-hack",
]
[[package]]
name = "ndk"
version = "0.8.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2076a31b7010b17a38c01907c45b945e8f11495ee4dd588309718901b1f7a5b7"
dependencies = [
"bitflags 2.9.0",
"jni-sys",
"log",
"ndk-sys 0.5.0+25.2.9519653",
"num_enum",
"thiserror 1.0.69",
]
[[package]]
name = "ndk"
version = "0.9.0"
@@ -10388,7 +10327,7 @@ dependencies = [
"bitflags 2.9.0",
"jni-sys",
"log",
"ndk-sys 0.6.0+11769913",
"ndk-sys",
"num_enum",
"thiserror 1.0.69",
]
@@ -10399,15 +10338,6 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "27b02d87554356db9e9a873add8782d4ea6e3e58ea071a9adb9a2e8ddb884a8b"
[[package]]
name = "ndk-sys"
version = "0.5.0+25.2.9519653"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c196769dd60fd4f363e11d948139556a344e79d451aeb2fa2fd040738ef7691"
dependencies = [
"jni-sys",
]
[[package]]
name = "ndk-sys"
version = "0.6.0+11769913"
@@ -10542,6 +10472,25 @@ dependencies = [
"zed_actions",
]
[[package]]
name = "notify"
version = "6.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6205bd8bb1e454ad2e27422015fb5e4f2bcc7e08fa8f27058670d208324a4d2d"
dependencies = [
"bitflags 2.9.0",
"crossbeam-channel",
"filetime",
"fsevent-sys 4.1.0",
"inotify 0.9.6",
"kqueue",
"libc",
"log",
"mio 0.8.11",
"walkdir",
"windows-sys 0.48.0",
]
[[package]]
name = "notify"
version = "8.0.0"
@@ -10550,11 +10499,11 @@ dependencies = [
"bitflags 2.9.0",
"filetime",
"fsevent-sys 4.1.0",
"inotify",
"inotify 0.11.0",
"kqueue",
"libc",
"log",
"mio",
"mio 1.0.3",
"notify-types",
"walkdir",
"windows-sys 0.59.0",
@@ -10562,14 +10511,13 @@ dependencies = [
[[package]]
name = "notify-debouncer-mini"
version = "0.6.0"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a689eb4262184d9a1727f9087cd03883ea716682ab03ed24efec57d7716dccb8"
checksum = "5d40b221972a1fc5ef4d858a2f671fb34c75983eb385463dff3780eeff6a9d43"
dependencies = [
"crossbeam-channel",
"log",
"notify",
"notify-types",
"tempfile",
"notify 6.1.1",
]
[[package]]
@@ -10709,21 +10657,6 @@ dependencies = [
"num-traits",
]
[[package]]
name = "num-modular"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "17bb261bf36fa7d83f4c294f834e91256769097b3cb505d44831e0a179ac647f"
[[package]]
name = "num-order"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "537b596b97c40fcf8056d153049eb22f481c17ebce72a513ec9286e4986d1bb6"
dependencies = [
"num-modular",
]
[[package]]
name = "num-rational"
version = "0.4.2"
@@ -10977,29 +10910,6 @@ dependencies = [
"memchr",
]
[[package]]
name = "oboe"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8b61bebd49e5d43f5f8cc7ee2891c16e0f41ec7954d36bcb6c14c5e0de867fb"
dependencies = [
"jni",
"ndk 0.8.0",
"ndk-context",
"num-derive",
"num-traits",
"oboe-sys",
]
[[package]]
name = "oboe-sys"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6c8bb09a4a2b1d668170cfe0a7d5bc103f8999fb316c98099b6a9939c9f2e79d"
dependencies = [
"cc",
]
[[package]]
name = "ollama"
version = "0.1.0"
@@ -11020,15 +10930,22 @@ dependencies = [
"anyhow",
"command_palette_hooks",
"db",
"editor",
"feature_flags",
"fs",
"gpui",
"language",
"project",
"schemars",
"serde",
"settings",
"theme",
"ui",
"util",
"workspace",
"workspace-hack",
"zed_actions",
"zlog",
]
[[package]]
@@ -13779,12 +13696,15 @@ dependencies = [
[[package]]
name = "rodio"
version = "0.20.1"
version = "0.21.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e7ceb6607dd738c99bc8cb28eff249b7cd5c8ec88b9db96c0608c1480d140fb1"
checksum = "e40ecf59e742e03336be6a3d53755e789fd05a059fa22dfa0ed624722319e183"
dependencies = [
"cpal 0.15.3",
"hound",
"cpal",
"dasp_sample",
"num-rational",
"symphonia",
"tracing",
]
[[package]]
@@ -14789,6 +14709,27 @@ dependencies = [
"zlog",
]
[[package]]
name = "settings_profile_selector"
version = "0.1.0"
dependencies = [
"client",
"editor",
"fuzzy",
"gpui",
"language",
"menu",
"picker",
"project",
"serde_json",
"settings",
"theme",
"ui",
"workspace",
"workspace-hack",
"zed_actions",
]
[[package]]
name = "settings_ui"
version = "0.1.0"
@@ -14811,7 +14752,6 @@ dependencies = [
"notifications",
"paths",
"project",
"schemars",
"search",
"serde",
"serde_json",
@@ -15805,6 +15745,66 @@ dependencies = [
"zeno",
]
[[package]]
name = "symphonia"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "815c942ae7ee74737bb00f965fa5b5a2ac2ce7b6c01c0cc169bbeaf7abd5f5a9"
dependencies = [
"lazy_static",
"symphonia-codec-pcm",
"symphonia-core",
"symphonia-format-riff",
"symphonia-metadata",
]
[[package]]
name = "symphonia-codec-pcm"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f395a67057c2ebc5e84d7bb1be71cce1a7ba99f64e0f0f0e303a03f79116f89b"
dependencies = [
"log",
"symphonia-core",
]
[[package]]
name = "symphonia-core"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "798306779e3dc7d5231bd5691f5a813496dc79d3f56bf82e25789f2094e022c3"
dependencies = [
"arrayvec",
"bitflags 1.3.2",
"bytemuck",
"lazy_static",
"log",
]
[[package]]
name = "symphonia-format-riff"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05f7be232f962f937f4b7115cbe62c330929345434c834359425e043bfd15f50"
dependencies = [
"extended",
"log",
"symphonia-core",
"symphonia-metadata",
]
[[package]]
name = "symphonia-metadata"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bc622b9841a10089c5b18e99eb904f4341615d5aa55bbf4eedde1be721a4023c"
dependencies = [
"encoding_rs",
"lazy_static",
"log",
"symphonia-core",
]
[[package]]
name = "syn"
version = "1.0.109"
@@ -16572,7 +16572,7 @@ dependencies = [
"backtrace",
"bytes 1.10.1",
"libc",
"mio",
"mio 1.0.3",
"parking_lot",
"pin-project-lite",
"signal-hook-registry",
@@ -18505,11 +18505,11 @@ name = "web_search"
version = "0.1.0"
dependencies = [
"anyhow",
"cloud_llm_client",
"collections",
"gpui",
"serde",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -18518,6 +18518,7 @@ version = "0.1.0"
dependencies = [
"anyhow",
"client",
"cloud_llm_client",
"futures 0.3.31",
"gpui",
"http_client",
@@ -18526,7 +18527,6 @@ dependencies = [
"serde_json",
"web_search",
"workspace-hack",
"zed_llm_client",
]
[[package]]
@@ -19692,14 +19692,12 @@ dependencies = [
"cc",
"chrono",
"cipher",
"clang-sys",
"clap",
"clap_builder",
"codespan-reporting 0.12.0",
"concurrent-queue",
"core-foundation 0.9.4",
"core-foundation-sys",
"coreaudio-sys",
"cranelift-codegen",
"crc32fast",
"crossbeam-epoch",
@@ -19751,7 +19749,7 @@ dependencies = [
"md-5",
"memchr",
"miniz_oxide",
"mio",
"mio 1.0.3",
"naga",
"nix 0.29.0",
"nom",
@@ -20195,7 +20193,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.198.0"
version = "0.199.0"
dependencies = [
"activity_indicator",
"agent",
@@ -20298,6 +20296,7 @@ dependencies = [
"serde_json",
"session",
"settings",
"settings_profile_selector",
"settings_ui",
"shellexpand 2.1.2",
"smol",
@@ -20356,7 +20355,7 @@ dependencies = [
[[package]]
name = "zed_emmet"
version = "0.0.3"
version = "0.0.4"
dependencies = [
"zed_extension_api 0.1.0",
]
@@ -20395,19 +20394,6 @@ dependencies = [
"zed_extension_api 0.1.0",
]
[[package]]
name = "zed_llm_client"
version = "0.8.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6607f74dee2a18a9ce0f091844944a0e59881359ab62e0768fb0618f55d4c1dc"
dependencies = [
"anyhow",
"serde",
"serde_json",
"strum 0.27.1",
"uuid",
]
[[package]]
name = "zed_proto"
version = "0.2.2"
@@ -20587,6 +20573,7 @@ dependencies = [
"call",
"client",
"clock",
"cloud_llm_client",
"collections",
"command_palette_hooks",
"copilot",
@@ -20628,7 +20615,6 @@ dependencies = [
"workspace-hack",
"worktree",
"zed_actions",
"zed_llm_client",
"zlog",
]

View File

@@ -1,13 +1,13 @@
[workspace]
resolver = "2"
members = [
"crates/activity_indicator",
"crates/acp_thread",
"crates/agent_ui",
"crates/activity_indicator",
"crates/agent",
"crates/agent_settings",
"crates/ai_onboarding",
"crates/agent_servers",
"crates/agent_settings",
"crates/agent_ui",
"crates/ai_onboarding",
"crates/anthropic",
"crates/askpass",
"crates/assets",
@@ -29,6 +29,9 @@ members = [
"crates/cli",
"crates/client",
"crates/clock",
"crates/cloud_api_client",
"crates/cloud_api_types",
"crates/cloud_llm_client",
"crates/collab",
"crates/collab_ui",
"crates/collections",
@@ -48,8 +51,8 @@ members = [
"crates/diagnostics",
"crates/docs_preprocessor",
"crates/editor",
"crates/explorer_command_injector",
"crates/eval",
"crates/explorer_command_injector",
"crates/extension",
"crates/extension_api",
"crates/extension_cli",
@@ -70,7 +73,6 @@ members = [
"crates/gpui",
"crates/gpui_macros",
"crates/gpui_tokio",
"crates/html_to_markdown",
"crates/http_client",
"crates/http_client_tls",
@@ -99,7 +101,6 @@ members = [
"crates/markdown_preview",
"crates/media",
"crates/menu",
"crates/svg_preview",
"crates/migrator",
"crates/mistral",
"crates/multi_buffer",
@@ -140,6 +141,7 @@ members = [
"crates/semantic_version",
"crates/session",
"crates/settings",
"crates/settings_profile_selector",
"crates/settings_ui",
"crates/snippet",
"crates/snippet_provider",
@@ -152,6 +154,7 @@ members = [
"crates/sum_tree",
"crates/supermaven",
"crates/supermaven_api",
"crates/svg_preview",
"crates/tab_switcher",
"crates/task",
"crates/tasks_ui",
@@ -251,6 +254,9 @@ channel = { path = "crates/channel" }
cli = { path = "crates/cli" }
client = { path = "crates/client" }
clock = { path = "crates/clock" }
cloud_api_client = { path = "crates/cloud_api_client" }
cloud_api_types = { path = "crates/cloud_api_types" }
cloud_llm_client = { path = "crates/cloud_llm_client" }
collab = { path = "crates/collab" }
collab_ui = { path = "crates/collab_ui" }
collections = { path = "crates/collections" }
@@ -337,6 +343,7 @@ picker = { path = "crates/picker" }
plugin = { path = "crates/plugin" }
plugin_macros = { path = "crates/plugin_macros" }
prettier = { path = "crates/prettier" }
settings_profile_selector = { path = "crates/settings_profile_selector" }
project = { path = "crates/project" }
project_panel = { path = "crates/project_panel" }
project_symbols = { path = "crates/project_symbols" }
@@ -645,7 +652,6 @@ which = "6.0.0"
windows-core = "0.61"
wit-component = "0.221"
workspace-hack = "0.1.0"
zed_llm_client = "= 0.8.6"
zstd = "0.11"
[workspace.dependencies.async-stripe]

View File

@@ -1,5 +1,6 @@
# Zed
[![Zed](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/zed-industries/zed/main/assets/badge/v0.json)](https://zed.dev)
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/ci.yml)
Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter).

8
assets/badge/v0.json Normal file
View File

@@ -0,0 +1,8 @@
{
"label": "",
"message": "Zed",
"logoSvg": "<svg xmlns=\"http://www.w3.org/2000/svg\" viewBox=\"0 0 96 96\"><rect width=\"96\" height=\"96\" fill=\"#000\"/><path fill-rule=\"evenodd\" clip-rule=\"evenodd\" d=\"M9 6C7.34315 6 6 7.34315 6 9V75H0V9C0 4.02944 4.02944 0 9 0H89.3787C93.3878 0 95.3955 4.84715 92.5607 7.68198L43.0551 57.1875H57V51H63V58.6875C63 61.1728 60.9853 63.1875 58.5 63.1875H37.0551L26.7426 73.5H73.5V36H79.5V73.5C79.5 76.8137 76.8137 79.5 73.5 79.5H20.7426L10.2426 90H87C88.6569 90 90 88.6569 90 87V21H96V87C96 91.9706 91.9706 96 87 96H6.62132C2.61224 96 0.604504 91.1529 3.43934 88.318L52.7574 39H39V45H33V37.5C33 35.0147 35.0147 33 37.5 33H58.7574L69.2574 22.5H22.5V60H16.5V22.5C16.5 19.1863 19.1863 16.5 22.5 16.5H75.2574L85.7574 6H9Z\" fill=\"#fff\"/></svg>",
"logoWidth": 16,
"labelColor": "black",
"color": "white"
}

View File

@@ -232,7 +232,7 @@
"ctrl-n": "agent::NewThread",
"ctrl-alt-n": "agent::NewTextThread",
"ctrl-shift-h": "agent::OpenHistory",
"ctrl-alt-c": "agent::OpenConfiguration",
"ctrl-alt-c": "agent::OpenSettings",
"ctrl-alt-p": "agent::OpenRulesLibrary",
"ctrl-i": "agent::ToggleProfileSelector",
"ctrl-alt-/": "agent::ToggleModelSelector",
@@ -495,7 +495,7 @@
"shift-f12": "editor::GoToImplementation",
"alt-ctrl-f12": "editor::GoToTypeDefinitionSplit",
"alt-shift-f12": "editor::FindAllReferences",
"ctrl-m": "editor::MoveToEnclosingBracket",
"ctrl-m": "editor::MoveToEnclosingBracket", // from jetbrains
"ctrl-|": "editor::MoveToEnclosingBracket",
"ctrl-{": "editor::Fold",
"ctrl-}": "editor::UnfoldLines",

View File

@@ -272,7 +272,7 @@
"cmd-n": "agent::NewThread",
"cmd-alt-n": "agent::NewTextThread",
"cmd-shift-h": "agent::OpenHistory",
"cmd-alt-c": "agent::OpenConfiguration",
"cmd-alt-c": "agent::OpenSettings",
"cmd-alt-p": "agent::OpenRulesLibrary",
"cmd-i": "agent::ToggleProfileSelector",
"cmd-alt-/": "agent::ToggleModelSelector",
@@ -549,7 +549,7 @@
"alt-cmd-f12": "editor::GoToTypeDefinitionSplit",
"alt-shift-f12": "editor::FindAllReferences",
"cmd-|": "editor::MoveToEnclosingBracket",
"ctrl-m": "editor::MoveToEnclosingBracket",
"ctrl-m": "editor::MoveToEnclosingBracket", // From Jetbrains
"alt-cmd-[": "editor::Fold",
"alt-cmd-]": "editor::UnfoldLines",
"cmd-k cmd-l": "editor::ToggleFold",

View File

@@ -8,7 +8,7 @@
"ctrl-shift-i": "agent::ToggleFocus",
"ctrl-l": "agent::ToggleFocus",
"ctrl-shift-l": "agent::ToggleFocus",
"ctrl-shift-j": "agent::OpenConfiguration"
"ctrl-shift-j": "agent::OpenSettings"
}
},
{

View File

@@ -8,7 +8,7 @@
"cmd-shift-i": "agent::ToggleFocus",
"cmd-l": "agent::ToggleFocus",
"cmd-shift-l": "agent::ToggleFocus",
"cmd-shift-j": "agent::OpenConfiguration"
"cmd-shift-j": "agent::OpenSettings"
}
},
{

View File

@@ -1877,5 +1877,8 @@
"save_breakpoints": true,
"dock": "bottom",
"button": true
}
},
// Configures any number of settings profiles that are temporarily applied
// when selected from `settings profile selector: toggle`.
"profiles": []
}

View File

@@ -1597,6 +1597,7 @@ mod tests {
name: "test",
connection,
child_status: io_task,
current_thread: thread_rc,
};
AcpThread::new(

View File

@@ -7,6 +7,7 @@ use gpui::{AppContext as _, AsyncApp, Entity, Task, WeakEntity};
use project::Project;
use std::{cell::RefCell, error::Error, fmt, path::Path, rc::Rc};
use ui::App;
use util::ResultExt as _;
use crate::{AcpThread, AgentConnection};
@@ -46,7 +47,7 @@ impl acp_old::Client for OldAcpClientDelegate {
thread.push_assistant_content_block(thought.into(), true, cx)
}
})
.ok();
.log_err();
})?;
Ok(())
@@ -364,6 +365,7 @@ pub struct OldAcpAgentConnection {
pub name: &'static str,
pub connection: acp_old::AgentConnection,
pub child_status: Task<Result<()>>,
pub current_thread: Rc<RefCell<WeakEntity<AcpThread>>>,
}
impl AgentConnection for OldAcpAgentConnection {
@@ -383,6 +385,7 @@ impl AgentConnection for OldAcpAgentConnection {
}
.into_any(),
);
let current_thread = self.current_thread.clone();
cx.spawn(async move |cx| {
let result = task.await?;
let result = acp_old::InitializeParams::response_from_any(result)?;
@@ -396,6 +399,7 @@ impl AgentConnection for OldAcpAgentConnection {
let session_id = acp::SessionId("acp-old-no-id".into());
AcpThread::new(self.clone(), project, session_id, cx)
});
current_thread.replace(thread.downgrade());
thread
})
})

View File

@@ -25,6 +25,7 @@ assistant_context.workspace = true
assistant_tool.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
component.workspace = true
context_server.workspace = true
@@ -35,9 +36,9 @@ futures.workspace = true
git.workspace = true
gpui.workspace = true
heed.workspace = true
http_client.workspace = true
icons.workspace = true
indoc.workspace = true
http_client.workspace = true
itertools.workspace = true
language.workspace = true
language_model.workspace = true
@@ -63,7 +64,6 @@ time.workspace = true
util.workspace = true
uuid.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true
zstd.workspace = true
[dev-dependencies]

View File

@@ -13,6 +13,7 @@ use anyhow::{Result, anyhow};
use assistant_tool::{ActionLog, AnyToolCard, Tool, ToolWorkingSet};
use chrono::{DateTime, Utc};
use client::{ModelRequestUsage, RequestUsage};
use cloud_llm_client::{CompletionIntent, CompletionRequestStatus, UsageLimit};
use collections::HashMap;
use feature_flags::{self, FeatureFlagAppExt};
use futures::{FutureExt, StreamExt as _, future::Shared};
@@ -49,7 +50,6 @@ use std::{
use thiserror::Error;
use util::{ResultExt as _, post_inc};
use uuid::Uuid;
use zed_llm_client::{CompletionIntent, CompletionRequestStatus, UsageLimit};
const MAX_RETRY_ATTEMPTS: u8 = 4;
const BASE_RETRY_DELAY: Duration = Duration::from_secs(5);
@@ -1681,7 +1681,7 @@ impl Thread {
let completion_mode = request
.mode
.unwrap_or(zed_llm_client::CompletionMode::Normal);
.unwrap_or(cloud_llm_client::CompletionMode::Normal);
self.last_received_chunk_at = Some(Instant::now());

View File

@@ -47,6 +47,7 @@ impl AgentServer for Codex {
cx: &mut App,
) -> Task<Result<Rc<dyn AgentConnection>>> {
let project = project.clone();
let working_directory = project.read(cx).active_project_directory(cx);
cx.spawn(async move |cx| {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
@@ -65,6 +66,7 @@ impl AgentServer for Codex {
args: command.args,
env: command.env,
},
working_directory,
)
.into();
ContextServer::start(client.clone(), cx).await?;
@@ -310,7 +312,7 @@ pub(crate) mod tests {
AgentServerCommand {
path: cli_path,
args: vec!["mcp".into()],
args: vec![],
env: None,
}
}

View File

@@ -12,7 +12,6 @@ use futures::{FutureExt, StreamExt, channel::mpsc, select};
use gpui::{Entity, TestAppContext};
use indoc::indoc;
use project::{FakeFs, Project};
use serde_json::json;
use settings::{Settings, SettingsStore};
use util::path;
@@ -27,7 +26,11 @@ pub async fn test_basic(server: impl AgentServer + 'static, cx: &mut TestAppCont
.unwrap();
thread.read_with(cx, |thread, _| {
assert_eq!(thread.entries().len(), 2);
assert!(
thread.entries().len() >= 2,
"Expected at least 2 entries. Got: {:?}",
thread.entries()
);
assert!(matches!(
thread.entries()[0],
AgentThreadEntry::UserMessage(_)
@@ -108,19 +111,19 @@ pub async fn test_path_mentions(server: impl AgentServer + 'static, cx: &mut Tes
}
pub async fn test_tool_call(server: impl AgentServer + 'static, cx: &mut TestAppContext) {
let fs = init_test(cx).await;
fs.insert_tree(
path!("/private/tmp"),
json!({"foo": "Lorem ipsum dolor", "bar": "bar", "baz": "baz"}),
)
.await;
let project = Project::test(fs, [path!("/private/tmp").as_ref()], cx).await;
let _fs = init_test(cx).await;
let tempdir = tempfile::tempdir().unwrap();
let foo_path = tempdir.path().join("foo");
std::fs::write(&foo_path, "Lorem ipsum dolor").expect("failed to write file");
let project = Project::example([tempdir.path()], &mut cx.to_async()).await;
let thread = new_test_thread(server, project.clone(), "/private/tmp", cx).await;
thread
.update(cx, |thread, cx| {
thread.send_raw(
"Read the '/private/tmp/foo' file and tell me what you see.",
&format!("Read {} and tell me what you see.", foo_path.display()),
cx,
)
})
@@ -143,6 +146,8 @@ pub async fn test_tool_call(server: impl AgentServer + 'static, cx: &mut TestApp
.any(|entry| { matches!(entry, AgentThreadEntry::AssistantMessage(_)) })
);
});
drop(tempdir);
}
pub async fn test_tool_call_with_confirmation(
@@ -155,7 +160,7 @@ pub async fn test_tool_call_with_confirmation(
let thread = new_test_thread(server, project.clone(), "/private/tmp", cx).await;
let full_turn = thread.update(cx, |thread, cx| {
thread.send_raw(
r#"Run `touch hello.txt && echo "Hello, world!" | tee hello.txt`"#,
r#"Run exactly `touch hello.txt && echo "Hello, world!" | tee hello.txt` in the terminal."#,
cx,
)
});
@@ -175,10 +180,10 @@ pub async fn test_tool_call_with_confirmation(
)
.await;
let tool_call_id = thread.read_with(cx, |thread, _cx| {
let tool_call_id = thread.read_with(cx, |thread, cx| {
let AgentThreadEntry::ToolCall(ToolCall {
id,
content,
label,
status: ToolCallStatus::WaitingForConfirmation { .. },
..
}) = &thread
@@ -190,7 +195,8 @@ pub async fn test_tool_call_with_confirmation(
panic!();
};
assert!(content.iter().any(|c| c.to_markdown(_cx).contains("touch")));
let label = label.read(cx).source();
assert!(label.contains("touch"), "Got: {}", label);
id.clone()
});
@@ -242,7 +248,7 @@ pub async fn test_cancel(server: impl AgentServer + 'static, cx: &mut TestAppCon
let thread = new_test_thread(server, project.clone(), "/private/tmp", cx).await;
let full_turn = thread.update(cx, |thread, cx| {
thread.send_raw(
r#"Run `touch hello.txt && echo "Hello, world!" >> hello.txt`"#,
r#"Run exactly `touch hello.txt && echo "Hello, world!" | tee hello.txt` in the terminal."#,
cx,
)
});
@@ -262,10 +268,10 @@ pub async fn test_cancel(server: impl AgentServer + 'static, cx: &mut TestAppCon
)
.await;
thread.read_with(cx, |thread, _cx| {
thread.read_with(cx, |thread, cx| {
let AgentThreadEntry::ToolCall(ToolCall {
id,
content,
label,
status: ToolCallStatus::WaitingForConfirmation { .. },
..
}) = &thread.entries()[first_tool_call_ix]
@@ -273,7 +279,8 @@ pub async fn test_cancel(server: impl AgentServer + 'static, cx: &mut TestAppCon
panic!("{:?}", thread.entries()[1]);
};
assert!(content.iter().any(|c| c.to_markdown(_cx).contains("touch")));
let label = label.read(cx).source();
assert!(label.contains("touch"), "Got: {}", label);
id.clone()
});

View File

@@ -107,6 +107,7 @@ impl AgentServer for Gemini {
name,
connection,
child_status,
current_thread: thread_rc,
});
Ok(connection)

View File

@@ -13,6 +13,7 @@ path = "src/agent_settings.rs"
[dependencies]
anyhow.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
gpui.workspace = true
language_model.workspace = true
@@ -20,7 +21,6 @@ schemars.workspace = true
serde.workspace = true
settings.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
fs.workspace = true

View File

@@ -321,11 +321,11 @@ pub enum CompletionMode {
Burn,
}
impl From<CompletionMode> for zed_llm_client::CompletionMode {
impl From<CompletionMode> for cloud_llm_client::CompletionMode {
fn from(value: CompletionMode) -> Self {
match value {
CompletionMode::Normal => zed_llm_client::CompletionMode::Normal,
CompletionMode::Burn => zed_llm_client::CompletionMode::Max,
CompletionMode::Normal => cloud_llm_client::CompletionMode::Normal,
CompletionMode::Burn => cloud_llm_client::CompletionMode::Max,
}
}
}

View File

@@ -31,6 +31,7 @@ audio.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
component.workspace = true
@@ -46,9 +47,9 @@ futures.workspace = true
fuzzy.workspace = true
gpui.workspace = true
html_to_markdown.workspace = true
indoc.workspace = true
http_client.workspace = true
indexed_docs.workspace = true
indoc.workspace = true
inventory.workspace = true
itertools.workspace = true
jsonschema.workspace = true
@@ -97,7 +98,6 @@ watch.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
assistant_tools.workspace = true

View File

@@ -14,6 +14,7 @@ use agent_settings::{AgentSettings, NotifyWhenAgentWaiting};
use anyhow::Context as _;
use assistant_tool::ToolUseStatus;
use audio::{Audio, Sound};
use cloud_llm_client::CompletionIntent;
use collections::{HashMap, HashSet};
use editor::actions::{MoveUp, Paste};
use editor::scroll::Autoscroll;
@@ -52,7 +53,6 @@ use util::ResultExt as _;
use util::markdown::MarkdownCodeBlock;
use workspace::{CollaboratorId, Workspace};
use zed_actions::assistant::OpenRulesLibrary;
use zed_llm_client::CompletionIntent;
const CODEBLOCK_CONTAINER_GROUP: &str = "codeblock_container";
const EDIT_PREVIOUS_MESSAGE_MIN_LINES: usize = 1;

View File

@@ -44,6 +44,7 @@ use assistant_context::{AssistantContext, ContextEvent, ContextSummary};
use assistant_slash_command::SlashCommandWorkingSet;
use assistant_tool::ToolWorkingSet;
use client::{DisableAiSettings, UserStore, zed_urls};
use cloud_llm_client::{CompletionIntent, UsageLimit};
use editor::{Anchor, AnchorRangeExt as _, Editor, EditorEvent, MultiBuffer};
use feature_flags::{self, FeatureFlagAppExt};
use fs::Fs;
@@ -77,10 +78,9 @@ use workspace::{
};
use zed_actions::{
DecreaseBufferFontSize, IncreaseBufferFontSize, ResetBufferFontSize,
agent::{OpenConfiguration, OpenOnboardingModal, ResetOnboarding, ToggleModelSelector},
agent::{OpenOnboardingModal, OpenSettings, ResetOnboarding, ToggleModelSelector},
assistant::{OpenRulesLibrary, ToggleFocus},
};
use zed_llm_client::{CompletionIntent, UsageLimit};
const AGENT_PANEL_KEY: &str = "agent_panel";
@@ -105,7 +105,7 @@ pub fn init(cx: &mut App) {
panel.update(cx, |panel, cx| panel.open_history(window, cx));
}
})
.register_action(|workspace, _: &OpenConfiguration, window, cx| {
.register_action(|workspace, _: &OpenSettings, window, cx| {
if let Some(panel) = workspace.panel::<AgentPanel>(cx) {
workspace.focus_panel::<AgentPanel>(window, cx);
panel.update(cx, |panel, cx| panel.open_configuration(window, cx));
@@ -2088,7 +2088,7 @@ impl AgentPanel {
menu = menu
.action("Rules…", Box::new(OpenRulesLibrary::default()))
.action("Settings", Box::new(OpenConfiguration))
.action("Settings", Box::new(OpenSettings))
.action(zoom_in_label, Box::new(ToggleZoom));
menu
}))
@@ -2482,14 +2482,14 @@ impl AgentPanel {
.icon_color(Color::Muted)
.full_width()
.key_binding(KeyBinding::for_action_in(
&OpenConfiguration,
&OpenSettings,
&focus_handle,
window,
cx,
))
.on_click(|_event, window, cx| {
window.dispatch_action(
OpenConfiguration.boxed_clone(),
OpenSettings.boxed_clone(),
cx,
)
}),
@@ -2713,16 +2713,11 @@ impl AgentPanel {
.style(ButtonStyle::Tinted(ui::TintColor::Warning))
.label_size(LabelSize::Small)
.key_binding(
KeyBinding::for_action_in(
&OpenConfiguration,
&focus_handle,
window,
cx,
)
.map(|kb| kb.size(rems_from_px(12.))),
KeyBinding::for_action_in(&OpenSettings, &focus_handle, window, cx)
.map(|kb| kb.size(rems_from_px(12.))),
)
.on_click(|_event, window, cx| {
window.dispatch_action(OpenConfiguration.boxed_clone(), cx)
window.dispatch_action(OpenSettings.boxed_clone(), cx)
}),
),
ConfigurationError::ProviderPendingTermsAcceptance(provider) => {
@@ -3226,7 +3221,7 @@ impl Render for AgentPanel {
.on_action(cx.listener(|this, _: &OpenHistory, window, cx| {
this.open_history(window, cx);
}))
.on_action(cx.listener(|this, _: &OpenConfiguration, window, cx| {
.on_action(cx.listener(|this, _: &OpenSettings, window, cx| {
this.open_configuration(window, cx);
}))
.on_action(cx.listener(Self::open_active_thread_as_markdown))

View File

@@ -265,8 +265,8 @@ fn update_command_palette_filter(cx: &mut App) {
filter.hide_namespace("agent");
filter.hide_namespace("assistant");
filter.hide_namespace("copilot");
filter.hide_namespace("supermaven");
filter.hide_namespace("zed_predict_onboarding");
filter.hide_namespace("edit_prediction");
use editor::actions::{

View File

@@ -6,6 +6,7 @@ use agent::{
use agent_settings::AgentSettings;
use anyhow::{Context as _, Result};
use client::telemetry::Telemetry;
use cloud_llm_client::CompletionIntent;
use collections::HashSet;
use editor::{Anchor, AnchorRangeExt, MultiBuffer, MultiBufferSnapshot, ToOffset as _, ToPoint};
use futures::{
@@ -35,7 +36,6 @@ use std::{
};
use streaming_diff::{CharOperation, LineDiff, LineOperation, StreamingDiff};
use telemetry_events::{AssistantEventData, AssistantKind, AssistantPhase};
use zed_llm_client::CompletionIntent;
pub struct BufferCodegen {
alternatives: Vec<Entity<CodegenAlternative>>,

View File

@@ -1,10 +1,10 @@
#![allow(unused, dead_code)]
use client::{ModelRequestUsage, RequestUsage};
use cloud_llm_client::{Plan, UsageLimit};
use gpui::Global;
use std::ops::{Deref, DerefMut};
use ui::prelude::*;
use zed_llm_client::{Plan, UsageLimit};
/// Debug only: Used for testing various account states
///

View File

@@ -48,7 +48,7 @@ use text::{OffsetRangeExt, ToPoint as _};
use ui::prelude::*;
use util::{RangeExt, ResultExt, maybe};
use workspace::{ItemHandle, Toast, Workspace, dock::Panel, notifications::NotificationId};
use zed_actions::agent::OpenConfiguration;
use zed_actions::agent::OpenSettings;
pub fn init(
fs: Arc<dyn Fs>,
@@ -345,7 +345,7 @@ impl InlineAssistant {
if let Some(answer) = answer {
if answer == 0 {
cx.update(|window, cx| {
window.dispatch_action(Box::new(OpenConfiguration), cx)
window.dispatch_action(Box::new(OpenSettings), cx)
})
.ok();
}

View File

@@ -576,7 +576,7 @@ impl PickerDelegate for LanguageModelPickerDelegate {
.icon_position(IconPosition::Start)
.on_click(|_, window, cx| {
window.dispatch_action(
zed_actions::agent::OpenConfiguration.boxed_clone(),
zed_actions::agent::OpenSettings.boxed_clone(),
cx,
);
}),

View File

@@ -18,6 +18,7 @@ use agent_settings::{AgentSettings, CompletionMode};
use ai_onboarding::ApiKeysWithProviders;
use buffer_diff::BufferDiff;
use client::UserStore;
use cloud_llm_client::CompletionIntent;
use collections::{HashMap, HashSet};
use editor::actions::{MoveUp, Paste};
use editor::display_map::CreaseId;
@@ -53,7 +54,6 @@ use util::ResultExt as _;
use workspace::{CollaboratorId, Workspace};
use zed_actions::agent::Chat;
use zed_actions::agent::ToggleModelSelector;
use zed_llm_client::CompletionIntent;
use crate::context_picker::{ContextPicker, ContextPickerCompletionProvider, crease_for_mention};
use crate::context_strip::{ContextStrip, ContextStripEvent, SuggestContextKind};
@@ -1300,11 +1300,11 @@ impl MessageEditor {
let plan = user_store
.current_plan()
.map(|plan| match plan {
Plan::Free => zed_llm_client::Plan::ZedFree,
Plan::ZedPro => zed_llm_client::Plan::ZedPro,
Plan::ZedProTrial => zed_llm_client::Plan::ZedProTrial,
Plan::Free => cloud_llm_client::Plan::ZedFree,
Plan::ZedPro => cloud_llm_client::Plan::ZedPro,
Plan::ZedProTrial => cloud_llm_client::Plan::ZedProTrial,
})
.unwrap_or(zed_llm_client::Plan::ZedFree);
.unwrap_or(cloud_llm_client::Plan::ZedFree);
let usage = user_store.model_request_usage()?;

View File

@@ -10,6 +10,7 @@ use agent::{
use agent_settings::AgentSettings;
use anyhow::{Context as _, Result};
use client::telemetry::Telemetry;
use cloud_llm_client::CompletionIntent;
use collections::{HashMap, VecDeque};
use editor::{MultiBuffer, actions::SelectAll};
use fs::Fs;
@@ -27,7 +28,6 @@ use terminal_view::TerminalView;
use ui::prelude::*;
use util::ResultExt;
use workspace::{Toast, Workspace, notifications::NotificationId};
use zed_llm_client::CompletionIntent;
pub fn init(
fs: Arc<dyn Fs>,

View File

@@ -1,8 +1,8 @@
use client::{ModelRequestUsage, RequestUsage, zed_urls};
use cloud_llm_client::{Plan, UsageLimit};
use component::{empty_example, example_group_with_title, single_example};
use gpui::{AnyElement, App, IntoElement, RenderOnce, Window};
use ui::{Callout, prelude::*};
use zed_llm_client::{Plan, UsageLimit};
#[derive(IntoElement, RegisterComponent)]
pub struct UsageCallout {

View File

@@ -136,10 +136,7 @@ impl RenderOnce for ApiKeysWithoutProviders {
.full_width()
.style(ButtonStyle::Outlined)
.on_click(move |_, window, cx| {
window.dispatch_action(
zed_actions::agent::OpenConfiguration.boxed_clone(),
cx,
);
window.dispatch_action(zed_actions::agent::OpenSettings.boxed_clone(), cx);
}),
)
}

View File

@@ -19,6 +19,7 @@ assistant_slash_commands.workspace = true
chrono.workspace = true
client.workspace = true
clock.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
context_server.workspace = true
fs.workspace = true
@@ -48,7 +49,6 @@ util.workspace = true
uuid.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
indoc.workspace = true

View File

@@ -11,6 +11,7 @@ use assistant_slash_command::{
use assistant_slash_commands::FileCommandMetadata;
use client::{self, Client, proto, telemetry::Telemetry};
use clock::ReplicaId;
use cloud_llm_client::CompletionIntent;
use collections::{HashMap, HashSet};
use fs::{Fs, RenameOptions};
use futures::{FutureExt, StreamExt, future::Shared};
@@ -46,7 +47,6 @@ use text::{BufferSnapshot, ToPoint};
use ui::IconName;
use util::{ResultExt, TryFutureExt, post_inc};
use uuid::Uuid;
use zed_llm_client::CompletionIntent;
pub use crate::context_store::*;

View File

@@ -21,9 +21,11 @@ assistant_tool.workspace = true
buffer_diff.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
component.workspace = true
derive_more.workspace = true
diffy = "0.4.2"
editor.workspace = true
feature_flags.workspace = true
futures.workspace = true
@@ -63,8 +65,6 @@ web_search.workspace = true
which.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_llm_client.workspace = true
diffy = "0.4.2"
[dev-dependencies]
lsp = { workspace = true, features = ["test-support"] }

View File

@@ -7,6 +7,7 @@ mod streaming_fuzzy_matcher;
use crate::{Template, Templates};
use anyhow::Result;
use assistant_tool::ActionLog;
use cloud_llm_client::CompletionIntent;
use create_file_parser::{CreateFileParser, CreateFileParserEvent};
pub use edit_parser::EditFormat;
use edit_parser::{EditParser, EditParserEvent, EditParserMetrics};
@@ -29,7 +30,6 @@ use std::{cmp, iter, mem, ops::Range, path::PathBuf, pin::Pin, sync::Arc, task::
use streaming_diff::{CharOperation, StreamingDiff};
use streaming_fuzzy_matcher::StreamingFuzzyMatcher;
use util::debug_panic;
use zed_llm_client::CompletionIntent;
#[derive(Serialize)]
struct CreateFilePromptTemplate {

View File

@@ -1663,47 +1663,68 @@ async fn retry_on_rate_limit<R>(mut request: impl AsyncFnMut() -> Result<R>) ->
attempt += 1;
match request().await {
Ok(result) => return Ok(result),
Err(err) => match err.downcast::<LanguageModelCompletionError>() {
Ok(err) => match &err {
LanguageModelCompletionError::RateLimitExceeded { retry_after, .. }
| LanguageModelCompletionError::ServerOverloaded { retry_after, .. } => {
let retry_after = retry_after.unwrap_or(Duration::from_secs(5));
// Wait for the duration supplied, with some jitter to avoid all requests being made at the same time.
let jitter = retry_after.mul_f64(rand::thread_rng().gen_range(0.0..1.0));
eprintln!(
"Attempt #{attempt}: {err}. Retry after {retry_after:?} + jitter of {jitter:?}"
);
Timer::after(retry_after + jitter).await;
continue;
}
LanguageModelCompletionError::UpstreamProviderError {
status,
retry_after,
..
} => {
// Only retry for specific status codes
let should_retry = matches!(
*status,
StatusCode::TOO_MANY_REQUESTS | StatusCode::SERVICE_UNAVAILABLE
) || status.as_u16() == 529;
Err(err) => {
if attempt > 20 {
return Err(err);
}
if !should_retry {
return Err(err.into());
match err.downcast::<LanguageModelCompletionError>() {
Ok(err) => match &err {
LanguageModelCompletionError::RateLimitExceeded { retry_after, .. }
| LanguageModelCompletionError::ServerOverloaded { retry_after, .. } => {
let retry_after = retry_after.unwrap_or(Duration::from_secs(5));
// Wait for the duration supplied, with some jitter to avoid all requests being made at the same time.
let jitter =
retry_after.mul_f64(rand::thread_rng().gen_range(0.0..1.0));
eprintln!(
"Attempt #{attempt}: {err}. Retry after {retry_after:?} + jitter of {jitter:?}"
);
Timer::after(retry_after + jitter).await;
continue;
}
LanguageModelCompletionError::UpstreamProviderError {
status,
retry_after,
..
} => {
// Only retry for specific status codes
let should_retry = matches!(
*status,
StatusCode::TOO_MANY_REQUESTS | StatusCode::SERVICE_UNAVAILABLE
) || status.as_u16() == 529;
// Use server-provided retry_after if available, otherwise use default
let retry_after = retry_after.unwrap_or(Duration::from_secs(5));
let jitter = retry_after.mul_f64(rand::thread_rng().gen_range(0.0..1.0));
eprintln!(
"Attempt #{attempt}: {err}. Retry after {retry_after:?} + jitter of {jitter:?}"
);
Timer::after(retry_after + jitter).await;
continue;
}
_ => return Err(err.into()),
},
Err(err) => return Err(err),
},
if !should_retry {
return Err(err.into());
}
// Use server-provided retry_after if available, otherwise use default
let retry_after = retry_after.unwrap_or(Duration::from_secs(5));
let jitter =
retry_after.mul_f64(rand::thread_rng().gen_range(0.0..1.0));
eprintln!(
"Attempt #{attempt}: {err}. Retry after {retry_after:?} + jitter of {jitter:?}"
);
Timer::after(retry_after + jitter).await;
continue;
}
LanguageModelCompletionError::ApiInternalServerError { .. }
| LanguageModelCompletionError::ApiReadResponseError { .. }
| LanguageModelCompletionError::DeserializeResponse { .. }
| LanguageModelCompletionError::HttpSend { .. } => {
let retry_after = Duration::from_secs(attempt);
let jitter =
retry_after.mul_f64(rand::thread_rng().gen_range(0.0..1.0));
eprintln!(
"Attempt #{attempt}: {err}. Retry after {retry_after:?} + jitter of {jitter:?}"
);
Timer::after(retry_after + jitter).await;
continue;
}
_ => return Err(err.into()),
},
Err(err) => return Err(err),
}
}
}
}
}

View File

@@ -6,6 +6,7 @@ use anyhow::{Context as _, Result, anyhow};
use assistant_tool::{
ActionLog, Tool, ToolCard, ToolResult, ToolResultContent, ToolResultOutput, ToolUseStatus,
};
use cloud_llm_client::{WebSearchResponse, WebSearchResult};
use futures::{Future, FutureExt, TryFutureExt};
use gpui::{
AnyWindowHandle, App, AppContext, Context, Entity, IntoElement, Task, WeakEntity, Window,
@@ -17,7 +18,6 @@ use serde::{Deserialize, Serialize};
use ui::{IconName, Tooltip, prelude::*};
use web_search::WebSearchRegistry;
use workspace::Workspace;
use zed_llm_client::{WebSearchResponse, WebSearchResult};
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct WebSearchToolInput {

View File

@@ -18,6 +18,6 @@ collections.workspace = true
derive_more.workspace = true
gpui.workspace = true
parking_lot.workspace = true
rodio = { version = "0.20.0", default-features = false, features = ["wav"] }
rodio = { version = "0.21.1", default-features = false, features = ["wav", "playback", "tracing"] }
util.workspace = true
workspace-hack.workspace = true

View File

@@ -3,12 +3,9 @@ use std::{io::Cursor, sync::Arc};
use anyhow::{Context as _, Result};
use collections::HashMap;
use gpui::{App, AssetSource, Global};
use rodio::{
Decoder, Source,
source::{Buffered, SamplesConverter},
};
use rodio::{Decoder, Source, source::Buffered};
type Sound = Buffered<SamplesConverter<Decoder<Cursor<Vec<u8>>>, f32>>;
type Sound = Buffered<Decoder<Cursor<Vec<u8>>>>;
pub struct SoundRegistry {
cache: Arc<parking_lot::Mutex<HashMap<String, Sound>>>,
@@ -48,7 +45,7 @@ impl SoundRegistry {
.with_context(|| format!("No asset available for path {path}"))??
.into_owned();
let cursor = Cursor::new(bytes);
let source = Decoder::new(cursor)?.convert_samples::<f32>().buffered();
let source = Decoder::new(cursor)?.buffered();
self.cache.lock().insert(name.to_string(), source.clone());

View File

@@ -1,7 +1,7 @@
use assets::SoundRegistry;
use derive_more::{Deref, DerefMut};
use gpui::{App, AssetSource, BorrowAppContext, Global};
use rodio::{OutputStream, OutputStreamHandle};
use rodio::{OutputStream, OutputStreamBuilder};
use util::ResultExt;
mod assets;
@@ -37,8 +37,7 @@ impl Sound {
#[derive(Default)]
pub struct Audio {
_output_stream: Option<OutputStream>,
output_handle: Option<OutputStreamHandle>,
output_handle: Option<OutputStream>,
}
#[derive(Deref, DerefMut)]
@@ -51,11 +50,9 @@ impl Audio {
Self::default()
}
fn ensure_output_exists(&mut self) -> Option<&OutputStreamHandle> {
fn ensure_output_exists(&mut self) -> Option<&OutputStream> {
if self.output_handle.is_none() {
let (_output_stream, output_handle) = OutputStream::try_default().log_err().unzip();
self.output_handle = output_handle;
self._output_stream = _output_stream;
self.output_handle = OutputStreamBuilder::open_default_stream().log_err();
}
self.output_handle.as_ref()
@@ -69,7 +66,7 @@ impl Audio {
cx.update_global::<GlobalAudio, _>(|this, cx| {
let output_handle = this.ensure_output_exists()?;
let source = SoundRegistry::global(cx).get(sound.file()).log_err()?;
output_handle.play_raw(source).log_err()?;
output_handle.mixer().add(source);
Some(())
});
}
@@ -80,7 +77,6 @@ impl Audio {
}
cx.update_global::<GlobalAudio, _>(|this, _| {
this._output_stream.take();
this.output_handle.take();
});
}

View File

@@ -22,6 +22,8 @@ async-tungstenite = { workspace = true, features = ["tokio", "tokio-rustls-manua
base64.workspace = true
chrono = { workspace = true, features = ["serde"] }
clock.workspace = true
cloud_api_client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
credentials_provider.workspace = true
derive_more.workspace = true
@@ -33,8 +35,8 @@ http_client.workspace = true
http_client_tls.workspace = true
httparse = "1.10"
log.workspace = true
paths.workspace = true
parking_lot.workspace = true
paths.workspace = true
postage.workspace = true
rand.workspace = true
regex.workspace = true
@@ -46,19 +48,18 @@ serde_json.workspace = true
settings.workspace = true
sha2.workspace = true
smol.workspace = true
telemetry.workspace = true
telemetry_events.workspace = true
text.workspace = true
thiserror.workspace = true
time.workspace = true
tiny_http.workspace = true
tokio-socks = { version = "0.5.2", default-features = false, features = ["futures-io"] }
tokio.workspace = true
url.workspace = true
util.workspace = true
worktree.workspace = true
telemetry.workspace = true
tokio.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true
worktree.workspace = true
[dev-dependencies]
clock = { workspace = true, features = ["test-support"] }

View File

@@ -1,6 +1,7 @@
#[cfg(any(test, feature = "test-support"))]
pub mod test;
mod cloud;
mod proxy;
pub mod telemetry;
pub mod user;
@@ -15,13 +16,14 @@ use async_tungstenite::tungstenite::{
};
use chrono::{DateTime, Utc};
use clock::SystemClock;
use cloud_api_client::CloudApiClient;
use credentials_provider::CredentialsProvider;
use futures::{
AsyncReadExt, FutureExt, SinkExt, Stream, StreamExt, TryFutureExt as _, TryStreamExt,
channel::oneshot, future::BoxFuture,
};
use gpui::{App, AsyncApp, Entity, Global, Task, WeakEntity, actions};
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl};
use http_client::{AsyncBody, HttpClient, HttpClientWithUrl, http};
use parking_lot::RwLock;
use postage::watch;
use proxy::connect_proxy_stream;
@@ -31,7 +33,6 @@ use rpc::proto::{AnyTypedEnvelope, EnvelopedMessage, PeerId, RequestMessage};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
use std::pin::Pin;
use std::{
any::TypeId,
convert::TryFrom,
@@ -45,12 +46,14 @@ use std::{
},
time::{Duration, Instant},
};
use std::{cmp, pin::Pin};
use telemetry::Telemetry;
use thiserror::Error;
use tokio::net::TcpStream;
use url::Url;
use util::{ConnectionResult, ResultExt};
pub use cloud::*;
pub use rpc::*;
pub use telemetry_events::Event;
pub use user::*;
@@ -78,7 +81,7 @@ pub static ZED_ALWAYS_ACTIVE: LazyLock<bool> =
LazyLock::new(|| std::env::var("ZED_ALWAYS_ACTIVE").map_or(false, |e| !e.is_empty()));
pub const INITIAL_RECONNECTION_DELAY: Duration = Duration::from_millis(500);
pub const MAX_RECONNECTION_DELAY: Duration = Duration::from_secs(10);
pub const MAX_RECONNECTION_DELAY: Duration = Duration::from_secs(30);
pub const CONNECTION_TIMEOUT: Duration = Duration::from_secs(20);
actions!(
@@ -213,6 +216,7 @@ pub struct Client {
id: AtomicU64,
peer: Arc<Peer>,
http: Arc<HttpClientWithUrl>,
cloud_client: Arc<CloudApiClient>,
telemetry: Arc<Telemetry>,
credentials_provider: ClientCredentialsProvider,
state: RwLock<ClientState>,
@@ -586,6 +590,7 @@ impl Client {
id: AtomicU64::new(0),
peer: Peer::new(0),
telemetry: Telemetry::new(clock, http.clone(), cx),
cloud_client: Arc::new(CloudApiClient::new(http.clone())),
http,
credentials_provider: ClientCredentialsProvider::new(cx),
state: Default::default(),
@@ -618,6 +623,10 @@ impl Client {
self.http.clone()
}
pub fn cloud_client(&self) -> Arc<CloudApiClient> {
self.cloud_client.clone()
}
pub fn set_id(&self, id: u64) -> &Self {
self.id.store(id, Ordering::SeqCst);
self
@@ -727,11 +736,10 @@ impl Client {
},
&cx,
);
cx.background_executor().timer(delay).await;
delay = delay
.mul_f32(rng.gen_range(0.5..=2.5))
.max(INITIAL_RECONNECTION_DELAY)
.min(MAX_RECONNECTION_DELAY);
let jitter =
Duration::from_millis(rng.gen_range(0..delay.as_millis() as u64));
cx.background_executor().timer(delay + jitter).await;
delay = cmp::min(delay * 2, MAX_RECONNECTION_DELAY);
} else {
break;
}
@@ -931,6 +939,8 @@ impl Client {
}
let credentials = credentials.unwrap();
self.set_id(credentials.user_id);
self.cloud_client
.set_credentials(credentials.user_id as u32, credentials.access_token.clone());
if was_disconnected {
self.set_status(Status::Connecting, cx);
@@ -1138,7 +1148,7 @@ impl Client {
.to_str()
.map_err(EstablishConnectionError::other)?
.to_string();
Url::parse(&collab_url).with_context(|| format!("parsing colab rpc url {collab_url}"))
Url::parse(&collab_url).with_context(|| format!("parsing collab rpc url {collab_url}"))
}
}
@@ -1158,6 +1168,7 @@ impl Client {
let http = self.http.clone();
let proxy = http.proxy().cloned();
let user_agent = http.user_agent().cloned();
let credentials = credentials.clone();
let rpc_url = self.rpc_url(http, release_channel);
let system_id = self.telemetry.system_id();
@@ -1209,7 +1220,7 @@ impl Client {
// We then modify the request to add our desired headers.
let request_headers = request.headers_mut();
request_headers.insert(
"Authorization",
http::header::AUTHORIZATION,
HeaderValue::from_str(&credentials.authorization_header())?,
);
request_headers.insert(
@@ -1221,6 +1232,9 @@ impl Client {
"x-zed-release-channel",
HeaderValue::from_str(release_channel.map(|r| r.dev_name()).unwrap_or("unknown"))?,
);
if let Some(user_agent) = user_agent {
request_headers.insert(http::header::USER_AGENT, user_agent);
}
if let Some(system_id) = system_id {
request_headers.insert("x-zed-system-id", HeaderValue::from_str(&system_id)?);
}
@@ -1477,6 +1491,7 @@ impl Client {
pub async fn sign_out(self: &Arc<Self>, cx: &AsyncApp) {
self.state.write().credentials = None;
self.cloud_client.clear_credentials();
self.disconnect(cx);
if self.has_credentials(cx).await {

View File

@@ -0,0 +1,3 @@
mod user_store;
pub use user_store::*;

View File

@@ -0,0 +1,69 @@
use std::sync::Arc;
use std::time::Duration;
use anyhow::Context as _;
use cloud_api_client::{AuthenticatedUser, CloudApiClient};
use gpui::{Context, Task};
use util::{ResultExt as _, maybe};
pub struct CloudUserStore {
authenticated_user: Option<Arc<AuthenticatedUser>>,
_maintain_authenticated_user_task: Task<()>,
}
impl CloudUserStore {
pub fn new(cloud_client: Arc<CloudApiClient>, cx: &mut Context<Self>) -> Self {
Self {
authenticated_user: None,
_maintain_authenticated_user_task: cx.spawn(async move |this, cx| {
maybe!(async move {
loop {
let Some(this) = this.upgrade() else {
return anyhow::Ok(());
};
if cloud_client.has_credentials() {
let already_fetched_authenticated_user = this
.read_with(cx, |this, _cx| this.authenticated_user().is_some())
.unwrap_or(false);
if already_fetched_authenticated_user {
// We already fetched the authenticated user; nothing to do.
} else {
let authenticated_user_result = cloud_client
.get_authenticated_user()
.await
.context("failed to fetch authenticated user");
if let Some(response) = authenticated_user_result.log_err() {
this.update(cx, |this, _cx| {
this.authenticated_user = Some(Arc::new(response.user));
})
.ok();
}
}
} else {
this.update(cx, |this, _cx| {
this.authenticated_user = None;
})
.ok();
}
cx.background_executor()
.timer(Duration::from_millis(100))
.await;
}
})
.await
.log_err();
}),
}
}
pub fn is_authenticated(&self) -> bool {
self.authenticated_user.is_some()
}
pub fn authenticated_user(&self) -> Option<Arc<AuthenticatedUser>> {
self.authenticated_user.clone()
}
}

View File

@@ -1,6 +1,10 @@
use super::{Client, Status, TypedEnvelope, proto};
use anyhow::{Context as _, Result, anyhow};
use chrono::{DateTime, Utc};
use cloud_llm_client::{
EDIT_PREDICTIONS_USAGE_AMOUNT_HEADER_NAME, EDIT_PREDICTIONS_USAGE_LIMIT_HEADER_NAME,
MODEL_REQUESTS_USAGE_AMOUNT_HEADER_NAME, MODEL_REQUESTS_USAGE_LIMIT_HEADER_NAME, UsageLimit,
};
use collections::{HashMap, HashSet, hash_map::Entry};
use derive_more::Deref;
use feature_flags::FeatureFlagAppExt;
@@ -17,10 +21,6 @@ use std::{
};
use text::ReplicaId;
use util::{TryFutureExt as _, maybe};
use zed_llm_client::{
EDIT_PREDICTIONS_USAGE_AMOUNT_HEADER_NAME, EDIT_PREDICTIONS_USAGE_LIMIT_HEADER_NAME,
MODEL_REQUESTS_USAGE_AMOUNT_HEADER_NAME, MODEL_REQUESTS_USAGE_LIMIT_HEADER_NAME, UsageLimit,
};
pub type UserId = u64;

View File

@@ -0,0 +1,21 @@
[package]
name = "cloud_api_client"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/cloud_api_client.rs"
[dependencies]
anyhow.workspace = true
cloud_api_types.workspace = true
futures.workspace = true
http_client.workspace = true
parking_lot.workspace = true
serde_json.workspace = true
workspace-hack.workspace = true

View File

@@ -0,0 +1 @@
../../LICENSE-APACHE

View File

@@ -0,0 +1,83 @@
use std::sync::Arc;
use anyhow::{Result, anyhow};
pub use cloud_api_types::*;
use futures::AsyncReadExt as _;
use http_client::{AsyncBody, HttpClientWithUrl, Method, Request};
use parking_lot::RwLock;
struct Credentials {
user_id: u32,
access_token: String,
}
pub struct CloudApiClient {
credentials: RwLock<Option<Credentials>>,
http_client: Arc<HttpClientWithUrl>,
}
impl CloudApiClient {
pub fn new(http_client: Arc<HttpClientWithUrl>) -> Self {
Self {
credentials: RwLock::new(None),
http_client,
}
}
pub fn has_credentials(&self) -> bool {
self.credentials.read().is_some()
}
pub fn set_credentials(&self, user_id: u32, access_token: String) {
*self.credentials.write() = Some(Credentials {
user_id,
access_token,
});
}
pub fn clear_credentials(&self) {
*self.credentials.write() = None;
}
fn authorization_header(&self) -> Result<String> {
let guard = self.credentials.read();
let credentials = guard
.as_ref()
.ok_or_else(|| anyhow!("No credentials provided"))?;
Ok(format!(
"{} {}",
credentials.user_id, credentials.access_token
))
}
pub async fn get_authenticated_user(&self) -> Result<GetAuthenticatedUserResponse> {
let request = Request::builder()
.method(Method::GET)
.uri(
self.http_client
.build_zed_cloud_url("/client/users/me", &[])?
.as_ref(),
)
.header("Content-Type", "application/json")
.header("Authorization", self.authorization_header()?)
.body(AsyncBody::default())?;
let mut response = self.http_client.send(request).await?;
if !response.status().is_success() {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
anyhow::bail!(
"Failed to get authenticated user.\nStatus: {:?}\nBody: {body}",
response.status()
)
}
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
Ok(serde_json::from_str(&body)?)
}
}

View File

@@ -0,0 +1,22 @@
[package]
name = "cloud_api_types"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/cloud_api_types.rs"
[dependencies]
chrono.workspace = true
cloud_llm_client.workspace = true
serde.workspace = true
workspace-hack.workspace = true
[dev-dependencies]
pretty_assertions.workspace = true
serde_json.workspace = true

View File

@@ -0,0 +1 @@
../../LICENSE-APACHE

View File

@@ -0,0 +1,40 @@
mod timestamp;
use serde::{Deserialize, Serialize};
pub use crate::timestamp::Timestamp;
#[derive(Debug, PartialEq, Serialize, Deserialize)]
pub struct GetAuthenticatedUserResponse {
pub user: AuthenticatedUser,
pub feature_flags: Vec<String>,
pub plan: PlanInfo,
}
#[derive(Debug, PartialEq, Serialize, Deserialize)]
pub struct AuthenticatedUser {
pub id: i32,
pub metrics_id: String,
pub avatar_url: String,
pub github_login: String,
pub name: Option<String>,
pub is_staff: bool,
pub accepted_tos_at: Option<Timestamp>,
}
#[derive(Debug, PartialEq, Serialize, Deserialize)]
pub struct PlanInfo {
pub plan: cloud_llm_client::Plan,
pub subscription_period: Option<SubscriptionPeriod>,
pub usage: cloud_llm_client::CurrentUsage,
pub trial_started_at: Option<Timestamp>,
pub is_usage_based_billing_enabled: bool,
pub is_account_too_young: bool,
pub has_overdue_invoices: bool,
}
#[derive(Debug, PartialEq, Clone, Copy, Serialize, Deserialize)]
pub struct SubscriptionPeriod {
pub started_at: Timestamp,
pub ended_at: Timestamp,
}

View File

@@ -0,0 +1,166 @@
use chrono::{DateTime, NaiveDateTime, SecondsFormat, Utc};
use serde::{Deserialize, Deserializer, Serialize, Serializer};
/// A timestamp with a serialized representation in RFC 3339 format.
#[derive(Debug, PartialEq, Eq, Hash, Clone, Copy)]
pub struct Timestamp(pub DateTime<Utc>);
impl Timestamp {
pub fn new(datetime: DateTime<Utc>) -> Self {
Self(datetime)
}
}
impl From<DateTime<Utc>> for Timestamp {
fn from(value: DateTime<Utc>) -> Self {
Self(value)
}
}
impl From<NaiveDateTime> for Timestamp {
fn from(value: NaiveDateTime) -> Self {
Self(value.and_utc())
}
}
impl Serialize for Timestamp {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let rfc3339_string = self.0.to_rfc3339_opts(SecondsFormat::Millis, true);
serializer.serialize_str(&rfc3339_string)
}
}
impl<'de> Deserialize<'de> for Timestamp {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
let value = String::deserialize(deserializer)?;
let datetime = DateTime::parse_from_rfc3339(&value)
.map_err(serde::de::Error::custom)?
.to_utc();
Ok(Self(datetime))
}
}
#[cfg(test)]
mod tests {
use chrono::NaiveDate;
use pretty_assertions::assert_eq;
use super::*;
#[test]
fn test_timestamp_serialization() {
let datetime = DateTime::parse_from_rfc3339("2023-12-25T14:30:45.123Z")
.unwrap()
.to_utc();
let timestamp = Timestamp::new(datetime);
let json = serde_json::to_string(&timestamp).unwrap();
assert_eq!(json, "\"2023-12-25T14:30:45.123Z\"");
}
#[test]
fn test_timestamp_deserialization() {
let json = "\"2023-12-25T14:30:45.123Z\"";
let timestamp: Timestamp = serde_json::from_str(json).unwrap();
let expected = DateTime::parse_from_rfc3339("2023-12-25T14:30:45.123Z")
.unwrap()
.to_utc();
assert_eq!(timestamp.0, expected);
}
#[test]
fn test_timestamp_roundtrip() {
let original = DateTime::parse_from_rfc3339("2023-12-25T14:30:45.123Z")
.unwrap()
.to_utc();
let timestamp = Timestamp::new(original);
let json = serde_json::to_string(&timestamp).unwrap();
let deserialized: Timestamp = serde_json::from_str(&json).unwrap();
assert_eq!(deserialized.0, original);
}
#[test]
fn test_timestamp_from_datetime_utc() {
let datetime = DateTime::parse_from_rfc3339("2023-12-25T14:30:45.123Z")
.unwrap()
.to_utc();
let timestamp = Timestamp::from(datetime);
assert_eq!(timestamp.0, datetime);
}
#[test]
fn test_timestamp_from_naive_datetime() {
let naive_dt = NaiveDate::from_ymd_opt(2023, 12, 25)
.unwrap()
.and_hms_milli_opt(14, 30, 45, 123)
.unwrap();
let timestamp = Timestamp::from(naive_dt);
let expected = naive_dt.and_utc();
assert_eq!(timestamp.0, expected);
}
#[test]
fn test_timestamp_serialization_with_microseconds() {
// Test that microseconds are truncated to milliseconds
let datetime = NaiveDate::from_ymd_opt(2023, 12, 25)
.unwrap()
.and_hms_micro_opt(14, 30, 45, 123456)
.unwrap()
.and_utc();
let timestamp = Timestamp::new(datetime);
let json = serde_json::to_string(&timestamp).unwrap();
// Should be truncated to milliseconds
assert_eq!(json, "\"2023-12-25T14:30:45.123Z\"");
}
#[test]
fn test_timestamp_deserialization_without_milliseconds() {
let json = "\"2023-12-25T14:30:45Z\"";
let timestamp: Timestamp = serde_json::from_str(json).unwrap();
let expected = NaiveDate::from_ymd_opt(2023, 12, 25)
.unwrap()
.and_hms_opt(14, 30, 45)
.unwrap()
.and_utc();
assert_eq!(timestamp.0, expected);
}
#[test]
fn test_timestamp_deserialization_with_timezone() {
let json = "\"2023-12-25T14:30:45.123+05:30\"";
let timestamp: Timestamp = serde_json::from_str(json).unwrap();
// Should be converted to UTC
let expected = NaiveDate::from_ymd_opt(2023, 12, 25)
.unwrap()
.and_hms_milli_opt(9, 0, 45, 123) // 14:30:45 + 5:30 = 20:00:45, but we want UTC so subtract 5:30
.unwrap()
.and_utc();
assert_eq!(timestamp.0, expected);
}
#[test]
fn test_timestamp_deserialization_with_invalid_format() {
let json = "\"invalid-date\"";
let result: Result<Timestamp, _> = serde_json::from_str(json);
assert!(result.is_err());
}
}

View File

@@ -0,0 +1,23 @@
[package]
name = "cloud_llm_client"
version = "0.1.0"
publish.workspace = true
edition.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/cloud_llm_client.rs"
[dependencies]
anyhow.workspace = true
serde = { workspace = true, features = ["derive", "rc"] }
serde_json.workspace = true
strum = { workspace = true, features = ["derive"] }
uuid = { workspace = true, features = ["serde"] }
workspace-hack.workspace = true
[dev-dependencies]
pretty_assertions.workspace = true

View File

@@ -0,0 +1 @@
../../LICENSE-APACHE

View File

@@ -0,0 +1,370 @@
use std::str::FromStr;
use std::sync::Arc;
use anyhow::Context as _;
use serde::{Deserialize, Serialize};
use strum::{Display, EnumIter, EnumString};
use uuid::Uuid;
/// The name of the header used to indicate which version of Zed the client is running.
pub const ZED_VERSION_HEADER_NAME: &str = "x-zed-version";
/// The name of the header used to indicate when a request failed due to an
/// expired LLM token.
///
/// The client may use this as a signal to refresh the token.
pub const EXPIRED_LLM_TOKEN_HEADER_NAME: &str = "x-zed-expired-token";
/// The name of the header used to indicate what plan the user is currently on.
pub const CURRENT_PLAN_HEADER_NAME: &str = "x-zed-plan";
/// The name of the header used to indicate the usage limit for model requests.
pub const MODEL_REQUESTS_USAGE_LIMIT_HEADER_NAME: &str = "x-zed-model-requests-usage-limit";
/// The name of the header used to indicate the usage amount for model requests.
pub const MODEL_REQUESTS_USAGE_AMOUNT_HEADER_NAME: &str = "x-zed-model-requests-usage-amount";
/// The name of the header used to indicate the usage limit for edit predictions.
pub const EDIT_PREDICTIONS_USAGE_LIMIT_HEADER_NAME: &str = "x-zed-edit-predictions-usage-limit";
/// The name of the header used to indicate the usage amount for edit predictions.
pub const EDIT_PREDICTIONS_USAGE_AMOUNT_HEADER_NAME: &str = "x-zed-edit-predictions-usage-amount";
/// The name of the header used to indicate the resource for which the subscription limit has been reached.
pub const SUBSCRIPTION_LIMIT_RESOURCE_HEADER_NAME: &str = "x-zed-subscription-limit-resource";
pub const MODEL_REQUESTS_RESOURCE_HEADER_VALUE: &str = "model_requests";
pub const EDIT_PREDICTIONS_RESOURCE_HEADER_VALUE: &str = "edit_predictions";
/// The name of the header used to indicate that the maximum number of consecutive tool uses has been reached.
pub const TOOL_USE_LIMIT_REACHED_HEADER_NAME: &str = "x-zed-tool-use-limit-reached";
/// The name of the header used to indicate the the minimum required Zed version.
///
/// This can be used to force a Zed upgrade in order to continue communicating
/// with the LLM service.
pub const MINIMUM_REQUIRED_VERSION_HEADER_NAME: &str = "x-zed-minimum-required-version";
/// The name of the header used by the client to indicate to the server that it supports receiving status messages.
pub const CLIENT_SUPPORTS_STATUS_MESSAGES_HEADER_NAME: &str =
"x-zed-client-supports-status-messages";
/// The name of the header used by the server to indicate to the client that it supports sending status messages.
pub const SERVER_SUPPORTS_STATUS_MESSAGES_HEADER_NAME: &str =
"x-zed-server-supports-status-messages";
#[derive(Debug, PartialEq, Clone, Copy, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum UsageLimit {
Limited(i32),
Unlimited,
}
impl FromStr for UsageLimit {
type Err = anyhow::Error;
fn from_str(value: &str) -> Result<Self, Self::Err> {
match value {
"unlimited" => Ok(Self::Unlimited),
limit => limit
.parse::<i32>()
.map(Self::Limited)
.context("failed to parse limit"),
}
}
}
#[derive(Debug, Clone, Copy, Default, PartialEq, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum Plan {
#[default]
#[serde(alias = "Free")]
ZedFree,
#[serde(alias = "ZedPro")]
ZedPro,
#[serde(alias = "ZedProTrial")]
ZedProTrial,
}
impl Plan {
pub fn as_str(&self) -> &'static str {
match self {
Plan::ZedFree => "zed_free",
Plan::ZedPro => "zed_pro",
Plan::ZedProTrial => "zed_pro_trial",
}
}
pub fn model_requests_limit(&self) -> UsageLimit {
match self {
Plan::ZedPro => UsageLimit::Limited(500),
Plan::ZedProTrial => UsageLimit::Limited(150),
Plan::ZedFree => UsageLimit::Limited(50),
}
}
pub fn edit_predictions_limit(&self) -> UsageLimit {
match self {
Plan::ZedPro => UsageLimit::Unlimited,
Plan::ZedProTrial => UsageLimit::Unlimited,
Plan::ZedFree => UsageLimit::Limited(2_000),
}
}
}
impl FromStr for Plan {
type Err = anyhow::Error;
fn from_str(value: &str) -> Result<Self, Self::Err> {
match value {
"zed_free" => Ok(Plan::ZedFree),
"zed_pro" => Ok(Plan::ZedPro),
"zed_pro_trial" => Ok(Plan::ZedProTrial),
plan => Err(anyhow::anyhow!("invalid plan: {plan:?}")),
}
}
}
#[derive(
Debug, PartialEq, Eq, Hash, Clone, Copy, Serialize, Deserialize, EnumString, EnumIter, Display,
)]
#[serde(rename_all = "snake_case")]
#[strum(serialize_all = "snake_case")]
pub enum LanguageModelProvider {
Anthropic,
OpenAi,
Google,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsBody {
#[serde(skip_serializing_if = "Option::is_none", default)]
pub outline: Option<String>,
pub input_events: String,
pub input_excerpt: String,
#[serde(skip_serializing_if = "Option::is_none", default)]
pub speculated_output: Option<String>,
/// Whether the user provided consent for sampling this interaction.
#[serde(default, alias = "data_collection_permission")]
pub can_collect_data: bool,
#[serde(skip_serializing_if = "Option::is_none", default)]
pub diagnostic_groups: Option<Vec<(String, serde_json::Value)>>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PredictEditsResponse {
pub request_id: Uuid,
pub output_excerpt: String,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct AcceptEditPredictionBody {
pub request_id: Uuid,
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum CompletionMode {
Normal,
Max,
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum CompletionIntent {
UserPrompt,
ToolResults,
ThreadSummarization,
ThreadContextSummarization,
CreateFile,
EditFile,
InlineAssist,
TerminalInlineAssist,
GenerateGitCommitMessage,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct CompletionBody {
#[serde(skip_serializing_if = "Option::is_none", default)]
pub thread_id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none", default)]
pub prompt_id: Option<String>,
#[serde(skip_serializing_if = "Option::is_none", default)]
pub intent: Option<CompletionIntent>,
#[serde(skip_serializing_if = "Option::is_none", default)]
pub mode: Option<CompletionMode>,
pub provider: LanguageModelProvider,
pub model: String,
pub provider_request: serde_json::Value,
}
#[derive(Debug, PartialEq, Clone, Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum CompletionRequestStatus {
Queued {
position: usize,
},
Started,
Failed {
code: String,
message: String,
request_id: Uuid,
/// Retry duration in seconds.
retry_after: Option<f64>,
},
UsageUpdated {
amount: usize,
limit: UsageLimit,
},
ToolUseLimitReached,
}
#[derive(Serialize, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum CompletionEvent<T> {
Status(CompletionRequestStatus),
Event(T),
}
impl<T> CompletionEvent<T> {
pub fn into_status(self) -> Option<CompletionRequestStatus> {
match self {
Self::Status(status) => Some(status),
Self::Event(_) => None,
}
}
pub fn into_event(self) -> Option<T> {
match self {
Self::Event(event) => Some(event),
Self::Status(_) => None,
}
}
}
#[derive(Serialize, Deserialize)]
pub struct WebSearchBody {
pub query: String,
}
#[derive(Serialize, Deserialize, Clone)]
pub struct WebSearchResponse {
pub results: Vec<WebSearchResult>,
}
#[derive(Serialize, Deserialize, Clone)]
pub struct WebSearchResult {
pub title: String,
pub url: String,
pub text: String,
}
#[derive(Serialize, Deserialize)]
pub struct CountTokensBody {
pub provider: LanguageModelProvider,
pub model: String,
pub provider_request: serde_json::Value,
}
#[derive(Serialize, Deserialize)]
pub struct CountTokensResponse {
pub tokens: usize,
}
#[derive(Debug, PartialEq, Eq, Hash, Clone, Serialize, Deserialize)]
pub struct LanguageModelId(pub Arc<str>);
impl std::fmt::Display for LanguageModelId {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.0)
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct LanguageModel {
pub provider: LanguageModelProvider,
pub id: LanguageModelId,
pub display_name: String,
pub max_token_count: usize,
pub max_token_count_in_max_mode: Option<usize>,
pub max_output_tokens: usize,
pub supports_tools: bool,
pub supports_images: bool,
pub supports_thinking: bool,
pub supports_max_mode: bool,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct ListModelsResponse {
pub models: Vec<LanguageModel>,
pub default_model: LanguageModelId,
pub default_fast_model: LanguageModelId,
pub recommended_models: Vec<LanguageModelId>,
}
#[derive(Debug, Serialize, Deserialize)]
pub struct GetSubscriptionResponse {
pub plan: Plan,
pub usage: Option<CurrentUsage>,
}
#[derive(Debug, PartialEq, Serialize, Deserialize)]
pub struct CurrentUsage {
pub model_requests: UsageData,
pub edit_predictions: UsageData,
}
#[derive(Debug, PartialEq, Serialize, Deserialize)]
pub struct UsageData {
pub used: u32,
pub limit: UsageLimit,
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
use serde_json::json;
use super::*;
#[test]
fn test_plan_deserialize_snake_case() {
let plan = serde_json::from_value::<Plan>(json!("zed_free")).unwrap();
assert_eq!(plan, Plan::ZedFree);
let plan = serde_json::from_value::<Plan>(json!("zed_pro")).unwrap();
assert_eq!(plan, Plan::ZedPro);
let plan = serde_json::from_value::<Plan>(json!("zed_pro_trial")).unwrap();
assert_eq!(plan, Plan::ZedProTrial);
}
#[test]
fn test_plan_deserialize_aliases() {
let plan = serde_json::from_value::<Plan>(json!("Free")).unwrap();
assert_eq!(plan, Plan::ZedFree);
let plan = serde_json::from_value::<Plan>(json!("ZedPro")).unwrap();
assert_eq!(plan, Plan::ZedPro);
let plan = serde_json::from_value::<Plan>(json!("ZedProTrial")).unwrap();
assert_eq!(plan, Plan::ZedProTrial);
}
#[test]
fn test_usage_limit_from_str() {
let limit = UsageLimit::from_str("unlimited").unwrap();
assert!(matches!(limit, UsageLimit::Unlimited));
let limit = UsageLimit::from_str(&0.to_string()).unwrap();
assert!(matches!(limit, UsageLimit::Limited(0)));
let limit = UsageLimit::from_str(&50.to_string()).unwrap();
assert!(matches!(limit, UsageLimit::Limited(50)));
for value in ["not_a_number", "50xyz"] {
let limit = UsageLimit::from_str(value);
assert!(limit.is_err());
}
}
}

View File

@@ -23,13 +23,14 @@ async-stripe.workspace = true
async-trait.workspace = true
async-tungstenite.workspace = true
aws-config = { version = "1.1.5" }
aws-sdk-s3 = { version = "1.15.0" }
aws-sdk-kinesis = "1.51.0"
aws-sdk-s3 = { version = "1.15.0" }
axum = { version = "0.6", features = ["json", "headers", "ws"] }
axum-extra = { version = "0.4", features = ["erased-json"] }
base64.workspace = true
chrono.workspace = true
clock.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
dashmap.workspace = true
derive_more.workspace = true
@@ -75,7 +76,6 @@ tracing-subscriber = { version = "0.3.18", features = ["env-filter", "json", "re
util.workspace = true
uuid.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
agent_settings.workspace = true

View File

@@ -100,7 +100,7 @@ impl std::fmt::Display for SystemIdHeader {
pub fn routes(rpc_server: Arc<rpc::Server>) -> Router<(), Body> {
Router::new()
.route("/user", get(update_or_create_authenticated_user))
.route("/user", get(legacy_update_or_create_authenticated_user))
.route("/users/look_up", get(look_up_user))
.route("/users/:id/access_tokens", post(create_access_token))
.route("/users/:id/refresh_llm_tokens", post(refresh_llm_tokens))
@@ -161,7 +161,10 @@ struct AuthenticatedUserResponse {
feature_flags: Vec<String>,
}
async fn update_or_create_authenticated_user(
/// This is a legacy endpoint that is no longer used in production.
///
/// It currently only exists to be used when developing Collab locally.
async fn legacy_update_or_create_authenticated_user(
Query(params): Query<AuthenticatedUserParams>,
Extension(app): Extension<Arc<AppState>>,
) -> Result<Json<AuthenticatedUserResponse>> {
@@ -353,9 +356,9 @@ async fn refresh_llm_tokens(
#[derive(Debug, Serialize, Deserialize)]
struct UpdatePlanBody {
pub plan: zed_llm_client::Plan,
pub plan: cloud_llm_client::Plan,
pub subscription_period: SubscriptionPeriod,
pub usage: zed_llm_client::CurrentUsage,
pub usage: cloud_llm_client::CurrentUsage,
pub trial_started_at: Option<DateTime<Utc>>,
pub is_usage_based_billing_enabled: bool,
pub is_account_too_young: bool,
@@ -377,9 +380,9 @@ async fn update_plan(
extract::Json(body): extract::Json<UpdatePlanBody>,
) -> Result<Json<UpdatePlanResponse>> {
let plan = match body.plan {
zed_llm_client::Plan::ZedFree => proto::Plan::Free,
zed_llm_client::Plan::ZedPro => proto::Plan::ZedPro,
zed_llm_client::Plan::ZedProTrial => proto::Plan::ZedProTrial,
cloud_llm_client::Plan::ZedFree => proto::Plan::Free,
cloud_llm_client::Plan::ZedPro => proto::Plan::ZedPro,
cloud_llm_client::Plan::ZedProTrial => proto::Plan::ZedProTrial,
};
let update_user_plan = proto::UpdateUserPlan {
@@ -411,15 +414,15 @@ async fn update_plan(
Ok(Json(UpdatePlanResponse {}))
}
fn usage_limit_to_proto(limit: zed_llm_client::UsageLimit) -> proto::UsageLimit {
fn usage_limit_to_proto(limit: cloud_llm_client::UsageLimit) -> proto::UsageLimit {
proto::UsageLimit {
variant: Some(match limit {
zed_llm_client::UsageLimit::Limited(limit) => {
cloud_llm_client::UsageLimit::Limited(limit) => {
proto::usage_limit::Variant::Limited(proto::usage_limit::Limited {
limit: limit as u32,
})
}
zed_llm_client::UsageLimit::Unlimited => {
cloud_llm_client::UsageLimit::Unlimited => {
proto::usage_limit::Variant::Unlimited(proto::usage_limit::Unlimited {})
}
}),

View File

@@ -1,11 +1,11 @@
use anyhow::{Context as _, bail};
use chrono::{DateTime, Utc};
use cloud_llm_client::LanguageModelProvider;
use collections::{HashMap, HashSet};
use sea_orm::ActiveValue;
use std::{sync::Arc, time::Duration};
use stripe::{CancellationDetailsReason, EventObject, EventType, ListEvents, SubscriptionStatus};
use util::{ResultExt, maybe};
use zed_llm_client::LanguageModelProvider;
use crate::AppState;
use crate::db::billing_subscription::{
@@ -87,6 +87,14 @@ async fn poll_stripe_events(
stripe_client: &Arc<dyn StripeClient>,
real_stripe_client: &stripe::Client,
) -> anyhow::Result<()> {
let feature_flags = app.db.list_feature_flags().await?;
let sync_events_using_cloud = feature_flags
.iter()
.any(|flag| flag.flag == "cloud-stripe-events-polling" && flag.enabled_for_all);
if sync_events_using_cloud {
return Ok(());
}
fn event_type_to_string(event_type: EventType) -> String {
// Calling `to_string` on `stripe::EventType` members gives us a quoted string,
// so we need to unquote it.
@@ -569,6 +577,14 @@ async fn sync_model_request_usage_with_stripe(
llm_db: &Arc<LlmDatabase>,
stripe_billing: &Arc<StripeBilling>,
) -> anyhow::Result<()> {
let feature_flags = app.db.list_feature_flags().await?;
let sync_model_request_usage_using_cloud = feature_flags
.iter()
.any(|flag| flag.flag == "cloud-stripe-usage-meters-sync" && flag.enabled_for_all);
if sync_model_request_usage_using_cloud {
return Ok(());
}
log::info!("Stripe usage sync: Starting");
let started_at = Utc::now();

View File

@@ -8,7 +8,6 @@ use axum::{
use chrono::{NaiveDateTime, SecondsFormat};
use serde::{Deserialize, Serialize};
use crate::api::AuthenticatedUserParams;
use crate::db::ContributorSelector;
use crate::{AppState, Result};
@@ -104,9 +103,18 @@ impl RenovateBot {
}
}
#[derive(Debug, Deserialize)]
struct AddContributorBody {
github_user_id: i32,
github_login: String,
github_email: Option<String>,
github_name: Option<String>,
github_user_created_at: chrono::DateTime<chrono::Utc>,
}
async fn add_contributor(
Extension(app): Extension<Arc<AppState>>,
extract::Json(params): extract::Json<AuthenticatedUserParams>,
extract::Json(params): extract::Json<AddContributorBody>,
) -> Result<()> {
let initial_channel_id = app.config.auto_join_channel_id;
app.db

View File

@@ -95,7 +95,7 @@ pub enum SubscriptionKind {
ZedFree,
}
impl From<SubscriptionKind> for zed_llm_client::Plan {
impl From<SubscriptionKind> for cloud_llm_client::Plan {
fn from(value: SubscriptionKind) -> Self {
match value {
SubscriptionKind::ZedPro => Self::ZedPro,

View File

@@ -6,11 +6,11 @@ mod tables;
#[cfg(test)]
mod tests;
use cloud_llm_client::LanguageModelProvider;
use collections::HashMap;
pub use ids::*;
pub use seed::*;
pub use tables::*;
use zed_llm_client::LanguageModelProvider;
#[cfg(test)]
pub use tests::TestLlmDb;

View File

@@ -1,5 +1,5 @@
use cloud_llm_client::LanguageModelProvider;
use pretty_assertions::assert_eq;
use zed_llm_client::LanguageModelProvider;
use crate::llm::db::LlmDatabase;
use crate::test_llm_db;

View File

@@ -4,12 +4,12 @@ use crate::llm::{AGENT_EXTENDED_TRIAL_FEATURE_FLAG, BYPASS_ACCOUNT_AGE_CHECK_FEA
use crate::{Config, db::billing_preference};
use anyhow::{Context as _, Result};
use chrono::{NaiveDateTime, Utc};
use cloud_llm_client::Plan;
use jsonwebtoken::{DecodingKey, EncodingKey, Header, Validation};
use serde::{Deserialize, Serialize};
use std::time::Duration;
use thiserror::Error;
use uuid::Uuid;
use zed_llm_client::Plan;
#[derive(Clone, Debug, Default, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]

View File

@@ -23,6 +23,7 @@ use anyhow::{Context as _, anyhow, bail};
use async_tungstenite::tungstenite::{
Message as TungsteniteMessage, protocol::CloseFrame as TungsteniteCloseFrame,
};
use axum::headers::UserAgent;
use axum::{
Extension, Router, TypedHeader,
body::Body,
@@ -750,6 +751,7 @@ impl Server {
address: String,
principal: Principal,
zed_version: ZedVersion,
user_agent: Option<String>,
geoip_country_code: Option<String>,
system_id: Option<String>,
send_connection_id: Option<oneshot::Sender<ConnectionId>>,
@@ -762,9 +764,14 @@ impl Server {
user_id=field::Empty,
login=field::Empty,
impersonator=field::Empty,
user_agent=field::Empty,
geoip_country_code=field::Empty
);
principal.update_span(&span);
if let Some(user_agent) = user_agent {
span.record("user_agent", user_agent);
}
if let Some(country_code) = geoip_country_code.as_ref() {
span.record("geoip_country_code", country_code);
}
@@ -1172,6 +1179,7 @@ pub async fn handle_websocket_request(
ConnectInfo(socket_address): ConnectInfo<SocketAddr>,
Extension(server): Extension<Arc<Server>>,
Extension(principal): Extension<Principal>,
user_agent: Option<TypedHeader<UserAgent>>,
country_code_header: Option<TypedHeader<CloudflareIpCountryHeader>>,
system_id_header: Option<TypedHeader<SystemIdHeader>>,
ws: WebSocketUpgrade,
@@ -1227,6 +1235,7 @@ pub async fn handle_websocket_request(
socket_address,
principal,
version,
user_agent.map(|header| header.to_string()),
country_code_header.map(|header| header.to_string()),
system_id_header.map(|header| header.to_string()),
None,
@@ -2859,12 +2868,12 @@ async fn make_update_user_plan_message(
}
fn model_requests_limit(
plan: zed_llm_client::Plan,
plan: cloud_llm_client::Plan,
feature_flags: &Vec<String>,
) -> zed_llm_client::UsageLimit {
) -> cloud_llm_client::UsageLimit {
match plan.model_requests_limit() {
zed_llm_client::UsageLimit::Limited(limit) => {
let limit = if plan == zed_llm_client::Plan::ZedProTrial
cloud_llm_client::UsageLimit::Limited(limit) => {
let limit = if plan == cloud_llm_client::Plan::ZedProTrial
&& feature_flags
.iter()
.any(|flag| flag == AGENT_EXTENDED_TRIAL_FEATURE_FLAG)
@@ -2874,9 +2883,9 @@ fn model_requests_limit(
limit
};
zed_llm_client::UsageLimit::Limited(limit)
cloud_llm_client::UsageLimit::Limited(limit)
}
zed_llm_client::UsageLimit::Unlimited => zed_llm_client::UsageLimit::Unlimited,
cloud_llm_client::UsageLimit::Unlimited => cloud_llm_client::UsageLimit::Unlimited,
}
}
@@ -2886,21 +2895,21 @@ fn subscription_usage_to_proto(
feature_flags: &Vec<String>,
) -> proto::SubscriptionUsage {
let plan = match plan {
proto::Plan::Free => zed_llm_client::Plan::ZedFree,
proto::Plan::ZedPro => zed_llm_client::Plan::ZedPro,
proto::Plan::ZedProTrial => zed_llm_client::Plan::ZedProTrial,
proto::Plan::Free => cloud_llm_client::Plan::ZedFree,
proto::Plan::ZedPro => cloud_llm_client::Plan::ZedPro,
proto::Plan::ZedProTrial => cloud_llm_client::Plan::ZedProTrial,
};
proto::SubscriptionUsage {
model_requests_usage_amount: usage.model_requests as u32,
model_requests_usage_limit: Some(proto::UsageLimit {
variant: Some(match model_requests_limit(plan, feature_flags) {
zed_llm_client::UsageLimit::Limited(limit) => {
cloud_llm_client::UsageLimit::Limited(limit) => {
proto::usage_limit::Variant::Limited(proto::usage_limit::Limited {
limit: limit as u32,
})
}
zed_llm_client::UsageLimit::Unlimited => {
cloud_llm_client::UsageLimit::Unlimited => {
proto::usage_limit::Variant::Unlimited(proto::usage_limit::Unlimited {})
}
}),
@@ -2908,12 +2917,12 @@ fn subscription_usage_to_proto(
edit_predictions_usage_amount: usage.edit_predictions as u32,
edit_predictions_usage_limit: Some(proto::UsageLimit {
variant: Some(match plan.edit_predictions_limit() {
zed_llm_client::UsageLimit::Limited(limit) => {
cloud_llm_client::UsageLimit::Limited(limit) => {
proto::usage_limit::Variant::Limited(proto::usage_limit::Limited {
limit: limit as u32,
})
}
zed_llm_client::UsageLimit::Unlimited => {
cloud_llm_client::UsageLimit::Unlimited => {
proto::usage_limit::Variant::Unlimited(proto::usage_limit::Unlimited {})
}
}),
@@ -2926,21 +2935,21 @@ fn make_default_subscription_usage(
feature_flags: &Vec<String>,
) -> proto::SubscriptionUsage {
let plan = match plan {
proto::Plan::Free => zed_llm_client::Plan::ZedFree,
proto::Plan::ZedPro => zed_llm_client::Plan::ZedPro,
proto::Plan::ZedProTrial => zed_llm_client::Plan::ZedProTrial,
proto::Plan::Free => cloud_llm_client::Plan::ZedFree,
proto::Plan::ZedPro => cloud_llm_client::Plan::ZedPro,
proto::Plan::ZedProTrial => cloud_llm_client::Plan::ZedProTrial,
};
proto::SubscriptionUsage {
model_requests_usage_amount: 0,
model_requests_usage_limit: Some(proto::UsageLimit {
variant: Some(match model_requests_limit(plan, feature_flags) {
zed_llm_client::UsageLimit::Limited(limit) => {
cloud_llm_client::UsageLimit::Limited(limit) => {
proto::usage_limit::Variant::Limited(proto::usage_limit::Limited {
limit: limit as u32,
})
}
zed_llm_client::UsageLimit::Unlimited => {
cloud_llm_client::UsageLimit::Unlimited => {
proto::usage_limit::Variant::Unlimited(proto::usage_limit::Unlimited {})
}
}),
@@ -2948,12 +2957,12 @@ fn make_default_subscription_usage(
edit_predictions_usage_amount: 0,
edit_predictions_usage_limit: Some(proto::UsageLimit {
variant: Some(match plan.edit_predictions_limit() {
zed_llm_client::UsageLimit::Limited(limit) => {
cloud_llm_client::UsageLimit::Limited(limit) => {
proto::usage_limit::Variant::Limited(proto::usage_limit::Limited {
limit: limit as u32,
})
}
zed_llm_client::UsageLimit::Unlimited => {
cloud_llm_client::UsageLimit::Unlimited => {
proto::usage_limit::Variant::Unlimited(proto::usage_limit::Unlimited {})
}
}),

View File

@@ -842,7 +842,7 @@ async fn test_client_disconnecting_from_room(
// Allow user A to reconnect to the server.
server.allow_connections();
executor.advance_clock(RECEIVE_TIMEOUT);
executor.advance_clock(RECONNECT_TIMEOUT);
// Call user B again from client A.
active_call_a
@@ -1358,7 +1358,7 @@ async fn test_calls_on_multiple_connections(
// User A reconnects automatically, then calls user B again.
server.allow_connections();
executor.advance_clock(RECEIVE_TIMEOUT);
executor.advance_clock(RECONNECT_TIMEOUT);
active_call_a
.update(cx_a, |call, cx| {
call.invite(client_b1.user_id().unwrap(), None, cx)

View File

@@ -8,6 +8,7 @@ use crate::{
use anyhow::anyhow;
use call::ActiveCall;
use channel::{ChannelBuffer, ChannelStore};
use client::CloudUserStore;
use client::{
self, ChannelId, Client, Connection, Credentials, EstablishConnectionError, UserStore,
proto::PeerId,
@@ -256,6 +257,7 @@ impl TestServer {
ZedVersion(SemanticVersion::new(1, 0, 0)),
None,
None,
None,
Some(connection_id_tx),
Executor::Deterministic(cx.background_executor().clone()),
None,
@@ -280,12 +282,14 @@ impl TestServer {
.register_hosting_provider(Arc::new(git_hosting_providers::Github::public_instance()));
let user_store = cx.new(|cx| UserStore::new(client.clone(), cx));
let cloud_user_store = cx.new(|cx| CloudUserStore::new(client.cloud_client(), cx));
let workspace_store = cx.new(|cx| WorkspaceStore::new(client.clone(), cx));
let language_registry = Arc::new(LanguageRegistry::test(cx.executor()));
let session = cx.new(|cx| AppSession::new(Session::test(), cx));
let app_state = Arc::new(workspace::AppState {
client: client.clone(),
user_store: user_store.clone(),
cloud_user_store,
workspace_store,
languages: language_registry,
fs: fs.clone(),

View File

@@ -158,6 +158,7 @@ impl Client {
pub fn stdio(
server_id: ContextServerId,
binary: ModelContextServerBinary,
working_directory: &Option<PathBuf>,
cx: AsyncApp,
) -> Result<Self> {
log::info!(
@@ -172,7 +173,7 @@ impl Client {
.map(|name| name.to_string_lossy().to_string())
.unwrap_or_else(String::new);
let transport = Arc::new(StdioTransport::new(binary, &cx)?);
let transport = Arc::new(StdioTransport::new(binary, working_directory, &cx)?);
Self::new(server_id, server_name.into(), transport, cx)
}

View File

@@ -53,7 +53,7 @@ impl std::fmt::Debug for ContextServerCommand {
}
enum ContextServerTransport {
Stdio(ContextServerCommand),
Stdio(ContextServerCommand, Option<PathBuf>),
Custom(Arc<dyn crate::transport::Transport>),
}
@@ -64,11 +64,18 @@ pub struct ContextServer {
}
impl ContextServer {
pub fn stdio(id: ContextServerId, command: ContextServerCommand) -> Self {
pub fn stdio(
id: ContextServerId,
command: ContextServerCommand,
working_directory: Option<Arc<Path>>,
) -> Self {
Self {
id,
client: RwLock::new(None),
configuration: ContextServerTransport::Stdio(command),
configuration: ContextServerTransport::Stdio(
command,
working_directory.map(|directory| directory.to_path_buf()),
),
}
}
@@ -90,13 +97,14 @@ impl ContextServer {
pub async fn start(self: Arc<Self>, cx: &AsyncApp) -> Result<()> {
let client = match &self.configuration {
ContextServerTransport::Stdio(command) => Client::stdio(
ContextServerTransport::Stdio(command, working_directory) => Client::stdio(
client::ContextServerId(self.id.0.clone()),
client::ModelContextServerBinary {
executable: Path::new(&command.path).to_path_buf(),
args: command.args.clone(),
env: command.env.clone(),
},
working_directory,
cx.clone(),
)?,
ContextServerTransport::Custom(transport) => Client::new(

View File

@@ -1,3 +1,4 @@
use std::path::PathBuf;
use std::pin::Pin;
use anyhow::{Context as _, Result};
@@ -22,7 +23,11 @@ pub struct StdioTransport {
}
impl StdioTransport {
pub fn new(binary: ModelContextServerBinary, cx: &AsyncApp) -> Result<Self> {
pub fn new(
binary: ModelContextServerBinary,
working_directory: &Option<PathBuf>,
cx: &AsyncApp,
) -> Result<Self> {
let mut command = util::command::new_smol_command(&binary.executable);
command
.args(&binary.args)
@@ -32,6 +37,10 @@ impl StdioTransport {
.stderr(std::process::Stdio::piped())
.kill_on_drop(true);
if let Some(working_directory) = working_directory {
command.current_dir(working_directory);
}
let mut server = command.spawn().with_context(|| {
format!(
"failed to spawn command. (path={:?}, args={:?})",

View File

@@ -7,17 +7,19 @@ license = "GPL-3.0-or-later"
[dependencies]
anyhow.workspace = true
clap.workspace = true
mdbook = "0.4.40"
command_palette.workspace = true
gpui.workspace = true
# We are specifically pinning this version of mdbook, as later versions introduce issues with double-nested subdirectories.
# Ask @maxdeviant about this before bumping.
mdbook = "= 0.4.40"
regex.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
regex.workspace = true
util.workspace = true
workspace-hack.workspace = true
zed.workspace = true
gpui.workspace = true
command_palette.workspace = true
zlog.workspace = true
[lints]
workspace = true

View File

@@ -1,14 +1,15 @@
use anyhow::Result;
use clap::{Arg, ArgMatches, Command};
use anyhow::{Context, Result};
use mdbook::BookItem;
use mdbook::book::{Book, Chapter};
use mdbook::preprocess::CmdPreprocessor;
use regex::Regex;
use settings::KeymapFile;
use std::collections::HashSet;
use std::borrow::Cow;
use std::collections::{HashMap, HashSet};
use std::io::{self, Read};
use std::process;
use std::sync::LazyLock;
use util::paths::PathExt;
static KEYMAP_MACOS: LazyLock<KeymapFile> = LazyLock::new(|| {
load_keymap("keymaps/default-macos.json").expect("Failed to load MacOS keymap")
@@ -20,60 +21,68 @@ static KEYMAP_LINUX: LazyLock<KeymapFile> = LazyLock::new(|| {
static ALL_ACTIONS: LazyLock<Vec<ActionDef>> = LazyLock::new(dump_all_gpui_actions);
pub fn make_app() -> Command {
Command::new("zed-docs-preprocessor")
.about("Preprocesses Zed Docs content to provide rich action & keybinding support and more")
.subcommand(
Command::new("supports")
.arg(Arg::new("renderer").required(true))
.about("Check whether a renderer is supported by this preprocessor"),
)
}
const FRONT_MATTER_COMMENT: &'static str = "<!-- ZED_META {} -->";
fn main() -> Result<()> {
let matches = make_app().get_matches();
zlog::init();
zlog::init_output_stderr();
// call a zed:: function so everything in `zed` crate is linked and
// all actions in the actual app are registered
zed::stdout_is_a_pty();
let args = std::env::args().skip(1).collect::<Vec<_>>();
if let Some(sub_args) = matches.subcommand_matches("supports") {
handle_supports(sub_args);
} else {
handle_preprocessing()?;
match args.get(0).map(String::as_str) {
Some("supports") => {
let renderer = args.get(1).expect("Required argument");
let supported = renderer != "not-supported";
if supported {
process::exit(0);
} else {
process::exit(1);
}
}
Some("postprocess") => handle_postprocessing()?,
_ => handle_preprocessing()?,
}
Ok(())
}
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
enum Error {
enum PreprocessorError {
ActionNotFound { action_name: String },
DeprecatedActionUsed { used: String, should_be: String },
InvalidFrontmatterLine(String),
}
impl Error {
impl PreprocessorError {
fn new_for_not_found_action(action_name: String) -> Self {
for action in &*ALL_ACTIONS {
for alias in action.deprecated_aliases {
if alias == &action_name {
return Error::DeprecatedActionUsed {
return PreprocessorError::DeprecatedActionUsed {
used: action_name.clone(),
should_be: action.name.to_string(),
};
}
}
}
Error::ActionNotFound {
PreprocessorError::ActionNotFound {
action_name: action_name.to_string(),
}
}
}
impl std::fmt::Display for Error {
impl std::fmt::Display for PreprocessorError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Error::ActionNotFound { action_name } => write!(f, "Action not found: {}", action_name),
Error::DeprecatedActionUsed { used, should_be } => write!(
PreprocessorError::InvalidFrontmatterLine(line) => {
write!(f, "Invalid frontmatter line: {}", line)
}
PreprocessorError::ActionNotFound { action_name } => {
write!(f, "Action not found: {}", action_name)
}
PreprocessorError::DeprecatedActionUsed { used, should_be } => write!(
f,
"Deprecated action used: {} should be {}",
used, should_be
@@ -89,8 +98,9 @@ fn handle_preprocessing() -> Result<()> {
let (_ctx, mut book) = CmdPreprocessor::parse_input(input.as_bytes())?;
let mut errors = HashSet::<Error>::new();
let mut errors = HashSet::<PreprocessorError>::new();
handle_frontmatter(&mut book, &mut errors);
template_and_validate_keybindings(&mut book, &mut errors);
template_and_validate_actions(&mut book, &mut errors);
@@ -108,19 +118,41 @@ fn handle_preprocessing() -> Result<()> {
Ok(())
}
fn handle_supports(sub_args: &ArgMatches) -> ! {
let renderer = sub_args
.get_one::<String>("renderer")
.expect("Required argument");
let supported = renderer != "not-supported";
if supported {
process::exit(0);
} else {
process::exit(1);
}
fn handle_frontmatter(book: &mut Book, errors: &mut HashSet<PreprocessorError>) {
let frontmatter_regex = Regex::new(r"(?s)^\s*---(.*?)---").unwrap();
for_each_chapter_mut(book, |chapter| {
let new_content = frontmatter_regex.replace(&chapter.content, |caps: &regex::Captures| {
let frontmatter = caps[1].trim();
let frontmatter = frontmatter.trim_matches(&[' ', '-', '\n']);
let mut metadata = HashMap::<String, String>::default();
for line in frontmatter.lines() {
let Some((name, value)) = line.split_once(':') else {
errors.insert(PreprocessorError::InvalidFrontmatterLine(format!(
"{}: {}",
chapter_breadcrumbs(&chapter),
line
)));
continue;
};
let name = name.trim();
let value = value.trim();
metadata.insert(name.to_string(), value.to_string());
}
FRONT_MATTER_COMMENT.replace(
"{}",
&serde_json::to_string(&metadata).expect("Failed to serialize metadata"),
)
});
match new_content {
Cow::Owned(content) => {
chapter.content = content;
}
Cow::Borrowed(_) => {}
}
});
}
fn template_and_validate_keybindings(book: &mut Book, errors: &mut HashSet<Error>) {
fn template_and_validate_keybindings(book: &mut Book, errors: &mut HashSet<PreprocessorError>) {
let regex = Regex::new(r"\{#kb (.*?)\}").unwrap();
for_each_chapter_mut(book, |chapter| {
@@ -128,7 +160,9 @@ fn template_and_validate_keybindings(book: &mut Book, errors: &mut HashSet<Error
.replace_all(&chapter.content, |caps: &regex::Captures| {
let action = caps[1].trim();
if find_action_by_name(action).is_none() {
errors.insert(Error::new_for_not_found_action(action.to_string()));
errors.insert(PreprocessorError::new_for_not_found_action(
action.to_string(),
));
return String::new();
}
let macos_binding = find_binding("macos", action).unwrap_or_default();
@@ -144,7 +178,7 @@ fn template_and_validate_keybindings(book: &mut Book, errors: &mut HashSet<Error
});
}
fn template_and_validate_actions(book: &mut Book, errors: &mut HashSet<Error>) {
fn template_and_validate_actions(book: &mut Book, errors: &mut HashSet<PreprocessorError>) {
let regex = Regex::new(r"\{#action (.*?)\}").unwrap();
for_each_chapter_mut(book, |chapter| {
@@ -152,7 +186,9 @@ fn template_and_validate_actions(book: &mut Book, errors: &mut HashSet<Error>) {
.replace_all(&chapter.content, |caps: &regex::Captures| {
let name = caps[1].trim();
let Some(action) = find_action_by_name(name) else {
errors.insert(Error::new_for_not_found_action(name.to_string()));
errors.insert(PreprocessorError::new_for_not_found_action(
name.to_string(),
));
return String::new();
};
format!("<code class=\"hljs\">{}</code>", &action.human_name)
@@ -217,6 +253,13 @@ fn name_for_action(action_as_str: String) -> String {
.unwrap_or(action_as_str)
}
fn chapter_breadcrumbs(chapter: &Chapter) -> String {
let mut breadcrumbs = Vec::with_capacity(chapter.parent_names.len() + 1);
breadcrumbs.extend(chapter.parent_names.iter().map(String::as_str));
breadcrumbs.push(chapter.name.as_str());
format!("[{:?}] {}", chapter.source_path, breadcrumbs.join(" > "))
}
fn load_keymap(asset_path: &str) -> Result<KeymapFile> {
let content = util::asset_str::<settings::SettingsAssets>(asset_path);
KeymapFile::parse(content.as_ref())
@@ -254,3 +297,126 @@ fn dump_all_gpui_actions() -> Vec<ActionDef> {
return actions;
}
fn handle_postprocessing() -> Result<()> {
let logger = zlog::scoped!("render");
let mut ctx = mdbook::renderer::RenderContext::from_json(io::stdin())?;
let output = ctx
.config
.get_mut("output")
.expect("has output")
.as_table_mut()
.expect("output is table");
let zed_html = output.remove("zed-html").expect("zed-html output defined");
let default_description = zed_html
.get("default-description")
.expect("Default description not found")
.as_str()
.expect("Default description not a string")
.to_string();
let default_title = zed_html
.get("default-title")
.expect("Default title not found")
.as_str()
.expect("Default title not a string")
.to_string();
output.insert("html".to_string(), zed_html);
mdbook::Renderer::render(&mdbook::renderer::HtmlHandlebars::new(), &ctx)?;
let ignore_list = ["toc.html"];
let root_dir = ctx.destination.clone();
let mut files = Vec::with_capacity(128);
let mut queue = Vec::with_capacity(64);
queue.push(root_dir.clone());
while let Some(dir) = queue.pop() {
for entry in std::fs::read_dir(&dir).context(dir.to_sanitized_string())? {
let Ok(entry) = entry else {
continue;
};
let file_type = entry.file_type().context("Failed to determine file type")?;
if file_type.is_dir() {
queue.push(entry.path());
}
if file_type.is_file()
&& matches!(
entry.path().extension().and_then(std::ffi::OsStr::to_str),
Some("html")
)
{
if ignore_list.contains(&&*entry.file_name().to_string_lossy()) {
zlog::info!(logger => "Ignoring {}", entry.path().to_string_lossy());
} else {
files.push(entry.path());
}
}
}
}
zlog::info!(logger => "Processing {} `.html` files", files.len());
let meta_regex = Regex::new(&FRONT_MATTER_COMMENT.replace("{}", "(.*)")).unwrap();
for file in files {
let contents = std::fs::read_to_string(&file)?;
let mut meta_description = None;
let mut meta_title = None;
let contents = meta_regex.replace(&contents, |caps: &regex::Captures| {
let metadata: HashMap<String, String> = serde_json::from_str(&caps[1]).with_context(|| format!("JSON Metadata: {:?}", &caps[1])).expect("Failed to deserialize metadata");
for (kind, content) in metadata {
match kind.as_str() {
"description" => {
meta_description = Some(content);
}
"title" => {
meta_title = Some(content);
}
_ => {
zlog::warn!(logger => "Unrecognized frontmatter key: {} in {:?}", kind, pretty_path(&file, &root_dir));
}
}
}
String::new()
});
let meta_description = meta_description.as_ref().unwrap_or_else(|| {
zlog::warn!(logger => "No meta description found for {:?}", pretty_path(&file, &root_dir));
&default_description
});
let page_title = extract_title_from_page(&contents, pretty_path(&file, &root_dir));
let meta_title = meta_title.as_ref().unwrap_or_else(|| {
zlog::debug!(logger => "No meta title found for {:?}", pretty_path(&file, &root_dir));
&default_title
});
let meta_title = format!("{} | {}", page_title, meta_title);
zlog::trace!(logger => "Updating {:?}", pretty_path(&file, &root_dir));
let contents = contents.replace("#description#", meta_description);
let contents = TITLE_REGEX
.replace(&contents, |_: &regex::Captures| {
format!("<title>{}</title>", meta_title)
})
.to_string();
// let contents = contents.replace("#title#", &meta_title);
std::fs::write(file, contents)?;
}
return Ok(());
fn pretty_path<'a>(
path: &'a std::path::PathBuf,
root: &'a std::path::PathBuf,
) -> &'a std::path::Path {
&path.strip_prefix(&root).unwrap_or(&path)
}
const TITLE_REGEX: std::cell::LazyCell<Regex> =
std::cell::LazyCell::new(|| Regex::new(r"<title>\s*(.*?)\s*</title>").unwrap());
fn extract_title_from_page(contents: &str, pretty_path: &std::path::Path) -> String {
let title_tag_contents = &TITLE_REGEX
.captures(&contents)
.with_context(|| format!("Failed to find title in {:?}", pretty_path))
.expect("Page has <title> element")[1];
let title = title_tag_contents
.trim()
.strip_suffix("- Zed")
.unwrap_or(title_tag_contents)
.trim()
.to_string();
title
}
}

View File

@@ -56,7 +56,7 @@ use aho_corasick::AhoCorasick;
use anyhow::{Context as _, Result, anyhow};
use blink_manager::BlinkManager;
use buffer_diff::DiffHunkStatus;
use client::{Collaborator, ParticipantIndex};
use client::{Collaborator, DisableAiSettings, ParticipantIndex};
use clock::{AGENT_REPLICA_ID, ReplicaId};
use collections::{BTreeMap, HashMap, HashSet, VecDeque};
use convert_case::{Case, Casing};
@@ -65,7 +65,7 @@ use display_map::*;
pub use display_map::{ChunkRenderer, ChunkRendererContext, DisplayPoint, FoldPlaceholder};
pub use editor_settings::{
CurrentLineHighlight, DocumentColorsRenderMode, EditorSettings, HideMouseMode,
ScrollBeyondLastLine, ScrollbarAxes, SearchSettings, ShowScrollbar,
ScrollBeyondLastLine, ScrollbarAxes, SearchSettings, ShowMinimap, ShowScrollbar,
};
use editor_settings::{GoToDefinitionFallback, Minimap as MinimapSettings};
pub use editor_settings_controls::*;
@@ -7048,7 +7048,7 @@ impl Editor {
}
pub fn update_edit_prediction_settings(&mut self, cx: &mut Context<Self>) {
if self.edit_prediction_provider.is_none() {
if self.edit_prediction_provider.is_none() || DisableAiSettings::get_global(cx).disable_ai {
self.edit_prediction_settings = EditPredictionSettings::Disabled;
} else {
let selection = self.selections.newest_anchor();
@@ -21834,11 +21834,11 @@ impl CodeActionProvider for Entity<Project> {
cx: &mut App,
) -> Task<Result<Vec<CodeAction>>> {
self.update(cx, |project, cx| {
let code_lens = project.code_lens(buffer, range.clone(), cx);
let code_lens_actions = project.code_lens_actions(buffer, range.clone(), cx);
let code_actions = project.code_actions(buffer, range, None, cx);
cx.background_spawn(async move {
let (code_lens, code_actions) = join(code_lens, code_actions).await;
Ok(code_lens
let (code_lens_actions, code_actions) = join(code_lens_actions, code_actions).await;
Ok(code_lens_actions
.context("code lens fetch")?
.into_iter()
.chain(code_actions.context("code action fetch")?)

View File

@@ -10072,8 +10072,14 @@ async fn test_autosave_with_dirty_buffers(cx: &mut TestAppContext) {
);
}
#[gpui::test]
async fn test_range_format_during_save(cx: &mut TestAppContext) {
async fn setup_range_format_test(
cx: &mut TestAppContext,
) -> (
Entity<Project>,
Entity<Editor>,
&mut gpui::VisualTestContext,
lsp::FakeLanguageServer,
) {
init_test(cx, |_| {});
let fs = FakeFs::new(cx.executor());
@@ -10088,9 +10094,9 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
FakeLspAdapter {
capabilities: lsp::ServerCapabilities {
document_range_formatting_provider: Some(lsp::OneOf::Left(true)),
..Default::default()
..lsp::ServerCapabilities::default()
},
..Default::default()
..FakeLspAdapter::default()
},
);
@@ -10105,14 +10111,22 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
let (editor, cx) = cx.add_window_view(|window, cx| {
build_editor_with_project(project.clone(), buffer, window, cx)
});
cx.executor().start_waiting();
let fake_server = fake_servers.next().await.unwrap();
(project, editor, cx, fake_server)
}
#[gpui::test]
async fn test_range_format_on_save_success(cx: &mut TestAppContext) {
let (project, editor, cx, fake_server) = setup_range_format_test(cx).await;
editor.update_in(cx, |editor, window, cx| {
editor.set_text("one\ntwo\nthree\n", window, cx)
});
assert!(cx.read(|cx| editor.is_dirty(cx)));
cx.executor().start_waiting();
let fake_server = fake_servers.next().await.unwrap();
let save = editor
.update_in(cx, |editor, window, cx| {
editor.save(
@@ -10147,13 +10161,18 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
"one, two\nthree\n"
);
assert!(!cx.read(|cx| editor.is_dirty(cx)));
}
#[gpui::test]
async fn test_range_format_on_save_timeout(cx: &mut TestAppContext) {
let (project, editor, cx, fake_server) = setup_range_format_test(cx).await;
editor.update_in(cx, |editor, window, cx| {
editor.set_text("one\ntwo\nthree\n", window, cx)
});
assert!(cx.read(|cx| editor.is_dirty(cx)));
// Ensure we can still save even if formatting hangs.
// Test that save still works when formatting hangs
fake_server.set_request_handler::<lsp::request::RangeFormatting, _, _>(
move |params, _| async move {
assert_eq!(
@@ -10185,8 +10204,13 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
"one\ntwo\nthree\n"
);
assert!(!cx.read(|cx| editor.is_dirty(cx)));
}
// For non-dirty buffer, no formatting request should be sent
#[gpui::test]
async fn test_range_format_not_called_for_clean_buffer(cx: &mut TestAppContext) {
let (project, editor, cx, fake_server) = setup_range_format_test(cx).await;
// Buffer starts clean, no formatting should be requested
let save = editor
.update_in(cx, |editor, window, cx| {
editor.save(
@@ -10207,6 +10231,12 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
.next();
cx.executor().start_waiting();
save.await;
cx.run_until_parked();
}
#[gpui::test]
async fn test_range_format_respects_language_tab_size_override(cx: &mut TestAppContext) {
let (project, editor, cx, fake_server) = setup_range_format_test(cx).await;
// Set Rust language override and assert overridden tabsize is sent to language server
update_test_language_settings(cx, |settings| {
@@ -10220,7 +10250,7 @@ async fn test_range_format_during_save(cx: &mut TestAppContext) {
});
editor.update_in(cx, |editor, window, cx| {
editor.set_text("somehting_new\n", window, cx)
editor.set_text("something_new\n", window, cx)
});
assert!(cx.read(|cx| editor.is_dirty(cx)));
let save = editor
@@ -21310,16 +21340,32 @@ async fn test_apply_code_lens_actions_with_commands(cx: &mut gpui::TestAppContex
},
);
let (buffer, _handle) = project
.update(cx, |p, cx| {
p.open_local_buffer_with_lsp(path!("/dir/a.ts"), cx)
let editor = workspace
.update(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from(path!("/dir/a.ts")),
OpenOptions::default(),
window,
cx,
)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
cx.executor().run_until_parked();
let fake_server = fake_language_servers.next().await.unwrap();
let buffer = editor.update(cx, |editor, cx| {
editor
.buffer()
.read(cx)
.as_singleton()
.expect("have opened a single file by path")
});
let buffer_snapshot = buffer.update(cx, |buffer, _| buffer.snapshot());
let anchor = buffer_snapshot.anchor_at(0, text::Bias::Left);
drop(buffer_snapshot);
@@ -21377,7 +21423,7 @@ async fn test_apply_code_lens_actions_with_commands(cx: &mut gpui::TestAppContex
assert_eq!(
actions.len(),
1,
"Should have only one valid action for the 0..0 range"
"Should have only one valid action for the 0..0 range, got: {actions:#?}"
);
let action = actions[0].clone();
let apply = project.update(cx, |project, cx| {
@@ -21423,7 +21469,7 @@ async fn test_apply_code_lens_actions_with_commands(cx: &mut gpui::TestAppContex
.into_iter()
.collect(),
),
..Default::default()
..lsp::WorkspaceEdit::default()
},
},
)
@@ -21446,6 +21492,38 @@ async fn test_apply_code_lens_actions_with_commands(cx: &mut gpui::TestAppContex
buffer.undo(cx);
assert_eq!(buffer.text(), "a");
});
let actions_after_edits = cx
.update_window(*workspace, |_, window, cx| {
project.code_actions(&buffer, anchor..anchor, window, cx)
})
.unwrap()
.await
.unwrap();
assert_eq!(
actions, actions_after_edits,
"For the same selection, same code lens actions should be returned"
);
let _responses =
fake_server.set_request_handler::<lsp::request::CodeLensRequest, _, _>(|_, _| async move {
panic!("No more code lens requests are expected");
});
editor.update_in(cx, |editor, window, cx| {
editor.select_all(&SelectAll, window, cx);
});
cx.executor().run_until_parked();
let new_actions = cx
.update_window(*workspace, |_, window, cx| {
project.code_actions(&buffer, anchor..anchor, window, cx)
})
.unwrap()
.await
.unwrap();
assert_eq!(
actions, new_actions,
"Code lens are queried for the same range and should get the same set back, but without additional LSP queries now"
);
}
#[gpui::test]

View File

@@ -6,7 +6,7 @@ use gpui::{Hsla, Rgba};
use itertools::Itertools;
use language::point_from_lsp;
use multi_buffer::Anchor;
use project::{DocumentColor, lsp_store::ColorFetchStrategy};
use project::{DocumentColor, lsp_store::LspFetchStrategy};
use settings::Settings as _;
use text::{Bias, BufferId, OffsetRangeExt as _};
use ui::{App, Context, Window};
@@ -180,9 +180,9 @@ impl Editor {
.filter_map(|buffer| {
let buffer_id = buffer.read(cx).remote_id();
let fetch_strategy = if ignore_cache {
ColorFetchStrategy::IgnoreCache
LspFetchStrategy::IgnoreCache
} else {
ColorFetchStrategy::UseCache {
LspFetchStrategy::UseCache {
known_cache_version: self.colors.as_ref().and_then(|colors| {
Some(colors.buffer_colors.get(&buffer_id)?.cache_version_used)
}),

View File

@@ -19,8 +19,8 @@ path = "src/explorer.rs"
[dependencies]
agent.workspace = true
agent_ui.workspace = true
agent_settings.workspace = true
agent_ui.workspace = true
anyhow.workspace = true
assistant_tool.workspace = true
assistant_tools.workspace = true
@@ -29,6 +29,7 @@ buffer_diff.workspace = true
chrono.workspace = true
clap.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
debug_adapter_extension.workspace = true
dirs.workspace = true
@@ -68,4 +69,3 @@ util.workspace = true
uuid.workspace = true
watch.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true

View File

@@ -15,11 +15,11 @@ use agent_settings::AgentProfileId;
use anyhow::{Result, anyhow};
use async_trait::async_trait;
use buffer_diff::DiffHunkStatus;
use cloud_llm_client::CompletionIntent;
use collections::HashMap;
use futures::{FutureExt as _, StreamExt, channel::mpsc, select_biased};
use gpui::{App, AppContext, AsyncApp, Entity};
use language_model::{LanguageModel, Role, StopReason};
use zed_llm_client::CompletionIntent;
pub const THREAD_EVENT_TIMEOUT: Duration = Duration::from_secs(60 * 2);

View File

@@ -106,7 +106,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn language_server_initialization_options(
@@ -131,7 +131,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn language_server_workspace_configuration(
@@ -154,7 +154,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn language_server_additional_initialization_options(
@@ -179,7 +179,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn language_server_additional_workspace_configuration(
@@ -204,7 +204,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn labels_for_completions(
@@ -230,7 +230,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn labels_for_symbols(
@@ -256,7 +256,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn complete_slash_command_argument(
@@ -275,7 +275,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn run_slash_command(
@@ -301,7 +301,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn context_server_command(
@@ -320,7 +320,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn context_server_configuration(
@@ -347,7 +347,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn suggest_docs_packages(&self, provider: Arc<str>) -> Result<Vec<String>> {
@@ -362,7 +362,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn index_docs(
@@ -388,7 +388,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn get_dap_binary(
@@ -410,7 +410,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn dap_request_kind(
&self,
@@ -427,7 +427,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn dap_config_to_scenario(&self, config: ZedDebugConfig) -> Result<DebugScenario> {
@@ -441,7 +441,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn dap_locator_create_scenario(
@@ -465,7 +465,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
async fn run_dap_locator(
&self,
@@ -481,7 +481,7 @@ impl extension::Extension for WasmExtension {
}
.boxed()
})
.await
.await?
}
}
@@ -761,7 +761,7 @@ impl WasmExtension {
.with_context(|| format!("failed to load wasm extension {}", manifest.id))
}
pub async fn call<T, Fn>(&self, f: Fn) -> T
pub async fn call<T, Fn>(&self, f: Fn) -> Result<T>
where
T: 'static + Send,
Fn: 'static
@@ -777,8 +777,19 @@ impl WasmExtension {
}
.boxed()
}))
.expect("wasm extension channel should not be closed yet");
return_rx.await.expect("wasm extension channel")
.map_err(|_| {
anyhow!(
"wasm extension channel should not be closed yet, extension {} (id {})",
self.manifest.name,
self.manifest.id,
)
})?;
return_rx.await.with_context(|| {
format!(
"wasm extension channel, extension {} (id {})",
self.manifest.name, self.manifest.id,
)
})
}
}
@@ -799,8 +810,19 @@ impl WasmState {
}
.boxed_local()
}))
.expect("main thread message channel should not be closed yet");
async move { return_rx.await.expect("main thread message channel") }
.unwrap_or_else(|_| {
panic!(
"main thread message channel should not be closed yet, extension {} (id {})",
self.manifest.name, self.manifest.id,
)
});
let name = self.manifest.name.clone();
let id = self.manifest.id.clone();
async move {
return_rx.await.unwrap_or_else(|_| {
panic!("main thread message channel, extension {name} (id {id})")
})
}
}
fn work_dir(&self) -> PathBuf {

View File

@@ -1404,14 +1404,21 @@ impl PickerDelegate for FileFinderDelegate {
} else {
let path_position = PathWithPosition::parse_str(&raw_query);
#[cfg(windows)]
let raw_query = raw_query.trim().to_owned().replace("/", "\\");
#[cfg(not(windows))]
let raw_query = raw_query.trim().to_owned();
let file_query_end = if path_position.path.to_str().unwrap_or(&raw_query) == raw_query {
None
} else {
// Safe to unwrap as we won't get here when the unwrap in if fails
Some(path_position.path.to_str().unwrap().len())
};
let query = FileSearchQuery {
raw_query: raw_query.trim().to_owned(),
file_query_end: if path_position.path.to_str().unwrap_or(raw_query) == raw_query {
None
} else {
// Safe to unwrap as we won't get here when the unwrap in if fails
Some(path_position.path.to_str().unwrap().len())
},
raw_query,
file_query_end,
path_position,
};

View File

@@ -24,6 +24,7 @@ buffer_diff.workspace = true
call.workspace = true
chrono.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
component.workspace = true
@@ -62,7 +63,6 @@ watch.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true
zed_llm_client.workspace = true
[target.'cfg(windows)'.dependencies]
windows.workspace = true

View File

@@ -71,12 +71,12 @@ use ui::{
use util::{ResultExt, TryFutureExt, maybe};
use workspace::SERIALIZATION_THROTTLE_TIME;
use cloud_llm_client::CompletionIntent;
use workspace::{
Workspace,
dock::{DockPosition, Panel, PanelEvent},
notifications::{DetachAndPromptErr, ErrorMessagePrompt, NotificationId},
};
use zed_llm_client::CompletionIntent;
actions!(
git_panel,

View File

@@ -310,18 +310,6 @@ mod windows {
&rust_binding_path,
);
}
{
let shader_path = PathBuf::from(std::env::var("CARGO_MANIFEST_DIR").unwrap())
.join("src/platform/windows/color_text_raster.hlsl");
compile_shader_for_module(
"emoji_rasterization",
&out_dir,
&fxc_path,
shader_path.to_str().unwrap(),
&rust_binding_path,
);
}
}
/// You can set the `GPUI_FXC_PATH` environment variable to specify the path to the fxc.exe compiler.

View File

@@ -6,6 +6,7 @@ use gpui::{
actions!(example, [Tab, TabPrev]);
struct Example {
focus_handle: FocusHandle,
items: Vec<FocusHandle>,
message: SharedString,
}
@@ -20,8 +21,11 @@ impl Example {
cx.focus_handle().tab_index(2).tab_stop(true),
];
window.focus(items.first().unwrap());
let focus_handle = cx.focus_handle();
window.focus(&focus_handle);
Self {
focus_handle,
items,
message: SharedString::from("Press `Tab`, `Shift-Tab` to switch focus."),
}
@@ -40,6 +44,10 @@ impl Example {
impl Render for Example {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
fn tab_stop_style<T: Styled>(this: T) -> T {
this.border_3().border_color(gpui::blue())
}
fn button(id: impl Into<ElementId>) -> Stateful<Div> {
div()
.id(id)
@@ -52,12 +60,13 @@ impl Render for Example {
.border_color(gpui::black())
.bg(gpui::black())
.text_color(gpui::white())
.focus(|this| this.border_color(gpui::blue()))
.focus(tab_stop_style)
.shadow_sm()
}
div()
.id("app")
.track_focus(&self.focus_handle)
.on_action(cx.listener(Self::on_tab))
.on_action(cx.listener(Self::on_tab_prev))
.size_full()
@@ -86,7 +95,7 @@ impl Render for Example {
.border_color(gpui::black())
.when(
item_handle.tab_stop && item_handle.is_focused(window),
|this| this.border_color(gpui::blue()),
tab_stop_style,
)
.map(|this| match item_handle.tab_stop {
true => this

View File

@@ -198,7 +198,7 @@ impl RenderOnce for CharacterGrid {
"χ", "ψ", "", "а", "в", "Ж", "ж", "З", "з", "К", "к", "л", "м", "Н", "н", "Р", "р",
"У", "у", "ф", "ч", "ь", "ы", "Э", "э", "Я", "я", "ij", "öẋ", ".,", "⣝⣑", "~", "*",
"_", "^", "`", "'", "(", "{", "«", "#", "&", "@", "$", "¢", "%", "|", "?", "", "µ",
"", "<=", "!=", "==", "--", "++", "=>", "->", "🏀", "🎊", "😍", "❤️", "👍", "👎",
"", "<=", "!=", "==", "--", "++", "=>", "->",
];
let columns = 11;

View File

@@ -2023,6 +2023,10 @@ impl HttpClient for NullHttpClient {
.boxed()
}
fn user_agent(&self) -> Option<&http_client::http::HeaderValue> {
None
}
fn proxy(&self) -> Option<&Url> {
None
}

View File

@@ -35,7 +35,6 @@ pub(crate) fn swap_rgba_pa_to_bgra(color: &mut [u8]) {
/// An RGBA color
#[derive(PartialEq, Clone, Copy, Default)]
#[repr(C)]
pub struct Rgba {
/// The red component of the color, in the range 0.0 to 1.0
pub r: f32,

View File

@@ -1004,12 +1004,13 @@ impl X11Client {
let mut keystroke = crate::Keystroke::from_xkb(&state.xkb, modifiers, code);
let keysym = state.xkb.key_get_one_sym(code);
// should be called after key_get_one_sym
state.xkb.update_key(code, xkbc::KeyDirection::Down);
if keysym.is_modifier_key() {
return Some(());
}
// should be called after key_get_one_sym
state.xkb.update_key(code, xkbc::KeyDirection::Down);
if let Some(mut compose_state) = state.compose_state.take() {
compose_state.feed(keysym);
match compose_state.status() {
@@ -1067,12 +1068,13 @@ impl X11Client {
let keystroke = crate::Keystroke::from_xkb(&state.xkb, modifiers, code);
let keysym = state.xkb.key_get_one_sym(code);
// should be called after key_get_one_sym
state.xkb.update_key(code, xkbc::KeyDirection::Up);
if keysym.is_modifier_key() {
return Some(());
}
// should be called after key_get_one_sym
state.xkb.update_key(code, xkbc::KeyDirection::Up);
keystroke
};
drop(state);

View File

@@ -1,39 +0,0 @@
struct RasterVertexOutput {
float4 position : SV_Position;
float2 texcoord : TEXCOORD0;
};
RasterVertexOutput emoji_rasterization_vertex(uint vertexID : SV_VERTEXID)
{
RasterVertexOutput output;
output.texcoord = float2((vertexID << 1) & 2, vertexID & 2);
output.position = float4(output.texcoord * 2.0f - 1.0f, 0.0f, 1.0f);
output.position.y = -output.position.y;
return output;
}
struct PixelInput {
float4 position: SV_Position;
float2 texcoord : TEXCOORD0;
};
struct Bounds {
int2 origin;
int2 size;
};
Texture2D<float4> t_layer : register(t0);
SamplerState s_layer : register(s0);
cbuffer GlyphLayerTextureParams : register(b0) {
Bounds bounds;
float4 run_color;
};
float4 emoji_rasterization_fragment(PixelInput input): SV_Target {
float3 sampled = t_layer.Sample(s_layer, input.texcoord.xy).rgb;
float alpha = (sampled.r + sampled.g + sampled.b) / 3;
return float4(run_color.rgb, alpha);
}

View File

@@ -1,4 +1,4 @@
use std::{borrow::Cow, mem::ManuallyDrop, sync::Arc};
use std::{borrow::Cow, sync::Arc};
use ::util::ResultExt;
use anyhow::Result;
@@ -10,8 +10,11 @@ use windows::{
Foundation::*,
Globalization::GetUserDefaultLocaleName,
Graphics::{
Direct3D::D3D_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP, Direct3D11::*, DirectWrite::*,
Dxgi::Common::*, Gdi::LOGFONTW, Imaging::*,
Direct2D::{Common::*, *},
DirectWrite::*,
Dxgi::Common::*,
Gdi::LOGFONTW,
Imaging::*,
},
System::SystemServices::LOCALE_NAME_MAX_LENGTH,
UI::WindowsAndMessaging::*,
@@ -37,21 +40,16 @@ struct DirectWriteComponent {
locale: String,
factory: IDWriteFactory5,
bitmap_factory: AgileReference<IWICImagingFactory>,
d2d1_factory: ID2D1Factory,
in_memory_loader: IDWriteInMemoryFontFileLoader,
builder: IDWriteFontSetBuilder1,
text_renderer: Arc<TextRendererWrapper>,
render_params: IDWriteRenderingParams3,
gpu_state: GPUState,
render_context: GlyphRenderContext,
}
struct GPUState {
device: ID3D11Device,
device_context: ID3D11DeviceContext,
sampler: [Option<ID3D11SamplerState>; 1],
blend_state: ID3D11BlendState,
vertex_shader: ID3D11VertexShader,
pixel_shader: ID3D11PixelShader,
struct GlyphRenderContext {
params: IDWriteRenderingParams3,
dc_target: ID2D1DeviceContext4,
}
struct DirectWriteState {
@@ -72,11 +70,12 @@ struct FontIdentifier {
}
impl DirectWriteComponent {
pub fn new(bitmap_factory: &IWICImagingFactory, gpu_context: &DirectXDevices) -> Result<Self> {
// todo: ideally this would not be a large unsafe block but smaller isolated ones for easier auditing
pub fn new(bitmap_factory: &IWICImagingFactory) -> Result<Self> {
unsafe {
let factory: IDWriteFactory5 = DWriteCreateFactory(DWRITE_FACTORY_TYPE_SHARED)?;
let bitmap_factory = AgileReference::new(bitmap_factory)?;
let d2d1_factory: ID2D1Factory =
D2D1CreateFactory(D2D1_FACTORY_TYPE_MULTI_THREADED, None)?;
// The `IDWriteInMemoryFontFileLoader` here is supported starting from
// Windows 10 Creators Update, which consequently requires the entire
// `DirectWriteTextSystem` to run on `win10 1703`+.
@@ -87,133 +86,60 @@ impl DirectWriteComponent {
GetUserDefaultLocaleName(&mut locale_vec);
let locale = String::from_utf16_lossy(&locale_vec);
let text_renderer = Arc::new(TextRendererWrapper::new(&locale));
let render_params = {
let default_params: IDWriteRenderingParams3 =
factory.CreateRenderingParams()?.cast()?;
let gamma = default_params.GetGamma();
let enhanced_contrast = default_params.GetEnhancedContrast();
let gray_contrast = default_params.GetGrayscaleEnhancedContrast();
let cleartype_level = default_params.GetClearTypeLevel();
let grid_fit_mode = default_params.GetGridFitMode();
factory.CreateCustomRenderingParams(
gamma,
enhanced_contrast,
gray_contrast,
cleartype_level,
DWRITE_PIXEL_GEOMETRY_RGB,
DWRITE_RENDERING_MODE1_NATURAL_SYMMETRIC,
grid_fit_mode,
)?
};
let gpu_state = GPUState::new(gpu_context)?;
let render_context = GlyphRenderContext::new(&factory, &d2d1_factory)?;
Ok(DirectWriteComponent {
locale,
factory,
bitmap_factory,
d2d1_factory,
in_memory_loader,
builder,
text_renderer,
render_params,
gpu_state,
render_context,
})
}
}
}
impl GPUState {
fn new(gpu_context: &DirectXDevices) -> Result<Self> {
// todo: safety comments
let device = gpu_context.device.clone();
let device_context = gpu_context.device_context.clone();
impl GlyphRenderContext {
pub fn new(factory: &IDWriteFactory5, d2d1_factory: &ID2D1Factory) -> Result<Self> {
unsafe {
let default_params: IDWriteRenderingParams3 =
factory.CreateRenderingParams()?.cast()?;
let gamma = default_params.GetGamma();
let enhanced_contrast = default_params.GetEnhancedContrast();
let gray_contrast = default_params.GetGrayscaleEnhancedContrast();
let cleartype_level = default_params.GetClearTypeLevel();
let grid_fit_mode = default_params.GetGridFitMode();
let blend_state = {
let mut blend_state = None;
let desc = D3D11_BLEND_DESC {
AlphaToCoverageEnable: false.into(),
IndependentBlendEnable: false.into(),
RenderTarget: [
D3D11_RENDER_TARGET_BLEND_DESC {
BlendEnable: true.into(),
SrcBlend: D3D11_BLEND_SRC_ALPHA,
DestBlend: D3D11_BLEND_INV_SRC_ALPHA,
BlendOp: D3D11_BLEND_OP_ADD,
SrcBlendAlpha: D3D11_BLEND_SRC_ALPHA,
DestBlendAlpha: D3D11_BLEND_INV_SRC_ALPHA,
BlendOpAlpha: D3D11_BLEND_OP_ADD,
RenderTargetWriteMask: D3D11_COLOR_WRITE_ENABLE_ALL.0 as u8,
},
Default::default(),
Default::default(),
Default::default(),
Default::default(),
Default::default(),
Default::default(),
Default::default(),
],
};
unsafe { device.CreateBlendState(&desc, Some(&mut blend_state)) }?;
blend_state.unwrap()
};
let sampler = {
let mut sampler = None;
let desc = D3D11_SAMPLER_DESC {
Filter: D3D11_FILTER_MIN_MAG_MIP_POINT,
AddressU: D3D11_TEXTURE_ADDRESS_BORDER,
AddressV: D3D11_TEXTURE_ADDRESS_BORDER,
AddressW: D3D11_TEXTURE_ADDRESS_BORDER,
MipLODBias: 0.0,
MaxAnisotropy: 1,
ComparisonFunc: D3D11_COMPARISON_ALWAYS,
BorderColor: [0.0, 0.0, 0.0, 0.0],
MinLOD: 0.0,
MaxLOD: 0.0,
};
unsafe { device.CreateSamplerState(&desc, Some(&mut sampler)) }?;
[sampler]
};
let vertex_shader = {
let source = shader_resources::RawShaderBytes::new(
shader_resources::ShaderModule::EmojiRasterization,
shader_resources::ShaderTarget::Vertex,
let params = factory.CreateCustomRenderingParams(
gamma,
enhanced_contrast,
gray_contrast,
cleartype_level,
DWRITE_PIXEL_GEOMETRY_RGB,
DWRITE_RENDERING_MODE1_NATURAL_SYMMETRIC,
grid_fit_mode,
)?;
let mut shader = None;
unsafe { device.CreateVertexShader(source.as_bytes(), None, Some(&mut shader)) }?;
shader.unwrap()
};
let dc_target = {
let target = d2d1_factory.CreateDCRenderTarget(&get_render_target_property(
DXGI_FORMAT_B8G8R8A8_UNORM,
D2D1_ALPHA_MODE_PREMULTIPLIED,
))?;
let target = target.cast::<ID2D1DeviceContext4>()?;
target.SetTextRenderingParams(&params);
target
};
let pixel_shader = {
let source = shader_resources::RawShaderBytes::new(
shader_resources::ShaderModule::EmojiRasterization,
shader_resources::ShaderTarget::Fragment,
)?;
let mut shader = None;
unsafe { device.CreatePixelShader(source.as_bytes(), None, Some(&mut shader)) }?;
shader.unwrap()
};
Ok(Self {
device,
device_context,
sampler,
blend_state,
vertex_shader,
pixel_shader,
})
Ok(Self { params, dc_target })
}
}
}
impl DirectWriteTextSystem {
pub(crate) fn new(
gpu_context: &DirectXDevices,
bitmap_factory: &IWICImagingFactory,
) -> Result<Self> {
let components = DirectWriteComponent::new(bitmap_factory, gpu_context)?;
pub(crate) fn new(bitmap_factory: &IWICImagingFactory) -> Result<Self> {
let components = DirectWriteComponent::new(bitmap_factory)?;
let system_font_collection = unsafe {
let mut result = std::mem::zeroed();
components
@@ -723,6 +649,11 @@ impl DirectWriteState {
}
fn raster_bounds(&self, params: &RenderGlyphParams) -> Result<Bounds<DevicePixels>> {
let render_target = &self.components.render_context.dc_target;
unsafe {
render_target.SetUnitMode(D2D1_UNIT_MODE_DIPS);
render_target.SetDpi(96.0 * params.scale_factor, 96.0 * params.scale_factor);
}
let font = &self.fonts[params.font_id.0];
let glyph_id = [params.glyph_id.0 as u16];
let advance = [0.0f32];
@@ -737,36 +668,25 @@ impl DirectWriteState {
isSideways: BOOL(0),
bidiLevel: 0,
};
let rendering_mode = DWRITE_RENDERING_MODE1_NATURAL_SYMMETRIC;
let measuring_mode = DWRITE_MEASURING_MODE_NATURAL;
let baseline_origin_x = 0.0;
let baseline_origin_y = 0.0;
let transform = DWRITE_MATRIX {
m11: params.scale_factor,
m12: 0.0,
m21: 0.0,
m22: params.scale_factor,
dx: 0.0,
dy: 0.0,
};
let glyph_analysis = unsafe {
self.components.factory.CreateGlyphRunAnalysis(
let bounds = unsafe {
render_target.GetGlyphRunWorldBounds(
Vector2 { X: 0.0, Y: 0.0 },
&glyph_run,
Some(&transform),
rendering_mode,
measuring_mode,
DWRITE_GRID_FIT_MODE_DEFAULT,
DWRITE_TEXT_ANTIALIAS_MODE_CLEARTYPE,
baseline_origin_x,
baseline_origin_y,
DWRITE_MEASURING_MODE_NATURAL,
)?
};
let texture_type = DWRITE_TEXTURE_CLEARTYPE_3x1;
let bounds = unsafe { glyph_analysis.GetAlphaTextureBounds(texture_type)? };
// todo(windows)
// This is a walkaround, deleted when figured out.
let y_offset;
let extra_height;
if params.is_emoji {
y_offset = 0;
extra_height = 0;
} else {
// make some room for scaler.
y_offset = -1;
extra_height = 2;
}
if bounds.right < bounds.left {
Ok(Bounds {
@@ -775,10 +695,15 @@ impl DirectWriteState {
})
} else {
Ok(Bounds {
origin: point((bounds.left as i32).into(), (bounds.top as i32).into()),
origin: point(
((bounds.left * params.scale_factor).ceil() as i32).into(),
((bounds.top * params.scale_factor).ceil() as i32 + y_offset).into(),
),
size: size(
(bounds.right - bounds.left).into(),
(bounds.bottom - bounds.top).into(),
(((bounds.right - bounds.left) * params.scale_factor).ceil() as i32).into(),
(((bounds.bottom - bounds.top) * params.scale_factor).ceil() as i32
+ extra_height)
.into(),
),
})
}
@@ -814,7 +739,7 @@ impl DirectWriteState {
ascenderOffset: glyph_bounds.origin.y.0 as f32 / params.scale_factor,
}];
let glyph_run = DWRITE_GLYPH_RUN {
fontFace: ManuallyDrop::new(Some(font_info.font_face.cast()?)),
fontFace: unsafe { std::mem::transmute_copy(&font_info.font_face) },
fontEmSize: params.font_size.0,
glyphCount: 1,
glyphIndices: glyph_id.as_ptr(),
@@ -834,398 +759,150 @@ impl DirectWriteState {
}
let bitmap_size = bitmap_size;
let subpixel_shift = params
.subpixel_variant
.map(|v| v as f32 / SUBPIXEL_VARIANTS as f32);
let baseline_origin_x = subpixel_shift.x / params.scale_factor;
let baseline_origin_y = subpixel_shift.y / params.scale_factor;
let transform = DWRITE_MATRIX {
m11: params.scale_factor,
m12: 0.0,
m21: 0.0,
m22: params.scale_factor,
dx: 0.0,
dy: 0.0,
};
let rendering_mode = if params.is_emoji {
DWRITE_RENDERING_MODE1_NATURAL
} else {
DWRITE_RENDERING_MODE1_NATURAL_SYMMETRIC
};
let measuring_mode = DWRITE_MEASURING_MODE_NATURAL;
let glyph_analysis = unsafe {
self.components.factory.CreateGlyphRunAnalysis(
&glyph_run,
Some(&transform),
rendering_mode,
measuring_mode,
DWRITE_GRID_FIT_MODE_DEFAULT,
DWRITE_TEXT_ANTIALIAS_MODE_CLEARTYPE,
baseline_origin_x,
baseline_origin_y,
)?
};
let texture_type = DWRITE_TEXTURE_CLEARTYPE_3x1;
let texture_bounds = unsafe { glyph_analysis.GetAlphaTextureBounds(texture_type)? };
let texture_width = (texture_bounds.right - texture_bounds.left) as u32;
let texture_height = (texture_bounds.bottom - texture_bounds.top) as u32;
if texture_width == 0 || texture_height == 0 {
return Ok((
bitmap_size,
vec![
0u8;
bitmap_size.width.0 as usize
* bitmap_size.height.0 as usize
* if params.is_emoji { 4 } else { 1 }
],
));
}
let mut bitmap_data: Vec<u8>;
let total_bytes;
let bitmap_format;
let render_target_property;
let bitmap_width;
let bitmap_height;
let bitmap_stride;
let bitmap_dpi;
if params.is_emoji {
if let Ok(color) = self.rasterize_color(
&glyph_run,
rendering_mode,
measuring_mode,
&transform,
point(baseline_origin_x, baseline_origin_y),
bitmap_size,
) {
bitmap_data = color;
} else {
let monochrome = Self::rasterize_monochrome(
&glyph_analysis,
bitmap_size,
size(texture_width, texture_height),
&texture_bounds,
)?;
bitmap_data = monochrome
.into_iter()
.flat_map(|pixel| [0, 0, 0, pixel])
.collect::<Vec<_>>();
}
total_bytes = bitmap_size.height.0 as usize * bitmap_size.width.0 as usize * 4;
bitmap_format = &GUID_WICPixelFormat32bppPBGRA;
render_target_property = get_render_target_property(
DXGI_FORMAT_B8G8R8A8_UNORM,
D2D1_ALPHA_MODE_PREMULTIPLIED,
);
bitmap_width = bitmap_size.width.0 as u32;
bitmap_height = bitmap_size.height.0 as u32;
bitmap_stride = bitmap_size.width.0 as u32 * 4;
bitmap_dpi = 96.0;
} else {
bitmap_data = Self::rasterize_monochrome(
&glyph_analysis,
bitmap_size,
size(texture_width, texture_height),
&texture_bounds,
)?;
total_bytes = bitmap_size.height.0 as usize * bitmap_size.width.0 as usize;
bitmap_format = &GUID_WICPixelFormat8bppAlpha;
render_target_property =
get_render_target_property(DXGI_FORMAT_A8_UNORM, D2D1_ALPHA_MODE_STRAIGHT);
bitmap_width = bitmap_size.width.0 as u32 * 2;
bitmap_height = bitmap_size.height.0 as u32 * 2;
bitmap_stride = bitmap_size.width.0 as u32;
bitmap_dpi = 192.0;
}
Ok((bitmap_size, bitmap_data))
}
fn rasterize_monochrome(
glyph_analysis: &IDWriteGlyphRunAnalysis,
bitmap_size: Size<DevicePixels>,
texture_size: Size<u32>,
texture_bounds: &RECT,
) -> Result<Vec<u8>> {
let mut bitmap_data =
vec![0u8; bitmap_size.width.0 as usize * bitmap_size.height.0 as usize];
let mut alpha_data = vec![0u8; (texture_size.width * texture_size.height * 3) as usize];
let bitmap_factory = self.components.bitmap_factory.resolve()?;
unsafe {
glyph_analysis.CreateAlphaTexture(
DWRITE_TEXTURE_CLEARTYPE_3x1,
texture_bounds,
&mut alpha_data,
let bitmap = bitmap_factory.CreateBitmap(
bitmap_width,
bitmap_height,
bitmap_format,
WICBitmapCacheOnLoad,
)?;
}
let render_target = self
.components
.d2d1_factory
.CreateWicBitmapRenderTarget(&bitmap, &render_target_property)?;
let brush = render_target.CreateSolidColorBrush(&BRUSH_COLOR, None)?;
let subpixel_shift = params
.subpixel_variant
.map(|v| v as f32 / SUBPIXEL_VARIANTS as f32);
let baseline_origin = Vector2 {
X: subpixel_shift.x / params.scale_factor,
Y: subpixel_shift.y / params.scale_factor,
};
// Convert ClearType RGB data to grayscale and place in bitmap
let offset_x = texture_bounds.left.max(0) as usize;
let offset_y = texture_bounds.top.max(0) as usize;
// This `cast()` action here should never fail since we are running on Win10+, and
// ID2D1DeviceContext4 requires Win8+
let render_target = render_target.cast::<ID2D1DeviceContext4>().unwrap();
render_target.SetUnitMode(D2D1_UNIT_MODE_DIPS);
render_target.SetDpi(
bitmap_dpi * params.scale_factor,
bitmap_dpi * params.scale_factor,
);
render_target.SetTextRenderingParams(&self.components.render_context.params);
render_target.BeginDraw();
for y in 0..texture_size.height as usize {
for x in 0..texture_size.width as usize {
let bitmap_x = offset_x + x;
let bitmap_y = offset_y + y;
if bitmap_x < bitmap_size.width.0 as usize
&& bitmap_y < bitmap_size.height.0 as usize
{
let texture_idx = (y * texture_size.width as usize + x) * 3;
let bitmap_idx = bitmap_y * bitmap_size.width.0 as usize + bitmap_x;
if texture_idx + 2 < alpha_data.len() && bitmap_idx < bitmap_data.len() {
let max_value = alpha_data[texture_idx]
.max(alpha_data[texture_idx + 1])
.max(alpha_data[texture_idx + 2]);
bitmap_data[bitmap_idx] = max_value;
if params.is_emoji {
// WARN: only DWRITE_GLYPH_IMAGE_FORMATS_COLR has been tested
let enumerator = self.components.factory.TranslateColorGlyphRun(
baseline_origin,
&glyph_run as _,
None,
DWRITE_GLYPH_IMAGE_FORMATS_COLR
| DWRITE_GLYPH_IMAGE_FORMATS_SVG
| DWRITE_GLYPH_IMAGE_FORMATS_PNG
| DWRITE_GLYPH_IMAGE_FORMATS_JPEG
| DWRITE_GLYPH_IMAGE_FORMATS_PREMULTIPLIED_B8G8R8A8,
DWRITE_MEASURING_MODE_NATURAL,
None,
0,
)?;
while enumerator.MoveNext().is_ok() {
let Ok(color_glyph) = enumerator.GetCurrentRun() else {
break;
};
let color_glyph = &*color_glyph;
let brush_color = translate_color(&color_glyph.Base.runColor);
brush.SetColor(&brush_color);
match color_glyph.glyphImageFormat {
DWRITE_GLYPH_IMAGE_FORMATS_PNG
| DWRITE_GLYPH_IMAGE_FORMATS_JPEG
| DWRITE_GLYPH_IMAGE_FORMATS_PREMULTIPLIED_B8G8R8A8 => render_target
.DrawColorBitmapGlyphRun(
color_glyph.glyphImageFormat,
baseline_origin,
&color_glyph.Base.glyphRun,
color_glyph.measuringMode,
D2D1_COLOR_BITMAP_GLYPH_SNAP_OPTION_DEFAULT,
),
DWRITE_GLYPH_IMAGE_FORMATS_SVG => render_target.DrawSvgGlyphRun(
baseline_origin,
&color_glyph.Base.glyphRun,
&brush,
None,
color_glyph.Base.paletteIndex as u32,
color_glyph.measuringMode,
),
_ => render_target.DrawGlyphRun(
baseline_origin,
&color_glyph.Base.glyphRun,
Some(color_glyph.Base.glyphRunDescription as *const _),
&brush,
color_glyph.measuringMode,
),
}
}
}
}
Ok(bitmap_data)
}
fn rasterize_color(
&self,
glyph_run: &DWRITE_GLYPH_RUN,
rendering_mode: DWRITE_RENDERING_MODE1,
measuring_mode: DWRITE_MEASURING_MODE,
transform: &DWRITE_MATRIX,
baseline_origin: Point<f32>,
bitmap_size: Size<DevicePixels>,
) -> Result<Vec<u8>> {
// todo: support formats other than COLR
let color_enumerator = unsafe {
self.components.factory.TranslateColorGlyphRun(
Vector2::new(baseline_origin.x, baseline_origin.y),
glyph_run,
None,
DWRITE_GLYPH_IMAGE_FORMATS_COLR,
measuring_mode,
Some(transform),
0,
)
}?;
let mut glyph_layers = Vec::new();
loop {
let color_run = unsafe { color_enumerator.GetCurrentRun() }?;
let color_run = unsafe { &*color_run };
let image_format = color_run.glyphImageFormat & !DWRITE_GLYPH_IMAGE_FORMATS_TRUETYPE;
if image_format == DWRITE_GLYPH_IMAGE_FORMATS_COLR {
let color_analysis = unsafe {
self.components.factory.CreateGlyphRunAnalysis(
&color_run.Base.glyphRun as *const _,
Some(transform),
rendering_mode,
measuring_mode,
DWRITE_GRID_FIT_MODE_DEFAULT,
DWRITE_TEXT_ANTIALIAS_MODE_CLEARTYPE,
baseline_origin.x,
baseline_origin.y,
)
}?;
let color_bounds =
unsafe { color_analysis.GetAlphaTextureBounds(DWRITE_TEXTURE_CLEARTYPE_3x1) }?;
let color_size = size(
color_bounds.right - color_bounds.left,
color_bounds.bottom - color_bounds.top,
} else {
render_target.DrawGlyphRun(
baseline_origin,
&glyph_run,
None,
&brush,
DWRITE_MEASURING_MODE_NATURAL,
);
if color_size.width > 0 && color_size.height > 0 {
let mut alpha_data =
vec![0u8; (color_size.width * color_size.height * 3) as usize];
unsafe {
color_analysis.CreateAlphaTexture(
DWRITE_TEXTURE_CLEARTYPE_3x1,
&color_bounds,
&mut alpha_data,
)
}?;
}
render_target.EndDraw(None, None)?;
let run_color = {
let run_color = color_run.Base.runColor;
Rgba {
r: run_color.r,
g: run_color.g,
b: run_color.b,
a: run_color.a,
}
};
let bounds = bounds(point(color_bounds.left, color_bounds.top), color_size);
let alpha_data = alpha_data
.chunks_exact(3)
.flat_map(|chunk| [chunk[0], chunk[1], chunk[2], 255])
.collect::<Vec<_>>();
glyph_layers.push(GlyphLayerTexture::new(
&self.components.gpu_state,
run_color,
bounds,
&alpha_data,
)?);
let mut raw_data = vec![0u8; total_bytes];
if params.is_emoji {
bitmap.CopyPixels(std::ptr::null() as _, bitmap_stride, &mut raw_data)?;
// Convert from BGRA with premultiplied alpha to BGRA with straight alpha.
for pixel in raw_data.chunks_exact_mut(4) {
let a = pixel[3] as f32 / 255.;
pixel[0] = (pixel[0] as f32 / a) as u8;
pixel[1] = (pixel[1] as f32 / a) as u8;
pixel[2] = (pixel[2] as f32 / a) as u8;
}
}
let has_next = unsafe { color_enumerator.MoveNext() }
.map(|e| e.as_bool())
.unwrap_or(false);
if !has_next {
break;
}
}
let gpu_state = &self.components.gpu_state;
let params_buffer = {
let desc = D3D11_BUFFER_DESC {
ByteWidth: std::mem::size_of::<GlyphLayerTextureParams>() as u32,
Usage: D3D11_USAGE_DYNAMIC,
BindFlags: D3D11_BIND_CONSTANT_BUFFER.0 as u32,
CPUAccessFlags: D3D11_CPU_ACCESS_WRITE.0 as u32,
MiscFlags: 0,
StructureByteStride: 0,
};
let mut buffer = None;
unsafe {
gpu_state
.device
.CreateBuffer(&desc, None, Some(&mut buffer))
}?;
[buffer]
};
let render_target_texture = {
let mut texture = None;
let desc = D3D11_TEXTURE2D_DESC {
Width: bitmap_size.width.0 as u32,
Height: bitmap_size.height.0 as u32,
MipLevels: 1,
ArraySize: 1,
Format: DXGI_FORMAT_B8G8R8A8_UNORM,
SampleDesc: DXGI_SAMPLE_DESC {
Count: 1,
Quality: 0,
},
Usage: D3D11_USAGE_DEFAULT,
BindFlags: D3D11_BIND_RENDER_TARGET.0 as u32,
CPUAccessFlags: 0,
MiscFlags: 0,
};
unsafe {
gpu_state
.device
.CreateTexture2D(&desc, None, Some(&mut texture))
}?;
texture.unwrap()
};
let render_target_view = {
let desc = D3D11_RENDER_TARGET_VIEW_DESC {
Format: DXGI_FORMAT_B8G8R8A8_UNORM,
ViewDimension: D3D11_RTV_DIMENSION_TEXTURE2D,
Anonymous: D3D11_RENDER_TARGET_VIEW_DESC_0 {
Texture2D: D3D11_TEX2D_RTV { MipSlice: 0 },
},
};
let mut rtv = None;
unsafe {
gpu_state.device.CreateRenderTargetView(
&render_target_texture,
Some(&desc),
Some(&mut rtv),
)
}?;
[rtv]
};
let staging_texture = {
let mut texture = None;
let desc = D3D11_TEXTURE2D_DESC {
Width: bitmap_size.width.0 as u32,
Height: bitmap_size.height.0 as u32,
MipLevels: 1,
ArraySize: 1,
Format: DXGI_FORMAT_B8G8R8A8_UNORM,
SampleDesc: DXGI_SAMPLE_DESC {
Count: 1,
Quality: 0,
},
Usage: D3D11_USAGE_STAGING,
BindFlags: 0,
CPUAccessFlags: D3D11_CPU_ACCESS_READ.0 as u32,
MiscFlags: 0,
};
unsafe {
gpu_state
.device
.CreateTexture2D(&desc, None, Some(&mut texture))
}?;
texture.unwrap()
};
let device_context = &gpu_state.device_context;
unsafe { device_context.IASetPrimitiveTopology(D3D_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP) };
unsafe { device_context.VSSetShader(&gpu_state.vertex_shader, None) };
unsafe { device_context.PSSetShader(&gpu_state.pixel_shader, None) };
unsafe { device_context.VSSetConstantBuffers(0, Some(&params_buffer)) };
unsafe { device_context.PSSetConstantBuffers(0, Some(&params_buffer)) };
unsafe { device_context.OMSetRenderTargets(Some(&render_target_view), None) };
unsafe { device_context.PSSetSamplers(0, Some(&gpu_state.sampler)) };
unsafe { device_context.OMSetBlendState(&gpu_state.blend_state, None, 0xffffffff) };
for layer in glyph_layers {
let params = GlyphLayerTextureParams {
run_color: layer.run_color,
bounds: layer.bounds,
};
unsafe {
let mut dest = std::mem::zeroed();
gpu_state.device_context.Map(
params_buffer[0].as_ref().unwrap(),
0,
D3D11_MAP_WRITE_DISCARD,
0,
Some(&mut dest),
} else {
let scaler = bitmap_factory.CreateBitmapScaler()?;
scaler.Initialize(
&bitmap,
bitmap_size.width.0 as u32,
bitmap_size.height.0 as u32,
WICBitmapInterpolationModeHighQualityCubic,
)?;
std::ptr::copy_nonoverlapping(&params as *const _, dest.pData as *mut _, 1);
gpu_state
.device_context
.Unmap(params_buffer[0].as_ref().unwrap(), 0);
};
let texture = [Some(layer.texture_view)];
unsafe { device_context.PSSetShaderResources(0, Some(&texture)) };
let viewport = [D3D11_VIEWPORT {
TopLeftX: layer.bounds.origin.x as f32,
TopLeftY: layer.bounds.origin.y as f32,
Width: layer.bounds.size.width as f32,
Height: layer.bounds.size.height as f32,
MinDepth: 0.0,
MaxDepth: 1.0,
}];
unsafe { device_context.RSSetViewports(Some(&viewport)) };
unsafe { device_context.Draw(4, 0) };
scaler.CopyPixels(std::ptr::null() as _, bitmap_stride, &mut raw_data)?;
}
Ok((bitmap_size, raw_data))
}
unsafe { device_context.CopyResource(&staging_texture, &render_target_texture) };
let mapped_data = {
let mut mapped_data = D3D11_MAPPED_SUBRESOURCE::default();
unsafe {
device_context.Map(
&staging_texture,
0,
D3D11_MAP_READ,
0,
Some(&mut mapped_data),
)
}?;
mapped_data
};
let mut rasterized =
vec![0u8; (bitmap_size.width.0 as u32 * bitmap_size.height.0 as u32 * 4) as usize];
for y in 0..bitmap_size.height.0 as usize {
let width = bitmap_size.width.0 as usize;
unsafe {
std::ptr::copy_nonoverlapping::<u8>(
(mapped_data.pData as *const u8).byte_add(mapped_data.RowPitch as usize * y),
rasterized
.as_mut_ptr()
.byte_add(width * y * std::mem::size_of::<u32>()),
width * std::mem::size_of::<u32>(),
)
};
}
Ok(rasterized)
}
fn get_typographic_bounds(&self, font_id: FontId, glyph_id: GlyphId) -> Result<Bounds<f32>> {
@@ -1299,84 +976,6 @@ impl Drop for DirectWriteState {
}
}
struct GlyphLayerTexture {
run_color: Rgba,
bounds: Bounds<i32>,
texture: ID3D11Texture2D,
texture_view: ID3D11ShaderResourceView,
}
impl GlyphLayerTexture {
pub fn new(
gpu_state: &GPUState,
run_color: Rgba,
bounds: Bounds<i32>,
alpha_data: &[u8],
) -> Result<Self> {
// todo: ideally we would have a nice texture wrapper
let texture_size = bounds.size;
let desc = D3D11_TEXTURE2D_DESC {
Width: texture_size.width as u32,
Height: texture_size.height as u32,
MipLevels: 1,
ArraySize: 1,
Format: DXGI_FORMAT_R8G8B8A8_UNORM,
SampleDesc: DXGI_SAMPLE_DESC {
Count: 1,
Quality: 0,
},
Usage: D3D11_USAGE_DEFAULT,
BindFlags: D3D11_BIND_SHADER_RESOURCE.0 as u32,
CPUAccessFlags: D3D11_CPU_ACCESS_WRITE.0 as u32,
MiscFlags: 0,
};
let texture = {
let mut texture: Option<ID3D11Texture2D> = None;
unsafe {
gpu_state
.device
.CreateTexture2D(&desc, None, Some(&mut texture))?
};
texture.unwrap()
};
let texture_view = {
let mut view: Option<ID3D11ShaderResourceView> = None;
unsafe {
gpu_state
.device
.CreateShaderResourceView(&texture, None, Some(&mut view))?
};
view.unwrap()
};
unsafe {
gpu_state.device_context.UpdateSubresource(
&texture,
0,
None,
alpha_data.as_ptr() as _,
(texture_size.width * 4) as u32,
0,
)
};
Ok(GlyphLayerTexture {
run_color,
bounds,
texture,
texture_view,
})
}
}
#[repr(C)]
struct GlyphLayerTextureParams {
bounds: Bounds<i32>,
run_color: Rgba,
}
struct TextRendererWrapper(pub IDWriteTextRenderer);
impl TextRendererWrapper {
@@ -1871,6 +1470,16 @@ fn get_name(string: IDWriteLocalizedStrings, locale: &str) -> Result<String> {
Ok(String::from_utf16_lossy(&name_vec[..name_length]))
}
#[inline]
fn translate_color(color: &DWRITE_COLOR_F) -> D2D1_COLOR_F {
D2D1_COLOR_F {
r: color.r,
g: color.g,
b: color.b,
a: color.a,
}
}
fn get_system_ui_font_name() -> SharedString {
unsafe {
let mut info: LOGFONTW = std::mem::zeroed();
@@ -1895,6 +1504,24 @@ fn get_system_ui_font_name() -> SharedString {
}
}
#[inline]
fn get_render_target_property(
pixel_format: DXGI_FORMAT,
alpha_mode: D2D1_ALPHA_MODE,
) -> D2D1_RENDER_TARGET_PROPERTIES {
D2D1_RENDER_TARGET_PROPERTIES {
r#type: D2D1_RENDER_TARGET_TYPE_DEFAULT,
pixelFormat: D2D1_PIXEL_FORMAT {
format: pixel_format,
alphaMode: alpha_mode,
},
dpiX: 96.0,
dpiY: 96.0,
usage: D2D1_RENDER_TARGET_USAGE_NONE,
minLevel: D2D1_FEATURE_LEVEL_DEFAULT,
}
}
// One would think that with newer DirectWrite method: IDWriteFontFace4::GetGlyphImageFormats
// but that doesn't seem to work for some glyphs, say ❤
fn is_color_glyph(
@@ -1934,6 +1561,12 @@ fn is_color_glyph(
}
const DEFAULT_LOCALE_NAME: PCWSTR = windows::core::w!("en-US");
const BRUSH_COLOR: D2D1_COLOR_F = D2D1_COLOR_F {
r: 1.0,
g: 1.0,
b: 1.0,
a: 1.0,
};
#[cfg(test)]
mod tests {

View File

@@ -7,7 +7,7 @@ use windows::Win32::Graphics::{
D3D11_USAGE_DEFAULT, ID3D11Device, ID3D11DeviceContext, ID3D11ShaderResourceView,
ID3D11Texture2D,
},
Dxgi::Common::*,
Dxgi::Common::{DXGI_FORMAT_A8_UNORM, DXGI_FORMAT_B8G8R8A8_UNORM, DXGI_SAMPLE_DESC},
};
use crate::{
@@ -167,7 +167,7 @@ impl DirectXAtlasState {
let bytes_per_pixel;
match kind {
AtlasTextureKind::Monochrome => {
pixel_format = DXGI_FORMAT_R8_UNORM;
pixel_format = DXGI_FORMAT_A8_UNORM;
bind_flag = D3D11_BIND_SHADER_RESOURCE;
bytes_per_pixel = 1;
}
@@ -265,7 +265,6 @@ impl DirectXAtlasTexture {
bounds: Bounds<DevicePixels>,
bytes: &[u8],
) {
println!("{:?}", bounds);
unsafe {
device_context.UpdateSubresource(
&self.texture,

View File

@@ -42,8 +42,8 @@ pub(crate) struct DirectXRenderer {
pub(crate) struct DirectXDevices {
adapter: IDXGIAdapter1,
dxgi_factory: IDXGIFactory6,
pub(crate) device: ID3D11Device,
pub(crate) device_context: ID3D11DeviceContext,
device: ID3D11Device,
device_context: ID3D11DeviceContext,
dxgi_device: Option<IDXGIDevice>,
}
@@ -175,8 +175,6 @@ impl DirectXRenderer {
}
fn pre_draw(&self) -> Result<()> {
let premultiplied_alpha = 1;
update_buffer(
&self.devices.device_context,
self.globals.global_params_buffer[0].as_ref().unwrap(),
@@ -185,7 +183,6 @@ impl DirectXRenderer {
self.resources.viewport[0].Width,
self.resources.viewport[0].Height,
],
premultiplied_alpha,
..Default::default()
}],
)?;
@@ -822,8 +819,7 @@ impl DirectXGlobalElements {
#[repr(C)]
struct GlobalParams {
viewport_size: [f32; 2],
premultiplied_alpha: u32,
_pad: u32,
_pad: u64,
}
struct PipelineState<T> {
@@ -1077,7 +1073,7 @@ fn create_swap_chain_for_composition(
// Composition SwapChains only support the DXGI_SCALING_STRETCH Scaling.
Scaling: DXGI_SCALING_STRETCH,
SwapEffect: DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL,
AlphaMode: DXGI_ALPHA_MODE_IGNORE,
AlphaMode: DXGI_ALPHA_MODE_PREMULTIPLIED,
Flags: 0,
};
Ok(unsafe { dxgi_factory.CreateSwapChainForComposition(device, &desc, None)? })
@@ -1281,7 +1277,7 @@ fn create_blend_state(device: &ID3D11Device) -> Result<ID3D11BlendState> {
desc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA;
desc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE;
desc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA;
desc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_INV_SRC_ALPHA;
desc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ONE;
desc.RenderTarget[0].RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL.0 as u8;
unsafe {
let mut state = None;
@@ -1427,7 +1423,7 @@ fn report_live_objects(device: &ID3D11Device) -> Result<()> {
const BUFFER_COUNT: usize = 3;
pub(crate) mod shader_resources {
mod shader_resources {
use anyhow::Result;
#[cfg(debug_assertions)]
@@ -1440,7 +1436,7 @@ pub(crate) mod shader_resources {
};
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub(crate) enum ShaderModule {
pub(super) enum ShaderModule {
Quad,
Shadow,
Underline,
@@ -1448,16 +1444,15 @@ pub(crate) mod shader_resources {
PathSprite,
MonochromeSprite,
PolychromeSprite,
EmojiRasterization,
}
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub(crate) enum ShaderTarget {
pub(super) enum ShaderTarget {
Vertex,
Fragment,
}
pub(crate) struct RawShaderBytes<'t> {
pub(super) struct RawShaderBytes<'t> {
inner: &'t [u8],
#[cfg(debug_assertions)]
@@ -1465,7 +1460,7 @@ pub(crate) mod shader_resources {
}
impl<'t> RawShaderBytes<'t> {
pub(crate) fn new(module: ShaderModule, target: ShaderTarget) -> Result<Self> {
pub(super) fn new(module: ShaderModule, target: ShaderTarget) -> Result<Self> {
#[cfg(not(debug_assertions))]
{
Ok(Self::from_bytes(module, target))
@@ -1483,7 +1478,7 @@ pub(crate) mod shader_resources {
}
}
pub(crate) fn as_bytes(&'t self) -> &'t [u8] {
pub(super) fn as_bytes(&'t self) -> &'t [u8] {
self.inner
}
@@ -1518,10 +1513,6 @@ pub(crate) mod shader_resources {
ShaderTarget::Vertex => POLYCHROME_SPRITE_VERTEX_BYTES,
ShaderTarget::Fragment => POLYCHROME_SPRITE_FRAGMENT_BYTES,
},
ShaderModule::EmojiRasterization => match target {
ShaderTarget::Vertex => EMOJI_RASTERIZATION_VERTEX_BYTES,
ShaderTarget::Fragment => EMOJI_RASTERIZATION_FRAGMENT_BYTES,
},
};
Self { inner: bytes }
}
@@ -1530,12 +1521,6 @@ pub(crate) mod shader_resources {
#[cfg(debug_assertions)]
pub(super) fn build_shader_blob(entry: ShaderModule, target: ShaderTarget) -> Result<ID3DBlob> {
unsafe {
let shader_name = if matches!(entry, ShaderModule::EmojiRasterization) {
"color_text_raster.hlsl"
} else {
"shaders.hlsl"
};
let entry = format!(
"{}_{}\0",
entry.as_str(),
@@ -1551,9 +1536,8 @@ pub(crate) mod shader_resources {
let mut compile_blob = None;
let mut error_blob = None;
let shader_path = std::path::PathBuf::from(env!("CARGO_MANIFEST_DIR"))
.join(&format!("src/platform/windows/{}", shader_name))
.join("src/platform/windows/shaders.hlsl")
.canonicalize()?;
let entry_point = PCSTR::from_raw(entry.as_ptr());
@@ -1599,7 +1583,6 @@ pub(crate) mod shader_resources {
ShaderModule::PathSprite => "path_sprite",
ShaderModule::MonochromeSprite => "monochrome_sprite",
ShaderModule::PolychromeSprite => "polychrome_sprite",
ShaderModule::EmojiRasterization => "emoji_rasterization",
}
}
}

View File

@@ -44,7 +44,6 @@ pub(crate) struct WindowsPlatform {
drop_target_helper: IDropTargetHelper,
validation_number: usize,
main_thread_id_win32: u32,
disable_direct_composition: bool,
}
pub(crate) struct WindowsPlatformState {
@@ -94,18 +93,14 @@ impl WindowsPlatform {
main_thread_id_win32,
validation_number,
));
let disable_direct_composition = std::env::var(DISABLE_DIRECT_COMPOSITION)
.is_ok_and(|value| value == "true" || value == "1");
let background_executor = BackgroundExecutor::new(dispatcher.clone());
let foreground_executor = ForegroundExecutor::new(dispatcher);
let directx_devices = DirectXDevices::new(disable_direct_composition)
.context("Unable to init directx devices.")?;
let bitmap_factory = ManuallyDrop::new(unsafe {
CoCreateInstance(&CLSID_WICImagingFactory, None, CLSCTX_INPROC_SERVER)
.context("Error creating bitmap factory.")?
});
let text_system = Arc::new(
DirectWriteTextSystem::new(&directx_devices, &bitmap_factory)
DirectWriteTextSystem::new(&bitmap_factory)
.context("Error creating DirectWriteTextSystem")?,
);
let drop_target_helper: IDropTargetHelper = unsafe {
@@ -125,7 +120,6 @@ impl WindowsPlatform {
background_executor,
foreground_executor,
text_system,
disable_direct_composition,
windows_version,
bitmap_factory,
drop_target_helper,
@@ -190,7 +184,6 @@ impl WindowsPlatform {
validation_number: self.validation_number,
main_receiver: self.main_receiver.clone(),
main_thread_id_win32: self.main_thread_id_win32,
disable_direct_composition: self.disable_direct_composition,
}
}
@@ -722,7 +715,6 @@ pub(crate) struct WindowCreationInfo {
pub(crate) validation_number: usize,
pub(crate) main_receiver: flume::Receiver<Runnable>,
pub(crate) main_thread_id_win32: u32,
pub(crate) disable_direct_composition: bool,
}
fn open_target(target: &str) {

View File

@@ -1,7 +1,6 @@
cbuffer GlobalParams: register(b0) {
float2 global_viewport_size;
uint premultiplied_alpha;
uint _pad;
uint2 _global_pad;
};
Texture2D<float4> t_sprite: register(t0);
@@ -1070,7 +1069,6 @@ struct MonochromeSpriteFragmentInput {
float4 position: SV_Position;
float2 tile_position: POSITION;
nointerpolation float4 color: COLOR;
float4 clip_distance: SV_ClipDistance;
};
StructuredBuffer<MonochromeSprite> mono_sprites: register(t1);
@@ -1092,28 +1090,11 @@ MonochromeSpriteVertexOutput monochrome_sprite_vertex(uint vertex_id: SV_VertexI
return output;
}
float4 blend_color(float4 color, float alpha_factor) {
float alpha = color.a * alpha_factor;
float multiplier = premultiplied_alpha != 0 ? alpha : 1.0;
return float4(color.rgb * multiplier, alpha);
}
float3 linear_to_srgbee(float3 l) {
bool cutoff = l < float3(0.0031308, 0.0031308, 0.0031308);
float3 higher = float3(1.055, 1.055, 1.055) * pow(l, float3(1.0 / 2.4, 1.0 / 2.4, 1.0 / 2.4)) - float3(0.055, 0.055, 0.055);
float3 lower = l * float3(12.92, 12.92, 12.92);
return cutoff ? lower : higher;
}
float4 monochrome_sprite_fragment(MonochromeSpriteFragmentInput input): SV_Target {
float sample = t_sprite.Sample(s_sprite, input.tile_position).r;
float4 sample = t_sprite.Sample(s_sprite, input.tile_position);
float4 color = input.color;
// color.a *= sample;
// return float4(color.rgb, color.a);
if (any(input.clip_distance < 0.0)) {
return float4(0.0, 0.0, 0.0, 0.0);
}
return blend_color(input.color, sample);
color.a *= sample.a;
return color;
}
/*

View File

@@ -360,7 +360,6 @@ impl WindowsWindow {
validation_number,
main_receiver,
main_thread_id_win32,
disable_direct_composition,
} = creation_info;
let classname = register_wnd_class(icon);
let hide_title_bar = params
@@ -376,6 +375,8 @@ impl WindowsWindow {
.map(|title| title.as_ref())
.unwrap_or(""),
);
let disable_direct_composition = std::env::var(DISABLE_DIRECT_COMPOSITION)
.is_ok_and(|value| value == "true" || value == "1");
let (mut dwexstyle, dwstyle) = if params.kind == WindowKind::PopUp {
(WS_EX_TOOLWINDOW, WINDOW_STYLE(0x0))

View File

@@ -32,20 +32,18 @@ impl TabHandles {
self.handles.clear();
}
fn current_index(&self, focused_id: Option<&FocusId>) -> usize {
self.handles
.iter()
.position(|h| Some(&h.id) == focused_id)
.unwrap_or_default()
fn current_index(&self, focused_id: Option<&FocusId>) -> Option<usize> {
self.handles.iter().position(|h| Some(&h.id) == focused_id)
}
pub(crate) fn next(&self, focused_id: Option<&FocusId>) -> Option<FocusHandle> {
let ix = self.current_index(focused_id);
let mut next_ix = ix + 1;
if next_ix + 1 > self.handles.len() {
next_ix = 0;
}
let next_ix = self
.current_index(focused_id)
.and_then(|ix| {
let next_ix = ix + 1;
(next_ix < self.handles.len()).then_some(next_ix)
})
.unwrap_or_default();
if let Some(next_handle) = self.handles.get(next_ix) {
Some(next_handle.clone())
@@ -55,7 +53,7 @@ impl TabHandles {
}
pub(crate) fn prev(&self, focused_id: Option<&FocusId>) -> Option<FocusHandle> {
let ix = self.current_index(focused_id);
let ix = self.current_index(focused_id).unwrap_or_default();
let prev_ix;
if ix == 0 {
prev_ix = self.handles.len().saturating_sub(1);
@@ -108,8 +106,14 @@ mod tests {
]
);
// next
assert_eq!(tab.next(None), Some(tab.handles[1].clone()));
// Select first tab index if no handle is currently focused.
assert_eq!(tab.next(None), Some(tab.handles[0].clone()));
// Select last tab index if no handle is currently focused.
assert_eq!(
tab.prev(None),
Some(tab.handles[tab.handles.len() - 1].clone())
);
assert_eq!(
tab.next(Some(&tab.handles[0].id)),
Some(tab.handles[1].clone())

View File

@@ -4,6 +4,7 @@ pub mod github;
pub use anyhow::{Result, anyhow};
pub use async_body::{AsyncBody, Inner};
use derive_more::Deref;
use http::HeaderValue;
pub use http::{self, Method, Request, Response, StatusCode, Uri};
use futures::future::BoxFuture;
@@ -39,6 +40,8 @@ impl HttpRequestExt for http::request::Builder {
pub trait HttpClient: 'static + Send + Sync {
fn type_name(&self) -> &'static str;
fn user_agent(&self) -> Option<&HeaderValue>;
fn send(
&self,
req: http::Request<AsyncBody>,
@@ -118,6 +121,10 @@ impl HttpClient for HttpClientWithProxy {
self.client.send(req)
}
fn user_agent(&self) -> Option<&HeaderValue> {
self.client.user_agent()
}
fn proxy(&self) -> Option<&Url> {
self.proxy.as_ref()
}
@@ -135,6 +142,10 @@ impl HttpClient for Arc<HttpClientWithProxy> {
self.client.send(req)
}
fn user_agent(&self) -> Option<&HeaderValue> {
self.client.user_agent()
}
fn proxy(&self) -> Option<&Url> {
self.proxy.as_ref()
}
@@ -225,6 +236,22 @@ impl HttpClientWithUrl {
)?)
}
/// Builds a Zed Cloud URL using the given path.
pub fn build_zed_cloud_url(&self, path: &str, query: &[(&str, &str)]) -> Result<Url> {
let base_url = self.base_url();
let base_api_url = match base_url.as_ref() {
"https://zed.dev" => "https://cloud.zed.dev",
"https://staging.zed.dev" => "https://cloud.zed.dev",
"http://localhost:3000" => "http://localhost:8787",
other => other,
};
Ok(Url::parse_with_params(
&format!("{}{}", base_api_url, path),
query,
)?)
}
/// Builds a Zed LLM URL using the given path.
pub fn build_zed_llm_url(&self, path: &str, query: &[(&str, &str)]) -> Result<Url> {
let base_url = self.base_url();
@@ -250,6 +277,10 @@ impl HttpClient for Arc<HttpClientWithUrl> {
self.client.send(req)
}
fn user_agent(&self) -> Option<&HeaderValue> {
self.client.user_agent()
}
fn proxy(&self) -> Option<&Url> {
self.client.proxy.as_ref()
}
@@ -267,6 +298,10 @@ impl HttpClient for HttpClientWithUrl {
self.client.send(req)
}
fn user_agent(&self) -> Option<&HeaderValue> {
self.client.user_agent()
}
fn proxy(&self) -> Option<&Url> {
self.client.proxy.as_ref()
}
@@ -314,6 +349,10 @@ impl HttpClient for BlockedHttpClient {
})
}
fn user_agent(&self) -> Option<&HeaderValue> {
None
}
fn proxy(&self) -> Option<&Url> {
None
}
@@ -334,6 +373,7 @@ type FakeHttpHandler = Box<
#[cfg(feature = "test-support")]
pub struct FakeHttpClient {
handler: FakeHttpHandler,
user_agent: HeaderValue,
}
#[cfg(feature = "test-support")]
@@ -348,6 +388,7 @@ impl FakeHttpClient {
client: HttpClientWithProxy {
client: Arc::new(Self {
handler: Box::new(move |req| Box::pin(handler(req))),
user_agent: HeaderValue::from_static(type_name::<Self>()),
}),
proxy: None,
},
@@ -390,6 +431,10 @@ impl HttpClient for FakeHttpClient {
future
}
fn user_agent(&self) -> Option<&HeaderValue> {
Some(&self.user_agent)
}
fn proxy(&self) -> Option<&Url> {
None
}

View File

@@ -15,6 +15,7 @@ doctest = false
[dependencies]
anyhow.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
copilot.workspace = true
editor.workspace = true
feature_flags.workspace = true
@@ -32,7 +33,6 @@ ui.workspace = true
workspace-hack.workspace = true
workspace.workspace = true
zed_actions.workspace = true
zed_llm_client.workspace = true
zeta.workspace = true
[dev-dependencies]

View File

@@ -1,5 +1,6 @@
use anyhow::Result;
use client::{DisableAiSettings, UserStore, zed_urls};
use cloud_llm_client::UsageLimit;
use copilot::{Copilot, Status};
use editor::{
Editor, SelectionEffects,
@@ -34,7 +35,6 @@ use workspace::{
notifications::NotificationId,
};
use zed_actions::OpenBrowser;
use zed_llm_client::UsageLimit;
use zeta::RateCompletions;
actions!(

View File

@@ -166,7 +166,6 @@ pub struct CachedLspAdapter {
pub reinstall_attempt_count: AtomicU64,
cached_binary: futures::lock::Mutex<Option<LanguageServerBinary>>,
manifest_name: OnceLock<Option<ManifestName>>,
attach_kind: OnceLock<Attach>,
}
impl Debug for CachedLspAdapter {
@@ -202,7 +201,6 @@ impl CachedLspAdapter {
adapter,
cached_binary: Default::default(),
reinstall_attempt_count: AtomicU64::new(0),
attach_kind: Default::default(),
manifest_name: Default::default(),
})
}
@@ -288,29 +286,15 @@ impl CachedLspAdapter {
.get_or_init(|| self.adapter.manifest_name())
.clone()
}
pub fn attach_kind(&self) -> Attach {
*self.attach_kind.get_or_init(|| self.adapter.attach_kind())
}
}
/// Determines what gets sent out as a workspace folders content
#[derive(Clone, Copy, Debug, PartialEq)]
pub enum Attach {
/// Create a single language server instance per subproject root.
InstancePerRoot,
/// Use one shared language server instance for all subprojects within a project.
Shared,
}
impl Attach {
pub fn root_path(
&self,
root_subproject_path: (WorktreeId, Arc<Path>),
) -> (WorktreeId, Arc<Path>) {
match self {
Attach::InstancePerRoot => root_subproject_path,
Attach::Shared => (root_subproject_path.0, Arc::from(Path::new(""))),
}
}
pub enum WorkspaceFoldersContent {
/// Send out a single entry with the root of the workspace.
WorktreeRoot,
/// Send out a list of subproject roots.
SubprojectRoots,
}
/// [`LspAdapterDelegate`] allows [`LspAdapter]` implementations to interface with the application
@@ -602,8 +586,11 @@ pub trait LspAdapter: 'static + Send + Sync {
Ok(original)
}
fn attach_kind(&self) -> Attach {
Attach::Shared
/// Determines whether a language server supports workspace folders.
///
/// And does not trip over itself in the process.
fn workspace_folders_content(&self) -> WorkspaceFoldersContent {
WorkspaceFoldersContent::SubprojectRoots
}
fn manifest_name(&self) -> Option<ManifestName> {

View File

@@ -20,6 +20,7 @@ anthropic = { workspace = true, features = ["schemars"] }
anyhow.workspace = true
base64.workspace = true
client.workspace = true
cloud_llm_client.workspace = true
collections.workspace = true
futures.workspace = true
gpui.workspace = true
@@ -37,7 +38,6 @@ telemetry_events.workspace = true
thiserror.workspace = true
util.workspace = true
workspace-hack.workspace = true
zed_llm_client.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }

Some files were not shown because too many files have changed in this diff Show More