Compare commits

...

157 Commits

Author SHA1 Message Date
Marshall Bowers
9641ae0755 Remove basic.conf (#10120)
This PR removes the `basic.conf` file.

In #10099 we suppressed some typo warnings that had cropped up in this
file, but it turns out we don't need the file at all.

Release Notes:

- N/A
2024-04-03 12:32:47 -04:00
Kirill Bulatov
ce73ff9808 Avoid failing format test with current date (#10068)
Replace the test that tested with
`chrono::offset::Local::now().naive_local()` taken, failing the
formatting once per year at least.


Release Notes:

- N/A
2024-04-03 12:32:40 -04:00
Joseph T. Lyons
240db73199 v0.129.x stable 2024-04-03 12:11:10 -04:00
gcp-cherry-pick-bot[bot]
6b52917e75 Don't update active completion for editors that are not focused (cherry-pick #9904) (#9907)
Cherry-picked Don't update active completion for editors that are not
focused (#9904)

Release Notes:

- N/A

Co-authored-by: Antonio Scandurra <me@as-cii.com>
2024-03-28 10:52:33 +01:00
Marshall Bowers
f226a9932a zed 0.129.1 2024-03-27 13:50:53 -04:00
Marshall Bowers
a7915cb848 Look up extensions in the new index when reporting extension events (#9879)
This PR fixes a bug that was causing extension telemetry events to not
be reported.

We need to look up the extensions in the new index, as the extensions to
load won't be found in the old index.

Release Notes:

- N/A
2024-03-27 13:48:45 -04:00
Joseph T. Lyons
2d8288f076 v0.129.x preview 2024-03-27 10:52:55 -04:00
Dunqing
96b812b2c4 Pin Vue language server to 1.8 (#9846)
After `@vue/language-server` release 2.0, vue lsp doesn't work. I tried
to support 2.0, but since I'm not familiar with `@vue/language-server`
and `zed` I was unsuccessful. To avoid long-term unavailability, I
temporarily fixed the version to 1.8 until we have 2.0 support.

Release Notes:

- Pinned `@vue/language-server` to version `1.8` until Zed supports
`2.x`. ([#9388](https://github.com/zed-industries/zed/issues/9388) &
[#9329](https://github.com/zed-industries/zed/issues/9329)).

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
2024-03-27 14:23:10 +01:00
Piotr Osiewicz
de4a54a204 chore: Bump ahash 0.7.6 (yanked) -> 0.7.8 (#9860)
Fixes #9855 

Release Notes:

- N/A
2024-03-27 13:51:58 +01:00
Kirill Bulatov
63f17c50b9 Fix mac bundling errors (#9848)
Based on
https://github.com/zed-industries/zed/pull/8952#issuecomment-2021693384
and
https://github.com/zed-industries/zed/pull/8952#issuecomment-2022241455

Fixes `./script/bundle-mac -l` workflow errors

* Use proper WebRTC.framework location path (without the arch name dir
in its path)

* Fix `./script/bundle-mac -l` behavior that unconditionally installed
the app and broke it on rerun.
Now the installation is done with `-i` flag only and always cleans up
the target dir (always using `-f` flag logic, hence removed it).


Release Notes:

- N/A
2024-03-27 13:04:32 +02:00
Conrad Irwin
140b8418c1 Stop reading deserialize_fingerprint (#9668)
Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-03-27 11:24:31 +01:00
Mikayla Maki
8583c3bd94 Add go to implementation shortcut (#9837)
This adds a keybinding for an existing action. Notably, our bindings for
`Go To Type Definition` and `Go To Implementation` are swapped from
VSCode. We use `cmd` and `shift`, they use `shift` and `cmd`.

Release Notes:

- Added a keybinding for `editor::GoToImplementation`
2024-03-26 17:04:55 -07:00
Mikayla Maki
40f60ebe2d Fix the linux keymap (#9829)
Earlier versions where a simple find-replace of `cmd` => `ctrl`. In this
PR, I've gone over every keybinding individually and checked them.

Release Notes:

- Removed the `ShowContextMenu` action, it's only usage was in the
collab panel and it's been rebound to `SecondaryConfirm`
2024-03-26 16:10:09 -07:00
Marshall Bowers
3676ca879b Extract Astro support into an extension (#9835)
This PR extracts Astro support into an extension and removes the
built-in Astro support from Zed.

Release Notes:

- Removed built-in support for Astro, in favor of making it available as
an extension. The Astro extension will be suggested for download when
you open a `.astro` file.
2024-03-26 18:50:08 -04:00
Marshall Bowers
7807f23e2a Extract Dockerfile extension (#9832)
This PR extracts Dockerfile support into an extension and removes the
built-in Dockerfile support from Zed.

There's already an existing [Dockerfile
extension](https://github.com/d1y/dockerfile.zed) that was just missing
language server support. Language server support is being added to that
extension in https://github.com/d1y/dockerfile.zed/pull/2.

Release Notes:

- Removed built-in support for Dockerfile, in favor of making it
available as an extension. The Dockerfile extension will be suggested
for download when you open a `Dockerfile`.
2024-03-26 16:38:21 -04:00
Marshall Bowers
181dc86b48 Fix typo in PureScript extension's struct name (#9831)
This PR fixes a copy/paste typo in the name of the PureScript
extension's struct name.

Release Notes:

- N/A
2024-03-26 16:29:55 -04:00
Bennet Bo Fenner
e272acd1bc collab ui: Fix notification windows on external monitors (#9817)
Sharing a project displays a notification (window) on every screen.
Previously there was an issue with the positioning of windows on all
screens except the primary screen.

As you can see here:


![image](https://github.com/zed-industries/zed/assets/53836821/314cf367-8c70-4e8e-bc4a-dcbb99cb4f71)

Now:


![image](https://github.com/zed-industries/zed/assets/53836821/42af9ef3-8af9-453a-ad95-147b5f9d90ba)

@mikayla-maki and I also decided to refactor the `WindowOptions` a bit. 
Previously you could specify bounds which controlled the positioning and
size of the window in the global coordinate space, while also providing
a display id (which screen to show the window on). This can lead to
unusual behavior because you could theoretically specify a global bound
which does not even belong to the display id which was provided.

Therefore we changed the api to this:
```rust
struct WindowOptions {
    /// The bounds of the window in screen coordinates
    /// None -> inherit, Some(bounds) -> set bounds.
    pub bounds: Option<Bounds<DevicePixels>>,

    /// The display to create the window on, if this is None,
    /// the window will be created on the main display
    pub display_id: Option<DisplayId>,
}
```

This lets you specify a display id, which maps to the screen where the
window should be created and bounds relative to the upper left of the
screen.

Release Notes:

- Fixed positioning of popup windows (e.g. when sharing a project) when
using multiple external displays.

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-03-26 13:07:38 -07:00
Marshall Bowers
ffd698be14 Remove lingering uiua.rs file (#9828)
This PR removes a lingering file related to Uiua support.

This file was no longer being referenced after #9085, but just hadn't
been removed.

Release Notes:

- N/A
2024-03-26 16:07:23 -04:00
Remco Smits
0fd91652de Fixed channel chat notifications are not working anymore (#9827)
This PR will fix the following issues that were introduced by the
following pull request #9557.

- Cannot create messages with a mention inside it
- Cannot invite a user
- Cannot accept an invitation

Release Notes:

- Fixed channel chat notifications are not working anymore.

Co-authored-by: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
2024-03-26 20:25:09 +01:00
Marshall Bowers
9604b22d98 Suppress error logs from CopilotCompletionProvider when Copilot is disabled (#9826)
This PR fixes some noisy error logs from the `CopilotCompletionProvider`
when Copilot is disabled entirely via the settings.

I have the following in my settings file:

```json
{
  "features": {
    "copilot": false
  },
}
```

After #9777 I started seeing my Zed logs getting filled up with messages
like this:

```
[2024-03-26T14:33:09-04:00 ERROR util] crates/copilot_ui/src/copilot_completion_provider.rs:206: copilot is disabled
```

Release Notes:

- N/A
2024-03-26 15:14:44 -04:00
Marshall Bowers
cd32ef64ff Use correct file extension for Haskell suggestions (#9825)
This PR fixes the file extension used for suggesting the Haskell
extension.

Release Notes:

- N/A
2024-03-26 14:21:03 -04:00
Marshall Bowers
b8ef97015c Extract PureScript support into an extension (#9824)
This PR extracts PureScript support into an extension and removes the
built-in PureScript support from Zed.

Release Notes:

- Removed built-in support for PureScript, in favor of making it
available as an extension. The PureScript extension will be suggested
for download when you open a `.purs` file.
2024-03-26 13:55:46 -04:00
Maksim Bondarenkov
d77cda1ea9 windows: Support compiling with MinGW toolchain (#9815)
Fixes #9757: compile manifest using `embed-manifest` crate, which
supports both MSVC and MinGW

Release Notes:

- N/A
2024-03-26 10:39:39 -07:00
白山風露
35b39e02ce Windows: Fullscreen (#9728)
~~This is still a work in progress, but to show the public where I am
working on it~~ Ready for review

TODO:
- [x] Justify fullscreen size to display
- [x] Record and apply restored size

Release Notes:

- N/A
2024-03-26 09:58:16 -07:00
Piotr Osiewicz
fc5a0885f3 Fix visual glitches in titlebar when there's an active prompt
It looks like a fractional traffic lights position doesn't play well with Mac 12+.
Related to #7339 and #8128
2024-03-26 17:57:16 +01:00
Marshall Bowers
dbcff2a420 Extract Prisma support into an extension (#9820)
This PR extracts Prisma support into an extension and removes the
built-in Prisma support from Zed.

Release Notes:

- Removed built-in support for Prisma, in favor of making it available
as an extension. The Prisma extension will be suggested for download
when you open a `.prisma` file.
2024-03-26 12:50:44 -04:00
Marshall Bowers
71441317bd Remove blank line in Cargo.toml 2024-03-26 11:44:15 -04:00
Marshall Bowers
1d6792b17d Extract Haskell support into an extension (#9814)
This PR extracts Haskell support into an extension and removes the
built-in Haskell support from Zed.

I tested out the extension locally in a Nix shell using `nix-shell -p
ghc haskell-language-server` to confirm the language server still
operated as expected:

<img width="341" alt="Screenshot 2024-03-26 at 11 26 26 AM"
src="https://github.com/zed-industries/zed/assets/1486634/df16fd38-4046-4a45-ac9f-c2b85bffe5c0">

Release Notes:

- Removed built-in support for Haskell, in favor of making it available
as an extension. The Haskell extension will be suggested for download
when you open a `.hs` file.
2024-03-26 11:41:41 -04:00
Kirill Bulatov
9d4c6c60fb Do not format or fully save non-dirty buffers (#9813)
Fixes https://github.com/zed-industries/zed/issues/9475

Release Notes:

- Start skipping formatting and actual FS saving for non-dirty buffers
([9475](https://github.com/zed-industries/zed/issues/9475))
2024-03-26 17:17:20 +02:00
张小白
f495ee0848 windows: implement mouse double click event (#9642)
Release Notes:

- N/A
2024-03-26 07:45:32 -07:00
Antonio Scandurra
fb6cff89d7 Introduce InlineCompletionProvider (#9777)
This pull request introduces a new `InlineCompletionProvider` trait,
which enables making `Editor` copilot-agnostic and lets us push all the
copilot functionality into the `copilot_ui` module. Long-term, I would
like to merge `copilot` and `copilot_ui`, but right now `project`
depends on `copilot`, which makes this impossible.

The reason for adding this new trait is so that we can experiment with
other inline completion providers and swap them at runtime using config
settings.

Please, note also that we renamed some of the existing copilot actions
to be more agnostic (see release notes below). We still kept the old
actions bound for backwards-compatibility, but we should probably remove
them at some later version.

Also, as a drive-by, we added new methods to the `Global` trait that let
you read or mutate a global directly, e.g.:

```rs
MyGlobal::update(cx, |global, cx| {
});
```

Release Notes:

- Renamed the `copilot::Suggest` action to
`editor::ShowInlineCompletion`
- Renamed the `copilot::NextSuggestion` action to
`editor::NextInlineCompletion`
- Renamed the `copilot::PreviousSuggestion` action to
`editor::PreviousInlineCompletion`
- Renamed the `editor::AcceptPartialCopilotSuggestion` action to
`editor::AcceptPartialInlineCompletion`

---------

Co-authored-by: Nathan <nathan@zed.dev>
Co-authored-by: Kyle <kylek@zed.dev>
Co-authored-by: Kyle Kelley <rgbkrk@gmail.com>
2024-03-26 13:28:06 +01:00
Kirill Bulatov
b8663e56a9 Make UniformList non-occluding. (#9806)
Fixes https://github.com/zed-industries/zed/issues/9723

Move the `occlude` in the places where they are needed.

Release Notes:

- Fixed right click in the project panel's empty region
([9723](https://github.com/zed-industries/zed/issues/9723))

Co-authored-by: Antonio Scandurra <antonio@zed.dev>
2024-03-26 13:13:10 +02:00
Bennet Bo Fenner
db9221aa57 markdown preview: Handle line breaks in between task list items correctly (#9795)
Closes #9783 

Release Notes:

- Fixed task list rendering when there was a line break between two list
items ([#9783](https://github.com/zed-industries/zed/issues/9783))
2024-03-26 12:12:57 +02:00
Thorsten Ball
157fb98a8b Add ability to specify binary path/args for gopls (#9803)
This uses the language server settings added in #9293 to allow users to
specify the binary path and arguments with which to start up `gopls`.

Example user settings for `gopls`:

```json
{
  "lsp": {
    "gopls": {
      "binary": {
        "path": "/Users/thorstenball/tmp/gopls",
        "arguments": ["-debug=0.0.0.0:8080"]
      },
    }
  }
}
```

Constraints:

* Right now this only allows ABSOLUTE paths.

Release Notes:

- Added ability to specify `gopls` binary `path` (must be absolute) and
`arguments` in user settings. Example: `{"lsp": {"gopls": {"binary":
{"path": "/my/abs/path/gopls", "arguments": ["-debug=0.0.0.0:8080"]
}}}}`
2024-03-26 07:09:06 +01:00
Max Brunsfeld
b0409ddd68 Consolidate more extension API structs that were duplicated btwn client and server (#9797)
Release Notes:

- N/A
2024-03-25 21:28:18 -07:00
Max Brunsfeld
5e7fcc02fa Remove old extension dir when upgrading (#9800)
Fixes #9799

Release Notes:

- Fixed a bug where upgrading an extension did not work correctly if the
extension had switched from using an old extension schema with
`extension.json` to the new schema with `extension.toml`.
2024-03-25 21:27:30 -07:00
Max Brunsfeld
5adc51f113 Add telemetry events for loading extensions (#9793)
* Store extensions versions' wasm API version in the database
* Share a common struct for extension API responses between collab and
client
* Add wasm API version and schema version to extension API responses

Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-03-25 17:30:48 -04:00
Ezekiel Warren
9b62e461ed windows: Add extension builder support (#9791)
Release Notes:

- N/A
2024-03-25 17:25:03 -04:00
Andrew Lygin
1b4c82dc2c Fix next/prev shortcuts handling in the File Finder (#9785)
This PR fixes the unexpected File Finder behaviour described in
https://github.com/zed-industries/zed/pull/8782#issuecomment-2018551041

Any change of the modifier keys except for the release of the initial
modifier keys now prevents opening the selected file.

Release Notes:

- N/A
2024-03-25 14:06:37 -07:00
Mikayla Maki
bdea804c48 Restore the hitbox of the excerpt header (#9790)
In https://github.com/zed-industries/zed/pull/9722, the
jump-to-excerpt-source buttons where shrunk too far.

Release Notes:

- N/A
2024-03-25 12:41:54 -07:00
Ezekiel Warren
00a8659491 More C++ path suffixes (#9761)
There is also `.C` and `.H` (capital), but I can't imagine they are very
popular and I'd be worried clashing with C.

Release Notes:

- Added more path suffixes recognized as C++
2024-03-25 15:23:09 -04:00
Paul
4789c02a19 X11: Double click (#9739)
This copies the logic from #9608 to the X11 client.

Fixes #9707.

Release Notes:
- N/A

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-03-25 11:44:31 -07:00
RoblKyogre
030e299b27 Fix key repeat after releasing a different key on Wayland (#9768)
Quick fix that fixes key repeat not working when releasing a different
key than the current one being held

Don't really know much rust yet, so unsure this is the best way to
handle this, but this does seem like a good starting point to get at
least a tad familiar with it

Release Notes:
- N/A
2024-03-25 11:44:24 -07:00
白山風露
6231df978b Windows: fix initial active status (#9694)
Separate from #9451

On Windows, a new window may already active immediate after creation.

Release Notes:

- N/A

---------

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-03-25 11:44:18 -07:00
Ezekiel Warren
78dc458231 windows: Mouse wheel coordinates fix (#9749)
mouse scroll wasn't working unless the window was maximized or in the
top left corner because the Windows wheel events give screen coordinates

Release Notes:

- N/A
2024-03-25 11:14:51 -07:00
Raunak Raj
2646ed08e7 linux: Implement restart and app_path (#9681)
Added `restart` and `app_path` method for linux platform which was
marked as `//todo(linux)`


Release Notes:

- N/A
2024-03-25 11:14:29 -07:00
Jakob Grønhaug
7bba9da281 Fix dependency install script on RHEL derivatives (#9684)
Added a check to `script/linux` so the script does not try to enable CSB
or add EPEL if the user is on Fedora, which does not need these steps.
The script now runs nicely on Fedora! :)

Release Notes:

- N/A
2024-03-25 11:14:11 -07:00
Daniel Zhu
569a7234fd Handle first click on Zed window (#9553)
Fixes #4336
2024-03-25 10:52:18 -07:00
白山風露
5361a4d72d Windows: Better cursor (#9451)
Moved `SetCursor` calls to `WM_SETCURSOR`, which occurs when OS is
requires set cursor.

Release Notes:

- N/A
2024-03-25 10:06:52 -07:00
Max Brunsfeld
053d05f6f5 Bump Tree-sitter for inclusion of strncat in wasm c stdlib
Co-authored-by: Marshall <marshall@zed.dev>
2024-03-25 09:54:43 -07:00
Marshall Bowers
0981c97a22 extension_cli: Clear out existing manifest collections for old extension.json (#9780)
This PR fixes an issue in the extension CLI when building extensions
using the old manifest schema (`extension.json`).

If there were values provided for the `languages`, `grammars`, or
`themes` collections, these could interfere with the packaging process.

We aren't expecting these fields to be set in the source
`extension.json` (just in the generated one), so we can clear them out
when building up the manifest.

Release Notes:

- N/A

Co-authored-by: Max <max@zed.dev>
2024-03-25 12:44:32 -04:00
Waffle Maybe
eec8660759 Use .is_some_and() instead of .is_some() && .unwrap() (#9704)
That's nicer & more readable.

(I just noticed that this looks weird while trying to understand why zed
changes my cursor, so decided to make a quick fix (btw the issue with
the cursor is that zed always loads cursor named "default" on wayland))

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-03-25 11:11:35 -04:00
Kirill Bulatov
e3894a4e1e Document main workspace structs (#9772) 2024-03-25 16:09:51 +01:00
Aaron Ruan
f83884518a Change maximum height of TitleBar (#9758)
<img width="209" alt="image"
src="https://github.com/zed-industries/zed/assets/38318044/0dcc4d0b-db9e-4eba-aa36-5c35f185e7e3">

Release Notes:

- Fixed alignment of items in the title bar
([#9709](https://github.com/zed-industries/zed/issues/9709)).
2024-03-25 10:45:19 -04:00
Piotr Osiewicz
a7047f67fb chore: Revert "gpui: update dependencies" (#9774)
Reverts zed-industries/zed#9741

/cc @niklaswimmer it looks like that PR change broke our rendering of
avatars (as @bennetbo found out) - they have a blue-ish tint now, which
I suppose might have to do with change between BGRA and RGBA. I'm gonna
revert it for now, let's reopen it though.


![image](https://github.com/zed-industries/zed/assets/24362066/3078d9c6-9638-441b-8b32-d969c46951e0)

Release Notes:

- N/A
2024-03-25 15:27:16 +01:00
Ko
6776688987 Fix Prisma indentation size (#9753)
This PR fixes #9567 

Release Notes:

- Changed default indentation for Prisma files to 2 spaces
([#9567](https://github.com/zed-industries/zed/issues/9567)).
2024-03-25 08:39:49 -04:00
Niklas Wimmer
7f9355e11f windows: update text system to new cosmic version
The same changes have been used on linux here 5003504031
and here 34832d49b09071846ff6f55f8ca1df019980a1df.

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-25 13:03:57 +01:00
Niklas Wimmer
6a22c8a298 gpui: Update image dependency
The latest update to resvg bumped some transitive dependencies
which lead to duplicates. The update to the image dependency
unifies most of their versions again.

Most notably, gif and kurbo are still duplicated, which is best fixed
downstream however.

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-25 13:03:57 +01:00
Niklas Wimmer
007acc4bc2 gpui: Update cosmic-text and resvg dependency
This unifies the rustybuzz dependency to the same version.

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-25 13:03:57 +01:00
Niklas Wimmer
97f1d61a4a gpui: make build dependencies mac only
This removes bindgen and cbindgen from the dependency graph on non-macos
systems, improving compile times on those systems.

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-25 13:03:57 +01:00
Niklas Wimmer
50ab60b9f0 gpui: update ashpd and open dependency
The ashpd update removes the dependency on an older zbus version which
decreases compile times.

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-25 13:03:57 +01:00
Mayfield
4785520d99 Support newline and tab literals in regex search-and-replace operations (#9609)
Closes #7645

Release Notes:

- Added support for inserting newlines (`\n`) and tabs (`\t`) in editor
Regex search replacements
([#7645](https://github.com/zed-industries/zed/issues/7645)).
2024-03-25 12:21:04 +01:00
Hans
eb3264c0ad change HashSet to BTreeSet (#9734)
I found that there may be some minor problems here, in editor.edit may
be more dependent on the order of operations, if the same set of
operations, different execution orders may lead to some different
results, so maybe we need to use BTreeSet instead of HashSet, because
HashSet may not be able to ensure that the same set of data order is
consistent, but maybe my worries are too much

Release notes:

- N/A
2024-03-25 12:13:44 +01:00
Bennet Bo Fenner
e77d313839 markdown preview: Improve task list visuals (#9695)
Instead of using some arbitrary unicode characters to render a task as
completed/not completed, I feel that using an actual checkbox from the
components crate makes it look more polished.

Before:

![image](https://github.com/zed-industries/zed/assets/53836821/700de8f8-2e01-4e03-b237-e3da2971f039)

After:

<img width="883" alt="image"
src="https://github.com/zed-industries/zed/assets/53836821/f63d56c3-bfbb-41c8-b150-8ebf973f75e2">


Release Notes:

- Improved visuals of task lists inside the markdown preview
2024-03-25 09:43:17 +01:00
Richard Taylor
5181d3f719 Workspace configuration for elixir-ls LSP (#9330)
This allows the workspace configuration settings to be passed to the
elixir-ls LSP via lsp settings.

This following example settings disable dialyzer in the LSP:

```
"lsp": {
  "elixir-ls": {
    "settings": {
      "dialyzerEnabled": false
    }
  }
}
```

It follows the same pattern used in
[#8568](https://github.com/zed-industries/zed/pull/8568) and resolves
[#4260](https://github.com/zed-industries/zed/issues/4260).

Zed's language server logs show the settings are being sent to the
language server:

```
Received client configuration via workspace/configuration
%{"dialyzerEnabled" => false}
Registering for workspace/didChangeConfiguration notifications
Starting build with MIX_ENV: test MIX_TARGET: host
client/registerCapability succeeded
Registering for workspace/didChangeWatchedFiles notifications
client/registerCapability succeeded
Received workspace/didChangeConfiguration
Received client configuration via workspace/didChangeConfiguration
%{"dialyzerEnabled" => false}
```

Release Notes:

- Added workspace configuration settings support for elixir-ls language
server. Those can now be configured by setting `{"lsp": {"elixir-ls": {
"settings: { "your-settings-here": "here"} } }` in Zed settings.
[#4260](https://github.com/zed-industries/zed/issues/4260).
2024-03-25 09:35:28 +01:00
Bennet Bo Fenner
6d78737973 markdown preview: Insert missing line break on hard break (#9687)
Closes #8990

For this input
```
Test \
Test
```

pulldown_cmark reports
```
Start(Paragraph)
Text(Borrowed("Test "))
HardBreak
Text(Borrowed("Test"))
End(Paragraph)
```

Previously `Event::HardBreak` just marked the paragraph block as
completed and ignored all the remaining text inside the paragraph.

Before:
See https://github.com/zed-industries/zed/issues/8990#issue-2173197637

After:

![image](https://github.com/zed-industries/zed/assets/53836821/48237ea6-d749-4207-89a3-b0f146b0e544)


Release Notes:

- Fixed markdown preview not handling hard breaks (e.g. `\`) correctly
([#8990](https://github.com/zed-industries/zed/issues/8990)).
2024-03-25 10:06:00 +02:00
Max Brunsfeld
7367350f41 Use upstream cargo-about 2024-03-22 22:12:24 -07:00
Max Brunsfeld
478e2a29a5 Add license symlinks for svelta and uiua extensions 2024-03-22 21:56:47 -07:00
Max Brunsfeld
6ebe599c98 Fix issues with extension API that come up when moving Svelte into an extension (#9611)
We're doing it. Svelte support is moving into an extension. This PR
fixes some issues that came up along the way.

Notes

* extensions need to be able to retrieve the path the `node` binary
installed by Zed
* previously we were silently swallowing any errors that occurred while
loading a grammar
* npm commands ran by extensions weren't run in the right directory
* Tree-sitter's WASM stdlib didn't support a C function (`strncmp`)
needed by the Svelte parser's external scanner
* the way that LSP installation status was reported was unnecessarily
complex

Release Notes:

- Removed built-in support for the Svelte and Gleam languages, because
full support for those languages is now available via extensions. These
extensions will be suggested for download when you open a `.svelte` or
`.gleam` file.

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-03-22 17:29:06 -07:00
Mikayla Maki
4459eacc98 Improve the clarity of multi buffer headers (#9722)
<img width="544" alt="Screenshot 2024-03-22 at 3 23 09 PM"
src="https://github.com/zed-industries/zed/assets/2280405/83fde9ad-76e1-4eed-a3f2-bc25d5a88d84">

Release Notes:


- Improved the clarity of the UI for diagnostic and search result
headers
2024-03-22 15:38:19 -07:00
Thorsten Ball
16a2013021 Update to vscode-eslint 2.4.4 & support flat config file extensions (#9708)
This upgrades to vscode-eslint 2.4.4 to support flat configs, in
multiple configuration files, ending in `.js`, `.cjs`, `.mjs`.

We changed the code to not use the GitHub release because we actually
don't need the artifacts of the release, we just need the source code,
which we compile anyway.

Fixes #7271.

Release Notes:

- Added support for ESLint flat config files.
([#7271](https://github.com/zed-industries/zed/issues/7271)).

Co-authored-by: Kristján Oddsson <koddsson@gmail.com>
2024-03-22 17:19:23 +01:00
Marshall Bowers
c6d479715d Add setting to allow disabling the Assistant (#9706)
This PR adds a new `assistant.enabled` setting that controls whether the
Zed Assistant is enabled.

Some users have requested the ability to disable the AI-related features
in Zed if they don't use them. Changing `assistant.enabled` to `false`
will hide the Assistant icon in the status bar (taking priority over the
`assistant.button` setting) as well as filter out the `assistant:`
actions.

The Assistant is enabled by default.

Release Notes:

- Added an `assistant.enabled` setting to control whether the Assistant
is enabled.
2024-03-22 11:55:29 -04:00
Piotr Osiewicz
4dc61f7ccd Extensions registering tasks (#9572)
This PR also introduces built-in tasks for Rust and Elixir. Note that
this is not a precedent for future PRs to include tasks for more
languages; we simply want to find the rough edges with tasks & language
integrations before proceeding to task contexts provided by extensions.

As is, we'll load tasks for all loaded languages, so in order to get
Elixir tasks, you have to open an Elixir buffer first. I think it sort
of makes sense (though it's not ideal), as in the future where
extensions do provide their own tasks.json, we'd like to limit the # of
tasks surfaced to the user to make them as relevant to the project at
hand as possible.

Release Notes:

- Added built-in tasks for Rust and Elixir files.
2024-03-22 16:18:33 +01:00
Conrad Irwin
cb4f868815 remoting (#9680)
This PR provides some of the plumbing needed for a "remote" zed
instance.

The way this will work is:
* From zed on your laptop you'll be able to manage a set of dev servers,
each of which is identified by a token.
* You'll run `zed --dev-server-token XXXX` to boot a remotable dev
server.
* From the zed on your laptop you'll be able to open directories and
work on the projects on the remote server (exactly like collaboration
works today).

For now all this PR does is provide the ability for a zed instance to
sign in
using a "dev server token". The next steps will be:
* Adding support to the collaboration protocol to instruct a dev server
to "open" a directory and share it into a channel.
* Adding UI to manage these servers and tokens (manually for now)

Related #5347

Release Notes:

- N/A

---------

Co-authored-by: Nathan <nathan@zed.dev>
2024-03-22 08:44:56 -06:00
Nathan Sobo
f56707e076 Assign OPENAI_API_KEY from a k8s secret in the collab deployment (#9703)
Merging this eagerly because it's just a configuration change, and I want to test it on staging.
2024-03-22 08:36:52 -06:00
Bennet Bo Fenner
ce57db497e chat panel: Fix tooltips not working for links (#9691)
Closes #9418 

Noticed a difference in the `cx.set_tooltip(...)` calls between `div`
and `InteractiveText`. `div` calls `cx.set_tooltip(...)` inside
`after_layout`, but `InteractiveText` was calling this inside `paint`. I
believe as #9012 was merged, we need to call `cx.set_tooltip` inside
`after_layout`, as inserting inside `paint` does not seem to be
supported anymore.

I moved the code for setting the tooltip to `after_layout` and hovering
over links inside the chat seems to bring up the tooltips again.

Before:

See https://github.com/zed-industries/zed/issues/9418#issue-2189398784

After:


![image](https://github.com/zed-industries/zed/assets/53836821/a623164c-1ce0-40d7-bc53-020f176fba4a)


Release Notes:

- Fixed tooltip not showing up when hovering over links inside the chat
panel ([#9418](https://github.com/zed-industries/zed/issues/9418))
2024-03-22 14:47:57 +01:00
Piotr Osiewicz
945d8c2112 Revert "Revert "chore: Bump Rust version to 1.77 (#9631)"" (#9672)
Reverts zed-industries/zed#9658, as the Docker image is now available.

Release notes:

- N/A
2024-03-22 11:17:16 +01:00
张小白
ae6f138b6c Fix compute_width_for_char (#9643)
Release Notes:

- N/A
2024-03-22 09:10:42 +01:00
Nathan Sobo
6d5787cfdc Hard code max token counts for supported models (#9675) 2024-03-21 20:30:33 -06:00
Daniel Zhu
441677c90a Fix typo (mimized -> minimized) (#9674) 2024-03-21 16:14:31 -07:00
apricotbucket28
95b2f4caf2 linux: fix word move/select shortcuts (#9673)
`alt+left/right` are never used on Linux.
The Linux keymap still has some other issues, but these shortcuts in
particular are really common when editing text.

Release Notes:

- N/A
2024-03-21 15:28:39 -07:00
Ezekiel Warren
0f15fd37d6 windows: User installed language server support (#9606)
Release Notes:

- N/A
2024-03-21 15:25:26 -07:00
张小白
4183805a39 windows: fix window activate action (#9664)
Now, the window activation event can be triggered correctly. As shown in
the video, when the window is activated, the caret blinks; when the
window loses activation due to me clicking on the PowerShell window, the
caret stops blinking.



https://github.com/zed-industries/zed/assets/14981363/4c1b2bec-319d-4f21-879e-5a0af0a00d8e



Release Notes:

- N/A
2024-03-21 15:24:17 -07:00
Conrad Irwin
eaa803298e Fix error handling in buffer open (#9667)
Co-Authored-By: Max <max@zed.dev>
Co-Authored-By: Marshall <marshall@zed.dev>

Release Notes:

- N/A

Co-authored-by: Max <max@zed.dev>
Co-authored-by: Marshall <marshall@zed.dev>
2024-03-21 15:57:20 -06:00
Conrad Irwin
caed275fbf Revert "language: Remove buffer fingerprinting (#9007)"
This reverts commit 6f2f61c9b1.
2024-03-21 14:10:18 -06:00
Remco Smits
35e3935e8f Fix invalid highlight position for (edited) text (#9660)
This pull request fixes a bug that was inserting the wrong position for
the `(edited)` text. This is only an issue when you have other styling
inside the markdown that causes the `richt_text.text.len()` to increase.

**Before**
<img wdth="234" alt="Screenshot 2024-03-21 at 19 53 34"
src="https://github.com/zed-industries/zed/assets/62463826/bf9e7fec-1563-415b-9b14-f7b95fc7d784">
**After**
<img width="234" alt="Screenshot 2024-03-21 at 19 53 48"
src="https://github.com/zed-industries/zed/assets/62463826/c0b7ba3a-5bdc-4b9e-a4f2-c3df4064f0ec">

Release Notes:

- N/A
2024-03-21 13:37:45 -06:00
Stanislav Alekseev
3b7cd9cf1e Remove incorrect venv base directory used (#9661)
Follow-up of https://github.com/zed-industries/zed/pull/8444

Release Notes:

- N/A
2024-03-21 21:34:14 +02:00
张小白
1e543b9755 windows: implement IME caret movement and editing while composing (#9659)
https://github.com/zed-industries/zed/assets/14981363/598440f7-0364-4053-9f44-710291f8aa92



Release Notes:

- N/A
2024-03-21 12:10:22 -07:00
Marshall Bowers
d557f8e36c Revert "chore: Bump Rust version to 1.77 (#9631)" (#9658)
This reverts commit 6184278faf.

We can't upgrade to Rust 1.77 until there are Rust 1.77 Docker images
available
(https://github.com/docker-library/official-images/pull/16457).


Release Notes:

- N/A
2024-03-21 14:07:22 -04:00
Ezekiel Warren
b6201a34b9 Check for user installed clangd (#9605)
Release Notes:

- Improved C/C++ support using user installed clangd when available
2024-03-21 10:50:42 -07:00
Stanislav Alekseev
85fdcef564 Do not enable venv in terminal for bash-like oneshot task invocations (#8444)
Release Notes:
- Work around #8334 by only activating venv in the terminal not in tasks
(see #8440 for a proper solution)
- To use venv modify your tasks in the following way:
```json
{
  "label": "Python main.py",
  "command": "sh",
  "args": ["-c", "source .venv/bin/activate && python3 main.py"]
}
```

---------

Co-authored-by: Kirill Bulatov <kirill@zed.dev>
2024-03-21 19:40:33 +02:00
Piotr Osiewicz
cd61297740 collab: Bump minimal client version to 0.127.3 (#9649)
Release Notes:

- N/A
2024-03-21 18:23:18 +01:00
Mikayla Maki
e07192e4e3 Implement is_minimized for macOS (#9651)
Release Notes:

- N/A
2024-03-21 09:36:54 -07:00
Marshall Bowers
adcb591629 extension_cli: Populate grammars from grammars directory for legacy extension formats (#9650)
This PR makes the extension CLI populate the grammars in the manifest
from the contents of the `grammars` directory for legacy extensions
using the `extension.json` format (`schema_version == 0`).

This allows us to continue packaging these older extensions until they
can be migrated to the new schema version.

Release Notes:

- N/A
2024-03-21 12:31:53 -04:00
张小白
d89905fc3d Fix IME window position with scale factor greater than 1.0 (#9637)
In #9456 I forgot to handle this...

Release Notes:

- N/A
2024-03-21 09:31:43 -07:00
白山風露
e1d1d575c3 Windows: Fix XButton direction (#9629)
Release Notes:

- N/A
2024-03-21 09:31:19 -07:00
张小白
3fd62a2313 windows: display icon (#9571)
Now `Zed` can display icons. The image below shows the icon of the
`zed.exe` file and the icon in the right-click properties.

![Screenshot 2024-03-20
181054](https://github.com/zed-industries/zed/assets/14981363/8f1ccc7f-aab0-46cf-8c32-a3545ba710a3)

I used the `crates\zed\resources\app-icon@2x.png` file to generate the
`.ico` file. Due to some blank space around the logo in the original
file, the logo appears slightly smaller on Windows compared to other
software.

![Screenshot 2024-03-20
181155](https://github.com/zed-industries/zed/assets/14981363/874c5ed3-6796-428c-9a91-f91231bb6510)

The current `.ico` file contains logo files of multiple sizes: 16x16,
24x24, 32x32, 48x48, 64x64, 96x96, 128x128, 256x256, 512x512.

Release Notes:

- N/A
2024-03-21 09:30:01 -07:00
白山風露
f179158913 windows: Avoid recording minimized position to database (#9407)
Minimizing window records strange position to DB. When Zed is restarted,
the window goes far away.
<img width="975" alt="image"
src="https://github.com/zed-industries/zed/assets/6465609/98dbab71-fdbb-4911-b41a-9c3e498e5478">

So I fixed.

Release Notes:

- N/A
2024-03-21 09:15:16 -07:00
白山風露
f88f1bce20 Windows: Not logging frequent WM_PAINTs (#9566)
The logging of WM_PAINT for each frame was not very meaningful, so it
was eliminated.
Other logging levels were also reduced to trace.

Release Notes:

- N/A

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-03-21 09:11:33 -07:00
Marshall Bowers
20b88b6078 Add a script for bumping the extension CLI (#9646)
This PR adds a script for bumping the extension CLI (thus kicking off a
new build).

Release Notes:

- N/A
2024-03-21 12:05:31 -04:00
Piotr Osiewicz
6f2f61c9b1 language: Remove buffer fingerprinting (#9007)
Followup to #9005 that actually removes buffer fingerprinting.

Release Notes:

- N/A
2024-03-21 17:03:26 +01:00
Marshall Bowers
1f21088591 Normalize - to _ in resulting Wasm file names (#9644)
This PR updates the extension builder to normalize `-` to `_` in the
Rust package names when computing the resulting `.wasm` file name.

The `wasm32-wasi` target does this normalization internally already, so
we need to do the same to ensure we're looking for the resulting `.wasm`
file in the right spot.

Release Notes:

- N/A
2024-03-21 11:48:08 -04:00
Marshall Bowers
3831088251 extension_cli: Don't propagate errors caused by trying to read Cargo.toml (#9641)
This PR fixes an issue where the extension CLI would error if a
`Cargo.toml` didn't exist when we were trying to check for its
existence. Since we're just checking if it exists for the purposes of
detecting a Rust extension, we can safely ignore the errors.

Also improved the logging/error handling in a few spots to make other
errors easier to troubleshoot in the future.

Release Notes:

- N/A
2024-03-21 11:29:35 -04:00
Piotr Osiewicz
e20508f66c lsp: Add partial support for insert/replace completions (#9634)
Most notably, this should do away with completions overriding the whole
word around completion trigger text. Fixes: #4816



Release Notes:

- Fixed code completions overriding text around the cursor.
2024-03-21 16:19:21 +01:00
Piotr Osiewicz
6184278faf chore: Bump Rust version to 1.77 (#9631)
Release Notes:

- N/A
2024-03-21 15:42:59 +01:00
Remco Smits
65152baa3f Fix prettier-plugin-organize-imports plugin removes used imports (#9598)
### Issue
So this pull request fixes an issue that was driven me crazy. The issue
was that when you use the `prettier-plugin-organize-imports` It would
remove some imports that should not be removed before they were used
inside the module itself.

You can reproduce it with the following `prettierrc.json` config and
source code. When you **save** the file, it would remove the `import
clsx from "clsx";` import from the file.

**Prettier config**
```json
{
  "semi": true,
  "tabWidth": 4,
  "trailingComma": "es5",
  "useTabs": true,
  "plugins": [
    "prettier-plugin-tailwindcss",
    "prettier-plugin-organize-imports"
  ]
}
```

**Source code**
```typescript
import clsx from "clsx";

export default function Home() {
  return (
      <main>
	      {clsx("asdjklasdjlkasd", "asdjlkasjdjlk")}
      </main>
  );
}
``` 

### Findings
After a deep dive with @mrnugget, I was debugging deep down the prettier
plugin system and found the issue that was causing this issue. When I
was looking inside the
`node_modules/prettier-plugin-organize-imports/lib/organize.js`. I saw
the following code that looked strange to me because it falls back to
`file.ts` if `filepath` is not passed through inside the prettier config
options.

<img width="860" alt="Screenshot 2024-03-20 at 21 31 46"
src="https://github.com/zed-industries/zed/assets/62463826/47177fe5-e5a9-41d8-9f2f-0304b2c2159f">

So the issue was small, if you look at the following code, the `path`
key should be `filepath` inside the
`crates/prettier/src/prettier_server.js:205`
![Screenshot 2024-03-20 at 21 35
25](https://github.com/zed-industries/zed/assets/62463826/1eea0a88-c886-4632-9c69-9f3095126971)

Release Notes:

- Fixed prettier integration not using the correct filepath when
invoking prettier, which could lead to some prettier plugins failing to
format correctly.
([#9496](https://github.com/zed-industries/zed/issues/9496)).
2024-03-21 08:23:15 +01:00
Conrad Irwin
65c6bfebda get-preview-channel-changes errors on invalid token (#9616)
Release Notes:

- N/A
2024-03-20 21:44:12 -06:00
Conrad Irwin
ac4c6c60f1 Make it (a tiny bit) easier to run your own collab (#9557)
* Allow creating channels when seeding
* Allow configuring a custom `SEED_PATH`
* Seed the database when creating/migrating it so you don't need a
  separate step for this.

Release Notes:

- N/A
2024-03-20 21:00:02 -06:00
Conrad Irwin
1062c5bd26 Fix copilot modal (#9613)
Release Notes:

- Fixed copilot modal not responding
([#9596](https://github.com/zed-industries/zed/issues/9596)). (preview
only)
2024-03-20 20:37:40 -06:00
Mikayla Maki
0b019282c3 Wayland: double click (#9608)
This PR builds off of an earlier version of
https://github.com/zed-industries/zed/pull/9595, rearranges some of the
logic, and removes an unused platform API.

Release Notes:

- N/A

---------

Co-authored-by: apricotbucket28 <agustin.nicolas.marcos@outlook.com>
2024-03-20 19:22:47 -07:00
Ezekiel Warren
9b0949b6fb Allow specifying no base keymap (#9471)
This PR is a bit of a shot in the dark. I'm not sure if this will be
acceptable and I understand if it gets rejected.

I've been trying to integrate Zed as my daily driver and the key
bindings have been a major hurdle for me. Mostly due to the
windows/linux keybindings being messed up, but also me wanting to have
more chained key bindings similar to helix or common in custom neovim
configurations.

I think having a `None` base keymap would allow someone to more easily
implement a new base keymap (#4642) and would make my daily use of Zed a
little nicer 😅.

Also I am aware that there would need to be a little more work done in
this PR for the other base keymaps such as 'atom' since they assume the
'default' (vscode) base keymaps are loaded. I'm happy to do that work if
a 'none' base keymap is acceptable.

Release Notes:

- Added ability to specify no base keymap which allows for full
keybinding customization
2024-03-20 18:52:17 -06:00
Andrew Lygin
5602c48136 Action release handlers (#8782)
This PR adds support for handling action releases &mdash; events that
are fired when the user releases all the modifier keys that were part of
an action-triggering shortcut.

If the user holds modifiers and invokes several actions sequentially via
shortcuts (same or different), only the last action is "released" when
its modifier keys released.

~The following methods were added to `Div`:~
- ~`capture_action_release()`~
- ~`on_action_release()`~
- ~`on_boxed_action_release()`~

~They work similarly to `capture_action()`, `on_action()` and
`on_boxed_action()`.~

See the implementation details in [this
comment](https://github.com/zed-industries/zed/pull/8782#issuecomment-2009154646).

Release Notes:

- Added a fast-switch mode to the file finder: hit `p` or `shift-p`
while holding down `cmd` to select a file immediately. (#8258).

Related Issues:

- Implements #8757 
- Implements #8258
- Part of #7653 

Co-authored-by: @ConradIrwin
2024-03-20 18:43:31 -06:00
Mikayla Maki
91ab95ec82 Fix bugs in linux text system (#9604)
Supersedes https://github.com/zed-industries/zed/pull/9579 by manually
including it.

Release Notes:

- N/A
2024-03-20 15:27:56 -07:00
Max Brunsfeld
585e8671e3 Add a schema to extensions, to prevent installing extensions on too old of a Zed version (#9599)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-03-20 17:33:26 -04:00
白山風露
b1feeb9f29 Windows: Refactoring (#9580)
Aggregate `DefWindowProc` calls with individual handler return value as
`Option<isize>`

Release Notes:

- N/A
2024-03-20 13:36:28 -07:00
Mikayla Maki
78e116c111 Fix skip prompt warning (#9590)
This fixes a non-panicking log error caused by
https://github.com/zed-industries/zed/pull/9452

Release Notes:

- N/A
2024-03-20 13:35:29 -07:00
Max Brunsfeld
d699b8e104 Allow extensions to define more of the methods in the LspAdapter trait (#9554)
Our goal is to extract Svelte support into an extension, since we've
seen problems with the Tree-sitter Svelte parser crashing due to bugs in
the external scanner. In order to do this, we need a couple more
capabilities in LSP extensions:

* [x] `initialization_options` - programmatically controlling the JSON
initialization params sent to the language server
* [x] `prettier_plugins` - statically specifying a list of prettier
plugins that apply for a given language.
* [x] `npm_install_package`

Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-03-20 12:47:04 -07:00
Jason Lee
0ce5cdc48f Only allow opening one Extensions view (#9569)
Release Notes:

- Changed the extensions view to only allow one open instance at a time.


## Before

<img width="494" alt="image"
src="https://github.com/zed-industries/zed/assets/5518/9329e685-1946-4384-bec3-f7eadf18a0cc">
2024-03-20 14:49:36 -04:00
Thorsten Ball
3853991c20 Format prettier_server.js (#9583)
This PURELY formats the file by opening it in Zed and hitting save with
save-on-format on.

It's been bugging me that I can't change the file without the whole
thing getting reformatted, so here we are.

Release Notes:

- N/A
2024-03-20 19:49:14 +01:00
Marshall Bowers
3a2eb12f68 Fix binary name for extension CLI (#9591)
This PR fixes the binary name for the extension CLI.

This was originally done in #9541, but got accidentally reverted in
#9549.

Release Notes:

- N/A
2024-03-20 14:16:23 -04:00
Mikayla
59bc81d1bc v0.129.x dev 2024-03-20 09:16:41 -07:00
Anthony Eid
88857f8149 VS Code -> Zed tasks converter (#9538)
We can convert shell, npm and gulp tasks to a Zed format. Additionally, we convert a subset of task variables that VsCode supports.

Release notes:

- Zed can now load tasks in Visual Studio Code task format

---------

Co-authored-by: Piotr Osiewicz <piotr@zed.dev>
Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-03-20 16:37:26 +01:00
Jason Lee
269d2513ca Add support for applying theme after extension is installed (#9529)
Release Notes:

- Added support for opening the theme selector with installed themes
after installing an extension containing themes.
([#9228](https://github.com/zed-industries/zed/issues/9228)).

<img width="1315" alt="Screenshot 2024-03-20 at 11 00 35 AM"
src="https://github.com/zed-industries/zed/assets/1486634/593389b3-eade-4bce-ae17-25c02a074f21">

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-03-20 11:13:58 -04:00
Marshall Bowers
6cec389125 ui: Make top_padding an associated function on the TitleBar (#9577)
This PR makes the function for computing the top padding for the
`TitleBar` an associated function.

Release Notes:

- N/A
2024-03-20 10:55:09 -04:00
Antonio Scandurra
9ab7a22fa8 Fix licensing errors 2024-03-20 15:52:02 +01:00
Antonio Scandurra
f2394c76f5 Fix licensing 2024-03-20 13:03:13 +01:00
Tim Masliuchenko
7855b9e9a8 Allow to handle autoclosed characters differently (#8666)
Adds the `always_treat_brackets_as_autoclosed` setting to control how
the autoclosed characters are handled.

The setting is off by default, meaning the behaviour stays the same
(following how VSCode handles autoclosed characters).
When set to `true`, the autoclosed characters are always skipped over
and auto-removed no matter how they were inserted (following how Sublime
Text/Xcode handle this).


https://github.com/zed-industries/zed/assets/471335/304cd04a-59fe-450f-9c65-cc31b781b0db


https://github.com/zed-industries/zed/assets/471335/0f5b09c2-260f-48d4-8528-23f122dee45f

Release Notes:

- Added the setting `always_treat_brackets_as_autoclosed` (default:
`false`) to always treat brackets as "auto-closed" brackets, i.e.
deleting the pair when deleting start/end, etc.
([#7146](https://github.com/zed-industries/zed/issues/7146)).

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
2024-03-20 09:35:42 +01:00
Ezekiel Warren
d5e0817fbc windows: Fix title bar height when maximized (#9449)
screenshots and description incoming

## title bar when window is maximized
| before | after |
| ---    | ---   |
|
![image](https://github.com/zed-industries/zed/assets/1284289/075a943d-54db-4b71-9fa0-15f823255182)
|
![image](https://github.com/zed-industries/zed/assets/1284289/39a1d381-fcfd-4651-aab4-231a8ec3bd99)
|

## ~~caption buttons at 200%~~
~~buttons are now properly responsive at different scales~~
~~closes #9438~~
~~proper scale factor handling in follow up PR (possibly #9440)~~

<details>
  <summary>out of date image</summary>


![scale-factor](https://github.com/zed-industries/zed/assets/1284289/299d37b8-0d2e-4f2e-81db-2fff6fc59a62)
</details>

should be fixed by https://github.com/zed-industries/zed/pull/9456


Release Notes:

- N/A
2024-03-19 20:54:00 -07:00
Remco Smits
3dadfe4787 Channel chat: Add edit message (#9035)
**Summary**:
- Removed reply message from message_menu
- Made render_popover_buttons a bit more reusable
- Fixed issue that you can't close the reply/edit preview when you are
not focusing the message editor
- Notify only the new people that were mentioned inside the edited
message

**Follow up**
- Fix that we update the notification message for the people that we
mentioned already
- Fix that we remove the notification when a message gets deleted.
  - Fix last acknowledge message id is in correct now

**Todo**:
- [x] Add tests
- [x] Change new added bindings to the `Editor::Cancel` event.

Release Notes:

- Added editing of chat messages
([#6707](https://github.com/zed-industries/zed/issues/6707)).

<img width="239" alt="Screenshot 2024-03-09 at 11 55 23"
src="https://github.com/zed-industries/zed/assets/62463826/b0949f0d-0f8b-43e1-ac20-4c6d40ac41e1">
<img width="240" alt="Screenshot 2024-03-13 at 13 34 23"
src="https://github.com/zed-industries/zed/assets/62463826/d0636da2-c5aa-4fed-858e-4bebe5695ba7">

---------

Co-authored-by: Bennet Bo Fenner <53836821+bennetbo@users.noreply.github.com>
Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2024-03-19 19:49:04 -06:00
Conrad Irwin
5139aa3811 Fix merge conflict in collab (#9550)
Release Notes:

- N/A
2024-03-19 16:02:33 -06:00
Max Brunsfeld
86a86a9635 Fix incorrect git ref check in publish extension cli workflow
Co-authored-by: Marshall <marshall@zed.dev>
2024-03-19 14:51:46 -07:00
Max Brunsfeld
fd11bd68f2 Perform extension packaging in extension-cli (#9549)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-03-19 17:26:06 -04:00
张小白
85c294da9a windows: Implement app_version (#9410)
#### Call `app_version`:

![Screenshot 2024-03-16
011821](https://github.com/zed-industries/zed/assets/14981363/9e618e49-fee2-4e7a-b884-6b0be05a0c95)

#### `Zed.exe` info:

![Screenshot 2024-03-16
011856](https://github.com/zed-industries/zed/assets/14981363/2b17a5df-ad38-42d0-8396-53680d77101d)


Release Notes:

- N/A
2024-03-19 12:40:57 -07:00
Luke Jones
cfa0fc96f0 wayland: change some borrow_mut to borrow, reduce borrow scopes, fix two crashes (#9306)
Release Notes:

- N/A
2024-03-19 12:40:09 -07:00
张小白
086f4e63c5 windows: Properly handle DPI (#9456)
As I mentioned before, there are the following issues with how GPUI
handles scale factors greater than 1.0:
1. The title bar buttons do not function correctly, with minimizing
button performing maximization and maximizing button performing closure.
2. As discussed in #8809, setting a scale factor greater than 1.0 causes
GPUI's drawing content to be pushed off the screen.

This PR introduces `LogicalSize` and `PhysicalSize` to differentiate
between coordinate systems for proper GPUI rendering, and now scale
factors above 1.5 are working correctly.

`Zed` with a scale factor equals 1.5, and change between different scale
factors:



https://github.com/zed-industries/zed/assets/14981363/3348536d-8bd3-41dd-82f6-052723312a5b



Release Notes:

- N/A
2024-03-19 12:39:36 -07:00
apricotbucket28
2c36652be2 wayland: fix handling of non-discrete scroll events (#9548)
https://github.com/zed-industries/zed/pull/9103 broke touchpad scrolling
on Wayland
This PR correctly filters the `Axis` to handle all non-discrete scroll
events (see
https://wayland.app/protocols/wayland#wl_pointer:enum:axis_source)

Should fix https://github.com/zed-industries/zed/issues/9525

Release Notes:

- N/A
2024-03-19 12:25:10 -07:00
Nathan Sobo
8ae5a3b61a Allow AI interactions to be proxied through Zed's server so you don't need an API key (#7367)
Co-authored-by: Antonio <antonio@zed.dev>

Resurrected this from some assistant work I did in Spring of 2023.
- [x] Resurrect streaming responses
- [x] Use streaming responses to enable AI via Zed's servers by default
(but preserve API key option for now)
- [x] Simplify protobuf
- [x] Proxy to OpenAI on zed.dev
- [x] Proxy to Gemini on zed.dev
- [x] Improve UX for switching between openAI and google models
- We current disallow cycling when setting a custom model, but we need a
better solution to keep OpenAI models available while testing the google
ones
- [x] Show remaining tokens correctly for Google models
- [x] Remove semantic index
- [x] Delete `ai` crate
- [x] Cloud front so we can ban abuse
- [x] Rate-limiting
- [x] Fix panic when using inline assistant
- [x] Double check the upgraded `AssistantSettings` are
backwards-compatible
- [x] Add hosted LLM interaction behind a `language-models` feature
flag.

Release Notes:

- We are temporarily removing the semantic index in order to redesign it
from scratch.

---------

Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Thorsten <thorsten@zed.dev>
Co-authored-by: Max <max@zed.dev>
2024-03-19 19:22:26 +01:00
Marshall Bowers
905a24079a Add GitHub Action for publishing the extension CLI (#9542)
This PR adds a GitHub Action for publishing the extension CLI.

When the `extension-cli` tag is pushed, this Action will run, build the
`zed-extension` binary, and upload it to DigitalOcean for consumption.

This will allow us to consume the pre-built binary in the CI for the
extensions repo.

Release Notes:

- N/A

---------

Co-authored-by: Max <max@zed.dev>
2024-03-19 14:19:32 -04:00
Antonio Scandurra
2ea333fff6 Insert hitbox if div contains tooltip (#9545)
Release Notes:

- Fixed a bug that would cause certain tooltips to not show up
(preview-only).
2024-03-19 19:02:59 +01:00
Ezekiel Warren
ac6c4f3ca8 windows: fix 'space' keystroke keydown event (#9476)
the space key was being reported as key " " which didn't allow it to be
used in keybindings

Release Notes:

- N/A
2024-03-19 10:35:28 -07:00
Mikayla Maki
fd0071f2af Add an animation to the LSP checking indicator (#9463)
Spinner go spinny.

Extra thanks to @kvark for helping me with the shaders.



https://github.com/zed-industries/zed/assets/2280405/9d5f4f4e-0d43-44d2-a089-5d69939938e9


Release Notes:

- Added a spinning animation to the LSP checking indicator

---------

Co-authored-by: Dzmitry Malyshau <kvark@fastmail.com>
2024-03-19 10:16:18 -07:00
Kyle Kelley
56bd96bc64 Image viewer (#9425)
This builds on #9353 by adding an image viewer to Zed. Closes #5251.

Release Notes:

- Added support for rendering image files
([#5251](https://github.com/zed-industries/zed/issues/5251)).

<img width="1840" alt="image"
src="https://github.com/zed-industries/zed/assets/836375/3bccfa8e-aa5c-421f-9dfa-671caa274c3c">

---------

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-03-19 10:13:10 -07:00
Marshall Bowers
8d515a620f Update binary name for extension CLI (#9541)
This PR updates the binary name used by the extension CLI from
`extension_cli` to `zed-extension`.

Release Notes:

- N/A
2024-03-19 12:36:53 -04:00
Conrad Irwin
d6b7f14b51 suggested extensions (#9526)
Follow-up from #9138

Release Notes:

- Adds suggested extensions for some filetypes
([#7096](https://github.com/zed-industries/zed/issues/7096)).

---------

Co-authored-by: Felix Zeller <felixazeller@gmail.com>
2024-03-19 10:06:01 -06:00
Niklas Wimmer
7573f35e8e Simplify and document parts of linux text system code (#9443)
I mainly focused on improving the `font_id` function, see the
description of e286483262 for more
details. The rest are some drive-by changes I could not resist to.

When I am right about af4d6c43ce, someone
with a Mac could change it there as well.

This PR is probably best reviewed commit by commit :)

cc @gabydd @h3mosphere

Release Notes:

- N/A

---------

Signed-off-by: Niklas Wimmer <mail@nwimmer.me>
2024-03-19 08:27:48 -07:00
白山風露
250528d28f Windows: Auto close HANDLE (#9429)
`HANDLE` is wrapped in a RAII struct.

Release Notes:

- N/A

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-03-19 08:21:01 -07:00
Max Brunsfeld
868616d62e Introduce extension-cli binary, for packaging extensions in CI (#9523)
This will be used in the
[extensions](https://github.com/zed-industries/extensions) repository
for packaging the extensions that users submit.

Release Notes:

- N/A

---------

Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
2024-03-19 10:50:21 -04:00
Ben Hamment
7ef6600cdd Improve Ruby language to recognize Guardfiles (#9530)
Release Notes:

- Improved Ruby language to recognize Guardfile.
2024-03-19 10:44:32 -04:00
Tobias Decking
97fbec9b33 Clean up the live_kit_client manifest file (#9532)
Removes some redundant dependency defenitions and updates one
dependency.
2024-03-19 14:31:47 +01:00
Piotr Osiewicz
080e25dd45 chore: Merge zed lib with zed binary.
TL;DR:
- shaves off about 0.5 seconds from most of our debug builds.
- It would've slightly regressed release build due to preventing build pipelining, but as a tradeoff I've bumped up codegen-units for zed.

\# What did you come up with this time Piotr
In our zed repository I've noticed that merely *loading dependencies* in each crate takes non-trivial amount of time (~800ms in case of editor).
That is to say, the moment you \`use editor\`, your build time increases by 800ms - this happens just once in crate though, as it looks like compiler has to load .rlibs of all of the referenced dependencies.
This is visible under rustc's self-profile. Repro steps on twitter: https://twitter.com/PiotrOsiewicz/status/1762845413072101567

\# How does this commit alleviate this?
zed lib + zed bin are on critical path of every build and cumulatively take about 3s to build. This commit bundles all of this up into ~2.2s of bin build time instead.

\# Wait, splitting binary targets is good, no?
Splitting up a binary target into lib + bin is generally considered to be a good practice, as you can then reuse the lib part elsewhere if needed.
It also allows the build to kick off the moment metadata for all of the dependencies is available (thus, you don't need to wait for codegen).

However, we do not really use zed as a lib, so the first benefit is not really a thing for us.
The latter *is* indeed something we lose out on in release mode (in dev codegen phase of leaf-ish crates is insignificant, as we use shared generics - thus we don't spend much time codegening).
That's why I've bumped codegen units for zed crate to 16 in release mode to keep build times in tact.
2024-03-19 10:54:36 +01:00
Hans
79a424f28f Add Vue language server auto update (#9474)
For #9401

---------

Co-authored-by: Joseph T. Lyons <JosephTLyons@gmail.com>
2024-03-19 09:58:07 +01:00
Andrew Lygin
192cd5f2d2 Fix file git status refresh on .gitignore update (#9466)
This PR fixes file name coloring in the project panel and tabs when
.gitignore file is updated. It's intended to fix #7831.

There's another, less vivid, problem with git-aware labels coloring.
It's about files that are both ignored and contained in the git index.
I'll file a separate issue for it to keep this fix focused.

Release Notes:

- Fixed file Git status refreshing on .gitignore update (#7831).
2024-03-18 20:35:38 -06:00
Tobias Decking
1e1fb21c81 Merge prost dependecies (#9522)
This patch puts the prost, prost-build, and prost-types dependencies
together and unifies their version. This improves organization a bit in
addition to improving build time slightly, since a redundant version of
prost is now removed.

The dependencies are _not_ updated to the newest versions, because the
newest versions add a dependency on the `protoc` application, which is
not provided by cargo and thus breaks the building process.
2024-03-18 20:33:20 -06:00
Conrad Irwin
0c82585ea2 Bump collab min version (#9521)
We made a change last week to allow creating files with names. This
means some files have null saved_mtime, which old versions of zed panic
on.

A fix is available in 0.126.3 and above

Release Notes:


- N/A
2024-03-18 16:59:51 -06:00
apricotbucket28
d8e32c3e3c linux: scrolling improvements (#9103)
This PR adjusts scrolling to be a lot faster on Linux and also makes
terminal scrolling work.

For Wayland, it makes scrolling faster by handling the `AxisValue120`
event (which also allows high-resolution scrolling on supported mice)
On X11, changed the 1 line per scroll to 3.

### Different solutions

I tried replicating Chromium's scrolling behaviour, but it was
inconsistent in X11/Wayland and found it too fast on Wayland. Plus, it
also didn't match VSCode, since it seems that they do something
different.

Release Notes:

- Made scrolling faster on Linux
- Made terminal scroll on Linux
2024-03-18 14:50:29 -07:00
白山風露
c0f8581b29 Windows: implement symlink (#9508)
Since Windows has a distinction between symlinks for directories and
symlinks for files, the implementation is adapted to this distinction.

Release Notes:

- N/A
2024-03-18 14:27:39 -07:00
Conrad Irwin
cd9b865e0a maintain channel subscriptions in RAM (#9512)
This avoids a giant database query on every leave/join event.

Release Notes:

- N/A
2024-03-18 15:14:16 -06:00
411 changed files with 16761 additions and 15521 deletions

View File

@@ -0,0 +1,40 @@
name: Publish zed-extension CLI
on:
push:
tags:
- extension-cli
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
jobs:
publish:
name: Publish zed-extension CLI
runs-on:
- ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v4
with:
clean: false
submodules: "recursive"
- name: Cache dependencies
uses: swatinem/rust-cache@v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
- name: Configure linux
shell: bash -euxo pipefail {0}
run: script/linux
- name: Build extension CLI
run: cargo build --release --package extension_cli
- name: Upload binary
env:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
run: script/upload-extension-cli ${{ github.sha }}

View File

@@ -15,6 +15,10 @@
"JSON": {
"tab_size": 2,
"formatter": "prettier"
},
"JavaScript": {
"tab_size": 2,
"formatter": "prettier"
}
},
"formatter": "auto"

0
.zed/tasks.json Normal file
View File

896
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,6 @@
[workspace]
members = [
"crates/activity_indicator",
"crates/ai",
"crates/assets",
"crates/assistant",
"crates/audio",
@@ -24,6 +23,7 @@ members = [
"crates/editor",
"crates/extension",
"crates/extension_api",
"crates/extension_cli",
"crates/extensions_ui",
"crates/feature_flags",
"crates/feedback",
@@ -33,8 +33,10 @@ members = [
"crates/fuzzy",
"crates/git",
"crates/go_to_line",
"crates/google_ai",
"crates/gpui",
"crates/gpui_macros",
"crates/image_viewer",
"crates/install_cli",
"crates/journal",
"crates/language",
@@ -50,6 +52,7 @@ members = [
"crates/multi_buffer",
"crates/node_runtime",
"crates/notifications",
"crates/open_ai",
"crates/outline",
"crates/picker",
"crates/prettier",
@@ -67,7 +70,6 @@ members = [
"crates/task",
"crates/tasks_ui",
"crates/search",
"crates/semantic_index",
"crates/settings",
"crates/snippet",
"crates/sqlez",
@@ -93,7 +95,12 @@ members = [
"crates/zed",
"crates/zed_actions",
"extensions/astro",
"extensions/gleam",
"extensions/haskell",
"extensions/prisma",
"extensions/purescript",
"extensions/svelte",
"extensions/uiua",
"tooling/xtask",
@@ -136,9 +143,11 @@ fsevent = { path = "crates/fsevent" }
fuzzy = { path = "crates/fuzzy" }
git = { path = "crates/git" }
go_to_line = { path = "crates/go_to_line" }
google_ai = { path = "crates/google_ai" }
gpui = { path = "crates/gpui" }
gpui_macros = { path = "crates/gpui_macros" }
install_cli = { path = "crates/install_cli" }
image_viewer = { path = "crates/image_viewer" }
journal = { path = "crates/journal" }
language = { path = "crates/language" }
language_selector = { path = "crates/language_selector" }
@@ -153,6 +162,7 @@ menu = { path = "crates/menu" }
multi_buffer = { path = "crates/multi_buffer" }
node_runtime = { path = "crates/node_runtime" }
notifications = { path = "crates/notifications" }
open_ai = { path = "crates/open_ai" }
outline = { path = "crates/outline" }
picker = { path = "crates/picker" }
plugin = { path = "crates/plugin" }
@@ -171,7 +181,6 @@ rpc = { path = "crates/rpc" }
task = { path = "crates/task" }
tasks_ui = { path = "crates/tasks_ui" }
search = { path = "crates/search" }
semantic_index = { path = "crates/semantic_index" }
settings = { path = "crates/settings" }
snippet = { path = "crates/snippet" }
sqlez = { path = "crates/sqlez" }
@@ -206,7 +215,7 @@ bitflags = "2.4.2"
blade-graphics = { git = "https://github.com/kvark/blade", rev = "61cbd6b2c224791d52b150fe535cee665cc91bb2" }
blade-macros = { git = "https://github.com/kvark/blade", rev = "61cbd6b2c224791d52b150fe535cee665cc91bb2" }
blade-rwh = { package = "raw-window-handle", version = "0.5" }
cap-std = "2.0"
cap-std = "3.0"
chrono = { version = "0.4", features = ["serde"] }
clap = { version = "4.4", features = ["derive"] }
clickhouse = { version = "0.11.6" }
@@ -238,7 +247,9 @@ parking_lot = "0.12.1"
profiling = "1"
postage = { version = "0.5", features = ["futures-traits"] }
pretty_assertions = "1.3.0"
prost = "0.8"
prost = "0.9"
prost-build = "0.9"
prost-types = "0.9"
pulldown-cmark = { version = "0.10.0", default-features = false }
rand = "0.8.5"
refineable = { path = "./crates/refineable" }
@@ -271,27 +282,24 @@ time = { version = "0.3", features = [
"formatting",
] }
toml = "0.8"
tokio = { version = "1", features = ["full"] }
tower-http = "0.4.4"
tree-sitter = { version = "0.20", features = ["wasm"] }
tree-sitter-astro = { git = "https://github.com/virchau13/tree-sitter-astro.git", rev = "e924787e12e8a03194f36a113290ac11d6dc10f3" }
tree-sitter-bash = { git = "https://github.com/tree-sitter/tree-sitter-bash", rev = "7331995b19b8f8aba2d5e26deb51d2195c18bc94" }
tree-sitter-c = "0.20.1"
tree-sitter-clojure = { git = "https://github.com/prcastro/tree-sitter-clojure", branch = "update-ts" }
tree-sitter-c-sharp = { git = "https://github.com/tree-sitter/tree-sitter-c-sharp", rev = "dd5e59721a5f8dae34604060833902b882023aaf" }
tree-sitter-cpp = { git = "https://github.com/tree-sitter/tree-sitter-cpp", rev = "f44509141e7e483323d2ec178f2d2e6c0fc041c1" }
tree-sitter-css = { git = "https://github.com/tree-sitter/tree-sitter-css", rev = "769203d0f9abe1a9a691ac2b9fe4bb4397a73c51" }
tree-sitter-dockerfile = { git = "https://github.com/camdencheek/tree-sitter-dockerfile", rev = "33e22c33bcdbfc33d42806ee84cfd0b1248cc392" }
tree-sitter-dart = { git = "https://github.com/agent3bood/tree-sitter-dart", rev = "48934e3bf757a9b78f17bdfaa3e2b4284656fdc7" }
tree-sitter-elixir = { git = "https://github.com/elixir-lang/tree-sitter-elixir", rev = "a2861e88a730287a60c11ea9299c033c7d076e30" }
tree-sitter-elm = { git = "https://github.com/elm-tooling/tree-sitter-elm", rev = "692c50c0b961364c40299e73c1306aecb5d20f40" }
tree-sitter-embedded-template = "0.20.0"
tree-sitter-erlang = "0.4.0"
tree-sitter-gleam = { git = "https://github.com/gleam-lang/tree-sitter-gleam", rev = "58b7cac8fc14c92b0677c542610d8738c373fa81" }
tree-sitter-glsl = { git = "https://github.com/theHamsta/tree-sitter-glsl", rev = "2a56fb7bc8bb03a1892b4741279dd0a8758b7fb3" }
tree-sitter-go = { git = "https://github.com/tree-sitter/tree-sitter-go", rev = "aeb2f33b366fd78d5789ff104956ce23508b85db" }
tree-sitter-gomod = { git = "https://github.com/camdencheek/tree-sitter-go-mod" }
tree-sitter-gowork = { git = "https://github.com/d1y/tree-sitter-go-work" }
tree-sitter-haskell = { git = "https://github.com/tree-sitter/tree-sitter-haskell", rev = "8a99848fc734f9c4ea523b3f2a07df133cbbcec2" }
tree-sitter-hcl = { git = "https://github.com/MichaHoffmann/tree-sitter-hcl", rev = "v1.1.0" }
rustc-demangle = "0.1.23"
tree-sitter-heex = { git = "https://github.com/phoenixframework/tree-sitter-heex", rev = "2e1348c3cf2c9323e87c2744796cf3f3868aa82a" }
@@ -304,16 +312,13 @@ tree-sitter-nix = { git = "https://github.com/nix-community/tree-sitter-nix", re
tree-sitter-nu = { git = "https://github.com/nushell/tree-sitter-nu", rev = "7dd29f9616822e5fc259f5b4ae6c4ded9a71a132" }
tree-sitter-ocaml = { git = "https://github.com/tree-sitter/tree-sitter-ocaml", rev = "4abfdc1c7af2c6c77a370aee974627be1c285b3b" }
tree-sitter-php = "0.21.1"
tree-sitter-prisma-io = { git = "https://github.com/victorhqc/tree-sitter-prisma" }
tree-sitter-proto = { git = "https://github.com/rewinfrey/tree-sitter-proto", rev = "36d54f288aee112f13a67b550ad32634d0c2cb52" }
tree-sitter-purescript = { git = "https://github.com/postsolar/tree-sitter-purescript", rev = "v0.1.0" }
tree-sitter-python = "0.20.2"
tree-sitter-racket = { git = "https://github.com/zed-industries/tree-sitter-racket", rev = "eb010cf2c674c6fd9a6316a84e28ef90190fe51a" }
tree-sitter-regex = "0.20.0"
tree-sitter-ruby = "0.20.0"
tree-sitter-rust = "0.20.3"
tree-sitter-scheme = { git = "https://github.com/6cdh/tree-sitter-scheme", rev = "af0fd1fa452cb2562dc7b5c8a8c55551c39273b9" }
tree-sitter-svelte = { git = "https://github.com/Himujjal/tree-sitter-svelte", rev = "bd60db7d3d06f89b6ec3b287c9a6e9190b5564bd" }
tree-sitter-toml = { git = "https://github.com/tree-sitter/tree-sitter-toml", rev = "342d9be207c2dba869b9967124c679b5e6fd0ebe" }
tree-sitter-typescript = { git = "https://github.com/tree-sitter/tree-sitter-typescript", rev = "5d20856f34315b068c41edaee2ac8a100081d259" }
tree-sitter-vue = { git = "https://github.com/zed-industries/tree-sitter-vue", rev = "6608d9d60c386f19d80af7d8132322fa11199c42" }
@@ -323,18 +328,18 @@ unindent = "0.1.7"
unicase = "2.6"
url = "2.2"
uuid = { version = "1.1.2", features = ["v4"] }
wasmparser = "0.121"
wasm-encoder = "0.41"
wasmtime = { version = "18.0", default-features = false, features = [
wasmparser = "0.201"
wasm-encoder = "0.201"
wasmtime = { version = "19.0.0", default-features = false, features = [
"async",
"demangle",
"runtime",
"cranelift",
"component-model",
] }
wasmtime-wasi = "18.0"
wasmtime-wasi = "19.0.0"
which = "6.0.0"
wit-component = "0.20"
wit-component = "0.201"
sys-locale = "0.3.1"
[workspace.dependencies.windows]
@@ -349,6 +354,7 @@ features = [
"Win32_Security",
"Win32_Security_Credentials",
"Win32_Storage_FileSystem",
"Win32_System_LibraryLoader",
"Win32_System_Com",
"Win32_System_Com_StructuredStorage",
"Win32_System_DataExchange",
@@ -367,7 +373,7 @@ features = [
]
[patch.crates-io]
tree-sitter = { git = "https://github.com/tree-sitter/tree-sitter", rev = "4294e59279205f503eb14348dd5128bd5910c8fb" }
tree-sitter = { git = "https://github.com/tree-sitter/tree-sitter", rev = "7f21c3b98c0749ac192da67a0d65dfe3eabc4a63" }
# Workaround for a broken nightly build of gpui: See #7644 and revisit once 0.5.3 is released.
pathfinder_simd = { git = "https://github.com/servo/pathfinder.git", rev = "30419d07660dc11a21e42ef4a7fa329600cff152" }
@@ -381,12 +387,16 @@ cranelift-codegen = { opt-level = 3 }
rustybuzz = { opt-level = 3 }
ttf-parser = { opt-level = 3 }
wasmtime-cranelift = { opt-level = 3 }
wasmtime = { opt-level = 3 }
[profile.release]
debug = "limited"
lto = "thin"
codegen-units = 1
[profile.release.package]
zed = { codegen-units = 16 }
[workspace.lints.clippy]
dbg_macro = "deny"
todo = "deny"

View File

@@ -1,6 +1,6 @@
# syntax = docker/dockerfile:1.2
FROM rust:1.76-bookworm as builder
FROM rust:1.77-bookworm as builder
WORKDIR app
COPY . .

4
assets/icons/pencil.svg Normal file
View File

@@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg width="800px" height="800px" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M18 10L21 7L17 3L14 6M18 10L8 20H4V16L14 6M18 10L14 6" stroke="#000000" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/>
</svg>

After

Width:  |  Height:  |  Size: 379 B

View File

@@ -1,4 +1,6 @@
[
// todo(linux): Review the editor bindings
// Standard Linux bindings
{
"bindings": {
"up": "menu::SelectPrev",
@@ -9,12 +11,10 @@
"pagedown": "menu::SelectLast",
"shift-pagedown": "menu::SelectFirst",
"ctrl-n": "menu::SelectNext",
"ctrl-up": "menu::SelectFirst",
"ctrl-down": "menu::SelectLast",
"enter": "menu::Confirm",
"shift-f10": "menu::ShowContextMenu",
"ctrl-enter": "menu::SecondaryConfirm",
"escape": "menu::Cancel",
"ctrl-escape": "menu::Cancel",
"ctrl-c": "menu::Cancel",
"shift-enter": "menu::UseSelectedQuery",
"ctrl-shift-w": "workspace::CloseWindow",
@@ -27,8 +27,6 @@
"ctrl-,": "zed::OpenSettings",
"ctrl-q": "zed::Quit",
"ctrl-h": "zed::Hide",
"alt-ctrl-h": "zed::HideOthers",
"ctrl-m": "zed::Minimize",
"f11": "zed::ToggleFullScreen"
}
},
@@ -45,80 +43,100 @@
"shift-tab": "editor::TabPrev",
"ctrl-k": "editor::CutToEndOfLine",
"ctrl-t": "editor::Transpose",
"ctrl-backspace": "editor::DeleteToBeginningOfLine",
"ctrl-delete": "editor::DeleteToEndOfLine",
"alt-backspace": "editor::DeleteToPreviousWordStart",
"alt-delete": "editor::DeleteToNextWordEnd",
"alt-h": "editor::DeleteToPreviousWordStart",
"alt-d": "editor::DeleteToNextWordEnd",
// "ctrl-backspace": "editor::DeleteToBeginningOfLine",
// "ctrl-delete": "editor::DeleteToEndOfLine",
"ctrl-backspace": "editor::DeleteToPreviousWordStart",
// "ctrl-w": "editor::DeleteToPreviousWordStart",
"ctrl-delete": "editor::DeleteToNextWordEnd",
// "alt-h": "editor::DeleteToPreviousWordStart",
// "alt-d": "editor::DeleteToNextWordEnd",
"ctrl-x": "editor::Cut",
"ctrl-c": "editor::Copy",
"ctrl-v": "editor::Paste",
"ctrl-z": "editor::Undo",
"ctrl-shift-z": "editor::Redo",
"ctrl-y": "editor::Redo",
"up": "editor::MoveUp",
"ctrl-up": "editor::MoveToStartOfParagraph",
// "ctrl-up": "editor::MoveToStartOfParagraph", todo(linux) Should be "scroll down by 1 line"
"pageup": "editor::PageUp",
"shift-pageup": "editor::MovePageUp",
// "shift-pageup": "editor::MovePageUp", todo(linux) should be 'select page up'
"home": "editor::MoveToBeginningOfLine",
"down": "editor::MoveDown",
"ctrl-down": "editor::MoveToEndOfParagraph",
// "ctrl-down": "editor::MoveToEndOfParagraph", todo(linux) should be "scroll up by 1 line"
"pagedown": "editor::PageDown",
"shift-pagedown": "editor::MovePageDown",
// "shift-pagedown": "editor::MovePageDown", todo(linux) should be 'select page down'
"end": "editor::MoveToEndOfLine",
"left": "editor::MoveLeft",
"right": "editor::MoveRight",
"ctrl-p": "editor::MoveUp",
"ctrl-n": "editor::MoveDown",
"ctrl-b": "editor::MoveLeft",
"ctrl-f": "editor::MoveRight",
"ctrl-shift-l": "editor::NextScreen", // todo(linux): What is this
"alt-left": "editor::MoveToPreviousWordStart",
"alt-b": "editor::MoveToPreviousWordStart",
"alt-right": "editor::MoveToNextWordEnd",
"alt-f": "editor::MoveToNextWordEnd",
"ctrl-e": "editor::MoveToEndOfLine",
"ctrl-left": "editor::MoveToPreviousWordStart",
// "alt-b": "editor::MoveToPreviousWordStart",
"ctrl-right": "editor::MoveToNextWordEnd",
// "alt-f": "editor::MoveToNextWordEnd",
// "cmd-left": "editor::MoveToBeginningOfLine",
// "ctrl-a": "editor::MoveToBeginningOfLine",
// "cmd-right": "editor::MoveToEndOfLine",
// "ctrl-e": "editor::MoveToEndOfLine",
"ctrl-home": "editor::MoveToBeginning",
"ctrl-=end": "editor::MoveToEnd",
"ctrl-end": "editor::MoveToEnd",
"shift-up": "editor::SelectUp",
"shift-down": "editor::SelectDown",
"ctrl-shift-n": "editor::SelectDown",
"shift-left": "editor::SelectLeft",
"ctrl-shift-b": "editor::SelectLeft",
"shift-right": "editor::SelectRight",
"ctrl-shift-f": "editor::SelectRight",
"alt-shift-left": "editor::SelectToPreviousWordStart",
"alt-shift-b": "editor::SelectToPreviousWordStart",
"alt-shift-right": "editor::SelectToNextWordEnd",
"alt-shift-f": "editor::SelectToNextWordEnd",
"ctrl-shift-up": "editor::SelectToStartOfParagraph",
"ctrl-shift-down": "editor::SelectToEndOfParagraph",
"ctrl-shift-left": "editor::SelectToPreviousWordStart",
"ctrl-shift-right": "editor::SelectToNextWordEnd",
"ctrl-shift-up": "editor::AddSelectionAbove",
"ctrl-shift-down": "editor::AddSelectionBelow",
// "ctrl-shift-up": "editor::SelectToStartOfParagraph",
// "ctrl-shift-down": "editor::SelectToEndOfParagraph",
"ctrl-shift-home": "editor::SelectToBeginning",
"ctrl-shift-end": "editor::SelectToEnd",
"ctrl-a": "editor::SelectAll",
"ctrl-l": "editor::SelectLine",
"ctrl-shift-i": "editor::Format",
// "cmd-shift-left": [
// "editor::SelectToBeginningOfLine",
// {
// "stop_at_soft_wraps": true
// }
// ],
"shift-home": [
"editor::SelectToBeginningOfLine",
{
"stop_at_soft_wraps": true
}
],
// "ctrl-shift-a": [
// "editor::SelectToBeginningOfLine",
// {
// "stop_at_soft_wraps": true
// }
// ],
// "cmd-shift-right": [
// "editor::SelectToEndOfLine",
// {
// "stop_at_soft_wraps": true
// }
// ],
"shift-end": [
"editor::SelectToEndOfLine",
{
"stop_at_soft_wraps": true
}
],
"ctrl-shift-e": [
"editor::SelectToEndOfLine",
{
"stop_at_soft_wraps": true
}
],
// "ctrl-shift-e": [
// "editor::SelectToEndOfLine",
// {
// "stop_at_soft_wraps": true
// }
// ],
// "alt-v": [
// "editor::MovePageUp",
// {
// "center_cursor": true
// }
// ],
"ctrl-alt-space": "editor::ShowCharacterPalette",
"ctrl-;": "editor::ToggleLineNumbers",
"ctrl-alt-z": "editor::RevertSelectedHunks"
"ctrl-k ctrl-r": "editor::RevertSelectedHunks"
}
},
{
@@ -126,8 +144,8 @@
"bindings": {
"enter": "editor::Newline",
"shift-enter": "editor::Newline",
"ctrl-shift-enter": "editor::NewlineAbove",
"ctrl-enter": "editor::NewlineBelow",
"ctrl-shift-enter": "editor::NewlineBelow",
"ctrl-enter": "editor::NewlineAbove",
"alt-z": "editor::ToggleSoftWrap",
"ctrl-f": [
"buffer_search::Deploy",
@@ -135,21 +153,27 @@
"focus": true
}
],
// "cmd-e": [
// "buffer_search::Deploy",
// {
// "focus": false
// }
// ],
"ctrl->": "assistant::QuoteSelection"
}
},
{
"context": "Editor && mode == full && copilot_suggestion",
"context": "Editor && mode == full && inline_completion",
"bindings": {
"alt-]": "copilot::NextSuggestion",
"alt-[": "copilot::PreviousSuggestion",
"alt-right": "editor::AcceptPartialCopilotSuggestion"
"alt-]": "editor::NextInlineCompletion",
"alt-[": "editor::PreviousInlineCompletion",
"alt-right": "editor::AcceptPartialInlineCompletion"
}
},
{
"context": "Editor && !copilot_suggestion",
"context": "Editor && !inline_completion",
"bindings": {
"alt-\\": "copilot::Suggest"
"alt-\\": "editor::ShowInlineCompletion"
}
},
{
@@ -163,8 +187,8 @@
{
"context": "AssistantPanel",
"bindings": {
"f3": "search::SelectNextMatch",
"shift-f3": "search::SelectPrevMatch"
"ctrl-g": "search::SelectNextMatch",
"ctrl-shift-g": "search::SelectPrevMatch"
}
},
{
@@ -192,7 +216,7 @@
"context": "BufferSearchBar && in_replace",
"bindings": {
"enter": "search::ReplaceNext",
"ctrl-enter": "search::ReplaceAll"
"cmd-enter": "search::ReplaceAll"
}
},
{
@@ -208,9 +232,8 @@
"escape": "project_search::ToggleFocus",
"alt-tab": "search::CycleMode",
"ctrl-shift-h": "search::ToggleReplace",
"ctrl-alt-g": "search::ActivateRegexMode",
"ctrl-alt-s": "search::ActivateSemanticMode",
"ctrl-alt-x": "search::ActivateTextMode"
"alt-ctrl-g": "search::ActivateRegexMode",
"alt-ctrl-x": "search::ActivateTextMode"
}
},
{
@@ -224,7 +247,7 @@
"context": "ProjectSearchBar && in_replace",
"bindings": {
"enter": "search::ReplaceNext",
"ctrl-enter": "search::ReplaceAll"
"ctrl-alt-enter": "search::ReplaceAll"
}
},
{
@@ -232,36 +255,34 @@
"bindings": {
"escape": "project_search::ToggleFocus",
"alt-tab": "search::CycleMode",
"ctrl-shift-h": "search::ToggleReplace",
"ctrl-alt-g": "search::ActivateRegexMode",
"ctrl-alt-s": "search::ActivateSemanticMode",
"ctrl-alt-x": "search::ActivateTextMode"
"cmd-shift-h": "search::ToggleReplace",
"alt-ctrl-g": "search::ActivateRegexMode",
"alt-ctrl-x": "search::ActivateTextMode"
}
},
{
"context": "Pane",
"bindings": {
"ctrl-{": "pane::ActivatePrevItem",
"ctrl-}": "pane::ActivateNextItem",
"ctrl-alt-left": "pane::ActivatePrevItem",
"ctrl-alt-right": "pane::ActivateNextItem",
"ctrl-shift-tab": "pane::ActivatePrevItem",
"ctrl-pageup": "pane::ActivatePrevItem",
"ctrl-tab": "pane::ActivateNextItem",
"ctrl-pagedown": "pane::ActivateNextItem",
"ctrl-w": "pane::CloseActiveItem",
"ctrl-alt-t": "pane::CloseInactiveItems",
"ctrl-alt-shift-w": "workspace::CloseInactiveTabsAndPanes",
"alt-ctrl-t": "pane::CloseInactiveItems",
"alt-ctrl-shift-w": "workspace::CloseInactiveTabsAndPanes",
"ctrl-k u": "pane::CloseCleanItems",
"ctrl-k ctrl-w": "pane::CloseAllItems",
"ctrl-f": "project_search::ToggleFocus",
"f3": "search::SelectNextMatch",
"shift-f3": "search::SelectPrevMatch",
"ctrl-shift-h": "search::ToggleReplace",
"ctrl-k w": "pane::CloseAllItems",
"ctrl-shift-f": "project_search::ToggleFocus",
"ctrl-alt-g": "search::SelectNextMatch",
"ctrl-alt-shift-g": "search::SelectPrevMatch",
"ctrl-alt-shift-h": "search::ToggleReplace",
"alt-enter": "search::SelectAllMatches",
"ctrl-alt-c": "search::ToggleCaseSensitive",
"ctrl-alt-w": "search::ToggleWholeWord",
"alt-tab": "search::CycleMode",
"ctrl-alt-f": "project_search::ToggleFilters",
"ctrl-alt-g": "search::ActivateRegexMode",
"ctrl-alt-s": "search::ActivateSemanticMode",
"ctrl-alt-x": "search::ActivateTextMode"
"alt-c": "search::ToggleCaseSensitive",
"alt-w": "search::ToggleWholeWord",
"alt-r": "search::CycleMode",
"alt-ctrl-f": "project_search::ToggleFilters",
"ctrl-alt-shift-r": "search::ActivateRegexMode",
"ctrl-alt-shift-x": "search::ActivateTextMode"
}
},
// Bindings from VS Code
@@ -270,8 +291,20 @@
"bindings": {
"ctrl-[": "editor::Outdent",
"ctrl-]": "editor::Indent",
"ctrl-alt-up": "editor::AddSelectionAbove",
"ctrl-alt-down": "editor::AddSelectionBelow",
"shift-alt-up": "editor::AddSelectionAbove",
"shift-alt-down": "editor::AddSelectionBelow",
"ctrl-shift-k": "editor::DeleteLine",
"alt-up": "editor::MoveLineUp",
"alt-down": "editor::MoveLineDown",
"ctrl-alt-shift-up": [
"editor::DuplicateLine",
{
"move_upwards": true
}
],
"ctrl-alt-shift-down": "editor::DuplicateLine",
"ctrl-shift-right": "editor::SelectLargerSyntaxNode",
"ctrl-shift-left": "editor::SelectSmallerSyntaxNode",
"ctrl-d": [
"editor::SelectNext",
{
@@ -304,8 +337,6 @@
"advance_downwards": false
}
],
"alt-up": "editor::SelectLargerSyntaxNode",
"alt-down": "editor::SelectSmallerSyntaxNode",
"ctrl-u": "editor::UndoSelection",
"ctrl-shift-u": "editor::RedoSelection",
"f8": "editor::GoToDiagnostic",
@@ -314,39 +345,40 @@
"f12": "editor::GoToDefinition",
"alt-f12": "editor::GoToDefinitionSplit",
"ctrl-f12": "editor::GoToTypeDefinition",
"ctrl-alt-f12": "editor::GoToTypeDefinitionSplit",
"shift-f12": "editor::GoToImplementation",
"alt-ctrl-f12": "editor::GoToTypeDefinitionSplit",
"alt-shift-f12": "editor::FindAllReferences",
"ctrl-m": "editor::MoveToEnclosingBracket",
"ctrl-alt-[": "editor::Fold",
"ctrl-alt-]": "editor::UnfoldLines",
"ctrl-shift-[": "editor::Fold",
"ctrl-shift-]": "editor::UnfoldLines",
"ctrl-space": "editor::ShowCompletions",
"ctrl-.": "editor::ToggleCodeActions",
"ctrl-alt-r": "editor::RevealInFinder",
"ctrl-alt-c": "editor::DisplayCursorNames"
"alt-cmd-r": "editor::RevealInFinder",
"ctrl-alt-shift-c": "editor::DisplayCursorNames"
}
},
{
"context": "Editor && mode == full",
"bindings": {
"ctrl-shift-o": "outline::Toggle",
"cmd-shift-o": "outline::Toggle",
"ctrl-g": "go_to_line::Toggle"
}
},
{
"context": "Pane",
"bindings": {
"ctrl-1": ["pane::ActivateItem", 0],
"ctrl-2": ["pane::ActivateItem", 1],
"ctrl-3": ["pane::ActivateItem", 2],
"ctrl-4": ["pane::ActivateItem", 3],
"ctrl-5": ["pane::ActivateItem", 4],
"ctrl-6": ["pane::ActivateItem", 5],
"ctrl-7": ["pane::ActivateItem", 6],
"ctrl-8": ["pane::ActivateItem", 7],
"ctrl-9": ["pane::ActivateItem", 8],
"ctrl-0": "pane::ActivateLastItem",
"ctrl--": "pane::GoBack",
"ctrl-_": "pane::GoForward",
"alt-1": ["pane::ActivateItem", 0],
"alt-2": ["pane::ActivateItem", 1],
"alt-3": ["pane::ActivateItem", 2],
"alt-4": ["pane::ActivateItem", 3],
"alt-5": ["pane::ActivateItem", 4],
"alt-6": ["pane::ActivateItem", 5],
"alt-7": ["pane::ActivateItem", 6],
"alt-8": ["pane::ActivateItem", 7],
"alt-9": ["pane::ActivateItem", 8],
"alt-0": "pane::ActivateLastItem",
"ctrl-alt--": "pane::GoBack",
"ctrl-alt-_": "pane::GoForward",
"ctrl-shift-t": "pane::ReopenClosedItem",
"ctrl-shift-f": "project_search::ToggleFocus"
}
@@ -361,8 +393,8 @@
// "create_new_window": true
// }
// ]
"ctrl-alt-o": "projects::OpenRecent",
"ctrl-alt-b": "branches::OpenRecent",
"alt-ctrl-o": "projects::OpenRecent",
"alt-ctrl-shift-b": "branches::OpenRecent",
"ctrl-~": "workspace::NewTerminal",
"ctrl-s": "workspace::Save",
"ctrl-k s": "workspace::SaveWithoutFormat",
@@ -370,24 +402,25 @@
"ctrl-n": "workspace::NewFile",
"ctrl-shift-n": "workspace::NewWindow",
"ctrl-`": "terminal_panel::ToggleFocus",
"ctrl-1": ["workspace::ActivatePane", 0],
"ctrl-2": ["workspace::ActivatePane", 1],
"ctrl-3": ["workspace::ActivatePane", 2],
"ctrl-4": ["workspace::ActivatePane", 3],
"ctrl-5": ["workspace::ActivatePane", 4],
"ctrl-6": ["workspace::ActivatePane", 5],
"ctrl-7": ["workspace::ActivatePane", 6],
"ctrl-8": ["workspace::ActivatePane", 7],
"ctrl-9": ["workspace::ActivatePane", 8],
"ctrl-b": "workspace::ToggleLeftDock",
"ctrl-r": "workspace::ToggleRightDock",
"alt-1": ["workspace::ActivatePane", 0],
"alt-2": ["workspace::ActivatePane", 1],
"alt-3": ["workspace::ActivatePane", 2],
"alt-4": ["workspace::ActivatePane", 3],
"alt-5": ["workspace::ActivatePane", 4],
"alt-6": ["workspace::ActivatePane", 5],
"alt-7": ["workspace::ActivatePane", 6],
"alt-8": ["workspace::ActivatePane", 7],
"alt-9": ["workspace::ActivatePane", 8],
"ctrl-alt-b": "workspace::ToggleLeftDock",
"ctrl-b": "workspace::ToggleRightDock",
"ctrl-j": "workspace::ToggleBottomDock",
"ctrl-alt-y": "workspace::CloseAllDocks",
"ctrl-shift-f": "pane::DeploySearch",
"ctrl-k ctrl-t": "theme_selector::Toggle",
"ctrl-k ctrl-s": "zed::OpenKeymap",
"ctrl-k ctrl-t": "theme_selector::Toggle",
"ctrl-t": "project_symbols::Toggle",
"ctrl-p": "file_finder::Toggle",
"ctrl-e": "file_finder::Toggle",
"ctrl-shift-p": "command_palette::Toggle",
"ctrl-shift-m": "diagnostics::Deploy",
"ctrl-shift-e": "project_panel::ToggleFocus",
@@ -408,15 +441,10 @@
}
},
// Bindings from Sublime Text
// todo(linux) make sure these match linux bindings or remove above comment?
{
"context": "Editor",
"bindings": {
"ctrl-shift-k": "editor::DeleteLine",
"ctrl-shift-d": "editor::DuplicateLine",
"ctrl-j": "editor::JoinLines",
"ctrl-alt-up": "editor::MoveLineUp",
"ctrl-alt-down": "editor::MoveLineDown",
"ctrl-alt-backspace": "editor::DeleteToPreviousSubwordStart",
"ctrl-alt-h": "editor::DeleteToPreviousSubwordStart",
"ctrl-alt-delete": "editor::DeleteToNextSubwordEnd",
@@ -432,7 +460,6 @@
}
},
// Bindings from Atom
// todo(linux) make sure these match linux bindings or remove above comment?
{
"context": "Pane",
"bindings": {
@@ -478,7 +505,7 @@
"bindings": {
"ctrl-alt-shift-f": "workspace::FollowNextCollaborator",
// TODO: Move this to a dock open action
"ctrl-alt-c": "collab_panel::ToggleFocus",
"ctrl-shift-c": "collab_panel::ToggleFocus",
"ctrl-alt-i": "zed::DebugElements",
"ctrl-:": "editor::ToggleInlayHints"
}
@@ -505,19 +532,19 @@
"left": "project_panel::CollapseSelectedEntry",
"right": "project_panel::ExpandSelectedEntry",
"ctrl-n": "project_panel::NewFile",
"ctrl-alt-n": "project_panel::NewDirectory",
"alt-ctrl-n": "project_panel::NewDirectory",
"ctrl-x": "project_panel::Cut",
"ctrl-c": "project_panel::Copy",
"ctrl-v": "project_panel::Paste",
"ctrl-alt-c": "project_panel::CopyPath",
"ctrl-alt-shift-c": "project_panel::CopyRelativePath",
"alt-ctrl-shift-c": "project_panel::CopyRelativePath",
"f2": "project_panel::Rename",
"enter": "project_panel::Rename",
"backspace": "project_panel::Delete",
"delete": "project_panel::Delete",
"ctrl-backspace": ["project_panel::Delete", { "skip_prompt": true }],
"ctrl-delete": ["project_panel::Delete", { "skip_prompt": true }],
"ctrl-alt-r": "project_panel::RevealInFinder",
"alt-cmd-r": "project_panel::RevealInFinder",
"alt-shift-f": "project_panel::NewSearchInDirectory"
}
},
@@ -558,22 +585,16 @@
"escape": "chat_panel::CloseReplyPreview"
}
},
{
"context": "FileFinder",
"bindings": { "ctrl-shift-p": "file_finder::SelectPrev" }
},
{
"context": "Terminal",
"bindings": {
"ctrl-alt-space": "terminal::ShowCharacterPalette",
"ctrl-shift-c": "terminal::Copy",
"ctrl-shift-v": "terminal::Paste",
"ctrl-k": "terminal::Clear",
// Some nice conveniences
"ctrl-backspace": ["terminal::SendText", "\u0015"],
"ctrl-right": ["terminal::SendText", "\u0005"],
"ctrl-left": ["terminal::SendText", "\u0001"],
// Terminal.app compatibility
"alt-left": ["terminal::SendText", "\u001bb"],
"alt-right": ["terminal::SendText", "\u001bf"],
// There are conflicting bindings for these keys in the global context.
// these bindings override them, remove at your own risk:
"shift-ctrl-c": "terminal::Copy",
"shift-ctrl-v": "terminal::Paste",
"up": ["terminal::SendKeystroke", "up"],
"pageup": ["terminal::SendKeystroke", "pageup"],
"down": ["terminal::SendKeystroke", "down"],

View File

@@ -13,9 +13,10 @@
"cmd-up": "menu::SelectFirst",
"cmd-down": "menu::SelectLast",
"enter": "menu::Confirm",
"ctrl-enter": "menu::ShowContextMenu",
"ctrl-enter": "menu::SecondaryConfirm",
"cmd-enter": "menu::SecondaryConfirm",
"escape": "menu::Cancel",
"cmd-escape": "menu::Cancel",
"ctrl-c": "menu::Cancel",
"shift-enter": "menu::UseSelectedQuery",
"cmd-shift-w": "workspace::CloseWindow",
@@ -181,17 +182,17 @@
}
},
{
"context": "Editor && mode == full && copilot_suggestion",
"context": "Editor && mode == full && inline_completion",
"bindings": {
"alt-]": "copilot::NextSuggestion",
"alt-[": "copilot::PreviousSuggestion",
"alt-right": "editor::AcceptPartialCopilotSuggestion"
"alt-]": "editor::NextInlineCompletion",
"alt-[": "editor::PreviousInlineCompletion",
"alt-right": "editor::AcceptPartialInlineCompletion"
}
},
{
"context": "Editor && !copilot_suggestion",
"context": "Editor && !inline_completion",
"bindings": {
"alt-\\": "copilot::Suggest"
"alt-\\": "editor::ShowInlineCompletion"
}
},
{
@@ -251,7 +252,6 @@
"alt-tab": "search::CycleMode",
"cmd-shift-h": "search::ToggleReplace",
"alt-cmd-g": "search::ActivateRegexMode",
"alt-cmd-s": "search::ActivateSemanticMode",
"alt-cmd-x": "search::ActivateTextMode"
}
},
@@ -276,7 +276,6 @@
"alt-tab": "search::CycleMode",
"cmd-shift-h": "search::ToggleReplace",
"alt-cmd-g": "search::ActivateRegexMode",
"alt-cmd-s": "search::ActivateSemanticMode",
"alt-cmd-x": "search::ActivateTextMode"
}
},
@@ -302,7 +301,6 @@
"alt-tab": "search::CycleMode",
"alt-cmd-f": "project_search::ToggleFilters",
"alt-cmd-g": "search::ActivateRegexMode",
"alt-cmd-s": "search::ActivateSemanticMode",
"alt-cmd-x": "search::ActivateTextMode"
}
},
@@ -368,6 +366,7 @@
"f12": "editor::GoToDefinition",
"alt-f12": "editor::GoToDefinitionSplit",
"cmd-f12": "editor::GoToTypeDefinition",
"shift-f12": "editor::GoToImplementation",
"alt-cmd-f12": "editor::GoToTypeDefinitionSplit",
"alt-shift-f12": "editor::FindAllReferences",
"ctrl-m": "editor::MoveToEnclosingBracket",
@@ -601,10 +600,8 @@
}
},
{
"context": "ChatPanel > MessageEditor",
"bindings": {
"escape": "chat_panel::CloseReplyPreview"
}
"context": "FileFinder",
"bindings": { "cmd-shift-p": "file_finder::SelectPrev" }
},
{
"context": "Terminal",

View File

@@ -13,7 +13,7 @@
"cmd-up": "menu::SelectFirst",
"cmd-down": "menu::SelectLast",
"enter": "menu::Confirm",
"ctrl-enter": "menu::ShowContextMenu",
"ctrl-enter": "menu::SecondaryConfirm",
"cmd-enter": "menu::SecondaryConfirm",
"escape": "menu::Cancel",
"ctrl-c": "menu::Cancel",

View File

@@ -510,7 +510,7 @@
"ctrl-[": "vim::NormalBefore",
"ctrl-x ctrl-o": "editor::ShowCompletions",
"ctrl-x ctrl-a": "assistant::InlineAssist", // zed specific
"ctrl-x ctrl-c": "copilot::Suggest", // zed specific
"ctrl-x ctrl-c": "editor::ShowInlineCompletion", // zed specific
"ctrl-x ctrl-l": "editor::ToggleCodeActions", // zed specific
"ctrl-x ctrl-z": "editor::Cancel",
"ctrl-w": "editor::DeleteToPreviousWordStart",

View File

@@ -92,6 +92,12 @@
// Whether to automatically type closing characters for you. For example,
// when you type (, Zed will automatically add a closing ) at the correct position.
"use_autoclose": true,
// Controls how the editor handles the autoclosed characters.
// When set to `false`(default), skipping over and auto-removing of the closing characters
// happen only for auto-inserted characters.
// Otherwise(when `true`), the closing characters are always skipped over and auto-removed
// no matter how they were inserted.
"always_treat_brackets_as_autoclosed": false,
// Controls whether copilot provides suggestion immediately
// or waits for a `copilot::Toggle`
"show_copilot_suggestions": true,
@@ -237,6 +243,10 @@
"default_width": 380
},
"assistant": {
// Version of this setting.
"version": "1",
// Whether the assistant is enabled.
"enabled": true,
// Whether to show the assistant panel button in the status bar.
"button": true,
// Where to dock the assistant panel. Can be 'left', 'right' or 'bottom'.
@@ -245,28 +255,16 @@
"default_width": 640,
// Default height when the assistant is docked to the bottom.
"default_height": 320,
// Deprecated: Please use `provider.api_url` instead.
// The default OpenAI API endpoint to use when starting new conversations.
"openai_api_url": "https://api.openai.com/v1",
// Deprecated: Please use `provider.default_model` instead.
// The default OpenAI model to use when starting new conversations. This
// setting can take three values:
//
// 1. "gpt-3.5-turbo-0613""
// 2. "gpt-4-0613""
// 3. "gpt-4-1106-preview"
"default_open_ai_model": "gpt-4-1106-preview",
// AI provider.
"provider": {
"type": "openai",
// The default OpenAI API endpoint to use when starting new conversations.
"api_url": "https://api.openai.com/v1",
// The default OpenAI model to use when starting new conversations. This
"name": "openai",
// The default model to use when starting new conversations. This
// setting can take three values:
//
// 1. "gpt-3.5-turbo-0613""
// 2. "gpt-4-0613""
// 3. "gpt-4-1106-preview"
"default_model": "gpt-4-1106-preview"
// 1. "gpt-3.5-turbo"
// 2. "gpt-4"
// 3. "gpt-4-turbo-preview"
"default_model": "gpt-4-turbo-preview"
}
},
// Whether the screen sharing icon is shown in the os status bar.
@@ -505,10 +503,6 @@
// Existing terminals will not pick up this change until they are recreated.
// "max_scroll_history_lines": 10000,
},
// Difference settings for semantic_index
"semantic_index": {
"enabled": true
},
// Settings specific to our elixir integration
"elixir": {
// Change the LSP zed uses for elixir.
@@ -593,6 +587,9 @@
},
"OCaml Interface": {
"tab_size": 2
},
"Prisma": {
"tab_size": 2
}
},
// Zed's Prettier integration settings.

View File

@@ -205,7 +205,7 @@ impl ActivityIndicator {
}
LanguageServerBinaryStatus::Downloading => downloading.push(status.name.0.as_ref()),
LanguageServerBinaryStatus::Failed { .. } => failed.push(status.name.0.as_ref()),
LanguageServerBinaryStatus::Downloaded | LanguageServerBinaryStatus::Cached => {}
LanguageServerBinaryStatus::None => {}
}
}

View File

@@ -1,41 +0,0 @@
[package]
name = "ai"
version = "0.1.0"
edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/ai.rs"
doctest = false
[features]
test-support = []
[dependencies]
anyhow.workspace = true
async-trait.workspace = true
bincode = "1.3.3"
futures.workspace = true
gpui.workspace = true
isahc.workspace = true
language.workspace = true
log.workspace = true
matrixmultiply = "0.3.7"
ordered-float.workspace = true
parking_lot.workspace = true
parse_duration = "2.1.1"
postage.workspace = true
rand.workspace = true
rusqlite = { version = "0.29.0", features = ["blob", "array", "modern_sqlite"] }
schemars.workspace = true
serde.workspace = true
serde_json.workspace = true
tiktoken-rs.workspace = true
util.workspace = true
[dev-dependencies]
gpui = { workspace = true, features = ["test-support"] }

View File

@@ -1,8 +0,0 @@
pub mod auth;
pub mod completion;
pub mod embedding;
pub mod models;
pub mod prompts;
pub mod providers;
#[cfg(any(test, feature = "test-support"))]
pub mod test;

View File

@@ -1,23 +0,0 @@
use futures::future::BoxFuture;
use gpui::AppContext;
#[derive(Clone, Debug)]
pub enum ProviderCredential {
Credentials { api_key: String },
NoCredentials,
NotNeeded,
}
pub trait CredentialProvider: Send + Sync {
fn has_credentials(&self) -> bool;
#[must_use]
fn retrieve_credentials(&self, cx: &mut AppContext) -> BoxFuture<ProviderCredential>;
#[must_use]
fn save_credentials(
&self,
cx: &mut AppContext,
credential: ProviderCredential,
) -> BoxFuture<()>;
#[must_use]
fn delete_credentials(&self, cx: &mut AppContext) -> BoxFuture<()>;
}

View File

@@ -1,23 +0,0 @@
use anyhow::Result;
use futures::{future::BoxFuture, stream::BoxStream};
use crate::{auth::CredentialProvider, models::LanguageModel};
pub trait CompletionRequest: Send + Sync {
fn data(&self) -> serde_json::Result<String>;
}
pub trait CompletionProvider: CredentialProvider {
fn base_model(&self) -> Box<dyn LanguageModel>;
fn complete(
&self,
prompt: Box<dyn CompletionRequest>,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>>;
fn box_clone(&self) -> Box<dyn CompletionProvider>;
}
impl Clone for Box<dyn CompletionProvider> {
fn clone(&self) -> Box<dyn CompletionProvider> {
self.box_clone()
}
}

View File

@@ -1,121 +0,0 @@
use std::time::Instant;
use anyhow::Result;
use async_trait::async_trait;
use ordered_float::OrderedFloat;
use rusqlite::types::{FromSql, FromSqlResult, ToSqlOutput, ValueRef};
use rusqlite::ToSql;
use crate::auth::CredentialProvider;
use crate::models::LanguageModel;
#[derive(Debug, PartialEq, Clone)]
pub struct Embedding(pub Vec<f32>);
// This is needed for semantic index functionality
// Unfortunately it has to live wherever the "Embedding" struct is created.
// Keeping this in here though, introduces a 'rusqlite' dependency into AI
// which is less than ideal
impl FromSql for Embedding {
fn column_result(value: ValueRef) -> FromSqlResult<Self> {
let bytes = value.as_blob()?;
let embedding =
bincode::deserialize(bytes).map_err(|err| rusqlite::types::FromSqlError::Other(err))?;
Ok(Embedding(embedding))
}
}
impl ToSql for Embedding {
fn to_sql(&self) -> rusqlite::Result<ToSqlOutput> {
let bytes = bincode::serialize(&self.0)
.map_err(|err| rusqlite::Error::ToSqlConversionFailure(Box::new(err)))?;
Ok(ToSqlOutput::Owned(rusqlite::types::Value::Blob(bytes)))
}
}
impl From<Vec<f32>> for Embedding {
fn from(value: Vec<f32>) -> Self {
Embedding(value)
}
}
impl Embedding {
pub fn similarity(&self, other: &Self) -> OrderedFloat<f32> {
let len = self.0.len();
assert_eq!(len, other.0.len());
let mut result = 0.0;
unsafe {
matrixmultiply::sgemm(
1,
len,
1,
1.0,
self.0.as_ptr(),
len as isize,
1,
other.0.as_ptr(),
1,
len as isize,
0.0,
&mut result as *mut f32,
1,
1,
);
}
OrderedFloat(result)
}
}
#[async_trait]
pub trait EmbeddingProvider: CredentialProvider {
fn base_model(&self) -> Box<dyn LanguageModel>;
async fn embed_batch(&self, spans: Vec<String>) -> Result<Vec<Embedding>>;
fn max_tokens_per_batch(&self) -> usize;
fn rate_limit_expiration(&self) -> Option<Instant>;
}
#[cfg(test)]
mod tests {
use super::*;
use rand::prelude::*;
#[gpui::test]
fn test_similarity(mut rng: StdRng) {
assert_eq!(
Embedding::from(vec![1., 0., 0., 0., 0.])
.similarity(&Embedding::from(vec![0., 1., 0., 0., 0.])),
0.
);
assert_eq!(
Embedding::from(vec![2., 0., 0., 0., 0.])
.similarity(&Embedding::from(vec![3., 1., 0., 0., 0.])),
6.
);
for _ in 0..100 {
let size = 1536;
let mut a = vec![0.; size];
let mut b = vec![0.; size];
for (a, b) in a.iter_mut().zip(b.iter_mut()) {
*a = rng.gen();
*b = rng.gen();
}
let a = Embedding::from(a);
let b = Embedding::from(b);
assert_eq!(
round_to_decimals(a.similarity(&b), 1),
round_to_decimals(reference_dot(&a.0, &b.0), 1)
);
}
fn round_to_decimals(n: OrderedFloat<f32>, decimal_places: i32) -> f32 {
let factor = 10.0_f32.powi(decimal_places);
(n * factor).round() / factor
}
fn reference_dot(a: &[f32], b: &[f32]) -> OrderedFloat<f32> {
OrderedFloat(a.iter().zip(b.iter()).map(|(a, b)| a * b).sum())
}
}
}

View File

@@ -1,16 +0,0 @@
pub enum TruncationDirection {
Start,
End,
}
pub trait LanguageModel {
fn name(&self) -> String;
fn count_tokens(&self, content: &str) -> anyhow::Result<usize>;
fn truncate(
&self,
content: &str,
length: usize,
direction: TruncationDirection,
) -> anyhow::Result<String>;
fn capacity(&self) -> anyhow::Result<usize>;
}

View File

@@ -1,337 +0,0 @@
use std::cmp::Reverse;
use std::ops::Range;
use std::sync::Arc;
use language::BufferSnapshot;
use util::ResultExt;
use crate::models::LanguageModel;
use crate::prompts::repository_context::PromptCodeSnippet;
pub(crate) enum PromptFileType {
Text,
Code,
}
// TODO: Set this up to manage for defaults well
pub struct PromptArguments {
pub model: Arc<dyn LanguageModel>,
pub user_prompt: Option<String>,
pub language_name: Option<String>,
pub project_name: Option<String>,
pub snippets: Vec<PromptCodeSnippet>,
pub reserved_tokens: usize,
pub buffer: Option<BufferSnapshot>,
pub selected_range: Option<Range<usize>>,
}
impl PromptArguments {
pub(crate) fn get_file_type(&self) -> PromptFileType {
if self
.language_name
.as_ref()
.map(|name| !["Markdown", "Plain Text"].contains(&name.as_str()))
.unwrap_or(true)
{
PromptFileType::Code
} else {
PromptFileType::Text
}
}
}
pub trait PromptTemplate {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)>;
}
#[repr(i8)]
#[derive(PartialEq, Eq)]
pub enum PromptPriority {
/// Ignores truncation.
Mandatory,
/// Truncates based on priority.
Ordered { order: usize },
}
impl PartialOrd for PromptPriority {
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
Some(self.cmp(other))
}
}
impl Ord for PromptPriority {
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
match (self, other) {
(Self::Mandatory, Self::Mandatory) => std::cmp::Ordering::Equal,
(Self::Mandatory, Self::Ordered { .. }) => std::cmp::Ordering::Greater,
(Self::Ordered { .. }, Self::Mandatory) => std::cmp::Ordering::Less,
(Self::Ordered { order: a }, Self::Ordered { order: b }) => b.cmp(a),
}
}
}
pub struct PromptChain {
args: PromptArguments,
templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)>,
}
impl PromptChain {
pub fn new(
args: PromptArguments,
templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)>,
) -> Self {
PromptChain { args, templates }
}
pub fn generate(&self, truncate: bool) -> anyhow::Result<(String, usize)> {
// Argsort based on Prompt Priority
let separator = "\n";
let separator_tokens = self.args.model.count_tokens(separator)?;
let mut sorted_indices = (0..self.templates.len()).collect::<Vec<_>>();
sorted_indices.sort_by_key(|&i| Reverse(&self.templates[i].0));
let mut tokens_outstanding = if truncate {
Some(self.args.model.capacity()? - self.args.reserved_tokens)
} else {
None
};
let mut prompts = vec!["".to_string(); sorted_indices.len()];
for idx in sorted_indices {
let (_, template) = &self.templates[idx];
if let Some((template_prompt, prompt_token_count)) =
template.generate(&self.args, tokens_outstanding).log_err()
{
if template_prompt != "" {
prompts[idx] = template_prompt;
if let Some(remaining_tokens) = tokens_outstanding {
let new_tokens = prompt_token_count + separator_tokens;
tokens_outstanding = if remaining_tokens > new_tokens {
Some(remaining_tokens - new_tokens)
} else {
Some(0)
};
}
}
}
}
prompts.retain(|x| x != "");
let full_prompt = prompts.join(separator);
let total_token_count = self.args.model.count_tokens(&full_prompt)?;
anyhow::Ok((prompts.join(separator), total_token_count))
}
}
#[cfg(test)]
pub(crate) mod tests {
use crate::models::TruncationDirection;
use crate::test::FakeLanguageModel;
use super::*;
#[test]
pub fn test_prompt_chain() {
struct TestPromptTemplate {}
impl PromptTemplate for TestPromptTemplate {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
let mut content = "This is a test prompt template".to_string();
let mut token_count = args.model.count_tokens(&content)?;
if let Some(max_token_length) = max_token_length {
if token_count > max_token_length {
content = args.model.truncate(
&content,
max_token_length,
TruncationDirection::End,
)?;
token_count = max_token_length;
}
}
anyhow::Ok((content, token_count))
}
}
struct TestLowPriorityTemplate {}
impl PromptTemplate for TestLowPriorityTemplate {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
let mut content = "This is a low priority test prompt template".to_string();
let mut token_count = args.model.count_tokens(&content)?;
if let Some(max_token_length) = max_token_length {
if token_count > max_token_length {
content = args.model.truncate(
&content,
max_token_length,
TruncationDirection::End,
)?;
token_count = max_token_length;
}
}
anyhow::Ok((content, token_count))
}
}
let model: Arc<dyn LanguageModel> = Arc::new(FakeLanguageModel { capacity: 100 });
let args = PromptArguments {
model: model.clone(),
language_name: None,
project_name: None,
snippets: Vec::new(),
reserved_tokens: 0,
buffer: None,
selected_range: None,
user_prompt: None,
};
let templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)> = vec![
(
PromptPriority::Ordered { order: 0 },
Box::new(TestPromptTemplate {}),
),
(
PromptPriority::Ordered { order: 1 },
Box::new(TestLowPriorityTemplate {}),
),
];
let chain = PromptChain::new(args, templates);
let (prompt, token_count) = chain.generate(false).unwrap();
assert_eq!(
prompt,
"This is a test prompt template\nThis is a low priority test prompt template"
.to_string()
);
assert_eq!(model.count_tokens(&prompt).unwrap(), token_count);
// Testing with Truncation Off
// Should ignore capacity and return all prompts
let model: Arc<dyn LanguageModel> = Arc::new(FakeLanguageModel { capacity: 20 });
let args = PromptArguments {
model: model.clone(),
language_name: None,
project_name: None,
snippets: Vec::new(),
reserved_tokens: 0,
buffer: None,
selected_range: None,
user_prompt: None,
};
let templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)> = vec![
(
PromptPriority::Ordered { order: 0 },
Box::new(TestPromptTemplate {}),
),
(
PromptPriority::Ordered { order: 1 },
Box::new(TestLowPriorityTemplate {}),
),
];
let chain = PromptChain::new(args, templates);
let (prompt, token_count) = chain.generate(false).unwrap();
assert_eq!(
prompt,
"This is a test prompt template\nThis is a low priority test prompt template"
.to_string()
);
assert_eq!(model.count_tokens(&prompt).unwrap(), token_count);
// Testing with Truncation Off
// Should ignore capacity and return all prompts
let capacity = 20;
let model: Arc<dyn LanguageModel> = Arc::new(FakeLanguageModel { capacity });
let args = PromptArguments {
model: model.clone(),
language_name: None,
project_name: None,
snippets: Vec::new(),
reserved_tokens: 0,
buffer: None,
selected_range: None,
user_prompt: None,
};
let templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)> = vec![
(
PromptPriority::Ordered { order: 0 },
Box::new(TestPromptTemplate {}),
),
(
PromptPriority::Ordered { order: 1 },
Box::new(TestLowPriorityTemplate {}),
),
(
PromptPriority::Ordered { order: 2 },
Box::new(TestLowPriorityTemplate {}),
),
];
let chain = PromptChain::new(args, templates);
let (prompt, token_count) = chain.generate(true).unwrap();
assert_eq!(prompt, "This is a test promp".to_string());
assert_eq!(token_count, capacity);
// Change Ordering of Prompts Based on Priority
let capacity = 120;
let reserved_tokens = 10;
let model: Arc<dyn LanguageModel> = Arc::new(FakeLanguageModel { capacity });
let args = PromptArguments {
model: model.clone(),
language_name: None,
project_name: None,
snippets: Vec::new(),
reserved_tokens,
buffer: None,
selected_range: None,
user_prompt: None,
};
let templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)> = vec![
(
PromptPriority::Mandatory,
Box::new(TestLowPriorityTemplate {}),
),
(
PromptPriority::Ordered { order: 0 },
Box::new(TestPromptTemplate {}),
),
(
PromptPriority::Ordered { order: 1 },
Box::new(TestLowPriorityTemplate {}),
),
];
let chain = PromptChain::new(args, templates);
let (prompt, token_count) = chain.generate(true).unwrap();
assert_eq!(
prompt,
"This is a low priority test prompt template\nThis is a test prompt template\nThis is a low priority test prompt "
.to_string()
);
assert_eq!(token_count, capacity - reserved_tokens);
}
}

View File

@@ -1,164 +0,0 @@
use anyhow::anyhow;
use language::BufferSnapshot;
use language::ToOffset;
use crate::models::LanguageModel;
use crate::models::TruncationDirection;
use crate::prompts::base::PromptArguments;
use crate::prompts::base::PromptTemplate;
use std::fmt::Write;
use std::ops::Range;
use std::sync::Arc;
fn retrieve_context(
buffer: &BufferSnapshot,
selected_range: &Option<Range<usize>>,
model: Arc<dyn LanguageModel>,
max_token_count: Option<usize>,
) -> anyhow::Result<(String, usize, bool)> {
let mut prompt = String::new();
let mut truncated = false;
if let Some(selected_range) = selected_range {
let start = selected_range.start.to_offset(buffer);
let end = selected_range.end.to_offset(buffer);
let start_window = buffer.text_for_range(0..start).collect::<String>();
let mut selected_window = String::new();
if start == end {
write!(selected_window, "<|START|>").unwrap();
} else {
write!(selected_window, "<|START|").unwrap();
}
write!(
selected_window,
"{}",
buffer.text_for_range(start..end).collect::<String>()
)
.unwrap();
if start != end {
write!(selected_window, "|END|>").unwrap();
}
let end_window = buffer.text_for_range(end..buffer.len()).collect::<String>();
if let Some(max_token_count) = max_token_count {
let selected_tokens = model.count_tokens(&selected_window)?;
if selected_tokens > max_token_count {
return Err(anyhow!(
"selected range is greater than model context window, truncation not possible"
));
};
let mut remaining_tokens = max_token_count - selected_tokens;
let start_window_tokens = model.count_tokens(&start_window)?;
let end_window_tokens = model.count_tokens(&end_window)?;
let outside_tokens = start_window_tokens + end_window_tokens;
if outside_tokens > remaining_tokens {
let (start_goal_tokens, end_goal_tokens) =
if start_window_tokens < end_window_tokens {
let start_goal_tokens = (remaining_tokens / 2).min(start_window_tokens);
remaining_tokens -= start_goal_tokens;
let end_goal_tokens = remaining_tokens.min(end_window_tokens);
(start_goal_tokens, end_goal_tokens)
} else {
let end_goal_tokens = (remaining_tokens / 2).min(end_window_tokens);
remaining_tokens -= end_goal_tokens;
let start_goal_tokens = remaining_tokens.min(start_window_tokens);
(start_goal_tokens, end_goal_tokens)
};
let truncated_start_window =
model.truncate(&start_window, start_goal_tokens, TruncationDirection::Start)?;
let truncated_end_window =
model.truncate(&end_window, end_goal_tokens, TruncationDirection::End)?;
writeln!(
prompt,
"{truncated_start_window}{selected_window}{truncated_end_window}"
)
.unwrap();
truncated = true;
} else {
writeln!(prompt, "{start_window}{selected_window}{end_window}").unwrap();
}
} else {
// If we dont have a selected range, include entire file.
writeln!(prompt, "{}", &buffer.text()).unwrap();
// Dumb truncation strategy
if let Some(max_token_count) = max_token_count {
if model.count_tokens(&prompt)? > max_token_count {
truncated = true;
prompt = model.truncate(&prompt, max_token_count, TruncationDirection::End)?;
}
}
}
}
let token_count = model.count_tokens(&prompt)?;
anyhow::Ok((prompt, token_count, truncated))
}
pub struct FileContext {}
impl PromptTemplate for FileContext {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
if let Some(buffer) = &args.buffer {
let mut prompt = String::new();
// Add Initial Preamble
// TODO: Do we want to add the path in here?
writeln!(
prompt,
"The file you are currently working on has the following content:"
)
.unwrap();
let language_name = args
.language_name
.clone()
.unwrap_or("".to_string())
.to_lowercase();
let (context, _, truncated) = retrieve_context(
buffer,
&args.selected_range,
args.model.clone(),
max_token_length,
)?;
writeln!(prompt, "```{language_name}\n{context}\n```").unwrap();
if truncated {
writeln!(prompt, "Note the content has been truncated and only represents a portion of the file.").unwrap();
}
if let Some(selected_range) = &args.selected_range {
let start = selected_range.start.to_offset(buffer);
let end = selected_range.end.to_offset(buffer);
if start == end {
writeln!(prompt, "In particular, the user's cursor is currently on the '<|START|>' span in the above content, with no text selected.").unwrap();
} else {
writeln!(prompt, "In particular, the user has selected a section of the text between the '<|START|' and '|END|>' spans.").unwrap();
}
}
// Really dumb truncation strategy
if let Some(max_tokens) = max_token_length {
prompt = args
.model
.truncate(&prompt, max_tokens, TruncationDirection::End)?;
}
let token_count = args.model.count_tokens(&prompt)?;
anyhow::Ok((prompt, token_count))
} else {
Err(anyhow!("no buffer provided to retrieve file context from"))
}
}
}

View File

@@ -1,99 +0,0 @@
use crate::prompts::base::{PromptArguments, PromptFileType, PromptTemplate};
use anyhow::anyhow;
use std::fmt::Write;
pub fn capitalize(s: &str) -> String {
let mut c = s.chars();
match c.next() {
None => String::new(),
Some(f) => f.to_uppercase().collect::<String>() + c.as_str(),
}
}
pub struct GenerateInlineContent {}
impl PromptTemplate for GenerateInlineContent {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
let Some(user_prompt) = &args.user_prompt else {
return Err(anyhow!("user prompt not provided"));
};
let file_type = args.get_file_type();
let content_type = match &file_type {
PromptFileType::Code => "code",
PromptFileType::Text => "text",
};
let mut prompt = String::new();
if let Some(selected_range) = &args.selected_range {
if selected_range.start == selected_range.end {
writeln!(
prompt,
"Assume the cursor is located where the `<|START|>` span is."
)
.unwrap();
writeln!(
prompt,
"{} can't be replaced, so assume your answer will be inserted at the cursor.",
capitalize(content_type)
)
.unwrap();
writeln!(
prompt,
"Generate {content_type} based on the users prompt: {user_prompt}",
)
.unwrap();
} else {
writeln!(prompt, "Modify the user's selected {content_type} based upon the users prompt: '{user_prompt}'").unwrap();
writeln!(prompt, "You must reply with only the adjusted {content_type} (within the '<|START|' and '|END|>' spans) not the entire file.").unwrap();
writeln!(prompt, "Double check that you only return code and not the '<|START|' and '|END|'> spans").unwrap();
}
} else {
writeln!(
prompt,
"Generate {content_type} based on the users prompt: {user_prompt}"
)
.unwrap();
}
if let Some(language_name) = &args.language_name {
writeln!(
prompt,
"Your answer MUST always and only be valid {}.",
language_name
)
.unwrap();
}
writeln!(prompt, "Never make remarks about the output.").unwrap();
writeln!(
prompt,
"Do not return anything else, except the generated {content_type}."
)
.unwrap();
match file_type {
PromptFileType::Code => {
// writeln!(prompt, "Always wrap your code in a Markdown block.").unwrap();
}
_ => {}
}
// Really dumb truncation strategy
if let Some(max_tokens) = max_token_length {
prompt = args.model.truncate(
&prompt,
max_tokens,
crate::models::TruncationDirection::End,
)?;
}
let token_count = args.model.count_tokens(&prompt)?;
anyhow::Ok((prompt, token_count))
}
}

View File

@@ -1,5 +0,0 @@
pub mod base;
pub mod file_context;
pub mod generate;
pub mod preamble;
pub mod repository_context;

View File

@@ -1,52 +0,0 @@
use crate::prompts::base::{PromptArguments, PromptFileType, PromptTemplate};
use std::fmt::Write;
pub struct EngineerPreamble {}
impl PromptTemplate for EngineerPreamble {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
let mut prompts = Vec::new();
match args.get_file_type() {
PromptFileType::Code => {
prompts.push(format!(
"You are an expert {}engineer.",
args.language_name.clone().unwrap_or("".to_string()) + " "
));
}
PromptFileType::Text => {
prompts.push("You are an expert engineer.".to_string());
}
}
if let Some(project_name) = args.project_name.clone() {
prompts.push(format!(
"You are currently working inside the '{project_name}' project in code editor Zed."
));
}
if let Some(mut remaining_tokens) = max_token_length {
let mut prompt = String::new();
let mut total_count = 0;
for prompt_piece in prompts {
let prompt_token_count =
args.model.count_tokens(&prompt_piece)? + args.model.count_tokens("\n")?;
if remaining_tokens > prompt_token_count {
writeln!(prompt, "{prompt_piece}").unwrap();
remaining_tokens -= prompt_token_count;
total_count += prompt_token_count;
}
}
anyhow::Ok((prompt, total_count))
} else {
let prompt = prompts.join("\n");
let token_count = args.model.count_tokens(&prompt)?;
anyhow::Ok((prompt, token_count))
}
}
}

View File

@@ -1,96 +0,0 @@
use crate::prompts::base::{PromptArguments, PromptTemplate};
use std::fmt::Write;
use std::{ops::Range, path::PathBuf};
use gpui::{AsyncAppContext, Model};
use language::{Anchor, Buffer};
#[derive(Clone)]
pub struct PromptCodeSnippet {
path: Option<PathBuf>,
language_name: Option<String>,
content: String,
}
impl PromptCodeSnippet {
pub fn new(
buffer: Model<Buffer>,
range: Range<Anchor>,
cx: &mut AsyncAppContext,
) -> anyhow::Result<Self> {
let (content, language_name, file_path) = buffer.update(cx, |buffer, _| {
let snapshot = buffer.snapshot();
let content = snapshot.text_for_range(range.clone()).collect::<String>();
let language_name = buffer
.language()
.map(|language| language.name().to_string().to_lowercase());
let file_path = buffer.file().map(|file| file.path().to_path_buf());
(content, language_name, file_path)
})?;
anyhow::Ok(PromptCodeSnippet {
path: file_path,
language_name,
content,
})
}
}
impl ToString for PromptCodeSnippet {
fn to_string(&self) -> String {
let path = self
.path
.as_ref()
.map(|path| path.to_string_lossy().to_string())
.unwrap_or("".to_string());
let language_name = self.language_name.clone().unwrap_or("".to_string());
let content = self.content.clone();
format!("The below code snippet may be relevant from file: {path}\n```{language_name}\n{content}\n```")
}
}
pub struct RepositoryContext {}
impl PromptTemplate for RepositoryContext {
fn generate(
&self,
args: &PromptArguments,
max_token_length: Option<usize>,
) -> anyhow::Result<(String, usize)> {
const MAXIMUM_SNIPPET_TOKEN_COUNT: usize = 500;
let template = "You are working inside a large repository, here are a few code snippets that may be useful.";
let mut prompt = String::new();
let mut remaining_tokens = max_token_length;
let separator_token_length = args.model.count_tokens("\n")?;
for snippet in &args.snippets {
let mut snippet_prompt = template.to_string();
let content = snippet.to_string();
writeln!(snippet_prompt, "{content}").unwrap();
let token_count = args.model.count_tokens(&snippet_prompt)?;
if token_count <= MAXIMUM_SNIPPET_TOKEN_COUNT {
if let Some(tokens_left) = remaining_tokens {
if tokens_left >= token_count {
writeln!(prompt, "{snippet_prompt}").unwrap();
remaining_tokens = if tokens_left >= (token_count + separator_token_length)
{
Some(tokens_left - token_count - separator_token_length)
} else {
Some(0)
};
}
} else {
writeln!(prompt, "{snippet_prompt}").unwrap();
}
}
}
let total_token_count = args.model.count_tokens(&prompt)?;
anyhow::Ok((prompt, total_token_count))
}
}

View File

@@ -1 +0,0 @@
pub mod open_ai;

View File

@@ -1,9 +0,0 @@
pub mod completion;
pub mod embedding;
pub mod model;
pub use completion::*;
pub use embedding::*;
pub use model::OpenAiLanguageModel;
pub const OPEN_AI_API_URL: &str = "https://api.openai.com/v1";

View File

@@ -1,421 +0,0 @@
use std::{
env,
fmt::{self, Display},
io,
sync::Arc,
};
use anyhow::{anyhow, Result};
use futures::{
future::BoxFuture, io::BufReader, stream::BoxStream, AsyncBufReadExt, AsyncReadExt, FutureExt,
Stream, StreamExt,
};
use gpui::{AppContext, BackgroundExecutor};
use isahc::{http::StatusCode, Request, RequestExt};
use parking_lot::RwLock;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use util::ResultExt;
use crate::providers::open_ai::{OpenAiLanguageModel, OPEN_AI_API_URL};
use crate::{
auth::{CredentialProvider, ProviderCredential},
completion::{CompletionProvider, CompletionRequest},
models::LanguageModel,
};
#[derive(Clone, Copy, Serialize, Deserialize, Debug, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum Role {
User,
Assistant,
System,
}
impl Role {
pub fn cycle(&mut self) {
*self = match self {
Role::User => Role::Assistant,
Role::Assistant => Role::System,
Role::System => Role::User,
}
}
}
impl Display for Role {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Role::User => write!(f, "User"),
Role::Assistant => write!(f, "Assistant"),
Role::System => write!(f, "System"),
}
}
}
#[derive(Serialize, Deserialize, Debug, Eq, PartialEq)]
pub struct RequestMessage {
pub role: Role,
pub content: String,
}
#[derive(Debug, Default, Serialize)]
pub struct OpenAiRequest {
pub model: String,
pub messages: Vec<RequestMessage>,
pub stream: bool,
pub stop: Vec<String>,
pub temperature: f32,
}
impl CompletionRequest for OpenAiRequest {
fn data(&self) -> serde_json::Result<String> {
serde_json::to_string(self)
}
}
#[derive(Serialize, Deserialize, Debug, Eq, PartialEq)]
pub struct ResponseMessage {
pub role: Option<Role>,
pub content: Option<String>,
}
#[derive(Deserialize, Debug)]
pub struct OpenAiUsage {
pub prompt_tokens: u32,
pub completion_tokens: u32,
pub total_tokens: u32,
}
#[derive(Deserialize, Debug)]
pub struct ChatChoiceDelta {
pub index: u32,
pub delta: ResponseMessage,
pub finish_reason: Option<String>,
}
#[derive(Deserialize, Debug)]
pub struct OpenAiResponseStreamEvent {
pub id: Option<String>,
pub object: String,
pub created: u32,
pub model: String,
pub choices: Vec<ChatChoiceDelta>,
pub usage: Option<OpenAiUsage>,
}
async fn stream_completion(
api_url: String,
kind: OpenAiCompletionProviderKind,
credential: ProviderCredential,
executor: BackgroundExecutor,
request: Box<dyn CompletionRequest>,
) -> Result<impl Stream<Item = Result<OpenAiResponseStreamEvent>>> {
let api_key = match credential {
ProviderCredential::Credentials { api_key } => api_key,
_ => {
return Err(anyhow!("no credentials provider for completion"));
}
};
let (tx, rx) = futures::channel::mpsc::unbounded::<Result<OpenAiResponseStreamEvent>>();
let (auth_header_name, auth_header_value) = kind.auth_header(api_key);
let json_data = request.data()?;
let mut response = Request::post(kind.completions_endpoint_url(&api_url))
.header("Content-Type", "application/json")
.header(auth_header_name, auth_header_value)
.body(json_data)?
.send_async()
.await?;
let status = response.status();
if status == StatusCode::OK {
executor
.spawn(async move {
let mut lines = BufReader::new(response.body_mut()).lines();
fn parse_line(
line: Result<String, io::Error>,
) -> Result<Option<OpenAiResponseStreamEvent>> {
if let Some(data) = line?.strip_prefix("data: ") {
let event = serde_json::from_str(data)?;
Ok(Some(event))
} else {
Ok(None)
}
}
while let Some(line) = lines.next().await {
if let Some(event) = parse_line(line).transpose() {
let done = event.as_ref().map_or(false, |event| {
event
.choices
.last()
.map_or(false, |choice| choice.finish_reason.is_some())
});
if tx.unbounded_send(event).is_err() {
break;
}
if done {
break;
}
}
}
anyhow::Ok(())
})
.detach();
Ok(rx)
} else {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
#[derive(Deserialize)]
struct OpenAiResponse {
error: OpenAiError,
}
#[derive(Deserialize)]
struct OpenAiError {
message: String,
}
match serde_json::from_str::<OpenAiResponse>(&body) {
Ok(response) if !response.error.message.is_empty() => Err(anyhow!(
"Failed to connect to OpenAI API: {}",
response.error.message,
)),
_ => Err(anyhow!(
"Failed to connect to OpenAI API: {} {}",
response.status(),
body,
)),
}
}
}
#[derive(Debug, Clone, Copy, Serialize, Deserialize, JsonSchema)]
pub enum AzureOpenAiApiVersion {
/// Retiring April 2, 2024.
#[serde(rename = "2023-03-15-preview")]
V2023_03_15Preview,
#[serde(rename = "2023-05-15")]
V2023_05_15,
/// Retiring April 2, 2024.
#[serde(rename = "2023-06-01-preview")]
V2023_06_01Preview,
/// Retiring April 2, 2024.
#[serde(rename = "2023-07-01-preview")]
V2023_07_01Preview,
/// Retiring April 2, 2024.
#[serde(rename = "2023-08-01-preview")]
V2023_08_01Preview,
/// Retiring April 2, 2024.
#[serde(rename = "2023-09-01-preview")]
V2023_09_01Preview,
#[serde(rename = "2023-12-01-preview")]
V2023_12_01Preview,
#[serde(rename = "2024-02-15-preview")]
V2024_02_15Preview,
}
impl fmt::Display for AzureOpenAiApiVersion {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(
f,
"{}",
match self {
Self::V2023_03_15Preview => "2023-03-15-preview",
Self::V2023_05_15 => "2023-05-15",
Self::V2023_06_01Preview => "2023-06-01-preview",
Self::V2023_07_01Preview => "2023-07-01-preview",
Self::V2023_08_01Preview => "2023-08-01-preview",
Self::V2023_09_01Preview => "2023-09-01-preview",
Self::V2023_12_01Preview => "2023-12-01-preview",
Self::V2024_02_15Preview => "2024-02-15-preview",
}
)
}
}
#[derive(Clone)]
pub enum OpenAiCompletionProviderKind {
OpenAi,
AzureOpenAi {
deployment_id: String,
api_version: AzureOpenAiApiVersion,
},
}
impl OpenAiCompletionProviderKind {
/// Returns the chat completion endpoint URL for this [`OpenAiCompletionProviderKind`].
fn completions_endpoint_url(&self, api_url: &str) -> String {
match self {
Self::OpenAi => {
// https://platform.openai.com/docs/api-reference/chat/create
format!("{api_url}/chat/completions")
}
Self::AzureOpenAi {
deployment_id,
api_version,
} => {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions
format!("{api_url}/openai/deployments/{deployment_id}/chat/completions?api-version={api_version}")
}
}
}
/// Returns the authentication header for this [`OpenAiCompletionProviderKind`].
fn auth_header(&self, api_key: String) -> (&'static str, String) {
match self {
Self::OpenAi => ("Authorization", format!("Bearer {api_key}")),
Self::AzureOpenAi { .. } => ("Api-Key", api_key),
}
}
}
#[derive(Clone)]
pub struct OpenAiCompletionProvider {
api_url: String,
kind: OpenAiCompletionProviderKind,
model: OpenAiLanguageModel,
credential: Arc<RwLock<ProviderCredential>>,
executor: BackgroundExecutor,
}
impl OpenAiCompletionProvider {
pub async fn new(
api_url: String,
kind: OpenAiCompletionProviderKind,
model_name: String,
executor: BackgroundExecutor,
) -> Self {
let model = executor
.spawn(async move { OpenAiLanguageModel::load(&model_name) })
.await;
let credential = Arc::new(RwLock::new(ProviderCredential::NoCredentials));
Self {
api_url,
kind,
model,
credential,
executor,
}
}
}
impl CredentialProvider for OpenAiCompletionProvider {
fn has_credentials(&self) -> bool {
match *self.credential.read() {
ProviderCredential::Credentials { .. } => true,
_ => false,
}
}
fn retrieve_credentials(&self, cx: &mut AppContext) -> BoxFuture<ProviderCredential> {
let existing_credential = self.credential.read().clone();
let retrieved_credential = match existing_credential {
ProviderCredential::Credentials { .. } => {
return async move { existing_credential }.boxed()
}
_ => {
if let Some(api_key) = env::var("OPENAI_API_KEY").log_err() {
async move { ProviderCredential::Credentials { api_key } }.boxed()
} else {
let credentials = cx.read_credentials(OPEN_AI_API_URL);
async move {
if let Some(Some((_, api_key))) = credentials.await.log_err() {
if let Some(api_key) = String::from_utf8(api_key).log_err() {
ProviderCredential::Credentials { api_key }
} else {
ProviderCredential::NoCredentials
}
} else {
ProviderCredential::NoCredentials
}
}
.boxed()
}
}
};
async move {
let retrieved_credential = retrieved_credential.await;
*self.credential.write() = retrieved_credential.clone();
retrieved_credential
}
.boxed()
}
fn save_credentials(
&self,
cx: &mut AppContext,
credential: ProviderCredential,
) -> BoxFuture<()> {
*self.credential.write() = credential.clone();
let credential = credential.clone();
let write_credentials = match credential {
ProviderCredential::Credentials { api_key } => {
Some(cx.write_credentials(OPEN_AI_API_URL, "Bearer", api_key.as_bytes()))
}
_ => None,
};
async move {
if let Some(write_credentials) = write_credentials {
write_credentials.await.log_err();
}
}
.boxed()
}
fn delete_credentials(&self, cx: &mut AppContext) -> BoxFuture<()> {
*self.credential.write() = ProviderCredential::NoCredentials;
let delete_credentials = cx.delete_credentials(OPEN_AI_API_URL);
async move {
delete_credentials.await.log_err();
}
.boxed()
}
}
impl CompletionProvider for OpenAiCompletionProvider {
fn base_model(&self) -> Box<dyn LanguageModel> {
let model: Box<dyn LanguageModel> = Box::new(self.model.clone());
model
}
fn complete(
&self,
prompt: Box<dyn CompletionRequest>,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
// Currently the CompletionRequest for OpenAI, includes a 'model' parameter
// This means that the model is determined by the CompletionRequest and not the CompletionProvider,
// which is currently model based, due to the language model.
// At some point in the future we should rectify this.
let credential = self.credential.read().clone();
let api_url = self.api_url.clone();
let kind = self.kind.clone();
let request = stream_completion(api_url, kind, credential, self.executor.clone(), prompt);
async move {
let response = request.await?;
let stream = response
.filter_map(|response| async move {
match response {
Ok(mut response) => Some(Ok(response.choices.pop()?.delta.content?)),
Err(error) => Some(Err(error)),
}
})
.boxed();
Ok(stream)
}
.boxed()
}
fn box_clone(&self) -> Box<dyn CompletionProvider> {
Box::new((*self).clone())
}
}

View File

@@ -1,345 +0,0 @@
use anyhow::{anyhow, Result};
use async_trait::async_trait;
use futures::future::BoxFuture;
use futures::AsyncReadExt;
use futures::FutureExt;
use gpui::AppContext;
use gpui::BackgroundExecutor;
use isahc::http::StatusCode;
use isahc::prelude::Configurable;
use isahc::{AsyncBody, Response};
use parking_lot::{Mutex, RwLock};
use parse_duration::parse;
use postage::watch;
use serde::{Deserialize, Serialize};
use serde_json;
use std::env;
use std::ops::Add;
use std::sync::{Arc, OnceLock};
use std::time::{Duration, Instant};
use tiktoken_rs::{cl100k_base, CoreBPE};
use util::http::{HttpClient, Request};
use util::ResultExt;
use crate::auth::{CredentialProvider, ProviderCredential};
use crate::embedding::{Embedding, EmbeddingProvider};
use crate::models::LanguageModel;
use crate::providers::open_ai::OpenAiLanguageModel;
use crate::providers::open_ai::OPEN_AI_API_URL;
pub(crate) fn open_ai_bpe_tokenizer() -> &'static CoreBPE {
static OPEN_AI_BPE_TOKENIZER: OnceLock<CoreBPE> = OnceLock::new();
OPEN_AI_BPE_TOKENIZER.get_or_init(|| cl100k_base().unwrap())
}
#[derive(Clone)]
pub struct OpenAiEmbeddingProvider {
api_url: String,
model: OpenAiLanguageModel,
credential: Arc<RwLock<ProviderCredential>>,
pub client: Arc<dyn HttpClient>,
pub executor: BackgroundExecutor,
rate_limit_count_rx: watch::Receiver<Option<Instant>>,
rate_limit_count_tx: Arc<Mutex<watch::Sender<Option<Instant>>>>,
}
#[derive(Serialize)]
struct OpenAiEmbeddingRequest<'a> {
model: &'static str,
input: Vec<&'a str>,
}
#[derive(Deserialize)]
struct OpenAiEmbeddingResponse {
data: Vec<OpenAiEmbedding>,
usage: OpenAiEmbeddingUsage,
}
#[derive(Debug, Deserialize)]
struct OpenAiEmbedding {
embedding: Vec<f32>,
index: usize,
object: String,
}
#[derive(Deserialize)]
struct OpenAiEmbeddingUsage {
prompt_tokens: usize,
total_tokens: usize,
}
impl OpenAiEmbeddingProvider {
pub async fn new(
api_url: String,
client: Arc<dyn HttpClient>,
executor: BackgroundExecutor,
) -> Self {
let (rate_limit_count_tx, rate_limit_count_rx) = watch::channel_with(None);
let rate_limit_count_tx = Arc::new(Mutex::new(rate_limit_count_tx));
// Loading the model is expensive, so ensure this runs off the main thread.
let model = executor
.spawn(async move { OpenAiLanguageModel::load("text-embedding-ada-002") })
.await;
let credential = Arc::new(RwLock::new(ProviderCredential::NoCredentials));
OpenAiEmbeddingProvider {
api_url,
model,
credential,
client,
executor,
rate_limit_count_rx,
rate_limit_count_tx,
}
}
fn get_api_key(&self) -> Result<String> {
match self.credential.read().clone() {
ProviderCredential::Credentials { api_key } => Ok(api_key),
_ => Err(anyhow!("api credentials not provided")),
}
}
fn resolve_rate_limit(&self) {
let reset_time = *self.rate_limit_count_tx.lock().borrow();
if let Some(reset_time) = reset_time {
if Instant::now() >= reset_time {
*self.rate_limit_count_tx.lock().borrow_mut() = None
}
}
log::trace!(
"resolving reset time: {:?}",
*self.rate_limit_count_tx.lock().borrow()
);
}
fn update_reset_time(&self, reset_time: Instant) {
let original_time = *self.rate_limit_count_tx.lock().borrow();
let updated_time = if let Some(original_time) = original_time {
if reset_time < original_time {
Some(reset_time)
} else {
Some(original_time)
}
} else {
Some(reset_time)
};
log::trace!("updating rate limit time: {:?}", updated_time);
*self.rate_limit_count_tx.lock().borrow_mut() = updated_time;
}
async fn send_request(
&self,
api_url: &str,
api_key: &str,
spans: Vec<&str>,
request_timeout: u64,
) -> Result<Response<AsyncBody>> {
let request = Request::post(format!("{api_url}/embeddings"))
.redirect_policy(isahc::config::RedirectPolicy::Follow)
.timeout(Duration::from_secs(request_timeout))
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {}", api_key))
.body(
serde_json::to_string(&OpenAiEmbeddingRequest {
input: spans.clone(),
model: "text-embedding-ada-002",
})
.unwrap()
.into(),
)?;
Ok(self.client.send(request).await?)
}
}
impl CredentialProvider for OpenAiEmbeddingProvider {
fn has_credentials(&self) -> bool {
match *self.credential.read() {
ProviderCredential::Credentials { .. } => true,
_ => false,
}
}
fn retrieve_credentials(&self, cx: &mut AppContext) -> BoxFuture<ProviderCredential> {
let existing_credential = self.credential.read().clone();
let retrieved_credential = match existing_credential {
ProviderCredential::Credentials { .. } => {
return async move { existing_credential }.boxed()
}
_ => {
if let Some(api_key) = env::var("OPENAI_API_KEY").log_err() {
async move { ProviderCredential::Credentials { api_key } }.boxed()
} else {
let credentials = cx.read_credentials(OPEN_AI_API_URL);
async move {
if let Some(Some((_, api_key))) = credentials.await.log_err() {
if let Some(api_key) = String::from_utf8(api_key).log_err() {
ProviderCredential::Credentials { api_key }
} else {
ProviderCredential::NoCredentials
}
} else {
ProviderCredential::NoCredentials
}
}
.boxed()
}
}
};
async move {
let retrieved_credential = retrieved_credential.await;
*self.credential.write() = retrieved_credential.clone();
retrieved_credential
}
.boxed()
}
fn save_credentials(
&self,
cx: &mut AppContext,
credential: ProviderCredential,
) -> BoxFuture<()> {
*self.credential.write() = credential.clone();
let credential = credential.clone();
let write_credentials = match credential {
ProviderCredential::Credentials { api_key } => {
Some(cx.write_credentials(OPEN_AI_API_URL, "Bearer", api_key.as_bytes()))
}
_ => None,
};
async move {
if let Some(write_credentials) = write_credentials {
write_credentials.await.log_err();
}
}
.boxed()
}
fn delete_credentials(&self, cx: &mut AppContext) -> BoxFuture<()> {
*self.credential.write() = ProviderCredential::NoCredentials;
let delete_credentials = cx.delete_credentials(OPEN_AI_API_URL);
async move {
delete_credentials.await.log_err();
}
.boxed()
}
}
#[async_trait]
impl EmbeddingProvider for OpenAiEmbeddingProvider {
fn base_model(&self) -> Box<dyn LanguageModel> {
let model: Box<dyn LanguageModel> = Box::new(self.model.clone());
model
}
fn max_tokens_per_batch(&self) -> usize {
50000
}
fn rate_limit_expiration(&self) -> Option<Instant> {
*self.rate_limit_count_rx.borrow()
}
async fn embed_batch(&self, spans: Vec<String>) -> Result<Vec<Embedding>> {
const BACKOFF_SECONDS: [usize; 4] = [3, 5, 15, 45];
const MAX_RETRIES: usize = 4;
let api_url = self.api_url.as_str();
let api_key = self.get_api_key()?;
let mut request_number = 0;
let mut rate_limiting = false;
let mut request_timeout: u64 = 15;
let mut response: Response<AsyncBody>;
while request_number < MAX_RETRIES {
response = self
.send_request(
&api_url,
&api_key,
spans.iter().map(|x| &**x).collect(),
request_timeout,
)
.await?;
request_number += 1;
match response.status() {
StatusCode::REQUEST_TIMEOUT => {
request_timeout += 5;
}
StatusCode::OK => {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
let response: OpenAiEmbeddingResponse = serde_json::from_str(&body)?;
log::trace!(
"openai embedding completed. tokens: {:?}",
response.usage.total_tokens
);
// If we complete a request successfully that was previously rate_limited
// resolve the rate limit
if rate_limiting {
self.resolve_rate_limit()
}
return Ok(response
.data
.into_iter()
.map(|embedding| Embedding::from(embedding.embedding))
.collect());
}
StatusCode::TOO_MANY_REQUESTS => {
rate_limiting = true;
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
let delay_duration = {
let delay = Duration::from_secs(BACKOFF_SECONDS[request_number - 1] as u64);
if let Some(time_to_reset) =
response.headers().get("x-ratelimit-reset-tokens")
{
if let Ok(time_str) = time_to_reset.to_str() {
parse(time_str).unwrap_or(delay)
} else {
delay
}
} else {
delay
}
};
// If we've previously rate limited, increment the duration but not the count
let reset_time = Instant::now().add(delay_duration);
self.update_reset_time(reset_time);
log::trace!(
"openai rate limiting: waiting {:?} until lifted",
&delay_duration
);
self.executor.timer(delay_duration).await;
}
_ => {
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
return Err(anyhow!(
"open ai bad request: {:?} {:?}",
&response.status(),
body
));
}
}
}
Err(anyhow!("openai max retries"))
}
}

View File

@@ -1,59 +0,0 @@
use anyhow::anyhow;
use tiktoken_rs::CoreBPE;
use crate::models::{LanguageModel, TruncationDirection};
use super::open_ai_bpe_tokenizer;
#[derive(Clone)]
pub struct OpenAiLanguageModel {
name: String,
bpe: Option<CoreBPE>,
}
impl OpenAiLanguageModel {
pub fn load(model_name: &str) -> Self {
let bpe = tiktoken_rs::get_bpe_from_model(model_name)
.unwrap_or(open_ai_bpe_tokenizer().to_owned());
OpenAiLanguageModel {
name: model_name.to_string(),
bpe: Some(bpe),
}
}
}
impl LanguageModel for OpenAiLanguageModel {
fn name(&self) -> String {
self.name.clone()
}
fn count_tokens(&self, content: &str) -> anyhow::Result<usize> {
if let Some(bpe) = &self.bpe {
anyhow::Ok(bpe.encode_with_special_tokens(content).len())
} else {
Err(anyhow!("bpe for open ai model was not retrieved"))
}
}
fn truncate(
&self,
content: &str,
length: usize,
direction: TruncationDirection,
) -> anyhow::Result<String> {
if let Some(bpe) = &self.bpe {
let tokens = bpe.encode_with_special_tokens(content);
if tokens.len() > length {
match direction {
TruncationDirection::End => bpe.decode(tokens[..length].to_vec()),
TruncationDirection::Start => bpe.decode(tokens[length..].to_vec()),
}
} else {
bpe.decode(tokens)
}
} else {
Err(anyhow!("bpe for open ai model was not retrieved"))
}
}
fn capacity(&self) -> anyhow::Result<usize> {
anyhow::Ok(tiktoken_rs::model::get_context_size(&self.name))
}
}

View File

@@ -1,206 +0,0 @@
use std::{
sync::atomic::{self, AtomicUsize, Ordering},
time::Instant,
};
use async_trait::async_trait;
use futures::{channel::mpsc, future::BoxFuture, stream::BoxStream, FutureExt, StreamExt};
use gpui::AppContext;
use parking_lot::Mutex;
use crate::{
auth::{CredentialProvider, ProviderCredential},
completion::{CompletionProvider, CompletionRequest},
embedding::{Embedding, EmbeddingProvider},
models::{LanguageModel, TruncationDirection},
};
#[derive(Clone)]
pub struct FakeLanguageModel {
pub capacity: usize,
}
impl LanguageModel for FakeLanguageModel {
fn name(&self) -> String {
"dummy".to_string()
}
fn count_tokens(&self, content: &str) -> anyhow::Result<usize> {
anyhow::Ok(content.chars().collect::<Vec<char>>().len())
}
fn truncate(
&self,
content: &str,
length: usize,
direction: TruncationDirection,
) -> anyhow::Result<String> {
println!("TRYING TO TRUNCATE: {:?}", length.clone());
if length > self.count_tokens(content)? {
println!("NOT TRUNCATING");
return anyhow::Ok(content.to_string());
}
anyhow::Ok(match direction {
TruncationDirection::End => content.chars().collect::<Vec<char>>()[..length]
.into_iter()
.collect::<String>(),
TruncationDirection::Start => content.chars().collect::<Vec<char>>()[length..]
.into_iter()
.collect::<String>(),
})
}
fn capacity(&self) -> anyhow::Result<usize> {
anyhow::Ok(self.capacity)
}
}
#[derive(Default)]
pub struct FakeEmbeddingProvider {
pub embedding_count: AtomicUsize,
}
impl Clone for FakeEmbeddingProvider {
fn clone(&self) -> Self {
FakeEmbeddingProvider {
embedding_count: AtomicUsize::new(self.embedding_count.load(Ordering::SeqCst)),
}
}
}
impl FakeEmbeddingProvider {
pub fn embedding_count(&self) -> usize {
self.embedding_count.load(atomic::Ordering::SeqCst)
}
pub fn embed_sync(&self, span: &str) -> Embedding {
let mut result = vec![1.0; 26];
for letter in span.chars() {
let letter = letter.to_ascii_lowercase();
if letter as u32 >= 'a' as u32 {
let ix = (letter as u32) - ('a' as u32);
if ix < 26 {
result[ix as usize] += 1.0;
}
}
}
let norm = result.iter().map(|x| x * x).sum::<f32>().sqrt();
for x in &mut result {
*x /= norm;
}
result.into()
}
}
impl CredentialProvider for FakeEmbeddingProvider {
fn has_credentials(&self) -> bool {
true
}
fn retrieve_credentials(&self, _cx: &mut AppContext) -> BoxFuture<ProviderCredential> {
async { ProviderCredential::NotNeeded }.boxed()
}
fn save_credentials(
&self,
_cx: &mut AppContext,
_credential: ProviderCredential,
) -> BoxFuture<()> {
async {}.boxed()
}
fn delete_credentials(&self, _cx: &mut AppContext) -> BoxFuture<()> {
async {}.boxed()
}
}
#[async_trait]
impl EmbeddingProvider for FakeEmbeddingProvider {
fn base_model(&self) -> Box<dyn LanguageModel> {
Box::new(FakeLanguageModel { capacity: 1000 })
}
fn max_tokens_per_batch(&self) -> usize {
1000
}
fn rate_limit_expiration(&self) -> Option<Instant> {
None
}
async fn embed_batch(&self, spans: Vec<String>) -> anyhow::Result<Vec<Embedding>> {
self.embedding_count
.fetch_add(spans.len(), atomic::Ordering::SeqCst);
anyhow::Ok(spans.iter().map(|span| self.embed_sync(span)).collect())
}
}
pub struct FakeCompletionProvider {
last_completion_tx: Mutex<Option<mpsc::Sender<String>>>,
}
impl Clone for FakeCompletionProvider {
fn clone(&self) -> Self {
Self {
last_completion_tx: Mutex::new(None),
}
}
}
impl FakeCompletionProvider {
pub fn new() -> Self {
Self {
last_completion_tx: Mutex::new(None),
}
}
pub fn send_completion(&self, completion: impl Into<String>) {
let mut tx = self.last_completion_tx.lock();
tx.as_mut().unwrap().try_send(completion.into()).unwrap();
}
pub fn finish_completion(&self) {
self.last_completion_tx.lock().take().unwrap();
}
}
impl CredentialProvider for FakeCompletionProvider {
fn has_credentials(&self) -> bool {
true
}
fn retrieve_credentials(&self, _cx: &mut AppContext) -> BoxFuture<ProviderCredential> {
async { ProviderCredential::NotNeeded }.boxed()
}
fn save_credentials(
&self,
_cx: &mut AppContext,
_credential: ProviderCredential,
) -> BoxFuture<()> {
async {}.boxed()
}
fn delete_credentials(&self, _cx: &mut AppContext) -> BoxFuture<()> {
async {}.boxed()
}
}
impl CompletionProvider for FakeCompletionProvider {
fn base_model(&self) -> Box<dyn LanguageModel> {
let model: Box<dyn LanguageModel> = Box::new(FakeLanguageModel { capacity: 8190 });
model
}
fn complete(
&self,
_prompt: Box<dyn CompletionRequest>,
) -> BoxFuture<'static, anyhow::Result<BoxStream<'static, anyhow::Result<String>>>> {
let (tx, rx) = mpsc::channel(1);
*self.last_completion_tx.lock() = Some(tx);
async move { Ok(rx.map(|rx| Ok(rx)).boxed()) }.boxed()
}
fn box_clone(&self) -> Box<dyn CompletionProvider> {
Box::new((*self).clone())
}
}

View File

@@ -5,18 +5,16 @@ edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/assistant.rs"
doctest = false
[dependencies]
ai.workspace = true
anyhow.workspace = true
chrono.workspace = true
client.workspace = true
collections.workspace = true
command_palette_hooks.workspace = true
editor.workspace = true
fs.workspace = true
futures.workspace = true
@@ -26,12 +24,13 @@ language.workspace = true
log.workspace = true
menu.workspace = true
multi_buffer.workspace = true
open_ai = { workspace = true, features = ["schemars"] }
ordered-float.workspace = true
parking_lot.workspace = true
project.workspace = true
regex.workspace = true
schemars.workspace = true
search.workspace = true
semantic_index.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
@@ -45,7 +44,6 @@ uuid.workspace = true
workspace.workspace = true
[dev-dependencies]
ai = { workspace = true, features = ["test-support"] }
ctor.workspace = true
editor = { workspace = true, features = ["test-support"] }
env_logger.workspace = true

View File

@@ -1,22 +1,25 @@
pub mod assistant_panel;
pub mod assistant_settings;
mod codegen;
mod completion_provider;
mod prompts;
mod saved_conversation;
mod streaming_diff;
use ai::providers::open_ai::Role;
use anyhow::Result;
pub use assistant_panel::AssistantPanel;
use assistant_settings::OpenAiModel;
use assistant_settings::{AssistantSettings, OpenAiModel, ZedDotDevModel};
use chrono::{DateTime, Local};
use collections::HashMap;
use fs::Fs;
use futures::StreamExt;
use gpui::{actions, AppContext, SharedString};
use regex::Regex;
use client::{proto, Client};
use command_palette_hooks::CommandPaletteFilter;
pub(crate) use completion_provider::*;
use gpui::{actions, AppContext, BorrowAppContext, Global, SharedString};
pub(crate) use saved_conversation::*;
use serde::{Deserialize, Serialize};
use std::{cmp::Reverse, ffi::OsStr, path::PathBuf, sync::Arc};
use util::paths::CONVERSATIONS_DIR;
use settings::{Settings, SettingsStore};
use std::{
fmt::{self, Display},
sync::Arc,
};
actions!(
assistant,
@@ -30,7 +33,6 @@ actions!(
ResetKey,
InlineAssist,
ToggleIncludeConversation,
ToggleRetrieveContext,
]
);
@@ -39,6 +41,134 @@ actions!(
)]
struct MessageId(usize);
#[derive(Clone, Copy, Serialize, Deserialize, Debug, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum Role {
User,
Assistant,
System,
}
impl Role {
pub fn cycle(&mut self) {
*self = match self {
Role::User => Role::Assistant,
Role::Assistant => Role::System,
Role::System => Role::User,
}
}
}
impl Display for Role {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Role::User => write!(f, "user"),
Role::Assistant => write!(f, "assistant"),
Role::System => write!(f, "system"),
}
}
}
#[derive(Clone, Debug, Serialize, Deserialize, PartialEq)]
pub enum LanguageModel {
ZedDotDev(ZedDotDevModel),
OpenAi(OpenAiModel),
}
impl Default for LanguageModel {
fn default() -> Self {
LanguageModel::ZedDotDev(ZedDotDevModel::default())
}
}
impl LanguageModel {
pub fn telemetry_id(&self) -> String {
match self {
LanguageModel::OpenAi(model) => format!("openai/{}", model.id()),
LanguageModel::ZedDotDev(model) => format!("zed.dev/{}", model.id()),
}
}
pub fn display_name(&self) -> String {
match self {
LanguageModel::OpenAi(model) => format!("openai/{}", model.display_name()),
LanguageModel::ZedDotDev(model) => format!("zed.dev/{}", model.display_name()),
}
}
pub fn max_token_count(&self) -> usize {
match self {
LanguageModel::OpenAi(model) => model.max_token_count(),
LanguageModel::ZedDotDev(model) => model.max_token_count(),
}
}
pub fn id(&self) -> &str {
match self {
LanguageModel::OpenAi(model) => model.id(),
LanguageModel::ZedDotDev(model) => model.id(),
}
}
}
#[derive(Serialize, Deserialize, Debug, Eq, PartialEq)]
pub struct LanguageModelRequestMessage {
pub role: Role,
pub content: String,
}
impl LanguageModelRequestMessage {
pub fn to_proto(&self) -> proto::LanguageModelRequestMessage {
proto::LanguageModelRequestMessage {
role: match self.role {
Role::User => proto::LanguageModelRole::LanguageModelUser,
Role::Assistant => proto::LanguageModelRole::LanguageModelAssistant,
Role::System => proto::LanguageModelRole::LanguageModelSystem,
} as i32,
content: self.content.clone(),
}
}
}
#[derive(Debug, Default, Serialize)]
pub struct LanguageModelRequest {
pub model: LanguageModel,
pub messages: Vec<LanguageModelRequestMessage>,
pub stop: Vec<String>,
pub temperature: f32,
}
impl LanguageModelRequest {
pub fn to_proto(&self) -> proto::CompleteWithLanguageModel {
proto::CompleteWithLanguageModel {
model: self.model.id().to_string(),
messages: self.messages.iter().map(|m| m.to_proto()).collect(),
stop: self.stop.clone(),
temperature: self.temperature,
}
}
}
#[derive(Serialize, Deserialize, Debug, Eq, PartialEq)]
pub struct LanguageModelResponseMessage {
pub role: Option<Role>,
pub content: Option<String>,
}
#[derive(Deserialize, Debug)]
pub struct LanguageModelUsage {
pub prompt_tokens: u32,
pub completion_tokens: u32,
pub total_tokens: u32,
}
#[derive(Deserialize, Debug)]
pub struct LanguageModelChoiceDelta {
pub index: u32,
pub delta: LanguageModelResponseMessage,
pub finish_reason: Option<String>,
}
#[derive(Clone, Debug, Serialize, Deserialize)]
struct MessageMetadata {
role: Role,
@@ -53,72 +183,61 @@ enum MessageStatus {
Error(SharedString),
}
#[derive(Serialize, Deserialize)]
struct SavedMessage {
id: MessageId,
start: usize,
/// The state pertaining to the Assistant.
#[derive(Default)]
struct Assistant {
/// Whether the Assistant is enabled.
enabled: bool,
}
#[derive(Serialize, Deserialize)]
struct SavedConversation {
id: Option<String>,
zed: String,
version: String,
text: String,
messages: Vec<SavedMessage>,
message_metadata: HashMap<MessageId, MessageMetadata>,
summary: String,
api_url: Option<String>,
model: OpenAiModel,
}
impl Global for Assistant {}
impl SavedConversation {
const VERSION: &'static str = "0.1.0";
}
impl Assistant {
const NAMESPACE: &'static str = "assistant";
struct SavedConversationMetadata {
title: String,
path: PathBuf,
mtime: chrono::DateTime<chrono::Local>,
}
impl SavedConversationMetadata {
pub async fn list(fs: Arc<dyn Fs>) -> Result<Vec<Self>> {
fs.create_dir(&CONVERSATIONS_DIR).await?;
let mut paths = fs.read_dir(&CONVERSATIONS_DIR).await?;
let mut conversations = Vec::<SavedConversationMetadata>::new();
while let Some(path) = paths.next().await {
let path = path?;
if path.extension() != Some(OsStr::new("json")) {
continue;
}
let pattern = r" - \d+.zed.json$";
let re = Regex::new(pattern).unwrap();
let metadata = fs.metadata(&path).await?;
if let Some((file_name, metadata)) = path
.file_name()
.and_then(|name| name.to_str())
.zip(metadata)
{
let title = re.replace(file_name, "");
conversations.push(Self {
title: title.into_owned(),
path,
mtime: metadata.mtime.into(),
});
}
fn set_enabled(&mut self, enabled: bool, cx: &mut AppContext) {
if self.enabled == enabled {
return;
}
conversations.sort_unstable_by_key(|conversation| Reverse(conversation.mtime));
Ok(conversations)
self.enabled = enabled;
if !enabled {
CommandPaletteFilter::update_global(cx, |filter, _cx| {
filter.hide_namespace(Self::NAMESPACE);
});
return;
}
CommandPaletteFilter::update_global(cx, |filter, _cx| {
filter.show_namespace(Self::NAMESPACE);
});
}
}
pub fn init(cx: &mut AppContext) {
pub fn init(client: Arc<Client>, cx: &mut AppContext) {
cx.set_global(Assistant::default());
AssistantSettings::register(cx);
completion_provider::init(client, cx);
assistant_panel::init(cx);
CommandPaletteFilter::update_global(cx, |filter, _cx| {
filter.hide_namespace(Assistant::NAMESPACE);
});
cx.update_global(|assistant: &mut Assistant, cx: &mut AppContext| {
let settings = AssistantSettings::get_global(cx);
assistant.set_enabled(settings.enabled, cx);
});
cx.observe_global::<SettingsStore>(|cx| {
cx.update_global(|assistant: &mut Assistant, cx: &mut AppContext| {
let settings = AssistantSettings::get_global(cx);
assistant.set_enabled(settings.enabled, cx);
});
})
.detach();
}
#[cfg(test)]

File diff suppressed because it is too large Load Diff

View File

@@ -1,169 +1,295 @@
use ai::providers::open_ai::{
AzureOpenAiApiVersion, OpenAiCompletionProviderKind, OPEN_AI_API_URL,
};
use anyhow::anyhow;
use std::fmt;
use gpui::Pixels;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
pub use open_ai::Model as OpenAiModel;
use schemars::{
schema::{InstanceType, Metadata, Schema, SchemaObject},
JsonSchema,
};
use serde::{
de::{self, Visitor},
Deserialize, Deserializer, Serialize, Serializer,
};
use settings::Settings;
#[derive(Clone, Copy, Debug, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum OpenAiModel {
#[serde(rename = "gpt-3.5-turbo-0613")]
ThreePointFiveTurbo,
#[serde(rename = "gpt-4-0613")]
Four,
#[serde(rename = "gpt-4-1106-preview")]
FourTurbo,
#[derive(Clone, Debug, Default, PartialEq)]
pub enum ZedDotDevModel {
GptThreePointFiveTurbo,
GptFour,
#[default]
GptFourTurbo,
Custom(String),
}
impl OpenAiModel {
pub fn full_name(&self) -> &'static str {
impl Serialize for ZedDotDevModel {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(self.id())
}
}
impl<'de> Deserialize<'de> for ZedDotDevModel {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
struct ZedDotDevModelVisitor;
impl<'de> Visitor<'de> for ZedDotDevModelVisitor {
type Value = ZedDotDevModel;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("a string for a ZedDotDevModel variant or a custom model")
}
fn visit_str<E>(self, value: &str) -> Result<Self::Value, E>
where
E: de::Error,
{
match value {
"gpt-3.5-turbo" => Ok(ZedDotDevModel::GptThreePointFiveTurbo),
"gpt-4" => Ok(ZedDotDevModel::GptFour),
"gpt-4-turbo-preview" => Ok(ZedDotDevModel::GptFourTurbo),
_ => Ok(ZedDotDevModel::Custom(value.to_owned())),
}
}
}
deserializer.deserialize_str(ZedDotDevModelVisitor)
}
}
impl JsonSchema for ZedDotDevModel {
fn schema_name() -> String {
"ZedDotDevModel".to_owned()
}
fn json_schema(_generator: &mut schemars::gen::SchemaGenerator) -> Schema {
let variants = vec![
"gpt-3.5-turbo".to_owned(),
"gpt-4".to_owned(),
"gpt-4-turbo-preview".to_owned(),
];
Schema::Object(SchemaObject {
instance_type: Some(InstanceType::String.into()),
enum_values: Some(variants.into_iter().map(|s| s.into()).collect()),
metadata: Some(Box::new(Metadata {
title: Some("ZedDotDevModel".to_owned()),
default: Some(serde_json::json!("gpt-4-turbo-preview")),
examples: vec![
serde_json::json!("gpt-3.5-turbo"),
serde_json::json!("gpt-4"),
serde_json::json!("gpt-4-turbo-preview"),
serde_json::json!("custom-model-name"),
],
..Default::default()
})),
..Default::default()
})
}
}
impl ZedDotDevModel {
pub fn id(&self) -> &str {
match self {
Self::ThreePointFiveTurbo => "gpt-3.5-turbo-0613",
Self::Four => "gpt-4-0613",
Self::FourTurbo => "gpt-4-1106-preview",
Self::GptThreePointFiveTurbo => "gpt-3.5-turbo",
Self::GptFour => "gpt-4",
Self::GptFourTurbo => "gpt-4-turbo-preview",
Self::Custom(id) => id,
}
}
pub fn short_name(&self) -> &'static str {
pub fn display_name(&self) -> &str {
match self {
Self::ThreePointFiveTurbo => "gpt-3.5-turbo",
Self::Four => "gpt-4",
Self::FourTurbo => "gpt-4-turbo",
Self::GptThreePointFiveTurbo => "gpt-3.5-turbo",
Self::GptFour => "gpt-4",
Self::GptFourTurbo => "gpt-4-turbo",
Self::Custom(id) => id.as_str(),
}
}
pub fn cycle(&self) -> Self {
pub fn max_token_count(&self) -> usize {
match self {
Self::ThreePointFiveTurbo => Self::Four,
Self::Four => Self::FourTurbo,
Self::FourTurbo => Self::ThreePointFiveTurbo,
Self::GptThreePointFiveTurbo => 2048,
Self::GptFour => 4096,
Self::GptFourTurbo => 128000,
Self::Custom(_) => 4096, // TODO: Make this configurable
}
}
}
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema)]
#[derive(Copy, Clone, Default, Debug, Serialize, Deserialize, JsonSchema)]
#[serde(rename_all = "snake_case")]
pub enum AssistantDockPosition {
Left,
#[default]
Right,
Bottom,
}
#[derive(Debug, Deserialize)]
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema, PartialEq)]
#[serde(tag = "name", rename_all = "snake_case")]
pub enum AssistantProvider {
#[serde(rename = "zed.dev")]
ZedDotDev {
#[serde(default)]
default_model: ZedDotDevModel,
},
#[serde(rename = "openai")]
OpenAi {
#[serde(default)]
default_model: OpenAiModel,
#[serde(default = "open_ai_url")]
api_url: String,
},
}
impl Default for AssistantProvider {
fn default() -> Self {
Self::ZedDotDev {
default_model: ZedDotDevModel::default(),
}
}
}
fn open_ai_url() -> String {
"https://api.openai.com/v1".into()
}
#[derive(Default, Debug, Deserialize, Serialize)]
pub struct AssistantSettings {
/// Whether to show the assistant panel button in the status bar.
pub enabled: bool,
pub button: bool,
/// Where to dock the assistant.
pub dock: AssistantDockPosition,
/// Default width in pixels when the assistant is docked to the left or right.
pub default_width: Pixels,
/// Default height in pixels when the assistant is docked to the bottom.
pub default_height: Pixels,
/// The default OpenAI model to use when starting new conversations.
#[deprecated = "Please use `provider.default_model` instead."]
pub default_open_ai_model: OpenAiModel,
/// OpenAI API base URL to use when starting new conversations.
#[deprecated = "Please use `provider.api_url` instead."]
pub openai_api_url: String,
/// The settings for the AI provider.
pub provider: AiProviderSettings,
}
impl AssistantSettings {
pub fn provider_kind(&self) -> anyhow::Result<OpenAiCompletionProviderKind> {
match &self.provider {
AiProviderSettings::OpenAi(_) => Ok(OpenAiCompletionProviderKind::OpenAi),
AiProviderSettings::AzureOpenAi(settings) => {
let deployment_id = settings
.deployment_id
.clone()
.ok_or_else(|| anyhow!("no Azure OpenAI deployment ID"))?;
let api_version = settings
.api_version
.ok_or_else(|| anyhow!("no Azure OpenAI API version"))?;
Ok(OpenAiCompletionProviderKind::AzureOpenAi {
deployment_id,
api_version,
})
}
}
}
pub fn provider_api_url(&self) -> anyhow::Result<String> {
match &self.provider {
AiProviderSettings::OpenAi(settings) => Ok(settings
.api_url
.clone()
.unwrap_or_else(|| OPEN_AI_API_URL.to_string())),
AiProviderSettings::AzureOpenAi(settings) => settings
.api_url
.clone()
.ok_or_else(|| anyhow!("no Azure OpenAI API URL")),
}
}
pub fn provider_model(&self) -> anyhow::Result<OpenAiModel> {
match &self.provider {
AiProviderSettings::OpenAi(settings) => {
Ok(settings.default_model.unwrap_or(OpenAiModel::FourTurbo))
}
AiProviderSettings::AzureOpenAi(settings) => {
let deployment_id = settings
.deployment_id
.as_deref()
.ok_or_else(|| anyhow!("no Azure OpenAI deployment ID"))?;
match deployment_id {
// https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#gpt-4-and-gpt-4-turbo-preview
"gpt-4" | "gpt-4-32k" => Ok(OpenAiModel::Four),
// https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#gpt-35
"gpt-35-turbo" | "gpt-35-turbo-16k" | "gpt-35-turbo-instruct" => {
Ok(OpenAiModel::ThreePointFiveTurbo)
}
_ => Err(anyhow!(
"no matching OpenAI model found for deployment ID: '{deployment_id}'"
)),
}
}
}
}
pub fn provider_model_name(&self) -> anyhow::Result<String> {
match &self.provider {
AiProviderSettings::OpenAi(settings) => Ok(settings
.default_model
.unwrap_or(OpenAiModel::FourTurbo)
.full_name()
.to_string()),
AiProviderSettings::AzureOpenAi(settings) => settings
.deployment_id
.clone()
.ok_or_else(|| anyhow!("no Azure OpenAI deployment ID")),
}
}
}
impl Settings for AssistantSettings {
const KEY: Option<&'static str> = Some("assistant");
type FileContent = AssistantSettingsContent;
fn load(
default_value: &Self::FileContent,
user_values: &[&Self::FileContent],
_: &mut gpui::AppContext,
) -> anyhow::Result<Self> {
Self::load_via_json_merge(default_value, user_values)
}
pub provider: AssistantProvider,
}
/// Assistant panel settings
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
pub struct AssistantSettingsContent {
#[derive(Clone, Serialize, Deserialize, Debug)]
#[serde(untagged)]
pub enum AssistantSettingsContent {
Versioned(VersionedAssistantSettingsContent),
Legacy(LegacyAssistantSettingsContent),
}
impl JsonSchema for AssistantSettingsContent {
fn schema_name() -> String {
VersionedAssistantSettingsContent::schema_name()
}
fn json_schema(gen: &mut schemars::gen::SchemaGenerator) -> Schema {
VersionedAssistantSettingsContent::json_schema(gen)
}
fn is_referenceable() -> bool {
VersionedAssistantSettingsContent::is_referenceable()
}
}
impl Default for AssistantSettingsContent {
fn default() -> Self {
Self::Versioned(VersionedAssistantSettingsContent::default())
}
}
impl AssistantSettingsContent {
fn upgrade(&self) -> AssistantSettingsContentV1 {
match self {
AssistantSettingsContent::Versioned(settings) => match settings {
VersionedAssistantSettingsContent::V1(settings) => settings.clone(),
},
AssistantSettingsContent::Legacy(settings) => AssistantSettingsContentV1 {
enabled: None,
button: settings.button,
dock: settings.dock,
default_width: settings.default_width,
default_height: settings.default_height,
provider: if let Some(open_ai_api_url) = settings.openai_api_url.as_ref() {
Some(AssistantProvider::OpenAi {
default_model: settings.default_open_ai_model.clone().unwrap_or_default(),
api_url: open_ai_api_url.clone(),
})
} else {
settings.default_open_ai_model.clone().map(|open_ai_model| {
AssistantProvider::OpenAi {
default_model: open_ai_model,
api_url: open_ai_url(),
}
})
},
},
}
}
pub fn set_dock(&mut self, dock: AssistantDockPosition) {
match self {
AssistantSettingsContent::Versioned(settings) => match settings {
VersionedAssistantSettingsContent::V1(settings) => {
settings.dock = Some(dock);
}
},
AssistantSettingsContent::Legacy(settings) => {
settings.dock = Some(dock);
}
}
}
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug)]
#[serde(tag = "version")]
pub enum VersionedAssistantSettingsContent {
#[serde(rename = "1")]
V1(AssistantSettingsContentV1),
}
impl Default for VersionedAssistantSettingsContent {
fn default() -> Self {
Self::V1(AssistantSettingsContentV1 {
enabled: None,
button: None,
dock: None,
default_width: None,
default_height: None,
provider: None,
})
}
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug)]
pub struct AssistantSettingsContentV1 {
/// Whether the Assistant is enabled.
///
/// Default: true
enabled: Option<bool>,
/// Whether to show the assistant panel button in the status bar.
///
/// Default: true
button: Option<bool>,
/// Where to dock the assistant.
///
/// Default: right
dock: Option<AssistantDockPosition>,
/// Default width in pixels when the assistant is docked to the left or right.
///
/// Default: 640
default_width: Option<f32>,
/// Default height in pixels when the assistant is docked to the bottom.
///
/// Default: 320
default_height: Option<f32>,
/// The provider of the assistant service.
///
/// This can either be the internal `zed.dev` service or an external `openai` service,
/// each with their respective default models and configurations.
provider: Option<AssistantProvider>,
}
#[derive(Clone, Serialize, Deserialize, JsonSchema, Debug)]
pub struct LegacyAssistantSettingsContent {
/// Whether to show the assistant panel button in the status bar.
///
/// Default: true
@@ -180,88 +306,165 @@ pub struct AssistantSettingsContent {
///
/// Default: 320
pub default_height: Option<f32>,
/// Deprecated: Please use `provider.default_model` instead.
/// The default OpenAI model to use when starting new conversations.
///
/// Default: gpt-4-1106-preview
#[deprecated = "Please use `provider.default_model` instead."]
pub default_open_ai_model: Option<OpenAiModel>,
/// Deprecated: Please use `provider.api_url` instead.
/// OpenAI API base URL to use when starting new conversations.
///
/// Default: https://api.openai.com/v1
#[deprecated = "Please use `provider.api_url` instead."]
pub openai_api_url: Option<String>,
/// The settings for the AI provider.
#[serde(default)]
pub provider: AiProviderSettingsContent,
}
#[derive(Debug, Clone, Deserialize)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum AiProviderSettings {
/// The settings for the OpenAI provider.
#[serde(rename = "openai")]
OpenAi(OpenAiProviderSettings),
/// The settings for the Azure OpenAI provider.
#[serde(rename = "azure_openai")]
AzureOpenAi(AzureOpenAiProviderSettings),
}
impl Settings for AssistantSettings {
const KEY: Option<&'static str> = Some("assistant");
/// The settings for the AI provider used by the Zed Assistant.
#[derive(Debug, Clone, Serialize, Deserialize, JsonSchema)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum AiProviderSettingsContent {
/// The settings for the OpenAI provider.
#[serde(rename = "openai")]
OpenAi(OpenAiProviderSettingsContent),
/// The settings for the Azure OpenAI provider.
#[serde(rename = "azure_openai")]
AzureOpenAi(AzureOpenAiProviderSettingsContent),
}
type FileContent = AssistantSettingsContent;
impl Default for AiProviderSettingsContent {
fn default() -> Self {
Self::OpenAi(OpenAiProviderSettingsContent::default())
fn load(
default_value: &Self::FileContent,
user_values: &[&Self::FileContent],
_: &mut gpui::AppContext,
) -> anyhow::Result<Self> {
let mut settings = AssistantSettings::default();
for value in [default_value].iter().chain(user_values) {
let value = value.upgrade();
merge(&mut settings.enabled, value.enabled);
merge(&mut settings.button, value.button);
merge(&mut settings.dock, value.dock);
merge(
&mut settings.default_width,
value.default_width.map(Into::into),
);
merge(
&mut settings.default_height,
value.default_height.map(Into::into),
);
if let Some(provider) = value.provider.clone() {
match (&mut settings.provider, provider) {
(
AssistantProvider::ZedDotDev { default_model },
AssistantProvider::ZedDotDev {
default_model: default_model_override,
},
) => {
*default_model = default_model_override;
}
(
AssistantProvider::OpenAi {
default_model,
api_url,
},
AssistantProvider::OpenAi {
default_model: default_model_override,
api_url: api_url_override,
},
) => {
*default_model = default_model_override;
*api_url = api_url_override;
}
(merged, provider_override) => {
*merged = provider_override;
}
}
}
}
Ok(settings)
}
}
#[derive(Debug, Clone, Deserialize)]
pub struct OpenAiProviderSettings {
/// The OpenAI API base URL to use when starting new conversations.
pub api_url: Option<String>,
/// The default OpenAI model to use when starting new conversations.
pub default_model: Option<OpenAiModel>,
fn merge<T: Copy>(target: &mut T, value: Option<T>) {
if let Some(value) = value {
*target = value;
}
}
#[derive(Debug, Default, Clone, Serialize, Deserialize, JsonSchema)]
pub struct OpenAiProviderSettingsContent {
/// The OpenAI API base URL to use when starting new conversations.
///
/// Default: https://api.openai.com/v1
pub api_url: Option<String>,
/// The default OpenAI model to use when starting new conversations.
///
/// Default: gpt-4-1106-preview
pub default_model: Option<OpenAiModel>,
}
#[cfg(test)]
mod tests {
use gpui::{AppContext, BorrowAppContext};
use settings::SettingsStore;
#[derive(Debug, Clone, Deserialize)]
pub struct AzureOpenAiProviderSettings {
/// The Azure OpenAI API base URL to use when starting new conversations.
pub api_url: Option<String>,
/// The Azure OpenAI API version.
pub api_version: Option<AzureOpenAiApiVersion>,
/// The Azure OpenAI API deployment ID.
pub deployment_id: Option<String>,
}
use super::*;
#[derive(Debug, Default, Clone, Serialize, Deserialize, JsonSchema)]
pub struct AzureOpenAiProviderSettingsContent {
/// The Azure OpenAI API base URL to use when starting new conversations.
pub api_url: Option<String>,
/// The Azure OpenAI API version.
pub api_version: Option<AzureOpenAiApiVersion>,
/// The Azure OpenAI deployment ID.
pub deployment_id: Option<String>,
#[gpui::test]
fn test_deserialize_assistant_settings(cx: &mut AppContext) {
let store = settings::SettingsStore::test(cx);
cx.set_global(store);
// Settings default to gpt-4-turbo.
AssistantSettings::register(cx);
assert_eq!(
AssistantSettings::get_global(cx).provider,
AssistantProvider::OpenAi {
default_model: OpenAiModel::FourTurbo,
api_url: open_ai_url()
}
);
// Ensure backward-compatibility.
cx.update_global::<SettingsStore, _>(|store, cx| {
store
.set_user_settings(
r#"{
"assistant": {
"openai_api_url": "test-url",
}
}"#,
cx,
)
.unwrap();
});
assert_eq!(
AssistantSettings::get_global(cx).provider,
AssistantProvider::OpenAi {
default_model: OpenAiModel::FourTurbo,
api_url: "test-url".into()
}
);
cx.update_global::<SettingsStore, _>(|store, cx| {
store
.set_user_settings(
r#"{
"assistant": {
"default_open_ai_model": "gpt-4-0613"
}
}"#,
cx,
)
.unwrap();
});
assert_eq!(
AssistantSettings::get_global(cx).provider,
AssistantProvider::OpenAi {
default_model: OpenAiModel::Four,
api_url: open_ai_url()
}
);
// The new version supports setting a custom model when using zed.dev.
cx.update_global::<SettingsStore, _>(|store, cx| {
store
.set_user_settings(
r#"{
"assistant": {
"version": "1",
"provider": {
"name": "zed.dev",
"default_model": "custom"
}
}
}"#,
cx,
)
.unwrap();
});
assert_eq!(
AssistantSettings::get_global(cx).provider,
AssistantProvider::ZedDotDev {
default_model: ZedDotDevModel::Custom("custom".into())
}
);
}
}

View File

@@ -1,12 +1,13 @@
use crate::streaming_diff::{Hunk, StreamingDiff};
use ai::completion::{CompletionProvider, CompletionRequest};
use crate::{
streaming_diff::{Hunk, StreamingDiff},
CompletionProvider, LanguageModelRequest,
};
use anyhow::Result;
use editor::{Anchor, MultiBuffer, MultiBufferSnapshot, ToOffset, ToPoint};
use futures::{channel::mpsc, SinkExt, Stream, StreamExt};
use gpui::{EventEmitter, Model, ModelContext, Task};
use language::{Rope, TransactionId};
use multi_buffer;
use std::{cmp, future, ops::Range, sync::Arc};
use std::{cmp, future, ops::Range};
pub enum Event {
Finished,
@@ -20,7 +21,6 @@ pub enum CodegenKind {
}
pub struct Codegen {
provider: Arc<dyn CompletionProvider>,
buffer: Model<MultiBuffer>,
snapshot: MultiBufferSnapshot,
kind: CodegenKind,
@@ -35,15 +35,9 @@ pub struct Codegen {
impl EventEmitter<Event> for Codegen {}
impl Codegen {
pub fn new(
buffer: Model<MultiBuffer>,
kind: CodegenKind,
provider: Arc<dyn CompletionProvider>,
cx: &mut ModelContext<Self>,
) -> Self {
pub fn new(buffer: Model<MultiBuffer>, kind: CodegenKind, cx: &mut ModelContext<Self>) -> Self {
let snapshot = buffer.read(cx).snapshot(cx);
Self {
provider,
buffer: buffer.clone(),
snapshot,
kind,
@@ -94,7 +88,7 @@ impl Codegen {
self.error.as_ref()
}
pub fn start(&mut self, prompt: Box<dyn CompletionRequest>, cx: &mut ModelContext<Self>) {
pub fn start(&mut self, prompt: LanguageModelRequest, cx: &mut ModelContext<Self>) {
let range = self.range();
let snapshot = self.snapshot.clone();
let selected_text = snapshot
@@ -108,7 +102,7 @@ impl Codegen {
.next()
.unwrap_or_else(|| snapshot.indent_size_for_line(selection_start.row));
let response = self.provider.complete(prompt);
let response = CompletionProvider::global(cx).complete(prompt);
self.generation = cx.spawn(|this, mut cx| {
async move {
let generate = async {
@@ -305,7 +299,7 @@ fn strip_invalid_spans_from_codeblock(
}
if first_line {
if buffer == "" || buffer == "`" || buffer == "``" {
if buffer.is_empty() || buffer == "`" || buffer == "``" {
return future::ready(None);
} else if buffer.starts_with("```") {
starts_with_markdown_codeblock = true;
@@ -360,8 +354,9 @@ fn strip_invalid_spans_from_codeblock(
mod tests {
use std::sync::Arc;
use crate::FakeCompletionProvider;
use super::*;
use ai::test::FakeCompletionProvider;
use futures::stream::{self};
use gpui::{Context, TestAppContext};
use indoc::indoc;
@@ -378,15 +373,11 @@ mod tests {
pub name: String,
}
impl CompletionRequest for DummyCompletionRequest {
fn data(&self) -> serde_json::Result<String> {
serde_json::to_string(self)
}
}
#[gpui::test(iterations = 10)]
async fn test_transform_autoindent(cx: &mut TestAppContext, mut rng: StdRng) {
let provider = FakeCompletionProvider::default();
cx.set_global(cx.update(SettingsStore::test));
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.update(language_settings::init);
let text = indoc! {"
@@ -405,19 +396,10 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 0))..snapshot.anchor_after(Point::new(4, 5))
});
let provider = Arc::new(FakeCompletionProvider::new());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
CodegenKind::Transform { range },
provider.clone(),
cx,
)
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), CodegenKind::Transform { range }, cx));
let request = Box::new(DummyCompletionRequest {
name: "test".to_string(),
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
@@ -430,8 +412,7 @@ mod tests {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
println!("CHUNK: {:?}", &chunk);
provider.send_completion(chunk);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}
@@ -456,6 +437,8 @@ mod tests {
cx: &mut TestAppContext,
mut rng: StdRng,
) {
let provider = FakeCompletionProvider::default();
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
@@ -472,19 +455,10 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 6))
});
let provider = Arc::new(FakeCompletionProvider::new());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
CodegenKind::Generate { position },
provider.clone(),
cx,
)
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), CodegenKind::Generate { position }, cx));
let request = Box::new(DummyCompletionRequest {
name: "test".to_string(),
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
@@ -497,7 +471,7 @@ mod tests {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
provider.send_completion(chunk);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}
@@ -522,6 +496,8 @@ mod tests {
cx: &mut TestAppContext,
mut rng: StdRng,
) {
let provider = FakeCompletionProvider::default();
cx.set_global(CompletionProvider::Fake(provider.clone()));
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
@@ -538,19 +514,10 @@ mod tests {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 2))
});
let provider = Arc::new(FakeCompletionProvider::new());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
CodegenKind::Generate { position },
provider.clone(),
cx,
)
});
let codegen =
cx.new_model(|cx| Codegen::new(buffer.clone(), CodegenKind::Generate { position }, cx));
let request = Box::new(DummyCompletionRequest {
name: "test".to_string(),
});
let request = LanguageModelRequest::default();
codegen.update(cx, |codegen, cx| codegen.start(request, cx));
let mut new_text = concat!(
@@ -563,8 +530,7 @@ mod tests {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
println!("{:?}", &chunk);
provider.send_completion(chunk);
provider.send_completion(chunk.into());
new_text = suffix;
cx.background_executor.run_until_parked();
}

View File

@@ -0,0 +1,188 @@
#[cfg(test)]
mod fake;
mod open_ai;
mod zed;
#[cfg(test)]
pub use fake::*;
pub use open_ai::*;
pub use zed::*;
use crate::{
assistant_settings::{AssistantProvider, AssistantSettings},
LanguageModel, LanguageModelRequest,
};
use anyhow::Result;
use client::Client;
use futures::{future::BoxFuture, stream::BoxStream};
use gpui::{AnyView, AppContext, BorrowAppContext, Task, WindowContext};
use settings::{Settings, SettingsStore};
use std::sync::Arc;
pub fn init(client: Arc<Client>, cx: &mut AppContext) {
let mut settings_version = 0;
let provider = match &AssistantSettings::get_global(cx).provider {
AssistantProvider::ZedDotDev { default_model } => {
CompletionProvider::ZedDotDev(ZedDotDevCompletionProvider::new(
default_model.clone(),
client.clone(),
settings_version,
cx,
))
}
AssistantProvider::OpenAi {
default_model,
api_url,
} => CompletionProvider::OpenAi(OpenAiCompletionProvider::new(
default_model.clone(),
api_url.clone(),
client.http_client(),
settings_version,
)),
};
cx.set_global(provider);
cx.observe_global::<SettingsStore>(move |cx| {
settings_version += 1;
cx.update_global::<CompletionProvider, _>(|provider, cx| {
match (&mut *provider, &AssistantSettings::get_global(cx).provider) {
(
CompletionProvider::OpenAi(provider),
AssistantProvider::OpenAi {
default_model,
api_url,
},
) => {
provider.update(default_model.clone(), api_url.clone(), settings_version);
}
(
CompletionProvider::ZedDotDev(provider),
AssistantProvider::ZedDotDev { default_model },
) => {
provider.update(default_model.clone(), settings_version);
}
(CompletionProvider::OpenAi(_), AssistantProvider::ZedDotDev { default_model }) => {
*provider = CompletionProvider::ZedDotDev(ZedDotDevCompletionProvider::new(
default_model.clone(),
client.clone(),
settings_version,
cx,
));
}
(
CompletionProvider::ZedDotDev(_),
AssistantProvider::OpenAi {
default_model,
api_url,
},
) => {
*provider = CompletionProvider::OpenAi(OpenAiCompletionProvider::new(
default_model.clone(),
api_url.clone(),
client.http_client(),
settings_version,
));
}
#[cfg(test)]
(CompletionProvider::Fake(_), _) => unimplemented!(),
}
})
})
.detach();
}
pub enum CompletionProvider {
OpenAi(OpenAiCompletionProvider),
ZedDotDev(ZedDotDevCompletionProvider),
#[cfg(test)]
Fake(FakeCompletionProvider),
}
impl gpui::Global for CompletionProvider {}
impl CompletionProvider {
pub fn global(cx: &AppContext) -> &Self {
cx.global::<Self>()
}
pub fn settings_version(&self) -> usize {
match self {
CompletionProvider::OpenAi(provider) => provider.settings_version(),
CompletionProvider::ZedDotDev(provider) => provider.settings_version(),
#[cfg(test)]
CompletionProvider::Fake(_) => unimplemented!(),
}
}
pub fn is_authenticated(&self) -> bool {
match self {
CompletionProvider::OpenAi(provider) => provider.is_authenticated(),
CompletionProvider::ZedDotDev(provider) => provider.is_authenticated(),
#[cfg(test)]
CompletionProvider::Fake(_) => true,
}
}
pub fn authenticate(&self, cx: &AppContext) -> Task<Result<()>> {
match self {
CompletionProvider::OpenAi(provider) => provider.authenticate(cx),
CompletionProvider::ZedDotDev(provider) => provider.authenticate(cx),
#[cfg(test)]
CompletionProvider::Fake(_) => Task::ready(Ok(())),
}
}
pub fn authentication_prompt(&self, cx: &mut WindowContext) -> AnyView {
match self {
CompletionProvider::OpenAi(provider) => provider.authentication_prompt(cx),
CompletionProvider::ZedDotDev(provider) => provider.authentication_prompt(cx),
#[cfg(test)]
CompletionProvider::Fake(_) => unimplemented!(),
}
}
pub fn reset_credentials(&self, cx: &AppContext) -> Task<Result<()>> {
match self {
CompletionProvider::OpenAi(provider) => provider.reset_credentials(cx),
CompletionProvider::ZedDotDev(_) => Task::ready(Ok(())),
#[cfg(test)]
CompletionProvider::Fake(_) => Task::ready(Ok(())),
}
}
pub fn default_model(&self) -> LanguageModel {
match self {
CompletionProvider::OpenAi(provider) => LanguageModel::OpenAi(provider.default_model()),
CompletionProvider::ZedDotDev(provider) => {
LanguageModel::ZedDotDev(provider.default_model())
}
#[cfg(test)]
CompletionProvider::Fake(_) => unimplemented!(),
}
}
pub fn count_tokens(
&self,
request: LanguageModelRequest,
cx: &AppContext,
) -> BoxFuture<'static, Result<usize>> {
match self {
CompletionProvider::OpenAi(provider) => provider.count_tokens(request, cx),
CompletionProvider::ZedDotDev(provider) => provider.count_tokens(request, cx),
#[cfg(test)]
CompletionProvider::Fake(_) => unimplemented!(),
}
}
pub fn complete(
&self,
request: LanguageModelRequest,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
match self {
CompletionProvider::OpenAi(provider) => provider.complete(request),
CompletionProvider::ZedDotDev(provider) => provider.complete(request),
#[cfg(test)]
CompletionProvider::Fake(provider) => provider.complete(),
}
}
}

View File

@@ -0,0 +1,29 @@
use anyhow::Result;
use futures::{channel::mpsc, future::BoxFuture, stream::BoxStream, FutureExt, StreamExt};
use std::sync::Arc;
#[derive(Clone, Default)]
pub struct FakeCompletionProvider {
current_completion_tx: Arc<parking_lot::Mutex<Option<mpsc::UnboundedSender<String>>>>,
}
impl FakeCompletionProvider {
pub fn complete(&self) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
let (tx, rx) = mpsc::unbounded();
*self.current_completion_tx.lock() = Some(tx);
async move { Ok(rx.map(Ok).boxed()) }.boxed()
}
pub fn send_completion(&self, chunk: String) {
self.current_completion_tx
.lock()
.as_ref()
.unwrap()
.unbounded_send(chunk)
.unwrap();
}
pub fn finish_completion(&self) {
self.current_completion_tx.lock().take();
}
}

View File

@@ -0,0 +1,301 @@
use crate::{
assistant_settings::OpenAiModel, CompletionProvider, LanguageModel, LanguageModelRequest, Role,
};
use anyhow::{anyhow, Result};
use editor::{Editor, EditorElement, EditorStyle};
use futures::{future::BoxFuture, stream::BoxStream, FutureExt, StreamExt};
use gpui::{AnyView, AppContext, FontStyle, FontWeight, Task, TextStyle, View, WhiteSpace};
use open_ai::{stream_completion, Request, RequestMessage, Role as OpenAiRole};
use settings::Settings;
use std::{env, sync::Arc};
use theme::ThemeSettings;
use ui::prelude::*;
use util::{http::HttpClient, ResultExt};
pub struct OpenAiCompletionProvider {
api_key: Option<String>,
api_url: String,
default_model: OpenAiModel,
http_client: Arc<dyn HttpClient>,
settings_version: usize,
}
impl OpenAiCompletionProvider {
pub fn new(
default_model: OpenAiModel,
api_url: String,
http_client: Arc<dyn HttpClient>,
settings_version: usize,
) -> Self {
Self {
api_key: None,
api_url,
default_model,
http_client,
settings_version,
}
}
pub fn update(&mut self, default_model: OpenAiModel, api_url: String, settings_version: usize) {
self.default_model = default_model;
self.api_url = api_url;
self.settings_version = settings_version;
}
pub fn settings_version(&self) -> usize {
self.settings_version
}
pub fn is_authenticated(&self) -> bool {
self.api_key.is_some()
}
pub fn authenticate(&self, cx: &AppContext) -> Task<Result<()>> {
if self.is_authenticated() {
Task::ready(Ok(()))
} else {
let api_url = self.api_url.clone();
cx.spawn(|mut cx| async move {
let api_key = if let Ok(api_key) = env::var("OPENAI_API_KEY") {
api_key
} else {
let (_, api_key) = cx
.update(|cx| cx.read_credentials(&api_url))?
.await?
.ok_or_else(|| anyhow!("credentials not found"))?;
String::from_utf8(api_key)?
};
cx.update_global::<CompletionProvider, _>(|provider, _cx| {
if let CompletionProvider::OpenAi(provider) = provider {
provider.api_key = Some(api_key);
}
})
})
}
}
pub fn reset_credentials(&self, cx: &AppContext) -> Task<Result<()>> {
let delete_credentials = cx.delete_credentials(&self.api_url);
cx.spawn(|mut cx| async move {
delete_credentials.await.log_err();
cx.update_global::<CompletionProvider, _>(|provider, _cx| {
if let CompletionProvider::OpenAi(provider) = provider {
provider.api_key = None;
}
})
})
}
pub fn authentication_prompt(&self, cx: &mut WindowContext) -> AnyView {
cx.new_view(|cx| AuthenticationPrompt::new(self.api_url.clone(), cx))
.into()
}
pub fn default_model(&self) -> OpenAiModel {
self.default_model.clone()
}
pub fn count_tokens(
&self,
request: LanguageModelRequest,
cx: &AppContext,
) -> BoxFuture<'static, Result<usize>> {
count_open_ai_tokens(request, cx.background_executor())
}
pub fn complete(
&self,
request: LanguageModelRequest,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
let request = self.to_open_ai_request(request);
let http_client = self.http_client.clone();
let api_key = self.api_key.clone();
let api_url = self.api_url.clone();
async move {
let api_key = api_key.ok_or_else(|| anyhow!("missing api key"))?;
let request = stream_completion(http_client.as_ref(), &api_url, &api_key, request);
let response = request.await?;
let stream = response
.filter_map(|response| async move {
match response {
Ok(mut response) => Some(Ok(response.choices.pop()?.delta.content?)),
Err(error) => Some(Err(error)),
}
})
.boxed();
Ok(stream)
}
.boxed()
}
fn to_open_ai_request(&self, request: LanguageModelRequest) -> Request {
let model = match request.model {
LanguageModel::ZedDotDev(_) => self.default_model(),
LanguageModel::OpenAi(model) => model,
};
Request {
model,
messages: request
.messages
.into_iter()
.map(|msg| RequestMessage {
role: msg.role.into(),
content: msg.content,
})
.collect(),
stream: true,
stop: request.stop,
temperature: request.temperature,
}
}
}
pub fn count_open_ai_tokens(
request: LanguageModelRequest,
background_executor: &gpui::BackgroundExecutor,
) -> BoxFuture<'static, Result<usize>> {
background_executor
.spawn(async move {
let messages = request
.messages
.into_iter()
.map(|message| tiktoken_rs::ChatCompletionRequestMessage {
role: match message.role {
Role::User => "user".into(),
Role::Assistant => "assistant".into(),
Role::System => "system".into(),
},
content: Some(message.content),
name: None,
function_call: None,
})
.collect::<Vec<_>>();
tiktoken_rs::num_tokens_from_messages(request.model.id(), &messages)
})
.boxed()
}
impl From<Role> for open_ai::Role {
fn from(val: Role) -> Self {
match val {
Role::User => OpenAiRole::User,
Role::Assistant => OpenAiRole::Assistant,
Role::System => OpenAiRole::System,
}
}
}
struct AuthenticationPrompt {
api_key: View<Editor>,
api_url: String,
}
impl AuthenticationPrompt {
fn new(api_url: String, cx: &mut WindowContext) -> Self {
Self {
api_key: cx.new_view(|cx| {
let mut editor = Editor::single_line(cx);
editor.set_placeholder_text(
"sk-000000000000000000000000000000000000000000000000",
cx,
);
editor
}),
api_url,
}
}
fn save_api_key(&mut self, _: &menu::Confirm, cx: &mut ViewContext<Self>) {
let api_key = self.api_key.read(cx).text(cx);
if api_key.is_empty() {
return;
}
let write_credentials = cx.write_credentials(&self.api_url, "Bearer", api_key.as_bytes());
cx.spawn(|_, mut cx| async move {
write_credentials.await?;
cx.update_global::<CompletionProvider, _>(|provider, _cx| {
if let CompletionProvider::OpenAi(provider) = provider {
provider.api_key = Some(api_key);
}
})
})
.detach_and_log_err(cx);
}
fn render_api_key_editor(&self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let text_style = TextStyle {
color: cx.theme().colors().text,
font_family: settings.ui_font.family.clone(),
font_features: settings.ui_font.features,
font_size: rems(0.875).into(),
font_weight: FontWeight::NORMAL,
font_style: FontStyle::Normal,
line_height: relative(1.3),
background_color: None,
underline: None,
strikethrough: None,
white_space: WhiteSpace::Normal,
};
EditorElement::new(
&self.api_key,
EditorStyle {
background: cx.theme().colors().editor_background,
local_player: cx.theme().players().local(),
text: text_style,
..Default::default()
},
)
}
}
impl Render for AuthenticationPrompt {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
const INSTRUCTIONS: [&str; 6] = [
"To use the assistant panel or inline assistant, you need to add your OpenAI API key.",
" - You can create an API key at: platform.openai.com/api-keys",
" - Make sure your OpenAI account has credits",
" - Having a subscription for another service like GitHub Copilot won't work.",
"",
"Paste your OpenAI API key below and hit enter to use the assistant:",
];
v_flex()
.p_4()
.size_full()
.on_action(cx.listener(Self::save_api_key))
.children(
INSTRUCTIONS.map(|instruction| Label::new(instruction).size(LabelSize::Small)),
)
.child(
h_flex()
.w_full()
.my_2()
.px_2()
.py_1()
.bg(cx.theme().colors().editor_background)
.rounded_md()
.child(self.render_api_key_editor(cx)),
)
.child(
Label::new(
"You can also assign the OPENAI_API_KEY environment variable and restart Zed.",
)
.size(LabelSize::Small),
)
.child(
h_flex()
.gap_2()
.child(Label::new("Click on").size(LabelSize::Small))
.child(Icon::new(IconName::Ai).size(IconSize::XSmall))
.child(
Label::new("in the status bar to close this panel.").size(LabelSize::Small),
),
)
.into_any()
}
}

View File

@@ -0,0 +1,167 @@
use crate::{
assistant_settings::ZedDotDevModel, count_open_ai_tokens, CompletionProvider,
LanguageModelRequest,
};
use anyhow::{anyhow, Result};
use client::{proto, Client};
use futures::{future::BoxFuture, stream::BoxStream, FutureExt, StreamExt, TryFutureExt};
use gpui::{AnyView, AppContext, Task};
use std::{future, sync::Arc};
use ui::prelude::*;
pub struct ZedDotDevCompletionProvider {
client: Arc<Client>,
default_model: ZedDotDevModel,
settings_version: usize,
status: client::Status,
_maintain_client_status: Task<()>,
}
impl ZedDotDevCompletionProvider {
pub fn new(
default_model: ZedDotDevModel,
client: Arc<Client>,
settings_version: usize,
cx: &mut AppContext,
) -> Self {
let mut status_rx = client.status();
let status = *status_rx.borrow();
let maintain_client_status = cx.spawn(|mut cx| async move {
while let Some(status) = status_rx.next().await {
let _ = cx.update_global::<CompletionProvider, _>(|provider, _cx| {
if let CompletionProvider::ZedDotDev(provider) = provider {
provider.status = status;
} else {
unreachable!()
}
});
}
});
Self {
client,
default_model,
settings_version,
status,
_maintain_client_status: maintain_client_status,
}
}
pub fn update(&mut self, default_model: ZedDotDevModel, settings_version: usize) {
self.default_model = default_model;
self.settings_version = settings_version;
}
pub fn settings_version(&self) -> usize {
self.settings_version
}
pub fn default_model(&self) -> ZedDotDevModel {
self.default_model.clone()
}
pub fn is_authenticated(&self) -> bool {
self.status.is_connected()
}
pub fn authenticate(&self, cx: &AppContext) -> Task<Result<()>> {
let client = self.client.clone();
cx.spawn(move |cx| async move { client.authenticate_and_connect(true, &cx).await })
}
pub fn authentication_prompt(&self, cx: &mut WindowContext) -> AnyView {
cx.new_view(|_cx| AuthenticationPrompt).into()
}
pub fn count_tokens(
&self,
request: LanguageModelRequest,
cx: &AppContext,
) -> BoxFuture<'static, Result<usize>> {
match request.model {
crate::LanguageModel::OpenAi(_) => future::ready(Err(anyhow!("invalid model"))).boxed(),
crate::LanguageModel::ZedDotDev(ZedDotDevModel::GptFour)
| crate::LanguageModel::ZedDotDev(ZedDotDevModel::GptFourTurbo)
| crate::LanguageModel::ZedDotDev(ZedDotDevModel::GptThreePointFiveTurbo) => {
count_open_ai_tokens(request, cx.background_executor())
}
crate::LanguageModel::ZedDotDev(ZedDotDevModel::Custom(model)) => {
let request = self.client.request(proto::CountTokensWithLanguageModel {
model,
messages: request
.messages
.iter()
.map(|message| message.to_proto())
.collect(),
});
async move {
let response = request.await?;
Ok(response.token_count as usize)
}
.boxed()
}
}
}
pub fn complete(
&self,
request: LanguageModelRequest,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
let request = proto::CompleteWithLanguageModel {
model: request.model.id().to_string(),
messages: request
.messages
.iter()
.map(|message| message.to_proto())
.collect(),
stop: request.stop,
temperature: request.temperature,
};
self.client
.request_stream(request)
.map_ok(|stream| {
stream
.filter_map(|response| async move {
match response {
Ok(mut response) => Some(Ok(response.choices.pop()?.delta?.content?)),
Err(error) => Some(Err(error)),
}
})
.boxed()
})
.boxed()
}
}
struct AuthenticationPrompt;
impl Render for AuthenticationPrompt {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
const LABEL: &str = "Generate and analyze code with language models. You can dialog with the assistant in this panel or transform code inline.";
v_flex().gap_6().p_4().child(Label::new(LABEL)).child(
v_flex()
.gap_2()
.child(
Button::new("sign_in", "Sign in")
.icon_color(Color::Muted)
.icon(IconName::Github)
.icon_position(IconPosition::Start)
.style(ButtonStyle::Filled)
.full_width()
.on_click(|_, cx| {
CompletionProvider::global(cx)
.authenticate(cx)
.detach_and_log_err(cx);
}),
)
.child(
div().flex().w_full().items_center().child(
Label::new("Sign in to enable collaboration.")
.color(Color::Muted)
.size(LabelSize::Small),
),
),
)
}
}

View File

@@ -1,394 +1,95 @@
use ai::models::LanguageModel;
use ai::prompts::base::{PromptArguments, PromptChain, PromptPriority, PromptTemplate};
use ai::prompts::file_context::FileContext;
use ai::prompts::generate::GenerateInlineContent;
use ai::prompts::preamble::EngineerPreamble;
use ai::prompts::repository_context::{PromptCodeSnippet, RepositoryContext};
use ai::providers::open_ai::OpenAiLanguageModel;
use language::{BufferSnapshot, OffsetRangeExt, ToOffset};
use std::cmp::{self, Reverse};
use std::ops::Range;
use std::sync::Arc;
#[allow(dead_code)]
fn summarize(buffer: &BufferSnapshot, selected_range: Range<impl ToOffset>) -> String {
#[derive(Debug)]
struct Match {
collapse: Range<usize>,
keep: Vec<Range<usize>>,
}
let selected_range = selected_range.to_offset(buffer);
let mut ts_matches = buffer.matches(0..buffer.len(), |grammar| {
Some(&grammar.embedding_config.as_ref()?.query)
});
let configs = ts_matches
.grammars()
.iter()
.map(|g| g.embedding_config.as_ref().unwrap())
.collect::<Vec<_>>();
let mut matches = Vec::new();
while let Some(mat) = ts_matches.peek() {
let config = &configs[mat.grammar_index];
if let Some(collapse) = mat.captures.iter().find_map(|cap| {
if Some(cap.index) == config.collapse_capture_ix {
Some(cap.node.byte_range())
} else {
None
}
}) {
let mut keep = Vec::new();
for capture in mat.captures.iter() {
if Some(capture.index) == config.keep_capture_ix {
keep.push(capture.node.byte_range());
} else {
continue;
}
}
ts_matches.advance();
matches.push(Match { collapse, keep });
} else {
ts_matches.advance();
}
}
matches.sort_unstable_by_key(|mat| (mat.collapse.start, Reverse(mat.collapse.end)));
let mut matches = matches.into_iter().peekable();
let mut summary = String::new();
let mut offset = 0;
let mut flushed_selection = false;
while let Some(mat) = matches.next() {
// Keep extending the collapsed range if the next match surrounds
// the current one.
while let Some(next_mat) = matches.peek() {
if mat.collapse.start <= next_mat.collapse.start
&& mat.collapse.end >= next_mat.collapse.end
{
matches.next().unwrap();
} else {
break;
}
}
if offset > mat.collapse.start {
// Skip collapsed nodes that have already been summarized.
offset = cmp::max(offset, mat.collapse.end);
continue;
}
if offset <= selected_range.start && selected_range.start <= mat.collapse.end {
if !flushed_selection {
// The collapsed node ends after the selection starts, so we'll flush the selection first.
summary.extend(buffer.text_for_range(offset..selected_range.start));
summary.push_str("<|S|");
if selected_range.end == selected_range.start {
summary.push_str(">");
} else {
summary.extend(buffer.text_for_range(selected_range.clone()));
summary.push_str("|E|>");
}
offset = selected_range.end;
flushed_selection = true;
}
// If the selection intersects the collapsed node, we won't collapse it.
if selected_range.end >= mat.collapse.start {
continue;
}
}
summary.extend(buffer.text_for_range(offset..mat.collapse.start));
for keep in mat.keep {
summary.extend(buffer.text_for_range(keep));
}
offset = mat.collapse.end;
}
// Flush selection if we haven't already done so.
if !flushed_selection && offset <= selected_range.start {
summary.extend(buffer.text_for_range(offset..selected_range.start));
summary.push_str("<|S|");
if selected_range.end == selected_range.start {
summary.push_str(">");
} else {
summary.extend(buffer.text_for_range(selected_range.clone()));
summary.push_str("|E|>");
}
offset = selected_range.end;
}
summary.extend(buffer.text_for_range(offset..buffer.len()));
summary
}
use language::BufferSnapshot;
use std::{fmt::Write, ops::Range};
pub fn generate_content_prompt(
user_prompt: String,
language_name: Option<&str>,
buffer: BufferSnapshot,
range: Range<usize>,
search_results: Vec<PromptCodeSnippet>,
model: &str,
project_name: Option<String>,
) -> anyhow::Result<String> {
// Using new Prompt Templates
let openai_model: Arc<dyn LanguageModel> = Arc::new(OpenAiLanguageModel::load(model));
let lang_name = if let Some(language_name) = language_name {
Some(language_name.to_string())
let mut prompt = String::new();
let content_type = match language_name {
None | Some("Markdown" | "Plain Text") => {
writeln!(prompt, "You are an expert engineer.")?;
"Text"
}
Some(language_name) => {
writeln!(prompt, "You are an expert {language_name} engineer.")?;
writeln!(
prompt,
"Your answer MUST always and only be valid {}.",
language_name
)?;
"Code"
}
};
if let Some(project_name) = project_name {
writeln!(
prompt,
"You are currently working inside the '{project_name}' project in code editor Zed."
)?;
}
// Include file content.
for chunk in buffer.text_for_range(0..range.start) {
prompt.push_str(chunk);
}
if range.is_empty() {
prompt.push_str("<|START|>");
} else {
None
};
let args = PromptArguments {
model: openai_model,
language_name: lang_name.clone(),
project_name,
snippets: search_results.clone(),
reserved_tokens: 1000,
buffer: Some(buffer),
selected_range: Some(range),
user_prompt: Some(user_prompt.clone()),
};
let templates: Vec<(PromptPriority, Box<dyn PromptTemplate>)> = vec![
(PromptPriority::Mandatory, Box::new(EngineerPreamble {})),
(
PromptPriority::Ordered { order: 1 },
Box::new(RepositoryContext {}),
),
(
PromptPriority::Ordered { order: 0 },
Box::new(FileContext {}),
),
(
PromptPriority::Mandatory,
Box::new(GenerateInlineContent {}),
),
];
let chain = PromptChain::new(args, templates);
let (prompt, _) = chain.generate(true)?;
anyhow::Ok(prompt)
}
#[cfg(test)]
pub(crate) mod tests {
use super::*;
use gpui::{AppContext, Context};
use indoc::indoc;
use language::{
language_settings, tree_sitter_rust, Buffer, BufferId, Language, LanguageConfig,
LanguageMatcher, Point,
};
use settings::SettingsStore;
use std::sync::Arc;
pub(crate) fn rust_lang() -> Language {
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_rust::language()),
)
.with_embedding_query(
r#"
(
[(line_comment) (attribute_item)]* @context
.
[
(struct_item
name: (_) @name)
(enum_item
name: (_) @name)
(impl_item
trait: (_)? @name
"for"? @name
type: (_) @name)
(trait_item
name: (_) @name)
(function_item
name: (_) @name
body: (block
"{" @keep
"}" @keep) @collapse)
(macro_definition
name: (_) @name)
] @item
)
"#,
)
.unwrap()
prompt.push_str("<|START|");
}
#[gpui::test]
fn test_outline_for_prompt(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
cx.set_global(settings_store);
language_settings::init(cx);
let text = indoc! {"
struct X {
a: usize,
b: usize,
}
impl X {
fn new() -> Self {
let a = 1;
let b = 2;
Self { a, b }
}
pub fn a(&self, param: bool) -> usize {
self.a
}
pub fn b(&self) -> usize {
self.b
}
}
"};
let buffer = cx.new_model(|cx| {
Buffer::new(0, BufferId::new(1).unwrap(), text).with_language(Arc::new(rust_lang()), cx)
});
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
summarize(&snapshot, Point::new(1, 4)..Point::new(1, 4)),
indoc! {"
struct X {
<|S|>a: usize,
b: usize,
}
impl X {
fn new() -> Self {}
pub fn a(&self, param: bool) -> usize {}
pub fn b(&self) -> usize {}
}
"}
);
assert_eq!(
summarize(&snapshot, Point::new(8, 12)..Point::new(8, 14)),
indoc! {"
struct X {
a: usize,
b: usize,
}
impl X {
fn new() -> Self {
let <|S|a |E|>= 1;
let b = 2;
Self { a, b }
}
pub fn a(&self, param: bool) -> usize {}
pub fn b(&self) -> usize {}
}
"}
);
assert_eq!(
summarize(&snapshot, Point::new(6, 0)..Point::new(6, 0)),
indoc! {"
struct X {
a: usize,
b: usize,
}
impl X {
<|S|>
fn new() -> Self {}
pub fn a(&self, param: bool) -> usize {}
pub fn b(&self) -> usize {}
}
"}
);
assert_eq!(
summarize(&snapshot, Point::new(21, 0)..Point::new(21, 0)),
indoc! {"
struct X {
a: usize,
b: usize,
}
impl X {
fn new() -> Self {}
pub fn a(&self, param: bool) -> usize {}
pub fn b(&self) -> usize {}
}
<|S|>"}
);
// Ensure nested functions get collapsed properly.
let text = indoc! {"
struct X {
a: usize,
b: usize,
}
impl X {
fn new() -> Self {
let a = 1;
let b = 2;
Self { a, b }
}
pub fn a(&self, param: bool) -> usize {
let a = 30;
fn nested() -> usize {
3
}
self.a + nested()
}
pub fn b(&self) -> usize {
self.b
}
}
"};
buffer.update(cx, |buffer, cx| buffer.set_text(text, cx));
let snapshot = buffer.read(cx).snapshot();
assert_eq!(
summarize(&snapshot, Point::new(0, 0)..Point::new(0, 0)),
indoc! {"
<|S|>struct X {
a: usize,
b: usize,
}
impl X {
fn new() -> Self {}
pub fn a(&self, param: bool) -> usize {}
pub fn b(&self) -> usize {}
}
"}
);
for chunk in buffer.text_for_range(range.clone()) {
prompt.push_str(chunk);
}
if !range.is_empty() {
prompt.push_str("|END|>");
}
for chunk in buffer.text_for_range(range.end..buffer.len()) {
prompt.push_str(chunk);
}
prompt.push('\n');
if range.is_empty() {
writeln!(
prompt,
"Assume the cursor is located where the `<|START|>` span is."
)
.unwrap();
writeln!(
prompt,
"{content_type} can't be replaced, so assume your answer will be inserted at the cursor.",
)
.unwrap();
writeln!(
prompt,
"Generate {content_type} based on the users prompt: {user_prompt}",
)
.unwrap();
} else {
writeln!(prompt, "Modify the user's selected {content_type} based upon the users prompt: '{user_prompt}'").unwrap();
writeln!(prompt, "You must reply with only the adjusted {content_type} (within the '<|START|' and '|END|>' spans) not the entire file.").unwrap();
writeln!(
prompt,
"Double check that you only return code and not the '<|START|' and '|END|'> spans"
)
.unwrap();
}
writeln!(prompt, "Never make remarks about the output.").unwrap();
writeln!(
prompt,
"Do not return anything else, except the generated {content_type}."
)
.unwrap();
Ok(prompt)
}

View File

@@ -0,0 +1,121 @@
use crate::{assistant_settings::OpenAiModel, MessageId, MessageMetadata};
use anyhow::{anyhow, Result};
use collections::HashMap;
use fs::Fs;
use futures::StreamExt;
use regex::Regex;
use serde::{Deserialize, Serialize};
use std::{
cmp::Reverse,
ffi::OsStr,
path::{Path, PathBuf},
sync::Arc,
};
use util::paths::CONVERSATIONS_DIR;
#[derive(Serialize, Deserialize)]
pub struct SavedMessage {
pub id: MessageId,
pub start: usize,
}
#[derive(Serialize, Deserialize)]
pub struct SavedConversation {
pub id: Option<String>,
pub zed: String,
pub version: String,
pub text: String,
pub messages: Vec<SavedMessage>,
pub message_metadata: HashMap<MessageId, MessageMetadata>,
pub summary: String,
}
impl SavedConversation {
pub const VERSION: &'static str = "0.2.0";
pub async fn load(path: &Path, fs: &dyn Fs) -> Result<Self> {
let saved_conversation = fs.load(path).await?;
let saved_conversation_json =
serde_json::from_str::<serde_json::Value>(&saved_conversation)?;
match saved_conversation_json
.get("version")
.ok_or_else(|| anyhow!("version not found"))?
{
serde_json::Value::String(version) => match version.as_str() {
Self::VERSION => Ok(serde_json::from_value::<Self>(saved_conversation_json)?),
"0.1.0" => {
let saved_conversation =
serde_json::from_value::<SavedConversationV0_1_0>(saved_conversation_json)?;
Ok(Self {
id: saved_conversation.id,
zed: saved_conversation.zed,
version: saved_conversation.version,
text: saved_conversation.text,
messages: saved_conversation.messages,
message_metadata: saved_conversation.message_metadata,
summary: saved_conversation.summary,
})
}
_ => Err(anyhow!(
"unrecognized saved conversation version: {}",
version
)),
},
_ => Err(anyhow!("version not found on saved conversation")),
}
}
}
#[derive(Serialize, Deserialize)]
struct SavedConversationV0_1_0 {
id: Option<String>,
zed: String,
version: String,
text: String,
messages: Vec<SavedMessage>,
message_metadata: HashMap<MessageId, MessageMetadata>,
summary: String,
api_url: Option<String>,
model: OpenAiModel,
}
pub struct SavedConversationMetadata {
pub title: String,
pub path: PathBuf,
pub mtime: chrono::DateTime<chrono::Local>,
}
impl SavedConversationMetadata {
pub async fn list(fs: Arc<dyn Fs>) -> Result<Vec<Self>> {
fs.create_dir(&CONVERSATIONS_DIR).await?;
let mut paths = fs.read_dir(&CONVERSATIONS_DIR).await?;
let mut conversations = Vec::<SavedConversationMetadata>::new();
while let Some(path) = paths.next().await {
let path = path?;
if path.extension() != Some(OsStr::new("json")) {
continue;
}
let pattern = r" - \d+.zed.json$";
let re = Regex::new(pattern).unwrap();
let metadata = fs.metadata(&path).await?;
if let Some((file_name, metadata)) = path
.file_name()
.and_then(|name| name.to_str())
.zip(metadata)
{
let title = re.replace(file_name, "");
conversations.push(Self {
title: title.into_owned(),
path,
mtime: metadata.mtime.into(),
});
}
}
conversations.sort_unstable_by_key(|conversation| Reverse(conversation.mtime));
Ok(conversations)
}
}

View File

@@ -197,12 +197,10 @@ impl StreamingDiff {
} else {
hunks.push(Hunk::Remove { len: char_len })
}
} else if let Some(Hunk::Keep { len }) = hunks.last_mut() {
*len += char_len;
} else {
if let Some(Hunk::Keep { len }) = hunks.last_mut() {
*len += char_len;
} else {
hunks.push(Hunk::Keep { len: char_len })
}
hunks.push(Hunk::Keep { len: char_len })
}
}

View File

@@ -1,6 +1,6 @@
use assets::SoundRegistry;
use derive_more::{Deref, DerefMut};
use gpui::{AppContext, AssetSource, Global};
use gpui::{AppContext, AssetSource, BorrowAppContext, Global};
use rodio::{OutputStream, OutputStreamHandle};
use util::ResultExt;

View File

@@ -51,6 +51,7 @@ pub struct ChannelMessage {
pub nonce: u128,
pub mentions: Vec<(Range<usize>, UserId)>,
pub reply_to_message_id: Option<u64>,
pub edited_at: Option<OffsetDateTime>,
}
#[derive(Copy, Clone, Debug, PartialEq, Eq, PartialOrd, Ord, Hash)]
@@ -83,6 +84,10 @@ pub enum ChannelChatEvent {
old_range: Range<usize>,
new_count: usize,
},
UpdateMessage {
message_id: ChannelMessageId,
message_ix: usize,
},
NewMessage {
channel_id: ChannelId,
message_id: u64,
@@ -93,6 +98,7 @@ impl EventEmitter<ChannelChatEvent> for ChannelChat {}
pub fn init(client: &Arc<Client>) {
client.add_model_message_handler(ChannelChat::handle_message_sent);
client.add_model_message_handler(ChannelChat::handle_message_removed);
client.add_model_message_handler(ChannelChat::handle_message_updated);
}
impl ChannelChat {
@@ -189,6 +195,7 @@ impl ChannelChat {
mentions: message.mentions.clone(),
nonce,
reply_to_message_id: message.reply_to_message_id,
edited_at: None,
},
&(),
),
@@ -234,6 +241,35 @@ impl ChannelChat {
})
}
pub fn update_message(
&mut self,
id: u64,
message: MessageParams,
cx: &mut ModelContext<Self>,
) -> Result<Task<Result<()>>> {
self.message_update(
ChannelMessageId::Saved(id),
message.text.clone(),
message.mentions.clone(),
Some(OffsetDateTime::now_utc()),
cx,
);
let nonce: u128 = self.rng.gen();
let request = self.rpc.request(proto::UpdateChannelMessage {
channel_id: self.channel_id.0,
message_id: id,
body: message.text,
nonce: Some(nonce.into()),
mentions: mentions_to_proto(&message.mentions),
});
Ok(cx.spawn(move |_, _| async move {
request.await?;
Ok(())
}))
}
pub fn load_more_messages(&mut self, cx: &mut ModelContext<Self>) -> Option<Task<Option<()>>> {
if self.loaded_all_messages {
return None;
@@ -523,6 +559,32 @@ impl ChannelChat {
Ok(())
}
async fn handle_message_updated(
this: Model<Self>,
message: TypedEnvelope<proto::ChannelMessageUpdate>,
_: Arc<Client>,
mut cx: AsyncAppContext,
) -> Result<()> {
let user_store = this.update(&mut cx, |this, _| this.user_store.clone())?;
let message = message
.payload
.message
.ok_or_else(|| anyhow!("empty message"))?;
let message = ChannelMessage::from_proto(message, &user_store, &mut cx).await?;
this.update(&mut cx, |this, cx| {
this.message_update(
message.id,
message.body,
message.mentions,
message.edited_at,
cx,
)
})?;
Ok(())
}
fn insert_messages(&mut self, messages: SumTree<ChannelMessage>, cx: &mut ModelContext<Self>) {
if let Some((first_message, last_message)) = messages.first().zip(messages.last()) {
let nonces = messages
@@ -599,6 +661,38 @@ impl ChannelChat {
}
}
}
fn message_update(
&mut self,
id: ChannelMessageId,
body: String,
mentions: Vec<(Range<usize>, u64)>,
edited_at: Option<OffsetDateTime>,
cx: &mut ModelContext<Self>,
) {
let mut cursor = self.messages.cursor::<ChannelMessageId>();
let mut messages = cursor.slice(&id, Bias::Left, &());
let ix = messages.summary().count;
if let Some(mut message_to_update) = cursor.item().cloned() {
message_to_update.body = body;
message_to_update.mentions = mentions;
message_to_update.edited_at = edited_at;
messages.push(message_to_update, &());
cursor.next(&());
}
messages.append(cursor.suffix(&()), &());
drop(cursor);
self.messages = messages;
cx.emit(ChannelChatEvent::UpdateMessage {
message_ix: ix,
message_id: id,
});
cx.notify();
}
}
async fn messages_from_proto(
@@ -623,6 +717,15 @@ impl ChannelMessage {
user_store.get_user(message.sender_id, cx)
})?
.await?;
let edited_at = message.edited_at.and_then(|t| -> Option<OffsetDateTime> {
if let Ok(a) = OffsetDateTime::from_unix_timestamp(t as i64) {
return Some(a);
}
None
});
Ok(ChannelMessage {
id: ChannelMessageId::Saved(message.id),
body: message.body,
@@ -641,6 +744,7 @@ impl ChannelMessage {
.ok_or_else(|| anyhow!("nonce is required"))?
.into(),
reply_to_message_id: message.reply_to_message_id,
edited_at,
})
}

View File

@@ -17,7 +17,7 @@ use rpc::{
};
use settings::Settings;
use std::{mem, sync::Arc, time::Duration};
use util::{async_maybe, maybe, ResultExt};
use util::{maybe, ResultExt};
pub const RECONNECT_TIMEOUT: Duration = Duration::from_secs(30);
@@ -227,7 +227,7 @@ impl ChannelStore {
_watch_connection_status: watch_connection_status,
disconnect_channel_buffers_task: None,
_update_channels: cx.spawn(|this, mut cx| async move {
async_maybe!({
maybe!(async move {
while let Some(update_channels) = update_channels_rx.next().await {
if let Some(this) = this.upgrade() {
let update_task = this.update(&mut cx, |this, cx| {

View File

@@ -186,6 +186,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
mentions: vec![],
nonce: Some(1.into()),
reply_to_message_id: None,
edited_at: None,
},
proto::ChannelMessage {
id: 11,
@@ -195,6 +196,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
mentions: vec![],
nonce: Some(2.into()),
reply_to_message_id: None,
edited_at: None,
},
],
done: false,
@@ -243,6 +245,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
mentions: vec![],
nonce: Some(3.into()),
reply_to_message_id: None,
edited_at: None,
}),
});
@@ -297,6 +300,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
nonce: Some(4.into()),
mentions: vec![],
reply_to_message_id: None,
edited_at: None,
},
proto::ChannelMessage {
id: 9,
@@ -306,6 +310,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
nonce: Some(5.into()),
mentions: vec![],
reply_to_message_id: None,
edited_at: None,
},
],
},

View File

@@ -13,11 +13,12 @@ use async_tungstenite::tungstenite::{
use clock::SystemClock;
use collections::HashMap;
use futures::{
channel::oneshot, future::LocalBoxFuture, AsyncReadExt, FutureExt, SinkExt, StreamExt,
channel::oneshot, future::LocalBoxFuture, AsyncReadExt, FutureExt, SinkExt, Stream, StreamExt,
TryFutureExt as _, TryStreamExt,
};
use gpui::{
actions, AnyModel, AnyWeakModel, AppContext, AsyncAppContext, Global, Model, Task, WeakModel,
actions, AnyModel, AnyWeakModel, AppContext, AsyncAppContext, BorrowAppContext, Global, Model,
Task, WeakModel,
};
use lazy_static::lazy_static;
use parking_lot::RwLock;
@@ -27,8 +28,8 @@ use release_channel::{AppVersion, ReleaseChannel};
use rpc::proto::{AnyTypedEnvelope, EntityMessage, EnvelopedMessage, PeerId, RequestMessage};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsStore};
use std::fmt;
use std::{
any::TypeId,
convert::TryFrom,
@@ -36,7 +37,10 @@ use std::{
future::Future,
marker::PhantomData,
path::PathBuf,
sync::{atomic::AtomicU64, Arc, Weak},
sync::{
atomic::{AtomicU64, Ordering},
Arc, Weak,
},
time::{Duration, Instant},
};
use telemetry::Telemetry;
@@ -49,6 +53,15 @@ pub use rpc::*;
pub use telemetry_events::Event;
pub use user::*;
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct DevServerToken(pub String);
impl fmt::Display for DevServerToken {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", self.0)
}
}
lazy_static! {
static ref ZED_SERVER_URL: Option<String> = std::env::var("ZED_SERVER_URL").ok();
static ref ZED_RPC_URL: Option<String> = std::env::var("ZED_RPC_URL").ok();
@@ -274,10 +287,22 @@ enum WeakSubscriber {
Pending(Vec<Box<dyn AnyTypedEnvelope>>),
}
#[derive(Clone, Debug)]
pub struct Credentials {
pub user_id: u64,
pub access_token: String,
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum Credentials {
DevServer { token: DevServerToken },
User { user_id: u64, access_token: String },
}
impl Credentials {
pub fn authorization_header(&self) -> String {
match self {
Credentials::DevServer { token } => format!("dev-server-token {}", token),
Credentials::User {
user_id,
access_token,
} => format!("{} {}", user_id, access_token),
}
}
}
impl Default for ClientState {
@@ -442,7 +467,7 @@ impl Client {
}
pub fn id(&self) -> u64 {
self.id.load(std::sync::atomic::Ordering::SeqCst)
self.id.load(Ordering::SeqCst)
}
pub fn http_client(&self) -> Arc<HttpClientWithUrl> {
@@ -450,7 +475,7 @@ impl Client {
}
pub fn set_id(&self, id: u64) -> &Self {
self.id.store(id, std::sync::atomic::Ordering::SeqCst);
self.id.store(id, Ordering::SeqCst);
self
}
@@ -494,11 +519,11 @@ impl Client {
}
pub fn user_id(&self) -> Option<u64> {
self.state
.read()
.credentials
.as_ref()
.map(|credentials| credentials.user_id)
if let Some(Credentials::User { user_id, .. }) = self.state.read().credentials.as_ref() {
Some(*user_id)
} else {
None
}
}
pub fn peer_id(&self) -> Option<PeerId> {
@@ -743,6 +768,10 @@ impl Client {
read_credentials_from_keychain(cx).await.is_some()
}
pub fn set_dev_server_token(&self, token: DevServerToken) {
self.state.write().credentials = Some(Credentials::DevServer { token });
}
#[async_recursion(?Send)]
pub async fn authenticate_and_connect(
self: &Arc<Self>,
@@ -793,7 +822,9 @@ impl Client {
}
}
let credentials = credentials.unwrap();
self.set_id(credentials.user_id);
if let Credentials::User { user_id, .. } = &credentials {
self.set_id(*user_id);
}
if was_disconnected {
self.set_status(Status::Connecting, cx);
@@ -809,7 +840,9 @@ impl Client {
Ok(conn) => {
self.state.write().credentials = Some(credentials.clone());
if !read_from_keychain && IMPERSONATE_LOGIN.is_none() {
write_credentials_to_keychain(credentials, cx).await.log_err();
if let Credentials::User{user_id, access_token} = credentials {
write_credentials_to_keychain(user_id, access_token, cx).await.log_err();
}
}
futures::select_biased! {
@@ -1017,10 +1050,7 @@ impl Client {
.unwrap_or_default();
let request = Request::builder()
.header(
"Authorization",
format!("{} {}", credentials.user_id, credentials.access_token),
)
.header("Authorization", credentials.authorization_header())
.header("x-zed-protocol-version", rpc::PROTOCOL_VERSION)
.header("x-zed-app-version", app_version)
.header(
@@ -1173,7 +1203,7 @@ impl Client {
.decrypt_string(&access_token)
.context("failed to decrypt access token")?;
Ok(Credentials {
Ok(Credentials::User {
user_id: user_id.parse()?,
access_token,
})
@@ -1223,7 +1253,7 @@ impl Client {
// Use the admin API token to authenticate as the impersonated user.
api_token.insert_str(0, "ADMIN_TOKEN:");
Ok(Credentials {
Ok(Credentials::User {
user_id: response.user.id,
access_token: api_token,
})
@@ -1260,6 +1290,30 @@ impl Client {
.map_ok(|envelope| envelope.payload)
}
pub fn request_stream<T: RequestMessage>(
&self,
request: T,
) -> impl Future<Output = Result<impl Stream<Item = Result<T::Response>>>> {
let client_id = self.id.load(Ordering::SeqCst);
log::debug!(
"rpc request start. client_id:{}. name:{}",
client_id,
T::NAME
);
let response = self
.connection_id()
.map(|conn_id| self.peer.request_stream(conn_id, request));
async move {
let response = response?.await;
log::debug!(
"rpc request finish. client_id:{}. name:{}",
client_id,
T::NAME
);
response
}
}
pub fn request_envelope<T: RequestMessage>(
&self,
request: T,
@@ -1412,21 +1466,22 @@ async fn read_credentials_from_keychain(cx: &AsyncAppContext) -> Option<Credenti
.await
.log_err()??;
Some(Credentials {
Some(Credentials::User {
user_id: user_id.parse().ok()?,
access_token: String::from_utf8(access_token).ok()?,
})
}
async fn write_credentials_to_keychain(
credentials: Credentials,
user_id: u64,
access_token: String,
cx: &AsyncAppContext,
) -> Result<()> {
cx.update(move |cx| {
cx.write_credentials(
&ClientSettings::get_global(cx).server_url,
&credentials.user_id.to_string(),
credentials.access_token.as_bytes(),
&user_id.to_string(),
access_token.as_bytes(),
)
})?
.await
@@ -1531,7 +1586,7 @@ mod tests {
// Time out when client tries to connect.
client.override_authenticate(move |cx| {
cx.background_executor().spawn(async move {
Ok(Credentials {
Ok(Credentials::User {
user_id,
access_token: "token".into(),
})

View File

@@ -15,7 +15,8 @@ use std::{env, mem, path::PathBuf, sync::Arc, time::Duration};
use sysinfo::{CpuRefreshKind, MemoryRefreshKind, Pid, ProcessRefreshKind, RefreshKind, System};
use telemetry_events::{
ActionEvent, AppEvent, AssistantEvent, AssistantKind, CallEvent, CopilotEvent, CpuEvent,
EditEvent, EditorEvent, Event, EventRequestBody, EventWrapper, MemoryEvent, SettingEvent,
EditEvent, EditorEvent, Event, EventRequestBody, EventWrapper, ExtensionEvent, MemoryEvent,
SettingEvent,
};
use tempfile::NamedTempFile;
use util::http::{self, HttpClient, HttpClientWithUrl, Method};
@@ -261,7 +262,7 @@ impl Telemetry {
self: &Arc<Self>,
conversation_id: Option<String>,
kind: AssistantKind,
model: &str,
model: String,
) {
let event = Event::Assistant(AssistantEvent {
conversation_id,
@@ -326,6 +327,13 @@ impl Telemetry {
self.report_event(event)
}
pub fn report_extension_event(self: &Arc<Self>, extension_id: Arc<str>, version: Arc<str>) {
self.report_event(Event::Extension(ExtensionEvent {
extension_id,
version,
}))
}
pub fn log_edit_event(self: &Arc<Self>, environment: &'static str) {
let mut state = self.state.lock();
let period_data = state.event_coalescer.log_event(environment);
@@ -470,7 +478,11 @@ impl Telemetry {
let request = http::Request::builder()
.method(Method::POST)
.uri(this.http_client.build_zed_api_url("/telemetry/events"))
.uri(
this.http_client
.build_zed_api_url("/telemetry/events", &[])?
.as_ref(),
)
.header("Content-Type", "text/plain")
.header("x-zed-checksum", checksum)
.body(json_bytes.into());

View File

@@ -48,7 +48,7 @@ impl FakeServer {
let mut state = state.lock();
state.auth_count += 1;
let access_token = state.access_token.to_string();
Ok(Credentials {
Ok(Credentials::User {
user_id: client_user_id,
access_token,
})
@@ -71,9 +71,12 @@ impl FakeServer {
)))?
}
assert_eq!(credentials.user_id, client_user_id);
if credentials.access_token != state.lock().access_token.to_string() {
if credentials
!= (Credentials::User {
user_id: client_user_id,
access_token: state.lock().access_token.to_string(),
})
{
Err(EstablishConnectionError::Unauthorized)?
}

View File

@@ -1,8 +0,0 @@
[
"nathansobo",
"as-cii",
"maxbrunsfeld",
"iamnbutler",
"mikayla-maki",
"JosephTLyons"
]

View File

@@ -1,4 +1,5 @@
DATABASE_URL = "postgres://postgres@localhost/zed"
# DATABASE_URL = "sqlite:////home/zed/.config/zed/db.sqlite3?mode=rwc"
DATABASE_MAX_CONNECTIONS = 5
HTTP_PORT = 8080
API_TOKEN = "secret"
@@ -13,6 +14,7 @@ BLOB_STORE_BUCKET = "the-extensions-bucket"
BLOB_STORE_URL = "http://127.0.0.1:9000"
BLOB_STORE_REGION = "the-region"
ZED_CLIENT_CHECKSUM_SEED = "development-checksum-seed"
SEED_PATH = "crates/collab/seed.default.json"
# CLICKHOUSE_URL = ""
# CLICKHOUSE_USER = "default"

View File

@@ -13,8 +13,9 @@ workspace = true
[[bin]]
name = "collab"
[[bin]]
name = "seed"
[features]
sqlite = ["sea-orm/sqlx-sqlite", "sqlx/sqlite"]
test-support = ["sqlite"]
[dependencies]
anyhow.workspace = true
@@ -31,10 +32,12 @@ collections.workspace = true
dashmap = "5.4"
envy = "0.4.2"
futures.workspace = true
google_ai.workspace = true
hex.workspace = true
live_kit_server.workspace = true
log.workspace = true
nanoid = "0.4"
open_ai.workspace = true
parking_lot.workspace = true
prometheus = "0.13"
prost.workspace = true
@@ -54,7 +57,7 @@ rustc-demangle.workspace = true
telemetry_events.workspace = true
text.workspace = true
time.workspace = true
tokio = { version = "1", features = ["full"] }
tokio.workspace = true
toml.workspace = true
tower = "0.4"
tower-http = { workspace = true, features = ["trace"] }
@@ -80,7 +83,6 @@ git = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
indoc.workspace = true
language = { workspace = true, features = ["test-support"] }
lazy_static.workspace = true
live_kit_client = { workspace = true, features = ["test-support"] }
lsp = { workspace = true, features = ["test-support"] }
menu.workspace = true

View File

@@ -6,21 +6,21 @@ It contains our back-end logic for collaboration, to which we connect from the Z
# Local Development
Detailed instructions on getting started are [here](https://zed.dev/docs/local-collaboration).
Detailed instructions on getting started are [here](https://zed.dev/docs/local-collaboration).
# Deployment
We run two instances of collab:
* Staging (https://staging-collab.zed.dev)
* Production (https://collab.zed.dev)
- Staging (https://staging-collab.zed.dev)
- Production (https://collab.zed.dev)
Both of these run on the Kubernetes cluster hosted in Digital Ocean.
Deployment is triggered by pushing to the `collab-staging` (or `collab-production`) tag in Github. The best way to do this is:
* `./script/deploy-collab staging`
* `./script/deploy-collab production`
- `./script/deploy-collab staging`
- `./script/deploy-collab production`
You can tell what is currently deployed with `./script/what-is-deployed`.
@@ -29,7 +29,7 @@ You can tell what is currently deployed with `./script/what-is-deployed`.
To create a new migration:
```
./script/sqlx migrate add <name>
./script/create-migration <name>
```
Migrations are run automatically on service start, so run `foreman start` again. The service will crash if the migrations fail.

View File

@@ -1,12 +0,0 @@
[Interface]
PrivateKey = B5Fp/yVfP0QYlb+YJv9ea+EMI1mWODPD3akh91cVjvc=
Address = fdaa:0:2ce3:a7b:bea:0:a:2/120
DNS = fdaa:0:2ce3::3
[Peer]
PublicKey = RKAYPljEJiuaELNDdQIEJmQienT9+LRISfIHwH45HAw=
AllowedIPs = fdaa:0:2ce3::/48
Endpoint = ord1.gateway.6pn.dev:51820
PersistentKeepalive = 15

View File

@@ -125,6 +125,11 @@ spec:
secretKeyRef:
name: livekit
key: secret
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: openai
key: api_key
- name: BLOB_STORE_ACCESS_KEY
valueFrom:
secretKeyRef:

View File

@@ -219,6 +219,7 @@ CREATE TABLE IF NOT EXISTS "channel_messages" (
"sender_id" INTEGER NOT NULL REFERENCES users (id),
"body" TEXT NOT NULL,
"sent_at" TIMESTAMP,
"edited_at" TIMESTAMP,
"nonce" BLOB NOT NULL,
"reply_to_message_id" INTEGER DEFAULT NULL
);
@@ -372,6 +373,8 @@ CREATE TABLE extension_versions (
authors TEXT NOT NULL,
repository TEXT NOT NULL,
description TEXT NOT NULL,
schema_version INTEGER NOT NULL DEFAULT 0,
wasm_api_version TEXT,
download_count INTEGER NOT NULL DEFAULT 0,
PRIMARY KEY (extension_id, version)
);
@@ -379,6 +382,16 @@ CREATE TABLE extension_versions (
CREATE UNIQUE INDEX "index_extensions_external_id" ON "extensions" ("external_id");
CREATE INDEX "index_extensions_total_download_count" ON "extensions" ("total_download_count");
CREATE TABLE rate_buckets (
user_id INT NOT NULL,
rate_limit_name VARCHAR(255) NOT NULL,
token_count INT NOT NULL,
last_refill TIMESTAMP WITHOUT TIME ZONE NOT NULL,
PRIMARY KEY (user_id, rate_limit_name),
FOREIGN KEY (user_id) REFERENCES users(id)
);
CREATE INDEX idx_user_id_rate_limit ON rate_buckets (user_id, rate_limit_name);
CREATE TABLE hosted_projects (
id INTEGER PRIMARY KEY AUTOINCREMENT,
channel_id INTEGER NOT NULL REFERENCES channels(id),
@@ -388,3 +401,11 @@ CREATE TABLE hosted_projects (
);
CREATE INDEX idx_hosted_projects_on_channel_id ON hosted_projects (channel_id);
CREATE UNIQUE INDEX uix_hosted_projects_on_channel_id_and_name ON hosted_projects (channel_id, name) WHERE (deleted_at IS NULL);
CREATE TABLE dev_servers (
id INTEGER PRIMARY KEY AUTOINCREMENT,
channel_id INTEGER NOT NULL REFERENCES channels(id),
name TEXT NOT NULL,
hashed_token TEXT NOT NULL
);
CREATE INDEX idx_dev_servers_on_channel_id ON dev_servers (channel_id);

View File

@@ -0,0 +1,11 @@
CREATE TABLE IF NOT EXISTS rate_buckets (
user_id INT NOT NULL,
rate_limit_name VARCHAR(255) NOT NULL,
token_count INT NOT NULL,
last_refill TIMESTAMP WITHOUT TIME ZONE NOT NULL,
PRIMARY KEY (user_id, rate_limit_name),
CONSTRAINT fk_user
FOREIGN KEY (user_id) REFERENCES users(id)
);
CREATE INDEX idx_user_id_rate_limit ON rate_buckets (user_id, rate_limit_name);

View File

@@ -0,0 +1 @@
ALTER TABLE channel_messages ADD edited_at TIMESTAMP DEFAULT NULL;

View File

@@ -0,0 +1,2 @@
-- Add migration script here
ALTER TABLE extension_versions ADD COLUMN schema_version INTEGER NOT NULL DEFAULT 0;

View File

@@ -0,0 +1,7 @@
CREATE TABLE dev_servers (
id INT PRIMARY KEY GENERATED ALWAYS AS IDENTITY,
channel_id INT NOT NULL REFERENCES channels(id),
name TEXT NOT NULL,
hashed_token TEXT NOT NULL
);
CREATE INDEX idx_dev_servers_on_channel_id ON dev_servers (channel_id);

View File

@@ -0,0 +1 @@
ALTER TABLE extension_versions ADD COLUMN wasm_api_version TEXT;

View File

@@ -0,0 +1,12 @@
{
"admins": [
"nathansobo",
"as-cii",
"maxbrunsfeld",
"iamnbutler",
"mikayla-maki",
"JosephTLyons"
],
"channels": ["zed"],
"number_of_users": 100
}

75
crates/collab/src/ai.rs Normal file
View File

@@ -0,0 +1,75 @@
use anyhow::{anyhow, Result};
use rpc::proto;
pub fn language_model_request_to_open_ai(
request: proto::CompleteWithLanguageModel,
) -> Result<open_ai::Request> {
Ok(open_ai::Request {
model: open_ai::Model::from_id(&request.model).unwrap_or(open_ai::Model::FourTurbo),
messages: request
.messages
.into_iter()
.map(|message| {
let role = proto::LanguageModelRole::from_i32(message.role)
.ok_or_else(|| anyhow!("invalid role {}", message.role))?;
Ok(open_ai::RequestMessage {
role: match role {
proto::LanguageModelRole::LanguageModelUser => open_ai::Role::User,
proto::LanguageModelRole::LanguageModelAssistant => {
open_ai::Role::Assistant
}
proto::LanguageModelRole::LanguageModelSystem => open_ai::Role::System,
},
content: message.content,
})
})
.collect::<Result<Vec<open_ai::RequestMessage>>>()?,
stream: true,
stop: request.stop,
temperature: request.temperature,
})
}
pub fn language_model_request_to_google_ai(
request: proto::CompleteWithLanguageModel,
) -> Result<google_ai::GenerateContentRequest> {
Ok(google_ai::GenerateContentRequest {
contents: request
.messages
.into_iter()
.map(language_model_request_message_to_google_ai)
.collect::<Result<Vec<_>>>()?,
generation_config: None,
safety_settings: None,
})
}
pub fn language_model_request_message_to_google_ai(
message: proto::LanguageModelRequestMessage,
) -> Result<google_ai::Content> {
let role = proto::LanguageModelRole::from_i32(message.role)
.ok_or_else(|| anyhow!("invalid role {}", message.role))?;
Ok(google_ai::Content {
parts: vec![google_ai::Part::TextPart(google_ai::TextPart {
text: message.content,
})],
role: match role {
proto::LanguageModelRole::LanguageModelUser => google_ai::Role::User,
proto::LanguageModelRole::LanguageModelAssistant => google_ai::Role::Model,
proto::LanguageModelRole::LanguageModelSystem => google_ai::Role::User,
},
})
}
pub fn count_tokens_request_to_google_ai(
request: proto::CountTokensWithLanguageModel,
) -> Result<google_ai::CountTokensRequest> {
Ok(google_ai::CountTokensRequest {
contents: request
.messages
.into_iter()
.map(language_model_request_message_to_google_ai)
.collect::<Result<Vec<_>>>()?,
})
}

View File

@@ -1,5 +1,5 @@
use std::sync::{Arc, OnceLock};
use super::ips_file::IpsFile;
use crate::{api::slack, AppState, Error, Result};
use anyhow::{anyhow, Context};
use aws_sdk_s3::primitives::ByteStream;
use axum::{
@@ -9,18 +9,16 @@ use axum::{
routing::post,
Extension, Router, TypedHeader,
};
use rpc::ExtensionMetadata;
use serde::{Serialize, Serializer};
use sha2::{Digest, Sha256};
use std::sync::{Arc, OnceLock};
use telemetry_events::{
ActionEvent, AppEvent, AssistantEvent, CallEvent, CopilotEvent, CpuEvent, EditEvent,
EditorEvent, Event, EventRequestBody, EventWrapper, MemoryEvent, SettingEvent,
EditorEvent, Event, EventRequestBody, EventWrapper, ExtensionEvent, MemoryEvent, SettingEvent,
};
use util::SemanticVersion;
use crate::{api::slack, AppState, Error, Result};
use super::ips_file::IpsFile;
pub fn router() -> Router {
Router::new()
.route("/telemetry/events", post(post_events))
@@ -331,6 +329,21 @@ pub async fn post_events(
&request_body,
first_event_at,
)),
Event::Extension(event) => {
let metadata = app
.db
.get_extension_version(&event.extension_id, &event.version)
.await?;
to_upload
.extension_events
.push(ExtensionEventRow::from_event(
event.clone(),
&wrapper,
&request_body,
metadata,
first_event_at,
))
}
}
}
@@ -352,6 +365,7 @@ struct ToUpload {
memory_events: Vec<MemoryEventRow>,
app_events: Vec<AppEventRow>,
setting_events: Vec<SettingEventRow>,
extension_events: Vec<ExtensionEventRow>,
edit_events: Vec<EditEventRow>,
action_events: Vec<ActionEventRow>,
}
@@ -410,6 +424,15 @@ impl ToUpload {
.await
.with_context(|| format!("failed to upload to table '{SETTING_EVENTS_TABLE}'"))?;
const EXTENSION_EVENTS_TABLE: &str = "extension_events";
Self::upload_to_table(
EXTENSION_EVENTS_TABLE,
&self.extension_events,
clickhouse_client,
)
.await
.with_context(|| format!("failed to upload to table '{EXTENSION_EVENTS_TABLE}'"))?;
const EDIT_EVENTS_TABLE: &str = "edit_events";
Self::upload_to_table(EDIT_EVENTS_TABLE, &self.edit_events, clickhouse_client)
.await
@@ -861,6 +884,68 @@ impl SettingEventRow {
}
}
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct ExtensionEventRow {
// AppInfoBase
app_version: String,
major: Option<i32>,
minor: Option<i32>,
patch: Option<i32>,
release_channel: String,
// ClientEventBase
installation_id: Option<String>,
session_id: Option<String>,
is_staff: Option<bool>,
time: i64,
// ExtensionEventRow
extension_id: Arc<str>,
extension_version: Arc<str>,
dev: bool,
schema_version: Option<i32>,
wasm_api_version: Option<String>,
}
impl ExtensionEventRow {
fn from_event(
event: ExtensionEvent,
wrapper: &EventWrapper,
body: &EventRequestBody,
extension_metadata: Option<ExtensionMetadata>,
first_event_at: chrono::DateTime<chrono::Utc>,
) -> Self {
let semver = body.semver();
let time =
first_event_at + chrono::Duration::milliseconds(wrapper.milliseconds_since_first_event);
Self {
app_version: body.app_version.clone(),
major: semver.map(|s| s.major as i32),
minor: semver.map(|s| s.minor as i32),
patch: semver.map(|s| s.patch as i32),
release_channel: body.release_channel.clone().unwrap_or_default(),
installation_id: body.installation_id.clone(),
session_id: body.session_id.clone(),
is_staff: body.is_staff,
time: time.timestamp_millis(),
extension_id: event.extension_id,
extension_version: event.version,
dev: extension_metadata.is_none(),
schema_version: extension_metadata
.as_ref()
.and_then(|metadata| metadata.manifest.schema_version),
wasm_api_version: extension_metadata.as_ref().and_then(|metadata| {
metadata
.manifest
.wasm_api_version
.as_ref()
.map(|version| version.to_string())
}),
}
}
}
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct EditEventRow {
// AppInfoBase

View File

@@ -1,8 +1,4 @@
use crate::{
db::{ExtensionMetadata, NewExtensionVersion},
executor::Executor,
AppState, Error, Result,
};
use crate::{db::NewExtensionVersion, AppState, Error, Result};
use anyhow::{anyhow, Context as _};
use aws_sdk_s3::presigning::PresigningConfig;
use axum::{
@@ -13,7 +9,8 @@ use axum::{
Extension, Json, Router,
};
use collections::HashMap;
use serde::{Deserialize, Serialize};
use rpc::{ExtensionApiManifest, GetExtensionsResponse};
use serde::Deserialize;
use std::{sync::Arc, time::Duration};
use time::PrimitiveDateTime;
use util::ResultExt;
@@ -21,6 +18,10 @@ use util::ResultExt;
pub fn router() -> Router {
Router::new()
.route("/extensions", get(get_extensions))
.route(
"/extensions/:extension_id/download",
get(download_latest_extension),
)
.route(
"/extensions/:extension_id/:version/download",
get(download_extension),
@@ -30,6 +31,13 @@ pub fn router() -> Router {
#[derive(Debug, Deserialize)]
struct GetExtensionsParams {
filter: Option<String>,
#[serde(default)]
max_schema_version: i32,
}
#[derive(Debug, Deserialize)]
struct DownloadLatestExtensionParams {
extension_id: String,
}
#[derive(Debug, Deserialize)]
@@ -38,28 +46,36 @@ struct DownloadExtensionParams {
version: String,
}
#[derive(Debug, Serialize)]
struct GetExtensionsResponse {
pub data: Vec<ExtensionMetadata>,
}
#[derive(Deserialize)]
struct ExtensionManifest {
name: String,
version: String,
description: Option<String>,
authors: Vec<String>,
repository: String,
}
async fn get_extensions(
Extension(app): Extension<Arc<AppState>>,
Query(params): Query<GetExtensionsParams>,
) -> Result<Json<GetExtensionsResponse>> {
let extensions = app.db.get_extensions(params.filter.as_deref(), 500).await?;
let extensions = app
.db
.get_extensions(params.filter.as_deref(), params.max_schema_version, 500)
.await?;
Ok(Json(GetExtensionsResponse { data: extensions }))
}
async fn download_latest_extension(
Extension(app): Extension<Arc<AppState>>,
Path(params): Path<DownloadLatestExtensionParams>,
) -> Result<Redirect> {
let extension = app
.db
.get_extension(&params.extension_id)
.await?
.ok_or_else(|| anyhow!("unknown extension"))?;
download_extension(
Extension(app),
Path(DownloadExtensionParams {
extension_id: params.extension_id,
version: extension.manifest.version.to_string(),
}),
)
.await
}
async fn download_extension(
Extension(app): Extension<Arc<AppState>>,
Path(params): Path<DownloadExtensionParams>,
@@ -108,7 +124,7 @@ async fn download_extension(
const EXTENSION_FETCH_INTERVAL: Duration = Duration::from_secs(5 * 60);
const EXTENSION_DOWNLOAD_URL_LIFETIME: Duration = Duration::from_secs(3 * 60);
pub fn fetch_extensions_from_blob_store_periodically(app_state: Arc<AppState>, executor: Executor) {
pub fn fetch_extensions_from_blob_store_periodically(app_state: Arc<AppState>) {
let Some(blob_store_client) = app_state.blob_store_client.clone() else {
log::info!("no blob store client");
return;
@@ -118,6 +134,7 @@ pub fn fetch_extensions_from_blob_store_periodically(app_state: Arc<AppState>, e
return;
};
let executor = app_state.executor.clone();
executor.spawn_detached({
let executor = executor.clone();
async move {
@@ -239,7 +256,7 @@ async fn fetch_extension_manifest(
})?
.to_vec();
let manifest =
serde_json::from_slice::<ExtensionManifest>(&manifest_bytes).with_context(|| {
serde_json::from_slice::<ExtensionApiManifest>(&manifest_bytes).with_context(|| {
format!(
"invalid manifest for extension {extension_id} version {version}: {}",
String::from_utf8_lossy(&manifest_bytes)
@@ -259,6 +276,8 @@ async fn fetch_extension_manifest(
description: manifest.description.unwrap_or_default(),
authors: manifest.authors,
repository: manifest.repository,
schema_version: manifest.schema_version.unwrap_or(0),
wasm_api_version: manifest.wasm_api_version,
published_at,
})
}

View File

@@ -1,5 +1,6 @@
use crate::{
db::{self, AccessTokenId, Database, UserId},
db::{self, dev_server, AccessTokenId, Database, DevServerId, UserId},
rpc::Principal,
AppState, Error, Result,
};
use anyhow::{anyhow, Context};
@@ -19,11 +20,11 @@ use std::sync::OnceLock;
use std::{sync::Arc, time::Instant};
use subtle::ConstantTimeEq;
#[derive(Clone, Debug, Default, PartialEq, Eq)]
pub struct Impersonator(pub Option<db::User>);
/// Validates the authorization header. This has two mechanisms, one for the ADMIN_TOKEN
/// and one for the access tokens that we issue.
/// Validates the authorization header and adds an Extension<Principal> to the request.
/// Authorization: <user-id> <token>
/// <token> can be an access_token attached to that user, or an access token of an admin
/// or (in development) the string ADMIN:<config.api_token>.
/// Authorization: "dev-server-token" <token>
pub async fn validate_header<B>(mut req: Request<B>, next: Next<B>) -> impl IntoResponse {
let mut auth_header = req
.headers()
@@ -37,7 +38,26 @@ pub async fn validate_header<B>(mut req: Request<B>, next: Next<B>) -> impl Into
})?
.split_whitespace();
let user_id = UserId(auth_header.next().unwrap_or("").parse().map_err(|_| {
let state = req.extensions().get::<Arc<AppState>>().unwrap();
let first = auth_header.next().unwrap_or("");
if first == "dev-server-token" {
let dev_server_token = auth_header.next().ok_or_else(|| {
Error::Http(
StatusCode::BAD_REQUEST,
"missing dev-server-token token in authorization header".to_string(),
)
})?;
let dev_server = verify_dev_server_token(dev_server_token, &state.db)
.await
.map_err(|e| Error::Http(StatusCode::UNAUTHORIZED, format!("{}", e)))?;
req.extensions_mut()
.insert(Principal::DevServer(dev_server));
return Ok::<_, Error>(next.run(req).await);
}
let user_id = UserId(first.parse().map_err(|_| {
Error::Http(
StatusCode::BAD_REQUEST,
"missing user id in authorization header".to_string(),
@@ -51,8 +71,6 @@ pub async fn validate_header<B>(mut req: Request<B>, next: Next<B>) -> impl Into
)
})?;
let state = req.extensions().get::<Arc<AppState>>().unwrap();
// In development, allow impersonation using the admin API token.
// Don't allow this in production because we can't tell who is doing
// the impersonating.
@@ -76,18 +94,17 @@ pub async fn validate_header<B>(mut req: Request<B>, next: Next<B>) -> impl Into
.await?
.ok_or_else(|| anyhow!("user {} not found", user_id))?;
let impersonator = if let Some(impersonator_id) = validate_result.impersonator_id {
let impersonator = state
if let Some(impersonator_id) = validate_result.impersonator_id {
let admin = state
.db
.get_user_by_id(impersonator_id)
.await?
.ok_or_else(|| anyhow!("user {} not found", impersonator_id))?;
Some(impersonator)
req.extensions_mut()
.insert(Principal::Impersonated { user, admin });
} else {
None
req.extensions_mut().insert(Principal::User(user));
};
req.extensions_mut().insert(user);
req.extensions_mut().insert(Impersonator(impersonator));
return Ok::<_, Error>(next.run(req).await);
}
}
@@ -213,6 +230,33 @@ pub async fn verify_access_token(
})
}
// a dev_server_token has the format <id>.<base64>. This is to make them
// relatively easy to copy/paste around.
pub async fn verify_dev_server_token(
dev_server_token: &str,
db: &Arc<Database>,
) -> anyhow::Result<dev_server::Model> {
let mut parts = dev_server_token.splitn(2, '.');
let id = DevServerId(parts.next().unwrap_or_default().parse()?);
let token = parts
.next()
.ok_or_else(|| anyhow!("invalid dev server token format"))?;
let token_hash = hash_access_token(&token);
let server = db.get_dev_server(id).await?;
if server
.hashed_token
.as_bytes()
.ct_eq(token_hash.as_ref())
.into()
{
Ok(server)
} else {
Err(anyhow!("wrong token for dev server"))
}
}
#[cfg(test)]
mod test {
use rand::thread_rng;

View File

@@ -1,97 +0,0 @@
use collab::{
db::{self, NewUserParams},
env::load_dotenv,
executor::Executor,
};
use db::{ConnectOptions, Database};
use serde::{de::DeserializeOwned, Deserialize};
use std::{fmt::Write, fs};
#[derive(Debug, Deserialize)]
struct GitHubUser {
id: i32,
login: String,
email: Option<String>,
}
#[tokio::main]
async fn main() {
load_dotenv().expect("failed to load .env.toml file");
let mut admin_logins = load_admins("crates/collab/.admins.default.json")
.expect("failed to load default admins file");
if let Ok(other_admins) = load_admins("./.admins.json") {
admin_logins.extend(other_admins);
}
let database_url = std::env::var("DATABASE_URL").expect("missing DATABASE_URL env var");
let db = Database::new(ConnectOptions::new(database_url), Executor::Production)
.await
.expect("failed to connect to postgres database");
let client = reqwest::Client::new();
// Create admin users for all of the users in `.admins.toml` or `.admins.default.toml`.
for admin_login in admin_logins {
let user = fetch_github::<GitHubUser>(
&client,
&format!("https://api.github.com/users/{admin_login}"),
)
.await;
db.create_user(
&user.email.unwrap_or(format!("{admin_login}@example.com")),
true,
NewUserParams {
github_login: user.login,
github_user_id: user.id,
},
)
.await
.expect("failed to create admin user");
}
// Fetch 100 other random users from GitHub and insert them into the database.
let mut user_count = db
.get_all_users(0, 200)
.await
.expect("failed to load users from db")
.len();
let mut last_user_id = None;
while user_count < 100 {
let mut uri = "https://api.github.com/users?per_page=100".to_string();
if let Some(last_user_id) = last_user_id {
write!(&mut uri, "&since={}", last_user_id).unwrap();
}
let users = fetch_github::<Vec<GitHubUser>>(&client, &uri).await;
for github_user in users {
last_user_id = Some(github_user.id);
user_count += 1;
db.get_or_create_user_by_github_account(
&github_user.login,
Some(github_user.id),
github_user.email.as_deref(),
None,
)
.await
.expect("failed to insert user");
}
}
}
fn load_admins(path: &str) -> anyhow::Result<Vec<String>> {
let file_content = fs::read_to_string(path)?;
Ok(serde_json::from_str(&file_content)?)
}
async fn fetch_github<T: DeserializeOwned>(client: &reqwest::Client, url: &str) -> T {
let response = client
.get(url)
.header("user-agent", "zed")
.send()
.await
.unwrap_or_else(|_| panic!("failed to fetch '{}'", url));
response
.json()
.await
.unwrap_or_else(|_| panic!("failed to deserialize github user from '{}'", url))
}

View File

@@ -12,7 +12,7 @@ use futures::StreamExt;
use rand::{prelude::StdRng, Rng, SeedableRng};
use rpc::{
proto::{self},
ConnectionId,
ConnectionId, ExtensionMetadata,
};
use sea_orm::{
entity::prelude::*,
@@ -128,12 +128,6 @@ impl Database {
Ok(new_migrations)
}
/// Initializes static data that resides in the database by upserting it.
pub async fn initialize_static_data(&mut self) -> Result<()> {
self.initialize_notification_kinds().await?;
Ok(())
}
/// Transaction runs things in a transaction. If you want to call other methods
/// and pass the transaction around you need to reborrow the transaction at each
/// call site with: `&*tx`.
@@ -458,6 +452,14 @@ pub struct CreatedChannelMessage {
pub notifications: NotificationBatch,
}
pub struct UpdatedChannelMessage {
pub message_id: MessageId,
pub participant_connection_ids: Vec<ConnectionId>,
pub notifications: NotificationBatch,
pub reply_to_message_id: Option<MessageId>,
pub timestamp: PrimitiveDateTime,
}
#[derive(Clone, Debug, PartialEq, Eq, FromQueryResult, Serialize, Deserialize)]
pub struct Invite {
pub email_address: String,
@@ -546,7 +548,7 @@ pub struct Channel {
}
impl Channel {
fn from_model(value: channel::Model) -> Self {
pub fn from_model(value: channel::Model) -> Self {
Channel {
id: value.id,
visibility: value.visibility,
@@ -604,16 +606,14 @@ pub struct RejoinedChannelBuffer {
#[derive(Clone)]
pub struct JoinRoom {
pub room: proto::Room,
pub channel_id: Option<ChannelId>,
pub channel_members: Vec<UserId>,
pub channel: Option<channel::Model>,
}
pub struct RejoinedRoom {
pub room: proto::Room,
pub rejoined_projects: Vec<RejoinedProject>,
pub reshared_projects: Vec<ResharedProject>,
pub channel_id: Option<ChannelId>,
pub channel_members: Vec<UserId>,
pub channel: Option<channel::Model>,
}
pub struct ResharedProject {
@@ -649,8 +649,7 @@ pub struct RejoinedWorktree {
pub struct LeftRoom {
pub room: proto::Room,
pub channel_id: Option<ChannelId>,
pub channel_members: Vec<UserId>,
pub channel: Option<channel::Model>,
pub left_projects: HashMap<ProjectId, LeftProject>,
pub canceled_calls_to_user_ids: Vec<UserId>,
pub deleted: bool,
@@ -658,8 +657,7 @@ pub struct LeftRoom {
pub struct RefreshedRoom {
pub room: proto::Room,
pub channel_id: Option<ChannelId>,
pub channel_members: Vec<UserId>,
pub channel: Option<channel::Model>,
pub stale_participant_user_ids: Vec<UserId>,
pub canceled_calls_to_user_ids: Vec<UserId>,
}
@@ -727,22 +725,11 @@ pub struct NewExtensionVersion {
pub description: String,
pub authors: Vec<String>,
pub repository: String,
pub schema_version: i32,
pub wasm_api_version: Option<String>,
pub published_at: PrimitiveDateTime,
}
#[derive(Debug, Serialize, PartialEq)]
pub struct ExtensionMetadata {
pub id: String,
pub name: String,
pub version: String,
pub authors: Vec<String>,
pub description: String,
pub repository: String,
#[serde(serialize_with = "serialize_iso8601")]
pub published_at: PrimitiveDateTime,
pub download_count: u64,
}
pub fn serialize_iso8601<S: Serializer>(
datetime: &PrimitiveDateTime,
serializer: S,

View File

@@ -67,31 +67,34 @@ macro_rules! id_type {
};
}
id_type!(BufferId);
id_type!(AccessTokenId);
id_type!(BufferId);
id_type!(ChannelBufferCollaboratorId);
id_type!(ChannelChatParticipantId);
id_type!(ChannelId);
id_type!(ChannelMemberId);
id_type!(MessageId);
id_type!(ContactId);
id_type!(DevServerId);
id_type!(ExtensionId);
id_type!(FlagId);
id_type!(FollowerId);
id_type!(HostedProjectId);
id_type!(MessageId);
id_type!(NotificationId);
id_type!(NotificationKindId);
id_type!(ProjectCollaboratorId);
id_type!(ProjectId);
id_type!(ReplicaId);
id_type!(RoomId);
id_type!(RoomParticipantId);
id_type!(ProjectId);
id_type!(ProjectCollaboratorId);
id_type!(ReplicaId);
id_type!(ServerId);
id_type!(SignupId);
id_type!(UserId);
id_type!(ChannelBufferCollaboratorId);
id_type!(FlagId);
id_type!(ExtensionId);
id_type!(NotificationId);
id_type!(NotificationKindId);
id_type!(HostedProjectId);
/// ChannelRole gives you permissions for both channels and calls.
#[derive(Eq, PartialEq, Copy, Clone, Debug, EnumIter, DeriveActiveEnum, Default, Hash)]
#[derive(
Eq, PartialEq, Copy, Clone, Debug, EnumIter, DeriveActiveEnum, Default, Hash, Serialize,
)]
#[sea_orm(rs_type = "String", db_type = "String(None)")]
pub enum ChannelRole {
/// Admin can read/write and change permissions.

View File

@@ -5,11 +5,13 @@ pub mod buffers;
pub mod channels;
pub mod contacts;
pub mod contributors;
pub mod dev_servers;
pub mod extensions;
pub mod hosted_projects;
pub mod messages;
pub mod notifications;
pub mod projects;
pub mod rate_buckets;
pub mod rooms;
pub mod servers;
pub mod users;

View File

@@ -45,11 +45,7 @@ impl Database {
name: &str,
parent_channel_id: Option<ChannelId>,
admin_id: UserId,
) -> Result<(
Channel,
Option<channel_member::Model>,
Vec<channel_member::Model>,
)> {
) -> Result<(channel::Model, Option<channel_member::Model>)> {
let name = Self::sanitize_channel_name(name)?;
self.transaction(move |tx| async move {
let mut parent = None;
@@ -90,12 +86,7 @@ impl Database {
);
}
let channel_members = channel_member::Entity::find()
.filter(channel_member::Column::ChannelId.eq(channel.root_id()))
.all(&*tx)
.await?;
Ok((Channel::from_model(channel), membership, channel_members))
Ok((channel, membership))
})
.await
}
@@ -181,7 +172,7 @@ impl Database {
channel_id: ChannelId,
visibility: ChannelVisibility,
admin_id: UserId,
) -> Result<(Channel, Vec<channel_member::Model>)> {
) -> Result<channel::Model> {
self.transaction(move |tx| async move {
let channel = self.get_channel_internal(channel_id, &tx).await?;
self.check_user_is_channel_admin(&channel, admin_id, &tx)
@@ -214,12 +205,7 @@ impl Database {
model.visibility = ActiveValue::Set(visibility);
let channel = model.update(&*tx).await?;
let channel_members = channel_member::Entity::find()
.filter(channel_member::Column::ChannelId.eq(channel.root_id()))
.all(&*tx)
.await?;
Ok((Channel::from_model(channel), channel_members))
Ok(channel)
})
.await
}
@@ -245,21 +231,12 @@ impl Database {
&self,
channel_id: ChannelId,
user_id: UserId,
) -> Result<(Vec<ChannelId>, Vec<UserId>)> {
) -> Result<(ChannelId, Vec<ChannelId>)> {
self.transaction(move |tx| async move {
let channel = self.get_channel_internal(channel_id, &tx).await?;
self.check_user_is_channel_admin(&channel, user_id, &tx)
.await?;
let members_to_notify: Vec<UserId> = channel_member::Entity::find()
.filter(channel_member::Column::ChannelId.eq(channel.root_id()))
.select_only()
.column(channel_member::Column::UserId)
.distinct()
.into_values::<_, QueryUserIds>()
.all(&*tx)
.await?;
let channels_to_remove = self
.get_channel_descendants_excluding_self([&channel], &tx)
.await?
@@ -273,7 +250,7 @@ impl Database {
.exec(&*tx)
.await?;
Ok((channels_to_remove, members_to_notify))
Ok((channel.root_id(), channels_to_remove))
})
.await
}
@@ -343,7 +320,7 @@ impl Database {
channel_id: ChannelId,
admin_id: UserId,
new_name: &str,
) -> Result<(Channel, Vec<channel_member::Model>)> {
) -> Result<channel::Model> {
self.transaction(move |tx| async move {
let new_name = Self::sanitize_channel_name(new_name)?.to_string();
@@ -355,12 +332,7 @@ impl Database {
model.name = ActiveValue::Set(new_name.clone());
let channel = model.update(&*tx).await?;
let channel_members = channel_member::Entity::find()
.filter(channel_member::Column::ChannelId.eq(channel.root_id()))
.all(&*tx)
.await?;
Ok((Channel::from_model(channel), channel_members))
Ok(channel)
})
.await
}
@@ -984,7 +956,7 @@ impl Database {
channel_id: ChannelId,
new_parent_id: ChannelId,
admin_id: UserId,
) -> Result<(Vec<Channel>, Vec<channel_member::Model>)> {
) -> Result<(ChannelId, Vec<Channel>)> {
self.transaction(|tx| async move {
let channel = self.get_channel_internal(channel_id, &tx).await?;
self.check_user_is_channel_admin(&channel, admin_id, &tx)
@@ -1039,12 +1011,7 @@ impl Database {
.map(|c| Channel::from_model(c))
.collect::<Vec<_>>();
let channel_members = channel_member::Entity::find()
.filter(channel_member::Column::ChannelId.eq(root_id))
.all(&*tx)
.await?;
Ok((channels, channel_members))
Ok((root_id, channels))
})
.await
}

View File

@@ -0,0 +1,18 @@
use sea_orm::EntityTrait;
use super::{dev_server, Database, DevServerId};
impl Database {
pub async fn get_dev_server(
&self,
dev_server_id: DevServerId,
) -> crate::Result<dev_server::Model> {
self.transaction(|tx| async move {
Ok(dev_server::Entity::find_by_id(dev_server_id)
.one(&*tx)
.await?
.ok_or_else(|| anyhow::anyhow!("no dev server with id {}", dev_server_id))?)
})
.await
}
}

View File

@@ -1,23 +1,50 @@
use chrono::Utc;
use super::*;
impl Database {
pub async fn get_extensions(
&self,
filter: Option<&str>,
max_schema_version: i32,
limit: usize,
) -> Result<Vec<ExtensionMetadata>> {
self.transaction(|tx| async move {
let mut condition = Condition::all();
let mut condition = Condition::all().add(
extension::Column::LatestVersion
.into_expr()
.eq(extension_version::Column::Version.into_expr()),
);
if let Some(filter) = filter {
let fuzzy_name_filter = Self::fuzzy_like_string(filter);
condition = condition.add(Expr::cust_with_expr("name ILIKE $1", fuzzy_name_filter));
}
let extensions = extension::Entity::find()
.inner_join(extension_version::Entity)
.select_also(extension_version::Entity)
.filter(condition)
.filter(extension_version::Column::SchemaVersion.lte(max_schema_version))
.order_by_desc(extension::Column::TotalDownloadCount)
.order_by_asc(extension::Column::Name)
.limit(Some(limit as u64))
.all(&*tx)
.await?;
Ok(extensions
.into_iter()
.filter_map(|(extension, version)| {
Some(metadata_from_extension_and_version(extension, version?))
})
.collect())
})
.await
}
pub async fn get_extension(&self, extension_id: &str) -> Result<Option<ExtensionMetadata>> {
self.transaction(|tx| async move {
let extension = extension::Entity::find()
.filter(extension::Column::ExternalId.eq(extension_id))
.filter(
extension::Column::LatestVersion
.into_expr()
@@ -25,29 +52,33 @@ impl Database {
)
.inner_join(extension_version::Entity)
.select_also(extension_version::Entity)
.all(&*tx)
.one(&*tx)
.await?;
Ok(extensions
.into_iter()
.filter_map(|(extension, latest_version)| {
let version = latest_version?;
Some(ExtensionMetadata {
id: extension.external_id,
name: extension.name,
version: version.version,
authors: version
.authors
.split(',')
.map(|author| author.trim().to_string())
.collect::<Vec<_>>(),
description: version.description,
repository: version.repository,
published_at: version.published_at,
download_count: extension.total_download_count as u64,
})
})
.collect())
Ok(extension.and_then(|(extension, version)| {
Some(metadata_from_extension_and_version(extension, version?))
}))
})
.await
}
pub async fn get_extension_version(
&self,
extension_id: &str,
version: &str,
) -> Result<Option<ExtensionMetadata>> {
self.transaction(|tx| async move {
let extension = extension::Entity::find()
.filter(extension::Column::ExternalId.eq(extension_id))
.filter(extension_version::Column::Version.eq(version))
.inner_join(extension_version::Entity)
.select_also(extension_version::Entity)
.one(&*tx)
.await?;
Ok(extension.and_then(|(extension, version)| {
Some(metadata_from_extension_and_version(extension, version?))
}))
})
.await
}
@@ -135,6 +166,8 @@ impl Database {
authors: ActiveValue::Set(version.authors.join(", ")),
repository: ActiveValue::Set(version.repository.clone()),
description: ActiveValue::Set(version.description.clone()),
schema_version: ActiveValue::Set(version.schema_version),
wasm_api_version: ActiveValue::Set(version.wasm_api_version.clone()),
download_count: ActiveValue::NotSet,
}
}))
@@ -204,3 +237,35 @@ impl Database {
.await
}
}
fn metadata_from_extension_and_version(
extension: extension::Model,
version: extension_version::Model,
) -> ExtensionMetadata {
ExtensionMetadata {
id: extension.external_id.into(),
manifest: rpc::ExtensionApiManifest {
name: extension.name,
version: version.version.into(),
authors: version
.authors
.split(',')
.map(|author| author.trim().to_string())
.collect::<Vec<_>>(),
description: Some(version.description),
repository: version.repository,
schema_version: Some(version.schema_version),
wasm_api_version: version.wasm_api_version,
},
published_at: convert_time_to_chrono(version.published_at),
download_count: extension.total_download_count as u64,
}
}
pub fn convert_time_to_chrono(time: time::PrimitiveDateTime) -> chrono::DateTime<Utc> {
chrono::DateTime::from_naive_utc_and_offset(
chrono::NaiveDateTime::from_timestamp_opt(time.assume_utc().unix_timestamp(), 0).unwrap(),
Utc,
)
}

View File

@@ -162,6 +162,9 @@ impl Database {
lower_half: nonce.1,
}),
reply_to_message_id: row.reply_to_message_id.map(|id| id.to_proto()),
edited_at: row
.edited_at
.map(|t| t.assume_utc().unix_timestamp() as u64),
}
})
.collect::<Vec<_>>();
@@ -199,6 +202,31 @@ impl Database {
Ok(messages)
}
fn format_mentions_to_entities(
&self,
message_id: MessageId,
body: &str,
mentions: &[proto::ChatMention],
) -> Result<Vec<tables::channel_message_mention::ActiveModel>> {
Ok(mentions
.iter()
.filter_map(|mention| {
let range = mention.range.as_ref()?;
if !body.is_char_boundary(range.start as usize)
|| !body.is_char_boundary(range.end as usize)
{
return None;
}
Some(channel_message_mention::ActiveModel {
message_id: ActiveValue::Set(message_id),
start_offset: ActiveValue::Set(range.start as i32),
end_offset: ActiveValue::Set(range.end as i32),
user_id: ActiveValue::Set(UserId::from_proto(mention.user_id)),
})
})
.collect::<Vec<_>>())
}
/// Creates a new channel message.
#[allow(clippy::too_many_arguments)]
pub async fn create_channel_message(
@@ -249,6 +277,7 @@ impl Database {
nonce: ActiveValue::Set(Uuid::from_u128(nonce)),
id: ActiveValue::NotSet,
reply_to_message_id: ActiveValue::Set(reply_to_message_id),
edited_at: ActiveValue::NotSet,
})
.on_conflict(
OnConflict::columns([
@@ -270,23 +299,7 @@ impl Database {
let mentioned_user_ids =
mentions.iter().map(|m| m.user_id).collect::<HashSet<_>>();
let mentions = mentions
.iter()
.filter_map(|mention| {
let range = mention.range.as_ref()?;
if !body.is_char_boundary(range.start as usize)
|| !body.is_char_boundary(range.end as usize)
{
return None;
}
Some(channel_message_mention::ActiveModel {
message_id: ActiveValue::Set(message_id),
start_offset: ActiveValue::Set(range.start as i32),
end_offset: ActiveValue::Set(range.end as i32),
user_id: ActiveValue::Set(UserId::from_proto(mention.user_id)),
})
})
.collect::<Vec<_>>();
let mentions = self.format_mentions_to_entities(message_id, body, mentions)?;
if !mentions.is_empty() {
channel_message_mention::Entity::insert_many(mentions)
.exec(&*tx)
@@ -522,4 +535,131 @@ impl Database {
})
.await
}
/// Updates the channel message with the given ID, body and timestamp(edited_at).
pub async fn update_channel_message(
&self,
channel_id: ChannelId,
message_id: MessageId,
user_id: UserId,
body: &str,
mentions: &[proto::ChatMention],
edited_at: OffsetDateTime,
) -> Result<UpdatedChannelMessage> {
self.transaction(|tx| async move {
let channel = self.get_channel_internal(channel_id, &tx).await?;
self.check_user_is_channel_participant(&channel, user_id, &tx)
.await?;
let mut rows = channel_chat_participant::Entity::find()
.filter(channel_chat_participant::Column::ChannelId.eq(channel_id))
.stream(&*tx)
.await?;
let mut is_participant = false;
let mut participant_connection_ids = Vec::new();
let mut participant_user_ids = Vec::new();
while let Some(row) = rows.next().await {
let row = row?;
if row.user_id == user_id {
is_participant = true;
}
participant_user_ids.push(row.user_id);
participant_connection_ids.push(row.connection());
}
drop(rows);
if !is_participant {
Err(anyhow!("not a chat participant"))?;
}
let channel_message = channel_message::Entity::find_by_id(message_id)
.filter(channel_message::Column::SenderId.eq(user_id))
.one(&*tx)
.await?;
let Some(channel_message) = channel_message else {
Err(anyhow!("Channel message not found"))?
};
let edited_at = edited_at.to_offset(time::UtcOffset::UTC);
let edited_at = time::PrimitiveDateTime::new(edited_at.date(), edited_at.time());
let updated_message = channel_message::ActiveModel {
body: ActiveValue::Set(body.to_string()),
edited_at: ActiveValue::Set(Some(edited_at)),
reply_to_message_id: ActiveValue::Unchanged(channel_message.reply_to_message_id),
id: ActiveValue::Unchanged(message_id),
channel_id: ActiveValue::Unchanged(channel_id),
sender_id: ActiveValue::Unchanged(user_id),
sent_at: ActiveValue::Unchanged(channel_message.sent_at),
nonce: ActiveValue::Unchanged(channel_message.nonce),
};
let result = channel_message::Entity::update_many()
.set(updated_message)
.filter(channel_message::Column::Id.eq(message_id))
.filter(channel_message::Column::SenderId.eq(user_id))
.exec(&*tx)
.await?;
if result.rows_affected == 0 {
return Err(anyhow!(
"Attempted to edit a message (id: {message_id}) which does not exist anymore."
))?;
}
// we have to fetch the old mentions,
// so we don't send a notification when the message has been edited that you are mentioned in
let old_mentions = channel_message_mention::Entity::find()
.filter(channel_message_mention::Column::MessageId.eq(message_id))
.all(&*tx)
.await?;
// remove all existing mentions
channel_message_mention::Entity::delete_many()
.filter(channel_message_mention::Column::MessageId.eq(message_id))
.exec(&*tx)
.await?;
let new_mentions = self.format_mentions_to_entities(message_id, body, mentions)?;
if !new_mentions.is_empty() {
// insert new mentions
channel_message_mention::Entity::insert_many(new_mentions)
.exec(&*tx)
.await?;
}
let mut mentioned_user_ids = mentions.iter().map(|m| m.user_id).collect::<HashSet<_>>();
// Filter out users that were mentioned before
for mention in old_mentions {
mentioned_user_ids.remove(&mention.user_id.to_proto());
}
let mut notifications = Vec::new();
for mentioned_user in mentioned_user_ids {
notifications.extend(
self.create_notification(
UserId::from_proto(mentioned_user),
rpc::Notification::ChannelMessageMention {
message_id: message_id.to_proto(),
sender_id: user_id.to_proto(),
channel_id: channel_id.to_proto(),
},
false,
&tx,
)
.await?,
);
}
Ok(UpdatedChannelMessage {
message_id,
participant_connection_ids,
notifications,
reply_to_message_id: channel_message.reply_to_message_id,
timestamp: channel_message.sent_at,
})
})
.await
}
}

View File

@@ -0,0 +1,58 @@
use super::*;
use crate::db::tables::rate_buckets;
use sea_orm::{ColumnTrait, EntityTrait, QueryFilter};
impl Database {
/// Saves the rate limit for the given user and rate limit name if the last_refill is later
/// than the currently saved timestamp.
pub async fn save_rate_buckets(&self, buckets: &[rate_buckets::Model]) -> Result<()> {
if buckets.is_empty() {
return Ok(());
}
self.transaction(|tx| async move {
rate_buckets::Entity::insert_many(buckets.iter().map(|bucket| {
rate_buckets::ActiveModel {
user_id: ActiveValue::Set(bucket.user_id),
rate_limit_name: ActiveValue::Set(bucket.rate_limit_name.clone()),
token_count: ActiveValue::Set(bucket.token_count),
last_refill: ActiveValue::Set(bucket.last_refill),
}
}))
.on_conflict(
OnConflict::columns([
rate_buckets::Column::UserId,
rate_buckets::Column::RateLimitName,
])
.update_columns([
rate_buckets::Column::TokenCount,
rate_buckets::Column::LastRefill,
])
.to_owned(),
)
.exec(&*tx)
.await?;
Ok(())
})
.await
}
/// Retrieves the rate limit for the given user and rate limit name.
pub async fn get_rate_bucket(
&self,
user_id: UserId,
rate_limit_name: &str,
) -> Result<Option<rate_buckets::Model>> {
self.transaction(|tx| async move {
let rate_limit = rate_buckets::Entity::find()
.filter(rate_buckets::Column::UserId.eq(user_id))
.filter(rate_buckets::Column::RateLimitName.eq(rate_limit_name))
.one(&*tx)
.await?;
Ok(rate_limit)
})
.await
}
}

View File

@@ -52,12 +52,7 @@ impl Database {
);
let (channel, room) = self.get_channel_room(room_id, &tx).await?;
let channel_members;
if let Some(channel) = &channel {
channel_members = self.get_channel_participants(channel, &tx).await?;
} else {
channel_members = Vec::new();
if channel.is_none() {
// Delete the room if it becomes empty.
if room.participants.is_empty() {
project::Entity::delete_many()
@@ -70,8 +65,7 @@ impl Database {
Ok(RefreshedRoom {
room,
channel_id: channel.map(|channel| channel.id),
channel_members,
channel,
stale_participant_user_ids,
canceled_calls_to_user_ids,
})
@@ -349,8 +343,7 @@ impl Database {
let room = self.get_room(room_id, &tx).await?;
Ok(JoinRoom {
room,
channel_id: None,
channel_members: vec![],
channel: None,
})
})
.await
@@ -446,11 +439,9 @@ impl Database {
let (channel, room) = self.get_channel_room(room_id, &tx).await?;
let channel = channel.ok_or_else(|| anyhow!("no channel for room"))?;
let channel_members = self.get_channel_participants(&channel, tx).await?;
Ok(JoinRoom {
room,
channel_id: Some(channel.id),
channel_members,
channel: Some(channel),
})
}
@@ -736,16 +727,10 @@ impl Database {
}
let (channel, room) = self.get_channel_room(room_id, &tx).await?;
let channel_members = if let Some(channel) = &channel {
self.get_channel_participants(&channel, &tx).await?
} else {
Vec::new()
};
Ok(RejoinedRoom {
room,
channel_id: channel.map(|channel| channel.id),
channel_members,
channel,
rejoined_projects,
reshared_projects,
})
@@ -902,15 +887,9 @@ impl Database {
false
};
let channel_members = if let Some(channel) = &channel {
self.get_channel_participants(channel, &tx).await?
} else {
Vec::new()
};
let left_room = LeftRoom {
room,
channel_id: channel.map(|channel| channel.id),
channel_members,
channel,
left_projects,
canceled_calls_to_user_ids,
deleted,

View File

@@ -10,6 +10,7 @@ pub mod channel_message;
pub mod channel_message_mention;
pub mod contact;
pub mod contributor;
pub mod dev_server;
pub mod extension;
pub mod extension_version;
pub mod feature_flag;
@@ -22,6 +23,7 @@ pub mod observed_buffer_edits;
pub mod observed_channel_messages;
pub mod project;
pub mod project_collaborator;
pub mod rate_buckets;
pub mod room;
pub mod room_participant;
pub mod server;

View File

@@ -11,6 +11,7 @@ pub struct Model {
pub sender_id: UserId,
pub body: String,
pub sent_at: PrimitiveDateTime,
pub edited_at: Option<PrimitiveDateTime>,
pub nonce: Uuid,
pub reply_to_message_id: Option<MessageId>,
}

View File

@@ -0,0 +1,17 @@
use crate::db::{ChannelId, DevServerId};
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "dev_servers")]
pub struct Model {
#[sea_orm(primary_key)]
pub id: DevServerId,
pub name: String,
pub channel_id: ChannelId,
pub hashed_token: String,
}
impl ActiveModelBehavior for ActiveModel {}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {}

View File

@@ -13,6 +13,8 @@ pub struct Model {
pub authors: String,
pub repository: String,
pub description: String,
pub schema_version: i32,
pub wasm_api_version: Option<String>,
pub download_count: i64,
}

View File

@@ -0,0 +1,31 @@
use crate::db::UserId;
use sea_orm::entity::prelude::*;
#[derive(Clone, Debug, PartialEq, Eq, DeriveEntityModel)]
#[sea_orm(table_name = "rate_buckets")]
pub struct Model {
#[sea_orm(primary_key, auto_increment = false)]
pub user_id: UserId,
#[sea_orm(primary_key, auto_increment = false)]
pub rate_limit_name: String,
pub token_count: i32,
pub last_refill: DateTime,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {
#[sea_orm(
belongs_to = "super::user::Entity",
from = "Column::UserId",
to = "super::user::Column::Id"
)]
User,
}
impl Related<super::user::Entity> for Entity {
fn to() -> RelationDef {
Relation::User.def()
}
}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -109,10 +109,9 @@ async fn test_channels(db: &Arc<Database>) {
assert!(db.get_channel(crdb_id, a_id).await.is_err());
// Remove a channel tree
let (mut channel_ids, user_ids) = db.delete_channel(rust_id, a_id).await.unwrap();
let (_, mut channel_ids) = db.delete_channel(rust_id, a_id).await.unwrap();
channel_ids.sort();
assert_eq!(channel_ids, &[rust_id, cargo_id, cargo_ra_id]);
assert_eq!(user_ids, &[a_id]);
assert!(db.get_channel(rust_id, a_id).await.is_err());
assert!(db.get_channel(cargo_id, a_id).await.is_err());

View File

@@ -1,10 +1,9 @@
use super::Database;
use crate::{
db::{ExtensionMetadata, NewExtensionVersion},
db::{queries::extensions::convert_time_to_chrono, ExtensionMetadata, NewExtensionVersion},
test_both_dbs,
};
use std::sync::Arc;
use time::{OffsetDateTime, PrimitiveDateTime};
test_both_dbs!(
test_extensions,
@@ -16,11 +15,13 @@ async fn test_extensions(db: &Arc<Database>) {
let versions = db.get_known_extension_versions().await.unwrap();
assert!(versions.is_empty());
let extensions = db.get_extensions(None, 5).await.unwrap();
let extensions = db.get_extensions(None, 1, 5).await.unwrap();
assert!(extensions.is_empty());
let t0 = OffsetDateTime::from_unix_timestamp_nanos(0).unwrap();
let t0 = PrimitiveDateTime::new(t0.date(), t0.time());
let t0 = time::OffsetDateTime::from_unix_timestamp_nanos(0).unwrap();
let t0 = time::PrimitiveDateTime::new(t0.date(), t0.time());
let t0_chrono = convert_time_to_chrono(t0);
db.insert_extension_versions(
&[
@@ -33,6 +34,8 @@ async fn test_extensions(db: &Arc<Database>) {
description: "an extension".into(),
authors: vec!["max".into()],
repository: "ext1/repo".into(),
schema_version: 1,
wasm_api_version: None,
published_at: t0,
},
NewExtensionVersion {
@@ -41,6 +44,8 @@ async fn test_extensions(db: &Arc<Database>) {
description: "a good extension".into(),
authors: vec!["max".into(), "marshall".into()],
repository: "ext1/repo".into(),
schema_version: 1,
wasm_api_version: None,
published_at: t0,
},
],
@@ -53,6 +58,8 @@ async fn test_extensions(db: &Arc<Database>) {
description: "a great extension".into(),
authors: vec!["marshall".into()],
repository: "ext2/repo".into(),
schema_version: 0,
wasm_api_version: None,
published_at: t0,
}],
),
@@ -75,33 +82,61 @@ async fn test_extensions(db: &Arc<Database>) {
);
// The latest version of each extension is returned.
let extensions = db.get_extensions(None, 5).await.unwrap();
let extensions = db.get_extensions(None, 1, 5).await.unwrap();
assert_eq!(
extensions,
&[
ExtensionMetadata {
id: "ext1".into(),
name: "Extension One".into(),
version: "0.0.2".into(),
authors: vec!["max".into(), "marshall".into()],
description: "a good extension".into(),
repository: "ext1/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension One".into(),
version: "0.0.2".into(),
authors: vec!["max".into(), "marshall".into()],
description: Some("a good extension".into()),
repository: "ext1/repo".into(),
schema_version: Some(1),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 0,
},
ExtensionMetadata {
id: "ext2".into(),
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: "a great extension".into(),
repository: "ext2/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: Some("a great extension".into()),
repository: "ext2/repo".into(),
schema_version: Some(0),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 0
},
]
);
// Extensions with too new of a schema version are excluded.
let extensions = db.get_extensions(None, 0, 5).await.unwrap();
assert_eq!(
extensions,
&[ExtensionMetadata {
id: "ext2".into(),
manifest: rpc::ExtensionApiManifest {
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: Some("a great extension".into()),
repository: "ext2/repo".into(),
schema_version: Some(0),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 0
},]
);
// Record extensions being downloaded.
for _ in 0..7 {
assert!(db.record_extension_download("ext2", "0.0.2").await.unwrap());
@@ -122,28 +157,36 @@ async fn test_extensions(db: &Arc<Database>) {
.unwrap());
// Extensions are returned in descending order of total downloads.
let extensions = db.get_extensions(None, 5).await.unwrap();
let extensions = db.get_extensions(None, 1, 5).await.unwrap();
assert_eq!(
extensions,
&[
ExtensionMetadata {
id: "ext2".into(),
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: "a great extension".into(),
repository: "ext2/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: Some("a great extension".into()),
repository: "ext2/repo".into(),
schema_version: Some(0),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 7
},
ExtensionMetadata {
id: "ext1".into(),
name: "Extension One".into(),
version: "0.0.2".into(),
authors: vec!["max".into(), "marshall".into()],
description: "a good extension".into(),
repository: "ext1/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension One".into(),
version: "0.0.2".into(),
authors: vec!["max".into(), "marshall".into()],
description: Some("a good extension".into()),
repository: "ext1/repo".into(),
schema_version: Some(1),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 5,
},
]
@@ -161,6 +204,8 @@ async fn test_extensions(db: &Arc<Database>) {
description: "a real good extension".into(),
authors: vec!["max".into(), "marshall".into()],
repository: "ext1/repo".into(),
schema_version: 1,
wasm_api_version: None,
published_at: t0,
}],
),
@@ -172,6 +217,8 @@ async fn test_extensions(db: &Arc<Database>) {
description: "an old extension".into(),
authors: vec!["marshall".into()],
repository: "ext2/repo".into(),
schema_version: 0,
wasm_api_version: None,
published_at: t0,
}],
),
@@ -196,28 +243,36 @@ async fn test_extensions(db: &Arc<Database>) {
.collect()
);
let extensions = db.get_extensions(None, 5).await.unwrap();
let extensions = db.get_extensions(None, 1, 5).await.unwrap();
assert_eq!(
extensions,
&[
ExtensionMetadata {
id: "ext2".into(),
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: "a great extension".into(),
repository: "ext2/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension Two".into(),
version: "0.2.0".into(),
authors: vec!["marshall".into()],
description: Some("a great extension".into()),
repository: "ext2/repo".into(),
schema_version: Some(0),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 7
},
ExtensionMetadata {
id: "ext1".into(),
name: "Extension One".into(),
version: "0.0.3".into(),
authors: vec!["max".into(), "marshall".into()],
description: "a real good extension".into(),
repository: "ext1/repo".into(),
published_at: t0,
manifest: rpc::ExtensionApiManifest {
name: "Extension One".into(),
version: "0.0.3".into(),
authors: vec!["max".into(), "marshall".into()],
description: Some("a real good extension".into()),
repository: "ext1/repo".into(),
schema_version: Some(1),
wasm_api_version: None,
},
published_at: t0_chrono,
download_count: 5,
},
]

View File

@@ -1,9 +1,12 @@
pub mod ai;
pub mod api;
pub mod auth;
pub mod db;
pub mod env;
pub mod executor;
mod rate_limiter;
pub mod rpc;
pub mod seed;
#[cfg(test)]
mod tests;
@@ -13,6 +16,7 @@ use aws_config::{BehaviorVersion, Region};
use axum::{http::StatusCode, response::IntoResponse};
use db::{ChannelId, Database};
use executor::Executor;
pub use rate_limiter::*;
use serde::Deserialize;
use std::{path::PathBuf, sync::Arc};
use util::ResultExt;
@@ -108,6 +112,8 @@ impl std::error::Error for Error {}
pub struct Config {
pub http_port: u16,
pub database_url: String,
pub migrations_path: Option<PathBuf>,
pub seed_path: Option<PathBuf>,
pub database_max_connections: u32,
pub api_token: String,
pub clickhouse_url: Option<String>,
@@ -126,6 +132,8 @@ pub struct Config {
pub blob_store_secret_key: Option<String>,
pub blob_store_bucket: Option<String>,
pub zed_environment: Arc<str>,
pub openai_api_key: Option<Arc<str>>,
pub google_ai_api_key: Option<Arc<str>>,
pub zed_client_checksum_seed: Option<String>,
pub slack_panics_webhook: Option<String>,
pub auto_join_channel_id: Option<ChannelId>,
@@ -137,22 +145,18 @@ impl Config {
}
}
#[derive(Default, Deserialize)]
pub struct MigrateConfig {
pub database_url: String,
pub migrations_path: Option<PathBuf>,
}
pub struct AppState {
pub db: Arc<Database>,
pub live_kit_client: Option<Arc<dyn live_kit_server::api::Client>>,
pub blob_store_client: Option<aws_sdk_s3::Client>,
pub rate_limiter: Arc<RateLimiter>,
pub executor: Executor,
pub clickhouse_client: Option<clickhouse::Client>,
pub config: Config,
}
impl AppState {
pub async fn new(config: Config) -> Result<Arc<Self>> {
pub async fn new(config: Config, executor: Executor) -> Result<Arc<Self>> {
let mut db_options = db::ConnectOptions::new(config.database_url.clone());
db_options.max_connections(config.database_max_connections);
let mut db = Database::new(db_options, Executor::Production).await?;
@@ -173,10 +177,13 @@ impl AppState {
None
};
let db = Arc::new(db);
let this = Self {
db: Arc::new(db),
db: db.clone(),
live_kit_client,
blob_store_client: build_blob_store_client(&config).await.log_err(),
rate_limiter: Arc::new(RateLimiter::new(db)),
executor,
clickhouse_client: config
.clickhouse_url
.as_ref()

View File

@@ -7,7 +7,7 @@ use axum::{
};
use collab::{
api::fetch_extensions_from_blob_store_periodically, db, env, executor::Executor, AppState,
Config, MigrateConfig, Result,
Config, RateLimiter, Result,
};
use db::Database;
use std::{
@@ -43,7 +43,16 @@ async fn main() -> Result<()> {
println!("collab v{} ({})", VERSION, REVISION.unwrap_or("unknown"));
}
Some("migrate") => {
run_migrations().await?;
let config = envy::from_env::<Config>().expect("error loading config");
run_migrations(&config).await?;
}
Some("seed") => {
let config = envy::from_env::<Config>().expect("error loading config");
let db_options = db::ConnectOptions::new(config.database_url.clone());
let mut db = Database::new(db_options, Executor::Production).await?;
db.initialize_notification_kinds().await?;
collab::seed::seed(&config, &db, true).await?;
}
Some("serve") => {
let (is_api, is_collab) = if let Some(next) = args.next() {
@@ -53,16 +62,16 @@ async fn main() -> Result<()> {
};
if !is_api && !is_collab {
Err(anyhow!(
"usage: collab <version | migrate | serve [api|collab]>"
"usage: collab <version | migrate | seed | serve [api|collab]>"
))?;
}
let config = envy::from_env::<Config>().expect("error loading config");
init_tracing(&config);
run_migrations().await?;
run_migrations(&config).await?;
let state = AppState::new(config).await?;
let state = AppState::new(config, Executor::Production).await?;
let listener = TcpListener::bind(&format!("0.0.0.0:{}", state.config.http_port))
.expect("failed to bind TCP listener");
@@ -72,8 +81,7 @@ async fn main() -> Result<()> {
.db
.create_server(&state.config.zed_environment)
.await?;
let rpc_server =
collab::rpc::Server::new(epoch, state.clone(), Executor::Production);
let rpc_server = collab::rpc::Server::new(epoch, state.clone());
rpc_server.start().await?;
Some(rpc_server)
@@ -81,8 +89,12 @@ async fn main() -> Result<()> {
None
};
if is_collab {
RateLimiter::save_periodically(state.rate_limiter.clone(), state.executor.clone());
}
if is_api {
fetch_extensions_from_blob_store_periodically(state.clone(), Executor::Production);
fetch_extensions_from_blob_store_periodically(state.clone());
}
let mut app = collab::api::routes(rpc_server.clone(), state.clone());
@@ -152,22 +164,25 @@ async fn main() -> Result<()> {
}
_ => {
Err(anyhow!(
"usage: collab <version | migrate | serve [api|collab]>"
"usage: collab <version | migrate | seed | serve [api|collab]>"
))?;
}
}
Ok(())
}
async fn run_migrations() -> Result<()> {
let config = envy::from_env::<MigrateConfig>().expect("error loading config");
async fn run_migrations(config: &Config) -> Result<()> {
let db_options = db::ConnectOptions::new(config.database_url.clone());
let db = Database::new(db_options, Executor::Production).await?;
let mut db = Database::new(db_options, Executor::Production).await?;
let migrations_path = config
.migrations_path
.as_deref()
.unwrap_or_else(|| Path::new(concat!(env!("CARGO_MANIFEST_DIR"), "/migrations")));
let migrations_path = config.migrations_path.as_deref().unwrap_or_else(|| {
#[cfg(feature = "sqlite")]
let default_migrations = concat!(env!("CARGO_MANIFEST_DIR"), "/migrations.sqlite");
#[cfg(not(feature = "sqlite"))]
let default_migrations = concat!(env!("CARGO_MANIFEST_DIR"), "/migrations");
Path::new(default_migrations)
});
let migrations = db.migrate(&migrations_path, false).await?;
for (migration, duration) in migrations {
@@ -179,6 +194,12 @@ async fn run_migrations() -> Result<()> {
);
}
db.initialize_notification_kinds().await?;
if config.seed_path.is_some() {
collab::seed::seed(&config, &db, false).await?;
}
return Ok(());
}

View File

@@ -0,0 +1,274 @@
use crate::{db::UserId, executor::Executor, Database, Error, Result};
use anyhow::anyhow;
use chrono::{DateTime, Duration, Utc};
use dashmap::{DashMap, DashSet};
use sea_orm::prelude::DateTimeUtc;
use std::sync::Arc;
use util::ResultExt;
pub trait RateLimit: 'static {
fn capacity() -> usize;
fn refill_duration() -> Duration;
fn db_name() -> &'static str;
}
/// Used to enforce per-user rate limits
pub struct RateLimiter {
buckets: DashMap<(UserId, String), RateBucket>,
dirty_buckets: DashSet<(UserId, String)>,
db: Arc<Database>,
}
impl RateLimiter {
pub fn new(db: Arc<Database>) -> Self {
RateLimiter {
buckets: DashMap::new(),
dirty_buckets: DashSet::new(),
db,
}
}
/// Spawns a new task that periodically saves rate limit data to the database.
pub fn save_periodically(rate_limiter: Arc<Self>, executor: Executor) {
const RATE_LIMITER_SAVE_INTERVAL: std::time::Duration = std::time::Duration::from_secs(10);
executor.clone().spawn_detached(async move {
loop {
executor.sleep(RATE_LIMITER_SAVE_INTERVAL).await;
rate_limiter.save().await.log_err();
}
});
}
/// Returns an error if the user has exceeded the specified `RateLimit`.
/// Attempts to read the from the database if no cached RateBucket currently exists.
pub async fn check<T: RateLimit>(&self, user_id: UserId) -> Result<()> {
self.check_internal::<T>(user_id, Utc::now()).await
}
async fn check_internal<T: RateLimit>(&self, user_id: UserId, now: DateTimeUtc) -> Result<()> {
let bucket_key = (user_id, T::db_name().to_string());
// Attempt to fetch the bucket from the database if it hasn't been cached.
// For now, we keep buckets in memory for the lifetime of the process rather than expiring them,
// but this enforces limits across restarts so long as the database is reachable.
if !self.buckets.contains_key(&bucket_key) {
if let Some(bucket) = self.load_bucket::<T>(user_id).await.log_err().flatten() {
self.buckets.insert(bucket_key.clone(), bucket);
self.dirty_buckets.insert(bucket_key.clone());
}
}
let mut bucket = self
.buckets
.entry(bucket_key.clone())
.or_insert_with(|| RateBucket::new(T::capacity(), T::refill_duration(), now));
if bucket.value_mut().allow(now) {
self.dirty_buckets.insert(bucket_key);
Ok(())
} else {
Err(anyhow!("rate limit exceeded"))?
}
}
async fn load_bucket<K: RateLimit>(
&self,
user_id: UserId,
) -> Result<Option<RateBucket>, Error> {
Ok(self
.db
.get_rate_bucket(user_id, K::db_name())
.await?
.map(|saved_bucket| RateBucket {
capacity: K::capacity(),
refill_time_per_token: K::refill_duration(),
token_count: saved_bucket.token_count as usize,
last_refill: DateTime::from_naive_utc_and_offset(saved_bucket.last_refill, Utc),
}))
}
pub async fn save(&self) -> Result<()> {
let mut buckets = Vec::new();
self.dirty_buckets.retain(|key| {
if let Some(bucket) = self.buckets.get(&key) {
buckets.push(crate::db::rate_buckets::Model {
user_id: key.0,
rate_limit_name: key.1.clone(),
token_count: bucket.token_count as i32,
last_refill: bucket.last_refill.naive_utc(),
});
}
false
});
match self.db.save_rate_buckets(&buckets).await {
Ok(()) => Ok(()),
Err(err) => {
for bucket in buckets {
self.dirty_buckets
.insert((bucket.user_id, bucket.rate_limit_name));
}
Err(err)
}
}
}
}
#[derive(Clone)]
struct RateBucket {
capacity: usize,
token_count: usize,
refill_time_per_token: Duration,
last_refill: DateTimeUtc,
}
impl RateBucket {
fn new(capacity: usize, refill_duration: Duration, now: DateTimeUtc) -> Self {
RateBucket {
capacity,
token_count: capacity,
refill_time_per_token: refill_duration / capacity as i32,
last_refill: now,
}
}
fn allow(&mut self, now: DateTimeUtc) -> bool {
self.refill(now);
if self.token_count > 0 {
self.token_count -= 1;
true
} else {
false
}
}
fn refill(&mut self, now: DateTimeUtc) {
let elapsed = now - self.last_refill;
if elapsed >= self.refill_time_per_token {
let new_tokens =
elapsed.num_milliseconds() / self.refill_time_per_token.num_milliseconds();
self.token_count = (self.token_count + new_tokens as usize).min(self.capacity);
self.last_refill = now;
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::db::{NewUserParams, TestDb};
use gpui::TestAppContext;
#[gpui::test]
async fn test_rate_limiter(cx: &mut TestAppContext) {
let test_db = TestDb::sqlite(cx.executor().clone());
let db = test_db.db().clone();
let user_1 = db
.create_user(
"user-1@zed.dev",
false,
NewUserParams {
github_login: "user-1".into(),
github_user_id: 1,
},
)
.await
.unwrap()
.user_id;
let user_2 = db
.create_user(
"user-2@zed.dev",
false,
NewUserParams {
github_login: "user-2".into(),
github_user_id: 2,
},
)
.await
.unwrap()
.user_id;
let mut now = Utc::now();
let rate_limiter = RateLimiter::new(db.clone());
// User 1 can access resource A two times before being rate-limited.
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap();
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap();
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap_err();
// User 2 can access resource A and user 1 can access resource B.
rate_limiter
.check_internal::<RateLimitB>(user_2, now)
.await
.unwrap();
rate_limiter
.check_internal::<RateLimitB>(user_1, now)
.await
.unwrap();
// After one second, user 1 can make another request before being rate-limited again.
now += Duration::seconds(1);
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap();
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap_err();
rate_limiter.save().await.unwrap();
// Rate limits are reloaded from the database, so user A is still rate-limited
// for resource A.
let rate_limiter = RateLimiter::new(db.clone());
rate_limiter
.check_internal::<RateLimitA>(user_1, now)
.await
.unwrap_err();
}
struct RateLimitA;
impl RateLimit for RateLimitA {
fn capacity() -> usize {
2
}
fn refill_duration() -> Duration {
Duration::seconds(2)
}
fn db_name() -> &'static str {
"rate-limit-a"
}
}
struct RateLimitB;
impl RateLimit for RateLimitB {
fn capacity() -> usize {
10
}
fn refill_duration() -> Duration {
Duration::seconds(3)
}
fn db_name() -> &'static str {
"rate-limit-b"
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,15 +1,16 @@
use crate::db::UserId;
use crate::db::{ChannelId, ChannelRole, UserId};
use anyhow::{anyhow, Result};
use collections::{BTreeMap, HashSet};
use collections::{BTreeMap, HashMap, HashSet};
use rpc::ConnectionId;
use serde::Serialize;
use tracing::instrument;
use util::SemanticVersion;
use util::{semver, SemanticVersion};
#[derive(Default, Serialize)]
pub struct ConnectionPool {
connections: BTreeMap<ConnectionId, Connection>,
connected_users: BTreeMap<UserId, ConnectedUser>,
channels: ChannelPool,
}
#[derive(Default, Serialize)]
@@ -28,11 +29,8 @@ impl fmt::Display for ZedVersion {
}
impl ZedVersion {
pub fn is_supported(&self) -> bool {
self.0 != SemanticVersion::new(0, 123, 0)
}
pub fn supports_talker_role(&self) -> bool {
self.0 >= SemanticVersion::new(0, 125, 0)
pub fn can_collaborate(&self) -> bool {
self.0 >= semver(0, 127, 3)
}
}
@@ -47,6 +45,7 @@ impl ConnectionPool {
pub fn reset(&mut self) {
self.connections.clear();
self.connected_users.clear();
self.channels.clear();
}
#[instrument(skip(self))]
@@ -81,6 +80,7 @@ impl ConnectionPool {
connected_user.connection_ids.remove(&connection_id);
if connected_user.connection_ids.is_empty() {
self.connected_users.remove(&user_id);
self.channels.remove_user(&user_id);
}
self.connections.remove(&connection_id).unwrap();
Ok(())
@@ -110,6 +110,38 @@ impl ConnectionPool {
.copied()
}
pub fn channel_user_ids(
&self,
channel_id: ChannelId,
) -> impl Iterator<Item = (UserId, ChannelRole)> + '_ {
self.channels.users_to_notify(channel_id)
}
pub fn channel_connection_ids(
&self,
channel_id: ChannelId,
) -> impl Iterator<Item = (ConnectionId, ChannelRole)> + '_ {
self.channels
.users_to_notify(channel_id)
.flat_map(|(user_id, role)| {
self.user_connection_ids(user_id)
.map(move |connection_id| (connection_id, role))
})
}
pub fn subscribe_to_channel(
&mut self,
user_id: UserId,
channel_id: ChannelId,
role: ChannelRole,
) {
self.channels.subscribe(user_id, channel_id, role);
}
pub fn unsubscribe_from_channel(&mut self, user_id: &UserId, channel_id: &ChannelId) {
self.channels.unsubscribe(user_id, channel_id);
}
pub fn is_user_online(&self, user_id: UserId) -> bool {
!self
.connected_users
@@ -140,3 +172,70 @@ impl ConnectionPool {
}
}
}
#[derive(Default, Serialize)]
pub struct ChannelPool {
by_user: HashMap<UserId, HashMap<ChannelId, ChannelRole>>,
by_channel: HashMap<ChannelId, HashSet<UserId>>,
}
impl ChannelPool {
pub fn clear(&mut self) {
self.by_user.clear();
self.by_channel.clear();
}
pub fn subscribe(&mut self, user_id: UserId, channel_id: ChannelId, role: ChannelRole) {
self.by_user
.entry(user_id)
.or_default()
.insert(channel_id, role);
self.by_channel
.entry(channel_id)
.or_default()
.insert(user_id);
}
pub fn unsubscribe(&mut self, user_id: &UserId, channel_id: &ChannelId) {
if let Some(channels) = self.by_user.get_mut(user_id) {
channels.remove(channel_id);
if channels.is_empty() {
self.by_user.remove(user_id);
}
}
if let Some(users) = self.by_channel.get_mut(channel_id) {
users.remove(user_id);
if users.is_empty() {
self.by_channel.remove(channel_id);
}
}
}
pub fn remove_user(&mut self, user_id: &UserId) {
if let Some(channels) = self.by_user.remove(&user_id) {
for channel_id in channels.keys() {
self.unsubscribe(user_id, &channel_id)
}
}
}
pub fn users_to_notify(
&self,
channel_id: ChannelId,
) -> impl '_ + Iterator<Item = (UserId, ChannelRole)> {
self.by_channel
.get(&channel_id)
.into_iter()
.flat_map(move |users| {
users.iter().flat_map(move |user_id| {
Some((
*user_id,
self.by_user
.get(user_id)
.and_then(|channels| channels.get(&channel_id))
.copied()?,
))
})
})
}
}

137
crates/collab/src/seed.rs Normal file
View File

@@ -0,0 +1,137 @@
use crate::db::{self, ChannelRole, NewUserParams};
use anyhow::Context;
use db::Database;
use serde::{de::DeserializeOwned, Deserialize};
use std::{fmt::Write, fs, path::Path};
use crate::Config;
#[derive(Debug, Deserialize)]
struct GitHubUser {
id: i32,
login: String,
email: Option<String>,
}
#[derive(Deserialize)]
struct SeedConfig {
// Which users to create as admins.
admins: Vec<String>,
// Which channels to create (all admins are invited to all channels)
channels: Vec<String>,
// Number of random users to create from the Github API
number_of_users: Option<usize>,
}
pub async fn seed(config: &Config, db: &Database, force: bool) -> anyhow::Result<()> {
let client = reqwest::Client::new();
if !db.get_all_users(0, 1).await?.is_empty() && !force {
return Ok(());
}
let seed_path = config
.seed_path
.as_ref()
.context("called seed with no SEED_PATH")?;
let seed_config = load_admins(seed_path)
.context(format!("failed to load {}", seed_path.to_string_lossy()))?;
let mut first_user = None;
let mut others = vec![];
for admin_login in seed_config.admins {
let user = fetch_github::<GitHubUser>(
&client,
&format!("https://api.github.com/users/{admin_login}"),
)
.await;
let user = db
.create_user(
&user.email.unwrap_or(format!("{admin_login}@example.com")),
true,
NewUserParams {
github_login: user.login,
github_user_id: user.id,
},
)
.await
.context("failed to create admin user")?;
if first_user.is_none() {
first_user = Some(user.user_id);
} else {
others.push(user.user_id)
}
}
for channel in seed_config.channels {
let (channel, _) = db
.create_channel(&channel, None, first_user.unwrap())
.await
.context("failed to create channel")?;
for user_id in &others {
db.invite_channel_member(
channel.id,
*user_id,
first_user.unwrap(),
ChannelRole::Admin,
)
.await
.context("failed to add user to channel")?;
}
}
if let Some(number_of_users) = seed_config.number_of_users {
// Fetch 100 other random users from GitHub and insert them into the database
// (for testing autocompleters, etc.)
let mut user_count = db
.get_all_users(0, 200)
.await
.expect("failed to load users from db")
.len();
let mut last_user_id = None;
while user_count < number_of_users {
let mut uri = "https://api.github.com/users?per_page=100".to_string();
if let Some(last_user_id) = last_user_id {
write!(&mut uri, "&since={}", last_user_id).unwrap();
}
let users = fetch_github::<Vec<GitHubUser>>(&client, &uri).await;
for github_user in users {
last_user_id = Some(github_user.id);
user_count += 1;
db.get_or_create_user_by_github_account(
&github_user.login,
Some(github_user.id),
github_user.email.as_deref(),
None,
)
.await
.expect("failed to insert user");
}
}
}
Ok(())
}
fn load_admins(path: impl AsRef<Path>) -> anyhow::Result<SeedConfig> {
let file_content = fs::read_to_string(path)?;
Ok(serde_json::from_str(&file_content)?)
}
async fn fetch_github<T: DeserializeOwned>(client: &reqwest::Client, url: &str) -> T {
let response = client
.get(url)
.header("user-agent", "zed")
.send()
.await
.unwrap_or_else(|_| panic!("failed to fetch '{}'", url));
response
.json()
.await
.unwrap_or_else(|_| panic!("failed to deserialize github user from '{}'", url))
}

View File

@@ -466,3 +466,136 @@ async fn test_chat_replies(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext)
)
});
}
#[gpui::test]
async fn test_chat_editing(cx_a: &mut TestAppContext, cx_b: &mut TestAppContext) {
let mut server = TestServer::start(cx_a.executor()).await;
let client_a = server.create_client(cx_a, "user_a").await;
let client_b = server.create_client(cx_b, "user_b").await;
let channel_id = server
.make_channel(
"the-channel",
None,
(&client_a, cx_a),
&mut [(&client_b, cx_b)],
)
.await;
// Client A sends a message, client B should see that there is a new message.
let channel_chat_a = client_a
.channel_store()
.update(cx_a, |store, cx| store.open_channel_chat(channel_id, cx))
.await
.unwrap();
let channel_chat_b = client_b
.channel_store()
.update(cx_b, |store, cx| store.open_channel_chat(channel_id, cx))
.await
.unwrap();
let msg_id = channel_chat_a
.update(cx_a, |c, cx| {
c.send_message(
MessageParams {
text: "Initial message".into(),
reply_to_message_id: None,
mentions: Vec::new(),
},
cx,
)
.unwrap()
})
.await
.unwrap();
cx_a.run_until_parked();
channel_chat_a
.update(cx_a, |c, cx| {
c.update_message(
msg_id,
MessageParams {
text: "Updated body".into(),
reply_to_message_id: None,
mentions: Vec::new(),
},
cx,
)
.unwrap()
})
.await
.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
channel_chat_a.update(cx_a, |channel_chat, _| {
let update_message = channel_chat.find_loaded_message(msg_id).unwrap();
assert_eq!(update_message.body, "Updated body");
assert_eq!(update_message.mentions, Vec::new());
});
channel_chat_b.update(cx_b, |channel_chat, _| {
let update_message = channel_chat.find_loaded_message(msg_id).unwrap();
assert_eq!(update_message.body, "Updated body");
assert_eq!(update_message.mentions, Vec::new());
});
// test mentions are updated correctly
client_b.notification_store().read_with(cx_b, |store, _| {
assert_eq!(store.notification_count(), 1);
let entry = store.notification_at(0).unwrap();
assert!(matches!(
entry.notification,
Notification::ChannelInvitation { .. }
),);
});
channel_chat_a
.update(cx_a, |c, cx| {
c.update_message(
msg_id,
MessageParams {
text: "Updated body including a mention for @user_b".into(),
reply_to_message_id: None,
mentions: vec![(37..45, client_b.id())],
},
cx,
)
.unwrap()
})
.await
.unwrap();
cx_a.run_until_parked();
cx_b.run_until_parked();
channel_chat_a.update(cx_a, |channel_chat, _| {
assert_eq!(
channel_chat.find_loaded_message(msg_id).unwrap().body,
"Updated body including a mention for @user_b",
)
});
channel_chat_b.update(cx_b, |channel_chat, _| {
assert_eq!(
channel_chat.find_loaded_message(msg_id).unwrap().body,
"Updated body including a mention for @user_b",
)
});
client_b.notification_store().read_with(cx_b, |store, _| {
assert_eq!(store.notification_count(), 2);
let entry = store.notification_at(0).unwrap();
assert_eq!(
entry.notification,
Notification::ChannelMessageMention {
message_id: msg_id,
sender_id: client_a.id(),
channel_id: channel_id.0,
}
);
});
}

View File

@@ -12,7 +12,7 @@ use editor::{
Editor,
};
use futures::StreamExt;
use gpui::{TestAppContext, VisualContext, VisualTestContext};
use gpui::{BorrowAppContext, TestAppContext, VisualContext, VisualTestContext};
use indoc::indoc;
use language::{
language_settings::{AllLanguageSettings, InlayHintSettings},

View File

@@ -7,8 +7,8 @@ use collab_ui::{
};
use editor::{Editor, ExcerptRange, MultiBuffer};
use gpui::{
point, BackgroundExecutor, Context, Entity, SharedString, TestAppContext, View, VisualContext,
VisualTestContext,
point, BackgroundExecutor, BorrowAppContext, Context, Entity, SharedString, TestAppContext,
View, VisualContext, VisualTestContext,
};
use language::Capability;
use live_kit_client::MacOSDisplay;

View File

@@ -8,8 +8,8 @@ use collections::{HashMap, HashSet};
use fs::{repository::GitFileStatus, FakeFs, Fs as _, RemoveOptions};
use futures::StreamExt as _;
use gpui::{
px, size, AppContext, BackgroundExecutor, Model, Modifiers, MouseButton, MouseDownEvent,
TestAppContext,
px, size, AppContext, BackgroundExecutor, BorrowAppContext, Model, Modifiers, MouseButton,
MouseDownEvent, TestAppContext,
};
use language::{
language_settings::{AllLanguageSettings, Formatter},
@@ -5891,6 +5891,7 @@ async fn test_right_click_menu_behind_collab_panel(cx: &mut TestAppContext) {
position: new_tab_button_bounds.center(),
modifiers: Modifiers::default(),
click_count: 1,
first_mouse: false,
});
// regression test that the right click menu for tabs does not open.
@@ -5902,6 +5903,7 @@ async fn test_right_click_menu_behind_collab_panel(cx: &mut TestAppContext) {
position: tab_bounds.center(),
modifiers: Modifiers::default(),
click_count: 1,
first_mouse: false,
});
assert!(cx.debug_bounds("MENU_ITEM-Close").is_some());
}

View File

@@ -1,8 +1,8 @@
use crate::{
db::{tests::TestDb, NewUserParams, UserId},
executor::Executor,
rpc::{Server, ZedVersion, CLEANUP_TIMEOUT, RECONNECT_TIMEOUT},
AppState, Config,
rpc::{Principal, Server, ZedVersion, CLEANUP_TIMEOUT, RECONNECT_TIMEOUT},
AppState, Config, RateLimiter,
};
use anyhow::anyhow;
use call::ActiveCall;
@@ -93,17 +93,14 @@ impl TestServer {
deterministic.clone(),
)
.unwrap();
let app_state = Self::build_app_state(&test_db, &live_kit_server).await;
let executor = Executor::Deterministic(deterministic.clone());
let app_state = Self::build_app_state(&test_db, &live_kit_server, executor.clone()).await;
let epoch = app_state
.db
.create_server(&app_state.config.zed_environment)
.await
.unwrap();
let server = Server::new(
epoch,
app_state.clone(),
Executor::Deterministic(deterministic.clone()),
);
let server = Server::new(epoch, app_state.clone());
server.start().await.unwrap();
// Advance clock to ensure the server's cleanup task is finished.
deterministic.advance_clock(CLEANUP_TIMEOUT);
@@ -200,15 +197,20 @@ impl TestServer {
.override_authenticate(move |cx| {
cx.spawn(|_| async move {
let access_token = "the-token".to_string();
Ok(Credentials {
Ok(Credentials::User {
user_id: user_id.to_proto(),
access_token,
})
})
})
.override_establish_connection(move |credentials, cx| {
assert_eq!(credentials.user_id, user_id.0 as u64);
assert_eq!(credentials.access_token, "the-token");
assert_eq!(
credentials,
&Credentials::User {
user_id: user_id.0 as u64,
access_token: "the-token".into()
}
);
let server = server.clone();
let db = db.clone();
@@ -233,9 +235,8 @@ impl TestServer {
.spawn(server.handle_connection(
server_conn,
client_name,
user,
Principal::User(user),
ZedVersion(SemanticVersion::new(1, 0, 0)),
None,
Some(connection_id_tx),
Executor::Deterministic(cx.background_executor().clone()),
))
@@ -482,12 +483,15 @@ impl TestServer {
pub async fn build_app_state(
test_db: &TestDb,
fake_server: &live_kit_client::TestServer,
live_kit_test_server: &live_kit_client::TestServer,
executor: Executor,
) -> Arc<AppState> {
Arc::new(AppState {
db: test_db.db().clone(),
live_kit_client: Some(Arc::new(fake_server.create_api_client())),
live_kit_client: Some(Arc::new(live_kit_test_server.create_api_client())),
blob_store_client: None,
rate_limiter: Arc::new(RateLimiter::new(test_db.db().clone())),
executor,
clickhouse_client: None,
config: Config {
http_port: 0,
@@ -506,6 +510,8 @@ impl TestServer {
blob_store_access_key: None,
blob_store_secret_key: None,
blob_store_bucket: None,
openai_api_key: None,
google_ai_api_key: None,
clickhouse_url: None,
clickhouse_user: None,
clickhouse_password: None,
@@ -513,6 +519,8 @@ impl TestServer {
zed_client_checksum_seed: None,
slack_panics_webhook: None,
auto_join_channel_id: None,
migrations_path: None,
seed_path: None,
},
})
}

View File

@@ -5,18 +5,18 @@ use channel::{ChannelChat, ChannelChatEvent, ChannelMessage, ChannelMessageId, C
use client::{ChannelId, Client};
use collections::HashMap;
use db::kvp::KEY_VALUE_STORE;
use editor::Editor;
use editor::{actions, Editor};
use gpui::{
actions, div, list, prelude::*, px, Action, AppContext, AsyncWindowContext, ClipboardItem,
CursorStyle, DismissEvent, ElementId, EventEmitter, FocusHandle, FocusableView, FontWeight,
ListOffset, ListScrollEvent, ListState, Model, Render, Subscription, Task, View, ViewContext,
VisualContext, WeakView,
HighlightStyle, ListOffset, ListScrollEvent, ListState, Model, Render, Stateful, Subscription,
Task, View, ViewContext, VisualContext, WeakView,
};
use language::LanguageRegistry;
use menu::Confirm;
use message_editor::MessageEditor;
use project::Fs;
use rich_text::RichText;
use rich_text::{Highlight, RichText};
use serde::{Deserialize, Serialize};
use settings::Settings;
use std::{sync::Arc, time::Duration};
@@ -64,7 +64,6 @@ pub struct ChatPanel {
open_context_menu: Option<(u64, Subscription)>,
highlighted_message: Option<(u64, Task<()>)>,
last_acknowledged_message_id: Option<u64>,
selected_message_to_reply_id: Option<u64>,
}
#[derive(Serialize, Deserialize)]
@@ -72,7 +71,7 @@ struct SerializedChatPanel {
width: Option<Pixels>,
}
actions!(chat_panel, [ToggleFocus, CloseReplyPreview]);
actions!(chat_panel, [ToggleFocus]);
impl ChatPanel {
pub fn new(workspace: &mut Workspace, cx: &mut ViewContext<Workspace>) -> View<Self> {
@@ -129,7 +128,6 @@ impl ChatPanel {
open_context_menu: None,
highlighted_message: None,
last_acknowledged_message_id: None,
selected_message_to_reply_id: None,
};
if let Some(channel_id) = ActiveCall::global(cx)
@@ -268,6 +266,13 @@ impl ChatPanel {
self.acknowledge_last_message(cx);
}
}
ChannelChatEvent::UpdateMessage {
message_id,
message_ix,
} => {
self.message_list.splice(*message_ix..*message_ix + 1, 1);
self.markdown_data.remove(message_id);
}
ChannelChatEvent::NewMessage {
channel_id,
message_id,
@@ -349,6 +354,7 @@ impl ChatPanel {
.px_0p5()
.gap_x_1()
.rounded_md()
.overflow_hidden()
.hover(|style| style.bg(cx.theme().colors().element_background))
.child(Icon::new(IconName::ReplyArrowRight).color(Color::Muted))
.child(Avatar::new(user_being_replied_to.avatar_uri.clone()).size(rems(0.7)))
@@ -413,6 +419,7 @@ impl ChatPanel {
let belongs_to_user = Some(message.sender.id) == self.client.user_id();
let can_delete_message = belongs_to_user || is_admin;
let can_edit_message = belongs_to_user;
let element_id: ElementId = match message.id {
ChannelMessageId::Saved(id) => ("saved-message", id).into(),
@@ -449,6 +456,8 @@ impl ChatPanel {
cx.theme().colors().panel_background
};
let reply_to_message_id = self.message_editor.read(cx).reply_to_message_id();
v_flex()
.w_full()
.relative()
@@ -462,7 +471,7 @@ impl ChatPanel {
.overflow_hidden()
.px_1p5()
.py_0p5()
.when_some(self.selected_message_to_reply_id, |el, reply_id| {
.when_some(reply_to_message_id, |el, reply_id| {
el.when_some(message_id, |el, message_id| {
el.when(reply_id == message_id, |el| {
el.bg(cx.theme().colors().element_selected)
@@ -559,7 +568,7 @@ impl ChatPanel {
},
)
.child(
self.render_popover_buttons(&cx, message_id, can_delete_message)
self.render_popover_buttons(&cx, message_id, can_delete_message, can_edit_message)
.neg_mt_2p5(),
)
}
@@ -571,94 +580,122 @@ impl ChatPanel {
}
}
fn render_popover_button(&self, cx: &ViewContext<Self>, child: Stateful<Div>) -> Div {
div()
.w_6()
.bg(cx.theme().colors().element_background)
.hover(|style| style.bg(cx.theme().colors().element_hover).rounded_md())
.child(child)
}
fn render_popover_buttons(
&self,
cx: &ViewContext<Self>,
message_id: Option<u64>,
can_delete_message: bool,
can_edit_message: bool,
) -> Div {
div()
h_flex()
.absolute()
.child(
div()
.absolute()
.right_8()
.w_6()
.rounded_tl_md()
.rounded_bl_md()
.border_l_1()
.border_t_1()
.border_b_1()
.border_color(cx.theme().colors().element_selected)
.bg(cx.theme().colors().element_background)
.hover(|style| style.bg(cx.theme().colors().element_hover))
.when(!self.has_open_menu(message_id), |el| {
el.visible_on_hover("")
})
.when_some(message_id, |el, message_id| {
el.child(
.right_2()
.overflow_hidden()
.rounded_md()
.border_color(cx.theme().colors().element_selected)
.border_1()
.when(!self.has_open_menu(message_id), |el| {
el.visible_on_hover("")
})
.bg(cx.theme().colors().element_background)
.when_some(message_id, |el, message_id| {
el.child(
self.render_popover_button(
cx,
div()
.id("reply")
.child(
IconButton::new(("reply", message_id), IconName::ReplyArrowRight)
.on_click(cx.listener(move |this, _, cx| {
this.message_editor.update(cx, |editor, cx| {
editor.set_reply_to_message_id(message_id);
editor.focus_handle(cx).focus(cx);
})
})),
)
.tooltip(|cx| Tooltip::text("Reply", cx)),
),
)
})
.when_some(message_id, |el, message_id| {
el.when(can_edit_message, |el| {
el.child(
self.render_popover_button(
cx,
div()
.id("reply")
.id("edit")
.child(
IconButton::new(
("reply", message_id),
IconName::ReplyArrowLeft,
)
.on_click(cx.listener(
move |this, _, cx| {
this.selected_message_to_reply_id = Some(message_id);
IconButton::new(("edit", message_id), IconName::Pencil)
.on_click(cx.listener(move |this, _, cx| {
this.message_editor.update(cx, |editor, cx| {
editor.set_reply_to_message_id(message_id);
editor.focus_handle(cx).focus(cx);
})
},
)),
)
.tooltip(|cx| Tooltip::text("Reply", cx)),
)
}),
)
.child(
div()
.absolute()
.right_2()
.w_6()
.rounded_tr_md()
.rounded_br_md()
.border_r_1()
.border_t_1()
.border_b_1()
.border_color(cx.theme().colors().element_selected)
.bg(cx.theme().colors().element_background)
.hover(|style| style.bg(cx.theme().colors().element_hover))
.when(!self.has_open_menu(message_id), |el| {
el.visible_on_hover("")
})
.when_some(message_id, |el, message_id| {
let this = cx.view().clone();
let message = this
.active_chat()
.and_then(|active_chat| {
active_chat
.read(cx)
.find_loaded_message(message_id)
})
.cloned();
el.child(
div()
.id("more")
.child(
popover_menu(("menu", message_id))
.trigger(IconButton::new(
("trigger", message_id),
IconName::Ellipsis,
))
.menu(move |cx| {
Some(Self::render_message_menu(
&this,
message_id,
can_delete_message,
cx,
))
}),
if let Some(message) = message {
let buffer = editor
.editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("message editor must be singleton");
buffer.update(cx, |buffer, cx| {
buffer.set_text(message.body.clone(), cx)
});
editor.set_edit_message_id(message_id);
editor.focus_handle(cx).focus(cx);
}
})
})),
)
.tooltip(|cx| Tooltip::text("More", cx)),
)
}),
)
.tooltip(|cx| Tooltip::text("Edit", cx)),
),
)
})
})
.when_some(message_id, |el, message_id| {
let this = cx.view().clone();
el.child(
self.render_popover_button(
cx,
div()
.child(
popover_menu(("menu", message_id))
.trigger(IconButton::new(
("trigger", message_id),
IconName::Ellipsis,
))
.menu(move |cx| {
Some(Self::render_message_menu(
&this,
message_id,
can_delete_message,
cx,
))
}),
)
.id("more")
.tooltip(|cx| Tooltip::text("More", cx)),
),
)
})
}
fn render_message_menu(
@@ -670,18 +707,6 @@ impl ChatPanel {
let menu = {
ContextMenu::build(cx, move |menu, cx| {
menu.entry(
"Reply to message",
None,
cx.handler_for(&this, move |this, cx| {
this.selected_message_to_reply_id = Some(message_id);
this.message_editor.update(cx, |editor, cx| {
editor.set_reply_to_message_id(message_id);
editor.focus_handle(cx).focus(cx);
})
}),
)
.entry(
"Copy message text",
None,
cx.handler_for(&this, move |this, cx| {
@@ -693,7 +718,7 @@ impl ChatPanel {
}
}),
)
.when(can_delete_message, move |menu| {
.when(can_delete_message, |menu| {
menu.entry(
"Delete message",
None,
@@ -725,22 +750,52 @@ impl ChatPanel {
})
.collect::<Vec<_>>();
rich_text::render_rich_text(message.body.clone(), &mentions, language_registry, None)
const MESSAGE_UPDATED: &str = " (edited)";
let mut body = message.body.clone();
if message.edited_at.is_some() {
body.push_str(MESSAGE_UPDATED);
}
let mut rich_text = rich_text::render_rich_text(body, &mentions, language_registry, None);
if message.edited_at.is_some() {
rich_text.highlights.push((
(rich_text.text.len() - MESSAGE_UPDATED.len())..rich_text.text.len(),
Highlight::Highlight(HighlightStyle {
fade_out: Some(0.8),
..Default::default()
}),
));
}
rich_text
}
fn send(&mut self, _: &Confirm, cx: &mut ViewContext<Self>) {
self.selected_message_to_reply_id = None;
if let Some((chat, _)) = self.active_chat.as_ref() {
let message = self
.message_editor
.update(cx, |editor, cx| editor.take_message(cx));
if let Some(task) = chat
.update(cx, |chat, cx| chat.send_message(message, cx))
.log_err()
{
task.detach();
if let Some(id) = self.message_editor.read(cx).edit_message_id() {
self.message_editor.update(cx, |editor, _| {
editor.clear_edit_message_id();
});
if let Some(task) = chat
.update(cx, |chat, cx| chat.update_message(id, message, cx))
.log_err()
{
task.detach();
}
} else {
if let Some(task) = chat
.update(cx, |chat, cx| chat.send_message(message, cx))
.log_err()
{
task.detach();
}
}
}
}
@@ -825,16 +880,39 @@ impl ChatPanel {
})
}
fn close_reply_preview(&mut self, _: &CloseReplyPreview, cx: &mut ViewContext<Self>) {
self.selected_message_to_reply_id = None;
fn close_reply_preview(&mut self, cx: &mut ViewContext<Self>) {
self.message_editor
.update(cx, |editor, _| editor.clear_reply_to_message_id());
}
fn cancel_edit_message(&mut self, cx: &mut ViewContext<Self>) {
self.message_editor.update(cx, |editor, cx| {
// only clear the editor input if we were editing a message
if editor.edit_message_id().is_none() {
return;
}
editor.clear_edit_message_id();
let buffer = editor
.editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.expect("message editor must be singleton");
buffer.update(cx, |buffer, cx| buffer.set_text("", cx));
});
}
}
impl Render for ChatPanel {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let reply_to_message_id = self.message_editor.read(cx).reply_to_message_id();
let message_editor = self.message_editor.read(cx);
let reply_to_message_id = message_editor.reply_to_message_id();
let edit_message_id = message_editor.edit_message_id();
v_flex()
.key_context("ChatPanel")
@@ -890,13 +968,36 @@ impl Render for ChatPanel {
)
}
}))
.when(!self.is_scrolled_to_bottom, |el| {
el.child(div().border_t_1().border_color(cx.theme().colors().border))
})
.when_some(edit_message_id, |el, _| {
el.child(
h_flex()
.px_2()
.text_ui_xs()
.justify_between()
.border_t_1()
.border_color(cx.theme().colors().border)
.bg(cx.theme().colors().background)
.child("Editing message")
.child(
IconButton::new("cancel-edit-message", IconName::Close)
.shape(ui::IconButtonShape::Square)
.tooltip(|cx| Tooltip::text("Cancel edit message", cx))
.on_click(cx.listener(move |this, _, cx| {
this.cancel_edit_message(cx);
})),
),
)
})
.when_some(reply_to_message_id, |el, reply_to_message_id| {
let reply_message = self
.active_chat()
.and_then(|active_chat| {
active_chat.read(cx).messages().iter().find(|message| {
message.id == ChannelMessageId::Saved(reply_to_message_id)
})
active_chat
.read(cx)
.find_loaded_message(reply_to_message_id)
})
.cloned();
@@ -932,13 +1033,9 @@ impl Render for ChatPanel {
.child(
IconButton::new("close-reply-preview", IconName::Close)
.shape(ui::IconButtonShape::Square)
.tooltip(|cx| {
Tooltip::for_action("Close reply", &CloseReplyPreview, cx)
})
.tooltip(|cx| Tooltip::text("Close reply", cx))
.on_click(cx.listener(move |this, _, cx| {
this.selected_message_to_reply_id = None;
cx.dispatch_action(CloseReplyPreview.boxed_clone())
this.close_reply_preview(cx);
})),
),
)
@@ -947,13 +1044,11 @@ impl Render for ChatPanel {
.children(
Some(
h_flex()
.key_context("MessageEditor")
.on_action(cx.listener(ChatPanel::close_reply_preview))
.when(
!self.is_scrolled_to_bottom && reply_to_message_id.is_none(),
|el| el.border_t_1().border_color(cx.theme().colors().border),
)
.p_2()
.on_action(cx.listener(|this, _: &actions::Cancel, cx| {
this.cancel_edit_message(cx);
this.close_reply_preview(cx);
}))
.map(|el| el.child(self.message_editor.clone())),
)
.filter(|_| self.active_chat.is_some()),
@@ -1056,6 +1151,7 @@ mod tests {
nonce: 5,
mentions: vec![(ranges[0].clone(), 101), (ranges[1].clone(), 102)],
reply_to_message_id: None,
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);
@@ -1103,6 +1199,7 @@ mod tests {
nonce: 5,
mentions: Vec::new(),
reply_to_message_id: None,
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);
@@ -1143,6 +1240,7 @@ mod tests {
nonce: 5,
mentions: Vec::new(),
reply_to_message_id: None,
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);

View File

@@ -37,6 +37,7 @@ pub struct MessageEditor {
mentions_task: Option<Task<()>>,
channel_id: Option<ChannelId>,
reply_to_message_id: Option<u64>,
edit_message_id: Option<u64>,
}
struct MessageEditorCompletionProvider(WeakView<MessageEditor>);
@@ -131,6 +132,7 @@ impl MessageEditor {
mentions: Vec::new(),
mentions_task: None,
reply_to_message_id: None,
edit_message_id: None,
}
}
@@ -146,6 +148,18 @@ impl MessageEditor {
self.reply_to_message_id = None;
}
pub fn edit_message_id(&self) -> Option<u64> {
self.edit_message_id
}
pub fn set_edit_message_id(&mut self, edit_message_id: u64) {
self.edit_message_id = Some(edit_message_id);
}
pub fn clear_edit_message_id(&mut self) {
self.edit_message_id = None;
}
pub fn set_channel(
&mut self,
channel_id: ChannelId,

View File

@@ -1802,7 +1802,7 @@ impl CollabPanel {
}
}
fn show_inline_context_menu(&mut self, _: &menu::ShowContextMenu, cx: &mut ViewContext<Self>) {
fn show_inline_context_menu(&mut self, _: &menu::SecondaryConfirm, cx: &mut ViewContext<Self>) {
let Some(bounds) = self
.selection
.and_then(|ix| self.list_state.bounds_for_item(ix))

View File

@@ -689,7 +689,7 @@ impl CollabTitlebarItem {
ContextMenu::build(cx, |menu, _| {
menu.action("Settings", zed_actions::OpenSettings.boxed_clone())
.action("Extensions", extensions_ui::Extensions.boxed_clone())
.action("Themes...", theme_selector::Toggle.boxed_clone())
.action("Themes...", theme_selector::Toggle::default().boxed_clone())
.separator()
.action("Sign Out", client::SignOut.boxed_clone())
})
@@ -713,7 +713,7 @@ impl CollabTitlebarItem {
ContextMenu::build(cx, |menu, _| {
menu.action("Settings", zed_actions::OpenSettings.boxed_clone())
.action("Extensions", extensions_ui::Extensions.boxed_clone())
.action("Themes...", theme_selector::Toggle.boxed_clone())
.action("Themes...", theme_selector::Toggle::default().boxed_clone())
})
.into()
})

View File

@@ -13,7 +13,7 @@ use call::{report_call_event_for_room, ActiveCall};
pub use collab_panel::CollabPanel;
pub use collab_titlebar_item::CollabTitlebarItem;
use gpui::{
actions, point, AppContext, GlobalPixels, Pixels, PlatformDisplay, Size, Task, WindowContext,
actions, point, AppContext, DevicePixels, Pixels, PlatformDisplay, Size, Task, WindowContext,
WindowKind, WindowOptions,
};
use panel_settings::MessageEditorSettings;
@@ -97,13 +97,13 @@ fn notification_window_options(
screen: Rc<dyn PlatformDisplay>,
window_size: Size<Pixels>,
) -> WindowOptions {
let notification_margin_width = GlobalPixels::from(16.);
let notification_margin_height = GlobalPixels::from(-0.) - GlobalPixels::from(48.);
let notification_margin_width = DevicePixels::from(16);
let notification_margin_height = DevicePixels::from(-0) - DevicePixels::from(48);
let screen_bounds = screen.bounds();
let size: Size<GlobalPixels> = window_size.into();
let size: Size<DevicePixels> = window_size.into();
let bounds = gpui::Bounds::<GlobalPixels> {
let bounds = gpui::Bounds::<DevicePixels> {
origin: screen_bounds.upper_right()
- point(
size.width + notification_margin_width,

Some files were not shown because too many files have changed in this diff Show More