Compare commits

..

67 Commits

Author SHA1 Message Date
Jason Mancuso
a533e622fa clean up documentation, fix links, add to summary 2024-08-20 13:59:41 -04:00
Jason Mancuso
cd7073bd19 docs: add documentation about context servers 2024-08-20 12:35:15 -04:00
Bennet Bo Fenner
0c980cde74 docs: Cleanup assistant configuration documentation (#16526)
Release Notes:

- N/A
2024-08-20 17:40:19 +02:00
Nate Butler
9951df7709 Update some docs keybindings to new format (#16524)
Updates some of the docs pages to the new keybinding format.

Release Notes:

- N/A
2024-08-20 11:23:40 -04:00
Marshall Bowers
936466e02c docs: Reword "Extensibility" section of slash command docs (#16521)
This PR rewords the "Extensibility" section of the slash command docs.

Release Notes:

- N/A
2024-08-20 10:13:53 -04:00
Marshall Bowers
1eb1e16954 docs: Fix possessive "its" typos (#16516)
This PR fixes a number of typos where possessive "its" wasn't being used
properly.

Release Notes:

- N/A
2024-08-20 09:59:29 -04:00
Suhun Han
b67404323c workspace: Improve error handling when dropping a file that cannot be opened into the workspace pane (#15613)
This PR can improve the UX when dropping a file that cannot be opened
into the workspace pane. Previously, nothing happened without any
messages when such error occurred, which could be awkward for users.
Additionally the pane was being split even though the file failed to
open.

Here's a screen recording demonstrating the previous/updated behavior:


https://github.com/user-attachments/assets/cfdf3488-9464-4568-b16a-9b87718bd729

Changes:

- It now displays an error message if a file cannot be opened.
- Updated the logic to first try to open the file. The pane splits only
if the file opening process is successful.

Release Notes:

- Improved error handling when opening files in the workspace pane. An
error message will now be displayed if the file cannot be opened.
- Fixed an issue where unnecessary pane splitting occurred when a file
fails to open.
2024-08-20 15:05:59 +02:00
Bennet Bo Fenner
c251a50e41 assistant: Update docs (#16515)
- Fix links on assistant page to subpages
- Mention the configuration view in the `configuration.md` and document
more settings

Release Notes:

- N/A

---------

Co-authored-by: Piotr <piotr@zed.dev>
2024-08-20 14:42:10 +02:00
Kirill Bulatov
e482fcde5b Fall back to FindAllReferences if GoToDefinition have not navigated (#16512)
Follow-up of https://github.com/zed-industries/zed/pull/9243 

Release Notes:

- N/A

---------

Co-authored-by: Alex Kladov <aleksey.kladov@gmail.com>
2024-08-20 14:56:19 +03:00
Kyle Kelley
f185269d03 repl: Upgrade runtimelib (#16499)
Upgrades runtimelib to bring in some fixes from
https://github.com/runtimed/runtimed/pull/114 and
https://github.com/runtimed/runtimed/pull/113 that work towards
addressing issues interfacing with the Julia kernel.

Release Notes:

- N/A
2024-08-19 22:39:17 -07:00
Nate Butler
1f0dc8b754 Expand assistant docs (#16501)
This PR significantly expands the assistant documentation, breaking it
out into sections, adding examples and further documenting features.

This PR introduces a convention in docs for swapping keybindings for mac
vs linux:

`<kbd>cmd-enter|ctrl-enter</kbd>`

In the above example, the first will be shown for mac, the second for
linux or windows.

TODO:

- [ ] Fix table style (for `/assistant/configuration`)
- [x] Add script to swap keybindings based on platform
- It should take in this format: [`cmd-n` (mac)|`ctrl-n`(linux)] and
return just the correct binding for the viewer's platform.
- [ ] Add image/video assets (non-blocking)

Release Notes:

- Updated assistant documentation
2024-08-19 23:50:09 -04:00
Marshall Bowers
395a68133d Add Postgrest to Docker Compose (#16498)
This PR adds two Postgrest containers—one for the app database and one
for the LLM database—to the Docker Compose cluster.

Also fixed an issue where `postgres_app.conf` and `postgres_llm.conf`
had been switched.

Release Notes:

- N/A
2024-08-19 20:50:45 -04:00
Marshall Bowers
77c08fade5 elixir: Bump to v0.0.8 (#16495)
This PR bumps the Elixir extension to v0.0.8.

Changes:

- #16382

Release Notes:

- N/A
2024-08-19 19:15:41 -04:00
Marshall Bowers
f7f7cd5bb9 repl: Don't prefix free variables with _ (#16494)
This PR is a small refactor to remove the leading `_` for some free
variables, as this unintentionally marks them as unused to the compiler.

While the fields on the struct _are_ unused, the free variables should
participate in usage tracking, as we want to make sure they get stored
on the struct.

Release Notes:

- N/A
2024-08-19 19:15:27 -04:00
Bennet Bo Fenner
6f5674691c assistant: Set default provider to zed.dev (#16454)
Do NOT merge until tomorrow

Release Notes:

- N/A

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
2024-08-19 19:00:38 -04:00
Stanislav Alekseev
8993a9f2ee elixir: Make two more files required by lexical executable (#16382)
I still haven't fixed building dev extensions with rust managed by nix,
so I'd appreciate testing this for me

Release Notes:

- N/A
2024-08-19 18:48:05 -04:00
Joseph T Lyons
9f66f12f7b v0.151.x dev 2024-08-19 18:40:19 -04:00
Peter Tripp
3eb5488c63 Update Terms and Conditions (#16478)
- Update Zed Terms of Use:
  - Rename from 'EULA' / 'Terms and Conditions'
  - Rename 'Zed Network Based Service' to 'Zed Service'
  - 3.3.2 Usage Data (formerly Telemetry Data)
    - Add examples of 'Usage Data'
- Add link to https://zed.dev/docs/telemetry - Explain 'telemetry ID' and user linkage
- 3.3.5 Privacy Policy - Add privacy policy reference - Add link to https://zed.dev/privacy-policy/
  - 5. OWNERSHIP
- Move "You retain all right, title and interest..." from 3.3 Customer Data
    - Additional note that customers retain Intellectual Property rights
- 9. Third Party Services - Add link to https://zed.dev/third-party-terms
- Add Privacy Policy
- Add Subprocessors
- Add Third Party Terms
- Update script/terms/terms.rtf for DMG bundle
2024-08-19 17:08:46 -04:00
Max Brunsfeld
30bfa56a24 Avoid double message header in new contexts, don't expand default prompt (#16490)
Follow-up to https://github.com/zed-industries/zed/pull/16471

* Don't expand the default prompt by default, since it looks strange in
the expanded state
* Don't create two `You` headers by default. Just insert a blank line
after the default prompt.

Release Notes:

- N/A
2024-08-19 12:54:03 -07:00
Roy Williams
0042c24d3c Simplify logic & add UI affordances to show model cache status (#16395)
Release Notes:

- Adds UI affordances to the assistant panel to show which messages have
been cached
- Migrate cache invalidation to be based on `has_edits_since_in_range`
to be smarter and more selective about when to invalidate the cache and
when to fetch.

<img width="310" alt="Screenshot 2024-08-16 at 11 19 23 PM"
src="https://github.com/user-attachments/assets/4ee2d111-2f55-4b0e-b944-50c4f78afc42">

<img width="580" alt="Screenshot 2024-08-18 at 10 05 16 PM"
src="https://github.com/user-attachments/assets/17630a60-7b78-421c-ae39-425246638a12">


I had originally added the lightening bolt on every message and only
added the tooltip warning about editing prior messages on the first
anchor, but thought it looked too busy, so I settled on just annotating
the last anchor.
2024-08-19 12:06:14 -07:00
Marshall Bowers
971db5c6f6 ci: Set the ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON for builds (#16486)
This PR updates the various GitHub Actions that build Zed binaries to
set the `ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON` environment variable
from the corresponding secret.

Release Notes:

- N/A
2024-08-19 14:47:20 -04:00
Max Brunsfeld
b5bd8a5c5d Add logic for closed beta LLM models (#16482)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-19 11:09:52 -07:00
Nate Butler
41fc6d0885 Make providers more clear in model selector (#16480)
Make providers more clear in model selector

Before:

![CleanShot 2024-08-19 at 13 20
36@2x](https://github.com/user-attachments/assets/5b43fa27-4aca-446a-a035-bc8bcb0d9b0e)

After:

![CleanShot 2024-08-19 at 13 20
05@2x](https://github.com/user-attachments/assets/cb961405-b573-42fe-80e1-f3c2ce828ea4)


Release Notes:

- N/A
2024-08-19 13:38:19 -04:00
Bennet Bo Fenner
90897707c3 assistant: Add imports in a single area when using workflows (#16355)
Co-Authored-by: Kirill <kirill@zed.dev>

Release Notes:

- N/A

---------

Co-authored-by: Kirill <kirill@zed.dev>
Co-authored-by: Thorsten <thorsten@zed.dev>
2024-08-19 19:01:45 +02:00
Piotr Osiewicz
7fbea39566 ui: Dismiss popovers when clicking on trigger button (#16476)
Release Notes:

- Clicking on an already-deployed popover menu trigger now hides the
popover menu.
2024-08-19 18:48:57 +02:00
Thorsten Ball
037cf1393c assistant: Undo workflow step when buffer is discarded (#16465)
This fixes a weird bug:

1. Use `/workflow` in assistant
2. Have it generate a step that modifies a file
3. Either (a) select the step in the assistant and have it auto-insert
newlines (b) select "Transform" to have the step applied
4. Close the modified file in the editor ("Discard")
5. Re-open the file
6. BUG: the changes made by assistant are still there!

The reason for the bug is that the assistant keeps references to buffers
and they're not closed/reloaded when closed/reopened.

To fix the bug we now rollback the applied workflow steps when
discarding a buffer.

(This does *not* yet fix the issue where a workflow step inserts a new
buffer into the project/worktree that does not show up on the file
system yet but in `/file` and hangs around until Zed is closed.)


Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-08-19 18:42:49 +02:00
Kirill Bulatov
69aae2037d Display default prompts more elaborately (#16471)
Show them under `User` role instead of a `System` one, and insert them
expanded.

Release Notes:

- N/A
2024-08-19 18:44:52 +03:00
Piotr Osiewicz
bac8e81e73 assistant: Add the "create your command" item (#16467)
This PR adds an extra item to the slash command picker that links users to the doc that teaches how to create a custom one.

Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2024-08-19 12:29:16 -03:00
Marshall Bowers
0bea4d5fa6 theme: Change autocomplete value for ui_font_features and buffer_font_features (#16466)
This PR changes the default value used when autocompleting the
`ui_font_features` and `ui_font_features` settings from `null` to `{}`.

Release Notes:

- N/A
2024-08-19 10:55:25 -04:00
renovate[bot]
4dec7806cb Update Rust crate heed to v0.20.5 (#16464)
[![Mend
Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [heed](https://togithub.com/Kerollmops/heed) | workspace.dependencies
| patch | `0.20.4` -> `0.20.5` |

---

### Release Notes

<details>
<summary>Kerollmops/heed (heed)</summary>

###
[`v0.20.5`](https://togithub.com/meilisearch/heed/releases/tag/v0.20.5):
🛁

[Compare
Source](https://togithub.com/Kerollmops/heed/compare/v0.20.4...v0.20.5)

<p align="center"><img width="280px"
src="https://raw.githubusercontent.com/meilisearch/heed/main/assets/heed-pigeon-logo.png"></a></p>
<h1 align="center" >heed</h1>

##### What's Changed
* fix function docs (clippy warnings) by
@&#8203;antonil[https://github.com/meilisearch/heed/pull/273](https://togithub.com/meilisearch/heed/pull/273)ll/273
* fix custom_key_cmp_wrapper being able to unwind to C code (ub) by
@&#8203;antonil[https://github.com/meilisearch/heed/pull/275](https://togithub.com/meilisearch/heed/pull/275)ll/275

##### New Contributors
* @&#8203;antonilol made their first
contributi[https://github.com/meilisearch/heed/pull/273](https://togithub.com/meilisearch/heed/pull/273)ll/273

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4yNi4xIiwidXBkYXRlZEluVmVyIjoiMzguMjYuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-08-19 10:26:14 -04:00
Marshall Bowers
de41c151c8 collab: Add is_staff to upstream rate limit spans (#16463)
This PR adds the `is_staff` field to the `upstream rate limit` spans.

Since we use different API keys for staff vs non-staff, it will be
useful to break down the rate limits accordingly.

Release Notes:

- N/A
2024-08-19 10:15:25 -04:00
Piotr Osiewicz
56f1ab9459 assistant: Remove "Resolving" text for step resolution and use Transform instead (#16461)
That way, user can click on "Transform" straight away and get it applied
immediately when it's resolved.



https://github.com/user-attachments/assets/08c99804-3841-4eba-a5eb-7066a9f45b47


TODO:
- [x] Tie "Send" button at the bottom into the same behavior

Release Notes:

- N/A
2024-08-19 15:17:04 +02:00
Piotr Osiewicz
911112d94a assistant: Fix toggling slash command menu from toolbar menu (#16459)
Release Notes:

- N/A
2024-08-19 14:47:05 +02:00
Thorsten Ball
e68b2d5ecc assistant panel: Disable send button on config error (#16455)
Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-08-19 11:44:56 +02:00
Thorsten Ball
f651333896 assistant panel: Show if env var with API key is set (#16453)
This makes it easier to debug why resetting a key doesn't work. We now
show when the key is set via an env var and if so, we disable the
reset-key button and instead give instructions.

![screenshot-2024-08-19-11 22
05@2x](https://github.com/user-attachments/assets/6c75dc82-cb61-4661-9647-f77fca8fdf41)


Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
2024-08-19 11:34:58 +02:00
Bennet Bo Fenner
14fa4abce4 assistant: Fix edge case where "Open new context" button would do nothing (#16452)
Co-Authored-by: Thorsten <thorsten@zed.dev>

Release Notes:

- N/A

Co-authored-by: Thorsten <thorsten@zed.dev>
2024-08-19 11:07:04 +02:00
Ryan Hawkins
8a320668ed Add support for GPT-4o in Copilot Chat (#16446)
Release Notes:
- Added support for GPT-4o for Copilot Chat.
2024-08-19 09:03:06 +02:00
Mikayla Maki
86efde4b76 Fixed bugs in workflow step preview (#16445)
Release Notes:

- N/A
2024-08-18 22:18:04 -07:00
Nathan Sobo
43e13df9f3 Add a /perplexity slash command in an extension (#16438)
Release Notes:

- N/A
2024-08-18 16:34:55 -06:00
Nathan Sobo
b9176fe4bb Add custom icon for Anthropic hosted models (#16436)
This commit adds a custom icon for Anthropic hosted models.


![CleanShot 2024-08-18 at 15 40
38@2x](https://github.com/user-attachments/assets/d467ccab-9628-4258-89fc-782e0d4a48d4)
![CleanShot 2024-08-18 at 15 40
34@2x](https://github.com/user-attachments/assets/7efaff9c-6a58-47ba-87ea-e0fe0586fedc)


- Adding a new SVG icon for Anthropic hosted models.
  - The new icon is located at: `assets/icons/ai_anthropic_hosted.svg`
- Updating the LanguageModel trait to include an optional icon method
- Implementing the icon method for CloudModel to return the custom icon
for Anthropic hosted models
- Updating the UI components to use the model-specific icon when
available
- Adding a new IconName variant for the Anthropic hosted icon

We should change the non-hosted icon in some small way to distinguish it
from the hosted version. I duplicated the path for now so we can
hopefully add it for the next release.

Release Notes:

- N/A
2024-08-18 16:07:15 -06:00
Nathan Sobo
11753914d7 Add a setting to show time to first window draw and frames per second in status bar (#16422)
I want to showcase Zed's performance via videos, and this seemed like a
good way to demonstrate it.


https://github.com/user-attachments/assets/f4a5fabc-efe7-4b48-9ba5-719882fdc856

Release Notes:

- On macOS, you can now set assign `performance.show_in_status_bar:
true` in your settings to show the time to the first window draw on
startup and then current FPS of the containing window's renderer.

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Kirill Bulatov <kirill@zed.dev>
Co-authored-by: David Soria Parra <167242713+dsp-ant@users.noreply.github.com>
Co-authored-by: Danny Hua <danny.hua@hey.com>
2024-08-18 15:22:19 -06:00
Danny Hua
6f93b42ecb docs: Fix example extension directory structure (#16424)
Add language-specific subdirectory in example directory structure, since
that's the requisite structure - see `extensions/languages.md`

Release Notes:

- N/A
2024-08-18 07:40:08 -04:00
David Soria Parra
10a996cbc4 context_servers: Fix argument handling (#16402) 2024-08-17 20:04:34 -07:00
Kirill Bulatov
5e6e465294 Show correct number of characters selected (#16420) 2024-08-18 02:24:32 +03:00
Max Brunsfeld
8841d6faad Avoid redundant newline insertion after file command (#16419)
Release Notes:

- Fixed an issue where an extra newline was inserted after running a
`/file` command in the assistant.
2024-08-17 15:10:10 -07:00
Nathan Sobo
c9c5eef8f2 Improve dev experience for built-in prompts (#16413)
When launching Zed from the CLI via `cargo run`, we'll always prompt
load templates from the repo.

This restores behavior that I reverted last night in #16403.

Also, I've improved the `script/prompts link/unlink` workflow for
overriding prompts of your production copy of Zed. Zed now detects when
the overrides directory is created or removed, and does the right thing.
You can link and unlink repeatedly without restarting Zed.

Release Notes:

- N/A
2024-08-17 12:28:53 -06:00
Danilo Leal
7c268d0c6d assistant: Remove meta description from quote selection tooltip (#16412)
The original idea was for the keybinding to be within the description, but given it's already inline with the title, I figure we don't need this anymore—cleaning it up a bit!

--- 

Release Notes:

- N/A
2024-08-17 13:30:32 -03:00
Danilo Leal
e4a591dcbd workflow: Add button to open the step view (#16387)
This PR adds an icon button that appears as you hover over the step header, which allows users to visit the step view.

---

Release Notes:

- N/A
2024-08-17 13:06:34 -03:00
Nathan Sobo
07d5e22cbe Revert changes to inline assist indentation logic and prompt (#16403)
This PR reverts #16145 and subsequent changes.

This reverts commit a515442a36.

We still have issues with our approach to indentation in Python
unfortunately, but this feels like a safer equilibrium than where we
were.

Release Notes:

- Returned to our previous prompt for inline assist transformations,
since recent changes were introducing issues.
2024-08-17 02:24:55 -06:00
Joseph T. Lyons
ebecd7e65f Fix issue with fetching users in seed script (#16393)
Release Notes:

- N/A
2024-08-16 21:51:51 -04:00
Joseph T Lyons
18f0626e08 Update assistant docs to mention inline works in the terminal 2024-08-16 21:13:02 -04:00
Marshall Bowers
3d997e5fd6 collab: Add is_staff to spans (#16389)
This PR adds the `is_staff` field to our LLM spans so that we can
distinguish between staff and non-staff traffic.

Release Notes:

- N/A
2024-08-16 18:42:44 -04:00
Max Brunsfeld
1b1070e0f7 Add tracing needed for LLM rate limit dashboards (#16388)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-16 17:52:31 -04:00
Joseph T. Lyons
9ef3306f55 Add feature flags to seed script (#16385)
Release Notes:

- N/A
2024-08-16 17:08:44 -04:00
Kyle Kelley
0fdc9d0f05 context_servers: Log errors from detached context server tasks (#16377)
Logged several of the detached tasks that before would silently fail if
the context server wasn't in compliance.

Release Notes:

- N/A
2024-08-16 13:50:19 -07:00
Nathan Sobo
907d76208d Allow display name of custom Anthropic models to be customized (#16376)
Also added some docs for our settings.

Release Notes:

- N/A
2024-08-16 14:02:37 -06:00
Kirill Bulatov
ae9e6a9daa Allow rerunning tasks with unknown termination status (#16374) 2024-08-16 23:00:20 +03:00
Danilo Leal
e36e605c96 assistant: Fine-tune error toast design (#16373)
Just some super small padding and absolute-positioning tweaks. 

---

Release Notes:

- N/A
2024-08-16 16:56:52 -03:00
Marshall Bowers
35cd397a40 collab: Allow enabling feature flags for all users (#16372)
This PR adds a new `enabled_for_all` column to the `feature_flags` table
to allow enabling a feature flag for all users.

Release Notes:

- N/A
2024-08-16 15:17:03 -04:00
Danilo Leal
2180dbdb50 assistant: Add action footer and refine slash command popover (#16360)
- [x] Put the slash command popover on the footer
- [x] Refine the popover (change it to a picker)
- [x] Add more options dropdown on the assistant's toolbar
- [x] Add quote selection button on the footer

---

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
Co-authored-by: Nate Butler <iamnbutler@gmail.com>
Co-authored-by: Kirill Bulatov <mail4score@gmail.com>
2024-08-16 16:07:42 -03:00
Mikayla Maki
23d56a1a84 Add configuration flow for inline assist button (#16369)
This adds a configuration prompt when using the inline assist button in
the editor.

Release Notes:

- N/A
2024-08-16 11:21:30 -07:00
Marshall Bowers
a9441879c3 collab: Fix writing LLM rate limit events to Clickhouse (#16367)
This PR fixes the writing of LLM rate limit events to Clickhouse.

We had a table in the table name: `llm_rate_limits` instead of
`llm_rate_limit_events`.

I also extracted a helper function to write to Clickhouse so we can use
it anywhere we need to.

Release Notes:

- N/A
2024-08-16 14:03:34 -04:00
Nate Butler
6cfbb54ede Switch icon (#16363)
Updates instances of the `MagicWand` icon to our more recent `Sparkle` /
`ZedAssistant` icon in places where we reference inline assist.

Before:

![CleanShot 2024-08-16 at 13 41
58@2x](https://github.com/user-attachments/assets/67af27a2-a09b-44bb-a8af-2bafcbbd9038)

After:
![CleanShot 2024-08-16 at 13 48
34@2x](https://github.com/user-attachments/assets/229ccc8e-8a93-44c1-abe0-7b6e22ca93e2)


Release Notes:

- Updated inline assist icon in the editor & terminal.
2024-08-16 14:01:56 -04:00
Marshall Bowers
7a5acc0b0c collab: Rework model name checks (#16365)
This PR reworks how we do checks for model names in the LLM service.

We now normalize the model names using the models defined in the
database.

Release Notes:

- N/A
2024-08-16 13:54:28 -04:00
Joseph T Lyons
463ac7f5e4 Correct H1 text for assistant documentation 2024-08-16 13:52:41 -04:00
Joseph T. Lyons
ee27114b35 Remove redundant assistant content (#16364)
Release Notes:

- N/A
2024-08-16 13:42:16 -04:00
Joseph T. Lyons
ebac9a7342 Combine assistant documentation (#16362)
Release Notes:

- N/A
2024-08-16 13:37:54 -04:00
145 changed files with 5810 additions and 2126 deletions

View File

@@ -167,6 +167,7 @@ jobs:
APPLE_NOTARIZATION_USERNAME: ${{ secrets.APPLE_NOTARIZATION_USERNAME }}
APPLE_NOTARIZATION_PASSWORD: ${{ secrets.APPLE_NOTARIZATION_PASSWORD }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
steps:
@@ -276,6 +277,7 @@ jobs:
needs: [linux_tests]
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
@@ -346,6 +348,7 @@ jobs:
needs: [linux_tests]
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4

View File

@@ -67,6 +67,7 @@ jobs:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Install Node
uses: actions/setup-node@1e60f620b9541d16bece96c5465dc8ee9832be0b # v4
@@ -106,6 +107,7 @@ jobs:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4
@@ -139,6 +141,7 @@ jobs:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
steps:
- name: Checkout repo
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4

42
Cargo.lock generated
View File

@@ -223,6 +223,7 @@ name = "anthropic"
version = "0.1.0"
dependencies = [
"anyhow",
"chrono",
"futures 0.3.30",
"http_client",
"isahc",
@@ -232,6 +233,7 @@ dependencies = [
"strum",
"thiserror",
"tokio",
"util",
]
[[package]]
@@ -5057,9 +5059,9 @@ checksum = "2304e00983f87ffb38b55b444b5e3b60a884b5d30c0fca7d82fe33449bbe55ea"
[[package]]
name = "heed"
version = "0.20.4"
version = "0.20.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "620033c8c8edfd2f53e6f99a30565eb56a33b42c468e3ad80e21d85fb93bafb0"
checksum = "7d4f449bab7320c56003d37732a917e18798e2f1709d80263face2b4f9436ddb"
dependencies = [
"bitflags 2.6.0",
"byteorder",
@@ -6314,9 +6316,9 @@ dependencies = [
[[package]]
name = "lmdb-master-sys"
version = "0.2.3"
version = "0.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1de7e761853c15ca72821d9f928d7bb123ef4c05377c4e7ab69fa1c742f91d24"
checksum = "472c3760e2a8d0f61f322fb36788021bb36d573c502b50fa3e2bcaac3ec326c9"
dependencies = [
"cc",
"doxygen-rs",
@@ -7590,6 +7592,29 @@ version = "2.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e3148f5046208a5d56bcfc03053e3ca6334e51da8dfb19b6cdc8b306fae3283e"
[[package]]
name = "performance"
version = "0.1.0"
dependencies = [
"anyhow",
"collections",
"gpui",
"log",
"schemars",
"serde",
"settings",
"util",
"workspace",
]
[[package]]
name = "perplexity"
version = "0.1.0"
dependencies = [
"serde",
"zed_extension_api 0.1.0",
]
[[package]]
name = "pest"
version = "2.7.11"
@@ -9033,9 +9058,9 @@ dependencies = [
[[package]]
name = "runtimelib"
version = "0.14.0"
version = "0.15.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0c3d817764e3971867351e6103955b17d808f5330e9ef63aaaaab55bf8c664c1"
checksum = "a7d76d28b882a7b889ebb04e79bc2b160b3061821ea596ff0f4a838fc7a76db0"
dependencies = [
"anyhow",
"async-dispatcher",
@@ -13814,7 +13839,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.149.1"
version = "0.151.0"
dependencies = [
"activity_indicator",
"anyhow",
@@ -13875,6 +13900,7 @@ dependencies = [
"outline_panel",
"parking_lot",
"paths",
"performance",
"profiling",
"project",
"project_panel",
@@ -13962,7 +13988,7 @@ dependencies = [
[[package]]
name = "zed_elixir"
version = "0.0.7"
version = "0.0.8"
dependencies = [
"zed_extension_api 0.0.6",
]

View File

@@ -70,6 +70,7 @@ members = [
"crates/outline",
"crates/outline_panel",
"crates/paths",
"crates/performance",
"crates/picker",
"crates/prettier",
"crates/project",
@@ -145,6 +146,7 @@ members = [
"extensions/lua",
"extensions/ocaml",
"extensions/php",
"extensions/perplexity",
"extensions/prisma",
"extensions/purescript",
"extensions/ruff",
@@ -241,6 +243,7 @@ open_ai = { path = "crates/open_ai" }
outline = { path = "crates/outline" }
outline_panel = { path = "crates/outline_panel" }
paths = { path = "crates/paths" }
performance = { path = "crates/performance" }
picker = { path = "crates/picker" }
plugin = { path = "crates/plugin" }
plugin_macros = { path = "crates/plugin_macros" }
@@ -380,7 +383,7 @@ rand = "0.8.5"
regex = "1.5"
repair_json = "0.1.0"
rsa = "0.9.6"
runtimelib = { version = "0.14", default-features = false, features = [
runtimelib = { version = "0.15", default-features = false, features = [
"async-dispatcher-runtime",
] }
rusqlite = { version = "0.29.0", features = ["blob", "array", "modern_sqlite"] }

View File

@@ -0,0 +1,12 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<rect width="16" height="16" rx="2" fill="black" fill-opacity="0.2"/>
<g clip-path="url(#clip0_1916_18)">
<path d="M10.652 3.79999H8.816L12.164 12.2H14L10.652 3.79999Z" fill="#1F1F1E"/>
<path d="M5.348 3.79999L2 12.2H3.872L4.55672 10.436H8.05927L8.744 12.2H10.616L7.268 3.79999H5.348ZM5.16224 8.87599L6.308 5.92399L7.45374 8.87599H5.16224Z" fill="#1F1F1E"/>
</g>
<defs>
<clipPath id="clip0_1916_18">
<rect width="12" height="8.4" fill="white" transform="translate(2 3.79999)"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 601 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-database-zap"><ellipse cx="12" cy="5" rx="9" ry="3"/><path d="M3 5V19A9 3 0 0 0 15 21.84"/><path d="M21 5V8"/><path d="M21 12L18 17H22L19 22"/><path d="M3 12A9 3 0 0 0 14.59 14.87"/></svg>

After

Width:  |  Height:  |  Size: 391 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-ellipsis-vertical"><circle cx="12" cy="12" r="1"/><circle cx="12" cy="5" r="1"/><circle cx="12" cy="19" r="1"/></svg>

After

Width:  |  Height:  |  Size: 320 B

View File

@@ -1,10 +0,0 @@
<svg width="16" height="16" viewBox="0 0 16 16" fill="none" xmlns="http://www.w3.org/2000/svg">
<path d="M3 13L7.01562 8.98438" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M8.6875 7.3125L9.5 6.5" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M7 5V3" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M12 5V3" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M12 10V8" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M6 4L8 4" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M11 4L13 4" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
<path d="M11 9L13 9" stroke="black" stroke-width="1.5" stroke-linecap="round"/>
</svg>

Before

Width:  |  Height:  |  Size: 787 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-search-code"><path d="m13 13.5 2-2.5-2-2.5"/><path d="m21 21-4.3-4.3"/><path d="M9 8.5 7 11l2 2.5"/><circle cx="11" cy="11" r="8"/></svg>

After

Width:  |  Height:  |  Size: 340 B

1
assets/icons/slash.svg Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-slash"><path d="M22 2 2 22"/></svg>

After

Width:  |  Height:  |  Size: 238 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-square-slash"><rect width="18" height="18" x="3" y="3" rx="2"/><line x1="9" x2="15" y1="15" y2="9"/></svg>

After

Width:  |  Height:  |  Size: 309 B

View File

@@ -1,426 +1,61 @@
You are an expert developer assistant working in an AI-enabled text editor.
Your task is to rewrite a specific section of the provided document based on a user-provided prompt.
{{#if language_name}}
Here's a file of {{language_name}} that I'm going to ask you to make an edit to.
{{else}}
Here's a file of text that I'm going to ask you to make an edit to.
{{/if}}
<guidelines>
1. Scope: Modify only content within <rewrite_this> tags. Do not alter anything outside these boundaries.
2. Precision: Make changes strictly necessary to fulfill the given prompt. Preserve all other content as-is.
3. Seamless integration: Ensure rewritten sections flow naturally with surrounding text and maintain document structure.
4. Tag exclusion: Never include <rewrite_this>, </rewrite_this>, <edit_here>, or <insert_here> tags in the output.
5. Indentation: Maintain the original indentation level of the file in rewritten sections.
6. Completeness: Rewrite the entire tagged section, even if only partial changes are needed. Avoid omissions or elisions.
7. Insertions: Replace <insert_here></insert_here> tags with appropriate content as specified by the prompt.
8. Code integrity: Respect existing code structure and functionality when making changes.
9. Consistency: Maintain a uniform style and tone throughout the rewritten text.
</guidelines>
{{#if is_insert}}
The point you'll need to insert at is marked with <insert_here></insert_here>.
{{else}}
The section you'll need to rewrite is marked with <rewrite_this></rewrite_this> tags.
{{/if}}
<examples>
<example>
<input>
<document>
use std::cell::Cell;
use std::collections::HashMap;
use std::cmp;
<rewrite_this>
<insert_here></insert_here>
</rewrite_this>
pub struct LruCache<K, V> {
/// The maximum number of items the cache can hold.
capacity: usize,
/// The map storing the cached items.
items: HashMap<K, V>,
}
// The rest of the implementation...
</document>
<prompt>
doc this
</prompt>
</input>
<incorrect_output failure="Over-generation. The text starting with `pub struct AabbTree<T> {` is *after* the rewrite_this tag">
/// Represents an Axis-Aligned Bounding Box (AABB) tree data structure.
///
/// This structure is used for efficient spatial queries and collision detection.
/// It organizes objects in a hierarchical tree structure based on their bounding boxes.
///
/// # Type Parameters
///
/// * `T`: The type of data associated with each node in the tree.
pub struct AabbTree<T> {
root: Option<usize>,
</incorrect_output>
<corrected_output improvement="Generation stops before repeating content after the rewrite_this section">
/// Represents an Axis-Aligned Bounding Box (AABB) tree data structure.
///
/// This structure is used for efficient spatial queries and collision detection.
/// It organizes objects in a hierarchical tree structure based on their bounding boxes.
///
/// # Type Parameters
///
/// * `T`: The type of data associated with each node in the tree.
</corrected_output>
</example>
<example>
<input>
<document>
import math
def calculate_circle_area(radius):
"""Calculate the area of a circle given its radius."""
return math.pi * radius ** 2
<rewrite_this>
<insert_here></insert_here>
</rewrite_this>
class Circle:
def __init__(self, radius):
self.radius = radius
def area(self):
return math.pi * self.radius ** 2
def circumference(self):
return 2 * math.pi * self.radius
# Usage example
circle = Circle(5)
print(f"Area: {circle.area():.2f}")
print(f"Circumference: {circle.circumference():.2f}")
</document>
<prompt>
write docs
</prompt>
</input>
<incorrect_output failure="Over-generation. The text starting with `class Circle:` is *after* the rewrite_this tag">
"""
Represents a circle with methods to calculate its area and circumference.
This class provides a simple way to work with circles in a geometric context.
It allows for the creation of Circle objects with a specified radius and
offers methods to compute the circle's area and circumference.
Attributes:
radius (float): The radius of the circle.
Methods:
area(): Calculates and returns the area of the circle.
circumference(): Calculates and returns the circumference of the circle.
"""
class Circle:
</incorrect_output>
<corrected_output improvement="Generation stops before repeating content after the rewrite_this section">
"""
Represents a circle with methods to calculate its area and circumference.
This class provides a simple way to work with circles in a geometric context.
It allows for the creation of Circle objects with a specified radius and
offers methods to compute the circle's area and circumference.
Attributes:
radius (float): The radius of the circle.
Methods:
area(): Calculates and returns the area of the circle.
circumference(): Calculates and returns the circumference of the circle.
"""
</corrected_output>
</example>
<example>
<input>
<document>
class BankAccount {
private balance: number;
constructor(initialBalance: number) {
this.balance = initialBalance;
}
<rewrite_this>
<insert_here></insert_here>
</rewrite_this>
deposit(amount: number): void {
if (amount > 0) {
this.balance += amount;
}
}
withdraw(amount: number): boolean {
if (amount > 0 && this.balance >= amount) {
this.balance -= amount;
return true;
}
return false;
}
getBalance(): number {
return this.balance;
}
}
// Usage
const account = new BankAccount(1000);
account.deposit(500);
console.log(account.getBalance()); // 1500
account.withdraw(200);
console.log(account.getBalance()); // 1300
</document>
<prompt>
//
</prompt>
</input>
<incorrect_output failure="Over-generation. The text starting with `deposit(amount: number): void {` is *after* the rewrite_this tag">
/**
* Deposits the specified amount into the bank account.
*
* @param amount The amount to deposit. Must be a positive number.
* @throws Error if the amount is not positive.
*/
deposit(amount: number): void {
if (amount > 0) {
this.balance += amount;
} else {
throw new Error("Deposit amount must be positive");
}
}
</incorrect_output>
<corrected_output improvement="Generation stops before repeating content after the rewrite_this section">
/**
* Deposits the specified amount into the bank account.
*
* @param amount The amount to deposit. Must be a positive number.
* @throws Error if the amount is not positive.
*/
</corrected_output>
</example>
<example>
<input>
<document>
use std::collections::VecDeque;
pub struct BinaryTree<T> {
root: Option<Node<T>>,
}
<rewrite_this>
<insert_here></insert_here>
</rewrite_this>
struct Node<T> {
value: T,
left: Option<Box<Node<T>>>,
right: Option<Box<Node<T>>>,
}
</document>
<prompt>
derive clone
</prompt>
</input>
<incorrect_output failure="Over-generation below the rewrite_this tags. Extra space between derive annotation and struct definition.">
#[derive(Clone)]
struct Node<T> {
value: T,
left: Option<Box<Node<T>>>,
right: Option<Box<Node<T>>>,
}
</incorrect_output>
<incorrect_output failure="Over-generation above the rewrite_this tags">
pub struct BinaryTree<T> {
root: Option<Node<T>>,
}
#[derive(Clone)]
</incorrect_output>
<incorrect_output failure="Over-generation below the rewrite_this tags">
#[derive(Clone)]
struct Node<T> {
value: T,
left: Option<Box<Node<T>>>,
right: Option<Box<Node<T>>>,
}
impl<T> Node<T> {
fn new(value: T) -> Self {
Node {
value,
left: None,
right: None,
}
}
}
</incorrect_output>
<corrected_output improvement="Only includes the new content within the rewrite_this tags">
#[derive(Clone)]
</corrected_output>
</example>
<example>
<input>
<document>
import math
def calculate_circle_area(radius):
"""Calculate the area of a circle given its radius."""
return math.pi * radius ** 2
<rewrite_this>
<insert_here></insert_here>
</rewrite_this>
class Circle:
def __init__(self, radius):
self.radius = radius
def area(self):
return math.pi * self.radius ** 2
def circumference(self):
return 2 * math.pi * self.radius
# Usage example
circle = Circle(5)
print(f"Area: {circle.area():.2f}")
print(f"Circumference: {circle.circumference():.2f}")
</document>
<prompt>
add dataclass decorator
</prompt>
</input>
<incorrect_output failure="Over-generation. The text starting with `class Circle:` is *after* the rewrite_this tag">
@dataclass
class Circle:
radius: float
def __init__(self, radius):
self.radius = radius
def area(self):
return math.pi * self.radius ** 2
</incorrect_output>
<corrected_output improvement="Generation stops before repeating content after the rewrite_this section">
@dataclass
</corrected_output>
</example>
<example>
<input>
<document>
interface ShoppingCart {
items: string[];
total: number;
}
<rewrite_this>
<insert_here></insert_here>class ShoppingCartManager {
</rewrite_this>
private cart: ShoppingCart;
constructor() {
this.cart = { items: [], total: 0 };
}
addItem(item: string, price: number): void {
this.cart.items.push(item);
this.cart.total += price;
}
getTotal(): number {
return this.cart.total;
}
}
// Usage
const manager = new ShoppingCartManager();
manager.addItem("Book", 15.99);
console.log(manager.getTotal()); // 15.99
</document>
<prompt>
add readonly modifier
</prompt>
</input>
<incorrect_output failure="Over-generation. The line starting with ` items: string[];` is *after* the rewrite_this tag">
readonly interface ShoppingCart {
items: string[];
total: number;
}
class ShoppingCartManager {
private readonly cart: ShoppingCart;
constructor() {
this.cart = { items: [], total: 0 };
}
</incorrect_output>
<corrected_output improvement="Only includes the new content within the rewrite_this tags and integrates cleanly into surrounding code">
readonly interface ShoppingCart {
</corrected_output>
</example>
</examples>
With these examples in mind, edit the following file:
<document language="{{ language_name }}">
{{{ document_content }}}
{{{document_content}}}
</document>
{{#if is_truncated}}
The provided document has been truncated (potentially mid-line) for brevity.
The context around the relevant section has been truncated (possibly in the middle of a line) for brevity.
{{/if}}
<instructions>
{{#if has_insertion}}
Insert text anywhere you see marked with <insert_here></insert_here> tags. It's CRITICAL that you DO NOT include <insert_here> tags in your output.
{{/if}}
{{#if has_replacement}}
Edit text that you see surrounded with <edit_here>...</edit_here> tags. It's CRITICAL that you DO NOT include <edit_here> tags in your output.
{{/if}}
Make no changes to the rewritten content outside these tags.
{{#if is_insert}}
You can't replace {{content_type}}, your answer will be inserted in place of the `<insert_here></insert_here>` tags. Don't include the insert_here tags in your output.
<snippet language="{{ language_name }}" annotated="true">
{{{ rewrite_section_prefix }}}
<rewrite_this>
{{{ rewrite_section_with_edits }}}
</rewrite_this>
{{{ rewrite_section_suffix }}}
</snippet>
Rewrite the lines enclosed within the <rewrite_this></rewrite_this> tags in accordance with the provided instructions and the prompt below.
Generate {{content_type}} based on the following prompt:
<prompt>
{{{ user_prompt }}}
{{{user_prompt}}}
</prompt>
Do not include <insert_here> or <edit_here> annotations in your output. Here is a clean copy of the snippet without annotations for your reference.
Match the indentation in the original file in the inserted {{content_type}}, don't include any indentation on blank lines.
<snippet>
{{{ rewrite_section_prefix }}}
{{{ rewrite_section }}}
{{{ rewrite_section_suffix }}}
</snippet>
</instructions>
Immediately start with the following format with no remarks:
<guidelines_reminder>
1. Focus on necessary changes: Modify only what's required to fulfill the prompt.
2. Preserve context: Maintain all surrounding content as-is, ensuring the rewritten section seamlessly integrates with the existing document structure and flow.
3. Exclude annotation tags: Do not output <rewrite_this>, </rewrite_this>, <edit_here>, or <insert_here> tags.
4. Maintain indentation: Begin at the original file's indentation level.
5. Complete rewrite: Continue until the entire section is rewritten, even if no further changes are needed.
6. Avoid elisions: Always write out the full section without unnecessary omissions. NEVER say `// ...` or `// ...existing code` in your output.
7. Respect content boundaries: Preserve code integrity.
</guidelines_reminder>
```
\{{INSERTED_CODE}}
```
{{else}}
Edit the section of {{content_type}} in <rewrite_this></rewrite_this> tags based on the following prompt:
<prompt>
{{{user_prompt}}}
</prompt>
{{#if rewrite_section}}
And here's the section to rewrite based on that prompt again for reference:
<rewrite_this>
{{{rewrite_section}}}
</rewrite_this>
{{/if}}
Only make changes that are necessary to fulfill the prompt, leave everything else as-is. All surrounding {{content_type}} will be preserved.
Start at the indentation level in the original file in the rewritten {{content_type}}. Don't stop until you've rewritten the entire section, even if you have no more changes to make, always write out the whole section with no unnecessary elisions.
Immediately start with the following format with no remarks:
```
\{{REWRITTEN_CODE}}
```
{{/if}}

View File

@@ -15,6 +15,7 @@ With each location, you will produce a brief, one-line description of the change
- When generating multiple suggestions, ensure the descriptions are specific to each individual operation.
- Avoid referring to the location in the description. Focus on the change to be made, not the location where it's made. That's implicit with the symbol you provide.
- Don't generate multiple suggestions at the same location. Instead, combine them together in a single operation with a succinct combined description.
- To add imports respond with a suggestion where the `"symbol"` key is set to `"#imports"`
</guidelines>
</overview>
@@ -203,6 +204,7 @@ Add a 'use std::fmt;' statement at the beginning of the file
{
"kind": "PrependChild",
"path": "src/vehicle.rs",
"symbol": "#imports",
"description": "Add 'use std::fmt' statement"
}
]
@@ -413,6 +415,7 @@ Add a 'load_from_file' method to Config and import necessary modules
{
"kind": "PrependChild",
"path": "src/config.rs",
"symbol": "#imports",
"description": "Import std::fs and std::io modules"
},
{

View File

@@ -395,9 +395,9 @@
// The default model to use when creating new contexts.
"default_model": {
// The provider to use.
"provider": "openai",
"provider": "zed.dev",
// The model to use.
"model": "gpt-4o"
"model": "claude-3-5-sonnet"
}
},
// The settings for slash commands.

View File

@@ -33,5 +33,31 @@ services:
volumes:
- ./livekit.yaml:/livekit.yaml
postgrest_app:
image: postgrest/postgrest
container_name: postgrest_app
ports:
- 8081:8081
environment:
PGRST_DB_URI: postgres://postgres@postgres:5432/zed
volumes:
- ./crates/collab/postgrest_app.conf:/etc/postgrest.conf
command: postgrest /etc/postgrest.conf
depends_on:
- postgres
postgrest_llm:
image: postgrest/postgrest
container_name: postgrest_llm
ports:
- 8082:8082
environment:
PGRST_DB_URI: postgres://postgres@postgres:5432/zed_llm
volumes:
- ./crates/collab/postgrest_llm.conf:/etc/postgrest.conf
command: postgrest /etc/postgrest.conf
depends_on:
- postgres
volumes:
postgres_data:

View File

@@ -17,6 +17,7 @@ path = "src/anthropic.rs"
[dependencies]
anyhow.workspace = true
chrono.workspace = true
futures.workspace = true
http_client.workspace = true
isahc.workspace = true
@@ -25,6 +26,7 @@ serde.workspace = true
serde_json.workspace = true
strum.workspace = true
thiserror.workspace = true
util.workspace = true
[dev-dependencies]
tokio.workspace = true

View File

@@ -1,14 +1,17 @@
mod supported_countries;
use anyhow::{anyhow, Context, Result};
use chrono::{DateTime, Utc};
use futures::{io::BufReader, stream::BoxStream, AsyncBufReadExt, AsyncReadExt, Stream, StreamExt};
use http_client::{AsyncBody, HttpClient, Method, Request as HttpRequest};
use isahc::config::Configurable;
use isahc::http::{HeaderMap, HeaderValue};
use serde::{Deserialize, Serialize};
use std::time::Duration;
use std::{pin::Pin, str::FromStr};
use strum::{EnumIter, EnumString};
use thiserror::Error;
use util::ResultExt as _;
pub use supported_countries::*;
@@ -38,6 +41,8 @@ pub enum Model {
Custom {
name: String,
max_tokens: usize,
/// The name displayed in the UI, such as in the assistant panel model dropdown menu.
display_name: Option<String>,
/// Override this model with a different Anthropic model for tool calls.
tool_override: Option<String>,
/// Indicates whether this custom model supports caching.
@@ -77,7 +82,9 @@ impl Model {
Self::Claude3Opus => "Claude 3 Opus",
Self::Claude3Sonnet => "Claude 3 Sonnet",
Self::Claude3Haiku => "Claude 3 Haiku",
Self::Custom { name, .. } => name,
Self::Custom {
name, display_name, ..
} => display_name.as_ref().unwrap_or(name),
}
}
@@ -191,6 +198,66 @@ pub async fn stream_completion(
request: Request,
low_speed_timeout: Option<Duration>,
) -> Result<BoxStream<'static, Result<Event, AnthropicError>>, AnthropicError> {
stream_completion_with_rate_limit_info(client, api_url, api_key, request, low_speed_timeout)
.await
.map(|output| output.0)
}
/// https://docs.anthropic.com/en/api/rate-limits#response-headers
#[derive(Debug)]
pub struct RateLimitInfo {
pub requests_limit: usize,
pub requests_remaining: usize,
pub requests_reset: DateTime<Utc>,
pub tokens_limit: usize,
pub tokens_remaining: usize,
pub tokens_reset: DateTime<Utc>,
}
impl RateLimitInfo {
fn from_headers(headers: &HeaderMap<HeaderValue>) -> Result<Self> {
let tokens_limit = get_header("anthropic-ratelimit-tokens-limit", headers)?.parse()?;
let requests_limit = get_header("anthropic-ratelimit-requests-limit", headers)?.parse()?;
let tokens_remaining =
get_header("anthropic-ratelimit-tokens-remaining", headers)?.parse()?;
let requests_remaining =
get_header("anthropic-ratelimit-requests-remaining", headers)?.parse()?;
let requests_reset = get_header("anthropic-ratelimit-requests-reset", headers)?;
let tokens_reset = get_header("anthropic-ratelimit-tokens-reset", headers)?;
let requests_reset = DateTime::parse_from_rfc3339(requests_reset)?.to_utc();
let tokens_reset = DateTime::parse_from_rfc3339(tokens_reset)?.to_utc();
Ok(Self {
requests_limit,
tokens_limit,
requests_remaining,
tokens_remaining,
requests_reset,
tokens_reset,
})
}
}
fn get_header<'a>(key: &str, headers: &'a HeaderMap) -> Result<&'a str, anyhow::Error> {
Ok(headers
.get(key)
.ok_or_else(|| anyhow!("missing header `{key}`"))?
.to_str()?)
}
pub async fn stream_completion_with_rate_limit_info(
client: &dyn HttpClient,
api_url: &str,
api_key: &str,
request: Request,
low_speed_timeout: Option<Duration>,
) -> Result<
(
BoxStream<'static, Result<Event, AnthropicError>>,
Option<RateLimitInfo>,
),
AnthropicError,
> {
let request = StreamingRequest {
base: request,
stream: true,
@@ -220,8 +287,9 @@ pub async fn stream_completion(
.await
.context("failed to send request to Anthropic")?;
if response.status().is_success() {
let rate_limits = RateLimitInfo::from_headers(response.headers());
let reader = BufReader::new(response.into_body());
Ok(reader
let stream = reader
.lines()
.filter_map(|line| async move {
match line {
@@ -235,7 +303,8 @@ pub async fn stream_completion(
Err(error) => Some(Err(AnthropicError::Other(anyhow!(error)))),
}
})
.boxed())
.boxed();
Ok((stream, rate_limits.log_err()))
} else {
let mut body = Vec::new();
response

View File

@@ -9,6 +9,7 @@ mod model_selector;
mod prompt_library;
mod prompts;
mod slash_command;
pub(crate) mod slash_command_picker;
pub mod slash_command_settings;
mod streaming_diff;
mod terminal_inline_assistant;
@@ -33,7 +34,7 @@ use language_model::{
};
pub(crate) use model_selector::*;
pub use prompts::PromptBuilder;
use prompts::PromptOverrideContext;
use prompts::PromptLoadingParams;
use semantic_index::{CloudEmbeddingProvider, SemanticIndex};
use serde::{Deserialize, Serialize};
use settings::{update_settings_file, Settings, SettingsStore};
@@ -59,7 +60,6 @@ actions!(
InsertIntoEditor,
ToggleFocus,
InsertActivePrompt,
ShowConfiguration,
DeployHistory,
DeployPromptLibrary,
ConfirmCommand,
@@ -184,7 +184,7 @@ impl Assistant {
pub fn init(
fs: Arc<dyn Fs>,
client: Arc<Client>,
dev_mode: bool,
stdout_is_a_pty: bool,
cx: &mut AppContext,
) -> Arc<PromptBuilder> {
cx.set_global(Assistant::default());
@@ -223,9 +223,11 @@ pub fn init(
assistant_panel::init(cx);
context_servers::init(cx);
let prompt_builder = prompts::PromptBuilder::new(Some(PromptOverrideContext {
dev_mode,
let prompt_builder = prompts::PromptBuilder::new(Some(PromptLoadingParams {
fs: fs.clone(),
repo_path: stdout_is_a_pty
.then(|| std::env::current_dir().log_err())
.flatten(),
cx,
}))
.log_err()

File diff suppressed because it is too large Load Diff

View File

@@ -543,8 +543,8 @@ mod tests {
assert_eq!(
AssistantSettings::get_global(cx).default_model,
LanguageModelSelection {
provider: "openai".into(),
model: "gpt-4o".into(),
provider: "zed.dev".into(),
model: "claude-3-5-sonnet".into(),
}
);
});

View File

@@ -40,6 +40,7 @@ use std::{
time::{Duration, Instant},
};
use telemetry_events::AssistantKind;
use text::BufferSnapshot;
use util::{post_inc, ResultExt, TryFutureExt};
use uuid::Uuid;
@@ -107,8 +108,7 @@ impl ContextOperation {
message.status.context("invalid status")?,
),
timestamp: id.0,
should_cache: false,
is_cache_anchor: false,
cache: None,
},
version: language::proto::deserialize_version(&insert.version),
})
@@ -123,8 +123,7 @@ impl ContextOperation {
timestamp: language::proto::deserialize_timestamp(
update.timestamp.context("invalid timestamp")?,
),
should_cache: false,
is_cache_anchor: false,
cache: None,
},
version: language::proto::deserialize_version(&update.version),
}),
@@ -295,6 +294,7 @@ pub enum ContextEvent {
output_range: Range<language::Anchor>,
sections: Vec<SlashCommandOutputSection<language::Anchor>>,
run_commands_in_output: bool,
expand_result: bool,
},
Operation(ContextOperation),
}
@@ -312,13 +312,43 @@ pub struct MessageAnchor {
pub start: language::Anchor,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum CacheStatus {
Pending,
Cached,
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct MessageCacheMetadata {
pub is_anchor: bool,
pub is_final_anchor: bool,
pub status: CacheStatus,
pub cached_at: clock::Global,
}
#[derive(Clone, Debug, Eq, PartialEq, Serialize, Deserialize)]
pub struct MessageMetadata {
pub role: Role,
pub status: MessageStatus,
timestamp: clock::Lamport,
should_cache: bool,
is_cache_anchor: bool,
#[serde(skip)]
pub cache: Option<MessageCacheMetadata>,
}
impl MessageMetadata {
pub fn is_cache_valid(&self, buffer: &BufferSnapshot, range: &Range<usize>) -> bool {
let result = match &self.cache {
Some(MessageCacheMetadata { cached_at, .. }) => !buffer.has_edits_since_in_range(
&cached_at,
Range {
start: buffer.anchor_at(range.start, Bias::Right),
end: buffer.anchor_at(range.end, Bias::Left),
},
),
_ => false,
};
result
}
}
#[derive(Clone, Debug)]
@@ -344,7 +374,7 @@ pub struct Message {
pub anchor: language::Anchor,
pub role: Role,
pub status: MessageStatus,
pub cache: bool,
pub cache: Option<MessageCacheMetadata>,
}
impl Message {
@@ -380,7 +410,7 @@ impl Message {
Some(LanguageModelRequestMessage {
role: self.role,
content,
cache: self.cache,
cache: self.cache.as_ref().map_or(false, |cache| cache.is_anchor),
})
}
@@ -543,8 +573,7 @@ impl Context {
role: Role::User,
status: MessageStatus::Done,
timestamp: first_message_id.0,
should_cache: false,
is_cache_anchor: false,
cache: None,
},
);
this.message_anchors.push(message);
@@ -774,6 +803,7 @@ impl Context {
cx.emit(ContextEvent::SlashCommandFinished {
output_range,
sections,
expand_result: false,
run_commands_in_output: false,
});
}
@@ -977,7 +1007,7 @@ impl Context {
});
}
pub fn mark_longest_messages_for_cache(
pub fn mark_cache_anchors(
&mut self,
cache_configuration: &Option<LanguageModelCacheConfiguration>,
speculative: bool,
@@ -992,66 +1022,104 @@ impl Context {
min_total_token: 0,
});
let messages: Vec<Message> = self
.messages_from_anchors(
self.message_anchors.iter().take(if speculative {
self.message_anchors.len().saturating_sub(1)
} else {
self.message_anchors.len()
}),
cx,
)
.filter(|message| message.offset_range.len() >= 5_000)
.collect();
let messages: Vec<Message> = self.messages(cx).collect();
let mut sorted_messages = messages.clone();
sorted_messages.sort_by(|a, b| b.offset_range.len().cmp(&a.offset_range.len()));
if cache_configuration.max_cache_anchors == 0 && cache_configuration.should_speculate {
// Some models support caching, but don't support anchors. In that case we want to
// mark the largest message as needing to be cached, but we will not mark it as an
// anchor.
sorted_messages.truncate(1);
} else {
// Save 1 anchor for the inline assistant.
sorted_messages.truncate(max(cache_configuration.max_cache_anchors, 1) - 1);
if speculative {
// Avoid caching the last message if this is a speculative cache fetch as
// it's likely to change.
sorted_messages.pop();
}
sorted_messages.retain(|m| m.role == Role::User);
sorted_messages.sort_by(|a, b| b.offset_range.len().cmp(&a.offset_range.len()));
let longest_message_ids: HashSet<MessageId> = sorted_messages
let cache_anchors = if self.token_count.unwrap_or(0) < cache_configuration.min_total_token {
// If we have't hit the minimum threshold to enable caching, don't cache anything.
0
} else {
// Save 1 anchor for the inline assistant to use.
max(cache_configuration.max_cache_anchors, 1) - 1
};
sorted_messages.truncate(cache_anchors);
let anchors: HashSet<MessageId> = sorted_messages
.into_iter()
.map(|message| message.id)
.collect();
let cache_deltas: HashSet<MessageId> = self
.messages_metadata
let buffer = self.buffer.read(cx).snapshot();
let invalidated_caches: HashSet<MessageId> = messages
.iter()
.filter_map(|(id, metadata)| {
let should_cache = longest_message_ids.contains(id);
let should_be_anchor = should_cache && cache_configuration.max_cache_anchors > 0;
if metadata.should_cache != should_cache
|| metadata.is_cache_anchor != should_be_anchor
{
Some(*id)
} else {
None
}
.scan(false, |encountered_invalid, message| {
let message_id = message.id;
let is_invalid = self
.messages_metadata
.get(&message_id)
.map_or(true, |metadata| {
!metadata.is_cache_valid(&buffer, &message.offset_range)
|| *encountered_invalid
});
*encountered_invalid |= is_invalid;
Some(if is_invalid { Some(message_id) } else { None })
})
.flatten()
.collect();
let mut newly_cached_item = false;
for id in cache_deltas {
newly_cached_item = newly_cached_item || longest_message_ids.contains(&id);
self.update_metadata(id, cx, |metadata| {
metadata.should_cache = longest_message_ids.contains(&id);
metadata.is_cache_anchor =
metadata.should_cache && (cache_configuration.max_cache_anchors > 0);
let last_anchor = messages.iter().rev().find_map(|message| {
if anchors.contains(&message.id) {
Some(message.id)
} else {
None
}
});
let mut new_anchor_needs_caching = false;
let current_version = &buffer.version;
// If we have no anchors, mark all messages as not being cached.
let mut hit_last_anchor = last_anchor.is_none();
for message in messages.iter() {
if hit_last_anchor {
self.update_metadata(message.id, cx, |metadata| metadata.cache = None);
continue;
}
if let Some(last_anchor) = last_anchor {
if message.id == last_anchor {
hit_last_anchor = true;
}
}
new_anchor_needs_caching = new_anchor_needs_caching
|| (invalidated_caches.contains(&message.id) && anchors.contains(&message.id));
self.update_metadata(message.id, cx, |metadata| {
let cache_status = if invalidated_caches.contains(&message.id) {
CacheStatus::Pending
} else {
metadata
.cache
.as_ref()
.map_or(CacheStatus::Pending, |cm| cm.status.clone())
};
metadata.cache = Some(MessageCacheMetadata {
is_anchor: anchors.contains(&message.id),
is_final_anchor: hit_last_anchor,
status: cache_status,
cached_at: current_version.clone(),
});
});
}
newly_cached_item
new_anchor_needs_caching
}
fn start_cache_warming(&mut self, model: &Arc<dyn LanguageModel>, cx: &mut ModelContext<Self>) {
let cache_configuration = model.cache_configuration();
if !self.mark_longest_messages_for_cache(&cache_configuration, true, cx) {
if !self.mark_cache_anchors(&cache_configuration, true, cx) {
return;
}
if !self.pending_completions.is_empty() {
return;
}
if let Some(cache_configuration) = cache_configuration {
@@ -1074,7 +1142,7 @@ impl Context {
};
let model = Arc::clone(model);
self.pending_cache_warming_task = cx.spawn(|_, cx| {
self.pending_cache_warming_task = cx.spawn(|this, mut cx| {
async move {
match model.stream_completion(request, &cx).await {
Ok(mut stream) => {
@@ -1085,13 +1153,41 @@ impl Context {
log::warn!("Cache warming failed: {}", e);
}
};
this.update(&mut cx, |this, cx| {
this.update_cache_status_for_completion(cx);
})
.ok();
anyhow::Ok(())
}
.log_err()
});
}
pub fn update_cache_status_for_completion(&mut self, cx: &mut ModelContext<Self>) {
let cached_message_ids: Vec<MessageId> = self
.messages_metadata
.iter()
.filter_map(|(message_id, metadata)| {
metadata.cache.as_ref().and_then(|cache| {
if cache.status == CacheStatus::Pending {
Some(*message_id)
} else {
None
}
})
})
.collect();
for message_id in cached_message_ids {
self.update_metadata(message_id, cx, |metadata| {
if let Some(cache) = &mut metadata.cache {
cache.status = CacheStatus::Cached;
}
});
}
cx.notify();
}
pub fn reparse_slash_commands(&mut self, cx: &mut ModelContext<Self>) {
let buffer = self.buffer.read(cx);
let mut row_ranges = self
@@ -1395,7 +1491,8 @@ impl Context {
&mut self,
command_range: Range<language::Anchor>,
output: Task<Result<SlashCommandOutput>>,
insert_trailing_newline: bool,
ensure_trailing_newline: bool,
expand_result: bool,
cx: &mut ModelContext<Self>,
) {
self.reparse_slash_commands(cx);
@@ -1406,8 +1503,27 @@ impl Context {
let output = output.await;
this.update(&mut cx, |this, cx| match output {
Ok(mut output) => {
if insert_trailing_newline {
output.text.push('\n');
// Ensure section ranges are valid.
for section in &mut output.sections {
section.range.start = section.range.start.min(output.text.len());
section.range.end = section.range.end.min(output.text.len());
while !output.text.is_char_boundary(section.range.start) {
section.range.start -= 1;
}
while !output.text.is_char_boundary(section.range.end) {
section.range.end += 1;
}
}
// Ensure there is a newline after the last section.
if ensure_trailing_newline {
let has_newline_after_last_section =
output.sections.last().map_or(false, |last_section| {
output.text[last_section.range.end..].ends_with('\n')
});
if !has_newline_after_last_section {
output.text.push('\n');
}
}
let version = this.version.clone();
@@ -1450,6 +1566,7 @@ impl Context {
output_range,
sections,
run_commands_in_output: output.run_commands_in_text,
expand_result,
},
)
});
@@ -1508,7 +1625,7 @@ impl Context {
return None;
}
// Compute which messages to cache, including the last one.
self.mark_longest_messages_for_cache(&model.cache_configuration(), false, cx);
self.mark_cache_anchors(&model.cache_configuration(), false, cx);
let request = self.to_completion_request(cx);
let assistant_message = self
@@ -1573,6 +1690,7 @@ impl Context {
this.pending_completions
.retain(|completion| completion.id != pending_completion_id);
this.summarize(false, cx);
this.update_cache_status_for_completion(cx);
})?;
anyhow::Ok(())
@@ -1723,8 +1841,7 @@ impl Context {
role,
status,
timestamp: anchor.id.0,
should_cache: false,
is_cache_anchor: false,
cache: None,
};
self.insert_message(anchor.clone(), metadata.clone(), cx);
self.push_op(
@@ -1841,8 +1958,7 @@ impl Context {
role,
status: MessageStatus::Done,
timestamp: suffix.id.0,
should_cache: false,
is_cache_anchor: false,
cache: None,
};
self.insert_message(suffix.clone(), suffix_metadata.clone(), cx);
self.push_op(
@@ -1892,8 +2008,7 @@ impl Context {
role,
status: MessageStatus::Done,
timestamp: selection.id.0,
should_cache: false,
is_cache_anchor: false,
cache: None,
};
self.insert_message(selection.clone(), selection_metadata.clone(), cx);
self.push_op(
@@ -2127,7 +2242,7 @@ impl Context {
anchor: message_anchor.start,
role: metadata.role,
status: metadata.status.clone(),
cache: metadata.is_cache_anchor,
cache: metadata.cache.clone(),
image_offsets,
});
}
@@ -2374,8 +2489,7 @@ impl SavedContext {
role: message.metadata.role,
status: message.metadata.status,
timestamp: message.metadata.timestamp,
should_cache: false,
is_cache_anchor: false,
cache: None,
},
version: version.clone(),
});
@@ -2392,8 +2506,7 @@ impl SavedContext {
role: metadata.role,
status: metadata.status,
timestamp,
should_cache: false,
is_cache_anchor: false,
cache: None,
},
version: version.clone(),
});
@@ -2488,8 +2601,7 @@ impl SavedContextV0_3_0 {
role: metadata.role,
status: metadata.status.clone(),
timestamp,
should_cache: false,
is_cache_anchor: false,
cache: None,
},
image_offsets: Vec::new(),
})

View File

@@ -1,6 +1,6 @@
use crate::{
assistant_panel, prompt_library, slash_command::file_command, workflow::tool, Context,
ContextEvent, ContextId, ContextOperation, MessageId, MessageStatus, PromptBuilder,
assistant_panel, prompt_library, slash_command::file_command, workflow::tool, CacheStatus,
Context, ContextEvent, ContextId, ContextOperation, MessageId, MessageStatus, PromptBuilder,
};
use anyhow::Result;
use assistant_slash_command::{
@@ -12,7 +12,7 @@ use fs::{FakeFs, Fs as _};
use gpui::{AppContext, Model, SharedString, Task, TestAppContext, WeakView};
use indoc::indoc;
use language::{Buffer, LanguageRegistry, LspAdapterDelegate};
use language_model::{LanguageModelRegistry, Role};
use language_model::{LanguageModelCacheConfiguration, LanguageModelRegistry, Role};
use parking_lot::Mutex;
use project::Project;
use rand::prelude::*;
@@ -33,6 +33,8 @@ use unindent::Unindent;
use util::{test::marked_text_ranges, RandomCharIter};
use workspace::Workspace;
use super::MessageCacheMetadata;
#[gpui::test]
fn test_inserting_and_removing_messages(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
@@ -473,7 +475,7 @@ async fn test_slash_commands(cx: &mut TestAppContext) {
}
#[gpui::test]
async fn test_edit_step_parsing(cx: &mut TestAppContext) {
async fn test_workflow_step_parsing(cx: &mut TestAppContext) {
cx.update(prompt_library::init);
let settings_store = cx.update(SettingsStore::test);
cx.set_global(settings_store);
@@ -891,6 +893,7 @@ async fn test_random_context_collaboration(cx: &mut TestAppContext, mut rng: Std
run_commands_in_text: false,
})),
true,
false,
cx,
);
});
@@ -1001,6 +1004,159 @@ async fn test_random_context_collaboration(cx: &mut TestAppContext, mut rng: Std
});
}
#[gpui::test]
fn test_mark_cache_anchors(cx: &mut AppContext) {
let settings_store = SettingsStore::test(cx);
LanguageModelRegistry::test(cx);
cx.set_global(settings_store);
assistant_panel::init(cx);
let registry = Arc::new(LanguageRegistry::test(cx.background_executor().clone()));
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let context =
cx.new_model(|cx| Context::local(registry, None, None, prompt_builder.clone(), cx));
let buffer = context.read(cx).buffer.clone();
// Create a test cache configuration
let cache_configuration = &Some(LanguageModelCacheConfiguration {
max_cache_anchors: 3,
should_speculate: true,
min_total_token: 10,
});
let message_1 = context.read(cx).message_anchors[0].clone();
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, false, cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.filter(|(_, cache)| cache.as_ref().map_or(false, |cache| cache.is_anchor))
.count(),
0,
"Empty messages should not have any cache anchors."
);
buffer.update(cx, |buffer, cx| buffer.edit([(0..0, "aaa")], None, cx));
let message_2 = context
.update(cx, |context, cx| {
context.insert_message_after(message_1.id, Role::User, MessageStatus::Pending, cx)
})
.unwrap();
buffer.update(cx, |buffer, cx| buffer.edit([(4..4, "bbbbbbb")], None, cx));
let message_3 = context
.update(cx, |context, cx| {
context.insert_message_after(message_2.id, Role::User, MessageStatus::Pending, cx)
})
.unwrap();
buffer.update(cx, |buffer, cx| buffer.edit([(12..12, "cccccc")], None, cx));
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, false, cx)
});
assert_eq!(buffer.read(cx).text(), "aaa\nbbbbbbb\ncccccc");
assert_eq!(
messages_cache(&context, cx)
.iter()
.filter(|(_, cache)| cache.as_ref().map_or(false, |cache| cache.is_anchor))
.count(),
0,
"Messages should not be marked for cache before going over the token minimum."
);
context.update(cx, |context, _| {
context.token_count = Some(20);
});
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, true, cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.map(|(_, cache)| cache.as_ref().map_or(false, |cache| cache.is_anchor))
.collect::<Vec<bool>>(),
vec![true, true, false],
"Last message should not be an anchor on speculative request."
);
context
.update(cx, |context, cx| {
context.insert_message_after(message_3.id, Role::Assistant, MessageStatus::Pending, cx)
})
.unwrap();
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, false, cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.map(|(_, cache)| cache.as_ref().map_or(false, |cache| cache.is_anchor))
.collect::<Vec<bool>>(),
vec![false, true, true, false],
"Most recent message should also be cached if not a speculative request."
);
context.update(cx, |context, cx| {
context.update_cache_status_for_completion(cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.map(|(_, cache)| cache
.as_ref()
.map_or(None, |cache| Some(cache.status.clone())))
.collect::<Vec<Option<CacheStatus>>>(),
vec![
Some(CacheStatus::Cached),
Some(CacheStatus::Cached),
Some(CacheStatus::Cached),
None
],
"All user messages prior to anchor should be marked as cached."
);
buffer.update(cx, |buffer, cx| buffer.edit([(14..14, "d")], None, cx));
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, false, cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.map(|(_, cache)| cache
.as_ref()
.map_or(None, |cache| Some(cache.status.clone())))
.collect::<Vec<Option<CacheStatus>>>(),
vec![
Some(CacheStatus::Cached),
Some(CacheStatus::Cached),
Some(CacheStatus::Pending),
None
],
"Modifying a message should invalidate it's cache but leave previous messages."
);
buffer.update(cx, |buffer, cx| buffer.edit([(2..2, "e")], None, cx));
context.update(cx, |context, cx| {
context.mark_cache_anchors(cache_configuration, false, cx)
});
assert_eq!(
messages_cache(&context, cx)
.iter()
.map(|(_, cache)| cache
.as_ref()
.map_or(None, |cache| Some(cache.status.clone())))
.collect::<Vec<Option<CacheStatus>>>(),
vec![
Some(CacheStatus::Pending),
Some(CacheStatus::Pending),
Some(CacheStatus::Pending),
None
],
"Modifying a message should invalidate all future messages."
);
}
fn messages(context: &Model<Context>, cx: &AppContext) -> Vec<(MessageId, Role, Range<usize>)> {
context
.read(cx)
@@ -1009,6 +1165,17 @@ fn messages(context: &Model<Context>, cx: &AppContext) -> Vec<(MessageId, Role,
.collect()
}
fn messages_cache(
context: &Model<Context>,
cx: &AppContext,
) -> Vec<(MessageId, Option<MessageCacheMetadata>)> {
context
.read(cx)
.messages(cx)
.map(|message| (message.id, message.cache.clone()))
.collect()
}
#[derive(Clone)]
struct FakeSlashCommand(String);

View File

@@ -28,7 +28,7 @@ use gpui::{
FontWeight, Global, HighlightStyle, Model, ModelContext, Subscription, Task, TextStyle,
UpdateGlobal, View, ViewContext, WeakView, WindowContext,
};
use language::{Buffer, IndentKind, Point, TransactionId};
use language::{Buffer, IndentKind, Point, Selection, TransactionId};
use language_model::{
LanguageModelRegistry, LanguageModelRequest, LanguageModelRequestMessage, Role,
};
@@ -38,6 +38,7 @@ use rope::Rope;
use settings::Settings;
use smol::future::FutureExt;
use std::{
cmp,
future::{self, Future},
mem,
ops::{Range, RangeInclusive},
@@ -46,7 +47,6 @@ use std::{
task::{self, Poll},
time::{Duration, Instant},
};
use text::OffsetRangeExt as _;
use theme::ThemeSettings;
use ui::{prelude::*, CheckboxWithLabel, IconButtonShape, Popover, Tooltip};
use util::{RangeExt, ResultExt};
@@ -76,8 +76,13 @@ pub struct InlineAssistant {
assists: HashMap<InlineAssistId, InlineAssist>,
assists_by_editor: HashMap<WeakView<Editor>, EditorInlineAssists>,
assist_groups: HashMap<InlineAssistGroupId, InlineAssistGroup>,
assist_observations:
HashMap<InlineAssistId, (async_watch::Sender<()>, async_watch::Receiver<()>)>,
assist_observations: HashMap<
InlineAssistId,
(
async_watch::Sender<AssistStatus>,
async_watch::Receiver<AssistStatus>,
),
>,
confirmed_assists: HashMap<InlineAssistId, Model<Codegen>>,
prompt_history: VecDeque<String>,
prompt_builder: Arc<PromptBuilder>,
@@ -85,6 +90,19 @@ pub struct InlineAssistant {
fs: Arc<dyn Fs>,
}
pub enum AssistStatus {
Idle,
Started,
Stopped,
Finished,
}
impl AssistStatus {
pub fn is_done(&self) -> bool {
matches!(self, Self::Stopped | Self::Finished)
}
}
impl Global for InlineAssistant {}
impl InlineAssistant {
@@ -140,81 +158,66 @@ impl InlineAssistant {
cx: &mut WindowContext,
) {
let snapshot = editor.read(cx).buffer().read(cx).snapshot(cx);
struct CodegenRange {
transform_range: Range<Point>,
selection_ranges: Vec<Range<Point>>,
focus_assist: bool,
}
let newest_selection_range = editor.read(cx).selections.newest::<Point>(cx).range();
let mut codegen_ranges: Vec<CodegenRange> = Vec::new();
let selection_ranges = snapshot
.split_ranges(editor.read(cx).selections.disjoint_anchor_ranges())
.map(|range| range.to_point(&snapshot))
.collect::<Vec<Range<Point>>>();
for selection_range in selection_ranges {
let selection_is_newest = newest_selection_range.contains_inclusive(&selection_range);
let mut transform_range = selection_range.start..selection_range.end;
// Expand the transform range to start/end of lines.
// If a non-empty selection ends at the start of the last line, clip at the end of the penultimate line.
transform_range.start.column = 0;
if transform_range.end.column == 0 && transform_range.end > transform_range.start {
transform_range.end.row -= 1;
let mut selections = Vec::<Selection<Point>>::new();
let mut newest_selection = None;
for mut selection in editor.read(cx).selections.all::<Point>(cx) {
if selection.end > selection.start {
selection.start.column = 0;
// If the selection ends at the start of the line, we don't want to include it.
if selection.end.column == 0 {
selection.end.row -= 1;
}
selection.end.column = snapshot.line_len(MultiBufferRow(selection.end.row));
}
transform_range.end.column = snapshot.line_len(MultiBufferRow(transform_range.end.row));
let selection_range =
selection_range.start..selection_range.end.min(transform_range.end);
// If we intersect the previous transform range,
if let Some(CodegenRange {
transform_range: prev_transform_range,
selection_ranges,
focus_assist,
}) = codegen_ranges.last_mut()
{
if transform_range.start <= prev_transform_range.end {
prev_transform_range.end = transform_range.end;
selection_ranges.push(selection_range);
*focus_assist |= selection_is_newest;
if let Some(prev_selection) = selections.last_mut() {
if selection.start <= prev_selection.end {
prev_selection.end = selection.end;
continue;
}
}
codegen_ranges.push(CodegenRange {
transform_range,
selection_ranges: vec![selection_range],
focus_assist: selection_is_newest,
})
let latest_selection = newest_selection.get_or_insert_with(|| selection.clone());
if selection.id > latest_selection.id {
*latest_selection = selection.clone();
}
selections.push(selection);
}
let newest_selection = newest_selection.unwrap();
let mut codegen_ranges = Vec::new();
for (excerpt_id, buffer, buffer_range) in
snapshot.excerpts_in_ranges(selections.iter().map(|selection| {
snapshot.anchor_before(selection.start)..snapshot.anchor_after(selection.end)
}))
{
let start = Anchor {
buffer_id: Some(buffer.remote_id()),
excerpt_id,
text_anchor: buffer.anchor_before(buffer_range.start),
};
let end = Anchor {
buffer_id: Some(buffer.remote_id()),
excerpt_id,
text_anchor: buffer.anchor_after(buffer_range.end),
};
codegen_ranges.push(start..end);
}
let assist_group_id = self.next_assist_group_id.post_inc();
let prompt_buffer =
cx.new_model(|cx| Buffer::local(initial_prompt.unwrap_or_default(), cx));
let prompt_buffer = cx.new_model(|cx| MultiBuffer::singleton(prompt_buffer, cx));
let mut assists = Vec::new();
let mut assist_to_focus = None;
for CodegenRange {
transform_range,
selection_ranges,
focus_assist,
} in codegen_ranges
{
let transform_range = snapshot.anchor_before(transform_range.start)
..snapshot.anchor_after(transform_range.end);
let selection_ranges = selection_ranges
.iter()
.map(|range| snapshot.anchor_before(range.start)..snapshot.anchor_after(range.end))
.collect::<Vec<_>>();
for range in codegen_ranges {
let assist_id = self.next_assist_id.post_inc();
let codegen = cx.new_model(|cx| {
Codegen::new(
editor.read(cx).buffer().clone(),
transform_range.clone(),
selection_ranges,
range.clone(),
None,
self.telemetry.clone(),
self.prompt_builder.clone(),
@@ -222,7 +225,6 @@ impl InlineAssistant {
)
});
let assist_id = self.next_assist_id.post_inc();
let gutter_dimensions = Arc::new(Mutex::new(GutterDimensions::default()));
let prompt_editor = cx.new_view(|cx| {
PromptEditor::new(
@@ -239,16 +241,23 @@ impl InlineAssistant {
)
});
if focus_assist {
assist_to_focus = Some(assist_id);
if assist_to_focus.is_none() {
let focus_assist = if newest_selection.reversed {
range.start.to_point(&snapshot) == newest_selection.start
} else {
range.end.to_point(&snapshot) == newest_selection.end
};
if focus_assist {
assist_to_focus = Some(assist_id);
}
}
let [prompt_block_id, end_block_id] =
self.insert_assist_blocks(editor, &transform_range, &prompt_editor, cx);
self.insert_assist_blocks(editor, &range, &prompt_editor, cx);
assists.push((
assist_id,
transform_range,
range,
prompt_editor,
prompt_block_id,
end_block_id,
@@ -315,7 +324,6 @@ impl InlineAssistant {
Codegen::new(
editor.read(cx).buffer().clone(),
range.clone(),
vec![range.clone()],
initial_transaction_id,
self.telemetry.clone(),
self.prompt_builder.clone(),
@@ -925,12 +933,17 @@ impl InlineAssistant {
assist
.codegen
.update(cx, |codegen, cx| {
codegen.start(user_prompt, assistant_panel_context, cx)
codegen.start(
assist.range.clone(),
user_prompt,
assistant_panel_context,
cx,
)
})
.log_err();
if let Some((tx, _)) = self.assist_observations.get(&assist_id) {
tx.send(()).ok();
tx.send(AssistStatus::Started).ok();
}
}
@@ -944,7 +957,7 @@ impl InlineAssistant {
assist.codegen.update(cx, |codegen, cx| codegen.stop(cx));
if let Some((tx, _)) = self.assist_observations.get(&assist_id) {
tx.send(()).ok();
tx.send(AssistStatus::Stopped).ok();
}
}
@@ -1146,11 +1159,14 @@ impl InlineAssistant {
})
}
pub fn observe_assist(&mut self, assist_id: InlineAssistId) -> async_watch::Receiver<()> {
pub fn observe_assist(
&mut self,
assist_id: InlineAssistId,
) -> async_watch::Receiver<AssistStatus> {
if let Some((_, rx)) = self.assist_observations.get(&assist_id) {
rx.clone()
} else {
let (tx, rx) = async_watch::channel(());
let (tx, rx) = async_watch::channel(AssistStatus::Idle);
self.assist_observations.insert(assist_id, (tx, rx.clone()));
rx
}
@@ -2084,7 +2100,7 @@ impl InlineAssist {
if assist.decorations.is_none() {
this.finish_assist(assist_id, false, cx);
} else if let Some(tx) = this.assist_observations.get(&assist_id) {
tx.0.send(()).ok();
tx.0.send(AssistStatus::Finished).ok();
}
}
})
@@ -2120,9 +2136,12 @@ impl InlineAssist {
return future::ready(Err(anyhow!("no user prompt"))).boxed();
};
let assistant_panel_context = self.assistant_panel_context(cx);
self.codegen
.read(cx)
.count_tokens(user_prompt, assistant_panel_context, cx)
self.codegen.read(cx).count_tokens(
self.range.clone(),
user_prompt,
assistant_panel_context,
cx,
)
}
}
@@ -2143,8 +2162,6 @@ pub struct Codegen {
buffer: Model<MultiBuffer>,
old_buffer: Model<Buffer>,
snapshot: MultiBufferSnapshot,
transform_range: Range<Anchor>,
selected_ranges: Vec<Range<Anchor>>,
edit_position: Option<Anchor>,
last_equal_ranges: Vec<Range<Anchor>>,
initial_transaction_id: Option<TransactionId>,
@@ -2154,7 +2171,7 @@ pub struct Codegen {
diff: Diff,
telemetry: Option<Arc<Telemetry>>,
_subscription: gpui::Subscription,
prompt_builder: Arc<PromptBuilder>,
builder: Arc<PromptBuilder>,
}
enum CodegenStatus {
@@ -2181,8 +2198,7 @@ impl EventEmitter<CodegenEvent> for Codegen {}
impl Codegen {
pub fn new(
buffer: Model<MultiBuffer>,
transform_range: Range<Anchor>,
selected_ranges: Vec<Range<Anchor>>,
range: Range<Anchor>,
initial_transaction_id: Option<TransactionId>,
telemetry: Option<Arc<Telemetry>>,
builder: Arc<PromptBuilder>,
@@ -2192,7 +2208,7 @@ impl Codegen {
let (old_buffer, _, _) = buffer
.read(cx)
.range_to_buffer_ranges(transform_range.clone(), cx)
.range_to_buffer_ranges(range.clone(), cx)
.pop()
.unwrap();
let old_buffer = cx.new_model(|cx| {
@@ -2223,9 +2239,7 @@ impl Codegen {
telemetry,
_subscription: cx.subscribe(&buffer, Self::handle_buffer_event),
initial_transaction_id,
prompt_builder: builder,
transform_range,
selected_ranges,
builder,
}
}
@@ -2250,12 +2264,14 @@ impl Codegen {
pub fn count_tokens(
&self,
edit_range: Range<Anchor>,
user_prompt: String,
assistant_panel_context: Option<LanguageModelRequest>,
cx: &AppContext,
) -> BoxFuture<'static, Result<TokenCounts>> {
if let Some(model) = LanguageModelRegistry::read_global(cx).active_model() {
let request = self.build_request(user_prompt, assistant_panel_context.clone(), cx);
let request =
self.build_request(user_prompt, assistant_panel_context.clone(), edit_range, cx);
match request {
Ok(request) => {
let total_count = model.count_tokens(request.clone(), cx);
@@ -2280,6 +2296,7 @@ impl Codegen {
pub fn start(
&mut self,
edit_range: Range<Anchor>,
user_prompt: String,
assistant_panel_context: Option<LanguageModelRequest>,
cx: &mut ModelContext<Self>,
@@ -2294,20 +2311,24 @@ impl Codegen {
});
}
self.edit_position = Some(self.transform_range.start.bias_right(&self.snapshot));
self.edit_position = Some(edit_range.start.bias_right(&self.snapshot));
let telemetry_id = model.telemetry_id();
let chunks: LocalBoxFuture<Result<BoxStream<Result<String>>>> =
if user_prompt.trim().to_lowercase() == "delete" {
async { Ok(stream::empty().boxed()) }.boxed_local()
} else {
let request = self.build_request(user_prompt, assistant_panel_context, cx)?;
let chunks: LocalBoxFuture<Result<BoxStream<Result<String>>>> = if user_prompt
.trim()
.to_lowercase()
== "delete"
{
async { Ok(stream::empty().boxed()) }.boxed_local()
} else {
let request =
self.build_request(user_prompt, assistant_panel_context, edit_range.clone(), cx)?;
let chunks =
cx.spawn(|_, cx| async move { model.stream_completion(request, &cx).await });
async move { Ok(chunks.await?.boxed()) }.boxed_local()
};
self.handle_stream(telemetry_id, self.transform_range.clone(), chunks, cx);
let chunks =
cx.spawn(|_, cx| async move { model.stream_completion(request, &cx).await });
async move { Ok(chunks.await?.boxed()) }.boxed_local()
};
self.handle_stream(telemetry_id, edit_range, chunks, cx);
Ok(())
}
@@ -2315,10 +2336,11 @@ impl Codegen {
&self,
user_prompt: String,
assistant_panel_context: Option<LanguageModelRequest>,
edit_range: Range<Anchor>,
cx: &AppContext,
) -> Result<LanguageModelRequest> {
let buffer = self.buffer.read(cx).snapshot(cx);
let language = buffer.language_at(self.transform_range.start);
let language = buffer.language_at(edit_range.start);
let language_name = if let Some(language) = language.as_ref() {
if Arc::ptr_eq(language, &language::PLAIN_TEXT) {
None
@@ -2343,9 +2365,9 @@ impl Codegen {
};
let language_name = language_name.as_deref();
let start = buffer.point_to_buffer_offset(self.transform_range.start);
let end = buffer.point_to_buffer_offset(self.transform_range.end);
let (transform_buffer, transform_range) = if let Some((start, end)) = start.zip(end) {
let start = buffer.point_to_buffer_offset(edit_range.start);
let end = buffer.point_to_buffer_offset(edit_range.end);
let (buffer, range) = if let Some((start, end)) = start.zip(end) {
let (start_buffer, start_buffer_offset) = start;
let (end_buffer, end_buffer_offset) = end;
if start_buffer.remote_id() == end_buffer.remote_id() {
@@ -2357,39 +2379,9 @@ impl Codegen {
return Err(anyhow::anyhow!("invalid transformation range"));
};
let mut transform_context_range = transform_range.to_point(&transform_buffer);
transform_context_range.start.row = transform_context_range.start.row.saturating_sub(3);
transform_context_range.start.column = 0;
transform_context_range.end =
(transform_context_range.end + Point::new(3, 0)).min(transform_buffer.max_point());
transform_context_range.end.column =
transform_buffer.line_len(transform_context_range.end.row);
let transform_context_range = transform_context_range.to_offset(&transform_buffer);
let selected_ranges = self
.selected_ranges
.iter()
.filter_map(|selected_range| {
let start = buffer
.point_to_buffer_offset(selected_range.start)
.map(|(_, offset)| offset)?;
let end = buffer
.point_to_buffer_offset(selected_range.end)
.map(|(_, offset)| offset)?;
Some(start..end)
})
.collect::<Vec<_>>();
let prompt = self
.prompt_builder
.generate_content_prompt(
user_prompt,
language_name,
transform_buffer,
transform_range,
selected_ranges,
transform_context_range,
)
.builder
.generate_content_prompt(user_prompt, language_name, buffer, range)
.map_err(|e| anyhow::anyhow!("Failed to generate content prompt: {}", e))?;
let mut messages = Vec::new();
@@ -2462,19 +2454,84 @@ impl Codegen {
let mut diff = StreamingDiff::new(selected_text.to_string());
let mut line_diff = LineDiff::default();
let mut new_text = String::new();
let mut base_indent = None;
let mut line_indent = None;
let mut first_line = true;
while let Some(chunk) = chunks.next().await {
if response_latency.is_none() {
response_latency = Some(request_start.elapsed());
}
let chunk = chunk?;
let char_ops = diff.push_new(&chunk);
line_diff.push_char_operations(&char_ops, &selected_text);
diff_tx
.send((char_ops, line_diff.line_operations()))
.await?;
let mut lines = chunk.split('\n').peekable();
while let Some(line) = lines.next() {
new_text.push_str(line);
if line_indent.is_none() {
if let Some(non_whitespace_ch_ix) =
new_text.find(|ch: char| !ch.is_whitespace())
{
line_indent = Some(non_whitespace_ch_ix);
base_indent = base_indent.or(line_indent);
let line_indent = line_indent.unwrap();
let base_indent = base_indent.unwrap();
let indent_delta =
line_indent as i32 - base_indent as i32;
let mut corrected_indent_len = cmp::max(
0,
suggested_line_indent.len as i32 + indent_delta,
)
as usize;
if first_line {
corrected_indent_len = corrected_indent_len
.saturating_sub(
selection_start.column as usize,
);
}
let indent_char = suggested_line_indent.char();
let mut indent_buffer = [0; 4];
let indent_str =
indent_char.encode_utf8(&mut indent_buffer);
new_text.replace_range(
..line_indent,
&indent_str.repeat(corrected_indent_len),
);
}
}
if line_indent.is_some() {
let char_ops = diff.push_new(&new_text);
line_diff
.push_char_operations(&char_ops, &selected_text);
diff_tx
.send((char_ops, line_diff.line_operations()))
.await?;
new_text.clear();
}
if lines.peek().is_some() {
let char_ops = diff.push_new("\n");
line_diff
.push_char_operations(&char_ops, &selected_text);
diff_tx
.send((char_ops, line_diff.line_operations()))
.await?;
if line_indent.is_none() {
// Don't write out the leading indentation in empty lines on the next line
// This is the case where the above if statement didn't clear the buffer
new_text.clear();
}
line_indent = None;
first_line = false;
}
}
}
let char_ops = diff.finish();
let mut char_ops = diff.push_new(&new_text);
char_ops.extend(diff.finish());
line_diff.push_char_operations(&char_ops, &selected_text);
line_diff.finish(&selected_text);
diff_tx
@@ -2938,13 +2995,311 @@ fn merge_ranges(ranges: &mut Vec<Range<Anchor>>, buffer: &MultiBufferSnapshot) {
mod tests {
use super::*;
use futures::stream::{self};
use gpui::{Context, TestAppContext};
use indoc::indoc;
use language::{
language_settings, tree_sitter_rust, Buffer, Language, LanguageConfig, LanguageMatcher,
Point,
};
use language_model::LanguageModelRegistry;
use rand::prelude::*;
use serde::Serialize;
use settings::SettingsStore;
use std::{future, sync::Arc};
#[derive(Serialize)]
pub struct DummyCompletionRequest {
pub name: String,
}
#[gpui::test(iterations = 10)]
async fn test_transform_autoindent(cx: &mut TestAppContext, mut rng: StdRng) {
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_model::LanguageModelRegistry::test);
cx.update(language_settings::init);
let text = indoc! {"
fn main() {
let x = 0;
for _ in 0..10 {
x += 1;
}
}
"};
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let range = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 0))..snapshot.anchor_after(Point::new(4, 5))
});
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
range,
future::ready(Ok(chunks_rx.map(|chunk| Ok(chunk)).boxed())),
cx,
)
});
let mut new_text = concat!(
" let mut x = 0;\n",
" while x < 10 {\n",
" x += 1;\n",
" }",
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
chunks_tx.unbounded_send(chunk.to_string()).unwrap();
new_text = suffix;
cx.background_executor.run_until_parked();
}
drop(chunks_tx);
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test(iterations = 10)]
async fn test_autoindent_when_generating_past_indentation(
cx: &mut TestAppContext,
mut rng: StdRng,
) {
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
let text = indoc! {"
fn main() {
le
}
"};
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let range = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 6))..snapshot.anchor_after(Point::new(1, 6))
});
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
range.clone(),
future::ready(Ok(chunks_rx.map(|chunk| Ok(chunk)).boxed())),
cx,
)
});
cx.background_executor.run_until_parked();
let mut new_text = concat!(
"t mut x = 0;\n",
"while x < 10 {\n",
" x += 1;\n",
"}", //
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
chunks_tx.unbounded_send(chunk.to_string()).unwrap();
new_text = suffix;
cx.background_executor.run_until_parked();
}
drop(chunks_tx);
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test(iterations = 10)]
async fn test_autoindent_when_generating_before_indentation(
cx: &mut TestAppContext,
mut rng: StdRng,
) {
cx.update(LanguageModelRegistry::test);
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
let text = concat!(
"fn main() {\n",
" \n",
"}\n" //
);
let buffer =
cx.new_model(|cx| Buffer::local(text, cx).with_language(Arc::new(rust_lang()), cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let range = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(1, 2))..snapshot.anchor_after(Point::new(1, 2))
});
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
range.clone(),
future::ready(Ok(chunks_rx.map(|chunk| Ok(chunk)).boxed())),
cx,
)
});
cx.background_executor.run_until_parked();
let mut new_text = concat!(
"let mut x = 0;\n",
"while x < 10 {\n",
" x += 1;\n",
"}", //
);
while !new_text.is_empty() {
let max_len = cmp::min(new_text.len(), 10);
let len = rng.gen_range(1..=max_len);
let (chunk, suffix) = new_text.split_at(len);
chunks_tx.unbounded_send(chunk.to_string()).unwrap();
new_text = suffix;
cx.background_executor.run_until_parked();
}
drop(chunks_tx);
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
fn main() {
let mut x = 0;
while x < 10 {
x += 1;
}
}
"}
);
}
#[gpui::test(iterations = 10)]
async fn test_autoindent_respects_tabs_in_selection(cx: &mut TestAppContext) {
cx.update(LanguageModelRegistry::test);
cx.set_global(cx.update(SettingsStore::test));
cx.update(language_settings::init);
let text = indoc! {"
func main() {
\tx := 0
\tfor i := 0; i < 10; i++ {
\t\tx++
\t}
}
"};
let buffer = cx.new_model(|cx| Buffer::local(text, cx));
let buffer = cx.new_model(|cx| MultiBuffer::singleton(buffer, cx));
let range = buffer.read_with(cx, |buffer, cx| {
let snapshot = buffer.snapshot(cx);
snapshot.anchor_before(Point::new(0, 0))..snapshot.anchor_after(Point::new(4, 2))
});
let prompt_builder = Arc::new(PromptBuilder::new(None).unwrap());
let codegen = cx.new_model(|cx| {
Codegen::new(
buffer.clone(),
range.clone(),
None,
None,
prompt_builder,
cx,
)
});
let (chunks_tx, chunks_rx) = mpsc::unbounded();
codegen.update(cx, |codegen, cx| {
codegen.handle_stream(
String::new(),
range.clone(),
future::ready(Ok(chunks_rx.map(|chunk| Ok(chunk)).boxed())),
cx,
)
});
let new_text = concat!(
"func main() {\n",
"\tx := 0\n",
"\tfor x < 10 {\n",
"\t\tx++\n",
"\t}", //
);
chunks_tx.unbounded_send(new_text.to_string()).unwrap();
drop(chunks_tx);
cx.background_executor.run_until_parked();
assert_eq!(
buffer.read_with(cx, |buffer, cx| buffer.snapshot(cx).text()),
indoc! {"
func main() {
\tx := 0
\tfor x < 10 {
\t\tx++
\t}
}
"}
);
}
#[gpui::test]
async fn test_strip_invalid_spans_from_codeblock() {
assert_chunks("Lorem ipsum dolor", "Lorem ipsum dolor").await;
@@ -2984,4 +3339,27 @@ mod tests {
)
}
}
fn rust_lang() -> Language {
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_rust::language()),
)
.with_indents_query(
r#"
(call_expression) @indent
(field_expression) @indent
(_ "(" ")" @end) @indent
(_ "{" "}" @end) @indent
"#,
)
.unwrap()
}
}

View File

@@ -1,15 +1,16 @@
use feature_flags::ZedPro;
use gpui::Action;
use gpui::DismissEvent;
use language_model::{LanguageModel, LanguageModelAvailability, LanguageModelRegistry};
use proto::Plan;
use workspace::ShowConfiguration;
use std::sync::Arc;
use ui::ListItemSpacing;
use crate::assistant_settings::AssistantSettings;
use crate::ShowConfiguration;
use fs::Fs;
use gpui::Action;
use gpui::SharedString;
use gpui::Task;
use picker::{Picker, PickerDelegate};
@@ -36,7 +37,7 @@ pub struct ModelPickerDelegate {
#[derive(Clone)]
struct ModelInfo {
model: Arc<dyn LanguageModel>,
provider_icon: IconName,
icon: IconName,
availability: LanguageModelAvailability,
is_selected: bool,
}
@@ -149,6 +150,8 @@ impl PickerDelegate for ModelPickerDelegate {
use feature_flags::FeatureFlagAppExt;
let model_info = self.filtered_models.get(ix)?;
let show_badges = cx.has_flag::<ZedPro>();
let provider_name: String = model_info.model.provider_name().0.into();
Some(
ListItem::new(ix)
.inset(true)
@@ -156,7 +159,7 @@ impl PickerDelegate for ModelPickerDelegate {
.selected(selected)
.start_slot(
div().pr_1().child(
Icon::new(model_info.provider_icon)
Icon::new(model_info.icon)
.color(Color::Muted)
.size(IconSize::Medium),
),
@@ -166,11 +169,16 @@ impl PickerDelegate for ModelPickerDelegate {
.w_full()
.justify_between()
.font_buffer(cx)
.min_w(px(200.))
.min_w(px(240.))
.child(
h_flex()
.gap_2()
.child(Label::new(model_info.model.name().0.clone()))
.child(
Label::new(provider_name)
.size(LabelSize::XSmall)
.color(Color::Muted),
)
.children(match model_info.availability {
LanguageModelAvailability::Public => None,
LanguageModelAvailability::RequiresPlan(Plan::Free) => None,
@@ -261,16 +269,17 @@ impl<T: PopoverTrigger> RenderOnce for ModelSelector<T> {
.iter()
.flat_map(|provider| {
let provider_id = provider.id();
let provider_icon = provider.icon();
let icon = provider.icon();
let selected_model = selected_model.clone();
let selected_provider = selected_provider.clone();
provider.provided_models(cx).into_iter().map(move |model| {
let model = model.clone();
let icon = model.icon().unwrap_or(icon);
ModelInfo {
model: model.clone(),
provider_icon,
icon,
availability: model.availability(),
is_selected: selected_model.as_ref() == Some(&model.id())
&& selected_provider.as_ref() == Some(&provider_id),

View File

@@ -1,26 +1,24 @@
use anyhow::Result;
use assets::Assets;
use fs::Fs;
use futures::StreamExt;
use handlebars::{Handlebars, RenderError, TemplateError};
use gpui::AssetSource;
use handlebars::{Handlebars, RenderError};
use language::BufferSnapshot;
use parking_lot::Mutex;
use serde::Serialize;
use std::{ops::Range, sync::Arc, time::Duration};
use std::{ops::Range, path::PathBuf, sync::Arc, time::Duration};
use util::ResultExt;
#[derive(Serialize)]
pub struct ContentPromptContext {
pub content_type: String,
pub language_name: Option<String>,
pub is_insert: bool,
pub is_truncated: bool,
pub document_content: String,
pub user_prompt: String,
pub rewrite_section: String,
pub rewrite_section_prefix: String,
pub rewrite_section_suffix: String,
pub rewrite_section_with_edits: String,
pub has_insertion: bool,
pub has_replacement: bool,
pub rewrite_section: Option<String>,
}
#[derive(Serialize)]
@@ -42,128 +40,162 @@ pub struct StepResolutionContext {
pub step_to_resolve: String,
}
pub struct PromptLoadingParams<'a> {
pub fs: Arc<dyn Fs>,
pub repo_path: Option<PathBuf>,
pub cx: &'a gpui::AppContext,
}
pub struct PromptBuilder {
handlebars: Arc<Mutex<Handlebars<'static>>>,
}
pub struct PromptOverrideContext<'a> {
pub dev_mode: bool,
pub fs: Arc<dyn Fs>,
pub cx: &'a mut gpui::AppContext,
}
impl PromptBuilder {
pub fn new(override_cx: Option<PromptOverrideContext>) -> Result<Self, Box<TemplateError>> {
pub fn new(loading_params: Option<PromptLoadingParams>) -> Result<Self> {
let mut handlebars = Handlebars::new();
Self::register_templates(&mut handlebars)?;
Self::register_built_in_templates(&mut handlebars)?;
let handlebars = Arc::new(Mutex::new(handlebars));
if let Some(override_cx) = override_cx {
Self::watch_fs_for_template_overrides(override_cx, handlebars.clone());
if let Some(params) = loading_params {
Self::watch_fs_for_template_overrides(params, handlebars.clone());
}
Ok(Self { handlebars })
}
/// Watches the filesystem for changes to prompt template overrides.
///
/// This function sets up a file watcher on the prompt templates directory. It performs
/// an initial scan of the directory and registers any existing template overrides.
/// Then it continuously monitors for changes, reloading templates as they are
/// modified or added.
///
/// If the templates directory doesn't exist initially, it waits for it to be created.
/// If the directory is removed, it restores the built-in templates and waits for the
/// directory to be recreated.
///
/// # Arguments
///
/// * `params` - A `PromptLoadingParams` struct containing the filesystem, repository path,
/// and application context.
/// * `handlebars` - An `Arc<Mutex<Handlebars>>` for registering and updating templates.
fn watch_fs_for_template_overrides(
PromptOverrideContext { dev_mode, fs, cx }: PromptOverrideContext,
mut params: PromptLoadingParams,
handlebars: Arc<Mutex<Handlebars<'static>>>,
) {
cx.background_executor()
params.repo_path = None;
let templates_dir = paths::prompt_overrides_dir(params.repo_path.as_deref());
params.cx.background_executor()
.spawn(async move {
let templates_dir = if dev_mode {
std::env::current_dir()
.ok()
.and_then(|pwd| {
let pwd_assets_prompts = pwd.join("assets").join("prompts");
pwd_assets_prompts.exists().then_some(pwd_assets_prompts)
})
.unwrap_or_else(|| paths::prompt_overrides_dir().clone())
} else {
paths::prompt_overrides_dir().clone()
let Some(parent_dir) = templates_dir.parent() else {
return;
};
// Create the prompt templates directory if it doesn't exist
if !fs.is_dir(&templates_dir).await {
if let Err(e) = fs.create_dir(&templates_dir).await {
log::error!("Failed to create prompt templates directory: {}", e);
return;
let mut found_dir_once = false;
loop {
// Check if the templates directory exists and handle its status
// If it exists, log its presence and check if it's a symlink
// If it doesn't exist:
// - Log that we're using built-in prompts
// - Check if it's a broken symlink and log if so
// - Set up a watcher to detect when it's created
// After the first check, set the `found_dir_once` flag
// This allows us to avoid logging when looping back around after deleting the prompt overrides directory.
let dir_status = params.fs.is_dir(&templates_dir).await;
let symlink_status = params.fs.read_link(&templates_dir).await.ok();
if dir_status {
let mut log_message = format!("Prompt template overrides directory found at {}", templates_dir.display());
if let Some(target) = symlink_status {
log_message.push_str(" -> ");
log_message.push_str(&target.display().to_string());
}
log::info!("{}.", log_message);
} else {
if !found_dir_once {
log::info!("No prompt template overrides directory found at {}. Using built-in prompts.", templates_dir.display());
if let Some(target) = symlink_status {
log::info!("Symlink found pointing to {}, but target is invalid.", target.display());
}
}
if params.fs.is_dir(parent_dir).await {
let (mut changes, _watcher) = params.fs.watch(parent_dir, Duration::from_secs(1)).await;
while let Some(changed_paths) = changes.next().await {
if changed_paths.iter().any(|p| p == &templates_dir) {
let mut log_message = format!("Prompt template overrides directory detected at {}", templates_dir.display());
if let Ok(target) = params.fs.read_link(&templates_dir).await {
log_message.push_str(" -> ");
log_message.push_str(&target.display().to_string());
}
log::info!("{}.", log_message);
break;
}
}
} else {
return;
}
}
}
// Initial scan of the prompts directory
if let Ok(mut entries) = fs.read_dir(&templates_dir).await {
while let Some(Ok(file_path)) = entries.next().await {
if file_path.to_string_lossy().ends_with(".hbs") {
if let Ok(content) = fs.load(&file_path).await {
let file_name = file_path.file_stem().unwrap().to_string_lossy();
found_dir_once = true;
match handlebars.lock().register_template_string(&file_name, content) {
Ok(_) => {
log::info!(
"Successfully registered template override: {} ({})",
file_name,
file_path.display()
);
},
Err(e) => {
log::error!(
"Failed to register template during initial scan: {} ({})",
e,
file_path.display()
);
},
// Initial scan of the prompt overrides directory
if let Ok(mut entries) = params.fs.read_dir(&templates_dir).await {
while let Some(Ok(file_path)) = entries.next().await {
if file_path.to_string_lossy().ends_with(".hbs") {
if let Ok(content) = params.fs.load(&file_path).await {
let file_name = file_path.file_stem().unwrap().to_string_lossy();
log::info!("Registering prompt template override: {}", file_name);
handlebars.lock().register_template_string(&file_name, content).log_err();
}
}
}
}
}
// Watch for changes
let (mut changes, watcher) = fs.watch(&templates_dir, Duration::from_secs(1)).await;
while let Some(changed_paths) = changes.next().await {
for changed_path in changed_paths {
if changed_path.extension().map_or(false, |ext| ext == "hbs") {
log::info!("Reloading template: {}", changed_path.display());
if let Some(content) = fs.load(&changed_path).await.log_err() {
let file_name = changed_path.file_stem().unwrap().to_string_lossy();
let file_path = changed_path.to_string_lossy();
match handlebars.lock().register_template_string(&file_name, content) {
Ok(_) => log::info!(
"Successfully reloaded template: {} ({})",
file_name,
file_path
),
Err(e) => log::error!(
"Failed to register template: {} ({})",
e,
file_path
),
// Watch both the parent directory and the template overrides directory:
// - Monitor the parent directory to detect if the template overrides directory is deleted.
// - Monitor the template overrides directory to re-register templates when they change.
// Combine both watch streams into a single stream.
let (parent_changes, parent_watcher) = params.fs.watch(parent_dir, Duration::from_secs(1)).await;
let (changes, watcher) = params.fs.watch(&templates_dir, Duration::from_secs(1)).await;
let mut combined_changes = futures::stream::select(changes, parent_changes);
while let Some(changed_paths) = combined_changes.next().await {
if changed_paths.iter().any(|p| p == &templates_dir) {
if !params.fs.is_dir(&templates_dir).await {
log::info!("Prompt template overrides directory removed. Restoring built-in prompt templates.");
Self::register_built_in_templates(&mut handlebars.lock()).log_err();
break;
}
}
for changed_path in changed_paths {
if changed_path.starts_with(&templates_dir) && changed_path.extension().map_or(false, |ext| ext == "hbs") {
log::info!("Reloading prompt template override: {}", changed_path.display());
if let Some(content) = params.fs.load(&changed_path).await.log_err() {
let file_name = changed_path.file_stem().unwrap().to_string_lossy();
handlebars.lock().register_template_string(&file_name, content).log_err();
}
}
}
}
drop(watcher);
drop(parent_watcher);
}
drop(watcher);
})
.detach();
}
fn register_templates(handlebars: &mut Handlebars) -> Result<(), Box<TemplateError>> {
let mut register_template = |id: &str| {
let prompt = Assets::get(&format!("prompts/{}.hbs", id))
.unwrap_or_else(|| panic!("{} prompt template not found", id))
.data;
handlebars
.register_template_string(id, String::from_utf8_lossy(&prompt))
.map_err(Box::new)
};
register_template("content_prompt")?;
register_template("terminal_assistant_prompt")?;
register_template("edit_workflow")?;
register_template("step_resolution")?;
fn register_built_in_templates(handlebars: &mut Handlebars) -> Result<()> {
for path in Assets.list("prompts")? {
if let Some(id) = path.split('/').last().and_then(|s| s.strip_suffix(".hbs")) {
if let Some(prompt) = Assets.load(path.as_ref()).log_err().flatten() {
log::info!("Registering built-in prompt template: {}", id);
handlebars
.register_template_string(id, String::from_utf8_lossy(prompt.as_ref()))?
}
}
}
Ok(())
}
@@ -173,9 +205,7 @@ impl PromptBuilder {
user_prompt: String,
language_name: Option<&str>,
buffer: BufferSnapshot,
transform_range: Range<usize>,
selected_ranges: Vec<Range<usize>>,
transform_context_range: Range<usize>,
range: Range<usize>,
) -> Result<String, RenderError> {
let content_type = match language_name {
None | Some("Markdown" | "Plain Text") => "text",
@@ -183,20 +213,21 @@ impl PromptBuilder {
};
const MAX_CTX: usize = 50000;
let is_insert = range.is_empty();
let mut is_truncated = false;
let before_range = 0..transform_range.start;
let before_range = 0..range.start;
let truncated_before = if before_range.len() > MAX_CTX {
is_truncated = true;
transform_range.start - MAX_CTX..transform_range.start
range.start - MAX_CTX..range.start
} else {
before_range
};
let after_range = transform_range.end..buffer.len();
let after_range = range.end..buffer.len();
let truncated_after = if after_range.len() > MAX_CTX {
is_truncated = true;
transform_range.end..transform_range.end + MAX_CTX
range.end..range.end + MAX_CTX
} else {
after_range
};
@@ -205,74 +236,37 @@ impl PromptBuilder {
for chunk in buffer.text_for_range(truncated_before) {
document_content.push_str(chunk);
}
document_content.push_str("<rewrite_this>\n");
for chunk in buffer.text_for_range(transform_range.clone()) {
document_content.push_str(chunk);
if is_insert {
document_content.push_str("<insert_here></insert_here>");
} else {
document_content.push_str("<rewrite_this>\n");
for chunk in buffer.text_for_range(range.clone()) {
document_content.push_str(chunk);
}
document_content.push_str("\n</rewrite_this>");
}
document_content.push_str("\n</rewrite_this>");
for chunk in buffer.text_for_range(truncated_after) {
document_content.push_str(chunk);
}
let mut rewrite_section = String::new();
for chunk in buffer.text_for_range(transform_range.clone()) {
rewrite_section.push_str(chunk);
}
let mut rewrite_section_prefix = String::new();
for chunk in buffer.text_for_range(transform_context_range.start..transform_range.start) {
rewrite_section_prefix.push_str(chunk);
}
let mut rewrite_section_suffix = String::new();
for chunk in buffer.text_for_range(transform_range.end..transform_context_range.end) {
rewrite_section_suffix.push_str(chunk);
}
let rewrite_section_with_edits = {
let mut section_with_selections = String::new();
let mut last_end = 0;
for selected_range in &selected_ranges {
if selected_range.start > last_end {
section_with_selections.push_str(
&rewrite_section[last_end..selected_range.start - transform_range.start],
);
}
if selected_range.start == selected_range.end {
section_with_selections.push_str("<insert_here></insert_here>");
} else {
section_with_selections.push_str("<edit_here>");
section_with_selections.push_str(
&rewrite_section[selected_range.start - transform_range.start
..selected_range.end - transform_range.start],
);
section_with_selections.push_str("</edit_here>");
}
last_end = selected_range.end - transform_range.start;
let rewrite_section = if !is_insert {
let mut section = String::new();
for chunk in buffer.text_for_range(range.clone()) {
section.push_str(chunk);
}
if last_end < rewrite_section.len() {
section_with_selections.push_str(&rewrite_section[last_end..]);
}
section_with_selections
Some(section)
} else {
None
};
let has_insertion = selected_ranges.iter().any(|range| range.start == range.end);
let has_replacement = selected_ranges.iter().any(|range| range.start != range.end);
let context = ContentPromptContext {
content_type: content_type.to_string(),
language_name: language_name.map(|s| s.to_string()),
is_insert,
is_truncated,
document_content,
user_prompt,
rewrite_section,
rewrite_section_prefix,
rewrite_section_suffix,
rewrite_section_with_edits,
has_insertion,
has_replacement,
};
self.handlebars.lock().render("content_prompt", &context)

View File

@@ -124,6 +124,7 @@ impl SlashCommandCompletionProvider {
&command_name,
&[],
true,
false,
workspace.clone(),
cx,
);
@@ -208,6 +209,7 @@ impl SlashCommandCompletionProvider {
&command_name,
&completed_arguments,
true,
false,
workspace.clone(),
cx,
);

View File

@@ -67,7 +67,11 @@ impl SlashCommand for ContextServerSlashCommand {
) -> Task<Result<SlashCommandOutput>> {
let server_id = self.server_id.clone();
let prompt_name = self.prompt.name.clone();
let argument = arguments.first().cloned();
let prompt_args = match prompt_arguments(&self.prompt, arguments) {
Ok(args) => args,
Err(e) => return Task::ready(Err(e)),
};
let manager = ContextServerManager::global(cx);
let manager = manager.read(cx);
@@ -76,10 +80,7 @@ impl SlashCommand for ContextServerSlashCommand {
let Some(protocol) = server.client.read().clone() else {
return Err(anyhow!("Context server not initialized"));
};
let result = protocol
.run_prompt(&prompt_name, prompt_arguments(&self.prompt, argument)?)
.await?;
let result = protocol.run_prompt(&prompt_name, prompt_args).await?;
Ok(SlashCommandOutput {
sections: vec![SlashCommandOutputSection {
@@ -97,19 +98,27 @@ impl SlashCommand for ContextServerSlashCommand {
}
}
fn prompt_arguments(
prompt: &PromptInfo,
argument: Option<String>,
) -> Result<HashMap<String, String>> {
fn prompt_arguments(prompt: &PromptInfo, arguments: &[String]) -> Result<HashMap<String, String>> {
match &prompt.arguments {
Some(args) if args.len() >= 2 => Err(anyhow!(
Some(args) if args.len() > 1 => Err(anyhow!(
"Prompt has more than one argument, which is not supported"
)),
Some(args) if args.len() == 1 => match argument {
Some(value) => Ok(HashMap::from_iter([(args[0].name.clone(), value)])),
None => Err(anyhow!("Prompt expects argument but none given")),
},
Some(_) | None => Ok(HashMap::default()),
Some(args) if args.len() == 1 => {
if !arguments.is_empty() {
let mut map = HashMap::default();
map.insert(args[0].name.clone(), arguments.join(" "));
Ok(map)
} else {
Err(anyhow!("Prompt expects argument but none given"))
}
}
Some(_) | None => {
if arguments.is_empty() {
Ok(HashMap::default())
} else {
Err(anyhow!("Prompt expects no arguments but some were given"))
}
}
}
}

View File

@@ -0,0 +1,306 @@
use std::sync::Arc;
use assistant_slash_command::SlashCommandRegistry;
use gpui::AnyElement;
use gpui::DismissEvent;
use gpui::WeakView;
use picker::PickerEditorPosition;
use ui::ListItemSpacing;
use gpui::SharedString;
use gpui::Task;
use picker::{Picker, PickerDelegate};
use ui::{prelude::*, ListItem, PopoverMenu, PopoverTrigger};
use crate::assistant_panel::ContextEditor;
#[derive(IntoElement)]
pub(super) struct SlashCommandSelector<T: PopoverTrigger> {
registry: Arc<SlashCommandRegistry>,
active_context_editor: WeakView<ContextEditor>,
trigger: T,
}
#[derive(Clone)]
struct SlashCommandInfo {
name: SharedString,
description: SharedString,
args: Option<SharedString>,
}
#[derive(Clone)]
enum SlashCommandEntry {
Info(SlashCommandInfo),
Advert {
name: SharedString,
renderer: fn(&mut WindowContext<'_>) -> AnyElement,
on_confirm: fn(&mut WindowContext<'_>),
},
}
impl AsRef<str> for SlashCommandEntry {
fn as_ref(&self) -> &str {
match self {
SlashCommandEntry::Info(SlashCommandInfo { name, .. })
| SlashCommandEntry::Advert { name, .. } => name,
}
}
}
pub(crate) struct SlashCommandDelegate {
all_commands: Vec<SlashCommandEntry>,
filtered_commands: Vec<SlashCommandEntry>,
active_context_editor: WeakView<ContextEditor>,
selected_index: usize,
}
impl<T: PopoverTrigger> SlashCommandSelector<T> {
pub(crate) fn new(
registry: Arc<SlashCommandRegistry>,
active_context_editor: WeakView<ContextEditor>,
trigger: T,
) -> Self {
SlashCommandSelector {
registry,
active_context_editor,
trigger,
}
}
}
impl PickerDelegate for SlashCommandDelegate {
type ListItem = ListItem;
fn match_count(&self) -> usize {
self.filtered_commands.len()
}
fn selected_index(&self) -> usize {
self.selected_index
}
fn set_selected_index(&mut self, ix: usize, cx: &mut ViewContext<Picker<Self>>) {
self.selected_index = ix.min(self.filtered_commands.len().saturating_sub(1));
cx.notify();
}
fn placeholder_text(&self, _cx: &mut WindowContext) -> Arc<str> {
"Select a command...".into()
}
fn update_matches(&mut self, query: String, cx: &mut ViewContext<Picker<Self>>) -> Task<()> {
let all_commands = self.all_commands.clone();
cx.spawn(|this, mut cx| async move {
let filtered_commands = cx
.background_executor()
.spawn(async move {
if query.is_empty() {
all_commands
} else {
all_commands
.into_iter()
.filter(|model_info| {
model_info
.as_ref()
.to_lowercase()
.contains(&query.to_lowercase())
})
.collect()
}
})
.await;
this.update(&mut cx, |this, cx| {
this.delegate.filtered_commands = filtered_commands;
this.delegate.set_selected_index(0, cx);
cx.notify();
})
.ok();
})
}
fn separators_after_indices(&self) -> Vec<usize> {
let mut ret = vec![];
let mut previous_is_advert = false;
for (index, command) in self.filtered_commands.iter().enumerate() {
if previous_is_advert {
if let SlashCommandEntry::Info(_) = command {
previous_is_advert = false;
debug_assert_ne!(
index, 0,
"index cannot be zero, as we can never have a separator at 0th position"
);
ret.push(index - 1);
}
} else {
if let SlashCommandEntry::Advert { .. } = command {
previous_is_advert = true;
if index != 0 {
ret.push(index - 1);
}
}
}
}
ret
}
fn confirm(&mut self, _secondary: bool, cx: &mut ViewContext<Picker<Self>>) {
if let Some(command) = self.filtered_commands.get(self.selected_index) {
if let SlashCommandEntry::Info(info) = command {
self.active_context_editor
.update(cx, |context_editor, cx| {
context_editor.insert_command(&info.name, cx)
})
.ok();
} else if let SlashCommandEntry::Advert { on_confirm, .. } = command {
on_confirm(cx);
}
cx.emit(DismissEvent);
}
}
fn dismissed(&mut self, _cx: &mut ViewContext<Picker<Self>>) {}
fn editor_position(&self) -> PickerEditorPosition {
PickerEditorPosition::End
}
fn render_match(
&self,
ix: usize,
selected: bool,
cx: &mut ViewContext<Picker<Self>>,
) -> Option<Self::ListItem> {
let command_info = self.filtered_commands.get(ix)?;
match command_info {
SlashCommandEntry::Info(info) => Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.selected(selected)
.child(
h_flex()
.group(format!("command-entry-label-{ix}"))
.w_full()
.min_w(px(220.))
.child(
v_flex()
.child(
h_flex()
.child(div().font_buffer(cx).child({
let mut label = format!("/{}", info.name);
if let Some(args) =
info.args.as_ref().filter(|_| selected)
{
label.push_str(&args);
}
Label::new(label).size(LabelSize::Small)
}))
.children(info.args.clone().filter(|_| !selected).map(
|args| {
div()
.font_buffer(cx)
.child(
Label::new(args).size(LabelSize::Small),
)
.visible_on_hover(format!(
"command-entry-label-{ix}"
))
},
)),
)
.child(
Label::new(info.description.clone())
.size(LabelSize::Small)
.color(Color::Muted),
),
),
),
),
SlashCommandEntry::Advert { renderer, .. } => Some(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.selected(selected)
.child(renderer(cx)),
),
}
}
}
impl<T: PopoverTrigger> RenderOnce for SlashCommandSelector<T> {
fn render(self, cx: &mut WindowContext) -> impl IntoElement {
let all_models = self
.registry
.featured_command_names()
.into_iter()
.filter_map(|command_name| {
let command = self.registry.command(&command_name)?;
let menu_text = SharedString::from(Arc::from(command.menu_text()));
let label = command.label(cx);
let args = label.filter_range.end.ne(&label.text.len()).then(|| {
SharedString::from(
label.text[label.filter_range.end..label.text.len()].to_owned(),
)
});
Some(SlashCommandEntry::Info(SlashCommandInfo {
name: command_name.into(),
description: menu_text,
args,
}))
})
.chain([SlashCommandEntry::Advert {
name: "create-your-command".into(),
renderer: |cx| {
v_flex()
.child(
h_flex()
.font_buffer(cx)
.items_center()
.gap_1()
.child(div().font_buffer(cx).child(
Label::new("create-your-command").size(LabelSize::Small),
))
.child(Icon::new(IconName::ArrowUpRight).size(IconSize::XSmall)),
)
.child(
Label::new("Learn how to create a custom command")
.size(LabelSize::Small)
.color(Color::Muted),
)
.into_any_element()
},
on_confirm: |cx| cx.open_url("https://zed.dev/docs/extensions/slash-commands"),
}])
.collect::<Vec<_>>();
let delegate = SlashCommandDelegate {
all_commands: all_models.clone(),
active_context_editor: self.active_context_editor.clone(),
filtered_commands: all_models,
selected_index: 0,
};
let picker_view = cx.new_view(|cx| {
let picker = Picker::uniform_list(delegate, cx).max_height(Some(rems(20.).into()));
picker
});
let handle = self
.active_context_editor
.update(cx, |this, _| this.slash_menu_handle.clone())
.ok();
PopoverMenu::new("model-switcher")
.menu(move |_cx| Some(picker_view.clone()))
.trigger(self.trigger)
.attach(gpui::AnchorCorner::TopLeft)
.anchor(gpui::AnchorCorner::BottomLeft)
.offset(gpui::Point {
x: px(0.0),
y: px(-16.0),
})
.when_some(handle, |this, handle| this.with_handle(handle))
}
}

View File

@@ -1,42 +0,0 @@
## Assistant Panel
Once you have configured a provider, you can interact with the provider's language models in a context editor.
To create a new context editor, use the menu in the top right of the assistant panel and select the `New Context` option.
In the context editor, select a model from one of the configured providers, type a message in the `You` block, and submit with `cmd-enter` (or `ctrl-enter` on Linux).
### Adding Prompts
You can customize the default prompts used in new context editors by opening the `Prompt Library`.
Open the `Prompt Library` using either the menu in the top right of the assistant panel and choosing the `Prompt Library` option, or by using the `assistant: deploy prompt library` command when the assistant panel is focused.
### Viewing past contexts
You can view all previous contexts by opening the `History` tab in the assistant panel.
Open the `History` using the menu in the top right of the assistant panel and choosing `History`.
### Slash commands
Slash commands enhance the assistant's capabilities. Begin by typing a `/` at the beginning of the line to see a list of available commands:
- default: Inserts the default prompt into the context
- diagnostics: Injects errors reported by the project's language server into the context
- fetch: Pulls the content of a webpage and inserts it into the context
- file: Pulls a single file or a directory of files into the context
- now: Inserts the current date and time into the context
- prompt: Adds a custom-configured prompt to the context (see Prompt Library)
- search: Performs semantic search for content in your project based on natural language
- symbols: Pulls the current tab's active symbols into the context
- tab: Pulls in the content of the active tab or all open tabs into the context
- terminal: Pulls in a select number of lines of output from the terminal
## Inline assistant
You can use `ctrl-enter` to open the inline assistant in both a normal editor and within the assistant panel.
The inline assistant allows you to send the current selection (or the current line) to a language model and modify the selection with the language model's response.
The inline assistant pulls its context from the assistant panel, allowing you to provide additional instructions or rules for code transformations.

View File

@@ -23,6 +23,8 @@ use workspace::Workspace;
pub use step_view::WorkflowStepView;
const IMPORTS_SYMBOL: &str = "#imports";
pub struct WorkflowStep {
context: WeakModel<Context>,
context_buffer_range: Range<Anchor>,
@@ -467,7 +469,7 @@ pub mod tool {
use super::*;
use anyhow::Context as _;
use gpui::AsyncAppContext;
use language::ParseStatus;
use language::{Outline, OutlineItem, ParseStatus};
use language_model::LanguageModelTool;
use project::ProjectPath;
use schemars::JsonSchema;
@@ -562,10 +564,7 @@ pub mod tool {
symbol,
description,
} => {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) = Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let start = symbol
.annotation_range
.map_or(symbol.range.start, |range| range.start);
@@ -588,10 +587,7 @@ pub mod tool {
symbol,
description,
} => {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) = Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let position = snapshot.anchor_before(
symbol
.annotation_range
@@ -609,10 +605,7 @@ pub mod tool {
symbol,
description,
} => {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) = Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let position = snapshot.anchor_after(symbol.range.end);
WorkflowSuggestion::InsertSiblingAfter {
position,
@@ -625,10 +618,8 @@ pub mod tool {
description,
} => {
if let Some(symbol) = symbol {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) =
Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let position = snapshot.anchor_after(
symbol
@@ -653,10 +644,8 @@ pub mod tool {
description,
} => {
if let Some(symbol) = symbol {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) =
Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let position = snapshot.anchor_before(
symbol
@@ -677,10 +666,7 @@ pub mod tool {
}
}
WorkflowSuggestionToolKind::Delete { symbol } => {
let (symbol_path, symbol) = outline
.find_most_similar(&symbol)
.with_context(|| format!("symbol not found: {:?}", symbol))?;
let symbol = symbol.to_point(&snapshot);
let (symbol_path, symbol) = Self::resolve_symbol(&snapshot, &outline, &symbol)?;
let start = symbol
.annotation_range
.map_or(symbol.range.start, |range| range.start);
@@ -696,6 +682,60 @@ pub mod tool {
Ok((buffer, suggestion))
}
fn resolve_symbol(
snapshot: &BufferSnapshot,
outline: &Outline<Anchor>,
symbol: &str,
) -> Result<(SymbolPath, OutlineItem<Point>)> {
if symbol == IMPORTS_SYMBOL {
let target_row = find_first_non_comment_line(snapshot);
Ok((
SymbolPath(IMPORTS_SYMBOL.to_string()),
OutlineItem {
range: Point::new(target_row, 0)..Point::new(target_row + 1, 0),
..Default::default()
},
))
} else {
let (symbol_path, symbol) = outline
.find_most_similar(symbol)
.with_context(|| format!("symbol not found: {symbol}"))?;
Ok((symbol_path, symbol.to_point(snapshot)))
}
}
}
fn find_first_non_comment_line(snapshot: &BufferSnapshot) -> u32 {
let Some(language) = snapshot.language() else {
return 0;
};
let scope = language.default_scope();
let comment_prefixes = scope.line_comment_prefixes();
let mut chunks = snapshot.as_rope().chunks();
let mut target_row = 0;
loop {
let starts_with_comment = chunks
.peek()
.map(|chunk| {
comment_prefixes
.iter()
.any(|s| chunk.starts_with(s.as_ref().trim_end()))
})
.unwrap_or(false);
if !starts_with_comment {
break;
}
target_row += 1;
if !chunks.next_line() {
break;
}
}
target_row
}
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, JsonSchema)]

View File

@@ -273,7 +273,7 @@ impl Item for WorkflowStepView {
}
fn tab_icon(&self, _cx: &WindowContext) -> Option<ui::Icon> {
Some(Icon::new(IconName::Pencil))
Some(Icon::new(IconName::SearchCode))
}
fn to_item_events(event: &Self::Event, mut f: impl FnMut(item::ItemEvent)) {

View File

@@ -139,6 +139,11 @@ spec:
secretKeyRef:
name: anthropic
key: staff_api_key
- name: LLM_CLOSED_BETA_MODEL_NAME
valueFrom:
secretKeyRef:
name: llm-closed-beta
key: model_name
- name: GOOGLE_AI_API_KEY
valueFrom:
secretKeyRef:

View File

@@ -295,7 +295,8 @@ CREATE UNIQUE INDEX "index_channel_buffer_collaborators_on_channel_id_connection
CREATE TABLE "feature_flags" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"flag" TEXT NOT NULL UNIQUE
"flag" TEXT NOT NULL UNIQUE,
"enabled_for_all" BOOLEAN NOT NULL DEFAULT false
);
CREATE INDEX "index_feature_flags" ON "feature_flags" ("id");

View File

@@ -0,0 +1 @@
alter table feature_flags add column enabled_for_all boolean not null default false;

View File

@@ -1,4 +1,4 @@
db-uri = "postgres://postgres@localhost/zed_llm"
server-port = 8082
db-uri = "postgres://postgres@localhost/zed"
server-port = 8081
jwt-secret = "the-postgrest-jwt-secret-for-authorization"
log-level = "info"

View File

@@ -1,4 +1,4 @@
db-uri = "postgres://postgres@localhost/zed"
server-port = 8081
db-uri = "postgres://postgres@localhost/zed_llm"
server-port = 8082
jwt-secret = "the-postgrest-jwt-secret-for-authorization"
log-level = "info"

View File

@@ -1,5 +1,6 @@
use super::ips_file::IpsFile;
use crate::api::CloudflareIpCountryHeader;
use crate::clickhouse::write_to_table;
use crate::{api::slack, AppState, Error, Result};
use anyhow::{anyhow, Context};
use aws_sdk_s3::primitives::ByteStream;
@@ -529,12 +530,12 @@ struct ToUpload {
impl ToUpload {
pub async fn upload(&self, clickhouse_client: &clickhouse::Client) -> anyhow::Result<()> {
const EDITOR_EVENTS_TABLE: &str = "editor_events";
Self::upload_to_table(EDITOR_EVENTS_TABLE, &self.editor_events, clickhouse_client)
write_to_table(EDITOR_EVENTS_TABLE, &self.editor_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{EDITOR_EVENTS_TABLE}'"))?;
const INLINE_COMPLETION_EVENTS_TABLE: &str = "inline_completion_events";
Self::upload_to_table(
write_to_table(
INLINE_COMPLETION_EVENTS_TABLE,
&self.inline_completion_events,
clickhouse_client,
@@ -543,7 +544,7 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{INLINE_COMPLETION_EVENTS_TABLE}'"))?;
const ASSISTANT_EVENTS_TABLE: &str = "assistant_events";
Self::upload_to_table(
write_to_table(
ASSISTANT_EVENTS_TABLE,
&self.assistant_events,
clickhouse_client,
@@ -552,27 +553,27 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{ASSISTANT_EVENTS_TABLE}'"))?;
const CALL_EVENTS_TABLE: &str = "call_events";
Self::upload_to_table(CALL_EVENTS_TABLE, &self.call_events, clickhouse_client)
write_to_table(CALL_EVENTS_TABLE, &self.call_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{CALL_EVENTS_TABLE}'"))?;
const CPU_EVENTS_TABLE: &str = "cpu_events";
Self::upload_to_table(CPU_EVENTS_TABLE, &self.cpu_events, clickhouse_client)
write_to_table(CPU_EVENTS_TABLE, &self.cpu_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{CPU_EVENTS_TABLE}'"))?;
const MEMORY_EVENTS_TABLE: &str = "memory_events";
Self::upload_to_table(MEMORY_EVENTS_TABLE, &self.memory_events, clickhouse_client)
write_to_table(MEMORY_EVENTS_TABLE, &self.memory_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{MEMORY_EVENTS_TABLE}'"))?;
const APP_EVENTS_TABLE: &str = "app_events";
Self::upload_to_table(APP_EVENTS_TABLE, &self.app_events, clickhouse_client)
write_to_table(APP_EVENTS_TABLE, &self.app_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{APP_EVENTS_TABLE}'"))?;
const SETTING_EVENTS_TABLE: &str = "setting_events";
Self::upload_to_table(
write_to_table(
SETTING_EVENTS_TABLE,
&self.setting_events,
clickhouse_client,
@@ -581,7 +582,7 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{SETTING_EVENTS_TABLE}'"))?;
const EXTENSION_EVENTS_TABLE: &str = "extension_events";
Self::upload_to_table(
write_to_table(
EXTENSION_EVENTS_TABLE,
&self.extension_events,
clickhouse_client,
@@ -590,48 +591,22 @@ impl ToUpload {
.with_context(|| format!("failed to upload to table '{EXTENSION_EVENTS_TABLE}'"))?;
const EDIT_EVENTS_TABLE: &str = "edit_events";
Self::upload_to_table(EDIT_EVENTS_TABLE, &self.edit_events, clickhouse_client)
write_to_table(EDIT_EVENTS_TABLE, &self.edit_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{EDIT_EVENTS_TABLE}'"))?;
const ACTION_EVENTS_TABLE: &str = "action_events";
Self::upload_to_table(ACTION_EVENTS_TABLE, &self.action_events, clickhouse_client)
write_to_table(ACTION_EVENTS_TABLE, &self.action_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{ACTION_EVENTS_TABLE}'"))?;
const REPL_EVENTS_TABLE: &str = "repl_events";
Self::upload_to_table(REPL_EVENTS_TABLE, &self.repl_events, clickhouse_client)
write_to_table(REPL_EVENTS_TABLE, &self.repl_events, clickhouse_client)
.await
.with_context(|| format!("failed to upload to table '{REPL_EVENTS_TABLE}'"))?;
Ok(())
}
async fn upload_to_table<T: clickhouse::Row + Serialize + std::fmt::Debug>(
table: &str,
rows: &[T],
clickhouse_client: &clickhouse::Client,
) -> anyhow::Result<()> {
if rows.is_empty() {
return Ok(());
}
let mut insert = clickhouse_client.insert(table)?;
for event in rows {
insert.write(event).await?;
}
insert.end().await?;
let event_count = rows.len();
log::info!(
"wrote {event_count} {event_specifier} to '{table}'",
event_specifier = if event_count == 1 { "event" } else { "events" }
);
Ok(())
}
}
pub fn serialize_country_code<S>(country_code: &str, serializer: S) -> Result<S::Ok, S::Error>

View File

@@ -0,0 +1,28 @@
use serde::Serialize;
/// Writes the given rows to the specified Clickhouse table.
pub async fn write_to_table<T: clickhouse::Row + Serialize + std::fmt::Debug>(
table: &str,
rows: &[T],
clickhouse_client: &clickhouse::Client,
) -> anyhow::Result<()> {
if rows.is_empty() {
return Ok(());
}
let mut insert = clickhouse_client.insert(table)?;
for event in rows {
insert.write(event).await?;
}
insert.end().await?;
let event_count = rows.len();
log::info!(
"wrote {event_count} {event_specifier} to '{table}'",
event_specifier = if event_count == 1 { "event" } else { "events" }
);
Ok(())
}

View File

@@ -312,10 +312,11 @@ impl Database {
}
/// Creates a new feature flag.
pub async fn create_user_flag(&self, flag: &str) -> Result<FlagId> {
pub async fn create_user_flag(&self, flag: &str, enabled_for_all: bool) -> Result<FlagId> {
self.transaction(|tx| async move {
let flag = feature_flag::Entity::insert(feature_flag::ActiveModel {
flag: ActiveValue::set(flag.to_string()),
enabled_for_all: ActiveValue::set(enabled_for_all),
..Default::default()
})
.exec(&*tx)
@@ -350,7 +351,15 @@ impl Database {
Flag,
}
let flags = user::Model {
let flags_enabled_for_all = feature_flag::Entity::find()
.filter(feature_flag::Column::EnabledForAll.eq(true))
.select_only()
.column(feature_flag::Column::Flag)
.into_values::<_, QueryAs>()
.all(&*tx)
.await?;
let flags_enabled_for_user = user::Model {
id: user,
..Default::default()
}
@@ -361,7 +370,10 @@ impl Database {
.all(&*tx)
.await?;
Ok(flags)
let mut all_flags = HashSet::from_iter(flags_enabled_for_all);
all_flags.extend(flags_enabled_for_user);
Ok(all_flags.into_iter().collect())
})
.await
}

View File

@@ -8,6 +8,7 @@ pub struct Model {
#[sea_orm(primary_key)]
pub id: FlagId,
pub flag: String,
pub enabled_for_all: bool,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]

View File

@@ -2,6 +2,7 @@ use crate::{
db::{Database, NewUserParams},
test_both_dbs,
};
use pretty_assertions::assert_eq;
use std::sync::Arc;
test_both_dbs!(
@@ -37,22 +38,27 @@ async fn test_get_user_flags(db: &Arc<Database>) {
.unwrap()
.user_id;
const CHANNELS_ALPHA: &str = "channels-alpha";
const NEW_SEARCH: &str = "new-search";
const FEATURE_FLAG_ONE: &str = "brand-new-ux";
const FEATURE_FLAG_TWO: &str = "cool-feature";
const FEATURE_FLAG_THREE: &str = "feature-enabled-for-everyone";
let channels_flag = db.create_user_flag(CHANNELS_ALPHA).await.unwrap();
let search_flag = db.create_user_flag(NEW_SEARCH).await.unwrap();
let feature_flag_one = db.create_user_flag(FEATURE_FLAG_ONE, false).await.unwrap();
let feature_flag_two = db.create_user_flag(FEATURE_FLAG_TWO, false).await.unwrap();
db.create_user_flag(FEATURE_FLAG_THREE, true).await.unwrap();
db.add_user_flag(user_1, channels_flag).await.unwrap();
db.add_user_flag(user_1, search_flag).await.unwrap();
db.add_user_flag(user_1, feature_flag_one).await.unwrap();
db.add_user_flag(user_1, feature_flag_two).await.unwrap();
db.add_user_flag(user_2, channels_flag).await.unwrap();
db.add_user_flag(user_2, feature_flag_one).await.unwrap();
let mut user_1_flags = db.get_user_flags(user_1).await.unwrap();
user_1_flags.sort();
assert_eq!(user_1_flags, &[CHANNELS_ALPHA, NEW_SEARCH]);
assert_eq!(
user_1_flags,
&[FEATURE_FLAG_ONE, FEATURE_FLAG_TWO, FEATURE_FLAG_THREE]
);
let mut user_2_flags = db.get_user_flags(user_2).await.unwrap();
user_2_flags.sort();
assert_eq!(user_2_flags, &[CHANNELS_ALPHA]);
assert_eq!(user_2_flags, &[FEATURE_FLAG_ONE, FEATURE_FLAG_THREE]);
}

View File

@@ -1,5 +1,6 @@
pub mod api;
pub mod auth;
pub mod clickhouse;
pub mod db;
pub mod env;
pub mod executor;
@@ -167,6 +168,7 @@ pub struct Config {
pub google_ai_api_key: Option<Arc<str>>,
pub anthropic_api_key: Option<Arc<str>>,
pub anthropic_staff_api_key: Option<Arc<str>>,
pub llm_closed_beta_model_name: Option<Arc<str>>,
pub qwen2_7b_api_key: Option<Arc<str>>,
pub qwen2_7b_api_url: Option<Arc<str>>,
pub zed_client_checksum_seed: Option<String>,
@@ -218,6 +220,7 @@ impl Config {
google_ai_api_key: None,
anthropic_api_key: None,
anthropic_staff_api_key: None,
llm_closed_beta_model_name: None,
clickhouse_url: None,
clickhouse_user: None,
clickhouse_password: None,
@@ -267,7 +270,7 @@ pub struct AppState {
pub stripe_client: Option<Arc<stripe::Client>>,
pub rate_limiter: Arc<RateLimiter>,
pub executor: Executor,
pub clickhouse_client: Option<clickhouse::Client>,
pub clickhouse_client: Option<::clickhouse::Client>,
pub config: Config,
}
@@ -358,8 +361,8 @@ async fn build_blob_store_client(config: &Config) -> anyhow::Result<aws_sdk_s3::
Ok(aws_sdk_s3::Client::new(&s3_config))
}
fn build_clickhouse_client(config: &Config) -> anyhow::Result<clickhouse::Client> {
Ok(clickhouse::Client::default()
fn build_clickhouse_client(config: &Config) -> anyhow::Result<::clickhouse::Client> {
Ok(::clickhouse::Client::default()
.with_url(
config
.clickhouse_url

View File

@@ -141,7 +141,8 @@ async fn validate_api_token<B>(mut req: Request<B>, next: Next<B>) -> impl IntoR
tracing::Span::current()
.record("user_id", claims.user_id)
.record("login", claims.github_user_login.clone())
.record("authn.jti", &claims.jti);
.record("authn.jti", &claims.jti)
.record("is_staff", &claims.is_staff);
req.extensions_mut().insert(claims);
Ok::<_, Error>(next.run(req).await.into_response())
@@ -169,7 +170,10 @@ async fn perform_completion(
country_code_header: Option<TypedHeader<CloudflareIpCountryHeader>>,
Json(params): Json<PerformCompletionParams>,
) -> Result<impl IntoResponse> {
let model = normalize_model_name(params.provider, params.model);
let model = normalize_model_name(
state.db.model_names_for_provider(params.provider),
params.model,
);
authorize_access_to_language_model(
&state.config,
@@ -200,17 +204,21 @@ async fn perform_completion(
let mut request: anthropic::Request =
serde_json::from_str(&params.provider_request.get())?;
// Parse the model, throw away the version that was included, and then set a specific
// version that we control on the server.
// Override the model on the request with the latest version of the model that is
// known to the server.
//
// Right now, we use the version that's defined in `model.id()`, but we will likely
// want to change this code once a new version of an Anthropic model is released,
// so that users can use the new version, without having to update Zed.
request.model = match anthropic::Model::from_id(&request.model) {
Ok(model) => model.id().to_string(),
Err(_) => request.model,
request.model = match model.as_str() {
"claude-3-5-sonnet" => anthropic::Model::Claude3_5Sonnet.id().to_string(),
"claude-3-opus" => anthropic::Model::Claude3Opus.id().to_string(),
"claude-3-haiku" => anthropic::Model::Claude3Haiku.id().to_string(),
"claude-3-sonnet" => anthropic::Model::Claude3Sonnet.id().to_string(),
_ => request.model,
};
let chunks = anthropic::stream_completion(
let (chunks, rate_limit_info) = anthropic::stream_completion_with_rate_limit_info(
&state.http_client,
anthropic::ANTHROPIC_API_URL,
api_key,
@@ -238,6 +246,19 @@ async fn perform_completion(
anthropic::AnthropicError::Other(err) => Error::Internal(err),
})?;
if let Some(rate_limit_info) = rate_limit_info {
tracing::info!(
target: "upstream rate limit",
is_staff = claims.is_staff,
provider = params.provider.to_string(),
model = model,
tokens_remaining = rate_limit_info.tokens_remaining,
requests_remaining = rate_limit_info.requests_remaining,
requests_reset = ?rate_limit_info.requests_reset,
tokens_reset = ?rate_limit_info.tokens_reset,
);
}
chunks
.map(move |event| {
let chunk = event?;
@@ -369,31 +390,13 @@ async fn perform_completion(
})))
}
fn normalize_model_name(provider: LanguageModelProvider, name: String) -> String {
let prefixes: &[_] = match provider {
LanguageModelProvider::Anthropic => &[
"claude-3-5-sonnet",
"claude-3-haiku",
"claude-3-opus",
"claude-3-sonnet",
],
LanguageModelProvider::OpenAi => &[
"gpt-3.5-turbo",
"gpt-4-turbo-preview",
"gpt-4o-mini",
"gpt-4o",
"gpt-4",
],
LanguageModelProvider::Google => &[],
LanguageModelProvider::Zed => &[],
};
if let Some(prefix) = prefixes
fn normalize_model_name(known_models: Vec<String>, name: String) -> String {
if let Some(known_model_name) = known_models
.iter()
.filter(|&&prefix| name.starts_with(prefix))
.max_by_key(|&&prefix| prefix.len())
.filter(|known_model_name| name.starts_with(known_model_name.as_str()))
.max_by_key(|known_model_name| known_model_name.len())
{
prefix.to_string()
known_model_name.to_string()
} else {
name
}
@@ -551,33 +554,75 @@ impl<S> Drop for TokenCountingStream<S> {
.await
.log_err();
if let Some((clickhouse_client, usage)) = state.clickhouse_client.as_ref().zip(usage) {
report_llm_usage(
clickhouse_client,
LlmUsageEventRow {
time: Utc::now().timestamp_millis(),
user_id: claims.user_id as i32,
is_staff: claims.is_staff,
plan: match claims.plan {
Plan::Free => "free".to_string(),
Plan::ZedPro => "zed_pro".to_string(),
if let Some(usage) = usage {
tracing::info!(
target: "user usage",
user_id = claims.user_id,
login = claims.github_user_login,
authn.jti = claims.jti,
is_staff = claims.is_staff,
requests_this_minute = usage.requests_this_minute,
tokens_this_minute = usage.tokens_this_minute,
);
if let Some(clickhouse_client) = state.clickhouse_client.as_ref() {
report_llm_usage(
clickhouse_client,
LlmUsageEventRow {
time: Utc::now().timestamp_millis(),
user_id: claims.user_id as i32,
is_staff: claims.is_staff,
plan: match claims.plan {
Plan::Free => "free".to_string(),
Plan::ZedPro => "zed_pro".to_string(),
},
model,
provider: provider.to_string(),
input_token_count: input_token_count as u64,
output_token_count: output_token_count as u64,
requests_this_minute: usage.requests_this_minute as u64,
tokens_this_minute: usage.tokens_this_minute as u64,
tokens_this_day: usage.tokens_this_day as u64,
input_tokens_this_month: usage.input_tokens_this_month as u64,
output_tokens_this_month: usage.output_tokens_this_month as u64,
spending_this_month: usage.spending_this_month as u64,
lifetime_spending: usage.lifetime_spending as u64,
},
model,
provider: provider.to_string(),
input_token_count: input_token_count as u64,
output_token_count: output_token_count as u64,
requests_this_minute: usage.requests_this_minute as u64,
tokens_this_minute: usage.tokens_this_minute as u64,
tokens_this_day: usage.tokens_this_day as u64,
input_tokens_this_month: usage.input_tokens_this_month as u64,
output_tokens_this_month: usage.output_tokens_this_month as u64,
spending_this_month: usage.spending_this_month as u64,
lifetime_spending: usage.lifetime_spending as u64,
},
)
.await
.log_err();
)
.await
.log_err();
}
}
})
}
}
pub fn log_usage_periodically(state: Arc<LlmState>) {
state.executor.clone().spawn_detached(async move {
loop {
state
.executor
.sleep(std::time::Duration::from_secs(30))
.await;
let Some(usages) = state
.db
.get_application_wide_usages_by_model(Utc::now())
.await
.log_err()
else {
continue;
};
for usage in usages {
tracing::info!(
target: "computed usage",
provider = usage.provider.to_string(),
model = usage.model,
requests_this_minute = usage.requests_this_minute,
tokens_this_minute = usage.tokens_this_minute,
);
}
}
})
}

View File

@@ -12,11 +12,12 @@ pub fn authorize_access_to_language_model(
model: &str,
) -> Result<()> {
authorize_access_for_country(config, country_code, provider)?;
authorize_access_to_model(claims, provider, model)?;
authorize_access_to_model(config, claims, provider, model)?;
Ok(())
}
fn authorize_access_to_model(
config: &Config,
claims: &LlmTokenClaims,
provider: LanguageModelProvider,
model: &str,
@@ -25,15 +26,25 @@ fn authorize_access_to_model(
return Ok(());
}
match (provider, model) {
(LanguageModelProvider::Anthropic, model) if model.starts_with("claude-3-5-sonnet") => {
Ok(())
match provider {
LanguageModelProvider::Anthropic => {
if model == "claude-3-5-sonnet" {
return Ok(());
}
if claims.has_llm_closed_beta_feature_flag
&& Some(model) == config.llm_closed_beta_model_name.as_deref()
{
return Ok(());
}
}
_ => Err(Error::http(
StatusCode::FORBIDDEN,
format!("access to model {model:?} is not included in your plan"),
))?,
_ => {}
}
Err(Error::http(
StatusCode::FORBIDDEN,
format!("access to model {model:?} is not included in your plan"),
))
}
fn authorize_access_for_country(

View File

@@ -67,6 +67,21 @@ impl LlmDatabase {
Ok(())
}
/// Returns the names of the known models for the given [`LanguageModelProvider`].
pub fn model_names_for_provider(&self, provider: LanguageModelProvider) -> Vec<String> {
self.models
.keys()
.filter_map(|(model_provider, model_name)| {
if model_provider == &provider {
Some(model_name)
} else {
None
}
})
.cloned()
.collect::<Vec<_>>()
}
pub fn model(&self, provider: LanguageModelProvider, name: &str) -> Result<&model::Model> {
Ok(self
.models

View File

@@ -1,5 +1,6 @@
use crate::db::UserId;
use chrono::Duration;
use futures::StreamExt as _;
use rpc::LanguageModelProvider;
use sea_orm::QuerySelect;
use std::{iter, str::FromStr};
@@ -18,6 +19,14 @@ pub struct Usage {
pub lifetime_spending: usize,
}
#[derive(Debug, PartialEq, Clone)]
pub struct ApplicationWideUsage {
pub provider: LanguageModelProvider,
pub model: String,
pub requests_this_minute: usize,
pub tokens_this_minute: usize,
}
#[derive(Clone, Copy, Debug, Default)]
pub struct ActiveUserCount {
pub users_in_recent_minutes: usize,
@@ -63,6 +72,72 @@ impl LlmDatabase {
Ok(())
}
pub async fn get_application_wide_usages_by_model(
&self,
now: DateTimeUtc,
) -> Result<Vec<ApplicationWideUsage>> {
self.transaction(|tx| async move {
let past_minute = now - Duration::minutes(1);
let requests_per_minute = self.usage_measure_ids[&UsageMeasure::RequestsPerMinute];
let tokens_per_minute = self.usage_measure_ids[&UsageMeasure::TokensPerMinute];
let mut results = Vec::new();
for ((provider, model_name), model) in self.models.iter() {
let mut usages = usage::Entity::find()
.filter(
usage::Column::Timestamp
.gte(past_minute.naive_utc())
.and(usage::Column::IsStaff.eq(false))
.and(usage::Column::ModelId.eq(model.id))
.and(
usage::Column::MeasureId
.eq(requests_per_minute)
.or(usage::Column::MeasureId.eq(tokens_per_minute)),
),
)
.stream(&*tx)
.await?;
let mut requests_this_minute = 0;
let mut tokens_this_minute = 0;
while let Some(usage) = usages.next().await {
let usage = usage?;
if usage.measure_id == requests_per_minute {
requests_this_minute += Self::get_live_buckets(
&usage,
now.naive_utc(),
UsageMeasure::RequestsPerMinute,
)
.0
.iter()
.copied()
.sum::<i64>() as usize;
} else if usage.measure_id == tokens_per_minute {
tokens_this_minute += Self::get_live_buckets(
&usage,
now.naive_utc(),
UsageMeasure::TokensPerMinute,
)
.0
.iter()
.copied()
.sum::<i64>() as usize;
}
}
results.push(ApplicationWideUsage {
provider: *provider,
model: model_name.clone(),
requests_this_minute,
tokens_this_minute,
})
}
Ok(results)
})
.await
}
pub async fn get_usage(
&self,
user_id: UserId,

View File

@@ -1,6 +1,8 @@
use anyhow::Result;
use anyhow::{Context, Result};
use serde::Serialize;
use crate::clickhouse::write_to_table;
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct LlmUsageEventRow {
pub time: i64,
@@ -40,9 +42,10 @@ pub struct LlmRateLimitEventRow {
}
pub async fn report_llm_usage(client: &clickhouse::Client, row: LlmUsageEventRow) -> Result<()> {
let mut insert = client.insert("llm_usage_events")?;
insert.write(&row).await?;
insert.end().await?;
const LLM_USAGE_EVENTS_TABLE: &str = "llm_usage_events";
write_to_table(LLM_USAGE_EVENTS_TABLE, &[row], client)
.await
.with_context(|| format!("failed to upload to table '{LLM_USAGE_EVENTS_TABLE}'"))?;
Ok(())
}
@@ -50,8 +53,9 @@ pub async fn report_llm_rate_limit(
client: &clickhouse::Client,
row: LlmRateLimitEventRow,
) -> Result<()> {
let mut insert = client.insert("llm_rate_limits")?;
insert.write(&row).await?;
insert.end().await?;
const LLM_RATE_LIMIT_EVENTS_TABLE: &str = "llm_rate_limit_events";
write_to_table(LLM_RATE_LIMIT_EVENTS_TABLE, &[row], client)
.await
.with_context(|| format!("failed to upload to table '{LLM_RATE_LIMIT_EVENTS_TABLE}'"))?;
Ok(())
}

View File

@@ -20,6 +20,8 @@ pub struct LlmTokenClaims {
#[serde(default)]
pub github_user_login: Option<String>,
pub is_staff: bool,
#[serde(default)]
pub has_llm_closed_beta_feature_flag: bool,
pub plan: rpc::proto::Plan,
}
@@ -30,6 +32,7 @@ impl LlmTokenClaims {
user_id: UserId,
github_user_login: String,
is_staff: bool,
has_llm_closed_beta_feature_flag: bool,
plan: rpc::proto::Plan,
config: &Config,
) -> Result<String> {
@@ -46,6 +49,7 @@ impl LlmTokenClaims {
user_id: user_id.to_proto(),
github_user_login: Some(github_user_login),
is_staff,
has_llm_closed_beta_feature_flag,
plan,
};

View File

@@ -5,7 +5,7 @@ use axum::{
routing::get,
Extension, Router,
};
use collab::llm::db::LlmDatabase;
use collab::llm::{db::LlmDatabase, log_usage_periodically};
use collab::migrations::run_database_migrations;
use collab::{api::billing::poll_stripe_events_periodically, llm::LlmState, ServiceMode};
use collab::{
@@ -95,6 +95,8 @@ async fn main() -> Result<()> {
let state = LlmState::new(config.clone(), Executor::Production).await?;
log_usage_periodically(state.clone());
app = app
.merge(collab::llm::routes())
.layer(Extension(state.clone()));
@@ -152,7 +154,8 @@ async fn main() -> Result<()> {
matched_path,
user_id = tracing::field::Empty,
login = tracing::field::Empty,
authn.jti = tracing::field::Empty
authn.jti = tracing::field::Empty,
is_staff = tracing::field::Empty
)
})
.on_response(

View File

@@ -4918,7 +4918,10 @@ async fn get_llm_api_token(
let db = session.db().await;
let flags = db.get_user_flags(session.user_id()).await?;
if !session.is_staff() && !flags.iter().any(|flag| flag == "language-models") {
let has_language_models_feature_flag = flags.iter().any(|flag| flag == "language-models");
let has_llm_closed_beta_feature_flag = flags.iter().any(|flag| flag == "llm-closed-beta");
if !session.is_staff() && !has_language_models_feature_flag {
Err(anyhow!("permission denied"))?
}
@@ -4943,6 +4946,7 @@ async fn get_llm_api_token(
user.id,
user.github_login.clone(),
session.is_staff(),
has_llm_closed_beta_feature_flag,
session.current_plan(db).await?,
&session.app_state.config,
)?;

View File

@@ -1,7 +1,6 @@
use crate::db::{self, ChannelRole, NewUserParams};
use anyhow::Context;
use chrono::{DateTime, Utc};
use db::Database;
use serde::{de::DeserializeOwned, Deserialize};
use std::{fmt::Write, fs, path::Path};
@@ -13,7 +12,6 @@ struct GitHubUser {
id: i32,
login: String,
email: Option<String>,
created_at: DateTime<Utc>,
}
#[derive(Deserialize)]
@@ -44,6 +42,17 @@ pub async fn seed(config: &Config, db: &Database, force: bool) -> anyhow::Result
let mut first_user = None;
let mut others = vec![];
let flag_names = ["remoting", "language-models"];
let mut flags = Vec::new();
for flag_name in flag_names {
let flag = db
.create_user_flag(flag_name, false)
.await
.unwrap_or_else(|_| panic!("failed to create flag: '{flag_name}'"));
flags.push(flag);
}
for admin_login in seed_config.admins {
let user = fetch_github::<GitHubUser>(
&client,
@@ -66,6 +75,15 @@ pub async fn seed(config: &Config, db: &Database, force: bool) -> anyhow::Result
} else {
others.push(user.user_id)
}
for flag in &flags {
db.add_user_flag(user.user_id, *flag)
.await
.context(format!(
"Unable to enable flag '{}' for user '{}'",
flag, user.user_id
))?;
}
}
for channel in seed_config.channels {
@@ -86,6 +104,7 @@ pub async fn seed(config: &Config, db: &Database, force: bool) -> anyhow::Result
}
}
// TODO: Fix this later
if let Some(number_of_users) = seed_config.number_of_users {
// Fetch 100 other random users from GitHub and insert them into the database
// (for testing autocompleters, etc.)
@@ -105,15 +124,23 @@ pub async fn seed(config: &Config, db: &Database, force: bool) -> anyhow::Result
for github_user in users {
last_user_id = Some(github_user.id);
user_count += 1;
db.get_or_create_user_by_github_account(
&github_user.login,
Some(github_user.id),
github_user.email.as_deref(),
Some(github_user.created_at),
None,
)
.await
.expect("failed to insert user");
let user = db
.get_or_create_user_by_github_account(
&github_user.login,
Some(github_user.id),
github_user.email.as_deref(),
None,
None,
)
.await
.expect("failed to insert user");
for flag in &flags {
db.add_user_flag(user.id, *flag).await.context(format!(
"Unable to enable flag '{}' for user '{}'",
flag, user.id
))?;
}
}
}
}
@@ -132,9 +159,9 @@ async fn fetch_github<T: DeserializeOwned>(client: &reqwest::Client, url: &str)
.header("user-agent", "zed")
.send()
.await
.unwrap_or_else(|_| panic!("failed to fetch '{}'", url));
.unwrap_or_else(|error| panic!("failed to fetch '{url}': {error}"));
response
.json()
.await
.unwrap_or_else(|_| panic!("failed to deserialize github user from '{}'", url))
.unwrap_or_else(|error| panic!("failed to deserialize github user from '{url}': {error}"))
}

View File

@@ -667,6 +667,7 @@ impl TestServer {
google_ai_api_key: None,
anthropic_api_key: None,
anthropic_staff_api_key: None,
llm_closed_beta_model_name: None,
clickhouse_url: None,
clickhouse_user: None,
clickhouse_password: None,

View File

@@ -30,7 +30,9 @@ fn restart_servers(_workspace: &mut Workspace, _action: &Restart, cx: &mut ViewC
let model = ContextServerManager::global(&cx);
cx.update_model(&model, |manager, cx| {
for server in manager.servers() {
manager.restart_server(&server.id, cx).detach();
manager
.restart_server(&server.id, cx)
.detach_and_log_err(cx);
}
});
}

View File

@@ -266,11 +266,11 @@ pub fn init(cx: &mut AppContext) {
log::trace!("servers_to_add={:?}", servers_to_add);
for config in servers_to_add {
manager.add_server(config, cx).detach();
manager.add_server(config, cx).detach_and_log_err(cx);
}
for id in servers_to_remove {
manager.remove_server(&id, cx).detach();
manager.remove_server(&id, cx).detach_and_log_err(cx);
}
})
})

View File

@@ -31,6 +31,8 @@ pub enum Role {
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq, EnumIter)]
pub enum Model {
#[default]
#[serde(alias = "gpt-4o", rename = "gpt-4o-2024-05-13")]
Gpt4o,
#[serde(alias = "gpt-4", rename = "gpt-4")]
Gpt4,
#[serde(alias = "gpt-3.5-turbo", rename = "gpt-3.5-turbo")]
@@ -40,6 +42,7 @@ pub enum Model {
impl Model {
pub fn from_id(id: &str) -> Result<Self> {
match id {
"gpt-4o" => Ok(Self::Gpt4o),
"gpt-4" => Ok(Self::Gpt4),
"gpt-3.5-turbo" => Ok(Self::Gpt3_5Turbo),
_ => Err(anyhow!("Invalid model id: {}", id)),
@@ -50,6 +53,7 @@ impl Model {
match self {
Self::Gpt3_5Turbo => "gpt-3.5-turbo",
Self::Gpt4 => "gpt-4",
Self::Gpt4o => "gpt-4o",
}
}
@@ -57,11 +61,13 @@ impl Model {
match self {
Self::Gpt3_5Turbo => "GPT-3.5",
Self::Gpt4 => "GPT-4",
Self::Gpt4o => "GPT-4o",
}
}
pub fn max_token_count(&self) -> usize {
match self {
Self::Gpt4o => 128000,
Self::Gpt4 => 8192,
Self::Gpt3_5Turbo => 16385,
}

View File

@@ -266,6 +266,22 @@ pub enum Direction {
Next,
}
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
pub enum Navigated {
Yes,
No,
}
impl Navigated {
pub fn from_bool(yes: bool) -> Navigated {
if yes {
Navigated::Yes
} else {
Navigated::No
}
}
}
pub fn init_settings(cx: &mut AppContext) {
EditorSettings::register(cx);
}
@@ -1561,6 +1577,7 @@ pub(crate) struct NavigationData {
scroll_top_row: u32,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
enum GotoDefinitionKind {
Symbol,
Declaration,
@@ -4843,7 +4860,7 @@ impl Editor {
let range = Anchor {
buffer_id,
excerpt_id: excerpt_id,
excerpt_id,
text_anchor: start,
}..Anchor {
buffer_id,
@@ -9020,15 +9037,28 @@ impl Editor {
&mut self,
_: &GoToDefinition,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Symbol, false, cx)
) -> Task<Result<Navigated>> {
let definition = self.go_to_definition_of_kind(GotoDefinitionKind::Symbol, false, cx);
let references = self.find_all_references(&FindAllReferences, cx);
cx.background_executor().spawn(async move {
if definition.await? == Navigated::Yes {
return Ok(Navigated::Yes);
}
if let Some(references) = references {
if references.await? == Navigated::Yes {
return Ok(Navigated::Yes);
}
}
Ok(Navigated::No)
})
}
pub fn go_to_declaration(
&mut self,
_: &GoToDeclaration,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Declaration, false, cx)
}
@@ -9036,7 +9066,7 @@ impl Editor {
&mut self,
_: &GoToDeclaration,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Declaration, true, cx)
}
@@ -9044,7 +9074,7 @@ impl Editor {
&mut self,
_: &GoToImplementation,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Implementation, false, cx)
}
@@ -9052,7 +9082,7 @@ impl Editor {
&mut self,
_: &GoToImplementationSplit,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Implementation, true, cx)
}
@@ -9060,7 +9090,7 @@ impl Editor {
&mut self,
_: &GoToTypeDefinition,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Type, false, cx)
}
@@ -9068,7 +9098,7 @@ impl Editor {
&mut self,
_: &GoToDefinitionSplit,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Symbol, true, cx)
}
@@ -9076,7 +9106,7 @@ impl Editor {
&mut self,
_: &GoToTypeDefinitionSplit,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
self.go_to_definition_of_kind(GotoDefinitionKind::Type, true, cx)
}
@@ -9085,16 +9115,16 @@ impl Editor {
kind: GotoDefinitionKind,
split: bool,
cx: &mut ViewContext<Self>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
let Some(workspace) = self.workspace() else {
return Task::ready(Ok(false));
return Task::ready(Ok(Navigated::No));
};
let buffer = self.buffer.read(cx);
let head = self.selections.newest::<usize>(cx).head();
let (buffer, head) = if let Some(text_anchor) = buffer.text_anchor_for_position(head, cx) {
text_anchor
} else {
return Task::ready(Ok(false));
return Task::ready(Ok(Navigated::No));
};
let project = workspace.read(cx).project().clone();
@@ -9153,7 +9183,7 @@ impl Editor {
mut definitions: Vec<HoverLink>,
split: bool,
cx: &mut ViewContext<Editor>,
) -> Task<Result<bool>> {
) -> Task<Result<Navigated>> {
// If there is one definition, just open it directly
if definitions.len() == 1 {
let definition = definitions.pop().unwrap();
@@ -9169,77 +9199,61 @@ impl Editor {
};
cx.spawn(|editor, mut cx| async move {
let target = target_task.await.context("target resolution task")?;
if let Some(target) = target {
editor.update(&mut cx, |editor, cx| {
let Some(workspace) = editor.workspace() else {
return false;
};
let pane = workspace.read(cx).active_pane().clone();
let Some(target) = target else {
return Ok(Navigated::No);
};
editor.update(&mut cx, |editor, cx| {
let Some(workspace) = editor.workspace() else {
return Navigated::No;
};
let pane = workspace.read(cx).active_pane().clone();
let range = target.range.to_offset(target.buffer.read(cx));
let range = editor.range_for_match(&range);
let range = target.range.to_offset(target.buffer.read(cx));
let range = editor.range_for_match(&range);
/// If select range has more than one line, we
/// just point the cursor to range.start.
fn check_multiline_range(
buffer: &Buffer,
range: Range<usize>,
) -> Range<usize> {
if buffer.offset_to_point(range.start).row
== buffer.offset_to_point(range.end).row
{
range
} else {
range.start..range.start
}
}
if Some(&target.buffer) == editor.buffer.read(cx).as_singleton().as_ref() {
let buffer = target.buffer.read(cx);
let range = check_multiline_range(buffer, range);
editor.change_selections(Some(Autoscroll::focused()), cx, |s| {
s.select_ranges([range]);
});
} else {
cx.window_context().defer(move |cx| {
let target_editor: View<Self> =
workspace.update(cx, |workspace, cx| {
let pane = if split {
workspace.adjacent_pane(cx)
} else {
workspace.active_pane().clone()
};
if Some(&target.buffer) == editor.buffer.read(cx).as_singleton().as_ref() {
let buffer = target.buffer.read(cx);
let range = check_multiline_range(buffer, range);
editor.change_selections(Some(Autoscroll::focused()), cx, |s| {
s.select_ranges([range]);
});
} else {
cx.window_context().defer(move |cx| {
let target_editor: View<Self> =
workspace.update(cx, |workspace, cx| {
let pane = if split {
workspace.adjacent_pane(cx)
} else {
workspace.active_pane().clone()
};
workspace.open_project_item(
pane,
target.buffer.clone(),
true,
true,
cx,
)
});
target_editor.update(cx, |target_editor, cx| {
// When selecting a definition in a different buffer, disable the nav history
// to avoid creating a history entry at the previous cursor location.
pane.update(cx, |pane, _| pane.disable_history());
let buffer = target.buffer.read(cx);
let range = check_multiline_range(buffer, range);
target_editor.change_selections(
Some(Autoscroll::focused()),
workspace.open_project_item(
pane,
target.buffer.clone(),
true,
true,
cx,
|s| {
s.select_ranges([range]);
},
);
pane.update(cx, |pane, _| pane.enable_history());
)
});
target_editor.update(cx, |target_editor, cx| {
// When selecting a definition in a different buffer, disable the nav history
// to avoid creating a history entry at the previous cursor location.
pane.update(cx, |pane, _| pane.disable_history());
let buffer = target.buffer.read(cx);
let range = check_multiline_range(buffer, range);
target_editor.change_selections(
Some(Autoscroll::focused()),
cx,
|s| {
s.select_ranges([range]);
},
);
pane.update(cx, |pane, _| pane.enable_history());
});
}
true
})
} else {
Ok(false)
}
});
}
Navigated::Yes
})
})
} else if !definitions.is_empty() {
let replica_id = self.replica_id(cx);
@@ -9289,7 +9303,7 @@ impl Editor {
.context("location tasks")?;
let Some(workspace) = workspace else {
return Ok(false);
return Ok(Navigated::No);
};
let opened = workspace
.update(&mut cx, |workspace, cx| {
@@ -9299,10 +9313,10 @@ impl Editor {
})
.ok();
anyhow::Ok(opened.is_some())
anyhow::Ok(Navigated::from_bool(opened.is_some()))
})
} else {
Task::ready(Ok(false))
Task::ready(Ok(Navigated::No))
}
}
@@ -9361,7 +9375,7 @@ impl Editor {
&mut self,
_: &FindAllReferences,
cx: &mut ViewContext<Self>,
) -> Option<Task<Result<()>>> {
) -> Option<Task<Result<Navigated>>> {
let multi_buffer = self.buffer.read(cx);
let selection = self.selections.newest::<usize>(cx);
let head = selection.head();
@@ -9416,7 +9430,7 @@ impl Editor {
let locations = references.await?;
if locations.is_empty() {
return anyhow::Ok(());
return anyhow::Ok(Navigated::No);
}
workspace.update(&mut cx, |workspace, cx| {
@@ -9436,6 +9450,7 @@ impl Editor {
Self::open_locations_in_multibuffer(
workspace, locations, replica_id, title, false, cx,
);
Navigated::Yes
})
}))
}
@@ -13277,3 +13292,13 @@ fn hunk_status(hunk: &DiffHunk<MultiBufferRow>) -> DiffHunkStatus {
DiffHunkStatus::Modified
}
}
/// If select range has more than one line, we
/// just point the cursor to range.start.
fn check_multiline_range(buffer: &Buffer, range: Range<usize>) -> Range<usize> {
if buffer.offset_to_point(range.start).row == buffer.offset_to_point(range.end).row {
range
} else {
range.start..range.start
}
}

View File

@@ -13221,6 +13221,127 @@ let foo = 15;"#,
});
}
#[gpui::test]
async fn test_goto_definition_with_find_all_references_fallback(cx: &mut gpui::TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorLspTestContext::new_rust(
lsp::ServerCapabilities {
definition_provider: Some(lsp::OneOf::Left(true)),
references_provider: Some(lsp::OneOf::Left(true)),
..lsp::ServerCapabilities::default()
},
cx,
)
.await;
let set_up_lsp_handlers = |empty_go_to_definition: bool, cx: &mut EditorLspTestContext| {
let go_to_definition = cx.lsp.handle_request::<lsp::request::GotoDefinition, _, _>(
move |params, _| async move {
if empty_go_to_definition {
Ok(None)
} else {
Ok(Some(lsp::GotoDefinitionResponse::Scalar(lsp::Location {
uri: params.text_document_position_params.text_document.uri,
range: lsp::Range::new(lsp::Position::new(4, 3), lsp::Position::new(4, 6)),
})))
}
},
);
let references =
cx.lsp
.handle_request::<lsp::request::References, _, _>(move |params, _| async move {
Ok(Some(vec![lsp::Location {
uri: params.text_document_position.text_document.uri,
range: lsp::Range::new(lsp::Position::new(0, 8), lsp::Position::new(0, 11)),
}]))
});
(go_to_definition, references)
};
cx.set_state(
&r#"fn one() {
let mut a = ˇtwo();
}
fn two() {}"#
.unindent(),
);
set_up_lsp_handlers(false, &mut cx);
let navigated = cx
.update_editor(|editor, cx| editor.go_to_definition(&GoToDefinition, cx))
.await
.expect("Failed to navigate to definition");
assert_eq!(
navigated,
Navigated::Yes,
"Should have navigated to definition from the GetDefinition response"
);
cx.assert_editor_state(
&r#"fn one() {
let mut a = two();
}
fn «twoˇ»() {}"#
.unindent(),
);
let editors = cx.update_workspace(|workspace, cx| {
workspace.items_of_type::<Editor>(cx).collect::<Vec<_>>()
});
cx.update_editor(|_, test_editor_cx| {
assert_eq!(
editors.len(),
1,
"Initially, only one, test, editor should be open in the workspace"
);
assert_eq!(
test_editor_cx.view(),
editors.last().expect("Asserted len is 1")
);
});
set_up_lsp_handlers(true, &mut cx);
let navigated = cx
.update_editor(|editor, cx| editor.go_to_definition(&GoToDefinition, cx))
.await
.expect("Failed to navigate to lookup references");
assert_eq!(
navigated,
Navigated::Yes,
"Should have navigated to references as a fallback after empty GoToDefinition response"
);
// We should not change the selections in the existing file,
// if opening another milti buffer with the references
cx.assert_editor_state(
&r#"fn one() {
let mut a = two();
}
fn «twoˇ»() {}"#
.unindent(),
);
let editors = cx.update_workspace(|workspace, cx| {
workspace.items_of_type::<Editor>(cx).collect::<Vec<_>>()
});
cx.update_editor(|_, test_editor_cx| {
assert_eq!(
editors.len(),
2,
"After falling back to references search, we open a new editor with the results"
);
let references_fallback_text = editors
.into_iter()
.find(|new_editor| new_editor != test_editor_cx.view())
.expect("Should have one non-test editor now")
.read(test_editor_cx)
.text(test_editor_cx);
assert_eq!(
references_fallback_text, "fn one() {\n let mut a = two();\n}",
"Should use the range from the references response and not the GoToDefinition one"
);
});
}
fn empty_range(row: usize, column: usize) -> Range<DisplayPoint> {
let point = DisplayPoint::new(DisplayRow(row as u32), column as u32);
point..point

View File

@@ -2,7 +2,7 @@ use crate::{
hover_popover::{self, InlayHover},
scroll::ScrollAmount,
Anchor, Editor, EditorSnapshot, FindAllReferences, GoToDefinition, GoToTypeDefinition, InlayId,
PointForPosition, SelectPhase,
Navigated, PointForPosition, SelectPhase,
};
use gpui::{px, AppContext, AsyncWindowContext, Model, Modifiers, Task, ViewContext};
use language::{Bias, ToOffset};
@@ -157,10 +157,10 @@ impl Editor {
) {
let reveal_task = self.cmd_click_reveal_task(point, modifiers, cx);
cx.spawn(|editor, mut cx| async move {
let definition_revealed = reveal_task.await.log_err().unwrap_or(false);
let definition_revealed = reveal_task.await.log_err().unwrap_or(Navigated::No);
let find_references = editor
.update(&mut cx, |editor, cx| {
if definition_revealed {
if definition_revealed == Navigated::Yes {
return None;
}
editor.find_all_references(&FindAllReferences, cx)
@@ -194,7 +194,7 @@ impl Editor {
point: PointForPosition,
modifiers: Modifiers,
cx: &mut ViewContext<Editor>,
) -> Task<anyhow::Result<bool>> {
) -> Task<anyhow::Result<Navigated>> {
if let Some(hovered_link_state) = self.hovered_link_state.take() {
self.hide_hovered_link(cx);
if !hovered_link_state.links.is_empty() {
@@ -211,7 +211,7 @@ impl Editor {
.read(cx)
.text_anchor_for_position(current_position, cx)
else {
return Task::ready(Ok(false));
return Task::ready(Ok(Navigated::No));
};
let links = hovered_link_state
.links
@@ -247,7 +247,7 @@ impl Editor {
self.go_to_definition(&GoToDefinition, cx)
}
} else {
Task::ready(Ok(false))
Task::ready(Ok(Navigated::No))
}
}
}

View File

@@ -680,6 +680,12 @@ impl Item for Editor {
self.nav_history = Some(history);
}
fn discarded(&self, _project: Model<Project>, cx: &mut ViewContext<Self>) {
for buffer in self.buffer().clone().read(cx).all_buffers() {
buffer.update(cx, |buffer, cx| buffer.discarded(cx))
}
}
fn deactivated(&mut self, cx: &mut ViewContext<Self>) {
let selection = self.selections.newest_anchor();
self.push_to_nav_history(selection.head(), None, cx);

View File

@@ -43,6 +43,11 @@ impl FeatureFlag for LanguageModels {
const NAME: &'static str = "language-models";
}
pub struct LlmClosedBeta {}
impl FeatureFlag for LlmClosedBeta {
const NAME: &'static str = "llm-closed-beta";
}
pub struct ZedPro {}
impl FeatureFlag for ZedPro {
const NAME: &'static str = "zed-pro";

View File

@@ -12,11 +12,11 @@ use ui::{
use util::paths::FILE_ROW_COLUMN_DELIMITER;
use workspace::{item::ItemHandle, StatusItemView, Workspace};
#[derive(Copy, Clone, Default, PartialOrd, PartialEq)]
struct SelectionStats {
lines: usize,
characters: usize,
selections: usize,
#[derive(Copy, Clone, Debug, Default, PartialOrd, PartialEq)]
pub(crate) struct SelectionStats {
pub lines: usize,
pub characters: usize,
pub selections: usize,
}
pub struct CursorPosition {
@@ -44,7 +44,10 @@ impl CursorPosition {
self.selected_count.selections = editor.selections.count();
let mut last_selection: Option<Selection<usize>> = None;
for selection in editor.selections.all::<usize>(cx) {
self.selected_count.characters += selection.end - selection.start;
self.selected_count.characters += buffer
.text_for_range(selection.start..selection.end)
.map(|t| t.chars().count())
.sum::<usize>();
if last_selection
.as_ref()
.map_or(true, |last_selection| selection.id > last_selection.id)
@@ -106,6 +109,11 @@ impl CursorPosition {
}
text.push(')');
}
#[cfg(test)]
pub(crate) fn selection_stats(&self) -> &SelectionStats {
&self.selected_count
}
}
impl Render for CursorPosition {

View File

@@ -221,6 +221,8 @@ impl Render for GoToLine {
#[cfg(test)]
mod tests {
use super::*;
use cursor_position::{CursorPosition, SelectionStats};
use editor::actions::SelectAll;
use gpui::{TestAppContext, VisualTestContext};
use indoc::indoc;
use project::{FakeFs, Project};
@@ -335,6 +337,83 @@ mod tests {
assert_single_caret_at_row(&editor, expected_highlighted_row, cx);
}
#[gpui::test]
async fn test_unicode_characters_selection(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/dir",
json!({
"a.rs": "ēlo"
}),
)
.await;
let project = Project::test(fs, ["/dir".as_ref()], cx).await;
let (workspace, cx) = cx.add_window_view(|cx| Workspace::test_new(project.clone(), cx));
workspace.update(cx, |workspace, cx| {
let cursor_position = cx.new_view(|_| CursorPosition::new(workspace));
workspace.status_bar().update(cx, |status_bar, cx| {
status_bar.add_right_item(cursor_position, cx);
});
});
let worktree_id = workspace.update(cx, |workspace, cx| {
workspace.project().update(cx, |project, cx| {
project.worktrees(cx).next().unwrap().read(cx).id()
})
});
let _buffer = project
.update(cx, |project, cx| project.open_local_buffer("/dir/a.rs", cx))
.await
.unwrap();
let editor = workspace
.update(cx, |workspace, cx| {
workspace.open_path((worktree_id, "a.rs"), None, true, cx)
})
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
workspace.update(cx, |workspace, cx| {
assert_eq!(
&SelectionStats {
lines: 0,
characters: 0,
selections: 1,
},
workspace
.status_bar()
.read(cx)
.item_of_type::<CursorPosition>()
.expect("missing cursor position item")
.read(cx)
.selection_stats(),
"No selections should be initially"
);
});
editor.update(cx, |editor, cx| editor.select_all(&SelectAll, cx));
workspace.update(cx, |workspace, cx| {
assert_eq!(
&SelectionStats {
lines: 1,
characters: 3,
selections: 1,
},
workspace
.status_bar()
.read(cx)
.item_of_type::<CursorPosition>()
.expect("missing cursor position item")
.read(cx)
.selection_stats(),
"After selecting a text with multibyte unicode characters, the character count should be correct"
);
});
}
fn open_go_to_line_view(
workspace: &View<Workspace>,
cx: &mut VisualTestContext,

View File

@@ -6,7 +6,7 @@ use std::{
path::{Path, PathBuf},
rc::{Rc, Weak},
sync::{atomic::Ordering::SeqCst, Arc},
time::Duration,
time::{Duration, Instant},
};
use anyhow::{anyhow, Result};
@@ -142,6 +142,12 @@ impl App {
self
}
/// Sets a start time for tracking time to first window draw.
pub fn measure_time_to_first_window_draw(self, start: Instant) -> Self {
self.0.borrow_mut().time_to_first_window_draw = Some(TimeToFirstWindowDraw::Pending(start));
self
}
/// Start the application. The provided callback will be called once the
/// app is fully launched.
pub fn run<F>(self, on_finish_launching: F)
@@ -247,6 +253,7 @@ pub struct AppContext {
pub(crate) layout_id_buffer: Vec<LayoutId>, // We recycle this memory across layout requests.
pub(crate) propagate_event: bool,
pub(crate) prompt_builder: Option<PromptBuilder>,
pub(crate) time_to_first_window_draw: Option<TimeToFirstWindowDraw>,
}
impl AppContext {
@@ -300,6 +307,7 @@ impl AppContext {
layout_id_buffer: Default::default(),
propagate_event: true,
prompt_builder: Some(PromptBuilder::Default),
time_to_first_window_draw: None,
}),
});
@@ -1302,6 +1310,14 @@ impl AppContext {
(task, is_first)
}
/// Returns the time to first window draw, if available.
pub fn time_to_first_window_draw(&self) -> Option<Duration> {
match self.time_to_first_window_draw {
Some(TimeToFirstWindowDraw::Done(duration)) => Some(duration),
_ => None,
}
}
}
impl Context for AppContext {
@@ -1465,6 +1481,15 @@ impl<G: Global> DerefMut for GlobalLease<G> {
}
}
/// Represents the initialization duration of the application.
#[derive(Clone, Copy)]
pub enum TimeToFirstWindowDraw {
/// The application is still initializing, and contains the start time.
Pending(Instant),
/// The application has finished initializing, and contains the total duration.
Done(Duration),
}
/// Contains state associated with an active drag operation, started by dragging an element
/// within the window or by dragging into the app from the underlying platform.
pub struct AnyDrag {

View File

@@ -16,6 +16,7 @@ mod blade;
#[cfg(any(test, feature = "test-support"))]
mod test;
mod fps;
#[cfg(target_os = "windows")]
mod windows;
@@ -51,6 +52,7 @@ use strum::EnumIter;
use uuid::Uuid;
pub use app_menu::*;
pub use fps::*;
pub use keystroke::*;
#[cfg(target_os = "linux")]
@@ -354,7 +356,7 @@ pub(crate) trait PlatformWindow: HasWindowHandle + HasDisplayHandle {
fn on_should_close(&self, callback: Box<dyn FnMut() -> bool>);
fn on_close(&self, callback: Box<dyn FnOnce()>);
fn on_appearance_changed(&self, callback: Box<dyn FnMut()>);
fn draw(&self, scene: &Scene);
fn draw(&self, scene: &Scene, on_complete: Option<oneshot::Sender<()>>);
fn completed_frame(&self) {}
fn sprite_atlas(&self) -> Arc<dyn PlatformAtlas>;
@@ -379,6 +381,7 @@ pub(crate) trait PlatformWindow: HasWindowHandle + HasDisplayHandle {
}
fn set_client_inset(&self, _inset: Pixels) {}
fn gpu_specs(&self) -> Option<GPUSpecs>;
fn fps(&self) -> Option<f32>;
#[cfg(any(test, feature = "test-support"))]
fn as_test(&mut self) -> Option<&mut TestWindow> {

View File

@@ -9,6 +9,7 @@ use crate::{
};
use bytemuck::{Pod, Zeroable};
use collections::HashMap;
use futures::channel::oneshot;
#[cfg(target_os = "macos")]
use media::core_video::CVMetalTextureCache;
#[cfg(target_os = "macos")]
@@ -537,7 +538,12 @@ impl BladeRenderer {
self.gpu.destroy_command_encoder(&mut self.command_encoder);
}
pub fn draw(&mut self, scene: &Scene) {
pub fn draw(
&mut self,
scene: &Scene,
// Required to compile on macOS, but not currently supported.
_on_complete: Option<oneshot::Sender<()>>,
) {
self.command_encoder.start();
self.atlas.before_frame(&mut self.command_encoder);
self.rasterize_paths(scene.paths());
@@ -766,4 +772,9 @@ impl BladeRenderer {
self.wait_for_gpu();
self.last_sync_point = Some(sync_point);
}
/// Required to compile on macOS, but not currently supported.
pub fn fps(&self) -> f32 {
0.0
}
}

View File

@@ -0,0 +1,94 @@
use std::sync::atomic::{AtomicU64, AtomicUsize, Ordering};
use std::sync::Arc;
const NANOS_PER_SEC: u64 = 1_000_000_000;
const WINDOW_SIZE: usize = 128;
/// Represents a rolling FPS (Frames Per Second) counter.
///
/// This struct provides a lock-free mechanism to measure and calculate FPS
/// continuously, updating with every frame. It uses atomic operations to
/// ensure thread-safety without the need for locks.
pub struct FpsCounter {
frame_times: [AtomicU64; WINDOW_SIZE],
head: AtomicUsize,
tail: AtomicUsize,
}
impl FpsCounter {
/// Creates a new `Fps` counter.
///
/// Returns an `Arc<Fps>` for safe sharing across threads.
pub fn new() -> Arc<Self> {
Arc::new(Self {
frame_times: std::array::from_fn(|_| AtomicU64::new(0)),
head: AtomicUsize::new(0),
tail: AtomicUsize::new(0),
})
}
/// Increments the FPS counter with a new frame timestamp.
///
/// This method updates the internal state to maintain a rolling window
/// of frame data for the last second. It uses atomic operations to
/// ensure thread-safety.
///
/// # Arguments
///
/// * `timestamp_ns` - The timestamp of the new frame in nanoseconds.
pub fn increment(&self, timestamp_ns: u64) {
let mut head = self.head.load(Ordering::Relaxed);
let mut tail = self.tail.load(Ordering::Relaxed);
// Add new timestamp
self.frame_times[head].store(timestamp_ns, Ordering::Relaxed);
// Increment head and wrap around to 0 if it reaches WINDOW_SIZE
head = (head + 1) % WINDOW_SIZE;
self.head.store(head, Ordering::Relaxed);
// Remove old timestamps (older than 1 second)
while tail != head {
let oldest = self.frame_times[tail].load(Ordering::Relaxed);
if timestamp_ns.wrapping_sub(oldest) <= NANOS_PER_SEC {
break;
}
// Increment tail and wrap around to 0 if it reaches WINDOW_SIZE
tail = (tail + 1) % WINDOW_SIZE;
self.tail.store(tail, Ordering::Relaxed);
}
}
/// Calculates and returns the current FPS.
///
/// This method computes the FPS based on the frames recorded in the last second.
/// It uses atomic loads to ensure thread-safety.
///
/// # Returns
///
/// The calculated FPS as a `f32`, or 0.0 if no frames have been recorded.
pub fn fps(&self) -> f32 {
let head = self.head.load(Ordering::Relaxed);
let tail = self.tail.load(Ordering::Relaxed);
if head == tail {
return 0.0;
}
let newest =
self.frame_times[head.wrapping_sub(1) & (WINDOW_SIZE - 1)].load(Ordering::Relaxed);
let oldest = self.frame_times[tail].load(Ordering::Relaxed);
let time_diff = newest.wrapping_sub(oldest) as f32;
if time_diff == 0.0 {
return 0.0;
}
let frame_count = if head > tail {
head - tail
} else {
WINDOW_SIZE - tail + head
};
(frame_count as f32 - 1.0) * NANOS_PER_SEC as f32 / time_diff
}
}

View File

@@ -6,7 +6,7 @@ use std::sync::Arc;
use blade_graphics as gpu;
use collections::HashMap;
use futures::channel::oneshot::Receiver;
use futures::channel::oneshot;
use raw_window_handle as rwh;
use wayland_backend::client::ObjectId;
@@ -827,7 +827,7 @@ impl PlatformWindow for WaylandWindow {
_msg: &str,
_detail: Option<&str>,
_answers: &[&str],
) -> Option<Receiver<usize>> {
) -> Option<oneshot::Receiver<usize>> {
None
}
@@ -934,9 +934,9 @@ impl PlatformWindow for WaylandWindow {
self.0.callbacks.borrow_mut().appearance_changed = Some(callback);
}
fn draw(&self, scene: &Scene) {
fn draw(&self, scene: &Scene, on_complete: Option<oneshot::Sender<()>>) {
let mut state = self.borrow_mut();
state.renderer.draw(scene);
state.renderer.draw(scene, on_complete);
}
fn completed_frame(&self) {
@@ -1009,6 +1009,10 @@ impl PlatformWindow for WaylandWindow {
fn gpu_specs(&self) -> Option<GPUSpecs> {
self.borrow().renderer.gpu_specs().into()
}
fn fps(&self) -> Option<f32> {
None
}
}
fn update_window(mut state: RefMut<WaylandWindowState>) {

View File

@@ -1,5 +1,3 @@
use anyhow::Context;
use crate::{
platform::blade::{BladeRenderer, BladeSurfaceConfig},
px, size, AnyWindowHandle, Bounds, Decorations, DevicePixels, ForegroundExecutor, GPUSpecs,
@@ -9,7 +7,9 @@ use crate::{
X11ClientStatePtr,
};
use anyhow::Context;
use blade_graphics as gpu;
use futures::channel::oneshot;
use raw_window_handle as rwh;
use util::{maybe, ResultExt};
use x11rb::{
@@ -1210,9 +1210,10 @@ impl PlatformWindow for X11Window {
self.0.callbacks.borrow_mut().appearance_changed = Some(callback);
}
fn draw(&self, scene: &Scene) {
// TODO: on_complete not yet supported for X11 windows
fn draw(&self, scene: &Scene, on_complete: Option<oneshot::Sender<()>>) {
let mut inner = self.0.state.borrow_mut();
inner.renderer.draw(scene);
inner.renderer.draw(scene, on_complete);
}
fn sprite_atlas(&self) -> Arc<dyn PlatformAtlas> {
@@ -1398,4 +1399,8 @@ impl PlatformWindow for X11Window {
fn gpu_specs(&self) -> Option<GPUSpecs> {
self.0.state.borrow().renderer.gpu_specs().into()
}
fn fps(&self) -> Option<f32> {
None
}
}

View File

@@ -1,7 +1,7 @@
use super::metal_atlas::MetalAtlas;
use crate::{
point, size, AtlasTextureId, AtlasTextureKind, AtlasTile, Bounds, ContentMask, DevicePixels,
Hsla, MonochromeSprite, PaintSurface, Path, PathId, PathVertex, PolychromeSprite,
FpsCounter, Hsla, MonochromeSprite, PaintSurface, Path, PathId, PathVertex, PolychromeSprite,
PrimitiveBatch, Quad, ScaledPixels, Scene, Shadow, Size, Surface, Underline,
};
use anyhow::{anyhow, Result};
@@ -14,6 +14,7 @@ use cocoa::{
use collections::HashMap;
use core_foundation::base::TCFType;
use foreign_types::ForeignType;
use futures::channel::oneshot;
use media::core_video::CVMetalTextureCache;
use metal::{CAMetalLayer, CommandQueue, MTLPixelFormat, MTLResourceOptions, NSRange};
use objc::{self, msg_send, sel, sel_impl};
@@ -105,6 +106,7 @@ pub(crate) struct MetalRenderer {
instance_buffer_pool: Arc<Mutex<InstanceBufferPool>>,
sprite_atlas: Arc<MetalAtlas>,
core_video_texture_cache: CVMetalTextureCache,
fps_counter: Arc<FpsCounter>,
}
impl MetalRenderer {
@@ -250,6 +252,7 @@ impl MetalRenderer {
instance_buffer_pool,
sprite_atlas,
core_video_texture_cache,
fps_counter: FpsCounter::new(),
}
}
@@ -292,7 +295,8 @@ impl MetalRenderer {
// nothing to do
}
pub fn draw(&mut self, scene: &Scene) {
pub fn draw(&mut self, scene: &Scene, on_complete: Option<oneshot::Sender<()>>) {
let on_complete = Arc::new(Mutex::new(on_complete));
let layer = self.layer.clone();
let viewport_size = layer.drawable_size();
let viewport_size: Size<DevicePixels> = size(
@@ -319,13 +323,24 @@ impl MetalRenderer {
Ok(command_buffer) => {
let instance_buffer_pool = self.instance_buffer_pool.clone();
let instance_buffer = Cell::new(Some(instance_buffer));
let block = ConcreteBlock::new(move |_| {
if let Some(instance_buffer) = instance_buffer.take() {
instance_buffer_pool.lock().release(instance_buffer);
}
});
let block = block.copy();
command_buffer.add_completed_handler(&block);
let device = self.device.clone();
let fps_counter = self.fps_counter.clone();
let completed_handler =
ConcreteBlock::new(move |_: &metal::CommandBufferRef| {
let mut cpu_timestamp = 0;
let mut gpu_timestamp = 0;
device.sample_timestamps(&mut cpu_timestamp, &mut gpu_timestamp);
fps_counter.increment(gpu_timestamp);
if let Some(on_complete) = on_complete.lock().take() {
on_complete.send(()).ok();
}
if let Some(instance_buffer) = instance_buffer.take() {
instance_buffer_pool.lock().release(instance_buffer);
}
});
let completed_handler = completed_handler.copy();
command_buffer.add_completed_handler(&completed_handler);
if self.presents_with_transaction {
command_buffer.commit();
@@ -1117,6 +1132,10 @@ impl MetalRenderer {
}
true
}
pub fn fps(&self) -> f32 {
self.fps_counter.fps()
}
}
fn build_pipeline_state(

View File

@@ -784,14 +784,14 @@ impl PlatformWindow for MacWindow {
self.0.as_ref().lock().bounds()
}
fn window_bounds(&self) -> WindowBounds {
self.0.as_ref().lock().window_bounds()
}
fn is_maximized(&self) -> bool {
self.0.as_ref().lock().is_maximized()
}
fn window_bounds(&self) -> WindowBounds {
self.0.as_ref().lock().window_bounds()
}
fn content_size(&self) -> Size<Pixels> {
self.0.as_ref().lock().content_size()
}
@@ -975,8 +975,6 @@ impl PlatformWindow for MacWindow {
}
}
fn set_app_id(&mut self, _app_id: &str) {}
fn set_background_appearance(&self, background_appearance: WindowBackgroundAppearance) {
let mut this = self.0.as_ref().lock();
this.renderer
@@ -1007,30 +1005,6 @@ impl PlatformWindow for MacWindow {
}
}
fn set_edited(&mut self, edited: bool) {
unsafe {
let window = self.0.lock().native_window;
msg_send![window, setDocumentEdited: edited as BOOL]
}
// Changing the document edited state resets the traffic light position,
// so we have to move it again.
self.0.lock().move_traffic_light();
}
fn show_character_palette(&self) {
let this = self.0.lock();
let window = this.native_window;
this.executor
.spawn(async move {
unsafe {
let app = NSApplication::sharedApplication(nil);
let _: () = msg_send![app, orderFrontCharacterPalette: window];
}
})
.detach();
}
fn minimize(&self) {
let window = self.0.lock().native_window;
unsafe {
@@ -1107,18 +1081,48 @@ impl PlatformWindow for MacWindow {
self.0.lock().appearance_changed_callback = Some(callback);
}
fn draw(&self, scene: &crate::Scene) {
fn draw(&self, scene: &crate::Scene, on_complete: Option<oneshot::Sender<()>>) {
let mut this = self.0.lock();
this.renderer.draw(scene);
this.renderer.draw(scene, on_complete);
}
fn sprite_atlas(&self) -> Arc<dyn PlatformAtlas> {
self.0.lock().renderer.sprite_atlas().clone()
}
fn set_edited(&mut self, edited: bool) {
unsafe {
let window = self.0.lock().native_window;
msg_send![window, setDocumentEdited: edited as BOOL]
}
// Changing the document edited state resets the traffic light position,
// so we have to move it again.
self.0.lock().move_traffic_light();
}
fn show_character_palette(&self) {
let this = self.0.lock();
let window = this.native_window;
this.executor
.spawn(async move {
unsafe {
let app = NSApplication::sharedApplication(nil);
let _: () = msg_send![app, orderFrontCharacterPalette: window];
}
})
.detach();
}
fn set_app_id(&mut self, _app_id: &str) {}
fn gpu_specs(&self) -> Option<crate::GPUSpecs> {
None
}
fn fps(&self) -> Option<f32> {
Some(self.0.lock().renderer.fps())
}
}
impl rwh::HasWindowHandle for MacWindow {

View File

@@ -251,7 +251,12 @@ impl PlatformWindow for TestWindow {
fn on_appearance_changed(&self, _callback: Box<dyn FnMut()>) {}
fn draw(&self, _scene: &crate::Scene) {}
fn draw(
&self,
_scene: &crate::Scene,
_on_complete: Option<futures::channel::oneshot::Sender<()>>,
) {
}
fn sprite_atlas(&self) -> sync::Arc<dyn crate::PlatformAtlas> {
self.0.lock().sprite_atlas.clone()
@@ -277,6 +282,10 @@ impl PlatformWindow for TestWindow {
fn gpu_specs(&self) -> Option<GPUSpecs> {
None
}
fn fps(&self) -> Option<f32> {
None
}
}
pub(crate) struct TestAtlasState {

View File

@@ -660,8 +660,8 @@ impl PlatformWindow for WindowsWindow {
self.0.state.borrow_mut().callbacks.appearance_changed = Some(callback);
}
fn draw(&self, scene: &Scene) {
self.0.state.borrow_mut().renderer.draw(scene)
fn draw(&self, scene: &Scene, on_complete: Option<oneshot::Sender<()>>) {
self.0.state.borrow_mut().renderer.draw(scene, on_complete)
}
fn sprite_atlas(&self) -> Arc<dyn PlatformAtlas> {
@@ -675,6 +675,10 @@ impl PlatformWindow for WindowsWindow {
fn gpu_specs(&self) -> Option<GPUSpecs> {
Some(self.0.state.borrow().renderer.gpu_specs())
}
fn fps(&self) -> Option<f32> {
None
}
}
#[implement(IDropTarget)]

View File

@@ -11,9 +11,9 @@ use crate::{
PromptLevel, Quad, Render, RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams,
Replay, ResizeEdge, ScaledPixels, Scene, Shadow, SharedString, Size, StrikethroughStyle, Style,
SubscriberSet, Subscription, TaffyLayoutEngine, Task, TextStyle, TextStyleRefinement,
TransformationMatrix, Underline, UnderlineStyle, View, VisualContext, WeakView,
WindowAppearance, WindowBackgroundAppearance, WindowBounds, WindowControls, WindowDecorations,
WindowOptions, WindowParams, WindowTextSystem, SUBPIXEL_VARIANTS,
TimeToFirstWindowDraw, TransformationMatrix, Underline, UnderlineStyle, View, VisualContext,
WeakView, WindowAppearance, WindowBackgroundAppearance, WindowBounds, WindowControls,
WindowDecorations, WindowOptions, WindowParams, WindowTextSystem, SUBPIXEL_VARIANTS,
};
use anyhow::{anyhow, Context as _, Result};
use collections::{FxHashMap, FxHashSet};
@@ -544,6 +544,8 @@ pub struct Window {
hovered: Rc<Cell<bool>>,
pub(crate) dirty: Rc<Cell<bool>>,
pub(crate) needs_present: Rc<Cell<bool>>,
/// We assign this to be notified when the platform graphics backend fires the next completion callback for drawing the window.
present_completed: RefCell<Option<oneshot::Sender<()>>>,
pub(crate) last_input_timestamp: Rc<Cell<Instant>>,
pub(crate) refreshing: bool,
pub(crate) draw_phase: DrawPhase,
@@ -820,6 +822,7 @@ impl Window {
hovered,
dirty,
needs_present,
present_completed: RefCell::default(),
last_input_timestamp,
refreshing: false,
draw_phase: DrawPhase::None,
@@ -1489,13 +1492,29 @@ impl<'a> WindowContext<'a> {
self.window.refreshing = false;
self.window.draw_phase = DrawPhase::None;
self.window.needs_present.set(true);
if let Some(TimeToFirstWindowDraw::Pending(start)) = self.app.time_to_first_window_draw {
let (tx, rx) = oneshot::channel();
*self.window.present_completed.borrow_mut() = Some(tx);
self.spawn(|mut cx| async move {
rx.await.ok();
cx.update(|cx| {
let duration = start.elapsed();
cx.time_to_first_window_draw = Some(TimeToFirstWindowDraw::Done(duration));
log::info!("time to first window draw: {:?}", duration);
cx.push_effect(Effect::Refresh);
})
})
.detach();
}
}
#[profiling::function]
fn present(&self) {
let on_complete = self.window.present_completed.take();
self.window
.platform_window
.draw(&self.window.rendered_frame.scene);
.draw(&self.window.rendered_frame.scene, on_complete);
self.window.needs_present.set(false);
profiling::finish_frame!();
}
@@ -3718,6 +3737,12 @@ impl<'a> WindowContext<'a> {
pub fn gpu_specs(&self) -> Option<GPUSpecs> {
self.window.platform_window.gpu_specs()
}
/// Get the current FPS (frames per second) of the window.
/// This is only supported on macOS currently.
pub fn fps(&self) -> Option<f32> {
self.window.platform_window.fps()
}
}
#[cfg(target_os = "windows")]

View File

@@ -332,6 +332,8 @@ pub enum Event {
CapabilityChanged,
/// The buffer was explicitly requested to close.
Closed,
/// The buffer was discarded when closing.
Discarded,
}
/// The file associated with a buffer.
@@ -827,6 +829,12 @@ impl Buffer {
cx.notify();
}
/// This method is called to signal that the buffer has been discarded.
pub fn discarded(&mut self, cx: &mut ModelContext<Self>) {
cx.emit(Event::Discarded);
cx.notify();
}
/// Reloads the contents of the buffer from disk.
pub fn reload(
&mut self,

View File

@@ -14,7 +14,7 @@ pub struct Outline<T> {
path_candidate_prefixes: Vec<usize>,
}
#[derive(Clone, Debug, PartialEq, Eq, Hash)]
#[derive(Clone, Debug, Default, PartialEq, Eq, Hash)]
pub struct OutlineItem<T> {
pub depth: usize,
pub range: Range<T>,

View File

@@ -54,6 +54,10 @@ pub struct LanguageModelCacheConfiguration {
pub trait LanguageModel: Send + Sync {
fn id(&self) -> LanguageModelId;
fn name(&self) -> LanguageModelName;
/// If None, falls back to [LanguageModelProvider::icon]
fn icon(&self) -> Option<IconName> {
None
}
fn provider_id(&self) -> LanguageModelProviderId;
fn provider_name(&self) -> LanguageModelProviderName;
fn telemetry_id(&self) -> String;

View File

@@ -2,6 +2,7 @@ use proto::Plan;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use strum::EnumIter;
use ui::IconName;
use crate::LanguageModelAvailability;
@@ -65,6 +66,13 @@ impl CloudModel {
}
}
pub fn icon(&self) -> Option<IconName> {
match self {
Self::Anthropic(_) => Some(IconName::AiAnthropicHosted),
_ => None,
}
}
pub fn max_token_count(&self) -> usize {
match self {
Self::Anthropic(model) => model.max_token_count(),

View File

@@ -19,7 +19,7 @@ use settings::{Settings, SettingsStore};
use std::{sync::Arc, time::Duration};
use strum::IntoEnumIterator;
use theme::ThemeSettings;
use ui::{prelude::*, Icon, IconName};
use ui::{prelude::*, Icon, IconName, Tooltip};
use util::ResultExt;
const PROVIDER_ID: &str = "anthropic";
@@ -29,15 +29,22 @@ const PROVIDER_NAME: &str = "Anthropic";
pub struct AnthropicSettings {
pub api_url: String,
pub low_speed_timeout: Option<Duration>,
/// Extend Zed's list of Anthropic models.
pub available_models: Vec<AvailableModel>,
pub needs_setting_migration: bool,
}
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct AvailableModel {
/// The model's name in the Anthropic API. e.g. claude-3-5-sonnet-20240620
pub name: String,
/// The model's name in Zed's UI, such as in the model selector dropdown menu in the assistant panel.
pub display_name: Option<String>,
/// The model's context window size.
pub max_tokens: usize,
/// A model `name` to substitute when calling tools, in case the primary model doesn't support tool calling.
pub tool_override: Option<String>,
/// Configuration of Anthropic's caching API.
pub cache_configuration: Option<LanguageModelCacheConfiguration>,
pub max_output_tokens: Option<u32>,
}
@@ -47,8 +54,11 @@ pub struct AnthropicLanguageModelProvider {
state: gpui::Model<State>,
}
const ANTHROPIC_API_KEY_VAR: &'static str = "ANTHROPIC_API_KEY";
pub struct State {
api_key: Option<String>,
api_key_from_env: bool,
_subscription: Subscription,
}
@@ -60,6 +70,7 @@ impl State {
delete_credentials.await.ok();
this.update(&mut cx, |this, cx| {
this.api_key = None;
this.api_key_from_env = false;
cx.notify();
})
})
@@ -98,18 +109,20 @@ impl State {
.clone();
cx.spawn(|this, mut cx| async move {
let api_key = if let Ok(api_key) = std::env::var("ANTHROPIC_API_KEY") {
api_key
let (api_key, from_env) = if let Ok(api_key) = std::env::var(ANTHROPIC_API_KEY_VAR)
{
(api_key, true)
} else {
let (_, api_key) = cx
.update(|cx| cx.read_credentials(&api_url))?
.await?
.ok_or_else(|| anyhow!("credentials not found"))?;
String::from_utf8(api_key)?
(String::from_utf8(api_key)?, false)
};
this.update(&mut cx, |this, cx| {
this.api_key = Some(api_key);
this.api_key_from_env = from_env;
cx.notify();
})
})
@@ -121,6 +134,7 @@ impl AnthropicLanguageModelProvider {
pub fn new(http_client: Arc<dyn HttpClient>, cx: &mut AppContext) -> Self {
let state = cx.new_model(|cx| State {
api_key: None,
api_key_from_env: false,
_subscription: cx.observe_global::<SettingsStore>(|_, cx| {
cx.notify();
}),
@@ -171,6 +185,7 @@ impl LanguageModelProvider for AnthropicLanguageModelProvider {
model.name.clone(),
anthropic::Model::Custom {
name: model.name.clone(),
display_name: model.display_name.clone(),
max_tokens: model.max_tokens,
tool_override: model.tool_override.clone(),
cache_configuration: model.cache_configuration.as_ref().map(|config| {
@@ -529,6 +544,8 @@ impl Render for ConfigurationView {
"Paste your Anthropic API key below and hit enter to use the assistant:",
];
let env_var_set = self.state.read(cx).api_key_from_env;
if self.load_credentials_task.is_some() {
div().child(Label::new("Loading credentials...")).into_any()
} else if self.should_render_editor(cx) {
@@ -550,7 +567,7 @@ impl Render for ConfigurationView {
)
.child(
Label::new(
"You can also assign the ANTHROPIC_API_KEY environment variable and restart Zed.",
"You can also assign the {ANTHROPIC_API_KEY_VAR} environment variable and restart Zed.",
)
.size(LabelSize::Small),
)
@@ -563,13 +580,21 @@ impl Render for ConfigurationView {
h_flex()
.gap_1()
.child(Icon::new(IconName::Check).color(Color::Success))
.child(Label::new("API key configured.")),
.child(Label::new(if env_var_set {
format!("API key set in {ANTHROPIC_API_KEY_VAR} environment variable.")
} else {
"API key configured.".to_string()
})),
)
.child(
Button::new("reset-key", "Reset key")
.icon(Some(IconName::Trash))
.icon_size(IconSize::Small)
.icon_position(IconPosition::Start)
.disabled(env_var_set)
.when(env_var_set, |this| {
this.tooltip(|cx| Tooltip::text(format!("To reset your API key, unset the {ANTHROPIC_API_KEY_VAR} environment variable."), cx))
})
.on_click(cx.listener(|this, _, cx| this.reset_api_key(cx))),
)
.into_any()

View File

@@ -8,7 +8,7 @@ use anthropic::AnthropicError;
use anyhow::{anyhow, Result};
use client::{Client, PerformCompletionParams, UserStore, EXPIRED_LLM_TOKEN_HEADER_NAME};
use collections::BTreeMap;
use feature_flags::{FeatureFlagAppExt, ZedPro};
use feature_flags::{FeatureFlagAppExt, LlmClosedBeta, ZedPro};
use futures::{
future::BoxFuture, stream::BoxStream, AsyncBufReadExt, FutureExt, Stream, StreamExt,
TryStreamExt as _,
@@ -26,7 +26,10 @@ use smol::{
io::{AsyncReadExt, BufReader},
lock::{RwLock, RwLockUpgradableReadGuard, RwLockWriteGuard},
};
use std::{future, sync::Arc};
use std::{
future,
sync::{Arc, LazyLock},
};
use strum::IntoEnumIterator;
use ui::prelude::*;
@@ -37,6 +40,18 @@ use super::anthropic::count_anthropic_tokens;
pub const PROVIDER_ID: &str = "zed.dev";
pub const PROVIDER_NAME: &str = "Zed";
const ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: Option<&str> =
option_env!("ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON");
fn zed_cloud_provider_additional_models() -> &'static [AvailableModel] {
static ADDITIONAL_MODELS: LazyLock<Vec<AvailableModel>> = LazyLock::new(|| {
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON
.map(|json| serde_json::from_str(json).unwrap())
.unwrap_or(Vec::new())
});
ADDITIONAL_MODELS.as_slice()
}
#[derive(Default, Clone, Debug, PartialEq)]
pub struct ZedDotDevSettings {
pub available_models: Vec<AvailableModel>,
@@ -52,12 +67,20 @@ pub enum AvailableProvider {
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct AvailableModel {
provider: AvailableProvider,
name: String,
max_tokens: usize,
tool_override: Option<String>,
cache_configuration: Option<LanguageModelCacheConfiguration>,
max_output_tokens: Option<u32>,
/// The provider of the language model.
pub provider: AvailableProvider,
/// The model's name in the provider's API. e.g. claude-3-5-sonnet-20240620
pub name: String,
/// The name displayed in the UI, such as in the assistant panel model dropdown menu.
pub display_name: Option<String>,
/// The size of the context window, indicating the maximum number of tokens the model can process.
pub max_tokens: usize,
/// The maximum number of output tokens allowed by the model.
pub max_output_tokens: Option<u32>,
/// Override this model with a different Anthropic model for tool calls.
pub tool_override: Option<String>,
/// Indicates whether this custom model supports caching.
pub cache_configuration: Option<LanguageModelCacheConfiguration>,
}
pub struct CloudLanguageModelProvider {
@@ -192,39 +215,6 @@ impl LanguageModelProvider for CloudLanguageModelProvider {
for model in ZedModel::iter() {
models.insert(model.id().to_string(), CloudModel::Zed(model));
}
// Override with available models from settings
for model in &AllLanguageModelSettings::get_global(cx)
.zed_dot_dev
.available_models
{
let model = match model.provider {
AvailableProvider::Anthropic => {
CloudModel::Anthropic(anthropic::Model::Custom {
name: model.name.clone(),
max_tokens: model.max_tokens,
tool_override: model.tool_override.clone(),
cache_configuration: model.cache_configuration.as_ref().map(|config| {
anthropic::AnthropicModelCacheConfiguration {
max_cache_anchors: config.max_cache_anchors,
should_speculate: config.should_speculate,
min_total_token: config.min_total_token,
}
}),
max_output_tokens: model.max_output_tokens,
})
}
AvailableProvider::OpenAi => CloudModel::OpenAi(open_ai::Model::Custom {
name: model.name.clone(),
max_tokens: model.max_tokens,
}),
AvailableProvider::Google => CloudModel::Google(google_ai::Model::Custom {
name: model.name.clone(),
max_tokens: model.max_tokens,
}),
};
models.insert(model.id().to_string(), model.clone());
}
} else {
models.insert(
anthropic::Model::Claude3_5Sonnet.id().to_string(),
@@ -232,6 +222,47 @@ impl LanguageModelProvider for CloudLanguageModelProvider {
);
}
let llm_closed_beta_models = if cx.has_flag::<LlmClosedBeta>() {
zed_cloud_provider_additional_models()
} else {
&[]
};
// Override with available models from settings
for model in AllLanguageModelSettings::get_global(cx)
.zed_dot_dev
.available_models
.iter()
.chain(llm_closed_beta_models)
.cloned()
{
let model = match model.provider {
AvailableProvider::Anthropic => CloudModel::Anthropic(anthropic::Model::Custom {
name: model.name.clone(),
display_name: model.display_name.clone(),
max_tokens: model.max_tokens,
tool_override: model.tool_override.clone(),
cache_configuration: model.cache_configuration.as_ref().map(|config| {
anthropic::AnthropicModelCacheConfiguration {
max_cache_anchors: config.max_cache_anchors,
should_speculate: config.should_speculate,
min_total_token: config.min_total_token,
}
}),
max_output_tokens: model.max_output_tokens,
}),
AvailableProvider::OpenAi => CloudModel::OpenAi(open_ai::Model::Custom {
name: model.name.clone(),
max_tokens: model.max_tokens,
}),
AvailableProvider::Google => CloudModel::Google(google_ai::Model::Custom {
name: model.name.clone(),
max_tokens: model.max_tokens,
}),
};
models.insert(model.id().to_string(), model.clone());
}
models
.into_values()
.map(|model| {
@@ -389,6 +420,10 @@ impl LanguageModel for CloudLanguageModel {
LanguageModelName::from(self.model.display_name().to_string())
}
fn icon(&self) -> Option<IconName> {
self.model.icon()
}
fn provider_id(&self) -> LanguageModelProviderId {
LanguageModelProviderId(PROVIDER_ID.into())
}

View File

@@ -180,6 +180,7 @@ impl LanguageModel for CopilotChatLanguageModel {
cx: &AppContext,
) -> BoxFuture<'static, Result<usize>> {
let model = match self.model {
CopilotChatModel::Gpt4o => open_ai::Model::FourOmni,
CopilotChatModel::Gpt4 => open_ai::Model::Four,
CopilotChatModel::Gpt3_5Turbo => open_ai::Model::ThreePointFiveTurbo,
};

View File

@@ -14,7 +14,7 @@ use settings::{Settings, SettingsStore};
use std::{future, sync::Arc, time::Duration};
use strum::IntoEnumIterator;
use theme::ThemeSettings;
use ui::{prelude::*, Icon, IconName};
use ui::{prelude::*, Icon, IconName, Tooltip};
use util::ResultExt;
use crate::{
@@ -46,9 +46,12 @@ pub struct GoogleLanguageModelProvider {
pub struct State {
api_key: Option<String>,
api_key_from_env: bool,
_subscription: Subscription,
}
const GOOGLE_AI_API_KEY_VAR: &'static str = "GOOGLE_AI_API_KEY";
impl State {
fn is_authenticated(&self) -> bool {
self.api_key.is_some()
@@ -61,6 +64,7 @@ impl State {
delete_credentials.await.ok();
this.update(&mut cx, |this, cx| {
this.api_key = None;
this.api_key_from_env = false;
cx.notify();
})
})
@@ -90,18 +94,20 @@ impl State {
.clone();
cx.spawn(|this, mut cx| async move {
let api_key = if let Ok(api_key) = std::env::var("GOOGLE_AI_API_KEY") {
api_key
let (api_key, from_env) = if let Ok(api_key) = std::env::var(GOOGLE_AI_API_KEY_VAR)
{
(api_key, true)
} else {
let (_, api_key) = cx
.update(|cx| cx.read_credentials(&api_url))?
.await?
.ok_or_else(|| anyhow!("credentials not found"))?;
String::from_utf8(api_key)?
(String::from_utf8(api_key)?, false)
};
this.update(&mut cx, |this, cx| {
this.api_key = Some(api_key);
this.api_key_from_env = from_env;
cx.notify();
})
})
@@ -113,6 +119,7 @@ impl GoogleLanguageModelProvider {
pub fn new(http_client: Arc<dyn HttpClient>, cx: &mut AppContext) -> Self {
let state = cx.new_model(|cx| State {
api_key: None,
api_key_from_env: false,
_subscription: cx.observe_global::<SettingsStore>(|_, cx| {
cx.notify();
}),
@@ -422,6 +429,8 @@ impl Render for ConfigurationView {
"Paste your Google AI API key below and hit enter to use the assistant:",
];
let env_var_set = self.state.read(cx).api_key_from_env;
if self.load_credentials_task.is_some() {
div().child(Label::new("Loading credentials...")).into_any()
} else if self.should_render_editor(cx) {
@@ -443,7 +452,7 @@ impl Render for ConfigurationView {
)
.child(
Label::new(
"You can also assign the GOOGLE_AI_API_KEY environment variable and restart Zed.",
format!("You can also assign the {GOOGLE_AI_API_KEY_VAR} environment variable and restart Zed."),
)
.size(LabelSize::Small),
)
@@ -456,13 +465,21 @@ impl Render for ConfigurationView {
h_flex()
.gap_1()
.child(Icon::new(IconName::Check).color(Color::Success))
.child(Label::new("API key configured.")),
.child(Label::new(if env_var_set {
format!("API key set in {GOOGLE_AI_API_KEY_VAR} environment variable.")
} else {
"API key configured.".to_string()
})),
)
.child(
Button::new("reset-key", "Reset key")
.icon(Some(IconName::Trash))
.icon_size(IconSize::Small)
.icon_position(IconPosition::Start)
.disabled(env_var_set)
.when(env_var_set, |this| {
this.tooltip(|cx| Tooltip::text(format!("To reset your API key, unset the {GOOGLE_AI_API_KEY_VAR} environment variable."), cx))
})
.on_click(cx.listener(|this, _, cx| this.reset_api_key(cx))),
)
.into_any()

View File

@@ -16,7 +16,7 @@ use settings::{Settings, SettingsStore};
use std::{sync::Arc, time::Duration};
use strum::IntoEnumIterator;
use theme::ThemeSettings;
use ui::{prelude::*, Icon, IconName};
use ui::{prelude::*, Icon, IconName, Tooltip};
use util::ResultExt;
use crate::{
@@ -49,9 +49,12 @@ pub struct OpenAiLanguageModelProvider {
pub struct State {
api_key: Option<String>,
api_key_from_env: bool,
_subscription: Subscription,
}
const OPENAI_API_KEY_VAR: &'static str = "OPENAI_API_KEY";
impl State {
fn is_authenticated(&self) -> bool {
self.api_key.is_some()
@@ -64,6 +67,7 @@ impl State {
delete_credentials.await.log_err();
this.update(&mut cx, |this, cx| {
this.api_key = None;
this.api_key_from_env = false;
cx.notify();
})
})
@@ -92,17 +96,18 @@ impl State {
.api_url
.clone();
cx.spawn(|this, mut cx| async move {
let api_key = if let Ok(api_key) = std::env::var("OPENAI_API_KEY") {
api_key
let (api_key, from_env) = if let Ok(api_key) = std::env::var(OPENAI_API_KEY_VAR) {
(api_key, true)
} else {
let (_, api_key) = cx
.update(|cx| cx.read_credentials(&api_url))?
.await?
.ok_or_else(|| anyhow!("credentials not found"))?;
String::from_utf8(api_key)?
(String::from_utf8(api_key)?, false)
};
this.update(&mut cx, |this, cx| {
this.api_key = Some(api_key);
this.api_key_from_env = from_env;
cx.notify();
})
})
@@ -114,6 +119,7 @@ impl OpenAiLanguageModelProvider {
pub fn new(http_client: Arc<dyn HttpClient>, cx: &mut AppContext) -> Self {
let state = cx.new_model(|cx| State {
api_key: None,
api_key_from_env: false,
_subscription: cx.observe_global::<SettingsStore>(|_this: &mut State, cx| {
cx.notify();
}),
@@ -476,6 +482,8 @@ impl Render for ConfigurationView {
"Paste your OpenAI API key below and hit enter to use the assistant:",
];
let env_var_set = self.state.read(cx).api_key_from_env;
if self.load_credentials_task.is_some() {
div().child(Label::new("Loading credentials...")).into_any()
} else if self.should_render_editor(cx) {
@@ -497,7 +505,7 @@ impl Render for ConfigurationView {
)
.child(
Label::new(
"You can also assign the OPENAI_API_KEY environment variable and restart Zed.",
format!("You can also assign the {OPENAI_API_KEY_VAR} environment variable and restart Zed."),
)
.size(LabelSize::Small),
)
@@ -510,13 +518,21 @@ impl Render for ConfigurationView {
h_flex()
.gap_1()
.child(Icon::new(IconName::Check).color(Color::Success))
.child(Label::new("API key configured.")),
.child(Label::new(if env_var_set {
format!("API key set in {OPENAI_API_KEY_VAR} environment variable.")
} else {
"API key configured.".to_string()
})),
)
.child(
Button::new("reset-key", "Reset key")
.icon(Some(IconName::Trash))
.icon_size(IconSize::Small)
.icon_position(IconPosition::Start)
.disabled(env_var_set)
.when(env_var_set, |this| {
this.tooltip(|cx| Tooltip::text(format!("To reset your API key, unset the {OPENAI_API_KEY_VAR} environment variable."), cx))
})
.on_click(cx.listener(|this, _, cx| this.reset_api_key(cx))),
)
.into_any()

View File

@@ -94,12 +94,14 @@ impl AnthropicSettingsContent {
.filter_map(|model| match model {
anthropic::Model::Custom {
name,
display_name,
max_tokens,
tool_override,
cache_configuration,
max_output_tokens,
} => Some(provider::anthropic::AvailableModel {
name,
display_name,
max_tokens,
tool_override,
cache_configuration: cache_configuration.as_ref().map(

View File

@@ -106,6 +106,7 @@ pub enum Event {
Saved,
FileHandleChanged,
Closed,
Discarded,
DirtyChanged,
DiagnosticsUpdated,
}
@@ -1691,6 +1692,7 @@ impl MultiBuffer {
language::Event::Reparsed => Event::Reparsed(buffer.read(cx).remote_id()),
language::Event::DiagnosticsUpdated => Event::DiagnosticsUpdated,
language::Event::Closed => Event::Closed,
language::Event::Discarded => Event::Discarded,
language::Event::CapabilityChanged => {
self.capability = buffer.read(cx).capability();
Event::CapabilityChanged

View File

@@ -193,15 +193,28 @@ pub fn prompts_dir() -> &'static PathBuf {
/// Returns the path to the prompt templates directory.
///
/// This is where the prompt templates for core features can be overridden with templates.
pub fn prompt_overrides_dir() -> &'static PathBuf {
static PROMPT_TEMPLATES_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPT_TEMPLATES_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("prompt_overrides")
} else {
support_dir().join("prompt_overrides")
///
/// # Arguments
///
/// * `dev_mode` - If true, assumes the current working directory is the Zed repository.
pub fn prompt_overrides_dir(repo_path: Option<&Path>) -> PathBuf {
if let Some(path) = repo_path {
let dev_path = path.join("assets").join("prompts");
if dev_path.exists() {
return dev_path;
}
})
}
static PROMPT_TEMPLATES_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPT_TEMPLATES_DIR
.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("prompt_overrides")
} else {
support_dir().join("prompt_overrides")
}
})
.clone()
}
/// Returns the path to the semantic search's embeddings directory.

View File

@@ -0,0 +1,36 @@
[package]
name = "performance"
version = "0.1.0"
edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/performance.rs"
doctest = false
[features]
test-support = [
"collections/test-support",
"gpui/test-support",
"workspace/test-support",
]
[dependencies]
anyhow.workspace = true
gpui.workspace = true
log.workspace = true
schemars.workspace = true
serde.workspace = true
settings.workspace = true
workspace.workspace = true
[dev-dependencies]
collections = { workspace = true, features = ["test-support"] }
gpui = { workspace = true, features = ["test-support"] }
settings = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }
workspace = { workspace = true, features = ["test-support"] }

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,189 @@
use std::time::Instant;
use anyhow::Result;
use gpui::{
div, AppContext, InteractiveElement as _, Render, StatefulInteractiveElement as _,
Subscription, ViewContext, VisualContext,
};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use settings::{Settings, SettingsSources, SettingsStore};
use workspace::{
ui::{Label, LabelCommon, LabelSize, Tooltip},
ItemHandle, StatusItemView, Workspace,
};
const SHOW_STARTUP_TIME_DURATION: std::time::Duration = std::time::Duration::from_secs(5);
pub fn init(cx: &mut AppContext) {
PerformanceSettings::register(cx);
let mut enabled = PerformanceSettings::get_global(cx)
.show_in_status_bar
.unwrap_or(false);
let start_time = Instant::now();
let mut _observe_workspaces = toggle_status_bar_items(enabled, start_time, cx);
cx.observe_global::<SettingsStore>(move |cx| {
let new_value = PerformanceSettings::get_global(cx)
.show_in_status_bar
.unwrap_or(false);
if new_value != enabled {
enabled = new_value;
_observe_workspaces = toggle_status_bar_items(enabled, start_time, cx);
}
})
.detach();
}
fn toggle_status_bar_items(
enabled: bool,
start_time: Instant,
cx: &mut AppContext,
) -> Option<Subscription> {
for window in cx.windows() {
if let Some(workspace) = window.downcast::<Workspace>() {
workspace
.update(cx, |workspace, cx| {
toggle_status_bar_item(workspace, enabled, start_time, cx);
})
.ok();
}
}
if enabled {
log::info!("performance metrics display enabled");
Some(cx.observe_new_views::<Workspace>(move |workspace, cx| {
toggle_status_bar_item(workspace, true, start_time, cx);
}))
} else {
log::info!("performance metrics display disabled");
None
}
}
struct PerformanceStatusBarItem {
display_mode: DisplayMode,
}
#[derive(Copy, Clone, Debug)]
enum DisplayMode {
StartupTime,
Fps,
}
impl PerformanceStatusBarItem {
fn new(start_time: Instant, cx: &mut ViewContext<Self>) -> Self {
let now = Instant::now();
let display_mode = if now < start_time + SHOW_STARTUP_TIME_DURATION {
DisplayMode::StartupTime
} else {
DisplayMode::Fps
};
let this = Self { display_mode };
if let DisplayMode::StartupTime = display_mode {
cx.spawn(|this, mut cx| async move {
let now = Instant::now();
let remaining_duration =
(start_time + SHOW_STARTUP_TIME_DURATION).saturating_duration_since(now);
cx.background_executor().timer(remaining_duration).await;
this.update(&mut cx, |this, cx| {
this.display_mode = DisplayMode::Fps;
cx.notify();
})
.ok();
})
.detach();
}
this
}
}
impl Render for PerformanceStatusBarItem {
fn render(&mut self, cx: &mut gpui::ViewContext<Self>) -> impl gpui::IntoElement {
let text = match self.display_mode {
DisplayMode::StartupTime => cx
.time_to_first_window_draw()
.map_or("Pending".to_string(), |duration| {
format!("{}ms", duration.as_millis())
}),
DisplayMode::Fps => cx.fps().map_or("".to_string(), |fps| {
format!("{:3} FPS", fps.round() as u32)
}),
};
use gpui::ParentElement;
let display_mode = self.display_mode;
div()
.id("performance status")
.child(Label::new(text).size(LabelSize::Small))
.tooltip(move |cx| match display_mode {
DisplayMode::StartupTime => Tooltip::text("Time to first window draw", cx),
DisplayMode::Fps => cx
.new_view(|cx| {
let tooltip = Tooltip::new("Current FPS");
if let Some(time_to_first) = cx.time_to_first_window_draw() {
tooltip.meta(format!(
"Time to first window draw: {}ms",
time_to_first.as_millis()
))
} else {
tooltip
}
})
.into(),
})
}
}
impl StatusItemView for PerformanceStatusBarItem {
fn set_active_pane_item(
&mut self,
_active_pane_item: Option<&dyn ItemHandle>,
_cx: &mut gpui::ViewContext<Self>,
) {
// This is not currently used.
}
}
fn toggle_status_bar_item(
workspace: &mut Workspace,
enabled: bool,
start_time: Instant,
cx: &mut ViewContext<Workspace>,
) {
if enabled {
workspace.status_bar().update(cx, |bar, cx| {
bar.add_right_item(
cx.new_view(|cx| PerformanceStatusBarItem::new(start_time, cx)),
cx,
)
});
} else {
workspace.status_bar().update(cx, |bar, cx| {
bar.remove_items_of_type::<PerformanceStatusBarItem>(cx);
});
}
}
/// Configuration of the display of performance details.
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
pub struct PerformanceSettings {
/// Display the time to first window draw and frame rate in the status bar.
///
/// Default: false
pub show_in_status_bar: Option<bool>,
}
impl Settings for PerformanceSettings {
const KEY: Option<&'static str> = Some("performance");
type FileContent = Self;
fn load(sources: SettingsSources<Self::FileContent>, _: &mut AppContext) -> Result<Self> {
sources.json_merge()
}
}

View File

@@ -51,6 +51,15 @@ pub struct Picker<D: PickerDelegate> {
is_modal: bool,
}
#[derive(Debug, Default, Clone, Copy, PartialEq)]
pub enum PickerEditorPosition {
#[default]
/// Render the editor at the start of the picker. Usually the top
Start,
/// Render the editor at the end of the picker. Usually the bottom
End,
}
pub trait PickerDelegate: Sized + 'static {
type ListItem: IntoElement;
@@ -103,8 +112,16 @@ pub trait PickerDelegate: Sized + 'static {
None
}
fn editor_position(&self) -> PickerEditorPosition {
PickerEditorPosition::default()
}
fn render_editor(&self, editor: &View<Editor>, _cx: &mut ViewContext<Picker<Self>>) -> Div {
v_flex()
.when(
self.editor_position() == PickerEditorPosition::End,
|this| this.child(Divider::horizontal()),
)
.child(
h_flex()
.overflow_hidden()
@@ -113,7 +130,10 @@ pub trait PickerDelegate: Sized + 'static {
.px_3()
.child(editor.clone()),
)
.child(Divider::horizontal())
.when(
self.editor_position() == PickerEditorPosition::Start,
|this| this.child(Divider::horizontal()),
)
}
fn render_match(
@@ -504,7 +524,7 @@ impl<D: PickerDelegate> Picker<D> {
picker
.border_color(cx.theme().colors().border_variant)
.border_b_1()
.pb(px(-1.0))
.py(px(-1.0))
},
)
}
@@ -555,6 +575,8 @@ impl<D: PickerDelegate> ModalView for Picker<D> {}
impl<D: PickerDelegate> Render for Picker<D> {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let editor_position = self.delegate.editor_position();
v_flex()
.key_context("Picker")
.size_full()
@@ -574,9 +596,15 @@ impl<D: PickerDelegate> Render for Picker<D> {
.on_action(cx.listener(Self::secondary_confirm))
.on_action(cx.listener(Self::confirm_completion))
.on_action(cx.listener(Self::confirm_input))
.child(match &self.head {
Head::Editor(editor) => self.delegate.render_editor(&editor.clone(), cx),
Head::Empty(empty_head) => div().child(empty_head.clone()),
.children(match &self.head {
Head::Editor(editor) => {
if editor_position == PickerEditorPosition::Start {
Some(self.delegate.render_editor(&editor.clone(), cx))
} else {
None
}
}
Head::Empty(empty_head) => Some(div().child(empty_head.clone())),
})
.when(self.delegate.match_count() > 0, |el| {
el.child(
@@ -602,5 +630,15 @@ impl<D: PickerDelegate> Render for Picker<D> {
)
})
.children(self.delegate.render_footer(cx))
.children(match &self.head {
Head::Editor(editor) => {
if editor_position == PickerEditorPosition::End {
Some(self.delegate.render_editor(&editor.clone(), cx))
} else {
None
}
}
Head::Empty(empty_head) => Some(div().child(empty_head.clone())),
})
}
}

View File

@@ -232,6 +232,7 @@ pub struct Project {
cached_shell_environments: HashMap<WorktreeId, HashMap<String, String>>,
}
#[derive(Debug)]
pub enum LanguageServerToQuery {
Primary,
Other(LanguageServerId),

View File

@@ -141,7 +141,7 @@ impl Render for QuickActionBar {
let assistant_button = QuickActionBarButton::new(
"toggle inline assistant",
IconName::MagicWand,
IconName::ZedAssistant,
false,
Box::new(InlineAssist::default()),
"Inline Assist",

View File

@@ -261,7 +261,7 @@ impl RunningKernel {
messages_rx.push(control_reply_rx);
messages_rx.push(shell_reply_rx);
let _iopub_task = cx.background_executor().spawn({
let iopub_task = cx.background_executor().spawn({
async move {
while let Ok(message) = iopub_socket.read().await {
iopub.send(message).await?;
@@ -274,7 +274,7 @@ impl RunningKernel {
futures::channel::mpsc::channel(100);
let (mut shell_request_tx, mut shell_request_rx) = futures::channel::mpsc::channel(100);
let _routing_task = cx.background_executor().spawn({
let routing_task = cx.background_executor().spawn({
async move {
while let Some(message) = request_rx.next().await {
match message.content {
@@ -292,7 +292,7 @@ impl RunningKernel {
}
});
let _shell_task = cx.background_executor().spawn({
let shell_task = cx.background_executor().spawn({
async move {
while let Some(message) = shell_request_rx.next().await {
shell_socket.send(message).await.ok();
@@ -303,7 +303,7 @@ impl RunningKernel {
}
});
let _control_task = cx.background_executor().spawn({
let control_task = cx.background_executor().spawn({
async move {
while let Some(message) = control_request_rx.next().await {
control_socket.send(message).await.ok();
@@ -319,10 +319,10 @@ impl RunningKernel {
process,
request_tx,
working_directory,
_shell_task,
_iopub_task,
_control_task,
_routing_task,
_shell_task: shell_task,
_iopub_task: iopub_task,
_control_task: control_task,
_routing_task: routing_task,
connection_path,
execution_state: ExecutionState::Busy,
kernel_info: None,

View File

@@ -943,26 +943,31 @@ impl Item for TerminalView {
fn tab_content(&self, params: TabContentParams, cx: &WindowContext) -> AnyElement {
let terminal = self.terminal().read(cx);
let title = terminal.title(true);
let rerun_button = |task_id: task::TaskId| {
IconButton::new("rerun-icon", IconName::Rerun)
.icon_size(IconSize::Small)
.size(ButtonSize::Compact)
.icon_color(Color::Default)
.shape(ui::IconButtonShape::Square)
.tooltip(|cx| Tooltip::text("Rerun task", cx))
.on_click(move |_, cx| {
cx.dispatch_action(Box::new(tasks_ui::Rerun {
task_id: Some(task_id.clone()),
..tasks_ui::Rerun::default()
}));
})
};
let (icon, icon_color, rerun_button) = match terminal.task() {
Some(terminal_task) => match &terminal_task.status {
TaskStatus::Unknown => (IconName::ExclamationTriangle, Color::Warning, None),
TaskStatus::Running => (IconName::Play, Color::Disabled, None),
TaskStatus::Unknown => (
IconName::ExclamationTriangle,
Color::Warning,
Some(rerun_button(terminal_task.id.clone())),
),
TaskStatus::Completed { success } => {
let task_id = terminal_task.id.clone();
let rerun_button = IconButton::new("rerun-icon", IconName::Rerun)
.icon_size(IconSize::Small)
.size(ButtonSize::Compact)
.icon_color(Color::Default)
.shape(ui::IconButtonShape::Square)
.tooltip(|cx| Tooltip::text("Rerun task", cx))
.on_click(move |_, cx| {
cx.dispatch_action(Box::new(tasks_ui::Rerun {
task_id: Some(task_id.clone()),
..Default::default()
}));
});
let rerun_button = rerun_button(terminal_task.id.clone());
if *success {
(IconName::Check, Color::Success, Some(rerun_button))
} else {

View File

@@ -249,6 +249,7 @@ pub struct ThemeSettingsContent {
pub ui_font_fallbacks: Option<Vec<String>>,
/// The OpenType features to enable for text in the UI.
#[serde(default)]
#[schemars(default = "default_font_features")]
pub ui_font_features: Option<FontFeatures>,
/// The weight of the UI font in CSS units from 100 to 900.
#[serde(default)]
@@ -270,6 +271,7 @@ pub struct ThemeSettingsContent {
pub buffer_line_height: Option<BufferLineHeight>,
/// The OpenType features to enable for rendering in text buffers.
#[serde(default)]
#[schemars(default = "default_font_features")]
pub buffer_font_features: Option<FontFeatures>,
/// The name of the Zed theme to use.
#[serde(default)]
@@ -288,6 +290,10 @@ pub struct ThemeSettingsContent {
pub theme_overrides: Option<ThemeStyleContent>,
}
fn default_font_features() -> Option<FontFeatures> {
Some(FontFeatures::default())
}
impl ThemeSettingsContent {
/// Sets the theme for the given appearance to the theme with the specified name.
pub fn set_theme(&mut self, theme_name: String, appearance: Appearance) {

View File

@@ -89,6 +89,7 @@ pub struct Button {
selected_icon: Option<IconName>,
selected_icon_color: Option<Color>,
key_binding: Option<KeyBinding>,
alpha: Option<f32>,
}
impl Button {
@@ -113,6 +114,7 @@ impl Button {
selected_icon: None,
selected_icon_color: None,
key_binding: None,
alpha: None,
}
}
@@ -181,6 +183,12 @@ impl Button {
self.key_binding = key_binding.into();
self
}
/// Sets the alpha property of the color of label.
pub fn alpha(mut self, alpha: f32) -> Self {
self.alpha = Some(alpha);
self
}
}
impl Selectable for Button {
@@ -409,6 +417,7 @@ impl RenderOnce for Button {
Label::new(label)
.color(label_color)
.size(self.label_size.unwrap_or_default())
.when_some(self.alpha, |this, alpha| this.alpha(alpha))
.line_height_style(LineHeightStyle::UiLabel),
)
.children(self.key_binding),

View File

@@ -107,6 +107,7 @@ impl IconSize {
pub enum IconName {
Ai,
AiAnthropic,
AiAnthropicHosted,
AiOpenAi,
AiGoogle,
AiOllama,
@@ -153,10 +154,12 @@ pub enum IconName {
Copy,
CountdownTimer,
Dash,
DatabaseZap,
Delete,
Disconnected,
Download,
Ellipsis,
EllipsisVertical,
Envelope,
Escape,
ExclamationTriangle,
@@ -195,7 +198,6 @@ pub enum IconName {
LineHeight,
Link,
ListTree,
MagicWand,
MagnifyingGlass,
MailOpen,
Maximize,
@@ -230,10 +232,13 @@ pub enum IconName {
Save,
Screen,
SearchSelection,
SearchCode,
SelectAll,
Server,
Settings,
Shift,
Slash,
SlashSquare,
Sliders,
SlidersAlt,
Snip,
@@ -272,6 +277,7 @@ impl IconName {
match self {
IconName::Ai => "icons/ai.svg",
IconName::AiAnthropic => "icons/ai_anthropic.svg",
IconName::AiAnthropicHosted => "icons/ai_anthropic_hosted.svg",
IconName::AiOpenAi => "icons/ai_open_ai.svg",
IconName::AiGoogle => "icons/ai_google.svg",
IconName::AiOllama => "icons/ai_ollama.svg",
@@ -317,10 +323,12 @@ impl IconName {
IconName::Copy => "icons/copy.svg",
IconName::CountdownTimer => "icons/countdown_timer.svg",
IconName::Dash => "icons/dash.svg",
IconName::DatabaseZap => "icons/database_zap.svg",
IconName::Delete => "icons/delete.svg",
IconName::Disconnected => "icons/disconnected.svg",
IconName::Download => "icons/download.svg",
IconName::Ellipsis => "icons/ellipsis.svg",
IconName::EllipsisVertical => "icons/ellipsis_vertical.svg",
IconName::Envelope => "icons/feedback.svg",
IconName::Escape => "icons/escape.svg",
IconName::ExclamationTriangle => "icons/warning.svg",
@@ -359,7 +367,6 @@ impl IconName {
IconName::LineHeight => "icons/line_height.svg",
IconName::Link => "icons/link.svg",
IconName::ListTree => "icons/list_tree.svg",
IconName::MagicWand => "icons/magic_wand.svg",
IconName::MagnifyingGlass => "icons/magnifying_glass.svg",
IconName::MailOpen => "icons/mail_open.svg",
IconName::Maximize => "icons/maximize.svg",
@@ -394,10 +401,13 @@ impl IconName {
IconName::Save => "icons/save.svg",
IconName::Screen => "icons/desktop.svg",
IconName::SearchSelection => "icons/search_selection.svg",
IconName::SearchCode => "icons/search_code.svg",
IconName::SelectAll => "icons/select_all.svg",
IconName::Server => "icons/server.svg",
IconName::Settings => "icons/file_icons/settings.svg",
IconName::Shift => "icons/shift.svg",
IconName::Slash => "icons/slash.svg",
IconName::SlashSquare => "icons/slash_square.svg",
IconName::Sliders => "icons/sliders.svg",
IconName::SlidersAlt => "icons/sliders-alt.svg",
IconName::Snip => "icons/snip.svg",

View File

@@ -208,14 +208,15 @@ impl<M> Default for PopoverMenuElementState<M> {
}
}
pub struct PopoverMenuFrameState {
pub struct PopoverMenuFrameState<M: ManagedView> {
child_layout_id: Option<LayoutId>,
child_element: Option<AnyElement>,
menu_element: Option<AnyElement>,
menu_handle: Rc<RefCell<Option<View<M>>>>,
}
impl<M: ManagedView> Element for PopoverMenu<M> {
type RequestLayoutState = PopoverMenuFrameState;
type RequestLayoutState = PopoverMenuFrameState<M>;
type PrepaintState = Option<HitboxId>;
fn id(&self) -> Option<ElementId> {
@@ -280,6 +281,7 @@ impl<M: ManagedView> Element for PopoverMenu<M> {
child_element,
child_layout_id,
menu_element,
menu_handle: element_state.menu.clone(),
},
),
element_state,
@@ -333,11 +335,14 @@ impl<M: ManagedView> Element for PopoverMenu<M> {
menu.paint(cx);
if let Some(child_hitbox) = *child_hitbox {
let menu_handle = request_layout.menu_handle.clone();
// Mouse-downing outside the menu dismisses it, so we don't
// want a click on the toggle to re-open it.
cx.on_mouse_event(move |_: &MouseDownEvent, phase, cx| {
if phase == DispatchPhase::Bubble && child_hitbox.is_hovered(cx) {
cx.stop_propagation()
menu_handle.borrow_mut().take();
cx.stop_propagation();
cx.refresh();
}
})
}

View File

@@ -184,6 +184,7 @@ pub trait Item: FocusableView + EventEmitter<Self::Event> {
fn to_item_events(_event: &Self::Event, _f: impl FnMut(ItemEvent)) {}
fn deactivated(&mut self, _: &mut ViewContext<Self>) {}
fn discarded(&self, _project: Model<Project>, _cx: &mut ViewContext<Self>) {}
fn workspace_deactivated(&mut self, _: &mut ViewContext<Self>) {}
fn navigate(&mut self, _: Box<dyn Any>, _: &mut ViewContext<Self>) -> bool {
false
@@ -373,6 +374,7 @@ pub trait ItemHandle: 'static + Send {
fn dragged_tab_content(&self, params: TabContentParams, cx: &WindowContext) -> AnyElement;
fn project_path(&self, cx: &AppContext) -> Option<ProjectPath>;
fn project_entry_ids(&self, cx: &AppContext) -> SmallVec<[ProjectEntryId; 3]>;
fn project_paths(&self, cx: &AppContext) -> SmallVec<[ProjectPath; 3]>;
fn project_item_model_ids(&self, cx: &AppContext) -> SmallVec<[EntityId; 3]>;
fn for_each_project_item(
&self,
@@ -393,6 +395,7 @@ pub trait ItemHandle: 'static + Send {
cx: &mut ViewContext<Workspace>,
);
fn deactivated(&self, cx: &mut WindowContext);
fn discarded(&self, project: Model<Project>, cx: &mut WindowContext);
fn workspace_deactivated(&self, cx: &mut WindowContext);
fn navigate(&self, data: Box<dyn Any>, cx: &mut WindowContext) -> bool;
fn item_id(&self) -> EntityId;
@@ -531,6 +534,16 @@ impl<T: Item> ItemHandle for View<T> {
result
}
fn project_paths(&self, cx: &AppContext) -> SmallVec<[ProjectPath; 3]> {
let mut result = SmallVec::new();
self.read(cx).for_each_project_item(cx, &mut |_, item| {
if let Some(id) = item.project_path(cx) {
result.push(id);
}
});
result
}
fn project_item_model_ids(&self, cx: &AppContext) -> SmallVec<[EntityId; 3]> {
let mut result = SmallVec::new();
self.read(cx).for_each_project_item(cx, &mut |id, _| {
@@ -724,6 +737,10 @@ impl<T: Item> ItemHandle for View<T> {
});
}
fn discarded(&self, project: Model<Project>, cx: &mut WindowContext) {
self.update(cx, |this, cx| this.discarded(project, cx));
}
fn deactivated(&self, cx: &mut WindowContext) {
self.update(cx, |this, cx| this.deactivated(cx));
}

View File

@@ -3,6 +3,7 @@ use crate::{
ClosePosition, Item, ItemHandle, ItemSettings, PreviewTabsSettings, TabContentParams,
WeakItemHandle,
},
notifications::NotifyResultExt,
toolbar::Toolbar,
workspace_settings::{AutosaveSetting, TabBarSettings, WorkspaceSettings},
CloseWindow, CopyPath, CopyRelativePath, NewFile, NewTerminal, OpenInTerminal, OpenTerminal,
@@ -920,7 +921,22 @@ impl Pane {
cx: &AppContext,
) -> Option<Box<dyn ItemHandle>> {
self.items.iter().find_map(|item| {
if item.is_singleton(cx) && item.project_entry_ids(cx).as_slice() == [entry_id] {
if item.is_singleton(cx) && (item.project_entry_ids(cx).as_slice() == [entry_id]) {
Some(item.boxed_clone())
} else {
None
}
})
}
pub fn item_for_path(
&self,
project_path: ProjectPath,
cx: &AppContext,
) -> Option<Box<dyn ItemHandle>> {
self.items.iter().find_map(move |item| {
if item.is_singleton(cx) && (item.project_path(cx).as_slice() == [project_path.clone()])
{
Some(item.boxed_clone())
} else {
None
@@ -1485,8 +1501,13 @@ impl Pane {
})?;
match answer {
Ok(0) => {}
Ok(1) => return Ok(true), // Don't save this file
_ => return Ok(false), // Cancel
Ok(1) => {
// Don't save this file
pane.update(cx, |_, cx| item.discarded(project, cx))
.log_err();
return Ok(true);
}
_ => return Ok(false), // Cancel
}
} else {
return Ok(false);
@@ -2089,13 +2110,32 @@ impl Pane {
.read(cx)
.path_for_entry(project_entry_id, cx)
{
if let Some(split_direction) = split_direction {
to_pane = workspace.split_pane(to_pane, split_direction, cx);
}
workspace
.open_path(path, Some(to_pane.downgrade()), true, cx)
.detach_and_log_err(cx);
}
let load_path_task = workspace.load_path(path, cx);
cx.spawn(|workspace, mut cx| async move {
if let Some((project_entry_id, build_item)) =
load_path_task.await.notify_async_err(&mut cx)
{
workspace
.update(&mut cx, |workspace, cx| {
if let Some(split_direction) = split_direction {
to_pane =
workspace.split_pane(to_pane, split_direction, cx);
}
to_pane.update(cx, |pane, cx| {
pane.open_item(
project_entry_id,
true,
false,
cx,
build_item,
)
})
})
.log_err();
}
})
.detach();
};
});
})
.log_err();
@@ -2166,7 +2206,14 @@ impl Pane {
})
.ok()
{
let _opened_items: Vec<_> = open_task.await;
let opened_items: Vec<_> = open_task.await;
_ = workspace.update(&mut cx, |workspace, cx| {
for item in opened_items.into_iter().flatten() {
if let Err(e) = item {
workspace.show_error(&e, cx);
}
}
});
}
})
.detach();

View File

@@ -153,6 +153,17 @@ impl StatusBar {
cx.notify();
}
pub fn remove_items_of_type<T>(&mut self, cx: &mut ViewContext<Self>)
where
T: 'static + StatusItemView,
{
self.left_items
.retain(|item| item.item_type() != TypeId::of::<T>());
self.right_items
.retain(|item| item.item_type() != TypeId::of::<T>());
cx.notify();
}
pub fn add_right_item<T>(&mut self, item: View<T>, cx: &mut ViewContext<Self>)
where
T: 'static + StatusItemView,

Some files were not shown because too many files have changed in this diff Show More