Compare commits

...

52 Commits

Author SHA1 Message Date
Richard Feldman
bd6bb9204f wip 2024-08-12 12:36:13 -04:00
Richard Feldman
4a90eba0da fixup! Handle combinations of strings and images in clipboard 2024-08-12 12:36:03 -04:00
Richard Feldman
3c183c4698 Handle combinations of strings and images in clipboard 2024-08-12 09:32:39 -04:00
Richard Feldman
ab03680094 Support copying a mixture of images and text 2024-08-12 09:16:54 -04:00
Mikayla
7e76aea435 Merge branch 'main' into paste-image 2024-08-11 23:38:58 -07:00
Mikayla
c7d2fa8b1b Fix image rendering performance 2024-08-11 23:16:57 -07:00
Nathan Sobo
355aebd0e4 Introduce prompt override script and adjust names (#16094)
This PR introduces a new script for iterative development of prompt
overrides in Zed.

Just `script/prompts link` and your running Zed should start using
prompts from `zed/assets/prompts`. Use `script/prompts unlink` to undo.
You can also link with `script/prompts link --worktree` to store the
prompts to a `../zed_prompts` worktree that's a sibling of your repo, in
case you don't want to mess with your working copy. Just don't forget
about it!

Key changes:
- Add new `script/prompts` for managing prompt overrides
- Rename `prompt_templates_dir` to `prompt_overrides_dir` for clarity
- Update paths to use `~/.config/zed/prompt_overrides` instead of
`~/.config/zed/prompts/templates`
- Adjust `PromptBuilder` to use the new `prompt_overrides_dir` function

These changes simplify the process of customizing prompts and provide a
more intuitive naming convention for override-related functionality.

Release Notes:

- N/A
2024-08-11 17:21:17 -06:00
Joseph T Lyons
10937c6e37 Fetch staff members from GitHub org 2024-08-11 15:10:45 -04:00
Joseph T Lyons
4818e0b7eb Don't thank staff members in release notes 2024-08-11 15:02:57 -04:00
Joseph T Lyons
2490b050ee Fix warning message 2024-08-11 14:48:58 -04:00
Joseph T Lyons
e99113e4cb Fix warning message 2024-08-11 14:45:46 -04:00
Marshall Bowers
3140d6ce8c collab: Temporarily bypass LLM rate limiting for staff (#16089)
This PR makes it so staff members will be exempt from rate limiting by
the LLM service.

This is just a temporary measure until we can tweak the rate-limiting
heuristics.

Staff members are still subject to upstream LLM provider rate limits.

Release Notes:

- N/A
2024-08-11 14:41:49 -04:00
Joseph T Lyons
492f6b9cdf Update issue-detection RegEx 2024-08-11 14:40:27 -04:00
Joseph T Lyons
36a560dcd0 Update the dangerfile to check for issue links 2024-08-11 14:18:19 -04:00
Nathan Sobo
6f104fecad Copy extension_api Rust files to OUT_DIR when building extension to work around rust-analyzer limitation (#16064)
Copies rust files from extension_api/wit to the OUT_DIR to allow
including them from within the crate, which is supported by
rust-analyzer. This allows rust-analyzer to deal with the included
files. It doesn't currently support files outside the crate.

Release Notes:

- N/A
2024-08-10 18:19:11 -06:00
Kyle Kelley
550e139b61 repl: Set the default lines ✖️ columns for the REPL to 32x128 (#16061)
![image](https://github.com/user-attachments/assets/dce938ef-bdfd-4412-a074-90c883286263)

Release Notes:

- Improved visuals of repl stdout/stderr by reducing default line count
to 32
2024-08-10 10:28:56 -07:00
Max Brunsfeld
33e120d964 Capture telemetry data on per-user monthly LLM spending (#16050)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 16:38:37 -07:00
Max Brunsfeld
8688b2ad19 Add telemetry for LLM usage (#16049)
Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 18:15:57 -04:00
Max Brunsfeld
423c7b999a Larger rate limit integers (#16047)
Tokens per day may exceed the range of Postgres's 32-bit `integer` data
type.

Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 14:07:49 -07:00
Marshall Bowers
0932818829 zed_extension_api: Release v0.0.7 (#16048)
This PR releases v0.0.7 of the Zed extension API.

Support for this version of the extension API will land in Zed v0.149.0.

Release Notes:

- N/A
2024-08-09 16:52:16 -04:00
Max Brunsfeld
fbebb73d7b Use LLM service for tool call requests (#16046)
Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 16:22:58 -04:00
Max Brunsfeld
d96afde5bf Avoid insert ... on conflict on startup (#16045)
These queries advance the id sequence even when there's nothing to
insert

Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 15:32:11 -04:00
Max Brunsfeld
b1c69c2178 Fix usage recording in llm service (#16044)
Release Notes:

- N/A

Co-authored-by: Marshall <marshall@zed.dev>
2024-08-09 11:48:18 -07:00
Peter Tripp
eb3c4b0e46 Docs Party 2024 (#15876)
Co-authored-by: Raunak Raj <nkray21111983@gmail.com>
Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
Co-authored-by: Bennet <bennet@zed.dev>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
Co-authored-by: Joseph T Lyons <JosephTLyons@gmail.com>
Co-authored-by: Mikayla <mikayla@zed.dev>
Co-authored-by: Jason <jason@zed.dev>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Nathan Sobo <nathan@zed.dev>
Co-authored-by: Jason Mancuso <7891333+jvmncs@users.noreply.github.com>
Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-08-09 13:37:54 -04:00
Bennet Bo Fenner
c633fa5a10 assistant: Improve quote selection (#16038)
https://github.com/user-attachments/assets/fe283eec-169f-4dfd-bae2-cc72c1c3f223



Release Notes:

- Include more context when using `assistant: Quote selection` to insert
text into the assistant panel

---------

Co-authored-by: Thorsten <thorsten@zed.dev>
2024-08-09 18:29:27 +02:00
renovate[bot]
46a4dd3ab1 Update Rust crate prometheus to v0.13.4 (#15941)
[![Mend
Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [prometheus](https://togithub.com/tikv/rust-prometheus) | dependencies
| patch | `0.13.3` -> `0.13.4` |

---

### Release Notes

<details>
<summary>tikv/rust-prometheus (prometheus)</summary>

###
[`v0.13.4`](https://togithub.com/tikv/rust-prometheus/blob/HEAD/CHANGELOG.md#0134)

[Compare
Source](https://togithub.com/tikv/rust-prometheus/compare/v0.13.3...v0.13.4)

- Improvement: Add PullingGauge
([#&#8203;405](https://togithub.com/tikv/rust-prometheus/issues/405))

- Improvement: Let cargo know which example requires which features
([#&#8203;511](https://togithub.com/tikv/rust-prometheus/issues/511))

- Bug fix: Prevent `clippy::ignored_unit_patterns` in macro expansions
([#&#8203;497](https://togithub.com/tikv/rust-prometheus/issues/497))

- Internal change: Add CI job for minimum toolchain (MSRV)
([#&#8203;467](https://togithub.com/tikv/rust-prometheus/issues/467))

- Internal change: Update CI to `actions/checkout@v4`
([#&#8203;499](https://togithub.com/tikv/rust-prometheus/issues/499))

-   Internal change: Update dependencies

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOC4yMC4xIiwidXBkYXRlZEluVmVyIjoiMzguMjAuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2024-08-09 11:54:08 -04:00
Marshall Bowers
49f760eeda collab: Set LLM_DATABASE_MAX_CONNECTIONS (#16035)
This PR updates the collab template to set the
`LLM_DATABASE_MAX_CONNECTIONS` environment variable for the LLM service.

Release Notes:

- N/A
2024-08-09 11:16:02 -04:00
Max Brunsfeld
225726ba4a Remove code paths that skip LLM db in prod (#16008)
Release Notes:

- N/A
2024-08-09 10:41:50 -04:00
Thorsten Ball
c1872e9cb0 vim: Fix ctrl-u/ctrl-d with high vertical_scroll_margin (#16031)
This makes sure that the `vertical_scroll_margin` doesn't leave the
cursor out of screen or somewhere it shouldn't go when it's higher than
the visible lines on screen.

So we cap it to `visible_line_count / 2`, similar to nvim:


5aa1a9532c/src/nvim/window.c (L6560)

Fixes #15101

Release Notes:

- Fixed `ctrl-u`/`ctrl-d` in Vim mode not working correctly when
`vertical_scroll_margin` is set to a really high value.

Co-authored-by: Bennet <bennet@zed.dev>
2024-08-09 16:00:05 +02:00
Thorsten Ball
09c9ed4765 vim: Fix panic due to overflow when scrolling (#16029)
When setting `"vertical_scroll_margin": 99` or other high values this
can lead to a panic that crashes Zed.

Release Notes:

- vim: Fixed a possible panic that could happen when using a very high
value for `vertical_scroll_margin` that exceeded the number of visible
lines on the screen.

Co-authored-by: Bennet <bennet@zed.dev>
2024-08-09 15:15:27 +02:00
Thorsten Ball
19d8422933 stories: Get OverflowScrollStory to scroll again (#15982)
This makes it at least scroll again, but it doesn't scroll down to the
last element yet. We haven't figured out why yet.


Release Notes:

- N/A

Co-authored-by: Bennet <bennet@zed.dev>
Co-authored-by: Antonio <antonio@zed.dev>
2024-08-09 12:32:26 +02:00
Thorsten Ball
173f6e7c8f assistant panel: Make configuration view scrollable (#16022)
We had to resort to this "hack" to get it to work, since nothing else we
tried (changing the `Pane`, changing the `ConfigurationView`, changing
the `AssistantPanel`, ...) worked.

Release Notes:

- N/A

Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Bennet <bennet@zed.dev>
2024-08-09 12:21:22 +02:00
Mikayla
76966b33f7 WIP 2024-08-08 09:14:42 -07:00
Richard Feldman
909e31a48d Merge remote-tracking branch 'origin/main' into paste-image 2024-08-06 21:47:32 -04:00
Richard Feldman
34053860c9 Include images in token counts 2024-08-06 21:28:47 -04:00
Mikayla
42727dd74e End to end image uploading achieved 2024-08-06 16:40:50 -07:00
Richard Feldman
cac634960e Add an initial test for messages_from_iters 2024-08-06 15:58:48 -04:00
Richard Feldman
bd3d81a0e2 Got everything compiling again
Co-authored-by: Jason <jason@zed.dev>
2024-08-06 15:42:19 -04:00
Mikayla
e600b35dcd WIP
co-authored-by: Kyle <kylek@zed.dev>
co-authored-by: Richard <richard@zed.dev>
co-authored-by: Jason <jason@zed.dev>
2024-08-06 12:01:39 -07:00
Richard Feldman
63837e5d43 Gracefully handle image not being able to be converted to RGBA 2024-08-06 13:33:14 -04:00
Richard Feldman
fbf71cf8f8 Fix sizing on pasted images
Co-authored-by: Antonio <antonio@zed.dev>
Co-authored-by: Jason <jason@zed.dev>
2024-08-06 10:54:28 -04:00
Richard Feldman
2d6e83d1d0 Fix inserting/deleting image behavior 2024-08-06 09:36:26 -04:00
Mikayla
092610bc15 Prep types to re-use the GPUI cache, expand the general message type to incorporate images 2024-08-05 22:51:09 -07:00
Mikayla
257e8aaacc Split img into img and surface, add image rendering to assistant panel
co-authored-by: Richard <richard@zed.dev>
2024-08-05 12:44:58 -07:00
Richard Feldman
5664f1f95d Introduce ClipboardImage 2024-08-05 13:04:24 -04:00
Richard Feldman
d60a29bbe6 Add ClipboardImage::to_image_data
co-authored-by: Antonio <antonio@zed.dev>
2024-08-05 11:38:25 -04:00
Richard Feldman
e7e59aa10c Add paste hook in assistant panel 2024-08-05 11:21:35 -04:00
Richard Feldman
61b204bedb Fix a test 2024-08-05 08:12:25 -04:00
Richard Feldman
d9c42a817f Fix UTType strategy 2024-08-04 17:46:47 -04:00
Richard Feldman
bbdefeab25 Support copying and pasting images better 2024-08-04 17:14:43 -04:00
Richard Feldman
4667e2ec22 Add ClipboardItem::string 2024-08-04 14:34:28 -04:00
Richard Feldman
8c8d350b61 Add support for copying and pasting images 2024-08-04 09:53:42 -04:00
187 changed files with 6744 additions and 2622 deletions

2263
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -53,7 +53,7 @@ impl Model {
Model::Claude3_5Sonnet => "claude-3-5-sonnet-20240620",
Model::Claude3Opus => "claude-3-opus-20240229",
Model::Claude3Sonnet => "claude-3-sonnet-20240229",
Model::Claude3Haiku => "claude-3-opus-20240307",
Model::Claude3Haiku => "claude-3-haiku-20240307",
Self::Custom { name, .. } => name,
}
}

View File

@@ -66,6 +66,8 @@ semantic_index.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
similar.workspace = true
smallvec.workspace = true
smol.workspace = true
telemetry_events.workspace = true
terminal.workspace = true

View File

@@ -6,6 +6,7 @@ use crate::{
slash_command::{
default_command::DefaultSlashCommand,
docs_command::{DocsSlashCommand, DocsSlashCommandArgs},
file_command::codeblock_fence_for_path,
SlashCommandCompletionProvider, SlashCommandRegistry,
},
terminal_inline_assistant::TerminalInlineAssistant,
@@ -32,12 +33,12 @@ use editor::{
use editor::{display_map::CreaseId, FoldPlaceholder};
use fs::Fs;
use gpui::{
div, percentage, point, Action, Animation, AnimationExt, AnyElement, AnyView, AppContext,
AsyncWindowContext, ClipboardItem, Context as _, DismissEvent, Empty, Entity, EntityId,
EventEmitter, FocusHandle, FocusableView, FontWeight, InteractiveElement, IntoElement, Model,
ParentElement, Pixels, ReadGlobal, Render, SharedString, StatefulInteractiveElement, Styled,
Subscription, Task, Transformation, UpdateGlobal, View, ViewContext, VisualContext, WeakView,
WindowContext,
canvas, div, img, percentage, point, size, Action, Animation, AnimationExt, AnyElement,
AnyView, AppContext, AsyncWindowContext, ClipboardEntry, ClipboardItem, Context as _,
DismissEvent, Empty, Entity, EntityId, EventEmitter, FocusHandle, FocusableView, FontWeight,
InteractiveElement, IntoElement, Model, ParentElement, Pixels, ReadGlobal, Render, RenderImage,
SharedString, Size, StatefulInteractiveElement, Styled, Subscription, Task, Transformation,
UpdateGlobal, View, ViewContext, VisualContext, WeakView, WindowContext,
};
use indexed_docs::IndexedDocsStore;
use language::{
@@ -1608,6 +1609,7 @@ pub struct ContextEditor {
lsp_adapter_delegate: Option<Arc<dyn LspAdapterDelegate>>,
editor: View<Editor>,
blocks: HashSet<CustomBlockId>,
image_blocks: HashSet<CustomBlockId>,
scroll_position: Option<ScrollPosition>,
remote_id: Option<workspace::ViewId>,
pending_slash_command_creases: HashMap<Range<language::Anchor>, CreaseId>,
@@ -1664,6 +1666,7 @@ impl ContextEditor {
editor,
lsp_adapter_delegate,
blocks: Default::default(),
image_blocks: Default::default(),
scroll_position: None,
remote_id: None,
fs,
@@ -1678,6 +1681,7 @@ impl ContextEditor {
error_message: None,
};
this.update_message_headers(cx);
this.update_image_blocks(cx);
this.insert_slash_command_output_sections(sections, cx);
this
}
@@ -2040,6 +2044,7 @@ impl ContextEditor {
match event {
ContextEvent::MessagesEdited => {
self.update_message_headers(cx);
self.update_image_blocks(cx);
self.context.update(cx, |context, cx| {
context.save(Some(Duration::from_millis(500)), self.fs.clone(), cx);
});
@@ -2929,6 +2934,11 @@ impl ContextEditor {
let buffer = editor.buffer().read(cx).snapshot(cx);
let range = editor::ToOffset::to_offset(&selection.start, &buffer)
..editor::ToOffset::to_offset(&selection.end, &buffer);
let selected_text = buffer.text_for_range(range.clone()).collect::<String>();
if selected_text.is_empty() {
return;
}
let start_language = buffer.language_at(range.start);
let end_language = buffer.language_at(range.end);
let language_name = if start_language == end_language {
@@ -2938,19 +2948,66 @@ impl ContextEditor {
};
let language_name = language_name.as_deref().unwrap_or("");
let selected_text = buffer.text_for_range(range).collect::<String>();
let text = if selected_text.is_empty() {
None
let filename = buffer
.file_at(selection.start)
.map(|file| file.full_path(cx));
let text = if language_name == "markdown" {
selected_text
.lines()
.map(|line| format!("> {}", line))
.collect::<Vec<_>>()
.join("\n")
} else {
Some(if language_name == "markdown" {
selected_text
.lines()
.map(|line| format!("> {}", line))
.collect::<Vec<_>>()
.join("\n")
let start_symbols = buffer
.symbols_containing(selection.start, None)
.map(|(_, symbols)| symbols);
let end_symbols = buffer
.symbols_containing(selection.end, None)
.map(|(_, symbols)| symbols);
let outline_text =
if let Some((start_symbols, end_symbols)) = start_symbols.zip(end_symbols) {
Some(
start_symbols
.into_iter()
.zip(end_symbols)
.take_while(|(a, b)| a == b)
.map(|(a, _)| a.text)
.collect::<Vec<_>>()
.join(" > "),
)
} else {
None
};
let line_comment_prefix = start_language
.and_then(|l| l.default_scope().line_comment_prefixes().first().cloned());
let fence = codeblock_fence_for_path(
filename.as_deref(),
Some(selection.start.row..selection.end.row),
);
if let Some((line_comment_prefix, outline_text)) = line_comment_prefix.zip(outline_text)
{
let breadcrumb = format!("{line_comment_prefix}Excerpt from: {outline_text}\n");
format!("{fence}{breadcrumb}{selected_text}\n```")
} else {
format!("```{language_name}\n{selected_text}\n```")
})
format!("{fence}{selected_text}\n```")
}
};
let crease_title = if let Some(path) = filename {
let start_line = selection.start.row + 1;
let end_line = selection.end.row + 1;
if start_line == end_line {
format!("{}, Line {}", path.display(), start_line)
} else {
format!("{}, Lines {} to {}", path.display(), start_line, end_line)
}
} else {
"Quoted selection".to_string()
};
// Activate the panel
@@ -2958,24 +3015,55 @@ impl ContextEditor {
workspace.toggle_panel_focus::<AssistantPanel>(cx);
}
if let Some(text) = text {
panel.update(cx, |_, cx| {
// Wait to create a new context until the workspace is no longer
// being updated.
cx.defer(move |panel, cx| {
if let Some(context) = panel
.active_context_editor(cx)
.or_else(|| panel.new_context(cx))
{
context.update(cx, |context, cx| {
context
.editor
.update(cx, |editor, cx| editor.insert(&text, cx))
});
};
});
panel.update(cx, |_, cx| {
// Wait to create a new context until the workspace is no longer
// being updated.
cx.defer(move |panel, cx| {
if let Some(context) = panel
.active_context_editor(cx)
.or_else(|| panel.new_context(cx))
{
context.update(cx, |context, cx| {
context.editor.update(cx, |editor, cx| {
editor.insert("\n", cx);
let point = editor.selections.newest::<Point>(cx).head();
let start_row = MultiBufferRow(point.row);
editor.insert(&text, cx);
let snapshot = editor.buffer().read(cx).snapshot(cx);
let anchor_before = snapshot.anchor_after(point);
let anchor_after = editor
.selections
.newest_anchor()
.head()
.bias_left(&snapshot);
editor.insert("\n", cx);
let fold_placeholder = quote_selection_fold_placeholder(
crease_title,
cx.view().downgrade(),
);
let crease = Crease::new(
anchor_before..anchor_after,
fold_placeholder,
render_quote_selection_output_toggle,
|_, _, _| Empty.into_any(),
);
editor.insert_creases(vec![crease], cx);
editor.fold_at(
&FoldAt {
buffer_row: start_row,
},
cx,
);
})
});
};
});
}
});
}
fn copy(&mut self, _: &editor::actions::Copy, cx: &mut ViewContext<Self>) {
@@ -2985,6 +3073,8 @@ impl ContextEditor {
let selection = editor.selections.newest::<usize>(cx);
let mut copied_text = String::new();
let mut spanned_messages = 0;
let mut clipboard_entries: Vec<ClipboardEntry> = Vec::new();
for message in context.messages(cx) {
if message.offset_range.start >= selection.range().end {
break;
@@ -3002,8 +3092,16 @@ impl ContextEditor {
}
}
for image_anchor in context.image_anchors(cx) {
if let Some((render_image, _)) = context.get_image(image_anchor.image_id) {
//
} else {
log::error!("Assistant panel context had an image id of {:?} but there was no associated images entry stored for that id. This should never happen!", image_anchor.image_id);
}
}
if spanned_messages > 1 {
cx.write_to_clipboard(ClipboardItem::new(copied_text));
cx.write_to_clipboard(ClipboardItem::new(clipboard_entries));
return;
}
}
@@ -3011,6 +3109,104 @@ impl ContextEditor {
cx.propagate();
}
fn paste(&mut self, _: &editor::actions::Paste, cx: &mut ViewContext<Self>) {
let images = if let Some(item) = cx.read_from_clipboard() {
item.into_entries()
.filter_map(|entry| {
if let ClipboardEntry::Image(image) = entry {
Some(image)
} else {
None
}
})
.collect()
} else {
Vec::new()
};
if images.is_empty() {
// If we didn't find any valid image data to paste, propagate to let normal pasting happen.
cx.propagate();
} else {
let mut image_positions = Vec::new();
self.editor.update(cx, |editor, cx| {
editor.transact(cx, |editor, cx| {
let edits = editor
.selections
.all::<usize>(cx)
.into_iter()
.map(|selection| (selection.start..selection.end, "\n"));
editor.edit(edits, cx);
let snapshot = editor.buffer().read(cx).snapshot(cx);
for selection in editor.selections.all::<usize>(cx) {
image_positions.push(snapshot.anchor_before(selection.end));
}
});
});
self.context.update(cx, |context, cx| {
for image in images {
let image_id = image.id();
context.insert_image(image, cx);
for image_position in image_positions.iter() {
context.insert_image_anchor(image_id, image_position.text_anchor, cx);
}
}
});
}
}
fn update_image_blocks(&mut self, cx: &mut ViewContext<Self>) {
self.editor.update(cx, |editor, cx| {
let buffer = editor.buffer().read(cx).snapshot(cx);
let excerpt_id = *buffer.as_singleton().unwrap().0;
let old_blocks = std::mem::take(&mut self.image_blocks);
let new_blocks = self
.context
.read(cx)
.image_anchors(cx)
.filter_map(|image_anchor| {
const MAX_HEIGHT_IN_LINES: u32 = 8;
let anchor = buffer
.anchor_in_excerpt(excerpt_id, image_anchor.anchor)
.unwrap();
let image = image_anchor.render_image.clone();
anchor.is_valid(&buffer).then(|| BlockProperties {
position: anchor,
height: MAX_HEIGHT_IN_LINES,
style: BlockStyle::Sticky,
render: Box::new(move |cx| {
let image_size = size_for_image(
&image,
size(
cx.max_width - cx.gutter_dimensions.full_width(),
MAX_HEIGHT_IN_LINES as f32 * cx.line_height,
),
);
h_flex()
.pl(cx.gutter_dimensions.full_width())
.child(
img(image.clone())
.object_fit(gpui::ObjectFit::ScaleDown)
.w(image_size.width)
.h(image_size.height),
)
.into_any_element()
}),
disposition: BlockDisposition::Above,
priority: 0,
})
})
.collect::<Vec<_>>();
editor.remove_blocks(old_blocks, None, cx);
let ids = editor.insert_blocks(new_blocks, None, cx);
self.image_blocks = HashSet::from_iter(ids);
});
}
fn split(&mut self, _: &Split, cx: &mut ViewContext<Self>) {
self.context.update(cx, |context, cx| {
let selections = self.editor.read(cx).selections.disjoint_anchors();
@@ -3211,6 +3407,7 @@ impl Render for ContextEditor {
.capture_action(cx.listener(ContextEditor::cancel))
.capture_action(cx.listener(ContextEditor::save))
.capture_action(cx.listener(ContextEditor::copy))
.capture_action(cx.listener(ContextEditor::paste))
.capture_action(cx.listener(ContextEditor::cycle_message_role))
.capture_action(cx.listener(ContextEditor::confirm_command))
.on_action(cx.listener(ContextEditor::assist))
@@ -3945,7 +4142,7 @@ impl Render for ConfigurationView {
.map(|provider| self.render_provider_view(&provider, cx))
.collect::<Vec<_>>();
v_flex()
let mut element = v_flex()
.id("assistant-configuration-view")
.track_focus(&self.focus_handle)
.bg(cx.theme().colors().editor_background)
@@ -3970,9 +4167,24 @@ impl Render for ConfigurationView {
.p(Spacing::XXLarge.rems(cx))
.mt_1()
.gap_6()
.size_full()
.flex_1()
.children(provider_views),
)
.into_any();
// We use a canvas here to get scrolling to work in the ConfigurationView. It's a workaround
// because we couldn't the element to take up the size of the parent.
canvas(
move |bounds, cx| {
element.prepaint_as_root(bounds.origin, bounds.size.into(), cx);
element
},
|_, mut element, cx| {
element.paint(cx);
},
)
.flex_1()
.w_full()
}
}
@@ -4013,6 +4225,47 @@ fn render_slash_command_output_toggle(
.into_any_element()
}
fn quote_selection_fold_placeholder(title: String, editor: WeakView<Editor>) -> FoldPlaceholder {
FoldPlaceholder {
render: Arc::new({
move |fold_id, fold_range, _cx| {
let editor = editor.clone();
ButtonLike::new(fold_id)
.style(ButtonStyle::Filled)
.layer(ElevationIndex::ElevatedSurface)
.child(Icon::new(IconName::FileText))
.child(Label::new(title.clone()).single_line())
.on_click(move |_, cx| {
editor
.update(cx, |editor, cx| {
let buffer_start = fold_range
.start
.to_point(&editor.buffer().read(cx).read(cx));
let buffer_row = MultiBufferRow(buffer_start.row);
editor.unfold_at(&UnfoldAt { buffer_row }, cx);
})
.ok();
})
.into_any_element()
}
}),
constrain_width: false,
merge_adjacent: false,
}
}
fn render_quote_selection_output_toggle(
row: MultiBufferRow,
is_folded: bool,
fold: ToggleFold,
_cx: &mut WindowContext,
) -> AnyElement {
Disclosure::new(("quote-selection-indicator", row.0 as u64), !is_folded)
.selected(is_folded)
.on_click(move |_e, cx| fold(!is_folded, cx))
.into_any_element()
}
fn render_pending_slash_command_gutter_decoration(
row: MultiBufferRow,
status: &PendingSlashCommandStatus,
@@ -4168,6 +4421,30 @@ fn token_state(context: &Model<Context>, cx: &AppContext) -> Option<TokenState>
Some(token_state)
}
fn size_for_image(data: &RenderImage, max_size: Size<Pixels>) -> Size<Pixels> {
let image_size = data
.size(0)
.map(|dimension| Pixels::from(u32::from(dimension)));
let image_ratio = image_size.width / image_size.height;
let bounds_ratio = max_size.width / max_size.height;
if image_size.width > max_size.width || image_size.height > max_size.height {
if bounds_ratio > image_ratio {
size(
image_size.width * (max_size.height / image_size.height),
max_size.height,
)
} else {
size(
max_size.width,
image_size.height * (max_size.width / image_size.width),
)
}
} else {
size(image_size.width, image_size.height)
}
}
enum ConfigurationError {
NoProvider,
ProviderNotAuthenticated,

View File

@@ -16,22 +16,23 @@ use futures::{
FutureExt, StreamExt,
};
use gpui::{
AppContext, Context as _, EventEmitter, Model, ModelContext, Subscription, Task, UpdateGlobal,
View, WeakView,
AppContext, Context as _, EventEmitter, Image, Model, ModelContext, RenderImage, Subscription,
Task, UpdateGlobal, View, WeakView,
};
use language::{
AnchorRangeExt, Bias, Buffer, BufferSnapshot, LanguageRegistry, OffsetRangeExt, ParseStatus,
Point, ToOffset,
};
use language_model::{
LanguageModelRegistry, LanguageModelRequest, LanguageModelRequestMessage, LanguageModelTool,
Role,
LanguageModelImage, LanguageModelRegistry, LanguageModelRequest, LanguageModelRequestMessage,
LanguageModelTool, Role,
};
use open_ai::Model as OpenAiModel;
use paths::contexts_dir;
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use smallvec::SmallVec;
use std::{
cmp,
fmt::Debug,
@@ -319,8 +320,23 @@ pub struct MessageMetadata {
timestamp: clock::Lamport,
}
#[derive(Clone, Debug, PartialEq, Eq)]
#[derive(Clone, Debug)]
pub struct MessageImage {
image_id: u64,
image: Shared<Task<Option<LanguageModelImage>>>,
}
impl PartialEq for MessageImage {
fn eq(&self, other: &Self) -> bool {
self.image_id == other.image_id
}
}
impl Eq for MessageImage {}
#[derive(Clone, Debug)]
pub struct Message {
pub image_offsets: SmallVec<[(usize, MessageImage); 1]>,
pub offset_range: Range<usize>,
pub index_range: Range<usize>,
pub id: MessageId,
@@ -331,13 +347,55 @@ pub struct Message {
impl Message {
fn to_request_message(&self, buffer: &Buffer) -> LanguageModelRequestMessage {
let mut content = Vec::new();
let mut range_start = self.offset_range.start;
for (image_offset, message_image) in self.image_offsets.iter() {
if *image_offset != range_start {
content.push(
buffer
.text_for_range(range_start..*image_offset)
.collect::<String>()
.into(),
)
}
if let Some(image) = message_image.image.clone().now_or_never().flatten() {
content.push(language_model::MessageContent::Image(image));
}
range_start = *image_offset;
}
if range_start != self.offset_range.end {
content.push(
buffer
.text_for_range(range_start..self.offset_range.end)
.collect::<String>()
.into(),
)
}
LanguageModelRequestMessage {
role: self.role,
content: buffer.text_for_range(self.offset_range.clone()).collect(),
content,
}
}
}
#[derive(Clone, Debug)]
pub struct ImageAnchor {
pub anchor: language::Anchor,
pub image_id: u64,
pub render_image: Arc<RenderImage>,
pub image: Shared<Task<Option<LanguageModelImage>>>,
}
impl PartialEq for ImageAnchor {
fn eq(&self, other: &Self) -> bool {
self.image_id == other.image_id
}
}
struct PendingCompletion {
id: usize,
_task: Task<()>,
@@ -605,6 +663,8 @@ pub struct Context {
finished_slash_commands: HashSet<SlashCommandId>,
slash_command_output_sections: Vec<SlashCommandOutputSection<language::Anchor>>,
message_anchors: Vec<MessageAnchor>,
images: HashMap<u64, (Arc<RenderImage>, Shared<Task<Option<LanguageModelImage>>>)>,
image_anchors: Vec<ImageAnchor>,
messages_metadata: HashMap<MessageId, MessageMetadata>,
summary: Option<ContextSummary>,
pending_summary: Task<Option<()>>,
@@ -674,6 +734,8 @@ impl Context {
pending_ops: Vec::new(),
operations: Vec::new(),
message_anchors: Default::default(),
image_anchors: Default::default(),
images: Default::default(),
messages_metadata: Default::default(),
pending_slash_commands: Vec::new(),
finished_slash_commands: HashSet::default(),
@@ -1288,7 +1350,7 @@ impl Context {
request.messages.push(LanguageModelRequestMessage {
role: Role::User,
content: prompt,
content: vec![prompt.into()],
});
// Invoke the model to get its edit suggestions for this workflow step.
@@ -1690,13 +1752,15 @@ impl Context {
}
pub fn to_completion_request(&self, cx: &AppContext) -> LanguageModelRequest {
let messages = self
let buffer = self.buffer.read(cx);
let request_messages = self
.messages(cx)
.filter(|message| matches!(message.status, MessageStatus::Done))
.map(|message| message.to_request_message(self.buffer.read(cx)));
.filter(|message| message.status == MessageStatus::Done)
.map(|message| message.to_request_message(&buffer))
.collect();
LanguageModelRequest {
messages: messages.collect(),
messages: request_messages,
stop: vec![],
temperature: 1.0,
}
@@ -1794,6 +1858,68 @@ impl Context {
}
}
pub fn insert_image(&mut self, image: Image, cx: &mut ModelContext<Self>) -> Option<()> {
if !self.images.contains_key(&image.id()) {
self.images.insert(
image.id(),
(
image.to_image_data(cx).log_err()?,
LanguageModelImage::from_image(image, cx).shared(),
),
);
}
Some(())
}
pub fn insert_image_anchor(
&mut self,
image_id: u64,
anchor: language::Anchor,
cx: &mut ModelContext<Self>,
) -> bool {
cx.emit(ContextEvent::MessagesEdited);
let buffer = self.buffer.read(cx);
let insertion_ix = match self
.image_anchors
.binary_search_by(|existing_anchor| anchor.cmp(&existing_anchor.anchor, buffer))
{
Ok(ix) => ix,
Err(ix) => ix,
};
if let Some((render_image, image)) = self.images.get(&image_id) {
self.image_anchors.insert(
insertion_ix,
ImageAnchor {
anchor,
image_id,
image: image.clone(),
render_image: render_image.clone(),
},
);
true
} else {
false
}
}
pub fn image_anchors<'a>(
&'a self,
_cx: &'a AppContext,
) -> impl 'a + Iterator<Item = ImageAnchor> {
self.image_anchors.iter().cloned()
}
pub fn get_image(
&self,
image_id: u64,
) -> Option<&(Arc<RenderImage>, Shared<Task<Option<LanguageModelImage>>>)> {
self.images.get(&image_id)
}
pub fn split_message(
&mut self,
range: Range<usize>,
@@ -1812,7 +1938,10 @@ impl Context {
let mut edited_buffer = false;
let mut suffix_start = None;
if range.start > message.offset_range.start && range.end < message.offset_range.end - 1
// TODO: why did this start panicking?
if range.start > message.offset_range.start
&& range.end < message.offset_range.end.saturating_sub(1)
{
if self.buffer.read(cx).chars_at(range.end).next() == Some('\n') {
suffix_start = Some(range.end + 1);
@@ -1954,7 +2083,9 @@ impl Context {
.map(|message| message.to_request_message(self.buffer.read(cx)))
.chain(Some(LanguageModelRequestMessage {
role: Role::User,
content: "Summarize the context into a short title without punctuation.".into(),
content: vec![
"Summarize the context into a short title without punctuation.".into(),
],
}));
let request = LanguageModelRequest {
messages: messages.collect(),
@@ -2056,25 +2187,55 @@ impl Context {
pub fn messages<'a>(&'a self, cx: &'a AppContext) -> impl 'a + Iterator<Item = Message> {
let buffer = self.buffer.read(cx);
let mut message_anchors = self.message_anchors.iter().enumerate().peekable();
let messages = self.message_anchors.iter().enumerate();
let images = self.image_anchors.iter();
Self::messages_from_iters(buffer, &self.messages_metadata, messages, images)
}
pub fn messages_from_iters<'a>(
buffer: &'a Buffer,
metadata: &'a HashMap<MessageId, MessageMetadata>,
messages: impl Iterator<Item = (usize, &'a MessageAnchor)> + 'a,
images: impl Iterator<Item = &'a ImageAnchor> + 'a,
) -> impl 'a + Iterator<Item = Message> {
let mut messages = messages.peekable();
let mut images = images.peekable();
iter::from_fn(move || {
if let Some((start_ix, message_anchor)) = message_anchors.next() {
let metadata = self.messages_metadata.get(&message_anchor.id)?;
if let Some((start_ix, message_anchor)) = messages.next() {
let metadata = metadata.get(&message_anchor.id)?;
let message_start = message_anchor.start.to_offset(buffer);
let mut message_end = None;
let mut end_ix = start_ix;
while let Some((_, next_message)) = message_anchors.peek() {
while let Some((_, next_message)) = messages.peek() {
if next_message.start.is_valid(buffer) {
message_end = Some(next_message.start);
break;
} else {
end_ix += 1;
message_anchors.next();
messages.next();
}
}
let message_end_anchor = message_end.unwrap_or(language::Anchor::MAX);
let message_end = message_anchor.start.to_offset(buffer);
let mut image_offsets = SmallVec::new();
while let Some(image_anchor) = images.peek() {
if image_anchor.anchor.cmp(&message_end_anchor, buffer).is_lt() {
image_offsets.push((
image_anchor.anchor.to_offset(buffer),
MessageImage {
image_id: image_anchor.image_id,
image: image_anchor.image.clone(),
},
));
images.next();
} else {
break;
}
}
let message_end = message_end
.unwrap_or(language::Anchor::MAX)
.to_offset(buffer);
return Some(Message {
index_range: start_ix..end_ix,
@@ -2083,6 +2244,7 @@ impl Context {
anchor: message_anchor.start,
role: metadata.role,
status: metadata.status.clone(),
image_offsets,
});
}
None

View File

@@ -2302,7 +2302,7 @@ impl Codegen {
messages.push(LanguageModelRequestMessage {
role: Role::User,
content: prompt,
content: vec![prompt.into()],
});
Ok(LanguageModelRequest {

View File

@@ -775,7 +775,7 @@ impl PromptLibrary {
LanguageModelRequest {
messages: vec![LanguageModelRequestMessage {
role: Role::System,
content: body.to_string(),
content: vec![body.to_string().into()],
}],
stop: Vec::new(),
temperature: 1.,

View File

@@ -54,7 +54,7 @@ impl PromptBuilder {
cx: &gpui::AppContext,
handlebars: Arc<Mutex<Handlebars<'static>>>,
) {
let templates_dir = paths::prompt_templates_dir();
let templates_dir = paths::prompt_overrides_dir();
cx.background_executor()
.spawn(async move {

View File

@@ -276,7 +276,7 @@ impl TerminalInlineAssistant {
messages.push(LanguageModelRequestMessage {
role: Role::User,
content: prompt,
content: vec![prompt.into()],
});
Ok(LanguageModelRequest {

View File

@@ -10,7 +10,7 @@ It contains our back-end logic for collaboration, to which we connect from the Z
Before you can run the collab server locally, you'll need to set up a zed Postgres database.
```
```sh
script/bootstrap
```
@@ -32,13 +32,13 @@ To use a different set of admin users, create `crates/collab/seed.json`.
In one terminal, run Zed's collaboration server and the livekit dev server:
```
```sh
foreman start
```
In a second terminal, run two or more instances of Zed.
```
```sh
script/zed-local -2
```
@@ -64,7 +64,7 @@ You can tell what is currently deployed with `./script/what-is-deployed`.
To create a new migration:
```
```sh
./script/create-migration <name>
```

View File

@@ -85,11 +85,6 @@ spec:
secretKeyRef:
name: database
key: url
- name: LLM_DATABASE_URL
valueFrom:
secretKeyRef:
name: llm-database
key: url
- name: DATABASE_MAX_CONNECTIONS
value: "${DATABASE_MAX_CONNECTIONS}"
- name: API_TOKEN
@@ -102,6 +97,13 @@ spec:
secretKeyRef:
name: llm-token
key: secret
- name: LLM_DATABASE_URL
valueFrom:
secretKeyRef:
name: llm-database
key: url
- name: LLM_DATABASE_MAX_CONNECTIONS
value: "${LLM_DATABASE_MAX_CONNECTIONS}"
- name: ZED_CLIENT_CHECKSUM_SEED
valueFrom:
secretKeyRef:

View File

@@ -3,3 +3,4 @@ RUST_LOG=info
INVITE_LINK_PREFIX=https://zed.dev/invites/
AUTO_JOIN_CHANNEL_ID=283
DATABASE_MAX_CONNECTIONS=85
LLM_DATABASE_MAX_CONNECTIONS=25

View File

@@ -2,4 +2,5 @@ ZED_ENVIRONMENT=staging
RUST_LOG=info
INVITE_LINK_PREFIX=https://staging.zed.dev/invites/
DATABASE_MAX_CONNECTIONS=5
LLM_DATABASE_MAX_CONNECTIONS=5
AUTO_JOIN_CHANNEL_ID=8

View File

@@ -0,0 +1,4 @@
ALTER TABLE models
ALTER COLUMN max_requests_per_minute TYPE bigint,
ALTER COLUMN max_tokens_per_minute TYPE bigint,
ALTER COLUMN max_tokens_per_day TYPE bigint;

View File

@@ -0,0 +1,3 @@
ALTER TABLE models
ADD COLUMN price_per_million_input_tokens integer NOT NULL DEFAULT 0,
ADD COLUMN price_per_million_output_tokens integer NOT NULL DEFAULT 0;

View File

@@ -5,15 +5,27 @@ use util::ResultExt;
impl Database {
/// Initializes the different kinds of notifications by upserting records for them.
pub async fn initialize_notification_kinds(&mut self) -> Result<()> {
notification_kind::Entity::insert_many(Notification::all_variant_names().iter().map(
|kind| notification_kind::ActiveModel {
let all_kinds = Notification::all_variant_names();
let existing_kinds = notification_kind::Entity::find().all(&self.pool).await?;
let kinds_to_create: Vec<_> = all_kinds
.iter()
.filter(|&kind| {
!existing_kinds
.iter()
.any(|existing| existing.name == **kind)
})
.map(|kind| notification_kind::ActiveModel {
name: ActiveValue::Set(kind.to_string()),
..Default::default()
},
))
.on_conflict(OnConflict::new().do_nothing().to_owned())
.exec_without_returning(&self.pool)
.await?;
})
.collect();
if !kinds_to_create.is_empty() {
notification_kind::Entity::insert_many(kinds_to_create)
.exec_without_returning(&self.pool)
.await?;
}
let mut rows = notification_kind::Entity::find().stream(&self.pool).await?;
while let Some(row) = rows.next().await {

View File

@@ -1,8 +1,12 @@
mod authorization;
pub mod db;
mod telemetry;
mod token;
use crate::{api::CloudflareIpCountryHeader, executor::Executor, Config, Error, Result};
use crate::{
api::CloudflareIpCountryHeader, build_clickhouse_client, executor::Executor, Config, Error,
Result,
};
use anyhow::{anyhow, Context as _};
use authorization::authorize_access_to_language_model;
use axum::{
@@ -15,10 +19,17 @@ use axum::{
};
use chrono::{DateTime, Duration, Utc};
use db::{ActiveUserCount, LlmDatabase};
use futures::StreamExt as _;
use futures::{Stream, StreamExt as _};
use http_client::IsahcHttpClient;
use rpc::{LanguageModelProvider, PerformCompletionParams, EXPIRED_LLM_TOKEN_HEADER_NAME};
use std::sync::Arc;
use rpc::{
proto::Plan, LanguageModelProvider, PerformCompletionParams, EXPIRED_LLM_TOKEN_HEADER_NAME,
};
use std::{
pin::Pin,
sync::Arc,
task::{Context, Poll},
};
use telemetry::{report_llm_usage, LlmUsageEventRow};
use tokio::sync::RwLock;
use util::ResultExt;
@@ -27,8 +38,9 @@ pub use token::*;
pub struct LlmState {
pub config: Config,
pub executor: Executor,
pub db: Option<Arc<LlmDatabase>>,
pub db: Arc<LlmDatabase>,
pub http_client: IsahcHttpClient,
pub clickhouse_client: Option<clickhouse::Client>,
active_user_count: RwLock<Option<(DateTime<Utc>, ActiveUserCount)>>,
}
@@ -36,25 +48,20 @@ const ACTIVE_USER_COUNT_CACHE_DURATION: Duration = Duration::seconds(30);
impl LlmState {
pub async fn new(config: Config, executor: Executor) -> Result<Arc<Self>> {
// TODO: This is temporary until we have the LLM database stood up.
let db = if config.is_development() {
let database_url = config
.llm_database_url
.as_ref()
.ok_or_else(|| anyhow!("missing LLM_DATABASE_URL"))?;
let max_connections = config
.llm_database_max_connections
.ok_or_else(|| anyhow!("missing LLM_DATABASE_MAX_CONNECTIONS"))?;
let database_url = config
.llm_database_url
.as_ref()
.ok_or_else(|| anyhow!("missing LLM_DATABASE_URL"))?;
let max_connections = config
.llm_database_max_connections
.ok_or_else(|| anyhow!("missing LLM_DATABASE_MAX_CONNECTIONS"))?;
let mut db_options = db::ConnectOptions::new(database_url);
db_options.max_connections(max_connections);
let mut db = LlmDatabase::new(db_options, executor.clone()).await?;
db.initialize().await?;
let mut db_options = db::ConnectOptions::new(database_url);
db_options.max_connections(max_connections);
let mut db = LlmDatabase::new(db_options, executor.clone()).await?;
db.initialize().await?;
Some(Arc::new(db))
} else {
None
};
let db = Arc::new(db);
let user_agent = format!("Zed Server/{}", env!("CARGO_PKG_VERSION"));
let http_client = IsahcHttpClient::builder()
@@ -62,18 +69,19 @@ impl LlmState {
.build()
.context("failed to construct http client")?;
let initial_active_user_count = if let Some(db) = &db {
Some((Utc::now(), db.get_active_user_count(Utc::now()).await?))
} else {
None
};
let initial_active_user_count =
Some((Utc::now(), db.get_active_user_count(Utc::now()).await?));
let this = Self {
config,
executor,
db,
http_client,
clickhouse_client: config
.clickhouse_url
.as_ref()
.and_then(|_| build_clickhouse_client(&config).log_err()),
active_user_count: RwLock::new(initial_active_user_count),
config,
};
Ok(Arc::new(this))
@@ -88,14 +96,10 @@ impl LlmState {
}
}
if let Some(db) = &self.db {
let mut cache = self.active_user_count.write().await;
let new_count = db.get_active_user_count(now).await?;
*cache = Some((now, new_count));
Ok(new_count)
} else {
Ok(ActiveUserCount::default())
}
let mut cache = self.active_user_count.write().await;
let new_count = self.db.get_active_user_count(now).await?;
*cache = Some((now, new_count));
Ok(new_count)
}
}
@@ -163,13 +167,9 @@ async fn perform_completion(
&model,
)?;
let user_id = claims.user_id as i32;
check_usage_limit(&state, params.provider, &model, &claims).await?;
if state.db.is_some() {
check_usage_limit(&state, params.provider, &model, &claims).await?;
}
match params.provider {
let stream = match params.provider {
LanguageModelProvider::Anthropic => {
let api_key = state
.config
@@ -199,39 +199,27 @@ async fn perform_completion(
)
.await?;
let mut recorder = state.db.clone().map(|db| UsageRecorder {
db,
executor: state.executor.clone(),
user_id,
provider: params.provider,
model,
token_count: 0,
});
let stream = chunks.map(move |event| {
let mut buffer = Vec::new();
event.map(|chunk| {
match &chunk {
chunks
.map(move |event| {
let chunk = event?;
let (input_tokens, output_tokens) = match &chunk {
anthropic::Event::MessageStart {
message: anthropic::Response { usage, .. },
}
| anthropic::Event::MessageDelta { usage, .. } => {
if let Some(recorder) = &mut recorder {
recorder.token_count += usage.input_tokens.unwrap_or(0) as usize;
recorder.token_count += usage.output_tokens.unwrap_or(0) as usize;
}
}
_ => {}
}
| anthropic::Event::MessageDelta { usage, .. } => (
usage.input_tokens.unwrap_or(0) as usize,
usage.output_tokens.unwrap_or(0) as usize,
),
_ => (0, 0),
};
buffer.clear();
serde_json::to_writer(&mut buffer, &chunk).unwrap();
buffer.push(b'\n');
buffer
anyhow::Ok((
serde_json::to_vec(&chunk).unwrap(),
input_tokens,
output_tokens,
))
})
});
Ok(Response::new(Body::wrap_stream(stream)))
.boxed()
}
LanguageModelProvider::OpenAi => {
let api_key = state
@@ -248,17 +236,21 @@ async fn perform_completion(
)
.await?;
let stream = chunks.map(|event| {
let mut buffer = Vec::new();
event.map(|chunk| {
buffer.clear();
serde_json::to_writer(&mut buffer, &chunk).unwrap();
buffer.push(b'\n');
buffer
chunks
.map(|event| {
event.map(|chunk| {
let input_tokens =
chunk.usage.as_ref().map_or(0, |u| u.prompt_tokens) as usize;
let output_tokens =
chunk.usage.as_ref().map_or(0, |u| u.completion_tokens) as usize;
(
serde_json::to_vec(&chunk).unwrap(),
input_tokens,
output_tokens,
)
})
})
});
Ok(Response::new(Body::wrap_stream(stream)))
.boxed()
}
LanguageModelProvider::Google => {
let api_key = state
@@ -274,17 +266,20 @@ async fn perform_completion(
)
.await?;
let stream = chunks.map(|event| {
let mut buffer = Vec::new();
event.map(|chunk| {
buffer.clear();
serde_json::to_writer(&mut buffer, &chunk).unwrap();
buffer.push(b'\n');
buffer
chunks
.map(|event| {
event.map(|chunk| {
// TODO - implement token counting for Google AI
let input_tokens = 0;
let output_tokens = 0;
(
serde_json::to_vec(&chunk).unwrap(),
input_tokens,
output_tokens,
)
})
})
});
Ok(Response::new(Body::wrap_stream(stream)))
.boxed()
}
LanguageModelProvider::Zed => {
let api_key = state
@@ -306,41 +301,63 @@ async fn perform_completion(
)
.await?;
let stream = chunks.map(|event| {
let mut buffer = Vec::new();
event.map(|chunk| {
buffer.clear();
serde_json::to_writer(&mut buffer, &chunk).unwrap();
buffer.push(b'\n');
buffer
chunks
.map(|event| {
event.map(|chunk| {
let input_tokens =
chunk.usage.as_ref().map_or(0, |u| u.prompt_tokens) as usize;
let output_tokens =
chunk.usage.as_ref().map_or(0, |u| u.completion_tokens) as usize;
(
serde_json::to_vec(&chunk).unwrap(),
input_tokens,
output_tokens,
)
})
})
});
Ok(Response::new(Body::wrap_stream(stream)))
.boxed()
}
}
};
Ok(Response::new(Body::wrap_stream(TokenCountingStream {
state,
claims,
provider: params.provider,
model,
input_tokens: 0,
output_tokens: 0,
inner_stream: stream,
})))
}
fn normalize_model_name(provider: LanguageModelProvider, name: String) -> String {
match provider {
LanguageModelProvider::Anthropic => {
for prefix in &[
"claude-3-5-sonnet",
"claude-3-haiku",
"claude-3-opus",
"claude-3-sonnet",
] {
if name.starts_with(prefix) {
return prefix.to_string();
}
}
}
LanguageModelProvider::OpenAi => {}
LanguageModelProvider::Google => {}
LanguageModelProvider::Zed => {}
}
let prefixes: &[_] = match provider {
LanguageModelProvider::Anthropic => &[
"claude-3-5-sonnet",
"claude-3-haiku",
"claude-3-opus",
"claude-3-sonnet",
],
LanguageModelProvider::OpenAi => &[
"gpt-3.5-turbo",
"gpt-4-turbo-preview",
"gpt-4o-mini",
"gpt-4o",
"gpt-4",
],
LanguageModelProvider::Google => &[],
LanguageModelProvider::Zed => &[],
};
name
if let Some(prefix) = prefixes
.iter()
.filter(|&&prefix| name.starts_with(prefix))
.max_by_key(|&&prefix| prefix.len())
{
prefix.to_string()
} else {
name
}
}
async fn check_usage_limit(
@@ -349,12 +366,9 @@ async fn check_usage_limit(
model_name: &str,
claims: &LlmTokenClaims,
) -> Result<()> {
let db = state
let model = state.db.model(provider, model_name)?;
let usage = state
.db
.as_ref()
.ok_or_else(|| anyhow!("LLM database not configured"))?;
let model = db.model(provider, model_name)?;
let usage = db
.get_usage(claims.user_id as i32, provider, model_name, Utc::now())
.await?;
@@ -386,6 +400,11 @@ async fn check_usage_limit(
];
for (usage, limit, resource) in checks {
// Temporarily bypass rate-limiting for staff members.
if claims.is_staff {
continue;
}
if usage > limit {
return Err(Error::http(
StatusCode::TOO_MANY_REQUESTS,
@@ -396,26 +415,86 @@ async fn check_usage_limit(
Ok(())
}
struct UsageRecorder {
db: Arc<LlmDatabase>,
executor: Executor,
user_id: i32,
struct TokenCountingStream<S> {
state: Arc<LlmState>,
claims: LlmTokenClaims,
provider: LanguageModelProvider,
model: String,
token_count: usize,
input_tokens: usize,
output_tokens: usize,
inner_stream: S,
}
impl Drop for UsageRecorder {
impl<S> Stream for TokenCountingStream<S>
where
S: Stream<Item = Result<(Vec<u8>, usize, usize), anyhow::Error>> + Unpin,
{
type Item = Result<Vec<u8>, anyhow::Error>;
fn poll_next(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Option<Self::Item>> {
match Pin::new(&mut self.inner_stream).poll_next(cx) {
Poll::Ready(Some(Ok((mut bytes, input_tokens, output_tokens)))) => {
bytes.push(b'\n');
self.input_tokens += input_tokens;
self.output_tokens += output_tokens;
Poll::Ready(Some(Ok(bytes)))
}
Poll::Ready(Some(Err(e))) => Poll::Ready(Some(Err(e))),
Poll::Ready(None) => Poll::Ready(None),
Poll::Pending => Poll::Pending,
}
}
}
impl<S> Drop for TokenCountingStream<S> {
fn drop(&mut self) {
let db = self.db.clone();
let user_id = self.user_id;
let state = self.state.clone();
let claims = self.claims.clone();
let provider = self.provider;
let model = std::mem::take(&mut self.model);
let token_count = self.token_count;
self.executor.spawn_detached(async move {
db.record_usage(user_id, provider, &model, token_count, Utc::now())
let input_token_count = self.input_tokens;
let output_token_count = self.output_tokens;
self.state.executor.spawn_detached(async move {
let usage = state
.db
.record_usage(
claims.user_id as i32,
provider,
&model,
input_token_count,
output_token_count,
Utc::now(),
)
.await
.log_err();
if let Some((clickhouse_client, usage)) = state.clickhouse_client.as_ref().zip(usage) {
report_llm_usage(
clickhouse_client,
LlmUsageEventRow {
time: Utc::now().timestamp_millis(),
user_id: claims.user_id as i32,
is_staff: claims.is_staff,
plan: match claims.plan {
Plan::Free => "free".to_string(),
Plan::ZedPro => "zed_pro".to_string(),
},
model,
provider: provider.to_string(),
input_token_count: input_token_count as u64,
output_token_count: output_token_count as u64,
requests_this_minute: usage.requests_this_minute as u64,
tokens_this_minute: usage.tokens_this_minute as u64,
tokens_this_day: usage.tokens_this_day as u64,
input_tokens_this_month: usage.input_tokens_this_month as u64,
output_tokens_this_month: usage.output_tokens_this_month as u64,
spending_this_month: usage.spending_this_month as u64,
},
)
.await
.log_err();
}
})
}
}

View File

@@ -3,10 +3,14 @@ use sea_orm::QueryOrder;
use std::str::FromStr;
use strum::IntoEnumIterator as _;
pub struct ModelRateLimits {
pub max_requests_per_minute: i32,
pub max_tokens_per_minute: i32,
pub max_tokens_per_day: i32,
pub struct ModelParams {
pub provider: LanguageModelProvider,
pub name: String,
pub max_requests_per_minute: i64,
pub max_tokens_per_minute: i64,
pub max_tokens_per_day: i64,
pub price_per_million_input_tokens: i32,
pub price_per_million_output_tokens: i32,
}
impl LlmDatabase {
@@ -75,20 +79,23 @@ impl LlmDatabase {
Ok(())
}
pub async fn insert_models(
&mut self,
models: &[(LanguageModelProvider, String, ModelRateLimits)],
) -> Result<()> {
pub async fn insert_models(&mut self, models: &[ModelParams]) -> Result<()> {
let all_provider_ids = &self.provider_ids;
self.transaction(|tx| async move {
model::Entity::insert_many(models.into_iter().map(|(provider, name, rate_limits)| {
let provider_id = all_provider_ids[&provider];
model::Entity::insert_many(models.into_iter().map(|model_params| {
let provider_id = all_provider_ids[&model_params.provider];
model::ActiveModel {
provider_id: ActiveValue::set(provider_id),
name: ActiveValue::set(name.clone()),
max_requests_per_minute: ActiveValue::set(rate_limits.max_requests_per_minute),
max_tokens_per_minute: ActiveValue::set(rate_limits.max_tokens_per_minute),
max_tokens_per_day: ActiveValue::set(rate_limits.max_tokens_per_day),
name: ActiveValue::set(model_params.name.clone()),
max_requests_per_minute: ActiveValue::set(model_params.max_requests_per_minute),
max_tokens_per_minute: ActiveValue::set(model_params.max_tokens_per_minute),
max_tokens_per_day: ActiveValue::set(model_params.max_tokens_per_day),
price_per_million_input_tokens: ActiveValue::set(
model_params.price_per_million_input_tokens,
),
price_per_million_output_tokens: ActiveValue::set(
model_params.price_per_million_output_tokens,
),
..Default::default()
}
}))

View File

@@ -11,7 +11,9 @@ pub struct Usage {
pub requests_this_minute: usize,
pub tokens_this_minute: usize,
pub tokens_this_day: usize,
pub tokens_this_month: usize,
pub input_tokens_this_month: usize,
pub output_tokens_this_month: usize,
pub spending_this_month: usize,
}
#[derive(Clone, Copy, Debug, Default)]
@@ -87,14 +89,20 @@ impl LlmDatabase {
self.get_usage_for_measure(&usages, now, UsageMeasure::TokensPerMinute)?;
let tokens_this_day =
self.get_usage_for_measure(&usages, now, UsageMeasure::TokensPerDay)?;
let tokens_this_month =
self.get_usage_for_measure(&usages, now, UsageMeasure::TokensPerMonth)?;
let input_tokens_this_month =
self.get_usage_for_measure(&usages, now, UsageMeasure::InputTokensPerMonth)?;
let output_tokens_this_month =
self.get_usage_for_measure(&usages, now, UsageMeasure::OutputTokensPerMonth)?;
let spending_this_month =
calculate_spending(model, input_tokens_this_month, output_tokens_this_month);
Ok(Usage {
requests_this_minute,
tokens_this_minute,
tokens_this_day,
tokens_this_month,
input_tokens_this_month,
output_tokens_this_month,
spending_this_month,
})
})
.await
@@ -105,9 +113,10 @@ impl LlmDatabase {
user_id: i32,
provider: LanguageModelProvider,
model_name: &str,
token_count: usize,
input_token_count: usize,
output_token_count: usize,
now: DateTimeUtc,
) -> Result<()> {
) -> Result<Usage> {
self.transaction(|tx| async move {
let model = self.model(provider, model_name)?;
@@ -120,48 +129,72 @@ impl LlmDatabase {
.all(&*tx)
.await?;
self.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::RequestsPerMinute,
now,
1,
&tx,
)
.await?;
self.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::TokensPerMinute,
now,
token_count,
&tx,
)
.await?;
self.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::TokensPerDay,
now,
token_count,
&tx,
)
.await?;
self.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::TokensPerMonth,
now,
token_count,
&tx,
)
.await?;
let requests_this_minute = self
.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::RequestsPerMinute,
now,
1,
&tx,
)
.await?;
let tokens_this_minute = self
.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::TokensPerMinute,
now,
input_token_count + output_token_count,
&tx,
)
.await?;
let tokens_this_day = self
.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::TokensPerDay,
now,
input_token_count + output_token_count,
&tx,
)
.await?;
let input_tokens_this_month = self
.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::InputTokensPerMonth,
now,
input_token_count,
&tx,
)
.await?;
let output_tokens_this_month = self
.update_usage_for_measure(
user_id,
model.id,
&usages,
UsageMeasure::OutputTokensPerMonth,
now,
output_token_count,
&tx,
)
.await?;
let spending_this_month =
calculate_spending(model, input_tokens_this_month, output_tokens_this_month);
Ok(())
Ok(Usage {
requests_this_minute,
tokens_this_minute,
tokens_this_day,
input_tokens_this_month,
output_tokens_this_month,
spending_this_month,
})
})
.await
}
@@ -205,7 +238,7 @@ impl LlmDatabase {
now: DateTimeUtc,
usage_to_add: usize,
tx: &DatabaseTransaction,
) -> Result<()> {
) -> Result<usize> {
let now = now.naive_utc();
let measure_id = *self
.usage_measure_ids
@@ -230,6 +263,7 @@ impl LlmDatabase {
}
*buckets.last_mut().unwrap() += usage_to_add as i64;
let total_usage = buckets.iter().sum::<i64>() as usize;
let mut model = usage::ActiveModel {
user_id: ActiveValue::set(user_id),
@@ -249,7 +283,7 @@ impl LlmDatabase {
.await?;
}
Ok(())
Ok(total_usage)
}
fn get_usage_for_measure(
@@ -293,6 +327,18 @@ impl LlmDatabase {
}
}
fn calculate_spending(
model: &model::Model,
input_tokens_this_month: usize,
output_tokens_this_month: usize,
) -> usize {
let input_token_cost =
input_tokens_this_month * model.price_per_million_input_tokens as usize / 1_000_000;
let output_token_cost =
output_tokens_this_month * model.price_per_million_output_tokens as usize / 1_000_000;
input_token_cost + output_token_cost
}
const MINUTE_BUCKET_COUNT: usize = 12;
const DAY_BUCKET_COUNT: usize = 48;
const MONTH_BUCKET_COUNT: usize = 30;
@@ -303,7 +349,8 @@ impl UsageMeasure {
UsageMeasure::RequestsPerMinute => MINUTE_BUCKET_COUNT,
UsageMeasure::TokensPerMinute => MINUTE_BUCKET_COUNT,
UsageMeasure::TokensPerDay => DAY_BUCKET_COUNT,
UsageMeasure::TokensPerMonth => MONTH_BUCKET_COUNT,
UsageMeasure::InputTokensPerMonth => MONTH_BUCKET_COUNT,
UsageMeasure::OutputTokensPerMonth => MONTH_BUCKET_COUNT,
}
}
@@ -312,7 +359,8 @@ impl UsageMeasure {
UsageMeasure::RequestsPerMinute => Duration::minutes(1),
UsageMeasure::TokensPerMinute => Duration::minutes(1),
UsageMeasure::TokensPerDay => Duration::hours(24),
UsageMeasure::TokensPerMonth => Duration::days(30),
UsageMeasure::InputTokensPerMonth => Duration::days(30),
UsageMeasure::OutputTokensPerMonth => Duration::days(30),
}
}

View File

@@ -1,45 +1,45 @@
use super::*;
use crate::{Config, Result};
use queries::providers::ModelRateLimits;
use queries::providers::ModelParams;
pub async fn seed_database(_config: &Config, db: &mut LlmDatabase, _force: bool) -> Result<()> {
db.insert_models(&[
(
LanguageModelProvider::Anthropic,
"claude-3-5-sonnet".into(),
ModelRateLimits {
max_requests_per_minute: 5,
max_tokens_per_minute: 20_000,
max_tokens_per_day: 300_000,
},
),
(
LanguageModelProvider::Anthropic,
"claude-3-opus".into(),
ModelRateLimits {
max_requests_per_minute: 5,
max_tokens_per_minute: 10_000,
max_tokens_per_day: 300_000,
},
),
(
LanguageModelProvider::Anthropic,
"claude-3-sonnet".into(),
ModelRateLimits {
max_requests_per_minute: 5,
max_tokens_per_minute: 20_000,
max_tokens_per_day: 300_000,
},
),
(
LanguageModelProvider::Anthropic,
"claude-3-haiku".into(),
ModelRateLimits {
max_requests_per_minute: 5,
max_tokens_per_minute: 25_000,
max_tokens_per_day: 300_000,
},
),
ModelParams {
provider: LanguageModelProvider::Anthropic,
name: "claude-3-5-sonnet".into(),
max_requests_per_minute: 5,
max_tokens_per_minute: 20_000,
max_tokens_per_day: 300_000,
price_per_million_input_tokens: 300, // $3.00/MTok
price_per_million_output_tokens: 1500, // $15.00/MTok
},
ModelParams {
provider: LanguageModelProvider::Anthropic,
name: "claude-3-opus".into(),
max_requests_per_minute: 5,
max_tokens_per_minute: 10_000,
max_tokens_per_day: 300_000,
price_per_million_input_tokens: 1500, // $15.00/MTok
price_per_million_output_tokens: 7500, // $75.00/MTok
},
ModelParams {
provider: LanguageModelProvider::Anthropic,
name: "claude-3-sonnet".into(),
max_requests_per_minute: 5,
max_tokens_per_minute: 20_000,
max_tokens_per_day: 300_000,
price_per_million_input_tokens: 1500, // $15.00/MTok
price_per_million_output_tokens: 7500, // $75.00/MTok
},
ModelParams {
provider: LanguageModelProvider::Anthropic,
name: "claude-3-haiku".into(),
max_requests_per_minute: 5,
max_tokens_per_minute: 25_000,
max_tokens_per_day: 300_000,
price_per_million_input_tokens: 25, // $0.25/MTok
price_per_million_output_tokens: 125, // $1.25/MTok
},
])
.await
}

View File

@@ -10,9 +10,11 @@ pub struct Model {
pub id: ModelId,
pub provider_id: ProviderId,
pub name: String,
pub max_requests_per_minute: i32,
pub max_tokens_per_minute: i32,
pub max_tokens_per_day: i32,
pub max_requests_per_minute: i64,
pub max_tokens_per_minute: i64,
pub max_tokens_per_day: i64,
pub price_per_million_input_tokens: i32,
pub price_per_million_output_tokens: i32,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]

View File

@@ -9,7 +9,8 @@ pub enum UsageMeasure {
RequestsPerMinute,
TokensPerMinute,
TokensPerDay,
TokensPerMonth,
InputTokensPerMonth,
OutputTokensPerMonth,
}
#[derive(Clone, Debug, PartialEq, DeriveEntityModel)]

View File

@@ -1,5 +1,5 @@
use crate::{
llm::db::{queries::providers::ModelRateLimits, queries::usages::Usage, LlmDatabase},
llm::db::{queries::providers::ModelParams, queries::usages::Usage, LlmDatabase},
test_llm_db,
};
use chrono::{Duration, Utc};
@@ -13,15 +13,15 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
let model = "claude-3-5-sonnet";
db.initialize().await.unwrap();
db.insert_models(&[(
db.insert_models(&[ModelParams {
provider,
model.to_string(),
ModelRateLimits {
max_requests_per_minute: 5,
max_tokens_per_minute: 10_000,
max_tokens_per_day: 50_000,
},
)])
name: model.to_string(),
max_requests_per_minute: 5,
max_tokens_per_minute: 10_000,
max_tokens_per_day: 50_000,
price_per_million_input_tokens: 50,
price_per_million_output_tokens: 50,
}])
.await
.unwrap();
@@ -29,12 +29,12 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
let user_id = 123;
let now = t0;
db.record_usage(user_id, provider, model, 1000, now)
db.record_usage(user_id, provider, model, 1000, 0, now)
.await
.unwrap();
let now = t0 + Duration::seconds(10);
db.record_usage(user_id, provider, model, 2000, now)
db.record_usage(user_id, provider, model, 2000, 0, now)
.await
.unwrap();
@@ -45,7 +45,9 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 2,
tokens_this_minute: 3000,
tokens_this_day: 3000,
tokens_this_month: 3000,
input_tokens_this_month: 3000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
@@ -57,12 +59,14 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 1,
tokens_this_minute: 2000,
tokens_this_day: 3000,
tokens_this_month: 3000,
input_tokens_this_month: 3000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
let now = t0 + Duration::seconds(60);
db.record_usage(user_id, provider, model, 3000, now)
db.record_usage(user_id, provider, model, 3000, 0, now)
.await
.unwrap();
@@ -73,7 +77,9 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 2,
tokens_this_minute: 5000,
tokens_this_day: 6000,
tokens_this_month: 6000,
input_tokens_this_month: 6000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
@@ -86,11 +92,13 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 0,
tokens_this_minute: 0,
tokens_this_day: 5000,
tokens_this_month: 6000,
input_tokens_this_month: 6000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
db.record_usage(user_id, provider, model, 4000, now)
db.record_usage(user_id, provider, model, 4000, 0, now)
.await
.unwrap();
@@ -101,7 +109,9 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 1,
tokens_this_minute: 4000,
tokens_this_day: 9000,
tokens_this_month: 10000,
input_tokens_this_month: 10000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
@@ -114,7 +124,9 @@ async fn test_tracking_usage(db: &mut LlmDatabase) {
requests_this_minute: 0,
tokens_this_minute: 0,
tokens_this_day: 0,
tokens_this_month: 9000,
input_tokens_this_month: 9000,
output_tokens_this_month: 0,
spending_this_month: 0,
}
);
}

View File

@@ -0,0 +1,27 @@
use anyhow::Result;
use serde::Serialize;
#[derive(Serialize, Debug, clickhouse::Row)]
pub struct LlmUsageEventRow {
pub time: i64,
pub user_id: i32,
pub is_staff: bool,
pub plan: String,
pub model: String,
pub provider: String,
pub input_token_count: u64,
pub output_token_count: u64,
pub requests_this_minute: u64,
pub tokens_this_minute: u64,
pub tokens_this_day: u64,
pub input_tokens_this_month: u64,
pub output_tokens_this_month: u64,
pub spending_this_month: u64,
}
pub async fn report_llm_usage(client: &clickhouse::Client, row: LlmUsageEventRow) -> Result<()> {
let mut insert = client.insert("llm_usage_events")?;
insert.write(&row).await?;
insert.end().await?;
Ok(())
}

View File

@@ -248,11 +248,6 @@ async fn setup_app_database(config: &Config) -> Result<()> {
}
async fn setup_llm_database(config: &Config) -> Result<()> {
// TODO: This is temporary until we have the LLM database stood up.
if !config.is_development() {
return Ok(());
}
let database_url = config
.llm_database_url
.as_ref()
@@ -298,7 +293,12 @@ async fn handle_liveness_probe(
state.db.get_all_users(0, 1).await?;
}
if let Some(_llm_state) = llm_state {}
if let Some(llm_state) = llm_state {
llm_state
.db
.get_active_user_count(chrono::Utc::now())
.await?;
}
Ok("ok".to_string())
}

View File

@@ -280,7 +280,7 @@ impl ChannelView {
};
let link = channel.notes_link(closest_heading.map(|heading| heading.text), cx);
cx.write_to_clipboard(ClipboardItem::new(link));
cx.write_to_clipboard(ClipboardItem::new_string(link));
self.workspace
.update(cx, |workspace, cx| {
struct CopyLinkForPositionToast;

View File

@@ -710,7 +710,7 @@ impl ChatPanel {
active_chat.read(cx).find_loaded_message(message_id)
}) {
let text = message.body.clone();
cx.write_to_clipboard(ClipboardItem::new(text))
cx.write_to_clipboard(ClipboardItem::new_string(text))
}
}),
)

View File

@@ -2042,7 +2042,7 @@ impl CollabPanel {
let Some(channel) = channel_store.channel_for_id(channel_id) else {
return;
};
let item = ClipboardItem::new(channel.link(cx));
let item = ClipboardItem::new_string(channel.link(cx));
cx.write_to_clipboard(item)
}
@@ -2261,7 +2261,7 @@ impl CollabPanel {
.size(ButtonSize::None)
.visible_on_hover("section-header")
.on_click(move |_, cx| {
let item = ClipboardItem::new(channel_link_copy.clone());
let item = ClipboardItem::new_string(channel_link_copy.clone());
cx.write_to_clipboard(item)
})
.tooltip(|cx| Tooltip::text("Copy channel link", cx))

View File

@@ -175,7 +175,8 @@ impl Render for ChannelModal {
.read(cx)
.channel_for_id(channel_id)
{
let item = ClipboardItem::new(channel.link(cx));
let item =
ClipboardItem::new_string(channel.link(cx));
cx.write_to_clipboard(item);
}
})),

View File

@@ -55,7 +55,7 @@ impl CopilotCodeVerification {
) -> impl IntoElement {
let copied = cx
.read_from_clipboard()
.map(|item| item.text() == &data.user_code)
.map(|item| item.text().as_ref() == Some(&data.user_code))
.unwrap_or(false);
h_flex()
.w_full()
@@ -68,7 +68,7 @@ impl CopilotCodeVerification {
.on_mouse_down(gpui::MouseButton::Left, {
let user_code = data.user_code.clone();
move |_, cx| {
cx.write_to_clipboard(ClipboardItem::new(user_code.clone()));
cx.write_to_clipboard(ClipboardItem::new_string(user_code.clone()));
cx.refresh();
}
})

View File

@@ -2,8 +2,8 @@ use futures::Future;
use git::blame::BlameEntry;
use git::Oid;
use gpui::{
Asset, ClipboardItem, Element, ParentElement, Render, ScrollHandle, StatefulInteractiveElement,
WeakView, WindowContext,
AppContext, Asset, ClipboardItem, Element, ParentElement, Render, ScrollHandle,
StatefulInteractiveElement, WeakView,
};
use settings::Settings;
use std::hash::Hash;
@@ -35,7 +35,7 @@ impl<'a> CommitAvatar<'a> {
let avatar_url = CommitAvatarAsset::new(remote.clone(), self.sha);
let element = match cx.use_cached_asset::<CommitAvatarAsset>(&avatar_url) {
let element = match cx.use_asset::<CommitAvatarAsset>(&avatar_url) {
// Loading or no avatar found
None | Some(None) => Icon::new(IconName::Person)
.color(Color::Muted)
@@ -73,7 +73,7 @@ impl Asset for CommitAvatarAsset {
fn load(
source: Self::Source,
cx: &mut WindowContext,
cx: &mut AppContext,
) -> impl Future<Output = Self::Output> + Send + 'static {
let client = cx.http_client();
@@ -242,9 +242,9 @@ impl Render for BlameEntryTooltip {
.icon_color(Color::Muted)
.on_click(move |_, cx| {
cx.stop_propagation();
cx.write_to_clipboard(ClipboardItem::new(
full_sha.clone(),
))
cx.write_to_clipboard(
ClipboardItem::new_string(full_sha.clone()),
)
}),
),
),

View File

@@ -69,13 +69,13 @@ use git::blame::GitBlame;
use git::diff_hunk_to_display;
use gpui::{
div, impl_actions, point, prelude::*, px, relative, size, uniform_list, Action, AnyElement,
AppContext, AsyncWindowContext, AvailableSpace, BackgroundExecutor, Bounds, ClipboardItem,
Context, DispatchPhase, ElementId, EntityId, EventEmitter, FocusHandle, FocusOutEvent,
FocusableView, FontId, FontWeight, HighlightStyle, Hsla, InteractiveText, KeyContext,
ListSizingBehavior, Model, MouseButton, PaintQuad, ParentElement, Pixels, Render, SharedString,
Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle, UnderlineStyle,
UniformListScrollHandle, View, ViewContext, ViewInputHandler, VisualContext, WeakFocusHandle,
WeakView, WindowContext,
AppContext, AsyncWindowContext, AvailableSpace, BackgroundExecutor, Bounds, ClipboardEntry,
ClipboardItem, Context, DispatchPhase, ElementId, EntityId, EventEmitter, FocusHandle,
FocusOutEvent, FocusableView, FontId, FontWeight, HighlightStyle, Hsla, InteractiveText,
KeyContext, ListSizingBehavior, Model, MouseButton, PaintQuad, ParentElement, Pixels, Render,
SharedString, Size, StrikethroughStyle, Styled, StyledText, Subscription, Task, TextStyle,
UnderlineStyle, UniformListScrollHandle, View, ViewContext, ViewInputHandler, VisualContext,
WeakFocusHandle, WeakView, WindowContext,
};
use highlight_matching_bracket::refresh_matching_bracket_highlights;
use hover_popover::{hide_hover, HoverState};
@@ -2281,7 +2281,7 @@ impl Editor {
}
if !text.is_empty() {
cx.write_to_primary(ClipboardItem::new(text));
cx.write_to_primary(ClipboardItem::new_string(text));
}
}
@@ -6546,7 +6546,10 @@ impl Editor {
s.select(selections);
});
this.insert("", cx);
cx.write_to_clipboard(ClipboardItem::new(text).with_metadata(clipboard_selections));
cx.write_to_clipboard(ClipboardItem::new_string_with_metadata(
text,
clipboard_selections,
));
});
}
@@ -6585,7 +6588,10 @@ impl Editor {
}
}
cx.write_to_clipboard(ClipboardItem::new(text).with_metadata(clipboard_selections));
cx.write_to_clipboard(ClipboardItem::new_string_with_metadata(
text,
clipboard_selections,
));
}
pub fn do_paste(
@@ -6669,13 +6675,21 @@ impl Editor {
pub fn paste(&mut self, _: &Paste, cx: &mut ViewContext<Self>) {
if let Some(item) = cx.read_from_clipboard() {
self.do_paste(
item.text(),
item.metadata::<Vec<ClipboardSelection>>(),
true,
cx,
)
};
let entries = item.entries();
match entries.first() {
// For now, we only support applying metadata if there's one string. In the future, we can incorporate all the selections
// of all the pasted entries.
Some(ClipboardEntry::String(clipboard_string)) if entries.len() == 1 => self
.do_paste(
clipboard_string.text(),
clipboard_string.metadata::<Vec<ClipboardSelection>>(),
true,
cx,
),
_ => self.do_paste(&item.text().unwrap_or_default(), None, true, cx),
}
}
}
pub fn undo(&mut self, _: &Undo, cx: &mut ViewContext<Self>) {
@@ -10496,7 +10510,7 @@ impl Editor {
if let Some(buffer) = self.buffer().read(cx).as_singleton() {
if let Some(file) = buffer.read(cx).file().and_then(|f| f.as_local()) {
if let Some(path) = file.abs_path(cx).to_str() {
cx.write_to_clipboard(ClipboardItem::new(path.to_string()));
cx.write_to_clipboard(ClipboardItem::new_string(path.to_string()));
}
}
}
@@ -10506,7 +10520,7 @@ impl Editor {
if let Some(buffer) = self.buffer().read(cx).as_singleton() {
if let Some(file) = buffer.read(cx).file().and_then(|f| f.as_local()) {
if let Some(path) = file.path().to_str() {
cx.write_to_clipboard(ClipboardItem::new(path.to_string()));
cx.write_to_clipboard(ClipboardItem::new_string(path.to_string()));
}
}
}
@@ -10696,7 +10710,7 @@ impl Editor {
match permalink {
Ok(permalink) => {
cx.write_to_clipboard(ClipboardItem::new(permalink.to_string()));
cx.write_to_clipboard(ClipboardItem::new_string(permalink.to_string()));
}
Err(err) => {
let message = format!("Failed to copy permalink: {err}");
@@ -11632,7 +11646,7 @@ impl Editor {
let Some(lines) = serde_json::to_string_pretty(&lines).log_err() else {
return;
};
cx.write_to_clipboard(ClipboardItem::new(lines));
cx.write_to_clipboard(ClipboardItem::new_string(lines));
}
pub fn inlay_hint_cache(&self) -> &InlayHintCache {
@@ -12899,7 +12913,9 @@ pub fn diagnostic_block_renderer(
.visible_on_hover(group_id.clone())
.on_click({
let message = diagnostic.message.clone();
move |_click, cx| cx.write_to_clipboard(ClipboardItem::new(message.clone()))
move |_click, cx| {
cx.write_to_clipboard(ClipboardItem::new_string(message.clone()))
}
})
.tooltip(|cx| Tooltip::text("Copy diagnostic message", cx)),
)

View File

@@ -3956,8 +3956,9 @@ async fn test_clipboard(cx: &mut gpui::TestAppContext) {
the lazy dog"});
cx.update_editor(|e, cx| e.copy(&Copy, cx));
assert_eq!(
cx.read_from_clipboard().map(|item| item.text().to_owned()),
Some("fox jumps over\n".to_owned())
cx.read_from_clipboard()
.and_then(|item| item.text().map(str::to_string)),
Some("fox jumps over\n".to_string())
);
// Paste with three selections, noticing how the copied full-line selection is inserted

View File

@@ -4282,7 +4282,7 @@ fn deploy_blame_entry_context_menu(
let sha = format!("{}", blame_entry.sha);
menu.on_blur_subscription(Subscription::new(|| {}))
.entry("Copy commit SHA", None, move |cx| {
cx.write_to_clipboard(ClipboardItem::new(sha.clone()));
cx.write_to_clipboard(ClipboardItem::new_string(sha.clone()));
})
.when_some(
details.and_then(|details| details.permalink.clone()),

43
crates/extension/build.rs Normal file
View File

@@ -0,0 +1,43 @@
use std::env;
use std::fs;
use std::path::PathBuf;
fn main() -> Result<(), Box<dyn std::error::Error>> {
copy_extension_api_rust_files()
}
// rust-analyzer doesn't support include! for files from outside the crate.
// Copy them to the OUT_DIR, so we can include them from there, which is supported.
fn copy_extension_api_rust_files() -> Result<(), Box<dyn std::error::Error>> {
let out_dir = env::var("OUT_DIR")?;
let input_dir = PathBuf::from("../extension_api/wit");
let output_dir = PathBuf::from(out_dir);
for entry in fs::read_dir(&input_dir)? {
let entry = entry?;
let path = entry.path();
if path.is_dir() {
for subentry in fs::read_dir(&path)? {
let subentry = subentry?;
let subpath = subentry.path();
if subpath.extension() == Some(std::ffi::OsStr::new("rs")) {
let relative_path = subpath.strip_prefix(&input_dir)?;
let destination = output_dir.join(relative_path);
fs::create_dir_all(destination.parent().unwrap())?;
fs::copy(&subpath, &destination)?;
println!("cargo:rerun-if-changed={}", subpath.display());
}
}
} else if path.extension() == Some(std::ffi::OsStr::new("rs")) {
let relative_path = path.strip_prefix(&input_dir)?;
let destination = output_dir.join(relative_path);
fs::create_dir_all(destination.parent().unwrap())?;
fs::copy(&path, &destination)?;
println!("cargo:rerun-if-changed={}", path.display());
}
}
Ok(())
}

View File

@@ -49,12 +49,10 @@ pub fn is_supported_wasm_api_version(
/// Returns the Wasm API version range that is supported by the Wasm host.
#[inline(always)]
pub fn wasm_api_version_range(release_channel: ReleaseChannel) -> RangeInclusive<SemanticVersion> {
let max_version = match release_channel {
ReleaseChannel::Dev | ReleaseChannel::Nightly => latest::MAX_VERSION,
ReleaseChannel::Stable | ReleaseChannel::Preview => since_v0_0_6::MAX_VERSION,
};
// Note: The release channel can be used to stage a new version of the extension API.
let _ = release_channel;
since_v0_0_1::MIN_VERSION..=max_version
since_v0_0_1::MIN_VERSION..=latest::MAX_VERSION
}
pub enum Extension {
@@ -71,7 +69,10 @@ impl Extension {
version: SemanticVersion,
component: &Component,
) -> Result<(Self, Instance)> {
if release_channel == ReleaseChannel::Dev && version >= latest::MIN_VERSION {
// Note: The release channel can be used to stage a new version of the extension API.
let _ = release_channel;
if version >= latest::MIN_VERSION {
let (extension, instance) =
latest::Extension::instantiate_async(store, &component, latest::linker())
.await

View File

@@ -8,7 +8,6 @@ use std::sync::{Arc, OnceLock};
use wasmtime::component::{Linker, Resource};
pub const MIN_VERSION: SemanticVersion = SemanticVersion::new(0, 0, 6);
pub const MAX_VERSION: SemanticVersion = SemanticVersion::new(0, 0, 6);
wasmtime::component::bindgen!({
async: true,
@@ -24,7 +23,7 @@ wasmtime::component::bindgen!({
});
mod settings {
include!("../../../../extension_api/wit/since_v0.0.6/settings.rs");
include!(concat!(env!("OUT_DIR"), "/since_v0.0.6/settings.rs"));
}
pub type ExtensionWorktree = Arc<dyn LspAdapterDelegate>;

View File

@@ -37,7 +37,7 @@ wasmtime::component::bindgen!({
pub use self::zed::extension::*;
mod settings {
include!("../../../../extension_api/wit/since_v0.0.7/settings.rs");
include!(concat!(env!("OUT_DIR"), "/since_v0.0.7/settings.rs"));
}
pub type ExtensionWorktree = Arc<dyn LspAdapterDelegate>;

View File

@@ -8,10 +8,6 @@ keywords = ["zed", "extension"]
edition = "2021"
license = "Apache-2.0"
# Don't publish v0.0.7 until we're ready to commit to the breaking API changes
# Marshall is DRI on this.
publish = false
[lints]
workspace = true

View File

@@ -23,7 +23,7 @@ need to set your `crate-type` accordingly:
```toml
[dependencies]
zed_extension_api = "0.0.6"
zed_extension_api = "0.0.7"
[lib]
crate-type = ["cdylib"]
@@ -63,6 +63,7 @@ Here is the compatibility of the `zed_extension_api` with versions of Zed:
| Zed version | `zed_extension_api` version |
| ----------- | --------------------------- |
| `0.149.x` | `0.0.1` - `0.0.7` |
| `0.131.x` | `0.0.1` - `0.0.6` |
| `0.130.x` | `0.0.1` - `0.0.5` |
| `0.129.x` | `0.0.1` - `0.0.4` |

View File

@@ -44,7 +44,7 @@ pub fn init(cx: &mut AppContext) {
cx.spawn(|_, mut cx| async move {
let specs = specs.await.to_string();
cx.update(|cx| cx.write_to_clipboard(ClipboardItem::new(specs.clone())))
cx.update(|cx| cx.write_to_clipboard(ClipboardItem::new_string(specs.clone())))
.log_err();
cx.prompt(

View File

@@ -67,6 +67,7 @@ serde_json.workspace = true
slotmap = "1.0.6"
smallvec.workspace = true
smol.workspace = true
strum.workspace = true
sum_tree.workspace = true
taffy = "0.4.3"
thiserror.workspace = true

View File

@@ -7,7 +7,7 @@ for Rust, designed to support a wide variety of applications.
GPUI is still in active development as we work on the Zed code editor and isn't yet on crates.io. You'll also need to use the latest version of stable rust and be on macOS. Add the following to your Cargo.toml:
```
```toml
gpui = { git = "https://github.com/zed-industries/zed" }
```

View File

@@ -4,7 +4,7 @@ GPUI is designed for keyboard-first interactivity.
To expose functionality to the mouse, you render a button with a click handler.
To expose functionality to the keyboard, you bind an *action* in a *key context*.
To expose functionality to the keyboard, you bind an _action_ in a _key context_.
Actions are similar to framework-level events like `MouseDown`, `KeyDown`, etc, but you can define them yourself:
@@ -55,7 +55,7 @@ impl Render for Menu {
}
```
In order to bind keys to actions, you need to declare a *key context* for part of the element tree by calling `key_context`.
In order to bind keys to actions, you need to declare a _key context_ for part of the element tree by calling `key_context`.
```rust
impl Render for Menu {
@@ -97,5 +97,4 @@ If you had opted for the more complex type definition, you'd provide the seriali
"shift-down": ["menu::Move", {direction: "down", select: true}]
}
}
```

View File

@@ -11,9 +11,12 @@ use std::{
use anyhow::{anyhow, Result};
use derive_more::{Deref, DerefMut};
use futures::{channel::oneshot, future::LocalBoxFuture, Future};
use futures::{
channel::oneshot,
future::{LocalBoxFuture, Shared},
Future, FutureExt,
};
use slotmap::SlotMap;
use smol::future::FutureExt;
pub use async_context::*;
use collections::{FxHashMap, FxHashSet, VecDeque};
@@ -25,8 +28,8 @@ pub use test_context::*;
use util::ResultExt;
use crate::{
current_platform, init_app_menus, Action, ActionRegistry, Any, AnyView, AnyWindowHandle,
AssetCache, AssetSource, BackgroundExecutor, ClipboardItem, Context, DispatchPhase, DisplayId,
current_platform, hash, init_app_menus, Action, ActionRegistry, Any, AnyView, AnyWindowHandle,
Asset, AssetSource, BackgroundExecutor, ClipboardItem, Context, DispatchPhase, DisplayId,
Entity, EventEmitter, ForegroundExecutor, Global, KeyBinding, Keymap, Keystroke, LayoutId,
Menu, MenuItem, OwnedMenu, PathPromptOptions, Pixels, Platform, PlatformDisplay, Point,
PromptBuilder, PromptHandle, PromptLevel, Render, RenderablePromptHandle, Reservation,
@@ -220,7 +223,6 @@ pub struct AppContext {
pub(crate) background_executor: BackgroundExecutor,
pub(crate) foreground_executor: ForegroundExecutor,
pub(crate) loading_assets: FxHashMap<(TypeId, u64), Box<dyn Any>>,
pub(crate) asset_cache: AssetCache,
asset_source: Arc<dyn AssetSource>,
pub(crate) svg_renderer: SvgRenderer,
http_client: Arc<dyn HttpClient>,
@@ -276,7 +278,6 @@ impl AppContext {
background_executor: executor,
foreground_executor,
svg_renderer: SvgRenderer::new(asset_source.clone()),
asset_cache: AssetCache::new(),
loading_assets: Default::default(),
asset_source,
http_client,
@@ -1267,6 +1268,40 @@ impl AppContext {
) {
self.prompt_builder = Some(PromptBuilder::Custom(Box::new(renderer)))
}
/// Remove an asset from GPUI's cache
pub fn remove_cached_asset<A: Asset + 'static>(&mut self, source: &A::Source) {
let asset_id = (TypeId::of::<A>(), hash(source));
self.loading_assets.remove(&asset_id);
}
/// Asynchronously load an asset, if the asset hasn't finished loading this will return None.
///
/// Note that the multiple calls to this method will only result in one `Asset::load` call at a
/// time, and the results of this call will be cached
///
/// This asset will not be cached by default, see [Self::use_cached_asset]
pub fn fetch_asset<A: Asset + 'static>(
&mut self,
source: &A::Source,
) -> (Shared<Task<A::Output>>, bool) {
let asset_id = (TypeId::of::<A>(), hash(source));
let mut is_first = false;
let task = self
.loading_assets
.remove(&asset_id)
.map(|boxed_task| *boxed_task.downcast::<Shared<Task<A::Output>>>().unwrap())
.unwrap_or_else(|| {
is_first = true;
let future = A::load(source.clone(), self);
let task = self.background_executor().spawn(future).shared();
task
});
self.loading_assets.insert(asset_id, Box::new(task.clone()));
(task, is_first)
}
}
impl Context for AppContext {

View File

@@ -1,17 +1,14 @@
use crate::{SharedString, SharedUri, WindowContext};
use collections::FxHashMap;
use crate::{AppContext, SharedString, SharedUri};
use futures::Future;
use parking_lot::Mutex;
use std::any::TypeId;
use std::hash::{Hash, Hasher};
use std::path::PathBuf;
use std::sync::Arc;
use std::{any::Any, path::PathBuf};
#[derive(Debug, PartialEq, Eq, Hash, Clone)]
pub(crate) enum UriOrPath {
Uri(SharedUri),
Path(Arc<PathBuf>),
Asset(SharedString),
Embedded(SharedString),
}
impl From<SharedUri> for UriOrPath {
@@ -37,7 +34,7 @@ pub trait Asset {
/// Load the asset asynchronously
fn load(
source: Self::Source,
cx: &mut WindowContext,
cx: &mut AppContext,
) -> impl Future<Output = Self::Output> + Send + 'static;
}
@@ -47,42 +44,3 @@ pub fn hash<T: Hash>(data: &T) -> u64 {
data.hash(&mut hasher);
hasher.finish()
}
/// A cache for assets.
#[derive(Clone)]
pub struct AssetCache {
assets: Arc<Mutex<FxHashMap<(TypeId, u64), Box<dyn Any + Send>>>>,
}
impl AssetCache {
pub(crate) fn new() -> Self {
Self {
assets: Default::default(),
}
}
/// Get the asset from the cache, if it exists.
pub fn get<A: Asset + 'static>(&self, source: &A::Source) -> Option<A::Output> {
self.assets
.lock()
.get(&(TypeId::of::<A>(), hash(&source)))
.and_then(|task| task.downcast_ref::<A::Output>())
.cloned()
}
/// Insert the asset into the cache.
pub fn insert<A: Asset + 'static>(&mut self, source: A::Source, output: A::Output) {
self.assets
.lock()
.insert((TypeId::of::<A>(), hash(&source)), Box::new(output));
}
/// Remove an entry from the asset cache
pub fn remove<A: Asset + 'static>(&mut self, source: &A::Source) -> Option<A::Output> {
self.assets
.lock()
.remove(&(TypeId::of::<A>(), hash(&source)))
.and_then(|any| any.downcast::<A::Output>().ok())
.map(|boxed| *boxed)
}
}

View File

@@ -38,14 +38,22 @@ pub(crate) struct RenderImageParams {
pub(crate) frame_index: usize,
}
/// A cached and processed image.
pub struct ImageData {
/// A cached and processed image, in BGRA format
pub struct RenderImage {
/// The ID associated with this image
pub id: ImageId,
data: SmallVec<[Frame; 1]>,
}
impl ImageData {
impl PartialEq for RenderImage {
fn eq(&self, other: &Self) -> bool {
self.id == other.id
}
}
impl Eq for RenderImage {}
impl RenderImage {
/// Create a new image from the given data.
pub fn new(data: impl Into<SmallVec<[Frame; 1]>>) -> Self {
static NEXT_ID: AtomicUsize = AtomicUsize::new(0);
@@ -57,8 +65,10 @@ impl ImageData {
}
/// Convert this image into a byte slice.
pub fn as_bytes(&self, frame_index: usize) -> &[u8] {
&self.data[frame_index].buffer()
pub fn as_bytes(&self, frame_index: usize) -> Option<&[u8]> {
self.data
.get(frame_index)
.map(|frame| frame.buffer().as_raw().as_slice())
}
/// Get the size of this image, in pixels.
@@ -78,7 +88,7 @@ impl ImageData {
}
}
impl fmt::Debug for ImageData {
impl fmt::Debug for RenderImage {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_struct("ImageData")
.field("id", &self.id)

View File

@@ -1,16 +1,14 @@
use crate::{
point, px, size, AbsoluteLength, Asset, Bounds, DefiniteLength, DevicePixels, Element,
ElementId, GlobalElementId, Hitbox, ImageData, InteractiveElement, Interactivity, IntoElement,
LayoutId, Length, Pixels, SharedString, SharedUri, Size, StyleRefinement, Styled, SvgSize,
UriOrPath, WindowContext,
px, AbsoluteLength, AppContext, Asset, Bounds, DefiniteLength, Element, ElementId,
GlobalElementId, Hitbox, Image, InteractiveElement, Interactivity, IntoElement, LayoutId,
Length, ObjectFit, Pixels, RenderImage, SharedString, SharedUri, Size, StyleRefinement, Styled,
SvgSize, UriOrPath, WindowContext,
};
use futures::{AsyncReadExt, Future};
use http_client;
use image::{
codecs::gif::GifDecoder, AnimationDecoder, Frame, ImageBuffer, ImageError, ImageFormat,
};
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use smallvec::SmallVec;
use std::{
fs,
@@ -23,20 +21,18 @@ use thiserror::Error;
use util::ResultExt;
/// A source of image content.
#[derive(Clone, Debug)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum ImageSource {
/// Image content will be loaded from provided URI at render time.
Uri(SharedUri),
/// Image content will be loaded from the provided file at render time.
File(Arc<PathBuf>),
/// Cached image data
Data(Arc<ImageData>),
Render(Arc<RenderImage>),
/// Cached image data
Image(Arc<Image>),
/// Image content will be loaded from Asset at render time.
Asset(SharedString),
// TODO: move surface definitions into mac platform module
/// A CoreVideo image buffer
#[cfg(target_os = "macos")]
Surface(CVImageBuffer),
Embedded(SharedString),
}
fn is_uri(uri: &str) -> bool {
@@ -54,7 +50,7 @@ impl From<&'static str> for ImageSource {
if is_uri(&s) {
Self::Uri(s.into())
} else {
Self::Asset(s.into())
Self::Embedded(s.into())
}
}
}
@@ -64,7 +60,7 @@ impl From<String> for ImageSource {
if is_uri(&s) {
Self::Uri(s.into())
} else {
Self::Asset(s.into())
Self::Embedded(s.into())
}
}
}
@@ -74,7 +70,7 @@ impl From<SharedString> for ImageSource {
if is_uri(&s) {
Self::Uri(s.into())
} else {
Self::Asset(s)
Self::Embedded(s)
}
}
}
@@ -91,16 +87,9 @@ impl From<PathBuf> for ImageSource {
}
}
impl From<Arc<ImageData>> for ImageSource {
fn from(value: Arc<ImageData>) -> Self {
Self::Data(value)
}
}
#[cfg(target_os = "macos")]
impl From<CVImageBuffer> for ImageSource {
fn from(value: CVImageBuffer) -> Self {
Self::Surface(value)
impl From<Arc<RenderImage>> for ImageSource {
fn from(value: Arc<RenderImage>) -> Self {
Self::Render(value)
}
}
@@ -122,121 +111,6 @@ pub fn img(source: impl Into<ImageSource>) -> Img {
}
}
/// How to fit the image into the bounds of the element.
pub enum ObjectFit {
/// The image will be stretched to fill the bounds of the element.
Fill,
/// The image will be scaled to fit within the bounds of the element.
Contain,
/// The image will be scaled to cover the bounds of the element.
Cover,
/// The image will be scaled down to fit within the bounds of the element.
ScaleDown,
/// The image will maintain its original size.
None,
}
impl ObjectFit {
/// Get the bounds of the image within the given bounds.
pub fn get_bounds(
&self,
bounds: Bounds<Pixels>,
image_size: Size<DevicePixels>,
) -> Bounds<Pixels> {
let image_size = image_size.map(|dimension| Pixels::from(u32::from(dimension)));
let image_ratio = image_size.width / image_size.height;
let bounds_ratio = bounds.size.width / bounds.size.height;
let result_bounds = match self {
ObjectFit::Fill => bounds,
ObjectFit::Contain => {
let new_size = if bounds_ratio > image_ratio {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
} else {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
}
ObjectFit::ScaleDown => {
// Check if the image is larger than the bounds in either dimension.
if image_size.width > bounds.size.width || image_size.height > bounds.size.height {
// If the image is larger, use the same logic as Contain to scale it down.
let new_size = if bounds_ratio > image_ratio {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
} else {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
} else {
// If the image is smaller than or equal to the container, display it at its original size,
// centered within the container.
let original_size = size(image_size.width, image_size.height);
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - original_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - original_size.height) / 2.0,
),
size: original_size,
}
}
}
ObjectFit::Cover => {
let new_size = if bounds_ratio > image_ratio {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
} else {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
}
ObjectFit::None => Bounds {
origin: bounds.origin,
size: image_size,
},
};
result_bounds
}
}
impl Img {
/// A list of all format extensions currently supported by this img element
pub fn extensions() -> &'static [&'static str] {
@@ -291,7 +165,7 @@ impl Element for Img {
let layout_id = self
.interactivity
.request_layout(global_id, cx, |mut style, cx| {
if let Some(data) = self.source.data(cx) {
if let Some(data) = self.source.use_data(cx) {
if let Some(state) = &mut state {
let frame_count = data.frame_count();
if frame_count > 1 {
@@ -363,7 +237,7 @@ impl Element for Img {
.paint(global_id, bounds, hitbox.as_ref(), cx, |style, cx| {
let corner_radii = style.corner_radii.to_pixels(bounds.size, cx.rem_size());
if let Some(data) = source.data(cx) {
if let Some(data) = source.use_data(cx) {
let new_bounds = self.object_fit.get_bounds(bounds, data.size(*frame_index));
cx.paint_image(
new_bounds,
@@ -374,17 +248,6 @@ impl Element for Img {
)
.log_err();
}
match source {
#[cfg(target_os = "macos")]
ImageSource::Surface(surface) => {
let size = size(surface.width().into(), surface.height().into());
let new_bounds = self.object_fit.get_bounds(bounds, size);
// TODO: Add support for corner_radii and grayscale.
cx.paint_surface(new_bounds, surface);
}
_ => {}
}
})
}
}
@@ -410,39 +273,74 @@ impl InteractiveElement for Img {
}
impl ImageSource {
fn data(&self, cx: &mut WindowContext) -> Option<Arc<ImageData>> {
pub(crate) fn use_data(&self, cx: &mut WindowContext) -> Option<Arc<RenderImage>> {
match self {
ImageSource::Uri(_) | ImageSource::Asset(_) | ImageSource::File(_) => {
ImageSource::Uri(_) | ImageSource::Embedded(_) | ImageSource::File(_) => {
let uri_or_path: UriOrPath = match self {
ImageSource::Uri(uri) => uri.clone().into(),
ImageSource::File(path) => path.clone().into(),
ImageSource::Asset(path) => UriOrPath::Asset(path.clone()),
ImageSource::Embedded(path) => UriOrPath::Embedded(path.clone()),
_ => unreachable!(),
};
cx.use_cached_asset::<Image>(&uri_or_path)?.log_err()
cx.use_asset::<ImageAsset>(&uri_or_path)?.log_err()
}
ImageSource::Data(data) => Some(data.to_owned()),
#[cfg(target_os = "macos")]
ImageSource::Surface(_) => None,
ImageSource::Render(data) => Some(data.to_owned()),
ImageSource::Image(data) => cx.use_asset::<ImageDecoder>(data)?.log_err(),
}
}
/// Fetch the data associated with this source, using GPUI's asset caching
pub async fn data(&self, cx: &mut AppContext) -> Option<Arc<RenderImage>> {
match self {
ImageSource::Uri(_) | ImageSource::Embedded(_) | ImageSource::File(_) => {
let uri_or_path: UriOrPath = match self {
ImageSource::Uri(uri) => uri.clone().into(),
ImageSource::File(path) => path.clone().into(),
ImageSource::Embedded(path) => UriOrPath::Embedded(path.clone()),
_ => unreachable!(),
};
cx.fetch_asset::<ImageAsset>(&uri_or_path).0.await.log_err()
}
ImageSource::Render(data) => Some(data.to_owned()),
ImageSource::Image(data) => cx.fetch_asset::<ImageDecoder>(data).0.await.log_err(),
}
}
}
#[derive(Clone)]
enum Image {}
enum ImageDecoder {}
impl Asset for Image {
type Source = UriOrPath;
type Output = Result<Arc<ImageData>, ImageCacheError>;
impl Asset for ImageDecoder {
type Source = Arc<Image>;
type Output = Result<Arc<RenderImage>, Arc<anyhow::Error>>;
fn load(
source: Self::Source,
cx: &mut WindowContext,
cx: &mut AppContext,
) -> impl Future<Output = Self::Output> + Send + 'static {
let result = source.to_image_data(cx).map_err(Arc::new);
async { result }
}
}
#[derive(Clone)]
enum ImageAsset {}
impl Asset for ImageAsset {
type Source = UriOrPath;
type Output = Result<Arc<RenderImage>, ImageCacheError>;
fn load(
source: Self::Source,
cx: &mut AppContext,
) -> impl Future<Output = Self::Output> + Send + 'static {
let client = cx.http_client();
let scale_factor = cx.scale_factor();
// TODO: Can we make SVGs always rescale?
// let scale_factor = cx.scale_factor();
let svg_renderer = cx.svg_renderer();
let asset_source = cx.asset_source().clone();
async move {
@@ -461,7 +359,7 @@ impl Asset for Image {
}
body
}
UriOrPath::Asset(path) => {
UriOrPath::Embedded(path) => {
let data = asset_source.load(&path).ok().flatten();
if let Some(data) = data {
data.to_vec()
@@ -503,15 +401,16 @@ impl Asset for Image {
}
};
ImageData::new(data)
RenderImage::new(data)
} else {
let pixmap =
svg_renderer.render_pixmap(&bytes, SvgSize::ScaleFactor(scale_factor))?;
// TODO: Can we make svgs always rescale?
svg_renderer.render_pixmap(&bytes, SvgSize::ScaleFactor(1.0))?;
let buffer =
ImageBuffer::from_raw(pixmap.width(), pixmap.height(), pixmap.take()).unwrap();
ImageData::new(SmallVec::from_elem(Frame::new(buffer), 1))
RenderImage::new(SmallVec::from_elem(Frame::new(buffer), 1))
};
Ok(Arc::new(data))

View File

@@ -5,6 +5,7 @@ mod deferred;
mod div;
mod img;
mod list;
mod surface;
mod svg;
mod text;
mod uniform_list;
@@ -16,6 +17,7 @@ pub use deferred::*;
pub use div::*;
pub use img::*;
pub use list::*;
pub use surface::*;
pub use svg::*;
pub use text::*;
pub use uniform_list::*;

View File

@@ -0,0 +1,111 @@
use crate::{
size, Bounds, Element, ElementId, GlobalElementId, IntoElement, LayoutId, ObjectFit, Pixels,
Style, StyleRefinement, Styled, WindowContext,
};
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use refineable::Refineable;
/// A source of a surface's content.
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum SurfaceSource {
/// A macOS image buffer from CoreVideo
#[cfg(target_os = "macos")]
Surface(CVImageBuffer),
}
#[cfg(target_os = "macos")]
impl From<CVImageBuffer> for SurfaceSource {
fn from(value: CVImageBuffer) -> Self {
SurfaceSource::Surface(value)
}
}
/// A surface element.
pub struct Surface {
source: SurfaceSource,
object_fit: ObjectFit,
style: StyleRefinement,
}
/// Create a new surface element.
pub fn surface(source: impl Into<SurfaceSource>) -> Surface {
Surface {
source: source.into(),
object_fit: ObjectFit::Contain,
style: Default::default(),
}
}
impl Surface {
/// Set the object fit for the image.
pub fn object_fit(mut self, object_fit: ObjectFit) -> Self {
self.object_fit = object_fit;
self
}
}
impl Element for Surface {
type RequestLayoutState = ();
type PrepaintState = ();
fn id(&self) -> Option<ElementId> {
None
}
fn request_layout(
&mut self,
_global_id: Option<&GlobalElementId>,
cx: &mut WindowContext,
) -> (LayoutId, Self::RequestLayoutState) {
let mut style = Style::default();
style.refine(&self.style);
let layout_id = cx.request_layout(style, []);
(layout_id, ())
}
fn prepaint(
&mut self,
_global_id: Option<&GlobalElementId>,
_bounds: Bounds<Pixels>,
_request_layout: &mut Self::RequestLayoutState,
_cx: &mut WindowContext,
) -> Self::PrepaintState {
()
}
fn paint(
&mut self,
_global_id: Option<&GlobalElementId>,
bounds: Bounds<Pixels>,
_: &mut Self::RequestLayoutState,
_: &mut Self::PrepaintState,
cx: &mut WindowContext,
) {
match &self.source {
#[cfg(target_os = "macos")]
SurfaceSource::Surface(surface) => {
let size = size(surface.width().into(), surface.height().into());
let new_bounds = self.object_fit.get_bounds(bounds, size);
// TODO: Add support for corner_radii
cx.paint_surface(new_bounds, surface.clone());
}
#[allow(unreachable_patterns)]
_ => {}
}
}
}
impl IntoElement for Surface {
type Element = Self;
fn into_element(self) -> Self::Element {
self
}
}
impl Styled for Surface {
fn style(&mut self) -> &mut StyleRefinement {
&mut self.style
}
}

View File

@@ -2447,10 +2447,24 @@ impl From<usize> for Pixels {
/// affected by the device's scale factor, `DevicePixels` always correspond to real pixels on the
/// display.
#[derive(
Add, AddAssign, Clone, Copy, Default, Div, Eq, Hash, Ord, PartialEq, PartialOrd, Sub, SubAssign,
Add,
AddAssign,
Clone,
Copy,
Default,
Div,
Eq,
Hash,
Ord,
PartialEq,
PartialOrd,
Sub,
SubAssign,
Serialize,
Deserialize,
)]
#[repr(transparent)]
pub struct DevicePixels(pub(crate) i32);
pub struct DevicePixels(pub i32);
impl DevicePixels {
/// Converts the `DevicePixels` value to the number of bytes needed to represent it in memory.

View File

@@ -20,21 +20,25 @@ mod test;
mod windows;
use crate::{
point, Action, AnyWindowHandle, AsyncWindowContext, BackgroundExecutor, Bounds, DevicePixels,
DispatchEventResult, Font, FontId, FontMetrics, FontRun, ForegroundExecutor, GPUSpecs, GlyphId,
Keymap, LineLayout, Pixels, PlatformInput, Point, RenderGlyphParams, RenderImageParams,
RenderSvgParams, Scene, SharedString, Size, Task, TaskLabel, WindowContext,
DEFAULT_WINDOW_SIZE,
point, Action, AnyWindowHandle, AppContext, AsyncWindowContext, BackgroundExecutor, Bounds,
DevicePixels, DispatchEventResult, Font, FontId, FontMetrics, FontRun, ForegroundExecutor,
GPUSpecs, GlyphId, ImageSource, Keymap, LineLayout, Pixels, PlatformInput, Point,
RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams, Scene, SharedString, Size,
SvgSize, Task, TaskLabel, WindowContext, DEFAULT_WINDOW_SIZE,
};
use anyhow::Result;
use anyhow::{anyhow, Result};
use async_task::Runnable;
use futures::channel::oneshot;
use image::codecs::gif::GifDecoder;
use image::{AnimationDecoder as _, Frame};
use parking::Unparker;
use raw_window_handle::{HasDisplayHandle, HasWindowHandle};
use seahash::SeaHasher;
use serde::{Deserialize, Serialize};
use smallvec::SmallVec;
use std::borrow::Cow;
use std::hash::{Hash, Hasher};
use std::io::Cursor;
use std::time::{Duration, Instant};
use std::{
fmt::{self, Debug},
@@ -43,6 +47,7 @@ use std::{
rc::Rc,
sync::Arc,
};
use strum::EnumIter;
use uuid::Uuid;
pub use app_menu::*;
@@ -623,7 +628,7 @@ pub trait InputHandler: 'static {
);
/// Replace the text in the given document range with the given text,
/// and mark the given text as part of of an IME 'composing' state
/// and mark the given text as part of an IME 'composing' state
/// Corresponds to [setMarkedText(_:selectedRange:replacementRange:)](https://developer.apple.com/documentation/appkit/nstextinputclient/1438246-setmarkedtext)
///
/// range_utf16 is in terms of UTF-16 characters
@@ -969,12 +974,227 @@ impl Default for CursorStyle {
/// A clipboard item that should be copied to the clipboard
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct ClipboardItem {
entries: Vec<ClipboardEntry>,
}
/// Either a ClipboardString or a ClipboardImage
#[derive(Clone, Debug, Eq, PartialEq)]
pub enum ClipboardEntry {
/// A string entry
String(ClipboardString),
/// An image entry
Image(Image),
}
impl ClipboardItem {
/// Create a new ClipboardItem with the given entries
pub fn new(entries: Vec<ClipboardEntry>) -> Self {
Self { entries }
}
/// Create a new ClipboardItem::String with no associated metadata
pub fn new_string(text: String) -> Self {
Self {
entries: vec![ClipboardEntry::String(ClipboardString::new(text))],
}
}
/// Create a new ClipboardItem::String with the given text and associated metadata
pub fn new_string_with_metadata<T: Serialize>(text: String, metadata: T) -> Self {
Self {
entries: vec![ClipboardEntry::String(
ClipboardString::new(text).with_metadata(metadata),
)],
}
}
/// Concatenates together all the ClipboardString entries in the item.
/// Returns None if there were no ClipboardString entries.
pub fn text(&self) -> Option<String> {
let mut answer = String::new();
let mut any_entries = false;
for entry in self.entries.iter() {
if let ClipboardEntry::String(ClipboardString { text, metadata: _ }) = entry {
answer.push_str(text);
any_entries = true;
}
}
if any_entries {
Some(answer)
} else {
None
}
}
/// Get the item's entries
pub fn entries(&self) -> &[ClipboardEntry] {
&self.entries
}
/// Get owned versions of the item's entries
pub fn into_entries(self) -> impl Iterator<Item = ClipboardEntry> {
self.entries.into_iter()
}
}
/// One of the editor's supported image formats (e.g. PNG, JPEG) - used when dealing with images in the clipboard
#[derive(Clone, Copy, Debug, Eq, PartialEq, EnumIter, Hash)]
pub enum ImageFormat {
// Sorted from most to least likely to be pasted into an editor,
// which matters when we iterate through them trying to see if
// clipboard content matches them.
/// .png
Png,
/// .jpeg or .jpg
Jpeg,
/// .webp
Webp,
/// .gif
Gif,
/// .svg
Svg,
/// .bmp
Bmp,
/// .tif or .tiff
Tiff,
}
/// An image, with a format and certain bytes
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct Image {
/// The image format the bytes represent (e.g. PNG)
format: ImageFormat,
/// The raw image bytes
bytes: Vec<u8>,
id: u64,
}
impl Hash for Image {
fn hash<H: Hasher>(&self, state: &mut H) {
state.write_u64(self.id);
}
}
impl Image {
/// Get this image's ID
pub fn id(&self) -> u64 {
self.id
}
/// Use the GPUI `use_asset` API to make this image renderable
pub fn use_render_image(self: Arc<Self>, cx: &mut WindowContext) -> Option<Arc<RenderImage>> {
ImageSource::Image(self).use_data(cx)
}
/// Convert an `ImageData` object to a clipboard image in PNG format.
pub fn from_image_data(render_image: &RenderImage) -> Result<Self> {
// hardcode frame index 0; we don't currently support copying animated GIFs
if let Some(input_bytes) = render_image.as_bytes(0) {
let rgba_bytes = {
let mut buf = input_bytes.to_vec();
// Convert from BGRA to RGBA.
for pixel in buf.chunks_exact_mut(4) {
pixel.swap(0, 2);
}
buf
};
let mut output_bytes = Vec::with_capacity(rgba_bytes.len());
let image_buffer = RgbaIma
let cursor = Cursor::new(output_bytes);
image::DynamicImage::ImageRgba8(rgba_bytes)
.write_to(&mut cursor, image::ImageOutputFormat::Png)?;
Ok(Self {
format: ImageFormat::Png,
bytes: cursor.into_inner(),
id: rand::random(),
})
} else {
Err(anyhow!(
"RenderImage did not have a frame 0, which should never happen."
))
}
}
/// Convert the clipboard image to an `ImageData` object.
pub fn to_image_data(&self, cx: &AppContext) -> Result<Arc<RenderImage>> {
fn frames_for_image(
bytes: &[u8],
format: image::ImageFormat,
) -> Result<SmallVec<[Frame; 1]>> {
let mut data = image::load_from_memory_with_format(bytes, format)?.into_rgba8();
// Convert from RGBA to BGRA.
for pixel in data.chunks_exact_mut(4) {
pixel.swap(0, 2);
}
Ok(SmallVec::from_elem(Frame::new(data), 1))
}
let frames = match self.format {
ImageFormat::Gif => {
let decoder = GifDecoder::new(Cursor::new(&self.bytes))?;
let mut frames = SmallVec::new();
for frame in decoder.into_frames() {
let mut frame = frame?;
// Convert from RGBA to BGRA.
for pixel in frame.buffer_mut().chunks_exact_mut(4) {
pixel.swap(0, 2);
}
frames.push(frame);
}
frames
}
ImageFormat::Png => frames_for_image(&self.bytes, image::ImageFormat::Png)?,
ImageFormat::Jpeg => frames_for_image(&self.bytes, image::ImageFormat::Jpeg)?,
ImageFormat::Webp => frames_for_image(&self.bytes, image::ImageFormat::WebP)?,
ImageFormat::Bmp => frames_for_image(&self.bytes, image::ImageFormat::Bmp)?,
ImageFormat::Tiff => frames_for_image(&self.bytes, image::ImageFormat::Tiff)?,
ImageFormat::Svg => {
// TODO: Fix this
let pixmap = cx
.svg_renderer()
.render_pixmap(&self.bytes, SvgSize::ScaleFactor(1.0))?;
let buffer =
image::ImageBuffer::from_raw(pixmap.width(), pixmap.height(), pixmap.take())
.unwrap();
SmallVec::from_elem(Frame::new(buffer), 1)
}
};
Ok(Arc::new(RenderImage::new(frames)))
}
/// Get the format of the clipboard image
pub fn format(&self) -> ImageFormat {
self.format
}
/// Get the raw bytes of the clipboard image
pub fn bytes(&self) -> &[u8] {
self.bytes.as_slice()
}
}
/// A clipboard item that should be copied to the clipboard
#[derive(Clone, Debug, Eq, PartialEq)]
pub struct ClipboardString {
pub(crate) text: String,
pub(crate) metadata: Option<String>,
}
impl ClipboardItem {
/// Create a new clipboard item with the given text
impl ClipboardString {
/// Create a new clipboard string with the given text
pub fn new(text: String) -> Self {
Self {
text,
@@ -982,18 +1202,23 @@ impl ClipboardItem {
}
}
/// Create a new clipboard item with the given text and metadata
/// Return a new clipboard item with the metadata replaced by the given metadata
pub fn with_metadata<T: Serialize>(mut self, metadata: T) -> Self {
self.metadata = Some(serde_json::to_string(&metadata).unwrap());
self
}
/// Get the text of the clipboard item
/// Get the text of the clipboard string
pub fn text(&self) -> &String {
&self.text
}
/// Get the metadata of the clipboard item
/// Get the owned text of the clipboard string
pub fn into_text(self) -> String {
self.text
}
/// Get the metadata of the clipboard string
pub fn metadata<T>(&self) -> Option<T>
where
T: for<'a> Deserialize<'a>,

View File

@@ -145,7 +145,7 @@ impl Clipboard {
match unsafe { read_fd(fd) } {
Ok(v) => {
self.cached_read = Some(ClipboardItem::new(v));
self.cached_read = Some(ClipboardItem::new_string(v));
self.cached_read.clone()
}
Err(err) => {
@@ -177,7 +177,7 @@ impl Clipboard {
match unsafe { read_fd(fd) } {
Ok(v) => {
self.cached_primary_read = Some(ClipboardItem::new(v.clone()));
self.cached_primary_read = Some(ClipboardItem::new_string(v.clone()));
self.cached_primary_read.clone()
}
Err(err) => {

View File

@@ -16,6 +16,7 @@ use metal_renderer as renderer;
#[cfg(feature = "macos-blade")]
use crate::platform::blade as renderer;
mod attributed_string;
mod open_type;
mod platform;
mod text_system;

View File

@@ -0,0 +1,121 @@
use cocoa::base::id;
use cocoa::foundation::NSRange;
use objc::{class, msg_send, sel, sel_impl};
/// The `cocoa` crate does not define NSAttributedString (and related Cocoa classes),
/// which are needed for copying rich text (that is, text intermingled with images)
/// to the clipboard. This adds access to those APIs.
#[allow(non_snake_case)]
pub trait NSAttributedString: Sized {
unsafe fn alloc(_: Self) -> id {
msg_send![class!(NSAttributedString), alloc]
}
unsafe fn init_attributed_string(self, string: id) -> id;
unsafe fn appendAttributedString_(self, attr_string: id);
unsafe fn RTFDFromRange_documentAttributes_(self, range: NSRange, attrs: id) -> id;
unsafe fn RTFFromRange_documentAttributes_(self, range: NSRange, attrs: id) -> id;
unsafe fn string(self) -> id;
}
impl NSAttributedString for id {
unsafe fn init_attributed_string(self, string: id) -> id {
msg_send![self, initWithString: string]
}
unsafe fn appendAttributedString_(self, attr_string: id) {
let _: () = msg_send![self, appendAttributedString: attr_string];
}
unsafe fn RTFDFromRange_documentAttributes_(self, range: NSRange, attrs: id) -> id {
msg_send![self, RTFDFromRange: range documentAttributes: attrs]
}
unsafe fn RTFFromRange_documentAttributes_(self, range: NSRange, attrs: id) -> id {
msg_send![self, RTFFromRange: range documentAttributes: attrs]
}
unsafe fn string(self) -> id {
msg_send![self, string]
}
}
#[allow(non_snake_case)]
pub trait NSTextAttachment: Sized {
unsafe fn alloc(_: Self) -> id {
msg_send![class!(NSTextAttachment), alloc]
}
}
impl NSTextAttachment for id {}
pub trait NSMutableAttributedString: NSAttributedString {
unsafe fn alloc(_: Self) -> id {
msg_send![class!(NSMutableAttributedString), alloc]
}
}
impl NSMutableAttributedString for id {}
#[cfg(test)]
mod tests {
use super::*;
use cocoa::appkit::NSImage;
use cocoa::base::nil;
use cocoa::foundation::NSString;
#[test]
fn test_nsattributed_string() {
unsafe {
let image: id = msg_send![class!(NSImage), alloc];
image.initWithContentsOfFile_(
NSString::alloc(nil).init_str("/Users/rtfeldman/Downloads/test.jpeg"),
);
let size = image.size();
let string = NSString::alloc(nil).init_str("Test String");
let attr_string = NSMutableAttributedString::alloc(nil).init_attributed_string(string);
let hello_string = NSString::alloc(nil).init_str("Hello World");
let hello_attr_string =
NSAttributedString::alloc(nil).init_attributed_string(hello_string);
attr_string.appendAttributedString_(hello_attr_string);
let attachment = NSTextAttachment::alloc(nil);
let _: () = msg_send![attachment, setImage: image];
let image_attr_string =
msg_send![class!(NSAttributedString), attributedStringWithAttachment: attachment];
attr_string.appendAttributedString_(image_attr_string);
let another_string = NSString::alloc(nil).init_str("Another String");
let another_attr_string =
NSAttributedString::alloc(nil).init_attributed_string(another_string);
attr_string.appendAttributedString_(another_attr_string);
let len: cocoa::foundation::NSUInteger = msg_send![attr_string, length];
///////////////////////////////////////////////////
// pasteboard.clearContents();
let rtfd_data = attr_string.RTFDFromRange_documentAttributes_(
NSRange::new(0, msg_send![attr_string, length]),
nil,
);
assert_ne!(rtfd_data, nil);
// if rtfd_data != nil {
// pasteboard.setData_forType(rtfd_data, NSPasteboardTypeRTFD);
// }
// let rtf_data = attributed_string.RTFFromRange_documentAttributes_(
// NSRange::new(0, attributed_string.length()),
// nil,
// );
// if rtf_data != nil {
// pasteboard.setData_forType(rtf_data, NSPasteboardTypeRTF);
// }
// let plain_text = attributed_string.string();
// pasteboard.setString_forType(plain_text, NSPasteboardTypeString);
}
}
}

View File

@@ -1,8 +1,8 @@
use super::metal_atlas::MetalAtlas;
use crate::{
point, size, AtlasTextureId, AtlasTextureKind, AtlasTile, Bounds, ContentMask, DevicePixels,
Hsla, MonochromeSprite, Path, PathId, PathVertex, PolychromeSprite, PrimitiveBatch, Quad,
ScaledPixels, Scene, Shadow, Size, Surface, Underline,
Hsla, MonochromeSprite, PaintSurface, Path, PathId, PathVertex, PolychromeSprite,
PrimitiveBatch, Quad, ScaledPixels, Scene, Shadow, Size, Surface, Underline,
};
use anyhow::{anyhow, Result};
use block::ConcreteBlock;
@@ -1020,7 +1020,7 @@ impl MetalRenderer {
fn draw_surfaces(
&mut self,
surfaces: &[Surface],
surfaces: &[PaintSurface],
instance_buffer: &mut InstanceBuffer,
instance_offset: &mut usize,
viewport_size: Size<DevicePixels>,

View File

@@ -1,8 +1,13 @@
use super::{events::key_to_native, BoolExt};
use super::{
attributed_string::{NSAttributedString, NSMutableAttributedString, NSTextAttachment},
events::key_to_native,
BoolExt,
};
use crate::{
Action, AnyWindowHandle, BackgroundExecutor, ClipboardItem, CursorStyle, ForegroundExecutor,
Keymap, MacDispatcher, MacDisplay, MacTextSystem, MacWindow, Menu, MenuItem, PathPromptOptions,
Platform, PlatformDisplay, PlatformTextSystem, PlatformWindow, Result, SemanticVersion, Task,
hash, Action, AnyWindowHandle, BackgroundExecutor, ClipboardEntry, ClipboardItem,
ClipboardString, CursorStyle, ForegroundExecutor, Image, ImageFormat, Keymap, MacDispatcher,
MacDisplay, MacTextSystem, MacWindow, Menu, MenuItem, PathPromptOptions, Platform,
PlatformDisplay, PlatformTextSystem, PlatformWindow, Result, SemanticVersion, Task,
WindowAppearance, WindowParams,
};
use anyhow::anyhow;
@@ -10,17 +15,18 @@ use block::ConcreteBlock;
use cocoa::{
appkit::{
NSApplication, NSApplicationActivationPolicy::NSApplicationActivationPolicyRegular,
NSEventModifierFlags, NSMenu, NSMenuItem, NSModalResponse, NSOpenPanel, NSPasteboard,
NSPasteboardTypeString, NSSavePanel, NSWindow,
NSEventModifierFlags, NSImage, NSMenu, NSMenuItem, NSModalResponse, NSOpenPanel,
NSPasteboard, NSPasteboardTypePNG, NSPasteboardTypeRTF, NSPasteboardTypeRTFD,
NSPasteboardTypeString, NSPasteboardTypeTIFF, NSSavePanel, NSWindow,
},
base::{id, nil, selector, BOOL, YES},
foundation::{
NSArray, NSAutoreleasePool, NSBundle, NSData, NSInteger, NSProcessInfo, NSString,
NSArray, NSAutoreleasePool, NSBundle, NSData, NSInteger, NSProcessInfo, NSRange, NSString,
NSUInteger, NSURL,
},
};
use core_foundation::{
base::{CFRelease, CFType, CFTypeRef, OSStatus, TCFType as _},
base::{CFRelease, CFType, CFTypeRef, OSStatus, TCFType},
boolean::CFBoolean,
data::CFData,
dictionary::{CFDictionary, CFDictionaryRef, CFMutableDictionary},
@@ -50,6 +56,7 @@ use std::{
slice, str,
sync::Arc,
};
use strum::IntoEnumIterator;
use super::renderer;
@@ -421,7 +428,7 @@ impl Platform for MacPlatform {
pool.drain();
(*app).set_ivar(MAC_PLATFORM_IVAR, null_mut::<c_void>());
(*app.delegate()).set_ivar(MAC_PLATFORM_IVAR, null_mut::<c_void>());
(*NSWindow::delegate(app)).set_ivar(MAC_PLATFORM_IVAR, null_mut::<c_void>());
}
}
@@ -749,7 +756,7 @@ impl Platform for MacPlatform {
let app: id = msg_send![APP_CLASS, sharedApplication];
let mut state = self.0.lock();
let actions = &mut state.menu_actions;
app.setMainMenu_(self.create_menu_bar(menus, app.delegate(), actions, keymap));
app.setMainMenu_(self.create_menu_bar(menus, NSWindow::delegate(app), actions, keymap));
}
}
@@ -758,7 +765,7 @@ impl Platform for MacPlatform {
let app: id = msg_send![APP_CLASS, sharedApplication];
let mut state = self.0.lock();
let actions = &mut state.menu_actions;
let new = self.create_dock_menu(menu, app.delegate(), actions, keymap);
let new = self.create_dock_menu(menu, NSWindow::delegate(app), actions, keymap);
if let Some(old) = state.dock_menu.replace(new) {
CFRelease(old as _)
}
@@ -851,79 +858,212 @@ impl Platform for MacPlatform {
}
fn write_to_clipboard(&self, item: ClipboardItem) {
let state = self.0.lock();
use crate::ClipboardEntry;
unsafe {
state.pasteboard.clearContents();
if item.entries.len() <= 1 {
// We only want to do the whole
match item.entries.first() {
Some(entry) => match entry {
ClipboardEntry::String(string) => {
self.write_plaintext_to_clipboard(string);
}
ClipboardEntry::Image(image) => {
self.write_image_to_clipboard(image);
}
},
None => {
// Write an empty list of entries just clears the clipboard.
let state = self.0.lock();
state.pasteboard.clearContents();
}
}
} else {
let mut any_images = false;
let attributed_string = {
let mut buf = NSMutableAttributedString::alloc(nil)
// TODO can we skip this? Or at least part of it?
.init_attributed_string(NSString::alloc(nil).init_str(""));
let text_bytes = NSData::dataWithBytes_length_(
nil,
item.text.as_ptr() as *const c_void,
item.text.len() as u64,
);
state
.pasteboard
.setData_forType(text_bytes, NSPasteboardTypeString);
for entry in item.entries {
let to_append;
if let Some(metadata) = item.metadata.as_ref() {
let hash_bytes = ClipboardItem::text_hash(&item.text).to_be_bytes();
let hash_bytes = NSData::dataWithBytes_length_(
nil,
hash_bytes.as_ptr() as *const c_void,
hash_bytes.len() as u64,
);
match entry {
ClipboardEntry::String(ClipboardString { text, metadata: _ }) => {
to_append = NSAttributedString::alloc(nil)
.init_attributed_string(NSString::alloc(nil).init_str(&text));
}
ClipboardEntry::Image(Image { format, bytes, id }) => {
use cocoa::appkit::NSImage;
any_images = true;
// Create an attachment from the image
let attachment = {
// Initialize the NSImage
let image = {
let image: id = msg_send![class!(NSImage), alloc];
NSImage::initWithContentsOfFile_(
image,
NSString::alloc(nil)
.init_str("/Users/rtfeldman/Downloads/test.jpeg"), // TODO read from clipboard bytes
);
};
let attachment = NSTextAttachment::alloc(nil);
let _: () = msg_send![attachment, setImage: image];
attachment
};
// Make a NSAttributedString with the attachment
to_append = msg_send![class!(NSAttributedString), attributedStringWithAttachment: attachment];
}
}
/*
{
// write some images and text to the clip
// let image1 = ns_image(&bytes).unwrap();
// let image2 = ns_image(&bytes).unwrap();
let image1 = NSImage::initWithPasteboard_(nil, pasteboard);
let image2 = NSImage::initWithPasteboard_(nil, pasteboard);
// let attributed_string = text_and_images([image1, image2]);
// text_and_images([NSImage::initWithPasteboard_(, pasteboard), ns_image(&bytes).unwrap()]);
let attributed_string = {
use cocoa::appkit::NSImage;
let image: id = msg_send![class!(NSImage), alloc];
NSImage::initWithContentsOfFile_(
image,
NSString::alloc(nil).init_str("/Users/rtfeldman/Downloads/test.jpeg"),
);
let size = image.size();
let string = NSString::alloc(nil).init_str("Test String");
let attr_string =
NSMutableAttributedString::alloc(nil).init_attributed_string(string);
let hello_string = NSString::alloc(nil).init_str("Hello World");
let hello_attr_string =
NSAttributedString::alloc(nil).init_attributed_string(hello_string);
attr_string.appendAttributedString_(hello_attr_string);
let attachment = NSTextAttachment::alloc(nil);
let _: () = msg_send![attachment, setImage: image];
let image_attr_string = msg_send![class!(NSAttributedString), attributedStringWithAttachment: attachment];
attr_string.appendAttributedString_(image_attr_string);
let another_string = NSString::alloc(nil).init_str("Another String");
let another_attr_string =
NSAttributedString::alloc(nil).init_attributed_string(another_string);
attr_string.appendAttributedString_(another_attr_string);
attr_string
};
pasteboard.clearContents();
let rtfd_data = attributed_string.RTFDFromRange_documentAttributes_(
NSRange::new(0, msg_send![attributed_string, length]),
nil,
);
if rtfd_data != nil {
pasteboard.setData_forType(rtfd_data, NSPasteboardTypeRTFD);
}
// let rtf_data = attributed_string.RTFFromRange_documentAttributes_(
// NSRange::new(0, attributed_string.length()),
// nil,
// );
// if rtf_data != nil {
// pasteboard.setData_forType(rtf_data, NSPasteboardTypeRTF);
// }
// let plain_text = attributed_string.string();
// pasteboard.setString_forType(plain_text, NSPasteboardTypeString);
}
*/
buf.appendAttributedString_(to_append);
}
buf
};
let state = self.0.lock();
state.pasteboard.clearContents();
// Only set rich text clipboard types if we actually have 1+ images to include.
if any_images {
let rtfd_data = attributed_string.RTFDFromRange_documentAttributes_(
NSRange::new(0, msg_send![attributed_string, length]),
nil,
);
if rtfd_data != nil {
state
.pasteboard
.setData_forType(rtfd_data, NSPasteboardTypeRTFD);
}
let rtf_data = attributed_string.RTFFromRange_documentAttributes_(
NSRange::new(0, attributed_string.length()),
nil,
);
if rtf_data != nil {
state
.pasteboard
.setData_forType(rtf_data, NSPasteboardTypeRTF);
}
}
let plain_text = attributed_string.string();
state
.pasteboard
.setData_forType(hash_bytes, state.text_hash_pasteboard_type);
let metadata_bytes = NSData::dataWithBytes_length_(
nil,
metadata.as_ptr() as *const c_void,
metadata.len() as u64,
);
state
.pasteboard
.setData_forType(metadata_bytes, state.metadata_pasteboard_type);
.setString_forType(plain_text, NSPasteboardTypeString);
}
}
}
fn read_from_clipboard(&self) -> Option<ClipboardItem> {
let state = self.0.lock();
unsafe {
if let Some(text_bytes) =
self.read_from_pasteboard(state.pasteboard, NSPasteboardTypeString)
{
let text = String::from_utf8_lossy(text_bytes).to_string();
let hash_bytes = self
.read_from_pasteboard(state.pasteboard, state.text_hash_pasteboard_type)
.and_then(|bytes| bytes.try_into().ok())
.map(u64::from_be_bytes);
let metadata_bytes = self
.read_from_pasteboard(state.pasteboard, state.metadata_pasteboard_type)
.and_then(|bytes| String::from_utf8(bytes.to_vec()).ok());
let pasteboard = state.pasteboard;
if let Some((hash, metadata)) = hash_bytes.zip(metadata_bytes) {
if hash == ClipboardItem::text_hash(&text) {
Some(ClipboardItem {
text,
metadata: Some(metadata),
})
} else {
Some(ClipboardItem {
text,
metadata: None,
})
}
// First, see if it's a string.
unsafe {
let types: id = pasteboard.types();
let string_type: id = ns_string("public.utf8-plain-text");
if msg_send![types, containsObject: string_type] {
let data = pasteboard.dataForType(string_type);
if data == nil {
return None;
} else if data.bytes().is_null() {
// https://developer.apple.com/documentation/foundation/nsdata/1410616-bytes?language=objc
// "If the length of the NSData object is 0, this property returns nil."
return Some(self.read_string_from_clipboard(&state, &[]));
} else {
Some(ClipboardItem {
text,
metadata: None,
})
let bytes =
slice::from_raw_parts(data.bytes() as *mut u8, data.length() as usize);
return Some(self.read_string_from_clipboard(&state, bytes));
}
}
// If it wasn't a string, try the various supported image types.
for format in ImageFormat::iter() {
if let Some(item) = try_clipboard_image(pasteboard, format) {
return Some(item);
}
} else {
None
}
}
// If it wasn't a string or a supported image type, give up.
None
}
fn write_credentials(&self, url: &str, username: &str, password: &[u8]) -> Task<Result<()>> {
@@ -1038,6 +1178,112 @@ impl Platform for MacPlatform {
}
}
impl MacPlatform {
unsafe fn read_string_from_clipboard(
&self,
state: &MacPlatformState,
text_bytes: &[u8],
) -> ClipboardItem {
let text = String::from_utf8_lossy(text_bytes).to_string();
let hash_bytes = self
.read_from_pasteboard(state.pasteboard, state.text_hash_pasteboard_type)
.and_then(|bytes| bytes.try_into().ok())
.map(u64::from_be_bytes);
let metadata_bytes = self
.read_from_pasteboard(state.pasteboard, state.metadata_pasteboard_type)
.and_then(|bytes| String::from_utf8(bytes.to_vec()).ok());
let opt_metadata;
if let Some((hash, metadata)) = hash_bytes.zip(metadata_bytes) {
if hash == ClipboardString::text_hash(&text) {
opt_metadata = Some(metadata);
} else {
opt_metadata = None;
}
} else {
opt_metadata = None;
}
ClipboardItem::new_string_with_metadata(text, opt_metadata)
}
unsafe fn write_plaintext_to_clipboard(&self, string: &ClipboardString) {
let state = self.0.lock();
state.pasteboard.clearContents();
let text_bytes = NSData::dataWithBytes_length_(
nil,
string.text.as_ptr() as *const c_void,
string.text.len() as u64,
);
state
.pasteboard
.setData_forType(text_bytes, NSPasteboardTypeString);
if let Some(metadata) = string.metadata.as_ref() {
let hash_bytes = ClipboardString::text_hash(&string.text).to_be_bytes();
let hash_bytes = NSData::dataWithBytes_length_(
nil,
hash_bytes.as_ptr() as *const c_void,
hash_bytes.len() as u64,
);
state
.pasteboard
.setData_forType(hash_bytes, state.text_hash_pasteboard_type);
let metadata_bytes = NSData::dataWithBytes_length_(
nil,
metadata.as_ptr() as *const c_void,
metadata.len() as u64,
);
state
.pasteboard
.setData_forType(metadata_bytes, state.metadata_pasteboard_type);
}
}
unsafe fn write_image_to_clipboard(&self, image: &Image) {
let state = self.0.lock();
state.pasteboard.clearContents();
let bytes = NSData::dataWithBytes_length_(
nil,
image.bytes.as_ptr() as *const c_void,
image.bytes.len() as u64,
);
state
.pasteboard
.setData_forType(bytes, Into::<UTType>::into(image.format).inner_mut());
}
}
fn try_clipboard_image(pasteboard: id, format: ImageFormat) -> Option<ClipboardItem> {
let mut ut_type: UTType = format.into();
unsafe {
let types: id = pasteboard.types();
if msg_send![types, containsObject: ut_type.inner()] {
let data = pasteboard.dataForType(ut_type.inner_mut());
if data == nil {
None
} else {
let bytes = Vec::from(slice::from_raw_parts(
data.bytes() as *mut u8,
data.length() as usize,
));
let id = hash(&bytes);
Some(ClipboardItem {
entries: vec![ClipboardEntry::Image(Image { format, bytes, id })],
})
}
} else {
None
}
}
}
unsafe fn path_from_objc(path: id) -> PathBuf {
let len = msg_send![path, lengthOfBytesUsingEncoding: NSUTF8StringEncoding];
let bytes = path.UTF8String() as *const u8;
@@ -1216,6 +1462,68 @@ mod security {
pub const errSecItemNotFound: OSStatus = -25300;
}
impl From<ImageFormat> for UTType {
fn from(value: ImageFormat) -> Self {
match value {
ImageFormat::Png => Self::png(),
ImageFormat::Jpeg => Self::jpeg(),
ImageFormat::Tiff => Self::tiff(),
ImageFormat::Webp => Self::webp(),
ImageFormat::Gif => Self::gif(),
ImageFormat::Bmp => Self::bmp(),
ImageFormat::Svg => Self::svg(),
}
}
}
// See https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/
struct UTType(id);
impl UTType {
pub fn png() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/png
Self(unsafe { NSPasteboardTypePNG }) // This is a rare case where there's a built-in NSPasteboardType
}
pub fn jpeg() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/jpeg
Self(unsafe { ns_string("public.jpeg") })
}
pub fn gif() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/gif
Self(unsafe { ns_string("com.compuserve.gif") })
}
pub fn webp() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/webp
Self(unsafe { ns_string("org.webmproject.webp") })
}
pub fn bmp() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/bmp
Self(unsafe { ns_string("com.microsoft.bmp") })
}
pub fn svg() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/svg
Self(unsafe { ns_string("public.svg-image") })
}
pub fn tiff() -> Self {
// https://developer.apple.com/documentation/uniformtypeidentifiers/uttype-swift.struct/tiff
Self(unsafe { NSPasteboardTypeTIFF }) // This is a rare case where there's a built-in NSPasteboardType
}
fn inner(&self) -> *const Object {
self.0
}
fn inner_mut(&mut self) -> *mut Object {
self.0 as *mut _
}
}
#[cfg(test)]
mod tests {
use crate::ClipboardItem;
@@ -1227,11 +1535,15 @@ mod tests {
let platform = build_platform();
assert_eq!(platform.read_from_clipboard(), None);
let item = ClipboardItem::new("1".to_string());
let item = ClipboardItem::new_string("1".to_string());
platform.write_to_clipboard(item.clone());
assert_eq!(platform.read_from_clipboard(), Some(item));
let item = ClipboardItem::new("2".to_string()).with_metadata(vec![3, 4]);
let item = ClipboardItem {
entries: vec![ClipboardEntry::String(
ClipboardString::new("2".to_string()).with_metadata(vec![3, 4]),
)],
};
platform.write_to_clipboard(item.clone());
assert_eq!(platform.read_from_clipboard(), Some(item));
@@ -1250,7 +1562,7 @@ mod tests {
}
assert_eq!(
platform.read_from_clipboard(),
Some(ClipboardItem::new(text_from_other_app.to_string()))
Some(ClipboardItem::new_string(text_from_other_app.to_string()))
);
}

View File

@@ -826,15 +826,15 @@ mod tests {
#[test]
fn test_clipboard() {
let platform = WindowsPlatform::new();
let item = ClipboardItem::new("你好".to_string());
let item = ClipboardItem::new_string("你好".to_string());
platform.write_to_clipboard(item.clone());
assert_eq!(platform.read_from_clipboard(), Some(item));
let item = ClipboardItem::new("12345".to_string());
let item = ClipboardItem::new_string("12345".to_string());
platform.write_to_clipboard(item.clone());
assert_eq!(platform.read_from_clipboard(), Some(item));
let item = ClipboardItem::new("abcdef".to_string()).with_metadata(vec![3, 4]);
let item = ClipboardItem::new_string("abcdef".to_string()).with_metadata(vec![3, 4]);
platform.write_to_clipboard(item.clone());
assert_eq!(platform.read_from_clipboard(), Some(item));
}

View File

@@ -23,7 +23,7 @@ pub(crate) struct Scene {
pub(crate) underlines: Vec<Underline>,
pub(crate) monochrome_sprites: Vec<MonochromeSprite>,
pub(crate) polychrome_sprites: Vec<PolychromeSprite>,
pub(crate) surfaces: Vec<Surface>,
pub(crate) surfaces: Vec<PaintSurface>,
}
impl Scene {
@@ -183,7 +183,7 @@ pub(crate) enum Primitive {
Underline(Underline),
MonochromeSprite(MonochromeSprite),
PolychromeSprite(PolychromeSprite),
Surface(Surface),
Surface(PaintSurface),
}
impl Primitive {
@@ -231,9 +231,9 @@ struct BatchIterator<'a> {
polychrome_sprites: &'a [PolychromeSprite],
polychrome_sprites_start: usize,
polychrome_sprites_iter: Peekable<slice::Iter<'a, PolychromeSprite>>,
surfaces: &'a [Surface],
surfaces: &'a [PaintSurface],
surfaces_start: usize,
surfaces_iter: Peekable<slice::Iter<'a, Surface>>,
surfaces_iter: Peekable<slice::Iter<'a, PaintSurface>>,
}
impl<'a> Iterator for BatchIterator<'a> {
@@ -411,7 +411,7 @@ pub(crate) enum PrimitiveBatch<'a> {
texture_id: AtlasTextureId,
sprites: &'a [PolychromeSprite],
},
Surfaces(&'a [Surface]),
Surfaces(&'a [PaintSurface]),
}
#[derive(Default, Debug, Clone, Eq, PartialEq)]
@@ -673,7 +673,7 @@ impl From<PolychromeSprite> for Primitive {
}
#[derive(Clone, Debug, Eq, PartialEq)]
pub(crate) struct Surface {
pub(crate) struct PaintSurface {
pub order: DrawOrder,
pub bounds: Bounds<ScaledPixels>,
pub content_mask: ContentMask<ScaledPixels>,
@@ -681,20 +681,20 @@ pub(crate) struct Surface {
pub image_buffer: media::core_video::CVImageBuffer,
}
impl Ord for Surface {
impl Ord for PaintSurface {
fn cmp(&self, other: &Self) -> std::cmp::Ordering {
self.order.cmp(&other.order)
}
}
impl PartialOrd for Surface {
impl PartialOrd for PaintSurface {
fn partial_cmp(&self, other: &Self) -> Option<std::cmp::Ordering> {
Some(self.cmp(other))
}
}
impl From<Surface> for Primitive {
fn from(surface: Surface) -> Self {
impl From<PaintSurface> for Primitive {
fn from(surface: PaintSurface) -> Self {
Primitive::Surface(surface)
}
}

View File

@@ -5,10 +5,10 @@ use std::{
};
use crate::{
black, phi, point, quad, rems, AbsoluteLength, Bounds, ContentMask, Corners, CornersRefinement,
CursorStyle, DefiniteLength, Edges, EdgesRefinement, Font, FontFallbacks, FontFeatures,
FontStyle, FontWeight, Hsla, Length, Pixels, Point, PointRefinement, Rgba, SharedString, Size,
SizeRefinement, Styled, TextRun, WindowContext,
black, phi, point, quad, rems, size, AbsoluteLength, Bounds, ContentMask, Corners,
CornersRefinement, CursorStyle, DefiniteLength, DevicePixels, Edges, EdgesRefinement, Font,
FontFallbacks, FontFeatures, FontStyle, FontWeight, Hsla, Length, Pixels, Point,
PointRefinement, Rgba, SharedString, Size, SizeRefinement, Styled, TextRun, WindowContext,
};
use collections::HashSet;
use refineable::Refineable;
@@ -27,6 +27,121 @@ pub struct DebugBelow;
#[cfg(debug_assertions)]
impl crate::Global for DebugBelow {}
/// How to fit the image into the bounds of the element.
pub enum ObjectFit {
/// The image will be stretched to fill the bounds of the element.
Fill,
/// The image will be scaled to fit within the bounds of the element.
Contain,
/// The image will be scaled to cover the bounds of the element.
Cover,
/// The image will be scaled down to fit within the bounds of the element.
ScaleDown,
/// The image will maintain its original size.
None,
}
impl ObjectFit {
/// Get the bounds of the image within the given bounds.
pub fn get_bounds(
&self,
bounds: Bounds<Pixels>,
image_size: Size<DevicePixels>,
) -> Bounds<Pixels> {
let image_size = image_size.map(|dimension| Pixels::from(u32::from(dimension)));
let image_ratio = image_size.width / image_size.height;
let bounds_ratio = bounds.size.width / bounds.size.height;
let result_bounds = match self {
ObjectFit::Fill => bounds,
ObjectFit::Contain => {
let new_size = if bounds_ratio > image_ratio {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
} else {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
}
ObjectFit::ScaleDown => {
// Check if the image is larger than the bounds in either dimension.
if image_size.width > bounds.size.width || image_size.height > bounds.size.height {
// If the image is larger, use the same logic as Contain to scale it down.
let new_size = if bounds_ratio > image_ratio {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
} else {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
} else {
// If the image is smaller than or equal to the container, display it at its original size,
// centered within the container.
let original_size = size(image_size.width, image_size.height);
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - original_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - original_size.height) / 2.0,
),
size: original_size,
}
}
}
ObjectFit::Cover => {
let new_size = if bounds_ratio > image_ratio {
size(
bounds.size.width,
image_size.height * (bounds.size.width / image_size.width),
)
} else {
size(
image_size.width * (bounds.size.height / image_size.height),
bounds.size.height,
)
};
Bounds {
origin: point(
bounds.origin.x + (bounds.size.width - new_size.width) / 2.0,
bounds.origin.y + (bounds.size.height - new_size.height) / 2.0,
),
size: new_size,
}
}
ObjectFit::None => Bounds {
origin: bounds.origin,
size: image_size,
},
};
result_bounds
}
}
/// The CSS styling that can be applied to an element via the `Styled` trait
#[derive(Clone, Refineable, Debug)]
#[refineable(Debug)]

View File

@@ -1,26 +1,25 @@
use crate::{
hash, point, prelude::*, px, size, transparent_black, Action, AnyDrag, AnyElement, AnyTooltip,
point, prelude::*, px, size, transparent_black, Action, AnyDrag, AnyElement, AnyTooltip,
AnyView, AppContext, Arena, Asset, AsyncWindowContext, AvailableSpace, Bounds, BoxShadow,
Context, Corners, CursorStyle, Decorations, DevicePixels, DispatchActionListener,
DispatchNodeId, DispatchTree, DisplayId, Edges, Effect, Entity, EntityId, EventEmitter,
FileDropEvent, Flatten, FontId, GPUSpecs, Global, GlobalElementId, GlyphId, Hsla, ImageData,
InputHandler, IsZero, KeyBinding, KeyContext, KeyDownEvent, KeyEvent, Keystroke,
KeystrokeEvent, LayoutId, LineLayoutIndex, Model, ModelContext, Modifiers,
ModifiersChangedEvent, MonochromeSprite, MouseButton, MouseEvent, MouseMoveEvent, MouseUpEvent,
Path, Pixels, PlatformAtlas, PlatformDisplay, PlatformInput, PlatformInputHandler,
PlatformWindow, Point, PolychromeSprite, PromptLevel, Quad, Render, RenderGlyphParams,
RenderImageParams, RenderSvgParams, Replay, ResizeEdge, ScaledPixels, Scene, Shadow,
SharedString, Size, StrikethroughStyle, Style, SubscriberSet, Subscription, TaffyLayoutEngine,
Task, TextStyle, TextStyleRefinement, TransformationMatrix, Underline, UnderlineStyle, View,
VisualContext, WeakView, WindowAppearance, WindowBackgroundAppearance, WindowBounds,
WindowControls, WindowDecorations, WindowOptions, WindowParams, WindowTextSystem,
SUBPIXEL_VARIANTS,
FileDropEvent, Flatten, FontId, GPUSpecs, Global, GlobalElementId, GlyphId, Hsla, InputHandler,
IsZero, KeyBinding, KeyContext, KeyDownEvent, KeyEvent, Keystroke, KeystrokeEvent, LayoutId,
LineLayoutIndex, Model, ModelContext, Modifiers, ModifiersChangedEvent, MonochromeSprite,
MouseButton, MouseEvent, MouseMoveEvent, MouseUpEvent, Path, Pixels, PlatformAtlas,
PlatformDisplay, PlatformInput, PlatformInputHandler, PlatformWindow, Point, PolychromeSprite,
PromptLevel, Quad, Render, RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams,
Replay, ResizeEdge, ScaledPixels, Scene, Shadow, SharedString, Size, StrikethroughStyle, Style,
SubscriberSet, Subscription, TaffyLayoutEngine, Task, TextStyle, TextStyleRefinement,
TransformationMatrix, Underline, UnderlineStyle, View, VisualContext, WeakView,
WindowAppearance, WindowBackgroundAppearance, WindowBounds, WindowControls, WindowDecorations,
WindowOptions, WindowParams, WindowTextSystem, SUBPIXEL_VARIANTS,
};
use anyhow::{anyhow, Context as _, Result};
use collections::{FxHashMap, FxHashSet};
use derive_more::{Deref, DerefMut};
use futures::channel::oneshot;
use futures::{future::Shared, FutureExt};
use futures::FutureExt;
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use parking_lot::RwLock;
@@ -1956,36 +1955,6 @@ impl<'a> WindowContext<'a> {
self.window.requested_autoscroll.take()
}
/// Remove an asset from GPUI's cache
pub fn remove_cached_asset<A: Asset + 'static>(
&mut self,
source: &A::Source,
) -> Option<A::Output> {
self.asset_cache.remove::<A>(source)
}
/// Asynchronously load an asset, if the asset hasn't finished loading this will return None.
/// Your view will be re-drawn once the asset has finished loading.
///
/// Note that the multiple calls to this method will only result in one `Asset::load` call.
/// The results of that call will be cached, and returned on subsequent uses of this API.
///
/// Use [Self::remove_cached_asset] to reload your asset.
pub fn use_cached_asset<A: Asset + 'static>(
&mut self,
source: &A::Source,
) -> Option<A::Output> {
self.asset_cache.get::<A>(source).or_else(|| {
if let Some(asset) = self.use_asset::<A>(source) {
self.asset_cache
.insert::<A>(source.to_owned(), asset.clone());
Some(asset)
} else {
None
}
})
}
/// Asynchronously load an asset, if the asset hasn't finished loading this will return None.
/// Your view will be re-drawn once the asset has finished loading.
///
@@ -1994,19 +1963,7 @@ impl<'a> WindowContext<'a> {
///
/// This asset will not be cached by default, see [Self::use_cached_asset]
pub fn use_asset<A: Asset + 'static>(&mut self, source: &A::Source) -> Option<A::Output> {
let asset_id = (TypeId::of::<A>(), hash(source));
let mut is_first = false;
let task = self
.loading_assets
.remove(&asset_id)
.map(|boxed_task| *boxed_task.downcast::<Shared<Task<A::Output>>>().unwrap())
.unwrap_or_else(|| {
is_first = true;
let future = A::load(source.clone(), self);
let task = self.background_executor().spawn(future).shared();
task
});
let (task, is_first) = self.fetch_asset::<A>(source);
task.clone().now_or_never().or_else(|| {
if is_first {
let parent_id = self.parent_view_id();
@@ -2027,12 +1984,9 @@ impl<'a> WindowContext<'a> {
.detach();
}
self.loading_assets.insert(asset_id, Box::new(task));
None
})
}
/// Obtain the current element offset. This method should only be called during the
/// prepaint phase of element drawing.
pub fn element_offset(&self) -> Point<Pixels> {
@@ -2610,13 +2564,14 @@ impl<'a> WindowContext<'a> {
}
/// Paint an image into the scene for the next frame at the current z-index.
/// This method will panic if the frame_index is not valid
///
/// This method should only be called as part of the paint phase of element drawing.
pub fn paint_image(
&mut self,
bounds: Bounds<Pixels>,
corner_radii: Corners<Pixels>,
data: Arc<ImageData>,
data: Arc<RenderImage>,
frame_index: usize,
grayscale: bool,
) -> Result<()> {
@@ -2639,7 +2594,10 @@ impl<'a> WindowContext<'a> {
.get_or_insert_with(&params.clone().into(), &mut || {
Ok(Some((
data.size(frame_index),
Cow::Borrowed(data.as_bytes(frame_index)),
Cow::Borrowed(
data.as_bytes(frame_index)
.expect("It's the caller's job to pass a valid frame index"),
),
)))
})?
.expect("Callback above only returns Some");
@@ -2665,6 +2623,8 @@ impl<'a> WindowContext<'a> {
/// This method should only be called as part of the paint phase of element drawing.
#[cfg(target_os = "macos")]
pub fn paint_surface(&mut self, bounds: Bounds<Pixels>, image_buffer: CVImageBuffer) {
use crate::PaintSurface;
debug_assert_eq!(
self.window.draw_phase,
DrawPhase::Paint,
@@ -2674,15 +2634,12 @@ impl<'a> WindowContext<'a> {
let scale_factor = self.scale_factor();
let bounds = bounds.scale(scale_factor);
let content_mask = self.content_mask().scale(scale_factor);
self.window
.next_frame
.scene
.insert_primitive(crate::Surface {
order: 0,
bounds,
content_mask,
image_buffer,
});
self.window.next_frame.scene.insert_primitive(PaintSurface {
order: 0,
bounds,
content_mask,
image_buffer,
});
}
#[must_use]

View File

@@ -50,6 +50,9 @@ theme.workspace = true
tiktoken-rs.workspace = true
ui.workspace = true
util.workspace = true
base64.workspace = true
image.workspace = true
[dev-dependencies]
ctor.workspace = true

View File

@@ -220,24 +220,44 @@ pub fn count_anthropic_tokens(
) -> BoxFuture<'static, Result<usize>> {
cx.background_executor()
.spawn(async move {
let messages = request
.messages
.into_iter()
.map(|message| tiktoken_rs::ChatCompletionRequestMessage {
role: match message.role {
Role::User => "user".into(),
Role::Assistant => "assistant".into(),
Role::System => "system".into(),
},
content: Some(message.content),
name: None,
function_call: None,
})
.collect::<Vec<_>>();
let messages = request.messages;
let mut tokens_from_images = 0;
let mut string_messages = Vec::with_capacity(messages.len());
for message in messages {
use crate::MessageContent;
let mut string_contents = String::new();
for content in message.content {
match content {
MessageContent::Text(string) => {
string_contents.push_str(&string);
}
MessageContent::Image(image) => {
tokens_from_images += image.estimate_tokens();
}
}
}
if !string_contents.is_empty() {
string_messages.push(tiktoken_rs::ChatCompletionRequestMessage {
role: match message.role {
Role::User => "user".into(),
Role::Assistant => "assistant".into(),
Role::System => "system".into(),
},
content: Some(string_contents),
name: None,
function_call: None,
});
}
}
// Tiktoken doesn't yet support these models, so we manually use the
// same tokenizer as GPT-4.
tiktoken_rs::num_tokens_from_messages("gpt-4", &messages)
tiktoken_rs::num_tokens_from_messages("gpt-4", &string_messages)
.map(|tokens| tokens + tokens_from_images)
})
.boxed()
}

View File

@@ -590,7 +590,7 @@ impl LanguageModel for CloudLanguageModel {
tool_name: String,
tool_description: String,
input_schema: serde_json::Value,
_cx: &AsyncAppContext,
cx: &AsyncAppContext,
) -> BoxFuture<'static, Result<serde_json::Value>> {
match &self.model {
CloudModel::Anthropic(model) => {
@@ -605,34 +605,106 @@ impl LanguageModel for CloudLanguageModel {
input_schema,
}];
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request(proto::CompleteWithLanguageModel {
provider: proto::LanguageModelProvider::Anthropic as i32,
request,
})
if cx
.update(|cx| cx.has_flag::<LlmServiceFeatureFlag>())
.unwrap_or(false)
{
let llm_api_token = self.llm_api_token.clone();
self.request_limiter
.run(async move {
let response = Self::perform_llm_completion(
client.clone(),
llm_api_token,
PerformCompletionParams {
provider: client::LanguageModelProvider::Anthropic,
model: request.model.clone(),
provider_request: RawValue::from_string(
serde_json::to_string(&request)?,
)?,
},
)
.await?;
let response: anthropic::Response =
serde_json::from_str(&response.completion)?;
response
.content
.into_iter()
.find_map(|content| {
if let anthropic::Content::ToolUse { name, input, .. } = content {
if name == tool_name {
Some(input)
let mut tool_use_index = None;
let mut tool_input = String::new();
let mut body = BufReader::new(response.into_body());
let mut line = String::new();
while body.read_line(&mut line).await? > 0 {
let event: anthropic::Event = serde_json::from_str(&line)?;
line.clear();
match event {
anthropic::Event::ContentBlockStart {
content_block,
index,
} => {
if let anthropic::Content::ToolUse { name, .. } =
content_block
{
if name == tool_name {
tool_use_index = Some(index);
}
}
}
anthropic::Event::ContentBlockDelta { index, delta } => {
match delta {
anthropic::ContentDelta::TextDelta { .. } => {}
anthropic::ContentDelta::InputJsonDelta {
partial_json,
} => {
if Some(index) == tool_use_index {
tool_input.push_str(&partial_json);
}
}
}
}
anthropic::Event::ContentBlockStop { index } => {
if Some(index) == tool_use_index {
return Ok(serde_json::from_str(&tool_input)?);
}
}
_ => {}
}
}
if tool_use_index.is_some() {
Err(anyhow!("tool content incomplete"))
} else {
Err(anyhow!("tool not used"))
}
})
.boxed()
} else {
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request(proto::CompleteWithLanguageModel {
provider: proto::LanguageModelProvider::Anthropic as i32,
request,
})
.await?;
let response: anthropic::Response =
serde_json::from_str(&response.completion)?;
response
.content
.into_iter()
.find_map(|content| {
if let anthropic::Content::ToolUse { name, input, .. } = content
{
if name == tool_name {
Some(input)
} else {
None
}
} else {
None
}
} else {
None
}
})
.context("tool not used")
})
.boxed()
})
.context("tool not used")
})
.boxed()
}
}
CloudModel::OpenAi(model) => {
let mut request = request.into_open_ai(model.id().into());
@@ -650,56 +722,116 @@ impl LanguageModel for CloudLanguageModel {
function.description = Some(tool_description);
function.parameters = Some(input_schema);
request.tools = vec![open_ai::ToolDefinition::Function { function }];
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request_stream(proto::StreamCompleteWithLanguageModel {
provider: proto::LanguageModelProvider::OpenAi as i32,
request,
})
.await?;
// Call arguments are gonna be streamed in over multiple chunks.
let mut load_state = None;
let mut response = response.map(
|item: Result<
proto::StreamCompleteWithLanguageModelResponse,
anyhow::Error,
>| {
Result::<open_ai::ResponseStreamEvent, anyhow::Error>::Ok(
serde_json::from_str(&item?.event)?,
)
},
);
while let Some(Ok(part)) = response.next().await {
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
if cx
.update(|cx| cx.has_flag::<LlmServiceFeatureFlag>())
.unwrap_or(false)
{
let llm_api_token = self.llm_api_token.clone();
self.request_limiter
.run(async move {
let response = Self::perform_llm_completion(
client.clone(),
llm_api_token,
PerformCompletionParams {
provider: client::LanguageModelProvider::OpenAi,
model: request.model.clone(),
provider_request: RawValue::from_string(
serde_json::to_string(&request)?,
)?,
},
)
.await?;
let mut body = BufReader::new(response.into_body());
let mut line = String::new();
let mut load_state = None;
while body.read_line(&mut line).await? > 0 {
let part: open_ai::ResponseStreamEvent =
serde_json::from_str(&line)?;
line.clear();
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
}
}
}
}
}
}
}
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
} else {
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request_stream(proto::StreamCompleteWithLanguageModel {
provider: proto::LanguageModelProvider::OpenAi as i32,
request,
})
.await?;
let mut load_state = None;
let mut response = response.map(
|item: Result<
proto::StreamCompleteWithLanguageModelResponse,
anyhow::Error,
>| {
Result::<open_ai::ResponseStreamEvent, anyhow::Error>::Ok(
serde_json::from_str(&item?.event)?,
)
},
);
while let Some(Ok(part)) = response.next().await {
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
}
}
}
}
}
}
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
}
}
CloudModel::Google(_) => {
future::ready(Err(anyhow!("tool use not implemented for Google AI"))).boxed()
@@ -721,56 +853,115 @@ impl LanguageModel for CloudLanguageModel {
function.description = Some(tool_description);
function.parameters = Some(input_schema);
request.tools = vec![open_ai::ToolDefinition::Function { function }];
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request_stream(proto::StreamCompleteWithLanguageModel {
provider: proto::LanguageModelProvider::OpenAi as i32,
request,
})
.await?;
// Call arguments are gonna be streamed in over multiple chunks.
let mut load_state = None;
let mut response = response.map(
|item: Result<
proto::StreamCompleteWithLanguageModelResponse,
anyhow::Error,
>| {
Result::<open_ai::ResponseStreamEvent, anyhow::Error>::Ok(
serde_json::from_str(&item?.event)?,
)
},
);
while let Some(Ok(part)) = response.next().await {
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
if cx
.update(|cx| cx.has_flag::<LlmServiceFeatureFlag>())
.unwrap_or(false)
{
let llm_api_token = self.llm_api_token.clone();
self.request_limiter
.run(async move {
let response = Self::perform_llm_completion(
client.clone(),
llm_api_token,
PerformCompletionParams {
provider: client::LanguageModelProvider::Zed,
model: request.model.clone(),
provider_request: RawValue::from_string(
serde_json::to_string(&request)?,
)?,
},
)
.await?;
let mut body = BufReader::new(response.into_body());
let mut line = String::new();
let mut load_state = None;
while body.read_line(&mut line).await? > 0 {
let part: open_ai::ResponseStreamEvent =
serde_json::from_str(&line)?;
line.clear();
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
}
}
}
}
}
}
}
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
} else {
self.request_limiter
.run(async move {
let request = serde_json::to_string(&request)?;
let response = client
.request_stream(proto::StreamCompleteWithLanguageModel {
provider: proto::LanguageModelProvider::OpenAi as i32,
request,
})
.await?;
let mut load_state = None;
let mut response = response.map(
|item: Result<
proto::StreamCompleteWithLanguageModelResponse,
anyhow::Error,
>| {
Result::<open_ai::ResponseStreamEvent, anyhow::Error>::Ok(
serde_json::from_str(&item?.event)?,
)
},
);
while let Some(Ok(part)) = response.next().await {
for choice in part.choices {
let Some(tool_calls) = choice.delta.tool_calls else {
continue;
};
for call in tool_calls {
if let Some(func) = call.function {
if func.name.as_deref() == Some(tool_name.as_str()) {
load_state = Some((String::default(), call.index));
}
if let Some((arguments, (output, index))) =
func.arguments.zip(load_state.as_mut())
{
if call.index == *index {
output.push_str(&arguments);
}
}
}
}
}
}
if let Some((arguments, _)) = load_state {
return Ok(serde_json::from_str(&arguments)?);
} else {
bail!("tool not used");
}
})
.boxed()
}
}
}
}

View File

@@ -193,7 +193,7 @@ impl LanguageModel for CopilotChatLanguageModel {
cx: &AsyncAppContext,
) -> BoxFuture<'static, Result<BoxStream<'static, Result<String>>>> {
if let Some(message) = request.messages.last() {
if message.content.trim().is_empty() {
if message.contents_empty() {
const EMPTY_PROMPT_MSG: &str =
"Empty prompts aren't allowed. Please provide a non-empty prompt.";
return futures::future::ready(Err(anyhow::anyhow!(EMPTY_PROMPT_MSG))).boxed();
@@ -270,7 +270,7 @@ impl CopilotChatLanguageModel {
Role::Assistant => CopilotChatRole::Assistant,
Role::System => CopilotChatRole::System,
},
content: msg.content,
content: msg.string_contents(),
})
.collect(),
)

View File

@@ -182,14 +182,14 @@ impl OllamaLanguageModel {
.into_iter()
.map(|msg| match msg.role {
Role::User => ChatMessage::User {
content: msg.content,
content: msg.string_contents(),
},
Role::Assistant => ChatMessage::Assistant {
content: msg.content,
content: msg.string_contents(),
tool_calls: None,
},
Role::System => ChatMessage::System {
content: msg.content,
content: msg.string_contents(),
},
})
.collect(),
@@ -257,7 +257,7 @@ impl LanguageModel for OllamaLanguageModel {
let token_count = request
.messages
.iter()
.map(|msg| msg.content.chars().count())
.map(|msg| msg.string_contents().chars().count())
.sum::<usize>()
/ 4;

View File

@@ -363,7 +363,7 @@ pub fn count_open_ai_tokens(
Role::Assistant => "assistant".into(),
Role::System => "system".into(),
},
content: Some(message.content),
content: Some(message.string_contents()),
name: None,
function_call: None,
})

View File

@@ -1,10 +1,223 @@
use std::io::{Cursor, Write};
use crate::role::Role;
use base64::write::EncoderWriter;
use gpui::{point, size, AppContext, DevicePixels, Image, ObjectFit, RenderImage, Size, Task};
use image::{codecs::png::PngEncoder, imageops::resize, DynamicImage, ImageDecoder};
use serde::{Deserialize, Serialize};
use ui::{px, SharedString};
use util::ResultExt;
#[derive(Clone, PartialEq, Eq, Serialize, Deserialize, Debug, Hash)]
pub struct LanguageModelImage {
// A base64 encoded PNG image
source: SharedString,
size: Size<DevicePixels>,
}
const ANTHROPIC_SIZE_LIMT: f32 = 1568.0; // Anthropic wants uploaded images to be smaller than this in both dimensions
impl LanguageModelImage {
pub fn from_image(data: Image, cx: &mut AppContext) -> Task<Option<Self>> {
cx.background_executor().spawn(async move {
match data.format() {
gpui::ImageFormat::Png
| gpui::ImageFormat::Jpeg
| gpui::ImageFormat::Webp
| gpui::ImageFormat::Gif => {}
_ => return None,
};
let image = image::codecs::png::PngDecoder::new(Cursor::new(data.bytes())).log_err()?;
let (width, height) = image.dimensions();
let image_size = size(DevicePixels(width as i32), DevicePixels(height as i32));
let mut base64_image = Vec::new();
{
let mut base64_encoder = EncoderWriter::new(
Cursor::new(&mut base64_image),
&base64::engine::general_purpose::STANDARD,
);
if image_size.width.0 > ANTHROPIC_SIZE_LIMT as i32
|| image_size.height.0 > ANTHROPIC_SIZE_LIMT as i32
{
let new_bounds = ObjectFit::ScaleDown.get_bounds(
gpui::Bounds {
origin: point(px(0.0), px(0.0)),
size: size(px(ANTHROPIC_SIZE_LIMT), px(ANTHROPIC_SIZE_LIMT)),
},
image_size,
);
let image = DynamicImage::from_decoder(image).log_err()?.resize(
new_bounds.size.width.0 as u32,
new_bounds.size.height.0 as u32,
image::imageops::FilterType::Triangle,
);
let mut png = Vec::new();
image
.write_with_encoder(PngEncoder::new(&mut png))
.log_err()?;
base64_encoder.write_all(png.as_slice()).log_err()?;
} else {
base64_encoder.write_all(data.bytes()).log_err()?;
}
}
// SAFETY: The base64 encoder should not produce non-UTF8
let source = unsafe { String::from_utf8_unchecked(base64_image) };
Some(LanguageModelImage {
size: image_size,
source: source.into(),
})
})
}
/// Resolves image into an LLM-ready format (base64)
pub fn from_render_image(data: &RenderImage) -> Option<Self> {
let image_size = data.size(0);
let mut bytes = data.as_bytes(0).unwrap_or(&[]).to_vec();
// Convert from BGRA to RGBA.
for pixel in bytes.chunks_exact_mut(4) {
pixel.swap(2, 0);
}
let mut image = image::RgbaImage::from_vec(
image_size.width.0 as u32,
image_size.height.0 as u32,
bytes,
)
.expect("We already know this works");
// https://docs.anthropic.com/en/docs/build-with-claude/vision
if image_size.width.0 > ANTHROPIC_SIZE_LIMT as i32
|| image_size.height.0 > ANTHROPIC_SIZE_LIMT as i32
{
let new_bounds = ObjectFit::ScaleDown.get_bounds(
gpui::Bounds {
origin: point(px(0.0), px(0.0)),
size: size(px(ANTHROPIC_SIZE_LIMT), px(ANTHROPIC_SIZE_LIMT)),
},
image_size,
);
image = resize(
&image,
new_bounds.size.width.0 as u32,
new_bounds.size.height.0 as u32,
image::imageops::FilterType::Triangle,
);
}
let mut png = Vec::new();
image
.write_with_encoder(PngEncoder::new(&mut png))
.log_err()?;
let mut base64_image = Vec::new();
{
let mut base64_encoder = EncoderWriter::new(
Cursor::new(&mut base64_image),
&base64::engine::general_purpose::STANDARD,
);
base64_encoder.write_all(png.as_slice()).log_err()?;
}
// SAFETY: The base64 encoder should not produce non-UTF8
let source = unsafe { String::from_utf8_unchecked(base64_image) };
Some(LanguageModelImage {
size: image_size,
source: source.into(),
})
}
pub fn estimate_tokens(&self) -> usize {
let width = self.size.width.0.abs() as usize;
let height = self.size.height.0.abs() as usize;
// From: https://docs.anthropic.com/en/docs/build-with-claude/vision#calculate-image-costs
// Note that are a lot of conditions on anthropic's API, and OpenAI doesn't use this,
// so this method is more of a rough guess
(width * height) / 750
}
}
#[derive(Clone, Serialize, Deserialize, Eq, PartialEq, Hash)]
pub enum MessageContent {
Text(String),
Image(LanguageModelImage),
}
impl std::fmt::Debug for MessageContent {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
MessageContent::Text(t) => f.debug_struct("MessageContent").field("text", t).finish(),
MessageContent::Image(i) => f
.debug_struct("MessageContent")
.field("image", &i.source.len())
.finish(),
}
}
}
impl MessageContent {
pub fn as_string(&self) -> &str {
match self {
MessageContent::Text(s) => s.as_str(),
MessageContent::Image(_) => "",
}
}
}
impl From<String> for MessageContent {
fn from(value: String) -> Self {
MessageContent::Text(value)
}
}
impl From<&str> for MessageContent {
fn from(value: &str) -> Self {
MessageContent::Text(value.to_string())
}
}
#[derive(Clone, Serialize, Deserialize, Debug, PartialEq, Hash)]
pub struct LanguageModelRequestMessage {
pub role: Role,
pub content: String,
pub content: Vec<MessageContent>,
}
impl LanguageModelRequestMessage {
pub fn string_contents(&self) -> String {
let mut string_buffer = String::new();
for string in self.content.iter().filter_map(|content| match content {
MessageContent::Text(s) => Some(s),
MessageContent::Image(_) => None,
}) {
string_buffer.push_str(string.as_str())
}
string_buffer
}
pub fn contents_empty(&self) -> bool {
self.content.is_empty()
|| self
.content
.get(0)
.map(|content| match content {
MessageContent::Text(s) => s.is_empty(),
MessageContent::Image(_) => true,
})
.unwrap_or(false)
}
}
#[derive(Clone, Debug, Default, Serialize, Deserialize, PartialEq)]
@@ -23,14 +236,14 @@ impl LanguageModelRequest {
.into_iter()
.map(|msg| match msg.role {
Role::User => open_ai::RequestMessage::User {
content: msg.content,
content: msg.string_contents(),
},
Role::Assistant => open_ai::RequestMessage::Assistant {
content: Some(msg.content),
content: Some(msg.string_contents()),
tool_calls: Vec::new(),
},
Role::System => open_ai::RequestMessage::System {
content: msg.content,
content: msg.string_contents(),
},
})
.collect(),
@@ -51,7 +264,7 @@ impl LanguageModelRequest {
.into_iter()
.map(|msg| google_ai::Content {
parts: vec![google_ai::Part::TextPart(google_ai::TextPart {
text: msg.content,
text: msg.string_contents(),
})],
role: match msg.role {
Role::User => google_ai::Role::User,
@@ -77,7 +290,7 @@ impl LanguageModelRequest {
let mut system_message = String::new();
for message in self.messages {
if message.content.is_empty() {
if message.contents_empty() {
continue;
}
@@ -85,8 +298,11 @@ impl LanguageModelRequest {
Role::User | Role::Assistant => {
if let Some(last_message) = new_messages.last_mut() {
if last_message.role == message.role {
last_message.content.push_str("\n\n");
last_message.content.push_str(&message.content);
// TODO: is this append done properly?
last_message.content.push(MessageContent::Text(format!(
"\n\n{}",
message.string_contents()
)));
continue;
}
}
@@ -97,7 +313,7 @@ impl LanguageModelRequest {
if !system_message.is_empty() {
system_message.push_str("\n\n");
}
system_message.push_str(&message.content);
system_message.push_str(&message.string_contents());
}
}
}
@@ -113,9 +329,24 @@ impl LanguageModelRequest {
Role::Assistant => anthropic::Role::Assistant,
Role::System => return None,
},
content: vec![anthropic::Content::Text {
text: message.content,
}],
content: message
.content
.into_iter()
// TODO: filter out the empty messages in the message construction step
.filter_map(|content| match content {
MessageContent::Text(t) if !t.is_empty() => {
Some(anthropic::Content::Text { text: t })
}
MessageContent::Image(i) => Some(anthropic::Content::Image {
source: anthropic::ImageSource {
source_type: "base64".to_string(),
media_type: "image/png".to_string(),
data: i.source.to_string(),
},
}),
_ => None,
})
.collect(),
})
})
.collect(),

View File

@@ -626,7 +626,7 @@ fn human_readable_package_name(package_directory: &Path) -> Option<String> {
// For providing local `cargo check -p $pkgid` task, we do not need most of the information we have returned.
// Output example in the root of Zed project:
// ```bash
// ```sh
// cargo pkgid zed
// path+file:///absolute/path/to/project/zed/crates/zed#0.131.0
// ```

View File

@@ -3,7 +3,7 @@ use anyhow::{anyhow, Context, Result};
use async_trait::async_trait;
use collections::{btree_map::Entry as BTreeEntry, hash_map::Entry, BTreeMap, HashMap, HashSet};
use futures::Stream;
use gpui::{BackgroundExecutor, ImageSource};
use gpui::{BackgroundExecutor, SurfaceSource};
use live_kit_server::{proto, token};
use parking_lot::Mutex;
@@ -870,7 +870,7 @@ impl Frame {
self.height
}
pub fn image(&self) -> ImageSource {
pub fn image(&self) -> SurfaceSource {
unimplemented!("you can't call this in test mode")
}
}

View File

@@ -150,7 +150,7 @@ impl Markdown {
return;
}
let text = text.text_for_range(self.selection.start..self.selection.end);
cx.write_to_clipboard(ClipboardItem::new(text));
cx.write_to_clipboard(ClipboardItem::new_string(text));
}
fn parse(&mut self, cx: &mut ViewContext<Self>) {
@@ -480,7 +480,7 @@ impl MarkdownElement {
{
let text = rendered_text
.text_for_range(markdown.selection.start..markdown.selection.end);
cx.write_to_primary(ClipboardItem::new(text))
cx.write_to_primary(ClipboardItem::new_string(text))
}
cx.notify();
}

View File

@@ -1067,7 +1067,7 @@ impl OutlinePanel {
.and_then(|entry| self.abs_path(&entry, cx))
.map(|p| p.to_string_lossy().to_string())
{
cx.write_to_clipboard(ClipboardItem::new(clipboard_text));
cx.write_to_clipboard(ClipboardItem::new_string(clipboard_text));
}
}
@@ -1082,7 +1082,7 @@ impl OutlinePanel {
})
.map(|p| p.to_string_lossy().to_string())
{
cx.write_to_clipboard(ClipboardItem::new(clipboard_text));
cx.write_to_clipboard(ClipboardItem::new_string(clipboard_text));
}
}

View File

@@ -187,13 +187,13 @@ pub fn prompts_dir() -> &'static PathBuf {
/// Returns the path to the prompt templates directory.
///
/// This is where the prompt templates for core features can be overridden with templates.
pub fn prompt_templates_dir() -> &'static PathBuf {
pub fn prompt_overrides_dir() -> &'static PathBuf {
static PROMPT_TEMPLATES_DIR: OnceLock<PathBuf> = OnceLock::new();
PROMPT_TEMPLATES_DIR.get_or_init(|| {
if cfg!(target_os = "macos") {
config_dir().join("prompts").join("templates")
config_dir().join("prompt_overrides")
} else {
support_dir().join("prompts").join("templates")
support_dir().join("prompt_overrides")
}
})
}

View File

@@ -1357,7 +1357,7 @@ impl ProjectPanel {
fn copy_path(&mut self, _: &CopyPath, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_entry(cx) {
cx.write_to_clipboard(ClipboardItem::new(
cx.write_to_clipboard(ClipboardItem::new_string(
worktree
.abs_path()
.join(&entry.path)
@@ -1369,7 +1369,9 @@ impl ProjectPanel {
fn copy_relative_path(&mut self, _: &CopyRelativePath, cx: &mut ViewContext<Self>) {
if let Some((_, entry)) = self.selected_entry(cx) {
cx.write_to_clipboard(ClipboardItem::new(entry.path.to_string_lossy().to_string()));
cx.write_to_clipboard(ClipboardItem::new_string(
entry.path.to_string_lossy().to_string(),
));
}
}

View File

@@ -5,7 +5,7 @@ use crate::stdio::TerminalOutput;
use anyhow::Result;
use base64::prelude::*;
use gpui::{
img, percentage, Animation, AnimationExt, AnyElement, FontWeight, ImageData, Render, Task,
img, percentage, Animation, AnimationExt, AnyElement, FontWeight, Render, RenderImage, Task,
TextRun, Transformation, View,
};
use runtimelib::datatable::TableSchema;
@@ -38,7 +38,7 @@ fn rank_mime_type(mimetype: &MimeType) -> usize {
pub struct ImageView {
height: u32,
width: u32,
image: Arc<ImageData>,
image: Arc<RenderImage>,
}
impl ImageView {
@@ -76,7 +76,7 @@ impl ImageView {
let height = data.height();
let width = data.width();
let gpui_image_data = ImageData::new(vec![image::Frame::new(data)]);
let gpui_image_data = RenderImage::new(vec![image::Frame::new(data)]);
return Ok(ImageView {
height,

View File

@@ -19,6 +19,9 @@ pub struct TerminalOutput {
handler: alacritty_terminal::Term<ZedListener>,
}
const DEFAULT_NUM_LINES: usize = 32;
const DEFAULT_NUM_COLUMNS: usize = 128;
pub fn terminal_size(cx: &mut WindowContext) -> terminal::TerminalSize {
let text_style = cx.text_style();
let text_system = cx.text_system();
@@ -33,8 +36,8 @@ pub fn terminal_size(cx: &mut WindowContext) -> terminal::TerminalSize {
.unwrap()
.width;
let num_lines = 200;
let columns = 120;
let num_lines = DEFAULT_NUM_LINES;
let columns = DEFAULT_NUM_COLUMNS;
// Reversed math from terminal::TerminalSize to get pixel width according to terminal width
let width = columns as f32 * cell_width;

View File

@@ -117,17 +117,15 @@ impl RenderOnce for StoryContainer {
pub struct Story {}
impl Story {
pub fn container() -> Div {
div().size_full().overflow_hidden().child(
div()
.id("story_container")
.overflow_y_scroll()
.w_full()
.min_h_full()
.flex()
.flex_col()
.bg(story_color().background),
)
pub fn container() -> gpui::Stateful<Div> {
div()
.id("story_container")
.overflow_y_scroll()
.w_full()
.min_h_full()
.flex()
.flex_col()
.bg(story_color().background)
}
// TODO: Move all stories to container2, then rename

View File

@@ -1,6 +1,5 @@
Much of element styling is now handled by an external engine.
How do I make an element hover.
There's a hover style.
@@ -15,11 +14,8 @@ struct Hoverable<E: Element> {
impl<V> Element<V> for Hoverable {
}
```
```rs
#[derive(Styled, Interactive)]
pub struct Div {
@@ -50,23 +46,12 @@ pub trait Interactive<V> {
struct Interactions<V> {
click: SmallVec<[<Rc<dyn Fn(&mut V, &dyn Any, )>; 1]>,
}
```
```rs
trait Stylable {
type Style;
fn with_style(self, style: Self::Style) -> Self;
}
```

View File

@@ -25,6 +25,8 @@ impl Render for OverflowScrollStory {
.child(Story::label("`overflow_y_scroll`"))
.child(
v_flex()
.w_full()
.flex_1()
.id("overflow_y_scroll")
.gap_2()
.overflow_y_scroll()

View File

@@ -656,13 +656,17 @@ impl Terminal {
cx.emit(Event::BreadcrumbsChanged);
}
AlacTermEvent::ClipboardStore(_, data) => {
cx.write_to_clipboard(ClipboardItem::new(data.to_string()))
cx.write_to_clipboard(ClipboardItem::new_string(data.to_string()))
}
AlacTermEvent::ClipboardLoad(_, format) => {
self.write_to_pty(
match &cx.read_from_clipboard().and_then(|item| item.text()) {
// The terminal only supports pasting strings, not images.
Some(text) => format(text),
_ => format(""),
},
)
}
AlacTermEvent::ClipboardLoad(_, format) => self.write_to_pty(format(
&cx.read_from_clipboard()
.map(|ci| ci.text().to_string())
.unwrap_or_else(|| "".to_string()),
)),
AlacTermEvent::PtyWrite(out) => self.write_to_pty(out.clone()),
AlacTermEvent::TextAreaSizeRequest(format) => {
self.write_to_pty(format(self.last_content.size.into()))
@@ -767,7 +771,7 @@ impl Terminal {
#[cfg(target_os = "linux")]
if let Some(selection_text) = term.selection_to_string() {
cx.write_to_primary(ClipboardItem::new(selection_text));
cx.write_to_primary(ClipboardItem::new_string(selection_text));
}
if let Some((_, head)) = selection {
@@ -788,7 +792,7 @@ impl Terminal {
#[cfg(target_os = "linux")]
if let Some(selection_text) = term.selection_to_string() {
cx.write_to_primary(ClipboardItem::new(selection_text));
cx.write_to_primary(ClipboardItem::new_string(selection_text));
}
self.selection_head = Some(point);
@@ -798,7 +802,7 @@ impl Terminal {
InternalEvent::Copy => {
if let Some(txt) = term.selection_to_string() {
cx.write_to_clipboard(ClipboardItem::new(txt))
cx.write_to_clipboard(ClipboardItem::new_string(txt))
}
}
InternalEvent::ScrollToAlacPoint(point) => {

View File

@@ -488,9 +488,9 @@ impl TerminalView {
///Attempt to paste the clipboard into the terminal
fn paste(&mut self, _: &Paste, cx: &mut ViewContext<Self>) {
if let Some(item) = cx.read_from_clipboard() {
if let Some(clipboard_string) = cx.read_from_clipboard().and_then(|item| item.text()) {
self.terminal
.update(cx, |terminal, _cx| terminal.paste(item.text()));
.update(cx, |terminal, _cx| terminal.paste(&clipboard_string));
}
}

View File

@@ -7,7 +7,7 @@ This crate provides the theme system for Zed.
A theme is a collection of colors used to build a consistent appearance for UI components across the application.
To produce a theme in Zed,
A theme is made of of two parts: A [ThemeFamily] and one or more [Theme]s.
A theme is made of two parts: A [ThemeFamily] and one or more [Theme]s.
//
A [ThemeFamily] contains metadata like theme name, author, and theme-specific [ColorScales] as well as a series of themes.

View File

@@ -24,7 +24,7 @@ async fn test_visual_star_hash(cx: &mut gpui::TestAppContext) {
To keep CI runs fast, by default the neovim tests use a cached JSON file that records what neovim did (see crates/vim/test_data),
but while developing this test you'll need to run it with the neovim flag enabled:
```
```sh
cargo test -p vim --features neovim test_visual_star_hash
```

View File

@@ -361,7 +361,8 @@ mod test {
Mode::Normal,
);
assert_eq!(
cx.read_from_clipboard().map(|item| item.text().clone()),
cx.read_from_clipboard()
.map(|item| item.text().map(ToOwned::to_owned).unwrap()),
Some("jumps".into())
);
cx.simulate_keystrokes("d d p");
@@ -373,10 +374,11 @@ mod test {
Mode::Normal,
);
assert_eq!(
cx.read_from_clipboard().map(|item| item.text().clone()),
cx.read_from_clipboard()
.map(|item| item.text().map(ToOwned::to_owned).unwrap()),
Some("jumps".into())
);
cx.write_to_clipboard(ClipboardItem::new("test-copy".to_string()));
cx.write_to_clipboard(ClipboardItem::new_string("test-copy".to_string()));
cx.simulate_keystrokes("shift-p");
cx.assert_state(
indoc! {"

View File

@@ -92,6 +92,9 @@ fn scroll_editor(
let mut head = selection.head();
let top = top_anchor.to_display_point(map);
let vertical_scroll_margin =
(vertical_scroll_margin as u32).min(visible_line_count as u32 / 2);
if preserve_cursor_position {
let old_top = old_top_anchor.to_display_point(map);
let new_row = if old_top.row() == top.row() {
@@ -108,10 +111,13 @@ fn scroll_editor(
let min_row = if top.row().0 == 0 {
DisplayRow(0)
} else {
DisplayRow(top.row().0 + vertical_scroll_margin as u32)
DisplayRow(top.row().0 + vertical_scroll_margin)
};
let max_row = DisplayRow(
top.row().0 + visible_line_count as u32 - vertical_scroll_margin as u32 - 1,
top.row().0
+ (visible_line_count as u32)
.saturating_sub(vertical_scroll_margin)
.saturating_sub(1),
);
let new_head = if head.row() < min_row {

View File

@@ -5,7 +5,7 @@ use crate::surrounds::SurroundsType;
use crate::{motion::Motion, object::Object};
use collections::HashMap;
use editor::{Anchor, ClipboardSelection};
use gpui::{Action, ClipboardItem, KeyContext};
use gpui::{Action, ClipboardEntry, ClipboardItem, KeyContext};
use language::{CursorShape, Selection, TransactionId};
use serde::{Deserialize, Serialize};
use ui::SharedString;
@@ -129,20 +129,24 @@ pub struct Register {
impl From<Register> for ClipboardItem {
fn from(register: Register) -> Self {
let item = ClipboardItem::new(register.text.into());
if let Some(clipboard_selections) = register.clipboard_selections {
item.with_metadata(clipboard_selections)
ClipboardItem::new_string_with_metadata(register.text.into(), clipboard_selections)
} else {
item
ClipboardItem::new_string(register.text.into())
}
}
}
impl From<ClipboardItem> for Register {
fn from(value: ClipboardItem) -> Self {
Register {
text: value.text().to_owned().into(),
clipboard_selections: value.metadata::<Vec<ClipboardSelection>>(),
fn from(item: ClipboardItem) -> Self {
// For now, we don't store metadata for multiple entries.
match item.entries().first() {
Some(ClipboardEntry::String(value)) if item.entries().len() == 1 => Register {
text: value.text().to_owned().into(),
clipboard_selections: value.metadata::<Vec<ClipboardSelection>>(),
},
// For now, registers can't store images. This could change in the future.
_ => Register::default(),
}
}
}

View File

@@ -247,7 +247,12 @@ impl NeovimBackedTestContext {
register: '"',
state: self.shared_state().await,
neovim: self.neovim.read_register('"').await,
editor: self.read_from_clipboard().unwrap().text().clone(),
editor: self
.read_from_clipboard()
.unwrap()
.text()
.unwrap()
.to_owned(),
}
}

View File

@@ -586,7 +586,7 @@ impl Vim {
} else {
self.workspace_state.last_yank = cx
.read_from_clipboard()
.map(|item| item.text().to_owned().into());
.and_then(|item| item.text().map(|string| string.into()))
}
self.workspace_state.registers.insert('"', content.clone());
@@ -663,7 +663,7 @@ impl Vim {
fn system_clipboard_is_newer(&self, cx: &mut AppContext) -> bool {
cx.read_from_clipboard().is_some_and(|item| {
if let Some(last_state) = &self.workspace_state.last_yank {
last_state != item.text()
Some(last_state.as_ref()) != item.text().as_deref()
} else {
true
}

View File

@@ -927,7 +927,7 @@ mod test {
the lazy dog"});
assert_eq!(
cx.read_from_clipboard()
.map(|item| item.text().clone())
.map(|item| item.text().map(ToOwned::to_owned).unwrap().clone())
.unwrap(),
"The q"
);

View File

@@ -342,7 +342,7 @@ impl Render for LanguageServerPrompt {
.on_click({
let message = request.message.clone();
move |_, cx| {
cx.write_to_clipboard(ClipboardItem::new(
cx.write_to_clipboard(ClipboardItem::new_string(
message.clone(),
))
}

View File

@@ -1609,7 +1609,7 @@ impl Pane {
.and_then(|entry| entry.project_path(cx))
.map(|p| p.path.to_string_lossy().to_string())
{
cx.write_to_clipboard(ClipboardItem::new(clipboard_text));
cx.write_to_clipboard(ClipboardItem::new_string(clipboard_text));
}
}
@@ -1819,7 +1819,7 @@ impl Pane {
"Copy Path",
Some(Box::new(CopyPath)),
cx.handler_for(&pane, move |_, cx| {
cx.write_to_clipboard(ClipboardItem::new(
cx.write_to_clipboard(ClipboardItem::new_string(
abs_path.to_string_lossy().to_string(),
));
}),

View File

@@ -7,7 +7,7 @@ use call::participant::{Frame, RemoteVideoTrack};
use client::{proto::PeerId, User};
use futures::StreamExt;
use gpui::{
div, img, AppContext, EventEmitter, FocusHandle, FocusableView, InteractiveElement,
div, surface, AppContext, EventEmitter, FocusHandle, FocusableView, InteractiveElement,
ParentElement, Render, SharedString, Styled, Task, View, ViewContext, VisualContext,
WindowContext,
};
@@ -75,7 +75,7 @@ impl Render for SharedScreen {
.children(
self.frame
.as_ref()
.map(|frame| img(frame.image()).size_full()),
.map(|frame| surface(frame.image()).size_full()),
)
}
}

View File

@@ -4,9 +4,9 @@ Welcome to Zed's documentation.
This is built on push to `main` and published automatically to [https://zed.dev/docs](https://zed.dev/docs).
To preview the docs locally you will need to install [mdBook](https://rust-lang.github.io/mdBook/), and then run:
To preview the docs locally you will need to install [mdBook](https://rust-lang.github.io/mdBook/) (`cargo install mdbook`) and then run:
```
```sh
mdbook serve docs
```

View File

@@ -20,3 +20,4 @@ enable = false
"/javascript.html" = "/docs/languages/javascript.html"
"/ruby.html" = "/docs/languages/ruby.html"
"/python.html" = "/docs/languages/python.html"
"/adding-new-languages.html" = "/docs/extensions/languages.html"

View File

@@ -5,21 +5,25 @@
- [Getting Started](./getting-started.md)
- [System Requirements](./system-requirements.md)
- [Linux](./linux.md)
- [Windows](./windows.md)
- [Telemetry](./telemetry.md)
- [Additional Learning Materials](./additional-learning-materials.md)
# Configuration
- [Configuring Zed](./configuring-zed.md)
- [Configuring Languages](./configuring-languages.md)
- [Key bindings](./key-bindings.md)
- [Snippets](./snippets.md)
- [Themes](./themes.md)
<!-- - [Fonts](./fonts.md) -->
- [Vim](./vim.md)
# Using Zed
- [Multibuffers](./multibuffers.md)
- [Language model integration](./language-model-integration.md)
- [Code Completions](./completions.md)
- [Channels](./channels.md)
- [Collaboration](./collaboration.md)
- [Git](./git.md)
@@ -27,9 +31,21 @@
- [Remote Development](./remote-development.md)
- [REPL](./repl.md)
# Extensions
- [Overview](./extensions.md)
- [Installing Extensions](./extensions/installing-extensions.md)
- [Developing Extensions](./extensions/developing-extensions.md)
- [Language Extensions](./extensions/languages.md)
- [Theme Extensions](./extensions/themes.md)
# Language Support
- [All Languages](./languages.md)
- [AsciiDoc](./languages/asciidoc.md)
- [Astro](./languages/astro.md)
- [Bash](./languages/bash.md)
- [Biome](./languages/biome.md)
- [C](./languages/c.md)
- [C++](./languages/cpp.md)
- [C#](./languages/csharp.md)
@@ -37,17 +53,29 @@
- [CSS](./languages/css.md)
- [Dart](./languages/dart.md)
- [Deno](./languages/deno.md)
- [Docker](./languages/docker.md)
- [Elixir](./languages/elixir.md)
- [Elm](./languages/elm.md)
- [Emmet](./languages/emmet.md)
- [Erlang](./languages/erlang.md)
- [Fish](./languages/fish.md)
- [GDScript](./languages/gdscript.md)
- [Gleam](./languages/gleam.md)
- [GLSL](./languages/glsl.md)
- [Go](./languages/go.md)
- [Groovy](./languages/groovy.md)
- [Haskell](./languages/haskell.md)
- [HTML](./languages/html.md)
- [Java](./languages/java.md)
- [JavaScript](./languages/javascript.md)
- [Julia](./languages/julia.md)
- [JSON](./languages/json.md)
- [Kotlin](./languages/kotlin.md)
- [Lua](./languages/lua.md)
- [Luau](./languages/luau.md)
- [Makefile](./languages/makefile.md)
- [Markdown](./languages/markdown.md)
- [Nim](./languages/nim.md)
- [OCaml](./languages/ocaml.md)
- [PHP](./languages/php.md)
- [Prisma](./languages/prisma.md)
@@ -55,11 +83,16 @@
- [PureScript](./languages/purescript.md)
- [Python](./languages/python.md)
- [R](./languages/r.md)
- [ReStructuredText](./languages/rst.md)
- [Racket](./languages/racket.md)
- [Roc](./languages/roc.md)
- [Ruby](./languages/ruby.md)
- [Rust](./languages/rust.md)
- [Scala](./languages/scala.md)
- [Scheme](./languages/scheme.md)
- [Svelte](./languages/svelte.md)
- [Swift](./languages/swift.md)
- [TailwindCSS](./languages/tailwindcss.md)
- [Terraform](./languages/terraform.md)
- [TOML](./languages/toml.md)
- [TypeScript](./languages/typescript.md)
@@ -71,8 +104,7 @@
# Developing Zed
- [Adding New Languages](./adding-new-languages.md)
- [Developing Zed](./developing-zed.md)
- [Developing Zed](./development.md)
- [macOS](./development/macos.md)
- [Linux](./development/linux.md)
- [Windows](./development/windows.md)

View File

@@ -1,83 +0,0 @@
# Adding New Languages to Zed
## LSP
Zed uses the [Language Server Protocol](https://microsoft.github.io/language-server-protocol/) to provide language support. This means, in theory, we can support any language that has an LSP server.
## Syntax Highlighting
### Defining syntax highlighting rules
We use tree-sitter queries to match certain properties to highlight.
#### Simple Example:
```scheme
(property_identifier) @property
```
```ts
const font: FontFamily = {
weight: "normal",
underline: false,
italic: false,
};
```
Match a property identifier and highlight it using the identifier `@property`. In the above example, `weight`, `underline`, and `italic` would be highlighted.
#### Complex example:
```scheme
(_
return_type: (type_annotation
[
(type_identifier) @type.return
(generic_type
name: (type_identifier) @type.return)
]))
```
```ts
function buildDefaultSyntax(colorScheme: Theme): Partial<Syntax> {
// ...
}
```
Match a function return type, and highlight the type using the identifier `@type.return`. In the above example, `Partial` would be highlighted.
#### Example - Typescript
Here is an example portion of our `highlights.scm` for TypeScript:
```scheme
; crates/zed/src/languages/typescript/highlights.scm
; Variables
(identifier) @variable
; Properties
(property_identifier) @property
; Function and method calls
(call_expression
function: (identifier) @function)
(call_expression
function: (member_expression
property: (property_identifier) @function.method))
; Function and method definitions
(function
name: (identifier) @function)
(function_declaration
name: (identifier) @function)
(method_definition
name: (property_identifier) @function.method)
; ...
```

View File

@@ -31,23 +31,19 @@ After joining a channel, you can `Share` a project with the other people there.
When you are editing someone elses project, you still have the full power of the editor at your fingertips, you can jump to definitions, use the AI assistant, and see any diagnostic errors. This is extremely powerful for pairing, as one of you can be implementing the current method while the other is reading and researching the correct solution to the next problem. And, because you have your own config running, it feels like youre using your own machine.
### Following
See [our collaboration documentation](./collaboration.md) for more details about how this works.
You can follow someone by clicking on their avatar in the top bar, or their name in the collaboration panel. When following, your pane will show you what they are looking at, even if they are jumping between different files in the project. If you want to stop following them, you can by scrolling around, or clicking in a different part of the file.
Following is incredibly useful when youre learning a new codebase, or trying to debug together. Because you can always see what each person is looking at, theres no confusion as to what is being talked about.
As a bonus, if the other person is sharing their screen, you can follow them out of Zed and see what is going on, so you can see if the code you wrote together really works.
### Notes & Chat
### Notes
Each channel has a notes file associated with it to keep track of current status, new ideas, or to collaborate on building out the design for the feature that youre working on before diving into code.
<figure><img src="../.gitbook/assets/channels-3.png" alt=""><figcaption></figcaption></figure>
The chat is also there for quickly sharing context, or getting questions answered, that are more ephemeral in nature.
This is similar to a Google Doc, except powered by Zed's collaborative software and persisted to our servers.
Between the two, you can use Zeds collaboration mode for large-scale changes with multiple people tackling different aspects of the problem. Because youre all working on the same copy of the code, there are no merge conflicts, and because you all have access to the same notes, its easy to track progress and keep everyone in the loop.
### Chat
The chat is also there for quickly sharing context without a microphone, getting questions answered, or however else you'd want to use a chat channel.
### Inviting people

View File

@@ -4,8 +4,6 @@ Only collaborate with people that you trust. Since sharing a project gives them
In the future, we will do more to prevent this type of access beyond the shared project and add more control over what collaborators can do, but for now, only collaborate with people you trust.
Note: we are working on a new version of this feature called [Channels](channels/). If you'd like to be part of the private beta, please contact us!
## Adding a collaborator to a call
Before you can collaborate, you'll need to add a collaborator to your contacts. To do this:
@@ -26,13 +24,17 @@ When you invite a collaborator to a project not in a call they will receive a no
### Inviting non-Zed users
If someone you want to collaborate with has not yet signed up for Zed, they will need to [download the app](https://zed.dev/download) and sign in for the first time before you can add them.
If someone you want to collaborate with has not yet signed up for Zed, they will need to [download the app](https://zed.dev/download) and sign in for the first time before you can add them. Identity is tied to GitHub accounts, so new users will need to authenticate with GitHub in order to sign into Zed.
### Voice chat
When joining a call, Zed will automatically share your microphone with other users in the call, if your OS allows it. This isn't tied to your project. You can disable this for your client via the [`mute_on_join`](./configuring-zed.md#calls) setting.
## Collaborating on a project
### Share a project
When you invite a collaborator to join your project, a new call begins. Your Zed windows will show the call participants in the top right of the window.
When you invite a collaborator to join your project, a new call begins. Your Zed windows will show the call participants in the title bar of the window.
![A new Zed call with two collaborators](https://zed.dev/img/collaboration/new-call.png)

91
docs/src/completions.md Normal file
View File

@@ -0,0 +1,91 @@
# Completions
Zed supports supports two sources for completions:
1. "Code Completions" provided by Language Servers (LSPs) automatically installed by Zed or via [Zed Language Extensions](languages.md).
2. "Inline Completions" provided by external APIs like [GitHub Copilot](#github-copilot) or [Supermaven](#supermaven).
## Code Completions
When there is an appropriate language server available, Zed will by-default provide completions of variable names, functions, and other symbols in the current file. You can disable these by adding the following to your zed settings.json file:
```json
"show_completions_on_input": false
```
You can manually trigger completions with `ctrl-space` or by triggering the `editor::ShowCompletions` action from the command palette.
For more information, see :
- [Configuring Supported Languages](./configuring-languages.md)
- [List of Zed Supported Languages](./languages.md).
## Configuring Inline Completions
### GitHub Copilot
To use GitHub Copilot (enabled by default), add the following to your `settings.json`:
```json
{
"features": {
"inline_completion_provider": "copilot"
}
}
```
You should be able to sign-in to GitHub Copilot by clicking on the Copilot icon in the status bar and following the setup instructions.
### Supermaven
To use Supermaven, add the following to your `settings.json`:
```json
{
"features": {
"inline_completion_provider": "supermaven"
}
}
```
You should be able to sign-in to Supermaven by clicking on the Supermaven icon in the status bar and following the setup instructions.
## Using Inline completions
Once you have configured an Inline Completions provider, you can start using inline completions in your code. Inline completions will appear as you type, and you can accept them by pressing `tab` or `enter` or hide them by pressing `esc`.
There a number of actions/shortcuts available to interact with inline completions:
- `editor: accept inline completion` (`tab`): To accept the current inline completion
- `editor: accept partial inline completion` (`cmd-right`): To accept the current inline completion up to the next word boundary
- `editor: show inline completion` (`alt-\\`): Trigger a inline completion request manually
- `editor: next inline completion` (`alt-]`): To cycle to the next inline completion
- `editor: previous inline completion` (`alt-[`): To cycle to the previous inline completion
### Disabling Inline-Completions
To disable completions that appear automatically as you type, add the following to your `settings.json`:
```json
{
"show_inline_completions": false
}
```
You can trigger inline completions manually by executing `editor: show inline completion` (`alt-\\`).
You can also add this as a language-specific setting in your `settings.json` to disable inline completions for a specific language:
```json
{
"language": {
"python": {
"show_inline_completions": false
}
}
}
```
## See also
You may also use the Assistant Panel or the Inline Assistant to interact with language models, see [Language Model Integration](language-model-integration.md) documentation for more information.

View File

@@ -0,0 +1,413 @@
# Configuring supported languages
Zed offers powerful customization options for each programming language it supports. This guide will walk you through the various ways you can tailor your coding experience to your preferences and project requirements.
Zed's language support is built on two main technologies:
1. Tree-sitter: This handles syntax highlighting and structure-based features like the outline panel.
2. Language Server Protocol (LSP): This provides semantic features such as code completion and diagnostics.
These components work together to provide Zed's language capabilities.
In this guide, we'll cover:
- Language-specific settings
- File associations
- Working with language servers
- Formatting and linting configuration
- Customizing syntax highlighting and themes
- Advanced language features
By the end of this guide, you should know how to configure and customize supported languages in Zed.
For a comprehensive list of languages supported by Zed and their specific configurations, see our [Supported Languages](./languages.md) page. To go further, you could explore developing your own extensions to add support for additional languages or enhance existing functionality. For more information on creating language extensions, see our [Language Extensions](./extensions/languages.md) guide.
## Language-specific Settings
Zed allows you to override global settings for individual languages. These custom configurations are defined in your `settings.json` file under the `languages` key.
Here's an example of language-specific settings:
```json
"languages": {
"Python": {
"tab_size": 4,
"formatter": "language_server",
"format_on_save": true
},
"JavaScript": {
"tab_size": 2,
"formatter": {
"external": {
"command": "prettier",
"arguments": ["--stdin-filepath", "{buffer_path}"]
}
}
}
}
```
You can customize a wide range of settings for each language, including:
- `tab_size`: The number of spaces for each indentation level
- `formatter`: The tool used for code formatting
- `format_on_save`: Whether to automatically format code when saving
- `enable_language_server`: Toggle language server support
- `hard_tabs`: Use tabs instead of spaces for indentation
- `preferred_line_length`: The recommended maximum line length
- `soft_wrap`: How to wrap long lines of code
These settings allow you to maintain specific coding styles across different languages and projects.
## File Associations
Zed automatically detects file types based on their extensions, but you can customize these associations to fit your workflow.
To set up custom file associations, use the `file_types` setting in your `settings.json`:
```json
"file_types": {
"C++": ["c"],
"TOML": ["MyLockFile"],
"Dockerfile": ["Dockerfile*"]
}
```
This configuration tells Zed to:
- Treat `.c` files as C++ instead of C
- Recognize files named "MyLockFile" as TOML
- Apply Dockerfile syntax to any file starting with "Dockerfile"
You can use glob patterns for more flexible matching, allowing you to handle complex naming conventions in your projects.
## Working with Language Servers
Language servers are a crucial part of Zed's intelligent coding features, providing capabilities like auto-completion, go-to-definition, and real-time error checking.
### What are Language Servers?
Language servers implement the Language Server Protocol (LSP), which standardizes communication between the editor and language-specific tools. This allows Zed to support advanced features for multiple programming languages without implementing each feature separately.
Some key features provided by language servers include:
- Code completion
- Error checking and diagnostics
- Code navigation (go to definition, find references)
- Code actions (Rename, extract method)
- Hover information
- Workspace symbol search
### Managing Language Servers
Zed simplifies language server management for users:
1. Automatic Download: When you open a file with a matching file type, Zed automatically downloads the appropriate language server. Zed may prompt you to install an extension for known file types.
2. Storage Location:
- macOS: `~/Library/Application Support/Zed/languages`
- Linux: `$XDG_DATA_HOME/languages`, `$FLATPAK_XDG_DATA_HOME/languages`, or `$HOME/.local/share`
3. Automatic Updates: Zed keeps your language servers up-to-date, ensuring you always have the latest features and improvements.
### Choosing Language Servers
Some languages in Zed offer multiple language server options. You might have multiple extensions installed that bundle language servers targeting the same language, potentially leading to overlapping capabilities. To ensure you get the functionality you prefer, Zed allows you to prioritize which language servers are used and in what order.
You can specify your preference using the `language_servers` setting:
```json
"languages": {
"PHP": {
"language_servers": ["intelephense", "!phpactor", "..."]
}
}
```
In this example:
- `intelephense` is set as the primary language server
- `phpactor` is disabled (note the `!` prefix)
- `...` preserves any other default language server settings
This configuration allows you to tailor the language server setup to your specific needs, ensuring that you get the most suitable functionality for your development workflow.
### Configuring Language Servers
Many language servers accept custom configuration options. You can set these in the `lsp` section of your `settings.json`:
```json
"lsp": {
"rust-analyzer": {
"initialization_options": {
"checkOnSave": {
"command": "clippy"
}
}
}
}
```
This example configures the Rust Analyzer to use Clippy for additional linting when saving files.
When configuring language server options in Zed, it's important to use nested objects rather than dot-delimited strings. This is particularly relevant when working with more complex configurations. Let's look at a real-world example using the TypeScript language server:
Suppose you want to configure the following settings for TypeScript:
- Enable strict null checks
- Set the target ECMAScript version to ES2020
- Configure import organization preferences
Here's how you would structure these settings in Zed's `settings.json`:
Here's how you might incorrectly attempt to set these options using dot notation:
```json
"lsp": {
"typescript-language-server": {
"initialization_options": {
// This is not supported:
// "preferences.strictNullChecks": true,
// You express it like this:
"preferences": {
"strictNullChecks": true
}
}
}
}
```
### Enabling or Disabling Language Servers
You can toggle language server support globally or per-language:
```json
"languages": {
"Markdown": {
"enable_language_server": false
}
}
```
This disables the language server for Markdown files, which can be useful for performance in large documentation projects. You can configure this globally in your `~/.zed/settings.json` or inside a `.zed/settings.json` in your project directory.
## Formatting and Linting
Zed provides support for code formatting and linting to maintain consistent code style and catch potential issues early.
### Configuring Formatters
Zed supports both built-in and external formatters. Configure formatters globally or per-language in your `settings.json`:
```json
"languages": {
"JavaScript": {
"formatter": {
"external": {
"command": "prettier",
"arguments": ["--stdin-filepath", "{buffer_path}"]
}
},
"format_on_save": true
},
"Rust": {
"formatter": "language_server",
"format_on_save": true
}
}
```
This example uses Prettier for JavaScript and the language server's formatter for Rust, both set to format on save.
To disable formatting for a specific language:
```json
"languages": {
"Markdown": {
"format_on_save": false
}
}
```
### Setting Up Linters
Linting in Zed is typically handled by language servers. Many language servers allow you to configure linting rules:
```json
"lsp": {
"eslint": {
"settings": {
"codeActionOnSave": {
"rules": ["import/order"]
}
}
}
}
```
This configuration sets up ESLint to organize imports on save for JavaScript files.
To run linter fixes automatically on save:
```json
"languages": {
"JavaScript": {
"code_actions_on_format": {
"source.fixAll.eslint": true
}
}
}
```
### Integrating Formatting and Linting
Zed allows you to run both formatting and linting on save. Here's an example that uses Prettier for formatting and ESLint for linting JavaScript files:
```json
"languages": {
"JavaScript": {
"formatter": {
"external": {
"command": "prettier",
"arguments": ["--stdin-filepath", "{buffer_path}"]
}
},
"code_actions_on_format": {
"source.fixAll.eslint": true
},
"format_on_save": true
}
}
```
### Troubleshooting
If you encounter issues with formatting or linting:
1. Check Zed's log file for error messages (Use the command palette: `zed: open log`)
2. Ensure external tools (formatters, linters) are correctly installed and in your PATH
3. Verify configurations in both Zed settings and language-specific config files (e.g., `.eslintrc`, `.prettierrc`)
## Syntax Highlighting and Themes
Zed offers customization options for syntax highlighting and themes, allowing you to tailor the visual appearance of your code.
### Customizing Syntax Highlighting
Zed uses Tree-sitter grammars for syntax highlighting. Override the default highlighting using the `experimental.theme_overrides` setting:
```json
"experimental.theme_overrides": {
"syntax": {
"comment": {
"font_style": "italic"
},
"string": {
"color": "#00AA00"
}
}
}
```
This example makes comments italic and changes the color of strings.
### Language-Specific Theme Overrides
Apply theme overrides for specific languages:
```json
"languages": {
"Python": {
"theme_overrides": {
"syntax": {
"function": {
"color": "#0000FF"
}
}
}
}
}
```
This configuration changes the color of function names in Python files.
### Selecting and Customizing Themes
Change your theme:
1. Use the theme selector (Cmd+K Cmd+T on macOS, Ctrl+K Ctrl+T on Linux)
2. Or set it in your `settings.json`:
```json
"theme": {
"mode": "dark",
"dark": "One Dark",
"light": "GitHub Light"
}
```
Create custom themes by creating a JSON file in `~/.config/zed/themes/`. Zed will automatically detect and make available any themes in this directory.
### Using Theme Extensions
Zed supports theme extensions. Browse and install theme extensions from the Extensions panel (Cmd+Shift+E).
To create your own theme extension, refer to the [Developing Theme Extensions](./extensions/themes.md) guide.
## Using Language Server Features
### Inlay Hints
Inlay hints provide additional information inline in your code, such as parameter names or inferred types. Configure inlay hints in your `settings.json`:
```json
"inlay_hints": {
"enabled": true,
"show_type_hints": true,
"show_parameter_hints": true,
"show_other_hints": true
}
```
For language-specific inlay hint settings, refer to the documentation for each language.
### Code Actions
Code actions provide quick fixes and refactoring options. Access code actions using the `editor: Toggle Code Actions` command or by clicking the lightbulb icon that appears next to your cursor when actions are available.
### Go To Definition and References
Use these commands to navigate your codebase:
- `editor: Go to Definition` (F12)
- `editor: Go to Type Definition` (Cmd+F12 on macOS, Ctrl+F12 on Linux)
- `editor: Find All References` (Shift+F12)
### Rename Symbol
To rename a symbol across your project:
1. Place your cursor on the symbol
2. Use the `editor: Rename Symbol` command (F2)
3. Enter the new name and press Enter
These features depend on the capabilities of the language server for each language.
When renaming a symbol that spans multiple files, Zed will open a preview in a multibuffer. This allows you to review all the changes across your project before applying them. To confirm the rename, simply save the multibuffer. If you decide not to proceed with the rename, you can undo the changes or close the multibuffer without saving.
### Hover Information
Use the `editor: Show Hover` command to display information about the symbol under the cursor. This often includes type information, documentation, and links to relevant resources.
### Workspace Symbol Search
The `workspace: Open Symbol` command allows you to search for symbols (functions, classes, variables) across your entire project. This is useful for quickly navigating large codebases.
### Code Completion
Zed provides intelligent code completion suggestions as you type. You can manually trigger completion with the `editor: Show Completions` command. Use Tab or Enter to accept suggestions.
### Diagnostics
Language servers provide real-time diagnostics (errors, warnings, hints) as you code. View all diagnostics for your project using the `diagnostics: Toggle` command.

Some files were not shown because too many files have changed in this diff Show More