Compare commits

...

49 Commits

Author SHA1 Message Date
Max Brunsfeld
8953b487ad Simplify synthesis further 2025-12-30 16:28:50 -08:00
Max Brunsfeld
196c488ed4 Switch to single-pass approach for synthesis 2025-12-30 13:23:47 -08:00
Max Brunsfeld
dfbbacec12 Write out invalid examples to failed dir 2025-12-30 09:42:54 -08:00
Max Brunsfeld
9161a23513 Fix output dir flag 2025-12-30 09:42:14 -08:00
Max Brunsfeld
9a8ccb32ac Use -o flag for synthesize output dir 2025-12-29 19:19:10 -08:00
Max Brunsfeld
5cfdfd32c6 Allow re-running from examples with changed provider 2025-12-29 19:19:00 -08:00
Max Brunsfeld
defcc2f51b Make example file naming style match example capture code in Zed 2025-12-29 19:18:37 -08:00
Max Brunsfeld
6ebe0edea0 Simplify apply_diff 2025-12-29 18:52:49 -08:00
Max Brunsfeld
1a83c0f5e4 Add synthesize subcommand 2025-12-29 18:33:14 -08:00
Max Brunsfeld
27a6d54efe Fix missing diff path headers in actual patch 2025-12-29 18:32:36 -08:00
Max Brunsfeld
a168d8f50a Restore ability to deserialize inline cursor marker 2025-12-29 11:52:18 -08:00
Max Brunsfeld
e243a658a5 Allow running teacher model predictions with no expected patch 2025-12-29 10:23:22 -08:00
Max Brunsfeld
a93fd51f35 Omit events from other worktrees from captured examples 2025-12-29 10:23:22 -08:00
Max Brunsfeld
0dcdc6d9a4 Allow multiple expected patches, remove line-based patch scoring 2025-12-29 10:23:22 -08:00
Max Brunsfeld
7e09b59fa3 Don't assume that claude prompt format matches example spec format 2025-12-29 10:23:22 -08:00
Max Brunsfeld
1e28bf8279 Represent cursor location in excerpt with a comment line 2025-12-29 10:23:22 -08:00
Max Brunsfeld
b6eec44a99 Allow example name to be specified in markdown 2025-12-29 10:23:22 -08:00
Max Brunsfeld
d83c985923 Allow full paths for cursor excerpt 2025-12-29 10:23:22 -08:00
Max Brunsfeld
74c4e25b8c Don't try to populate expected patch in capture example 2025-12-29 10:23:22 -08:00
Max Brunsfeld
2021f32947 Allow full path for cursor path 2025-12-29 10:23:22 -08:00
Max Brunsfeld
299ca2e8ac Clear last event in clear history action 2025-12-29 10:23:22 -08:00
Max Brunsfeld
c284f9086b Store repo and revision in TOML front matter 2025-12-29 10:23:22 -08:00
Finn Evers
fc89e19098 extension_ci: Move shared workflows into nested folder (#45828)
This makes the rollout as well as distinguishing this in the future
easier.

Release Notes:

- N/A
2025-12-29 16:44:47 +00:00
Finn Evers
f53b01d5a2 ci: Grant GitHub token more granular permissions (#45825)
Release Notes:

- N/A
2025-12-29 16:16:01 +00:00
Finn Evers
bf1c8819d9 ci: Properly request token for extension repositories (#45824)
Release Notes:

- N/A
2025-12-29 15:41:56 +00:00
Xiaobo Liu
3247264288 ui: Remove stray blank lines in ButtonStyle methods (#45822)
Release Notes:

- N/A
2025-12-29 14:23:49 +00:00
Danilo Leal
6d947b7746 ui: Add a CopyButton component (#45821)
There were several places adding a copy icon button, so thought of
encapsulating the logic to copy a given string into the clipboard (and
other small details like swapping the icon and tooltip if copied) into a
component, making it easier to introduce this sort of functionality in
the future, with fewer lines of code.

All it takes (for the simplest case) is:

```rs
CopyButton::new(your_message)
```

<img width="600" height="714" alt="Screenshot 2025-12-29 at 10  50@2x"
src="https://github.com/user-attachments/assets/e6949863-a056-4855-82d8-e4ffb5d62c90"
/>

Release Notes:

- N/A
2025-12-29 11:01:19 -03:00
Finn Evers
db221ca72d Add workflow to rollout changes to the extension organization (#45579)
This PR adds a workflow that we can utilize to rollout changes to the CI
workflows for the `zed-extensions` organization.

Release Notes:

- N/A
2025-12-29 14:51:17 +01:00
Finn Evers
1d006a8cb0 extension_ci: Specify needed permissions for jobs (#45542)
GitHub flags these as security vulnerabilities. Hence, this PR specifies
the needed permissions for the workflows used in the `zed-extensions`
organization.

Release Notes:

- N/A
2025-12-29 13:19:22 +00:00
Rocky Shi
aaab9f6960 Add a button to copy diagnostic messages from the hover popover to the clipboard (#45625)
Closes https://github.com/zed-industries/zed/issues/45346

Release Notes:

- Added a button to copy diagnostic messages from the hove popover

Screenshot:

<img width="842" height="360" alt="image"
src="https://github.com/user-attachments/assets/9c00fba5-82aa-4179-95b1-afd5c1173889"
/>

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-12-29 12:31:29 +00:00
Ahmed Hesham Abdelkader
209cf0a48f ui: Make long Callout descriptions scrollable (#45792)
Fixes #43306

Long error messages from LLM providers in the Agent Panel were not
scrollable, making it impossible to read the full error content.

Changes:
- Add max_h_48() and overflow_y_scroll() to description containers
- Add element IDs required for scroll functionality
- Add min_h_0() and overflow_hidden() to parent flex container
- Add component preview example demonstrating scrollable content

Release Notes:

- Fixed long error messages in Agent Panel being unreadable by making
them scrollable
([#43306](https://github.com/zed-industries/zed/issues/43306)).

---------

Co-authored-by: Danilo Leal <daniloleal09@gmail.com>
2025-12-29 12:03:50 +00:00
Murilo Cunha
260691c99c docs: Specify that dev containers are currently previeiw-only (#45816)
Thanks for the cool project and making it open source! Started using Zed
recently and I really enjoy it.

Made a tiny addition to devcontainer docs to specify the version. Wasn't
able to get it to work as shown in the
[docs](https://zed.dev/docs/dev-containers) (should "just work"). The
feature was introduced recently on [PR
44442](https://github.com/zed-industries/zed/pull/44442) and is only
available as of v0.218 (currently still in preview), while I was still
on the latest stable version.

So I thought of opening a small PR 😊 

Thanks again for the awesome project!

Release Notes:

- N/A

---------

Co-authored-by: Danilo Leal <67129314+danilo-leal@users.noreply.github.com>
2025-12-29 11:34:52 +00:00
Danilo Leal
9e88f3f33c agent_ui: Fix issues with mention crease (#45683)
This PR introduces the `MentionCrease` component, aimed at solving two
issues with mention creases in the agent panel:
- Previously, the mention crease was using a button with a regular size,
which is bigger than the default buffer font line height. That made the
crease look clipped and also overlapping with one another when in a
multiple line scenario where the creases would be on top of each other.
`MentionCrease` uses the window line height value to set the button
height, with a small one pixel vertical padding just for a bit of
spacing.
- Previously, given the crease used a `Label`, its font size wouldn't
scale if you changed the `agent_buffer_font_size` setting. Now,
`MentionCrease` uses that font size value, which makes regular text and
its text grow together as you'd expect.

Release Notes:

- agent: Fix a bug where mention creases didn't scale with
`agent_buffer_font_size` and got clipped/jumbled when rendered one above
the other.
2025-12-29 08:27:43 -03:00
Gabe Shahbazian
2cad6c8ef1 svg_preview: Detect SVG in single-file mode by checking file name (#45747)
Release Notes:

- Use the files name for "is svg" checks so SVG previews and the toolbar
button work in single-file mode.
2025-12-27 23:29:36 +00:00
Ben Brandt
bc24ffe863 acp: Beta support for Session Config Options (#45751)
Adds beta support for the ACP draft feature of Session Config Options:
https://agentclientprotocol.com/rfds/session-config-options

Release Notes:

- N/A
2025-12-27 22:10:37 +00:00
Kirill Bulatov
1e4a970ae2 Bump glsl to 0.2.0 (#45744)
Includes https://github.com/zed-industries/zed/pull/45727

Release Notes:

- N/A
2025-12-27 19:09:52 +00:00
Thinkseal
3e656a0911 Add colorized brackets support for GLSL (#45727)
Closes #45674

before:
<img width="482" height="271" alt="Screenshot 2025-12-26 at 21 05 50"
src="https://github.com/user-attachments/assets/d1bba3e1-04f3-4b8d-a187-8da80cee7e22"
/>
after
<img width="481" height="244" alt="Screenshot 2025-12-26 at 21 06 23"
src="https://github.com/user-attachments/assets/4778c779-7082-4701-821e-b825b05b4097"
/>


Release Notes:

- Fixes colorized brackets for the GLSL extension

---------

Co-authored-by: Finn Evers <finn.evers@outlook.de>
2025-12-27 18:54:23 +00:00
Finn Evers
57ea23d161 editor: Fix active line number regressions with relative counting (#45741)
Follow-up to https://github.com/zed-industries/zed/pull/45164 which
caused the active line number to always be `0` instead of the actual
current line number. No release notes since its only on nightly

Release Notes:

- N/A
2025-12-27 16:26:08 +00:00
Kirill Bulatov
a50c5b2c10 Fix Zed OOM-ing when macOS file descriptors become invalid (#45669)
Closes https://github.com/zed-industries/zed/issues/42845

Repro steps:
https://github.com/zed-industries/zed/issues/42845#issuecomment-3687413958
Initial investigation and Zed memory trace:
https://github.com/zed-industries/zed/issues/42845#issuecomment-3687877977

The PR consists of 2 commits:
*
[first](732d308c8d)
adds cosmetic fixes to remove backtraces from logs yet again and print
paths in quotes, as file descriptors may return empty paths.
It also stubs the cause if OOM in project panel: that one traversed all
worktrees in `for worktree_snapshot in visible_worktrees` and "accepted"
the one with empty paths + never called `entry_iter.advance();` in "no
file name found for the worktree" case, thus looping endlessly and
bloating the memory quite fast.

*
[second](7ebfe5da2f)
adds something that resembles a fix: `fn current_path` on macOS used the
file handler to re-fetch the worktree root file path on worktree root
canonicalization failure.
What's odd, is that `libc::fcntl` returns `0` in the case when external
volume is not mounted, thus resulting in the `""` path string that is
propagated all the way up.

*
[third](1a7560cef3)
moves the fix down to the platform-related FS implementations

The "fix" now checks the only usage of this method inside `async fn
process_events` for an empty path and bails if that is the case.
I am not sure what is a better fix, but this stops any memory leaks and
given how bad the situation now, seems ok to merge for now with the
`TODO` comment for more clever people to fix properly later.

----------------

Now, when I disconnect the SMB share and reconnect it again, Zed stops
displaying any files in the project tree but the ones opened as editors.

As before, at first, when the share is unmounted, Zed fails to save any
changes because of the timeouts.

Later, when the share is re-connected, macOS Finder hangs still but Zed
starts to react on saves yet still only shows the files that are open as
editors.
The files can be edited and saved from now on.

Later, when Finder finally stops hanging and indicates that the share is
mounted fully, the rest of the file structure reappear in the project
panel, and all file saves are propagated, hence can be observed in the
share in Finder.

It feels that one good improvement to add on top is some "disconnected"
indicator that clearly shows that the file is not properly handles in
the OS.
This requires much more changes and thinking as nothing like that exists
in Zed yet, hence not done.

Release Notes:

- Fixed Zed OOM-ing when macOS file descriptors become invalid
2025-12-26 18:47:53 +00:00
Agus Zubiaga
f1b723973b mac: Delay initial find pasteboard search until ⌘G or ⌘F (#45605)
Follow up to https://github.com/zed-industries/zed/pull/45311. Instead
of searching for the string in the find pasteboard as soon as the pane
is focused, we will now wait until the search bar is either deployed or
`Select{Next|Prev}Match` is triggered.

Release Notes:

- N/A
2025-12-26 15:22:57 -03:00
Nathan Sobo
a7ce677ac3 Optimize terminal rendering when clipped by parent containers (#45537)
This brings the terminal element's viewport culling in line with the
editor optimization in PR #44995 and the fix in PR #45077.

## Problem

When a terminal is inside a scrollable container (e.g., the Agent Panel
thread view), it would render ALL cells during prepaint, even when the
terminal was entirely outside the viewport. This caused unnecessary CPU
usage when multiple terminal tool outputs existed in the Agent Panel.

## Solution

Calculate the intersection of the terminal's bounds with the current
content_mask (the visible viewport after all parent clipping). If the
intersection has zero area, skip all cell processing entirely.

### Three code paths

1. **Offscreen** (`intersection.size <= 0`): Early exit, process 0 cells
2. **Fully visible** (`intersection == bounds`): Fast path, stream cells
directly (no allocation)
3. **Partially clipped**: Group cells by line, skip/take visible rows
only

### Key insight: filter by screen position, not buffer coordinates

The previous approach tried to filter cells by `cell.point.line`
(terminal buffer coordinates), which breaks in Scrollable mode where
cells can have negative line numbers for scrollback history.

The new approach filters by **screen position** using
`chunk_by(line).skip(N).take(M)`, which works regardless of the actual
line numbers because we're filtering on enumerated line group index.

## Testing

Added comprehensive unit tests for:
- Screen-position filtering with positive lines (Inline mode)
- Screen-position filtering with negative lines (Scrollable mode with
scrollback)
- Edge cases (skip all, positioning math)
- Unified filtering works for both modes

Manually verified:
- Terminal fully visible (no clipping) ✓
- Terminal clipped from top/bottom ✓
- Terminal completely outside viewport ✓
- Scrollable terminals with scrollback history ✓
- Selection/interaction still works ✓

Release Notes:

- Improved Agent Panel performance when terminals are scrolled
offscreen.

/cc @as-cii
2025-12-26 11:20:40 -07:00
Finn Evers
ed67f246cb Fix formatting in json_schema_store.rs (#45698)
There are some too long lines which make `rustfmt` unable to format the
file, which in turn makes editing and working with this file rather
hard. This PR fixes this.

Release Notes:

- N/A
2025-12-26 18:10:33 +00:00
Marshall Bowers
93f29326c4 docs: Update link to Tree-sitter Query extension (#45697)
Release Notes:

- N/A
2025-12-26 17:53:40 +00:00
Haojian Wu
85f4681299 docs: Link to Tree-sitter query extension (#45682)
Release Notes:

- N/A
2025-12-26 17:43:18 +00:00
Finn Evers
741c5d5010 Revive "good first issue" notifier (#45679)
We adjusted the labels some time ago, but never took care of the `good
first issue` notifier that posts the good first issues to discord.

Adjusting the label accordingly so that it notifies people again.

Release Notes:

- N/A
2025-12-26 10:16:32 +00:00
Marco Mihai Condrache
f03987fb68 search: Remove intermediate allocation (#45633)
Release Notes:

- N/A

---------

Signed-off-by: Marco Mihai Condrache <52580954+marcocondrache@users.noreply.github.com>
2025-12-25 20:29:17 +01:00
Teoh Han Hui
ca47822667 Associate devcontainer.json with JSONC language (#45593)
Release Notes:

- N/A
2025-12-23 21:23:28 +00:00
Danilo Leal
a34fe06bb1 agent_ui: Allow "token reached" callout to be dismissed (#45595)
It was previously impossible to dismiss the "token usage
reaching/reached the limit" callout.

<img width="500" height="392" alt="Screenshot 2025-12-23 at 5  49@2x"
src="https://github.com/user-attachments/assets/7fd8b126-dd3f-430b-9fea-ca05c73e5643"
/>

Release Notes:

- N/A
2025-12-23 21:14:58 +00:00
Kirill Bulatov
0ce484e66c Do not trust Docker hosts by default (#45587)
It's still possible to leak secrets by spawning odd MCP/LSP servers from
`.zed/settings.json`

Release Notes:

- N/A
2025-12-23 19:27:09 +00:00
90 changed files with 5115 additions and 1140 deletions

View File

@@ -66,7 +66,7 @@ jobs:
if: |-
(github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions') &&
(inputs.force-bump == 'true' || needs.check_bump_needed.outputs.needs_bump == 'true')
runs-on: namespace-profile-8x16-ubuntu-2204
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: generate-token
name: extension_bump::generate_token
@@ -119,7 +119,7 @@ jobs:
needs:
- check_bump_needed
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions') && github.event_name == 'push' && github.ref == 'refs/heads/main' && needs.check_bump_needed.outputs.needs_bump == 'false'
runs-on: namespace-profile-8x16-ubuntu-2204
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: generate-token
name: extension_bump::generate_token

View File

@@ -13,7 +13,7 @@ on:
jobs:
create_release:
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-8x16-ubuntu-2204
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: generate-token
name: extension_bump::generate_token

View File

@@ -0,0 +1,106 @@
# Generated from xtask::workflows::extension_workflow_rollout
# Rebuild with `cargo xtask workflows`.
name: extension_workflow_rollout
env:
CARGO_TERM_COLOR: always
on:
workflow_dispatch: {}
jobs:
fetch_extension_repos:
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: list-repos
name: extension_workflow_rollout::fetch_extension_repos::get_repositories
uses: actions/github-script@v7
with:
script: |
const repos = await github.paginate(github.rest.repos.listForOrg, {
org: 'zed-extensions',
type: 'public',
per_page: 100,
});
const filteredRepos = repos
.filter(repo => !repo.archived)
.filter(repo => repo.name !== 'workflows' && repo.name !== 'material-icon-theme')
.map(repo => repo.name);
console.log(`Found ${filteredRepos.length} extension repos`);
return filteredRepos;
result-encoding: json
outputs:
repos: ${{ steps.list-repos.outputs.result }}
timeout-minutes: 5
rollout_workflows_to_extension:
needs:
- fetch_extension_repos
if: needs.fetch_extension_repos.outputs.repos != '[]'
runs-on: namespace-profile-2x4-ubuntu-2404
strategy:
matrix:
repo: ${{ fromJson(needs.fetch_extension_repos.outputs.repos) }}
fail-fast: false
max-parallel: 5
steps:
- id: generate-token
name: extension_bump::generate_token
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.ZED_ZIPPY_APP_ID }}
private-key: ${{ secrets.ZED_ZIPPY_APP_PRIVATE_KEY }}
owner: zed-extensions
repositories: ${{ matrix.repo }}
permission-pull-requests: write
permission-contents: write
permission-workflows: write
- name: checkout_zed_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
path: zed
- name: steps::checkout_repo_with_token
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
token: ${{ steps.generate-token.outputs.token }}
repository: zed-extensions/${{ matrix.repo }}
path: extension
- name: extension_workflow_rollout::rollout_workflows_to_extension::copy_workflow_files
run: |
mkdir -p extension/.github/workflows
cp zed/extensions/workflows/shared/*.yml extension/.github/workflows/
shell: bash -euxo pipefail {0}
- id: short-sha
name: extension_workflow_rollout::rollout_workflows_to_extension::get_short_sha
run: |
echo "sha_short=$(git rev-parse --short HEAD)" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
working-directory: zed
- id: create-pr
name: extension_workflow_rollout::rollout_workflows_to_extension::create_pull_request
uses: peter-evans/create-pull-request@v7
with:
path: extension
title: Update CI workflows to zed@${{ steps.short-sha.outputs.sha_short }}
body: |
This PR updates the CI workflow files from the main Zed repository
based on the commit zed-industries/zed@${{ github.sha }}
commit-message: Update CI workflows to zed@${{ steps.short-sha.outputs.sha_short }}
branch: update-workflows
committer: zed-zippy[bot] <234243425+zed-zippy[bot]@users.noreply.github.com>
author: zed-zippy[bot] <234243425+zed-zippy[bot]@users.noreply.github.com>
base: main
delete-branch: true
token: ${{ steps.generate-token.outputs.token }}
sign-commits: true
- name: extension_workflow_rollout::rollout_workflows_to_extension::enable_auto_merge
run: |
PR_NUMBER="${{ steps.create-pr.outputs.pull-request-number }}"
if [ -n "$PR_NUMBER" ]; then
cd extension
gh pr merge "$PR_NUMBER" --auto --squash
fi
shell: bash -euxo pipefail {0}
env:
GH_TOKEN: ${{ steps.generate-token.outputs.token }}
timeout-minutes: 10

View File

@@ -6,7 +6,7 @@ on:
jobs:
handle-good-first-issue:
if: github.event.label.name == 'good first issue' && github.repository_owner == 'zed-industries'
if: github.event.label.name == '.contrib/good first issue' && github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:

4
Cargo.lock generated
View File

@@ -268,6 +268,7 @@ dependencies = [
"client",
"collections",
"env_logger 0.11.8",
"feature_flags",
"fs",
"futures 0.3.31",
"gpui",
@@ -5253,6 +5254,7 @@ dependencies = [
"text",
"thiserror 2.0.17",
"time",
"toml 0.8.23",
"ui",
"util",
"uuid",
@@ -20964,7 +20966,7 @@ dependencies = [
[[package]]
name = "zed_glsl"
version = "0.1.0"
version = "0.2.0"
dependencies = [
"zed_extension_api 0.1.0",
]

View File

@@ -884,6 +884,7 @@ pub enum AcpThreadEvent {
Refusal,
AvailableCommandsUpdated(Vec<acp::AvailableCommand>),
ModeUpdated(acp::SessionModeId),
ConfigOptionsUpdated(Vec<acp::SessionConfigOption>),
}
impl EventEmitter<AcpThreadEvent> for AcpThread {}
@@ -1193,6 +1194,10 @@ impl AcpThread {
current_mode_id,
..
}) => cx.emit(AcpThreadEvent::ModeUpdated(current_mode_id)),
acp::SessionUpdate::ConfigOptionUpdate(acp::ConfigOptionUpdate {
config_options,
..
}) => cx.emit(AcpThreadEvent::ConfigOptionsUpdated(config_options)),
_ => {}
}
Ok(())

View File

@@ -86,6 +86,14 @@ pub trait AgentConnection {
None
}
fn session_config_options(
&self,
_session_id: &acp::SessionId,
_cx: &App,
) -> Option<Rc<dyn AgentSessionConfigOptions>> {
None
}
fn into_any(self: Rc<Self>) -> Rc<dyn Any>;
}
@@ -125,6 +133,26 @@ pub trait AgentSessionModes {
fn set_mode(&self, mode: acp::SessionModeId, cx: &mut App) -> Task<Result<()>>;
}
pub trait AgentSessionConfigOptions {
/// Get all current config options with their state
fn config_options(&self) -> Vec<acp::SessionConfigOption>;
/// Set a config option value
/// Returns the full updated list of config options
fn set_config_option(
&self,
config_id: acp::SessionConfigId,
value: acp::SessionConfigValueId,
cx: &mut App,
) -> Task<Result<Vec<acp::SessionConfigOption>>>;
/// Whenever the config options are updated the receiver will be notified.
/// Optional for agents that don't update their config options dynamically.
fn watch(&self, _cx: &mut App) -> Option<watch::Receiver<()>> {
None
}
}
#[derive(Debug)]
pub struct AuthRequired {
pub description: Option<String>,

View File

@@ -4,22 +4,20 @@ use std::{
fmt::Display,
rc::{Rc, Weak},
sync::Arc,
time::Duration,
};
use agent_client_protocol as acp;
use collections::HashMap;
use gpui::{
App, ClipboardItem, Empty, Entity, EventEmitter, FocusHandle, Focusable, Global, ListAlignment,
ListState, StyleRefinement, Subscription, Task, TextStyleRefinement, Window, actions, list,
prelude::*,
App, Empty, Entity, EventEmitter, FocusHandle, Focusable, Global, ListAlignment, ListState,
StyleRefinement, Subscription, Task, TextStyleRefinement, Window, actions, list, prelude::*,
};
use language::LanguageRegistry;
use markdown::{CodeBlockRenderer, Markdown, MarkdownElement, MarkdownStyle};
use project::Project;
use settings::Settings;
use theme::ThemeSettings;
use ui::{Tooltip, WithScrollbar, prelude::*};
use ui::{CopyButton, Tooltip, WithScrollbar, prelude::*};
use util::ResultExt as _;
use workspace::{
Item, ItemHandle, ToolbarItemEvent, ToolbarItemLocation, ToolbarItemView, Workspace,
@@ -544,15 +542,11 @@ impl Render for AcpTools {
pub struct AcpToolsToolbarItemView {
acp_tools: Option<Entity<AcpTools>>,
just_copied: bool,
}
impl AcpToolsToolbarItemView {
pub fn new() -> Self {
Self {
acp_tools: None,
just_copied: false,
}
Self { acp_tools: None }
}
}
@@ -572,37 +566,14 @@ impl Render for AcpToolsToolbarItemView {
h_flex()
.gap_2()
.child({
let acp_tools = acp_tools.clone();
IconButton::new(
"copy_all_messages",
if self.just_copied {
IconName::Check
} else {
IconName::Copy
},
)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text(if self.just_copied {
"Copied!"
} else {
"Copy All Messages"
}))
.disabled(!has_messages)
.on_click(cx.listener(move |this, _, _window, cx| {
if let Some(content) = acp_tools.read(cx).serialize_observed_messages() {
cx.write_to_clipboard(ClipboardItem::new_string(content));
let message = acp_tools
.read(cx)
.serialize_observed_messages()
.unwrap_or_default();
this.just_copied = true;
cx.spawn(async move |this, cx| {
cx.background_executor().timer(Duration::from_secs(2)).await;
this.update(cx, |this, cx| {
this.just_copied = false;
cx.notify();
})
})
.detach();
}
}))
CopyButton::new(message)
.tooltip_label("Copy All Messages")
.disabled(!has_messages)
})
.child(
IconButton::new("clear_messages", IconName::Trash)

View File

@@ -21,6 +21,7 @@ acp_tools.workspace = true
acp_thread.workspace = true
action_log.workspace = true
agent-client-protocol.workspace = true
feature_flags.workspace = true
anyhow.workspace = true
async-trait.workspace = true
client.workspace = true

View File

@@ -4,6 +4,7 @@ use action_log::ActionLog;
use agent_client_protocol::{self as acp, Agent as _, ErrorCode};
use anyhow::anyhow;
use collections::HashMap;
use feature_flags::{AcpBetaFeatureFlag, FeatureFlagAppExt as _};
use futures::AsyncBufReadExt as _;
use futures::io::BufReader;
use project::Project;
@@ -38,6 +39,7 @@ pub struct AcpConnection {
agent_capabilities: acp::AgentCapabilities,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
default_config_options: HashMap<String, String>,
root_dir: PathBuf,
// NB: Don't move this into the wait_task, since we need to ensure the process is
// killed on drop (setting kill_on_drop on the command seems to not always work).
@@ -47,11 +49,29 @@ pub struct AcpConnection {
_stderr_task: Task<Result<()>>,
}
struct ConfigOptions {
config_options: Rc<RefCell<Vec<acp::SessionConfigOption>>>,
tx: Rc<RefCell<watch::Sender<()>>>,
rx: watch::Receiver<()>,
}
impl ConfigOptions {
fn new(config_options: Rc<RefCell<Vec<acp::SessionConfigOption>>>) -> Self {
let (tx, rx) = watch::channel(());
Self {
config_options,
tx: Rc::new(RefCell::new(tx)),
rx,
}
}
}
pub struct AcpSession {
thread: WeakEntity<AcpThread>,
suppress_abort_err: bool,
models: Option<Rc<RefCell<acp::SessionModelState>>>,
session_modes: Option<Rc<RefCell<acp::SessionModeState>>>,
config_options: Option<ConfigOptions>,
}
pub async fn connect(
@@ -60,6 +80,7 @@ pub async fn connect(
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
default_config_options: HashMap<String, String>,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Rc<dyn AgentConnection>> {
@@ -69,6 +90,7 @@ pub async fn connect(
root_dir,
default_mode,
default_model,
default_config_options,
is_remote,
cx,
)
@@ -85,6 +107,7 @@ impl AcpConnection {
root_dir: &Path,
default_mode: Option<acp::SessionModeId>,
default_model: Option<acp::ModelId>,
default_config_options: HashMap<String, String>,
is_remote: bool,
cx: &mut AsyncApp,
) -> Result<Self> {
@@ -217,6 +240,7 @@ impl AcpConnection {
agent_capabilities: response.agent_capabilities,
default_mode,
default_model,
default_config_options,
_io_task: io_task,
_wait_task: wait_task,
_stderr_task: stderr_task,
@@ -256,6 +280,7 @@ impl AgentConnection for AcpConnection {
let sessions = self.sessions.clone();
let default_mode = self.default_mode.clone();
let default_model = self.default_model.clone();
let default_config_options = self.default_config_options.clone();
let cwd = cwd.to_path_buf();
let context_server_store = project.read(cx).context_server_store().read(cx);
let mcp_servers = if project.read(cx).is_local() {
@@ -322,8 +347,21 @@ impl AgentConnection for AcpConnection {
}
})?;
let modes = response.modes.map(|modes| Rc::new(RefCell::new(modes)));
let models = response.models.map(|models| Rc::new(RefCell::new(models)));
let use_config_options = cx.update(|cx| cx.has_flag::<AcpBetaFeatureFlag>())?;
// Config options take precedence over legacy modes/models
let (modes, models, config_options) = if use_config_options && let Some(opts) = response.config_options {
(
None,
None,
Some(Rc::new(RefCell::new(opts))),
)
} else {
// Fall back to legacy modes/models
let modes = response.modes.map(|modes| Rc::new(RefCell::new(modes)));
let models = response.models.map(|models| Rc::new(RefCell::new(models)));
(modes, models, None)
};
if let Some(default_mode) = default_mode {
if let Some(modes) = modes.as_ref() {
@@ -411,6 +449,92 @@ impl AgentConnection for AcpConnection {
}
}
if let Some(config_opts) = config_options.as_ref() {
let defaults_to_apply: Vec<_> = {
let config_opts_ref = config_opts.borrow();
config_opts_ref
.iter()
.filter_map(|config_option| {
let default_value = default_config_options.get(&*config_option.id.0)?;
let is_valid = match &config_option.kind {
acp::SessionConfigKind::Select(select) => match &select.options {
acp::SessionConfigSelectOptions::Ungrouped(options) => {
options.iter().any(|opt| &*opt.value.0 == default_value.as_str())
}
acp::SessionConfigSelectOptions::Grouped(groups) => groups
.iter()
.any(|g| g.options.iter().any(|opt| &*opt.value.0 == default_value.as_str())),
_ => false,
},
_ => false,
};
if is_valid {
let initial_value = match &config_option.kind {
acp::SessionConfigKind::Select(select) => {
Some(select.current_value.clone())
}
_ => None,
};
Some((config_option.id.clone(), default_value.clone(), initial_value))
} else {
log::warn!(
"`{}` is not a valid value for config option `{}` in {}",
default_value,
config_option.id.0,
name
);
None
}
})
.collect()
};
for (config_id, default_value, initial_value) in defaults_to_apply {
cx.spawn({
let default_value_id = acp::SessionConfigValueId::new(default_value.clone());
let session_id = response.session_id.clone();
let config_id_clone = config_id.clone();
let config_opts = config_opts.clone();
let conn = conn.clone();
async move |_| {
let result = conn
.set_session_config_option(
acp::SetSessionConfigOptionRequest::new(
session_id,
config_id_clone.clone(),
default_value_id,
),
)
.await
.log_err();
if result.is_none() {
if let Some(initial) = initial_value {
let mut opts = config_opts.borrow_mut();
if let Some(opt) = opts.iter_mut().find(|o| o.id == config_id_clone) {
if let acp::SessionConfigKind::Select(select) =
&mut opt.kind
{
select.current_value = initial;
}
}
}
}
}
})
.detach();
let mut opts = config_opts.borrow_mut();
if let Some(opt) = opts.iter_mut().find(|o| o.id == config_id) {
if let acp::SessionConfigKind::Select(select) = &mut opt.kind {
select.current_value = acp::SessionConfigValueId::new(default_value);
}
}
}
}
let session_id = response.session_id;
let action_log = cx.new(|_| ActionLog::new(project.clone()))?;
let thread = cx.new(|cx| {
@@ -432,6 +556,7 @@ impl AgentConnection for AcpConnection {
suppress_abort_err: false,
session_modes: modes,
models,
config_options: config_options.map(|opts| ConfigOptions::new(opts))
};
sessions.borrow_mut().insert(session_id, session);
@@ -567,6 +692,25 @@ impl AgentConnection for AcpConnection {
}
}
fn session_config_options(
&self,
session_id: &acp::SessionId,
_cx: &App,
) -> Option<Rc<dyn acp_thread::AgentSessionConfigOptions>> {
let sessions = self.sessions.borrow();
let session = sessions.get(session_id)?;
let config_opts = session.config_options.as_ref()?;
Some(Rc::new(AcpSessionConfigOptions {
session_id: session_id.clone(),
connection: self.connection.clone(),
state: config_opts.config_options.clone(),
watch_tx: config_opts.tx.clone(),
watch_rx: config_opts.rx.clone(),
}) as _)
}
fn into_any(self: Rc<Self>) -> Rc<dyn Any> {
self
}
@@ -685,6 +829,49 @@ impl acp_thread::AgentModelSelector for AcpModelSelector {
}
}
struct AcpSessionConfigOptions {
session_id: acp::SessionId,
connection: Rc<acp::ClientSideConnection>,
state: Rc<RefCell<Vec<acp::SessionConfigOption>>>,
watch_tx: Rc<RefCell<watch::Sender<()>>>,
watch_rx: watch::Receiver<()>,
}
impl acp_thread::AgentSessionConfigOptions for AcpSessionConfigOptions {
fn config_options(&self) -> Vec<acp::SessionConfigOption> {
self.state.borrow().clone()
}
fn set_config_option(
&self,
config_id: acp::SessionConfigId,
value: acp::SessionConfigValueId,
cx: &mut App,
) -> Task<Result<Vec<acp::SessionConfigOption>>> {
let connection = self.connection.clone();
let session_id = self.session_id.clone();
let state = self.state.clone();
let watch_tx = self.watch_tx.clone();
cx.foreground_executor().spawn(async move {
let response = connection
.set_session_config_option(acp::SetSessionConfigOptionRequest::new(
session_id, config_id, value,
))
.await?;
*state.borrow_mut() = response.config_options.clone();
watch_tx.borrow_mut().send(()).ok();
Ok(response.config_options)
})
}
fn watch(&self, _cx: &mut App) -> Option<watch::Receiver<()>> {
Some(self.watch_rx.clone())
}
}
struct ClientDelegate {
sessions: Rc<RefCell<HashMap<acp::SessionId, AcpSession>>>,
cx: AsyncApp,
@@ -778,6 +965,21 @@ impl acp::Client for ClientDelegate {
}
}
if let acp::SessionUpdate::ConfigOptionUpdate(acp::ConfigOptionUpdate {
config_options,
..
}) = &notification.update
{
if let Some(opts) = &session.config_options {
*opts.config_options.borrow_mut() = config_options.clone();
opts.tx.borrow_mut().send(()).ok();
} else {
log::error!(
"Got a `ConfigOptionUpdate` notification, but the agent didn't specify `config_options` during session setup."
);
}
}
// Clone so we can inspect meta both before and after handing off to the thread
let update_clone = notification.update.clone();

View File

@@ -4,15 +4,13 @@ mod codex;
mod custom;
mod gemini;
use collections::HashSet;
#[cfg(any(test, feature = "test-support"))]
pub mod e2e_tests;
pub use claude::*;
use client::ProxySettings;
pub use codex::*;
use collections::HashMap;
use collections::{HashMap, HashSet};
pub use custom::*;
use fs::Fs;
pub use gemini::*;
@@ -67,7 +65,7 @@ pub trait AgentServer: Send {
fn into_any(self: Rc<Self>) -> Rc<dyn Any>;
fn default_mode(&self, _cx: &mut App) -> Option<agent_client_protocol::SessionModeId> {
fn default_mode(&self, _cx: &App) -> Option<agent_client_protocol::SessionModeId> {
None
}
@@ -79,7 +77,7 @@ pub trait AgentServer: Send {
) {
}
fn default_model(&self, _cx: &mut App) -> Option<agent_client_protocol::ModelId> {
fn default_model(&self, _cx: &App) -> Option<agent_client_protocol::ModelId> {
None
}
@@ -95,6 +93,37 @@ pub trait AgentServer: Send {
HashSet::default()
}
fn default_config_option(&self, _config_id: &str, _cx: &App) -> Option<String> {
None
}
fn set_default_config_option(
&self,
_config_id: &str,
_value_id: Option<&str>,
_fs: Arc<dyn Fs>,
_cx: &mut App,
) {
}
fn favorite_config_option_value_ids(
&self,
_config_id: &agent_client_protocol::SessionConfigId,
_cx: &mut App,
) -> HashSet<agent_client_protocol::SessionConfigValueId> {
HashSet::default()
}
fn toggle_favorite_config_option_value(
&self,
_config_id: agent_client_protocol::SessionConfigId,
_value_id: agent_client_protocol::SessionConfigValueId,
_should_be_favorite: bool,
_fs: Arc<dyn Fs>,
_cx: &App,
) {
}
fn toggle_favorite_model(
&self,
_model_id: agent_client_protocol::ModelId,

View File

@@ -31,7 +31,7 @@ impl AgentServer for ClaudeCode {
ui::IconName::AiClaude
}
fn default_mode(&self, cx: &mut App) -> Option<acp::SessionModeId> {
fn default_mode(&self, cx: &App) -> Option<acp::SessionModeId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
@@ -52,7 +52,7 @@ impl AgentServer for ClaudeCode {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
fn default_model(&self, cx: &App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
@@ -115,6 +115,97 @@ impl AgentServer for ClaudeCode {
});
}
fn default_config_option(&self, config_id: &str, cx: &App) -> Option<String> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
settings
.as_ref()
.and_then(|s| s.default_config_options.get(config_id).cloned())
}
fn set_default_config_option(
&self,
config_id: &str,
value_id: Option<&str>,
fs: Arc<dyn Fs>,
cx: &mut App,
) {
let config_id = config_id.to_string();
let value_id = value_id.map(|s| s.to_string());
update_settings_file(fs, cx, move |settings, _| {
let config_options = &mut settings
.agent_servers
.get_or_insert_default()
.claude
.get_or_insert_default()
.default_config_options;
if let Some(value) = value_id.clone() {
config_options.insert(config_id.clone(), value);
} else {
config_options.remove(&config_id);
}
});
}
fn favorite_config_option_value_ids(
&self,
config_id: &acp::SessionConfigId,
cx: &mut App,
) -> HashSet<acp::SessionConfigValueId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).claude.clone()
});
settings
.as_ref()
.and_then(|s| s.favorite_config_option_values.get(config_id.0.as_ref()))
.map(|values| {
values
.iter()
.cloned()
.map(acp::SessionConfigValueId::new)
.collect()
})
.unwrap_or_default()
}
fn toggle_favorite_config_option_value(
&self,
config_id: acp::SessionConfigId,
value_id: acp::SessionConfigValueId,
should_be_favorite: bool,
fs: Arc<dyn Fs>,
cx: &App,
) {
let config_id = config_id.to_string();
let value_id = value_id.to_string();
update_settings_file(fs, cx, move |settings, _| {
let favorites = &mut settings
.agent_servers
.get_or_insert_default()
.claude
.get_or_insert_default()
.favorite_config_option_values;
let entry = favorites.entry(config_id.clone()).or_insert_with(Vec::new);
if should_be_favorite {
if !entry.iter().any(|v| v == &value_id) {
entry.push(value_id.clone());
}
} else {
entry.retain(|v| v != &value_id);
if entry.is_empty() {
favorites.remove(&config_id);
}
}
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -128,6 +219,14 @@ impl AgentServer for ClaudeCode {
let extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
let default_config_options = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.claude
.as_ref()
.map(|s| s.default_config_options.clone())
.unwrap_or_default()
});
cx.spawn(async move |cx| {
let (command, root_dir, login) = store
@@ -150,6 +249,7 @@ impl AgentServer for ClaudeCode {
root_dir.as_ref(),
default_mode,
default_model,
default_config_options,
is_remote,
cx,
)

View File

@@ -32,7 +32,7 @@ impl AgentServer for Codex {
ui::IconName::AiOpenAi
}
fn default_mode(&self, cx: &mut App) -> Option<acp::SessionModeId> {
fn default_mode(&self, cx: &App) -> Option<acp::SessionModeId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
});
@@ -53,7 +53,7 @@ impl AgentServer for Codex {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
fn default_model(&self, cx: &App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
});
@@ -116,6 +116,97 @@ impl AgentServer for Codex {
});
}
fn default_config_option(&self, config_id: &str, cx: &App) -> Option<String> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
});
settings
.as_ref()
.and_then(|s| s.default_config_options.get(config_id).cloned())
}
fn set_default_config_option(
&self,
config_id: &str,
value_id: Option<&str>,
fs: Arc<dyn Fs>,
cx: &mut App,
) {
let config_id = config_id.to_string();
let value_id = value_id.map(|s| s.to_string());
update_settings_file(fs, cx, move |settings, _| {
let config_options = &mut settings
.agent_servers
.get_or_insert_default()
.codex
.get_or_insert_default()
.default_config_options;
if let Some(value) = value_id.clone() {
config_options.insert(config_id.clone(), value);
} else {
config_options.remove(&config_id);
}
});
}
fn favorite_config_option_value_ids(
&self,
config_id: &acp::SessionConfigId,
cx: &mut App,
) -> HashSet<acp::SessionConfigValueId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings.get::<AllAgentServersSettings>(None).codex.clone()
});
settings
.as_ref()
.and_then(|s| s.favorite_config_option_values.get(config_id.0.as_ref()))
.map(|values| {
values
.iter()
.cloned()
.map(acp::SessionConfigValueId::new)
.collect()
})
.unwrap_or_default()
}
fn toggle_favorite_config_option_value(
&self,
config_id: acp::SessionConfigId,
value_id: acp::SessionConfigValueId,
should_be_favorite: bool,
fs: Arc<dyn Fs>,
cx: &App,
) {
let config_id = config_id.to_string();
let value_id = value_id.to_string();
update_settings_file(fs, cx, move |settings, _| {
let favorites = &mut settings
.agent_servers
.get_or_insert_default()
.codex
.get_or_insert_default()
.favorite_config_option_values;
let entry = favorites.entry(config_id.clone()).or_insert_with(Vec::new);
if should_be_favorite {
if !entry.iter().any(|v| v == &value_id) {
entry.push(value_id.clone());
}
} else {
entry.retain(|v| v != &value_id);
if entry.is_empty() {
favorites.remove(&config_id);
}
}
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -129,6 +220,14 @@ impl AgentServer for Codex {
let extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
let default_config_options = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.codex
.as_ref()
.map(|s| s.default_config_options.clone())
.unwrap_or_default()
});
cx.spawn(async move |cx| {
let (command, root_dir, login) = store
@@ -152,6 +251,7 @@ impl AgentServer for Codex {
root_dir.as_ref(),
default_mode,
default_model,
default_config_options,
is_remote,
cx,
)

View File

@@ -30,7 +30,7 @@ impl AgentServer for CustomAgentServer {
IconName::Terminal
}
fn default_mode(&self, cx: &mut App) -> Option<acp::SessionModeId> {
fn default_mode(&self, cx: &App) -> Option<acp::SessionModeId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
@@ -44,6 +44,86 @@ impl AgentServer for CustomAgentServer {
.and_then(|s| s.default_mode().map(acp::SessionModeId::new))
}
fn favorite_config_option_value_ids(
&self,
config_id: &acp::SessionConfigId,
cx: &mut App,
) -> HashSet<acp::SessionConfigValueId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.custom
.get(&self.name())
.cloned()
});
settings
.as_ref()
.and_then(|s| s.favorite_config_option_values(config_id.0.as_ref()))
.map(|values| {
values
.iter()
.cloned()
.map(acp::SessionConfigValueId::new)
.collect()
})
.unwrap_or_default()
}
fn toggle_favorite_config_option_value(
&self,
config_id: acp::SessionConfigId,
value_id: acp::SessionConfigValueId,
should_be_favorite: bool,
fs: Arc<dyn Fs>,
cx: &App,
) {
let name = self.name();
let config_id = config_id.to_string();
let value_id = value_id.to_string();
update_settings_file(fs, cx, move |settings, _| {
let settings = settings
.agent_servers
.get_or_insert_default()
.custom
.entry(name.clone())
.or_insert_with(|| settings::CustomAgentServerSettings::Extension {
default_model: None,
default_mode: None,
favorite_models: Vec::new(),
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
});
match settings {
settings::CustomAgentServerSettings::Custom {
favorite_config_option_values,
..
}
| settings::CustomAgentServerSettings::Extension {
favorite_config_option_values,
..
} => {
let entry = favorite_config_option_values
.entry(config_id.clone())
.or_insert_with(Vec::new);
if should_be_favorite {
if !entry.iter().any(|v| v == &value_id) {
entry.push(value_id.clone());
}
} else {
entry.retain(|v| v != &value_id);
if entry.is_empty() {
favorite_config_option_values.remove(&config_id);
}
}
}
}
});
}
fn set_default_mode(&self, mode_id: Option<acp::SessionModeId>, fs: Arc<dyn Fs>, cx: &mut App) {
let name = self.name();
update_settings_file(fs, cx, move |settings, _| {
@@ -56,6 +136,8 @@ impl AgentServer for CustomAgentServer {
default_model: None,
default_mode: None,
favorite_models: Vec::new(),
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
});
match settings {
@@ -67,7 +149,7 @@ impl AgentServer for CustomAgentServer {
});
}
fn default_model(&self, cx: &mut App) -> Option<acp::ModelId> {
fn default_model(&self, cx: &App) -> Option<acp::ModelId> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
@@ -93,6 +175,8 @@ impl AgentServer for CustomAgentServer {
default_model: None,
default_mode: None,
favorite_models: Vec::new(),
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
});
match settings {
@@ -142,6 +226,8 @@ impl AgentServer for CustomAgentServer {
default_model: None,
default_mode: None,
favorite_models: Vec::new(),
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
});
let favorite_models = match settings {
@@ -164,6 +250,63 @@ impl AgentServer for CustomAgentServer {
});
}
fn default_config_option(&self, config_id: &str, cx: &App) -> Option<String> {
let settings = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.custom
.get(&self.name())
.cloned()
});
settings
.as_ref()
.and_then(|s| s.default_config_option(config_id).map(|s| s.to_string()))
}
fn set_default_config_option(
&self,
config_id: &str,
value_id: Option<&str>,
fs: Arc<dyn Fs>,
cx: &mut App,
) {
let name = self.name();
let config_id = config_id.to_string();
let value_id = value_id.map(|s| s.to_string());
update_settings_file(fs, cx, move |settings, _| {
let settings = settings
.agent_servers
.get_or_insert_default()
.custom
.entry(name.clone())
.or_insert_with(|| settings::CustomAgentServerSettings::Extension {
default_model: None,
default_mode: None,
favorite_models: Vec::new(),
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
});
match settings {
settings::CustomAgentServerSettings::Custom {
default_config_options,
..
}
| settings::CustomAgentServerSettings::Extension {
default_config_options,
..
} => {
if let Some(value) = value_id.clone() {
default_config_options.insert(config_id.clone(), value);
} else {
default_config_options.remove(&config_id);
}
}
}
});
}
fn connect(
&self,
root_dir: Option<&Path>,
@@ -175,6 +318,23 @@ impl AgentServer for CustomAgentServer {
let is_remote = delegate.project.read(cx).is_via_remote_server();
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
let default_config_options = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.custom
.get(&self.name())
.map(|s| match s {
project::agent_server_store::CustomAgentServerSettings::Custom {
default_config_options,
..
}
| project::agent_server_store::CustomAgentServerSettings::Extension {
default_config_options,
..
} => default_config_options.clone(),
})
.unwrap_or_default()
});
let store = delegate.store.downgrade();
let extra_env = load_proxy_env(cx);
cx.spawn(async move |cx| {
@@ -200,6 +360,7 @@ impl AgentServer for CustomAgentServer {
root_dir.as_ref(),
default_mode,
default_model,
default_config_options,
is_remote,
cx,
)

View File

@@ -455,22 +455,12 @@ pub async fn init_test(cx: &mut TestAppContext) -> Arc<FakeFs> {
project::agent_server_store::AllAgentServersSettings {
claude: Some(BuiltinAgentServerSettings {
path: Some("claude-code-acp".into()),
args: None,
env: None,
ignore_system_version: None,
default_mode: None,
default_model: None,
favorite_models: vec![],
..Default::default()
}),
gemini: Some(crate::gemini::tests::local_command().into()),
codex: Some(BuiltinAgentServerSettings {
path: Some("codex-acp".into()),
args: None,
env: None,
ignore_system_version: None,
default_mode: None,
default_model: None,
favorite_models: vec![],
..Default::default()
}),
custom: collections::HashMap::default(),
},

View File

@@ -4,9 +4,10 @@ use std::{any::Any, path::Path};
use crate::{AgentServer, AgentServerDelegate, load_proxy_env};
use acp_thread::AgentConnection;
use anyhow::{Context as _, Result};
use gpui::{App, SharedString, Task};
use gpui::{App, AppContext as _, SharedString, Task};
use language_models::provider::google::GoogleLanguageModelProvider;
use project::agent_server_store::GEMINI_NAME;
use project::agent_server_store::{AllAgentServersSettings, GEMINI_NAME};
use settings::SettingsStore;
#[derive(Clone)]
pub struct Gemini;
@@ -33,6 +34,14 @@ impl AgentServer for Gemini {
let mut extra_env = load_proxy_env(cx);
let default_mode = self.default_mode(cx);
let default_model = self.default_model(cx);
let default_config_options = cx.read_global(|settings: &SettingsStore, _| {
settings
.get::<AllAgentServersSettings>(None)
.gemini
.as_ref()
.map(|s| s.default_config_options.clone())
.unwrap_or_default()
});
cx.spawn(async move |cx| {
extra_env.insert("SURFACE".to_owned(), "zed".to_owned());
@@ -65,6 +74,7 @@ impl AgentServer for Gemini {
root_dir.as_ref(),
default_mode,
default_model,
default_config_options,
is_remote,
cx,
)

View File

@@ -1,3 +1,4 @@
mod config_options;
mod entry_view_state;
mod message_editor;
mod mode_selector;

View File

@@ -0,0 +1,772 @@
use std::{cmp::Reverse, rc::Rc, sync::Arc};
use acp_thread::AgentSessionConfigOptions;
use agent_client_protocol as acp;
use agent_servers::AgentServer;
use collections::HashSet;
use fs::Fs;
use fuzzy::StringMatchCandidate;
use gpui::{
BackgroundExecutor, Context, DismissEvent, Entity, Subscription, Task, Window, prelude::*,
};
use ordered_float::OrderedFloat;
use picker::popover_menu::PickerPopoverMenu;
use picker::{Picker, PickerDelegate};
use settings::SettingsStore;
use ui::{
ElevationIndex, IconButton, ListItem, ListItemSpacing, PopoverMenuHandle, Tooltip, prelude::*,
};
use util::ResultExt as _;
use crate::ui::HoldForDefault;
const PICKER_THRESHOLD: usize = 5;
pub struct ConfigOptionsView {
config_options: Rc<dyn AgentSessionConfigOptions>,
selectors: Vec<Entity<ConfigOptionSelector>>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
config_option_ids: Vec<acp::SessionConfigId>,
_refresh_task: Task<()>,
}
impl ConfigOptionsView {
pub fn new(
config_options: Rc<dyn AgentSessionConfigOptions>,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let selectors = Self::build_selectors(&config_options, &agent_server, &fs, window, cx);
let config_option_ids = Self::config_option_ids(&config_options);
let rx = config_options.watch(cx);
let refresh_task = cx.spawn_in(window, async move |this, cx| {
if let Some(mut rx) = rx {
while let Ok(()) = rx.recv().await {
this.update_in(cx, |this, window, cx| {
this.refresh_selectors_if_needed(window, cx);
cx.notify();
})
.log_err();
}
}
});
Self {
config_options,
selectors,
agent_server,
fs,
config_option_ids,
_refresh_task: refresh_task,
}
}
fn config_option_ids(
config_options: &Rc<dyn AgentSessionConfigOptions>,
) -> Vec<acp::SessionConfigId> {
config_options
.config_options()
.into_iter()
.map(|option| option.id)
.collect()
}
fn refresh_selectors_if_needed(&mut self, window: &mut Window, cx: &mut Context<Self>) {
let current_ids = Self::config_option_ids(&self.config_options);
if current_ids != self.config_option_ids {
self.config_option_ids = current_ids;
self.rebuild_selectors(window, cx);
}
}
fn rebuild_selectors(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.selectors = Self::build_selectors(
&self.config_options,
&self.agent_server,
&self.fs,
window,
cx,
);
cx.notify();
}
fn build_selectors(
config_options: &Rc<dyn AgentSessionConfigOptions>,
agent_server: &Rc<dyn AgentServer>,
fs: &Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Vec<Entity<ConfigOptionSelector>> {
config_options
.config_options()
.into_iter()
.map(|option| {
let config_options = config_options.clone();
let agent_server = agent_server.clone();
let fs = fs.clone();
cx.new(|cx| {
ConfigOptionSelector::new(
config_options,
option.id.clone(),
agent_server,
fs,
window,
cx,
)
})
})
.collect()
}
}
impl Render for ConfigOptionsView {
fn render(&mut self, _window: &mut Window, _cx: &mut Context<Self>) -> impl IntoElement {
if self.selectors.is_empty() {
return div().into_any_element();
}
h_flex()
.gap_1()
.children(self.selectors.iter().cloned())
.into_any_element()
}
}
struct ConfigOptionSelector {
config_options: Rc<dyn AgentSessionConfigOptions>,
config_id: acp::SessionConfigId,
picker_handle: PopoverMenuHandle<Picker<ConfigOptionPickerDelegate>>,
picker: Entity<Picker<ConfigOptionPickerDelegate>>,
setting_value: bool,
}
impl ConfigOptionSelector {
pub fn new(
config_options: Rc<dyn AgentSessionConfigOptions>,
config_id: acp::SessionConfigId,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Self {
let option_count = config_options
.config_options()
.iter()
.find(|opt| opt.id == config_id)
.map(count_config_options)
.unwrap_or(0);
let is_searchable = option_count >= PICKER_THRESHOLD;
let picker = {
let config_options = config_options.clone();
let config_id = config_id.clone();
let agent_server = agent_server.clone();
let fs = fs.clone();
cx.new(move |picker_cx| {
let delegate = ConfigOptionPickerDelegate::new(
config_options,
config_id,
agent_server,
fs,
window,
picker_cx,
);
if is_searchable {
Picker::list(delegate, window, picker_cx)
} else {
Picker::nonsearchable_list(delegate, window, picker_cx)
}
.show_scrollbar(true)
.width(rems(20.))
.max_height(Some(rems(20.).into()))
})
};
Self {
config_options,
config_id,
picker_handle: PopoverMenuHandle::default(),
picker,
setting_value: false,
}
}
fn current_option(&self) -> Option<acp::SessionConfigOption> {
self.config_options
.config_options()
.into_iter()
.find(|opt| opt.id == self.config_id)
}
fn current_value_name(&self) -> String {
let Some(option) = self.current_option() else {
return "Unknown".to_string();
};
match &option.kind {
acp::SessionConfigKind::Select(select) => {
find_option_name(&select.options, &select.current_value)
.unwrap_or_else(|| "Unknown".to_string())
}
_ => "Unknown".to_string(),
}
}
fn render_trigger_button(&self, _window: &mut Window, _cx: &mut Context<Self>) -> Button {
let Some(option) = self.current_option() else {
return Button::new("config-option-trigger", "Unknown")
.label_size(LabelSize::Small)
.color(Color::Muted)
.disabled(true);
};
let icon = if self.picker_handle.is_deployed() {
IconName::ChevronUp
} else {
IconName::ChevronDown
};
Button::new(
ElementId::Name(format!("config-option-{}", option.id.0).into()),
self.current_value_name(),
)
.label_size(LabelSize::Small)
.color(Color::Muted)
.icon(icon)
.icon_size(IconSize::XSmall)
.icon_position(IconPosition::End)
.icon_color(Color::Muted)
.disabled(self.setting_value)
}
}
impl Render for ConfigOptionSelector {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
let Some(option) = self.current_option() else {
return div().into_any_element();
};
let trigger_button = self.render_trigger_button(window, cx);
let option_name = option.name.clone();
let option_description: Option<SharedString> = option.description.map(Into::into);
let tooltip = Tooltip::element(move |_window, _cx| {
let mut content = v_flex().gap_1().child(Label::new(option_name.clone()));
if let Some(desc) = option_description.as_ref() {
content = content.child(
Label::new(desc.clone())
.size(LabelSize::Small)
.color(Color::Muted),
);
}
content.into_any()
});
PickerPopoverMenu::new(
self.picker.clone(),
trigger_button,
tooltip,
gpui::Corner::BottomRight,
cx,
)
.with_handle(self.picker_handle.clone())
.render(window, cx)
.into_any_element()
}
}
#[derive(Clone)]
enum ConfigOptionPickerEntry {
Separator(SharedString),
Option(ConfigOptionValue),
}
#[derive(Clone)]
struct ConfigOptionValue {
value: acp::SessionConfigValueId,
name: String,
description: Option<String>,
group: Option<String>,
}
struct ConfigOptionPickerDelegate {
config_options: Rc<dyn AgentSessionConfigOptions>,
config_id: acp::SessionConfigId,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
filtered_entries: Vec<ConfigOptionPickerEntry>,
all_options: Vec<ConfigOptionValue>,
selected_index: usize,
selected_description: Option<(usize, SharedString, bool)>,
favorites: HashSet<acp::SessionConfigValueId>,
_settings_subscription: Subscription,
}
impl ConfigOptionPickerDelegate {
fn new(
config_options: Rc<dyn AgentSessionConfigOptions>,
config_id: acp::SessionConfigId,
agent_server: Rc<dyn AgentServer>,
fs: Arc<dyn Fs>,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Self {
let favorites = agent_server.favorite_config_option_value_ids(&config_id, cx);
let all_options = extract_options(&config_options, &config_id);
let filtered_entries = options_to_picker_entries(&all_options, &favorites);
let current_value = get_current_value(&config_options, &config_id);
let selected_index = current_value
.and_then(|current| {
filtered_entries.iter().position(|entry| {
matches!(entry, ConfigOptionPickerEntry::Option(opt) if opt.value == current)
})
})
.unwrap_or(0);
let agent_server_for_subscription = agent_server.clone();
let config_id_for_subscription = config_id.clone();
let settings_subscription =
cx.observe_global_in::<SettingsStore>(window, move |picker, window, cx| {
let new_favorites = agent_server_for_subscription
.favorite_config_option_value_ids(&config_id_for_subscription, cx);
if new_favorites != picker.delegate.favorites {
picker.delegate.favorites = new_favorites;
picker.refresh(window, cx);
}
});
cx.notify();
Self {
config_options,
config_id,
agent_server,
fs,
filtered_entries,
all_options,
selected_index,
selected_description: None,
favorites,
_settings_subscription: settings_subscription,
}
}
fn current_value(&self) -> Option<acp::SessionConfigValueId> {
get_current_value(&self.config_options, &self.config_id)
}
}
impl PickerDelegate for ConfigOptionPickerDelegate {
type ListItem = AnyElement;
fn match_count(&self) -> usize {
self.filtered_entries.len()
}
fn selected_index(&self) -> usize {
self.selected_index
}
fn set_selected_index(&mut self, ix: usize, _: &mut Window, cx: &mut Context<Picker<Self>>) {
self.selected_index = ix.min(self.filtered_entries.len().saturating_sub(1));
cx.notify();
}
fn can_select(
&mut self,
ix: usize,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> bool {
match self.filtered_entries.get(ix) {
Some(ConfigOptionPickerEntry::Option(_)) => true,
Some(ConfigOptionPickerEntry::Separator(_)) | None => false,
}
}
fn placeholder_text(&self, _window: &mut Window, _cx: &mut App) -> Arc<str> {
"Select an option…".into()
}
fn update_matches(
&mut self,
query: String,
window: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Task<()> {
let all_options = self.all_options.clone();
cx.spawn_in(window, async move |this, cx| {
let filtered_options = match this
.read_with(cx, |_, cx| {
if query.is_empty() {
None
} else {
Some((all_options.clone(), query.clone(), cx.background_executor().clone()))
}
})
.ok()
.flatten()
{
Some((options, q, executor)) => fuzzy_search_options(options, &q, executor).await,
None => all_options,
};
this.update_in(cx, |this, window, cx| {
this.delegate.filtered_entries =
options_to_picker_entries(&filtered_options, &this.delegate.favorites);
let current_value = this.delegate.current_value();
let new_index = current_value
.and_then(|current| {
this.delegate.filtered_entries.iter().position(|entry| {
matches!(entry, ConfigOptionPickerEntry::Option(opt) if opt.value == current)
})
})
.unwrap_or(0);
this.set_selected_index(new_index, Some(picker::Direction::Down), true, window, cx);
cx.notify();
})
.ok();
})
}
fn confirm(&mut self, _secondary: bool, window: &mut Window, cx: &mut Context<Picker<Self>>) {
if let Some(ConfigOptionPickerEntry::Option(option)) =
self.filtered_entries.get(self.selected_index)
{
if window.modifiers().secondary() {
let default_value = self
.agent_server
.default_config_option(self.config_id.0.as_ref(), cx);
let is_default = default_value.as_deref() == Some(&*option.value.0);
self.agent_server.set_default_config_option(
self.config_id.0.as_ref(),
if is_default {
None
} else {
Some(option.value.0.as_ref())
},
self.fs.clone(),
cx,
);
}
let task = self.config_options.set_config_option(
self.config_id.clone(),
option.value.clone(),
cx,
);
cx.spawn(async move |_, _| {
if let Err(err) = task.await {
log::error!("Failed to set config option: {:?}", err);
}
})
.detach();
cx.emit(DismissEvent);
}
}
fn dismissed(&mut self, window: &mut Window, cx: &mut Context<Picker<Self>>) {
cx.defer_in(window, |picker, window, cx| {
picker.set_query("", window, cx);
});
}
fn render_match(
&self,
ix: usize,
selected: bool,
_: &mut Window,
cx: &mut Context<Picker<Self>>,
) -> Option<Self::ListItem> {
match self.filtered_entries.get(ix)? {
ConfigOptionPickerEntry::Separator(title) => Some(
div()
.when(ix > 0, |this| this.mt_1())
.child(
div()
.px_2()
.py_1()
.text_xs()
.text_color(cx.theme().colors().text_muted)
.child(title.clone()),
)
.into_any_element(),
),
ConfigOptionPickerEntry::Option(option) => {
let current_value = self.current_value();
let is_selected = current_value.as_ref() == Some(&option.value);
let default_value = self
.agent_server
.default_config_option(self.config_id.0.as_ref(), cx);
let is_default = default_value.as_deref() == Some(&*option.value.0);
let is_favorite = self.favorites.contains(&option.value);
let option_name = option.name.clone();
let description = option.description.clone();
Some(
div()
.id(("config-option-picker-item", ix))
.when_some(description, |this, desc| {
let desc: SharedString = desc.into();
this.on_hover(cx.listener(move |menu, hovered, _, cx| {
if *hovered {
menu.delegate.selected_description =
Some((ix, desc.clone(), is_default));
} else if matches!(menu.delegate.selected_description, Some((id, _, _)) if id == ix)
{
menu.delegate.selected_description = None;
}
cx.notify();
}))
})
.child(
ListItem::new(ix)
.inset(true)
.spacing(ListItemSpacing::Sparse)
.toggle_state(selected)
.child(h_flex().w_full().child(Label::new(option_name).truncate()))
.end_slot(div().pr_2().when(is_selected, |this| {
this.child(Icon::new(IconName::Check).color(Color::Accent))
}))
.end_hover_slot(div().pr_1p5().child({
let (icon, color, tooltip) = if is_favorite {
(IconName::StarFilled, Color::Accent, "Unfavorite")
} else {
(IconName::Star, Color::Default, "Favorite")
};
let config_id = self.config_id.clone();
let value_id = option.value.clone();
let agent_server = self.agent_server.clone();
let fs = self.fs.clone();
IconButton::new(("toggle-favorite-config-option", ix), icon)
.layer(ElevationIndex::ElevatedSurface)
.icon_color(color)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text(tooltip))
.on_click(move |_, _, cx| {
agent_server.toggle_favorite_config_option_value(
config_id.clone(),
value_id.clone(),
!is_favorite,
fs.clone(),
cx,
);
})
})),
)
.into_any_element(),
)
}
}
}
fn documentation_aside(
&self,
_window: &mut Window,
_cx: &mut Context<Picker<Self>>,
) -> Option<ui::DocumentationAside> {
self.selected_description
.as_ref()
.map(|(_, description, is_default)| {
let description = description.clone();
let is_default = *is_default;
ui::DocumentationAside::new(
ui::DocumentationSide::Left,
ui::DocumentationEdge::Top,
Rc::new(move |_| {
v_flex()
.gap_1()
.child(Label::new(description.clone()))
.child(HoldForDefault::new(is_default))
.into_any_element()
}),
)
})
}
}
fn extract_options(
config_options: &Rc<dyn AgentSessionConfigOptions>,
config_id: &acp::SessionConfigId,
) -> Vec<ConfigOptionValue> {
let Some(option) = config_options
.config_options()
.into_iter()
.find(|opt| &opt.id == config_id)
else {
return Vec::new();
};
match &option.kind {
acp::SessionConfigKind::Select(select) => match &select.options {
acp::SessionConfigSelectOptions::Ungrouped(options) => options
.iter()
.map(|opt| ConfigOptionValue {
value: opt.value.clone(),
name: opt.name.clone(),
description: opt.description.clone(),
group: None,
})
.collect(),
acp::SessionConfigSelectOptions::Grouped(groups) => groups
.iter()
.flat_map(|group| {
group.options.iter().map(|opt| ConfigOptionValue {
value: opt.value.clone(),
name: opt.name.clone(),
description: opt.description.clone(),
group: Some(group.name.clone()),
})
})
.collect(),
_ => Vec::new(),
},
_ => Vec::new(),
}
}
fn get_current_value(
config_options: &Rc<dyn AgentSessionConfigOptions>,
config_id: &acp::SessionConfigId,
) -> Option<acp::SessionConfigValueId> {
config_options
.config_options()
.into_iter()
.find(|opt| &opt.id == config_id)
.and_then(|opt| match &opt.kind {
acp::SessionConfigKind::Select(select) => Some(select.current_value.clone()),
_ => None,
})
}
fn options_to_picker_entries(
options: &[ConfigOptionValue],
favorites: &HashSet<acp::SessionConfigValueId>,
) -> Vec<ConfigOptionPickerEntry> {
let mut entries = Vec::new();
let mut favorite_options = Vec::new();
for option in options {
if favorites.contains(&option.value) {
favorite_options.push(option.clone());
}
}
if !favorite_options.is_empty() {
entries.push(ConfigOptionPickerEntry::Separator("Favorites".into()));
for option in favorite_options {
entries.push(ConfigOptionPickerEntry::Option(option));
}
// If the remaining list would start ungrouped (group == None), insert a separator so
// Favorites doesn't visually run into the main list.
if let Some(option) = options.first()
&& option.group.is_none()
{
entries.push(ConfigOptionPickerEntry::Separator("All Options".into()));
}
}
let mut current_group: Option<String> = None;
for option in options {
if option.group != current_group {
if let Some(group_name) = &option.group {
entries.push(ConfigOptionPickerEntry::Separator(
group_name.clone().into(),
));
}
current_group = option.group.clone();
}
entries.push(ConfigOptionPickerEntry::Option(option.clone()));
}
entries
}
async fn fuzzy_search_options(
options: Vec<ConfigOptionValue>,
query: &str,
executor: BackgroundExecutor,
) -> Vec<ConfigOptionValue> {
let candidates = options
.iter()
.enumerate()
.map(|(ix, opt)| StringMatchCandidate::new(ix, &opt.name))
.collect::<Vec<_>>();
let mut matches = fuzzy::match_strings(
&candidates,
query,
false,
true,
100,
&Default::default(),
executor,
)
.await;
matches.sort_unstable_by_key(|mat| {
let candidate = &candidates[mat.candidate_id];
(Reverse(OrderedFloat(mat.score)), candidate.id)
});
matches
.into_iter()
.map(|mat| options[mat.candidate_id].clone())
.collect()
}
fn find_option_name(
options: &acp::SessionConfigSelectOptions,
value_id: &acp::SessionConfigValueId,
) -> Option<String> {
match options {
acp::SessionConfigSelectOptions::Ungrouped(opts) => opts
.iter()
.find(|o| &o.value == value_id)
.map(|o| o.name.clone()),
acp::SessionConfigSelectOptions::Grouped(groups) => groups.iter().find_map(|group| {
group
.options
.iter()
.find(|o| &o.value == value_id)
.map(|o| o.name.clone())
}),
_ => None,
}
}
fn count_config_options(option: &acp::SessionConfigOption) -> usize {
match &option.kind {
acp::SessionConfigKind::Select(select) => match &select.options {
acp::SessionConfigSelectOptions::Ungrouped(options) => options.len(),
acp::SessionConfigSelectOptions::Grouped(groups) => {
groups.iter().map(|g| g.options.len()).sum()
}
_ => 0,
},
_ => 0,
}
}

View File

@@ -24,11 +24,11 @@ use file_icons::FileIcons;
use fs::Fs;
use futures::FutureExt as _;
use gpui::{
Action, Animation, AnimationExt, AnyView, App, BorderStyle, ClickEvent, ClipboardItem,
CursorStyle, EdgesRefinement, ElementId, Empty, Entity, FocusHandle, Focusable, Hsla, Length,
ListOffset, ListState, PlatformDisplay, SharedString, StyleRefinement, Subscription, Task,
TextStyle, TextStyleRefinement, UnderlineStyle, WeakEntity, Window, WindowHandle, div,
ease_in_out, linear_color_stop, linear_gradient, list, point, pulsating_between,
Action, Animation, AnimationExt, AnyView, App, BorderStyle, ClickEvent, CursorStyle,
EdgesRefinement, ElementId, Empty, Entity, FocusHandle, Focusable, Hsla, Length, ListOffset,
ListState, PlatformDisplay, SharedString, StyleRefinement, Subscription, Task, TextStyle,
TextStyleRefinement, UnderlineStyle, WeakEntity, Window, WindowHandle, div, ease_in_out,
linear_color_stop, linear_gradient, list, point, pulsating_between,
};
use language::Buffer;
@@ -47,15 +47,16 @@ use terminal_view::terminal_panel::TerminalPanel;
use text::Anchor;
use theme::{AgentFontSize, ThemeSettings};
use ui::{
Callout, CommonAnimationExt, ContextMenu, ContextMenuEntry, Disclosure, Divider, DividerColor,
ElevationIndex, KeyBinding, PopoverMenuHandle, SpinnerLabel, TintColor, Tooltip, WithScrollbar,
prelude::*, right_click_menu,
Callout, CommonAnimationExt, ContextMenu, ContextMenuEntry, CopyButton, Disclosure, Divider,
DividerColor, ElevationIndex, KeyBinding, PopoverMenuHandle, SpinnerLabel, TintColor, Tooltip,
WithScrollbar, prelude::*, right_click_menu,
};
use util::{ResultExt, size::format_file_size, time::duration_alt_display};
use workspace::{CollaboratorId, NewTerminal, Workspace};
use zed_actions::agent::{Chat, ToggleModelSelector};
use zed_actions::assistant::OpenRulesLibrary;
use super::config_options::ConfigOptionsView;
use super::entry_view_state::EntryViewState;
use crate::acp::AcpModelSelectorPopover;
use crate::acp::ModeSelector;
@@ -272,12 +273,14 @@ pub struct AcpThreadView {
message_editor: Entity<MessageEditor>,
focus_handle: FocusHandle,
model_selector: Option<Entity<AcpModelSelectorPopover>>,
config_options_view: Option<Entity<ConfigOptionsView>>,
profile_selector: Option<Entity<ProfileSelector>>,
notifications: Vec<WindowHandle<AgentNotification>>,
notification_subscriptions: HashMap<WindowHandle<AgentNotification>, Vec<Subscription>>,
thread_retry_status: Option<RetryStatus>,
thread_error: Option<ThreadError>,
thread_error_markdown: Option<Entity<Markdown>>,
token_limit_callout_dismissed: bool,
thread_feedback: ThreadFeedbackState,
list_state: ListState,
auth_task: Option<Task<()>>,
@@ -429,14 +432,15 @@ impl AcpThreadView {
login: None,
message_editor,
model_selector: None,
config_options_view: None,
profile_selector: None,
notifications: Vec::new(),
notification_subscriptions: HashMap::default(),
list_state: list_state,
thread_retry_status: None,
thread_error: None,
thread_error_markdown: None,
token_limit_callout_dismissed: false,
thread_feedback: Default::default(),
auth_task: None,
expanded_tool_calls: HashSet::default(),
@@ -613,42 +617,64 @@ impl AcpThreadView {
AgentDiff::set_active_thread(&workspace, thread.clone(), window, cx);
this.model_selector = thread
// Check for config options first
// Config options take precedence over legacy mode/model selectors
// (feature flag gating happens at the data layer)
let config_options_provider = thread
.read(cx)
.connection()
.model_selector(thread.read(cx).session_id())
.map(|selector| {
let agent_server = this.agent.clone();
let fs = this.project.read(cx).fs().clone();
cx.new(|cx| {
AcpModelSelectorPopover::new(
selector,
agent_server,
fs,
PopoverMenuHandle::default(),
this.focus_handle(cx),
window,
cx,
)
})
});
.session_config_options(thread.read(cx).session_id(), cx);
let mode_selector = thread
.read(cx)
.connection()
.session_modes(thread.read(cx).session_id(), cx)
.map(|session_modes| {
let fs = this.project.read(cx).fs().clone();
let focus_handle = this.focus_handle(cx);
cx.new(|_cx| {
ModeSelector::new(
session_modes,
this.agent.clone(),
fs,
focus_handle,
)
})
});
let mode_selector;
if let Some(config_options) = config_options_provider {
// Use config options - don't create mode_selector or model_selector
let agent_server = this.agent.clone();
let fs = this.project.read(cx).fs().clone();
this.config_options_view = Some(cx.new(|cx| {
ConfigOptionsView::new(config_options, agent_server, fs, window, cx)
}));
this.model_selector = None;
mode_selector = None;
} else {
// Fall back to legacy mode/model selectors
this.config_options_view = None;
this.model_selector = thread
.read(cx)
.connection()
.model_selector(thread.read(cx).session_id())
.map(|selector| {
let agent_server = this.agent.clone();
let fs = this.project.read(cx).fs().clone();
cx.new(|cx| {
AcpModelSelectorPopover::new(
selector,
agent_server,
fs,
PopoverMenuHandle::default(),
this.focus_handle(cx),
window,
cx,
)
})
});
mode_selector = thread
.read(cx)
.connection()
.session_modes(thread.read(cx).session_id(), cx)
.map(|session_modes| {
let fs = this.project.read(cx).fs().clone();
let focus_handle = this.focus_handle(cx);
cx.new(|_cx| {
ModeSelector::new(
session_modes,
this.agent.clone(),
fs,
focus_handle,
)
})
});
}
let mut subscriptions = vec![
cx.subscribe_in(&thread, window, Self::handle_thread_event),
@@ -1394,6 +1420,7 @@ impl AcpThreadView {
fn clear_thread_error(&mut self, cx: &mut Context<Self>) {
self.thread_error = None;
self.thread_error_markdown = None;
self.token_limit_callout_dismissed = true;
cx.notify();
}
@@ -1520,6 +1547,10 @@ impl AcpThreadView {
// The connection keeps track of the mode
cx.notify();
}
AcpThreadEvent::ConfigOptionsUpdated(_) => {
// The watch task in ConfigOptionsView handles rebuilding selectors
cx.notify();
}
}
cx.notify();
}
@@ -4415,8 +4446,12 @@ impl AcpThreadView {
.gap_1()
.children(self.render_token_usage(cx))
.children(self.profile_selector.clone())
.children(self.mode_selector().cloned())
.children(self.model_selector.clone())
// Either config_options_view OR (mode_selector + model_selector)
.children(self.config_options_view.clone())
.when(self.config_options_view.is_none(), |this| {
this.children(self.mode_selector().cloned())
.children(self.model_selector.clone())
})
.child(self.render_send_button(cx)),
),
)
@@ -5391,22 +5426,26 @@ impl AcpThreadView {
cx.notify();
}
fn render_token_limit_callout(
&self,
line_height: Pixels,
cx: &mut Context<Self>,
) -> Option<Callout> {
fn render_token_limit_callout(&self, cx: &mut Context<Self>) -> Option<Callout> {
if self.token_limit_callout_dismissed {
return None;
}
let token_usage = self.thread()?.read(cx).token_usage()?;
let ratio = token_usage.ratio();
let (severity, title) = match ratio {
let (severity, icon, title) = match ratio {
acp_thread::TokenUsageRatio::Normal => return None,
acp_thread::TokenUsageRatio::Warning => {
(Severity::Warning, "Thread reaching the token limit soon")
}
acp_thread::TokenUsageRatio::Exceeded => {
(Severity::Error, "Thread reached the token limit")
}
acp_thread::TokenUsageRatio::Warning => (
Severity::Warning,
IconName::Warning,
"Thread reaching the token limit soon",
),
acp_thread::TokenUsageRatio::Exceeded => (
Severity::Error,
IconName::XCircle,
"Thread reached the token limit",
),
};
let burn_mode_available = self.as_native_thread(cx).is_some_and(|thread| {
@@ -5426,7 +5465,7 @@ impl AcpThreadView {
Some(
Callout::new()
.severity(severity)
.line_height(line_height)
.icon(icon)
.title(title)
.description(description)
.actions_slot(
@@ -5458,7 +5497,8 @@ impl AcpThreadView {
})),
)
}),
),
)
.dismiss_action(self.dismiss_error_button(cx)),
)
}
@@ -5881,18 +5921,13 @@ impl AcpThreadView {
fn create_copy_button(&self, message: impl Into<String>) -> impl IntoElement {
let message = message.into();
IconButton::new("copy", IconName::Copy)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text("Copy Error Message"))
.on_click(move |_, _, cx| {
cx.write_to_clipboard(ClipboardItem::new_string(message.clone()))
})
CopyButton::new(message).tooltip_label("Copy Error Message")
}
fn dismiss_error_button(&self, cx: &mut Context<Self>) -> impl IntoElement {
IconButton::new("dismiss", IconName::Close)
.icon_size(IconSize::Small)
.tooltip(Tooltip::text("Dismiss Error"))
.tooltip(Tooltip::text("Dismiss"))
.on_click(cx.listener({
move |this, _, _, cx| {
this.clear_thread_error(cx);
@@ -6152,7 +6187,7 @@ impl Render for AcpThreadView {
if let Some(usage_callout) = self.render_usage_callout(line_height, cx) {
Some(usage_callout.into_any_element())
} else {
self.render_token_limit_callout(line_height, cx)
self.render_token_limit_callout(cx)
.map(|token_limit_callout| token_limit_callout.into_any_element())
},
)

View File

@@ -1371,6 +1371,8 @@ async fn open_new_agent_servers_entry_in_settings_editor(
default_mode: None,
default_model: None,
favorite_models: vec![],
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
},
);
}

View File

@@ -1363,7 +1363,8 @@ impl AgentDiff {
| AcpThreadEvent::PromptCapabilitiesUpdated
| AcpThreadEvent::AvailableCommandsUpdated(_)
| AcpThreadEvent::Retry(_)
| AcpThreadEvent::ModeUpdated(_) => {}
| AcpThreadEvent::ModeUpdated(_)
| AcpThreadEvent::ConfigOptionsUpdated(_) => {}
}
}

View File

@@ -12,8 +12,8 @@ use editor::{
};
use futures::{AsyncReadExt as _, FutureExt as _, future::Shared};
use gpui::{
Animation, AnimationExt as _, AppContext, ClipboardEntry, Context, Empty, Entity, EntityId,
Image, ImageFormat, Img, SharedString, Task, WeakEntity, pulsating_between,
AppContext, ClipboardEntry, Context, Empty, Entity, EntityId, Image, ImageFormat, Img,
SharedString, Task, WeakEntity,
};
use http_client::{AsyncBody, HttpClientWithUrl};
use itertools::Either;
@@ -32,13 +32,14 @@ use std::{
path::{Path, PathBuf},
rc::Rc,
sync::Arc,
time::Duration,
};
use text::OffsetRangeExt;
use ui::{ButtonLike, Disclosure, TintColor, Toggleable, prelude::*};
use ui::{Disclosure, Toggleable, prelude::*};
use util::{ResultExt, debug_panic, rel_path::RelPath};
use workspace::{Workspace, notifications::NotifyResultExt as _};
use crate::ui::MentionCrease;
pub type MentionTask = Shared<Task<Result<Mention, String>>>;
#[derive(Debug, Clone, Eq, PartialEq)]
@@ -754,25 +755,8 @@ fn render_fold_icon_button(
.update(cx, |editor, cx| editor.is_range_selected(&fold_range, cx))
.unwrap_or_default();
ButtonLike::new(fold_id)
.style(ButtonStyle::Filled)
.selected_style(ButtonStyle::Tinted(TintColor::Accent))
.toggle_state(is_in_text_selection)
.child(
h_flex()
.gap_1()
.child(
Icon::from_path(icon_path.clone())
.size(IconSize::XSmall)
.color(Color::Muted),
)
.child(
Label::new(label.clone())
.size(LabelSize::Small)
.buffer_font(cx)
.single_line(),
),
)
MentionCrease::new(fold_id, icon_path.clone(), label.clone())
.is_toggled(is_in_text_selection)
.into_any_element()
}
})
@@ -947,12 +931,14 @@ impl Render for LoadingContext {
.editor
.update(cx, |editor, cx| editor.is_range_selected(&self.range, cx))
.unwrap_or_default();
ButtonLike::new(("loading-context", self.id))
.style(ButtonStyle::Filled)
.selected_style(ButtonStyle::Tinted(TintColor::Accent))
.toggle_state(is_in_text_selection)
.when_some(self.image.clone(), |el, image_task| {
el.hoverable_tooltip(move |_, cx| {
let id = ElementId::from(("loading_context", self.id));
MentionCrease::new(id, self.icon.clone(), self.label.clone())
.is_toggled(is_in_text_selection)
.is_loading(self.loading.is_some())
.when_some(self.image.clone(), |this, image_task| {
this.image_preview(move |_, cx| {
let image = image_task.peek().cloned().transpose().ok().flatten();
let image_task = image_task.clone();
cx.new::<ImageHover>(|cx| ImageHover {
@@ -971,35 +957,6 @@ impl Render for LoadingContext {
.into()
})
})
.child(
h_flex()
.gap_1()
.child(
Icon::from_path(self.icon.clone())
.size(IconSize::XSmall)
.color(Color::Muted),
)
.child(
Label::new(self.label.clone())
.size(LabelSize::Small)
.buffer_font(cx)
.single_line(),
)
.map(|el| {
if self.loading.is_some() {
el.with_animation(
"loading-context-crease",
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.4, 0.8)),
|label, delta| label.opacity(delta),
)
.into_any()
} else {
el.into_any()
}
}),
)
}
}

View File

@@ -4,6 +4,7 @@ mod burn_mode_tooltip;
mod claude_code_onboarding_modal;
mod end_trial_upsell;
mod hold_for_default;
mod mention_crease;
mod model_selector_components;
mod onboarding_modal;
mod usage_callout;
@@ -14,6 +15,7 @@ pub use burn_mode_tooltip::*;
pub use claude_code_onboarding_modal::*;
pub use end_trial_upsell::*;
pub use hold_for_default::*;
pub use mention_crease::*;
pub use model_selector_components::*;
pub use onboarding_modal::*;
pub use usage_callout::*;

View File

@@ -0,0 +1,100 @@
use std::time::Duration;
use gpui::{Animation, AnimationExt, AnyView, IntoElement, Window, pulsating_between};
use settings::Settings;
use theme::ThemeSettings;
use ui::{ButtonLike, TintColor, prelude::*};
#[derive(IntoElement)]
pub struct MentionCrease {
id: ElementId,
icon: SharedString,
label: SharedString,
is_toggled: bool,
is_loading: bool,
image_preview: Option<Box<dyn Fn(&mut Window, &mut App) -> AnyView + 'static>>,
}
impl MentionCrease {
pub fn new(
id: impl Into<ElementId>,
icon: impl Into<SharedString>,
label: impl Into<SharedString>,
) -> Self {
Self {
id: id.into(),
icon: icon.into(),
label: label.into(),
is_toggled: false,
is_loading: false,
image_preview: None,
}
}
pub fn is_toggled(mut self, is_toggled: bool) -> Self {
self.is_toggled = is_toggled;
self
}
pub fn is_loading(mut self, is_loading: bool) -> Self {
self.is_loading = is_loading;
self
}
pub fn image_preview(
mut self,
builder: impl Fn(&mut Window, &mut App) -> AnyView + 'static,
) -> Self {
self.image_preview = Some(Box::new(builder));
self
}
}
impl RenderOnce for MentionCrease {
fn render(self, window: &mut Window, cx: &mut App) -> impl IntoElement {
let settings = ThemeSettings::get_global(cx);
let font_size = settings.agent_buffer_font_size(cx);
let buffer_font = settings.buffer_font.clone();
let button_height = DefiniteLength::Absolute(AbsoluteLength::Pixels(
px(window.line_height().into()) - px(1.),
));
ButtonLike::new(self.id)
.style(ButtonStyle::Outlined)
.size(ButtonSize::Compact)
.height(button_height)
.selected_style(ButtonStyle::Tinted(TintColor::Accent))
.toggle_state(self.is_toggled)
.when_some(self.image_preview, |this, image_preview| {
this.hoverable_tooltip(image_preview)
})
.child(
h_flex()
.pb_px()
.gap_1()
.font(buffer_font)
.text_size(font_size)
.child(
Icon::from_path(self.icon.clone())
.size(IconSize::XSmall)
.color(Color::Muted),
)
.child(self.label.clone())
.map(|this| {
if self.is_loading {
this.with_animation(
"loading-context-crease",
Animation::new(Duration::from_secs(2))
.repeat()
.with_easing(pulsating_between(0.4, 0.8)),
|label, delta| label.opacity(delta),
)
.into_any()
} else {
this.into_any()
}
}),
)
}
}

View File

@@ -31,9 +31,9 @@ use smallvec::SmallVec;
use std::{mem, sync::Arc};
use theme::{ActiveTheme, ThemeSettings};
use ui::{
Avatar, AvatarAvailabilityIndicator, Button, Color, ContextMenu, Facepile, HighlightedLabel,
Icon, IconButton, IconName, IconSize, Indicator, Label, ListHeader, ListItem, Tab, Tooltip,
prelude::*, tooltip_container,
Avatar, AvatarAvailabilityIndicator, Button, Color, ContextMenu, CopyButton, Facepile,
HighlightedLabel, Icon, IconButton, IconName, IconSize, Indicator, Label, ListHeader, ListItem,
Tab, Tooltip, prelude::*, tooltip_container,
};
use util::{ResultExt, TryFutureExt, maybe};
use workspace::{
@@ -2527,16 +2527,9 @@ impl CollabPanel {
let button = match section {
Section::ActiveCall => channel_link.map(|channel_link| {
let channel_link_copy = channel_link;
IconButton::new("channel-link", IconName::Copy)
.icon_size(IconSize::Small)
.size(ButtonSize::None)
CopyButton::new(channel_link)
.visible_on_hover("section-header")
.on_click(move |_, _, cx| {
let item = ClipboardItem::new_string(channel_link_copy.clone());
cx.write_to_clipboard(item)
})
.tooltip(Tooltip::text("Copy channel link"))
.tooltip_label("Copy Channel Link")
.into_any_element()
}),
Section::Contacts => Some(

View File

@@ -56,6 +56,7 @@ telemetry_events.workspace = true
text.workspace = true
thiserror.workspace = true
time.workspace = true
toml.workspace = true
ui.workspace = true
util.workspace = true
uuid.workspace = true

View File

@@ -7,15 +7,14 @@ use buffer_diff::BufferDiffSnapshot;
use collections::HashMap;
use gpui::{App, Entity, Task};
use language::{Buffer, ToPoint as _};
use project::Project;
use project::{Project, WorktreeId};
use std::{collections::hash_map, fmt::Write as _, path::Path, sync::Arc};
use text::{BufferSnapshot as TextBufferSnapshot, ToOffset as _};
use text::BufferSnapshot as TextBufferSnapshot;
pub fn capture_example(
project: Entity<Project>,
buffer: Entity<Buffer>,
cursor_anchor: language::Anchor,
last_event_is_expected_patch: bool,
cx: &mut App,
) -> Option<Task<Result<ExampleSpec>>> {
let ep_store = EditPredictionStore::try_global(cx)?;
@@ -43,8 +42,26 @@ pub fn capture_example(
let git_store = project.read(cx).git_store().clone();
Some(cx.spawn(async move |mut cx| {
let snapshots_by_path = collect_snapshots(&project, &git_store, &events, &mut cx).await?;
let cursor_excerpt = cx
let snapshots_by_path =
collect_snapshots(&project, &git_store, worktree_id, &events, &mut cx).await?;
events.retain(|stored_event| {
match stored_event.event.as_ref() {
zeta_prompt::Event::BufferChange { path, .. } => {
if !snapshots_by_path.contains_key(path) {
return false;
}
}
}
true
});
let line_comment_prefix = snapshot
.language()
.and_then(|lang| lang.config().line_comments.first())
.map(|s| s.to_string())
.unwrap_or_default();
let (cursor_excerpt, cursor_offset) = cx
.background_executor()
.spawn(async move { compute_cursor_excerpt(&snapshot, cursor_anchor) })
.await;
@@ -54,13 +71,6 @@ pub fn capture_example(
.await;
let mut edit_history = String::new();
let mut expected_patch = String::new();
if last_event_is_expected_patch {
if let Some(stored_event) = events.pop() {
zeta_prompt::write_event(&mut expected_patch, &stored_event.event);
}
}
for stored_event in &events {
zeta_prompt::write_event(&mut edit_history, &stored_event.event);
if !edit_history.ends_with('\n') {
@@ -68,57 +78,62 @@ pub fn capture_example(
}
}
let name = generate_timestamp_name();
Ok(ExampleSpec {
name,
let mut spec = ExampleSpec {
name: generate_timestamp_name(),
repository_url,
revision,
tags: Vec::new(),
reasoning: None,
uncommitted_diff,
cursor_path: cursor_path.as_std_path().into(),
cursor_position: cursor_excerpt,
cursor_position: String::new(),
edit_history,
expected_patch,
})
expected_patches: Vec::new(),
};
spec.set_cursor_excerpt(&cursor_excerpt, cursor_offset, &line_comment_prefix);
Ok(spec)
}))
}
fn compute_cursor_excerpt(
snapshot: &language::BufferSnapshot,
cursor_anchor: language::Anchor,
) -> String {
) -> (String, usize) {
use text::ToOffset as _;
let cursor_point = cursor_anchor.to_point(snapshot);
let (_editable_range, context_range) =
editable_and_context_ranges_for_cursor_position(cursor_point, snapshot, 100, 50);
let context_start_offset = context_range.start.to_offset(snapshot);
let cursor_offset = cursor_anchor.to_offset(snapshot);
let cursor_offset_in_excerpt = cursor_offset.saturating_sub(context_start_offset);
let mut excerpt = snapshot.text_for_range(context_range).collect::<String>();
if cursor_offset_in_excerpt <= excerpt.len() {
excerpt.insert_str(cursor_offset_in_excerpt, zeta_prompt::CURSOR_MARKER);
}
excerpt
let excerpt = snapshot.text_for_range(context_range).collect::<String>();
(excerpt, cursor_offset_in_excerpt)
}
async fn collect_snapshots(
project: &Entity<Project>,
git_store: &Entity<project::git_store::GitStore>,
worktree_id: WorktreeId,
events: &[StoredEvent],
cx: &mut gpui::AsyncApp,
) -> Result<HashMap<Arc<Path>, (TextBufferSnapshot, BufferDiffSnapshot)>> {
let mut snapshots_by_path = HashMap::default();
let root_name = project.read_with(cx, |project, cx| {
project
.worktree_for_id(worktree_id, cx)
.unwrap()
.read(cx)
.root_name()
.to_owned()
})?;
for stored_event in events {
let zeta_prompt::Event::BufferChange { path, .. } = stored_event.event.as_ref();
if let Some((project_path, full_path)) = project.read_with(cx, |project, cx| {
let project_path = project.find_project_path(path, cx)?;
let full_path = project
.worktree_for_id(project_path.worktree_id, cx)?
.read(cx)
.root_name()
.join(&project_path.path)
.as_std_path()
.into();
let project_path = project
.find_project_path(path, cx)
.filter(|path| path.worktree_id == worktree_id)?;
let full_path = root_name.join(&project_path.path).as_std_path().into();
Some((project_path, full_path))
})? {
if let hash_map::Entry::Vacant(entry) = snapshots_by_path.entry(full_path) {
@@ -289,9 +304,7 @@ mod tests {
cx.run_until_parked();
let mut example = cx
.update(|cx| {
capture_example(project.clone(), buffer.clone(), Anchor::MIN, false, cx).unwrap()
})
.update(|cx| capture_example(project.clone(), buffer.clone(), Anchor::MIN, cx).unwrap())
.await
.unwrap();
example.name = "test".to_string();
@@ -302,6 +315,8 @@ mod tests {
name: "test".to_string(),
repository_url: "https://github.com/test/repo.git".to_string(),
revision: "abc123def456".to_string(),
tags: Vec::new(),
reasoning: None,
uncommitted_diff: indoc! {"
--- a/project/src/main.rs
+++ b/project/src/main.rs
@@ -322,7 +337,8 @@ mod tests {
.to_string(),
cursor_path: Path::new("project/src/main.rs").into(),
cursor_position: indoc! {"
<|user_cursor|>fn main() {
fn main() {
^[CURSOR_POSITION]
// comment 1
one();
two();
@@ -355,7 +371,7 @@ mod tests {
seven();
"}
.to_string(),
expected_patch: "".to_string(),
expected_patches: Vec::new()
}
);
}

View File

@@ -688,12 +688,14 @@ impl EditPredictionStore {
pub fn clear_history(&mut self) {
for project_state in self.projects.values_mut() {
project_state.events.clear();
project_state.last_event.take();
}
}
pub fn clear_history_for_project(&mut self, project: &Entity<Project>) {
if let Some(project_state) = self.projects.get_mut(&project.entity_id()) {
project_state.events.clear();
project_state.last_event.take();
}
}
@@ -2044,7 +2046,9 @@ impl EditPredictionStore {
"Edit Prediction Rated",
rating,
inputs = prediction.inputs,
output = prediction.edit_preview.as_unified_diff(&prediction.edits),
output = prediction
.edit_preview
.as_unified_diff(prediction.snapshot.file(), &prediction.edits),
feedback
);
self.client.telemetry().flush_events().detach();

View File

@@ -1,5 +1,9 @@
use anyhow::{Context as _, Result};
use serde::{Deserialize, Serialize};
use std::{fmt::Write as _, mem, path::Path, sync::Arc};
use std::{borrow::Cow, fmt::Write as _, mem, path::Path, sync::Arc};
pub const CURSOR_POSITION_MARKER: &str = "[CURSOR_POSITION]";
pub const INLINE_CURSOR_MARKER: &str = "<|user_cursor|>";
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub struct ExampleSpec {
@@ -7,33 +11,80 @@ pub struct ExampleSpec {
pub name: String,
pub repository_url: String,
pub revision: String,
#[serde(default, skip_serializing_if = "Vec::is_empty")]
pub tags: Vec<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub reasoning: Option<String>,
#[serde(default)]
pub uncommitted_diff: String,
pub cursor_path: Arc<Path>,
pub cursor_position: String,
pub edit_history: String,
pub expected_patch: String,
pub expected_patches: Vec<String>,
}
const REASONING_HEADING: &str = "Reasoning";
const UNCOMMITTED_DIFF_HEADING: &str = "Uncommitted Diff";
const EDIT_HISTORY_HEADING: &str = "Edit History";
const CURSOR_POSITION_HEADING: &str = "Cursor Position";
const EXPECTED_PATCH_HEADING: &str = "Expected Patch";
const EXPECTED_CONTEXT_HEADING: &str = "Expected Context";
const REPOSITORY_URL_FIELD: &str = "repository_url";
const REVISION_FIELD: &str = "revision";
#[derive(Serialize, Deserialize)]
struct FrontMatter<'a> {
repository_url: Cow<'a, str>,
revision: Cow<'a, str>,
#[serde(default, skip_serializing_if = "Vec::is_empty")]
tags: Vec<String>,
}
impl ExampleSpec {
/// Generate a sanitized filename for this example.
pub fn filename(&self) -> String {
self.name
.chars()
.map(|c| match c {
' ' | ':' | '~' | '^' | '?' | '*' | '[' | '\\' | '@' | '{' | '/' | '<' | '>'
| '|' | '"' => '-',
c => c,
})
.collect()
}
/// Format this example spec as markdown.
pub fn to_markdown(&self) -> String {
use std::fmt::Write as _;
let front_matter = FrontMatter {
repository_url: Cow::Borrowed(&self.repository_url),
revision: Cow::Borrowed(&self.revision),
tags: self.tags.clone(),
};
let front_matter_toml =
toml::to_string_pretty(&front_matter).unwrap_or_else(|_| String::new());
let mut markdown = String::new();
_ = writeln!(markdown, "+++");
markdown.push_str(&front_matter_toml);
if !markdown.ends_with('\n') {
markdown.push('\n');
}
_ = writeln!(markdown, "+++");
markdown.push('\n');
_ = writeln!(markdown, "# {}", self.name);
markdown.push('\n');
_ = writeln!(markdown, "repository_url = {}", self.repository_url);
_ = writeln!(markdown, "revision = {}", self.revision);
markdown.push('\n');
if let Some(reasoning) = &self.reasoning {
_ = writeln!(markdown, "## {}", REASONING_HEADING);
markdown.push('\n');
markdown.push_str(reasoning);
if !markdown.ends_with('\n') {
markdown.push('\n');
}
markdown.push('\n');
}
if !self.uncommitted_diff.is_empty() {
_ = writeln!(markdown, "## {}", UNCOMMITTED_DIFF_HEADING);
@@ -75,34 +126,48 @@ impl ExampleSpec {
_ = writeln!(markdown, "## {}", EXPECTED_PATCH_HEADING);
markdown.push('\n');
_ = writeln!(markdown, "```diff");
markdown.push_str(&self.expected_patch);
if !markdown.ends_with('\n') {
for patch in &self.expected_patches {
_ = writeln!(markdown, "```diff");
markdown.push_str(patch);
if !markdown.ends_with('\n') {
markdown.push('\n');
}
_ = writeln!(markdown, "```");
markdown.push('\n');
}
_ = writeln!(markdown, "```");
markdown.push('\n');
markdown
}
/// Parse an example spec from markdown.
pub fn from_markdown(name: String, input: &str) -> anyhow::Result<Self> {
pub fn from_markdown(mut input: &str) -> anyhow::Result<Self> {
use pulldown_cmark::{CodeBlockKind, CowStr, Event, HeadingLevel, Parser, Tag, TagEnd};
let parser = Parser::new(input);
let mut spec = ExampleSpec {
name,
name: String::new(),
repository_url: String::new(),
revision: String::new(),
tags: Vec::new(),
reasoning: None,
uncommitted_diff: String::new(),
cursor_path: Path::new("").into(),
cursor_position: String::new(),
edit_history: String::new(),
expected_patch: String::new(),
expected_patches: Vec::new(),
};
if let Some(rest) = input.strip_prefix("+++\n")
&& let Some((front_matter, rest)) = rest.split_once("+++\n")
{
if let Ok(data) = toml::from_str::<FrontMatter<'_>>(front_matter) {
spec.repository_url = data.repository_url.into_owned();
spec.revision = data.revision.into_owned();
spec.tags = data.tags;
}
input = rest.trim_start();
}
let parser = Parser::new(input);
let mut text = String::new();
let mut block_info: CowStr = "".into();
@@ -123,20 +188,9 @@ impl ExampleSpec {
match event {
Event::Text(line) => {
text.push_str(&line);
if let Section::Start = current_section
&& let Some((field, value)) = line.split_once('=')
{
match field.trim() {
REPOSITORY_URL_FIELD => {
spec.repository_url = value.trim().to_string();
}
REVISION_FIELD => {
spec.revision = value.trim().to_string();
}
_ => {}
}
}
}
Event::End(TagEnd::Heading(HeadingLevel::H1)) => {
spec.name = mem::take(&mut text);
}
Event::End(TagEnd::Heading(HeadingLevel::H2)) => {
let title = mem::take(&mut text);
@@ -194,7 +248,7 @@ impl ExampleSpec {
mem::take(&mut text);
}
Section::ExpectedPatch => {
spec.expected_patch = mem::take(&mut text);
spec.expected_patches.push(mem::take(&mut text));
}
Section::Start | Section::Other => {}
}
@@ -209,4 +263,326 @@ impl ExampleSpec {
Ok(spec)
}
/// Returns the excerpt of text around the cursor, and the offset of the cursor within that
/// excerpt.
///
/// The cursor's position is marked with a special comment that appears
/// below the cursor line, which contains the string `[CURSOR_POSITION]`,
/// preceded by an arrow marking the cursor's column. The arrow can be
/// either:
/// - `^` - The cursor column is at the position of the `^` character (pointing up to the cursor)
/// - `<` - The cursor column is at the first non-whitespace character on that line.
pub fn cursor_excerpt(&self) -> Result<(String, usize)> {
let input = &self.cursor_position;
// Check for inline cursor marker first
if let Some(inline_offset) = input.find(INLINE_CURSOR_MARKER) {
let excerpt = input[..inline_offset].to_string()
+ &input[inline_offset + INLINE_CURSOR_MARKER.len()..];
return Ok((excerpt, inline_offset));
}
let marker_offset = input
.find(CURSOR_POSITION_MARKER)
.context("missing [CURSOR_POSITION] marker")?;
let marker_line_start = input[..marker_offset]
.rfind('\n')
.map(|pos| pos + 1)
.unwrap_or(0);
let marker_line_end = input[marker_line_start..]
.find('\n')
.map(|pos| marker_line_start + pos + 1)
.unwrap_or(input.len());
let marker_line = &input[marker_line_start..marker_line_end].trim_end_matches('\n');
let cursor_column = if let Some(cursor_offset) = marker_line.find('^') {
cursor_offset
} else if let Some(less_than_pos) = marker_line.find('<') {
marker_line
.find(|c: char| !c.is_whitespace())
.unwrap_or(less_than_pos)
} else {
anyhow::bail!(
"cursor position marker line must contain '^' or '<' before [CURSOR_POSITION]"
);
};
let mut excerpt = input[..marker_line_start].to_string() + &input[marker_line_end..];
excerpt.truncate(excerpt.trim_end_matches('\n').len());
// The cursor is on the line above the marker line.
let cursor_line_end = marker_line_start.saturating_sub(1);
let cursor_line_start = excerpt[..cursor_line_end]
.rfind('\n')
.map(|pos| pos + 1)
.unwrap_or(0);
let cursor_offset = cursor_line_start + cursor_column;
Ok((excerpt, cursor_offset))
}
/// Sets the cursor position excerpt from a plain excerpt and cursor byte offset.
///
/// The `line_comment_prefix` is used to format the marker line as a comment.
/// If the cursor column is less than the comment prefix length, the `<` format is used.
/// Otherwise, the `^` format is used.
pub fn set_cursor_excerpt(
&mut self,
excerpt: &str,
cursor_offset: usize,
line_comment_prefix: &str,
) {
// Find which line the cursor is on and its column
let cursor_line_start = excerpt[..cursor_offset]
.rfind('\n')
.map(|pos| pos + 1)
.unwrap_or(0);
let cursor_line_end = excerpt[cursor_line_start..]
.find('\n')
.map(|pos| cursor_line_start + pos + 1)
.unwrap_or(excerpt.len());
let cursor_line = &excerpt[cursor_line_start..cursor_line_end];
let cursor_line_indent = &cursor_line[..cursor_line.len() - cursor_line.trim_start().len()];
let cursor_column = cursor_offset - cursor_line_start;
// Build the marker line
let mut marker_line = String::new();
if cursor_column < line_comment_prefix.len() {
for _ in 0..cursor_column {
marker_line.push(' ');
}
marker_line.push_str(line_comment_prefix);
write!(marker_line, " <{}", CURSOR_POSITION_MARKER).unwrap();
} else {
if cursor_column >= cursor_line_indent.len() + line_comment_prefix.len() {
marker_line.push_str(cursor_line_indent);
}
marker_line.push_str(line_comment_prefix);
while marker_line.len() < cursor_column {
marker_line.push(' ');
}
write!(marker_line, "^{}", CURSOR_POSITION_MARKER).unwrap();
}
// Build the final cursor_position string
let mut result = String::with_capacity(excerpt.len() + marker_line.len() + 2);
result.push_str(&excerpt[..cursor_line_end]);
if !result.ends_with('\n') {
result.push('\n');
}
result.push_str(&marker_line);
if cursor_line_end < excerpt.len() {
result.push('\n');
result.push_str(&excerpt[cursor_line_end..]);
}
self.cursor_position = result;
}
}
#[cfg(test)]
mod tests {
use super::*;
use indoc::indoc;
#[test]
fn test_cursor_excerpt_with_caret() {
let mut spec = ExampleSpec {
name: String::new(),
repository_url: String::new(),
revision: String::new(),
tags: Vec::new(),
reasoning: None,
uncommitted_diff: String::new(),
cursor_path: Path::new("test.rs").into(),
cursor_position: String::new(),
edit_history: String::new(),
expected_patches: Vec::new(),
};
// Cursor before `42`
let excerpt = indoc! {"
fn main() {
let x = 42;
println!(\"{}\", x);
}"
};
let offset = excerpt.find("42").unwrap();
let position_string = indoc! {"
fn main() {
let x = 42;
// ^[CURSOR_POSITION]
println!(\"{}\", x);
}"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
// Cursor after `l` in `let`
let offset = excerpt.find("et x").unwrap();
let position_string = indoc! {"
fn main() {
let x = 42;
// ^[CURSOR_POSITION]
println!(\"{}\", x);
}"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
// Cursor before `let`
let offset = excerpt.find("let").unwrap();
let position_string = indoc! {"
fn main() {
let x = 42;
// ^[CURSOR_POSITION]
println!(\"{}\", x);
}"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
// Cursor at beginning of the line with `let`
let offset = excerpt.find(" let").unwrap();
let position_string = indoc! {"
fn main() {
let x = 42;
// <[CURSOR_POSITION]
println!(\"{}\", x);
}"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
// Cursor at end of line, after the semicolon
let offset = excerpt.find(';').unwrap() + 1;
let position_string = indoc! {"
fn main() {
let x = 42;
// ^[CURSOR_POSITION]
println!(\"{}\", x);
}"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
// Caret at end of file (no trailing newline)
let excerpt = indoc! {"
fn main() {
let x = 42;"
};
let offset = excerpt.find(';').unwrap() + 1;
let position_string = indoc! {"
fn main() {
let x = 42;
// ^[CURSOR_POSITION]"
}
.to_string();
spec.set_cursor_excerpt(excerpt, offset, "//");
assert_eq!(spec.cursor_position, position_string);
assert_eq!(
spec.cursor_excerpt().unwrap(),
(excerpt.to_string(), offset)
);
}
#[test]
fn test_cursor_excerpt_with_inline_marker() {
let mut spec = ExampleSpec {
name: String::new(),
repository_url: String::new(),
revision: String::new(),
tags: Vec::new(),
reasoning: None,
uncommitted_diff: String::new(),
cursor_path: Path::new("test.rs").into(),
cursor_position: String::new(),
edit_history: String::new(),
expected_patches: Vec::new(),
};
// Cursor before `42` using inline marker
spec.cursor_position = indoc! {"
fn main() {
let x = <|user_cursor|>42;
println!(\"{}\", x);
}"
}
.to_string();
let expected_excerpt = indoc! {"
fn main() {
let x = 42;
println!(\"{}\", x);
}"
};
let expected_offset = expected_excerpt.find("42").unwrap();
assert_eq!(
spec.cursor_excerpt().unwrap(),
(expected_excerpt.to_string(), expected_offset)
);
// Cursor at beginning of line
spec.cursor_position = indoc! {"
fn main() {
<|user_cursor|> let x = 42;
}"
}
.to_string();
let expected_excerpt = indoc! {"
fn main() {
let x = 42;
}"
};
let expected_offset = expected_excerpt.find(" let").unwrap();
assert_eq!(
spec.cursor_excerpt().unwrap(),
(expected_excerpt.to_string(), expected_offset)
);
// Cursor at end of file
spec.cursor_position = "fn main() {}<|user_cursor|>".to_string();
let expected_excerpt = "fn main() {}";
let expected_offset = expected_excerpt.len();
assert_eq!(
spec.cursor_excerpt().unwrap(),
(expected_excerpt.to_string(), expected_offset)
);
}
}

View File

@@ -14,10 +14,8 @@ use anyhow::anyhow;
use collections::HashMap;
use gpui::AsyncApp;
use gpui::Entity;
use language::{Anchor, Buffer, OffsetRangeExt as _, TextBufferSnapshot};
use project::{Project, ProjectPath};
use util::paths::PathStyle;
use util::rel_path::RelPath;
use language::{Anchor, Buffer, OffsetRangeExt as _, TextBufferSnapshot, text_diff};
use project::Project;
#[derive(Clone, Debug)]
pub struct OpenedBuffers(#[allow(unused)] HashMap<String, Entity<Buffer>>);
@@ -30,54 +28,26 @@ pub async fn apply_diff(
) -> Result<OpenedBuffers> {
let mut included_files = HashMap::default();
let worktree_id = project.read_with(cx, |project, cx| {
anyhow::Ok(
project
.visible_worktrees(cx)
.next()
.context("no worktrees")?
.read(cx)
.id(),
)
})??;
for line in diff_str.lines() {
let diff_line = DiffLine::parse(line);
if let DiffLine::OldPath { path } = diff_line {
let buffer = project
.update(cx, |project, cx| {
let project_path = ProjectPath {
worktree_id,
path: RelPath::new(Path::new(path.as_ref()), PathStyle::Posix)?.into_arc(),
};
anyhow::Ok(project.open_buffer(project_path, cx))
})??
.await?;
included_files.insert(path.to_string(), buffer);
}
}
let ranges = [Anchor::MIN..Anchor::MAX];
let mut diff = DiffParser::new(diff_str);
let mut current_file = None;
let mut edits = vec![];
while let Some(event) = diff.next()? {
match event {
DiffEvent::Hunk {
path: file_path,
hunk,
} => {
let (buffer, ranges) = match current_file {
DiffEvent::Hunk { path, hunk } => {
let buffer = match current_file {
None => {
let buffer = included_files
.get_mut(file_path.as_ref())
.expect("Opened all files in diff");
current_file = Some((buffer, ranges.as_slice()));
let buffer = project
.update(cx, |project, cx| {
let project_path = project
.find_project_path(path.as_ref(), cx)
.context("no such path")?;
anyhow::Ok(project.open_buffer(project_path, cx))
})??
.await?;
included_files.insert(path.to_string(), buffer.clone());
current_file = Some(buffer);
current_file.as_ref().unwrap()
}
Some(ref current) => current,
@@ -85,14 +55,14 @@ pub async fn apply_diff(
buffer.read_with(cx, |buffer, _| {
edits.extend(
resolve_hunk_edits_in_buffer(hunk, buffer, ranges)
resolve_hunk_edits_in_buffer(hunk, buffer, ranges.as_slice())
.with_context(|| format!("Diff:\n{diff_str}"))?,
);
anyhow::Ok(())
})??;
}
DiffEvent::FileEnd { renamed_to } => {
let (buffer, _) = current_file
let buffer = current_file
.take()
.context("Got a FileEnd event before an Hunk event")?;
@@ -128,10 +98,69 @@ pub async fn apply_diff(
Ok(OpenedBuffers(included_files))
}
pub fn apply_diff_to_string(diff_str: &str, text: &str) -> Result<String> {
/// Extract the diff for a specific file from a multi-file diff.
/// Returns an error if the file is not found in the diff.
pub fn extract_file_diff(full_diff: &str, file_path: &str) -> Result<String> {
let mut result = String::new();
let mut in_target_file = false;
let mut found_file = false;
for line in full_diff.lines() {
if line.starts_with("diff --git") {
if in_target_file {
break;
}
in_target_file = line.contains(&format!("a/{}", file_path))
|| line.contains(&format!("b/{}", file_path));
if in_target_file {
found_file = true;
}
}
if in_target_file {
result.push_str(line);
result.push('\n');
}
}
if !found_file {
anyhow::bail!("File '{}' not found in diff", file_path);
}
Ok(result)
}
/// Strip unnecessary git metadata lines from a diff, keeping only the lines
/// needed for patch application: path headers (--- and +++), hunk headers (@@),
/// and content lines (+, -, space).
pub fn strip_diff_metadata(diff: &str) -> String {
let mut result = String::new();
for line in diff.lines() {
let dominated = DiffLine::parse(line);
match dominated {
// Keep path headers, hunk headers, and content lines
DiffLine::OldPath { .. }
| DiffLine::NewPath { .. }
| DiffLine::HunkHeader(_)
| DiffLine::Context(_)
| DiffLine::Deletion(_)
| DiffLine::Addition(_) => {
result.push_str(line);
result.push('\n');
}
// Skip garbage lines (diff --git, index, etc.)
DiffLine::Garbage(_) => {}
}
}
result
}
pub fn apply_diff_to_string(original: &str, diff_str: &str) -> Result<String> {
let mut diff = DiffParser::new(diff_str);
let mut text = text.to_string();
let mut text = original.to_string();
while let Some(event) = diff.next()? {
match event {
@@ -151,6 +180,51 @@ pub fn apply_diff_to_string(diff_str: &str, text: &str) -> Result<String> {
Ok(text)
}
/// Returns the individual edits that would be applied by a diff to the given content.
/// Each edit is a tuple of (byte_range_in_content, replacement_text).
/// Uses sub-line diffing to find the precise character positions of changes.
/// Returns an empty vec if the hunk context is not found or is ambiguous.
pub fn edits_for_diff(content: &str, diff_str: &str) -> Result<Vec<(Range<usize>, String)>> {
let mut diff = DiffParser::new(diff_str);
let mut result = Vec::new();
while let Some(event) = diff.next()? {
match event {
DiffEvent::Hunk { hunk, .. } => {
if hunk.context.is_empty() {
return Ok(Vec::new());
}
// Find the context in the content
let first_match = content.find(&hunk.context);
let Some(context_offset) = first_match else {
return Ok(Vec::new());
};
// Check for ambiguity - if context appears more than once, reject
if content[context_offset + 1..].contains(&hunk.context) {
return Ok(Vec::new());
}
// Use sub-line diffing to find precise edit positions
for edit in &hunk.edits {
let old_text = &content
[context_offset + edit.range.start..context_offset + edit.range.end];
let edits_within_hunk = text_diff(old_text, &edit.text);
for (inner_range, inner_text) in edits_within_hunk {
let absolute_start = context_offset + edit.range.start + inner_range.start;
let absolute_end = context_offset + edit.range.start + inner_range.end;
result.push((absolute_start..absolute_end, inner_text.to_string()));
}
}
}
DiffEvent::FileEnd { .. } => {}
}
}
Ok(result)
}
struct PatchFile<'a> {
old_path: Cow<'a, str>,
new_path: Cow<'a, str>,
@@ -873,4 +947,135 @@ mod tests {
FakeFs::new(cx.background_executor.clone())
}
#[test]
fn test_extract_file_diff() {
let multi_file_diff = indoc! {r#"
diff --git a/file1.txt b/file1.txt
index 1234567..abcdefg 100644
--- a/file1.txt
+++ b/file1.txt
@@ -1,3 +1,4 @@
line1
+added line
line2
line3
diff --git a/file2.txt b/file2.txt
index 2345678..bcdefgh 100644
--- a/file2.txt
+++ b/file2.txt
@@ -1,2 +1,2 @@
-old line
+new line
unchanged
"#};
let file1_diff = extract_file_diff(multi_file_diff, "file1.txt").unwrap();
assert_eq!(
file1_diff,
indoc! {r#"
diff --git a/file1.txt b/file1.txt
index 1234567..abcdefg 100644
--- a/file1.txt
+++ b/file1.txt
@@ -1,3 +1,4 @@
line1
+added line
line2
line3
"#}
);
let file2_diff = extract_file_diff(multi_file_diff, "file2.txt").unwrap();
assert_eq!(
file2_diff,
indoc! {r#"
diff --git a/file2.txt b/file2.txt
index 2345678..bcdefgh 100644
--- a/file2.txt
+++ b/file2.txt
@@ -1,2 +1,2 @@
-old line
+new line
unchanged
"#}
);
let result = extract_file_diff(multi_file_diff, "nonexistent.txt");
assert!(result.is_err());
}
#[test]
fn test_edits_for_diff() {
let content = indoc! {"
fn main() {
let x = 1;
let y = 2;
println!(\"{} {}\", x, y);
}
"};
let diff = indoc! {"
--- a/file.rs
+++ b/file.rs
@@ -1,5 +1,5 @@
fn main() {
- let x = 1;
+ let x = 42;
let y = 2;
println!(\"{} {}\", x, y);
}
"};
let edits = edits_for_diff(content, diff).unwrap();
assert_eq!(edits.len(), 1);
let (range, replacement) = &edits[0];
// With sub-line diffing, the edit should start at "1" (the actual changed character)
let expected_start = content.find("let x = 1;").unwrap() + "let x = ".len();
assert_eq!(range.start, expected_start);
// The deleted text is just "1"
assert_eq!(range.end, expected_start + "1".len());
// The replacement text
assert_eq!(replacement, "42");
// Verify the cursor would be positioned at the column of "1"
let line_start = content[..range.start]
.rfind('\n')
.map(|p| p + 1)
.unwrap_or(0);
let cursor_column = range.start - line_start;
// " let x = " is 12 characters, so column 12
assert_eq!(cursor_column, " let x = ".len());
}
#[test]
fn test_strip_diff_metadata() {
let diff_with_metadata = indoc! {r#"
diff --git a/file.txt b/file.txt
index 1234567..abcdefg 100644
--- a/file.txt
+++ b/file.txt
@@ -1,3 +1,4 @@
context line
-removed line
+added line
more context
"#};
let stripped = strip_diff_metadata(diff_with_metadata);
assert_eq!(
stripped,
indoc! {r#"
--- a/file.txt
+++ b/file.txt
@@ -1,3 +1,4 @@
context line
-removed line
+added line
more context
"#}
);
}
}

View File

@@ -1,8 +1,10 @@
use anthropic::{
ANTHROPIC_API_URL, Message, Request as AnthropicRequest, RequestContent,
Response as AnthropicResponse, Role, non_streaming_completion,
ANTHROPIC_API_URL, Event, Message, Request as AnthropicRequest, RequestContent,
Response as AnthropicResponse, ResponseContent, Role, non_streaming_completion,
stream_completion,
};
use anyhow::Result;
use futures::StreamExt as _;
use http_client::HttpClient;
use indoc::indoc;
use reqwest_client::ReqwestClient;
@@ -15,12 +17,12 @@ use std::path::Path;
use std::sync::Arc;
pub struct PlainLlmClient {
http_client: Arc<dyn HttpClient>,
api_key: String,
pub http_client: Arc<dyn HttpClient>,
pub api_key: String,
}
impl PlainLlmClient {
fn new() -> Result<Self> {
pub fn new() -> Result<Self> {
let http_client: Arc<dyn http_client::HttpClient> = Arc::new(ReqwestClient::new());
let api_key = std::env::var("ANTHROPIC_API_KEY")
.map_err(|_| anyhow::anyhow!("ANTHROPIC_API_KEY environment variable not set"))?;
@@ -30,7 +32,7 @@ impl PlainLlmClient {
})
}
async fn generate(
pub async fn generate(
&self,
model: &str,
max_tokens: u64,
@@ -63,6 +65,72 @@ impl PlainLlmClient {
Ok(response)
}
pub async fn generate_streaming<F>(
&self,
model: &str,
max_tokens: u64,
messages: Vec<Message>,
mut on_progress: F,
) -> Result<AnthropicResponse>
where
F: FnMut(usize, &str),
{
let request = AnthropicRequest {
model: model.to_string(),
max_tokens,
messages,
tools: Vec::new(),
thinking: None,
tool_choice: None,
system: None,
metadata: None,
stop_sequences: Vec::new(),
temperature: None,
top_k: None,
top_p: None,
};
let mut stream = stream_completion(
self.http_client.as_ref(),
ANTHROPIC_API_URL,
&self.api_key,
request,
None,
)
.await
.map_err(|e| anyhow::anyhow!("{:?}", e))?;
let mut response: Option<AnthropicResponse> = None;
let mut text_content = String::new();
while let Some(event_result) = stream.next().await {
let event = event_result.map_err(|e| anyhow::anyhow!("{:?}", e))?;
match event {
Event::MessageStart { message } => {
response = Some(message);
}
Event::ContentBlockDelta { delta, .. } => {
if let anthropic::ContentDelta::TextDelta { text } = delta {
text_content.push_str(&text);
on_progress(text_content.len(), &text_content);
}
}
_ => {}
}
}
let mut response = response.ok_or_else(|| anyhow::anyhow!("No response received"))?;
if response.content.is_empty() && !text_content.is_empty() {
response
.content
.push(ResponseContent::Text { text: text_content });
}
Ok(response)
}
}
pub struct BatchingLlmClient {
@@ -408,6 +476,29 @@ impl AnthropicClient {
}
}
#[allow(dead_code)]
pub async fn generate_streaming<F>(
&self,
model: &str,
max_tokens: u64,
messages: Vec<Message>,
on_progress: F,
) -> Result<Option<AnthropicResponse>>
where
F: FnMut(usize, &str),
{
match self {
AnthropicClient::Plain(plain_llm_client) => plain_llm_client
.generate_streaming(model, max_tokens, messages, on_progress)
.await
.map(Some),
AnthropicClient::Batch(_) => {
anyhow::bail!("Streaming not supported with batching client")
}
AnthropicClient::Dummy => panic!("Dummy LLM client is not expected to be used"),
}
}
pub async fn sync_batches(&self) -> Result<()> {
match self {
AnthropicClient::Plain(_) => Ok(()),

View File

@@ -1,20 +1,15 @@
use anyhow::{Result, anyhow};
use anyhow::Result;
use std::mem;
use crate::example::Example;
pub async fn run_distill(example: &mut Example) -> Result<()> {
let [prediction]: [_; 1] =
mem::take(&mut example.predictions)
.try_into()
.map_err(|preds: Vec<_>| {
anyhow!(
"Example has {} predictions, but it should have exactly one",
preds.len()
)
})?;
let predictions = mem::take(&mut example.predictions)
.into_iter()
.map(|p| p.actual_patch)
.collect();
example.spec.expected_patch = prediction.actual_patch;
example.spec.expected_patches = predictions;
example.prompt = None;
example.predictions = Vec::new();
example.score = Vec::new();

View File

@@ -1,4 +1,4 @@
use crate::{PredictionProvider, PromptFormat, metrics::ClassificationMetrics};
use crate::{PredictionProvider, PromptFormat};
use anyhow::{Context as _, Result};
use collections::HashMap;
use edit_prediction::example_spec::ExampleSpec;
@@ -87,7 +87,6 @@ pub struct ExamplePrediction {
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct ExampleScore {
pub delta_chr_f: f32,
pub line_match: ClassificationMetrics,
}
impl Example {
@@ -190,7 +189,11 @@ pub fn read_examples(inputs: &[PathBuf]) -> Vec<Example> {
.collect::<Vec<Example>>(),
),
"md" => {
examples.push(parse_markdown_example(filename, &content).unwrap());
let mut example = parse_markdown_example(&content).unwrap();
if example.spec.name.is_empty() {
example.spec.name = filename;
}
examples.push(example);
}
ext => {
panic!("{} has invalid example extension `{ext}`", path.display())
@@ -236,8 +239,8 @@ pub fn group_examples_by_repo(examples: &mut [Example]) -> Vec<Vec<&mut Example>
examples_by_repo.into_values().collect()
}
fn parse_markdown_example(name: String, input: &str) -> Result<Example> {
let spec = ExampleSpec::from_markdown(name, input)?;
fn parse_markdown_example(input: &str) -> Result<Example> {
let spec = ExampleSpec::from_markdown(input)?;
Ok(Example {
spec,
buffer: None,

View File

@@ -30,7 +30,12 @@ pub async fn run_format_prompt(
let prompt = TeacherPrompt::format_prompt(example);
example.prompt = Some(ExamplePrompt {
input: prompt,
expected_output: example.spec.expected_patch.clone(), // TODO
expected_output: example
.spec
.expected_patches
.first()
.cloned()
.unwrap_or_default(),
format: prompt_format,
});
}
@@ -68,8 +73,15 @@ pub async fn run_format_prompt(
))
})??;
let prompt = format_zeta_prompt(&input);
let expected_output =
zeta2_output_for_patch(&input, &example.spec.expected_patch.clone())?;
let expected_output = zeta2_output_for_patch(
&input,
&example
.spec
.expected_patches
.first()
.context("expected patches is empty")?
.clone(),
)?;
example.prompt = Some(ExamplePrompt {
input: prompt,
expected_output,
@@ -86,6 +98,7 @@ impl TeacherPrompt {
const PROMPT: &str = include_str!("teacher.prompt.md");
pub(crate) const EDITABLE_REGION_START: &str = "<|editable_region_start|>\n";
pub(crate) const EDITABLE_REGION_END: &str = "<|editable_region_end|>";
pub(crate) const USER_CURSOR_MARKER: &str = "<|user_cursor|>";
/// Truncate edit history to this number of last lines
const MAX_HISTORY_LINES: usize = 128;
@@ -181,13 +194,15 @@ impl TeacherPrompt {
result.push_str(Self::EDITABLE_REGION_START);
// TODO: control number of lines around cursor
result.push_str(&example.spec.cursor_position);
if !example.spec.cursor_position.ends_with('\n') {
let (mut excerpt, offset) = example.spec.cursor_excerpt().unwrap();
excerpt.insert_str(offset, Self::USER_CURSOR_MARKER);
result.push_str(&excerpt);
if !result.ends_with('\n') {
result.push('\n');
}
result.push_str(&format!("{}\n", Self::EDITABLE_REGION_END));
result.push_str("`````");
result.push_str(Self::EDITABLE_REGION_END);
result.push_str("\n`````");
result
}

View File

@@ -0,0 +1,110 @@
use anyhow::{Context as _, Result};
use collections::HashMap;
use futures::lock::{Mutex, OwnedMutexGuard};
use std::{
cell::RefCell,
path::{Path, PathBuf},
sync::Arc,
};
use crate::paths::REPOS_DIR;
thread_local! {
static REPO_LOCKS: RefCell<HashMap<PathBuf, Arc<Mutex<()>>>> = RefCell::new(HashMap::default());
}
#[must_use]
pub async fn lock_repo(path: impl AsRef<Path>) -> OwnedMutexGuard<()> {
REPO_LOCKS
.with(|cell| {
cell.borrow_mut()
.entry(path.as_ref().to_path_buf())
.or_default()
.clone()
})
.lock_owned()
.await
}
pub async fn run_git(repo_path: &Path, args: &[&str]) -> Result<String> {
let output = smol::process::Command::new("git")
.current_dir(repo_path)
.args(args)
.output()
.await?;
anyhow::ensure!(
output.status.success(),
"`git {}` within `{}` failed with status: {}\nstderr:\n{}\nstdout:\n{}",
args.join(" "),
repo_path.display(),
output.status,
String::from_utf8_lossy(&output.stderr),
String::from_utf8_lossy(&output.stdout),
);
Ok(String::from_utf8(output.stdout)?.trim().to_string())
}
pub fn parse_repo_url(url: &str) -> Result<(String, String)> {
if url.contains('@') {
let (_, path) = url.split_once(':').context("expected : in git url")?;
let (owner, repo) = path.split_once('/').context("expected / in git url")?;
Ok((owner.to_string(), repo.trim_end_matches(".git").to_string()))
} else {
let parsed = http_client::Url::parse(url)?;
let mut segments = parsed.path_segments().context("empty http url")?;
let owner = segments.next().context("expected owner")?;
let repo = segments.next().context("expected repo")?;
Ok((owner.to_string(), repo.trim_end_matches(".git").to_string()))
}
}
pub fn repo_path_for_url(url: &str) -> Result<PathBuf> {
let (owner, name) = parse_repo_url(url)?;
Ok(REPOS_DIR.join(&owner).join(&name))
}
pub async fn ensure_repo_cloned(repo_url: &str) -> Result<PathBuf> {
let repo_path = repo_path_for_url(repo_url)?;
let _lock = lock_repo(&repo_path).await;
if !repo_path.is_dir() {
log::info!("Cloning {} into {:?}", repo_url, repo_path);
std::fs::create_dir_all(&repo_path)?;
run_git(&repo_path, &["init"]).await?;
run_git(&repo_path, &["remote", "add", "origin", repo_url]).await?;
}
// Always fetch to get latest commits
run_git(&repo_path, &["fetch", "origin"]).await?;
// Check if we have a valid HEAD, if not checkout FETCH_HEAD
let has_head = run_git(&repo_path, &["rev-parse", "HEAD"]).await.is_ok();
if !has_head {
// Use reset to set HEAD without needing a branch
run_git(&repo_path, &["reset", "--hard", "FETCH_HEAD"]).await?;
}
Ok(repo_path)
}
pub async fn fetch_if_needed(repo_path: &Path, revision: &str) -> Result<String> {
let resolved = run_git(
repo_path,
&["rev-parse", &format!("{}^{{commit}}", revision)],
)
.await;
if let Ok(sha) = resolved {
return Ok(sha);
}
if run_git(repo_path, &["fetch", "--depth", "1", "origin", revision])
.await
.is_err()
{
run_git(repo_path, &["fetch", "origin"]).await?;
}
run_git(repo_path, &["rev-parse", "FETCH_HEAD"]).await
}

View File

@@ -1,29 +1,19 @@
use crate::{
example::{Example, ExampleBuffer, ExampleState},
git,
headless::EpAppState,
paths::{REPOS_DIR, WORKTREES_DIR},
paths::WORKTREES_DIR,
progress::{InfoStyle, Progress, Step, StepProgress},
};
use anyhow::{Context as _, Result};
use collections::HashMap;
use edit_prediction::EditPredictionStore;
use edit_prediction::udiff::OpenedBuffers;
use futures::{
AsyncWriteExt as _,
lock::{Mutex, OwnedMutexGuard},
};
use futures::AsyncWriteExt as _;
use gpui::{AsyncApp, Entity};
use language::{Anchor, Buffer, LanguageNotFound, ToOffset, ToPoint};
use project::Project;
use project::buffer_store::BufferStoreEvent;
use project::{Project, ProjectPath};
use std::{
cell::RefCell,
fs,
path::{Path, PathBuf},
sync::Arc,
};
use util::{paths::PathStyle, rel_path::RelPath};
use zeta_prompt::CURSOR_MARKER;
use std::{fs, path::PathBuf, sync::Arc};
pub async fn run_load_project(
example: &mut Example,
@@ -86,37 +76,22 @@ async fn cursor_position(
return Err(error);
}
let worktree = project.read_with(cx, |project, cx| {
project
.visible_worktrees(cx)
.next()
.context("No visible worktrees")
})??;
let cursor_path = RelPath::new(&example.spec.cursor_path, PathStyle::Posix)
.context("Failed to create RelPath")?
.into_arc();
let cursor_buffer = project
.update(cx, |project, cx| {
project.open_buffer(
ProjectPath {
worktree_id: worktree.read(cx).id(),
path: cursor_path,
},
cx,
)
let cursor_path = project
.read_with(cx, |project, cx| {
project.find_project_path(&example.spec.cursor_path, cx)
})?
.with_context(|| {
format!(
"failed to find cursor path {}",
example.spec.cursor_path.display()
)
})?;
let cursor_buffer = project
.update(cx, |project, cx| project.open_buffer(cursor_path, cx))?
.await?;
let cursor_offset_within_excerpt = example
.spec
.cursor_position
.find(CURSOR_MARKER)
.context("missing cursor marker")?;
let mut cursor_excerpt = example.spec.cursor_position.clone();
cursor_excerpt.replace_range(
cursor_offset_within_excerpt..(cursor_offset_within_excerpt + CURSOR_MARKER.len()),
"",
);
let (cursor_excerpt, cursor_offset_within_excerpt) = example.spec.cursor_excerpt()?;
let excerpt_offset = cursor_buffer.read_with(cx, |buffer, _cx| {
let text = buffer.text();
@@ -212,17 +187,17 @@ async fn setup_project(
async fn setup_worktree(example: &Example, step_progress: &StepProgress) -> Result<PathBuf> {
let (repo_owner, repo_name) = example.repo_name().context("failed to get repo name")?;
let repo_dir = REPOS_DIR.join(repo_owner.as_ref()).join(repo_name.as_ref());
let repo_dir = git::repo_path_for_url(&example.spec.repository_url)?;
let worktree_path = WORKTREES_DIR
.join(repo_owner.as_ref())
.join(repo_name.as_ref());
let repo_lock = lock_repo(&repo_dir).await;
let repo_lock = git::lock_repo(&repo_dir).await;
if !repo_dir.is_dir() {
step_progress.set_substatus(format!("cloning {}", repo_name));
fs::create_dir_all(&repo_dir)?;
run_git(&repo_dir, &["init"]).await?;
run_git(
git::run_git(&repo_dir, &["init"]).await?;
git::run_git(
&repo_dir,
&["remote", "add", "origin", &example.spec.repository_url],
)
@@ -230,53 +205,26 @@ async fn setup_worktree(example: &Example, step_progress: &StepProgress) -> Resu
}
// Resolve the example to a revision, fetching it if needed.
let revision = run_git(
&repo_dir,
&[
"rev-parse",
&format!("{}^{{commit}}", example.spec.revision),
],
)
.await;
let revision = if let Ok(revision) = revision {
revision
} else {
step_progress.set_substatus("fetching");
if run_git(
&repo_dir,
&["fetch", "--depth", "1", "origin", &example.spec.revision],
)
.await
.is_err()
{
run_git(&repo_dir, &["fetch", "origin"]).await?;
}
let revision = run_git(&repo_dir, &["rev-parse", "FETCH_HEAD"]).await?;
revision
};
step_progress.set_substatus("fetching");
let revision = git::fetch_if_needed(&repo_dir, &example.spec.revision).await?;
// Create the worktree for this example if needed.
step_progress.set_substatus("preparing worktree");
if worktree_path.is_dir() {
run_git(&worktree_path, &["clean", "--force", "-d"]).await?;
run_git(&worktree_path, &["reset", "--hard", "HEAD"]).await?;
run_git(&worktree_path, &["checkout", revision.as_str()]).await?;
git::run_git(&worktree_path, &["clean", "--force", "-d"]).await?;
git::run_git(&worktree_path, &["reset", "--hard", "HEAD"]).await?;
git::run_git(&worktree_path, &["checkout", revision.as_str()]).await?;
} else {
let worktree_path_string = worktree_path.to_string_lossy();
run_git(
let branch_name = example.spec.filename();
git::run_git(
&repo_dir,
&["branch", "-f", &example.spec.name, revision.as_str()],
&["branch", "-f", &branch_name, revision.as_str()],
)
.await?;
run_git(
git::run_git(
&repo_dir,
&[
"worktree",
"add",
"-f",
&worktree_path_string,
&example.spec.name,
],
&["worktree", "add", "-f", &worktree_path_string, &branch_name],
)
.await?;
}
@@ -319,39 +267,3 @@ async fn apply_edit_history(
) -> Result<OpenedBuffers> {
edit_prediction::udiff::apply_diff(&example.spec.edit_history, project, cx).await
}
thread_local! {
static REPO_LOCKS: RefCell<HashMap<PathBuf, Arc<Mutex<()>>>> = RefCell::new(HashMap::default());
}
#[must_use]
pub async fn lock_repo(path: impl AsRef<Path>) -> OwnedMutexGuard<()> {
REPO_LOCKS
.with(|cell| {
cell.borrow_mut()
.entry(path.as_ref().to_path_buf())
.or_default()
.clone()
})
.lock_owned()
.await
}
async fn run_git(repo_path: &Path, args: &[&str]) -> Result<String> {
let output = smol::process::Command::new("git")
.current_dir(repo_path)
.args(args)
.output()
.await?;
anyhow::ensure!(
output.status.success(),
"`git {}` within `{}` failed with status: {}\nstderr:\n{}\nstdout:\n{}",
args.join(" "),
repo_path.display(),
output.status,
String::from_utf8_lossy(&output.stderr),
String::from_utf8_lossy(&output.stdout),
);
Ok(String::from_utf8(output.stdout)?.trim().to_string())
}

View File

@@ -2,6 +2,7 @@ mod anthropic_client;
mod distill;
mod example;
mod format_prompt;
mod git;
mod headless;
mod load_project;
mod metrics;
@@ -10,6 +11,7 @@ mod predict;
mod progress;
mod retrieve_context;
mod score;
mod synthesize;
use clap::{Args, CommandFactory, Parser, Subcommand, ValueEnum};
use edit_prediction::EditPredictionStore;
@@ -28,6 +30,7 @@ use crate::predict::run_prediction;
use crate::progress::Progress;
use crate::retrieve_context::run_context_retrieval;
use crate::score::run_scoring;
use crate::synthesize::{SynthesizeConfig, run_synthesize};
#[derive(Parser, Debug)]
#[command(name = "ep")]
@@ -67,6 +70,8 @@ enum Command {
Distill,
/// Print aggregated scores
Eval(PredictArgs),
/// Generate eval examples by analyzing git commits from a repository
Synthesize(SynthesizeArgs),
/// Remove git repositories and worktrees
Clean,
}
@@ -118,6 +123,9 @@ impl Display for Command {
.unwrap()
.get_name()
),
Command::Synthesize(args) => {
write!(f, "synthesize --repo={}", args.repo)
}
Command::Clean => write!(f, "clean"),
}
}
@@ -143,7 +151,7 @@ struct PredictArgs {
repetitions: usize,
}
#[derive(Clone, Copy, Debug, ValueEnum, Serialize, Deserialize)]
#[derive(Clone, Copy, Debug, PartialEq, ValueEnum, Serialize, Deserialize)]
enum PredictionProvider {
Sweep,
Mercury,
@@ -153,6 +161,29 @@ enum PredictionProvider {
TeacherNonBatching,
}
#[derive(Debug, Args)]
struct SynthesizeArgs {
/// Repository URL (git@github.com:owner/repo or https://...)
#[clap(long)]
repo: String,
/// Number of examples to generate
#[clap(long, default_value_t = 5)]
count: usize,
/// Maximum commits to scan before giving up
#[clap(long, default_value_t = 100)]
max_commits: usize,
/// Only generate examples that require retrieved context to make a correct prediction
#[clap(long)]
require_context: bool,
/// Ignore state file and reprocess all commits
#[clap(long)]
fresh: bool,
}
impl EpArgs {
fn output_path(&self) -> Option<PathBuf> {
if self.in_place {
@@ -189,6 +220,26 @@ fn main() {
std::fs::remove_dir_all(&*paths::DATA_DIR).unwrap();
return;
}
Command::Synthesize(synth_args) => {
let Some(output_dir) = args.output else {
panic!("output dir is required");
};
let config = SynthesizeConfig {
repo_url: synth_args.repo.clone(),
count: synth_args.count,
max_commits: synth_args.max_commits,
output_dir,
require_context: synth_args.require_context,
fresh: synth_args.fresh,
};
smol::block_on(async {
if let Err(e) = run_synthesize(config).await {
eprintln!("Error: {:?}", e);
std::process::exit(1);
}
});
return;
}
_ => {}
}
@@ -256,7 +307,7 @@ fn main() {
run_scoring(example, &args, app_state.clone(), cx.clone())
.await?;
}
Command::Clean => {
Command::Clean | Command::Synthesize(_) => {
unreachable!()
}
}

View File

@@ -1,34 +1,17 @@
use collections::{HashMap, HashSet};
use edit_prediction::udiff::DiffLine;
use serde::{Deserialize, Serialize};
use collections::HashMap;
type Counts = HashMap<String, usize>;
type CountsDelta = HashMap<String, isize>;
#[derive(Default, Debug, Clone, Serialize, Deserialize)]
pub struct ClassificationMetrics {
pub true_positives: usize,
pub false_positives: usize,
pub false_negatives: usize,
#[derive(Default, Debug, Clone)]
struct ClassificationMetrics {
true_positives: usize,
false_positives: usize,
false_negatives: usize,
}
impl ClassificationMetrics {
pub fn from_sets(
expected: &HashSet<String>,
actual: &HashSet<String>,
) -> ClassificationMetrics {
let true_positives = expected.intersection(actual).count();
let false_positives = actual.difference(expected).count();
let false_negatives = expected.difference(actual).count();
ClassificationMetrics {
true_positives,
false_positives,
false_negatives,
}
}
pub fn from_counts(expected: &Counts, actual: &Counts) -> ClassificationMetrics {
fn from_counts(expected: &Counts, actual: &Counts) -> ClassificationMetrics {
let mut true_positives = 0;
let mut false_positives = 0;
let mut false_negatives = 0;
@@ -56,27 +39,7 @@ impl ClassificationMetrics {
}
}
pub fn aggregate<'a>(
scores: impl Iterator<Item = &'a ClassificationMetrics>,
) -> ClassificationMetrics {
let mut true_positives = 0;
let mut false_positives = 0;
let mut false_negatives = 0;
for score in scores {
true_positives += score.true_positives;
false_positives += score.false_positives;
false_negatives += score.false_negatives;
}
ClassificationMetrics {
true_positives,
false_positives,
false_negatives,
}
}
pub fn precision(&self) -> f64 {
fn precision(&self) -> f64 {
if self.true_positives + self.false_positives == 0 {
0.0
} else {
@@ -84,42 +47,13 @@ impl ClassificationMetrics {
}
}
pub fn recall(&self) -> f64 {
fn recall(&self) -> f64 {
if self.true_positives + self.false_negatives == 0 {
0.0
} else {
self.true_positives as f64 / (self.true_positives + self.false_negatives) as f64
}
}
pub fn f1_score(&self) -> f64 {
let recall = self.recall();
let precision = self.precision();
if precision + recall == 0.0 {
0.0
} else {
2.0 * precision * recall / (precision + recall)
}
}
}
pub fn line_match_score(
expected_patch: &[DiffLine],
actual_patch: &[DiffLine],
) -> ClassificationMetrics {
let expected_change_lines = expected_patch
.iter()
.filter(|line| matches!(line, DiffLine::Addition(_) | DiffLine::Deletion(_)))
.map(|line| line.to_string())
.collect();
let actual_change_lines = actual_patch
.iter()
.filter(|line| matches!(line, DiffLine::Addition(_) | DiffLine::Deletion(_)))
.map(|line| line.to_string())
.collect();
ClassificationMetrics::from_sets(&expected_change_lines, &actual_change_lines)
}
enum ChrfWhitespace {
@@ -135,55 +69,26 @@ const CHR_F_WHITESPACE: ChrfWhitespace = ChrfWhitespace::Ignore;
/// Computes a delta-chrF score that compares two sets of edits.
///
/// This metric works by:
/// 1. Reconstructing original, golden (expected result), and actual texts from diffs
/// 2. Computing n-gram count differences (deltas) between original→golden and original→actual
/// 3. Comparing these deltas to measure how well actual edits match expected edits
pub fn delta_chr_f(expected: &[DiffLine], actual: &[DiffLine]) -> f64 {
// Reconstruct texts from diffs
let mut original_text = String::new(); // state of the text before any edits
let mut golden_text = String::new(); // text after applying golden edits
let mut actual_text = String::new(); // text after applying actual edits
for line in expected {
match line {
DiffLine::Context(s) => {
original_text.push_str(s);
golden_text.push_str(s);
}
DiffLine::Deletion(s) => {
original_text.push_str(s);
}
DiffLine::Addition(s) => {
golden_text.push_str(s);
}
_ => {}
}
}
for line in actual {
match line {
DiffLine::Context(s) | DiffLine::Addition(s) => {
actual_text.push_str(s);
}
_ => {}
}
}
// Edge case
if original_text == golden_text && golden_text == actual_text {
/// 1. Computing n-gram count differences (deltas) between original→expected and original→actual
/// 2. Comparing these deltas to measure how well actual edits match expected edits
///
/// Returns a score from 0.0 to 100.0, where 100.0 means the actual edits perfectly match
/// the expected edits.
pub fn delta_chr_f(original: &str, expected: &str, actual: &str) -> f64 {
// Edge case: if all texts are identical, the edits match perfectly
if original == expected && expected == actual {
return 100.0;
}
// Compute the metric
let original_ngrams = chr_f_ngram_counts(&original_text);
let golden_ngrams = chr_f_ngram_counts(&golden_text);
let actual_ngrams = chr_f_ngram_counts(&actual_text);
let original_ngrams = chr_f_ngram_counts(original);
let expected_ngrams = chr_f_ngram_counts(expected);
let actual_ngrams = chr_f_ngram_counts(actual);
let mut total_precision = 0.0;
let mut total_recall = 0.0;
for order in 0..CHR_F_CHAR_ORDER {
let expected_delta = compute_ngram_delta(&golden_ngrams[order], &original_ngrams[order]);
let expected_delta = compute_ngram_delta(&expected_ngrams[order], &original_ngrams[order]);
let actual_delta = compute_ngram_delta(&actual_ngrams[order], &original_ngrams[order]);
if expected_delta.is_empty() && actual_delta.is_empty() {
@@ -255,7 +160,7 @@ fn ngram_delta_to_counts(delta: &CountsDelta) -> Counts {
for (ngram, &delta) in delta {
if delta > 0 {
counts.insert(ngram.clone(), delta as usize);
} else {
} else if delta < 0 {
counts.insert(format!("¬{ngram}"), delta.unsigned_abs());
}
}
@@ -278,94 +183,68 @@ fn count_ngrams(text: &str, n: usize) -> Counts {
#[cfg(test)]
mod test {
use super::*;
use edit_prediction::udiff::DiffLine;
#[test]
fn test_delta_chr_f_perfect_match() {
let diff = vec![
DiffLine::Context("fn main() {"),
DiffLine::Deletion(" println!(\"Hello\");"),
DiffLine::Addition(" println!(\"Hello, World!\");"),
DiffLine::Context("}"),
];
let original = "fn main() { println!(\"Hello\");}";
let expected = "fn main() { println!(\"Hello, World!\");}";
let score = delta_chr_f(&diff, &diff);
let score = delta_chr_f(original, expected, expected);
assert!((score - 100.0).abs() < 1e-2);
}
#[test]
fn test_delta_chr_f_wrong_edit() {
// When the edit is wrong
let expected = vec![
DiffLine::Context("one "),
DiffLine::Deletion("two "),
DiffLine::Context("three"),
];
let actual = vec![
DiffLine::Context("one "),
DiffLine::Context("two "),
DiffLine::Deletion("three"),
DiffLine::Addition("four"),
];
let original = "one two three";
let expected = "one three"; // deleted "two "
let actual = "one two four"; // deleted "three", added "four"
// Then the score should be low
let score = delta_chr_f(&expected, &actual);
let score = delta_chr_f(original, expected, actual);
assert!(score > 20.0 && score < 40.0);
}
#[test]
fn test_delta_chr_f_partial_match() {
let expected = vec![
DiffLine::Deletion("let x = 42;"),
DiffLine::Addition("let x = 100;"),
];
let actual = vec![
DiffLine::Deletion("let x = 42;"),
DiffLine::Addition("let x = 99;"),
];
let original = "let x = 42;";
let expected = "let x = 100;";
let actual = "let x = 99;";
// We got the edit location right, but the replacement text is wrong.
// Deleted ngrams will match, bringing the score somewhere in the middle.
let score = delta_chr_f(&expected, &actual);
let score = delta_chr_f(original, expected, actual);
assert!(score > 40.0 && score < 60.0);
}
#[test]
fn test_delta_chr_f_missed_edit() {
// When predictions makes no changes
let expected = vec![
DiffLine::Context("prefix "),
DiffLine::Deletion("old"),
DiffLine::Addition("new"),
DiffLine::Context(" suffix"),
];
let actual = vec![
DiffLine::Context("prefix "),
DiffLine::Context("old"),
DiffLine::Context(" suffix"),
];
let original = "prefix old suffix";
let expected = "prefix new suffix";
let actual = "prefix old suffix"; // no change
// Then the score should be low (all expected changes are false negatives)
let score = delta_chr_f(&expected, &actual);
let score = delta_chr_f(original, expected, actual);
assert!(score < 20.0);
}
#[test]
fn test_delta_chr_f_extra_edit() {
// When adding unexpected content
let expected = vec![DiffLine::Context("hello"), DiffLine::Context("world")];
let actual = vec![
DiffLine::Context("hello"),
DiffLine::Addition("extra"),
DiffLine::Context("world"),
];
let original = "helloworld";
let expected = "helloworld"; // no change expected
let actual = "helloextraworld"; // added "extra"
// Then the score should be low (all actual changes are false positives)
let score = delta_chr_f(&expected, &actual);
let score = delta_chr_f(original, expected, actual);
assert!(score < 20.0);
}
#[test]
fn test_delta_chr_f_no_changes() {
let text = "unchanged text";
let score = delta_chr_f(text, text, text);
assert!((score - 100.0).abs() < 1e-2);
}
}

View File

@@ -17,7 +17,11 @@ pub static RUN_DIR: LazyLock<PathBuf> = LazyLock::new(|| {
.join(chrono::Local::now().format("%d-%m-%y-%H_%M_%S").to_string())
});
pub static LATEST_EXAMPLE_RUN_DIR: LazyLock<PathBuf> = LazyLock::new(|| DATA_DIR.join("latest"));
pub static LATEST_FAILED_EXAMPLES_DIR: LazyLock<PathBuf> =
LazyLock::new(|| DATA_DIR.join("latest_failed"));
pub static LLM_CACHE_DB: LazyLock<PathBuf> = LazyLock::new(|| CACHE_DIR.join("llm_cache.sqlite"));
pub static SYNTHESIZE_STATE_FILE: LazyLock<PathBuf> =
LazyLock::new(|| DATA_DIR.join("synthesize_state.json"));
pub static FAILED_EXAMPLES_DIR: LazyLock<PathBuf> =
LazyLock::new(|| ensure_dir(&RUN_DIR.join("failed")));

View File

@@ -28,12 +28,16 @@ pub async fn run_prediction(
app_state: Arc<EpAppState>,
mut cx: AsyncApp,
) -> anyhow::Result<()> {
if !example.predictions.is_empty() {
return Ok(());
}
let provider = provider.context("provider is required")?;
if let Some(existing_prediction) = example.predictions.first() {
if existing_prediction.provider == provider {
return Ok(());
} else {
example.predictions.clear();
}
}
run_context_retrieval(example, app_state.clone(), cx.clone()).await?;
if matches!(
@@ -184,7 +188,9 @@ pub async fn run_prediction(
let actual_patch = prediction
.and_then(|prediction| {
let prediction = prediction.prediction.ok()?;
prediction.edit_preview.as_unified_diff(&prediction.edits)
prediction
.edit_preview
.as_unified_diff(prediction.snapshot.file(), &prediction.edits)
})
.unwrap_or_default();

View File

@@ -46,6 +46,7 @@ pub enum Step {
FormatPrompt,
Predict,
Score,
Synthesize,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
@@ -62,6 +63,7 @@ impl Step {
Step::FormatPrompt => "Format",
Step::Predict => "Predict",
Step::Score => "Score",
Step::Synthesize => "Synthesize",
}
}
@@ -72,6 +74,7 @@ impl Step {
Step::FormatPrompt => "\x1b[34m",
Step::Predict => "\x1b[32m",
Step::Score => "\x1b[31m",
Step::Synthesize => "\x1b[36m",
}
}
}

View File

@@ -2,11 +2,12 @@ use crate::{
PredictArgs,
example::{Example, ExampleScore},
headless::EpAppState,
metrics::{self, ClassificationMetrics},
metrics,
predict::run_prediction,
progress::{Progress, Step},
};
use edit_prediction::udiff::DiffLine;
use anyhow::Context as _;
use edit_prediction::udiff::apply_diff_to_string;
use gpui::AsyncApp;
use std::sync::Arc;
@@ -27,18 +28,32 @@ pub async fn run_scoring(
let _progress = Progress::global().start(Step::Score, &example.spec.name);
let expected_patch = parse_patch(&example.spec.expected_patch);
let original_text = &example.buffer.as_ref().unwrap().content;
let expected_texts: Vec<String> = example
.spec
.expected_patches
.iter()
.map(|patch| {
apply_diff_to_string(original_text, patch)
.with_context(|| format!("Expected patch did not apply for {}", example.spec.name))
})
.collect::<Result<Vec<_>, _>>()?;
let mut scores = vec![];
for pred in &example.predictions {
let actual_patch = parse_patch(&pred.actual_patch);
let line_match = metrics::line_match_score(&expected_patch, &actual_patch);
let delta_chr_f = metrics::delta_chr_f(&expected_patch, &actual_patch) as f32;
for prediction in &example.predictions {
let actual_text = match apply_diff_to_string(original_text, &prediction.actual_patch) {
Ok(text) => text,
Err(_) => {
scores.push(ExampleScore { delta_chr_f: 0.0 });
continue;
}
};
let best_delta_chr_f = expected_texts
.iter()
.map(|expected| metrics::delta_chr_f(original_text, expected, &actual_text) as f32)
.fold(0.0, f32::max);
scores.push(ExampleScore {
delta_chr_f,
line_match,
delta_chr_f: best_delta_chr_f,
});
}
@@ -46,42 +61,25 @@ pub async fn run_scoring(
Ok(())
}
fn parse_patch(patch: &str) -> Vec<DiffLine<'_>> {
patch.lines().map(DiffLine::parse).collect()
}
pub fn print_report(examples: &[Example]) {
eprintln!(
"──────────────────────────────────────────────────────────────────────────────────────"
);
eprintln!(
"{:<30} {:>4} {:>4} {:>4} {:>10} {:>8} {:>8} {:>10}",
"Example name", "TP", "FP", "FN", "Precision", "Recall", "F1", "DeltaChrF"
);
eprintln!("{:<50} {:>10}", "Example name", "DeltaChrF");
eprintln!(
"──────────────────────────────────────────────────────────────────────────────────────"
);
let mut all_line_match_scores = Vec::new();
let mut all_delta_chr_f_scores = Vec::new();
for example in examples {
for score in example.score.iter() {
let line_match = &score.line_match;
eprintln!(
"{:<30} {:>4} {:>4} {:>4} {:>9.2}% {:>7.2}% {:>7.2}% {:>9.2}",
truncate_name(&example.spec.name, 30),
line_match.true_positives,
line_match.false_positives,
line_match.false_negatives,
line_match.precision() * 100.0,
line_match.recall() * 100.0,
line_match.f1_score() * 100.0,
"{:<50} {:>9.2}",
truncate_name(&example.spec.name, 50),
score.delta_chr_f
);
all_line_match_scores.push(line_match.clone());
all_delta_chr_f_scores.push(score.delta_chr_f);
}
}
@@ -90,22 +88,11 @@ pub fn print_report(examples: &[Example]) {
"──────────────────────────────────────────────────────────────────────────────────────"
);
if !all_line_match_scores.is_empty() {
let total_line_match = ClassificationMetrics::aggregate(all_line_match_scores.iter());
if !all_delta_chr_f_scores.is_empty() {
let avg_delta_chr_f: f32 =
all_delta_chr_f_scores.iter().sum::<f32>() / all_delta_chr_f_scores.len() as f32;
eprintln!(
"{:<30} {:>4} {:>4} {:>4} {:>9.2}% {:>7.2}% {:>7.2}% {:>9.2}",
"TOTAL",
total_line_match.true_positives,
total_line_match.false_positives,
total_line_match.false_negatives,
total_line_match.precision() * 100.0,
total_line_match.recall() * 100.0,
total_line_match.f1_score() * 100.0,
avg_delta_chr_f
);
eprintln!("{:<50} {:>9.2}", "AVERAGE", avg_delta_chr_f);
eprintln!(
"──────────────────────────────────────────────────────────────────────────────────────"
);

View File

@@ -0,0 +1,902 @@
use crate::{
anthropic_client::PlainLlmClient,
git::{ensure_repo_cloned, run_git},
paths::{FAILED_EXAMPLES_DIR, LATEST_FAILED_EXAMPLES_DIR, SYNTHESIZE_STATE_FILE},
progress::{InfoStyle, Progress, Step, StepProgress},
};
use anthropic::ResponseContent;
use anyhow::{Context as _, Result};
use chrono::Local;
use collections::{HashMap, HashSet};
use edit_prediction::{
example_spec::ExampleSpec,
udiff::{apply_diff_to_string, edits_for_diff},
};
use indoc::indoc;
use serde::{Deserialize, Serialize};
use std::{
path::{Path, PathBuf},
sync::Arc,
};
#[derive(Debug, Clone)]
pub struct SynthesizeConfig {
pub repo_url: String,
pub count: usize,
pub max_commits: usize,
pub output_dir: PathBuf,
pub require_context: bool,
pub fresh: bool,
}
#[derive(Debug, Default, Serialize, Deserialize)]
struct SynthesizeState {
repositories: HashMap<String, RepoState>,
}
#[derive(Debug, Default, Serialize, Deserialize)]
struct RepoState {
processed_commits: HashSet<String>,
examples_generated: usize,
}
impl SynthesizeState {
fn load() -> Self {
if SYNTHESIZE_STATE_FILE.exists() {
std::fs::read_to_string(&*SYNTHESIZE_STATE_FILE)
.ok()
.and_then(|s| serde_json::from_str(&s).ok())
.unwrap_or_default()
} else {
Self::default()
}
}
fn save(&self) -> Result<()> {
let content = serde_json::to_string_pretty(self)?;
std::fs::write(&*SYNTHESIZE_STATE_FILE, content)?;
Ok(())
}
fn is_processed(&self, repo_url: &str, commit_sha: &str) -> bool {
self.repositories
.get(repo_url)
.is_some_and(|repo| repo.processed_commits.contains(commit_sha))
}
fn mark_processed(&mut self, repo_url: &str, commit_sha: &str, examples_count: usize) {
let repo = self.repositories.entry(repo_url.to_string()).or_default();
repo.processed_commits.insert(commit_sha.to_string());
repo.examples_generated += examples_count;
}
}
#[derive(Debug)]
struct CommitInfo {
sha: String,
parent_sha: String,
message: String,
diff: String,
expanded_diff: String,
}
/// Claude's response parsed into structured form
#[derive(Debug)]
struct ClaudeResponse {
name: String,
reasoning: String,
edit_history_hunks: Vec<String>,
expected_patch_hunks: Vec<String>,
}
pub async fn run_synthesize(config: SynthesizeConfig) -> Result<()> {
let mut state = if config.fresh {
SynthesizeState::default()
} else {
SynthesizeState::load()
};
std::fs::create_dir_all(&config.output_dir)?;
std::fs::create_dir_all(&*FAILED_EXAMPLES_DIR)?;
// Create "latest_failed" symlink pointing to this run's failed directory
if LATEST_FAILED_EXAMPLES_DIR.is_symlink() {
std::fs::remove_file(&*LATEST_FAILED_EXAMPLES_DIR)?;
}
#[cfg(unix)]
std::os::unix::fs::symlink(&*FAILED_EXAMPLES_DIR, &*LATEST_FAILED_EXAMPLES_DIR)?;
#[cfg(windows)]
std::os::windows::fs::symlink_dir(&*FAILED_EXAMPLES_DIR, &*LATEST_FAILED_EXAMPLES_DIR)?;
let progress = Progress::global();
progress.set_total_examples(config.count);
let clone_progress = progress.start(Step::Synthesize, "clone");
let repo_path = ensure_repo_cloned(&config.repo_url).await?;
drop(clone_progress);
let client = PlainLlmClient::new()?;
let mut examples_generated = 0;
let mut commits_skipped = 0;
let batch_size = config.max_commits;
'outer: loop {
let list_progress = progress.start(Step::Synthesize, "list-commits");
let commits = list_commits(&repo_path, batch_size, commits_skipped).await?;
drop(list_progress);
if commits.is_empty() {
break;
}
commits_skipped += commits.len();
for commit in commits {
if examples_generated >= config.count {
break 'outer;
}
if !config.fresh && state.is_processed(&config.repo_url, &commit.sha) {
continue;
}
if should_skip_commit(&commit) {
continue;
}
let commit_label = format!(
"{} {}",
&commit.sha[..8],
truncate_message(&commit.message, 40)
);
let step_progress = Arc::new(progress.start(Step::Synthesize, &commit_label));
// Single Claude call to identify and copy hunks
step_progress.set_substatus("analyzing...");
let claude_response =
match analyze_commit(&client, &config, &commit, step_progress.clone()).await {
Ok(Some(response)) => response,
Ok(None) => {
step_progress.set_info("no pattern", InfoStyle::Normal);
state.mark_processed(&config.repo_url, &commit.sha, 0);
state.save()?;
continue;
}
Err(e) => {
step_progress.set_info(format!("error: {:?}", e), InfoStyle::Warning);
state.mark_processed(&config.repo_url, &commit.sha, 0);
state.save()?;
continue;
}
};
// Validate and build the example
step_progress.set_substatus("validating...");
match build_example(&config, &commit, &repo_path, &claude_response).await {
Ok(spec) => {
let timestamp = Local::now().format("%Y-%m-%d--%H-%M-%S");
let filename = format!("{}.md", timestamp);
let path = config.output_dir.join(&filename);
std::fs::write(&path, spec.to_markdown())?;
examples_generated += 1;
step_progress.set_info(filename, InfoStyle::Normal);
}
Err(rejection_reason) => {
log::debug!("Example rejected: {}", rejection_reason);
let timestamp = Local::now().format("%Y-%m-%d--%H-%M-%S%.3f");
let filename = format!("{}.md", timestamp);
let path = FAILED_EXAMPLES_DIR.join(&filename);
let content = format_rejected_example(&claude_response, &rejection_reason);
if let Err(e) = std::fs::write(&path, content) {
log::warn!("Failed to write rejected example: {:?}", e);
}
step_progress.set_info(format!("rejected: {}", filename), InfoStyle::Warning);
}
}
state.mark_processed(&config.repo_url, &commit.sha, 1);
state.save()?;
}
}
progress.finalize();
Ok(())
}
fn truncate_message(msg: &str, max_len: usize) -> String {
let first_line = msg.lines().next().unwrap_or("");
if first_line.len() <= max_len {
first_line.to_string()
} else {
format!("{}...", &first_line[..max_len - 3])
}
}
fn should_skip_commit(commit: &CommitInfo) -> bool {
let lines_changed = commit
.diff
.lines()
.filter(|l| l.starts_with('+') || l.starts_with('-'))
.count();
lines_changed < 10
|| lines_changed > 1000
|| is_non_code_commit(commit)
|| is_rename_commit(commit)
}
fn is_non_code_commit(commit: &CommitInfo) -> bool {
let non_code_extensions = [
".md", ".txt", ".json", ".yaml", ".yml", ".toml", ".lock", ".svg", ".png", ".jpg", ".gif",
".ico", ".woff", ".ttf", ".eot",
];
let diff_files: Vec<&str> = commit
.diff
.lines()
.filter(|l| l.starts_with("+++ b/") || l.starts_with("--- a/"))
.filter_map(|l| {
l.strip_prefix("+++ b/")
.or_else(|| l.strip_prefix("--- a/"))
})
.collect();
if diff_files.is_empty() {
return false;
}
diff_files
.iter()
.all(|f| non_code_extensions.iter().any(|ext| f.ends_with(ext)))
}
fn is_rename_commit(commit: &CommitInfo) -> bool {
commit.diff.contains("similarity index")
|| commit.diff.contains("rename from")
|| commit.diff.contains("rename to")
}
async fn list_commits(
repo_path: &Path,
max_commits: usize,
skip: usize,
) -> Result<Vec<CommitInfo>> {
let output = run_git(
repo_path,
&[
"log",
"--no-merges",
&format!("--skip={}", skip),
&format!("-{}", max_commits),
"--format=%H|%P|%s",
],
)
.await?;
let mut commits = Vec::new();
for line in output.lines() {
let parts: Vec<&str> = line.splitn(3, '|').collect();
if parts.len() < 3 {
continue;
}
let sha = parts[0].to_string();
let parent_sha = parts[1].split_whitespace().next().unwrap_or("").to_string();
if parent_sha.is_empty() {
continue;
}
// Get standard diff (for skip checks)
let diff = run_git(repo_path, &["show", "--format=", &sha])
.await
.unwrap_or_default();
// Get expanded diff with 30 lines of context
let expanded_diff = run_git(repo_path, &["show", "-U30", "--format=", &sha])
.await
.unwrap_or_default();
commits.push(CommitInfo {
sha,
parent_sha,
message: parts[2].to_string(),
diff,
expanded_diff,
});
}
Ok(commits)
}
fn build_prompt(config: &SynthesizeConfig, commit: &CommitInfo) -> String {
let context_guidance = if config.require_context {
"IMPORTANT: Only identify patterns that REQUIRE reading context from other files to make the prediction. \
Single-file patterns (where the edit history and expected patch are in the same file) are NOT acceptable \
unless the pattern clearly requires understanding code from other files."
} else {
"Both single-file and multi-file patterns are acceptable."
};
format!(
indoc! {r#"
You are analyzing a git commit to construct a realistic edit prediction example.
Your goal is to tell the story of a programmer's editing session: what sequence of changes did they make, and what change logically comes next? We use these examples to train a model to predict edits, so the quality of the EDIT HISTORY is what matters most.
An edit prediction example consists of:
1. **Edit History**: 3-6 hunks showing what the programmer did BEFORE making the expected patch. This is the most important part - it must tell a coherent story of the changes leading up to the prediction.
2. **Expected Patch**: One small hunk that logically follows from the edit history.
{context_guidance}
## What Makes a Good Example
The edit history should read like a story: "First the programmer changed X, then Y, then Z, and now they need to change W."
GOOD examples (rich sequences with 3+ steps):
- Removing a parameter: docstring update → constructor change → field removal → (predict) usage site update
- Adding a feature: type definition → first usage → second usage → (predict) third usage
- Bug fix pattern: fix in file A → fix in file B → fix in file C → (predict) fix in file D
BAD examples (respond NO_PATTERN):
- Commits where all changes are independent (no narrative thread)
- Simple find-and-replace (renaming, version bumps)
- Documentation-only or config-only changes
- Changes where you can only find 1-2 hunks for the edit history
## Commit Information
Repository: {repo_url}
Commit: {sha}
Message: {message}
## Diff (30 lines context)
```diff
{expanded_diff}
```
## Your Task
First, THINK through whether this commit can support a good example:
1. What is the high-level pattern in this commit?
2. Can you identify at least 4 related hunks (3 for edit history + 1 for expected patch)?
3. What would be the narrative? (First... then... then... finally predict...)
4. Which specific hunk should be the expected patch (the "punchline")?
If you cannot construct a coherent 3+ hunk story, respond with just:
NO_PATTERN: <brief reason>
If you CAN construct a good example, respond in this format:
ANALYSIS:
Pattern: <one sentence describing the pattern>
Steps:
1. <file:line-range> - <what this hunk does>
2. <file:line-range> - <what this hunk does>
3. <file:line-range> - <what this hunk does>
4. [EXPECTED PATCH] <file:line-range> - <what this hunk does>
NAME: <short description, like a commit message, under 60 chars>
EDIT_HISTORY:
Hunk 1:
```diff
--- a/src/models/user.py
+++ b/src/models/user.py
@@ -15,7 +15,6 @@ class User:
"""A user in the system.
Attributes:
- email: The user's email address.
name: The user's display name.
"""
```
Hunk 2:
```diff
--- a/src/models/user.py
+++ b/src/models/user.py
@@ -25,10 +24,9 @@ class User:
def __init__(
self,
name: str,
- email: str,
created_at: datetime,
):
self.name = name
- self.email = email
self.created_at = created_at
```
Hunk 3:
```diff
--- a/src/api/handlers.py
+++ b/src/api/handlers.py
@@ -42,7 +42,6 @@ def create_user(request):
data = request.json()
user = User(
name=data["name"],
- email=data["email"],
created_at=datetime.now(),
)
return user.save()
```
EXPECTED_PATCH:
```diff
--- a/src/api/handlers.py
+++ b/src/api/handlers.py
@@ -58,7 +57,6 @@ def update_user(request, user_id):
user = User.get(user_id)
user.name = data.get("name", user.name)
- user.email = data.get("email", user.email)
user.save()
return user
```
## Requirements for the diffs
Edit history:
- MUST have 3-6 hunks (if you cannot find 3+, respond NO_PATTERN instead)
- Each hunk needs file headers (--- a/path and +++ b/path)
- Hunks must be valid unified diffs that apply to the parent commit
- Order hunks as a programmer would naturally make the changes
Expected patch:
- Must be a SINGLE hunk from a SINGLE file
- Must be SMALL: 1-15 changed lines (not counting context)
- Must be clearly predictable from the edit history narrative
"#},
context_guidance = context_guidance,
repo_url = config.repo_url,
sha = commit.sha,
message = commit.message,
expanded_diff = commit.expanded_diff,
)
}
async fn analyze_commit(
client: &PlainLlmClient,
config: &SynthesizeConfig,
commit: &CommitInfo,
step_progress: Arc<StepProgress>,
) -> Result<Option<ClaudeResponse>> {
use anthropic::{Message, RequestContent, Role};
let prompt = build_prompt(config, commit);
let messages = vec![Message {
role: Role::User,
content: vec![RequestContent::Text {
text: prompt,
cache_control: None,
}],
}];
let response = client
.generate_streaming("claude-sonnet-4-5", 8192, messages, |chars, _text| {
step_progress.set_substatus(format!("analyzing: {:.1}K", chars as f64 / 1000.0));
})
.await?;
// Extract text content from response
let response_text: String = response
.content
.iter()
.filter_map(|block| {
if let ResponseContent::Text { text } = block {
Some(text.as_str())
} else {
None
}
})
.collect::<Vec<_>>()
.join("\n");
parse_claude_response(&response_text)
}
fn parse_claude_response(response: &str) -> Result<Option<ClaudeResponse>> {
// Check for NO_PATTERN
if response.contains("NO_PATTERN:") {
return Ok(None);
}
// Parse NAME
let name = response
.lines()
.find(|l| l.starts_with("NAME:"))
.map(|l| l.strip_prefix("NAME:").unwrap_or("").trim().to_string())
.unwrap_or_else(|| "unnamed example".to_string());
// Parse ANALYSIS section (Claude's planning) - this is the primary reasoning
let reasoning = extract_section(
response,
"ANALYSIS:",
&["NAME:", "REASONING:", "EDIT_HISTORY:", "EXPECTED_PATCH:"],
)
.unwrap_or_default();
// Parse EDIT_HISTORY diff block
let edit_history_hunks = extract_diff_block(response, "EDIT_HISTORY:")?;
// Parse EXPECTED_PATCH diff block
let expected_patch_hunks = extract_diff_block(response, "EXPECTED_PATCH:")?;
if edit_history_hunks.is_empty() {
anyhow::bail!("No edit history hunks found in response");
}
if expected_patch_hunks.is_empty() {
anyhow::bail!("No expected patch hunks found in response");
}
Ok(Some(ClaudeResponse {
name,
reasoning,
edit_history_hunks,
expected_patch_hunks,
}))
}
fn extract_section(text: &str, start_marker: &str, end_markers: &[&str]) -> Option<String> {
let start_idx = text.find(start_marker)?;
let content_start = start_idx + start_marker.len();
let end_idx = end_markers
.iter()
.filter_map(|marker| text[content_start..].find(marker))
.min()
.map(|idx| content_start + idx)
.unwrap_or(text.len());
Some(text[content_start..end_idx].trim().to_string())
}
fn extract_diff_block(text: &str, section_marker: &str) -> Result<Vec<String>> {
let section_start = text
.find(section_marker)
.context(format!("Section {} not found", section_marker))?;
let after_marker = &text[section_start + section_marker.len()..];
// Find where the next major section starts (to bound our search)
let section_end = ["EXPECTED_PATCH:", "## "]
.iter()
.filter(|&&m| m != section_marker)
.filter_map(|marker| after_marker.find(marker))
.min()
.unwrap_or(after_marker.len());
let section_content = &after_marker[..section_end];
// Collect all ```diff blocks in this section
let mut hunks = Vec::new();
let mut search_start = 0;
while let Some(diff_start) = section_content[search_start..].find("```diff") {
let abs_diff_start = search_start + diff_start;
let block_content_start = section_content[abs_diff_start..]
.find('\n')
.map(|i| abs_diff_start + i + 1)
.unwrap_or(abs_diff_start);
if let Some(block_end_rel) = section_content[block_content_start..].find("```") {
let block_end = block_content_start + block_end_rel;
let diff_content = section_content[block_content_start..block_end].trim();
// Split this block into hunks (in case multiple hunks in one block)
hunks.extend(split_into_hunks(diff_content));
search_start = block_end + 3;
} else {
break;
}
}
if hunks.is_empty() {
anyhow::bail!("No diff blocks found in section {}", section_marker);
}
Ok(hunks)
}
/// Split a diff block into individual hunks, preserving file headers
fn split_into_hunks(diff: &str) -> Vec<String> {
let mut hunks = Vec::new();
let mut current_file_header: Option<String> = None;
let mut current_hunk: Vec<String> = Vec::new();
let mut in_hunk = false;
for line in diff.lines() {
if line.starts_with("--- a/") || line.starts_with("--- /") {
// Start of file header - flush previous hunk
if in_hunk && !current_hunk.is_empty() {
let mut hunk_text = String::new();
if let Some(ref header) = current_file_header {
hunk_text.push_str(header);
hunk_text.push('\n');
}
hunk_text.push_str(&current_hunk.join("\n"));
hunks.push(hunk_text);
current_hunk.clear();
}
current_file_header = Some(line.to_string());
in_hunk = false;
} else if line.starts_with("+++ b/") || line.starts_with("+++ /") {
if let Some(ref mut header) = current_file_header {
header.push('\n');
header.push_str(line);
}
} else if line.starts_with("@@ ") {
// New hunk - flush previous
if in_hunk && !current_hunk.is_empty() {
let mut hunk_text = String::new();
if let Some(ref header) = current_file_header {
hunk_text.push_str(header);
hunk_text.push('\n');
}
hunk_text.push_str(&current_hunk.join("\n"));
hunks.push(hunk_text);
current_hunk.clear();
}
current_hunk.push(line.to_string());
in_hunk = true;
} else if in_hunk {
current_hunk.push(line.to_string());
}
}
// Flush final hunk
if !current_hunk.is_empty() {
let mut hunk_text = String::new();
if let Some(ref header) = current_file_header {
hunk_text.push_str(header);
hunk_text.push('\n');
}
hunk_text.push_str(&current_hunk.join("\n"));
hunks.push(hunk_text);
}
hunks
}
/// Validate Claude's output by applying diffs and build the ExampleSpec
async fn build_example(
config: &SynthesizeConfig,
commit: &CommitInfo,
repo_path: &Path,
response: &ClaudeResponse,
) -> Result<ExampleSpec, String> {
// Validate expected patch hunks
if response.expected_patch_hunks.len() != 1 {
return Err(format!(
"Expected exactly 1 expected patch hunk, got {}",
response.expected_patch_hunks.len()
));
}
// Parse the expected patch to determine cursor file
let expected_patch = &response.expected_patch_hunks[0];
let cursor_file = extract_file_from_hunk(expected_patch)
.ok_or_else(|| "Could not determine file from expected patch".to_string())?;
// Get the file content before the commit
let before_content = run_git(
repo_path,
&["show", &format!("{}^:{}", commit.sha, cursor_file)],
)
.await
.map_err(|e| format!("Failed to get file content for {}: {}", cursor_file, e))?;
// Build edit history diff from Claude's hunks
let edit_history = response.edit_history_hunks.join("\n");
// Apply edit history to get intermediate state (validates edit history)
let intermediate_state =
apply_edit_history_to_content(&before_content, &edit_history, &cursor_file)?;
// Validate expected patch applies to intermediate state
let expected_patch_with_header = ensure_diff_header(expected_patch, &cursor_file);
apply_diff_to_string(&intermediate_state, &expected_patch_with_header)
.map_err(|e| format!("Expected patch failed to apply: {}", e))?;
// Find where the expected patch edits would apply in the intermediate state
let edits = edits_for_diff(&intermediate_state, &expected_patch_with_header)
.map_err(|e| format!("Failed to parse expected patch: {}", e))?;
if edits.is_empty() {
return Err(
"Could not locate expected patch in file (context not found or ambiguous)".to_string(),
);
}
// Use the start of the first edit for cursor positioning
let cursor_byte_offset = edits[0].0.start;
// Extract excerpt around the edit location
let (excerpt, cursor_offset) = extract_cursor_excerpt(&intermediate_state, cursor_byte_offset)?;
// Build the ExampleSpec and use set_cursor_excerpt to format with comment marker
let comment_prefix = line_comment_prefix(&cursor_file);
let reasoning_with_source = format!(
"Source commit: {} ({})\n\n{}",
commit.sha,
truncate_message(&commit.message, 60),
response.reasoning
);
let mut spec = ExampleSpec {
name: response.name.clone(),
repository_url: config.repo_url.clone(),
revision: commit.parent_sha.clone(),
tags: Vec::new(),
reasoning: Some(reasoning_with_source),
uncommitted_diff: String::new(),
cursor_path: Arc::from(Path::new(&cursor_file)),
cursor_position: String::new(),
edit_history,
expected_patches: vec![expected_patch_with_header],
};
spec.set_cursor_excerpt(&excerpt, cursor_offset, comment_prefix);
Ok(spec)
}
/// Extract file path from a hunk (looks for --- a/path or +++ b/path)
fn extract_file_from_hunk(hunk: &str) -> Option<String> {
for line in hunk.lines() {
if let Some(path) = line.strip_prefix("+++ b/") {
return Some(path.to_string());
}
if let Some(path) = line.strip_prefix("--- a/") {
return Some(path.to_string());
}
}
None
}
/// Ensure a hunk has proper file headers
fn ensure_diff_header(hunk: &str, file_path: &str) -> String {
if hunk.contains("--- a/") || hunk.contains("+++ b/") {
return hunk.to_string();
}
format!("--- a/{}\n+++ b/{}\n{}", file_path, file_path, hunk)
}
/// Apply edit history to file content, only if hunks affect this file
fn apply_edit_history_to_content(
content: &str,
edit_history: &str,
cursor_file: &str,
) -> Result<String, String> {
// Extract just the hunks for this file from the edit history
let file_diff = extract_file_diff_from_combined(edit_history, cursor_file);
if file_diff.is_empty() {
return Ok(content.to_string());
}
apply_diff_to_string(content, &file_diff)
.map_err(|e| format!("Failed to apply edit history: {}", e))
}
/// Extract hunks for a specific file from a combined diff
fn extract_file_diff_from_combined(combined_diff: &str, target_file: &str) -> String {
let mut result = String::new();
let mut in_target_file = false;
let mut found_header = false;
for line in combined_diff.lines() {
if line.starts_with("--- a/") {
let file = line.strip_prefix("--- a/").unwrap_or("");
in_target_file = file == target_file;
if in_target_file {
result.push_str(line);
result.push('\n');
found_header = false;
}
} else if line.starts_with("+++ b/") && in_target_file {
result.push_str(line);
result.push('\n');
found_header = true;
} else if in_target_file && found_header {
if line.starts_with("--- a/") {
break;
}
result.push_str(line);
result.push('\n');
}
}
result
}
/// Extract a cursor position excerpt from content around a byte offset.
/// Returns the excerpt and the cursor offset within the excerpt.
fn extract_cursor_excerpt(
content: &str,
cursor_byte_offset: usize,
) -> Result<(String, usize), String> {
// Find the line containing the cursor
let line_start = content[..cursor_byte_offset]
.rfind('\n')
.map(|pos| pos + 1)
.unwrap_or(0);
let line_end = content[cursor_byte_offset..]
.find('\n')
.map(|pos| cursor_byte_offset + pos)
.unwrap_or(content.len());
// Get context lines before
let lines_before: Vec<&str> = content[..line_start].lines().collect();
let context_before: Vec<&str> = lines_before.iter().rev().take(3).rev().cloned().collect();
// Get context lines after
let after_line_end = if line_end < content.len() {
line_end + 1
} else {
line_end
};
let context_after: Vec<&str> = content[after_line_end..].lines().take(4).collect();
// The line containing the cursor
let cursor_line = &content[line_start..line_end];
let cursor_column = cursor_byte_offset - line_start;
// Build the excerpt
let mut excerpt = String::new();
for line in context_before {
excerpt.push_str(line);
excerpt.push('\n');
}
// Track where cursor will be in the excerpt
let cursor_offset_in_excerpt = excerpt.len() + cursor_column;
// Line containing cursor
excerpt.push_str(cursor_line);
excerpt.push('\n');
for line in context_after {
excerpt.push_str(line);
excerpt.push('\n');
}
// Trim trailing newline
if excerpt.ends_with('\n') {
excerpt.pop();
}
Ok((excerpt, cursor_offset_in_excerpt))
}
/// Get the line comment prefix for a file based on its extension
fn line_comment_prefix(file_path: &str) -> &'static str {
let extension = file_path.rsplit('.').next().unwrap_or("");
match extension {
"rs" | "c" | "cpp" | "cc" | "h" | "hpp" | "js" | "ts" | "tsx" | "jsx" | "go" | "java"
| "swift" | "kt" | "kts" | "scala" | "cs" | "m" | "mm" | "zig" | "v" | "d" => "//",
"py" | "rb" | "sh" | "bash" | "zsh" | "pl" | "pm" | "r" | "jl" | "yaml" | "yml"
| "toml" | "coffee" | "cr" | "ex" | "exs" | "elixir" => "#",
"lua" | "hs" | "sql" => "--",
"lisp" | "clj" | "cljs" | "scm" | "rkt" | "el" => ";",
"erl" | "hrl" => "%",
_ => "//",
}
}
fn format_rejected_example(response: &ClaudeResponse, rejection_reason: &str) -> String {
let mut content = String::new();
content.push_str("# Rejected Example\n\n");
content.push_str(&format!("## Name\n\n{}\n\n", response.name));
content.push_str(&format!("## Reasoning\n\n{}\n\n", response.reasoning));
content.push_str("## Edit History Hunks\n\n```diff\n");
for hunk in &response.edit_history_hunks {
content.push_str(hunk);
content.push_str("\n\n");
}
content.push_str("```\n\n");
content.push_str("## Expected Patch Hunks\n\n```diff\n");
for hunk in &response.expected_patch_hunks {
content.push_str(hunk);
content.push_str("\n\n");
}
content.push_str("```\n\n");
content.push_str(&format!("## Rejection Reason\n\n{}\n", rejection_reason));
content
}

View File

@@ -150,7 +150,7 @@ fn capture_example_as_markdown(
.buffer()
.read(cx)
.text_anchor_for_position(editor.selections.newest_anchor().head(), cx)?;
let example = capture_example(project.clone(), buffer, cursor_anchor, true, cx)?;
let example = capture_example(project.clone(), buffer, cursor_anchor, cx)?;
let examples_dir = AllLanguageSettings::get_global(cx)
.edit_predictions

View File

@@ -25298,36 +25298,34 @@ impl EditorSnapshot {
/// Returns the line delta from `base` to `line` in the multibuffer, ignoring wrapped lines.
///
/// This is positive if `base` is before `line`.
fn relative_line_delta(&self, base: DisplayRow, line: DisplayRow) -> i64 {
fn relative_line_delta(
&self,
base: DisplayRow,
line: DisplayRow,
consider_wrapped_lines: bool,
) -> i64 {
let point = DisplayPoint::new(line, 0).to_point(self);
self.relative_line_delta_to_point(base, point)
self.relative_line_delta_to_point(base, point, consider_wrapped_lines)
}
/// Returns the line delta from `base` to `point` in the multibuffer, ignoring wrapped lines.
/// Returns the line delta from `base` to `point` in the multibuffer.
///
/// This is positive if `base` is before `point`.
pub fn relative_line_delta_to_point(&self, base: DisplayRow, point: Point) -> i64 {
pub fn relative_line_delta_to_point(
&self,
base: DisplayRow,
point: Point,
consider_wrapped_lines: bool,
) -> i64 {
let base_point = DisplayPoint::new(base, 0).to_point(self);
point.row as i64 - base_point.row as i64
}
/// Returns the line delta from `base` to `line` in the multibuffer, counting wrapped lines.
///
/// This is positive if `base` is before `line`.
fn relative_wrapped_line_delta(&self, base: DisplayRow, line: DisplayRow) -> i64 {
let point = DisplayPoint::new(line, 0).to_point(self);
self.relative_wrapped_line_delta_to_point(base, point)
}
/// Returns the line delta from `base` to `point` in the multibuffer, counting wrapped lines.
///
/// This is positive if `base` is before `point`.
pub fn relative_wrapped_line_delta_to_point(&self, base: DisplayRow, point: Point) -> i64 {
let base_point = DisplayPoint::new(base, 0).to_point(self);
let wrap_snapshot = self.wrap_snapshot();
let base_wrap_row = wrap_snapshot.make_wrap_point(base_point, Bias::Left).row();
let wrap_row = wrap_snapshot.make_wrap_point(point, Bias::Left).row();
wrap_row.0 as i64 - base_wrap_row.0 as i64
if consider_wrapped_lines {
let wrap_snapshot = self.wrap_snapshot();
let base_wrap_row = wrap_snapshot.make_wrap_point(base_point, Bias::Left).row();
let wrap_row = wrap_snapshot.make_wrap_point(point, Bias::Left).row();
wrap_row.0 as i64 - base_wrap_row.0 as i64
} else {
point.row as i64 - base_point.row as i64
}
}
/// Returns the unsigned relative line number to display for each row in `rows`.
@@ -25339,23 +25337,21 @@ impl EditorSnapshot {
relative_to: DisplayRow,
count_wrapped_lines: bool,
) -> HashMap<DisplayRow, u32> {
let initial_offset = if count_wrapped_lines {
self.relative_wrapped_line_delta(relative_to, rows.start)
} else {
self.relative_line_delta(relative_to, rows.start)
};
let display_row_infos = self
.row_infos(rows.start)
let initial_offset = self.relative_line_delta(relative_to, rows.start, count_wrapped_lines);
self.row_infos(rows.start)
.take(rows.len())
.enumerate()
.map(|(i, row_info)| (DisplayRow(rows.start.0 + i as u32), row_info));
display_row_infos
.map(|(i, row_info)| (DisplayRow(rows.start.0 + i as u32), row_info))
.filter(|(_row, row_info)| {
row_info.buffer_row.is_some()
|| (count_wrapped_lines && row_info.wrapped_buffer_row.is_some())
})
.enumerate()
.map(|(i, (row, _row_info))| (row, (initial_offset + i as i64).unsigned_abs() as u32))
.flat_map(|(i, (row, _row_info))| {
(row != relative_to)
.then_some((row, (initial_offset + i as i64).unsigned_abs() as u32))
})
.collect()
}
}

View File

@@ -28725,7 +28725,7 @@ fn test_relative_line_numbers(cx: &mut TestAppContext) {
assert_eq!(
relative_number,
snapshot
.relative_line_delta(display_row, base_display_row)
.relative_line_delta(display_row, base_display_row, false)
.unsigned_abs() as u32,
);
}
@@ -28735,6 +28735,7 @@ fn test_relative_line_numbers(cx: &mut TestAppContext) {
.into_iter()
.enumerate()
.map(|(i, row)| (DisplayRow(row), i.abs_diff(wrapped_base_row) as u32))
.filter(|(row, _)| *row != base_display_row)
.collect_vec();
let actual_relative_numbers = snapshot
.calculate_relative_line_numbers(
@@ -28751,7 +28752,7 @@ fn test_relative_line_numbers(cx: &mut TestAppContext) {
assert_eq!(
relative_number,
snapshot
.relative_wrapped_line_delta(display_row, base_display_row)
.relative_line_delta(display_row, base_display_row, true)
.unsigned_abs() as u32,
);
}

View File

@@ -4611,15 +4611,15 @@ impl EditorElement {
);
let line_number = show_line_numbers.then(|| {
let relative_number = relative_to.and_then(|base| match relative_line_numbers {
RelativeLineNumbers::Disabled => None,
RelativeLineNumbers::Enabled => {
Some(snapshot.relative_line_delta_to_point(base, start_point))
}
RelativeLineNumbers::Wrapped => {
Some(snapshot.relative_wrapped_line_delta_to_point(base, start_point))
}
});
let relative_number = relative_to
.filter(|_| relative_line_numbers != RelativeLineNumbers::Disabled)
.map(|base| {
snapshot.relative_line_delta_to_point(
base,
start_point,
relative_line_numbers == RelativeLineNumbers::Wrapped,
)
});
let number = relative_number
.filter(|&delta| delta != 0)
.map(|delta| delta.unsigned_abs() as u32)
@@ -9055,14 +9055,8 @@ impl Element for EditorElement {
let em_advance = window.text_system().em_advance(font_id, font_size).unwrap();
let glyph_grid_cell = size(em_advance, line_height);
let gutter_dimensions = snapshot
.gutter_dimensions(
font_id,
font_size,
style,
window,
cx,
);
let gutter_dimensions =
snapshot.gutter_dimensions(font_id, font_size, style, window, cx);
let text_width = bounds.size.width - gutter_dimensions.width;
let settings = EditorSettings::get_global(cx);
@@ -9276,10 +9270,10 @@ impl Element for EditorElement {
};
let background_color = match diff_status.kind {
DiffHunkStatusKind::Added =>
cx.theme().colors().version_control_added,
DiffHunkStatusKind::Deleted =>
cx.theme().colors().version_control_deleted,
DiffHunkStatusKind::Added => cx.theme().colors().version_control_added,
DiffHunkStatusKind::Deleted => {
cx.theme().colors().version_control_deleted
}
DiffHunkStatusKind::Modified => {
debug_panic!("modified diff status for row info");
continue;
@@ -9423,25 +9417,26 @@ impl Element for EditorElement {
);
// relative rows are based on newest selection, even outside the visible area
let relative_row_base = self.editor.update(cx, |editor, cx| {
if editor.selections.count()==0 {
return None;
}
let relative_row_base = self.editor.update(cx, |editor, cx| {
(editor.selections.count() != 0).then(|| {
let newest = editor
.selections
.newest::<Point>(&editor.display_snapshot(cx));
Some(SelectionLayout::new(
SelectionLayout::new(
newest,
editor.selections.line_mode(),
editor.cursor_offset_on_selection,
editor.cursor_shape,
&snapshot.display_snapshot,
&snapshot,
true,
true,
None,
)
.head.row())
});
.head
.row()
})
});
let mut breakpoint_rows = self.editor.update(cx, |editor, cx| {
editor.active_breakpoints(start_row..end_row, window, cx)
@@ -9601,9 +9596,10 @@ impl Element for EditorElement {
cx,
);
} else {
debug_panic!(
"skipping recursive prepaint at max depth. renderer widths may be stale."
);
debug_panic!(concat!(
"skipping recursive prepaint at max depth. ",
"renderer widths may be stale."
));
}
}
@@ -9715,9 +9711,10 @@ impl Element for EditorElement {
cx,
);
} else {
debug_panic!(
"skipping recursive prepaint at max depth. block layout may be stale."
);
debug_panic!(concat!(
"skipping recursive prepaint at max depth. ",
"block layout may be stale."
));
}
}
@@ -11723,6 +11720,7 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line has no relative number
assert!(!relative_rows.contains_key(&DisplayRow(3)));
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
@@ -11869,6 +11867,7 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line has no relative number
assert!(!relative_rows.contains_key(&DisplayRow(3)));
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
@@ -11924,6 +11923,7 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line, even if deleted, has no relative number
assert!(!relative_rows.contains_key(&DisplayRow(3)));
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
}

View File

@@ -24,7 +24,7 @@ use std::{borrow::Cow, cell::RefCell};
use std::{ops::Range, sync::Arc, time::Duration};
use std::{path::PathBuf, rc::Rc};
use theme::ThemeSettings;
use ui::{Scrollbars, WithScrollbar, prelude::*, theme_is_transparent};
use ui::{CopyButton, Scrollbars, WithScrollbar, prelude::*, theme_is_transparent};
use url::Url;
use util::TryFutureExt;
use workspace::{OpenOptions, OpenVisible, Workspace};
@@ -994,11 +994,13 @@ impl DiagnosticPopover {
.border_color(self.border_color)
.rounded_lg()
.child(
div()
h_flex()
.id("diagnostic-content-container")
.overflow_y_scroll()
.gap_1()
.items_start()
.max_w(max_size.width)
.max_h(max_size.height)
.overflow_y_scroll()
.track_scroll(&self.scroll_handle)
.child(
MarkdownElement::new(
@@ -1021,7 +1023,11 @@ impl DiagnosticPopover {
}
},
),
),
)
.child({
let message = self.local_diagnostic.diagnostic.message.clone();
CopyButton::new(message).tooltip_label("Copy Diagnostic")
}),
)
.custom_scrollbars(
Scrollbars::for_settings::<EditorSettings>()

View File

@@ -23,3 +23,9 @@ pub struct AgentV2FeatureFlag;
impl FeatureFlag for AgentV2FeatureFlag {
const NAME: &'static str = "agent-v2";
}
pub struct AcpBetaFeatureFlag;
impl FeatureFlag for AcpBetaFeatureFlag {
const NAME: &'static str = "acp-beta";
}

View File

@@ -335,12 +335,11 @@ impl FileHandle for std::fs::File {
let mut path_buf = MaybeUninit::<[u8; libc::PATH_MAX as usize]>::uninit();
let result = unsafe { libc::fcntl(fd.as_raw_fd(), libc::F_GETPATH, path_buf.as_mut_ptr()) };
if result == -1 {
anyhow::bail!("fcntl returned -1".to_string());
}
anyhow::ensure!(result != -1, "fcntl returned -1");
// SAFETY: `fcntl` will initialize the path buffer.
let c_str = unsafe { CStr::from_ptr(path_buf.as_ptr().cast()) };
anyhow::ensure!(!c_str.is_empty(), "Could find a path for the file handle");
let path = PathBuf::from(OsStr::from_bytes(c_str.to_bytes()));
Ok(path)
}
@@ -372,12 +371,11 @@ impl FileHandle for std::fs::File {
kif.kf_structsize = libc::KINFO_FILE_SIZE;
let result = unsafe { libc::fcntl(fd.as_raw_fd(), libc::F_KINFO, kif.as_mut_ptr()) };
if result == -1 {
anyhow::bail!("fcntl returned -1".to_string());
}
anyhow::ensure!(result != -1, "fcntl returned -1");
// SAFETY: `fcntl` will initialize the kif.
let c_str = unsafe { CStr::from_ptr(kif.assume_init().kf_path.as_ptr()) };
anyhow::ensure!(!c_str.is_empty(), "Could find a path for the file handle");
let path = PathBuf::from(OsStr::from_bytes(c_str.to_bytes()));
Ok(path)
}
@@ -398,18 +396,21 @@ impl FileHandle for std::fs::File {
// Query required buffer size (in wide chars)
let required_len =
unsafe { GetFinalPathNameByHandleW(handle, &mut [], FILE_NAME_NORMALIZED) };
if required_len == 0 {
anyhow::bail!("GetFinalPathNameByHandleW returned 0 length");
}
anyhow::ensure!(
required_len != 0,
"GetFinalPathNameByHandleW returned 0 length"
);
// Allocate buffer and retrieve the path
let mut buf: Vec<u16> = vec![0u16; required_len as usize + 1];
let written = unsafe { GetFinalPathNameByHandleW(handle, &mut buf, FILE_NAME_NORMALIZED) };
if written == 0 {
anyhow::bail!("GetFinalPathNameByHandleW failed to write path");
}
anyhow::ensure!(
written != 0,
"GetFinalPathNameByHandleW failed to write path"
);
let os_str: OsString = OsString::from_wide(&buf[..written as usize]);
anyhow::ensure!(!os_str.is_empty(), "Could find a path for the file handle");
Ok(PathBuf::from(os_str))
}
}

View File

@@ -76,7 +76,7 @@ impl EventStream {
cf::CFRelease(cf_path);
cf::CFRelease(cf_url);
} else {
log::error!("Failed to create CFURL for path: {}", path.display());
log::error!("Failed to create CFURL for path: {path:?}");
}
}

View File

@@ -13,7 +13,7 @@ use project::{git_store::Repository, project_settings::ProjectSettings};
use settings::Settings as _;
use theme::ThemeSettings;
use time::OffsetDateTime;
use ui::{ContextMenu, Divider, prelude::*, tooltip_container};
use ui::{ContextMenu, CopyButton, Divider, prelude::*, tooltip_container};
use workspace::Workspace;
const GIT_BLAME_MAX_AUTHOR_CHARS_DISPLAYED: usize = 20;
@@ -335,18 +335,10 @@ impl BlameRenderer for GitBlameRenderer {
cx.stop_propagation();
}),
)
.child(Divider::vertical())
.child(
IconButton::new("copy-sha-button", IconName::Copy)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.on_click(move |_, _, cx| {
cx.stop_propagation();
cx.write_to_clipboard(
ClipboardItem::new_string(
sha.to_string(),
),
)
}),
CopyButton::new(sha.to_string())
.tooltip_label("Copy SHA"),
),
),
),

View File

@@ -5,7 +5,7 @@ use git::blame::BlameEntry;
use git::repository::CommitSummary;
use git::{GitRemote, commit::ParsedCommitMessage};
use gpui::{
App, Asset, ClipboardItem, Element, Entity, MouseButton, ParentElement, Render, ScrollHandle,
App, Asset, Element, Entity, MouseButton, ParentElement, Render, ScrollHandle,
StatefulInteractiveElement, WeakEntity, prelude::*,
};
use markdown::{Markdown, MarkdownElement};
@@ -14,7 +14,7 @@ use settings::Settings;
use std::hash::Hash;
use theme::ThemeSettings;
use time::{OffsetDateTime, UtcOffset};
use ui::{Avatar, Divider, IconButtonShape, prelude::*, tooltip_container};
use ui::{Avatar, CopyButton, Divider, prelude::*, tooltip_container};
use workspace::Workspace;
#[derive(Clone, Debug)]
@@ -315,8 +315,8 @@ impl Render for CommitTooltip {
cx.open_url(pr.url.as_str())
}),
)
.child(Divider::vertical())
})
.child(Divider::vertical())
.child(
Button::new(
"commit-sha-button",
@@ -342,18 +342,8 @@ impl Render for CommitTooltip {
},
),
)
.child(
IconButton::new("copy-sha-button", IconName::Copy)
.shape(IconButtonShape::Square)
.icon_size(IconSize::Small)
.icon_color(Color::Muted)
.on_click(move |_, _, cx| {
cx.stop_propagation();
cx.write_to_clipboard(
ClipboardItem::new_string(full_sha.clone()),
)
}),
),
.child(Divider::vertical())
.child(CopyButton::new(full_sha).tooltip_label("Copy SHA")),
),
),
)

View File

@@ -123,19 +123,25 @@ pub async fn resolve_schema_request_inner(
.find(|adapter| adapter.name().as_ref() as &str == lsp_name)
.with_context(|| format!("LSP adapter not found: {}", lsp_name))?;
let delegate = cx.update(|inner_cx| {
lsp_store.update(inner_cx, |lsp_store, inner_cx| {
let Some(local) = lsp_store.as_local() else {
return None;
};
let Some(worktree) = local.worktree_store.read(inner_cx).worktrees().next() else {
return None;
};
Some(LocalLspAdapterDelegate::from_local_lsp(
local, &worktree, inner_cx,
))
})
})?.context("Failed to create adapter delegate - either LSP store is not in local mode or no worktree is available")?;
let delegate = cx
.update(|inner_cx| {
lsp_store.update(inner_cx, |lsp_store, inner_cx| {
let Some(local) = lsp_store.as_local() else {
return None;
};
let Some(worktree) = local.worktree_store.read(inner_cx).worktrees().next()
else {
return None;
};
Some(LocalLspAdapterDelegate::from_local_lsp(
local, &worktree, inner_cx,
))
})
})?
.context(concat!(
"Failed to create adapter delegate - ",
"either LSP store is not in local mode or no worktree is available"
))?;
let adapter_for_schema = adapter.clone();
@@ -152,7 +158,16 @@ pub async fn resolve_schema_request_inner(
)
.await
.await
.0.with_context(|| format!("Failed to find language server {lsp_name} to generate initialization params schema"))?;
.0
.with_context(|| {
format!(
concat!(
"Failed to find language server {} ",
"to generate initialization params schema"
),
lsp_name
)
})?;
adapter_for_schema
.adapter

View File

@@ -13,7 +13,7 @@ use crate::{
},
task_context::RunnableRange,
text_diff::text_diff,
unified_diff,
unified_diff_with_offsets,
};
pub use crate::{
Grammar, Language, LanguageRegistry,
@@ -773,7 +773,11 @@ pub struct EditPreview {
}
impl EditPreview {
pub fn as_unified_diff(&self, edits: &[(Range<Anchor>, impl AsRef<str>)]) -> Option<String> {
pub fn as_unified_diff(
&self,
file: Option<&Arc<dyn File>>,
edits: &[(Range<Anchor>, impl AsRef<str>)],
) -> Option<String> {
let (first, _) = edits.first()?;
let (last, _) = edits.last()?;
@@ -788,7 +792,7 @@ impl EditPreview {
let old_end = Point::new(old_end.row + 4, 0).min(self.old_snapshot.max_point());
let new_end = Point::new(new_end.row + 4, 0).min(self.applied_edits_snapshot.max_point());
Some(unified_diff(
let diff_body = unified_diff_with_offsets(
&self
.old_snapshot
.text_for_range(start..old_end)
@@ -797,7 +801,17 @@ impl EditPreview {
.applied_edits_snapshot
.text_for_range(start..new_end)
.collect::<String>(),
))
start.row,
start.row,
);
let path = file.map(|f| f.path().as_unix_str());
let header = match path {
Some(p) => format!("--- a/{}\n+++ b/{}\n", p, p),
None => String::new(),
};
Some(format!("{}{}", header, diff_body))
}
pub fn highlight_edits(

View File

@@ -1,6 +1,6 @@
name = "JSONC"
grammar = "jsonc"
path_suffixes = ["jsonc", "bun.lock", "tsconfig.json", "pyrightconfig.json"]
path_suffixes = ["jsonc", "bun.lock", "devcontainer.json", "pyrightconfig.json", "tsconfig.json"]
line_comments = ["// "]
autoclose_before = ",]}"
brackets = [

View File

@@ -8,6 +8,7 @@ use language::LanguageName;
use log::Level;
pub use path_range::{LineCol, PathWithRange};
use ui::Checkbox;
use ui::CopyButton;
use std::borrow::Cow;
use std::iter;
@@ -32,7 +33,7 @@ use parser::{MarkdownEvent, MarkdownTag, MarkdownTagEnd, parse_links_only, parse
use pulldown_cmark::Alignment;
use sum_tree::TreeMap;
use theme::SyntaxTheme;
use ui::{ScrollAxes, Scrollbars, Tooltip, WithScrollbar, prelude::*};
use ui::{ScrollAxes, Scrollbars, WithScrollbar, prelude::*};
use util::ResultExt;
use crate::parser::CodeBlockKind;
@@ -1202,7 +1203,6 @@ impl Element for MarkdownElement {
range.end,
code,
self.markdown.clone(),
cx,
);
el.child(
h_flex()
@@ -1233,7 +1233,6 @@ impl Element for MarkdownElement {
range.end,
code,
self.markdown.clone(),
cx,
);
el.child(
h_flex()
@@ -1449,26 +1448,12 @@ fn render_copy_code_block_button(
id: usize,
code: String,
markdown: Entity<Markdown>,
cx: &App,
) -> impl IntoElement {
let id = ElementId::named_usize("copy-markdown-code", id);
let was_copied = markdown.read(cx).copied_code_blocks.contains(&id);
IconButton::new(
id.clone(),
if was_copied {
IconName::Check
} else {
IconName::Copy
},
)
.icon_color(Color::Muted)
.icon_size(IconSize::Small)
.style(ButtonStyle::Filled)
.shape(ui::IconButtonShape::Square)
.tooltip(Tooltip::text("Copy"))
.on_click({
CopyButton::new(code.clone()).custom_on_click({
let markdown = markdown;
move |_event, _window, cx| {
move |_window, cx| {
let id = id.clone();
markdown.update(cx, |this, cx| {
this.copied_code_blocks.insert(id.clone());

View File

@@ -6,10 +6,10 @@ use crate::markdown_elements::{
};
use fs::normalize_path;
use gpui::{
AbsoluteLength, AnyElement, App, AppContext as _, ClipboardItem, Context, Div, Element,
ElementId, Entity, HighlightStyle, Hsla, ImageSource, InteractiveText, IntoElement, Keystroke,
Modifiers, ParentElement, Render, Resource, SharedString, Styled, StyledText, TextStyle,
WeakEntity, Window, div, img, px, rems,
AbsoluteLength, AnyElement, App, AppContext as _, Context, Div, Element, ElementId, Entity,
HighlightStyle, Hsla, ImageSource, InteractiveText, IntoElement, Keystroke, Modifiers,
ParentElement, Render, Resource, SharedString, Styled, StyledText, TextStyle, WeakEntity,
Window, div, img, px, rems,
};
use settings::Settings;
use std::{
@@ -18,12 +18,7 @@ use std::{
vec,
};
use theme::{ActiveTheme, SyntaxTheme, ThemeSettings};
use ui::{
ButtonCommon, Clickable, Color, FluentBuilder, IconButton, IconName, IconSize,
InteractiveElement, Label, LabelCommon, LabelSize, LinkPreview, Pixels, Rems,
StatefulInteractiveElement, StyledExt, StyledImage, ToggleState, Tooltip, VisibleOnHover,
h_flex, tooltip_container, v_flex,
};
use ui::{CopyButton, LinkPreview, ToggleState, prelude::*, tooltip_container};
use workspace::{OpenOptions, OpenVisible, Workspace};
pub struct CheckboxClickedEvent {
@@ -626,15 +621,8 @@ fn render_markdown_code_block(
StyledText::new(parsed.contents.clone())
};
let copy_block_button = IconButton::new("copy-code", IconName::Copy)
.icon_size(IconSize::Small)
.on_click({
let contents = parsed.contents.clone();
move |_, _window, cx| {
cx.write_to_clipboard(ClipboardItem::new_string(contents.to_string()));
}
})
.tooltip(Tooltip::text("Copy code block"))
let copy_block_button = CopyButton::new(parsed.contents.clone())
.tooltip_label("Copy Codeblock")
.visible_on_hover("markdown-block");
let font = gpui::Font {

View File

@@ -1869,6 +1869,8 @@ pub struct BuiltinAgentServerSettings {
pub default_mode: Option<String>,
pub default_model: Option<String>,
pub favorite_models: Vec<String>,
pub default_config_options: HashMap<String, String>,
pub favorite_config_option_values: HashMap<String, Vec<String>>,
}
impl BuiltinAgentServerSettings {
@@ -1893,6 +1895,8 @@ impl From<settings::BuiltinAgentServerSettings> for BuiltinAgentServerSettings {
default_mode: value.default_mode,
default_model: value.default_model,
favorite_models: value.favorite_models,
default_config_options: value.default_config_options,
favorite_config_option_values: value.favorite_config_option_values,
}
}
}
@@ -1928,6 +1932,18 @@ pub enum CustomAgentServerSettings {
///
/// Default: []
favorite_models: Vec<String>,
/// Default values for session config options.
///
/// This is a map from config option ID to value ID.
///
/// Default: {}
default_config_options: HashMap<String, String>,
/// Favorited values for session config options.
///
/// This is a map from config option ID to a list of favorited value IDs.
///
/// Default: {}
favorite_config_option_values: HashMap<String, Vec<String>>,
},
Extension {
/// The default mode to use for this agent.
@@ -1946,6 +1962,18 @@ pub enum CustomAgentServerSettings {
///
/// Default: []
favorite_models: Vec<String>,
/// Default values for session config options.
///
/// This is a map from config option ID to value ID.
///
/// Default: {}
default_config_options: HashMap<String, String>,
/// Favorited values for session config options.
///
/// This is a map from config option ID to a list of favorited value IDs.
///
/// Default: {}
favorite_config_option_values: HashMap<String, Vec<String>>,
},
}
@@ -1983,6 +2011,34 @@ impl CustomAgentServerSettings {
} => favorite_models,
}
}
pub fn default_config_option(&self, config_id: &str) -> Option<&str> {
match self {
CustomAgentServerSettings::Custom {
default_config_options,
..
}
| CustomAgentServerSettings::Extension {
default_config_options,
..
} => default_config_options.get(config_id).map(|s| s.as_str()),
}
}
pub fn favorite_config_option_values(&self, config_id: &str) -> Option<&[String]> {
match self {
CustomAgentServerSettings::Custom {
favorite_config_option_values,
..
}
| CustomAgentServerSettings::Extension {
favorite_config_option_values,
..
} => favorite_config_option_values
.get(config_id)
.map(|v| v.as_slice()),
}
}
}
impl From<settings::CustomAgentServerSettings> for CustomAgentServerSettings {
@@ -1995,6 +2051,8 @@ impl From<settings::CustomAgentServerSettings> for CustomAgentServerSettings {
default_mode,
default_model,
favorite_models,
default_config_options,
favorite_config_option_values,
} => CustomAgentServerSettings::Custom {
command: AgentServerCommand {
path: PathBuf::from(shellexpand::tilde(&path.to_string_lossy()).as_ref()),
@@ -2004,15 +2062,21 @@ impl From<settings::CustomAgentServerSettings> for CustomAgentServerSettings {
default_mode,
default_model,
favorite_models,
default_config_options,
favorite_config_option_values,
},
settings::CustomAgentServerSettings::Extension {
default_mode,
default_model,
default_config_options,
favorite_models,
favorite_config_option_values,
} => CustomAgentServerSettings::Extension {
default_mode,
default_model,
default_config_options,
favorite_models,
favorite_config_option_values,
},
}
}
@@ -2339,6 +2403,8 @@ mod extension_agent_tests {
default_mode: None,
default_model: None,
favorite_models: vec![],
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
};
let BuiltinAgentServerSettings { path, .. } = settings.into();
@@ -2356,6 +2422,8 @@ mod extension_agent_tests {
default_mode: None,
default_model: None,
favorite_models: vec![],
default_config_options: Default::default(),
favorite_config_option_values: Default::default(),
};
let converted: CustomAgentServerSettings = settings.into();

View File

@@ -216,7 +216,7 @@ impl ProjectEnvironment {
let shell = shell.clone();
let tx = self.environment_error_messages_tx.clone();
cx.spawn(async move |cx| {
let mut shell_env = cx
let mut shell_env = match cx
.background_spawn(load_directory_shell_environment(
shell,
abs_path.clone(),
@@ -224,7 +224,15 @@ impl ProjectEnvironment {
tx,
))
.await
.log_err();
{
Ok(shell_env) => Some(shell_env),
Err(e) => {
log::error!(
"Failed to load shell environment for directory {abs_path:?}: {e:#}"
);
None
}
};
if let Some(shell_env) = shell_env.as_mut() {
let path = shell_env

View File

@@ -1293,34 +1293,13 @@ impl Project {
cx.subscribe(&worktree_store, Self::on_worktree_store_event)
.detach();
if init_worktree_trust {
let trust_remote_project = match &connection_options {
RemoteConnectionOptions::Ssh(..) | RemoteConnectionOptions::Wsl(..) => false,
RemoteConnectionOptions::Docker(..) => true,
};
let remote_host = RemoteHostLocation::from(connection_options);
trusted_worktrees::track_worktree_trust(
worktree_store.clone(),
Some(remote_host.clone()),
Some(RemoteHostLocation::from(connection_options)),
None,
Some((remote_proto.clone(), REMOTE_SERVER_PROJECT_ID)),
cx,
);
if trust_remote_project {
if let Some(trusted_worktres) = TrustedWorktrees::try_get_global(cx) {
trusted_worktres.update(cx, |trusted_worktres, cx| {
trusted_worktres.trust(
worktree_store
.read(cx)
.worktrees()
.map(|worktree| worktree.read(cx).id())
.map(PathTrust::Worktree)
.collect(),
Some(remote_host),
cx,
);
})
}
}
}
let weak_self = cx.weak_entity();
@@ -4559,13 +4538,19 @@ impl Project {
for worktree in worktree_store.visible_worktrees(cx) {
let worktree = worktree.read(cx);
if let Ok(path) = RelPath::new(path, path_style)
&& let Some(entry) = worktree.entry_for_path(&path)
{
return Some(ProjectPath {
worktree_id: worktree.id(),
path: entry.path.clone(),
});
if let Ok(rel_path) = RelPath::new(path, path_style) {
if let Some(entry) = worktree.entry_for_path(&rel_path) {
return Some(ProjectPath {
worktree_id: worktree.id(),
path: entry.path.clone(),
});
}
if worktree_store.visible_worktrees(cx).count() == 1 {
return Some(ProjectPath {
worktree_id: worktree.id(),
path: rel_path.into_arc(),
});
}
}
}
}

View File

@@ -3599,6 +3599,7 @@ impl ProjectPanel {
== worktree_snapshot.root_entry()
{
let Some(path_name) = worktree_abs_path.file_name() else {
entry_iter.advance();
continue;
};
let depth = 0;

View File

@@ -37,10 +37,7 @@ use editor::{Editor, MultiBuffer};
use gpui::{AnyElement, ClipboardItem, Entity, Render, WeakEntity};
use language::Buffer;
use runtimelib::{ExecutionState, JupyterMessageContent, MimeBundle, MimeType};
use ui::{
ButtonStyle, CommonAnimationExt, Context, IconButton, IconName, IntoElement, Styled, Tooltip,
Window, div, h_flex, prelude::*, v_flex,
};
use ui::{CommonAnimationExt, CopyButton, IconButton, Tooltip, prelude::*};
mod image;
use image::ImageView;
@@ -236,89 +233,62 @@ impl Output {
Self::Image { content, .. } => {
Self::render_output_controls(content.clone(), workspace, window, cx)
}
Self::ErrorOutput(err) => {
// Add buttons for the traceback section
Some(
h_flex()
.pl_1()
.child(
IconButton::new(
ElementId::Name("copy-full-error-traceback".into()),
IconName::Copy,
)
.style(ButtonStyle::Transparent)
.tooltip(Tooltip::text("Copy Full Error"))
.on_click({
let ename = err.ename.clone();
let evalue = err.evalue.clone();
let traceback = err.traceback.clone();
move |_, _window, cx| {
Self::ErrorOutput(err) => Some(
h_flex()
.pl_1()
.child({
let ename = err.ename.clone();
let evalue = err.evalue.clone();
let traceback = err.traceback.clone();
let traceback_text = traceback.read(cx).full_text();
let full_error = format!("{}: {}\n{}", ename, evalue, traceback_text);
CopyButton::new(full_error).tooltip_label("Copy Full Error")
})
.child(
IconButton::new(
ElementId::Name("open-full-error-in-buffer-traceback".into()),
IconName::FileTextOutlined,
)
.style(ButtonStyle::Transparent)
.tooltip(Tooltip::text("Open Full Error in Buffer"))
.on_click({
let ename = err.ename.clone();
let evalue = err.evalue.clone();
let traceback = err.traceback.clone();
move |_, window, cx| {
if let Some(workspace) = workspace.upgrade() {
let traceback_text = traceback.read(cx).full_text();
let full_error =
format!("{}: {}\n{}", ename, evalue, traceback_text);
let clipboard_content =
ClipboardItem::new_string(full_error);
cx.write_to_clipboard(clipboard_content);
}
}),
)
.child(
IconButton::new(
ElementId::Name("open-full-error-in-buffer-traceback".into()),
IconName::FileTextOutlined,
)
.style(ButtonStyle::Transparent)
.tooltip(Tooltip::text("Open Full Error in Buffer"))
.on_click({
let ename = err.ename.clone();
let evalue = err.evalue.clone();
let traceback = err.traceback.clone();
move |_, window, cx| {
if let Some(workspace) = workspace.upgrade() {
let traceback_text = traceback.read(cx).full_text();
let full_error = format!(
"{}: {}\n{}",
ename, evalue, traceback_text
let buffer = cx.new(|cx| {
let mut buffer = Buffer::local(full_error, cx)
.with_language(language::PLAIN_TEXT.clone(), cx);
buffer
.set_capability(language::Capability::ReadOnly, cx);
buffer
});
let editor = Box::new(cx.new(|cx| {
let multibuffer = cx.new(|cx| {
let mut multi_buffer =
MultiBuffer::singleton(buffer.clone(), cx);
multi_buffer
.set_title("Full Error".to_string(), cx);
multi_buffer
});
Editor::for_multibuffer(multibuffer, None, window, cx)
}));
workspace.update(cx, |workspace, cx| {
workspace.add_item_to_active_pane(
editor, None, true, window, cx,
);
let buffer = cx.new(|cx| {
let mut buffer = Buffer::local(full_error, cx)
.with_language(
language::PLAIN_TEXT.clone(),
cx,
);
buffer.set_capability(
language::Capability::ReadOnly,
cx,
);
buffer
});
let editor = Box::new(cx.new(|cx| {
let multibuffer = cx.new(|cx| {
let mut multi_buffer =
MultiBuffer::singleton(buffer.clone(), cx);
multi_buffer
.set_title("Full Error".to_string(), cx);
multi_buffer
});
Editor::for_multibuffer(
multibuffer,
None,
window,
cx,
)
}));
workspace.update(cx, |workspace, cx| {
workspace.add_item_to_active_pane(
editor, None, true, window, cx,
);
});
}
});
}
}),
)
.into_any_element(),
)
}
}
}),
)
.into_any_element(),
),
Self::Message(_) => None,
Self::Table { content, .. } => {
Self::render_output_controls(content.clone(), workspace, window, cx)

View File

@@ -4,6 +4,7 @@ use crate::{
FocusSearch, NextHistoryQuery, PreviousHistoryQuery, ReplaceAll, ReplaceNext, SearchOption,
SearchOptions, SearchSource, SelectAllMatches, SelectNextMatch, SelectPreviousMatch,
ToggleCaseSensitive, ToggleRegex, ToggleReplace, ToggleSelection, ToggleWholeWord,
buffer_search::registrar::WithResultsOrExternalQuery,
search_bar::{ActionButtonState, input_base_styles, render_action_button, render_text_input},
};
use any_vec::AnyVec;
@@ -43,7 +44,7 @@ use workspace::{
};
pub use registrar::DivRegistrar;
use registrar::{ForDeployed, ForDismissed, SearchActionsRegistrar, WithResults};
use registrar::{ForDeployed, ForDismissed, SearchActionsRegistrar};
const MAX_BUFFER_SEARCH_HISTORY_SIZE: usize = 50;
@@ -110,6 +111,8 @@ pub struct BufferSearchBar {
active_searchable_item_subscriptions: Option<[Subscription; 2]>,
#[cfg(not(target_os = "macos"))]
active_searchable_item_subscriptions: Option<Subscription>,
#[cfg(target_os = "macos")]
pending_external_query: Option<(String, SearchOptions)>,
active_search: Option<Arc<SearchQuery>>,
searchable_items_with_matches: HashMap<Box<dyn WeakSearchableItemHandle>, AnyVec<dyn Send>>,
pending_search: Option<Task<()>>,
@@ -510,24 +513,27 @@ impl ToolbarItemView for BufferSearchBar {
}
cx.defer_in(window, |this, window, cx| {
if let Some(item) = cx.read_from_find_pasteboard()
&& let Some(text) = item.text()
{
if this.query(cx) != text {
let search_options = item
.metadata()
.and_then(|m| m.parse().ok())
.and_then(SearchOptions::from_bits)
.unwrap_or(this.search_options);
let Some(item) = cx.read_from_find_pasteboard() else {
return;
};
let Some(text) = item.text() else {
return;
};
drop(this.search(
&text,
Some(search_options),
true,
window,
cx,
));
}
if this.query(cx) == text {
return;
}
let search_options = item
.metadata()
.and_then(|m| m.parse().ok())
.and_then(SearchOptions::from_bits)
.unwrap_or(this.search_options);
if this.dismissed {
this.pending_external_query = Some((text, search_options));
} else {
drop(this.search(&text, Some(search_options), true, window, cx));
}
});
}),
@@ -594,14 +600,16 @@ impl BufferSearchBar {
cx.propagate();
}
}));
registrar.register_handler(WithResults(|this, action: &SelectNextMatch, window, cx| {
if this.supported_options(cx).find_in_results {
cx.propagate();
} else {
this.select_next_match(action, window, cx);
}
}));
registrar.register_handler(WithResults(
registrar.register_handler(WithResultsOrExternalQuery(
|this, action: &SelectNextMatch, window, cx| {
if this.supported_options(cx).find_in_results {
cx.propagate();
} else {
this.select_next_match(action, window, cx);
}
},
));
registrar.register_handler(WithResultsOrExternalQuery(
|this, action: &SelectPreviousMatch, window, cx| {
if this.supported_options(cx).find_in_results {
cx.propagate();
@@ -610,7 +618,7 @@ impl BufferSearchBar {
}
},
));
registrar.register_handler(WithResults(
registrar.register_handler(WithResultsOrExternalQuery(
|this, action: &SelectAllMatches, window, cx| {
if this.supported_options(cx).find_in_results {
cx.propagate();
@@ -707,6 +715,8 @@ impl BufferSearchBar {
replacement_editor_focused: false,
active_searchable_item: None,
active_searchable_item_subscriptions: None,
#[cfg(target_os = "macos")]
pending_external_query: None,
active_match_index: None,
searchable_items_with_matches: Default::default(),
default_options: search_options,
@@ -852,11 +862,20 @@ impl BufferSearchBar {
self.search(&suggestion, Some(self.default_options), true, window, cx)
});
#[cfg(target_os = "macos")]
let search = search.or_else(|| {
self.pending_external_query
.take()
.map(|(query, options)| self.search(&query, Some(options), true, window, cx))
});
if let Some(search) = search {
cx.spawn_in(window, async move |this, cx| {
if search.await.is_ok() {
this.update_in(cx, |this, window, cx| {
this.activate_current_match(window, cx)
if !this.dismissed {
this.activate_current_match(window, cx)
}
})
} else {
Ok(())
@@ -1071,6 +1090,22 @@ impl BufferSearchBar {
window: &mut Window,
cx: &mut Context<Self>,
) {
#[cfg(target_os = "macos")]
if let Some((query, options)) = self.pending_external_query.take() {
let search_rx = self.search(&query, Some(options), true, window, cx);
cx.spawn_in(window, async move |this, cx| {
if search_rx.await.is_ok() {
this.update_in(cx, |this, window, cx| {
this.activate_current_match(window, cx);
})
.ok();
}
})
.detach();
return;
}
if let Some(index) = self.active_match_index
&& let Some(searchable_item) = self.active_searchable_item.as_ref()
&& let Some(matches) = self
@@ -1271,6 +1306,8 @@ impl BufferSearchBar {
let (done_tx, done_rx) = oneshot::channel();
let query = self.query(cx);
self.pending_search.take();
#[cfg(target_os = "macos")]
self.pending_external_query.take();
if let Some(active_searchable_item) = self.active_searchable_item.as_ref() {
self.query_error = None;
@@ -1367,8 +1404,8 @@ impl BufferSearchBar {
cx,
);
}
let _ = done_tx.send(());
}
let _ = done_tx.send(());
cx.notify();
}
})

View File

@@ -149,16 +149,16 @@ impl<A: Action> ActionExecutor<A> for ForDeployed<A> {
}
}
/// Run an action when the search bar has any matches, regardless of whether it
/// is visible or not.
pub struct WithResults<A>(pub(super) SearchBarActionCallback<A>);
impl<A> Clone for WithResults<A> {
/// Run an action when the search bar has any matches or a pending external query,
/// regardless of whether it is visible or not.
pub struct WithResultsOrExternalQuery<A>(pub(super) SearchBarActionCallback<A>);
impl<A> Clone for WithResultsOrExternalQuery<A> {
fn clone(&self) -> Self {
Self(self.0)
}
}
impl<A: Action> ActionExecutor<A> for WithResults<A> {
impl<A: Action> ActionExecutor<A> for WithResultsOrExternalQuery<A> {
fn execute(
&self,
search_bar: &mut BufferSearchBar,
@@ -166,7 +166,13 @@ impl<A: Action> ActionExecutor<A> for WithResults<A> {
window: &mut Window,
cx: &mut Context<BufferSearchBar>,
) -> DidHandleAction {
if search_bar.active_match_index.is_some() {
#[cfg(not(target_os = "macos"))]
let has_external_query = false;
#[cfg(target_os = "macos")]
let has_external_query = search_bar.pending_external_query.is_some();
if has_external_query || search_bar.active_match_index.is_some() {
self.0(search_bar, action, window, cx);
true
} else {

View File

@@ -1531,14 +1531,20 @@ impl ProjectSearchView {
fn update_match_index(&mut self, cx: &mut Context<Self>) {
let results_editor = self.results_editor.read(cx);
let match_ranges = self.entity.read(cx).match_ranges.clone();
let new_index = active_match_index(
Direction::Next,
&match_ranges,
&results_editor.selections.newest_anchor().head(),
&results_editor.buffer().read(cx).snapshot(cx),
);
self.highlight_matches(&match_ranges, new_index, cx);
let newest_anchor = results_editor.selections.newest_anchor().head();
let buffer_snapshot = results_editor.buffer().read(cx).snapshot(cx);
let new_index = self.entity.update(cx, |this, cx| {
let new_index = active_match_index(
Direction::Next,
&this.match_ranges,
&newest_anchor,
&buffer_snapshot,
);
self.highlight_matches(&this.match_ranges, new_index, cx);
new_index
});
if self.active_match_index != new_index {
self.active_match_index = new_index;
cx.notify();
@@ -1550,7 +1556,7 @@ impl ProjectSearchView {
&self,
match_ranges: &[Range<Anchor>],
active_index: Option<usize>,
cx: &mut Context<Self>,
cx: &mut App,
) {
self.results_editor.update(cx, |editor, cx| {
editor.highlight_background::<Self>(

View File

@@ -370,6 +370,20 @@ pub struct BuiltinAgentServerSettings {
/// Default: []
#[serde(default)]
pub favorite_models: Vec<String>,
/// Default values for session config options.
///
/// This is a map from config option ID to value ID.
///
/// Default: {}
#[serde(default)]
pub default_config_options: HashMap<String, String>,
/// Favorited values for session config options.
///
/// This is a map from config option ID to a list of favorited value IDs.
///
/// Default: {}
#[serde(default)]
pub favorite_config_option_values: HashMap<String, Vec<String>>,
}
#[with_fallible_options]
@@ -401,6 +415,20 @@ pub enum CustomAgentServerSettings {
/// Default: []
#[serde(default)]
favorite_models: Vec<String>,
/// Default values for session config options.
///
/// This is a map from config option ID to value ID.
///
/// Default: {}
#[serde(default)]
default_config_options: HashMap<String, String>,
/// Favorited values for session config options.
///
/// This is a map from config option ID to a list of favorited value IDs.
///
/// Default: {}
#[serde(default)]
favorite_config_option_values: HashMap<String, Vec<String>>,
},
Extension {
/// The default mode to use for this agent.
@@ -422,5 +450,19 @@ pub enum CustomAgentServerSettings {
/// Default: []
#[serde(default)]
favorite_models: Vec<String>,
/// Default values for session config options.
///
/// This is a map from config option ID to value ID.
///
/// Default: {}
#[serde(default)]
default_config_options: HashMap<String, String>,
/// Favorited values for session config options.
///
/// This is a map from config option ID to a list of favorited value IDs.
///
/// Default: {}
#[serde(default)]
favorite_config_option_values: HashMap<String, Vec<String>>,
},
}

View File

@@ -196,7 +196,7 @@ impl SvgPreviewView {
.as_singleton()
.and_then(|buffer| buffer.read(cx).file())
.is_some_and(|file| {
file.path()
std::path::Path::new(file.file_name(cx))
.extension()
.is_some_and(|ext| ext.eq_ignore_ascii_case("svg"))
})

View File

@@ -1065,43 +1065,73 @@ impl Element for TerminalElement {
// then have that representation be converted to the appropriate highlight data structure
let content_mode = self.terminal_view.read(cx).content_mode(window, cx);
let (rects, batched_text_runs) = match content_mode {
ContentMode::Scrollable => {
// In scrollable mode, the terminal already provides cells
// that are correctly positioned for the current viewport
// based on its display_offset. We don't need additional filtering.
TerminalElement::layout_grid(
cells.iter().cloned(),
0,
&text_style,
last_hovered_word.as_ref().map(|last_hovered_word| {
(link_style, &last_hovered_word.word_match)
}),
minimum_contrast,
cx,
)
}
ContentMode::Inline { .. } => {
let intersection = window.content_mask().bounds.intersect(&bounds);
let start_row = (intersection.top() - bounds.top()) / line_height_px;
let end_row = start_row + intersection.size.height / line_height_px;
let line_range = (start_row as i32)..=(end_row as i32);
TerminalElement::layout_grid(
cells
.iter()
.skip_while(|i| &i.point.line < line_range.start())
.take_while(|i| &i.point.line <= line_range.end())
.cloned(),
*line_range.start(),
&text_style,
last_hovered_word.as_ref().map(|last_hovered_word| {
(link_style, &last_hovered_word.word_match)
}),
minimum_contrast,
cx,
)
}
// Calculate the intersection of the terminal's bounds with the current
// content mask (the visible viewport after all parent clipping).
// This allows us to only render cells that are actually visible, which is
// critical for performance when terminals are inside scrollable containers
// like the Agent Panel thread view.
//
// This optimization is analogous to the editor optimization in PR #45077
// which fixed performance issues with large AutoHeight editors inside Lists.
let visible_bounds = window.content_mask().bounds;
let intersection = visible_bounds.intersect(&bounds);
// If the terminal is entirely outside the viewport, skip all cell processing.
// This handles the case where the terminal has been scrolled past (above or
// below the viewport), similar to the editor fix in PR #45077 where start_row
// could exceed max_row when the editor was positioned above the viewport.
let (rects, batched_text_runs) = if intersection.size.height <= px(0.)
|| intersection.size.width <= px(0.)
{
(Vec::new(), Vec::new())
} else if intersection == bounds {
// Fast path: terminal fully visible, no clipping needed.
// Avoid grouping/allocation overhead by streaming cells directly.
TerminalElement::layout_grid(
cells.iter().cloned(),
0,
&text_style,
last_hovered_word
.as_ref()
.map(|last_hovered_word| (link_style, &last_hovered_word.word_match)),
minimum_contrast,
cx,
)
} else {
// Calculate which screen rows are visible based on pixel positions.
// This works for both Scrollable and Inline modes because we filter
// by screen position (enumerated line group index), not by the cell's
// internal line number (which can be negative in Scrollable mode for
// scrollback history).
let rows_above_viewport =
((intersection.top() - bounds.top()).max(px(0.)) / line_height_px) as usize;
let visible_row_count =
(intersection.size.height / line_height_px).ceil() as usize + 1;
// Group cells by line and filter to only the visible screen rows.
// skip() and take() work on enumerated line groups (screen position),
// making this work regardless of the actual cell.point.line values.
let visible_cells: Vec<_> = cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_above_viewport)
.take(visible_row_count)
.flat_map(|(_, line_cells)| line_cells)
.cloned()
.collect();
TerminalElement::layout_grid(
visible_cells.into_iter(),
rows_above_viewport as i32,
&text_style,
last_hovered_word
.as_ref()
.map(|last_hovered_word| (link_style, &last_hovered_word.word_match)),
minimum_contrast,
cx,
)
};
// Layout cursor. Rectangle is used for IME, so we should lay it out even
@@ -2059,4 +2089,248 @@ mod tests {
let merged2 = merge_background_regions(regions2);
assert_eq!(merged2.len(), 3);
}
#[test]
fn test_screen_position_filtering_with_positive_lines() {
// Test the unified screen-position-based filtering approach.
// This works for both Scrollable and Inline modes because we filter
// by enumerated line group index, not by cell.point.line values.
use itertools::Itertools;
use terminal::IndexedCell;
use terminal::alacritty_terminal::index::{Column, Line, Point as AlacPoint};
use terminal::alacritty_terminal::term::cell::Cell;
// Create mock cells for lines 0-23 (typical terminal with 24 visible lines)
let mut cells = Vec::new();
for line in 0..24i32 {
for col in 0..3i32 {
cells.push(IndexedCell {
point: AlacPoint::new(Line(line), Column(col as usize)),
cell: Cell::default(),
});
}
}
// Scenario: Terminal partially scrolled above viewport
// First 5 lines (0-4) are clipped, lines 5-15 should be visible
let rows_above_viewport = 5usize;
let visible_row_count = 11usize;
// Apply the same filtering logic as in the render code
let filtered: Vec<_> = cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_above_viewport)
.take(visible_row_count)
.flat_map(|(_, line_cells)| line_cells)
.collect();
// Should have lines 5-15 (11 lines * 3 cells each = 33 cells)
assert_eq!(filtered.len(), 11 * 3, "Should have 33 cells for 11 lines");
// First filtered cell should be line 5
assert_eq!(
filtered.first().unwrap().point.line,
Line(5),
"First cell should be on line 5"
);
// Last filtered cell should be line 15
assert_eq!(
filtered.last().unwrap().point.line,
Line(15),
"Last cell should be on line 15"
);
}
#[test]
fn test_screen_position_filtering_with_negative_lines() {
// This is the key test! In Scrollable mode, cells have NEGATIVE line numbers
// for scrollback history. The screen-position filtering approach works because
// we filter by enumerated line group index, not by cell.point.line values.
use itertools::Itertools;
use terminal::IndexedCell;
use terminal::alacritty_terminal::index::{Column, Line, Point as AlacPoint};
use terminal::alacritty_terminal::term::cell::Cell;
// Simulate cells from a scrolled terminal with scrollback
// These have negative line numbers representing scrollback history
let mut scrollback_cells = Vec::new();
for line in -588i32..=-578i32 {
for col in 0..80i32 {
scrollback_cells.push(IndexedCell {
point: AlacPoint::new(Line(line), Column(col as usize)),
cell: Cell::default(),
});
}
}
// Scenario: First 3 screen rows clipped, show next 5 rows
let rows_above_viewport = 3usize;
let visible_row_count = 5usize;
// Apply the same filtering logic as in the render code
let filtered: Vec<_> = scrollback_cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_above_viewport)
.take(visible_row_count)
.flat_map(|(_, line_cells)| line_cells)
.collect();
// Should have 5 lines * 80 cells = 400 cells
assert_eq!(filtered.len(), 5 * 80, "Should have 400 cells for 5 lines");
// First filtered cell should be line -585 (skipped 3 lines from -588)
assert_eq!(
filtered.first().unwrap().point.line,
Line(-585),
"First cell should be on line -585"
);
// Last filtered cell should be line -581 (5 lines: -585, -584, -583, -582, -581)
assert_eq!(
filtered.last().unwrap().point.line,
Line(-581),
"Last cell should be on line -581"
);
}
#[test]
fn test_screen_position_filtering_skip_all() {
// Test what happens when we skip more rows than exist
use itertools::Itertools;
use terminal::IndexedCell;
use terminal::alacritty_terminal::index::{Column, Line, Point as AlacPoint};
use terminal::alacritty_terminal::term::cell::Cell;
let mut cells = Vec::new();
for line in 0..10i32 {
cells.push(IndexedCell {
point: AlacPoint::new(Line(line), Column(0)),
cell: Cell::default(),
});
}
// Skip more rows than exist
let rows_above_viewport = 100usize;
let visible_row_count = 5usize;
let filtered: Vec<_> = cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_above_viewport)
.take(visible_row_count)
.flat_map(|(_, line_cells)| line_cells)
.collect();
assert_eq!(
filtered.len(),
0,
"Should have no cells when all are skipped"
);
}
#[test]
fn test_layout_grid_positioning_math() {
// Test the math that layout_grid uses for positioning.
// When we skip N rows, we pass N as start_line_offset to layout_grid,
// which positions the first visible line at screen row N.
// Scenario: Terminal at y=-100px, line_height=20px
// First 5 screen rows are above viewport (clipped)
// So we skip 5 rows and pass offset=5 to layout_grid
let terminal_origin_y = -100.0f32;
let line_height = 20.0f32;
let rows_skipped = 5;
// The first visible line (at offset 5) renders at:
// y = terminal_origin + offset * line_height = -100 + 5*20 = 0
let first_visible_y = terminal_origin_y + rows_skipped as f32 * line_height;
assert_eq!(
first_visible_y, 0.0,
"First visible line should be at viewport top (y=0)"
);
// The 6th visible line (at offset 10) renders at:
let sixth_visible_y = terminal_origin_y + (rows_skipped + 5) as f32 * line_height;
assert_eq!(
sixth_visible_y, 100.0,
"6th visible line should be at y=100"
);
}
#[test]
fn test_unified_filtering_works_for_both_modes() {
// This test proves that the unified screen-position filtering approach
// works for BOTH positive line numbers (Inline mode) and negative line
// numbers (Scrollable mode with scrollback).
//
// The key insight: we filter by enumerated line group index (screen position),
// not by cell.point.line values. This makes the filtering agnostic to the
// actual line numbers in the cells.
use itertools::Itertools;
use terminal::IndexedCell;
use terminal::alacritty_terminal::index::{Column, Line, Point as AlacPoint};
use terminal::alacritty_terminal::term::cell::Cell;
// Test with positive line numbers (Inline mode style)
let positive_cells: Vec<_> = (0..10i32)
.flat_map(|line| {
(0..3i32).map(move |col| IndexedCell {
point: AlacPoint::new(Line(line), Column(col as usize)),
cell: Cell::default(),
})
})
.collect();
// Test with negative line numbers (Scrollable mode with scrollback)
let negative_cells: Vec<_> = (-10i32..0i32)
.flat_map(|line| {
(0..3i32).map(move |col| IndexedCell {
point: AlacPoint::new(Line(line), Column(col as usize)),
cell: Cell::default(),
})
})
.collect();
let rows_to_skip = 3usize;
let rows_to_take = 4usize;
// Filter positive cells
let positive_filtered: Vec<_> = positive_cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_to_skip)
.take(rows_to_take)
.flat_map(|(_, cells)| cells)
.collect();
// Filter negative cells
let negative_filtered: Vec<_> = negative_cells
.iter()
.chunk_by(|c| c.point.line)
.into_iter()
.skip(rows_to_skip)
.take(rows_to_take)
.flat_map(|(_, cells)| cells)
.collect();
// Both should have same count: 4 lines * 3 cells = 12
assert_eq!(positive_filtered.len(), 12);
assert_eq!(negative_filtered.len(), 12);
// Positive: lines 3, 4, 5, 6
assert_eq!(positive_filtered.first().unwrap().point.line, Line(3));
assert_eq!(positive_filtered.last().unwrap().point.line, Line(6));
// Negative: lines -7, -6, -5, -4
assert_eq!(negative_filtered.first().unwrap().point.line, Line(-7));
assert_eq!(negative_filtered.last().unwrap().point.line, Line(-4));
}
}

View File

@@ -2,6 +2,7 @@ mod button;
mod button_icon;
mod button_like;
mod button_link;
mod copy_button;
mod icon_button;
mod split_button;
mod toggle_button;
@@ -9,6 +10,7 @@ mod toggle_button;
pub use button::*;
pub use button_like::*;
pub use button_link::*;
pub use copy_button::*;
pub use icon_button::*;
pub use split_button::*;
pub use toggle_button::*;

View File

@@ -208,7 +208,6 @@ impl ButtonStyle {
pub(crate) fn enabled(
self,
elevation: Option<ElevationIndex>,
cx: &mut App,
) -> ButtonLikeStyles {
match self {
@@ -249,7 +248,6 @@ impl ButtonStyle {
pub(crate) fn hovered(
self,
elevation: Option<ElevationIndex>,
cx: &mut App,
) -> ButtonLikeStyles {
match self {

View File

@@ -0,0 +1,162 @@
use gpui::{
AnyElement, App, ClipboardItem, IntoElement, ParentElement, RenderOnce, Styled, Window,
};
use crate::{Tooltip, prelude::*};
#[derive(IntoElement, RegisterComponent)]
pub struct CopyButton {
message: SharedString,
icon_size: IconSize,
disabled: bool,
tooltip_label: SharedString,
visible_on_hover: Option<SharedString>,
custom_on_click: Option<Box<dyn Fn(&mut Window, &mut App) + 'static>>,
}
impl CopyButton {
pub fn new(message: impl Into<SharedString>) -> Self {
Self {
message: message.into(),
icon_size: IconSize::Small,
disabled: false,
tooltip_label: "Copy".into(),
visible_on_hover: None,
custom_on_click: None,
}
}
pub fn icon_size(mut self, icon_size: IconSize) -> Self {
self.icon_size = icon_size;
self
}
pub fn disabled(mut self, disabled: bool) -> Self {
self.disabled = disabled;
self
}
pub fn tooltip_label(mut self, tooltip_label: impl Into<SharedString>) -> Self {
self.tooltip_label = tooltip_label.into();
self
}
pub fn visible_on_hover(mut self, visible_on_hover: impl Into<SharedString>) -> Self {
self.visible_on_hover = Some(visible_on_hover.into());
self
}
pub fn custom_on_click(
mut self,
custom_on_click: impl Fn(&mut Window, &mut App) + 'static,
) -> Self {
self.custom_on_click = Some(Box::new(custom_on_click));
self
}
}
impl RenderOnce for CopyButton {
fn render(self, _window: &mut Window, cx: &mut App) -> impl IntoElement {
let message = self.message;
let message_clone = message.clone();
let id = format!("copy-button-{}", message_clone);
let copied = cx
.read_from_clipboard()
.map(|item| item.text().as_ref() == Some(&message_clone.into()))
.unwrap_or(false);
let (icon, color, tooltip) = if copied {
(IconName::Check, Color::Success, "Copied!".into())
} else {
(IconName::Copy, Color::Muted, self.tooltip_label)
};
let custom_on_click = self.custom_on_click;
let visible_on_hover = self.visible_on_hover;
let button = IconButton::new(id, icon)
.icon_color(color)
.icon_size(self.icon_size)
.disabled(self.disabled)
.tooltip(Tooltip::text(tooltip))
.on_click(move |_, window, cx| {
if let Some(custom_on_click) = custom_on_click.as_ref() {
(custom_on_click)(window, cx);
} else {
cx.stop_propagation();
cx.write_to_clipboard(ClipboardItem::new_string(message.clone().into()));
}
});
if let Some(visible_on_hover) = visible_on_hover {
button.visible_on_hover(visible_on_hover)
} else {
button
}
}
}
impl Component for CopyButton {
fn scope() -> ComponentScope {
ComponentScope::Input
}
fn description() -> Option<&'static str> {
Some("An icon button that encapsulates the logic to copy a string into the clipboard.")
}
fn preview(_window: &mut Window, _cx: &mut App) -> Option<AnyElement> {
let label_text = "Here's an example label";
let mut counter: usize = 0;
let mut copy_b = || {
counter += 1;
CopyButton::new(format!(
"Here's an example label (id for uniqueness: {} — ignore this)",
counter
))
};
let example = vec![
single_example(
"Default",
h_flex()
.gap_1()
.child(Label::new(label_text).size(LabelSize::Small))
.child(copy_b())
.into_any_element(),
),
single_example(
"Multiple Icon Sizes",
h_flex()
.gap_1()
.child(Label::new(label_text).size(LabelSize::Small))
.child(copy_b().icon_size(IconSize::XSmall))
.child(copy_b().icon_size(IconSize::Medium))
.child(copy_b().icon_size(IconSize::XLarge))
.into_any_element(),
),
single_example(
"Custom Tooltip Label",
h_flex()
.gap_1()
.child(Label::new(label_text).size(LabelSize::Small))
.child(copy_b().tooltip_label("Custom tooltip label"))
.into_any_element(),
),
single_example(
"Visible On Hover",
h_flex()
.group("container")
.gap_1()
.child(Label::new(label_text).size(LabelSize::Small))
.child(copy_b().visible_on_hover("container"))
.into_any_element(),
),
];
Some(example_group(example).vertical().into_any_element())
}
}

View File

@@ -164,6 +164,7 @@ impl RenderOnce for Callout {
.child(
v_flex()
.min_w_0()
.min_h_0()
.w_full()
.child(
h_flex()
@@ -189,20 +190,19 @@ impl RenderOnce for Callout {
}),
)
.map(|this| {
let base_desc_container = div()
.id("callout-description-slot")
.w_full()
.max_h_32()
.flex_1()
.overflow_y_scroll()
.text_ui_sm(cx);
if let Some(description_slot) = self.description_slot {
this.child(
div()
.w_full()
.flex_1()
.text_ui_sm(cx)
.child(description_slot),
)
this.child(base_desc_container.child(description_slot))
} else if let Some(description) = self.description {
this.child(
div()
.w_full()
.flex_1()
.text_ui_sm(cx)
base_desc_container
.text_color(cx.theme().colors().text_muted)
.child(description),
)
@@ -276,6 +276,39 @@ impl Component for Callout {
.into_any_element(),
)
.width(px(580.)),
single_example(
"Scrollable Long Description",
Callout::new()
.severity(Severity::Error)
.icon(IconName::XCircle)
.title("Very Long API Error Description")
.description_slot(
v_flex().gap_1().children(
[
"You exceeded your current quota.",
"For more information, visit the docs.",
"Error details:",
"• Quota exceeded for metric",
"• Limit: 0",
"• Model: gemini-3-pro",
"Please retry in 26.33s.",
"Additional details:",
"- Request ID: abc123def456",
"- Timestamp: 2024-01-15T10:30:00Z",
"- Region: us-central1",
"- Service: generativelanguage.googleapis.com",
"- Error Code: RESOURCE_EXHAUSTED",
"- Retry After: 26s",
"This error occurs when you have exceeded your API quota.",
]
.into_iter()
.map(|t| Label::new(t).size(LabelSize::Small).color(Color::Muted)),
),
)
.actions_slot(single_action())
.into_any_element(),
)
.width(px(580.)),
];
let severity_examples = vec![

View File

@@ -1,9 +1,9 @@
use crate::{SuppressNotification, Toast, Workspace};
use anyhow::Context as _;
use gpui::{
AnyView, App, AppContext as _, AsyncWindowContext, ClickEvent, ClipboardItem, Context,
DismissEvent, Entity, EventEmitter, FocusHandle, Focusable, PromptLevel, Render, ScrollHandle,
Task, TextStyleRefinement, UnderlineStyle, svg,
AnyView, App, AppContext as _, AsyncWindowContext, ClickEvent, Context, DismissEvent, Entity,
EventEmitter, FocusHandle, Focusable, PromptLevel, Render, ScrollHandle, Task,
TextStyleRefinement, UnderlineStyle, svg,
};
use markdown::{Markdown, MarkdownElement, MarkdownStyle};
use parking_lot::Mutex;
@@ -13,7 +13,7 @@ use theme::ThemeSettings;
use std::ops::Deref;
use std::sync::{Arc, LazyLock};
use std::{any::TypeId, time::Duration};
use ui::{Tooltip, prelude::*};
use ui::{CopyButton, Tooltip, prelude::*};
use util::ResultExt;
#[derive(Default)]
@@ -308,16 +308,8 @@ impl Render for LanguageServerPrompt {
h_flex()
.gap_1()
.child(
IconButton::new("copy", IconName::Copy)
.on_click({
let message = request.message.clone();
move |_, _, cx| {
cx.write_to_clipboard(
ClipboardItem::new_string(message.clone()),
)
}
})
.tooltip(Tooltip::text("Copy Description")),
CopyButton::new(request.message.clone())
.tooltip_label("Copy Description"),
)
.child(
IconButton::new(close_id, close_icon)

View File

@@ -3995,21 +3995,27 @@ impl BackgroundScanner {
.snapshot
.root_file_handle
.clone()
.and_then(|handle| handle.current_path(&self.fs).log_err())
.and_then(|handle| match handle.current_path(&self.fs) {
Ok(new_path) => Some(new_path),
Err(e) => {
log::error!("Failed to refresh worktree root path: {e:#}");
None
}
})
.map(|path| SanitizedPath::new_arc(&path))
.filter(|new_path| *new_path != root_path);
if let Some(new_path) = new_path {
log::info!(
"root renamed from {} to {}",
root_path.as_path().display(),
new_path.as_path().display()
"root renamed from {:?} to {:?}",
root_path.as_path(),
new_path.as_path(),
);
self.status_updates_tx
.unbounded_send(ScanState::RootUpdated { new_path })
.ok();
} else {
log::warn!("root path could not be canonicalized: {:#}", err);
log::error!("root path could not be canonicalized: {err:#}");
}
return;
}
@@ -4457,7 +4463,7 @@ impl BackgroundScanner {
Ok(Some(metadata)) => metadata,
Ok(None) => continue,
Err(err) => {
log::error!("error processing {child_abs_path:?}: {err:?}");
log::error!("error processing {child_abs_path:?}: {err:#}");
continue;
}
};

View File

@@ -6,6 +6,7 @@ If your repository includes a `.devcontainer/devcontainer.json` file, Zed can op
## Requirements
- [(Preview Only)](https://zed.dev/releases/preview): Download and install the latest Zed.
- Docker must be installed and available in your `PATH`. Zed requires the `docker` command to be present. If you use Podman, you can alias it to `docker` (e.g., `alias docker=podman`).
- Your project must contain a `.devcontainer/devcontainer.json` directory/file.

View File

@@ -155,6 +155,7 @@ Some work out-of-the box and others rely on 3rd party extensions.
- [Swift](https://github.com/zed-extensions/swift)
- [Templ](https://github.com/makifdb/zed-templ)
- [Tmux](https://github.com/dangh/zed-tmux)
- [Tree-sitter Query](https://github.com/vitallium/zed-tree-sitter-query)
- [Twig](https://github.com/YussufSassi/zed-twig)
- [Typst](https://github.com/WeetHet/typst.zed)
- [Unison](https://github.com/zetashift/unison-zed)

View File

@@ -1,6 +1,6 @@
[package]
name = "zed_glsl"
version = "0.1.0"
version = "0.2.0"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"

View File

@@ -1,7 +1,7 @@
id = "glsl"
name = "GLSL"
description = "GLSL support."
version = "0.1.0"
version = "0.2.0"
schema_version = 1
authors = ["Mikayla Maki <mikayla@zed.dev>"]
repository = "https://github.com/zed-industries/zed"

View File

@@ -0,0 +1,3 @@
("[" @open "]" @close)
("{" @open "}" @close)
("(" @open ")" @close)

View File

@@ -10,6 +10,8 @@ on:
- main
jobs:
call_extension_tests:
permissions:
contents: read
uses: zed-industries/zed/.github/workflows/extension_tests.yml@main
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}pr

View File

@@ -13,7 +13,9 @@ on:
workflow_dispatch: {}
jobs:
determine_bump_type:
runs-on: namespace-profile-16x32-ubuntu-2204
if: (github.repository_owner == 'zed-industries' || github.repository_owner == 'zed-extensions')
runs-on: namespace-profile-2x4-ubuntu-2404
permissions: {}
steps:
- id: get-bump-type
name: extensions::bump_version::get_bump_type
@@ -40,6 +42,11 @@ jobs:
needs:
- determine_bump_type
if: github.event.action != 'labeled' || needs.determine_bump_type.outputs.bump_type != 'patch'
permissions:
actions: write
contents: write
issues: write
pull-requests: write
uses: zed-industries/zed/.github/workflows/extension_bump.yml@main
secrets:
app-id: ${{ secrets.ZED_ZIPPY_APP_ID }}

View File

@@ -7,6 +7,9 @@ on:
- v**
jobs:
call_release_version:
permissions:
contents: write
pull-requests: write
uses: zed-industries/zed/.github/workflows/extension_release.yml@main
secrets:
app-id: ${{ secrets.ZED_ZIPPY_APP_ID }}

View File

@@ -12,6 +12,7 @@ mod danger;
mod extension_bump;
mod extension_release;
mod extension_tests;
mod extension_workflow_rollout;
mod extensions;
mod nix_build;
mod release_nightly;
@@ -39,16 +40,29 @@ impl WorkflowFile {
r#type: WorkflowType::Zed,
}
}
fn extension(f: fn() -> Workflow) -> WorkflowFile {
WorkflowFile {
source: f,
r#type: WorkflowType::Extensions,
r#type: WorkflowType::ExtensionCI,
}
}
fn extension_shared(f: fn() -> Workflow) -> WorkflowFile {
WorkflowFile {
source: f,
r#type: WorkflowType::ExtensionsShared,
}
}
fn generate_file(&self) -> Result<()> {
let workflow = (self.source)();
let workflow_folder = self.r#type.folder_path();
fs::create_dir_all(&workflow_folder).with_context(|| {
format!("Failed to create directory: {}", workflow_folder.display())
})?;
let workflow_name = workflow
.name
.as_ref()
@@ -71,9 +85,16 @@ impl WorkflowFile {
}
}
#[derive(PartialEq, Eq)]
enum WorkflowType {
/// Workflows living in the Zed repository
Zed,
Extensions,
/// Workflows living in the `zed-extensions/workflows` repository that are
/// required workflows for PRs to the extension organization
ExtensionCI,
/// Workflows living in each of the extensions to perform checks and version
/// bumps until a better, more centralized system for that is in place.
ExtensionsShared,
}
impl WorkflowType {
@@ -84,7 +105,7 @@ impl WorkflowType {
"# Rebuild with `cargo xtask workflows`.",
),
workflow_name,
matches!(self, WorkflowType::Extensions)
(*self != WorkflowType::Zed)
.then_some(" within the Zed repository.")
.unwrap_or_default(),
)
@@ -93,7 +114,8 @@ impl WorkflowType {
fn folder_path(&self) -> PathBuf {
match self {
WorkflowType::Zed => PathBuf::from(".github/workflows"),
WorkflowType::Extensions => PathBuf::from("extensions/workflows"),
WorkflowType::ExtensionCI => PathBuf::from("extensions/workflows"),
WorkflowType::ExtensionsShared => PathBuf::from("extensions/workflows/shared"),
}
}
}
@@ -102,8 +124,6 @@ pub fn run_workflows(_: GenerateWorkflowArgs) -> Result<()> {
if !Path::new("crates/zed/").is_dir() {
anyhow::bail!("xtask workflows must be ran from the project root");
}
let workflow_dir = Path::new(".github/workflows");
let extension_workflow_dir = Path::new("extensions/workflows");
let workflows = [
WorkflowFile::zed(danger::danger),
@@ -121,17 +141,13 @@ pub fn run_workflows(_: GenerateWorkflowArgs) -> Result<()> {
WorkflowFile::zed(extension_tests::extension_tests),
WorkflowFile::zed(extension_bump::extension_bump),
WorkflowFile::zed(extension_release::extension_release),
WorkflowFile::zed(extension_workflow_rollout::extension_workflow_rollout),
/* workflows used for CI/CD in extension repositories */
WorkflowFile::extension(extensions::run_tests::run_tests),
WorkflowFile::extension(extensions::bump_version::bump_version),
WorkflowFile::extension(extensions::release_version::release_version),
WorkflowFile::extension_shared(extensions::bump_version::bump_version),
WorkflowFile::extension_shared(extensions::release_version::release_version),
];
for directory in [&workflow_dir, &extension_workflow_dir] {
fs::create_dir_all(directory)
.with_context(|| format!("Failed to create directory: {}", directory.display()))?;
}
for workflow_file in workflows {
workflow_file.generate_file()?;
}

View File

@@ -101,13 +101,14 @@ fn create_version_label(
app_id: &WorkflowSecret,
app_secret: &WorkflowSecret,
) -> NamedJob {
let (generate_token, generated_token) = generate_token(app_id, app_secret, None);
let (generate_token, generated_token) =
generate_token(&app_id.to_string(), &app_secret.to_string(), None);
let job = steps::dependant_job(dependencies)
.cond(Expression::new(format!(
"{DEFAULT_REPOSITORY_OWNER_GUARD} && github.event_name == 'push' && github.ref == 'refs/heads/main' && {} == 'false'",
needs_bump.expr(),
)))
.runs_on(runners::LINUX_LARGE)
.runs_on(runners::LINUX_SMALL)
.timeout_minutes(1u32)
.add_step(generate_token)
.add_step(steps::checkout_repo())
@@ -181,7 +182,8 @@ fn bump_extension_version(
app_id: &WorkflowSecret,
app_secret: &WorkflowSecret,
) -> NamedJob {
let (generate_token, generated_token) = generate_token(app_id, app_secret, None);
let (generate_token, generated_token) =
generate_token(&app_id.to_string(), &app_secret.to_string(), None);
let (bump_version, new_version) = bump_version(current_version, bump_type);
let job = steps::dependant_job(dependencies)
@@ -190,7 +192,7 @@ fn bump_extension_version(
force_bump.expr(),
needs_bump.expr(),
)))
.runs_on(runners::LINUX_LARGE)
.runs_on(runners::LINUX_SMALL)
.timeout_minutes(1u32)
.add_step(generate_token)
.add_step(steps::checkout_repo())
@@ -202,24 +204,37 @@ fn bump_extension_version(
}
pub(crate) fn generate_token(
app_id: &WorkflowSecret,
app_secret: &WorkflowSecret,
app_id_source: &str,
app_secret_source: &str,
repository_target: Option<RepositoryTarget>,
) -> (Step<Use>, StepOutput) {
let step = named::uses("actions", "create-github-app-token", "v2")
.id("generate-token")
.add_with(
Input::default()
.add("app-id", app_id.to_string())
.add("private-key", app_secret.to_string())
.add("app-id", app_id_source)
.add("private-key", app_secret_source)
.when_some(
repository_target,
|input,
RepositoryTarget {
owner,
repositories,
permissions,
}| {
input.add("owner", owner).add("repositories", repositories)
input
.add("owner", owner)
.add("repositories", repositories)
.when_some(permissions, |input, permissions| {
permissions
.into_iter()
.fold(input, |input, (permission, level)| {
input.add(
permission,
serde_json::to_value(&level).unwrap_or_default(),
)
})
})
},
),
);
@@ -295,6 +310,7 @@ fn create_pull_request(new_version: StepOutput, generated_token: StepOutput) ->
pub(crate) struct RepositoryTarget {
owner: String,
repositories: String,
permissions: Option<Vec<(String, Level)>>,
}
impl RepositoryTarget {
@@ -302,6 +318,14 @@ impl RepositoryTarget {
Self {
owner: owner.to_string(),
repositories: repositories.join("\n"),
permissions: None,
}
}
pub fn permissions(self, permissions: impl Into<Vec<(String, Level)>>) -> Self {
Self {
permissions: Some(permissions.into()),
..self
}
}
}

View File

@@ -27,13 +27,16 @@ pub(crate) fn extension_release() -> Workflow {
fn create_release(app_id: &WorkflowSecret, app_secret: &WorkflowSecret) -> NamedJob {
let extension_registry = RepositoryTarget::new("zed-industries", &["extensions"]);
let (generate_token, generated_token) =
generate_token(&app_id, &app_secret, Some(extension_registry));
let (generate_token, generated_token) = generate_token(
&app_id.to_string(),
&app_secret.to_string(),
Some(extension_registry),
);
let (get_extension_id, extension_id) = get_extension_id();
let job = Job::default()
.with_repository_owner_guard()
.runs_on(runners::LINUX_LARGE)
.runs_on(runners::LINUX_SMALL)
.add_step(generate_token)
.add_step(checkout_repo())
.add_step(get_extension_id)

View File

@@ -0,0 +1,187 @@
use gh_workflow::{
Event, Expression, Job, Level, Run, Step, Strategy, Use, Workflow, WorkflowDispatch,
};
use indoc::indoc;
use serde_json::json;
use crate::tasks::workflows::{
extension_bump::{RepositoryTarget, generate_token},
runners,
steps::{self, NamedJob, named},
vars::{self, StepOutput},
};
const EXCLUDED_REPOS: &[&str] = &["workflows", "material-icon-theme"];
pub(crate) fn extension_workflow_rollout() -> Workflow {
let fetch_repos = fetch_extension_repos();
let rollout_workflows = rollout_workflows_to_extension(&fetch_repos);
named::workflow()
.on(Event::default().workflow_dispatch(WorkflowDispatch::default()))
.add_env(("CARGO_TERM_COLOR", "always"))
.add_job(fetch_repos.name, fetch_repos.job)
.add_job(rollout_workflows.name, rollout_workflows.job)
}
fn fetch_extension_repos() -> NamedJob {
fn get_repositories() -> (Step<Use>, StepOutput) {
let exclusion_filter = EXCLUDED_REPOS
.iter()
.map(|repo| format!("repo.name !== '{}'", repo))
.collect::<Vec<_>>()
.join(" && ");
let step = named::uses("actions", "github-script", "v7")
.id("list-repos")
.add_with((
"script",
format!(
indoc! {r#"
const repos = await github.paginate(github.rest.repos.listForOrg, {{
org: 'zed-extensions',
type: 'public',
per_page: 100,
}});
const filteredRepos = repos
.filter(repo => !repo.archived)
.filter(repo => {})
.map(repo => repo.name);
console.log(`Found ${{filteredRepos.length}} extension repos`);
return filteredRepos;
"#},
exclusion_filter
),
))
.add_with(("result-encoding", "json"));
let filtered_repos = StepOutput::new(&step, "result");
(step, filtered_repos)
}
let (get_org_repositories, list_repos_output) = get_repositories();
let job = Job::default()
.runs_on(runners::LINUX_SMALL)
.timeout_minutes(5u32)
.outputs([("repos".to_owned(), list_repos_output.to_string())])
.add_step(get_org_repositories);
named::job(job)
}
fn rollout_workflows_to_extension(fetch_repos_job: &NamedJob) -> NamedJob {
fn checkout_zed_repo() -> Step<Use> {
steps::checkout_repo()
.name("checkout_zed_repo")
.add_with(("path", "zed"))
}
fn checkout_extension_repo(token: &StepOutput) -> Step<Use> {
steps::checkout_repo_with_token(token)
.add_with(("repository", "zed-extensions/${{ matrix.repo }}"))
.add_with(("path", "extension"))
}
fn copy_workflow_files() -> Step<Run> {
named::bash(indoc! {r#"
mkdir -p extension/.github/workflows
cp zed/extensions/workflows/shared/*.yml extension/.github/workflows/
"#})
}
fn get_short_sha() -> (Step<Run>, StepOutput) {
let step = named::bash(indoc! {r#"
echo "sha_short=$(git rev-parse --short HEAD)" >> "$GITHUB_OUTPUT"
"#})
.id("short-sha")
.working_directory("zed");
let step_output = StepOutput::new(&step, "sha_short");
(step, step_output)
}
fn create_pull_request(token: &StepOutput, short_sha: StepOutput) -> Step<Use> {
let title = format!("Update CI workflows to zed@{}", short_sha);
named::uses("peter-evans", "create-pull-request", "v7")
.add_with(("path", "extension"))
.add_with(("title", title.clone()))
.add_with((
"body",
indoc! {r#"
This PR updates the CI workflow files from the main Zed repository
based on the commit zed-industries/zed@${{ github.sha }}
"#},
))
.add_with(("commit-message", title))
.add_with(("branch", "update-workflows"))
.add_with((
"committer",
"zed-zippy[bot] <234243425+zed-zippy[bot]@users.noreply.github.com>",
))
.add_with((
"author",
"zed-zippy[bot] <234243425+zed-zippy[bot]@users.noreply.github.com>",
))
.add_with(("base", "main"))
.add_with(("delete-branch", true))
.add_with(("token", token.to_string()))
.add_with(("sign-commits", true))
.id("create-pr")
}
fn enable_auto_merge(token: &StepOutput) -> Step<gh_workflow::Run> {
named::bash(indoc! {r#"
PR_NUMBER="${{ steps.create-pr.outputs.pull-request-number }}"
if [ -n "$PR_NUMBER" ]; then
cd extension
gh pr merge "$PR_NUMBER" --auto --squash
fi
"#})
.add_env(("GH_TOKEN", token.to_string()))
}
let (authenticate, token) = generate_token(
vars::ZED_ZIPPY_APP_ID,
vars::ZED_ZIPPY_APP_PRIVATE_KEY,
Some(
RepositoryTarget::new("zed-extensions", &["${{ matrix.repo }}"]).permissions([
("permission-pull-requests".to_owned(), Level::Write),
("permission-contents".to_owned(), Level::Write),
("permission-workflows".to_owned(), Level::Write),
]),
),
);
let (calculate_short_sha, short_sha) = get_short_sha();
let job = Job::default()
.needs([fetch_repos_job.name.clone()])
.cond(Expression::new(format!(
"needs.{}.outputs.repos != '[]'",
fetch_repos_job.name
)))
.runs_on(runners::LINUX_SMALL)
.timeout_minutes(10u32)
.strategy(
Strategy::default()
.fail_fast(false)
.max_parallel(5u32)
.matrix(json!({
"repo": format!("${{{{ fromJson(needs.{}.outputs.repos) }}}}", fetch_repos_job.name)
})),
)
.add_step(authenticate)
.add_step(checkout_zed_repo())
.add_step(checkout_extension_repo(&token))
.add_step(copy_workflow_files())
.add_step(calculate_short_sha)
.add_step(create_pull_request(&token, short_sha))
.add_step(enable_auto_merge(&token));
named::job(job)
}

View File

@@ -1,13 +1,13 @@
use gh_workflow::{
Event, Expression, Input, Job, PullRequest, PullRequestType, Push, Run, Step, UsesJob,
Workflow, WorkflowDispatch,
Event, Expression, Input, Job, Level, Permissions, PullRequest, PullRequestType, Push, Run,
Step, UsesJob, Workflow, WorkflowDispatch,
};
use indexmap::IndexMap;
use indoc::indoc;
use crate::tasks::workflows::{
runners,
steps::{NamedJob, named},
steps::{CommonJobConditions, NamedJob, named},
vars::{self, JobOutput, StepOutput, one_workflow_per_non_main_branch_and_token},
};
@@ -40,6 +40,13 @@ pub(crate) fn call_bump_version(
"github.event.action != 'labeled' || {} != 'patch'",
bump_type.expr()
)))
.permissions(
Permissions::default()
.contents(Level::Write)
.issues(Level::Write)
.pull_requests(Level::Write)
.actions(Level::Write),
)
.uses(
"zed-industries",
"zed",
@@ -66,7 +73,9 @@ pub(crate) fn call_bump_version(
fn determine_bump_type() -> (NamedJob, StepOutput) {
let (get_bump_type, output) = get_bump_type();
let job = Job::default()
.runs_on(runners::LINUX_DEFAULT)
.with_repository_owner_guard()
.permissions(Permissions::default())
.runs_on(runners::LINUX_SMALL)
.add_step(get_bump_type)
.outputs([(output.name.to_owned(), output.to_string())]);
(named::job(job), output)

View File

@@ -1,4 +1,4 @@
use gh_workflow::{Event, Job, Push, UsesJob, Workflow};
use gh_workflow::{Event, Job, Level, Permissions, Push, UsesJob, Workflow};
use crate::tasks::workflows::{
extensions::WithAppSecrets,
@@ -14,6 +14,11 @@ pub(crate) fn release_version() -> Workflow {
pub(crate) fn call_release_version() -> NamedJob<UsesJob> {
let job = Job::default()
.permissions(
Permissions::default()
.contents(Level::Write)
.pull_requests(Level::Write),
)
.uses(
"zed-industries",
"zed",

View File

@@ -1,4 +1,4 @@
use gh_workflow::{Event, Job, PullRequest, Push, UsesJob, Workflow};
use gh_workflow::{Event, Job, Level, Permissions, PullRequest, Push, UsesJob, Workflow};
use crate::tasks::workflows::{
steps::{NamedJob, named},
@@ -16,12 +16,14 @@ pub(crate) fn run_tests() -> Workflow {
}
pub(crate) fn call_extension_tests() -> NamedJob<UsesJob> {
let job = Job::default().uses(
"zed-industries",
"zed",
".github/workflows/extension_tests.yml",
"main",
);
let job = Job::default()
.permissions(Permissions::default().contents(Level::Read))
.uses(
"zed-industries",
"zed",
".github/workflows/extension_tests.yml",
"main",
);
named::job(job)
}