Compare commits

...

58 Commits

Author SHA1 Message Date
Zed Bot
d846d45519 Bump to 0.211.5 for @smitbarmase 2025-11-06 15:05:46 +00:00
Smit Barmase
f70f507f61 language: Fix completion menu no longer prioritizes relevant items for Typescript and Python (#42065)
Closes #41672

Regressed in https://github.com/zed-industries/zed/pull/40242

Release Notes:

- Fixed issue where completion menu no longer prioritizes relevant items
for TypeScript and Python.
2025-11-06 13:37:06 +05:30
Anthony Eid
e517719132 AI: Fix Github Copilot edit predictions failing to start (#41934)
Closes #41457 #41806 #41801

Copilot started using `node:sqlite` module which is an experimental
feature between node v22-v23 (stable in v24). The fix was passing in the
experimental flag when Zed starts the copilot LSP.

I tested this with v20.19.5 and v24.11.0. The fix got v20.19 working and
didn't affect v24.11 which was already working.

Release Notes:

- AI: Fix Github Copilot edit predictions failing to start
2025-11-06 13:30:28 +05:30
Richard Feldman
74efd3adbc Run ACP login from same cwd as agent server (#42038)
This makes it possible to do login via things like `cmd: "node", args:
["my-node-file.js", "login"]`

Also, that command will now use Zed's managed `node` instance.

Release Notes:

- ACP extensions can now run terminal login commands using relative
paths
2025-11-06 08:11:16 +01:00
Danilo Leal
9f1a9016b6 agent_ui: Fix how icons from external agents are displayed (#42034)
Release Notes:

- N/A
2025-11-06 08:07:59 +01:00
Danilo Leal
506f333ce1 gpui: Add support for rendering SVG from external files (#42024)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-06 08:06:59 +01:00
Finn Evers
214a0bc116 svg_preview: Update preview on every buffer edit (#41270)
Closes https://github.com/zed-industries/zed/issues/39104

This fixes an issue where the preview would not work for remote buffers
in the process.

Release Notes:

- Fixed an issue where the SVG preview would not work in remote
scenarios.
- The SVG preview will now rerender on every keypress instead of only on
saves.
2025-11-06 08:06:54 +01:00
Conrad Irwin
d461acbc7b Revert "Don't draft release notes"
This reverts commit 62ece18dfe.
2025-11-05 23:51:19 -07:00
zed-zippy[bot]
7acefd50cc Refresh zed.dev releases page after releases (#42060) (cherry-pick to stable) (#42064)
Cherry-pick of #42060 to stable

----
Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-06 06:49:23 +00:00
Conrad Irwin
62ece18dfe Don't draft release notes 2025-11-05 10:45:11 -07:00
Joseph T. Lyons
4475689e4c v0.211.x stable 2025-11-05 12:07:10 -05:00
Ben Kunkle
5ea5b5e7a9 Fix integer underflow in autosave mode after delay in the settings (cherry-pick #41898) (#42013)
Closes #41774 

Release Notes:

- settings_ui: Fixed an integer underflow panic when attempting to hit
the `-` sign on settings item that take delays in milliseconds

Co-authored-by: Ignasius <96295999+ignasius-j-s@users.noreply.github.com>
2025-11-05 11:30:12 -05:00
Richard Feldman
d93f528a37 Use our node runtime for ACP extensions (#41955)
Release Notes:

- Now ACP extensions use Zed's managed Node.js runtime

---------

Co-authored-by: Mikayla Maki <mikayla.c.maki@gmail.com>
2025-11-05 12:24:23 +01:00
Richard Feldman
44140197a6 Add ACP terminal-login via _meta field (#41954)
As discussed with @benbrandt and @mikayla-maki:

* We now tell ACP clients we support the nonstandard `terminal-auth`
`_meta` field for terminal-based authentication
* In the future, we anticipate ACP itself supporting *some* form of
terminal-based authentication, but that hasn't been designed yet or gone
through the RFD process
* For now, this unblocks terminal-based auth

Release Notes:

- Added experimental terminal-based authentication to ACP support
2025-11-05 11:40:11 +01:00
Lukas Wirth
2e746791b1 project: Fetch latest lsp data in deduplicate_range_based_lsp_requests 2025-11-05 09:24:05 +01:00
Lukas Wirth
a3f230f760 zed: Reduce number of rayon threads, spawn with bigger stacks (#41812)
We already do this for the cli and remote server but forgot to do so for
the main binary

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-05 09:13:16 +01:00
zed-zippy[bot]
ff0eef98c9 Fetch (just) enough refs in script/cherry-pick (#41949) (cherry-pick to preview) (#41951)
Cherry-pick of #41949 to preview

----
Before this change we'd download all the tagged commits, but none of
their ancestors,
this was slow and made cherry-picking fail.

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-04 19:45:56 -07:00
Zed Bot
712f75ddb4 Bump to 0.211.4 for @ConradIrwin 2025-11-05 00:15:16 +00:00
Conrad Irwin
74031e2243 More tweaks to CI pipeline (#41941)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-04 16:59:39 -07:00
zed-zippy[bot]
8ea85828f5 lsp: Fix dynamic registration of document diagnostics (#41929) (cherry-pick to preview) (#41947)
Cherry-pick of #41929 to preview

----
- lsp: Fix dynamic registration of diagnostic capabilities not taking
effect when an initial capability is not specified
Gist of the issue lies within use of .get_mut instead of .entry. If we
had not created any dynamic capability beforehand, we'd miss a
registration, essentially

- **Determine whether to update remote caps in a smarter manner**

Release Notes:

- Fixed document diagnostics with Ty language server.

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2025-11-04 23:33:47 +00:00
zed-zippy[bot]
7deac6cd2c Improve compare_perf.yml, cherry_pick.yml (#41606) (cherry-pick) (#41946)
Cherry-pick of #41606

----
Release Notes:

- N/A

---------

Co-authored-by: Nia Espera <nia@zed.dev>

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Nia Espera <nia@zed.dev>
2025-11-04 23:01:35 +00:00
zed-zippy[bot]
03a0f9b944 Shell out to real tar in extension builder (#41856) (cherry-pick) (#41945)
Cherry-pick of #41856

----
We see `test_extension_store_with_test_extension` hang in untarring the
WASI SDK some times.

In lieu of trying to debug the problem, let's try shelling out for now
in the hope that the test becomes more reliable.

There's a bit of risk here because we're using async-tar for other
things (but probably not 300Mb tar files...)

Assisted-By: Zed AI

Closes #ISSUE

Release Notes:

- N/A

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-11-04 22:39:24 +00:00
Conrad Irwin
c374532318 settings_ui: Fix dropdowns after #41036 (#41920) (cherry-pick) (#41930)
Cherry-pick of #41920

----
Closes #41533

Both of the issues in the release notes that are fixed in this PR, were
caused by incorrect usage of the `window.use_state` API.
The first issue was caused by calling `window.use_state` in a render
helper, resulting in the element ID used to share state being the same
across different pages, resulting in the state being re-used when it
should have been re-created. The fix for this was to move the
`window.state` (and rendering logic) into a `impl RenderOnce` component,
so that the IDs are resolved during the render, avoiding the state
conflicts.

The second issue is caused by using a `move` closure in the
`window.use_state` call, resulting in stale closure values when the
window state is re-used.

Release Notes:

- settings_ui: Fixed an issue where some dropdown menus would show
options from a different dropdown when clicked
- settings_ui: Fixed an issue where attempting to change a setting in a
dropdown back to it's original value after changing it would do nothing

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-11-04 14:01:06 -05:00
Zed Bot
a677ecc889 Bump to 0.211.3 for @ConradIrwin 2025-11-04 05:35:12 +00:00
Conrad Irwin
59213b26c9 Fix merge conflict (#41853)
Closes #ISSUE

Release Notes:

- N/A
2025-11-03 22:31:09 -07:00
Piotr Osiewicz
2b901ad3ae ci: Enable namespace caching for Linux workers (#41652)
Release Notes:

- N/A
2025-11-03 22:30:51 -07:00
Conrad Irwin
61dddc4d65 Re-use the existing bundle steps for nightly too (#41699)
One of the reasons we didn't spot that we were missing the telemetry env
vars for the production builds was that nightly (which was working) had
its own set of build steps. This re-uses those and pushes the env vars
down from the workflow to the job.

It also fixes nightly releases to upload all-in-one go so that all
platforms update in sync.

Closes #41655

Release Notes:

- N/A
2025-11-03 22:30:45 -07:00
Ben Kunkle
14281179f2 gh-workflow unit evals (#41637)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-11-03 22:30:35 -07:00
Conrad Irwin
2f3c2082f4 Fix branch diff hunk expansion (#41873)
Closes #ISSUE

Release Notes:

- (preview only) Fixes a bug where hunks were not expanded when viewing
branch diff
2025-11-03 22:30:16 -07:00
Kirill Bulatov
f8979f1ed7 Do not pull diagnostics when those are disabled (#41865)
Based on 

[hang.log](https://github.com/user-attachments/files/23319081/hang.log)


Release Notes:

- N/A
2025-11-04 00:43:07 +02:00
Kirill Bulatov
921be53241 Remove incorrectly added test
During cherry-picking of https://github.com/zed-industries/zed/pull/41859 , one test was incorrectly merged in.
This was only added in https://github.com/zed-industries/zed/pull/41342 which is not cherry-picked, hence the test will fail.
2025-11-04 00:38:38 +02:00
Kirill Bulatov
af2d462bf7 Fix racy inlay hints queries (#41816)
Follow-up of https://github.com/zed-industries/zed/pull/40183

Release Notes:

- (Preview only) Fixed inlay hints duplicating when multiple editors are
open for the same buffer

---------

Co-authored-by: Lukas Wirth <lukas@zed.dev>
2025-11-04 00:17:00 +02:00
Kirill Bulatov
713903fe76 Fix incorrect search ranges when rendering search matches in the outline panel (#41859)
Closes https://github.com/zed-industries/zed/issues/41792

Release Notes:

- Fixed outline panel panicking when rendering certain search matches
2025-11-04 00:16:44 +02:00
Lukas Wirth
f4c077bdcf Remove unused import 2025-11-03 22:13:40 +01:00
Lukas Wirth
48fdd8893f file_finder: Fix highlighting panic in open path prompt (#41808)
Closes https://github.com/zed-industries/zed/issues/41249

Couldn't quite come up with a test case here but verified it works.

Release Notes:

- Fixed a panic in file finder when deleting characters
2025-11-03 22:02:10 +01:00
Lukas Wirth
93495542de editor: Fix refresh_linked_ranges panics due to old snapshot use (#41657)
Fixes ZED-29Z

Release Notes:

- Fixed panic in `refresh_linked_ranges`
2025-11-03 22:01:54 +01:00
Finn Evers
311abd0e1b extension_host: Do not try auto installing suppressed extensions (#41551)
Release Notes:

- Fixed an issue where Zed would try to install extensions specified
under `auto_install_extensions` which were moved into core.
2025-11-03 21:34:28 +01:00
Lukas Wirth
c5e298b5d8 Revert "sum_tree: Replace rayon with futures (#41586)"
This reverts commit f2ce06c7b0.
2025-11-03 20:02:37 +01:00
Zed Bot
ec813e9833 Bump to 0.211.2 for @ConradIrwin 2025-11-01 05:00:53 +00:00
Conrad Irwin
68063eaa45 Fix telemetry in release builds (#41695)
This was inadvertently broken in v0.211.1-pre when we rewrote the
release build

Release Notes:

- N/A
2025-10-31 22:56:18 -06:00
Conrad Irwin
7dadd4a240 Delete old ci.yml (#41668)
The new one is much better

Release Notes:

- N/A
2025-10-31 22:55:54 -06:00
Ben Kunkle
4f8c3782cc Fix release.yml workflow (#41675)
Closes #ISSUE

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 16:31:11 -04:00
Conrad Irwin
6eb3cdbf20 blerg 2025-10-31 13:11:39 -06:00
Conrad Irwin
8677bb57f0 Skip release notes for a hot second 2025-10-31 13:07:53 -06:00
Zed Bot
45accd931e Bump to 0.211.1 for @ConradIrwin 2025-10-31 18:32:04 +00:00
Conrad Irwin
671c4eb133 Delete old ci.yml 2025-10-31 12:27:21 -06:00
Ben Kunkle
011e3c155b gh-workflow release (#41502)
Closes #ISSUE

Rewrite our release pipeline to be generated by `gh-workflow`

Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-31 12:26:32 -06:00
claytonrcarter
3eb9d7765a bundle: Restore local install on macOS (#41482)
I just pulled and ran a local build via `script/bundle-mac -l -i` but
found that the resulting bundle wasn't installed as expected. (me:
"ToggleAllDocks!! Wait! Where is it?!") Looking into, it looks like the
`-l` flag was removed in #41392, leaving the `$local_only` var orphaned,
which then left the `-i/$local_install` flag unreachable. I suspect that
this was unintentional, so this PR re-adds the `-l/$local_only` flag to
`script/bundle-mac`.

I ran the build again and confirmed that local install seemed to work as
expected. (ie "ToggleAllDocks!! 🎉")

While here, I also removed the last reference to `$local_arch`, because
all other references to that were removed in #41392.

/cc @osiewicz 

Release Notes:

- N/A

---------

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-10-31 12:26:19 -06:00
Conrad Irwin
d6c49fbc3c Add a no-op compare_perf workflow (#41605)
Testing PR for @zed-zippy

Release Notes:

- N/A
2025-10-31 12:24:42 -06:00
Conrad Irwin
7bf3f92a0a Don't skip tests in nightly release (#41573)
Release Notes:

- N/A
2025-10-31 12:24:25 -06:00
Conrad Irwin
2751512ae8 Use gh-workflow for tests (take 2) (#41420)
This re-implements the reverted commit 8b051d6cc3.

Closes #ISSUE

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-10-31 12:09:43 -06:00
Conrad Irwin
032ab87cbe Generate dwarf files for builds again (#41651)
Closes #ISSUE

Release Notes:

- N/A
2025-10-31 11:03:48 -06:00
Lukas Wirth
dbabcfcac8 sum_tree: Replace rayon with futures (#41586)
Release Notes:

- N/A *or* Added/Fixed/Improved ...

Co-authored by: Kate <kate@zed.dev>
2025-10-31 12:20:42 +01:00
Bennet Fenner
9436818467 agent_ui: Fix agent: Chat with follow not working (#41581)
Release Notes:

- Fixed an issue where `agent: Chat with follow` was not working anymore

Co-authored-by: Ben Brandt <benjamin.j.brandt@gmail.com>
2025-10-30 17:17:06 +01:00
Lukas Wirth
501f78aacb project: Fix inlay hints duplicatig on chunk start (#41461)
Release Notes:

- N/A *or* Added/Fixed/Improved ...
2025-10-29 16:23:03 +01:00
Finn Evers
0d5bfaf7b4 Revert "Support relative line number on wrapped lines (#39268)" (#41450)
Closes #41422

This completely broke line numbering as described in the linked issue
and scrolling up does not have the correct numbers any more.

Release Notes:

- NOTE: The `relative_line_numbers` change
(https://github.com/zed-industries/zed/pull/39268) was reverted and did
not make the release cut!
2025-10-29 16:22:00 +01:00
Finn Evers
b1935b95a1 ui: Don't show scrollbar track in too many cases (#41455)
Follow-up to https://github.com/zed-industries/zed/pull/41354 which
introduced a small regression.

Release Notes:

- N/A
2025-10-29 16:22:00 +01:00
Conrad Irwin
f98c12513b v0.211.x preview 2025-10-28 19:43:56 -06:00
101 changed files with 5464 additions and 3261 deletions

69
.github/workflows/after_release.yml vendored Normal file
View File

@@ -0,0 +1,69 @@
# Generated from xtask::workflows::after_release
# Rebuild with `cargo xtask workflows`.
name: after_release
on:
release:
types:
- published
jobs:
rebuild_releases_page:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: after_release::rebuild_releases_page
run: 'curl https://zed.dev/api/revalidate-releases -H "Authorization: Bearer ${RELEASE_NOTES_API_TOKEN}"'
shell: bash -euxo pipefail {0}
env:
RELEASE_NOTES_API_TOKEN: ${{ secrets.RELEASE_NOTES_API_TOKEN }}
post_to_discord:
needs:
- rebuild_releases_page
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: get-release-url
name: after_release::post_to_discord::get_release_url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- id: get-content
name: after_release::post_to_discord::get_content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: '...'
- name: after_release::post_to_discord::discord_webhook_action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish_winget:
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- id: set-package-name
name: after_release::publish_winget::set_package_name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
shell: bash -euxo pipefail {0}
- name: after_release::publish_winget::winget_releaser
uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}

39
.github/workflows/cherry_pick.yml vendored Normal file
View File

@@ -0,0 +1,39 @@
# Generated from xtask::workflows::cherry_pick
# Rebuild with `cargo xtask workflows`.
name: cherry_pick
on:
workflow_dispatch:
inputs:
commit:
description: commit
required: true
type: string
branch:
description: branch
required: true
type: string
channel:
description: channel
required: true
type: string
jobs:
run_cherry_pick:
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- id: get-app-token
name: cherry_pick::run_cherry_pick::authenticate_as_zippy
uses: actions/create-github-app-token@bef1eaf1c0ac2b148ee2a0a74c65fbe6db0631f1
with:
app-id: ${{ secrets.ZED_ZIPPY_APP_ID }}
private-key: ${{ secrets.ZED_ZIPPY_APP_PRIVATE_KEY }}
- name: cherry_pick::run_cherry_pick::cherry_pick
run: ./script/cherry-pick ${{ inputs.branch }} ${{ inputs.commit }} ${{ inputs.channel }}
shell: bash -euxo pipefail {0}
env:
GIT_COMMITTER_NAME: Zed Zippy
GIT_COMMITTER_EMAIL: hi@zed.dev
GITHUB_TOKEN: ${{ steps.get-app-token.outputs.token }}

View File

@@ -1,855 +0,0 @@
name: CI
on:
push:
branches:
- main
- "v[0-9]+.[0-9]+.x"
tags:
- "v*"
pull_request:
branches:
- "**"
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
jobs:
job_spec:
name: Decide which jobs to run
if: github.repository_owner == 'zed-industries'
outputs:
run_tests: ${{ steps.filter.outputs.run_tests }}
run_license: ${{ steps.filter.outputs.run_license }}
run_docs: ${{ steps.filter.outputs.run_docs }}
run_nix: ${{ steps.filter.outputs.run_nix }}
run_actionlint: ${{ steps.filter.outputs.run_actionlint }}
runs-on:
- namespace-profile-2x4-ubuntu-2404
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# 350 is arbitrary; ~10days of history on main (5secs); full history is ~25secs
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- name: Fetch git history and generate output filters
id: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
# Specify anything which should potentially skip full test suite in this regex:
# - docs/
# - script/update_top_ranking_issues/
# - .github/ISSUE_TEMPLATE/
# - .github/workflows/ (except .github/workflows/ci.yml)
SKIP_REGEX='^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!ci)))'
echo "$CHANGED_FILES" | grep -qvP "$SKIP_REGEX" && \
echo "run_tests=true" >> "$GITHUB_OUTPUT" || \
echo "run_tests=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^docs/' && \
echo "run_docs=true" >> "$GITHUB_OUTPUT" || \
echo "run_docs=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^\.github/(workflows/|actions/|actionlint.yml)' && \
echo "run_actionlint=true" >> "$GITHUB_OUTPUT" || \
echo "run_actionlint=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(Cargo.lock|script/.*licenses)' && \
echo "run_license=true" >> "$GITHUB_OUTPUT" || \
echo "run_license=false" >> "$GITHUB_OUTPUT"
echo "$CHANGED_FILES" | grep -qP '^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)' && \
echo "$GITHUB_REF_NAME" | grep -qvP '^v[0-9]+\.[0-9]+\.[0-9x](-pre)?$' && \
echo "run_nix=true" >> "$GITHUB_OUTPUT" || \
echo "run_nix=false" >> "$GITHUB_OUTPUT"
migration_checks:
name: Check Postgres and Protobuf migrations, mergability
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
timeout-minutes: 60
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
fetch-depth: 0 # fetch full history
- name: Remove untracked files
run: git clean -df
- name: Find modified migrations
shell: bash -euxo pipefail {0}
run: |
export SQUAWK_GITHUB_TOKEN=${{ github.token }}
. ./script/squawk
- name: Ensure fresh merge
shell: bash -euxo pipefail {0}
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
- uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- uses: bufbuild/buf-breaking-action@v1
with:
input: "crates/proto/proto/"
against: "https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/"
style:
timeout-minutes: 60
name: Check formatting and spelling
needs: [job_spec]
if: github.repository_owner == 'zed-industries'
runs-on:
- namespace-profile-4x8-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" . --check || {
echo "To fix, run from the root of the Zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}
env:
PRETTIER_VERSION: 3.5.0
- name: Prettier Check on default.json
run: |
pnpm dlx "prettier@${PRETTIER_VERSION}" assets/settings/default.json --check || {
echo "To fix, run from the root of the Zed repo:"
echo " pnpm dlx prettier@${PRETTIER_VERSION} assets/settings/default.json --write"
false
}
env:
PRETTIER_VERSION: 3.5.0
# To support writing comments that they will certainly be revisited.
- name: Check for todo! and FIXME comments
run: script/check-todos
- name: Check modifier use in keymaps
run: script/check-keymaps
- name: Run style checks
uses: ./.github/actions/check_style
- name: Check for typos
uses: crate-ci/typos@80c8a4945eec0f6d464eaf9e65ed98ef085283d1 # v1.38.1
with:
config: ./typos.toml
check_docs:
timeout-minutes: 60
name: Check docs
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
(needs.job_spec.outputs.run_tests == 'true' || needs.job_spec.outputs.run_docs == 'true')
runs-on:
- namespace-profile-8x16-ubuntu-2204
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build docs
uses: ./.github/actions/build_docs
actionlint:
runs-on: namespace-profile-2x4-ubuntu-2404
if: github.repository_owner == 'zed-industries' && needs.job_spec.outputs.run_actionlint == 'true'
needs: [job_spec]
steps:
- uses: actions/checkout@v4
- name: Download actionlint
id: get_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash
- name: Check workflow files
run: ${{ steps.get_actionlint.outputs.executable }} -color
shell: bash
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- self-mini-macos
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Check that Cargo.lock is up to date
run: |
cargo update --locked --workspace
- name: cargo clippy
run: ./script/clippy
- name: Install cargo-machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: install
args: cargo-machete@0.7.0
- name: Check unused dependencies
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386 # v2
with:
command: machete
- name: Check licenses
run: |
script/check-licenses
if [[ "${{ needs.job_spec.outputs.run_license }}" == "true" ]]; then
script/generate-licenses /tmp/zed_licenses_output
fi
- name: Check for new vulnerable dependencies
if: github.event_name == 'pull_request'
uses: actions/dependency-review-action@67d4f4bd7a9b17a0db54d2a7519187c65e339de8 # v4
with:
license-check: false
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build collab
run: cargo build -p collab
- name: Build other binaries and features
run: |
cargo build --workspace --bins --all-features
cargo check -p gpui --features "macos-blade"
cargo check -p workspace
cargo build -p remote_server
cargo check -p gpui --examples
# Since the macOS runners are stateful, so we need to remove the config file to prevent potential bug.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
linux_tests:
timeout-minutes: 60
name: (Linux) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: cargo clippy
run: ./script/clippy
- name: Run tests
uses: ./.github/actions/run_tests
- name: Build other binaries and features
run: |
cargo build -p zed
cargo check -p workspace
cargo check -p gpui --examples
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
doctests:
# Nextest currently doesn't support doctests, so run them separately and in parallel.
timeout-minutes: 60
name: (Linux) Run doctests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Run doctests
run: cargo test --workspace --doc --no-fail-fast
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Clang & Mold
run: ./script/remote-server && ./script/install-mold 2.34.0
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Build Remote Server
run: cargo build -p remote_server
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo
windows_tests:
timeout-minutes: 60
name: (Windows) Run Clippy and tests
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on: [self-32vcpu-windows-2022]
steps:
- name: Environment Setup
run: |
$RunnerDir = Split-Path -Parent $env:RUNNER_WORKSPACE
Write-Output `
"RUSTUP_HOME=$RunnerDir\.rustup" `
"CARGO_HOME=$RunnerDir\.cargo" `
"PATH=$RunnerDir\.cargo\bin;$env:PATH" `
>> $env:GITHUB_ENV
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Configure CI
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
- name: cargo clippy
run: |
.\script\clippy.ps1
- name: Run tests
uses: ./.github/actions/run_tests_windows
- name: Build Zed
run: cargo build
- name: Limit target directory size
run: ./script/clear-target-dir-if-larger-than.ps1 250
- name: Clean CI config file
if: always()
run: Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
tests_pass:
name: Tests Pass
runs-on: namespace-profile-2x4-ubuntu-2404
needs:
- job_spec
- style
- check_docs
- actionlint
- migration_checks
# run_tests: If adding required tests, add them here and to script below.
- linux_tests
- build_remote_server
- macos_tests
- windows_tests
if: |
github.repository_owner == 'zed-industries' &&
always()
steps:
- name: Check all tests passed
run: |
# Check dependent jobs...
RET_CODE=0
# Always check style
[[ "${{ needs.style.result }}" != 'success' ]] && { RET_CODE=1; echo "style tests failed"; }
if [[ "${{ needs.job_spec.outputs.run_docs }}" == "true" ]]; then
[[ "${{ needs.check_docs.result }}" != 'success' ]] && { RET_CODE=1; echo "docs checks failed"; }
fi
if [[ "${{ needs.job_spec.outputs.run_actionlint }}" == "true" ]]; then
[[ "${{ needs.actionlint.result }}" != 'success' ]] && { RET_CODE=1; echo "actionlint checks failed"; }
fi
# Only check test jobs if they were supposed to run
if [[ "${{ needs.job_spec.outputs.run_tests }}" == "true" ]]; then
[[ "${{ needs.macos_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "macOS tests failed"; }
[[ "${{ needs.linux_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Linux tests failed"; }
[[ "${{ needs.windows_tests.result }}" != 'success' ]] && { RET_CODE=1; echo "Windows tests failed"; }
[[ "${{ needs.build_remote_server.result }}" != 'success' ]] && { RET_CODE=1; echo "Remote server build failed"; }
# This check is intentionally disabled. See: https://github.com/zed-industries/zed/pull/28431
# [[ "${{ needs.migration_checks.result }}" != 'success' ]] && { RET_CODE=1; echo "Migration Checks failed"; }
fi
if [[ "$RET_CODE" -eq 0 ]]; then
echo "All tests passed successfully!"
fi
exit $RET_CODE
bundle-mac:
timeout-minutes: 120
name: Create a macOS bundle
runs-on:
- self-mini-macos
if: startsWith(github.ref, 'refs/tags/v')
needs: [macos_tests]
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# We need to fetch more than one commit so that `script/draft-release-notes`
# is able to diff between the current and previous tag.
#
# 25 was chosen arbitrarily.
fetch-depth: 25
clean: false
ref: ${{ github.ref }}
- name: Limit target directory size
run: script/clear-target-dir-if-larger-than 300
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Draft release notes
run: |
mkdir -p target/
# Ignore any errors that occur while drafting release notes to not fail the build.
script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md || true
script/create-draft-release target/release-notes.md
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create macOS app bundle (aarch64)
run: script/bundle-mac aarch64-apple-darwin
- name: Create macOS app bundle (x64)
run: script/bundle-mac x86_64-apple-darwin
- name: Rename binaries
run: |
mv target/aarch64-apple-darwin/release/Zed.dmg target/aarch64-apple-darwin/release/Zed-aarch64.dmg
mv target/x86_64-apple-darwin/release/Zed.dmg target/x86_64-apple-darwin/release/Zed-x86_64.dmg
- uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
name: Upload app bundle to release
if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-macos-x86_64.gz
target/zed-remote-server-macos-aarch64.gz
target/aarch64-apple-darwin/release/Zed-aarch64.dmg
target/x86_64-apple-darwin/release/Zed-x86_64.dmg
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-x86_x64:
timeout-minutes: 60
name: Linux x86_x64 release bundle
runs-on:
- namespace-profile-16x32-ubuntu-2004 # ubuntu 20.04 for minimal glibc
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux && ./script/install-mold 2.34.0
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create Linux .tar.gz bundle
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-x86_64.gz
target/release/zed-linux-x86_64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-linux-aarch64: # this runs on ubuntu22.04
timeout-minutes: 60
name: Linux arm64 release bundle
runs-on:
- namespace-profile-8x32-ubuntu-2004-arm-m4 # ubuntu 20.04 for minimal glibc
if: |
startsWith(github.ref, 'refs/tags/v')
needs: [linux_tests]
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Install Linux dependencies
run: ./script/linux
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
- name: Create and upload Linux .tar.gz bundles
run: script/bundle-linux
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
target/zed-remote-server-linux-aarch64.gz
target/release/zed-linux-aarch64.tar.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
freebsd:
timeout-minutes: 60
runs-on: github-8vcpu-ubuntu-2404
if: |
false && ( startsWith(github.ref, 'refs/tags/v') )
needs: [linux_tests]
name: Build Zed on FreeBSD
steps:
- uses: actions/checkout@v4
- name: Build FreeBSD remote-server
id: freebsd-build
uses: vmactions/freebsd-vm@c3ae29a132c8ef1924775414107a97cac042aad5 # v1.2.0
with:
usesh: true
release: 13.5
copyback: true
prepare: |
pkg install -y \
bash curl jq git \
rustup-init cmake-core llvm-devel-lite pkgconf protobuf # ibx11 alsa-lib rust-bindgen-cli
run: |
freebsd-version
sysctl hw.model
sysctl hw.ncpu
sysctl hw.physmem
sysctl hw.usermem
git config --global --add safe.directory /home/runner/work/zed/zed
rustup-init --profile minimal --default-toolchain none -y
. "$HOME/.cargo/env"
./script/bundle-freebsd
mkdir -p out/
mv "target/zed-remote-server-freebsd-x86_64.gz" out/
rm -rf target/
cargo clean
- name: Upload Artifact to Workflow - zed-remote-server (run-bundling)
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4
if: contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-freebsd.gz
path: out/zed-remote-server-freebsd-x86_64.gz
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
if: ${{ !(contains(github.event.pull_request.labels.*.name, 'run-bundling')) }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: |
out/zed-remote-server-freebsd-x86_64.gz
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
nix-build:
name: Build with Nix
uses: ./.github/workflows/nix_build.yml
needs: [job_spec]
if: github.repository_owner == 'zed-industries' &&
(contains(github.event.pull_request.labels.*.name, 'run-nix') ||
needs.job_spec.outputs.run_nix == 'true')
secrets: inherit
with:
flake-output: debug
# excludes the final package to only cache dependencies
cachix-filter: "-zed-editor-[0-9.]*-nightly"
bundle-windows-x64:
timeout-minutes: 120
name: Create a Windows installer for x86_64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
bundle-windows-aarch64:
timeout-minutes: 120
name: Create a Windows installer for aarch64
runs-on: [self-32vcpu-windows-2022]
if: |
( startsWith(github.ref, 'refs/tags/v') )
needs: [windows_tests]
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: "http://timestamp.acs.microsoft.com"
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Setup Sentry CLI
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b #v2
with:
token: ${{ SECRETS.SENTRY_AUTH_TOKEN }}
- name: Determine version and release channel
working-directory: ${{ env.ZED_WORKSPACE }}
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel.ps1
- name: Build Zed installer
working-directory: ${{ env.ZED_WORKSPACE }}
run: script/bundle-windows.ps1 -Architecture aarch64
- name: Upload Artifacts to release
uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v1
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: ${{ env.SETUP_PATH }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto-release-preview:
name: Auto release preview
if: |
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64, bundle-windows-x64, bundle-windows-aarch64]
runs-on:
- self-mini-macos
steps:
- name: gh release
run: gh release edit "$GITHUB_REF_NAME" --draft=false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Create Sentry release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c # v3
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
with:
environment: production

View File

@@ -1,93 +0,0 @@
# IF YOU UPDATE THE NAME OF ANY GITHUB SECRET, YOU MUST CHERRY PICK THE COMMIT
# TO BOTH STABLE AND PREVIEW CHANNELS
name: Release Actions
on:
release:
types: [published]
jobs:
discord_release:
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
- name: Get release URL
id: get-release-url
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
- name: Get content
uses: 2428392/gh-truncate-string-action@b3ff790d21cf42af3ca7579146eedb93c8fb0757 # v1.4.1
id: get-content
with:
stringToTruncate: |
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
maxLength: 2000
truncationSymbol: "..."
- name: Discord Webhook Action
uses: tsickert/discord-webhook@c840d45a03a323fbc3f7507ac7769dbd91bfb164 # v5.3.0
with:
webhook-url: ${{ secrets.DISCORD_WEBHOOK_RELEASE_NOTES }}
content: ${{ steps.get-content.outputs.string }}
publish-winget:
runs-on:
- ubuntu-latest
steps:
- name: Set Package Name
id: set-package-name
run: |
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
- uses: vedantmgoyal9/winget-releaser@19e706d4c9121098010096f9c495a70a7518b30f # v2
with:
identifier: ${{ steps.set-package-name.outputs.PACKAGE_NAME }}
max-versions-to-keep: 5
token: ${{ secrets.WINGET_TOKEN }}
send_release_notes_email:
if: false && github.repository_owner == 'zed-industries' && !github.event.release.prerelease
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
fetch-depth: 0
- name: Check if release was promoted from preview
id: check-promotion-from-preview
run: |
VERSION="${{ github.event.release.tag_name }}"
PREVIEW_TAG="${VERSION}-pre"
if git rev-parse "$PREVIEW_TAG" > /dev/null 2>&1; then
echo "was_promoted_from_preview=true" >> "$GITHUB_OUTPUT"
else
echo "was_promoted_from_preview=false" >> "$GITHUB_OUTPUT"
fi
- name: Send release notes email
if: steps.check-promotion-from-preview.outputs.was_promoted_from_preview == 'true'
run: |
TAG="${{ github.event.release.tag_name }}"
cat << 'EOF' > release_body.txt
${{ github.event.release.body }}
EOF
jq -n --arg tag "$TAG" --rawfile body release_body.txt '{version: $tag, markdown_body: $body}' \
> release_data.json
curl -X POST "https://zed.dev/api/send_release_notes_email" \
-H "Authorization: Bearer ${{ secrets.RELEASE_NOTES_API_TOKEN }}" \
-H "Content-Type: application/json" \
-d @release_data.json

78
.github/workflows/compare_perf.yml vendored Normal file
View File

@@ -0,0 +1,78 @@
# Generated from xtask::workflows::compare_perf
# Rebuild with `cargo xtask workflows`.
name: compare_perf
on:
workflow_dispatch:
inputs:
head:
description: head
required: true
type: string
base:
description: base
required: true
type: string
crate_name:
description: crate_name
type: string
default: ''
jobs:
run_perf:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::install_hyperfine
run: cargo install hyperfine
shell: bash -euxo pipefail {0}
- name: steps::git_checkout
run: git fetch origin ${{ inputs.base }} && git checkout ${{ inputs.base }}
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::cargo_perf_test
run: |2-
if [ -n "${{ inputs.crate_name }}" ]; then
cargo perf-test -p ${{ inputs.crate_name }} -- --json=${{ inputs.base }};
else
cargo perf-test -p vim -- --json=${{ inputs.base }};
fi
shell: bash -euxo pipefail {0}
- name: steps::git_checkout
run: git fetch origin ${{ inputs.head }} && git checkout ${{ inputs.head }}
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::cargo_perf_test
run: |2-
if [ -n "${{ inputs.crate_name }}" ]; then
cargo perf-test -p ${{ inputs.crate_name }} -- --json=${{ inputs.head }};
else
cargo perf-test -p vim -- --json=${{ inputs.head }};
fi
shell: bash -euxo pipefail {0}
- name: compare_perf::run_perf::compare_runs
run: cargo perf-compare --save=results.md ${{ inputs.base }} ${{ inputs.head }}
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact results.md'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: results.md
path: results.md
if-no-files-found: error
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}

View File

@@ -29,10 +29,10 @@ jobs:
node-version: '20'
cache: pnpm
cache-dependency-path: script/danger/pnpm-lock.yaml
- name: danger::install_deps
- name: danger::danger_job::install_deps
run: pnpm install --dir script/danger
shell: bash -euxo pipefail {0}
- name: danger::run
- name: danger::danger_job::run
run: pnpm run --dir script/danger danger ci
shell: bash -euxo pipefail {0}
env:

View File

@@ -1,71 +0,0 @@
name: Run Agent Eval
on:
schedule:
- cron: "0 0 * * *"
pull_request:
branches:
- "**"
types: [synchronize, reopened, labeled]
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: 1
jobs:
run_eval:
timeout-minutes: 60
name: Run Agent Eval
if: >
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Compile eval
run: cargo build --package=eval
- name: Run eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

View File

@@ -1,77 +0,0 @@
# Generated from xtask::workflows::nix_build
# Rebuild with `cargo xtask workflows`.
name: nix_build
on:
workflow_call:
inputs:
flake-output:
type: string
default: default
cachix-filter:
type: string
jobs:
build_nix_linux_x86_64:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: ${{ inputs.cachix-filter }}
- name: nix_build::build
run: nix build .#${{ inputs.flake-output }} -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: ${{ inputs.cachix-filter }}
- name: nix_build::build
run: nix build .#${{ inputs.flake-output }} -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true

491
.github/workflows/release.yml vendored Normal file
View File

@@ -0,0 +1,491 @@
# Generated from xtask::workflows::release
# Rebuild with `cargo xtask workflows`.
name: release
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
on:
push:
tags:
- v*
jobs:
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_linux:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
check_scripts:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_scripts::run_shellcheck
run: ./script/shellcheck-scripts error
shell: bash -euxo pipefail {0}
- id: get_actionlint
name: run_tests::check_scripts::download_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::run_actionlint
run: |
${{ steps.get_actionlint.outputs.executable }} -color
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::check_xtask_workflows
run: |
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
create_draft_release:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 25
ref: ${{ github.ref }}
- name: script/determine-release-channel
run: script/determine-release-channel
shell: bash -euxo pipefail {0}
- name: mkdir -p target/
run: mkdir -p target/
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::generate_release_notes
run: node --redirect-warnings=/dev/null ./script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md
shell: bash -euxo pipefail {0}
- name: release::create_draft_release::create_release
run: script/create-draft-release target/release-notes.md
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
timeout-minutes: 60
bundle_linux_aarch64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
needs:
- run_tests_linux
- check_scripts
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_aarch64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_x86_64:
needs:
- run_tests_mac
- check_scripts
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
needs:
- run_tests_windows
- check_scripts
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
upload_release_assets:
needs:
- create_draft_release
- bundle_linux_aarch64
- bundle_linux_x86_64
- bundle_mac_aarch64
- bundle_mac_x86_64
- bundle_windows_aarch64
- bundle_windows_x86_64
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: release::download_workflow_artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
path: ./artifacts/
- name: ls -lR ./artifacts
run: ls -lR ./artifacts
shell: bash -euxo pipefail {0}
- name: release::prep_release_artifacts
run: |-
mkdir -p release-artifacts/
mv ./artifacts/Zed-aarch64.dmg/Zed-aarch64.dmg release-artifacts/Zed-aarch64.dmg
mv ./artifacts/Zed-x86_64.dmg/Zed-x86_64.dmg release-artifacts/Zed-x86_64.dmg
mv ./artifacts/zed-linux-aarch64.tar.gz/zed-linux-aarch64.tar.gz release-artifacts/zed-linux-aarch64.tar.gz
mv ./artifacts/zed-linux-x86_64.tar.gz/zed-linux-x86_64.tar.gz release-artifacts/zed-linux-x86_64.tar.gz
mv ./artifacts/Zed-x86_64.exe/Zed-x86_64.exe release-artifacts/Zed-x86_64.exe
mv ./artifacts/Zed-aarch64.exe/Zed-aarch64.exe release-artifacts/Zed-aarch64.exe
mv ./artifacts/zed-remote-server-macos-aarch64.gz/zed-remote-server-macos-aarch64.gz release-artifacts/zed-remote-server-macos-aarch64.gz
mv ./artifacts/zed-remote-server-macos-x86_64.gz/zed-remote-server-macos-x86_64.gz release-artifacts/zed-remote-server-macos-x86_64.gz
mv ./artifacts/zed-remote-server-linux-aarch64.gz/zed-remote-server-linux-aarch64.gz release-artifacts/zed-remote-server-linux-aarch64.gz
mv ./artifacts/zed-remote-server-linux-x86_64.gz/zed-remote-server-linux-x86_64.gz release-artifacts/zed-remote-server-linux-x86_64.gz
shell: bash -euxo pipefail {0}
- name: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
run: gh release upload "$GITHUB_REF_NAME" --repo=zed-industries/zed release-artifacts/*
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
auto_release_preview:
needs:
- upload_release_assets
if: |
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
run: gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false
shell: bash -euxo pipefail {0}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production
env:
SENTRY_ORG: zed-dev
SENTRY_PROJECT: zed
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -3,12 +3,7 @@
name: release_nightly
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
on:
push:
tags:
@@ -32,38 +27,6 @@ jobs:
run: ./script/clippy
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_mac:
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
@@ -81,6 +44,9 @@ jobs:
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
@@ -96,176 +62,215 @@ jobs:
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
bundle_mac_nightly_x86_64:
bundle_linux_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
- run_tests_windows
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_mac_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly macos aarch64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_linux_nightly_x86_64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: ./script/install-mold
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz x86_64
shell: bash -euxo pipefail {0}
timeout-minutes: 60
bundle_linux_nightly_aarch64:
needs:
- check_style
- run_tests_mac
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::add_rust_to_path
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: ./script/linux
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 100
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: release_nightly::set_release_channel_to_nightly
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_aarch64:
needs:
- check_style
- run_tests_windows
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: release_nightly::upload_zed_nightly
run: script/upload-nightly linux-targz aarch64
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_nightly_x86_64:
bundle_mac_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
@@ -280,11 +285,7 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
- name: run_bundling::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
@@ -292,61 +293,71 @@ jobs:
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
timeout-minutes: 60
bundle_windows_nightly_aarch64:
needs:
- check_style
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: release_nightly::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::build_zed_installer
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: release_nightly::upload_zed_nightly_windows
run: script/upload-nightly.ps1 -Architecture aarch64
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
needs:
- check_style
- run_tests_windows
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_bundling::set_release_channel_to_nightly
run: |
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
build_nix_linux_x86_64:
needs:
- check_style
- run_tests_mac
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
@@ -359,17 +370,17 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::install_nix
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
@@ -377,7 +388,7 @@ jobs:
build_nix_mac_aarch64:
needs:
- check_style
- run_tests_mac
- run_tests_windows
if: github.repository_owner == 'zed-industries'
runs-on: self-mini-macos
env:
@@ -390,21 +401,21 @@ jobs:
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::set_path
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::cachix_action
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
- name: nix_build::build
- name: nix_build::build_nix::build
run: nix build .#default -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::limit_store
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
@@ -414,21 +425,49 @@ jobs:
continue-on-error: true
update_nightly_tag:
needs:
- bundle_mac_nightly_x86_64
- bundle_mac_nightly_aarch64
- bundle_linux_nightly_x86_64
- bundle_linux_nightly_aarch64
- bundle_windows_nightly_x86_64
- bundle_windows_nightly_aarch64
- bundle_linux_aarch64
- bundle_linux_x86_64
- bundle_mac_aarch64
- bundle_mac_x86_64
- bundle_windows_aarch64
- bundle_windows_x86_64
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: 0
- name: release_nightly::update_nightly_tag
- name: release::download_workflow_artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53
with:
path: ./artifacts/
- name: ls -lR ./artifacts
run: ls -lR ./artifacts
shell: bash -euxo pipefail {0}
- name: release::prep_release_artifacts
run: |-
mkdir -p release-artifacts/
mv ./artifacts/Zed-aarch64.dmg/Zed-aarch64.dmg release-artifacts/Zed-aarch64.dmg
mv ./artifacts/Zed-x86_64.dmg/Zed-x86_64.dmg release-artifacts/Zed-x86_64.dmg
mv ./artifacts/zed-linux-aarch64.tar.gz/zed-linux-aarch64.tar.gz release-artifacts/zed-linux-aarch64.tar.gz
mv ./artifacts/zed-linux-x86_64.tar.gz/zed-linux-x86_64.tar.gz release-artifacts/zed-linux-x86_64.tar.gz
mv ./artifacts/Zed-x86_64.exe/Zed-x86_64.exe release-artifacts/Zed-x86_64.exe
mv ./artifacts/Zed-aarch64.exe/Zed-aarch64.exe release-artifacts/Zed-aarch64.exe
mv ./artifacts/zed-remote-server-macos-aarch64.gz/zed-remote-server-macos-aarch64.gz release-artifacts/zed-remote-server-macos-aarch64.gz
mv ./artifacts/zed-remote-server-macos-x86_64.gz/zed-remote-server-macos-x86_64.gz release-artifacts/zed-remote-server-macos-x86_64.gz
mv ./artifacts/zed-remote-server-linux-aarch64.gz/zed-remote-server-linux-aarch64.gz release-artifacts/zed-remote-server-linux-aarch64.gz
mv ./artifacts/zed-remote-server-linux-x86_64.gz/zed-remote-server-linux-x86_64.gz release-artifacts/zed-remote-server-linux-x86_64.gz
shell: bash -euxo pipefail {0}
- name: ./script/upload-nightly
run: ./script/upload-nightly
shell: bash -euxo pipefail {0}
env:
DIGITALOCEAN_SPACES_ACCESS_KEY: ${{ secrets.DIGITALOCEAN_SPACES_ACCESS_KEY }}
DIGITALOCEAN_SPACES_SECRET_KEY: ${{ secrets.DIGITALOCEAN_SPACES_SECRET_KEY }}
- name: release_nightly::update_nightly_tag_job::update_nightly_tag
run: |
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
@@ -439,7 +478,7 @@ jobs:
git tag -f nightly
git push origin nightly --force
shell: bash -euxo pipefail {0}
- name: release_nightly::create_sentry_release
- name: release::create_sentry_release
uses: getsentry/action-release@526942b68292201ac6bbb99b9a0747d4abee354c
with:
environment: production

62
.github/workflows/run_agent_evals.yml vendored Normal file
View File

@@ -0,0 +1,62 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_EVAL_TELEMETRY: '1'
on:
pull_request:
types:
- synchronize
- reopened
- labeled
branches:
- '**'
schedule:
- cron: 0 0 * * *
workflow_dispatch: {}
jobs:
agent_evals:
if: |
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: cargo build --package=eval
run: cargo build --package=eval
shell: bash -euxo pipefail {0}
- name: run_agent_evals::agent_evals::run_eval
run: cargo run --package=eval -- --repetitions=8 --concurrency=1
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -3,103 +3,62 @@
name: run_bundling
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
on:
pull_request:
types:
- labeled
- synchronize
jobs:
bundle_mac_x86_64:
bundle_linux_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg'
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-linux-aarch64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz'
name: zed-linux-aarch64.tar.gz
path: target/release/zed-linux-aarch64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
timeout-minutes: 60
bundle_mac_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed.dmg
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
name: zed-remote-server-linux-aarch64.gz
path: target/zed-remote-server-linux-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-32x64-ubuntu-2004
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
@@ -109,31 +68,138 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: ./script/linux
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: ./script/install-mold
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
- name: '@actions/upload-artifact zed-linux-x86_64.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz'
name: zed-linux-x86_64.tar.gz
path: target/release/zed-linux-x86_64.tar.gz
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-linux-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
name: zed-remote-server-linux-x86_64.gz
path: target/zed-remote-server-linux-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_linux_arm64:
bundle_mac_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: namespace-profile-8x32-ubuntu-2004-arm-m4
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac aarch64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-aarch64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-aarch64.dmg
path: target/aarch64-apple-darwin/release/Zed-aarch64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-aarch64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-aarch64.gz
path: target/zed-remote-server-macos-aarch64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_mac_x86_64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-mini-macos
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
MACOS_CERTIFICATE_PASSWORD: ${{ secrets.MACOS_CERTIFICATE_PASSWORD }}
APPLE_NOTARIZATION_KEY: ${{ secrets.APPLE_NOTARIZATION_KEY }}
APPLE_NOTARIZATION_KEY_ID: ${{ secrets.APPLE_NOTARIZATION_KEY_ID }}
APPLE_NOTARIZATION_ISSUER_ID: ${{ secrets.APPLE_NOTARIZATION_ISSUER_ID }}
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: run_bundling::bundle_mac::bundle_mac
run: ./script/bundle-mac x86_64-apple-darwin
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact Zed-x86_64.dmg'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed-x86_64.dmg
path: target/x86_64-apple-darwin/release/Zed-x86_64.dmg
if-no-files-found: error
- name: '@actions/upload-artifact zed-remote-server-macos-x86_64.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-macos-x86_64.gz
path: target/zed-remote-server-macos-x86_64.gz
if-no-files-found: error
timeout-minutes: 60
bundle_windows_aarch64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
@@ -143,22 +209,16 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: ./script/linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: ./script/bundle-linux
run: ./script/bundle-linux
shell: bash -euxo pipefail {0}
- name: '@actions/upload-artifact zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: '@actions/upload-artifact zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-remote-server-*.tar.gz
name: Zed-aarch64.exe
path: target/Zed-aarch64.exe
if-no-files-found: error
timeout-minutes: 60
bundle_windows_x86_64:
if: |-
@@ -166,6 +226,9 @@ jobs:
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
CARGO_INCREMENTAL: 0
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
@@ -184,49 +247,16 @@ jobs:
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
- name: run_bundling::bundle_windows::bundle_windows
run: script/bundle-windows.ps1 -Architecture x86_64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe'
- name: '@actions/upload-artifact Zed-x86_64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-x86_64.exe
path: ${{ env.SETUP_PATH }}
timeout-minutes: 60
bundle_windows_arm64:
if: |-
(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))
runs-on: self-32vcpu-windows-2022
env:
AZURE_TENANT_ID: ${{ secrets.AZURE_SIGNING_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_SIGNING_CLIENT_ID }}
AZURE_CLIENT_SECRET: ${{ secrets.AZURE_SIGNING_CLIENT_SECRET }}
ACCOUNT_NAME: ${{ vars.AZURE_SIGNING_ACCOUNT_NAME }}
CERT_PROFILE_NAME: ${{ vars.AZURE_SIGNING_CERT_PROFILE_NAME }}
ENDPOINT: ${{ vars.AZURE_SIGNING_ENDPOINT }}
FILE_DIGEST: SHA256
TIMESTAMP_DIGEST: SHA256
TIMESTAMP_SERVER: http://timestamp.acs.microsoft.com
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_sentry
uses: matbour/setup-sentry-cli@3e938c54b3018bdd019973689ef984e033b0454b
with:
token: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: run_bundling::bundle_windows
run: script/bundle-windows.ps1 -Architecture aarch64
shell: pwsh
working-directory: ${{ env.ZED_WORKSPACE }}
- name: '@actions/upload-artifact Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4
with:
name: Zed_${{ github.event.pull_request.head.sha || github.sha }}-aarch64.exe
path: ${{ env.SETUP_PATH }}
name: Zed-x86_64.exe
path: target/Zed-x86_64.exe
if-no-files-found: error
timeout-minutes: 60
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.ref }}

569
.github/workflows/run_tests.yml vendored Normal file
View File

@@ -0,0 +1,569 @@
# Generated from xtask::workflows::run_tests
# Rebuild with `cargo xtask workflows`.
name: run_tests
env:
CARGO_TERM_COLOR: always
RUST_BACKTRACE: '1'
CARGO_INCREMENTAL: '0'
on:
pull_request:
branches:
- '**'
push:
branches:
- main
- v[0-9]+.[0-9]+.x
jobs:
orchestrate:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- id: filter
name: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
check_pattern() {
local output_name="$1"
local pattern="$2"
local grep_arg="$3"
echo "$CHANGED_FILES" | grep "$grep_arg" "$pattern" && \
echo "${output_name}=true" >> "$GITHUB_OUTPUT" || \
echo "${output_name}=false" >> "$GITHUB_OUTPUT"
}
check_pattern "run_action_checks" '^\.github/(workflows/|actions/|actionlint.yml)|tooling/xtask|script/' -qP
check_pattern "run_docs" '^docs/' -qP
check_pattern "run_licenses" '^(Cargo.lock|script/.*licenses)' -qP
check_pattern "run_nix" '^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)' -qP
check_pattern "run_tests" '^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!run_tests)))' -qvP
shell: bash -euxo pipefail {0}
outputs:
run_action_checks: ${{ steps.filter.outputs.run_action_checks }}
run_docs: ${{ steps.filter.outputs.run_docs }}
run_licenses: ${{ steps.filter.outputs.run_licenses }}
run_nix: ${{ steps.filter.outputs.run_nix }}
run_tests: ${{ steps.filter.outputs.run_tests }}
check_style:
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-4x8-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_pnpm
uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2
with:
version: '9'
- name: ./script/prettier
run: ./script/prettier
shell: bash -euxo pipefail {0}
- name: ./script/check-todos
run: ./script/check-todos
shell: bash -euxo pipefail {0}
- name: ./script/check-keymaps
run: ./script/check-keymaps
shell: bash -euxo pipefail {0}
- name: run_tests::check_style::check_for_typos
uses: crate-ci/typos@80c8a4945eec0f6d464eaf9e65ed98ef085283d1
with:
config: ./typos.toml
- name: steps::cargo_fmt
run: cargo fmt --all -- --check
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_windows:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-32vcpu-windows-2022
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
New-Item -ItemType Directory -Path "./../.cargo" -Force
Copy-Item -Path "./.cargo/ci-config.toml" -Destination "./../.cargo/config.toml"
shell: pwsh
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy.ps1
shell: pwsh
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: pwsh
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than.ps1 250
shell: pwsh
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: pwsh
- name: steps::cleanup_cargo_config
if: always()
run: |
Remove-Item -Recurse -Path "./../.cargo" -Force -ErrorAction SilentlyContinue
shell: pwsh
timeout-minutes: 60
run_tests_linux:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
run_tests_mac:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020
with:
node-version: '20'
- name: steps::clippy
run: ./script/clippy
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 300
shell: bash -euxo pipefail {0}
- name: steps::cargo_nextest
run: cargo nextest run --workspace --no-fail-fast --failure-output immediate-final
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
doctests:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- id: run_doctests
name: run_tests::doctests::run_doctests
run: |
cargo test --workspace --doc --no-fail-fast
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_workspace_binaries:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-8x16-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: cargo build -p collab
run: cargo build -p collab
shell: bash -euxo pipefail {0}
- name: cargo build --workspace --bins --examples
run: cargo build --workspace --bins --examples
shell: bash -euxo pipefail {0}
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
timeout-minutes: 60
check_postgres_and_protobuf_migrations:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: self-mini-macos
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
fetch-depth: 0
- name: run_tests::check_postgres_and_protobuf_migrations::remove_untracked_files
run: git clean -df
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::ensure_fresh_merge
run: |
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
shell: bash -euxo pipefail {0}
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_setup_action
uses: bufbuild/buf-setup-action@v1
with:
version: v1.29.0
- name: run_tests::check_postgres_and_protobuf_migrations::bufbuild_breaking_action
uses: bufbuild/buf-breaking-action@v1
with:
input: crates/proto/proto/
against: https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/
timeout-minutes: 60
check_dependencies:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_tests == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: run_tests::check_dependencies::install_cargo_machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386
with:
command: install
args: cargo-machete@0.7.0
- name: run_tests::check_dependencies::run_cargo_machete
uses: clechasseur/rs-cargo@8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386
with:
command: machete
- name: run_tests::check_dependencies::check_cargo_lock
run: cargo update --locked --workspace
shell: bash -euxo pipefail {0}
- name: run_tests::check_dependencies::check_vulnerable_dependencies
if: github.event_name == 'pull_request'
uses: actions/dependency-review-action@67d4f4bd7a9b17a0db54d2a7519187c65e339de8
with:
license-check: false
timeout-minutes: 60
check_docs:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_docs == 'true'
runs-on: namespace-profile-8x16-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: run_tests::check_docs::lychee_link_check
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332
with:
args: --no-progress --exclude '^http' './docs/src/**/*'
fail: true
jobSummary: false
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::install_mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08
with:
mdbook-version: 0.4.37
- name: run_tests::check_docs::build_docs
run: |
mkdir -p target/deploy
mdbook build ./docs --dest-dir=../target/deploy/docs/
shell: bash -euxo pipefail {0}
- name: run_tests::check_docs::lychee_link_check
uses: lycheeverse/lychee-action@82202e5e9c2f4ef1a55a3d02563e1cb6041e5332
with:
args: --no-progress --exclude '^http' 'target/deploy/docs'
fail: true
jobSummary: false
timeout-minutes: 60
check_licenses:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_licenses == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: ./script/check-licenses
run: ./script/check-licenses
shell: bash -euxo pipefail {0}
- name: ./script/generate-licenses
run: ./script/generate-licenses
shell: bash -euxo pipefail {0}
check_scripts:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_action_checks == 'true'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: run_tests::check_scripts::run_shellcheck
run: ./script/shellcheck-scripts error
shell: bash -euxo pipefail {0}
- id: get_actionlint
name: run_tests::check_scripts::download_actionlint
run: bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::run_actionlint
run: |
${{ steps.get_actionlint.outputs.executable }} -color
shell: bash -euxo pipefail {0}
- name: run_tests::check_scripts::check_xtask_workflows
run: |
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
build_nix_linux_x86_64:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_nix == 'true'
runs-on: namespace-profile-32x64-ubuntu-2004
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::build_nix::install_nix
uses: cachix/install-nix-action@02a151ada4993995686f9ed4f1be7cfbb229e56f
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
build_nix_mac_aarch64:
needs:
- orchestrate
if: needs.orchestrate.outputs.run_nix == 'true'
runs-on: self-mini-macos
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
ZED_MINIDUMP_ENDPOINT: ${{ secrets.ZED_SENTRY_MINIDUMP_ENDPOINT }}
ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON: ${{ secrets.ZED_CLOUD_PROVIDER_ADDITIONAL_MODELS_JSON }}
GIT_LFS_SKIP_SMUDGE: '1'
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: nix_build::build_nix::set_path
run: |
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
shell: bash -euxo pipefail {0}
- name: nix_build::build_nix::cachix_action
uses: cachix/cachix-action@0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad
with:
name: zed
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
cachixArgs: -v
pushFilter: -zed-editor-[0-9.]*-nightly
- name: nix_build::build_nix::build
run: nix build .#debug -L --accept-flake-config
shell: bash -euxo pipefail {0}
- name: nix_build::build_nix::limit_store
run: |-
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi
shell: bash -euxo pipefail {0}
timeout-minutes: 60
continue-on-error: true
tests_pass:
needs:
- orchestrate
- check_style
- run_tests_windows
- run_tests_linux
- run_tests_mac
- doctests
- check_workspace_binaries
- check_postgres_and_protobuf_migrations
- check_dependencies
- check_docs
- check_licenses
- check_scripts
- build_nix_linux_x86_64
- build_nix_mac_aarch64
if: github.repository_owner == 'zed-industries' && always()
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- name: run_tests::tests_pass
run: |
set +x
EXIT_CODE=0
check_result() {
echo "* $1: $2"
if [[ "$2" != "skipped" && "$2" != "success" ]]; then EXIT_CODE=1; fi
}
check_result "orchestrate" "${{ needs.orchestrate.result }}"
check_result "check_style" "${{ needs.check_style.result }}"
check_result "run_tests_windows" "${{ needs.run_tests_windows.result }}"
check_result "run_tests_linux" "${{ needs.run_tests_linux.result }}"
check_result "run_tests_mac" "${{ needs.run_tests_mac.result }}"
check_result "doctests" "${{ needs.doctests.result }}"
check_result "check_workspace_binaries" "${{ needs.check_workspace_binaries.result }}"
check_result "check_postgres_and_protobuf_migrations" "${{ needs.check_postgres_and_protobuf_migrations.result }}"
check_result "check_dependencies" "${{ needs.check_dependencies.result }}"
check_result "check_docs" "${{ needs.check_docs.result }}"
check_result "check_licenses" "${{ needs.check_licenses.result }}"
check_result "check_scripts" "${{ needs.check_scripts.result }}"
check_result "build_nix_linux_x86_64" "${{ needs.build_nix_linux_x86_64.result }}"
check_result "build_nix_mac_aarch64" "${{ needs.build_nix_mac_aarch64.result }}"
exit $EXIT_CODE
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

63
.github/workflows/run_unit_evals.yml vendored Normal file
View File

@@ -0,0 +1,63 @@
# Generated from xtask::workflows::run_agent_evals
# Rebuild with `cargo xtask workflows`.
name: run_agent_evals
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: '0'
RUST_BACKTRACE: '1'
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
on:
schedule:
- cron: 47 1 * * 2
workflow_dispatch: {}
jobs:
unit_evals:
runs-on: namespace-profile-16x32-ubuntu-2204
steps:
- name: steps::checkout_repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
with:
clean: false
- name: steps::setup_cargo_config
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
shell: bash -euxo pipefail {0}
- name: steps::cache_rust_dependencies_namespace
uses: namespacelabs/nscloud-cache-action@v1
with:
cache: rust
- name: steps::setup_linux
run: ./script/linux
shell: bash -euxo pipefail {0}
- name: steps::install_mold
run: ./script/install-mold
shell: bash -euxo pipefail {0}
- name: steps::cargo_install_nextest
run: cargo install cargo-nextest --locked
shell: bash -euxo pipefail {0}
- name: steps::clear_target_dir_if_large
run: ./script/clear-target-dir-if-larger-than 250
shell: bash -euxo pipefail {0}
- name: ./script/run-unit-evals
run: ./script/run-unit-evals
shell: bash -euxo pipefail {0}
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: run_agent_evals::unit_evals::send_failure_to_slack
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
- name: steps::cleanup_cargo_config
if: always()
run: |
rm -rf ./../.cargo
shell: bash -euxo pipefail {0}
concurrency:
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true

View File

@@ -1,21 +0,0 @@
name: Script
on:
pull_request:
paths:
- "script/**"
push:
branches:
- main
jobs:
shellcheck:
name: "ShellCheck Scripts"
if: github.repository_owner == 'zed-industries'
runs-on: namespace-profile-2x4-ubuntu-2404
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- name: Shellcheck ./scripts
run: |
./script/shellcheck-scripts error

View File

@@ -1,86 +0,0 @@
name: Run Unit Evals
on:
schedule:
# GitHub might drop jobs at busy times, so we choose a random time in the middle of the night.
- cron: "47 1 * * 2"
workflow_dispatch:
concurrency:
# Allow only one workflow per any non-`main` branch.
group: ${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}
cancel-in-progress: true
env:
CARGO_TERM_COLOR: always
CARGO_INCREMENTAL: 0
RUST_BACKTRACE: 1
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
jobs:
unit_evals:
if: github.repository_owner == 'zed-industries'
timeout-minutes: 60
name: Run unit evals
runs-on:
- namespace-profile-16x32-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
clean: false
- name: Cache dependencies
uses: swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2
with:
save-if: ${{ github.ref == 'refs/heads/main' }}
# cache-provider: "buildjet"
- name: Install Linux dependencies
run: ./script/linux
- name: Configure CI
run: |
mkdir -p ./../.cargo
cp ./.cargo/ci-config.toml ./../.cargo/config.toml
- name: Install Rust
shell: bash -euxo pipefail {0}
run: |
cargo install cargo-nextest --locked
- name: Install Node
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4
with:
node-version: "18"
- name: Limit target directory size
shell: bash -euxo pipefail {0}
run: script/clear-target-dir-if-larger-than 100
- name: Run unit evals
shell: bash -euxo pipefail {0}
run: cargo nextest run --workspace --no-fail-fast --features unit-eval --no-capture -E 'test(::eval_)'
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
- name: Send failure message to Slack channel if needed
if: ${{ failure() }}
uses: slackapi/slack-github-action@b0fa283ad8fea605de13dc3f449259339835fc52
with:
method: chat.postMessage
token: ${{ secrets.SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN }}
payload: |
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
# Even the Linux runner is not stateful, in theory there is no need to do this cleanup.
# But, to avoid potential issues in the future if we choose to use a stateful Linux runner and forget to add code
# to clean up the config file, Ive included the cleanup code here as a precaution.
# While its not strictly necessary at this moment, I believe its better to err on the side of caution.
- name: Clean CI config file
if: always()
run: rm -rf ./../.cargo

12
Cargo.lock generated
View File

@@ -5832,8 +5832,6 @@ name = "extension"
version = "0.1.0"
dependencies = [
"anyhow",
"async-compression",
"async-tar",
"async-trait",
"collections",
"dap",
@@ -6957,7 +6955,7 @@ dependencies = [
[[package]]
name = "gh-workflow"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
source = "git+https://github.com/zed-industries/gh-workflow?rev=3eaa84abca0778eb54272f45a312cb24f9a0b435#3eaa84abca0778eb54272f45a312cb24f9a0b435"
dependencies = [
"async-trait",
"derive_more 2.0.1",
@@ -6974,7 +6972,7 @@ dependencies = [
[[package]]
name = "gh-workflow-macros"
version = "0.8.0"
source = "git+https://github.com/zed-industries/gh-workflow?rev=0090c6b6ef82fff02bc8616645953e778d1acc08#0090c6b6ef82fff02bc8616645953e778d1acc08"
source = "git+https://github.com/zed-industries/gh-workflow?rev=3eaa84abca0778eb54272f45a312cb24f9a0b435#3eaa84abca0778eb54272f45a312cb24f9a0b435"
dependencies = [
"heck 0.5.0",
"quote",
@@ -16494,7 +16492,7 @@ dependencies = [
"editor",
"file_icons",
"gpui",
"multi_buffer",
"language",
"ui",
"workspace",
]
@@ -20930,6 +20928,7 @@ dependencies = [
"gh-workflow",
"indexmap 2.11.4",
"indoc",
"serde",
"toml 0.8.23",
"toml_edit 0.22.27",
]
@@ -21114,7 +21113,7 @@ dependencies = [
[[package]]
name = "zed"
version = "0.211.0"
version = "0.211.5"
dependencies = [
"acp_tools",
"activity_indicator",
@@ -21206,6 +21205,7 @@ dependencies = [
"project_symbols",
"prompt_store",
"proto",
"rayon",
"recent_projects",
"release_channel",
"remote",

View File

@@ -508,7 +508,7 @@ fork = "0.2.0"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
gh-workflow = { git = "https://github.com/zed-industries/gh-workflow", rev = "0090c6b6ef82fff02bc8616645953e778d1acc08" }
gh-workflow = { git = "https://github.com/zed-industries/gh-workflow", rev = "3eaa84abca0778eb54272f45a312cb24f9a0b435" }
git2 = { version = "0.20.1", default-features = false }
globset = "0.4"
handlebars = "4.3"

View File

@@ -592,7 +592,7 @@
// to both the horizontal and vertical delta values while scrolling. Fast scrolling
// happens when a user holds the alt or option key while scrolling.
"fast_scroll_sensitivity": 4.0,
"relative_line_numbers": "disabled",
"relative_line_numbers": false,
// If 'search_wrap' is disabled, search result do not wrap around the end of the file.
"search_wrap": true,
// Search options to enable by default when opening new project and buffer searches.

View File

@@ -178,6 +178,7 @@ impl AcpConnection {
meta: Some(serde_json::json!({
// Experimental: Allow for rendering terminal output from the agents
"terminal_output": true,
"terminal-auth": true,
})),
},
client_info: Some(acp::Implementation {

View File

@@ -1,4 +1,5 @@
use crate::{
ChatWithFollow,
acp::completion_provider::{ContextPickerCompletionProvider, SlashCommandCompletion},
context_picker::{ContextPickerAction, fetch_context_picker::fetch_url_content},
};
@@ -49,7 +50,7 @@ use text::OffsetRangeExt;
use theme::ThemeSettings;
use ui::{ButtonLike, TintColor, Toggleable, prelude::*};
use util::{ResultExt, debug_panic, rel_path::RelPath};
use workspace::{Workspace, notifications::NotifyResultExt as _};
use workspace::{CollaboratorId, Workspace, notifications::NotifyResultExt as _};
use zed_actions::agent::Chat;
pub struct MessageEditor {
@@ -813,6 +814,21 @@ impl MessageEditor {
self.send(cx);
}
fn chat_with_follow(
&mut self,
_: &ChatWithFollow,
window: &mut Window,
cx: &mut Context<Self>,
) {
self.workspace
.update(cx, |this, cx| {
this.follow(CollaboratorId::Agent, window, cx)
})
.log_err();
self.send(cx);
}
fn cancel(&mut self, _: &editor::actions::Cancel, _: &mut Window, cx: &mut Context<Self>) {
cx.emit(MessageEditorEvent::Cancel)
}
@@ -1276,6 +1292,7 @@ impl Render for MessageEditor {
div()
.key_context("MessageEditor")
.on_action(cx.listener(Self::chat))
.on_action(cx.listener(Self::chat_with_follow))
.on_action(cx.listener(Self::cancel))
.capture_action(cx.listener(Self::paste))
.flex_1()

View File

@@ -1469,6 +1469,114 @@ impl AcpThreadView {
return;
};
// Check for the experimental "terminal-auth" _meta field
let auth_method = connection.auth_methods().iter().find(|m| m.id == method);
if let Some(auth_method) = auth_method {
if let Some(meta) = &auth_method.meta {
if let Some(terminal_auth) = meta.get("terminal-auth") {
// Extract terminal auth details from meta
if let (Some(command), Some(label)) = (
terminal_auth.get("command").and_then(|v| v.as_str()),
terminal_auth.get("label").and_then(|v| v.as_str()),
) {
let args = terminal_auth
.get("args")
.and_then(|v| v.as_array())
.map(|arr| {
arr.iter()
.filter_map(|v| v.as_str().map(String::from))
.collect()
})
.unwrap_or_default();
let env = terminal_auth
.get("env")
.and_then(|v| v.as_object())
.map(|obj| {
obj.iter()
.filter_map(|(k, v)| {
v.as_str().map(|val| (k.clone(), val.to_string()))
})
.collect::<HashMap<String, String>>()
})
.unwrap_or_default();
// Run SpawnInTerminal in the same dir as the ACP server
let cwd = connection
.clone()
.downcast::<agent_servers::AcpConnection>()
.map(|acp_conn| acp_conn.root_dir().to_path_buf());
// Build SpawnInTerminal from _meta
let login = task::SpawnInTerminal {
id: task::TaskId(format!("external-agent-{}-login", label)),
full_label: label.to_string(),
label: label.to_string(),
command: Some(command.to_string()),
args,
command_label: label.to_string(),
cwd,
env,
use_new_terminal: true,
allow_concurrent_runs: true,
hide: task::HideStrategy::Always,
..Default::default()
};
self.thread_error.take();
configuration_view.take();
pending_auth_method.replace(method.clone());
if let Some(workspace) = self.workspace.upgrade() {
let project = self.project.clone();
let authenticate = Self::spawn_external_agent_login(
login, workspace, project, false, true, window, cx,
);
cx.notify();
self.auth_task = Some(cx.spawn_in(window, {
let agent = self.agent.clone();
async move |this, cx| {
let result = authenticate.await;
match &result {
Ok(_) => telemetry::event!(
"Authenticate Agent Succeeded",
agent = agent.telemetry_id()
),
Err(_) => {
telemetry::event!(
"Authenticate Agent Failed",
agent = agent.telemetry_id(),
)
}
}
this.update_in(cx, |this, window, cx| {
if let Err(err) = result {
if let ThreadState::Unauthenticated {
pending_auth_method,
..
} = &mut this.thread_state
{
pending_auth_method.take();
}
this.handle_thread_error(err, cx);
} else {
this.reset(window, cx);
}
this.auth_task.take()
})
.ok();
}
}));
}
return;
}
}
}
}
if method.0.as_ref() == "gemini-api-key" {
let registry = LanguageModelRegistry::global(cx);
let provider = registry
@@ -1567,7 +1675,10 @@ impl AcpThreadView {
&& let Some(login) = self.login.clone()
{
if let Some(workspace) = self.workspace.upgrade() {
Self::spawn_external_agent_login(login, workspace, false, window, cx)
let project = self.project.clone();
Self::spawn_external_agent_login(
login, workspace, project, false, false, window, cx,
)
} else {
Task::ready(Ok(()))
}
@@ -1617,17 +1728,40 @@ impl AcpThreadView {
fn spawn_external_agent_login(
login: task::SpawnInTerminal,
workspace: Entity<Workspace>,
project: Entity<Project>,
previous_attempt: bool,
check_exit_code: bool,
window: &mut Window,
cx: &mut App,
) -> Task<Result<()>> {
let Some(terminal_panel) = workspace.read(cx).panel::<TerminalPanel>(cx) else {
return Task::ready(Ok(()));
};
let project = workspace.read(cx).project().clone();
window.spawn(cx, async move |cx| {
let mut task = login.clone();
if let Some(cmd) = &task.command {
// Have "node" command use Zed's managed Node runtime by default
if cmd == "node" {
let resolved_node_runtime = project
.update(cx, |project, cx| {
let agent_server_store = project.agent_server_store().clone();
agent_server_store.update(cx, |store, cx| {
store.node_runtime().map(|node_runtime| {
cx.background_spawn(async move {
node_runtime.binary_path().await
})
})
})
});
if let Ok(Some(resolve_task)) = resolved_node_runtime {
if let Ok(node_path) = resolve_task.await {
task.command = Some(node_path.to_string_lossy().to_string());
}
}
}
}
task.shell = task::Shell::WithArguments {
program: task.command.take().expect("login command should be set"),
args: std::mem::take(&mut task.args),
@@ -1645,44 +1779,65 @@ impl AcpThreadView {
})?;
let terminal = terminal.await?;
let mut exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.fuse();
let logged_in = cx
.spawn({
let terminal = terminal.clone();
async move |cx| {
loop {
cx.background_executor().timer(Duration::from_secs(1)).await;
let content =
terminal.update(cx, |terminal, _cx| terminal.get_content())?;
if content.contains("Login successful")
|| content.contains("Type your message")
{
return anyhow::Ok(());
if check_exit_code {
// For extension-based auth, wait for the process to exit and check exit code
let exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.await;
match exit_status {
Some(status) if status.success() => {
Ok(())
}
Some(status) => {
Err(anyhow!("Login command failed with exit code: {:?}", status.code()))
}
None => {
Err(anyhow!("Login command terminated without exit status"))
}
}
} else {
// For hardcoded agents (claude-login, gemini-cli): look for specific output
let mut exit_status = terminal
.read_with(cx, |terminal, cx| terminal.wait_for_completed_task(cx))?
.fuse();
let logged_in = cx
.spawn({
let terminal = terminal.clone();
async move |cx| {
loop {
cx.background_executor().timer(Duration::from_secs(1)).await;
let content =
terminal.update(cx, |terminal, _cx| terminal.get_content())?;
if content.contains("Login successful")
|| content.contains("Type your message")
{
return anyhow::Ok(());
}
}
}
})
.fuse();
futures::pin_mut!(logged_in);
futures::select_biased! {
result = logged_in => {
if let Err(e) = result {
log::error!("{e}");
return Err(anyhow!("exited before logging in"));
}
}
})
.fuse();
futures::pin_mut!(logged_in);
futures::select_biased! {
result = logged_in => {
if let Err(e) = result {
log::error!("{e}");
_ = exit_status => {
if !previous_attempt && project.read_with(cx, |project, _| project.is_via_remote_server())? && login.label.contains("gemini") {
return cx.update(|window, cx| Self::spawn_external_agent_login(login, workspace, project.clone(), true, false, window, cx))?.await
}
return Err(anyhow!("exited before logging in"));
}
}
_ = exit_status => {
if !previous_attempt && project.read_with(cx, |project, _| project.is_via_remote_server())? && login.label.contains("gemini") {
return cx.update(|window, cx| Self::spawn_external_agent_login(login, workspace, true, window, cx))?.await
}
return Err(anyhow!("exited before logging in"));
}
terminal.update(cx, |terminal, _| terminal.kill_active_task())?;
Ok(())
}
terminal.update(cx, |terminal, _| terminal.kill_active_task())?;
Ok(())
})
}

View File

@@ -2040,7 +2040,7 @@ impl AgentPanel {
let mut entry =
ContextMenuEntry::new(format!("New {} Thread", agent_name));
if let Some(icon_path) = icon_path {
entry = entry.custom_icon_path(icon_path);
entry = entry.custom_icon_svg(icon_path);
} else {
entry = entry.icon(IconName::Terminal);
}
@@ -2109,7 +2109,7 @@ impl AgentPanel {
.when_some(selected_agent_custom_icon, |this, icon_path| {
let label = selected_agent_label.clone();
this.px(DynamicSpacing::Base02.rems(cx))
.child(Icon::from_path(icon_path).color(Color::Muted))
.child(Icon::from_external_svg(icon_path).color(Color::Muted))
.tooltip(move |_window, cx| {
Tooltip::with_meta(label.clone(), None, "Selected Agent", cx)
})

View File

@@ -37,6 +37,7 @@ use std::{
Arc,
atomic::{self, AtomicBool, AtomicUsize},
},
time::Duration,
};
use text::Point;
use util::{path, rel_path::rel_path, uri};
@@ -1813,14 +1814,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
enabled: Some(true),
show_value_hints: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1830,15 +1824,8 @@ async fn test_mutual_editor_inlay_hint_cache_update(
store.update_user_settings(cx, |settings| {
settings.project.all_languages.defaults.inlay_hints =
Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(false),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
});
@@ -1931,6 +1918,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
let fake_language_server = fake_language_servers.next().await.unwrap();
let editor_a = file_a.await.unwrap().downcast::<Editor>().unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
let initial_edit = edits_made.load(atomic::Ordering::Acquire);
@@ -1951,6 +1939,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.downcast::<Editor>()
.unwrap();
executor.advance_clock(Duration::from_millis(100));
executor.run_until_parked();
editor_b.update(cx_b, |editor, cx| {
assert_eq!(
@@ -1969,6 +1958,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_b.focus(&editor_b);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -1992,6 +1982,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
});
cx_a.focus(&editor_a);
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(
@@ -2013,6 +2004,7 @@ async fn test_mutual_editor_inlay_hint_cache_update(
.into_response()
.expect("inlay refresh request failed");
executor.advance_clock(Duration::from_secs(1));
executor.run_until_parked();
editor_a.update(cx_a, |editor, cx| {
assert_eq!(

View File

@@ -489,7 +489,11 @@ impl Copilot {
let node_path = node_runtime.binary_path().await?;
ensure_node_version_for_copilot(&node_path).await?;
let arguments: Vec<OsString> = vec![server_path.into(), "--stdio".into()];
let arguments: Vec<OsString> = vec![
"--experimental-sqlite".into(),
server_path.into(),
"--stdio".into(),
];
let binary = LanguageServerBinary {
path: node_path,
arguments,

View File

@@ -1017,7 +1017,6 @@ impl Iterator for WrapRows<'_> {
multibuffer_row: None,
diff_status,
expand_info: None,
wrapped_buffer_row: buffer_row.buffer_row,
}
} else {
buffer_row

View File

@@ -163,10 +163,7 @@ use rpc::{ErrorCode, ErrorExt, proto::PeerId};
use scroll::{Autoscroll, OngoingScroll, ScrollAnchor, ScrollManager};
use selections_collection::{MutableSelectionsCollection, SelectionsCollection};
use serde::{Deserialize, Serialize};
use settings::{
GitGutterSetting, RelativeLineNumbers, Settings, SettingsLocation, SettingsStore,
update_settings_file,
};
use settings::{GitGutterSetting, Settings, SettingsLocation, SettingsStore, update_settings_file};
use smallvec::{SmallVec, smallvec};
use snippet::Snippet;
use std::{
@@ -1836,9 +1833,15 @@ impl Editor {
project::Event::RefreshCodeLens => {
// we always query lens with actions, without storing them, always refreshing them
}
project::Event::RefreshInlayHints(server_id) => {
project::Event::RefreshInlayHints {
server_id,
request_id,
} => {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(*server_id),
InlayHintRefreshReason::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
cx,
);
}
@@ -17782,6 +17785,7 @@ impl Editor {
.unwrap_or(self.diagnostics_max_severity);
if !self.inline_diagnostics_enabled()
|| !self.diagnostics_enabled()
|| !self.show_inline_diagnostics
|| max_severity == DiagnosticSeverity::Off
{
@@ -17860,7 +17864,7 @@ impl Editor {
window: &Window,
cx: &mut Context<Self>,
) -> Option<()> {
if self.ignore_lsp_data() {
if self.ignore_lsp_data() || !self.diagnostics_enabled() {
return None;
}
let pull_diagnostics_settings = ProjectSettings::get_global(cx)
@@ -19476,16 +19480,9 @@ impl Editor {
EditorSettings::get_global(cx).gutter.line_numbers
}
pub fn relative_line_numbers(&self, cx: &mut App) -> RelativeLineNumbers {
match (
self.use_relative_line_numbers,
EditorSettings::get_global(cx).relative_line_numbers,
) {
(None, setting) => setting,
(Some(false), _) => RelativeLineNumbers::Disabled,
(Some(true), RelativeLineNumbers::Wrapped) => RelativeLineNumbers::Wrapped,
(Some(true), _) => RelativeLineNumbers::Enabled,
}
pub fn should_use_relative_line_numbers(&self, cx: &mut App) -> bool {
self.use_relative_line_numbers
.unwrap_or(EditorSettings::get_global(cx).relative_line_numbers)
}
pub fn toggle_relative_line_numbers(
@@ -19494,8 +19491,8 @@ impl Editor {
_: &mut Window,
cx: &mut Context<Self>,
) {
let is_relative = self.relative_line_numbers(cx);
self.set_relative_line_number(Some(!is_relative.enabled()), cx)
let is_relative = self.should_use_relative_line_numbers(cx);
self.set_relative_line_number(Some(!is_relative), cx)
}
pub fn set_relative_line_number(&mut self, is_relative: Option<bool>, cx: &mut Context<Self>) {

View File

@@ -3,12 +3,12 @@ use core::num;
use gpui::App;
use language::CursorShape;
use project::project_settings::DiagnosticSeverity;
use settings::Settings;
pub use settings::{
CurrentLineHighlight, DelayMs, DisplayIn, DocumentColorsRenderMode, DoubleClickInMultibuffer,
GoToDefinitionFallback, HideMouseMode, MinimapThumb, MinimapThumbBorder, MultiCursorModifier,
ScrollBeyondLastLine, ScrollbarDiagnostics, SeedQuerySetting, ShowMinimap, SnippetSortOrder,
};
use settings::{RelativeLineNumbers, Settings};
use ui::scrollbars::{ScrollbarVisibility, ShowScrollbar};
/// Imports from the VSCode settings at
@@ -33,7 +33,7 @@ pub struct EditorSettings {
pub horizontal_scroll_margin: f32,
pub scroll_sensitivity: f32,
pub fast_scroll_sensitivity: f32,
pub relative_line_numbers: RelativeLineNumbers,
pub relative_line_numbers: bool,
pub seed_search_query_from_cursor: SeedQuerySetting,
pub use_smartcase_search: bool,
pub multi_cursor_modifier: MultiCursorModifier,

View File

@@ -764,14 +764,8 @@ impl EditorElement {
.row;
if line_numbers
.get(&MultiBufferRow(multi_buffer_row))
.is_some_and(|line_layout| {
line_layout.segments.iter().any(|segment| {
segment
.hitbox
.as_ref()
.is_some_and(|hitbox| hitbox.contains(&event.position))
})
})
.and_then(|line_number| line_number.hitbox.as_ref())
.is_some_and(|hitbox| hitbox.contains(&event.position))
{
let line_offset_from_top = display_row - position_map.scroll_position.y as u32;
@@ -3151,10 +3145,9 @@ impl EditorElement {
fn calculate_relative_line_numbers(
&self,
buffer_rows: &[RowInfo],
snapshot: &EditorSnapshot,
rows: &Range<DisplayRow>,
relative_to: Option<DisplayRow>,
count_wrapped_lines: bool,
) -> HashMap<DisplayRow, DisplayRowDelta> {
let mut relative_rows: HashMap<DisplayRow, DisplayRowDelta> = Default::default();
let Some(relative_to) = relative_to else {
@@ -3162,19 +3155,18 @@ impl EditorElement {
};
let start = rows.start.min(relative_to);
let end = rows.end.max(relative_to);
let buffer_rows = snapshot
.row_infos(start)
.take(1 + end.minus(start) as usize)
.collect::<Vec<_>>();
let head_idx = relative_to.minus(start);
let mut delta = 1;
let mut i = head_idx + 1;
let should_count_line = |row_info: &RowInfo| {
if count_wrapped_lines {
row_info.buffer_row.is_some() || row_info.wrapped_buffer_row.is_some()
} else {
row_info.buffer_row.is_some()
}
};
while i < buffer_rows.len() as u32 {
if should_count_line(&buffer_rows[i as usize]) {
if buffer_rows[i as usize].buffer_row.is_some() {
if rows.contains(&DisplayRow(i + start.0)) {
relative_rows.insert(DisplayRow(i + start.0), delta);
}
@@ -3184,13 +3176,13 @@ impl EditorElement {
}
delta = 1;
i = head_idx.min(buffer_rows.len().saturating_sub(1) as u32);
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() && !count_wrapped_lines {
while i > 0 && buffer_rows[i as usize].buffer_row.is_none() {
i -= 1;
}
while i > 0 {
i -= 1;
if should_count_line(&buffer_rows[i as usize]) {
if buffer_rows[i as usize].buffer_row.is_some() {
if rows.contains(&DisplayRow(i + start.0)) {
relative_rows.insert(DisplayRow(i + start.0), delta);
}
@@ -3222,7 +3214,7 @@ impl EditorElement {
return Arc::default();
}
let (newest_selection_head, relative) = self.editor.update(cx, |editor, cx| {
let (newest_selection_head, is_relative) = self.editor.update(cx, |editor, cx| {
let newest_selection_head = newest_selection_head.unwrap_or_else(|| {
let newest = editor
.selections
@@ -3238,97 +3230,79 @@ impl EditorElement {
)
.head
});
let relative = editor.relative_line_numbers(cx);
(newest_selection_head, relative)
let is_relative = editor.should_use_relative_line_numbers(cx);
(newest_selection_head, is_relative)
});
let relative_to = if relative.enabled() {
let relative_to = if is_relative {
Some(newest_selection_head.row())
} else {
None
};
let relative_rows = self.calculate_relative_line_numbers(
&buffer_rows,
&rows,
relative_to,
relative.wrapped(),
);
let relative_rows = self.calculate_relative_line_numbers(snapshot, &rows, relative_to);
let mut line_number = String::new();
let segments = buffer_rows.iter().enumerate().flat_map(|(ix, row_info)| {
let display_row = DisplayRow(rows.start.0 + ix as u32);
line_number.clear();
let non_relative_number = if relative.wrapped() {
row_info.buffer_row.or(row_info.wrapped_buffer_row)? + 1
} else {
row_info.buffer_row? + 1
};
let number = relative_rows
.get(&display_row)
.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
if row_info
.diff_status
.is_some_and(|status| status.is_deleted())
{
return None;
}
let line_numbers = buffer_rows
.iter()
.enumerate()
.flat_map(|(ix, row_info)| {
let display_row = DisplayRow(rows.start.0 + ix as u32);
line_number.clear();
let non_relative_number = row_info.buffer_row? + 1;
let number = relative_rows
.get(&display_row)
.unwrap_or(&non_relative_number);
write!(&mut line_number, "{number}").unwrap();
if row_info
.diff_status
.is_some_and(|status| status.is_deleted())
{
return None;
}
let color = active_rows
.get(&display_row)
.map(|spec| {
if spec.breakpoint {
cx.theme().colors().debugger_accent
} else {
cx.theme().colors().editor_active_line_number
}
})
.unwrap_or_else(|| cx.theme().colors().editor_line_number);
let shaped_line =
self.shape_line_number(SharedString::from(&line_number), color, window);
let scroll_top = scroll_position.y * ScrollPixelOffset::from(line_height);
let line_origin = gutter_hitbox.map(|hitbox| {
hitbox.origin
+ point(
hitbox.size.width - shaped_line.width - gutter_dimensions.right_padding,
ix as f32 * line_height
- Pixels::from(scroll_top % ScrollPixelOffset::from(line_height)),
let color = active_rows
.get(&display_row)
.map(|spec| {
if spec.breakpoint {
cx.theme().colors().debugger_accent
} else {
cx.theme().colors().editor_active_line_number
}
})
.unwrap_or_else(|| cx.theme().colors().editor_line_number);
let shaped_line =
self.shape_line_number(SharedString::from(&line_number), color, window);
let scroll_top = scroll_position.y * ScrollPixelOffset::from(line_height);
let line_origin = gutter_hitbox.map(|hitbox| {
hitbox.origin
+ point(
hitbox.size.width - shaped_line.width - gutter_dimensions.right_padding,
ix as f32 * line_height
- Pixels::from(scroll_top % ScrollPixelOffset::from(line_height)),
)
});
#[cfg(not(test))]
let hitbox = line_origin.map(|line_origin| {
window.insert_hitbox(
Bounds::new(line_origin, size(shaped_line.width, line_height)),
HitboxBehavior::Normal,
)
});
});
#[cfg(test)]
let hitbox = {
let _ = line_origin;
None
};
#[cfg(not(test))]
let hitbox = line_origin.map(|line_origin| {
window.insert_hitbox(
Bounds::new(line_origin, size(shaped_line.width, line_height)),
HitboxBehavior::Normal,
)
});
#[cfg(test)]
let hitbox = {
let _ = line_origin;
None
};
let segment = LineNumberSegment {
shaped_line,
hitbox,
};
let buffer_row = DisplayPoint::new(display_row, 0).to_point(snapshot).row;
let multi_buffer_row = MultiBufferRow(buffer_row);
Some((multi_buffer_row, segment))
});
let mut line_numbers: HashMap<MultiBufferRow, LineNumberLayout> = HashMap::default();
for (buffer_row, segment) in segments {
line_numbers
.entry(buffer_row)
.or_insert_with(|| LineNumberLayout {
segments: Default::default(),
})
.segments
.push(segment);
}
let multi_buffer_row = DisplayPoint::new(display_row, 0).to_point(snapshot).row;
let multi_buffer_row = MultiBufferRow(multi_buffer_row);
let line_number = LineNumberLayout {
shaped_line,
hitbox,
};
Some((multi_buffer_row, line_number))
})
.collect();
Arc::new(line_numbers)
}
@@ -5866,36 +5840,34 @@ impl EditorElement {
let line_height = layout.position_map.line_height;
window.set_cursor_style(CursorStyle::Arrow, &layout.gutter_hitbox);
for line_layout in layout.line_numbers.values() {
for LineNumberSegment {
shaped_line,
hitbox,
} in &line_layout.segments
{
let Some(hitbox) = hitbox else {
continue;
};
for LineNumberLayout {
shaped_line,
hitbox,
} in layout.line_numbers.values()
{
let Some(hitbox) = hitbox else {
continue;
};
let Some(()) = (if !is_singleton && hitbox.is_hovered(window) {
let color = cx.theme().colors().editor_hover_line_number;
let Some(()) = (if !is_singleton && hitbox.is_hovered(window) {
let color = cx.theme().colors().editor_hover_line_number;
let line = self.shape_line_number(shaped_line.text.clone(), color, window);
line.paint(hitbox.origin, line_height, window, cx).log_err()
} else {
shaped_line
.paint(hitbox.origin, line_height, window, cx)
.log_err()
}) else {
continue;
};
let line = self.shape_line_number(shaped_line.text.clone(), color, window);
line.paint(hitbox.origin, line_height, window, cx).log_err()
} else {
shaped_line
.paint(hitbox.origin, line_height, window, cx)
.log_err()
}) else {
continue;
};
// In singleton buffers, we select corresponding lines on the line number click, so use | -like cursor.
// In multi buffers, we open file at the line number clicked, so use a pointing hand cursor.
if is_singleton {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
} else {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
}
// In singleton buffers, we select corresponding lines on the line number click, so use | -like cursor.
// In multi buffers, we open file at the line number clicked, so use a pointing hand cursor.
if is_singleton {
window.set_cursor_style(CursorStyle::IBeam, hitbox);
} else {
window.set_cursor_style(CursorStyle::PointingHand, hitbox);
}
}
}
@@ -9808,17 +9780,11 @@ impl EditorLayout {
}
}
#[derive(Debug)]
struct LineNumberSegment {
struct LineNumberLayout {
shaped_line: ShapedLine,
hitbox: Option<Hitbox>,
}
#[derive(Debug)]
struct LineNumberLayout {
segments: SmallVec<[LineNumberSegment; 1]>,
}
struct ColoredRange<T> {
start: T,
end: T,
@@ -10875,21 +10841,13 @@ mod tests {
.unwrap();
assert_eq!(layouts.len(), 6);
let get_row_infos = |snapshot: &EditorSnapshot| {
snapshot
.row_infos(DisplayRow(0))
.take(6)
.collect::<Vec<RowInfo>>()
};
let relative_rows = window
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
element.calculate_relative_line_numbers(
&get_row_infos(&snapshot),
&snapshot,
&(DisplayRow(0)..DisplayRow(6)),
Some(DisplayRow(3)),
false,
)
})
.unwrap();
@@ -10905,10 +10863,9 @@ mod tests {
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
element.calculate_relative_line_numbers(
&get_row_infos(&snapshot),
&snapshot,
&(DisplayRow(3)..DisplayRow(6)),
Some(DisplayRow(1)),
false,
)
})
.unwrap();
@@ -10922,10 +10879,9 @@ mod tests {
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
element.calculate_relative_line_numbers(
&get_row_infos(&snapshot),
&snapshot,
&(DisplayRow(0)..DisplayRow(3)),
Some(DisplayRow(6)),
false,
)
})
.unwrap();
@@ -10935,88 +10891,6 @@ mod tests {
assert_eq!(relative_rows[&DisplayRow(2)], 3);
}
#[gpui::test]
fn test_shape_line_numbers_wrapping(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let window = cx.add_window(|window, cx| {
let buffer = MultiBuffer::build_simple(&sample_text(6, 6, 'a'), cx);
Editor::new(EditorMode::full(), buffer, None, window, cx)
});
update_test_language_settings(cx, |s| {
s.defaults.preferred_line_length = Some(5_u32);
s.defaults.soft_wrap = Some(language_settings::SoftWrap::PreferredLineLength);
});
let editor = window.root(cx).unwrap();
let style = cx.update(|cx| editor.read(cx).style().unwrap().clone());
let line_height = window
.update(cx, |_, window, _| {
style.text.line_height_in_pixels(window.rem_size())
})
.unwrap();
let element = EditorElement::new(&editor, style);
let snapshot = window
.update(cx, |editor, window, cx| editor.snapshot(window, cx))
.unwrap();
let layouts = cx
.update_window(*window, |_, window, cx| {
element.layout_line_numbers(
None,
GutterDimensions {
left_padding: Pixels::ZERO,
right_padding: Pixels::ZERO,
width: px(30.0),
margin: Pixels::ZERO,
git_blame_entries_width: None,
},
line_height,
gpui::Point::default(),
DisplayRow(0)..DisplayRow(6),
&(0..6)
.map(|row| RowInfo {
buffer_row: Some(row),
..Default::default()
})
.collect::<Vec<_>>(),
&BTreeMap::default(),
Some(DisplayPoint::new(DisplayRow(0), 0)),
&snapshot,
window,
cx,
)
})
.unwrap();
assert_eq!(layouts.len(), 3);
let relative_rows = window
.update(cx, |editor, window, cx| {
let snapshot = editor.snapshot(window, cx);
let start_row = DisplayRow(0);
let end_row = DisplayRow(6);
let row_infos = snapshot
.row_infos(start_row)
.take((start_row..end_row).len())
.collect::<Vec<RowInfo>>();
element.calculate_relative_line_numbers(
&row_infos,
&(DisplayRow(0)..DisplayRow(6)),
Some(DisplayRow(3)),
true,
)
})
.unwrap();
assert_eq!(relative_rows[&DisplayRow(0)], 3);
assert_eq!(relative_rows[&DisplayRow(1)], 2);
assert_eq!(relative_rows[&DisplayRow(2)], 1);
// current line has no relative number
assert_eq!(relative_rows[&DisplayRow(4)], 1);
assert_eq!(relative_rows[&DisplayRow(5)], 2);
}
#[gpui::test]
async fn test_vim_visual_selections(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -11130,13 +11004,7 @@ mod tests {
state
.line_numbers
.get(&MultiBufferRow(0))
.map(|line_number| line_number
.segments
.first()
.unwrap()
.shaped_line
.text
.as_ref()),
.map(|line_number| line_number.shaped_line.text.as_ref()),
Some("1")
);
}

View File

@@ -1,5 +1,4 @@
use std::{
collections::hash_map,
ops::{ControlFlow, Range},
time::Duration,
};
@@ -49,8 +48,8 @@ pub struct LspInlayHintData {
allowed_hint_kinds: HashSet<Option<InlayHintKind>>,
invalidate_debounce: Option<Duration>,
append_debounce: Option<Duration>,
hint_refresh_tasks: HashMap<BufferId, HashMap<Vec<Range<BufferRow>>, Vec<Task<()>>>>,
hint_chunk_fetched: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
hint_refresh_tasks: HashMap<BufferId, Vec<Task<()>>>,
hint_chunk_fetching: HashMap<BufferId, (Global, HashSet<Range<BufferRow>>)>,
invalidate_hints_for_buffers: HashSet<BufferId>,
pub added_hints: HashMap<InlayId, Option<InlayHintKind>>,
}
@@ -63,7 +62,7 @@ impl LspInlayHintData {
enabled_in_settings: settings.enabled,
hint_refresh_tasks: HashMap::default(),
added_hints: HashMap::default(),
hint_chunk_fetched: HashMap::default(),
hint_chunk_fetching: HashMap::default(),
invalidate_hints_for_buffers: HashSet::default(),
invalidate_debounce: debounce_value(settings.edit_debounce_ms),
append_debounce: debounce_value(settings.scroll_debounce_ms),
@@ -99,9 +98,8 @@ impl LspInlayHintData {
pub fn clear(&mut self) {
self.hint_refresh_tasks.clear();
self.hint_chunk_fetched.clear();
self.hint_chunk_fetching.clear();
self.added_hints.clear();
self.invalidate_hints_for_buffers.clear();
}
/// Checks inlay hint settings for enabled hint kinds and general enabled state.
@@ -199,7 +197,7 @@ impl LspInlayHintData {
) {
for buffer_id in removed_buffer_ids {
self.hint_refresh_tasks.remove(buffer_id);
self.hint_chunk_fetched.remove(buffer_id);
self.hint_chunk_fetching.remove(buffer_id);
}
}
}
@@ -211,7 +209,10 @@ pub enum InlayHintRefreshReason {
SettingsChange(InlayHintSettings),
NewLinesShown,
BufferEdited(BufferId),
RefreshRequested(LanguageServerId),
RefreshRequested {
server_id: LanguageServerId,
request_id: Option<usize>,
},
ExcerptsRemoved(Vec<ExcerptId>),
}
@@ -296,7 +297,7 @@ impl Editor {
| InlayHintRefreshReason::Toggle(_)
| InlayHintRefreshReason::SettingsChange(_) => true,
InlayHintRefreshReason::NewLinesShown
| InlayHintRefreshReason::RefreshRequested(_)
| InlayHintRefreshReason::RefreshRequested { .. }
| InlayHintRefreshReason::ExcerptsRemoved(_) => false,
InlayHintRefreshReason::BufferEdited(buffer_id) => {
let Some(affected_language) = self
@@ -370,48 +371,45 @@ impl Editor {
let Some(buffer) = multi_buffer.read(cx).buffer(buffer_id) else {
continue;
};
let fetched_tasks = inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
let (fetched_for_version, fetched_chunks) = inlay_hints
.hint_chunk_fetching
.entry(buffer_id)
.or_default();
if visible_excerpts
.buffer_version
.changed_since(&fetched_tasks.0)
.changed_since(fetched_for_version)
{
fetched_tasks.1.clear();
fetched_tasks.0 = visible_excerpts.buffer_version.clone();
*fetched_for_version = visible_excerpts.buffer_version.clone();
fetched_chunks.clear();
inlay_hints.hint_refresh_tasks.remove(&buffer_id);
}
let applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
let known_chunks = if ignore_previous_fetches {
None
} else {
Some((fetched_for_version.clone(), fetched_chunks.clone()))
};
match inlay_hints
let mut applicable_chunks =
semantics_provider.applicable_inlay_chunks(&buffer, &visible_excerpts.ranges, cx);
applicable_chunks.retain(|chunk| fetched_chunks.insert(chunk.clone()));
if applicable_chunks.is_empty() && !ignore_previous_fetches {
continue;
}
inlay_hints
.hint_refresh_tasks
.entry(buffer_id)
.or_default()
.entry(applicable_chunks)
{
hash_map::Entry::Occupied(mut o) => {
if invalidate_cache.should_invalidate() || ignore_previous_fetches {
o.get_mut().push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
hash_map::Entry::Vacant(v) => {
v.insert(Vec::new()).push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
ignore_previous_fetches,
debounce,
visible_excerpts,
cx,
));
}
}
.push(spawn_editor_hints_refresh(
buffer_id,
invalidate_cache,
debounce,
visible_excerpts,
known_chunks,
applicable_chunks,
cx,
));
}
}
@@ -506,9 +504,13 @@ impl Editor {
}
InlayHintRefreshReason::NewLinesShown => InvalidationStrategy::None,
InlayHintRefreshReason::BufferEdited(_) => InvalidationStrategy::BufferEdited,
InlayHintRefreshReason::RefreshRequested(server_id) => {
InvalidationStrategy::RefreshRequested(*server_id)
}
InlayHintRefreshReason::RefreshRequested {
server_id,
request_id,
} => InvalidationStrategy::RefreshRequested {
server_id: *server_id,
request_id: *request_id,
},
};
match &mut self.inlay_hints {
@@ -718,44 +720,29 @@ impl Editor {
fn inlay_hints_for_buffer(
&mut self,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
cx: &mut Context<Self>,
) -> Option<Vec<Task<(Range<BufferRow>, anyhow::Result<CacheInlayHints>)>>> {
let semantics_provider = self.semantics_provider()?;
let inlay_hints = self.inlay_hints.as_mut()?;
let buffer_id = buffer_excerpts.buffer.read(cx).remote_id();
let new_hint_tasks = semantics_provider
.inlay_hints(
invalidate_cache,
buffer_excerpts.buffer,
buffer_excerpts.ranges,
inlay_hints
.hint_chunk_fetched
.get(&buffer_id)
.filter(|_| !ignore_previous_fetches && !invalidate_cache.should_invalidate())
.cloned(),
known_chunks,
cx,
)
.unwrap_or_default();
let (known_version, known_chunks) =
inlay_hints.hint_chunk_fetched.entry(buffer_id).or_default();
if buffer_excerpts.buffer_version.changed_since(known_version) {
known_chunks.clear();
*known_version = buffer_excerpts.buffer_version;
}
let mut hint_tasks = Vec::new();
let mut hint_tasks = None;
for (row_range, new_hints_task) in new_hint_tasks {
let inserted = known_chunks.insert(row_range.clone());
if inserted || ignore_previous_fetches || invalidate_cache.should_invalidate() {
hint_tasks.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
hint_tasks
.get_or_insert_with(Vec::new)
.push(cx.spawn(async move |_, _| (row_range, new_hints_task.await)));
}
Some(hint_tasks)
hint_tasks
}
fn apply_fetched_hints(
@@ -793,20 +780,28 @@ impl Editor {
let excerpts = self.buffer.read(cx).excerpt_ids();
let hints_to_insert = new_hints
.into_iter()
.filter_map(|(chunk_range, hints_result)| match hints_result {
Ok(new_hints) => Some(new_hints),
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) =
inlay_hints.hint_chunk_fetched.get_mut(&buffer_id)
{
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
.filter_map(|(chunk_range, hints_result)| {
let chunks_fetched = inlay_hints.hint_chunk_fetching.get_mut(&buffer_id);
match hints_result {
Ok(new_hints) => {
if new_hints.is_empty() {
if let Some((_, chunks_fetched)) = chunks_fetched {
chunks_fetched.remove(&chunk_range);
}
}
Some(new_hints)
}
Err(e) => {
log::error!(
"Failed to query inlays for buffer row range {chunk_range:?}, {e:#}"
);
if let Some((for_version, chunks_fetched)) = chunks_fetched {
if for_version == &query_version {
chunks_fetched.remove(&chunk_range);
}
}
None
}
None
}
})
.flat_map(|hints| hints.into_values())
@@ -856,9 +851,10 @@ struct VisibleExcerpts {
fn spawn_editor_hints_refresh(
buffer_id: BufferId,
invalidate_cache: InvalidationStrategy,
ignore_previous_fetches: bool,
debounce: Option<Duration>,
buffer_excerpts: VisibleExcerpts,
known_chunks: Option<(Global, HashSet<Range<BufferRow>>)>,
applicable_chunks: Vec<Range<BufferRow>>,
cx: &mut Context<'_, Editor>,
) -> Task<()> {
cx.spawn(async move |editor, cx| {
@@ -869,12 +865,7 @@ fn spawn_editor_hints_refresh(
let query_version = buffer_excerpts.buffer_version.clone();
let Some(hint_tasks) = editor
.update(cx, |editor, cx| {
editor.inlay_hints_for_buffer(
invalidate_cache,
ignore_previous_fetches,
buffer_excerpts,
cx,
)
editor.inlay_hints_for_buffer(invalidate_cache, buffer_excerpts, known_chunks, cx)
})
.ok()
else {
@@ -882,6 +873,19 @@ fn spawn_editor_hints_refresh(
};
let hint_tasks = hint_tasks.unwrap_or_default();
if hint_tasks.is_empty() {
editor
.update(cx, |editor, _| {
if let Some((_, hint_chunk_fetching)) = editor
.inlay_hints
.as_mut()
.and_then(|inlay_hints| inlay_hints.hint_chunk_fetching.get_mut(&buffer_id))
{
for applicable_chunks in &applicable_chunks {
hint_chunk_fetching.remove(applicable_chunks);
}
}
})
.ok();
return;
}
let new_hints = join_all(hint_tasks).await;
@@ -1102,7 +1106,10 @@ pub mod tests {
editor
.update(cx, |editor, _window, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -1958,15 +1965,8 @@ pub mod tests {
async fn test_large_buffer_inlay_requests_split(cx: &mut gpui::TestAppContext) {
init_test(cx, |settings| {
settings.defaults.inlay_hints = Some(InlayHintSettingsContent {
show_value_hints: Some(true),
enabled: Some(true),
edit_debounce_ms: Some(0),
scroll_debounce_ms: Some(0),
show_type_hints: Some(true),
show_parameter_hints: Some(true),
show_other_hints: Some(true),
show_background: Some(false),
toggle_on_modifiers_press: None,
..InlayHintSettingsContent::default()
})
});
@@ -2044,6 +2044,7 @@ pub mod tests {
cx.add_window(|window, cx| Editor::for_buffer(buffer, Some(project), window, cx));
cx.executor().run_until_parked();
let _fake_server = fake_servers.next().await.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
let ranges = lsp_request_ranges
@@ -2129,6 +2130,7 @@ pub mod tests {
);
})
.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
editor.update(cx, |_, _, _| {
let ranges = lsp_request_ranges
@@ -2145,6 +2147,7 @@ pub mod tests {
editor.handle_input("++++more text++++", window, cx);
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
editor.update(cx, |editor, _window, cx| {
let mut ranges = lsp_request_ranges.lock().drain(..).collect::<Vec<_>>();
@@ -2695,7 +2698,7 @@ let c = 3;"#
),
(
"main.rs",
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 11))
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 0))
),
],
lsp_request_ranges
@@ -2754,7 +2757,7 @@ let c = 3;"#
),
(
"main.rs",
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 11))
lsp::Range::new(lsp::Position::new(50, 0), lsp::Position::new(100, 0))
),
],
lsp_request_ranges
@@ -3887,7 +3890,10 @@ let c = 3;"#
editor
.update(cx, |editor, _, cx| {
editor.refresh_inlay_hints(
InlayHintRefreshReason::RefreshRequested(fake_server.server.server_id()),
InlayHintRefreshReason::RefreshRequested {
server_id: fake_server.server.server_id(),
request_id: Some(1),
},
cx,
);
})
@@ -4022,7 +4028,7 @@ let c = 3;"#
let mut all_fetched_hints = Vec::new();
for buffer in editor.buffer.read(cx).all_buffers() {
lsp_store.update(cx, |lsp_store, cx| {
let hints = &lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
let hints = lsp_store.latest_lsp_data(&buffer, cx).inlay_hints();
all_cached_labels.extend(hints.all_cached_hints().into_iter().map(|hint| {
let mut label = hint.text().to_string();
if hint.padding_left {

View File

@@ -1,5 +1,5 @@
use collections::HashMap;
use gpui::{Context, Window};
use gpui::{AppContext, Context, Window};
use itertools::Itertools;
use std::{ops::Range, time::Duration};
use text::{AnchorRangeExt, BufferId, ToPoint};
@@ -59,8 +59,9 @@ pub(super) fn refresh_linked_ranges(
let mut applicable_selections = Vec::new();
editor
.update(cx, |editor, cx| {
let selections = editor.selections.all::<usize>(&editor.display_snapshot(cx));
let snapshot = editor.buffer.read(cx).snapshot(cx);
let display_snapshot = editor.display_snapshot(cx);
let selections = editor.selections.all::<usize>(&display_snapshot);
let snapshot = display_snapshot.buffer_snapshot();
let buffer = editor.buffer.read(cx);
for selection in selections {
let cursor_position = selection.head();
@@ -90,14 +91,16 @@ pub(super) fn refresh_linked_ranges(
let highlights = project
.update(cx, |project, cx| {
let mut linked_edits_tasks = vec![];
for (buffer, start, end) in &applicable_selections {
let snapshot = buffer.read(cx).snapshot();
let buffer_id = buffer.read(cx).remote_id();
let linked_edits_task = project.linked_edits(buffer, *start, cx);
let highlights = move || async move {
let cx = cx.to_async();
let highlights = async move {
let edits = linked_edits_task.await.log_err()?;
let snapshot = cx
.read_entity(&buffer, |buffer, _| buffer.snapshot())
.ok()?;
let buffer_id = snapshot.remote_id();
// Find the range containing our current selection.
// We might not find one, because the selection contains both the start and end of the contained range
// (think of selecting <`html>foo`</html> - even though there's a matching closing tag, the selection goes beyond the range of the opening tag)
@@ -128,7 +131,7 @@ pub(super) fn refresh_linked_ranges(
siblings.sort_by(|lhs, rhs| lhs.0.cmp(&rhs.0, &snapshot));
Some((buffer_id, siblings))
};
linked_edits_tasks.push(highlights());
linked_edits_tasks.push(highlights);
}
linked_edits_tasks
})

View File

@@ -13,8 +13,6 @@ path = "src/extension.rs"
[dependencies]
anyhow.workspace = true
async-compression.workspace = true
async-tar.workspace = true
async-trait.workspace = true
collections.workspace = true
dap.workspace = true

View File

@@ -3,9 +3,7 @@ use crate::{
parse_wasm_extension_version,
};
use anyhow::{Context as _, Result, bail};
use async_compression::futures::bufread::GzipDecoder;
use async_tar::Archive;
use futures::{AsyncReadExt, io::Cursor};
use futures::AsyncReadExt;
use heck::ToSnakeCase;
use http_client::{self, AsyncBody, HttpClient};
use serde::Deserialize;
@@ -417,28 +415,48 @@ impl ExtensionBuilder {
return Ok(clang_path);
}
let mut tar_out_dir = wasi_sdk_dir.clone();
tar_out_dir.set_extension("archive");
let tar_out_dir = self.cache_dir.join("wasi-sdk-temp");
fs::remove_dir_all(&wasi_sdk_dir).ok();
fs::remove_dir_all(&tar_out_dir).ok();
fs::create_dir_all(&tar_out_dir).context("failed to create extraction directory")?;
log::info!("downloading wasi-sdk to {}", wasi_sdk_dir.display());
let mut response = self.http.get(&url, AsyncBody::default(), true).await?;
let body = GzipDecoder::new({
// stream the entire request into memory at once as the artifact is quite big (100MB+)
let mut b = vec![];
response.body_mut().read_to_end(&mut b).await?;
Cursor::new(b)
});
let tar = Archive::new(body);
log::info!("un-tarring wasi-sdk to {}", wasi_sdk_dir.display());
tar.unpack(&tar_out_dir)
// Write the response to a temporary file
let tar_gz_path = self.cache_dir.join("wasi-sdk.tar.gz");
let mut tar_gz_file =
fs::File::create(&tar_gz_path).context("failed to create temporary tar.gz file")?;
let response_body = response.body_mut();
let mut body_bytes = Vec::new();
response_body.read_to_end(&mut body_bytes).await?;
std::io::Write::write_all(&mut tar_gz_file, &body_bytes)?;
drop(tar_gz_file);
log::info!("un-tarring wasi-sdk to {}", tar_out_dir.display());
// Shell out to tar to extract the archive
let tar_output = util::command::new_smol_command("tar")
.arg("-xzf")
.arg(&tar_gz_path)
.arg("-C")
.arg(&tar_out_dir)
.output()
.await
.context("failed to unpack wasi-sdk archive")?;
.context("failed to run tar")?;
if !tar_output.status.success() {
bail!(
"failed to extract wasi-sdk archive: {}",
String::from_utf8_lossy(&tar_output.stderr)
);
}
log::info!("finished downloading wasi-sdk");
// Clean up the temporary tar.gz file
fs::remove_file(&tar_gz_path).ok();
let inner_dir = fs::read_dir(&tar_out_dir)?
.next()
.context("no content")?

View File

@@ -164,6 +164,19 @@ pub struct AgentServerManifestEntry {
/// args = ["--serve"]
/// sha256 = "abc123..." # optional
/// ```
///
/// For Node.js-based agents, you can use "node" as the cmd to automatically
/// use Zed's managed Node.js runtime instead of relying on the user's PATH:
/// ```toml
/// [agent_servers.nodeagent.targets.darwin-aarch64]
/// archive = "https://example.com/nodeagent.zip"
/// cmd = "node"
/// args = ["index.js", "--port", "3000"]
/// ```
///
/// Note: All commands are executed with the archive extraction directory as the
/// working directory, so relative paths in args (like "index.js") will resolve
/// relative to the extracted archive contents.
pub targets: HashMap<String, TargetConfig>,
}

View File

@@ -360,7 +360,7 @@ impl ExtensionStore {
}
extension_id = reload_rx.next() => {
let Some(extension_id) = extension_id else { break; };
this.update( cx, |this, _| {
this.update(cx, |this, _| {
this.modified_extensions.extend(extension_id);
})?;
index_changed = true;
@@ -608,7 +608,7 @@ impl ExtensionStore {
.extension_index
.extensions
.contains_key(extension_id.as_ref());
!is_already_installed
!is_already_installed && !SUPPRESSED_EXTENSIONS.contains(&extension_id.as_ref())
})
.cloned()
.collect::<Vec<_>>();

View File

@@ -711,7 +711,9 @@ impl PickerDelegate for OpenPathDelegate {
match &self.directory_state {
DirectoryState::List { parent_path, .. } => {
let (label, indices) = if *parent_path == self.prompt_root {
let (label, indices) = if is_current_dir_candidate {
("open this directory".to_string(), vec![])
} else if *parent_path == self.prompt_root {
match_positions.iter_mut().for_each(|position| {
*position += self.prompt_root.len();
});
@@ -719,8 +721,6 @@ impl PickerDelegate for OpenPathDelegate {
format!("{}{}", self.prompt_root, candidate.path.string),
match_positions,
)
} else if is_current_dir_candidate {
("open this directory".to_string(), vec![])
} else {
(candidate.path.string, match_positions)
};

View File

@@ -470,11 +470,6 @@ impl ProjectDiff {
window: &mut Window,
cx: &mut Context<Self>,
) {
if self.branch_diff.read(cx).diff_base().is_merge_base() {
self.multibuffer.update(cx, |multibuffer, cx| {
multibuffer.add_diff(diff.clone(), cx);
});
}
let subscription = cx.subscribe_in(&diff, window, move |this, _, _, window, cx| {
this._task = window.spawn(cx, {
let this = cx.weak_entity();
@@ -491,8 +486,8 @@ impl ProjectDiff {
.expect("project diff editor should have a conflict addon");
let snapshot = buffer.read(cx).snapshot();
let diff = diff.read(cx);
let diff_hunk_ranges = diff
let diff_read = diff.read(cx);
let diff_hunk_ranges = diff_read
.hunks_intersecting_range(Anchor::MIN..Anchor::MAX, &snapshot, cx)
.map(|diff_hunk| diff_hunk.buffer_range);
let conflicts = conflict_addon
@@ -515,6 +510,9 @@ impl ProjectDiff {
multibuffer_context_lines(cx),
cx,
);
if self.branch_diff.read(cx).diff_base().is_merge_base() {
multibuffer.add_diff(diff.clone(), cx);
}
(was_empty, is_newly_added)
});

View File

@@ -2,14 +2,13 @@ use crate::{
AnyElement, AnyImageCache, App, Asset, AssetLogger, Bounds, DefiniteLength, Element, ElementId,
Entity, GlobalElementId, Hitbox, Image, ImageCache, InspectorElementId, InteractiveElement,
Interactivity, IntoElement, LayoutId, Length, ObjectFit, Pixels, RenderImage, Resource,
SMOOTH_SVG_SCALE_FACTOR, SharedString, SharedUri, StyleRefinement, Styled, SvgSize, Task,
Window, px, swap_rgba_pa_to_bgra,
SharedString, SharedUri, StyleRefinement, Styled, Task, Window, px,
};
use anyhow::{Context as _, Result};
use futures::{AsyncReadExt, Future};
use image::{
AnimationDecoder, DynamicImage, Frame, ImageBuffer, ImageError, ImageFormat, Rgba,
AnimationDecoder, DynamicImage, Frame, ImageError, ImageFormat, Rgba,
codecs::{gif::GifDecoder, webp::WebPDecoder},
};
use smallvec::SmallVec;
@@ -160,13 +159,15 @@ pub trait StyledImage: Sized {
self
}
/// Set the object fit for the image.
/// Set a fallback function that will be invoked to render an error view should
/// the image fail to load.
fn with_fallback(mut self, fallback: impl Fn() -> AnyElement + 'static) -> Self {
self.image_style().fallback = Some(Box::new(fallback));
self
}
/// Set the object fit for the image.
/// Set a fallback function that will be invoked to render a view while the image
/// is still being loaded.
fn with_loading(mut self, loading: impl Fn() -> AnyElement + 'static) -> Self {
self.image_style().loading = Some(Box::new(loading));
self
@@ -631,7 +632,7 @@ impl Asset for ImageAssetLoader {
}
};
let data = if let Ok(format) = image::guess_format(&bytes) {
if let Ok(format) = image::guess_format(&bytes) {
let data = match format {
ImageFormat::Gif => {
let decoder = GifDecoder::new(Cursor::new(&bytes))?;
@@ -689,25 +690,12 @@ impl Asset for ImageAssetLoader {
}
};
RenderImage::new(data)
Ok(Arc::new(RenderImage::new(data)))
} else {
let pixmap =
// TODO: Can we make svgs always rescale?
svg_renderer.render_pixmap(&bytes, SvgSize::ScaleFactor(SMOOTH_SVG_SCALE_FACTOR))?;
let mut buffer =
ImageBuffer::from_raw(pixmap.width(), pixmap.height(), pixmap.take()).unwrap();
for pixel in buffer.chunks_exact_mut(4) {
swap_rgba_pa_to_bgra(pixel);
}
let mut image = RenderImage::new(SmallVec::from_elem(Frame::new(buffer), 1));
image.scale_factor = SMOOTH_SVG_SCALE_FACTOR;
image
};
Ok(Arc::new(data))
svg_renderer
.render_single_frame(&bytes, 1.0, true)
.map_err(Into::into)
}
}
}
}

View File

@@ -1,5 +1,7 @@
use std::{fs, path::Path, sync::Arc};
use crate::{
App, Bounds, Element, GlobalElementId, Hitbox, InspectorElementId, InteractiveElement,
App, Asset, Bounds, Element, GlobalElementId, Hitbox, InspectorElementId, InteractiveElement,
Interactivity, IntoElement, LayoutId, Pixels, Point, Radians, SharedString, Size,
StyleRefinement, Styled, TransformationMatrix, Window, geometry::Negate as _, point, px,
radians, size,
@@ -11,6 +13,7 @@ pub struct Svg {
interactivity: Interactivity,
transformation: Option<Transformation>,
path: Option<SharedString>,
external_path: Option<SharedString>,
}
/// Create a new SVG element.
@@ -20,6 +23,7 @@ pub fn svg() -> Svg {
interactivity: Interactivity::new(),
transformation: None,
path: None,
external_path: None,
}
}
@@ -30,6 +34,12 @@ impl Svg {
self
}
/// Set the path to the SVG file for this element.
pub fn external_path(mut self, path: impl Into<SharedString>) -> Self {
self.external_path = Some(path.into());
self
}
/// Transform the SVG element with the given transformation.
/// Note that this won't effect the hitbox or layout of the element, only the rendering.
pub fn with_transformation(mut self, transformation: Transformation) -> Self {
@@ -117,7 +127,35 @@ impl Element for Svg {
.unwrap_or_default();
window
.paint_svg(bounds, path.clone(), transformation, color, cx)
.paint_svg(bounds, path.clone(), None, transformation, color, cx)
.log_err();
} else if let Some((path, color)) =
self.external_path.as_ref().zip(style.text.color)
{
let Some(bytes) = window
.use_asset::<SvgAsset>(path, cx)
.and_then(|asset| asset.log_err())
else {
return;
};
let transformation = self
.transformation
.as_ref()
.map(|transformation| {
transformation.into_matrix(bounds.center(), window.scale_factor())
})
.unwrap_or_default();
window
.paint_svg(
bounds,
path.clone(),
Some(&bytes),
transformation,
color,
cx,
)
.log_err();
}
},
@@ -219,3 +257,21 @@ impl Transformation {
.translate(center.scale(scale_factor).negate())
}
}
enum SvgAsset {}
impl Asset for SvgAsset {
type Source = SharedString;
type Output = Result<Arc<[u8]>, Arc<std::io::Error>>;
fn load(
source: Self::Source,
_cx: &mut App,
) -> impl Future<Output = Self::Output> + Send + 'static {
async move {
let bytes = fs::read(Path::new(source.as_ref())).map_err(|e| Arc::new(e))?;
let bytes = Arc::from(bytes);
Ok(bytes)
}
}
}

View File

@@ -95,7 +95,7 @@ pub use smol::Timer;
pub use style::*;
pub use styled::*;
pub use subscription::*;
use svg_renderer::*;
pub use svg_renderer::*;
pub(crate) use tab_stop::*;
pub use taffy::{AvailableSpace, LayoutId};
#[cfg(any(test, feature = "test-support"))]

View File

@@ -40,7 +40,7 @@ use crate::{
DEFAULT_WINDOW_SIZE, DevicePixels, DispatchEventResult, Font, FontId, FontMetrics, FontRun,
ForegroundExecutor, GlyphId, GpuSpecs, ImageSource, Keymap, LineLayout, Pixels, PlatformInput,
Point, RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams, Scene, ShapedGlyph,
ShapedRun, SharedString, Size, SvgRenderer, SvgSize, SystemWindowTab, Task, TaskLabel, Window,
ShapedRun, SharedString, Size, SvgRenderer, SystemWindowTab, Task, TaskLabel, Window,
WindowControlArea, hash, point, px, size,
};
use anyhow::Result;
@@ -1817,13 +1817,9 @@ impl Image {
ImageFormat::Tiff => frames_for_image(&self.bytes, image::ImageFormat::Tiff)?,
ImageFormat::Ico => frames_for_image(&self.bytes, image::ImageFormat::Ico)?,
ImageFormat::Svg => {
let pixmap = svg_renderer.render_pixmap(&self.bytes, SvgSize::ScaleFactor(1.0))?;
let buffer =
image::ImageBuffer::from_raw(pixmap.width(), pixmap.height(), pixmap.take())
.unwrap();
SmallVec::from_elem(Frame::new(buffer), 1)
return svg_renderer
.render_single_frame(&self.bytes, 1.0, false)
.map_err(Into::into);
}
};

View File

@@ -1,5 +1,10 @@
use crate::{AssetSource, DevicePixels, IsZero, Result, SharedString, Size};
use crate::{
AssetSource, DevicePixels, IsZero, RenderImage, Result, SharedString, Size,
swap_rgba_pa_to_bgra,
};
use image::Frame;
use resvg::tiny_skia::Pixmap;
use smallvec::SmallVec;
use std::{
hash::Hash,
sync::{Arc, LazyLock},
@@ -15,17 +20,22 @@ pub(crate) struct RenderSvgParams {
}
#[derive(Clone)]
/// A struct holding everything necessary to render SVGs.
pub struct SvgRenderer {
asset_source: Arc<dyn AssetSource>,
usvg_options: Arc<usvg::Options<'static>>,
}
/// The size in which to render the SVG.
pub enum SvgSize {
/// An absolute size in device pixels.
Size(Size<DevicePixels>),
/// A scaling factor to apply to the size provided by the SVG.
ScaleFactor(f32),
}
impl SvgRenderer {
/// Creates a new SVG renderer with the provided asset source.
pub fn new(asset_source: Arc<dyn AssetSource>) -> Self {
static FONT_DB: LazyLock<Arc<usvg::fontdb::Database>> = LazyLock::new(|| {
let mut db = usvg::fontdb::Database::new();
@@ -54,33 +64,68 @@ impl SvgRenderer {
}
}
pub(crate) fn render(
/// Renders the given bytes into an image buffer.
pub fn render_single_frame(
&self,
bytes: &[u8],
scale_factor: f32,
to_brga: bool,
) -> Result<Arc<RenderImage>, usvg::Error> {
self.render_pixmap(
bytes,
SvgSize::ScaleFactor(scale_factor * SMOOTH_SVG_SCALE_FACTOR),
)
.map(|pixmap| {
let mut buffer =
image::ImageBuffer::from_raw(pixmap.width(), pixmap.height(), pixmap.take())
.unwrap();
if to_brga {
for pixel in buffer.chunks_exact_mut(4) {
swap_rgba_pa_to_bgra(pixel);
}
}
let mut image = RenderImage::new(SmallVec::from_const([Frame::new(buffer)]));
image.scale_factor = SMOOTH_SVG_SCALE_FACTOR;
Arc::new(image)
})
}
pub(crate) fn render_alpha_mask(
&self,
params: &RenderSvgParams,
bytes: Option<&[u8]>,
) -> Result<Option<(Size<DevicePixels>, Vec<u8>)>> {
anyhow::ensure!(!params.size.is_zero(), "can't render at a zero size");
// Load the tree.
let Some(bytes) = self.asset_source.load(&params.path)? else {
return Ok(None);
let render_pixmap = |bytes| {
let pixmap = self.render_pixmap(bytes, SvgSize::Size(params.size))?;
// Convert the pixmap's pixels into an alpha mask.
let size = Size::new(
DevicePixels(pixmap.width() as i32),
DevicePixels(pixmap.height() as i32),
);
let alpha_mask = pixmap
.pixels()
.iter()
.map(|p| p.alpha())
.collect::<Vec<_>>();
Ok(Some((size, alpha_mask)))
};
let pixmap = self.render_pixmap(&bytes, SvgSize::Size(params.size))?;
// Convert the pixmap's pixels into an alpha mask.
let size = Size::new(
DevicePixels(pixmap.width() as i32),
DevicePixels(pixmap.height() as i32),
);
let alpha_mask = pixmap
.pixels()
.iter()
.map(|p| p.alpha())
.collect::<Vec<_>>();
Ok(Some((size, alpha_mask)))
if let Some(bytes) = bytes {
render_pixmap(bytes)
} else if let Some(bytes) = self.asset_source.load(&params.path)? {
render_pixmap(&bytes)
} else {
Ok(None)
}
}
pub fn render_pixmap(&self, bytes: &[u8], size: SvgSize) -> Result<Pixmap, usvg::Error> {
fn render_pixmap(&self, bytes: &[u8], size: SvgSize) -> Result<Pixmap, usvg::Error> {
let tree = usvg::Tree::from_data(bytes, &self.usvg_options)?;
let svg_size = tree.size();
let scale = match size {

View File

@@ -3084,6 +3084,7 @@ impl Window {
&mut self,
bounds: Bounds<Pixels>,
path: SharedString,
mut data: Option<&[u8]>,
transformation: TransformationMatrix,
color: Hsla,
cx: &App,
@@ -3104,7 +3105,8 @@ impl Window {
let Some(tile) =
self.sprite_atlas
.get_or_insert_with(&params.clone().into(), &mut || {
let Some((size, bytes)) = cx.svg_renderer.render(&params)? else {
let Some((size, bytes)) = cx.svg_renderer.render_alpha_mask(&params, data)?
else {
return Ok(None);
};
Ok(Some((size, Cow::Owned(bytes))))

View File

@@ -2324,17 +2324,19 @@ impl CodeLabel {
}
pub fn plain(text: String, filter_text: Option<&str>) -> Self {
Self::filtered(text, filter_text, Vec::new())
Self::filtered(text.clone(), text.len(), filter_text, Vec::new())
}
pub fn filtered(
text: String,
label_len: usize,
filter_text: Option<&str>,
runs: Vec<(Range<usize>, HighlightId)>,
) -> Self {
assert!(label_len <= text.len());
let filter_range = filter_text
.and_then(|filter| text.find(filter).map(|ix| ix..ix + filter.len()))
.unwrap_or(0..text.len());
.unwrap_or(0..label_len);
Self::new(text, filter_range, runs)
}

View File

@@ -406,6 +406,7 @@ impl LspAdapter for PyrightLspAdapter {
language: &Arc<language::Language>,
) -> Option<language::CodeLabel> {
let label = &item.label;
let label_len = label.len();
let grammar = language.grammar()?;
let highlight_id = match item.kind? {
lsp::CompletionItemKind::METHOD => grammar.highlight_id_for_name("function.method"),
@@ -427,9 +428,10 @@ impl LspAdapter for PyrightLspAdapter {
}
Some(language::CodeLabel::filtered(
text,
label_len,
item.filter_text.as_deref(),
highlight_id
.map(|id| (0..label.len(), id))
.map(|id| (0..label_len, id))
.into_iter()
.collect(),
))
@@ -1466,6 +1468,7 @@ impl LspAdapter for PyLspAdapter {
language: &Arc<language::Language>,
) -> Option<language::CodeLabel> {
let label = &item.label;
let label_len = label.len();
let grammar = language.grammar()?;
let highlight_id = match item.kind? {
lsp::CompletionItemKind::METHOD => grammar.highlight_id_for_name("function.method")?,
@@ -1476,6 +1479,7 @@ impl LspAdapter for PyLspAdapter {
};
Some(language::CodeLabel::filtered(
label.clone(),
label_len,
item.filter_text.as_deref(),
vec![(0..label.len(), highlight_id)],
))
@@ -1741,6 +1745,7 @@ impl LspAdapter for BasedPyrightLspAdapter {
language: &Arc<language::Language>,
) -> Option<language::CodeLabel> {
let label = &item.label;
let label_len = label.len();
let grammar = language.grammar()?;
let highlight_id = match item.kind? {
lsp::CompletionItemKind::METHOD => grammar.highlight_id_for_name("function.method"),
@@ -1762,6 +1767,7 @@ impl LspAdapter for BasedPyrightLspAdapter {
}
Some(language::CodeLabel::filtered(
text,
label_len,
item.filter_text.as_deref(),
highlight_id
.map(|id| (0..label.len(), id))

View File

@@ -754,7 +754,7 @@ impl LspAdapter for TypeScriptLspAdapter {
language: &Arc<language::Language>,
) -> Option<language::CodeLabel> {
use lsp::CompletionItemKind as Kind;
let len = item.label.len();
let label_len = item.label.len();
let grammar = language.grammar()?;
let highlight_id = match item.kind? {
Kind::CLASS | Kind::INTERFACE | Kind::ENUM => grammar.highlight_id_for_name("type"),
@@ -779,8 +779,9 @@ impl LspAdapter for TypeScriptLspAdapter {
};
Some(language::CodeLabel::filtered(
text,
label_len,
item.filter_text.as_deref(),
vec![(0..len, highlight_id)],
vec![(0..label_len, highlight_id)],
))
}

View File

@@ -178,7 +178,7 @@ impl LspAdapter for VtslsLspAdapter {
language: &Arc<language::Language>,
) -> Option<language::CodeLabel> {
use lsp::CompletionItemKind as Kind;
let len = item.label.len();
let label_len = item.label.len();
let grammar = language.grammar()?;
let highlight_id = match item.kind? {
Kind::CLASS | Kind::INTERFACE | Kind::ENUM => grammar.highlight_id_for_name("type"),
@@ -203,8 +203,9 @@ impl LspAdapter for VtslsLspAdapter {
};
Some(language::CodeLabel::filtered(
text,
label_len,
item.filter_text.as_deref(),
vec![(0..len, highlight_id)],
vec![(0..label_len, highlight_id)],
))
}

View File

@@ -129,9 +129,3 @@ pub(crate) mod m_2025_10_17 {
pub(crate) use settings::make_file_finder_include_ignored_an_enum;
}
pub(crate) mod m_2025_10_21 {
mod settings;
pub(crate) use settings::make_relative_line_numbers_an_enum;
}

View File

@@ -1,16 +0,0 @@
use anyhow::Result;
use serde_json::Value;
pub fn make_relative_line_numbers_an_enum(value: &mut Value) -> Result<()> {
let Some(relative_line_numbers) = value.get_mut("relative_line_numbers") else {
return Ok(());
};
*relative_line_numbers = match relative_line_numbers {
Value::Bool(true) => Value::String("enabled".to_string()),
Value::Bool(false) => Value::String("disabled".to_string()),
Value::String(s) if s == "enabled" || s == "disabled" || s == "wrapped" => return Ok(()),
_ => anyhow::bail!("Expected relative_line_numbers to be a boolean"),
};
Ok(())
}

View File

@@ -214,7 +214,6 @@ pub fn migrate_settings(text: &str) -> Result<Option<String>> {
),
MigrationType::Json(migrations::m_2025_10_16::restore_code_actions_on_format),
MigrationType::Json(migrations::m_2025_10_17::make_file_finder_include_ignored_an_enum),
MigrationType::Json(migrations::m_2025_10_21::make_relative_line_numbers_an_enum),
];
run_migrations(text, migrations)
}

View File

@@ -340,7 +340,6 @@ pub struct RowInfo {
pub multibuffer_row: Option<MultiBufferRow>,
pub diff_status: Option<buffer_diff::DiffHunkStatus>,
pub expand_info: Option<ExpandInfo>,
pub wrapped_buffer_row: Option<u32>,
}
/// A slice into a [`Buffer`] that is being edited in a [`MultiBuffer`].
@@ -6651,7 +6650,6 @@ impl Iterator for MultiBufferRows<'_> {
multibuffer_row: Some(MultiBufferRow(0)),
diff_status: None,
expand_info: None,
wrapped_buffer_row: None,
});
}
@@ -6709,7 +6707,6 @@ impl Iterator for MultiBufferRows<'_> {
buffer_row: Some(last_row),
multibuffer_row: Some(multibuffer_row),
diff_status: None,
wrapped_buffer_row: None,
expand_info,
});
} else {
@@ -6754,7 +6751,6 @@ impl Iterator for MultiBufferRows<'_> {
.diff_hunk_status
.filter(|_| self.point < region.range.end),
expand_info,
wrapped_buffer_row: None,
});
self.point += Point::new(1, 0);
result

View File

@@ -32,7 +32,6 @@ fn test_empty_singleton(cx: &mut App) {
multibuffer_row: Some(MultiBufferRow(0)),
diff_status: None,
expand_info: None,
wrapped_buffer_row: None,
}]
);
}
@@ -2433,8 +2432,6 @@ impl ReferenceMultibuffer {
buffer_id: region.buffer_id,
diff_status: region.status,
buffer_row,
wrapped_buffer_row: None,
multibuffer_row: Some(MultiBufferRow(
text[..ix].matches('\n').count() as u32
)),

View File

@@ -465,14 +465,8 @@ impl SearchData {
let match_offset_range = match_range.to_offset(multi_buffer_snapshot);
let mut search_match_indices = vec![
multi_buffer_snapshot.clip_offset(
match_offset_range.start - context_offset_range.start,
Bias::Left,
)
..multi_buffer_snapshot.clip_offset(
match_offset_range.end - context_offset_range.start,
Bias::Right,
),
match_offset_range.start - context_offset_range.start
..match_offset_range.end - context_offset_range.start,
];
let entire_context_text = multi_buffer_snapshot
@@ -509,14 +503,8 @@ impl SearchData {
.next()
.is_some_and(|c| !c.is_whitespace());
search_match_indices.iter_mut().for_each(|range| {
range.start = multi_buffer_snapshot.clip_offset(
range.start.saturating_sub(left_whitespaces_offset),
Bias::Left,
);
range.end = multi_buffer_snapshot.clip_offset(
range.end.saturating_sub(left_whitespaces_offset),
Bias::Right,
);
range.start = range.start.saturating_sub(left_whitespaces_offset);
range.end = range.end.saturating_sub(left_whitespaces_offset);
});
let trimmed_row_offset_range =
@@ -5226,10 +5214,13 @@ mod tests {
use language::{Language, LanguageConfig, LanguageMatcher, tree_sitter_rust};
use pretty_assertions::assert_eq;
use project::FakeFs;
use search::project_search::{self, perform_project_search};
use search::{
buffer_search,
project_search::{self, perform_project_search},
};
use serde_json::json;
use util::path;
use workspace::{OpenOptions, OpenVisible};
use workspace::{OpenOptions, OpenVisible, ToolbarItemView};
use super::*;
@@ -5292,25 +5283,28 @@ mod tests {
ide/src/
inlay_hints/
fn_lifetime_fn.rs
search: match config.param_names_for_lifetime_elision_hints {
search: allocated_lifetimes.push(if config.param_names_for_lifetime_elision_hints {
search: Some(it) if config.param_names_for_lifetime_elision_hints => {
search: InlayHintsConfig { param_names_for_lifetime_elision_hints: true, ..TEST_CONFIG },
search: match config.«param_names_for_lifetime_elision_hints» {
search: allocated_lifetimes.push(if config.«param_names_for_lifetime_elision_hints» {
search: Some(it) if config.«param_names_for_lifetime_elision_hints» => {
search: InlayHintsConfig { «param_names_for_lifetime_elision_hints»: true, ..TEST_CONFIG },
inlay_hints.rs
search: pub param_names_for_lifetime_elision_hints: bool,
search: param_names_for_lifetime_elision_hints: self
search: pub «param_names_for_lifetime_elision_hints»: bool,
search: «param_names_for_lifetime_elision_hints»: self
static_index.rs
search: param_names_for_lifetime_elision_hints: false,
search: «param_names_for_lifetime_elision_hints»: false,
rust-analyzer/src/
cli/
analysis_stats.rs
search: param_names_for_lifetime_elision_hints: true,
search: «param_names_for_lifetime_elision_hints»: true,
config.rs
search: param_names_for_lifetime_elision_hints: self"#
search: «param_names_for_lifetime_elision_hints»: self"#
.to_string();
let select_first_in_all_matches = |line_to_select: &str| {
assert!(all_matches.contains(line_to_select));
assert!(
all_matches.contains(line_to_select),
"`{line_to_select}` was not found in all matches `{all_matches}`"
);
all_matches.replacen(
line_to_select,
&format!("{line_to_select}{SELECTED_MARKER}"),
@@ -5331,7 +5325,7 @@ mod tests {
cx,
),
select_first_in_all_matches(
"search: match config.param_names_for_lifetime_elision_hints {"
"search: match config.«param_names_for_lifetime_elision_hints» {"
)
);
});
@@ -5371,16 +5365,16 @@ mod tests {
inlay_hints/
fn_lifetime_fn.rs{SELECTED_MARKER}
inlay_hints.rs
search: pub param_names_for_lifetime_elision_hints: bool,
search: param_names_for_lifetime_elision_hints: self
search: pub «param_names_for_lifetime_elision_hints»: bool,
search: «param_names_for_lifetime_elision_hints»: self
static_index.rs
search: param_names_for_lifetime_elision_hints: false,
search: «param_names_for_lifetime_elision_hints»: false,
rust-analyzer/src/
cli/
analysis_stats.rs
search: param_names_for_lifetime_elision_hints: true,
search: «param_names_for_lifetime_elision_hints»: true,
config.rs
search: param_names_for_lifetime_elision_hints: self"#,
search: «param_names_for_lifetime_elision_hints»: self"#,
)
);
});
@@ -5441,9 +5435,9 @@ mod tests {
rust-analyzer/src/
cli/
analysis_stats.rs
search: param_names_for_lifetime_elision_hints: true,
search: «param_names_for_lifetime_elision_hints»: true,
config.rs
search: param_names_for_lifetime_elision_hints: self"#,
search: «param_names_for_lifetime_elision_hints»: self"#,
)
);
});
@@ -5523,21 +5517,21 @@ mod tests {
ide/src/
inlay_hints/
fn_lifetime_fn.rs
search: match config.param_names_for_lifetime_elision_hints {
search: allocated_lifetimes.push(if config.param_names_for_lifetime_elision_hints {
search: Some(it) if config.param_names_for_lifetime_elision_hints => {
search: InlayHintsConfig { param_names_for_lifetime_elision_hints: true, ..TEST_CONFIG },
search: match config.«param_names_for_lifetime_elision_hints» {
search: allocated_lifetimes.push(if config.«param_names_for_lifetime_elision_hints» {
search: Some(it) if config.«param_names_for_lifetime_elision_hints» => {
search: InlayHintsConfig { «param_names_for_lifetime_elision_hints»: true, ..TEST_CONFIG },
inlay_hints.rs
search: pub param_names_for_lifetime_elision_hints: bool,
search: param_names_for_lifetime_elision_hints: self
search: pub «param_names_for_lifetime_elision_hints»: bool,
search: «param_names_for_lifetime_elision_hints»: self
static_index.rs
search: param_names_for_lifetime_elision_hints: false,
search: «param_names_for_lifetime_elision_hints»: false,
rust-analyzer/src/
cli/
analysis_stats.rs
search: param_names_for_lifetime_elision_hints: true,
search: «param_names_for_lifetime_elision_hints»: true,
config.rs
search: param_names_for_lifetime_elision_hints: self"#
search: «param_names_for_lifetime_elision_hints»: self"#
.to_string();
cx.executor()
@@ -5662,30 +5656,40 @@ mod tests {
ide/src/
inlay_hints/
fn_lifetime_fn.rs
search: match config.param_names_for_lifetime_elision_hints {
search: allocated_lifetimes.push(if config.param_names_for_lifetime_elision_hints {
search: Some(it) if config.param_names_for_lifetime_elision_hints => {
search: InlayHintsConfig { param_names_for_lifetime_elision_hints: true, ..TEST_CONFIG },
search: match config.«param_names_for_lifetime_elision_hints» {
search: allocated_lifetimes.push(if config.«param_names_for_lifetime_elision_hints» {
search: Some(it) if config.«param_names_for_lifetime_elision_hints» => {
search: InlayHintsConfig { «param_names_for_lifetime_elision_hints»: true, ..TEST_CONFIG },
inlay_hints.rs
search: pub param_names_for_lifetime_elision_hints: bool,
search: param_names_for_lifetime_elision_hints: self
search: pub «param_names_for_lifetime_elision_hints»: bool,
search: «param_names_for_lifetime_elision_hints»: self
static_index.rs
search: param_names_for_lifetime_elision_hints: false,
search: «param_names_for_lifetime_elision_hints»: false,
rust-analyzer/src/
cli/
analysis_stats.rs
search: param_names_for_lifetime_elision_hints: true,
search: «param_names_for_lifetime_elision_hints»: true,
config.rs
search: param_names_for_lifetime_elision_hints: self"#
search: «param_names_for_lifetime_elision_hints»: self"#
.to_string();
let select_first_in_all_matches = |line_to_select: &str| {
assert!(all_matches.contains(line_to_select));
assert!(
all_matches.contains(line_to_select),
"`{line_to_select}` was not found in all matches `{all_matches}`"
);
all_matches.replacen(
line_to_select,
&format!("{line_to_select}{SELECTED_MARKER}"),
1,
)
};
let clear_outline_metadata = |input: &str| {
input
.replace("search: ", "")
.replace("«", "")
.replace("»", "")
};
cx.executor()
.advance_clock(UPDATE_DEBOUNCE + Duration::from_millis(100));
cx.run_until_parked();
@@ -5696,7 +5700,7 @@ mod tests {
.expect("should have an active editor open")
});
let initial_outline_selection =
"search: match config.param_names_for_lifetime_elision_hints {";
"search: match config.«param_names_for_lifetime_elision_hints» {";
outline_panel.update_in(cx, |outline_panel, window, cx| {
assert_eq!(
display_entries(
@@ -5710,7 +5714,7 @@ mod tests {
);
assert_eq!(
selected_row_text(&active_editor, cx),
initial_outline_selection.replace("search: ", ""), // Clear outline metadata prefixes
clear_outline_metadata(initial_outline_selection),
"Should place the initial editor selection on the corresponding search result"
);
@@ -5719,7 +5723,7 @@ mod tests {
});
let navigated_outline_selection =
"search: Some(it) if config.param_names_for_lifetime_elision_hints => {";
"search: Some(it) if config.«param_names_for_lifetime_elision_hints» => {";
outline_panel.update(cx, |outline_panel, cx| {
assert_eq!(
display_entries(
@@ -5737,7 +5741,7 @@ mod tests {
outline_panel.update(cx, |_, cx| {
assert_eq!(
selected_row_text(&active_editor, cx),
navigated_outline_selection.replace("search: ", ""), // Clear outline metadata prefixes
clear_outline_metadata(navigated_outline_selection),
"Should still have the initial caret position after SelectNext calls"
);
});
@@ -5748,7 +5752,7 @@ mod tests {
outline_panel.update(cx, |_outline_panel, cx| {
assert_eq!(
selected_row_text(&active_editor, cx),
navigated_outline_selection.replace("search: ", ""), // Clear outline metadata prefixes
clear_outline_metadata(navigated_outline_selection),
"After opening, should move the caret to the opened outline entry's position"
);
});
@@ -5756,7 +5760,7 @@ mod tests {
outline_panel.update_in(cx, |outline_panel, window, cx| {
outline_panel.select_next(&SelectNext, window, cx);
});
let next_navigated_outline_selection = "search: InlayHintsConfig { param_names_for_lifetime_elision_hints: true, ..TEST_CONFIG },";
let next_navigated_outline_selection = "search: InlayHintsConfig { «param_names_for_lifetime_elision_hints»: true, ..TEST_CONFIG },";
outline_panel.update(cx, |outline_panel, cx| {
assert_eq!(
display_entries(
@@ -5774,7 +5778,7 @@ mod tests {
outline_panel.update(cx, |_outline_panel, cx| {
assert_eq!(
selected_row_text(&active_editor, cx),
next_navigated_outline_selection.replace("search: ", ""), // Clear outline metadata prefixes
clear_outline_metadata(next_navigated_outline_selection),
"Should again preserve the selection after another SelectNext call"
);
});
@@ -5807,7 +5811,7 @@ mod tests {
);
assert_eq!(
selected_row_text(&new_active_editor, cx),
next_navigated_outline_selection.replace("search: ", ""), // Clear outline metadata prefixes
clear_outline_metadata(next_navigated_outline_selection),
"When opening the excerpt, should navigate to the place corresponding the outline entry"
);
});
@@ -5909,11 +5913,11 @@ mod tests {
format!(
r#"one/
a.txt
search: aaa aaa <==== selected
search: aaa aaa
search: «aaa» aaa <==== selected
search: aaa «aaa»
two/
b.txt
search: a aaa"#,
search: a «aaa»"#,
),
);
});
@@ -5939,7 +5943,7 @@ two/
a.txt <==== selected
two/
b.txt
search: a aaa"#,
search: a «aaa»"#,
),
);
});
@@ -5988,7 +5992,7 @@ two/ <==== selected"#,
a.txt
two/ <==== selected
b.txt
search: a aaa"#,
search: a «aaa»"#,
)
);
});
@@ -6453,18 +6457,18 @@ outline: struct OutlineEntryExcerpt
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "static" }} <==== selected
search: {{ "something": "«static»" }} <==== selected
src/
app/(site)/
(about)/jobs/[slug]/
page.tsx
search: static
search: «static»
(blog)/post/[slug]/
page.tsx
search: static
search: «static»
components/
ErrorBoundary.tsx
search: static"#
search: «static»"#
)
);
});
@@ -6492,12 +6496,12 @@ outline: struct OutlineEntryExcerpt
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "static" }}
search: {{ "something": "«static»" }}
src/
app/(site)/ <==== selected
components/
ErrorBoundary.tsx
search: static"#
search: «static»"#
)
);
});
@@ -6522,12 +6526,12 @@ outline: struct OutlineEntryExcerpt
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "static" }}
search: {{ "something": "«static»" }}
src/
app/(site)/
components/
ErrorBoundary.tsx
search: static <==== selected"#
search: «static» <==== selected"#
)
);
});
@@ -6556,7 +6560,7 @@ outline: struct OutlineEntryExcerpt
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "static" }}
search: {{ "something": "«static»" }}
src/
app/(site)/
components/
@@ -6589,12 +6593,66 @@ outline: struct OutlineEntryExcerpt
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "static" }}
search: {{ "something": "«static»" }}
src/
app/(site)/
components/
ErrorBoundary.tsx <==== selected
search: static"#
search: «static»"#
)
);
});
outline_panel.update_in(cx, |outline_panel, window, cx| {
outline_panel.collapse_all_entries(&CollapseAllEntries, window, cx);
});
cx.executor()
.advance_clock(UPDATE_DEBOUNCE + Duration::from_millis(100));
cx.run_until_parked();
outline_panel.update(cx, |outline_panel, cx| {
assert_eq!(
display_entries(
&project,
&snapshot(outline_panel, cx),
&outline_panel.cached_entries,
outline_panel.selected_entry(),
cx,
),
format!(r#"frontend-project/"#)
);
});
outline_panel.update_in(cx, |outline_panel, window, cx| {
outline_panel.expand_all_entries(&ExpandAllEntries, window, cx);
});
cx.executor()
.advance_clock(UPDATE_DEBOUNCE + Duration::from_millis(100));
cx.run_until_parked();
outline_panel.update(cx, |outline_panel, cx| {
assert_eq!(
display_entries(
&project,
&snapshot(outline_panel, cx),
&outline_panel.cached_entries,
outline_panel.selected_entry(),
cx,
),
format!(
r#"frontend-project/
public/lottie/
syntax-tree.json
search: {{ "something": "«static»" }}
src/
app/(site)/
(about)/jobs/[slug]/
page.tsx
search: «static»
(blog)/post/[slug]/
page.tsx
search: «static»
components/
ErrorBoundary.tsx <==== selected
search: «static»"#
)
);
});
@@ -6700,16 +6758,21 @@ outline: struct OutlineEntryExcerpt
}
},
PanelEntry::Search(search_entry) => {
format!(
"search: {}",
search_entry
.render_data
.get_or_init(|| SearchData::new(
&search_entry.match_range,
multi_buffer_snapshot
))
.context_text
)
let search_data = search_entry.render_data.get_or_init(|| {
SearchData::new(&search_entry.match_range, multi_buffer_snapshot)
});
let mut search_result = String::new();
let mut last_end = 0;
for range in &search_data.search_match_indices {
search_result.push_str(&search_data.context_text[last_end..range.start]);
search_result.push('«');
search_result.push_str(&search_data.context_text[range.start..range.end]);
search_result.push('»');
last_end = range.end;
}
search_result.push_str(&search_data.context_text[last_end..]);
format!("search: {search_result}")
}
};
@@ -6732,6 +6795,7 @@ outline: struct OutlineEntryExcerpt
workspace::init_settings(cx);
Project::init_settings(cx);
project_search::init(cx);
buffer_search::init(cx);
super::init(cx);
});
}
@@ -7510,4 +7574,102 @@ outline: fn main()"
);
});
}
#[gpui::test]
async fn test_buffer_search(cx: &mut TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.background_executor.clone());
fs.insert_tree(
"/test",
json!({
"foo.txt": r#"<_constitution>
</_constitution>
## 📊 Output
| Field | Meaning |
"#
}),
)
.await;
let project = Project::test(fs.clone(), ["/test".as_ref()], cx).await;
let workspace = add_outline_panel(&project, cx).await;
let cx = &mut VisualTestContext::from_window(*workspace, cx);
let editor = workspace
.update(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from("/test/foo.txt"),
OpenOptions {
visible: Some(OpenVisible::All),
..OpenOptions::default()
},
window,
cx,
)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
let search_bar = workspace
.update(cx, |_, window, cx| {
cx.new(|cx| {
let mut search_bar = BufferSearchBar::new(None, window, cx);
search_bar.set_active_pane_item(Some(&editor), window, cx);
search_bar.show(window, cx);
search_bar
})
})
.unwrap();
let outline_panel = outline_panel(&workspace, cx);
outline_panel.update_in(cx, |outline_panel, window, cx| {
outline_panel.set_active(true, window, cx)
});
search_bar
.update_in(cx, |search_bar, window, cx| {
search_bar.search(" ", None, true, window, cx)
})
.await
.unwrap();
cx.executor()
.advance_clock(UPDATE_DEBOUNCE + Duration::from_millis(500));
cx.run_until_parked();
outline_panel.update(cx, |outline_panel, cx| {
assert_eq!(
display_entries(
&project,
&snapshot(outline_panel, cx),
&outline_panel.cached_entries,
outline_panel.selected_entry(),
cx,
),
"search: | Field« » | Meaning | <==== selected
search: | Field « » | Meaning |
search: | Field « » | Meaning |
search: | Field « » | Meaning |
search: | Field « »| Meaning |
search: | Field | Meaning« » |
search: | Field | Meaning « » |
search: | Field | Meaning « » |
search: | Field | Meaning « » |
search: | Field | Meaning « » |
search: | Field | Meaning « » |
search: | Field | Meaning « » |
search: | Field | Meaning « »|"
);
});
}
}

View File

@@ -267,6 +267,7 @@ impl AgentServerStore {
// Insert agent servers from extension manifests
match &self.state {
AgentServerStoreState::Local {
node_runtime,
project_environment,
fs,
http_client,
@@ -297,6 +298,7 @@ impl AgentServerStore {
Box::new(LocalExtensionArchiveAgent {
fs: fs.clone(),
http_client: http_client.clone(),
node_runtime: node_runtime.clone(),
project_environment: project_environment.clone(),
extension_id: Arc::from(ext_id),
agent_id: agent_name.clone(),
@@ -444,6 +446,13 @@ impl AgentServerStore {
cx.emit(AgentServersUpdated);
}
pub fn node_runtime(&self) -> Option<NodeRuntime> {
match &self.state {
AgentServerStoreState::Local { node_runtime, .. } => Some(node_runtime.clone()),
_ => None,
}
}
pub fn local(
node_runtime: NodeRuntime,
fs: Arc<dyn Fs>,
@@ -1364,6 +1373,7 @@ fn asset_name(version: &str) -> Option<String> {
struct LocalExtensionArchiveAgent {
fs: Arc<dyn Fs>,
http_client: Arc<dyn HttpClient>,
node_runtime: NodeRuntime,
project_environment: Entity<ProjectEnvironment>,
extension_id: Arc<str>,
agent_id: Arc<str>,
@@ -1387,6 +1397,7 @@ impl ExternalAgentServer for LocalExtensionArchiveAgent {
) -> Task<Result<(AgentServerCommand, String, Option<task::SpawnInTerminal>)>> {
let fs = self.fs.clone();
let http_client = self.http_client.clone();
let node_runtime = self.node_runtime.clone();
let project_environment = self.project_environment.downgrade();
let extension_id = self.extension_id.clone();
let agent_id = self.agent_id.clone();
@@ -1534,23 +1545,29 @@ impl ExternalAgentServer for LocalExtensionArchiveAgent {
// Validate and resolve cmd path
let cmd = &target_config.cmd;
if cmd.contains("..") {
anyhow::bail!("command path cannot contain '..': {}", cmd);
}
let cmd_path = if cmd.starts_with("./") || cmd.starts_with(".\\") {
// Relative to extraction directory
version_dir.join(&cmd[2..])
let cmd_path = if cmd == "node" {
// Use Zed's managed Node.js runtime
node_runtime.binary_path().await?
} else {
// On PATH
anyhow::bail!("command must be relative (start with './'): {}", cmd);
};
if cmd.contains("..") {
anyhow::bail!("command path cannot contain '..': {}", cmd);
}
anyhow::ensure!(
fs.is_file(&cmd_path).await,
"Missing command {} after extraction",
cmd_path.to_string_lossy()
);
if cmd.starts_with("./") || cmd.starts_with(".\\") {
// Relative to extraction directory
let cmd_path = version_dir.join(&cmd[2..]);
anyhow::ensure!(
fs.is_file(&cmd_path).await,
"Missing command {} after extraction",
cmd_path.to_string_lossy()
);
cmd_path
} else {
// On PATH
anyhow::bail!("command must be relative (start with './'): {}", cmd);
}
};
let command = AgentServerCommand {
path: cmd_path,
@@ -1558,7 +1575,7 @@ impl ExternalAgentServer for LocalExtensionArchiveAgent {
env: Some(env),
};
Ok((command, root_dir.to_string_lossy().into_owned(), None))
Ok((command, version_dir.to_string_lossy().into_owned(), None))
})
}
@@ -1829,6 +1846,7 @@ mod extension_agent_tests {
let agent = LocalExtensionArchiveAgent {
fs,
http_client,
node_runtime: node_runtime::NodeRuntime::unavailable(),
project_environment,
extension_id: Arc::from("my-extension"),
agent_id: Arc::from("my-agent"),
@@ -1893,4 +1911,85 @@ mod extension_agent_tests {
let target = manifest_entry.targets.get("linux-x86_64").unwrap();
assert_eq!(target.cmd, "./release-agent");
}
#[gpui::test]
async fn test_node_command_uses_managed_runtime(cx: &mut TestAppContext) {
let fs = fs::FakeFs::new(cx.background_executor.clone());
let http_client = http_client::FakeHttpClient::with_404_response();
let node_runtime = NodeRuntime::unavailable();
let project_environment = cx.new(|cx| crate::ProjectEnvironment::new(None, cx));
let agent = LocalExtensionArchiveAgent {
fs,
http_client,
node_runtime,
project_environment,
extension_id: Arc::from("node-extension"),
agent_id: Arc::from("node-agent"),
targets: {
let mut map = HashMap::default();
map.insert(
"darwin-aarch64".to_string(),
extension::TargetConfig {
archive: "https://example.com/node-agent.zip".into(),
cmd: "node".into(),
args: vec!["index.js".into()],
sha256: None,
},
);
map
},
env: HashMap::default(),
};
// Verify that when cmd is "node", it attempts to use the node runtime
assert_eq!(agent.extension_id.as_ref(), "node-extension");
assert_eq!(agent.agent_id.as_ref(), "node-agent");
let target = agent.targets.get("darwin-aarch64").unwrap();
assert_eq!(target.cmd, "node");
assert_eq!(target.args, vec!["index.js"]);
}
#[gpui::test]
async fn test_commands_run_in_extraction_directory(cx: &mut TestAppContext) {
let fs = fs::FakeFs::new(cx.background_executor.clone());
let http_client = http_client::FakeHttpClient::with_404_response();
let node_runtime = NodeRuntime::unavailable();
let project_environment = cx.new(|cx| crate::ProjectEnvironment::new(None, cx));
let agent = LocalExtensionArchiveAgent {
fs,
http_client,
node_runtime,
project_environment,
extension_id: Arc::from("test-ext"),
agent_id: Arc::from("test-agent"),
targets: {
let mut map = HashMap::default();
map.insert(
"darwin-aarch64".to_string(),
extension::TargetConfig {
archive: "https://example.com/test.zip".into(),
cmd: "node".into(),
args: vec![
"server.js".into(),
"--config".into(),
"./config.json".into(),
],
sha256: None,
},
);
map
},
env: HashMap::default(),
};
// Verify the agent is configured with relative paths in args
let target = agent.targets.get("darwin-aarch64").unwrap();
assert_eq!(target.args[0], "server.js");
assert_eq!(target.args[2], "./config.json");
// These relative paths will resolve relative to the extraction directory
// when the command is executed
}
}

View File

@@ -190,7 +190,7 @@ pub struct DocumentDiagnostics {
version: Option<i32>,
}
#[derive(Default)]
#[derive(Default, Debug)]
struct DynamicRegistrations {
did_change_watched_files: HashMap<String, Vec<FileSystemWatcher>>,
diagnostics: HashMap<Option<String>, DiagnosticServerCapabilities>,
@@ -804,23 +804,32 @@ impl LocalLspStore {
language_server
.on_request::<lsp::request::InlayHintRefreshRequest, _, _>({
let lsp_store = lsp_store.clone();
let request_id = Arc::new(AtomicUsize::new(0));
move |(), cx| {
let this = lsp_store.clone();
let lsp_store = lsp_store.clone();
let request_id = request_id.clone();
let mut cx = cx.clone();
async move {
this.update(&mut cx, |lsp_store, cx| {
cx.emit(LspStoreEvent::RefreshInlayHints(server_id));
lsp_store
.downstream_client
.as_ref()
.map(|(client, project_id)| {
client.send(proto::RefreshInlayHints {
project_id: *project_id,
server_id: server_id.to_proto(),
lsp_store
.update(&mut cx, |lsp_store, cx| {
let request_id =
Some(request_id.fetch_add(1, atomic::Ordering::AcqRel));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id,
request_id,
});
lsp_store
.downstream_client
.as_ref()
.map(|(client, project_id)| {
client.send(proto::RefreshInlayHints {
project_id: *project_id,
server_id: server_id.to_proto(),
request_id: request_id.map(|id| id as u64),
})
})
})
})?
.transpose()?;
})?
.transpose()?;
Ok(())
}
}
@@ -3610,7 +3619,10 @@ pub enum LspStoreEvent {
new_language: Option<Arc<Language>>,
},
Notification(String),
RefreshInlayHints(LanguageServerId),
RefreshInlayHints {
server_id: LanguageServerId,
request_id: Option<usize>,
},
RefreshCodeLens,
DiagnosticsUpdated {
server_id: LanguageServerId,
@@ -6583,14 +6595,22 @@ impl LspStore {
cx: &mut Context<Self>,
) -> HashMap<Range<BufferRow>, Task<Result<CacheInlayHints>>> {
let buffer_snapshot = buffer.read(cx).snapshot();
let for_server = if let InvalidationStrategy::RefreshRequested(server_id) = invalidate {
let next_hint_id = self.next_hint_id.clone();
let lsp_data = self.latest_lsp_data(&buffer, cx);
let mut lsp_refresh_requested = false;
let for_server = if let InvalidationStrategy::RefreshRequested {
server_id,
request_id,
} = invalidate
{
let invalidated = lsp_data
.inlay_hints
.invalidate_for_server_refresh(server_id, request_id);
lsp_refresh_requested = invalidated;
Some(server_id)
} else {
None
};
let invalidate_cache = invalidate.should_invalidate();
let next_hint_id = self.next_hint_id.clone();
let lsp_data = self.latest_lsp_data(&buffer, cx);
let existing_inlay_hints = &mut lsp_data.inlay_hints;
let known_chunks = known_chunks
.filter(|(known_version, _)| !lsp_data.buffer_version.changed_since(known_version))
@@ -6598,8 +6618,8 @@ impl LspStore {
.unwrap_or_default();
let mut hint_fetch_tasks = Vec::new();
let mut cached_inlay_hints = HashMap::default();
let mut ranges_to_query = Vec::new();
let mut cached_inlay_hints = None;
let mut ranges_to_query = None;
let applicable_chunks = existing_inlay_hints
.applicable_chunks(ranges.as_slice())
.filter(|chunk| !known_chunks.contains(&(chunk.start..chunk.end)))
@@ -6608,39 +6628,38 @@ impl LspStore {
return HashMap::default();
}
let last_chunk_number = applicable_chunks.len() - 1;
let last_chunk_number = existing_inlay_hints.buffer_chunks_len() - 1;
for (i, row_chunk) in applicable_chunks.into_iter().enumerate() {
for row_chunk in applicable_chunks {
match (
existing_inlay_hints
.cached_hints(&row_chunk)
.filter(|_| !invalidate_cache)
.filter(|_| !lsp_refresh_requested)
.cloned(),
existing_inlay_hints
.fetched_hints(&row_chunk)
.as_ref()
.filter(|_| !invalidate_cache)
.filter(|_| !lsp_refresh_requested)
.cloned(),
) {
(None, None) => {
let end = if last_chunk_number == i {
let end = if last_chunk_number == row_chunk.id {
Point::new(row_chunk.end, buffer_snapshot.line_len(row_chunk.end))
} else {
Point::new(row_chunk.end, 0)
};
ranges_to_query.push((
ranges_to_query.get_or_insert_with(Vec::new).push((
row_chunk,
buffer_snapshot.anchor_before(Point::new(row_chunk.start, 0))
..buffer_snapshot.anchor_after(end),
));
}
(None, Some(fetched_hints)) => {
hint_fetch_tasks.push((row_chunk, fetched_hints.clone()))
}
(None, Some(fetched_hints)) => hint_fetch_tasks.push((row_chunk, fetched_hints)),
(Some(cached_hints), None) => {
for (server_id, cached_hints) in cached_hints {
if for_server.is_none_or(|for_server| for_server == server_id) {
cached_inlay_hints
.get_or_insert_with(HashMap::default)
.entry(row_chunk.start..row_chunk.end)
.or_insert_with(HashMap::default)
.entry(server_id)
@@ -6650,10 +6669,11 @@ impl LspStore {
}
}
(Some(cached_hints), Some(fetched_hints)) => {
hint_fetch_tasks.push((row_chunk, fetched_hints.clone()));
hint_fetch_tasks.push((row_chunk, fetched_hints));
for (server_id, cached_hints) in cached_hints {
if for_server.is_none_or(|for_server| for_server == server_id) {
cached_inlay_hints
.get_or_insert_with(HashMap::default)
.entry(row_chunk.start..row_chunk.end)
.or_insert_with(HashMap::default)
.entry(server_id)
@@ -6665,18 +6685,18 @@ impl LspStore {
}
}
let cached_chunk_data = cached_inlay_hints
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.collect();
if hint_fetch_tasks.is_empty() && ranges_to_query.is_empty() {
cached_chunk_data
if hint_fetch_tasks.is_empty()
&& ranges_to_query
.as_ref()
.is_none_or(|ranges| ranges.is_empty())
&& let Some(cached_inlay_hints) = cached_inlay_hints
{
cached_inlay_hints
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.collect()
} else {
if invalidate_cache {
lsp_data.inlay_hints.clear();
}
for (chunk, range_to_query) in ranges_to_query {
for (chunk, range_to_query) in ranges_to_query.into_iter().flatten() {
let next_hint_id = next_hint_id.clone();
let buffer = buffer.clone();
let new_inlay_hints = cx
@@ -6692,31 +6712,38 @@ impl LspStore {
let update_cache = !lsp_data
.buffer_version
.changed_since(&buffer.read(cx).version());
new_hints_by_server
.into_iter()
.map(|(server_id, new_hints)| {
let new_hints = new_hints
.into_iter()
.map(|new_hint| {
(
InlayId::Hint(next_hint_id.fetch_add(
1,
atomic::Ordering::AcqRel,
)),
new_hint,
)
})
.collect::<Vec<_>>();
if update_cache {
lsp_data.inlay_hints.insert_new_hints(
chunk,
server_id,
new_hints.clone(),
);
}
(server_id, new_hints)
})
.collect()
if new_hints_by_server.is_empty() {
if update_cache {
lsp_data.inlay_hints.invalidate_for_chunk(chunk);
}
HashMap::default()
} else {
new_hints_by_server
.into_iter()
.map(|(server_id, new_hints)| {
let new_hints = new_hints
.into_iter()
.map(|new_hint| {
(
InlayId::Hint(next_hint_id.fetch_add(
1,
atomic::Ordering::AcqRel,
)),
new_hint,
)
})
.collect::<Vec<_>>();
if update_cache {
lsp_data.inlay_hints.insert_new_hints(
chunk,
server_id,
new_hints.clone(),
);
}
(server_id, new_hints)
})
.collect()
}
})
})
.map_err(Arc::new)
@@ -6728,22 +6755,25 @@ impl LspStore {
hint_fetch_tasks.push((chunk, new_inlay_hints));
}
let mut combined_data = cached_chunk_data;
combined_data.extend(hint_fetch_tasks.into_iter().map(|(chunk, hints_fetch)| {
(
chunk.start..chunk.end,
cx.spawn(async move |_, _| {
hints_fetch.await.map_err(|e| {
if e.error_code() != ErrorCode::Internal {
anyhow!(e.error_code())
} else {
anyhow!("{e:#}")
}
})
}),
)
}));
combined_data
cached_inlay_hints
.unwrap_or_default()
.into_iter()
.map(|(row_chunk, hints)| (row_chunk, Task::ready(Ok(hints))))
.chain(hint_fetch_tasks.into_iter().map(|(chunk, hints_fetch)| {
(
chunk.start..chunk.end,
cx.spawn(async move |_, _| {
hints_fetch.await.map_err(|e| {
if e.error_code() != ErrorCode::Internal {
anyhow!(e.error_code())
} else {
anyhow!("{e:#}")
}
})
}),
)
}))
.collect()
}
}
@@ -9542,7 +9572,10 @@ impl LspStore {
if let Some(work) = status.pending_work.remove(&token)
&& !work.is_disk_based_diagnostics_progress
{
cx.emit(LspStoreEvent::RefreshInlayHints(language_server_id));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id: language_server_id,
request_id: None,
});
}
cx.notify();
}
@@ -9679,9 +9712,10 @@ impl LspStore {
mut cx: AsyncApp,
) -> Result<proto::Ack> {
lsp_store.update(&mut cx, |_, cx| {
cx.emit(LspStoreEvent::RefreshInlayHints(
LanguageServerId::from_proto(envelope.payload.server_id),
));
cx.emit(LspStoreEvent::RefreshInlayHints {
server_id: LanguageServerId::from_proto(envelope.payload.server_id),
request_id: envelope.payload.request_id.map(|id| id as usize),
});
})?;
Ok(proto::Ack {})
}
@@ -10900,7 +10934,6 @@ impl LspStore {
language_server.name(),
Some(key.worktree_id),
));
cx.emit(LspStoreEvent::RefreshInlayHints(server_id));
let server_capabilities = language_server.capabilities();
if let Some((downstream_client, project_id)) = self.downstream_client.as_ref() {
@@ -11931,14 +11964,11 @@ impl LspStore {
.context("Could not obtain Language Servers state")?;
local
.language_server_dynamic_registrations
.get_mut(&server_id)
.and_then(|registrations| {
registrations
.diagnostics
.insert(Some(reg.id.clone()), caps.clone())
});
.entry(server_id)
.or_default()
.diagnostics
.insert(Some(reg.id.clone()), caps.clone());
let mut can_now_provide_diagnostics = false;
if let LanguageServerState::Running {
workspace_diagnostics_refresh_tasks,
..
@@ -11951,20 +11981,42 @@ impl LspStore {
)
{
workspace_diagnostics_refresh_tasks.insert(Some(reg.id), task);
can_now_provide_diagnostics = true;
}
// We don't actually care about capabilities.diagnostic_provider, but it IS relevant for the remote peer
// to know that there's at least one provider. Otherwise, it will never ask us to issue documentdiagnostic calls on their behalf,
// as it'll think that they're not supported.
if can_now_provide_diagnostics {
server.update_capabilities(|capabilities| {
debug_assert!(capabilities.diagnostic_provider.is_none());
let mut did_update_caps = false;
server.update_capabilities(|capabilities| {
if capabilities.diagnostic_provider.as_ref().is_none_or(
|current_caps| {
let supports_workspace_diagnostics =
|capabilities: &DiagnosticServerCapabilities| {
match capabilities {
DiagnosticServerCapabilities::Options(
diagnostic_options,
) => diagnostic_options.workspace_diagnostics,
DiagnosticServerCapabilities::RegistrationOptions(
diagnostic_registration_options,
) => {
diagnostic_registration_options
.diagnostic_options
.workspace_diagnostics
}
}
};
// We don't actually care about capabilities.diagnostic_provider, but it IS relevant for the remote peer
// to know that there's at least one provider. Otherwise, it will never ask us to issue documentdiagnostic calls on their behalf,
// as it'll think that they're not supported.
// If we did not support any workspace diagnostics up to this point but now do, let's update.
!supports_workspace_diagnostics(current_caps)
& supports_workspace_diagnostics(&caps)
},
) {
did_update_caps = true;
capabilities.diagnostic_provider = Some(caps);
});
}
});
if did_update_caps {
notify_server_capabilities_updated(&server, cx);
}
notify_server_capabilities_updated(&server, cx);
}
}
"textDocument/documentColor" => {
@@ -12199,10 +12251,7 @@ impl LspStore {
.update(cx, |buffer, _| buffer.wait_for_version(version))?
.await?;
lsp_store.update(cx, |lsp_store, cx| {
let lsp_data = lsp_store
.lsp_data
.entry(buffer_id)
.or_insert_with(|| BufferLspData::new(&buffer, cx));
let lsp_data = lsp_store.latest_lsp_data(&buffer, cx);
let chunks_queried_for = lsp_data
.inlay_hints
.applicable_chunks(&[range])

View File

@@ -19,7 +19,10 @@ pub enum InvalidationStrategy {
/// Demands to re-query all inlay hints needed and invalidate all cached entries, but does not require instant update with invalidation.
///
/// Despite nothing forbids language server from sending this request on every edit, it is expected to be sent only when certain internal server state update, invisible for the editor otherwise.
RefreshRequested(LanguageServerId),
RefreshRequested {
server_id: LanguageServerId,
request_id: Option<usize>,
},
/// Multibuffer excerpt(s) and/or singleton buffer(s) were edited at least on one place.
/// Neither editor nor LSP is able to tell which open file hints' are not affected, so all of them have to be invalidated, re-queried and do that fast enough to avoid being slow, but also debounce to avoid loading hints on every fast keystroke sequence.
BufferEdited,
@@ -36,7 +39,7 @@ impl InvalidationStrategy {
pub fn should_invalidate(&self) -> bool {
matches!(
self,
InvalidationStrategy::RefreshRequested(_) | InvalidationStrategy::BufferEdited
InvalidationStrategy::RefreshRequested { .. } | InvalidationStrategy::BufferEdited
)
}
}
@@ -47,6 +50,7 @@ pub struct BufferInlayHints {
hints_by_chunks: Vec<Option<CacheInlayHints>>,
fetches_by_chunks: Vec<Option<CacheInlayHintsTask>>,
hints_by_id: HashMap<InlayId, HintForId>,
latest_invalidation_requests: HashMap<LanguageServerId, Option<usize>>,
pub(super) hint_resolves: HashMap<InlayId, Shared<Task<()>>>,
}
@@ -67,7 +71,7 @@ struct HintForId {
/// <https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#inlayHintParams>
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct BufferChunk {
id: usize,
pub id: usize,
pub start: BufferRow,
pub end: BufferRow,
}
@@ -104,6 +108,7 @@ impl BufferInlayHints {
Self {
hints_by_chunks: vec![None; buffer_chunks.len()],
fetches_by_chunks: vec![None; buffer_chunks.len()],
latest_invalidation_requests: HashMap::default(),
hints_by_id: HashMap::default(),
hint_resolves: HashMap::default(),
snapshot,
@@ -176,6 +181,7 @@ impl BufferInlayHints {
self.fetches_by_chunks = vec![None; self.buffer_chunks.len()];
self.hints_by_id.clear();
self.hint_resolves.clear();
self.latest_invalidation_requests.clear();
}
pub fn insert_new_hints(
@@ -218,4 +224,52 @@ impl BufferInlayHints {
debug_assert_eq!(*hint_id, id, "Invalid pointer {hint_for_id:?}");
Some(hint)
}
pub fn buffer_chunks_len(&self) -> usize {
self.buffer_chunks.len()
}
pub(crate) fn invalidate_for_server_refresh(
&mut self,
for_server: LanguageServerId,
request_id: Option<usize>,
) -> bool {
match self.latest_invalidation_requests.entry(for_server) {
hash_map::Entry::Occupied(mut o) => {
if request_id > *o.get() {
o.insert(request_id);
} else {
return false;
}
}
hash_map::Entry::Vacant(v) => {
v.insert(request_id);
}
}
for (chunk_id, chunk_data) in self.hints_by_chunks.iter_mut().enumerate() {
if let Some(removed_hints) = chunk_data
.as_mut()
.and_then(|chunk_data| chunk_data.remove(&for_server))
{
for (id, _) in removed_hints {
self.hints_by_id.remove(&id);
self.hint_resolves.remove(&id);
}
self.fetches_by_chunks[chunk_id] = None;
}
}
true
}
pub(crate) fn invalidate_for_chunk(&mut self, chunk: BufferChunk) {
self.fetches_by_chunks[chunk.id] = None;
if let Some(hints_by_server) = self.hints_by_chunks[chunk.id].take() {
for (hint_id, _) in hints_by_server.into_values().flatten() {
self.hints_by_id.remove(&hint_id);
self.hint_resolves.remove(&hint_id);
}
}
}
}

View File

@@ -337,7 +337,10 @@ pub enum Event {
HostReshared,
Reshared,
Rejoined,
RefreshInlayHints(LanguageServerId),
RefreshInlayHints {
server_id: LanguageServerId,
request_id: Option<usize>,
},
RefreshCodeLens,
RevealInProjectPanel(ProjectEntryId),
SnippetEdit(BufferId, Vec<(lsp::Range, Snippet)>),
@@ -3074,9 +3077,13 @@ impl Project {
return;
};
}
LspStoreEvent::RefreshInlayHints(server_id) => {
cx.emit(Event::RefreshInlayHints(*server_id))
}
LspStoreEvent::RefreshInlayHints {
server_id,
request_id,
} => cx.emit(Event::RefreshInlayHints {
server_id: *server_id,
request_id: *request_id,
}),
LspStoreEvent::RefreshCodeLens => cx.emit(Event::RefreshCodeLens),
LspStoreEvent::LanguageServerPrompt(prompt) => {
cx.emit(Event::LanguageServerPrompt(prompt.clone()))

View File

@@ -1815,10 +1815,6 @@ async fn test_disk_based_diagnostics_progress(cx: &mut gpui::TestAppContext) {
fake_server
.start_progress(format!("{}/0", progress_token))
.await;
assert_eq!(
events.next().await.unwrap(),
Event::RefreshInlayHints(fake_server.server.server_id())
);
assert_eq!(
events.next().await.unwrap(),
Event::DiskBasedDiagnosticsStarted {
@@ -1957,10 +1953,6 @@ async fn test_restarting_server_with_diagnostics_running(cx: &mut gpui::TestAppC
Some(worktree_id)
)
);
assert_eq!(
events.next().await.unwrap(),
Event::RefreshInlayHints(fake_server.server.server_id())
);
fake_server.start_progress(progress_token).await;
assert_eq!(
events.next().await.unwrap(),

View File

@@ -466,6 +466,7 @@ message ResolveInlayHintResponse {
message RefreshInlayHints {
uint64 project_id = 1;
uint64 server_id = 2;
optional uint64 request_id = 3;
}
message CodeLens {

View File

@@ -2338,7 +2338,15 @@ pub fn perform_project_search(
#[cfg(test)]
pub mod tests {
use std::{ops::Deref as _, sync::Arc, time::Duration};
use std::{
ops::Deref as _,
path::PathBuf,
sync::{
Arc,
atomic::{self, AtomicUsize},
},
time::Duration,
};
use super::*;
use editor::{DisplayPoint, display_map::DisplayRow};
@@ -4239,6 +4247,8 @@ pub mod tests {
)
.await;
let requests_count = Arc::new(AtomicUsize::new(0));
let closure_requests_count = requests_count.clone();
let project = Project::test(fs.clone(), [path!("/dir").as_ref()], cx).await;
let language_registry = project.read_with(cx, |project, _| project.languages().clone());
let language = rust_lang();
@@ -4250,21 +4260,26 @@ pub mod tests {
inlay_hint_provider: Some(lsp::OneOf::Left(true)),
..lsp::ServerCapabilities::default()
},
initializer: Some(Box::new(|fake_server| {
fake_server.set_request_handler::<lsp::request::InlayHintRequest, _, _>(
move |_, _| async move {
Ok(Some(vec![lsp::InlayHint {
position: lsp::Position::new(0, 17),
label: lsp::InlayHintLabel::String(": i32".to_owned()),
kind: Some(lsp::InlayHintKind::TYPE),
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
}]))
},
);
initializer: Some(Box::new(move |fake_server| {
let requests_count = closure_requests_count.clone();
fake_server.set_request_handler::<lsp::request::InlayHintRequest, _, _>({
move |_, _| {
let requests_count = requests_count.clone();
async move {
requests_count.fetch_add(1, atomic::Ordering::Release);
Ok(Some(vec![lsp::InlayHint {
position: lsp::Position::new(0, 17),
label: lsp::InlayHintLabel::String(": i32".to_owned()),
kind: Some(lsp::InlayHintKind::TYPE),
text_edits: None,
tooltip: None,
padding_left: None,
padding_right: None,
data: None,
}]))
}
}
});
})),
..FakeLspAdapter::default()
},
@@ -4278,7 +4293,7 @@ pub mod tests {
});
perform_search(search_view, "let ", cx);
let _fake_server = fake_servers.next().await.unwrap();
let fake_server = fake_servers.next().await.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
search_view
@@ -4291,11 +4306,127 @@ pub mod tests {
);
})
.unwrap();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
1,
"New hints should have been queried",
);
// Can do the 2nd search without any panics
perform_search(search_view, "let ", cx);
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
search_view
.update(cx, |search_view, _, cx| {
assert_eq!(
search_view
.results_editor
.update(cx, |editor, cx| editor.display_text(cx)),
"\n\nfn main() { let a: i32 = 2; }\n"
);
})
.unwrap();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
2,
"We did drop the previous buffer when cleared the old project search results, hence another query was made",
);
let singleton_editor = window
.update(cx, |workspace, window, cx| {
workspace.open_abs_path(
PathBuf::from(path!("/dir/main.rs")),
workspace::OpenOptions::default(),
window,
cx,
)
})
.unwrap()
.await
.unwrap()
.downcast::<Editor>()
.unwrap();
cx.executor().advance_clock(Duration::from_millis(100));
cx.executor().run_until_parked();
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"fn main() { let a: i32 = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
2,
"Opening the same buffer again should reuse the cached hints",
);
window
.update(cx, |_, window, cx| {
singleton_editor.update(cx, |editor, cx| {
editor.handle_input("test", window, cx);
});
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"testfn main() { l: i32et a = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
3,
"We have edited the buffer and should send a new request",
);
window
.update(cx, |_, window, cx| {
singleton_editor.update(cx, |editor, cx| {
editor.undo(&editor::actions::Undo, window, cx);
});
})
.unwrap();
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
4,
"We have edited the buffer again and should send a new request again",
);
singleton_editor.update(cx, |editor, cx| {
assert_eq!(
editor.display_text(cx),
"fn main() { let a: i32 = 2; }\n",
"Newly opened editor should have the correct text with hints",
);
});
project.update(cx, |_, cx| {
cx.emit(project::Event::RefreshInlayHints {
server_id: fake_server.server.server_id(),
request_id: Some(1),
});
});
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
5,
"After a simulated server refresh request, we should have sent another request",
);
perform_search(search_view, "let ", cx);
cx.executor().advance_clock(Duration::from_secs(1));
cx.executor().run_until_parked();
assert_eq!(
requests_count.load(atomic::Ordering::Acquire),
5,
"New project search should reuse the cached hints",
);
search_view
.update(cx, |search_view, _, cx| {
assert_eq!(

View File

@@ -97,11 +97,9 @@ pub struct EditorSettingsContent {
#[serde(serialize_with = "crate::serialize_optional_f32_with_two_decimal_places")]
pub fast_scroll_sensitivity: Option<f32>,
/// Whether the line numbers on editors gutter are relative or not.
/// When "enabled" shows relative number of buffer lines, when "wrapped" shows
/// relative number of display lines.
///
/// Default: "disabled"
pub relative_line_numbers: Option<RelativeLineNumbers>,
/// Default: false
pub relative_line_numbers: Option<bool>,
/// When to populate a new search's query based on the text under the cursor.
///
/// Default: always
@@ -201,41 +199,6 @@ pub struct EditorSettingsContent {
pub lsp_document_colors: Option<DocumentColorsRenderMode>,
}
#[derive(
Debug,
Clone,
Copy,
Serialize,
Deserialize,
JsonSchema,
MergeFrom,
PartialEq,
Eq,
strum::VariantArray,
strum::VariantNames,
)]
#[serde(rename_all = "snake_case")]
pub enum RelativeLineNumbers {
Disabled,
Enabled,
Wrapped,
}
impl RelativeLineNumbers {
pub fn enabled(&self) -> bool {
match self {
RelativeLineNumbers::Enabled | RelativeLineNumbers::Wrapped => true,
RelativeLineNumbers::Disabled => false,
}
}
pub fn wrapped(&self) -> bool {
match self {
RelativeLineNumbers::Enabled | RelativeLineNumbers::Disabled => false,
RelativeLineNumbers::Wrapped => true,
}
}
}
// Toolbar related settings
#[skip_serializing_none]
#[derive(Debug, Clone, Default, Serialize, Deserialize, JsonSchema, MergeFrom, PartialEq, Eq)]

View File

@@ -275,7 +275,7 @@ impl VsCodeSettings {
}),
redact_private_values: None,
relative_line_numbers: self.read_enum("editor.lineNumbers", |s| match s {
"relative" => Some(RelativeLineNumbers::Enabled),
"relative" => Some(true),
_ => None,
}),
rounded_selection: self.read_bool("editor.roundedSelection"),

View File

@@ -1,8 +1,10 @@
mod dropdown;
mod font_picker;
mod icon_theme_picker;
mod input_field;
mod theme_picker;
pub use dropdown::*;
pub use font_picker::font_picker;
pub use icon_theme_picker::icon_theme_picker;
pub use input_field::*;

View File

@@ -0,0 +1,108 @@
use std::rc::Rc;
use gpui::{App, ElementId, IntoElement, RenderOnce};
use heck::ToTitleCase as _;
use ui::{
ButtonSize, ContextMenu, DropdownMenu, DropdownStyle, FluentBuilder as _, IconPosition, px,
};
#[derive(IntoElement)]
pub struct EnumVariantDropdown<T>
where
T: strum::VariantArray + strum::VariantNames + Copy + PartialEq + Send + Sync + 'static,
{
id: ElementId,
current_value: T,
variants: &'static [T],
labels: &'static [&'static str],
should_do_title_case: bool,
tab_index: Option<isize>,
on_change: Rc<dyn Fn(T, &mut App) + 'static>,
}
impl<T> EnumVariantDropdown<T>
where
T: strum::VariantArray + strum::VariantNames + Copy + PartialEq + Send + Sync + 'static,
{
pub fn new(
id: impl Into<ElementId>,
current_value: T,
variants: &'static [T],
labels: &'static [&'static str],
on_change: impl Fn(T, &mut App) + 'static,
) -> Self {
Self {
id: id.into(),
current_value,
variants,
labels,
should_do_title_case: true,
tab_index: None,
on_change: Rc::new(on_change),
}
}
pub fn title_case(mut self, title_case: bool) -> Self {
self.should_do_title_case = title_case;
self
}
pub fn tab_index(mut self, tab_index: isize) -> Self {
self.tab_index = Some(tab_index);
self
}
}
impl<T> RenderOnce for EnumVariantDropdown<T>
where
T: strum::VariantArray + strum::VariantNames + Copy + PartialEq + Send + Sync + 'static,
{
fn render(self, window: &mut ui::Window, cx: &mut ui::App) -> impl gpui::IntoElement {
let current_value_label = self.labels[self
.variants
.iter()
.position(|v| *v == self.current_value)
.unwrap()];
let context_menu = window.use_keyed_state(current_value_label, cx, |window, cx| {
ContextMenu::new(window, cx, move |mut menu, _, _| {
for (&value, &label) in std::iter::zip(self.variants, self.labels) {
let on_change = self.on_change.clone();
let current_value = self.current_value;
menu = menu.toggleable_entry(
if self.should_do_title_case {
label.to_title_case()
} else {
label.to_string()
},
value == current_value,
IconPosition::End,
None,
move |_, cx| {
on_change(value, cx);
},
);
}
menu
})
});
DropdownMenu::new(
self.id,
if self.should_do_title_case {
current_value_label.to_title_case()
} else {
current_value_label.to_string()
},
context_menu,
)
.when_some(self.tab_index, |elem, tab_index| elem.tab_index(tab_index))
.trigger_size(ButtonSize::Medium)
.style(DropdownStyle::Outlined)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
})
.into_any_element()
}
}

View File

@@ -1503,7 +1503,7 @@ pub(crate) fn settings_data(cx: &App) -> Vec<SettingsPage> {
}),
SettingsPageItem::SettingItem(SettingItem {
title: "Relative Line Numbers",
description: "Controls line number display in the editor's gutter. \"disabled\" shows absolute line numbers, \"enabled\" shows relative line numbers for each absolute line, and \"wrapped\" shows relative line numbers for every line, absolute or wrapped.",
description: "Whether the line numbers in the editor's gutter are relative or not.",
field: Box::new(SettingField {
json_path: Some("relative_line_numbers"),
pick: |settings_content| {

View File

@@ -11,7 +11,6 @@ use gpui::{
TitlebarOptions, UniformListScrollHandle, Window, WindowBounds, WindowHandle, WindowOptions,
actions, div, list, point, prelude::*, px, uniform_list,
};
use heck::ToTitleCase as _;
use project::{Project, WorktreeId};
use release_channel::ReleaseChannel;
use schemars::JsonSchema;
@@ -38,7 +37,9 @@ use util::{ResultExt as _, paths::PathStyle, rel_path::RelPath};
use workspace::{AppState, OpenOptions, OpenVisible, Workspace, client_side_decorations};
use zed_actions::{OpenSettings, OpenSettingsAt};
use crate::components::{SettingsInputField, font_picker, icon_theme_picker, theme_picker};
use crate::components::{
EnumVariantDropdown, SettingsInputField, font_picker, icon_theme_picker, theme_picker,
};
const NAVBAR_CONTAINER_TAB_INDEX: isize = 0;
const NAVBAR_GROUP_TAB_INDEX: isize = 1;
@@ -491,7 +492,6 @@ fn init_renderers(cx: &mut App) {
.add_basic_renderer::<settings::IncludeIgnoredContent>(render_dropdown)
.add_basic_renderer::<settings::ShowIndentGuides>(render_dropdown)
.add_basic_renderer::<settings::ShellDiscriminants>(render_dropdown)
.add_basic_renderer::<settings::RelativeLineNumbers>(render_dropdown)
// please semicolon stay on next line
;
}
@@ -3326,7 +3326,7 @@ fn render_dropdown<T>(
field: SettingField<T>,
file: SettingsUiFile,
metadata: Option<&SettingsFieldMetadata>,
window: &mut Window,
_window: &mut Window,
cx: &mut App,
) -> AnyElement
where
@@ -3342,56 +3342,19 @@ where
SettingsStore::global(cx).get_value_from_file(file.to_settings(), field.pick);
let current_value = current_value.copied().unwrap_or(variants()[0]);
let current_value_label =
labels()[variants().iter().position(|v| *v == current_value).unwrap()];
DropdownMenu::new(
"dropdown",
if should_do_titlecase {
current_value_label.to_title_case()
} else {
current_value_label.to_string()
},
window.use_state(cx, |window, cx| {
ContextMenu::new(window, cx, move |mut menu, _, _| {
for (&value, &label) in std::iter::zip(variants(), labels()) {
let file = file.clone();
menu = menu.toggleable_entry(
if should_do_titlecase {
label.to_title_case()
} else {
label.to_string()
},
value == current_value,
IconPosition::End,
None,
move |_, cx| {
if value == current_value {
return;
}
update_settings_file(
file.clone(),
field.json_path,
cx,
move |settings, _cx| {
(field.write)(settings, Some(value));
},
)
.log_err(); // todo(settings_ui) don't log err
},
);
}
menu
EnumVariantDropdown::new("dropdown", current_value, variants(), labels(), {
move |value, cx| {
if value == current_value {
return;
}
update_settings_file(file.clone(), field.json_path, cx, move |settings, _cx| {
(field.write)(settings, Some(value));
})
}),
)
.tab_index(0)
.trigger_size(ButtonSize::Medium)
.style(DropdownStyle::Outlined)
.offset(gpui::Point {
x: px(0.0),
y: px(2.0),
.log_err(); // todo(settings_ui) don't log err
}
})
.tab_index(0)
.title_case(should_do_titlecase)
.into_any_element()
}

View File

@@ -15,6 +15,6 @@ path = "src/svg_preview.rs"
editor.workspace = true
file_icons.workspace = true
gpui.workspace = true
multi_buffer.workspace = true
language.workspace = true
ui.workspace = true
workspace.workspace = true

View File

@@ -1,13 +1,13 @@
use std::path::PathBuf;
use std::mem;
use std::sync::Arc;
use editor::Editor;
use file_icons::FileIcons;
use gpui::{
App, Context, Entity, EventEmitter, FocusHandle, Focusable, ImageSource, IntoElement,
ParentElement, Render, Resource, RetainAllImageCache, Styled, Subscription, WeakEntity, Window,
div, img,
App, Context, Entity, EventEmitter, FocusHandle, Focusable, IntoElement, ParentElement, Render,
RenderImage, Styled, Subscription, Task, WeakEntity, Window, div, img,
};
use multi_buffer::{Event as MultiBufferEvent, MultiBuffer};
use language::{Buffer, BufferEvent};
use ui::prelude::*;
use workspace::item::Item;
use workspace::{Pane, Workspace};
@@ -16,9 +16,10 @@ use crate::{OpenFollowingPreview, OpenPreview, OpenPreviewToTheSide};
pub struct SvgPreviewView {
focus_handle: FocusHandle,
svg_path: Option<PathBuf>,
image_cache: Entity<RetainAllImageCache>,
_buffer_subscription: Subscription,
buffer: Option<Entity<Buffer>>,
current_svg: Option<Result<Arc<RenderImage>, SharedString>>,
_refresh: Task<()>,
_buffer_subscription: Option<Subscription>,
_workspace_subscription: Option<Subscription>,
}
@@ -31,6 +32,182 @@ pub enum SvgPreviewMode {
}
impl SvgPreviewView {
pub fn new(
mode: SvgPreviewMode,
active_editor: Entity<Editor>,
workspace_handle: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<Self> {
cx.new(|cx| {
let workspace_subscription = if mode == SvgPreviewMode::Follow
&& let Some(workspace) = workspace_handle.upgrade()
{
Some(Self::subscribe_to_workspace(workspace, window, cx))
} else {
None
};
let buffer = active_editor
.read(cx)
.buffer()
.clone()
.read_with(cx, |buffer, _cx| buffer.as_singleton());
let subscription = buffer
.as_ref()
.map(|buffer| Self::create_buffer_subscription(buffer, window, cx));
let mut this = Self {
focus_handle: cx.focus_handle(),
buffer,
current_svg: None,
_buffer_subscription: subscription,
_workspace_subscription: workspace_subscription,
_refresh: Task::ready(()),
};
this.render_image(window, cx);
this
})
}
fn subscribe_to_workspace(
workspace: Entity<Workspace>,
window: &Window,
cx: &mut Context<Self>,
) -> Subscription {
cx.subscribe_in(
&workspace,
window,
move |this: &mut SvgPreviewView, workspace, event: &workspace::Event, window, cx| {
if let workspace::Event::ActiveItemChanged = event {
let workspace = workspace.read(cx);
if let Some(active_item) = workspace.active_item(cx)
&& let Some(editor) = active_item.downcast::<Editor>()
&& Self::is_svg_file(&editor, cx)
{
let Some(buffer) = editor.read(cx).buffer().read(cx).as_singleton() else {
return;
};
if this.buffer.as_ref() != Some(&buffer) {
this._buffer_subscription =
Some(Self::create_buffer_subscription(&buffer, window, cx));
this.buffer = Some(buffer);
this.render_image(window, cx);
cx.notify();
}
} else {
this.set_current(None, window, cx);
}
}
},
)
}
fn render_image(&mut self, window: &Window, cx: &mut Context<Self>) {
let Some(buffer) = self.buffer.as_ref() else {
return;
};
const SCALE_FACTOR: f32 = 1.0;
let renderer = cx.svg_renderer();
let content = buffer.read(cx).snapshot();
let background_task = cx.background_spawn(async move {
renderer.render_single_frame(content.text().as_bytes(), SCALE_FACTOR, true)
});
self._refresh = cx.spawn_in(window, async move |this, cx| {
let result = background_task.await;
this.update_in(cx, |view, window, cx| {
let current = result.map_err(|e| e.to_string().into());
view.set_current(Some(current), window, cx);
})
.ok();
});
}
fn set_current(
&mut self,
image: Option<Result<Arc<RenderImage>, SharedString>>,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some(Ok(image)) = mem::replace(&mut self.current_svg, image) {
window.drop_image(image).ok();
}
cx.notify();
}
fn find_existing_preview_item_idx(
pane: &Pane,
editor: &Entity<Editor>,
cx: &App,
) -> Option<usize> {
let buffer_id = editor.read(cx).buffer().entity_id();
pane.items_of_type::<SvgPreviewView>()
.find(|view| {
view.read(cx)
.buffer
.as_ref()
.is_some_and(|buffer| buffer.entity_id() == buffer_id)
})
.and_then(|view| pane.index_for_item(&view))
}
pub fn resolve_active_item_as_svg_editor(
workspace: &Workspace,
cx: &mut Context<Workspace>,
) -> Option<Entity<Editor>> {
workspace
.active_item(cx)?
.act_as::<Editor>(cx)
.filter(|editor| Self::is_svg_file(&editor, cx))
}
fn create_svg_view(
mode: SvgPreviewMode,
workspace: &mut Workspace,
editor: Entity<Editor>,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<SvgPreviewView> {
let workspace_handle = workspace.weak_handle();
SvgPreviewView::new(mode, editor, workspace_handle, window, cx)
}
fn create_buffer_subscription(
buffer: &Entity<Buffer>,
window: &Window,
cx: &mut Context<Self>,
) -> Subscription {
cx.subscribe_in(
buffer,
window,
move |this, _buffer, event: &BufferEvent, window, cx| match event {
BufferEvent::Edited | BufferEvent::Saved => {
this.render_image(window, cx);
}
_ => {}
},
)
}
pub fn is_svg_file(editor: &Entity<Editor>, cx: &App) -> bool {
editor
.read(cx)
.buffer()
.read(cx)
.as_singleton()
.and_then(|buffer| buffer.read(cx).file())
.is_some_and(|file| {
file.path()
.extension()
.is_some_and(|ext| ext.eq_ignore_ascii_case("svg"))
})
}
pub fn register(workspace: &mut Workspace, _window: &mut Window, _cx: &mut Context<Workspace>) {
workspace.register_action(move |workspace, _: &OpenPreview, window, cx| {
if let Some(editor) = Self::resolve_active_item_as_svg_editor(workspace, cx)
@@ -104,154 +281,6 @@ impl SvgPreviewView {
}
});
}
fn find_existing_preview_item_idx(
pane: &Pane,
editor: &Entity<Editor>,
cx: &App,
) -> Option<usize> {
let editor_path = Self::get_svg_path(editor.read(cx).buffer(), cx);
pane.items_of_type::<SvgPreviewView>()
.find(|view| {
let view_read = view.read(cx);
view_read.svg_path.is_some() && view_read.svg_path == editor_path
})
.and_then(|view| pane.index_for_item(&view))
}
pub fn resolve_active_item_as_svg_editor(
workspace: &Workspace,
cx: &mut Context<Workspace>,
) -> Option<Entity<Editor>> {
let editor = workspace.active_item(cx)?.act_as::<Editor>(cx)?;
if Self::is_svg_file(&editor, cx) {
Some(editor)
} else {
None
}
}
fn create_svg_view(
mode: SvgPreviewMode,
workspace: &mut Workspace,
editor: Entity<Editor>,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<SvgPreviewView> {
let workspace_handle = workspace.weak_handle();
SvgPreviewView::new(mode, editor, workspace_handle, window, cx)
}
pub fn new(
mode: SvgPreviewMode,
active_editor: Entity<Editor>,
workspace_handle: WeakEntity<Workspace>,
window: &mut Window,
cx: &mut Context<Workspace>,
) -> Entity<Self> {
cx.new(|cx| {
let image_cache = RetainAllImageCache::new(cx);
let buffer = active_editor.read(cx).buffer();
let svg_path = Self::get_svg_path(buffer, cx);
let subscription = Self::create_buffer_subscription(&buffer.clone(), window, cx);
// Subscribe to workspace active item changes to follow SVG files
let workspace_subscription = if mode == SvgPreviewMode::Follow {
workspace_handle.upgrade().map(|workspace_handle| {
cx.subscribe_in(
&workspace_handle,
window,
|this: &mut SvgPreviewView,
workspace,
event: &workspace::Event,
window,
cx| {
if let workspace::Event::ActiveItemChanged = event {
let workspace_read = workspace.read(cx);
if let Some(active_item) = workspace_read.active_item(cx)
&& let Some(editor) = active_item.downcast::<Editor>()
&& Self::is_svg_file(&editor, cx)
{
let buffer = editor.read(cx).buffer();
let new_path = Self::get_svg_path(&buffer, cx);
if this.svg_path != new_path {
this.svg_path = new_path;
this._buffer_subscription =
Self::create_buffer_subscription(
&buffer.clone(),
window,
cx,
);
cx.notify();
}
}
}
},
)
})
} else {
None
};
Self {
focus_handle: cx.focus_handle(),
svg_path,
image_cache,
_buffer_subscription: subscription,
_workspace_subscription: workspace_subscription,
}
})
}
fn create_buffer_subscription(
active_buffer: &Entity<MultiBuffer>,
window: &mut Window,
cx: &mut Context<Self>,
) -> Subscription {
cx.subscribe_in(
active_buffer,
window,
|this: &mut SvgPreviewView, buffer, event: &MultiBufferEvent, window, cx| {
let potential_path_change = event == &MultiBufferEvent::FileHandleChanged;
if event == &MultiBufferEvent::Saved || potential_path_change {
// Remove cached image to force reload
if let Some(svg_path) = &this.svg_path {
let resource = Resource::Path(svg_path.clone().into());
this.image_cache.update(cx, |cache, cx| {
cache.remove(&resource, window, cx);
});
}
if potential_path_change {
this.svg_path = Self::get_svg_path(buffer, cx);
}
cx.notify();
}
},
)
}
pub fn is_svg_file(editor: &Entity<Editor>, cx: &App) -> bool {
let buffer = editor.read(cx).buffer().read(cx);
if let Some(buffer) = buffer.as_singleton()
&& let Some(file) = buffer.read(cx).file()
{
return file
.path()
.extension()
.map(|ext| ext.eq_ignore_ascii_case("svg"))
.unwrap_or(false);
}
false
}
fn get_svg_path(buffer: &Entity<MultiBuffer>, cx: &App) -> Option<PathBuf> {
let buffer = buffer.read(cx).as_singleton()?;
let file = buffer.read(cx).file()?;
let local_file = file.as_local()?;
Some(local_file.abs_path(cx))
}
}
impl Render for SvgPreviewView {
@@ -265,20 +294,19 @@ impl Render for SvgPreviewView {
.flex()
.justify_center()
.items_center()
.child(if let Some(svg_path) = &self.svg_path {
img(ImageSource::from(svg_path.clone()))
.image_cache(&self.image_cache)
.max_w_full()
.max_h_full()
.with_fallback(|| {
div()
.map(|this| match self.current_svg.clone() {
Some(Ok(image)) => {
this.child(img(image).max_w_full().max_h_full().with_fallback(|| {
h_flex()
.p_4()
.child("Failed to load SVG file")
.gap_2()
.child(Icon::new(IconName::Warning))
.child("Failed to load SVG image")
.into_any_element()
})
.into_any_element()
} else {
div().p_4().child("No SVG file selected").into_any_element()
}))
}
Some(Err(e)) => this.child(div().p_4().child(e).into_any_element()),
None => this.child(div().p_4().child("No SVG file selected")),
})
}
}
@@ -295,20 +323,19 @@ impl Item for SvgPreviewView {
type Event = ();
fn tab_icon(&self, _window: &Window, cx: &App) -> Option<Icon> {
// Use the same icon as SVG files in the file tree
self.svg_path
self.buffer
.as_ref()
.and_then(|svg_path| FileIcons::get_icon(svg_path, cx))
.and_then(|buffer| buffer.read(cx).file())
.and_then(|file| FileIcons::get_icon(file.path().as_std_path(), cx))
.map(Icon::from_path)
.or_else(|| Some(Icon::new(IconName::Image)))
}
fn tab_content_text(&self, _detail: usize, _cx: &App) -> SharedString {
self.svg_path
fn tab_content_text(&self, _detail: usize, cx: &App) -> SharedString {
self.buffer
.as_ref()
.and_then(|svg_path| svg_path.file_name())
.map(|name| name.to_string_lossy())
.map(|name| format!("Preview {}", name).into())
.and_then(|svg_path| svg_path.read(cx).file())
.map(|name| format!("Preview {}", name.file_name(cx)).into())
.unwrap_or_else(|| "SVG Preview".into())
}

View File

@@ -2312,6 +2312,7 @@ impl BufferSnapshot {
self.visible_text.len()
} else {
debug_assert!(anchor.buffer_id == Some(self.remote_id));
debug_assert!(self.version.observed(anchor.timestamp));
let anchor_key = InsertionFragmentKey {
timestamp: anchor.timestamp,
split_offset: anchor.offset,
@@ -2335,10 +2336,7 @@ impl BufferSnapshot {
.item()
.filter(|insertion| insertion.timestamp == anchor.timestamp)
else {
panic!(
"invalid anchor {:?}. buffer id: {}, version: {:?}",
anchor, self.remote_id, self.version
);
self.panic_bad_anchor(anchor);
};
let (start, _, item) = self
@@ -2357,13 +2355,29 @@ impl BufferSnapshot {
}
}
fn fragment_id_for_anchor(&self, anchor: &Anchor) -> &Locator {
self.try_fragment_id_for_anchor(anchor).unwrap_or_else(|| {
#[cold]
fn panic_bad_anchor(&self, anchor: &Anchor) -> ! {
if anchor.buffer_id.is_some_and(|id| id != self.remote_id) {
panic!(
"invalid anchor - buffer id does not match: anchor {anchor:?}; buffer id: {}, version: {:?}",
self.remote_id, self.version
);
} else if !self.version.observed(anchor.timestamp) {
panic!(
"invalid anchor - snapshot has not observed lamport: {:?}; version: {:?}",
anchor, self.version
);
} else {
panic!(
"invalid anchor {:?}. buffer id: {}, version: {:?}",
anchor, self.remote_id, self.version,
)
})
anchor, self.remote_id, self.version
);
}
}
fn fragment_id_for_anchor(&self, anchor: &Anchor) -> &Locator {
self.try_fragment_id_for_anchor(anchor)
.unwrap_or_else(|| self.panic_bad_anchor(anchor))
}
fn try_fragment_id_for_anchor(&self, anchor: &Anchor) -> Option<&Locator> {

View File

@@ -48,6 +48,7 @@ pub struct ContextMenuEntry {
label: SharedString,
icon: Option<IconName>,
custom_icon_path: Option<SharedString>,
custom_icon_svg: Option<SharedString>,
icon_position: IconPosition,
icon_size: IconSize,
icon_color: Option<Color>,
@@ -68,6 +69,7 @@ impl ContextMenuEntry {
label: label.into(),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: IconPosition::Start,
icon_size: IconSize::Small,
icon_color: None,
@@ -94,7 +96,15 @@ impl ContextMenuEntry {
pub fn custom_icon_path(mut self, path: impl Into<SharedString>) -> Self {
self.custom_icon_path = Some(path.into());
self.icon = None; // Clear IconName if custom path is set
self.custom_icon_svg = None; // Clear other icon sources if custom path is set
self.icon = None;
self
}
pub fn custom_icon_svg(mut self, svg: impl Into<SharedString>) -> Self {
self.custom_icon_svg = Some(svg.into());
self.custom_icon_path = None; // Clear other icon sources if custom path is set
self.icon = None;
self
}
@@ -396,6 +406,7 @@ impl ContextMenu {
handler: Rc::new(move |_, window, cx| handler(window, cx)),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: IconPosition::End,
icon_size: IconSize::Small,
icon_color: None,
@@ -425,6 +436,7 @@ impl ContextMenu {
handler: Rc::new(move |_, window, cx| handler(window, cx)),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: IconPosition::End,
icon_size: IconSize::Small,
icon_color: None,
@@ -454,6 +466,7 @@ impl ContextMenu {
handler: Rc::new(move |_, window, cx| handler(window, cx)),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: IconPosition::End,
icon_size: IconSize::Small,
icon_color: None,
@@ -482,6 +495,7 @@ impl ContextMenu {
handler: Rc::new(move |_, window, cx| handler(window, cx)),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: position,
icon_size: IconSize::Small,
icon_color: None,
@@ -541,6 +555,7 @@ impl ContextMenu {
}),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_position: IconPosition::End,
icon_size: IconSize::Small,
icon_color: None,
@@ -572,6 +587,7 @@ impl ContextMenu {
}),
icon: None,
custom_icon_path: None,
custom_icon_svg: None,
icon_size: IconSize::Small,
icon_position: IconPosition::End,
icon_color: None,
@@ -593,6 +609,7 @@ impl ContextMenu {
handler: Rc::new(move |_, window, cx| window.dispatch_action(action.boxed_clone(), cx)),
icon: Some(IconName::ArrowUpRight),
custom_icon_path: None,
custom_icon_svg: None,
icon_size: IconSize::XSmall,
icon_position: IconPosition::End,
icon_color: None,
@@ -913,6 +930,7 @@ impl ContextMenu {
handler,
icon,
custom_icon_path,
custom_icon_svg,
icon_position,
icon_size,
icon_color,
@@ -965,6 +983,28 @@ impl ContextMenu {
)
})
.into_any_element()
} else if let Some(custom_icon_svg) = custom_icon_svg {
h_flex()
.gap_1p5()
.when(
*icon_position == IconPosition::Start && toggle.is_none(),
|flex| {
flex.child(
Icon::from_external_svg(custom_icon_svg.clone())
.size(*icon_size)
.color(icon_color),
)
},
)
.child(Label::new(label.clone()).color(label_color).truncate())
.when(*icon_position == IconPosition::End, |flex| {
flex.child(
Icon::from_external_svg(custom_icon_svg.clone())
.size(*icon_size)
.color(icon_color),
)
})
.into_any_element()
} else if let Some(icon_name) = icon {
h_flex()
.gap_1p5()

View File

@@ -115,24 +115,24 @@ impl From<IconName> for Icon {
/// The source of an icon.
enum IconSource {
/// An SVG embedded in the Zed binary.
Svg(SharedString),
Embedded(SharedString),
/// An image file located at the specified path.
///
/// Currently our SVG renderer is missing support for the following features:
/// 1. Loading SVGs from external files.
/// 2. Rendering polychrome SVGs.
/// Currently our SVG renderer is missing support for rendering polychrome SVGs.
///
/// In order to support icon themes, we render the icons as images instead.
Image(Arc<Path>),
External(Arc<Path>),
/// An SVG not embedded in the Zed binary.
ExternalSvg(SharedString),
}
impl IconSource {
fn from_path(path: impl Into<SharedString>) -> Self {
let path = path.into();
if path.starts_with("icons/") {
Self::Svg(path)
Self::Embedded(path)
} else {
Self::Image(Arc::from(PathBuf::from(path.as_ref())))
Self::External(Arc::from(PathBuf::from(path.as_ref())))
}
}
}
@@ -148,7 +148,7 @@ pub struct Icon {
impl Icon {
pub fn new(icon: IconName) -> Self {
Self {
source: IconSource::Svg(icon.path().into()),
source: IconSource::Embedded(icon.path().into()),
color: Color::default(),
size: IconSize::default().rems(),
transformation: Transformation::default(),
@@ -164,6 +164,15 @@ impl Icon {
}
}
pub fn from_external_svg(svg: SharedString) -> Self {
Self {
source: IconSource::ExternalSvg(svg),
color: Color::default(),
size: IconSize::default().rems(),
transformation: Transformation::default(),
}
}
pub fn color(mut self, color: Color) -> Self {
self.color = color;
self
@@ -193,14 +202,21 @@ impl Transformable for Icon {
impl RenderOnce for Icon {
fn render(self, _: &mut Window, cx: &mut App) -> impl IntoElement {
match self.source {
IconSource::Svg(path) => svg()
IconSource::Embedded(path) => svg()
.with_transformation(self.transformation)
.size(self.size)
.flex_none()
.path(path)
.text_color(self.color.color(cx))
.into_any_element(),
IconSource::Image(path) => img(path)
IconSource::ExternalSvg(path) => svg()
.external_path(path)
.with_transformation(self.transformation)
.size(self.size)
.flex_none()
.text_color(self.color.color(cx))
.into_any_element(),
IconSource::External(path) => img(path)
.size(self.size)
.flex_none()
.text_color(self.color.color(cx))

View File

@@ -1171,12 +1171,14 @@ impl<T: ScrollableHandle> Element for ScrollbarElement<T> {
.apply_along(axis, |_| thumb_end - thumb_offset),
);
let needs_scroll_track = reserved_space.needs_scroll_track();
ScrollbarLayout {
thumb_bounds,
track_bounds: padded_bounds,
axis,
cursor_hitbox: window.insert_hitbox(
if reserved_space.needs_scroll_track() {
if needs_scroll_track {
padded_bounds
} else {
thumb_bounds
@@ -1184,6 +1186,7 @@ impl<T: ScrollableHandle> Element for ScrollbarElement<T> {
HitboxBehavior::BlockMouseExceptScroll,
),
track_background: track_color
.filter(|_| needs_scroll_track)
.map(|color| (padded_bounds.dilate(SCROLLBAR_PADDING), color)),
reserved_space,
}
@@ -1292,10 +1295,15 @@ impl<T: ScrollableHandle> Element for ScrollbarElement<T> {
}
if let Some((track_bounds, color)) = track_background {
let mut color = *color;
if let Some(fade) = autohide_fade {
color.fade_out(fade);
}
window.paint_quad(quad(
*track_bounds,
Corners::default(),
*color,
color,
Edges::default(),
Hsla::transparent_black(),
BorderStyle::default(),

View File

@@ -31,7 +31,7 @@ pub trait NumberFieldType: Display + Copy + Clone + Sized + PartialOrd + FromStr
fn saturating_sub(self, rhs: Self) -> Self;
}
macro_rules! impl_newtype_numeric_stepper {
macro_rules! impl_newtype_numeric_stepper_float {
($type:ident, $default:expr, $large:expr, $small:expr, $min:expr, $max:expr) => {
impl NumberFieldType for $type {
fn default_step() -> Self {
@@ -65,13 +65,47 @@ macro_rules! impl_newtype_numeric_stepper {
};
}
macro_rules! impl_newtype_numeric_stepper_int {
($type:ident, $default:expr, $large:expr, $small:expr, $min:expr, $max:expr) => {
impl NumberFieldType for $type {
fn default_step() -> Self {
$default.into()
}
fn large_step() -> Self {
$large.into()
}
fn small_step() -> Self {
$small.into()
}
fn min_value() -> Self {
$min.into()
}
fn max_value() -> Self {
$max.into()
}
fn saturating_add(self, rhs: Self) -> Self {
$type(self.0.saturating_add(rhs.0).min(Self::max_value().0))
}
fn saturating_sub(self, rhs: Self) -> Self {
$type(self.0.saturating_sub(rhs.0).max(Self::min_value().0))
}
}
};
}
#[rustfmt::skip]
impl_newtype_numeric_stepper!(FontWeight, 50., 100., 10., FontWeight::THIN, FontWeight::BLACK);
impl_newtype_numeric_stepper!(CodeFade, 0.1, 0.2, 0.05, 0.0, 0.9);
impl_newtype_numeric_stepper!(InactiveOpacity, 0.1, 0.2, 0.05, 0.0, 1.0);
impl_newtype_numeric_stepper!(MinimumContrast, 1., 10., 0.5, 0.0, 106.0);
impl_newtype_numeric_stepper!(DelayMs, 100, 500, 10, 0, 2000);
impl_newtype_numeric_stepper!(
impl_newtype_numeric_stepper_float!(FontWeight, 50., 100., 10., FontWeight::THIN, FontWeight::BLACK);
impl_newtype_numeric_stepper_float!(CodeFade, 0.1, 0.2, 0.05, 0.0, 0.9);
impl_newtype_numeric_stepper_float!(InactiveOpacity, 0.1, 0.2, 0.05, 0.0, 1.0);
impl_newtype_numeric_stepper_float!(MinimumContrast, 1., 10., 0.5, 0.0, 106.0);
impl_newtype_numeric_stepper_int!(DelayMs, 100, 500, 10, 0, 2000);
impl_newtype_numeric_stepper_float!(
CenteredPaddingSettings,
0.05,
0.2,

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.211.0"
version = "0.211.5"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]
@@ -73,6 +73,7 @@ gpui = { workspace = true, features = [
"windows-manifest",
] }
gpui_tokio.workspace = true
rayon.workspace = true
edit_prediction_button.workspace = true
http_client.workspace = true

View File

@@ -1 +1 @@
dev
stable

View File

@@ -257,6 +257,13 @@ pub fn main() {
return;
}
rayon::ThreadPoolBuilder::new()
.num_threads(4)
.stack_size(10 * 1024 * 1024)
.thread_name(|ix| format!("RayonWorker{}", ix))
.build_global()
.unwrap();
log::info!(
"========== starting zed version {}, sha {} ==========",
app_version,

View File

@@ -41,6 +41,9 @@ const DEFAULT_FILTERS: &[(&str, log::LevelFilter)] = &[
("blade_graphics", log::LevelFilter::Warn),
#[cfg(any(target_os = "linux", target_os = "freebsd", target_os = "windows"))]
("naga::back::spv::writer", log::LevelFilter::Warn),
// usvg prints a lot of warnings on rendering an SVG with partial errors, which
// can happen a lot with the SVG preview
("usvg::parser::style", log::LevelFilter::Error),
];
pub fn init_env_filter(filter: env_config::EnvFilter) {

View File

@@ -2994,33 +2994,11 @@ List of `string` glob patterns
- Description: Whether to show relative line numbers in the gutter
- Setting: `relative_line_numbers`
- Default: `"disabled"`
- Default: `false`
**Options**
1. Show relative line numbers in the gutter whilst counting wrapped lines as one line:
```json [settings]
{
"relative_line_numbers": "enabled"
}
```
2. Show relative line numbers in the gutter, including wrapped lines in the counting:
```json [settings]
{
"relative_line_numbers": "wrapped"
}
```
2. Do not use relative line numbers:
```json [settings]
{
"relative_line_numbers": "disabled"
}
```
`boolean` values
## Remove Trailing Whitespace On Save

View File

@@ -606,7 +606,7 @@ Here are a few general Zed settings that can help you fine-tune your Vim experie
| Property | Description | Default Value |
| ----------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------- |
| cursor_blink | If `true`, the cursor blinks. | `true` |
| relative_line_numbers | If `"enabled"`, line numbers in the left gutter are relative to the cursor. If `"wrapped"`, they also display for wrapped lines. | `"disabled"` |
| relative_line_numbers | If `true`, line numbers in the left gutter are relative to the cursor. | `false` |
| scrollbar | Object that controls the scrollbar display. Set to `{ "show": "never" }` to hide the scroll bar. | `{ "show": "auto" }` |
| scroll_beyond_last_line | If set to `"one_page"`, allows scrolling up to one page beyond the last line. Set to `"off"` to prevent this behavior. | `"one_page"` |
| vertical_scroll_margin | The number of lines to keep above or below the cursor when scrolling. Set to `0` to allow the cursor to go up to the edges of the screen vertically. | `3` |
@@ -620,7 +620,7 @@ Here's an example of these settings changed:
// Disable cursor blink
"cursor_blink": false,
// Use relative line numbers
"relative_line_numbers": "enabled",
"relative_line_numbers": true,
// Hide the scroll bar
"scrollbar": { "show": "never" },
// Prevent the buffer from scrolling beyond the last line
@@ -628,7 +628,7 @@ Here's an example of these settings changed:
// Allow the cursor to reach the edges of the screen
"vertical_scroll_margin": 0,
"gutter": {
// Disable line numbers completely:
// Disable line numbers completely
"line_numbers": false
},
"command_aliases": {

View File

@@ -204,7 +204,7 @@ TBD: Centered layout related settings
"folds": true, // Show/hide show fold buttons in the gutter.
"min_line_number_digits": 4 // Reserve space for N digit line numbers
},
"relative_line_numbers": "enabled", // Show relative line numbers in gutter
"relative_line_numbers": false, // Show relative line numbers in gutter
// Indent guides
"indent_guides": {

View File

@@ -170,12 +170,7 @@ cp "assets/licenses.md" "${zed_dir}/licenses.md"
# Create archive out of everything that's in the temp directory
arch=$(uname -m)
target="linux-${arch}"
if [[ "$channel" == "dev" ]]; then
archive="zed-${commit}-${target}.tar.gz"
else
archive="zed-${target}.tar.gz"
fi
archive="zed-linux-${arch}.tar.gz"
rm -rf "${archive}"
remove_match="zed(-[a-zA-Z0-9]+)?-linux-$(uname -m)\.tar\.gz"

View File

@@ -6,8 +6,6 @@ source script/lib/blob-store.sh
build_flag="--release"
target_dir="release"
open_result=false
local_arch=false
local_only=false
local_install=false
can_code_sign=false
@@ -72,12 +70,12 @@ target_triple=${host_line#*: }
if [[ $# -gt 0 && -n "$1" ]]; then
target_triple="$1"
fi
remote_server_arch=""
arch_suffix=""
if [[ "$target_triple" = "x86_64-apple-darwin" ]]; then
remote_server_arch="x86_64"
arch_suffix="x86_64"
elif [[ "$target_triple" = "aarch64-apple-darwin" ]]; then
remote_server_arch="aarch64"
arch_suffix="aarch64"
else
echo "Unsupported architecture $target_triple"
exit 1
@@ -196,10 +194,6 @@ function sign_app_binaries() {
/usr/bin/codesign --force --timestamp --options runtime --entitlements crates/zed/resources/zed.entitlements --sign "$IDENTITY" "${app_path}" -v
else
echo "One or more of the following variables are missing: MACOS_CERTIFICATE, MACOS_CERTIFICATE_PASSWORD, APPLE_NOTARIZATION_KEY, APPLE_NOTARIZATION_KEY_ID, APPLE_NOTARIZATION_ISSUER_ID"
if [[ "$local_only" = false ]]; then
echo "To create a self-signed local build use ./scripts/build.sh -ldf"
exit 1
fi
echo "====== WARNING ======"
echo "This bundle is being signed without all entitlements, some features (e.g. universal links) will not work"
@@ -215,7 +209,7 @@ function sign_app_binaries() {
codesign --force --deep --entitlements "${app_path}/Contents/Resources/zed.entitlements" --sign ${MACOS_SIGNING_KEY:- -} "${app_path}" -v
fi
if [[ "$target_dir" = "debug" && "$local_only" = false ]]; then
if [[ "$target_dir" = "debug" ]]; then
if [ "$open_result" = true ]; then
open "$app_path"
else
@@ -227,25 +221,18 @@ function sign_app_binaries() {
bundle_name=$(basename "$app_path")
if [ "$local_only" = true ]; then
if [ "$local_install" = true ]; then
rm -rf "/Applications/$bundle_name"
mv "$app_path" "/Applications/$bundle_name"
echo "Installed application bundle: /Applications/$bundle_name"
if [ "$open_result" = true ]; then
echo "Opening /Applications/$bundle_name"
open "/Applications/$bundle_name"
fi
else
if [ "$open_result" = true ]; then
echo "Opening $app_path"
open "$app_path"
fi
if [ "$local_install" = true ]; then
rm -rf "/Applications/$bundle_name"
mv "$app_path" "/Applications/$bundle_name"
echo "Installed application bundle: /Applications/$bundle_name"
if [ "$open_result" = true ]; then
echo "Opening /Applications/$bundle_name"
open "/Applications/$bundle_name"
fi
else
dmg_target_directory="target/${target_triple}/${target_dir}"
dmg_source_directory="${dmg_target_directory}/dmg"
dmg_file_path="${dmg_target_directory}/Zed.dmg"
dmg_file_path="${dmg_target_directory}/Zed-${arch_suffix}.dmg"
xcode_bin_dir_path="$(xcode-select -p)/usr/bin"
rm -rf ${dmg_source_directory}
@@ -291,30 +278,36 @@ function sign_binary() {
/usr/bin/codesign --deep --force --timestamp --options runtime --entitlements crates/zed/resources/zed.entitlements --sign "$IDENTITY" "${binary_path}" -v
fi
}
cp target/${target_triple}/${target_dir}/zed "${app_path}/Contents/MacOS/zed"
cp target/${target_triple}/${target_dir}/cli "${app_path}/Contents/MacOS/cli"
sign_app_binaries
sign_binary "target/$target_triple/release/remote_server"
gzip -f --stdout --best target/$target_triple/release/remote_server > target/zed-remote-server-macos-$remote_server_arch.gz
function upload_debug_info() {
function upload_debug_symbols() {
if [[ -n "${SENTRY_AUTH_TOKEN:-}" ]]; then
echo "Uploading zed debug symbols to sentry..."
exe_path="target/${target_triple}/release/Zed"
if ! dsymutil --flat "target/${target_triple}/${target_dir}/zed" 2> target/dsymutil.log; then
echo "dsymutil failed"
cat target/dsymutil.log
exit 1
fi
if ! dsymutil --flat "target/${target_triple}/${target_dir}/remote_server" 2> target/dsymutil.log; then
echo "dsymutil failed"
cat target/dsymutil.log
exit 1
fi
# note: this uploads the unstripped binary which is needed because it contains
# .eh_frame data for stack unwinding. see https://github.com/getsentry/symbolic/issues/783
sentry-cli debug-files upload --include-sources --wait -p zed -o zed-dev \
"target/${target_triple}/${target_dir}/zed" \
"target/${target_triple}/${target_dir}/remote_server" \
"target/${target_triple}/${target_dir}/zed.dwarf"
"target/${target_triple}/${target_dir}/zed.dwarf" \
"target/${target_triple}/${target_dir}/remote_server.dwarf"
else
echo "missing SENTRY_AUTH_TOKEN. skipping sentry upload."
fi
}
if command -v sentry-cli >/dev/null 2>&1; then
upload_debug_info
else
echo "sentry-cli not found. skipping sentry upload."
echo "install with: 'curl -sL https://sentry.io/get-cli | bash'"
fi
upload_debug_symbols
cp target/${target_triple}/${target_dir}/zed "${app_path}/Contents/MacOS/zed"
cp target/${target_triple}/${target_dir}/cli "${app_path}/Contents/MacOS/cli"
sign_app_binaries
sign_binary "target/$target_triple/release/remote_server"
gzip -f --stdout --best target/$target_triple/release/remote_server > target/zed-remote-server-macos-$arch_suffix.gz

33
script/cherry-pick Executable file
View File

@@ -0,0 +1,33 @@
# #!/bin/bash
set -euxo pipefail
if [ "$#" -ne 3 ]; then
echo "Usage: $0 <branch-name> <commit-sha> <channel>"
exit 1
fi
BRANCH_NAME="$1"
COMMIT_SHA="$2"
CHANNEL="$3"
SHORT_SHA="${COMMIT_SHA:0:8}"
NEW_BRANCH="cherry-pick-${BRANCH_NAME}-${SHORT_SHA}"
git fetch --depth 2 origin +${COMMIT_SHA} ${BRANCH_NAME}
git checkout --force "origin/$BRANCH_NAME" -B "$NEW_BRANCH"
git cherry-pick "$COMMIT_SHA"
git push origin -f "$NEW_BRANCH"
COMMIT_TITLE=$(git log -1 --pretty=format:"%s" "$COMMIT_SHA")
COMMIT_BODY=$(git log -1 --pretty=format:"%b" "$COMMIT_SHA")
# Check if commit title ends with (#number)
if [[ "$COMMIT_TITLE" =~ \(#([0-9]+)\)$ ]]; then
PR_NUMBER="${BASH_REMATCH[1]}"
PR_BODY="Cherry-pick of #${PR_NUMBER} to ${CHANNEL}"$'\n'$'\n'"----"$'\n'"${COMMIT_BODY}"
else
PR_BODY="Cherry-pick of ${COMMIT_SHA} to ${CHANNEL}"$'\n'$'\n'"----"$'\n'"${COMMIT_BODY}"
fi
# Create a pull request
gh pr create --base "$BRANCH_NAME" --head "$NEW_BRANCH" --title "$COMMIT_TITLE (cherry-pick to $CHANNEL)" --body "$PR_BODY"

17
script/prettier Executable file
View File

@@ -0,0 +1,17 @@
#!/bin/bash
set -euxo pipefail
PRETTIER_VERSION=3.5.0
pnpm dlx "prettier@${PRETTIER_VERSION}" assets/settings/default.json --check || {
echo "To fix, run from the root of the Zed repo:"
echo " pnpm dlx prettier@${PRETTIER_VERSION} assets/settings/default.json --write"
false
}
cd docs
pnpm dlx "prettier@${PRETTIER_VERSION}" . --check || {
echo "To fix, run from the root of the Zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}

5
script/run-unit-evals Executable file
View File

@@ -0,0 +1,5 @@
#!/usr/bin/env bash
set -euxo pipefail
cargo nextest run --workspace --no-fail-fast --features unit-eval --no-capture -E 'test(::eval_)'

View File

@@ -1,75 +1,18 @@
#!/usr/bin/env bash
# Based on the template in: https://docs.digitalocean.com/reference/api/spaces-api/
bash -euo pipefail
source script/lib/blob-store.sh
allowed_targets=("linux-targz" "macos" "freebsd")
is_allowed_target() {
for val in "${allowed_targets[@]}"; do
if [[ "$1" == "$val" ]]; then
return 0
fi
done
return 1
}
allowed_arch=("x86_64" "aarch64")
is_allowed_arch() {
for val in "${allowed_arch[@]}"; do
if [[ "$1" == "$val" ]]; then
return 0
fi
done
return 1
}
if is_allowed_target "$1"; then
target="$1"
else
echo "Error: Target '$1' is not allowed"
echo "Usage: $0 [${allowed_targets[*]}] {arch}"
exit 1
fi
if is_allowed_arch "$2"; then
arch="$2"
else
echo "Error: Arch '$2' is not allowed"
echo "Usage: $0 $1 [${allowed_arch[*]}]"
exit 1
fi
echo "Uploading nightly for target: $target $arch"
bucket_name="zed-nightly-host"
sha=$(git rev-parse HEAD)
echo ${sha} > target/latest-sha
find target -type f -name "zed-remote-server-*.gz" -print0 | while IFS= read -r -d '' file_to_upload; do
for file_to_upload in ./release-artifacts/*; do
[ -f "$file_to_upload" ] || continue
upload_to_blob_store $bucket_name "$file_to_upload" "nightly/$(basename "$file_to_upload")"
upload_to_blob_store $bucket_name "$file_to_upload" "${GITHUB_SHA}/$(basename "$file_to_upload")"
rm -f "$file_to_upload"
done
case "$target" in
macos)
upload_to_blob_store $bucket_name "target/$arch-apple-darwin/release/Zed.dmg" "nightly/Zed-$arch.dmg"
upload_to_blob_store $bucket_name "target/latest-sha" "nightly/latest-sha"
rm -f "target/$arch-apple-darwin/release/Zed.dmg" "target/release/Zed.dmg"
rm -f "target/latest-sha"
;;
linux-targz)
find . -type f -name "zed-*.tar.gz" -print0 | while IFS= read -r -d '' file_to_upload; do
upload_to_blob_store $bucket_name "$file_to_upload" "nightly/$(basename "$file_to_upload")"
rm -f "$file_to_upload"
done
upload_to_blob_store $bucket_name "target/latest-sha" "nightly/latest-sha-linux-targz"
rm -f "target/latest-sha"
;;
freebsd)
echo "No freebsd client build (yet)."
;;
*)
echo "Error: Unknown target '$target'"
exit 1
;;
esac
sha=$(git rev-parse HEAD)
echo -n ${sha} > ./release-artifacts/latest-sha
upload_to_blob_store $bucket_name "release-artifacts/latest-sha" "nightly/latest-sha"

View File

@@ -17,5 +17,6 @@ clap = { workspace = true, features = ["derive"] }
toml.workspace = true
indoc.workspace = true
indexmap.workspace = true
serde.workspace = true
toml_edit.workspace = true
gh-workflow.workspace = true

View File

@@ -3,11 +3,17 @@ use clap::Parser;
use std::fs;
use std::path::Path;
mod after_release;
mod cherry_pick;
mod compare_perf;
mod danger;
mod nix_build;
mod release_nightly;
mod run_bundling;
mod release;
mod run_agent_evals;
mod run_tests;
mod runners;
mod steps;
mod vars;
@@ -20,11 +26,15 @@ pub fn run_workflows(_: GenerateWorkflowArgs) -> Result<()> {
let workflows = vec![
("danger.yml", danger::danger()),
("nix_build.yml", nix_build::nix_build()),
("run_bundling.yml", run_bundling::run_bundling()),
("release_nightly.yml", release_nightly::release_nightly()),
// ("run_tests.yml", run_tests::run_tests()),
// ("release.yml", release::release()),
("run_tests.yml", run_tests::run_tests()),
("release.yml", release::release()),
("cherry_pick.yml", cherry_pick::cherry_pick()),
("compare_perf.yml", compare_perf::compare_perf()),
("run_unit_evals.yml", run_agent_evals::run_unit_evals()),
("run_agent_evals.yml", run_agent_evals::run_agent_evals()),
("after_release.yml", after_release::after_release()),
];
fs::create_dir_all(dir)
.with_context(|| format!("Failed to create directory: {}", dir.display()))?;

View File

@@ -0,0 +1,123 @@
use gh_workflow::*;
use crate::tasks::workflows::{
runners,
steps::{NamedJob, dependant_job, named},
vars::{self, StepOutput},
};
pub fn after_release() -> Workflow {
let refresh_zed_dev = rebuild_releases_page();
let post_to_discord = post_to_discord(&[&refresh_zed_dev]);
let publish_winget = publish_winget();
named::workflow()
.on(Event::default().release(Release::default().types(vec![ReleaseType::Published])))
.add_job(refresh_zed_dev.name, refresh_zed_dev.job)
.add_job(post_to_discord.name, post_to_discord.job)
.add_job(publish_winget.name, publish_winget.job)
}
fn rebuild_releases_page() -> NamedJob {
named::job(
Job::default()
.runs_on(runners::LINUX_SMALL)
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.add_step(named::bash(
"curl https://zed.dev/api/revalidate-releases -H \"Authorization: Bearer ${RELEASE_NOTES_API_TOKEN}\"",
).add_env(("RELEASE_NOTES_API_TOKEN", vars::RELEASE_NOTES_API_TOKEN))),
)
}
fn post_to_discord(deps: &[&NamedJob]) -> NamedJob {
fn get_release_url() -> Step<Run> {
named::bash(indoc::indoc! {r#"
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
URL="https://zed.dev/releases/preview"
else
URL="https://zed.dev/releases/stable"
fi
echo "URL=$URL" >> "$GITHUB_OUTPUT"
"#})
.id("get-release-url")
}
fn get_content() -> Step<Use> {
named::uses(
"2428392",
"gh-truncate-string-action",
"b3ff790d21cf42af3ca7579146eedb93c8fb0757", // v1.4.1
)
.id("get-content")
.add_with((
"stringToTruncate",
indoc::indoc! {r#"
📣 Zed [${{ github.event.release.tag_name }}](<${{ steps.get-release-url.outputs.URL }}>) was just released!
${{ github.event.release.body }}
"#},
))
.add_with(("maxLength", 2000))
.add_with(("truncationSymbol", "..."))
}
fn discord_webhook_action() -> Step<Use> {
named::uses(
"tsickert",
"discord-webhook",
"c840d45a03a323fbc3f7507ac7769dbd91bfb164", // v5.3.0
)
.add_with(("webhook-url", vars::DISCORD_WEBHOOK_RELEASE_NOTES))
.add_with(("content", "${{ steps.get-content.outputs.string }}"))
}
let job = dependant_job(deps)
.runs_on(runners::LINUX_SMALL)
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.add_step(get_release_url())
.add_step(get_content())
.add_step(discord_webhook_action());
named::job(job)
}
fn publish_winget() -> NamedJob {
fn set_package_name() -> (Step<Run>, StepOutput) {
let step = named::bash(indoc::indoc! {r#"
if [ "${{ github.event.release.prerelease }}" == "true" ]; then
PACKAGE_NAME=ZedIndustries.Zed.Preview
else
PACKAGE_NAME=ZedIndustries.Zed
fi
echo "PACKAGE_NAME=$PACKAGE_NAME" >> "$GITHUB_OUTPUT"
"#})
.id("set-package-name");
let output = StepOutput::new(&step, "PACKAGE_NAME");
(step, output)
}
fn winget_releaser(package_name: &StepOutput) -> Step<Use> {
named::uses(
"vedantmgoyal9",
"winget-releaser",
"19e706d4c9121098010096f9c495a70a7518b30f", // v2
)
.add_with(("identifier", package_name.to_string()))
.add_with(("max-versions-to-keep", 5))
.add_with(("token", vars::WINGET_TOKEN))
}
let (set_package_name, package_name) = set_package_name();
named::job(
Job::default()
.runs_on(runners::LINUX_SMALL)
.add_step(set_package_name)
.add_step(winget_releaser(&package_name)),
)
}

View File

@@ -0,0 +1,59 @@
use gh_workflow::*;
use crate::tasks::workflows::{
runners,
steps::{self, NamedJob, named},
vars::{self, Input, StepOutput},
};
pub fn cherry_pick() -> Workflow {
let branch = Input::string("branch", None);
let commit = Input::string("commit", None);
let channel = Input::string("channel", None);
let cherry_pick = run_cherry_pick(&branch, &commit, &channel);
named::workflow()
.on(Event::default().workflow_dispatch(
WorkflowDispatch::default()
.add_input(commit.name, commit.input())
.add_input(branch.name, branch.input())
.add_input(channel.name, channel.input()),
))
.add_job(cherry_pick.name, cherry_pick.job)
}
fn run_cherry_pick(branch: &Input, commit: &Input, channel: &Input) -> NamedJob {
fn authenticate_as_zippy() -> (Step<Use>, StepOutput) {
let step = named::uses(
"actions",
"create-github-app-token",
"bef1eaf1c0ac2b148ee2a0a74c65fbe6db0631f1",
) // v2
.add_with(("app-id", vars::ZED_ZIPPY_APP_ID))
.add_with(("private-key", vars::ZED_ZIPPY_APP_PRIVATE_KEY))
.id("get-app-token");
let output = StepOutput::new(&step, "token");
(step, output)
}
fn cherry_pick(
branch: &Input,
commit: &Input,
channel: &Input,
token: &StepOutput,
) -> Step<Run> {
named::bash(&format!("./script/cherry-pick {branch} {commit} {channel}"))
.add_env(("GIT_COMMITTER_NAME", "Zed Zippy"))
.add_env(("GIT_COMMITTER_EMAIL", "hi@zed.dev"))
.add_env(("GITHUB_TOKEN", token))
}
let (authenticate, token) = authenticate_as_zippy();
named::job(
Job::default()
.runs_on(runners::LINUX_SMALL)
.add_step(steps::checkout_repo())
.add_step(authenticate)
.add_step(cherry_pick(branch, commit, channel, &token)),
)
}

View File

@@ -0,0 +1,63 @@
use gh_workflow::*;
use crate::tasks::workflows::run_bundling::upload_artifact;
use crate::tasks::workflows::steps::FluentBuilder;
use crate::tasks::workflows::{
runners,
steps::{self, NamedJob, named},
vars::Input,
};
pub fn compare_perf() -> Workflow {
let head = Input::string("head", None);
let base = Input::string("base", None);
let crate_name = Input::string("crate_name", Some("".to_owned()));
let run_perf = run_perf(&base, &head, &crate_name);
named::workflow()
.on(Event::default().workflow_dispatch(
WorkflowDispatch::default()
.add_input(head.name, head.input())
.add_input(base.name, base.input())
.add_input(crate_name.name, crate_name.input()),
))
.add_job(run_perf.name, run_perf.job)
}
pub fn run_perf(base: &Input, head: &Input, crate_name: &Input) -> NamedJob {
fn cargo_perf_test(ref_name: &Input, crate_name: &Input) -> Step<Run> {
named::bash(&format!(
"
if [ -n \"{crate_name}\" ]; then
cargo perf-test -p {crate_name} -- --json={ref_name};
else
cargo perf-test -p vim -- --json={ref_name};
fi"
))
}
fn install_hyperfine() -> Step<Run> {
named::bash("cargo install hyperfine")
}
fn compare_runs(head: &Input, base: &Input) -> Step<Run> {
named::bash(&format!(
"cargo perf-compare --save=results.md {base} {head}"
))
}
named::job(
Job::default()
.runs_on(runners::LINUX_DEFAULT)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(runners::Platform::Linux))
.map(steps::install_linux_dependencies)
.add_step(install_hyperfine())
.add_step(steps::git_checkout(base))
.add_step(cargo_perf_test(base, crate_name))
.add_step(steps::git_checkout(head))
.add_step(cargo_perf_test(head, crate_name))
.add_step(compare_runs(head, base))
.add_step(upload_artifact("results.md"))
.add_step(steps::cleanup_cargo_config(runners::Platform::Linux)),
)
}

View File

@@ -1,11 +1,13 @@
use gh_workflow::*;
use crate::tasks::workflows::steps::named;
use crate::tasks::workflows::steps::{NamedJob, named};
use super::{runners, steps};
/// Generates the danger.yml workflow
pub fn danger() -> Workflow {
let danger = danger_job();
named::workflow()
.on(
Event::default().pull_request(PullRequest::default().add_branch("main").types([
@@ -15,39 +17,43 @@ pub fn danger() -> Workflow {
PullRequestType::Edited,
])),
)
.add_job(
"danger",
Job::default()
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.runs_on(runners::LINUX_CHEAP)
.add_step(steps::checkout_repo())
.add_step(steps::setup_pnpm())
.add_step(
steps::setup_node()
.add_with(("cache", "pnpm"))
.add_with(("cache-dependency-path", "script/danger/pnpm-lock.yaml")),
)
.add_step(install_deps())
.add_step(run()),
)
.add_job(danger.name, danger.job)
}
pub fn install_deps() -> Step<Run> {
named::bash("pnpm install --dir script/danger")
}
fn danger_job() -> NamedJob {
pub fn install_deps() -> Step<Run> {
named::bash("pnpm install --dir script/danger")
}
pub fn run() -> Step<Run> {
named::bash("pnpm run --dir script/danger danger ci")
// This GitHub token is not used, but the value needs to be here to prevent
// Danger from throwing an error.
.add_env(("GITHUB_TOKEN", "not_a_real_token"))
// All requests are instead proxied through an instance of
// https://github.com/maxdeviant/danger-proxy that allows Danger to securely
// authenticate with GitHub while still being able to run on PRs from forks.
.add_env((
"DANGER_GITHUB_API_BASE_URL",
"https://danger-proxy.fly.dev/github",
))
pub fn run() -> Step<Run> {
named::bash("pnpm run --dir script/danger danger ci")
// This GitHub token is not used, but the value needs to be here to prevent
// Danger from throwing an error.
.add_env(("GITHUB_TOKEN", "not_a_real_token"))
// All requests are instead proxied through an instance of
// https://github.com/maxdeviant/danger-proxy that allows Danger to securely
// authenticate with GitHub while still being able to run on PRs from forks.
.add_env((
"DANGER_GITHUB_API_BASE_URL",
"https://danger-proxy.fly.dev/github",
))
}
NamedJob {
name: "danger".to_string(),
job: Job::default()
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.runs_on(runners::LINUX_SMALL)
.add_step(steps::checkout_repo())
.add_step(steps::setup_pnpm())
.add_step(
steps::setup_node()
.add_with(("cache", "pnpm"))
.add_with(("cache-dependency-path", "script/danger/pnpm-lock.yaml")),
)
.add_step(install_deps())
.add_step(run()),
}
}

View File

@@ -7,52 +7,6 @@ use super::{runners, steps, steps::named, vars};
use gh_workflow::*;
use indoc::indoc;
/// Generates the nix.yml workflow
pub fn nix_build() -> Workflow {
// todo(ci) instead of having these as optional YAML inputs,
// should we just generate two copies of the job (one for release-nightly
// and one for CI?)
let (input_flake_output, flake_output) = vars::input(
"flake-output",
WorkflowCallInput {
input_type: "string".into(),
default: Some("default".into()),
..Default::default()
},
);
let (input_cachix_filter, cachix_filter) = vars::input(
"cachix-filter",
WorkflowCallInput {
input_type: "string".into(),
..Default::default()
},
);
let linux_x86 = build_nix(
Platform::Linux,
Arch::X86_64,
&input_flake_output,
Some(&input_cachix_filter),
&[],
);
let mac_arm = build_nix(
Platform::Mac,
Arch::ARM64,
&input_flake_output,
Some(&input_cachix_filter),
&[],
);
named::workflow()
.on(Event::default().workflow_call(
WorkflowCall::default()
.add_input(flake_output.0, flake_output.1)
.add_input(cachix_filter.0, cachix_filter.1),
))
.add_job(linux_x86.name, linux_x86.job)
.add_job(mac_arm.name, mac_arm.job)
}
pub(crate) fn build_nix(
platform: Platform,
arch: Arch,
@@ -60,6 +14,55 @@ pub(crate) fn build_nix(
cachix_filter: Option<&str>,
deps: &[&NamedJob],
) -> NamedJob {
// on our macs we manually install nix. for some reason the cachix action is running
// under a non-login /bin/bash shell which doesn't source the proper script to add the
// nix profile to PATH, so we manually add them here
pub fn set_path() -> Step<Run> {
named::bash(indoc! {r#"
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
"#})
}
pub fn install_nix() -> Step<Use> {
named::uses(
"cachix",
"install-nix-action",
"02a151ada4993995686f9ed4f1be7cfbb229e56f", // v31
)
.add_with(("github_access_token", vars::GITHUB_TOKEN))
}
pub fn cachix_action(cachix_filter: Option<&str>) -> Step<Use> {
let mut step = named::uses(
"cachix",
"cachix-action",
"0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad", // v16
)
.add_with(("name", "zed"))
.add_with(("authToken", vars::CACHIX_AUTH_TOKEN))
.add_with(("cachixArgs", "-v"));
if let Some(cachix_filter) = cachix_filter {
step = step.add_with(("pushFilter", cachix_filter));
}
step
}
pub fn build(flake_output: &str) -> Step<Run> {
named::bash(&format!(
"nix build .#{} -L --accept-flake-config",
flake_output
))
}
pub fn limit_store() -> Step<Run> {
named::bash(indoc! {r#"
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi"#
})
}
let runner = match platform {
Platform::Windows => unimplemented!(),
Platform::Linux => runners::LINUX_X86_BUNDLER,
@@ -101,52 +104,3 @@ pub(crate) fn build_nix(
job,
}
}
// on our macs we manually install nix. for some reason the cachix action is running
// under a non-login /bin/bash shell which doesn't source the proper script to add the
// nix profile to PATH, so we manually add them here
pub fn set_path() -> Step<Run> {
named::bash(indoc! {r#"
echo "/nix/var/nix/profiles/default/bin" >> "$GITHUB_PATH"
echo "/Users/administrator/.nix-profile/bin" >> "$GITHUB_PATH"
"#})
}
pub fn install_nix() -> Step<Use> {
named::uses(
"cachix",
"install-nix-action",
"02a151ada4993995686f9ed4f1be7cfbb229e56f", // v31
)
.add_with(("github_access_token", vars::GITHUB_TOKEN))
}
pub fn cachix_action(cachix_filter: Option<&str>) -> Step<Use> {
let mut step = named::uses(
"cachix",
"cachix-action",
"0fc020193b5a1fa3ac4575aa3a7d3aa6a35435ad", // v16
)
.add_with(("name", "zed"))
.add_with(("authToken", vars::CACHIX_AUTH_TOKEN))
.add_with(("cachixArgs", "-v"));
if let Some(cachix_filter) = cachix_filter {
step = step.add_with(("pushFilter", cachix_filter));
}
step
}
pub fn build(flake_output: &str) -> Step<Run> {
named::bash(&format!(
"nix build .#{} -L --accept-flake-config",
flake_output
))
}
pub fn limit_store() -> Step<Run> {
named::bash(indoc! {r#"
if [ "$(du -sm /nix/store | cut -f1)" -gt 50000 ]; then
nix-collect-garbage -d || true
fi"#
})
}

View File

@@ -0,0 +1,184 @@
use gh_workflow::{Event, Expression, Push, Run, Step, Use, Workflow};
use crate::tasks::workflows::{
run_bundling::{bundle_linux, bundle_mac, bundle_windows},
run_tests,
runners::{self, Arch},
steps::{self, FluentBuilder, NamedJob, dependant_job, named, release_job},
vars::{self, assets},
};
pub(crate) fn release() -> Workflow {
let macos_tests = run_tests::run_platform_tests(runners::Platform::Mac);
let linux_tests = run_tests::run_platform_tests(runners::Platform::Linux);
let windows_tests = run_tests::run_platform_tests(runners::Platform::Windows);
let check_scripts = run_tests::check_scripts();
let create_draft_release = create_draft_release();
let bundle = ReleaseBundleJobs {
linux_aarch64: bundle_linux(Arch::AARCH64, None, &[&linux_tests, &check_scripts]),
linux_x86_64: bundle_linux(Arch::X86_64, None, &[&linux_tests, &check_scripts]),
mac_aarch64: bundle_mac(Arch::AARCH64, None, &[&macos_tests, &check_scripts]),
mac_x86_64: bundle_mac(Arch::X86_64, None, &[&macos_tests, &check_scripts]),
windows_aarch64: bundle_windows(Arch::AARCH64, None, &[&windows_tests, &check_scripts]),
windows_x86_64: bundle_windows(Arch::X86_64, None, &[&windows_tests, &check_scripts]),
};
let upload_release_assets = upload_release_assets(&[&create_draft_release], &bundle);
let auto_release_preview = auto_release_preview(&[&upload_release_assets]);
named::workflow()
.on(Event::default().push(Push::default().tags(vec!["v*".to_string()])))
.concurrency(vars::one_workflow_per_non_main_branch())
.add_env(("CARGO_TERM_COLOR", "always"))
.add_env(("RUST_BACKTRACE", "1"))
.add_job(macos_tests.name, macos_tests.job)
.add_job(linux_tests.name, linux_tests.job)
.add_job(windows_tests.name, windows_tests.job)
.add_job(check_scripts.name, check_scripts.job)
.add_job(create_draft_release.name, create_draft_release.job)
.map(|mut workflow| {
for job in bundle.into_jobs() {
workflow = workflow.add_job(job.name, job.job);
}
workflow
})
.add_job(upload_release_assets.name, upload_release_assets.job)
.add_job(auto_release_preview.name, auto_release_preview.job)
}
pub(crate) struct ReleaseBundleJobs {
pub linux_aarch64: NamedJob,
pub linux_x86_64: NamedJob,
pub mac_aarch64: NamedJob,
pub mac_x86_64: NamedJob,
pub windows_aarch64: NamedJob,
pub windows_x86_64: NamedJob,
}
impl ReleaseBundleJobs {
pub fn jobs(&self) -> Vec<&NamedJob> {
vec![
&self.linux_aarch64,
&self.linux_x86_64,
&self.mac_aarch64,
&self.mac_x86_64,
&self.windows_aarch64,
&self.windows_x86_64,
]
}
pub fn into_jobs(self) -> Vec<NamedJob> {
vec![
self.linux_aarch64,
self.linux_x86_64,
self.mac_aarch64,
self.mac_x86_64,
self.windows_aarch64,
self.windows_x86_64,
]
}
}
pub(crate) fn create_sentry_release() -> Step<Use> {
named::uses(
"getsentry",
"action-release",
"526942b68292201ac6bbb99b9a0747d4abee354c", // v3
)
.add_env(("SENTRY_ORG", "zed-dev"))
.add_env(("SENTRY_PROJECT", "zed"))
.add_env(("SENTRY_AUTH_TOKEN", vars::SENTRY_AUTH_TOKEN))
.add_with(("environment", "production"))
}
fn auto_release_preview(deps: &[&NamedJob; 1]) -> NamedJob {
named::job(
dependant_job(deps)
.runs_on(runners::LINUX_SMALL)
.cond(Expression::new(indoc::indoc!(
r#"
false
&& startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
"# // todo(ci-release) enable
)))
.add_step(
steps::script(
r#"gh release edit "$GITHUB_REF_NAME" --repo=zed-industries/zed --draft=false"#,
)
.add_env(("GITHUB_TOKEN", vars::GITHUB_TOKEN)),
)
.add_step(create_sentry_release()),
)
}
pub(crate) fn download_workflow_artifacts() -> Step<Use> {
named::uses(
"actions",
"download-artifact",
"018cc2cf5baa6db3ef3c5f8a56943fffe632ef53", // v6.0.0
)
.add_with(("path", "./artifacts/"))
}
pub(crate) fn prep_release_artifacts() -> Step<Run> {
let mut script_lines = vec!["mkdir -p release-artifacts/\n".to_string()];
for asset in assets::all() {
let mv_command = format!("mv ./artifacts/{asset}/{asset} release-artifacts/{asset}");
script_lines.push(mv_command)
}
named::bash(&script_lines.join("\n"))
}
fn upload_release_assets(deps: &[&NamedJob], bundle: &ReleaseBundleJobs) -> NamedJob {
let mut deps = deps.to_vec();
deps.extend(bundle.jobs());
named::job(
dependant_job(&deps)
.runs_on(runners::LINUX_MEDIUM)
.add_step(download_workflow_artifacts())
.add_step(steps::script("ls -lR ./artifacts"))
.add_step(prep_release_artifacts())
.add_step(
steps::script("gh release upload \"$GITHUB_REF_NAME\" --repo=zed-industries/zed release-artifacts/*")
.add_env(("GITHUB_TOKEN", vars::GITHUB_TOKEN)),
),
)
}
fn create_draft_release() -> NamedJob {
fn generate_release_notes() -> Step<Run> {
named::bash(
r#"node --redirect-warnings=/dev/null ./script/draft-release-notes "$RELEASE_VERSION" "$RELEASE_CHANNEL" > target/release-notes.md"#,
)
}
fn create_release() -> Step<Run> {
named::bash("script/create-draft-release target/release-notes.md")
.add_env(("GITHUB_TOKEN", vars::GITHUB_TOKEN))
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_SMALL)
// We need to fetch more than one commit so that `script/draft-release-notes`
// is able to diff between the current and previous tag.
//
// 25 was chosen arbitrarily.
.add_step(
steps::checkout_repo()
.add_with(("fetch-depth", 25))
.add_with(("clean", false))
.add_with(("ref", "${{ github.ref }}")),
)
.add_step(steps::script("script/determine-release-channel")) // export RELEASE_CHANNEL and RELEASE_VERSION
.add_step(steps::script("mkdir -p target/"))
.add_step(generate_release_notes())
.add_step(create_release()),
)
}

View File

@@ -1,45 +1,33 @@
use crate::tasks::workflows::{
nix_build::build_nix,
run_bundling::bundle_mac,
runners::{Arch, Platform},
steps::NamedJob,
vars::{mac_bundle_envs, windows_bundle_envs},
release::{
ReleaseBundleJobs, create_sentry_release, download_workflow_artifacts,
prep_release_artifacts,
},
run_bundling::{bundle_linux, bundle_mac, bundle_windows},
run_tests::run_platform_tests,
runners::{Arch, Platform, ReleaseChannel},
steps::{FluentBuilder, NamedJob},
};
use super::{runners, steps, steps::named, vars};
use gh_workflow::*;
use indexmap::IndexMap;
/// Generates the release_nightly.yml workflow
pub fn release_nightly() -> Workflow {
let env: IndexMap<_, _> = [
("CARGO_TERM_COLOR", "always"),
("CARGO_INCREMENTAL", "0"),
("RUST_BACKTRACE", "1"),
("ZED_CLIENT_CHECKSUM_SEED", vars::ZED_CLIENT_CHECKSUM_SEED),
("ZED_MINIDUMP_ENDPOINT", vars::ZED_SENTRY_MINIDUMP_ENDPOINT),
(
"DIGITALOCEAN_SPACES_ACCESS_KEY",
vars::DIGITALOCEAN_SPACES_ACCESS_KEY,
),
(
"DIGITALOCEAN_SPACES_SECRET_KEY",
vars::DIGITALOCEAN_SPACES_SECRET_KEY,
),
]
.into_iter()
.map(|(key, value)| (key.into(), value.into()))
.collect();
let style = check_style();
let tests = run_tests(Platform::Mac);
let windows_tests = run_tests(Platform::Windows);
let bundle_mac_x86 = bundle_mac_nightly(Arch::X86_64, &[&style, &tests]);
let bundle_mac_arm = bundle_mac_nightly(Arch::ARM64, &[&style, &tests]);
let linux_x86 = bundle_linux_nightly(Arch::X86_64, &[&style, &tests]);
let linux_arm = bundle_linux_nightly(Arch::ARM64, &[&style, &tests]);
let windows_x86 = bundle_windows_nightly(Arch::X86_64, &[&style, &windows_tests]);
let windows_arm = bundle_windows_nightly(Arch::ARM64, &[&style, &windows_tests]);
// run only on windows as that's our fastest platform right now.
let tests = run_platform_tests(Platform::Windows);
let nightly = Some(ReleaseChannel::Nightly);
let bundle = ReleaseBundleJobs {
linux_aarch64: bundle_linux(Arch::AARCH64, nightly, &[&style, &tests]),
linux_x86_64: bundle_linux(Arch::X86_64, nightly, &[&style, &tests]),
mac_aarch64: bundle_mac(Arch::AARCH64, nightly, &[&style, &tests]),
mac_x86_64: bundle_mac(Arch::X86_64, nightly, &[&style, &tests]),
windows_aarch64: bundle_windows(Arch::AARCH64, nightly, &[&style, &tests]),
windows_x86_64: bundle_windows(Arch::X86_64, nightly, &[&style, &tests]),
};
let nix_linux_x86 = build_nix(
Platform::Linux,
@@ -50,35 +38,28 @@ pub fn release_nightly() -> Workflow {
);
let nix_mac_arm = build_nix(
Platform::Mac,
Arch::ARM64,
Arch::AARCH64,
"default",
None,
&[&style, &tests],
);
let update_nightly_tag = update_nightly_tag_job(&[
&bundle_mac_x86,
&bundle_mac_arm,
&linux_x86,
&linux_arm,
&windows_x86,
&windows_arm,
]);
let update_nightly_tag = update_nightly_tag_job(&bundle);
named::workflow()
.on(Event::default()
// Fire every day at 7:00am UTC (Roughly before EU workday and after US workday)
.schedule([Schedule::new("0 7 * * *")])
.push(Push::default().add_tag("nightly")))
.envs(env)
.add_env(("CARGO_TERM_COLOR", "always"))
.add_env(("RUST_BACKTRACE", "1"))
.add_job(style.name, style.job)
.add_job(tests.name, tests.job)
.add_job(windows_tests.name, windows_tests.job)
.add_job(bundle_mac_x86.name, bundle_mac_x86.job)
.add_job(bundle_mac_arm.name, bundle_mac_arm.job)
.add_job(linux_x86.name, linux_x86.job)
.add_job(linux_arm.name, linux_arm.job)
.add_job(windows_x86.name, windows_x86.job)
.add_job(windows_arm.name, windows_arm.job)
.map(|mut workflow| {
for job in bundle.into_jobs() {
workflow = workflow.add_job(job.name, job.job);
}
workflow
})
.add_job(nix_linux_x86.name, nix_linux_x86.job)
.add_job(nix_mac_arm.name, nix_mac_arm.job)
.add_job(update_nightly_tag.name, update_nightly_tag.job)
@@ -111,166 +92,40 @@ fn release_job(deps: &[&NamedJob]) -> Job {
}
}
fn run_tests(platform: Platform) -> NamedJob {
let runner = match platform {
Platform::Windows => runners::WINDOWS_DEFAULT,
Platform::Linux => runners::LINUX_DEFAULT,
Platform::Mac => runners::MAC_DEFAULT,
};
NamedJob {
name: format!("run_tests_{platform}"),
job: release_job(&[])
.runs_on(runner)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(platform))
.add_step(steps::setup_node())
.add_step(steps::cargo_install_nextest(platform))
.add_step(steps::clear_target_dir_if_large(platform))
.add_step(steps::cargo_nextest(platform))
.add_step(steps::cleanup_cargo_config(platform)),
fn update_nightly_tag_job(bundle: &ReleaseBundleJobs) -> NamedJob {
fn update_nightly_tag() -> Step<Run> {
named::bash(indoc::indoc! {r#"
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
exit 0
fi
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force
"#})
}
}
fn bundle_mac_nightly(arch: Arch, deps: &[&NamedJob]) -> NamedJob {
let platform = Platform::Mac;
NamedJob {
name: format!("bundle_mac_nightly_{arch}"),
job: release_job(deps)
.runs_on(runners::MAC_DEFAULT)
.envs(mac_bundle_envs())
.add_step(steps::checkout_repo())
.add_step(steps::setup_node())
.add_step(steps::setup_sentry())
.add_step(steps::clear_target_dir_if_large(platform))
.add_step(set_release_channel_to_nightly(platform))
.add_step(bundle_mac(arch))
.add_step(upload_zed_nightly(platform, arch)),
}
}
fn bundle_linux_nightly(arch: Arch, deps: &[&NamedJob]) -> NamedJob {
let platform = Platform::Linux;
let mut job = release_job(deps)
.runs_on(arch.linux_bundler())
.add_step(steps::checkout_repo())
.add_step(steps::setup_sentry())
.add_step(add_rust_to_path())
.add_step(steps::script("./script/linux"));
// todo(ci) can we do this on arm too?
if arch == Arch::X86_64 {
job = job.add_step(steps::script("./script/install-mold"));
}
job = job
.add_step(steps::clear_target_dir_if_large(platform))
.add_step(set_release_channel_to_nightly(platform))
.add_step(steps::script("./script/bundle-linux"))
.add_step(upload_zed_nightly(platform, arch));
NamedJob {
name: format!("bundle_linux_nightly_{arch}"),
job,
}
}
fn bundle_windows_nightly(arch: Arch, deps: &[&NamedJob]) -> NamedJob {
let platform = Platform::Windows;
NamedJob {
name: format!("bundle_windows_nightly_{arch}"),
job: release_job(deps)
.runs_on(runners::WINDOWS_DEFAULT)
.envs(windows_bundle_envs())
.add_step(steps::checkout_repo())
.add_step(steps::setup_sentry())
.add_step(set_release_channel_to_nightly(platform))
.add_step(build_zed_installer(arch))
.add_step(upload_zed_nightly_windows(arch)),
}
}
fn update_nightly_tag_job(deps: &[&NamedJob]) -> NamedJob {
NamedJob {
name: "update_nightly_tag".to_owned(),
job: release_job(deps)
.runs_on(runners::LINUX_CHEAP)
job: steps::release_job(&bundle.jobs())
.runs_on(runners::LINUX_MEDIUM)
.add_step(steps::checkout_repo().add_with(("fetch-depth", 0)))
.add_step(download_workflow_artifacts())
.add_step(steps::script("ls -lR ./artifacts"))
.add_step(prep_release_artifacts())
.add_step(
steps::script("./script/upload-nightly")
.add_env((
"DIGITALOCEAN_SPACES_ACCESS_KEY",
vars::DIGITALOCEAN_SPACES_ACCESS_KEY,
))
.add_env((
"DIGITALOCEAN_SPACES_SECRET_KEY",
vars::DIGITALOCEAN_SPACES_SECRET_KEY,
)),
)
.add_step(update_nightly_tag())
.add_step(create_sentry_release()),
}
}
fn set_release_channel_to_nightly(platform: Platform) -> Step<Run> {
match platform {
Platform::Linux | Platform::Mac => named::bash(indoc::indoc! {r#"
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
"#}),
Platform::Windows => named::pwsh(indoc::indoc! {r#"
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
"#})
.working_directory("${{ env.ZED_WORKSPACE }}"),
}
}
fn add_rust_to_path() -> Step<Run> {
named::bash(r#"echo "$HOME/.cargo/bin" >> "$GITHUB_PATH""#)
}
fn upload_zed_nightly(platform: Platform, arch: Arch) -> Step<Run> {
match platform {
Platform::Linux => named::bash(&format!("script/upload-nightly linux-targz {arch}")),
Platform::Mac => named::bash(&format!("script/upload-nightly macos {arch}")),
Platform::Windows => {
let cmd = match arch {
Arch::X86_64 => "script/upload-nightly.ps1 -Architecture x86_64",
Arch::ARM64 => "script/upload-nightly.ps1 -Architecture aarch64",
};
named::pwsh(cmd).working_directory("${{ env.ZED_WORKSPACE }}")
}
}
}
fn build_zed_installer(arch: Arch) -> Step<Run> {
let cmd = match arch {
Arch::X86_64 => "script/bundle-windows.ps1 -Architecture x86_64",
Arch::ARM64 => "script/bundle-windows.ps1 -Architecture aarch64",
};
named::pwsh(cmd).working_directory("${{ env.ZED_WORKSPACE }}")
}
fn upload_zed_nightly_windows(arch: Arch) -> Step<Run> {
let cmd = match arch {
Arch::X86_64 => "script/upload-nightly.ps1 -Architecture x86_64",
Arch::ARM64 => "script/upload-nightly.ps1 -Architecture aarch64",
};
named::pwsh(cmd).working_directory("${{ env.ZED_WORKSPACE }}")
}
fn update_nightly_tag() -> Step<Run> {
named::bash(indoc::indoc! {r#"
if [ "$(git rev-parse nightly)" = "$(git rev-parse HEAD)" ]; then
echo "Nightly tag already points to current commit. Skipping tagging."
exit 0
fi
git config user.name github-actions
git config user.email github-actions@github.com
git tag -f nightly
git push origin nightly --force
"#})
}
fn create_sentry_release() -> Step<Use> {
named::uses(
"getsentry",
"action-release",
"526942b68292201ac6bbb99b9a0747d4abee354c", // v3
)
.add_env(("SENTRY_ORG", "zed-dev"))
.add_env(("SENTRY_PROJECT", "zed"))
.add_env(("SENTRY_AUTH_TOKEN", vars::SENTRY_AUTH_TOKEN))
.add_with(("environment", "production"))
}

View File

@@ -0,0 +1,107 @@
use gh_workflow::{
Event, Expression, Job, PullRequest, PullRequestType, Run, Schedule, Step, Use, Workflow,
WorkflowDispatch,
};
use crate::tasks::workflows::{
runners::{self, Platform},
steps::{self, FluentBuilder as _, NamedJob, named, setup_cargo_config},
vars,
};
pub(crate) fn run_agent_evals() -> Workflow {
let agent_evals = agent_evals();
named::workflow()
.on(Event::default()
.schedule([Schedule::default().cron("0 0 * * *")])
.pull_request(PullRequest::default().add_branch("**").types([
PullRequestType::Synchronize,
PullRequestType::Reopened,
PullRequestType::Labeled,
]))
.workflow_dispatch(WorkflowDispatch::default()))
.concurrency(vars::one_workflow_per_non_main_branch())
.add_env(("CARGO_TERM_COLOR", "always"))
.add_env(("CARGO_INCREMENTAL", 0))
.add_env(("RUST_BACKTRACE", 1))
.add_env(("ANTHROPIC_API_KEY", vars::ANTHROPIC_API_KEY))
.add_env(("ZED_CLIENT_CHECKSUM_SEED", vars::ZED_CLIENT_CHECKSUM_SEED))
.add_env(("ZED_EVAL_TELEMETRY", 1))
.add_job(agent_evals.name, agent_evals.job)
}
fn agent_evals() -> NamedJob {
fn run_eval() -> Step<Run> {
named::bash("cargo run --package=eval -- --repetitions=8 --concurrency=1")
}
named::job(
Job::default()
.cond(Expression::new(indoc::indoc!{r#"
github.repository_owner == 'zed-industries' &&
(github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'run-eval'))
"#}))
.runs_on(runners::LINUX_DEFAULT)
.timeout_minutes(60_u32)
.add_step(steps::checkout_repo())
.add_step(steps::cache_rust_dependencies_namespace())
.map(steps::install_linux_dependencies)
.add_step(setup_cargo_config(Platform::Linux))
.add_step(steps::script("cargo build --package=eval"))
.add_step(run_eval())
.add_step(steps::cleanup_cargo_config(Platform::Linux))
)
}
pub(crate) fn run_unit_evals() -> Workflow {
let unit_evals = unit_evals();
named::workflow()
.on(Event::default()
.schedule([
// GitHub might drop jobs at busy times, so we choose a random time in the middle of the night.
Schedule::default().cron("47 1 * * 2"),
])
.workflow_dispatch(WorkflowDispatch::default()))
.concurrency(vars::one_workflow_per_non_main_branch())
.add_env(("CARGO_TERM_COLOR", "always"))
.add_env(("CARGO_INCREMENTAL", 0))
.add_env(("RUST_BACKTRACE", 1))
.add_env(("ZED_CLIENT_CHECKSUM_SEED", vars::ZED_CLIENT_CHECKSUM_SEED))
.add_job(unit_evals.name, unit_evals.job)
}
fn unit_evals() -> NamedJob {
fn send_failure_to_slack() -> Step<Use> {
named::uses(
"slackapi",
"slack-github-action",
"b0fa283ad8fea605de13dc3f449259339835fc52",
)
.if_condition(Expression::new("${{ failure() }}"))
.add_with(("method", "chat.postMessage"))
.add_with(("token", vars::SLACK_APP_ZED_UNIT_EVALS_BOT_TOKEN))
.add_with(("payload", indoc::indoc!{r#"
channel: C04UDRNNJFQ
text: "Unit Evals Failed: https://github.com/zed-industries/zed/actions/runs/${{ github.run_id }}"
"#}))
}
named::job(
Job::default()
.runs_on(runners::LINUX_DEFAULT)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(Platform::Linux))
.add_step(steps::cache_rust_dependencies_namespace())
.map(steps::install_linux_dependencies)
.add_step(steps::cargo_install_nextest(Platform::Linux))
.add_step(steps::clear_target_dir_if_large(Platform::Linux))
.add_step(
steps::script("./script/run-unit-evals")
.add_env(("ANTHROPIC_API_KEY", vars::ANTHROPIC_API_KEY)),
)
.add_step(send_failure_to_slack())
.add_step(steps::cleanup_cargo_config(Platform::Linux)),
)
}

View File

@@ -1,12 +1,24 @@
use std::path::Path;
use crate::tasks::workflows::{
steps::named,
vars::{mac_bundle_envs, windows_bundle_envs},
release::ReleaseBundleJobs,
runners::{Arch, Platform, ReleaseChannel},
steps::{FluentBuilder, NamedJob, dependant_job, named},
vars::{assets, bundle_envs},
};
use super::{runners, steps, vars};
use super::{runners, steps};
use gh_workflow::*;
pub fn run_bundling() -> Workflow {
let bundle = ReleaseBundleJobs {
linux_aarch64: bundle_linux(Arch::AARCH64, None, &[]),
linux_x86_64: bundle_linux(Arch::X86_64, None, &[]),
mac_aarch64: bundle_mac(Arch::AARCH64, None, &[]),
mac_x86_64: bundle_mac(Arch::X86_64, None, &[]),
windows_aarch64: bundle_windows(Arch::AARCH64, None, &[]),
windows_x86_64: bundle_windows(Arch::X86_64, None, &[]),
};
named::workflow()
.on(Event::default().pull_request(
PullRequest::default().types([PullRequestType::Labeled, PullRequestType::Synchronize]),
@@ -18,102 +30,164 @@ pub fn run_bundling() -> Workflow {
.cancel_in_progress(true),
)
.add_env(("CARGO_TERM_COLOR", "always"))
.add_env(("CARGO_INCREMENTAL", "0"))
.add_env(("RUST_BACKTRACE", "1"))
.add_env(("ZED_CLIENT_CHECKSUM_SEED", vars::ZED_CLIENT_CHECKSUM_SEED))
.add_env(("ZED_MINIDUMP_ENDPOINT", vars::ZED_SENTRY_MINIDUMP_ENDPOINT))
.add_job("bundle_mac_x86_64", bundle_mac_job(runners::Arch::X86_64))
.add_job("bundle_mac_arm64", bundle_mac_job(runners::Arch::ARM64))
.add_job("bundle_linux_x86_64", bundle_linux(runners::Arch::X86_64))
.add_job("bundle_linux_arm64", bundle_linux(runners::Arch::ARM64))
.add_job(
"bundle_windows_x86_64",
bundle_windows_job(runners::Arch::X86_64),
)
.add_job(
"bundle_windows_arm64",
bundle_windows_job(runners::Arch::ARM64),
)
.map(|mut workflow| {
for job in bundle.into_jobs() {
workflow = workflow.add_job(job.name, job.job);
}
workflow
})
}
fn bundle_job() -> Job {
Job::default()
.cond(Expression::new(
fn bundle_job(deps: &[&NamedJob]) -> Job {
dependant_job(deps)
.when(deps.len() == 0, |job|
job.cond(Expression::new(
"(github.event.action == 'labeled' && github.event.label.name == 'run-bundling') ||
(github.event.action == 'synchronize' && contains(github.event.pull_request.labels.*.name, 'run-bundling'))",
))
)))
.timeout_minutes(60u32)
}
fn bundle_mac_job(arch: runners::Arch) -> Job {
use vars::GITHUB_SHA;
bundle_job()
.runs_on(runners::MAC_DEFAULT)
.envs(mac_bundle_envs())
.add_step(steps::checkout_repo())
.add_step(steps::setup_node())
.add_step(steps::setup_sentry())
.add_step(steps::clear_target_dir_if_large(runners::Platform::Mac))
.add_step(bundle_mac(arch))
.add_step(steps::upload_artifact(
&format!("Zed_{GITHUB_SHA}-{arch}.dmg"),
&format!("target/{arch}-apple-darwin/release/Zed.dmg"),
))
.add_step(steps::upload_artifact(
&format!("zed-remote-server-{GITHUB_SHA}-macos-{arch}.gz"),
&format!("target/zed-remote-server-macos-{arch}.gz"),
))
}
pub fn bundle_mac(arch: runners::Arch) -> Step<Run> {
named::bash(&format!("./script/bundle-mac {arch}-apple-darwin"))
}
fn bundle_linux(arch: runners::Arch) -> Job {
let artifact_name = format!("zed-{}-{}.tar.gz", vars::GITHUB_SHA, arch.triple());
let remote_server_artifact_name = format!(
"zed-remote-server-{}-{}.tar.gz",
vars::GITHUB_SHA,
arch.triple()
);
let mut job = bundle_job()
.runs_on(arch.linux_bundler())
.add_step(steps::checkout_repo())
.add_step(steps::setup_sentry())
.add_step(steps::script("./script/linux"));
// todo(ci) can we do this on arm too?
if arch == runners::Arch::X86_64 {
job = job.add_step(steps::script("./script/install-mold"));
pub(crate) fn bundle_mac(
arch: Arch,
release_channel: Option<ReleaseChannel>,
deps: &[&NamedJob],
) -> NamedJob {
pub fn bundle_mac(arch: Arch) -> Step<Run> {
named::bash(&format!("./script/bundle-mac {arch}-apple-darwin"))
}
job.add_step(steps::script("./script/bundle-linux"))
.add_step(steps::upload_artifact(
&artifact_name,
"target/release/zed-*.tar.gz",
))
.add_step(steps::upload_artifact(
&remote_server_artifact_name,
"target/release/zed-remote-server-*.tar.gz",
))
}
fn bundle_windows_job(arch: runners::Arch) -> Job {
use vars::GITHUB_SHA;
bundle_job()
.runs_on(runners::WINDOWS_DEFAULT)
.envs(windows_bundle_envs())
.add_step(steps::checkout_repo())
.add_step(steps::setup_sentry())
.add_step(bundle_windows(arch))
.add_step(steps::upload_artifact(
&format!("Zed_{GITHUB_SHA}-{arch}.exe"),
"${{ env.SETUP_PATH }}",
))
}
fn bundle_windows(arch: runners::Arch) -> Step<Run> {
let step = match arch {
runners::Arch::X86_64 => named::pwsh("script/bundle-windows.ps1 -Architecture x86_64"),
runners::Arch::ARM64 => named::pwsh("script/bundle-windows.ps1 -Architecture aarch64"),
let platform = Platform::Mac;
let artifact_name = match arch {
Arch::X86_64 => assets::MAC_X86_64,
Arch::AARCH64 => assets::MAC_AARCH64,
};
step.working_directory("${{ env.ZED_WORKSPACE }}")
let remote_server_artifact_name = match arch {
Arch::X86_64 => assets::REMOTE_SERVER_MAC_X86_64,
Arch::AARCH64 => assets::REMOTE_SERVER_MAC_AARCH64,
};
NamedJob {
name: format!("bundle_mac_{arch}"),
job: bundle_job(deps)
.runs_on(runners::MAC_DEFAULT)
.envs(bundle_envs(platform))
.add_step(steps::checkout_repo())
.when_some(release_channel, |job, release_channel| {
job.add_step(set_release_channel(platform, release_channel))
})
.add_step(steps::setup_node())
.add_step(steps::setup_sentry())
.add_step(steps::clear_target_dir_if_large(runners::Platform::Mac))
.add_step(bundle_mac(arch))
.add_step(upload_artifact(&format!(
"target/{arch}-apple-darwin/release/{artifact_name}"
)))
.add_step(upload_artifact(&format!(
"target/{remote_server_artifact_name}"
))),
}
}
pub fn upload_artifact(path: &str) -> Step<Use> {
let name = Path::new(path).file_name().unwrap().to_str().unwrap();
Step::new(format!("@actions/upload-artifact {}", name))
.uses(
"actions",
"upload-artifact",
"330a01c490aca151604b8cf639adc76d48f6c5d4", // v5
)
// N.B. "name" is the name for the asset. The uploaded
// file retains its filename.
.add_with(("name", name))
.add_with(("path", path))
.add_with(("if-no-files-found", "error"))
}
pub(crate) fn bundle_linux(
arch: Arch,
release_channel: Option<ReleaseChannel>,
deps: &[&NamedJob],
) -> NamedJob {
let platform = Platform::Linux;
let artifact_name = match arch {
Arch::X86_64 => assets::LINUX_X86_64,
Arch::AARCH64 => assets::LINUX_AARCH64,
};
let remote_server_artifact_name = match arch {
Arch::X86_64 => assets::REMOTE_SERVER_LINUX_X86_64,
Arch::AARCH64 => assets::REMOTE_SERVER_LINUX_AARCH64,
};
NamedJob {
name: format!("bundle_linux_{arch}"),
job: bundle_job(deps)
.runs_on(arch.linux_bundler())
.envs(bundle_envs(platform))
.add_step(steps::checkout_repo())
.when_some(release_channel, |job, release_channel| {
job.add_step(set_release_channel(platform, release_channel))
})
.add_step(steps::setup_sentry())
.map(steps::install_linux_dependencies)
.add_step(steps::script("./script/bundle-linux"))
.add_step(upload_artifact(&format!("target/release/{artifact_name}")))
.add_step(upload_artifact(&format!(
"target/{remote_server_artifact_name}"
))),
}
}
pub(crate) fn bundle_windows(
arch: Arch,
release_channel: Option<ReleaseChannel>,
deps: &[&NamedJob],
) -> NamedJob {
let platform = Platform::Windows;
pub fn bundle_windows(arch: Arch) -> Step<Run> {
let step = match arch {
Arch::X86_64 => named::pwsh("script/bundle-windows.ps1 -Architecture x86_64"),
Arch::AARCH64 => named::pwsh("script/bundle-windows.ps1 -Architecture aarch64"),
};
step.working_directory("${{ env.ZED_WORKSPACE }}")
}
let artifact_name = match arch {
Arch::X86_64 => assets::WINDOWS_X86_64,
Arch::AARCH64 => assets::WINDOWS_AARCH64,
};
NamedJob {
name: format!("bundle_windows_{arch}"),
job: bundle_job(deps)
.runs_on(runners::WINDOWS_DEFAULT)
.envs(bundle_envs(platform))
.add_step(steps::checkout_repo())
.when_some(release_channel, |job, release_channel| {
job.add_step(set_release_channel(platform, release_channel))
})
.add_step(steps::setup_sentry())
.add_step(bundle_windows(arch))
.add_step(upload_artifact(&format!("target/{artifact_name}"))),
}
}
fn set_release_channel(platform: Platform, release_channel: ReleaseChannel) -> Step<Run> {
match release_channel {
ReleaseChannel::Nightly => set_release_channel_to_nightly(platform),
}
}
fn set_release_channel_to_nightly(platform: Platform) -> Step<Run> {
match platform {
Platform::Linux | Platform::Mac => named::bash(indoc::indoc! {r#"
set -eu
version=$(git rev-parse --short HEAD)
echo "Publishing version: ${version} on release channel nightly"
echo "nightly" > crates/zed/RELEASE_CHANNEL
"#}),
Platform::Windows => named::pwsh(indoc::indoc! {r#"
$ErrorActionPreference = "Stop"
$version = git rev-parse --short HEAD
Write-Host "Publishing version: $version on release channel nightly"
"nightly" | Set-Content -Path "crates/zed/RELEASE_CHANNEL"
"#})
.working_directory("${{ env.ZED_WORKSPACE }}"),
}
}

View File

@@ -0,0 +1,483 @@
use gh_workflow::{
Concurrency, Event, Expression, Job, PullRequest, Push, Run, Step, Use, Workflow,
};
use indexmap::IndexMap;
use crate::tasks::workflows::{
nix_build::build_nix, runners::Arch, steps::BASH_SHELL, vars::PathCondition,
};
use super::{
runners::{self, Platform},
steps::{self, FluentBuilder, NamedJob, named, release_job},
};
pub(crate) fn run_tests() -> Workflow {
// Specify anything which should potentially skip full test suite in this regex:
// - docs/
// - script/update_top_ranking_issues/
// - .github/ISSUE_TEMPLATE/
// - .github/workflows/ (except .github/workflows/ci.yml)
let should_run_tests = PathCondition::inverted(
"run_tests",
r"^(docs/|script/update_top_ranking_issues/|\.github/(ISSUE_TEMPLATE|workflows/(?!run_tests)))",
);
let should_check_docs = PathCondition::new("run_docs", r"^docs/");
let should_check_scripts = PathCondition::new(
"run_action_checks",
r"^\.github/(workflows/|actions/|actionlint.yml)|tooling/xtask|script/",
);
let should_check_licences =
PathCondition::new("run_licenses", r"^(Cargo.lock|script/.*licenses)");
let should_build_nix = PathCondition::new(
"run_nix",
r"^(nix/|flake\.|Cargo\.|rust-toolchain.toml|\.cargo/config.toml)",
);
let orchestrate = orchestrate(&[
&should_check_scripts,
&should_check_docs,
&should_check_licences,
&should_build_nix,
&should_run_tests,
]);
let jobs = [
orchestrate,
check_style(),
should_run_tests.guard(run_platform_tests(Platform::Windows)),
should_run_tests.guard(run_platform_tests(Platform::Linux)),
should_run_tests.guard(run_platform_tests(Platform::Mac)),
should_run_tests.guard(doctests()),
should_run_tests.guard(check_workspace_binaries()),
should_run_tests.guard(check_postgres_and_protobuf_migrations()), // could be more specific here?
should_run_tests.guard(check_dependencies()), // could be more specific here?
should_check_docs.guard(check_docs()),
should_check_licences.guard(check_licenses()),
should_check_scripts.guard(check_scripts()),
should_build_nix.guard(build_nix(
Platform::Linux,
Arch::X86_64,
"debug",
// *don't* cache the built output
Some("-zed-editor-[0-9.]*-nightly"),
&[],
)),
should_build_nix.guard(build_nix(
Platform::Mac,
Arch::AARCH64,
"debug",
// *don't* cache the built output
Some("-zed-editor-[0-9.]*-nightly"),
&[],
)),
];
let tests_pass = tests_pass(&jobs);
named::workflow()
.add_event(Event::default()
.push(
Push::default()
.add_branch("main")
.add_branch("v[0-9]+.[0-9]+.x")
)
.pull_request(PullRequest::default().add_branch("**"))
)
.concurrency(Concurrency::default()
.group("${{ github.workflow }}-${{ github.ref_name }}-${{ github.ref_name == 'main' && github.sha || 'anysha' }}")
.cancel_in_progress(true)
)
.add_env(( "CARGO_TERM_COLOR", "always" ))
.add_env(( "RUST_BACKTRACE", 1 ))
.add_env(( "CARGO_INCREMENTAL", 0 ))
.map(|mut workflow| {
for job in jobs {
workflow = workflow.add_job(job.name, job.job)
}
workflow
})
.add_job(tests_pass.name, tests_pass.job)
}
// Generates a bash script that checks changed files against regex patterns
// and sets GitHub output variables accordingly
fn orchestrate(rules: &[&PathCondition]) -> NamedJob {
let name = "orchestrate".to_owned();
let step_name = "filter".to_owned();
let mut script = String::new();
script.push_str(indoc::indoc! {r#"
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV="$(git rev-parse HEAD~1)"
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV="$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)"
fi
CHANGED_FILES="$(git diff --name-only "$COMPARE_REV" ${{ github.sha }})"
check_pattern() {
local output_name="$1"
local pattern="$2"
local grep_arg="$3"
echo "$CHANGED_FILES" | grep "$grep_arg" "$pattern" && \
echo "${output_name}=true" >> "$GITHUB_OUTPUT" || \
echo "${output_name}=false" >> "$GITHUB_OUTPUT"
}
"#});
let mut outputs = IndexMap::new();
for rule in rules {
assert!(
rule.set_by_step
.borrow_mut()
.replace(name.clone())
.is_none()
);
assert!(
outputs
.insert(
rule.name.to_owned(),
format!("${{{{ steps.{}.outputs.{} }}}}", step_name, rule.name)
)
.is_none()
);
let grep_arg = if rule.invert { "-qvP" } else { "-qP" };
script.push_str(&format!(
"check_pattern \"{}\" '{}' {}\n",
rule.name, rule.pattern, grep_arg
));
}
let job = Job::default()
.runs_on(runners::LINUX_SMALL)
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.outputs(outputs)
.add_step(steps::checkout_repo().add_with((
"fetch-depth",
"${{ github.ref == 'refs/heads/main' && 2 || 350 }}",
)))
.add_step(
Step::new(step_name.clone())
.run(script)
.id(step_name)
.shell(BASH_SHELL),
);
NamedJob { name, job }
}
pub(crate) fn tests_pass(jobs: &[NamedJob]) -> NamedJob {
let mut script = String::from(indoc::indoc! {r#"
set +x
EXIT_CODE=0
check_result() {
echo "* $1: $2"
if [[ "$2" != "skipped" && "$2" != "success" ]]; then EXIT_CODE=1; fi
}
"#});
script.push_str(
&jobs
.iter()
.map(|job| {
format!(
"check_result \"{}\" \"${{{{ needs.{}.result }}}}\"",
job.name, job.name
)
})
.collect::<Vec<_>>()
.join("\n"),
);
script.push_str("\n\nexit $EXIT_CODE\n");
let job = Job::default()
.runs_on(runners::LINUX_SMALL)
.needs(
jobs.iter()
.map(|j| j.name.to_string())
.collect::<Vec<String>>(),
)
.cond(Expression::new(
"github.repository_owner == 'zed-industries' && always()",
))
.add_step(named::bash(&script));
named::job(job)
}
fn check_style() -> NamedJob {
fn check_for_typos() -> Step<Use> {
named::uses(
"crate-ci",
"typos",
"80c8a4945eec0f6d464eaf9e65ed98ef085283d1",
) // v1.38.1
.with(("config", "./typos.toml"))
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_MEDIUM)
.add_step(steps::checkout_repo())
.add_step(steps::cache_rust_dependencies_namespace())
.add_step(steps::setup_pnpm())
.add_step(steps::script("./script/prettier"))
.add_step(steps::script("./script/check-todos"))
.add_step(steps::script("./script/check-keymaps"))
.add_step(check_for_typos())
.add_step(steps::cargo_fmt()),
)
}
fn check_dependencies() -> NamedJob {
fn install_cargo_machete() -> Step<Use> {
named::uses(
"clechasseur",
"rs-cargo",
"8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386", // v2
)
.add_with(("command", "install"))
.add_with(("args", "cargo-machete@0.7.0"))
}
fn run_cargo_machete() -> Step<Use> {
named::uses(
"clechasseur",
"rs-cargo",
"8435b10f6e71c2e3d4d3b7573003a8ce4bfc6386", // v2
)
.add_with(("command", "machete"))
}
fn check_cargo_lock() -> Step<Run> {
named::bash("cargo update --locked --workspace")
}
fn check_vulnerable_dependencies() -> Step<Use> {
named::uses(
"actions",
"dependency-review-action",
"67d4f4bd7a9b17a0db54d2a7519187c65e339de8", // v4
)
.if_condition(Expression::new("github.event_name == 'pull_request'"))
.with(("license-check", false))
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_SMALL)
.add_step(steps::checkout_repo())
.add_step(steps::cache_rust_dependencies_namespace())
.add_step(install_cargo_machete())
.add_step(run_cargo_machete())
.add_step(check_cargo_lock())
.add_step(check_vulnerable_dependencies()),
)
}
fn check_workspace_binaries() -> NamedJob {
named::job(
release_job(&[])
.runs_on(runners::LINUX_LARGE)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(Platform::Linux))
.map(steps::install_linux_dependencies)
.add_step(steps::cache_rust_dependencies_namespace())
.add_step(steps::script("cargo build -p collab"))
.add_step(steps::script("cargo build --workspace --bins --examples"))
.add_step(steps::cleanup_cargo_config(Platform::Linux)),
)
}
pub(crate) fn run_platform_tests(platform: Platform) -> NamedJob {
let runner = match platform {
Platform::Windows => runners::WINDOWS_DEFAULT,
Platform::Linux => runners::LINUX_DEFAULT,
Platform::Mac => runners::MAC_DEFAULT,
};
NamedJob {
name: format!("run_tests_{platform}"),
job: release_job(&[])
.runs_on(runner)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(platform))
.when(
platform == Platform::Linux,
steps::install_linux_dependencies,
)
.when(platform == Platform::Linux, |this| {
this.add_step(steps::cache_rust_dependencies_namespace())
})
.add_step(steps::setup_node())
.add_step(steps::clippy(platform))
.add_step(steps::cargo_install_nextest(platform))
.add_step(steps::clear_target_dir_if_large(platform))
.add_step(steps::cargo_nextest(platform))
.add_step(steps::cleanup_cargo_config(platform)),
}
}
pub(crate) fn check_postgres_and_protobuf_migrations() -> NamedJob {
fn remove_untracked_files() -> Step<Run> {
named::bash("git clean -df")
}
fn ensure_fresh_merge() -> Step<Run> {
named::bash(indoc::indoc! {r#"
if [ -z "$GITHUB_BASE_REF" ];
then
echo "BUF_BASE_BRANCH=$(git merge-base origin/main HEAD)" >> "$GITHUB_ENV"
else
git checkout -B temp
git merge -q "origin/$GITHUB_BASE_REF" -m "merge main into temp"
echo "BUF_BASE_BRANCH=$GITHUB_BASE_REF" >> "$GITHUB_ENV"
fi
"#})
}
fn bufbuild_setup_action() -> Step<Use> {
named::uses("bufbuild", "buf-setup-action", "v1").add_with(("version", "v1.29.0"))
}
fn bufbuild_breaking_action() -> Step<Use> {
named::uses("bufbuild", "buf-breaking-action", "v1").add_with(("input", "crates/proto/proto/"))
.add_with(("against", "https://github.com/${GITHUB_REPOSITORY}.git#branch=${BUF_BASE_BRANCH},subdir=crates/proto/proto/"))
}
named::job(
release_job(&[])
.runs_on(runners::MAC_DEFAULT)
.add_step(steps::checkout_repo().with(("fetch-depth", 0))) // fetch full history
.add_step(remove_untracked_files())
.add_step(ensure_fresh_merge())
.add_step(bufbuild_setup_action())
.add_step(bufbuild_breaking_action()),
)
}
fn doctests() -> NamedJob {
fn run_doctests() -> Step<Run> {
named::bash(indoc::indoc! {r#"
cargo test --workspace --doc --no-fail-fast
"#})
.id("run_doctests")
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_DEFAULT)
.add_step(steps::checkout_repo())
.add_step(steps::cache_rust_dependencies_namespace())
.map(steps::install_linux_dependencies)
.add_step(steps::setup_cargo_config(Platform::Linux))
.add_step(run_doctests())
.add_step(steps::cleanup_cargo_config(Platform::Linux)),
)
}
fn check_licenses() -> NamedJob {
named::job(
Job::default()
.runs_on(runners::LINUX_SMALL)
.add_step(steps::checkout_repo())
.add_step(steps::cache_rust_dependencies_namespace())
.add_step(steps::script("./script/check-licenses"))
.add_step(steps::script("./script/generate-licenses")),
)
}
fn check_docs() -> NamedJob {
fn lychee_link_check(dir: &str) -> Step<Use> {
named::uses(
"lycheeverse",
"lychee-action",
"82202e5e9c2f4ef1a55a3d02563e1cb6041e5332",
) // v2.4.1
.add_with(("args", format!("--no-progress --exclude '^http' '{dir}'")))
.add_with(("fail", true))
.add_with(("jobSummary", false))
}
fn install_mdbook() -> Step<Use> {
named::uses(
"peaceiris",
"actions-mdbook",
"ee69d230fe19748b7abf22df32acaa93833fad08", // v2
)
.with(("mdbook-version", "0.4.37"))
}
fn build_docs() -> Step<Run> {
named::bash(indoc::indoc! {r#"
mkdir -p target/deploy
mdbook build ./docs --dest-dir=../target/deploy/docs/
"#})
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_LARGE)
.add_step(steps::checkout_repo())
.add_step(steps::setup_cargo_config(Platform::Linux))
// todo(ci): un-inline build_docs/action.yml here
.add_step(steps::cache_rust_dependencies_namespace())
.add_step(
lychee_link_check("./docs/src/**/*"), // check markdown links
)
.map(steps::install_linux_dependencies)
.add_step(install_mdbook())
.add_step(build_docs())
.add_step(
lychee_link_check("target/deploy/docs"), // check links in generated html
),
)
}
pub(crate) fn check_scripts() -> NamedJob {
fn download_actionlint() -> Step<Run> {
named::bash(
"bash <(curl https://raw.githubusercontent.com/rhysd/actionlint/main/scripts/download-actionlint.bash)",
)
}
fn run_actionlint() -> Step<Run> {
named::bash(indoc::indoc! {r#"
${{ steps.get_actionlint.outputs.executable }} -color
"#})
}
fn run_shellcheck() -> Step<Run> {
named::bash("./script/shellcheck-scripts error")
}
fn check_xtask_workflows() -> Step<Run> {
named::bash(indoc::indoc! {r#"
cargo xtask workflows
if ! git diff --exit-code .github; then
echo "Error: .github directory has uncommitted changes after running 'cargo xtask workflows'"
echo "Please run 'cargo xtask workflows' locally and commit the changes"
exit 1
fi
"#})
}
named::job(
release_job(&[])
.runs_on(runners::LINUX_SMALL)
.add_step(steps::checkout_repo())
.add_step(run_shellcheck())
.add_step(download_actionlint().id("get_actionlint"))
.add_step(run_actionlint())
.add_step(check_xtask_workflows()),
)
}

View File

@@ -1,5 +1,8 @@
pub const LINUX_CHEAP: Runner = Runner("namespace-profile-2x4-ubuntu-2404");
pub const LINUX_DEFAULT: Runner = Runner("namespace-profile-16x32-ubuntu-2204");
pub const LINUX_SMALL: Runner = Runner("namespace-profile-2x4-ubuntu-2404");
pub const LINUX_DEFAULT: Runner = LINUX_XL;
pub const LINUX_XL: Runner = Runner("namespace-profile-16x32-ubuntu-2204");
pub const LINUX_LARGE: Runner = Runner("namespace-profile-8x16-ubuntu-2204");
pub const LINUX_MEDIUM: Runner = Runner("namespace-profile-4x8-ubuntu-2204");
// Using Ubuntu 20.04 for minimal glibc version
pub const LINUX_X86_BUNDLER: Runner = Runner("namespace-profile-32x64-ubuntu-2004");
@@ -19,30 +22,23 @@ impl Into<gh_workflow::RunsOn> for Runner {
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum Arch {
X86_64,
ARM64,
AARCH64,
}
impl std::fmt::Display for Arch {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Arch::X86_64 => write!(f, "x86_64"),
Arch::ARM64 => write!(f, "aarch64"),
Arch::AARCH64 => write!(f, "aarch64"),
}
}
}
impl Arch {
pub fn triple(&self) -> &'static str {
match self {
Arch::X86_64 => "x86_64-unknown-linux-gnu",
Arch::ARM64 => "aarch64-unknown-linux-gnu",
}
}
pub fn linux_bundler(&self) -> Runner {
match self {
Arch::X86_64 => LINUX_X86_BUNDLER,
Arch::ARM64 => LINUX_ARM_BUNDLER,
Arch::AARCH64 => LINUX_ARM_BUNDLER,
}
}
}
@@ -63,3 +59,8 @@ impl std::fmt::Display for Platform {
}
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub enum ReleaseChannel {
Nightly,
}

View File

@@ -2,9 +2,9 @@ use gh_workflow::*;
use crate::tasks::workflows::{runners::Platform, vars};
const BASH_SHELL: &str = "bash -euxo pipefail {0}";
pub const BASH_SHELL: &str = "bash -euxo pipefail {0}";
// https://docs.github.com/en/actions/reference/workflows-and-actions/workflow-syntax#jobsjob_idstepsshell
const PWSH_SHELL: &str = "pwsh";
pub const PWSH_SHELL: &str = "pwsh";
pub fn checkout_repo() -> Step<Use> {
named::uses(
@@ -86,25 +86,37 @@ pub fn cleanup_cargo_config(platform: Platform) -> Step<Run> {
step.if_condition(Expression::new("always()"))
}
pub fn upload_artifact(name: &str, path: &str) -> Step<Use> {
Step::new(format!("@actions/upload-artifact {}", name))
.uses(
"actions",
"upload-artifact",
"330a01c490aca151604b8cf639adc76d48f6c5d4", // v5
)
.add_with(("name", name))
.add_with(("path", path))
}
pub fn clear_target_dir_if_large(platform: Platform) -> Step<Run> {
match platform {
Platform::Windows => named::pwsh("./script/clear-target-dir-if-larger-than.ps1 250"),
Platform::Linux => named::bash("./script/clear-target-dir-if-larger-than 100"),
Platform::Linux => named::bash("./script/clear-target-dir-if-larger-than 250"),
Platform::Mac => named::bash("./script/clear-target-dir-if-larger-than 300"),
}
}
pub(crate) fn clippy(platform: Platform) -> Step<Run> {
match platform {
Platform::Windows => named::pwsh("./script/clippy.ps1"),
_ => named::bash("./script/clippy"),
}
}
pub(crate) fn cache_rust_dependencies_namespace() -> Step<Use> {
named::uses("namespacelabs", "nscloud-cache-action", "v1").add_with(("cache", "rust"))
}
fn setup_linux() -> Step<Run> {
named::bash("./script/linux")
}
fn install_mold() -> Step<Run> {
named::bash("./script/install-mold")
}
pub(crate) fn install_linux_dependencies(job: Job) -> Job {
job.add_step(setup_linux()).add_step(install_mold())
}
pub fn script(name: &str) -> Step<Run> {
if name.ends_with(".ps1") {
Step::new(name).run(name).shell(PWSH_SHELL)
@@ -118,6 +130,91 @@ pub(crate) struct NamedJob {
pub job: Job,
}
// impl NamedJob {
// pub fn map(self, f: impl FnOnce(Job) -> Job) -> Self {
// NamedJob {
// name: self.name,
// job: f(self.job),
// }
// }
// }
pub(crate) fn release_job(deps: &[&NamedJob]) -> Job {
dependant_job(deps)
.cond(Expression::new(
"github.repository_owner == 'zed-industries'",
))
.timeout_minutes(60u32)
}
pub(crate) fn dependant_job(deps: &[&NamedJob]) -> Job {
let job = Job::default();
if deps.len() > 0 {
job.needs(deps.iter().map(|j| j.name.clone()).collect::<Vec<_>>())
} else {
job
}
}
impl FluentBuilder for Job {}
impl FluentBuilder for Workflow {}
/// A helper trait for building complex objects with imperative conditionals in a fluent style.
/// Copied from GPUI to avoid adding GPUI as dependency
/// todo(ci) just put this in gh-workflow
#[allow(unused)]
pub(crate) trait FluentBuilder {
/// Imperatively modify self with the given closure.
fn map<U>(self, f: impl FnOnce(Self) -> U) -> U
where
Self: Sized,
{
f(self)
}
/// Conditionally modify self with the given closure.
fn when(self, condition: bool, then: impl FnOnce(Self) -> Self) -> Self
where
Self: Sized,
{
self.map(|this| if condition { then(this) } else { this })
}
/// Conditionally modify self with the given closure.
fn when_else(
self,
condition: bool,
then: impl FnOnce(Self) -> Self,
else_fn: impl FnOnce(Self) -> Self,
) -> Self
where
Self: Sized,
{
self.map(|this| if condition { then(this) } else { else_fn(this) })
}
/// Conditionally unwrap and modify self with the given closure, if the given option is Some.
fn when_some<T>(self, option: Option<T>, then: impl FnOnce(Self, T) -> Self) -> Self
where
Self: Sized,
{
self.map(|this| {
if let Some(value) = option {
then(this, value)
} else {
this
}
})
}
/// Conditionally unwrap and modify self with the given closure, if the given option is None.
fn when_none<T>(self, option: &Option<T>, then: impl FnOnce(Self) -> Self) -> Self
where
Self: Sized,
{
self.map(|this| if option.is_some() { this } else { then(this) })
}
}
// (janky) helper to generate steps with a name that corresponds
// to the name of the calling function.
pub(crate) mod named {
@@ -201,3 +298,9 @@ pub(crate) mod named {
.join("::")
}
}
pub fn git_checkout(ref_name: &dyn std::fmt::Display) -> Step<Run> {
named::bash(&format!(
"git fetch origin {ref_name} && git checkout {ref_name}"
))
}

Some files were not shown because too many files have changed in this diff Show More