Compare commits

...

38 Commits

Author SHA1 Message Date
Mikayla Maki
aac6168d74 Add the concept of external tasks, to expose the fallibility of GPUI tasks 2025-03-13 20:35:23 -07:00
Mikayla Maki
91da475b64 Remove foreground task and option return type 2025-03-13 20:26:23 -07:00
Mikayla
fcc82019fb ContextTask -> ForegroundTask 2025-03-13 19:33:07 -07:00
Mikayla
adce2dda14 Restore executor cloning APIs for Zed's test infrastructure
but hide them from general use, as using these APIs can cause panics
2025-03-13 19:31:53 -07:00
Mikayla
51390ab00c Document ForegroundContext 2025-03-13 19:31:53 -07:00
Mikayla
d6f960b68c Add Weak variants of AsyncApp and AsyncWindow context 2025-03-13 19:31:51 -07:00
Mikayla
c5688fde73 Move thread id check inside spawned futures.
for some reason, it seems like the metadata drop can sometimes happen on other threads
2025-03-13 19:26:50 -07:00
Mikayla
66eb3d6dc4 Fix some potential runtime panics with rust lifetimes 2025-03-13 19:26:49 -07:00
Mikayla
5231937f3c Add a ContextTask type, and thread it through GPUI 2025-03-13 19:25:30 -07:00
Mikayla Maki
d3ff40ef01 Proof of concept: move the window and app context checks into the executor 2025-03-13 19:20:36 -07:00
Mikayla Maki
0081b816fe Fix a bug where the modal layer could not be dismissed by the mouse 2025-03-12 16:44:16 -07:00
Peter Tripp
21949bcf1a ci: Fix tests not-running on main (#26613)
Follow-up to #26551 

Fix for tests being skipped on main.
Also fetch less history: [example
run](https://github.com/zed-industries/zed/actions/runs/13822318758/job/38670334893)

Release Notes:

- N/A
2025-03-12 19:16:23 -04:00
renovate[bot]
ee7ed6d5b8 Update Rust crate anyhow to v1.0.97 (#26576)
This PR contains the following updates:

| Package | Type | Update | Change |
|---|---|---|---|
| [anyhow](https://redirect.github.com/dtolnay/anyhow) |
workspace.dependencies | patch | `1.0.96` -> `1.0.97` |

---

### Release Notes

<details>
<summary>dtolnay/anyhow (anyhow)</summary>

###
[`v1.0.97`](https://redirect.github.com/dtolnay/anyhow/releases/tag/1.0.97)

[Compare
Source](https://redirect.github.com/dtolnay/anyhow/compare/1.0.96...1.0.97)

-   Documentation improvements

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "after 3pm on Wednesday" in timezone
America/New_York, Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

Release Notes:

- N/A

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzOS4xOTQuMSIsInVwZGF0ZWRJblZlciI6IjM5LjE5NC4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Marshall Bowers <git@maxdeviant.com>
2025-03-12 23:01:04 +00:00
Marshall Bowers
07b67c1bd3 assistant2: Add ability to enable/disable all tools from a context server (#26610)
This PR adds an option to enable/disable all tools from a specific
context server:

<img width="1297" alt="Screenshot 2025-03-12 at 5 55 45 PM"
src="https://github.com/user-attachments/assets/af6c169e-0462-4a99-9bec-48fbf83dd08a"
/>

Release Notes:

- N/A
2025-03-12 22:14:31 +00:00
Mikayla Maki
f116b44ae8 Rename the editor::ToggleGitBlame action to git::Blame (#26565)
Release Notes:

- Git Beta: Renamed `editor::ToggleGitBlame` to `git::Blame`

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
Co-authored-by: Cole Miller <m@cole-miller.net>
Co-authored-by: Nathan Sobo <nathan@zed.dev>
Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
2025-03-12 22:12:42 +00:00
Nate Butler
43ab7fe0e2 theme: Fix incorrect version control keys in One themes (#26606)
While the `.{variants}` of the theme keys _were_ incorrect, they are
actually more consistent with our current theme keys (thanks AI!) So we
will keep theme, and fix the incorrect usages in the one themes and
elsewhere.

Old description:
> 
> This PR fixes an issue where we specified the incorrect theme keys
(thanks AI!) > in the theme schema. The following keys have been changed
to their correct > versions:
> 
> | Before                        | After                   |
> |-------------------------------|-------------------------|
> | version_control.added         | version_control_added   |
> | version_control.deleted       | version_control_deleted |
> | version_control.modified      | version_control_modified|
> | version_control.renamed       | version_control_renamed |
> | version_control.conflict      | version_control_conflict|
> | version_control.ignored       | version_control_ignored |
> 
> Please use the after versions in your themes, as they are correct! 
> 
> We won't be adding secondary keys to fix this automatically as git
only > officially launched today.
> 
> Due to this change, we've also updated the version control keys in the
One > themes to keep the default diff hunks looks from changing.

Closes #26572

Release Notes:

- theme: Fixed an issue where version control colors weren't applying
correctly.
2025-03-12 22:07:04 +00:00
Richard Feldman
6044773043 Add path search glob tool (#26567)
<img width="638" alt="Screenshot 2025-03-12 at 1 33 31 PM"
src="https://github.com/user-attachments/assets/f29b9dae-59eb-4d7a-bc26-aa4721cb829a"
/>

Release Notes:

- N/A
2025-03-12 22:00:54 +00:00
Conrad Irwin
81af2c0bed Fix overflow in create branch label (#26591)
Closes #ISSUE

Release Notes:

- N/A
2025-03-12 21:55:31 +00:00
Peter Tripp
ab199fda47 ci: GitHub actions refactor (#26551)
Refactor GitHub actions CI workflow.
- Single combined 'tests_pass' action so we only need one mandatory
check for merge queue
- Add new `job_spec` job which determines what needs to be run (+5secs)
  - Do not run full CI for docs only changes (~30secs vs 10+mins)
- Only run `script/generate-licenses` if Cargo.lock changed (saves
~23secs on mac_test)
- Move prettier /docs check to ci.yml and remove docs.yml 
- Run Windows tests on every PR commit
- Added new Windows runners named to reflect their OS/capacity
(windows-2025-64, windows-2025-32, windows-2025-16)

Release Notes:

- N/A
2025-03-12 17:32:38 -04:00
Marshall Bowers
e60e8f3a0a assistant_tool: Reduce locking in ToolWorkingSet (#26605)
This PR updates the `ToolWorkingSet` to reduce the amount of locking we
need to do.

A number of the methods have had corresponding versions moved to the
`ToolWorkingSetState` so that we can take out the lock once and do a
number of operations without needing to continually acquire and release
the lock.

Release Notes:

- N/A
2025-03-12 21:26:26 +00:00
brian tan
edeed7b619 workspace::Open: Highlight fuzzy matches (#26320)
Partial: https://github.com/zed-industries/zed/issues/15398

Changes:
Adds highlighting to the matches when using `"use_system_path_prompts":
false`

| before | after |
|---|---|

|![image](https://github.com/user-attachments/assets/60a385a0-abb0-49c5-935c-e71149161562)|![image](https://github.com/user-attachments/assets/d66ce980-cea9-4c22-8e6a-9720344be39a)|

Release Notes:

- N/A
2025-03-12 22:54:38 +02:00
Richard Feldman
9be7934f12 Add Bash tool (#26597)
<img width="636" alt="Screenshot 2025-03-12 at 4 24 18 PM"
src="https://github.com/user-attachments/assets/6f317031-f495-4a5a-8260-79a56b10d628"
/>

<img width="634" alt="Screenshot 2025-03-12 at 4 24 36 PM"
src="https://github.com/user-attachments/assets/27283432-4f94-49f3-9d61-a0a9c737de40"
/>


Release Notes:

- N/A
2025-03-12 20:51:29 +00:00
Peter Tripp
009b90291e Fix formatting in linux.md (#26598)
Merge queue did not require docs tests to pass:
-
https://github.com/zed-industries/zed/actions/runs/13820880465/job/38665664419

This will be fixed with:
- https://github.com/zed-industries/zed/pull/26551

cc: @ConradIrwin 

Release Notes:

- N/A
2025-03-12 16:33:11 -04:00
Michael Kaplan
8b17dc66f6 docs: Document linker issue & workarounds with GCC >= 14 (#26579)
Closes #24880

documents issues with aws-lc-rs and gcc >=14 on linux and provides a
workaround until the issues are fixed in aws-lc-rs
2025-03-12 20:26:08 +00:00
Conrad Irwin
de07b712fd Fix message on push (#26588)
Instead of saying "Successfully pushed new branch" we say "Pushed x to
y"

Release Notes:

- N/A
2025-03-12 20:18:28 +00:00
Richard Feldman
be8f3b3791 Add delete-path tool (#26590)
Release Notes:

- N/A
2025-03-12 20:16:26 +00:00
Richard Feldman
3131b0459f Return which files were touched in the edit tool (#26564)
<img width="631" alt="Screenshot 2025-03-12 at 12 56 43 PM"
src="https://github.com/user-attachments/assets/9ab84a53-829a-4943-ae76-b1d97ee31f55"
/>

<img width="908" alt="Screenshot 2025-03-12 at 12 57 12 PM"
src="https://github.com/user-attachments/assets/bd246231-6c92-4266-b61e-5293adfe2ba0"
/>

Release Notes:

- N/A
2025-03-12 15:56:23 -04:00
Marshall Bowers
3ec323ce0d uiua: Extract to zed-extensions/uiua repository (#26587)
This PR extracts the Uiua extension to the
[zed-extensions/uiua](https://github.com/zed-extensions/uiua)
repository.

Release Notes:

- N/A
2025-03-12 19:55:37 +00:00
Conrad Irwin
c8b782d870 git: Hard wrap in editor (#26507)
This adds the ability for the editor to implement hard wrap (similar to
"textwidth" in vim).

If you are typing and your line extends beyond the limit, a newline is
inserted before the most recent space on the line. If you are otherwise
editing the line, pasting, etc. then you will need to manually rewrap.

Release Notes:

- git: Commit messages are now wrapped "as you type" to 72 characters.
2025-03-12 13:48:13 -06:00
Conrad Irwin
7bca15704b Git on main thread (#26573)
This moves spawning of the git subprocess to the main thread. We're not
yet
sure why, but when we spawn a process using GCD's background queues,
sub-processes like git-credential-manager fail to open windows.

This seems to be fixable either by using the main thread, or by using a
standard background thread,
but for now we use the main thread.


Release Notes:

- Git: Fix git-credential-manager

---------

Co-authored-by: Max Brunsfeld <maxbrunsfeld@gmail.com>
Co-authored-by: Kirill Bulatov <mail4score@gmail.com>
2025-03-12 19:39:30 +00:00
Kirill Bulatov
5268e74315 Properly handle goto single file worktrees during terminal cmd-clicks (#26582)
Closes https://github.com/zed-industries/zed/issues/26431
Follow-up of https://github.com/zed-industries/zed/pull/26174

`path_with_position.path.strip_prefix(&worktree_root)` used in the PR is
wrong for cases of single-file worktrees, where it will return empty
paths that will result in incorrect project and FS entries accessed.

Release Notes:

- Fixed goto single file worktrees during terminal cmd-clicks
2025-03-12 19:38:21 +00:00
Kirill Bulatov
91c209900b Support word-based completions (#26410)
Closes https://github.com/zed-industries/zed/issues/4957


https://github.com/user-attachments/assets/ff491378-376d-48ec-b552-6cc80f74200b

Adds `"completions"` language settings section, to configure LSP and
word completions per language.
Word-based completions may be turned on never, always (returned along
with the LSP ones), and as a fallback if no LSP completion items were
returned.

Future work:

* words are matched with the same fuzzy matching code that the rest of
the completions are

This might worsen the completion menu's usability even more, and will
require work on better completion sorting.

* completion entries currently have no icons or other ways to indicate
those are coming from LSP or from word search, or from something else

* we may work with language scopes more intelligently, group words by
them and distinguish during completions

Release Notes:

- Supported word-based completions

---------

Co-authored-by: Max Brunsfeld <max@zed.dev>
2025-03-12 21:27:10 +02:00
Anthony Eid
74c29f1818 Fix unstage/stage in project diff not working when git panel isn't open (#26575)
Closes #ISSUE

Release Notes:

- Fix Bug where unstage/stage all in project diff wouldn't work while
git panel was closed

Co-authored-by: Conrad Irwin <conrad.irwin@gmail.com>
2025-03-12 19:07:51 +00:00
Marshall Bowers
5858e61327 purescript: Extract to zed-extensions/purescript repository (#26571)
This PR extracts the PureScript extension to the
[zed-extensions/purescript](https://github.com/zed-extensions/purescript)
repository.

Release Notes:

- N/A
2025-03-12 18:42:12 +00:00
Martim Aires de Sousa
21cf2e38c5 Fix pane magnification causing mouse to drag tabs unexpectedly (#26383)
Previously, if a user clicked a button and moved the cursor out before
releasing, the click event was correctly prevented, but the pending
mouse-down state remained.
This caused unintended drags when the UI shifted due to magnification
settings.

Now, mouse-up clears the pending state:
- If over the button → clear state and trigger click handlers.
- If outside the button → clear state without triggering a click.

This avoids accidental drags while preserving expected click behavior.

Closes #24600

Release Notes:

- N/A

---------

Co-authored-by: Ben Kunkle <ben@zed.dev>
2025-03-12 13:32:42 -05:00
Marshall Bowers
a3ca5554fd zig: Extract to zed-extensions/zig repository (#26569)
This PR extracts the Zig extension to the
[zed-extensions/zig](https://github.com/zed-extensions/zig) repository.

Release Notes:

- N/A
2025-03-12 18:28:26 +00:00
Marshall Bowers
acf9b22466 extension: Add ExtensionEvents for listening to extension-related events (#26562)
This PR adds a new `ExtensionEvents` event bus that can be used to
listen for extension-related events throughout the app.

Today you need to have a handle to the `ExtensionStore` (which entails
depending on `extension_host`) in order to listen for extension events.

With this change subscribers only need to depend on `extension`, which
has a leaner dependency graph.

Release Notes:

- N/A
2025-03-12 17:01:52 +00:00
Joseph T. Lyons
ffcd023f83 Bump Zed to v0.179 (#26563)
Release Notes:

-N/A
2025-03-12 12:53:37 -04:00
98 changed files with 3597 additions and 2733 deletions

View File

@@ -23,9 +23,47 @@ env:
RUST_BACKTRACE: 1
jobs:
job_spec:
name: Decide which jobs to run
if: github.repository_owner == 'zed-industries'
outputs:
run_tests: ${{ steps.filter.outputs.run_tests }}
runs-on:
- ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
# 350 is arbitrary; ~10days of history on main (5secs); full history is ~25secs
fetch-depth: ${{ github.ref == 'refs/heads/main' && 2 || 350 }}
- name: Fetch git history and generate output filters
id: filter
run: |
if [ -z "$GITHUB_BASE_REF" ]; then
echo "Not in a PR context (i.e., push to main/stable/preview)"
COMPARE_REV=$(git rev-parse HEAD~1)
else
echo "In a PR context comparing to pull_request.base.ref"
git fetch origin "$GITHUB_BASE_REF" --depth=350
COMPARE_REV=$(git merge-base "origin/${GITHUB_BASE_REF}" HEAD)
fi
if [[ $(git diff --name-only $COMPARE_REV ${{ github.sha }} | grep -v "^docs/") ]]; then
echo "run_tests=true" >> $GITHUB_OUTPUT
else
echo "run_tests=false" >> $GITHUB_OUTPUT
fi
if [[ $(git diff --name-only $COMPARE_REV ${{ github.sha }} | grep '^Cargo.lock') ]]; then
echo "run_license=true" >> $GITHUB_OUTPUT
else
echo "run_license=false" >> $GITHUB_OUTPUT
fi
migration_checks:
name: Check Postgres and Protobuf migrations, mergability
if: github.repository_owner == 'zed-industries'
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
timeout-minutes: 60
runs-on:
- self-hosted
@@ -69,6 +107,7 @@ jobs:
style:
timeout-minutes: 60
name: Check formatting and spelling
needs: [job_spec]
if: github.repository_owner == 'zed-industries'
runs-on:
- buildjet-8vcpu-ubuntu-2204
@@ -76,6 +115,21 @@ jobs:
- name: Checkout repo
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx prettier@${PRETTIER_VERSION} . --check || {
echo "To fix, run from the root of the zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}
env:
PRETTIER_VERSION: 3.5.0
# To support writing comments that they will certainly be revisited.
- name: Check for todo! and FIXME comments
run: script/check-todos
@@ -91,7 +145,10 @@ jobs:
macos_tests:
timeout-minutes: 60
name: (macOS) Run Clippy and tests
if: github.repository_owner == 'zed-industries'
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- self-hosted
- test
@@ -123,7 +180,9 @@ jobs:
- name: Check licenses
run: |
script/check-licenses
script/generate-licenses /tmp/zed_licenses_output
if [[ "${{ needs.job_spec.outputs.run_license }}" == "true" ]]; then
script/generate-licenses /tmp/zed_licenses_output
fi
- name: Check for new vulnerable dependencies
if: github.event_name == 'pull_request'
@@ -154,7 +213,10 @@ jobs:
linux_tests:
timeout-minutes: 60
name: (Linux) Run Clippy and tests
if: github.repository_owner == 'zed-industries'
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- buildjet-16vcpu-ubuntu-2204
steps:
@@ -203,9 +265,12 @@ jobs:
build_remote_server:
timeout-minutes: 60
name: (Linux) Build Remote Server
if: github.repository_owner == 'zed-industries'
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on:
- buildjet-16vcpu-ubuntu-2204
- buildjet-8vcpu-ubuntu-2204
steps:
- name: Add Rust to the PATH
run: echo "$HOME/.cargo/bin" >> $GITHUB_PATH
@@ -239,21 +304,12 @@ jobs:
windows_clippy:
timeout-minutes: 60
name: (Windows) Run Clippy
if: github.repository_owner == 'zed-industries'
runs-on: hosted-windows-2
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on: windows-2025-16
steps:
# Temporarily Collect some metadata about the hardware behind our runners.
- name: GHA Runner Info
run: |
Invoke-RestMethod -Headers @{"Metadata"="true"} -Method GET -Uri "http://169.254.169.254/metadata/instance/compute?api-version=2023-07-01" |
ConvertTo-Json -Depth 10 |
jq "{ vm_size: .vmSize, location: .location, os_disk_gb: (.storageProfile.osDisk.diskSizeGB | tonumber), rs_disk_gb: (.storageProfile.resourceDisk.size | tonumber / 1024) }"
@{
Cores = (Get-CimInstance Win32_Processor).NumberOfCores
vCPUs = (Get-CimInstance Win32_Processor).NumberOfLogicalProcessors
RamGb = [math]::Round((Get-CimInstance Win32_ComputerSystem).TotalPhysicalMemory / 1GB, 2)
cpuid = (Get-CimInstance Win32_Processor).Name.Trim()
} | ConvertTo-Json
# more info here:- https://github.com/rust-lang/cargo/issues/13020
- name: Enable longer pathnames for git
run: git config --system core.longpaths true
@@ -306,21 +362,12 @@ jobs:
windows_tests:
timeout-minutes: 60
name: (Windows) Run Tests
if: ${{ github.repository_owner == 'zed-industries' && (github.ref == 'refs/heads/main' || contains(github.event.pull_request.labels.*.name, 'windows')) }}
runs-on: hosted-windows-2
needs: [job_spec]
if: |
github.repository_owner == 'zed-industries' &&
needs.job_spec.outputs.run_tests == 'true'
runs-on: windows-2025-64
steps:
# Temporarily Collect some metadata about the hardware behind our runners.
- name: GHA Runner Info
run: |
Invoke-RestMethod -Headers @{"Metadata"="true"} -Method GET -Uri "http://169.254.169.254/metadata/instance/compute?api-version=2023-07-01" |
ConvertTo-Json -Depth 10 |
jq "{ vm_size: .vmSize, location: .location, os_disk_gb: (.storageProfile.osDisk.diskSizeGB | tonumber), rs_disk_gb: (.storageProfile.resourceDisk.size | tonumber / 1024) }"
@{
Cores = (Get-CimInstance Win32_Processor).NumberOfCores
vCPUs = (Get-CimInstance Win32_Processor).NumberOfLogicalProcessors
RamGb = [math]::Round((Get-CimInstance Win32_ComputerSystem).TotalPhysicalMemory / 1GB, 2)
cpuid = (Get-CimInstance Win32_Processor).Name.Trim()
} | ConvertTo-Json
# more info here:- https://github.com/rust-lang/cargo/issues/13020
- name: Enable longer pathnames for git
run: git config --system core.longpaths true
@@ -372,13 +419,44 @@ jobs:
Remove-Item -Path "${{ env.CARGO_HOME }}/config.toml" -Force
}
tests_pass:
name: Tests Pass
runs-on: ubuntu-latest
needs:
- job_spec
- style
- migration_checks
- linux_tests
- build_remote_server
- macos_tests
- windows_clippy
- windows_tests
if: |
always() && (
needs.style.result == 'success'
&& (
needs.job_spec.outputs.run_tests == 'false'
|| (needs.macos_tests.result == 'success'
&& needs.linux_tests.result == 'success'
&& needs.windows_tests.result == 'success'
&& needs.windows_clippy.result == 'success'
&& needs.build_remote_server.result == 'success'
&& needs.migration_checks.result == 'success')
)
)
steps:
- name: All tests passed
run: echo "All tests passed successfully!"
bundle-mac:
timeout-minutes: 120
name: Create a macOS bundle
runs-on:
- self-hosted
- bundle
if: ${{ startsWith(github.ref, 'refs/tags/v') || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
needs: [macos_tests]
env:
MACOS_CERTIFICATE: ${{ secrets.MACOS_CERTIFICATE }}
@@ -468,7 +546,9 @@ jobs:
name: Linux x86_x64 release bundle
runs-on:
- buildjet-16vcpu-ubuntu-2004
if: ${{ startsWith(github.ref, 'refs/tags/v') || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
needs: [linux_tests]
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
@@ -485,7 +565,7 @@ jobs:
run: ./script/linux && ./script/install-mold 2.34.0
- name: Determine version and release channel
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
if: startsWith(github.ref, 'refs/tags/v')
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
@@ -495,14 +575,18 @@ jobs:
- name: Upload Linux bundle to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
github.ref == 'refs/heads/main'
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Linux remote server to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
github.ref == 'refs/heads/main'
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-x86_64.gz
@@ -523,7 +607,9 @@ jobs:
name: Linux arm64 release bundle
runs-on:
- buildjet-16vcpu-ubuntu-2204-arm
if: ${{ startsWith(github.ref, 'refs/tags/v') || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
startsWith(github.ref, 'refs/tags/v')
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
needs: [linux_tests]
env:
ZED_CLIENT_CHECKSUM_SEED: ${{ secrets.ZED_CLIENT_CHECKSUM_SEED }}
@@ -540,7 +626,7 @@ jobs:
run: ./script/linux
- name: Determine version and release channel
if: ${{ startsWith(github.ref, 'refs/tags/v') }}
if: startsWith(github.ref, 'refs/tags/v')
run: |
# This exports RELEASE_CHANNEL into env (GITHUB_ENV)
script/determine-release-channel
@@ -550,14 +636,18 @@ jobs:
- name: Upload Linux bundle to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
github.ref == 'refs/heads/main'
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.tar.gz
path: target/release/zed-*.tar.gz
- name: Upload Linux remote server to workflow run if main branch or specific label
uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4
if: ${{ github.ref == 'refs/heads/main' }} || contains(github.event.pull_request.labels.*.name, 'run-bundling') }}
if: |
github.ref == 'refs/heads/main'
|| contains(github.event.pull_request.labels.*.name, 'run-bundling')
with:
name: zed-remote-server-${{ github.event.pull_request.head.sha || github.sha }}-aarch64-unknown-linux-gnu.gz
path: target/zed-remote-server-linux-aarch64.gz
@@ -575,7 +665,9 @@ jobs:
auto-release-preview:
name: Auto release preview
if: ${{ startsWith(github.ref, 'refs/tags/v') && endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre') }}
if: |
startsWith(github.ref, 'refs/tags/v')
&& endsWith(github.ref, '-pre') && !endsWith(github.ref, '.0-pre')
needs: [bundle-mac, bundle-linux-x86_x64, bundle-linux-aarch64]
runs-on:
- self-hosted

View File

@@ -1,39 +0,0 @@
name: Docs
on:
pull_request:
paths:
- "docs/**"
push:
branches:
- main
jobs:
check_formatting:
name: "Check formatting"
if: github.repository_owner == 'zed-industries'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
- uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
version: 9
- name: Prettier Check on /docs
working-directory: ./docs
run: |
pnpm dlx prettier@${PRETTIER_VERSION} . --check || {
echo "To fix, run from the root of the zed repo:"
echo " cd docs && pnpm dlx prettier@${PRETTIER_VERSION} . --write && cd .."
false
}
env:
PRETTIER_VERSION: 3.5.0
- name: Check for Typos with Typos-CLI
uses: crate-ci/typos@8e6a4285bcbde632c5d79900a7779746e8b7ea3f # v1.24.6
with:
config: ./typos.toml
files: ./docs/

382
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -174,14 +174,11 @@ members = [
"extensions/html",
"extensions/perplexity",
"extensions/proto",
"extensions/purescript",
"extensions/ruff",
"extensions/slash-commands-example",
"extensions/snippets",
"extensions/test-extension",
"extensions/toml",
"extensions/uiua",
"extensions/zig",
#
# Tooling

View File

@@ -336,14 +336,14 @@
"active_line_width": 1,
// Determines how indent guides are colored.
// This setting can take the following three values:
///
//
// 1. "disabled"
// 2. "fixed"
// 3. "indent_aware"
"coloring": "fixed",
// Determines how indent guide backgrounds are colored.
// This setting can take the following two values:
///
//
// 1. "disabled"
// 2. "indent_aware"
"background_coloring": "disabled"
@@ -402,8 +402,8 @@
// Time to wait after scrolling the buffer, before requesting the hints,
// set to 0 to disable debouncing.
"scroll_debounce_ms": 50,
/// A set of modifiers which, when pressed, will toggle the visibility of inlay hints.
/// If the set if empty or not all the modifiers specified are pressed, inlay hints will not be toggled.
// A set of modifiers which, when pressed, will toggle the visibility of inlay hints.
// If the set if empty or not all the modifiers specified are pressed, inlay hints will not be toggled.
"toggle_on_modifiers_press": {
"control": false,
"shift": false,
@@ -440,7 +440,7 @@
"scrollbar": {
// When to show the scrollbar in the project panel.
// This setting can take five values:
///
//
// 1. null (default): Inherit editor settings
// 2. Show the scrollbar if there's important information or
// follow the system's configured behavior (default):
@@ -455,7 +455,7 @@
},
// Which files containing diagnostic errors/warnings to mark in the project panel.
// This setting can take the following three values:
///
//
// 1. Do not mark any files:
// "off"
// 2. Only mark files with errors:
@@ -512,7 +512,7 @@
"scrollbar": {
// When to show the scrollbar in the project panel.
// This setting can take five values:
///
//
// 1. null (default): Inherit editor settings
// 2. Show the scrollbar if there's important information or
// follow the system's configured behavior (default):
@@ -686,7 +686,7 @@
// Which files containing diagnostic errors/warnings to mark in the tabs.
// Diagnostics are only shown when file icons are also active.
// This setting only works when can take the following three values:
///
//
// 1. Do not mark any files:
// "off"
// 2. Only mark files with errors:
@@ -1014,7 +1014,7 @@
"scrollbar": {
// When to show the scrollbar in the terminal.
// This setting can take five values:
///
//
// 1. null (default): Inherit editor settings
// 2. Show the scrollbar if there's important information or
// follow the system's configured behavior (default):
@@ -1085,6 +1085,31 @@
"auto_install_extensions": {
"html": true
},
// Controls how completions are processed for this language.
"completions": {
// Controls how words are completed.
// For large documents, not all words may be fetched for completion.
//
// May take 3 values:
// 1. "enabled"
// Always fetch document's words for completions.
// 2. "fallback"
// Only if LSP response errors/times out/is empty, use document's words to show completions.
// 3. "disabled"
// Never fetch or complete document's words for completions.
//
// Default: fallback
"words": "fallback",
// Whether to fetch LSP completions or not.
//
// Default: true
"lsp": true,
// When fetching LSP completions, determines how long to wait for a response of a particular server.
// When set to 0, waits indefinitely.
//
// Default: 500
"lsp_fetch_timeout_ms": 500
},
// Different settings for specific languages.
"languages": {
"Astro": {

View File

@@ -96,9 +96,9 @@
"terminal.ansi.bright_white": "#dce0e5ff",
"terminal.ansi.dim_white": "#575d65ff",
"link_text.hover": "#74ade8ff",
"version_control_added": "#a7c088ff",
"version_control_modified": "#dec184ff",
"version_control_deleted": "#d07277ff",
"version_control.added": "#27a657ff",
"version_control.modified": "#d3b020ff",
"version_control.deleted": "#e06c76ff",
"conflict": "#dec184ff",
"conflict.background": "#dec1841a",
"conflict.border": "#5d4c2fff",
@@ -475,9 +475,9 @@
"terminal.ansi.bright_white": "#242529ff",
"terminal.ansi.dim_white": "#97979aff",
"link_text.hover": "#5c78e2ff",
"version_control_added": "#669f59ff",
"version_control_modified": "#a48819ff",
"version_control_deleted": "#d36151ff",
"version_control.added": "#27a657ff",
"version_control.modified": "#d3b020ff",
"version_control.deleted": "#e06c76ff",
"conflict": "#a48819ff",
"conflict.background": "#faf2e6ff",
"conflict.border": "#f4e7d1ff",

View File

@@ -20,13 +20,14 @@ impl ToolSelector {
cx: &mut Context<Self>,
) -> Entity<ContextMenu> {
ContextMenu::build(window, cx, |mut menu, _window, cx| {
let icon_position = IconPosition::End;
let tools_by_source = self.tools.tools_by_source(cx);
let all_tools_enabled = self.tools.are_all_tools_enabled();
menu = menu.header("Tools").toggleable_entry(
"All Tools",
all_tools_enabled,
IconPosition::End,
icon_position,
None,
{
let tools = self.tools.clone();
@@ -61,31 +62,51 @@ impl ToolSelector {
tools.sort_by(|(_, name_a, _), (_, name_b, _)| name_a.cmp(name_b));
}
menu = match source {
menu = match &source {
ToolSource::Native => menu.header("Zed"),
ToolSource::ContextServer { id } => menu.separator().header(id),
ToolSource::ContextServer { id } => {
let all_tools_from_source_enabled =
self.tools.are_all_tools_from_source_enabled(&source);
menu.separator().header(id).toggleable_entry(
"All Tools",
all_tools_from_source_enabled,
icon_position,
None,
{
let tools = self.tools.clone();
let source = source.clone();
move |_window, cx| {
if all_tools_from_source_enabled {
tools.disable_source(source.clone(), cx);
} else {
tools.enable_source(&source);
}
}
},
)
}
};
for (source, name, is_enabled) in tools {
menu =
menu.toggleable_entry(name.clone(), is_enabled, IconPosition::End, None, {
let tools = self.tools.clone();
move |_window, _cx| {
if name.as_ref() == ScriptingTool::NAME {
if is_enabled {
tools.disable_scripting_tool();
} else {
tools.enable_scripting_tool();
}
menu = menu.toggleable_entry(name.clone(), is_enabled, icon_position, None, {
let tools = self.tools.clone();
move |_window, _cx| {
if name.as_ref() == ScriptingTool::NAME {
if is_enabled {
tools.disable_scripting_tool();
} else {
if is_enabled {
tools.disable(source.clone(), &[name.clone()]);
} else {
tools.enable(source.clone(), &[name.clone()]);
}
tools.enable_scripting_tool();
}
} else {
if is_enabled {
tools.disable(source.clone(), &[name.clone()]);
} else {
tools.enable(source.clone(), &[name.clone()]);
}
}
});
}
});
}
}

View File

@@ -46,56 +46,119 @@ impl ToolWorkingSet {
}
pub fn tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let mut tools = ToolRegistry::global(cx).tools();
tools.extend(
self.state
.lock()
.context_server_tools_by_id
.values()
.cloned(),
);
self.state.lock().tools(cx)
}
tools
pub fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
self.state.lock().tools_by_source(cx)
}
pub fn are_all_tools_enabled(&self) -> bool {
let state = self.state.lock();
state.disabled_tools_by_source.is_empty() && !state.is_scripting_tool_disabled
}
pub fn are_all_tools_from_source_enabled(&self, source: &ToolSource) -> bool {
let state = self.state.lock();
!state.disabled_tools_by_source.contains_key(source)
}
pub fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
self.state.lock().enabled_tools(cx)
}
pub fn enable_all_tools(&self) {
let mut state = self.state.lock();
state.disabled_tools_by_source.clear();
state.is_scripting_tool_disabled = false;
state.enable_scripting_tool();
}
pub fn disable_all_tools(&self, cx: &App) {
let tools = self.tools_by_source(cx);
for (source, tools) in tools {
let tool_names = tools
.into_iter()
.map(|tool| tool.name().into())
.collect::<Vec<_>>();
self.disable(source, &tool_names);
}
self.disable_scripting_tool();
let mut state = self.state.lock();
state.disable_all_tools(cx);
}
pub fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let all_tools = self.tools(cx);
all_tools
.into_iter()
.filter(|tool| self.is_enabled(&tool.source(), &tool.name().into()))
.collect()
pub fn enable_source(&self, source: &ToolSource) {
let mut state = self.state.lock();
state.enable_source(source);
}
pub fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
pub fn disable_source(&self, source: ToolSource, cx: &App) {
let mut state = self.state.lock();
state.disable_source(source, cx);
}
pub fn insert(&self, tool: Arc<dyn Tool>) -> ToolId {
let mut state = self.state.lock();
let tool_id = state.next_tool_id;
state.next_tool_id.0 += 1;
state
.context_server_tools_by_id
.insert(tool_id, tool.clone());
state.tools_changed();
tool_id
}
pub fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.state.lock().is_enabled(source, name)
}
pub fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.state.lock().is_disabled(source, name)
}
pub fn enable(&self, source: ToolSource, tools_to_enable: &[Arc<str>]) {
let mut state = self.state.lock();
state.enable(source, tools_to_enable);
}
pub fn disable(&self, source: ToolSource, tools_to_disable: &[Arc<str>]) {
let mut state = self.state.lock();
state.disable(source, tools_to_disable);
}
pub fn remove(&self, tool_ids_to_remove: &[ToolId]) {
let mut state = self.state.lock();
state
.context_server_tools_by_id
.retain(|id, _| !tool_ids_to_remove.contains(id));
state.tools_changed();
}
pub fn is_scripting_tool_enabled(&self) -> bool {
let state = self.state.lock();
!state.is_scripting_tool_disabled
}
pub fn enable_scripting_tool(&self) {
let mut state = self.state.lock();
state.enable_scripting_tool();
}
pub fn disable_scripting_tool(&self) {
let mut state = self.state.lock();
state.disable_scripting_tool();
}
}
impl WorkingSetState {
fn tools_changed(&mut self) {
self.context_server_tools_by_name.clear();
self.context_server_tools_by_name.extend(
self.context_server_tools_by_id
.values()
.map(|tool| (tool.name(), tool.clone())),
);
}
fn tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let mut tools = ToolRegistry::global(cx).tools();
tools.extend(self.context_server_tools_by_id.values().cloned());
tools
}
fn tools_by_source(&self, cx: &App) -> IndexMap<ToolSource, Vec<Arc<dyn Tool>>> {
let mut tools_by_source = IndexMap::default();
for tool in self.tools(cx) {
@@ -114,78 +177,78 @@ impl ToolWorkingSet {
tools_by_source
}
pub fn insert(&self, tool: Arc<dyn Tool>) -> ToolId {
let mut state = self.state.lock();
let tool_id = state.next_tool_id;
state.next_tool_id.0 += 1;
state
.context_server_tools_by_id
.insert(tool_id, tool.clone());
state.tools_changed();
tool_id
fn enabled_tools(&self, cx: &App) -> Vec<Arc<dyn Tool>> {
let all_tools = self.tools(cx);
all_tools
.into_iter()
.filter(|tool| self.is_enabled(&tool.source(), &tool.name().into()))
.collect()
}
pub fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
fn is_enabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
!self.is_disabled(source, name)
}
pub fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
let state = self.state.lock();
state
.disabled_tools_by_source
fn is_disabled(&self, source: &ToolSource, name: &Arc<str>) -> bool {
self.disabled_tools_by_source
.get(source)
.map_or(false, |disabled_tools| disabled_tools.contains(name))
}
pub fn enable(&self, source: ToolSource, tools_to_enable: &[Arc<str>]) {
let mut state = self.state.lock();
state
.disabled_tools_by_source
fn enable(&mut self, source: ToolSource, tools_to_enable: &[Arc<str>]) {
self.disabled_tools_by_source
.entry(source)
.or_default()
.retain(|name| !tools_to_enable.contains(name));
}
pub fn disable(&self, source: ToolSource, tools_to_disable: &[Arc<str>]) {
let mut state = self.state.lock();
state
.disabled_tools_by_source
fn disable(&mut self, source: ToolSource, tools_to_disable: &[Arc<str>]) {
self.disabled_tools_by_source
.entry(source)
.or_default()
.extend(tools_to_disable.into_iter().cloned());
}
pub fn remove(&self, tool_ids_to_remove: &[ToolId]) {
let mut state = self.state.lock();
state
.context_server_tools_by_id
.retain(|id, _| !tool_ids_to_remove.contains(id));
state.tools_changed();
fn enable_source(&mut self, source: &ToolSource) {
self.disabled_tools_by_source.remove(source);
}
pub fn is_scripting_tool_enabled(&self) -> bool {
let state = self.state.lock();
!state.is_scripting_tool_disabled
}
fn disable_source(&mut self, source: ToolSource, cx: &App) {
let tools_by_source = self.tools_by_source(cx);
let Some(tools) = tools_by_source.get(&source) else {
return;
};
pub fn enable_scripting_tool(&self) {
let mut state = self.state.lock();
state.is_scripting_tool_disabled = false;
}
pub fn disable_scripting_tool(&self) {
let mut state = self.state.lock();
state.is_scripting_tool_disabled = true;
}
}
impl WorkingSetState {
fn tools_changed(&mut self) {
self.context_server_tools_by_name.clear();
self.context_server_tools_by_name.extend(
self.context_server_tools_by_id
.values()
.map(|tool| (tool.name(), tool.clone())),
self.disabled_tools_by_source.insert(
source,
tools
.into_iter()
.map(|tool| tool.name().into())
.collect::<HashSet<_>>(),
);
}
fn disable_all_tools(&mut self, cx: &App) {
let tools = self.tools_by_source(cx);
for (source, tools) in tools {
let tool_names = tools
.into_iter()
.map(|tool| tool.name().into())
.collect::<Vec<_>>();
self.disable(source, &tool_names);
}
self.disable_scripting_tool();
}
fn enable_scripting_tool(&mut self) {
self.is_scripting_tool_disabled = false;
}
fn disable_scripting_tool(&mut self) {
self.is_scripting_tool_disabled = true;
}
}

View File

@@ -1,15 +1,21 @@
mod bash_tool;
mod delete_path_tool;
mod edit_files_tool;
mod list_directory_tool;
mod now_tool;
mod path_search_tool;
mod read_file_tool;
mod regex_search;
use assistant_tool::ToolRegistry;
use gpui::App;
use crate::bash_tool::BashTool;
use crate::delete_path_tool::DeletePathTool;
use crate::edit_files_tool::EditFilesTool;
use crate::list_directory_tool::ListDirectoryTool;
use crate::now_tool::NowTool;
use crate::path_search_tool::PathSearchTool;
use crate::read_file_tool::ReadFileTool;
use crate::regex_search::RegexSearchTool;
@@ -21,5 +27,8 @@ pub fn init(cx: &mut App) {
registry.register_tool(ReadFileTool);
registry.register_tool(ListDirectoryTool);
registry.register_tool(EditFilesTool);
registry.register_tool(PathSearchTool);
registry.register_tool(RegexSearchTool);
registry.register_tool(DeletePathTool);
registry.register_tool(BashTool);
}

View File

@@ -0,0 +1,70 @@
use anyhow::{anyhow, Result};
use assistant_tool::Tool;
use gpui::{App, Entity, Task};
use language_model::LanguageModelRequestMessage;
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::sync::Arc;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct BashToolInput {
/// The bash command to execute as a one-liner.
command: String,
}
pub struct BashTool;
impl Tool for BashTool {
fn name(&self) -> String {
"bash".to_string()
}
fn description(&self) -> String {
include_str!("./bash_tool/description.md").to_string()
}
fn input_schema(&self) -> serde_json::Value {
let schema = schemars::schema_for!(BashToolInput);
serde_json::to_value(&schema).unwrap()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
_messages: &[LanguageModelRequestMessage],
_project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>> {
let input: BashToolInput = match serde_json::from_value(input) {
Ok(input) => input,
Err(err) => return Task::ready(Err(anyhow!(err))),
};
cx.spawn(|_| async move {
// Add 2>&1 to merge stderr into stdout for proper interleaving
let command = format!("{} 2>&1", input.command);
// Spawn a blocking task to execute the command
let output = futures::executor::block_on(async {
std::process::Command::new("bash")
.arg("-c")
.arg(&command)
.output()
.map_err(|err| anyhow!("Failed to execute bash command: {}", err))
})?;
let output_string = String::from_utf8_lossy(&output.stdout).to_string();
if output.status.success() {
Ok(output_string)
} else {
Ok(format!(
"Command failed with exit code {}\n{}",
output.status.code().unwrap_or(-1),
&output_string
))
}
})
}
}

View File

@@ -0,0 +1 @@
Executes a bash one-liner and returns the combined output. This tool spawns a bash process, combines stdout and stderr into one interleaved stream as they are produced (preserving the order of writes), and captures that stream into a string which is returned. Use this tool when you need to run shell commands to get information about the system or process files.

View File

@@ -0,0 +1,165 @@
use anyhow::{anyhow, Result};
use assistant_tool::Tool;
use gpui::{App, Entity, Task};
use language_model::LanguageModelRequestMessage;
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::{fs, path::PathBuf, sync::Arc};
use util::paths::PathMatcher;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct DeletePathToolInput {
/// The glob to match files in the project to delete.
///
/// <example>
/// If the project has the following files:
///
/// - directory1/a/something.txt
/// - directory2/a/things.txt
/// - directory3/a/other.txt
///
/// You can delete the first two files by providing a glob of "*thing*.txt"
/// </example>
pub glob: String,
}
pub struct DeletePathTool;
impl Tool for DeletePathTool {
fn name(&self) -> String {
"delete-path".into()
}
fn description(&self) -> String {
include_str!("./delete_path_tool/description.md").into()
}
fn input_schema(&self) -> serde_json::Value {
let schema = schemars::schema_for!(DeletePathToolInput);
serde_json::to_value(&schema).unwrap()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
_messages: &[LanguageModelRequestMessage],
project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>> {
let glob = match serde_json::from_value::<DeletePathToolInput>(input) {
Ok(input) => input.glob,
Err(err) => return Task::ready(Err(anyhow!(err))),
};
let path_matcher = match PathMatcher::new(&[glob.clone()]) {
Ok(matcher) => matcher,
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {}", err))),
};
struct Match {
display_path: String,
path: PathBuf,
}
let mut matches = Vec::new();
let mut deleted_paths = Vec::new();
let mut errors = Vec::new();
for worktree_handle in project.read(cx).worktrees(cx) {
let worktree = worktree_handle.read(cx);
let worktree_root = worktree.abs_path().to_path_buf();
// Don't consider ignored entries.
for entry in worktree.entries(false, 0) {
if path_matcher.is_match(&entry.path) {
matches.push(Match {
path: worktree_root.join(&entry.path),
display_path: entry.path.display().to_string(),
});
}
}
}
if matches.is_empty() {
return Task::ready(Ok(format!("No paths in the project matched {glob:?}")));
}
let paths_matched = matches.len();
// Delete the files
for Match { path, display_path } in matches {
match fs::remove_file(&path) {
Ok(()) => {
deleted_paths.push(display_path);
}
Err(file_err) => {
// Try to remove directory if it's not a file. Retrying as a directory
// on error saves a syscall compared to checking whether it's
// a directory up front for every single file.
if let Err(dir_err) = fs::remove_dir_all(&path) {
let error = if path.is_dir() {
format!("Failed to delete directory {}: {dir_err}", display_path)
} else {
format!("Failed to delete file {}: {file_err}", display_path)
};
errors.push(error);
} else {
deleted_paths.push(display_path);
}
}
}
}
if errors.is_empty() {
// 0 deleted paths should never happen if there were no errors;
// we already returned if matches was empty.
let answer = if deleted_paths.len() == 1 {
format!(
"Deleted {}",
deleted_paths.first().unwrap_or(&String::new())
)
} else {
// Sort to group entries in the same directory together
deleted_paths.sort();
let mut buf = format!("Deleted these {} paths:\n", deleted_paths.len());
for path in deleted_paths.iter() {
buf.push('\n');
buf.push_str(path);
}
buf
};
Task::ready(Ok(answer))
} else {
if deleted_paths.is_empty() {
Task::ready(Err(anyhow!(
"{glob:?} matched {} deleted because of {}:\n{}",
if paths_matched == 1 {
"1 path, but it was not".to_string()
} else {
format!("{} paths, but none were", paths_matched)
},
if errors.len() == 1 {
"this error".to_string()
} else {
format!("{} errors", errors.len())
},
errors.join("\n")
)))
} else {
// Sort to group entries in the same directory together
deleted_paths.sort();
Task::ready(Ok(format!(
"Deleted {} paths matching glob {glob:?}:\n{}\n\nErrors:\n{}",
deleted_paths.len(),
deleted_paths.join("\n"),
errors.join("\n")
)))
}
}
}
}

View File

@@ -0,0 +1 @@
Deletes all files and directories in the project which match the given glob, and returns a list of the paths that were deleted.

View File

@@ -12,6 +12,7 @@ use language_model::{
use project::{Project, ProjectPath};
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::fmt::Write;
use std::sync::Arc;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
@@ -145,17 +146,29 @@ impl Tool for EditFilesTool {
}
}
let mut answer = match changed_buffers.len() {
0 => "No files were edited.".to_string(),
1 => "Successfully edited ".to_string(),
_ => "Successfully edited these files:\n\n".to_string(),
};
// Save each buffer once at the end
for buffer in changed_buffers {
project
.update(&mut cx, |project, cx| project.save_buffer(buffer, cx))?
.update(&mut cx, |project, cx| {
if let Some(file) = buffer.read(&cx).file() {
let _ = write!(&mut answer, "{}\n\n", &file.path().display());
}
project.save_buffer(buffer, cx)
})?
.await?;
}
let errors = parser.errors();
if errors.is_empty() {
Ok("Successfully applied all edits".into())
Ok(answer.trim_end().to_string())
} else {
let error_message = errors
.iter()

View File

@@ -0,0 +1,88 @@
use anyhow::{anyhow, Result};
use assistant_tool::Tool;
use gpui::{App, Entity, Task};
use language_model::LanguageModelRequestMessage;
use project::Project;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
use std::{path::PathBuf, sync::Arc};
use util::paths::PathMatcher;
#[derive(Debug, Serialize, Deserialize, JsonSchema)]
pub struct PathSearchToolInput {
/// The glob to search all project paths for.
///
/// <example>
/// If the project has the following top-level directories:
///
/// - directory1/a/something.txt
/// - directory2/a/things.txt
/// - directory3/a/other.txt
///
/// You can get back the first two paths by providing a glob of "*thing*.txt"
/// </example>
pub glob: String,
}
pub struct PathSearchTool;
impl Tool for PathSearchTool {
fn name(&self) -> String {
"path-search".into()
}
fn description(&self) -> String {
include_str!("./path_search_tool/description.md").into()
}
fn input_schema(&self) -> serde_json::Value {
let schema = schemars::schema_for!(PathSearchToolInput);
serde_json::to_value(&schema).unwrap()
}
fn run(
self: Arc<Self>,
input: serde_json::Value,
_messages: &[LanguageModelRequestMessage],
project: Entity<Project>,
cx: &mut App,
) -> Task<Result<String>> {
let glob = match serde_json::from_value::<PathSearchToolInput>(input) {
Ok(input) => input.glob,
Err(err) => return Task::ready(Err(anyhow!(err))),
};
let path_matcher = match PathMatcher::new(&[glob.clone()]) {
Ok(matcher) => matcher,
Err(err) => return Task::ready(Err(anyhow!("Invalid glob: {}", err))),
};
let mut matches = Vec::new();
for worktree_handle in project.read(cx).worktrees(cx) {
let worktree = worktree_handle.read(cx);
let root_name = worktree.root_name();
// Don't consider ignored entries.
for entry in worktree.entries(false, 0) {
if path_matcher.is_match(&entry.path) {
matches.push(
PathBuf::from(root_name)
.join(&entry.path)
.to_string_lossy()
.to_string(),
);
}
}
}
if matches.is_empty() {
Task::ready(Ok(format!(
"No paths in the project matched the glob {glob:?}"
)))
} else {
// Sort to group entries in the same directory together.
matches.sort();
Task::ready(Ok(matches.join("\n")))
}
}
}

View File

@@ -0,0 +1 @@
Returns all the paths in the project which match the given glob.

View File

@@ -2038,7 +2038,7 @@ async fn test_git_blame_is_forwarded(cx_a: &mut TestAppContext, cx_b: &mut TestA
// client_b now requests git blame for the open buffer
editor_b.update_in(cx_b, |editor_b, window, cx| {
assert!(editor_b.blame().is_none());
editor_b.toggle_git_blame(&editor::actions::ToggleGitBlame {}, window, cx);
editor_b.toggle_git_blame(&git::Blame {}, window, cx);
});
cx_a.executor().run_until_parked();

View File

@@ -6770,7 +6770,7 @@ async fn test_remote_git_branches(
assert_eq!(branches_b, branches_set);
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch))
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch.to_string()))
.await
.unwrap()
.unwrap();
@@ -6790,15 +6790,23 @@ async fn test_remote_git_branches(
assert_eq!(host_branch.name, branches[2]);
// Also try creating a new branch
cx_b.update(|cx| repo_b.read(cx).create_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.create_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx_b.update(|cx| repo_b.read(cx).change_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.change_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
executor.run_until_parked();

View File

@@ -294,7 +294,7 @@ async fn test_ssh_collaboration_git_branches(
assert_eq!(&branches_b, &branches_set);
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch))
cx_b.update(|cx| repo_b.read(cx).change_branch(new_branch.to_string()))
.await
.unwrap()
.unwrap();
@@ -316,15 +316,23 @@ async fn test_ssh_collaboration_git_branches(
assert_eq!(server_branch.name, branches[2]);
// Also try creating a new branch
cx_b.update(|cx| repo_b.read(cx).create_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.create_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx_b.update(|cx| repo_b.read(cx).change_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx_b.update(|cx| {
repo_b
.read(cx)
.change_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
executor.run_until_parked();

View File

@@ -271,7 +271,10 @@ mod tests {
use gpui::{AppContext as _, BackgroundExecutor, TestAppContext, UpdateGlobal};
use indoc::indoc;
use language::{
language_settings::{AllLanguageSettings, AllLanguageSettingsContent},
language_settings::{
AllLanguageSettings, AllLanguageSettingsContent, CompletionSettings,
WordsCompletionMode,
},
Point,
};
use project::Project;
@@ -286,7 +289,13 @@ mod tests {
#[gpui::test(iterations = 10)]
async fn test_copilot(executor: BackgroundExecutor, cx: &mut TestAppContext) {
// flaky
init_test(cx, |_| {});
init_test(cx, |settings| {
settings.defaults.completions = Some(CompletionSettings {
words: WordsCompletionMode::Disabled,
lsp: true,
lsp_fetch_timeout_ms: 0,
});
});
let (copilot, copilot_lsp) = Copilot::fake(cx);
let mut cx = EditorLspTestContext::new_rust(
@@ -511,7 +520,13 @@ mod tests {
cx: &mut TestAppContext,
) {
// flaky
init_test(cx, |_| {});
init_test(cx, |settings| {
settings.defaults.completions = Some(CompletionSettings {
words: WordsCompletionMode::Disabled,
lsp: true,
lsp_fetch_timeout_ms: 0,
});
});
let (copilot, copilot_lsp) = Copilot::fake(cx);
let mut cx = EditorLspTestContext::new_rust(

View File

@@ -412,7 +412,6 @@ gpui::actions!(
Tab,
Backtab,
ToggleAutoSignatureHelp,
ToggleGitBlame,
ToggleGitBlameInline,
ToggleIndentGuides,
ToggleInlayHints,

View File

@@ -101,6 +101,7 @@ use itertools::Itertools;
use language::{
language_settings::{
self, all_language_settings, language_settings, InlayHintSettings, RewrapBehavior,
WordsCompletionMode,
},
point_from_lsp, text_diff_with_options, AutoindentMode, BracketMatch, BracketPair, Buffer,
Capability, CharKind, CodeLabel, CursorShape, Diagnostic, DiffOptions, EditPredictionsMode,
@@ -607,12 +608,6 @@ pub trait Addon: 'static {
fn to_any(&self) -> &dyn std::any::Any;
}
#[derive(Debug, Copy, Clone, PartialEq, Eq)]
pub enum IsVimMode {
Yes,
No,
}
/// Zed's primary implementation of text input, allowing users to edit a [`MultiBuffer`].
///
/// See the [module level documentation](self) for more information.
@@ -644,6 +639,7 @@ pub struct Editor {
inline_diagnostics_enabled: bool,
inline_diagnostics: Vec<(Anchor, InlineDiagnostic)>,
soft_wrap_mode_override: Option<language_settings::SoftWrap>,
hard_wrap: Option<usize>,
// TODO: make this a access method
pub project: Option<Entity<Project>>,
@@ -1355,6 +1351,7 @@ impl Editor {
inline_diagnostics_update: Task::ready(()),
inline_diagnostics: Vec::new(),
soft_wrap_mode_override,
hard_wrap: None,
completion_provider: project.clone().map(|project| Box::new(project) as _),
semantics_provider: project.clone().map(|project| Rc::new(project) as _),
collaboration_hub: project.clone().map(|project| Box::new(project) as _),
@@ -3192,6 +3189,19 @@ impl Editor {
let trigger_in_words =
this.show_edit_predictions_in_menu() || !had_active_inline_completion;
if this.hard_wrap.is_some() {
let latest: Range<Point> = this.selections.newest(cx).range();
if latest.is_empty()
&& this
.buffer()
.read(cx)
.snapshot(cx)
.line_len(MultiBufferRow(latest.start.row))
== latest.start.column
{
this.rewrap_impl(true, cx)
}
}
this.trigger_completion_on_input(&text, trigger_in_words, window, cx);
linked_editing_ranges::refresh_linked_ranges(this, window, cx);
this.refresh_inline_completion(true, false, window, cx);
@@ -4012,9 +4022,8 @@ impl Editor {
} else {
return;
};
let show_completion_documentation = buffer
.read(cx)
.snapshot()
let buffer_snapshot = buffer.read(cx).snapshot();
let show_completion_documentation = buffer_snapshot
.settings_at(buffer_position, cx)
.show_completion_documentation;
@@ -4038,6 +4047,51 @@ impl Editor {
};
let completions =
provider.completions(&buffer, buffer_position, completion_context, window, cx);
let (old_range, word_kind) = buffer_snapshot.surrounding_word(buffer_position);
let (old_range, word_to_exclude) = if word_kind == Some(CharKind::Word) {
let word_to_exclude = buffer_snapshot
.text_for_range(old_range.clone())
.collect::<String>();
(
buffer_snapshot.anchor_before(old_range.start)
..buffer_snapshot.anchor_after(old_range.end),
Some(word_to_exclude),
)
} else {
(buffer_position..buffer_position, None)
};
let completion_settings = language_settings(
buffer_snapshot
.language_at(buffer_position)
.map(|language| language.name()),
buffer_snapshot.file(),
cx,
)
.completions;
// The document can be large, so stay in reasonable bounds when searching for words,
// otherwise completion pop-up might be slow to appear.
const WORD_LOOKUP_ROWS: u32 = 5_000;
let buffer_row = text::ToPoint::to_point(&buffer_position, &buffer_snapshot).row;
let min_word_search = buffer_snapshot.clip_point(
Point::new(buffer_row.saturating_sub(WORD_LOOKUP_ROWS), 0),
Bias::Left,
);
let max_word_search = buffer_snapshot.clip_point(
Point::new(buffer_row + WORD_LOOKUP_ROWS, 0).min(buffer_snapshot.max_point()),
Bias::Right,
);
let word_search_range = buffer_snapshot.point_to_offset(min_word_search)
..buffer_snapshot.point_to_offset(max_word_search);
let words = match completion_settings.words {
WordsCompletionMode::Disabled => Task::ready(HashMap::default()),
WordsCompletionMode::Enabled | WordsCompletionMode::Fallback => {
cx.background_spawn(async move {
buffer_snapshot.words_in_range(None, word_search_range)
})
}
};
let sort_completions = provider.sort_completions();
let id = post_inc(&mut self.next_completion_id);
@@ -4046,8 +4100,55 @@ impl Editor {
editor.update(&mut cx, |this, _| {
this.completion_tasks.retain(|(task_id, _)| *task_id >= id);
})?;
let completions = completions.await.log_err();
let menu = if let Some(completions) = completions {
let mut completions = completions.await.log_err().unwrap_or_default();
match completion_settings.words {
WordsCompletionMode::Enabled => {
completions.extend(
words
.await
.into_iter()
.filter(|(word, _)| word_to_exclude.as_ref() != Some(word))
.map(|(word, word_range)| Completion {
old_range: old_range.clone(),
new_text: word.clone(),
label: CodeLabel::plain(word, None),
documentation: None,
source: CompletionSource::BufferWord {
word_range,
resolved: false,
},
confirm: None,
}),
);
}
WordsCompletionMode::Fallback => {
if completions.is_empty() {
completions.extend(
words
.await
.into_iter()
.filter(|(word, _)| word_to_exclude.as_ref() != Some(word))
.map(|(word, word_range)| Completion {
old_range: old_range.clone(),
new_text: word.clone(),
label: CodeLabel::plain(word, None),
documentation: None,
source: CompletionSource::BufferWord {
word_range,
resolved: false,
},
confirm: None,
}),
);
}
}
WordsCompletionMode::Disabled => {}
}
let menu = if completions.is_empty() {
None
} else {
let mut menu = CompletionsMenu::new(
id,
sort_completions,
@@ -4061,8 +4162,6 @@ impl Editor {
.await;
menu.visible().then_some(menu)
} else {
None
};
editor.update_in(&mut cx, |editor, window, cx| {
@@ -8507,10 +8606,10 @@ impl Editor {
}
pub fn rewrap(&mut self, _: &Rewrap, _: &mut Window, cx: &mut Context<Self>) {
self.rewrap_impl(IsVimMode::No, cx)
self.rewrap_impl(false, cx)
}
pub fn rewrap_impl(&mut self, is_vim_mode: IsVimMode, cx: &mut Context<Self>) {
pub fn rewrap_impl(&mut self, override_language_settings: bool, cx: &mut Context<Self>) {
let buffer = self.buffer.read(cx).snapshot(cx);
let selections = self.selections.all::<Point>(cx);
let mut selections = selections.iter().peekable();
@@ -8584,7 +8683,9 @@ impl Editor {
RewrapBehavior::Anywhere => true,
};
let should_rewrap = is_vim_mode == IsVimMode::Yes || allow_rewrap_based_on_language;
let should_rewrap = override_language_settings
|| allow_rewrap_based_on_language
|| self.hard_wrap.is_some();
if !should_rewrap {
continue;
}
@@ -8632,9 +8733,11 @@ impl Editor {
continue;
};
let wrap_column = buffer
.language_settings_at(Point::new(start_row, 0), cx)
.preferred_line_length as usize;
let wrap_column = self.hard_wrap.unwrap_or_else(|| {
buffer
.language_settings_at(Point::new(start_row, 0), cx)
.preferred_line_length as usize
});
let wrapped_text = wrap_with_prefix(
line_prefix,
lines_without_prefixes.join(" "),
@@ -8645,7 +8748,7 @@ impl Editor {
// TODO: should always use char-based diff while still supporting cursor behavior that
// matches vim.
let mut diff_options = DiffOptions::default();
if is_vim_mode == IsVimMode::Yes {
if override_language_settings {
diff_options.max_word_diff_len = 0;
diff_options.max_word_diff_line_count = 0;
} else {
@@ -14215,6 +14318,11 @@ impl Editor {
cx.notify();
}
pub fn set_hard_wrap(&mut self, hard_wrap: Option<usize>, cx: &mut Context<Self>) {
self.hard_wrap = hard_wrap;
cx.notify();
}
pub fn set_text_style_refinement(&mut self, style: TextStyleRefinement) {
self.text_style_refinement = Some(style);
}
@@ -14497,7 +14605,7 @@ impl Editor {
pub fn toggle_git_blame(
&mut self,
_: &ToggleGitBlame,
_: &::git::Blame,
window: &mut Window,
cx: &mut Context<Self>,
) {

View File

@@ -16,7 +16,8 @@ use gpui::{
use indoc::indoc;
use language::{
language_settings::{
AllLanguageSettings, AllLanguageSettingsContent, LanguageSettingsContent, PrettierSettings,
AllLanguageSettings, AllLanguageSettingsContent, CompletionSettings,
LanguageSettingsContent, PrettierSettings,
},
BracketPairConfig,
Capability::ReadWrite,
@@ -30,7 +31,7 @@ use pretty_assertions::{assert_eq, assert_ne};
use project::project_settings::{LspSettings, ProjectSettings};
use project::FakeFs;
use serde_json::{self, json};
use std::{cell::RefCell, future::Future, rc::Rc, time::Instant};
use std::{cell::RefCell, future::Future, rc::Rc, sync::atomic::AtomicBool, time::Instant};
use std::{
iter,
sync::atomic::{self, AtomicUsize},
@@ -4737,6 +4738,31 @@ async fn test_rewrap(cx: &mut TestAppContext) {
}
}
#[gpui::test]
async fn test_hard_wrap(cx: &mut TestAppContext) {
init_test(cx, |_| {});
let mut cx = EditorTestContext::new(cx).await;
cx.update_editor(|editor, _, cx| {
editor.set_hard_wrap(Some(14), cx);
});
cx.set_state(indoc!(
"
one two three ˇ
"
));
cx.simulate_input("four");
cx.run_until_parked();
cx.assert_editor_state(indoc!(
"
one two three
fourˇ
"
));
}
#[gpui::test]
async fn test_clipboard(cx: &mut TestAppContext) {
init_test(cx, |_| {});
@@ -9169,6 +9195,101 @@ async fn test_completion(cx: &mut TestAppContext) {
apply_additional_edits.await.unwrap();
}
#[gpui::test]
async fn test_words_completion(cx: &mut TestAppContext) {
let lsp_fetch_timeout_ms = 10;
init_test(cx, |language_settings| {
language_settings.defaults.completions = Some(CompletionSettings {
words: WordsCompletionMode::Fallback,
lsp: true,
lsp_fetch_timeout_ms: 10,
});
});
let mut cx = EditorLspTestContext::new_rust(
lsp::ServerCapabilities {
completion_provider: Some(lsp::CompletionOptions {
trigger_characters: Some(vec![".".to_string(), ":".to_string()]),
..lsp::CompletionOptions::default()
}),
signature_help_provider: Some(lsp::SignatureHelpOptions::default()),
..lsp::ServerCapabilities::default()
},
cx,
)
.await;
let throttle_completions = Arc::new(AtomicBool::new(false));
let lsp_throttle_completions = throttle_completions.clone();
let _completion_requests_handler =
cx.lsp
.server
.on_request::<lsp::request::Completion, _, _>(move |_, cx| {
let lsp_throttle_completions = lsp_throttle_completions.clone();
async move {
if lsp_throttle_completions.load(atomic::Ordering::Acquire) {
cx.background_executor()
.timer(Duration::from_millis(lsp_fetch_timeout_ms * 10))
.await;
}
Ok(Some(lsp::CompletionResponse::Array(vec![
lsp::CompletionItem {
label: "first".into(),
..Default::default()
},
lsp::CompletionItem {
label: "last".into(),
..Default::default()
},
])))
}
});
cx.set_state(indoc! {"
oneˇ
two
three
"});
cx.simulate_keystroke(".");
cx.executor().run_until_parked();
cx.condition(|editor, _| editor.context_menu_visible())
.await;
cx.update_editor(|editor, window, cx| {
if let Some(CodeContextMenu::Completions(menu)) = editor.context_menu.borrow_mut().as_ref()
{
assert_eq!(
completion_menu_entries(&menu),
&["first", "last"],
"When LSP server is fast to reply, no fallback word completions are used"
);
} else {
panic!("expected completion menu to be open");
}
editor.cancel(&Cancel, window, cx);
});
cx.executor().run_until_parked();
cx.condition(|editor, _| !editor.context_menu_visible())
.await;
throttle_completions.store(true, atomic::Ordering::Release);
cx.simulate_keystroke(".");
cx.executor()
.advance_clock(Duration::from_millis(lsp_fetch_timeout_ms * 2));
cx.executor().run_until_parked();
cx.condition(|editor, _| editor.context_menu_visible())
.await;
cx.update_editor(|editor, _, _| {
if let Some(CodeContextMenu::Completions(menu)) = editor.context_menu.borrow_mut().as_ref()
{
assert_eq!(completion_menu_entries(&menu), &["one", "three", "two"],
"When LSP server is slow, document words can be shown instead, if configured accordingly");
} else {
panic!("expected completion menu to be open");
}
});
}
#[gpui::test]
async fn test_multiline_completion(cx: &mut TestAppContext) {
init_test(cx, |_| {});

View File

@@ -1,4 +1,5 @@
pub mod extension_builder;
mod extension_events;
mod extension_host_proxy;
mod extension_manifest;
mod types;
@@ -14,12 +15,14 @@ use gpui::{App, Task};
use language::LanguageName;
use semantic_version::SemanticVersion;
pub use crate::extension_events::*;
pub use crate::extension_host_proxy::*;
pub use crate::extension_manifest::*;
pub use crate::types::*;
/// Initializes the `extension` crate.
pub fn init(cx: &mut App) {
extension_events::init(cx);
ExtensionHostProxy::default_global(cx);
}

View File

@@ -0,0 +1,35 @@
use gpui::{App, AppContext as _, Context, Entity, EventEmitter, Global, ReadGlobal as _};
pub fn init(cx: &mut App) {
let extension_events = cx.new(ExtensionEvents::new);
cx.set_global(GlobalExtensionEvents(extension_events));
}
struct GlobalExtensionEvents(Entity<ExtensionEvents>);
impl Global for GlobalExtensionEvents {}
/// An event bus for broadcasting extension-related events throughout the app.
pub struct ExtensionEvents;
impl ExtensionEvents {
/// Returns the global [`ExtensionEvents`].
pub fn global(cx: &App) -> Entity<Self> {
GlobalExtensionEvents::global(cx).0.clone()
}
fn new(_cx: &mut Context<Self>) -> Self {
Self
}
pub fn emit(&mut self, event: Event, cx: &mut Context<Self>) {
cx.emit(event)
}
}
#[derive(Clone)]
pub enum Event {
ExtensionsUpdated,
}
impl EventEmitter<Event> for ExtensionEvents {}

View File

@@ -14,7 +14,7 @@ use collections::{btree_map, BTreeMap, BTreeSet, HashMap, HashSet};
use extension::extension_builder::{CompileExtensionOptions, ExtensionBuilder};
pub use extension::ExtensionManifest;
use extension::{
ExtensionContextServerProxy, ExtensionGrammarProxy, ExtensionHostProxy,
ExtensionContextServerProxy, ExtensionEvents, ExtensionGrammarProxy, ExtensionHostProxy,
ExtensionIndexedDocsProviderProxy, ExtensionLanguageProxy, ExtensionLanguageServerProxy,
ExtensionSlashCommandProxy, ExtensionSnippetProxy, ExtensionThemeProxy,
};
@@ -127,7 +127,6 @@ pub enum ExtensionOperation {
#[derive(Clone)]
pub enum Event {
ExtensionsUpdated,
StartedReloading,
ExtensionInstalled(Arc<str>),
ExtensionFailedToLoad(Arc<str>),
@@ -1214,7 +1213,9 @@ impl ExtensionStore {
self.extension_index = new_index;
cx.notify();
cx.emit(Event::ExtensionsUpdated);
ExtensionEvents::global(cx).update(cx, |this, cx| {
this.emit(extension::Event::ExtensionsUpdated, cx)
});
cx.spawn(|this, mut cx| async move {
cx.background_spawn({

View File

@@ -780,6 +780,7 @@ fn init_test(cx: &mut TestAppContext) {
let store = SettingsStore::test(cx);
cx.set_global(store);
release_channel::init(SemanticVersion::default(), cx);
extension::init(cx);
theme::init(theme::LoadThemes::JustBase, cx);
Project::init_settings(cx);
ExtensionSettings::register(cx);

View File

@@ -17,6 +17,7 @@ client.workspace = true
collections.workspace = true
db.workspace = true
editor.workspace = true
extension.workspace = true
extension_host.workspace = true
feature_flags.workspace = true
fs.workspace = true

View File

@@ -9,6 +9,7 @@ use std::{ops::Range, sync::Arc};
use client::{ExtensionMetadata, ExtensionProvides};
use collections::{BTreeMap, BTreeSet};
use editor::{Editor, EditorElement, EditorStyle};
use extension::ExtensionEvents;
use extension_host::{ExtensionManifest, ExtensionOperation, ExtensionStore};
use feature_flags::FeatureFlagAppExt as _;
use fuzzy::{match_strings, StringMatchCandidate};
@@ -212,7 +213,7 @@ pub struct ExtensionsPage {
query_editor: Entity<Editor>,
query_contains_error: bool,
provides_filter: Option<ExtensionProvides>,
_subscriptions: [gpui::Subscription; 2],
_subscriptions: Vec<gpui::Subscription>,
extension_fetch_task: Option<Task<()>>,
upsells: BTreeSet<Feature>,
}
@@ -226,15 +227,12 @@ impl ExtensionsPage {
cx.new(|cx| {
let store = ExtensionStore::global(cx);
let workspace_handle = workspace.weak_handle();
let subscriptions = [
let subscriptions = vec![
cx.observe(&store, |_: &mut Self, _, cx| cx.notify()),
cx.subscribe_in(
&store,
window,
move |this, _, event, window, cx| match event {
extension_host::Event::ExtensionsUpdated => {
this.fetch_extensions_debounced(cx)
}
extension_host::Event::ExtensionInstalled(extension_id) => this
.on_extension_installed(
workspace_handle.clone(),
@@ -245,6 +243,15 @@ impl ExtensionsPage {
_ => {}
},
),
cx.subscribe_in(
&ExtensionEvents::global(cx),
window,
move |this, _, event, _window, cx| match event {
extension::Event::ExtensionsUpdated => {
this.fetch_extensions_debounced(cx);
}
},
),
];
let query_editor = cx.new(|cx| {

View File

@@ -1,5 +1,5 @@
use futures::channel::oneshot;
use fuzzy::StringMatchCandidate;
use fuzzy::{StringMatch, StringMatchCandidate};
use picker::{Picker, PickerDelegate};
use project::DirectoryLister;
use std::{
@@ -9,7 +9,7 @@ use std::{
Arc,
},
};
use ui::{prelude::*, LabelLike, ListItemSpacing};
use ui::{prelude::*, HighlightedLabel, ListItemSpacing};
use ui::{Context, ListItem, Window};
use util::{maybe, paths::compare_paths};
use workspace::Workspace;
@@ -22,6 +22,7 @@ pub struct OpenPathDelegate {
selected_index: usize,
directory_state: Option<DirectoryState>,
matches: Vec<usize>,
string_matches: Vec<StringMatch>,
cancel_flag: Arc<AtomicBool>,
should_dismiss: bool,
}
@@ -34,6 +35,7 @@ impl OpenPathDelegate {
selected_index: 0,
directory_state: None,
matches: Vec::new(),
string_matches: Vec::new(),
cancel_flag: Arc::new(AtomicBool::new(false)),
should_dismiss: true,
}
@@ -223,6 +225,7 @@ impl PickerDelegate for OpenPathDelegate {
if suffix == "" {
this.update(&mut cx, |this, cx| {
this.delegate.matches.clear();
this.delegate.string_matches.clear();
this.delegate
.matches
.extend(match_candidates.iter().map(|m| m.path.id));
@@ -249,6 +252,7 @@ impl PickerDelegate for OpenPathDelegate {
this.update(&mut cx, |this, cx| {
this.delegate.matches.clear();
this.delegate.string_matches = matches.clone();
this.delegate
.matches
.extend(matches.into_iter().map(|m| m.candidate_id));
@@ -337,13 +341,22 @@ impl PickerDelegate for OpenPathDelegate {
let m = self.matches.get(ix)?;
let directory_state = self.directory_state.as_ref()?;
let candidate = directory_state.match_candidates.get(*m)?;
let highlight_positions = self
.string_matches
.iter()
.find(|string_match| string_match.candidate_id == *m)
.map(|string_match| string_match.positions.clone())
.unwrap_or_default();
Some(
ListItem::new(ix)
.spacing(ListItemSpacing::Sparse)
.inset(true)
.toggle_state(selected)
.child(LabelLike::new().child(candidate.path.string.clone())),
.child(HighlightedLabel::new(
candidate.path.string.clone(),
highlight_positions,
)),
)
}

View File

@@ -2,8 +2,8 @@ use crate::commit::get_messages;
use crate::Oid;
use anyhow::{anyhow, Context as _, Result};
use collections::{HashMap, HashSet};
use futures::AsyncWriteExt;
use serde::{Deserialize, Serialize};
use std::io::Write;
use std::process::Stdio;
use std::{ops::Range, path::Path};
use text::Rope;
@@ -21,14 +21,14 @@ pub struct Blame {
}
impl Blame {
pub fn for_path(
pub async fn for_path(
git_binary: &Path,
working_directory: &Path,
path: &Path,
content: &Rope,
remote_url: Option<String>,
) -> Result<Self> {
let output = run_git_blame(git_binary, working_directory, path, content)?;
let output = run_git_blame(git_binary, working_directory, path, content).await?;
let mut entries = parse_git_blame(&output)?;
entries.sort_unstable_by(|a, b| a.range.start.cmp(&b.range.start));
@@ -39,8 +39,9 @@ impl Blame {
}
let shas = unique_shas.into_iter().collect::<Vec<_>>();
let messages =
get_messages(working_directory, &shas).context("failed to get commit messages")?;
let messages = get_messages(working_directory, &shas)
.await
.context("failed to get commit messages")?;
Ok(Self {
entries,
@@ -53,13 +54,13 @@ impl Blame {
const GIT_BLAME_NO_COMMIT_ERROR: &str = "fatal: no such ref: HEAD";
const GIT_BLAME_NO_PATH: &str = "fatal: no such path";
fn run_git_blame(
async fn run_git_blame(
git_binary: &Path,
working_directory: &Path,
path: &Path,
contents: &Rope,
) -> Result<String> {
let child = util::command::new_std_command(git_binary)
let mut child = util::command::new_smol_command(git_binary)
.current_dir(working_directory)
.arg("blame")
.arg("--incremental")
@@ -72,18 +73,19 @@ fn run_git_blame(
.spawn()
.map_err(|e| anyhow!("Failed to start git blame process: {}", e))?;
let mut stdin = child
let stdin = child
.stdin
.as_ref()
.as_mut()
.context("failed to get pipe to stdin of git blame command")?;
for chunk in contents.chunks() {
stdin.write_all(chunk.as_bytes())?;
stdin.write_all(chunk.as_bytes()).await?;
}
stdin.flush()?;
stdin.flush().await?;
let output = child
.wait_with_output()
.output()
.await
.map_err(|e| anyhow!("Failed to read git blame output: {}", e))?;
if !output.status.success() {

View File

@@ -3,20 +3,21 @@ use anyhow::{anyhow, Result};
use collections::HashMap;
use std::path::Path;
pub fn get_messages(working_directory: &Path, shas: &[Oid]) -> Result<HashMap<Oid, String>> {
pub async fn get_messages(working_directory: &Path, shas: &[Oid]) -> Result<HashMap<Oid, String>> {
if shas.is_empty() {
return Ok(HashMap::default());
}
const MARKER: &str = "<MARKER>";
let output = util::command::new_std_command("git")
let output = util::command::new_smol_command("git")
.current_dir(working_directory)
.arg("show")
.arg("-s")
.arg(format!("--format=%B{}", MARKER))
.args(shas.iter().map(ToString::to_string))
.output()
.await
.map_err(|e| anyhow!("Failed to start git blame process: {}", e))?;
anyhow::ensure!(

View File

@@ -54,8 +54,10 @@ actions!(
Init,
]
);
action_with_deprecated_aliases!(git, RestoreFile, ["editor::RevertFile"]);
action_with_deprecated_aliases!(git, Restore, ["editor::RevertSelectedHunks"]);
action_with_deprecated_aliases!(git, Blame, ["editor::ToggleGitBlame"]);
/// The length of a Git short SHA.
pub const SHORT_SHA_LENGTH: usize = 7;

File diff suppressed because it is too large Load Diff

View File

@@ -205,9 +205,9 @@ impl BranchListDelegate {
return;
};
cx.spawn(|_, cx| async move {
cx.update(|cx| repo.read(cx).create_branch(&new_branch_name))?
cx.update(|cx| repo.read(cx).create_branch(new_branch_name.to_string()))?
.await??;
cx.update(|cx| repo.read(cx).change_branch(&new_branch_name))?
cx.update(|cx| repo.read(cx).change_branch(new_branch_name.to_string()))?
.await??;
Ok(())
})
@@ -358,7 +358,7 @@ impl PickerDelegate for BranchListDelegate {
let cx = cx.to_async();
anyhow::Ok(async move {
cx.update(|cx| repo.read(cx).change_branch(&branch.name))?
cx.update(|cx| repo.read(cx).change_branch(branch.name.to_string()))?
.await?
})
})??;
@@ -434,6 +434,7 @@ impl PickerDelegate for BranchListDelegate {
"Create branch \"{}\"",
entry.branch.name
))
.single_line()
.into_any_element()
} else {
HighlightedLabel::new(

View File

@@ -367,6 +367,7 @@ pub(crate) fn commit_message_editor(
commit_editor.set_show_gutter(false, cx);
commit_editor.set_show_wrap_guides(false, cx);
commit_editor.set_show_indent_guides(false, cx);
commit_editor.set_hard_wrap(Some(72), cx);
let placeholder = placeholder.unwrap_or("Enter commit message");
commit_editor.set_placeholder_text(placeholder, cx);
commit_editor
@@ -1501,15 +1502,17 @@ impl GitPanel {
telemetry::event!("Git Uncommitted");
let confirmation = self.check_for_pushed_commits(window, cx);
let prior_head = self.load_commit_details("HEAD", cx);
let prior_head = self.load_commit_details("HEAD".to_string(), cx);
let task = cx.spawn_in(window, |this, mut cx| async move {
let result = maybe!(async {
if let Ok(true) = confirmation.await {
let prior_head = prior_head.await?;
repo.update(&mut cx, |repo, cx| repo.reset("HEAD^", ResetMode::Soft, cx))?
.await??;
repo.update(&mut cx, |repo, cx| {
repo.reset("HEAD^".to_string(), ResetMode::Soft, cx)
})?
.await??;
Ok(Some(prior_head))
} else {
@@ -3401,7 +3404,7 @@ impl GitPanel {
fn load_commit_details(
&self,
sha: &str,
sha: String,
cx: &mut Context<Self>,
) -> Task<anyhow::Result<CommitDetails>> {
let Some(repo) = self.active_repository.clone() else {
@@ -3911,7 +3914,7 @@ impl GitPanelMessageTooltip {
cx.spawn_in(window, |this, mut cx| async move {
let details = git_panel
.update(&mut cx, |git_panel, cx| {
git_panel.load_commit_details(&sha, cx)
git_panel.load_commit_details(sha.to_string(), cx)
})?
.await?;

View File

@@ -815,23 +815,30 @@ impl ProjectDiffToolbar {
cx.dispatch_action(action.as_ref());
})
}
fn dispatch_panel_action(
&self,
action: &dyn Action,
window: &mut Window,
cx: &mut Context<Self>,
) {
fn stage_all(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.workspace
.read_with(cx, |workspace, cx| {
.update(cx, |workspace, cx| {
if let Some(panel) = workspace.panel::<GitPanel>(cx) {
panel.focus_handle(cx).focus(window)
panel.update(cx, |panel, cx| {
panel.stage_all(&Default::default(), window, cx);
});
}
})
.ok();
let action = action.boxed_clone();
cx.defer(move |cx| {
cx.dispatch_action(action.as_ref());
})
}
fn unstage_all(&mut self, window: &mut Window, cx: &mut Context<Self>) {
self.workspace
.update(cx, |workspace, cx| {
let Some(panel) = workspace.panel::<GitPanel>(cx) else {
return;
};
panel.update(cx, |panel, cx| {
panel.unstage_all(&Default::default(), window, cx);
});
})
.ok();
}
}
@@ -985,7 +992,7 @@ impl Render for ProjectDiffToolbar {
&focus_handle,
))
.on_click(cx.listener(|this, _, window, cx| {
this.dispatch_panel_action(&UnstageAll, window, cx)
this.unstage_all(window, cx)
})),
)
},
@@ -1005,7 +1012,7 @@ impl Render for ProjectDiffToolbar {
&focus_handle,
))
.on_click(cx.listener(|this, _, window, cx| {
this.dispatch_panel_action(&StageAll, window, cx)
this.stage_all(window, cx)
})),
),
)

View File

@@ -143,7 +143,7 @@ pub fn format_output(action: &RemoteAction, output: RemoteCommandOutput) -> Succ
}
} else {
SuccessMessage {
message: "Successfully pushed new branch".to_owned(),
message: format!("Pushed {} to {}", branch_name, remote_ref.name),
style: SuccessStyle::ToastWithLog { output },
}
}

View File

@@ -3,7 +3,8 @@ use std::{fs, path::PathBuf, time::Duration};
use anyhow::Result;
use gpui::{
div, hsla, img, point, prelude::*, px, rgb, size, svg, App, Application, AssetSource, Bounds,
BoxShadow, ClickEvent, Context, SharedString, Task, Timer, Window, WindowBounds, WindowOptions,
BoxShadow, ClickEvent, Context, ForegroundTask, SharedString, Timer, Window, WindowBounds,
WindowOptions,
};
struct Assets {
@@ -34,7 +35,7 @@ impl AssetSource for Assets {
}
struct HelloWorld {
_task: Option<Task<()>>,
_task: Option<ForegroundTask<()>>,
opacity: f32,
}

View File

@@ -32,12 +32,12 @@ use util::ResultExt;
use crate::{
current_platform, hash, init_app_menus, Action, ActionBuildError, ActionRegistry, Any, AnyView,
AnyWindowHandle, AppContext, Asset, AssetSource, BackgroundExecutor, Bounds, ClipboardItem,
DispatchPhase, DisplayId, EventEmitter, FocusHandle, FocusMap, ForegroundExecutor, Global,
KeyBinding, Keymap, Keystroke, LayoutId, Menu, MenuItem, OwnedMenu, PathPromptOptions, Pixels,
Platform, PlatformDisplay, Point, PromptBuilder, PromptHandle, PromptLevel, Render,
RenderablePromptHandle, Reservation, ScreenCaptureSource, SharedString, SubscriberSet,
Subscription, SvgRenderer, Task, TextSystem, Window, WindowAppearance, WindowHandle, WindowId,
WindowInvalidator,
DispatchPhase, DisplayId, EventEmitter, FocusHandle, FocusMap, ForegroundContext,
ForegroundExecutor, Global, KeyBinding, Keymap, Keystroke, LayoutId, Menu, MenuItem, OwnedMenu,
PathPromptOptions, Pixels, Platform, PlatformDisplay, Point, PromptBuilder, PromptHandle,
PromptLevel, Render, RenderablePromptHandle, Reservation, ScreenCaptureSource, SharedString,
SubscriberSet, Subscription, SvgRenderer, Task, TextSystem, Window, WindowAppearance,
WindowHandle, WindowId, WindowInvalidator,
};
mod async_context;
@@ -1026,10 +1026,10 @@ impl App {
Ok(result)
})
}
/// Creates an `AsyncApp`, which can be cloned and has a static lifetime
/// Creates a `WeakAsyncApp`, which can be cloned and has a static lifetime
/// so it can be held across `await` points.
pub fn to_async(&self) -> AsyncApp {
AsyncApp {
pub fn to_async(&self) -> WeakAsyncApp {
WeakAsyncApp {
app: self.this.clone(),
background_executor: self.background_executor.clone(),
foreground_executor: self.foreground_executor.clone(),
@@ -1054,7 +1054,11 @@ impl App {
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor.spawn(f(self.to_async()))
let async_app = self.to_async();
self.foreground_executor.spawn_with_context(
ForegroundContext::app(&async_app.app),
f(async_app.upgrade()),
)
}
/// Schedules the given function to be run at the end of the current effect cycle, allowing entities
@@ -1596,8 +1600,6 @@ impl App {
}
impl AppContext for App {
type Result<T> = T;
/// Build an entity that is owned by the application. The given function will be invoked with
/// a `Context` and must return an object representing the entity. A `Entity` handle will be returned,
/// which can be used to access the entity in a context.
@@ -1621,7 +1623,7 @@ impl AppContext for App {
})
}
fn reserve_entity<T: 'static>(&mut self) -> Self::Result<Reservation<T>> {
fn reserve_entity<T: 'static>(&mut self) -> Reservation<T> {
Reservation(self.entities.reserve())
}
@@ -1629,7 +1631,7 @@ impl AppContext for App {
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
self.update(|cx| {
let slot = reservation.0;
let entity = build_entity(&mut Context::new_context(cx, slot.downgrade()));
@@ -1655,11 +1657,7 @@ impl AppContext for App {
})
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
read: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static,
{
@@ -1704,7 +1702,7 @@ impl AppContext for App {
self.background_executor.spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{
@@ -1842,3 +1840,38 @@ impl HttpClient for NullHttpClient {
type_name::<Self>()
}
}
/// The AsyncApp, AsyncWindowContext, and executor types must not be clone, in order to
/// protect against panics that are caused when moving an AsyncApp across an
/// await point, in a task that doesn't share it's context.
///
/// For example, an issue with AsyncWindowContext:
///
/// ```rs
/// fn test(cx: AsyncWindowContext) {
/// let cx_2 = cx.clone();
/// cx.update(|_, app: &mut App| {
/// app.spawn(|_| {
/// my_async_thing().await;
/// cx_2.update(/**/) // 💥 Window wasn't checked after `.await`, this might panic
/// })
/// })
/// }
///```
///
/// Or using ForegroundExecutor:
///
/// ```rs
/// fn test(cx: AsyncApp) {
/// let executor = cx.foreground_executor().clone();
/// executor.spawn(async {
/// my_async_thing().await;
/// cx.update(/**/) // 💥 Didn't check that the app still existed after await, this might panic
/// })
/// }
/// ```
///
/// Having these types be !Clone, ensures that they'll be locked behind a `&`
/// when calling `spawn`, which means they won't be able to be moved inside
/// the spawn call (which has a + 'static bound)
pub(crate) struct NotClone;

View File

@@ -1,95 +1,63 @@
use crate::{
AnyView, AnyWindowHandle, App, AppCell, AppContext, BackgroundExecutor, BorrowAppContext,
Entity, Focusable, ForegroundExecutor, Global, PromptLevel, Render, Reservation, Result, Task,
VisualContext, Window, WindowHandle,
Entity, Focusable, ForegroundContext, ForegroundExecutor, Global, PromptLevel, Render,
Reservation, Result, Task, VisualContext, Window, WindowHandle,
};
use anyhow::{anyhow, Context as _};
use anyhow::Context as _;
use derive_more::{Deref, DerefMut};
use futures::channel::oneshot;
use std::{future::Future, rc::Weak};
use super::Context;
use super::{Context, NotClone};
/// An async-friendly version of [App] with a static lifetime so it can be held across `await` points in async code.
/// You're provided with an instance when calling [App::spawn], and you can also create one with [App::to_async].
/// Internally, this holds a weak reference to an `App`, so its methods are fallible to protect against cases where the [App] is dropped.
#[derive(Clone)]
pub struct AsyncApp {
pub(crate) app: Weak<AppCell>,
pub(crate) background_executor: BackgroundExecutor,
pub(crate) foreground_executor: ForegroundExecutor,
pub(crate) app: WeakAsyncApp,
_not_clone: NotClone,
}
impl AppContext for AsyncApp {
type Result<T> = Result<T>;
fn new<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut app = app.borrow_mut();
Ok(app.new(build_entity))
) -> Entity<T> {
self.app.new(build_entity).unwrap()
}
fn reserve_entity<T: 'static>(&mut self) -> Result<Reservation<T>> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut app = app.borrow_mut();
Ok(app.reserve_entity())
fn reserve_entity<T: 'static>(&mut self) -> Reservation<T> {
self.app.reserve_entity().unwrap()
}
fn insert_entity<T: 'static>(
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Result<Entity<T>> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut app = app.borrow_mut();
Ok(app.insert_entity(reservation, build_entity))
) -> Entity<T> {
self.app.insert_entity(reservation, build_entity).unwrap()
}
fn update_entity<T: 'static, R>(
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Self::Result<R> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut app = app.borrow_mut();
Ok(app.update_entity(handle, update))
) -> R {
self.app.update_entity(handle, update).unwrap()
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
callback: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, callback: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static,
{
let app = self.app.upgrade().context("app was released")?;
let lock = app.borrow();
Ok(lock.read_entity(handle, callback))
self.app.read_entity(handle, callback).unwrap()
}
fn update_window<T, F>(&mut self, window: AnyWindowHandle, f: F) -> Result<T>
where
F: FnOnce(AnyView, &mut Window, &mut App) -> T,
{
let app = self.app.upgrade().context("app was released")?;
let mut lock = app.borrow_mut();
lock.update_window(window, f)
self.app.update_window(window, f).unwrap()
}
fn read_window<T, R>(
@@ -100,58 +68,44 @@ impl AppContext for AsyncApp {
where
T: 'static,
{
let app = self.app.upgrade().context("app was released")?;
let lock = app.borrow();
lock.read_window(window, read)
self.app.read_window(window, read).unwrap()
}
fn background_spawn<R>(&self, future: impl Future<Output = R> + Send + 'static) -> Task<R>
where
R: Send + 'static,
{
self.background_executor.spawn(future)
self.app.background_spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{
let app = self.app.upgrade().context("app was released")?;
let mut lock = app.borrow_mut();
Ok(lock.update(|this| this.read_global(callback)))
self.app.read_global(callback).unwrap()
}
}
impl AsyncApp {
/// Schedules all windows in the application to be redrawn.
pub fn refresh(&self) -> Result<()> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut lock = app.borrow_mut();
lock.refresh_windows();
Ok(())
pub fn refresh(&self) {
self.app.refresh().unwrap()
}
/// Get an executor which can be used to spawn futures in the background.
pub fn background_executor(&self) -> &BackgroundExecutor {
&self.background_executor
self.app.background_executor()
}
/// Get an executor which can be used to spawn futures in the foreground.
pub fn foreground_executor(&self) -> &ForegroundExecutor {
&self.foreground_executor
self.app.foreground_executor()
}
/// Invoke the given function in the context of the app, then flush any effects produced during its invocation.
pub fn update<R>(&self, f: impl FnOnce(&mut App) -> R) -> Result<R> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut lock = app.borrow_mut();
Ok(lock.update(f))
/// Panics if the app has been dropped since this was created
pub fn update<R>(&self, f: impl FnOnce(&mut App) -> R) -> R {
self.app.update(f).unwrap()
}
/// Open a window with the given options based on the root view returned by the given function.
@@ -163,12 +117,7 @@ impl AsyncApp {
where
V: 'static + Render,
{
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut lock = app.borrow_mut();
lock.open_window(options, build_root_view)
self.app.open_window(options, build_root_view).unwrap()
}
/// Schedule a future to be polled in the background.
@@ -178,31 +127,21 @@ impl AsyncApp {
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor.spawn(f(self.clone()))
self.app.spawn(f)
}
/// Determine whether global state of the specified type has been assigned.
/// Returns an error if the `App` has been dropped.
pub fn has_global<G: Global>(&self) -> Result<bool> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let app = app.borrow_mut();
Ok(app.has_global::<G>())
pub fn has_global<G: Global>(&self) -> bool {
self.app.has_global::<G>().unwrap()
}
/// Reads the global state of the specified type, passing it to the given callback.
///
/// Panics if no global state of the specified type has been assigned.
/// Returns an error if the `App` has been dropped.
pub fn read_global<G: Global, R>(&self, read: impl FnOnce(&G, &App) -> R) -> Result<R> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let app = app.borrow_mut();
Ok(read(app.global(), &app))
pub fn read_global<G: Global, R>(&self, read: impl FnOnce(&G, &App) -> R) -> R {
self.app.read_global(read).unwrap()
}
/// Reads the global state of the specified type, passing it to the given callback.
@@ -212,39 +151,32 @@ impl AsyncApp {
///
/// Returns an error if no state of the specified type has been assigned the `App` has been dropped.
pub fn try_read_global<G: Global, R>(&self, read: impl FnOnce(&G, &App) -> R) -> Option<R> {
let app = self.app.upgrade()?;
let app = app.borrow_mut();
Some(read(app.try_global()?, &app))
self.app.try_read_global(read).unwrap()
}
/// A convenience method for [App::update_global]
/// for updating the global state of the specified type.
pub fn update_global<G: Global, R>(
&self,
update: impl FnOnce(&mut G, &mut App) -> R,
) -> Result<R> {
let app = self
.app
.upgrade()
.ok_or_else(|| anyhow!("app was released"))?;
let mut app = app.borrow_mut();
Ok(app.update(|cx| cx.update_global(update)))
pub fn update_global<G: Global, R>(&self, update: impl FnOnce(&mut G, &mut App) -> R) -> R {
self.app.update_global(update).unwrap()
}
}
/// A cloneable, owned handle to the application context,
/// composed with the window associated with the current task.
#[derive(Clone, Deref, DerefMut)]
#[derive(Deref, DerefMut)]
pub struct AsyncWindowContext {
#[deref]
#[deref_mut]
app: AsyncApp,
window: AnyWindowHandle,
cx: WeakAsyncWindowContext,
_not_clone: NotClone,
}
impl AsyncWindowContext {
pub(crate) fn new_context(app: AsyncApp, window: AnyWindowHandle) -> Self {
Self { app, window }
Self {
cx: WeakAsyncWindowContext::new_context(app.app, window),
_not_clone: NotClone,
}
}
/// Get the handle of the window this context is associated with.
@@ -253,33 +185,26 @@ impl AsyncWindowContext {
}
/// A convenience method for [`App::update_window`].
pub fn update<R>(&mut self, update: impl FnOnce(&mut Window, &mut App) -> R) -> Result<R> {
self.app
.update_window(self.window, |_, window, cx| update(window, cx))
pub fn update<R>(&mut self, update: impl FnOnce(&mut Window, &mut App) -> R) -> R {
self.cx.update(update).unwrap()
}
/// A convenience method for [`App::update_window`].
pub fn update_root<R>(
&mut self,
update: impl FnOnce(AnyView, &mut Window, &mut App) -> R,
) -> Result<R> {
self.app.update_window(self.window, update)
) -> R {
self.cx.update_root(update).unwrap()
}
/// A convenience method for [`Window::on_next_frame`].
pub fn on_next_frame(&mut self, f: impl FnOnce(&mut Window, &mut App) + 'static) {
self.window
.update(self, |_, window, _| window.on_next_frame(f))
.ok();
self.cx.on_next_frame(f).unwrap()
}
/// A convenience method for [`App::global`].
pub fn read_global<G: Global, R>(
&mut self,
read: impl FnOnce(&G, &Window, &App) -> R,
) -> Result<R> {
self.window
.update(self, |_, window, cx| read(cx.global(), window, cx))
pub fn read_global<G: Global, R>(&mut self, read: impl FnOnce(&G, &Window, &App) -> R) -> R {
self.cx.read_global(read).unwrap()
}
/// A convenience method for [`App::update_global`].
@@ -287,13 +212,11 @@ impl AsyncWindowContext {
pub fn update_global<G, R>(
&mut self,
update: impl FnOnce(&mut G, &mut Window, &mut App) -> R,
) -> Result<R>
) -> R
where
G: Global,
{
self.window.update(self, |_, window, cx| {
cx.update_global(|global, cx| update(global, window, cx))
})
self.cx.update_global(update).unwrap()
}
/// Schedule a future to be executed on the main thread. This is used for collecting
@@ -304,7 +227,7 @@ impl AsyncWindowContext {
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor.spawn(f(self.clone()))
self.cx.spawn(f)
}
/// Present a platform dialog.
@@ -321,58 +244,50 @@ impl AsyncWindowContext {
.update(self, |_, window, cx| {
window.prompt(level, message, detail, answers, cx)
})
.unwrap_or_else(|_| oneshot::channel().1)
.unwrap()
}
}
impl AppContext for AsyncWindowContext {
type Result<T> = Result<T>;
fn new<T>(&mut self, build_entity: impl FnOnce(&mut Context<'_, T>) -> T) -> Result<Entity<T>>
fn new<T>(&mut self, build_entity: impl FnOnce(&mut Context<'_, T>) -> T) -> Entity<T>
where
T: 'static,
{
self.window.update(self, |_, _, cx| cx.new(build_entity))
self.cx.new(build_entity).unwrap()
}
fn reserve_entity<T: 'static>(&mut self) -> Result<Reservation<T>> {
self.window.update(self, |_, _, cx| cx.reserve_entity())
fn reserve_entity<T: 'static>(&mut self) -> Reservation<T> {
self.cx.reserve_entity().unwrap()
}
fn insert_entity<T: 'static>(
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
self.window
.update(self, |_, _, cx| cx.insert_entity(reservation, build_entity))
) -> Entity<T> {
self.cx.insert_entity(reservation, build_entity).unwrap()
}
fn update_entity<T: 'static, R>(
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Result<R> {
self.window
.update(self, |_, _, cx| cx.update_entity(handle, update))
) -> R {
self.cx.update_entity(handle, update).unwrap()
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
read: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static,
{
self.app.read_entity(handle, read)
self.cx.read_entity(handle, read).unwrap()
}
fn update_window<T, F>(&mut self, window: AnyWindowHandle, update: F) -> Result<T>
where
F: FnOnce(AnyView, &mut Window, &mut App) -> T,
{
self.app.update_window(window, update)
self.cx.update_window(window, update)
}
fn read_window<T, R>(
@@ -383,21 +298,21 @@ impl AppContext for AsyncWindowContext {
where
T: 'static,
{
self.app.read_window(window, read)
self.cx.read_window(window, read)
}
fn background_spawn<R>(&self, future: impl Future<Output = R> + Send + 'static) -> Task<R>
where
R: Send + 'static,
{
self.app.background_executor.spawn(future)
self.cx.background_spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{
self.app.read_global(callback)
self.app.read_global(callback).unwrap()
}
}
@@ -409,38 +324,432 @@ impl VisualContext for AsyncWindowContext {
fn new_window_entity<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Window, &mut Context<T>) -> T,
) -> Self::Result<Entity<T>> {
self.window
.update(self, |_, window, cx| cx.new(|cx| build_entity(window, cx)))
) -> Entity<T> {
self.cx.new_window_entity(build_entity).unwrap()
}
fn update_window_entity<T: 'static, R>(
&mut self,
view: &Entity<T>,
update: impl FnOnce(&mut T, &mut Window, &mut Context<T>) -> R,
) -> Self::Result<R> {
self.window.update(self, |_, window, cx| {
view.update(cx, |entity, cx| update(entity, window, cx))
})
) -> R {
self.cx.update_window_entity(view, update).unwrap()
}
fn replace_root_view<V>(
&mut self,
build_view: impl FnOnce(&mut Window, &mut Context<V>) -> V,
) -> Self::Result<Entity<V>>
) -> Entity<V>
where
V: 'static + Render,
{
self.window
.update(self, |_, window, cx| window.replace_root(cx, build_view))
self.cx.replace_root_view(build_view).unwrap()
}
fn focus<V>(&mut self, view: &Entity<V>) -> Self::Result<()>
fn focus<V>(&mut self, view: &Entity<V>)
where
V: Focusable,
{
self.window.update(self, |_, window, cx| {
view.read(cx).focus_handle(cx).clone().focus(window);
self.cx.focus(view).unwrap()
}
}
/// An async-friendly version of [App] with a static lifetime so it can be held across `await` points in async code.
/// You're provided with an instance when calling [App::spawn], and you can also create one with [App::to_async].
/// Internally, this holds a weak reference to an `App`, so its methods are fallible to protect against cases where the [App] is dropped.
pub struct WeakAsyncApp {
pub(crate) app: Weak<AppCell>,
pub(crate) background_executor: BackgroundExecutor,
pub(crate) foreground_executor: ForegroundExecutor,
}
impl Clone for WeakAsyncApp {
fn clone(&self) -> Self {
Self {
app: self.app.clone(),
background_executor: self.background_executor.clone(),
foreground_executor: self.foreground_executor.clone(),
}
}
}
impl WeakAsyncApp {
pub(crate) fn upgrade(self) -> AsyncApp {
AsyncApp {
app: self,
_not_clone: NotClone,
}
}
fn new<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Result<Entity<T>> {
let app = self.app.upgrade().context("App dropped")?;
let mut app = app.borrow_mut();
Ok(app.new(build_entity))
}
fn reserve_entity<T: 'static>(&mut self) -> Result<Reservation<T>> {
let app = self.app.upgrade().context("App dropped")?;
let mut app = app.borrow_mut();
Ok(app.reserve_entity())
}
fn insert_entity<T: 'static>(
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Result<Entity<T>> {
let app = self.app.upgrade().context("App dropped")?;
let mut app = app.borrow_mut();
Ok(app.insert_entity(reservation, build_entity))
}
fn update_entity<T: 'static, R>(
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Result<R> {
let app = self.app.upgrade().context("App dropped")?;
let mut app = app.borrow_mut();
Ok(app.update_entity(handle, update))
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
callback: impl FnOnce(&T, &App) -> R,
) -> Result<R>
where
T: 'static,
{
let app = self.app.upgrade().context("App dropped")?;
let lock = app.borrow();
Ok(lock.read_entity(handle, callback))
}
fn update_window<T, F>(&mut self, window: AnyWindowHandle, f: F) -> Result<Result<T>>
where
F: FnOnce(AnyView, &mut Window, &mut App) -> T,
{
let app = self.app.upgrade().context("App dropped")?;
let mut lock = app.borrow_mut();
Ok(lock.update_window(window, f))
}
fn read_window<T, R>(
&self,
window: &WindowHandle<T>,
read: impl FnOnce(Entity<T>, &App) -> R,
) -> Result<Result<R>>
where
T: 'static,
{
let app = self.app.upgrade().context("App dropped")?;
let lock = app.borrow();
Ok(lock.read_window(window, read))
}
fn background_spawn<R>(&self, future: impl Future<Output = R> + Send + 'static) -> Task<R>
where
R: Send + 'static,
{
self.background_executor.spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Result<R>
where
G: Global,
{
let app = self.app.upgrade().context("App dropped")?;
let mut lock = app.borrow_mut();
Ok(lock.update(|this| this.read_global(callback)))
}
/// Schedules all windows in the application to be redrawn.
pub fn refresh(&self) -> Result<()> {
let app = self.app.upgrade().context("App dropped")?;
let mut lock = app.borrow_mut();
Ok(lock.refresh_windows())
}
/// Get an executor which can be used to spawn futures in the background.
pub fn background_executor(&self) -> &BackgroundExecutor {
&self.background_executor
}
/// Get an executor which can be used to spawn futures in the foreground.
pub fn foreground_executor(&self) -> &ForegroundExecutor {
&self.foreground_executor
}
/// Invoke the given function in the context of the app, then flush any effects produced during its invocation.
/// Panics if the app has been dropped since this was created
pub fn update<R>(&self, f: impl FnOnce(&mut App) -> R) -> Result<R> {
let app = self.app.upgrade().context("App dropped")?;
let mut lock = app.borrow_mut();
Ok(f(&mut lock))
}
/// Open a window with the given options based on the root view returned by the given function.
pub fn open_window<V>(
&self,
options: crate::WindowOptions,
build_root_view: impl FnOnce(&mut Window, &mut App) -> Entity<V>,
) -> Result<Result<WindowHandle<V>>>
where
V: 'static + Render,
{
let app = self.app.upgrade().context("App dropped")?;
let mut lock = app.borrow_mut();
Ok(lock.open_window(options, build_root_view))
}
/// Schedule a future to be polled in the background.
#[track_caller]
pub fn spawn<Fut, R>(&self, f: impl FnOnce(AsyncApp) -> Fut) -> Task<R>
where
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor
.spawn_with_context(ForegroundContext::app(&self.app), f(self.clone().upgrade()))
}
/// Determine whether global state of the specified type has been assigned.
/// Returns an error if the `App` has been dropped.
pub fn has_global<G: Global>(&self) -> Result<bool> {
let app = self.app.upgrade().context("App dropped")?;
let app = app.borrow_mut();
Ok(app.has_global::<G>())
}
/// Reads the global state of the specified type, passing it to the given callback.
///
/// Similar to [`AsyncApp::read_global`], but returns an error instead of panicking
/// if no state of the specified type has been assigned.
///
/// Returns an error if no state of the specified type has been assigned the `App` has been dropped.
pub fn try_read_global<G: Global, R>(
&self,
read: impl FnOnce(&G, &App) -> R,
) -> Result<Option<R>> {
let app = self.app.upgrade().context("App dropped")?;
let app = app.borrow_mut();
let Some(global) = app.try_global::<G>() else {
return Ok(None);
};
Ok(Some(read(global, &app)))
}
/// A convenience method for [App::update_global]
/// for updating the global state of the specified type.
pub fn update_global<G: Global, R>(
&self,
update: impl FnOnce(&mut G, &mut App) -> R,
) -> Result<R> {
let app = self.app.upgrade().context("App dropped")?;
let mut app = app.borrow_mut();
Ok(app.update(|cx| cx.update_global(update)))
}
}
/// A cloneable, owned handle to the application context,
/// composed with the window associated with the current task.
#[derive(Clone, Deref, DerefMut)]
pub struct WeakAsyncWindowContext {
#[deref]
#[deref_mut]
app: WeakAsyncApp,
window: AnyWindowHandle,
}
impl WeakAsyncWindowContext {
pub(crate) fn new_context(app: WeakAsyncApp, window: AnyWindowHandle) -> Self {
Self { app, window }
}
pub(crate) fn upgrade(self) -> AsyncWindowContext {
AsyncWindowContext {
cx: self,
_not_clone: NotClone,
}
}
/// Get the handle of the window this context is associated with.
pub fn window_handle(&self) -> AnyWindowHandle {
self.window
}
/// A convenience method for [`App::update_window`].
pub fn update<R>(&mut self, update: impl FnOnce(&mut Window, &mut App) -> R) -> Result<R> {
crate::Flatten::flatten(
self.app
.update_window(self.window, |_, window, cx| update(window, cx)),
)
}
/// A convenience method for [`App::update_window`].
pub fn update_root<R>(
&mut self,
update: impl FnOnce(AnyView, &mut Window, &mut App) -> R,
) -> Result<R> {
crate::Flatten::flatten(self.app.update_window(self.window, update))
}
/// A convenience method for [`Window::on_next_frame`].
pub fn on_next_frame(&mut self, f: impl FnOnce(&mut Window, &mut App) + 'static) -> Result<()> {
self.update_window(self.window, |_, window, _| window.on_next_frame(f))
}
/// A convenience method for [`App::global`].
pub fn read_global<G: Global, R>(
&mut self,
read: impl FnOnce(&G, &Window, &App) -> R,
) -> Result<R> {
self.update_window(self.window, |_, window, cx| read(cx.global(), window, cx))
}
/// A convenience method for [`App::update_global`].
/// for updating the global state of the specified type.
pub fn update_global<G, R>(
&mut self,
update: impl FnOnce(&mut G, &mut Window, &mut App) -> R,
) -> Result<R>
where
G: Global,
{
self.update_window(self.window, |_, window, cx| {
cx.update_global(|global, cx| update(global, window, cx))
})
}
/// Schedule a future to be executed on the main thread. This is used for collecting
/// the results of background tasks and updating the UI.
#[track_caller]
pub fn spawn<Fut, R>(&self, f: impl FnOnce(AsyncWindowContext) -> Fut) -> Task<R>
where
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor.spawn_with_context(
ForegroundContext::window(&self.app.app, self.window.id),
f(self.clone().upgrade()),
)
}
/// Present a platform dialog.
/// The provided message will be presented, along with buttons for each answer.
/// When a button is clicked, the returned Receiver will receive the index of the clicked button.
pub fn prompt(
&mut self,
level: PromptLevel,
message: &str,
detail: Option<&str>,
answers: &[&str],
) -> Result<oneshot::Receiver<usize>> {
self.update_window(self.window, |_, window, cx| {
window.prompt(level, message, detail, answers, cx)
})
}
fn new<T>(&mut self, build_entity: impl FnOnce(&mut Context<'_, T>) -> T) -> Result<Entity<T>>
where
T: 'static,
{
self.update_window(self.window, |_, _, cx| cx.new(build_entity))
}
fn reserve_entity<T: 'static>(&mut self) -> Result<Reservation<T>> {
self.update_window(self.window, |_, _, cx| cx.reserve_entity())
}
fn insert_entity<T: 'static>(
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Result<Entity<T>> {
self.update_window(self.window, |_, _, cx| {
cx.insert_entity(reservation, build_entity)
})
}
fn update_entity<T: 'static, R>(
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Result<R> {
self.update_window(self.window, |_, _, cx| cx.update_entity(handle, update))
}
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> Result<R>
where
T: 'static,
{
self.app.read_entity(handle, read)
}
fn update_window<T, F>(&mut self, window: AnyWindowHandle, update: F) -> Result<T>
where
F: FnOnce(AnyView, &mut Window, &mut App) -> T,
{
crate::Flatten::flatten(self.app.update_window(window, update))
}
fn read_window<T, R>(
&self,
window: &WindowHandle<T>,
read: impl FnOnce(Entity<T>, &App) -> R,
) -> Result<R>
where
T: 'static,
{
crate::Flatten::flatten(self.app.read_window(window, read))
}
fn background_spawn<R>(&self, future: impl Future<Output = R> + Send + 'static) -> Task<R>
where
R: Send + 'static,
{
self.app.background_executor.spawn(future)
}
fn new_window_entity<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Window, &mut Context<T>) -> T,
) -> Result<Entity<T>> {
self.update_window(self.window, |_, window, cx| {
cx.new(|cx| build_entity(window, cx))
})
}
fn update_window_entity<T: 'static, R>(
&mut self,
view: &Entity<T>,
update: impl FnOnce(&mut T, &mut Window, &mut Context<T>) -> R,
) -> Result<R> {
self.update(|window, cx| view.update(cx, |entity, cx| update(entity, window, cx)))
}
fn replace_root_view<V>(
&mut self,
build_view: impl FnOnce(&mut Window, &mut Context<V>) -> V,
) -> Result<Entity<V>>
where
V: 'static + Render,
{
self.update_window(self.window, |_, window, cx| {
window.replace_root(cx, build_view)
})
}
fn focus<V>(&mut self, view: &Entity<V>) -> Result<()>
where
V: Focusable,
{
self.update_window(self.window, |_, window, cx| {
view.read(cx).focus_handle(cx).clone().focus(window)
})
}
}

View File

@@ -680,8 +680,6 @@ impl<T> Context<'_, T> {
}
impl<T> AppContext for Context<'_, T> {
type Result<U> = U;
fn new<U: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, U>) -> U,
@@ -697,7 +695,7 @@ impl<T> AppContext for Context<'_, T> {
&mut self,
reservation: Reservation<U>,
build_entity: impl FnOnce(&mut Context<'_, U>) -> U,
) -> Self::Result<Entity<U>> {
) -> Entity<U> {
self.app.insert_entity(reservation, build_entity)
}
@@ -709,11 +707,7 @@ impl<T> AppContext for Context<'_, T> {
self.app.update_entity(handle, update)
}
fn read_entity<U, R>(
&self,
handle: &Entity<U>,
read: impl FnOnce(&U, &App) -> R,
) -> Self::Result<R>
fn read_entity<U, R>(&self, handle: &Entity<U>, read: impl FnOnce(&U, &App) -> R) -> R
where
U: 'static,
{
@@ -745,7 +739,7 @@ impl<T> AppContext for Context<'_, T> {
self.app.background_executor.spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{

View File

@@ -418,11 +418,7 @@ impl<T: 'static> Entity<T> {
}
/// Read the entity referenced by this handle with the given function.
pub fn read_with<R, C: AppContext>(
&self,
cx: &C,
f: impl FnOnce(&T, &App) -> R,
) -> C::Result<R> {
pub fn read_with<R, C: AppContext>(&self, cx: &C, f: impl FnOnce(&T, &App) -> R) -> R {
cx.read_entity(self, f)
}
@@ -435,7 +431,7 @@ impl<T: 'static> Entity<T> {
&self,
cx: &mut C,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> C::Result<R>
) -> R
where
C: AppContext,
{
@@ -449,7 +445,7 @@ impl<T: 'static> Entity<T> {
&self,
cx: &mut C,
update: impl FnOnce(&mut T, &mut Window, &mut Context<'_, T>) -> R,
) -> C::Result<R>
) -> R
where
C: VisualContext,
{
@@ -646,13 +642,10 @@ impl<T: 'static> WeakEntity<T> {
) -> Result<R>
where
C: AppContext,
Result<C::Result<R>>: crate::Flatten<R>,
{
crate::Flatten::flatten(
self.upgrade()
.ok_or_else(|| anyhow!("entity released"))
.map(|this| cx.update_entity(&this, update)),
)
self.upgrade()
.ok_or_else(|| anyhow!("entity released"))
.map(|this| cx.update_entity(&this, update))
}
/// Updates the entity referenced by this handle with the given function if
@@ -665,14 +658,13 @@ impl<T: 'static> WeakEntity<T> {
) -> Result<R>
where
C: VisualContext,
Result<C::Result<R>>: crate::Flatten<R>,
{
let window = cx.window_handle();
let this = self.upgrade().ok_or_else(|| anyhow!("entity released"))?;
crate::Flatten::flatten(window.update(cx, |_, window, cx| {
window.update(cx, |_, window, cx| {
this.update(cx, |entity, cx| update(entity, window, cx))
}))
})
}
/// Reads the entity referenced by this handle with the given function if
@@ -681,13 +673,10 @@ impl<T: 'static> WeakEntity<T> {
pub fn read_with<C, R>(&self, cx: &C, read: impl FnOnce(&T, &App) -> R) -> Result<R>
where
C: AppContext,
Result<C::Result<R>>: crate::Flatten<R>,
{
crate::Flatten::flatten(
self.upgrade()
.ok_or_else(|| anyhow!("entity release"))
.map(|this| cx.read_entity(&this, read)),
)
self.upgrade()
.ok_or_else(|| anyhow!("entity release"))
.map(|this| cx.read_entity(&this, read))
}
}

View File

@@ -1,9 +1,9 @@
use crate::{
Action, AnyView, AnyWindowHandle, App, AppCell, AppContext, AsyncApp, AvailableSpace,
BackgroundExecutor, BorrowAppContext, Bounds, ClipboardItem, DrawPhase, Drawable, Element,
Empty, EventEmitter, ForegroundExecutor, Global, InputEvent, Keystroke, Modifiers,
ModifiersChangedEvent, MouseButton, MouseDownEvent, MouseMoveEvent, MouseUpEvent, Pixels,
Platform, Point, Render, Result, Size, Task, TestDispatcher, TestPlatform,
Empty, EventEmitter, ForegroundContext, ForegroundExecutor, Global, InputEvent, Keystroke,
Modifiers, ModifiersChangedEvent, MouseButton, MouseDownEvent, MouseMoveEvent, MouseUpEvent,
Pixels, Platform, Point, Render, Result, Size, Task, TestDispatcher, TestPlatform,
TestScreenCaptureSource, TestWindow, TextSystem, VisualContext, Window, WindowBounds,
WindowHandle, WindowOptions,
};
@@ -13,7 +13,6 @@ use std::{cell::RefCell, future::Future, ops::Deref, rc::Rc, sync::Arc, time::Du
/// A TestAppContext is provided to tests created with `#[gpui::test]`, it provides
/// an implementation of `Context` with additional methods that are useful in tests.
#[derive(Clone)]
pub struct TestAppContext {
#[doc(hidden)]
pub app: Rc<AppCell>,
@@ -29,18 +28,31 @@ pub struct TestAppContext {
on_quit: Rc<RefCell<Vec<Box<dyn FnOnce() + 'static>>>>,
}
impl AppContext for TestAppContext {
type Result<T> = T;
impl Clone for TestAppContext {
fn clone(&self) -> Self {
Self {
app: self.app.clone(),
background_executor: self.background_executor.clone(),
foreground_executor: self.foreground_executor.clone(),
dispatcher: self.dispatcher.clone(),
test_platform: self.test_platform.clone(),
text_system: self.text_system.clone(),
fn_name: self.fn_name.clone(),
on_quit: self.on_quit.clone(),
}
}
}
impl AppContext for TestAppContext {
fn new<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
let mut app = self.app.borrow_mut();
app.new(build_entity)
}
fn reserve_entity<T: 'static>(&mut self) -> Self::Result<crate::Reservation<T>> {
fn reserve_entity<T: 'static>(&mut self) -> crate::Reservation<T> {
let mut app = self.app.borrow_mut();
app.reserve_entity()
}
@@ -49,7 +61,7 @@ impl AppContext for TestAppContext {
&mut self,
reservation: crate::Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
let mut app = self.app.borrow_mut();
app.insert_entity(reservation, build_entity)
}
@@ -58,16 +70,12 @@ impl AppContext for TestAppContext {
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Self::Result<R> {
) -> R {
let mut app = self.app.borrow_mut();
app.update_entity(handle, update)
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
read: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static,
{
@@ -102,7 +110,7 @@ impl AppContext for TestAppContext {
self.background_executor.spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{
@@ -329,7 +337,10 @@ impl TestAppContext {
Fut: Future<Output = R> + 'static,
R: 'static,
{
self.foreground_executor.spawn(f(self.to_async()))
self.foreground_executor.spawn_with_context(
ForegroundContext::app(&Rc::downgrade(&self.app)),
f(self.to_async()),
)
}
/// true if the given global is defined
@@ -363,14 +374,15 @@ impl TestAppContext {
lock.update(|cx| cx.update_global(update))
}
/// Returns an `AsyncApp` which can be used to run tasks that expect to be on a background
/// Returns a `AsyncApp` which can be used to run tasks that expect to be on a background
/// thread on the current thread in tests.
pub fn to_async(&self) -> AsyncApp {
AsyncApp {
WeakAsyncApp {
app: Rc::downgrade(&self.app),
background_executor: self.background_executor.clone(),
foreground_executor: self.foreground_executor.clone(),
}
.upgrade()
}
/// Wait until there are no more pending tasks.
@@ -644,7 +656,7 @@ impl<V> Entity<V> {
use derive_more::{Deref, DerefMut};
use super::{Context, Entity};
use super::{Context, Entity, WeakAsyncApp};
#[derive(Deref, DerefMut, Clone)]
/// A VisualTestContext is the test-equivalent of a `Window` and `App`. It allows you to
/// run window-specific test code. It can be dereferenced to a `TextAppContext`.
@@ -868,16 +880,14 @@ impl VisualTestContext {
}
impl AppContext for VisualTestContext {
type Result<T> = <TestAppContext as AppContext>::Result<T>;
fn new<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
self.cx.new(build_entity)
}
fn reserve_entity<T: 'static>(&mut self) -> Self::Result<crate::Reservation<T>> {
fn reserve_entity<T: 'static>(&mut self) -> crate::Reservation<T> {
self.cx.reserve_entity()
}
@@ -885,7 +895,7 @@ impl AppContext for VisualTestContext {
&mut self,
reservation: crate::Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
self.cx.insert_entity(reservation, build_entity)
}
@@ -893,18 +903,14 @@ impl AppContext for VisualTestContext {
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Self::Result<R>
) -> R
where
T: 'static,
{
self.cx.update_entity(handle, update)
}
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
read: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static,
{
@@ -936,7 +942,7 @@ impl AppContext for VisualTestContext {
self.cx.background_spawn(future)
}
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global,
{
@@ -953,7 +959,7 @@ impl VisualContext for VisualTestContext {
fn new_window_entity<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Window, &mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>> {
) -> Entity<T> {
self.window
.update(&mut self.cx, |_, window, cx| {
cx.new(|cx| build_entity(window, cx))
@@ -965,7 +971,7 @@ impl VisualContext for VisualTestContext {
&mut self,
view: &Entity<V>,
update: impl FnOnce(&mut V, &mut Window, &mut Context<V>) -> R,
) -> Self::Result<R> {
) -> R {
self.window
.update(&mut self.cx, |_, window, cx| {
view.update(cx, |v, cx| update(v, window, cx))
@@ -976,7 +982,7 @@ impl VisualContext for VisualTestContext {
fn replace_root_view<V>(
&mut self,
build_view: impl FnOnce(&mut Window, &mut Context<V>) -> V,
) -> Self::Result<Entity<V>>
) -> Entity<V>
where
V: 'static + Render,
{
@@ -987,7 +993,7 @@ impl VisualContext for VisualTestContext {
.unwrap()
}
fn focus<V: crate::Focusable>(&mut self, view: &Entity<V>) -> Self::Result<()> {
fn focus<V: crate::Focusable>(&mut self, view: &Entity<V>) {
self.window
.update(&mut self.cx, |_, window, cx| {
view.read(cx).focus_handle(cx).clone().focus(window)

View File

@@ -1948,6 +1948,14 @@ impl Interactivity {
if pending_mouse_down.is_some() && hitbox.is_hovered(window) {
captured_mouse_down = pending_mouse_down.take();
window.refresh();
} else if pending_mouse_down.is_some() {
// Clear the pending mouse down event (without firing click handlers)
// if the hitbox is not being hovered.
// This avoids dragging elements that changed their position
// immediately after being clicked.
// See https://github.com/zed-industries/zed/issues/24600 for more details
pending_mouse_down.take();
window.refresh();
}
}
// Fire click handlers during the bubble phase.
@@ -2501,8 +2509,7 @@ fn handle_tooltip_mouse_move(
});
*active_tooltip.borrow_mut() = new_tooltip;
window.refresh();
})
.ok();
});
}
});
active_tooltip
@@ -2569,7 +2576,7 @@ fn handle_tooltip_check_visible_and_update(
.timer(HOVERABLE_TOOLTIP_HIDE_DELAY)
.await;
if active_tooltip.borrow_mut().take().is_some() {
cx.update(|window, _cx| window.refresh()).ok();
cx.update(|window, _cx| window.refresh());
}
}
});

View File

@@ -361,8 +361,7 @@ impl Element for Img {
cx.background_executor().timer(LOADING_DELAY).await;
cx.update(move |_, cx| {
cx.notify(current_view);
})
.ok();
});
});
state.started_loading = Some((Instant::now(), task));
}

View File

@@ -1,14 +1,11 @@
use crate::{App, PlatformDispatcher};
use async_task::Runnable;
use crate::{App, ForegroundContext, NotClone, PlatformDispatcher};
use async_task::Builder;
use futures::channel::mpsc;
use smol::prelude::*;
use std::mem::ManuallyDrop;
use std::panic::Location;
use std::thread::{self, ThreadId};
use std::{
fmt::Debug,
marker::PhantomData,
mem,
mem::{self, ManuallyDrop},
num::NonZeroUsize,
pin::Pin,
rc::Rc,
@@ -17,6 +14,7 @@ use std::{
Arc,
},
task::{Context, Poll},
thread::ThreadId,
time::{Duration, Instant},
};
use util::TryFutureExt;
@@ -27,10 +25,10 @@ use rand::rngs::StdRng;
/// A pointer to the executor that is currently running,
/// for spawning background tasks.
#[derive(Clone)]
pub struct BackgroundExecutor {
#[doc(hidden)]
pub dispatcher: Arc<dyn PlatformDispatcher>,
_not_clone: NotClone,
}
/// A pointer to the executor that is currently running,
@@ -40,10 +38,10 @@ pub struct BackgroundExecutor {
/// `ForegroundExecutor::spawn` does not require `Send` but checks at runtime that the future is
/// only polled from the same thread it was spawned from. These checks would fail when spawning
/// foreground tasks from from background threads.
#[derive(Clone)]
pub struct ForegroundExecutor {
#[doc(hidden)]
pub dispatcher: Arc<dyn PlatformDispatcher>,
_not_clone: NotClone,
not_send: PhantomData<Rc<()>>,
}
@@ -55,15 +53,18 @@ pub struct ForegroundExecutor {
/// the task to continue running, but with no way to return a value.
#[must_use]
#[derive(Debug)]
pub struct Task<T>(TaskState<T>);
pub struct Task<T>(TaskState<T, async_task::Task<T, ForegroundContext>, async_task::Task<T>>);
#[derive(Debug)]
enum TaskState<T> {
enum TaskState<T, F, B> {
/// A task that is ready to return a value
Ready(Option<T>),
/// A task that is currently running.
Spawned(async_task::Task<T>),
/// A task that is currently running on the foreground.
ForegroundSpawned(F),
/// A task that is currently running on the background
BackgroundSpawned(B),
}
impl<T> Task<T> {
@@ -76,7 +77,23 @@ impl<T> Task<T> {
pub fn detach(self) {
match self {
Task(TaskState::Ready(_)) => {}
Task(TaskState::Spawned(task)) => task.detach(),
Task(TaskState::ForegroundSpawned(task)) => task.detach(),
Task(TaskState::BackgroundSpawned(task)) => task.detach(),
}
}
/// Create a task that can be safely awaited outside of GPUI.
/// If the app quits while this task is incomplete, it will
/// produce None instead of panicking.
pub fn external(self) -> ExternalTask<T> {
match self.0 {
TaskState::Ready(r) => ExternalTask(TaskState::Ready(r)),
TaskState::ForegroundSpawned(task) => {
ExternalTask(TaskState::ForegroundSpawned(task.fallible()))
}
TaskState::BackgroundSpawned(task) => {
ExternalTask(TaskState::BackgroundSpawned(task.fallible()))
}
}
}
}
@@ -96,14 +113,34 @@ where
.detach();
}
}
impl<T> Future for Task<T> {
type Output = T;
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output> {
match unsafe { self.get_unchecked_mut() } {
Task(TaskState::Ready(val)) => Poll::Ready(val.take().unwrap()),
Task(TaskState::Spawned(task)) => task.poll(cx),
Task(TaskState::ForegroundSpawned(task)) => task.poll(cx),
Task(TaskState::BackgroundSpawned(task)) => task.poll(cx),
}
}
}
/// An ExternalTask is the same as a task, except it can be safely
/// awaited outside of GPUI, without panicking.
#[must_use]
#[derive(Debug)]
pub struct ExternalTask<T>(
TaskState<T, async_task::FallibleTask<T, ForegroundContext>, async_task::FallibleTask<T>>,
);
impl<T> Future for ExternalTask<T> {
type Output = Option<T>;
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output> {
match unsafe { self.get_unchecked_mut() } {
ExternalTask(TaskState::Ready(val)) => Poll::Ready(val.take()),
ExternalTask(TaskState::ForegroundSpawned(task)) => task.poll(cx),
ExternalTask(TaskState::BackgroundSpawned(task)) => task.poll(cx),
}
}
}
@@ -127,8 +164,6 @@ impl TaskLabel {
}
}
type AnyLocalFuture<R> = Pin<Box<dyn 'static + Future<Output = R>>>;
type AnyFuture<R> = Pin<Box<dyn 'static + Send + Future<Output = R>>>;
/// BackgroundExecutor lets you run things on background threads.
@@ -138,7 +173,20 @@ type AnyFuture<R> = Pin<Box<dyn 'static + Send + Future<Output = R>>>;
impl BackgroundExecutor {
#[doc(hidden)]
pub fn new(dispatcher: Arc<dyn PlatformDispatcher>) -> Self {
Self { dispatcher }
Self {
dispatcher,
_not_clone: NotClone,
}
}
/// Cloning executors can cause runtime panics, see the documentation on `NotClone` for details.
/// Use this power wisely.
#[doc(hidden)]
pub fn clone(&self) -> Self {
Self {
dispatcher: self.dispatcher.clone(),
_not_clone: NotClone,
}
}
/// Enqueues the given future to be run to completion on a background thread.
@@ -171,7 +219,7 @@ impl BackgroundExecutor {
let (runnable, task) =
async_task::spawn(future, move |runnable| dispatcher.dispatch(runnable, label));
runnable.schedule();
Task(TaskState::Spawned(task))
Task(TaskState::BackgroundSpawned(task))
}
/// Used by the test harness to run an async test in a synchronous fashion.
@@ -348,7 +396,7 @@ impl BackgroundExecutor {
move |runnable| dispatcher.dispatch_after(duration, runnable)
});
runnable.schedule();
Task(TaskState::Spawned(task))
Task(TaskState::BackgroundSpawned(task))
}
/// in tests, start_waiting lets you indicate which task is waiting (for debugging only)
@@ -447,6 +495,18 @@ impl ForegroundExecutor {
pub fn new(dispatcher: Arc<dyn PlatformDispatcher>) -> Self {
Self {
dispatcher,
_not_clone: NotClone,
not_send: PhantomData,
}
}
/// Cloning executors can cause runtime panics, see the documentation on `NotClone` for details.
/// Use this power wisely.
#[doc(hidden)]
pub fn clone(&self) -> Self {
Self {
dispatcher: self.dispatcher.clone(),
_not_clone: NotClone,
not_send: PhantomData,
}
}
@@ -457,86 +517,92 @@ impl ForegroundExecutor {
where
R: 'static,
{
let dispatcher = self.dispatcher.clone();
let mut context = ForegroundContext::none();
let task = self.spawn_internal(future, context);
Task(TaskState::ForegroundSpawned(task))
}
#[track_caller]
fn inner<R: 'static>(
dispatcher: Arc<dyn PlatformDispatcher>,
future: AnyLocalFuture<R>,
) -> Task<R> {
let (runnable, task) = spawn_local_with_source_location(future, move |runnable| {
dispatcher.dispatch_on_main_thread(runnable)
});
runnable.schedule();
Task(TaskState::Spawned(task))
/// Enqueues the given Task to run on the main thread at some point in the future,
/// with a context parameter that will be checked before each turn
#[track_caller]
pub(crate) fn spawn_with_context<R>(
&self,
mut context: ForegroundContext,
future: impl Future<Output = R> + 'static,
) -> Task<R>
where
R: 'static,
{
let task = self.spawn_internal(future, context);
Task(TaskState::ForegroundSpawned(task))
}
#[track_caller]
fn spawn_internal<R>(
&self,
future: impl Future<Output = R> + 'static,
mut context: ForegroundContext,
) -> smol::Task<R, ForegroundContext>
where
R: 'static,
{
/// Declarations here are copy-modified from:
/// https://github.com/smol-rs/async-task/blob/ca9dbe1db9c422fd765847fa91306e30a6bb58a9/src/runnable.rs#L405
#[inline]
pub(crate) fn thread_id() -> ThreadId {
std::thread_local! {
static ID: ThreadId = std::thread::current().id();
}
ID.try_with(|id| *id)
.unwrap_or_else(|_| std::thread::current().id())
}
inner::<R>(dispatcher, Box::pin(future))
}
}
/// Variant of `async_task::spawn_local` that includes the source location of the spawn in panics.
///
/// Copy-modified from:
/// https://github.com/smol-rs/async-task/blob/ca9dbe1db9c422fd765847fa91306e30a6bb58a9/src/runnable.rs#L405
#[track_caller]
fn spawn_local_with_source_location<Fut, S>(
future: Fut,
schedule: S,
) -> (Runnable<()>, async_task::Task<Fut::Output, ()>)
where
Fut: Future + 'static,
Fut::Output: 'static,
S: async_task::Schedule<()> + Send + Sync + 'static,
{
#[inline]
fn thread_id() -> ThreadId {
std::thread_local! {
static ID: ThreadId = thread::current().id();
struct Checked<F> {
id: ThreadId,
location: core::panic::Location<'static>,
inner: ManuallyDrop<F>,
}
ID.try_with(|id| *id)
.unwrap_or_else(|_| thread::current().id())
}
struct Checked<F> {
id: ThreadId,
inner: ManuallyDrop<F>,
location: &'static Location<'static>,
}
impl<F> Drop for Checked<F> {
fn drop(&mut self) {
assert!(
self.id == thread_id(),
"local task dropped by a thread that didn't spawn it. Task spawned at {}",
self.location
);
unsafe {
ManuallyDrop::drop(&mut self.inner);
impl<F> Drop for Checked<F> {
fn drop(&mut self) {
assert!(
self.id == thread_id(),
"local task dropped by a thread that didn't spawn it. Task spawned at {}",
self.location
);
unsafe {
ManuallyDrop::drop(&mut self.inner);
}
}
}
}
impl<F: Future> Future for Checked<F> {
type Output = F::Output;
impl<F: Future> Future for Checked<F> {
type Output = F::Output;
fn poll(self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
assert!(
self.id == thread_id(),
"local task polled by a thread that didn't spawn it. Task spawned at {}",
self.location
);
unsafe { self.map_unchecked_mut(|c| &mut *c.inner).poll(cx) }
fn poll(self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
assert!(
self.id == thread_id(),
"local task polled by a thread that didn't spawn it. Task spawned at {}",
self.location
);
unsafe { self.map_unchecked_mut(|c| &mut *c.inner).poll(cx) }
}
}
let checked = Checked {
id: thread_id(),
location: *core::panic::Location::caller(),
inner: ManuallyDrop::new(future),
};
let dispatcher = self.dispatcher.clone();
let (runnable, task) = Builder::new().metadata(context).spawn_local(
|_| checked,
move |runnable| dispatcher.dispatch_on_main_thread(runnable),
);
runnable.schedule();
task
}
// Wrap the future into one that checks which thread it's on.
let future = Checked {
id: thread_id(),
inner: ManuallyDrop::new(future),
location: Location::caller(),
};
unsafe { async_task::spawn_unchecked(future, schedule) }
}
/// Scope manages a set of tasks that are enqueued and waited on together. See [`BackgroundExecutor::scoped`].

View File

@@ -161,19 +161,13 @@ use taffy::TaffyLayoutEngine;
/// The context trait, allows the different contexts in GPUI to be used
/// interchangeably for certain operations.
pub trait AppContext {
/// The result type for this context, used for async contexts that
/// can't hold a direct reference to the application context.
type Result<T>;
/// Create a new entity in the app context.
fn new<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>>;
fn new<T: 'static>(&mut self, build_entity: impl FnOnce(&mut Context<'_, T>) -> T)
-> Entity<T>;
/// Reserve a slot for a entity to be inserted later.
/// The returned [Reservation] allows you to obtain the [EntityId] for the future entity.
fn reserve_entity<T: 'static>(&mut self) -> Self::Result<Reservation<T>>;
fn reserve_entity<T: 'static>(&mut self) -> Reservation<T>;
/// Insert a new entity in the app context based on a [Reservation] previously obtained from [`reserve_entity`].
///
@@ -182,23 +176,19 @@ pub trait AppContext {
&mut self,
reservation: Reservation<T>,
build_entity: impl FnOnce(&mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>>;
) -> Entity<T>;
/// Update a entity in the app context.
fn update_entity<T, R>(
&mut self,
handle: &Entity<T>,
update: impl FnOnce(&mut T, &mut Context<'_, T>) -> R,
) -> Self::Result<R>
) -> R
where
T: 'static;
/// Read a entity from the app context.
fn read_entity<T, R>(
&self,
handle: &Entity<T>,
read: impl FnOnce(&T, &App) -> R,
) -> Self::Result<R>
fn read_entity<T, R>(&self, handle: &Entity<T>, read: impl FnOnce(&T, &App) -> R) -> R
where
T: 'static;
@@ -222,7 +212,7 @@ pub trait AppContext {
R: Send + 'static;
/// Read a global from this app context
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> Self::Result<R>
fn read_global<G, R>(&self, callback: impl FnOnce(&G, &App) -> R) -> R
where
G: Global;
}
@@ -249,24 +239,24 @@ pub trait VisualContext: AppContext {
&mut self,
entity: &Entity<T>,
update: impl FnOnce(&mut T, &mut Window, &mut Context<T>) -> R,
) -> Self::Result<R>;
) -> R;
/// Update a view with the given callback
fn new_window_entity<T: 'static>(
&mut self,
build_entity: impl FnOnce(&mut Window, &mut Context<'_, T>) -> T,
) -> Self::Result<Entity<T>>;
) -> Entity<T>;
/// Replace the root view of a window with a new view.
fn replace_root_view<V>(
&mut self,
build_view: impl FnOnce(&mut Window, &mut Context<V>) -> V,
) -> Self::Result<Entity<V>>
) -> Entity<V>
where
V: 'static + Render;
/// Focus a entity in the window, if it implements the [`Focusable`] trait.
fn focus<V>(&mut self, entity: &Entity<V>) -> Self::Result<()>
fn focus<V>(&mut self, entity: &Entity<V>)
where
V: Focusable;
}

View File

@@ -27,11 +27,11 @@ mod test;
mod windows;
use crate::{
point, Action, AnyWindowHandle, App, AsyncWindowContext, BackgroundExecutor, Bounds,
DevicePixels, DispatchEventResult, Font, FontId, FontMetrics, FontRun, ForegroundExecutor,
GlyphId, GpuSpecs, ImageSource, Keymap, LineLayout, Pixels, PlatformInput, Point,
RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams, ScaledPixels, Scene,
SharedString, Size, SvgRenderer, SvgSize, Task, TaskLabel, Window, DEFAULT_WINDOW_SIZE,
point, Action, AnyWindowHandle, App, AppCell, BackgroundExecutor, Bounds, DevicePixels,
DispatchEventResult, Font, FontId, FontMetrics, FontRun, ForegroundExecutor, GlyphId, GpuSpecs,
ImageSource, Keymap, LineLayout, Pixels, PlatformInput, Point, RenderGlyphParams, RenderImage,
RenderImageParams, RenderSvgParams, ScaledPixels, Scene, SharedString, Size, SvgRenderer,
SvgSize, Task, TaskLabel, WeakAsyncWindowContext, Window, WindowId, DEFAULT_WINDOW_SIZE,
};
use anyhow::{anyhow, Result};
use async_task::Runnable;
@@ -47,6 +47,7 @@ use std::borrow::Cow;
use std::hash::{Hash, Hasher};
use std::io::Cursor;
use std::ops;
use std::rc::Weak;
use std::time::{Duration, Instant};
use std::{
fmt::{self, Debug},
@@ -450,13 +451,80 @@ pub(crate) trait PlatformWindow: HasWindowHandle + HasDisplayHandle {
}
}
#[derive(Debug)]
enum Contexts {
App {
app: Weak<AppCell>,
},
Window {
app: Weak<AppCell>,
window: WindowId,
},
}
/// The context of a foreground future, to be checked before polling the future
#[derive(Debug)]
pub struct ForegroundContext {
inner: Option<Contexts>,
}
// SAFETY: These fields are only accessed during `Runnable::run()` calls on the foreground,
// As enforced by the GPUI foreground executor and platform dispatcher implementations
unsafe impl Send for ForegroundContext {}
unsafe impl Sync for ForegroundContext {}
impl ForegroundContext {
/// This ForegroundContext will enforce that the app exists before polling
#[track_caller]
pub fn app(app: &Weak<AppCell>) -> Self {
Self {
inner: Some(Contexts::App { app: app.clone() }),
}
}
/// This ForegroundContext will enforce that the app and window exists before polling
#[track_caller]
pub fn window(app: &Weak<AppCell>, window: WindowId) -> Self {
Self {
inner: Some(Contexts::Window {
app: app.clone(),
window,
}),
}
}
/// This foreground context will do nothing
#[track_caller]
pub fn none() -> Self {
Self { inner: None }
}
/// Check if the context is currently valid
pub fn context_is_valid(&self) -> bool {
match &self.inner {
Some(Contexts::App { app }) => app.upgrade().is_some(),
Some(Contexts::Window { app, window }) => {
if let Some(app) = app.upgrade() {
let lock = app.borrow();
let result = lock.windows.contains_key(*window);
drop(lock);
result
} else {
false
}
}
None => true,
}
}
}
/// This type is public so that our test macro can generate and use it, but it should not
/// be considered part of our public API.
#[doc(hidden)]
pub trait PlatformDispatcher: Send + Sync {
fn is_main_thread(&self) -> bool;
fn dispatch(&self, runnable: Runnable, label: Option<TaskLabel>);
fn dispatch_on_main_thread(&self, runnable: Runnable);
fn dispatch_on_main_thread(&self, runnable: Runnable<ForegroundContext>);
fn dispatch_after(&self, duration: Duration, runnable: Runnable);
fn park(&self, timeout: Option<Duration>) -> bool;
fn unparker(&self) -> Unparker;
@@ -683,7 +751,7 @@ impl From<TileId> for etagere::AllocId {
}
pub(crate) struct PlatformInputHandler {
cx: AsyncWindowContext,
cx: WeakAsyncWindowContext,
handler: Box<dyn InputHandler>,
}
@@ -695,7 +763,7 @@ pub(crate) struct PlatformInputHandler {
allow(dead_code)
)]
impl PlatformInputHandler {
pub fn new(cx: AsyncWindowContext, handler: Box<dyn InputHandler>) -> Self {
pub fn new(cx: WeakAsyncWindowContext, handler: Box<dyn InputHandler>) -> Self {
Self { cx, handler }
}

View File

@@ -1,6 +1,7 @@
use crate::{Action, App, Platform, SharedString};
use util::ResultExt;
use crate::{Action, App, Platform, SharedString};
/// A menu of the application, either a main menu or a submenu
pub struct Menu {
/// The name of the menu

View File

@@ -2,7 +2,7 @@
#![allow(non_camel_case_types)]
#![allow(non_snake_case)]
use crate::{PlatformDispatcher, TaskLabel};
use crate::{ForegroundContext, PlatformDispatcher, TaskLabel};
use async_task::Runnable;
use objc::{
class, msg_send,
@@ -63,12 +63,12 @@ impl PlatformDispatcher for MacDispatcher {
}
}
fn dispatch_on_main_thread(&self, runnable: Runnable) {
fn dispatch_on_main_thread(&self, runnable: Runnable<ForegroundContext>) {
unsafe {
dispatch_async_f(
dispatch_get_main_queue(),
runnable.into_raw().as_ptr() as *mut c_void,
Some(trampoline),
Some(context_trampoline),
);
}
}
@@ -105,3 +105,15 @@ extern "C" fn trampoline(runnable: *mut c_void) {
let task = unsafe { Runnable::<()>::from_raw(NonNull::new_unchecked(runnable as *mut ())) };
task.run();
}
extern "C" fn context_trampoline(runnable: *mut c_void) {
let task = unsafe {
Runnable::<ForegroundContext>::from_raw(NonNull::new_unchecked(runnable as *mut ()))
};
if task.metadata().context_is_valid() {
task.run();
} else {
drop(task);
}
}

View File

@@ -1,4 +1,4 @@
use crate::{PlatformDispatcher, TaskLabel};
use crate::{platform::ForegroundContext, PlatformDispatcher, TaskLabel};
use async_task::Runnable;
use backtrace::Backtrace;
use collections::{HashMap, HashSet, VecDeque};
@@ -26,9 +26,13 @@ pub struct TestDispatcher {
unparker: Unparker,
}
// SAFETY: The test dispatcher is only ever run single threaded
unsafe impl Send for TestDispatcher {}
unsafe impl Sync for TestDispatcher {}
struct TestDispatcherState {
random: StdRng,
foreground: HashMap<TestDispatcherId, VecDeque<Runnable>>,
foreground: HashMap<TestDispatcherId, VecDeque<Runnable<ForegroundContext>>>,
background: Vec<Runnable>,
deprioritized_background: Vec<Runnable>,
delayed: Vec<(Duration, Runnable)>,
@@ -135,7 +139,8 @@ impl TestDispatcher {
};
let background_len = state.background.len();
let runnable;
let mut background_runnable = None;
let mut foreground_runnable = None;
let main_thread;
if foreground_len == 0 && background_len == 0 {
let deprioritized_background_len = state.deprioritized_background.len();
@@ -144,7 +149,7 @@ impl TestDispatcher {
}
let ix = state.random.gen_range(0..deprioritized_background_len);
main_thread = false;
runnable = state.deprioritized_background.swap_remove(ix);
background_runnable = Some(state.deprioritized_background.swap_remove(ix));
} else {
main_thread = state.random.gen_ratio(
foreground_len as u32,
@@ -152,24 +157,36 @@ impl TestDispatcher {
);
if main_thread {
let state = &mut *state;
runnable = state
.foreground
.values_mut()
.filter(|runnables| !runnables.is_empty())
.choose(&mut state.random)
.unwrap()
.pop_front()
.unwrap();
foreground_runnable = Some(
state
.foreground
.values_mut()
.filter(|runnables| !runnables.is_empty())
.choose(&mut state.random)
.unwrap()
.pop_front()
.unwrap(),
);
} else {
let ix = state.random.gen_range(0..background_len);
runnable = state.background.swap_remove(ix);
background_runnable = Some(state.background.swap_remove(ix));
};
};
let was_main_thread = state.is_main_thread;
state.is_main_thread = main_thread;
drop(state);
runnable.run();
if let Some(background_runnable) = background_runnable {
background_runnable.run();
} else if let Some(foreground_runnable) = foreground_runnable {
if foreground_runnable.metadata().context_is_valid() {
foreground_runnable.run();
} else {
drop(foreground_runnable);
}
}
self.state.lock().is_main_thread = was_main_thread;
true
@@ -272,7 +289,7 @@ impl PlatformDispatcher for TestDispatcher {
self.unparker.unpark();
}
fn dispatch_on_main_thread(&self, runnable: Runnable) {
fn dispatch_on_main_thread(&self, runnable: Runnable<ForegroundContext>) {
self.state
.lock()
.foreground

View File

@@ -1,19 +1,20 @@
use crate::{
point, prelude::*, px, size, transparent_black, Action, AnyDrag, AnyElement, AnyTooltip,
AnyView, App, AppContext, Arena, Asset, AsyncWindowContext, AvailableSpace, Background, Bounds,
BoxShadow, Context, Corners, CursorStyle, Decorations, DevicePixels, DispatchActionListener,
DispatchNodeId, DispatchTree, DisplayId, Edges, Effect, Entity, EntityId, EventEmitter,
FileDropEvent, FontId, Global, GlobalElementId, GlyphId, GpuSpecs, Hsla, InputHandler, IsZero,
KeyBinding, KeyContext, KeyDownEvent, KeyEvent, Keystroke, KeystrokeEvent, LayoutId,
LineLayoutIndex, Modifiers, ModifiersChangedEvent, MonochromeSprite, MouseButton, MouseEvent,
MouseMoveEvent, MouseUpEvent, Path, Pixels, PlatformAtlas, PlatformDisplay, PlatformInput,
PlatformInputHandler, PlatformWindow, Point, PolychromeSprite, PromptLevel, Quad, Render,
RenderGlyphParams, RenderImage, RenderImageParams, RenderSvgParams, Replay, ResizeEdge,
ScaledPixels, Scene, Shadow, SharedString, Size, StrikethroughStyle, Style, SubscriberSet,
Subscription, TaffyLayoutEngine, Task, TextStyle, TextStyleRefinement, TransformationMatrix,
Underline, UnderlineStyle, WindowAppearance, WindowBackgroundAppearance, WindowBounds,
point, prelude::*, px, size, svg_renderer::SMOOTH_SVG_SCALE_FACTOR, transparent_black, Action,
AnyDrag, AnyElement, AnyTooltip, AnyView, App, AppContext, Arena, Asset, AsyncWindowContext,
AvailableSpace, Background, Bounds, BoxShadow, Context, Corners, CursorStyle, Decorations,
DevicePixels, DispatchActionListener, DispatchNodeId, DispatchTree, DisplayId, Edges, Effect,
Entity, EntityId, EventEmitter, FileDropEvent, FontId, Global, GlobalElementId, GlyphId,
GpuSpecs, Hsla, InputHandler, IsZero, KeyBinding, KeyContext, KeyDownEvent, KeyEvent,
Keystroke, KeystrokeEvent, LayoutId, LineLayoutIndex, Modifiers, ModifiersChangedEvent,
MonochromeSprite, MouseButton, MouseEvent, MouseMoveEvent, MouseUpEvent, Path, Pixels,
PlatformAtlas, PlatformDisplay, PlatformInput, PlatformInputHandler, PlatformWindow, Point,
PolychromeSprite, PromptLevel, Quad, Render, RenderGlyphParams, RenderImage, RenderImageParams,
RenderSvgParams, Replay, ResizeEdge, ScaledPixels, Scene, Shadow, SharedString, Size,
StrikethroughStyle, Style, SubscriberSet, Subscription, TaffyLayoutEngine, Task, TextStyle,
TextStyleRefinement, TransformationMatrix, Underline, UnderlineStyle, WeakAsyncApp,
WeakAsyncWindowContext, WindowAppearance, WindowBackgroundAppearance, WindowBounds,
WindowControls, WindowDecorations, WindowOptions, WindowParams, WindowTextSystem,
SMOOTH_SVG_SCALE_FACTOR, SUBPIXEL_VARIANTS,
SUBPIXEL_VARIANTS,
};
use anyhow::{anyhow, Context as _, Result};
use collections::{FxHashMap, FxHashSet};
@@ -765,7 +766,7 @@ impl Window {
platform_window.on_close(Box::new({
let mut cx = cx.to_async();
move || {
let _ = handle.update(&mut cx, |_, window, _| window.remove_window());
let _ = handle.update_weak(&mut cx, |_, window, _| window.remove_window());
}
}));
platform_window.on_request_frame(Box::new({
@@ -779,7 +780,7 @@ impl Window {
let next_frame_callbacks = next_frame_callbacks.take();
if !next_frame_callbacks.is_empty() {
handle
.update(&mut cx, |_, window, cx| {
.update_weak(&mut cx, |_, window, cx| {
for callback in next_frame_callbacks {
callback(window, cx);
}
@@ -797,7 +798,7 @@ impl Window {
if invalidator.is_dirty() {
measure("frame duration", || {
handle
.update(&mut cx, |_, window, cx| {
.update_weak(&mut cx, |_, window, cx| {
window.draw(cx);
window.present();
})
@@ -805,12 +806,12 @@ impl Window {
})
} else if needs_present {
handle
.update(&mut cx, |_, window, _| window.present())
.update_weak(&mut cx, |_, window, _| window.present())
.log_err();
}
handle
.update(&mut cx, |_, window, _| {
.update_weak(&mut cx, |_, window, _| {
window.complete_frame();
})
.log_err();
@@ -820,7 +821,7 @@ impl Window {
let mut cx = cx.to_async();
move |_, _| {
handle
.update(&mut cx, |_, window, cx| window.bounds_changed(cx))
.update_weak(&mut cx, |_, window, cx| window.bounds_changed(cx))
.log_err();
}
}));
@@ -828,7 +829,7 @@ impl Window {
let mut cx = cx.to_async();
move || {
handle
.update(&mut cx, |_, window, cx| window.bounds_changed(cx))
.update_weak(&mut cx, |_, window, cx| window.bounds_changed(cx))
.log_err();
}
}));
@@ -836,7 +837,7 @@ impl Window {
let mut cx = cx.to_async();
move || {
handle
.update(&mut cx, |_, window, cx| window.appearance_changed(cx))
.update_weak(&mut cx, |_, window, cx| window.appearance_changed(cx))
.log_err();
}
}));
@@ -844,7 +845,7 @@ impl Window {
let mut cx = cx.to_async();
move |active| {
handle
.update(&mut cx, |_, window, cx| {
.update_weak(&mut cx, |_, window, cx| {
window.active.set(active);
window
.activation_observers
@@ -859,7 +860,7 @@ impl Window {
let mut cx = cx.to_async();
move |active| {
handle
.update(&mut cx, |_, window, _| {
.update_weak(&mut cx, |_, window, _| {
window.hovered.set(active);
window.refresh();
})
@@ -870,7 +871,7 @@ impl Window {
let mut cx = cx.to_async();
Box::new(move |event| {
handle
.update(&mut cx, |_, window, cx| window.dispatch_event(event, cx))
.update_weak(&mut cx, |_, window, cx| window.dispatch_event(event, cx))
.log_err()
.unwrap_or(DispatchEventResult::default())
})
@@ -1268,8 +1269,8 @@ impl Window {
/// Creates an [`AsyncWindowContext`], which has a static lifetime and can be held across
/// await points in async code.
pub fn to_async(&self, cx: &App) -> AsyncWindowContext {
AsyncWindowContext::new_context(cx.to_async(), self.handle)
pub fn to_async(&self, cx: &App) -> WeakAsyncWindowContext {
WeakAsyncWindowContext::new_context(cx.to_async(), self.handle)
}
/// Schedule the given closure to be run directly after the current frame is rendered.
@@ -3260,8 +3261,7 @@ impl Window {
.flush_dispatch(currently_pending.keystrokes, &dispatch_path);
window.replay_pending_input(to_replay, cx)
})
.log_err();
});
}));
self.pending_input = Some(currently_pending);
self.pending_input_changed(cx);
@@ -3946,6 +3946,17 @@ impl AnyWindowHandle {
cx.update_window(self, update)
}
/// Updates the state of the root view of this window.
///
/// This will fail if the window has been closed.
pub fn update_weak<R>(
self,
cx: &mut WeakAsyncApp,
update: impl FnOnce(AnyView, &mut Window, &mut App) -> R,
) -> Result<R> {
cx.update(|cx| cx.update_window(self, update))?
}
/// Read the state of the root view of this window.
///
/// This will fail if the window has been closed.

View File

@@ -4145,6 +4145,63 @@ impl BufferSnapshot {
None
}
}
pub fn words_in_range(
&self,
query: Option<&str>,
range: Range<usize>,
) -> HashMap<String, Range<Anchor>> {
if query.map_or(false, |query| query.is_empty()) {
return HashMap::default();
}
let classifier = CharClassifier::new(self.language.clone().map(|language| LanguageScope {
language,
override_id: None,
}));
let mut query_ix = 0;
let query = query.map(|query| query.chars().collect::<Vec<_>>());
let query_len = query.as_ref().map_or(0, |query| query.len());
let mut words = HashMap::default();
let mut current_word_start_ix = None;
let mut chunk_ix = range.start;
for chunk in self.chunks(range, false) {
for (i, c) in chunk.text.char_indices() {
let ix = chunk_ix + i;
if classifier.is_word(c) {
if current_word_start_ix.is_none() {
current_word_start_ix = Some(ix);
}
if let Some(query) = &query {
if query_ix < query_len {
let query_c = query.get(query_ix).expect(
"query_ix is a vec of chars, which we access only if before the end",
);
if c.to_lowercase().eq(query_c.to_lowercase()) {
query_ix += 1;
}
}
}
continue;
} else if let Some(word_start) = current_word_start_ix.take() {
if query_ix == query_len {
let word_range = self.anchor_before(word_start)..self.anchor_after(ix);
words.insert(
self.text_for_range(word_start..ix).collect::<String>(),
word_range,
);
}
}
query_ix = 0;
}
chunk_ix += chunk.text.len();
}
words
}
}
fn indent_size_for_line(text: &text::BufferSnapshot, row: u32) -> IndentSize {

View File

@@ -13,6 +13,7 @@ use proto::deserialize_operation;
use rand::prelude::*;
use regex::RegexBuilder;
use settings::SettingsStore;
use std::collections::BTreeSet;
use std::{
env,
ops::Range,
@@ -3140,6 +3141,93 @@ fn test_trailing_whitespace_ranges(mut rng: StdRng) {
);
}
#[gpui::test]
fn test_words_in_range(cx: &mut gpui::App) {
init_settings(cx, |_| {});
let contents = r#"let word=öäpple.bar你 Öäpple word2-öÄpPlE-Pizza-word ÖÄPPLE word"#;
let buffer = cx.new(|cx| {
let buffer = Buffer::local(contents, cx).with_language(Arc::new(rust_lang()), cx);
assert_eq!(buffer.text(), contents);
buffer.check_invariants();
buffer
});
buffer.update(cx, |buffer, _| {
let snapshot = buffer.snapshot();
assert_eq!(
BTreeSet::from_iter(["Pizza".to_string()]),
snapshot
.words_in_range(Some("piz"), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::from_iter([
"öäpple".to_string(),
"Öäpple".to_string(),
"öÄpPlE".to_string(),
"ÖÄPPLE".to_string(),
]),
snapshot
.words_in_range(Some("öp"), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::from_iter([
"öÄpPlE".to_string(),
"Öäpple".to_string(),
"ÖÄPPLE".to_string(),
"öäpple".to_string(),
]),
snapshot
.words_in_range(Some("öÄ"), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::default(),
snapshot
.words_in_range(Some("öÄ好"), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::from_iter(["bar你".to_string(),]),
snapshot
.words_in_range(Some(""), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::default(),
snapshot
.words_in_range(Some(""), 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
assert_eq!(
BTreeSet::from_iter([
"bar你".to_string(),
"öÄpPlE".to_string(),
"Öäpple".to_string(),
"ÖÄPPLE".to_string(),
"öäpple".to_string(),
"let".to_string(),
"Pizza".to_string(),
"word".to_string(),
"word2".to_string(),
]),
snapshot
.words_in_range(None, 0..snapshot.len())
.into_keys()
.collect::<BTreeSet<_>>()
);
});
}
fn ruby_lang() -> Language {
Language::new(
LanguageConfig {

View File

@@ -79,10 +79,10 @@ pub struct LanguageSettings {
/// The column at which to soft-wrap lines, for buffers where soft-wrap
/// is enabled.
pub preferred_line_length: u32,
// Whether to show wrap guides (vertical rulers) in the editor.
// Setting this to true will show a guide at the 'preferred_line_length' value
// if softwrap is set to 'preferred_line_length', and will show any
// additional guides as specified by the 'wrap_guides' setting.
/// Whether to show wrap guides (vertical rulers) in the editor.
/// Setting this to true will show a guide at the 'preferred_line_length' value
/// if softwrap is set to 'preferred_line_length', and will show any
/// additional guides as specified by the 'wrap_guides' setting.
pub show_wrap_guides: bool,
/// Character counts at which to show wrap guides (vertical rulers) in the editor.
pub wrap_guides: Vec<usize>,
@@ -137,7 +137,7 @@ pub struct LanguageSettings {
pub use_on_type_format: bool,
/// Whether indentation of pasted content should be adjusted based on the context.
pub auto_indent_on_paste: bool,
// Controls how the editor handles the autoclosed characters.
/// Controls how the editor handles the autoclosed characters.
pub always_treat_brackets_as_autoclosed: bool,
/// Which code actions to run on save
pub code_actions_on_format: HashMap<String, bool>,
@@ -151,6 +151,8 @@ pub struct LanguageSettings {
/// Whether to display inline and alongside documentation for items in the
/// completions menu.
pub show_completion_documentation: bool,
/// Completion settings for this language.
pub completions: CompletionSettings,
}
impl LanguageSettings {
@@ -306,6 +308,50 @@ pub struct AllLanguageSettingsContent {
pub file_types: HashMap<Arc<str>, Vec<String>>,
}
/// Controls how completions are processed for this language.
#[derive(Copy, Clone, Debug, Serialize, Deserialize, PartialEq, Eq, JsonSchema)]
#[serde(rename_all = "snake_case")]
pub struct CompletionSettings {
/// Controls how words are completed.
/// For large documents, not all words may be fetched for completion.
///
/// Default: `fallback`
#[serde(default = "default_words_completion_mode")]
pub words: WordsCompletionMode,
/// Whether to fetch LSP completions or not.
///
/// Default: true
#[serde(default = "default_true")]
pub lsp: bool,
/// When fetching LSP completions, determines how long to wait for a response of a particular server.
/// When set to 0, waits indefinitely.
///
/// Default: 500
#[serde(default = "lsp_fetch_timeout_ms")]
pub lsp_fetch_timeout_ms: u64,
}
/// Controls how document's words are completed.
#[derive(Copy, Clone, Debug, Serialize, Deserialize, PartialEq, Eq, JsonSchema)]
#[serde(rename_all = "snake_case")]
pub enum WordsCompletionMode {
/// Always fetch document's words for completions.
Enabled,
/// Only if LSP response errors/times out/is empty,
/// use document's words to show completions.
Fallback,
/// Never fetch or complete document's words for completions.
Disabled,
}
fn default_words_completion_mode() -> WordsCompletionMode {
WordsCompletionMode::Fallback
}
fn lsp_fetch_timeout_ms() -> u64 {
500
}
/// The settings for a particular language.
#[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize, JsonSchema)]
pub struct LanguageSettingsContent {
@@ -478,6 +524,8 @@ pub struct LanguageSettingsContent {
///
/// Default: true
pub show_completion_documentation: Option<bool>,
/// Controls how completions are processed for this language.
pub completions: Option<CompletionSettings>,
}
/// The behavior of `editor::Rewrap`.
@@ -1381,6 +1429,7 @@ fn merge_settings(settings: &mut LanguageSettings, src: &LanguageSettingsContent
&mut settings.show_completion_documentation,
src.show_completion_documentation,
);
merge(&mut settings.completions, src.completions);
}
/// Allows to enable/disable formatting with Prettier

View File

@@ -837,52 +837,63 @@ impl LocalBufferStore {
let snapshot =
worktree_handle.update(&mut cx, |tree, _| tree.as_local().unwrap().snapshot())?;
let diff_bases_changes_by_buffer = cx
.background_spawn(async move {
diff_state_updates
.into_iter()
.filter_map(|(buffer, path, current_index_text, current_head_text)| {
let local_repo = snapshot.local_repo_for_path(&path)?;
let relative_path = local_repo.relativize(&path).ok()?;
let index_text = if current_index_text.is_some() {
local_repo.repo().load_index_text(&relative_path)
} else {
None
};
let head_text = if current_head_text.is_some() {
local_repo.repo().load_committed_text(&relative_path)
} else {
None
};
.spawn(async move |cx| {
let mut results = Vec::new();
for (buffer, path, current_index_text, current_head_text) in diff_state_updates
{
let Some(local_repo) = snapshot.local_repo_for_path(&path) else {
continue;
};
let Some(relative_path) = local_repo.relativize(&path).ok() else {
continue;
};
let index_text = if current_index_text.is_some() {
local_repo
.repo()
.load_index_text(relative_path.clone(), cx.clone())
.await
} else {
None
};
let head_text = if current_head_text.is_some() {
local_repo
.repo()
.load_committed_text(relative_path, cx.clone())
.await
} else {
None
};
// Avoid triggering a diff update if the base text has not changed.
if let Some((current_index, current_head)) =
current_index_text.as_ref().zip(current_head_text.as_ref())
// Avoid triggering a diff update if the base text has not changed.
if let Some((current_index, current_head)) =
current_index_text.as_ref().zip(current_head_text.as_ref())
{
if current_index.as_deref() == index_text.as_ref()
&& current_head.as_deref() == head_text.as_ref()
{
if current_index.as_deref() == index_text.as_ref()
&& current_head.as_deref() == head_text.as_ref()
{
return None;
}
continue;
}
}
let diff_bases_change =
match (current_index_text.is_some(), current_head_text.is_some()) {
(true, true) => Some(if index_text == head_text {
DiffBasesChange::SetBoth(head_text)
} else {
DiffBasesChange::SetEach {
index: index_text,
head: head_text,
}
}),
(true, false) => Some(DiffBasesChange::SetIndex(index_text)),
(false, true) => Some(DiffBasesChange::SetHead(head_text)),
(false, false) => None,
};
let diff_bases_change =
match (current_index_text.is_some(), current_head_text.is_some()) {
(true, true) => Some(if index_text == head_text {
DiffBasesChange::SetBoth(head_text)
} else {
DiffBasesChange::SetEach {
index: index_text,
head: head_text,
}
}),
(true, false) => Some(DiffBasesChange::SetIndex(index_text)),
(false, true) => Some(DiffBasesChange::SetHead(head_text)),
(false, false) => None,
};
Some((buffer, diff_bases_change))
})
.collect::<Vec<_>>()
results.push((buffer, diff_bases_change))
}
results
})
.await;
@@ -1620,11 +1631,12 @@ impl BufferStore {
anyhow::Ok(Some((repo, relative_path, content)))
});
cx.background_spawn(async move {
cx.spawn(|cx| async move {
let Some((repo, relative_path, content)) = blame_params? else {
return Ok(None);
};
repo.blame(&relative_path, content)
repo.blame(relative_path.clone(), content, cx)
.await
.with_context(|| format!("Failed to blame {:?}", relative_path.0))
.map(Some)
})

View File

@@ -401,7 +401,7 @@ impl GitStore {
if let Some((repo, path)) = this.repository_and_path_for_buffer_id(buffer_id, cx) {
let recv = repo.update(cx, |repo, cx| {
repo.set_index_text(
&path,
path,
new_index_text.as_ref().map(|rope| rope.to_string()),
cx,
)
@@ -715,7 +715,7 @@ impl GitStore {
repository_handle
.update(&mut cx, |repository_handle, cx| {
repository_handle.set_index_text(
&RepoPath::from_str(&envelope.payload.path),
RepoPath::from_str(&envelope.payload.path),
envelope.payload.text,
cx,
)
@@ -808,7 +808,7 @@ impl GitStore {
repository_handle
.update(&mut cx, |repository_handle, _| {
repository_handle.create_branch(&branch_name)
repository_handle.create_branch(branch_name)
})?
.await??;
@@ -828,7 +828,7 @@ impl GitStore {
repository_handle
.update(&mut cx, |repository_handle, _| {
repository_handle.change_branch(&branch_name)
repository_handle.change_branch(branch_name)
})?
.await??;
@@ -847,7 +847,7 @@ impl GitStore {
let commit = repository_handle
.update(&mut cx, |repository_handle, _| {
repository_handle.show(&envelope.payload.commit)
repository_handle.show(envelope.payload.commit)
})?
.await??;
Ok(proto::GitCommitDetails {
@@ -876,7 +876,7 @@ impl GitStore {
repository_handle
.update(&mut cx, |repository_handle, cx| {
repository_handle.reset(&envelope.payload.commit, mode, cx)
repository_handle.reset(envelope.payload.commit, mode, cx)
})?
.await??;
Ok(proto::Ack {})
@@ -1081,8 +1081,8 @@ impl Repository {
fn send_job<F, Fut, R>(&self, job: F) -> oneshot::Receiver<R>
where
F: FnOnce(GitRepo) -> Fut + 'static,
Fut: Future<Output = R> + Send + 'static,
F: FnOnce(GitRepo, AsyncApp) -> Fut + 'static,
Fut: Future<Output = R> + 'static,
R: Send + 'static,
{
self.send_keyed_job(None, job)
@@ -1090,8 +1090,8 @@ impl Repository {
fn send_keyed_job<F, Fut, R>(&self, key: Option<GitJobKey>, job: F) -> oneshot::Receiver<R>
where
F: FnOnce(GitRepo) -> Fut + 'static,
Fut: Future<Output = R> + Send + 'static,
F: FnOnce(GitRepo, AsyncApp) -> Fut + 'static,
Fut: Future<Output = R> + 'static,
R: Send + 'static,
{
let (result_tx, result_rx) = futures::channel::oneshot::channel();
@@ -1100,8 +1100,8 @@ impl Repository {
.unbounded_send(GitJob {
key,
job: Box::new(|cx: &mut AsyncApp| {
let job = job(git_repo);
cx.background_spawn(async move {
let job = job(git_repo, cx.clone());
cx.spawn(|_| async move {
let result = job.await;
result_tx.send(result).ok();
})
@@ -1292,9 +1292,9 @@ impl Repository {
let commit = commit.to_string();
let env = self.worktree_environment(cx);
self.send_job(|git_repo| async move {
self.send_job(|git_repo, _| async move {
match git_repo {
GitRepo::Local(repo) => repo.checkout_files(&commit, &paths, &env.await),
GitRepo::Local(repo) => repo.checkout_files(commit, paths, env.await).await,
GitRepo::Remote {
project_id,
client,
@@ -1322,17 +1322,17 @@ impl Repository {
pub fn reset(
&self,
commit: &str,
commit: String,
reset_mode: ResetMode,
cx: &mut App,
) -> oneshot::Receiver<Result<()>> {
let commit = commit.to_string();
let env = self.worktree_environment(cx);
self.send_job(|git_repo| async move {
self.send_job(|git_repo, _| async move {
match git_repo {
GitRepo::Local(git_repo) => {
let env = env.await;
git_repo.reset(&commit, reset_mode, &env)
git_repo.reset(commit, reset_mode, env).await
}
GitRepo::Remote {
project_id,
@@ -1359,11 +1359,10 @@ impl Repository {
})
}
pub fn show(&self, commit: &str) -> oneshot::Receiver<Result<CommitDetails>> {
let commit = commit.to_string();
self.send_job(|git_repo| async move {
pub fn show(&self, commit: String) -> oneshot::Receiver<Result<CommitDetails>> {
self.send_job(|git_repo, cx| async move {
match git_repo {
GitRepo::Local(git_repository) => git_repository.show(&commit),
GitRepo::Local(git_repository) => git_repository.show(commit, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1433,9 +1432,9 @@ impl Repository {
let env = env.await;
this.update(&mut cx, |this, _| {
this.send_job(|git_repo| async move {
this.send_job(|git_repo, cx| async move {
match git_repo {
GitRepo::Local(repo) => repo.stage_paths(&entries, &env),
GitRepo::Local(repo) => repo.stage_paths(entries, env, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1504,9 +1503,9 @@ impl Repository {
let env = env.await;
this.update(&mut cx, |this, _| {
this.send_job(|git_repo| async move {
this.send_job(|git_repo, cx| async move {
match git_repo {
GitRepo::Local(repo) => repo.unstage_paths(&entries, &env),
GitRepo::Local(repo) => repo.unstage_paths(entries, env, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1587,17 +1586,11 @@ impl Repository {
cx: &mut App,
) -> oneshot::Receiver<Result<()>> {
let env = self.worktree_environment(cx);
self.send_job(|git_repo| async move {
self.send_job(|git_repo, cx| async move {
match git_repo {
GitRepo::Local(repo) => {
let env = env.await;
repo.commit(
message.as_ref(),
name_and_email
.as_ref()
.map(|(name, email)| (name.as_ref(), email.as_ref())),
&env,
)
repo.commit(message, name_and_email, env, cx).await
}
GitRepo::Remote {
project_id,
@@ -1634,12 +1627,12 @@ impl Repository {
let askpass_id = util::post_inc(&mut self.latest_askpass_id);
let env = self.worktree_environment(cx);
self.send_job(move |git_repo| async move {
self.send_job(move |git_repo, cx| async move {
match git_repo {
GitRepo::Local(git_repository) => {
let askpass = AskPassSession::new(&executor, askpass).await?;
let env = env.await;
git_repository.fetch(askpass, &env)
git_repository.fetch(askpass, env, cx).await
}
GitRepo::Remote {
project_id,
@@ -1685,12 +1678,21 @@ impl Repository {
let askpass_id = util::post_inc(&mut self.latest_askpass_id);
let env = self.worktree_environment(cx);
self.send_job(move |git_repo| async move {
self.send_job(move |git_repo, cx| async move {
match git_repo {
GitRepo::Local(git_repository) => {
let env = env.await;
let askpass = AskPassSession::new(&executor, askpass).await?;
git_repository.push(&branch, &remote, options, askpass, &env)
git_repository
.push(
branch.to_string(),
remote.to_string(),
options,
askpass,
env,
cx,
)
.await
}
GitRepo::Remote {
project_id,
@@ -1740,12 +1742,14 @@ impl Repository {
let askpass_id = util::post_inc(&mut self.latest_askpass_id);
let env = self.worktree_environment(cx);
self.send_job(move |git_repo| async move {
self.send_job(move |git_repo, cx| async move {
match git_repo {
GitRepo::Local(git_repository) => {
let askpass = AskPassSession::new(&executor, askpass).await?;
let env = env.await;
git_repository.pull(&branch, &remote, askpass, &env)
git_repository
.pull(branch.to_string(), remote.to_string(), askpass, env, cx)
.await
}
GitRepo::Remote {
project_id,
@@ -1781,18 +1785,17 @@ impl Repository {
fn set_index_text(
&self,
path: &RepoPath,
path: RepoPath,
content: Option<String>,
cx: &mut App,
) -> oneshot::Receiver<anyhow::Result<()>> {
let path = path.clone();
let env = self.worktree_environment(cx);
self.send_keyed_job(
Some(GitJobKey::WriteIndex(path.clone())),
|git_repo| async move {
|git_repo, cx| async move {
match git_repo {
GitRepo::Local(repo) => repo.set_index_text(&path, content, &env.await),
GitRepo::Local(repo) => repo.set_index_text(path, content, env.await, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1819,11 +1822,9 @@ impl Repository {
&self,
branch_name: Option<String>,
) -> oneshot::Receiver<Result<Vec<Remote>>> {
self.send_job(|repo| async move {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => {
git_repository.get_remotes(branch_name.as_deref())
}
GitRepo::Local(git_repository) => git_repository.get_remotes(branch_name, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1854,9 +1855,13 @@ impl Repository {
}
pub fn branches(&self) -> oneshot::Receiver<Result<Vec<Branch>>> {
self.send_job(|repo| async move {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => git_repository.branches(),
GitRepo::Local(git_repository) => {
let git_repository = git_repository.clone();
cx.background_spawn(async move { git_repository.branches().await })
.await
}
GitRepo::Remote {
project_id,
client,
@@ -1884,9 +1889,9 @@ impl Repository {
}
pub fn diff(&self, diff_type: DiffType, _cx: &App) -> oneshot::Receiver<Result<String>> {
self.send_job(|repo| async move {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => git_repository.diff(diff_type),
GitRepo::Local(git_repository) => git_repository.diff(diff_type, cx).await,
GitRepo::Remote {
project_id,
client,
@@ -1916,11 +1921,12 @@ impl Repository {
})
}
pub fn create_branch(&self, branch_name: &str) -> oneshot::Receiver<Result<()>> {
let branch_name = branch_name.to_owned();
self.send_job(|repo| async move {
pub fn create_branch(&self, branch_name: String) -> oneshot::Receiver<Result<()>> {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => git_repository.create_branch(&branch_name),
GitRepo::Local(git_repository) => {
git_repository.create_branch(branch_name, cx).await
}
GitRepo::Remote {
project_id,
client,
@@ -1942,11 +1948,12 @@ impl Repository {
})
}
pub fn change_branch(&self, branch_name: &str) -> oneshot::Receiver<Result<()>> {
let branch_name = branch_name.to_owned();
self.send_job(|repo| async move {
pub fn change_branch(&self, branch_name: String) -> oneshot::Receiver<Result<()>> {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => git_repository.change_branch(&branch_name),
GitRepo::Local(git_repository) => {
git_repository.change_branch(branch_name, cx).await
}
GitRepo::Remote {
project_id,
client,
@@ -1969,9 +1976,9 @@ impl Repository {
}
pub fn check_for_pushed_commits(&self) -> oneshot::Receiver<Result<Vec<SharedString>>> {
self.send_job(|repo| async move {
self.send_job(|repo, cx| async move {
match repo {
GitRepo::Local(git_repository) => git_repository.check_for_pushed_commit(),
GitRepo::Local(git_repository) => git_repository.check_for_pushed_commit(cx).await,
GitRepo::Remote {
project_id,
client,

View File

@@ -23,13 +23,13 @@ use client::{proto, TypedEnvelope};
use collections::{btree_map, BTreeMap, BTreeSet, HashMap, HashSet};
use futures::{
future::{join_all, Shared},
select,
select, select_biased,
stream::FuturesUnordered,
AsyncWriteExt, Future, FutureExt, StreamExt,
};
use globset::{Glob, GlobBuilder, GlobMatcher, GlobSet, GlobSetBuilder};
use gpui::{
App, AppContext as _, AsyncApp, Context, Entity, EventEmitter, PromptLevel, SharedString, Task,
App, AppContext, AsyncApp, Context, Entity, EventEmitter, PromptLevel, SharedString, Task,
WeakEntity,
};
use http_client::HttpClient;
@@ -4325,6 +4325,15 @@ impl LspStore {
let offset = position.to_offset(&snapshot);
let scope = snapshot.language_scope_at(offset);
let language = snapshot.language().cloned();
let completion_settings = language_settings(
language.as_ref().map(|language| language.name()),
buffer.read(cx).file(),
cx,
)
.completions;
if !completion_settings.lsp {
return Task::ready(Ok(Vec::new()));
}
let server_ids: Vec<_> = buffer.update(cx, |buffer, cx| {
local
@@ -4341,23 +4350,51 @@ impl LspStore {
});
let buffer = buffer.clone();
let lsp_timeout = completion_settings.lsp_fetch_timeout_ms;
let lsp_timeout = if lsp_timeout > 0 {
Some(Duration::from_millis(lsp_timeout))
} else {
None
};
cx.spawn(move |this, mut cx| async move {
let mut tasks = Vec::with_capacity(server_ids.len());
this.update(&mut cx, |this, cx| {
this.update(&mut cx, |lsp_store, cx| {
for server_id in server_ids {
let lsp_adapter = this.language_server_adapter_for_id(server_id);
tasks.push((
lsp_adapter,
this.request_lsp(
buffer.clone(),
LanguageServerToQuery::Other(server_id),
GetCompletions {
position,
context: context.clone(),
let lsp_adapter = lsp_store.language_server_adapter_for_id(server_id);
let lsp_timeout = lsp_timeout
.map(|lsp_timeout| cx.background_executor().timer(lsp_timeout));
let mut timeout = cx.background_spawn(async move {
match lsp_timeout {
Some(lsp_timeout) => {
lsp_timeout.await;
true
},
cx,
),
));
None => false,
}
}).fuse();
let mut lsp_request = lsp_store.request_lsp(
buffer.clone(),
LanguageServerToQuery::Other(server_id),
GetCompletions {
position,
context: context.clone(),
},
cx,
).fuse();
let new_task = cx.background_spawn(async move {
select_biased! {
response = lsp_request => response,
timeout_happened = timeout => {
if timeout_happened {
log::warn!("Fetching completions from server {server_id} timed out, timeout ms: {}", completion_settings.lsp_fetch_timeout_ms);
return anyhow::Ok(Vec::new())
} else {
lsp_request.await
}
},
}
});
tasks.push((lsp_adapter, new_task));
}
})?;
@@ -4416,47 +4453,58 @@ impl LspStore {
{
did_resolve = true;
}
} else {
resolve_word_completion(
&buffer_snapshot,
&mut completions.borrow_mut()[completion_index],
);
}
}
} else {
for completion_index in completion_indices {
let Some(server_id) = completions.borrow()[completion_index].source.server_id()
else {
continue;
let server_id = {
let completion = &completions.borrow()[completion_index];
completion.source.server_id()
};
if let Some(server_id) = server_id {
let server_and_adapter = this
.read_with(&cx, |lsp_store, _| {
let server = lsp_store.language_server_for_id(server_id)?;
let adapter =
lsp_store.language_server_adapter_for_id(server.server_id())?;
Some((server, adapter))
})
.ok()
.flatten();
let Some((server, adapter)) = server_and_adapter else {
continue;
};
let server_and_adapter = this
.read_with(&cx, |lsp_store, _| {
let server = lsp_store.language_server_for_id(server_id)?;
let adapter =
lsp_store.language_server_adapter_for_id(server.server_id())?;
Some((server, adapter))
})
.ok()
.flatten();
let Some((server, adapter)) = server_and_adapter else {
continue;
};
let resolved = Self::resolve_completion_local(
server,
&buffer_snapshot,
completions.clone(),
completion_index,
)
.await
.log_err()
.is_some();
if resolved {
Self::regenerate_completion_labels(
adapter,
let resolved = Self::resolve_completion_local(
server,
&buffer_snapshot,
completions.clone(),
completion_index,
)
.await
.log_err();
did_resolve = true;
.log_err()
.is_some();
if resolved {
Self::regenerate_completion_labels(
adapter,
&buffer_snapshot,
completions.clone(),
completion_index,
)
.await
.log_err();
did_resolve = true;
}
} else {
resolve_word_completion(
&buffer_snapshot,
&mut completions.borrow_mut()[completion_index],
);
}
}
}
@@ -4500,7 +4548,9 @@ impl LspStore {
);
server.request::<lsp::request::ResolveCompletionItem>(*lsp_completion.clone())
}
CompletionSource::Custom => return Ok(()),
CompletionSource::BufferWord { .. } | CompletionSource::Custom => {
return Ok(());
}
}
};
let resolved_completion = request.await?;
@@ -4641,7 +4691,9 @@ impl LspStore {
}
serde_json::to_string(lsp_completion).unwrap().into_bytes()
}
CompletionSource::Custom => return Ok(()),
CompletionSource::Custom | CompletionSource::BufferWord { .. } => {
return Ok(());
}
}
};
let request = proto::ResolveCompletionDocumentation {
@@ -8172,51 +8224,54 @@ impl LspStore {
}
pub(crate) fn serialize_completion(completion: &CoreCompletion) -> proto::Completion {
let (source, server_id, lsp_completion, lsp_defaults, resolved) = match &completion.source {
let mut serialized_completion = proto::Completion {
old_start: Some(serialize_anchor(&completion.old_range.start)),
old_end: Some(serialize_anchor(&completion.old_range.end)),
new_text: completion.new_text.clone(),
..proto::Completion::default()
};
match &completion.source {
CompletionSource::Lsp {
server_id,
lsp_completion,
lsp_defaults,
resolved,
} => (
proto::completion::Source::Lsp as i32,
server_id.0 as u64,
serde_json::to_vec(lsp_completion).unwrap(),
lsp_defaults
} => {
serialized_completion.source = proto::completion::Source::Lsp as i32;
serialized_completion.server_id = server_id.0 as u64;
serialized_completion.lsp_completion = serde_json::to_vec(lsp_completion).unwrap();
serialized_completion.lsp_defaults = lsp_defaults
.as_deref()
.map(|lsp_defaults| serde_json::to_vec(lsp_defaults).unwrap()),
*resolved,
),
CompletionSource::Custom => (
proto::completion::Source::Custom as i32,
0,
Vec::new(),
None,
true,
),
};
proto::Completion {
old_start: Some(serialize_anchor(&completion.old_range.start)),
old_end: Some(serialize_anchor(&completion.old_range.end)),
new_text: completion.new_text.clone(),
server_id,
lsp_completion,
lsp_defaults,
resolved,
source,
.map(|lsp_defaults| serde_json::to_vec(lsp_defaults).unwrap());
serialized_completion.resolved = *resolved;
}
CompletionSource::BufferWord {
word_range,
resolved,
} => {
serialized_completion.source = proto::completion::Source::BufferWord as i32;
serialized_completion.buffer_word_start = Some(serialize_anchor(&word_range.start));
serialized_completion.buffer_word_end = Some(serialize_anchor(&word_range.end));
serialized_completion.resolved = *resolved;
}
CompletionSource::Custom => {
serialized_completion.source = proto::completion::Source::Custom as i32;
serialized_completion.resolved = true;
}
}
serialized_completion
}
pub(crate) fn deserialize_completion(completion: proto::Completion) -> Result<CoreCompletion> {
let old_start = completion
.old_start
.and_then(deserialize_anchor)
.ok_or_else(|| anyhow!("invalid old start"))?;
.context("invalid old start")?;
let old_end = completion
.old_end
.and_then(deserialize_anchor)
.ok_or_else(|| anyhow!("invalid old end"))?;
.context("invalid old end")?;
Ok(CoreCompletion {
old_range: old_start..old_end,
new_text: completion.new_text,
@@ -8232,6 +8287,20 @@ impl LspStore {
.transpose()?,
resolved: completion.resolved,
},
Some(proto::completion::Source::BufferWord) => {
let word_range = completion
.buffer_word_start
.and_then(deserialize_anchor)
.context("invalid buffer word start")?
..completion
.buffer_word_end
.and_then(deserialize_anchor)
.context("invalid buffer word end")?;
CompletionSource::BufferWord {
word_range,
resolved: completion.resolved,
}
}
_ => anyhow::bail!("Unexpected completion source {}", completion.source),
},
})
@@ -8296,6 +8365,40 @@ impl LspStore {
}
}
fn resolve_word_completion(snapshot: &BufferSnapshot, completion: &mut Completion) {
let CompletionSource::BufferWord {
word_range,
resolved,
} = &mut completion.source
else {
return;
};
if *resolved {
return;
}
if completion.new_text
!= snapshot
.text_for_range(word_range.clone())
.collect::<String>()
{
return;
}
let mut offset = 0;
for chunk in snapshot.chunks(word_range.clone(), true) {
let end_offset = offset + chunk.text.len();
if let Some(highlight_id) = chunk.syntax_highlight_id {
completion
.label
.runs
.push((offset..end_offset, highlight_id));
}
offset = end_offset;
}
*resolved = true;
}
impl EventEmitter<LspStoreEvent> for LspStore {}
fn remove_empty_hover_blocks(mut hover: Hover) -> Option<Hover> {

View File

@@ -388,6 +388,10 @@ pub enum CompletionSource {
resolved: bool,
},
Custom,
BufferWord {
word_range: Range<Anchor>,
resolved: bool,
},
}
impl CompletionSource {

View File

@@ -1002,10 +1002,13 @@ message Completion {
bool resolved = 6;
Source source = 7;
optional bytes lsp_defaults = 8;
optional Anchor buffer_word_start = 9;
optional Anchor buffer_word_end = 10;
enum Source {
Lsp = 0;
Custom = 1;
BufferWord = 2;
}
}

View File

@@ -1361,7 +1361,7 @@ async fn test_remote_git_branches(cx: &mut TestAppContext, server_cx: &mut TestA
assert_eq!(&remote_branches, &branches_set);
cx.update(|cx| repository.read(cx).change_branch(new_branch))
cx.update(|cx| repository.read(cx).change_branch(new_branch.to_string()))
.await
.unwrap()
.unwrap();
@@ -1383,15 +1383,23 @@ async fn test_remote_git_branches(cx: &mut TestAppContext, server_cx: &mut TestA
assert_eq!(server_branch.name, branches[2]);
// Also try creating a new branch
cx.update(|cx| repository.read(cx).create_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx.update(|cx| {
repository
.read(cx)
.create_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx.update(|cx| repository.read(cx).change_branch("totally-new-branch"))
.await
.unwrap()
.unwrap();
cx.update(|cx| {
repository
.read(cx)
.change_branch("totally-new-branch".to_string())
})
.await
.unwrap()
.unwrap();
cx.run_until_parked();

View File

@@ -1064,18 +1064,36 @@ fn possible_open_target(
for worktree in &worktree_candidates {
let worktree_root = worktree.read(cx).abs_path();
let paths_to_check = potential_paths
.iter()
.map(|path_with_position| PathWithPosition {
path: path_with_position
.path
.strip_prefix(&worktree_root)
.unwrap_or(&path_with_position.path)
.to_owned(),
row: path_with_position.row,
column: path_with_position.column,
})
.collect::<Vec<_>>();
let mut paths_to_check = Vec::with_capacity(potential_paths.len());
for path_with_position in &potential_paths {
if worktree_root.ends_with(&path_with_position.path) {
let root_path_with_posiition = PathWithPosition {
path: worktree_root.to_path_buf(),
row: path_with_position.row,
column: path_with_position.column,
};
match worktree.read(cx).root_entry() {
Some(root_entry) => {
return Task::ready(Some(OpenTarget::Worktree(
root_path_with_posiition,
root_entry.clone(),
)))
}
None => paths_to_check.push(root_path_with_posiition),
}
} else {
paths_to_check.push(PathWithPosition {
path: path_with_position
.path
.strip_prefix(&worktree_root)
.unwrap_or(&path_with_position.path)
.to_owned(),
row: path_with_position.row,
column: path_with_position.column,
});
};
}
let mut traversal = worktree
.read(cx)

View File

@@ -1,6 +1,6 @@
use crate::{motion::Motion, object::Object, state::Mode, Vim};
use collections::HashMap;
use editor::{display_map::ToDisplayPoint, scroll::Autoscroll, Bias, Editor, IsVimMode};
use editor::{display_map::ToDisplayPoint, scroll::Autoscroll, Bias, Editor};
use gpui::{actions, Context, Window};
use language::SelectionGoal;
@@ -14,7 +14,7 @@ pub(crate) fn register(editor: &mut Editor, cx: &mut Context<Vim>) {
vim.update_editor(window, cx, |vim, editor, window, cx| {
editor.transact(window, cx, |editor, window, cx| {
let mut positions = vim.save_selection_starts(editor, cx);
editor.rewrap_impl(IsVimMode::Yes, cx);
editor.rewrap_impl(true, cx);
editor.change_selections(Some(Autoscroll::fit()), window, cx, |s| {
s.move_with(|map, selection| {
if let Some(anchor) = positions.remove(&selection.id) {
@@ -52,7 +52,7 @@ impl Vim {
motion.expand_selection(map, selection, times, false, &text_layout_details);
});
});
editor.rewrap_impl(IsVimMode::Yes, cx);
editor.rewrap_impl(true, cx);
editor.change_selections(None, window, cx, |s| {
s.move_with(|map, selection| {
let anchor = selection_starts.remove(&selection.id).unwrap();
@@ -83,7 +83,7 @@ impl Vim {
object.expand_selection(map, selection, around);
});
});
editor.rewrap_impl(IsVimMode::Yes, cx);
editor.rewrap_impl(true, cx);
editor.change_selections(None, window, cx, |s| {
s.move_with(|map, selection| {
let anchor = original_positions.remove(&selection.id).unwrap();

View File

@@ -1,4 +1,7 @@
use gpui::{AnyView, DismissEvent, Entity, FocusHandle, Focusable as _, ManagedView, Subscription};
use gpui::{
AnyView, DismissEvent, Entity, FocusHandle, Focusable as _, ManagedView, MouseButton,
Subscription,
};
use ui::prelude::*;
#[derive(Debug)]
@@ -172,11 +175,13 @@ impl Render for ModalLayer {
let mut background = cx.theme().colors().elevated_surface_background;
background.fade_out(0.2);
el.bg(background)
.occlude()
.on_mouse_down_out(cx.listener(|this, _, window, cx| {
this.hide_modal(window, cx);
}))
})
.on_mouse_down(
MouseButton::Left,
cx.listener(|this, _, window, cx| {
this.hide_modal(window, cx);
}),
)
.child(
v_flex()
.h(px(0.0))
@@ -185,7 +190,14 @@ impl Render for ModalLayer {
.flex_col()
.items_center()
.track_focus(&active_modal.focus_handle)
.child(h_flex().occlude().child(active_modal.modal.view())),
.child(
h_flex()
.occlude()
.child(div().child(active_modal.modal.view()))
.on_mouse_down(MouseButton::Left, |_, _, cx| {
cx.stop_propagation();
}),
),
)
}
}

View File

@@ -1055,13 +1055,13 @@ impl Worktree {
Worktree::Local(this) => {
let path = Arc::from(path);
let snapshot = this.snapshot();
cx.background_spawn(async move {
cx.spawn(|cx| async move {
if let Some(repo) = snapshot.repository_for_path(&path) {
if let Some(repo_path) = repo.relativize(&path).log_err() {
if let Some(git_repo) =
snapshot.git_repositories.get(&repo.work_directory_id)
{
return Ok(git_repo.repo_ptr.load_index_text(&repo_path));
return Ok(git_repo.repo_ptr.load_index_text(repo_path, cx).await);
}
}
}
@@ -1079,13 +1079,16 @@ impl Worktree {
Worktree::Local(this) => {
let path = Arc::from(path);
let snapshot = this.snapshot();
cx.background_spawn(async move {
cx.spawn(|cx| async move {
if let Some(repo) = snapshot.repository_for_path(&path) {
if let Some(repo_path) = repo.relativize(&path).log_err() {
if let Some(git_repo) =
snapshot.git_repositories.get(&repo.work_directory_id)
{
return Ok(git_repo.repo_ptr.load_committed_text(&repo_path));
return Ok(git_repo
.repo_ptr
.load_committed_text(repo_path, cx)
.await);
}
}
}
@@ -5520,7 +5523,9 @@ impl BackgroundScanner {
state.repository_scans.insert(
path_key.clone(),
self.executor.spawn(async move {
update_branches(&job_state, &mut local_repository).log_err();
update_branches(&job_state, &mut local_repository)
.await
.log_err();
log::trace!("updating git statuses for repo {repository_name}",);
let t0 = Instant::now();
@@ -5665,11 +5670,11 @@ fn send_status_update_inner(
.is_ok()
}
fn update_branches(
async fn update_branches(
state: &Mutex<BackgroundScannerState>,
repository: &mut LocalRepositoryEntry,
) -> Result<()> {
let branches = repository.repo().branches()?;
let branches = repository.repo().branches().await?;
let snapshot = state.lock().snapshot.snapshot.clone();
let mut repository = snapshot
.repository(repository.work_directory.path_key())

View File

@@ -2,7 +2,7 @@
description = "The fast, collaborative code editor."
edition.workspace = true
name = "zed"
version = "0.178.0"
version = "0.179.0"
publish.workspace = true
license = "GPL-3.0-or-later"
authors = ["Zed Team <hi@zed.dev>"]

View File

@@ -383,14 +383,14 @@ impl Render for QuickActionBar {
"Column Git Blame",
show_git_blame_gutter,
IconPosition::Start,
Some(editor::actions::ToggleGitBlame.boxed_clone()),
Some(git::Blame.boxed_clone()),
{
let editor = editor.clone();
move |window, cx| {
editor
.update(cx, |editor, cx| {
editor.toggle_git_blame(
&editor::actions::ToggleGitBlame,
&git::Blame,
window,
cx,
)

View File

@@ -63,6 +63,31 @@ You can install a local build on your machine with:
This will build zed and the cli in release mode and make them available at `~/.local/bin/zed`, installing .desktop files to `~/.local/share`.
> **_Note_**: If you encounter linker errors similar to the following:
>
> ```bash
> error: linking with `cc` failed: exit status: 1 ...
> = note: /usr/bin/ld: /tmp/rustcISMaod/libaws_lc_sys-79f08eb6d32e546e.rlib(f8e4fd781484bd36-bcm.o): in function `aws_lc_0_25_0_handle_cpu_env':
> /aws-lc/crypto/fipsmodule/cpucap/cpu_intel.c:(.text.aws_lc_0_25_0_handle_cpu_env+0x63): undefined reference to `__isoc23_sscanf'
> /usr/bin/ld: /tmp/rustcISMaod/libaws_lc_sys-79f08eb6d32e546e.rlib(f8e4fd781484bd36-bcm.o): in function `pkey_rsa_ctrl_str':
> /aws-lc/crypto/fipsmodule/evp/p_rsa.c:741:(.text.pkey_rsa_ctrl_str+0x20d): undefined reference to `__isoc23_strtol'
> /usr/bin/ld: /aws-lc/crypto/fipsmodule/evp/p_rsa.c:752:(.text.pkey_rsa_ctrl_str+0x258): undefined reference to `__isoc23_strtol'
> collect2: error: ld returned 1 exit status
> = note: some `extern` functions couldn't be found; some native libraries may need to be installed or have their path specified
> = note: use the `-l` flag to specify native libraries to link
> = note: use the `cargo:rustc-link-lib` directive to specify the native libraries to link with Cargo (see https://doc.rust-lang.org/cargo/reference/build-scripts.html#rustc-link-lib)
> error: could not compile `remote_server` (bin "remote_server") due to 1 previous error
> ```
>
> **Cause**:
> this is caused by known bugs in aws-lc-rs(doesn't support GCC >= 14): [FIPS fails to build with GCC >= 14](https://github.com/aws/aws-lc-rs/issues/569)
> & [GCC-14 - build failure for FIPS module](https://github.com/aws/aws-lc/issues/2010)
>
> You can refer to [linux: Linker error for remote_server when using script/install-linux](https://github.com/zed-industries/zed/issues/24880) for more information.
>
> **Workarounds**:
> Set the remote server target to `x86_64-unknown-linux-gnu` like so `export REMOTE_SERVER_TARGET=x86_64-unknown-linux-gnu; script/install-linux`
## Wayland & X11
Zed supports both X11 and Wayland. By default, we pick whichever we can find at runtime. If you're on Wayland and want to run in X11 mode, use the environment variable `WAYLAND_DISPLAY=''`.

View File

@@ -1,6 +1,6 @@
# PureScript
PureScript support is available through the [PureScript extension](https://github.com/zed-industries/zed/tree/main/extensions/purescript).
PureScript support is available through the [PureScript extension](https://github.com/zed-extensions/purescript).
- Tree-sitter: [postsolar/tree-sitter-purescript](https://github.com/postsolar/tree-sitter-purescript)
- Language-Server: [nwolverson/purescript-language-server](https://github.com/nwolverson/purescript-language-server)

View File

@@ -2,7 +2,7 @@
[Uiua](https://www.uiua.org/) is a general purpose, stack-based, array-oriented programming language with a focus on simplicity, beauty, and tacit code.
Uiua support is available through the [Uiua extension](https://github.com/zed-industries/zed/tree/main/extensions/uiua).
Uiua support is available through the [Uiua extension](https://github.com/zed-extensions/uiua).
- Tree-sitter: [shnarazk/tree-sitter-uiua](https://github.com/shnarazk/tree-sitter-uiua)
- Language Server: [uiua-lang/uiua](https://github.com/uiua-lang/uiua/)

View File

@@ -1,6 +1,6 @@
# Zig
Zig support is available through the [Zig extension](https://github.com/zed-industries/zed/tree/main/extensions/zig).
Zig support is available through the [Zig extension](https://github.com/zed-extensions/zig).
- Tree-sitter: [tree-sitter-zig](https://github.com/tree-sitter-grammars/tree-sitter-zig)
- Language Server: [zls](https://github.com/zigtools/zls)

View File

@@ -1,16 +0,0 @@
[package]
name = "zed_purescript"
version = "0.1.0"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/purescript.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.1.0"

View File

@@ -1 +0,0 @@
../../LICENSE-APACHE

View File

@@ -1,15 +0,0 @@
id = "purescript"
name = "PureScript"
description = "PureScript support."
version = "0.1.0"
schema_version = 1
authors = ["Iván Molina Rebolledo <ivanmolinarebolledo@gmail.com>"]
repository = "https://github.com/zed-industries/zed"
[language_servers.purescript-language-server]
name = "PureScript Language Server"
language = "PureScript"
[grammars.purescript]
repository = "https://github.com/postsolar/tree-sitter-purescript"
commit = "0554811a512b9cec08b5a83ce9096eb22da18213"

View File

@@ -1,3 +0,0 @@
("(" @open ")" @close)
("[" @open "]" @close)
("{" @open "}" @close)

View File

@@ -1,14 +0,0 @@
name = "PureScript"
grammar = "purescript"
path_suffixes = ["purs"]
autoclose_before = ",=)}]"
line_comments = ["-- "]
block_comment = ["{- ", " -}"]
brackets = [
{ start = "{", end = "}", close = true, newline = true },
{ start = "[", end = "]", close = true, newline = true },
{ start = "(", end = ")", close = true, newline = true },
{ start = "\"", end = "\"", close = true, newline = false },
{ start = "'", end = "'", close = true, newline = false },
{ start = "`", end = "`", close = true, newline = false },
]

View File

@@ -1,144 +0,0 @@
;; Copyright 2022 nvim-treesitter
;;
;; Licensed under the Apache License, Version 2.0 (the "License");
;; you may not use this file except in compliance with the License.
;; You may obtain a copy of the License at
;;
;; http://www.apache.org/licenses/LICENSE-2.0
;;
;; Unless required by applicable law or agreed to in writing, software
;; distributed under the License is distributed on an "AS IS" BASIS,
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
;; See the License for the specific language governing permissions and
;; limitations under the License.
;; ----------------------------------------------------------------------------
;; Literals and comments
(integer) @number
(exp_negation) @number
(exp_literal (number)) @float
(char) @string
[
(string)
(triple_quote_string)
] @string
(comment) @comment
;; ----------------------------------------------------------------------------
;; Punctuation
[
"("
")"
"{"
"}"
"["
"]"
] @punctuation.bracket
[
(comma)
";"
] @punctuation.delimiter
;; ----------------------------------------------------------------------------
;; Keywords, operators, includes
[
"forall"
"∀"
] @keyword
;; (pragma) @constant
[
"if"
"then"
"else"
"case"
"of"
] @keyword
[
"import"
"module"
] @keyword
[
(operator)
(constructor_operator)
(type_operator)
(qualified_module) ; grabs the `.` (dot), ex: import System.IO
(all_names)
(wildcard)
"="
"|"
"::"
"=>"
"->"
"<-"
"\\"
"`"
"@"
"∷"
"⇒"
"<="
"⇐"
"→"
"←"
] @operator
(module) @title
[
(where)
"let"
"in"
"class"
"instance"
"derive"
"foreign"
"data"
"newtype"
"type"
"as"
"hiding"
"do"
"ado"
"infix"
"infixl"
"infixr"
] @keyword
;; ----------------------------------------------------------------------------
;; Functions and variables
(variable) @variable
(pat_wildcard) @variable
(signature name: (variable) @type)
(function
name: (variable) @function
patterns: (patterns))
(exp_infix (exp_name) @function (#set! "priority" 101))
(exp_apply . (exp_name (variable) @function))
(exp_apply . (exp_name (qualified_variable (variable) @function)))
;; ----------------------------------------------------------------------------
;; Types
(type) @type
(type_variable) @type
(constructor) @constructor
; True or False
((constructor) @_bool (#match? @_bool "(True|False)")) @boolean

View File

@@ -1,3 +0,0 @@
(_ "[" "]" @end) @indent
(_ "{" "}" @end) @indent
(_ "(" ")" @end) @indent

View File

@@ -1,97 +0,0 @@
use std::{env, fs};
use zed_extension_api::{self as zed, serde_json, Result};
const SERVER_PATH: &str = "node_modules/.bin/purescript-language-server";
const PACKAGE_NAME: &str = "purescript-language-server";
struct PurescriptExtension {
did_find_server: bool,
}
impl PurescriptExtension {
fn server_exists(&self) -> bool {
fs::metadata(SERVER_PATH).map_or(false, |stat| stat.is_file())
}
fn server_script_path(&mut self, language_server_id: &zed::LanguageServerId) -> Result<String> {
let server_exists = self.server_exists();
if self.did_find_server && server_exists {
return Ok(SERVER_PATH.to_string());
}
zed::set_language_server_installation_status(
language_server_id,
&zed::LanguageServerInstallationStatus::CheckingForUpdate,
);
let version = zed::npm_package_latest_version(PACKAGE_NAME)?;
if !server_exists
|| zed::npm_package_installed_version(PACKAGE_NAME)?.as_ref() != Some(&version)
{
zed::set_language_server_installation_status(
language_server_id,
&zed::LanguageServerInstallationStatus::Downloading,
);
let result = zed::npm_install_package(PACKAGE_NAME, &version);
match result {
Ok(()) => {
if !self.server_exists() {
Err(format!(
"installed package '{PACKAGE_NAME}' did not contain expected path '{SERVER_PATH}'",
))?;
}
}
Err(error) => {
if !self.server_exists() {
Err(error)?;
}
}
}
}
self.did_find_server = true;
Ok(SERVER_PATH.to_string())
}
}
impl zed::Extension for PurescriptExtension {
fn new() -> Self {
Self {
did_find_server: false,
}
}
fn language_server_command(
&mut self,
language_server_id: &zed::LanguageServerId,
_worktree: &zed::Worktree,
) -> Result<zed::Command> {
let server_path = self.server_script_path(language_server_id)?;
Ok(zed::Command {
command: zed::node_binary_path()?,
args: vec![
env::current_dir()
.unwrap()
.join(&server_path)
.to_string_lossy()
.to_string(),
"--stdio".to_string(),
],
env: Default::default(),
})
}
fn language_server_initialization_options(
&mut self,
_language_server_id: &zed::LanguageServerId,
_worktree: &zed::Worktree,
) -> Result<Option<serde_json::Value>> {
Ok(Some(serde_json::json!({
"purescript": {
"addSpagoSources": true
}
})))
}
}
zed::register_extension!(PurescriptExtension);

View File

@@ -1,16 +0,0 @@
[package]
name = "zed_uiua"
version = "0.0.1"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/uiua.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.1.0"

View File

@@ -1 +0,0 @@
../../LICENSE-APACHE

View File

@@ -1,15 +0,0 @@
id = "uiua"
name = "Uiua"
description = "Uiua support."
version = "0.0.1"
schema_version = 1
authors = ["Max Brunsfeld <max@zed.dev>"]
repository = "https://github.com/zed-industries/zed"
[language_servers.uiua]
name = "Uiua LSP"
language = "Uiua"
[grammars.uiua]
repository = "https://github.com/shnarazk/tree-sitter-uiua"
commit = "21dc2db39494585bf29a3f86d5add6e9d11a22ba"

View File

@@ -1,11 +0,0 @@
name = "Uiua"
grammar = "uiua"
path_suffixes = ["ua"]
line_comments = ["# "]
autoclose_before = ")]}\""
brackets = [
{ start = "{", end = "}", close = true, newline = false},
{ start = "[", end = "]", close = true, newline = false },
{ start = "(", end = ")", close = true, newline = false },
{ start = "\"", end = "\"", close = true, newline = false, not_in = ["string"] },
]

View File

@@ -1,50 +0,0 @@
[
(openParen)
(closeParen)
(openCurly)
(closeCurly)
(openBracket)
(closeBracket)
] @punctuation.bracket
[
(branchSeparator)
(underscore)
] @constructor
; ] @punctuation.delimiter
[ (character) ] @constant.character
[ (comment) ] @comment
[ (constant) ] @constant.numeric
[ (identifier) ] @variable
[ (leftArrow) ] @keyword
[ (function) ] @function
[ (modifier1) ] @operator
[ (modifier2) ] @operator
[ (number) ] @constant.numeric
[ (placeHolder) ] @special
[ (otherConstant) ] @string.special
[ (signature) ] @type
[ (system) ] @function.builtin
[ (tripleMinus) ] @module
; planet
[
"id"
"identity"
"∘"
"dip"
"⊙"
"gap"
"⋅"
] @tag
[
(string)
(multiLineString)
] @string
; [
; (deprecated)
; (identifierDeprecated)
; ] @warning

View File

@@ -1,3 +0,0 @@
[
(array)
] @indent

View File

@@ -1,27 +0,0 @@
use zed_extension_api::{self as zed, Result};
struct UiuaExtension;
impl zed::Extension for UiuaExtension {
fn new() -> Self {
Self
}
fn language_server_command(
&mut self,
_language_server_id: &zed::LanguageServerId,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
let path = worktree
.which("uiua")
.ok_or_else(|| "uiua is not installed".to_string())?;
Ok(zed::Command {
command: path,
args: vec!["lsp".to_string()],
env: Default::default(),
})
}
}
zed::register_extension!(UiuaExtension);

View File

@@ -1,16 +0,0 @@
[package]
name = "zed_zig"
version = "0.3.3"
edition.workspace = true
publish.workspace = true
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/zig.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.1.0"

View File

@@ -1 +0,0 @@
../../LICENSE-APACHE

View File

@@ -1,15 +0,0 @@
id = "zig"
name = "Zig"
description = "Zig support."
version = "0.3.3"
schema_version = 1
authors = ["Allan Calix <contact@acx.dev>"]
repository = "https://github.com/zed-industries/zed"
[language_servers.zls]
name = "zls"
language = "Zig"
[grammars.zig]
repository = "https://github.com/tree-sitter-grammars/tree-sitter-zig"
commit = "eb7d58c2dc4fbeea4745019dee8df013034ae66b"

View File

@@ -1,3 +0,0 @@
("(" @open ")" @close)
("[" @open "]" @close)
("{" @open "}" @close)

View File

@@ -1,12 +0,0 @@
name = "Zig"
grammar = "zig"
path_suffixes = ["zig", "zon"]
line_comments = ["// ", "/// ", "//! "]
autoclose_before = ";:.,=}])"
brackets = [
{ start = "{", end = "}", close = true, newline = true },
{ start = "[", end = "]", close = true, newline = true },
{ start = "(", end = ")", close = true, newline = true },
{ start = "\"", end = "\"", close = true, newline = false, not_in = ["string"] },
{ start = "'", end = "'", close = true, newline = false, not_in = ["string", "comment"] },
]

View File

@@ -1,295 +0,0 @@
; Variables
(identifier) @variable
; Parameters
(parameter
name: (identifier) @variable.parameter)
; Types
(parameter
type: (identifier) @type)
((identifier) @type
(#match? @type "^[A-Z_][a-zA-Z0-9_]*"))
(variable_declaration
(identifier) @type
"="
[
(struct_declaration)
(enum_declaration)
(union_declaration)
(opaque_declaration)
])
[
(builtin_type)
"anyframe"
] @type.builtin
; Constants
((identifier) @constant
(#match? @constant "^[A-Z][A-Z_0-9]+$"))
[
"null"
"unreachable"
"undefined"
] @constant.builtin
(field_expression
.
member: (identifier) @constant)
(enum_declaration
(container_field
type: (identifier) @constant))
; Labels
(block_label (identifier) @label)
(break_label (identifier) @label)
; Fields
(field_initializer
.
(identifier) @variable.member)
(field_expression
(_)
member: (identifier) @property)
(field_expression
(_)
member: (identifier) @type (#match? @type "^[A-Z_][a-zA-Z0-9_]*"))
(container_field
name: (identifier) @property)
(initializer_list
(assignment_expression
left: (field_expression
.
member: (identifier) @property)))
; Functions
(builtin_identifier) @function.builtin
(call_expression
function: (identifier) @function.call)
(call_expression
function: (field_expression
member: (identifier) @function.call))
(function_declaration
name: (identifier) @function)
; Modules
(variable_declaration
(identifier) @module
(builtin_function
(builtin_identifier) @keyword.import
(#any-of? @keyword.import "@import" "@cImport")))
; Builtins
[
"c"
"..."
] @variable.builtin
((identifier) @variable.builtin
(#eq? @variable.builtin "_"))
(calling_convention
(identifier) @variable.builtin)
; Keywords
[
"asm"
"defer"
"errdefer"
"test"
"error"
"const"
"var"
] @keyword
[
"struct"
"union"
"enum"
"opaque"
] @keyword.type
[
"async"
"await"
"suspend"
"nosuspend"
"resume"
] @keyword.coroutine
"fn" @keyword.function
[
"and"
"or"
"orelse"
] @keyword.operator
"return" @keyword.return
[
"if"
"else"
"switch"
] @keyword.conditional
[
"for"
"while"
"break"
"continue"
] @keyword.repeat
[
"usingnamespace"
"export"
] @keyword.import
[
"try"
"catch"
] @keyword.exception
[
"volatile"
"allowzero"
"noalias"
"addrspace"
"align"
"callconv"
"linksection"
"pub"
"inline"
"noinline"
"extern"
"comptime"
"packed"
"threadlocal"
] @keyword.modifier
; Operator
[
"="
"*="
"*%="
"*|="
"/="
"%="
"+="
"+%="
"+|="
"-="
"-%="
"-|="
"<<="
"<<|="
">>="
"&="
"^="
"|="
"!"
"~"
"-"
"-%"
"&"
"=="
"!="
">"
">="
"<="
"<"
"&"
"^"
"|"
"<<"
">>"
"<<|"
"+"
"++"
"+%"
"-%"
"+|"
"-|"
"*"
"/"
"%"
"**"
"*%"
"*|"
"||"
".*"
".?"
"?"
".."
] @operator
; Literals
(character) @string
([
(string)
(multiline_string)
] @string
(#set! "priority" 95))
(integer) @number
(float) @number.float
(boolean) @boolean
(escape_sequence) @string.escape
; Punctuation
[
"["
"]"
"("
")"
"{"
"}"
] @punctuation.bracket
[
";"
"."
","
":"
"=>"
"->"
] @punctuation.delimiter
(payload "|" @punctuation.bracket)
; Comments
(comment) @comment
((comment) @comment.documentation
(#match? @comment.documentation "^//(/|!)"))

View File

@@ -1,17 +0,0 @@
[
(block)
(switch_expression)
(initializer_list)
] @indent.begin
(block
"}" @indent.end)
(_ "[" "]" @end) @indent
(_ "{" "}" @end) @indent
(_ "(" ")" @end) @indent
[
(comment)
(multiline_string)
] @indent.ignore

View File

@@ -1,10 +0,0 @@
((comment) @injection.content
(#set! injection.language "comment"))
; TODO: add when asm is added
; (asm_output_item (string) @injection.content
; (#set! injection.language "asm"))
; (asm_input_item (string) @injection.content
; (#set! injection.language "asm"))
; (asm_clobbers (string) @injection.content
; (#set! injection.language "asm"))

View File

@@ -1,50 +0,0 @@
(test_declaration
"test" @context
[
(string)
(identifier)
] @name) @item
(function_declaration
"pub"? @context
[
"extern"
"export"
"inline"
"noinline"
]? @context
"fn" @context
name: (_) @name) @item
(source_file
(variable_declaration
"pub"? @context
(identifier) @name
"=" (_) @context) @item)
(struct_declaration
(variable_declaration
"pub"? @context
(identifier) @name
"=" (_) @context) @item)
(union_declaration
(variable_declaration
"pub"? @context
(identifier) @name
"=" (_) @context) @item)
(enum_declaration
(variable_declaration
"pub"? @context
(identifier) @name
"=" (_) @context) @item)
(opaque_declaration
(variable_declaration
"pub"? @context
(identifier) @name
"=" (_) @context) @item)
(container_field
. (_) @name) @item

View File

@@ -1,27 +0,0 @@
(function_declaration
body: (_
"{"
(_)* @function.inside
"}")) @function.around
(test_declaration
(block
"{"
(_)* @function.inside
"}")) @function.around
(variable_declaration
(struct_declaration
"struct"
"{"
[(_) ","]* @class.inside
"}")) @class.around
(variable_declaration
(enum_declaration
"enum"
"{"
(_)* @class.inside
"}")) @class.around
(comment)+ @comment.around

View File

@@ -1,171 +0,0 @@
use std::fs;
use zed_extension_api::{self as zed, serde_json, settings::LspSettings, LanguageServerId, Result};
struct ZigExtension {
cached_binary_path: Option<String>,
}
#[derive(Clone)]
struct ZlsBinary {
path: String,
args: Option<Vec<String>>,
environment: Option<Vec<(String, String)>>,
}
impl ZigExtension {
fn language_server_binary(
&mut self,
language_server_id: &LanguageServerId,
worktree: &zed::Worktree,
) -> Result<ZlsBinary> {
let mut args: Option<Vec<String>> = None;
let (platform, arch) = zed::current_platform();
let environment = match platform {
zed::Os::Mac | zed::Os::Linux => Some(worktree.shell_env()),
zed::Os::Windows => None,
};
if let Ok(lsp_settings) = LspSettings::for_worktree("zls", worktree) {
if let Some(binary) = lsp_settings.binary {
args = binary.arguments;
if let Some(path) = binary.path {
return Ok(ZlsBinary {
path: path.clone(),
args,
environment,
});
}
}
}
if let Some(path) = worktree.which("zls") {
return Ok(ZlsBinary {
path,
args,
environment,
});
}
if let Some(path) = &self.cached_binary_path {
if fs::metadata(path).map_or(false, |stat| stat.is_file()) {
return Ok(ZlsBinary {
path: path.clone(),
args,
environment,
});
}
}
zed::set_language_server_installation_status(
language_server_id,
&zed::LanguageServerInstallationStatus::CheckingForUpdate,
);
// Note that in github releases and on zlstools.org the tar.gz asset is not shown
// but is available at https://builds.zigtools.org/zls-{os}-{arch}-{version}.tar.gz
let release = zed::latest_github_release(
"zigtools/zls",
zed::GithubReleaseOptions {
require_assets: true,
pre_release: false,
},
)?;
let arch: &str = match arch {
zed::Architecture::Aarch64 => "aarch64",
zed::Architecture::X86 => "x86",
zed::Architecture::X8664 => "x86_64",
};
let os: &str = match platform {
zed::Os::Mac => "macos",
zed::Os::Linux => "linux",
zed::Os::Windows => "windows",
};
let extension: &str = match platform {
zed::Os::Mac | zed::Os::Linux => "tar.gz",
zed::Os::Windows => "zip",
};
let asset_name: String = format!("zls-{}-{}-{}.{}", os, arch, release.version, extension);
let download_url = format!("https://builds.zigtools.org/{}", asset_name);
let version_dir = format!("zls-{}", release.version);
let binary_path = match platform {
zed::Os::Mac | zed::Os::Linux => format!("{version_dir}/zls"),
zed::Os::Windows => format!("{version_dir}/zls.exe"),
};
if !fs::metadata(&binary_path).map_or(false, |stat| stat.is_file()) {
zed::set_language_server_installation_status(
language_server_id,
&zed::LanguageServerInstallationStatus::Downloading,
);
zed::download_file(
&download_url,
&version_dir,
match platform {
zed::Os::Mac | zed::Os::Linux => zed::DownloadedFileType::GzipTar,
zed::Os::Windows => zed::DownloadedFileType::Zip,
},
)
.map_err(|e| format!("failed to download file: {e}"))?;
zed::make_file_executable(&binary_path)?;
let entries =
fs::read_dir(".").map_err(|e| format!("failed to list working directory {e}"))?;
for entry in entries {
let entry = entry.map_err(|e| format!("failed to load directory entry {e}"))?;
if entry.file_name().to_str() != Some(&version_dir) {
fs::remove_dir_all(entry.path()).ok();
}
}
}
self.cached_binary_path = Some(binary_path.clone());
Ok(ZlsBinary {
path: binary_path,
args,
environment,
})
}
}
impl zed::Extension for ZigExtension {
fn new() -> Self {
Self {
cached_binary_path: None,
}
}
fn language_server_command(
&mut self,
language_server_id: &LanguageServerId,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
let zls_binary = self.language_server_binary(language_server_id, worktree)?;
Ok(zed::Command {
command: zls_binary.path,
args: zls_binary.args.unwrap_or_default(),
env: zls_binary.environment.unwrap_or_default(),
})
}
fn language_server_workspace_configuration(
&mut self,
_language_server_id: &zed::LanguageServerId,
worktree: &zed::Worktree,
) -> Result<Option<serde_json::Value>> {
let settings = LspSettings::for_worktree("zls", worktree)
.ok()
.and_then(|lsp_settings| lsp_settings.settings.clone())
.unwrap_or_default();
Ok(Some(settings))
}
}
zed::register_extension!(ZigExtension);