Compare commits

..

41 Commits

Author SHA1 Message Date
Mikayla
9eb92d4599 Make project panel settings live update 2024-04-12 11:17:51 -07:00
Mikayla
88a1070b13 Merge branch 'main' into auto-folded-dirs 2024-04-12 10:54:55 -07:00
Mikayla
60e1ffbce5 Default auto-folding to off, restore previous tests 2024-04-12 10:41:51 -07:00
Kyle Kelley
49371b44cb Semantic Index (#10329)
This introduces semantic indexing in Zed based on chunking text from
files in the developer's workspace and creating vector embeddings using
an embedding model. As part of this, we've created an embeddings
provider trait that allows us to work with OpenAI, a local Ollama model,
or a Zed hosted embedding.

The semantic index is built by breaking down text for known
(programming) languages into manageable chunks that are smaller than the
max token size. Each chunk is then fed to a language model to create a
high dimensional vector which is then normalized to a unit vector to
allow fast comparison with other vectors with a simple dot product.
Alongside the vector, we store the path of the file and the range within
the document where the vector was sourced from.

Zed will soon grok contextual similarity across different text snippets,
allowing for natural language search beyond keyword matching. This is
being put together both for human-based search as well as providing
results to Large Language Models to allow them to refine how they help
developers.

Remaining todo:

* [x] Change `provider` to `model` within the zed hosted embeddings
database (as its currently a combo of the provider and the model in one
name)


Release Notes:

- N/A

---------

Co-authored-by: Nathan Sobo <nathan@zed.dev>
Co-authored-by: Antonio Scandurra <me@as-cii.com>
Co-authored-by: Conrad Irwin <conrad@zed.dev>
Co-authored-by: Marshall Bowers <elliott.codes@gmail.com>
Co-authored-by: Antonio <antonio@zed.dev>
2024-04-12 11:40:59 -06:00
Maxime Forveille
4b40e83b8b gpui: Fix window title special characters display on X11 (#9994)
Before:

![image](https://github.com/zed-industries/zed/assets/13511978/f12a144a-5c41-44e9-8422-aa73ea54fb9c)

After:

![image](https://github.com/zed-industries/zed/assets/13511978/45e9b701-77a8-4e63-9481-dab895a347f7)

Release Notes:

- Fixed window title special characters display on X11.

---------

Co-authored-by: Mikayla Maki <mikayla@zed.dev>
2024-04-12 09:49:31 -07:00
Conrad Irwin
dffddaec4c Revert "Revert "language: Remove buffer fingerprinting (#9007)"" (#9671)
This reverts commit caed275fbf.

NOTE: this should not be merged until #9668 is on stable and the
`ZedVersion#can_collaborate` is updated to exclude all clients without
that change.

Release Notes:

- N/A

---------

Co-authored-by: Piotr Osiewicz <24362066+osiewicz@users.noreply.github.com>
2024-04-12 18:40:35 +02:00
Marshall Bowers
a4d6c5da7c toml: Bump to v0.1.0 (#10482)
This PR bumps the TOML extension to v0.1.0.

This version of the extension has been updated to use v0.0.6 of the
`zed_extension_api`.

Release Notes:

- N/A
2024-04-12 12:39:43 -04:00
Hans
3ea17248c8 Adjust left movement when soft_wrap mode is used (#10464)
Release Notes:

- Added/Fixed #10350
2024-04-12 10:36:31 -06:00
Marshall Bowers
e0e1103228 zig: Bump to v0.1.0 (#10481)
This PR bumps the Zig extension to v0.1.0.

This version of the extension has been updated to use v0.0.6 of the
`zed_extension_api`.

It also adds support for treating `.zig.zon` files as Zig files
(#10012).

Release Notes:

- N/A
2024-04-12 12:30:29 -04:00
Marshall Bowers
65c9e7d3d1 php: Bump to v0.0.2 (#10480)
This PR bumps the PHP extension to v0.0.2.

This version of the PHP extension adds the `language_ids` mappings from
#10053.

Release Notes:

- N/A
2024-04-12 12:19:01 -04:00
Marshall Bowers
b5b872656b Extract Terraform extension (#10479)
This PR extracts Terraform support into an extension and removes the
built-in Terraform support from Zed.

Release Notes:

- Removed built-in support for Terraform, in favor of making it
available as
an extension. The Terraform extension will be suggested for download
when you
open a `.tf`, `.tfvars`, or `.hcl` file.
2024-04-12 11:49:49 -04:00
Bennet Bo Fenner
f4d9a97195 preview tabs: Support find all references (#10470)
`FindAllReferences` will now open a preview tab, jumping to a definition
will also open a preview tab.


https://github.com/zed-industries/zed/assets/53836821/fa3db1fd-ccb3-4559-b3d2-b1fe57f86481

Note: One thing I would like to improve here is also adding support for
reopening `FindAllReferences` using the navigation history. As of now
the navigation history is lacking support for reopening items other then
project files, which needs to be implemented first.

Release Notes:

- N/A
2024-04-12 17:22:12 +02:00
Bennet Bo Fenner
7b01a29f5a preview tabs: Fix tab selection getting out of sync (#10478)
There was an edge case where the project panel selection would not be
updated when opening a lot of tabs quickly using the preview tab
feature.
I spent way too long debugging this, thankfully @ConradIrwin spotted it
in like 5 minutes 🎉

Release Notes:

- N/A
2024-04-12 17:20:30 +02:00
张小白
04e89c4c51 Use workspace uuid (#10475)
Release Notes:

- N/A
2024-04-12 10:53:10 -04:00
Conrad Irwin
0ab5a524b0 Fix overlap (#10474)
Although I liked the symmetry of the count in the middle of the arrows,
it's
tricky to make the buttons not occlude the count on hover, so go back to
this arrangement.

Release Notes:

- N/A
2024-04-12 08:25:09 -06:00
Bennet Bo Fenner
cd5ddfe34b chat panel: Add timestamp in tooltip to edited message (#10444)
Hovering over the `(edited)` text inside a message displays a tooltip
with the timestamp of when the message was last edited:


![image](https://github.com/zed-industries/zed/assets/53836821/be6d68c2-7447-42bc-bd5e-7a9053b3c980)

---

Also removed the `fade_out` style for the `(edited)` text, as this was
causing tooltips to fade out as well:


![image](https://github.com/zed-industries/zed/assets/53836821/91d3cf6a-db58-4e1d-b257-663b2ce1aca4)

Instead it uses `theme().text_muted` now.


Release Notes:

- Hovering over an edited message now displays a tooltip revealing the
timestamp of the last edit.
2024-04-12 14:26:41 +02:00
Bennet Bo Fenner
0a4c3488dd Fix typo in README (#10471)
Fixes a typo in the README which I believe was accidentally committed
yesterday in #10459


Release Notes:

- N/A
2024-04-12 14:18:54 +02:00
Piotr Osiewicz
a1cbc23fee task: use full task label to distinguish a terminal (#10469)
Spotted by @SomeoneToIgnore, in #10468 I've used a shortened task label,
which might lead to collisions.

Release Notes:

- N/A
2024-04-12 13:25:46 +02:00
Piotr Osiewicz
298e9c9387 task: Allow Rerun action to override properties of task being reran (#10468)
For example:
```
"alt-t": [
    "task::Rerun",
     { "reevaluate_context": true, "allow_concurrent_runs": true }
],
```
Overriding `allow_concurrent_runs` to `true` by itself should terminate
current instance of the task, if there's any.

This PR also fixes task deduplication in terminal panel to use expanded
label and not the id, which depends on task context. It kinda aligns
with how task rerun worked prior to #10341 . That's omitted in the
release notes though, as it's not in Preview yet.

Release Notes:

- `Task::Rerun` action can now override `allow_concurrent_runs` and
`use_new_terminal` properties of the task that is being reran.
2024-04-12 12:44:50 +02:00
Thorsten Ball
6e1ba7e936 Allow hovering over tooltips in git blame sidebar (#10466)
This introduces a new API on `StatefulInteractiveElement` to create a
tooltip that can be hovered, scrolled inside, and clicked:
`.hoverable_tooltip`.

Right now we only use it in the `git blame` gutter, but the plan is to
use the new hover/click/scroll behavior in #10398 to introduce new
git-blame-tooltips.

Release Notes:

- N/A

---------

Co-authored-by: Antonio <antonio@zed.dev>
2024-04-12 11:47:32 +02:00
Thorsten Ball
bc0c2e0cae Extend Vim default keybindings (#10461)
This implements some of #10457.

Release notes:

- Added `g c c` and `g c` to Vim keybindings to toggle comments in
normal and visual mode respectively.
- Added `g ]` and `g [` to Vim keybindings to go to next and previous
diagnostic error.
- Changed `[ x` and `] x` (which select larger/smaller syntax node) in
Vim mode to also work in visual mode.
2024-04-12 08:05:38 +02:00
Mehmet Efe Akça
29a50573a9 Add git blame error reporting with notification (#10408)
<img width="1035" alt="Screenshot 2024-04-11 at 13 13 44"
src="https://github.com/zed-industries/zed/assets/13402668/cd0e96a0-41c6-4757-8840-97d15a75c511">

Release Notes:

- Added a notification to show possible `git blame` errors if it fails to run.

Caveats:
- ~git blame now executes in foreground
executor  (required since the Fut is !Send)~

TODOs:
- After a failed toggle, the app thinks the blame
is shown. This means toggling again will do nothing
instead of retrying. (Caused by editor.show_git_blame
being set to true before the git blame is generated)
- ~(Maybe) Trim error?~ Done

---------

Co-authored-by: Thorsten Ball <mrnugget@gmail.com>
2024-04-12 07:20:34 +02:00
Conrad Irwin
08786fa7bf Make BufferSearch less wide (#10459)
This also adds some "responsiveness" so that UI elements are hidden
before everything has to be occluded

Release Notes:

- Improved search UI. It now works in narrower panes, and avoids
scrolling the editor on open.

<img width="899" alt="Screenshot 2024-04-11 at 21 33 17"
src="https://github.com/zed-industries/zed/assets/94272/44b95d4f-08d6-4c40-a175-0e594402ca01">
<img width="508" alt="Screenshot 2024-04-11 at 21 33 45"
src="https://github.com/zed-industries/zed/assets/94272/baf4638d-427b-43e6-ad67-13d43f0f18a2">
<img width="361" alt="Screenshot 2024-04-11 at 21 34 00"
src="https://github.com/zed-industries/zed/assets/94272/ff60b561-2f77-49c0-9df7-e26227fe9225">
<img width="348" alt="Screenshot 2024-04-11 at 21 37 03"
src="https://github.com/zed-industries/zed/assets/94272/a2a700a2-ce99-41bd-bf47-9b14d7082b0e">
2024-04-11 23:07:29 -06:00
Hans
f2d61f3ea5 Add feature to display commands for vim mode (#10349)
Release Notes:

- Added the current operator stack to the Vim status bar at the bottom
of the editor. #4447

This commit introduces a new feature that displays the current partial
command in the vim mode, similar to the behavior in Vim plugin. This
helps users keep track of the commands they're entering.
2024-04-12 06:39:57 +02:00
Abykhodau Yury
b82dda64e1 Merge github.com:zed-industries/zed into auto-folded-dirs 2024-04-03 21:49:18 +03:00
Abykhodau Yury
dfbf6b9ee9 Improve performance of project panel 2024-04-03 21:49:13 +03:00
Abykhodau Yury
dbd2a247db fix Snapshot import issue 2024-03-28 23:19:51 +02:00
Abykhodau Yury
da4af5a651 Merge github.com:zed-industries/zed into auto-folded-dirs 2024-03-28 20:44:56 +02:00
Abykhodau Yury
e9c33bd819 Delete collecting the vector in is_foldable check 2024-03-28 20:41:41 +02:00
Abykhodau Yury
3dd6a7c6e5 Merge branch 'main' of github.com:zed-industries/zed into auto-folded-dirs 2024-02-28 22:10:32 +02:00
Abykhodau Yury
b946797390 Solve performance issues with project panel auto folding dirs 2024-02-28 22:10:21 +02:00
Abykhodau Yury
fd32cc4679 Merge branch 'main' of github.com:zed-industries/zed into auto-folded-dirs 2024-02-26 10:58:26 +02:00
Abykhodau Yury
4d5509be6b Refactor project panel tests according to changes in auto folding of directories 2024-02-26 10:56:56 +02:00
Abykhodau Yury
17cea27bbc Add support of fold/unfold directory functionality 2024-02-25 22:39:16 +02:00
Abykhodau Yury
fe4b744ae9 Refactor project_panel to get rid of redundant state 2024-02-25 14:46:45 +02:00
Abykhodau Yury
a5b8b5fdb3 Adjust tests after adding functioality of auto collapsed dir paths 2024-02-12 11:00:06 +02:00
Abykhodau Yury
4275283201 Merge branch 'main' of github.com:zed-industries/zed into auto-folded-dirs 2024-02-12 08:33:07 +02:00
Abykhodau Yury
6de4aa5990 Adding test for auto collapsing dir paths 2024-02-11 18:32:03 +02:00
Abykhodau Yury
b375001228 Add 'auto_collapse_dirs' parameter to project_panel settings 2024-02-10 21:36:25 +02:00
Abykhodau Yury
00dd254ba9 Add support of fold/unfold action for directories 2024-02-10 19:26:01 +02:00
Abykhodau Yury
00d8a92326 Add support for collapsed paths in nested directories within the project panel 2024-02-10 15:00:02 +02:00
105 changed files with 3705 additions and 532 deletions

190
Cargo.lock generated
View File

@@ -1578,17 +1578,6 @@ dependencies = [
"workspace",
]
[[package]]
name = "bromberg_sl2"
version = "0.6.0"
source = "git+https://github.com/zed-industries/bromberg_sl2?rev=950bc5482c216c395049ae33ae4501e08975f17f#950bc5482c216c395049ae33ae4501e08975f17f"
dependencies = [
"digest 0.9.0",
"lazy_static",
"rayon",
"seq-macro",
]
[[package]]
name = "bstr"
version = "1.6.2"
@@ -3276,6 +3265,15 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9ea835d29036a4087793836fa931b08837ad5e957da9e23886b29586fb9b6650"
[[package]]
name = "doxygen-rs"
version = "0.4.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "415b6ec780d34dcf624666747194393603d0373b7141eef01d12ee58881507d9"
dependencies = [
"phf",
]
[[package]]
name = "dwrote"
version = "0.11.0"
@@ -4096,6 +4094,17 @@ dependencies = [
"futures-util",
]
[[package]]
name = "futures-batch"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6f444c45a1cb86f2a7e301469fd50a82084a60dadc25d94529a8312276ecb71a"
dependencies = [
"futures 0.3.28",
"futures-timer",
"pin-utils",
]
[[package]]
name = "futures-channel"
version = "0.3.30"
@@ -4191,6 +4200,12 @@ version = "0.3.30"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "38d84fa142264698cdce1a9f9172cf383a0c82de1bddcf3092901442c4097004"
[[package]]
name = "futures-timer"
version = "3.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f288b0a4f20f9a56b5d1da57e2227c661b7b16168e2f72365f57b63326e29b24"
[[package]]
name = "futures-util"
version = "0.3.30"
@@ -4670,6 +4685,41 @@ dependencies = [
"unicode-segmentation",
]
[[package]]
name = "heed"
version = "0.20.0-alpha.9"
source = "git+https://github.com/meilisearch/heed?rev=036ac23f73a021894974b9adc815bc95b3e0482a#036ac23f73a021894974b9adc815bc95b3e0482a"
dependencies = [
"bitflags 2.4.2",
"byteorder",
"heed-traits",
"heed-types",
"libc",
"lmdb-master-sys",
"once_cell",
"page_size",
"serde",
"synchronoise",
"url",
]
[[package]]
name = "heed-traits"
version = "0.20.0-alpha.9"
source = "git+https://github.com/meilisearch/heed?rev=036ac23f73a021894974b9adc815bc95b3e0482a#036ac23f73a021894974b9adc815bc95b3e0482a"
[[package]]
name = "heed-types"
version = "0.20.0-alpha.9"
source = "git+https://github.com/meilisearch/heed?rev=036ac23f73a021894974b9adc815bc95b3e0482a#036ac23f73a021894974b9adc815bc95b3e0482a"
dependencies = [
"bincode",
"byteorder",
"heed-traits",
"serde",
"serde_json",
]
[[package]]
name = "hermit-abi"
version = "0.1.19"
@@ -5463,7 +5513,6 @@ dependencies = [
"tree-sitter-go",
"tree-sitter-gomod",
"tree-sitter-gowork",
"tree-sitter-hcl",
"tree-sitter-heex",
"tree-sitter-jsdoc",
"tree-sitter-json 0.20.0",
@@ -5676,6 +5725,16 @@ dependencies = [
"sha2 0.10.7",
]
[[package]]
name = "lmdb-master-sys"
version = "0.1.0"
source = "git+https://github.com/meilisearch/heed?rev=036ac23f73a021894974b9adc815bc95b3e0482a#036ac23f73a021894974b9adc815bc95b3e0482a"
dependencies = [
"cc",
"doxygen-rs",
"libc",
]
[[package]]
name = "lock_api"
version = "0.4.10"
@@ -6695,6 +6754,16 @@ dependencies = [
"sha2 0.10.7",
]
[[package]]
name = "page_size"
version = "0.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30d5b2194ed13191c1999ae0704b7839fb18384fa22e49b57eeaa97d79ce40da"
dependencies = [
"libc",
"winapi",
]
[[package]]
name = "palette"
version = "0.7.5"
@@ -6868,9 +6937,33 @@ version = "0.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ade2d8b8f33c7333b51bcf0428d37e217e9f32192ae4772156f65063b8ce03dc"
dependencies = [
"phf_macros",
"phf_shared",
]
[[package]]
name = "phf_generator"
version = "0.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "48e4cc64c2ad9ebe670cb8fd69dd50ae301650392e81c05f9bfcb2d5bdbc24b0"
dependencies = [
"phf_shared",
"rand 0.8.5",
]
[[package]]
name = "phf_macros"
version = "0.11.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3444646e286606587e49f3bcf1679b8cef1dc2c5ecc29ddacaffc305180d464b"
dependencies = [
"phf_generator",
"phf_shared",
"proc-macro2",
"quote",
"syn 2.0.48",
]
[[package]]
name = "phf_shared"
version = "0.11.2"
@@ -7943,7 +8036,6 @@ name = "rope"
version = "0.1.0"
dependencies = [
"arrayvec",
"bromberg_sl2",
"criterion",
"gpui",
"log",
@@ -8486,6 +8578,35 @@ version = "1.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "58bf37232d3bb9a2c4e641ca2a11d83b5062066f88df7fed36c28772046d65ba"
[[package]]
name = "semantic_index"
version = "0.1.0"
dependencies = [
"anyhow",
"client",
"clock",
"collections",
"env_logger",
"fs",
"futures 0.3.28",
"futures-batch",
"gpui",
"heed",
"language",
"languages",
"log",
"open_ai",
"project",
"serde",
"serde_json",
"settings",
"sha2 0.10.7",
"smol",
"tempfile",
"util",
"worktree",
]
[[package]]
name = "semantic_version"
version = "0.1.0"
@@ -8500,12 +8621,6 @@ version = "1.0.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b0293b4b29daaf487284529cc2f5675b8e57c61f70167ba415a463651fd6a918"
[[package]]
name = "seq-macro"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a9f47faea3cad316faa914d013d24f471cd90bfca1a0c70f05a3f42c6441e99"
[[package]]
name = "serde"
version = "1.0.196"
@@ -9497,6 +9612,15 @@ version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2047c6ded9c721764247e62cd3b03c09ffc529b2ba5b10ec482ae507a4a70160"
[[package]]
name = "synchronoise"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3dbc01390fc626ce8d1cffe3376ded2b72a11bb70e1c75f404a210e4daa4def2"
dependencies = [
"crossbeam-queue",
]
[[package]]
name = "sys-locale"
version = "0.3.1"
@@ -10417,15 +10541,6 @@ dependencies = [
"tree-sitter",
]
[[package]]
name = "tree-sitter-hcl"
version = "0.0.1"
source = "git+https://github.com/MichaHoffmann/tree-sitter-hcl?rev=v1.1.0#636dbe70301ecbab8f353c8c78b3406fe4f185f5"
dependencies = [
"cc",
"tree-sitter",
]
[[package]]
name = "tree-sitter-heex"
version = "0.0.1"
@@ -12657,7 +12772,7 @@ dependencies = [
[[package]]
name = "zed_php"
version = "0.0.1"
version = "0.0.2"
dependencies = [
"zed_extension_api 0.0.4",
]
@@ -12684,10 +12799,17 @@ dependencies = [
]
[[package]]
name = "zed_toml"
version = "0.0.2"
name = "zed_terraform"
version = "0.0.1"
dependencies = [
"zed_extension_api 0.0.5",
"zed_extension_api 0.0.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "zed_toml"
version = "0.1.0"
dependencies = [
"zed_extension_api 0.0.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
@@ -12699,9 +12821,9 @@ dependencies = [
[[package]]
name = "zed_zig"
version = "0.0.1"
version = "0.1.0"
dependencies = [
"zed_extension_api 0.0.5",
"zed_extension_api 0.0.6 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]

View File

@@ -73,6 +73,7 @@ members = [
"crates/task",
"crates/tasks_ui",
"crates/search",
"crates/semantic_index",
"crates/semantic_version",
"crates/settings",
"crates/snippet",
@@ -117,6 +118,7 @@ members = [
"extensions/prisma",
"extensions/purescript",
"extensions/svelte",
"extensions/terraform",
"extensions/toml",
"extensions/uiua",
"extensions/zig",
@@ -252,9 +254,11 @@ derive_more = "0.99.17"
emojis = "0.6.1"
env_logger = "0.9"
futures = "0.3"
futures-batch = "0.6.1"
futures-lite = "1.13"
git2 = { version = "0.15", default-features = false }
globset = "0.4"
heed = { git = "https://github.com/meilisearch/heed", rev = "036ac23f73a021894974b9adc815bc95b3e0482a", features = ["read-txn-no-tls"] }
hex = "0.4.3"
ignore = "0.4.22"
indoc = "1"
@@ -322,7 +326,6 @@ tree-sitter-embedded-template = "0.20.0"
tree-sitter-go = { git = "https://github.com/tree-sitter/tree-sitter-go", rev = "aeb2f33b366fd78d5789ff104956ce23508b85db" }
tree-sitter-gomod = { git = "https://github.com/camdencheek/tree-sitter-go-mod" }
tree-sitter-gowork = { git = "https://github.com/d1y/tree-sitter-go-work" }
tree-sitter-hcl = { git = "https://github.com/MichaHoffmann/tree-sitter-hcl", rev = "v1.1.0" }
rustc-demangle = "0.1.23"
tree-sitter-heex = { git = "https://github.com/phoenixframework/tree-sitter-heex", rev = "2e1348c3cf2c9323e87c2744796cf3f3868aa82a" }
tree-sitter-html = "0.19.0"
@@ -343,7 +346,7 @@ unindent = "0.1.7"
unicase = "2.6"
unicode-segmentation = "1.10"
url = "2.2"
uuid = { version = "1.1.2", features = ["v4"] }
uuid = { version = "1.1.2", features = ["v4", "v5"] }
wasmparser = "0.201"
wasm-encoder = "0.201"
wasmtime = { version = "19.0.0", default-features = false, features = [

View File

@@ -1,6 +1,6 @@
# Zed
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/ze34actions/workflows/ci.yml)
[![CI](https://github.com/zed-industries/zed/actions/workflows/ci.yml/badge.svg)](https://github.com/zed-industries/zed/actions/workflows/ci.yml)
Welcome to Zed, a high-performance, multiplayer code editor from the creators of [Atom](https://github.com/atom/atom) and [Tree-sitter](https://github.com/tree-sitter/tree-sitter).

View File

@@ -234,6 +234,8 @@
"displayLines": true
}
],
"g ]": "editor::GoToDiagnostic",
"g [": "editor::GoToPrevDiagnostic",
"shift-h": "vim::WindowTop",
"shift-m": "vim::WindowMiddle",
"shift-l": "vim::WindowBottom",
@@ -367,6 +369,15 @@
"< <": "vim::Outdent",
"ctrl-pagedown": "pane::ActivateNextItem",
"ctrl-pageup": "pane::ActivatePrevItem",
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
"] x": "editor::SelectSmallerSyntaxNode"
}
},
{
"context": "Editor && vim_mode == visual && vim_operator == none && !VimWaiting",
"bindings": {
// tree-sitter related commands
"[ x": "editor::SelectLargerSyntaxNode",
"] x": "editor::SelectSmallerSyntaxNode"
}
@@ -532,6 +543,18 @@
]
}
},
{
"context": "Editor && vim_mode == normal",
"bindings": {
"g c c": "editor::ToggleComments"
}
},
{
"context": "Editor && vim_mode == visual",
"bindings": {
"g c": "editor::ToggleComments"
}
},
{
"context": "Editor && vim_mode == insert",
"bindings": {

View File

@@ -214,7 +214,10 @@
// Whether to reveal it in the project panel automatically,
// when a corresponding project entry becomes active.
// Gitignored entries are never auto revealed.
"auto_reveal_entries": true
"auto_reveal_entries": true,
/// Whether to fold directories automatically
/// when a directory has only one directory inside.
"auto_fold_dirs": false
},
"collaboration_panel": {
// Whether to show the collaboration panel button in the status bar.

View File

@@ -264,7 +264,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
);
assert_eq!(
channel.next_event(cx),
channel.next_event(cx).await,
ChannelChatEvent::MessagesUpdated {
old_range: 2..2,
new_count: 1,
@@ -317,7 +317,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
);
assert_eq!(
channel.next_event(cx),
channel.next_event(cx).await,
ChannelChatEvent::MessagesUpdated {
old_range: 0..0,
new_count: 2,

View File

@@ -0,0 +1,9 @@
CREATE TABLE IF NOT EXISTS "embeddings" (
"model" TEXT,
"digest" BYTEA,
"dimensions" FLOAT4[1536],
"retrieved_at" TIMESTAMP NOT NULL DEFAULT now(),
PRIMARY KEY ("model", "digest")
);
CREATE INDEX IF NOT EXISTS "idx_retrieved_at_on_embeddings" ON "embeddings" ("retrieved_at");

View File

@@ -6,6 +6,7 @@ pub mod channels;
pub mod contacts;
pub mod contributors;
pub mod dev_servers;
pub mod embeddings;
pub mod extensions;
pub mod hosted_projects;
pub mod messages;

View File

@@ -0,0 +1,94 @@
use super::*;
use time::Duration;
use time::OffsetDateTime;
impl Database {
pub async fn get_embeddings(
&self,
model: &str,
digests: &[Vec<u8>],
) -> Result<HashMap<Vec<u8>, Vec<f32>>> {
self.weak_transaction(|tx| async move {
let embeddings = {
let mut db_embeddings = embedding::Entity::find()
.filter(
embedding::Column::Model.eq(model).and(
embedding::Column::Digest
.is_in(digests.iter().map(|digest| digest.as_slice())),
),
)
.stream(&*tx)
.await?;
let mut embeddings = HashMap::default();
while let Some(db_embedding) = db_embeddings.next().await {
let db_embedding = db_embedding?;
embeddings.insert(db_embedding.digest, db_embedding.dimensions);
}
embeddings
};
if !embeddings.is_empty() {
let now = OffsetDateTime::now_utc();
let retrieved_at = PrimitiveDateTime::new(now.date(), now.time());
embedding::Entity::update_many()
.filter(
embedding::Column::Digest
.is_in(embeddings.keys().map(|digest| digest.as_slice())),
)
.col_expr(embedding::Column::RetrievedAt, Expr::value(retrieved_at))
.exec(&*tx)
.await?;
}
Ok(embeddings)
})
.await
}
pub async fn save_embeddings(
&self,
model: &str,
embeddings: &HashMap<Vec<u8>, Vec<f32>>,
) -> Result<()> {
self.weak_transaction(|tx| async move {
embedding::Entity::insert_many(embeddings.iter().map(|(digest, dimensions)| {
let now_offset_datetime = OffsetDateTime::now_utc();
let retrieved_at =
PrimitiveDateTime::new(now_offset_datetime.date(), now_offset_datetime.time());
embedding::ActiveModel {
model: ActiveValue::set(model.to_string()),
digest: ActiveValue::set(digest.clone()),
dimensions: ActiveValue::set(dimensions.clone()),
retrieved_at: ActiveValue::set(retrieved_at),
}
}))
.on_conflict(
OnConflict::columns([embedding::Column::Model, embedding::Column::Digest])
.do_nothing()
.to_owned(),
)
.exec_without_returning(&*tx)
.await?;
Ok(())
})
.await
}
pub async fn purge_old_embeddings(&self) -> Result<()> {
self.weak_transaction(|tx| async move {
embedding::Entity::delete_many()
.filter(
embedding::Column::RetrievedAt
.lte(OffsetDateTime::now_utc() - Duration::days(60)),
)
.exec(&*tx)
.await?;
Ok(())
})
.await
}
}

View File

@@ -11,6 +11,7 @@ pub mod channel_message_mention;
pub mod contact;
pub mod contributor;
pub mod dev_server;
pub mod embedding;
pub mod extension;
pub mod extension_version;
pub mod feature_flag;

View File

@@ -0,0 +1,18 @@
use sea_orm::entity::prelude::*;
use time::PrimitiveDateTime;
#[derive(Clone, Debug, PartialEq, DeriveEntityModel)]
#[sea_orm(table_name = "embeddings")]
pub struct Model {
#[sea_orm(primary_key)]
pub model: String,
#[sea_orm(primary_key)]
pub digest: Vec<u8>,
pub dimensions: Vec<f32>,
pub retrieved_at: PrimitiveDateTime,
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveRelation)]
pub enum Relation {}
impl ActiveModelBehavior for ActiveModel {}

View File

@@ -2,6 +2,7 @@ mod buffer_tests;
mod channel_tests;
mod contributor_tests;
mod db_tests;
mod embedding_tests;
mod extension_tests;
mod feature_flag_tests;
mod message_tests;

View File

@@ -0,0 +1,84 @@
use super::TestDb;
use crate::db::embedding;
use collections::HashMap;
use sea_orm::{sea_query::Expr, ColumnTrait, EntityTrait, QueryFilter};
use std::ops::Sub;
use time::{Duration, OffsetDateTime, PrimitiveDateTime};
// SQLite does not support array arguments, so we only test this against a real postgres instance
#[gpui::test]
async fn test_get_embeddings_postgres(cx: &mut gpui::TestAppContext) {
let test_db = TestDb::postgres(cx.executor().clone());
let db = test_db.db();
let provider = "test_model";
let digest1 = vec![1, 2, 3];
let digest2 = vec![4, 5, 6];
let embeddings = HashMap::from_iter([
(digest1.clone(), vec![0.1, 0.2, 0.3]),
(digest2.clone(), vec![0.4, 0.5, 0.6]),
]);
// Save embeddings
db.save_embeddings(provider, &embeddings).await.unwrap();
// Retrieve embeddings
let retrieved_embeddings = db
.get_embeddings(provider, &[digest1.clone(), digest2.clone()])
.await
.unwrap();
assert_eq!(retrieved_embeddings.len(), 2);
assert!(retrieved_embeddings.contains_key(&digest1));
assert!(retrieved_embeddings.contains_key(&digest2));
// Check if the retrieved embeddings are correct
assert_eq!(retrieved_embeddings[&digest1], vec![0.1, 0.2, 0.3]);
assert_eq!(retrieved_embeddings[&digest2], vec![0.4, 0.5, 0.6]);
}
#[gpui::test]
async fn test_purge_old_embeddings(cx: &mut gpui::TestAppContext) {
let test_db = TestDb::postgres(cx.executor().clone());
let db = test_db.db();
let model = "test_model";
let digest = vec![7, 8, 9];
let embeddings = HashMap::from_iter([(digest.clone(), vec![0.7, 0.8, 0.9])]);
// Save old embeddings
db.save_embeddings(model, &embeddings).await.unwrap();
// Reach into the DB and change the retrieved at to be > 60 days
db.weak_transaction(|tx| {
let digest = digest.clone();
async move {
let sixty_days_ago = OffsetDateTime::now_utc().sub(Duration::days(61));
let retrieved_at = PrimitiveDateTime::new(sixty_days_ago.date(), sixty_days_ago.time());
embedding::Entity::update_many()
.filter(
embedding::Column::Model
.eq(model)
.and(embedding::Column::Digest.eq(digest)),
)
.col_expr(embedding::Column::RetrievedAt, Expr::value(retrieved_at))
.exec(&*tx)
.await
.unwrap();
Ok(())
}
})
.await
.unwrap();
// Purge old embeddings
db.purge_old_embeddings().await.unwrap();
// Try to retrieve the purged embeddings
let retrieved_embeddings = db.get_embeddings(model, &[digest.clone()]).await.unwrap();
assert!(
retrieved_embeddings.is_empty(),
"Old embeddings should have been purged"
);
}

View File

@@ -6,8 +6,8 @@ use axum::{
Extension, Router,
};
use collab::{
api::fetch_extensions_from_blob_store_periodically, db, env, executor::Executor, AppState,
Config, RateLimiter, Result,
api::fetch_extensions_from_blob_store_periodically, db, env, executor::Executor,
rpc::ResultExt, AppState, Config, RateLimiter, Result,
};
use db::Database;
use std::{
@@ -23,7 +23,7 @@ use tower_http::trace::TraceLayer;
use tracing_subscriber::{
filter::EnvFilter, fmt::format::JsonFields, util::SubscriberInitExt, Layer,
};
use util::ResultExt;
use util::ResultExt as _;
const VERSION: &str = env!("CARGO_PKG_VERSION");
const REVISION: Option<&'static str> = option_env!("GITHUB_SHA");
@@ -90,6 +90,7 @@ async fn main() -> Result<()> {
};
if is_collab {
state.db.purge_old_embeddings().await.trace_err();
RateLimiter::save_periodically(state.rate_limiter.clone(), state.executor.clone());
}

View File

@@ -32,6 +32,8 @@ use axum::{
use collections::{HashMap, HashSet};
pub use connection_pool::{ConnectionPool, ZedVersion};
use core::fmt::{self, Debug, Formatter};
use open_ai::{OpenAiEmbeddingModel, OPEN_AI_API_URL};
use sha2::Digest;
use futures::{
channel::oneshot,
@@ -568,6 +570,22 @@ impl Server {
app_state.config.google_ai_api_key.clone(),
)
})
})
.add_request_handler({
user_handler(move |request, response, session| {
get_cached_embeddings(request, response, session)
})
})
.add_request_handler({
let app_state = app_state.clone();
user_handler(move |request, response, session| {
compute_embeddings(
request,
response,
session,
app_state.config.openai_api_key.clone(),
)
})
});
Arc::new(server)
@@ -4021,8 +4039,6 @@ async fn complete_with_open_ai(
session: UserSession,
api_key: Arc<str>,
) -> Result<()> {
const OPEN_AI_API_URL: &str = "https://api.openai.com/v1";
let mut completion_stream = open_ai::stream_completion(
&session.http_client,
OPEN_AI_API_URL,
@@ -4276,6 +4292,128 @@ async fn count_tokens_with_language_model(
Ok(())
}
struct ComputeEmbeddingsRateLimit;
impl RateLimit for ComputeEmbeddingsRateLimit {
fn capacity() -> usize {
std::env::var("EMBED_TEXTS_RATE_LIMIT_PER_HOUR")
.ok()
.and_then(|v| v.parse().ok())
.unwrap_or(120) // Picked arbitrarily
}
fn refill_duration() -> chrono::Duration {
chrono::Duration::hours(1)
}
fn db_name() -> &'static str {
"compute-embeddings"
}
}
async fn compute_embeddings(
request: proto::ComputeEmbeddings,
response: Response<proto::ComputeEmbeddings>,
session: UserSession,
api_key: Option<Arc<str>>,
) -> Result<()> {
let api_key = api_key.context("no OpenAI API key configured on the server")?;
authorize_access_to_language_models(&session).await?;
session
.rate_limiter
.check::<ComputeEmbeddingsRateLimit>(session.user_id())
.await?;
let embeddings = match request.model.as_str() {
"openai/text-embedding-3-small" => {
open_ai::embed(
&session.http_client,
OPEN_AI_API_URL,
&api_key,
OpenAiEmbeddingModel::TextEmbedding3Small,
request.texts.iter().map(|text| text.as_str()),
)
.await?
}
provider => return Err(anyhow!("unsupported embedding provider {:?}", provider))?,
};
let embeddings = request
.texts
.iter()
.map(|text| {
let mut hasher = sha2::Sha256::new();
hasher.update(text.as_bytes());
let result = hasher.finalize();
result.to_vec()
})
.zip(
embeddings
.data
.into_iter()
.map(|embedding| embedding.embedding),
)
.collect::<HashMap<_, _>>();
let db = session.db().await;
db.save_embeddings(&request.model, &embeddings)
.await
.context("failed to save embeddings")
.trace_err();
response.send(proto::ComputeEmbeddingsResponse {
embeddings: embeddings
.into_iter()
.map(|(digest, dimensions)| proto::Embedding { digest, dimensions })
.collect(),
})?;
Ok(())
}
struct GetCachedEmbeddingsRateLimit;
impl RateLimit for GetCachedEmbeddingsRateLimit {
fn capacity() -> usize {
std::env::var("EMBED_TEXTS_RATE_LIMIT_PER_HOUR")
.ok()
.and_then(|v| v.parse().ok())
.unwrap_or(120) // Picked arbitrarily
}
fn refill_duration() -> chrono::Duration {
chrono::Duration::hours(1)
}
fn db_name() -> &'static str {
"get-cached-embeddings"
}
}
async fn get_cached_embeddings(
request: proto::GetCachedEmbeddings,
response: Response<proto::GetCachedEmbeddings>,
session: UserSession,
) -> Result<()> {
authorize_access_to_language_models(&session).await?;
session
.rate_limiter
.check::<GetCachedEmbeddingsRateLimit>(session.user_id())
.await?;
let db = session.db().await;
let embeddings = db.get_embeddings(&request.model, &request.digests).await?;
response.send(proto::GetCachedEmbeddingsResponse {
embeddings: embeddings
.into_iter()
.map(|(digest, dimensions)| proto::Embedding { digest, dimensions })
.collect(),
})?;
Ok(())
}
async fn authorize_access_to_language_models(session: &UserSession) -> Result<(), Error> {
let db = session.db().await;
let flags = db.get_user_flags(session.user_id()).await?;

View File

@@ -31,7 +31,7 @@ impl fmt::Display for ZedVersion {
impl ZedVersion {
pub fn can_collaborate(&self) -> bool {
self.0 >= SemanticVersion::new(0, 127, 3)
self.0 >= SemanticVersion::new(0, 129, 2)
}
}

View File

@@ -1347,13 +1347,11 @@ impl RandomizedTest for ProjectCollaborationTest {
client.username
);
let host_saved_version_fingerprint =
host_buffer.read_with(host_cx, |b, _| b.saved_version_fingerprint());
let guest_saved_version_fingerprint =
guest_buffer.read_with(client_cx, |b, _| b.saved_version_fingerprint());
let host_is_dirty = host_buffer.read_with(host_cx, |b, _| b.is_dirty());
let guest_is_dirty = guest_buffer.read_with(client_cx, |b, _| b.is_dirty());
assert_eq!(
guest_saved_version_fingerprint, host_saved_version_fingerprint,
"guest {} saved fingerprint does not match host's for path {path:?} in project {project_id}",
guest_is_dirty, host_is_dirty,
"guest {} dirty state does not match host's for path {path:?} in project {project_id}",
client.username
);

View File

@@ -531,6 +531,8 @@ impl ChatPanel {
&self.languages,
self.client.id(),
&message,
self.local_timezone,
cx,
)
});
el.child(
@@ -744,6 +746,8 @@ impl ChatPanel {
language_registry: &Arc<LanguageRegistry>,
current_user_id: u64,
message: &channel::ChannelMessage,
local_timezone: UtcOffset,
cx: &AppContext,
) -> RichText {
let mentions = message
.mentions
@@ -754,24 +758,39 @@ impl ChatPanel {
})
.collect::<Vec<_>>();
const MESSAGE_UPDATED: &str = " (edited)";
const MESSAGE_EDITED: &str = " (edited)";
let mut body = message.body.clone();
if message.edited_at.is_some() {
body.push_str(MESSAGE_UPDATED);
body.push_str(MESSAGE_EDITED);
}
let mut rich_text = rich_text::render_rich_text(body, &mentions, language_registry, None);
if message.edited_at.is_some() {
let range = (rich_text.text.len() - MESSAGE_EDITED.len())..rich_text.text.len();
rich_text.highlights.push((
(rich_text.text.len() - MESSAGE_UPDATED.len())..rich_text.text.len(),
range.clone(),
Highlight::Highlight(HighlightStyle {
fade_out: Some(0.8),
color: Some(cx.theme().colors().text_muted),
..Default::default()
}),
));
if let Some(edit_timestamp) = message.edited_at {
let edit_timestamp_text = time_format::format_localized_timestamp(
edit_timestamp,
OffsetDateTime::now_utc(),
local_timezone,
time_format::TimestampFormat::Absolute,
);
rich_text.custom_ranges.push(range);
rich_text.set_tooltip_builder_for_custom_ranges(move |_, _, cx| {
Some(Tooltip::text(edit_timestamp_text.clone(), cx))
})
}
}
rich_text
}
@@ -1176,7 +1195,13 @@ mod tests {
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);
let message = ChatPanel::render_markdown_with_mentions(
&language_registry,
102,
&message,
UtcOffset::UTC,
cx,
);
// Note that the "'" was replaced with due to smart punctuation.
let (body, ranges) = marked_text_ranges("«hi», «@abc», lets «call» «@fgh»", false);
@@ -1224,7 +1249,13 @@ mod tests {
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);
let message = ChatPanel::render_markdown_with_mentions(
&language_registry,
102,
&message,
UtcOffset::UTC,
cx,
);
// Note that the "'" was replaced with due to smart punctuation.
let (body, ranges) =
@@ -1265,7 +1296,13 @@ mod tests {
edited_at: None,
};
let message = ChatPanel::render_markdown_with_mentions(&language_registry, 102, &message);
let message = ChatPanel::render_markdown_with_mentions(
&language_registry,
102,
&message,
UtcOffset::UTC,
cx,
);
// Note that the "'" was replaced with due to smart punctuation.
let (body, ranges) = marked_text_ranges(

View File

@@ -1255,7 +1255,6 @@ mod tests {
&self,
_: BufferId,
_: &clock::Global,
_: language::RopeFingerprint,
_: language::LineEnding,
_: Option<std::time::SystemTime>,
_: &mut AppContext,

View File

@@ -129,6 +129,7 @@ use ui::{
Tooltip,
};
use util::{defer, maybe, post_inc, RangeExt, ResultExt, TryFutureExt};
use workspace::item::ItemHandle;
use workspace::notifications::NotificationId;
use workspace::Toast;
use workspace::{
@@ -471,6 +472,8 @@ pub struct Editor {
+ Fn(&mut Self, DisplayPoint, &mut ViewContext<Self>) -> Option<View<ui::ContextMenu>>,
>,
>,
last_bounds: Option<Bounds<Pixels>>,
expect_bounds_change: Option<Bounds<Pixels>>,
}
#[derive(Clone)]
@@ -1485,6 +1488,8 @@ impl Editor {
inlay_hint_cache: InlayHintCache::new(inlay_hint_settings),
gutter_hovered: false,
pixel_position_of_newest_cursor: None,
last_bounds: None,
expect_bounds_change: None,
gutter_width: Default::default(),
style: None,
show_cursor_names: false,
@@ -7998,11 +8003,16 @@ impl Editor {
cx,
);
});
let item = Box::new(editor);
if split {
workspace.split_item(SplitDirection::Right, Box::new(editor), cx);
workspace.split_item(SplitDirection::Right, item.clone(), cx);
} else {
workspace.add_item_to_active_pane(Box::new(editor), cx);
workspace.add_item_to_active_pane(item.clone(), cx);
}
workspace.active_pane().clone().update(cx, |pane, cx| {
let item_id = item.item_id();
pane.set_preview_item_id(Some(item_id), cx);
});
}
pub fn rename(&mut self, _: &Rename, cx: &mut ViewContext<Self>) -> Option<Task<Result<()>>> {
@@ -9517,7 +9527,11 @@ impl Editor {
cx.spawn(|_, mut cx| async move {
let workspace = workspace.ok_or_else(|| anyhow!("cannot jump without workspace"))?;
let editor = workspace.update(&mut cx, |workspace, cx| {
workspace.open_path(path, None, true, cx)
// Reset the preview item id before opening the new item
workspace.active_pane().update(cx, |pane, cx| {
pane.set_preview_item_id(None, cx);
});
workspace.open_path_preview(path, None, true, true, cx)
})?;
let editor = editor
.await?

View File

@@ -2969,7 +2969,7 @@ fn render_blame_entry(
cx.open_url(url.as_str())
})
})
.tooltip(move |cx| {
.hoverable_tooltip(move |cx| {
BlameEntryTooltip::new(
sha_color.cursor,
commit_message.clone(),
@@ -3371,6 +3371,7 @@ impl Element for EditorElement {
let overscroll = size(em_width, px(0.));
snapshot = self.editor.update(cx, |editor, cx| {
editor.last_bounds = Some(bounds);
editor.gutter_width = gutter_dimensions.width;
editor.set_visible_line_count(bounds.size.height / line_height, cx);
@@ -3419,7 +3420,7 @@ impl Element for EditorElement {
let autoscroll_horizontally = self.editor.update(cx, |editor, cx| {
let autoscroll_horizontally =
editor.autoscroll_vertically(bounds.size.height, line_height, cx);
editor.autoscroll_vertically(bounds, line_height, cx);
snapshot = editor.snapshot(cx);
autoscroll_horizontally
});

View File

@@ -256,7 +256,7 @@ impl GitBlame {
let blame = self.project.read(cx).blame_buffer(&self.buffer, None, cx);
self.task = cx.spawn(|this, mut cx| async move {
let (entries, permalinks, messages) = cx
let result = cx
.background_executor()
.spawn({
let snapshot = snapshot.clone();
@@ -304,16 +304,23 @@ impl GitBlame {
anyhow::Ok((entries, permalinks, messages))
}
})
.await?;
.await;
this.update(&mut cx, |this, cx| {
this.buffer_edits = buffer_edits;
this.buffer_snapshot = snapshot;
this.entries = entries;
this.permalinks = permalinks;
this.messages = messages;
this.generated = true;
cx.notify();
this.update(&mut cx, |this, cx| match result {
Ok((entries, permalinks, messages)) => {
this.buffer_edits = buffer_edits;
this.buffer_snapshot = snapshot;
this.entries = entries;
this.permalinks = permalinks;
this.messages = messages;
this.generated = true;
cx.notify();
}
Err(error) => this.project.update(cx, |_, cx| {
log::error!("failed to get git blame data: {error:?}");
let notification = format!("{:#}", error).trim().to_string();
cx.emit(project::Event::Notification(notification));
}),
})
});
}
@@ -359,6 +366,54 @@ mod tests {
});
}
#[gpui::test]
async fn test_blame_error_notifications(cx: &mut gpui::TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor());
fs.insert_tree(
"/my-repo",
json!({
".git": {},
"file.txt": r#"
irrelevant contents
"#
.unindent()
}),
)
.await;
// Creating a GitBlame without a corresponding blame state
// will result in an error.
let project = Project::test(fs, ["/my-repo".as_ref()], cx).await;
let buffer = project
.update(cx, |project, cx| {
project.open_local_buffer("/my-repo/file.txt", cx)
})
.await
.unwrap();
let blame = cx.new_model(|cx| GitBlame::new(buffer.clone(), project.clone(), cx));
let event = project.next_event(cx).await;
assert_eq!(
event,
project::Event::Notification(
"Failed to blame \"file.txt\": failed to get blame for \"file.txt\"".to_string()
)
);
blame.update(cx, |blame, cx| {
assert_eq!(
blame
.blame_for_rows((0..1).map(Some), cx)
.collect::<Vec<_>>(),
vec![None]
);
});
}
#[gpui::test]
async fn test_blame_for_rows(cx: &mut gpui::TestAppContext) {
init_test(cx);

View File

@@ -735,9 +735,8 @@ impl Item for Editor {
buffer
.update(&mut cx, |buffer, cx| {
let version = buffer.saved_version().clone();
let fingerprint = buffer.saved_version_fingerprint();
let mtime = buffer.saved_mtime();
buffer.did_save(version, fingerprint, mtime, cx);
buffer.did_save(version, mtime, cx);
})
.ok();
}
@@ -1167,6 +1166,10 @@ impl SearchableItem for Editor {
&self.buffer().read(cx).snapshot(cx),
)
}
fn search_bar_visibility_changed(&mut self, _visible: bool, _cx: &mut ViewContext<Self>) {
self.expect_bounds_change = self.last_bounds;
}
}
pub fn active_match_index(

View File

@@ -47,6 +47,12 @@ pub fn left(map: &DisplaySnapshot, mut point: DisplayPoint) -> DisplayPoint {
pub fn saturating_left(map: &DisplaySnapshot, mut point: DisplayPoint) -> DisplayPoint {
if point.column() > 0 {
*point.column_mut() -= 1;
} else if point.column() == 0 {
// If the current sofr_wrap mode is used, the column corresponding to the display is 0,
// which does not necessarily mean that the actual beginning of a paragraph
if map.display_point_to_fold_point(point, Bias::Left).column() > 0 {
return left(map, point);
}
}
map.clip_point(point, Bias::Left)
}

View File

@@ -1,6 +1,6 @@
use std::{cmp, f32};
use gpui::{px, Pixels, ViewContext};
use gpui::{px, Bounds, Pixels, ViewContext};
use language::Point;
use crate::{display_map::ToDisplayPoint, Editor, EditorMode, LineWithInvisibles};
@@ -63,13 +63,23 @@ impl AutoscrollStrategy {
impl Editor {
pub fn autoscroll_vertically(
&mut self,
viewport_height: Pixels,
bounds: Bounds<Pixels>,
line_height: Pixels,
cx: &mut ViewContext<Editor>,
) -> bool {
let viewport_height = bounds.size.height;
let visible_lines = viewport_height / line_height;
let display_map = self.display_map.update(cx, |map, cx| map.snapshot(cx));
let mut scroll_position = self.scroll_manager.scroll_position(&display_map);
let original_y = scroll_position.y;
if let Some(last_bounds) = self.expect_bounds_change.take() {
if scroll_position.y != 0. {
scroll_position.y += (bounds.top() - last_bounds.top()) / line_height;
if scroll_position.y < 0. {
scroll_position.y = 0.;
}
}
}
let max_scroll_top = if matches!(self.mode, EditorMode::AutoHeight { .. }) {
(display_map.max_point().row() as f32 - visible_lines + 1.).max(0.)
} else {
@@ -77,6 +87,9 @@ impl Editor {
};
if scroll_position.y > max_scroll_top {
scroll_position.y = max_scroll_top;
}
if original_y != scroll_position.y {
self.set_scroll_position(scroll_position, cx);
}

View File

@@ -10,7 +10,7 @@ use gpui::AsyncAppContext;
use language::{
CodeLabel, HighlightId, Language, LanguageServerName, LspAdapter, LspAdapterDelegate,
};
use lsp::LanguageServerBinary;
use lsp::{CodeActionKind, LanguageServerBinary};
use serde::Serialize;
use serde_json::Value;
use std::ops::Range;
@@ -129,6 +129,24 @@ impl LspAdapter for ExtensionLspAdapter {
None
}
fn code_action_kinds(&self) -> Option<Vec<CodeActionKind>> {
if self.extension.manifest.id.as_ref() == "terraform" {
// This is taken from the original Terraform implementation, including
// the TODOs:
// TODO: file issue for server supported code actions
// TODO: reenable default actions / delete override
return Some(vec![]);
}
Some(vec![
CodeActionKind::EMPTY,
CodeActionKind::QUICKFIX,
CodeActionKind::REFACTOR,
CodeActionKind::REFACTOR_EXTRACT,
CodeActionKind::SOURCE,
])
}
fn language_ids(&self) -> HashMap<String, String> {
// TODO: The language IDs can be provided via the language server options
// in `extension.toml now but we're leaving these existing usages in place temporarily

View File

@@ -59,6 +59,7 @@ const SUGGESTIONS_BY_EXTENSION_ID: &[(&str, &[&str])] = &[
("svelte", &["svelte"]),
("swift", &["swift"]),
("templ", &["templ"]),
("terraform", &["tf", "tfvars", "hcl"]),
("toml", &["Cargo.lock", "toml"]),
("wgsl", &["wgsl"]),
("zig", &["zig"]),

View File

@@ -78,6 +78,7 @@ fn run_git_blame(
.arg(path.as_os_str())
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.stderr(Stdio::piped())
.spawn()
.map_err(|e| anyhow!("Failed to start git blame process: {}", e))?;

View File

@@ -66,7 +66,7 @@ taffy = { git = "https://github.com/DioxusLabs/taffy", rev = "1876f72bee5e376023
thiserror.workspace = true
time.workspace = true
util.workspace = true
uuid = { version = "1.1.2", features = ["v4", "v5"] }
uuid.workspace = true
waker-fn = "1.1.0"
[dev-dependencies]

View File

@@ -1416,8 +1416,8 @@ pub struct AnyTooltip {
/// The view used to display the tooltip
pub view: AnyView,
/// The offset from the cursor to use, relative to the parent view
pub cursor_offset: Point<Pixels>,
/// The absolute position of the mouse when the tooltip was deployed.
pub mouse_position: Point<Pixels>,
}
/// A keystroke event, and potentially the associated action

View File

@@ -7,7 +7,7 @@ use crate::{
TextSystem, View, ViewContext, VisualContext, WindowContext, WindowHandle, WindowOptions,
};
use anyhow::{anyhow, bail};
use futures::{Stream, StreamExt};
use futures::{channel::oneshot, Stream, StreamExt};
use std::{cell::RefCell, future::Future, ops::Deref, rc::Rc, sync::Arc, time::Duration};
/// A TestAppContext is provided to tests created with `#[gpui::test]`, it provides
@@ -479,31 +479,26 @@ impl TestAppContext {
impl<T: 'static> Model<T> {
/// Block until the next event is emitted by the model, then return it.
pub fn next_event<Evt>(&self, cx: &mut TestAppContext) -> Evt
pub fn next_event<Event>(&self, cx: &mut TestAppContext) -> impl Future<Output = Event>
where
Evt: Send + Clone + 'static,
T: EventEmitter<Evt>,
Event: Send + Clone + 'static,
T: EventEmitter<Event>,
{
let (tx, mut rx) = futures::channel::mpsc::unbounded();
let _subscription = self.update(cx, |_, cx| {
let (tx, mut rx) = oneshot::channel();
let mut tx = Some(tx);
let subscription = self.update(cx, |_, cx| {
cx.subscribe(self, move |_, _, event, _| {
tx.unbounded_send(event.clone()).ok();
if let Some(tx) = tx.take() {
_ = tx.send(event.clone());
}
})
});
// Run other tasks until the event is emitted.
loop {
match rx.try_next() {
Ok(Some(event)) => return event,
Ok(None) => panic!("model was dropped"),
Err(_) => {
if !cx.executor().tick() {
break;
}
}
}
async move {
let event = rx.await.expect("no event emitted");
drop(subscription);
event
}
panic!("no event received")
}
/// Returns a future that resolves when the model notifies.

View File

@@ -21,7 +21,7 @@ use crate::{
HitboxId, IntoElement, IsZero, KeyContext, KeyDownEvent, KeyUpEvent, LayoutId,
ModifiersChangedEvent, MouseButton, MouseDownEvent, MouseMoveEvent, MouseUpEvent,
ParentElement, Pixels, Point, Render, ScrollWheelEvent, SharedString, Size, Style,
StyleRefinement, Styled, Task, View, Visibility, WindowContext,
StyleRefinement, Styled, Task, TooltipId, View, Visibility, WindowContext,
};
use collections::HashMap;
use refineable::Refineable;
@@ -483,7 +483,29 @@ impl Interactivity {
self.tooltip_builder.is_none(),
"calling tooltip more than once on the same element is not supported"
);
self.tooltip_builder = Some(Rc::new(build_tooltip));
self.tooltip_builder = Some(TooltipBuilder {
build: Rc::new(build_tooltip),
hoverable: false,
});
}
/// Use the given callback to construct a new tooltip view when the mouse hovers over this element.
/// The tooltip itself is also hoverable and won't disappear when the user moves the mouse into
/// the tooltip. The imperative API equivalent to [`InteractiveElement::hoverable_tooltip`]
pub fn hoverable_tooltip(
&mut self,
build_tooltip: impl Fn(&mut WindowContext) -> AnyView + 'static,
) where
Self: Sized,
{
debug_assert!(
self.tooltip_builder.is_none(),
"calling tooltip more than once on the same element is not supported"
);
self.tooltip_builder = Some(TooltipBuilder {
build: Rc::new(build_tooltip),
hoverable: true,
});
}
/// Block the mouse from interacting with this element or any of its children
@@ -973,6 +995,20 @@ pub trait StatefulInteractiveElement: InteractiveElement {
self.interactivity().tooltip(build_tooltip);
self
}
/// Use the given callback to construct a new tooltip view when the mouse hovers over this element.
/// The tooltip itself is also hoverable and won't disappear when the user moves the mouse into
/// the tooltip. The fluent API equivalent to [`Interactivity::hoverable_tooltip`]
fn hoverable_tooltip(
mut self,
build_tooltip: impl Fn(&mut WindowContext) -> AnyView + 'static,
) -> Self
where
Self: Sized,
{
self.interactivity().hoverable_tooltip(build_tooltip);
self
}
}
/// A trait for providing focus related APIs to interactive elements
@@ -1015,7 +1051,10 @@ type DropListener = Box<dyn Fn(&dyn Any, &mut WindowContext) + 'static>;
type CanDropPredicate = Box<dyn Fn(&dyn Any, &mut WindowContext) -> bool + 'static>;
pub(crate) type TooltipBuilder = Rc<dyn Fn(&mut WindowContext) -> AnyView + 'static>;
pub(crate) struct TooltipBuilder {
build: Rc<dyn Fn(&mut WindowContext) -> AnyView + 'static>,
hoverable: bool,
}
pub(crate) type KeyDownListener =
Box<dyn Fn(&KeyDownEvent, DispatchPhase, &mut WindowContext) + 'static>;
@@ -1188,6 +1227,7 @@ pub struct Interactivity {
/// Whether the element was hovered. This will only be present after paint if an hitbox
/// was created for the interactive element.
pub hovered: Option<bool>,
pub(crate) tooltip_id: Option<TooltipId>,
pub(crate) content_size: Size<Pixels>,
pub(crate) key_context: Option<KeyContext>,
pub(crate) focusable: bool,
@@ -1321,7 +1361,7 @@ impl Interactivity {
if let Some(active_tooltip) = element_state.active_tooltip.as_ref() {
if let Some(active_tooltip) = active_tooltip.borrow().as_ref() {
if let Some(tooltip) = active_tooltip.tooltip.clone() {
cx.set_tooltip(tooltip);
self.tooltip_id = Some(cx.set_tooltip(tooltip));
}
}
}
@@ -1806,6 +1846,7 @@ impl Interactivity {
}
if let Some(tooltip_builder) = self.tooltip_builder.take() {
let tooltip_is_hoverable = tooltip_builder.hoverable;
let active_tooltip = element_state
.active_tooltip
.get_or_insert_with(Default::default)
@@ -1818,11 +1859,17 @@ impl Interactivity {
cx.on_mouse_event({
let active_tooltip = active_tooltip.clone();
let hitbox = hitbox.clone();
let tooltip_id = self.tooltip_id;
move |_: &MouseMoveEvent, phase, cx| {
let is_hovered =
pending_mouse_down.borrow().is_none() && hitbox.is_hovered(cx);
if !is_hovered {
active_tooltip.borrow_mut().take();
let tooltip_is_hovered =
tooltip_id.map_or(false, |tooltip_id| tooltip_id.is_hovered(cx));
if !is_hovered && (!tooltip_is_hoverable || !tooltip_is_hovered) {
if active_tooltip.borrow_mut().take().is_some() {
cx.refresh();
}
return;
}
@@ -1833,15 +1880,14 @@ impl Interactivity {
if active_tooltip.borrow().is_none() {
let task = cx.spawn({
let active_tooltip = active_tooltip.clone();
let tooltip_builder = tooltip_builder.clone();
let build_tooltip = tooltip_builder.build.clone();
move |mut cx| async move {
cx.background_executor().timer(TOOLTIP_DELAY).await;
cx.update(|cx| {
active_tooltip.borrow_mut().replace(ActiveTooltip {
tooltip: Some(AnyTooltip {
view: tooltip_builder(cx),
cursor_offset: cx.mouse_position(),
view: build_tooltip(cx),
mouse_position: cx.mouse_position(),
}),
_task: None,
});
@@ -1860,15 +1906,30 @@ impl Interactivity {
cx.on_mouse_event({
let active_tooltip = active_tooltip.clone();
move |_: &MouseDownEvent, _, _| {
active_tooltip.borrow_mut().take();
let tooltip_id = self.tooltip_id;
move |_: &MouseDownEvent, _, cx| {
let tooltip_is_hovered =
tooltip_id.map_or(false, |tooltip_id| tooltip_id.is_hovered(cx));
if !tooltip_is_hoverable || !tooltip_is_hovered {
if active_tooltip.borrow_mut().take().is_some() {
cx.refresh();
}
}
}
});
cx.on_mouse_event({
let active_tooltip = active_tooltip.clone();
move |_: &ScrollWheelEvent, _, _| {
active_tooltip.borrow_mut().take();
let tooltip_id = self.tooltip_id;
move |_: &ScrollWheelEvent, _, cx| {
let tooltip_is_hovered =
tooltip_id.map_or(false, |tooltip_id| tooltip_id.is_hovered(cx));
if !tooltip_is_hoverable || !tooltip_is_hovered {
if active_tooltip.borrow_mut().take().is_some() {
cx.refresh();
}
}
}
})
}

View File

@@ -553,7 +553,7 @@ impl Element for InteractiveText {
ActiveTooltip {
tooltip: Some(AnyTooltip {
view: tooltip,
cursor_offset: cx.mouse_position(),
mouse_position: cx.mouse_position(),
}),
_task: None,
}

View File

@@ -372,7 +372,7 @@ impl BackgroundExecutor {
self.dispatcher.as_test().unwrap().rng()
}
/// How many CPUs are available to the dispatcher
/// How many CPUs are available to the dispatcher.
pub fn num_cpus(&self) -> usize {
num_cpus::get()
}
@@ -440,6 +440,11 @@ impl<'a> Scope<'a> {
}
}
/// How many CPUs are available to the dispatcher.
pub fn num_cpus(&self) -> usize {
self.executor.num_cpus()
}
/// Spawn a future into this scope.
pub fn spawn<F>(&mut self, f: F)
where

View File

@@ -33,8 +33,10 @@ use super::X11Display;
x11rb::atom_manager! {
pub XcbAtoms: AtomsCookie {
UTF8_STRING,
WM_PROTOCOLS,
WM_DELETE_WINDOW,
_NET_WM_NAME,
_NET_WM_STATE,
_NET_WM_STATE_MAXIMIZED_VERT,
_NET_WM_STATE_MAXIMIZED_HORZ,
@@ -76,7 +78,7 @@ pub struct Callbacks {
pub(crate) struct X11WindowState {
raw: RawWindow,
atoms: XcbAtoms,
bounds: Bounds<i32>,
scale_factor: f32,
renderer: BladeRenderer,
@@ -238,6 +240,7 @@ impl X11WindowState {
bounds: params.bounds.map(|v| v.0),
scale_factor: 1.0,
renderer: BladeRenderer::new(gpu, gpu_extent),
atoms: *atoms,
input_handler: None,
}
@@ -442,6 +445,16 @@ impl PlatformWindow for X11Window {
title.as_bytes(),
)
.unwrap();
self.xcb_connection
.change_property8(
xproto::PropMode::REPLACE,
self.x_window,
self.state.borrow().atoms._NET_WM_NAME,
self.state.borrow().atoms.UTF8_STRING,
title.as_bytes(),
)
.unwrap();
}
// todo(linux)

View File

@@ -287,6 +287,8 @@ pub struct Window {
pub(crate) rendered_frame: Frame,
pub(crate) next_frame: Frame,
pub(crate) next_hitbox_id: HitboxId,
pub(crate) next_tooltip_id: TooltipId,
pub(crate) tooltip_bounds: Option<TooltipBounds>,
next_frame_callbacks: Rc<RefCell<Vec<FrameCallback>>>,
pub(crate) dirty_views: FxHashSet<EntityId>,
pub(crate) focus_handles: Arc<RwLock<SlotMap<FocusId, AtomicUsize>>>,
@@ -551,6 +553,8 @@ impl Window {
next_frame: Frame::new(DispatchTree::new(cx.keymap.clone(), cx.actions.clone())),
next_frame_callbacks,
next_hitbox_id: HitboxId::default(),
next_tooltip_id: TooltipId::default(),
tooltip_bounds: None,
dirty_views: FxHashSet::default(),
focus_handles: Arc::new(RwLock::new(SlotMap::with_key())),
focus_listeners: SubscriberSet::new(),

View File

@@ -15,7 +15,7 @@
use std::{
any::{Any, TypeId},
borrow::{Borrow, BorrowMut, Cow},
mem,
cmp, mem,
ops::Range,
rc::Rc,
sync::Arc,
@@ -28,17 +28,18 @@ use futures::{future::Shared, FutureExt};
#[cfg(target_os = "macos")]
use media::core_video::CVImageBuffer;
use smallvec::SmallVec;
use util::post_inc;
use crate::{
hash, prelude::*, size, AnyElement, AnyTooltip, AppContext, Asset, AvailableSpace, Bounds,
BoxShadow, ContentMask, Corners, CursorStyle, DevicePixels, DispatchNodeId, DispatchPhase,
DispatchTree, DrawPhase, ElementId, ElementStateBox, EntityId, FocusHandle, FocusId, FontId,
GlobalElementId, GlyphId, Hsla, ImageData, InputHandler, IsZero, KeyContext, KeyEvent,
LayoutId, LineLayoutIndex, ModifiersChangedEvent, MonochromeSprite, MouseEvent, PaintQuad,
Path, Pixels, PlatformInputHandler, Point, PolychromeSprite, Quad, RenderGlyphParams,
RenderImageParams, RenderSvgParams, Scene, Shadow, SharedString, Size, StrikethroughStyle,
Style, Task, TextStyleRefinement, TransformationMatrix, Underline, UnderlineStyle, Window,
WindowContext, SUBPIXEL_VARIANTS,
hash, point, prelude::*, px, size, AnyElement, AnyTooltip, AppContext, Asset, AvailableSpace,
Bounds, BoxShadow, ContentMask, Corners, CursorStyle, DevicePixels, DispatchNodeId,
DispatchPhase, DispatchTree, DrawPhase, ElementId, ElementStateBox, EntityId, FocusHandle,
FocusId, FontId, GlobalElementId, GlyphId, Hsla, ImageData, InputHandler, IsZero, KeyContext,
KeyEvent, LayoutId, LineLayoutIndex, ModifiersChangedEvent, MonochromeSprite, MouseEvent,
PaintQuad, Path, Pixels, PlatformInputHandler, Point, PolychromeSprite, Quad,
RenderGlyphParams, RenderImageParams, RenderSvgParams, Scene, Shadow, SharedString, Size,
StrikethroughStyle, Style, Task, TextStyleRefinement, TransformationMatrix, Underline,
UnderlineStyle, Window, WindowContext, SUBPIXEL_VARIANTS,
};
pub(crate) type AnyMouseListener =
@@ -84,6 +85,33 @@ impl Hitbox {
#[derive(Default, Eq, PartialEq)]
pub(crate) struct HitTest(SmallVec<[HitboxId; 8]>);
/// An identifier for a tooltip.
#[derive(Copy, Clone, Debug, Default, Eq, PartialEq)]
pub struct TooltipId(usize);
impl TooltipId {
/// Checks if the tooltip is currently hovered.
pub fn is_hovered(&self, cx: &WindowContext) -> bool {
cx.window
.tooltip_bounds
.as_ref()
.map_or(false, |tooltip_bounds| {
tooltip_bounds.id == *self && tooltip_bounds.bounds.contains(&cx.mouse_position())
})
}
}
pub(crate) struct TooltipBounds {
id: TooltipId,
bounds: Bounds<Pixels>,
}
#[derive(Clone)]
pub(crate) struct TooltipRequest {
id: TooltipId,
tooltip: AnyTooltip,
}
pub(crate) struct DeferredDraw {
priority: usize,
parent_node: DispatchNodeId,
@@ -108,7 +136,7 @@ pub(crate) struct Frame {
pub(crate) content_mask_stack: Vec<ContentMask<Pixels>>,
pub(crate) element_offset_stack: Vec<Point<Pixels>>,
pub(crate) input_handlers: Vec<Option<PlatformInputHandler>>,
pub(crate) tooltip_requests: Vec<Option<AnyTooltip>>,
pub(crate) tooltip_requests: Vec<Option<TooltipRequest>>,
pub(crate) cursor_styles: Vec<CursorStyleRequest>,
#[cfg(any(test, feature = "test-support"))]
pub(crate) debug_bounds: FxHashMap<String, Bounds<Pixels>>,
@@ -364,6 +392,7 @@ impl<'a> VisualContext for ElementContext<'a> {
impl<'a> ElementContext<'a> {
pub(crate) fn draw_roots(&mut self) {
self.window.draw_phase = DrawPhase::Layout;
self.window.tooltip_bounds.take();
// Layout all root elements.
let mut root_element = self.window.root_view.as_ref().unwrap().clone().into_any();
@@ -388,14 +417,8 @@ impl<'a> ElementContext<'a> {
element.layout(offset, AvailableSpace::min_size(), self);
active_drag_element = Some(element);
self.app.active_drag = Some(active_drag);
} else if let Some(tooltip_request) =
self.window.next_frame.tooltip_requests.last().cloned()
{
let tooltip_request = tooltip_request.unwrap();
let mut element = tooltip_request.view.clone().into_any();
let offset = tooltip_request.cursor_offset;
element.layout(offset, AvailableSpace::min_size(), self);
tooltip_element = Some(element);
} else {
tooltip_element = self.layout_tooltip();
}
self.window.mouse_hit_test = self.window.next_frame.hit_test(self.window.mouse_position);
@@ -415,6 +438,52 @@ impl<'a> ElementContext<'a> {
}
}
fn layout_tooltip(&mut self) -> Option<AnyElement> {
let tooltip_request = self.window.next_frame.tooltip_requests.last().cloned()?;
let tooltip_request = tooltip_request.unwrap();
let mut element = tooltip_request.tooltip.view.clone().into_any();
let mouse_position = tooltip_request.tooltip.mouse_position;
let tooltip_size = element.measure(AvailableSpace::min_size(), self);
let mut tooltip_bounds = Bounds::new(mouse_position + point(px(1.), px(1.)), tooltip_size);
let window_bounds = Bounds {
origin: Point::default(),
size: self.viewport_size(),
};
if tooltip_bounds.right() > window_bounds.right() {
let new_x = mouse_position.x - tooltip_bounds.size.width - px(1.);
if new_x >= Pixels::ZERO {
tooltip_bounds.origin.x = new_x;
} else {
tooltip_bounds.origin.x = cmp::max(
Pixels::ZERO,
tooltip_bounds.origin.x - tooltip_bounds.right() - window_bounds.right(),
);
}
}
if tooltip_bounds.bottom() > window_bounds.bottom() {
let new_y = mouse_position.y - tooltip_bounds.size.height - px(1.);
if new_y >= Pixels::ZERO {
tooltip_bounds.origin.y = new_y;
} else {
tooltip_bounds.origin.y = cmp::max(
Pixels::ZERO,
tooltip_bounds.origin.y - tooltip_bounds.bottom() - window_bounds.bottom(),
);
}
}
self.with_absolute_element_offset(tooltip_bounds.origin, |cx| element.after_layout(cx));
self.window.tooltip_bounds = Some(TooltipBounds {
id: tooltip_request.id,
bounds: tooltip_bounds,
});
Some(element)
}
fn layout_deferred_draws(&mut self, deferred_draw_indices: &[usize]) {
assert_eq!(self.window.element_id_stack.len(), 0);
@@ -604,8 +673,13 @@ impl<'a> ElementContext<'a> {
}
/// Sets a tooltip to be rendered for the upcoming frame
pub fn set_tooltip(&mut self, tooltip: AnyTooltip) {
self.window.next_frame.tooltip_requests.push(Some(tooltip));
pub fn set_tooltip(&mut self, tooltip: AnyTooltip) -> TooltipId {
let id = TooltipId(post_inc(&mut self.window.next_tooltip_id.0));
self.window
.next_frame
.tooltip_requests
.push(Some(TooltipRequest { id, tooltip }));
id
}
/// Pushes the given element id onto the global stack and invokes the given closure

View File

@@ -45,9 +45,9 @@ use text::operation_queue::OperationQueue;
use text::*;
pub use text::{
Anchor, Bias, Buffer as TextBuffer, BufferId, BufferSnapshot as TextBufferSnapshot, Edit,
OffsetRangeExt, OffsetUtf16, Patch, Point, PointUtf16, Rope, RopeFingerprint, Selection,
SelectionGoal, Subscription, TextDimension, TextSummary, ToOffset, ToOffsetUtf16, ToPoint,
ToPointUtf16, Transaction, TransactionId, Unclipped,
OffsetRangeExt, OffsetUtf16, Patch, Point, PointUtf16, Rope, Selection, SelectionGoal,
Subscription, TextDimension, TextSummary, ToOffset, ToOffsetUtf16, ToPoint, ToPointUtf16,
Transaction, TransactionId, Unclipped,
};
use theme::SyntaxTheme;
#[cfg(any(test, feature = "test-support"))]
@@ -87,8 +87,6 @@ pub struct Buffer {
/// The version vector when this buffer was last loaded from
/// or saved to disk.
saved_version: clock::Global,
/// A hash of the current contents of the buffer's file.
file_fingerprint: RopeFingerprint,
transaction_depth: usize,
was_dirty_before_starting_transaction: Option<bool>,
reload_task: Option<Task<Result<()>>>,
@@ -379,7 +377,6 @@ pub trait LocalFile: File {
&self,
buffer_id: BufferId,
version: &clock::Global,
fingerprint: RopeFingerprint,
line_ending: LineEnding,
mtime: Option<SystemTime>,
cx: &mut AppContext,
@@ -562,7 +559,6 @@ impl Buffer {
diff_base: self.diff_base.as_ref().map(|h| h.to_string()),
line_ending: proto::serialize_line_ending(self.line_ending()) as i32,
saved_version: proto::serialize_version(&self.saved_version),
saved_version_fingerprint: proto::serialize_fingerprint(self.file_fingerprint),
saved_mtime: self.saved_mtime.map(|time| time.into()),
}
}
@@ -642,7 +638,6 @@ impl Buffer {
Self {
saved_mtime,
saved_version: buffer.version(),
file_fingerprint: buffer.as_rope().fingerprint(),
reload_task: None,
transaction_depth: 0,
was_dirty_before_starting_transaction: None,
@@ -717,11 +712,6 @@ impl Buffer {
&self.saved_version
}
/// The fingerprint of the buffer's text when the buffer was last saved or reloaded from disk.
pub fn saved_version_fingerprint(&self) -> RopeFingerprint {
self.file_fingerprint
}
/// The mtime of the buffer's file when the buffer was last saved or reloaded from disk.
pub fn saved_mtime(&self) -> Option<SystemTime> {
self.saved_mtime
@@ -754,13 +744,11 @@ impl Buffer {
pub fn did_save(
&mut self,
version: clock::Global,
fingerprint: RopeFingerprint,
mtime: Option<SystemTime>,
cx: &mut ModelContext<Self>,
) {
self.saved_version = version;
self.has_conflict = false;
self.file_fingerprint = fingerprint;
self.saved_mtime = mtime;
cx.emit(Event::Saved);
cx.notify();
@@ -792,13 +780,7 @@ impl Buffer {
this.apply_diff(diff, cx);
tx.send(this.finalize_last_transaction().cloned()).ok();
this.has_conflict = false;
this.did_reload(
this.version(),
this.as_rope().fingerprint(),
this.line_ending(),
new_mtime,
cx,
);
this.did_reload(this.version(), this.line_ending(), new_mtime, cx);
} else {
if !diff.edits.is_empty()
|| this
@@ -809,13 +791,7 @@ impl Buffer {
this.has_conflict = true;
}
this.did_reload(
prev_version,
Rope::text_fingerprint(&new_text),
this.line_ending(),
this.saved_mtime,
cx,
);
this.did_reload(prev_version, this.line_ending(), this.saved_mtime, cx);
}
this.reload_task.take();
@@ -828,20 +804,17 @@ impl Buffer {
pub fn did_reload(
&mut self,
version: clock::Global,
fingerprint: RopeFingerprint,
line_ending: LineEnding,
mtime: Option<SystemTime>,
cx: &mut ModelContext<Self>,
) {
self.saved_version = version;
self.file_fingerprint = fingerprint;
self.text.set_line_ending(line_ending);
self.saved_mtime = mtime;
if let Some(file) = self.file.as_ref().and_then(|f| f.as_local()) {
file.buffer_reloaded(
self.remote_id(),
&self.saved_version,
self.file_fingerprint,
self.line_ending(),
self.saved_mtime,
cx,

View File

@@ -72,7 +72,7 @@ pub use lsp::LanguageServerId;
pub use outline::{Outline, OutlineItem};
pub use syntax_map::{OwnedSyntaxLayer, SyntaxLayer};
pub use text::LineEnding;
pub use tree_sitter::{Parser, Tree};
pub use tree_sitter::{Node, Parser, Tree, TreeCursor};
use crate::language_settings::SoftWrap;
@@ -91,6 +91,16 @@ thread_local! {
};
}
pub fn with_parser<F, R>(func: F) -> R
where
F: FnOnce(&mut Parser) -> R,
{
PARSER.with(|parser| {
let mut parser = parser.borrow_mut();
func(&mut parser)
})
}
lazy_static! {
static ref NEXT_LANGUAGE_ID: AtomicUsize = Default::default();
static ref NEXT_GRAMMAR_ID: AtomicUsize = Default::default();

View File

@@ -10,11 +10,6 @@ use text::*;
pub use proto::{BufferState, Operation};
/// Serializes a [`RopeFingerprint`] to be sent over RPC.
pub fn serialize_fingerprint(fingerprint: RopeFingerprint) -> String {
fingerprint.to_hex()
}
/// Deserializes a `[text::LineEnding]` from the RPC representation.
pub fn deserialize_line_ending(message: proto::LineEnding) -> text::LineEnding {
match message {

View File

@@ -45,7 +45,6 @@ tree-sitter-embedded-template.workspace = true
tree-sitter-go.workspace = true
tree-sitter-gomod.workspace = true
tree-sitter-gowork.workspace = true
tree-sitter-hcl.workspace = true
tree-sitter-heex.workspace = true
tree-sitter-jsdoc.workspace = true
tree-sitter-json.workspace = true

View File

@@ -23,7 +23,6 @@ mod python;
mod ruby;
mod rust;
mod tailwind;
mod terraform;
mod typescript;
mod vue;
mod yaml;
@@ -63,7 +62,6 @@ pub fn init(
("go", tree_sitter_go::language()),
("gomod", tree_sitter_gomod::language()),
("gowork", tree_sitter_gowork::language()),
("hcl", tree_sitter_hcl::language()),
("heex", tree_sitter_heex::language()),
("jsdoc", tree_sitter_jsdoc::language()),
("json", tree_sitter_json::language()),
@@ -280,12 +278,6 @@ pub fn init(
]
);
language!("proto");
language!("terraform", vec![Arc::new(terraform::TerraformLspAdapter)]);
language!(
"terraform-vars",
vec![Arc::new(terraform::TerraformLspAdapter)]
);
language!("hcl", vec![]);
languages.register_secondary_lsp_adapter(
"Astro".into(),

View File

@@ -1,181 +0,0 @@
use anyhow::{anyhow, Context, Result};
use async_trait::async_trait;
use collections::HashMap;
use futures::StreamExt;
pub use language::*;
use lsp::{CodeActionKind, LanguageServerBinary};
use smol::fs::{self, File};
use std::{any::Any, ffi::OsString, path::PathBuf};
use util::{
fs::remove_matching,
github::{latest_github_release, GitHubLspBinaryVersion},
maybe, ResultExt,
};
fn terraform_ls_binary_arguments() -> Vec<OsString> {
vec!["serve".into()]
}
pub struct TerraformLspAdapter;
#[async_trait(?Send)]
impl LspAdapter for TerraformLspAdapter {
fn name(&self) -> LanguageServerName {
LanguageServerName("terraform-ls".into())
}
async fn fetch_latest_server_version(
&self,
delegate: &dyn LspAdapterDelegate,
) -> Result<Box<dyn 'static + Send + Any>> {
// TODO: maybe use release API instead
// https://api.releases.hashicorp.com/v1/releases/terraform-ls?limit=1
let release = latest_github_release(
"hashicorp/terraform-ls",
false,
false,
delegate.http_client(),
)
.await?;
Ok(Box::new(GitHubLspBinaryVersion {
name: release.tag_name,
url: Default::default(),
}))
}
async fn fetch_server_binary(
&self,
version: Box<dyn 'static + Send + Any>,
container_dir: PathBuf,
delegate: &dyn LspAdapterDelegate,
) -> Result<LanguageServerBinary> {
let version = version.downcast::<GitHubLspBinaryVersion>().unwrap();
let zip_path = container_dir.join(format!("terraform-ls_{}.zip", version.name));
let version_dir = container_dir.join(format!("terraform-ls_{}", version.name));
let binary_path = version_dir.join("terraform-ls");
let url = build_download_url(version.name)?;
if fs::metadata(&binary_path).await.is_err() {
let mut response = delegate
.http_client()
.get(&url, Default::default(), true)
.await
.context("error downloading release")?;
let mut file = File::create(&zip_path).await?;
if !response.status().is_success() {
Err(anyhow!(
"download failed with status {}",
response.status().to_string()
))?;
}
futures::io::copy(response.body_mut(), &mut file).await?;
let unzip_status = smol::process::Command::new("unzip")
.current_dir(&container_dir)
.arg(&zip_path)
.arg("-d")
.arg(&version_dir)
.output()
.await?
.status;
if !unzip_status.success() {
Err(anyhow!("failed to unzip Terraform LS archive"))?;
}
remove_matching(&container_dir, |entry| entry != version_dir).await;
}
Ok(LanguageServerBinary {
path: binary_path,
env: None,
arguments: terraform_ls_binary_arguments(),
})
}
async fn cached_server_binary(
&self,
container_dir: PathBuf,
_: &dyn LspAdapterDelegate,
) -> Option<LanguageServerBinary> {
get_cached_server_binary(container_dir).await
}
async fn installation_test_binary(
&self,
container_dir: PathBuf,
) -> Option<LanguageServerBinary> {
get_cached_server_binary(container_dir)
.await
.map(|mut binary| {
binary.arguments = vec!["version".into()];
binary
})
}
fn code_action_kinds(&self) -> Option<Vec<CodeActionKind>> {
// TODO: file issue for server supported code actions
// TODO: reenable default actions / delete override
Some(vec![])
}
fn language_ids(&self) -> HashMap<String, String> {
HashMap::from_iter([
("Terraform".into(), "terraform".into()),
("Terraform Vars".into(), "terraform-vars".into()),
])
}
}
fn build_download_url(version: String) -> Result<String> {
let v = version.strip_prefix('v').unwrap_or(&version);
let os = match std::env::consts::OS {
"linux" => "linux",
"macos" => "darwin",
"win" => "windows",
_ => Err(anyhow!("unsupported OS {}", std::env::consts::OS))?,
}
.to_string();
let arch = match std::env::consts::ARCH {
"x86" => "386",
"x86_64" => "amd64",
"arm" => "arm",
"aarch64" => "arm64",
_ => Err(anyhow!("unsupported ARCH {}", std::env::consts::ARCH))?,
}
.to_string();
let url = format!(
"https://releases.hashicorp.com/terraform-ls/{v}/terraform-ls_{v}_{os}_{arch}.zip",
);
Ok(url)
}
async fn get_cached_server_binary(container_dir: PathBuf) -> Option<LanguageServerBinary> {
maybe!(async {
let mut last = None;
let mut entries = fs::read_dir(&container_dir).await?;
while let Some(entry) = entries.next().await {
last = Some(entry?.path());
}
match last {
Some(path) if path.is_dir() => {
let binary = path.join("terraform-ls");
if fs::metadata(&binary).await.is_ok() {
return Ok(LanguageServerBinary {
path: binary,
env: None,
arguments: terraform_ls_binary_arguments(),
});
}
}
_ => {}
}
Err(anyhow!("no cached binary"))
})
.await
.log_err()
}

View File

@@ -1,9 +1,11 @@
use anyhow::{anyhow, Result};
use anyhow::{anyhow, Context, Result};
use futures::{io::BufReader, stream::BoxStream, AsyncBufReadExt, AsyncReadExt, StreamExt};
use serde::{Deserialize, Serialize};
use std::convert::TryFrom;
use std::{convert::TryFrom, future::Future};
use util::http::{AsyncBody, HttpClient, Method, Request as HttpRequest};
pub const OPEN_AI_API_URL: &str = "https://api.openai.com/v1";
#[derive(Clone, Copy, Serialize, Deserialize, Debug, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum Role {
@@ -188,3 +190,68 @@ pub async fn stream_completion(
}
}
}
#[derive(Copy, Clone, Serialize, Deserialize)]
pub enum OpenAiEmbeddingModel {
#[serde(rename = "text-embedding-3-small")]
TextEmbedding3Small,
#[serde(rename = "text-embedding-3-large")]
TextEmbedding3Large,
}
#[derive(Serialize)]
struct OpenAiEmbeddingRequest<'a> {
model: OpenAiEmbeddingModel,
input: Vec<&'a str>,
}
#[derive(Deserialize)]
pub struct OpenAiEmbeddingResponse {
pub data: Vec<OpenAiEmbedding>,
}
#[derive(Deserialize)]
pub struct OpenAiEmbedding {
pub embedding: Vec<f32>,
}
pub fn embed<'a>(
client: &dyn HttpClient,
api_url: &str,
api_key: &str,
model: OpenAiEmbeddingModel,
texts: impl IntoIterator<Item = &'a str>,
) -> impl 'static + Future<Output = Result<OpenAiEmbeddingResponse>> {
let uri = format!("{api_url}/embeddings");
let request = OpenAiEmbeddingRequest {
model,
input: texts.into_iter().collect(),
};
let body = AsyncBody::from(serde_json::to_string(&request).unwrap());
let request = HttpRequest::builder()
.method(Method::POST)
.uri(uri)
.header("Content-Type", "application/json")
.header("Authorization", format!("Bearer {}", api_key))
.body(body)
.map(|request| client.send(request));
async move {
let mut response = request?.await?;
let mut body = String::new();
response.body_mut().read_to_string(&mut body).await?;
if response.status().is_success() {
let response: OpenAiEmbeddingResponse =
serde_json::from_str(&body).context("failed to parse OpenAI embedding response")?;
Ok(response)
} else {
Err(anyhow!(
"error during embedding, status: {:?}, body: {:?}",
response.status(),
body
))
}
}
}

View File

@@ -97,7 +97,7 @@ use std::{
};
use task::static_source::{StaticSource, TrackedFile};
use terminals::Terminals;
use text::{Anchor, BufferId, RopeFingerprint};
use text::{Anchor, BufferId};
use util::{
debug_panic, defer,
http::{HttpClient, Url},
@@ -978,6 +978,50 @@ impl Project {
}
}
#[cfg(any(test, feature = "test-support"))]
pub async fn example(
root_paths: impl IntoIterator<Item = &Path>,
cx: &mut AsyncAppContext,
) -> Model<Project> {
use clock::FakeSystemClock;
let fs = Arc::new(RealFs::default());
let languages = LanguageRegistry::test(cx.background_executor().clone());
let clock = Arc::new(FakeSystemClock::default());
let http_client = util::http::FakeHttpClient::with_404_response();
let client = cx
.update(|cx| client::Client::new(clock, http_client.clone(), cx))
.unwrap();
let user_store = cx
.new_model(|cx| UserStore::new(client.clone(), cx))
.unwrap();
let project = cx
.update(|cx| {
Project::local(
client,
node_runtime::FakeNodeRuntime::new(),
user_store,
Arc::new(languages),
fs,
cx,
)
})
.unwrap();
for path in root_paths {
let (tree, _) = project
.update(cx, |project, cx| {
project.find_or_create_local_worktree(path, true, cx)
})
.unwrap()
.await
.unwrap();
tree.update(cx, |tree, _| tree.as_local().unwrap().scan_complete())
.unwrap()
.await;
}
project
}
#[cfg(any(test, feature = "test-support"))]
pub async fn test(
fs: Arc<dyn Fs>,
@@ -1146,6 +1190,10 @@ impl Project {
self.user_store.clone()
}
pub fn node_runtime(&self) -> Option<&Arc<dyn NodeRuntime>> {
self.node.as_ref()
}
pub fn opened_buffers(&self) -> Vec<Model<Buffer>> {
self.opened_buffers
.values()
@@ -7734,6 +7782,7 @@ impl Project {
let (repo, relative_path, content) = blame_params?;
let lock = repo.lock();
lock.blame(&relative_path, content)
.with_context(|| format!("Failed to blame {relative_path:?}"))
})
} else {
let project_id = self.remote_id();
@@ -8525,7 +8574,6 @@ impl Project {
buffer_id: buffer_id.into(),
version: serialize_version(buffer.saved_version()),
mtime: buffer.saved_mtime().map(|time| time.into()),
fingerprint: language::proto::serialize_fingerprint(buffer.saved_version_fingerprint()),
})
}
@@ -8618,9 +8666,6 @@ impl Project {
buffer_id: buffer_id.into(),
version: language::proto::serialize_version(buffer.saved_version()),
mtime: buffer.saved_mtime().map(|time| time.into()),
fingerprint: language::proto::serialize_fingerprint(
buffer.saved_version_fingerprint(),
),
line_ending: language::proto::serialize_line_ending(
buffer.line_ending(),
) as i32,
@@ -9609,7 +9654,6 @@ impl Project {
_: Arc<Client>,
mut cx: AsyncAppContext,
) -> Result<()> {
let fingerprint = Default::default();
let version = deserialize_version(&envelope.payload.version);
let buffer_id = BufferId::new(envelope.payload.buffer_id)?;
let mtime = envelope.payload.mtime.map(|time| time.into());
@@ -9622,7 +9666,7 @@ impl Project {
.or_else(|| this.incomplete_remote_buffers.get(&buffer_id).cloned());
if let Some(buffer) = buffer {
buffer.update(cx, |buffer, cx| {
buffer.did_save(version, fingerprint, mtime, cx);
buffer.did_save(version, mtime, cx);
});
}
Ok(())
@@ -9637,7 +9681,6 @@ impl Project {
) -> Result<()> {
let payload = envelope.payload;
let version = deserialize_version(&payload.version);
let fingerprint = RopeFingerprint::default();
let line_ending = deserialize_line_ending(
proto::LineEnding::from_i32(payload.line_ending)
.ok_or_else(|| anyhow!("missing line ending"))?,
@@ -9652,7 +9695,7 @@ impl Project {
.or_else(|| this.incomplete_remote_buffers.get(&buffer_id).cloned());
if let Some(buffer) = buffer {
buffer.update(cx, |buffer, cx| {
buffer.did_reload(version, fingerprint, line_ending, mtime, cx);
buffer.did_reload(version, line_ending, mtime, cx);
});
}
Ok(())

View File

@@ -2661,7 +2661,7 @@ async fn test_file_changes_multiple_times_on_disk(cx: &mut gpui::TestAppContext)
)
.await
.unwrap();
worktree.next_event(cx);
worktree.next_event(cx).await;
// Change the buffer's file again. Depending on the random seed, the
// previous file change may still be in progress.
@@ -2672,7 +2672,7 @@ async fn test_file_changes_multiple_times_on_disk(cx: &mut gpui::TestAppContext)
)
.await
.unwrap();
worktree.next_event(cx);
worktree.next_event(cx).await;
cx.executor().run_until_parked();
let on_disk_text = fs.load(Path::new("/dir/file1")).await.unwrap();
@@ -2716,7 +2716,7 @@ async fn test_edit_buffer_while_it_reloads(cx: &mut gpui::TestAppContext) {
)
.await
.unwrap();
worktree.next_event(cx);
worktree.next_event(cx).await;
cx.executor()
.spawn(cx.executor().simulate_random_delay())
@@ -3122,12 +3122,7 @@ async fn test_buffer_is_dirty(cx: &mut gpui::TestAppContext) {
&[language::Event::Edited, language::Event::DirtyChanged]
);
events.lock().clear();
buffer.did_save(
buffer.version(),
buffer.as_rope().fingerprint(),
buffer.file().unwrap().mtime(),
cx,
);
buffer.did_save(buffer.version(), buffer.file().unwrap().mtime(), cx);
});
// after saving, the buffer is not dirty, and emits a saved event.

View File

@@ -52,6 +52,7 @@ impl Project {
(
Some(TaskState {
id: spawn_task.id,
full_label: spawn_task.full_label,
label: spawn_task.label,
status: TaskStatus::Running,
completion_rx,

View File

@@ -1,6 +1,6 @@
mod project_panel_settings;
use client::{ErrorCode, ErrorExt};
use settings::Settings;
use settings::{Settings, SettingsStore};
use db::kvp::KEY_VALUE_STORE;
use editor::{actions::Cancel, items::entry_git_aware_label_color, scroll::Autoscroll, Editor};
@@ -24,6 +24,7 @@ use project_panel_settings::{ProjectPanelDockPosition, ProjectPanelSettings};
use serde::{Deserialize, Serialize};
use std::{
cmp::Ordering,
collections::HashSet,
ffi::OsStr,
ops::Range,
path::{Path, PathBuf},
@@ -50,6 +51,7 @@ pub struct ProjectPanel {
visible_entries: Vec<(WorktreeId, Vec<Entry>)>,
last_worktree_root_id: Option<ProjectEntryId>,
expanded_dir_ids: HashMap<WorktreeId, Vec<ProjectEntryId>>,
unfolded_dir_ids: HashSet<ProjectEntryId>,
selection: Option<Selection>,
context_menu: Option<(View<ContextMenu>, Point<Pixels>, Subscription)>,
edit_state: Option<EditState>,
@@ -133,6 +135,8 @@ actions!(
OpenPermanent,
ToggleFocus,
NewSearchInDirectory,
UnfoldDirectory,
FoldDirectory,
]
);
@@ -235,6 +239,16 @@ impl ProjectPanel {
})
.detach();
let mut project_panel_settings = *ProjectPanelSettings::get_global(cx);
cx.observe_global::<SettingsStore>(move |_, cx| {
let new_settings = *ProjectPanelSettings::get_global(cx);
if project_panel_settings != new_settings {
project_panel_settings = new_settings;
cx.notify();
}
})
.detach();
let mut this = Self {
project: project.clone(),
fs: workspace.app_state().fs.clone(),
@@ -243,6 +257,7 @@ impl ProjectPanel {
visible_entries: Default::default(),
last_worktree_root_id: Default::default(),
expanded_dir_ids: Default::default(),
unfolded_dir_ids: Default::default(),
selection: None,
edit_state: None,
context_menu: None,
@@ -403,8 +418,11 @@ impl ProjectPanel {
});
if let Some((worktree, entry)) = self.selected_entry(cx) {
let auto_fold_dirs = ProjectPanelSettings::get_global(cx).auto_fold_dirs;
let is_root = Some(entry) == worktree.root_entry();
let is_dir = entry.is_dir();
let is_foldable = auto_fold_dirs && self.is_foldable(entry, worktree);
let is_unfoldable = auto_fold_dirs && self.is_unfoldable(entry, worktree);
let worktree_id = worktree.id();
let is_local = project.is_local();
let is_read_only = project.is_read_only();
@@ -430,6 +448,12 @@ impl ProjectPanel {
menu.separator()
.action("Find in Folder…", Box::new(NewSearchInDirectory))
})
.when(is_unfoldable, |menu| {
menu.action("Unfold Directory", Box::new(UnfoldDirectory))
})
.when(is_foldable, |menu| {
menu.action("Fold Directory", Box::new(FoldDirectory))
})
.separator()
.action("Cut", Box::new(Cut))
.action("Copy", Box::new(Copy))
@@ -482,6 +506,37 @@ impl ProjectPanel {
cx.notify();
}
fn is_unfoldable(&self, entry: &Entry, worktree: &Worktree) -> bool {
if !entry.is_dir() || self.unfolded_dir_ids.contains(&entry.id) {
return false;
}
if let Some(parent_path) = entry.path.parent() {
let snapshot = worktree.snapshot();
let mut child_entries = snapshot.child_entries(&parent_path);
if let Some(child) = child_entries.next() {
if child_entries.next().is_none() {
return child.kind.is_dir();
}
}
};
false
}
fn is_foldable(&self, entry: &Entry, worktree: &Worktree) -> bool {
if entry.is_dir() {
let snapshot = worktree.snapshot();
let mut child_entries = snapshot.child_entries(&entry.path);
if let Some(child) = child_entries.next() {
if child_entries.next().is_none() {
return child.kind.is_dir();
}
}
}
false
}
fn expand_selected_entry(&mut self, _: &ExpandSelectedEntry, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_entry(cx) {
if entry.is_dir() {
@@ -859,6 +914,59 @@ impl ProjectPanel {
});
}
fn unfold_directory(&mut self, _: &UnfoldDirectory, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_entry(cx) {
self.unfolded_dir_ids.insert(entry.id);
let snapshot = worktree.snapshot();
let mut parent_path = entry.path.parent();
while let Some(path) = parent_path {
if let Some(parent_entry) = worktree.entry_for_path(path) {
let mut children_iter = snapshot.child_entries(path);
if children_iter.by_ref().take(2).count() > 1 {
break;
}
self.unfolded_dir_ids.insert(parent_entry.id);
parent_path = path.parent();
} else {
break;
}
}
self.update_visible_entries(None, cx);
self.autoscroll(cx);
cx.notify();
}
}
fn fold_directory(&mut self, _: &FoldDirectory, cx: &mut ViewContext<Self>) {
if let Some((worktree, entry)) = self.selected_entry(cx) {
self.unfolded_dir_ids.remove(&entry.id);
let snapshot = worktree.snapshot();
let mut path = &*entry.path;
loop {
let mut child_entries_iter = snapshot.child_entries(path);
if let Some(child) = child_entries_iter.next() {
if child_entries_iter.next().is_none() && child.is_dir() {
self.unfolded_dir_ids.remove(&child.id);
path = &*child.path;
} else {
break;
}
} else {
break;
}
}
self.update_visible_entries(None, cx);
self.autoscroll(cx);
cx.notify();
}
}
fn select_next(&mut self, _: &SelectNext, cx: &mut ViewContext<Self>) {
if let Some(selection) = self.selection {
let (mut worktree_ix, mut entry_ix, _) =
@@ -1153,6 +1261,7 @@ impl ProjectPanel {
new_selected_entry: Option<(WorktreeId, ProjectEntryId)>,
cx: &mut ViewContext<Self>,
) {
let auto_collapse_dirs = ProjectPanelSettings::get_global(cx).auto_fold_dirs;
let project = self.project.read(cx);
self.last_worktree_root_id = project
.visible_worktrees(cx)
@@ -1194,8 +1303,25 @@ impl ProjectPanel {
let mut visible_worktree_entries = Vec::new();
let mut entry_iter = snapshot.entries(true);
while let Some(entry) = entry_iter.entry() {
if auto_collapse_dirs
&& entry.kind.is_dir()
&& !self.unfolded_dir_ids.contains(&entry.id)
{
if let Some(root_path) = snapshot.root_entry() {
let mut child_entries = snapshot.child_entries(&entry.path);
if let Some(child) = child_entries.next() {
if entry.path != root_path.path
&& child_entries.next().is_none()
&& child.kind.is_dir()
{
entry_iter.advance();
continue;
}
}
}
}
visible_worktree_entries.push(entry.clone());
if Some(entry.id) == new_entry_parent_id {
visible_worktree_entries.push(Entry {
@@ -1367,16 +1493,32 @@ impl ProjectPanel {
}
};
let mut details = EntryDetails {
filename: entry
let (depth, difference) = ProjectPanel::calculate_depth_and_difference(
entry,
visible_worktree_entries,
);
let filename = match difference {
diff if diff > 1 => entry
.path
.iter()
.skip(entry.path.components().count() - diff)
.collect::<PathBuf>()
.to_str()
.unwrap_or_default()
.to_string(),
_ => entry
.path
.file_name()
.unwrap_or(root_name)
.to_string_lossy()
.to_string(),
.map(|name| name.to_string_lossy().into_owned())
.unwrap_or_else(|| root_name.to_string_lossy().to_string()),
};
let mut details = EntryDetails {
filename,
icon,
path: entry.path.clone(),
depth: entry.path.components().count(),
depth,
kind: entry.kind,
is_ignored: entry.is_ignored,
is_expanded,
@@ -1420,6 +1562,45 @@ impl ProjectPanel {
}
}
fn calculate_depth_and_difference(
entry: &Entry,
visible_worktree_entries: &Vec<Entry>,
) -> (usize, usize) {
let visible_worktree_paths: HashSet<Arc<Path>> = visible_worktree_entries
.iter()
.map(|e| e.path.clone())
.collect();
let (depth, difference) = entry
.path
.ancestors()
.skip(1) // Skip the entry itself
.find_map(|ancestor| {
if visible_worktree_paths.contains(ancestor) {
let parent_entry = visible_worktree_entries
.iter()
.find(|&e| &*e.path == ancestor)
.unwrap();
let entry_path_components_count = entry.path.components().count();
let parent_path_components_count = parent_entry.path.components().count();
let difference = entry_path_components_count - parent_path_components_count;
let depth = parent_entry
.path
.ancestors()
.skip(1)
.filter(|ancestor| visible_worktree_paths.contains(*ancestor))
.count();
Some((depth + 1, difference))
} else {
None
}
})
.unwrap_or((0, 0));
(depth, difference)
}
fn render_entry(
&self,
entry_id: ProjectEntryId,
@@ -1572,6 +1753,8 @@ impl Render for ProjectPanel {
.on_action(cx.listener(Self::copy_path))
.on_action(cx.listener(Self::copy_relative_path))
.on_action(cx.listener(Self::new_search_in_directory))
.on_action(cx.listener(Self::unfold_directory))
.on_action(cx.listener(Self::fold_directory))
.when(!project.is_read_only(), |el| {
el.on_action(cx.listener(Self::new_file))
.on_action(cx.listener(Self::new_directory))
@@ -1983,6 +2166,125 @@ mod tests {
);
}
#[gpui::test]
async fn test_auto_collapse_dir_paths(cx: &mut gpui::TestAppContext) {
init_test(cx);
let fs = FakeFs::new(cx.executor().clone());
fs.insert_tree(
"/root1",
json!({
"dir_1": {
"nested_dir_1": {
"nested_dir_2": {
"nested_dir_3": {
"file_a.java": "// File contents",
"file_b.java": "// File contents",
"file_c.java": "// File contents",
"nested_dir_4": {
"nested_dir_5": {
"file_d.java": "// File contents",
}
}
}
}
}
}
}),
)
.await;
fs.insert_tree(
"/root2",
json!({
"dir_2": {
"file_1.java": "// File contents",
}
}),
)
.await;
let project = Project::test(fs.clone(), ["/root1".as_ref(), "/root2".as_ref()], cx).await;
let workspace = cx.add_window(|cx| Workspace::test_new(project.clone(), cx));
let cx = &mut VisualTestContext::from_window(*workspace, cx);
cx.update(|cx| {
let settings = *ProjectPanelSettings::get_global(cx);
ProjectPanelSettings::override_global(
ProjectPanelSettings {
auto_fold_dirs: true,
..settings
},
cx,
);
});
let panel = workspace
.update(cx, |workspace, cx| ProjectPanel::new(workspace, cx))
.unwrap();
assert_eq!(
visible_entries_as_strings(&panel, 0..10, cx),
&[
"v root1",
" > dir_1/nested_dir_1/nested_dir_2/nested_dir_3",
"v root2",
" > dir_2",
]
);
toggle_expand_dir(
&panel,
"root1/dir_1/nested_dir_1/nested_dir_2/nested_dir_3",
cx,
);
assert_eq!(
visible_entries_as_strings(&panel, 0..10, cx),
&[
"v root1",
" v dir_1/nested_dir_1/nested_dir_2/nested_dir_3 <== selected",
" > nested_dir_4/nested_dir_5",
" file_a.java",
" file_b.java",
" file_c.java",
"v root2",
" > dir_2",
]
);
toggle_expand_dir(
&panel,
"root1/dir_1/nested_dir_1/nested_dir_2/nested_dir_3/nested_dir_4/nested_dir_5",
cx,
);
assert_eq!(
visible_entries_as_strings(&panel, 0..10, cx),
&[
"v root1",
" v dir_1/nested_dir_1/nested_dir_2/nested_dir_3",
" v nested_dir_4/nested_dir_5 <== selected",
" file_d.java",
" file_a.java",
" file_b.java",
" file_c.java",
"v root2",
" > dir_2",
]
);
toggle_expand_dir(&panel, "root2/dir_2", cx);
assert_eq!(
visible_entries_as_strings(&panel, 0..10, cx),
&[
"v root1",
" v dir_1/nested_dir_1/nested_dir_2/nested_dir_3",
" v nested_dir_4/nested_dir_5",
" file_d.java",
" file_a.java",
" file_b.java",
" file_c.java",
"v root2",
" v dir_2 <== selected",
" file_1.java",
]
);
}
#[gpui::test(iterations = 30)]
async fn test_editing_files(cx: &mut gpui::TestAppContext) {
init_test(cx);

View File

@@ -4,14 +4,14 @@ use schemars::JsonSchema;
use serde_derive::{Deserialize, Serialize};
use settings::{Settings, SettingsSources};
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema)]
#[derive(Clone, Debug, Serialize, Deserialize, JsonSchema, Copy, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum ProjectPanelDockPosition {
Left,
Right,
}
#[derive(Deserialize, Debug)]
#[derive(Deserialize, Debug, Clone, Copy, PartialEq)]
pub struct ProjectPanelSettings {
pub default_width: Pixels,
pub dock: ProjectPanelDockPosition,
@@ -20,6 +20,7 @@ pub struct ProjectPanelSettings {
pub git_status: bool,
pub indent_size: f32,
pub auto_reveal_entries: bool,
pub auto_fold_dirs: bool,
}
#[derive(Clone, Default, Serialize, Deserialize, JsonSchema, Debug)]
@@ -54,6 +55,11 @@ pub struct ProjectPanelSettingsContent {
///
/// Default: true
pub auto_reveal_entries: Option<bool>,
/// Whether to fold directories automatically
/// when directory has only one directory inside.
///
/// Default: false
pub auto_fold_dirs: Option<bool>,
}
impl Settings for ProjectPanelSettings {

View File

@@ -1,7 +1,7 @@
use futures::FutureExt;
use gpui::{
AnyElement, ElementId, FontStyle, FontWeight, HighlightStyle, InteractiveText, IntoElement,
SharedString, StrikethroughStyle, StyledText, UnderlineStyle, WindowContext,
AnyElement, AnyView, ElementId, FontStyle, FontWeight, HighlightStyle, InteractiveText,
IntoElement, SharedString, StrikethroughStyle, StyledText, UnderlineStyle, WindowContext,
};
use language::{HighlightId, Language, LanguageRegistry};
use std::{ops::Range, sync::Arc};
@@ -31,12 +31,16 @@ impl From<HighlightId> for Highlight {
}
}
#[derive(Debug, Clone)]
#[derive(Clone)]
pub struct RichText {
pub text: SharedString,
pub highlights: Vec<(Range<usize>, Highlight)>,
pub link_ranges: Vec<Range<usize>>,
pub link_urls: Arc<[String]>,
pub custom_ranges: Vec<Range<usize>>,
custom_ranges_tooltip_fn:
Option<Arc<dyn Fn(usize, Range<usize>, &mut WindowContext) -> Option<AnyView>>>,
}
/// Allows one to specify extra links to the rendered markdown, which can be used
@@ -48,7 +52,14 @@ pub struct Mention {
}
impl RichText {
pub fn element(&self, id: ElementId, cx: &WindowContext) -> AnyElement {
pub fn set_tooltip_builder_for_custom_ranges(
&mut self,
f: impl Fn(usize, Range<usize>, &mut WindowContext) -> Option<AnyView> + 'static,
) {
self.custom_ranges_tooltip_fn = Some(Arc::new(f));
}
pub fn element(&self, id: ElementId, cx: &mut WindowContext) -> AnyElement {
let theme = cx.theme();
let code_background = theme.colors().surface_background;
@@ -111,12 +122,21 @@ impl RichText {
.tooltip({
let link_ranges = self.link_ranges.clone();
let link_urls = self.link_urls.clone();
let custom_tooltip_ranges = self.custom_ranges.clone();
let custom_tooltip_fn = self.custom_ranges_tooltip_fn.clone();
move |idx, cx| {
for (ix, range) in link_ranges.iter().enumerate() {
if range.contains(&idx) {
return Some(LinkPreview::new(&link_urls[ix], cx));
}
}
for range in &custom_tooltip_ranges {
if range.contains(&idx) {
if let Some(f) = &custom_tooltip_fn {
return f(idx, range.clone(), cx);
}
}
}
None
}
})
@@ -354,6 +374,8 @@ pub fn render_rich_text(
link_urls: link_urls.into(),
link_ranges,
highlights,
custom_ranges: Vec::new(),
custom_ranges_tooltip_fn: None,
}
}

View File

@@ -13,7 +13,6 @@ path = "src/rope.rs"
[dependencies]
arrayvec = "0.7.1"
bromberg_sl2 = { git = "https://github.com/zed-industries/bromberg_sl2", rev = "950bc5482c216c395049ae33ae4501e08975f17f" }
log.workspace = true
smallvec.workspace = true
sum_tree.workspace = true

View File

@@ -4,7 +4,6 @@ mod point_utf16;
mod unclipped;
use arrayvec::ArrayString;
use bromberg_sl2::HashMatrix;
use smallvec::SmallVec;
use std::{
cmp, fmt, io, mem,
@@ -26,12 +25,6 @@ const CHUNK_BASE: usize = 6;
#[cfg(not(test))]
const CHUNK_BASE: usize = 64;
/// Type alias to [`HashMatrix`], an implementation of a homomorphic hash function. Two [`Rope`] instances
/// containing the same text will produce the same fingerprint. This hash function is special in that
/// it allows us to hash individual chunks and aggregate them up the [`Rope`]'s tree, with the resulting
/// hash being equivalent to hashing all the text contained in the [`Rope`] at once.
pub type RopeFingerprint = HashMatrix;
#[derive(Clone, Default)]
pub struct Rope {
chunks: SumTree<Chunk>,
@@ -42,10 +35,6 @@ impl Rope {
Self::default()
}
pub fn text_fingerprint(text: &str) -> RopeFingerprint {
bromberg_sl2::hash_strict(text.as_bytes())
}
pub fn append(&mut self, rope: Rope) {
let mut chunks = rope.chunks.cursor::<()>();
chunks.next(&());
@@ -424,10 +413,6 @@ impl Rope {
self.clip_point(Point::new(row, u32::MAX), Bias::Left)
.column
}
pub fn fingerprint(&self) -> RopeFingerprint {
self.chunks.summary().fingerprint
}
}
impl<'a> From<&'a str> for Rope {
@@ -994,14 +979,12 @@ impl sum_tree::Item for Chunk {
#[derive(Clone, Debug, Default, Eq, PartialEq)]
pub struct ChunkSummary {
text: TextSummary,
fingerprint: RopeFingerprint,
}
impl<'a> From<&'a str> for ChunkSummary {
fn from(text: &'a str) -> Self {
Self {
text: TextSummary::from(text),
fingerprint: Rope::text_fingerprint(text),
}
}
}
@@ -1011,7 +994,6 @@ impl sum_tree::Summary for ChunkSummary {
fn add_summary(&mut self, summary: &Self, _: &()) {
self.text += &summary.text;
self.fingerprint = self.fingerprint * summary.fingerprint;
}
}

View File

@@ -204,6 +204,11 @@ message Envelope {
LanguageModelResponse language_model_response = 167;
CountTokensWithLanguageModel count_tokens_with_language_model = 168;
CountTokensResponse count_tokens_response = 169;
GetCachedEmbeddings get_cached_embeddings = 189;
GetCachedEmbeddingsResponse get_cached_embeddings_response = 190;
ComputeEmbeddings compute_embeddings = 191;
ComputeEmbeddingsResponse compute_embeddings_response = 192; // current max
UpdateChannelMessage update_channel_message = 170;
ChannelMessageUpdate channel_message_update = 171;
@@ -216,7 +221,7 @@ message Envelope {
MultiLspQueryResponse multi_lsp_query_response = 176;
CreateRemoteProject create_remote_project = 177;
CreateRemoteProjectResponse create_remote_project_response = 188; // current max
CreateRemoteProjectResponse create_remote_project_response = 188;
CreateDevServer create_dev_server = 178;
CreateDevServerResponse create_dev_server_response = 179;
ShutdownDevServer shutdown_dev_server = 180;
@@ -750,7 +755,7 @@ message BufferSaved {
uint64 buffer_id = 2;
repeated VectorClockEntry version = 3;
Timestamp mtime = 4;
string fingerprint = 5;
reserved 5;
}
message BufferReloaded {
@@ -758,7 +763,7 @@ message BufferReloaded {
uint64 buffer_id = 2;
repeated VectorClockEntry version = 3;
Timestamp mtime = 4;
string fingerprint = 5;
reserved 5;
LineEnding line_ending = 6;
}
@@ -1605,7 +1610,7 @@ message BufferState {
optional string diff_base = 4;
LineEnding line_ending = 5;
repeated VectorClockEntry saved_version = 6;
string saved_version_fingerprint = 7;
reserved 7;
Timestamp saved_mtime = 8;
}
@@ -1892,6 +1897,29 @@ message CountTokensResponse {
uint32 token_count = 1;
}
message GetCachedEmbeddings {
string model = 1;
repeated bytes digests = 2;
}
message GetCachedEmbeddingsResponse {
repeated Embedding embeddings = 1;
}
message ComputeEmbeddings {
string model = 1;
repeated string texts = 2;
}
message ComputeEmbeddingsResponse {
repeated Embedding embeddings = 1;
}
message Embedding {
bytes digest = 1;
repeated float dimensions = 2;
}
message BlameBuffer {
uint64 project_id = 1;
uint64 buffer_id = 2;

View File

@@ -151,6 +151,8 @@ messages!(
(ChannelMessageSent, Foreground),
(ChannelMessageUpdate, Foreground),
(CompleteWithLanguageModel, Background),
(ComputeEmbeddings, Background),
(ComputeEmbeddingsResponse, Background),
(CopyProjectEntry, Foreground),
(CountTokensWithLanguageModel, Background),
(CountTokensResponse, Background),
@@ -174,6 +176,8 @@ messages!(
(FormatBuffers, Foreground),
(FormatBuffersResponse, Foreground),
(FuzzySearchUsers, Foreground),
(GetCachedEmbeddings, Background),
(GetCachedEmbeddingsResponse, Background),
(GetChannelMembers, Foreground),
(GetChannelMembersResponse, Foreground),
(GetChannelMessages, Background),
@@ -325,6 +329,7 @@ request_messages!(
(CancelCall, Ack),
(CopyProjectEntry, ProjectEntryResponse),
(CompleteWithLanguageModel, LanguageModelResponse),
(ComputeEmbeddings, ComputeEmbeddingsResponse),
(CountTokensWithLanguageModel, CountTokensResponse),
(CreateChannel, CreateChannelResponse),
(CreateProjectEntry, ProjectEntryResponse),
@@ -336,6 +341,7 @@ request_messages!(
(Follow, FollowResponse),
(FormatBuffers, FormatBuffersResponse),
(FuzzySearchUsers, UsersResponse),
(GetCachedEmbeddings, GetCachedEmbeddingsResponse),
(GetChannelMembers, GetChannelMembersResponse),
(GetChannelMessages, GetChannelMessagesResponse),
(GetChannelMessagesById, GetChannelMessagesResponse),

View File

@@ -272,28 +272,21 @@ impl Render for BufferSearchBar {
"Select previous match",
&SelectPrevMatch,
))
.when(!narrow_mode, |this| {
this.child(
h_flex()
.mx(rems_from_px(-4.0))
.min_w(rems_from_px(40.))
.justify_center()
.items_center()
.child(Label::new(match_text).color(
if self.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
},
)),
)
})
.child(render_nav_button(
ui::IconName::ChevronRight,
self.active_match_index.is_some(),
"Select next match",
&SelectNextMatch,
)),
))
.when(!narrow_mode, |this| {
this.child(h_flex().min_w(rems_from_px(40.)).child(
Label::new(match_text).color(if self.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
}),
))
}),
);
let replace_line = should_show_replace_input.then(|| {
@@ -531,6 +524,7 @@ impl BufferSearchBar {
}
}
if let Some(active_editor) = self.active_searchable_item.as_ref() {
active_editor.search_bar_visibility_changed(false, cx);
let handle = active_editor.focus_handle(cx);
cx.focus(&handle);
}
@@ -572,10 +566,12 @@ impl BufferSearchBar {
}
pub fn show(&mut self, cx: &mut ViewContext<Self>) -> bool {
if self.active_searchable_item.is_none() {
let Some(handle) = self.active_searchable_item.as_ref() else {
return false;
}
};
self.dismissed = false;
handle.search_bar_visibility_changed(true, cx);
cx.notify();
cx.emit(Event::UpdateLocation);
cx.emit(ToolbarItemEvent::ChangeLocation(
@@ -1868,8 +1864,7 @@ mod tests {
// Let's turn on regex mode.
search_bar
.update(cx, |search_bar, cx| {
search_bar.enable_search_option(SearchOptions::REGEX, cx);
search_bar.search("\\[([^\\]]+)\\]", None, cx)
search_bar.search("\\[([^\\]]+)\\]", Some(SearchOptions::REGEX), cx)
})
.await
.unwrap();
@@ -1892,8 +1887,11 @@ mod tests {
// Now with a whole-word twist.
search_bar
.update(cx, |search_bar, cx| {
search_bar.enable_search_option(SearchOptions::REGEX, cx);
search_bar.search("a\\w+s", Some(SearchOptions::WHOLE_WORD), cx)
search_bar.search(
"a\\w+s",
Some(SearchOptions::REGEX | SearchOptions::WHOLE_WORD),
cx,
)
})
.await
.unwrap();

View File

@@ -1437,20 +1437,6 @@ impl Render for ProjectSearchBar {
Tooltip::for_action("Go to previous match", &SelectPrevMatch, cx)
}),
)
.child(
h_flex()
.mx(rems_from_px(-4.0))
.min_w(rems_from_px(40.))
.justify_center()
.items_center()
.child(
Label::new(match_text).color(if search.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
}),
),
)
.child(
IconButton::new("project-search-next-match", IconName::ChevronRight)
.disabled(search.active_match_index.is_none())
@@ -1463,6 +1449,17 @@ impl Render for ProjectSearchBar {
}))
.tooltip(|cx| Tooltip::for_action("Go to next match", &SelectNextMatch, cx)),
)
.child(
h_flex()
.min_w(rems_from_px(40.))
.child(
Label::new(match_text).color(if search.active_match_index.is_some() {
Color::Default
} else {
Color::Disabled
}),
),
)
.when(limit_reached, |this| {
this.child(
div()

View File

@@ -0,0 +1,48 @@
[package]
name = "semantic_index"
description = "Process, chunk, and embed text as vectors for semantic search."
version = "0.1.0"
edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lib]
path = "src/semantic_index.rs"
[dependencies]
anyhow.workspace = true
client.workspace = true
clock.workspace = true
collections.workspace = true
fs.workspace = true
futures.workspace = true
futures-batch.workspace = true
gpui.workspace = true
language.workspace = true
log.workspace = true
heed.workspace = true
open_ai.workspace = true
project.workspace = true
settings.workspace = true
serde.workspace = true
serde_json.workspace = true
sha2.workspace = true
smol.workspace = true
util. workspace = true
worktree.workspace = true
[dev-dependencies]
env_logger.workspace = true
client = { workspace = true, features = ["test-support"] }
fs = { workspace = true, features = ["test-support"] }
futures.workspace = true
gpui = { workspace = true, features = ["test-support"] }
language = { workspace = true, features = ["test-support"] }
languages.workspace = true
project = { workspace = true, features = ["test-support"] }
tempfile.workspace = true
util = { workspace = true, features = ["test-support"] }
worktree = { workspace = true, features = ["test-support"] }
[lints]
workspace = true

View File

@@ -0,0 +1 @@
../../LICENSE-GPL

View File

@@ -0,0 +1,140 @@
use client::Client;
use futures::channel::oneshot;
use gpui::{App, Global, TestAppContext};
use language::language_settings::AllLanguageSettings;
use project::Project;
use semantic_index::{OpenAiEmbeddingModel, OpenAiEmbeddingProvider, SemanticIndex};
use settings::SettingsStore;
use std::{path::Path, sync::Arc};
use util::http::HttpClientWithUrl;
pub fn init_test(cx: &mut TestAppContext) {
_ = cx.update(|cx| {
let store = SettingsStore::test(cx);
cx.set_global(store);
language::init(cx);
Project::init_settings(cx);
SettingsStore::update(cx, |store, cx| {
store.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
});
}
fn main() {
env_logger::init();
use clock::FakeSystemClock;
App::new().run(|cx| {
let store = SettingsStore::test(cx);
cx.set_global(store);
language::init(cx);
Project::init_settings(cx);
SettingsStore::update(cx, |store, cx| {
store.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
let clock = Arc::new(FakeSystemClock::default());
let http = Arc::new(HttpClientWithUrl::new("http://localhost:11434"));
let client = client::Client::new(clock, http.clone(), cx);
Client::set_global(client.clone(), cx);
let args: Vec<String> = std::env::args().collect();
if args.len() < 2 {
eprintln!("Usage: cargo run --example index -p semantic_index -- <project_path>");
cx.quit();
return;
}
// let embedding_provider = semantic_index::FakeEmbeddingProvider;
let api_key = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let embedding_provider = OpenAiEmbeddingProvider::new(
http.clone(),
OpenAiEmbeddingModel::TextEmbedding3Small,
open_ai::OPEN_AI_API_URL.to_string(),
api_key,
);
let semantic_index = SemanticIndex::new(
Path::new("/tmp/semantic-index-db.mdb"),
Arc::new(embedding_provider),
cx,
);
cx.spawn(|mut cx| async move {
let mut semantic_index = semantic_index.await.unwrap();
let project_path = Path::new(&args[1]);
let project = Project::example([project_path], &mut cx).await;
cx.update(|cx| {
let language_registry = project.read(cx).languages().clone();
let node_runtime = project.read(cx).node_runtime().unwrap().clone();
languages::init(language_registry, node_runtime, cx);
})
.unwrap();
let project_index = cx
.update(|cx| semantic_index.project_index(project.clone(), cx))
.unwrap();
let (tx, rx) = oneshot::channel();
let mut tx = Some(tx);
let subscription = cx.update(|cx| {
cx.subscribe(&project_index, move |_, event, _| {
if let Some(tx) = tx.take() {
_ = tx.send(*event);
}
})
});
let index_start = std::time::Instant::now();
rx.await.expect("no event emitted");
drop(subscription);
println!("Index time: {:?}", index_start.elapsed());
let results = cx
.update(|cx| {
let project_index = project_index.read(cx);
let query = "converting an anchor to a point";
project_index.search(query, 4, cx)
})
.unwrap()
.await;
for search_result in results {
let path = search_result.path.clone();
let content = cx
.update(|cx| {
let worktree = search_result.worktree.read(cx);
let entry_abs_path = worktree.abs_path().join(search_result.path.clone());
let fs = project.read(cx).fs().clone();
cx.spawn(|_| async move { fs.load(&entry_abs_path).await.unwrap() })
})
.unwrap()
.await;
let range = search_result.range.clone();
let content = content[search_result.range].to_owned();
println!(
"✄✄✄✄✄✄✄✄✄✄✄✄✄✄ {:?} @ {} ✄✄✄✄✄✄✄✄✄✄✄✄✄✄",
path, search_result.score
);
println!("{:?}:{:?}:{:?}", path, range.start, range.end);
println!("{}", content);
}
cx.background_executor()
.timer(std::time::Duration::from_secs(100000))
.await;
cx.update(|cx| cx.quit()).unwrap();
})
.detach();
});
}

View File

@@ -0,0 +1,3 @@
fn main() {
println!("Hello Indexer!");
}

View File

@@ -0,0 +1,43 @@
# Searching for a needle in a haystack
When you have a large amount of text, it can be useful to search for a specific word or phrase. This is often referred to as "finding a needle in a haystack." In this markdown document, we're "hiding" a key phrase for our text search to find. Can you find it?
## Instructions
1. Use the search functionality in your text editor or markdown viewer to find the hidden phrase in this document.
2. Once you've found the **phrase**, write it down and proceed to the next step.
Honestly, I just want to fill up plenty of characters so that we chunk this markdown into several chunks.
## Tips
- Relax
- Take a deep breath
- Focus on the task at hand
- Don't get distracted by other text
- Use the search functionality to your advantage
## Example code
```python
def search_for_needle(haystack, needle):
if needle in haystack:
return True
else:
return False
```
```javascript
function searchForNeedle(haystack, needle) {
return haystack.includes(needle);
}
```
## Background
When creating an index for a book or searching for a specific term in a large document, the ability to quickly find a specific word or phrase is essential. This is where search functionality comes in handy. However, one should _remember_ that the search is only as good as the index that was built. As they say, garbage in, garbage out!
## Conclusion
Searching for a needle in a haystack can be a challenging task, but with the right tools and techniques, it becomes much easier. Whether you're looking for a specific word in a document or trying to find a key piece of information in a large dataset, the ability to search efficiently is a valuable skill to have.

View File

@@ -0,0 +1,409 @@
use language::{with_parser, Grammar, Tree};
use serde::{Deserialize, Serialize};
use sha2::{Digest, Sha256};
use std::{cmp, ops::Range, sync::Arc};
const CHUNK_THRESHOLD: usize = 1500;
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct Chunk {
pub range: Range<usize>,
pub digest: [u8; 32],
}
pub fn chunk_text(text: &str, grammar: Option<&Arc<Grammar>>) -> Vec<Chunk> {
if let Some(grammar) = grammar {
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(&text, None).expect("invalid language")
});
chunk_parse_tree(tree, &text, CHUNK_THRESHOLD)
} else {
chunk_lines(&text)
}
}
fn chunk_parse_tree(tree: Tree, text: &str, chunk_threshold: usize) -> Vec<Chunk> {
let mut chunk_ranges = Vec::new();
let mut cursor = tree.walk();
let mut range = 0..0;
loop {
let node = cursor.node();
// If adding the node to the current chunk exceeds the threshold
if node.end_byte() - range.start > chunk_threshold {
// Try to descend into its first child. If we can't, flush the current
// range and try again.
if cursor.goto_first_child() {
continue;
} else if !range.is_empty() {
chunk_ranges.push(range.clone());
range.start = range.end;
continue;
}
// If we get here, the node itself has no children but is larger than the threshold.
// Break its text into arbitrary chunks.
split_text(text, range.clone(), node.end_byte(), &mut chunk_ranges);
}
range.end = node.end_byte();
// If we get here, we consumed the node. Advance to the next child, ascending if there isn't one.
while !cursor.goto_next_sibling() {
if !cursor.goto_parent() {
if !range.is_empty() {
chunk_ranges.push(range);
}
return chunk_ranges
.into_iter()
.map(|range| {
let digest = Sha256::digest(&text[range.clone()]).into();
Chunk { range, digest }
})
.collect();
}
}
}
}
fn chunk_lines(text: &str) -> Vec<Chunk> {
let mut chunk_ranges = Vec::new();
let mut range = 0..0;
let mut newlines = text.match_indices('\n').peekable();
while let Some((newline_ix, _)) = newlines.peek() {
let newline_ix = newline_ix + 1;
if newline_ix - range.start <= CHUNK_THRESHOLD {
range.end = newline_ix;
newlines.next();
} else {
if range.is_empty() {
split_text(text, range, newline_ix, &mut chunk_ranges);
range = newline_ix..newline_ix;
} else {
chunk_ranges.push(range.clone());
range.start = range.end;
}
}
}
if !range.is_empty() {
chunk_ranges.push(range);
}
chunk_ranges
.into_iter()
.map(|range| {
let mut hasher = Sha256::new();
hasher.update(&text[range.clone()]);
let mut digest = [0u8; 32];
digest.copy_from_slice(hasher.finalize().as_slice());
Chunk { range, digest }
})
.collect()
}
fn split_text(
text: &str,
mut range: Range<usize>,
max_end: usize,
chunk_ranges: &mut Vec<Range<usize>>,
) {
while range.start < max_end {
range.end = cmp::min(range.start + CHUNK_THRESHOLD, max_end);
while !text.is_char_boundary(range.end) {
range.end -= 1;
}
chunk_ranges.push(range.clone());
range.start = range.end;
}
}
#[cfg(test)]
mod tests {
use super::*;
use language::{tree_sitter_rust, Language, LanguageConfig, LanguageMatcher};
// This example comes from crates/gpui/examples/window_positioning.rs which
// has the property of being CHUNK_THRESHOLD < TEXT.len() < 2*CHUNK_THRESHOLD
static TEXT: &str = r#"
use gpui::*;
struct WindowContent {
text: SharedString,
}
impl Render for WindowContent {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
div()
.flex()
.bg(rgb(0x1e2025))
.size_full()
.justify_center()
.items_center()
.text_xl()
.text_color(rgb(0xffffff))
.child(self.text.clone())
}
}
fn main() {
App::new().run(|cx: &mut AppContext| {
// Create several new windows, positioned in the top right corner of each screen
for screen in cx.displays() {
let options = {
let popup_margin_width = DevicePixels::from(16);
let popup_margin_height = DevicePixels::from(-0) - DevicePixels::from(48);
let window_size = Size {
width: px(400.),
height: px(72.),
};
let screen_bounds = screen.bounds();
let size: Size<DevicePixels> = window_size.into();
let bounds = gpui::Bounds::<DevicePixels> {
origin: screen_bounds.upper_right()
- point(size.width + popup_margin_width, popup_margin_height),
size: window_size.into(),
};
WindowOptions {
// Set the bounds of the window in screen coordinates
bounds: Some(bounds),
// Specify the display_id to ensure the window is created on the correct screen
display_id: Some(screen.id()),
titlebar: None,
window_background: WindowBackgroundAppearance::default(),
focus: false,
show: true,
kind: WindowKind::PopUp,
is_movable: false,
fullscreen: false,
}
};
cx.open_window(options, |cx| {
cx.new_view(|_| WindowContent {
text: format!("{:?}", screen.id()).into(),
})
});
}
});
}"#;
fn setup_rust_language() -> Language {
Language::new(
LanguageConfig {
name: "Rust".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["rs".to_string()],
..Default::default()
},
..Default::default()
},
Some(tree_sitter_rust::language()),
)
}
#[test]
fn test_chunk_text() {
let text = "a\n".repeat(1000);
let chunks = chunk_text(&text, None);
assert_eq!(
chunks.len(),
((2000_f64) / (CHUNK_THRESHOLD as f64)).ceil() as usize
);
}
#[test]
fn test_chunk_text_grammar() {
// Let's set up a big text with some known segments
// We'll then chunk it and verify that the chunks are correct
let language = setup_rust_language();
let chunks = chunk_text(TEXT, language.grammar());
assert_eq!(chunks.len(), 2);
assert_eq!(chunks[0].range.start, 0);
assert_eq!(chunks[0].range.end, 1498);
// The break between chunks is right before the "Specify the display_id" comment
assert_eq!(chunks[1].range.start, 1498);
assert_eq!(chunks[1].range.end, 2396);
}
#[test]
fn test_chunk_parse_tree() {
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(TEXT, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, TEXT, 250);
assert_eq!(chunks.len(), 11);
}
#[test]
fn test_chunk_unparsable() {
// Even if a chunk is unparsable, we should still be able to chunk it
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let text = r#"fn main() {"#;
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(text, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, text, 250);
assert_eq!(chunks.len(), 1);
assert_eq!(chunks[0].range.start, 0);
assert_eq!(chunks[0].range.end, 11);
}
#[test]
fn test_empty_text() {
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse("", None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, "", CHUNK_THRESHOLD);
assert!(chunks.is_empty(), "Chunks should be empty for empty text");
}
#[test]
fn test_single_large_node() {
let large_text = "static ".to_owned() + "a".repeat(CHUNK_THRESHOLD - 1).as_str() + " = 2";
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(&large_text, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, &large_text, CHUNK_THRESHOLD);
assert_eq!(
chunks.len(),
3,
"Large chunks are broken up according to grammar as best as possible"
);
// Expect chunks to be static, aaaaaa..., and = 2
assert_eq!(chunks[0].range.start, 0);
assert_eq!(chunks[0].range.end, "static".len());
assert_eq!(chunks[1].range.start, "static".len());
assert_eq!(chunks[1].range.end, "static".len() + CHUNK_THRESHOLD);
assert_eq!(chunks[2].range.start, "static".len() + CHUNK_THRESHOLD);
assert_eq!(chunks[2].range.end, large_text.len());
}
#[test]
fn test_multiple_small_nodes() {
let small_text = "a b c d e f g h i j k l m n o p q r s t u v w x y z";
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(small_text, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, small_text, 5);
assert!(
chunks.len() > 1,
"Should have multiple chunks for multiple small nodes"
);
}
#[test]
fn test_node_with_children() {
let nested_text = "fn main() { let a = 1; let b = 2; }";
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(nested_text, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, nested_text, 10);
assert!(
chunks.len() > 1,
"Should have multiple chunks for a node with children"
);
}
#[test]
fn test_text_with_unparsable_sections() {
// This test uses purposefully hit-or-miss sizing of 11 characters per likely chunk
let mixed_text = "fn main() { let a = 1; let b = 2; } unparsable bits here";
let language = setup_rust_language();
let grammar = language.grammar().unwrap();
let tree = with_parser(|parser| {
parser
.set_language(&grammar.ts_language)
.expect("incompatible grammar");
parser.parse(mixed_text, None).expect("invalid language")
});
let chunks = chunk_parse_tree(tree, mixed_text, 11);
assert!(
chunks.len() > 1,
"Should handle both parsable and unparsable sections correctly"
);
let expected_chunks = [
"fn main() {",
" let a = 1;",
" let b = 2;",
" }",
" unparsable",
" bits here",
];
for (i, chunk) in chunks.iter().enumerate() {
assert_eq!(
&mixed_text[chunk.range.clone()],
expected_chunks[i],
"Chunk {} should match",
i
);
}
}
}

View File

@@ -0,0 +1,125 @@
mod cloud;
mod ollama;
mod open_ai;
pub use cloud::*;
pub use ollama::*;
pub use open_ai::*;
use sha2::{Digest, Sha256};
use anyhow::Result;
use futures::{future::BoxFuture, FutureExt};
use serde::{Deserialize, Serialize};
use std::{fmt, future};
#[derive(Debug, Default, Clone, PartialEq, Serialize, Deserialize)]
pub struct Embedding(Vec<f32>);
impl Embedding {
pub fn new(mut embedding: Vec<f32>) -> Self {
let len = embedding.len();
let mut norm = 0f32;
for i in 0..len {
norm += embedding[i] * embedding[i];
}
norm = norm.sqrt();
for dimension in &mut embedding {
*dimension /= norm;
}
Self(embedding)
}
fn len(&self) -> usize {
self.0.len()
}
pub fn similarity(self, other: &Embedding) -> f32 {
debug_assert_eq!(self.0.len(), other.0.len());
self.0
.iter()
.copied()
.zip(other.0.iter().copied())
.map(|(a, b)| a * b)
.sum()
}
}
impl fmt::Display for Embedding {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let digits_to_display = 3;
// Start the Embedding display format
write!(f, "Embedding(sized: {}; values: [", self.len())?;
for (index, value) in self.0.iter().enumerate().take(digits_to_display) {
// Lead with comma if not the first element
if index != 0 {
write!(f, ", ")?;
}
write!(f, "{:.3}", value)?;
}
if self.len() > digits_to_display {
write!(f, "...")?;
}
write!(f, "])")
}
}
/// Trait for embedding providers. Texts in, vectors out.
pub trait EmbeddingProvider: Sync + Send {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>>;
fn batch_size(&self) -> usize;
}
#[derive(Debug)]
pub struct TextToEmbed<'a> {
pub text: &'a str,
pub digest: [u8; 32],
}
impl<'a> TextToEmbed<'a> {
pub fn new(text: &'a str) -> Self {
let digest = Sha256::digest(text.as_bytes());
Self {
text,
digest: digest.into(),
}
}
}
pub struct FakeEmbeddingProvider;
impl EmbeddingProvider for FakeEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embeddings = texts
.iter()
.map(|_text| {
let mut embedding = vec![0f32; 1536];
for i in 0..embedding.len() {
embedding[i] = i as f32;
}
Embedding::new(embedding)
})
.collect();
future::ready(Ok(embeddings)).boxed()
}
fn batch_size(&self) -> usize {
16
}
}
#[cfg(test)]
mod test {
use super::*;
#[gpui::test]
fn test_normalize_embedding() {
let normalized = Embedding::new(vec![1.0, 1.0, 1.0]);
let value: f32 = 1.0 / 3.0_f32.sqrt();
assert_eq!(normalized, Embedding(vec![value; 3]));
}
}

View File

@@ -0,0 +1,88 @@
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
use anyhow::{anyhow, Context, Result};
use client::{proto, Client};
use collections::HashMap;
use futures::{future::BoxFuture, FutureExt};
use std::sync::Arc;
pub struct CloudEmbeddingProvider {
model: String,
client: Arc<Client>,
}
impl CloudEmbeddingProvider {
pub fn new(client: Arc<Client>) -> Self {
Self {
model: "openai/text-embedding-3-small".into(),
client,
}
}
}
impl EmbeddingProvider for CloudEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
// First, fetch any embeddings that are cached based on the requested texts' digests
// Then compute any embeddings that are missing.
async move {
let cached_embeddings = self.client.request(proto::GetCachedEmbeddings {
model: self.model.clone(),
digests: texts
.iter()
.map(|to_embed| to_embed.digest.to_vec())
.collect(),
});
let mut embeddings = cached_embeddings
.await
.context("failed to fetch cached embeddings via cloud model")?
.embeddings
.into_iter()
.map(|embedding| {
let digest: [u8; 32] = embedding
.digest
.try_into()
.map_err(|_| anyhow!("invalid digest for cached embedding"))?;
Ok((digest, embedding.dimensions))
})
.collect::<Result<HashMap<_, _>>>()?;
let compute_embeddings_request = proto::ComputeEmbeddings {
model: self.model.clone(),
texts: texts
.iter()
.filter_map(|to_embed| {
if embeddings.contains_key(&to_embed.digest) {
None
} else {
Some(to_embed.text.to_string())
}
})
.collect(),
};
if !compute_embeddings_request.texts.is_empty() {
let missing_embeddings = self.client.request(compute_embeddings_request).await?;
for embedding in missing_embeddings.embeddings {
let digest: [u8; 32] = embedding
.digest
.try_into()
.map_err(|_| anyhow!("invalid digest for cached embedding"))?;
embeddings.insert(digest, embedding.dimensions);
}
}
texts
.iter()
.map(|to_embed| {
let dimensions = embeddings.remove(&to_embed.digest).with_context(|| {
format!("server did not return an embedding for {:?}", to_embed)
})?;
Ok(Embedding::new(dimensions))
})
.collect()
}
.boxed()
}
fn batch_size(&self) -> usize {
2048
}
}

View File

@@ -0,0 +1,74 @@
use anyhow::{Context as _, Result};
use futures::{future::BoxFuture, AsyncReadExt, FutureExt};
use serde::{Deserialize, Serialize};
use std::sync::Arc;
use util::http::HttpClient;
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
pub enum OllamaEmbeddingModel {
NomicEmbedText,
MxbaiEmbedLarge,
}
pub struct OllamaEmbeddingProvider {
client: Arc<dyn HttpClient>,
model: OllamaEmbeddingModel,
}
#[derive(Serialize)]
struct OllamaEmbeddingRequest {
model: String,
prompt: String,
}
#[derive(Deserialize)]
struct OllamaEmbeddingResponse {
embedding: Vec<f32>,
}
impl OllamaEmbeddingProvider {
pub fn new(client: Arc<dyn HttpClient>, model: OllamaEmbeddingModel) -> Self {
Self { client, model }
}
}
impl EmbeddingProvider for OllamaEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
//
let model = match self.model {
OllamaEmbeddingModel::NomicEmbedText => "nomic-embed-text",
OllamaEmbeddingModel::MxbaiEmbedLarge => "mxbai-embed-large",
};
futures::future::try_join_all(texts.into_iter().map(|to_embed| {
let request = OllamaEmbeddingRequest {
model: model.to_string(),
prompt: to_embed.text.to_string(),
};
let request = serde_json::to_string(&request).unwrap();
async {
let response = self
.client
.post_json("http://localhost:11434/api/embeddings", request.into())
.await?;
let mut body = String::new();
response.into_body().read_to_string(&mut body).await?;
let response: OllamaEmbeddingResponse =
serde_json::from_str(&body).context("Unable to pull response")?;
Ok(Embedding::new(response.embedding))
}
}))
.boxed()
}
fn batch_size(&self) -> usize {
// TODO: Figure out decent value
10
}
}

View File

@@ -0,0 +1,55 @@
use crate::{Embedding, EmbeddingProvider, TextToEmbed};
use anyhow::Result;
use futures::{future::BoxFuture, FutureExt};
pub use open_ai::OpenAiEmbeddingModel;
use std::sync::Arc;
use util::http::HttpClient;
pub struct OpenAiEmbeddingProvider {
client: Arc<dyn HttpClient>,
model: OpenAiEmbeddingModel,
api_url: String,
api_key: String,
}
impl OpenAiEmbeddingProvider {
pub fn new(
client: Arc<dyn HttpClient>,
model: OpenAiEmbeddingModel,
api_url: String,
api_key: String,
) -> Self {
Self {
client,
model,
api_url,
api_key,
}
}
}
impl EmbeddingProvider for OpenAiEmbeddingProvider {
fn embed<'a>(&'a self, texts: &'a [TextToEmbed<'a>]) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embed = open_ai::embed(
self.client.as_ref(),
&self.api_url,
&self.api_key,
self.model,
texts.iter().map(|to_embed| to_embed.text),
);
async move {
let response = embed.await?;
Ok(response
.data
.into_iter()
.map(|data| Embedding::new(data.embedding))
.collect())
}
.boxed()
}
fn batch_size(&self) -> usize {
// From https://platform.openai.com/docs/api-reference/embeddings/create
2048
}
}

View File

@@ -0,0 +1,954 @@
mod chunking;
mod embedding;
use anyhow::{anyhow, Context as _, Result};
use chunking::{chunk_text, Chunk};
use collections::{Bound, HashMap};
pub use embedding::*;
use fs::Fs;
use futures::stream::StreamExt;
use futures_batch::ChunksTimeoutStreamExt;
use gpui::{
AppContext, AsyncAppContext, Context, EntityId, EventEmitter, Global, Model, ModelContext,
Subscription, Task, WeakModel,
};
use heed::types::{SerdeBincode, Str};
use language::LanguageRegistry;
use project::{Entry, Project, UpdatedEntriesSet, Worktree};
use serde::{Deserialize, Serialize};
use smol::channel;
use std::{
cmp::Ordering,
future::Future,
ops::Range,
path::Path,
sync::Arc,
time::{Duration, SystemTime},
};
use util::ResultExt;
use worktree::LocalSnapshot;
pub struct SemanticIndex {
embedding_provider: Arc<dyn EmbeddingProvider>,
db_connection: heed::Env,
project_indices: HashMap<WeakModel<Project>, Model<ProjectIndex>>,
}
impl Global for SemanticIndex {}
impl SemanticIndex {
pub fn new(
db_path: &Path,
embedding_provider: Arc<dyn EmbeddingProvider>,
cx: &mut AppContext,
) -> Task<Result<Self>> {
let db_path = db_path.to_path_buf();
cx.spawn(|cx| async move {
let db_connection = cx
.background_executor()
.spawn(async move {
unsafe {
heed::EnvOpenOptions::new()
.map_size(1024 * 1024 * 1024)
.max_dbs(3000)
.open(db_path)
}
})
.await?;
Ok(SemanticIndex {
db_connection,
embedding_provider,
project_indices: HashMap::default(),
})
})
}
pub fn project_index(
&mut self,
project: Model<Project>,
cx: &mut AppContext,
) -> Model<ProjectIndex> {
self.project_indices
.entry(project.downgrade())
.or_insert_with(|| {
cx.new_model(|cx| {
ProjectIndex::new(
project,
self.db_connection.clone(),
self.embedding_provider.clone(),
cx,
)
})
})
.clone()
}
}
pub struct ProjectIndex {
db_connection: heed::Env,
project: Model<Project>,
worktree_indices: HashMap<EntityId, WorktreeIndexHandle>,
language_registry: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>,
last_status: Status,
embedding_provider: Arc<dyn EmbeddingProvider>,
_subscription: Subscription,
}
enum WorktreeIndexHandle {
Loading {
_task: Task<Result<()>>,
},
Loaded {
index: Model<WorktreeIndex>,
_subscription: Subscription,
},
}
impl ProjectIndex {
fn new(
project: Model<Project>,
db_connection: heed::Env,
embedding_provider: Arc<dyn EmbeddingProvider>,
cx: &mut ModelContext<Self>,
) -> Self {
let language_registry = project.read(cx).languages().clone();
let fs = project.read(cx).fs().clone();
let mut this = ProjectIndex {
db_connection,
project: project.clone(),
worktree_indices: HashMap::default(),
language_registry,
fs,
last_status: Status::Idle,
embedding_provider,
_subscription: cx.subscribe(&project, Self::handle_project_event),
};
this.update_worktree_indices(cx);
this
}
fn handle_project_event(
&mut self,
_: Model<Project>,
event: &project::Event,
cx: &mut ModelContext<Self>,
) {
match event {
project::Event::WorktreeAdded | project::Event::WorktreeRemoved(_) => {
self.update_worktree_indices(cx);
}
_ => {}
}
}
fn update_worktree_indices(&mut self, cx: &mut ModelContext<Self>) {
let worktrees = self
.project
.read(cx)
.visible_worktrees(cx)
.filter_map(|worktree| {
if worktree.read(cx).is_local() {
Some((worktree.entity_id(), worktree))
} else {
None
}
})
.collect::<HashMap<_, _>>();
self.worktree_indices
.retain(|worktree_id, _| worktrees.contains_key(worktree_id));
for (worktree_id, worktree) in worktrees {
self.worktree_indices.entry(worktree_id).or_insert_with(|| {
let worktree_index = WorktreeIndex::load(
worktree.clone(),
self.db_connection.clone(),
self.language_registry.clone(),
self.fs.clone(),
self.embedding_provider.clone(),
cx,
);
let load_worktree = cx.spawn(|this, mut cx| async move {
if let Some(index) = worktree_index.await.log_err() {
this.update(&mut cx, |this, cx| {
this.worktree_indices.insert(
worktree_id,
WorktreeIndexHandle::Loaded {
_subscription: cx
.observe(&index, |this, _, cx| this.update_status(cx)),
index,
},
);
})?;
} else {
this.update(&mut cx, |this, _cx| {
this.worktree_indices.remove(&worktree_id)
})?;
}
this.update(&mut cx, |this, cx| this.update_status(cx))
});
WorktreeIndexHandle::Loading {
_task: load_worktree,
}
});
}
self.update_status(cx);
}
fn update_status(&mut self, cx: &mut ModelContext<Self>) {
let mut status = Status::Idle;
for index in self.worktree_indices.values() {
match index {
WorktreeIndexHandle::Loading { .. } => {
status = Status::Scanning;
break;
}
WorktreeIndexHandle::Loaded { index, .. } => {
if index.read(cx).status == Status::Scanning {
status = Status::Scanning;
break;
}
}
}
}
if status != self.last_status {
self.last_status = status;
cx.emit(status);
}
}
pub fn search(&self, query: &str, limit: usize, cx: &AppContext) -> Task<Vec<SearchResult>> {
let mut worktree_searches = Vec::new();
for worktree_index in self.worktree_indices.values() {
if let WorktreeIndexHandle::Loaded { index, .. } = worktree_index {
worktree_searches
.push(index.read_with(cx, |index, cx| index.search(query, limit, cx)));
}
}
cx.spawn(|_| async move {
let mut results = Vec::new();
let worktree_searches = futures::future::join_all(worktree_searches).await;
for worktree_search_results in worktree_searches {
if let Some(worktree_search_results) = worktree_search_results.log_err() {
results.extend(worktree_search_results);
}
}
results
.sort_unstable_by(|a, b| b.score.partial_cmp(&a.score).unwrap_or(Ordering::Equal));
results.truncate(limit);
results
})
}
}
pub struct SearchResult {
pub worktree: Model<Worktree>,
pub path: Arc<Path>,
pub range: Range<usize>,
pub score: f32,
}
#[derive(Copy, Clone, Debug, Eq, PartialEq)]
pub enum Status {
Idle,
Scanning,
}
impl EventEmitter<Status> for ProjectIndex {}
struct WorktreeIndex {
worktree: Model<Worktree>,
db_connection: heed::Env,
db: heed::Database<Str, SerdeBincode<EmbeddedFile>>,
language_registry: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>,
embedding_provider: Arc<dyn EmbeddingProvider>,
status: Status,
_index_entries: Task<Result<()>>,
_subscription: Subscription,
}
impl WorktreeIndex {
pub fn load(
worktree: Model<Worktree>,
db_connection: heed::Env,
language_registry: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>,
embedding_provider: Arc<dyn EmbeddingProvider>,
cx: &mut AppContext,
) -> Task<Result<Model<Self>>> {
let worktree_abs_path = worktree.read(cx).abs_path();
cx.spawn(|mut cx| async move {
let db = cx
.background_executor()
.spawn({
let db_connection = db_connection.clone();
async move {
let mut txn = db_connection.write_txn()?;
let db_name = worktree_abs_path.to_string_lossy();
let db = db_connection.create_database(&mut txn, Some(&db_name))?;
txn.commit()?;
anyhow::Ok(db)
}
})
.await?;
cx.new_model(|cx| {
Self::new(
worktree,
db_connection,
db,
language_registry,
fs,
embedding_provider,
cx,
)
})
})
}
fn new(
worktree: Model<Worktree>,
db_connection: heed::Env,
db: heed::Database<Str, SerdeBincode<EmbeddedFile>>,
language_registry: Arc<LanguageRegistry>,
fs: Arc<dyn Fs>,
embedding_provider: Arc<dyn EmbeddingProvider>,
cx: &mut ModelContext<Self>,
) -> Self {
let (updated_entries_tx, updated_entries_rx) = channel::unbounded();
let _subscription = cx.subscribe(&worktree, move |_this, _worktree, event, _cx| {
if let worktree::Event::UpdatedEntries(update) = event {
_ = updated_entries_tx.try_send(update.clone());
}
});
Self {
db_connection,
db,
worktree,
language_registry,
fs,
embedding_provider,
status: Status::Idle,
_index_entries: cx.spawn(|this, cx| Self::index_entries(this, updated_entries_rx, cx)),
_subscription,
}
}
async fn index_entries(
this: WeakModel<Self>,
updated_entries: channel::Receiver<UpdatedEntriesSet>,
mut cx: AsyncAppContext,
) -> Result<()> {
let index = this.update(&mut cx, |this, cx| {
cx.notify();
this.status = Status::Scanning;
this.index_entries_changed_on_disk(cx)
})?;
index.await.log_err();
this.update(&mut cx, |this, cx| {
this.status = Status::Idle;
cx.notify();
})?;
while let Ok(updated_entries) = updated_entries.recv().await {
let index = this.update(&mut cx, |this, cx| {
cx.notify();
this.status = Status::Scanning;
this.index_updated_entries(updated_entries, cx)
})?;
index.await.log_err();
this.update(&mut cx, |this, cx| {
this.status = Status::Idle;
cx.notify();
})?;
}
Ok(())
}
fn index_entries_changed_on_disk(&self, cx: &AppContext) -> impl Future<Output = Result<()>> {
let worktree = self.worktree.read(cx).as_local().unwrap().snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_entries(worktree.clone(), cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = self.embed_files(chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
Ok(())
}
}
fn index_updated_entries(
&self,
updated_entries: UpdatedEntriesSet,
cx: &AppContext,
) -> impl Future<Output = Result<()>> {
let worktree = self.worktree.read(cx).as_local().unwrap().snapshot();
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_updated_entries(worktree, updated_entries, cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = self.embed_files(chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
Ok(())
}
}
fn scan_entries(&self, worktree: LocalSnapshot, cx: &AppContext) -> ScanEntries {
let (updated_entries_tx, updated_entries_rx) = channel::bounded(512);
let (deleted_entry_ranges_tx, deleted_entry_ranges_rx) = channel::bounded(128);
let db_connection = self.db_connection.clone();
let db = self.db;
let task = cx.background_executor().spawn(async move {
let txn = db_connection
.read_txn()
.context("failed to create read transaction")?;
let mut db_entries = db
.iter(&txn)
.context("failed to create iterator")?
.move_between_keys()
.peekable();
let mut deletion_range: Option<(Bound<&str>, Bound<&str>)> = None;
for entry in worktree.files(false, 0) {
let entry_db_key = db_key_for_path(&entry.path);
let mut saved_mtime = None;
while let Some(db_entry) = db_entries.peek() {
match db_entry {
Ok((db_path, db_embedded_file)) => match (*db_path).cmp(&entry_db_key) {
Ordering::Less => {
if let Some(deletion_range) = deletion_range.as_mut() {
deletion_range.1 = Bound::Included(db_path);
} else {
deletion_range =
Some((Bound::Included(db_path), Bound::Included(db_path)));
}
db_entries.next();
}
Ordering::Equal => {
if let Some(deletion_range) = deletion_range.take() {
deleted_entry_ranges_tx
.send((
deletion_range.0.map(ToString::to_string),
deletion_range.1.map(ToString::to_string),
))
.await?;
}
saved_mtime = db_embedded_file.mtime;
db_entries.next();
break;
}
Ordering::Greater => {
break;
}
},
Err(_) => return Err(db_entries.next().unwrap().unwrap_err())?,
}
}
if entry.mtime != saved_mtime {
updated_entries_tx.send(entry.clone()).await?;
}
}
if let Some(db_entry) = db_entries.next() {
let (db_path, _) = db_entry?;
deleted_entry_ranges_tx
.send((Bound::Included(db_path.to_string()), Bound::Unbounded))
.await?;
}
Ok(())
});
ScanEntries {
updated_entries: updated_entries_rx,
deleted_entry_ranges: deleted_entry_ranges_rx,
task,
}
}
fn scan_updated_entries(
&self,
worktree: LocalSnapshot,
updated_entries: UpdatedEntriesSet,
cx: &AppContext,
) -> ScanEntries {
let (updated_entries_tx, updated_entries_rx) = channel::bounded(512);
let (deleted_entry_ranges_tx, deleted_entry_ranges_rx) = channel::bounded(128);
let task = cx.background_executor().spawn(async move {
for (path, entry_id, status) in updated_entries.iter() {
match status {
project::PathChange::Added
| project::PathChange::Updated
| project::PathChange::AddedOrUpdated => {
if let Some(entry) = worktree.entry_for_id(*entry_id) {
updated_entries_tx.send(entry.clone()).await?;
}
}
project::PathChange::Removed => {
let db_path = db_key_for_path(path);
deleted_entry_ranges_tx
.send((Bound::Included(db_path.clone()), Bound::Included(db_path)))
.await?;
}
project::PathChange::Loaded => {
// Do nothing.
}
}
}
Ok(())
});
ScanEntries {
updated_entries: updated_entries_rx,
deleted_entry_ranges: deleted_entry_ranges_rx,
task,
}
}
fn chunk_files(
&self,
worktree_abs_path: Arc<Path>,
entries: channel::Receiver<Entry>,
cx: &AppContext,
) -> ChunkFiles {
let language_registry = self.language_registry.clone();
let fs = self.fs.clone();
let (chunked_files_tx, chunked_files_rx) = channel::bounded(2048);
let task = cx.spawn(|cx| async move {
cx.background_executor()
.scoped(|cx| {
for _ in 0..cx.num_cpus() {
cx.spawn(async {
while let Ok(entry) = entries.recv().await {
let entry_abs_path = worktree_abs_path.join(&entry.path);
let Some(text) = fs.load(&entry_abs_path).await.log_err() else {
continue;
};
let language = language_registry
.language_for_file_path(&entry.path)
.await
.ok();
let grammar =
language.as_ref().and_then(|language| language.grammar());
let chunked_file = ChunkedFile {
worktree_root: worktree_abs_path.clone(),
chunks: chunk_text(&text, grammar),
entry,
text,
};
if chunked_files_tx.send(chunked_file).await.is_err() {
return;
}
}
});
}
})
.await;
Ok(())
});
ChunkFiles {
files: chunked_files_rx,
task,
}
}
fn embed_files(
&self,
chunked_files: channel::Receiver<ChunkedFile>,
cx: &AppContext,
) -> EmbedFiles {
let embedding_provider = self.embedding_provider.clone();
let (embedded_files_tx, embedded_files_rx) = channel::bounded(512);
let task = cx.background_executor().spawn(async move {
let mut chunked_file_batches =
chunked_files.chunks_timeout(512, Duration::from_secs(2));
while let Some(chunked_files) = chunked_file_batches.next().await {
// View the batch of files as a vec of chunks
// Flatten out to a vec of chunks that we can subdivide into batch sized pieces
// Once those are done, reassemble it back into which files they belong to
let chunks = chunked_files
.iter()
.flat_map(|file| {
file.chunks.iter().map(|chunk| TextToEmbed {
text: &file.text[chunk.range.clone()],
digest: chunk.digest,
})
})
.collect::<Vec<_>>();
let mut embeddings = Vec::new();
for embedding_batch in chunks.chunks(embedding_provider.batch_size()) {
// todo!("add a retry facility")
embeddings.extend(embedding_provider.embed(embedding_batch).await?);
}
let mut embeddings = embeddings.into_iter();
for chunked_file in chunked_files {
let chunk_embeddings = embeddings
.by_ref()
.take(chunked_file.chunks.len())
.collect::<Vec<_>>();
let embedded_chunks = chunked_file
.chunks
.into_iter()
.zip(chunk_embeddings)
.map(|(chunk, embedding)| EmbeddedChunk { chunk, embedding })
.collect();
let embedded_file = EmbeddedFile {
path: chunked_file.entry.path.clone(),
mtime: chunked_file.entry.mtime,
chunks: embedded_chunks,
};
embedded_files_tx.send(embedded_file).await?;
}
}
Ok(())
});
EmbedFiles {
files: embedded_files_rx,
task,
}
}
fn persist_embeddings(
&self,
mut deleted_entry_ranges: channel::Receiver<(Bound<String>, Bound<String>)>,
embedded_files: channel::Receiver<EmbeddedFile>,
cx: &AppContext,
) -> Task<Result<()>> {
let db_connection = self.db_connection.clone();
let db = self.db;
cx.background_executor().spawn(async move {
while let Some(deletion_range) = deleted_entry_ranges.next().await {
let mut txn = db_connection.write_txn()?;
let start = deletion_range.0.as_ref().map(|start| start.as_str());
let end = deletion_range.1.as_ref().map(|end| end.as_str());
log::debug!("deleting embeddings in range {:?}", &(start, end));
db.delete_range(&mut txn, &(start, end))?;
txn.commit()?;
}
let mut embedded_files = embedded_files.chunks_timeout(4096, Duration::from_secs(2));
while let Some(embedded_files) = embedded_files.next().await {
let mut txn = db_connection.write_txn()?;
for file in embedded_files {
log::debug!("saving embedding for file {:?}", file.path);
let key = db_key_for_path(&file.path);
db.put(&mut txn, &key, &file)?;
}
txn.commit()?;
log::debug!("committed");
}
Ok(())
})
}
fn search(
&self,
query: &str,
limit: usize,
cx: &AppContext,
) -> Task<Result<Vec<SearchResult>>> {
let (chunks_tx, chunks_rx) = channel::bounded(1024);
let db_connection = self.db_connection.clone();
let db = self.db;
let scan_chunks = cx.background_executor().spawn({
async move {
let txn = db_connection
.read_txn()
.context("failed to create read transaction")?;
let db_entries = db.iter(&txn).context("failed to iterate database")?;
for db_entry in db_entries {
let (_, db_embedded_file) = db_entry?;
for chunk in db_embedded_file.chunks {
chunks_tx
.send((db_embedded_file.path.clone(), chunk))
.await?;
}
}
anyhow::Ok(())
}
});
let query = query.to_string();
let embedding_provider = self.embedding_provider.clone();
let worktree = self.worktree.clone();
cx.spawn(|cx| async move {
#[cfg(debug_assertions)]
let embedding_query_start = std::time::Instant::now();
let mut query_embeddings = embedding_provider
.embed(&[TextToEmbed::new(&query)])
.await?;
let query_embedding = query_embeddings
.pop()
.ok_or_else(|| anyhow!("no embedding for query"))?;
let mut workers = Vec::new();
for _ in 0..cx.background_executor().num_cpus() {
workers.push(Vec::<SearchResult>::new());
}
#[cfg(debug_assertions)]
let search_start = std::time::Instant::now();
cx.background_executor()
.scoped(|cx| {
for worker_results in workers.iter_mut() {
cx.spawn(async {
while let Ok((path, embedded_chunk)) = chunks_rx.recv().await {
let score = embedded_chunk.embedding.similarity(&query_embedding);
let ix = match worker_results.binary_search_by(|probe| {
score.partial_cmp(&probe.score).unwrap_or(Ordering::Equal)
}) {
Ok(ix) | Err(ix) => ix,
};
worker_results.insert(
ix,
SearchResult {
worktree: worktree.clone(),
path: path.clone(),
range: embedded_chunk.chunk.range.clone(),
score,
},
);
worker_results.truncate(limit);
}
});
}
})
.await;
scan_chunks.await?;
let mut search_results = Vec::with_capacity(workers.len() * limit);
for worker_results in workers {
search_results.extend(worker_results);
}
search_results
.sort_unstable_by(|a, b| b.score.partial_cmp(&a.score).unwrap_or(Ordering::Equal));
search_results.truncate(limit);
#[cfg(debug_assertions)]
{
let search_elapsed = search_start.elapsed();
log::debug!(
"searched {} entries in {:?}",
search_results.len(),
search_elapsed
);
let embedding_query_elapsed = embedding_query_start.elapsed();
log::debug!("embedding query took {:?}", embedding_query_elapsed);
}
Ok(search_results)
})
}
}
struct ScanEntries {
updated_entries: channel::Receiver<Entry>,
deleted_entry_ranges: channel::Receiver<(Bound<String>, Bound<String>)>,
task: Task<Result<()>>,
}
struct ChunkFiles {
files: channel::Receiver<ChunkedFile>,
task: Task<Result<()>>,
}
struct ChunkedFile {
#[allow(dead_code)]
pub worktree_root: Arc<Path>,
pub entry: Entry,
pub text: String,
pub chunks: Vec<Chunk>,
}
struct EmbedFiles {
files: channel::Receiver<EmbeddedFile>,
task: Task<Result<()>>,
}
#[derive(Debug, Serialize, Deserialize)]
struct EmbeddedFile {
path: Arc<Path>,
mtime: Option<SystemTime>,
chunks: Vec<EmbeddedChunk>,
}
#[derive(Debug, Serialize, Deserialize)]
struct EmbeddedChunk {
chunk: Chunk,
embedding: Embedding,
}
fn db_key_for_path(path: &Arc<Path>) -> String {
path.to_string_lossy().replace('/', "\0")
}
#[cfg(test)]
mod tests {
use super::*;
use futures::channel::oneshot;
use futures::{future::BoxFuture, FutureExt};
use gpui::{Global, TestAppContext};
use language::language_settings::AllLanguageSettings;
use project::Project;
use settings::SettingsStore;
use std::{future, path::Path, sync::Arc};
fn init_test(cx: &mut TestAppContext) {
_ = cx.update(|cx| {
let store = SettingsStore::test(cx);
cx.set_global(store);
language::init(cx);
Project::init_settings(cx);
SettingsStore::update(cx, |store, cx| {
store.update_user_settings::<AllLanguageSettings>(cx, |_| {});
});
});
}
pub struct TestEmbeddingProvider;
impl EmbeddingProvider for TestEmbeddingProvider {
fn embed<'a>(
&'a self,
texts: &'a [TextToEmbed<'a>],
) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embeddings = texts
.iter()
.map(|text| {
let mut embedding = vec![0f32; 2];
// if the text contains garbage, give it a 1 in the first dimension
if text.text.contains("garbage in") {
embedding[0] = 0.9;
} else {
embedding[0] = -0.9;
}
if text.text.contains("garbage out") {
embedding[1] = 0.9;
} else {
embedding[1] = -0.9;
}
Embedding::new(embedding)
})
.collect();
future::ready(Ok(embeddings)).boxed()
}
fn batch_size(&self) -> usize {
16
}
}
#[gpui::test]
async fn test_search(cx: &mut TestAppContext) {
cx.executor().allow_parking();
init_test(cx);
let temp_dir = tempfile::tempdir().unwrap();
let mut semantic_index = cx
.update(|cx| {
let semantic_index = SemanticIndex::new(
Path::new(temp_dir.path()),
Arc::new(TestEmbeddingProvider),
cx,
);
semantic_index
})
.await
.unwrap();
// todo!(): use a fixture
let project_path = Path::new("./fixture");
let project = cx
.spawn(|mut cx| async move { Project::example([project_path], &mut cx).await })
.await;
cx.update(|cx| {
let language_registry = project.read(cx).languages().clone();
let node_runtime = project.read(cx).node_runtime().unwrap().clone();
languages::init(language_registry, node_runtime, cx);
});
let project_index = cx.update(|cx| semantic_index.project_index(project.clone(), cx));
let (tx, rx) = oneshot::channel();
let mut tx = Some(tx);
let subscription = cx.update(|cx| {
cx.subscribe(&project_index, move |_, event, _| {
if let Some(tx) = tx.take() {
_ = tx.send(*event);
}
})
});
rx.await.expect("no event emitted");
drop(subscription);
let results = cx
.update(|cx| {
let project_index = project_index.read(cx);
let query = "garbage in, garbage out";
project_index.search(query, 4, cx)
})
.await;
assert!(results.len() > 1, "should have found some results");
for result in &results {
println!("result: {:?}", result.path);
println!("score: {:?}", result.score);
}
// Find result that is greater than 0.5
let search_result = results.iter().find(|result| result.score > 0.9).unwrap();
assert_eq!(search_result.path.to_string_lossy(), "needle.md");
let content = cx
.update(|cx| {
let worktree = search_result.worktree.read(cx);
let entry_abs_path = worktree.abs_path().join(search_result.path.clone());
let fs = project.read(cx).fs().clone();
cx.spawn(|_| async move { fs.load(&entry_abs_path).await.unwrap() })
})
.await;
let range = search_result.range.clone();
let content = content[range.clone()].to_owned();
assert!(content.contains("garbage in, garbage out"));
}
}

View File

@@ -25,6 +25,8 @@ pub struct TaskId(pub String);
pub struct SpawnInTerminal {
/// Id of the task to use when determining task tab affinity.
pub id: TaskId,
/// Full unshortened form of `label` field.
pub full_label: String,
/// Human readable name of the terminal tab.
pub label: String,
/// Executable command to spawn.

View File

@@ -102,7 +102,8 @@ impl TaskTemplate {
let full_label = substitute_all_template_variables_in_str(&self.label, &task_variables)?;
let command = substitute_all_template_variables_in_str(&self.command, &task_variables)?;
let args = substitute_all_template_variables_in_vec(self.args.clone(), &task_variables)?;
let task_hash = to_hex_hash(self)
let task_hash = to_hex_hash(&self)
.context("hashing task template")
.log_err()?;
let variables_hash = to_hex_hash(&task_variables)
@@ -114,10 +115,11 @@ impl TaskTemplate {
Some(ResolvedTask {
id: id.clone(),
original_task: self.clone(),
resolved_label: full_label,
resolved_label: full_label.clone(),
resolved: Some(SpawnInTerminal {
id,
cwd,
full_label,
label: shortened_label,
command,
args,

View File

@@ -23,13 +23,19 @@ pub fn init(cx: &mut AppContext) {
workspace
.register_action(spawn_task_or_modal)
.register_action(move |workspace, action: &modal::Rerun, cx| {
if let Some((task_source_kind, last_scheduled_task)) =
if let Some((task_source_kind, mut last_scheduled_task)) =
workspace.project().update(cx, |project, cx| {
project.task_inventory().read(cx).last_scheduled_task()
})
{
if action.reevaluate_context {
let original_task = last_scheduled_task.original_task;
let mut original_task = last_scheduled_task.original_task;
if let Some(allow_concurrent_runs) = action.allow_concurrent_runs {
original_task.allow_concurrent_runs = allow_concurrent_runs;
}
if let Some(use_new_terminal) = action.use_new_terminal {
original_task.use_new_terminal = use_new_terminal;
}
let cwd = task_cwd(workspace, cx).log_err().flatten();
let task_context = task_context(workspace, cwd, cx);
schedule_task(
@@ -41,6 +47,15 @@ pub fn init(cx: &mut AppContext) {
cx,
)
} else {
if let Some(resolved) = last_scheduled_task.resolved.as_mut() {
if let Some(allow_concurrent_runs) = action.allow_concurrent_runs {
resolved.allow_concurrent_runs = allow_concurrent_runs;
}
if let Some(use_new_terminal) = action.use_new_terminal {
resolved.use_new_terminal = use_new_terminal;
}
}
schedule_resolved_task(
workspace,
task_source_kind,

View File

@@ -38,12 +38,20 @@ impl Spawn {
/// Rerun last task
#[derive(PartialEq, Clone, Deserialize, Default)]
pub struct Rerun {
#[serde(default)]
/// Controls whether the task context is reevaluated prior to execution of a task.
/// If it is not, environment variables such as ZED_COLUMN, ZED_FILE are gonna be the same as in the last execution of a task
/// If it is, these variables will be updated to reflect current state of editor at the time task::Rerun is executed.
/// default: false
#[serde(default)]
pub reevaluate_context: bool,
/// Overrides `allow_concurrent_runs` property of the task being reran.
/// Default: null
#[serde(default)]
pub allow_concurrent_runs: Option<bool>,
/// Overrides `use_new_terminal` property of the task being reran.
/// Default: null
#[serde(default)]
pub use_new_terminal: Option<bool>,
}
impl_actions!(task, [Rerun, Spawn]);

View File

@@ -288,6 +288,7 @@ impl Display for TerminalError {
pub struct SpawnTask {
pub id: TaskId,
pub full_label: String,
pub label: String,
pub command: String,
pub args: Vec<String>,
@@ -594,6 +595,7 @@ pub struct Terminal {
pub struct TaskState {
pub id: TaskId,
pub full_label: String,
pub label: String,
pub status: TaskStatus,
pub completion_rx: Receiver<()>,
@@ -1363,7 +1365,7 @@ impl Terminal {
if truncate {
truncate_and_trailoff(&task_state.label, MAX_CHARS)
} else {
task_state.label.clone()
task_state.full_label.clone()
}
}
None => self

View File

@@ -296,9 +296,10 @@ impl TerminalPanel {
})
}
pub fn spawn_task(&mut self, spawn_in_terminal: &SpawnInTerminal, cx: &mut ViewContext<Self>) {
fn spawn_task(&mut self, spawn_in_terminal: &SpawnInTerminal, cx: &mut ViewContext<Self>) {
let mut spawn_task = SpawnTask {
id: spawn_in_terminal.id.clone(),
full_label: spawn_in_terminal.full_label.clone(),
label: spawn_in_terminal.label.clone(),
command: spawn_in_terminal.command.clone(),
args: spawn_in_terminal.args.clone(),
@@ -334,7 +335,7 @@ impl TerminalPanel {
return;
}
let terminals_for_task = self.terminals_for_task(&spawn_in_terminal.id, cx);
let terminals_for_task = self.terminals_for_task(&spawn_in_terminal.full_label, cx);
if terminals_for_task.is_empty() {
self.spawn_in_new_terminal(spawn_task, working_directory, cx);
return;
@@ -435,7 +436,7 @@ impl TerminalPanel {
fn terminals_for_task(
&self,
id: &TaskId,
label: &str,
cx: &mut AppContext,
) -> Vec<(usize, View<TerminalView>)> {
self.pane
@@ -445,7 +446,7 @@ impl TerminalPanel {
.filter_map(|(index, item)| Some((index, item.act_as::<TerminalView>(cx)?)))
.filter_map(|(index, terminal_view)| {
let task_state = terminal_view.read(cx).terminal().read(cx).task()?;
if &task_state.id == id {
if &task_state.full_label == label {
Some((index, terminal_view))
} else {
None

View File

@@ -1,4 +1,4 @@
use gpui::{anchored, Action, AnyView, IntoElement, Render, VisualContext};
use gpui::{Action, AnyView, IntoElement, Render, VisualContext};
use settings::Settings;
use theme::ThemeSettings;
@@ -90,18 +90,17 @@ pub fn tooltip_container<V>(
f: impl FnOnce(Div, &mut ViewContext<V>) -> Div,
) -> impl IntoElement {
let ui_font = ThemeSettings::get_global(cx).ui_font.family.clone();
// padding to avoid mouse cursor
anchored().child(
div().pl_2().pt_2p5().child(
v_flex()
.elevation_2(cx)
.font(ui_font)
.text_ui()
.text_color(cx.theme().colors().text)
.py_1()
.px_2()
.map(|el| f(el, cx)),
),
// padding to avoid tooltip appearing right below the mouse cursor
div().pl_2().pt_2p5().child(
v_flex()
.elevation_2(cx)
.font(ui_font)
.text_ui()
.text_color(cx.theme().colors().text)
.py_1()
.px_2()
.map(|el| f(el, cx)),
)
}

View File

@@ -71,19 +71,28 @@ impl HttpClientWithUrl {
}
impl HttpClient for Arc<HttpClientWithUrl> {
fn send(&self, req: Request<AsyncBody>) -> BoxFuture<Result<Response<AsyncBody>, Error>> {
fn send(
&self,
req: Request<AsyncBody>,
) -> BoxFuture<'static, Result<Response<AsyncBody>, Error>> {
self.client.send(req)
}
}
impl HttpClient for HttpClientWithUrl {
fn send(&self, req: Request<AsyncBody>) -> BoxFuture<Result<Response<AsyncBody>, Error>> {
fn send(
&self,
req: Request<AsyncBody>,
) -> BoxFuture<'static, Result<Response<AsyncBody>, Error>> {
self.client.send(req)
}
}
pub trait HttpClient: Send + Sync {
fn send(&self, req: Request<AsyncBody>) -> BoxFuture<Result<Response<AsyncBody>, Error>>;
fn send(
&self,
req: Request<AsyncBody>,
) -> BoxFuture<'static, Result<Response<AsyncBody>, Error>>;
fn get<'a>(
&'a self,
@@ -135,8 +144,12 @@ pub fn client() -> Arc<dyn HttpClient> {
}
impl HttpClient for isahc::HttpClient {
fn send(&self, req: Request<AsyncBody>) -> BoxFuture<Result<Response<AsyncBody>, Error>> {
Box::pin(async move { self.send_async(req).await })
fn send(
&self,
req: Request<AsyncBody>,
) -> BoxFuture<'static, Result<Response<AsyncBody>, Error>> {
let client = self.clone();
Box::pin(async move { client.send_async(req).await })
}
}
@@ -196,7 +209,10 @@ impl fmt::Debug for FakeHttpClient {
#[cfg(feature = "test-support")]
impl HttpClient for FakeHttpClient {
fn send(&self, req: Request<AsyncBody>) -> BoxFuture<Result<Response<AsyncBody>, Error>> {
fn send(
&self,
req: Request<AsyncBody>,
) -> BoxFuture<'static, Result<Response<AsyncBody>, Error>> {
let future = (self.handler)(req);
Box::pin(async move { future.await.map(Into::into) })
}

View File

@@ -6,6 +6,7 @@ use crate::{state::Mode, Vim};
/// The ModeIndicator displays the current mode in the status bar.
pub struct ModeIndicator {
pub(crate) mode: Option<Mode>,
pub(crate) operators: String,
_subscription: Subscription,
}
@@ -15,6 +16,7 @@ impl ModeIndicator {
let _subscription = cx.observe_global::<Vim>(|this, cx| this.update_mode(cx));
let mut this = Self {
mode: None,
operators: "".to_string(),
_subscription,
};
this.update_mode(cx);
@@ -29,10 +31,20 @@ impl ModeIndicator {
if vim.enabled {
self.mode = Some(vim.state().mode);
self.operators = self.current_operators_description(&vim);
} else {
self.mode = None;
}
}
fn current_operators_description(&self, vim: &Vim) -> String {
vim.state()
.operator_stack
.iter()
.map(|item| item.id())
.collect::<Vec<_>>()
.join("")
}
}
impl Render for ModeIndicator {
@@ -41,7 +53,7 @@ impl Render for ModeIndicator {
return div().into_any();
};
Label::new(format!("-- {} --", mode))
Label::new(format!("{} -- {} --", self.operators, mode))
.size(LabelSize::Small)
.into_any_element()
}

View File

@@ -517,6 +517,17 @@ impl Vim {
) {
self.start_recording(cx)
};
// Since these operations can only be entered with pre-operators,
// we need to clear the previous operators when pushing,
// so that the current stack is the most correct
if matches!(
operator,
Operator::AddSurrounds { .. }
| Operator::ChangeSurrounds { .. }
| Operator::DeleteSurrounds
) {
self.clear_operator(cx);
};
self.update_state(|state| state.operator_stack.push(operator));
self.sync_vim_settings(cx);
}

View File

@@ -818,11 +818,10 @@ impl Pane {
if let Some(prev_item) = self.items.get(prev_active_item_ix) {
prev_item.deactivated(cx);
}
cx.emit(Event::ActivateItem {
local: activate_pane,
});
}
cx.emit(Event::ActivateItem {
local: activate_pane,
});
if let Some(newly_active_item) = self.items.get(index) {
self.activation_history

View File

@@ -55,6 +55,8 @@ pub trait SearchableItem: Item + EventEmitter<SearchEvent> {
}
}
fn search_bar_visibility_changed(&mut self, _visible: bool, _cx: &mut ViewContext<Self>) {}
fn clear_matches(&mut self, cx: &mut ViewContext<Self>);
fn update_matches(&mut self, matches: &[Self::Match], cx: &mut ViewContext<Self>);
fn query_suggestion(&mut self, cx: &mut ViewContext<Self>) -> String;
@@ -131,6 +133,7 @@ pub trait SearchableItemHandle: ItemHandle {
matches: &AnyVec<dyn Send>,
cx: &mut WindowContext,
) -> Option<usize>;
fn search_bar_visibility_changed(&self, visible: bool, cx: &mut WindowContext);
}
impl<T: SearchableItem> SearchableItemHandle for View<T> {
@@ -227,6 +230,12 @@ impl<T: SearchableItem> SearchableItemHandle for View<T> {
let mat = mat.downcast_ref().unwrap();
self.update(cx, |this, cx| this.replace(mat, query, cx))
}
fn search_bar_visibility_changed(&self, visible: bool, cx: &mut WindowContext) {
self.update(cx, |this, cx| {
this.search_bar_visibility_changed(visible, cx)
});
}
}
impl From<Box<dyn SearchableItemHandle>> for AnyView {

View File

@@ -31,9 +31,8 @@ use gpui::{
use ignore::IgnoreStack;
use itertools::Itertools;
use language::{
proto::{deserialize_version, serialize_fingerprint, serialize_line_ending, serialize_version},
Buffer, Capability, DiagnosticEntry, File as _, LineEnding, PointUtf16, Rope, RopeFingerprint,
Unclipped,
proto::{deserialize_version, serialize_line_ending, serialize_version},
Buffer, Capability, DiagnosticEntry, File as _, LineEnding, PointUtf16, Rope, Unclipped,
};
use lsp::{DiagnosticSeverity, LanguageServerId};
use parking_lot::Mutex;
@@ -1175,7 +1174,6 @@ impl LocalWorktree {
}
let text = buffer.as_rope().clone();
let fingerprint = text.fingerprint();
let version = buffer.version();
let save = self.write_file(path.as_ref(), text, buffer.line_ending(), cx);
let fs = Arc::clone(&self.fs);
@@ -1238,12 +1236,11 @@ impl LocalWorktree {
buffer_id,
version: serialize_version(&version),
mtime: mtime.map(|time| time.into()),
fingerprint: serialize_fingerprint(fingerprint),
})?;
}
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), fingerprint, mtime, cx);
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
@@ -1644,11 +1641,10 @@ impl RemoteWorktree {
})
.await?;
let version = deserialize_version(&response.version);
let fingerprint = RopeFingerprint::default();
let mtime = response.mtime.map(|mtime| mtime.into());
buffer_handle.update(&mut cx, |buffer, cx| {
buffer.did_save(version.clone(), fingerprint, mtime, cx);
buffer.did_save(version.clone(), mtime, cx);
})?;
Ok(())
@@ -2083,7 +2079,7 @@ impl Snapshot {
.map(|entry| &entry.path)
}
fn child_entries<'a>(&'a self, parent_path: &'a Path) -> ChildEntriesIter<'a> {
pub fn child_entries<'a>(&'a self, parent_path: &'a Path) -> ChildEntriesIter<'a> {
let mut cursor = self.entries_by_path.cursor();
cursor.seek(&TraversalTarget::Path(parent_path), Bias::Right, &());
let traversal = Traversal {
@@ -3030,7 +3026,6 @@ impl language::LocalFile for File {
&self,
buffer_id: BufferId,
version: &clock::Global,
fingerprint: RopeFingerprint,
line_ending: LineEnding,
mtime: Option<SystemTime>,
cx: &mut AppContext,
@@ -3044,7 +3039,6 @@ impl language::LocalFile for File {
buffer_id: buffer_id.into(),
version: serialize_version(version),
mtime: mtime.map(|time| time.into()),
fingerprint: serialize_fingerprint(fingerprint),
line_ending: serialize_line_ending(line_ending) as i32,
})
.log_err();
@@ -4712,7 +4706,7 @@ impl<'a, 'b> SeekTarget<'a, EntrySummary, (TraversalProgress<'a>, GitStatuses)>
}
}
struct ChildEntriesIter<'a> {
pub struct ChildEntriesIter<'a> {
parent_path: &'a Path,
traversal: Traversal<'a>,
}

View File

@@ -1,6 +1,6 @@
[package]
name = "zed_php"
version = "0.0.1"
version = "0.0.2"
edition = "2021"
publish = false
license = "Apache-2.0"

View File

@@ -1,7 +1,7 @@
id = "php"
name = "PHP"
description = "PHP support."
version = "0.0.1"
version = "0.0.2"
schema_version = 1
authors = ["Piotr Osiewicz <piotr@zed.dev>"]
repository = "https://github.com/zed-industries/zed"

View File

@@ -0,0 +1,16 @@
[package]
name = "zed_terraform"
version = "0.0.1"
edition = "2021"
publish = false
license = "Apache-2.0"
[lints]
workspace = true
[lib]
path = "src/terraform.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.0.6"

View File

@@ -0,0 +1 @@
../../LICENSE-APACHE

View File

@@ -0,0 +1,16 @@
id = "terraform"
name = "Terraform"
description = "Terraform support."
version = "0.0.1"
schema_version = 1
authors = ["Caius Durling <dev@caius.name>", "Daniel Banck <dbanck@users.noreply.github.com>"]
repository = "https://github.com/zed-industries/zed"
[language_servers.terraform-ls]
name = "Terraform Language Server"
languages = ["Terraform", "Terraform Vars"]
language_ids = { Terraform = "terraform", "Terraform Vars" = "terraform-vars" }
[grammars.hcl]
repository = "https://github.com/MichaHoffmann/tree-sitter-hcl"
commit = "e936d3fef8bac884661472dce71ad82284761eb1"

View File

@@ -0,0 +1,101 @@
use std::fs;
use zed::LanguageServerId;
use zed_extension_api::{self as zed, Result};
struct TerraformExtension {
cached_binary_path: Option<String>,
}
impl TerraformExtension {
fn language_server_binary_path(
&mut self,
language_server_id: &LanguageServerId,
worktree: &zed::Worktree,
) -> Result<String> {
if let Some(path) = &self.cached_binary_path {
if fs::metadata(path).map_or(false, |stat| stat.is_file()) {
return Ok(path.clone());
}
}
if let Some(path) = worktree.which("terraform-ls") {
self.cached_binary_path = Some(path.clone());
return Ok(path);
}
zed::set_language_server_installation_status(
&language_server_id,
&zed::LanguageServerInstallationStatus::CheckingForUpdate,
);
let release = zed::latest_github_release(
"hashicorp/terraform-ls",
zed::GithubReleaseOptions {
require_assets: false,
pre_release: false,
},
)?;
let (platform, arch) = zed::current_platform();
let download_url = format!(
"https://releases.hashicorp.com/terraform-ls/{version}/terraform-ls_{version}_{os}_{arch}.zip",
version = release.version.strip_prefix('v').unwrap_or(&release.version),
os = match platform {
zed::Os::Mac => "darwin",
zed::Os::Linux => "linux",
zed::Os::Windows => "windows",
},
arch = match arch {
zed::Architecture::Aarch64 => "arm64",
zed::Architecture::X86 => "386",
zed::Architecture::X8664 => "amd64",
},
);
let version_dir = format!("terraform-ls-{}", release.version);
let binary_path = format!("{version_dir}/terraform-ls");
if !fs::metadata(&binary_path).map_or(false, |stat| stat.is_file()) {
zed::set_language_server_installation_status(
&language_server_id,
&zed::LanguageServerInstallationStatus::Downloading,
);
zed::download_file(&download_url, &version_dir, zed::DownloadedFileType::Zip)
.map_err(|e| format!("failed to download file: {e}"))?;
let entries =
fs::read_dir(".").map_err(|e| format!("failed to list working directory {e}"))?;
for entry in entries {
let entry = entry.map_err(|e| format!("failed to load directory entry {e}"))?;
if entry.file_name().to_str() != Some(&version_dir) {
fs::remove_dir_all(&entry.path()).ok();
}
}
}
self.cached_binary_path = Some(binary_path.clone());
Ok(binary_path)
}
}
impl zed::Extension for TerraformExtension {
fn new() -> Self {
Self {
cached_binary_path: None,
}
}
fn language_server_command(
&mut self,
language_server_id: &LanguageServerId,
worktree: &zed::Worktree,
) -> Result<zed::Command> {
Ok(zed::Command {
command: self.language_server_binary_path(language_server_id, worktree)?,
args: vec!["serve".to_string()],
env: Default::default(),
})
}
}
zed::register_extension!(TerraformExtension);

View File

@@ -1,6 +1,6 @@
[package]
name = "zed_toml"
version = "0.0.2"
version = "0.1.0"
edition = "2021"
publish = false
license = "Apache-2.0"
@@ -13,4 +13,4 @@ path = "src/toml.rs"
crate-type = ["cdylib"]
[dependencies]
zed_extension_api = "0.0.5"
zed_extension_api = "0.0.6"

View File

@@ -1,7 +1,7 @@
id = "toml"
name = "TOML"
description = "TOML support."
version = "0.0.2"
version = "0.1.0"
schema_version = 1
authors = [
"Max Brunsfeld <max@zed.dev>",

Some files were not shown because too many files have changed in this diff Show More