Compare commits

..

15 Commits

Author SHA1 Message Date
Conrad Irwin
284ad179d7 Don't auto-start during install flow 2024-04-30 16:09:19 -06:00
Conrad Irwin
30b55279b5 Bundle linux preview releases too 2024-04-30 16:04:42 -06:00
Conrad Irwin
28bcc95468 installer (#11229)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-04-30 16:01:07 -06:00
Marshall Bowers
d55b637b7e assistant2: Fix the height of collapsed messages (#11230)
This PR fixes the height of collapsed messages, addressing the
associated TODO comment.

Release Notes:

- N/A
2024-04-30 17:44:51 -04:00
Marshall Bowers
f2a1226e18 assistant2: Setup storybook (#11228)
This PR sets up the `assistant2` crate with the storybook so that UI
elements can be iterated on in isolation.

To start, we have some stories for the `ChatMessage` component:

```sh
cargo run -p storybook -- components/assistant_chat_message
```

<img width="1233" alt="Screenshot 2024-04-30 at 5 20 03 PM"
src="https://github.com/zed-industries/zed/assets/1486634/510967ea-0e9b-4fa9-94fb-421ee74bcc45">

Release Notes:

- N/A
2024-04-30 17:33:32 -04:00
Marshall Bowers
96b1fc4650 assistant2: Align chat messages with composer (#11227)
This PR adds some additional spacing to the composer so it aligns with
the chat messages.

Release Notes:

- N/A
2024-04-30 17:00:23 -04:00
Conrad Irwin
1d4814e5b6 installer (#11224)
Release Notes:

- N/A

---------

Co-authored-by: Mikayla <mikayla@zed.dev>
2024-04-30 14:25:12 -06:00
Conrad Irwin
cb7350174e Don't allow dropping files on remote projects (#11218)
Release Notes:

- Fixed a panic when a remote participant dropped a local file on your
project
2024-04-30 14:24:47 -06:00
Marshall Bowers
f11a7811f3 assistant2: Use composer for editing inline messages (#11222)
This PR updates the assistant to render historical user messages the
same as ones from the assistant.

Double-clicking on a message will open a composer inline for editing.
Pressing `Esc` will cancel the edit.

We don't yet restore the previous state of the message upon canceling.

<img width="401" alt="Screenshot 2024-04-30 at 4 04 01 PM"
src="https://github.com/zed-industries/zed/assets/1486634/5f253fa8-6578-4054-be30-c495e326d700">

<img width="401" alt="Screenshot 2024-04-30 at 4 04 28 PM"
src="https://github.com/zed-industries/zed/assets/1486634/edf25cea-d97e-44d2-8772-3690eac017a4">


Release Notes:

- N/A
2024-04-30 16:12:20 -04:00
Marshall Bowers
e8ee0131f1 Remove unused assistant prompt (#11221)
This PR removes an unused assistant prompt.

Release Notes:

- N/A

Co-authored-by: Nathan <nathan@zed.dev>
2024-04-30 15:32:11 -04:00
Max Brunsfeld
38b9d5cc36 Fix some semantic index issues (#11216)
* [x] Fixed an issue where embeddings would be assigned incorrectly to
files if a subset of embedding batches failed
* [x] Added a command to debug which paths are present in the semantic
index
* [x] Determine why so many paths are often missing from the semantic
index
* we erroring out if an embedding batch contained multiple texts that
were the same, which can happen if a worktree contains multiple copies
of the same text (e.g. a license).

Release Notes:

- N/A

---------

Co-authored-by: Marshall <marshall@zed.dev>
Co-authored-by: Nathan <nathan@zed.dev>
Co-authored-by: Kyle <kylek@zed.dev>
Co-authored-by: Kyle Kelley <rgbkrk@gmail.com>
2024-04-30 10:55:38 -07:00
Marshall Bowers
d01428e69c assistant2: Add create buffer tool (#11219)
This PR adds a new tool to the `assistant2` crate that allows the
assistant to create a new buffer with some content.

Release Notes:

- N/A

---------

Co-authored-by: Nathan <nathan@zed.dev>
2024-04-30 13:43:25 -04:00
Piotr Osiewicz
ada2791fa3 editor: Move code actions menu closer to the indicator when it is deployed via click (#11214)
Before:

![image](https://github.com/zed-industries/zed/assets/24362066/98d633a7-c982-4522-b4dc-b944b70b8081)

After: 

![image](https://github.com/zed-industries/zed/assets/24362066/79931e12-0e6c-4ece-b734-5af7d02f7e50)

Release Notes:

- N/A
2024-04-30 18:51:20 +02:00
Marshall Bowers
713c314d67 Clean up doc comments on LanguageModelTool trait (#11217)
This PR cleans up the doc comments on the `LanguageModelTool` trait to
follow Rust documentation conventions.

Release Notes:

- N/A
2024-04-30 12:32:59 -04:00
Marshall Bowers
e3de440715 assistant2: Restructure tools in preparation for adding more (#11213)
This PR does a slight restructuring of how tools are defined in the
`assistant2` crate to make it more amenable to adding more tools in the
near future.

Release Notes:

- N/A
2024-04-30 11:12:44 -04:00
33 changed files with 1081 additions and 534 deletions

View File

@@ -323,14 +323,13 @@ jobs:
name: zed-${{ github.event.pull_request.head.sha || github.sha }}-x86_64-unknown-linux-gnu.tar.gz
path: zed-*.tar.gz
# TODO linux : make it stable enough to be uploaded as a release
# - uses: softprops/action-gh-release@v1
# name: Upload app bundle to release
# if: ${{ env.RELEASE_CHANNEL == 'preview' || env.RELEASE_CHANNEL == 'stable' }}
# with:
# draft: true
# prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
# files: target/release/Zed.dmg
# body: ""
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload app bundle to release
uses: softprops/action-gh-release@v1
if: ${{ env.RELEASE_CHANNEL == 'preview' }}
with:
draft: true
prerelease: ${{ env.RELEASE_CHANNEL == 'preview' }}
files: target/release/zed-linux-x86_64.tar.gz
body: ""
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

2
Cargo.lock generated
View File

@@ -401,6 +401,7 @@ dependencies = [
"serde",
"serde_json",
"settings",
"story",
"theme",
"ui",
"util",
@@ -9498,6 +9499,7 @@ name = "storybook"
version = "0.1.0"
dependencies = [
"anyhow",
"assistant2",
"clap 4.4.4",
"collab_ui",
"ctrlc",

View File

@@ -1,6 +0,0 @@
User input begins on a line starting with /.
Don't apologize ever.
Never say "I apologize".
Use simple language and don't flatter the users.
Keep it short.
Risk being rude.

View File

@@ -5,9 +5,16 @@ edition = "2021"
publish = false
license = "GPL-3.0-or-later"
[lints]
workspace = true
[lib]
path = "src/assistant2.rs"
[features]
default = []
stories = ["dep:story"]
[dependencies]
anyhow.workspace = true
assistant_tooling.workspace = true
@@ -29,6 +36,7 @@ semantic_index.workspace = true
serde.workspace = true
serde_json.workspace = true
settings.workspace = true
story = { workspace = true, optional = true }
theme.workspace = true
ui.workspace = true
util.workspace = true
@@ -49,6 +57,3 @@ settings = { workspace = true, features = ["test-support"] }
theme = { workspace = true, features = ["test-support"] }
util = { workspace = true, features = ["test-support"] }
workspace = { workspace = true, features = ["test-support"] }
[lints]
workspace = true

View File

@@ -122,7 +122,11 @@ impl LanguageModelTool for RollDiceTool {
"Rolls N many dice and returns the results.".to_string()
}
fn execute(&self, input: &Self::Input, _cx: &AppContext) -> Task<gpui::Result<Self::Output>> {
fn execute(
&self,
input: &Self::Input,
_cx: &mut WindowContext,
) -> Task<gpui::Result<Self::Output>> {
let rolls = (0..input.num_dice)
.map(|_| {
let die_type = input.die_type.as_ref().unwrap_or(&Die::D6).clone();
@@ -223,7 +227,11 @@ impl LanguageModelTool for FileBrowserTool {
"A tool for browsing the filesystem.".to_string()
}
fn execute(&self, input: &Self::Input, cx: &AppContext) -> Task<gpui::Result<Self::Output>> {
fn execute(
&self,
input: &Self::Input,
cx: &mut WindowContext,
) -> Task<gpui::Result<Self::Output>> {
cx.spawn({
let fs = self.fs.clone();
let root_dir = self.root_dir.clone();
@@ -357,7 +365,7 @@ impl Example {
) -> Self {
Self {
assistant_panel: cx.new_view(|cx| {
AssistantPanel::new(language_registry, tool_registry, user_store, cx)
AssistantPanel::new(language_registry, tool_registry, user_store, None, cx)
}),
}
}

View File

@@ -1,7 +1,7 @@
mod assistant_settings;
mod completion_provider;
pub mod tools;
mod ui;
mod tools;
pub mod ui;
use ::ui::{div, prelude::*, Color, ViewContext};
use anyhow::{Context, Result};
@@ -13,17 +13,16 @@ use editor::Editor;
use feature_flags::FeatureFlagAppExt as _;
use futures::{future::join_all, StreamExt};
use gpui::{
list, AnyElement, AppContext, AsyncWindowContext, EventEmitter, FocusHandle, FocusableView,
ListAlignment, ListState, Model, Render, Task, View, WeakView,
list, AnyElement, AppContext, AsyncWindowContext, ClickEvent, EventEmitter, FocusHandle,
FocusableView, ListAlignment, ListState, Model, Render, Task, View, WeakView,
};
use language::{language_settings::SoftWrap, LanguageRegistry};
use open_ai::{FunctionContent, ToolCall, ToolCallContent};
use rich_text::RichText;
use semantic_index::{CloudEmbeddingProvider, SemanticIndex};
use semantic_index::{CloudEmbeddingProvider, ProjectIndex, SemanticIndex};
use serde::Deserialize;
use settings::Settings;
use std::sync::Arc;
use tools::ProjectIndexTool;
use ui::Composer;
use util::{paths::EMBEDDINGS_DIR, ResultExt};
use workspace::{
@@ -33,6 +32,7 @@ use workspace::{
pub use assistant_settings::AssistantSettings;
use crate::tools::{CreateBufferTool, ProjectIndexTool};
use crate::ui::UserOrAssistant;
const MAX_COMPLETION_CALLS_PER_SUBMISSION: usize = 5;
@@ -51,7 +51,7 @@ pub enum SubmitMode {
Codebase,
}
gpui::actions!(assistant2, [Cancel, ToggleFocus]);
gpui::actions!(assistant2, [Cancel, ToggleFocus, DebugProjectIndex]);
gpui::impl_actions!(assistant2, [Submit]);
pub fn init(client: Arc<Client>, cx: &mut AppContext) {
@@ -121,10 +121,23 @@ impl AssistantPanel {
)
.context("failed to register ProjectIndexTool")
.log_err();
tool_registry
.register(
CreateBufferTool::new(workspace.clone(), project.clone()),
cx,
)
.context("failed to register CreateBufferTool")
.log_err();
let tool_registry = Arc::new(tool_registry);
Self::new(app_state.languages.clone(), tool_registry, user_store, cx)
Self::new(
app_state.languages.clone(),
tool_registry,
user_store,
Some(project_index),
cx,
)
})
})
}
@@ -133,6 +146,7 @@ impl AssistantPanel {
language_registry: Arc<LanguageRegistry>,
tool_registry: Arc<ToolRegistry>,
user_store: Model<UserStore>,
project_index: Option<Model<ProjectIndex>>,
cx: &mut ViewContext<Self>,
) -> Self {
let chat = cx.new_view(|cx| {
@@ -140,6 +154,7 @@ impl AssistantPanel {
language_registry.clone(),
tool_registry.clone(),
user_store,
project_index,
cx,
)
});
@@ -207,7 +222,7 @@ impl FocusableView for AssistantPanel {
}
}
struct AssistantChat {
pub struct AssistantChat {
model: String,
messages: Vec<ChatMessage>,
list_state: ListState,
@@ -216,8 +231,10 @@ struct AssistantChat {
user_store: Model<UserStore>,
next_message_id: MessageId,
collapsed_messages: HashMap<MessageId, bool>,
editing_message_id: Option<MessageId>,
pending_completion: Option<Task<()>>,
tool_registry: Arc<ToolRegistry>,
project_index: Option<Model<ProjectIndex>>,
}
impl AssistantChat {
@@ -225,6 +242,7 @@ impl AssistantChat {
language_registry: Arc<LanguageRegistry>,
tool_registry: Arc<ToolRegistry>,
user_store: Model<UserStore>,
project_index: Option<Model<ProjectIndex>>,
cx: &mut ViewContext<Self>,
) -> Self {
let model = CompletionProvider::get(cx).default_model();
@@ -251,7 +269,9 @@ impl AssistantChat {
list_state,
user_store,
language_registry,
project_index,
next_message_id: MessageId(0),
editing_message_id: None,
collapsed_messages: HashMap::default(),
pending_completion: None,
tool_registry,
@@ -270,6 +290,9 @@ impl AssistantChat {
}
fn cancel(&mut self, _: &Cancel, cx: &mut ViewContext<Self>) {
// If we're currently editing a message, cancel the edit.
self.editing_message_id.take();
if self.pending_completion.take().is_none() {
cx.propagate();
return;
@@ -335,6 +358,14 @@ impl AssistantChat {
self.pending_completion.is_none()
}
fn debug_project_index(&mut self, _: &DebugProjectIndex, cx: &mut ViewContext<Self>) {
if let Some(index) = &self.project_index {
index.update(cx, |project_index, cx| {
project_index.debug(cx).detach_and_log_err(cx)
});
}
}
async fn request_completion(
this: WeakView<Self>,
mode: SubmitMode,
@@ -538,19 +569,52 @@ impl AssistantChat {
match &self.messages[ix] {
ChatMessage::User(UserMessage { id, body }) => div()
.id(SharedString::from(format!("message-{}-container", id.0)))
.when(!is_last, |element| element.mb_2())
.child(crate::ui::ChatMessage::new(
*id,
UserOrAssistant::User(self.user_store.read(cx).current_user()),
body.clone().into_any_element(),
self.is_message_collapsed(id),
Box::new(cx.listener({
let id = *id;
move |assistant_chat, _event, _cx| {
assistant_chat.toggle_message_collapsed(id)
}
})),
))
.map(|element| {
if self.editing_message_id.as_ref() == Some(id) {
element.child(Composer::new(
body.clone(),
self.user_store.read(cx).current_user(),
self.can_submit(),
self.tool_registry.clone(),
crate::ui::ModelSelector::new(
cx.view().downgrade(),
self.model.clone(),
)
.into_any_element(),
))
} else {
element
.on_click(cx.listener({
let id = *id;
move |assistant_chat, event: &ClickEvent, _cx| {
if event.up.click_count == 2 {
assistant_chat.editing_message_id = Some(id);
}
}
}))
.child(crate::ui::ChatMessage::new(
*id,
UserOrAssistant::User(self.user_store.read(cx).current_user()),
Some(
RichText::new(
body.read(cx).text(cx),
&[],
&self.language_registry,
)
.element(ElementId::from(id.0), cx),
),
self.is_message_collapsed(id),
Box::new(cx.listener({
let id = *id;
move |assistant_chat, _event, _cx| {
assistant_chat.toggle_message_collapsed(id)
}
})),
))
}
})
.into_any(),
ChatMessage::Assistant(AssistantMessage {
id,
@@ -559,10 +623,15 @@ impl AssistantChat {
tool_calls,
..
}) => {
let assistant_body = if body.text.is_empty() && !tool_calls.is_empty() {
div()
let assistant_body = if body.text.is_empty() {
None
} else {
div().p_2().child(body.element(ElementId::from(id.0), cx))
Some(
div()
.p_2()
.child(body.element(ElementId::from(id.0), cx))
.into_any_element(),
)
};
div()
@@ -570,7 +639,7 @@ impl AssistantChat {
.child(crate::ui::ChatMessage::new(
*id,
UserOrAssistant::Assistant,
assistant_body.into_any_element(),
assistant_body,
self.is_message_collapsed(id),
Box::new(cx.listener({
let id = *id;
@@ -674,21 +743,22 @@ impl Render for AssistantChat {
.key_context("AssistantChat")
.on_action(cx.listener(Self::submit))
.on_action(cx.listener(Self::cancel))
.on_action(cx.listener(Self::debug_project_index))
.text_color(Color::Default.color(cx))
.child(list(self.list_state.clone()).flex_1())
.child(Composer::new(
cx.view().downgrade(),
self.model.clone(),
self.composer_editor.clone(),
self.user_store.read(cx).current_user(),
self.can_submit(),
self.tool_registry.clone(),
crate::ui::ModelSelector::new(cx.view().downgrade(), self.model.clone())
.into_any_element(),
))
}
}
#[derive(Debug, PartialEq, Eq, PartialOrd, Ord, Hash, Clone, Copy)]
struct MessageId(usize);
pub struct MessageId(usize);
impl MessageId {
fn post_inc(&mut self) -> Self {

View File

@@ -1,267 +1,5 @@
use anyhow::Result;
use assistant_tooling::LanguageModelTool;
use gpui::{prelude::*, AnyView, AppContext, Model, Task};
use project::Fs;
use schemars::JsonSchema;
use semantic_index::{ProjectIndex, Status};
use serde::Deserialize;
use std::sync::Arc;
use ui::{
div, prelude::*, CollapsibleContainer, Color, Icon, IconName, Label, SharedString,
WindowContext,
};
use util::ResultExt as _;
mod create_buffer;
mod project_index;
const DEFAULT_SEARCH_LIMIT: usize = 20;
#[derive(Clone)]
pub struct CodebaseExcerpt {
path: SharedString,
text: SharedString,
score: f32,
element_id: ElementId,
expanded: bool,
}
// Note: Comments on a `LanguageModelTool::Input` become descriptions on the generated JSON schema as shown to the language model.
// Any changes or deletions to the `CodebaseQuery` comments will change model behavior.
#[derive(Deserialize, JsonSchema)]
pub struct CodebaseQuery {
/// Semantic search query
query: String,
/// Maximum number of results to return, defaults to 20
limit: Option<usize>,
}
pub struct ProjectIndexView {
input: CodebaseQuery,
output: Result<ProjectIndexOutput>,
}
impl ProjectIndexView {
fn toggle_expanded(&mut self, element_id: ElementId, cx: &mut ViewContext<Self>) {
if let Ok(output) = &mut self.output {
if let Some(excerpt) = output
.excerpts
.iter_mut()
.find(|excerpt| excerpt.element_id == element_id)
{
excerpt.expanded = !excerpt.expanded;
cx.notify();
}
}
}
}
impl Render for ProjectIndexView {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let query = self.input.query.clone();
let result = &self.output;
let output = match result {
Err(err) => {
return div().child(Label::new(format!("Error: {}", err)).color(Color::Error));
}
Ok(output) => output,
};
div()
.v_flex()
.gap_2()
.child(
div()
.p_2()
.rounded_md()
.bg(cx.theme().colors().editor_background)
.child(
h_flex()
.child(Label::new("Query: ").color(Color::Modified))
.child(Label::new(query).color(Color::Muted)),
),
)
.children(output.excerpts.iter().map(|excerpt| {
let element_id = excerpt.element_id.clone();
let expanded = excerpt.expanded;
CollapsibleContainer::new(element_id.clone(), expanded)
.start_slot(
h_flex()
.gap_1()
.child(Icon::new(IconName::File).color(Color::Muted))
.child(Label::new(excerpt.path.clone()).color(Color::Muted)),
)
.on_click(cx.listener(move |this, _, cx| {
this.toggle_expanded(element_id.clone(), cx);
}))
.child(
div()
.p_2()
.rounded_md()
.bg(cx.theme().colors().editor_background)
.child(excerpt.text.clone()),
)
}))
}
}
pub struct ProjectIndexTool {
project_index: Model<ProjectIndex>,
fs: Arc<dyn Fs>,
}
pub struct ProjectIndexOutput {
excerpts: Vec<CodebaseExcerpt>,
status: Status,
}
impl ProjectIndexTool {
pub fn new(project_index: Model<ProjectIndex>, fs: Arc<dyn Fs>) -> Self {
// Listen for project index status and update the ProjectIndexTool directly
// TODO: setup a better description based on the user's current codebase.
Self { project_index, fs }
}
}
impl LanguageModelTool for ProjectIndexTool {
type Input = CodebaseQuery;
type Output = ProjectIndexOutput;
type View = ProjectIndexView;
fn name(&self) -> String {
"query_codebase".to_string()
}
fn description(&self) -> String {
"Semantic search against the user's current codebase, returning excerpts related to the query by computing a dot product against embeddings of chunks and an embedding of the query".to_string()
}
fn execute(&self, query: &Self::Input, cx: &AppContext) -> Task<Result<Self::Output>> {
let project_index = self.project_index.read(cx);
let status = project_index.status();
let results = project_index.search(
query.query.as_str(),
query.limit.unwrap_or(DEFAULT_SEARCH_LIMIT),
cx,
);
let fs = self.fs.clone();
cx.spawn(|cx| async move {
let results = results.await;
let excerpts = results.into_iter().map(|result| {
let abs_path = result
.worktree
.read_with(&cx, |worktree, _| worktree.abs_path().join(&result.path));
let fs = fs.clone();
async move {
let path = result.path.clone();
let text = fs.load(&abs_path?).await?;
let mut start = result.range.start;
let mut end = result.range.end.min(text.len());
while !text.is_char_boundary(start) {
start += 1;
}
while !text.is_char_boundary(end) {
end -= 1;
}
anyhow::Ok(CodebaseExcerpt {
element_id: ElementId::Name(nanoid::nanoid!().into()),
expanded: false,
path: path.to_string_lossy().to_string().into(),
text: SharedString::from(text[start..end].to_string()),
score: result.score,
})
}
});
let excerpts = futures::future::join_all(excerpts)
.await
.into_iter()
.filter_map(|result| result.log_err())
.collect();
anyhow::Ok(ProjectIndexOutput { excerpts, status })
})
}
fn output_view(
_tool_call_id: String,
input: Self::Input,
output: Result<Self::Output>,
cx: &mut WindowContext,
) -> gpui::View<Self::View> {
cx.new_view(|_cx| ProjectIndexView { input, output })
}
fn status_view(&self, cx: &mut WindowContext) -> Option<AnyView> {
Some(
cx.new_view(|cx| ProjectIndexStatusView::new(self.project_index.clone(), cx))
.into(),
)
}
fn format(_input: &Self::Input, output: &Result<Self::Output>) -> String {
match &output {
Ok(output) => {
let mut body = "Semantic search results:\n".to_string();
if output.status != Status::Idle {
body.push_str("Still indexing. Results may be incomplete.\n");
}
if output.excerpts.is_empty() {
body.push_str("No results found");
return body;
}
for excerpt in &output.excerpts {
body.push_str("Excerpt from ");
body.push_str(excerpt.path.as_ref());
body.push_str(", score ");
body.push_str(&excerpt.score.to_string());
body.push_str(":\n");
body.push_str("~~~\n");
body.push_str(excerpt.text.as_ref());
body.push_str("~~~\n");
}
body
}
Err(err) => format!("Error: {}", err),
}
}
}
struct ProjectIndexStatusView {
project_index: Model<ProjectIndex>,
}
impl ProjectIndexStatusView {
pub fn new(project_index: Model<ProjectIndex>, cx: &mut ViewContext<Self>) -> Self {
cx.subscribe(&project_index, |_this, _, _status: &Status, cx| {
cx.notify();
})
.detach();
Self { project_index }
}
}
impl Render for ProjectIndexStatusView {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let status = self.project_index.read(cx).status();
h_flex().gap_2().map(|element| match status {
Status::Idle => element.child(Label::new("Project index ready")),
Status::Loading => element.child(Label::new("Project index loading...")),
Status::Scanning { remaining_count } => element.child(Label::new(format!(
"Project index scanning: {remaining_count} remaining..."
))),
})
}
}
pub use create_buffer::*;
pub use project_index::*;

View File

@@ -0,0 +1,111 @@
use anyhow::Result;
use assistant_tooling::LanguageModelTool;
use editor::Editor;
use gpui::{prelude::*, Model, Task, View, WeakView};
use project::Project;
use schemars::JsonSchema;
use serde::Deserialize;
use ui::prelude::*;
use util::ResultExt;
use workspace::Workspace;
pub struct CreateBufferTool {
workspace: WeakView<Workspace>,
project: Model<Project>,
}
impl CreateBufferTool {
pub fn new(workspace: WeakView<Workspace>, project: Model<Project>) -> Self {
Self { workspace, project }
}
}
#[derive(Debug, Deserialize, JsonSchema)]
pub struct CreateBufferInput {
/// The contents of the buffer.
text: String,
/// The name of the language to use for the buffer.
///
/// This should be a human-readable name, like "Rust", "JavaScript", or "Python".
language: String,
}
pub struct CreateBufferOutput {}
impl LanguageModelTool for CreateBufferTool {
type Input = CreateBufferInput;
type Output = CreateBufferOutput;
type View = CreateBufferView;
fn name(&self) -> String {
"create_buffer".to_string()
}
fn description(&self) -> String {
"Create a new buffer in the current codebase".to_string()
}
fn execute(&self, input: &Self::Input, cx: &mut WindowContext) -> Task<Result<Self::Output>> {
cx.spawn({
let workspace = self.workspace.clone();
let project = self.project.clone();
let text = input.text.clone();
let language_name = input.language.clone();
|mut cx| async move {
let language = cx
.update(|cx| {
project
.read(cx)
.languages()
.language_for_name(&language_name)
})?
.await?;
let buffer = cx.update(|cx| {
project.update(cx, |project, cx| {
project.create_buffer(&text, Some(language), cx)
})
})??;
workspace
.update(&mut cx, |workspace, cx| {
workspace.add_item_to_active_pane(
Box::new(
cx.new_view(|cx| Editor::for_buffer(buffer, Some(project), cx)),
),
None,
cx,
);
})
.log_err();
Ok(CreateBufferOutput {})
}
})
}
fn format(input: &Self::Input, output: &Result<Self::Output>) -> String {
match output {
Ok(_) => format!("Created a new {} buffer", input.language),
Err(err) => format!("Failed to create buffer: {err:?}"),
}
}
fn output_view(
_tool_call_id: String,
_input: Self::Input,
_output: Result<Self::Output>,
cx: &mut WindowContext,
) -> View<Self::View> {
cx.new_view(|_cx| CreateBufferView {})
}
}
pub struct CreateBufferView {}
impl Render for CreateBufferView {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
div().child("Opening a buffer")
}
}

View File

@@ -0,0 +1,267 @@
use anyhow::Result;
use assistant_tooling::LanguageModelTool;
use gpui::{prelude::*, AnyView, Model, Task};
use project::Fs;
use schemars::JsonSchema;
use semantic_index::{ProjectIndex, Status};
use serde::Deserialize;
use std::sync::Arc;
use ui::{
div, prelude::*, CollapsibleContainer, Color, Icon, IconName, Label, SharedString,
WindowContext,
};
use util::ResultExt as _;
const DEFAULT_SEARCH_LIMIT: usize = 20;
#[derive(Clone)]
pub struct CodebaseExcerpt {
path: SharedString,
text: SharedString,
score: f32,
element_id: ElementId,
expanded: bool,
}
// Note: Comments on a `LanguageModelTool::Input` become descriptions on the generated JSON schema as shown to the language model.
// Any changes or deletions to the `CodebaseQuery` comments will change model behavior.
#[derive(Deserialize, JsonSchema)]
pub struct CodebaseQuery {
/// Semantic search query
query: String,
/// Maximum number of results to return, defaults to 20
limit: Option<usize>,
}
pub struct ProjectIndexView {
input: CodebaseQuery,
output: Result<ProjectIndexOutput>,
}
impl ProjectIndexView {
fn toggle_expanded(&mut self, element_id: ElementId, cx: &mut ViewContext<Self>) {
if let Ok(output) = &mut self.output {
if let Some(excerpt) = output
.excerpts
.iter_mut()
.find(|excerpt| excerpt.element_id == element_id)
{
excerpt.expanded = !excerpt.expanded;
cx.notify();
}
}
}
}
impl Render for ProjectIndexView {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let query = self.input.query.clone();
let result = &self.output;
let output = match result {
Err(err) => {
return div().child(Label::new(format!("Error: {}", err)).color(Color::Error));
}
Ok(output) => output,
};
div()
.v_flex()
.gap_2()
.child(
div()
.p_2()
.rounded_md()
.bg(cx.theme().colors().editor_background)
.child(
h_flex()
.child(Label::new("Query: ").color(Color::Modified))
.child(Label::new(query).color(Color::Muted)),
),
)
.children(output.excerpts.iter().map(|excerpt| {
let element_id = excerpt.element_id.clone();
let expanded = excerpt.expanded;
CollapsibleContainer::new(element_id.clone(), expanded)
.start_slot(
h_flex()
.gap_1()
.child(Icon::new(IconName::File).color(Color::Muted))
.child(Label::new(excerpt.path.clone()).color(Color::Muted)),
)
.on_click(cx.listener(move |this, _, cx| {
this.toggle_expanded(element_id.clone(), cx);
}))
.child(
div()
.p_2()
.rounded_md()
.bg(cx.theme().colors().editor_background)
.child(excerpt.text.clone()),
)
}))
}
}
pub struct ProjectIndexTool {
project_index: Model<ProjectIndex>,
fs: Arc<dyn Fs>,
}
pub struct ProjectIndexOutput {
excerpts: Vec<CodebaseExcerpt>,
status: Status,
}
impl ProjectIndexTool {
pub fn new(project_index: Model<ProjectIndex>, fs: Arc<dyn Fs>) -> Self {
// Listen for project index status and update the ProjectIndexTool directly
// TODO: setup a better description based on the user's current codebase.
Self { project_index, fs }
}
}
impl LanguageModelTool for ProjectIndexTool {
type Input = CodebaseQuery;
type Output = ProjectIndexOutput;
type View = ProjectIndexView;
fn name(&self) -> String {
"query_codebase".to_string()
}
fn description(&self) -> String {
"Semantic search against the user's current codebase, returning excerpts related to the query by computing a dot product against embeddings of chunks and an embedding of the query".to_string()
}
fn execute(&self, query: &Self::Input, cx: &mut WindowContext) -> Task<Result<Self::Output>> {
let project_index = self.project_index.read(cx);
let status = project_index.status();
let results = project_index.search(
query.query.as_str(),
query.limit.unwrap_or(DEFAULT_SEARCH_LIMIT),
cx,
);
let fs = self.fs.clone();
cx.spawn(|cx| async move {
let results = results.await;
let excerpts = results.into_iter().map(|result| {
let abs_path = result
.worktree
.read_with(&cx, |worktree, _| worktree.abs_path().join(&result.path));
let fs = fs.clone();
async move {
let path = result.path.clone();
let text = fs.load(&abs_path?).await?;
let mut start = result.range.start;
let mut end = result.range.end.min(text.len());
while !text.is_char_boundary(start) {
start += 1;
}
while !text.is_char_boundary(end) {
end -= 1;
}
anyhow::Ok(CodebaseExcerpt {
element_id: ElementId::Name(nanoid::nanoid!().into()),
expanded: false,
path: path.to_string_lossy().to_string().into(),
text: SharedString::from(text[start..end].to_string()),
score: result.score,
})
}
});
let excerpts = futures::future::join_all(excerpts)
.await
.into_iter()
.filter_map(|result| result.log_err())
.collect();
anyhow::Ok(ProjectIndexOutput { excerpts, status })
})
}
fn output_view(
_tool_call_id: String,
input: Self::Input,
output: Result<Self::Output>,
cx: &mut WindowContext,
) -> gpui::View<Self::View> {
cx.new_view(|_cx| ProjectIndexView { input, output })
}
fn status_view(&self, cx: &mut WindowContext) -> Option<AnyView> {
Some(
cx.new_view(|cx| ProjectIndexStatusView::new(self.project_index.clone(), cx))
.into(),
)
}
fn format(_input: &Self::Input, output: &Result<Self::Output>) -> String {
match &output {
Ok(output) => {
let mut body = "Semantic search results:\n".to_string();
if output.status != Status::Idle {
body.push_str("Still indexing. Results may be incomplete.\n");
}
if output.excerpts.is_empty() {
body.push_str("No results found");
return body;
}
for excerpt in &output.excerpts {
body.push_str("Excerpt from ");
body.push_str(excerpt.path.as_ref());
body.push_str(", score ");
body.push_str(&excerpt.score.to_string());
body.push_str(":\n");
body.push_str("~~~\n");
body.push_str(excerpt.text.as_ref());
body.push_str("~~~\n");
}
body
}
Err(err) => format!("Error: {}", err),
}
}
}
struct ProjectIndexStatusView {
project_index: Model<ProjectIndex>,
}
impl ProjectIndexStatusView {
pub fn new(project_index: Model<ProjectIndex>, cx: &mut ViewContext<Self>) -> Self {
cx.subscribe(&project_index, |_this, _, _status: &Status, cx| {
cx.notify();
})
.detach();
Self { project_index }
}
}
impl Render for ProjectIndexStatusView {
fn render(&mut self, cx: &mut ViewContext<Self>) -> impl IntoElement {
let status = self.project_index.read(cx).status();
h_flex().gap_2().map(|element| match status {
Status::Idle => element.child(Label::new("Project index ready")),
Status::Loading => element.child(Label::new("Project index loading...")),
Status::Scanning { remaining_count } => element.child(Label::new(format!(
"Project index scanning: {remaining_count} remaining..."
))),
})
}
}

View File

@@ -1,5 +1,11 @@
mod chat_message;
mod composer;
#[cfg(feature = "stories")]
mod stories;
pub use chat_message::*;
pub use composer::*;
#[cfg(feature = "stories")]
pub use stories::*;

View File

@@ -15,7 +15,7 @@ pub enum UserOrAssistant {
pub struct ChatMessage {
id: MessageId,
player: UserOrAssistant,
message: AnyElement,
message: Option<AnyElement>,
collapsed: bool,
on_collapse_handle_click: Box<dyn Fn(&ClickEvent, &mut WindowContext) + 'static>,
}
@@ -24,7 +24,7 @@ impl ChatMessage {
pub fn new(
id: MessageId,
player: UserOrAssistant,
message: AnyElement,
message: Option<AnyElement>,
collapsed: bool,
on_collapse_handle_click: Box<dyn Fn(&ClickEvent, &mut WindowContext) + 'static>,
) -> Self {
@@ -40,10 +40,6 @@ impl ChatMessage {
impl RenderOnce for ChatMessage {
fn render(self, cx: &mut WindowContext) -> impl IntoElement {
// TODO: This should be top padding + 1.5x line height
// Set the message height to cut off at exactly 1.5 lines when collapsed
let collapsed_height = rems(2.875);
let collapse_handle_id = SharedString::from(format!("{}_collapse_handle", self.id.0));
let collapse_handle = h_flex()
.id(collapse_handle_id.clone())
@@ -65,19 +61,26 @@ impl RenderOnce for ChatMessage {
this.bg(cx.theme().colors().element_hover)
}),
);
let content = div()
.overflow_hidden()
.w_full()
.p_4()
.rounded_lg()
.when(self.collapsed, |this| this.h(collapsed_height))
.bg(cx.theme().colors().surface_background)
.child(self.message);
let content_padding = rems(1.);
// Clamp the message height to exactly 1.5 lines when collapsed.
let collapsed_height = content_padding.to_pixels(cx.rem_size()) + cx.line_height() * 1.5;
let content = self.message.map(|message| {
div()
.overflow_hidden()
.w_full()
.p(content_padding)
.rounded_lg()
.when(self.collapsed, |this| this.h(collapsed_height))
.bg(cx.theme().colors().surface_background)
.child(message)
});
v_flex()
.gap_1()
.child(ChatMessageHeader::new(self.player))
.child(h_flex().gap_3().child(collapse_handle).child(content))
.child(h_flex().gap_3().child(collapse_handle).children(content))
}
}

View File

@@ -1,7 +1,7 @@
use assistant_tooling::ToolRegistry;
use client::User;
use editor::{Editor, EditorElement, EditorStyle};
use gpui::{FontStyle, FontWeight, TextStyle, View, WeakView, WhiteSpace};
use gpui::{AnyElement, FontStyle, FontWeight, TextStyle, View, WeakView, WhiteSpace};
use settings::Settings;
use std::sync::Arc;
use theme::ThemeSettings;
@@ -11,30 +11,27 @@ use crate::{AssistantChat, CompletionProvider, Submit, SubmitMode};
#[derive(IntoElement)]
pub struct Composer {
assistant_chat: WeakView<AssistantChat>,
model: String,
editor: View<Editor>,
player: Option<Arc<User>>,
can_submit: bool,
tool_registry: Arc<ToolRegistry>,
model_selector: AnyElement,
}
impl Composer {
pub fn new(
assistant_chat: WeakView<AssistantChat>,
model: String,
editor: View<Editor>,
player: Option<Arc<User>>,
can_submit: bool,
tool_registry: Arc<ToolRegistry>,
model_selector: AnyElement,
) -> Self {
Self {
assistant_chat,
model,
editor,
player,
can_submit,
tool_registry,
model_selector,
}
}
}
@@ -61,6 +58,7 @@ impl RenderOnce for Composer {
v_flex()
.size_full()
.gap_1()
.pr_4()
.child(
v_flex()
.w_full()
@@ -149,7 +147,7 @@ impl RenderOnce for Composer {
h_flex()
.w_full()
.justify_between()
.child(ModelSelector::new(self.assistant_chat, self.model))
.child(self.model_selector)
.children(self.tool_registry.status_views().iter().cloned()),
),
)
@@ -157,7 +155,7 @@ impl RenderOnce for Composer {
}
#[derive(IntoElement)]
struct ModelSelector {
pub struct ModelSelector {
assistant_chat: WeakView<AssistantChat>,
model: String,
}

View File

@@ -0,0 +1,3 @@
mod chat_message;
pub use chat_message::*;

View File

@@ -0,0 +1,101 @@
use std::sync::Arc;
use client::User;
use story::{StoryContainer, StoryItem, StorySection};
use ui::prelude::*;
use crate::ui::{ChatMessage, UserOrAssistant};
use crate::MessageId;
pub struct ChatMessageStory;
impl Render for ChatMessageStory {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
let user_1 = Arc::new(User {
id: 12345,
github_login: "iamnbutler".into(),
avatar_uri: "https://avatars.githubusercontent.com/u/1714999?v=4".into(),
});
StoryContainer::new(
"ChatMessage Story",
"crates/assistant2/src/ui/stories/chat_message.rs",
)
.child(
StorySection::new()
.child(StoryItem::new(
"User chat message",
ChatMessage::new(
MessageId(0),
UserOrAssistant::User(Some(user_1.clone())),
Some(div().child("What can I do here?").into_any_element()),
false,
Box::new(|_, _| {}),
),
))
.child(StoryItem::new(
"User chat message (collapsed)",
ChatMessage::new(
MessageId(0),
UserOrAssistant::User(Some(user_1.clone())),
Some(div().child("What can I do here?").into_any_element()),
true,
Box::new(|_, _| {}),
),
)),
)
.child(
StorySection::new()
.child(StoryItem::new(
"Assistant chat message",
ChatMessage::new(
MessageId(0),
UserOrAssistant::Assistant,
Some(div().child("You can talk to me!").into_any_element()),
false,
Box::new(|_, _| {}),
),
))
.child(StoryItem::new(
"Assistant chat message (collapsed)",
ChatMessage::new(
MessageId(0),
UserOrAssistant::Assistant,
Some(div().child(MULTI_LINE_MESSAGE).into_any_element()),
true,
Box::new(|_, _| {}),
),
)),
)
.child(
StorySection::new().child(StoryItem::new(
"Conversation between user and assistant",
v_flex()
.gap_2()
.child(ChatMessage::new(
MessageId(0),
UserOrAssistant::User(Some(user_1.clone())),
Some(div().child("What is Rust??").into_any_element()),
false,
Box::new(|_, _| {}),
))
.child(ChatMessage::new(
MessageId(0),
UserOrAssistant::Assistant,
Some(div().child("Rust is a multi-paradigm programming language focused on performance and safety").into_any_element()),
false,
Box::new(|_, _| {}),
))
.child(ChatMessage::new(
MessageId(0),
UserOrAssistant::User(Some(user_1)),
Some(div().child("Sounds pretty cool!").into_any_element()),
false,
Box::new(|_, _| {}),
)),
)),
)
}
}
const MULTI_LINE_MESSAGE: &str = "In 2010, the movies nominated for the 82nd Academy Awards, for films released in 2009, were as follows. Note that 2010 nominees were announced for the ceremony happening in that year, but they honor movies from the previous year";

View File

@@ -120,8 +120,8 @@ impl ToolRegistry {
#[cfg(test)]
mod test {
use super::*;
use gpui::View;
use gpui::{div, prelude::*, Render, TestAppContext};
use gpui::{EmptyView, View};
use schemars::schema_for;
use schemars::JsonSchema;
use serde::{Deserialize, Serialize};
@@ -170,7 +170,7 @@ mod test {
fn execute(
&self,
input: &Self::Input,
_cx: &gpui::AppContext,
_cx: &mut WindowContext,
) -> Task<Result<Self::Output>> {
let _location = input.location.clone();
let _unit = input.unit.clone();
@@ -200,6 +200,7 @@ mod test {
#[gpui::test]
async fn test_openai_weather_example(cx: &mut TestAppContext) {
cx.background_executor.run_until_parked();
let (_, cx) = cx.add_window_view(|_cx| EmptyView);
let tool = WeatherTool {
current_weather: WeatherResult {

View File

@@ -1,5 +1,5 @@
use anyhow::Result;
use gpui::{AnyElement, AnyView, AppContext, IntoElement as _, Render, Task, View, WindowContext};
use gpui::{AnyElement, AnyView, IntoElement as _, Render, Task, View, WindowContext};
use schemars::{schema::RootSchema, schema_for, JsonSchema};
use serde::Deserialize;
use std::fmt::Display;
@@ -70,16 +70,19 @@ pub trait LanguageModelTool {
type View: Render;
/// The name of the tool is exposed to the language model to allow
/// the model to pick which tools to use. As this name is used to
/// identify the tool within a tool registry, it should be unique.
/// Returns the name of the tool.
///
/// This name is exposed to the language model to allow the model to pick
/// which tools to use. As this name is used to identify the tool within a
/// tool registry, it should be unique.
fn name(&self) -> String;
/// A description of the tool that can be used to _prompt_ the model
/// as to what the tool does.
/// Returns the description of the tool.
///
/// This can be used to _prompt_ the model as to what the tool does.
fn description(&self) -> String;
/// The OpenAI Function definition for the tool, for direct use with OpenAI's API.
/// Returns the OpenAI Function definition for the tool, for direct use with OpenAI's API.
fn definition(&self) -> ToolFunctionDefinition {
let root_schema = schema_for!(Self::Input);
@@ -90,8 +93,8 @@ pub trait LanguageModelTool {
}
}
/// Execute the tool
fn execute(&self, input: &Self::Input, cx: &AppContext) -> Task<Result<Self::Output>>;
/// Executes the tool with the given input.
fn execute(&self, input: &Self::Input, cx: &mut WindowContext) -> Task<Result<Self::Output>>;
fn format(input: &Self::Input, output: &Result<Self::Output>) -> String;

View File

@@ -764,10 +764,10 @@ impl ContextMenu {
max_height: Pixels,
workspace: Option<WeakView<Workspace>>,
cx: &mut ViewContext<Editor>,
) -> (DisplayPoint, AnyElement) {
) -> (ContextMenuOrigin, AnyElement) {
match self {
ContextMenu::Completions(menu) => (
cursor_position,
ContextMenuOrigin::EditorPoint(cursor_position),
menu.render(style, max_height, workspace, cx),
),
ContextMenu::CodeActions(menu) => menu.render(cursor_position, style, max_height, cx),
@@ -775,6 +775,11 @@ impl ContextMenu {
}
}
enum ContextMenuOrigin {
EditorPoint(DisplayPoint),
GutterIndicator(u32),
}
#[derive(Clone)]
struct CompletionsMenu {
id: CompletionId,
@@ -1208,11 +1213,11 @@ impl CodeActionsMenu {
fn render(
&self,
mut cursor_position: DisplayPoint,
cursor_position: DisplayPoint,
_style: &EditorStyle,
max_height: Pixels,
cx: &mut ViewContext<Editor>,
) -> (DisplayPoint, AnyElement) {
) -> (ContextMenuOrigin, AnyElement) {
let actions = self.actions.clone();
let selected_item = self.selected_item;
@@ -1277,10 +1282,11 @@ impl CodeActionsMenu {
)
.into_any_element();
if self.deployed_from_indicator {
*cursor_position.column_mut() = 0;
}
let cursor_position = if self.deployed_from_indicator {
ContextMenuOrigin::GutterIndicator(cursor_position.row())
} else {
ContextMenuOrigin::EditorPoint(cursor_position)
};
(cursor_position, element)
}
}
@@ -4247,13 +4253,13 @@ impl Editor {
.map_or(false, |menu| menu.visible())
}
pub fn render_context_menu(
fn render_context_menu(
&self,
cursor_position: DisplayPoint,
style: &EditorStyle,
max_height: Pixels,
cx: &mut ViewContext<Editor>,
) -> Option<(DisplayPoint, AnyElement)> {
) -> Option<(ContextMenuOrigin, AnyElement)> {
self.context_menu.read().as_ref().map(|menu| {
menu.render(
cursor_position,

View File

@@ -1949,6 +1949,7 @@ impl EditorElement {
scroll_pixel_position: gpui::Point<Pixels>,
line_layouts: &[LineWithInvisibles],
newest_selection_head: DisplayPoint,
gutter_overshoot: Pixels,
cx: &mut WindowContext,
) -> bool {
let max_height = cmp::min(
@@ -1968,9 +1969,23 @@ impl EditorElement {
let available_space = size(AvailableSpace::MinContent, AvailableSpace::MinContent);
let context_menu_size = context_menu.layout_as_root(available_space, cx);
let cursor_row_layout = &line_layouts[(position.row() - start_row) as usize].line;
let x = cursor_row_layout.x_for_index(position.column() as usize) - scroll_pixel_position.x;
let y = (position.row() + 1) as f32 * line_height - scroll_pixel_position.y;
let (x, y) = match position {
crate::ContextMenuOrigin::EditorPoint(point) => {
let cursor_row_layout = &line_layouts[(point.row() - start_row) as usize].line;
let x = cursor_row_layout.x_for_index(point.column() as usize)
- scroll_pixel_position.x;
let y = (point.row() + 1) as f32 * line_height - scroll_pixel_position.y;
(x, y)
}
crate::ContextMenuOrigin::GutterIndicator(row) => {
// Context menu was spawned via a click on a gutter. Ensure it's a bit closer to the indicator than just a plain first column of the
// text field.
let x = -gutter_overshoot;
let y = (row + 1) as f32 * line_height - scroll_pixel_position.y;
(x, y)
}
};
let mut list_origin = content_origin + point(x, y);
let list_width = context_menu_size.width;
let list_height = context_menu_size.height;
@@ -3826,6 +3841,7 @@ impl Element for EditorElement {
scroll_pixel_position,
&line_layouts,
newest_selection_head,
gutter_dimensions.width - gutter_dimensions.left_padding,
cx,
);
if gutter_settings.code_actions {

View File

@@ -218,7 +218,7 @@ impl TestAppContext {
/// Adds a new window, and returns its root view and a `VisualTestContext` which can be used
/// as a `WindowContext` for the rest of the test. Typically you would shadow this context with
/// the returned one. `let (view, cx) = cx.add_window_view(...);`
pub fn add_window_view<F, V>(&mut self, build_window: F) -> (View<V>, &mut VisualTestContext)
pub fn add_window_view<F, V>(&mut self, build_root_view: F) -> (View<V>, &mut VisualTestContext)
where
F: FnOnce(&mut ViewContext<V>) -> V,
V: 'static + Render,
@@ -230,7 +230,7 @@ impl TestAppContext {
bounds: Some(bounds),
..Default::default()
},
|cx| cx.new_view(build_window),
|cx| cx.new_view(build_root_view),
);
drop(cx);
let view = window.root_view(self).unwrap();

View File

@@ -1,3 +1,4 @@
use crate::Empty;
use crate::{
seal::Sealed, AnyElement, AnyModel, AnyWeakModel, AppContext, Bounds, ContentMask, Element,
ElementId, Entity, EntityId, Flatten, FocusHandle, FocusableView, GlobalElementId, IntoElement,
@@ -457,3 +458,12 @@ mod any_view {
view.update(cx, |view, cx| view.render(cx).into_any_element())
}
}
/// A view that renders nothing
pub struct EmptyView;
impl Render for EmptyView {
fn render(&mut self, _cx: &mut ViewContext<Self>) -> impl IntoElement {
Empty
}
}

View File

@@ -8525,11 +8525,13 @@ impl Project {
OpenBuffer::Weak(_) => {}
},
hash_map::Entry::Vacant(e) => {
assert!(
is_remote,
"received buffer update from {:?}",
envelope.original_sender_id
);
if !is_remote {
debug_panic!(
"received buffer update from {:?}",
envelope.original_sender_id
);
return Err(anyhow!("received buffer update for non-remote project"));
}
e.insert(OpenBuffer::Operations(ops));
}
}

View File

@@ -72,10 +72,11 @@ impl EmbeddingProvider for CloudEmbeddingProvider {
texts
.iter()
.map(|to_embed| {
let dimensions = embeddings.remove(&to_embed.digest).with_context(|| {
format!("server did not return an embedding for {:?}", to_embed)
})?;
Ok(Embedding::new(dimensions))
let embedding =
embeddings.get(&to_embed.digest).cloned().with_context(|| {
format!("server did not return an embedding for {:?}", to_embed)
})?;
Ok(Embedding::new(embedding))
})
.collect()
}

View File

@@ -21,6 +21,7 @@ use smol::channel;
use std::{
cmp::Ordering,
future::Future,
iter,
num::NonZeroUsize,
ops::Range,
path::{Path, PathBuf},
@@ -295,6 +296,28 @@ impl ProjectIndex {
}
Ok(result)
}
pub fn debug(&self, cx: &mut ModelContext<Self>) -> Task<Result<()>> {
let indices = self
.worktree_indices
.values()
.filter_map(|worktree_index| {
if let WorktreeIndexHandle::Loaded { index, .. } = worktree_index {
Some(index.clone())
} else {
None
}
})
.collect::<Vec<_>>();
cx.spawn(|_, mut cx| async move {
eprintln!("semantic index contents:");
for index in indices {
index.update(&mut cx, |index, cx| index.debug(cx))?.await?
}
Ok(())
})
}
}
pub struct SearchResult {
@@ -419,7 +442,7 @@ impl WorktreeIndex {
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_entries(worktree.clone(), cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = self.embed_files(chunk.files, cx);
let embed = Self::embed_files(self.embedding_provider.clone(), chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
@@ -436,7 +459,7 @@ impl WorktreeIndex {
let worktree_abs_path = worktree.abs_path().clone();
let scan = self.scan_updated_entries(worktree, updated_entries.clone(), cx);
let chunk = self.chunk_files(worktree_abs_path, scan.updated_entries, cx);
let embed = self.embed_files(chunk.files, cx);
let embed = Self::embed_files(self.embedding_provider.clone(), chunk.files, cx);
let persist = self.persist_embeddings(scan.deleted_entry_ranges, embed.files, cx);
async move {
futures::try_join!(scan.task, chunk.task, embed.task, persist)?;
@@ -500,7 +523,7 @@ impl WorktreeIndex {
}
if entry.mtime != saved_mtime {
let handle = entries_being_indexed.insert(&entry);
let handle = entries_being_indexed.insert(entry.id);
updated_entries_tx.send((entry.clone(), handle)).await?;
}
}
@@ -539,7 +562,7 @@ impl WorktreeIndex {
| project::PathChange::AddedOrUpdated => {
if let Some(entry) = worktree.entry_for_id(*entry_id) {
if entry.is_file() {
let handle = entries_being_indexed.insert(&entry);
let handle = entries_being_indexed.insert(entry.id);
updated_entries_tx.send((entry.clone(), handle)).await?;
}
}
@@ -601,7 +624,8 @@ impl WorktreeIndex {
let chunked_file = ChunkedFile {
chunks: chunk_text(&text, grammar),
handle,
entry,
path: entry.path,
mtime: entry.mtime,
text,
};
@@ -623,11 +647,11 @@ impl WorktreeIndex {
}
fn embed_files(
&self,
embedding_provider: Arc<dyn EmbeddingProvider>,
chunked_files: channel::Receiver<ChunkedFile>,
cx: &AppContext,
) -> EmbedFiles {
let embedding_provider = self.embedding_provider.clone();
let embedding_provider = embedding_provider.clone();
let (embedded_files_tx, embedded_files_rx) = channel::bounded(512);
let task = cx.background_executor().spawn(async move {
let mut chunked_file_batches =
@@ -635,9 +659,10 @@ impl WorktreeIndex {
while let Some(chunked_files) = chunked_file_batches.next().await {
// View the batch of files as a vec of chunks
// Flatten out to a vec of chunks that we can subdivide into batch sized pieces
// Once those are done, reassemble it back into which files they belong to
// Once those are done, reassemble them back into the files in which they belong
// If any embeddings fail for a file, the entire file is discarded
let chunks = chunked_files
let chunks: Vec<TextToEmbed> = chunked_files
.iter()
.flat_map(|file| {
file.chunks.iter().map(|chunk| TextToEmbed {
@@ -647,36 +672,50 @@ impl WorktreeIndex {
})
.collect::<Vec<_>>();
let mut embeddings = Vec::new();
let mut embeddings: Vec<Option<Embedding>> = Vec::new();
for embedding_batch in chunks.chunks(embedding_provider.batch_size()) {
if let Some(batch_embeddings) =
embedding_provider.embed(embedding_batch).await.log_err()
{
embeddings.extend_from_slice(&batch_embeddings);
if batch_embeddings.len() == embedding_batch.len() {
embeddings.extend(batch_embeddings.into_iter().map(Some));
continue;
}
log::error!(
"embedding provider returned unexpected embedding count {}, expected {}",
batch_embeddings.len(), embedding_batch.len()
);
}
embeddings.extend(iter::repeat(None).take(embedding_batch.len()));
}
let mut embeddings = embeddings.into_iter();
for chunked_file in chunked_files {
let chunk_embeddings = embeddings
.by_ref()
.take(chunked_file.chunks.len())
.collect::<Vec<_>>();
let embedded_chunks = chunked_file
.chunks
.into_iter()
.zip(chunk_embeddings)
.map(|(chunk, embedding)| EmbeddedChunk { chunk, embedding })
.collect();
let embedded_file = EmbeddedFile {
path: chunked_file.entry.path.clone(),
mtime: chunked_file.entry.mtime,
chunks: embedded_chunks,
let mut embedded_file = EmbeddedFile {
path: chunked_file.path,
mtime: chunked_file.mtime,
chunks: Vec::new(),
};
embedded_files_tx
.send((embedded_file, chunked_file.handle))
.await?;
let mut embedded_all_chunks = true;
for (chunk, embedding) in
chunked_file.chunks.into_iter().zip(embeddings.by_ref())
{
if let Some(embedding) = embedding {
embedded_file
.chunks
.push(EmbeddedChunk { chunk, embedding });
} else {
embedded_all_chunks = false;
}
}
if embedded_all_chunks {
embedded_files_tx
.send((embedded_file, chunked_file.handle))
.await?;
}
}
}
Ok(())
@@ -826,6 +865,21 @@ impl WorktreeIndex {
})
}
fn debug(&mut self, cx: &mut ModelContext<Self>) -> Task<Result<()>> {
let connection = self.db_connection.clone();
let db = self.db;
cx.background_executor().spawn(async move {
let tx = connection
.read_txn()
.context("failed to create read transaction")?;
for record in db.iter(&tx)? {
let (key, _) = record?;
eprintln!("{}", path_for_db_key(key));
}
Ok(())
})
}
#[cfg(test)]
fn path_count(&self) -> Result<u64> {
let txn = self
@@ -848,7 +902,8 @@ struct ChunkFiles {
}
struct ChunkedFile {
pub entry: Entry,
pub path: Arc<Path>,
pub mtime: Option<SystemTime>,
pub handle: IndexingEntryHandle,
pub text: String,
pub chunks: Vec<Chunk>,
@@ -872,11 +927,14 @@ struct EmbeddedChunk {
embedding: Embedding,
}
/// The set of entries that are currently being indexed.
struct IndexingEntrySet {
entry_ids: Mutex<HashSet<ProjectEntryId>>,
tx: channel::Sender<()>,
}
/// When dropped, removes the entry from the set of entries that are being indexed.
#[derive(Clone)]
struct IndexingEntryHandle {
entry_id: ProjectEntryId,
set: Weak<IndexingEntrySet>,
@@ -890,11 +948,11 @@ impl IndexingEntrySet {
}
}
fn insert(self: &Arc<Self>, entry: &project::Entry) -> IndexingEntryHandle {
self.entry_ids.lock().insert(entry.id);
fn insert(self: &Arc<Self>, entry_id: ProjectEntryId) -> IndexingEntryHandle {
self.entry_ids.lock().insert(entry_id);
self.tx.send_blocking(()).ok();
IndexingEntryHandle {
entry_id: entry.id,
entry_id,
set: Arc::downgrade(self),
}
}
@@ -917,6 +975,10 @@ fn db_key_for_path(path: &Arc<Path>) -> String {
path.to_string_lossy().replace('/', "\0")
}
fn path_for_db_key(key: &str) -> String {
key.replace('\0', "/")
}
#[cfg(test)]
mod tests {
use super::*;
@@ -939,7 +1001,22 @@ mod tests {
});
}
pub struct TestEmbeddingProvider;
pub struct TestEmbeddingProvider {
batch_size: usize,
compute_embedding: Box<dyn Fn(&str) -> Result<Embedding> + Send + Sync>,
}
impl TestEmbeddingProvider {
pub fn new(
batch_size: usize,
compute_embedding: impl 'static + Fn(&str) -> Result<Embedding> + Send + Sync,
) -> Self {
return Self {
batch_size,
compute_embedding: Box::new(compute_embedding),
};
}
}
impl EmbeddingProvider for TestEmbeddingProvider {
fn embed<'a>(
@@ -948,29 +1025,13 @@ mod tests {
) -> BoxFuture<'a, Result<Vec<Embedding>>> {
let embeddings = texts
.iter()
.map(|text| {
let mut embedding = vec![0f32; 2];
// if the text contains garbage, give it a 1 in the first dimension
if text.text.contains("garbage in") {
embedding[0] = 0.9;
} else {
embedding[0] = -0.9;
}
if text.text.contains("garbage out") {
embedding[1] = 0.9;
} else {
embedding[1] = -0.9;
}
Embedding::new(embedding)
})
.map(|to_embed| (self.compute_embedding)(to_embed.text))
.collect();
future::ready(Ok(embeddings)).boxed()
future::ready(embeddings).boxed()
}
fn batch_size(&self) -> usize {
16
self.batch_size
}
}
@@ -984,7 +1045,23 @@ mod tests {
let mut semantic_index = SemanticIndex::new(
temp_dir.path().into(),
Arc::new(TestEmbeddingProvider),
Arc::new(TestEmbeddingProvider::new(16, |text| {
let mut embedding = vec![0f32; 2];
// if the text contains garbage, give it a 1 in the first dimension
if text.contains("garbage in") {
embedding[0] = 0.9;
} else {
embedding[0] = -0.9;
}
if text.contains("garbage out") {
embedding[1] = 0.9;
} else {
embedding[1] = -0.9;
}
Ok(Embedding::new(embedding))
})),
&mut cx.to_async(),
)
.await
@@ -1046,4 +1123,82 @@ mod tests {
assert!(content.contains("garbage in, garbage out"));
}
#[gpui::test]
async fn test_embed_files(cx: &mut TestAppContext) {
cx.executor().allow_parking();
let provider = Arc::new(TestEmbeddingProvider::new(3, |text| {
if text.contains('g') {
Err(anyhow!("cannot embed text containing a 'g' character"))
} else {
Ok(Embedding::new(
('a'..'z')
.map(|char| text.chars().filter(|c| *c == char).count() as f32)
.collect(),
))
}
}));
let (indexing_progress_tx, _) = channel::unbounded();
let indexing_entries = Arc::new(IndexingEntrySet::new(indexing_progress_tx));
let (chunked_files_tx, chunked_files_rx) = channel::unbounded::<ChunkedFile>();
chunked_files_tx
.send_blocking(ChunkedFile {
path: Path::new("test1.md").into(),
mtime: None,
handle: indexing_entries.insert(ProjectEntryId::from_proto(0)),
text: "abcdefghijklmnop".to_string(),
chunks: [0..4, 4..8, 8..12, 12..16]
.into_iter()
.map(|range| Chunk {
range,
digest: Default::default(),
})
.collect(),
})
.unwrap();
chunked_files_tx
.send_blocking(ChunkedFile {
path: Path::new("test2.md").into(),
mtime: None,
handle: indexing_entries.insert(ProjectEntryId::from_proto(1)),
text: "qrstuvwxyz".to_string(),
chunks: [0..4, 4..8, 8..10]
.into_iter()
.map(|range| Chunk {
range,
digest: Default::default(),
})
.collect(),
})
.unwrap();
chunked_files_tx.close();
let embed_files_task =
cx.update(|cx| WorktreeIndex::embed_files(provider.clone(), chunked_files_rx, cx));
embed_files_task.task.await.unwrap();
let mut embedded_files_rx = embed_files_task.files;
let mut embedded_files = Vec::new();
while let Some((embedded_file, _)) = embedded_files_rx.next().await {
embedded_files.push(embedded_file);
}
assert_eq!(embedded_files.len(), 1);
assert_eq!(embedded_files[0].path.as_ref(), Path::new("test2.md"));
assert_eq!(
embedded_files[0]
.chunks
.iter()
.map(|embedded_chunk| { embedded_chunk.embedding.clone() })
.collect::<Vec<Embedding>>(),
vec![
(provider.compute_embedding)("qrst").unwrap(),
(provider.compute_embedding)("uvwx").unwrap(),
(provider.compute_embedding)("yz").unwrap(),
],
);
}
}

View File

@@ -14,6 +14,7 @@ path = "src/storybook.rs"
[dependencies]
anyhow.workspace = true
assistant2 = { workspace = true, features = ["stories"] }
clap = { workspace = true, features = ["derive", "string"] }
collab_ui = { workspace = true, features = ["stories"] }
ctrlc = "3.4"

View File

@@ -12,6 +12,7 @@ use ui::prelude::*;
#[derive(Debug, PartialEq, Eq, Clone, Copy, strum::Display, EnumString, EnumIter)]
#[strum(serialize_all = "snake_case")]
pub enum ComponentStory {
AssistantChatMessage,
AutoHeightEditor,
Avatar,
Button,
@@ -42,6 +43,9 @@ pub enum ComponentStory {
impl ComponentStory {
pub fn story(&self, cx: &mut WindowContext) -> AnyView {
match self {
Self::AssistantChatMessage => {
cx.new_view(|_cx| assistant2::ui::ChatMessageStory).into()
}
Self::AutoHeightEditor => AutoHeightEditorStory::new(cx).into(),
Self::Avatar => cx.new_view(|_| ui::AvatarStory).into(),
Self::Button => cx.new_view(|_| ui::ButtonStory).into(),

View File

@@ -447,8 +447,6 @@ pub(crate) fn motion(motion: Motion, cx: &mut WindowContext) {
vim.clear_operator(cx);
if let Some(operator) = waiting_operator {
vim.push_operator(operator, cx);
dbg!(count);
vim.update_state(|state| state.pre_count = count)
}
});
}
@@ -757,7 +755,7 @@ impl Motion {
},
NextLineStart => (next_line_start(map, point, times), SelectionGoal::None),
StartOfLineDownward => (next_line_start(map, point, times - 1), SelectionGoal::None),
EndOfLineDownward => (last_non_whitespace(map, point, times), SelectionGoal::None),
EndOfLineDownward => (next_line_end(map, point, times), SelectionGoal::None),
GoToColumn => (go_to_column(map, point, times), SelectionGoal::None),
WindowTop => window_top(map, point, &text_layout_details, times - 1),
WindowMiddle => window_middle(map, point, &text_layout_details),
@@ -1424,28 +1422,6 @@ pub(crate) fn first_non_whitespace(
start_offset.to_display_point(map)
}
pub(crate) fn last_non_whitespace(
map: &DisplaySnapshot,
from: DisplayPoint,
count: usize,
) -> DisplayPoint {
let mut end_of_line = end_of_line(map, false, from, count).to_offset(map, Bias::Left);
let scope = map.buffer_snapshot.language_scope_at(from.to_point(map));
dbg!(end_of_line);
for (ch, offset) in map.reverse_buffer_chars_at(end_of_line) {
if ch == '\n' {
break;
}
end_of_line = offset;
dbg!(ch, offset);
if char_kind(&scope, ch) != CharKind::Whitespace || ch == '\n' {
break;
}
}
end_of_line.to_display_point(map)
}
pub(crate) fn start_of_line(
map: &DisplaySnapshot,
display_lines: bool,
@@ -1923,16 +1899,6 @@ mod test {
cx.assert_shared_state("one\n ˇtwo\nthree").await;
}
#[gpui::test]
async fn test_end_of_line_downward(cx: &mut gpui::TestAppContext) {
let mut cx = NeovimBackedTestContext::new(cx).await;
cx.set_shared_state("ˇ one \n two \nthree").await;
cx.simulate_shared_keystrokes(["g", "_"]).await;
cx.assert_shared_state(" onˇe \n two \nthree").await;
cx.simulate_shared_keystrokes(["2", "g", "_"]).await;
cx.assert_shared_state(" one \n twˇo \nthree").await;
}
#[gpui::test]
async fn test_window_top(cx: &mut gpui::TestAppContext) {
let mut cx = NeovimBackedTestContext::new(cx).await;

View File

@@ -1,9 +1,4 @@
use crate::{
motion::{self, Motion},
object::Object,
state::Mode,
Vim,
};
use crate::{motion::Motion, object::Object, state::Mode, Vim};
use editor::{movement, scroll::Autoscroll, Bias};
use gpui::WindowContext;
use language::BracketPair;
@@ -28,7 +23,6 @@ impl<'de> Deserialize<'de> for SurroundsType {
pub fn add_surrounds(text: Arc<str>, target: SurroundsType, cx: &mut WindowContext) {
Vim::update(cx, |vim, cx| {
vim.stop_recording();
let count = vim.take_count(cx);
vim.update_active_editor(cx, |_, editor, cx| {
let text_layout_details = editor.text_layout_details(cx);
editor.transact(cx, |editor, cx| {
@@ -58,26 +52,22 @@ pub fn add_surrounds(text: Arc<str>, target: SurroundsType, cx: &mut WindowConte
.range(
&display_map,
selection.clone(),
count,
Some(1),
true,
&text_layout_details,
)
.map(|mut range| {
// The Motion::CurrentLine operation will contain the newline of the current line and leading/trailing whitespace
// The Motion::CurrentLine operation will contain the newline of the current line,
// so we need to deal with this edge case
if let Motion::CurrentLine = motion {
range.start = motion::first_non_whitespace(
&display_map,
false,
range.start,
);
range.end = movement::saturating_right(
&display_map,
motion::last_non_whitespace(
&display_map,
movement::left(&display_map, range.end),
1,
),
);
let offset = range.end.to_offset(&display_map, Bias::Left);
if let Some((last_ch, _)) =
display_map.reverse_buffer_chars_at(offset).next()
{
if last_ch == '\n' {
range.end = movement::left(&display_map, range.end);
}
}
}
range
});
@@ -637,30 +627,6 @@ mod test {
the lazy dog."},
Mode::Normal,
);
cx.set_state(
indoc! {"
The quˇick brown•
fox jumps over
the lazy dog."},
Mode::Normal,
);
cx.simulate_keystrokes(["y", "s", "s", "{"]);
cx.assert_state(
indoc! {"
ˇ{ The quick brown }•
fox jumps over
the lazy dog."},
Mode::Normal,
);
cx.simulate_keystrokes(["2", "y", "s", "s", ")"]);
cx.assert_state(
indoc! {"
ˇ({ The quick brown }•
fox jumps over)
the lazy dog."},
Mode::Normal,
);
}
#[gpui::test]

View File

@@ -526,7 +526,7 @@ impl Vim {
| Operator::ChangeSurrounds { .. }
| Operator::DeleteSurrounds
) {
self.update_state(|state| state.operator_stack.clear());
self.clear_operator(cx);
};
self.update_state(|state| state.operator_stack.push(operator));
self.sync_vim_settings(cx);

View File

@@ -1,8 +0,0 @@
{"Put":{"state":"ˇ one \n two \nthree"}}
{"Key":"g"}
{"Key":"_"}
{"Get":{"state":" onˇe \n two \nthree","mode":"Normal"}}
{"Key":"2"}
{"Key":"g"}
{"Key":"_"}
{"Get":{"state":" one \n twˇo \nthree","mode":"Normal"}}

View File

@@ -1891,6 +1891,24 @@ impl Pane {
let mut to_pane = cx.view().clone();
let mut split_direction = self.drag_split_direction;
let paths = paths.paths().to_vec();
let is_remote = self
.workspace
.update(cx, |workspace, cx| {
if workspace.project().read(cx).is_remote() {
workspace.show_error(
&anyhow::anyhow!("Cannot drop files on a remote project"),
cx,
);
true
} else {
false
}
})
.unwrap_or(true);
if is_remote {
return;
}
self.workspace
.update(cx, |workspace, cx| {
let fs = Arc::clone(workspace.project().read(cx).fs());

View File

@@ -40,37 +40,48 @@ cargo build --release --target "${target_triple}" --package zed
# Later, we probably want to do something like this: https://github.com/GabrielMajeri/separate-symbols
strip "target/${target_triple}/release/Zed"
suffix=""
if [ "$channel" != "stable" ]; then
suffix="-$channel"
fi
# Move everything that should end up in the final package
# into a temp directory.
temp_dir=$(mktemp -d)
zed_dir="${temp_dir}/zed$suffix.app"
# Binary
mkdir "${temp_dir}/bin"
cp "target/${target_triple}/release/Zed" "${temp_dir}/bin/zed"
mkdir -p "${zed_dir}/bin"
cp "target/${target_triple}/release/Zed" "${zed_dir}/zed"
# Icons
mkdir -p "${temp_dir}/share/icons/hicolor/512x512/apps"
cp "crates/zed/resources/app-icon-nightly.png" "${temp_dir}/share/icons/hicolor/512x512/apps/zed.png"
mkdir -p "${temp_dir}/share/icons/hicolor/1024x1024/apps"
cp "crates/zed/resources/app-icon-nightly@2x.png" "${temp_dir}/share/icons/hicolor/1024x1024/apps/zed.png"
mkdir -p "${zed_dir}/share/icons/hicolor/512x512/apps"
cp "crates/zed/resources/app-icon$suffix.png" "${zed_dir}/share/icons/hicolor/512x512/apps/zed.png"
mkdir -p "${zed_dir}/share/icons/hicolor/1024x1024/apps"
cp "crates/zed/resources/app-icon$suffix/share/icons/hicolor/1024x1024/apps/zed.png"
# .desktop
mkdir -p "${temp_dir}/share/applications"
cp "crates/zed/resources/zed.desktop" "${temp_dir}/share/applications/zed.desktop"
mkdir -p "${zed_dir}/share/applications"
cp "crates/zed/resources/zed$suffix.desktop" "${zed_dir}/share/applications/zed$suffix.desktop"
if [[ "$channel" == "preview" ]]; then
sed -i "s|Name=Zed|Name=Zed Preview|g" "${zed_dir}/share/applications/zed$suffix.desktop"
elif [[ "$channel" == "nightly" ]]; then
sed -i "s|Name=Zed|Name=Zed Nightly|g" "${zed_dir}/share/applications/zed$suffix.desktop"
fi
# Licenses
cp "assets/licenses.md" "${temp_dir}/licenses.md"
cp "assets/licenses.md" "${zed_dir}/licenses.md"
# Create archive out of everything that's in the temp directory
target="linux-$(uname -m)"
if [[ "$channel" == "nightly" ]]; then
archive="zed-${channel}-${target_triple}.tar.gz"
archive="zed-${target}.tar.gz"
elif [[ "$channel" == "dev" ]]; then
archive="zed-${channel}-${commit}-${target_triple}.tar.gz"
archive="zed-${commit}-${target}.tar.gz"
else
archive="zed-${version}-${target_triple}.tar.gz"
archive="zed-${target}.tar.gz"
fi
rm -rf "${archive}"
tar -czvf $archive -C ${temp_dir} .
tar -czvf $archive -C ${temp_dir} ${zed_dir}

90
script/install.sh Normal file → Executable file
View File

@@ -1 +1,89 @@
#!/bin/sh
#!/usr/bin/env bash
set -euo pipefail
main() {
platform="$(uname -s)"
arch="$(uname -m)"
channel="stable"
temp="$(mktemp -d "/tmp/zed-XXXXX")"
if [[ $platform == "Darwin" ]]; then
platform="macos"
elif [[ $platform == "Linux" ]]; then
platform="linux"
channel="nightly"
else
echo "Unsupported platform $platform"
exit 1
fi
if [[ $platform == "macos" ]] && [[ $arch == arm64* ]]; then
arch="aarch64"
elif [[ $arch = x86* || $arch == i686* ]]; then
arch="x86_64"
else
echo "Unsupported architecture $arch"
exit 1
fi
if which curl >/dev/null 2>&1; then
curl () {
command curl -fL "$@"
}
elif which wget >/dev/null 2>&1; then
curl () {
wget -O- "$@"
}
else
echo "Could not find 'curl' or 'wget' in your path"
exit 1
fi
"$platform" "$@"
}
linux() {
echo "Downloading Zed"
curl "https://zed.dev/api/releases/$channel/latest/zed-linux-$arch.tar.gz" > "$temp/zed-linux-$arch.tar.gz"
suffix=""
if [[ $channel != "stable" ]]; then
suffix="-$channel"
fi
mkdir -p "$HOME/.local/zed$suffix.app"
tar -xzf "$temp/zed-linux-$arch.tar.gz" -C "$HOME/.local/"
mkdir -p "$HOME/.local/bin" "$HOME/.local/share/applications"
ln -sf ~/.local/zed$suffix.app/bin/zed ~/.local/bin/
cp ~/.local/zed$suffix.app/share/applications/zed$suffix.desktop ~/.local/share/applications/
sed -i "s|Icon=zed|Icon=$HOME/.local/zed$suffix.app/share/icons/hicolor/512x512/apps/zed.png|g" ~/.local/share/applications/zed$suffix.desktop
sed -i "s|Exec=zed|Exec=$HOME/.local/zed$suffix.app/bin/zed|g" ~/.local/share/applications/zed.desktop
if which zed >/dev/null 2>&1; then
echo "zed successfully installed. You can run it with 'zed'"
else
echo "To run zed from your terminal, you must add ~/.local/bin to your PATH"
echo "Run:"
echo " echo 'export PATH=\$HOME/.local/bin:\$PATH' >> ~/.bashrc"
echo " source ~/.bashrc"
echo "Otherwise run '~/.local/bin/zed'"
fi
}
macos() {
echo "Downloading Zed"
curl "https://zed.dev/api/releases/$channel/latest/Zed-$arch.dmg" > "$temp/Zed-$arch.dmg"
hdiutil attach -quiet "$temp/Zed-$arch.dmg" -mountpoint "$temp/mount"
app="$(cd "$temp/mount/"; echo *.app)"
echo "Installing $app"
if [[ -d "/Applications/$app" ]]; then
echo "Removing existing $app"
rm -rf "/Applications/$app"
fi
ditto -v "$temp/mount/$app" "/Applications/$app"
hdiutil detach -quiet "$temp/mount"
echo "Zed is installed. You can run it with 'open /Applications/$app'"
}
main "$@"

View File

@@ -13,6 +13,8 @@ maysudo=$(command -v sudo || command -v doas || true)
apt=$(command -v apt-get || true)
if [[ -n $apt ]]; then
deps=(
gcc
g++
libasound2-dev
libfontconfig-dev
libwayland-dev