Compare commits

...

10 Commits

Author SHA1 Message Date
Nathan Sobo
d104df955e WIP 2025-06-04 09:50:06 -07:00
Mikayla Maki
45de5c080a remove old comment 2025-05-31 22:17:44 -07:00
Mikayla Maki
03bed065e7 Slightly optimize the noop reordering case 2025-05-31 22:15:51 -07:00
Mikayla Maki
a3e7c4e961 Add a test showing that the channels can be in any order 2025-05-31 21:09:45 -07:00
Mikayla Maki
24ccd8268d Add missing channel moving tests
Fix a bug in the channel reordering when moving a channel into another
2025-05-31 20:38:50 -07:00
Mikayla Maki
9168a94c6a Remove unescessary complexity.
Fix a test that wasn't actually testing anything when ordering
constraints where removed.
2025-05-31 19:55:15 -07:00
Mikayla Maki
b96463e5bb remove unescessary channel ordering 2025-05-31 19:31:44 -07:00
Nathan Sobo
0e12e31edc Fix channel reordering to return only swapped channels
The reorder_channel function was returning all sibling channels instead
of just the two that were swapped, causing test failures that expected
exactly 2 channels to be returned.

This change modifies the function to return only the two channels that
had their order values swapped, which is more efficient and matches
the test expectations.
2025-05-31 15:53:45 -06:00
Nathan Sobo
a8287e4289 Fix CollabPanel channel reordering keybindings
The channel reordering keybindings (Cmd/Ctrl+Up/Down) were non-functional because the CollabPanel wasn't properly setting the key context to distinguish between editing and non-editing states.

## Problem

The keymaps specified 'CollabPanel && not_editing' as the required context for the reordering actions, but CollabPanel was only setting 'CollabPanel' without the editing state modifier. This meant the keybindings were never matched.

## Solution

Added a dispatch_context method to CollabPanel that:
- Checks if the channel_name_editor is focused
- Adds 'editing' context when renaming/creating channels
- Adds 'not_editing' context during normal navigation
- Follows the same pattern used by ProjectPanel and OutlinePanel

This allows the keybindings to work correctly while preventing conflicts between text editing operations (where arrows move the cursor) and channel navigation operations (where arrows reorder channels).
2025-05-31 14:49:51 -06:00
Nathan Sobo
8019a3925d Add channel reordering functionality
This change introduces the ability for channel administrators to reorder channels within their parent context, providing better organizational control over channel hierarchies. Channels can now be moved up or down relative to their siblings through keyboard shortcuts.

## Problem

Previously, channels were displayed in alphabetical order with no way to customize their arrangement. This made it difficult for teams to organize channels in a logical order that reflected their workflow or importance, forcing users to prefix channel names with numbers or special characters as a workaround.

## Solution

The implementation adds a persistent `channel_order` field to channels that determines their display order within their parent. Channels with the same parent are sorted by this field rather than alphabetically.

## Technical Implementation

**Database Schema:**
- Added `channel_order` INTEGER column to the channels table with a default value of 1
- Created a compound index on `(parent_path, channel_order)` for efficient ordering queries
- Migration updates existing channels with deterministic ordering based on their current alphabetical position

**Core Changes:**
- Extended the `Channel` struct across the codebase to include the `channel_order` field
- Modified `ChannelIndex` sorting logic to use channel_order instead of alphabetical name comparison
- Channels now maintain their relative position among siblings

**RPC Protocol:**
- Added `ReorderChannel` RPC message with channel_id and direction (Up/Down)
- Extended existing Channel proto message to include the channel_order field
- Server validates that users have admin permissions before allowing reordering

**Reordering Logic:**
- When moving a channel up/down, the server finds the adjacent sibling in that direction
- Swaps the `channel_order` values between the two channels
- Broadcasts the updated channel list to all affected users
- Handles edge cases: channels at boundaries, channels with gaps in ordering

**User Interface:**
- Added keyboard shortcuts:
  - macOS: `Cmd+Up/Down` to move channels
  - Linux: `Ctrl+Up/Down` to move channels
- Added `MoveChannelUp` and `MoveChannelDown` actions to the collab panel
- Reordering is only available when a channel is selected (not during editing)

**Error Handling:**
The change also improves error handling practices throughout the codebase by ensuring that errors from async operations propagate correctly to the UI layer instead of being silently discarded with `let _ =`.

**Testing:**
Comprehensive tests were added to verify:
- Basic reordering functionality up and down
- Boundary conditions (first/last channels)
- Permission checks (non-admins cannot reorder)
- Ordering persistence across server restarts
- Correct broadcasting of changes to channel members

**Known Issue:**
The keybindings are currently non-functional due to missing context setup in CollabPanel. The keymaps expect 'CollabPanel && not_editing' context, but the panel only provides 'CollabPanel'. This will be fixed in a follow-up commit.
2025-05-31 14:45:13 -06:00
22 changed files with 2162 additions and 70 deletions

6
.rules
View File

@@ -5,6 +5,12 @@
* Prefer implementing functionality in existing files unless it is a new logical component. Avoid creating many small files.
* Avoid using functions that panic like `unwrap()`, instead use mechanisms like `?` to propagate errors.
* Be careful with operations like indexing which may panic if the indexes are out of bounds.
* Never silently discard errors with `let _ =` on fallible operations. Always handle errors appropriately:
- Propagate errors with `?` when the calling function should handle them
- Use `.log_err()` or similar when you need to ignore errors but want visibility
- Use explicit error handling with `match` or `if let Err(...)` when you need custom logic
- Example: avoid `let _ = client.request(...).await?;` - use `client.request(...).await?;` instead
* When implementing async operations that may fail, ensure errors propagate to the UI layer so users get meaningful feedback.
* Never create files with `mod.rs` paths - prefer `src/some_module.rs` instead of `src/some_module/mod.rs`.
# GPUI

View File

@@ -897,7 +897,9 @@
"context": "CollabPanel && not_editing",
"bindings": {
"ctrl-backspace": "collab_panel::Remove",
"space": "menu::Confirm"
"space": "menu::Confirm",
"ctrl-up": "collab_panel::MoveChannelUp",
"ctrl-down": "collab_panel::MoveChannelDown"
}
},
{

View File

@@ -951,7 +951,9 @@
"use_key_equivalents": true,
"bindings": {
"ctrl-backspace": "collab_panel::Remove",
"space": "menu::Confirm"
"space": "menu::Confirm",
"cmd-up": "collab_panel::MoveChannelUp",
"cmd-down": "collab_panel::MoveChannelDown"
}
},
{

View File

@@ -56,6 +56,7 @@ pub struct Channel {
pub name: SharedString,
pub visibility: proto::ChannelVisibility,
pub parent_path: Vec<ChannelId>,
pub channel_order: i32,
}
#[derive(Default, Debug)]
@@ -614,7 +615,24 @@ impl ChannelStore {
to: to.0,
})
.await?;
Ok(())
})
}
pub fn reorder_channel(
&mut self,
channel_id: ChannelId,
direction: proto::reorder_channel::Direction,
cx: &mut Context<Self>,
) -> Task<Result<()>> {
let client = self.client.clone();
cx.spawn(async move |_, _| {
client
.request(proto::ReorderChannel {
channel_id: channel_id.0,
direction: direction.into(),
})
.await?;
Ok(())
})
}
@@ -1026,6 +1044,18 @@ impl ChannelStore {
});
}
#[cfg(any(test, feature = "test-support"))]
pub fn reset(&mut self) {
self.channel_invitations.clear();
self.channel_index.clear();
self.channel_participants.clear();
self.outgoing_invites.clear();
self.opened_buffers.clear();
self.opened_chats.clear();
self.disconnect_channel_buffers_task = None;
self.channel_states.clear();
}
pub(crate) fn update_channels(
&mut self,
payload: proto::UpdateChannels,
@@ -1050,6 +1080,7 @@ impl ChannelStore {
visibility: channel.visibility(),
name: channel.name.into(),
parent_path: channel.parent_path.into_iter().map(ChannelId).collect(),
channel_order: channel.channel_order,
}),
),
}

View File

@@ -61,11 +61,13 @@ impl ChannelPathsInsertGuard<'_> {
ret = existing_channel.visibility != channel_proto.visibility()
|| existing_channel.name != channel_proto.name
|| existing_channel.parent_path != parent_path;
|| existing_channel.parent_path != parent_path
|| existing_channel.channel_order != channel_proto.channel_order;
existing_channel.visibility = channel_proto.visibility();
existing_channel.name = channel_proto.name.into();
existing_channel.parent_path = parent_path;
existing_channel.channel_order = channel_proto.channel_order;
} else {
self.channels_by_id.insert(
ChannelId(channel_proto.id),
@@ -74,6 +76,7 @@ impl ChannelPathsInsertGuard<'_> {
visibility: channel_proto.visibility(),
name: channel_proto.name.into(),
parent_path,
channel_order: channel_proto.channel_order,
}),
);
self.insert_root(ChannelId(channel_proto.id));
@@ -100,17 +103,18 @@ impl Drop for ChannelPathsInsertGuard<'_> {
fn channel_path_sorting_key(
id: ChannelId,
channels_by_id: &BTreeMap<ChannelId, Arc<Channel>>,
) -> impl Iterator<Item = (&str, ChannelId)> {
let (parent_path, name) = channels_by_id
.get(&id)
.map_or((&[] as &[_], None), |channel| {
(
channel.parent_path.as_slice(),
Some((channel.name.as_ref(), channel.id)),
)
});
) -> impl Iterator<Item = (i32, ChannelId)> {
let (parent_path, order_and_id) =
channels_by_id
.get(&id)
.map_or((&[] as &[_], None), |channel| {
(
channel.parent_path.as_slice(),
Some((channel.channel_order, channel.id)),
)
});
parent_path
.iter()
.filter_map(|id| Some((channels_by_id.get(id)?.name.as_ref(), *id)))
.chain(name)
.filter_map(|id| Some((channels_by_id.get(id)?.channel_order, *id)))
.chain(order_and_id)
}

View File

@@ -21,12 +21,14 @@ fn test_update_channels(cx: &mut App) {
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 1,
},
proto::Channel {
id: 2,
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 2,
},
],
..Default::default()
@@ -37,8 +39,8 @@ fn test_update_channels(cx: &mut App) {
&channel_store,
&[
//
(0, "a".to_string()),
(0, "b".to_string()),
(0, "a".to_string()),
],
cx,
);
@@ -52,12 +54,14 @@ fn test_update_channels(cx: &mut App) {
name: "x".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 1,
},
proto::Channel {
id: 4,
name: "y".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![2],
channel_order: 1,
},
],
..Default::default()
@@ -67,15 +71,111 @@ fn test_update_channels(cx: &mut App) {
assert_channels(
&channel_store,
&[
(0, "a".to_string()),
(1, "y".to_string()),
(0, "b".to_string()),
(1, "x".to_string()),
(0, "a".to_string()),
(1, "y".to_string()),
],
cx,
);
}
#[gpui::test]
fn test_update_channels_order_independent(cx: &mut App) {
/// Based on: https://stackoverflow.com/a/59939809
fn unique_permutations<T: Clone>(items: Vec<T>) -> Vec<Vec<T>> {
if items.len() == 1 {
vec![items]
} else {
let mut output: Vec<Vec<T>> = vec![];
for (ix, first) in items.iter().enumerate() {
let mut remaining_elements = items.clone();
remaining_elements.remove(ix);
for mut permutation in unique_permutations(remaining_elements) {
permutation.insert(0, first.clone());
output.push(permutation);
}
}
output
}
}
let test_data = vec![
proto::Channel {
id: 6,
name: "β".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1, 3],
channel_order: 1,
},
proto::Channel {
id: 5,
name: "α".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 2,
},
proto::Channel {
id: 3,
name: "x".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![1],
channel_order: 1,
},
proto::Channel {
id: 4,
name: "y".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![2],
channel_order: 1,
},
proto::Channel {
id: 1,
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 1,
},
proto::Channel {
id: 2,
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: Vec::new(),
channel_order: 2,
},
];
let channel_store = init_test(cx);
let permutations = unique_permutations(test_data);
for test_instance in permutations {
channel_store.update(cx, |channel_store, _| channel_store.reset());
update_channels(
&channel_store,
proto::UpdateChannels {
channels: test_instance,
..Default::default()
},
cx,
);
assert_channels(
&channel_store,
&[
(0, "b".to_string()),
(1, "x".to_string()),
(2, "β".to_string()),
(1, "α".to_string()),
(0, "a".to_string()),
(1, "y".to_string()),
],
cx,
);
}
}
#[gpui::test]
fn test_dangling_channel_paths(cx: &mut App) {
let channel_store = init_test(cx);
@@ -89,18 +189,21 @@ fn test_dangling_channel_paths(cx: &mut App) {
name: "a".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![],
channel_order: 1,
},
proto::Channel {
id: 1,
name: "b".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![0],
channel_order: 1,
},
proto::Channel {
id: 2,
name: "c".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![0, 1],
channel_order: 1,
},
],
..Default::default()
@@ -147,6 +250,7 @@ async fn test_channel_messages(cx: &mut TestAppContext) {
name: "the-channel".to_string(),
visibility: proto::ChannelVisibility::Members as i32,
parent_path: vec![],
channel_order: 1,
}],
..Default::default()
});

View File

@@ -266,11 +266,14 @@ CREATE TABLE "channels" (
"created_at" TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
"visibility" VARCHAR NOT NULL,
"parent_path" TEXT NOT NULL,
"requires_zed_cla" BOOLEAN NOT NULL DEFAULT FALSE
"requires_zed_cla" BOOLEAN NOT NULL DEFAULT FALSE,
"channel_order" INTEGER NOT NULL DEFAULT 1
);
CREATE INDEX "index_channels_on_parent_path" ON "channels" ("parent_path");
CREATE INDEX "index_channels_on_parent_path_and_order" ON "channels" ("parent_path", "channel_order");
CREATE TABLE IF NOT EXISTS "channel_chat_participants" (
"id" INTEGER PRIMARY KEY AUTOINCREMENT,
"user_id" INTEGER NOT NULL REFERENCES users (id),

View File

@@ -0,0 +1,16 @@
-- Add channel_order column to channels table with default value
ALTER TABLE channels ADD COLUMN channel_order INTEGER NOT NULL DEFAULT 1;
-- Update channel_order for existing channels using ROW_NUMBER for deterministic ordering
UPDATE channels
SET channel_order = (
SELECT ROW_NUMBER() OVER (
PARTITION BY parent_path
ORDER BY name, id
)
FROM channels c2
WHERE c2.id = channels.id
);
-- Create index for efficient ordering queries
CREATE INDEX "index_channels_on_parent_path_and_order" ON "channels" ("parent_path", "channel_order");

View File

@@ -582,6 +582,7 @@ pub struct Channel {
pub visibility: ChannelVisibility,
/// parent_path is the channel ids from the root to this one (not including this one)
pub parent_path: Vec<ChannelId>,
pub channel_order: i32,
}
impl Channel {
@@ -591,6 +592,7 @@ impl Channel {
visibility: value.visibility,
name: value.clone().name,
parent_path: value.ancestors().collect(),
channel_order: value.channel_order,
}
}
@@ -600,8 +602,13 @@ impl Channel {
name: self.name.clone(),
visibility: self.visibility.into(),
parent_path: self.parent_path.iter().map(|c| c.to_proto()).collect(),
channel_order: self.channel_order,
}
}
pub fn root_id(&self) -> ChannelId {
self.parent_path.first().copied().unwrap_or(self.id)
}
}
#[derive(Debug, PartialEq, Eq, Hash)]

View File

@@ -4,7 +4,7 @@ use rpc::{
ErrorCode, ErrorCodeExt,
proto::{ChannelBufferVersion, VectorClockEntry, channel_member::Kind},
};
use sea_orm::{DbBackend, TryGetableMany};
use sea_orm::{ActiveValue, DbBackend, TryGetableMany};
impl Database {
#[cfg(test)]
@@ -59,16 +59,28 @@ impl Database {
parent = Some(parent_channel);
}
let parent_path = parent
.as_ref()
.map_or(String::new(), |parent| parent.path());
// Find the maximum channel_order among siblings to set the new channel at the end
let max_order = max_order(&parent_path, &tx).await?;
log::info!(
"Creating channel '{}' with parent_path='{}', max_order={}, new_order={}",
name,
parent_path,
max_order,
max_order + 1
);
let channel = channel::ActiveModel {
id: ActiveValue::NotSet,
name: ActiveValue::Set(name.to_string()),
visibility: ActiveValue::Set(ChannelVisibility::Members),
parent_path: ActiveValue::Set(
parent
.as_ref()
.map_or(String::new(), |parent| parent.path()),
),
parent_path: ActiveValue::Set(parent_path),
requires_zed_cla: ActiveValue::NotSet,
channel_order: ActiveValue::Set(max_order + 1),
}
.insert(&*tx)
.await?;
@@ -531,11 +543,7 @@ impl Database {
.get_channel_descendants_excluding_self(channels.iter(), tx)
.await?;
for channel in channels {
if let Err(ix) = descendants.binary_search_by_key(&channel.path(), |c| c.path()) {
descendants.insert(ix, channel);
}
}
descendants.extend(channels);
let roles_by_channel_id = channel_memberships
.iter()
@@ -952,11 +960,14 @@ impl Database {
}
let root_id = channel.root_id();
let new_parent_path = new_parent.path();
let old_path = format!("{}{}/", channel.parent_path, channel.id);
let new_path = format!("{}{}/", new_parent.path(), channel.id);
let new_path = format!("{}{}/", &new_parent_path, channel.id);
let new_order = max_order(&new_parent_path, &tx).await? + 1;
let mut model = channel.into_active_model();
model.parent_path = ActiveValue::Set(new_parent.path());
model.channel_order = ActiveValue::Set(new_order);
let channel = model.update(&*tx).await?;
let descendent_ids =
@@ -986,6 +997,132 @@ impl Database {
})
.await
}
pub async fn reorder_channel(
&self,
channel_id: ChannelId,
direction: proto::reorder_channel::Direction,
user_id: UserId,
) -> Result<Vec<Channel>> {
self.transaction(|tx| async move {
let mut channel = self.get_channel_internal(channel_id, &tx).await?;
log::info!(
"Reordering channel {} (parent_path: '{}', order: {})",
channel.id,
channel.parent_path,
channel.channel_order
);
// Check if user is admin of the channel
self.check_user_is_channel_admin(&channel, user_id, &tx)
.await?;
// Find the sibling channel to swap with
let sibling_channel = match direction {
proto::reorder_channel::Direction::Up => {
log::info!(
"Looking for sibling with parent_path='{}' and order < {}",
channel.parent_path,
channel.channel_order
);
// Find channel with highest order less than current
channel::Entity::find()
.filter(
channel::Column::ParentPath
.eq(&channel.parent_path)
.and(channel::Column::ChannelOrder.lt(channel.channel_order)),
)
.order_by_desc(channel::Column::ChannelOrder)
.one(&*tx)
.await?
}
proto::reorder_channel::Direction::Down => {
log::info!(
"Looking for sibling with parent_path='{}' and order > {}",
channel.parent_path,
channel.channel_order
);
// Find channel with lowest order greater than current
channel::Entity::find()
.filter(
channel::Column::ParentPath
.eq(&channel.parent_path)
.and(channel::Column::ChannelOrder.gt(channel.channel_order)),
)
.order_by_asc(channel::Column::ChannelOrder)
.one(&*tx)
.await?
}
};
let mut sibling_channel = match sibling_channel {
Some(sibling) => {
log::info!(
"Found sibling {} (parent_path: '{}', order: {})",
sibling.id,
sibling.parent_path,
sibling.channel_order
);
sibling
}
None => {
log::warn!("No sibling found to swap with");
// No sibling to swap with
return Ok(vec![]);
}
};
let current_order = channel.channel_order;
let sibling_order = sibling_channel.channel_order;
channel::ActiveModel {
id: ActiveValue::Unchanged(sibling_channel.id),
channel_order: ActiveValue::Set(current_order),
..Default::default()
}
.update(&*tx)
.await?;
sibling_channel.channel_order = current_order;
channel::ActiveModel {
id: ActiveValue::Unchanged(channel.id),
channel_order: ActiveValue::Set(sibling_order),
..Default::default()
}
.update(&*tx)
.await?;
channel.channel_order = sibling_order;
log::info!(
"Reorder complete. Swapped channels {} and {}",
channel.id,
sibling_channel.id
);
let swapped_channels = vec![
Channel::from_model(channel),
Channel::from_model(sibling_channel),
];
Ok(swapped_channels)
})
.await
}
}
async fn max_order(parent_path: &str, tx: &TransactionHandle) -> Result<i32> {
let max_order = channel::Entity::find()
.filter(channel::Column::ParentPath.eq(parent_path))
.select_only()
.column_as(channel::Column::ChannelOrder.max(), "max_order")
.into_tuple::<Option<i32>>()
.one(&**tx)
.await?
.flatten()
.unwrap_or(0);
Ok(max_order)
}
#[derive(Copy, Clone, Debug, EnumIter, DeriveColumn)]

View File

@@ -10,6 +10,9 @@ pub struct Model {
pub visibility: ChannelVisibility,
pub parent_path: String,
pub requires_zed_cla: bool,
/// The order of this channel relative to its siblings within the same parent.
/// Lower values appear first. Channels are sorted by parent_path first, then by channel_order.
pub channel_order: i32,
}
impl Model {

View File

@@ -172,16 +172,35 @@ impl Drop for TestDb {
}
}
fn assert_channel_tree_matches(actual: Vec<Channel>, expected: Vec<Channel>) {
let expected_channels = expected.into_iter().collect::<HashSet<_>>();
let actual_channels = actual.into_iter().collect::<HashSet<_>>();
pretty_assertions::assert_eq!(expected_channels, actual_channels);
}
fn channel_tree(channels: &[(ChannelId, &[ChannelId], &'static str)]) -> Vec<Channel> {
channels
.iter()
.map(|(id, parent_path, name)| Channel {
use std::collections::HashMap;
let mut result = Vec::new();
let mut order_by_parent: HashMap<Vec<ChannelId>, i32> = HashMap::new();
for (id, parent_path, name) in channels {
let parent_key = parent_path.to_vec();
let order = *order_by_parent
.entry(parent_key.clone())
.and_modify(|e| *e += 1)
.or_insert(1);
result.push(Channel {
id: *id,
name: name.to_string(),
visibility: ChannelVisibility::Members,
parent_path: parent_path.to_vec(),
})
.collect()
parent_path: parent_key,
channel_order: order,
});
}
result
}
static GITHUB_USER_ID: AtomicI32 = AtomicI32::new(5);

View File

@@ -1,15 +1,15 @@
use crate::{
db::{
Channel, ChannelId, ChannelRole, Database, NewUserParams, RoomId, UserId,
tests::{channel_tree, new_test_connection, new_test_user},
tests::{assert_channel_tree_matches, channel_tree, new_test_connection, new_test_user},
},
test_both_dbs,
};
use rpc::{
ConnectionId,
proto::{self},
proto::{self, reorder_channel},
};
use std::sync::Arc;
use std::{collections::HashSet, sync::Arc};
test_both_dbs!(test_channels, test_channels_postgres, test_channels_sqlite);
@@ -59,28 +59,28 @@ async fn test_channels(db: &Arc<Database>) {
.unwrap();
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
(rust_id, &[], "rust"),
(cargo_id, &[rust_id], "cargo"),
(cargo_ra_id, &[rust_id, cargo_id], "cargo-ra",)
],)
(cargo_ra_id, &[rust_id, cargo_id], "cargo-ra"),
]),
);
let result = db.get_channels_for_user(b_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(replace_id, &[zed_id], "replace")
],)
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
]),
);
// Update member permissions
@@ -94,14 +94,14 @@ async fn test_channels(db: &Arc<Database>) {
assert!(set_channel_admin.is_ok());
let result = db.get_channels_for_user(b_id).await.unwrap();
assert_eq!(
assert_channel_tree_matches(
result.channels,
channel_tree(&[
(zed_id, &[], "zed"),
(crdb_id, &[zed_id], "crdb"),
(livestreaming_id, &[zed_id], "livestreaming",),
(replace_id, &[zed_id], "replace")
],)
(livestreaming_id, &[zed_id], "livestreaming"),
(replace_id, &[zed_id], "replace"),
]),
);
// Remove a single channel
@@ -313,8 +313,8 @@ async fn test_channel_renames(db: &Arc<Database>) {
test_both_dbs!(
test_db_channel_moving,
test_channels_moving_postgres,
test_channels_moving_sqlite
test_db_channel_moving_postgres,
test_db_channel_moving_sqlite
);
async fn test_db_channel_moving(db: &Arc<Database>) {
@@ -343,16 +343,14 @@ async fn test_db_channel_moving(db: &Arc<Database>) {
.await
.unwrap();
let livestreaming_dag_id = db
.create_sub_channel("livestreaming_dag", livestreaming_id, a_id)
let livestreaming_sub_id = db
.create_sub_channel("livestreaming_sub", livestreaming_id, a_id)
.await
.unwrap();
// ========================================================================
// sanity check
// Initial DAG:
// /- gpui2
// zed -- crdb - livestreaming - livestreaming_dag
// zed -- crdb - livestreaming - livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
@@ -360,10 +358,242 @@ async fn test_db_channel_moving(db: &Arc<Database>) {
(zed_id, &[]),
(crdb_id, &[zed_id]),
(livestreaming_id, &[zed_id, crdb_id]),
(livestreaming_dag_id, &[zed_id, crdb_id, livestreaming_id]),
(livestreaming_sub_id, &[zed_id, crdb_id, livestreaming_id]),
(gpui2_id, &[zed_id]),
],
);
// Check that we can do a simple leaf -> leaf move
db.move_channel(livestreaming_sub_id, crdb_id, a_id)
.await
.unwrap();
// /- gpui2
// zed -- crdb -- livestreaming
// \- livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(crdb_id, &[zed_id]),
(livestreaming_id, &[zed_id, crdb_id]),
(livestreaming_sub_id, &[zed_id, crdb_id]),
(gpui2_id, &[zed_id]),
],
);
// Check that we can move a whole subtree at once
db.move_channel(crdb_id, gpui2_id, a_id).await.unwrap();
// zed -- gpui2 -- crdb -- livestreaming
// \- livestreaming_sub
let result = db.get_channels_for_user(a_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(gpui2_id, &[zed_id]),
(crdb_id, &[zed_id, gpui2_id]),
(livestreaming_id, &[zed_id, gpui2_id, crdb_id]),
(livestreaming_sub_id, &[zed_id, gpui2_id, crdb_id]),
],
);
}
test_both_dbs!(
test_channel_reordering,
test_channel_reordering_postgres,
test_channel_reordering_sqlite
);
async fn test_channel_reordering(db: &Arc<Database>) {
let admin_id = db
.create_user(
"admin@example.com",
None,
false,
NewUserParams {
github_login: "admin".into(),
github_user_id: 1,
},
)
.await
.unwrap()
.user_id;
let user_id = db
.create_user(
"user@example.com",
None,
false,
NewUserParams {
github_login: "user".into(),
github_user_id: 2,
},
)
.await
.unwrap()
.user_id;
// Create a root channel with some sub-channels
let root_id = db.create_root_channel("root", admin_id).await.unwrap();
// Invite user to root channel so they can see the sub-channels
db.invite_channel_member(root_id, user_id, admin_id, ChannelRole::Member)
.await
.unwrap();
db.respond_to_channel_invite(root_id, user_id, true)
.await
.unwrap();
let alpha_id = db
.create_sub_channel("alpha", root_id, admin_id)
.await
.unwrap();
let beta_id = db
.create_sub_channel("beta", root_id, admin_id)
.await
.unwrap();
let gamma_id = db
.create_sub_channel("gamma", root_id, admin_id)
.await
.unwrap();
// Initial order should be: root, alpha (order=1), beta (order=2), gamma (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(alpha_id, &[root_id], 1),
(beta_id, &[root_id], 2),
(gamma_id, &[root_id], 3),
],
);
// Test moving beta up (should swap with alpha)
let updated_channels = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, admin_id)
.await
.unwrap();
// Verify that beta and alpha were returned as updated
assert_eq!(updated_channels.len(), 2);
let updated_ids: std::collections::HashSet<_> = updated_channels.iter().map(|c| c.id).collect();
assert!(updated_ids.contains(&alpha_id));
assert!(updated_ids.contains(&beta_id));
// Now order should be: root, beta (order=1), alpha (order=2), gamma (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(alpha_id, &[root_id], 2),
(gamma_id, &[root_id], 3),
],
);
// Test moving gamma down (should be no-op since it's already last)
let updated_channels = db
.reorder_channel(gamma_id, reorder_channel::Direction::Down, admin_id)
.await
.unwrap();
// Should return just nothing
assert_eq!(updated_channels.len(), 0);
// Test moving alpha down (should swap with gamma)
let updated_channels = db
.reorder_channel(alpha_id, reorder_channel::Direction::Down, admin_id)
.await
.unwrap();
// Verify that alpha and gamma were returned as updated
assert_eq!(updated_channels.len(), 2);
let updated_ids: std::collections::HashSet<_> = updated_channels.iter().map(|c| c.id).collect();
assert!(updated_ids.contains(&alpha_id));
assert!(updated_ids.contains(&gamma_id));
// Now order should be: root, beta (order=1), gamma (order=2), alpha (order=3)
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
],
);
// Test that non-admin cannot reorder
let reorder_result = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, user_id)
.await;
assert!(reorder_result.is_err());
// Test moving beta up (should be no-op since it's already first)
let updated_channels = db
.reorder_channel(beta_id, reorder_channel::Direction::Up, admin_id)
.await
.unwrap();
// Should return nothing
assert_eq!(updated_channels.len(), 0);
// Adding a channel to an existing ordering should add it to the end
let delta_id = db
.create_sub_channel("delta", root_id, admin_id)
.await
.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
],
);
// And moving a channel into an existing ordering should add it to the end
let eta_id = db
.create_sub_channel("eta", delta_id, admin_id)
.await
.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
(eta_id, &[root_id, delta_id], 1),
],
);
db.move_channel(eta_id, root_id, admin_id).await.unwrap();
let result = db.get_channels_for_user(admin_id).await.unwrap();
assert_channel_tree_order(
result.channels,
&[
(root_id, &[], 1),
(beta_id, &[root_id], 1),
(gamma_id, &[root_id], 2),
(alpha_id, &[root_id], 3),
(delta_id, &[root_id], 4),
(eta_id, &[root_id], 5),
],
);
}
test_both_dbs!(
@@ -422,6 +652,20 @@ async fn test_db_channel_moving_bugs(db: &Arc<Database>) {
(livestreaming_id, &[zed_id, projects_id]),
],
);
// Can't un-root a root channel
db.move_channel(zed_id, livestreaming_id, user_id)
.await
.unwrap_err();
let result = db.get_channels_for_user(user_id).await.unwrap();
assert_channel_tree(
result.channels,
&[
(zed_id, &[]),
(projects_id, &[zed_id]),
(livestreaming_id, &[zed_id, projects_id]),
],
);
}
test_both_dbs!(
@@ -745,10 +989,29 @@ fn assert_channel_tree(actual: Vec<Channel>, expected: &[(ChannelId, &[ChannelId
let actual = actual
.iter()
.map(|channel| (channel.id, channel.parent_path.as_slice()))
.collect::<Vec<_>>();
pretty_assertions::assert_eq!(
actual,
expected.to_vec(),
"wrong channel ids and parent paths"
);
.collect::<HashSet<_>>();
let expected = expected
.iter()
.map(|(id, parents)| (*id, *parents))
.collect::<HashSet<_>>();
pretty_assertions::assert_eq!(actual, expected, "wrong channel ids and parent paths");
}
#[track_caller]
fn assert_channel_tree_order(actual: Vec<Channel>, expected: &[(ChannelId, &[ChannelId], i32)]) {
let actual = actual
.iter()
.map(|channel| {
(
channel.id,
channel.parent_path.as_slice(),
channel.channel_order,
)
})
.collect::<HashSet<_>>();
let expected = expected
.iter()
.map(|(id, parents, order)| (*id, *parents, *order))
.collect::<HashSet<_>>();
pretty_assertions::assert_eq!(actual, expected, "wrong channel ids and parent paths");
}

View File

@@ -384,6 +384,7 @@ impl Server {
.add_request_handler(get_notifications)
.add_request_handler(mark_notification_as_read)
.add_request_handler(move_channel)
.add_request_handler(reorder_channel)
.add_request_handler(follow)
.add_message_handler(unfollow)
.add_message_handler(update_followers)
@@ -3195,6 +3196,51 @@ async fn move_channel(
Ok(())
}
async fn reorder_channel(
request: proto::ReorderChannel,
response: Response<proto::ReorderChannel>,
session: Session,
) -> Result<()> {
let channel_id = ChannelId::from_proto(request.channel_id);
let direction = request.direction();
let updated_channels = session
.db()
.await
.reorder_channel(channel_id, direction, session.user_id())
.await?;
if let Some(root_id) = updated_channels.first().map(|channel| channel.root_id()) {
let connection_pool = session.connection_pool().await;
for (connection_id, role) in connection_pool.channel_connection_ids(root_id) {
let channels = updated_channels
.iter()
.filter_map(|channel| {
if role.can_see_channel(channel.visibility) {
Some(channel.to_proto())
} else {
None
}
})
.collect::<Vec<_>>();
if channels.is_empty() {
continue;
}
let update = proto::UpdateChannels {
channels,
..Default::default()
};
session.peer.send(connection_id, update.clone())?;
}
}
response.send(Ack {})?;
Ok(())
}
/// Get the list of channel members
async fn get_channel_members(
request: proto::GetChannelMembers,

View File

@@ -0,0 +1,455 @@
//! Link folding support for channel notes.
//!
//! This module provides functionality to automatically fold markdown links in channel notes,
//! displaying only the link text with an underline decoration. When the cursor is positioned
//! inside a link, the crease is temporarily removed to show the full markdown syntax.
//!
//! Example:
//! - When rendered: [Example](https://example.com) becomes "Example" with underline
//! - When cursor is inside: Full markdown syntax is shown
//!
//! The main components are:
//! - `parse_markdown_links`: Extracts markdown links from text
//! - `create_link_creases`: Creates visual creases for links
//! - `LinkFoldingManager`: Manages dynamic showing/hiding of link creases based on cursor position
use editor::{
Anchor, Editor, FoldPlaceholder,
display_map::{Crease, CreaseId},
};
use gpui::{App, Entity, Window};
use std::{collections::HashMap, ops::Range, sync::Arc};
use ui::prelude::*;
#[derive(Debug, Clone, PartialEq)]
pub struct MarkdownLink {
pub text: String,
pub url: String,
pub range: Range<usize>,
}
pub fn parse_markdown_links(text: &str) -> Vec<MarkdownLink> {
let mut links = Vec::new();
let mut chars = text.char_indices();
while let Some((start, ch)) = chars.next() {
if ch == '[' {
// Look for the closing bracket
let mut bracket_depth = 1;
let text_start = start + 1;
let mut text_end = None;
for (i, ch) in chars.by_ref() {
match ch {
'[' => bracket_depth += 1,
']' => {
bracket_depth -= 1;
if bracket_depth == 0 {
text_end = Some(i);
break;
}
}
_ => {}
}
}
if let Some(text_end) = text_end {
// Check if the next character is '('
if let Some((_, '(')) = chars.next() {
// Look for the closing parenthesis
let url_start = text_end + 2;
let mut url_end = None;
for (i, ch) in chars.by_ref() {
if ch == ')' {
url_end = Some(i);
break;
}
}
if let Some(url_end) = url_end {
links.push(MarkdownLink {
text: text[text_start..text_end].to_string(),
url: text[url_start..url_end].to_string(),
range: start..url_end + 1,
});
}
}
}
}
}
links
}
pub fn create_link_creases(
links: &[MarkdownLink],
buffer_snapshot: &editor::MultiBufferSnapshot,
) -> Vec<Crease<Anchor>> {
links
.iter()
.map(|link| {
// Convert byte offsets to Points first
let start_point = buffer_snapshot.offset_to_point(link.range.start);
let end_point = buffer_snapshot.offset_to_point(link.range.end);
// Create anchors from points
let start = buffer_snapshot.anchor_before(start_point);
let end = buffer_snapshot.anchor_after(end_point);
let link_text = link.text.clone();
Crease::simple(
start..end,
FoldPlaceholder {
render: Arc::new(move |_fold_id, _range, cx| {
div()
.child(link_text.clone())
.text_decoration_1()
.text_decoration_solid()
.text_color(cx.theme().colors().link_text_hover)
.into_any_element()
}),
constrain_width: false,
merge_adjacent: false,
type_tag: None,
},
)
})
.collect()
}
pub struct LinkFoldingManager {
editor: Entity<Editor>,
folded_links: Vec<MarkdownLink>,
link_creases: HashMap<CreaseId, Range<usize>>,
}
impl LinkFoldingManager {
pub fn new(editor: Entity<Editor>, window: &mut Window, cx: &mut App) -> Self {
let mut manager = Self {
editor: editor.clone(),
folded_links: Vec::new(),
link_creases: HashMap::default(),
};
manager.refresh_links(window, cx);
manager
}
pub fn handle_selections_changed(&mut self, window: &mut Window, cx: &mut App) {
self.update_link_visibility(window, cx);
}
pub fn handle_edited(&mut self, window: &mut Window, cx: &mut App) {
// Re-parse links and update folds
// Note: In a production implementation, this could be debounced
// to avoid updating on every keystroke
self.refresh_links(window, cx);
}
fn refresh_links(&mut self, window: &mut Window, cx: &mut App) {
// Remove existing creases
if !self.link_creases.is_empty() {
self.editor.update(cx, |editor, cx| {
let crease_ids: Vec<_> = self.link_creases.keys().copied().collect();
editor.remove_creases(crease_ids, cx);
});
self.link_creases.clear();
}
// Parse new links
let buffer_text = self.editor.read(cx).buffer().read(cx).snapshot(cx).text();
let links = parse_markdown_links(&buffer_text);
self.folded_links = links;
// Insert creases for all links
if !self.folded_links.is_empty() {
self.editor.update(cx, |editor, cx| {
let buffer = editor.buffer().read(cx).snapshot(cx);
let creases = create_link_creases(&self.folded_links, &buffer);
let crease_ids = editor.insert_creases(creases.clone(), cx);
// Store the mapping of crease IDs to link ranges
for (crease_id, link) in crease_ids.into_iter().zip(self.folded_links.iter()) {
self.link_creases.insert(crease_id, link.range.clone());
}
// Fold the creases to activate the custom placeholder
editor.fold_creases(creases, true, window, cx);
});
}
// Update visibility based on cursor position
self.update_link_visibility(window, cx);
}
fn update_link_visibility(&mut self, window: &mut Window, cx: &mut App) {
if self.folded_links.is_empty() {
return;
}
self.editor.update(cx, |editor, cx| {
let buffer = editor.buffer().read(cx).snapshot(cx);
let selections = editor.selections.all::<usize>(cx);
// Find which links should be visible (cursor inside or adjacent)
// Remove creases where cursor is inside or adjacent to the link
let mut creases_to_remove = Vec::new();
for (crease_id, link_range) in &self.link_creases {
let cursor_near_link = selections.iter().any(|selection| {
let cursor_offset = selection.head();
// Check if cursor is inside the link
if link_range.contains(&cursor_offset) {
return true;
}
// Check if cursor is adjacent (immediately before or after)
cursor_offset == link_range.start || cursor_offset == link_range.end
});
if cursor_near_link {
creases_to_remove.push(*crease_id);
}
}
if !creases_to_remove.is_empty() {
editor.remove_creases(creases_to_remove.clone(), cx);
for crease_id in creases_to_remove {
self.link_creases.remove(&crease_id);
}
}
// Re-add creases for links where cursor is not present or adjacent
let links_to_recreate: Vec<_> = self
.folded_links
.iter()
.filter(|link| {
!selections.iter().any(|selection| {
let cursor_offset = selection.head();
// Check if cursor is inside or adjacent to the link
link.range.contains(&cursor_offset)
|| cursor_offset == link.range.start
|| cursor_offset == link.range.end
})
})
.filter(|link| {
// Only recreate if not already present
!self.link_creases.values().any(|range| range == &link.range)
})
.cloned()
.collect();
if !links_to_recreate.is_empty() {
let creases = create_link_creases(&links_to_recreate, &buffer);
let crease_ids = editor.insert_creases(creases.clone(), cx);
for (crease_id, link) in crease_ids.into_iter().zip(links_to_recreate.iter()) {
self.link_creases.insert(crease_id, link.range.clone());
}
// Fold the creases to activate the custom placeholder
editor.fold_creases(creases, true, window, cx);
}
});
}
/// Checks if a position (in buffer coordinates) is within a folded link
/// and returns the link if found
pub fn link_at_position(&self, position: usize) -> Option<&MarkdownLink> {
// Check if the position falls within any folded link that has an active crease
self.folded_links.iter().find(|link| {
link.range.contains(&position)
&& self.link_creases.values().any(|range| range == &link.range)
})
}
/// Opens a link URL in the default browser
pub fn open_link(&self, url: &str, cx: &mut App) {
cx.open_url(url);
}
}
#[cfg(test)]
mod tests {
use super::*;
#[gpui::test]
fn test_parse_markdown_links() {
let text = "This is a [link](https://example.com) and another [test](https://test.com).";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 2);
assert_eq!(links[0].text, "link");
assert_eq!(links[0].url, "https://example.com");
assert_eq!(links[0].range, 10..37);
assert_eq!(links[1].text, "test");
assert_eq!(links[1].url, "https://test.com");
assert_eq!(links[1].range, 50..74);
}
#[gpui::test]
fn test_parse_nested_brackets() {
let text = "A [[nested] link](https://example.com) here.";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 1);
assert_eq!(links[0].text, "[nested] link");
assert_eq!(links[0].url, "https://example.com");
}
#[gpui::test]
fn test_parse_no_links() {
let text = "This text has no links.";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 0);
}
#[gpui::test]
fn test_parse_incomplete_links() {
let text = "This [link has no url] and [this](incomplete";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 0);
}
#[gpui::test]
fn test_link_parsing_and_folding() {
// Test comprehensive link parsing
let text = "Check out [Zed](https://zed.dev) and [GitHub](https://github.com)!";
let links = parse_markdown_links(text);
assert_eq!(links.len(), 2);
assert_eq!(links[0].text, "Zed");
assert_eq!(links[0].url, "https://zed.dev");
assert_eq!(links[0].range, 10..32);
assert_eq!(links[1].text, "GitHub");
assert_eq!(links[1].url, "https://github.com");
assert_eq!(links[1].range, 37..65);
}
#[gpui::test]
fn test_link_detection_when_typing() {
// Test that links are detected as they're typed
let text1 = "Check out ";
let links1 = parse_markdown_links(text1);
assert_eq!(links1.len(), 0, "No links in plain text");
let text2 = "Check out [";
let links2 = parse_markdown_links(text2);
assert_eq!(links2.len(), 0, "Incomplete link not detected");
let text3 = "Check out [Zed]";
let links3 = parse_markdown_links(text3);
assert_eq!(links3.len(), 0, "Link without URL not detected");
let text4 = "Check out [Zed](";
let links4 = parse_markdown_links(text4);
assert_eq!(links4.len(), 0, "Link with incomplete URL not detected");
let text5 = "Check out [Zed](https://zed.dev)";
let links5 = parse_markdown_links(text5);
assert_eq!(links5.len(), 1, "Complete link should be detected");
assert_eq!(links5[0].text, "Zed");
assert_eq!(links5[0].url, "https://zed.dev");
// Test link detection in middle of text
let text6 = "Check out [Zed](https://zed.dev) for coding!";
let links6 = parse_markdown_links(text6);
assert_eq!(links6.len(), 1, "Link in middle of text should be detected");
assert_eq!(links6[0].range, 10..32);
}
#[gpui::test]
fn test_link_position_detection() {
// Test the logic for determining if a position is within a link
let links = vec![
MarkdownLink {
text: "Zed".to_string(),
url: "https://zed.dev".to_string(),
range: 10..32,
},
MarkdownLink {
text: "GitHub".to_string(),
url: "https://github.com".to_string(),
range: 50..78,
},
];
// Test positions inside the first link
assert!(links[0].range.contains(&10), "Start of first link");
assert!(links[0].range.contains(&20), "Middle of first link");
assert!(links[0].range.contains(&31), "Near end of first link");
// Test positions inside the second link
assert!(links[1].range.contains(&50), "Start of second link");
assert!(links[1].range.contains(&65), "Middle of second link");
assert!(links[1].range.contains(&77), "Near end of second link");
// Test positions outside any link
assert!(!links[0].range.contains(&9), "Before first link");
assert!(!links[0].range.contains(&32), "After first link");
assert!(!links[1].range.contains(&49), "Before second link");
assert!(!links[1].range.contains(&78), "After second link");
// Test finding a link at a specific position
let link_at_20 = links.iter().find(|link| link.range.contains(&20));
assert!(link_at_20.is_some());
assert_eq!(link_at_20.unwrap().text, "Zed");
assert_eq!(link_at_20.unwrap().url, "https://zed.dev");
let link_at_65 = links.iter().find(|link| link.range.contains(&65));
assert!(link_at_65.is_some());
assert_eq!(link_at_65.unwrap().text, "GitHub");
assert_eq!(link_at_65.unwrap().url, "https://github.com");
}
#[gpui::test]
fn test_cursor_adjacent_link_expansion() {
// Test the logic for determining if cursor is inside or adjacent to a link
let link = MarkdownLink {
text: "Example".to_string(),
url: "https://example.com".to_string(),
range: 10..37,
};
// Helper function to check if cursor should expand the link
let should_expand_link = |cursor_pos: usize, link: &MarkdownLink| -> bool {
// Link should be expanded if cursor is inside or adjacent
link.range.contains(&cursor_pos)
|| cursor_pos == link.range.start
|| cursor_pos == link.range.end
};
// Test cursor positions
assert!(
should_expand_link(10, &link),
"Cursor at position 10 (link start) should expand link"
);
assert!(
should_expand_link(37, &link),
"Cursor at position 37 (link end) should expand link"
);
assert!(
should_expand_link(20, &link),
"Cursor at position 20 (inside link) should expand link"
);
assert!(
!should_expand_link(9, &link),
"Cursor at position 9 (before link) should not expand link"
);
assert!(
!should_expand_link(38, &link),
"Cursor at position 38 (after link) should not expand link"
);
// Test the edge cases
assert_eq!(link.range.start, 10, "Link starts at position 10");
assert_eq!(link.range.end, 37, "Link ends at position 37");
assert!(link.range.contains(&10), "Range includes start position");
assert!(!link.range.contains(&37), "Range excludes end position");
}
}

View File

@@ -14,9 +14,9 @@ use fuzzy::{StringMatchCandidate, match_strings};
use gpui::{
AnyElement, App, AsyncWindowContext, Bounds, ClickEvent, ClipboardItem, Context, DismissEvent,
Div, Entity, EventEmitter, FocusHandle, Focusable, FontStyle, InteractiveElement, IntoElement,
ListOffset, ListState, MouseDownEvent, ParentElement, Pixels, Point, PromptLevel, Render,
SharedString, Styled, Subscription, Task, TextStyle, WeakEntity, Window, actions, anchored,
canvas, deferred, div, fill, list, point, prelude::*, px,
KeyContext, ListOffset, ListState, MouseDownEvent, ParentElement, Pixels, Point, PromptLevel,
Render, SharedString, Styled, Subscription, Task, TextStyle, WeakEntity, Window, actions,
anchored, canvas, deferred, div, fill, list, point, prelude::*, px,
};
use menu::{Cancel, Confirm, SecondaryConfirm, SelectNext, SelectPrevious};
use project::{Fs, Project};
@@ -52,6 +52,8 @@ actions!(
StartMoveChannel,
MoveSelected,
InsertSpace,
MoveChannelUp,
MoveChannelDown,
]
);
@@ -1961,6 +1963,33 @@ impl CollabPanel {
})
}
fn move_channel_up(&mut self, _: &MoveChannelUp, window: &mut Window, cx: &mut Context<Self>) {
if let Some(channel) = self.selected_channel() {
self.channel_store.update(cx, |store, cx| {
store
.reorder_channel(channel.id, proto::reorder_channel::Direction::Up, cx)
.detach_and_prompt_err("Failed to move channel up", window, cx, |_, _, _| None)
});
}
}
fn move_channel_down(
&mut self,
_: &MoveChannelDown,
window: &mut Window,
cx: &mut Context<Self>,
) {
if let Some(channel) = self.selected_channel() {
self.channel_store.update(cx, |store, cx| {
store
.reorder_channel(channel.id, proto::reorder_channel::Direction::Down, cx)
.detach_and_prompt_err("Failed to move channel down", window, cx, |_, _, _| {
None
})
});
}
}
fn open_channel_notes(
&mut self,
channel_id: ChannelId,
@@ -1974,7 +2003,7 @@ impl CollabPanel {
fn show_inline_context_menu(
&mut self,
_: &menu::SecondaryConfirm,
_: &Secondary,
window: &mut Window,
cx: &mut Context<Self>,
) {
@@ -2003,6 +2032,21 @@ impl CollabPanel {
}
}
fn dispatch_context(&self, window: &Window, cx: &Context<Self>) -> KeyContext {
let mut dispatch_context = KeyContext::new_with_defaults();
dispatch_context.add("CollabPanel");
dispatch_context.add("menu");
let identifier = if self.channel_name_editor.focus_handle(cx).is_focused(window) {
"editing"
} else {
"not_editing"
};
dispatch_context.add(identifier);
dispatch_context
}
fn selected_channel(&self) -> Option<&Arc<Channel>> {
self.selection
.and_then(|ix| self.entries.get(ix))
@@ -2965,7 +3009,7 @@ fn render_tree_branch(
impl Render for CollabPanel {
fn render(&mut self, window: &mut Window, cx: &mut Context<Self>) -> impl IntoElement {
v_flex()
.key_context("CollabPanel")
.key_context(self.dispatch_context(window, cx))
.on_action(cx.listener(CollabPanel::cancel))
.on_action(cx.listener(CollabPanel::select_next))
.on_action(cx.listener(CollabPanel::select_previous))
@@ -2977,6 +3021,8 @@ impl Render for CollabPanel {
.on_action(cx.listener(CollabPanel::collapse_selected_channel))
.on_action(cx.listener(CollabPanel::expand_selected_channel))
.on_action(cx.listener(CollabPanel::start_move_selected_channel))
.on_action(cx.listener(CollabPanel::move_channel_up))
.on_action(cx.listener(CollabPanel::move_channel_down))
.track_focus(&self.focus_handle(cx))
.size_full()
.child(if self.user_store.read(cx).current_user().is_none() {

View File

@@ -0,0 +1,578 @@
use super::{
fold_map::{FoldPlaceholder, FoldSnapshot},
inlay_map::{InlayEdit, InlaySnapshot},
};
use collections::{HashMap, HashSet};
use gpui::{AnyElement, App, Context, Entity};
use language::{Anchor, Buffer, BufferSnapshot, Edit, Point};
use multi_buffer::{ExcerptId, MultiBuffer, MultiBufferSnapshot, ToOffset, ToPoint};
use parking_lot::Mutex;
use serde::{Deserialize, Serialize};
use std::{any::TypeId, ops::Range, sync::Arc};
use sum_tree::TreeMap;
use tree_sitter::{Query, QueryCapture, QueryCursor};
use ui::{Color, IntoElement, Label};
use util::ResultExt;
/// Defines how to find foldable regions using Tree-sitter queries
#[derive(Clone, Debug)]
pub struct FoldingQuery {
pub query: Query,
pub auto_fold: bool, // Should regions auto-fold on load?
pub display_capture_ix: Option<u32>, // Which capture to show when folded
pub action_capture_ix: Option<u32>, // Which capture contains action data
pub proximity_expand: bool, // Expand when cursor is near?
}
/// Represents a foldable region found by queries
#[derive(Clone, Debug)]
pub struct SyntaxFold {
pub id: SyntaxFoldId,
pub range: Range<Anchor>,
pub display_text: Option<String>, // Text to show when folded
pub action_data: Option<FoldAction>, // What happens on cmd+click
pub proximity_expand: bool,
pub auto_fold: bool,
pub query_index: usize, // Which query created this fold
}
/// Unique identifier for a syntax fold
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
pub struct SyntaxFoldId(usize);
/// Possible actions when clicking folded content
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
pub enum FoldAction {
OpenUrl(String), // Open URL in browser
GoToFile(String), // Open file in editor
RunCommand(String), // Run system command
Custom(String, serde_json::Value), // Extensibility point
}
/// Configuration for syntax folding
#[derive(Clone, Debug, Default)]
pub struct SyntaxFoldConfig {
pub enabled: bool,
pub auto_fold_on_open: bool,
pub proximity_expand_distance: u32, // Characters away from fold to trigger expansion
}
/// Snapshot of the syntax fold state at a point in time
pub struct SyntaxFoldSnapshot {
buffer_snapshot: MultiBufferSnapshot,
folds: TreeMap<Range<Anchor>, SyntaxFold>,
queries: Arc<Vec<FoldingQuery>>,
config: SyntaxFoldConfig,
}
/// Map that tracks syntax-driven folds in the buffer
pub struct SyntaxFoldMap {
buffer: Entity<MultiBuffer>,
queries: Arc<Vec<FoldingQuery>>,
config: SyntaxFoldConfig,
// All detected syntax folds
folds: TreeMap<Range<Anchor>, SyntaxFold>,
next_fold_id: usize,
// Track which folds are currently applied to the fold map
applied_folds: HashMap<SyntaxFoldId, Range<Anchor>>,
// Track which folds are temporarily expanded due to cursor proximity
proximity_expanded: Arc<Mutex<HashSet<SyntaxFoldId>>>,
// Track which folds have been manually toggled by the user
user_toggled: Arc<Mutex<HashSet<SyntaxFoldId>>>,
}
impl SyntaxFoldMap {
pub fn new(buffer: Entity<MultiBuffer>, cx: &mut App) -> Self {
let mut this = Self {
buffer: buffer.clone(),
queries: Arc::new(Vec::new()),
config: SyntaxFoldConfig::default(),
folds: TreeMap::default(),
next_fold_id: 0,
applied_folds: HashMap::default(),
proximity_expanded: Arc::new(Mutex::new(HashSet::default())),
user_toggled: Arc::new(Mutex::new(HashSet::default())),
};
// Initialize queries from languages
this.update_queries(cx);
// Perform initial fold detection if auto-fold is enabled
if this.config.enabled && this.config.auto_fold_on_open {
this.detect_folds(cx);
}
this
}
pub fn snapshot(&self, buffer_snapshot: MultiBufferSnapshot) -> SyntaxFoldSnapshot {
SyntaxFoldSnapshot {
buffer_snapshot,
folds: self.folds.clone(),
queries: self.queries.clone(),
config: self.config.clone(),
}
}
/// Returns ranges that should be folded/unfolded based on syntax analysis
pub fn sync(
&mut self,
buffer_snapshot: MultiBufferSnapshot,
buffer_edits: Vec<Edit<Point>>,
cx: &mut App,
) -> (Vec<Range<Anchor>>, Vec<Range<Anchor>>) {
// Re-detect folds in edited regions
for edit in &buffer_edits {
self.detect_folds_in_range(edit.new.clone(), &buffer_snapshot, cx);
}
// Determine which folds to apply/remove
let mut to_fold = Vec::new();
let mut to_unfold = Vec::new();
for (range, fold) in self.folds.iter() {
let should_be_folded = fold.auto_fold
&& !self.user_toggled.lock().contains(&fold.id)
&& !self.proximity_expanded.lock().contains(&fold.id);
let is_folded = self.applied_folds.contains_key(&fold.id);
if should_be_folded && !is_folded {
to_fold.push(range.clone());
self.applied_folds.insert(fold.id, range.clone());
} else if !should_be_folded && is_folded {
to_unfold.push(range.clone());
self.applied_folds.remove(&fold.id);
}
}
(to_fold, to_unfold)
}
/// Update queries from the current languages in the buffer
fn update_queries(&mut self, cx: &App) {
let mut queries = Vec::new();
let buffer = self.buffer.read(cx);
for excerpt in buffer.excerpts() {
if let Some(language) = excerpt.buffer.read(cx).language() {
if let Some(folding_query_str) = language.folding_query() {
if let Ok(query) =
Query::new(&language.grammar().unwrap().ts_language, folding_query_str)
{
// Parse query metadata from captures
let mut auto_fold = false;
let mut display_capture_ix = None;
let mut action_capture_ix = None;
let mut proximity_expand = false;
for (ix, name) in query.capture_names().iter().enumerate() {
match name.as_str() {
"fold.auto" => auto_fold = true,
"fold.text" | "fold.display" => {
display_capture_ix = Some(ix as u32)
}
"fold.action" | "fold.url" => action_capture_ix = Some(ix as u32),
"fold.proximity" => proximity_expand = true,
_ => {}
}
}
queries.push(FoldingQuery {
query,
auto_fold,
display_capture_ix,
action_capture_ix,
proximity_expand,
});
}
}
}
}
self.queries = Arc::new(queries);
self.snapshot.queries = self.queries.clone();
}
/// Detect all folds in the buffer
fn detect_folds(&mut self, cx: &mut App) {
let buffer = self.buffer.read(cx);
let snapshot = buffer.snapshot(cx);
for (excerpt_id, excerpt) in snapshot.excerpts() {
if let Some(tree) = excerpt.tree() {
let excerpt_range = excerpt.range.to_point(&excerpt.buffer);
self.detect_folds_in_excerpt(
excerpt_id,
&excerpt.buffer,
tree.root_node(),
excerpt_range,
cx,
);
}
}
}
/// Detect folds in a specific range
fn detect_folds_in_range(
&mut self,
range: Range<Point>,
buffer_snapshot: &MultiBufferSnapshot,
cx: &mut App,
) {
// Remove existing folds in this range
let range_anchors =
buffer_snapshot.anchor_before(range.start)..buffer_snapshot.anchor_after(range.end);
let mut folds_to_remove = Vec::new();
for (fold_range, fold) in self.folds.iter() {
if fold_range
.start
.cmp(&range_anchors.end, buffer_snapshot)
.is_lt()
&& fold_range
.end
.cmp(&range_anchors.start, buffer_snapshot)
.is_gt()
{
folds_to_remove.push(fold.id);
}
}
for fold_id in folds_to_remove {
self.remove_fold(fold_id);
}
// Re-detect folds in affected excerpts
for (excerpt_id, excerpt) in buffer_snapshot.excerpts() {
let excerpt_range = excerpt.range.to_point(&excerpt.buffer);
if excerpt_range.start < range.end && excerpt_range.end > range.start {
if let Some(tree) = excerpt.tree() {
self.detect_folds_in_excerpt(
excerpt_id,
&excerpt.buffer,
tree.root_node(),
excerpt_range,
buffer_snapshot,
cx,
);
}
}
}
}
/// Detect folds in a specific excerpt using tree-sitter queries
fn detect_folds_in_excerpt(
&mut self,
excerpt_id: ExcerptId,
buffer: &BufferSnapshot,
root: Node,
range: Range<Point>,
cx: &mut App,
) {
let mut cursor = QueryCursor::new();
cursor.set_point_range(range.clone());
for (query_index, folding_query) in self.queries.iter().enumerate() {
let matches = cursor.matches(
&folding_query.query,
root,
buffer
.as_rope()
.chunks_in_range(range.clone())
.collect::<String>()
.as_bytes(),
);
for match_ in matches {
if let Some(fold) = self.create_fold_from_match(
match_.captures,
query_index,
folding_query,
excerpt_id,
buffer,
cx,
) {
// Store the fold
self.folds.insert(fold.range.clone(), fold);
}
}
}
}
/// Create a SyntaxFold from a tree-sitter match
fn create_fold_from_match(
&mut self,
captures: &[QueryCapture],
query_index: usize,
query: &FoldingQuery,
excerpt_id: ExcerptId,
buffer: &BufferSnapshot,
cx: &App,
) -> Option<SyntaxFold> {
// Find the main fold capture (should be the one without a suffix)
let fold_capture = captures.iter().find(|c| {
let name = query.query.capture_names()[c.index as usize].as_str();
name == "fold" || name == "fold.auto"
})?;
let fold_node = fold_capture.node;
let start = Point::new(
fold_node.start_position().row as u32,
fold_node.start_position().column as u32,
);
let end = Point::new(
fold_node.end_position().row as u32,
fold_node.end_position().column as u32,
);
// Convert to anchors in the multi-buffer
let buffer_snapshot = self.buffer.read(cx).snapshot(cx);
let start_anchor = buffer_snapshot.anchor_in_excerpt(excerpt_id, start)?;
let end_anchor = buffer_snapshot.anchor_in_excerpt(excerpt_id, end)?;
// Extract display text if specified
let display_text = if let Some(display_ix) = query.display_capture_ix {
captures
.iter()
.find(|c| c.index == display_ix)
.and_then(|c| {
let node = c.node;
let start = node.start_byte();
let end = node.end_byte();
buffer.text_for_range(start..end).collect::<String>().into()
})
} else {
None
};
// Extract action data if specified
let action_data = if let Some(action_ix) = query.action_capture_ix {
captures
.iter()
.find(|c| c.index == action_ix)
.and_then(|c| {
let node = c.node;
let start = node.start_byte();
let end = node.end_byte();
let text = buffer.text_for_range(start..end).collect::<String>();
// Parse action based on capture name
let capture_name = &query.query.capture_names()[c.index as usize];
match capture_name.as_str() {
"fold.url" => Some(FoldAction::OpenUrl(text)),
"fold.file" => Some(FoldAction::GoToFile(text)),
"fold.command" => Some(FoldAction::RunCommand(text)),
_ => None,
}
})
} else {
None
};
let fold_id = SyntaxFoldId(self.next_fold_id);
self.next_fold_id += 1;
Some(SyntaxFold {
id: fold_id,
range: start_anchor..end_anchor,
display_text,
action_data,
proximity_expand: query.proximity_expand,
auto_fold: query.auto_fold,
query_index,
})
}
/// Create a fold placeholder for syntax folds
pub fn create_fold_placeholder(fold: &SyntaxFold) -> FoldPlaceholder {
let display_text = fold.display_text.clone();
FoldPlaceholder {
render: Arc::new(move |_id, _range, _cx| {
if let Some(text) = &display_text {
// Render the display text as a clickable element
Label::new(text.clone())
.color(Color::Accent)
.into_any_element()
} else {
// Default ellipsis
Label::new("").color(Color::Muted).into_any_element()
}
}),
constrain_width: true,
merge_adjacent: false,
type_tag: Some(TypeId::of::<SyntaxFold>()),
}
}
/// Remove a syntax fold
fn remove_fold(&mut self, fold_id: SyntaxFoldId) {
let mut range_to_remove = None;
for (range, fold) in self.folds.iter() {
if fold.id == fold_id {
range_to_remove = Some(range.clone());
break;
}
}
if let Some(range) = range_to_remove {
self.folds.remove(&range);
self.applied_folds.remove(&fold_id);
}
}
/// Handle cursor movement for proximity-based expansion
pub fn handle_cursor_moved(
&mut self,
cursor_offset: usize,
buffer_snapshot: &MultiBufferSnapshot,
) -> (Vec<Range<Anchor>>, Vec<Range<Anchor>>) {
let cursor_point = cursor_offset.to_point(buffer_snapshot);
let mut to_expand = Vec::new();
let mut to_collapse = Vec::new();
// Check each fold for proximity
for (range, fold) in self.folds.iter() {
if !fold.proximity_expand {
continue;
}
let fold_start = range.start.to_point(buffer_snapshot);
let fold_end = range.end.to_point(buffer_snapshot);
// Check if cursor is near the fold
let near_fold = Self::is_point_near_range(
cursor_point,
fold_start..fold_end,
self.config.proximity_expand_distance,
);
let is_expanded = self.proximity_expanded.lock().contains(&fold.id);
if near_fold && !is_expanded {
// Mark for expansion
self.proximity_expanded.lock().insert(fold.id);
to_expand.push(range.clone());
} else if !near_fold && is_expanded {
// Mark for collapse
self.proximity_expanded.lock().remove(&fold.id);
to_collapse.push(range.clone());
}
}
(to_expand, to_collapse)
}
/// Check if a point is near a range
fn is_point_near_range(point: Point, range: Range<Point>, distance: u32) -> bool {
// Check if point is within the range
if point >= range.start && point <= range.end {
return true;
}
// Check distance before range
if point.row == range.start.row
&& range.start.column.saturating_sub(point.column) <= distance
{
return true;
}
// Check distance after range
if point.row == range.end.row && point.column.saturating_sub(range.end.column) <= distance {
return true;
}
false
}
/// Get fold at a specific position
pub fn fold_at_position(
&self,
offset: usize,
buffer_snapshot: &MultiBufferSnapshot,
) -> Option<&SyntaxFold> {
let point = offset.to_point(buffer_snapshot);
for (range, fold) in self.folds.iter() {
let start = range.start.to_point(buffer_snapshot);
let end = range.end.to_point(buffer_snapshot);
if point >= start && point <= end {
return Some(fold);
}
}
None
}
/// Execute the action associated with a fold
pub fn execute_fold_action(&self, fold: &SyntaxFold, cx: &mut App) {
if let Some(action) = &fold.action_data {
match action {
FoldAction::OpenUrl(url) => {
// Use platform open to open URLs
cx.open_url(url.as_str());
}
FoldAction::GoToFile(path) => {
// This would be handled by the workspace
log::info!("Go to file: {}", path);
}
FoldAction::RunCommand(cmd) => {
log::info!("Run command: {}", cmd);
}
FoldAction::Custom(name, data) => {
log::info!("Custom action {}: {:?}", name, data);
}
}
}
}
}
impl Clone for SyntaxFoldSnapshot {
fn clone(&self) -> Self {
Self {
buffer_snapshot: self.buffer_snapshot.clone(),
folds: self.folds.clone(),
queries: self.queries.clone(),
config: self.config.clone(),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use gpui::{Context as _, TestAppContext};
use language::{Buffer, Language, LanguageConfig, LanguageMatcher};
use multi_buffer::MultiBuffer;
use std::sync::Arc;
// Note: These tests are placeholders showing the intended API.
// Real tests would require actual tree-sitter language support.
#[gpui::test]
async fn test_syntax_fold_api(cx: &mut TestAppContext) {
cx.update(|cx| {
// Create a buffer
let buffer = cx.new_model(|cx| Buffer::local("[Example](https://example.com)", cx));
let multibuffer = cx.new_model(|cx| {
let mut multibuffer = MultiBuffer::new(0, language::Capability::ReadWrite);
multibuffer.push_buffer(buffer.clone(), cx);
multibuffer
});
// Create syntax fold map
let mut syntax_fold_map = SyntaxFoldMap::new(multibuffer.clone(), cx);
// Test basic API
syntax_fold_map.config.enabled = true;
syntax_fold_map.detect_folds(cx);
// Verify the API works
let buffer_snapshot = multibuffer.read(cx).snapshot(cx);
let snapshot = syntax_fold_map.snapshot(buffer_snapshot);
assert_eq!(snapshot.folds.len(), 0); // No folds without proper tree-sitter
});
}
}

View File

@@ -0,0 +1,320 @@
use super::*;
use crate::{
display_map::{DisplayMap, FoldMap, InlayMap},
test::editor_test_context::EditorTestContext,
};
use gpui::{Context, TestAppContext};
use language::{Language, LanguageConfig, LanguageMatcher, LanguageRegistry};
use multi_buffer::MultiBuffer;
use project::Project;
use std::sync::Arc;
#[gpui::test]
async fn test_markdown_link_folding(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Set up markdown language with folding query
cx.update_buffer(|buffer, cx| {
let registry = LanguageRegistry::test(cx.background_executor().clone());
let markdown = Arc::new(Language::new(
LanguageConfig {
name: "Markdown".into(),
matcher: LanguageMatcher {
path_suffixes: vec!["md".to_string()],
..Default::default()
},
..Default::default()
},
None,
));
registry.add(markdown.clone());
buffer.set_language_registry(registry);
buffer.set_language(Some(markdown), cx);
});
// Add markdown content with links
cx.set_state(indoc! {"
# Channel Notes
Check out [our website](https://example.com) for more info.
Here's a [link to docs](https://docs.example.com/guide) that you should read.
And another [reference](https://reference.example.com).
"});
cx.update_editor(|editor, cx| {
// Enable syntax folding
let display_map = editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify folds were created for each link
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 3, "Should have 3 folds for 3 links");
});
}
#[gpui::test]
async fn test_link_proximity_expansion(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Set up markdown with a single link
cx.set_state(indoc! {"
Check out [this link](https://example.com) for details.
"});
cx.update_editor(|editor, cx| {
// Enable syntax folding with proximity expansion
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 3,
};
map.detect_syntax_folds(cx);
});
});
// Move cursor near the link
cx.set_selections_state(indoc! {"
Check out |[this link](https://example.com) for details.
"});
cx.update_editor(|editor, cx| {
let display_map = editor.display_map.update(cx, |map, cx| {
// Simulate cursor movement handling
let cursor_offset = editor.selections.newest::<usize>(cx).head();
map.handle_cursor_movement(cursor_offset, cx);
});
// Verify link is expanded due to proximity
let snapshot = editor.snapshot(cx);
let expanded_count = snapshot.expanded_fold_count();
assert_eq!(
expanded_count, 1,
"Link should be expanded when cursor is near"
);
});
// Move cursor away from the link
cx.set_selections_state(indoc! {"
Check out [this link](https://example.com) for details.|
"});
cx.update_editor(|editor, cx| {
let display_map = editor.display_map.update(cx, |map, cx| {
let cursor_offset = editor.selections.newest::<usize>(cx).head();
map.handle_cursor_movement(cursor_offset, cx);
});
// Verify link is folded again
let snapshot = editor.snapshot(cx);
let expanded_count = snapshot.expanded_fold_count();
assert_eq!(
expanded_count, 0,
"Link should be folded when cursor moves away"
);
});
}
#[gpui::test]
async fn test_link_click_action(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
cx.set_state(indoc! {"
Visit [GitHub](https://github.com) for code.
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
});
// Simulate clicking on the folded link
cx.update_editor(|editor, cx| {
let point = editor.pixel_position_of_cursor(cx);
// Find fold at click position
let display_map = editor.display_map.read(cx);
let snapshot = display_map.snapshot(cx);
let click_offset = snapshot.display_point_to_offset(point, cx);
if let Some(fold) = display_map.syntax_fold_at_offset(click_offset) {
// Verify the fold has the correct URL action
assert!(matches!(
fold.action_data,
Some(FoldAction::OpenUrl(ref url)) if url == "https://github.com"
));
// Execute the action (in tests, this would be mocked)
display_map.execute_fold_action(&fold, cx);
} else {
panic!("No fold found at click position");
}
});
}
#[gpui::test]
async fn test_nested_markdown_structures(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Test with nested brackets and complex markdown
cx.set_state(indoc! {"
# Documentation
See [the [nested] guide](https://example.com/guide) for details.
![Image description](https://example.com/image.png)
Code: `[not a link](just code)`
> Quote with [link in quote](https://quoted.com)
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify correct number of folds (should not fold code block content)
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(
fold_count, 3,
"Should have 3 folds: nested link, image, and quote link"
);
});
}
#[gpui::test]
async fn test_fold_persistence_across_edits(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
cx.set_state(indoc! {"
First [link one](https://one.com) here.
Second [link two](https://two.com) here.
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
});
// Edit text between links
cx.set_selections_state(indoc! {"
First [link one](https://one.com) here|.
Second [link two](https://two.com) here.
"});
cx.simulate_keystroke(" and more text");
cx.update_editor(|editor, cx| {
// Verify folds are maintained after edit
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 2, "Both folds should persist after edit");
});
// Add a new link
cx.simulate_keystroke("\nThird [link three](https://three.com) added.");
cx.update_editor(|editor, cx| {
// Verify new fold is detected
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 3, "New fold should be detected for added link");
});
}
#[gpui::test]
async fn test_channel_notes_integration(cx: &mut TestAppContext) {
init_test(cx);
let mut cx = EditorTestContext::new(cx).await;
// Simulate channel notes content with many links
cx.set_state(indoc! {"
# Project Resources
## Documentation
- [API Reference](https://api.example.com/docs)
- [User Guide](https://guide.example.com)
- [Developer Handbook](https://dev.example.com/handbook)
## Tools
- [Build Status](https://ci.example.com/status)
- [Issue Tracker](https://issues.example.com)
- [Code Review](https://review.example.com)
## External Links
- [Blog Post](https://blog.example.com/announcement)
- [Video Tutorial](https://youtube.com/watch?v=demo)
- [Community Forum](https://forum.example.com)
"});
cx.update_editor(|editor, cx| {
editor.display_map.update(cx, |map, cx| {
map.syntax_fold_config = SyntaxFoldConfig {
enabled: true,
auto_fold_on_open: true,
proximity_expand_distance: 2,
};
map.detect_syntax_folds(cx);
});
// Verify all links are folded
let snapshot = editor.snapshot(cx);
let fold_count = snapshot.fold_count();
assert_eq!(fold_count, 9, "All 9 links should be folded");
// Verify visual appearance (folded links show only text)
let display_text = snapshot.display_text();
assert!(display_text.contains("API Reference"));
assert!(!display_text.contains("https://api.example.com/docs"));
});
}
fn init_test(cx: &mut TestAppContext) {
cx.update(|cx| {
let settings = crate::test::test_settings(cx);
cx.set_global(settings);
theme::init(theme::LoadThemes::JustBase, cx);
language::init(cx);
crate::init(cx);
});
}

View File

@@ -0,0 +1,37 @@
; Fold markdown links to show only link text
(link
(link_text) @fold.text
(link_destination) @fold.url
) @fold.auto
(#set! fold.display "text")
(#set! fold.action "open_url:url")
(#set! fold.proximity_expand true)
; Fold image links showing alt text
(image
(image_description) @fold.text
(link_destination) @fold.url
) @fold.auto
(#set! fold.display "text")
(#set! fold.action "open_url:url")
(#set! fold.proximity_expand true)
; Fold reference links
(reference_link
(link_text) @fold.text
(link_label)? @fold.label
) @fold.auto
(#set! fold.display "text")
(#set! fold.proximity_expand true)
; Fold autolinks (bare URLs)
(uri_autolink) @fold.auto @fold.url
(#set! fold.display "<link>")
(#set! fold.action "open_url:url")
(#set! fold.proximity_expand true)
; Fold email autolinks
(email_autolink) @fold.auto @fold.url
(#set! fold.display "<email>")
(#set! fold.action "open_url:url")
(#set! fold.proximity_expand true)

View File

@@ -8,6 +8,7 @@ message Channel {
uint64 id = 1;
string name = 2;
ChannelVisibility visibility = 3;
int32 channel_order = 4;
repeated uint64 parent_path = 5;
}
@@ -207,6 +208,15 @@ message MoveChannel {
uint64 to = 2;
}
message ReorderChannel {
uint64 channel_id = 1;
enum Direction {
Up = 0;
Down = 1;
}
Direction direction = 2;
}
message JoinChannelBuffer {
uint64 channel_id = 1;
}

View File

@@ -190,6 +190,7 @@ message Envelope {
GetChannelMessagesById get_channel_messages_by_id = 144;
MoveChannel move_channel = 147;
ReorderChannel reorder_channel = 349;
SetChannelVisibility set_channel_visibility = 148;
AddNotification add_notification = 149;

View File

@@ -176,6 +176,7 @@ messages!(
(LspExtClearFlycheck, Background),
(MarkNotificationRead, Foreground),
(MoveChannel, Foreground),
(ReorderChannel, Foreground),
(MultiLspQuery, Background),
(MultiLspQueryResponse, Background),
(OnTypeFormatting, Background),
@@ -389,6 +390,7 @@ request_messages!(
(RemoveContact, Ack),
(RenameChannel, RenameChannelResponse),
(RenameProjectEntry, ProjectEntryResponse),
(ReorderChannel, Ack),
(RequestContact, Ack),
(
ResolveCompletionDocumentation,