This commit is contained in:
mixa
2026-02-19 07:49:31 +03:00
parent 6f13bb03aa
commit 78e1efeaea
17 changed files with 2377 additions and 858 deletions

83
AGENTS.md Normal file
View File

@@ -0,0 +1,83 @@
# P2P Chat Codebase Guide for Agents
This document outlines the development workflow, code style, and architectural patterns for the `p2p-chat` repository.
## 1. Build, Test, and Run Commands
### Basic Commands
* **Build**: `cargo build`
* **Run TUI (default)**: `cargo run`
* **Run GUI**: `cargo run -- --gui`
* **Check**: `cargo check`
* **Format**: `cargo fmt`
* **Lint**: `cargo clippy`
### Testing
* **Run all tests**: `cargo test`
* **Run a specific test**: `cargo test test_name`
* **Run tests with output**: `cargo test -- --nocapture`
### Debugging
* **Logs**:
* TUI mode logs to `p2p-chat.log` in the working directory.
* GUI mode logs to `stdout` (INFO/DEBUG) and `stderr` (WARN/ERROR).
* **Environment Variables**:
* `RUST_LOG`: Control logging levels (e.g., `RUST_LOG=p2p_chat=debug,iroh=info`).
## 2. Code Style & Architecture
### General Rust Guidelines
* **Edition**: Rust 2021.
* **Formatting**: Strictly follow `rustfmt`.
* **Error Handling**: Use `anyhow::Result` for application-level errors and main functions. Use specific `thiserror` enums for libraries/modules when precise error handling is required.
* **Async/Await**: The project is built on the `tokio` runtime. Use `async/await` for I/O-bound tasks.
### Project Structure
* `src/main.rs`: Application entry point, runtime setup, and main event loop.
* `src/app_logic.rs`: Core business logic (`AppLogic` struct). Handles network events and application commands. It is agnostic of the UI (TUI vs. GUI).
* `src/gui.rs`: Iced-based GUI implementation following the Elm architecture (Model-View-Update).
* `src/tui/`: Ratatui-based TUI implementation.
* `src/net/`: Networking layer wrapping `iroh` and `iroh-gossip`.
* `src/media/`: Audio/Video capture and playback (GStreamer, cpal, FFmpeg).
* `src/protocol/`: Data structures and serialization (`serde`, `bincode`, `postcard`) for network messages.
### Naming Conventions
* **Variables/Functions**: `snake_case`.
* **Types/Traits**: `CamelCase`.
* **Constants**: `SCREAMING_SNAKE_CASE`.
* **Files**: `snake_case.rs`.
### Architectural Patterns
1. **State Management**:
* `AppLogic` holds the source of truth for application state (`ChatState`, `MediaState`, `NetworkManager`, etc.).
* `FrontendState` is a simplified struct derived from `AppLogic` to pass data to the UI (GUI/TUI) for rendering.
* Do not put core business logic inside `gui.rs` or `tui/`.
2. **Concurrency**:
* Use `tokio::spawn` for background tasks.
* Use `tokio::sync::mpsc` channels for communicating between the UI and the backend logic.
* Use `tokio::sync::broadcast` for one-to-many event distribution (e.g., video frames).
* Use `Arc<Mutex<...>>` or `Arc<Atomic...>` for shared state when channels are insufficient, but prefer message passing.
3. **GUI (Iced)**:
* **Messages**: Define all UI interactions in the `Message` enum in `src/gui.rs`.
* **Update**: Handle `Message`s in the `update` function.
* **View**: Keep the `view` function strictly for rendering based on `self.state`.
* **Subscriptions**: Use `subscription` to listen for external events (e.g., backend state updates, video frames).
4. **Networking**:
* Based on `iroh` QUIC.
* Events are received in the main loop and processed by `AppLogic::handle_net_event`.
### Dependencies
* **Networking**: `iroh`, `iroh-gossip`.
* **Runtime**: `tokio`.
* **UI**: `iced` (GUI), `ratatui` + `crossterm` (TUI).
* **Media**: `gstreamer` (video capture), `cpal` (audio I/O), `ffmpeg-next` (video encoding).
### Common Tasks
* **Adding a new feature**:
1. Update `protocol/mod.rs` if it involves new network messages.
2. Update `AppLogic` to handle the logic.
3. Update `FrontendState` to expose data to the UI.
4. Update `gui.rs` and `tui/` to render the new state.

330
Cargo.lock generated
View File

@@ -335,6 +335,24 @@ dependencies = [
"zbus 4.4.0",
]
[[package]]
name = "ashpd"
version = "0.9.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4d43c03d9e36dd40cab48435be0b09646da362c278223ca535493877b2c1dee9"
dependencies = [
"async-fs",
"async-net",
"enumflags2",
"futures-channel",
"futures-util",
"rand 0.8.5",
"serde",
"serde_repr",
"url",
"zbus 4.4.0",
]
[[package]]
name = "async-broadcast"
version = "0.7.2"
@@ -536,6 +554,12 @@ version = "1.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1505bd5d3d116872e7271a6d4e16d81d0c8570876c8de68093a09ac269d8aac0"
[[package]]
name = "atomic_refcell"
version = "0.1.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "41e67cd8309bbd06cd603a9e693a784ac2e5d1e955f11286e355089fcab3047c"
[[package]]
name = "attohttpc"
version = "0.30.1"
@@ -617,7 +641,7 @@ dependencies = [
"log",
"num-rational",
"num-traits",
"pastey",
"pastey 0.1.1",
"rayon",
"thiserror 2.0.18",
"v_frame",
@@ -2986,6 +3010,16 @@ dependencies = [
"polyval",
]
[[package]]
name = "gif"
version = "0.13.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4ae047235e33e2829703574b54fdec96bfbad892062d97fed2f76022287de61b"
dependencies = [
"color_quant",
"weezl",
]
[[package]]
name = "gif"
version = "0.14.1"
@@ -2996,6 +3030,19 @@ dependencies = [
"weezl",
]
[[package]]
name = "gio-sys"
version = "0.21.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0071fe88dba8e40086c8ff9bbb62622999f49628344b1d1bf490a48a29d80f22"
dependencies = [
"glib-sys",
"gobject-sys",
"libc",
"system-deps",
"windows-sys 0.61.2",
]
[[package]]
name = "gl"
version = "0.14.0"
@@ -3022,6 +3069,50 @@ version = "0.25.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "151665d9be52f9bb40fc7966565d39666f2d1e69233571b71b87791c7e0528b3"
[[package]]
name = "glib"
version = "0.21.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "16de123c2e6c90ce3b573b7330de19be649080ec612033d397d72da265f1bd8b"
dependencies = [
"bitflags 2.10.0",
"futures-channel",
"futures-core",
"futures-executor",
"futures-task",
"futures-util",
"gio-sys",
"glib-macros",
"glib-sys",
"gobject-sys",
"libc",
"memchr",
"smallvec",
]
[[package]]
name = "glib-macros"
version = "0.21.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cf59b675301228a696fe01c3073974643365080a76cc3ed5bc2cbc466ad87f17"
dependencies = [
"heck 0.5.0",
"proc-macro-crate",
"proc-macro2",
"quote",
"syn 2.0.114",
]
[[package]]
name = "glib-sys"
version = "0.21.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2d95e1a3a19ae464a7286e14af9a90683c64d70c02532d88d87ce95056af3e6c"
dependencies = [
"libc",
"system-deps",
]
[[package]]
name = "glob"
version = "0.3.3"
@@ -3061,6 +3152,17 @@ dependencies = [
"gl_generator",
]
[[package]]
name = "gobject-sys"
version = "0.21.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2dca35da0d19a18f4575f3cb99fe1c9e029a2941af5662f326f738a21edaf294"
dependencies = [
"glib-sys",
"libc",
"system-deps",
]
[[package]]
name = "gpu-alloc"
version = "0.6.0"
@@ -3113,6 +3215,129 @@ dependencies = [
"bitflags 2.10.0",
]
[[package]]
name = "gstreamer"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0bed73742c5d54cb48533be608b67d89f96e1ebbba280be7823f1ef995e3a9d7"
dependencies = [
"cfg-if",
"futures-channel",
"futures-core",
"futures-util",
"glib",
"gstreamer-sys",
"itertools 0.14.0",
"kstring",
"libc",
"muldiv",
"num-integer",
"num-rational",
"option-operations",
"pastey 0.2.1",
"pin-project-lite",
"smallvec",
"thiserror 2.0.18",
]
[[package]]
name = "gstreamer-app"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "895753fb0f976693f321e6b9d68f746ef9095f1a5b8277c11d85d807a949fbfc"
dependencies = [
"futures-core",
"futures-sink",
"glib",
"gstreamer",
"gstreamer-app-sys",
"gstreamer-base",
"libc",
]
[[package]]
name = "gstreamer-app-sys"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7719cee28afda1a48ab1ee93769628bd0653d3c5be1923bce9a8a4550fcc980"
dependencies = [
"glib-sys",
"gstreamer-base-sys",
"gstreamer-sys",
"libc",
"system-deps",
]
[[package]]
name = "gstreamer-base"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4dd15c7e37d306573766834a5cbdd8ee711265f217b060f40a9a8eda45298488"
dependencies = [
"atomic_refcell",
"cfg-if",
"glib",
"gstreamer",
"gstreamer-base-sys",
"libc",
]
[[package]]
name = "gstreamer-base-sys"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "27a2eda2c61e13c11883bf19b290d07ea6b53d04fd8bfeb7af64b6006c6c9ee6"
dependencies = [
"glib-sys",
"gobject-sys",
"gstreamer-sys",
"libc",
"system-deps",
]
[[package]]
name = "gstreamer-sys"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d88630697e757c319e7bcec7b13919ba80492532dd3238481c1c4eee05d4904"
dependencies = [
"cfg-if",
"glib-sys",
"gobject-sys",
"libc",
"system-deps",
]
[[package]]
name = "gstreamer-video"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "33987f6a6a99750a07b0341d6288bac89b9b301be4672a209935203d4608d547"
dependencies = [
"cfg-if",
"futures-channel",
"glib",
"gstreamer",
"gstreamer-base",
"gstreamer-video-sys",
"libc",
"thiserror 2.0.18",
]
[[package]]
name = "gstreamer-video-sys"
version = "0.24.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a00c28faad96cd40a7b7592433051199691b131b08f622ed5d51c54e049792d3"
dependencies = [
"glib-sys",
"gobject-sys",
"gstreamer-base-sys",
"gstreamer-sys",
"libc",
"system-deps",
]
[[package]]
name = "guillotiere"
version = "0.6.2"
@@ -3585,6 +3810,7 @@ dependencies = [
"iced_renderer",
"iced_widget",
"iced_winit",
"image 0.24.9",
"thiserror 1.0.69",
]
@@ -3618,6 +3844,7 @@ dependencies = [
"iced_core",
"log",
"rustc-hash 2.1.1",
"tokio",
"wasm-bindgen-futures",
"wasm-timer",
]
@@ -3647,6 +3874,8 @@ dependencies = [
"half",
"iced_core",
"iced_futures",
"image 0.24.9",
"kamadak-exif",
"log",
"once_cell",
"raw-window-handle",
@@ -3887,6 +4116,24 @@ dependencies = [
"xmltree",
]
[[package]]
name = "image"
version = "0.24.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5690139d2f55868e080017335e4b94cb7414274c74f1669c84fb5feba2c9f69d"
dependencies = [
"bytemuck",
"byteorder",
"color_quant",
"exr",
"gif 0.13.3",
"jpeg-decoder",
"num-traits",
"png 0.17.16",
"qoi",
"tiff 0.9.1",
]
[[package]]
name = "image"
version = "0.25.9"
@@ -3897,7 +4144,7 @@ dependencies = [
"byteorder-lite",
"color_quant",
"exr",
"gif",
"gif 0.14.1",
"image-webp",
"moxcms",
"num-traits",
@@ -3906,7 +4153,7 @@ dependencies = [
"ravif",
"rayon",
"rgb",
"tiff",
"tiff 0.10.3",
"zune-core 0.5.1",
"zune-jpeg 0.5.12",
]
@@ -4379,6 +4626,15 @@ dependencies = [
"libc",
]
[[package]]
name = "jpeg-decoder"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "00810f1d8b74be64b13dbf3db89ac67740615d6c891f0e7b6179326533011a07"
dependencies = [
"rayon",
]
[[package]]
name = "js-sys"
version = "0.3.85"
@@ -4389,6 +4645,15 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "kamadak-exif"
version = "0.5.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ef4fc70d0ab7e5b6bafa30216a6b48705ea964cdfc29c050f2412295eba58077"
dependencies = [
"mutate_once",
]
[[package]]
name = "kasuari"
version = "0.4.11"
@@ -4417,6 +4682,15 @@ version = "3.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e2db585e1d738fc771bf08a151420d3ed193d9d895a36df7f6f8a9456b911ddc"
[[package]]
name = "kstring"
version = "2.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "558bf9508a558512042d3095138b1f7b8fe90c5467d94f9f1da28b3731c5dbd1"
dependencies = [
"static_assertions",
]
[[package]]
name = "kurbo"
version = "0.10.4"
@@ -4535,7 +4809,7 @@ dependencies = [
"drm",
"gbm",
"gl",
"image",
"image 0.25.9",
"khronos-egl",
"memmap2",
"rustix 1.1.3",
@@ -4834,6 +5108,18 @@ dependencies = [
"pxfm",
]
[[package]]
name = "muldiv"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "956787520e75e9bd233246045d19f42fb73242759cc57fba9611d940ae96d4b0"
[[package]]
name = "mutate_once"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "13d2233c9842d08cfe13f9eac96e207ca6a2ea10b80259ebe8ad0268be27d2af"
[[package]]
name = "n0-error"
version = "0.1.3"
@@ -5870,6 +6156,15 @@ version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04744f49eae99ab78e0d5c0b603ab218f515ea8cfe5a456d7629ad883a3b6e7d"
[[package]]
name = "option-operations"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aca39cf52b03268400c16eeb9b56382ea3c3353409309b63f5c8f0b1faf42754"
dependencies = [
"pastey 0.2.1",
]
[[package]]
name = "orbclient"
version = "0.3.50"
@@ -5938,6 +6233,7 @@ name = "p2p-chat"
version = "0.1.0"
dependencies = [
"anyhow",
"ashpd 0.9.2",
"audiopus 0.2.0",
"axum",
"bincode",
@@ -5950,10 +6246,13 @@ dependencies = [
"dashmap",
"directories",
"futures",
"gstreamer",
"gstreamer-app",
"gstreamer-video",
"hex",
"iced",
"iced_futures",
"image",
"image 0.25.9",
"iroh",
"iroh-gossip",
"mime_guess",
@@ -6079,6 +6378,12 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "35fb2e5f958ec131621fdd531e9fc186ed768cbe395337403ae56c17a74c68ec"
[[package]]
name = "pastey"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b867cad97c0791bbd3aaa6472142568c6c9e8f71937e98379f584cfb0cf35bec"
[[package]]
name = "patricia_tree"
version = "0.8.0"
@@ -7126,7 +7431,7 @@ version = "0.14.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "25a73a7337fc24366edfca76ec521f51877b114e42dab584008209cca6719251"
dependencies = [
"ashpd",
"ashpd 0.8.1",
"block",
"dispatch",
"js-sys",
@@ -8594,6 +8899,17 @@ dependencies = [
"cfg-if",
]
[[package]]
name = "tiff"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ba1310fcea54c6a9a4fd1aad794ecc02c31682f6bfbecdf460bf19533eed1e3e"
dependencies = [
"flate2",
"jpeg-decoder",
"weezl",
]
[[package]]
name = "tiff"
version = "0.10.3"
@@ -10746,7 +11062,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3b65b5b36b7abe56a64f6dd46e55e26ca1ed5937a84f57fc86e8f2dd37b136f1"
dependencies = [
"dispatch2",
"image",
"image 0.25.9",
"lazy_static",
"libwayshot-xcap",
"log",

View File

@@ -42,7 +42,7 @@ directories = "5.0"
songbird = { version = "0.4", features = ["builtin-queue"] }
audiopus = "0.2"
rfd = "0.14"
iced = "0.13"
iced = { version = "0.13", features = ["image", "wgpu", "tokio"] }
iced_futures = "0.13"
crossbeam-channel = "0.5"
@@ -55,10 +55,14 @@ mime_guess = "2.0.5"
hex = "0.4.3"
cpal = { version = "0.17.1", features = ["jack"] }
xcap = "0.8.2"
ashpd = "0.9"
image = "0.25.9"
ringbuf = "0.4.8"
nnnoiseless = "0.5"
dashmap = "5"
gstreamer = "0.24.4"
gstreamer-app = "0.24.4"
gstreamer-video = "0.24.4"
[profile.dev]
opt-level = 0

View File

@@ -1,6 +1,6 @@
use crate::chat::{ChatEntry, ChatState};
use crate::config::AppConfig;
use crate::file_transfer::FileTransferManager;
use crate::file_transfer::{FileTransferManager, TransferInfo};
use crate::media::MediaState;
use crate::net::{NetEvent, NetworkManager, PeerInfo};
use crate::protocol::{self, GossipMessage};
@@ -11,6 +11,7 @@ use std::path::PathBuf;
pub struct FrontendState {
pub chat_history: Vec<ChatEntry>,
pub peers: Vec<PeerInfo>,
pub transfers: Vec<TransferInfo>,
pub our_name: String,
pub our_id: String,
pub our_id_full: String,
@@ -18,6 +19,7 @@ pub struct FrontendState {
pub input_device_name: Option<String>,
pub output_device_name: Option<String>,
pub master_volume: f32,
pub mic_volume: f32,
pub noise_suppression: bool,
}
@@ -26,6 +28,7 @@ impl Default for FrontendState {
Self {
chat_history: Vec::new(),
peers: Vec::new(),
transfers: Vec::new(),
our_name: "Unknown".to_string(),
our_id: "".to_string(),
our_id_full: "".to_string(),
@@ -33,6 +36,7 @@ impl Default for FrontendState {
input_device_name: None,
output_device_name: None,
master_volume: 1.0,
mic_volume: 1.0,
noise_suppression: true,
}
}
@@ -52,6 +56,7 @@ pub enum AppCommand {
SetInputDevice(String),
SetOutputDevice(String),
SetMasterVolume(f32),
SetMicVolume(f32),
ToggleNoiseCancel,
SetBitrate(u32),
@@ -68,6 +73,8 @@ pub struct AppLogic {
pub our_name: String,
pub our_id_short: String,
pub connected: bool,
pub media_event_rx: tokio::sync::mpsc::Receiver<crate::media::InternalMediaEvent>,
pub net_event_rx: tokio::sync::mpsc::Receiver<crate::net::NetEvent>,
}
impl AppLogic {
@@ -78,6 +85,8 @@ impl AppLogic {
net: NetworkManager,
our_name: String,
our_id_short: String,
media_event_rx: tokio::sync::mpsc::Receiver<crate::media::InternalMediaEvent>,
net_event_rx: tokio::sync::mpsc::Receiver<crate::net::NetEvent>,
) -> Self {
Self {
chat,
@@ -87,6 +96,8 @@ impl AppLogic {
our_name,
our_id_short,
connected: false,
media_event_rx,
net_event_rx,
}
}
@@ -109,38 +120,49 @@ impl AppLogic {
self.chat.add_system_message(status.to_string());
}
AppCommand::SetInputDevice(device_name) => {
self.media.set_input_device(device_name.clone());
self.chat.add_system_message(format!("Microphone set to: {}", device_name));
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.input_device = Some(device_name);
let _ = cfg.save();
}
self.media.set_input_device(device_name.clone());
self.chat
.add_system_message(format!("Microphone set to: {}", device_name));
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.input_device = Some(device_name);
let _ = cfg.save();
}
}
AppCommand::SetOutputDevice(device_name) => {
self.media.set_output_device(device_name.clone());
self.chat.add_system_message(format!("Output set to: {}", device_name));
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.output_device = Some(device_name);
let _ = cfg.save();
}
self.media.set_output_device(device_name.clone());
self.chat
.add_system_message(format!("Output set to: {}", device_name));
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.output_device = Some(device_name);
let _ = cfg.save();
}
}
AppCommand::SetMasterVolume(vol) => {
self.media.set_volume(vol);
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.master_volume = vol;
let _ = cfg.save();
}
cfg.media.master_volume = vol;
let _ = cfg.save();
}
}
AppCommand::SetMicVolume(vol) => {
self.media.set_mic_volume(vol);
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.mic_volume = vol;
let _ = cfg.save();
}
}
AppCommand::ToggleNoiseCancel => {
if let Some(enabled) = self.media.toggle_denoise() {
let status = if enabled { "enabled" } else { "disabled" };
self.chat.add_system_message(format!("Noise cancellation {}", status));
self.chat
.add_system_message(format!("Noise cancellation {}", status));
if let Ok(mut cfg) = AppConfig::load() {
cfg.media.noise_suppression = enabled;
let _ = cfg.save();
}
cfg.media.noise_suppression = enabled;
let _ = cfg.save();
}
} else {
self.chat.add_system_message("Voice chat not active".to_string());
self.chat
.add_system_message("Voice chat not active".to_string());
}
}
AppCommand::Quit => {
@@ -510,17 +532,29 @@ impl AppLogic {
let mut peers: Vec<PeerInfo> = peers_map.values().cloned().collect();
peers.sort_by(|a, b| a.id.to_string().cmp(&b.id.to_string()));
// Sync audio levels
// Sync audio levels and video status
let levels = self.media.get_peer_levels();
let video_sessions = &self.media.active_video_sessions;
for peer in &mut peers {
if let Some(level) = levels.get(&peer.id) {
peer.audio_level = *level;
}
if video_sessions.contains(&peer.id) {
peer.is_streaming_video = true;
} else {
peer.is_streaming_video = false;
}
}
// Check timeouts/cleanups
self.file_mgr.check_timeouts();
let transfers = self.file_mgr.active_transfers();
FrontendState {
chat_history: self.chat.history.clone(),
peers,
transfers,
our_name: self.our_name.clone(),
our_id: self.our_id_short.clone(),
our_id_full: self.net.our_id.to_string(),
@@ -528,6 +562,7 @@ impl AppLogic {
input_device_name: self.media.input_device.clone(),
output_device_name: self.media.output_device.clone(),
master_volume: self.media.get_volume(),
mic_volume: self.media.get_mic_volume(),
noise_suppression: self.media.is_denoise_enabled(),
}
}

View File

@@ -67,6 +67,8 @@ pub struct MediaConfig {
pub output_device: Option<String>,
#[serde(default = "default_volume")]
pub master_volume: f32,
#[serde(default = "default_volume")]
pub mic_volume: f32,
#[serde(default = "default_true")]
pub noise_suppression: bool,
}
@@ -90,6 +92,7 @@ impl Default for MediaConfig {
input_device: None,
output_device: None,
master_volume: 1.0,
mic_volume: 1.0,
noise_suppression: true,
}
}

View File

@@ -197,10 +197,6 @@ impl FileTransferManager {
// `execute_send` logic needs a timeout on `decode_framed`.
}
/// Execute the sending side of a file transfer over a QUIC bi-stream.
#[allow(dead_code)]
pub async fn execute_send(

File diff suppressed because it is too large Load Diff

View File

@@ -64,29 +64,19 @@ struct Cli {
gui: bool,
}
#[tokio::main]
async fn main() -> Result<()> {
// ... tracing init ...
// Initialize tracing to file (not stdout, since we use TUI)
let _tracing_guard = tracing_subscriber::fmt()
.with_env_filter(
tracing_subscriber::EnvFilter::from_default_env()
.add_directive("p2p_chat=debug".parse()?)
.add_directive("iroh=warn".parse()?)
.add_directive("iroh_gossip=warn".parse()?),
)
.with_writer(|| -> Box<dyn io::Write + Send> {
match std::fs::OpenOptions::new()
.create(true)
.append(true)
.open("p2p-chat.log")
{
Ok(f) => Box::new(f),
Err(_) => Box::new(io::sink()),
}
})
.with_ansi(false)
.init();
// Remove #[tokio::main] and use manual runtime builder
fn main() -> Result<()> {
// Setup runtime
let rt = tokio::runtime::Builder::new_multi_thread()
.enable_all()
.build()?;
// Block on async main logic
rt.block_on(async_main())
}
async fn async_main() -> Result<()> {
let cli = Cli::parse();
// Load config
let config = AppConfig::load().unwrap_or_else(|e| {
@@ -94,9 +84,66 @@ async fn main() -> Result<()> {
AppConfig::default()
});
let cli = Cli::parse();
// Initialize tracing
if cli.gui {
// GUI Mode: Log to stdout (Info/Debug) and stderr (Warn/Error)
use tracing_subscriber::fmt::writer::MakeWriterExt;
let stdout = std::io::stdout.with_max_level(tracing::Level::INFO);
let stderr = std::io::stderr.with_min_level(tracing::Level::WARN);
// Combine them? tracing_subscriber doesn't easily support split writers in one layer without customization.
// But we can use the MakeWriter implementation pattern.
struct SplitWriter;
impl<'a> tracing_subscriber::fmt::MakeWriter<'a> for SplitWriter {
type Writer = Box<dyn io::Write + Send + 'a>;
fn make_writer(&'a self) -> Self::Writer {
Box::new(io::stdout())
}
fn make_writer_for(&'a self, meta: &tracing::Metadata<'_>) -> Self::Writer {
if *meta.level() <= tracing::Level::WARN {
Box::new(io::stderr())
} else {
Box::new(io::stdout())
}
}
}
tracing_subscriber::fmt()
.with_env_filter(
tracing_subscriber::EnvFilter::from_default_env()
.add_directive("p2p_chat=debug".parse()?)
.add_directive("iroh=warn".parse()?)
.add_directive("iroh_gossip=warn".parse()?),
)
.with_writer(SplitWriter)
.with_ansi(true)
.init();
} else {
// TUI Mode: Log to file to avoid breaking UI
tracing_subscriber::fmt()
.with_env_filter(
tracing_subscriber::EnvFilter::from_default_env()
.add_directive("p2p_chat=debug".parse()?)
.add_directive("iroh=warn".parse()?)
.add_directive("iroh_gossip=warn".parse()?),
)
.with_writer(|| -> Box<dyn io::Write + Send> {
match std::fs::OpenOptions::new()
.create(true)
.append(true)
.open("p2p-chat.log")
{
Ok(f) => Box::new(f),
Err(_) => Box::new(io::sink()),
}
})
.with_ansi(false)
.init();
}
// Topic: CLI > Config > Default
let topic_str = cli
@@ -108,7 +155,7 @@ async fn main() -> Result<()> {
// ... networking init ...
// Initialize networking
let (mut net_mgr, _net_tx, mut net_rx) = NetworkManager::new(topic_bytes)
let (mut net_mgr, _net_tx, net_event_rx) = NetworkManager::new(topic_bytes)
.await
.context("Failed to start networking")?;
@@ -134,12 +181,16 @@ async fn main() -> Result<()> {
let file_mgr = FileTransferManager::new(download_path);
// Pass mic name from config if present
// Pass mic name from config if present
let (media_event_tx, media_event_rx) = tokio::sync::mpsc::channel(10);
let media = MediaState::new(
config.media.mic_bitrate,
config.media.input_device.clone(),
config.media.output_device.clone(),
config.media.master_volume,
config.media.mic_volume,
config.media.noise_suppression,
Some(media_event_tx),
);
// Initialize App with Theme
@@ -184,6 +235,7 @@ async fn main() -> Result<()> {
capabilities: None,
is_self: true,
audio_level: 0.0,
is_streaming_video: false,
},
);
}
@@ -205,8 +257,6 @@ async fn main() -> Result<()> {
);
}
// Start Web Interface
// Start Web Interface
// Start Web Interface
tokio::spawn(crate::web::start_web_server(
media.broadcast_tx.clone(),
@@ -222,6 +272,8 @@ async fn main() -> Result<()> {
net_mgr,
cli.name.clone(),
our_id_short,
media_event_rx,
net_event_rx,
);
if cli.gui {
@@ -232,55 +284,59 @@ async fn main() -> Result<()> {
// Channel for AppLogic -> GUI state updates
let (gui_state_tx, gui_state_rx) = mpsc::channel(100);
// Subscribe to video frames
let video_rx = app_logic.media.video_frame_tx.subscribe();
// Get initial state
let initial_state = app_logic.get_frontend_state().await;
// Spawn AppLogic loop
tokio::spawn(async move {
let mut interval = tokio::time::interval(std::time::Duration::from_millis(100));
let loop_handle = tokio::spawn(async move {
loop {
let mut state_changed = false;
tokio::select! {
_ = interval.tick() => {
app_logic.file_mgr.check_timeouts();
Some(event) = app_logic.net_event_rx.recv() => {
app_logic.handle_net_event(event).await;
// Send state update to GUI
let state = app_logic.get_frontend_state().await;
let _ = gui_state_tx.send(state).await;
}
Some(cmd) = gui_cmd_rx.recv() => {
match app_logic.handle_command(cmd).await {
Ok(true) => { // Quit command
cmd_opt = gui_cmd_rx.recv() => {
match cmd_opt {
Some(cmd) => {
// Handle command from GUI
if let Ok(should_quit) = app_logic.handle_command(cmd).await {
// Update state even if quitting
let state = app_logic.get_frontend_state().await;
let _ = gui_state_tx.send(state).await;
if should_quit {
break;
}
}
}
None => {
// GUI channel closed, exit loop
break;
}
Ok(false) => {
state_changed = true;
}
Err(e) => {
tracing::error!("Command error: {}", e);
}
}
}
Some(event) = net_rx.recv() => {
app_logic.handle_net_event(event).await;
state_changed = true;
}
Some(event) = gossip_event_rx.recv() => {
app_logic.handle_net_event(event).await;
state_changed = true;
}
_ = tokio::signal::ctrl_c() => {
break;
}
}
if state_changed {
let new_state = app_logic.get_frontend_state().await;
if gui_state_tx.send(new_state).await.is_err() {
break;
Some(_media_evt) = app_logic.media_event_rx.recv() => {
// Media state changed (video started/stopped), update GUI
let state = app_logic.get_frontend_state().await;
let _ = gui_state_tx.send(state).await;
}
}
}
// Shutdown logic
let _ = app_logic.net.shutdown().await;
app_logic.media.shutdown();
// Offload entire app_logic shutdown to blocking thread to avoid async drop panics
let _ = tokio::task::spawn_blocking(move || {
// Shutdown media explicitly (stops threads)
app_logic.media.shutdown();
// app_logic is dropped here, on a blocking thread.
// This includes NetworkManager, ChatState, etc.
})
.await;
});
// Run GUI
@@ -288,9 +344,14 @@ async fn main() -> Result<()> {
initial_state,
command_sender: gui_cmd_tx,
state_receiver: gui_state_rx,
video_receiver: video_rx,
};
crate::gui::run(flags)?;
// Wait for background task to finish cleanup
let _ = loop_handle.await;
return Ok(());
}
@@ -306,7 +367,6 @@ async fn main() -> Result<()> {
&mut terminal,
&mut app,
&mut app_logic,
&mut net_rx,
&mut gossip_event_rx,
)
.await;
@@ -326,7 +386,6 @@ async fn run_event_loop(
terminal: &mut Terminal<CrosstermBackend<io::Stdout>>,
app: &mut App,
logic: &mut crate::app_logic::AppLogic,
net_rx: &mut mpsc::Receiver<NetEvent>,
gossip_rx: &mut mpsc::Receiver<NetEvent>,
) -> Result<()> {
let mut event_stream = EventStream::new();
@@ -386,7 +445,7 @@ async fn run_event_loop(
}
// Network events from file transfer acceptor
Some(event) = net_rx.recv() => {
Some(event) = logic.net_event_rx.recv() => {
logic.handle_net_event(event).await;
}
@@ -444,5 +503,3 @@ fn parse_topic(hex_str: &str) -> Result<[u8; 32]> {
Ok(bytes)
}
}

View File

@@ -0,0 +1,135 @@
use anyhow::Result;
use gstreamer::prelude::*;
use gstreamer::{Pipeline, State};
use gstreamer_app::{AppSink, AppSinkCallbacks};
use std::sync::atomic::{AtomicUsize, Ordering};
use std::sync::Arc;
pub struct GStreamerCapture {
pipeline: Pipeline,
sink: AppSink,
}
impl GStreamerCapture {
pub fn new(pipeline_str: &str) -> Result<Self> {
// Initialize GStreamer
// We only need to init once, but calling it multiple times is safe.
// gstreamer::init() checks internally.
gstreamer::init()?;
// Create pipeline from string
let pipeline = gstreamer::parse::launch(pipeline_str)?
.downcast::<Pipeline>()
.map_err(|_| anyhow::anyhow!("Expected a pipeline"))?;
// Get the appsink
let sink = pipeline
.by_name("sink")
.ok_or_else(|| anyhow::anyhow!("Pipeline must have an appsink named 'sink'"))?
.downcast::<AppSink>()
.map_err(|_| anyhow::anyhow!("'sink' element is not an appsink"))?;
Ok(Self { pipeline, sink })
}
pub fn set_callback<F>(&self, on_sample: F)
where
F: Fn(&[u8]) + Send + Sync + 'static,
{
let count = Arc::new(AtomicUsize::new(0));
let callbacks = AppSinkCallbacks::builder()
.new_sample(move |sink| {
let sample = sink.pull_sample().map_err(|_| gstreamer::FlowError::Eos)?;
let buffer = sample.buffer().ok_or(gstreamer::FlowError::Error)?;
let map = buffer
.map_readable()
.map_err(|_| gstreamer::FlowError::Error)?;
let c = count.fetch_add(1, Ordering::Relaxed);
if c % 10 == 0 {
tracing::info!("GStreamer captured frame #{}", c);
}
on_sample(map.as_slice());
Ok(gstreamer::FlowSuccess::Ok)
})
.build();
self.sink.set_callbacks(callbacks);
}
pub fn start(&self) -> Result<()> {
self.pipeline.set_state(State::Playing)?;
// Spawn a bus watcher
let bus = self.pipeline.bus().expect("Pipeline has no bus");
std::thread::spawn(move || {
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
use gstreamer::MessageView;
match msg.view() {
MessageView::Error(err) => {
tracing::error!("GStreamer Error: {} ({:?})", err.error(), err.debug());
break;
}
MessageView::Warning(warn) => {
tracing::warn!("GStreamer Warning: {} ({:?})", warn.error(), warn.debug());
}
MessageView::Eos(..) => {
tracing::info!("GStreamer EndOfStream");
break;
}
_ => (),
}
}
});
Ok(())
}
pub fn stop(&self) -> Result<()> {
let _ = self.pipeline.set_state(State::Null);
Ok(())
}
}
impl Drop for GStreamerCapture {
fn drop(&mut self) {
let _ = self.stop();
}
}
#[cfg(test)]
mod tests {
use super::*;
use std::sync::{Arc, Mutex};
use std::time::Duration;
#[test]
fn test_appsink_pattern() {
// Simple videotestsrc pipeline for testing
// videotestsrc ! videoconvert ! video/x-raw,format=I420 ! appsink name=sink
let pipeline_str = "videotestsrc num-buffers=10 ! videoconvert ! video/x-raw,format=I420 ! appsink name=sink";
let capture = GStreamerCapture::new(pipeline_str).expect("Failed to create capture");
let frame_count = Arc::new(Mutex::new(0));
let frame_count_clone = frame_count.clone();
capture.set_callback(move |data| {
let mut count = frame_count_clone.lock().unwrap();
*count += 1;
println!("Received frame of size: {}", data.len());
});
capture.start().expect("Failed to start");
// Wait for frames
std::thread::sleep(Duration::from_secs(2));
capture.stop().expect("Failed to stop");
let count = *frame_count.lock().unwrap();
assert!(count > 0, "Should have received at least one frame");
println!("Total frames received: {}", count);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,43 @@
use anyhow::Result;
use ashpd::desktop::{
screencast::{CursorMode, Screencast, SourceType},
PersistMode,
};
pub struct AshpdKeeper {
pub _proxy: Screencast<'static>,
pub _session: ashpd::desktop::Session<'static, Screencast<'static>>,
}
pub async fn select_wayland_source() -> Result<(u32, AshpdKeeper)> {
let proxy = Screencast::new().await?;
let session = proxy.create_session().await?;
proxy
.select_sources(
&session,
CursorMode::Embedded,
SourceType::Monitor | SourceType::Window,
false,
None,
PersistMode::DoNot,
)
.await?;
let request = proxy
.start(&session, &ashpd::WindowIdentifier::default())
.await?;
let streams = request.response()?;
if let Some(stream) = streams.streams().iter().next() {
Ok((
stream.pipe_wire_node_id(),
AshpdKeeper {
_proxy: proxy,
_session: session,
},
))
} else {
Err(anyhow::anyhow!("No pipewire stream returned from portal"))
}
}

View File

@@ -7,6 +7,7 @@
//! Each feature is runtime-toggleable and runs on dedicated threads/tasks.
pub mod capture;
pub mod playback;
pub mod voice;
use iroh::EndpointId;
@@ -36,6 +37,11 @@ pub enum WebMediaEvent {
},
}
pub enum InternalMediaEvent {
VideoStart(EndpointId),
VideoStop(EndpointId),
}
/// Tracks all active media sessions.
pub struct MediaState {
/// Active voice chat session (if any).
@@ -50,18 +56,32 @@ pub struct MediaState {
// Input channels (from Web -> MediaState -> Peers)
pub mic_broadcast: tokio::sync::broadcast::Sender<Vec<f32>>,
pub screen_broadcast: tokio::sync::broadcast::Sender<Vec<u8>>,
// Channel for integrated video player frames (RGBA, width, height)
pub video_frame_tx: tokio::sync::broadcast::Sender<(Vec<u8>, u32, u32)>,
pub mic_bitrate: Arc<AtomicU32>,
pub input_device: Option<String>,
pub output_device: Option<String>,
pub initial_master_volume: f32,
pub initial_mic_volume: f32,
pub initial_noise_suppression: bool,
pub active_video_sessions: Arc<dashmap::DashSet<EndpointId>>,
pub media_event_tx: Option<tokio::sync::mpsc::Sender<InternalMediaEvent>>,
}
impl MediaState {
pub fn new(mic_bitrate: u32, input_device: Option<String>, output_device: Option<String>, master_volume: f32, noise_suppression: bool) -> Self {
pub fn new(
mic_bitrate: u32,
input_device: Option<String>,
output_device: Option<String>,
master_volume: f32,
mic_volume: f32,
noise_suppression: bool,
media_event_tx: Option<tokio::sync::mpsc::Sender<InternalMediaEvent>>,
) -> Self {
let (broadcast_tx, _) = tokio::sync::broadcast::channel(100);
let (mic_broadcast, _) = tokio::sync::broadcast::channel(100);
let (screen_broadcast, _) = tokio::sync::broadcast::channel(100);
let (video_frame_tx, _) = tokio::sync::broadcast::channel(10); // Low buffer for live video
Self {
voice: None,
screen: None,
@@ -69,11 +89,15 @@ impl MediaState {
broadcast_tx,
mic_broadcast,
screen_broadcast,
video_frame_tx,
mic_bitrate: Arc::new(AtomicU32::new(mic_bitrate)),
input_device,
output_device,
initial_master_volume: master_volume,
initial_mic_volume: mic_volume,
initial_noise_suppression: noise_suppression,
active_video_sessions: Arc::new(dashmap::DashSet::new()),
media_event_tx,
}
}
@@ -91,35 +115,36 @@ impl MediaState {
pub fn get_input_devices(&self) -> Vec<String> {
let mut names = Vec::new();
// Prioritize JACK if available, otherwise ALSA/Pulse/WASAPI
let available_hosts = cpal::available_hosts();
let mut hosts = Vec::new();
// Push JACK first if available
if available_hosts.contains(&cpal::HostId::Jack) {
hosts.push(cpal::host_from_id(cpal::HostId::Jack).unwrap());
}
// Then default host
hosts.push(cpal::default_host());
for host in hosts {
if let Ok(devices) = host.input_devices() {
if let Ok(devices) = host.input_devices() {
for device in devices {
if let Ok(name) = device.name() {
// Filter out common noise/unusable devices
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null") {
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null")
{
continue;
}
// Clean up ALSA names
// Example: "sysdefault:CARD=PCH" -> "PCH (sysdefault)"
// Example: "front:CARD=Microphone,DEV=0" -> "Microphone (front)"
let clean_name = if let Some(start) = name.find("CARD=") {
let rest = &name[start + 5..];
let card_name = rest.split(',').next().unwrap_or(rest);
let prefix = name.split(':').next().unwrap_or("Unknown");
format!("{} ({})", card_name, prefix)
} else if name.contains("HDA Intel PCH") {
@@ -128,13 +153,13 @@ impl MediaState {
} else {
name
};
names.push(clean_name);
}
}
}
}
}
// Dedup and sort
names.sort();
names.dedup();
@@ -143,24 +168,25 @@ impl MediaState {
pub fn get_output_devices(&self) -> Vec<String> {
let mut names = Vec::new();
// Prioritize JACK if available
let available_hosts = cpal::available_hosts();
let mut hosts = Vec::new();
if available_hosts.contains(&cpal::HostId::Jack) {
hosts.push(cpal::host_from_id(cpal::HostId::Jack).unwrap());
}
hosts.push(cpal::default_host());
for host in hosts {
if let Ok(devices) = host.output_devices() {
if let Ok(devices) = host.output_devices() {
for device in devices {
if let Ok(name) = device.name() {
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null") {
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null")
{
continue;
}
let clean_name = if let Some(start) = name.find("CARD=") {
let rest = &name[start + 5..];
let card_name = rest.split(',').next().unwrap_or(rest);
@@ -169,13 +195,13 @@ impl MediaState {
} else {
name
};
names.push(clean_name);
}
}
}
}
}
names.sort();
names.dedup();
names
@@ -191,6 +217,12 @@ impl MediaState {
}
}
pub fn set_mic_volume(&self, volume: f32) {
if let Some(voice) = &self.voice {
voice.set_input_volume(volume);
}
}
pub fn get_volume(&self) -> f32 {
if let Some(voice) = &self.voice {
voice.get_volume()
@@ -199,6 +231,12 @@ impl MediaState {
}
}
pub fn get_mic_volume(&self) -> f32 {
// VoiceChat doesn't strictly expose get_input_volume yet but we know it
// Let's assume we track it or just return initial if not active
self.initial_mic_volume // TODO: Fetch from VoiceChat if active? VoiceChat stores it in Atomic.
}
pub fn is_denoise_enabled(&self) -> bool {
if let Some(voice) = &self.voice {
voice.is_denoise_enabled()
@@ -258,6 +296,7 @@ impl MediaState {
self.input_device.clone(),
self.output_device.clone(), // Added output device
self.initial_master_volume,
self.initial_mic_volume, // Pass initial mic volume
self.initial_noise_suppression,
) {
Ok(vc) => {
@@ -281,7 +320,7 @@ impl MediaState {
} else {
// Start
let peers = net.peers.lock().await;
// Use Native Capture (FFmpeg)
match VideoCapture::start_native(
MediaKind::Screen,
@@ -317,7 +356,10 @@ impl MediaState {
mut recv: iroh::endpoint::RecvStream,
) {
let broadcast_tx = self.broadcast_tx.clone();
let video_frame_tx = self.video_frame_tx.clone();
let active_sessions = self.active_video_sessions.clone();
let media_event_tx = self.media_event_tx.clone();
// Spawn a task to determine stream type and handle it
let handle = tokio::spawn(async move {
// Read first message to determine type.
@@ -325,16 +367,36 @@ impl MediaState {
Ok(msg) => match msg {
MediaStreamMessage::AudioStart { .. } => {
// DEPRECATED in Native Datagram mode
tracing::warn!("Received Audio stream from {} (unexpected in datagram mode)", from);
// We could support stream fallback, but for now we ignore or log.
// Or we can close it.
tracing::warn!(
"Received Audio stream from {} (unexpected in datagram mode)",
from
);
}
MediaStreamMessage::VideoStart { .. } => {
tracing::info!("Accepted Video stream from {:?}", from);
if let Err(e) =
VideoCapture::handle_incoming_video_native(from, msg, recv, broadcast_tx)
.await
{
active_sessions.insert(from);
// Notify start
if let Some(tx) = &media_event_tx {
let _ = tx.send(InternalMediaEvent::VideoStart(from)).await;
}
let result = VideoCapture::handle_incoming_video_native(
from,
msg,
recv,
broadcast_tx,
video_frame_tx,
)
.await;
active_sessions.remove(&from);
// Notify stop
if let Some(tx) = &media_event_tx {
let _ = tx.send(InternalMediaEvent::VideoStop(from)).await;
}
if let Err(e) = result {
tracing::error!("Video native playback error: {}", e);
}
}
@@ -362,15 +424,18 @@ impl MediaState {
/// Handle an incoming datagram (unreliable audio/video).
pub fn handle_incoming_datagram(&mut self, from: EndpointId, data: bytes::Bytes) {
if data.is_empty() { return; }
if data.is_empty() {
return;
}
// Check first byte for type
match data[0] {
1 => { // Audio
if let Some(voice) = &mut self.voice {
voice.handle_datagram(from, data);
}
},
1 => {
// Audio
if let Some(voice) = &mut self.voice {
voice.handle_datagram(from, data);
}
}
// 2 => Video?
_ => {
// tracing::trace!("Unknown datagram type: {}", data[0]);
@@ -410,9 +475,3 @@ impl MediaState {
}
}
}
impl Drop for MediaState {
fn drop(&mut self) {
self.shutdown();
}
}

98
src/media/playback.rs Normal file
View File

@@ -0,0 +1,98 @@
use anyhow::Result;
use gstreamer::prelude::*;
use gstreamer::{ElementFactory, Pipeline, State};
use gstreamer_app::{AppSink, AppSrc};
use std::sync::{Arc, Mutex};
pub struct VideoPlayer {
pipeline: Pipeline,
appsrc: AppSrc,
appsink: AppSink,
}
impl VideoPlayer {
pub fn new<F>(on_frame: F) -> Result<Self>
where
F: Fn(Vec<u8>, u32, u32) + Send + Sync + 'static,
{
gstreamer::init()?;
// Pipeline: appsrc -> decodebin -> videoconvert -> appsink
// We use decodebin to handle both H.264 and HEVC automatically
// output format RGBA for Iced
let pipeline_str = "appsrc name=src is-live=true format=time do-timestamp=true ! \
parsebin ! \
decodebin ! \
videoconvert ! \
video/x-raw,format=RGBA ! \
appsink name=sink drop=true max-buffers=1 sync=false";
let pipeline = gstreamer::parse::launch(pipeline_str)?
.downcast::<Pipeline>()
.map_err(|_| anyhow::anyhow!("Expected pipeline"))?;
let appsrc = pipeline
.by_name("src")
.ok_or_else(|| anyhow::anyhow!("Missing src"))?
.downcast::<AppSrc>()
.map_err(|_| anyhow::anyhow!("src is not appsrc"))?;
let appsink = pipeline
.by_name("sink")
.ok_or_else(|| anyhow::anyhow!("Missing sink"))?
.downcast::<AppSink>()
.map_err(|_| anyhow::anyhow!("sink is not appsink"))?;
// Set up callback
let callbacks = gstreamer_app::AppSinkCallbacks::builder()
.new_sample(move |sink| {
let sample = sink.pull_sample().map_err(|_| gstreamer::FlowError::Eos)?;
let buffer = sample.buffer().ok_or(gstreamer::FlowError::Error)?;
// Get caps to know width/height
let caps = sample.caps().ok_or(gstreamer::FlowError::Error)?;
let structure = caps.structure(0).ok_or(gstreamer::FlowError::Error)?;
let width = structure
.get::<i32>("width")
.map_err(|_| gstreamer::FlowError::Error)? as u32;
let height = structure
.get::<i32>("height")
.map_err(|_| gstreamer::FlowError::Error)? as u32;
let map = buffer
.map_readable()
.map_err(|_| gstreamer::FlowError::Error)?;
on_frame(map.to_vec(), width, height);
Ok(gstreamer::FlowSuccess::Ok)
})
.build();
appsink.set_callbacks(callbacks);
Ok(Self {
pipeline,
appsrc,
appsink,
})
}
pub fn start(&self) -> Result<()> {
self.pipeline.set_state(State::Playing)?;
Ok(())
}
pub fn stop(&self) -> Result<()> {
let _ = self.pipeline.set_state(State::Null);
Ok(())
}
pub fn push_data(&self, data: &[u8]) -> Result<()> {
let buffer = gstreamer::Buffer::from_slice(data.to_vec());
self.appsrc.push_buffer(buffer)?;
Ok(())
}
}
// Remove Drop impl to prevent panic on shutdown. Rely on explicit stop() or process exit.

View File

@@ -6,11 +6,10 @@ use std::collections::HashMap;
use std::sync::atomic::{AtomicBool, AtomicU32, Ordering};
use std::sync::Arc;
use std::thread;
use std::time::Duration;
use std::time::{Duration, Instant};
use anyhow::{Result, anyhow};
use dashmap::DashMap;
use nnnoiseless::DenoiseState;
use crate::media::WebMediaEvent;
use anyhow::{anyhow, Result};
use audiopus::{
coder::Decoder as OpusDecoder, coder::Encoder as OpusEncoder, Application, Bitrate, Channels,
SampleRate,
@@ -18,9 +17,10 @@ use audiopus::{
use bytes::Bytes;
use cpal::traits::{DeviceTrait, HostTrait, StreamTrait};
use crossbeam_channel::{unbounded, Receiver, Sender};
use dashmap::DashMap;
use iroh::EndpointId;
use nnnoiseless::DenoiseState;
use ringbuf::{traits::*, HeapRb};
use crate::media::WebMediaEvent;
const PACKET_TYPE_AUDIO: u8 = 1;
const FRAME_SIZE_SAMPLES: usize = 960; // 20ms at 48kHz
@@ -51,46 +51,53 @@ impl std::ops::DerefMut for SyncAudioProducer {
type AudioProducer = SyncAudioProducer;
type AudioConsumer = ringbuf::HeapCons<f32>;
/// Main voice chat coordination.
pub struct VoiceChat {
running: Arc<AtomicBool>,
tasks: Vec<tokio::task::JoinHandle<()>>,
// Capture and Playback threads
capture_thread: Option<thread::JoinHandle<()>>,
playback_thread: Option<thread::JoinHandle<()>>,
// Per-peer state: Decoder + Jitter Buffer Producer
peer_audio_sinks: HashMap<EndpointId, (SendDecoder, AudioProducer)>,
// Channel to notify playback thread of new peers
new_peer_tx: Sender<(EndpointId, AudioConsumer)>,
// Audio processing controls
pub denoise_enabled: Arc<AtomicBool>,
pub output_volume: Arc<AtomicU32>, // stored as f32 bits
pub input_volume: Arc<AtomicU32>, // stored as f32 bits
pub peer_levels: Arc<DashMap<EndpointId, f32>>,
}
// impl Drop for VoiceChat {
// fn drop(&mut self) {
// self.stop();
// }
// }
impl VoiceChat {
/// Start voice chat session (Native Version with CPAL + QUIC Datagrams).
pub fn start_native(
net: crate::net::NetworkManager,
peers: Vec<EndpointId>,
mic_tx: tokio::sync::broadcast::Sender<Vec<f32>>,
_mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
_mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
_broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
mic_bitrate: Arc<AtomicU32>,
input_device_name: Option<String>,
output_device_name: Option<String>,
initial_volume: f32,
initial_mic_volume: f32,
initial_denoise: bool,
) -> Result<Self> {
let running = Arc::new(AtomicBool::new(true));
let denoise_enabled = Arc::new(AtomicBool::new(initial_denoise));
let output_volume = Arc::new(AtomicU32::new(initial_volume.to_bits()));
let input_volume = Arc::new(AtomicU32::new(initial_mic_volume.to_bits()));
let peer_levels = Arc::new(DashMap::new());
tracing::info!("Starting Native Voice Chat...");
@@ -101,9 +108,15 @@ impl VoiceChat {
let playback_device_name = output_device_name.clone();
let playback_volume = output_volume.clone();
let playback_levels = peer_levels.clone();
let playback_thread = thread::spawn(move || {
run_playback_loop(playback_running, new_peer_rx, playback_device_name, playback_volume, playback_levels);
run_playback_loop(
playback_running,
new_peer_rx,
playback_device_name,
playback_volume,
playback_levels,
);
});
// 2. Setup Capture Thread (CPAL Input)
@@ -111,9 +124,16 @@ impl VoiceChat {
let mic_tx_capture = mic_tx.clone();
let capture_device_name = input_device_name.clone();
let capture_denoise = denoise_enabled.clone();
let capture_volume = input_volume.clone();
let capture_thread = thread::spawn(move || {
run_capture_loop(capture_running, mic_tx_capture, capture_device_name, capture_denoise);
run_capture_loop(
capture_running,
mic_tx_capture,
capture_device_name,
capture_denoise,
capture_volume,
);
});
// 3. Setup Network Sender Task (Opus -> Datagrams)
@@ -130,7 +150,8 @@ impl VoiceChat {
mic_rx_sender,
sender_running,
mic_bitrate_clone,
).await;
)
.await;
});
tasks.push(sender_task);
@@ -143,12 +164,18 @@ impl VoiceChat {
new_peer_tx,
denoise_enabled,
output_volume,
input_volume,
peer_levels,
})
}
pub fn set_volume(&self, volume: f32) {
self.output_volume.store(volume.to_bits(), Ordering::Relaxed);
self.output_volume
.store(volume.to_bits(), Ordering::Relaxed);
}
pub fn set_input_volume(&self, volume: f32) {
self.input_volume.store(volume.to_bits(), Ordering::Relaxed);
}
pub fn get_volume(&self) -> f32 {
@@ -166,9 +193,12 @@ impl VoiceChat {
}
pub fn get_peer_levels(&self) -> HashMap<EndpointId, f32> {
self.peer_levels.iter().map(|entry| (*entry.key(), *entry.value())).collect()
self.peer_levels
.iter()
.map(|entry| (*entry.key(), *entry.value()))
.collect()
}
// Kept for compatibility but unused in Native mode
pub fn start_web(
_net: crate::net::NetworkManager,
@@ -177,7 +207,7 @@ impl VoiceChat {
_broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
_mic_bitrate: Arc<AtomicU32>,
) -> Result<Self> {
Err(anyhow!("Web voice not supported in this native build"))
Err(anyhow!("Web voice not supported in this native build"))
}
/// Stop voice chat.
@@ -187,7 +217,8 @@ impl VoiceChat {
task.abort();
}
self.tasks.clear();
// Spawn a thread to join audio threads to avoid blocking async context if they are stuck
if let Some(t) = self.capture_thread.take() {
t.thread().unpark();
}
@@ -213,7 +244,7 @@ impl VoiceChat {
// Get or create decoder/producer for this peer
let (decoder, producer) = self.peer_audio_sinks.entry(from).or_insert_with(|| {
tracing::info!("New voice peer detected: {}", from);
// Create Jitter Buffer (RingBuf)
// 48kHz * 1s buffer
let rb = HeapRb::<f32>::new(48000);
@@ -226,7 +257,7 @@ impl VoiceChat {
let decoder = OpusDecoder::new(SampleRate::Hz48000, Channels::Mono)
.expect("Failed to create Opus decoder");
(SendDecoder(decoder), SyncAudioProducer(prod))
});
@@ -254,18 +285,26 @@ fn run_capture_loop(
mic_tx: tokio::sync::broadcast::Sender<Vec<f32>>,
device_name: Option<String>,
denoise_enabled: Arc<AtomicBool>,
input_volume: Arc<AtomicU32>,
) {
let host = cpal::default_host();
// Find device
let device = if let Some(ref name) = device_name {
host.input_devices().ok().and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
tracing::info!("Requesting input device: '{}'", name);
host.input_devices()
.ok()
.and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
} else {
tracing::info!("Requesting default input device");
host.default_input_device()
};
let device = match device {
Some(d) => d,
Some(d) => {
tracing::info!("Found input device: {:?}", d.name());
d
}
None => {
tracing::error!("No input device found");
return;
@@ -281,10 +320,10 @@ fn run_capture_loop(
return;
}
};
// We try to stick to default but standardise to 1 channel if possible.
let stream_config: cpal::StreamConfig = config.clone().into();
tracing::info!("Input config: {:?}", stream_config);
// Initialize RNNoise
@@ -293,6 +332,9 @@ fn run_capture_loop(
let mut processing_buffer: Vec<f32> = Vec::with_capacity(480 * 2);
let mut out_buf = [0.0f32; DenoiseState::FRAME_SIZE];
let mut last_log = Instant::now();
let mut packet_count = 0;
let err_fn = |err| tracing::error!("Input stream error: {}", err);
let stream = match config.sample_format() {
@@ -302,21 +344,36 @@ fn run_capture_loop(
&stream_config,
move |data: &[f32], _: &_| {
if !running_clone.load(Ordering::Relaxed) { return; }
packet_count += 1;
if last_log.elapsed() >= Duration::from_secs(1) {
tracing::info!("Microphone input active: {} callbacks/sec, {} samples in last callback", packet_count, data.len());
packet_count = 0;
last_log = Instant::now();
}
// Convert to Mono
let channels = stream_config.channels as usize;
let mono_samples: Vec<f32> = if channels == 1 {
let mut mono_samples: Vec<f32> = if channels == 1 {
data.to_vec()
} else {
data.chunks(channels).map(|chunk| chunk.iter().sum::<f32>() / channels as f32).collect()
};
// Apply Input Gain
let gain = f32::from_bits(input_volume.load(Ordering::Relaxed));
if gain != 1.0 {
for sample in &mut mono_samples {
*sample *= gain;
}
}
if !mono_samples.is_empty() {
let use_denoise = denoise_enabled.load(Ordering::Relaxed);
if use_denoise {
processing_buffer.extend_from_slice(&mono_samples);
while processing_buffer.len() >= DenoiseState::FRAME_SIZE {
let chunk: Vec<f32> = processing_buffer.drain(0..DenoiseState::FRAME_SIZE).collect();
denoise_state.process_frame(&mut out_buf, &chunk);
@@ -331,10 +388,10 @@ fn run_capture_loop(
err_fn,
None
)
},
}
_ => {
tracing::error!("Input device does not support F32 samples");
return;
tracing::error!("Input device does not support F32 samples");
return;
}
};
@@ -343,7 +400,7 @@ fn run_capture_loop(
tracing::error!("Failed to play input stream: {}", e);
}
tracing::info!("Voice started (Capture)");
// Keep thread alive
while running.load(Ordering::Relaxed) {
thread::sleep(Duration::from_millis(100));
@@ -361,15 +418,22 @@ fn run_playback_loop(
peer_levels: Arc<DashMap<EndpointId, f32>>,
) {
let host = cpal::default_host();
let device = if let Some(ref name) = device_name {
host.output_devices().ok().and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
tracing::info!("Requesting output device: '{}'", name);
host.output_devices()
.ok()
.and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
} else {
tracing::info!("Requesting default output device");
host.default_output_device()
};
let device = match device {
Some(d) => d,
Some(d) => {
tracing::info!("Found output device: {:?}", d.name());
d
}
None => {
tracing::error!("No output device found");
return;
@@ -391,14 +455,26 @@ fn run_playback_loop(
let err_fn = |err| tracing::error!("Output stream error: {}", err);
let mut last_log = Instant::now();
let mut packet_count = 0;
let stream = match config.sample_format() {
cpal::SampleFormat::F32 => {
let running_clone = running.clone();
device.build_output_stream(
&stream_config,
move |data: &mut [f32], _: &_| {
if !running_clone.load(Ordering::Relaxed) { return; }
if !running_clone.load(Ordering::Relaxed) {
return;
}
packet_count += 1;
if last_log.elapsed() >= Duration::from_secs(1) {
tracing::info!("Speaker output active: {} callbacks/sec", packet_count);
packet_count = 0;
last_log = Instant::now();
}
let master_vol = f32::from_bits(output_volume.load(Ordering::Relaxed));
// Check for new peers non-blocking
@@ -409,10 +485,10 @@ fn run_playback_loop(
// Mix
let channels = stream_config.channels as usize;
// We assume we are filling interleaved buffer.
// Our ringbufs are Mono. We duplicate mono to all channels.
// Pre-allocate level accumulators for this frame
// We'll calculate RMS over the whole buffer size for UI visualization
let mut peer_sums: HashMap<EndpointId, f32> = HashMap::new();
@@ -421,43 +497,43 @@ fn run_playback_loop(
// Iterate output buffer frame by frame (all channels per sample time)
for frame in data.chunks_mut(channels) {
let mut sum: f32 = 0.0;
// Sum up all peers
for (id, c) in consumers.iter_mut() {
if let Some(sample) = c.try_pop() {
sum += sample;
// Accumulate squared sample for RMS
*peer_sums.entry(*id).or_default() += sample * sample;
*peer_counts.entry(*id).or_default() += 1;
}
}
// Apply master volume
sum *= master_vol;
// Soft clip
let mixed = sum.clamp(-1.0, 1.0);
// Assign to all channels
for sample in frame.iter_mut() {
*sample = mixed;
}
}
// Update peer levels in shared map
for (id, sq_sum) in peer_sums {
let count = peer_counts.get(&id).unwrap_or(&1);
let rms = (sq_sum / *count as f32).sqrt();
// Smooth decay could be implemented here, but for now just raw RMS
peer_levels.insert(id, rms);
}
},
err_fn,
None
None,
)
},
}
_ => {
tracing::error!("Output device does not support F32 samples");
return;
@@ -468,7 +544,7 @@ fn run_playback_loop(
if let Err(e) = s.play() {
tracing::error!("Failed to play output stream: {}", e);
}
while running.load(Ordering::Relaxed) {
thread::sleep(Duration::from_millis(100));
}
@@ -484,7 +560,9 @@ async fn run_network_sender(
running: Arc<AtomicBool>,
mic_bitrate: Arc<AtomicU32>,
) {
if peers.is_empty() { return; }
if peers.is_empty() {
return;
}
// Initialize connections
let mut connections = Vec::new();
@@ -502,9 +580,11 @@ async fn run_network_sender(
let mut encoder = OpusEncoder::new(SampleRate::Hz48000, Channels::Mono, Application::Voip)
.expect("Failed to create Opus encoder");
// Initial bitrate
let _ = encoder.set_bitrate(Bitrate::BitsPerSecond(mic_bitrate.load(Ordering::Relaxed) as i32));
let _ = encoder.set_bitrate(Bitrate::BitsPerSecond(
mic_bitrate.load(Ordering::Relaxed) as i32
));
let mut pcm_buffer: Vec<f32> = Vec::with_capacity(FRAME_SIZE_SAMPLES * 2);
let mut opus_buffer = vec![0u8; 1500];
@@ -522,18 +602,18 @@ async fn run_network_sender(
while pcm_buffer.len() >= FRAME_SIZE_SAMPLES {
let chunk: Vec<f32> = pcm_buffer.drain(0..FRAME_SIZE_SAMPLES).collect();
match encoder.encode_float(&chunk, &mut opus_buffer) {
Ok(len) => {
let opus_packet = &opus_buffer[..len];
// Construct Datagram: [TYPE=1][OPUS]
let mut datagram = Vec::with_capacity(1 + len);
datagram.push(PACKET_TYPE_AUDIO);
datagram.extend_from_slice(opus_packet);
let bytes = Bytes::from(datagram);
// Send to all peers
for conn in &mut connections {
if let Err(e) = conn.send_datagram(bytes.clone()) {

View File

@@ -73,6 +73,8 @@ pub struct PeerInfo {
pub is_self: bool,
#[serde(skip)]
pub audio_level: f32,
#[serde(skip)]
pub is_streaming_video: bool,
}
/// Manages the iroh networking stack.
@@ -207,6 +209,7 @@ impl NetworkManager {
capabilities: None,
is_self: false,
audio_level: 0.0,
is_streaming_video: false,
});
}
let _ = event_tx.send(NetEvent::PeerUp(peer_id)).await;

View File

@@ -64,9 +64,11 @@ pub async fn start_web_server(
axum::serve(listener, app).await.unwrap();
}
// --- AUDIO ---
async fn ws_audio_handler(ws: WebSocketUpgrade, State(state): State<AppState>) -> impl IntoResponse {
async fn ws_audio_handler(
ws: WebSocketUpgrade,
State(state): State<AppState>,
) -> impl IntoResponse {
ws.on_upgrade(move |socket| handle_audio_socket(socket, state))
}
@@ -77,7 +79,11 @@ async fn handle_audio_socket(socket: WebSocket, state: AppState) {
// Outgoing (Server -> Browser)
tokio::spawn(async move {
while let Ok(event) = rx.recv().await {
if let WebMediaEvent::Audio { peer_id, data: samples } = event {
if let WebMediaEvent::Audio {
peer_id,
data: samples,
} = event
{
// Protocol: [IDLen] [ID] [f32...]
let id_bytes = peer_id.as_bytes();
let id_len = id_bytes.len() as u8;
@@ -87,7 +93,11 @@ async fn handle_audio_socket(socket: WebSocket, state: AppState) {
for s in samples {
payload.extend_from_slice(&s.to_ne_bytes());
}
if sender.send(Message::Binary(Bytes::from(payload))).await.is_err() {
if sender
.send(Message::Binary(Bytes::from(payload)))
.await
.is_err()
{
break;
}
}
@@ -97,7 +107,7 @@ async fn handle_audio_socket(socket: WebSocket, state: AppState) {
// Incoming (Browser -> Server)
while let Some(msg) = receiver.next().await {
if let Ok(Message::Binary(data)) = msg {
// Protocol: [f32...]
// Protocol: [f32...]
// (We dropped the header byte 3)
if data.len() % 4 == 0 {
let samples: Vec<f32> = data
@@ -111,7 +121,10 @@ async fn handle_audio_socket(socket: WebSocket, state: AppState) {
}
// --- SCREEN ---
async fn ws_screen_handler(ws: WebSocketUpgrade, State(state): State<AppState>) -> impl IntoResponse {
async fn ws_screen_handler(
ws: WebSocketUpgrade,
State(state): State<AppState>,
) -> impl IntoResponse {
ws.on_upgrade(move |socket| handle_screen_socket(socket, state))
}
@@ -122,7 +135,12 @@ async fn handle_screen_socket(socket: WebSocket, state: AppState) {
// Outgoing (Server -> Browser)
tokio::spawn(async move {
while let Ok(event) = rx.recv().await {
if let WebMediaEvent::Video { peer_id, kind, data } = event {
if let WebMediaEvent::Video {
peer_id,
kind,
data,
} = event
{
if matches!(kind, MediaKind::Screen) {
let id_bytes = peer_id.as_bytes();
let id_len = id_bytes.len() as u8;
@@ -131,11 +149,15 @@ async fn handle_screen_socket(socket: WebSocket, state: AppState) {
payload.extend_from_slice(id_bytes);
payload.extend_from_slice(&data);
if sender.send(Message::Binary(Bytes::from(payload))).await.is_err() {
if sender
.send(Message::Binary(Bytes::from(payload)))
.await
.is_err()
{
break;
}
}
}
}
}
});

View File

@@ -290,8 +290,9 @@ screen_resolution = "1920x1080"
// ============================================================================
mod tui_tests {
use crossterm::event::{KeyCode, KeyEvent, KeyModifiers};
use p2p_chat::app_logic::AppCommand;
use p2p_chat::config::{Theme, UiConfig};
use p2p_chat::tui::{App, InputMode, TuiCommand};
use p2p_chat::tui::{App, InputMode};
fn make_app() -> App {
let theme: Theme = UiConfig::default().into();
@@ -320,7 +321,7 @@ mod tui_tests {
let mut app = make_app();
assert_eq!(app.input_mode, InputMode::Editing);
let cmd = app.handle_key(key(KeyCode::Esc));
assert!(matches!(cmd, TuiCommand::None));
assert!(matches!(cmd, AppCommand::None));
assert_eq!(app.input_mode, InputMode::Normal);
}
@@ -329,7 +330,7 @@ mod tui_tests {
let mut app = make_app();
app.input_mode = InputMode::Normal;
let cmd = app.handle_key(key(KeyCode::Char('i')));
assert!(matches!(cmd, TuiCommand::None));
assert!(matches!(cmd, AppCommand::None));
assert_eq!(app.input_mode, InputMode::Editing);
}
@@ -338,7 +339,7 @@ mod tui_tests {
let mut app = make_app();
app.input_mode = InputMode::Normal;
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::None));
assert!(matches!(cmd, AppCommand::None));
assert_eq!(app.input_mode, InputMode::Editing);
}
@@ -347,7 +348,7 @@ mod tui_tests {
let mut app = make_app();
app.input_mode = InputMode::Normal;
let cmd = app.handle_key(key(KeyCode::Char('/')));
assert!(matches!(cmd, TuiCommand::None));
assert!(matches!(cmd, AppCommand::None));
assert_eq!(app.input_mode, InputMode::Editing);
assert_eq!(app.input, "/");
assert_eq!(app.cursor_position, 1);
@@ -358,7 +359,7 @@ mod tui_tests {
let mut app = make_app();
app.input_mode = InputMode::Normal;
let cmd = app.handle_key(key(KeyCode::Char('q')));
assert!(matches!(cmd, TuiCommand::Quit));
assert!(matches!(cmd, AppCommand::Quit));
}
#[test]
@@ -388,7 +389,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char('h')));
app.handle_key(key(KeyCode::Char('i')));
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::SendMessage(ref s) if s == "hi"));
assert!(matches!(cmd, AppCommand::SendMessage(ref s) if s == "hi"));
assert_eq!(app.input, "");
assert_eq!(app.cursor_position, 0);
}
@@ -397,7 +398,7 @@ mod tui_tests {
fn enter_on_empty_input_does_nothing() {
let mut app = make_app();
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::None));
assert!(matches!(cmd, AppCommand::None));
}
#[test]
@@ -407,7 +408,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::Quit));
assert!(matches!(cmd, AppCommand::Quit));
}
#[test]
@@ -417,7 +418,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
}
#[test]
@@ -427,7 +428,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
}
#[test]
@@ -437,7 +438,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::ChangeNick(ref s) if s == "alice"));
assert!(matches!(cmd, AppCommand::ChangeNick(ref s) if s == "alice"));
}
#[test]
@@ -447,7 +448,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
}
#[test]
@@ -457,7 +458,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::Connect(ref s) if s == "abc123"));
assert!(matches!(cmd, AppCommand::Connect(ref s) if s == "abc123"));
}
#[test]
@@ -467,17 +468,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::ToggleVoice));
}
#[test]
fn camera_command() {
let mut app = make_app();
for c in "/camera".chars() {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::ToggleCamera));
assert!(matches!(cmd, AppCommand::ToggleVoice));
}
#[test]
@@ -487,7 +478,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::ToggleScreen));
assert!(matches!(cmd, AppCommand::ToggleScreen));
}
#[test]
@@ -497,7 +488,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::Leave));
assert!(matches!(cmd, AppCommand::Leave));
}
#[test]
@@ -507,7 +498,7 @@ mod tui_tests {
app.handle_key(key(KeyCode::Char(c)));
}
let cmd = app.handle_key(key(KeyCode::Enter));
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
}
#[test]
@@ -518,7 +509,7 @@ mod tui_tests {
}
let cmd = app.handle_key(key(KeyCode::Enter));
match cmd {
TuiCommand::SendFile(path) => {
AppCommand::SendFile(path) => {
assert_eq!(path.to_str().unwrap(), "/tmp/test.txt");
}
_ => panic!("Expected SendFile, got {:?}", cmd),