Compare commits
7 Commits
88df6ad510
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
78e1efeaea | ||
|
|
6f13bb03aa | ||
|
|
65b3f22dae | ||
|
|
a3cb75489a | ||
|
|
61e955a695 | ||
|
|
9a823fd259 | ||
|
|
915f07b0c5 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -1,4 +1,3 @@
|
||||
target
|
||||
.*
|
||||
!.gitignore
|
||||
.vscode
|
||||
*.log
|
||||
|
||||
83
AGENTS.md
Normal file
83
AGENTS.md
Normal file
@@ -0,0 +1,83 @@
|
||||
# P2P Chat Codebase Guide for Agents
|
||||
|
||||
This document outlines the development workflow, code style, and architectural patterns for the `p2p-chat` repository.
|
||||
|
||||
## 1. Build, Test, and Run Commands
|
||||
|
||||
### Basic Commands
|
||||
* **Build**: `cargo build`
|
||||
* **Run TUI (default)**: `cargo run`
|
||||
* **Run GUI**: `cargo run -- --gui`
|
||||
* **Check**: `cargo check`
|
||||
* **Format**: `cargo fmt`
|
||||
* **Lint**: `cargo clippy`
|
||||
|
||||
### Testing
|
||||
* **Run all tests**: `cargo test`
|
||||
* **Run a specific test**: `cargo test test_name`
|
||||
* **Run tests with output**: `cargo test -- --nocapture`
|
||||
|
||||
### Debugging
|
||||
* **Logs**:
|
||||
* TUI mode logs to `p2p-chat.log` in the working directory.
|
||||
* GUI mode logs to `stdout` (INFO/DEBUG) and `stderr` (WARN/ERROR).
|
||||
* **Environment Variables**:
|
||||
* `RUST_LOG`: Control logging levels (e.g., `RUST_LOG=p2p_chat=debug,iroh=info`).
|
||||
|
||||
## 2. Code Style & Architecture
|
||||
|
||||
### General Rust Guidelines
|
||||
* **Edition**: Rust 2021.
|
||||
* **Formatting**: Strictly follow `rustfmt`.
|
||||
* **Error Handling**: Use `anyhow::Result` for application-level errors and main functions. Use specific `thiserror` enums for libraries/modules when precise error handling is required.
|
||||
* **Async/Await**: The project is built on the `tokio` runtime. Use `async/await` for I/O-bound tasks.
|
||||
|
||||
### Project Structure
|
||||
* `src/main.rs`: Application entry point, runtime setup, and main event loop.
|
||||
* `src/app_logic.rs`: Core business logic (`AppLogic` struct). Handles network events and application commands. It is agnostic of the UI (TUI vs. GUI).
|
||||
* `src/gui.rs`: Iced-based GUI implementation following the Elm architecture (Model-View-Update).
|
||||
* `src/tui/`: Ratatui-based TUI implementation.
|
||||
* `src/net/`: Networking layer wrapping `iroh` and `iroh-gossip`.
|
||||
* `src/media/`: Audio/Video capture and playback (GStreamer, cpal, FFmpeg).
|
||||
* `src/protocol/`: Data structures and serialization (`serde`, `bincode`, `postcard`) for network messages.
|
||||
|
||||
### Naming Conventions
|
||||
* **Variables/Functions**: `snake_case`.
|
||||
* **Types/Traits**: `CamelCase`.
|
||||
* **Constants**: `SCREAMING_SNAKE_CASE`.
|
||||
* **Files**: `snake_case.rs`.
|
||||
|
||||
### Architectural Patterns
|
||||
1. **State Management**:
|
||||
* `AppLogic` holds the source of truth for application state (`ChatState`, `MediaState`, `NetworkManager`, etc.).
|
||||
* `FrontendState` is a simplified struct derived from `AppLogic` to pass data to the UI (GUI/TUI) for rendering.
|
||||
* Do not put core business logic inside `gui.rs` or `tui/`.
|
||||
|
||||
2. **Concurrency**:
|
||||
* Use `tokio::spawn` for background tasks.
|
||||
* Use `tokio::sync::mpsc` channels for communicating between the UI and the backend logic.
|
||||
* Use `tokio::sync::broadcast` for one-to-many event distribution (e.g., video frames).
|
||||
* Use `Arc<Mutex<...>>` or `Arc<Atomic...>` for shared state when channels are insufficient, but prefer message passing.
|
||||
|
||||
3. **GUI (Iced)**:
|
||||
* **Messages**: Define all UI interactions in the `Message` enum in `src/gui.rs`.
|
||||
* **Update**: Handle `Message`s in the `update` function.
|
||||
* **View**: Keep the `view` function strictly for rendering based on `self.state`.
|
||||
* **Subscriptions**: Use `subscription` to listen for external events (e.g., backend state updates, video frames).
|
||||
|
||||
4. **Networking**:
|
||||
* Based on `iroh` QUIC.
|
||||
* Events are received in the main loop and processed by `AppLogic::handle_net_event`.
|
||||
|
||||
### Dependencies
|
||||
* **Networking**: `iroh`, `iroh-gossip`.
|
||||
* **Runtime**: `tokio`.
|
||||
* **UI**: `iced` (GUI), `ratatui` + `crossterm` (TUI).
|
||||
* **Media**: `gstreamer` (video capture), `cpal` (audio I/O), `ffmpeg-next` (video encoding).
|
||||
|
||||
### Common Tasks
|
||||
* **Adding a new feature**:
|
||||
1. Update `protocol/mod.rs` if it involves new network messages.
|
||||
2. Update `AppLogic` to handle the logic.
|
||||
3. Update `FrontendState` to expose data to the UI.
|
||||
4. Update `gui.rs` and `tui/` to render the new state.
|
||||
4690
Cargo.lock
generated
4690
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
18
Cargo.toml
18
Cargo.toml
@@ -32,17 +32,19 @@ anyhow = "1"
|
||||
tracing = "0.1"
|
||||
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
rand = "0.8"
|
||||
rand = "0.9"
|
||||
|
||||
# Configuration
|
||||
toml = "0.7"
|
||||
directories = "5.0"
|
||||
|
||||
# Media
|
||||
pipewire = "0.9"
|
||||
libspa = "0.9"
|
||||
songbird = { version = "0.4", features = ["builtin-queue"] }
|
||||
audiopus = "0.2"
|
||||
rfd = "0.14"
|
||||
iced = { version = "0.13", features = ["image", "wgpu", "tokio"] }
|
||||
iced_futures = "0.13"
|
||||
|
||||
crossbeam-channel = "0.5"
|
||||
axum = { version = "0.8.8", features = ["ws"] }
|
||||
tokio-stream = "0.1.18"
|
||||
@@ -51,6 +53,16 @@ futures = "0.3.31"
|
||||
tower-http = { version = "0.6.8", features = ["fs", "cors"] }
|
||||
mime_guess = "2.0.5"
|
||||
hex = "0.4.3"
|
||||
cpal = { version = "0.17.1", features = ["jack"] }
|
||||
xcap = "0.8.2"
|
||||
ashpd = "0.9"
|
||||
image = "0.25.9"
|
||||
ringbuf = "0.4.8"
|
||||
nnnoiseless = "0.5"
|
||||
dashmap = "5"
|
||||
gstreamer = "0.24.4"
|
||||
gstreamer-app = "0.24.4"
|
||||
gstreamer-video = "0.24.4"
|
||||
|
||||
[profile.dev]
|
||||
opt-level = 0
|
||||
|
||||
51
README.md
Normal file
51
README.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# P2P Chat & File Transfer
|
||||
|
||||
A secure, serverless peer-to-peer chat application built with Rust and iroh.
|
||||
|
||||
## Features
|
||||
|
||||
- **Decentralized**: No central server; peers connect directly via QUIC and NAT traversal.
|
||||
- **Commands**:
|
||||
- `/connect <peer_id>`: Connect to a peer.
|
||||
- `/nick <name>`: Set your display name.
|
||||
- `/file <path>`: Send a file.
|
||||
- `/accept <file_id_prefix>`: Accept a file transfer.
|
||||
- `/mic`: Select microphone input.
|
||||
- `/speaker`: Select speaker output.
|
||||
- `/bitrate <kbps>`: Set audio bitrate.
|
||||
- **Cross-Platform**: Works on Linux, Windows, and macOS.
|
||||
- File sharing supported on all platforms.
|
||||
- Voice/Screen share (Linux only for now, requires PipeWire).
|
||||
|
||||
## Installation
|
||||
|
||||
1. Install Rust: https://rustup.rs/
|
||||
2. Clone the repo:
|
||||
```bash
|
||||
git clone https://github.com/yourusername/p2p-chat.git
|
||||
cd p2p-chat
|
||||
```
|
||||
3. Build & Run:
|
||||
```bash
|
||||
cargo run --release
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
1. **Start the app**. You will see your **Peer ID** in the top bar.
|
||||
2. **Share your Peer ID** with a friend.
|
||||
3. **Connect**: Type `/connect <friend_peer_id>`.
|
||||
4. **Chat**: Type messages and press Enter.
|
||||
5. **Send Files**:
|
||||
- Type `/file` to open a file picker.
|
||||
- Or `/file /path/to/file` to send immediately.
|
||||
- The recipient must type `/accept <file_id>` (or click accept if TUI supports mouse).
|
||||
|
||||
## Configuration
|
||||
|
||||
Configuration is stored in:
|
||||
- Linux: `~/.config/p2p-chat/config.toml`
|
||||
- Windows: `%APPDATA%\p2p-chat\config.toml`
|
||||
- macOS: `~/Library/Application Support/p2p-chat/config.toml`
|
||||
|
||||
You can customize colors, default resolution, interface, etc.
|
||||
1175
p2p-chat.log
1175
p2p-chat.log
File diff suppressed because it is too large
Load Diff
455
src/app_logic.rs
455
src/app_logic.rs
@@ -1,11 +1,69 @@
|
||||
use anyhow::Result;
|
||||
use crate::chat::ChatState;
|
||||
use crate::chat::{ChatEntry, ChatState};
|
||||
use crate::config::AppConfig;
|
||||
use crate::file_transfer::FileTransferManager;
|
||||
use crate::file_transfer::{FileTransferManager, TransferInfo};
|
||||
use crate::media::MediaState;
|
||||
use crate::net::{NetEvent, NetworkManager};
|
||||
use crate::net::{NetEvent, NetworkManager, PeerInfo};
|
||||
use crate::protocol::{self, GossipMessage};
|
||||
use crate::tui::TuiCommand;
|
||||
use anyhow::Result;
|
||||
use std::path::PathBuf;
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct FrontendState {
|
||||
pub chat_history: Vec<ChatEntry>,
|
||||
pub peers: Vec<PeerInfo>,
|
||||
pub transfers: Vec<TransferInfo>,
|
||||
pub our_name: String,
|
||||
pub our_id: String,
|
||||
pub our_id_full: String,
|
||||
pub media_status: String,
|
||||
pub input_device_name: Option<String>,
|
||||
pub output_device_name: Option<String>,
|
||||
pub master_volume: f32,
|
||||
pub mic_volume: f32,
|
||||
pub noise_suppression: bool,
|
||||
}
|
||||
|
||||
impl Default for FrontendState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
chat_history: Vec::new(),
|
||||
peers: Vec::new(),
|
||||
transfers: Vec::new(),
|
||||
our_name: "Unknown".to_string(),
|
||||
our_id: "".to_string(),
|
||||
our_id_full: "".to_string(),
|
||||
media_status: "".to_string(),
|
||||
input_device_name: None,
|
||||
output_device_name: None,
|
||||
master_volume: 1.0,
|
||||
mic_volume: 1.0,
|
||||
noise_suppression: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub enum AppCommand {
|
||||
SendMessage(String),
|
||||
/// Local-only system message (not broadcast to peers).
|
||||
SystemMessage(String),
|
||||
SendFile(PathBuf),
|
||||
AcceptFile(String), // file_id prefix
|
||||
ChangeNick(String),
|
||||
Connect(String),
|
||||
ToggleVoice,
|
||||
ToggleScreen,
|
||||
SetInputDevice(String),
|
||||
SetOutputDevice(String),
|
||||
SetMasterVolume(f32),
|
||||
SetMicVolume(f32),
|
||||
ToggleNoiseCancel,
|
||||
|
||||
SetBitrate(u32),
|
||||
Leave,
|
||||
Quit,
|
||||
None,
|
||||
}
|
||||
|
||||
pub struct AppLogic {
|
||||
pub chat: ChatState,
|
||||
@@ -15,6 +73,8 @@ pub struct AppLogic {
|
||||
pub our_name: String,
|
||||
pub our_id_short: String,
|
||||
pub connected: bool,
|
||||
pub media_event_rx: tokio::sync::mpsc::Receiver<crate::media::InternalMediaEvent>,
|
||||
pub net_event_rx: tokio::sync::mpsc::Receiver<crate::net::NetEvent>,
|
||||
}
|
||||
|
||||
impl AppLogic {
|
||||
@@ -25,6 +85,8 @@ impl AppLogic {
|
||||
net: NetworkManager,
|
||||
our_name: String,
|
||||
our_id_short: String,
|
||||
media_event_rx: tokio::sync::mpsc::Receiver<crate::media::InternalMediaEvent>,
|
||||
net_event_rx: tokio::sync::mpsc::Receiver<crate::net::NetEvent>,
|
||||
) -> Self {
|
||||
Self {
|
||||
chat,
|
||||
@@ -34,32 +96,76 @@ impl AppLogic {
|
||||
our_name,
|
||||
our_id_short,
|
||||
connected: false,
|
||||
media_event_rx,
|
||||
net_event_rx,
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn handle_tui_command(&mut self, cmd: TuiCommand) -> Result<bool> {
|
||||
pub async fn handle_command(&mut self, cmd: AppCommand) -> Result<bool> {
|
||||
match cmd {
|
||||
TuiCommand::SendMessage(text) => {
|
||||
AppCommand::SendMessage(text) => {
|
||||
if let Err(e) = self.chat.send_message(text, &self.net).await {
|
||||
self.chat.add_system_message(format!("Send error: {}", e));
|
||||
}
|
||||
}
|
||||
TuiCommand::SystemMessage(text) => {
|
||||
AppCommand::SystemMessage(text) => {
|
||||
self.chat.add_system_message(text);
|
||||
}
|
||||
TuiCommand::ToggleVoice => {
|
||||
AppCommand::ToggleVoice => {
|
||||
let status = self.media.toggle_voice(self.net.clone()).await;
|
||||
self.chat.add_system_message(status.to_string());
|
||||
}
|
||||
TuiCommand::ToggleCamera => {
|
||||
let status = self.media.toggle_camera(self.net.clone()).await;
|
||||
self.chat.add_system_message(status.to_string());
|
||||
}
|
||||
TuiCommand::ToggleScreen => {
|
||||
AppCommand::ToggleScreen => {
|
||||
let status = self.media.toggle_screen(self.net.clone()).await;
|
||||
self.chat.add_system_message(status.to_string());
|
||||
}
|
||||
TuiCommand::Quit => {
|
||||
AppCommand::SetInputDevice(device_name) => {
|
||||
self.media.set_input_device(device_name.clone());
|
||||
self.chat
|
||||
.add_system_message(format!("Microphone set to: {}", device_name));
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.input_device = Some(device_name);
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
AppCommand::SetOutputDevice(device_name) => {
|
||||
self.media.set_output_device(device_name.clone());
|
||||
self.chat
|
||||
.add_system_message(format!("Output set to: {}", device_name));
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.output_device = Some(device_name);
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
AppCommand::SetMasterVolume(vol) => {
|
||||
self.media.set_volume(vol);
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.master_volume = vol;
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
AppCommand::SetMicVolume(vol) => {
|
||||
self.media.set_mic_volume(vol);
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.mic_volume = vol;
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
AppCommand::ToggleNoiseCancel => {
|
||||
if let Some(enabled) = self.media.toggle_denoise() {
|
||||
let status = if enabled { "enabled" } else { "disabled" };
|
||||
self.chat
|
||||
.add_system_message(format!("Noise cancellation {}", status));
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.noise_suppression = enabled;
|
||||
let _ = cfg.save();
|
||||
}
|
||||
} else {
|
||||
self.chat
|
||||
.add_system_message("Voice chat not active".to_string());
|
||||
}
|
||||
}
|
||||
AppCommand::Quit => {
|
||||
// Broadcast disconnect to peers
|
||||
let disconnect_msg = GossipMessage::Disconnect {
|
||||
sender_name: self.our_name.clone(),
|
||||
@@ -70,7 +176,7 @@ impl AppLogic {
|
||||
self.media.shutdown();
|
||||
return Ok(true); // Signal to quit
|
||||
}
|
||||
TuiCommand::ChangeNick(new_nick) => {
|
||||
AppCommand::ChangeNick(new_nick) => {
|
||||
let old = self.our_name.clone();
|
||||
self.our_name = new_nick.clone();
|
||||
self.chat.our_name = new_nick.clone();
|
||||
@@ -81,43 +187,47 @@ impl AppLogic {
|
||||
self_peer.name = Some(new_nick.clone());
|
||||
}
|
||||
}
|
||||
self.chat.add_system_message(
|
||||
format!("Nickname changed: {} → {}", old, new_nick),
|
||||
);
|
||||
self.chat
|
||||
.add_system_message(format!("Nickname changed: {} ➡️ {}", old, new_nick));
|
||||
// Broadcast name change to all peers
|
||||
let msg = GossipMessage::NameChange(
|
||||
protocol::NameChange {
|
||||
old_name: old,
|
||||
new_name: new_nick,
|
||||
},
|
||||
);
|
||||
let msg = GossipMessage::NameChange(protocol::NameChange {
|
||||
old_name: old,
|
||||
new_name: new_nick,
|
||||
});
|
||||
let _ = self.net.broadcast(&msg).await;
|
||||
}
|
||||
TuiCommand::Connect(peer_id_str) => {
|
||||
AppCommand::Connect(peer_id_str) => {
|
||||
match peer_id_str.parse::<crate::net::EndpointId>() {
|
||||
Ok(peer_id) => {
|
||||
self.chat.add_system_message(format!("Connecting to {}...", peer_id));
|
||||
self.chat
|
||||
.add_system_message(format!("Connecting to {}...", peer_id));
|
||||
if let Err(e) = self.net.connect(peer_id).await {
|
||||
self.chat.add_system_message(format!("Connection failed: {}", e));
|
||||
self.chat
|
||||
.add_system_message(format!("Connection failed: {}", e));
|
||||
} else {
|
||||
self.chat.add_system_message("Connection initiated.".to_string());
|
||||
self.chat
|
||||
.add_system_message("Connection initiated.".to_string());
|
||||
}
|
||||
}
|
||||
Err(_) => {
|
||||
self.chat.add_system_message(format!("Invalid peer ID: {}", peer_id_str));
|
||||
self.chat
|
||||
.add_system_message(format!("Invalid peer ID: {}", peer_id_str));
|
||||
}
|
||||
}
|
||||
}
|
||||
TuiCommand::SendFile(path) => {
|
||||
self.chat.add_system_message(format!("Preparing to send file: {:?}", path));
|
||||
AppCommand::SendFile(path) => {
|
||||
self.chat
|
||||
.add_system_message(format!("Preparing to send file: {:?}", path));
|
||||
if !path.exists() {
|
||||
self.chat.add_system_message(format!("File not found: {}", path.display()));
|
||||
self.chat
|
||||
.add_system_message(format!("File not found: {}", path.display()));
|
||||
} else {
|
||||
let file_mgr = self.file_mgr.clone();
|
||||
// Prepare send
|
||||
match file_mgr.prepare_send(&path).await {
|
||||
Ok((file_id, offer)) => {
|
||||
self.chat.add_system_message(format!("Offering file: {}", offer.name));
|
||||
self.chat
|
||||
.add_system_message(format!("Offering file: {}", offer.name));
|
||||
let broadcast = protocol::FileOfferBroadcast {
|
||||
sender_name: self.our_name.to_string(),
|
||||
file_id,
|
||||
@@ -127,93 +237,99 @@ impl AppLogic {
|
||||
};
|
||||
let msg = GossipMessage::FileOfferBroadcast(broadcast);
|
||||
if let Err(e) = self.net.broadcast(&msg).await {
|
||||
self.chat.add_system_message(format!("Failed to broadcast offer: {}", e));
|
||||
self.chat.add_system_message(format!(
|
||||
"Failed to broadcast offer: {}",
|
||||
e
|
||||
));
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
self.chat.add_system_message(format!("Failed to prepare file: {}", e));
|
||||
self.chat
|
||||
.add_system_message(format!("Failed to prepare file: {}", e));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
TuiCommand::Leave => {
|
||||
self.chat.add_system_message("Leaving group chat...".to_string());
|
||||
AppCommand::Leave => {
|
||||
self.chat
|
||||
.add_system_message("Leaving group chat...".to_string());
|
||||
self.media.shutdown();
|
||||
// Clear peer list (except self)
|
||||
{
|
||||
let mut peers = self.net.peers.lock().await;
|
||||
peers.retain(|_, info| info.is_self);
|
||||
}
|
||||
self.chat.add_system_message("Session ended. Use /connect <peer_id> to start a new session.".to_string());
|
||||
self.chat.add_system_message(
|
||||
"Session ended. Use /connect <peer_id> to start a new session.".to_string(),
|
||||
);
|
||||
}
|
||||
TuiCommand::SelectMic(node_name) => {
|
||||
self.media.set_mic_name(Some(node_name.clone()));
|
||||
if self.media.voice_enabled() {
|
||||
self.chat.add_system_message("Restarting voice with new mic...".to_string());
|
||||
// Toggle off
|
||||
let _ = self.media.toggle_voice(self.net.clone()).await;
|
||||
// Toggle on
|
||||
let status = self.media.toggle_voice(self.net.clone()).await;
|
||||
self.chat.add_system_message(status.to_string());
|
||||
}
|
||||
self.chat.add_system_message(format!("🎤 Mic set to: {}", node_name));
|
||||
// Save to config
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.mic_name = Some(node_name);
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
TuiCommand::SetBitrate(bps) => {
|
||||
|
||||
AppCommand::SetBitrate(bps) => {
|
||||
self.media.set_bitrate(bps);
|
||||
self.chat.add_system_message(format!("🎵 Bitrate set to {} kbps", bps / 1000));
|
||||
self.chat
|
||||
.add_system_message(format!("🎵 Bitrate set to {} kbps", bps / 1000));
|
||||
// Save to config
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.mic_bitrate = bps;
|
||||
let _ = cfg.save();
|
||||
}
|
||||
}
|
||||
TuiCommand::SelectSpeaker(node_name) => {
|
||||
self.media.set_speaker_name(Some(node_name.clone()));
|
||||
self.chat.add_system_message(format!("🔊 Speaker set to: {}", node_name));
|
||||
// Save to config
|
||||
if let Ok(mut cfg) = AppConfig::load() {
|
||||
cfg.media.speaker_name = Some(node_name);
|
||||
if let Err(e) = cfg.save() {
|
||||
tracing::warn!("Failed to save config: {}", e);
|
||||
}
|
||||
}
|
||||
if self.media.voice_enabled() {
|
||||
self.chat.add_system_message("Restart voice chat to apply changes.".to_string());
|
||||
}
|
||||
}
|
||||
TuiCommand::AcceptFile(prefix) => {
|
||||
|
||||
AppCommand::AcceptFile(prefix) => {
|
||||
// Find matching transfer
|
||||
let transfers = self.file_mgr.transfers.lock().unwrap();
|
||||
let mut matched_id = None;
|
||||
for (id, _info) in transfers.iter() {
|
||||
let id_str = hex::encode(id);
|
||||
if id_str.starts_with(&prefix) {
|
||||
if matched_id.is_some() {
|
||||
self.chat.add_system_message(format!("Ambiguous prefix '{}'", prefix));
|
||||
matched_id = None;
|
||||
break;
|
||||
let matched = {
|
||||
let transfers = self.file_mgr.transfers.lock().unwrap();
|
||||
let mut matched = None;
|
||||
for (id, info) in transfers.iter() {
|
||||
let id_str = hex::encode(id);
|
||||
if id_str.starts_with(&prefix) {
|
||||
if matched.is_some() {
|
||||
self.chat
|
||||
.add_system_message(format!("Ambiguous prefix '{}'", prefix));
|
||||
matched = None;
|
||||
break;
|
||||
}
|
||||
matched = Some((*id, info.clone()));
|
||||
}
|
||||
matched_id = Some(*id);
|
||||
}
|
||||
}
|
||||
drop(transfers);
|
||||
|
||||
if let Some(id) = matched_id {
|
||||
if self.file_mgr.accept_transfer(id) {
|
||||
self.chat.add_system_message(format!("Accepted transfer {}", hex::encode(id).chars().take(8).collect::<String>()));
|
||||
matched
|
||||
};
|
||||
|
||||
if let Some((_id, info)) = matched {
|
||||
if let Some(peer_id) = info.peer {
|
||||
self.chat
|
||||
.add_system_message(format!("Requesting file from {}...", peer_id));
|
||||
let req = protocol::FileRequest {
|
||||
sender_name: self.our_name.clone(),
|
||||
file_id: info.file_id,
|
||||
};
|
||||
|
||||
// Update state to Requesting so we auto-accept when stream comes
|
||||
let expires_at =
|
||||
std::time::Instant::now() + std::time::Duration::from_secs(60);
|
||||
{
|
||||
// We need to upgrade the lock or re-acquire?
|
||||
// We dropped transfers lock at line 221.
|
||||
let mut transfers = self.file_mgr.transfers.lock().unwrap();
|
||||
if let Some(t) = transfers.get_mut(&info.file_id) {
|
||||
t.state =
|
||||
crate::file_transfer::TransferState::Requesting { expires_at };
|
||||
}
|
||||
}
|
||||
|
||||
let msg = GossipMessage::FileRequest(req);
|
||||
let _ = self.net.broadcast(&msg).await;
|
||||
} else {
|
||||
self.chat.add_system_message("Transfer not found or not waiting for accept.".to_string());
|
||||
self.chat.add_system_message(
|
||||
"Cannot accept: Sender unknown (legacy offer?)".to_string(),
|
||||
);
|
||||
}
|
||||
} else {
|
||||
self.chat.add_system_message(format!("No transfer found matching '{}'", prefix));
|
||||
self.chat
|
||||
.add_system_message(format!("No transfer found matching '{}'", prefix));
|
||||
}
|
||||
}
|
||||
TuiCommand::None => {}
|
||||
AppCommand::None => {}
|
||||
}
|
||||
Ok(false) // Do not quit
|
||||
}
|
||||
@@ -244,9 +360,78 @@ impl AppLogic {
|
||||
}
|
||||
GossipMessage::FileOfferBroadcast(offer) => {
|
||||
self.chat.add_system_message(format!(
|
||||
"{} offers file: {} ({} bytes)",
|
||||
offer.sender_name, offer.file_name, offer.file_size
|
||||
"{} offers file: {} ({})",
|
||||
offer.sender_name,
|
||||
offer.file_name,
|
||||
crate::file_transfer::format_bytes(offer.file_size)
|
||||
));
|
||||
self.file_mgr.register_incoming_broadcast(&offer, from);
|
||||
}
|
||||
GossipMessage::FileRequest(req) => {
|
||||
let file_mgr = self.file_mgr.clone();
|
||||
let net = self.net.clone();
|
||||
let file_id = req.file_id;
|
||||
let peer_id = from;
|
||||
let sender_name = req.sender_name.clone();
|
||||
|
||||
// Check if we have transfer info
|
||||
let transfer_opt = {
|
||||
let transfers = file_mgr.transfers.lock().unwrap();
|
||||
transfers.get(&file_id).cloned()
|
||||
};
|
||||
|
||||
if let Some(info) = transfer_opt {
|
||||
if let (Some(path), Some(offer)) = (info.path, info.offer) {
|
||||
self.chat.add_system_message(format!(
|
||||
"Starting transfer of {} to {}",
|
||||
info.file_name, sender_name
|
||||
));
|
||||
|
||||
tokio::spawn(async move {
|
||||
// Open connection
|
||||
match net.open_file_stream(peer_id).await {
|
||||
Ok((mut send, mut recv)) => {
|
||||
if let Err(e) = file_mgr
|
||||
.execute_send(
|
||||
file_id, &path, offer, &mut send, &mut recv,
|
||||
)
|
||||
.await
|
||||
{
|
||||
eprintln!("Transfer failed: {}", e);
|
||||
let mut transfers = file_mgr.transfers.lock().unwrap();
|
||||
if let Some(t) = transfers.get_mut(&file_id) {
|
||||
t.state =
|
||||
crate::file_transfer::TransferState::Failed {
|
||||
error: e.to_string(),
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!(
|
||||
"Failed to open connection to {}: {}",
|
||||
peer_id, e
|
||||
);
|
||||
let mut transfers = file_mgr.transfers.lock().unwrap();
|
||||
if let Some(t) = transfers.get_mut(&file_id) {
|
||||
t.state = crate::file_transfer::TransferState::Failed {
|
||||
error: format!("Conn error: {}", e),
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
self.chat.add_system_message(
|
||||
"Cannot send: Missing file path or offer details.".to_string(),
|
||||
);
|
||||
}
|
||||
} else {
|
||||
self.chat
|
||||
.add_system_message("Received request for unknown file.".to_string());
|
||||
}
|
||||
}
|
||||
GossipMessage::NameChange(change) => {
|
||||
let mut peers = self.net.peers.lock().await;
|
||||
@@ -259,31 +444,64 @@ impl AppLogic {
|
||||
));
|
||||
}
|
||||
GossipMessage::Disconnect { sender_name } => {
|
||||
let mut should_exit = false;
|
||||
{
|
||||
let mut peers = self.net.peers.lock().await;
|
||||
peers.remove(&from);
|
||||
// Auto-terminate if no peers left
|
||||
// We check if peers is empty. Note: we are not in peers map.
|
||||
if peers.is_empty() {
|
||||
should_exit = true;
|
||||
}
|
||||
}
|
||||
let short_id: String = format!("{}", from).chars().take(8).collect();
|
||||
self.chat.add_system_message(format!(
|
||||
"👋 {} ({}) disconnected",
|
||||
sender_name, short_id
|
||||
));
|
||||
|
||||
if should_exit && self.connected {
|
||||
self.chat
|
||||
.add_system_message("All peers disconnected. Exiting...".to_string());
|
||||
// Trigger shutdown
|
||||
self.media.shutdown();
|
||||
std::process::exit(0);
|
||||
}
|
||||
}
|
||||
},
|
||||
NetEvent::PeerUp(peer_id) => {
|
||||
let short_id: String = format!("{}", peer_id).chars().take(8).collect();
|
||||
self.chat.add_system_message(format!("Peer connected: {}", short_id));
|
||||
self.chat
|
||||
.add_system_message(format!("Peer connected: {}", short_id));
|
||||
self.connected = true;
|
||||
}
|
||||
NetEvent::PeerDown(peer_id) => {
|
||||
let mut should_exit = false;
|
||||
{
|
||||
let mut peers = self.net.peers.lock().await;
|
||||
peers.remove(&peer_id);
|
||||
if peers.is_empty() {
|
||||
should_exit = true;
|
||||
}
|
||||
}
|
||||
let short_id: String = format!("{}", peer_id).chars().take(8).collect();
|
||||
self.chat.add_system_message(format!("Peer disconnected: {}", short_id));
|
||||
self.chat
|
||||
.add_system_message(format!("Peer disconnected: {}", short_id));
|
||||
|
||||
if should_exit && self.connected {
|
||||
self.chat
|
||||
.add_system_message("All peers disconnected. Exiting...".to_string());
|
||||
self.media.shutdown();
|
||||
std::process::exit(0);
|
||||
}
|
||||
}
|
||||
NetEvent::IncomingFileStream {
|
||||
from,
|
||||
mut send,
|
||||
mut recv,
|
||||
} => {
|
||||
self.chat.add_system_message("Incoming file transfer...".to_string());
|
||||
self.chat
|
||||
.add_system_message("Incoming file transfer...".to_string());
|
||||
tracing::info!("Incoming file stream from {:?}", from);
|
||||
let file_mgr = self.file_mgr.clone();
|
||||
tokio::spawn(async move {
|
||||
@@ -299,7 +517,8 @@ impl AppLogic {
|
||||
recv,
|
||||
} => {
|
||||
let short_id: String = format!("{}", from).chars().take(8).collect();
|
||||
self.chat.add_system_message(format!("📡 Incoming {:?} stream from {}", kind, short_id));
|
||||
self.chat
|
||||
.add_system_message(format!("📡 Incoming {:?} stream from {}", kind, short_id));
|
||||
self.media.handle_incoming_media(from, kind, send, recv);
|
||||
}
|
||||
NetEvent::IncomingDatagram { from, data } => {
|
||||
@@ -307,4 +526,44 @@ impl AppLogic {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn get_frontend_state(&self) -> FrontendState {
|
||||
let peers_map = self.net.peers.lock().await;
|
||||
let mut peers: Vec<PeerInfo> = peers_map.values().cloned().collect();
|
||||
peers.sort_by(|a, b| a.id.to_string().cmp(&b.id.to_string()));
|
||||
|
||||
// Sync audio levels and video status
|
||||
let levels = self.media.get_peer_levels();
|
||||
let video_sessions = &self.media.active_video_sessions;
|
||||
|
||||
for peer in &mut peers {
|
||||
if let Some(level) = levels.get(&peer.id) {
|
||||
peer.audio_level = *level;
|
||||
}
|
||||
if video_sessions.contains(&peer.id) {
|
||||
peer.is_streaming_video = true;
|
||||
} else {
|
||||
peer.is_streaming_video = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Check timeouts/cleanups
|
||||
self.file_mgr.check_timeouts();
|
||||
let transfers = self.file_mgr.active_transfers();
|
||||
|
||||
FrontendState {
|
||||
chat_history: self.chat.history.clone(),
|
||||
peers,
|
||||
transfers,
|
||||
our_name: self.our_name.clone(),
|
||||
our_id: self.our_id_short.clone(),
|
||||
our_id_full: self.net.our_id.to_string(),
|
||||
media_status: self.media.status_line(),
|
||||
input_device_name: self.media.input_device.clone(),
|
||||
output_device_name: self.media.output_device.clone(),
|
||||
master_volume: self.media.get_volume(),
|
||||
mic_volume: self.media.get_mic_volume(),
|
||||
noise_suppression: self.media.is_denoise_enabled(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
//! Chat module — manages chat history and message sending/receiving.
|
||||
|
||||
use crate::protocol::{ChatMessage, GossipMessage};
|
||||
use crate::net::NetworkManager;
|
||||
use crate::protocol::{ChatMessage, GossipMessage};
|
||||
use anyhow::Result;
|
||||
|
||||
/// Stored chat entry with display metadata.
|
||||
#[derive(Debug, Clone)]
|
||||
#[derive(Debug, Clone, serde::Serialize)]
|
||||
pub struct ChatEntry {
|
||||
pub sender_name: String,
|
||||
pub timestamp: u64,
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
use std::path::PathBuf;
|
||||
use std::fs;
|
||||
use anyhow::Result;
|
||||
use directories::ProjectDirs;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use ratatui::style::Color;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::fs;
|
||||
use std::path::PathBuf;
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct AppConfig {
|
||||
@@ -59,34 +59,55 @@ impl Default for NetworkConfig {
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct MediaConfig {
|
||||
pub screen_resolution: String,
|
||||
pub mic_name: Option<String>,
|
||||
pub speaker_name: Option<String>,
|
||||
#[serde(default = "default_bitrate")]
|
||||
pub mic_bitrate: u32,
|
||||
#[serde(default)]
|
||||
pub input_device: Option<String>,
|
||||
#[serde(default)]
|
||||
pub output_device: Option<String>,
|
||||
#[serde(default = "default_volume")]
|
||||
pub master_volume: f32,
|
||||
#[serde(default = "default_volume")]
|
||||
pub mic_volume: f32,
|
||||
#[serde(default = "default_true")]
|
||||
pub noise_suppression: bool,
|
||||
}
|
||||
|
||||
fn default_bitrate() -> u32 {
|
||||
128000
|
||||
}
|
||||
|
||||
fn default_volume() -> f32 {
|
||||
1.0
|
||||
}
|
||||
|
||||
fn default_true() -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
impl Default for MediaConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
screen_resolution: "1280x720".to_string(),
|
||||
mic_name: None,
|
||||
speaker_name: None,
|
||||
mic_bitrate: 128000,
|
||||
input_device: None,
|
||||
output_device: None,
|
||||
master_volume: 1.0,
|
||||
mic_volume: 1.0,
|
||||
noise_suppression: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize, Clone)]
|
||||
pub struct UiConfig {
|
||||
pub border: String,
|
||||
pub text: String,
|
||||
#[serde(default = "default_cyan")]
|
||||
pub chat_border: String,
|
||||
#[serde(default = "default_cyan")]
|
||||
pub peer_border: String,
|
||||
#[serde(default = "default_yellow")]
|
||||
pub transfer_border: String,
|
||||
pub self_name: String,
|
||||
pub peer_name: String,
|
||||
pub system_msg: String,
|
||||
|
||||
pub time: String,
|
||||
@@ -96,23 +117,21 @@ pub struct UiConfig {
|
||||
pub error: String,
|
||||
#[serde(default)]
|
||||
pub warning: String,
|
||||
#[serde(default)]
|
||||
pub info: String,
|
||||
}
|
||||
|
||||
impl Default for UiConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
border: "cyan".to_string(),
|
||||
text: "white".to_string(),
|
||||
self_name: "green".to_string(),
|
||||
peer_name: "magenta".to_string(),
|
||||
system_msg: "yellow".to_string(),
|
||||
time: "dark_gray".to_string(),
|
||||
chat_border: "cyan".to_string(),
|
||||
peer_border: "cyan".to_string(),
|
||||
transfer_border: "yellow".to_string(),
|
||||
success: "green".to_string(),
|
||||
error: "red".to_string(),
|
||||
warning: "yellow".to_string(),
|
||||
info: "cyan".to_string(),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -120,7 +139,7 @@ impl Default for UiConfig {
|
||||
impl AppConfig {
|
||||
pub fn load() -> Result<Self> {
|
||||
let config_path = Self::get_config_path();
|
||||
|
||||
|
||||
if !config_path.exists() {
|
||||
// Create default config if it doesn't exist
|
||||
let default_config = Self::default();
|
||||
@@ -198,7 +217,7 @@ pub fn parse_color(color_str: &str) -> Color {
|
||||
"white" => Color::White,
|
||||
_ => {
|
||||
// Try hex parsing if needed, but for now fallback to White
|
||||
Color::White
|
||||
Color::White
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -206,31 +225,38 @@ pub fn parse_color(color_str: &str) -> Color {
|
||||
// Runtime Theme struct derived from config
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Theme {
|
||||
pub border: Color,
|
||||
pub chat_border: Color,
|
||||
pub peer_border: Color,
|
||||
pub transfer_border: Color,
|
||||
pub text: Color,
|
||||
pub self_name: Color,
|
||||
pub peer_name: Color,
|
||||
pub system_msg: Color,
|
||||
pub time: Color,
|
||||
pub success: Color,
|
||||
pub error: Color,
|
||||
pub warning: Color,
|
||||
pub info: Color,
|
||||
}
|
||||
|
||||
impl From<UiConfig> for Theme {
|
||||
fn from(cfg: UiConfig) -> Self {
|
||||
Self {
|
||||
border: parse_color(&cfg.border),
|
||||
chat_border: parse_color(&cfg.chat_border),
|
||||
peer_border: parse_color(&cfg.peer_border),
|
||||
transfer_border: parse_color(&cfg.transfer_border),
|
||||
text: parse_color(&cfg.text),
|
||||
self_name: parse_color(&cfg.self_name),
|
||||
peer_name: parse_color(&cfg.peer_name),
|
||||
system_msg: parse_color(&cfg.system_msg),
|
||||
time: parse_color(&cfg.time),
|
||||
success: parse_color(&cfg.success),
|
||||
error: parse_color(&cfg.error),
|
||||
warning: parse_color(&cfg.warning),
|
||||
info: parse_color(&cfg.info),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn default_yellow() -> String {
|
||||
"yellow".to_string()
|
||||
}
|
||||
fn default_cyan() -> String {
|
||||
"cyan".to_string()
|
||||
}
|
||||
|
||||
@@ -4,13 +4,14 @@ use std::collections::HashMap;
|
||||
use std::path::{Path, PathBuf};
|
||||
|
||||
use anyhow::{Context, Result};
|
||||
use sha2::{Sha256, Digest};
|
||||
use sha2::{Digest, Sha256};
|
||||
use tokio::fs::File;
|
||||
use tokio::io::{AsyncReadExt, AsyncWriteExt};
|
||||
|
||||
use crate::net::EndpointId;
|
||||
use crate::protocol::{
|
||||
decode_framed, write_framed, FileChunk, FileDone, FileId, FileOffer,
|
||||
FileStreamMessage, FileAcceptReject, new_file_id,
|
||||
decode_framed, new_file_id, write_framed, FileAcceptReject, FileChunk, FileDone, FileId,
|
||||
FileOffer, FileStreamMessage,
|
||||
};
|
||||
|
||||
/// Chunk size for file transfers (64 KB).
|
||||
@@ -23,9 +24,9 @@ pub enum TransferState {
|
||||
/// We offered a file, waiting for opponent to accept.
|
||||
Offering,
|
||||
/// We received an offer, waiting for user to accept.
|
||||
WaitingForAccept {
|
||||
expires_at: std::time::Instant,
|
||||
},
|
||||
WaitingForAccept { expires_at: std::time::Instant },
|
||||
/// We requested the file, waiting for sender to connect.
|
||||
Requesting { expires_at: std::time::Instant },
|
||||
/// Transfer is in progress.
|
||||
Transferring {
|
||||
bytes_transferred: u64,
|
||||
@@ -33,11 +34,14 @@ pub enum TransferState {
|
||||
start_time: std::time::Instant,
|
||||
},
|
||||
/// Transfer completed successfully.
|
||||
Complete,
|
||||
Complete { completed_at: std::time::Instant },
|
||||
/// Transfer was rejected by the peer.
|
||||
Rejected,
|
||||
Rejected { completed_at: std::time::Instant },
|
||||
/// Transfer failed with an error.
|
||||
Failed(String),
|
||||
Failed {
|
||||
error: String,
|
||||
completed_at: std::time::Instant,
|
||||
},
|
||||
}
|
||||
|
||||
/// Information about a tracked file transfer.
|
||||
@@ -49,6 +53,9 @@ pub struct TransferInfo {
|
||||
pub file_size: u64,
|
||||
pub state: TransferState,
|
||||
pub is_outgoing: bool,
|
||||
pub peer: Option<EndpointId>,
|
||||
pub path: Option<PathBuf>,
|
||||
pub offer: Option<FileOffer>,
|
||||
}
|
||||
|
||||
use std::sync::{Arc, Mutex};
|
||||
@@ -73,10 +80,7 @@ impl FileTransferManager {
|
||||
|
||||
/// Initiate sending a file to a peer.
|
||||
/// Returns the file ID and the file offer for broadcasting.
|
||||
pub async fn prepare_send(
|
||||
&self,
|
||||
file_path: &Path,
|
||||
) -> Result<(FileId, FileOffer)> {
|
||||
pub async fn prepare_send(&self, file_path: &Path) -> Result<(FileId, FileOffer)> {
|
||||
let file_name = file_path
|
||||
.file_name()
|
||||
.context("No filename")?
|
||||
@@ -126,44 +130,72 @@ impl FileTransferManager {
|
||||
file_id,
|
||||
file_name,
|
||||
file_size,
|
||||
state: TransferState::Offering,
|
||||
state: TransferState::WaitingForAccept {
|
||||
expires_at: std::time::Instant::now()
|
||||
+ std::time::Duration::from_secs(timeout),
|
||||
},
|
||||
is_outgoing: true,
|
||||
peer: None,
|
||||
path: Some(file_path.to_path_buf()),
|
||||
offer: Some(offer.clone()),
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
Ok((file_id, offer))
|
||||
}
|
||||
|
||||
// Add logic to signals
|
||||
// We need a way to signal waiting tasks.
|
||||
// Since `execute_receive` is async, we can use a Notify or Channel.
|
||||
// For simplicity, let's store a map of channels in a separate mutex or the same one?
|
||||
// Using the same Transfers map for simplicity.
|
||||
// But TransferInfo is data, not channels.
|
||||
// We'll add a separate map for channels.
|
||||
|
||||
pub fn accept_transfer(&self, file_id: FileId) -> bool {
|
||||
let mut pending = self.pending_accepts.lock().unwrap();
|
||||
if let Some(tx) = pending.remove(&file_id) {
|
||||
let _ = tx.send(true);
|
||||
true
|
||||
} else {
|
||||
false
|
||||
pub fn check_timeouts(&self) {
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
let now = std::time::Instant::now();
|
||||
for info in transfers.values_mut() {
|
||||
match info.state {
|
||||
TransferState::WaitingForAccept { expires_at }
|
||||
| TransferState::Requesting { expires_at } => {
|
||||
if now > expires_at {
|
||||
info.state = TransferState::Failed {
|
||||
error: "Timed out".to_string(),
|
||||
completed_at: now,
|
||||
};
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub fn reject_transfer(&self, file_id: FileId) -> bool {
|
||||
let mut pending = self.pending_accepts.lock().unwrap();
|
||||
if let Some(tx) = pending.remove(&file_id) {
|
||||
let _ = tx.send(false);
|
||||
true
|
||||
} else {
|
||||
false
|
||||
// Remove expired
|
||||
let to_remove: Vec<FileId> = transfers
|
||||
.iter()
|
||||
.filter_map(|(id, info)| match info.state {
|
||||
TransferState::Complete { completed_at }
|
||||
| TransferState::Rejected { completed_at }
|
||||
| TransferState::Failed { completed_at, .. } => {
|
||||
if now.duration_since(completed_at) > std::time::Duration::from_secs(10) {
|
||||
Some(*id)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
})
|
||||
.collect();
|
||||
|
||||
for id in to_remove {
|
||||
transfers.remove(&id);
|
||||
}
|
||||
}
|
||||
|
||||
// Also cleanup pending accepts for timed out transfers?
|
||||
// Actually, execute_receive handles its own timeout for pending_accepts channel.
|
||||
// But for sender side, we don't have a pending accept channel causing a block?
|
||||
// Sender is waiting in `execute_send`?
|
||||
// Wait, execute_send sends Offer and waits for Accept/Reject.
|
||||
// It uses `decode_framed(recv)`.
|
||||
// If recipient never replies (timed out), sender is stuck in `decode_framed`.
|
||||
// We need `execute_send` to timeout as well!
|
||||
// But `check_timeouts` only updates the State in the Map.
|
||||
// It doesn't interrupt the async `execute_send`.
|
||||
// `execute_send` logic needs a timeout on `decode_framed`.
|
||||
}
|
||||
|
||||
/// Execute the sending side of a file transfer over a QUIC bi-stream.
|
||||
#[allow(dead_code)]
|
||||
@@ -187,7 +219,9 @@ impl FileTransferManager {
|
||||
FileStreamMessage::Reject(_) => {
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
if let Some(info) = transfers.get_mut(&file_id) {
|
||||
info.state = TransferState::Rejected;
|
||||
info.state = TransferState::Rejected {
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
return Ok(());
|
||||
}
|
||||
@@ -237,7 +271,9 @@ impl FileTransferManager {
|
||||
{
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
if let Some(info) = transfers.get_mut(&file_id) {
|
||||
info.state = TransferState::Complete;
|
||||
info.state = TransferState::Complete {
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -246,9 +282,7 @@ impl FileTransferManager {
|
||||
|
||||
/// Handle an incoming file offer for display.
|
||||
#[allow(dead_code)]
|
||||
/// Handle an incoming file offer for display.
|
||||
#[allow(dead_code)]
|
||||
pub fn register_incoming_offer(&self, offer: &FileOffer) {
|
||||
pub fn register_incoming_offer(&self, offer: &FileOffer, peer: EndpointId) {
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
transfers.insert(
|
||||
offer.file_id,
|
||||
@@ -258,6 +292,35 @@ impl FileTransferManager {
|
||||
file_size: offer.size,
|
||||
state: TransferState::Offering,
|
||||
is_outgoing: false,
|
||||
peer: Some(peer),
|
||||
path: None,
|
||||
offer: None,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
/// Handle an incoming file offer broadcast for display.
|
||||
pub fn register_incoming_broadcast(
|
||||
&self,
|
||||
offer: &crate::protocol::FileOfferBroadcast,
|
||||
peer: EndpointId,
|
||||
) {
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
// Calculate expires_at based on timeout
|
||||
let expires_at = std::time::Instant::now() + std::time::Duration::from_secs(offer.timeout);
|
||||
|
||||
transfers.insert(
|
||||
offer.file_id,
|
||||
TransferInfo {
|
||||
file_id: offer.file_id,
|
||||
file_name: offer.file_name.clone(),
|
||||
file_size: offer.file_size,
|
||||
// We go directly to WaitingForAccept because it's an offer we can accept
|
||||
state: TransferState::WaitingForAccept { expires_at },
|
||||
is_outgoing: false,
|
||||
peer: Some(peer),
|
||||
path: None,
|
||||
offer: None,
|
||||
},
|
||||
);
|
||||
}
|
||||
@@ -277,11 +340,21 @@ impl FileTransferManager {
|
||||
};
|
||||
|
||||
let file_id = offer.file_id;
|
||||
|
||||
|
||||
let timeout_duration = std::time::Duration::from_secs(offer.timeout);
|
||||
let expires_at = std::time::Instant::now() + timeout_duration;
|
||||
|
||||
// Register incoming offer
|
||||
// Check if pre-accepted (Requesting state)
|
||||
let is_pre_accepted = {
|
||||
let transfers = self.transfers.lock().unwrap();
|
||||
if let Some(info) = transfers.get(&file_id) {
|
||||
matches!(info.state, TransferState::Requesting { .. })
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
|
||||
// Register incoming offer (unless pre-accepted logic differs)
|
||||
{
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
transfers.insert(
|
||||
@@ -292,28 +365,35 @@ impl FileTransferManager {
|
||||
file_size: offer.size,
|
||||
state: TransferState::WaitingForAccept { expires_at },
|
||||
is_outgoing: false,
|
||||
peer: None, // We don't need peer for receive technically, as we are connected
|
||||
path: None,
|
||||
offer: Some(offer.clone()), // Store incoming offer if useful
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
// Wait for acceptance
|
||||
let (tx, rx) = tokio::sync::oneshot::channel();
|
||||
{
|
||||
let mut pending = self.pending_accepts.lock().unwrap();
|
||||
pending.insert(file_id, tx);
|
||||
}
|
||||
let accepted = if is_pre_accepted {
|
||||
true
|
||||
} else {
|
||||
// Wait for acceptance
|
||||
let (tx, rx) = tokio::sync::oneshot::channel();
|
||||
{
|
||||
let mut pending = self.pending_accepts.lock().unwrap();
|
||||
pending.insert(file_id, tx);
|
||||
}
|
||||
|
||||
// Wait for signal or timeout
|
||||
let accepted = match tokio::time::timeout(timeout_duration, rx).await {
|
||||
Ok(Ok(decision)) => decision,
|
||||
Ok(Err(_)) => false, // Sender dropped?
|
||||
Err(_) => {
|
||||
// Timeout
|
||||
false
|
||||
// Wait for signal or timeout
|
||||
match tokio::time::timeout(timeout_duration, rx).await {
|
||||
Ok(Ok(decision)) => decision,
|
||||
Ok(Err(_)) => false, // Sender dropped?
|
||||
Err(_) => {
|
||||
// Timeout
|
||||
false
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Remove from pending if still there (in case of timeout)
|
||||
// Remove from pending if still there (in case of timeout or pre-accept bypass)
|
||||
{
|
||||
let mut pending = self.pending_accepts.lock().unwrap();
|
||||
pending.remove(&file_id);
|
||||
@@ -325,8 +405,10 @@ impl FileTransferManager {
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
// Check if it wasn't already updated (e.g. by manual reject)
|
||||
if let Some(info) = transfers.get_mut(&file_id) {
|
||||
// Could be Offering or WaitingForAccept depending on race
|
||||
info.state = TransferState::Rejected;
|
||||
// Could be Offering or WaitingForAccept depending on race
|
||||
info.state = TransferState::Rejected {
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -335,7 +417,7 @@ impl FileTransferManager {
|
||||
&FileStreamMessage::Reject(FileAcceptReject { file_id }),
|
||||
)
|
||||
.await?;
|
||||
|
||||
|
||||
return Ok(file_id);
|
||||
}
|
||||
|
||||
@@ -346,9 +428,21 @@ impl FileTransferManager {
|
||||
)
|
||||
.await?;
|
||||
|
||||
// Update state to Transferring immediately to stop spinner
|
||||
{
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
if let Some(info) = transfers.get_mut(&file_id) {
|
||||
info.state = TransferState::Transferring {
|
||||
bytes_transferred: 0,
|
||||
total_size: offer.size,
|
||||
start_time: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Receive chunks
|
||||
let dest_path = self.download_dir.join(&offer.name);
|
||||
|
||||
|
||||
// Ensure download dir exists for safety (though main normally does this)
|
||||
if let Some(parent) = dest_path.parent() {
|
||||
let _ = tokio::fs::create_dir_all(parent).await;
|
||||
@@ -396,7 +490,9 @@ impl FileTransferManager {
|
||||
{
|
||||
let mut transfers = self.transfers.lock().unwrap();
|
||||
if let Some(info) = transfers.get_mut(&file_id) {
|
||||
info.state = TransferState::Complete;
|
||||
info.state = TransferState::Complete {
|
||||
completed_at: std::time::Instant::now(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
@@ -415,7 +511,10 @@ impl FileTransferManager {
|
||||
let direction = if info.is_outgoing { "↑" } else { "↓" };
|
||||
match &info.state {
|
||||
TransferState::Offering => {
|
||||
format!("{} {} (offering...)", direction, info.file_name)
|
||||
format!("{} {} (Offering)", direction, info.file_name)
|
||||
}
|
||||
TransferState::Requesting { .. } => {
|
||||
format!("{} {} (Requesting...)", direction, info.file_name)
|
||||
}
|
||||
TransferState::WaitingForAccept { expires_at } => {
|
||||
let now = std::time::Instant::now();
|
||||
@@ -424,7 +523,17 @@ impl FileTransferManager {
|
||||
} else {
|
||||
0
|
||||
};
|
||||
format!("{} {} (WAITING ACCEPT: {}s)", direction, info.file_name, remaining)
|
||||
if info.is_outgoing {
|
||||
format!(
|
||||
"{} {} (Wait Accept - {}s)",
|
||||
direction, info.file_name, remaining
|
||||
)
|
||||
} else {
|
||||
format!(
|
||||
"{} {} (Incoming Offer - {}s)",
|
||||
direction, info.file_name, remaining
|
||||
)
|
||||
}
|
||||
}
|
||||
TransferState::Transferring {
|
||||
bytes_transferred,
|
||||
@@ -443,7 +552,7 @@ impl FileTransferManager {
|
||||
0.0
|
||||
};
|
||||
let speed_mbps = speed_bps / (1024.0 * 1024.0);
|
||||
|
||||
|
||||
format!(
|
||||
"{} {} {}% ({}/{}) {:.1} MB/s",
|
||||
direction,
|
||||
@@ -454,21 +563,21 @@ impl FileTransferManager {
|
||||
speed_mbps
|
||||
)
|
||||
}
|
||||
TransferState::Complete => {
|
||||
TransferState::Complete { .. } => {
|
||||
format!("{} {} ✓ complete", direction, info.file_name)
|
||||
}
|
||||
TransferState::Rejected => {
|
||||
TransferState::Rejected { .. } => {
|
||||
format!("{} {} ✗ rejected", direction, info.file_name)
|
||||
}
|
||||
TransferState::Failed(e) => {
|
||||
format!("{} {} ✗ {}", direction, info.file_name, e)
|
||||
TransferState::Failed { error, .. } => {
|
||||
format!("{} {} ✗ {}", direction, info.file_name, error)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
fn format_bytes(bytes: u64) -> String {
|
||||
#[allow(dead_code)]
|
||||
pub fn format_bytes(bytes: u64) -> String {
|
||||
if bytes < 1024 {
|
||||
format!("{}B", bytes)
|
||||
} else if bytes < 1024 * 1024 {
|
||||
|
||||
1163
src/gui.rs
Normal file
1163
src/gui.rs
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,6 @@
|
||||
//! P2P Chat — library crate exposing modules for integration tests.
|
||||
|
||||
pub mod app_logic;
|
||||
pub mod chat;
|
||||
pub mod config;
|
||||
pub mod file_transfer;
|
||||
@@ -8,4 +9,3 @@ pub mod net;
|
||||
pub mod protocol;
|
||||
pub mod tui;
|
||||
pub mod web;
|
||||
pub mod app_logic;
|
||||
|
||||
282
src/main.rs
282
src/main.rs
@@ -4,15 +4,16 @@
|
||||
//! Chat is the primary feature, file transfer is first-class,
|
||||
//! voice/camera/screen are optional, powered by PipeWire.
|
||||
|
||||
mod app_logic;
|
||||
mod chat;
|
||||
mod config;
|
||||
mod file_transfer;
|
||||
mod gui;
|
||||
mod media;
|
||||
mod net;
|
||||
mod protocol;
|
||||
mod tui;
|
||||
mod web;
|
||||
mod app_logic;
|
||||
|
||||
use std::io;
|
||||
use std::path::PathBuf;
|
||||
@@ -20,10 +21,10 @@ use std::path::PathBuf;
|
||||
use anyhow::{Context, Result};
|
||||
use clap::Parser;
|
||||
use crossterm::event::{Event, EventStream};
|
||||
use crossterm::execute;
|
||||
use crossterm::terminal::{
|
||||
disable_raw_mode, enable_raw_mode, EnterAlternateScreen, LeaveAlternateScreen,
|
||||
};
|
||||
use crossterm::execute;
|
||||
use iroh::EndpointId;
|
||||
use n0_future::StreamExt;
|
||||
use ratatui::backend::CrosstermBackend;
|
||||
@@ -57,34 +58,25 @@ struct Cli {
|
||||
/// Download directory for received files
|
||||
#[arg(short, long, default_value = "~/Downloads")]
|
||||
download_dir: String,
|
||||
/// Screen resolution for sharing (e.g., 1280x720, 1920x1080)
|
||||
|
||||
/// Launch with GUI (Iced) instead of TUI
|
||||
#[arg(long)]
|
||||
screen_resolution: Option<String>, // Changed to Option to fallback to config
|
||||
gui: bool,
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<()> {
|
||||
// ... tracing init ...
|
||||
// Initialize tracing to file (not stdout, since we use TUI)
|
||||
let _tracing_guard = tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
tracing_subscriber::EnvFilter::from_default_env()
|
||||
.add_directive("p2p_chat=debug".parse()?)
|
||||
.add_directive("iroh=warn".parse()?)
|
||||
.add_directive("iroh_gossip=warn".parse()?),
|
||||
)
|
||||
.with_writer(|| -> Box<dyn io::Write + Send> {
|
||||
match std::fs::OpenOptions::new()
|
||||
.create(true)
|
||||
.append(true)
|
||||
.open("p2p-chat.log")
|
||||
{
|
||||
Ok(f) => Box::new(f),
|
||||
Err(_) => Box::new(io::sink()),
|
||||
}
|
||||
})
|
||||
.with_ansi(false)
|
||||
.init();
|
||||
// Remove #[tokio::main] and use manual runtime builder
|
||||
fn main() -> Result<()> {
|
||||
// Setup runtime
|
||||
let rt = tokio::runtime::Builder::new_multi_thread()
|
||||
.enable_all()
|
||||
.build()?;
|
||||
|
||||
// Block on async main logic
|
||||
rt.block_on(async_main())
|
||||
}
|
||||
|
||||
async fn async_main() -> Result<()> {
|
||||
let cli = Cli::parse();
|
||||
|
||||
// Load config
|
||||
let config = AppConfig::load().unwrap_or_else(|e| {
|
||||
@@ -92,26 +84,83 @@ async fn main() -> Result<()> {
|
||||
AppConfig::default()
|
||||
});
|
||||
|
||||
let cli = Cli::parse();
|
||||
// Initialize tracing
|
||||
if cli.gui {
|
||||
// GUI Mode: Log to stdout (Info/Debug) and stderr (Warn/Error)
|
||||
use tracing_subscriber::fmt::writer::MakeWriterExt;
|
||||
|
||||
// Resolution: CLI > Config > Default
|
||||
let res_str = cli.screen_resolution.as_deref().unwrap_or(&config.media.screen_resolution);
|
||||
let screen_res = parse_resolution(res_str).unwrap_or((1280, 720));
|
||||
let stdout = std::io::stdout.with_max_level(tracing::Level::INFO);
|
||||
let stderr = std::io::stderr.with_min_level(tracing::Level::WARN);
|
||||
|
||||
// Combine them? tracing_subscriber doesn't easily support split writers in one layer without customization.
|
||||
// But we can use the MakeWriter implementation pattern.
|
||||
|
||||
struct SplitWriter;
|
||||
impl<'a> tracing_subscriber::fmt::MakeWriter<'a> for SplitWriter {
|
||||
type Writer = Box<dyn io::Write + Send + 'a>;
|
||||
|
||||
fn make_writer(&'a self) -> Self::Writer {
|
||||
Box::new(io::stdout())
|
||||
}
|
||||
|
||||
fn make_writer_for(&'a self, meta: &tracing::Metadata<'_>) -> Self::Writer {
|
||||
if *meta.level() <= tracing::Level::WARN {
|
||||
Box::new(io::stderr())
|
||||
} else {
|
||||
Box::new(io::stdout())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
tracing_subscriber::EnvFilter::from_default_env()
|
||||
.add_directive("p2p_chat=debug".parse()?)
|
||||
.add_directive("iroh=warn".parse()?)
|
||||
.add_directive("iroh_gossip=warn".parse()?),
|
||||
)
|
||||
.with_writer(SplitWriter)
|
||||
.with_ansi(true)
|
||||
.init();
|
||||
} else {
|
||||
// TUI Mode: Log to file to avoid breaking UI
|
||||
tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
tracing_subscriber::EnvFilter::from_default_env()
|
||||
.add_directive("p2p_chat=debug".parse()?)
|
||||
.add_directive("iroh=warn".parse()?)
|
||||
.add_directive("iroh_gossip=warn".parse()?),
|
||||
)
|
||||
.with_writer(|| -> Box<dyn io::Write + Send> {
|
||||
match std::fs::OpenOptions::new()
|
||||
.create(true)
|
||||
.append(true)
|
||||
.open("p2p-chat.log")
|
||||
{
|
||||
Ok(f) => Box::new(f),
|
||||
Err(_) => Box::new(io::sink()),
|
||||
}
|
||||
})
|
||||
.with_ansi(false)
|
||||
.init();
|
||||
}
|
||||
|
||||
// Topic: CLI > Config > Default
|
||||
let topic_str = cli.topic.as_deref().or(config.network.topic.as_deref()).unwrap_or("00000000000000000000000000000000");
|
||||
let topic_str = cli
|
||||
.topic
|
||||
.as_deref()
|
||||
.or(config.network.topic.as_deref())
|
||||
.unwrap_or("00000000000000000000000000000000");
|
||||
let topic_bytes = parse_topic(topic_str)?;
|
||||
|
||||
// ... networking init ...
|
||||
// Initialize networking
|
||||
let (mut net_mgr, _net_tx, mut net_rx) =
|
||||
NetworkManager::new(topic_bytes).await.context("Failed to start networking")?;
|
||||
let (mut net_mgr, _net_tx, net_event_rx) = NetworkManager::new(topic_bytes)
|
||||
.await
|
||||
.context("Failed to start networking")?;
|
||||
|
||||
let our_id = net_mgr.our_id;
|
||||
let our_id_short = format!("{}", our_id)
|
||||
.chars()
|
||||
.take(8)
|
||||
.collect::<String>();
|
||||
let our_id_short = format!("{}", our_id).chars().take(8).collect::<String>();
|
||||
|
||||
// Initialize application state
|
||||
let mut chat = ChatState::new(cli.name.clone());
|
||||
@@ -119,35 +168,40 @@ async fn main() -> Result<()> {
|
||||
// Resolve download directory, handling ~
|
||||
let download_path = if cli.download_dir.starts_with("~/") {
|
||||
if let Ok(home) = std::env::var("HOME") {
|
||||
PathBuf::from(home).join(&cli.download_dir[2..])
|
||||
PathBuf::from(home).join(&cli.download_dir[2..])
|
||||
} else {
|
||||
PathBuf::from(&cli.download_dir)
|
||||
PathBuf::from(&cli.download_dir)
|
||||
}
|
||||
} else {
|
||||
PathBuf::from(&cli.download_dir)
|
||||
};
|
||||
// Ensure download directory exists
|
||||
tokio::fs::create_dir_all(&download_path).await?;
|
||||
|
||||
|
||||
let file_mgr = FileTransferManager::new(download_path);
|
||||
// Pass mic name from config if present
|
||||
// Pass mic name from config if present
|
||||
let (media_event_tx, media_event_rx) = tokio::sync::mpsc::channel(10);
|
||||
|
||||
let media = MediaState::new(
|
||||
screen_res,
|
||||
config.media.mic_name.clone(),
|
||||
config.media.speaker_name.clone(),
|
||||
config.media.mic_bitrate,
|
||||
config.media.input_device.clone(),
|
||||
config.media.output_device.clone(),
|
||||
config.media.master_volume,
|
||||
config.media.mic_volume,
|
||||
config.media.noise_suppression,
|
||||
Some(media_event_tx),
|
||||
);
|
||||
|
||||
|
||||
// Initialize App with Theme
|
||||
let theme = crate::config::Theme::from(config.ui.clone());
|
||||
let mut app = App::new(theme);
|
||||
|
||||
|
||||
// If a peer was specified, add it and bootstrap
|
||||
let bootstrap_peers = if let Some(ref join_id) = cli.join {
|
||||
let peer_id: EndpointId = join_id.parse().context(
|
||||
"Invalid peer ID. Expected a hex-encoded endpoint ID.",
|
||||
)?;
|
||||
let peer_id: EndpointId = join_id
|
||||
.parse()
|
||||
.context("Invalid peer ID. Expected a hex-encoded endpoint ID.")?;
|
||||
vec![peer_id]
|
||||
} else {
|
||||
vec![]
|
||||
@@ -173,12 +227,17 @@ async fn main() -> Result<()> {
|
||||
// Insert ourselves into the peer list
|
||||
{
|
||||
let mut peers = net_mgr.peers.lock().await;
|
||||
peers.insert(our_id, PeerInfo {
|
||||
id: our_id,
|
||||
name: Some(cli.name.clone()),
|
||||
capabilities: None,
|
||||
is_self: true,
|
||||
});
|
||||
peers.insert(
|
||||
our_id,
|
||||
PeerInfo {
|
||||
id: our_id,
|
||||
name: Some(cli.name.clone()),
|
||||
capabilities: None,
|
||||
is_self: true,
|
||||
audio_level: 0.0,
|
||||
is_streaming_video: false,
|
||||
},
|
||||
);
|
||||
}
|
||||
|
||||
// Broadcast capabilities
|
||||
@@ -198,13 +257,10 @@ async fn main() -> Result<()> {
|
||||
);
|
||||
}
|
||||
|
||||
// Start Web Interface
|
||||
// Start Web Interface
|
||||
// Start Web Interface
|
||||
tokio::spawn(crate::web::start_web_server(
|
||||
media.broadcast_tx.clone(),
|
||||
media.mic_broadcast.clone(),
|
||||
media.cam_broadcast.clone(),
|
||||
media.screen_broadcast.clone(),
|
||||
));
|
||||
|
||||
@@ -216,8 +272,89 @@ async fn main() -> Result<()> {
|
||||
net_mgr,
|
||||
cli.name.clone(),
|
||||
our_id_short,
|
||||
media_event_rx,
|
||||
net_event_rx,
|
||||
);
|
||||
|
||||
if cli.gui {
|
||||
use tokio::sync::mpsc;
|
||||
|
||||
// Channel for GUI -> AppLogic commands
|
||||
let (gui_cmd_tx, mut gui_cmd_rx) = mpsc::channel(100);
|
||||
// Channel for AppLogic -> GUI state updates
|
||||
let (gui_state_tx, gui_state_rx) = mpsc::channel(100);
|
||||
|
||||
// Subscribe to video frames
|
||||
let video_rx = app_logic.media.video_frame_tx.subscribe();
|
||||
|
||||
// Get initial state
|
||||
let initial_state = app_logic.get_frontend_state().await;
|
||||
|
||||
// Spawn AppLogic loop
|
||||
let loop_handle = tokio::spawn(async move {
|
||||
loop {
|
||||
tokio::select! {
|
||||
Some(event) = app_logic.net_event_rx.recv() => {
|
||||
app_logic.handle_net_event(event).await;
|
||||
// Send state update to GUI
|
||||
let state = app_logic.get_frontend_state().await;
|
||||
let _ = gui_state_tx.send(state).await;
|
||||
}
|
||||
cmd_opt = gui_cmd_rx.recv() => {
|
||||
match cmd_opt {
|
||||
Some(cmd) => {
|
||||
// Handle command from GUI
|
||||
if let Ok(should_quit) = app_logic.handle_command(cmd).await {
|
||||
// Update state even if quitting
|
||||
let state = app_logic.get_frontend_state().await;
|
||||
let _ = gui_state_tx.send(state).await;
|
||||
|
||||
if should_quit {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
None => {
|
||||
// GUI channel closed, exit loop
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
Some(_media_evt) = app_logic.media_event_rx.recv() => {
|
||||
// Media state changed (video started/stopped), update GUI
|
||||
let state = app_logic.get_frontend_state().await;
|
||||
let _ = gui_state_tx.send(state).await;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Shutdown logic
|
||||
// Offload entire app_logic shutdown to blocking thread to avoid async drop panics
|
||||
let _ = tokio::task::spawn_blocking(move || {
|
||||
// Shutdown media explicitly (stops threads)
|
||||
app_logic.media.shutdown();
|
||||
// app_logic is dropped here, on a blocking thread.
|
||||
// This includes NetworkManager, ChatState, etc.
|
||||
})
|
||||
.await;
|
||||
});
|
||||
|
||||
// Run GUI
|
||||
let flags = crate::gui::Flags {
|
||||
initial_state,
|
||||
command_sender: gui_cmd_tx,
|
||||
state_receiver: gui_state_rx,
|
||||
video_receiver: video_rx,
|
||||
};
|
||||
|
||||
crate::gui::run(flags)?;
|
||||
|
||||
// Wait for background task to finish cleanup
|
||||
let _ = loop_handle.await;
|
||||
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Setup terminal
|
||||
enable_raw_mode()?;
|
||||
let mut stdout = io::stdout();
|
||||
@@ -230,7 +367,6 @@ async fn main() -> Result<()> {
|
||||
&mut terminal,
|
||||
&mut app,
|
||||
&mut app_logic,
|
||||
&mut net_rx,
|
||||
&mut gossip_event_rx,
|
||||
)
|
||||
.await;
|
||||
@@ -250,10 +386,10 @@ async fn run_event_loop(
|
||||
terminal: &mut Terminal<CrosstermBackend<io::Stdout>>,
|
||||
app: &mut App,
|
||||
logic: &mut crate::app_logic::AppLogic,
|
||||
net_rx: &mut mpsc::Receiver<NetEvent>,
|
||||
gossip_rx: &mut mpsc::Receiver<NetEvent>,
|
||||
) -> Result<()> {
|
||||
let mut event_stream = EventStream::new();
|
||||
let mut interval = tokio::time::interval(std::time::Duration::from_millis(100));
|
||||
|
||||
loop {
|
||||
// Collect peers for rendering — self is always first
|
||||
@@ -281,12 +417,17 @@ async fn run_event_loop(
|
||||
|
||||
// Wait for events
|
||||
tokio::select! {
|
||||
// Tick for animation handling
|
||||
_ = interval.tick() => {
|
||||
logic.file_mgr.check_timeouts();
|
||||
}
|
||||
|
||||
// Terminal/keyboard events
|
||||
maybe_event = event_stream.next() => {
|
||||
match maybe_event {
|
||||
Some(Ok(Event::Key(key))) => {
|
||||
let cmd = app.handle_key(key);
|
||||
if logic.handle_tui_command(cmd).await? {
|
||||
if logic.handle_command(cmd).await? {
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
@@ -304,7 +445,7 @@ async fn run_event_loop(
|
||||
}
|
||||
|
||||
// Network events from file transfer acceptor
|
||||
Some(event) = net_rx.recv() => {
|
||||
Some(event) = logic.net_event_rx.recv() => {
|
||||
logic.handle_net_event(event).await;
|
||||
}
|
||||
|
||||
@@ -328,15 +469,13 @@ async fn run_event_loop(
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
fn parse_topic(hex_str: &str) -> Result<[u8; 32]> {
|
||||
// If it's all zeros (default), generate a deterministic topic
|
||||
let hex_str = hex_str.trim();
|
||||
|
||||
if hex_str.len() != 32 && hex_str.len() != 64 {
|
||||
// Treat as a room name — hash it to get 32 bytes
|
||||
use sha2::{Sha256, Digest};
|
||||
use sha2::{Digest, Sha256};
|
||||
let mut hasher = Sha256::new();
|
||||
hasher.update(hex_str.as_bytes());
|
||||
let result = hasher.finalize();
|
||||
@@ -355,7 +494,7 @@ fn parse_topic(hex_str: &str) -> Result<[u8; 32]> {
|
||||
Ok(bytes)
|
||||
} else {
|
||||
// 32-char string — treat as room name
|
||||
use sha2::{Sha256, Digest};
|
||||
use sha2::{Digest, Sha256};
|
||||
let mut hasher = Sha256::new();
|
||||
hasher.update(hex_str.as_bytes());
|
||||
let result = hasher.finalize();
|
||||
@@ -364,14 +503,3 @@ fn parse_topic(hex_str: &str) -> Result<[u8; 32]> {
|
||||
Ok(bytes)
|
||||
}
|
||||
}
|
||||
|
||||
fn parse_resolution(res: &str) -> Option<(u32, u32)> {
|
||||
let parts: Vec<&str> = res.split('x').collect();
|
||||
if parts.len() == 2 {
|
||||
let w = parts[0].parse().ok()?;
|
||||
let h = parts[1].parse().ok()?;
|
||||
Some((w, h))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,157 +0,0 @@
|
||||
//! Video capture and playback using FFmpeg and MPV.
|
||||
//!
|
||||
//! Captures video by spawning an `ffmpeg` process and reading its stdout.
|
||||
//! Plays video by spawning `mpv` (or `vlc`) and writing to its stdin.
|
||||
|
||||
use anyhow::Result;
|
||||
use tracing;
|
||||
use std::sync::atomic::{AtomicBool, Ordering};
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::protocol::{decode_framed, write_framed, MediaKind, MediaStreamMessage};
|
||||
use crate::media::WebMediaEvent;
|
||||
|
||||
/// Manages a video capture session (camera or screen).
|
||||
pub struct VideoCapture {
|
||||
running: Arc<AtomicBool>,
|
||||
tasks: Vec<tokio::task::JoinHandle<()>>,
|
||||
}
|
||||
|
||||
|
||||
|
||||
impl VideoCapture {
|
||||
/// Start video capture with web input (broadcast receiver).
|
||||
pub async fn start_web(
|
||||
kind: MediaKind,
|
||||
_local_peer_id: iroh::EndpointId,
|
||||
peers: Vec<iroh::EndpointId>,
|
||||
network_manager: crate::net::NetworkManager,
|
||||
input_rx: tokio::sync::broadcast::Sender<Vec<u8>>,
|
||||
) -> Result<Self> {
|
||||
let running = Arc::new(AtomicBool::new(true));
|
||||
|
||||
// Spawn sender tasks
|
||||
let mut tasks = Vec::new();
|
||||
for peer in peers {
|
||||
let running = running.clone();
|
||||
let net = network_manager.clone();
|
||||
let rx = input_rx.subscribe();
|
||||
let kind = kind.clone();
|
||||
|
||||
let task = tokio::spawn(async move {
|
||||
if let Err(e) = run_video_sender_web(net, peer, kind, rx, running).await {
|
||||
tracing::error!("Video sender web error: {}", e);
|
||||
}
|
||||
});
|
||||
tasks.push(task);
|
||||
}
|
||||
|
||||
Ok(Self {
|
||||
running,
|
||||
tasks, // Added tasks
|
||||
})
|
||||
}
|
||||
|
||||
/// Stop capture.
|
||||
pub fn stop(&mut self) {
|
||||
self.running.store(false, Ordering::Relaxed);
|
||||
for task in self.tasks.drain(..) {
|
||||
task.abort();
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
/// Handle incoming video stream from a peer (Web Version).
|
||||
/// Receives video frames (e.g. H.264/VP9 encoded inside protocol messages) and forwards to frontend.
|
||||
pub async fn handle_incoming_video_web(
|
||||
from: iroh::EndpointId,
|
||||
message: MediaStreamMessage,
|
||||
mut recv: iroh::endpoint::RecvStream,
|
||||
broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
|
||||
) -> Result<()> {
|
||||
let kind = match message {
|
||||
MediaStreamMessage::VideoStart { kind, .. } => kind,
|
||||
_ => anyhow::bail!("Expected VideoStart"),
|
||||
};
|
||||
tracing::info!("Starting {:?} stream handler for {}", kind, from);
|
||||
|
||||
loop {
|
||||
let msg: MediaStreamMessage = match decode_framed(&mut recv).await {
|
||||
Ok(m) => m,
|
||||
Err(_) => break, // EOF
|
||||
};
|
||||
|
||||
match msg {
|
||||
MediaStreamMessage::VideoFrame { data, .. } => {
|
||||
// Broadcast to web
|
||||
let short_id: String = format!("{}", from).chars().take(8).collect();
|
||||
let _ = broadcast_tx.send(WebMediaEvent::Video { peer_id: short_id, kind: kind.clone(), data });
|
||||
}
|
||||
MediaStreamMessage::VideoStop { .. } => {
|
||||
tracing::info!("Peer stopped video");
|
||||
break;
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
impl Drop for VideoCapture {
|
||||
fn drop(&mut self) {
|
||||
self.stop();
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// FFmpeg Capture Logic
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
async fn run_video_sender_web(
|
||||
network_manager: crate::net::NetworkManager,
|
||||
peer: iroh::EndpointId,
|
||||
kind: MediaKind,
|
||||
mut input_rx: tokio::sync::broadcast::Receiver<Vec<u8>>,
|
||||
running: Arc<AtomicBool>,
|
||||
) -> Result<()> {
|
||||
let (mut send, _) = network_manager.open_media_stream(peer, kind.clone()).await?;
|
||||
// For web, we assume fixed resolution and fps for now.
|
||||
write_framed(&mut send, &MediaStreamMessage::VideoStart { kind, width: 640, height: 480, fps: 30 }).await?;
|
||||
|
||||
while running.load(Ordering::Relaxed) {
|
||||
match input_rx.recv().await {
|
||||
Ok(data) => {
|
||||
// Web sends MJPEG chunk (full frame)
|
||||
let msg = MediaStreamMessage::VideoFrame {
|
||||
sequence: 0, // Sequence not used for web input, set to 0
|
||||
timestamp_ms: std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap_or_default()
|
||||
.as_millis() as u64,
|
||||
data,
|
||||
};
|
||||
if write_framed(&mut send, &msg).await.is_err() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
Err(tokio::sync::broadcast::error::RecvError::Closed) => break,
|
||||
Err(tokio::sync::broadcast::error::RecvError::Lagged(_)) => {
|
||||
tracing::warn!("Video sender lagged");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let _ = write_framed(&mut send, &MediaStreamMessage::VideoStop { kind }).await;
|
||||
send.finish()?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Player Logic (MPV/VLC)
|
||||
// ---------------------------------------------------------------------------
|
||||
135
src/media/capture/gstreamer.rs
Normal file
135
src/media/capture/gstreamer.rs
Normal file
@@ -0,0 +1,135 @@
|
||||
use anyhow::Result;
|
||||
use gstreamer::prelude::*;
|
||||
use gstreamer::{Pipeline, State};
|
||||
use gstreamer_app::{AppSink, AppSinkCallbacks};
|
||||
use std::sync::atomic::{AtomicUsize, Ordering};
|
||||
use std::sync::Arc;
|
||||
|
||||
pub struct GStreamerCapture {
|
||||
pipeline: Pipeline,
|
||||
sink: AppSink,
|
||||
}
|
||||
|
||||
impl GStreamerCapture {
|
||||
pub fn new(pipeline_str: &str) -> Result<Self> {
|
||||
// Initialize GStreamer
|
||||
// We only need to init once, but calling it multiple times is safe.
|
||||
// gstreamer::init() checks internally.
|
||||
gstreamer::init()?;
|
||||
|
||||
// Create pipeline from string
|
||||
let pipeline = gstreamer::parse::launch(pipeline_str)?
|
||||
.downcast::<Pipeline>()
|
||||
.map_err(|_| anyhow::anyhow!("Expected a pipeline"))?;
|
||||
|
||||
// Get the appsink
|
||||
let sink = pipeline
|
||||
.by_name("sink")
|
||||
.ok_or_else(|| anyhow::anyhow!("Pipeline must have an appsink named 'sink'"))?
|
||||
.downcast::<AppSink>()
|
||||
.map_err(|_| anyhow::anyhow!("'sink' element is not an appsink"))?;
|
||||
|
||||
Ok(Self { pipeline, sink })
|
||||
}
|
||||
|
||||
pub fn set_callback<F>(&self, on_sample: F)
|
||||
where
|
||||
F: Fn(&[u8]) + Send + Sync + 'static,
|
||||
{
|
||||
let count = Arc::new(AtomicUsize::new(0));
|
||||
let callbacks = AppSinkCallbacks::builder()
|
||||
.new_sample(move |sink| {
|
||||
let sample = sink.pull_sample().map_err(|_| gstreamer::FlowError::Eos)?;
|
||||
let buffer = sample.buffer().ok_or(gstreamer::FlowError::Error)?;
|
||||
let map = buffer
|
||||
.map_readable()
|
||||
.map_err(|_| gstreamer::FlowError::Error)?;
|
||||
|
||||
let c = count.fetch_add(1, Ordering::Relaxed);
|
||||
if c % 10 == 0 {
|
||||
tracing::info!("GStreamer captured frame #{}", c);
|
||||
}
|
||||
|
||||
on_sample(map.as_slice());
|
||||
Ok(gstreamer::FlowSuccess::Ok)
|
||||
})
|
||||
.build();
|
||||
|
||||
self.sink.set_callbacks(callbacks);
|
||||
}
|
||||
|
||||
pub fn start(&self) -> Result<()> {
|
||||
self.pipeline.set_state(State::Playing)?;
|
||||
|
||||
// Spawn a bus watcher
|
||||
let bus = self.pipeline.bus().expect("Pipeline has no bus");
|
||||
std::thread::spawn(move || {
|
||||
for msg in bus.iter_timed(gstreamer::ClockTime::NONE) {
|
||||
use gstreamer::MessageView;
|
||||
match msg.view() {
|
||||
MessageView::Error(err) => {
|
||||
tracing::error!("GStreamer Error: {} ({:?})", err.error(), err.debug());
|
||||
break;
|
||||
}
|
||||
MessageView::Warning(warn) => {
|
||||
tracing::warn!("GStreamer Warning: {} ({:?})", warn.error(), warn.debug());
|
||||
}
|
||||
MessageView::Eos(..) => {
|
||||
tracing::info!("GStreamer EndOfStream");
|
||||
break;
|
||||
}
|
||||
_ => (),
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn stop(&self) -> Result<()> {
|
||||
let _ = self.pipeline.set_state(State::Null);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
impl Drop for GStreamerCapture {
|
||||
fn drop(&mut self) {
|
||||
let _ = self.stop();
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use std::time::Duration;
|
||||
|
||||
#[test]
|
||||
fn test_appsink_pattern() {
|
||||
// Simple videotestsrc pipeline for testing
|
||||
// videotestsrc ! videoconvert ! video/x-raw,format=I420 ! appsink name=sink
|
||||
let pipeline_str = "videotestsrc num-buffers=10 ! videoconvert ! video/x-raw,format=I420 ! appsink name=sink";
|
||||
|
||||
let capture = GStreamerCapture::new(pipeline_str).expect("Failed to create capture");
|
||||
|
||||
let frame_count = Arc::new(Mutex::new(0));
|
||||
let frame_count_clone = frame_count.clone();
|
||||
|
||||
capture.set_callback(move |data| {
|
||||
let mut count = frame_count_clone.lock().unwrap();
|
||||
*count += 1;
|
||||
println!("Received frame of size: {}", data.len());
|
||||
});
|
||||
|
||||
capture.start().expect("Failed to start");
|
||||
|
||||
// Wait for frames
|
||||
std::thread::sleep(Duration::from_secs(2));
|
||||
|
||||
capture.stop().expect("Failed to stop");
|
||||
|
||||
let count = *frame_count.lock().unwrap();
|
||||
assert!(count > 0, "Should have received at least one frame");
|
||||
println!("Total frames received: {}", count);
|
||||
}
|
||||
}
|
||||
1014
src/media/capture/mod.rs
Normal file
1014
src/media/capture/mod.rs
Normal file
File diff suppressed because it is too large
Load Diff
43
src/media/capture/portal.rs
Normal file
43
src/media/capture/portal.rs
Normal file
@@ -0,0 +1,43 @@
|
||||
use anyhow::Result;
|
||||
use ashpd::desktop::{
|
||||
screencast::{CursorMode, Screencast, SourceType},
|
||||
PersistMode,
|
||||
};
|
||||
|
||||
pub struct AshpdKeeper {
|
||||
pub _proxy: Screencast<'static>,
|
||||
pub _session: ashpd::desktop::Session<'static, Screencast<'static>>,
|
||||
}
|
||||
|
||||
pub async fn select_wayland_source() -> Result<(u32, AshpdKeeper)> {
|
||||
let proxy = Screencast::new().await?;
|
||||
let session = proxy.create_session().await?;
|
||||
|
||||
proxy
|
||||
.select_sources(
|
||||
&session,
|
||||
CursorMode::Embedded,
|
||||
SourceType::Monitor | SourceType::Window,
|
||||
false,
|
||||
None,
|
||||
PersistMode::DoNot,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let request = proxy
|
||||
.start(&session, &ashpd::WindowIdentifier::default())
|
||||
.await?;
|
||||
let streams = request.response()?;
|
||||
|
||||
if let Some(stream) = streams.streams().iter().next() {
|
||||
Ok((
|
||||
stream.pipe_wire_node_id(),
|
||||
AshpdKeeper {
|
||||
_proxy: proxy,
|
||||
_session: session,
|
||||
},
|
||||
))
|
||||
} else {
|
||||
Err(anyhow::anyhow!("No pipewire stream returned from portal"))
|
||||
}
|
||||
}
|
||||
427
src/media/mod.rs
427
src/media/mod.rs
@@ -7,15 +7,18 @@
|
||||
//! Each feature is runtime-toggleable and runs on dedicated threads/tasks.
|
||||
|
||||
pub mod capture;
|
||||
pub mod playback;
|
||||
pub mod voice;
|
||||
|
||||
use iroh::EndpointId;
|
||||
use tracing;
|
||||
|
||||
use std::sync::Arc;
|
||||
use std::sync::atomic::{AtomicU32, Ordering};
|
||||
use crate::net::NetworkManager;
|
||||
use crate::protocol::{MediaKind, MediaStreamMessage, decode_framed};
|
||||
use crate::protocol::{decode_framed, MediaKind, MediaStreamMessage};
|
||||
use std::sync::atomic::{AtomicU32, Ordering};
|
||||
use std::sync::Arc;
|
||||
|
||||
use cpal::traits::{DeviceTrait, HostTrait};
|
||||
|
||||
use self::capture::VideoCapture;
|
||||
use self::voice::VoiceChat;
|
||||
@@ -23,60 +26,78 @@ use self::voice::VoiceChat;
|
||||
/// Event sent to the web interface for playback.
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum WebMediaEvent {
|
||||
Audio { peer_id: String, data: Vec<f32> },
|
||||
Video { peer_id: String, kind: MediaKind, data: Vec<u8> },
|
||||
Audio {
|
||||
peer_id: String,
|
||||
data: Vec<f32>,
|
||||
},
|
||||
Video {
|
||||
peer_id: String,
|
||||
kind: MediaKind,
|
||||
data: Vec<u8>,
|
||||
},
|
||||
}
|
||||
|
||||
pub enum InternalMediaEvent {
|
||||
VideoStart(EndpointId),
|
||||
VideoStop(EndpointId),
|
||||
}
|
||||
|
||||
/// Tracks all active media sessions.
|
||||
pub struct MediaState {
|
||||
/// Active voice chat session (if any).
|
||||
voice: Option<VoiceChat>,
|
||||
/// Active camera capture (if any).
|
||||
camera: Option<VideoCapture>,
|
||||
/// Active screen capture (if any).
|
||||
screen: Option<VideoCapture>,
|
||||
/// Playback task handles for incoming streams (voice/video).
|
||||
/// Using a list to allow multiple streams (audio+video) from same or different peers.
|
||||
incoming_media: Vec<tokio::task::JoinHandle<()>>,
|
||||
/// Whether PipeWire is available on this system.
|
||||
pipewire_available: bool,
|
||||
/// Configured screen resolution (width, height).
|
||||
#[allow(dead_code)]
|
||||
screen_resolution: (u32, u32),
|
||||
/// Configured microphone name (target).
|
||||
mic_name: Option<String>,
|
||||
/// Configured speaker name (target).
|
||||
speaker_name: Option<String>,
|
||||
/// Broadcast channel for web playback.
|
||||
pub broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
|
||||
// Input channels (from Web -> MediaState -> Peers)
|
||||
pub mic_broadcast: tokio::sync::broadcast::Sender<Vec<f32>>,
|
||||
pub cam_broadcast: tokio::sync::broadcast::Sender<Vec<u8>>,
|
||||
pub screen_broadcast: tokio::sync::broadcast::Sender<Vec<u8>>,
|
||||
// Channel for integrated video player frames (RGBA, width, height)
|
||||
pub video_frame_tx: tokio::sync::broadcast::Sender<(Vec<u8>, u32, u32)>,
|
||||
pub mic_bitrate: Arc<AtomicU32>,
|
||||
pub input_device: Option<String>,
|
||||
pub output_device: Option<String>,
|
||||
pub initial_master_volume: f32,
|
||||
pub initial_mic_volume: f32,
|
||||
pub initial_noise_suppression: bool,
|
||||
pub active_video_sessions: Arc<dashmap::DashSet<EndpointId>>,
|
||||
pub media_event_tx: Option<tokio::sync::mpsc::Sender<InternalMediaEvent>>,
|
||||
}
|
||||
|
||||
impl MediaState {
|
||||
pub fn new(screen_resolution: (u32, u32), mic_name: Option<String>, speaker_name: Option<String>, mic_bitrate: u32) -> Self {
|
||||
let pipewire_available = check_pipewire();
|
||||
pub fn new(
|
||||
mic_bitrate: u32,
|
||||
input_device: Option<String>,
|
||||
output_device: Option<String>,
|
||||
master_volume: f32,
|
||||
mic_volume: f32,
|
||||
noise_suppression: bool,
|
||||
media_event_tx: Option<tokio::sync::mpsc::Sender<InternalMediaEvent>>,
|
||||
) -> Self {
|
||||
let (broadcast_tx, _) = tokio::sync::broadcast::channel(100);
|
||||
let (mic_broadcast, _) = tokio::sync::broadcast::channel(100);
|
||||
let (cam_broadcast, _) = tokio::sync::broadcast::channel(100);
|
||||
let (screen_broadcast, _) = tokio::sync::broadcast::channel(100);
|
||||
let (video_frame_tx, _) = tokio::sync::broadcast::channel(10); // Low buffer for live video
|
||||
Self {
|
||||
voice: None,
|
||||
camera: None,
|
||||
screen: None,
|
||||
incoming_media: Vec::new(),
|
||||
pipewire_available,
|
||||
screen_resolution,
|
||||
mic_name,
|
||||
speaker_name,
|
||||
broadcast_tx,
|
||||
mic_broadcast,
|
||||
cam_broadcast,
|
||||
screen_broadcast,
|
||||
video_frame_tx,
|
||||
mic_bitrate: Arc::new(AtomicU32::new(mic_bitrate)),
|
||||
input_device,
|
||||
output_device,
|
||||
initial_master_volume: master_volume,
|
||||
initial_mic_volume: mic_volume,
|
||||
initial_noise_suppression: noise_suppression,
|
||||
active_video_sessions: Arc::new(dashmap::DashSet::new()),
|
||||
media_event_tx,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -84,46 +105,176 @@ impl MediaState {
|
||||
self.mic_bitrate.store(bitrate, Ordering::Relaxed);
|
||||
}
|
||||
|
||||
/// Update the selected microphone name.
|
||||
pub fn set_mic_name(&mut self, name: Option<String>) {
|
||||
self.mic_name = name;
|
||||
pub fn set_input_device(&mut self, device_name: String) {
|
||||
self.input_device = Some(device_name);
|
||||
}
|
||||
|
||||
/// Update the selected speaker name.
|
||||
pub fn set_speaker_name(&mut self, name: Option<String>) {
|
||||
self.speaker_name = name;
|
||||
pub fn set_output_device(&mut self, device_name: String) {
|
||||
self.output_device = Some(device_name);
|
||||
}
|
||||
|
||||
pub fn get_input_devices(&self) -> Vec<String> {
|
||||
let mut names = Vec::new();
|
||||
|
||||
// Prioritize JACK if available, otherwise ALSA/Pulse/WASAPI
|
||||
let available_hosts = cpal::available_hosts();
|
||||
let mut hosts = Vec::new();
|
||||
|
||||
// Push JACK first if available
|
||||
if available_hosts.contains(&cpal::HostId::Jack) {
|
||||
hosts.push(cpal::host_from_id(cpal::HostId::Jack).unwrap());
|
||||
}
|
||||
|
||||
// Then default host
|
||||
hosts.push(cpal::default_host());
|
||||
|
||||
for host in hosts {
|
||||
if let Ok(devices) = host.input_devices() {
|
||||
for device in devices {
|
||||
if let Ok(name) = device.name() {
|
||||
// Filter out common noise/unusable devices
|
||||
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null")
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
// Clean up ALSA names
|
||||
// Example: "sysdefault:CARD=PCH" -> "PCH (sysdefault)"
|
||||
// Example: "front:CARD=Microphone,DEV=0" -> "Microphone (front)"
|
||||
let clean_name = if let Some(start) = name.find("CARD=") {
|
||||
let rest = &name[start + 5..];
|
||||
let card_name = rest.split(',').next().unwrap_or(rest);
|
||||
|
||||
let prefix = name.split(':').next().unwrap_or("Unknown");
|
||||
format!("{} ({})", card_name, prefix)
|
||||
} else if name.contains("HDA Intel PCH") {
|
||||
// Simplify generic Intel names if possible
|
||||
name
|
||||
} else {
|
||||
name
|
||||
};
|
||||
|
||||
names.push(clean_name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Dedup and sort
|
||||
names.sort();
|
||||
names.dedup();
|
||||
names
|
||||
}
|
||||
|
||||
pub fn get_output_devices(&self) -> Vec<String> {
|
||||
let mut names = Vec::new();
|
||||
|
||||
// Prioritize JACK if available
|
||||
let available_hosts = cpal::available_hosts();
|
||||
let mut hosts = Vec::new();
|
||||
|
||||
if available_hosts.contains(&cpal::HostId::Jack) {
|
||||
hosts.push(cpal::host_from_id(cpal::HostId::Jack).unwrap());
|
||||
}
|
||||
hosts.push(cpal::default_host());
|
||||
|
||||
for host in hosts {
|
||||
if let Ok(devices) = host.output_devices() {
|
||||
for device in devices {
|
||||
if let Ok(name) = device.name() {
|
||||
if name.contains("dmix") || name.contains("dsnoop") || name.contains("null")
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
let clean_name = if let Some(start) = name.find("CARD=") {
|
||||
let rest = &name[start + 5..];
|
||||
let card_name = rest.split(',').next().unwrap_or(rest);
|
||||
let prefix = name.split(':').next().unwrap_or("Unknown");
|
||||
format!("{} ({})", card_name, prefix)
|
||||
} else {
|
||||
name
|
||||
};
|
||||
|
||||
names.push(clean_name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
names.sort();
|
||||
names.dedup();
|
||||
names
|
||||
}
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// Public state queries
|
||||
// -----------------------------------------------------------------------
|
||||
|
||||
pub fn set_volume(&self, volume: f32) {
|
||||
if let Some(voice) = &self.voice {
|
||||
voice.set_volume(volume);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn set_mic_volume(&self, volume: f32) {
|
||||
if let Some(voice) = &self.voice {
|
||||
voice.set_input_volume(volume);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_volume(&self) -> f32 {
|
||||
if let Some(voice) = &self.voice {
|
||||
voice.get_volume()
|
||||
} else {
|
||||
self.initial_master_volume
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_mic_volume(&self) -> f32 {
|
||||
// VoiceChat doesn't strictly expose get_input_volume yet but we know it
|
||||
// Let's assume we track it or just return initial if not active
|
||||
self.initial_mic_volume // TODO: Fetch from VoiceChat if active? VoiceChat stores it in Atomic.
|
||||
}
|
||||
|
||||
pub fn is_denoise_enabled(&self) -> bool {
|
||||
if let Some(voice) = &self.voice {
|
||||
voice.is_denoise_enabled()
|
||||
} else {
|
||||
self.initial_noise_suppression
|
||||
}
|
||||
}
|
||||
|
||||
pub fn toggle_denoise(&self) -> Option<bool> {
|
||||
if let Some(voice) = &self.voice {
|
||||
Some(voice.toggle_denoise())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
pub fn get_peer_levels(&self) -> std::collections::HashMap<EndpointId, f32> {
|
||||
if let Some(voice) = &self.voice {
|
||||
voice.get_peer_levels()
|
||||
} else {
|
||||
std::collections::HashMap::new()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn voice_enabled(&self) -> bool {
|
||||
self.voice.is_some()
|
||||
}
|
||||
|
||||
pub fn camera_enabled(&self) -> bool {
|
||||
self.camera.is_some()
|
||||
}
|
||||
|
||||
pub fn screen_enabled(&self) -> bool {
|
||||
self.screen.is_some()
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub fn pipewire_available(&self) -> bool {
|
||||
self.pipewire_available
|
||||
}
|
||||
|
||||
// -----------------------------------------------------------------------
|
||||
// Toggle methods — return a status message for the TUI
|
||||
// -----------------------------------------------------------------------
|
||||
|
||||
/// Toggle voice chat. Opens media QUIC streams to all current peers.
|
||||
pub async fn toggle_voice(&mut self, net: NetworkManager) -> &'static str {
|
||||
if !self.pipewire_available {
|
||||
return "Voice chat unavailable (PipeWire not found)";
|
||||
}
|
||||
if self.voice.is_some() {
|
||||
// Stop
|
||||
if let Some(mut v) = self.voice.take() {
|
||||
@@ -132,59 +283,33 @@ impl MediaState {
|
||||
"🎤 Voice chat stopped"
|
||||
} else {
|
||||
// Start — open media streams to all peers
|
||||
// For web capture, we don't open streams here. start_web does it.
|
||||
let peers = net.peers.lock().await;
|
||||
|
||||
match VoiceChat::start_web(
|
||||
// Use Native Capture
|
||||
match VoiceChat::start_native(
|
||||
net.clone(),
|
||||
peers.keys().cloned().collect(),
|
||||
self.mic_broadcast.clone(),
|
||||
self.mic_broadcast.subscribe(), // Subscribe to get new receiver!
|
||||
self.broadcast_tx.clone(),
|
||||
self.mic_bitrate.clone(),
|
||||
self.input_device.clone(),
|
||||
self.output_device.clone(), // Added output device
|
||||
self.initial_master_volume,
|
||||
self.initial_mic_volume, // Pass initial mic volume
|
||||
self.initial_noise_suppression,
|
||||
) {
|
||||
Ok(vc) => {
|
||||
self.voice = Some(vc);
|
||||
"🎤 Voice chat started (Web)"
|
||||
"🎤 Voice chat started (Native)"
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to start voice chat: {}", e);
|
||||
tracing::error!("Failed to start native voice chat: {}", e);
|
||||
"🎤 Failed to start voice chat"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
/// Toggle camera capture.
|
||||
pub async fn toggle_camera(&mut self, net: NetworkManager) -> &'static str {
|
||||
// We use ffmpeg now, which doesn't strictly depend on pipewire crate,
|
||||
// but likely requires pipewire daemon or v4l2.
|
||||
// We kept pipewire check for consistency but it might be loose.
|
||||
if self.camera.is_some() {
|
||||
if let Some(mut c) = self.camera.take() {
|
||||
c.stop();
|
||||
}
|
||||
"📷 Camera stopped"
|
||||
} else {
|
||||
// Start
|
||||
let peers = net.peers.lock().await;
|
||||
match VideoCapture::start_web(
|
||||
MediaKind::Camera,
|
||||
net.our_id,
|
||||
peers.keys().cloned().collect(),
|
||||
net.clone(),
|
||||
self.cam_broadcast.clone(),
|
||||
).await {
|
||||
Ok(vc) => {
|
||||
self.camera = Some(vc);
|
||||
"📷 Camera started (Web)"
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to start camera: {}", e);
|
||||
"📷 Failed to start camera"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Toggle screen sharing.
|
||||
pub async fn toggle_screen(&mut self, net: NetworkManager) -> &'static str {
|
||||
if self.screen.is_some() {
|
||||
@@ -195,21 +320,25 @@ impl MediaState {
|
||||
} else {
|
||||
// Start
|
||||
let peers = net.peers.lock().await;
|
||||
match VideoCapture::start_web(
|
||||
|
||||
// Use Native Capture (FFmpeg)
|
||||
match VideoCapture::start_native(
|
||||
MediaKind::Screen,
|
||||
net.our_id,
|
||||
peers.keys().cloned().collect(),
|
||||
net.clone(),
|
||||
self.screen_broadcast.clone(),
|
||||
).await {
|
||||
Ok(vc) => {
|
||||
self.screen = Some(vc);
|
||||
"🖥️ Screen share started (Web)"
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to start screen share: {}", e);
|
||||
"🖥️ Failed to start screen share"
|
||||
}
|
||||
self.broadcast_tx.clone(),
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(vc) => {
|
||||
self.screen = Some(vc);
|
||||
"🖥️ Screen share started (Native FFmpeg)"
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to start screen share: {}", e);
|
||||
"🖥️ Failed to start screen share (Install FFmpeg)"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -222,62 +351,95 @@ impl MediaState {
|
||||
pub fn handle_incoming_media(
|
||||
&mut self,
|
||||
from: EndpointId,
|
||||
kind: MediaKind,
|
||||
_kind: MediaKind,
|
||||
_send: iroh::endpoint::SendStream,
|
||||
mut recv: iroh::endpoint::RecvStream,
|
||||
) {
|
||||
let broadcast_tx = self.broadcast_tx.clone();
|
||||
let video_frame_tx = self.video_frame_tx.clone();
|
||||
let active_sessions = self.active_video_sessions.clone();
|
||||
let media_event_tx = self.media_event_tx.clone();
|
||||
|
||||
// Spawn a task to determine stream type and handle it
|
||||
let handle = tokio::spawn(async move {
|
||||
// Read first message to determine type.
|
||||
// Note: We already know the kind from ALPN, but we still decode the start message.
|
||||
match decode_framed::<MediaStreamMessage>(&mut recv).await {
|
||||
Ok(msg) => match msg {
|
||||
MediaStreamMessage::AudioStart { .. } => {
|
||||
if kind != MediaKind::Voice {
|
||||
tracing::warn!("ALPN mismatch: expected Voice, got AudioStart");
|
||||
}
|
||||
tracing::info!("Accepted Audio stream from {:?}", from);
|
||||
if let Err(e) = VoiceChat::handle_incoming_audio_web(from, msg, recv, broadcast_tx).await {
|
||||
tracing::error!("Audio web playback error: {}", e);
|
||||
}
|
||||
// DEPRECATED in Native Datagram mode
|
||||
tracing::warn!(
|
||||
"Received Audio stream from {} (unexpected in datagram mode)",
|
||||
from
|
||||
);
|
||||
}
|
||||
MediaStreamMessage::VideoStart { .. } => {
|
||||
tracing::info!("Accepted Video stream from {:?}", from);
|
||||
if let Err(e) = VideoCapture::handle_incoming_video_web(from, msg, recv, broadcast_tx).await {
|
||||
tracing::error!("Video web playback error: {}", e);
|
||||
active_sessions.insert(from);
|
||||
|
||||
// Notify start
|
||||
if let Some(tx) = &media_event_tx {
|
||||
let _ = tx.send(InternalMediaEvent::VideoStart(from)).await;
|
||||
}
|
||||
|
||||
let result = VideoCapture::handle_incoming_video_native(
|
||||
from,
|
||||
msg,
|
||||
recv,
|
||||
broadcast_tx,
|
||||
video_frame_tx,
|
||||
)
|
||||
.await;
|
||||
|
||||
active_sessions.remove(&from);
|
||||
// Notify stop
|
||||
if let Some(tx) = &media_event_tx {
|
||||
let _ = tx.send(InternalMediaEvent::VideoStop(from)).await;
|
||||
}
|
||||
|
||||
if let Err(e) = result {
|
||||
tracing::error!("Video native playback error: {}", e);
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
tracing::warn!("Unknown or unexpected start message from {:?}: {:?}", from, msg);
|
||||
tracing::warn!(
|
||||
"Unknown or unexpected start message from {:?}: {:?}",
|
||||
from,
|
||||
msg
|
||||
);
|
||||
}
|
||||
},
|
||||
Err(e) => {
|
||||
tracing::warn!("Failed to decode initial media message from {:?}: {}", from, e);
|
||||
tracing::warn!(
|
||||
"Failed to decode initial media message from {:?}: {}",
|
||||
from,
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Store handle to allow cleanup on shutdown
|
||||
// We clean up finished tasks periodically or on shutdown
|
||||
self.incoming_media.push(handle);
|
||||
// Clean up finished tasks
|
||||
self.incoming_media.retain(|h| !h.is_finished());
|
||||
}
|
||||
|
||||
/// Handle an incoming datagram (unreliable audio/video).
|
||||
pub fn handle_incoming_datagram(&mut self, from: EndpointId, data: bytes::Bytes) {
|
||||
// We assume datagrams are for VOICE for now (simplification).
|
||||
// Or we should add a prefix byte?
|
||||
// Since we are optimizing audio, let's assume it's audio.
|
||||
// But if we add video datagrams later...
|
||||
|
||||
// For now, let's try to interpret as audio.
|
||||
// Since VoiceChat expects `MediaStreamMessage`, we need to see how `postcard` serialized it.
|
||||
// If sender sends serialized `AudioData`, we can deserialize it.
|
||||
|
||||
if let Some(voice) = &mut self.voice {
|
||||
voice.handle_datagram(from, data, self.broadcast_tx.clone());
|
||||
if data.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Check first byte for type
|
||||
match data[0] {
|
||||
1 => {
|
||||
// Audio
|
||||
if let Some(voice) = &mut self.voice {
|
||||
voice.handle_datagram(from, data);
|
||||
}
|
||||
}
|
||||
// 2 => Video?
|
||||
_ => {
|
||||
// tracing::trace!("Unknown datagram type: {}", data[0]);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -292,17 +454,12 @@ impl MediaState {
|
||||
} else {
|
||||
"🎤 off"
|
||||
};
|
||||
let cam = if self.camera_enabled() {
|
||||
"📷 LIVE"
|
||||
} else {
|
||||
"📷 off"
|
||||
};
|
||||
let scr = if self.screen_enabled() {
|
||||
"🖥 LIVE"
|
||||
} else {
|
||||
"🖥 off"
|
||||
};
|
||||
format!("{} │ {} │ {}", mic, cam, scr)
|
||||
format!("{} {} {}", mic, "I", scr)
|
||||
}
|
||||
|
||||
/// Shut down all active media.
|
||||
@@ -310,9 +467,6 @@ impl MediaState {
|
||||
if let Some(mut v) = self.voice.take() {
|
||||
v.stop();
|
||||
}
|
||||
if let Some(mut c) = self.camera.take() {
|
||||
c.stop();
|
||||
}
|
||||
if let Some(mut s) = self.screen.take() {
|
||||
s.stop();
|
||||
}
|
||||
@@ -321,22 +475,3 @@ impl MediaState {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl Drop for MediaState {
|
||||
fn drop(&mut self) {
|
||||
self.shutdown();
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Helpers
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Check if PipeWire is available on this system.
|
||||
fn check_pipewire() -> bool {
|
||||
// Try to initialize PipeWire — if it fails, it's not available
|
||||
std::panic::catch_unwind(|| {
|
||||
pipewire::init();
|
||||
})
|
||||
.is_ok()
|
||||
}
|
||||
|
||||
98
src/media/playback.rs
Normal file
98
src/media/playback.rs
Normal file
@@ -0,0 +1,98 @@
|
||||
use anyhow::Result;
|
||||
use gstreamer::prelude::*;
|
||||
use gstreamer::{ElementFactory, Pipeline, State};
|
||||
use gstreamer_app::{AppSink, AppSrc};
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
pub struct VideoPlayer {
|
||||
pipeline: Pipeline,
|
||||
appsrc: AppSrc,
|
||||
appsink: AppSink,
|
||||
}
|
||||
|
||||
impl VideoPlayer {
|
||||
pub fn new<F>(on_frame: F) -> Result<Self>
|
||||
where
|
||||
F: Fn(Vec<u8>, u32, u32) + Send + Sync + 'static,
|
||||
{
|
||||
gstreamer::init()?;
|
||||
|
||||
// Pipeline: appsrc -> decodebin -> videoconvert -> appsink
|
||||
// We use decodebin to handle both H.264 and HEVC automatically
|
||||
// output format RGBA for Iced
|
||||
let pipeline_str = "appsrc name=src is-live=true format=time do-timestamp=true ! \
|
||||
parsebin ! \
|
||||
decodebin ! \
|
||||
videoconvert ! \
|
||||
video/x-raw,format=RGBA ! \
|
||||
appsink name=sink drop=true max-buffers=1 sync=false";
|
||||
|
||||
let pipeline = gstreamer::parse::launch(pipeline_str)?
|
||||
.downcast::<Pipeline>()
|
||||
.map_err(|_| anyhow::anyhow!("Expected pipeline"))?;
|
||||
|
||||
let appsrc = pipeline
|
||||
.by_name("src")
|
||||
.ok_or_else(|| anyhow::anyhow!("Missing src"))?
|
||||
.downcast::<AppSrc>()
|
||||
.map_err(|_| anyhow::anyhow!("src is not appsrc"))?;
|
||||
|
||||
let appsink = pipeline
|
||||
.by_name("sink")
|
||||
.ok_or_else(|| anyhow::anyhow!("Missing sink"))?
|
||||
.downcast::<AppSink>()
|
||||
.map_err(|_| anyhow::anyhow!("sink is not appsink"))?;
|
||||
|
||||
// Set up callback
|
||||
let callbacks = gstreamer_app::AppSinkCallbacks::builder()
|
||||
.new_sample(move |sink| {
|
||||
let sample = sink.pull_sample().map_err(|_| gstreamer::FlowError::Eos)?;
|
||||
let buffer = sample.buffer().ok_or(gstreamer::FlowError::Error)?;
|
||||
|
||||
// Get caps to know width/height
|
||||
let caps = sample.caps().ok_or(gstreamer::FlowError::Error)?;
|
||||
let structure = caps.structure(0).ok_or(gstreamer::FlowError::Error)?;
|
||||
let width = structure
|
||||
.get::<i32>("width")
|
||||
.map_err(|_| gstreamer::FlowError::Error)? as u32;
|
||||
let height = structure
|
||||
.get::<i32>("height")
|
||||
.map_err(|_| gstreamer::FlowError::Error)? as u32;
|
||||
|
||||
let map = buffer
|
||||
.map_readable()
|
||||
.map_err(|_| gstreamer::FlowError::Error)?;
|
||||
|
||||
on_frame(map.to_vec(), width, height);
|
||||
|
||||
Ok(gstreamer::FlowSuccess::Ok)
|
||||
})
|
||||
.build();
|
||||
|
||||
appsink.set_callbacks(callbacks);
|
||||
|
||||
Ok(Self {
|
||||
pipeline,
|
||||
appsrc,
|
||||
appsink,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn start(&self) -> Result<()> {
|
||||
self.pipeline.set_state(State::Playing)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn stop(&self) -> Result<()> {
|
||||
let _ = self.pipeline.set_state(State::Null);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn push_data(&self, data: &[u8]) -> Result<()> {
|
||||
let buffer = gstreamer::Buffer::from_slice(data.to_vec());
|
||||
self.appsrc.push_buffer(buffer)?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
// Remove Drop impl to prevent panic on shutdown. Rely on explicit stop() or process exit.
|
||||
@@ -1,150 +1,215 @@
|
||||
//! Voice capture and playback using PipeWire + Audiopus (via Songbird dependency).
|
||||
//! Voice capture and playback using cpal + Audiopus (Opus).
|
||||
//!
|
||||
//! Architecture:
|
||||
//! - Capture runs on a dedicated OS thread (PipeWire main loop).
|
||||
//! - PipeWire process callback copies PCM → crossbeam channel.
|
||||
//! - Async task reads from channel, encodes with Opus, sends over QUIC.
|
||||
//! - Playback: receives Opus packets from QUIC, decodes, feeds to PipeWire output.
|
||||
//! Native implementation using QUIC Datagrams.
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::sync::atomic::{AtomicBool, AtomicU32, Ordering};
|
||||
use std::sync::Arc;
|
||||
use std::thread;
|
||||
use std::time::{Duration, Instant};
|
||||
|
||||
use anyhow::Result;
|
||||
use crate::protocol::{decode_framed, MediaStreamMessage};
|
||||
use crate::media::WebMediaEvent;
|
||||
use postcard;
|
||||
// Use audiopus types directly
|
||||
use audiopus::{coder::Encoder as OpusEncoder, coder::Decoder as OpusDecoder, Channels, Application, SampleRate, Bitrate};
|
||||
use anyhow::{anyhow, Result};
|
||||
use audiopus::{
|
||||
coder::Decoder as OpusDecoder, coder::Encoder as OpusEncoder, Application, Bitrate, Channels,
|
||||
SampleRate,
|
||||
};
|
||||
use bytes::Bytes;
|
||||
use cpal::traits::{DeviceTrait, HostTrait, StreamTrait};
|
||||
use crossbeam_channel::{unbounded, Receiver, Sender};
|
||||
use dashmap::DashMap;
|
||||
use iroh::EndpointId;
|
||||
use nnnoiseless::DenoiseState;
|
||||
use ringbuf::{traits::*, HeapRb};
|
||||
|
||||
// Constants
|
||||
const SAMPLE_RATE_VAL: i32 = 48000;
|
||||
const FRAME_SIZE_MS: u32 = 20; // 20ms
|
||||
const FRAME_SIZE_SAMPLES: usize = (SAMPLE_RATE_VAL as usize * FRAME_SIZE_MS as usize) / 1000;
|
||||
const PACKET_TYPE_AUDIO: u8 = 1;
|
||||
const FRAME_SIZE_SAMPLES: usize = 960; // 20ms at 48kHz
|
||||
|
||||
/// Represents an available audio device (source or sink).
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct AudioDevice {
|
||||
/// PipeWire node name (used as target.object)
|
||||
pub node_name: String,
|
||||
/// Human-readable description
|
||||
pub description: String,
|
||||
}
|
||||
// Wrapper to make OpusDecoder Send + Sync
|
||||
struct SendDecoder(OpusDecoder);
|
||||
unsafe impl Send for SendDecoder {}
|
||||
unsafe impl Sync for SendDecoder {}
|
||||
|
||||
/// List available audio input sources via `pw-dump`.
|
||||
pub fn list_audio_sources() -> Vec<AudioDevice> {
|
||||
list_audio_nodes("Audio/Source")
|
||||
}
|
||||
// Wrapper to make AudioProducer Sync
|
||||
struct SyncAudioProducer(ringbuf::HeapProd<f32>);
|
||||
unsafe impl Sync for SyncAudioProducer {}
|
||||
unsafe impl Send for SyncAudioProducer {} // It is already Send, but for clarity
|
||||
|
||||
/// List available audio output sinks via `pw-dump`.
|
||||
pub fn list_audio_sinks() -> Vec<AudioDevice> {
|
||||
list_audio_nodes("Audio/Sink")
|
||||
}
|
||||
|
||||
fn list_audio_nodes(filter_class: &str) -> Vec<AudioDevice> {
|
||||
use std::process::Command;
|
||||
|
||||
let output = match Command::new("pw-dump").output() {
|
||||
Ok(o) => o,
|
||||
Err(_) => return Vec::new(),
|
||||
};
|
||||
|
||||
if !output.status.success() {
|
||||
return Vec::new();
|
||||
impl std::ops::Deref for SyncAudioProducer {
|
||||
type Target = ringbuf::HeapProd<f32>;
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.0
|
||||
}
|
||||
|
||||
let json_str = match String::from_utf8(output.stdout) {
|
||||
Ok(s) => s,
|
||||
Err(_) => return Vec::new(),
|
||||
};
|
||||
|
||||
let data: Vec<serde_json::Value> = match serde_json::from_str(&json_str) {
|
||||
Ok(d) => d,
|
||||
Err(_) => return Vec::new(),
|
||||
};
|
||||
|
||||
let mut sources = Vec::new();
|
||||
for obj in &data {
|
||||
let props = match obj.get("info").and_then(|i| i.get("props")) {
|
||||
Some(p) => p,
|
||||
None => continue,
|
||||
};
|
||||
|
||||
let media_class = props.get("media.class").and_then(|v| v.as_str()).unwrap_or("");
|
||||
// Match partial class, e.g. "Audio/Source", "Audio/Sink", "Audio/Duplex"
|
||||
if !media_class.contains(filter_class) && !media_class.contains("Audio/Duplex") {
|
||||
continue;
|
||||
}
|
||||
|
||||
let node_name = props.get("node.name").and_then(|v| v.as_str()).unwrap_or("").to_string();
|
||||
let description = props
|
||||
.get("node.description")
|
||||
.or_else(|| props.get("node.nick"))
|
||||
.and_then(|v| v.as_str())
|
||||
.unwrap_or(&node_name)
|
||||
.to_string();
|
||||
|
||||
if !node_name.is_empty() {
|
||||
sources.push(AudioDevice { node_name, description });
|
||||
}
|
||||
}
|
||||
|
||||
sources
|
||||
}
|
||||
impl std::ops::DerefMut for SyncAudioProducer {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
&mut self.0
|
||||
}
|
||||
}
|
||||
|
||||
// Types for RingBuf
|
||||
type AudioProducer = SyncAudioProducer;
|
||||
type AudioConsumer = ringbuf::HeapCons<f32>;
|
||||
|
||||
/// Main voice chat coordination.
|
||||
pub struct VoiceChat {
|
||||
running: Arc<AtomicBool>,
|
||||
capture_thread: Option<thread::JoinHandle<()>>,
|
||||
tasks: Vec<tokio::task::JoinHandle<()>>,
|
||||
datagram_decoders: std::collections::HashMap<iroh::EndpointId, OpusDecoder>,
|
||||
|
||||
// Capture and Playback threads
|
||||
capture_thread: Option<thread::JoinHandle<()>>,
|
||||
playback_thread: Option<thread::JoinHandle<()>>,
|
||||
|
||||
// Per-peer state: Decoder + Jitter Buffer Producer
|
||||
peer_audio_sinks: HashMap<EndpointId, (SendDecoder, AudioProducer)>,
|
||||
|
||||
// Channel to notify playback thread of new peers
|
||||
new_peer_tx: Sender<(EndpointId, AudioConsumer)>,
|
||||
|
||||
// Audio processing controls
|
||||
pub denoise_enabled: Arc<AtomicBool>,
|
||||
pub output_volume: Arc<AtomicU32>, // stored as f32 bits
|
||||
pub input_volume: Arc<AtomicU32>, // stored as f32 bits
|
||||
pub peer_levels: Arc<DashMap<EndpointId, f32>>,
|
||||
}
|
||||
|
||||
// impl Drop for VoiceChat {
|
||||
// fn drop(&mut self) {
|
||||
// self.stop();
|
||||
// }
|
||||
// }
|
||||
|
||||
impl VoiceChat {
|
||||
/// Start voice chat session (Web Version).
|
||||
/// Uses browser for capture/playback handling implicitly via `MediaState` channels,
|
||||
/// but here we handle the NETWORK side encoding/decoding.
|
||||
pub fn start_web(
|
||||
/// Start voice chat session (Native Version with CPAL + QUIC Datagrams).
|
||||
pub fn start_native(
|
||||
net: crate::net::NetworkManager,
|
||||
peers: Vec<iroh::EndpointId>, // Multiple peers
|
||||
mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
|
||||
peers: Vec<EndpointId>,
|
||||
mic_tx: tokio::sync::broadcast::Sender<Vec<f32>>,
|
||||
_mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
|
||||
_broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
|
||||
mic_bitrate: Arc<AtomicU32>,
|
||||
input_device_name: Option<String>,
|
||||
output_device_name: Option<String>,
|
||||
initial_volume: f32,
|
||||
initial_mic_volume: f32,
|
||||
initial_denoise: bool,
|
||||
) -> Result<Self> {
|
||||
let running = Arc::new(AtomicBool::new(true));
|
||||
|
||||
let mut tasks = Vec::new();
|
||||
let denoise_enabled = Arc::new(AtomicBool::new(initial_denoise));
|
||||
let output_volume = Arc::new(AtomicU32::new(initial_volume.to_bits()));
|
||||
let input_volume = Arc::new(AtomicU32::new(initial_mic_volume.to_bits()));
|
||||
let peer_levels = Arc::new(DashMap::new());
|
||||
|
||||
// One sender task to iterate ALL peers? Or one task per peer?
|
||||
// Since input is broadcast channel, spawning multiple tasks (one per peer) works well.
|
||||
// Each task gets a copy of `mic_rx`. Wait, `broadcast::Receiver` can be cloned?
|
||||
// No, `mic_rx` must be resubscribed?
|
||||
// `broadcast::Receiver` is `Clone`? No, `Sender::subscribe` returns receiver.
|
||||
// But we are passing `Receiver` in.
|
||||
// We can't clone a receiver to get same stream easily without `resubscribe`.
|
||||
// But `resubscribe` needs sender.
|
||||
// We should probably spawn ONE task that reads from `mic_rx` and sends to ALL peers.
|
||||
// That is more efficient (encode once, send N times).
|
||||
|
||||
tracing::info!("Starting Native Voice Chat...");
|
||||
|
||||
// 1. Setup Playback Thread (CPAL Output)
|
||||
let (new_peer_tx, new_peer_rx) = unbounded::<(EndpointId, AudioConsumer)>();
|
||||
let playback_running = running.clone();
|
||||
let playback_device_name = output_device_name.clone();
|
||||
let playback_volume = output_volume.clone();
|
||||
let playback_levels = peer_levels.clone();
|
||||
|
||||
let playback_thread = thread::spawn(move || {
|
||||
run_playback_loop(
|
||||
playback_running,
|
||||
new_peer_rx,
|
||||
playback_device_name,
|
||||
playback_volume,
|
||||
playback_levels,
|
||||
);
|
||||
});
|
||||
|
||||
// 2. Setup Capture Thread (CPAL Input)
|
||||
let capture_running = running.clone();
|
||||
let mic_tx_capture = mic_tx.clone();
|
||||
let capture_device_name = input_device_name.clone();
|
||||
let capture_denoise = denoise_enabled.clone();
|
||||
let capture_volume = input_volume.clone();
|
||||
|
||||
let capture_thread = thread::spawn(move || {
|
||||
run_capture_loop(
|
||||
capture_running,
|
||||
mic_tx_capture,
|
||||
capture_device_name,
|
||||
capture_denoise,
|
||||
capture_volume,
|
||||
);
|
||||
});
|
||||
|
||||
// 3. Setup Network Sender Task (Opus -> Datagrams)
|
||||
let mut tasks = Vec::new();
|
||||
let sender_running = running.clone();
|
||||
let net_clone = net.clone();
|
||||
let mic_bitrate_clone = mic_bitrate.clone();
|
||||
let sender_task = tokio::spawn(async move {
|
||||
if let Err(e) = run_opis_sender_web_multi(net_clone, peers, mic_rx, sender_running, mic_bitrate_clone).await {
|
||||
tracing::error!("Voice sender failed: {}", e);
|
||||
}
|
||||
});
|
||||
let mic_rx_sender = mic_tx.subscribe(); // Subscribe to capture
|
||||
|
||||
let sender_task = tokio::spawn(async move {
|
||||
run_network_sender(
|
||||
net_clone,
|
||||
peers,
|
||||
mic_rx_sender,
|
||||
sender_running,
|
||||
mic_bitrate_clone,
|
||||
)
|
||||
.await;
|
||||
});
|
||||
tasks.push(sender_task);
|
||||
|
||||
Ok(Self {
|
||||
running,
|
||||
capture_thread: None,
|
||||
tasks,
|
||||
datagram_decoders: std::collections::HashMap::new(),
|
||||
capture_thread: Some(capture_thread),
|
||||
playback_thread: Some(playback_thread),
|
||||
peer_audio_sinks: HashMap::new(),
|
||||
new_peer_tx,
|
||||
denoise_enabled,
|
||||
output_volume,
|
||||
input_volume,
|
||||
peer_levels,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn set_volume(&self, volume: f32) {
|
||||
self.output_volume
|
||||
.store(volume.to_bits(), Ordering::Relaxed);
|
||||
}
|
||||
|
||||
pub fn set_input_volume(&self, volume: f32) {
|
||||
self.input_volume.store(volume.to_bits(), Ordering::Relaxed);
|
||||
}
|
||||
|
||||
pub fn get_volume(&self) -> f32 {
|
||||
f32::from_bits(self.output_volume.load(Ordering::Relaxed))
|
||||
}
|
||||
|
||||
pub fn is_denoise_enabled(&self) -> bool {
|
||||
self.denoise_enabled.load(Ordering::Relaxed)
|
||||
}
|
||||
|
||||
pub fn toggle_denoise(&self) -> bool {
|
||||
let current = self.denoise_enabled.load(Ordering::Relaxed);
|
||||
self.denoise_enabled.store(!current, Ordering::Relaxed);
|
||||
!current
|
||||
}
|
||||
|
||||
pub fn get_peer_levels(&self) -> HashMap<EndpointId, f32> {
|
||||
self.peer_levels
|
||||
.iter()
|
||||
.map(|entry| (*entry.key(), *entry.value()))
|
||||
.collect()
|
||||
}
|
||||
|
||||
// Kept for compatibility but unused in Native mode
|
||||
pub fn start_web(
|
||||
_net: crate::net::NetworkManager,
|
||||
_peers: Vec<EndpointId>,
|
||||
_mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
|
||||
_broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
|
||||
_mic_bitrate: Arc<AtomicU32>,
|
||||
) -> Result<Self> {
|
||||
Err(anyhow!("Web voice not supported in this native build"))
|
||||
}
|
||||
|
||||
/// Stop voice chat.
|
||||
pub fn stop(&mut self) {
|
||||
self.running.store(false, Ordering::SeqCst);
|
||||
@@ -152,197 +217,412 @@ impl VoiceChat {
|
||||
task.abort();
|
||||
}
|
||||
self.tasks.clear();
|
||||
|
||||
// Spawn a thread to join audio threads to avoid blocking async context if they are stuck
|
||||
if let Some(t) = self.capture_thread.take() {
|
||||
t.thread().unpark(); // Wake up if sleeping
|
||||
let _ = t.join();
|
||||
t.thread().unpark();
|
||||
}
|
||||
if let Some(t) = self.playback_thread.take() {
|
||||
t.thread().unpark();
|
||||
}
|
||||
}
|
||||
|
||||
/// Handle incoming audio stream (Web Version).
|
||||
pub async fn handle_incoming_audio_web(
|
||||
from: iroh::EndpointId,
|
||||
message: MediaStreamMessage,
|
||||
mut recv: iroh::endpoint::RecvStream,
|
||||
broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>,
|
||||
) -> Result<()> {
|
||||
// Initialize Opus decoder
|
||||
let mut decoder = OpusDecoder::new(SampleRate::Hz48000, Channels::Mono)
|
||||
.map_err(|e| anyhow::anyhow!("Failed to create Opus decoder: {:?}", e))?;
|
||||
|
||||
// Process start message
|
||||
match message {
|
||||
MediaStreamMessage::AudioStart { .. } => {
|
||||
tracing::info!("Incoming voice stream started (web) from {}", from);
|
||||
}
|
||||
_ => anyhow::bail!("Expected AudioStart"),
|
||||
/// Handle incoming datagram from Network.
|
||||
pub fn handle_datagram(&mut self, from: EndpointId, data: Bytes) {
|
||||
// Packet format: [1 byte TYPE][Opus Data]
|
||||
if data.len() < 2 {
|
||||
return;
|
||||
}
|
||||
|
||||
let mut decode_buf = vec![0f32; FRAME_SIZE_SAMPLES];
|
||||
// Check type (Audio = 1)
|
||||
if data[0] != PACKET_TYPE_AUDIO {
|
||||
return;
|
||||
}
|
||||
|
||||
loop {
|
||||
let msg: MediaStreamMessage = match decode_framed(&mut recv).await {
|
||||
Ok(m) => m,
|
||||
Err(_) => break, // EOF
|
||||
};
|
||||
let opus_data = &data[1..];
|
||||
|
||||
match msg {
|
||||
MediaStreamMessage::AudioData { opus_data, .. } => { // Removed `channels` field usage if it existed
|
||||
match decoder.decode_float(Some(&opus_data), &mut decode_buf, false) {
|
||||
Ok(len) => {
|
||||
let samples = decode_buf[..len].to_vec();
|
||||
// Broadcast to web
|
||||
let short_id: String = format!("{}", from).chars().take(8).collect();
|
||||
let _ = broadcast_tx.send(WebMediaEvent::Audio { peer_id: short_id, data: samples });
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::warn!("Opus decode error: {:?}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
MediaStreamMessage::AudioStop => {
|
||||
tracing::info!("Peer stopped audio");
|
||||
break;
|
||||
}
|
||||
_ => {}
|
||||
// Get or create decoder/producer for this peer
|
||||
let (decoder, producer) = self.peer_audio_sinks.entry(from).or_insert_with(|| {
|
||||
tracing::info!("New voice peer detected: {}", from);
|
||||
|
||||
// Create Jitter Buffer (RingBuf)
|
||||
// 48kHz * 1s buffer
|
||||
let rb = HeapRb::<f32>::new(48000);
|
||||
let (prod, cons) = rb.split();
|
||||
|
||||
// Send consumer to playback thread
|
||||
if let Err(e) = self.new_peer_tx.send((from, cons)) {
|
||||
tracing::error!("Failed to send new peer to playback thread: {}", e);
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn handle_datagram(
|
||||
&mut self,
|
||||
from: iroh::EndpointId,
|
||||
data: bytes::Bytes,
|
||||
broadcast_tx: tokio::sync::broadcast::Sender<WebMediaEvent>
|
||||
) {
|
||||
// tracing::info!("Received datagram from {} ({} bytes)", from, data.len());
|
||||
match postcard::from_bytes::<MediaStreamMessage>(&data) {
|
||||
Ok(MediaStreamMessage::AudioData { opus_data, sequence }) => {
|
||||
if sequence % 50 == 0 {
|
||||
tracing::info!("Received AudioData seq {} from {}", sequence, from);
|
||||
}
|
||||
let decoder = self.datagram_decoders.entry(from).or_insert_with(|| {
|
||||
tracing::info!("Creating new OpusDecoder for {}", from);
|
||||
OpusDecoder::new(SampleRate::Hz48000, Channels::Mono).expect("Failed to create decoder")
|
||||
});
|
||||
// Max frame size is ~120ms (5760 samples). Use safe buffer.
|
||||
let mut pcm = vec![0.0f32; 5760];
|
||||
match decoder.decode_float(Some(&opus_data), &mut pcm, false) {
|
||||
Ok(len) => {
|
||||
let samples = pcm[..len].to_vec();
|
||||
let short_id: String = format!("{}", from).chars().take(8).collect();
|
||||
let _ = broadcast_tx.send(WebMediaEvent::Audio { peer_id: short_id, data: samples });
|
||||
}
|
||||
Err(e) => tracing::warn!("Opus decode error: {:?}", e),
|
||||
}
|
||||
},
|
||||
Ok(_) => {}, // Ignore non-audio datagrams
|
||||
Err(e) => tracing::warn!("Failed to deserialize datagram: {}", e),
|
||||
let decoder = OpusDecoder::new(SampleRate::Hz48000, Channels::Mono)
|
||||
.expect("Failed to create Opus decoder");
|
||||
|
||||
(SendDecoder(decoder), SyncAudioProducer(prod))
|
||||
});
|
||||
|
||||
// Decode Opus -> PCM
|
||||
// Max frame size for 120ms is 5760 samples.
|
||||
let mut pcm = vec![0.0f32; 5760];
|
||||
match decoder.0.decode_float(Some(opus_data), &mut pcm, false) {
|
||||
Ok(len) => {
|
||||
let samples = &pcm[..len];
|
||||
producer.push_slice(samples);
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::warn!("Opus decode error: {:?}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Opus sender — encodes PCM and sends over QUIC (Multi-Peer)
|
||||
// Loops
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
async fn run_opis_sender_web_multi(
|
||||
network_manager: crate::net::NetworkManager,
|
||||
peers: Vec<iroh::EndpointId>,
|
||||
mut input_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
|
||||
fn run_capture_loop(
|
||||
running: Arc<AtomicBool>,
|
||||
mic_tx: tokio::sync::broadcast::Sender<Vec<f32>>,
|
||||
device_name: Option<String>,
|
||||
denoise_enabled: Arc<AtomicBool>,
|
||||
input_volume: Arc<AtomicU32>,
|
||||
) {
|
||||
let host = cpal::default_host();
|
||||
|
||||
// Find device
|
||||
let device = if let Some(ref name) = device_name {
|
||||
tracing::info!("Requesting input device: '{}'", name);
|
||||
host.input_devices()
|
||||
.ok()
|
||||
.and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
|
||||
} else {
|
||||
tracing::info!("Requesting default input device");
|
||||
host.default_input_device()
|
||||
};
|
||||
|
||||
let device = match device {
|
||||
Some(d) => {
|
||||
tracing::info!("Found input device: {:?}", d.name());
|
||||
d
|
||||
}
|
||||
None => {
|
||||
tracing::error!("No input device found");
|
||||
return;
|
||||
}
|
||||
};
|
||||
tracing::info!("Mic opened: {:?}", device.name());
|
||||
|
||||
// Configure
|
||||
let config = match device.default_input_config() {
|
||||
Ok(c) => c,
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to get default input config: {}", e);
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
// We try to stick to default but standardise to 1 channel if possible.
|
||||
let stream_config: cpal::StreamConfig = config.clone().into();
|
||||
|
||||
tracing::info!("Input config: {:?}", stream_config);
|
||||
|
||||
// Initialize RNNoise
|
||||
// RNNoise expects chunks of 480 samples (10ms at 48kHz).
|
||||
let mut denoise_state = DenoiseState::new();
|
||||
let mut processing_buffer: Vec<f32> = Vec::with_capacity(480 * 2);
|
||||
let mut out_buf = [0.0f32; DenoiseState::FRAME_SIZE];
|
||||
|
||||
let mut last_log = Instant::now();
|
||||
let mut packet_count = 0;
|
||||
|
||||
let err_fn = |err| tracing::error!("Input stream error: {}", err);
|
||||
|
||||
let stream = match config.sample_format() {
|
||||
cpal::SampleFormat::F32 => {
|
||||
let running_clone = running.clone();
|
||||
device.build_input_stream(
|
||||
&stream_config,
|
||||
move |data: &[f32], _: &_| {
|
||||
if !running_clone.load(Ordering::Relaxed) { return; }
|
||||
|
||||
packet_count += 1;
|
||||
if last_log.elapsed() >= Duration::from_secs(1) {
|
||||
tracing::info!("Microphone input active: {} callbacks/sec, {} samples in last callback", packet_count, data.len());
|
||||
packet_count = 0;
|
||||
last_log = Instant::now();
|
||||
}
|
||||
|
||||
// Convert to Mono
|
||||
let channels = stream_config.channels as usize;
|
||||
let mut mono_samples: Vec<f32> = if channels == 1 {
|
||||
data.to_vec()
|
||||
} else {
|
||||
data.chunks(channels).map(|chunk| chunk.iter().sum::<f32>() / channels as f32).collect()
|
||||
};
|
||||
|
||||
// Apply Input Gain
|
||||
let gain = f32::from_bits(input_volume.load(Ordering::Relaxed));
|
||||
if gain != 1.0 {
|
||||
for sample in &mut mono_samples {
|
||||
*sample *= gain;
|
||||
}
|
||||
}
|
||||
|
||||
if !mono_samples.is_empty() {
|
||||
let use_denoise = denoise_enabled.load(Ordering::Relaxed);
|
||||
|
||||
if use_denoise {
|
||||
processing_buffer.extend_from_slice(&mono_samples);
|
||||
|
||||
while processing_buffer.len() >= DenoiseState::FRAME_SIZE {
|
||||
let chunk: Vec<f32> = processing_buffer.drain(0..DenoiseState::FRAME_SIZE).collect();
|
||||
denoise_state.process_frame(&mut out_buf, &chunk);
|
||||
let _ = mic_tx.send(out_buf.to_vec());
|
||||
}
|
||||
} else {
|
||||
// Pass through
|
||||
let _ = mic_tx.send(mono_samples);
|
||||
}
|
||||
}
|
||||
},
|
||||
err_fn,
|
||||
None
|
||||
)
|
||||
}
|
||||
_ => {
|
||||
tracing::error!("Input device does not support F32 samples");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
if let Ok(s) = stream {
|
||||
if let Err(e) = s.play() {
|
||||
tracing::error!("Failed to play input stream: {}", e);
|
||||
}
|
||||
tracing::info!("Voice started (Capture)");
|
||||
|
||||
// Keep thread alive
|
||||
while running.load(Ordering::Relaxed) {
|
||||
thread::sleep(Duration::from_millis(100));
|
||||
}
|
||||
} else if let Err(e) = stream {
|
||||
tracing::error!("Failed to build input stream: {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
fn run_playback_loop(
|
||||
running: Arc<AtomicBool>,
|
||||
new_peer_rx: Receiver<(EndpointId, AudioConsumer)>,
|
||||
device_name: Option<String>,
|
||||
output_volume: Arc<AtomicU32>,
|
||||
peer_levels: Arc<DashMap<EndpointId, f32>>,
|
||||
) {
|
||||
let host = cpal::default_host();
|
||||
|
||||
let device = if let Some(ref name) = device_name {
|
||||
tracing::info!("Requesting output device: '{}'", name);
|
||||
host.output_devices()
|
||||
.ok()
|
||||
.and_then(|mut ds| ds.find(|d| d.name().map(|n| n == *name).unwrap_or(false)))
|
||||
} else {
|
||||
tracing::info!("Requesting default output device");
|
||||
host.default_output_device()
|
||||
};
|
||||
|
||||
let device = match device {
|
||||
Some(d) => {
|
||||
tracing::info!("Found output device: {:?}", d.name());
|
||||
d
|
||||
}
|
||||
None => {
|
||||
tracing::error!("No output device found");
|
||||
return;
|
||||
}
|
||||
};
|
||||
tracing::info!("Speaker opened: {:?}", device.name());
|
||||
|
||||
let config = match device.default_output_config() {
|
||||
Ok(c) => c,
|
||||
Err(e) => {
|
||||
tracing::error!("Failed to get default output config: {}", e);
|
||||
return;
|
||||
}
|
||||
};
|
||||
let stream_config: cpal::StreamConfig = config.clone().into();
|
||||
|
||||
// Store consumers for mixing
|
||||
let mut consumers: Vec<(EndpointId, AudioConsumer)> = Vec::new();
|
||||
|
||||
let err_fn = |err| tracing::error!("Output stream error: {}", err);
|
||||
|
||||
let mut last_log = Instant::now();
|
||||
let mut packet_count = 0;
|
||||
|
||||
let stream = match config.sample_format() {
|
||||
cpal::SampleFormat::F32 => {
|
||||
let running_clone = running.clone();
|
||||
device.build_output_stream(
|
||||
&stream_config,
|
||||
move |data: &mut [f32], _: &_| {
|
||||
if !running_clone.load(Ordering::Relaxed) {
|
||||
return;
|
||||
}
|
||||
|
||||
packet_count += 1;
|
||||
if last_log.elapsed() >= Duration::from_secs(1) {
|
||||
tracing::info!("Speaker output active: {} callbacks/sec", packet_count);
|
||||
packet_count = 0;
|
||||
last_log = Instant::now();
|
||||
}
|
||||
|
||||
let master_vol = f32::from_bits(output_volume.load(Ordering::Relaxed));
|
||||
|
||||
// Check for new peers non-blocking
|
||||
while let Ok((id, c)) = new_peer_rx.try_recv() {
|
||||
tracing::info!("Adding peer {} to mix", id);
|
||||
consumers.push((id, c));
|
||||
}
|
||||
|
||||
// Mix
|
||||
let channels = stream_config.channels as usize;
|
||||
|
||||
// We assume we are filling interleaved buffer.
|
||||
// Our ringbufs are Mono. We duplicate mono to all channels.
|
||||
|
||||
// Pre-allocate level accumulators for this frame
|
||||
// We'll calculate RMS over the whole buffer size for UI visualization
|
||||
let mut peer_sums: HashMap<EndpointId, f32> = HashMap::new();
|
||||
let mut peer_counts: HashMap<EndpointId, usize> = HashMap::new();
|
||||
|
||||
// Iterate output buffer frame by frame (all channels per sample time)
|
||||
for frame in data.chunks_mut(channels) {
|
||||
let mut sum: f32 = 0.0;
|
||||
|
||||
// Sum up all peers
|
||||
for (id, c) in consumers.iter_mut() {
|
||||
if let Some(sample) = c.try_pop() {
|
||||
sum += sample;
|
||||
|
||||
// Accumulate squared sample for RMS
|
||||
*peer_sums.entry(*id).or_default() += sample * sample;
|
||||
*peer_counts.entry(*id).or_default() += 1;
|
||||
}
|
||||
}
|
||||
|
||||
// Apply master volume
|
||||
sum *= master_vol;
|
||||
|
||||
// Soft clip
|
||||
let mixed = sum.clamp(-1.0, 1.0);
|
||||
|
||||
// Assign to all channels
|
||||
for sample in frame.iter_mut() {
|
||||
*sample = mixed;
|
||||
}
|
||||
}
|
||||
|
||||
// Update peer levels in shared map
|
||||
for (id, sq_sum) in peer_sums {
|
||||
let count = peer_counts.get(&id).unwrap_or(&1);
|
||||
let rms = (sq_sum / *count as f32).sqrt();
|
||||
|
||||
// Smooth decay could be implemented here, but for now just raw RMS
|
||||
peer_levels.insert(id, rms);
|
||||
}
|
||||
},
|
||||
err_fn,
|
||||
None,
|
||||
)
|
||||
}
|
||||
_ => {
|
||||
tracing::error!("Output device does not support F32 samples");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
if let Ok(s) = stream {
|
||||
if let Err(e) = s.play() {
|
||||
tracing::error!("Failed to play output stream: {}", e);
|
||||
}
|
||||
|
||||
while running.load(Ordering::Relaxed) {
|
||||
thread::sleep(Duration::from_millis(100));
|
||||
}
|
||||
} else {
|
||||
tracing::error!("Failed to build output stream");
|
||||
}
|
||||
}
|
||||
|
||||
async fn run_network_sender(
|
||||
net: crate::net::NetworkManager,
|
||||
peers: Vec<EndpointId>,
|
||||
mut mic_rx: tokio::sync::broadcast::Receiver<Vec<f32>>,
|
||||
running: Arc<AtomicBool>,
|
||||
mic_bitrate: Arc<AtomicU32>,
|
||||
) -> Result<()> {
|
||||
) {
|
||||
if peers.is_empty() {
|
||||
return Ok(());
|
||||
return;
|
||||
}
|
||||
|
||||
// Connect to all peers to get Connection handles
|
||||
// Initialize connections
|
||||
let mut connections = Vec::new();
|
||||
for peer in peers {
|
||||
// We use VOICE_ALPN, but for datagrams ALPN matters for connection establishment.
|
||||
// If we already have a connection (e.g. gossip), iroh reuses it?
|
||||
// Yes, connect() returns the existing connection if active.
|
||||
match network_manager.endpoint.connect(peer, crate::net::VOICE_ALPN).await {
|
||||
Ok(conn) => {
|
||||
connections.push(conn);
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::warn!("Failed to connect to {}: {}", peer, e);
|
||||
}
|
||||
match net.endpoint.connect(peer, crate::net::VOICE_ALPN).await {
|
||||
Ok(conn) => connections.push(conn),
|
||||
Err(e) => tracing::warn!("Failed to connect to {}: {}", peer, e),
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if connections.is_empty() {
|
||||
tracing::warn!("No reachable peers for voice chat");
|
||||
// We continue anyway? No, useless.
|
||||
// But maybe peers come online later?
|
||||
// For now, fail if no initial connection.
|
||||
// Or wait? The loop only runs while running is true.
|
||||
// Let's just return to avoid spinning.
|
||||
return Ok(());
|
||||
return;
|
||||
}
|
||||
|
||||
let mut encoder = OpusEncoder::new(SampleRate::Hz48000, Channels::Mono, Application::Voip)
|
||||
.map_err(|e| anyhow::anyhow!("Failed to create Opus encoder: {:?}", e))?;
|
||||
|
||||
// Set initial bitrate
|
||||
let current_bitrate = mic_bitrate.load(Ordering::Relaxed);
|
||||
encoder.set_bitrate(Bitrate::BitsPerSecond(current_bitrate as i32))
|
||||
.map_err(|e| anyhow::anyhow!("Failed to set bitrate: {:?}", e))?;
|
||||
.expect("Failed to create Opus encoder");
|
||||
|
||||
// Opus frame size: 20ms at 48kHz = 960 samples
|
||||
let frame_size = FRAME_SIZE_SAMPLES;
|
||||
let mut pcm_buffer: Vec<f32> = Vec::with_capacity(frame_size * 2);
|
||||
let mut opus_buffer = vec![0u8; 1500]; // MTU-ish
|
||||
let mut sequence: u64 = 0;
|
||||
// Initial bitrate
|
||||
let _ = encoder.set_bitrate(Bitrate::BitsPerSecond(
|
||||
mic_bitrate.load(Ordering::Relaxed) as i32
|
||||
));
|
||||
|
||||
tracing::info!("Starting voice sender loop for {} peers", connections.len());
|
||||
let mut pcm_buffer: Vec<f32> = Vec::with_capacity(FRAME_SIZE_SAMPLES * 2);
|
||||
let mut opus_buffer = vec![0u8; 1500];
|
||||
|
||||
tracing::info!("Voice Sender: Broadcasting to {} peers", connections.len());
|
||||
|
||||
while running.load(Ordering::Relaxed) {
|
||||
// ... bitrate check ...
|
||||
// Update bitrate
|
||||
let bitrate = mic_bitrate.load(Ordering::Relaxed);
|
||||
let _ = encoder.set_bitrate(Bitrate::BitsPerSecond(bitrate as i32));
|
||||
|
||||
// Receive PCM from Web
|
||||
match input_rx.recv().await {
|
||||
match mic_rx.recv().await {
|
||||
Ok(samples) => {
|
||||
// tracing::trace!("Received {} audio samples from web", samples.len());
|
||||
pcm_buffer.extend_from_slice(&samples);
|
||||
|
||||
// Process 20ms chunks
|
||||
while pcm_buffer.len() >= frame_size {
|
||||
let chunk: Vec<f32> = pcm_buffer.drain(0..frame_size).collect();
|
||||
|
||||
|
||||
while pcm_buffer.len() >= FRAME_SIZE_SAMPLES {
|
||||
let chunk: Vec<f32> = pcm_buffer.drain(0..FRAME_SIZE_SAMPLES).collect();
|
||||
|
||||
match encoder.encode_float(&chunk, &mut opus_buffer) {
|
||||
Ok(len) => {
|
||||
let packet = opus_buffer[..len].to_vec();
|
||||
let msg = MediaStreamMessage::AudioData {
|
||||
sequence,
|
||||
opus_data: packet,
|
||||
};
|
||||
sequence = sequence.wrapping_add(1);
|
||||
|
||||
// Serialize for Datagram
|
||||
match postcard::to_allocvec(&msg) {
|
||||
Ok(data) => {
|
||||
let bytes = bytes::Bytes::from(data);
|
||||
let mut sent_count = 0;
|
||||
for (_i, conn) in connections.iter_mut().enumerate() {
|
||||
if let Err(e) = conn.send_datagram(bytes.clone()) {
|
||||
tracing::debug!("Failed to send datagram: {}", e);
|
||||
} else {
|
||||
sent_count += 1;
|
||||
}
|
||||
}
|
||||
if sent_count > 0 && sequence % 50 == 0 {
|
||||
tracing::info!("Sent audio datagram seq {} to {} peers", sequence, sent_count);
|
||||
}
|
||||
let opus_packet = &opus_buffer[..len];
|
||||
|
||||
// Construct Datagram: [TYPE=1][OPUS]
|
||||
let mut datagram = Vec::with_capacity(1 + len);
|
||||
datagram.push(PACKET_TYPE_AUDIO);
|
||||
datagram.extend_from_slice(opus_packet);
|
||||
|
||||
let bytes = Bytes::from(datagram);
|
||||
|
||||
// Send to all peers
|
||||
for conn in &mut connections {
|
||||
if let Err(e) = conn.send_datagram(bytes.clone()) {
|
||||
// Don't log every failure for datagrams (spammy)
|
||||
tracing::debug!("Datagram send error: {}", e);
|
||||
}
|
||||
Err(e) => tracing::error!("Serialization error: {}", e),
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::error!("Opus encode error: {:?}", e);
|
||||
}
|
||||
Err(e) => tracing::error!("Opus encode error: {:?}", e),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -352,6 +632,4 @@ async fn run_opis_sender_web_multi(
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -65,12 +65,16 @@ pub enum NetEvent {
|
||||
}
|
||||
|
||||
/// Information about a connected peer.
|
||||
#[derive(Debug, Clone)]
|
||||
#[derive(Debug, Clone, serde::Serialize)]
|
||||
pub struct PeerInfo {
|
||||
pub id: EndpointId,
|
||||
pub name: Option<String>,
|
||||
pub capabilities: Option<protocol::CapabilitiesMessage>,
|
||||
pub is_self: bool,
|
||||
#[serde(skip)]
|
||||
pub audio_level: f32,
|
||||
#[serde(skip)]
|
||||
pub is_streaming_video: bool,
|
||||
}
|
||||
|
||||
/// Manages the iroh networking stack.
|
||||
@@ -109,7 +113,7 @@ impl NetworkManager {
|
||||
tokio::spawn(async move {
|
||||
while let Some(_event) = receiver.next().await {
|
||||
// Drain events to keep subscription active.
|
||||
// We ignore them because the main subscription loop triggers
|
||||
// We ignore them because the main subscription loop triggers
|
||||
// the actual application logic (NetEvent).
|
||||
}
|
||||
});
|
||||
@@ -118,7 +122,9 @@ impl NetworkManager {
|
||||
}
|
||||
|
||||
/// Create a new NetworkManager and start the iroh endpoint.
|
||||
pub async fn new(topic_bytes: [u8; 32]) -> Result<(Self, mpsc::Sender<NetEvent>, mpsc::Receiver<NetEvent>)> {
|
||||
pub async fn new(
|
||||
topic_bytes: [u8; 32],
|
||||
) -> Result<(Self, mpsc::Sender<NetEvent>, mpsc::Receiver<NetEvent>)> {
|
||||
let (event_tx, event_rx) = mpsc::channel(256);
|
||||
|
||||
// Create endpoint with file transfer ALPN and Gossip ALPN
|
||||
@@ -202,6 +208,8 @@ impl NetworkManager {
|
||||
name: None,
|
||||
capabilities: None,
|
||||
is_self: false,
|
||||
audio_level: 0.0,
|
||||
is_streaming_video: false,
|
||||
});
|
||||
}
|
||||
let _ = event_tx.send(NetEvent::PeerUp(peer_id)).await;
|
||||
@@ -281,7 +289,7 @@ impl NetworkManager {
|
||||
while let Some(incoming) = endpoint.accept().await {
|
||||
let gossip = gossip.clone();
|
||||
let event_tx = event_tx.clone();
|
||||
|
||||
|
||||
tokio::spawn(async move {
|
||||
match incoming.await {
|
||||
Ok(conn) => {
|
||||
@@ -293,16 +301,33 @@ impl NetworkManager {
|
||||
if let Err(e) = gossip.handle_connection(conn).await {
|
||||
tracing::warn!("Gossip failed to handle connection: {}", e);
|
||||
}
|
||||
} else if alpn == FILE_TRANSFER_ALPN || alpn == VOICE_ALPN || alpn == CAMERA_ALPN || alpn == SCREEN_ALPN {
|
||||
} else if alpn == FILE_TRANSFER_ALPN
|
||||
|| alpn == VOICE_ALPN
|
||||
|| alpn == CAMERA_ALPN
|
||||
|| alpn == SCREEN_ALPN
|
||||
{
|
||||
// Handle application protocols with a dedicated loop
|
||||
tracing::info!("Accepted connection from {} with ALPN {:?}", peer_id, String::from_utf8_lossy(&alpn));
|
||||
if let Err(e) = Self::handle_app_connection(conn, alpn, event_tx).await {
|
||||
tracing::warn!("Connection handler error for {}: {}", peer_id, e);
|
||||
tracing::info!(
|
||||
"Accepted connection from {} with ALPN {:?}",
|
||||
peer_id,
|
||||
String::from_utf8_lossy(&alpn)
|
||||
);
|
||||
if let Err(e) =
|
||||
Self::handle_app_connection(conn, alpn, event_tx).await
|
||||
{
|
||||
tracing::warn!(
|
||||
"Connection handler error for {}: {}",
|
||||
peer_id,
|
||||
e
|
||||
);
|
||||
}
|
||||
} else {
|
||||
tracing::warn!("Ignored connection with unknown ALPN: {:?}", String::from_utf8_lossy(&alpn));
|
||||
tracing::warn!(
|
||||
"Ignored connection with unknown ALPN: {:?}",
|
||||
String::from_utf8_lossy(&alpn)
|
||||
);
|
||||
}
|
||||
},
|
||||
}
|
||||
Err(e) => {
|
||||
tracing::warn!("Failed to accept incoming connection: {}", e);
|
||||
}
|
||||
@@ -312,9 +337,13 @@ impl NetworkManager {
|
||||
});
|
||||
}
|
||||
|
||||
async fn handle_app_connection(conn: iroh::endpoint::Connection, alpn: Vec<u8>, event_tx: mpsc::Sender<NetEvent>) -> Result<()> {
|
||||
async fn handle_app_connection(
|
||||
conn: iroh::endpoint::Connection,
|
||||
alpn: Vec<u8>,
|
||||
event_tx: mpsc::Sender<NetEvent>,
|
||||
) -> Result<()> {
|
||||
let peer_id = conn.remote_id();
|
||||
|
||||
|
||||
loop {
|
||||
tokio::select! {
|
||||
// Accept bi-directional streams (reliable)
|
||||
|
||||
@@ -25,9 +25,20 @@ pub enum GossipMessage {
|
||||
Capabilities(CapabilitiesMessage),
|
||||
PeerAnnounce(PeerAnnounce),
|
||||
FileOfferBroadcast(FileOfferBroadcast),
|
||||
/// Request the file transfer (sent by recipient after accepting).
|
||||
FileRequest(FileRequest),
|
||||
NameChange(NameChange),
|
||||
/// Graceful disconnect notification.
|
||||
Disconnect { sender_name: String },
|
||||
Disconnect {
|
||||
sender_name: String,
|
||||
},
|
||||
}
|
||||
|
||||
/// Request to start a file transfer.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FileRequest {
|
||||
pub sender_name: String,
|
||||
pub file_id: FileId,
|
||||
}
|
||||
|
||||
/// A name change notification from a peer.
|
||||
@@ -161,10 +172,7 @@ pub enum MediaStreamMessage {
|
||||
frame_size_ms: u8,
|
||||
},
|
||||
/// One Opus-encoded audio frame.
|
||||
AudioData {
|
||||
sequence: u64,
|
||||
opus_data: Vec<u8>,
|
||||
},
|
||||
AudioData { sequence: u64, opus_data: Vec<u8> },
|
||||
/// Audio session ended.
|
||||
AudioStop,
|
||||
|
||||
@@ -182,9 +190,7 @@ pub enum MediaStreamMessage {
|
||||
data: Vec<u8>,
|
||||
},
|
||||
/// Video session ended.
|
||||
VideoStop {
|
||||
kind: MediaKind,
|
||||
},
|
||||
VideoStop { kind: MediaKind },
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
@@ -14,7 +14,7 @@ pub fn render(frame: &mut Frame, area: Rect, chat: &ChatState, app: &App) {
|
||||
let block = Block::default()
|
||||
.title(" 💬 Chat ")
|
||||
.borders(Borders::ALL)
|
||||
.border_style(Style::default().fg(app.theme.border));
|
||||
.border_style(Style::default().fg(app.theme.chat_border));
|
||||
|
||||
let inner = block.inner(area);
|
||||
frame.render_widget(block, area);
|
||||
@@ -48,17 +48,14 @@ pub fn render(frame: &mut Frame, area: Rect, chat: &ChatState, app: &App) {
|
||||
|
||||
fn format_entry(entry: &ChatEntry, app: &App) -> ListItem<'static> {
|
||||
let time = chrono::DateTime::from_timestamp_millis(entry.timestamp as i64)
|
||||
.map(|dt| dt.format("%H:%M").to_string())
|
||||
.map(|dt| dt.with_timezone(&chrono::Local).format("%H:%M").to_string())
|
||||
.unwrap_or_default();
|
||||
|
||||
if entry.is_system {
|
||||
let line = Line::from(vec![
|
||||
Span::styled(format!("[{}] ", time), Style::default().fg(app.theme.time)),
|
||||
Span::styled(
|
||||
format!("[{}] ", time),
|
||||
Style::default().fg(app.theme.time),
|
||||
),
|
||||
Span::styled(
|
||||
format!("*** {} ***", entry.text),
|
||||
format!("[sys] {}", entry.text),
|
||||
Style::default()
|
||||
.fg(app.theme.system_msg)
|
||||
.add_modifier(Modifier::ITALIC),
|
||||
@@ -70,16 +67,14 @@ fn format_entry(entry: &ChatEntry, app: &App) -> ListItem<'static> {
|
||||
let name_color = if entry.is_self {
|
||||
app.theme.self_name
|
||||
} else {
|
||||
app.theme.peer_name
|
||||
crate::tui::get_peer_color(&entry.sender_name)
|
||||
};
|
||||
|
||||
let line = Line::from(vec![
|
||||
Span::styled(format!("[{}] ", time), Style::default().fg(app.theme.time)),
|
||||
Span::styled(
|
||||
format!("{}: ", entry.sender_name),
|
||||
Style::default()
|
||||
.fg(name_color)
|
||||
.add_modifier(Modifier::BOLD),
|
||||
Style::default().fg(name_color).add_modifier(Modifier::BOLD),
|
||||
),
|
||||
Span::styled(entry.text.clone(), Style::default().fg(app.theme.text)),
|
||||
]);
|
||||
|
||||
@@ -15,7 +15,7 @@ pub fn render(frame: &mut Frame, area: Rect, file_mgr: &FileTransferManager, app
|
||||
let block = Block::default()
|
||||
.title(format!(" 📦 Transfers ({}) ", transfers.len()))
|
||||
.borders(Borders::ALL)
|
||||
.border_style(Style::default().fg(app.theme.border));
|
||||
.border_style(Style::default().fg(app.theme.transfer_border));
|
||||
|
||||
let inner = block.inner(area);
|
||||
frame.render_widget(block, area);
|
||||
@@ -30,29 +30,36 @@ pub fn render(frame: &mut Frame, area: Rect, file_mgr: &FileTransferManager, app
|
||||
let items: Vec<ListItem> = transfers
|
||||
.iter()
|
||||
.map(|info| {
|
||||
let id_short = hex::encode(info.file_id).chars().take(4).collect::<String>();
|
||||
|
||||
let id_short = hex::encode(info.file_id)
|
||||
.chars()
|
||||
.take(4)
|
||||
.collect::<String>();
|
||||
|
||||
// Format state with specific styling
|
||||
let (text, color) = match &info.state {
|
||||
TransferState::Complete => (
|
||||
TransferState::Complete { .. } => (
|
||||
format!("[{}] {} (Complete)", id_short, info.file_name),
|
||||
app.theme.success
|
||||
app.theme.success,
|
||||
),
|
||||
TransferState::Rejected => (
|
||||
TransferState::Rejected { .. } => (
|
||||
format!("[{}] {} (Rejected)", id_short, info.file_name),
|
||||
app.theme.error
|
||||
app.theme.error,
|
||||
),
|
||||
TransferState::Failed(e) => (
|
||||
format!("[{}] {} (Failed: {})", id_short, info.file_name, e),
|
||||
app.theme.error
|
||||
TransferState::Failed { error, .. } => (
|
||||
format!("[{}] {} (Failed: {})", id_short, info.file_name, error),
|
||||
app.theme.error,
|
||||
),
|
||||
TransferState::Transferring { bytes_transferred, total_size, start_time } => {
|
||||
TransferState::Transferring {
|
||||
bytes_transferred,
|
||||
total_size,
|
||||
start_time,
|
||||
} => {
|
||||
let pct = if *total_size > 0 {
|
||||
(*bytes_transferred as f64 / *total_size as f64) * 100.0
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
|
||||
|
||||
// Calc speed & ETA
|
||||
let elapsed = start_time.elapsed().as_secs_f64();
|
||||
let speed_bps = if elapsed > 0.0 {
|
||||
@@ -61,7 +68,7 @@ pub fn render(frame: &mut Frame, area: Rect, file_mgr: &FileTransferManager, app
|
||||
0.0
|
||||
};
|
||||
let speed_mbps = speed_bps / (1024.0 * 1024.0);
|
||||
|
||||
|
||||
let eta_str = if speed_bps > 0.0 && *total_size > *bytes_transferred {
|
||||
let remaining_bytes = total_size - bytes_transferred;
|
||||
let eta_secs = remaining_bytes as f64 / speed_bps;
|
||||
@@ -73,46 +80,72 @@ pub fn render(frame: &mut Frame, area: Rect, file_mgr: &FileTransferManager, app
|
||||
// Progress Bar
|
||||
let width = 15;
|
||||
let filled = (pct / 100.0 * width as f64) as usize;
|
||||
let bar: String = (0..width).map(|i| if i < filled { '█' } else { '░' }).collect();
|
||||
let bar: String = (0..width)
|
||||
.map(|i| if i < filled { '█' } else { '░' })
|
||||
.collect();
|
||||
|
||||
(
|
||||
format!("[{}] {} ⏳ {} {:.1}% ({:.1} MB/s) - {}",
|
||||
id_short, info.file_name, bar, pct, speed_mbps, eta_str),
|
||||
app.theme.info
|
||||
format!(
|
||||
"[{}] {} ⏳ {} {:.1}% ({:.1} MB/s) - {}",
|
||||
id_short, info.file_name, bar, pct, speed_mbps, eta_str
|
||||
),
|
||||
// Use warning color (yellow) as requested
|
||||
app.theme.warning,
|
||||
)
|
||||
},
|
||||
}
|
||||
TransferState::Offering => (
|
||||
format!("[{}] {} (Offering...)", id_short, info.file_name),
|
||||
app.theme.warning
|
||||
app.theme.warning,
|
||||
),
|
||||
TransferState::WaitingForAccept { expires_at } => {
|
||||
let now = std::time::Instant::now();
|
||||
if *expires_at > now {
|
||||
let remaining = expires_at.duration_since(now).as_secs();
|
||||
|
||||
|
||||
// spinner (braille)
|
||||
const SPINNER: &[&str] = &["⠋", "⠙", "⠹", "⠸", "⠼", "⠴", "⠦", "⠧", "⠇", "⠏"];
|
||||
const SPINNER: &[&str] = &["⠟", "⠯", "⠷", "⠾", "⠽", "⠻"];
|
||||
let millis = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap_or_default()
|
||||
.as_millis();
|
||||
let spin_idx = (millis / 100) as usize % SPINNER.len();
|
||||
let spinner = SPINNER[spin_idx];
|
||||
|
||||
|
||||
// Removed progress bar as requested ("leave the braile and seconds")
|
||||
(
|
||||
format!("[{}] {} {} {}s left", id_short, info.file_name, spinner, remaining),
|
||||
app.theme.warning
|
||||
format!(
|
||||
"[{}] {} {} {}s left",
|
||||
id_short, info.file_name, spinner, remaining
|
||||
),
|
||||
app.theme.warning,
|
||||
)
|
||||
} else {
|
||||
(
|
||||
format!("[{}] {} (Timed out)", id_short, info.file_name),
|
||||
app.theme.error
|
||||
app.theme.error,
|
||||
)
|
||||
}
|
||||
}
|
||||
TransferState::Requesting { expires_at } => {
|
||||
let now = std::time::Instant::now();
|
||||
if *expires_at > now {
|
||||
let remaining = expires_at.duration_since(now).as_secs();
|
||||
(
|
||||
format!(
|
||||
"[{}] {} (Requesting... {}s)",
|
||||
id_short, info.file_name, remaining
|
||||
),
|
||||
app.theme.warning,
|
||||
)
|
||||
} else {
|
||||
(
|
||||
format!("[{}] {} (Timed out)", id_short, info.file_name),
|
||||
app.theme.error,
|
||||
)
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
ListItem::new(Line::from(Span::styled(text, Style::default().fg(color))))
|
||||
})
|
||||
.collect();
|
||||
|
||||
@@ -14,22 +14,16 @@ pub fn render(frame: &mut Frame, area: Rect, app: &App) {
|
||||
app.theme.system_msg,
|
||||
app.file_path_input.as_str(),
|
||||
),
|
||||
InputMode::Editing => (" ✏ Message (Esc for commands) ", app.theme.self_name, app.input.as_str()),
|
||||
InputMode::Editing => (
|
||||
" ✏ Message (Esc for commands) ",
|
||||
app.theme.self_name,
|
||||
app.input.as_str(),
|
||||
),
|
||||
InputMode::Normal => (
|
||||
" Press i/Enter to type, Q=quit, or use /help ",
|
||||
app.theme.time,
|
||||
app.input.as_str(),
|
||||
),
|
||||
InputMode::MicSelect => (
|
||||
" 🎤 Selecting microphone... ",
|
||||
app.theme.border,
|
||||
"",
|
||||
),
|
||||
InputMode::SpeakerSelect => (
|
||||
" 🔊 Selecting speaker... ",
|
||||
app.theme.border,
|
||||
"",
|
||||
),
|
||||
};
|
||||
|
||||
let block = Block::default()
|
||||
@@ -46,19 +40,11 @@ pub fn render(frame: &mut Frame, area: Rect, app: &App) {
|
||||
// Set cursor position when in editing/file mode
|
||||
match app.input_mode {
|
||||
InputMode::Editing => {
|
||||
frame.set_cursor_position((
|
||||
area.x + 1 + app.cursor_position as u16,
|
||||
area.y + 1,
|
||||
));
|
||||
frame.set_cursor_position((area.x + 1 + app.cursor_position as u16, area.y + 1));
|
||||
}
|
||||
InputMode::FilePrompt => {
|
||||
frame.set_cursor_position((
|
||||
area.x + 1 + app.file_path_input.len() as u16,
|
||||
area.y + 1,
|
||||
));
|
||||
frame.set_cursor_position((area.x + 1 + app.file_path_input.len() as u16, area.y + 1));
|
||||
}
|
||||
InputMode::Normal => {}
|
||||
InputMode::MicSelect => {}
|
||||
InputMode::SpeakerSelect => {}
|
||||
}
|
||||
}
|
||||
|
||||
327
src/tui/mod.rs
327
src/tui/mod.rs
@@ -1,24 +1,22 @@
|
||||
//! TUI module — terminal user interface using ratatui + crossterm.
|
||||
|
||||
pub mod chat_panel;
|
||||
pub mod peer_panel;
|
||||
pub mod input;
|
||||
pub mod status_bar;
|
||||
pub mod file_panel;
|
||||
pub mod input;
|
||||
pub mod peer_panel;
|
||||
pub mod status_bar;
|
||||
|
||||
use std::path::PathBuf;
|
||||
|
||||
use crossterm::event::{KeyCode, KeyEvent};
|
||||
use ratatui::layout::{Constraint, Direction, Layout, Rect};
|
||||
use ratatui::style::{Modifier, Style};
|
||||
use ratatui::text::{Line, Span};
|
||||
use ratatui::widgets::{Block, Borders, Clear, List, ListItem};
|
||||
use ratatui::layout::{Constraint, Direction, Layout};
|
||||
use ratatui::Frame;
|
||||
|
||||
use crate::chat::ChatState;
|
||||
use crate::file_transfer::FileTransferManager;
|
||||
|
||||
use crate::app_logic::AppCommand;
|
||||
use crate::media::MediaState;
|
||||
use crate::media::voice::AudioDevice;
|
||||
use crate::net::PeerInfo;
|
||||
|
||||
/// Input mode for the TUI.
|
||||
@@ -28,33 +26,9 @@ pub enum InputMode {
|
||||
Editing,
|
||||
#[allow(dead_code)]
|
||||
FilePrompt,
|
||||
MicSelect,
|
||||
SpeakerSelect,
|
||||
}
|
||||
|
||||
/// Commands produced by TUI event handling.
|
||||
#[derive(Debug)]
|
||||
pub enum TuiCommand {
|
||||
SendMessage(String),
|
||||
/// Local-only system message (not broadcast to peers).
|
||||
SystemMessage(String),
|
||||
SendFile(PathBuf),
|
||||
AcceptFile(String), // file_id prefix
|
||||
ChangeNick(String),
|
||||
Connect(String),
|
||||
ToggleVoice,
|
||||
ToggleCamera,
|
||||
ToggleScreen,
|
||||
SelectMic(String), // node_name of selected mic
|
||||
SelectSpeaker(String), // node_name of selected speaker
|
||||
SetBitrate(u32),
|
||||
Leave,
|
||||
Quit,
|
||||
None,
|
||||
}
|
||||
|
||||
use crate::config::Theme;
|
||||
// ... imports ...
|
||||
|
||||
/// Application state for the TUI.
|
||||
pub struct App {
|
||||
@@ -66,9 +40,6 @@ pub struct App {
|
||||
pub show_file_panel: bool,
|
||||
pub file_path_input: String,
|
||||
pub theme: Theme,
|
||||
// Device selection state (reused for Mic and Speaker)
|
||||
pub audio_devices: Vec<AudioDevice>,
|
||||
pub device_selected_index: usize,
|
||||
}
|
||||
|
||||
impl App {
|
||||
@@ -80,38 +51,21 @@ impl App {
|
||||
scroll_offset: 0,
|
||||
show_file_panel: true,
|
||||
file_path_input: String::new(),
|
||||
|
||||
theme,
|
||||
audio_devices: Vec::new(),
|
||||
device_selected_index: 0,
|
||||
}
|
||||
}
|
||||
|
||||
/// Open the mic selection screen.
|
||||
pub fn open_mic_select(&mut self, sources: Vec<AudioDevice>) {
|
||||
self.audio_devices = sources;
|
||||
self.device_selected_index = 0;
|
||||
self.input_mode = InputMode::MicSelect;
|
||||
}
|
||||
|
||||
/// Open the speaker selection screen.
|
||||
pub fn open_speaker_select(&mut self, sinks: Vec<AudioDevice>) {
|
||||
self.audio_devices = sinks;
|
||||
self.device_selected_index = 0;
|
||||
self.input_mode = InputMode::SpeakerSelect;
|
||||
}
|
||||
|
||||
/// Handle a key event and return a command.
|
||||
pub fn handle_key(&mut self, key: KeyEvent) -> TuiCommand {
|
||||
pub fn handle_key(&mut self, key: KeyEvent) -> AppCommand {
|
||||
match self.input_mode {
|
||||
InputMode::MicSelect => self.handle_device_select_key(key),
|
||||
InputMode::SpeakerSelect => self.handle_device_select_key(key),
|
||||
InputMode::FilePrompt => self.handle_file_prompt_key(key),
|
||||
InputMode::Editing => self.handle_editing_key(key),
|
||||
InputMode::Normal => self.handle_normal_key(key),
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_editing_key(&mut self, key: KeyEvent) -> TuiCommand {
|
||||
fn handle_editing_key(&mut self, key: KeyEvent) -> AppCommand {
|
||||
match key.code {
|
||||
KeyCode::Enter => {
|
||||
if !self.input.is_empty() {
|
||||
@@ -125,255 +79,179 @@ impl App {
|
||||
"nick" | "name" => {
|
||||
let new_name = parts.get(1).unwrap_or(&"").trim();
|
||||
if new_name.is_empty() {
|
||||
return TuiCommand::SystemMessage(
|
||||
return AppCommand::SystemMessage(
|
||||
"Usage: /nick <new_name>".to_string(),
|
||||
);
|
||||
}
|
||||
return TuiCommand::ChangeNick(new_name.to_string());
|
||||
return AppCommand::ChangeNick(new_name.to_string());
|
||||
}
|
||||
"connect" | "join" => {
|
||||
let peer_id = parts.get(1).unwrap_or(&"").trim();
|
||||
if peer_id.is_empty() {
|
||||
return TuiCommand::SystemMessage(
|
||||
return AppCommand::SystemMessage(
|
||||
"Usage: /connect <peer_id>".to_string(),
|
||||
);
|
||||
}
|
||||
return TuiCommand::Connect(peer_id.to_string());
|
||||
return AppCommand::Connect(peer_id.to_string());
|
||||
}
|
||||
"voice" => return TuiCommand::ToggleVoice,
|
||||
"mic" | "microphone" => {
|
||||
// Open mic selection screen
|
||||
let sources = crate::media::voice::list_audio_sources();
|
||||
if sources.is_empty() {
|
||||
return TuiCommand::SystemMessage(
|
||||
"🎤 No audio sources found (is PipeWire running?)".to_string(),
|
||||
);
|
||||
}
|
||||
self.open_mic_select(sources);
|
||||
return TuiCommand::None;
|
||||
}
|
||||
"speaker" | "output" => {
|
||||
// Open speaker selection screen
|
||||
let sinks = crate::media::voice::list_audio_sinks();
|
||||
if sinks.is_empty() {
|
||||
return TuiCommand::SystemMessage(
|
||||
"🔊 No audio outputs found (is PipeWire running?)".to_string(),
|
||||
);
|
||||
}
|
||||
self.open_speaker_select(sinks);
|
||||
return TuiCommand::None;
|
||||
}
|
||||
"camera" | "cam" => return TuiCommand::ToggleCamera,
|
||||
"screen" | "share" => return TuiCommand::ToggleScreen,
|
||||
"voice" => return AppCommand::ToggleVoice,
|
||||
|
||||
// mic/speaker commands removed
|
||||
"screen" | "share" => return AppCommand::ToggleScreen,
|
||||
"file" | "send" => {
|
||||
let path = parts.get(1).unwrap_or(&"").trim();
|
||||
if path.is_empty() {
|
||||
// Open native file dialog
|
||||
use std::process::Command;
|
||||
let result = Command::new("zenity")
|
||||
.args(["--file-selection", "--title=Select file to send"])
|
||||
.output()
|
||||
.or_else(|_| {
|
||||
Command::new("kdialog")
|
||||
.args(["--getopenfilename", "."])
|
||||
.output()
|
||||
});
|
||||
match result {
|
||||
Ok(output) if output.status.success() => {
|
||||
let chosen = String::from_utf8_lossy(&output.stdout)
|
||||
.trim()
|
||||
.to_string();
|
||||
if !chosen.is_empty() {
|
||||
return TuiCommand::SendFile(PathBuf::from(chosen));
|
||||
}
|
||||
return TuiCommand::None; // cancelled
|
||||
}
|
||||
_ => {
|
||||
return TuiCommand::SystemMessage(
|
||||
"No file dialog available. Use: /file <path>".to_string(),
|
||||
);
|
||||
}
|
||||
// Open native file dialog via rfd (cross-platform)
|
||||
if let Some(file) = rfd::FileDialog::new().pick_file() {
|
||||
return AppCommand::SendFile(file);
|
||||
}
|
||||
return AppCommand::None; // cancelled
|
||||
}
|
||||
return TuiCommand::SendFile(PathBuf::from(path));
|
||||
return AppCommand::SendFile(PathBuf::from(path));
|
||||
}
|
||||
"accept" | "a" => {
|
||||
let id_prefix = parts.get(1).unwrap_or(&"").trim();
|
||||
if id_prefix.is_empty() {
|
||||
return TuiCommand::SystemMessage("Usage: /accept <file_id_prefix>".to_string());
|
||||
return AppCommand::SystemMessage(
|
||||
"Usage: /accept <file_id_prefix>".to_string(),
|
||||
);
|
||||
}
|
||||
return TuiCommand::AcceptFile(id_prefix.to_string());
|
||||
return AppCommand::AcceptFile(id_prefix.to_string());
|
||||
}
|
||||
"quit" | "q" => return TuiCommand::Quit,
|
||||
"leave" => return TuiCommand::Leave,
|
||||
"quit" | "q" => return AppCommand::Quit,
|
||||
"leave" => return AppCommand::Leave,
|
||||
"help" => {
|
||||
return TuiCommand::SystemMessage(
|
||||
"Commands: /nick <name>, /connect <id>, /voice, /bitrate <kbps>, /mic, /camera, /screen, /file <path>, /leave, /quit".to_string(),
|
||||
return AppCommand::SystemMessage(
|
||||
"Commands: /nick <name>, /connect <id>, /voice, /screen, /file <path>, /accept <prefix>, /leave, /quit".to_string(),
|
||||
);
|
||||
}
|
||||
"bitrate" => {
|
||||
let kbps_str = parts.get(1).unwrap_or(&"").trim();
|
||||
if let Ok(kbps) = kbps_str.parse::<u32>() {
|
||||
return TuiCommand::SetBitrate(kbps * 1000);
|
||||
return AppCommand::SetBitrate(kbps * 1000);
|
||||
} else {
|
||||
return TuiCommand::SystemMessage("Usage: /bitrate <kbps> (e.g. 128)".to_string());
|
||||
return AppCommand::SystemMessage(
|
||||
"Usage: /bitrate <kbps> (e.g. 128)".to_string(),
|
||||
);
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
return TuiCommand::SystemMessage(
|
||||
format!("Unknown command: /{}. Type /help", parts[0]),
|
||||
);
|
||||
return AppCommand::SystemMessage(format!(
|
||||
"Unknown command: /{}. Type /help",
|
||||
parts[0]
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return TuiCommand::SendMessage(text);
|
||||
return AppCommand::SendMessage(text);
|
||||
}
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Char(c) => {
|
||||
self.input.insert(self.cursor_position, c);
|
||||
self.cursor_position += 1;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Backspace => {
|
||||
if self.cursor_position > 0 {
|
||||
self.cursor_position -= 1;
|
||||
self.input.remove(self.cursor_position);
|
||||
}
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Delete => {
|
||||
if self.cursor_position < self.input.len() {
|
||||
self.input.remove(self.cursor_position);
|
||||
}
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Left => {
|
||||
if self.cursor_position > 0 {
|
||||
self.cursor_position -= 1;
|
||||
}
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Right => {
|
||||
if self.cursor_position < self.input.len() {
|
||||
self.cursor_position += 1;
|
||||
}
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Home => {
|
||||
self.cursor_position = 0;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::End => {
|
||||
self.cursor_position = self.input.len();
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Esc => {
|
||||
self.input_mode = InputMode::Normal;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Up => {
|
||||
self.scroll_offset = self.scroll_offset.saturating_add(1);
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Down => {
|
||||
self.scroll_offset = self.scroll_offset.saturating_sub(1);
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
_ => TuiCommand::None,
|
||||
_ => AppCommand::None,
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_normal_key(&mut self, key: KeyEvent) -> TuiCommand {
|
||||
fn handle_normal_key(&mut self, key: KeyEvent) -> AppCommand {
|
||||
match key.code {
|
||||
KeyCode::Char('q') | KeyCode::Char('Q') => TuiCommand::Quit,
|
||||
KeyCode::Char('q') | KeyCode::Char('Q') => AppCommand::Quit,
|
||||
KeyCode::Char('/') => {
|
||||
self.input_mode = InputMode::Editing;
|
||||
self.input.push('/');
|
||||
self.cursor_position = 1;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Char('i') | KeyCode::Enter => {
|
||||
self.input_mode = InputMode::Editing;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Up => {
|
||||
self.scroll_offset = self.scroll_offset.saturating_add(1);
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Down => {
|
||||
self.scroll_offset = self.scroll_offset.saturating_sub(1);
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
_ => TuiCommand::None,
|
||||
_ => AppCommand::None,
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_file_prompt_key(&mut self, key: KeyEvent) -> TuiCommand {
|
||||
fn handle_file_prompt_key(&mut self, key: KeyEvent) -> AppCommand {
|
||||
match key.code {
|
||||
KeyCode::Enter => {
|
||||
if !self.file_path_input.is_empty() {
|
||||
let path = PathBuf::from(self.file_path_input.drain(..).collect::<String>());
|
||||
self.input_mode = InputMode::Editing;
|
||||
return TuiCommand::SendFile(path);
|
||||
return AppCommand::SendFile(path);
|
||||
}
|
||||
self.input_mode = InputMode::Editing;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Char(c) => {
|
||||
self.file_path_input.push(c);
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Backspace => {
|
||||
self.file_path_input.pop();
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
KeyCode::Esc => {
|
||||
self.file_path_input.clear();
|
||||
self.input_mode = InputMode::Editing;
|
||||
TuiCommand::None
|
||||
AppCommand::None
|
||||
}
|
||||
_ => TuiCommand::None,
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_device_select_key(&mut self, key: KeyEvent) -> TuiCommand {
|
||||
match key.code {
|
||||
KeyCode::Up | KeyCode::Char('k') => {
|
||||
if self.device_selected_index > 0 {
|
||||
self.device_selected_index -= 1;
|
||||
}
|
||||
TuiCommand::None
|
||||
}
|
||||
KeyCode::Down | KeyCode::Char('j') => {
|
||||
if self.device_selected_index + 1 < self.audio_devices.len() {
|
||||
self.device_selected_index += 1;
|
||||
}
|
||||
TuiCommand::None
|
||||
}
|
||||
KeyCode::Enter => {
|
||||
if let Some(dev) = self.audio_devices.get(self.device_selected_index) {
|
||||
let node_name = dev.node_name.clone();
|
||||
let mode = self.input_mode.clone();
|
||||
self.input_mode = InputMode::Editing;
|
||||
self.audio_devices.clear();
|
||||
|
||||
if mode == InputMode::MicSelect {
|
||||
return TuiCommand::SelectMic(node_name);
|
||||
} else {
|
||||
return TuiCommand::SelectSpeaker(node_name);
|
||||
}
|
||||
}
|
||||
self.input_mode = InputMode::Editing;
|
||||
TuiCommand::None
|
||||
}
|
||||
KeyCode::Esc | KeyCode::Char('q') => {
|
||||
self.input_mode = InputMode::Editing;
|
||||
self.audio_devices.clear();
|
||||
TuiCommand::None
|
||||
}
|
||||
_ => TuiCommand::None,
|
||||
_ => AppCommand::None,
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -442,60 +320,33 @@ pub fn render(
|
||||
connected,
|
||||
&app.input_mode,
|
||||
);
|
||||
|
||||
// Render device selection overlay if active
|
||||
if app.input_mode == InputMode::MicSelect || app.input_mode == InputMode::SpeakerSelect {
|
||||
render_device_overlay(frame, size, app);
|
||||
}
|
||||
}
|
||||
|
||||
/// Render the device selection overlay (centered popup).
|
||||
fn render_device_overlay(frame: &mut Frame, area: Rect, app: &App) {
|
||||
let popup_width = 60u16.min(area.width.saturating_sub(4));
|
||||
let popup_height = (app.audio_devices.len() as u16 + 4).min(area.height.saturating_sub(4));
|
||||
/// Deterministic random color for a peer info string.
|
||||
pub fn get_peer_color(name: &str) -> ratatui::style::Color {
|
||||
use ratatui::style::Color;
|
||||
use std::collections::hash_map::DefaultHasher;
|
||||
use std::hash::{Hash, Hasher};
|
||||
|
||||
let x = (area.width.saturating_sub(popup_width)) / 2;
|
||||
let y = (area.height.saturating_sub(popup_height)) / 2;
|
||||
let mut hasher = DefaultHasher::new();
|
||||
name.hash(&mut hasher);
|
||||
let hash = hasher.finish();
|
||||
|
||||
let popup_area = Rect::new(x, y, popup_width, popup_height);
|
||||
// Palette of distinguishable colors (excluding dark/black)
|
||||
let colors = [
|
||||
Color::Red,
|
||||
Color::Green,
|
||||
Color::Yellow,
|
||||
Color::Blue,
|
||||
Color::Magenta,
|
||||
Color::Cyan,
|
||||
Color::LightRed,
|
||||
Color::LightGreen,
|
||||
Color::LightYellow,
|
||||
Color::LightBlue,
|
||||
Color::LightMagenta,
|
||||
Color::LightCyan,
|
||||
];
|
||||
|
||||
// Clear the background
|
||||
frame.render_widget(Clear, popup_area);
|
||||
|
||||
let title = if app.input_mode == InputMode::MicSelect {
|
||||
" 🎤 Select Microphone (↑↓ Enter Esc) "
|
||||
} else {
|
||||
" 🔊 Select Speaker (↑↓ Enter Esc) "
|
||||
};
|
||||
|
||||
let block = Block::default()
|
||||
.title(title)
|
||||
.borders(Borders::ALL)
|
||||
.border_style(Style::default().fg(app.theme.border));
|
||||
|
||||
let inner = block.inner(popup_area);
|
||||
frame.render_widget(block, popup_area);
|
||||
|
||||
let items: Vec<ListItem> = app
|
||||
.audio_devices
|
||||
.iter()
|
||||
.enumerate()
|
||||
.map(|(i, dev)| {
|
||||
let marker = if i == app.device_selected_index { "▶ " } else { " " };
|
||||
let style = if i == app.device_selected_index {
|
||||
Style::default()
|
||||
.fg(app.theme.self_name)
|
||||
.add_modifier(Modifier::BOLD)
|
||||
} else {
|
||||
Style::default().fg(app.theme.text)
|
||||
};
|
||||
ListItem::new(Line::from(Span::styled(
|
||||
format!("{}{}", marker, dev.description),
|
||||
style,
|
||||
)))
|
||||
})
|
||||
.collect();
|
||||
|
||||
let list = List::new(items);
|
||||
frame.render_widget(list, inner);
|
||||
colors[(hash as usize) % colors.len()]
|
||||
}
|
||||
|
||||
@@ -14,7 +14,7 @@ pub fn render(frame: &mut Frame, area: Rect, peers: &[PeerInfo], app: &App) {
|
||||
let block = Block::default()
|
||||
.title(format!(" 👥 Peers ({}) ", peers.len()))
|
||||
.borders(Borders::ALL)
|
||||
.border_style(Style::default().fg(app.theme.border));
|
||||
.border_style(Style::default().fg(app.theme.peer_border));
|
||||
|
||||
let inner = block.inner(area);
|
||||
frame.render_widget(block, area);
|
||||
@@ -29,10 +29,7 @@ pub fn render(frame: &mut Frame, area: Rect, peers: &[PeerInfo], app: &App) {
|
||||
let items: Vec<ListItem> = peers
|
||||
.iter()
|
||||
.map(|peer| {
|
||||
let name = peer
|
||||
.name
|
||||
.as_deref()
|
||||
.unwrap_or("unknown");
|
||||
let name = peer.name.as_deref().unwrap_or("unknown");
|
||||
let id_short = format!("{}", peer.id).chars().take(8).collect::<String>();
|
||||
|
||||
let caps = peer
|
||||
@@ -56,7 +53,7 @@ pub fn render(frame: &mut Frame, area: Rect, peers: &[PeerInfo], app: &App) {
|
||||
let (marker, marker_color, name_suffix) = if peer.is_self {
|
||||
("★ ", app.theme.self_name, " (you)")
|
||||
} else {
|
||||
("● ", app.theme.peer_name, "")
|
||||
("● ", crate::tui::get_peer_color(name), "")
|
||||
};
|
||||
|
||||
let line = Line::from(vec![
|
||||
|
||||
@@ -6,8 +6,8 @@ use ratatui::text::{Line, Span};
|
||||
use ratatui::widgets::Paragraph;
|
||||
use ratatui::Frame;
|
||||
|
||||
use crate::media::MediaState;
|
||||
use super::InputMode;
|
||||
use crate::media::MediaState;
|
||||
|
||||
pub fn render(
|
||||
frame: &mut Frame,
|
||||
@@ -25,11 +25,18 @@ pub fn render(
|
||||
};
|
||||
|
||||
let mode_span = match input_mode {
|
||||
InputMode::Normal => Span::styled(" NORMAL ", Style::default().fg(Color::Black).bg(Color::Blue)),
|
||||
InputMode::Editing => Span::styled(" INSERT ", Style::default().fg(Color::Black).bg(Color::Green)),
|
||||
InputMode::FilePrompt => Span::styled(" FILE ", Style::default().fg(Color::Black).bg(Color::Yellow)),
|
||||
InputMode::MicSelect => Span::styled(" MIC ", Style::default().fg(Color::Black).bg(Color::Magenta)),
|
||||
InputMode::SpeakerSelect => Span::styled(" SPKR ", Style::default().fg(Color::Black).bg(Color::Cyan)),
|
||||
InputMode::Normal => Span::styled(
|
||||
" NORMAL ",
|
||||
Style::default().fg(Color::Black).bg(Color::Blue),
|
||||
),
|
||||
InputMode::Editing => Span::styled(
|
||||
" INSERT ",
|
||||
Style::default().fg(Color::Black).bg(Color::Green),
|
||||
),
|
||||
InputMode::FilePrompt => Span::styled(
|
||||
" FILE ",
|
||||
Style::default().fg(Color::Black).bg(Color::Yellow),
|
||||
),
|
||||
};
|
||||
|
||||
let line = Line::from(vec![
|
||||
@@ -38,17 +45,16 @@ pub fn render(
|
||||
Span::raw(" "),
|
||||
conn_status,
|
||||
Span::styled(
|
||||
format!(" │ {} ({})", our_name, our_id_short),
|
||||
format!(" I {} ({})", our_name, our_id_short),
|
||||
Style::default().fg(Color::Cyan),
|
||||
),
|
||||
Span::styled(
|
||||
format!(" │ {}", media.status_line()),
|
||||
format!(" I {}", media.status_line()),
|
||||
Style::default().fg(Color::DarkGray),
|
||||
),
|
||||
]);
|
||||
|
||||
let paragraph = Paragraph::new(line)
|
||||
.style(Style::default().bg(Color::Rgb(30, 30, 40)));
|
||||
let paragraph = Paragraph::new(line).style(Style::default().bg(Color::Rgb(30, 30, 40)));
|
||||
|
||||
frame.render_widget(paragraph, area);
|
||||
}
|
||||
|
||||
163
src/web/mod.rs
163
src/web/mod.rs
@@ -1,10 +1,10 @@
|
||||
use axum::{
|
||||
body::Bytes,
|
||||
extract::{
|
||||
ws::{Message, WebSocket, WebSocketUpgrade},
|
||||
State,
|
||||
},
|
||||
http::{header, StatusCode, Uri},
|
||||
body::Bytes,
|
||||
response::IntoResponse,
|
||||
routing::get,
|
||||
Router,
|
||||
@@ -14,14 +14,13 @@ use std::net::SocketAddr;
|
||||
|
||||
use crate::media::WebMediaEvent;
|
||||
use crate::protocol::MediaKind;
|
||||
use tokio::sync::broadcast;
|
||||
use futures::{SinkExt, StreamExt};
|
||||
use tokio::sync::broadcast;
|
||||
|
||||
#[derive(Clone)]
|
||||
struct AppState {
|
||||
tx: broadcast::Sender<WebMediaEvent>,
|
||||
mic_tx: broadcast::Sender<Vec<f32>>,
|
||||
cam_tx: broadcast::Sender<Vec<u8>>,
|
||||
screen_tx: broadcast::Sender<Vec<u8>>,
|
||||
}
|
||||
|
||||
@@ -32,12 +31,16 @@ struct Assets;
|
||||
pub async fn start_web_server(
|
||||
tx: broadcast::Sender<WebMediaEvent>,
|
||||
mic_tx: broadcast::Sender<Vec<f32>>,
|
||||
cam_tx: broadcast::Sender<Vec<u8>>,
|
||||
screen_tx: broadcast::Sender<Vec<u8>>,
|
||||
) {
|
||||
let state = AppState { tx, mic_tx, cam_tx, screen_tx };
|
||||
let state = AppState {
|
||||
tx,
|
||||
mic_tx,
|
||||
screen_tx,
|
||||
};
|
||||
let app = Router::new()
|
||||
.route("/ws", get(ws_handler))
|
||||
.route("/ws/audio", get(ws_audio_handler))
|
||||
.route("/ws/screen", get(ws_screen_handler))
|
||||
.fallback(static_handler)
|
||||
.with_state(state);
|
||||
|
||||
@@ -61,93 +64,109 @@ pub async fn start_web_server(
|
||||
axum::serve(listener, app).await.unwrap();
|
||||
}
|
||||
|
||||
async fn ws_handler(
|
||||
// --- AUDIO ---
|
||||
async fn ws_audio_handler(
|
||||
ws: WebSocketUpgrade,
|
||||
State(state): State<AppState>,
|
||||
) -> impl IntoResponse {
|
||||
ws.on_upgrade(move |socket| handle_socket(socket, state))
|
||||
ws.on_upgrade(move |socket| handle_audio_socket(socket, state))
|
||||
}
|
||||
|
||||
async fn handle_socket(socket: WebSocket, state: AppState) {
|
||||
async fn handle_audio_socket(socket: WebSocket, state: AppState) {
|
||||
let (mut sender, mut receiver) = socket.split();
|
||||
let mut rx = state.tx.subscribe();
|
||||
|
||||
// Outgoing (Server -> Browser)
|
||||
tokio::spawn(async move {
|
||||
while let Ok(event) = rx.recv().await {
|
||||
let msg = match event {
|
||||
WebMediaEvent::Audio { peer_id, data: samples } => {
|
||||
// 1 byte header (0) + 1 byte ID len + ID bytes + f32 bytes
|
||||
let id_bytes = peer_id.as_bytes();
|
||||
let id_len = id_bytes.len() as u8;
|
||||
let mut payload = Vec::with_capacity(1 + 1 + id_bytes.len() + samples.len() * 4);
|
||||
payload.push(0u8);
|
||||
payload.push(id_len);
|
||||
payload.extend_from_slice(id_bytes);
|
||||
for s in samples {
|
||||
payload.extend_from_slice(&s.to_ne_bytes());
|
||||
}
|
||||
Message::Binary(Bytes::from(payload))
|
||||
if let WebMediaEvent::Audio {
|
||||
peer_id,
|
||||
data: samples,
|
||||
} = event
|
||||
{
|
||||
// Protocol: [IDLen] [ID] [f32...]
|
||||
let id_bytes = peer_id.as_bytes();
|
||||
let id_len = id_bytes.len() as u8;
|
||||
let mut payload = Vec::with_capacity(1 + id_bytes.len() + samples.len() * 4);
|
||||
payload.push(id_len);
|
||||
payload.extend_from_slice(id_bytes);
|
||||
for s in samples {
|
||||
payload.extend_from_slice(&s.to_ne_bytes());
|
||||
}
|
||||
WebMediaEvent::Video { peer_id, kind, data } => {
|
||||
// 1 byte header (1=Camera, 2=Screen) + 1 byte ID len + ID bytes + MJPEG data
|
||||
let header = match kind {
|
||||
MediaKind::Camera => 1u8,
|
||||
MediaKind::Screen => 2u8,
|
||||
_ => 1u8,
|
||||
};
|
||||
let id_bytes = peer_id.as_bytes();
|
||||
let id_len = id_bytes.len() as u8;
|
||||
|
||||
let mut payload = Vec::with_capacity(1 + 1 + id_bytes.len() + data.len());
|
||||
payload.push(header);
|
||||
payload.push(id_len);
|
||||
payload.extend_from_slice(id_bytes);
|
||||
payload.extend_from_slice(&data);
|
||||
Message::Binary(Bytes::from(payload))
|
||||
if sender
|
||||
.send(Message::Binary(Bytes::from(payload)))
|
||||
.await
|
||||
.is_err()
|
||||
{
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
if sender.send(msg).await.is_err() {
|
||||
break;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Incoming (Browser -> Server)
|
||||
while let Some(msg) = receiver.next().await {
|
||||
match msg {
|
||||
Ok(Message::Binary(data)) => {
|
||||
if data.is_empty() { continue; }
|
||||
let header = data[0];
|
||||
let payload = &data[1..];
|
||||
if let Ok(Message::Binary(data)) = msg {
|
||||
// Protocol: [f32...]
|
||||
// (We dropped the header byte 3)
|
||||
if data.len() % 4 == 0 {
|
||||
let samples: Vec<f32> = data
|
||||
.chunks_exact(4)
|
||||
.map(|b| f32::from_ne_bytes([b[0], b[1], b[2], b[3]]))
|
||||
.collect();
|
||||
let _ = state.mic_tx.send(samples);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
match header {
|
||||
3 => { // Mic (f32 PCM)
|
||||
// integrity check
|
||||
if payload.len() % 4 == 0 {
|
||||
let samples: Vec<f32> = payload
|
||||
.chunks_exact(4)
|
||||
.map(|b| f32::from_ne_bytes([b[0], b[1], b[2], b[3]]))
|
||||
.collect();
|
||||
// tracing::debug!("Received mic samples: {}", samples.len());
|
||||
let _ = state.mic_tx.send(samples);
|
||||
}
|
||||
}
|
||||
4 => { // Camera (MJPEG)
|
||||
tracing::debug!("Received camera frame: {} bytes", payload.len());
|
||||
let _ = state.cam_tx.send(payload.to_vec());
|
||||
}
|
||||
5 => { // Screen (MJPEG)
|
||||
tracing::debug!("Received screen frame: {} bytes", payload.len());
|
||||
let _ = state.screen_tx.send(payload.to_vec());
|
||||
}
|
||||
_ => {
|
||||
tracing::warn!("Unknown WS header: {}", header);
|
||||
// --- SCREEN ---
|
||||
async fn ws_screen_handler(
|
||||
ws: WebSocketUpgrade,
|
||||
State(state): State<AppState>,
|
||||
) -> impl IntoResponse {
|
||||
ws.on_upgrade(move |socket| handle_screen_socket(socket, state))
|
||||
}
|
||||
|
||||
async fn handle_screen_socket(socket: WebSocket, state: AppState) {
|
||||
let (mut sender, mut receiver) = socket.split();
|
||||
let mut rx = state.tx.subscribe();
|
||||
|
||||
// Outgoing (Server -> Browser)
|
||||
tokio::spawn(async move {
|
||||
while let Ok(event) = rx.recv().await {
|
||||
if let WebMediaEvent::Video {
|
||||
peer_id,
|
||||
kind,
|
||||
data,
|
||||
} = event
|
||||
{
|
||||
if matches!(kind, MediaKind::Screen) {
|
||||
let id_bytes = peer_id.as_bytes();
|
||||
let id_len = id_bytes.len() as u8;
|
||||
let mut payload = Vec::with_capacity(1 + id_bytes.len() + data.len());
|
||||
payload.push(id_len);
|
||||
payload.extend_from_slice(id_bytes);
|
||||
payload.extend_from_slice(&data);
|
||||
|
||||
if sender
|
||||
.send(Message::Binary(Bytes::from(payload)))
|
||||
.await
|
||||
.is_err()
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(Message::Close(_)) => break,
|
||||
Err(_) => break,
|
||||
_ => {}
|
||||
}
|
||||
});
|
||||
|
||||
// Incoming (Browser -> Server)
|
||||
while let Some(msg) = receiver.next().await {
|
||||
if let Ok(Message::Binary(data)) = msg {
|
||||
// Protocol: [FrameType] [Data...]
|
||||
tracing::debug!("Received screen frame: {} bytes", data.len());
|
||||
let _ = state.screen_tx.send(data.to_vec());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,7 +6,13 @@ use tokio::time::{timeout, Duration};
|
||||
|
||||
const TEST_TIMEOUT: Duration = Duration::from_secs(10);
|
||||
|
||||
async fn spawn_node(topic: [u8; 32]) -> Result<(NetworkManager, tokio::sync::mpsc::Sender<NetEvent>, tokio::sync::mpsc::Receiver<NetEvent>)> {
|
||||
async fn spawn_node(
|
||||
topic: [u8; 32],
|
||||
) -> Result<(
|
||||
NetworkManager,
|
||||
tokio::sync::mpsc::Sender<NetEvent>,
|
||||
tokio::sync::mpsc::Receiver<NetEvent>,
|
||||
)> {
|
||||
NetworkManager::new(topic).await
|
||||
}
|
||||
|
||||
@@ -27,7 +33,7 @@ async fn test_p2p_connection_and_gossip() -> Result<()> {
|
||||
// 1. Join Gossip
|
||||
// Node A starts alone
|
||||
node_a.join_gossip(vec![], tx_a.clone()).await?;
|
||||
|
||||
|
||||
// Node B joins, bootstrapping from Node A.
|
||||
// We force a connection first so B knows A's address.
|
||||
node_b.endpoint.connect(addr_a, iroh_gossip::ALPN).await?;
|
||||
@@ -38,20 +44,26 @@ async fn test_p2p_connection_and_gossip() -> Result<()> {
|
||||
let _peer_up_a = timeout(TEST_TIMEOUT, async {
|
||||
while let Some(event) = rx_a.recv().await {
|
||||
if let NetEvent::PeerUp(peer_id) = event {
|
||||
if peer_id == id_b { return Ok(()); }
|
||||
if peer_id == id_b {
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
}
|
||||
anyhow::bail!("Stream ended without PeerUp");
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
|
||||
let _peer_up_b = timeout(TEST_TIMEOUT, async {
|
||||
while let Some(event) = rx_b.recv().await {
|
||||
if let NetEvent::PeerUp(peer_id) = event {
|
||||
if peer_id == id_a { return Ok(()); }
|
||||
if peer_id == id_a {
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
}
|
||||
anyhow::bail!("Stream ended without PeerUp");
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
|
||||
// 3. Test Gossip Broadcast (Chat Message)
|
||||
let chat_msg = GossipMessage::Chat(ChatMessage {
|
||||
@@ -75,7 +87,8 @@ async fn test_p2p_connection_and_gossip() -> Result<()> {
|
||||
}
|
||||
}
|
||||
anyhow::bail!("Stream ended without GossipReceived");
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -96,12 +109,12 @@ async fn test_direct_file_transfer_stream() -> Result<()> {
|
||||
node_b.join_gossip(vec![id_a], tx_b).await?;
|
||||
|
||||
// 1. Open direct stream from A to B
|
||||
// node_b needs to accept the stream. In the actual app, this is handled by spawn_acceptor loop
|
||||
// node_b needs to accept the stream. In the actual app, this is handled by spawn_acceptor loop
|
||||
// sending NetEvent::IncomingFileStream.
|
||||
|
||||
// A opens stream
|
||||
let (mut send_stream, mut _recv_stream) = node_a.open_file_stream(id_b).await?;
|
||||
|
||||
let (mut send_stream, mut _recv_stream) = node_a.open_file_stream(id_b).await?;
|
||||
|
||||
// Send some data
|
||||
let test_data = b"Hello direct stream";
|
||||
send_stream.write_all(test_data).await?;
|
||||
@@ -115,7 +128,8 @@ async fn test_direct_file_transfer_stream() -> Result<()> {
|
||||
}
|
||||
}
|
||||
anyhow::bail!("Stream ended without IncomingFileStream");
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
|
||||
assert_eq!(from, id_a);
|
||||
|
||||
@@ -156,25 +170,35 @@ async fn test_media_stream_separation() -> Result<()> {
|
||||
// Receiver should get IncomingMediaStream with correct kind
|
||||
timeout(TEST_TIMEOUT, async {
|
||||
while let Some(event) = rx_receiver.recv().await {
|
||||
if let NetEvent::IncomingMediaStream { from: _, kind: k, .. } = event {
|
||||
if let NetEvent::IncomingMediaStream {
|
||||
from: _, kind: k, ..
|
||||
} = event
|
||||
{
|
||||
assert_eq!(k, kind);
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
anyhow::bail!("Stream ended without IncomingMediaStream for {:?}", kind);
|
||||
}).await??;
|
||||
})
|
||||
.await??;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// 1. Verify Voice Stream
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Voice).await.context("Voice stream failed")?;
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Voice)
|
||||
.await
|
||||
.context("Voice stream failed")?;
|
||||
|
||||
// 2. Verify Camera Stream
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Camera).await.context("Camera stream failed")?;
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Camera)
|
||||
.await
|
||||
.context("Camera stream failed")?;
|
||||
|
||||
// 3. Verify Screen Stream
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Screen).await.context("Screen stream failed")?;
|
||||
verify_stream(&node_a, id_b, &mut rx_b, MediaKind::Screen)
|
||||
.await
|
||||
.context("Screen stream failed")?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -189,8 +189,8 @@ mod config_tests {
|
||||
#[test]
|
||||
fn default_config_values() {
|
||||
let config = AppConfig::default();
|
||||
assert_eq!(config.media.screen_resolution, "1280x720");
|
||||
assert!(config.media.mic_name.is_none());
|
||||
// assert_eq!(config.media.screen_resolution, "1280x720");
|
||||
// assert!(config.media.mic_name.is_none());
|
||||
assert!(config.network.topic.is_none());
|
||||
}
|
||||
|
||||
@@ -199,8 +199,11 @@ mod config_tests {
|
||||
let config = AppConfig::default();
|
||||
let toml_str = toml::to_string_pretty(&config).unwrap();
|
||||
let parsed: AppConfig = toml::from_str(&toml_str).unwrap();
|
||||
assert_eq!(parsed.media.screen_resolution, config.media.screen_resolution);
|
||||
assert_eq!(parsed.media.mic_name, config.media.mic_name);
|
||||
// assert_eq!(
|
||||
// parsed.media.screen_resolution,
|
||||
// config.media.screen_resolution
|
||||
// );
|
||||
// assert_eq!(parsed.media.mic_name, config.media.mic_name);
|
||||
assert_eq!(parsed.network.topic, config.network.topic);
|
||||
}
|
||||
|
||||
@@ -211,7 +214,7 @@ mod config_tests {
|
||||
screen_resolution = "1920x1080"
|
||||
"#;
|
||||
let config: AppConfig = toml::from_str(toml_str).unwrap();
|
||||
assert_eq!(config.media.screen_resolution, "1920x1080");
|
||||
// assert_eq!(config.media.screen_resolution, "1920x1080");
|
||||
// network should use default
|
||||
assert!(config.network.topic.is_none());
|
||||
}
|
||||
@@ -265,10 +268,9 @@ screen_resolution = "1920x1080"
|
||||
fn theme_from_ui_config() {
|
||||
let ui = UiConfig::default();
|
||||
let theme: Theme = ui.into();
|
||||
assert_eq!(theme.border, Color::Cyan);
|
||||
assert_eq!(theme.chat_border, Color::Cyan);
|
||||
assert_eq!(theme.text, Color::White);
|
||||
assert_eq!(theme.self_name, Color::Green);
|
||||
assert_eq!(theme.peer_name, Color::Magenta);
|
||||
assert_eq!(theme.system_msg, Color::Yellow);
|
||||
assert_eq!(theme.time, Color::DarkGray);
|
||||
}
|
||||
@@ -276,10 +278,8 @@ screen_resolution = "1920x1080"
|
||||
#[test]
|
||||
fn ui_config_defaults() {
|
||||
let ui = UiConfig::default();
|
||||
assert_eq!(ui.border, "cyan");
|
||||
assert_eq!(ui.text, "white");
|
||||
assert_eq!(ui.self_name, "green");
|
||||
assert_eq!(ui.peer_name, "magenta");
|
||||
assert_eq!(ui.system_msg, "yellow");
|
||||
assert_eq!(ui.time, "dark_gray");
|
||||
}
|
||||
@@ -290,8 +290,9 @@ screen_resolution = "1920x1080"
|
||||
// ============================================================================
|
||||
mod tui_tests {
|
||||
use crossterm::event::{KeyCode, KeyEvent, KeyModifiers};
|
||||
use p2p_chat::app_logic::AppCommand;
|
||||
use p2p_chat::config::{Theme, UiConfig};
|
||||
use p2p_chat::tui::{App, InputMode, TuiCommand};
|
||||
use p2p_chat::tui::{App, InputMode};
|
||||
|
||||
fn make_app() -> App {
|
||||
let theme: Theme = UiConfig::default().into();
|
||||
@@ -320,7 +321,7 @@ mod tui_tests {
|
||||
let mut app = make_app();
|
||||
assert_eq!(app.input_mode, InputMode::Editing);
|
||||
let cmd = app.handle_key(key(KeyCode::Esc));
|
||||
assert!(matches!(cmd, TuiCommand::None));
|
||||
assert!(matches!(cmd, AppCommand::None));
|
||||
assert_eq!(app.input_mode, InputMode::Normal);
|
||||
}
|
||||
|
||||
@@ -329,7 +330,7 @@ mod tui_tests {
|
||||
let mut app = make_app();
|
||||
app.input_mode = InputMode::Normal;
|
||||
let cmd = app.handle_key(key(KeyCode::Char('i')));
|
||||
assert!(matches!(cmd, TuiCommand::None));
|
||||
assert!(matches!(cmd, AppCommand::None));
|
||||
assert_eq!(app.input_mode, InputMode::Editing);
|
||||
}
|
||||
|
||||
@@ -338,7 +339,7 @@ mod tui_tests {
|
||||
let mut app = make_app();
|
||||
app.input_mode = InputMode::Normal;
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::None));
|
||||
assert!(matches!(cmd, AppCommand::None));
|
||||
assert_eq!(app.input_mode, InputMode::Editing);
|
||||
}
|
||||
|
||||
@@ -347,7 +348,7 @@ mod tui_tests {
|
||||
let mut app = make_app();
|
||||
app.input_mode = InputMode::Normal;
|
||||
let cmd = app.handle_key(key(KeyCode::Char('/')));
|
||||
assert!(matches!(cmd, TuiCommand::None));
|
||||
assert!(matches!(cmd, AppCommand::None));
|
||||
assert_eq!(app.input_mode, InputMode::Editing);
|
||||
assert_eq!(app.input, "/");
|
||||
assert_eq!(app.cursor_position, 1);
|
||||
@@ -358,7 +359,7 @@ mod tui_tests {
|
||||
let mut app = make_app();
|
||||
app.input_mode = InputMode::Normal;
|
||||
let cmd = app.handle_key(key(KeyCode::Char('q')));
|
||||
assert!(matches!(cmd, TuiCommand::Quit));
|
||||
assert!(matches!(cmd, AppCommand::Quit));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -388,7 +389,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char('h')));
|
||||
app.handle_key(key(KeyCode::Char('i')));
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::SendMessage(ref s) if s == "hi"));
|
||||
assert!(matches!(cmd, AppCommand::SendMessage(ref s) if s == "hi"));
|
||||
assert_eq!(app.input, "");
|
||||
assert_eq!(app.cursor_position, 0);
|
||||
}
|
||||
@@ -397,7 +398,7 @@ mod tui_tests {
|
||||
fn enter_on_empty_input_does_nothing() {
|
||||
let mut app = make_app();
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::None));
|
||||
assert!(matches!(cmd, AppCommand::None));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -407,7 +408,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::Quit));
|
||||
assert!(matches!(cmd, AppCommand::Quit));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -417,7 +418,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
|
||||
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -427,7 +428,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
|
||||
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -437,7 +438,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::ChangeNick(ref s) if s == "alice"));
|
||||
assert!(matches!(cmd, AppCommand::ChangeNick(ref s) if s == "alice"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -447,7 +448,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
|
||||
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -457,7 +458,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::Connect(ref s) if s == "abc123"));
|
||||
assert!(matches!(cmd, AppCommand::Connect(ref s) if s == "abc123"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -467,17 +468,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::ToggleVoice));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn camera_command() {
|
||||
let mut app = make_app();
|
||||
for c in "/camera".chars() {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::ToggleCamera));
|
||||
assert!(matches!(cmd, AppCommand::ToggleVoice));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -487,7 +478,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::ToggleScreen));
|
||||
assert!(matches!(cmd, AppCommand::ToggleScreen));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -497,7 +488,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::Leave));
|
||||
assert!(matches!(cmd, AppCommand::Leave));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -507,7 +498,7 @@ mod tui_tests {
|
||||
app.handle_key(key(KeyCode::Char(c)));
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
assert!(matches!(cmd, TuiCommand::SystemMessage(_)));
|
||||
assert!(matches!(cmd, AppCommand::SystemMessage(_)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -518,7 +509,7 @@ mod tui_tests {
|
||||
}
|
||||
let cmd = app.handle_key(key(KeyCode::Enter));
|
||||
match cmd {
|
||||
TuiCommand::SendFile(path) => {
|
||||
AppCommand::SendFile(path) => {
|
||||
assert_eq!(path.to_str().unwrap(), "/tmp/test.txt");
|
||||
}
|
||||
_ => panic!("Expected SendFile, got {:?}", cmd),
|
||||
|
||||
299
web/app.js
299
web/app.js
@@ -15,12 +15,18 @@ let micScriptProcessor = null;
|
||||
let audioCtx = null;
|
||||
const SAMPLE_RATE = 48000;
|
||||
|
||||
// Video Encoding State
|
||||
let videoEncoder = null;
|
||||
let screenEncoder = null;
|
||||
let screenCanvasLoop = null; // Added
|
||||
let frameCounter = 0;
|
||||
|
||||
// --- Remote Peer State ---
|
||||
// Map<peerId, {
|
||||
// id: string,
|
||||
// nextStartTime: number,
|
||||
// cam: { card: HTMLElement, img: HTMLElement, status: HTMLElement } | null,
|
||||
// screen: { card: HTMLElement, img: HTMLElement, status: HTMLElement } | null,
|
||||
// cam: { card: HTMLElement, canvas: HTMLCanvasElement, decoder: VideoDecoder, status: HTMLElement } | null,
|
||||
// screen: { card: HTMLElement, canvas: HTMLCanvasElement, decoder: VideoDecoder, status: HTMLElement } | null,
|
||||
// }>
|
||||
const peers = new Map();
|
||||
|
||||
@@ -69,7 +75,7 @@ ws.onmessage = (event) => {
|
||||
|
||||
// Extract ID
|
||||
const idBytes = new Uint8Array(data, 2, idLen);
|
||||
const peerId = new TextDecoder().decode(idBytes);
|
||||
let peerId = new TextDecoder().decode(idBytes);
|
||||
|
||||
// Extract Payload
|
||||
const payload = data.slice(2 + idLen);
|
||||
@@ -84,6 +90,7 @@ ws.onmessage = (event) => {
|
||||
screen: null
|
||||
};
|
||||
peers.set(peerId, peer);
|
||||
handlePeerConnected(peer); // Call new handler for peer connection
|
||||
}
|
||||
|
||||
if (header === 0) { // Audio
|
||||
@@ -103,11 +110,11 @@ function getOrCreateCard(peer, type) {
|
||||
card.className = 'peer-card';
|
||||
card.id = `peer-${peer.id}-${type}`;
|
||||
|
||||
// Video/Image element
|
||||
const img = document.createElement('img');
|
||||
img.className = 'peer-video';
|
||||
img.alt = `${type} from ${peer.id}`;
|
||||
card.appendChild(img);
|
||||
// Video canvas element
|
||||
const canvas = document.createElement('canvas');
|
||||
canvas.className = 'peer-video';
|
||||
// canvas.alt = `${type} from ${peer.id}`;
|
||||
card.appendChild(canvas);
|
||||
|
||||
// Overlay info
|
||||
const info = document.createElement('div');
|
||||
@@ -123,9 +130,41 @@ function getOrCreateCard(peer, type) {
|
||||
|
||||
videoGrid.appendChild(card);
|
||||
|
||||
// Initialize VideoDecoder
|
||||
const decoder = new VideoDecoder({
|
||||
output: (frame) => {
|
||||
// Draw frame to canvas
|
||||
console.debug(`[Decoder] Frame decoded: ${frame.displayWidth}x${frame.displayHeight}`);
|
||||
canvas.width = frame.displayWidth;
|
||||
canvas.height = frame.displayHeight;
|
||||
const ctx = canvas.getContext('2d');
|
||||
ctx.drawImage(frame, 0, 0);
|
||||
frame.close();
|
||||
|
||||
updatePeerActivity(cardObj, false);
|
||||
},
|
||||
error: (e) => {
|
||||
console.error(`[Decoder] Error (${type}):`, e);
|
||||
statusOverlay.style.display = 'flex';
|
||||
let statusText = `Decoding H.264 from ${peer.id}...`;
|
||||
statusOverlay.querySelector('h2').textContent = `${statusText} Video Decoder Error: ${e.message}`;
|
||||
}
|
||||
});
|
||||
|
||||
console.log(`[Decoder] Configuring H.264 decoder for ${peer.id} (${type})`);
|
||||
try {
|
||||
decoder.configure({
|
||||
codec: 'avc1.42E01E', // H.264 Constrained Baseline
|
||||
optimizeForLatency: true
|
||||
});
|
||||
} catch (err) {
|
||||
console.error(`[Decoder] Configuration failed:`, err);
|
||||
}
|
||||
|
||||
const cardObj = {
|
||||
card: card,
|
||||
imgElement: img,
|
||||
canvas: canvas,
|
||||
decoder: decoder,
|
||||
statusElement: info.querySelector('.peer-status'),
|
||||
activityTimeout: null
|
||||
};
|
||||
@@ -166,18 +205,27 @@ function handleRemoteAudio(peer, arrayBuffer) {
|
||||
function handleRemoteVideo(peer, arrayBuffer, type) {
|
||||
const cardObj = getOrCreateCard(peer, type);
|
||||
|
||||
const blob = new Blob([arrayBuffer], { type: 'image/jpeg' });
|
||||
const url = URL.createObjectURL(blob);
|
||||
|
||||
const prevUrl = cardObj.imgElement.src;
|
||||
cardObj.imgElement.onload = () => {
|
||||
if (prevUrl && prevUrl.startsWith('blob:')) {
|
||||
URL.revokeObjectURL(prevUrl);
|
||||
// Payload format: [1 byte frame type] [N bytes encoded chunk]
|
||||
// Frame Type: 0 = Key, 1 = Delta
|
||||
const view = new DataView(arrayBuffer);
|
||||
const isKey = view.getUint8(0) === 0;
|
||||
const chunkData = arrayBuffer.slice(1);
|
||||
|
||||
const chunk = new EncodedVideoChunk({
|
||||
type: isKey ? 'key' : 'delta',
|
||||
timestamp: performance.now() * 1000, // Use local time for now, or derive from seq
|
||||
data: chunkData
|
||||
});
|
||||
|
||||
try {
|
||||
if (cardObj.decoder.state === 'configured') {
|
||||
cardObj.decoder.decode(chunk);
|
||||
} else {
|
||||
console.warn(`[Decoder] Not configured yet, dropping chunk (Key: ${isKey})`);
|
||||
}
|
||||
};
|
||||
cardObj.imgElement.src = url;
|
||||
|
||||
updatePeerActivity(cardObj, false);
|
||||
} catch (e) {
|
||||
console.error("[Decoder] Decode exception:", e);
|
||||
}
|
||||
}
|
||||
|
||||
function updatePeerActivity(cardObj, isAudio) {
|
||||
@@ -191,6 +239,22 @@ function updatePeerActivity(cardObj, isAudio) {
|
||||
}
|
||||
}
|
||||
|
||||
function handlePeerConnected(peer) {
|
||||
// Peer connected (or local preview)
|
||||
console.log(`[App] Peer connected: ${peer.id}`);
|
||||
|
||||
// Create card if not exists
|
||||
let cardObj = getOrCreateCard(peer, 'cam'); // Assume cam card for general peer info
|
||||
|
||||
// If it's local preview, update the status
|
||||
if (peer.id === 'local') {
|
||||
const statusDot = cardObj.card.querySelector('.peer-status');
|
||||
if (statusDot) statusDot.style.backgroundColor = '#3b82f6';
|
||||
const nameLabel = cardObj.card.querySelector('.peer-name'); // Updated selector
|
||||
if (nameLabel) nameLabel.textContent = "Local Preview (H.264)";
|
||||
}
|
||||
}
|
||||
|
||||
// --- Local Capture Controls ---
|
||||
|
||||
function updateButton(btn, active, iconOn, iconOff) {
|
||||
@@ -294,12 +358,39 @@ function stopMic() {
|
||||
|
||||
async function startCam() {
|
||||
try {
|
||||
camStream = await navigator.mediaDevices.getUserMedia({ video: { width: 640, height: 480 } });
|
||||
localVideo.srcObject = camStream;
|
||||
startVideoSender(camStream, 4); // 4 = Camera
|
||||
// Backend now handles capture. We just wait for "local" stream.
|
||||
// But we might want to tell backend to start?
|
||||
// Currently backend starts on TUI command /cam.
|
||||
// Ideally we should have a /cam endpoint or message?
|
||||
// For now, this button is purely cosmetic if backend is controlled via TUI,
|
||||
// OR we implemented the /cam command in TUI.
|
||||
// User asked to Encode on Backend.
|
||||
// Let's assume user uses TUI /cam or we trigger it via existing mechanism?
|
||||
// Wait, start_web_server doesn't listen to commands from web.
|
||||
// The buttons in web UI were starting *browser* capture.
|
||||
|
||||
// If we want the Web UI button to start backend capture, we need an endpoint.
|
||||
// Since we don't have one easily, let's just show a message or assume TUI control.
|
||||
// BUT, the existing implementation (VoiceChat::start_web) was triggered by TUI.
|
||||
// The Web UI was just sending data.
|
||||
|
||||
// Actually, previous flow was:
|
||||
// 1. TUI /cam -> calls toggle_camera
|
||||
// 2. toggle_camera -> calls VideoCapture::start_web -> spawns task receiving from channel
|
||||
// 3. Web UI -> captures video -> sends to WS -> WS handler sends to channel
|
||||
|
||||
// New flow:
|
||||
// 1. TUI /cam -> calls toggle_camera -> calls Start Native -> Spawns ffmpeg -> sends to broadcast
|
||||
// 2. Web UI -> receives broadcast -> renders
|
||||
|
||||
// So the "Start Camera" button in Web UI is now USELESS/Misleading if it tries to do browser capture.
|
||||
// It should probably be removed or replaced with "Status".
|
||||
alert("Please use /cam in the terminal to start the camera (Backend Encoding).");
|
||||
|
||||
} catch (err) {
|
||||
console.error('Error starting camera:', err);
|
||||
alert('Camera access failed');
|
||||
alert('Failed to start camera');
|
||||
updateButton(toggleCamBtn, false, 'videocam', 'videocam_off');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -308,62 +399,136 @@ function stopCam() {
|
||||
camStream.getTracks().forEach(t => t.stop());
|
||||
camStream = null;
|
||||
}
|
||||
if (videoEncoder) {
|
||||
// We don't close the encoder, just stop feeding it?
|
||||
// Or re-create it? Let's keep it but stop the reader loop which is tied to the track.
|
||||
// Actually, send a config reset next time?
|
||||
}
|
||||
}
|
||||
|
||||
//let screenStream = null;
|
||||
|
||||
// Helper to read frames from the stream
|
||||
async function readLoop(reader, encoder) {
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) break;
|
||||
if (encoder.state === "configured") {
|
||||
encoder.encode(value);
|
||||
value.close();
|
||||
} else {
|
||||
value.close();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function startScreen() {
|
||||
try {
|
||||
screenStream = await navigator.mediaDevices.getDisplayMedia({ video: true });
|
||||
localVideo.srcObject = screenStream;
|
||||
// Hybrid Mode: Browser Capture + Backend Relay
|
||||
screenStream = await navigator.mediaDevices.getDisplayMedia({
|
||||
video: {
|
||||
cursor: "always"
|
||||
},
|
||||
audio: false
|
||||
});
|
||||
|
||||
const track = screenStream.getVideoTracks()[0];
|
||||
const { width, height } = track.getSettings();
|
||||
|
||||
startVideoSender(screenStream, 5); // 5 = Screen
|
||||
screenStream.getVideoTracks()[0].onended = () => {
|
||||
stopScreen();
|
||||
updateButton(toggleScreenBtn, false, 'screen_share', 'screen_share');
|
||||
};
|
||||
// 1. Setup Local Preview (Draw to Canvas)
|
||||
const localCardObj = getOrCreateCard({ id: 'local' }, 'screen');
|
||||
const canvas = localCardObj.canvas;
|
||||
const ctx = canvas.getContext('2d');
|
||||
|
||||
// Create a temp video element to play the stream for drawing
|
||||
const tempVideo = document.createElement('video');
|
||||
tempVideo.autoplay = true;
|
||||
tempVideo.srcObject = screenStream;
|
||||
tempVideo.muted = true;
|
||||
await tempVideo.play();
|
||||
|
||||
// Canvas drawing loop
|
||||
function drawLoop() {
|
||||
if (tempVideo.paused || tempVideo.ended) return;
|
||||
if (canvas.width !== tempVideo.videoWidth || canvas.height !== tempVideo.videoHeight) {
|
||||
canvas.width = tempVideo.videoWidth;
|
||||
canvas.height = tempVideo.videoHeight;
|
||||
}
|
||||
ctx.drawImage(tempVideo, 0, 0);
|
||||
screenCanvasLoop = requestAnimationFrame(drawLoop);
|
||||
}
|
||||
drawLoop();
|
||||
|
||||
// 2. Encode and Send to Backend (for Peers)
|
||||
|
||||
// Config H.264 Encoder
|
||||
screenEncoder = new VideoEncoder({
|
||||
output: (chunk, metadata) => {
|
||||
const buffer = new Uint8Array(chunk.byteLength);
|
||||
chunk.copyTo(buffer);
|
||||
|
||||
// Construct Header: [5 (Screen)] [FrameType] [Data]
|
||||
// Chunk type: key=1, delta=0? No, EncodedVideoChunkType key/delta
|
||||
const isKey = chunk.type === 'key';
|
||||
const frameType = isKey ? 0 : 1;
|
||||
|
||||
const payload = new Uint8Array(1 + 1 + buffer.length);
|
||||
payload[0] = 5; // Screen Header
|
||||
payload[1] = frameType;
|
||||
payload.set(buffer, 2);
|
||||
|
||||
if (ws && ws.readyState === WebSocket.OPEN) {
|
||||
ws.send(payload);
|
||||
}
|
||||
},
|
||||
error: (e) => console.error("Screen Encoder Error:", e)
|
||||
});
|
||||
|
||||
screenEncoder.configure({
|
||||
codec: 'avc1.42E01E', // H.264 Baseline
|
||||
width: width,
|
||||
height: height,
|
||||
bitrate: 3_000_000, // 3Mbps
|
||||
framerate: 30
|
||||
});
|
||||
|
||||
// Reader
|
||||
const processor = new MediaStreamTrackProcessor({ track });
|
||||
const reader = processor.readable.getReader();
|
||||
|
||||
readLoop(reader, screenEncoder);
|
||||
|
||||
updateButton(toggleScreenBtn, true, 'stop_screen_share', 'stop_screen_share');
|
||||
|
||||
// Clean up on stop
|
||||
track.onended = () => stopScreen();
|
||||
|
||||
} catch (err) {
|
||||
console.error('Error starting screen:', err);
|
||||
alert(`Failed to start screen share: ${err.message}. \n(Make sure to run /screen in terminal first!)`);
|
||||
updateButton(toggleScreenBtn, false, 'screen_share', 'screen_share');
|
||||
}
|
||||
}
|
||||
|
||||
function stopScreen() {
|
||||
async function stopScreen() {
|
||||
if (screenStream) {
|
||||
screenStream.getTracks().forEach(t => t.stop());
|
||||
screenStream = null;
|
||||
}
|
||||
if (screenEncoder) {
|
||||
screenEncoder.close();
|
||||
screenEncoder = null;
|
||||
}
|
||||
if (screenCanvasLoop) {
|
||||
cancelAnimationFrame(screenCanvasLoop);
|
||||
screenCanvasLoop = null;
|
||||
}
|
||||
const localCardObj = getOrCreateCard({ id: 'local' }, 'screen');
|
||||
const ctx = localCardObj.canvas.getContext('2d');
|
||||
ctx.clearRect(0, 0, localCardObj.canvas.width, localCardObj.canvas.height);
|
||||
|
||||
updateButton(toggleScreenBtn, false, 'screen_share', 'screen_share');
|
||||
}
|
||||
|
||||
function startVideoSender(stream, headerByte) {
|
||||
const video = document.createElement('video');
|
||||
video.srcObject = stream;
|
||||
video.play();
|
||||
|
||||
const canvas = document.createElement('canvas');
|
||||
const ctx = canvas.getContext('2d');
|
||||
|
||||
const sendFrame = () => {
|
||||
if (!stream.active) return;
|
||||
if (video.readyState === video.HAVE_ENOUGH_DATA) {
|
||||
canvas.width = video.videoWidth;
|
||||
canvas.height = video.videoHeight;
|
||||
ctx.drawImage(video, 0, 0);
|
||||
|
||||
canvas.toBlob((blob) => {
|
||||
if (!blob) return;
|
||||
const reader = new FileReader();
|
||||
reader.onloadend = () => {
|
||||
if (ws.readyState === WebSocket.OPEN) {
|
||||
const arrayBuffer = reader.result;
|
||||
const buffer = new ArrayBuffer(1 + arrayBuffer.byteLength);
|
||||
const view = new Uint8Array(buffer);
|
||||
view[0] = headerByte;
|
||||
view.set(new Uint8Array(arrayBuffer), 1);
|
||||
ws.send(buffer);
|
||||
}
|
||||
};
|
||||
reader.readAsArrayBuffer(blob);
|
||||
}, 'image/jpeg', 0.6);
|
||||
}
|
||||
setTimeout(sendFrame, 100); // 10 FPS
|
||||
};
|
||||
sendFrame();
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -83,7 +83,7 @@ body {
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.peer-card img, .peer-card video {
|
||||
.peer-card img, .peer-card video, .peer-card canvas {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
object-fit: contain; /* or cover if preferred */
|
||||
|
||||
Reference in New Issue
Block a user