mirror of
https://github.com/openai/codex.git
synced 2026-05-16 09:12:54 +00:00
Fixes #18203. ## Why Remote TUI clients connected through `codex app-server --listen ws://...` can receive short bursts of outbound turn and tool-output notifications. The WebSocket transport previously used the shared 128-message channel capacity for its outbound writer queue, so a healthy client that briefly lagged during normal output streaming could fill the queue and be disconnected immediately. This is a smaller mitigation than #18265: instead of adding a new overflow/backpressure pipeline, keep the existing non-blocking router behavior and give WebSocket clients enough bounded headroom for realistic bursts. ## What Changed - Added a WebSocket-only outbound writer capacity of `64 * 1024` messages. - Used that larger capacity only for the WebSocket data writer queue in `codex-rs/app-server/src/transport/websocket.rs`. - Left the shared `CHANNEL_CAPACITY` and the existing disconnect-on-full behavior unchanged for internal/control channels and genuinely stuck clients. ## Verification - `cargo test -p codex-app-server transport::tests::broadcast_does_not_block_on_slow_connection` - Manually retried the #18203 repro prompt against the remote TUI and confirmed it stayed connected.