mirror of
https://github.com/openai/codex.git
synced 2026-05-16 01:02:48 +00:00
## Why Remote TUI resume uses the app-server websocket client. That client inherited tungstenite's default `16 MiB` frame limit, so a large saved session could make `thread/resume` return a single JSON-RPC response frame that the client rejected before the TUI could deserialize or render it. Fixes #19837 ## What Changed - Configure the remote app-server websocket client with a bounded `128 MiB` max frame/message size. - Preserve the concrete remote worker exit reason when completing pending requests after a transport/read failure instead of replacing it with a generic channel-closed error. - Add a regression test that sends a single `>16 MiB` JSON-RPC response frame and verifies the typed request succeeds. Note: This isn't a perfect fix. It really just moves the limit to a much larger value. I looked at a bunch of other potential fixes (both server-side and client-side), and they all involved significant complexity, had backward-compatibility impact, or impacted performance of common use cases. This simple fix should address the vast majority of remote use cases. ## Verification I reproed the problem locally using a long rollout. Verified that fix addresses connection drop.