mirror of
https://github.com/openai/codex.git
synced 2026-04-28 00:25:56 +00:00
[fix] app server flaky send_messages test (#8874)
Fix flakiness of CI test: https://github.com/openai/codex/actions/runs/20350530276/job/58473691434?pr=8282 This PR does two things: 1. move the flakiness test to use responses API instead of chat completion API 2. make mcp_process agnostic to the order of responses/notifications/requests that come in, by buffering messages not read
This commit is contained in:
@@ -554,6 +554,7 @@ async fn turn_start_updates_sandbox_and_cwd_between_turns_v2() -> Result<()> {
|
||||
mcp.read_stream_until_notification_message("codex/event/task_complete"),
|
||||
)
|
||||
.await??;
|
||||
mcp.clear_message_buffer();
|
||||
|
||||
// second turn with workspace-write and second_cwd, ensure exec begins in second_cwd
|
||||
let second_turn = mcp
|
||||
|
||||
Reference in New Issue
Block a user