## Summary This is the `exec-server` follow-up to #21759. #21759 fixed the Windows `taskkill` output leak for the `rmcp-client` MCP teardown path, but #22050 showed that `exec-server` still had a parallel `taskkill /T /F` cleanup path in `exec-server/src/connection.rs`. Because that command inherited the parent stdio handles, Windows could still print `SUCCESS:` lines into the user's terminal during stdio child cleanup. This change silences that remaining `exec-server` callsite by redirecting `taskkill` stdin, stdout, and stderr to `Stdio::null()`. ## What Changed - add a Windows-only `Stdio` import in `exec-server/src/connection.rs` - redirect the `taskkill` command in `kill_windows_process_tree` to `Stdio::null()` for stdin, stdout, and stderr - keep the existing kill semantics unchanged by still checking `.status()` and preserving the existing fallback/logging behavior ## How to Test Manual validation is Windows-only, so I did not run the UI repro path locally here. 1. On Windows, use a Codex build from this branch. 2. Exercise an `exec-server` stdio flow that spawns a child process tree and then triggers transport cleanup. 3. Confirm the child process tree is still torn down. 4. Confirm the terminal no longer shows `SUCCESS: The process with PID ... has been terminated.` lines during cleanup. Targeted tests: - `cargo test -p codex-exec-server client::tests::dropping_stdio_client_terminates_spawned_process -- --exact` - `cargo test -p codex-exec-server client::tests::malformed_stdio_message_terminates_spawned_process -- --exact` Notes: - `cargo test -p codex-exec-server` still hits unrelated local macOS `sandbox-exec: sandbox_apply: Operation not permitted` failures in `tests/file_system.rs`. ## References - Fixes the remaining callsite discussed in #22050 - Related earlier fix: #21759
npm i -g @openai/codex
or brew install --cask codex
Codex CLI is a coding agent from OpenAI that runs locally on your computer.
If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you want the desktop app experience, run
codex app or visit the Codex App page.
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.
Quickstart
Installing and running Codex CLI
Install globally with your preferred package manager:
# Install using npm
npm install -g @openai/codex
# Install using Homebrew
brew install --cask codex
Then simply run codex to get started.
You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
- macOS
- Apple Silicon/arm64:
codex-aarch64-apple-darwin.tar.gz - x86_64 (older Mac hardware):
codex-x86_64-apple-darwin.tar.gz
- Apple Silicon/arm64:
- Linux
- x86_64:
codex-x86_64-unknown-linux-musl.tar.gz - arm64:
codex-aarch64-unknown-linux-musl.tar.gz
- x86_64:
Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.
Using Codex with your ChatGPT plan
Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Business, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.
You can also use Codex with an API key, but this requires additional setup.
Docs
This repository is licensed under the Apache-2.0 License.
