Michael Bolin 1324fffe5d ## New Features
- Add a unified `WebSearchMode` setting (disabled/cached/live) across config, protocol v2, CLI/TUI, and the TypeScript SDK, superseding legacy web-search flags (#9216)
- Improve startup UX by queuing input until session configuration/model selection is ready and showing a loading state in the TUIs (#9191)
- Add experimental collaboration-tool prompting plus lifecycle events so clients can track tool start/finish (#9208, #9095)
- Make tool execution more robust by falling back to a piped process when PTYs aren’t available (#8797)

## Bug Fixes
- Prevent Responses SSE connections from lingering by emitting `response.completed` immediately on completion (#9170)
- Fix Linux sandbox UID/GID mapping after `unshare` for more reliable sandboxed execution (#9234)
- Make TUI2 “queue vs submit” keybindings consistent (Tab queues; Enter behavior aligns with Steer mode) (#9218)
- Adjust quit shortcuts to reduce accidental exits while the exit/interrupt flow is being refined (#8936, #9220)

## Documentation
- Document the evolving exit confirmation / interrupt design and update related config guidance (#8936)
- Improve issue reporting by asking for terminal emulator details in the issue template (#9231)

## Chores
- Add trace-mode logging of request headers to aid debugging (#9214)
- Rename MCP-related config fields and agent metadata for clearer naming (#9212, #9215)
- Speed up CI by moving Rust workflows onto larger runners (#9106)

## Changelog

Full Changelog: https://github.com/openai/codex/compare/rust-v0.81.0...rust-v0.82.0

- #8936 tui: double-press Ctrl+C/Ctrl+D to quit @joshka-oai
- #9095 feat: emit events around collab tools @jif-oai
- #9208 feat: add collab prompt @jif-oai
- #9170 fix: Emit response.completed immediately for Responses SSE @celia-oai
- #9191 Get model on session configured @aibrahim-oai
- #9214 Log headers in trace mode @pakrym-oai
- #9212 s/mcp_server_requirements/mcp_servers @gt-oai
- #8797 feat: adding piped process to replace PTY when needed @jif-oai
- #9215 Rename hierarchical_agents to child_agents_md @pakrym-oai
- #9218 fix(tui2): align Steer submit keys @joshka-oai
- #9220 fix(tui): disable double-press quit shortcut @joshka-oai
- #9216 add WebSearchMode enum @sayan-oai
- #9231 Updated issue template to ask for terminal emulator @etraut-openai
- #9106 upgrade runners in rust-ci.yml to use the larger runners @willwang-openai
- #9234 fix: correct linux sandbox uid/gid mapping after unshare @viyatb-oai
2026-01-14 15:42:33 -08:00
2026-01-14 15:42:33 -08:00
2026-01-08 07:50:58 -08:00
2025-04-16 12:56:08 -04:00
2025-10-17 12:19:08 -07:00
2025-10-17 12:19:08 -07:00
2025-04-16 12:56:08 -04:00
2025-07-31 00:06:55 +00:00
2025-04-18 17:01:11 -07:00
2026-01-02 15:23:22 -07:00

npm i -g @openai/codex
or brew install --cask codex

Codex CLI is a coding agent from OpenAI that runs locally on your computer.

Codex CLI splash


If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.


Quickstart

Installing and running Codex CLI

Install globally with your preferred package manager:

# Install using npm
npm install -g @openai/codex
# Install using Homebrew
brew install --cask codex

Then simply run codex to get started.

You can also go to the latest GitHub Release and download the appropriate binary for your platform.

Each GitHub Release contains many executables, but in practice, you likely want one of these:

  • macOS
    • Apple Silicon/arm64: codex-aarch64-apple-darwin.tar.gz
    • x86_64 (older Mac hardware): codex-x86_64-apple-darwin.tar.gz
  • Linux
    • x86_64: codex-x86_64-unknown-linux-musl.tar.gz
    • arm64: codex-aarch64-unknown-linux-musl.tar.gz

Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.

Using Codex with your ChatGPT plan

Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.

You can also use Codex with an API key, but this requires additional setup.

Docs

This repository is licensed under the Apache-2.0 License.

Description
No description provided
Readme Apache-2.0 1.3 GiB
Languages
Rust 96.1%
Python 2.6%
TypeScript 0.3%
JavaScript 0.2%
Starlark 0.2%
Other 0.4%