pakrym-oai 605d43719e ## New Features
- Added a `/permissions` command with a shorter approval set while keeping `/approvals` for compatibility. (#9561)
- Added a `/skill` UI to enable or disable individual skills. (#9627)
- Improved slash-command selection by prioritizing exact and prefix matches over fuzzy matches. (#9629)
- App server now supports `thread/read` and can filter archived threads in `thread/list`. (#9569, #9571)
- App server clients now support layered `config.toml` resolution and `config/read` can compute effective config from a given cwd. (#9510)
- Release artifacts now include a stable URL for the published config schema. (#9572)

## Bug Fixes
- Prevented tilde expansion from escaping HOME on paths like `~//...`. (#9621)
- TUI turn timing now resets between assistant messages so elapsed time reflects the latest response. (#9599)

## Documentation
- Updated MCP subcommand docs to match current CLI behavior. (#9622)
- Refreshed the `skills/list` protocol README example to match the latest response shape. (#9623)

## Chores
- Removed the TUI2 experiment and its related config/docs, keeping Codex on the terminal-native UI. (#9640)

## Changelog

Full Changelog: https://github.com/openai/codex/compare/rust-v0.88.0...rust-v0.89.0

- #9576 [bazel] Upgrade to bazel9 @zbarsky-openai
- #9606 nit: ui on interruption @jif-oai
- #9609 chore: defensive shell snapshot @jif-oai
- #9621 fix: Fix tilde expansion to avoid absolute-path escape @tiffanycitra
- #9573 define/emit some metrics for windows sandbox setup @iceweasel-oai
- #9622 docs: fix outdated MCP subcommands documentation @htiennv
- #9623 Update skills/list protocol readme @gverma-openai
- #9616 [bazel] Upgrade llvm toolchain and enable remote repo cache @zbarsky-openai
- #9624 forgot to add some windows sandbox nux events. @iceweasel-oai
- #9633 Add websockets logging @pakrym-oai
- #9592 Chore: update plan mode output in prompt @shijie-oai
- #9583 Add collaboration_mode to TurnContextItem @charley-oai
- #9510 Add layered config.toml support to app server @etraut-openai
- #9629 feat: better sorting of shell commands @jif-oai
- #9599 fix(tui) turn timing incremental @dylan-hurd-oai
- #9572 feat: publish config schema on release @sayan-oai
- #9549 Reduce burst testing flake @charley-oai
- #9640 feat(tui): retire the tui2 experiment @joshka-oai
- #9597 feat(core) ModelInfo.model_instructions_template @dylan-hurd-oai
- #9627 Add UI for skill enable/disable. @xl-openai
- #9650 chore: tweak AGENTS.md @dylan-hurd-oai
- #9656 Add tui.experimental_mode setting @pakrym-oai
- #9561 feat(tui) /permissions flow @dylan-hurd-oai
- #9653 Fix: Lower log level for closed-channel send @Kbediako
- #9659 Chore: add cmd related info to exec approval request @shijie-oai
- #9693 Revert "feat: support proxy for ws connection" @pakrym-oai
- #9698 Support end_turn flag @pakrym-oai
- #9645 Modes label below textarea @charley-oai
- #9644 feat(core) update Personality on turn @dylan-hurd-oai
- #9569 feat(app-server): thread/read API @owenlin0
- #9571 feat(app-server): support archived threads in thread/list @owenlin0
2026-01-22 12:57:37 -08:00
2026-01-22 12:57:37 -08:00
2026-01-08 07:50:58 -08:00
2025-04-16 12:56:08 -04:00
2026-01-21 20:20:45 -08:00
2025-10-17 12:19:08 -07:00
2025-10-17 12:19:08 -07:00
2025-04-16 12:56:08 -04:00
2025-07-31 00:06:55 +00:00
2025-04-18 17:01:11 -07:00
2026-01-02 15:23:22 -07:00

npm i -g @openai/codex
or brew install --cask codex

Codex CLI is a coding agent from OpenAI that runs locally on your computer.

Codex CLI splash


If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.


Quickstart

Installing and running Codex CLI

Install globally with your preferred package manager:

# Install using npm
npm install -g @openai/codex
# Install using Homebrew
brew install --cask codex

Then simply run codex to get started.

You can also go to the latest GitHub Release and download the appropriate binary for your platform.

Each GitHub Release contains many executables, but in practice, you likely want one of these:

  • macOS
    • Apple Silicon/arm64: codex-aarch64-apple-darwin.tar.gz
    • x86_64 (older Mac hardware): codex-x86_64-apple-darwin.tar.gz
  • Linux
    • x86_64: codex-x86_64-unknown-linux-musl.tar.gz
    • arm64: codex-aarch64-unknown-linux-musl.tar.gz

Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.

Using Codex with your ChatGPT plan

Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.

You can also use Codex with an API key, but this requires additional setup.

Docs

This repository is licensed under the Apache-2.0 License.

Description
No description provided
Readme Apache-2.0 1.3 GiB
Languages
Rust 96.1%
Python 2.6%
TypeScript 0.3%
JavaScript 0.2%
Starlark 0.2%
Other 0.4%