## Why The TUI currently treats Markdown tables as ordinary wrapped text, which makes table-heavy responses hard to read and brittle across narrow panes and terminal resizes. This change teaches the TUI to render Markdown tables responsively while preserving the raw Markdown source needed to re-render streamed and finalized transcript content after width changes. The goal is to keep tables legible during streaming, after resize, and once a turn has finished, without corrupting scrollback ordering. ## What Changed - add table detection and responsive table rendering in the Markdown renderer - render standard tables with Unicode box-drawing borders when the pane is wide enough - add a vertical readability fallback for constrained or dense tables so narrow panes still show each row clearly - keep links and `<br>` content inside table cells instead of leaking text outside the table - avoid table normalization inside fenced or indented code blocks - preserve raw streamed Markdown source and keep the active table as a mutable tail until finalization - consolidate finalized streamed content into source-backed transcript cells so post-resize re-rendering stays correct - add snapshot and targeted streaming/resize regression coverage for the new table behavior ## How to Test 1. Start Codex TUI from this branch. 2. Paste this exact prompt: `This is a session to test codex, no need to do any thinking, just end different markdown tables, with columns exploring different markdown contents, like links, bold italic, code, etc. Make them different sizes, some 30+ rows, some not and intertwine them with some paragraphs with complex formatting as well.` 3. Confirm the response includes several Markdown tables mixed with richly formatted paragraphs. 4. Confirm wide-enough tables render with box-drawing borders instead of plain wrapped pipe text. 5. Resize the terminal narrower while the answer is still streaming and confirm the in-progress table stays coherent instead of duplicating headers or leaving broken scrollback behind. 6. Resize again after the turn finishes and confirm the finalized transcript re-renders cleanly at the new width. 7. In a narrow pane, verify dense tables fall back to the vertical per-row layout instead of producing unreadable wrapped columns. 8. Also verify pipe-heavy fenced code blocks still render as code, not as tables. Targeted tests: - `cargo test -p codex-tui table_readability_fallback --no-fail-fast` - `cargo test -p codex-tui markdown_render --no-fail-fast` - `cargo test -p codex-tui streaming::controller --no-fail-fast` - `cargo test -p codex-tui table_resize_lifecycle --no-fail-fast` ## Docs No developer docs update appears necessary.
npm i -g @openai/codex
or brew install --cask codex
Codex CLI is a coding agent from OpenAI that runs locally on your computer.
If you want Codex in your code editor (VS Code, Cursor, Windsurf), install in your IDE.
If you want the desktop app experience, run
codex app or visit the Codex App page.
If you are looking for the cloud-based agent from OpenAI, Codex Web, go to chatgpt.com/codex.
Quickstart
Installing and running Codex CLI
Install globally with your preferred package manager:
# Install using npm
npm install -g @openai/codex
# Install using Homebrew
brew install --cask codex
Then simply run codex to get started.
You can also go to the latest GitHub Release and download the appropriate binary for your platform.
Each GitHub Release contains many executables, but in practice, you likely want one of these:
- macOS
- Apple Silicon/arm64:
codex-aarch64-apple-darwin.tar.gz - x86_64 (older Mac hardware):
codex-x86_64-apple-darwin.tar.gz
- Apple Silicon/arm64:
- Linux
- x86_64:
codex-x86_64-unknown-linux-musl.tar.gz - arm64:
codex-aarch64-unknown-linux-musl.tar.gz
- x86_64:
Each archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.
Using Codex with your ChatGPT plan
Run codex and select Sign in with ChatGPT. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Business, Edu, or Enterprise plan. Learn more about what's included in your ChatGPT plan.
You can also use Codex with an API key, but this requires additional setup.
Docs
This repository is licensed under the Apache-2.0 License.
