Compare commits

...

23 Commits

Author SHA1 Message Date
Michael Bolin
0a68853d8b feat: build the standalone codex-exec binary as part of the GitHub Release 2025-09-29 09:54:54 -07:00
Ed Bayes
c4120a265b [CODEX-3595] Remove period when copying highlighted text in iTerm (#4419) 2025-09-28 20:50:04 -07:00
Michael Bolin
618a42adf5 feat: introduce npm module for codex-responses-api-proxy (#4417)
This PR expands `.github/workflows/rust-release.yml` so that it also
builds and publishes the `npm` module for
`@openai/codex-responses-api-proxy` in addition to `@openai/codex`. Note
both `npm` modules are similar, in that they each contain a single `.js`
file that is a thin launcher around the appropriate native executable.
(Since we have a minimal dependency on Node.js, I also lowered the
minimum version from 20 to 16 and verified that works on my machine.)

As part of this change, we tighten up some of the docs around
`codex-responses-api-proxy` and ensure the details regarding protecting
the `OPENAI_API_KEY` in memory match the implementation.

To test the `npm` build process, I ran:

```
./codex-cli/scripts/build_npm_package.py --package codex-responses-api-proxy --version 0.43.0-alpha.3
```

which stages the `npm` module for `@openai/codex-responses-api-proxy` in
a temp directory, using the binary artifacts from
https://github.com/openai/codex/releases/tag/rust-v0.43.0-alpha.3.
2025-09-28 19:34:06 -07:00
dedrisian-oai
a9d54b9e92 Add /review to main commands (#4416)
<img width="517" height="249" alt="Screenshot 2025-09-28 at 4 33 26 PM"
src="https://github.com/user-attachments/assets/6d34734e-fe3c-4b88-8239-a6436bcb9fa5"
/>
2025-09-28 16:59:39 -07:00
Michael Bolin
79e51dd607 fix: clean up some minor issues with .github/workflows/ci.yml (#4408) 2025-09-28 15:31:17 -07:00
Michael Bolin
ff6dbff0b6 feat: build codex-responses-api-proxy for all platforms as part of the GitHub Release (#4406)
This should make the `codex-responses-api-proxy` binaries available for
all platforms in a GitHub Release as well as a corresponding DotSlash
file.

Making `codex-responses-api-proxy` available as an `npm` module will be
done in a follow-up PR.

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/4404).
* __->__ #4406
* #4404
* #4403
2025-09-28 15:25:15 -07:00
Michael Bolin
99841332e2 chore: remove responses-api-proxy from the multitool (#4404)
This removes the `codex responses-api-proxy` subcommand in favor of
running it as a standalone CLI.

As part of this change, we:

- remove the dependency on `tokio`/`async/await` as well as `codex_arg0`
- introduce the use of `pre_main_hardening()` so `CODEX_SECURE_MODE=1`
is not required

---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/4404).
* #4406
* __->__ #4404
* #4403
2025-09-28 15:22:27 -07:00
Michael Bolin
7407469791 chore: lower logging level from error to info for MCP startup (#4412) 2025-09-28 15:13:44 -07:00
Michael Bolin
43615becf0 chore: move pre_main_hardening() utility into its own crate (#4403) 2025-09-28 14:35:14 -07:00
dedrisian-oai
9ee6e6f342 Improve update nudge (#4405)
Makes the update nudge larger and adds a link to see latest release:

<img width="542" height="337" alt="Screenshot 2025-09-28 at 11 19 05 AM"
src="https://github.com/user-attachments/assets/1facce96-72f0-4a97-910a-df8b5b8b07af"
/>
2025-09-28 11:46:15 -07:00
Thibault Sottiaux
d7286e9829 chore: remove model upgrade popup (#4332) 2025-09-27 13:25:09 -07:00
Fouad Matin
bcf2bc0aa5 fix(tui): make ? work again (#4362)
Revert #4330 #4316
2025-09-27 12:18:33 -07:00
Michael Bolin
68765214b3 fix: remove default timeout of 30s in the proxy (#4336)
This is likely the reason that I saw some conversations "freeze up" when
using the proxy.

Note the client in `core` does not specify a timeout when making
requests to the Responses API, so the proxy should not, either.
2025-09-27 07:54:32 -07:00
Ahmed Ibrahim
5c67dc3af1 Edit the spacing in shortcuts (#4330)
<img width="739" height="132" alt="image"
src="https://github.com/user-attachments/assets/e8d40abb-ac41-49a2-abc4-ddc6decef989"
/>
2025-09-26 22:54:38 -07:00
Jeremy Rose
c0960c0f49 tui: separator above final agent message (#4324)
Adds a separator line before the final agent message

<img width="1011" height="884" alt="Screenshot 2025-09-26 at 4 55 01 PM"
src="https://github.com/user-attachments/assets/7c91adbf-6035-4578-8b88-a6921f11bcbc"
/>
2025-09-26 22:49:59 -07:00
Thibault Sottiaux
90c3a5650c fix: set gpt-5-codex medium preset reasoning (#4335)
Otherwise it shows up as none, see
https://github.com/openai/codex/issues/4321
2025-09-26 22:31:39 -07:00
Thibault Sottiaux
a3254696c8 docs: refresh README under codex-rs (#4333) 2025-09-26 21:45:46 -07:00
Ahmed Ibrahim
2719fdd12a Add "? for shortcuts" (#4316)
https://github.com/user-attachments/assets/9e61b197-024b-4cbc-b40d-c446b448e759
2025-09-26 18:24:26 -07:00
Gabriel Peal
3a1be084f9 [MCP] Add experimental support for streamable HTTP MCP servers (#4317)
This PR adds support for streamable HTTP MCP servers when the
`experimental_use_rmcp_client` is enabled.

To set one up, simply add a new mcp server config with the url:
```
[mcp_servers.figma]
url = "http://127.0.0.1:3845/mcp"
```

It also supports an optional `bearer_token` which will be provided in an
authorization header. The full oauth flow is not supported yet.

The config parsing will throw if it detects that the user mixed and
matched config fields (like command + bearer token or url + env).

The best way to review it is to review `core/src` and then
`rmcp-client/src/rmcp_client.rs` first. The rest is tests and
propagating the `Transport` struct around the codebase.

Example with the Figma MCP:
<img width="5084" height="1614" alt="CleanShot 2025-09-26 at 13 35 40"
src="https://github.com/user-attachments/assets/eaf2771e-df3e-4300-816b-184d7dec5a28"
/>
2025-09-26 21:24:01 -04:00
Jeremy Rose
43b63ccae8 update composer + user message styling (#4240)
Changes:

- the composer and user messages now have a colored background that
stretches the entire width of the terminal.
- the prompt character was changed from a cyan `▌` to a bold `›`.
- the "working" shimmer now follows the "dark gray" color of the
terminal, better matching the terminal's color scheme

| Terminal + Background        | Screenshot |
|------------------------------|------------|
| iTerm with dark bg | <img width="810" height="641" alt="Screenshot
2025-09-25 at 11 44 52 AM"
src="https://github.com/user-attachments/assets/1317e579-64a9-4785-93e6-98b0258f5d92"
/> |
| iTerm with light bg | <img width="845" height="540" alt="Screenshot
2025-09-25 at 11 46 29 AM"
src="https://github.com/user-attachments/assets/e671d490-c747-4460-af0b-3f8d7f7a6b8e"
/> |
| iTerm with color bg | <img width="825" height="564" alt="Screenshot
2025-09-25 at 11 47 12 AM"
src="https://github.com/user-attachments/assets/141cda1b-1164-41d5-87da-3be11e6a3063"
/> |
| Terminal.app with dark bg | <img width="577" height="367"
alt="Screenshot 2025-09-25 at 11 45 22 AM"
src="https://github.com/user-attachments/assets/93fc4781-99f7-4ee7-9c8e-3db3cd854fe5"
/> |
| Terminal.app with light bg | <img width="577" height="367"
alt="Screenshot 2025-09-25 at 11 46 04 AM"
src="https://github.com/user-attachments/assets/19bf6a3c-91e0-447b-9667-b8033f512219"
/> |
| Terminal.app with color bg | <img width="577" height="367"
alt="Screenshot 2025-09-25 at 11 45 50 AM"
src="https://github.com/user-attachments/assets/dd7c4b5b-342e-4028-8140-f4e65752bd0b"
/> |
2025-09-26 16:35:56 -07:00
pakrym-oai
cc1b21e47f Add turn started/completed events and correct exit code on error (#4309)
Adds new event for session completed that includes usage. Also ensures
we return 1 on failures.
```
{
  "type": "session.created",
  "session_id": "019987a7-93e7-7b20-9e05-e90060e411ea"
}
{
  "type": "turn.started"
}
...
{
  "type": "turn.completed",
  "usage": {
    "input_tokens": 78913,
    "cached_input_tokens": 65280,
    "output_tokens": 1099
  }
}
```
2025-09-26 16:21:50 -07:00
iceweasel-oai
55801700de reject dangerous commands for AskForApproval::Never (#4307)
If we detect a dangerous command but approval_policy is Never, simply
reject the command.
2025-09-26 14:08:28 -07:00
Ahmed Ibrahim
1fba99ed85 /status followup (#4304)
- Render `send a message to load usage data` in the beginning of the
session
- Render `data not available yet` if received no rate limits 
- nit case
- Deleted stall snapshots that were moved to
`codex-rs/tui/src/status/snapshots`
2025-09-26 18:16:54 +00:00
113 changed files with 3367 additions and 1503 deletions

View File

@@ -27,6 +27,34 @@
"path": "codex.exe"
}
}
},
"codex-responses-api-proxy": {
"platforms": {
"macos-aarch64": {
"regex": "^codex-responses-api-proxy-aarch64-apple-darwin\\.zst$",
"path": "codex-responses-api-proxy"
},
"macos-x86_64": {
"regex": "^codex-responses-api-proxy-x86_64-apple-darwin\\.zst$",
"path": "codex-responses-api-proxy"
},
"linux-x86_64": {
"regex": "^codex-responses-api-proxy-x86_64-unknown-linux-musl\\.zst$",
"path": "codex-responses-api-proxy"
},
"linux-aarch64": {
"regex": "^codex-responses-api-proxy-aarch64-unknown-linux-musl\\.zst$",
"path": "codex-responses-api-proxy"
},
"windows-x86_64": {
"regex": "^codex-responses-api-proxy-x86_64-pc-windows-msvc\\.exe\\.zst$",
"path": "codex-responses-api-proxy.exe"
},
"windows-aarch64": {
"regex": "^codex-responses-api-proxy-aarch64-pc-windows-msvc\\.exe\\.zst$",
"path": "codex-responses-api-proxy.exe"
}
}
}
}
}

View File

@@ -1,7 +1,7 @@
name: ci
on:
pull_request: { branches: [main] }
pull_request: {}
push: { branches: [main] }
jobs:
@@ -31,6 +31,7 @@ jobs:
- uses: facebook/install-dotslash@v2
- name: Stage npm package
id: stage_npm_package
env:
GH_TOKEN: ${{ github.token }}
run: |
@@ -40,13 +41,13 @@ jobs:
python3 ./codex-cli/scripts/build_npm_package.py \
--release-version "$CODEX_VERSION" \
--pack-output "$PACK_OUTPUT"
echo "PACK_OUTPUT=$PACK_OUTPUT" >> "$GITHUB_ENV"
echo "pack_output=$PACK_OUTPUT" >> "$GITHUB_OUTPUT"
- name: Upload staged npm package artifact
uses: actions/upload-artifact@v4
with:
name: codex-npm-staging
path: ${{ env.PACK_OUTPUT }}
path: ${{ steps.stage_npm_package.outputs.pack_output }}
- name: Ensure root README.md contains only ASCII and certain Unicode code points
run: ./scripts/asciicheck.py README.md

View File

@@ -97,7 +97,7 @@ jobs:
sudo apt install -y musl-tools pkg-config
- name: Cargo build
run: cargo build --target ${{ matrix.target }} --release --bin codex
run: cargo build --target ${{ matrix.target }} --release --bin codex --bin codex-responses-api-proxy --bin codex-exec
- name: Stage artifacts
shell: bash
@@ -107,8 +107,10 @@ jobs:
if [[ "${{ matrix.runner }}" == windows* ]]; then
cp target/${{ matrix.target }}/release/codex.exe "$dest/codex-${{ matrix.target }}.exe"
cp target/${{ matrix.target }}/release/codex-responses-api-proxy.exe "$dest/codex-responses-api-proxy-${{ matrix.target }}.exe"
else
cp target/${{ matrix.target }}/release/codex "$dest/codex-${{ matrix.target }}"
cp target/${{ matrix.target }}/release/codex-responses-api-proxy "$dest/codex-responses-api-proxy-${{ matrix.target }}"
fi
- if: ${{ matrix.runner == 'windows-11-arm' }}
@@ -216,17 +218,30 @@ jobs:
# build_npm_package.py requires DotSlash when staging releases.
- uses: facebook/install-dotslash@v2
- name: Stage npm package
- name: Stage codex CLI npm package
env:
GH_TOKEN: ${{ github.token }}
run: |
set -euo pipefail
TMP_DIR="${RUNNER_TEMP}/npm-stage"
./codex-cli/scripts/build_npm_package.py \
--package codex \
--release-version "${{ steps.release_name.outputs.name }}" \
--staging-dir "${TMP_DIR}" \
--pack-output "${GITHUB_WORKSPACE}/dist/npm/codex-npm-${{ steps.release_name.outputs.name }}.tgz"
- name: Stage responses API proxy npm package
env:
GH_TOKEN: ${{ github.token }}
run: |
set -euo pipefail
TMP_DIR="${RUNNER_TEMP}/npm-stage-responses"
./codex-cli/scripts/build_npm_package.py \
--package codex-responses-api-proxy \
--release-version "${{ steps.release_name.outputs.name }}" \
--staging-dir "${TMP_DIR}" \
--pack-output "${GITHUB_WORKSPACE}/dist/npm/codex-responses-api-proxy-npm-${{ steps.release_name.outputs.name }}.tgz"
- name: Create GitHub Release
uses: softprops/action-gh-release@v2
with:
@@ -269,7 +284,7 @@ jobs:
- name: Update npm
run: npm install -g npm@latest
- name: Download npm tarball from release
- name: Download npm tarballs from release
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
@@ -281,6 +296,10 @@ jobs:
--repo "${GITHUB_REPOSITORY}" \
--pattern "codex-npm-${version}.tgz" \
--dir dist/npm
gh release download "$tag" \
--repo "${GITHUB_REPOSITORY}" \
--pattern "codex-responses-api-proxy-npm-${version}.tgz" \
--dir dist/npm
# No NODE_AUTH_TOKEN needed because we use OIDC.
- name: Publish to npm
@@ -294,7 +313,14 @@ jobs:
tag_args+=(--tag "${NPM_TAG}")
fi
npm publish "${GITHUB_WORKSPACE}/dist/npm/codex-npm-${VERSION}.tgz" "${tag_args[@]}"
tarballs=(
"codex-npm-${VERSION}.tgz"
"codex-responses-api-proxy-npm-${VERSION}.tgz"
)
for tarball in "${tarballs[@]}"; do
npm publish "${GITHUB_WORKSPACE}/dist/npm/${tarball}" "${tag_args[@]}"
done
update-branch:
name: Update latest-alpha-cli branch

View File

@@ -1,4 +1,3 @@
<h1 align="center">OpenAI Codex CLI</h1>
<p align="center"><code>npm i -g @openai/codex</code><br />or <code>brew install codex</code></p>
@@ -102,4 +101,3 @@ Codex CLI supports a rich set of configuration options, with preferences stored
## License
This repository is licensed under the [Apache-2.0 License](LICENSE).

View File

@@ -208,7 +208,7 @@ The hardening mechanism Codex uses depends on your OS:
| Requirement | Details |
| --------------------------- | --------------------------------------------------------------- |
| Operating systems | macOS 12+, Ubuntu 20.04+/Debian 10+, or Windows 11 **via WSL2** |
| Node.js | **22 or newer** (LTS recommended) |
| Node.js | **16 or newer** (Node 20 LTS recommended) |
| Git (optional, recommended) | 2.23+ for built-in PR helpers |
| RAM | 4-GB minimum (8-GB recommended) |
@@ -513,7 +513,7 @@ Codex runs model-generated commands in a sandbox. If a proposed command or file
<details>
<summary>Does it work on Windows?</summary>
Not directly. It requires [Windows Subsystem for Linux (WSL2)](https://learn.microsoft.com/en-us/windows/wsl/install) - Codex has been tested on macOS and Linux with Node 22.
Not directly. It requires [Windows Subsystem for Linux (WSL2)](https://learn.microsoft.com/en-us/windows/wsl/install) - Codex is regularly tested on macOS and Linux with Node 20+, and also supports Node 16.
</details>

View File

@@ -1,6 +1,7 @@
#!/usr/bin/env node
// Unified entry point for the Codex CLI.
import { spawn } from "node:child_process";
import { existsSync } from "fs";
import path from "path";
import { fileURLToPath } from "url";
@@ -68,7 +69,6 @@ const binaryPath = path.join(archRoot, "codex", codexBinaryName);
// executing. This allows us to forward those signals to the child process
// and guarantees that when either the child terminates or the parent
// receives a fatal signal, both processes exit in a predictable manner.
const { spawn } = await import("child_process");
function getUpdatedPath(newDirs) {
const pathSep = process.platform === "win32" ? ";" : ":";

View File

@@ -11,7 +11,7 @@
"codex": "bin/codex.js"
},
"engines": {
"node": ">=20"
"node": ">=16"
}
}
}

View File

@@ -7,7 +7,7 @@
},
"type": "module",
"engines": {
"node": ">=20"
"node": ">=16"
},
"files": [
"bin",

View File

@@ -13,6 +13,7 @@ from pathlib import Path
SCRIPT_DIR = Path(__file__).resolve().parent
CODEX_CLI_ROOT = SCRIPT_DIR.parent
REPO_ROOT = CODEX_CLI_ROOT.parent
RESPONSES_API_PROXY_NPM_ROOT = REPO_ROOT / "codex-rs" / "responses-api-proxy" / "npm"
GITHUB_REPO = "openai/codex"
# The docs are not clear on what the expected value/format of
@@ -23,6 +24,12 @@ WORKFLOW_NAME = ".github/workflows/rust-release.yml"
def parse_args() -> argparse.Namespace:
parser = argparse.ArgumentParser(description="Build or stage the Codex CLI npm package.")
parser.add_argument(
"--package",
choices=("codex", "codex-responses-api-proxy"),
default="codex",
help="Which npm package to stage (default: codex).",
)
parser.add_argument(
"--version",
help="Version number to write to package.json inside the staged package.",
@@ -63,6 +70,7 @@ def parse_args() -> argparse.Namespace:
def main() -> int:
args = parse_args()
package = args.package
version = args.version
release_version = args.release_version
if release_version:
@@ -76,7 +84,7 @@ def main() -> int:
staging_dir, created_temp = prepare_staging_dir(args.staging_dir)
try:
stage_sources(staging_dir, version)
stage_sources(staging_dir, version, package)
workflow_url = args.workflow_url
resolved_head_sha: str | None = None
@@ -100,16 +108,23 @@ def main() -> int:
if not workflow_url:
raise RuntimeError("Unable to determine workflow URL for native binaries.")
install_native_binaries(staging_dir, workflow_url)
install_native_binaries(staging_dir, workflow_url, package)
if release_version:
staging_dir_str = str(staging_dir)
print(
f"Staged version {version} for release in {staging_dir_str}\n\n"
"Verify the CLI:\n"
f" node {staging_dir_str}/bin/codex.js --version\n"
f" node {staging_dir_str}/bin/codex.js --help\n\n"
)
if package == "codex":
print(
f"Staged version {version} for release in {staging_dir_str}\n\n"
"Verify the CLI:\n"
f" node {staging_dir_str}/bin/codex.js --version\n"
f" node {staging_dir_str}/bin/codex.js --help\n\n"
)
else:
print(
f"Staged version {version} for release in {staging_dir_str}\n\n"
"Verify the responses API proxy:\n"
f" node {staging_dir_str}/bin/codex-responses-api-proxy.js --help\n\n"
)
else:
print(f"Staged package in {staging_dir}")
@@ -136,20 +151,34 @@ def prepare_staging_dir(staging_dir: Path | None) -> tuple[Path, bool]:
return temp_dir, True
def stage_sources(staging_dir: Path, version: str) -> None:
def stage_sources(staging_dir: Path, version: str, package: str) -> None:
bin_dir = staging_dir / "bin"
bin_dir.mkdir(parents=True, exist_ok=True)
shutil.copy2(CODEX_CLI_ROOT / "bin" / "codex.js", bin_dir / "codex.js")
rg_manifest = CODEX_CLI_ROOT / "bin" / "rg"
if rg_manifest.exists():
shutil.copy2(rg_manifest, bin_dir / "rg")
if package == "codex":
shutil.copy2(CODEX_CLI_ROOT / "bin" / "codex.js", bin_dir / "codex.js")
rg_manifest = CODEX_CLI_ROOT / "bin" / "rg"
if rg_manifest.exists():
shutil.copy2(rg_manifest, bin_dir / "rg")
readme_src = REPO_ROOT / "README.md"
if readme_src.exists():
shutil.copy2(readme_src, staging_dir / "README.md")
readme_src = REPO_ROOT / "README.md"
if readme_src.exists():
shutil.copy2(readme_src, staging_dir / "README.md")
with open(CODEX_CLI_ROOT / "package.json", "r", encoding="utf-8") as fh:
package_json_path = CODEX_CLI_ROOT / "package.json"
elif package == "codex-responses-api-proxy":
launcher_src = RESPONSES_API_PROXY_NPM_ROOT / "bin" / "codex-responses-api-proxy.js"
shutil.copy2(launcher_src, bin_dir / "codex-responses-api-proxy.js")
readme_src = RESPONSES_API_PROXY_NPM_ROOT / "README.md"
if readme_src.exists():
shutil.copy2(readme_src, staging_dir / "README.md")
package_json_path = RESPONSES_API_PROXY_NPM_ROOT / "package.json"
else:
raise RuntimeError(f"Unknown package '{package}'.")
with open(package_json_path, "r", encoding="utf-8") as fh:
package_json = json.load(fh)
package_json["version"] = version
@@ -158,10 +187,19 @@ def stage_sources(staging_dir: Path, version: str) -> None:
out.write("\n")
def install_native_binaries(staging_dir: Path, workflow_url: str | None) -> None:
cmd = ["./scripts/install_native_deps.py"]
if workflow_url:
cmd.extend(["--workflow-url", workflow_url])
def install_native_binaries(staging_dir: Path, workflow_url: str, package: str) -> None:
package_components = {
"codex": ["codex", "rg"],
"codex-responses-api-proxy": ["codex-responses-api-proxy"],
}
components = package_components.get(package)
if components is None:
raise RuntimeError(f"Unknown package '{package}'.")
cmd = ["./scripts/install_native_deps.py", "--workflow-url", workflow_url]
for component in components:
cmd.extend(["--component", component])
cmd.append(str(staging_dir))
subprocess.check_call(cmd, cwd=CODEX_CLI_ROOT)

View File

@@ -9,6 +9,7 @@ import subprocess
import tarfile
import tempfile
import zipfile
from dataclasses import dataclass
from concurrent.futures import ThreadPoolExecutor, as_completed
from pathlib import Path
from typing import Iterable, Sequence
@@ -20,7 +21,7 @@ CODEX_CLI_ROOT = SCRIPT_DIR.parent
DEFAULT_WORKFLOW_URL = "https://github.com/openai/codex/actions/runs/17952349351" # rust-v0.40.0
VENDOR_DIR_NAME = "vendor"
RG_MANIFEST = CODEX_CLI_ROOT / "bin" / "rg"
CODEX_TARGETS = (
BINARY_TARGETS = (
"x86_64-unknown-linux-musl",
"aarch64-unknown-linux-musl",
"x86_64-apple-darwin",
@@ -29,6 +30,27 @@ CODEX_TARGETS = (
"aarch64-pc-windows-msvc",
)
@dataclass(frozen=True)
class BinaryComponent:
artifact_prefix: str # matches the artifact filename prefix (e.g. codex-<target>.zst)
dest_dir: str # directory under vendor/<target>/ where the binary is installed
binary_basename: str # executable name inside dest_dir (before optional .exe)
BINARY_COMPONENTS = {
"codex": BinaryComponent(
artifact_prefix="codex",
dest_dir="codex",
binary_basename="codex",
),
"codex-responses-api-proxy": BinaryComponent(
artifact_prefix="codex-responses-api-proxy",
dest_dir="codex-responses-api-proxy",
binary_basename="codex-responses-api-proxy",
),
}
RG_TARGET_PLATFORM_PAIRS: list[tuple[str, str]] = [
("x86_64-unknown-linux-musl", "linux-x86_64"),
("aarch64-unknown-linux-musl", "linux-aarch64"),
@@ -50,6 +72,16 @@ def parse_args() -> argparse.Namespace:
"known good run when omitted."
),
)
parser.add_argument(
"--component",
dest="components",
action="append",
choices=tuple(list(BINARY_COMPONENTS) + ["rg"]),
help=(
"Limit installation to the specified components."
" May be repeated. Defaults to 'codex' and 'rg'."
),
)
parser.add_argument(
"root",
nargs="?",
@@ -69,18 +101,28 @@ def main() -> int:
vendor_dir = codex_cli_root / VENDOR_DIR_NAME
vendor_dir.mkdir(parents=True, exist_ok=True)
components = args.components or ["codex", "rg"]
workflow_url = (args.workflow_url or DEFAULT_WORKFLOW_URL).strip()
if not workflow_url:
workflow_url = DEFAULT_WORKFLOW_URL
workflow_id = workflow_url.rstrip("/").split("/")[-1]
print(f"Downloading native artifacts from workflow {workflow_id}...")
with tempfile.TemporaryDirectory(prefix="codex-native-artifacts-") as artifacts_dir_str:
artifacts_dir = Path(artifacts_dir_str)
_download_artifacts(workflow_id, artifacts_dir)
install_codex_binaries(artifacts_dir, vendor_dir, CODEX_TARGETS)
install_binary_components(
artifacts_dir,
vendor_dir,
BINARY_TARGETS,
[name for name in components if name in BINARY_COMPONENTS],
)
fetch_rg(vendor_dir, DEFAULT_RG_TARGETS, manifest_path=RG_MANIFEST)
if "rg" in components:
print("Fetching ripgrep binaries...")
fetch_rg(vendor_dir, DEFAULT_RG_TARGETS, manifest_path=RG_MANIFEST)
print(f"Installed native dependencies into {vendor_dir}")
return 0
@@ -124,6 +166,8 @@ def fetch_rg(
results: dict[str, Path] = {}
max_workers = min(len(task_configs), max(1, (os.cpu_count() or 1)))
print("Installing ripgrep binaries for targets: " + ", ".join(targets))
with ThreadPoolExecutor(max_workers=max_workers) as executor:
future_map = {
executor.submit(
@@ -140,6 +184,7 @@ def fetch_rg(
for future in as_completed(future_map):
target = future_map[future]
results[target] = future.result()
print(f" installed ripgrep for {target}")
return [results[target] for target in targets]
@@ -158,40 +203,60 @@ def _download_artifacts(workflow_id: str, dest_dir: Path) -> None:
subprocess.check_call(cmd)
def install_codex_binaries(
artifacts_dir: Path, vendor_dir: Path, targets: Iterable[str]
) -> list[Path]:
def install_binary_components(
artifacts_dir: Path,
vendor_dir: Path,
targets: Iterable[str],
component_names: Sequence[str],
) -> None:
selected_components = [BINARY_COMPONENTS[name] for name in component_names if name in BINARY_COMPONENTS]
if not selected_components:
return
targets = list(targets)
if not targets:
return []
return
results: dict[str, Path] = {}
max_workers = min(len(targets), max(1, (os.cpu_count() or 1)))
with ThreadPoolExecutor(max_workers=max_workers) as executor:
future_map = {
executor.submit(_install_single_codex_binary, artifacts_dir, vendor_dir, target): target
for target in targets
}
for future in as_completed(future_map):
target = future_map[future]
results[target] = future.result()
return [results[target] for target in targets]
for component in selected_components:
print(
f"Installing {component.binary_basename} binaries for targets: "
+ ", ".join(targets)
)
max_workers = min(len(targets), max(1, (os.cpu_count() or 1)))
with ThreadPoolExecutor(max_workers=max_workers) as executor:
futures = {
executor.submit(
_install_single_binary,
artifacts_dir,
vendor_dir,
target,
component,
): target
for target in targets
}
for future in as_completed(futures):
installed_path = future.result()
print(f" installed {installed_path}")
def _install_single_codex_binary(artifacts_dir: Path, vendor_dir: Path, target: str) -> Path:
def _install_single_binary(
artifacts_dir: Path,
vendor_dir: Path,
target: str,
component: BinaryComponent,
) -> Path:
artifact_subdir = artifacts_dir / target
archive_name = _archive_name_for_target(target)
archive_name = _archive_name_for_target(component.artifact_prefix, target)
archive_path = artifact_subdir / archive_name
if not archive_path.exists():
raise FileNotFoundError(f"Expected artifact not found: {archive_path}")
dest_dir = vendor_dir / target / "codex"
dest_dir = vendor_dir / target / component.dest_dir
dest_dir.mkdir(parents=True, exist_ok=True)
binary_name = "codex.exe" if "windows" in target else "codex"
binary_name = (
f"{component.binary_basename}.exe" if "windows" in target else component.binary_basename
)
dest = dest_dir / binary_name
dest.unlink(missing_ok=True)
extract_archive(archive_path, "zst", None, dest)
@@ -200,10 +265,10 @@ def _install_single_codex_binary(artifacts_dir: Path, vendor_dir: Path, target:
return dest
def _archive_name_for_target(target: str) -> str:
def _archive_name_for_target(artifact_prefix: str, target: str) -> str:
if "windows" in target:
return f"codex-{target}.exe.zst"
return f"codex-{target}.zst"
return f"{artifact_prefix}-{target}.exe.zst"
return f"{artifact_prefix}-{target}.zst"
def _fetch_single_rg(

95
codex-rs/Cargo.lock generated
View File

@@ -333,6 +333,54 @@ version = "1.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c08606f8c3cbf4ce6ec8e28fb0014a2c086708fe954eaa885384a6165172e7e8"
[[package]]
name = "axum"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "021e862c184ae977658b36c4500f7feac3221ca5da43e3f25bd04ab6c79a29b5"
dependencies = [
"axum-core",
"bytes",
"futures-util",
"http",
"http-body",
"http-body-util",
"hyper",
"hyper-util",
"itoa",
"matchit",
"memchr",
"mime",
"percent-encoding",
"pin-project-lite",
"rustversion",
"serde",
"sync_wrapper",
"tokio",
"tower",
"tower-layer",
"tower-service",
]
[[package]]
name = "axum-core"
version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68464cd0412f486726fb3373129ef5d2993f90c34bc2bc1c1e9943b2f4fc7ca6"
dependencies = [
"bytes",
"futures-core",
"http",
"http-body",
"http-body-util",
"mime",
"pin-project-lite",
"rustversion",
"sync_wrapper",
"tower-layer",
"tower-service",
]
[[package]]
name = "backtrace"
version = "0.3.75"
@@ -644,12 +692,11 @@ dependencies = [
"codex-exec",
"codex-login",
"codex-mcp-server",
"codex-process-hardening",
"codex-protocol",
"codex-protocol-ts",
"codex-responses-api-proxy",
"codex-tui",
"ctor 0.5.0",
"libc",
"owo-colors",
"predicates",
"pretty_assertions",
@@ -901,6 +948,13 @@ dependencies = [
"wiremock",
]
[[package]]
name = "codex-process-hardening"
version = "0.0.0"
dependencies = [
"libc",
]
[[package]]
name = "codex-protocol"
version = "0.0.0"
@@ -941,13 +995,13 @@ version = "0.0.0"
dependencies = [
"anyhow",
"clap",
"codex-arg0",
"codex-process-hardening",
"ctor 0.5.0",
"libc",
"reqwest",
"serde",
"serde_json",
"tiny_http",
"tokio",
"zeroize",
]
@@ -956,8 +1010,11 @@ name = "codex-rmcp-client"
version = "0.0.0"
dependencies = [
"anyhow",
"axum",
"futures",
"mcp-types",
"pretty_assertions",
"reqwest",
"rmcp",
"serde",
"serde_json",
@@ -2883,6 +2940,12 @@ dependencies = [
"regex-automata",
]
[[package]]
name = "matchit"
version = "0.8.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "47e1ffaa40ddd1f3ed91f717a33c8c0ee23fff369e3aa8772b9605cc1d22f4c3"
[[package]]
name = "mcp-types"
version = "0.0.0"
@@ -3669,7 +3732,7 @@ dependencies = [
"once_cell",
"socket2",
"tracing",
"windows-sys 0.52.0",
"windows-sys 0.60.2",
]
[[package]]
@@ -3907,20 +3970,29 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "534fd1cd0601e798ac30545ff2b7f4a62c6f14edd4aaed1cc5eb1e85f69f09af"
dependencies = [
"base64",
"bytes",
"chrono",
"futures",
"http",
"http-body",
"http-body-util",
"paste",
"pin-project-lite",
"process-wrap",
"rand",
"reqwest",
"rmcp-macros",
"schemars 1.0.4",
"serde",
"serde_json",
"sse-stream",
"thiserror 2.0.16",
"tokio",
"tokio-stream",
"tokio-util",
"tower-service",
"tracing",
"uuid",
]
[[package]]
@@ -4468,6 +4540,19 @@ dependencies = [
"windows-sys 0.59.0",
]
[[package]]
name = "sse-stream"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "eb4dc4d33c68ec1f27d386b5610a351922656e1fdf5c05bbaad930cd1519479a"
dependencies = [
"bytes",
"futures-util",
"http-body",
"http-body-util",
"pin-project-lite",
]
[[package]]
name = "stable_deref_trait"
version = "1.2.0"

View File

@@ -16,6 +16,7 @@ members = [
"mcp-server",
"mcp-types",
"ollama",
"process-hardening",
"protocol",
"protocol-ts",
"rmcp-client",
@@ -49,10 +50,10 @@ codex-login = { path = "login" }
codex-mcp-client = { path = "mcp-client" }
codex-mcp-server = { path = "mcp-server" }
codex-ollama = { path = "ollama" }
codex-process-hardening = { path = "process-hardening" }
codex-protocol = { path = "protocol" }
codex-rmcp-client = { path = "rmcp-client" }
codex-protocol-ts = { path = "protocol-ts" }
codex-responses-api-proxy = { path = "responses-api-proxy" }
codex-rmcp-client = { path = "rmcp-client" }
codex-tui = { path = "tui" }
codex-utils-readiness = { path = "utils/readiness" }
core_test_support = { path = "core/tests/common" }
@@ -83,8 +84,8 @@ dirs = "6"
dotenvy = "0.15.7"
env-flags = "0.1.1"
env_logger = "0.11.5"
eventsource-stream = "0.2.3"
escargot = "0.5"
eventsource-stream = "0.2.3"
futures = "0.3"
icu_decimal = "2.0.0"
icu_locale_core = "2.0.0"

View File

@@ -4,18 +4,18 @@ We provide Codex CLI as a standalone, native executable to ensure a zero-depende
## Installing Codex
Today, the easiest way to install Codex is via `npm`, though we plan to publish Codex to other package managers soon.
Today, the easiest way to install Codex is via `npm`:
```shell
npm i -g @openai/codex@native
npm i -g @openai/codex
codex
```
You can also download a platform-specific release directly from our [GitHub Releases](https://github.com/openai/codex/releases).
You can also install via Homebrew (`brew install codex`) or download a platform-specific release directly from our [GitHub Releases](https://github.com/openai/codex/releases).
## What's new in the Rust CLI
While we are [working to close the gap between the TypeScript and Rust implementations of Codex CLI](https://github.com/openai/codex/issues/1262), note that the Rust CLI has a number of features that the TypeScript CLI does not!
The Rust implementation is now the maintained Codex CLI and serves as the default experience. It includes a number of features that the legacy TypeScript CLI never supported.
### Config

View File

@@ -25,9 +25,9 @@ codex-core = { workspace = true }
codex-exec = { workspace = true }
codex-login = { workspace = true }
codex-mcp-server = { workspace = true }
codex-process-hardening = { workspace = true }
codex-protocol = { workspace = true }
codex-protocol-ts = { workspace = true }
codex-responses-api-proxy = { workspace = true }
codex-tui = { workspace = true }
ctor = { workspace = true }
owo-colors = { workspace = true }
@@ -43,15 +43,6 @@ tokio = { workspace = true, features = [
tracing = { workspace = true }
tracing-subscriber = { workspace = true }
[target.'cfg(target_os = "linux")'.dependencies]
libc = { workspace = true }
[target.'cfg(target_os = "android")'.dependencies]
libc = { workspace = true }
[target.'cfg(target_os = "macos")'.dependencies]
libc = { workspace = true }
[dev-dependencies]
assert_cmd = { workspace = true }
predicates = { workspace = true }

View File

@@ -1,4 +1,3 @@
use anyhow::Context;
use clap::CommandFactory;
use clap::Parser;
use clap_complete::Shell;
@@ -15,7 +14,6 @@ use codex_cli::login::run_logout;
use codex_cli::proto;
use codex_common::CliConfigOverrides;
use codex_exec::Cli as ExecCli;
use codex_responses_api_proxy::Args as ResponsesApiProxyArgs;
use codex_tui::AppExitInfo;
use codex_tui::Cli as TuiCli;
use owo_colors::OwoColorize;
@@ -23,7 +21,6 @@ use std::path::PathBuf;
use supports_color::Stream;
mod mcp_cmd;
mod pre_main_hardening;
use crate::mcp_cmd::McpCli;
use crate::proto::ProtoCli;
@@ -88,10 +85,6 @@ enum Subcommand {
/// Internal: generate TypeScript protocol bindings.
#[clap(hide = true)]
GenerateTs(GenerateTsCommand),
/// Internal: run the responses API proxy.
#[clap(hide = true)]
ResponsesApiProxy(ResponsesApiProxyArgs),
}
#[derive(Debug, Parser)]
@@ -188,7 +181,7 @@ fn format_exit_messages(exit_info: AppExitInfo, color_enabled: bool) -> Vec<Stri
} else {
resume_cmd
};
lines.push(format!("To continue this session, run {command}."));
lines.push(format!("To continue this session, run {command}"));
}
lines
@@ -213,14 +206,7 @@ fn pre_main_hardening() {
};
if secure_mode == "1" {
#[cfg(any(target_os = "linux", target_os = "android"))]
crate::pre_main_hardening::pre_main_hardening_linux();
#[cfg(target_os = "macos")]
crate::pre_main_hardening::pre_main_hardening_macos();
#[cfg(windows)]
crate::pre_main_hardening::pre_main_hardening_windows();
codex_process_hardening::pre_main_hardening();
}
// Always clear this env var so child processes don't inherit it.
@@ -347,11 +333,6 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
Some(Subcommand::GenerateTs(gen_cli)) => {
codex_protocol_ts::generate_ts(&gen_cli.out_dir, gen_cli.prettier.as_deref())?;
}
Some(Subcommand::ResponsesApiProxy(args)) => {
tokio::task::spawn_blocking(move || codex_responses_api_proxy::run_main(args))
.await
.context("responses-api-proxy blocking task panicked")??;
}
}
Ok(())
@@ -500,7 +481,7 @@ mod tests {
lines,
vec![
"Token usage: total=2 input=0 output=2".to_string(),
"To continue this session, run codex resume 123e4567-e89b-12d3-a456-426614174000."
"To continue this session, run codex resume 123e4567-e89b-12d3-a456-426614174000"
.to_string(),
]
);

View File

@@ -1,4 +1,3 @@
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::path::PathBuf;
@@ -13,6 +12,7 @@ use codex_core::config::find_codex_home;
use codex_core::config::load_global_mcp_servers;
use codex_core::config::write_global_mcp_servers;
use codex_core::config_types::McpServerConfig;
use codex_core::config_types::McpServerTransportConfig;
/// [experimental] Launch Codex as an MCP server or manage configured MCP servers.
///
@@ -145,9 +145,11 @@ fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Result<(
.with_context(|| format!("failed to load MCP servers from {}", codex_home.display()))?;
let new_entry = McpServerConfig {
command: command_bin,
args: command_args,
env: env_map,
transport: McpServerTransportConfig::Stdio {
command: command_bin,
args: command_args,
env: env_map,
},
startup_timeout_sec: None,
tool_timeout_sec: None,
};
@@ -201,16 +203,25 @@ fn run_list(config_overrides: &CliConfigOverrides, list_args: ListArgs) -> Resul
let json_entries: Vec<_> = entries
.into_iter()
.map(|(name, cfg)| {
let env = cfg.env.as_ref().map(|env| {
env.iter()
.map(|(k, v)| (k.clone(), v.clone()))
.collect::<BTreeMap<_, _>>()
});
let transport = match &cfg.transport {
McpServerTransportConfig::Stdio { command, args, env } => serde_json::json!({
"type": "stdio",
"command": command,
"args": args,
"env": env,
}),
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
serde_json::json!({
"type": "streamable_http",
"url": url,
"bearer_token": bearer_token,
})
}
};
serde_json::json!({
"name": name,
"command": cfg.command,
"args": cfg.args,
"env": env,
"transport": transport,
"startup_timeout_sec": cfg
.startup_timeout_sec
.map(|timeout| timeout.as_secs_f64()),
@@ -230,62 +241,111 @@ fn run_list(config_overrides: &CliConfigOverrides, list_args: ListArgs) -> Resul
return Ok(());
}
let mut rows: Vec<[String; 4]> = Vec::new();
let mut stdio_rows: Vec<[String; 4]> = Vec::new();
let mut http_rows: Vec<[String; 3]> = Vec::new();
for (name, cfg) in entries {
let args = if cfg.args.is_empty() {
"-".to_string()
} else {
cfg.args.join(" ")
};
let env = match cfg.env.as_ref() {
None => "-".to_string(),
Some(map) if map.is_empty() => "-".to_string(),
Some(map) => {
let mut pairs: Vec<_> = map.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
pairs
.into_iter()
.map(|(k, v)| format!("{k}={v}"))
.collect::<Vec<_>>()
.join(", ")
match &cfg.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
let args_display = if args.is_empty() {
"-".to_string()
} else {
args.join(" ")
};
let env_display = match env.as_ref() {
None => "-".to_string(),
Some(map) if map.is_empty() => "-".to_string(),
Some(map) => {
let mut pairs: Vec<_> = map.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
pairs
.into_iter()
.map(|(k, v)| format!("{k}={v}"))
.collect::<Vec<_>>()
.join(", ")
}
};
stdio_rows.push([name.clone(), command.clone(), args_display, env_display]);
}
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
let has_bearer = if bearer_token.is_some() {
"True"
} else {
"False"
};
http_rows.push([name.clone(), url.clone(), has_bearer.into()]);
}
};
rows.push([name.clone(), cfg.command.clone(), args, env]);
}
let mut widths = ["Name".len(), "Command".len(), "Args".len(), "Env".len()];
for row in &rows {
for (i, cell) in row.iter().enumerate() {
widths[i] = widths[i].max(cell.len());
}
}
println!(
"{:<name_w$} {:<cmd_w$} {:<args_w$} {:<env_w$}",
"Name",
"Command",
"Args",
"Env",
name_w = widths[0],
cmd_w = widths[1],
args_w = widths[2],
env_w = widths[3],
);
if !stdio_rows.is_empty() {
let mut widths = ["Name".len(), "Command".len(), "Args".len(), "Env".len()];
for row in &stdio_rows {
for (i, cell) in row.iter().enumerate() {
widths[i] = widths[i].max(cell.len());
}
}
for row in rows {
println!(
"{:<name_w$} {:<cmd_w$} {:<args_w$} {:<env_w$}",
row[0],
row[1],
row[2],
row[3],
"Name",
"Command",
"Args",
"Env",
name_w = widths[0],
cmd_w = widths[1],
args_w = widths[2],
env_w = widths[3],
);
for row in &stdio_rows {
println!(
"{:<name_w$} {:<cmd_w$} {:<args_w$} {:<env_w$}",
row[0],
row[1],
row[2],
row[3],
name_w = widths[0],
cmd_w = widths[1],
args_w = widths[2],
env_w = widths[3],
);
}
}
if !stdio_rows.is_empty() && !http_rows.is_empty() {
println!();
}
if !http_rows.is_empty() {
let mut widths = ["Name".len(), "Url".len(), "Has Bearer Token".len()];
for row in &http_rows {
for (i, cell) in row.iter().enumerate() {
widths[i] = widths[i].max(cell.len());
}
}
println!(
"{:<name_w$} {:<url_w$} {:<token_w$}",
"Name",
"Url",
"Has Bearer Token",
name_w = widths[0],
url_w = widths[1],
token_w = widths[2],
);
for row in &http_rows {
println!(
"{:<name_w$} {:<url_w$} {:<token_w$}",
row[0],
row[1],
row[2],
name_w = widths[0],
url_w = widths[1],
token_w = widths[2],
);
}
}
Ok(())
@@ -301,16 +361,22 @@ fn run_get(config_overrides: &CliConfigOverrides, get_args: GetArgs) -> Result<(
};
if get_args.json {
let env = server.env.as_ref().map(|env| {
env.iter()
.map(|(k, v)| (k.clone(), v.clone()))
.collect::<BTreeMap<_, _>>()
});
let transport = match &server.transport {
McpServerTransportConfig::Stdio { command, args, env } => serde_json::json!({
"type": "stdio",
"command": command,
"args": args,
"env": env,
}),
McpServerTransportConfig::StreamableHttp { url, bearer_token } => serde_json::json!({
"type": "streamable_http",
"url": url,
"bearer_token": bearer_token,
}),
};
let output = serde_json::to_string_pretty(&serde_json::json!({
"name": get_args.name,
"command": server.command,
"args": server.args,
"env": env,
"transport": transport,
"startup_timeout_sec": server
.startup_timeout_sec
.map(|timeout| timeout.as_secs_f64()),
@@ -323,27 +389,38 @@ fn run_get(config_overrides: &CliConfigOverrides, get_args: GetArgs) -> Result<(
}
println!("{}", get_args.name);
println!(" command: {}", server.command);
let args = if server.args.is_empty() {
"-".to_string()
} else {
server.args.join(" ")
};
println!(" args: {args}");
let env_display = match server.env.as_ref() {
None => "-".to_string(),
Some(map) if map.is_empty() => "-".to_string(),
Some(map) => {
let mut pairs: Vec<_> = map.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
pairs
.into_iter()
.map(|(k, v)| format!("{k}={v}"))
.collect::<Vec<_>>()
.join(", ")
match &server.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
println!(" transport: stdio");
println!(" command: {command}");
let args_display = if args.is_empty() {
"-".to_string()
} else {
args.join(" ")
};
println!(" args: {args_display}");
let env_display = match env.as_ref() {
None => "-".to_string(),
Some(map) if map.is_empty() => "-".to_string(),
Some(map) => {
let mut pairs: Vec<_> = map.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
pairs
.into_iter()
.map(|(k, v)| format!("{k}={v}"))
.collect::<Vec<_>>()
.join(", ")
}
};
println!(" env: {env_display}");
}
};
println!(" env: {env_display}");
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
println!(" transport: streamable_http");
println!(" url: {url}");
let bearer = bearer_token.as_deref().unwrap_or("-");
println!(" bearer_token: {bearer}");
}
}
if let Some(timeout) = server.startup_timeout_sec {
println!(" startup_timeout_sec: {}", timeout.as_secs_f64());
}

View File

@@ -2,6 +2,7 @@ use std::path::Path;
use anyhow::Result;
use codex_core::config::load_global_mcp_servers;
use codex_core::config_types::McpServerTransportConfig;
use predicates::str::contains;
use pretty_assertions::assert_eq;
use tempfile::TempDir;
@@ -26,9 +27,14 @@ fn add_and_remove_server_updates_global_config() -> Result<()> {
let servers = load_global_mcp_servers(codex_home.path())?;
assert_eq!(servers.len(), 1);
let docs = servers.get("docs").expect("server should exist");
assert_eq!(docs.command, "echo");
assert_eq!(docs.args, vec!["hello".to_string()]);
assert!(docs.env.is_none());
match &docs.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
assert_eq!(command, "echo");
assert_eq!(args, &vec!["hello".to_string()]);
assert!(env.is_none());
}
other => panic!("unexpected transport: {other:?}"),
}
let mut remove_cmd = codex_command(codex_home.path())?;
remove_cmd
@@ -76,7 +82,10 @@ fn add_with_env_preserves_key_order_and_values() -> Result<()> {
let servers = load_global_mcp_servers(codex_home.path())?;
let envy = servers.get("envy").expect("server should exist");
let env = envy.env.as_ref().expect("env should be present");
let env = match &envy.transport {
McpServerTransportConfig::Stdio { env: Some(env), .. } => env,
other => panic!("unexpected transport: {other:?}"),
};
assert_eq!(env.len(), 2);
assert_eq!(env.get("FOO"), Some(&"bar".to_string()));

View File

@@ -4,6 +4,7 @@ use anyhow::Result;
use predicates::str::contains;
use pretty_assertions::assert_eq;
use serde_json::Value as JsonValue;
use serde_json::json;
use tempfile::TempDir;
fn codex_command(codex_home: &Path) -> Result<assert_cmd::Command> {
@@ -58,38 +59,35 @@ fn list_and_get_render_expected_output() -> Result<()> {
assert!(json_output.status.success());
let stdout = String::from_utf8(json_output.stdout)?;
let parsed: JsonValue = serde_json::from_str(&stdout)?;
let array = parsed.as_array().expect("expected array");
assert_eq!(array.len(), 1);
let entry = &array[0];
assert_eq!(entry.get("name"), Some(&JsonValue::String("docs".into())));
assert_eq!(
entry.get("command"),
Some(&JsonValue::String("docs-server".into()))
);
let args = entry
.get("args")
.and_then(|v| v.as_array())
.expect("args array");
assert_eq!(
args,
&vec![
JsonValue::String("--port".into()),
JsonValue::String("4000".into())
parsed,
json!([
{
"name": "docs",
"transport": {
"type": "stdio",
"command": "docs-server",
"args": [
"--port",
"4000"
],
"env": {
"TOKEN": "secret"
}
},
"startup_timeout_sec": null,
"tool_timeout_sec": null
}
]
)
);
let env = entry
.get("env")
.and_then(|v| v.as_object())
.expect("env map");
assert_eq!(env.get("TOKEN"), Some(&JsonValue::String("secret".into())));
let mut get_cmd = codex_command(codex_home.path())?;
let get_output = get_cmd.args(["mcp", "get", "docs"]).output()?;
assert!(get_output.status.success());
let stdout = String::from_utf8(get_output.stdout)?;
assert!(stdout.contains("docs"));
assert!(stdout.contains("transport: stdio"));
assert!(stdout.contains("command: docs-server"));
assert!(stdout.contains("args: --port 4000"));
assert!(stdout.contains("env: TOKEN=secret"));

View File

@@ -29,7 +29,7 @@ const PRESETS: &[ModelPreset] = &[
label: "gpt-5-codex medium",
description: "",
model: "gpt-5-codex",
effort: None,
effort: Some(ReasoningEffort::Medium),
},
ModelPreset {
id: "gpt-5-codex-high",

View File

@@ -559,10 +559,6 @@ fn parse_rate_limit_snapshot(headers: &HeaderMap) -> Option<RateLimitSnapshot> {
"x-codex-secondary-reset-after-seconds",
);
if primary.is_none() && secondary.is_none() {
return None;
}
Some(RateLimitSnapshot { primary, secondary })
}

View File

@@ -1,6 +1,7 @@
use crate::config_profile::ConfigProfile;
use crate::config_types::History;
use crate::config_types::McpServerConfig;
use crate::config_types::McpServerTransportConfig;
use crate::config_types::Notifications;
use crate::config_types::ReasoningSummaryFormat;
use crate::config_types::SandboxWorkspaceWrite;
@@ -314,27 +315,37 @@ pub fn write_global_mcp_servers(
for (name, config) in servers {
let mut entry = TomlTable::new();
entry.set_implicit(false);
entry["command"] = toml_edit::value(config.command.clone());
match &config.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
entry["command"] = toml_edit::value(command.clone());
if !config.args.is_empty() {
let mut args = TomlArray::new();
for arg in &config.args {
args.push(arg.clone());
}
entry["args"] = TomlItem::Value(args.into());
}
if !args.is_empty() {
let mut args_array = TomlArray::new();
for arg in args {
args_array.push(arg.clone());
}
entry["args"] = TomlItem::Value(args_array.into());
}
if let Some(env) = &config.env
&& !env.is_empty()
{
let mut env_table = TomlTable::new();
env_table.set_implicit(false);
let mut pairs: Vec<_> = env.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
for (key, value) in pairs {
env_table.insert(key, toml_edit::value(value.clone()));
if let Some(env) = env
&& !env.is_empty()
{
let mut env_table = TomlTable::new();
env_table.set_implicit(false);
let mut pairs: Vec<_> = env.iter().collect();
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
for (key, value) in pairs {
env_table.insert(key, toml_edit::value(value.clone()));
}
entry["env"] = TomlItem::Table(env_table);
}
}
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
entry["url"] = toml_edit::value(url.clone());
if let Some(token) = bearer_token {
entry["bearer_token"] = toml_edit::value(token.clone());
}
}
entry["env"] = TomlItem::Table(env_table);
}
if let Some(timeout) = config.startup_timeout_sec {
@@ -1294,9 +1305,11 @@ exclude_slash_tmp = true
servers.insert(
"docs".to_string(),
McpServerConfig {
command: "echo".to_string(),
args: vec!["hello".to_string()],
env: None,
transport: McpServerTransportConfig::Stdio {
command: "echo".to_string(),
args: vec!["hello".to_string()],
env: None,
},
startup_timeout_sec: Some(Duration::from_secs(3)),
tool_timeout_sec: Some(Duration::from_secs(5)),
},
@@ -1307,8 +1320,14 @@ exclude_slash_tmp = true
let loaded = load_global_mcp_servers(codex_home.path())?;
assert_eq!(loaded.len(), 1);
let docs = loaded.get("docs").expect("docs entry");
assert_eq!(docs.command, "echo");
assert_eq!(docs.args, vec!["hello".to_string()]);
match &docs.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
assert_eq!(command, "echo");
assert_eq!(args, &vec!["hello".to_string()]);
assert!(env.is_none());
}
other => panic!("unexpected transport {other:?}"),
}
assert_eq!(docs.startup_timeout_sec, Some(Duration::from_secs(3)));
assert_eq!(docs.tool_timeout_sec, Some(Duration::from_secs(5)));
@@ -1342,6 +1361,134 @@ startup_timeout_ms = 2500
Ok(())
}
#[test]
fn write_global_mcp_servers_serializes_env_sorted() -> anyhow::Result<()> {
let codex_home = TempDir::new()?;
let servers = BTreeMap::from([(
"docs".to_string(),
McpServerConfig {
transport: McpServerTransportConfig::Stdio {
command: "docs-server".to_string(),
args: vec!["--verbose".to_string()],
env: Some(HashMap::from([
("ZIG_VAR".to_string(), "3".to_string()),
("ALPHA_VAR".to_string(), "1".to_string()),
])),
},
startup_timeout_sec: None,
tool_timeout_sec: None,
},
)]);
write_global_mcp_servers(codex_home.path(), &servers)?;
let config_path = codex_home.path().join(CONFIG_TOML_FILE);
let serialized = std::fs::read_to_string(&config_path)?;
assert_eq!(
serialized,
r#"[mcp_servers.docs]
command = "docs-server"
args = ["--verbose"]
[mcp_servers.docs.env]
ALPHA_VAR = "1"
ZIG_VAR = "3"
"#
);
let loaded = load_global_mcp_servers(codex_home.path())?;
let docs = loaded.get("docs").expect("docs entry");
match &docs.transport {
McpServerTransportConfig::Stdio { command, args, env } => {
assert_eq!(command, "docs-server");
assert_eq!(args, &vec!["--verbose".to_string()]);
let env = env
.as_ref()
.expect("env should be preserved for stdio transport");
assert_eq!(env.get("ALPHA_VAR"), Some(&"1".to_string()));
assert_eq!(env.get("ZIG_VAR"), Some(&"3".to_string()));
}
other => panic!("unexpected transport {other:?}"),
}
Ok(())
}
#[test]
fn write_global_mcp_servers_serializes_streamable_http() -> anyhow::Result<()> {
let codex_home = TempDir::new()?;
let mut servers = BTreeMap::from([(
"docs".to_string(),
McpServerConfig {
transport: McpServerTransportConfig::StreamableHttp {
url: "https://example.com/mcp".to_string(),
bearer_token: Some("secret-token".to_string()),
},
startup_timeout_sec: Some(Duration::from_secs(2)),
tool_timeout_sec: None,
},
)]);
write_global_mcp_servers(codex_home.path(), &servers)?;
let config_path = codex_home.path().join(CONFIG_TOML_FILE);
let serialized = std::fs::read_to_string(&config_path)?;
assert_eq!(
serialized,
r#"[mcp_servers.docs]
url = "https://example.com/mcp"
bearer_token = "secret-token"
startup_timeout_sec = 2.0
"#
);
let loaded = load_global_mcp_servers(codex_home.path())?;
let docs = loaded.get("docs").expect("docs entry");
match &docs.transport {
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
assert_eq!(url, "https://example.com/mcp");
assert_eq!(bearer_token.as_deref(), Some("secret-token"));
}
other => panic!("unexpected transport {other:?}"),
}
assert_eq!(docs.startup_timeout_sec, Some(Duration::from_secs(2)));
servers.insert(
"docs".to_string(),
McpServerConfig {
transport: McpServerTransportConfig::StreamableHttp {
url: "https://example.com/mcp".to_string(),
bearer_token: None,
},
startup_timeout_sec: None,
tool_timeout_sec: None,
},
);
write_global_mcp_servers(codex_home.path(), &servers)?;
let serialized = std::fs::read_to_string(&config_path)?;
assert_eq!(
serialized,
r#"[mcp_servers.docs]
url = "https://example.com/mcp"
"#
);
let loaded = load_global_mcp_servers(codex_home.path())?;
let docs = loaded.get("docs").expect("docs entry");
match &docs.transport {
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
assert_eq!(url, "https://example.com/mcp");
assert!(bearer_token.is_none());
}
other => panic!("unexpected transport {other:?}"),
}
Ok(())
}
#[tokio::test]
async fn persist_model_selection_updates_defaults() -> anyhow::Result<()> {
let codex_home = TempDir::new()?;

View File

@@ -3,25 +3,20 @@
// Note this file should generally be restricted to simple struct/enum
// definitions that do not contain business logic.
use serde::Deserializer;
use std::collections::HashMap;
use std::path::PathBuf;
use std::time::Duration;
use wildmatch::WildMatchPattern;
use serde::Deserialize;
use serde::Deserializer;
use serde::Serialize;
use serde::de::Error as SerdeError;
#[derive(Serialize, Debug, Clone, PartialEq)]
pub struct McpServerConfig {
pub command: String,
#[serde(default)]
pub args: Vec<String>,
#[serde(default)]
pub env: Option<HashMap<String, String>>,
#[serde(flatten)]
pub transport: McpServerTransportConfig,
/// Startup timeout in seconds for initializing MCP server & initially listing tools.
#[serde(
@@ -43,11 +38,15 @@ impl<'de> Deserialize<'de> for McpServerConfig {
{
#[derive(Deserialize)]
struct RawMcpServerConfig {
command: String,
command: Option<String>,
#[serde(default)]
args: Vec<String>,
args: Option<Vec<String>>,
#[serde(default)]
env: Option<HashMap<String, String>>,
url: Option<String>,
bearer_token: Option<String>,
#[serde(default)]
startup_timeout_sec: Option<f64>,
#[serde(default)]
@@ -67,16 +66,81 @@ impl<'de> Deserialize<'de> for McpServerConfig {
(None, None) => None,
};
fn throw_if_set<E, T>(transport: &str, field: &str, value: Option<&T>) -> Result<(), E>
where
E: SerdeError,
{
if value.is_none() {
return Ok(());
}
Err(E::custom(format!(
"{field} is not supported for {transport}",
)))
}
let transport = match raw {
RawMcpServerConfig {
command: Some(command),
args,
env,
url,
bearer_token,
..
} => {
throw_if_set("stdio", "url", url.as_ref())?;
throw_if_set("stdio", "bearer_token", bearer_token.as_ref())?;
McpServerTransportConfig::Stdio {
command,
args: args.unwrap_or_default(),
env,
}
}
RawMcpServerConfig {
url: Some(url),
bearer_token,
command,
args,
env,
..
} => {
throw_if_set("streamable_http", "command", command.as_ref())?;
throw_if_set("streamable_http", "args", args.as_ref())?;
throw_if_set("streamable_http", "env", env.as_ref())?;
McpServerTransportConfig::StreamableHttp { url, bearer_token }
}
_ => return Err(SerdeError::custom("invalid transport")),
};
Ok(Self {
command: raw.command,
args: raw.args,
env: raw.env,
transport,
startup_timeout_sec,
tool_timeout_sec: raw.tool_timeout_sec,
})
}
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq)]
#[serde(untagged, deny_unknown_fields, rename_all = "snake_case")]
pub enum McpServerTransportConfig {
/// https://modelcontextprotocol.io/specification/2025-06-18/basic/transports#stdio
Stdio {
command: String,
#[serde(default)]
args: Vec<String>,
#[serde(default, skip_serializing_if = "Option::is_none")]
env: Option<HashMap<String, String>>,
},
/// https://modelcontextprotocol.io/specification/2025-06-18/basic/transports#streamable-http
StreamableHttp {
url: String,
/// A plain text bearer token to use for authentication.
/// This bearer token will be included in the HTTP request header as an `Authorization: Bearer <token>` header.
/// This should be used with caution because it lives on disk in clear text.
#[serde(default, skip_serializing_if = "Option::is_none")]
bearer_token: Option<String>,
},
}
mod option_duration_secs {
use serde::Deserialize;
use serde::Deserializer;
@@ -303,3 +367,139 @@ pub enum ReasoningSummaryFormat {
None,
Experimental,
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
#[test]
fn deserialize_stdio_command_server_config() {
let cfg: McpServerConfig = toml::from_str(
r#"
command = "echo"
"#,
)
.expect("should deserialize command config");
assert_eq!(
cfg.transport,
McpServerTransportConfig::Stdio {
command: "echo".to_string(),
args: vec![],
env: None
}
);
}
#[test]
fn deserialize_stdio_command_server_config_with_args() {
let cfg: McpServerConfig = toml::from_str(
r#"
command = "echo"
args = ["hello", "world"]
"#,
)
.expect("should deserialize command config");
assert_eq!(
cfg.transport,
McpServerTransportConfig::Stdio {
command: "echo".to_string(),
args: vec!["hello".to_string(), "world".to_string()],
env: None
}
);
}
#[test]
fn deserialize_stdio_command_server_config_with_arg_with_args_and_env() {
let cfg: McpServerConfig = toml::from_str(
r#"
command = "echo"
args = ["hello", "world"]
env = { "FOO" = "BAR" }
"#,
)
.expect("should deserialize command config");
assert_eq!(
cfg.transport,
McpServerTransportConfig::Stdio {
command: "echo".to_string(),
args: vec!["hello".to_string(), "world".to_string()],
env: Some(HashMap::from([("FOO".to_string(), "BAR".to_string())]))
}
);
}
#[test]
fn deserialize_streamable_http_server_config() {
let cfg: McpServerConfig = toml::from_str(
r#"
url = "https://example.com/mcp"
"#,
)
.expect("should deserialize http config");
assert_eq!(
cfg.transport,
McpServerTransportConfig::StreamableHttp {
url: "https://example.com/mcp".to_string(),
bearer_token: None
}
);
}
#[test]
fn deserialize_streamable_http_server_config_with_bearer_token() {
let cfg: McpServerConfig = toml::from_str(
r#"
url = "https://example.com/mcp"
bearer_token = "secret"
"#,
)
.expect("should deserialize http config");
assert_eq!(
cfg.transport,
McpServerTransportConfig::StreamableHttp {
url: "https://example.com/mcp".to_string(),
bearer_token: Some("secret".to_string())
}
);
}
#[test]
fn deserialize_rejects_command_and_url() {
toml::from_str::<McpServerConfig>(
r#"
command = "echo"
url = "https://example.com"
"#,
)
.expect_err("should reject command+url");
}
#[test]
fn deserialize_rejects_env_for_http_transport() {
toml::from_str::<McpServerConfig>(
r#"
url = "https://example.com"
env = { "FOO" = "BAR" }
"#,
)
.expect_err("should reject env for http transport");
}
#[test]
fn deserialize_rejects_bearer_token_for_stdio_transport() {
toml::from_str::<McpServerConfig>(
r#"
command = "echo"
bearer_token = "secret"
"#,
)
.expect_err("should reject bearer token for stdio transport");
}
}

View File

@@ -1,89 +0,0 @@
use anyhow::Context;
use serde::Deserialize;
use serde::Serialize;
use std::io::ErrorKind;
use std::path::Path;
use std::path::PathBuf;
pub(crate) const INTERNAL_STORAGE_FILE: &str = "internal_storage.json";
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct InternalStorage {
#[serde(skip)]
storage_path: PathBuf,
#[serde(default = "default_gpt_5_codex_model_prompt_seen")]
pub gpt_5_codex_model_prompt_seen: bool,
}
const fn default_gpt_5_codex_model_prompt_seen() -> bool {
true
}
impl Default for InternalStorage {
fn default() -> Self {
Self {
storage_path: PathBuf::new(),
gpt_5_codex_model_prompt_seen: default_gpt_5_codex_model_prompt_seen(),
}
}
}
// TODO(jif) generalise all the file writers and build proper async channel inserters.
impl InternalStorage {
pub fn load(codex_home: &Path) -> Self {
let storage_path = codex_home.join(INTERNAL_STORAGE_FILE);
match std::fs::read_to_string(&storage_path) {
Ok(serialized) => match serde_json::from_str::<Self>(&serialized) {
Ok(mut storage) => {
storage.storage_path = storage_path;
storage
}
Err(error) => {
tracing::warn!("failed to parse internal storage: {error:?}");
Self::empty(storage_path)
}
},
Err(error) => {
if error.kind() == ErrorKind::NotFound {
tracing::debug!(
"internal storage not found at {}; initializing defaults",
storage_path.display()
);
} else {
tracing::warn!("failed to read internal storage: {error:?}");
}
Self::empty(storage_path)
}
}
}
fn empty(storage_path: PathBuf) -> Self {
Self {
storage_path,
..Default::default()
}
}
pub async fn persist(&self) -> anyhow::Result<()> {
let serialized = serde_json::to_string_pretty(self)?;
if let Some(parent) = self.storage_path.parent() {
tokio::fs::create_dir_all(parent).await.with_context(|| {
format!(
"failed to create internal storage directory at {}",
parent.display()
)
})?;
}
tokio::fs::write(&self.storage_path, serialized)
.await
.with_context(|| {
format!(
"failed to persist internal storage at {}",
self.storage_path.display()
)
})
}
}

View File

@@ -29,7 +29,6 @@ mod exec_command;
pub mod exec_env;
mod flags;
pub mod git_info;
pub mod internal_storage;
pub mod landlock;
mod mcp_connection_manager;
mod mcp_tool_call;

View File

@@ -29,6 +29,7 @@ use tracing::info;
use tracing::warn;
use crate::config_types::McpServerConfig;
use crate::config_types::McpServerTransportConfig;
/// Delimiter used to separate the server name from the tool name in a fully
/// qualified tool name.
@@ -107,7 +108,7 @@ impl McpClientAdapter {
params: mcp_types::InitializeRequestParams,
startup_timeout: Duration,
) -> Result<Self> {
tracing::error!(
info!(
"new_stdio_client use_rmcp_client: {use_rmcp_client} program: {program:?} args: {args:?} env: {env:?} params: {params:?} startup_timeout: {startup_timeout:?}"
);
if use_rmcp_client {
@@ -121,6 +122,17 @@ impl McpClientAdapter {
}
}
async fn new_streamable_http_client(
url: String,
bearer_token: Option<String>,
params: mcp_types::InitializeRequestParams,
startup_timeout: Duration,
) -> Result<Self> {
let client = Arc::new(RmcpClient::new_streamable_http_client(url, bearer_token)?);
client.initialize(params, Some(startup_timeout)).await?;
Ok(McpClientAdapter::Rmcp(client))
}
async fn list_tools(
&self,
params: Option<mcp_types::ListToolsRequestParams>,
@@ -176,8 +188,6 @@ impl McpConnectionManager {
return Ok((Self::default(), ClientStartErrors::default()));
}
tracing::error!("new mcp_servers: {mcp_servers:?} use_rmcp_client: {use_rmcp_client}");
// Launch all configured servers concurrently.
let mut join_set = JoinSet::new();
let mut errors = ClientStartErrors::new();
@@ -193,16 +203,24 @@ impl McpConnectionManager {
continue;
}
if matches!(
cfg.transport,
McpServerTransportConfig::StreamableHttp { .. }
) && !use_rmcp_client
{
info!(
"skipping MCP server `{}` configured with url because rmcp client is disabled",
server_name
);
continue;
}
let startup_timeout = cfg.startup_timeout_sec.unwrap_or(DEFAULT_STARTUP_TIMEOUT);
let tool_timeout = cfg.tool_timeout_sec.unwrap_or(DEFAULT_TOOL_TIMEOUT);
let use_rmcp_client_flag = use_rmcp_client;
join_set.spawn(async move {
let McpServerConfig {
command, args, env, ..
} = cfg;
let command_os: OsString = command.into();
let args_os: Vec<OsString> = args.into_iter().map(Into::into).collect();
let McpServerConfig { transport, .. } = cfg;
let params = mcp_types::InitializeRequestParams {
capabilities: ClientCapabilities {
experimental: None,
@@ -224,15 +242,30 @@ impl McpConnectionManager {
protocol_version: mcp_types::MCP_SCHEMA_VERSION.to_owned(),
};
let client = McpClientAdapter::new_stdio_client(
use_rmcp_client_flag,
command_os,
args_os,
env,
params,
startup_timeout,
)
.await
let client = match transport {
McpServerTransportConfig::Stdio { command, args, env } => {
let command_os: OsString = command.into();
let args_os: Vec<OsString> = args.into_iter().map(Into::into).collect();
McpClientAdapter::new_stdio_client(
use_rmcp_client_flag,
command_os,
args_os,
env,
params.clone(),
startup_timeout,
)
.await
}
McpServerTransportConfig::StreamableHttp { url, bearer_token } => {
McpClientAdapter::new_streamable_http_client(
url,
bearer_token,
params,
startup_timeout,
)
.await
}
}
.map(|c| (c, startup_timeout));
((server_name, tool_timeout), client)

View File

@@ -89,8 +89,15 @@ pub fn assess_command_safety(
) -> SafetyCheck {
// Some commands look dangerous. Even if they are run inside a sandbox,
// unless the user has explicitly approved them, we should ask,
// regardless of the approval policy and sandbox policy.
// or reject if the approval_policy tells us not to ask.
if command_might_be_dangerous(command) && !approved.contains(command) {
if approval_policy == AskForApproval::Never {
return SafetyCheck::Reject {
reason: "dangerous command detected; rejected by user approval settings"
.to_string(),
};
}
return SafetyCheck::AskUser;
}
@@ -376,7 +383,13 @@ mod tests {
request_escalated_privileges,
);
assert_eq!(safety_check, SafetyCheck::AskUser);
assert_eq!(
safety_check,
SafetyCheck::Reject {
reason: "dangerous command detected; rejected by user approval settings"
.to_string(),
}
);
}
#[test]

View File

@@ -361,6 +361,7 @@ async fn includes_conversation_id_and_model_headers_in_request() {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn includes_base_instructions_override_in_request() {
skip_if_no_network!();
// Mock server
let server = MockServer::start().await;
@@ -558,6 +559,7 @@ async fn prefers_apikey_when_config_prefers_apikey_even_with_chatgpt_tokens() {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn includes_user_instructions_message_in_request() {
skip_if_no_network!();
let server = MockServer::start().await;
let first = ResponseTemplate::new(200)
@@ -755,6 +757,7 @@ async fn azure_responses_request_includes_store_and_reasoning_ids() {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn token_count_includes_rate_limits_snapshot() {
skip_if_no_network!();
let server = MockServer::start().await;
let sse_body = responses::sse(vec![responses::ev_completed_with_tokens("resp_rate", 123)]);
@@ -899,6 +902,7 @@ async fn token_count_includes_rate_limits_snapshot() {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn usage_limit_error_emits_rate_limit_event() -> anyhow::Result<()> {
skip_if_no_network!(Ok(()));
let server = MockServer::start().await;
let response = ResponseTemplate::new(429)
@@ -978,6 +982,7 @@ async fn usage_limit_error_emits_rate_limit_event() -> anyhow::Result<()> {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn azure_overrides_assign_properties_used_for_responses_url() {
skip_if_no_network!();
let existing_env_var_with_random_value = if cfg!(windows) { "USERNAME" } else { "USER" };
// Mock server
@@ -1054,6 +1059,7 @@ async fn azure_overrides_assign_properties_used_for_responses_url() {
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn env_var_overrides_loaded_auth() {
skip_if_no_network!();
let existing_env_var_with_random_value = if cfg!(windows) { "USERNAME" } else { "USER" };
// Mock server

View File

@@ -1,7 +1,10 @@
use std::collections::HashMap;
use std::net::TcpListener;
use std::time::Duration;
use codex_core::config_types::McpServerConfig;
use codex_core::config_types::McpServerTransportConfig;
use codex_core::protocol::AskForApproval;
use codex_core::protocol::EventMsg;
use codex_core::protocol::InputItem;
@@ -16,10 +19,15 @@ use core_test_support::wait_for_event;
use core_test_support::wait_for_event_with_timeout;
use escargot::CargoBuild;
use serde_json::Value;
use tokio::net::TcpStream;
use tokio::process::Child;
use tokio::process::Command;
use tokio::time::Instant;
use tokio::time::sleep;
use wiremock::matchers::any;
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
#[tokio::test(flavor = "multi_thread", worker_threads = 1)]
async fn stdio_server_round_trip() -> anyhow::Result<()> {
skip_if_no_network!(Ok(()));
let server = responses::start_mock_server().await;
@@ -54,7 +62,7 @@ async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
let expected_env_value = "propagated-env";
let rmcp_test_server_bin = CargoBuild::new()
.package("codex-rmcp-client")
.bin("rmcp_test_server")
.bin("test_stdio_server")
.run()?
.path()
.to_string_lossy()
@@ -66,12 +74,14 @@ async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
config.mcp_servers.insert(
server_name.to_string(),
McpServerConfig {
command: rmcp_test_server_bin.clone(),
args: Vec::new(),
env: Some(HashMap::from([(
"MCP_TEST_VALUE".to_string(),
expected_env_value.to_string(),
)])),
transport: McpServerTransportConfig::Stdio {
command: rmcp_test_server_bin.clone(),
args: Vec::new(),
env: Some(HashMap::from([(
"MCP_TEST_VALUE".to_string(),
expected_env_value.to_string(),
)])),
},
startup_timeout_sec: Some(Duration::from_secs(10)),
tool_timeout_sec: None,
},
@@ -97,18 +107,13 @@ async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
})
.await?;
eprintln!("waiting for mcp tool call begin event");
let begin_event = wait_for_event_with_timeout(
&fixture.codex,
|ev| {
eprintln!("ev: {ev:?}");
matches!(ev, EventMsg::McpToolCallBegin(_))
},
|ev| matches!(ev, EventMsg::McpToolCallBegin(_)),
Duration::from_secs(10),
)
.await;
eprintln!("mcp tool call begin event: {begin_event:?}");
let EventMsg::McpToolCallBegin(begin) = begin_event else {
unreachable!("event guard guarantees McpToolCallBegin");
};
@@ -119,7 +124,6 @@ async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
matches!(ev, EventMsg::McpToolCallEnd(_))
})
.await;
eprintln!("end_event: {end_event:?}");
let EventMsg::McpToolCallEnd(end) = end_event else {
unreachable!("event guard guarantees McpToolCallEnd");
};
@@ -145,18 +149,223 @@ async fn rmcp_tool_call_round_trip() -> anyhow::Result<()> {
.get("echo")
.and_then(Value::as_str)
.expect("echo payload present");
assert_eq!(echo_value, "ping");
assert_eq!(echo_value, "ECHOING: ping");
let env_value = map
.get("env")
.and_then(Value::as_str)
.expect("env snapshot inserted");
assert_eq!(env_value, expected_env_value);
let task_complete_event =
wait_for_event(&fixture.codex, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
eprintln!("task_complete_event: {task_complete_event:?}");
wait_for_event(&fixture.codex, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
server.verify().await;
Ok(())
}
#[tokio::test(flavor = "multi_thread", worker_threads = 1)]
async fn streamable_http_tool_call_round_trip() -> anyhow::Result<()> {
skip_if_no_network!(Ok(()));
let server = responses::start_mock_server().await;
let call_id = "call-456";
let server_name = "rmcp_http";
let tool_name = format!("{server_name}__echo");
mount_sse_once(
&server,
any(),
responses::sse(vec![
serde_json::json!({
"type": "response.created",
"response": {"id": "resp-1"}
}),
responses::ev_function_call(call_id, &tool_name, "{\"message\":\"ping\"}"),
responses::ev_completed("resp-1"),
]),
)
.await;
mount_sse_once(
&server,
any(),
responses::sse(vec![
responses::ev_assistant_message(
"msg-1",
"rmcp streamable http echo tool completed successfully.",
),
responses::ev_completed("resp-2"),
]),
)
.await;
let expected_env_value = "propagated-env-http";
let rmcp_http_server_bin = CargoBuild::new()
.package("codex-rmcp-client")
.bin("test_streamable_http_server")
.run()?
.path()
.to_string_lossy()
.into_owned();
let listener = TcpListener::bind("127.0.0.1:0")?;
let port = listener.local_addr()?.port();
drop(listener);
let bind_addr = format!("127.0.0.1:{port}");
let server_url = format!("http://{bind_addr}/mcp");
let mut http_server_child = Command::new(&rmcp_http_server_bin)
.kill_on_drop(true)
.env("MCP_STREAMABLE_HTTP_BIND_ADDR", &bind_addr)
.env("MCP_TEST_VALUE", expected_env_value)
.spawn()?;
wait_for_streamable_http_server(&mut http_server_child, &bind_addr, Duration::from_secs(5))
.await?;
let fixture = test_codex()
.with_config(move |config| {
config.use_experimental_use_rmcp_client = true;
config.mcp_servers.insert(
server_name.to_string(),
McpServerConfig {
transport: McpServerTransportConfig::StreamableHttp {
url: server_url,
bearer_token: None,
},
startup_timeout_sec: Some(Duration::from_secs(10)),
tool_timeout_sec: None,
},
);
})
.build(&server)
.await?;
let session_model = fixture.session_configured.model.clone();
fixture
.codex
.submit(Op::UserTurn {
items: vec![InputItem::Text {
text: "call the rmcp streamable http echo tool".into(),
}],
final_output_json_schema: None,
cwd: fixture.cwd.path().to_path_buf(),
approval_policy: AskForApproval::Never,
sandbox_policy: SandboxPolicy::DangerFullAccess,
model: session_model,
effort: None,
summary: ReasoningSummary::Auto,
})
.await?;
let begin_event = wait_for_event_with_timeout(
&fixture.codex,
|ev| matches!(ev, EventMsg::McpToolCallBegin(_)),
Duration::from_secs(10),
)
.await;
let EventMsg::McpToolCallBegin(begin) = begin_event else {
unreachable!("event guard guarantees McpToolCallBegin");
};
assert_eq!(begin.invocation.server, server_name);
assert_eq!(begin.invocation.tool, "echo");
let end_event = wait_for_event(&fixture.codex, |ev| {
matches!(ev, EventMsg::McpToolCallEnd(_))
})
.await;
let EventMsg::McpToolCallEnd(end) = end_event else {
unreachable!("event guard guarantees McpToolCallEnd");
};
let result = end
.result
.as_ref()
.expect("rmcp echo tool should return success");
assert_eq!(result.is_error, Some(false));
assert!(
result.content.is_empty(),
"content should default to an empty array"
);
let structured = result
.structured_content
.as_ref()
.expect("structured content");
let Value::Object(map) = structured else {
panic!("structured content should be an object: {structured:?}");
};
let echo_value = map
.get("echo")
.and_then(Value::as_str)
.expect("echo payload present");
assert_eq!(echo_value, "ECHOING: ping");
let env_value = map
.get("env")
.and_then(Value::as_str)
.expect("env snapshot inserted");
assert_eq!(env_value, expected_env_value);
wait_for_event(&fixture.codex, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
server.verify().await;
match http_server_child.try_wait() {
Ok(Some(_)) => {}
Ok(None) => {
let _ = http_server_child.kill().await;
}
Err(error) => {
eprintln!("failed to check streamable http server status: {error}");
let _ = http_server_child.kill().await;
}
}
if let Err(error) = http_server_child.wait().await {
eprintln!("failed to await streamable http server shutdown: {error}");
}
Ok(())
}
async fn wait_for_streamable_http_server(
server_child: &mut Child,
address: &str,
timeout: Duration,
) -> anyhow::Result<()> {
let deadline = Instant::now() + timeout;
loop {
if let Some(status) = server_child.try_wait()? {
return Err(anyhow::anyhow!(
"streamable HTTP server exited early with status {status}"
));
}
let remaining = deadline.saturating_duration_since(Instant::now());
if remaining.is_zero() {
return Err(anyhow::anyhow!(
"timed out waiting for streamable HTTP server at {address}: deadline reached"
));
}
match tokio::time::timeout(remaining, TcpStream::connect(address)).await {
Ok(Ok(_)) => return Ok(()),
Ok(Err(error)) => {
if Instant::now() >= deadline {
return Err(anyhow::anyhow!(
"timed out waiting for streamable HTTP server at {address}: {error}"
));
}
}
Err(_) => {
return Err(anyhow::anyhow!(
"timed out waiting for streamable HTTP server at {address}: connect call timed out"
));
}
}
sleep(Duration::from_millis(50)).await;
}
}

View File

@@ -8,6 +8,10 @@ use ts_rs::TS;
pub enum ConversationEvent {
#[serde(rename = "session.created")]
SessionCreated(SessionCreatedEvent),
#[serde(rename = "turn.started")]
TurnStarted(TurnStartedEvent),
#[serde(rename = "turn.completed")]
TurnCompleted(TurnCompletedEvent),
#[serde(rename = "item.started")]
ItemStarted(ItemStartedEvent),
#[serde(rename = "item.updated")]
@@ -23,6 +27,22 @@ pub struct SessionCreatedEvent {
pub session_id: String,
}
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, TS, Default)]
pub struct TurnStartedEvent {}
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, TS)]
pub struct TurnCompletedEvent {
pub usage: Usage,
}
/// Minimal usage summary for a turn.
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, TS, Default)]
pub struct Usage {
pub input_tokens: u64,
pub cached_input_tokens: u64,
pub output_tokens: u64,
}
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq, TS)]
pub struct ItemStartedEvent {
pub item: ConversationItem,

View File

@@ -23,6 +23,9 @@ use crate::exec_events::ReasoningItem;
use crate::exec_events::SessionCreatedEvent;
use crate::exec_events::TodoItem;
use crate::exec_events::TodoListItem;
use crate::exec_events::TurnCompletedEvent;
use crate::exec_events::TurnStartedEvent;
use crate::exec_events::Usage;
use codex_core::config::Config;
use codex_core::plan_tool::StepStatus;
use codex_core::plan_tool::UpdatePlanArgs;
@@ -37,6 +40,7 @@ use codex_core::protocol::PatchApplyBeginEvent;
use codex_core::protocol::PatchApplyEndEvent;
use codex_core::protocol::SessionConfiguredEvent;
use codex_core::protocol::TaskCompleteEvent;
use codex_core::protocol::TaskStartedEvent;
use tracing::error;
use tracing::warn;
@@ -48,6 +52,7 @@ pub struct ExperimentalEventProcessorWithJsonOutput {
running_patch_applies: HashMap<String, PatchApplyBeginEvent>,
// Tracks the todo list for the current turn (at most one per turn).
running_todo_list: Option<RunningTodoList>,
last_total_token_usage: Option<codex_core::protocol::TokenUsage>,
}
#[derive(Debug, Clone)]
@@ -70,6 +75,7 @@ impl ExperimentalEventProcessorWithJsonOutput {
running_commands: HashMap::new(),
running_patch_applies: HashMap::new(),
running_todo_list: None,
last_total_token_usage: None,
}
}
@@ -82,6 +88,14 @@ impl ExperimentalEventProcessorWithJsonOutput {
EventMsg::ExecCommandEnd(ev) => self.handle_exec_command_end(ev),
EventMsg::PatchApplyBegin(ev) => self.handle_patch_apply_begin(ev),
EventMsg::PatchApplyEnd(ev) => self.handle_patch_apply_end(ev),
EventMsg::TokenCount(ev) => {
if let Some(info) = &ev.info {
self.last_total_token_usage = Some(info.total_token_usage.clone());
}
Vec::new()
}
EventMsg::TaskStarted(ev) => self.handle_task_started(ev),
EventMsg::TaskComplete(_) => self.handle_task_complete(),
EventMsg::Error(ev) => vec![ConversationEvent::Error(ConversationErrorEvent {
message: ev.message.clone(),
})],
@@ -89,7 +103,6 @@ impl ExperimentalEventProcessorWithJsonOutput {
message: ev.message.clone(),
})],
EventMsg::PlanUpdate(ev) => self.handle_plan_update(ev),
EventMsg::TaskComplete(_) => self.handle_task_complete(),
_ => Vec::new(),
}
}
@@ -283,7 +296,23 @@ impl ExperimentalEventProcessorWithJsonOutput {
vec![ConversationEvent::ItemStarted(ItemStartedEvent { item })]
}
fn handle_task_started(&self, _: &TaskStartedEvent) -> Vec<ConversationEvent> {
vec![ConversationEvent::TurnStarted(TurnStartedEvent {})]
}
fn handle_task_complete(&mut self) -> Vec<ConversationEvent> {
let usage = if let Some(u) = &self.last_total_token_usage {
Usage {
input_tokens: u.input_tokens,
cached_input_tokens: u.cached_input_tokens,
output_tokens: u.output_tokens,
}
} else {
Usage::default()
};
let mut items = Vec::new();
if let Some(running) = self.running_todo_list.take() {
let item = ConversationItem {
id: running.item_id,
@@ -291,11 +320,16 @@ impl ExperimentalEventProcessorWithJsonOutput {
items: running.items,
}),
};
return vec![ConversationEvent::ItemCompleted(ItemCompletedEvent {
items.push(ConversationEvent::ItemCompleted(ItemCompletedEvent {
item,
})];
}));
}
Vec::new()
items.push(ConversationEvent::TurnCompleted(TurnCompletedEvent {
usage,
}));
items
}
}

View File

@@ -331,7 +331,13 @@ pub async fn run_main(cli: Cli, codex_linux_sandbox_exe: Option<PathBuf>) -> any
info!("Sent prompt with event ID: {initial_prompt_task_id}");
// Run the loop until the task is complete.
// Track whether a fatal error was reported by the server so we can
// exit with a non-zero status for automation-friendly signaling.
let mut error_seen = false;
while let Some(event) = rx.recv().await {
if matches!(event.msg, EventMsg::Error(_)) {
error_seen = true;
}
let shutdown: CodexStatus = event_processor.process_event(event);
match shutdown {
CodexStatus::Running => continue,
@@ -343,6 +349,9 @@ pub async fn run_main(cli: Cli, codex_linux_sandbox_exe: Option<PathBuf>) -> any
}
}
}
if error_seen {
std::process::exit(1);
}
Ok(())
}

View File

@@ -24,6 +24,9 @@ use codex_exec::exec_events::ReasoningItem;
use codex_exec::exec_events::SessionCreatedEvent;
use codex_exec::exec_events::TodoItem as ExecTodoItem;
use codex_exec::exec_events::TodoListItem as ExecTodoListItem;
use codex_exec::exec_events::TurnCompletedEvent;
use codex_exec::exec_events::TurnStartedEvent;
use codex_exec::exec_events::Usage;
use codex_exec::experimental_event_processor_with_json_output::ExperimentalEventProcessorWithJsonOutput;
use pretty_assertions::assert_eq;
use std::path::PathBuf;
@@ -65,6 +68,22 @@ fn session_configured_produces_session_created_event() {
);
}
#[test]
fn task_started_produces_turn_started_event() {
let mut ep = ExperimentalEventProcessorWithJsonOutput::new(None);
let out = ep.collect_conversation_events(&event(
"t1",
EventMsg::TaskStarted(codex_core::protocol::TaskStartedEvent {
model_context_window: Some(32_000),
}),
));
assert_eq!(
out,
vec![ConversationEvent::TurnStarted(TurnStartedEvent {})]
);
}
#[test]
fn plan_update_emits_todo_list_started_updated_and_completed() {
use codex_core::plan_tool::PlanItemArg;
@@ -161,23 +180,28 @@ fn plan_update_emits_todo_list_started_updated_and_completed() {
let out_complete = ep.collect_conversation_events(&complete);
assert_eq!(
out_complete,
vec![ConversationEvent::ItemCompleted(ItemCompletedEvent {
item: ConversationItem {
id: "item_0".to_string(),
details: ConversationItemDetails::TodoList(ExecTodoListItem {
items: vec![
ExecTodoItem {
text: "step one".to_string(),
completed: true
},
ExecTodoItem {
text: "step two".to_string(),
completed: false
},
],
}),
},
})]
vec![
ConversationEvent::ItemCompleted(ItemCompletedEvent {
item: ConversationItem {
id: "item_0".to_string(),
details: ConversationItemDetails::TodoList(ExecTodoListItem {
items: vec![
ExecTodoItem {
text: "step one".to_string(),
completed: true
},
ExecTodoItem {
text: "step two".to_string(),
completed: false
},
],
}),
},
}),
ConversationEvent::TurnCompleted(TurnCompletedEvent {
usage: Usage::default(),
}),
]
);
}
@@ -585,3 +609,52 @@ fn patch_apply_failure_produces_item_completed_patchapply_failed() {
other => panic!("unexpected event: {other:?}"),
}
}
#[test]
fn task_complete_produces_turn_completed_with_usage() {
let mut ep = ExperimentalEventProcessorWithJsonOutput::new(None);
// First, feed a TokenCount event with known totals.
let usage = codex_core::protocol::TokenUsage {
input_tokens: 1200,
cached_input_tokens: 200,
output_tokens: 345,
reasoning_output_tokens: 0,
total_tokens: 0,
};
let info = codex_core::protocol::TokenUsageInfo {
total_token_usage: usage.clone(),
last_token_usage: usage,
model_context_window: None,
};
let token_count_event = event(
"e1",
EventMsg::TokenCount(codex_core::protocol::TokenCountEvent {
info: Some(info),
rate_limits: None,
}),
);
assert!(
ep.collect_conversation_events(&token_count_event)
.is_empty()
);
// Then TaskComplete should produce turn.completed with the captured usage.
let complete_event = event(
"e2",
EventMsg::TaskComplete(codex_core::protocol::TaskCompleteEvent {
last_agent_message: Some("done".to_string()),
}),
);
let out = ep.collect_conversation_events(&complete_event);
assert_eq!(
out,
vec![ConversationEvent::TurnCompleted(TurnCompletedEvent {
usage: Usage {
input_tokens: 1200,
cached_input_tokens: 200,
output_tokens: 345,
},
})]
);
}

View File

@@ -3,3 +3,4 @@ mod apply_patch;
mod output_schema;
mod resume;
mod sandbox;
mod server_error_exit;

View File

@@ -0,0 +1,34 @@
#![cfg(not(target_os = "windows"))]
#![allow(clippy::expect_used, clippy::unwrap_used)]
use core_test_support::responses;
use core_test_support::test_codex_exec::test_codex_exec;
use wiremock::matchers::any;
/// Verify that when the server reports an error, `codex-exec` exits with a
/// non-zero status code so automation can detect failures.
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn exits_non_zero_when_server_reports_error() -> anyhow::Result<()> {
let test = test_codex_exec();
// Mock a simple Responses API SSE stream that immediately reports a
// `response.failed` event with an error message.
let server = responses::start_mock_server().await;
let body = responses::sse(vec![serde_json::json!({
"type": "response.failed",
"response": {
"id": "resp_err_1",
"error": {"code": "rate_limit_exceeded", "message": "synthetic server error"}
}
})]);
responses::mount_sse_once(&server, any(), body).await;
test.cmd_with_server(&server)
.arg("--skip-git-repo-check")
.arg("tell me something")
.arg("--experimental-json")
.assert()
.code(1);
Ok(())
}

View File

@@ -5,6 +5,7 @@ help:
just -l
# `codex`
alias c := codex
codex *args:
cargo run --bin codex -- "$@"
@@ -27,6 +28,9 @@ fmt:
fix *args:
cargo clippy --fix --all-features --tests --allow-dirty "$@"
clippy:
cargo clippy --all-features --tests "$@"
install:
rustup show active-toolchain
cargo fetch

View File

@@ -0,0 +1,21 @@
[package]
edition = "2024"
name = "codex-process-hardening"
version = { workspace = true }
[lib]
name = "codex_process_hardening"
path = "src/lib.rs"
[lints]
workspace = true
[dependencies]
[target.'cfg(target_os = "linux")'.dependencies]
libc = { workspace = true }
[target.'cfg(target_os = "android")'.dependencies]
libc = { workspace = true }
[target.'cfg(target_os = "macos")'.dependencies]
libc = { workspace = true }

View File

@@ -0,0 +1,7 @@
# codex-process-hardening
This crate provides `pre_main_hardening()`, which is designed to be called pre-`main()` (using `#[ctor::ctor]`) to perform various process hardening steps, such as
- disabling core dumps
- disabling ptrace attach on Linux and macOS
- removing dangerous environment variables such as `LD_PRELOAD` and `DYLD_*`

View File

@@ -1,3 +1,19 @@
/// This is designed to be called pre-main() (using `#[ctor::ctor]`) to perform
/// various process hardening steps, such as
/// - disabling core dumps
/// - disabling ptrace attach on Linux and macOS.
/// - removing dangerous environment variables such as LD_PRELOAD and DYLD_*
pub fn pre_main_hardening() {
#[cfg(any(target_os = "linux", target_os = "android"))]
pre_main_hardening_linux();
#[cfg(target_os = "macos")]
pre_main_hardening_macos();
#[cfg(windows)]
pre_main_hardening_windows();
}
#[cfg(any(target_os = "linux", target_os = "android"))]
const PRCTL_FAILED_EXIT_CODE: i32 = 5;

View File

@@ -8,7 +8,7 @@ name = "codex_responses_api_proxy"
path = "src/lib.rs"
[[bin]]
name = "responses-api-proxy"
name = "codex-responses-api-proxy"
path = "src/main.rs"
[lints]
@@ -17,11 +17,11 @@ workspace = true
[dependencies]
anyhow = { workspace = true }
clap = { workspace = true, features = ["derive"] }
codex-arg0 = { workspace = true }
codex-process-hardening = { workspace = true }
ctor = { workspace = true }
libc = { workspace = true }
reqwest = { workspace = true, features = ["blocking", "json", "rustls-tls"] }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true }
tiny_http = { workspace = true }
tokio = { workspace = true }
zeroize = { workspace = true }

View File

@@ -4,12 +4,12 @@ A strict HTTP proxy that only forwards `POST` requests to `/v1/responses` to the
## Expected Usage
**IMPORTANT:** This is designed to be used with `CODEX_SECURE_MODE=1` so that an unprivileged user cannot inspect or tamper with this process. Though if `--http-shutdown` is specified, an unprivileged user _can_ shutdown the server.
**IMPORTANT:** `codex-responses-api-proxy` is designed to be run by a privileged user with access to `OPENAI_API_KEY` so that an unprivileged user cannot inspect or tamper with the process. Though if `--http-shutdown` is specified, an unprivileged user _can_ make a `GET` request to `/shutdown` to shutdown the server, as an unprivileged could not send `SIGTERM` to kill the process.
A privileged user (i.e., `root` or a user with `sudo`) who has access to `OPENAI_API_KEY` would run the following to start the server:
A privileged user (i.e., `root` or a user with `sudo`) who has access to `OPENAI_API_KEY` would run the following to start the server, as `codex-responses-api-proxy` reads the auth token from `stdin`:
```shell
printenv OPENAI_API_KEY | CODEX_SECURE_MODE=1 codex responses-api-proxy --http-shutdown --server-info /tmp/server-info.json
printenv OPENAI_API_KEY | codex-responses-api-proxy --http-shutdown --server-info /tmp/server-info.json
```
A non-privileged user would then run Codex as follows, specifying the `model_provider` dynamically:
@@ -22,7 +22,7 @@ codex exec -c "model_providers.openai-proxy={ name = 'OpenAI Proxy', base_url =
'Your prompt here'
```
When the unprivileged user was finished, they could shutdown the server using `curl` (since `kill -9` is not an option):
When the unprivileged user was finished, they could shutdown the server using `curl` (since `kill -SIGTERM` is not an option):
```shell
curl --fail --silent --show-error "${PROXY_BASE_URL}/shutdown"
@@ -30,17 +30,17 @@ curl --fail --silent --show-error "${PROXY_BASE_URL}/shutdown"
## Behavior
- Reads the API key from `stdin`. All callers should pipe the key in (for example, `printenv OPENAI_API_KEY | codex responses-api-proxy`).
- Reads the API key from `stdin`. All callers should pipe the key in (for example, `printenv OPENAI_API_KEY | codex-responses-api-proxy`).
- Formats the header value as `Bearer <key>` and attempts to `mlock(2)` the memory holding that header so it is not swapped to disk.
- Listens on the provided port or an ephemeral port if `--port` is not specified.
- Accepts exactly `POST /v1/responses` (no query string). The request body is forwarded to `https://api.openai.com/v1/responses` with `Authorization: Bearer <key>` set. All original request headers (except any incoming `Authorization`) are forwarded upstream. For other requests, it responds with `403`.
- Optionally writes a single-line JSON file with server info, currently `{ "port": <u16> }`.
- Optional `--http-shutdown` enables `GET /shutdown` to terminate the process with exit code 0. This allows one user (e.g., root) to start the proxy and another unprivileged user on the host to shut it down.
- Optional `--http-shutdown` enables `GET /shutdown` to terminate the process with exit code 0. This allows one user (e.g., `root`) to start the proxy and another unprivileged user on the host to shut it down.
## CLI
```
responses-api-proxy [--port <PORT>] [--server-info <FILE>] [--http-shutdown]
codex-responses-api-proxy [--port <PORT>] [--server-info <FILE>] [--http-shutdown]
```
- `--port <PORT>`: Port to bind on `127.0.0.1`. If omitted, an ephemeral port is chosen.
@@ -51,3 +51,19 @@ responses-api-proxy [--port <PORT>] [--server-info <FILE>] [--http-shutdown]
- Only `POST /v1/responses` is permitted. No query strings are allowed.
- All request headers are forwarded to the upstream call (aside from overriding `Authorization`). Response status and content-type are mirrored from upstream.
## Hardening Details
Care is taken to restrict access/copying to the value of `OPENAI_API_KEY` retained in memory:
- We leverage [`codex_process_hardening`](https://github.com/openai/codex/blob/main/codex-rs/process-hardening/README.md) so `codex-responses-api-proxy` is run with standard process-hardening techniques.
- At startup, we allocate a `1024` byte buffer on the stack and write `"Bearer "` as the first `7` bytes.
- We then read from `stdin`, copying the contents into the buffer after `"Bearer "`.
- After verifying the key matches `/^[a-zA-Z0-9_-]+$/` (and does not exceed the buffer), we create a `String` from that buffer (so the data is now on the heap).
- We zero out the stack-allocated buffer using https://crates.io/crates/zeroize so it is not optimized away by the compiler.
- We invoke `.leak()` on the `String` so we can treat its contents as a `&'static str`, as it will live for the rest of the process.
- On UNIX, we `mlock(2)` the memory backing the `&'static str`.
- When using the `&'static str` when building an HTTP request, we use `HeaderValue::from_static()` to avoid copying the `&str`.
- We also invoke `.set_sensitive(true)` on the `HeaderValue`, which in theory indicates to other parts of the HTTP stack that the header should be treated with "special care" to avoid leakage:
https://github.com/hyperium/http/blob/439d1c50d71e3be3204b6c4a1bf2255ed78e1f93/src/header/value.rs#L346-L376

View File

@@ -0,0 +1,13 @@
# @openai/codex-responses-api-proxy
<p align="center"><code>npm i -g @openai/codex-responses-api-proxy</code> to install <code>codex-responses-api-proxy</code></p>
This package distributes the prebuilt [Codex Responses API proxy binary](https://github.com/openai/codex/tree/main/codex-rs/responses-api-proxy) for macOS, Linux, and Windows.
To see available options, run:
```
node ./bin/codex-responses-api-proxy.js --help
```
Refer to [`codex-rs/responses-api-proxy/README.md`](https://github.com/openai/codex/blob/main/codex-rs/responses-api-proxy/README.md) for detailed documentation.

View File

@@ -0,0 +1,97 @@
#!/usr/bin/env node
// Entry point for the Codex responses API proxy binary.
import { spawn } from "node:child_process";
import path from "path";
import { fileURLToPath } from "url";
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
function determineTargetTriple(platform, arch) {
switch (platform) {
case "linux":
case "android":
if (arch === "x64") {
return "x86_64-unknown-linux-musl";
}
if (arch === "arm64") {
return "aarch64-unknown-linux-musl";
}
break;
case "darwin":
if (arch === "x64") {
return "x86_64-apple-darwin";
}
if (arch === "arm64") {
return "aarch64-apple-darwin";
}
break;
case "win32":
if (arch === "x64") {
return "x86_64-pc-windows-msvc";
}
if (arch === "arm64") {
return "aarch64-pc-windows-msvc";
}
break;
default:
break;
}
return null;
}
const targetTriple = determineTargetTriple(process.platform, process.arch);
if (!targetTriple) {
throw new Error(
`Unsupported platform: ${process.platform} (${process.arch})`,
);
}
const vendorRoot = path.join(__dirname, "..", "vendor");
const archRoot = path.join(vendorRoot, targetTriple);
const binaryBaseName = "codex-responses-api-proxy";
const binaryPath = path.join(
archRoot,
binaryBaseName,
process.platform === "win32" ? `${binaryBaseName}.exe` : binaryBaseName,
);
const child = spawn(binaryPath, process.argv.slice(2), {
stdio: "inherit",
});
child.on("error", (err) => {
console.error(err);
process.exit(1);
});
const forwardSignal = (signal) => {
if (!child.killed) {
try {
child.kill(signal);
} catch {
/* ignore */
}
}
};
["SIGINT", "SIGTERM", "SIGHUP"].forEach((sig) => {
process.on(sig, () => forwardSignal(sig));
});
const childResult = await new Promise((resolve) => {
child.on("exit", (code, signal) => {
if (signal) {
resolve({ type: "signal", signal });
} else {
resolve({ type: "code", exitCode: code ?? 1 });
}
});
});
if (childResult.type === "signal") {
process.kill(process.pid, childResult.signal);
} else {
process.exit(childResult.exitCode);
}

View File

@@ -0,0 +1,21 @@
{
"name": "@openai/codex-responses-api-proxy",
"version": "0.0.0-dev",
"license": "Apache-2.0",
"bin": {
"codex-responses-api-proxy": "bin/codex-responses-api-proxy.js"
},
"type": "module",
"engines": {
"node": ">=16"
},
"files": [
"bin",
"vendor"
],
"repository": {
"type": "git",
"url": "git+https://github.com/openai/codex.git",
"directory": "codex-rs/responses-api-proxy/npm"
}
}

View File

@@ -6,6 +6,7 @@ use std::net::TcpListener;
use std::path::Path;
use std::path::PathBuf;
use std::sync::Arc;
use std::time::Duration;
use anyhow::Context;
use anyhow::Result;
@@ -62,6 +63,8 @@ pub fn run_main(args: Args) -> Result<()> {
.map_err(|err| anyhow!("creating HTTP server: {err}"))?;
let client = Arc::new(
Client::builder()
// Disable reqwest's 30s default so long-lived response streams keep flowing.
.timeout(None::<Duration>)
.build()
.context("building reqwest client")?,
);

View File

@@ -1,14 +1,12 @@
use anyhow::Context;
use clap::Parser;
use codex_arg0::arg0_dispatch_or_else;
use codex_responses_api_proxy::Args as ResponsesApiProxyArgs;
pub fn main() -> anyhow::Result<()> {
arg0_dispatch_or_else(|_codex_linux_sandbox_exe| async move {
let args = ResponsesApiProxyArgs::parse();
tokio::task::spawn_blocking(move || codex_responses_api_proxy::run_main(args))
.await
.context("responses-api-proxy blocking task panicked")??;
Ok(())
})
#[ctor::ctor]
fn pre_main() {
codex_process_hardening::pre_main_hardening();
}
pub fn main() -> anyhow::Result<()> {
let args = ResponsesApiProxyArgs::parse();
codex_responses_api_proxy::run_main(args)
}

View File

@@ -54,9 +54,16 @@ where
));
}
if let Err(err) = validate_auth_header_bytes(&buf[AUTH_HEADER_PREFIX.len()..total]) {
buf.zeroize();
return Err(err);
}
let header_str = match std::str::from_utf8(&buf[..total]) {
Ok(value) => value,
Err(err) => {
// In theory, validate_auth_header_bytes() should have caught
// any invalid UTF-8 sequences, but just in case...
buf.zeroize();
return Err(err).context("reading Authorization header from stdin as UTF-8");
}
@@ -113,6 +120,21 @@ fn mlock_str(value: &str) {
#[cfg(not(unix))]
fn mlock_str(_value: &str) {}
/// The key should match /^[A-Za-z0-9\-_]+$/. Ensure there is no funny business
/// with NUL characters and whatnot.
fn validate_auth_header_bytes(key_bytes: &[u8]) -> Result<()> {
if key_bytes
.iter()
.all(|byte| byte.is_ascii_alphanumeric() || matches!(byte, b'-' | b'_'))
{
return Ok(());
}
Err(anyhow!(
"OPENAI_API_KEY may only contain ASCII letters, numbers, '-' or '_'"
))
}
#[cfg(test)]
mod tests {
use super::*;
@@ -158,7 +180,7 @@ mod tests {
})
.unwrap_err();
let message = format!("{err:#}");
assert!(message.contains("too large"));
assert!(message.contains("OPENAI_API_KEY is too large to fit in the 512-byte buffer"));
}
#[test]
@@ -180,6 +202,23 @@ mod tests {
.unwrap_err();
let message = format!("{err:#}");
assert!(message.contains("UTF-8"));
assert!(
message.contains("OPENAI_API_KEY may only contain ASCII letters, numbers, '-' or '_'")
);
}
#[test]
fn errors_on_invalid_characters() {
let err = read_auth_header_with(|buf| {
let data = b"sk-abc!23";
buf[..data.len()].copy_from_slice(data);
Ok(data.len())
})
.unwrap_err();
let message = format!("{err:#}");
assert!(
message.contains("OPENAI_API_KEY may only contain ASCII letters, numbers, '-' or '_'")
);
}
}

View File

@@ -16,6 +16,15 @@ rmcp = { version = "0.7.0", default-features = false, features = [
"schemars",
"server",
"transport-child-process",
"transport-streamable-http-client-reqwest",
"transport-streamable-http-server",
] }
axum = { version = "0.8", default-features = false, features = ["http1", "tokio"] }
futures = { version = "0.3", default-features = false, features = ["std"] }
reqwest = { version = "0.12", default-features = false, features = [
"json",
"stream",
"rustls-tls",
] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"

View File

@@ -0,0 +1,142 @@
use std::borrow::Cow;
use std::collections::HashMap;
use std::sync::Arc;
use rmcp::ErrorData as McpError;
use rmcp::ServiceExt;
use rmcp::handler::server::ServerHandler;
use rmcp::model::CallToolRequestParam;
use rmcp::model::CallToolResult;
use rmcp::model::JsonObject;
use rmcp::model::ListToolsResult;
use rmcp::model::PaginatedRequestParam;
use rmcp::model::ServerCapabilities;
use rmcp::model::ServerInfo;
use rmcp::model::Tool;
use serde::Deserialize;
use serde_json::json;
use tokio::task;
#[derive(Clone)]
struct TestToolServer {
tools: Arc<Vec<Tool>>,
}
pub fn stdio() -> (tokio::io::Stdin, tokio::io::Stdout) {
(tokio::io::stdin(), tokio::io::stdout())
}
impl TestToolServer {
fn new() -> Self {
let tools = vec![Self::echo_tool()];
Self {
tools: Arc::new(tools),
}
}
fn echo_tool() -> Tool {
#[expect(clippy::expect_used)]
let schema: JsonObject = serde_json::from_value(json!({
"type": "object",
"properties": {
"message": { "type": "string" },
"env_var": { "type": "string" }
},
"required": ["message"],
"additionalProperties": false
}))
.expect("echo tool schema should deserialize");
Tool::new(
Cow::Borrowed("echo"),
Cow::Borrowed("Echo back the provided message and include environment data."),
Arc::new(schema),
)
}
}
#[derive(Deserialize)]
struct EchoArgs {
message: String,
#[allow(dead_code)]
env_var: Option<String>,
}
impl ServerHandler for TestToolServer {
fn get_info(&self) -> ServerInfo {
ServerInfo {
capabilities: ServerCapabilities::builder()
.enable_tools()
.enable_tool_list_changed()
.build(),
..ServerInfo::default()
}
}
fn list_tools(
&self,
_request: Option<PaginatedRequestParam>,
_context: rmcp::service::RequestContext<rmcp::service::RoleServer>,
) -> impl std::future::Future<Output = Result<ListToolsResult, McpError>> + Send + '_ {
let tools = self.tools.clone();
async move {
Ok(ListToolsResult {
tools: (*tools).clone(),
next_cursor: None,
})
}
}
async fn call_tool(
&self,
request: CallToolRequestParam,
_context: rmcp::service::RequestContext<rmcp::service::RoleServer>,
) -> Result<CallToolResult, McpError> {
match request.name.as_ref() {
"echo" => {
let args: EchoArgs = match request.arguments {
Some(arguments) => serde_json::from_value(serde_json::Value::Object(
arguments.into_iter().collect(),
))
.map_err(|err| McpError::invalid_params(err.to_string(), None))?,
None => {
return Err(McpError::invalid_params(
"missing arguments for echo tool",
None,
));
}
};
let env_snapshot: HashMap<String, String> = std::env::vars().collect();
let structured_content = json!({
"echo": format!("ECHOING: {}", args.message),
"env": env_snapshot.get("MCP_TEST_VALUE"),
});
Ok(CallToolResult {
content: Vec::new(),
structured_content: Some(structured_content),
is_error: Some(false),
meta: None,
})
}
other => Err(McpError::invalid_params(
format!("unknown tool: {other}"),
None,
)),
}
}
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
eprintln!("starting rmcp test server");
// Run the server with STDIO transport. If the client disconnects we simply
// bubble up the error so the process exits.
let service = TestToolServer::new();
let running = service.serve(stdio()).await?;
// Wait for the client to finish interacting with the server.
running.waiting().await?;
// Drain background tasks to ensure clean shutdown.
task::yield_now().await;
Ok(())
}

View File

@@ -0,0 +1,167 @@
use std::borrow::Cow;
use std::collections::HashMap;
use std::io::ErrorKind;
use std::net::SocketAddr;
use std::sync::Arc;
use axum::Router;
use rmcp::ErrorData as McpError;
use rmcp::handler::server::ServerHandler;
use rmcp::model::CallToolRequestParam;
use rmcp::model::CallToolResult;
use rmcp::model::JsonObject;
use rmcp::model::ListToolsResult;
use rmcp::model::PaginatedRequestParam;
use rmcp::model::ServerCapabilities;
use rmcp::model::ServerInfo;
use rmcp::model::Tool;
use rmcp::transport::StreamableHttpServerConfig;
use rmcp::transport::StreamableHttpService;
use rmcp::transport::streamable_http_server::session::local::LocalSessionManager;
use serde::Deserialize;
use serde_json::json;
use tokio::task;
#[derive(Clone)]
struct TestToolServer {
tools: Arc<Vec<Tool>>,
}
impl TestToolServer {
fn new() -> Self {
let tools = vec![Self::echo_tool()];
Self {
tools: Arc::new(tools),
}
}
fn echo_tool() -> Tool {
#[expect(clippy::expect_used)]
let schema: JsonObject = serde_json::from_value(json!({
"type": "object",
"properties": {
"message": { "type": "string" },
"env_var": { "type": "string" }
},
"required": ["message"],
"additionalProperties": false
}))
.expect("echo tool schema should deserialize");
Tool::new(
Cow::Borrowed("echo"),
Cow::Borrowed("Echo back the provided message and include environment data."),
Arc::new(schema),
)
}
}
#[derive(Deserialize)]
struct EchoArgs {
message: String,
#[allow(dead_code)]
env_var: Option<String>,
}
impl ServerHandler for TestToolServer {
fn get_info(&self) -> ServerInfo {
ServerInfo {
capabilities: ServerCapabilities::builder()
.enable_tools()
.enable_tool_list_changed()
.build(),
..ServerInfo::default()
}
}
fn list_tools(
&self,
_request: Option<PaginatedRequestParam>,
_context: rmcp::service::RequestContext<rmcp::service::RoleServer>,
) -> impl std::future::Future<Output = Result<ListToolsResult, McpError>> + Send + '_ {
let tools = self.tools.clone();
async move {
Ok(ListToolsResult {
tools: (*tools).clone(),
next_cursor: None,
})
}
}
async fn call_tool(
&self,
request: CallToolRequestParam,
_context: rmcp::service::RequestContext<rmcp::service::RoleServer>,
) -> Result<CallToolResult, McpError> {
match request.name.as_ref() {
"echo" => {
let args: EchoArgs = match request.arguments {
Some(arguments) => serde_json::from_value(serde_json::Value::Object(
arguments.into_iter().collect(),
))
.map_err(|err| McpError::invalid_params(err.to_string(), None))?,
None => {
return Err(McpError::invalid_params(
"missing arguments for echo tool",
None,
));
}
};
let env_snapshot: HashMap<String, String> = std::env::vars().collect();
let structured_content = json!({
"echo": format!("ECHOING: {}", args.message),
"env": env_snapshot.get("MCP_TEST_VALUE"),
});
Ok(CallToolResult {
content: Vec::new(),
structured_content: Some(structured_content),
is_error: Some(false),
meta: None,
})
}
other => Err(McpError::invalid_params(
format!("unknown tool: {other}"),
None,
)),
}
}
}
fn parse_bind_addr() -> Result<SocketAddr, Box<dyn std::error::Error>> {
let default_addr = "127.0.0.1:3920";
let bind_addr = std::env::var("MCP_STREAMABLE_HTTP_BIND_ADDR")
.or_else(|_| std::env::var("BIND_ADDR"))
.unwrap_or_else(|_| default_addr.to_string());
Ok(bind_addr.parse()?)
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let bind_addr = parse_bind_addr()?;
let listener = match tokio::net::TcpListener::bind(&bind_addr).await {
Ok(listener) => listener,
Err(err) if err.kind() == ErrorKind::PermissionDenied => {
eprintln!(
"failed to bind to {bind_addr}: {err}. make sure the process has network access"
);
return Ok(());
}
Err(err) => return Err(err.into()),
};
eprintln!("starting rmcp streamable http test server on http://{bind_addr}/mcp");
let router = Router::new().nest_service(
"/mcp",
StreamableHttpService::new(
|| Ok(TestToolServer::new()),
Arc::new(LocalSessionManager::default()),
StreamableHttpServerConfig::default(),
),
);
axum::serve(listener, router).await?;
task::yield_now().await;
Ok(())
}

View File

@@ -7,6 +7,7 @@ use std::time::Duration;
use anyhow::Result;
use anyhow::anyhow;
use futures::FutureExt;
use mcp_types::CallToolRequestParams;
use mcp_types::CallToolResult;
use mcp_types::InitializeRequestParams;
@@ -19,7 +20,9 @@ use rmcp::model::PaginatedRequestParam;
use rmcp::service::RoleClient;
use rmcp::service::RunningService;
use rmcp::service::{self};
use rmcp::transport::StreamableHttpClientTransport;
use rmcp::transport::child_process::TokioChildProcess;
use rmcp::transport::streamable_http_client::StreamableHttpClientTransportConfig;
use tokio::io::AsyncBufReadExt;
use tokio::io::BufReader;
use tokio::process::Command;
@@ -35,9 +38,14 @@ use crate::utils::convert_to_rmcp;
use crate::utils::create_env_for_mcp_server;
use crate::utils::run_with_timeout;
enum PendingTransport {
ChildProcess(TokioChildProcess),
StreamableHttp(StreamableHttpClientTransport<reqwest::Client>),
}
enum ClientState {
Connecting {
transport: Option<TokioChildProcess>,
transport: Option<PendingTransport>,
},
Ready {
service: Arc<RunningService<RoleClient, LoggingClientHandler>>,
@@ -90,7 +98,22 @@ impl RmcpClient {
Ok(Self {
state: Mutex::new(ClientState::Connecting {
transport: Some(transport),
transport: Some(PendingTransport::ChildProcess(transport)),
}),
})
}
pub fn new_streamable_http_client(url: String, bearer_token: Option<String>) -> Result<Self> {
let mut config = StreamableHttpClientTransportConfig::with_uri(url);
if let Some(token) = bearer_token {
config = config.auth_header(format!("Bearer {token}"));
}
let transport = StreamableHttpClientTransport::from_config(config);
Ok(Self {
state: Mutex::new(ClientState::Connecting {
transport: Some(PendingTransport::StreamableHttp(transport)),
}),
})
}
@@ -116,7 +139,14 @@ impl RmcpClient {
let client_info = convert_to_rmcp::<_, InitializeRequestParam>(params.clone())?;
let client_handler = LoggingClientHandler::new(client_info);
let service_future = service::serve_client(client_handler, transport);
let service_future = match transport {
PendingTransport::ChildProcess(transport) => {
service::serve_client(client_handler.clone(), transport).boxed()
}
PendingTransport::StreamableHttp(transport) => {
service::serve_client(client_handler, transport).boxed()
}
};
let service = match timeout {
Some(duration) => time::timeout(duration, service_future)

View File

@@ -40,23 +40,20 @@ codex-login = { workspace = true }
codex-ollama = { workspace = true }
codex-protocol = { workspace = true }
color-eyre = { workspace = true }
crossterm = { workspace = true, features = [
"bracketed-paste",
"event-stream",
] }
dirs = { workspace = true }
crossterm = { workspace = true, features = ["bracketed-paste", "event-stream"] }
diffy = { workspace = true }
image = { workspace = true, features = [
"jpeg",
"png",
] }
dirs = { workspace = true }
image = { workspace = true, features = ["jpeg", "png"] }
itertools = { workspace = true }
lazy_static = { workspace = true }
mcp-types = { workspace = true }
path-clean = { workspace = true }
pathdiff = { workspace = true }
pulldown-cmark = { workspace = true }
rand = { workspace = true }
ratatui = { workspace = true, features = [
"scrolling-regions",
"unstable-backend-writer",
"unstable-rendered-line-info",
"unstable-widget-ref",
] }
@@ -80,11 +77,9 @@ tokio-stream = { workspace = true }
tracing = { workspace = true, features = ["log"] }
tracing-appender = { workspace = true }
tracing-subscriber = { workspace = true, features = ["env-filter"] }
pulldown-cmark = { workspace = true }
unicode-segmentation = { workspace = true }
unicode-width = { workspace = true }
url = { workspace = true }
pathdiff = { workspace = true }
[target.'cfg(unix)'.dependencies]
libc = { workspace = true }

View File

@@ -447,7 +447,7 @@ mod tests {
.iter()
.map(|span| span.content.as_ref())
.collect();
assert_eq!(intro_text, "> intro");
assert_eq!(intro_text, " intro");
}
#[test]
@@ -479,7 +479,7 @@ mod tests {
.iter()
.map(|span| span.content.as_ref())
.collect();
assert_eq!(intro_text, "> intro");
assert_eq!(intro_text, " intro");
let user_first = cells[1]
.as_any()

View File

@@ -90,6 +90,7 @@ impl AsciiAnimation {
true
}
#[allow(dead_code)]
pub(crate) fn request_frame(&self) {
self.request_frame.schedule_frame();
}

View File

@@ -8,15 +8,10 @@ use ratatui::layout::Constraint;
use ratatui::layout::Layout;
use ratatui::layout::Margin;
use ratatui::layout::Rect;
use ratatui::style::Color;
use ratatui::style::Modifier;
use ratatui::style::Style;
use ratatui::style::Stylize;
use ratatui::text::Line;
use ratatui::text::Span;
use ratatui::widgets::Block;
use ratatui::widgets::BorderType;
use ratatui::widgets::Borders;
use ratatui::widgets::StatefulWidgetRef;
use ratatui::widgets::WidgetRef;
@@ -30,6 +25,8 @@ use super::paste_burst::CharDecision;
use super::paste_burst::PasteBurst;
use crate::bottom_pane::paste_burst::FlushResult;
use crate::slash_command::SlashCommand;
use crate::style::user_message_style;
use crate::terminal_palette;
use codex_protocol::custom_prompts::CustomPrompt;
use crate::app_event::AppEvent;
@@ -97,8 +94,6 @@ enum ActivePopup {
}
const FOOTER_HINT_HEIGHT: u16 = 1;
const FOOTER_SPACING_HEIGHT: u16 = 1;
const FOOTER_HEIGHT_WITH_HINT: u16 = FOOTER_HINT_HEIGHT + FOOTER_SPACING_HEIGHT;
impl ChatComposer {
pub fn new(
@@ -137,30 +132,40 @@ impl ChatComposer {
}
pub fn desired_height(&self, width: u16) -> u16 {
// Leave 1 column for the left border and 1 column for left padding
self.textarea
.desired_height(width.saturating_sub(LIVE_PREFIX_COLS))
+ 2
+ match &self.active_popup {
ActivePopup::None => FOOTER_HEIGHT_WITH_HINT,
ActivePopup::None => FOOTER_HINT_HEIGHT,
ActivePopup::Command(c) => c.calculate_required_height(width),
ActivePopup::File(c) => c.calculate_required_height(),
}
}
pub fn cursor_pos(&self, area: Rect) -> Option<(u16, u16)> {
fn layout_areas(&self, area: Rect) -> [Rect; 3] {
let popup_constraint = match &self.active_popup {
ActivePopup::Command(popup) => {
Constraint::Max(popup.calculate_required_height(area.width))
}
ActivePopup::File(popup) => Constraint::Max(popup.calculate_required_height()),
ActivePopup::None => Constraint::Max(FOOTER_HEIGHT_WITH_HINT),
ActivePopup::None => Constraint::Max(FOOTER_HINT_HEIGHT),
};
let [textarea_rect, _] =
let mut area = area;
// Leave an empty row at the top, unless there isn't room.
if area.height > 1 {
area.height -= 1;
area.y += 1;
}
let [composer_rect, popup_rect] =
Layout::vertical([Constraint::Min(1), popup_constraint]).areas(area);
let mut textarea_rect = textarea_rect;
// Leave 1 for border and 1 for padding
let mut textarea_rect = composer_rect;
textarea_rect.width = textarea_rect.width.saturating_sub(LIVE_PREFIX_COLS);
textarea_rect.x = textarea_rect.x.saturating_add(LIVE_PREFIX_COLS);
[composer_rect, textarea_rect, popup_rect]
}
pub fn cursor_pos(&self, area: Rect) -> Option<(u16, u16)> {
let [_, textarea_rect, _] = self.layout_areas(area);
let state = *self.textarea_state.borrow();
self.textarea.cursor_pos_with_state(textarea_rect, state)
}
@@ -1232,19 +1237,13 @@ impl ChatComposer {
impl WidgetRef for ChatComposer {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
let (popup_constraint, hint_spacing) = match &self.active_popup {
ActivePopup::Command(popup) => (
Constraint::Max(popup.calculate_required_height(area.width)),
0,
),
ActivePopup::File(popup) => (Constraint::Max(popup.calculate_required_height()), 0),
ActivePopup::None => (
Constraint::Length(FOOTER_HEIGHT_WITH_HINT),
FOOTER_SPACING_HEIGHT,
),
};
let [textarea_rect, popup_rect] =
Layout::vertical([Constraint::Min(1), popup_constraint]).areas(area);
let [composer_rect, textarea_rect, popup_rect] = self.layout_areas(area);
if !matches!(self.active_popup, ActivePopup::None) {
buf.set_style(
popup_rect,
user_message_style(terminal_palette::default_bg()),
);
}
match &self.active_popup {
ActivePopup::Command(popup) => {
popup.render_ref(popup_rect, buf);
@@ -1253,16 +1252,9 @@ impl WidgetRef for ChatComposer {
popup.render_ref(popup_rect, buf);
}
ActivePopup::None => {
let hint_rect = if hint_spacing > 0 {
let [_, hint_rect] = Layout::vertical([
Constraint::Length(hint_spacing),
Constraint::Length(FOOTER_HINT_HEIGHT),
])
.areas(popup_rect);
hint_rect
} else {
popup_rect
};
let mut hint_rect = popup_rect;
hint_rect.x += 2;
hint_rect.width = hint_rect.width.saturating_sub(2);
render_footer(
hint_rect,
buf,
@@ -1276,23 +1268,17 @@ impl WidgetRef for ChatComposer {
);
}
}
let border_style = if self.has_focus {
Style::default().fg(Color::Cyan)
} else {
Style::default().add_modifier(Modifier::DIM)
};
Block::default()
.borders(Borders::LEFT)
.border_type(BorderType::QuadrantOutside)
.border_style(border_style)
.render_ref(
Rect::new(textarea_rect.x, textarea_rect.y, 1, textarea_rect.height),
buf,
);
let mut textarea_rect = textarea_rect;
// Leave 1 for border and 1 for padding
textarea_rect.width = textarea_rect.width.saturating_sub(LIVE_PREFIX_COLS);
textarea_rect.x = textarea_rect.x.saturating_add(LIVE_PREFIX_COLS);
let style = user_message_style(terminal_palette::default_bg());
let mut block_rect = composer_rect;
block_rect.y = composer_rect.y.saturating_sub(1);
block_rect.height = composer_rect.height.saturating_add(1);
Block::default().style(style).render_ref(block_rect, buf);
buf.set_span(
composer_rect.x,
composer_rect.y,
&"".bold(),
composer_rect.width,
);
let mut state = self.textarea_state.borrow_mut();
StatefulWidgetRef::render_ref(&(&self.textarea), textarea_rect, buf, &mut state);

View File

@@ -209,8 +209,8 @@ impl WidgetRef for CommandPopup {
&rows,
&self.state,
MAX_POPUP_ROWS,
false,
"no matches",
false,
);
}
}

View File

@@ -144,8 +144,8 @@ impl WidgetRef for &FileSearchPopup {
&rows_all,
&self.state,
MAX_POPUP_ROWS,
false,
empty_message,
false,
);
}
}

View File

@@ -454,8 +454,8 @@ impl BottomPaneView for ListSelectionView {
&rows,
&self.state,
MAX_POPUP_ROWS,
true,
"no matches",
true,
);
}

View File

@@ -103,6 +103,10 @@ impl BottomPane {
}
}
pub fn status_widget(&self) -> Option<&StatusIndicatorWidget> {
self.status.as_ref()
}
fn active_view(&self) -> Option<&dyn BottomPaneView> {
self.view_stack.last().map(std::convert::AsRef::as_ref)
}
@@ -150,7 +154,9 @@ impl BottomPane {
let status_height = self
.status
.as_ref()
.map_or(0, |status| status.desired_height(area.width));
.map_or(0, |status| status.desired_height(area.width))
.min(area.height.saturating_sub(1));
Layout::vertical([Constraint::Max(status_height), Constraint::Min(1)]).areas(area)
}
}
@@ -616,7 +622,7 @@ mod tests {
// Composer placeholder should be visible somewhere below.
let mut found_composer = false;
for y in 1..area.height.saturating_sub(2) {
for y in 1..area.height {
let mut row = String::new();
for x in 0..area.width {
row.push(buf[(x, y)].symbol().chars().next().unwrap_or(' '));

View File

@@ -8,9 +8,6 @@ use ratatui::style::Style;
use ratatui::style::Stylize;
use ratatui::text::Line;
use ratatui::text::Span;
use ratatui::widgets::Block;
use ratatui::widgets::BorderType;
use ratatui::widgets::Borders;
use ratatui::widgets::Paragraph;
use ratatui::widgets::Widget;
use unicode_width::UnicodeWidthChar;
@@ -119,15 +116,21 @@ pub(crate) fn render_rows(
rows_all: &[GenericDisplayRow],
state: &ScrollState,
max_results: usize,
_dim_non_selected: bool,
empty_message: &str,
include_border: bool,
) {
// Always draw a dim left border to match other popups.
let block = Block::default()
.borders(Borders::LEFT)
.border_type(BorderType::QuadrantOutside)
.border_style(Style::default().add_modifier(Modifier::DIM));
block.render(area, buf);
if include_border {
use ratatui::widgets::Block;
use ratatui::widgets::BorderType;
use ratatui::widgets::Borders;
// Always draw a dim left border to match other popups.
let block = Block::default()
.borders(Borders::LEFT)
.border_type(BorderType::QuadrantOutside)
.border_style(Style::default().add_modifier(Modifier::DIM));
block.render(area, buf);
}
// Content renders to the right of the border with the same live prefix
// padding used by the composer so the popup aligns with the input text.

View File

@@ -2,13 +2,13 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ [Pasted Content 1002 chars][Pasted Content 1004 chars] "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
" "
"⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" [Pasted Content 1002 chars][Pasted Content 1004 chars] "
" "
" "
" "
" "
" "
" "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "

View File

@@ -2,13 +2,13 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ Ask Codex to do anything "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
" "
"⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" Ask Codex to do anything "
" "
" "
" "
" "
" "
" "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "

View File

@@ -2,13 +2,13 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ [Pasted Content 1005 chars] "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
" "
"⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" [Pasted Content 1005 chars] "
" "
" "
" "
" "
" "
" "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "

View File

@@ -2,13 +2,13 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ [Pasted Content 1003 chars][Pasted Content 1007 chars] another short paste "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
" "
"⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" [Pasted Content 1003 chars][Pasted Content 1007 chars] another short paste "
" "
" "
" "
" "
" "
" "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "

View File

@@ -2,7 +2,7 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ /mo "
" "
" /model choose what model and reasoning effort to use "
" /mention mention a file "
" "
" /mo "
" /model choose what model and reasoning effort to use "
" /mention mention a file "

View File

@@ -2,13 +2,13 @@
source: tui/src/bottom_pane/chat_composer.rs
expression: terminal.backend()
---
"▌ short "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
"▌ "
" "
" send ⌃J newline ⌃T transcript ⌃C quit "
" short "
" "
" "
" "
" "
" "
" "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "

View File

@@ -59,6 +59,7 @@ use tracing::debug;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::bottom_pane::ApprovalRequest;
use crate::bottom_pane::BottomPane;
use crate::bottom_pane::BottomPaneParams;
use crate::bottom_pane::CancellationEvent;
@@ -79,13 +80,11 @@ use crate::history_cell::AgentMessageCell;
use crate::history_cell::HistoryCell;
use crate::history_cell::McpToolCallCell;
use crate::history_cell::PatchEventType;
use crate::history_cell::RateLimitSnapshotDisplay;
use crate::markdown::append_markdown;
use crate::slash_command::SlashCommand;
use crate::status::RateLimitSnapshotDisplay;
use crate::text_formatting::truncate_text;
use crate::tui::FrameRequester;
// streaming internals are provided by crate::streaming and crate::markdown_stream
use crate::bottom_pane::ApprovalRequest;
mod interrupts;
use self::interrupts::InterruptManager;
mod agent;
@@ -255,6 +254,8 @@ pub(crate) struct ChatWidget {
// List of ghost commits corresponding to each turn.
ghost_snapshots: Vec<GhostCommit>,
ghost_snapshots_disabled: bool,
// Whether to add a final message separator after the last message
needs_final_message_separator: bool,
}
struct UserMessage {
@@ -417,7 +418,7 @@ impl ChatWidget {
.and_then(|window| window.window_minutes),
);
let display = history_cell::rate_limit_snapshot_display(&snapshot, Local::now());
let display = crate::status::rate_limit_snapshot_display(&snapshot, Local::now());
self.rate_limit_snapshot = Some(display);
if !warnings.is_empty() {
@@ -650,6 +651,14 @@ impl ChatWidget {
self.flush_active_cell();
if self.stream_controller.is_none() {
if self.needs_final_message_separator {
let elapsed_seconds = self
.bottom_pane
.status_widget()
.map(super::status_indicator_widget::StatusIndicatorWidget::elapsed_seconds);
self.add_to_history(history_cell::FinalMessageSeparator::new(elapsed_seconds));
self.needs_final_message_separator = false;
}
self.stream_controller = Some(StreamController::new(self.config.clone()));
}
if let Some(controller) = self.stream_controller.as_mut()
@@ -903,6 +912,7 @@ impl ChatWidget {
is_review_mode: false,
ghost_snapshots: Vec::new(),
ghost_snapshots_disabled: true,
needs_final_message_separator: false,
}
}
@@ -964,6 +974,7 @@ impl ChatWidget {
is_review_mode: false,
ghost_snapshots: Vec::new(),
ghost_snapshots_disabled: true,
needs_final_message_separator: false,
}
}
@@ -1190,6 +1201,7 @@ impl ChatWidget {
fn flush_active_cell(&mut self) {
if let Some(active) = self.active_cell.take() {
self.needs_final_message_separator = true;
self.app_event_tx.send(AppEvent::InsertHistoryCell(active));
}
}
@@ -1202,6 +1214,7 @@ impl ChatWidget {
if !cell.display_lines(u16::MAX).is_empty() {
// Only break exec grouping if the cell renders visible lines.
self.flush_active_cell();
self.needs_final_message_separator = true;
}
self.app_event_tx.send(AppEvent::InsertHistoryCell(cell));
}
@@ -1243,6 +1256,7 @@ impl ChatWidget {
if !text.is_empty() {
self.add_to_history(history_cell::new_user_prompt(text));
}
self.needs_final_message_separator = false;
}
fn capture_ghost_snapshot(&mut self) {
@@ -1542,7 +1556,7 @@ impl ChatWidget {
default_usage = TokenUsage::default();
&default_usage
};
self.add_to_history(history_cell::new_status_output(
self.add_to_history(crate::status::new_status_output(
&self.config,
usage_ref,
&self.conversation_id,

View File

@@ -1,8 +1,18 @@
---
source: tui/src/chatwidget/tests.rs
expression: visible_after
assertion_line: 1152
expression: "lines[start_idx..].join(\"\\n\")"
---
> Im going to scan the workspace and Cargo manifests to see build profiles and
I need to check the codex-rs repository to explain why the project's binaries
are large. The user is likely seeking specifics about the setup: are Rust
builds static, what features are enabled, and is debug information included?
It could be due to static linking, included OpenSSL, or how panic handling
is set up. I should look into the Cargo.toml file to confirm features and
profiles without needing to edit any code. Let's get started on this!
─ Worked for 0s ────────────────────────────────────────────────────────────────
• Im going to scan the workspace and Cargo manifests to see build profiles and
dependencies that impact binary size. Then Ill summarize the main causes.
• Explored
@@ -103,7 +113,9 @@ expression: visible_after
"Main Causes" and "Build-Mode Notes." I can also include brief suggestions for
reducing size, but I want to stay focused on answering the user's question.
> Heres whats driving size in this workspaces binaries.
─ Worked for 0s ────────────────────────────────────────────────────────────────
• Heres whats driving size in this workspaces binaries.
Main Causes

View File

@@ -2,4 +2,4 @@
source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
" Ask Codex to do anything "
" Ask Codex to do anything "

View File

@@ -2,5 +2,5 @@
source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
"▌ Ask Codex to do anything "
" "
" Ask Codex to do anything "

View File

@@ -3,5 +3,5 @@ source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
" "
" Ask Codex to do anything "
" Ask Codex to do anything "
" "

View File

@@ -2,4 +2,4 @@
source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
" Ask Codex to do anything "
" Ask Codex to do anything "

View File

@@ -3,4 +3,4 @@ source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
" Thinking (0s • Esc to interrupt) "
" Ask Codex to do anything "
" Ask Codex to do anything "

View File

@@ -3,5 +3,5 @@ source: tui/src/chatwidget/tests.rs
expression: terminal.backend()
---
" "
" Ask Codex to do anything "
" Ask Codex to do anything "
" "

View File

@@ -1,8 +1,8 @@
---
source: tui/src/chatwidget/tests.rs
expression: visual
expression: term.backend().vt100().screen().contents()
---
> Im going to search the repo for where “Change Approved” is rendered to update
Im going to search the repo for where “Change Approved” is rendered to update
that view.
• Explored
@@ -11,6 +11,7 @@ expression: visual
Investigating rendering code (0s • Esc to interrupt)
▌ Summarize recent commits
⏎ send ⌃J newline ⌃T transcript ⌃C quit
Summarize recent commits
⏎ send ⌃J newline ⌃T transcript ⌃C quit

View File

@@ -2,7 +2,7 @@
source: tui/src/chatwidget/tests.rs
expression: visual
---
> -- Indented code block (4 spaces)
-- Indented code block (4 spaces)
SELECT *
FROM "users"
WHERE "email" LIKE '%@example.com';

View File

@@ -2,4 +2,4 @@
source: tui/src/chatwidget/tests.rs
expression: combined
---
> Here is the result.
Here is the result.

View File

@@ -2,4 +2,4 @@
source: tui/src/chatwidget/tests.rs
expression: combined
---
> Here is the result.
Here is the result.

View File

@@ -5,7 +5,8 @@ expression: terminal.backend()
" "
" Analyzing (0s • Esc to interrupt) "
" "
"▌ Ask Codex to do anything "
" "
"⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" Ask Codex to do anything "
" "
" ⏎ send ⌃J newline ⌃T transcript ⌃C quit "
" "

View File

@@ -1,6 +1,7 @@
use super::*;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::test_backend::VT100Backend;
use codex_core::AuthManager;
use codex_core::CodexAuth;
use codex_core::config::Config;
@@ -81,6 +82,7 @@ fn upgrade_event_payload_for_tests(mut payload: serde_json::Value) -> serde_json
payload
}
/*
#[test]
fn final_answer_without_newline_is_flushed_immediately() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
@@ -138,6 +140,7 @@ fn final_answer_without_newline_is_flushed_immediately() {
"expected final answer text to be flushed to history"
);
}
*/
#[test]
fn resumed_initial_messages_render_history() {
@@ -337,6 +340,7 @@ fn make_chatwidget_manual() -> (
is_review_mode: false,
ghost_snapshots: Vec::new(),
ghost_snapshots_disabled: false,
needs_final_message_separator: false,
};
(widget, rx, op_rx)
}
@@ -1001,7 +1005,7 @@ async fn binary_size_transcript_snapshot() {
let width: u16 = 80;
let height: u16 = 2000;
let viewport = Rect::new(0, height - 1, width, 1);
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut terminal = crate::custom_terminal::Terminal::with_options(backend)
.expect("failed to construct terminal");
terminal.set_viewport_area(viewport);
@@ -1010,7 +1014,6 @@ async fn binary_size_transcript_snapshot() {
let file = open_fixture("binary-size-log.jsonl");
let reader = BufReader::new(file);
let mut transcript = String::new();
let mut ansi: Vec<u8> = Vec::new();
let mut has_emitted_history = false;
for line in reader.lines() {
@@ -1071,11 +1074,7 @@ async fn binary_size_transcript_snapshot() {
}
has_emitted_history = true;
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
crate::insert_history::insert_history_lines(&mut terminal, lines);
}
}
}
@@ -1096,11 +1095,7 @@ async fn binary_size_transcript_snapshot() {
}
has_emitted_history = true;
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
crate::insert_history::insert_history_lines(&mut terminal, lines);
}
}
}
@@ -1111,13 +1106,12 @@ async fn binary_size_transcript_snapshot() {
// Build the final VT100 visual by parsing the ANSI stream. Trim trailing spaces per line
// and drop trailing empty lines so the shape matches the ideal fixture exactly.
let mut parser = vt100::Parser::new(height, width, 0);
parser.process(&ansi);
let screen = terminal.backend().vt100().screen();
let mut lines: Vec<String> = Vec::with_capacity(height as usize);
for row in 0..height {
let mut s = String::with_capacity(width as usize);
for col in 0..width {
if let Some(cell) = parser.screen().cell(row, col) {
if let Some(cell) = screen.cell(row, col) {
if let Some(ch) = cell.contents().chars().next() {
s.push(ch);
} else {
@@ -1144,7 +1138,7 @@ async fn binary_size_transcript_snapshot() {
// Prefer the first assistant content line (blockquote '>' prefix) after the marker;
// fallback to the first non-empty, non-'thinking' line.
let start_idx = (last_marker_line_idx + 1..lines.len())
.find(|&idx| lines[idx].trim_start().starts_with('>'))
.find(|&idx| lines[idx].trim_start().starts_with(''))
.unwrap_or_else(|| {
(last_marker_line_idx + 1..lines.len())
.find(|&idx| {
@@ -1154,28 +1148,8 @@ async fn binary_size_transcript_snapshot() {
.expect("no content line found after marker")
});
let mut compare_lines: Vec<String> = Vec::new();
// Ensure the first line is trimmed-left to match the fixture shape.
compare_lines.push(lines[start_idx].trim_start().to_string());
compare_lines.extend(lines[(start_idx + 1)..].iter().cloned());
let visible_after = compare_lines.join("\n");
// Normalize: drop a leading 'thinking' line if present to avoid coupling
// to whether the reasoning header is rendered in history.
fn drop_leading_thinking(s: &str) -> String {
let mut it = s.lines();
let first = it.next();
let rest = it.collect::<Vec<_>>().join("\n");
if first.is_some_and(|l| l.trim() == "thinking") {
rest
} else {
s.to_string()
}
}
let visible_after = drop_leading_thinking(&visible_after);
// Snapshot the normalized visible transcript following the banner.
assert_snapshot!("binary_size_ideal_response", visible_after);
assert_snapshot!("binary_size_ideal_response", lines[start_idx..].join("\n"));
}
//
@@ -2104,66 +2078,25 @@ fn chatwidget_exec_and_status_layout_vt100_snapshot() {
chat.bottom_pane
.set_composer_text("Summarize recent commits".to_string());
// Dimensions
let width: u16 = 80;
let ui_height: u16 = chat.desired_height(width);
let vt_height: u16 = 40;
let viewport = Rect::new(0, vt_height - ui_height, width, ui_height);
let viewport = Rect::new(0, vt_height - ui_height - 1, width, ui_height);
// Use TestBackend for the terminal (no real ANSI emitted by drawing),
// but capture VT100 escape stream for history insertion with a separate writer.
let backend = ratatui::backend::TestBackend::new(width, vt_height);
let backend = VT100Backend::new(width, vt_height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
term.set_viewport_area(viewport);
// 1) Apply any pending history insertions by emitting ANSI to a buffer via insert_history_lines_to_writer
let mut ansi: Vec<u8> = Vec::new();
for lines in drain_insert_history(&mut rx) {
crate::insert_history::insert_history_lines_to_writer(&mut term, &mut ansi, lines);
crate::insert_history::insert_history_lines(&mut term, lines);
}
// 2) Render the ChatWidget UI into an off-screen buffer using WidgetRef directly
let mut ui_buf = Buffer::empty(viewport);
(&chat).render_ref(viewport, &mut ui_buf);
term.draw(|f| {
(&chat).render_ref(f.area(), f.buffer_mut());
})
.unwrap();
// 3) Build VT100 visual from the captured ANSI
let mut parser = vt100::Parser::new(vt_height, width, 0);
parser.process(&ansi);
let mut vt_lines: Vec<String> = (0..vt_height)
.map(|row| {
let mut s = String::with_capacity(width as usize);
for col in 0..width {
if let Some(cell) = parser.screen().cell(row, col) {
if let Some(ch) = cell.contents().chars().next() {
s.push(ch);
} else {
s.push(' ');
}
} else {
s.push(' ');
}
}
s.trim_end().to_string()
})
.collect();
// 4) Overlay UI buffer content into the viewport region of the VT output
for rel_y in 0..viewport.height {
let y = viewport.y + rel_y;
let mut line = String::with_capacity(width as usize);
for x in 0..viewport.width {
let ch = ui_buf[(viewport.x + x, viewport.y + rel_y)]
.symbol()
.chars()
.next()
.unwrap_or(' ');
line.push(ch);
}
vt_lines[y as usize] = line.trim_end().to_string();
}
let visual = vt_lines.join("\n");
assert_snapshot!(visual);
assert_snapshot!(term.backend().vt100().screen().contents());
}
// E2E vt100 snapshot for complex markdown with indented and nested fenced code blocks
@@ -2182,13 +2115,11 @@ fn chatwidget_markdown_code_blocks_vt100_snapshot() {
// Build a vt100 visual from the history insertions only (no UI overlay)
let width: u16 = 80;
let height: u16 = 50;
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
// Place viewport at the last line so that history lines insert above it
term.set_viewport_area(Rect::new(0, height - 1, width, 1));
let mut ansi: Vec<u8> = Vec::new();
// Simulate streaming via AgentMessageDelta in 2-character chunks (no final AgentMessage).
let source: &str = r#"
@@ -2234,9 +2165,7 @@ printf 'fenced within fenced\n'
while let Ok(app_ev) = rx.try_recv() {
if let AppEvent::InsertHistoryCell(cell) = app_ev {
let lines = cell.display_lines(width);
crate::insert_history::insert_history_lines_to_writer(
&mut term, &mut ansi, lines,
);
crate::insert_history::insert_history_lines(&mut term, lines);
inserted_any = true;
}
}
@@ -2254,34 +2183,8 @@ printf 'fenced within fenced\n'
}),
});
for lines in drain_insert_history(&mut rx) {
crate::insert_history::insert_history_lines_to_writer(&mut term, &mut ansi, lines);
crate::insert_history::insert_history_lines(&mut term, lines);
}
let mut parser = vt100::Parser::new(height, width, 0);
parser.process(&ansi);
let mut vt_lines: Vec<String> = (0..height)
.map(|row| {
let mut s = String::with_capacity(width as usize);
for col in 0..width {
if let Some(cell) = parser.screen().cell(row, col) {
if let Some(ch) = cell.contents().chars().next() {
s.push(ch);
} else {
s.push(' ');
}
} else {
s.push(' ');
}
}
s.trim_end().to_string()
})
.collect();
// Compact trailing blank rows for a stable snapshot
while matches!(vt_lines.last(), Some(l) if l.trim().is_empty()) {
vt_lines.pop();
}
let visual = vt_lines.join("\n");
assert_snapshot!(visual);
assert_snapshot!(term.backend().vt100().screen().contents());
}

75
codex-rs/tui/src/color.rs Normal file
View File

@@ -0,0 +1,75 @@
pub(crate) fn is_light(bg: (u8, u8, u8)) -> bool {
let (r, g, b) = bg;
let y = 0.299 * r as f32 + 0.587 * g as f32 + 0.114 * b as f32;
y > 128.0
}
pub(crate) fn blend(fg: (u8, u8, u8), bg: (u8, u8, u8), alpha: f32) -> (u8, u8, u8) {
let r = (fg.0 as f32 * alpha + bg.0 as f32 * (1.0 - alpha)) as u8;
let g = (fg.1 as f32 * alpha + bg.1 as f32 * (1.0 - alpha)) as u8;
let b = (fg.2 as f32 * alpha + bg.2 as f32 * (1.0 - alpha)) as u8;
(r, g, b)
}
/// Returns the perceptual color distance between two RGB colors.
/// Uses the CIE76 formula (Euclidean distance in Lab space approximation).
pub(crate) fn perceptual_distance(a: (u8, u8, u8), b: (u8, u8, u8)) -> f32 {
// Convert sRGB to linear RGB
fn srgb_to_linear(c: u8) -> f32 {
let c = c as f32 / 255.0;
if c <= 0.04045 {
c / 12.92
} else {
((c + 0.055) / 1.055).powf(2.4)
}
}
// Convert RGB to XYZ
fn rgb_to_xyz(r: u8, g: u8, b: u8) -> (f32, f32, f32) {
let r = srgb_to_linear(r);
let g = srgb_to_linear(g);
let b = srgb_to_linear(b);
let x = r * 0.4124 + g * 0.3576 + b * 0.1805;
let y = r * 0.2126 + g * 0.7152 + b * 0.0722;
let z = r * 0.0193 + g * 0.1192 + b * 0.9505;
(x, y, z)
}
// Convert XYZ to Lab
fn xyz_to_lab(x: f32, y: f32, z: f32) -> (f32, f32, f32) {
// D65 reference white
let xr = x / 0.95047;
let yr = y / 1.00000;
let zr = z / 1.08883;
fn f(t: f32) -> f32 {
if t > 0.008856 {
t.powf(1.0 / 3.0)
} else {
7.787 * t + 16.0 / 116.0
}
}
let fx = f(xr);
let fy = f(yr);
let fz = f(zr);
let l = 116.0 * fy - 16.0;
let a = 500.0 * (fx - fy);
let b = 200.0 * (fy - fz);
(l, a, b)
}
let (x1, y1, z1) = rgb_to_xyz(a.0, a.1, a.2);
let (x2, y2, z2) = rgb_to_xyz(b.0, b.1, b.2);
let (l1, a1, b1) = xyz_to_lab(x1, y1, z1);
let (l2, a2, b2) = xyz_to_lab(x2, y2, z2);
let dl = l1 - l2;
let da = a1 - a2;
let db = b1 - b2;
(dl * dl + da * da + db * db).sqrt()
}

View File

@@ -22,13 +22,25 @@
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
use std::io;
use std::io::Write;
use crossterm::cursor::MoveTo;
use crossterm::queue;
use crossterm::style::Colors;
use crossterm::style::Print;
use crossterm::style::SetAttribute;
use crossterm::style::SetBackgroundColor;
use crossterm::style::SetColors;
use crossterm::style::SetForegroundColor;
use crossterm::terminal::Clear;
use ratatui::backend::Backend;
use ratatui::backend::ClearType;
use ratatui::buffer::Buffer;
use ratatui::layout::Position;
use ratatui::layout::Rect;
use ratatui::layout::Size;
use ratatui::style::Color;
use ratatui::style::Modifier;
use ratatui::widgets::WidgetRef;
#[derive(Debug, Hash)]
@@ -90,7 +102,7 @@ impl Frame<'_> {
#[derive(Debug, Default, Clone, Eq, PartialEq, Hash)]
pub struct Terminal<B>
where
B: Backend,
B: Backend + Write,
{
/// The backend used to interface with the terminal
backend: B,
@@ -113,6 +125,7 @@ where
impl<B> Drop for Terminal<B>
where
B: Backend,
B: Write,
{
#[allow(clippy::print_stderr)]
fn drop(&mut self) {
@@ -128,6 +141,7 @@ where
impl<B> Terminal<B>
where
B: Backend,
B: Write,
{
/// Creates a new [`Terminal`] with the given [`Backend`] and [`TerminalOptions`].
pub fn with_options(mut backend: B) -> io::Result<Self> {
@@ -176,11 +190,15 @@ where
pub fn flush(&mut self) -> io::Result<()> {
let previous_buffer = &self.buffers[1 - self.current];
let current_buffer = &self.buffers[self.current];
let updates = previous_buffer.diff(current_buffer);
if let Some((col, row, _)) = updates.last() {
self.last_known_cursor_pos = Position { x: *col, y: *row };
let updates = diff_buffers(previous_buffer, current_buffer);
if let Some(DrawCommand::Put { x, y, .. }) = updates
.iter()
.rev()
.find(|cmd| matches!(cmd, DrawCommand::Put { .. }))
{
self.last_known_cursor_pos = Position { x: *x, y: *y };
}
self.backend.draw(updates.into_iter())
draw(&mut self.backend, updates.into_iter())
}
/// Updates the Terminal so that internal buffers match the requested area.
@@ -307,8 +325,7 @@ where
self.swap_buffers();
// Flush
self.backend.flush()?;
ratatui::backend::Backend::flush(&mut self.backend)?;
Ok(())
}
@@ -367,3 +384,189 @@ where
self.backend.size()
}
}
use ratatui::buffer::Cell;
use unicode_width::UnicodeWidthStr;
#[derive(Debug)]
enum DrawCommand<'a> {
Put { x: u16, y: u16, cell: &'a Cell },
ClearToEnd { x: u16, y: u16, bg: Color },
}
fn diff_buffers<'a>(a: &'a Buffer, b: &'a Buffer) -> Vec<DrawCommand<'a>> {
let previous_buffer = &a.content;
let next_buffer = &b.content;
let mut updates = vec![];
let mut last_nonblank_column = vec![0; a.area.height as usize];
for y in 0..a.area.height {
let row_start = y as usize * a.area.width as usize;
let row_end = row_start + a.area.width as usize;
let row = &next_buffer[row_start..row_end];
let bg = row.last().map(|cell| cell.bg).unwrap_or(Color::Reset);
let x = row
.iter()
.rposition(|cell| cell.symbol() != " " || cell.bg != bg)
.unwrap_or(0);
last_nonblank_column[y as usize] = x as u16;
let (x_abs, y_abs) = a.pos_of(row_start + x + 1);
updates.push(DrawCommand::ClearToEnd {
x: x_abs,
y: y_abs,
bg,
});
}
// Cells invalidated by drawing/replacing preceding multi-width characters:
let mut invalidated: usize = 0;
// Cells from the current buffer to skip due to preceding multi-width characters taking
// their place (the skipped cells should be blank anyway), or due to per-cell-skipping:
let mut to_skip: usize = 0;
for (i, (current, previous)) in next_buffer.iter().zip(previous_buffer.iter()).enumerate() {
if !current.skip && (current != previous || invalidated > 0) && to_skip == 0 {
let (x, y) = a.pos_of(i);
let row = i / a.area.width as usize;
if x <= last_nonblank_column[row] {
updates.push(DrawCommand::Put {
x,
y,
cell: &next_buffer[i],
});
}
}
to_skip = current.symbol().width().saturating_sub(1);
let affected_width = std::cmp::max(current.symbol().width(), previous.symbol().width());
invalidated = std::cmp::max(affected_width, invalidated).saturating_sub(1);
}
updates
}
fn draw<'a, I>(writer: &mut impl Write, commands: I) -> io::Result<()>
where
I: Iterator<Item = DrawCommand<'a>>,
{
let mut fg = Color::Reset;
let mut bg = Color::Reset;
let mut modifier = Modifier::empty();
let mut last_pos: Option<Position> = None;
for command in commands {
let (x, y) = match command {
DrawCommand::Put { x, y, .. } => (x, y),
DrawCommand::ClearToEnd { x, y, .. } => (x, y),
};
// Move the cursor if the previous location was not (x - 1, y)
if !matches!(last_pos, Some(p) if x == p.x + 1 && y == p.y) {
queue!(writer, MoveTo(x, y))?;
}
last_pos = Some(Position { x, y });
match command {
DrawCommand::Put { cell, .. } => {
if cell.modifier != modifier {
let diff = ModifierDiff {
from: modifier,
to: cell.modifier,
};
diff.queue(writer)?;
modifier = cell.modifier;
}
if cell.fg != fg || cell.bg != bg {
queue!(
writer,
SetColors(Colors::new(cell.fg.into(), cell.bg.into()))
)?;
fg = cell.fg;
bg = cell.bg;
}
queue!(writer, Print(cell.symbol()))?;
}
DrawCommand::ClearToEnd { bg: clear_bg, .. } => {
queue!(writer, SetAttribute(crossterm::style::Attribute::Reset))?;
modifier = Modifier::empty();
queue!(writer, SetBackgroundColor(clear_bg.into()))?;
bg = clear_bg;
queue!(writer, Clear(crossterm::terminal::ClearType::UntilNewLine))?;
}
}
}
queue!(
writer,
SetForegroundColor(crossterm::style::Color::Reset),
SetBackgroundColor(crossterm::style::Color::Reset),
SetAttribute(crossterm::style::Attribute::Reset),
)?;
Ok(())
}
/// The `ModifierDiff` struct is used to calculate the difference between two `Modifier`
/// values. This is useful when updating the terminal display, as it allows for more
/// efficient updates by only sending the necessary changes.
struct ModifierDiff {
pub from: Modifier,
pub to: Modifier,
}
impl ModifierDiff {
fn queue<W: io::Write>(self, w: &mut W) -> io::Result<()> {
use crossterm::style::Attribute as CAttribute;
let removed = self.from - self.to;
if removed.contains(Modifier::REVERSED) {
queue!(w, SetAttribute(CAttribute::NoReverse))?;
}
if removed.contains(Modifier::BOLD) {
queue!(w, SetAttribute(CAttribute::NormalIntensity))?;
if self.to.contains(Modifier::DIM) {
queue!(w, SetAttribute(CAttribute::Dim))?;
}
}
if removed.contains(Modifier::ITALIC) {
queue!(w, SetAttribute(CAttribute::NoItalic))?;
}
if removed.contains(Modifier::UNDERLINED) {
queue!(w, SetAttribute(CAttribute::NoUnderline))?;
}
if removed.contains(Modifier::DIM) {
queue!(w, SetAttribute(CAttribute::NormalIntensity))?;
}
if removed.contains(Modifier::CROSSED_OUT) {
queue!(w, SetAttribute(CAttribute::NotCrossedOut))?;
}
if removed.contains(Modifier::SLOW_BLINK) || removed.contains(Modifier::RAPID_BLINK) {
queue!(w, SetAttribute(CAttribute::NoBlink))?;
}
let added = self.to - self.from;
if added.contains(Modifier::REVERSED) {
queue!(w, SetAttribute(CAttribute::Reverse))?;
}
if added.contains(Modifier::BOLD) {
queue!(w, SetAttribute(CAttribute::Bold))?;
}
if added.contains(Modifier::ITALIC) {
queue!(w, SetAttribute(CAttribute::Italic))?;
}
if added.contains(Modifier::UNDERLINED) {
queue!(w, SetAttribute(CAttribute::Underlined))?;
}
if added.contains(Modifier::DIM) {
queue!(w, SetAttribute(CAttribute::Dim))?;
}
if added.contains(Modifier::CROSSED_OUT) {
queue!(w, SetAttribute(CAttribute::CrossedOut))?;
}
if added.contains(Modifier::SLOW_BLINK) {
queue!(w, SetAttribute(CAttribute::SlowBlink))?;
}
if added.contains(Modifier::RAPID_BLINK) {
queue!(w, SetAttribute(CAttribute::RapidBlink))?;
}
Ok(())
}
}

View File

@@ -200,7 +200,7 @@ impl ExecCell {
if self.is_active() {
spinner(self.active_start_time())
} else {
"".bold()
"".into()
},
" ".into(),
if self.is_active() {

View File

@@ -9,9 +9,8 @@ use crate::exec_command::strip_bash_lc_and_escape;
use crate::markdown::append_markdown;
use crate::render::line_utils::line_to_static;
use crate::render::line_utils::prefix_lines;
pub(crate) use crate::status::RateLimitSnapshotDisplay;
pub(crate) use crate::status::new_status_output;
pub(crate) use crate::status::rate_limit_snapshot_display;
use crate::style::user_message_style;
use crate::terminal_palette::default_bg;
use crate::text_formatting::format_and_truncate_tool_result;
use crate::ui_consts::LIVE_PREFIX_COLS;
use crate::wrapping::RtOptions;
@@ -19,6 +18,7 @@ use crate::wrapping::word_wrap_line;
use crate::wrapping::word_wrap_lines;
use base64::Engine;
use codex_core::config::Config;
use codex_core::config_types::McpServerTransportConfig;
use codex_core::config_types::ReasoningSummaryFormat;
use codex_core::plan_tool::PlanItemArg;
use codex_core::plan_tool::StepStatus;
@@ -97,17 +97,24 @@ impl HistoryCell for UserHistoryCell {
fn display_lines(&self, width: u16) -> Vec<Line<'static>> {
let mut lines: Vec<Line<'static>> = Vec::new();
// Wrap the content first, then prefix each wrapped line with the marker.
// Use ratatui-aware word wrapping and prefixing to avoid lifetime issues.
let wrap_width = width.saturating_sub(LIVE_PREFIX_COLS); // account for the ▌ prefix and trailing space
let wrapped = textwrap::wrap(
&self.message,
textwrap::Options::new(wrap_width as usize)
.wrap_algorithm(textwrap::WrapAlgorithm::FirstFit), // Match textarea wrap
let style = user_message_style(default_bg());
// Use our ratatui wrapping helpers for correct styling and lifetimes.
let wrapped = word_wrap_lines(
&self
.message
.lines()
.map(|l| Line::from(l).style(style))
.collect::<Vec<_>>(),
RtOptions::new(wrap_width as usize),
);
for line in wrapped {
lines.push(vec![" ".cyan().dim(), line.to_string().dim()].into());
}
lines.push(Line::from("").style(style));
lines.extend(prefix_lines(wrapped, " ".bold().dim(), " ".into()));
lines.push(Line::from("").style(style));
lines
}
@@ -139,7 +146,21 @@ impl HistoryCell for ReasoningSummaryCell {
let summary_lines = self
.content
.iter()
.map(|l| l.clone().dim().italic())
.map(|line| {
Line::from(
line.spans
.iter()
.map(|span| {
Span::styled(
span.content.clone().into_owned(),
span.style
.add_modifier(Modifier::ITALIC)
.add_modifier(Modifier::DIM),
)
})
.collect::<Vec<_>>(),
)
})
.collect::<Vec<_>>();
word_wrap_lines(
@@ -179,7 +200,7 @@ impl HistoryCell for AgentMessageCell {
&self.lines,
RtOptions::new(width as usize)
.initial_indent(if self.is_first_line {
"> ".into()
" ".into()
} else {
" ".into()
})
@@ -393,6 +414,11 @@ pub(crate) fn new_session_info(
"/model".into(),
" - choose what model and reasoning effort to use".dim(),
]),
Line::from(vec![
" ".into(),
"/review".into(),
" - review any changes and find issues".dim(),
]),
];
CompositeHistoryCell {
@@ -848,10 +874,19 @@ pub(crate) fn new_mcp_tools_output(
lines.push(vec![" • Server: ".into(), server.clone().into()].into());
if !cfg.command.is_empty() {
let cmd_display = format!("{} {}", cfg.command, cfg.args.join(" "));
lines.push(vec![" • Command: ".into(), cmd_display.into()].into());
match &cfg.transport {
McpServerTransportConfig::Stdio { command, args, .. } => {
let args_suffix = if args.is_empty() {
String::new()
} else {
format!(" {}", args.join(" "))
};
let cmd_display = format!("{command}{args_suffix}");
lines.push(vec![" • Command: ".into(), cmd_display.into()].into());
}
McpServerTransportConfig::StreamableHttp { url, .. } => {
lines.push(vec![" • URL: ".into(), url.clone().into()].into());
}
}
if names.is_empty() {
@@ -866,7 +901,7 @@ pub(crate) fn new_mcp_tools_output(
}
pub(crate) fn new_info_event(message: String, hint: Option<String>) -> PlainHistoryCell {
let mut line = vec!["> ".into(), message.into()];
let mut line = vec![" ".into(), message.into()];
if let Some(hint) = hint {
line.push(" ".into());
line.push(hint.dark_gray());
@@ -1060,6 +1095,40 @@ pub(crate) fn new_reasoning_summary_block(
Box::new(new_reasoning_block(full_reasoning_buffer, config))
}
#[derive(Debug)]
pub struct FinalMessageSeparator {
elapsed_seconds: Option<u64>,
}
impl FinalMessageSeparator {
pub(crate) fn new(elapsed_seconds: Option<u64>) -> Self {
Self { elapsed_seconds }
}
}
impl HistoryCell for FinalMessageSeparator {
fn display_lines(&self, width: u16) -> Vec<Line<'static>> {
let elapsed_seconds = self
.elapsed_seconds
.map(super::status_indicator_widget::fmt_elapsed_compact);
if let Some(elapsed_seconds) = elapsed_seconds {
let worked_for = format!("─ Worked for {elapsed_seconds}");
let worked_for_width = worked_for.width();
vec![
Line::from_iter([
worked_for,
"".repeat((width as usize).saturating_sub(worked_for_width)),
])
.dim(),
]
} else {
vec![Line::from_iter(["".repeat(width as usize).dim()])]
}
}
fn transcript_lines(&self) -> Vec<Line<'static>> {
vec![]
}
}
fn format_mcp_invocation<'a>(invocation: McpInvocation) -> Line<'a> {
let args_str = invocation
.arguments

View File

@@ -2,7 +2,6 @@ use std::fmt;
use std::io;
use std::io::Write;
use crate::tui;
use crate::wrapping::word_wrap_lines_borrowed;
use crossterm::Command;
use crossterm::cursor::MoveTo;
@@ -14,7 +13,10 @@ use crossterm::style::SetAttribute;
use crossterm::style::SetBackgroundColor;
use crossterm::style::SetColors;
use crossterm::style::SetForegroundColor;
use crossterm::terminal::Clear;
use crossterm::terminal::ClearType;
use ratatui::layout::Size;
use ratatui::prelude::Backend;
use ratatui::style::Color;
use ratatui::style::Modifier;
use ratatui::text::Line;
@@ -22,24 +24,16 @@ use ratatui::text::Span;
/// Insert `lines` above the viewport using the terminal's backend writer
/// (avoids direct stdout references).
pub(crate) fn insert_history_lines(terminal: &mut tui::Terminal, lines: Vec<Line>) {
let mut out = std::io::stdout();
insert_history_lines_to_writer(terminal, &mut out, lines);
}
/// Like `insert_history_lines`, but writes ANSI to the provided writer. This
/// is intended for testing where a capture buffer is used instead of stdout.
pub fn insert_history_lines_to_writer<B, W>(
terminal: &mut crate::custom_terminal::Terminal<B>,
writer: &mut W,
lines: Vec<Line>,
) where
B: ratatui::backend::Backend,
W: Write,
pub fn insert_history_lines<B>(terminal: &mut crate::custom_terminal::Terminal<B>, lines: Vec<Line>)
where
B: Backend + Write,
{
let screen_size = terminal.backend().size().unwrap_or(Size::new(0, 0));
let mut area = terminal.viewport_area;
let mut should_update_area = false;
let last_cursor_pos = terminal.last_known_cursor_pos;
let writer = terminal.backend_mut();
// Pre-wrap lines using word-aware wrapping so terminal scrollback sees the same
// formatting as the TUI. This avoids character-level hard wrapping by the terminal.
@@ -67,7 +61,7 @@ pub fn insert_history_lines_to_writer<B, W>(
let cursor_top = area.top().saturating_sub(1);
area.y += scroll_amount;
terminal.set_viewport_area(area);
should_update_area = true;
cursor_top
} else {
area.top().saturating_sub(1)
@@ -97,6 +91,21 @@ pub fn insert_history_lines_to_writer<B, W>(
for line in wrapped {
queue!(writer, Print("\r\n")).ok();
queue!(
writer,
SetColors(Colors::new(
line.style
.fg
.map(std::convert::Into::into)
.unwrap_or(CColor::Reset),
line.style
.bg
.map(std::convert::Into::into)
.unwrap_or(CColor::Reset)
))
)
.ok();
queue!(writer, Clear(ClearType::UntilNewLine)).ok();
// Merge line-level style into each span so that ANSI colors reflect
// line styles (e.g., blockquotes with green fg).
let merged_spans: Vec<Span> = line
@@ -113,14 +122,12 @@ pub fn insert_history_lines_to_writer<B, W>(
queue!(writer, ResetScrollRegion).ok();
// Restore the cursor position to where it was before we started.
queue!(
writer,
MoveTo(
terminal.last_known_cursor_pos.x,
terminal.last_known_cursor_pos.y
)
)
.ok();
queue!(writer, MoveTo(last_cursor_pos.x, last_cursor_pos.y)).ok();
let _ = writer;
if should_update_area {
terminal.set_viewport_area(area);
}
}
#[derive(Debug, Clone, PartialEq, Eq)]
@@ -275,9 +282,9 @@ where
mod tests {
use super::*;
use crate::markdown_render::render_markdown_text;
use crate::test_backend::VT100Backend;
use ratatui::layout::Rect;
use ratatui::style::Color;
use vt100::Parser;
#[test]
fn writes_bold_then_regular_spans() {
@@ -312,7 +319,7 @@ mod tests {
// Set up a small off-screen terminal
let width: u16 = 40;
let height: u16 = 10;
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
// Place viewport on the last line so history inserts scroll upward
let viewport = Rect::new(0, height - 1, width, 1);
@@ -321,17 +328,12 @@ mod tests {
// Build a blockquote-like line: apply line-level green style and prefix "> "
let mut line: Line<'static> = Line::from(vec!["> ".into(), "Hello world".into()]);
line = line.style(Color::Green);
let mut ansi: Vec<u8> = Vec::new();
insert_history_lines_to_writer(&mut term, &mut ansi, vec![line]);
// Parse ANSI using vt100 and assert at least one non-default fg color appears
let mut parser = Parser::new(height, width, 0);
parser.process(&ansi);
insert_history_lines(&mut term, vec![line]);
let mut saw_colored = false;
'outer: for row in 0..height {
for col in 0..width {
if let Some(cell) = parser.screen().cell(row, col)
if let Some(cell) = term.backend().vt100().screen().cell(row, col)
&& cell.has_contents()
&& cell.fgcolor() != vt100::Color::Default
{
@@ -351,7 +353,7 @@ mod tests {
// Force wrapping by using a narrow viewport width and a long blockquote line.
let width: u16 = 20;
let height: u16 = 8;
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
// Viewport is the last line so history goes directly above it.
let viewport = Rect::new(0, height - 1, width, 1);
@@ -364,13 +366,10 @@ mod tests {
]);
line = line.style(Color::Green);
let mut ansi: Vec<u8> = Vec::new();
insert_history_lines_to_writer(&mut term, &mut ansi, vec![line]);
insert_history_lines(&mut term, vec![line]);
// Parse and inspect the final screen buffer.
let mut parser = Parser::new(height, width, 0);
parser.process(&ansi);
let screen = parser.screen();
let screen = term.backend().vt100().screen();
// Collect rows that are non-empty; these should correspond to our wrapped lines.
let mut non_empty_rows: Vec<u16> = Vec::new();
@@ -418,7 +417,7 @@ mod tests {
fn vt100_colored_prefix_then_plain_text_resets_color() {
let width: u16 = 40;
let height: u16 = 6;
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
let viewport = Rect::new(0, height - 1, width, 1);
term.set_viewport_area(viewport);
@@ -429,12 +428,9 @@ mod tests {
Span::raw("Hello world"),
]);
let mut ansi: Vec<u8> = Vec::new();
insert_history_lines_to_writer(&mut term, &mut ansi, vec![line]);
insert_history_lines(&mut term, vec![line]);
let mut parser = Parser::new(height, width, 0);
parser.process(&ansi);
let screen = parser.screen();
let screen = term.backend().vt100().screen();
// Find the first non-empty row; verify first three cells are colored, following cells default.
'rows: for row in 0..height {
@@ -483,35 +479,17 @@ mod tests {
let width: u16 = 60;
let height: u16 = 12;
let backend = ratatui::backend::TestBackend::new(width, height);
let backend = VT100Backend::new(width, height);
let mut term = crate::custom_terminal::Terminal::with_options(backend).expect("terminal");
let viewport = ratatui::layout::Rect::new(0, height - 1, width, 1);
term.set_viewport_area(viewport);
let mut ansi: Vec<u8> = Vec::new();
insert_history_lines_to_writer(&mut term, &mut ansi, lines);
insert_history_lines(&mut term, lines);
let mut parser = Parser::new(height, width, 0);
parser.process(&ansi);
let screen = parser.screen();
let screen = term.backend().vt100().screen();
// Reconstruct screen rows as strings to locate the 3rd level line.
let mut rows: Vec<String> = Vec::with_capacity(height as usize);
for row in 0..height {
let mut s = String::with_capacity(width as usize);
for col in 0..width {
if let Some(cell) = screen.cell(row, col) {
if let Some(ch) = cell.contents().chars().next() {
s.push(ch);
} else {
s.push(' ');
}
} else {
s.push(' ');
}
}
rows.push(s.trim_end().to_string());
}
let rows: Vec<String> = screen.rows(0, width).collect();
let needle = "1. Third level (ordered)";
let row_idx = rows

View File

@@ -1,5 +1,5 @@
use ratatui::style::Color;
use ratatui::style::Style;
use ratatui::style::Stylize;
use ratatui::text::Span;
use std::fmt::Display;
@@ -25,7 +25,7 @@ const SHIFT_PREFIX: &str = "⇧";
const SHIFT_PREFIX: &str = "Shift+";
fn key_hint_style() -> Style {
Style::default().fg(Color::Cyan)
Style::default().bold()
}
fn modifier_span(prefix: &str, key: impl Display) -> Span<'static> {

View File

@@ -12,10 +12,8 @@ use codex_core::RolloutRecorder;
use codex_core::config::Config;
use codex_core::config::ConfigOverrides;
use codex_core::config::ConfigToml;
use codex_core::config::GPT_5_CODEX_MEDIUM_MODEL;
use codex_core::config::find_codex_home;
use codex_core::config::load_config_as_toml_with_cli_overrides;
use codex_core::config::persist_model_selection;
use codex_core::find_conversation_path_by_id_str;
use codex_core::protocol::AskForApproval;
use codex_core::protocol::SandboxPolicy;
@@ -39,6 +37,7 @@ mod chatwidget;
mod citation_regex;
mod cli;
mod clipboard_paste;
mod color;
pub mod custom_terminal;
mod diff_render;
mod exec_cell;
@@ -53,7 +52,6 @@ pub mod live_wrap;
mod markdown;
mod markdown_render;
mod markdown_stream;
mod new_model_popup;
pub mod onboarding;
mod pager_overlay;
mod render;
@@ -64,23 +62,25 @@ mod slash_command;
mod status;
mod status_indicator_widget;
mod streaming;
mod style;
mod terminal_palette;
mod text_formatting;
mod tui;
mod ui_consts;
mod version;
mod wrapping;
#[cfg(test)]
pub mod test_backend;
#[cfg(not(debug_assertions))]
mod updates;
use crate::new_model_popup::ModelUpgradeDecision;
use crate::new_model_popup::run_model_upgrade_popup;
use crate::onboarding::TrustDirectorySelection;
use crate::onboarding::onboarding_screen::OnboardingScreenArgs;
use crate::onboarding::onboarding_screen::run_onboarding_app;
use crate::tui::Tui;
pub use cli::Cli;
use codex_core::internal_storage::InternalStorage;
// (tests access modules directly within the crate)
@@ -198,8 +198,6 @@ pub async fn run_main(
cli_profile_override,
)?;
let internal_storage = InternalStorage::load(&config.codex_home);
let log_dir = codex_core::config::log_dir(&config)?;
std::fs::create_dir_all(&log_dir)?;
// Open (or create) your log file, appending to it.
@@ -243,21 +241,14 @@ pub async fn run_main(
let _ = tracing_subscriber::registry().with(file_layer).try_init();
run_ratatui_app(
cli,
config,
internal_storage,
active_profile,
should_show_trust_screen,
)
.await
.map_err(|err| std::io::Error::other(err.to_string()))
run_ratatui_app(cli, config, active_profile, should_show_trust_screen)
.await
.map_err(|err| std::io::Error::other(err.to_string()))
}
async fn run_ratatui_app(
cli: Cli,
config: Config,
mut internal_storage: InternalStorage,
active_profile: Option<String>,
should_show_trust_screen: bool,
) -> color_eyre::Result<AppExitInfo> {
@@ -282,6 +273,8 @@ async fn run_ratatui_app(
// within the TUI scrollback. Building spans keeps styling consistent.
#[cfg(not(debug_assertions))]
if let Some(latest_version) = updates::get_upgrade_version(&config) {
use crate::history_cell::padded_emoji;
use crate::history_cell::with_border_with_inner_width;
use ratatui::style::Stylize as _;
use ratatui::text::Line;
@@ -289,16 +282,27 @@ async fn run_ratatui_app(
let exe = std::env::current_exe()?;
let managed_by_npm = std::env::var_os("CODEX_MANAGED_BY_NPM").is_some();
let mut lines: Vec<Line<'static>> = Vec::new();
lines.push(Line::from(vec![
"✨⬆️ Update available!".bold().cyan(),
" ".into(),
format!("{current_version} -> {latest_version}.").into(),
]));
let mut content_lines: Vec<Line<'static>> = vec![
Line::from(vec![
padded_emoji("").bold().cyan(),
"Update available!".bold().cyan(),
" ".into(),
format!("{current_version} -> {latest_version}.").bold(),
]),
Line::from(""),
Line::from("See full release notes:"),
Line::from(""),
Line::from(
"https://github.com/openai/codex/releases/latest"
.cyan()
.underlined(),
),
Line::from(""),
];
if managed_by_npm {
let npm_cmd = "npm install -g @openai/codex@latest";
lines.push(Line::from(vec![
content_lines.push(Line::from(vec![
"Run ".into(),
npm_cmd.cyan(),
" to update.".into(),
@@ -307,19 +311,22 @@ async fn run_ratatui_app(
&& (exe.starts_with("/opt/homebrew") || exe.starts_with("/usr/local"))
{
let brew_cmd = "brew upgrade codex";
lines.push(Line::from(vec![
content_lines.push(Line::from(vec![
"Run ".into(),
brew_cmd.cyan(),
" to update.".into(),
]));
} else {
lines.push(Line::from(vec![
content_lines.push(Line::from(vec![
"See ".into(),
"https://github.com/openai/codex/releases/latest".cyan(),
" for the latest releases and installation options.".into(),
"https://github.com/openai/codex".cyan().underlined(),
" for installation options.".into(),
]));
}
let viewport_width = tui.terminal.viewport_area.width as usize;
let inner_width = viewport_width.saturating_sub(4).max(1);
let mut lines = with_border_with_inner_width(content_lines, inner_width);
lines.push("".into());
tui.insert_history_lines(lines);
}
@@ -383,36 +390,6 @@ async fn run_ratatui_app(
resume_picker::ResumeSelection::StartFresh
};
if should_show_model_rollout_prompt(
&cli,
&config,
active_profile.as_deref(),
internal_storage.gpt_5_codex_model_prompt_seen,
) {
internal_storage.gpt_5_codex_model_prompt_seen = true;
if let Err(e) = internal_storage.persist().await {
error!("Failed to persist internal storage: {e:?}");
}
let upgrade_decision = run_model_upgrade_popup(&mut tui).await?;
let switch_to_new_model = upgrade_decision == ModelUpgradeDecision::Switch;
if switch_to_new_model {
config.model = GPT_5_CODEX_MEDIUM_MODEL.to_owned();
config.model_reasoning_effort = None;
if let Err(e) = persist_model_selection(
&config.codex_home,
active_profile.as_deref(),
&config.model,
config.model_reasoning_effort,
)
.await
{
error!("Failed to persist model selection: {e:?}");
}
}
}
let Cli { prompt, images, .. } = cli;
let app_result = App::run(
@@ -526,140 +503,3 @@ fn should_show_login_screen(login_status: LoginStatus, config: &Config) -> bool
login_status == LoginStatus::NotAuthenticated
}
fn should_show_model_rollout_prompt(
cli: &Cli,
config: &Config,
active_profile: Option<&str>,
gpt_5_codex_model_prompt_seen: bool,
) -> bool {
let login_status = get_login_status(config);
active_profile.is_none()
&& cli.model.is_none()
&& !gpt_5_codex_model_prompt_seen
&& config.model_provider.requires_openai_auth
&& matches!(login_status, LoginStatus::AuthMode(AuthMode::ChatGPT))
&& !cli.oss
}
#[cfg(test)]
mod tests {
use super::*;
use clap::Parser;
use codex_core::auth::AuthDotJson;
use codex_core::auth::get_auth_file;
use codex_core::auth::login_with_api_key;
use codex_core::auth::write_auth_json;
use codex_core::token_data::IdTokenInfo;
use codex_core::token_data::TokenData;
use std::sync::atomic::AtomicUsize;
use std::sync::atomic::Ordering;
fn get_next_codex_home() -> PathBuf {
static NEXT_CODEX_HOME_ID: AtomicUsize = AtomicUsize::new(0);
let mut codex_home = std::env::temp_dir();
let unique_suffix = format!(
"codex_tui_test_{}_{}",
std::process::id(),
NEXT_CODEX_HOME_ID.fetch_add(1, Ordering::Relaxed)
);
codex_home.push(unique_suffix);
codex_home
}
fn make_config() -> Config {
// Create a unique CODEX_HOME per test to isolate auth.json writes.
let codex_home = get_next_codex_home();
std::fs::create_dir_all(&codex_home).expect("create unique CODEX_HOME");
Config::load_from_base_config_with_overrides(
ConfigToml::default(),
ConfigOverrides::default(),
codex_home,
)
.expect("load default config")
}
/// Test helper to write an `auth.json` with the requested auth mode into the
/// provided CODEX_HOME directory. This ensures `get_login_status()` reads the
/// intended mode deterministically.
fn set_auth_method(codex_home: &std::path::Path, mode: AuthMode) {
match mode {
AuthMode::ApiKey => {
login_with_api_key(codex_home, "sk-test-key").expect("write api key auth.json");
}
AuthMode::ChatGPT => {
// Minimal valid JWT payload: header.payload.signature (all base64url, no padding)
const FAKE_JWT: &str = "eyJhbGciOiJub25lIiwidHlwIjoiSldUIn0.e30.c2ln"; // {"alg":"none","typ":"JWT"}.{}."sig"
let mut id_info = IdTokenInfo::default();
id_info.raw_jwt = FAKE_JWT.to_string();
let auth = AuthDotJson {
openai_api_key: None,
tokens: Some(TokenData {
id_token: id_info,
access_token: "access-token".to_string(),
refresh_token: "refresh-token".to_string(),
account_id: None,
}),
last_refresh: None,
};
let file = get_auth_file(codex_home);
write_auth_json(&file, &auth).expect("write chatgpt auth.json");
}
}
}
#[test]
fn shows_login_when_not_authenticated() {
let cfg = make_config();
assert!(should_show_login_screen(
LoginStatus::NotAuthenticated,
&cfg
));
}
#[test]
fn shows_model_rollout_prompt_for_default_model() {
let cli = Cli::parse_from(["codex"]);
let cfg = make_config();
set_auth_method(&cfg.codex_home, AuthMode::ChatGPT);
assert!(should_show_model_rollout_prompt(&cli, &cfg, None, false,));
}
#[test]
fn hides_model_rollout_prompt_when_api_auth_mode() {
let cli = Cli::parse_from(["codex"]);
let cfg = make_config();
set_auth_method(&cfg.codex_home, AuthMode::ApiKey);
assert!(!should_show_model_rollout_prompt(&cli, &cfg, None, false,));
}
#[test]
fn hides_model_rollout_prompt_when_marked_seen() {
let cli = Cli::parse_from(["codex"]);
let cfg = make_config();
set_auth_method(&cfg.codex_home, AuthMode::ChatGPT);
assert!(!should_show_model_rollout_prompt(&cli, &cfg, None, true,));
}
#[test]
fn hides_model_rollout_prompt_when_cli_overrides_model() {
let cli = Cli::parse_from(["codex", "--model", "gpt-4.1"]);
let cfg = make_config();
set_auth_method(&cfg.codex_home, AuthMode::ChatGPT);
assert!(!should_show_model_rollout_prompt(&cli, &cfg, None, false,));
}
#[test]
fn hides_model_rollout_prompt_when_profile_active() {
let cli = Cli::parse_from(["codex"]);
let cfg = make_config();
set_auth_method(&cfg.codex_home, AuthMode::ChatGPT);
assert!(!should_show_model_rollout_prompt(
&cli,
&cfg,
Some("gpt5"),
false,
));
}
}

View File

@@ -1,179 +0,0 @@
use crate::ascii_animation::AsciiAnimation;
use crate::tui::FrameRequester;
use crate::tui::Tui;
use crate::tui::TuiEvent;
use color_eyre::eyre::Result;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use ratatui::prelude::Widget;
use ratatui::style::Stylize;
use ratatui::text::Line;
use ratatui::widgets::Clear;
use ratatui::widgets::Paragraph;
use ratatui::widgets::WidgetRef;
use ratatui::widgets::Wrap;
use tokio_stream::StreamExt;
const MIN_ANIMATION_HEIGHT: u16 = 24;
const MIN_ANIMATION_WIDTH: u16 = 60;
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub(crate) enum ModelUpgradeDecision {
Switch,
KeepCurrent,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
enum ModelUpgradeOption {
TryNewModel,
KeepCurrent,
}
struct ModelUpgradePopup {
highlighted: ModelUpgradeOption,
decision: Option<ModelUpgradeDecision>,
animation: AsciiAnimation,
}
impl ModelUpgradePopup {
fn new(request_frame: FrameRequester) -> Self {
Self {
highlighted: ModelUpgradeOption::TryNewModel,
decision: None,
animation: AsciiAnimation::new(request_frame),
}
}
fn handle_key_event(&mut self, key_event: KeyEvent) {
match key_event.code {
KeyCode::Up | KeyCode::Char('k') => self.highlight(ModelUpgradeOption::TryNewModel),
KeyCode::Down | KeyCode::Char('j') => self.highlight(ModelUpgradeOption::KeepCurrent),
KeyCode::Char('1') => self.select(ModelUpgradeOption::TryNewModel),
KeyCode::Char('2') => self.select(ModelUpgradeOption::KeepCurrent),
KeyCode::Enter => self.select(self.highlighted),
KeyCode::Esc => self.select(ModelUpgradeOption::KeepCurrent),
KeyCode::Char('.') => {
if key_event.modifiers.contains(KeyModifiers::CONTROL) {
let _ = self.animation.pick_random_variant();
}
}
_ => {}
}
}
fn highlight(&mut self, option: ModelUpgradeOption) {
if self.highlighted != option {
self.highlighted = option;
self.animation.request_frame();
}
}
fn select(&mut self, option: ModelUpgradeOption) {
self.decision = Some(option.into());
self.animation.request_frame();
}
}
impl From<ModelUpgradeOption> for ModelUpgradeDecision {
fn from(option: ModelUpgradeOption) -> Self {
match option {
ModelUpgradeOption::TryNewModel => ModelUpgradeDecision::Switch,
ModelUpgradeOption::KeepCurrent => ModelUpgradeDecision::KeepCurrent,
}
}
}
impl WidgetRef for &ModelUpgradePopup {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
Clear.render(area, buf);
self.animation.schedule_next_frame();
// Skip the animation entirely when the viewport is too small so we don't clip frames.
let show_animation =
area.height >= MIN_ANIMATION_HEIGHT && area.width >= MIN_ANIMATION_WIDTH;
let mut lines: Vec<Line> = Vec::new();
if show_animation {
let frame = self.animation.current_frame();
lines.extend(frame.lines().map(Into::into));
// Spacer between animation and text content.
lines.push("".into());
}
lines.push(Line::from(vec![
" ".into(),
"Introducing GPT-5-Codex".bold(),
]));
lines.push("".into());
lines.push(
" GPT-5-Codex works faster through easy tasks and harder on complex tasks,".into(),
);
lines.push(" improves on code quality, and is more steerable with AGENTS.md.".into());
lines.push("".into());
let create_option =
|index: usize, option: ModelUpgradeOption, text: &str| -> Line<'static> {
if self.highlighted == option {
Line::from(vec![
format!("> {}. ", index + 1).cyan(),
text.to_owned().cyan(),
])
} else {
format!(" {}. {text}", index + 1).into()
}
};
lines.push(create_option(
0,
ModelUpgradeOption::TryNewModel,
"Try the new GPT-5-Codex model",
));
lines.push("".into());
lines.push(create_option(
1,
ModelUpgradeOption::KeepCurrent,
"Continue using current model",
));
lines.push("".into());
lines.push(
" Press Enter to confirm or Esc to keep your current model"
.dim()
.into(),
);
Paragraph::new(lines)
.wrap(Wrap { trim: false })
.render(area, buf);
}
}
pub(crate) async fn run_model_upgrade_popup(tui: &mut Tui) -> Result<ModelUpgradeDecision> {
let mut popup = ModelUpgradePopup::new(tui.frame_requester());
tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&popup, frame.area());
})?;
let events = tui.event_stream();
tokio::pin!(events);
while popup.decision.is_none() {
if let Some(event) = events.next().await {
match event {
TuiEvent::Key(key_event) => popup.handle_key_event(key_event),
TuiEvent::Draw => {
let _ = tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&popup, frame.area());
});
}
_ => {}
}
} else {
break;
}
}
Ok(popup.decision.unwrap_or(ModelUpgradeDecision::KeepCurrent))
}

View File

@@ -53,7 +53,7 @@ pub fn prefix_lines(
subsequent_prefix.clone()
});
spans.extend(l.spans);
Line::from(spans)
Line::from(spans).style(l.style)
})
.collect()
}

View File

@@ -7,6 +7,12 @@ use ratatui::style::Modifier;
use ratatui::style::Style;
use ratatui::text::Span;
use crate::color::blend;
use crate::terminal_palette::default_fg;
use crate::terminal_palette::terminal_palette;
const FALLBACK_DARK_GRAY: (u8, u8, u8) = (103, 103, 103);
static PROCESS_START: OnceLock<Instant> = OnceLock::new();
fn elapsed_since_start() -> Duration {
@@ -32,6 +38,8 @@ pub(crate) fn shimmer_spans(text: &str) -> Vec<Span<'static>> {
let band_half_width = 3.0;
let mut spans: Vec<Span<'static>> = Vec::with_capacity(chars.len());
let default_fg = default_fg();
let palette_dark_gray = terminal_palette().map(|palette| palette[8]);
for (i, ch) in chars.iter().enumerate() {
let i_pos = i as isize + padding as isize;
let pos = pos as isize;
@@ -43,31 +51,33 @@ pub(crate) fn shimmer_spans(text: &str) -> Vec<Span<'static>> {
} else {
0.0
};
let brightness = 0.4 + 0.6 * t;
let level = (brightness * 255.0).clamp(0.0, 255.0) as u8;
let style = if has_true_color {
let base = palette_dark_gray
.or(default_fg)
.unwrap_or(FALLBACK_DARK_GRAY);
let highlight = t.clamp(0.0, 1.0);
let (r, g, b) = blend((255, 255, 255), base, highlight);
// Allow custom RGB colors, as the implementation is thoughtfully
// adjusting the level of the default foreground color.
#[allow(clippy::disallowed_methods)]
{
Style::default()
.fg(Color::Rgb(level, level, level))
.fg(Color::Rgb(r, g, b))
.add_modifier(Modifier::BOLD)
}
} else {
color_for_level(level)
color_for_level(t)
};
spans.push(Span::styled(ch.to_string(), style));
}
spans
}
fn color_for_level(level: u8) -> Style {
// Tune thresholds so the edges of the shimmer band appear dim
// in fallback mode (no true color support).
if level < 160 {
fn color_for_level(intensity: f32) -> Style {
// Tune fallback styling so the shimmer band reads even without RGB support.
if intensity < 0.2 {
Style::default().add_modifier(Modifier::DIM)
} else if level < 224 {
} else if intensity < 0.6 {
Style::default()
} else {
Style::default().add_modifier(Modifier::BOLD)

View File

@@ -2,7 +2,7 @@
source: tui/src/history_cell.rs
expression: rendered
---
one two
three four
five six
seven
one two
three four
five six
seven

View File

@@ -1,19 +0,0 @@
---
source: tui/src/status.rs
expression: sanitized
---
/status
╭──────────────────────────────────────────────────────────────────╮
│ >_ OpenAI Codex (v0.0.0) │
│ │
│ Model : gpt-5-codex (reasoning high, summaries detailed) │
│ Directory : /workspace/tests │
│ Approval : on-request │
│ Sandbox : workspace-write │
│ Agents.md : <none> │
│ │
│ Token Usage : 1.9K total (1K input + 900 output) │
│ 5h limit : [███████████████░░░░░] 72% used · resets 03:14 │
│ Weekly limit : [█████████░░░░░░░░░░░] 45% used · resets 03:24 │
╰──────────────────────────────────────────────────────────────────╯

View File

@@ -1,19 +0,0 @@
---
source: tui/src/status.rs
expression: sanitized
---
/status
╭────────────────────────────────────────────╮
│ >_ OpenAI Codex (v0.0.0) │
│ │
│ Model : gpt-5-codex (reasoning high │
│ Directory : /workspace/tests │
│ Approval : on-request │
│ Sandbox : read-only │
│ Agents.md : <none> │
│ │
│ Token Usage : 1.9K total (1K input + 900 │
│ 5h limit : [███████████████░░░░░] 72% │
│ · resets 03:14 │
╰────────────────────────────────────────────╯

View File

@@ -23,7 +23,6 @@ use super::helpers::compose_agents_summary;
use super::helpers::compose_model_display;
use super::helpers::format_directory_display;
use super::helpers::format_tokens_compact;
use super::rate_limits::RESET_BULLET;
use super::rate_limits::RateLimitSnapshotDisplay;
use super::rate_limits::StatusRateLimitData;
use super::rate_limits::compose_rate_limit_data;
@@ -149,8 +148,7 @@ impl StatusHistoryCell {
let base_line = Line::from(base_spans.clone());
if let Some(resets_at) = row.resets_at.as_ref() {
let resets_span =
Span::from(format!("{RESET_BULLET} resets {resets_at}")).dim();
let resets_span = Span::from(format!("(resets {resets_at})")).dim();
let mut inline_spans = base_spans.clone();
inline_spans.push(Span::from(" ").dim());
inline_spans.push(resets_span.clone());
@@ -171,7 +169,10 @@ impl StatusHistoryCell {
lines
}
StatusRateLimitData::Missing => {
vec![formatter.line("Limits", vec![Span::from("data not available yet").dim()])]
vec![formatter.line(
"Limits",
vec![Span::from("send a message to load usage data").dim()],
)]
}
}
}
@@ -233,7 +234,7 @@ impl HistoryCell for StatusHistoryCell {
if self.session_id.is_some() {
push_label(&mut labels, &mut seen, "Session");
}
push_label(&mut labels, &mut seen, "Token Usage");
push_label(&mut labels, &mut seen, "Token usage");
self.collect_rate_limit_labels(&mut seen, &mut labels);
let formatter = FieldFormatter::from_labels(labels.iter().map(String::as_str));
@@ -263,7 +264,7 @@ impl HistoryCell for StatusHistoryCell {
}
lines.push(Line::from(Vec::<Span<'static>>::new()));
lines.push(formatter.line("Token Usage", self.token_usage_spans()));
lines.push(formatter.line("Token usage", self.token_usage_spans()));
lines.extend(self.rate_limit_lines(available_inner_width, &formatter));

View File

@@ -162,7 +162,7 @@ pub(crate) fn format_reset_timestamp(dt: DateTime<Local>, captured_at: DateTime<
if dt.date_naive() == captured_at.date_naive() {
time
} else {
format!("{} ({time})", dt.format("%-d %b"))
format!("{time} on {}", dt.format("%-d %b"))
}
}

View File

@@ -11,7 +11,6 @@ use std::convert::TryFrom;
const STATUS_LIMIT_BAR_SEGMENTS: usize = 20;
const STATUS_LIMIT_BAR_FILLED: &str = "";
const STATUS_LIMIT_BAR_EMPTY: &str = "";
pub(crate) const RESET_BULLET: &str = "·";
#[derive(Debug, Clone)]
pub(crate) struct StatusRateLimitRow {
@@ -105,7 +104,7 @@ pub(crate) fn compose_rate_limit_data(
}
if rows.is_empty() {
StatusRateLimitData::Missing
StatusRateLimitData::Available(vec![])
} else {
StatusRateLimitData::Available(rows)
}

View File

@@ -4,15 +4,15 @@ expression: sanitized
---
/status
╭──────────────────────────────────────────────────────────────────────────╮
│ >_ OpenAI Codex (v0.0.0) │
│ │
│ Model: gpt-5-codex (reasoning none, summaries auto) │
│ Directory: [[workspace]] │
│ Approval: on-request │
│ Sandbox: read-only │
│ Agents.md: <none> │
│ │
│ Token Usage: 1.2K total (800 input + 400 output) │
│ Monthly limit: [██░░░░░░░░░░░░░░░░░░] 12% used · resets 7 May (07:08) │
╰──────────────────────────────────────────────────────────────────────────╯
╭──────────────────────────────────────────────────────────────────────────
│ >_ OpenAI Codex (v0.0.0)
│ Model: gpt-5-codex (reasoning none, summaries auto)
│ Directory: [[workspace]]
│ Approval: on-request
│ Sandbox: read-only
│ Agents.md: <none>
│ Token usage: 1.2K total (800 input + 400 output)
│ Monthly limit: [██░░░░░░░░░░░░░░░░░░] 12% used (resets 07:08 on 7 May) │
╰──────────────────────────────────────────────────────────────────────────

View File

@@ -13,7 +13,7 @@ expression: sanitized
│ Sandbox: workspace-write │
│ Agents.md: <none> │
│ │
│ Token Usage: 1.9K total (1K input + 900 output) │
│ 5h limit: [███████████████░░░░░] 72% used · resets 03:14 │
│ Weekly limit: [█████████░░░░░░░░░░░] 45% used · resets 03:24 │
│ Token usage: 1.9K total (1K input + 900 output) │
│ 5h limit: [███████████████░░░░░] 72% used (resets 03:14)
│ Weekly limit: [█████████░░░░░░░░░░░] 45% used (resets 03:24)
╰───────────────────────────────────────────────────────────────────╯

View File

@@ -0,0 +1,18 @@
---
source: tui/src/status/tests.rs
expression: sanitized
---
/status
╭──────────────────────────────────────────────────────────────╮
│ >_ OpenAI Codex (v0.0.0) │
│ │
│ Model: gpt-5-codex (reasoning none, summaries auto) │
│ Directory: [[workspace]] │
│ Approval: on-request │
│ Sandbox: read-only │
│ Agents.md: <none> │
│ │
│ Token usage: 750 total (500 input + 250 output) │
│ Limits: data not available yet │
╰──────────────────────────────────────────────────────────────╯

View File

@@ -13,6 +13,6 @@ expression: sanitized
│ Sandbox: read-only │
│ Agents.md: <none> │
│ │
│ Token Usage: 750 total (500 input + 250 output) │
│ Limits: data not available yet
│ Token usage: 750 total (500 input + 250 output) │
│ Limits: send a message to load usage data
╰──────────────────────────────────────────────────────────────╯

View File

@@ -13,7 +13,7 @@ expression: sanitized
│ Sandbox: read-only │
│ Agents.md: <none> │
│ │
│ Token Usage: 1.9K total (1K input + 90 │
│ Token usage: 1.9K total (1K input + 90 │
│ 5h limit: [███████████████░░░░░] 72% │
· resets 03:14 │
(resets 03:14)
╰────────────────────────────────────────────╯

Some files were not shown because too many files have changed in this diff Show More