Compare commits

..

5 Commits

Author SHA1 Message Date
Felipe Coury
1c6e06381f feat(tui): add guided update remediation flows 2026-05-09 16:55:03 -03:00
Felipe Coury
f12a36c166 fix(tui): annotate update prompt literal arguments 2026-05-09 16:55:02 -03:00
Felipe Coury
d0a905a557 fix(tui): avoid update loops for mismatched npm installs 2026-05-09 16:55:02 -03:00
Michael Bolin
0c70698e24 tests: cover sandbox link write behavior (#21819)
## Why

[PR #1705](https://github.com/openai/codex/pull/1705) moved
`apply_patch` execution under the configured sandbox and called out the
need for integration coverage. We already covered textual `../` escapes,
but did not have coverage for link aliases that live inside a writable
workspace while pointing at, or aliasing, files visible outside it.

This PR locks in the current sandbox boundary without changing
production write semantics. Symlink escapes into a read-only outside
root should fail and leave the outside file unchanged. Existing hard
links are characterized separately: if a user-created hard link already
exists inside the writable root, sandboxed writes preserve normal
hard-link semantics rather than replacing the link and silently breaking
that relationship.

## What Changed

- Added
`apply_patch_cli_does_not_write_through_symlink_escape_outside_workspace`
to verify `apply_patch` cannot update a symlink that targets a file
outside the writable workspace.
- Added `apply_patch_cli_preserves_existing_hard_link_outside_workspace`
to verify `apply_patch` intentionally writes through an existing hard
link and does not unlink or replace it.
- Added `file_system_sandboxed_write_preserves_existing_hard_link` to
verify sandboxed `fs/writeFile` preserves an existing hard link and
writes the shared inode.

## Testing

- `cargo test -p codex-exec-server file_system_sandboxed_write`
- `cargo test -p codex-core
apply_patch_cli_does_not_write_through_symlink_escape_outside_workspace`
- `cargo test -p codex-core
apply_patch_cli_preserves_existing_hard_link_outside_workspace`
- `just fix -p codex-exec-server -p codex-core`
- `just fix -p codex-core`



---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/21819).
* #21845
* __->__ #21819
2026-05-09 08:28:15 -07:00
Ahmed Ibrahim
fca81eeb5b [codex] Lowercase TUI service tier commands (#21906)
## Why

Service-tier slash commands are built from model-catalog metadata. If
the catalog returns a name like `Fast`, the TUI currently exposes
`/Fast` and exact dispatch expects that casing, which is inconsistent
with the lowercase command style used elsewhere.

## What

- Lowercase service-tier command names when converting catalog tiers
into `ServiceTierCommand` values.
- Add regression coverage that seeds a catalog tier named `Fast` and
expects the generated command to be `fast`.

## Testing

Not run locally per repo instruction; PR CI should run the new
`service_tier_commands_lowercase_catalog_names` coverage.
2026-05-09 14:29:12 +03:00
37 changed files with 1677 additions and 1133 deletions

View File

@@ -220,48 +220,6 @@ jobs:
"$dest/${binary}-${{ matrix.target }}.exe"
done
- name: Build Python runtime wheel
shell: bash
run: |
set -euo pipefail
case "${{ matrix.target }}" in
aarch64-pc-windows-msvc)
platform_tag="win_arm64"
;;
x86_64-pc-windows-msvc)
platform_tag="win_amd64"
;;
*)
echo "No Python runtime wheel platform tag for ${{ matrix.target }}"
exit 1
;;
esac
python -m venv "${RUNNER_TEMP}/python-runtime-build-venv"
"${RUNNER_TEMP}/python-runtime-build-venv/Scripts/python.exe" -m pip install build
stage_dir="${RUNNER_TEMP}/openai-codex-cli-bin-${{ matrix.target }}"
wheel_dir="${GITHUB_WORKSPACE}/python-runtime-dist/${{ matrix.target }}"
# Keep the helpers next to codex.exe in the runtime wheel so Windows
# sandbox/elevation lookup matches the standalone release zip.
python "${GITHUB_WORKSPACE}/sdk/python/scripts/update_sdk_artifacts.py" \
stage-runtime \
"$stage_dir" \
"${GITHUB_WORKSPACE}/codex-rs/target/${{ matrix.target }}/release/codex.exe" \
--codex-version "${GITHUB_REF_NAME}" \
--platform-tag "$platform_tag" \
--resource-binary "${GITHUB_WORKSPACE}/codex-rs/target/${{ matrix.target }}/release/codex-command-runner.exe" \
--resource-binary "${GITHUB_WORKSPACE}/codex-rs/target/${{ matrix.target }}/release/codex-windows-sandbox-setup.exe"
"${RUNNER_TEMP}/python-runtime-build-venv/Scripts/python.exe" -m build --wheel --outdir "$wheel_dir" "$stage_dir"
- name: Upload Python runtime wheel
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: python-runtime-wheel-${{ matrix.target }}
path: python-runtime-dist/${{ matrix.target }}/*.whl
if-no-files-found: error
- name: Install DotSlash
uses: facebook/install-dotslash@1e4e7b3e07eaca387acb98f1d4720e0bee8dbb6a # v2

View File

@@ -399,65 +399,6 @@ jobs:
cp target/${{ matrix.target }}/release/codex-${{ matrix.target }}.dmg "$dest/codex-${{ matrix.target }}.dmg"
fi
- name: Build Python runtime wheel
if: ${{ matrix.bundle == 'primary' }}
shell: bash
run: |
set -euo pipefail
case "${{ matrix.target }}" in
aarch64-apple-darwin)
platform_tag="macosx_11_0_arm64"
;;
x86_64-apple-darwin)
platform_tag="macosx_10_9_x86_64"
;;
aarch64-unknown-linux-musl)
platform_tag="musllinux_1_1_aarch64"
;;
x86_64-unknown-linux-musl)
platform_tag="musllinux_1_1_x86_64"
;;
*)
echo "No Python runtime wheel platform tag for ${{ matrix.target }}"
exit 1
;;
esac
python3 -m venv "${RUNNER_TEMP}/python-runtime-build-venv"
# Do not install into the runner's system Python; macOS runners mark
# the Homebrew Python as externally managed under PEP 668.
"${RUNNER_TEMP}/python-runtime-build-venv/bin/python" -m pip install build
stage_dir="${RUNNER_TEMP}/openai-codex-cli-bin-${{ matrix.target }}"
wheel_dir="${GITHUB_WORKSPACE}/python-runtime-dist/${{ matrix.target }}"
stage_runtime_args=(
"${GITHUB_WORKSPACE}/sdk/python/scripts/update_sdk_artifacts.py"
stage-runtime
"$stage_dir"
"${GITHUB_WORKSPACE}/codex-rs/target/${{ matrix.target }}/release/codex"
--codex-version "${GITHUB_REF_NAME}"
--platform-tag "$platform_tag"
)
if [[ "${{ matrix.target }}" == *linux* ]]; then
# Keep bwrap in the runtime wheel so Linux sandbox fallback behavior
# matches the standalone release bundle on hosts without system bwrap.
stage_runtime_args+=(
--resource-binary
"${GITHUB_WORKSPACE}/codex-rs/target/${{ matrix.target }}/release/bwrap"
)
fi
python3 "${stage_runtime_args[@]}"
"${RUNNER_TEMP}/python-runtime-build-venv/bin/python" -m build --wheel --outdir "$wheel_dir" "$stage_dir"
- name: Upload Python runtime wheel
if: ${{ matrix.bundle == 'primary' }}
uses: actions/upload-artifact@bbbca2ddaa5d8feaa63e36b76fdaad77386f024f # v7.0.0
with:
name: python-runtime-wheel-${{ matrix.target }}
path: python-runtime-dist/${{ matrix.target }}/*.whl
if-no-files-found: error
- name: Compress artifacts
shell: bash
run: |
@@ -537,7 +478,6 @@ jobs:
tag: ${{ github.ref_name }}
should_publish_npm: ${{ steps.npm_publish_settings.outputs.should_publish }}
npm_tag: ${{ steps.npm_publish_settings.outputs.npm_tag }}
should_publish_python_runtime: ${{ steps.python_runtime_publish_settings.outputs.should_publish }}
steps:
- name: Checkout repository
@@ -614,22 +554,6 @@ jobs:
echo "npm_tag=" >> "$GITHUB_OUTPUT"
fi
- name: Determine Python runtime publish settings
id: python_runtime_publish_settings
env:
VERSION: ${{ steps.release_name.outputs.name }}
run: |
set -euo pipefail
version="${VERSION}"
if [[ "${version}" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo "should_publish=true" >> "$GITHUB_OUTPUT"
elif [[ "${version}" =~ ^[0-9]+\.[0-9]+\.[0-9]+-alpha\.[0-9]+$ ]]; then
echo "should_publish=true" >> "$GITHUB_OUTPUT"
else
echo "should_publish=false" >> "$GITHUB_OUTPUT"
fi
- name: Setup pnpm
uses: pnpm/action-setup@a8198c4bff370c8506180b035930dea56dbd5288 # v5
with:
@@ -863,48 +787,6 @@ jobs:
exit "${publish_status}"
done
# Publish the platform-specific Python runtime wheels using PyPI trusted publishing.
# PyPI project configuration must trust this workflow and job. Keep this
# non-blocking while the Python runtime publishing path is new; failures still
# need release follow-up, but should not invalidate the Rust release itself.
publish-python-runtime:
# Publish to PyPI for stable releases and alpha pre-releases with numeric suffixes.
if: ${{ needs.release.outputs.should_publish_python_runtime == 'true' }}
name: publish-python-runtime
needs: release
runs-on: ubuntu-latest
continue-on-error: true
environment: pypi
permissions:
id-token: write # Required for PyPI trusted publishing.
contents: read
steps:
- name: Download Python runtime wheels from release
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_TAG: ${{ needs.release.outputs.tag }}
RELEASE_VERSION: ${{ needs.release.outputs.version }}
run: |
set -euo pipefail
python_version="$RELEASE_VERSION"
python_version="${python_version/-alpha./a}"
python_version="${python_version/-beta./b}"
python_version="${python_version/-rc./rc}"
mkdir -p dist/python-runtime
gh release download "$RELEASE_TAG" \
--repo "${GITHUB_REPOSITORY}" \
--pattern "openai_codex_cli_bin-${python_version}-*.whl" \
--dir dist/python-runtime
ls -lh dist/python-runtime
- name: Publish Python runtime wheels to PyPI
uses: pypa/gh-action-pypi-publish@ed0c53931b1dc9bd32cbe73a98c7f6766f8a527e # v1.13.0
with:
packages-dir: dist/python-runtime
skip-existing: true
winget:
name: winget
needs: release

View File

@@ -2,7 +2,7 @@
// Unified entry point for the Codex CLI.
import { spawn } from "node:child_process";
import { existsSync } from "fs";
import { existsSync, realpathSync } from "fs";
import { createRequire } from "node:module";
import path from "path";
import { fileURLToPath } from "url";
@@ -171,6 +171,12 @@ const packageManagerEnvVar =
? "CODEX_MANAGED_BY_BUN"
: "CODEX_MANAGED_BY_NPM";
env[packageManagerEnvVar] = "1";
try {
env.CODEX_MANAGED_PACKAGE_ROOT = realpathSync(path.join(__dirname, ".."));
} catch {
// Best effort only. Older or unusual package layouts can omit this extra
// provenance without preventing Codex from starting.
}
const child = spawn(binaryPath, process.argv.slice(2), {
stdio: "inherit",

View File

@@ -34,7 +34,10 @@ use codex_state::state_db_path;
use codex_tui::AppExitInfo;
use codex_tui::Cli as TuiCli;
use codex_tui::ExitReason;
use codex_tui::PromptedUpdate;
use codex_tui::UpdateAction;
#[cfg(not(debug_assertions))]
use codex_tui::UpdateActionStatus;
use codex_utils_absolute_path::AbsolutePathBuf;
use codex_utils_cli::CliConfigOverrides;
use owo_colors::OwoColorize;
@@ -618,13 +621,13 @@ fn handle_app_exit(exit_info: AppExitInfo) -> anyhow::Result<()> {
ExitReason::UserRequested => { /* normal exit */ }
}
let update_action = exit_info.update_action;
let prompted_update = exit_info.update_action.clone();
let color_enabled = supports_color::on(Stream::Stdout).is_some();
for line in format_exit_messages(exit_info, color_enabled) {
println!("{line}");
}
if let Some(action) = update_action {
run_update_action(action)?;
if let Some(prompted_update) = prompted_update {
run_prompted_update(prompted_update)?;
}
Ok(())
}
@@ -672,6 +675,20 @@ fn run_update_action(action: UpdateAction) -> anyhow::Result<()> {
Ok(())
}
fn run_prompted_update(prompted_update: PromptedUpdate) -> anyhow::Result<()> {
run_update_action(prompted_update.action)?;
#[cfg(not(debug_assertions))]
{
if let Err(err) = codex_tui::record_successful_prompt_update_attempt(
&prompted_update.version_file,
&prompted_update.target_version,
) {
tracing::warn!("Failed to record successful prompted update attempt: {err}");
}
}
Ok(())
}
fn run_update_command() -> anyhow::Result<()> {
#[cfg(debug_assertions)]
{
@@ -682,12 +699,13 @@ fn run_update_command() -> anyhow::Result<()> {
#[cfg(not(debug_assertions))]
{
let Some(action) = codex_tui::get_update_action() else {
anyhow::bail!(
match codex_tui::get_update_action_status() {
UpdateActionStatus::Ready(action) => run_update_action(action),
UpdateActionStatus::Blocked(blocker) => anyhow::bail!("{blocker}"),
UpdateActionStatus::Unavailable => anyhow::bail!(
"Could not detect the Codex installation method. Please update manually: https://developers.openai.com/codex/cli/"
);
};
run_update_action(action)
),
}
}
}

View File

@@ -383,9 +383,16 @@ impl TestCodexBuilder {
.exec_server_url
.clone()
.or_else(|| test_env.exec_server_url.clone());
#[cfg(target_os = "linux")]
let codex_linux_sandbox_exe = Some(
crate::find_codex_linux_sandbox_exe()
.context("should find binary for codex-linux-sandbox")?,
);
#[cfg(not(target_os = "linux"))]
let codex_linux_sandbox_exe = None;
let local_runtime_paths = codex_exec_server::ExecServerRuntimePaths::new(
std::env::current_exe()?,
/*codex_linux_sandbox_exe*/ None,
codex_linux_sandbox_exe,
)?;
let environment_manager = Arc::new(
codex_exec_server::EnvironmentManager::create_for_tests(

View File

@@ -15,6 +15,11 @@ use std::time::Duration;
use codex_exec_server::CreateDirectoryOptions;
use codex_features::Feature;
use codex_protocol::models::PermissionProfile;
use codex_protocol::permissions::FileSystemAccessMode;
use codex_protocol::permissions::FileSystemPath;
use codex_protocol::permissions::FileSystemSandboxEntry;
use codex_protocol::permissions::FileSystemSandboxPolicy;
use codex_protocol::permissions::FileSystemSpecialPath;
use codex_protocol::permissions::NetworkSandboxPolicy;
use codex_protocol::protocol::AskForApproval;
use codex_protocol::protocol::EventMsg;
@@ -23,6 +28,7 @@ use codex_protocol::protocol::SandboxPolicy;
use codex_protocol::user_input::UserInput;
#[cfg(target_os = "linux")]
use codex_sandboxing::landlock::CODEX_LINUX_SANDBOX_ARG0;
use codex_utils_absolute_path::AbsolutePathBuf;
use core_test_support::assert_regex_match;
use core_test_support::responses::ev_assistant_message;
use core_test_support::responses::ev_completed;
@@ -112,6 +118,45 @@ fn restrictive_workspace_write_profile() -> PermissionProfile {
)
}
fn workspace_write_with_read_only_root(read_only_root: AbsolutePathBuf) -> PermissionProfile {
let file_system_sandbox_policy = FileSystemSandboxPolicy::restricted(vec![
FileSystemSandboxEntry {
path: FileSystemPath::Path {
path: read_only_root,
},
access: FileSystemAccessMode::Read,
},
FileSystemSandboxEntry {
path: FileSystemPath::Special {
value: FileSystemSpecialPath::project_roots(/*subpath*/ None),
},
access: FileSystemAccessMode::Write,
},
]);
PermissionProfile::from_runtime_permissions(
&file_system_sandbox_policy,
NetworkSandboxPolicy::Restricted,
)
}
#[cfg(unix)]
fn create_file_symlink(source: &std::path::Path, link: &std::path::Path) -> std::io::Result<()> {
std::os::unix::fs::symlink(source, link)
}
#[cfg(windows)]
fn create_file_symlink(source: &std::path::Path, link: &std::path::Path) -> std::io::Result<()> {
std::os::windows::fs::symlink_file(source, link)
}
#[cfg(not(any(unix, windows)))]
fn create_file_symlink(_source: &std::path::Path, _link: &std::path::Path) -> std::io::Result<()> {
Err(std::io::Error::new(
std::io::ErrorKind::Unsupported,
"file symlinks are unsupported on this platform",
))
}
pub async fn mount_apply_patch(
harness: &TestCodexHarness,
call_id: &str,
@@ -675,6 +720,181 @@ async fn apply_patch_cli_rejects_path_traversal_outside_workspace(
Ok(())
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
#[test_case(ApplyPatchModelOutput::Freeform ; "freeform")]
#[test_case(ApplyPatchModelOutput::Shell ; "shell")]
#[test_case(ApplyPatchModelOutput::ShellViaHeredoc ; "shell_heredoc")]
#[test_case(ApplyPatchModelOutput::ShellCommandViaHeredoc ; "shell_command_heredoc")]
async fn apply_patch_cli_does_not_write_through_symlink_escape_outside_workspace(
model_output: ApplyPatchModelOutput,
) -> Result<()> {
skip_if_no_network!(Ok(()));
skip_if_remote!(
Ok(()),
"link escape setup needs local filesystem link creation"
);
let test_root = tempfile::tempdir_in(std::env::current_dir()?)?;
let work_dir = AbsolutePathBuf::try_from(test_root.path().join("work"))?;
let outside_dir = AbsolutePathBuf::try_from(test_root.path().join("outside"))?;
std::fs::create_dir_all(work_dir.as_path())?;
std::fs::create_dir_all(outside_dir.as_path())?;
let harness_work_dir = work_dir.clone();
let harness = apply_patch_harness_with(move |builder| {
builder.with_config(move |config| {
config.cwd = harness_work_dir;
})
})
.await?;
let original_contents = "original outside content\n";
let outside_file = outside_dir.join("victim.txt");
std::fs::write(&outside_file, original_contents)?;
let link_rel = "soft-link.txt";
let link_path = harness.path(link_rel);
match create_file_symlink(&outside_file, &link_path) {
Ok(()) => {}
Err(error) if cfg!(windows) => {
eprintln!("Skipping Windows symlink apply_patch sandbox test: {error}");
return Ok(());
}
Err(error) => return Err(error.into()),
}
let patch = format!(
r#"*** Begin Patch
*** Update File: {link_rel}
@@
-original outside content
+pwned
*** End Patch"#
);
let call_id = "apply-symlink-escape";
mount_apply_patch(&harness, call_id, &patch, "fail", model_output).await;
harness
.submit_with_permission_profile(
"attempt to escape workspace via apply_patch link",
workspace_write_with_read_only_root(outside_dir.clone()),
)
.await?;
let out = harness.apply_patch_output(call_id, model_output).await;
assert_eq!(
std::fs::read_to_string(&outside_file)?,
original_contents,
"symlink escape should not modify the outside victim; tool output: {out}",
);
let metadata = std::fs::symlink_metadata(&link_path)?;
assert!(metadata.file_type().is_symlink());
Ok(())
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
#[test_case(ApplyPatchModelOutput::Freeform ; "freeform")]
#[test_case(ApplyPatchModelOutput::Shell ; "shell")]
#[test_case(ApplyPatchModelOutput::ShellViaHeredoc ; "shell_heredoc")]
#[test_case(ApplyPatchModelOutput::ShellCommandViaHeredoc ; "shell_command_heredoc")]
async fn apply_patch_cli_preserves_existing_hard_link_outside_workspace(
model_output: ApplyPatchModelOutput,
) -> Result<()> {
skip_if_no_network!(Ok(()));
skip_if_remote!(
Ok(()),
"link setup needs local filesystem hard link creation"
);
let test_root = tempfile::tempdir_in(std::env::current_dir()?)?;
let work_dir = AbsolutePathBuf::try_from(test_root.path().join("work"))?;
let outside_dir = AbsolutePathBuf::try_from(test_root.path().join("outside"))?;
std::fs::create_dir_all(work_dir.as_path())?;
std::fs::create_dir_all(outside_dir.as_path())?;
let harness_work_dir = work_dir.clone();
let harness = apply_patch_harness_with(move |builder| {
builder.with_config(move |config| {
config.cwd = harness_work_dir;
})
})
.await?;
let outside_file = outside_dir.join("victim.txt");
std::fs::write(&outside_file, "original outside content\n")?;
let link_rel = "hard-link.txt";
let link_path = harness.path(link_rel);
std::fs::hard_link(&outside_file, &link_path)?;
let patch = format!(
r#"*** Begin Patch
*** Update File: {link_rel}
@@
-original outside content
+updated through existing hard link
*** End Patch"#
);
let call_id = "apply-hard-link";
mount_apply_patch(&harness, call_id, &patch, "ok", model_output).await;
harness
.submit_with_permission_profile(
"update existing hard link via apply_patch",
workspace_write_with_read_only_root(outside_dir.clone()),
)
.await?;
let out = harness.apply_patch_output(call_id, model_output).await;
if cfg!(windows) {
assert!(
out.contains("patch rejected: writing outside of the project"),
"Windows sandboxing intentionally rejects writes through existing hard links to files outside the workspace; tool output: {out}"
);
assert_eq!(
std::fs::read_to_string(&outside_file)?,
"original outside content\n",
"Windows rejection must leave the outside hard-link target unchanged"
);
assert_eq!(
std::fs::read_to_string(&link_path)?,
"original outside content\n",
"Windows rejection must leave the workspace hard-link path unchanged"
);
std::fs::write(&outside_file, "post-reject outside write\n")?;
assert_eq!(
std::fs::read_to_string(&link_path)?,
"post-reject outside write\n",
"Windows rejection must not unlink or replace an existing hard link"
);
return Ok(());
}
assert!(
out.contains("Success. Updated the following files:"),
"apply_patch should intentionally allow updates through existing hard links; tool output: {out}"
);
assert_eq!(
std::fs::read_to_string(&outside_file)?,
"updated through existing hard link\n",
"apply_patch intentionally preserves existing hard-link semantics; the outside path observes the shared inode update"
);
assert_eq!(
std::fs::read_to_string(&link_path)?,
"updated through existing hard link\n",
"apply_patch intentionally preserves existing hard-link semantics; the workspace path observes the same update"
);
std::fs::write(&outside_file, "post-apply outside write\n")?;
assert_eq!(
std::fs::read_to_string(&link_path)?,
"post-apply outside write\n",
"apply_patch must not unlink or replace an existing hard link; later writes through either path should still be visible"
);
Ok(())
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
#[test_case(ApplyPatchModelOutput::Freeform)]
#[test_case(ApplyPatchModelOutput::Shell)]

View File

@@ -2,6 +2,7 @@
mod common;
use std::os::unix::fs::MetadataExt;
#[cfg(target_os = "linux")]
use std::os::unix::fs::PermissionsExt;
use std::os::unix::fs::symlink;
@@ -732,6 +733,53 @@ async fn file_system_sandboxed_write_rejects_symlink_escape(use_remote: bool) ->
Ok(())
}
#[test_case(false ; "local")]
#[test_case(true ; "remote")]
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn file_system_sandboxed_write_preserves_existing_hard_link(use_remote: bool) -> Result<()> {
let context = create_file_system_context(use_remote).await?;
let file_system = context.file_system;
let tmp = TempDir::new()?;
let allowed_dir = tmp.path().join("allowed");
let outside_dir = tmp.path().join("outside");
std::fs::create_dir_all(&allowed_dir)?;
std::fs::create_dir_all(&outside_dir)?;
let outside_file = outside_dir.join("outside.txt");
let hard_link = allowed_dir.join("hard-link.txt");
std::fs::write(&outside_file, "outside\n")?;
std::fs::hard_link(&outside_file, &hard_link)?;
let sandbox = workspace_write_sandbox(allowed_dir);
file_system
.write_file(
&absolute_path(hard_link.clone()),
b"updated through existing hard link\n".to_vec(),
Some(&sandbox),
)
.await
.with_context(|| format!("mode={use_remote}"))?;
assert_eq!(
std::fs::read_to_string(&outside_file)?,
"updated through existing hard link\n"
);
assert_eq!(
std::fs::read_to_string(&hard_link)?,
"updated through existing hard link\n"
);
let outside_metadata = std::fs::metadata(&outside_file)?;
let link_metadata = std::fs::metadata(&hard_link)?;
assert_eq!(
(link_metadata.dev(), link_metadata.ino()),
(outside_metadata.dev(), outside_metadata.ino())
);
Ok(())
}
#[test_case(false ; "local")]
#[test_case(true ; "remote")]
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]

View File

@@ -40,6 +40,8 @@ use crate::history_cell;
use crate::history_cell::HistoryCell;
#[cfg(not(debug_assertions))]
use crate::history_cell::UpdateAvailableHistoryCell;
#[cfg(not(debug_assertions))]
use crate::history_cell::UpdateRemediationWarningHistoryCell;
use crate::key_hint::KeyBindingListExt;
use crate::keymap::RuntimeKeymap;
use crate::legacy_core::config::Config;
@@ -73,7 +75,9 @@ use crate::token_usage::TokenUsage;
use crate::transcript_reflow::TranscriptReflowState;
use crate::tui;
use crate::tui::TuiEvent;
use crate::update_action::UpdateAction;
use crate::update_action::PromptedUpdate;
#[cfg(not(debug_assertions))]
use crate::updates::UpgradeHistoryNotice;
use crate::version::CODEX_CLI_VERSION;
use crate::workspace_command::AppServerWorkspaceCommandRunner;
use crate::workspace_command::WorkspaceCommandRunner;
@@ -335,7 +339,7 @@ pub struct AppExitInfo {
pub token_usage: TokenUsage,
pub thread_id: Option<ThreadId>,
pub thread_name: Option<String>,
pub update_action: Option<UpdateAction>,
pub update_action: Option<PromptedUpdate>,
pub exit_reason: ExitReason,
}
@@ -478,7 +482,7 @@ pub(crate) struct App {
remote_app_server_url: Option<String>,
remote_app_server_auth_token: Option<String>,
/// Set when the user confirms an update; propagated on exit.
pub(crate) pending_update_action: Option<UpdateAction>,
pub(crate) pending_update_action: Option<PromptedUpdate>,
/// Tracks the thread we intentionally shut down while exiting the app.
///
@@ -863,7 +867,13 @@ See the Codex keymap documentation for supported actions and examples."
)
})?;
#[cfg(not(debug_assertions))]
let upgrade_version = crate::updates::get_upgrade_version(&config);
let upgrade_notice = crate::updates::get_upgrade_notice_for_history(
&config,
initial_prompt
.as_ref()
.is_some_and(|prompt| !prompt.is_empty()),
crate::update_action::get_update_action_status(),
);
let mut app = Self {
model_catalog,
@@ -968,16 +978,24 @@ See the Codex keymap documentation for supported actions and examples."
let mut waiting_for_initial_session_configured = wait_for_initial_session_configured;
#[cfg(not(debug_assertions))]
let pre_loop_exit_reason = if let Some(latest_version) = upgrade_version {
let pre_loop_exit_reason = if let Some(upgrade_notice) = upgrade_notice {
let cell: Box<dyn HistoryCell> = match upgrade_notice {
UpgradeHistoryNotice::Available {
latest_version,
update_action,
} => Box::new(UpdateAvailableHistoryCell::new(
latest_version,
update_action,
)),
UpgradeHistoryNotice::BlockedWarning(blocker) => {
Box::new(UpdateRemediationWarningHistoryCell::Blocked(blocker))
}
UpgradeHistoryNotice::NoOpUpdateWarning { latest_version } => {
Box::new(UpdateRemediationWarningHistoryCell::NoOpUpdate { latest_version })
}
};
let control = app
.handle_event(
tui,
&mut app_server,
AppEvent::InsertHistoryCell(Box::new(UpdateAvailableHistoryCell::new(
latest_version,
crate::update_action::get_update_action(),
))),
)
.handle_event(tui, &mut app_server, AppEvent::InsertHistoryCell(cell))
.await?;
match control {
AppRunControl::Continue => None,
@@ -1078,7 +1096,7 @@ See the Codex keymap documentation for supported actions and examples."
token_usage: app.token_usage(),
thread_id: resumable_thread.as_ref().map(|thread| thread.thread_id),
thread_name: resumable_thread.and_then(|thread| thread.thread_name),
update_action: app.pending_update_action,
update_action: app.pending_update_action.clone(),
exit_reason,
})
}

View File

@@ -88,7 +88,7 @@ impl ChatWidget {
.into_iter()
.map(|tier| ServiceTierCommand {
id: tier.id,
name: tier.name,
name: tier.name.to_lowercase(),
description: tier.description,
})
.collect()

View File

@@ -62,6 +62,24 @@ fn next_add_to_history_event(rx: &mut tokio::sync::mpsc::UnboundedReceiver<AppEv
}
}
#[tokio::test]
async fn service_tier_commands_lowercase_catalog_names() {
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(Some("gpt-5.4")).await;
let mut preset = get_available_model(&chat, "gpt-5.4");
let expected_description = preset.service_tiers[0].description.clone();
preset.service_tiers[0].name = "Fast".to_string();
chat.model_catalog = std::sync::Arc::new(ModelCatalog::new(vec![preset]));
assert_eq!(
chat.current_model_service_tier_commands(),
vec![ServiceTierCommand {
id: ServiceTier::Fast.request_value().to_string(),
name: "fast".to_string(),
description: expected_description,
}]
);
}
#[tokio::test]
async fn slash_compact_eagerly_queues_follow_up_before_turn_start() {
let (mut chat, mut rx, mut op_rx) = make_chatwidget_manual(/*model_override*/ None).await;

View File

@@ -41,6 +41,7 @@ use crate::text_formatting::truncate_text;
use crate::tooltips;
use crate::ui_consts::LIVE_PREFIX_COLS;
use crate::update_action::UpdateAction;
use crate::update_action::UpdateBlocker;
use crate::version::CODEX_CLI_VERSION;
use crate::wrapping::RtOptions;
use crate::wrapping::adaptive_wrap_line;
@@ -656,6 +657,40 @@ pub(crate) struct UpdateAvailableHistoryCell {
update_action: Option<UpdateAction>,
}
#[cfg_attr(debug_assertions, allow(dead_code))]
#[derive(Debug)]
pub(crate) enum UpdateRemediationWarningHistoryCell {
Blocked(UpdateBlocker),
NoOpUpdate { latest_version: String },
}
impl HistoryCell for UpdateRemediationWarningHistoryCell {
fn display_lines(&self, _width: u16) -> Vec<Line<'static>> {
self.raw_lines()
}
fn raw_lines(&self) -> Vec<Line<'static>> {
match self {
Self::Blocked(UpdateBlocker::NpmGlobalRootMismatch {
running_package_root,
npm_package_root,
}) => vec![Line::from(format!(
"Warning: Codex is running from {}, but npm would update {}.",
running_package_root.display(),
npm_package_root.display(),
))],
Self::NoOpUpdate { latest_version } => vec![
Line::from(
"Warning: The previous update completed, but this Codex executable did not change.",
),
Line::from(format!(
"Codex is still running {CODEX_CLI_VERSION} while {latest_version} is available."
)),
],
}
}
}
#[cfg_attr(debug_assertions, allow(dead_code))]
impl UpdateAvailableHistoryCell {
pub(crate) fn new(latest_version: String, update_action: Option<UpdateAction>) -> Self {

View File

@@ -177,13 +177,20 @@ mod transcript_reflow;
mod tui;
mod ui_consts;
pub(crate) mod update_action;
pub use update_action::PromptedUpdate;
pub use update_action::UpdateAction;
pub use update_action::UpdateActionStatus;
pub use update_action::UpdateBlocker;
#[cfg(not(debug_assertions))]
pub use update_action::get_update_action;
#[cfg(any(not(debug_assertions), test))]
pub use update_action::get_update_action_status;
mod update_prompt;
#[cfg(any(not(debug_assertions), test))]
mod update_versions;
mod updates;
#[cfg(not(debug_assertions))]
pub use updates::record_successful_prompt_update_attempt;
mod version;
#[cfg(not(target_os = "linux"))]
mod voice;
@@ -1094,8 +1101,9 @@ pub async fn run_main(
}
#[allow(clippy::too_many_arguments)]
#[allow(unused_mut)]
async fn run_ratatui_app(
cli: Cli,
mut cli: Cli,
arg0_paths: Arg0DispatchPaths,
loader_overrides: LoaderOverrides,
app_server_target: AppServerTarget,
@@ -1149,6 +1157,9 @@ async fn run_ratatui_app(
exit_reason: ExitReason::UserRequested,
});
}
UpdatePromptOutcome::StartRepairSession(prompt) => {
cli.prompt = Some(prompt);
}
}
}
}

View File

@@ -1,7 +1,8 @@
use serde::Deserialize;
use std::collections::HashMap;
#[cfg(not(debug_assertions))]
#[cfg(any(not(debug_assertions), test))]
#[cfg_attr(test, allow(dead_code))]
pub(crate) const PACKAGE_URL: &str = "https://registry.npmjs.org/@openai%2fcodex";
#[derive(Deserialize, Debug, Clone)]

View File

@@ -2,6 +2,7 @@
source: tui/src/update_prompt.rs
expression: terminal.backend()
---
Update available! 0.0.0 -> 9.9.9
Release notes: https://github.com/openai/codex/releases/latest

View File

@@ -0,0 +1,17 @@
---
source: tui/src/update_prompt.rs
assertion_line: 592
expression: terminal.backend()
---
Update needs attention
You are running Codex from:
/prefix-a/lib/node_modules/@openai/codex
but `npm install -g @openai/codex@latest` would update:
/prefix-b/lib/node_modules/@openai/codex
Fix your shell PATH or remove the stale Codex install, then restart Codex.
1. Help me fix this
2. Later
3. Don't remind me about this version

View File

@@ -2,6 +2,15 @@
use codex_install_context::InstallContext;
#[cfg(any(not(debug_assertions), test))]
use codex_install_context::StandalonePlatform;
use std::fmt;
#[cfg(any(not(debug_assertions), test))]
use std::path::Path;
use std::path::PathBuf;
#[cfg(any(not(debug_assertions), test))]
use std::process::Command;
#[cfg(any(not(debug_assertions), test))]
const MANAGED_PACKAGE_ROOT_ENV: &str = "CODEX_MANAGED_PACKAGE_ROOT";
/// Update action the CLI should perform after the TUI exits.
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
@@ -18,6 +27,62 @@ pub enum UpdateAction {
StandaloneWindows,
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UpdateActionStatus {
Ready(UpdateAction),
Blocked(UpdateBlocker),
Unavailable,
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UpdateBlocker {
NpmGlobalRootMismatch {
running_package_root: PathBuf,
npm_package_root: PathBuf,
},
}
#[derive(Debug, Clone)]
pub struct PromptedUpdate {
pub action: UpdateAction,
pub target_version: String,
pub version_file: PathBuf,
}
impl UpdateBlocker {
pub fn remediation_lines(&self) -> Vec<String> {
match self {
Self::NpmGlobalRootMismatch {
running_package_root,
npm_package_root,
} => vec![
"You are running Codex from:".to_string(),
format!(" {}", running_package_root.display()),
"but `npm install -g @openai/codex@latest` would update:".to_string(),
format!(" {}", npm_package_root.display()),
"Fix your shell PATH or remove the stale Codex install, then restart Codex."
.to_string(),
],
}
}
}
impl fmt::Display for UpdateBlocker {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
Self::NpmGlobalRootMismatch {
running_package_root,
npm_package_root,
} => write!(
f,
"You are running Codex from {}, but `npm install -g @openai/codex@latest` would update {}. Fix your shell PATH or remove the stale Codex install, then restart Codex.",
running_package_root.display(),
npm_package_root.display(),
),
}
}
}
impl UpdateAction {
#[cfg(any(not(debug_assertions), test))]
pub(crate) fn from_install_context(context: &InstallContext) -> Option<Self> {
@@ -36,8 +101,8 @@ impl UpdateAction {
/// Returns the list of command-line arguments for invoking the update.
pub fn command_args(self) -> (&'static str, &'static [&'static str]) {
match self {
UpdateAction::NpmGlobalLatest => ("npm", &["install", "-g", "@openai/codex"]),
UpdateAction::BunGlobalLatest => ("bun", &["install", "-g", "@openai/codex"]),
UpdateAction::NpmGlobalLatest => ("npm", &["install", "-g", "@openai/codex@latest"]),
UpdateAction::BunGlobalLatest => ("bun", &["install", "-g", "@openai/codex@latest"]),
UpdateAction::BrewUpgrade => ("brew", &["upgrade", "--cask", "codex"]),
UpdateAction::StandaloneUnix => (
"sh",
@@ -58,11 +123,79 @@ impl UpdateAction {
}
}
#[cfg(not(debug_assertions))]
#[cfg(any(not(debug_assertions), test))]
#[cfg_attr(test, allow(dead_code))]
pub fn get_update_action() -> Option<UpdateAction> {
UpdateAction::from_install_context(InstallContext::current())
}
#[cfg(any(not(debug_assertions), test))]
pub fn get_update_action_status() -> UpdateActionStatus {
let Some(action) = UpdateAction::from_install_context(InstallContext::current()) else {
return UpdateActionStatus::Unavailable;
};
if let Some(blocker) = update_blocker(action) {
UpdateActionStatus::Blocked(blocker)
} else {
UpdateActionStatus::Ready(action)
}
}
#[cfg(any(not(debug_assertions), test))]
fn update_blocker(action: UpdateAction) -> Option<UpdateBlocker> {
match action {
UpdateAction::NpmGlobalLatest => npm_global_root_mismatch(),
UpdateAction::BunGlobalLatest
| UpdateAction::BrewUpgrade
| UpdateAction::StandaloneUnix
| UpdateAction::StandaloneWindows => None,
}
}
#[cfg(any(not(debug_assertions), test))]
fn npm_global_root_mismatch() -> Option<UpdateBlocker> {
let running_package_root = std::env::var_os(MANAGED_PACKAGE_ROOT_ENV)?;
let running_package_root = std::fs::canonicalize(PathBuf::from(running_package_root)).ok()?;
let npm_global_root = npm_global_root()?;
mismatch_from_npm_roots(&running_package_root, &npm_global_root)
}
#[cfg(any(not(debug_assertions), test))]
fn npm_global_root() -> Option<PathBuf> {
#[cfg(windows)]
let output = Command::new("cmd")
.args(["/C", "npm", "root", "-g"])
.output()
.ok()?;
#[cfg(not(windows))]
let output = Command::new("npm").args(["root", "-g"]).output().ok()?;
if !output.status.success() {
return None;
}
let stdout = String::from_utf8(output.stdout).ok()?;
let npm_global_root = stdout.trim();
if npm_global_root.is_empty() {
return None;
}
std::fs::canonicalize(npm_global_root).ok()
}
#[cfg(any(not(debug_assertions), test))]
fn mismatch_from_npm_roots(
running_package_root: &Path,
npm_global_root: &Path,
) -> Option<UpdateBlocker> {
let npm_package_root = npm_global_root.join("@openai").join("codex");
(running_package_root != npm_package_root.as_path()).then(|| {
UpdateBlocker::NpmGlobalRootMismatch {
running_package_root: running_package_root.to_path_buf(),
npm_package_root,
}
})
}
#[cfg(test)]
mod tests {
use super::*;
@@ -124,4 +257,29 @@ mod tests {
)
);
}
#[test]
fn npm_root_mismatch_is_blocked_when_update_targets_another_install() {
assert_eq!(
mismatch_from_npm_roots(
Path::new("/prefix-a/lib/node_modules/@openai/codex"),
Path::new("/prefix-b/lib/node_modules"),
),
Some(UpdateBlocker::NpmGlobalRootMismatch {
running_package_root: PathBuf::from("/prefix-a/lib/node_modules/@openai/codex"),
npm_package_root: PathBuf::from("/prefix-b/lib/node_modules/@openai/codex"),
})
);
}
#[test]
fn npm_root_match_keeps_update_available() {
assert_eq!(
mismatch_from_npm_roots(
Path::new("/prefix-a/lib/node_modules/@openai/codex"),
Path::new("/prefix-a/lib/node_modules"),
),
None
);
}
}

View File

@@ -1,4 +1,5 @@
#![cfg(not(debug_assertions))]
#![cfg(any(not(debug_assertions), test))]
#![cfg_attr(test, allow(dead_code))]
use crate::history_cell::padded_emoji;
use crate::key_hint;
@@ -11,8 +12,12 @@ use crate::selection_list::selection_option_row;
use crate::tui::FrameRequester;
use crate::tui::Tui;
use crate::tui::TuiEvent;
use crate::update_action::PromptedUpdate;
use crate::update_action::UpdateAction;
use crate::update_action::UpdateActionStatus;
use crate::update_action::UpdateBlocker;
use crate::updates;
use crate::updates::UpgradeNotice;
use color_eyre::Result;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
@@ -29,20 +34,55 @@ use tokio_stream::StreamExt;
pub(crate) enum UpdatePromptOutcome {
Continue,
RunUpdate(UpdateAction),
RunUpdate(PromptedUpdate),
StartRepairSession(String),
}
pub(crate) async fn run_update_prompt_if_needed(
tui: &mut Tui,
config: &Config,
) -> Result<UpdatePromptOutcome> {
let Some(latest_version) = updates::get_upgrade_version_for_popup(config) else {
return Ok(UpdatePromptOutcome::Continue);
};
let Some(update_action) = crate::update_action::get_update_action() else {
let Some(notice) = updates::get_upgrade_notice_for_popup(config) else {
return Ok(UpdatePromptOutcome::Continue);
};
match notice {
UpgradeNotice::Available(latest_version) => {
let update_action = match crate::update_action::get_update_action_status() {
UpdateActionStatus::Ready(update_action) => update_action,
UpdateActionStatus::Blocked(blocker) => {
return run_remediation_prompt(
tui,
config,
latest_version,
RemediationPromptReason::Blocked(blocker),
)
.await;
}
UpdateActionStatus::Unavailable => return Ok(UpdatePromptOutcome::Continue),
};
run_standard_update_prompt(tui, config, latest_version, update_action).await
}
UpgradeNotice::RemediationNeeded(latest_version) => {
run_remediation_prompt(
tui,
config,
latest_version.clone(),
RemediationPromptReason::NoOpUpdate {
latest_version: latest_version.clone(),
},
)
.await
}
}
}
async fn run_standard_update_prompt(
tui: &mut Tui,
config: &Config,
latest_version: String,
update_action: UpdateAction,
) -> Result<UpdatePromptOutcome> {
let mut screen =
UpdatePromptScreen::new(tui.frame_requester(), latest_version.clone(), update_action);
tui.draw(u16::MAX, |frame| {
@@ -71,7 +111,11 @@ pub(crate) async fn run_update_prompt_if_needed(
match screen.selection() {
Some(UpdateSelection::UpdateNow) => {
tui.terminal.clear()?;
Ok(UpdatePromptOutcome::RunUpdate(update_action))
Ok(UpdatePromptOutcome::RunUpdate(PromptedUpdate {
action: update_action,
target_version: latest_version,
version_file: updates::version_filepath(config),
}))
}
Some(UpdateSelection::NotNow) | None => Ok(UpdatePromptOutcome::Continue),
Some(UpdateSelection::DontRemind) => {
@@ -83,6 +127,49 @@ pub(crate) async fn run_update_prompt_if_needed(
}
}
async fn run_remediation_prompt(
tui: &mut Tui,
config: &Config,
latest_version: String,
reason: RemediationPromptReason,
) -> Result<UpdatePromptOutcome> {
let mut screen = RemediationPromptScreen::new(tui.frame_requester(), reason.clone());
tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&screen, frame.area());
})?;
let events = tui.event_stream();
tokio::pin!(events);
while !screen.is_done() {
if let Some(event) = events.next().await {
match event {
TuiEvent::Key(key_event) => screen.handle_key(key_event),
TuiEvent::Paste(_) => {}
TuiEvent::Draw | TuiEvent::Resize => {
tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&screen, frame.area());
})?;
}
}
} else {
break;
}
}
match screen.selection() {
Some(RemediationSelection::HelpMeFixThis) => Ok(UpdatePromptOutcome::StartRepairSession(
reason.repair_prompt(),
)),
Some(RemediationSelection::DontRemind) => {
if let Err(err) = updates::dismiss_version(config, &latest_version).await {
tracing::error!("Failed to persist update remediation dismissal: {err}");
}
Ok(UpdatePromptOutcome::Continue)
}
Some(RemediationSelection::Later) | None => Ok(UpdatePromptOutcome::Continue),
}
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
enum UpdateSelection {
UpdateNow,
@@ -208,21 +295,23 @@ impl WidgetRef for &UpdatePromptScreen {
.dim()
.underlined(),
])
.inset(Insets::tlbr(0, 2, 0, 0)),
.inset(Insets::tlbr(
/*top*/ 0, /*left*/ 2, /*bottom*/ 0, /*right*/ 0,
)),
);
column.push("");
column.push(selection_option_row(
0,
/*index*/ 0,
format!("Update now (runs `{update_command}`)"),
self.highlighted == UpdateSelection::UpdateNow,
));
column.push(selection_option_row(
1,
/*index*/ 1,
"Skip".to_string(),
self.highlighted == UpdateSelection::NotNow,
));
column.push(selection_option_row(
2,
/*index*/ 2,
"Skip until next version".to_string(),
self.highlighted == UpdateSelection::DontRemind,
));
@@ -233,12 +322,186 @@ impl WidgetRef for &UpdatePromptScreen {
key_hint::plain(KeyCode::Enter).into(),
" to continue".dim(),
])
.inset(Insets::tlbr(0, 2, 0, 0)),
.inset(Insets::tlbr(
/*top*/ 0, /*left*/ 2, /*bottom*/ 0, /*right*/ 0,
)),
);
column.render(area, buf);
}
}
#[derive(Clone, Debug)]
enum RemediationPromptReason {
Blocked(UpdateBlocker),
NoOpUpdate { latest_version: String },
}
impl RemediationPromptReason {
fn lines(&self) -> Vec<String> {
match self {
Self::Blocked(blocker) => blocker.remediation_lines(),
Self::NoOpUpdate { latest_version } => vec![
"The previous update command completed, but this Codex executable did not change."
.to_string(),
format!(
"Codex is still running {} while {latest_version} is available.",
env!("CARGO_PKG_VERSION")
),
"Check which `codex` your shell runs before trying again.".to_string(),
],
}
}
fn repair_prompt(&self) -> String {
match self {
Self::Blocked(UpdateBlocker::NpmGlobalRootMismatch {
running_package_root,
npm_package_root,
}) => format!(
"Help me fix my Codex install. Codex is currently running from {}, but `npm install -g @openai/codex@latest` would update {}. Please inspect my shell PATH and Codex installs, explain the safest fix, and ask before making any destructive changes.",
running_package_root.display(),
npm_package_root.display(),
),
Self::NoOpUpdate { latest_version } => format!(
"Help me fix my Codex install. A previous update command completed, but this Codex executable is still running {} while {latest_version} is available. Please inspect my shell PATH and Codex installs, explain the safest fix, and ask before making any destructive changes.",
env!("CARGO_PKG_VERSION"),
),
}
}
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
enum RemediationSelection {
HelpMeFixThis,
Later,
DontRemind,
}
struct RemediationPromptScreen {
request_frame: FrameRequester,
reason: RemediationPromptReason,
highlighted: RemediationSelection,
selection: Option<RemediationSelection>,
}
impl RemediationPromptScreen {
fn new(request_frame: FrameRequester, reason: RemediationPromptReason) -> Self {
Self {
request_frame,
reason,
highlighted: RemediationSelection::HelpMeFixThis,
selection: None,
}
}
fn handle_key(&mut self, key_event: KeyEvent) {
if key_event.kind == KeyEventKind::Release {
return;
}
if key_event.modifiers.contains(KeyModifiers::CONTROL)
&& matches!(key_event.code, KeyCode::Char('c') | KeyCode::Char('d'))
{
self.select(RemediationSelection::Later);
return;
}
match key_event.code {
KeyCode::Up | KeyCode::Char('k') => self.set_highlight(self.highlighted.prev()),
KeyCode::Down | KeyCode::Char('j') => self.set_highlight(self.highlighted.next()),
KeyCode::Char('1') => self.select(RemediationSelection::HelpMeFixThis),
KeyCode::Char('2') => self.select(RemediationSelection::Later),
KeyCode::Char('3') => self.select(RemediationSelection::DontRemind),
KeyCode::Enter => self.select(self.highlighted),
KeyCode::Esc => self.select(RemediationSelection::Later),
_ => {}
}
}
fn set_highlight(&mut self, selection: RemediationSelection) {
self.highlighted = selection;
self.request_frame.schedule_frame();
}
fn select(&mut self, selection: RemediationSelection) {
self.highlighted = selection;
self.selection = Some(selection);
self.request_frame.schedule_frame();
}
fn is_done(&self) -> bool {
self.selection.is_some()
}
fn selection(&self) -> Option<RemediationSelection> {
self.selection
}
}
impl WidgetRef for &RemediationPromptScreen {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
Clear.render(area, buf);
let mut column = ColumnRenderable::new();
column.push("");
column.push(Line::from(vec![
padded_emoji("").bold().cyan(),
"Update needs attention".bold(),
]));
column.push("");
for line in self.reason.lines() {
column.push(Line::from(line).inset(Insets::tlbr(
/*top*/ 0, /*left*/ 2, /*bottom*/ 0, /*right*/ 0,
)));
}
column.push("");
column.push(remediation_selection_line(
/*index*/ 0,
"Help me fix this",
self.highlighted == RemediationSelection::HelpMeFixThis,
));
column.push(remediation_selection_line(
/*index*/ 1,
"Later",
self.highlighted == RemediationSelection::Later,
));
column.push(remediation_selection_line(
/*index*/ 2,
"Don't remind me about this version",
self.highlighted == RemediationSelection::DontRemind,
));
column.render(area, buf);
}
}
fn remediation_selection_line(index: usize, label: &str, selected: bool) -> Line<'static> {
let line = if selected {
return Line::from(vec![
format!(" {}. ", index + 1).cyan(),
label.to_string().cyan(),
]);
} else {
format!(" {}. {label}", index + 1)
};
Line::from(line)
}
impl RemediationSelection {
fn next(self) -> Self {
match self {
Self::HelpMeFixThis => Self::Later,
Self::Later => Self::DontRemind,
Self::DontRemind => Self::HelpMeFixThis,
}
}
fn prev(self) -> Self {
match self {
Self::HelpMeFixThis => Self::DontRemind,
Self::Later => Self::HelpMeFixThis,
Self::DontRemind => Self::Later,
}
}
}
#[cfg(test)]
mod tests {
use super::*;
@@ -260,7 +523,8 @@ mod tests {
#[test]
fn update_prompt_snapshot() {
let screen = new_prompt();
let mut terminal = Terminal::new(VT100Backend::new(80, 12)).expect("terminal");
let mut terminal =
Terminal::new(VT100Backend::new(/*width*/ 80, /*height*/ 12)).expect("terminal");
terminal
.draw(|frame| frame.render_widget_ref(&screen, frame.area()))
.expect("render update prompt");
@@ -310,4 +574,40 @@ mod tests {
screen.handle_key(KeyEvent::new(KeyCode::Down, KeyModifiers::NONE));
assert_eq!(screen.highlighted, UpdateSelection::UpdateNow);
}
#[test]
fn update_remediation_prompt_snapshot() {
let screen = RemediationPromptScreen::new(
FrameRequester::test_dummy(),
RemediationPromptReason::Blocked(UpdateBlocker::NpmGlobalRootMismatch {
running_package_root: "/prefix-a/lib/node_modules/@openai/codex".into(),
npm_package_root: "/prefix-b/lib/node_modules/@openai/codex".into(),
}),
);
let mut terminal =
Terminal::new(VT100Backend::new(/*width*/ 96, /*height*/ 12)).expect("terminal");
terminal
.draw(|frame| frame.render_widget_ref(&screen, frame.area()))
.expect("render update remediation prompt");
insta::assert_snapshot!("update_remediation_prompt_modal", terminal.backend());
}
#[test]
fn remediation_prompt_help_selection_starts_repair_flow() {
let reason = RemediationPromptReason::Blocked(UpdateBlocker::NpmGlobalRootMismatch {
running_package_root: "/prefix-a/lib/node_modules/@openai/codex".into(),
npm_package_root: "/prefix-b/lib/node_modules/@openai/codex".into(),
});
assert!(
reason
.repair_prompt()
.contains("/prefix-a/lib/node_modules/@openai/codex")
);
assert!(
reason
.repair_prompt()
.contains("/prefix-b/lib/node_modules/@openai/codex")
);
}
}

View File

@@ -1,10 +1,13 @@
#![cfg(not(debug_assertions))]
#![cfg(any(not(debug_assertions), test))]
#![cfg_attr(test, allow(dead_code))]
use crate::legacy_core::config::Config;
use crate::npm_registry;
use crate::npm_registry::NpmPackageInfo;
use crate::update_action;
use crate::update_action::UpdateAction;
use crate::update_action::UpdateActionStatus;
use crate::update_action::UpdateBlocker;
use crate::update_versions::extract_version_from_latest_tag;
use crate::update_versions::is_newer;
use crate::update_versions::is_source_build_version;
@@ -19,6 +22,24 @@ use std::path::PathBuf;
use crate::version::CODEX_CLI_VERSION;
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UpgradeNotice {
Available(String),
RemediationNeeded(String),
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UpgradeHistoryNotice {
Available {
latest_version: String,
update_action: Option<UpdateAction>,
},
BlockedWarning(UpdateBlocker),
NoOpUpdateWarning {
latest_version: String,
},
}
pub fn get_upgrade_version(config: &Config) -> Option<String> {
if !config.check_for_update_on_startup || is_source_build_version(CODEX_CLI_VERSION) {
return None;
@@ -42,13 +63,52 @@ pub fn get_upgrade_version(config: &Config) -> Option<String> {
});
}
info.and_then(|info| {
if is_newer(&info.latest_version, CODEX_CLI_VERSION).unwrap_or(false) {
Some(info.latest_version)
} else {
None
info.and_then(latest_upgrade_version)
}
pub fn get_upgrade_notice_for_history(
config: &Config,
prompt_launch: bool,
action_status: UpdateActionStatus,
) -> Option<UpgradeHistoryNotice> {
let latest = get_upgrade_version(config)?;
let version_file = version_filepath(config);
let info = read_version_info(&version_file).ok();
if info
.as_ref()
.is_some_and(|info| should_show_prompt_update_remediation(info, &latest))
{
if prompt_launch
&& info
.as_ref()
.and_then(|info| info.no_op_inline_notice_shown_version.as_deref())
!= Some(latest.as_str())
{
if let Err(err) = record_no_op_inline_notice_shown(&version_file, &latest) {
tracing::warn!("Failed to persist no-op update inline notice state: {err}");
}
return Some(UpgradeHistoryNotice::NoOpUpdateWarning {
latest_version: latest,
});
}
})
return None;
}
match action_status {
UpdateActionStatus::Ready(update_action) => Some(UpgradeHistoryNotice::Available {
latest_version: latest,
update_action: Some(update_action),
}),
UpdateActionStatus::Unavailable => Some(UpgradeHistoryNotice::Available {
latest_version: latest,
update_action: None,
}),
UpdateActionStatus::Blocked(blocker) if prompt_launch => {
Some(UpgradeHistoryNotice::BlockedWarning(blocker))
}
UpdateActionStatus::Blocked(_) => None,
}
}
#[derive(Serialize, Deserialize, Debug, Clone)]
@@ -58,6 +118,18 @@ struct VersionInfo {
last_checked_at: DateTime<Utc>,
#[serde(default)]
dismissed_version: Option<String>,
#[serde(default)]
successful_prompt_update: Option<SuccessfulPromptUpdate>,
#[serde(default)]
suppressed_version: Option<String>,
#[serde(default)]
no_op_inline_notice_shown_version: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq)]
struct SuccessfulPromptUpdate {
from_version: String,
target_version: String,
}
const VERSION_FILENAME: &str = "version.json";
@@ -75,7 +147,7 @@ struct HomebrewCaskInfo {
version: String,
}
fn version_filepath(config: &Config) -> PathBuf {
pub(crate) fn version_filepath(config: &Config) -> PathBuf {
config.codex_home.join(VERSION_FILENAME).into_path_buf()
}
@@ -113,12 +185,20 @@ async fn check_for_update(version_file: &Path, action: Option<UpdateAction>) ->
}
};
// Preserve any previously dismissed version if present.
// Preserve local prompt state across version refreshes.
let prev_info = read_version_info(version_file).ok();
let info = VersionInfo {
latest_version,
last_checked_at: Utc::now(),
dismissed_version: prev_info.and_then(|p| p.dismissed_version),
dismissed_version: prev_info.as_ref().and_then(|p| p.dismissed_version.clone()),
successful_prompt_update: prev_info
.as_ref()
.and_then(|p| p.successful_prompt_update.clone()),
suppressed_version: prev_info
.as_ref()
.and_then(|p| p.suppressed_version.clone()),
no_op_inline_notice_shown_version: prev_info
.and_then(|p| p.no_op_inline_notice_shown_version),
};
let json_line = format!("{}\n", serde_json::to_string(&info)?);
@@ -142,22 +222,27 @@ async fn fetch_latest_github_release_version() -> anyhow::Result<String> {
extract_version_from_latest_tag(&latest_tag_name)
}
/// Returns the latest version to show in a popup, if it should be shown.
/// Returns the upgrade notice to show in a popup, if one should be shown.
/// This respects the user's dismissal choice for the current latest version.
pub fn get_upgrade_version_for_popup(config: &Config) -> Option<String> {
pub fn get_upgrade_notice_for_popup(config: &Config) -> Option<UpgradeNotice> {
if !config.check_for_update_on_startup || is_source_build_version(CODEX_CLI_VERSION) {
return None;
}
let version_file = version_filepath(config);
let latest = get_upgrade_version(config)?;
// If the user dismissed this exact version previously, do not show the popup.
if let Ok(info) = read_version_info(&version_file)
&& info.dismissed_version.as_deref() == Some(latest.as_str())
{
let Ok(info) = read_version_info(&version_file) else {
return Some(UpgradeNotice::Available(latest));
};
if info.dismissed_version.as_deref() == Some(latest.as_str()) {
return None;
}
Some(latest)
if should_show_prompt_update_remediation(&info, &latest) {
Some(UpgradeNotice::RemediationNeeded(latest))
} else {
Some(UpgradeNotice::Available(latest))
}
}
/// Persist a dismissal for the current latest version so we don't show
@@ -176,3 +261,159 @@ pub async fn dismiss_version(config: &Config, version: &str) -> anyhow::Result<(
tokio::fs::write(version_file, json_line).await?;
Ok(())
}
/// Persist a successful prompt-triggered update attempt so the next launch can
/// detect whether the running executable actually changed.
pub fn record_successful_prompt_update_attempt(
version_file: &Path,
target_version: &str,
) -> anyhow::Result<()> {
let mut info = read_version_info(version_file)?;
info.successful_prompt_update = Some(SuccessfulPromptUpdate {
from_version: CODEX_CLI_VERSION.to_string(),
target_version: target_version.to_string(),
});
write_version_info_sync(version_file, &info)
}
fn record_no_op_inline_notice_shown(version_file: &Path, version: &str) -> anyhow::Result<()> {
let mut info = read_version_info(version_file)?;
info.no_op_inline_notice_shown_version = Some(version.to_string());
write_version_info_sync(version_file, &info)
}
/// Suppress future notices for the current latest version after we explain a
/// likely no-op update once.
pub async fn suppress_version_after_remediation(
config: &Config,
version: &str,
) -> anyhow::Result<()> {
let version_file = version_filepath(config);
let mut info = match read_version_info(&version_file) {
Ok(info) => info,
Err(_) => return Ok(()),
};
info.suppressed_version = Some(version.to_string());
let json_line = format!("{}\n", serde_json::to_string(&info)?);
if let Some(parent) = version_file.parent() {
tokio::fs::create_dir_all(parent).await?;
}
tokio::fs::write(version_file, json_line).await?;
Ok(())
}
fn should_show_prompt_update_remediation(info: &VersionInfo, latest: &str) -> bool {
matches!(
info.successful_prompt_update.as_ref(),
Some(SuccessfulPromptUpdate {
from_version,
target_version,
}) if from_version == CODEX_CLI_VERSION && target_version == latest
)
}
fn latest_upgrade_version(info: VersionInfo) -> Option<String> {
if info.suppressed_version.as_deref() == Some(info.latest_version.as_str()) {
return None;
}
if is_newer(&info.latest_version, CODEX_CLI_VERSION).unwrap_or(/*default*/ false) {
Some(info.latest_version)
} else {
None
}
}
fn write_version_info_sync(version_file: &Path, info: &VersionInfo) -> anyhow::Result<()> {
let json_line = format!("{}\n", serde_json::to_string(info)?);
if let Some(parent) = version_file.parent() {
std::fs::create_dir_all(parent)?;
}
std::fs::write(version_file, json_line)?;
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
fn version_info(latest_version: &str) -> VersionInfo {
VersionInfo {
latest_version: latest_version.to_string(),
last_checked_at: Utc::now(),
dismissed_version: None,
successful_prompt_update: None,
suppressed_version: None,
no_op_inline_notice_shown_version: None,
}
}
#[test]
fn successful_prompt_update_for_current_binary_triggers_remediation() {
let mut info = version_info("9.9.9");
info.successful_prompt_update = Some(SuccessfulPromptUpdate {
from_version: CODEX_CLI_VERSION.to_string(),
target_version: "9.9.9".to_string(),
});
assert!(should_show_prompt_update_remediation(&info, "9.9.9"));
assert!(!should_show_prompt_update_remediation(&info, "9.9.10"));
}
#[test]
fn remediation_suppression_hides_the_same_latest_version() {
let mut info = version_info("9.9.9");
info.suppressed_version = Some("9.9.9".to_string());
assert_eq!(latest_upgrade_version(info), None);
}
#[test]
fn newer_latest_version_ignores_stale_remediation_suppression() {
let mut info = version_info("9.9.10");
info.suppressed_version = Some("9.9.9".to_string());
assert_eq!(latest_upgrade_version(info), Some("9.9.10".to_string()));
}
#[test]
fn successful_prompt_update_attempt_is_persisted() -> anyhow::Result<()> {
let tempdir = tempfile::tempdir()?;
let version_file = tempdir.path().join(VERSION_FILENAME);
write_version_info_sync(&version_file, &version_info("9.9.9"))?;
record_successful_prompt_update_attempt(&version_file, "9.9.9")?;
let info = read_version_info(&version_file)?;
assert_eq!(
info.successful_prompt_update,
Some(SuccessfulPromptUpdate {
from_version: CODEX_CLI_VERSION.to_string(),
target_version: "9.9.9".to_string(),
})
);
Ok(())
}
#[test]
fn no_op_inline_notice_marker_is_persisted_without_clearing_remediation() -> anyhow::Result<()>
{
let tempdir = tempfile::tempdir()?;
let version_file = tempdir.path().join(VERSION_FILENAME);
let mut info = version_info("9.9.9");
info.successful_prompt_update = Some(SuccessfulPromptUpdate {
from_version: CODEX_CLI_VERSION.to_string(),
target_version: "9.9.9".to_string(),
});
write_version_info_sync(&version_file, &info)?;
record_no_op_inline_notice_shown(&version_file, "9.9.9")?;
let info = read_version_info(&version_file)?;
assert_eq!(
info.no_op_inline_notice_shown_version.as_deref(),
Some("9.9.9")
);
assert!(should_show_prompt_update_remediation(&info, "9.9.9"));
Ok(())
}
}

View File

@@ -2,9 +2,7 @@
Experimental Python SDK for `codex app-server` JSON-RPC v2 over stdio, with a small default surface optimized for real scripts and apps.
The generated wire-model layer is sourced from the pinned `openai-codex-cli-bin`
runtime package and exposed as Pydantic models with snake_case Python fields
that serialize back to the app-servers camelCase wire format.
The generated wire-model layer is currently sourced from the bundled v2 schema and exposed as Pydantic models with snake_case Python fields that serialize back to the app-servers camelCase wire format.
## Install
@@ -70,7 +68,6 @@ notebook bootstrap the pinned runtime package automatically.
```bash
cd sdk/python
uv sync
python scripts/update_sdk_artifacts.py generate-types
python scripts/update_sdk_artifacts.py \
stage-sdk \
@@ -94,7 +91,7 @@ This supports the CI release flow:
- run `generate-types` before packaging
- stage `openai-codex-app-server-sdk` once with an exact `openai-codex-cli-bin==...` dependency
- stage `openai-codex-cli-bin` on each supported platform runner with the same pinned runtime version
- build and publish `openai-codex-cli-bin` as platform wheels only through PyPI trusted publishing; do not publish an sdist
- build and publish `openai-codex-cli-bin` as platform wheels only; do not publish an sdist
## Compatibility and versioning

View File

@@ -27,22 +27,16 @@ class RuntimeSetupError(RuntimeError):
def pinned_runtime_version() -> str:
"""Return the exact runtime version pinned by the SDK package dependency."""
source_pin = _source_tree_runtime_dependency_version()
if source_pin is not None:
return _normalized_package_version(source_pin)
source_version = _source_tree_project_version()
if source_version is not None:
return _normalized_package_version(source_version)
try:
installed_pin = _installed_sdk_runtime_dependency_version()
return _normalized_package_version(importlib.metadata.version(SDK_PACKAGE_NAME))
except importlib.metadata.PackageNotFoundError as exc:
raise RuntimeSetupError(
f"Unable to resolve {SDK_PACKAGE_NAME} metadata for runtime pinning."
f"Unable to resolve {SDK_PACKAGE_NAME} version for runtime pinning."
) from exc
if installed_pin is None:
raise RuntimeSetupError(
f"Unable to resolve {PACKAGE_NAME} dependency pin from {SDK_PACKAGE_NAME}."
)
return _normalized_package_version(installed_pin)
def ensure_runtime_package_installed(
@@ -405,33 +399,20 @@ def _release_tag(version: str) -> str:
return f"rust-v{_codex_release_version(version)}"
def _source_tree_runtime_dependency_version() -> str | None:
"""Read the runtime dependency pin when the SDK is running from a checkout."""
def _source_tree_project_version() -> str | None:
pyproject_path = Path(__file__).resolve().parent / "pyproject.toml"
if not pyproject_path.exists():
return None
match = re.search(_runtime_dependency_pin_pattern(), pyproject_path.read_text())
match = re.search(
r'(?m)^version = "([^"]+)"$',
pyproject_path.read_text(encoding="utf-8"),
)
if match is None:
return None
return match.group(1)
def _installed_sdk_runtime_dependency_version() -> str | None:
"""Read the runtime dependency pin from installed package metadata."""
requirements = importlib.metadata.requires(SDK_PACKAGE_NAME) or []
for requirement in requirements:
match = re.search(_runtime_dependency_pin_pattern(), requirement)
if match is not None:
return match.group(1)
return None
def _runtime_dependency_pin_pattern() -> str:
"""Match the exact runtime dependency pin in TOML and wheel metadata."""
return rf'{re.escape(PACKAGE_NAME)}\s*==\s*"?([^",;\s]+)"?'
__all__ = [
"PACKAGE_NAME",
"SDK_PACKAGE_NAME",

View File

@@ -28,7 +28,7 @@ will download the matching GitHub release artifact, stage a temporary local
`openai-codex-cli-bin` package, install it into your active interpreter, and clean up
the temporary files afterward.
The pinned runtime version comes from the SDK package dependency.
The pinned runtime version comes from the SDK package version.
## Run examples

View File

@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
[project]
name = "openai-codex-app-server-sdk"
version = "0.131.0a4"
version = "0.116.0a1"
description = "Python SDK for Codex app-server v2"
readme = "README.md"
requires-python = ">=3.10"
@@ -22,7 +22,7 @@ classifiers = [
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Libraries :: Python Modules",
]
dependencies = ["pydantic>=2.12", "openai-codex-cli-bin==0.131.0a4"]
dependencies = ["pydantic>=2.12"]
[project.urls]
Homepage = "https://github.com/openai/codex"
@@ -63,10 +63,8 @@ testpaths = ["tests"]
[tool.uv]
exclude-newer = "7 days"
exclude-newer-package = { openai-codex-cli-bin = "2026-05-10T00:00:00Z" }
index-strategy = "first-index"
[tool.uv.pip]
exclude-newer = "7 days"
exclude-newer-package = { openai-codex-cli-bin = "2026-05-10T00:00:00Z" }
index-strategy = "first-index"

View File

@@ -3,7 +3,6 @@ from __future__ import annotations
import argparse
import importlib
import importlib.metadata
import json
import platform
import re
@@ -34,14 +33,19 @@ def python_runtime_root() -> Path:
return repo_root() / "sdk" / "python-runtime"
def sdk_pyproject_path() -> Path:
"""Return the SDK pyproject file that owns package pins and versions."""
return sdk_root() / "pyproject.toml"
def schema_bundle_path() -> Path:
return (
repo_root()
/ "codex-rs"
/ "app-server-protocol"
/ "schema"
/ "json"
/ "codex_app_server_protocol.v2.schemas.json"
)
def schema_bundle_path(schema_dir: Path) -> Path:
"""Return the aggregate v2 schema bundle emitted by the runtime binary."""
return schema_dir / "codex_app_server_protocol.v2.schemas.json"
def schema_root_dir() -> Path:
return repo_root() / "codex-rs" / "app-server-protocol" / "schema" / "json"
def _is_windows() -> bool:
@@ -57,7 +61,6 @@ def staged_runtime_bin_path(root: Path) -> Path:
def staged_runtime_resource_path(root: Path, resource: Path) -> Path:
"""Stage runtime helper binaries beside the main bundled Codex binary."""
# Runtime wheels include the whole bin/ directory, so helper executables
# should be staged beside the main Codex binary instead of changing the
# package template for each platform.
@@ -75,7 +78,7 @@ def run_python_module(module: str, args: list[str], cwd: Path) -> None:
def current_sdk_version() -> str:
match = re.search(
r'^version = "([^"]+)"$',
sdk_pyproject_path().read_text(),
(sdk_root() / "pyproject.toml").read_text(),
flags=re.MULTILINE,
)
if match is None:
@@ -83,59 +86,6 @@ def current_sdk_version() -> str:
return match.group(1)
def pinned_runtime_version() -> str:
"""Read the exact runtime package pin used for schema generation."""
pyproject_text = sdk_pyproject_path().read_text()
match = re.search(r"(?ms)^dependencies = \[(.*?)\]$", pyproject_text)
if match is None:
raise RuntimeError(
"Could not find dependencies array in sdk/python/pyproject.toml"
)
pins = re.findall(
rf'"{re.escape(RUNTIME_DISTRIBUTION_NAME)}==([^"]+)"',
match.group(1),
)
if len(pins) != 1:
raise RuntimeError(
f"Expected exactly one {RUNTIME_DISTRIBUTION_NAME} dependency pin "
"in sdk/python/pyproject.toml"
)
return normalize_codex_version(pins[0])
def pinned_runtime_codex_path() -> Path:
"""Return the bundled Codex binary from the installed pinned runtime wheel."""
expected_version = pinned_runtime_version()
try:
installed_version = importlib.metadata.version(RUNTIME_DISTRIBUTION_NAME)
except importlib.metadata.PackageNotFoundError as exc:
raise RuntimeError(
f"Install {RUNTIME_DISTRIBUTION_NAME}=={expected_version} before "
"generating Python SDK types."
) from exc
normalized_installed_version = normalize_codex_version(installed_version)
if normalized_installed_version != expected_version:
raise RuntimeError(
f"Expected {RUNTIME_DISTRIBUTION_NAME}=={expected_version}, "
f"but found {installed_version}."
)
try:
from codex_cli_bin import bundled_codex_path
except ImportError as exc:
raise RuntimeError(
f"Installed {RUNTIME_DISTRIBUTION_NAME} package does not expose "
"bundled_codex_path."
) from exc
codex_path = bundled_codex_path()
if not codex_path.exists():
raise RuntimeError(f"Pinned Codex runtime binary not found at {codex_path}.")
return codex_path
def normalize_codex_version(version: str) -> str:
normalized = version.strip()
if normalized.startswith("rust-v"):
@@ -537,28 +487,8 @@ def _annotate_schema(value: Any, base: str | None = None) -> None:
_annotate_schema(child, base)
def generate_schema_from_pinned_runtime(schema_dir: Path) -> Path:
"""Generate app-server schemas by invoking the installed pinned runtime binary."""
codex_path = pinned_runtime_codex_path()
if schema_dir.exists():
shutil.rmtree(schema_dir)
schema_dir.mkdir(parents=True)
run(
[
str(codex_path),
"app-server",
"generate-json-schema",
"--out",
str(schema_dir),
],
cwd=sdk_root(),
)
return schema_dir
def _normalized_schema_bundle_text(schema_dir: Path) -> str:
"""Normalize the schema bundle before feeding it to the Python type generator."""
schema = json.loads(schema_bundle_path(schema_dir).read_text())
def _normalized_schema_bundle_text() -> str:
schema = json.loads(schema_bundle_path().read_text())
definitions = schema.get("definitions", {})
if isinstance(definitions, dict):
for definition in definitions.values():
@@ -570,8 +500,7 @@ def _normalized_schema_bundle_text(schema_dir: Path) -> str:
return json.dumps(schema, indent=2, sort_keys=True) + "\n"
def generate_v2_all(schema_dir: Path) -> None:
"""Regenerate the Pydantic v2 protocol model module from runtime schemas."""
def generate_v2_all() -> None:
out_path = sdk_root() / "src" / "codex_app_server" / "generated" / "v2_all.py"
out_dir = out_path.parent
old_package_dir = out_dir / "v2_all"
@@ -579,8 +508,8 @@ def generate_v2_all(schema_dir: Path) -> None:
shutil.rmtree(old_package_dir)
out_dir.mkdir(parents=True, exist_ok=True)
with tempfile.TemporaryDirectory() as td:
normalized_bundle = Path(td) / schema_bundle_path(schema_dir).name
normalized_bundle.write_text(_normalized_schema_bundle_text(schema_dir))
normalized_bundle = Path(td) / schema_bundle_path().name
normalized_bundle.write_text(_normalized_schema_bundle_text())
run_python_module(
"datamodel_code_generator",
[
@@ -617,10 +546,9 @@ def generate_v2_all(schema_dir: Path) -> None:
_normalize_generated_timestamps(out_path)
def _notification_specs(schema_dir: Path) -> list[tuple[str, str]]:
"""Map each server notification method to its generated payload model class."""
def _notification_specs() -> list[tuple[str, str]]:
server_notifications = json.loads(
(schema_dir / "ServerNotification.json").read_text()
(schema_root_dir() / "ServerNotification.json").read_text()
)
one_of = server_notifications.get("oneOf", [])
generated_source = (
@@ -658,12 +586,10 @@ def _notification_specs(schema_dir: Path) -> list[tuple[str, str]]:
def _notification_turn_id_specs(
schema_dir: Path,
specs: list[tuple[str, str]],
) -> tuple[list[str], list[str]]:
"""Classify notification payloads by where their turn id is carried."""
server_notifications = json.loads(
(schema_dir / "ServerNotification.json").read_text()
(schema_root_dir() / "ServerNotification.json").read_text()
)
definitions = server_notifications.get("definitions", {})
if not isinstance(definitions, dict):
@@ -689,7 +615,6 @@ def _notification_turn_id_specs(
def _type_tuple_source(class_names: list[str]) -> str:
"""Render a generated tuple literal for notification payload classes."""
if not class_names:
return "()"
if len(class_names) == 1:
@@ -697,8 +622,7 @@ def _type_tuple_source(class_names: list[str]) -> str:
return "(\n" + "".join(f" {class_name},\n" for class_name in class_names) + ")"
def generate_notification_registry(schema_dir: Path) -> None:
"""Regenerate notification dispatch metadata from the runtime notification schema."""
def generate_notification_registry() -> None:
out = (
sdk_root()
/ "src"
@@ -706,12 +630,9 @@ def generate_notification_registry(schema_dir: Path) -> None:
/ "generated"
/ "notification_registry.py"
)
specs = _notification_specs(schema_dir)
specs = _notification_specs()
class_names = sorted({class_name for _, class_name in specs})
direct_turn_id_types, nested_turn_types = _notification_turn_id_specs(
schema_dir,
specs,
)
direct_turn_id_types, nested_turn_types = _notification_turn_id_specs(specs)
lines = [
"# Auto-generated by scripts/update_sdk_artifacts.py",
@@ -745,7 +666,6 @@ def generate_notification_registry(schema_dir: Path) -> None:
"",
"",
"def notification_turn_id(payload: BaseModel) -> str | None:",
' """Return the turn id carried by generated notification payload metadata."""',
" if isinstance(payload, DIRECT_TURN_ID_NOTIFICATION_TYPES):",
" return payload.turn_id if isinstance(payload.turn_id, str) else None",
" if isinstance(payload, NESTED_TURN_NOTIFICATION_TYPES):",
@@ -832,12 +752,8 @@ def _camel_to_snake(name: str) -> str:
def _load_public_fields(
module_name: str, class_name: str, *, exclude: set[str] | None = None
) -> list[PublicFieldSpec]:
"""Load generated model fields used to render the ergonomic public methods."""
exclude = exclude or set()
if module_name == "codex_app_server.generated.v2_all":
module = _load_generated_v2_all_module()
else:
module = importlib.import_module(module_name)
module = importlib.import_module(module_name)
model = getattr(module, class_name)
fields: list[PublicFieldSpec] = []
for name, field in model.model_fields.items():
@@ -859,20 +775,6 @@ def _load_public_fields(
return fields
def _load_generated_v2_all_module() -> types.ModuleType:
"""Import the freshly generated v2_all module without importing package init."""
module_name = "_codex_app_server_generated_v2_all_for_artifacts"
sys.modules.pop(module_name, None)
module_path = sdk_root() / "src" / "codex_app_server" / "generated" / "v2_all.py"
spec = importlib.util.spec_from_file_location(module_name, module_path)
if spec is None or spec.loader is None:
raise RuntimeError(f"Failed to load generated module from {module_path}")
module = importlib.util.module_from_spec(spec)
sys.modules[module_name] = module
spec.loader.exec_module(module)
return module
def _kw_signature_lines(fields: list[PublicFieldSpec]) -> list[str]:
lines: list[str] = []
for field in fields:
@@ -1082,7 +984,6 @@ def _render_async_thread_block(
def generate_public_api_flat_methods() -> None:
"""Regenerate the public convenience methods from generated protocol models."""
src_dir = sdk_root() / "src"
public_api_path = src_dir / "codex_app_server" / "api.py"
if not public_api_path.exists():
@@ -1148,22 +1049,13 @@ def generate_public_api_flat_methods() -> None:
_render_async_thread_block(turn_start_fields),
)
public_api_path.write_text(source)
run_python_module("ruff", ["format", str(public_api_path)], cwd=sdk_root())
def generate_types_from_schema_dir(schema_dir: Path) -> None:
"""Regenerate every SDK artifact derived from an existing schema directory."""
# v2_all is the authoritative generated surface.
generate_v2_all(schema_dir)
generate_notification_registry(schema_dir)
generate_public_api_flat_methods()
def generate_types() -> None:
"""Generate schemas from the pinned runtime and then refresh SDK artifacts."""
with tempfile.TemporaryDirectory(prefix="codex-python-schema-") as td:
schema_dir = generate_schema_from_pinned_runtime(Path(td) / "schema")
generate_types_from_schema_dir(schema_dir)
# v2_all is the authoritative generated surface.
generate_v2_all()
generate_notification_registry()
generate_public_api_flat_methods()
def build_parser() -> argparse.ArgumentParser:

View File

@@ -22,12 +22,12 @@ from .generated.v2_all import (
ReasoningSummary,
SandboxMode,
SandboxPolicy,
ServiceTier,
ThreadItem,
ThreadForkParams,
ThreadListParams,
ThreadResumeParams,
ThreadSortKey,
ThreadSource,
ThreadSourceKind,
ThreadStartParams,
ThreadTokenUsageUpdatedNotification,
@@ -86,11 +86,11 @@ __all__ = [
"ReasoningSummary",
"SandboxMode",
"SandboxPolicy",
"ServiceTier",
"ThreadStartParams",
"ThreadResumeParams",
"ThreadListParams",
"ThreadSortKey",
"ThreadSource",
"ThreadSourceKind",
"ThreadForkParams",
"TurnStatus",

View File

@@ -22,7 +22,6 @@ class MessageRouter:
"""
def __init__(self) -> None:
"""Create empty response, turn, and global notification queues."""
self._lock = threading.Lock()
self._response_waiters: dict[str, queue.Queue[ResponseQueueItem]] = {}
self._turn_notifications: dict[str, queue.Queue[NotificationQueueItem]] = {}
@@ -145,7 +144,6 @@ class MessageRouter:
self._global_notifications.put(exc)
def _notification_turn_id(self, notification: Notification) -> str | None:
"""Extract routing ids from known generated payloads or raw unknown payloads."""
payload = notification.payload
if isinstance(payload, UnknownNotification):
raw_turn_id = payload.params.get("turnId")

View File

@@ -15,6 +15,7 @@ from .generated.v2_all import (
ReasoningSummary,
SandboxMode,
SandboxPolicy,
ServiceTier,
SortDirection,
ThreadArchiveResponse,
ThreadCompactStartResponse,
@@ -26,7 +27,6 @@ from .generated.v2_all import (
ThreadResumeParams,
ThreadSetNameResponse,
ThreadSortKey,
ThreadSource,
ThreadSourceKind,
ThreadStartSource,
ThreadStartParams,
@@ -152,9 +152,8 @@ class Codex:
personality: Personality | None = None,
sandbox: SandboxMode | None = None,
service_name: str | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
session_start_source: ThreadStartSource | None = None,
thread_source: ThreadSource | None = None,
) -> Thread:
params = ThreadStartParams(
approval_policy=approval_policy,
@@ -171,7 +170,6 @@ class Codex:
service_name=service_name,
service_tier=service_tier,
session_start_source=session_start_source,
thread_source=thread_source,
)
started = self._client.thread_start(params)
return Thread(self._client, started.thread.id)
@@ -218,7 +216,7 @@ class Codex:
model_provider: str | None = None,
personality: Personality | None = None,
sandbox: SandboxMode | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
) -> Thread:
params = ThreadResumeParams(
thread_id=thread_id,
@@ -251,8 +249,7 @@ class Codex:
model: str | None = None,
model_provider: str | None = None,
sandbox: SandboxMode | None = None,
service_tier: str | None = None,
thread_source: ThreadSource | None = None,
service_tier: ServiceTier | None = None,
) -> Thread:
params = ThreadForkParams(
thread_id=thread_id,
@@ -267,7 +264,6 @@ class Codex:
model_provider=model_provider,
sandbox=sandbox,
service_tier=service_tier,
thread_source=thread_source,
)
forked = self._client.thread_fork(thread_id, params)
return Thread(self._client, forked.thread.id)
@@ -353,9 +349,8 @@ class AsyncCodex:
personality: Personality | None = None,
sandbox: SandboxMode | None = None,
service_name: str | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
session_start_source: ThreadStartSource | None = None,
thread_source: ThreadSource | None = None,
) -> AsyncThread:
await self._ensure_initialized()
params = ThreadStartParams(
@@ -373,7 +368,6 @@ class AsyncCodex:
service_name=service_name,
service_tier=service_tier,
session_start_source=session_start_source,
thread_source=thread_source,
)
started = await self._client.thread_start(params)
return AsyncThread(self, started.thread.id)
@@ -421,7 +415,7 @@ class AsyncCodex:
model_provider: str | None = None,
personality: Personality | None = None,
sandbox: SandboxMode | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
) -> AsyncThread:
await self._ensure_initialized()
params = ThreadResumeParams(
@@ -455,8 +449,7 @@ class AsyncCodex:
model: str | None = None,
model_provider: str | None = None,
sandbox: SandboxMode | None = None,
service_tier: str | None = None,
thread_source: ThreadSource | None = None,
service_tier: ServiceTier | None = None,
) -> AsyncThread:
await self._ensure_initialized()
params = ThreadForkParams(
@@ -472,7 +465,6 @@ class AsyncCodex:
model_provider=model_provider,
sandbox=sandbox,
service_tier=service_tier,
thread_source=thread_source,
)
forked = await self._client.thread_fork(thread_id, params)
return AsyncThread(self, forked.thread.id)
@@ -510,7 +502,7 @@ class Thread:
output_schema: JsonObject | None = None,
personality: Personality | None = None,
sandbox_policy: SandboxPolicy | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
summary: ReasoningSummary | None = None,
) -> RunResult:
turn = self.turn(
@@ -545,7 +537,7 @@ class Thread:
output_schema: JsonObject | None = None,
personality: Personality | None = None,
sandbox_policy: SandboxPolicy | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
summary: ReasoningSummary | None = None,
) -> TurnHandle:
wire_input = _to_wire_input(input)
@@ -595,7 +587,7 @@ class AsyncThread:
output_schema: JsonObject | None = None,
personality: Personality | None = None,
sandbox_policy: SandboxPolicy | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
summary: ReasoningSummary | None = None,
) -> RunResult:
turn = await self.turn(
@@ -630,7 +622,7 @@ class AsyncThread:
output_schema: JsonObject | None = None,
personality: Personality | None = None,
sandbox_policy: SandboxPolicy | None = None,
service_tier: str | None = None,
service_tier: ServiceTier | None = None,
summary: ReasoningSummary | None = None,
) -> AsyncTurnHandle:
await self._codex._ensure_initialized()
@@ -686,7 +678,6 @@ class TurnHandle:
return self._client.turn_interrupt(self.thread_id, self.id)
def stream(self) -> Iterator[Notification]:
"""Yield only notifications routed to this turn handle."""
self._client.register_turn_notifications(self.id)
try:
while True:
@@ -739,7 +730,6 @@ class AsyncTurnHandle:
return await self._codex._client.turn_interrupt(self.thread_id, self.id)
async def stream(self) -> AsyncIterator[Notification]:
"""Yield only notifications routed to this async turn handle."""
await self._codex._ensure_initialized()
self._codex._client.register_turn_notifications(self.id)
try:

View File

@@ -40,16 +40,13 @@ class AsyncAppServerClient:
"""Async wrapper around AppServerClient using thread offloading."""
def __init__(self, config: AppServerConfig | None = None) -> None:
"""Create the wrapped sync client that owns the transport process."""
self._sync = AppServerClient(config=config)
async def __aenter__(self) -> "AsyncAppServerClient":
"""Start the app-server process when entering an async context."""
await self.start()
return self
async def __aexit__(self, _exc_type, _exc, _tb) -> None:
"""Close the app-server process when leaving an async context."""
await self.close()
async def _call_sync(
@@ -59,37 +56,30 @@ class AsyncAppServerClient:
*args: ParamsT.args,
**kwargs: ParamsT.kwargs,
) -> ReturnT:
"""Run a blocking sync-client operation without blocking the event loop."""
return await asyncio.to_thread(fn, *args, **kwargs)
@staticmethod
def _next_from_iterator(
iterator: Iterator[AgentMessageDeltaNotification],
) -> tuple[bool, AgentMessageDeltaNotification | None]:
"""Convert StopIteration into a value that can cross asyncio.to_thread."""
try:
return True, next(iterator)
except StopIteration:
return False, None
async def start(self) -> None:
"""Start the wrapped sync client in a worker thread."""
await self._call_sync(self._sync.start)
async def close(self) -> None:
"""Close the wrapped sync client in a worker thread."""
await self._call_sync(self._sync.close)
async def initialize(self) -> InitializeResponse:
"""Initialize the app-server session."""
return await self._call_sync(self._sync.initialize)
def register_turn_notifications(self, turn_id: str) -> None:
"""Register a turn notification queue on the wrapped sync client."""
self._sync.register_turn_notifications(turn_id)
def unregister_turn_notifications(self, turn_id: str) -> None:
"""Unregister a turn notification queue on the wrapped sync client."""
self._sync.unregister_turn_notifications(turn_id)
async def request(
@@ -99,7 +89,6 @@ class AsyncAppServerClient:
*,
response_model: type[ModelT],
) -> ModelT:
"""Send a typed JSON-RPC request through the wrapped sync client."""
return await self._call_sync(
self._sync.request,
method,
@@ -110,7 +99,6 @@ class AsyncAppServerClient:
async def thread_start(
self, params: V2ThreadStartParams | JsonObject | None = None
) -> ThreadStartResponse:
"""Start a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_start, params)
async def thread_resume(
@@ -118,19 +106,16 @@ class AsyncAppServerClient:
thread_id: str,
params: V2ThreadResumeParams | JsonObject | None = None,
) -> ThreadResumeResponse:
"""Resume a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_resume, thread_id, params)
async def thread_list(
self, params: V2ThreadListParams | JsonObject | None = None
) -> ThreadListResponse:
"""List threads using the wrapped sync client."""
return await self._call_sync(self._sync.thread_list, params)
async def thread_read(
self, thread_id: str, include_turns: bool = False
) -> ThreadReadResponse:
"""Read a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_read, thread_id, include_turns)
async def thread_fork(
@@ -138,23 +123,18 @@ class AsyncAppServerClient:
thread_id: str,
params: V2ThreadForkParams | JsonObject | None = None,
) -> ThreadForkResponse:
"""Fork a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_fork, thread_id, params)
async def thread_archive(self, thread_id: str) -> ThreadArchiveResponse:
"""Archive a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_archive, thread_id)
async def thread_unarchive(self, thread_id: str) -> ThreadUnarchiveResponse:
"""Unarchive a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_unarchive, thread_id)
async def thread_set_name(self, thread_id: str, name: str) -> ThreadSetNameResponse:
"""Rename a thread using the wrapped sync client."""
return await self._call_sync(self._sync.thread_set_name, thread_id, name)
async def thread_compact(self, thread_id: str) -> ThreadCompactStartResponse:
"""Start thread compaction using the wrapped sync client."""
return await self._call_sync(self._sync.thread_compact, thread_id)
async def turn_start(
@@ -163,7 +143,6 @@ class AsyncAppServerClient:
input_items: list[JsonObject] | JsonObject | str,
params: V2TurnStartParams | JsonObject | None = None,
) -> TurnStartResponse:
"""Start a turn using the wrapped sync client."""
return await self._call_sync(
self._sync.turn_start, thread_id, input_items, params
)
@@ -171,7 +150,6 @@ class AsyncAppServerClient:
async def turn_interrupt(
self, thread_id: str, turn_id: str
) -> TurnInterruptResponse:
"""Interrupt a turn using the wrapped sync client."""
return await self._call_sync(self._sync.turn_interrupt, thread_id, turn_id)
async def turn_steer(
@@ -180,7 +158,6 @@ class AsyncAppServerClient:
expected_turn_id: str,
input_items: list[JsonObject] | JsonObject | str,
) -> TurnSteerResponse:
"""Send steering input to a turn using the wrapped sync client."""
return await self._call_sync(
self._sync.turn_steer,
thread_id,
@@ -189,7 +166,6 @@ class AsyncAppServerClient:
)
async def model_list(self, include_hidden: bool = False) -> ModelListResponse:
"""List models using the wrapped sync client."""
return await self._call_sync(self._sync.model_list, include_hidden)
async def request_with_retry_on_overload(
@@ -202,7 +178,6 @@ class AsyncAppServerClient:
initial_delay_s: float = 0.25,
max_delay_s: float = 2.0,
) -> ModelT:
"""Send a typed request with the sync client's overload retry policy."""
return await self._call_sync(
self._sync.request_with_retry_on_overload,
method,
@@ -214,15 +189,12 @@ class AsyncAppServerClient:
)
async def next_notification(self) -> Notification:
"""Wait for the next global notification without blocking the event loop."""
return await self._call_sync(self._sync.next_notification)
async def next_turn_notification(self, turn_id: str) -> Notification:
"""Wait for the next notification routed to one turn."""
return await self._call_sync(self._sync.next_turn_notification, turn_id)
async def wait_for_turn_completed(self, turn_id: str) -> TurnCompletedNotification:
"""Wait for the completion notification routed to one turn."""
return await self._call_sync(self._sync.wait_for_turn_completed, turn_id)
async def stream_text(
@@ -231,7 +203,6 @@ class AsyncAppServerClient:
text: str,
params: V2TurnStartParams | JsonObject | None = None,
) -> AsyncIterator[AgentMessageDeltaNotification]:
"""Stream text deltas from one turn without monopolizing the event loop."""
iterator = self._sync.stream_text(thread_id, text, params)
while True:
has_value, chunk = await asyncio.to_thread(

View File

@@ -243,7 +243,6 @@ class AppServerClient:
return response_model.model_validate(result)
def _request_raw(self, method: str, params: JsonObject | None = None) -> JsonValue:
"""Send a JSON-RPC request and wait for the reader thread to route its response."""
request_id = str(uuid.uuid4())
waiter = self._router.create_response_waiter(request_id)
@@ -261,23 +260,18 @@ class AppServerClient:
return item
def notify(self, method: str, params: JsonObject | None = None) -> None:
"""Send a JSON-RPC notification without waiting for a response."""
self._write_message({"method": method, "params": params or {}})
def next_notification(self) -> Notification:
"""Return the next notification that is not scoped to an active turn."""
return self._router.next_global_notification()
def register_turn_notifications(self, turn_id: str) -> None:
"""Start routing notifications for one turn into its dedicated queue."""
self._router.register_turn(turn_id)
def unregister_turn_notifications(self, turn_id: str) -> None:
"""Stop routing notifications for one turn into its dedicated queue."""
self._router.unregister_turn(turn_id)
def next_turn_notification(self, turn_id: str) -> Notification:
"""Return the next routed notification for the requested turn id."""
return self._router.next_turn_notification(turn_id)
def thread_start(
@@ -355,7 +349,6 @@ class AppServerClient:
input_items: list[JsonObject] | JsonObject | str,
params: V2TurnStartParams | JsonObject | None = None,
) -> TurnStartResponse:
"""Start a turn and register its notification queue as early as possible."""
payload = {
**_params_dict(params),
"threadId": thread_id,
@@ -413,7 +406,6 @@ class AppServerClient:
)
def wait_for_turn_completed(self, turn_id: str) -> TurnCompletedNotification:
"""Block on the routed turn stream until the matching completion arrives."""
self.register_turn_notifications(turn_id)
try:
while True:
@@ -433,7 +425,6 @@ class AppServerClient:
text: str,
params: V2TurnStartParams | JsonObject | None = None,
) -> Iterator[AgentMessageDeltaNotification]:
"""Start a text turn and yield only its agent-message delta payloads."""
started = self.turn_start(thread_id, text, params=params)
turn_id = started.turn.id
self.register_turn_notifications(turn_id)
@@ -486,7 +477,6 @@ class AppServerClient:
def _default_approval_handler(
self, method: str, params: JsonObject | None
) -> JsonObject:
"""Accept approval requests when the caller did not provide a handler."""
if method == "item/commandExecution/requestApproval":
return {"decision": "accept"}
if method == "item/fileChange/requestApproval":
@@ -508,7 +498,6 @@ class AppServerClient:
self._stderr_thread.start()
def _start_reader_thread(self) -> None:
"""Start the sole stdout reader that fans messages into router queues."""
if self._proc is None or self._proc.stdout is None:
return
@@ -516,7 +505,6 @@ class AppServerClient:
self._reader_thread.start()
def _reader_loop(self) -> None:
"""Continuously classify transport messages into requests, responses, and events."""
try:
while True:
msg = self._read_message()

View File

@@ -35,8 +35,6 @@ from .v2_all import McpToolCallProgressNotification
from .v2_all import ModelReroutedNotification
from .v2_all import ModelVerificationNotification
from .v2_all import PlanDeltaNotification
from .v2_all import ProcessExitedNotification
from .v2_all import ProcessOutputDeltaNotification
from .v2_all import ReasoningSummaryPartAddedNotification
from .v2_all import ReasoningSummaryTextDeltaNotification
from .v2_all import ReasoningTextDeltaNotification
@@ -103,8 +101,6 @@ NOTIFICATION_MODELS: dict[str, type[BaseModel]] = {
"mcpServer/startupStatus/updated": McpServerStatusUpdatedNotification,
"model/rerouted": ModelReroutedNotification,
"model/verification": ModelVerificationNotification,
"process/exited": ProcessExitedNotification,
"process/outputDelta": ProcessOutputDeltaNotification,
"remoteControl/status/changed": RemoteControlStatusChangedNotification,
"serverRequest/resolved": ServerRequestResolvedNotification,
"skills/changed": SkillsChangedNotification,
@@ -169,7 +165,6 @@ NESTED_TURN_NOTIFICATION_TYPES: tuple[type[BaseModel], ...] = (
def notification_turn_id(payload: BaseModel) -> str | None:
"""Return the turn id carried by generated notification payload metadata."""
if isinstance(payload, DIRECT_TURN_ID_NOTIFICATION_TYPES):
return payload.turn_id if isinstance(payload.turn_id, str) else None
if isinstance(payload, NESTED_TURN_NOTIFICATION_TYPES):

File diff suppressed because it is too large Load Diff

View File

@@ -16,7 +16,6 @@ ROOT = Path(__file__).resolve().parents[1]
def _load_update_script_module():
"""Load the maintenance script as a module so tests exercise real helpers."""
script_path = ROOT / "scripts" / "update_sdk_artifacts.py"
spec = importlib.util.spec_from_file_location("update_sdk_artifacts", script_path)
if spec is None or spec.loader is None:
@@ -28,7 +27,6 @@ def _load_update_script_module():
def _load_runtime_setup_module():
"""Load runtime setup without importing the SDK package under test."""
runtime_setup_path = ROOT / "_runtime_setup.py"
spec = importlib.util.spec_from_file_location("_runtime_setup", runtime_setup_path)
if spec is None or spec.loader is None:
@@ -42,13 +40,11 @@ def _load_runtime_setup_module():
def test_generation_has_single_maintenance_entrypoint_script() -> None:
"""Keep artifact workflows routed through one script instead of side entrypoints."""
scripts = sorted(p.name for p in (ROOT / "scripts").glob("*.py"))
assert scripts == ["update_sdk_artifacts.py"]
def test_generate_types_wires_all_generation_steps() -> None:
"""The type generation command should refresh every schema-derived artifact."""
source = (ROOT / "scripts" / "update_sdk_artifacts.py").read_text()
tree = ast.parse(source)
@@ -56,8 +52,7 @@ def test_generate_types_wires_all_generation_steps() -> None:
(
node
for node in tree.body
if isinstance(node, ast.FunctionDef)
and node.name == "generate_types_from_schema_dir"
if isinstance(node, ast.FunctionDef) and node.name == "generate_types"
),
None,
)
@@ -77,19 +72,19 @@ def test_generate_types_wires_all_generation_steps() -> None:
]
def _load_runtime_schema_bundle(tmp_path: Path) -> dict:
"""Ask the pinned runtime package for a real schema bundle used by tests."""
def test_schema_normalization_only_flattens_string_literal_oneofs() -> None:
script = _load_update_script_module()
schema_dir = script.generate_schema_from_pinned_runtime(tmp_path / "schema")
return json.loads(script.schema_bundle_path(schema_dir).read_text())
schema = json.loads(
(
ROOT.parent.parent
/ "codex-rs"
/ "app-server-protocol"
/ "schema"
/ "json"
/ "codex_app_server_protocol.v2.schemas.json"
).read_text()
)
def test_schema_normalization_only_flattens_string_literal_oneofs(
tmp_path: Path,
) -> None:
"""Schema normalization should only flatten the enum-shaped oneOf variants."""
script = _load_update_script_module()
schema = _load_runtime_schema_bundle(tmp_path)
definitions = schema["definitions"]
flattened = [
name
@@ -99,23 +94,27 @@ def test_schema_normalization_only_flattens_string_literal_oneofs(
]
assert flattened == [
"MessagePhase",
"TurnItemsView",
"PluginAvailability",
"AuthMode",
"InputModality",
"ExperimentalFeatureStage",
"CommandExecOutputStream",
"ProcessOutputStream",
"ExperimentalFeatureStage",
"InputModality",
"MessagePhase",
]
def test_python_codegen_schema_annotation_adds_stable_variant_titles(
tmp_path: Path,
) -> None:
"""Schema annotations should give generated protocol classes stable names."""
def test_python_codegen_schema_annotation_adds_stable_variant_titles() -> None:
script = _load_update_script_module()
schema = _load_runtime_schema_bundle(tmp_path)
schema = json.loads(
(
ROOT.parent.parent
/ "codex-rs"
/ "app-server-protocol"
/ "schema"
/ "json"
/ "codex_app_server_protocol.v2.schemas.json"
).read_text()
)
script._annotate_schema(schema)
definitions = schema["definitions"]
@@ -164,9 +163,8 @@ def test_runtime_package_template_has_no_checked_in_binaries() -> None:
def test_examples_readme_points_to_runtime_version_source_of_truth() -> None:
"""Document that examples should point at the dependency pin, not release lore."""
readme = (ROOT / "examples" / "README.md").read_text()
assert "The pinned runtime version comes from the SDK package dependency." in readme
assert "The pinned runtime version comes from the SDK package version." in readme
def test_runtime_distribution_name_is_consistent() -> None:
@@ -187,25 +185,6 @@ def test_runtime_distribution_name_is_consistent() -> None:
)
def test_source_sdk_package_pins_published_runtime() -> None:
"""The source package metadata should pin the runtime wheel that ships schemas."""
script = _load_update_script_module()
pyproject = tomllib.loads((ROOT / "pyproject.toml").read_text())
assert {
"sdk_version": pyproject["project"]["version"],
"runtime_pin": script.pinned_runtime_version(),
"dependencies": pyproject["project"]["dependencies"],
} == {
"sdk_version": "0.131.0a4",
"runtime_pin": "0.131.0a4",
"dependencies": [
"pydantic>=2.12",
"openai-codex-cli-bin==0.131.0a4",
],
}
def test_release_metadata_retries_without_invalid_auth(
monkeypatch: pytest.MonkeyPatch,
) -> None:
@@ -233,16 +212,11 @@ def test_release_metadata_retries_without_invalid_auth(
def test_runtime_setup_uses_pep440_package_version_and_codex_release_tags() -> None:
"""The SDK uses PEP 440 package pins and converts only when fetching releases."""
runtime_setup = _load_runtime_setup_module()
pyproject = tomllib.loads((ROOT / "pyproject.toml").read_text())
assert runtime_setup.PACKAGE_NAME == "openai-codex-cli-bin"
assert runtime_setup.pinned_runtime_version() == pyproject["project"]["version"]
assert (
f"{runtime_setup.PACKAGE_NAME}=={pyproject['project']['version']}"
in pyproject["project"]["dependencies"]
)
assert (
runtime_setup._normalized_package_version("rust-v0.116.0-alpha.1")
== "0.116.0a1"
@@ -378,7 +352,6 @@ def test_stage_runtime_release_can_pin_wheel_platform_tag(tmp_path: Path) -> Non
def test_stage_runtime_release_copies_resource_binaries(tmp_path: Path) -> None:
"""Runtime staging should copy every helper binary into the wheel bin dir."""
script = _load_update_script_module()
fake_binary = tmp_path / script.runtime_binary_name()
helper = tmp_path / "helper"
@@ -409,7 +382,6 @@ def test_stage_runtime_release_copies_resource_binaries(tmp_path: Path) -> None:
def test_runtime_resource_binaries_are_included_by_wheel_config(
tmp_path: Path,
) -> None:
"""The runtime wheel config should include helper binaries beside Codex."""
script = _load_update_script_module()
fake_binary = tmp_path / script.runtime_binary_name()
helper = tmp_path / "helper"
@@ -426,7 +398,9 @@ def test_runtime_resource_binaries_are_included_by_wheel_config(
pyproject = tomllib.loads((staged / "pyproject.toml").read_text())
assert {
"include": pyproject["tool"]["hatch"]["build"]["targets"]["wheel"]["include"],
"helper": (staged / "src" / "codex_cli_bin" / "bin" / "helper").read_text(),
"helper": (
staged / "src" / "codex_cli_bin" / "bin" / "helper"
).read_text(),
} == {
"include": ["src/codex_cli_bin/bin/**"],
"helper": "fake helper\n",

View File

@@ -13,15 +13,12 @@ from codex_app_server.models import Notification, UnknownNotification
def test_async_client_allows_concurrent_transport_calls() -> None:
"""Async wrappers should offload sync calls so concurrent awaits can overlap."""
async def scenario() -> int:
"""Run two blocking sync calls and report peak overlap."""
client = AsyncAppServerClient()
active = 0
max_active = 0
def fake_model_list(include_hidden: bool = False) -> bool:
"""Simulate a blocking sync transport call."""
nonlocal active, max_active
active += 1
max_active = max(max_active, active)
@@ -37,20 +34,16 @@ def test_async_client_allows_concurrent_transport_calls() -> None:
def test_async_stream_text_is_incremental_without_blocking_parallel_calls() -> None:
"""Async text streaming should yield incrementally without blocking other calls."""
async def scenario() -> tuple[str, list[str], bool]:
"""Start a stream, then prove another async client call can finish."""
client = AsyncAppServerClient()
def fake_stream_text(thread_id: str, text: str, params=None): # type: ignore[no-untyped-def]
"""Yield one item before sleeping so the async wrapper can interleave."""
yield "first"
time.sleep(0.03)
yield "second"
yield "third"
def fake_model_list(include_hidden: bool = False) -> str:
"""Return immediately to prove the event loop was not monopolized."""
return "done"
client._sync.stream_text = fake_stream_text # type: ignore[method-assign]
@@ -77,9 +70,7 @@ def test_async_stream_text_is_incremental_without_blocking_parallel_calls() -> N
def test_async_client_turn_notification_methods_delegate_to_sync_client() -> None:
"""Async turn routing methods should preserve sync-client registration semantics."""
async def scenario() -> tuple[list[tuple[str, str]], Notification, str]:
"""Record the sync-client calls made by async turn notification wrappers."""
client = AsyncAppServerClient()
event = Notification(
method="unknown/direct",
@@ -94,20 +85,16 @@ def test_async_client_turn_notification_methods_delegate_to_sync_client() -> Non
calls: list[tuple[str, str]] = []
def fake_register(turn_id: str) -> None:
"""Record turn registration through the wrapped sync client."""
calls.append(("register", turn_id))
def fake_unregister(turn_id: str) -> None:
"""Record turn unregistration through the wrapped sync client."""
calls.append(("unregister", turn_id))
def fake_next(turn_id: str) -> Notification:
"""Return one routed notification through the wrapped sync client."""
calls.append(("next", turn_id))
return event
def fake_wait(turn_id: str) -> TurnCompletedNotification:
"""Return one completion through the wrapped sync client."""
calls.append(("wait", turn_id))
return completed
@@ -145,9 +132,7 @@ def test_async_client_turn_notification_methods_delegate_to_sync_client() -> Non
def test_async_stream_text_uses_sync_turn_routing() -> None:
"""Async text streaming should consume the same per-turn routing path as sync."""
async def scenario() -> tuple[list[tuple[str, str]], list[str]]:
"""Record routing calls while streaming two deltas and one completion."""
client = AsyncAppServerClient()
notifications = [
Notification(
@@ -185,21 +170,17 @@ def test_async_stream_text_uses_sync_turn_routing() -> None:
calls: list[tuple[str, str]] = []
def fake_turn_start(thread_id: str, text: str, *, params=None): # type: ignore[no-untyped-def]
"""Return a started turn id while recording the request thread."""
calls.append(("turn_start", thread_id))
return SimpleNamespace(turn=SimpleNamespace(id="turn-1"))
def fake_register(turn_id: str) -> None:
"""Record stream registration for the started turn."""
calls.append(("register", turn_id))
def fake_next(turn_id: str) -> Notification:
"""Return the next queued turn notification."""
calls.append(("next", turn_id))
return notifications.pop(0)
def fake_unregister(turn_id: str) -> None:
"""Record stream cleanup for the started turn."""
calls.append(("unregister", turn_id))
client._sync.turn_start = fake_turn_start # type: ignore[method-assign]

View File

@@ -50,7 +50,6 @@ def test_generated_v2_bundle_has_single_shared_plan_type_definition() -> None:
def test_thread_resume_response_accepts_auto_review_reviewer() -> None:
"""Generated response models should keep accepting the auto review enum value."""
response = ThreadResumeResponse.model_validate(
{
"approvalPolicy": "on-request",
@@ -67,8 +66,6 @@ def test_thread_resume_response_accepts_auto_review_reviewer() -> None:
"id": "thread-1",
"modelProvider": "openai",
"preview": "",
# The pinned runtime schema requires the session id on threads.
"sessionId": "session-1",
"source": "cli",
"status": {"type": "idle"},
"turns": [],
@@ -138,7 +135,6 @@ def test_invalid_notification_payload_falls_back_to_unknown() -> None:
def test_generated_notification_turn_id_handles_known_payload_shapes() -> None:
"""Generated routing metadata should cover direct, nested, and unscoped payloads."""
direct = AgentMessageDeltaNotification.model_validate(
{
"delta": "hello",
@@ -163,7 +159,6 @@ def test_generated_notification_turn_id_handles_known_payload_shapes() -> None:
def test_turn_notification_router_demuxes_registered_turns() -> None:
"""The router should deliver out-of-order turn events to the matching queues."""
client = AppServerClient()
client.register_turn_notifications("turn-1")
client.register_turn_notifications("turn-2")
@@ -206,7 +201,6 @@ def test_turn_notification_router_demuxes_registered_turns() -> None:
def test_client_reader_routes_interleaved_turn_notifications_by_turn_id() -> None:
"""Reader-loop routing should preserve order within each interleaved turn stream."""
client = AppServerClient()
client.register_turn_notifications("turn-1")
client.register_turn_notifications("turn-2")
@@ -251,7 +245,6 @@ def test_client_reader_routes_interleaved_turn_notifications_by_turn_id() -> Non
]
def fake_read_message() -> dict[str, object]:
"""Feed the reader loop a realistic interleaved stdout sequence."""
if messages:
return messages.pop(0)
raise EOFError
@@ -285,7 +278,6 @@ def test_client_reader_routes_interleaved_turn_notifications_by_turn_id() -> Non
def test_turn_notification_router_buffers_events_before_registration() -> None:
"""Early turn events should be replayed once their TurnHandle registers."""
client = AppServerClient()
client._router.route_notification(
client._coerce_notification(
@@ -310,7 +302,6 @@ def test_turn_notification_router_buffers_events_before_registration() -> None:
def test_turn_notification_router_clears_unregistered_turn_when_completed() -> None:
"""A completed unregistered turn should not leave a pending queue behind."""
client = AppServerClient()
client._router.route_notification(
client._coerce_notification(
@@ -337,7 +328,6 @@ def test_turn_notification_router_clears_unregistered_turn_when_completed() -> N
def test_turn_notification_router_routes_unknown_turn_notifications() -> None:
"""Unknown notifications should still route when their raw params carry a turn id."""
client = AppServerClient()
client.register_turn_notifications("turn-1")
client.register_turn_notifications("turn-2")

View File

@@ -1,6 +1,5 @@
from __future__ import annotations
import importlib.metadata
import os
import subprocess
import sys
@@ -15,7 +14,6 @@ GENERATED_TARGETS = [
def _snapshot_target(root: Path, rel_path: Path) -> dict[str, bytes] | bytes | None:
"""Capture one generated artifact so regeneration drift is easy to compare."""
target = root / rel_path
if not target.exists():
return None
@@ -30,22 +28,16 @@ def _snapshot_target(root: Path, rel_path: Path) -> dict[str, bytes] | bytes | N
def _snapshot_targets(root: Path) -> dict[str, dict[str, bytes] | bytes | None]:
"""Capture all checked-in generated artifacts before and after regeneration."""
return {
str(rel_path): _snapshot_target(root, rel_path)
for rel_path in GENERATED_TARGETS
str(rel_path): _snapshot_target(root, rel_path) for rel_path in GENERATED_TARGETS
}
def test_generated_files_are_up_to_date():
"""Regenerating from the pinned runtime package should leave artifacts unchanged."""
before = _snapshot_targets(ROOT)
# Regenerate contract artifacts via the pinned runtime package, not a local
# app-server binary from the checkout or CI environment.
assert importlib.metadata.version("openai-codex-cli-bin") == "0.131.0a4"
# Regenerate contract artifacts via single maintenance entrypoint.
env = os.environ.copy()
env.pop("CODEX_EXEC_PATH", None)
python_bin = str(Path(sys.executable).parent)
env["PATH"] = f"{python_bin}{os.pathsep}{env.get('PATH', '')}"

View File

@@ -82,7 +82,6 @@ def _item_completed_notification(
text: str = "final text",
phase: MessagePhase | None = None,
) -> Notification:
"""Build a realistic completed-item notification accepted by generated models."""
item: dict[str, object] = {
"id": "item-1",
"text": text,
@@ -94,8 +93,6 @@ def _item_completed_notification(
method="item/completed",
payload=ItemCompletedNotification.model_validate(
{
# The pinned runtime schema requires completion timestamps.
"completedAtMs": 1,
"item": item,
"threadId": thread_id,
"turnId": turn_id,
@@ -230,7 +227,6 @@ def test_async_codex_initializes_only_once_under_concurrency() -> None:
def test_turn_streams_can_consume_multiple_turns_on_one_client() -> None:
"""Two sync TurnHandle streams should advance independently on one client."""
client = AppServerClient()
notifications: dict[str, deque[Notification]] = {
"turn-1": deque(
@@ -261,13 +257,10 @@ def test_turn_streams_can_consume_multiple_turns_on_one_client() -> None:
def test_async_turn_streams_can_consume_multiple_turns_on_one_client() -> None:
"""Two async TurnHandle streams should advance independently on one client."""
async def scenario() -> None:
"""Interleave two async streams backed by separate per-turn queues."""
codex = AsyncCodex()
async def fake_ensure_initialized() -> None:
"""Avoid starting a real app-server process for this stream test."""
return None
notifications: dict[str, deque[Notification]] = {
@@ -286,7 +279,6 @@ def test_async_turn_streams_can_consume_multiple_turns_on_one_client() -> None:
}
async def fake_next_notification(turn_id: str) -> Notification:
"""Return the next notification from the requested per-turn queue."""
return notifications[turn_id].popleft()
codex._ensure_initialized = fake_ensure_initialized # type: ignore[method-assign]
@@ -476,7 +468,6 @@ def test_thread_run_raises_on_failed_turn() -> None:
def test_stream_text_registers_and_consumes_turn_notifications() -> None:
"""stream_text should register, consume, and unregister one turn queue."""
client = AppServerClient()
notifications: deque[Notification] = deque(
[
@@ -491,16 +482,13 @@ def test_stream_text_registers_and_consumes_turn_notifications() -> None:
)
def fake_register(turn_id: str) -> None:
"""Record registration for the turn created by stream_text."""
calls.append(("register", turn_id))
def fake_next(turn_id: str) -> Notification:
"""Return the next queued notification for stream_text."""
calls.append(("next", turn_id))
return notifications.popleft()
def fake_unregister(turn_id: str) -> None:
"""Record cleanup for the turn created by stream_text."""
calls.append(("unregister", turn_id))
client.register_turn_notifications = fake_register # type: ignore[method-assign]
@@ -522,13 +510,10 @@ def test_stream_text_registers_and_consumes_turn_notifications() -> None:
def test_async_thread_run_accepts_string_input_and_returns_run_result() -> None:
"""Async Thread.run should normalize string input and collect routed results."""
async def scenario() -> None:
"""Feed item, usage, and completion events through the async turn stream."""
codex = AsyncCodex()
async def fake_ensure_initialized() -> None:
"""Avoid starting a real app-server process for this run test."""
return None
item_notification = _item_completed_notification(text="Hello async.")
@@ -543,14 +528,12 @@ def test_async_thread_run_accepts_string_input_and_returns_run_result() -> None:
seen: dict[str, object] = {}
async def fake_turn_start(thread_id: str, wire_input: object, *, params=None): # noqa: ANN001,ANN202
"""Capture normalized input and return a synthetic turn id."""
seen["thread_id"] = thread_id
seen["wire_input"] = wire_input
seen["params"] = params
return SimpleNamespace(turn=SimpleNamespace(id="turn-1"))
async def fake_next_notification(_turn_id: str) -> Notification:
"""Return the next queued notification for the synthetic turn."""
return notifications.popleft()
codex._ensure_initialized = fake_ensure_initialized # type: ignore[method-assign]
@@ -573,13 +556,10 @@ def test_async_thread_run_accepts_string_input_and_returns_run_result() -> None:
def test_async_thread_run_uses_last_completed_assistant_message_as_final_response() -> (
None
):
"""Async run should use the last final assistant message as the response text."""
async def scenario() -> None:
"""Feed two completed agent messages through the async per-turn stream."""
codex = AsyncCodex()
async def fake_ensure_initialized() -> None:
"""Avoid starting a real app-server process for this run test."""
return None
first_item_notification = _item_completed_notification(
@@ -597,11 +577,9 @@ def test_async_thread_run_uses_last_completed_assistant_message_as_final_respons
)
async def fake_turn_start(thread_id: str, wire_input: object, *, params=None): # noqa: ANN001,ANN202,ARG001
"""Return a synthetic turn id after AsyncThread.run builds input."""
return SimpleNamespace(turn=SimpleNamespace(id="turn-1"))
async def fake_next_notification(_turn_id: str) -> Notification:
"""Return the next queued notification for that synthetic turn."""
return notifications.popleft()
codex._ensure_initialized = fake_ensure_initialized # type: ignore[method-assign]
@@ -620,13 +598,10 @@ def test_async_thread_run_uses_last_completed_assistant_message_as_final_respons
def test_async_thread_run_returns_none_when_only_commentary_messages_complete() -> None:
"""Async Thread.run should ignore commentary-only messages for final text."""
async def scenario() -> None:
"""Feed a commentary item and completion through the async turn stream."""
codex = AsyncCodex()
async def fake_ensure_initialized() -> None:
"""Avoid starting a real app-server process for this run test."""
return None
commentary_notification = _item_completed_notification(
@@ -641,11 +616,9 @@ def test_async_thread_run_returns_none_when_only_commentary_messages_complete()
)
async def fake_turn_start(thread_id: str, wire_input: object, *, params=None): # noqa: ANN001,ANN202,ARG001
"""Return a synthetic turn id for commentary-only output."""
return SimpleNamespace(turn=SimpleNamespace(id="turn-1"))
async def fake_next_notification(_turn_id: str) -> Notification:
"""Return the next queued commentary/completion notification."""
return notifications.popleft()
codex._ensure_initialized = fake_ensure_initialized # type: ignore[method-assign]

View File

@@ -54,7 +54,6 @@ def test_package_includes_py_typed_marker() -> None:
def test_generated_public_signatures_are_snake_case_and_typed() -> None:
"""Generated convenience methods should expose typed Pythonic keyword names."""
expected = {
Codex.thread_start: [
"approval_policy",
@@ -71,7 +70,6 @@ def test_generated_public_signatures_are_snake_case_and_typed() -> None:
"service_name",
"service_tier",
"session_start_source",
"thread_source",
],
Codex.thread_list: [
"archived",
@@ -110,7 +108,6 @@ def test_generated_public_signatures_are_snake_case_and_typed() -> None:
"model_provider",
"sandbox",
"service_tier",
"thread_source",
],
Thread.turn: [
"approval_policy",
@@ -151,7 +148,6 @@ def test_generated_public_signatures_are_snake_case_and_typed() -> None:
"service_name",
"service_tier",
"session_start_source",
"thread_source",
],
AsyncCodex.thread_list: [
"archived",
@@ -190,7 +186,6 @@ def test_generated_public_signatures_are_snake_case_and_typed() -> None:
"model_provider",
"sandbox",
"service_tier",
"thread_source",
],
AsyncThread.turn: [
"approval_policy",

22
sdk/python/uv.lock generated
View File

@@ -3,12 +3,9 @@ revision = 3
requires-python = ">=3.10"
[options]
exclude-newer = "2026-05-02T06:28:46.47929Z"
exclude-newer = "2026-04-20T18:19:27.620299Z"
exclude-newer-span = "P7D"
[options.exclude-newer-package]
openai-codex-cli-bin = "2026-05-10T00:00:00Z"
[[package]]
name = "annotated-types"
version = "0.7.0"
@@ -282,10 +279,9 @@ wheels = [
[[package]]
name = "openai-codex-app-server-sdk"
version = "0.131.0a4"
version = "0.116.0a1"
source = { editable = "." }
dependencies = [
{ name = "openai-codex-cli-bin" },
{ name = "pydantic" },
]
@@ -299,26 +295,12 @@ dev = [
[package.metadata]
requires-dist = [
{ name = "datamodel-code-generator", marker = "extra == 'dev'", specifier = "==0.31.2" },
{ name = "openai-codex-cli-bin", specifier = "==0.131.0a4" },
{ name = "pydantic", specifier = ">=2.12" },
{ name = "pytest", marker = "extra == 'dev'", specifier = ">=8.0" },
{ name = "ruff", marker = "extra == 'dev'", specifier = ">=0.11" },
]
provides-extras = ["dev"]
[[package]]
name = "openai-codex-cli-bin"
version = "0.131.0a4"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/6b/9f/f9fc4bb1b2b7a20d4d65143ebb4c4dcd2301a718183b539ecb5b1c0ac3ec/openai_codex_cli_bin-0.131.0a4-py3-none-macosx_10_9_x86_64.whl", hash = "sha256:db0f3cb7dda310641ac04fbaf3f128693a3817ab83ae59b67a3c9c74bd53f8b8", size = 88367585, upload-time = "2026-05-09T06:14:09.453Z" },
{ url = "https://files.pythonhosted.org/packages/dc/39/eb95ed0e8156669e895a192dec760be07dabe891c3c6340f7c6487b9a976/openai_codex_cli_bin-0.131.0a4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:6cae5af6edca7f6d3f0bcbbd93cfc8a6dc3e33fb5955af21ae492b6d5d0dcb72", size = 79245567, upload-time = "2026-05-09T06:14:13.581Z" },
{ url = "https://files.pythonhosted.org/packages/0c/92/ade176fa78d746d5ff7a6e371d64740c0d95ab299b0dd58a5404b89b3915/openai_codex_cli_bin-0.131.0a4-py3-none-musllinux_1_1_aarch64.whl", hash = "sha256:5728f9887baf62d7e72f4f242093b3ff81e26c81d80d346fe1eef7eda6838aa8", size = 77758628, upload-time = "2026-05-09T06:14:18.374Z" },
{ url = "https://files.pythonhosted.org/packages/28/e6/bfe6c65f8e3e5499f71b24c3b6e8d07e4d426543d25e429b9b141b544e5f/openai_codex_cli_bin-0.131.0a4-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:d7a47fd3667fbcc216593839c202deffa056e9b3d46c6933e72594d461f4fea0", size = 84535509, upload-time = "2026-05-09T06:14:22.851Z" },
{ url = "https://files.pythonhosted.org/packages/bd/b7/53dc094a691ab6f2ca079e8e865b122843809ac4fad51cac4d59021e599d/openai_codex_cli_bin-0.131.0a4-py3-none-win_amd64.whl", hash = "sha256:c61bcf029672494c4c7fdc8567dbaa659a48bb75641d91c2ade27c1e46803434", size = 88185543, upload-time = "2026-05-09T06:14:27.282Z" },
{ url = "https://files.pythonhosted.org/packages/82/99/e0852ffcf9b4d2794fef83e0c3a267b3c773a776f136e9f7ce19f0c8df42/openai_codex_cli_bin-0.131.0a4-py3-none-win_arm64.whl", hash = "sha256:bbde750186861f102e346ac066f4e9608f515f7b71b16a6e8b7ef1ddc02a97a5", size = 81196380, upload-time = "2026-05-09T06:14:32.103Z" },
]
[[package]]
name = "packaging"
version = "26.1"