## Why The high-level SDK should expose the approval behavior it actually supports instead of leaking generated app-server routing fields. New work should have two clear choices: default auto review, or explicitly deny escalated permission requests. Existing threads and subsequent turns should preserve their current approval behavior unless the caller passes an override. ## What - Add the public `ApprovalMode` enum with `auto_review` and `deny_all`. - Default new thread creation to `ApprovalMode.auto_review`. - Preserve existing approval settings by default for resume, fork, run, and turn helpers. - Remove raw `approval_policy` / `approvals_reviewer` kwargs from high-level SDK wrappers. - Update generated wrapper output, docs, examples, notebooks, and tests for the high-level approval mode API. ## Stack 1. #21891 `[1/8]` Pin Python SDK runtime dependency 2. #21893 `[2/8]` Generate Python SDK types from pinned runtime 3. #21895 `[3/8]` Run Python SDK tests in CI 4. #21896 `[4/8]` Define Python SDK public API surface 5. #21905 `[5/8]` Rename Python SDK package to `openai-codex` 6. This PR `[6/8]` Add high-level Python SDK approval mode 7. #22014 `[7/8]` Add Python SDK app-server integration harness 8. #22021 `[8/8]` Add Python SDK Ruff formatting ## Verification - Added approval-mode mapping/default tests for new threads, existing threads, forks, resumes, and subsequent turns. --------- Co-authored-by: Codex <noreply@openai.com>
OpenAI Codex Python SDK (Experimental)
Experimental Python SDK for codex app-server JSON-RPC v2 over stdio, with a small default surface optimized for real scripts and apps.
The generated wire-model layer is sourced from the pinned openai-codex-cli-bin
runtime package and exposed as Pydantic models with snake_case Python fields
that serialize back to the app-server’s camelCase wire format.
The package root exports the ergonomic client API; public app-server value and
event types live in openai_codex.types.
Install
cd sdk/python
uv sync
source .venv/bin/activate
Published SDK builds pin an exact openai-codex-cli-bin runtime dependency
with the same version as the SDK. For local repo development, either pass
AppServerConfig(codex_bin=...) to point at a local build explicitly, or use
the repo examples/notebook bootstrap which installs the pinned runtime package
automatically.
Quickstart
from openai_codex import Codex
with Codex() as codex:
thread = codex.thread_start(model="gpt-5")
result = thread.run("Say hello in one sentence.")
print(result.final_response)
print(len(result.items))
result.final_response is None when the turn completes without a final-answer
or phase-less assistant message item.
Docs map
- Golden path tutorial:
docs/getting-started.md - API reference (signatures + behavior):
docs/api-reference.md - Common decisions and pitfalls:
docs/faq.md - Runnable examples index:
examples/README.md - Jupyter walkthrough notebook:
notebooks/sdk_walkthrough.ipynb
Examples
Start here:
cd sdk/python
python examples/01_quickstart_constructor/sync.py
python examples/01_quickstart_constructor/async.py
Runtime packaging
The repo no longer checks codex binaries into sdk/python.
Published SDK builds are pinned to an exact openai-codex-cli-bin package
version, and that runtime package carries the platform-specific binary for the
target wheel. The SDK package version and runtime package version must match.
For local repo development, the checked-in sdk/python-runtime package is only
a template for staged release artifacts. Editable installs should use an
explicit codex_bin override for manual SDK usage; the repo examples and
notebook bootstrap the pinned runtime package automatically.
Maintainer workflow
cd sdk/python
uv sync
python scripts/update_sdk_artifacts.py generate-types
python scripts/update_sdk_artifacts.py \
stage-sdk \
/tmp/codex-python-release/openai-codex \
--codex-version <codex-release-tag-or-pep440-version>
python scripts/update_sdk_artifacts.py \
stage-runtime \
/tmp/codex-python-release/openai-codex-cli-bin \
/path/to/codex \
--codex-version <codex-release-tag-or-pep440-version>
Pass --platform-tag ... to stage-runtime when the wheel should be tagged for
a Rust target that differs from the Python build host. The intended one-off
matrix is macosx_11_0_arm64, macosx_10_9_x86_64,
musllinux_1_1_aarch64, musllinux_1_1_x86_64, win_arm64, and
win_amd64.
This supports the CI release flow:
- run
generate-typesbefore packaging - stage
openai-codexonce with an exactopenai-codex-cli-bin==...dependency - stage
openai-codex-cli-binon each supported platform runner with the same pinned runtime version - build and publish
openai-codex-cli-binas platform wheels only through PyPI trusted publishing; do not publish an sdist
Compatibility and versioning
- Package:
openai-codex - Runtime package:
openai-codex-cli-bin - Python:
>=3.10 - Target protocol: Codex
app-serverJSON-RPC v2 - Versioning rule: the SDK package version is the underlying Codex runtime version
Notes
Codex()is eager and performs startup +initializein the constructor.- Use context managers (
with Codex() as codex:) to ensure shutdown. - Prefer
thread.run("...")for the common case. Usethread.turn(...)when you need streaming, steering, or interrupt control. - For transient overload, use
retry_on_overloadfrom the package root.