## Why The SDK should publish under the reserved public distribution name `openai-codex`, and its import module should match that name in the Python style. Since package names can contain hyphens but import modules cannot, the public import path becomes `openai_codex`. Keeping the rename separate from the public API surface change makes the naming change easy to review and avoids mixing it with API curation. ## What - Rename the SDK distribution from `openai-codex-app-server-sdk` to `openai-codex`. - Rename the import package from `codex_app_server` to `openai_codex`. - Keep the runtime wheel as the separate `openai-codex-cli-bin` dependency. - Update docs, examples, notebooks, artifact scripts, lockfile metadata, and tests for the new distribution/module names. ## Stack 1. #21891 `[1/8]` Pin Python SDK runtime dependency 2. #21893 `[2/8]` Generate Python SDK types from pinned runtime 3. #21895 `[3/8]` Run Python SDK tests in CI 4. #21896 `[4/8]` Define Python SDK public API surface 5. This PR `[5/8]` Rename Python SDK package to `openai-codex` 6. #21910 `[6/8]` Add high-level Python SDK approval mode 7. #22014 `[7/8]` Add Python SDK app-server integration harness 8. #22021 `[8/8]` Add Python SDK Ruff formatting ## Verification - Updated package metadata and public API tests to assert the distribution and import names. Co-authored-by: Codex <noreply@openai.com>
2.7 KiB
Python SDK Examples
Each example folder contains runnable versions:
sync.py(public sync surface:Codex)async.py(public async surface:AsyncCodex)
All examples intentionally use only public SDK exports from openai_codex
and openai_codex.types.
Prerequisites
- Python
>=3.10 - Install SDK dependencies for the same Python interpreter you will use to run examples
Recommended setup (from sdk/python):
uv sync
source .venv/bin/activate
When running examples from this repo checkout, the SDK source uses the local
tree and does not bundle a runtime binary. The helper in examples/_bootstrap.py
uses the installed openai-codex-cli-bin runtime package.
If the pinned openai-codex-cli-bin runtime is not already installed, the bootstrap
will download the matching GitHub release artifact, stage a temporary local
openai-codex-cli-bin package, install it into your active interpreter, and clean up
the temporary files afterward.
The pinned runtime version comes from the SDK package dependency.
Run examples
From sdk/python:
python examples/<example-folder>/sync.py
python examples/<example-folder>/async.py
The examples bootstrap local imports from sdk/python/src automatically, so no
SDK wheel install is required. You only need the Python dependencies for your
active interpreter and an installed openai-codex-cli-bin runtime package (either
already present or automatically provisioned by the bootstrap).
Recommended first run
python examples/01_quickstart_constructor/sync.py
python examples/01_quickstart_constructor/async.py
Index
01_quickstart_constructor/- first run / sanity check
02_turn_run/- inspect full turn output fields
03_turn_stream_events/- stream a turn with a small curated event view
04_models_and_metadata/- discover visible models for the connected runtime
05_existing_thread/- resume a real existing thread (created in-script)
06_thread_lifecycle_and_controls/- thread lifecycle + control calls
07_image_and_text/- remote image URL + text multimodal turn
08_local_image_and_text/- local image + text multimodal turn using a generated temporary sample image
09_async_parity/- parity-style sync flow (see async parity in other examples)
10_error_handling_and_retry/- overload retry pattern + typed error handling structure
11_cli_mini_app/- interactive chat loop
12_turn_params_kitchen_sink/- structured output with a curated advanced
turn(...)configuration
- structured output with a curated advanced
13_model_select_and_turn_params/- list models, pick highest model + highest supported reasoning effort, run turns, print message and usage
14_turn_controls/- separate best-effort
steer()andinterrupt()demos with concise summaries
- separate best-effort