[4/8] Define Python SDK public API surface (#21896)

## Why

The SDK package root should be the ergonomic public client API, not a
dump of every generated app-server schema type. Generated models still
need a supported import path, but callers should be able to tell which
names are high-level SDK entrypoints and which names are protocol value
models.

## What

- Define a curated root `__all__` for clients, handles, input helpers,
retry helpers, config, and public errors.
- Add a `types` module as the supported home for generated app-server
response, event, enum, and helper models.
- Update docs and examples to import protocol/value models from the type
module.
- Add tests that lock root exports, type-module exports, star-import
behavior, and example import hygiene.

## Stack

1. #21891 `[1/8]` Pin Python SDK runtime dependency
2. #21893 `[2/8]` Generate Python SDK types from pinned runtime
3. #21895 `[3/8]` Run Python SDK tests in CI
4. This PR `[4/8]` Define Python SDK public API surface
5. #21905 `[5/8]` Rename Python SDK package to `openai-codex`
6. #21910 `[6/8]` Add high-level Python SDK approval mode
7. #22014 `[7/8]` Add Python SDK app-server integration harness
8. #22021 `[8/8]` Add Python SDK Ruff formatting

## Verification

- Added public API signature tests for root exports, `types` exports,
and example imports.

---------

Co-authored-by: Codex <noreply@openai.com>
This commit is contained in:
Ahmed Ibrahim
2026-05-12 00:57:44 +03:00
committed by GitHub
parent 3e2936dd0e
commit b4bc02439f
16 changed files with 274 additions and 78 deletions

View File

@@ -24,9 +24,9 @@ from codex_app_server import (
JsonRpcError,
ServerBusyError,
TextInput,
TurnStatus,
is_retryable_error,
)
from codex_app_server.types import TurnStatus
ResultT = TypeVar("ResultT")

View File

@@ -19,9 +19,9 @@ from codex_app_server import (
JsonRpcError,
ServerBusyError,
TextInput,
TurnStatus,
retry_on_overload,
)
from codex_app_server.types import TurnStatus
with Codex(config=runtime_config()) as codex:
thread = codex.thread_start(model="gpt-5.4", config={"model_reasoning_effort": "high"})

View File

@@ -14,6 +14,8 @@ import asyncio
from codex_app_server import (
AsyncCodex,
TextInput,
)
from codex_app_server.types import (
ThreadTokenUsageUpdatedNotification,
TurnCompletedNotification,
)

View File

@@ -12,6 +12,8 @@ ensure_local_sdk_src()
from codex_app_server import (
Codex,
TextInput,
)
from codex_app_server.types import (
ThreadTokenUsageUpdatedNotification,
TurnCompletedNotification,
)

View File

@@ -18,11 +18,13 @@ ensure_local_sdk_src()
import asyncio
from codex_app_server import (
AskForApproval,
AsyncCodex,
TextInput,
)
from codex_app_server.types import (
AskForApproval,
Personality,
ReasoningSummary,
TextInput,
)
OUTPUT_SCHEMA = {

View File

@@ -16,11 +16,13 @@ from _bootstrap import (
ensure_local_sdk_src()
from codex_app_server import (
AskForApproval,
Codex,
TextInput,
)
from codex_app_server.types import (
AskForApproval,
Personality,
ReasoningSummary,
TextInput,
)
OUTPUT_SCHEMA = {

View File

@@ -12,13 +12,15 @@ ensure_local_sdk_src()
import asyncio
from codex_app_server import (
AskForApproval,
AsyncCodex,
TextInput,
)
from codex_app_server.types import (
AskForApproval,
Personality,
ReasoningEffort,
ReasoningSummary,
SandboxPolicy,
TextInput,
)
REASONING_RANK = {

View File

@@ -10,13 +10,15 @@ from _bootstrap import assistant_text_from_turn, ensure_local_sdk_src, find_turn
ensure_local_sdk_src()
from codex_app_server import (
AskForApproval,
Codex,
TextInput,
)
from codex_app_server.types import (
AskForApproval,
Personality,
ReasoningEffort,
ReasoningSummary,
SandboxPolicy,
TextInput,
)
REASONING_RANK = {

View File

@@ -5,7 +5,8 @@ Each example folder contains runnable versions:
- `sync.py` (public sync surface: `Codex`)
- `async.py` (public async surface: `AsyncCodex`)
All examples intentionally use only public SDK exports from `codex_app_server`.
All examples intentionally use only public SDK exports from `codex_app_server`
and `codex_app_server.types`.
## Prerequisites