mirror of
https://github.com/openai/codex.git
synced 2026-05-14 16:22:51 +00:00
## Why [#21736](https://github.com/openai/codex/pull/21736) introduces the typed extension API, but the runtime does not yet carry a registry through thread/session startup or give contributors host-owned stores to read from. This PR wires that host-side path so later feature migrations can move product-specific behavior behind typed contributions without adding another bespoke seam directly to `codex-core`. ## What changed - Thread `ExtensionRegistry<Config>` through `ThreadManager`, `CodexSpawnArgs`, `Session`, and sub-agent spawn paths. - Wire `ThreadStartContributor` and `ContextContributor` - Expose the small supporting surface needed by non-core callers that construct threads directly, including `empty_extension_registry()` through `codex-core-api`. This PR lands the host plumbing only: the app-server registry is still empty, and concrete feature migrations are intended to follow separately.
ThreadManager Sample
Small one-shot binary that starts a Codex thread with ThreadManager from
codex-core-api, submits a single user turn, and prints the final assistant
message.
cargo run -p codex-thread-manager-sample -- "Say hello"
Use --model to override the configured default model:
cargo run -p codex-thread-manager-sample -- --model gpt-5.2 "Say hello"
The prompt can also be piped through stdin:
printf 'Say hello\n' | cargo run -p codex-thread-manager-sample