mirror of
https://github.com/openai/codex.git
synced 2026-05-12 23:32:44 +00:00
## Why After stdio transports and provider-owned defaults exist, Codex needs a config-backed provider that can describe more than the single legacy `CODEX_EXEC_SERVER_URL` remote. This PR adds that provider without activating it in product entrypoints yet, keeping parser/validation review separate from runtime wiring. **Stack position:** this is PR 4 of 5. It builds on PR 3's provider/default model and adds the `environments.toml` provider used by PR 5. ## What Changed - Add `environment_toml.rs` as the TOML-specific home for parsing, validation, and provider construction. - Keep the TOML schema/provider structs private; the public constructor added here is `EnvironmentManager::from_codex_home(...)`. - Add `TomlEnvironmentProvider`, including validation for: - reserved ids such as `local` and `none` - duplicate ids - unknown explicit defaults - empty programs or URLs - exactly one of `url` or `program` per configured environment - Support websocket environments with `url = "ws://..."` / `wss://...`. - Support stdio-command environments with `program = "..."`. - Add helpers to load `environments.toml` from `CODEX_HOME`, but do not wire entrypoints to call them yet. - Add the `toml` dependency for parsing. ## Stack - 1. https://github.com/openai/codex/pull/20663 - Add stdio exec-server listener - 2. https://github.com/openai/codex/pull/20664 - Add stdio exec-server client transport - 3. https://github.com/openai/codex/pull/20665 - Make environment providers own default selection - **4. This PR:** https://github.com/openai/codex/pull/20666 - Add CODEX_HOME environments TOML provider - 5. https://github.com/openai/codex/pull/20667 - Load configured environments from CODEX_HOME Split from original draft: https://github.com/openai/codex/pull/20508 ## Validation Not run locally; this was split out of the original draft stack. ## Documentation This introduces the config shape for `environments.toml`; user-facing documentation should be added before this stack is treated as a documented public workflow. --------- Co-authored-by: Codex <noreply@openai.com>
18 lines
557 B
Python
18 lines
557 B
Python
load("//:defs.bzl", "codex_rust_crate")
|
|
|
|
codex_rust_crate(
|
|
name = "exec-server",
|
|
crate_name = "codex_exec_server",
|
|
deps_extra = [
|
|
"@crates//:toml",
|
|
],
|
|
# Keep the crate's integration tests single-threaded under Bazel because
|
|
# they install process-global test-binary dispatch state, and the remote
|
|
# exec-server cases already rely on serialization around the full CLI path.
|
|
integration_test_args = ["--test-threads=1"],
|
|
extra_binaries = [
|
|
"//codex-rs/bwrap:bwrap",
|
|
],
|
|
test_tags = ["no-sandbox"],
|
|
)
|