Compare commits

..

44 Commits

Author SHA1 Message Date
jif-oai
58154a1d76 Merge branch 'main' into codex/validate-remote-exec-cwd 2026-04-14 14:27:50 +01:00
jif-oai
e6947f85f6 feat: add context percent to status line (#17637)
Co-authored-by: Codex <noreply@openai.com>
2026-04-14 14:27:24 +01:00
jif-oai
34a9ca083e nit: feature flag (#17777) 2026-04-14 13:44:01 +01:00
Ahmed Ibrahim
2f6fc7c137 Add realtime output modality and transcript events (#17701)
- Add outputModality to thread/realtime/start and wire text/audio output
selection through app-server, core, API, and TUI.\n- Rename the realtime
transcript delta notification and add a separate transcript done
notification that forwards final text from item done without correlating
it with deltas.
2026-04-14 00:13:13 -07:00
Ahmed Ibrahim
a6b03a22cc Log realtime call location (#17761)
Add a trace-level log for the realtime call Location header when
decoding the call id.
2026-04-13 23:33:51 -07:00
rhan-oai
b704df85b8 [codex-analytics] feature plumbing and emittance (#16640)
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with [ReviewStack](https://reviewstack.dev/openai/codex/pull/16640).
* #16870
* #16706
* #16641
* __->__ #16640
2026-04-13 23:11:49 -07:00
Thibault Sottiaux
05c5829923 [codex] drain mailbox only at request boundaries (#17749)
This changes multi-agent v2 mailbox handling so incoming inter-agent
messages no longer preempt an in-flight sampling stream at reasoning or
commentary output-item boundaries.
2026-04-13 22:09:51 -07:00
pakrym-oai
ad37389c18 [codex] Initialize ICU data for code mode V8 (#17709)
Link ICU data into code mode, otherwise locale-dependent methods cause a
panic and a crash.
2026-04-13 22:01:58 -07:00
pakrym-oai
3b24a9a532 Refactor plugin loading to async (#17747)
Simplifies skills migration.
2026-04-13 21:52:56 -07:00
xli-oai
ff584c5a4b [codex] Refactor marketplace add into shared core flow (#17717)
## Summary

Move `codex marketplace add` onto a shared core implementation so the
CLI and app-server path can use one source of truth.

This change:
- adds shared marketplace-add orchestration in `codex-core`
- switches the CLI command to call that shared implementation
- removes duplicated CLI-only marketplace add helpers
- preserves focused parser and add-path coverage while moving the shared
behavior into core tests

## Why

The new `marketplace/add` RPC should reuse the same underlying
marketplace-add flow as the CLI. This refactor lands that consolidation
first so the follow-up app-server PR can be mostly protocol and handler
wiring.

## Validation

- `cargo test -p codex-core marketplace_add`
- `cargo test -p codex-cli marketplace_cmd`
- `just fix -p codex-core`
- `just fix -p codex-cli`
- `just fmt`
2026-04-13 20:37:11 -07:00
viyatb-oai
d9a385ac8c fix: pin inputs (#17471)
## Summary
- Pin Rust git patch dependencies to immutable revisions and make
cargo-deny reject unknown git and registry sources unless explicitly
allowlisted.
- Add checked-in SHA-256 coverage for the current rusty_v8 release
assets, wire those hashes into Bazel, and verify CI override downloads
before use.
- Add rusty_v8 MODULE.bazel update/check tooling plus a Bazel CI guard
so future V8 bumps cannot drift from the checked-in checksum manifest.
- Pin release/lint cargo installs and all external GitHub Actions refs
to immutable inputs.

## Future V8 bump flow
Run these after updating the resolved `v8` crate version and checksum
manifest:

```bash
python3 .github/scripts/rusty_v8_bazel.py update-module-bazel
python3 .github/scripts/rusty_v8_bazel.py check-module-bazel
```

The update command rewrites the matching `rusty_v8_<crate_version>`
`http_file` SHA-256 values in `MODULE.bazel` from
`third_party/v8/rusty_v8_<crate_version>.sha256`. The check command is
also wired into Bazel CI to block drift.

## Notes
- This intentionally excludes RustSec dependency upgrades and
bubblewrap-related changes per request.
- The branch was rebased onto the latest origin/main before opening the
PR.

## Validation
- cargo fetch --locked
- cargo deny check advisories
- cargo deny check
- cargo deny check sources
- python3 .github/scripts/rusty_v8_bazel.py check-module-bazel
- python3 .github/scripts/rusty_v8_bazel.py update-module-bazel
- python3 -m unittest discover -s .github/scripts -p
'test_rusty_v8_bazel.py'
- python3 -m py_compile .github/scripts/rusty_v8_bazel.py
.github/scripts/rusty_v8_module_bazel.py
.github/scripts/test_rusty_v8_bazel.py
- repo-wide GitHub Actions `uses:` audit: all external action refs are
pinned to 40-character SHAs
- yq eval on touched workflows and local actions
- git diff --check
- just bazel-lock-check

## Hash verification
- Confirmed `MODULE.bazel` hashes match
`third_party/v8/rusty_v8_146_4_0.sha256`.
- Confirmed GitHub release asset digests for denoland/rusty_v8
`v146.4.0` and openai/codex `rusty-v8-v146.4.0` match the checked-in
hashes.
- Streamed and SHA-256 hashed all 10 `MODULE.bazel` rusty_v8 asset URLs
locally; every downloaded byte stream matched both `MODULE.bazel` and
the checked-in manifest.

## Pin verification
- Confirmed signing-action pins match the peeled commits for their tag
comments: `sigstore/cosign-installer@v3.7.0`, `azure/login@v2`, and
`azure/trusted-signing-action@v0`.
- Pinned the remaining tag-based action refs in Bazel CI/setup:
`actions/setup-node@v6`, `facebook/install-dotslash@v2`,
`bazelbuild/setup-bazelisk@v3`, and `actions/cache/restore@v5`.
- Normalized all `bazelbuild/setup-bazelisk@v3` refs to the peeled
commit behind the annotated tag.
- Audited Cargo git dependencies: every manifest git dependency uses
`rev` only, every `Cargo.lock` git source has `?rev=<sha>#<same-sha>`,
and `cargo deny check sources` passes with `required-git-spec = "rev"`.
- Shallow-fetched each distinct git dependency repo at its pinned SHA
and verified Git reports each object as a commit.
2026-04-14 01:45:41 +00:00
pakrym-oai
0c8f3173e4 [codex] Remove unused Rust helpers (#17146)
## Summary

Removes high-confidence unused Rust helper functions and exports across
`codex-tui`, `codex-shell-command`, and utility crates.

The cleanup includes dead TUI helper methods, unused
path/string/elapsed/fuzzy-match utilities, an unused Windows PowerShell
lookup helper, and the unused terminal palette version counter. This
keeps the remaining public surface smaller without changing behavior.

## Validation

- `just fmt`
- `cargo test -p codex-tui -p codex-shell-command -p codex-utils-elapsed
-p codex-utils-fuzzy-match -p codex-utils-string -p codex-utils-path`
- `just fix -p codex-tui -p codex-shell-command -p codex-utils-elapsed
-p codex-utils-fuzzy-match -p codex-utils-string -p codex-utils-path`
- `git diff --check`
2026-04-13 18:27:00 -07:00
pakrym-oai
f3cbe3d385 [codex] Add symlink flag to fs metadata (#17719)
Add `is_symlink` to FsMetadata struct.
2026-04-13 17:46:56 -07:00
Won Park
495ed22dfb guardian timeout fix pr 3 - ux touch for timeouts (#17557)
This PR teaches the TUI to render guardian review timeouts as explicit
terminal history entries instead of dropping them from the live
timeline.
It adds timeout-specific history cells for command, patch, MCP tool, and
network approval reviews.
It also adds snapshot tests covering both the direct guardian event path
and the app-server notification path.
2026-04-13 17:43:19 -07:00
starr-openai
280a4a6d42 Stabilize exec-server filesystem tests in CI (#17671)
## Summary\n- add an exec-server package-local test helper binary that
can run exec-server and fs-helper flows\n- route exec-server filesystem
tests through that helper instead of cross-crate codex helper
binaries\n- stop relying on Bazel-only extra binary wiring for these
tests\n\n## Testing\n- not run (per repo guidance for codex changes)

---------

Co-authored-by: Codex <noreply@openai.com>
2026-04-13 16:53:42 -07:00
pakrym-oai
d4be06adea Add turn item injection API (#17703)
## Summary
- Add `turn/inject_items` app-server v2 request support for appending
raw Responses API items to a loaded thread history without starting a
turn.
- Generate JSON schema and TypeScript protocol artifacts for the new
params and empty response.
- Document the new endpoint and include a request/response example.
- Preserve compatibility with the typo alias `turn/injet_items` while
returning the canonical method name.

## Testing
- Not run (not requested)
2026-04-13 16:11:05 -07:00
josiah-openai
937dd3812d Add supports_parallel_tool_calls flag to included mcps (#17667)
## Why

For more advanced MCP usage, we want the model to be able to emit
parallel MCP tool calls and have Codex execute eligible ones
concurrently, instead of forcing all MCP calls through the serial block.

The main design choice was where to thread the config. I made this
server-level because parallel safety depends on the MCP server
implementation. Codex reads the flag from `mcp_servers`, threads the
opted-in server names into `ToolRouter`, and checks the parsed
`ToolPayload::Mcp { server, .. }` at execution time. That avoids relying
on model-visible tool names, which can be incomplete in
deferred/search-tool paths or ambiguous for similarly named
servers/tools.

## What was added

Added `supports_parallel_tool_calls` for MCP servers.

Before:

```toml
[mcp_servers.docs]
command = "docs-server"
```

After:

```toml
[mcp_servers.docs]
command = "docs-server"
supports_parallel_tool_calls = true
```

MCP calls remain serial by default. Only tools from opted-in servers are
eligible to run in parallel. Docs also now warn to enable this only when
the server’s tools are safe to run concurrently, especially around
shared state or read/write races.

## Testing

Tested with a local stdio MCP server exposing real delay tools. The
model/Responses side was mocked only to deterministically emit two MCP
calls in the same turn.

Each test called `query_with_delay` and `query_with_delay_2` with `{
"seconds": 25 }`.

| Build/config | Observed | Wall time |
| --- | --- | --- |
| main with flag enabled | serial | `58.79s` |
| PR with flag enabled | parallel | `31.73s` |
| PR without flag | serial | `56.70s` |

PR with flag enabled showed both tools start before either completed;
main and PR-without-flag completed the first delay before starting the
second.

Also added an integration test.

Additional checks:

- `cargo test -p codex-tools` passed
- `cargo test -p codex-core
mcp_parallel_support_uses_exact_payload_server` passed
- `git diff --check` passed
2026-04-13 15:16:34 -07:00
Ahmed Ibrahim
0e31dc0d4a change realtime tool description (#17699)
# External (non-OpenAI) Pull Request Requirements

Before opening this Pull Request, please read the dedicated
"Contributing" markdown file or your PR may be closed:
https://github.com/openai/codex/blob/main/docs/contributing.md

If your PR conforms to our contribution guidelines, replace this text
with a detailed and high quality description of your changes.

Include a link to a bug report or enhancement request.
2026-04-13 14:31:31 -07:00
Ahmed Ibrahim
ec0133f5f8 Cap realtime mirrored user turns (#17685)
Cap mirrored user text sent to realtime with the existing 300-token turn
budget while preserving the full model turn.

Adds integration coverage for capped realtime mirror payloads.

---------

Co-authored-by: Codex <noreply@openai.com>
2026-04-13 14:31:18 -07:00
Kevin Liu
ecdd733a48 Remove unnecessary tests (#17395)
# External (non-OpenAI) Pull Request Requirements

Before opening this Pull Request, please read the dedicated
"Contributing" markdown file or your PR may be closed:
https://github.com/openai/codex/blob/main/docs/contributing.md

If your PR conforms to our contribution guidelines, replace this text
with a detailed and high quality description of your changes.

Include a link to a bug report or enhancement request.
2026-04-13 21:02:12 +00:00
Kevin Liu
ec72b1ced9 Update phase 2 memory model to gpt-5.4 (#17384)
### Motivation
- Switch the default model used for memory Phase 2 (consolidation) to
the newer `gpt-5.4` model.

### Description
- Change the Phase 2 model constant from `"gpt-5.3-codex"` to
`"gpt-5.4"` in `codex-rs/core/src/memories/mod.rs`.

### Testing
- Ran `just fmt`, which completed successfully.
- Attempted `cargo test -p codex-core`, but the build failed in this
environment because the `codex-linux-sandbox` crate requires the system
`libcap` pkg-config entry and the required system packages could not be
installed, so the test run was blocked.

------
[Codex
Task](https://chatgpt.com/codex/cloud/tasks/task_i_69d977693b48832a967e78d73c66dc8e)
2026-04-13 20:59:03 +00:00
David Z Hao
7c43f8bb5e Fix tui compilation (#17691)
The recent release broke, codex suggested this as the fix

Source failure:
https://github.com/openai/codex/actions/runs/24362949066/job/71147202092

Probably from
ac82443d07

For why it got in:
```
The relevant setup:

.github/workflows/rust-ci.yml (line 1) runs on PRs, but for codex-rs it only does:

cargo fmt --check
cargo shear
argument-comment lint via Bazel
no cargo check, no cargo clippy over the workspace, no cargo test over codex-tui
.github/workflows/rust-ci-full.yml (line 1) runs on pushes to main and branches matching **full-ci**. That one does compile TUI because:

codex-rs/Cargo.toml includes "tui" as a workspace member
lint_build runs cargo clippy --target ... --tests --profile ...
the matrix includes both dev and release profiles
tests runs cargo nextest run ..., but only dev-profile tests
Release CI also compiles it indirectly. .github/workflows/rust-release.yml (line 235) builds --bin codex, and cli/Cargo.toml (line 46) depends on codex-tui.
```

Codex tested locally with `cargo check -p codex-tui --release` and was
able to repro, and verified that this fixed it
2026-04-13 21:43:33 +01:00
iceweasel-oai
7b5e1ad3dc only specify remote ports when the rule needs them (#17669)
Windows gives an error when you combine `protocol = ANY` with
`SetRemotePorts`
This fixes that
2026-04-13 12:28:26 -07:00
Ruslan Nigmatullin
a5507b59c4 app-server: Only unload threads which were unused for some time (#17398)
Currently app-server may unload actively running threads once the last
connection disconnects, which is not expected.
Instead track when was the last active turn & when there were any
subscribers the last time, also add 30 minute idleness/no subscribers
timer to reduce the churn.
2026-04-13 12:25:26 -07:00
jif-oai
d905376628 feat: Avoid reloading curated marketplaces for tool-suggest discovera… (#17638)
- stop `list_tool_suggest_discoverable_plugins()` from reloading the
curated marketplace for each discoverable plugin
- reuse a direct plugin-detail loader against the already-resolved
marketplace entry


The trigger was to stop those logs spamming:
```
d=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/life-science-research/.codex-plugin/plugin.json
2026-04-13T12:27:30.402Z WARN  [019d81cf-6f69-7230-98aa-74294ff2dc5a] codex_core::plugins::manifest - session_loop{thread_id=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/build-ios-apps/.codex-plugin/plugin.json
2026-04-13T12:27:30.402Z WARN  [019d81cf-6f69-7230-98aa-74294ff2dc5a] codex_core::plugins::manifest - session_loop{thread_id=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/life-science-research/.codex-plugin/plugin.json
2026-04-13T12:27:30.405Z WARN  [019d81cf-6f69-7230-98aa-74294ff2dc5a] codex_core::plugins::manifest - session_loop{thread_id=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/build-ios-apps/.codex-plugin/plugin.json
2026-04-13T12:27:30.406Z WARN  [019d81cf-6f69-7230-98aa-74294ff2dc5a] codex_core::plugins::manifest - session_loop{thread_id=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/life-science-research/.codex-plugin/plugin.json
2026-04-13T12:27:30.408Z WARN  [019d81cf-6f69-7230-98aa-74294ff2dc5a] codex_core::plugins::manifest - session_loop{thread_id=019d81cf-6f69-7230-98aa-74294ff2dc5a}:submission_dispatch{otel.name="op.dispatch.user_input" submission.id="019d86c8-0a8e-7013-b442-109aabbf75c9" codex.op="user_input"}:turn{otel.name="session_task.turn" thread.id=019d81cf-6f69-7230-98aa-74294ff2dc5a turn.id=019d86c8-0a8e-7013-b442-109aabbf75c9 model=gpt-5.4}: ignoring interface.defaultPrompt: prompt must be at most 128 characters path=/Users/jif/.codex/.tmp/plugins/plugins/build-ios-apps/.codex-plugin/plugin.json
```
2026-04-13 19:08:43 +00:00
iceweasel-oai
0131f99fd5 Include legacy deny paths in elevated Windows sandbox setup (#17365)
## Summary

This updates the Windows elevated sandbox setup/refresh path to include
the legacy `compute_allow_paths(...).deny` protected children in the
same deny-write payload pipe added for split filesystem carveouts.

Concretely, elevated setup and elevated refresh now both build
deny-write payload paths from:

- explicit split-policy deny-write paths, preserving missing paths so
setup can materialize them before applying ACLs
- legacy `compute_allow_paths(...).deny`, which includes existing
`.git`, `.codex`, and `.agents` children under writable roots

This lets the elevated backend protect `.git` consistently with the
unelevated/restricted-token path, and removes the old janky hard-coded
`.codex` / `.agents` elevated setup helpers in favor of the shared
payload path.

## Root Cause

The landed split-carveout PR threaded a `deny_write_paths` pipe through
elevated setup/refresh, but the legacy workspace-write deny set from
`compute_allow_paths(...).deny` was not included in that payload. As a
result, elevated workspace-write did not apply the intended deny-write
ACLs for existing protected children like `<cwd>/.git`.

## Notes

The legacy protected children still only enter the deny set if they
already exist, because `compute_allow_paths` filters `.git`, `.codex`,
and `.agents` with `exists()`. Missing explicit split-policy deny paths
are preserved separately because setup intentionally materializes those
before applying ACLs.

## Validation

- `cargo fmt --check -p codex-windows-sandbox`
- `cargo test -p codex-windows-sandbox`
- `cargo build -p codex-cli -p codex-windows-sandbox --bins`
- Elevated `codex exec` smoke with `windows.sandbox='elevated'`: fresh
git repo, attempted append to `.git/config`, observed `Access is
denied`, marker not written, Deny ACE present on `.git`
- Unelevated `codex exec` smoke with `windows.sandbox='unelevated'`:
fresh git repo, attempted append to `.git/config`, observed `Access is
denied`, marker not written, Deny ACE present on `.git`
2026-04-13 10:49:42 -07:00
jif-oai
46a266cd6a feat: disable memory endpoint (#17626) 2026-04-13 18:29:49 +01:00
pakrym-oai
ac82443d07 Use AbsolutePathBuf in skill loading and codex_home (#17407)
Helps with FS migration later
2026-04-13 10:26:51 -07:00
Eric Traut
d25a9822a7 Do not fail thread start when trust persistence fails (#17595)
Addresses #17593

Problem: A regression introduced in
https://github.com/openai/codex/pull/16492 made thread/start fail when
Codex could not persist trusted project state, which crashes startup for
users with read-only config.toml.

Solution: Treat trusted project persistence as best effort and keep the
current thread's config trusted in memory when writing config.toml
fails.
2026-04-13 10:03:21 -07:00
Eric Traut
313ad29ad7 Fix TUI compaction item replay (#17657)
Problem: PR #17601 updated context-compaction replay to call a new
ChatWidget handler, but the handler was never implemented, breaking
codex-tui compilation on main.

Solution: Render context-compaction replay through the existing
info-message path, preserving the intended `Context compacted` UI marker
without adding a one-off handler.
2026-04-13 09:20:10 -07:00
Eric Traut
7c797c6544 Suppress duplicate compaction and terminal wait events (#17601)
Addresses #17514

Problem: PR #16966 made the TUI render the deprecated context-compaction
notification, while v2 could also receive legacy unified-exec
interaction items alongside terminal-interaction notifications, causing
duplicate "Context compacted" and "Waited for background terminal"
messages.

Solution: Suppress deprecated context-compaction notifications and
legacy unified-exec interaction command items from the app-server v2
projection, and render canonical context-compaction items through the
existing TUI info-event path.
2026-04-13 08:59:19 -07:00
Eric Traut
370be363f1 Wrap status reset timestamps in narrow layouts (#17481)
Addresses #17453

Problem: /status rate-limit reset timestamps can be truncated in narrow
layouts, leaving users with partial times or dates.

Solution: Let narrow rate-limit rows drop the fixed progress bar to
preserve the percent summary, and wrap reset timestamps onto
continuation lines instead of truncating them.
2026-04-13 08:53:37 -07:00
Eric Traut
ce5ad7b295 Emit plan-mode prompt notifications for questionnaires (#17417)
Addresses #17252

Problem: Plan-mode clarification questionnaires used the generic
user-input notification type, so configs listening for plan-mode-prompt
did not fire when request_user_input waited for an answer.

Solution: Map request_user_input prompts to the plan-mode-prompt
notification and remove the obsolete user-input TUI notification
variant.
2026-04-13 08:52:14 -07:00
Eric Traut
a5783f90c9 Fix custom tool output cleanup on stream failure (#17470)
Addresses #16255

Problem: Incomplete Responses streams could leave completed custom tool
outputs out of cleanup and retry prompts, making persisted history
inconsistent and retries stale.

Solution: Route stream and output-item errors through shared cleanup,
and rebuild retry prompts from fresh session history after the first
attempt.
2026-04-13 08:35:17 -07:00
friel-openai
776246c3f5 Make forked agent spawns keep parent model config (#17247)
## Summary

When a `spawn_agent` call does a full-history fork, keep the parent's
effective agent type and model configuration instead of applying child
role/model overrides.

This is the minimal config-inheritance slice of #16055. Prompt-cache key
inheritance and MCP tool-surface stability are split into follow-up PRs.

## Design

- Reject `agent_type`, `model`, and `reasoning_effort` for v1
`fork_context` spawns.
- Reject `agent_type`, `model`, and `reasoning_effort` for v2
`fork_turns = "all"` spawns.
- Keep v2 partial-history forks (`fork_turns = "N"`) configurable;
requested model/reasoning overrides and role config still apply there.
- Keep non-forked spawn behavior unchanged.

## Tests

- `cargo +1.93.1 test -p codex-core spawn_agent_fork_context --lib`
- `cargo +1.93.1 test -p codex-core multi_agent_v2_spawn_fork_turns
--lib`
- `cargo +1.93.1 test -p codex-core
multi_agent_v2_spawn_partial_fork_turns_allows_agent_type_override
--lib`
2026-04-13 15:28:40 +00:00
jif-oai
3f62b5cc61 fix: dedup compact (#17643) 2026-04-13 16:08:53 +01:00
jif-oai
7f877548ef Merge branch 'main' into codex/validate-remote-exec-cwd 2026-04-13 15:46:41 +01:00
jif-oai
49ca7c9f24 fix: stability exec server (#17640) 2026-04-13 14:52:12 +01:00
jif-oai
86bd0bc95c nit: change consolidation model (#17633) 2026-04-13 13:02:07 +01:00
jif-oai
4ed82dfdf3 Merge branch 'main' into codex/validate-remote-exec-cwd 2026-04-13 10:56:55 +01:00
jif-oai
bacb92b1d7 Build remote exec env from exec-server policy (#17216)
## Summary
- add an exec-server `envPolicy` field; when present, the server starts
from its own process env and applies the shell environment policy there
- keep `env` as the exact environment for local/embedded starts, but
make it an overlay for remote unified-exec starts
- move the shell-environment-policy builder into `codex-config` so Core
and exec-server share the inherit/filter/set/include behavior
- overlay only runtime/sandbox/network deltas from Core onto the
exec-server-derived env

## Why
Remote unified exec was materializing the shell env inside Core and
forwarding the whole map to exec-server, so remote processes could
inherit the orchestrator machine's `HOME`, `PATH`, etc. This keeps the
base env on the executor while preserving Core-owned runtime additions
like `CODEX_THREAD_ID`, unified-exec defaults, network proxy env, and
sandbox marker env.

## Validation
- `just fmt`
- `git diff --check`
- `cargo test -p codex-exec-server --lib`
- `cargo test -p codex-core --lib unified_exec::process_manager::tests`
- `cargo test -p codex-core --lib exec_env::tests`
- `cargo test -p codex-core --lib exec_env_tests` (compile-only; filter
matched 0 tests)
- `cargo test -p codex-config --lib shell_environment` (compile-only;
filter matched 0 tests)
- `just bazel-lock-update`

## Known local validation issue
- `just bazel-lock-check` is not runnable in this checkout: it invokes
`./scripts/check-module-bazel-lock.sh`, which is missing.

---------

Co-authored-by: Codex <noreply@openai.com>
Co-authored-by: pakrym-oai <pakrym@openai.com>
2026-04-13 09:59:08 +01:00
jif-oai
4ffe6c2ce6 feat: ignore keyring on 0.0.0 (#17221)
To prevent the spammy: 
<img width="424" height="172" alt="Screenshot 2026-04-09 at 13 36 16"
src="https://github.com/user-attachments/assets/b5ece9e3-c561-422f-87ec-041e7bd6813d"
/>
2026-04-13 09:58:47 +01:00
Eric Traut
6550007cca Stabilize exec-server process tests (#17605)
Problem: After #17294 switched exec-server tests to launch the top-level
`codex exec-server` command, parallel remote exec-process cases can
flake while waiting for the child server's listen URL or transport
shutdown.

Solution: Serialize remote exec-server-backed process tests and harden
the harness so spawned servers are killed on drop and shutdown waits for
the child process to exit.
2026-04-13 00:31:13 -07:00
jif-oai
310be316ff Validate remote exec-server cwd during startup
Fail fast when a remote exec-server cannot see the configured thread cwd. This catches the most dangerous split-brain state before user instructions, skills, or tool processes are resolved from the wrong filesystem.

Co-authored-by: Codex <noreply@openai.com>
2026-04-09 12:15:58 +01:00
307 changed files with 11359 additions and 3246 deletions

View File

@@ -12,7 +12,7 @@ runs:
using: composite
steps:
- name: Install cosign
uses: sigstore/cosign-installer@v3.7.0
uses: sigstore/cosign-installer@dc72c7d5c4d10cd6bcb8cf6e3fd625a9e5e537da # v3.7.0
- name: Cosign Linux artifacts
shell: bash

View File

@@ -18,7 +18,7 @@ runs:
steps:
- name: Set up Node.js for js_repl tests
if: inputs.install-test-prereqs == 'true'
uses: actions/setup-node@v6
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
with:
node-version-file: codex-rs/node-version.txt
@@ -26,7 +26,7 @@ runs:
# See https://github.com/openai/codex/pull/7617.
- name: Install DotSlash
if: inputs.install-test-prereqs == 'true'
uses: facebook/install-dotslash@v2
uses: facebook/install-dotslash@1e4e7b3e07eaca387acb98f1d4720e0bee8dbb6a # v2
- name: Make DotSlash available in PATH (Unix)
if: inputs.install-test-prereqs == 'true' && runner.os != 'Windows'
@@ -39,7 +39,7 @@ runs:
run: Copy-Item (Get-Command dotslash).Source -Destination "$env:LOCALAPPDATA\Microsoft\WindowsApps\dotslash.exe"
- name: Set up Bazel
uses: bazelbuild/setup-bazelisk@v3
uses: bazelbuild/setup-bazelisk@b39c379c82683a5f25d34f0d062761f62693e0b2 # v3
- name: Configure Bazel repository cache
id: configure_bazel_repository_cache

View File

@@ -0,0 +1,49 @@
name: setup-rusty-v8-musl
description: Download and verify musl rusty_v8 artifacts for Cargo builds.
inputs:
target:
description: Rust musl target triple.
required: true
runs:
using: composite
steps:
- name: Configure musl rusty_v8 artifact overrides and verify checksums
shell: bash
env:
TARGET: ${{ inputs.target }}
run: |
set -euo pipefail
case "${TARGET}" in
x86_64-unknown-linux-musl|aarch64-unknown-linux-musl)
;;
*)
echo "Unsupported musl rusty_v8 target: ${TARGET}" >&2
exit 1
;;
esac
version="$(python3 "${GITHUB_WORKSPACE}/.github/scripts/rusty_v8_bazel.py" resolved-v8-crate-version)"
release_tag="rusty-v8-v${version}"
base_url="https://github.com/openai/codex/releases/download/${release_tag}"
binding_dir="${RUNNER_TEMP}/rusty_v8"
archive_path="${binding_dir}/librusty_v8_release_${TARGET}.a.gz"
binding_path="${binding_dir}/src_binding_release_${TARGET}.rs"
checksums_path="${binding_dir}/rusty_v8_release_${TARGET}.sha256"
checksums_source="${GITHUB_WORKSPACE}/third_party/v8/rusty_v8_${version//./_}.sha256"
mkdir -p "${binding_dir}"
curl -fsSL "${base_url}/librusty_v8_release_${TARGET}.a.gz" -o "${archive_path}"
curl -fsSL "${base_url}/src_binding_release_${TARGET}.rs" -o "${binding_path}"
grep -E " (librusty_v8_release_${TARGET}[.]a[.]gz|src_binding_release_${TARGET}[.]rs)$" \
"${checksums_source}" > "${checksums_path}"
if [[ "$(wc -l < "${checksums_path}")" -ne 2 ]]; then
echo "Expected exactly two checksums for ${TARGET} in ${checksums_source}" >&2
exit 1
fi
(cd "${binding_dir}" && sha256sum -c "${checksums_path}")
echo "RUSTY_V8_ARCHIVE=${archive_path}" >> "${GITHUB_ENV}"
echo "RUSTY_V8_SRC_BINDING_PATH=${binding_path}" >> "${GITHUB_ENV}"

View File

@@ -27,14 +27,14 @@ runs:
using: composite
steps:
- name: Azure login for Trusted Signing (OIDC)
uses: azure/login@v2
uses: azure/login@a457da9ea143d694b1b9c7c869ebb04ebe844ef5 # v2
with:
client-id: ${{ inputs.client-id }}
tenant-id: ${{ inputs.tenant-id }}
subscription-id: ${{ inputs.subscription-id }}
- name: Sign Windows binaries with Azure Trusted Signing
uses: azure/trusted-signing-action@v0
uses: azure/trusted-signing-action@1d365fec12862c4aa68fcac418143d73f0cea293 # v0
with:
endpoint: ${{ inputs.endpoint }}
trusted-signing-account-name: ${{ inputs.account-name }}

View File

@@ -4,6 +4,7 @@ from __future__ import annotations
import argparse
import gzip
import hashlib
import re
import shutil
import subprocess
@@ -12,8 +13,16 @@ import tempfile
import tomllib
from pathlib import Path
from rusty_v8_module_bazel import (
RustyV8ChecksumError,
check_module_bazel,
update_module_bazel,
)
ROOT = Path(__file__).resolve().parents[2]
MODULE_BAZEL = ROOT / "MODULE.bazel"
RUSTY_V8_CHECKSUMS_DIR = ROOT / "third_party" / "v8"
MUSL_RUNTIME_ARCHIVE_LABELS = [
"@llvm//runtimes/libcxx:libcxx.static",
"@llvm//runtimes/libcxx:libcxxabi.static",
@@ -146,6 +155,24 @@ def resolved_v8_crate_version() -> str:
return matches[0]
def rusty_v8_checksum_manifest_path(version: str) -> Path:
return RUSTY_V8_CHECKSUMS_DIR / f"rusty_v8_{version.replace('.', '_')}.sha256"
def command_version(version: str | None) -> str:
if version is not None:
return version
return resolved_v8_crate_version()
def command_manifest_path(manifest: Path | None, version: str) -> Path:
if manifest is None:
return rusty_v8_checksum_manifest_path(version)
if manifest.is_absolute():
return manifest
return ROOT / manifest
def staged_archive_name(target: str, source_path: Path) -> str:
if source_path.suffix == ".lib":
return f"rusty_v8_release_{target}.lib.gz"
@@ -244,8 +271,18 @@ def stage_release_pair(
shutil.copyfile(binding_path, staged_binding)
staged_checksums = output_dir / f"rusty_v8_release_{target}.sha256"
with staged_checksums.open("w", encoding="utf-8") as checksums:
for path in [staged_library, staged_binding]:
digest = hashlib.sha256()
with path.open("rb") as artifact:
for chunk in iter(lambda: artifact.read(1024 * 1024), b""):
digest.update(chunk)
checksums.write(f"{digest.hexdigest()} {path.name}\n")
print(staged_library)
print(staged_binding)
print(staged_checksums)
def parse_args() -> argparse.Namespace:
@@ -264,6 +301,24 @@ def parse_args() -> argparse.Namespace:
subparsers.add_parser("resolved-v8-crate-version")
check_module_bazel_parser = subparsers.add_parser("check-module-bazel")
check_module_bazel_parser.add_argument("--version")
check_module_bazel_parser.add_argument("--manifest", type=Path)
check_module_bazel_parser.add_argument(
"--module-bazel",
type=Path,
default=MODULE_BAZEL,
)
update_module_bazel_parser = subparsers.add_parser("update-module-bazel")
update_module_bazel_parser.add_argument("--version")
update_module_bazel_parser.add_argument("--manifest", type=Path)
update_module_bazel_parser.add_argument(
"--module-bazel",
type=Path,
default=MODULE_BAZEL,
)
return parser.parse_args()
@@ -280,6 +335,22 @@ def main() -> int:
if args.command == "resolved-v8-crate-version":
print(resolved_v8_crate_version())
return 0
if args.command == "check-module-bazel":
version = command_version(args.version)
manifest_path = command_manifest_path(args.manifest, version)
try:
check_module_bazel(args.module_bazel, manifest_path, version)
except RustyV8ChecksumError as exc:
raise SystemExit(str(exc)) from exc
return 0
if args.command == "update-module-bazel":
version = command_version(args.version)
manifest_path = command_manifest_path(args.manifest, version)
try:
update_module_bazel(args.module_bazel, manifest_path, version)
except RustyV8ChecksumError as exc:
raise SystemExit(str(exc)) from exc
return 0
raise SystemExit(f"unsupported command: {args.command}")

230
.github/scripts/rusty_v8_module_bazel.py vendored Normal file
View File

@@ -0,0 +1,230 @@
#!/usr/bin/env python3
from __future__ import annotations
import re
from dataclasses import dataclass
from pathlib import Path
SHA256_RE = re.compile(r"[0-9a-f]{64}")
HTTP_FILE_BLOCK_RE = re.compile(r"(?ms)^http_file\(\n.*?^\)\n?")
class RustyV8ChecksumError(ValueError):
pass
@dataclass(frozen=True)
class RustyV8HttpFile:
start: int
end: int
block: str
name: str
downloaded_file_path: str
sha256: str | None
def parse_checksum_manifest(path: Path) -> dict[str, str]:
try:
lines = path.read_text(encoding="utf-8").splitlines()
except FileNotFoundError as exc:
raise RustyV8ChecksumError(f"missing checksum manifest: {path}") from exc
checksums: dict[str, str] = {}
for line_number, line in enumerate(lines, 1):
if not line.strip():
continue
parts = line.split()
if len(parts) != 2:
raise RustyV8ChecksumError(
f"{path}:{line_number}: expected '<sha256> <filename>'"
)
checksum, filename = parts
if not SHA256_RE.fullmatch(checksum):
raise RustyV8ChecksumError(
f"{path}:{line_number}: invalid SHA-256 digest for {filename}"
)
if not filename or filename in {".", ".."} or "/" in filename:
raise RustyV8ChecksumError(
f"{path}:{line_number}: expected a bare artifact filename"
)
if filename in checksums:
raise RustyV8ChecksumError(
f"{path}:{line_number}: duplicate checksum for {filename}"
)
checksums[filename] = checksum
if not checksums:
raise RustyV8ChecksumError(f"empty checksum manifest: {path}")
return checksums
def string_field(block: str, field: str) -> str | None:
# Matches one-line string fields inside http_file blocks, e.g. `sha256 = "...",`.
match = re.search(rf'^\s*{re.escape(field)}\s*=\s*"([^"]+)",\s*$', block, re.M)
if match:
return match.group(1)
return None
def rusty_v8_http_files(module_bazel: str, version: str) -> list[RustyV8HttpFile]:
version_slug = version.replace(".", "_")
name_prefix = f"rusty_v8_{version_slug}_"
entries = []
for match in HTTP_FILE_BLOCK_RE.finditer(module_bazel):
block = match.group(0)
name = string_field(block, "name")
if not name or not name.startswith(name_prefix):
continue
downloaded_file_path = string_field(block, "downloaded_file_path")
if not downloaded_file_path:
raise RustyV8ChecksumError(
f"MODULE.bazel {name} is missing downloaded_file_path"
)
entries.append(
RustyV8HttpFile(
start=match.start(),
end=match.end(),
block=block,
name=name,
downloaded_file_path=downloaded_file_path,
sha256=string_field(block, "sha256"),
)
)
return entries
def module_entry_set_errors(
entries: list[RustyV8HttpFile],
checksums: dict[str, str],
version: str,
) -> list[str]:
errors = []
if not entries:
errors.append(f"MODULE.bazel has no rusty_v8 http_file entries for {version}")
return errors
module_files: dict[str, RustyV8HttpFile] = {}
duplicate_files = set()
for entry in entries:
if entry.downloaded_file_path in module_files:
duplicate_files.add(entry.downloaded_file_path)
module_files[entry.downloaded_file_path] = entry
for filename in sorted(duplicate_files):
errors.append(f"MODULE.bazel has duplicate http_file entries for {filename}")
for filename in sorted(set(module_files) - set(checksums)):
entry = module_files[filename]
errors.append(f"MODULE.bazel {entry.name} has no checksum in the manifest")
for filename in sorted(set(checksums) - set(module_files)):
errors.append(f"manifest has {filename}, but MODULE.bazel has no http_file")
return errors
def module_checksum_errors(
entries: list[RustyV8HttpFile],
checksums: dict[str, str],
) -> list[str]:
errors = []
for entry in entries:
expected = checksums.get(entry.downloaded_file_path)
if expected is None:
continue
if entry.sha256 is None:
errors.append(f"MODULE.bazel {entry.name} is missing sha256")
elif entry.sha256 != expected:
errors.append(
f"MODULE.bazel {entry.name} has sha256 {entry.sha256}, "
f"expected {expected}"
)
return errors
def raise_checksum_errors(message: str, errors: list[str]) -> None:
if errors:
formatted_errors = "\n".join(f"- {error}" for error in errors)
raise RustyV8ChecksumError(f"{message}:\n{formatted_errors}")
def check_module_bazel_text(
module_bazel: str,
checksums: dict[str, str],
version: str,
) -> None:
entries = rusty_v8_http_files(module_bazel, version)
errors = [
*module_entry_set_errors(entries, checksums, version),
*module_checksum_errors(entries, checksums),
]
raise_checksum_errors("rusty_v8 MODULE.bazel checksum drift", errors)
def block_with_sha256(block: str, checksum: str) -> str:
sha256_line_re = re.compile(r'(?m)^(\s*)sha256\s*=\s*"[0-9a-f]+",\s*$')
if sha256_line_re.search(block):
return sha256_line_re.sub(
lambda match: f'{match.group(1)}sha256 = "{checksum}",',
block,
count=1,
)
downloaded_file_path_match = re.search(
r'(?m)^(\s*)downloaded_file_path\s*=\s*"[^"]+",\n',
block,
)
if not downloaded_file_path_match:
raise RustyV8ChecksumError("http_file block is missing downloaded_file_path")
insert_at = downloaded_file_path_match.end()
indent = downloaded_file_path_match.group(1)
return f'{block[:insert_at]}{indent}sha256 = "{checksum}",\n{block[insert_at:]}'
def update_module_bazel_text(
module_bazel: str,
checksums: dict[str, str],
version: str,
) -> str:
entries = rusty_v8_http_files(module_bazel, version)
errors = module_entry_set_errors(entries, checksums, version)
raise_checksum_errors("cannot update rusty_v8 MODULE.bazel checksums", errors)
updated = []
previous_end = 0
for entry in entries:
updated.append(module_bazel[previous_end : entry.start])
updated.append(
block_with_sha256(entry.block, checksums[entry.downloaded_file_path])
)
previous_end = entry.end
updated.append(module_bazel[previous_end:])
return "".join(updated)
def check_module_bazel(
module_bazel_path: Path,
manifest_path: Path,
version: str,
) -> None:
checksums = parse_checksum_manifest(manifest_path)
module_bazel = module_bazel_path.read_text(encoding="utf-8")
check_module_bazel_text(module_bazel, checksums, version)
print(f"{module_bazel_path} rusty_v8 {version} checksums match {manifest_path}")
def update_module_bazel(
module_bazel_path: Path,
manifest_path: Path,
version: str,
) -> None:
checksums = parse_checksum_manifest(manifest_path)
module_bazel = module_bazel_path.read_text(encoding="utf-8")
updated_module_bazel = update_module_bazel_text(module_bazel, checksums, version)
if updated_module_bazel == module_bazel:
print(f"{module_bazel_path} rusty_v8 {version} checksums are already current")
return
module_bazel_path.write_text(updated_module_bazel, encoding="utf-8")
print(f"updated {module_bazel_path} rusty_v8 {version} checksums")

126
.github/scripts/test_rusty_v8_bazel.py vendored Normal file
View File

@@ -0,0 +1,126 @@
#!/usr/bin/env python3
from __future__ import annotations
import textwrap
import unittest
import rusty_v8_module_bazel
class RustyV8BazelTest(unittest.TestCase):
def test_update_module_bazel_replaces_and_inserts_sha256(self) -> None:
module_bazel = textwrap.dedent(
"""\
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "0000000000000000000000000000000000000000000000000000000000000000",
urls = [
"https://example.test/librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
],
)
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_musl_binding",
downloaded_file_path = "src_binding_release_x86_64-unknown-linux-musl.rs",
urls = [
"https://example.test/src_binding_release_x86_64-unknown-linux-musl.rs",
],
)
http_file(
name = "rusty_v8_145_0_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
urls = [
"https://example.test/old.gz",
],
)
"""
)
checksums = {
"librusty_v8_release_x86_64-unknown-linux-gnu.a.gz": (
"1111111111111111111111111111111111111111111111111111111111111111"
),
"src_binding_release_x86_64-unknown-linux-musl.rs": (
"2222222222222222222222222222222222222222222222222222222222222222"
),
}
updated = rusty_v8_module_bazel.update_module_bazel_text(
module_bazel,
checksums,
"146.4.0",
)
self.assertEqual(
textwrap.dedent(
"""\
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "1111111111111111111111111111111111111111111111111111111111111111",
urls = [
"https://example.test/librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
],
)
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_musl_binding",
downloaded_file_path = "src_binding_release_x86_64-unknown-linux-musl.rs",
sha256 = "2222222222222222222222222222222222222222222222222222222222222222",
urls = [
"https://example.test/src_binding_release_x86_64-unknown-linux-musl.rs",
],
)
http_file(
name = "rusty_v8_145_0_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
urls = [
"https://example.test/old.gz",
],
)
"""
),
updated,
)
rusty_v8_module_bazel.check_module_bazel_text(updated, checksums, "146.4.0")
def test_check_module_bazel_rejects_manifest_drift(self) -> None:
module_bazel = textwrap.dedent(
"""\
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "1111111111111111111111111111111111111111111111111111111111111111",
urls = [
"https://example.test/librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
],
)
"""
)
checksums = {
"librusty_v8_release_x86_64-unknown-linux-gnu.a.gz": (
"1111111111111111111111111111111111111111111111111111111111111111"
),
"orphan.gz": (
"2222222222222222222222222222222222222222222222222222222222222222"
),
}
with self.assertRaisesRegex(
rusty_v8_module_bazel.RustyV8ChecksumError,
"manifest has orphan.gz",
):
rusty_v8_module_bazel.check_module_bazel_text(
module_bazel,
checksums,
"146.4.0",
)
if __name__ == "__main__":
unittest.main()

View File

@@ -51,6 +51,13 @@ jobs:
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Check rusty_v8 MODULE.bazel checksums
if: matrix.os == 'ubuntu-24.04' && matrix.target == 'x86_64-unknown-linux-gnu'
shell: bash
run: |
python3 .github/scripts/rusty_v8_bazel.py check-module-bazel
python3 -m unittest discover -s .github/scripts -p test_rusty_v8_bazel.py
- name: Set up Bazel CI
id: setup_bazel
uses: ./.github/actions/setup-bazel-ci
@@ -65,7 +72,7 @@ jobs:
- name: Restore bazel repository cache
id: cache_bazel_repository_restore
continue-on-error: true
uses: actions/cache/restore@v5
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
with:
path: ${{ steps.setup_bazel.outputs.repository-cache-path }}
key: bazel-cache-${{ matrix.target }}-${{ hashFiles('MODULE.bazel', 'codex-rs/Cargo.lock', 'codex-rs/Cargo.toml') }}
@@ -168,7 +175,7 @@ jobs:
- name: Restore bazel repository cache
id: cache_bazel_repository_restore
continue-on-error: true
uses: actions/cache/restore@v5
uses: actions/cache/restore@668228422ae6a00e4ad889ee87cd7109ec5666a7 # v5
with:
path: ${{ steps.setup_bazel.outputs.repository-cache-path }}
key: bazel-cache-${{ matrix.target }}-${{ hashFiles('MODULE.bazel', 'codex-rs/Cargo.lock', 'codex-rs/Cargo.toml') }}

View File

@@ -43,6 +43,9 @@ jobs:
argument_comment_lint_package:
name: Argument comment lint package
runs-on: ubuntu-24.04
env:
CARGO_DYLINT_VERSION: 5.0.0
DYLINT_LINK_VERSION: 5.0.0
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: dtolnay/rust-toolchain@a0b273b48ed29de4470960879e8381ff45632f26 # 1.93.0
@@ -59,10 +62,13 @@ jobs:
~/.cargo/registry/index
~/.cargo/registry/cache
~/.cargo/git/db
key: argument-comment-lint-${{ runner.os }}-${{ hashFiles('tools/argument-comment-lint/Cargo.lock', 'tools/argument-comment-lint/rust-toolchain', '.github/workflows/rust-ci.yml', '.github/workflows/rust-ci-full.yml') }}
key: argument-comment-lint-${{ runner.os }}-${{ env.CARGO_DYLINT_VERSION }}-${{ env.DYLINT_LINK_VERSION }}-${{ hashFiles('tools/argument-comment-lint/Cargo.lock', 'tools/argument-comment-lint/rust-toolchain', '.github/workflows/rust-ci.yml', '.github/workflows/rust-ci-full.yml') }}
- name: Install cargo-dylint tooling
if: ${{ steps.cargo_dylint_cache.outputs.cache-hit != 'true' }}
run: cargo install --locked cargo-dylint dylint-link
shell: bash
run: |
cargo install --locked cargo-dylint --version "$CARGO_DYLINT_VERSION"
cargo install --locked dylint-link --version "$DYLINT_LINK_VERSION"
- name: Check Python wrapper syntax
run: python3 -m py_compile tools/argument-comment-lint/wrapper_common.py tools/argument-comment-lint/run.py tools/argument-comment-lint/run-prebuilt-linter.py tools/argument-comment-lint/test_wrapper_common.py
- name: Test Python wrapper helpers
@@ -415,22 +421,10 @@ jobs:
echo "CXXFLAGS=${cxxflags}" >> "$GITHUB_ENV"
- if: ${{ matrix.target == 'x86_64-unknown-linux-musl' || matrix.target == 'aarch64-unknown-linux-musl' }}
name: Configure musl rusty_v8 artifact overrides
env:
TARGET: ${{ matrix.target }}
shell: bash
run: |
set -euo pipefail
version="$(python3 "${GITHUB_WORKSPACE}/.github/scripts/rusty_v8_bazel.py" resolved-v8-crate-version)"
release_tag="rusty-v8-v${version}"
base_url="https://github.com/openai/codex/releases/download/${release_tag}"
archive="https://github.com/openai/codex/releases/download/rusty-v8-v${version}/librusty_v8_release_${TARGET}.a.gz"
binding_dir="${RUNNER_TEMP}/rusty_v8"
binding_path="${binding_dir}/src_binding_release_${TARGET}.rs"
mkdir -p "${binding_dir}"
curl -fsSL "${base_url}/src_binding_release_${TARGET}.rs" -o "${binding_path}"
echo "RUSTY_V8_ARCHIVE=${archive}" >> "$GITHUB_ENV"
echo "RUSTY_V8_SRC_BINDING_PATH=${binding_path}" >> "$GITHUB_ENV"
name: Configure musl rusty_v8 artifact overrides and verify checksums
uses: ./.github/actions/setup-rusty-v8-musl
with:
target: ${{ matrix.target }}
- name: Install cargo-chef
if: ${{ matrix.profile == 'release' }}

View File

@@ -90,6 +90,9 @@ jobs:
runs-on: ubuntu-24.04
needs: changed
if: ${{ needs.changed.outputs.argument_comment_lint_package == 'true' }}
env:
CARGO_DYLINT_VERSION: 5.0.0
DYLINT_LINK_VERSION: 5.0.0
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: dtolnay/rust-toolchain@a0b273b48ed29de4470960879e8381ff45632f26 # 1.93.0
@@ -113,10 +116,13 @@ jobs:
~/.cargo/registry/index
~/.cargo/registry/cache
~/.cargo/git/db
key: argument-comment-lint-${{ runner.os }}-${{ hashFiles('tools/argument-comment-lint/Cargo.lock', 'tools/argument-comment-lint/rust-toolchain', '.github/workflows/rust-ci.yml', '.github/workflows/rust-ci-full.yml') }}
key: argument-comment-lint-${{ runner.os }}-${{ env.CARGO_DYLINT_VERSION }}-${{ env.DYLINT_LINK_VERSION }}-${{ hashFiles('tools/argument-comment-lint/Cargo.lock', 'tools/argument-comment-lint/rust-toolchain', '.github/workflows/rust-ci.yml', '.github/workflows/rust-ci-full.yml') }}
- name: Install cargo-dylint tooling
if: ${{ steps.cargo_dylint_cache.outputs.cache-hit != 'true' }}
run: cargo install --locked cargo-dylint dylint-link
shell: bash
run: |
cargo install --locked cargo-dylint --version "$CARGO_DYLINT_VERSION"
cargo install --locked dylint-link --version "$DYLINT_LINK_VERSION"
- name: Check Python wrapper syntax
run: python3 -m py_compile tools/argument-comment-lint/wrapper_common.py tools/argument-comment-lint/run.py tools/argument-comment-lint/run-prebuilt-linter.py tools/argument-comment-lint/test_wrapper_common.py
- name: Test Python wrapper helpers

View File

@@ -19,6 +19,9 @@ jobs:
name: Build - ${{ matrix.runner }} - ${{ matrix.target }}
runs-on: ${{ matrix.runs_on || matrix.runner }}
timeout-minutes: 60
env:
CARGO_DYLINT_VERSION: 5.0.0
DYLINT_LINK_VERSION: 5.0.0
strategy:
fail-fast: false
@@ -65,8 +68,8 @@ jobs:
shell: bash
run: |
install_root="${RUNNER_TEMP}/argument-comment-lint-tools"
cargo install --locked cargo-dylint --root "$install_root"
cargo install --locked dylint-link
cargo install --locked cargo-dylint --version "$CARGO_DYLINT_VERSION" --root "$install_root"
cargo install --locked dylint-link --version "$DYLINT_LINK_VERSION"
echo "INSTALL_ROOT=$install_root" >> "$GITHUB_ENV"
- name: Cargo build

View File

@@ -211,22 +211,10 @@ jobs:
echo "CXXFLAGS=${cxxflags}" >> "$GITHUB_ENV"
- if: ${{ matrix.target == 'x86_64-unknown-linux-musl' || matrix.target == 'aarch64-unknown-linux-musl' }}
name: Configure musl rusty_v8 artifact overrides
env:
TARGET: ${{ matrix.target }}
shell: bash
run: |
set -euo pipefail
version="$(python3 "${GITHUB_WORKSPACE}/.github/scripts/rusty_v8_bazel.py" resolved-v8-crate-version)"
release_tag="rusty-v8-v${version}"
base_url="https://github.com/openai/codex/releases/download/${release_tag}"
archive="https://github.com/openai/codex/releases/download/rusty-v8-v${version}/librusty_v8_release_${TARGET}.a.gz"
binding_dir="${RUNNER_TEMP}/rusty_v8"
binding_path="${binding_dir}/src_binding_release_${TARGET}.rs"
mkdir -p "${binding_dir}"
curl -fsSL "${base_url}/src_binding_release_${TARGET}.rs" -o "${binding_path}"
echo "RUSTY_V8_ARCHIVE=${archive}" >> "$GITHUB_ENV"
echo "RUSTY_V8_SRC_BINDING_PATH=${binding_path}" >> "$GITHUB_ENV"
name: Configure musl rusty_v8 artifact overrides and verify checksums
uses: ./.github/actions/setup-rusty-v8-musl
with:
target: ${{ matrix.target }}
- name: Cargo build
shell: bash

View File

@@ -78,7 +78,7 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Bazel
uses: bazelbuild/setup-bazelisk@6ecf4fd8b7d1f9721785f1dd656a689acf9add47 # v3
uses: bazelbuild/setup-bazelisk@b39c379c82683a5f25d34f0d062761f62693e0b2 # v3
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6

View File

@@ -75,7 +75,7 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Bazel
uses: bazelbuild/setup-bazelisk@6ecf4fd8b7d1f9721785f1dd656a689acf9add47 # v3
uses: bazelbuild/setup-bazelisk@b39c379c82683a5f25d34f0d062761f62693e0b2 # v3
- name: Set up Python
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6

View File

@@ -423,6 +423,7 @@ http_archive(
http_file(
name = "rusty_v8_146_4_0_aarch64_apple_darwin_archive",
downloaded_file_path = "librusty_v8_release_aarch64-apple-darwin.a.gz",
sha256 = "bfe2c9be32a56c28546f0f965825ee68fbf606405f310cc4e17b448a568cf98a",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/librusty_v8_release_aarch64-apple-darwin.a.gz",
],
@@ -431,6 +432,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_aarch64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_aarch64-unknown-linux-gnu.a.gz",
sha256 = "dbf165b07c81bdb054bc046b43d23e69fcf7bcc1a4c1b5b4776983a71062ecd8",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/librusty_v8_release_aarch64-unknown-linux-gnu.a.gz",
],
@@ -439,6 +441,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_aarch64_pc_windows_msvc_archive",
downloaded_file_path = "rusty_v8_release_aarch64-pc-windows-msvc.lib.gz",
sha256 = "ed13363659c6d08583ac8fdc40493445c5767d8b94955a4d5d7bb8d5a81f6bf8",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/rusty_v8_release_aarch64-pc-windows-msvc.lib.gz",
],
@@ -447,6 +450,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_x86_64_apple_darwin_archive",
downloaded_file_path = "librusty_v8_release_x86_64-apple-darwin.a.gz",
sha256 = "630cd240f1bbecdb071417dc18387ab81cf67c549c1c515a0b4fcf9eba647bb7",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/librusty_v8_release_x86_64-apple-darwin.a.gz",
],
@@ -455,6 +459,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_gnu_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
sha256 = "e64b4d99e4ae293a2e846244a89b80178ba10382c13fb591c1fa6968f5291153",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/librusty_v8_release_x86_64-unknown-linux-gnu.a.gz",
],
@@ -463,6 +468,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_x86_64_pc_windows_msvc_archive",
downloaded_file_path = "rusty_v8_release_x86_64-pc-windows-msvc.lib.gz",
sha256 = "90a9a2346acd3685a355e98df85c24dbe406cb124367d16259a4b5d522621862",
urls = [
"https://github.com/denoland/rusty_v8/releases/download/v146.4.0/rusty_v8_release_x86_64-pc-windows-msvc.lib.gz",
],
@@ -471,6 +477,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_aarch64_unknown_linux_musl_archive",
downloaded_file_path = "librusty_v8_release_aarch64-unknown-linux-musl.a.gz",
sha256 = "27a08ed26c34297bfd93e514692ccc44b85f8b15c6aa39cf34e784f84fb37e8e",
urls = [
"https://github.com/openai/codex/releases/download/rusty-v8-v146.4.0/librusty_v8_release_aarch64-unknown-linux-musl.a.gz",
],
@@ -479,6 +486,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_aarch64_unknown_linux_musl_binding",
downloaded_file_path = "src_binding_release_aarch64-unknown-linux-musl.rs",
sha256 = "09f8900ced8297c229246c7a50b2e0ec23c54d0a554f369619cc29863f38dd1a",
urls = [
"https://github.com/openai/codex/releases/download/rusty-v8-v146.4.0/src_binding_release_aarch64-unknown-linux-musl.rs",
],
@@ -487,6 +495,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_musl_archive",
downloaded_file_path = "librusty_v8_release_x86_64-unknown-linux-musl.a.gz",
sha256 = "20d8271ad712323d352c1383c36e3c4b755abc41ece35819c49c75ec7134d2f8",
urls = [
"https://github.com/openai/codex/releases/download/rusty-v8-v146.4.0/librusty_v8_release_x86_64-unknown-linux-musl.a.gz",
],
@@ -495,6 +504,7 @@ http_file(
http_file(
name = "rusty_v8_146_4_0_x86_64_unknown_linux_musl_binding",
downloaded_file_path = "src_binding_release_x86_64-unknown-linux-musl.rs",
sha256 = "09f8900ced8297c229246c7a50b2e0ec23c54d0a554f369619cc29863f38dd1a",
urls = [
"https://github.com/openai/codex/releases/download/rusty-v8-v146.4.0/src_binding_release_x86_64-unknown-linux-musl.rs",
],

5
MODULE.bazel.lock generated
View File

@@ -782,6 +782,7 @@
"debugid_0.8.0": "{\"dependencies\":[{\"name\":\"serde\",\"optional\":true,\"req\":\"^1.0.85\"},{\"kind\":\"dev\",\"name\":\"serde_json\",\"req\":\"^1.0.37\"},{\"name\":\"uuid\",\"req\":\"^1.0.0\"}],\"features\":{}}",
"debugserver-types_0.5.0": "{\"dependencies\":[{\"name\":\"schemafy\",\"req\":\"^0.5.0\"},{\"features\":[\"derive\"],\"name\":\"serde\",\"req\":\"^1.0\"},{\"name\":\"serde_json\",\"req\":\"^1.0\"}],\"features\":{}}",
"deflate64_0.1.10": "{\"dependencies\":[{\"features\":[\"derive\"],\"kind\":\"dev\",\"name\":\"bytemuck\",\"req\":\"^1.13.1\"},{\"kind\":\"dev\",\"name\":\"proptest\",\"req\":\"^1.2.0\"},{\"kind\":\"dev\",\"name\":\"tempfile\",\"req\":\"^3.7.1\"}],\"features\":{}}",
"deno_core_icudata_0.77.0": "{\"dependencies\":[],\"features\":{}}",
"der-parser_10.0.0": "{\"dependencies\":[{\"name\":\"asn1-rs\",\"req\":\"^0.7\"},{\"name\":\"bitvec\",\"optional\":true,\"req\":\"^1.0\"},{\"name\":\"cookie-factory\",\"optional\":true,\"req\":\"^0.3.0\"},{\"default_features\":false,\"name\":\"displaydoc\",\"req\":\"^0.2\"},{\"kind\":\"dev\",\"name\":\"hex-literal\",\"req\":\"^0.4\"},{\"name\":\"nom\",\"req\":\"^7.0\"},{\"name\":\"num-bigint\",\"optional\":true,\"req\":\"^0.4\"},{\"name\":\"num-traits\",\"req\":\"^0.2\"},{\"kind\":\"dev\",\"name\":\"pretty_assertions\",\"req\":\"^1.0\"},{\"name\":\"rusticata-macros\",\"req\":\"^4.0\"},{\"kind\":\"dev\",\"name\":\"test-case\",\"req\":\"^3.0\"}],\"features\":{\"as_bitvec\":[\"bitvec\"],\"bigint\":[\"num-bigint\"],\"default\":[\"std\"],\"serialize\":[\"std\",\"cookie-factory\"],\"std\":[],\"unstable\":[]}}",
"der_0.7.10": "{\"dependencies\":[{\"features\":[\"derive\"],\"name\":\"arbitrary\",\"optional\":true,\"req\":\"^1.3\"},{\"default_features\":false,\"name\":\"bytes\",\"optional\":true,\"req\":\"^1\"},{\"name\":\"const-oid\",\"optional\":true,\"req\":\"^0.9.2\"},{\"name\":\"der_derive\",\"optional\":true,\"req\":\"^0.7.2\"},{\"name\":\"flagset\",\"optional\":true,\"req\":\"^0.4.3\"},{\"kind\":\"dev\",\"name\":\"hex-literal\",\"req\":\"^0.4.1\"},{\"features\":[\"alloc\"],\"name\":\"pem-rfc7468\",\"optional\":true,\"req\":\"^0.7\"},{\"kind\":\"dev\",\"name\":\"proptest\",\"req\":\"^1\"},{\"default_features\":false,\"name\":\"time\",\"optional\":true,\"req\":\"^0.3.4\"},{\"default_features\":false,\"name\":\"zeroize\",\"optional\":true,\"req\":\"^1.5\"}],\"features\":{\"alloc\":[\"zeroize?/alloc\"],\"arbitrary\":[\"dep:arbitrary\",\"const-oid?/arbitrary\",\"std\"],\"bytes\":[\"dep:bytes\",\"alloc\"],\"derive\":[\"dep:der_derive\"],\"oid\":[\"dep:const-oid\"],\"pem\":[\"dep:pem-rfc7468\",\"alloc\",\"zeroize\"],\"real\":[],\"std\":[\"alloc\"]}}",
"deranged_0.5.5": "{\"dependencies\":[{\"name\":\"deranged-macros\",\"optional\":true,\"req\":\"=0.3.0\"},{\"default_features\":false,\"name\":\"num-traits\",\"optional\":true,\"req\":\"^0.2.15\"},{\"default_features\":false,\"name\":\"powerfmt\",\"optional\":true,\"req\":\"^0.2.0\"},{\"default_features\":false,\"name\":\"quickcheck\",\"optional\":true,\"req\":\"^1.0.3\"},{\"default_features\":false,\"name\":\"rand08\",\"optional\":true,\"package\":\"rand\",\"req\":\"^0.8.4\"},{\"kind\":\"dev\",\"name\":\"rand08\",\"package\":\"rand\",\"req\":\"^0.8.4\"},{\"default_features\":false,\"name\":\"rand09\",\"optional\":true,\"package\":\"rand\",\"req\":\"^0.9.0\"},{\"kind\":\"dev\",\"name\":\"rand09\",\"package\":\"rand\",\"req\":\"^0.9.0\"},{\"default_features\":false,\"name\":\"serde_core\",\"optional\":true,\"req\":\"^1.0.220\"},{\"kind\":\"dev\",\"name\":\"serde_json\",\"req\":\"^1.0.86\"}],\"features\":{\"alloc\":[],\"default\":[],\"macros\":[\"dep:deranged-macros\"],\"num\":[\"dep:num-traits\"],\"powerfmt\":[\"dep:powerfmt\"],\"quickcheck\":[\"dep:quickcheck\",\"alloc\"],\"rand\":[\"rand08\",\"rand09\"],\"rand08\":[\"dep:rand08\"],\"rand09\":[\"dep:rand09\"],\"serde\":[\"dep:serde_core\"]}}",
@@ -903,8 +904,8 @@
"git+https://github.com/juberti-oai/rust-sdks.git?rev=e2d1d1d230c6fc9df171ccb181423f957bb3c1f0#e2d1d1d230c6fc9df171ccb181423f957bb3c1f0_livekit-runtime": "{\"dependencies\":[{\"default_features\":true,\"features\":[],\"name\":\"async-io\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"async-std\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"async-task\",\"optional\":true},{\"name\":\"futures\",\"optional\":true},{\"default_features\":false,\"features\":[\"net\",\"rt\",\"rt-multi-thread\",\"time\"],\"name\":\"tokio\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"tokio-stream\",\"optional\":true}],\"features\":{\"async\":[\"dep:async-std\",\"dep:futures\",\"dep:async-io\"],\"default\":[\"tokio\"],\"dispatcher\":[\"dep:futures\",\"dep:async-io\",\"dep:async-std\",\"dep:async-task\"],\"tokio\":[\"dep:tokio\",\"dep:tokio-stream\"]},\"strip_prefix\":\"livekit-runtime\"}",
"git+https://github.com/juberti-oai/rust-sdks.git?rev=e2d1d1d230c6fc9df171ccb181423f957bb3c1f0#e2d1d1d230c6fc9df171ccb181423f957bb3c1f0_webrtc-sys": "{\"dependencies\":[{\"name\":\"cxx\"},{\"name\":\"log\"},{\"kind\":\"build\",\"name\":\"cc\"},{\"kind\":\"build\",\"name\":\"cxx-build\"},{\"kind\":\"build\",\"name\":\"glob\"},{\"kind\":\"build\",\"name\":\"pkg-config\"},{\"default_features\":true,\"features\":[],\"kind\":\"build\",\"name\":\"webrtc-sys-build\",\"optional\":false}],\"features\":{\"default\":[]},\"strip_prefix\":\"webrtc-sys\"}",
"git+https://github.com/juberti-oai/rust-sdks.git?rev=e2d1d1d230c6fc9df171ccb181423f957bb3c1f0#e2d1d1d230c6fc9df171ccb181423f957bb3c1f0_webrtc-sys-build": "{\"dependencies\":[{\"name\":\"anyhow\"},{\"name\":\"fs2\"},{\"name\":\"regex\"},{\"default_features\":false,\"features\":[\"rustls-tls-native-roots\",\"blocking\"],\"name\":\"reqwest\",\"optional\":false},{\"name\":\"scratch\"},{\"name\":\"semver\"},{\"name\":\"zip\"}],\"features\":{},\"strip_prefix\":\"webrtc-sys/build\"}",
"git+https://github.com/nornagon/crossterm?branch=nornagon%2Fcolor-query#87db8bfa6dc99427fd3b071681b07fc31c6ce995_crossterm": "{\"dependencies\":[{\"default_features\":true,\"features\":[],\"name\":\"bitflags\",\"optional\":false},{\"default_features\":false,\"features\":[],\"name\":\"futures-core\",\"optional\":true},{\"name\":\"parking_lot\"},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"serde\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"filedescriptor\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":false,\"features\":[],\"name\":\"libc\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[\"os-poll\"],\"name\":\"mio\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":false,\"features\":[\"std\",\"stdio\",\"termios\"],\"name\":\"rustix\",\"optional\":false,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[],\"name\":\"signal-hook\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[\"support-v1_0\"],\"name\":\"signal-hook-mio\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[],\"name\":\"crossterm_winapi\",\"optional\":true,\"target\":\"cfg(windows)\"},{\"default_features\":true,\"features\":[\"winuser\",\"winerror\"],\"name\":\"winapi\",\"optional\":true,\"target\":\"cfg(windows)\"}],\"features\":{\"bracketed-paste\":[],\"default\":[\"bracketed-paste\",\"windows\",\"events\"],\"event-stream\":[\"dep:futures-core\",\"events\"],\"events\":[\"dep:mio\",\"dep:signal-hook\",\"dep:signal-hook-mio\"],\"serde\":[\"dep:serde\",\"bitflags/serde\"],\"use-dev-tty\":[\"filedescriptor\",\"rustix/process\"],\"windows\":[\"dep:winapi\",\"dep:crossterm_winapi\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/nornagon/ratatui?branch=nornagon-v0.29.0-patch#9b2ad1298408c45918ee9f8241a6f95498cdbed2_ratatui": "{\"dependencies\":[{\"name\":\"bitflags\"},{\"name\":\"cassowary\"},{\"name\":\"compact_str\"},{\"default_features\":true,\"features\":[],\"name\":\"crossterm\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"document-features\",\"optional\":true},{\"name\":\"indoc\"},{\"name\":\"instability\"},{\"name\":\"itertools\"},{\"name\":\"lru\"},{\"default_features\":true,\"features\":[],\"name\":\"palette\",\"optional\":true},{\"name\":\"paste\"},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"serde\",\"optional\":true},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"strum\",\"optional\":false},{\"default_features\":true,\"features\":[],\"name\":\"termwiz\",\"optional\":true},{\"default_features\":true,\"features\":[\"local-offset\"],\"name\":\"time\",\"optional\":true},{\"name\":\"unicode-segmentation\"},{\"name\":\"unicode-truncate\"},{\"name\":\"unicode-width\"},{\"default_features\":true,\"features\":[],\"name\":\"termion\",\"optional\":true,\"target\":\"cfg(not(windows))\"}],\"features\":{\"all-widgets\":[\"widget-calendar\"],\"crossterm\":[\"dep:crossterm\"],\"default\":[\"crossterm\",\"underline-color\"],\"macros\":[],\"palette\":[\"dep:palette\"],\"scrolling-regions\":[],\"serde\":[\"dep:serde\",\"bitflags/serde\",\"compact_str/serde\"],\"termion\":[\"dep:termion\"],\"termwiz\":[\"dep:termwiz\"],\"underline-color\":[\"dep:crossterm\"],\"unstable\":[\"unstable-rendered-line-info\",\"unstable-widget-ref\",\"unstable-backend-writer\"],\"unstable-backend-writer\":[],\"unstable-rendered-line-info\":[],\"unstable-widget-ref\":[],\"widget-calendar\":[\"dep:time\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/nornagon/crossterm?rev=87db8bfa6dc99427fd3b071681b07fc31c6ce995#87db8bfa6dc99427fd3b071681b07fc31c6ce995_crossterm": "{\"dependencies\":[{\"default_features\":true,\"features\":[],\"name\":\"bitflags\",\"optional\":false},{\"default_features\":false,\"features\":[],\"name\":\"futures-core\",\"optional\":true},{\"name\":\"parking_lot\"},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"serde\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"filedescriptor\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":false,\"features\":[],\"name\":\"libc\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[\"os-poll\"],\"name\":\"mio\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":false,\"features\":[\"std\",\"stdio\",\"termios\"],\"name\":\"rustix\",\"optional\":false,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[],\"name\":\"signal-hook\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[\"support-v1_0\"],\"name\":\"signal-hook-mio\",\"optional\":true,\"target\":\"cfg(unix)\"},{\"default_features\":true,\"features\":[],\"name\":\"crossterm_winapi\",\"optional\":true,\"target\":\"cfg(windows)\"},{\"default_features\":true,\"features\":[\"winuser\",\"winerror\"],\"name\":\"winapi\",\"optional\":true,\"target\":\"cfg(windows)\"}],\"features\":{\"bracketed-paste\":[],\"default\":[\"bracketed-paste\",\"windows\",\"events\"],\"event-stream\":[\"dep:futures-core\",\"events\"],\"events\":[\"dep:mio\",\"dep:signal-hook\",\"dep:signal-hook-mio\"],\"serde\":[\"dep:serde\",\"bitflags/serde\"],\"use-dev-tty\":[\"filedescriptor\",\"rustix/process\"],\"windows\":[\"dep:winapi\",\"dep:crossterm_winapi\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/nornagon/ratatui?rev=9b2ad1298408c45918ee9f8241a6f95498cdbed2#9b2ad1298408c45918ee9f8241a6f95498cdbed2_ratatui": "{\"dependencies\":[{\"name\":\"bitflags\"},{\"name\":\"cassowary\"},{\"name\":\"compact_str\"},{\"default_features\":true,\"features\":[],\"name\":\"crossterm\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"document-features\",\"optional\":true},{\"name\":\"indoc\"},{\"name\":\"instability\"},{\"name\":\"itertools\"},{\"name\":\"lru\"},{\"default_features\":true,\"features\":[],\"name\":\"palette\",\"optional\":true},{\"name\":\"paste\"},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"serde\",\"optional\":true},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"strum\",\"optional\":false},{\"default_features\":true,\"features\":[],\"name\":\"termwiz\",\"optional\":true},{\"default_features\":true,\"features\":[\"local-offset\"],\"name\":\"time\",\"optional\":true},{\"name\":\"unicode-segmentation\"},{\"name\":\"unicode-truncate\"},{\"name\":\"unicode-width\"},{\"default_features\":true,\"features\":[],\"name\":\"termion\",\"optional\":true,\"target\":\"cfg(not(windows))\"}],\"features\":{\"all-widgets\":[\"widget-calendar\"],\"crossterm\":[\"dep:crossterm\"],\"default\":[\"crossterm\",\"underline-color\"],\"macros\":[],\"palette\":[\"dep:palette\"],\"scrolling-regions\":[],\"serde\":[\"dep:serde\",\"bitflags/serde\",\"compact_str/serde\"],\"termion\":[\"dep:termion\"],\"termwiz\":[\"dep:termwiz\"],\"underline-color\":[\"dep:crossterm\"],\"unstable\":[\"unstable-rendered-line-info\",\"unstable-widget-ref\",\"unstable-backend-writer\"],\"unstable-backend-writer\":[],\"unstable-rendered-line-info\":[],\"unstable-widget-ref\":[],\"widget-calendar\":[\"dep:time\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/openai-oss-forks/tokio-tungstenite?rev=132f5b39c862e3a970f731d709608b3e6276d5f6#132f5b39c862e3a970f731d709608b3e6276d5f6_tokio-tungstenite": "{\"dependencies\":[{\"default_features\":false,\"features\":[\"sink\",\"std\"],\"name\":\"futures-util\",\"optional\":false},{\"name\":\"log\"},{\"default_features\":true,\"features\":[],\"name\":\"native-tls-crate\",\"optional\":true,\"package\":\"native-tls\"},{\"default_features\":false,\"features\":[],\"name\":\"rustls\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"rustls-native-certs\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"rustls-pki-types\",\"optional\":true},{\"default_features\":false,\"features\":[\"io-util\"],\"name\":\"tokio\",\"optional\":false},{\"default_features\":true,\"features\":[],\"name\":\"tokio-native-tls\",\"optional\":true},{\"default_features\":false,\"features\":[],\"name\":\"tokio-rustls\",\"optional\":true},{\"default_features\":false,\"features\":[],\"name\":\"tungstenite\",\"optional\":false},{\"default_features\":true,\"features\":[],\"name\":\"webpki-roots\",\"optional\":true}],\"features\":{\"__rustls-tls\":[\"rustls\",\"rustls-pki-types\",\"tokio-rustls\",\"stream\",\"tungstenite/__rustls-tls\",\"handshake\"],\"connect\":[\"stream\",\"tokio/net\",\"handshake\"],\"default\":[\"connect\",\"handshake\"],\"handshake\":[\"tungstenite/handshake\"],\"native-tls\":[\"native-tls-crate\",\"tokio-native-tls\",\"stream\",\"tungstenite/native-tls\",\"handshake\"],\"native-tls-vendored\":[\"native-tls\",\"native-tls-crate/vendored\",\"tungstenite/native-tls-vendored\"],\"proxy\":[\"tungstenite/proxy\",\"tokio/net\",\"handshake\"],\"rustls-tls-native-roots\":[\"__rustls-tls\",\"rustls-native-certs\"],\"rustls-tls-webpki-roots\":[\"__rustls-tls\",\"webpki-roots\"],\"stream\":[],\"url\":[\"tungstenite/url\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/openai-oss-forks/tungstenite-rs?rev=9200079d3b54a1ff51072e24d81fd354f085156f#9200079d3b54a1ff51072e24d81fd354f085156f_tungstenite": "{\"dependencies\":[{\"name\":\"bytes\"},{\"default_features\":true,\"features\":[],\"name\":\"data-encoding\",\"optional\":true},{\"default_features\":false,\"features\":[\"zlib\"],\"name\":\"flate2\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"headers\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"http\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"httparse\",\"optional\":true},{\"name\":\"log\"},{\"default_features\":true,\"features\":[],\"name\":\"native-tls-crate\",\"optional\":true,\"package\":\"native-tls\"},{\"name\":\"rand\"},{\"default_features\":false,\"features\":[\"std\"],\"name\":\"rustls\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"rustls-native-certs\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"rustls-pki-types\",\"optional\":true},{\"default_features\":true,\"features\":[],\"name\":\"sha1\",\"optional\":true},{\"name\":\"thiserror\"},{\"default_features\":true,\"features\":[],\"name\":\"url\",\"optional\":true},{\"name\":\"utf-8\"},{\"default_features\":true,\"features\":[],\"name\":\"webpki-roots\",\"optional\":true}],\"features\":{\"__rustls-tls\":[\"rustls\",\"rustls-pki-types\"],\"default\":[\"handshake\"],\"deflate\":[\"headers\",\"flate2\"],\"handshake\":[\"data-encoding\",\"headers\",\"httparse\",\"sha1\"],\"headers\":[\"http\",\"dep:headers\"],\"native-tls\":[\"native-tls-crate\"],\"native-tls-vendored\":[\"native-tls\",\"native-tls-crate/vendored\"],\"proxy\":[\"handshake\"],\"rustls-tls-native-roots\":[\"__rustls-tls\",\"rustls-native-certs\"],\"rustls-tls-webpki-roots\":[\"__rustls-tls\",\"webpki-roots\"],\"url\":[\"dep:url\"]},\"strip_prefix\":\"\"}",
"git+https://github.com/rust-lang/rust-clippy?rev=20ce69b9a63bcd2756cd906fe0964d1e901e042a#20ce69b9a63bcd2756cd906fe0964d1e901e042a_clippy_utils": "{\"dependencies\":[{\"default_features\":false,\"features\":[],\"name\":\"arrayvec\",\"optional\":false},{\"name\":\"itertools\"},{\"name\":\"rustc_apfloat\"},{\"default_features\":true,\"features\":[\"derive\"],\"name\":\"serde\",\"optional\":false}],\"features\":{},\"strip_prefix\":\"clippy_utils\"}",

View File

@@ -1,6 +1,9 @@
[advisories]
# Reviewed 2026-04-11. Keep this list in sync with ../deny.toml.
ignore = [
"RUSTSEC-2024-0388", # derivative 2.2.0 via starlark; upstream crate is unmaintained
"RUSTSEC-2025-0057", # fxhash 0.2.1 via starlark_map; upstream crate is unmaintained
"RUSTSEC-2024-0436", # paste 1.0.15 via starlark/ratatui; upstream crate is unmaintained
"RUSTSEC-2024-0320", # yaml-rust via syntect; remove when syntect drops or updates it
"RUSTSEC-2025-0141", # bincode via syntect; remove when syntect drops or updates it
]

28
codex-rs/Cargo.lock generated
View File

@@ -1828,6 +1828,7 @@ name = "codex-code-mode"
version = "0.0.0"
dependencies = [
"async-trait",
"deno_core_icudata",
"pretty_assertions",
"serde",
"serde_json",
@@ -1902,7 +1903,6 @@ dependencies = [
"codex-api",
"codex-app-server-protocol",
"codex-apply-patch",
"codex-arg0",
"codex-async-utils",
"codex-code-mode",
"codex-config",
@@ -1932,6 +1932,7 @@ dependencies = [
"codex-shell-escalation",
"codex-state",
"codex-terminal-detection",
"codex-test-binary-support",
"codex-tools",
"codex-utils-absolute-path",
"codex-utils-cache",
@@ -2098,15 +2099,18 @@ dependencies = [
"async-trait",
"base64 0.22.1",
"codex-app-server-protocol",
"codex-config",
"codex-protocol",
"codex-sandboxing",
"codex-test-binary-support",
"codex-utils-absolute-path",
"codex-utils-cargo-bin",
"codex-utils-pty",
"ctor 0.6.3",
"futures",
"pretty_assertions",
"serde",
"serde_json",
"serial_test",
"tempfile",
"test-case",
"thiserror 2.0.18",
@@ -2349,6 +2353,7 @@ dependencies = [
"codex-plugin",
"codex-protocol",
"codex-rmcp-client",
"codex-utils-absolute-path",
"codex-utils-plugins",
"futures",
"pretty_assertions",
@@ -2810,6 +2815,14 @@ dependencies = [
"tracing",
]
[[package]]
name = "codex-test-binary-support"
version = "0.0.0"
dependencies = [
"codex-arg0",
"tempfile",
]
[[package]]
name = "codex-tools"
version = "0.0.0"
@@ -2987,6 +3000,7 @@ version = "0.0.0"
name = "codex-utils-home-dir"
version = "0.0.0"
dependencies = [
"codex-utils-absolute-path",
"dirs",
"pretty_assertions",
"tempfile",
@@ -3523,7 +3537,7 @@ checksum = "d0a5c400df2834b80a4c3327b3aad3a4c4cd4de0629063962b03235697506a28"
[[package]]
name = "crossterm"
version = "0.28.1"
source = "git+https://github.com/nornagon/crossterm?branch=nornagon%2Fcolor-query#87db8bfa6dc99427fd3b071681b07fc31c6ce995"
source = "git+https://github.com/nornagon/crossterm?rev=87db8bfa6dc99427fd3b071681b07fc31c6ce995#87db8bfa6dc99427fd3b071681b07fc31c6ce995"
dependencies = [
"bitflags 2.10.0",
"crossterm_winapi",
@@ -3886,6 +3900,12 @@ version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26bf8fc351c5ed29b5c2f0cbbac1b209b74f60ecd62e675a998df72c49af5204"
[[package]]
name = "deno_core_icudata"
version = "0.77.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9efff8990a82c1ae664292507e1a5c6749ddd2312898cdf9cd7cb1fd4bc64c6"
[[package]]
name = "der"
version = "0.7.10"
@@ -8450,7 +8470,7 @@ dependencies = [
[[package]]
name = "ratatui"
version = "0.29.0"
source = "git+https://github.com/nornagon/ratatui?branch=nornagon-v0.29.0-patch#9b2ad1298408c45918ee9f8241a6f95498cdbed2"
source = "git+https://github.com/nornagon/ratatui?rev=9b2ad1298408c45918ee9f8241a6f95498cdbed2#9b2ad1298408c45918ee9f8241a6f95498cdbed2"
dependencies = [
"bitflags 2.10.0",
"cassowary",

View File

@@ -86,6 +86,7 @@ members = [
"codex-api",
"state",
"terminal-detection",
"test-binary-support",
"codex-experimental-api-macros",
"plugin",
]
@@ -163,6 +164,7 @@ codex-skills = { path = "skills" }
codex-state = { path = "state" }
codex-stdio-to-uds = { path = "stdio-to-uds" }
codex-terminal-detection = { path = "terminal-detection" }
codex-test-binary-support = { path = "test-binary-support" }
codex-tools = { path = "tools" }
codex-tui = { path = "tui" }
codex-utils-absolute-path = { path = "utils/absolute-path" }
@@ -218,6 +220,7 @@ crossbeam-channel = "0.5.15"
crossterm = "0.28.1"
csv = "1.3.1"
ctor = "0.6.3"
deno_core_icudata = "0.77.0"
derive_more = "2"
diffy = "0.4.2"
dirs = "6"
@@ -427,8 +430,8 @@ opt-level = 0
[patch.crates-io]
# Uncomment to debug local changes.
# ratatui = { path = "../../ratatui" }
crossterm = { git = "https://github.com/nornagon/crossterm", branch = "nornagon/color-query" }
ratatui = { git = "https://github.com/nornagon/ratatui", branch = "nornagon-v0.29.0-patch" }
crossterm = { git = "https://github.com/nornagon/crossterm", rev = "87db8bfa6dc99427fd3b071681b07fc31c6ce995" }
ratatui = { git = "https://github.com/nornagon/ratatui", rev = "9b2ad1298408c45918ee9f8241a6f95498cdbed2" }
tokio-tungstenite = { git = "https://github.com/openai-oss-forks/tokio-tungstenite", rev = "132f5b39c862e3a970f731d709608b3e6276d5f6" }
tungstenite = { git = "https://github.com/openai-oss-forks/tungstenite-rs", rev = "9200079d3b54a1ff51072e24d81fd354f085156f" }

File diff suppressed because it is too large Load Diff

View File

@@ -4,6 +4,7 @@ use crate::events::TrackEventRequest;
use crate::events::TrackEventsRequest;
use crate::events::current_runtime_metadata;
use crate::facts::AnalyticsFact;
use crate::facts::AnalyticsJsonRpcError;
use crate::facts::AppInvocation;
use crate::facts::AppMentionedInput;
use crate::facts::AppUsedInput;
@@ -14,9 +15,15 @@ use crate::facts::SkillInvocation;
use crate::facts::SkillInvokedInput;
use crate::facts::SubAgentThreadStartedInput;
use crate::facts::TrackEventsContext;
use crate::facts::TurnResolvedConfigFact;
use crate::facts::TurnTokenUsageFact;
use crate::reducer::AnalyticsReducer;
use codex_app_server_protocol::ClientRequest;
use codex_app_server_protocol::ClientResponse;
use codex_app_server_protocol::InitializeParams;
use codex_app_server_protocol::JSONRPCErrorError;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ServerNotification;
use codex_login::AuthManager;
use codex_login::default_client::create_client;
use codex_plugin::PluginTelemetryMetadata;
@@ -167,6 +174,14 @@ impl AnalyticsEventsClient {
)));
}
pub fn track_request(&self, connection_id: u64, request_id: RequestId, request: ClientRequest) {
self.record_fact(AnalyticsFact::Request {
connection_id,
request_id,
request: Box::new(request),
});
}
pub fn track_app_used(&self, tracking: TrackEventsContext, app: AppInvocation) {
if !self.queue.should_enqueue_app_used(&tracking, &app) {
return;
@@ -191,6 +206,18 @@ impl AnalyticsEventsClient {
)));
}
pub fn track_turn_resolved_config(&self, fact: TurnResolvedConfigFact) {
self.record_fact(AnalyticsFact::Custom(
CustomAnalyticsFact::TurnResolvedConfig(Box::new(fact)),
));
}
pub fn track_turn_token_usage(&self, fact: TurnTokenUsageFact) {
self.record_fact(AnalyticsFact::Custom(CustomAnalyticsFact::TurnTokenUsage(
Box::new(fact),
)));
}
pub fn track_plugin_installed(&self, plugin: PluginTelemetryMetadata) {
self.record_fact(AnalyticsFact::Custom(
CustomAnalyticsFact::PluginStateChanged(PluginStateChangedInput {
@@ -240,6 +267,25 @@ impl AnalyticsEventsClient {
response: Box::new(response),
});
}
pub fn track_error_response(
&self,
connection_id: u64,
request_id: RequestId,
error: JSONRPCErrorError,
error_type: Option<AnalyticsJsonRpcError>,
) {
self.record_fact(AnalyticsFact::ErrorResponse {
connection_id,
request_id,
error,
error_type,
});
}
pub fn track_notification(&self, notification: ServerNotification) {
self.record_fact(AnalyticsFact::Notification(Box::new(notification)));
}
}
async fn send_track_events(

View File

@@ -3,7 +3,13 @@ use crate::facts::CodexCompactionEvent;
use crate::facts::InvocationType;
use crate::facts::PluginState;
use crate::facts::SubAgentThreadStartedInput;
use crate::facts::ThreadInitializationMode;
use crate::facts::TrackEventsContext;
use crate::facts::TurnStatus;
use crate::facts::TurnSteerRejectionReason;
use crate::facts::TurnSteerResult;
use crate::facts::TurnSubmissionType;
use codex_app_server_protocol::CodexErrorInfo;
use codex_login::default_client::originator;
use codex_plugin::PluginTelemetryMetadata;
use codex_protocol::approvals::NetworkApprovalProtocol;
@@ -21,14 +27,6 @@ pub enum AppServerRpcTransport {
InProcess,
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub(crate) enum ThreadInitializationMode {
New,
Forked,
Resumed,
}
#[derive(Serialize)]
pub(crate) struct TrackEventsRequest {
pub(crate) events: Vec<TrackEventRequest>,
@@ -43,6 +41,8 @@ pub(crate) enum TrackEventRequest {
AppMentioned(CodexAppMentionedEventRequest),
AppUsed(CodexAppUsedEventRequest),
Compaction(Box<CodexCompactionEventRequest>),
TurnEvent(Box<CodexTurnEventRequest>),
TurnSteer(CodexTurnSteerEventRequest),
PluginUsed(CodexPluginUsedEventRequest),
PluginInstalled(CodexPluginEventRequest),
PluginUninstalled(CodexPluginEventRequest),
@@ -330,6 +330,84 @@ pub(crate) struct CodexCompactionEventRequest {
pub(crate) event_params: CodexCompactionEventParams,
}
#[derive(Serialize)]
pub(crate) struct CodexTurnEventParams {
pub(crate) thread_id: String,
pub(crate) turn_id: String,
// TODO(rhan-oai): Populate once queued/default submission type is plumbed from
// the turn/start callsites instead of always being reported as None.
pub(crate) submission_type: Option<TurnSubmissionType>,
pub(crate) app_server_client: CodexAppServerClientMetadata,
pub(crate) runtime: CodexRuntimeMetadata,
pub(crate) ephemeral: bool,
pub(crate) thread_source: Option<String>,
pub(crate) initialization_mode: ThreadInitializationMode,
pub(crate) subagent_source: Option<String>,
pub(crate) parent_thread_id: Option<String>,
pub(crate) model: Option<String>,
pub(crate) model_provider: String,
pub(crate) sandbox_policy: Option<&'static str>,
pub(crate) reasoning_effort: Option<String>,
pub(crate) reasoning_summary: Option<String>,
pub(crate) service_tier: String,
pub(crate) approval_policy: String,
pub(crate) approvals_reviewer: String,
pub(crate) sandbox_network_access: bool,
pub(crate) collaboration_mode: Option<&'static str>,
pub(crate) personality: Option<String>,
pub(crate) num_input_images: usize,
pub(crate) is_first_turn: bool,
pub(crate) status: Option<TurnStatus>,
pub(crate) turn_error: Option<CodexErrorInfo>,
pub(crate) steer_count: Option<usize>,
// TODO(rhan-oai): Populate these once tool-call accounting is emitted from
// core; the schema is reserved but these fields are currently always None.
pub(crate) total_tool_call_count: Option<usize>,
pub(crate) shell_command_count: Option<usize>,
pub(crate) file_change_count: Option<usize>,
pub(crate) mcp_tool_call_count: Option<usize>,
pub(crate) dynamic_tool_call_count: Option<usize>,
pub(crate) subagent_tool_call_count: Option<usize>,
pub(crate) web_search_count: Option<usize>,
pub(crate) image_generation_count: Option<usize>,
pub(crate) input_tokens: Option<i64>,
pub(crate) cached_input_tokens: Option<i64>,
pub(crate) output_tokens: Option<i64>,
pub(crate) reasoning_output_tokens: Option<i64>,
pub(crate) total_tokens: Option<i64>,
pub(crate) duration_ms: Option<u64>,
pub(crate) started_at: Option<u64>,
pub(crate) completed_at: Option<u64>,
}
#[derive(Serialize)]
pub(crate) struct CodexTurnEventRequest {
pub(crate) event_type: &'static str,
pub(crate) event_params: CodexTurnEventParams,
}
#[derive(Serialize)]
pub(crate) struct CodexTurnSteerEventParams {
pub(crate) thread_id: String,
pub(crate) expected_turn_id: Option<String>,
pub(crate) accepted_turn_id: Option<String>,
pub(crate) app_server_client: CodexAppServerClientMetadata,
pub(crate) runtime: CodexRuntimeMetadata,
pub(crate) thread_source: Option<String>,
pub(crate) subagent_source: Option<String>,
pub(crate) parent_thread_id: Option<String>,
pub(crate) num_input_images: usize,
pub(crate) result: TurnSteerResult,
pub(crate) rejection_reason: Option<TurnSteerRejectionReason>,
pub(crate) created_at: u64,
}
#[derive(Serialize)]
pub(crate) struct CodexTurnSteerEventRequest {
pub(crate) event_type: &'static str,
pub(crate) event_params: CodexTurnSteerEventParams,
}
#[derive(Serialize)]
pub(crate) struct CodexPluginMetadata {
pub(crate) plugin_id: Option<String>,

View File

@@ -4,11 +4,22 @@ use crate::events::GuardianReviewEventParams;
use codex_app_server_protocol::ClientRequest;
use codex_app_server_protocol::ClientResponse;
use codex_app_server_protocol::InitializeParams;
use codex_app_server_protocol::JSONRPCErrorError;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ServerNotification;
use codex_plugin::PluginTelemetryMetadata;
use codex_protocol::config_types::ApprovalsReviewer;
use codex_protocol::config_types::ModeKind;
use codex_protocol::config_types::Personality;
use codex_protocol::config_types::ReasoningSummary;
use codex_protocol::config_types::ServiceTier;
use codex_protocol::openai_models::ReasoningEffort;
use codex_protocol::protocol::AskForApproval;
use codex_protocol::protocol::SandboxPolicy;
use codex_protocol::protocol::SessionSource;
use codex_protocol::protocol::SkillScope;
use codex_protocol::protocol::SubAgentSource;
use codex_protocol::protocol::TokenUsage;
use serde::Serialize;
use std::path::PathBuf;
@@ -31,6 +42,126 @@ pub fn build_track_events_context(
}
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum TurnSubmissionType {
Default,
Queued,
}
#[derive(Clone)]
pub struct TurnResolvedConfigFact {
pub turn_id: String,
pub thread_id: String,
pub num_input_images: usize,
pub submission_type: Option<TurnSubmissionType>,
pub ephemeral: bool,
pub session_source: SessionSource,
pub model: String,
pub model_provider: String,
pub sandbox_policy: SandboxPolicy,
pub reasoning_effort: Option<ReasoningEffort>,
pub reasoning_summary: Option<ReasoningSummary>,
pub service_tier: Option<ServiceTier>,
pub approval_policy: AskForApproval,
pub approvals_reviewer: ApprovalsReviewer,
pub sandbox_network_access: bool,
pub collaboration_mode: ModeKind,
pub personality: Option<Personality>,
pub is_first_turn: bool,
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum ThreadInitializationMode {
New,
Forked,
Resumed,
}
#[derive(Clone)]
pub struct TurnTokenUsageFact {
pub turn_id: String,
pub thread_id: String,
pub token_usage: TokenUsage,
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum TurnStatus {
Completed,
Failed,
Interrupted,
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum TurnSteerResult {
Accepted,
Rejected,
}
#[derive(Clone, Copy, Debug, Serialize)]
#[serde(rename_all = "snake_case")]
pub enum TurnSteerRejectionReason {
NoActiveTurn,
ExpectedTurnMismatch,
NonSteerableReview,
NonSteerableCompact,
EmptyInput,
InputTooLarge,
}
#[derive(Clone)]
pub struct CodexTurnSteerEvent {
pub expected_turn_id: Option<String>,
pub accepted_turn_id: Option<String>,
pub num_input_images: usize,
pub result: TurnSteerResult,
pub rejection_reason: Option<TurnSteerRejectionReason>,
pub created_at: u64,
}
#[derive(Clone, Copy, Debug)]
pub enum AnalyticsJsonRpcError {
TurnSteer(TurnSteerRequestError),
Input(InputError),
}
#[derive(Clone, Copy, Debug)]
pub enum TurnSteerRequestError {
NoActiveTurn,
ExpectedTurnMismatch,
NonSteerableReview,
NonSteerableCompact,
}
#[derive(Clone, Copy, Debug)]
pub enum InputError {
Empty,
TooLarge,
}
impl From<TurnSteerRequestError> for TurnSteerRejectionReason {
fn from(error: TurnSteerRequestError) -> Self {
match error {
TurnSteerRequestError::NoActiveTurn => Self::NoActiveTurn,
TurnSteerRequestError::ExpectedTurnMismatch => Self::ExpectedTurnMismatch,
TurnSteerRequestError::NonSteerableReview => Self::NonSteerableReview,
TurnSteerRequestError::NonSteerableCompact => Self::NonSteerableCompact,
}
}
}
impl From<InputError> for TurnSteerRejectionReason {
fn from(error: InputError) -> Self {
match error {
InputError::Empty => Self::EmptyInput,
InputError::TooLarge => Self::InputTooLarge,
}
}
}
#[derive(Clone, Debug)]
pub struct SkillInvocation {
pub skill_name: String,
@@ -146,6 +277,12 @@ pub(crate) enum AnalyticsFact {
connection_id: u64,
response: Box<ClientResponse>,
},
ErrorResponse {
connection_id: u64,
request_id: RequestId,
error: JSONRPCErrorError,
error_type: Option<AnalyticsJsonRpcError>,
},
Notification(Box<ServerNotification>),
// Facts that do not naturally exist on the app-server protocol surface, or
// would require non-trivial protocol reshaping on this branch.
@@ -156,6 +293,8 @@ pub(crate) enum CustomAnalyticsFact {
SubAgentThreadStarted(SubAgentThreadStartedInput),
Compaction(Box<CodexCompactionEvent>),
GuardianReview(Box<GuardianReviewEventParams>),
TurnResolvedConfig(Box<TurnResolvedConfigFact>),
TurnTokenUsage(Box<TurnTokenUsageFact>),
SkillInvoked(SkillInvokedInput),
AppMentioned(AppMentionedInput),
AppUsed(AppUsedInput),

View File

@@ -3,6 +3,9 @@ mod events;
mod facts;
mod reducer;
use std::time::SystemTime;
use std::time::UNIX_EPOCH;
pub use client::AnalyticsEventsClient;
pub use events::AppServerRpcTransport;
pub use events::GuardianApprovalRequestSource;
@@ -16,19 +19,36 @@ pub use events::GuardianReviewSessionKind;
pub use events::GuardianReviewTerminalStatus;
pub use events::GuardianReviewUserAuthorization;
pub use events::GuardianReviewedAction;
pub use facts::AnalyticsJsonRpcError;
pub use facts::AppInvocation;
pub use facts::CodexCompactionEvent;
pub use facts::CodexTurnSteerEvent;
pub use facts::CompactionImplementation;
pub use facts::CompactionPhase;
pub use facts::CompactionReason;
pub use facts::CompactionStatus;
pub use facts::CompactionStrategy;
pub use facts::CompactionTrigger;
pub use facts::InputError;
pub use facts::InvocationType;
pub use facts::SkillInvocation;
pub use facts::SubAgentThreadStartedInput;
pub use facts::ThreadInitializationMode;
pub use facts::TrackEventsContext;
pub use facts::TurnResolvedConfigFact;
pub use facts::TurnStatus;
pub use facts::TurnSteerRejectionReason;
pub use facts::TurnSteerRequestError;
pub use facts::TurnSteerResult;
pub use facts::TurnTokenUsageFact;
pub use facts::build_track_events_context;
#[cfg(test)]
mod analytics_client_tests;
pub fn now_unix_seconds() -> u64 {
SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap_or_default()
.as_secs()
}

View File

@@ -6,12 +6,15 @@ use crate::events::CodexCompactionEventRequest;
use crate::events::CodexPluginEventRequest;
use crate::events::CodexPluginUsedEventRequest;
use crate::events::CodexRuntimeMetadata;
use crate::events::CodexTurnEventParams;
use crate::events::CodexTurnEventRequest;
use crate::events::CodexTurnSteerEventParams;
use crate::events::CodexTurnSteerEventRequest;
use crate::events::GuardianReviewEventParams;
use crate::events::GuardianReviewEventPayload;
use crate::events::GuardianReviewEventRequest;
use crate::events::SkillInvocationEventParams;
use crate::events::SkillInvocationEventRequest;
use crate::events::ThreadInitializationMode;
use crate::events::ThreadInitializedEvent;
use crate::events::ThreadInitializedEventParams;
use crate::events::TrackEventRequest;
@@ -25,6 +28,7 @@ use crate::events::subagent_source_name;
use crate::events::subagent_thread_started_event_request;
use crate::events::thread_source_name;
use crate::facts::AnalyticsFact;
use crate::facts::AnalyticsJsonRpcError;
use crate::facts::AppMentionedInput;
use crate::facts::AppUsedInput;
use crate::facts::CodexCompactionEvent;
@@ -34,19 +38,39 @@ use crate::facts::PluginStateChangedInput;
use crate::facts::PluginUsedInput;
use crate::facts::SkillInvokedInput;
use crate::facts::SubAgentThreadStartedInput;
use crate::facts::ThreadInitializationMode;
use crate::facts::TurnResolvedConfigFact;
use crate::facts::TurnStatus;
use crate::facts::TurnSteerRejectionReason;
use crate::facts::TurnSteerResult;
use crate::facts::TurnTokenUsageFact;
use crate::now_unix_seconds;
use codex_app_server_protocol::ClientRequest;
use codex_app_server_protocol::ClientResponse;
use codex_app_server_protocol::CodexErrorInfo;
use codex_app_server_protocol::InitializeParams;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ServerNotification;
use codex_app_server_protocol::TurnSteerResponse;
use codex_app_server_protocol::UserInput;
use codex_git_utils::collect_git_info;
use codex_git_utils::get_git_repo_root;
use codex_login::default_client::originator;
use codex_protocol::config_types::ModeKind;
use codex_protocol::config_types::Personality;
use codex_protocol::config_types::ReasoningSummary;
use codex_protocol::protocol::SandboxPolicy;
use codex_protocol::protocol::SessionSource;
use codex_protocol::protocol::SkillScope;
use codex_protocol::protocol::TokenUsage;
use sha1::Digest;
use std::collections::HashMap;
use std::path::Path;
#[derive(Default)]
pub(crate) struct AnalyticsReducer {
requests: HashMap<(u64, RequestId), RequestState>,
turns: HashMap<String, TurnState>,
connections: HashMap<u64, ConnectionState>,
thread_connections: HashMap<String, u64>,
thread_metadata: HashMap<String, ThreadMetadataState>,
@@ -60,12 +84,16 @@ struct ConnectionState {
#[derive(Clone)]
struct ThreadMetadataState {
thread_source: Option<&'static str>,
initialization_mode: ThreadInitializationMode,
subagent_source: Option<String>,
parent_thread_id: Option<String>,
}
impl ThreadMetadataState {
fn from_session_source(session_source: &SessionSource) -> Self {
fn from_thread_metadata(
session_source: &SessionSource,
initialization_mode: ThreadInitializationMode,
) -> Self {
let (subagent_source, parent_thread_id) = match session_source {
SessionSource::SubAgent(subagent_source) => (
Some(subagent_source_name(subagent_source)),
@@ -80,12 +108,49 @@ impl ThreadMetadataState {
};
Self {
thread_source: thread_source_name(session_source),
initialization_mode,
subagent_source,
parent_thread_id,
}
}
}
enum RequestState {
TurnStart(PendingTurnStartState),
TurnSteer(PendingTurnSteerState),
}
struct PendingTurnStartState {
thread_id: String,
num_input_images: usize,
}
struct PendingTurnSteerState {
thread_id: String,
expected_turn_id: String,
num_input_images: usize,
created_at: u64,
}
#[derive(Clone)]
struct CompletedTurnState {
status: Option<TurnStatus>,
turn_error: Option<CodexErrorInfo>,
completed_at: u64,
duration_ms: Option<u64>,
}
struct TurnState {
connection_id: Option<u64>,
thread_id: Option<String>,
num_input_images: Option<usize>,
resolved_config: Option<TurnResolvedConfigFact>,
started_at: Option<u64>,
token_usage: Option<TokenUsage>,
completed: Option<CompletedTurnState>,
steer_count: usize,
}
impl AnalyticsReducer {
pub(crate) async fn ingest(&mut self, input: AnalyticsFact, out: &mut Vec<TrackEventRequest>) {
match input {
@@ -105,17 +170,29 @@ impl AnalyticsReducer {
);
}
AnalyticsFact::Request {
connection_id: _connection_id,
request_id: _request_id,
request: _request,
} => {}
connection_id,
request_id,
request,
} => {
self.ingest_request(connection_id, request_id, *request);
}
AnalyticsFact::Response {
connection_id,
response,
} => {
self.ingest_response(connection_id, *response, out);
}
AnalyticsFact::Notification(_notification) => {}
AnalyticsFact::ErrorResponse {
connection_id,
request_id,
error: _,
error_type,
} => {
self.ingest_error_response(connection_id, request_id, error_type, out);
}
AnalyticsFact::Notification(notification) => {
self.ingest_notification(*notification, out);
}
AnalyticsFact::Custom(input) => match input {
CustomAnalyticsFact::SubAgentThreadStarted(input) => {
self.ingest_subagent_thread_started(input, out);
@@ -126,6 +203,12 @@ impl AnalyticsReducer {
CustomAnalyticsFact::GuardianReview(input) => {
self.ingest_guardian_review(*input, out);
}
CustomAnalyticsFact::TurnResolvedConfig(input) => {
self.ingest_turn_resolved_config(*input, out);
}
CustomAnalyticsFact::TurnTokenUsage(input) => {
self.ingest_turn_token_usage(*input, out);
}
CustomAnalyticsFact::SkillInvoked(input) => {
self.ingest_skill_invoked(input, out).await;
}
@@ -216,6 +299,82 @@ impl AnalyticsReducer {
)));
}
fn ingest_request(
&mut self,
connection_id: u64,
request_id: RequestId,
request: ClientRequest,
) {
match request {
ClientRequest::TurnStart { params, .. } => {
self.requests.insert(
(connection_id, request_id),
RequestState::TurnStart(PendingTurnStartState {
thread_id: params.thread_id,
num_input_images: num_input_images(&params.input),
}),
);
}
ClientRequest::TurnSteer { params, .. } => {
self.requests.insert(
(connection_id, request_id),
RequestState::TurnSteer(PendingTurnSteerState {
thread_id: params.thread_id,
expected_turn_id: params.expected_turn_id,
num_input_images: num_input_images(&params.input),
created_at: now_unix_seconds(),
}),
);
}
_ => {}
}
}
fn ingest_turn_resolved_config(
&mut self,
input: TurnResolvedConfigFact,
out: &mut Vec<TrackEventRequest>,
) {
let turn_id = input.turn_id.clone();
let thread_id = input.thread_id.clone();
let num_input_images = input.num_input_images;
let turn_state = self.turns.entry(turn_id.clone()).or_insert(TurnState {
connection_id: None,
thread_id: None,
num_input_images: None,
resolved_config: None,
started_at: None,
token_usage: None,
completed: None,
steer_count: 0,
});
turn_state.thread_id = Some(thread_id);
turn_state.num_input_images = Some(num_input_images);
turn_state.resolved_config = Some(input);
self.maybe_emit_turn_event(&turn_id, out);
}
fn ingest_turn_token_usage(
&mut self,
input: TurnTokenUsageFact,
out: &mut Vec<TrackEventRequest>,
) {
let turn_id = input.turn_id.clone();
let turn_state = self.turns.entry(turn_id.clone()).or_insert(TurnState {
connection_id: None,
thread_id: None,
num_input_images: None,
resolved_config: None,
started_at: None,
token_usage: None,
completed: None,
steer_count: 0,
});
turn_state.thread_id = Some(input.thread_id);
turn_state.token_usage = Some(input.token_usage);
self.maybe_emit_turn_event(&turn_id, out);
}
async fn ingest_skill_invoked(
&mut self,
input: SkillInvokedInput,
@@ -316,30 +475,193 @@ impl AnalyticsReducer {
response: ClientResponse,
out: &mut Vec<TrackEventRequest>,
) {
let (thread, model, initialization_mode) = match response {
ClientResponse::ThreadStart { response, .. } => (
response.thread,
response.model,
ThreadInitializationMode::New,
),
ClientResponse::ThreadResume { response, .. } => (
response.thread,
response.model,
ThreadInitializationMode::Resumed,
),
ClientResponse::ThreadFork { response, .. } => (
response.thread,
response.model,
ThreadInitializationMode::Forked,
),
_ => return,
match response {
ClientResponse::ThreadStart { response, .. } => {
self.emit_thread_initialized(
connection_id,
response.thread,
response.model,
ThreadInitializationMode::New,
out,
);
}
ClientResponse::ThreadResume { response, .. } => {
self.emit_thread_initialized(
connection_id,
response.thread,
response.model,
ThreadInitializationMode::Resumed,
out,
);
}
ClientResponse::ThreadFork { response, .. } => {
self.emit_thread_initialized(
connection_id,
response.thread,
response.model,
ThreadInitializationMode::Forked,
out,
);
}
ClientResponse::TurnStart {
request_id,
response,
} => {
let turn_id = response.turn.id;
let Some(RequestState::TurnStart(pending_request)) =
self.requests.remove(&(connection_id, request_id))
else {
return;
};
let turn_state = self.turns.entry(turn_id.clone()).or_insert(TurnState {
connection_id: None,
thread_id: None,
num_input_images: None,
resolved_config: None,
started_at: None,
token_usage: None,
completed: None,
steer_count: 0,
});
turn_state.connection_id = Some(connection_id);
turn_state.thread_id = Some(pending_request.thread_id);
turn_state.num_input_images = Some(pending_request.num_input_images);
self.maybe_emit_turn_event(&turn_id, out);
}
ClientResponse::TurnSteer {
request_id,
response,
} => {
self.ingest_turn_steer_response(connection_id, request_id, response, out);
}
_ => {}
}
}
fn ingest_error_response(
&mut self,
connection_id: u64,
request_id: RequestId,
error_type: Option<AnalyticsJsonRpcError>,
out: &mut Vec<TrackEventRequest>,
) {
let Some(request) = self.requests.remove(&(connection_id, request_id)) else {
return;
};
self.ingest_request_error_response(connection_id, request, error_type, out);
}
fn ingest_request_error_response(
&mut self,
connection_id: u64,
request: RequestState,
error_type: Option<AnalyticsJsonRpcError>,
out: &mut Vec<TrackEventRequest>,
) {
match request {
RequestState::TurnStart(_) => {}
RequestState::TurnSteer(pending_request) => {
self.ingest_turn_steer_error_response(
connection_id,
pending_request,
error_type,
out,
);
}
}
}
fn ingest_turn_steer_error_response(
&mut self,
connection_id: u64,
pending_request: PendingTurnSteerState,
error_type: Option<AnalyticsJsonRpcError>,
out: &mut Vec<TrackEventRequest>,
) {
self.emit_turn_steer_event(
connection_id,
pending_request,
/*accepted_turn_id*/ None,
TurnSteerResult::Rejected,
rejection_reason_from_error_type(error_type),
out,
);
}
fn ingest_notification(
&mut self,
notification: ServerNotification,
out: &mut Vec<TrackEventRequest>,
) {
match notification {
ServerNotification::TurnStarted(notification) => {
let turn_state = self.turns.entry(notification.turn.id).or_insert(TurnState {
connection_id: None,
thread_id: None,
num_input_images: None,
resolved_config: None,
started_at: None,
token_usage: None,
completed: None,
steer_count: 0,
});
turn_state.started_at = notification
.turn
.started_at
.and_then(|started_at| u64::try_from(started_at).ok());
}
ServerNotification::TurnCompleted(notification) => {
let turn_state =
self.turns
.entry(notification.turn.id.clone())
.or_insert(TurnState {
connection_id: None,
thread_id: None,
num_input_images: None,
resolved_config: None,
started_at: None,
token_usage: None,
completed: None,
steer_count: 0,
});
turn_state.completed = Some(CompletedTurnState {
status: analytics_turn_status(notification.turn.status),
turn_error: notification
.turn
.error
.and_then(|error| error.codex_error_info),
completed_at: notification
.turn
.completed_at
.and_then(|completed_at| u64::try_from(completed_at).ok())
.unwrap_or_default(),
duration_ms: notification
.turn
.duration_ms
.and_then(|duration_ms| u64::try_from(duration_ms).ok()),
});
let turn_id = notification.turn.id;
self.maybe_emit_turn_event(&turn_id, out);
}
_ => {}
}
}
fn emit_thread_initialized(
&mut self,
connection_id: u64,
thread: codex_app_server_protocol::Thread,
model: String,
initialization_mode: ThreadInitializationMode,
out: &mut Vec<TrackEventRequest>,
) {
let thread_source: SessionSource = thread.source.into();
let thread_id = thread.id;
let Some(connection_state) = self.connections.get(&connection_id) else {
return;
};
let thread_metadata = ThreadMetadataState::from_session_source(&thread_source);
let thread_metadata =
ThreadMetadataState::from_thread_metadata(&thread_source, initialization_mode);
self.thread_connections
.insert(thread_id.clone(), connection_id);
self.thread_metadata
@@ -403,6 +725,275 @@ impl AnalyticsReducer {
},
)));
}
fn ingest_turn_steer_response(
&mut self,
connection_id: u64,
request_id: RequestId,
response: TurnSteerResponse,
out: &mut Vec<TrackEventRequest>,
) {
let Some(RequestState::TurnSteer(pending_request)) =
self.requests.remove(&(connection_id, request_id))
else {
return;
};
if let Some(turn_state) = self.turns.get_mut(&response.turn_id) {
turn_state.steer_count += 1;
}
self.emit_turn_steer_event(
connection_id,
pending_request,
Some(response.turn_id),
TurnSteerResult::Accepted,
/*rejection_reason*/ None,
out,
);
}
fn emit_turn_steer_event(
&mut self,
connection_id: u64,
pending_request: PendingTurnSteerState,
accepted_turn_id: Option<String>,
result: TurnSteerResult,
rejection_reason: Option<TurnSteerRejectionReason>,
out: &mut Vec<TrackEventRequest>,
) {
let Some(connection_state) = self.connections.get(&connection_id) else {
return;
};
let Some(thread_metadata) = self.thread_metadata.get(&pending_request.thread_id) else {
tracing::warn!(
thread_id = %pending_request.thread_id,
"dropping turn steer analytics event: missing thread lifecycle metadata"
);
return;
};
out.push(TrackEventRequest::TurnSteer(CodexTurnSteerEventRequest {
event_type: "codex_turn_steer_event",
event_params: CodexTurnSteerEventParams {
thread_id: pending_request.thread_id,
expected_turn_id: Some(pending_request.expected_turn_id),
accepted_turn_id,
app_server_client: connection_state.app_server_client.clone(),
runtime: connection_state.runtime.clone(),
thread_source: thread_metadata.thread_source.map(str::to_string),
subagent_source: thread_metadata.subagent_source.clone(),
parent_thread_id: thread_metadata.parent_thread_id.clone(),
num_input_images: pending_request.num_input_images,
result,
rejection_reason,
created_at: pending_request.created_at,
},
}));
}
fn maybe_emit_turn_event(&mut self, turn_id: &str, out: &mut Vec<TrackEventRequest>) {
let Some(turn_state) = self.turns.get(turn_id) else {
return;
};
if turn_state.thread_id.is_none()
|| turn_state.num_input_images.is_none()
|| turn_state.resolved_config.is_none()
|| turn_state.completed.is_none()
{
return;
}
let connection_metadata = turn_state
.connection_id
.and_then(|connection_id| self.connections.get(&connection_id))
.map(|connection_state| {
(
connection_state.app_server_client.clone(),
connection_state.runtime.clone(),
)
});
let Some((app_server_client, runtime)) = connection_metadata else {
if let Some(connection_id) = turn_state.connection_id {
tracing::warn!(
turn_id,
connection_id,
"dropping turn analytics event: missing connection metadata"
);
}
return;
};
let Some(thread_id) = turn_state.thread_id.as_ref() else {
return;
};
let Some(thread_metadata) = self.thread_metadata.get(thread_id) else {
tracing::warn!(
thread_id,
turn_id,
"dropping turn analytics event: missing thread lifecycle metadata"
);
return;
};
out.push(TrackEventRequest::TurnEvent(Box::new(
CodexTurnEventRequest {
event_type: "codex_turn_event",
event_params: codex_turn_event_params(
app_server_client,
runtime,
turn_id.to_string(),
turn_state,
thread_metadata,
),
},
)));
self.turns.remove(turn_id);
}
}
fn codex_turn_event_params(
app_server_client: CodexAppServerClientMetadata,
runtime: CodexRuntimeMetadata,
turn_id: String,
turn_state: &TurnState,
thread_metadata: &ThreadMetadataState,
) -> CodexTurnEventParams {
let (Some(thread_id), Some(num_input_images), Some(resolved_config), Some(completed)) = (
turn_state.thread_id.clone(),
turn_state.num_input_images,
turn_state.resolved_config.clone(),
turn_state.completed.clone(),
) else {
unreachable!("turn event params require a fully populated turn state");
};
let started_at = turn_state.started_at;
let TurnResolvedConfigFact {
turn_id: _resolved_turn_id,
thread_id: _resolved_thread_id,
num_input_images: _resolved_num_input_images,
submission_type,
ephemeral,
session_source: _session_source,
model,
model_provider,
sandbox_policy,
reasoning_effort,
reasoning_summary,
service_tier,
approval_policy,
approvals_reviewer,
sandbox_network_access,
collaboration_mode,
personality,
is_first_turn,
} = resolved_config;
let token_usage = turn_state.token_usage.clone();
CodexTurnEventParams {
thread_id,
turn_id,
app_server_client,
runtime,
submission_type,
ephemeral,
thread_source: thread_metadata.thread_source.map(str::to_string),
initialization_mode: thread_metadata.initialization_mode,
subagent_source: thread_metadata.subagent_source.clone(),
parent_thread_id: thread_metadata.parent_thread_id.clone(),
model: Some(model),
model_provider,
sandbox_policy: Some(sandbox_policy_mode(&sandbox_policy)),
reasoning_effort: reasoning_effort.map(|value| value.to_string()),
reasoning_summary: reasoning_summary_mode(reasoning_summary),
service_tier: service_tier
.map(|value| value.to_string())
.unwrap_or_else(|| "default".to_string()),
approval_policy: approval_policy.to_string(),
approvals_reviewer: approvals_reviewer.to_string(),
sandbox_network_access,
collaboration_mode: Some(collaboration_mode_mode(collaboration_mode)),
personality: personality_mode(personality),
num_input_images,
is_first_turn,
status: completed.status,
turn_error: completed.turn_error,
steer_count: Some(turn_state.steer_count),
total_tool_call_count: None,
shell_command_count: None,
file_change_count: None,
mcp_tool_call_count: None,
dynamic_tool_call_count: None,
subagent_tool_call_count: None,
web_search_count: None,
image_generation_count: None,
input_tokens: token_usage
.as_ref()
.map(|token_usage| token_usage.input_tokens),
cached_input_tokens: token_usage
.as_ref()
.map(|token_usage| token_usage.cached_input_tokens),
output_tokens: token_usage
.as_ref()
.map(|token_usage| token_usage.output_tokens),
reasoning_output_tokens: token_usage
.as_ref()
.map(|token_usage| token_usage.reasoning_output_tokens),
total_tokens: token_usage
.as_ref()
.map(|token_usage| token_usage.total_tokens),
duration_ms: completed.duration_ms,
started_at,
completed_at: Some(completed.completed_at),
}
}
fn sandbox_policy_mode(sandbox_policy: &SandboxPolicy) -> &'static str {
match sandbox_policy {
SandboxPolicy::DangerFullAccess => "full_access",
SandboxPolicy::ReadOnly { .. } => "read_only",
SandboxPolicy::WorkspaceWrite { .. } => "workspace_write",
SandboxPolicy::ExternalSandbox { .. } => "external_sandbox",
}
}
fn collaboration_mode_mode(mode: ModeKind) -> &'static str {
match mode {
ModeKind::Plan => "plan",
ModeKind::Default | ModeKind::PairProgramming | ModeKind::Execute => "default",
}
}
fn reasoning_summary_mode(summary: Option<ReasoningSummary>) -> Option<String> {
match summary {
Some(ReasoningSummary::None) | None => None,
Some(summary) => Some(summary.to_string()),
}
}
fn personality_mode(personality: Option<Personality>) -> Option<String> {
match personality {
Some(Personality::None) | None => None,
Some(personality) => Some(personality.to_string()),
}
}
fn analytics_turn_status(status: codex_app_server_protocol::TurnStatus) -> Option<TurnStatus> {
match status {
codex_app_server_protocol::TurnStatus::Completed => Some(TurnStatus::Completed),
codex_app_server_protocol::TurnStatus::Failed => Some(TurnStatus::Failed),
codex_app_server_protocol::TurnStatus::Interrupted => Some(TurnStatus::Interrupted),
codex_app_server_protocol::TurnStatus::InProgress => None,
}
}
fn num_input_images(input: &[UserInput]) -> usize {
input
.iter()
.filter(|item| matches!(item, UserInput::Image { .. } | UserInput::LocalImage { .. }))
.count()
}
fn rejection_reason_from_error_type(
error_type: Option<AnalyticsJsonRpcError>,
) -> Option<TurnSteerRejectionReason> {
match error_type? {
AnalyticsJsonRpcError::TurnSteer(error) => Some(error.into()),
AnalyticsJsonRpcError::Input(error) => Some(error.into()),
}
}
pub(crate) fn skill_id_for_local_skill(

View File

@@ -1233,6 +1233,32 @@
}
]
},
"MarketplaceAddParams": {
"properties": {
"refName": {
"type": [
"string",
"null"
]
},
"source": {
"type": "string"
},
"sparsePaths": {
"items": {
"type": "string"
},
"type": [
"array",
"null"
]
}
},
"required": [
"source"
],
"type": "object"
},
"McpResourceReadParams": {
"properties": {
"server": {
@@ -1499,6 +1525,13 @@
}
]
},
"RealtimeOutputModality": {
"enum": [
"text",
"audio"
],
"type": "string"
},
"RealtimeVoice": {
"enum": [
"alloy",
@@ -2715,6 +2748,23 @@
],
"type": "object"
},
"ThreadInjectItemsParams": {
"properties": {
"items": {
"description": "Raw Responses API items to append to the thread's model-visible history.",
"items": true,
"type": "array"
},
"threadId": {
"type": "string"
}
},
"required": [
"items",
"threadId"
],
"type": "object"
},
"ThreadListParams": {
"properties": {
"archived": {
@@ -2809,6 +2859,13 @@
},
"type": "object"
},
"ThreadMemoryMode": {
"enum": [
"enabled",
"disabled"
],
"type": "string"
},
"ThreadMetadataGitInfoUpdateParams": {
"properties": {
"branch": {
@@ -3952,6 +4009,31 @@
"title": "Thread/readRequest",
"type": "object"
},
{
"description": "Append raw Responses API items to the thread history without starting a user turn.",
"properties": {
"id": {
"$ref": "#/definitions/RequestId"
},
"method": {
"enum": [
"thread/inject_items"
],
"title": "Thread/injectItemsRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadInjectItemsParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Thread/injectItemsRequest",
"type": "object"
},
{
"properties": {
"id": {
@@ -3976,6 +4058,30 @@
"title": "Skills/listRequest",
"type": "object"
},
{
"properties": {
"id": {
"$ref": "#/definitions/RequestId"
},
"method": {
"enum": [
"marketplace/add"
],
"title": "Marketplace/addRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/MarketplaceAddParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Marketplace/addRequest",
"type": "object"
},
{
"properties": {
"id": {

View File

@@ -3384,13 +3384,35 @@
],
"type": "object"
},
"ThreadRealtimeTranscriptUpdatedNotification": {
"ThreadRealtimeTranscriptDeltaNotification": {
"description": "EXPERIMENTAL - flat transcript delta emitted whenever realtime transcript text changes.",
"properties": {
"delta": {
"description": "Live transcript delta from the realtime event.",
"type": "string"
},
"role": {
"type": "string"
},
"threadId": {
"type": "string"
}
},
"required": [
"delta",
"role",
"threadId"
],
"type": "object"
},
"ThreadRealtimeTranscriptDoneNotification": {
"description": "EXPERIMENTAL - final transcript text emitted when realtime completes a transcript part.",
"properties": {
"role": {
"type": "string"
},
"text": {
"description": "Final complete text for the transcript part.",
"type": "string"
},
"threadId": {
@@ -4949,20 +4971,40 @@
"properties": {
"method": {
"enum": [
"thread/realtime/transcriptUpdated"
"thread/realtime/transcript/delta"
],
"title": "Thread/realtime/transcriptUpdatedNotificationMethod",
"title": "Thread/realtime/transcript/deltaNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadRealtimeTranscriptUpdatedNotification"
"$ref": "#/definitions/ThreadRealtimeTranscriptDeltaNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcriptUpdatedNotification",
"title": "Thread/realtime/transcript/deltaNotification",
"type": "object"
},
{
"properties": {
"method": {
"enum": [
"thread/realtime/transcript/done"
],
"title": "Thread/realtime/transcript/doneNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadRealtimeTranscriptDoneNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcript/doneNotification",
"type": "object"
},
{

View File

@@ -578,6 +578,31 @@
"title": "Thread/readRequest",
"type": "object"
},
{
"description": "Append raw Responses API items to the thread history without starting a user turn.",
"properties": {
"id": {
"$ref": "#/definitions/v2/RequestId"
},
"method": {
"enum": [
"thread/inject_items"
],
"title": "Thread/injectItemsRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/v2/ThreadInjectItemsParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Thread/injectItemsRequest",
"type": "object"
},
{
"properties": {
"id": {
@@ -602,6 +627,30 @@
"title": "Skills/listRequest",
"type": "object"
},
{
"properties": {
"id": {
"$ref": "#/definitions/v2/RequestId"
},
"method": {
"enum": [
"marketplace/add"
],
"title": "Marketplace/addRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/v2/MarketplaceAddParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Marketplace/addRequest",
"type": "object"
},
{
"properties": {
"id": {
@@ -4330,20 +4379,40 @@
"properties": {
"method": {
"enum": [
"thread/realtime/transcriptUpdated"
"thread/realtime/transcript/delta"
],
"title": "Thread/realtime/transcriptUpdatedNotificationMethod",
"title": "Thread/realtime/transcript/deltaNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/v2/ThreadRealtimeTranscriptUpdatedNotification"
"$ref": "#/definitions/v2/ThreadRealtimeTranscriptDeltaNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcriptUpdatedNotification",
"title": "Thread/realtime/transcript/deltaNotification",
"type": "object"
},
{
"properties": {
"method": {
"enum": [
"thread/realtime/transcript/done"
],
"title": "Thread/realtime/transcript/doneNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/v2/ThreadRealtimeTranscriptDoneNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcript/doneNotification",
"type": "object"
},
{
@@ -7688,11 +7757,15 @@
"type": "integer"
},
"isDirectory": {
"description": "Whether the path currently resolves to a directory.",
"description": "Whether the path resolves to a directory.",
"type": "boolean"
},
"isFile": {
"description": "Whether the path currently resolves to a regular file.",
"description": "Whether the path resolves to a regular file.",
"type": "boolean"
},
"isSymlink": {
"description": "Whether the path itself is a symbolic link.",
"type": "boolean"
},
"modifiedAtMs": {
@@ -7705,6 +7778,7 @@
"createdAtMs",
"isDirectory",
"isFile",
"isSymlink",
"modifiedAtMs"
],
"title": "FsGetMetadataResponse",
@@ -9030,6 +9104,55 @@
"title": "LogoutAccountResponse",
"type": "object"
},
"MarketplaceAddParams": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"refName": {
"type": [
"string",
"null"
]
},
"source": {
"type": "string"
},
"sparsePaths": {
"items": {
"type": "string"
},
"type": [
"array",
"null"
]
}
},
"required": [
"source"
],
"title": "MarketplaceAddParams",
"type": "object"
},
"MarketplaceAddResponse": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"alreadyAdded": {
"type": "boolean"
},
"installedRoot": {
"$ref": "#/definitions/v2/AbsolutePathBuf"
},
"marketplaceName": {
"type": "string"
}
},
"required": [
"alreadyAdded",
"installedRoot",
"marketplaceName"
],
"title": "MarketplaceAddResponse",
"type": "object"
},
"MarketplaceInterface": {
"properties": {
"displayName": {
@@ -10615,6 +10738,13 @@
],
"type": "string"
},
"RealtimeOutputModality": {
"enum": [
"text",
"audio"
],
"type": "string"
},
"RealtimeVoice": {
"enum": [
"alloy",
@@ -12086,7 +12216,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/v2/AbsolutePathBuf"
},
"scope": {
"$ref": "#/definitions/v2/SkillScope"
@@ -12139,7 +12269,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/v2/AbsolutePathBuf"
},
"shortDescription": {
"type": [
@@ -12862,6 +12992,30 @@
"ThreadId": {
"type": "string"
},
"ThreadInjectItemsParams": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"items": {
"description": "Raw Responses API items to append to the thread's model-visible history.",
"items": true,
"type": "array"
},
"threadId": {
"type": "string"
}
},
"required": [
"items",
"threadId"
],
"title": "ThreadInjectItemsParams",
"type": "object"
},
"ThreadInjectItemsResponse": {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "ThreadInjectItemsResponse",
"type": "object"
},
"ThreadItem": {
"oneOf": [
{
@@ -13647,6 +13801,13 @@
"title": "ThreadLoadedListResponse",
"type": "object"
},
"ThreadMemoryMode": {
"enum": [
"enabled",
"disabled"
],
"type": "string"
},
"ThreadMetadataGitInfoUpdateParams": {
"properties": {
"branch": {
@@ -13954,14 +14115,38 @@
"title": "ThreadRealtimeStartedNotification",
"type": "object"
},
"ThreadRealtimeTranscriptUpdatedNotification": {
"ThreadRealtimeTranscriptDeltaNotification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - flat transcript delta emitted whenever realtime transcript text changes.",
"properties": {
"delta": {
"description": "Live transcript delta from the realtime event.",
"type": "string"
},
"role": {
"type": "string"
},
"threadId": {
"type": "string"
}
},
"required": [
"delta",
"role",
"threadId"
],
"title": "ThreadRealtimeTranscriptDeltaNotification",
"type": "object"
},
"ThreadRealtimeTranscriptDoneNotification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - final transcript text emitted when realtime completes a transcript part.",
"properties": {
"role": {
"type": "string"
},
"text": {
"description": "Final complete text for the transcript part.",
"type": "string"
},
"threadId": {
@@ -13973,7 +14158,7 @@
"text",
"threadId"
],
"title": "ThreadRealtimeTranscriptUpdatedNotification",
"title": "ThreadRealtimeTranscriptDoneNotification",
"type": "object"
},
"ThreadResumeParams": {

View File

@@ -1160,6 +1160,31 @@
"title": "Thread/readRequest",
"type": "object"
},
{
"description": "Append raw Responses API items to the thread history without starting a user turn.",
"properties": {
"id": {
"$ref": "#/definitions/RequestId"
},
"method": {
"enum": [
"thread/inject_items"
],
"title": "Thread/injectItemsRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadInjectItemsParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Thread/injectItemsRequest",
"type": "object"
},
{
"properties": {
"id": {
@@ -1184,6 +1209,30 @@
"title": "Skills/listRequest",
"type": "object"
},
{
"properties": {
"id": {
"$ref": "#/definitions/RequestId"
},
"method": {
"enum": [
"marketplace/add"
],
"title": "Marketplace/addRequestMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/MarketplaceAddParams"
}
},
"required": [
"id",
"method",
"params"
],
"title": "Marketplace/addRequest",
"type": "object"
},
{
"properties": {
"id": {
@@ -4329,11 +4378,15 @@
"type": "integer"
},
"isDirectory": {
"description": "Whether the path currently resolves to a directory.",
"description": "Whether the path resolves to a directory.",
"type": "boolean"
},
"isFile": {
"description": "Whether the path currently resolves to a regular file.",
"description": "Whether the path resolves to a regular file.",
"type": "boolean"
},
"isSymlink": {
"description": "Whether the path itself is a symbolic link.",
"type": "boolean"
},
"modifiedAtMs": {
@@ -4346,6 +4399,7 @@
"createdAtMs",
"isDirectory",
"isFile",
"isSymlink",
"modifiedAtMs"
],
"title": "FsGetMetadataResponse",
@@ -5826,6 +5880,55 @@
"title": "LogoutAccountResponse",
"type": "object"
},
"MarketplaceAddParams": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"refName": {
"type": [
"string",
"null"
]
},
"source": {
"type": "string"
},
"sparsePaths": {
"items": {
"type": "string"
},
"type": [
"array",
"null"
]
}
},
"required": [
"source"
],
"title": "MarketplaceAddParams",
"type": "object"
},
"MarketplaceAddResponse": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"alreadyAdded": {
"type": "boolean"
},
"installedRoot": {
"$ref": "#/definitions/AbsolutePathBuf"
},
"marketplaceName": {
"type": "string"
}
},
"required": [
"alreadyAdded",
"installedRoot",
"marketplaceName"
],
"title": "MarketplaceAddResponse",
"type": "object"
},
"MarketplaceInterface": {
"properties": {
"displayName": {
@@ -7411,6 +7514,13 @@
],
"type": "string"
},
"RealtimeOutputModality": {
"enum": [
"text",
"audio"
],
"type": "string"
},
"RealtimeVoice": {
"enum": [
"alloy",
@@ -9580,20 +9690,40 @@
"properties": {
"method": {
"enum": [
"thread/realtime/transcriptUpdated"
"thread/realtime/transcript/delta"
],
"title": "Thread/realtime/transcriptUpdatedNotificationMethod",
"title": "Thread/realtime/transcript/deltaNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadRealtimeTranscriptUpdatedNotification"
"$ref": "#/definitions/ThreadRealtimeTranscriptDeltaNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcriptUpdatedNotification",
"title": "Thread/realtime/transcript/deltaNotification",
"type": "object"
},
{
"properties": {
"method": {
"enum": [
"thread/realtime/transcript/done"
],
"title": "Thread/realtime/transcript/doneNotificationMethod",
"type": "string"
},
"params": {
"$ref": "#/definitions/ThreadRealtimeTranscriptDoneNotification"
}
},
"required": [
"method",
"params"
],
"title": "Thread/realtime/transcript/doneNotification",
"type": "object"
},
{
@@ -9934,7 +10064,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/AbsolutePathBuf"
},
"scope": {
"$ref": "#/definitions/SkillScope"
@@ -9987,7 +10117,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/AbsolutePathBuf"
},
"shortDescription": {
"type": [
@@ -10710,6 +10840,30 @@
"ThreadId": {
"type": "string"
},
"ThreadInjectItemsParams": {
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"items": {
"description": "Raw Responses API items to append to the thread's model-visible history.",
"items": true,
"type": "array"
},
"threadId": {
"type": "string"
}
},
"required": [
"items",
"threadId"
],
"title": "ThreadInjectItemsParams",
"type": "object"
},
"ThreadInjectItemsResponse": {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "ThreadInjectItemsResponse",
"type": "object"
},
"ThreadItem": {
"oneOf": [
{
@@ -11495,6 +11649,13 @@
"title": "ThreadLoadedListResponse",
"type": "object"
},
"ThreadMemoryMode": {
"enum": [
"enabled",
"disabled"
],
"type": "string"
},
"ThreadMetadataGitInfoUpdateParams": {
"properties": {
"branch": {
@@ -11802,14 +11963,38 @@
"title": "ThreadRealtimeStartedNotification",
"type": "object"
},
"ThreadRealtimeTranscriptUpdatedNotification": {
"ThreadRealtimeTranscriptDeltaNotification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - flat transcript delta emitted whenever realtime transcript text changes.",
"properties": {
"delta": {
"description": "Live transcript delta from the realtime event.",
"type": "string"
},
"role": {
"type": "string"
},
"threadId": {
"type": "string"
}
},
"required": [
"delta",
"role",
"threadId"
],
"title": "ThreadRealtimeTranscriptDeltaNotification",
"type": "object"
},
"ThreadRealtimeTranscriptDoneNotification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - final transcript text emitted when realtime completes a transcript part.",
"properties": {
"role": {
"type": "string"
},
"text": {
"description": "Final complete text for the transcript part.",
"type": "string"
},
"threadId": {
@@ -11821,7 +12006,7 @@
"text",
"threadId"
],
"title": "ThreadRealtimeTranscriptUpdatedNotification",
"title": "ThreadRealtimeTranscriptDoneNotification",
"type": "object"
},
"ThreadResumeParams": {

View File

@@ -8,11 +8,15 @@
"type": "integer"
},
"isDirectory": {
"description": "Whether the path currently resolves to a directory.",
"description": "Whether the path resolves to a directory.",
"type": "boolean"
},
"isFile": {
"description": "Whether the path currently resolves to a regular file.",
"description": "Whether the path resolves to a regular file.",
"type": "boolean"
},
"isSymlink": {
"description": "Whether the path itself is a symbolic link.",
"type": "boolean"
},
"modifiedAtMs": {
@@ -25,6 +29,7 @@
"createdAtMs",
"isDirectory",
"isFile",
"isSymlink",
"modifiedAtMs"
],
"title": "FsGetMetadataResponse",

View File

@@ -0,0 +1,28 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"refName": {
"type": [
"string",
"null"
]
},
"source": {
"type": "string"
},
"sparsePaths": {
"items": {
"type": "string"
},
"type": [
"array",
"null"
]
}
},
"required": [
"source"
],
"title": "MarketplaceAddParams",
"type": "object"
}

View File

@@ -0,0 +1,27 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"definitions": {
"AbsolutePathBuf": {
"description": "A path that is guaranteed to be absolute and normalized (though it is not guaranteed to be canonicalized or exist on the filesystem).\n\nIMPORTANT: When deserializing an `AbsolutePathBuf`, a base path must be set using [AbsolutePathBufGuard::new]. If no base path is set, the deserialization will fail unless the path being deserialized is already absolute.",
"type": "string"
}
},
"properties": {
"alreadyAdded": {
"type": "boolean"
},
"installedRoot": {
"$ref": "#/definitions/AbsolutePathBuf"
},
"marketplaceName": {
"type": "string"
}
},
"required": [
"alreadyAdded",
"installedRoot",
"marketplaceName"
],
"title": "MarketplaceAddResponse",
"type": "object"
}

View File

@@ -335,7 +335,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/AbsolutePathBuf"
},
"shortDescription": {
"type": [

View File

@@ -1,6 +1,10 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"definitions": {
"AbsolutePathBuf": {
"description": "A path that is guaranteed to be absolute and normalized (though it is not guaranteed to be canonicalized or exist on the filesystem).\n\nIMPORTANT: When deserializing an `AbsolutePathBuf`, a base path must be set using [AbsolutePathBufGuard::new]. If no base path is set, the deserialization will fail unless the path being deserialized is already absolute.",
"type": "string"
},
"SkillDependencies": {
"properties": {
"tools": {
@@ -103,7 +107,7 @@
"type": "string"
},
"path": {
"type": "string"
"$ref": "#/definitions/AbsolutePathBuf"
},
"scope": {
"$ref": "#/definitions/SkillScope"

View File

@@ -0,0 +1,19 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"items": {
"description": "Raw Responses API items to append to the thread's model-visible history.",
"items": true,
"type": "array"
},
"threadId": {
"type": "string"
}
},
"required": [
"items",
"threadId"
],
"title": "ThreadInjectItemsParams",
"type": "object"
}

View File

@@ -0,0 +1,5 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "ThreadInjectItemsResponse",
"type": "object"
}

View File

@@ -2,10 +2,11 @@
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - flat transcript delta emitted whenever realtime transcript text changes.",
"properties": {
"role": {
"delta": {
"description": "Live transcript delta from the realtime event.",
"type": "string"
},
"text": {
"role": {
"type": "string"
},
"threadId": {
@@ -13,10 +14,10 @@
}
},
"required": [
"delta",
"role",
"text",
"threadId"
],
"title": "ThreadRealtimeTranscriptUpdatedNotification",
"title": "ThreadRealtimeTranscriptDeltaNotification",
"type": "object"
}

View File

@@ -0,0 +1,23 @@
{
"$schema": "http://json-schema.org/draft-07/schema#",
"description": "EXPERIMENTAL - final transcript text emitted when realtime completes a transcript part.",
"properties": {
"role": {
"type": "string"
},
"text": {
"description": "Final complete text for the transcript part.",
"type": "string"
},
"threadId": {
"type": "string"
}
},
"required": [
"role",
"text",
"threadId"
],
"title": "ThreadRealtimeTranscriptDoneNotification",
"type": "object"
}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,5 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type RealtimeOutputModality = "text" | "audio";

View File

@@ -43,7 +43,8 @@ import type { ThreadRealtimeItemAddedNotification } from "./v2/ThreadRealtimeIte
import type { ThreadRealtimeOutputAudioDeltaNotification } from "./v2/ThreadRealtimeOutputAudioDeltaNotification";
import type { ThreadRealtimeSdpNotification } from "./v2/ThreadRealtimeSdpNotification";
import type { ThreadRealtimeStartedNotification } from "./v2/ThreadRealtimeStartedNotification";
import type { ThreadRealtimeTranscriptUpdatedNotification } from "./v2/ThreadRealtimeTranscriptUpdatedNotification";
import type { ThreadRealtimeTranscriptDeltaNotification } from "./v2/ThreadRealtimeTranscriptDeltaNotification";
import type { ThreadRealtimeTranscriptDoneNotification } from "./v2/ThreadRealtimeTranscriptDoneNotification";
import type { ThreadStartedNotification } from "./v2/ThreadStartedNotification";
import type { ThreadStatusChangedNotification } from "./v2/ThreadStatusChangedNotification";
import type { ThreadTokenUsageUpdatedNotification } from "./v2/ThreadTokenUsageUpdatedNotification";
@@ -58,4 +59,4 @@ import type { WindowsWorldWritableWarningNotification } from "./v2/WindowsWorldW
/**
* Notification sent from the server to the client.
*/
export type ServerNotification = { "method": "error", "params": ErrorNotification } | { "method": "thread/started", "params": ThreadStartedNotification } | { "method": "thread/status/changed", "params": ThreadStatusChangedNotification } | { "method": "thread/archived", "params": ThreadArchivedNotification } | { "method": "thread/unarchived", "params": ThreadUnarchivedNotification } | { "method": "thread/closed", "params": ThreadClosedNotification } | { "method": "skills/changed", "params": SkillsChangedNotification } | { "method": "thread/name/updated", "params": ThreadNameUpdatedNotification } | { "method": "thread/tokenUsage/updated", "params": ThreadTokenUsageUpdatedNotification } | { "method": "turn/started", "params": TurnStartedNotification } | { "method": "hook/started", "params": HookStartedNotification } | { "method": "turn/completed", "params": TurnCompletedNotification } | { "method": "hook/completed", "params": HookCompletedNotification } | { "method": "turn/diff/updated", "params": TurnDiffUpdatedNotification } | { "method": "turn/plan/updated", "params": TurnPlanUpdatedNotification } | { "method": "item/started", "params": ItemStartedNotification } | { "method": "item/autoApprovalReview/started", "params": ItemGuardianApprovalReviewStartedNotification } | { "method": "item/autoApprovalReview/completed", "params": ItemGuardianApprovalReviewCompletedNotification } | { "method": "item/completed", "params": ItemCompletedNotification } | { "method": "rawResponseItem/completed", "params": RawResponseItemCompletedNotification } | { "method": "item/agentMessage/delta", "params": AgentMessageDeltaNotification } | { "method": "item/plan/delta", "params": PlanDeltaNotification } | { "method": "command/exec/outputDelta", "params": CommandExecOutputDeltaNotification } | { "method": "item/commandExecution/outputDelta", "params": CommandExecutionOutputDeltaNotification } | { "method": "item/commandExecution/terminalInteraction", "params": TerminalInteractionNotification } | { "method": "item/fileChange/outputDelta", "params": FileChangeOutputDeltaNotification } | { "method": "serverRequest/resolved", "params": ServerRequestResolvedNotification } | { "method": "item/mcpToolCall/progress", "params": McpToolCallProgressNotification } | { "method": "mcpServer/oauthLogin/completed", "params": McpServerOauthLoginCompletedNotification } | { "method": "mcpServer/startupStatus/updated", "params": McpServerStatusUpdatedNotification } | { "method": "account/updated", "params": AccountUpdatedNotification } | { "method": "account/rateLimits/updated", "params": AccountRateLimitsUpdatedNotification } | { "method": "app/list/updated", "params": AppListUpdatedNotification } | { "method": "fs/changed", "params": FsChangedNotification } | { "method": "item/reasoning/summaryTextDelta", "params": ReasoningSummaryTextDeltaNotification } | { "method": "item/reasoning/summaryPartAdded", "params": ReasoningSummaryPartAddedNotification } | { "method": "item/reasoning/textDelta", "params": ReasoningTextDeltaNotification } | { "method": "thread/compacted", "params": ContextCompactedNotification } | { "method": "model/rerouted", "params": ModelReroutedNotification } | { "method": "deprecationNotice", "params": DeprecationNoticeNotification } | { "method": "configWarning", "params": ConfigWarningNotification } | { "method": "fuzzyFileSearch/sessionUpdated", "params": FuzzyFileSearchSessionUpdatedNotification } | { "method": "fuzzyFileSearch/sessionCompleted", "params": FuzzyFileSearchSessionCompletedNotification } | { "method": "thread/realtime/started", "params": ThreadRealtimeStartedNotification } | { "method": "thread/realtime/itemAdded", "params": ThreadRealtimeItemAddedNotification } | { "method": "thread/realtime/transcriptUpdated", "params": ThreadRealtimeTranscriptUpdatedNotification } | { "method": "thread/realtime/outputAudio/delta", "params": ThreadRealtimeOutputAudioDeltaNotification } | { "method": "thread/realtime/sdp", "params": ThreadRealtimeSdpNotification } | { "method": "thread/realtime/error", "params": ThreadRealtimeErrorNotification } | { "method": "thread/realtime/closed", "params": ThreadRealtimeClosedNotification } | { "method": "windows/worldWritableWarning", "params": WindowsWorldWritableWarningNotification } | { "method": "windowsSandbox/setupCompleted", "params": WindowsSandboxSetupCompletedNotification } | { "method": "account/login/completed", "params": AccountLoginCompletedNotification };
export type ServerNotification = { "method": "error", "params": ErrorNotification } | { "method": "thread/started", "params": ThreadStartedNotification } | { "method": "thread/status/changed", "params": ThreadStatusChangedNotification } | { "method": "thread/archived", "params": ThreadArchivedNotification } | { "method": "thread/unarchived", "params": ThreadUnarchivedNotification } | { "method": "thread/closed", "params": ThreadClosedNotification } | { "method": "skills/changed", "params": SkillsChangedNotification } | { "method": "thread/name/updated", "params": ThreadNameUpdatedNotification } | { "method": "thread/tokenUsage/updated", "params": ThreadTokenUsageUpdatedNotification } | { "method": "turn/started", "params": TurnStartedNotification } | { "method": "hook/started", "params": HookStartedNotification } | { "method": "turn/completed", "params": TurnCompletedNotification } | { "method": "hook/completed", "params": HookCompletedNotification } | { "method": "turn/diff/updated", "params": TurnDiffUpdatedNotification } | { "method": "turn/plan/updated", "params": TurnPlanUpdatedNotification } | { "method": "item/started", "params": ItemStartedNotification } | { "method": "item/autoApprovalReview/started", "params": ItemGuardianApprovalReviewStartedNotification } | { "method": "item/autoApprovalReview/completed", "params": ItemGuardianApprovalReviewCompletedNotification } | { "method": "item/completed", "params": ItemCompletedNotification } | { "method": "rawResponseItem/completed", "params": RawResponseItemCompletedNotification } | { "method": "item/agentMessage/delta", "params": AgentMessageDeltaNotification } | { "method": "item/plan/delta", "params": PlanDeltaNotification } | { "method": "command/exec/outputDelta", "params": CommandExecOutputDeltaNotification } | { "method": "item/commandExecution/outputDelta", "params": CommandExecutionOutputDeltaNotification } | { "method": "item/commandExecution/terminalInteraction", "params": TerminalInteractionNotification } | { "method": "item/fileChange/outputDelta", "params": FileChangeOutputDeltaNotification } | { "method": "serverRequest/resolved", "params": ServerRequestResolvedNotification } | { "method": "item/mcpToolCall/progress", "params": McpToolCallProgressNotification } | { "method": "mcpServer/oauthLogin/completed", "params": McpServerOauthLoginCompletedNotification } | { "method": "mcpServer/startupStatus/updated", "params": McpServerStatusUpdatedNotification } | { "method": "account/updated", "params": AccountUpdatedNotification } | { "method": "account/rateLimits/updated", "params": AccountRateLimitsUpdatedNotification } | { "method": "app/list/updated", "params": AppListUpdatedNotification } | { "method": "fs/changed", "params": FsChangedNotification } | { "method": "item/reasoning/summaryTextDelta", "params": ReasoningSummaryTextDeltaNotification } | { "method": "item/reasoning/summaryPartAdded", "params": ReasoningSummaryPartAddedNotification } | { "method": "item/reasoning/textDelta", "params": ReasoningTextDeltaNotification } | { "method": "thread/compacted", "params": ContextCompactedNotification } | { "method": "model/rerouted", "params": ModelReroutedNotification } | { "method": "deprecationNotice", "params": DeprecationNoticeNotification } | { "method": "configWarning", "params": ConfigWarningNotification } | { "method": "fuzzyFileSearch/sessionUpdated", "params": FuzzyFileSearchSessionUpdatedNotification } | { "method": "fuzzyFileSearch/sessionCompleted", "params": FuzzyFileSearchSessionCompletedNotification } | { "method": "thread/realtime/started", "params": ThreadRealtimeStartedNotification } | { "method": "thread/realtime/itemAdded", "params": ThreadRealtimeItemAddedNotification } | { "method": "thread/realtime/transcript/delta", "params": ThreadRealtimeTranscriptDeltaNotification } | { "method": "thread/realtime/transcript/done", "params": ThreadRealtimeTranscriptDoneNotification } | { "method": "thread/realtime/outputAudio/delta", "params": ThreadRealtimeOutputAudioDeltaNotification } | { "method": "thread/realtime/sdp", "params": ThreadRealtimeSdpNotification } | { "method": "thread/realtime/error", "params": ThreadRealtimeErrorNotification } | { "method": "thread/realtime/closed", "params": ThreadRealtimeClosedNotification } | { "method": "windows/worldWritableWarning", "params": WindowsWorldWritableWarningNotification } | { "method": "windowsSandbox/setupCompleted", "params": WindowsSandboxSetupCompletedNotification } | { "method": "account/login/completed", "params": AccountLoginCompletedNotification };

View File

@@ -0,0 +1,5 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ThreadMemoryMode = "enabled" | "disabled";

View File

@@ -49,6 +49,7 @@ export type { ParsedCommand } from "./ParsedCommand";
export type { Personality } from "./Personality";
export type { PlanType } from "./PlanType";
export type { RealtimeConversationVersion } from "./RealtimeConversationVersion";
export type { RealtimeOutputModality } from "./RealtimeOutputModality";
export type { RealtimeVoice } from "./RealtimeVoice";
export type { RealtimeVoicesList } from "./RealtimeVoicesList";
export type { ReasoningEffort } from "./ReasoningEffort";
@@ -68,6 +69,7 @@ export type { SessionSource } from "./SessionSource";
export type { Settings } from "./Settings";
export type { SubAgentSource } from "./SubAgentSource";
export type { ThreadId } from "./ThreadId";
export type { ThreadMemoryMode } from "./ThreadMemoryMode";
export type { Tool } from "./Tool";
export type { Verbosity } from "./Verbosity";
export type { WebSearchAction } from "./WebSearchAction";

View File

@@ -7,13 +7,17 @@
*/
export type FsGetMetadataResponse = {
/**
* Whether the path currently resolves to a directory.
* Whether the path resolves to a directory.
*/
isDirectory: boolean,
/**
* Whether the path currently resolves to a regular file.
* Whether the path resolves to a regular file.
*/
isFile: boolean,
/**
* Whether the path itself is a symbolic link.
*/
isSymlink: boolean,
/**
* File creation time in Unix milliseconds when available, otherwise `0`.
*/

View File

@@ -0,0 +1,5 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type MarketplaceAddParams = { source: string, refName?: string | null, sparsePaths?: Array<string> | null, };

View File

@@ -0,0 +1,6 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { AbsolutePathBuf } from "../AbsolutePathBuf";
export type MarketplaceAddResponse = { marketplaceName: string, installedRoot: AbsolutePathBuf, alreadyAdded: boolean, };

View File

@@ -1,6 +1,7 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { AbsolutePathBuf } from "../AbsolutePathBuf";
import type { SkillDependencies } from "./SkillDependencies";
import type { SkillInterface } from "./SkillInterface";
import type { SkillScope } from "./SkillScope";
@@ -9,4 +10,4 @@ export type SkillMetadata = { name: string, description: string,
/**
* Legacy short_description from SKILL.md. Prefer SKILL.json interface.short_description.
*/
shortDescription?: string, interface?: SkillInterface, dependencies?: SkillDependencies, path: string, scope: SkillScope, enabled: boolean, };
shortDescription?: string, interface?: SkillInterface, dependencies?: SkillDependencies, path: AbsolutePathBuf, scope: SkillScope, enabled: boolean, };

View File

@@ -1,6 +1,7 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { AbsolutePathBuf } from "../AbsolutePathBuf";
import type { SkillInterface } from "./SkillInterface";
export type SkillSummary = { name: string, description: string, shortDescription: string | null, interface: SkillInterface | null, path: string, enabled: boolean, };
export type SkillSummary = { name: string, description: string, shortDescription: string | null, interface: SkillInterface | null, path: AbsolutePathBuf, enabled: boolean, };

View File

@@ -0,0 +1,10 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { JsonValue } from "../serde_json/JsonValue";
export type ThreadInjectItemsParams = { threadId: string,
/**
* Raw Responses API items to append to the thread's model-visible history.
*/
items: Array<JsonValue>, };

View File

@@ -0,0 +1,5 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ThreadInjectItemsResponse = Record<string, never>;

View File

@@ -6,4 +6,8 @@
* EXPERIMENTAL - flat transcript delta emitted whenever realtime
* transcript text changes.
*/
export type ThreadRealtimeTranscriptUpdatedNotification = { threadId: string, role: string, text: string, };
export type ThreadRealtimeTranscriptDeltaNotification = { threadId: string, role: string,
/**
* Live transcript delta from the realtime event.
*/
delta: string, };

View File

@@ -0,0 +1,13 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
/**
* EXPERIMENTAL - final transcript text emitted when realtime completes
* a transcript part.
*/
export type ThreadRealtimeTranscriptDoneNotification = { threadId: string, role: string,
/**
* Final complete text for the transcript part.
*/
text: string, };

View File

@@ -149,6 +149,8 @@ export type { ListMcpServerStatusResponse } from "./ListMcpServerStatusResponse"
export type { LoginAccountParams } from "./LoginAccountParams";
export type { LoginAccountResponse } from "./LoginAccountResponse";
export type { LogoutAccountResponse } from "./LogoutAccountResponse";
export type { MarketplaceAddParams } from "./MarketplaceAddParams";
export type { MarketplaceAddResponse } from "./MarketplaceAddResponse";
export type { MarketplaceInterface } from "./MarketplaceInterface";
export type { MarketplaceLoadErrorInfo } from "./MarketplaceLoadErrorInfo";
export type { McpAuthStatus } from "./McpAuthStatus";
@@ -282,6 +284,8 @@ export type { ThreadCompactStartParams } from "./ThreadCompactStartParams";
export type { ThreadCompactStartResponse } from "./ThreadCompactStartResponse";
export type { ThreadForkParams } from "./ThreadForkParams";
export type { ThreadForkResponse } from "./ThreadForkResponse";
export type { ThreadInjectItemsParams } from "./ThreadInjectItemsParams";
export type { ThreadInjectItemsResponse } from "./ThreadInjectItemsResponse";
export type { ThreadItem } from "./ThreadItem";
export type { ThreadListParams } from "./ThreadListParams";
export type { ThreadListResponse } from "./ThreadListResponse";
@@ -301,7 +305,8 @@ export type { ThreadRealtimeOutputAudioDeltaNotification } from "./ThreadRealtim
export type { ThreadRealtimeSdpNotification } from "./ThreadRealtimeSdpNotification";
export type { ThreadRealtimeStartTransport } from "./ThreadRealtimeStartTransport";
export type { ThreadRealtimeStartedNotification } from "./ThreadRealtimeStartedNotification";
export type { ThreadRealtimeTranscriptUpdatedNotification } from "./ThreadRealtimeTranscriptUpdatedNotification";
export type { ThreadRealtimeTranscriptDeltaNotification } from "./ThreadRealtimeTranscriptDeltaNotification";
export type { ThreadRealtimeTranscriptDoneNotification } from "./ThreadRealtimeTranscriptDoneNotification";
export type { ThreadResumeParams } from "./ThreadResumeParams";
export type { ThreadResumeResponse } from "./ThreadResumeResponse";
export type { ThreadRollbackParams } from "./ThreadRollbackParams";

View File

@@ -284,6 +284,11 @@ client_request_definitions! {
params: v2::ThreadMetadataUpdateParams,
response: v2::ThreadMetadataUpdateResponse,
},
#[experimental("thread/memoryMode/set")]
ThreadMemoryModeSet => "thread/memoryMode/set" {
params: v2::ThreadMemoryModeSetParams,
response: v2::ThreadMemoryModeSetResponse,
},
ThreadUnarchive => "thread/unarchive" {
params: v2::ThreadUnarchiveParams,
response: v2::ThreadUnarchiveResponse,
@@ -317,10 +322,19 @@ client_request_definitions! {
params: v2::ThreadReadParams,
response: v2::ThreadReadResponse,
},
/// Append raw Responses API items to the thread history without starting a user turn.
ThreadInjectItems => "thread/inject_items" {
params: v2::ThreadInjectItemsParams,
response: v2::ThreadInjectItemsResponse,
},
SkillsList => "skills/list" {
params: v2::SkillsListParams,
response: v2::SkillsListResponse,
},
MarketplaceAdd => "marketplace/add" {
params: v2::MarketplaceAddParams,
response: v2::MarketplaceAddResponse,
},
PluginList => "plugin/list" {
params: v2::PluginListParams,
response: v2::PluginListResponse,
@@ -1012,8 +1026,10 @@ server_notification_definitions! {
ThreadRealtimeStarted => "thread/realtime/started" (v2::ThreadRealtimeStartedNotification),
#[experimental("thread/realtime/itemAdded")]
ThreadRealtimeItemAdded => "thread/realtime/itemAdded" (v2::ThreadRealtimeItemAddedNotification),
#[experimental("thread/realtime/transcriptUpdated")]
ThreadRealtimeTranscriptUpdated => "thread/realtime/transcriptUpdated" (v2::ThreadRealtimeTranscriptUpdatedNotification),
#[experimental("thread/realtime/transcript/delta")]
ThreadRealtimeTranscriptDelta => "thread/realtime/transcript/delta" (v2::ThreadRealtimeTranscriptDeltaNotification),
#[experimental("thread/realtime/transcript/done")]
ThreadRealtimeTranscriptDone => "thread/realtime/transcript/done" (v2::ThreadRealtimeTranscriptDoneNotification),
#[experimental("thread/realtime/outputAudio/delta")]
ThreadRealtimeOutputAudioDelta => "thread/realtime/outputAudio/delta" (v2::ThreadRealtimeOutputAudioDeltaNotification),
#[experimental("thread/realtime/sdp")]
@@ -1046,6 +1062,8 @@ mod tests {
use codex_protocol::account::PlanType;
use codex_protocol::parse_command::ParsedCommand;
use codex_protocol::protocol::RealtimeConversationVersion;
use codex_protocol::protocol::RealtimeOutputModality;
use codex_protocol::protocol::RealtimeVoice;
use codex_utils_absolute_path::AbsolutePathBuf;
use pretty_assertions::assert_eq;
use serde_json::json;
@@ -1774,10 +1792,11 @@ mod tests {
request_id: RequestId::Integer(9),
params: v2::ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("You are on a call".to_string())),
session_id: Some("sess_456".to_string()),
transport: None,
voice: Some(codex_protocol::protocol::RealtimeVoice::Marin),
voice: Some(RealtimeVoice::Marin),
},
};
assert_eq!(
@@ -1786,6 +1805,7 @@ mod tests {
"id": 9,
"params": {
"threadId": "thr_123",
"outputModality": "audio",
"prompt": "You are on a call",
"sessionId": "sess_456",
"transport": null,
@@ -1803,6 +1823,7 @@ mod tests {
request_id: RequestId::Integer(9),
params: v2::ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: None,
session_id: None,
transport: None,
@@ -1815,6 +1836,7 @@ mod tests {
"id": 9,
"params": {
"threadId": "thr_123",
"outputModality": "audio",
"sessionId": null,
"transport": null,
"voice": null
@@ -1827,6 +1849,7 @@ mod tests {
request_id: RequestId::Integer(9),
params: v2::ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(None),
session_id: None,
transport: None,
@@ -1839,6 +1862,7 @@ mod tests {
"id": 9,
"params": {
"threadId": "thr_123",
"outputModality": "audio",
"prompt": null,
"sessionId": null,
"transport": null,
@@ -1853,6 +1877,7 @@ mod tests {
"id": 9,
"params": {
"threadId": "thr_123",
"outputModality": "audio",
"sessionId": null,
"transport": null,
"voice": null
@@ -1868,6 +1893,7 @@ mod tests {
"id": 9,
"params": {
"threadId": "thr_123",
"outputModality": "audio",
"prompt": null,
"sessionId": null,
"transport": null,
@@ -1952,6 +1978,7 @@ mod tests {
request_id: RequestId::Integer(1),
params: v2::ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("You are on a call".to_string())),
session_id: None,
transport: None,

View File

@@ -74,6 +74,7 @@ use codex_protocol::protocol::RateLimitWindow as CoreRateLimitWindow;
use codex_protocol::protocol::ReadOnlyAccess as CoreReadOnlyAccess;
use codex_protocol::protocol::RealtimeAudioFrame as CoreRealtimeAudioFrame;
use codex_protocol::protocol::RealtimeConversationVersion;
use codex_protocol::protocol::RealtimeOutputModality;
use codex_protocol::protocol::RealtimeVoice;
use codex_protocol::protocol::RealtimeVoicesList;
use codex_protocol::protocol::ReviewDecision as CoreReviewDecision;
@@ -2320,10 +2321,12 @@ pub struct FsGetMetadataParams {
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct FsGetMetadataResponse {
/// Whether the path currently resolves to a directory.
/// Whether the path resolves to a directory.
pub is_directory: bool,
/// Whether the path currently resolves to a regular file.
/// Whether the path resolves to a regular file.
pub is_file: bool,
/// Whether the path itself is a symbolic link.
pub is_symlink: bool,
/// File creation time in Unix milliseconds when available, otherwise `0`.
#[ts(type = "number")]
pub created_at_ms: i64,
@@ -3049,6 +3052,43 @@ pub struct ThreadMetadataUpdateResponse {
pub thread: Thread,
}
#[derive(Serialize, Deserialize, Debug, Clone, Copy, PartialEq, Eq, JsonSchema, TS)]
#[serde(rename_all = "lowercase")]
#[ts(rename_all = "lowercase")]
pub enum ThreadMemoryMode {
Enabled,
Disabled,
}
impl ThreadMemoryMode {
pub fn as_str(self) -> &'static str {
match self {
Self::Enabled => "enabled",
Self::Disabled => "disabled",
}
}
pub fn to_core(self) -> codex_protocol::protocol::ThreadMemoryMode {
match self {
Self::Enabled => codex_protocol::protocol::ThreadMemoryMode::Enabled,
Self::Disabled => codex_protocol::protocol::ThreadMemoryMode::Disabled,
}
}
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadMemoryModeSetParams {
pub thread_id: String,
pub mode: ThreadMemoryMode,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadMemoryModeSetResponse {}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
@@ -3287,6 +3327,26 @@ pub struct SkillsListResponse {
pub data: Vec<SkillsListEntry>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct MarketplaceAddParams {
pub source: String,
#[ts(optional = nullable)]
pub ref_name: Option<String>,
#[ts(optional = nullable)]
pub sparse_paths: Option<Vec<String>>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct MarketplaceAddResponse {
pub marketplace_name: String,
pub installed_root: AbsolutePathBuf,
pub already_added: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
@@ -3363,7 +3423,7 @@ pub struct SkillMetadata {
#[serde(default, skip_serializing_if = "Option::is_none")]
#[ts(optional)]
pub dependencies: Option<SkillDependencies>,
pub path: PathBuf,
pub path: AbsolutePathBuf,
pub scope: SkillScope,
pub enabled: bool,
}
@@ -3509,7 +3569,7 @@ pub struct SkillSummary {
pub description: String,
pub short_description: Option<String>,
pub interface: Option<SkillInterface>,
pub path: PathBuf,
pub path: AbsolutePathBuf,
pub enabled: bool,
}
@@ -3917,11 +3977,14 @@ impl From<ThreadRealtimeAudioChunk> for CoreRealtimeAudioFrame {
}
/// EXPERIMENTAL - start a thread-scoped realtime session.
#[derive(Serialize, Deserialize, Debug, Default, Clone, PartialEq, JsonSchema, TS)]
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadRealtimeStartParams {
pub thread_id: String,
/// Selects text or audio output for the realtime session. Transport and voice stay
/// independent so clients can choose how they connect separately from what the model emits.
pub output_modality: RealtimeOutputModality,
#[serde(
default,
deserialize_with = "super::serde_helpers::deserialize_double_option",
@@ -4039,9 +4102,22 @@ pub struct ThreadRealtimeItemAddedNotification {
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadRealtimeTranscriptUpdatedNotification {
pub struct ThreadRealtimeTranscriptDeltaNotification {
pub thread_id: String,
pub role: String,
/// Live transcript delta from the realtime event.
pub delta: String,
}
/// EXPERIMENTAL - final transcript text emitted when realtime completes
/// a transcript part.
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadRealtimeTranscriptDoneNotification {
pub thread_id: String,
pub role: String,
/// Final complete text for the transcript part.
pub text: String,
}
@@ -4214,6 +4290,20 @@ pub struct TurnStartResponse {
pub turn: Turn,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadInjectItemsParams {
pub thread_id: String,
/// Raw Responses API items to append to the thread's model-visible history.
pub items: Vec<JsonValue>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]
pub struct ThreadInjectItemsResponse {}
#[derive(
Serialize, Deserialize, Debug, Default, Clone, PartialEq, JsonSchema, TS, ExperimentalApi,
)]
@@ -6714,6 +6804,7 @@ mod tests {
let response = FsGetMetadataResponse {
is_directory: false,
is_file: true,
is_symlink: false,
created_at_ms: 123,
modified_at_ms: 456,
};
@@ -6724,6 +6815,7 @@ mod tests {
json!({
"isDirectory": false,
"isFile": true,
"isSymlink": false,
"createdAtMs": 123,
"modifiedAtMs": 456,
})
@@ -8297,6 +8389,37 @@ mod tests {
);
}
#[test]
fn marketplace_add_params_serialization_uses_optional_ref_name_and_sparse_paths() {
assert_eq!(
serde_json::to_value(MarketplaceAddParams {
source: "owner/repo".to_string(),
ref_name: None,
sparse_paths: None,
})
.unwrap(),
json!({
"source": "owner/repo",
"refName": null,
"sparsePaths": null,
}),
);
assert_eq!(
serde_json::to_value(MarketplaceAddParams {
source: "owner/repo".to_string(),
ref_name: Some("main".to_string()),
sparse_paths: Some(vec!["plugins/foo".to_string()]),
})
.unwrap(),
json!({
"source": "owner/repo",
"refName": "main",
"sparsePaths": ["plugins/foo"],
}),
);
}
#[test]
fn plugin_install_params_serialization_uses_force_remote_sync() {
let marketplace_path = if cfg!(windows) {

View File

@@ -140,9 +140,10 @@ Example with notification opt-out:
- `thread/loaded/list` — list the thread ids currently loaded in memory.
- `thread/read` — read a stored thread by id without resuming it; optionally include turns via `includeTurns`. The returned `thread` includes `status` (`ThreadStatus`), defaulting to `notLoaded` when the thread is not currently loaded.
- `thread/metadata/update` — patch stored thread metadata in sqlite; currently supports updating persisted `gitInfo` fields and returns the refreshed `thread`.
- `thread/memoryMode/set` — experimental; set a threads persisted memory eligibility to `"enabled"` or `"disabled"` for either a loaded thread or a stored rollout; returns `{}` on success.
- `thread/status/changed` — notification emitted when a loaded threads status changes (`threadId` + new `status`).
- `thread/archive` — move a threads rollout file into the archived directory; returns `{}` on success and emits `thread/archived`.
- `thread/unsubscribe` — unsubscribe this connection from thread turn/item events. If this was the last subscriber, the server shuts down and unloads the thread, then emits `thread/closed`.
- `thread/unsubscribe` — unsubscribe this connection from thread turn/item events. If this was the last subscriber, the server keeps the thread loaded and unloads it only after it has had no subscribers and no thread activity for 30 minutes, then emits `thread/closed`.
- `thread/name/set` — set or update a threads user-facing name for either a loaded thread or a persisted rollout; returns `{}` on success and emits `thread/name/updated` to initialized, opted-in clients. Thread names are not required to be unique; name lookups resolve to the most recently updated thread.
- `thread/unarchive` — move an archived rollout file back into the sessions directory; returns the restored `thread` on success and emits `thread/unarchived`.
- `thread/compact/start` — trigger conversation history compaction for a thread; returns `{}` immediately while progress streams through standard turn/item notifications.
@@ -150,9 +151,10 @@ Example with notification opt-out:
- `thread/backgroundTerminals/clean` — terminate all running background terminals for a thread (experimental; requires `capabilities.experimentalApi`); returns `{}` when the cleanup request is accepted.
- `thread/rollback` — drop the last N turns from the agents in-memory context and persist a rollback marker in the rollout so future resumes see the pruned history; returns the updated `thread` (with `turns` populated) on success.
- `turn/start` — add user input to a thread and begin Codex generation; responds with the initial `turn` object and streams `turn/started`, `item/*`, and `turn/completed` notifications. For `collaborationMode`, `settings.developer_instructions: null` means "use built-in instructions for the selected mode".
- `thread/inject_items` — append raw Responses API items to a loaded threads model-visible history without starting a user turn; returns `{}` on success.
- `turn/steer` — add user input to an already in-flight regular turn without starting a new turn; returns the active `turnId` that accepted the input. Review and manual compaction turns reject `turn/steer`.
- `turn/interrupt` — request cancellation of an in-flight turn by `(thread_id, turn_id)`; success is an empty `{}` response and the turn finishes with `status: "interrupted"`.
- `thread/realtime/start` — start a thread-scoped realtime session (experimental); returns `{}` and streams `thread/realtime/*` notifications. Omit `transport` for the websocket transport, or pass `{ "type": "webrtc", "sdp": "..." }` to create a WebRTC session from a browser-generated SDP offer; the remote answer SDP is emitted as `thread/realtime/sdp`.
- `thread/realtime/start` — start a thread-scoped realtime session (experimental); pass `outputModality: "text"` or `outputModality: "audio"` to choose model output, returns `{}` and streams `thread/realtime/*` notifications. Omit `transport` for the websocket transport, or pass `{ "type": "webrtc", "sdp": "..." }` to create a WebRTC session from a browser-generated SDP offer; the remote answer SDP is emitted as `thread/realtime/sdp`.
- `thread/realtime/appendAudio` — append an input audio chunk to the active realtime session (experimental); returns `{}`.
- `thread/realtime/appendText` — append text input to the active realtime session (experimental); returns `{}`.
- `thread/realtime/stop` — stop the active realtime session for the thread (experimental); returns `{}`.
@@ -165,7 +167,7 @@ Example with notification opt-out:
- `fs/readFile` — read an absolute file path and return `{ dataBase64 }`.
- `fs/writeFile` — write an absolute file path from base64-encoded `{ dataBase64 }`; returns `{}`.
- `fs/createDirectory` — create an absolute directory path; `recursive` defaults to `true`.
- `fs/getMetadata` — return metadata for an absolute path: `isDirectory`, `isFile`, `createdAtMs`, and `modifiedAtMs`.
- `fs/getMetadata` — return metadata for an absolute path: `isDirectory`, `isFile`, `isSymlink`, `createdAtMs`, and `modifiedAtMs`.
- `fs/readDirectory` — list direct child entries for an absolute directory path; each entry contains `fileName`, `isDirectory`, and `isFile`, and `fileName` is just the child name, not a path.
- `fs/remove` — remove an absolute file or directory tree; `recursive` and `force` default to `true`.
- `fs/copy` — copy between absolute paths; directory copies require `recursive: true`.
@@ -177,6 +179,7 @@ Example with notification opt-out:
- `experimentalFeature/enablement/set` — patch the in-memory process-wide runtime feature enablement for the currently supported feature keys (`apps`, `plugins`). For each feature, precedence is: cloud requirements > --enable <feature_name> > config.toml > experimentalFeature/enablement/set (new) > code default.
- `collaborationMode/list` — list available collaboration mode presets (experimental, no pagination). This response omits built-in developer instructions; clients should either pass `settings.developer_instructions: null` when setting a mode to use Codex's built-in instructions, or provide their own instructions explicitly.
- `skills/list` — list skills for one or more `cwd` values (optional `forceReload`).
- `marketplace/add` — add a remote plugin marketplace from an HTTP(S) Git URL, SSH Git URL, or GitHub `owner/repo` shorthand, then persist it into the user marketplace config. Returns the installed root path plus whether the marketplace was already present.
- `plugin/list` — list discovered plugin marketplaces and plugin state, including effective marketplace install/auth policy metadata, fail-open `marketplaceLoadErrors` entries for marketplace files that could not be parsed or loaded, and best-effort `featuredPluginIds` for the official curated marketplace. `interface.category` uses the marketplace category when present; otherwise it falls back to the plugin manifest category. Pass `forceRemoteSync: true` to refresh curated plugin state before listing (**under development; do not call from production clients yet**).
- `plugin/read` — read one plugin by `marketplacePath` plus `pluginName`, returning marketplace info, a list-style `summary`, manifest descriptions/interface metadata, and bundled skills/apps/MCP server names. Returned plugin skills include their current `enabled` state after local config filtering. Plugin app summaries also include `needsAuth` when the server can determine connector accessibility (**under development; do not call from production clients yet**).
- `skills/changed` — notification emitted when watched local skill files change.
@@ -337,11 +340,16 @@ When `nextCursor` is `null`, youve reached the final page.
- `notSubscribed` when the connection was not subscribed to that thread.
- `notLoaded` when the thread is not loaded.
If this was the last subscriber, the server unloads the thread and emits `thread/closed` and a `thread/status/changed` transition to `notLoaded`.
If this was the last subscriber, the server does not unload the thread immediately. It unloads the thread after the thread has had no subscribers and no thread activity for 30 minutes, then emits `thread/closed` and a `thread/status/changed` transition to `notLoaded`.
```json
{ "method": "thread/unsubscribe", "id": 22, "params": { "threadId": "thr_123" } }
{ "id": 22, "result": { "status": "unsubscribed" } }
```
Later, after the idle unload timeout:
```json
{ "method": "thread/status/changed", "params": {
"threadId": "thr_123",
"status": { "type": "notLoaded" }
@@ -395,6 +403,16 @@ Use `thread/metadata/update` to patch sqlite-backed metadata for a thread withou
} }
```
Experimental: use `thread/memoryMode/set` to change whether a thread remains eligible for future memory generation.
```json
{ "method": "thread/memoryMode/set", "id": 26, "params": {
"threadId": "thr_123",
"mode": "disabled"
} }
{ "id": 26, "result": {} }
```
### Example: Archive a thread
Use `thread/archive` to move the persisted rollout (stored as a JSONL file on disk) into the archived sessions directory.
@@ -565,6 +583,24 @@ Invoke a plugin by including a UI mention token such as `@sample` in the text in
} } }
```
### Example: Inject raw history items
Use `thread/inject_items` to append prebuilt Responses API items to a loaded threads prompt history without starting a user turn. These items are persisted to the rollout and included in subsequent model requests.
```json
{ "method": "thread/inject_items", "id": 36, "params": {
"threadId": "thr_123",
"items": [
{
"type": "message",
"role": "assistant",
"content": [{ "type": "output_text", "text": "Previously computed context." }]
}
]
} }
{ "id": 36, "result": {} }
```
### Example: Start realtime with WebRTC
Use `thread/realtime/start` with `transport.type: "webrtc"` when a browser or webview owns the `RTCPeerConnection` and app-server should create the server-side realtime session. The transport `sdp` must be the offer SDP produced by `RTCPeerConnection.createOffer()`, not a hand-written or minimal SDP string.
@@ -592,6 +628,7 @@ Then send `offer.sdp` to app-server. Core uses `experimental_realtime_ws_backend
```json
{ "method": "thread/realtime/start", "id": 40, "params": {
"threadId": "thr_123",
"outputModality": "audio",
"prompt": "You are on a call.",
"sessionId": null,
"transport": { "type": "webrtc", "sdp": "v=0\r\no=..." }
@@ -843,6 +880,7 @@ All filesystem paths in this section must be absolute.
{ "id": 42, "result": {
"isDirectory": false,
"isFile": true,
"isSymlink": false,
"createdAtMs": 1730910000000,
"modifiedAtMs": 1730910000000
} }
@@ -854,7 +892,7 @@ All filesystem paths in this section must be absolute.
} }
```
- `fs/getMetadata` returns whether the path currently resolves to a directory or regular file, plus `createdAtMs` and `modifiedAtMs` in Unix milliseconds. If a timestamp is unavailable on the current platform, that field is `0`.
- `fs/getMetadata` returns whether the path resolves to a directory or regular file, whether the path itself is a symlink, plus `createdAtMs` and `modifiedAtMs` in Unix milliseconds. If a timestamp is unavailable on the current platform, that field is `0`.
- `fs/createDirectory` defaults `recursive` to `true` when omitted.
- `fs/remove` defaults both `recursive` and `force` to `true` when omitted.
- `fs/readFile` always returns base64 bytes via `dataBase64`, and `fs/writeFile` always expects base64 bytes in `dataBase64`.
@@ -915,7 +953,8 @@ The thread realtime API emits thread-scoped notifications for session lifecycle
- `thread/realtime/started``{ threadId, sessionId }` once realtime starts for the thread (experimental).
- `thread/realtime/itemAdded``{ threadId, item }` for raw non-audio realtime items that do not have a dedicated typed app-server notification, including `handoff_request` (experimental). `item` is forwarded as raw JSON while the upstream websocket item schema remains unstable.
- `thread/realtime/transcriptUpdated``{ threadId, role, text }` whenever realtime transcript text changes (experimental). This forwards the live transcript delta from that realtime event, not the full accumulated transcript.
- `thread/realtime/transcript/delta``{ threadId, role, delta }` for live realtime transcript deltas (experimental).
- `thread/realtime/transcript/done``{ threadId, role, text }` when realtime emits the final full text for a transcript part (experimental).
- `thread/realtime/outputAudio/delta``{ threadId, audio }` for streamed output audio chunks (experimental). `audio` uses camelCase fields (`data`, `sampleRate`, `numChannels`, `samplesPerChannel`).
- `thread/realtime/error``{ threadId, message }` when realtime encounters a transport or backend error (experimental).
- `thread/realtime/closed``{ threadId, reason }` when the realtime transport closes (experimental).

View File

@@ -12,6 +12,7 @@ use crate::thread_state::TurnSummary;
use crate::thread_state::resolve_server_request_on_thread_listener;
use crate::thread_status::ThreadWatchActiveGuard;
use crate::thread_status::ThreadWatchManager;
use codex_analytics::AnalyticsEventsClient;
use codex_app_server_protocol::AccountRateLimitsUpdatedNotification;
use codex_app_server_protocol::AdditionalPermissionProfile as V2AdditionalPermissionProfile;
use codex_app_server_protocol::AgentMessageDeltaNotification;
@@ -82,7 +83,8 @@ use codex_app_server_protocol::ThreadRealtimeItemAddedNotification;
use codex_app_server_protocol::ThreadRealtimeOutputAudioDeltaNotification;
use codex_app_server_protocol::ThreadRealtimeSdpNotification;
use codex_app_server_protocol::ThreadRealtimeStartedNotification;
use codex_app_server_protocol::ThreadRealtimeTranscriptUpdatedNotification;
use codex_app_server_protocol::ThreadRealtimeTranscriptDeltaNotification;
use codex_app_server_protocol::ThreadRealtimeTranscriptDoneNotification;
use codex_app_server_protocol::ThreadRollbackResponse;
use codex_app_server_protocol::ThreadTokenUsage;
use codex_app_server_protocol::ThreadTokenUsageUpdatedNotification;
@@ -167,6 +169,7 @@ pub(crate) async fn apply_bespoke_event_handling(
conversation_id: ThreadId,
conversation: Arc<CodexThread>,
thread_manager: Arc<ThreadManager>,
analytics_events_client: Option<AnalyticsEventsClient>,
outgoing: ThreadScopedOutgoingMessageSender,
thread_state: Arc<tokio::sync::Mutex<ThreadState>>,
thread_watch_manager: ThreadWatchManager,
@@ -202,6 +205,10 @@ pub(crate) async fn apply_bespoke_event_handling(
thread_id: conversation_id.to_string(),
turn,
};
if let Some(analytics_events_client) = analytics_events_client.as_ref() {
analytics_events_client
.track_notification(ServerNotification::TurnStarted(notification.clone()));
}
outgoing
.send_server_notification(ServerNotification::TurnStarted(notification))
.await;
@@ -218,6 +225,7 @@ pub(crate) async fn apply_bespoke_event_handling(
conversation_id,
event_turn_id,
turn_complete_event,
analytics_events_client.as_ref(),
&outgoing,
&thread_state,
)
@@ -401,26 +409,50 @@ pub(crate) async fn apply_bespoke_event_handling(
.await;
}
RealtimeEvent::InputTranscriptDelta(event) => {
let notification = ThreadRealtimeTranscriptUpdatedNotification {
let notification = ThreadRealtimeTranscriptDeltaNotification {
thread_id: conversation_id.to_string(),
role: "user".to_string(),
text: event.delta,
delta: event.delta,
};
outgoing
.send_server_notification(
ServerNotification::ThreadRealtimeTranscriptUpdated(notification),
ServerNotification::ThreadRealtimeTranscriptDelta(notification),
)
.await;
}
RealtimeEvent::InputTranscriptDone(event) => {
let notification = ThreadRealtimeTranscriptDoneNotification {
thread_id: conversation_id.to_string(),
role: "user".to_string(),
text: event.text,
};
outgoing
.send_server_notification(
ServerNotification::ThreadRealtimeTranscriptDone(notification),
)
.await;
}
RealtimeEvent::OutputTranscriptDelta(event) => {
let notification = ThreadRealtimeTranscriptUpdatedNotification {
let notification = ThreadRealtimeTranscriptDeltaNotification {
thread_id: conversation_id.to_string(),
role: "assistant".to_string(),
text: event.delta,
delta: event.delta,
};
outgoing
.send_server_notification(
ServerNotification::ThreadRealtimeTranscriptUpdated(notification),
ServerNotification::ThreadRealtimeTranscriptDelta(notification),
)
.await;
}
RealtimeEvent::OutputTranscriptDone(event) => {
let notification = ThreadRealtimeTranscriptDoneNotification {
thread_id: conversation_id.to_string(),
role: "assistant".to_string(),
text: event.text,
};
outgoing
.send_server_notification(
ServerNotification::ThreadRealtimeTranscriptDone(notification),
)
.await;
}
@@ -1303,6 +1335,11 @@ pub(crate) async fn apply_bespoke_event_handling(
.await;
}
EventMsg::ContextCompacted(..) => {
// Core still fans out this deprecated event for legacy clients;
// v2 clients receive the canonical ContextCompaction item instead.
if matches!(api_version, ApiVersion::V2) {
return;
}
let notification = ContextCompactedNotification {
thread_id: conversation_id.to_string(),
turn_id: event_turn_id.clone(),
@@ -1599,6 +1636,17 @@ pub(crate) async fn apply_bespoke_event_handling(
.await;
}
EventMsg::ExecCommandBegin(exec_command_begin_event) => {
if matches!(api_version, ApiVersion::V2)
&& matches!(
exec_command_begin_event.source,
codex_protocol::protocol::ExecCommandSource::UnifiedExecInteraction
)
{
// TerminalInteraction is the v2 surface for unified exec
// stdin/poll events. Suppress the legacy CommandExecution
// item so clients do not render the same wait twice.
return;
}
let item_id = exec_command_begin_event.call_id.clone();
let command_actions = exec_command_begin_event
.parsed_cmd
@@ -1702,6 +1750,17 @@ pub(crate) async fn apply_bespoke_event_handling(
.command_execution_started
.remove(&call_id);
}
if matches!(api_version, ApiVersion::V2)
&& matches!(
exec_command_end_event.source,
codex_protocol::protocol::ExecCommandSource::UnifiedExecInteraction
)
{
// The paired begin event is suppressed above; keep the
// completion out of v2 as well so no orphan legacy item is
// emitted for unified exec interactions.
return;
}
let item = build_command_execution_end_item(&exec_command_end_event);
@@ -1746,6 +1805,7 @@ pub(crate) async fn apply_bespoke_event_handling(
conversation_id,
event_turn_id,
turn_aborted_event,
analytics_events_client.as_ref(),
&outgoing,
&thread_state,
)
@@ -1923,6 +1983,7 @@ async fn emit_turn_completed_with_status(
conversation_id: ThreadId,
event_turn_id: String,
turn_completion_metadata: TurnCompletionMetadata,
analytics_events_client: Option<&AnalyticsEventsClient>,
outgoing: &ThreadScopedOutgoingMessageSender,
) {
let notification = TurnCompletedNotification {
@@ -1937,6 +1998,10 @@ async fn emit_turn_completed_with_status(
duration_ms: turn_completion_metadata.duration_ms,
},
};
if let Some(analytics_events_client) = analytics_events_client {
analytics_events_client
.track_notification(ServerNotification::TurnCompleted(notification.clone()));
}
outgoing
.send_server_notification(ServerNotification::TurnCompleted(notification))
.await;
@@ -2129,6 +2194,7 @@ async fn handle_turn_complete(
conversation_id: ThreadId,
event_turn_id: String,
turn_complete_event: TurnCompleteEvent,
analytics_events_client: Option<&AnalyticsEventsClient>,
outgoing: &ThreadScopedOutgoingMessageSender,
thread_state: &Arc<Mutex<ThreadState>>,
) {
@@ -2149,6 +2215,7 @@ async fn handle_turn_complete(
completed_at: turn_complete_event.completed_at,
duration_ms: turn_complete_event.duration_ms,
},
analytics_events_client,
outgoing,
)
.await;
@@ -2158,6 +2225,7 @@ async fn handle_turn_interrupted(
conversation_id: ThreadId,
event_turn_id: String,
turn_aborted_event: TurnAbortedEvent,
analytics_events_client: Option<&AnalyticsEventsClient>,
outgoing: &ThreadScopedOutgoingMessageSender,
thread_state: &Arc<Mutex<ThreadState>>,
) {
@@ -2173,6 +2241,7 @@ async fn handle_turn_interrupted(
completed_at: turn_aborted_event.completed_at,
duration_ms: turn_aborted_event.duration_ms,
},
analytics_events_client,
outgoing,
)
.await;
@@ -2913,6 +2982,7 @@ mod tests {
use codex_app_server_protocol::GuardianApprovalReviewStatus;
use codex_app_server_protocol::JSONRPCErrorError;
use codex_app_server_protocol::TurnPlanStepStatus;
use codex_login::AuthManager;
use codex_login::CodexAuth;
use codex_protocol::items::HookPromptFragment;
use codex_protocol::items::build_hook_prompt_message;
@@ -3043,6 +3113,7 @@ mod tests {
outgoing: ThreadScopedOutgoingMessageSender,
thread_state: Arc<Mutex<ThreadState>>,
thread_watch_manager: ThreadWatchManager,
analytics_events_client: AnalyticsEventsClient,
codex_home: PathBuf,
}
@@ -3057,6 +3128,7 @@ mod tests {
self.conversation_id,
self.conversation.clone(),
self.thread_manager.clone(),
Some(self.analytics_events_client.clone()),
self.outgoing.clone(),
self.thread_state.clone(),
self.thread_watch_manager.clone(),
@@ -3356,7 +3428,7 @@ mod tests {
codex_core::test_support::thread_manager_with_models_provider_and_home(
CodexAuth::create_dummy_chatgpt_auth_for_testing(),
config.model_provider.clone(),
config.codex_home.clone(),
config.codex_home.to_path_buf(),
Arc::new(codex_exec_server::EnvironmentManager::new(
/*exec_server_url*/ None,
)),
@@ -3383,6 +3455,13 @@ mod tests {
outgoing: outgoing.clone(),
thread_state: thread_state.clone(),
thread_watch_manager: thread_watch_manager.clone(),
analytics_events_client: AnalyticsEventsClient::new(
AuthManager::from_auth_for_testing(
CodexAuth::create_dummy_chatgpt_auth_for_testing(),
),
"http://localhost".to_string(),
Some(false),
),
codex_home: codex_home.path().to_path_buf(),
};
@@ -3833,6 +3912,7 @@ mod tests {
conversation_id,
event_turn_id.clone(),
turn_complete_event(&event_turn_id),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)
@@ -3881,6 +3961,7 @@ mod tests {
conversation_id,
event_turn_id.clone(),
turn_aborted_event(&event_turn_id),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)
@@ -3928,6 +4009,7 @@ mod tests {
conversation_id,
event_turn_id.clone(),
turn_complete_event(&event_turn_id),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)
@@ -4194,6 +4276,7 @@ mod tests {
conversation_a,
a_turn1.clone(),
turn_complete_event(&a_turn1),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)
@@ -4215,6 +4298,7 @@ mod tests {
conversation_b,
b_turn1.clone(),
turn_complete_event(&b_turn1),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)
@@ -4226,6 +4310,7 @@ mod tests {
conversation_a,
a_turn2.clone(),
turn_complete_event(&a_turn2),
/*analytics_events_client*/ None,
&outgoing,
&thread_state,
)

File diff suppressed because it is too large Load Diff

View File

@@ -210,7 +210,7 @@ impl ConfigApi {
.write_value(params)
.await
.map_err(map_error)?;
self.emit_plugin_toggle_events(pending_changes);
self.emit_plugin_toggle_events(pending_changes).await;
Ok(response)
}
@@ -230,7 +230,7 @@ impl ConfigApi {
.batch_write(params)
.await
.map_err(map_error)?;
self.emit_plugin_toggle_events(pending_changes);
self.emit_plugin_toggle_events(pending_changes).await;
if reload_user_config {
self.user_config_reloader.reload_user_config().await;
}
@@ -299,13 +299,16 @@ impl ConfigApi {
Ok(ExperimentalFeatureEnablementSetResponse { enablement })
}
fn emit_plugin_toggle_events(&self, pending_changes: std::collections::BTreeMap<String, bool>) {
async fn emit_plugin_toggle_events(
&self,
pending_changes: std::collections::BTreeMap<String, bool>,
) {
for (plugin_id, enabled) in pending_changes {
let Ok(plugin_id) = PluginId::parse(&plugin_id) else {
continue;
};
let metadata =
installed_plugin_telemetry_metadata(self.codex_home.as_path(), &plugin_id);
installed_plugin_telemetry_metadata(self.codex_home.as_path(), &plugin_id).await;
if enabled {
self.analytics_events_client.track_plugin_enabled(metadata);
} else {

View File

@@ -99,6 +99,7 @@ impl FsApi {
Ok(FsGetMetadataResponse {
is_directory: metadata.is_directory,
is_file: metadata.is_file,
is_symlink: metadata.is_symlink,
created_at_ms: metadata.created_at_ms,
modified_at_ms: metadata.modified_at_ms,
})

View File

@@ -410,7 +410,7 @@ pub async fn run_main_with_transport(
cloud_requirements_loader(
auth_manager,
config.chatgpt_base_url,
config.codex_home.clone(),
config.codex_home.to_path_buf(),
)
}
Err(err) => {

View File

@@ -12,6 +12,7 @@ use std::path::PathBuf;
// Debug-only test hook: lets integration tests point the server at a temporary
// managed config file without writing to /etc.
const MANAGED_CONFIG_PATH_ENV_VAR: &str = "CODEX_APP_SERVER_MANAGED_CONFIG_PATH";
const DISABLE_MANAGED_CONFIG_ENV_VAR: &str = "CODEX_APP_SERVER_DISABLE_MANAGED_CONFIG";
#[derive(Debug, Parser)]
struct AppServerArgs {
@@ -40,10 +41,13 @@ struct AppServerArgs {
fn main() -> anyhow::Result<()> {
arg0_dispatch_or_else(|arg0_paths: Arg0DispatchPaths| async move {
let args = AppServerArgs::parse();
let managed_config_path = managed_config_path_from_debug_env();
let loader_overrides = LoaderOverrides {
managed_config_path,
..Default::default()
let loader_overrides = if disable_managed_config_from_debug_env() {
LoaderOverrides::without_managed_config_for_tests()
} else {
LoaderOverrides {
managed_config_path: managed_config_path_from_debug_env(),
..Default::default()
}
};
let transport = args.listen;
let session_source = args.session_source;
@@ -63,6 +67,17 @@ fn main() -> anyhow::Result<()> {
})
}
fn disable_managed_config_from_debug_env() -> bool {
#[cfg(debug_assertions)]
{
if let Ok(value) = std::env::var(DISABLE_MANAGED_CONFIG_ENV_VAR) {
return matches!(value.as_str(), "1" | "true" | "TRUE" | "yes" | "YES");
}
}
false
}
fn managed_config_path_from_debug_env() -> Option<PathBuf> {
#[cfg(debug_assertions)]
{

View File

@@ -9,7 +9,6 @@ use std::sync::atomic::Ordering;
use crate::codex_message_processor::CodexMessageProcessor;
use crate::codex_message_processor::CodexMessageProcessorArgs;
use crate::config_api::ConfigApi;
use crate::error_code::INTERNAL_ERROR_CODE;
use crate::error_code::INVALID_REQUEST_ERROR_CODE;
use crate::external_agent_config_api::ExternalAgentConfigApi;
use crate::fs_api::FsApi;
@@ -266,7 +265,7 @@ impl MessageProcessor {
.plugins_manager()
.maybe_start_plugin_startup_tasks_for_config(&config, auth_manager.clone());
let config_api = ConfigApi::new(
config.codex_home.clone(),
config.codex_home.to_path_buf(),
cli_overrides,
runtime_feature_enablement,
loader_overrides,
@@ -274,7 +273,8 @@ impl MessageProcessor {
thread_manager,
analytics_events_client.clone(),
);
let external_agent_config_api = ExternalAgentConfigApi::new(config.codex_home.clone());
let external_agent_config_api =
ExternalAgentConfigApi::new(config.codex_home.to_path_buf());
let fs_api = FsApi::default();
let fs_watch_manager = FsWatchManager::new(outgoing.clone());
@@ -620,21 +620,9 @@ impl MessageProcessor {
}
let user_agent = get_codex_user_agent();
let codex_home = match self.config.codex_home.clone().try_into() {
Ok(codex_home) => codex_home,
Err(err) => {
let error = JSONRPCErrorError {
code: INTERNAL_ERROR_CODE,
message: format!("Invalid CODEX_HOME: {err}"),
data: None,
};
self.outgoing.send_error(connection_request_id, error).await;
return;
}
};
let response = InitializeResponse {
user_agent,
codex_home,
codex_home: self.config.codex_home.clone(),
platform_family: std::env::consts::FAMILY.to_string(),
platform_os: std::env::consts::OS.to_string(),
};
@@ -677,6 +665,16 @@ impl MessageProcessor {
self.outgoing.send_error(connection_request_id, error).await;
return;
}
if self.config.features.enabled(Feature::GeneralAnalytics)
&& let ClientRequest::TurnStart { request_id, .. }
| ClientRequest::TurnSteer { request_id, .. } = &codex_request
{
self.analytics_events_client.track_request(
connection_id.0,
request_id.clone(),
codex_request.clone(),
);
}
match codex_request {
ClientRequest::ConfigRead { request_id, params } => {

View File

@@ -16,6 +16,7 @@ use std::sync::Weak;
use tokio::sync::Mutex;
use tokio::sync::mpsc;
use tokio::sync::oneshot;
use tokio::sync::watch;
use tracing::error;
type PendingInterruptQueue = Vec<(
@@ -159,6 +160,7 @@ pub(crate) async fn resolve_server_request_on_thread_listener(
struct ThreadEntry {
state: Arc<Mutex<ThreadState>>,
connection_ids: HashSet<ConnectionId>,
has_connections_watcher: watch::Sender<bool>,
}
impl Default for ThreadEntry {
@@ -166,10 +168,21 @@ impl Default for ThreadEntry {
Self {
state: Arc::new(Mutex::new(ThreadState::default())),
connection_ids: HashSet::new(),
has_connections_watcher: watch::channel(false).0,
}
}
}
impl ThreadEntry {
fn update_has_connections(&self) {
let _ = self.has_connections_watcher.send_if_modified(|current| {
let prev = *current;
*current = !self.connection_ids.is_empty();
prev != *current
});
}
}
#[derive(Default)]
struct ThreadStateManagerInner {
live_connections: HashSet<ConnectionId>,
@@ -286,12 +299,14 @@ impl ThreadStateManager {
}
if let Some(thread_entry) = state.threads.get_mut(&thread_id) {
thread_entry.connection_ids.remove(&connection_id);
thread_entry.update_has_connections();
}
};
true
}
#[cfg(test)]
pub(crate) async fn has_subscribers(&self, thread_id: ThreadId) -> bool {
self.state
.lock()
@@ -319,6 +334,7 @@ impl ThreadStateManager {
.insert(thread_id);
let thread_entry = state.threads.entry(thread_id).or_default();
thread_entry.connection_ids.insert(connection_id);
thread_entry.update_has_connections();
thread_entry.state.clone()
};
{
@@ -344,12 +360,9 @@ impl ThreadStateManager {
.entry(connection_id)
.or_default()
.insert(thread_id);
state
.threads
.entry(thread_id)
.or_default()
.connection_ids
.insert(connection_id);
let thread_entry = state.threads.entry(thread_id).or_default();
thread_entry.connection_ids.insert(connection_id);
thread_entry.update_has_connections();
true
}
@@ -364,6 +377,7 @@ impl ThreadStateManager {
for thread_id in &thread_ids {
if let Some(thread_entry) = state.threads.get_mut(thread_id) {
thread_entry.connection_ids.remove(&connection_id);
thread_entry.update_has_connections();
}
}
thread_ids
@@ -377,4 +391,15 @@ impl ThreadStateManager {
.collect::<Vec<_>>()
}
}
pub(crate) async fn subscribe_to_has_connections(
&self,
thread_id: ThreadId,
) -> Option<watch::Receiver<bool>> {
let state = self.state.lock().await;
state
.threads
.get(&thread_id)
.map(|thread_entry| thread_entry.has_connections_watcher.subscribe())
}
}

View File

@@ -8,6 +8,7 @@ use codex_app_server_protocol::Thread;
use codex_app_server_protocol::ThreadActiveFlag;
use codex_app_server_protocol::ThreadStatus;
use codex_app_server_protocol::ThreadStatusChangedNotification;
use codex_protocol::ThreadId;
use std::collections::HashMap;
#[cfg(test)]
use std::path::PathBuf;
@@ -244,6 +245,13 @@ impl ThreadWatchManager {
}
}
pub(crate) async fn subscribe(
&self,
thread_id: ThreadId,
) -> Option<watch::Receiver<ThreadStatus>> {
Some(self.state.lock().await.subscribe(thread_id.to_string()))
}
async fn note_active_guard_released(
&self,
thread_id: String,
@@ -295,6 +303,7 @@ pub(crate) fn resolve_thread_status(
#[derive(Default)]
struct ThreadWatchState {
runtime_by_thread_id: HashMap<String, RuntimeFacts>,
status_watcher_by_thread_id: HashMap<String, watch::Sender<ThreadStatus>>,
}
impl ThreadWatchState {
@@ -309,6 +318,7 @@ impl ThreadWatchState {
.entry(thread_id.clone())
.or_default();
runtime.is_loaded = true;
self.update_status_watcher_for_thread(&thread_id);
if emit_notification {
self.status_changed_notification(thread_id, previous_status)
} else {
@@ -319,6 +329,7 @@ impl ThreadWatchState {
fn remove_thread(&mut self, thread_id: &str) -> Option<ThreadStatusChangedNotification> {
let previous_status = self.status_for(thread_id);
self.runtime_by_thread_id.remove(thread_id);
self.update_status_watcher(thread_id, &ThreadStatus::NotLoaded);
if previous_status.is_some() && previous_status != Some(ThreadStatus::NotLoaded) {
Some(ThreadStatusChangedNotification {
thread_id: thread_id.to_string(),
@@ -344,6 +355,7 @@ impl ThreadWatchState {
.or_default();
runtime.is_loaded = true;
mutate(runtime);
self.update_status_watcher_for_thread(thread_id);
self.status_changed_notification(thread_id.to_string(), previous_status)
}
@@ -358,6 +370,40 @@ impl ThreadWatchState {
.unwrap_or(ThreadStatus::NotLoaded)
}
fn subscribe(&mut self, thread_id: String) -> watch::Receiver<ThreadStatus> {
let status = self.loaded_status_for_thread(&thread_id);
let sender = self
.status_watcher_by_thread_id
.entry(thread_id)
.or_insert_with(|| watch::channel(status.clone()).0);
sender.subscribe()
}
fn update_status_watcher_for_thread(&mut self, thread_id: &str) {
let status = self.loaded_status_for_thread(thread_id);
self.update_status_watcher(thread_id, &status);
}
fn update_status_watcher(&mut self, thread_id: &str, status: &ThreadStatus) {
let remove_watcher = if let Some(sender) = self.status_watcher_by_thread_id.get(thread_id) {
let status = status.clone();
let _ = sender.send_if_modified(|current| {
if *current == status {
false
} else {
*current = status;
true
}
});
sender.receiver_count() == 0
} else {
false
};
if remove_watcher {
self.status_watcher_by_thread_id.remove(thread_id);
}
}
fn status_changed_notification(
&self,
thread_id: String,
@@ -752,6 +798,55 @@ mod tests {
);
}
#[tokio::test]
async fn status_watchers_receive_only_their_thread_updates() {
let manager = ThreadWatchManager::new();
manager
.upsert_thread(test_thread(
INTERACTIVE_THREAD_ID,
codex_app_server_protocol::SessionSource::Cli,
))
.await;
manager
.upsert_thread(test_thread(
NON_INTERACTIVE_THREAD_ID,
codex_app_server_protocol::SessionSource::AppServer,
))
.await;
let interactive_thread_id = ThreadId::from_string(INTERACTIVE_THREAD_ID)
.expect("interactive thread id should parse");
let non_interactive_thread_id = ThreadId::from_string(NON_INTERACTIVE_THREAD_ID)
.expect("non-interactive thread id should parse");
let mut interactive_rx = manager
.subscribe(interactive_thread_id)
.await
.expect("interactive status watcher should subscribe");
let mut non_interactive_rx = manager
.subscribe(non_interactive_thread_id)
.await
.expect("non-interactive status watcher should subscribe");
manager.note_turn_started(INTERACTIVE_THREAD_ID).await;
timeout(Duration::from_secs(1), interactive_rx.changed())
.await
.expect("timed out waiting for interactive status update")
.expect("interactive status watcher should remain open");
assert_eq!(
*interactive_rx.borrow(),
ThreadStatus::Active {
active_flags: vec![],
},
);
assert!(
timeout(Duration::from_millis(100), non_interactive_rx.changed())
.await
.is_err(),
"unrelated thread watcher should not receive an update"
);
assert_eq!(*non_interactive_rx.borrow(), ThreadStatus::Idle);
}
async fn wait_for_status(
manager: &ThreadWatchManager,
thread_id: &str,

View File

@@ -78,3 +78,31 @@ model_provider = "{model_provider_id}"
),
)
}
pub fn write_mock_responses_config_toml_with_chatgpt_base_url(
codex_home: &Path,
server_uri: &str,
chatgpt_base_url: &str,
) -> std::io::Result<()> {
let config_toml = codex_home.join("config.toml");
std::fs::write(
config_toml,
format!(
r#"
model = "mock-model"
approval_policy = "never"
sandbox_mode = "read-only"
chatgpt_base_url = "{chatgpt_base_url}"
model_provider = "mock_provider"
[model_providers.mock_provider]
name = "Mock provider for test"
base_url = "{server_uri}/v1"
wire_api = "responses"
request_max_retries = 0
stream_max_retries = 0
"#
),
)
}

View File

@@ -14,6 +14,7 @@ pub use auth_fixtures::encode_id_token;
pub use auth_fixtures::write_chatgpt_auth;
use codex_app_server_protocol::JSONRPCResponse;
pub use config::write_mock_responses_config_toml;
pub use config::write_mock_responses_config_toml_with_chatgpt_base_url;
pub use core_test_support::format_with_current_shell;
pub use core_test_support::format_with_current_shell_display;
pub use core_test_support::format_with_current_shell_display_non_login;

View File

@@ -47,6 +47,7 @@ use codex_app_server_protocol::JSONRPCRequest;
use codex_app_server_protocol::JSONRPCResponse;
use codex_app_server_protocol::ListMcpServerStatusParams;
use codex_app_server_protocol::LoginAccountParams;
use codex_app_server_protocol::MarketplaceAddParams;
use codex_app_server_protocol::McpResourceReadParams;
use codex_app_server_protocol::McpServerToolCallParams;
use codex_app_server_protocol::MockExperimentalMethodParams;
@@ -62,8 +63,10 @@ use codex_app_server_protocol::SkillsListParams;
use codex_app_server_protocol::ThreadArchiveParams;
use codex_app_server_protocol::ThreadCompactStartParams;
use codex_app_server_protocol::ThreadForkParams;
use codex_app_server_protocol::ThreadInjectItemsParams;
use codex_app_server_protocol::ThreadListParams;
use codex_app_server_protocol::ThreadLoadedListParams;
use codex_app_server_protocol::ThreadMemoryModeSetParams;
use codex_app_server_protocol::ThreadMetadataUpdateParams;
use codex_app_server_protocol::ThreadReadParams;
use codex_app_server_protocol::ThreadRealtimeAppendAudioParams;
@@ -99,12 +102,17 @@ pub struct McpProcess {
}
pub const DEFAULT_CLIENT_NAME: &str = "codex-app-server-tests";
const DISABLE_MANAGED_CONFIG_ENV_VAR: &str = "CODEX_APP_SERVER_DISABLE_MANAGED_CONFIG";
impl McpProcess {
pub async fn new(codex_home: &Path) -> anyhow::Result<Self> {
Self::new_with_env_and_args(codex_home, &[], &[]).await
}
pub async fn new_without_managed_config(codex_home: &Path) -> anyhow::Result<Self> {
Self::new_with_env(codex_home, &[(DISABLE_MANAGED_CONFIG_ENV_VAR, Some("1"))]).await
}
pub async fn new_with_args(codex_home: &Path, args: &[&str]) -> anyhow::Result<Self> {
Self::new_with_env_and_args(codex_home, &[], args).await
}
@@ -512,6 +520,15 @@ impl McpProcess {
self.send_request("skills/list", params).await
}
/// Send a `marketplace/add` JSON-RPC request.
pub async fn send_marketplace_add_request(
&mut self,
params: MarketplaceAddParams,
) -> anyhow::Result<i64> {
let params = Some(serde_json::to_value(params)?);
self.send_request("marketplace/add", params).await
}
/// Send a `plugin/install` JSON-RPC request.
pub async fn send_plugin_install_request(
&mut self,
@@ -583,6 +600,15 @@ impl McpProcess {
self.send_request("mock/experimentalMethod", params).await
}
/// Send a `thread/memoryMode/set` JSON-RPC request (v2, experimental).
pub async fn send_thread_memory_mode_set_request(
&mut self,
params: ThreadMemoryModeSetParams,
) -> anyhow::Result<i64> {
let params = Some(serde_json::to_value(params)?);
self.send_request("thread/memoryMode/set", params).await
}
/// Send a `turn/start` JSON-RPC request (v2).
pub async fn send_turn_start_request(
&mut self,
@@ -592,6 +618,15 @@ impl McpProcess {
self.send_request("turn/start", params).await
}
/// Send a `thread/inject_items` JSON-RPC request (v2).
pub async fn send_thread_inject_items_request(
&mut self,
params: ThreadInjectItemsParams,
) -> anyhow::Result<i64> {
let params = Some(serde_json::to_value(params)?);
self.send_request("thread/inject_items", params).await
}
/// Send a `command/exec` JSON-RPC request (v2).
pub async fn send_command_exec_request(
&mut self,

View File

@@ -80,6 +80,24 @@ async fn app_server_default_analytics_enabled_with_flag() -> Result<()> {
}
pub(crate) async fn enable_analytics_capture(server: &MockServer, codex_home: &Path) -> Result<()> {
let config_path = codex_home.join("config.toml");
let config_toml = std::fs::read_to_string(&config_path)?;
if !config_toml.contains("[features]") {
std::fs::write(
&config_path,
format!("{config_toml}\n[features]\ngeneral_analytics = true\n"),
)?;
} else if !config_toml.contains("general_analytics") {
std::fs::write(
&config_path,
config_toml.replace("[features]\n", "[features]\ngeneral_analytics = true\n"),
)?;
}
mount_analytics_capture(server, codex_home).await
}
pub(crate) async fn mount_analytics_capture(server: &MockServer, codex_home: &Path) -> Result<()> {
Mock::given(method("POST"))
.and(path("/codex/analytics-events/events"))
.respond_with(ResponseTemplate::new(200))
@@ -120,6 +138,41 @@ pub(crate) async fn wait_for_analytics_payload(
serde_json::from_slice(&body).map_err(|err| anyhow::anyhow!("invalid analytics payload: {err}"))
}
pub(crate) async fn wait_for_analytics_event(
server: &MockServer,
read_timeout: Duration,
event_type: &str,
) -> Result<Value> {
timeout(read_timeout, async {
loop {
let Some(requests) = server.received_requests().await else {
tokio::time::sleep(Duration::from_millis(25)).await;
continue;
};
for request in &requests {
if request.method != "POST"
|| request.url.path() != "/codex/analytics-events/events"
{
continue;
}
let payload: Value = serde_json::from_slice(&request.body)
.map_err(|err| anyhow::anyhow!("invalid analytics payload: {err}"))?;
let Some(events) = payload["events"].as_array() else {
continue;
};
if let Some(event) = events
.iter()
.find(|event| event["event_type"] == event_type)
{
return Ok::<Value, anyhow::Error>(event.clone());
}
}
tokio::time::sleep(Duration::from_millis(25)).await;
}
})
.await?
}
pub(crate) fn thread_initialized_event(payload: &Value) -> Result<&Value> {
let events = payload["events"]
.as_array()

View File

@@ -338,7 +338,8 @@ async fn websocket_transport_allows_unauthenticated_non_loopback_startup_by_defa
}
#[tokio::test]
async fn websocket_disconnect_unloads_last_subscribed_thread() -> Result<()> {
async fn websocket_disconnect_keeps_last_subscribed_thread_loaded_until_idle_timeout() -> Result<()>
{
let server = create_mock_responses_server_sequence_unchecked(Vec::new()).await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri(), "never")?;
@@ -359,7 +360,7 @@ async fn websocket_disconnect_unloads_last_subscribed_thread() -> Result<()> {
send_initialize_request(&mut ws2, /*id*/ 4, "ws_reconnect_client").await?;
read_response_for_id(&mut ws2, /*id*/ 4).await?;
wait_for_loaded_threads(&mut ws2, /*first_id*/ 5, &[]).await?;
wait_for_loaded_threads(&mut ws2, /*first_id*/ 5, &[thread_id.as_str()]).await?;
process
.kill()

View File

@@ -11,10 +11,13 @@ use codex_app_server_protocol::JSONRPCMessage;
use codex_app_server_protocol::JSONRPCResponse;
use codex_app_server_protocol::MockExperimentalMethodParams;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ThreadMemoryMode;
use codex_app_server_protocol::ThreadMemoryModeSetParams;
use codex_app_server_protocol::ThreadRealtimeStartParams;
use codex_app_server_protocol::ThreadRealtimeStartTransport;
use codex_app_server_protocol::ThreadStartParams;
use codex_app_server_protocol::ThreadStartResponse;
use codex_protocol::protocol::RealtimeOutputModality;
use pretty_assertions::assert_eq;
use std::path::Path;
use std::time::Duration;
@@ -74,6 +77,7 @@ async fn realtime_conversation_start_requires_experimental_api_capability() -> R
let request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("hello".to_string())),
session_id: None,
transport: None,
@@ -89,6 +93,39 @@ async fn realtime_conversation_start_requires_experimental_api_capability() -> R
Ok(())
}
#[tokio::test]
async fn thread_memory_mode_set_requires_experimental_api_capability() -> Result<()> {
let codex_home = TempDir::new()?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
let init = mcp
.initialize_with_capabilities(
default_client_info(),
Some(InitializeCapabilities {
experimental_api: false,
opt_out_notification_methods: None,
}),
)
.await?;
let JSONRPCMessage::Response(_) = init else {
anyhow::bail!("expected initialize response, got {init:?}");
};
let request_id = mcp
.send_thread_memory_mode_set_request(ThreadMemoryModeSetParams {
thread_id: "thr_123".to_string(),
mode: ThreadMemoryMode::Disabled,
})
.await?;
let error = timeout(
DEFAULT_TIMEOUT,
mcp.read_stream_until_error_message(RequestId::Integer(request_id)),
)
.await??;
assert_experimental_capability_error(error, "thread/memoryMode/set");
Ok(())
}
#[tokio::test]
async fn realtime_webrtc_start_requires_experimental_api_capability() -> Result<()> {
let codex_home = TempDir::new()?;
@@ -110,6 +147,7 @@ async fn realtime_webrtc_start_requires_experimental_api_capability() -> Result<
let request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: "thr_123".to_string(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("hello".to_string())),
session_id: None,
transport: Some(ThreadRealtimeStartTransport::Webrtc {

View File

@@ -89,6 +89,7 @@ async fn fs_get_metadata_returns_only_used_fields() -> Result<()> {
"createdAtMs".to_string(),
"isDirectory".to_string(),
"isFile".to_string(),
"isSymlink".to_string(),
"modifiedAtMs".to_string(),
]
);
@@ -99,6 +100,7 @@ async fn fs_get_metadata_returns_only_used_fields() -> Result<()> {
FsGetMetadataResponse {
is_directory: false,
is_file: true,
is_symlink: false,
created_at_ms: stat.created_at_ms,
modified_at_ms: stat.modified_at_ms,
}
@@ -111,6 +113,35 @@ async fn fs_get_metadata_returns_only_used_fields() -> Result<()> {
Ok(())
}
#[cfg(unix)]
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn fs_get_metadata_reports_symlink() -> Result<()> {
let codex_home = TempDir::new()?;
let file_path = codex_home.path().join("note.txt");
let symlink_path = codex_home.path().join("note-link.txt");
std::fs::write(&file_path, "hello")?;
symlink(&file_path, &symlink_path)?;
let mut mcp = initialized_mcp(&codex_home).await?;
let request_id = mcp
.send_fs_get_metadata_request(codex_app_server_protocol::FsGetMetadataParams {
path: absolute_path(symlink_path),
})
.await?;
let response = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(request_id)),
)
.await??;
let stat: FsGetMetadataResponse = to_response(response)?;
assert_eq!(stat.is_directory, false);
assert_eq!(stat.is_file, true);
assert_eq!(stat.is_symlink, true);
Ok(())
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn fs_methods_cover_current_fs_utils_surface() -> Result<()> {
let codex_home = TempDir::new()?;

View File

@@ -0,0 +1,40 @@
use anyhow::Result;
use app_test_support::McpProcess;
use codex_app_server_protocol::MarketplaceAddParams;
use codex_app_server_protocol::RequestId;
use tempfile::TempDir;
use tokio::time::Duration;
use tokio::time::timeout;
const DEFAULT_TIMEOUT: Duration = Duration::from_secs(10);
#[tokio::test]
async fn marketplace_add_rejects_local_directory_source() -> Result<()> {
let codex_home = TempDir::new()?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_TIMEOUT, mcp.initialize()).await??;
let request_id = mcp
.send_marketplace_add_request(MarketplaceAddParams {
source: "./marketplace".to_string(),
ref_name: None,
sparse_paths: None,
})
.await?;
let err = timeout(
DEFAULT_TIMEOUT,
mcp.read_stream_until_error_message(RequestId::Integer(request_id)),
)
.await??;
assert_eq!(err.error.code, -32600);
assert!(
err.error.message.contains(
"local marketplace sources are not supported yet; use an HTTP(S) Git URL, SSH Git URL, or GitHub owner/repo"
),
"unexpected error: {}",
err.error.message
);
Ok(())
}

View File

@@ -15,6 +15,7 @@ mod experimental_api;
mod experimental_feature_list;
mod fs;
mod initialize;
mod marketplace_add;
mod mcp_resource;
mod mcp_server_elicitation;
mod mcp_server_status;
@@ -35,8 +36,10 @@ mod safety_check_downgrade;
mod skills_list;
mod thread_archive;
mod thread_fork;
mod thread_inject_items;
mod thread_list;
mod thread_loaded_list;
mod thread_memory_mode_set;
mod thread_metadata_update;
mod thread_name_websocket;
mod thread_read;

View File

@@ -31,7 +31,8 @@ use codex_app_server_protocol::ThreadRealtimeStartTransport;
use codex_app_server_protocol::ThreadRealtimeStartedNotification;
use codex_app_server_protocol::ThreadRealtimeStopParams;
use codex_app_server_protocol::ThreadRealtimeStopResponse;
use codex_app_server_protocol::ThreadRealtimeTranscriptUpdatedNotification;
use codex_app_server_protocol::ThreadRealtimeTranscriptDeltaNotification;
use codex_app_server_protocol::ThreadRealtimeTranscriptDoneNotification;
use codex_app_server_protocol::ThreadStartParams;
use codex_app_server_protocol::ThreadStartResponse;
use codex_app_server_protocol::TurnCompletedNotification;
@@ -39,6 +40,7 @@ use codex_app_server_protocol::TurnStartedNotification;
use codex_features::FEATURES;
use codex_features::Feature;
use codex_protocol::protocol::RealtimeConversationVersion;
use codex_protocol::protocol::RealtimeOutputModality;
use codex_protocol::protocol::RealtimeVoice;
use codex_protocol::protocol::RealtimeVoicesList;
use core_test_support::responses;
@@ -301,6 +303,7 @@ impl RealtimeE2eHarness {
.mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: self.thread_id.clone(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("backend prompt".to_string())),
session_id: None,
transport: Some(ThreadRealtimeStartTransport::Webrtc {
@@ -478,6 +481,15 @@ async fn realtime_conversation_streams_v2_notifications() -> Result<()> {
"type": "response.output_text.delta",
"delta": "working"
}),
json!({
"type": "conversation.item.done",
"item": {
"id": "item_assistant_1",
"type": "message",
"role": "assistant",
"content": [{ "type": "output_text", "text": "working on it" }]
}
}),
json!({
"type": "conversation.item.done",
"item": {
@@ -523,6 +535,7 @@ async fn realtime_conversation_streams_v2_notifications() -> Result<()> {
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_start.thread.id.clone(),
output_modality: RealtimeOutputModality::Audio,
prompt: None,
session_id: None,
transport: None,
@@ -554,6 +567,10 @@ async fn realtime_conversation_streams_v2_notifications() -> Result<()> {
startup_context_request.body_json()["session"]["audio"]["output"]["voice"],
"cedar"
);
assert_eq!(
startup_context_request.body_json()["session"]["output_modalities"],
json!(["audio"])
);
let startup_context_instructions =
startup_context_request.body_json()["session"]["instructions"]
.as_str()
@@ -612,24 +629,32 @@ async fn realtime_conversation_streams_v2_notifications() -> Result<()> {
assert_eq!(item_added.thread_id, output_audio.thread_id);
assert_eq!(item_added.item["type"], json!("message"));
let first_transcript_update = read_notification::<ThreadRealtimeTranscriptUpdatedNotification>(
let first_transcript_delta = read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
&mut mcp,
"thread/realtime/transcriptUpdated",
"thread/realtime/transcript/delta",
)
.await?;
assert_eq!(first_transcript_update.thread_id, output_audio.thread_id);
assert_eq!(first_transcript_update.role, "user");
assert_eq!(first_transcript_update.text, "delegate now");
assert_eq!(first_transcript_delta.thread_id, output_audio.thread_id);
assert_eq!(first_transcript_delta.role, "user");
assert_eq!(first_transcript_delta.delta, "delegate now");
let second_transcript_update =
read_notification::<ThreadRealtimeTranscriptUpdatedNotification>(
&mut mcp,
"thread/realtime/transcriptUpdated",
)
.await?;
assert_eq!(second_transcript_update.thread_id, output_audio.thread_id);
assert_eq!(second_transcript_update.role, "assistant");
assert_eq!(second_transcript_update.text, "working");
let second_transcript_delta = read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
&mut mcp,
"thread/realtime/transcript/delta",
)
.await?;
assert_eq!(second_transcript_delta.thread_id, output_audio.thread_id);
assert_eq!(second_transcript_delta.role, "assistant");
assert_eq!(second_transcript_delta.delta, "working");
let final_transcript_done = read_notification::<ThreadRealtimeTranscriptDoneNotification>(
&mut mcp,
"thread/realtime/transcript/done",
)
.await?;
assert_eq!(final_transcript_done.thread_id, output_audio.thread_id);
assert_eq!(final_transcript_done.role, "assistant");
assert_eq!(final_transcript_done.text, "working on it");
let handoff_item_added = read_notification::<ThreadRealtimeItemAddedNotification>(
&mut mcp,
@@ -693,6 +718,140 @@ async fn realtime_conversation_streams_v2_notifications() -> Result<()> {
Ok(())
}
#[tokio::test]
async fn realtime_text_output_modality_requests_text_output_and_final_transcript() -> Result<()> {
skip_if_no_network!(Ok(()));
let responses_server = create_mock_responses_server_sequence_unchecked(Vec::new()).await;
let realtime_server = start_websocket_server(vec![vec![vec![
json!({
"type": "session.updated",
"session": { "id": "sess_text", "instructions": "backend prompt" }
}),
json!({
"type": "response.output_text.delta",
"delta": "hello "
}),
json!({
"type": "response.output_text.delta",
"delta": "world"
}),
json!({
"type": "response.output_audio_transcript.done",
"transcript": "hello world"
}),
json!({
"type": "conversation.item.done",
"item": {
"id": "item_output_1",
"type": "message",
"role": "assistant",
"content": [{"type": "output_text", "text": "hello world"}]
}
}),
]]])
.await;
let codex_home = TempDir::new()?;
create_config_toml(
codex_home.path(),
&responses_server.uri(),
realtime_server.uri(),
/*realtime_enabled*/ true,
StartupContextConfig::Generated,
)?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
mcp.initialize().await?;
login_with_api_key(&mut mcp, "sk-test-key").await?;
let thread_start_request_id = mcp
.send_thread_start_request(ThreadStartParams::default())
.await?;
let thread_start_response: JSONRPCResponse = timeout(
DEFAULT_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(thread_start_request_id)),
)
.await??;
let thread_start: ThreadStartResponse = to_response(thread_start_response)?;
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_start.thread.id.clone(),
output_modality: RealtimeOutputModality::Text,
prompt: None,
session_id: None,
transport: None,
voice: None,
})
.await?;
let start_response: JSONRPCResponse = timeout(
DEFAULT_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(start_request_id)),
)
.await??;
let _: ThreadRealtimeStartResponse = to_response(start_response)?;
let session_update = realtime_server
.wait_for_request(/*connection_index*/ 0, /*request_index*/ 0)
.await;
assert_eq!(
session_update.body_json()["session"]["output_modalities"],
json!(["text"])
);
let first_delta = read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
&mut mcp,
"thread/realtime/transcript/delta",
)
.await?;
let second_delta = read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
&mut mcp,
"thread/realtime/transcript/delta",
)
.await?;
let done = read_notification::<ThreadRealtimeTranscriptDoneNotification>(
&mut mcp,
"thread/realtime/transcript/done",
)
.await?;
assert_eq!(
vec![first_delta, second_delta],
vec![
ThreadRealtimeTranscriptDeltaNotification {
thread_id: thread_start.thread.id.clone(),
role: "assistant".to_string(),
delta: "hello ".to_string(),
},
ThreadRealtimeTranscriptDeltaNotification {
thread_id: thread_start.thread.id.clone(),
role: "assistant".to_string(),
delta: "world".to_string(),
},
]
);
assert_eq!(
done,
ThreadRealtimeTranscriptDoneNotification {
thread_id: thread_start.thread.id,
role: "assistant".to_string(),
text: "hello world".to_string(),
}
);
assert!(
timeout(
Duration::from_millis(200),
mcp.read_stream_until_notification_message("thread/realtime/transcript/done"),
)
.await
.is_err(),
"should not emit duplicate transcript done from audio transcript done"
);
realtime_server.shutdown().await;
Ok(())
}
#[tokio::test]
async fn realtime_list_voices_returns_supported_names() -> Result<()> {
let codex_home = TempDir::new()?;
@@ -793,6 +952,7 @@ async fn realtime_conversation_stop_emits_closed_notification() -> Result<()> {
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_start.thread.id.clone(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("backend prompt".to_string())),
session_id: None,
transport: None,
@@ -889,6 +1049,7 @@ async fn realtime_webrtc_start_emits_sdp_notification() -> Result<()> {
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_id.clone(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("backend prompt".to_string())),
session_id: None,
transport: Some(ThreadRealtimeStartTransport::Webrtc {
@@ -973,7 +1134,7 @@ async fn realtime_webrtc_start_emits_sdp_notification() -> Result<()> {
Some("multipart/form-data; boundary=codex-realtime-call-boundary")
);
let body = String::from_utf8(request.body).context("multipart body should be utf-8")?;
let session = r#"{"tool_choice":"auto","type":"realtime","model":"gpt-realtime-1.5","instructions":"backend prompt\n\nstartup context","output_modalities":["audio"],"audio":{"input":{"format":{"type":"audio/pcm","rate":24000},"noise_reduction":{"type":"near_field"},"turn_detection":{"type":"server_vad","interrupt_response":true,"create_response":true}},"output":{"format":{"type":"audio/pcm","rate":24000},"voice":"marin"}},"tools":[{"type":"function","name":"background_agent","description":"Send a user request to the background agent. Use this as the default action. If the background agent is idle, this starts a new task and returns the final result to the user. If the background agent is already working on a task, this sends the request as guidance to steer that previous task. If the user asks to do something next, later, after this, or once current work finishes, call this tool so the work is actually queued instead of merely promising to do it later.","parameters":{"type":"object","properties":{"prompt":{"type":"string","description":"The user request to delegate to the background agent."}},"required":["prompt"],"additionalProperties":false}}]}"#;
let session = r#"{"tool_choice":"auto","type":"realtime","model":"gpt-realtime-1.5","instructions":"backend prompt\n\nstartup context","output_modalities":["audio"],"audio":{"input":{"format":{"type":"audio/pcm","rate":24000},"noise_reduction":{"type":"near_field"},"turn_detection":{"type":"server_vad","interrupt_response":true,"create_response":true}},"output":{"format":{"type":"audio/pcm","rate":24000},"voice":"marin"}},"tools":[{"type":"function","name":"background_agent","description":"Send a user request to the background agent. Use this as the default action. Do not rephrase the user's ask or rewrite it in your own words; pass along the user's own words. If the background agent is idle, this starts a new task and returns the final result to the user. If the background agent is already working on a task, this sends the request as guidance to steer that previous task. If the user asks to do something next, later, after this, or once current work finishes, call this tool so the work is actually queued instead of merely promising to do it later.","parameters":{"type":"object","properties":{"prompt":{"type":"string","description":"The user request to delegate to the background agent."}},"required":["prompt"],"additionalProperties":false}}]}"#;
assert_eq!(
body,
format!(
@@ -1163,11 +1324,11 @@ async fn webrtc_v2_forwards_audio_and_text_between_client_and_sideband() -> Resu
harness.append_text(thread_id, "hello").await?;
let transcript = harness
.read_notification::<ThreadRealtimeTranscriptUpdatedNotification>(
"thread/realtime/transcriptUpdated",
.read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
"thread/realtime/transcript/delta",
)
.await?;
assert_eq!(transcript.text, "transcribed audio");
assert_eq!(transcript.delta, "transcribed audio");
let output_audio = harness
.read_notification::<ThreadRealtimeOutputAudioDeltaNotification>(
"thread/realtime/outputAudio/delta",
@@ -1252,11 +1413,11 @@ async fn webrtc_v2_text_input_is_append_only_while_response_is_active() -> Resul
"first",
);
let transcript = harness
.read_notification::<ThreadRealtimeTranscriptUpdatedNotification>(
"thread/realtime/transcriptUpdated",
.read_notification::<ThreadRealtimeTranscriptDeltaNotification>(
"thread/realtime/transcript/delta",
)
.await?;
assert_eq!(transcript.text, "active response started");
assert_eq!(transcript.delta, "active response started");
// Phase 3: send a second text turn while `resp_active` is still open. The
// user message must reach realtime without requesting another response.
@@ -1736,6 +1897,7 @@ async fn realtime_webrtc_start_surfaces_backend_error() -> Result<()> {
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_start.thread.id,
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("backend prompt".to_string())),
session_id: None,
transport: Some(ThreadRealtimeStartTransport::Webrtc {
@@ -1794,6 +1956,7 @@ async fn realtime_conversation_requires_feature_flag() -> Result<()> {
let start_request_id = mcp
.send_thread_realtime_start_request(ThreadRealtimeStartParams {
thread_id: thread_start.thread.id.clone(),
output_modality: RealtimeOutputModality::Audio,
prompt: Some(Some("backend prompt".to_string())),
session_id: None,
transport: None,

View File

@@ -98,6 +98,35 @@ async fn skills_list_rejects_relative_extra_user_roots() -> Result<()> {
Ok(())
}
#[tokio::test]
async fn skills_list_accepts_relative_cwds() -> Result<()> {
let codex_home = TempDir::new()?;
let relative_cwd = std::path::PathBuf::from("relative-cwd");
std::fs::create_dir_all(codex_home.path().join(&relative_cwd))?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_TIMEOUT, mcp.initialize()).await??;
let request_id = mcp
.send_skills_list_request(SkillsListParams {
cwds: vec![relative_cwd.clone()],
force_reload: true,
per_cwd_extra_user_roots: None,
})
.await?;
let response: JSONRPCResponse = timeout(
DEFAULT_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(request_id)),
)
.await??;
let SkillsListResponse { data } = to_response(response)?;
assert_eq!(data.len(), 1);
assert_eq!(data[0].cwd, relative_cwd);
assert_eq!(data[0].errors, Vec::new());
Ok(())
}
#[tokio::test]
async fn skills_list_ignores_per_cwd_extra_roots_for_unknown_cwd() -> Result<()> {
let codex_home = TempDir::new()?;

View File

@@ -205,7 +205,7 @@ async fn thread_fork_tracks_thread_initialized_analytics() -> Result<()> {
/*git_info*/ None,
)?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let fork_id = mcp
@@ -565,7 +565,7 @@ fn create_config_toml_with_chatgpt_base_url(
let general_analytics_toml = if general_analytics_enabled {
"\ngeneral_analytics = true".to_string()
} else {
String::new()
"\ngeneral_analytics = false".to_string()
};
let config_toml = codex_home.join("config.toml");
std::fs::write(

View File

@@ -0,0 +1,288 @@
use anyhow::Context;
use anyhow::Result;
use app_test_support::McpProcess;
use app_test_support::to_response;
use codex_app_server_protocol::JSONRPCResponse;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ThreadInjectItemsParams;
use codex_app_server_protocol::ThreadInjectItemsResponse;
use codex_app_server_protocol::ThreadStartParams;
use codex_app_server_protocol::ThreadStartResponse;
use codex_app_server_protocol::TurnStartParams;
use codex_app_server_protocol::UserInput as V2UserInput;
use codex_core::RolloutRecorder;
use codex_protocol::models::ContentItem;
use codex_protocol::models::ResponseItem;
use codex_protocol::protocol::InitialHistory;
use codex_protocol::protocol::RolloutItem;
use core_test_support::responses;
use serde_json::Value;
use std::path::Path;
use tempfile::TempDir;
use tokio::time::timeout;
const DEFAULT_READ_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10);
#[tokio::test]
async fn thread_inject_items_adds_raw_response_items_to_thread_history() -> Result<()> {
let server = responses::start_mock_server().await;
let body = responses::sse(vec![
responses::ev_response_created("resp-1"),
responses::ev_assistant_message("msg-1", "Done"),
responses::ev_completed("resp-1"),
]);
let response_mock = responses::mount_sse_once(&server, body).await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
.send_thread_start_request(ThreadStartParams {
model: Some("mock-model".to_string()),
..Default::default()
})
.await?;
let thread_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(thread_req)),
)
.await??;
let ThreadStartResponse { thread, .. } = to_response::<ThreadStartResponse>(thread_resp)?;
let injected_text = "Injected assistant context";
let injected_item = ResponseItem::Message {
id: None,
role: "assistant".to_string(),
content: vec![ContentItem::OutputText {
text: injected_text.to_string(),
}],
end_turn: None,
phase: None,
};
let inject_req = mcp
.send_thread_inject_items_request(ThreadInjectItemsParams {
thread_id: thread.id.clone(),
items: vec![serde_json::to_value(&injected_item)?],
})
.await?;
let inject_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(inject_req)),
)
.await??;
let _response: ThreadInjectItemsResponse =
to_response::<ThreadInjectItemsResponse>(inject_resp)?;
let rollout_path = thread.path.as_ref().context("thread path missing")?;
let history = RolloutRecorder::get_rollout_history(rollout_path).await?;
let InitialHistory::Resumed(resumed_history) = history else {
panic!("expected resumed rollout history");
};
assert!(
resumed_history
.history
.iter()
.any(|item| matches!(item, RolloutItem::ResponseItem(response_item) if response_item == &injected_item)),
"injected item should be persisted in rollout history"
);
let turn_req = mcp
.send_turn_start_request(TurnStartParams {
thread_id: thread.id.clone(),
input: vec![V2UserInput::Text {
text: "Hello".to_string(),
text_elements: Vec::new(),
}],
..Default::default()
})
.await?;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(turn_req)),
)
.await??;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("turn/completed"),
)
.await??;
let injected_value = serde_json::to_value(&injected_item)?;
let model_input = response_mock.single_request().input();
let environment_context_index =
response_item_text_position(&model_input, "<environment_context>")
.expect("environment context should be injected before the first user turn");
let injected_index = model_input
.iter()
.position(|item| item == &injected_value)
.expect("injected item should be sent in the next model request");
let user_prompt_index = response_item_text_position(&model_input, "Hello")
.expect("user prompt should be sent in the next model request");
assert!(
environment_context_index < injected_index,
"standard initial context should be sent before injected items"
);
assert!(
injected_index < user_prompt_index,
"injected items should be sent before the user prompt"
);
Ok(())
}
#[tokio::test]
async fn thread_inject_items_adds_raw_response_items_after_a_turn() -> Result<()> {
let server = responses::start_mock_server().await;
let first_body = responses::sse(vec![
responses::ev_response_created("resp-1"),
responses::ev_assistant_message("msg-1", "First done"),
responses::ev_completed("resp-1"),
]);
let second_body = responses::sse(vec![
responses::ev_response_created("resp-2"),
responses::ev_assistant_message("msg-2", "Second done"),
responses::ev_completed("resp-2"),
]);
let response_mock = responses::mount_sse_sequence(&server, vec![first_body, second_body]).await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
.send_thread_start_request(ThreadStartParams {
model: Some("mock-model".to_string()),
..Default::default()
})
.await?;
let thread_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(thread_req)),
)
.await??;
let ThreadStartResponse { thread, .. } = to_response::<ThreadStartResponse>(thread_resp)?;
let first_turn_req = mcp
.send_turn_start_request(TurnStartParams {
thread_id: thread.id.clone(),
input: vec![V2UserInput::Text {
text: "First turn".to_string(),
text_elements: Vec::new(),
}],
..Default::default()
})
.await?;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(first_turn_req)),
)
.await??;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("turn/completed"),
)
.await??;
let injected_item = ResponseItem::Message {
id: None,
role: "assistant".to_string(),
content: vec![ContentItem::OutputText {
text: "Injected after first turn".to_string(),
}],
end_turn: None,
phase: None,
};
let injected_value = serde_json::to_value(&injected_item)?;
let inject_req = mcp
.send_thread_inject_items_request(ThreadInjectItemsParams {
thread_id: thread.id.clone(),
items: vec![injected_value.clone()],
})
.await?;
let inject_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(inject_req)),
)
.await??;
let _response: ThreadInjectItemsResponse =
to_response::<ThreadInjectItemsResponse>(inject_resp)?;
let second_turn_req = mcp
.send_turn_start_request(TurnStartParams {
thread_id: thread.id.clone(),
input: vec![V2UserInput::Text {
text: "Second turn".to_string(),
text_elements: Vec::new(),
}],
..Default::default()
})
.await?;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(second_turn_req)),
)
.await??;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("turn/completed"),
)
.await??;
let requests = response_mock.requests();
assert_eq!(requests.len(), 2);
assert!(
!requests[0].input().contains(&injected_value),
"injected item should not be sent before it is injected"
);
assert!(
requests[1].input().contains(&injected_value),
"injected item should be sent after being injected into existing history"
);
Ok(())
}
fn create_config_toml(codex_home: &Path, server_uri: &str) -> std::io::Result<()> {
let config_toml = codex_home.join("config.toml");
std::fs::write(
config_toml,
format!(
r#"
model = "mock-model"
approval_policy = "never"
sandbox_mode = "read-only"
model_provider = "mock_provider"
[model_providers.mock_provider]
name = "Mock provider for test"
base_url = "{server_uri}/v1"
wire_api = "responses"
request_max_retries = 0
stream_max_retries = 0
"#
),
)
}
fn response_item_text_position(items: &[Value], needle: &str) -> Option<usize> {
items.iter().position(|item| {
item.get("content")
.and_then(Value::as_array)
.into_iter()
.flatten()
.any(|content| {
content
.get("text")
.and_then(Value::as_str)
.is_some_and(|text| text.contains(needle))
})
})
}

View File

@@ -0,0 +1,138 @@
use anyhow::Result;
use app_test_support::McpProcess;
use app_test_support::create_fake_rollout;
use app_test_support::create_mock_responses_server_repeating_assistant;
use app_test_support::to_response;
use codex_app_server_protocol::JSONRPCResponse;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ThreadMemoryMode;
use codex_app_server_protocol::ThreadMemoryModeSetParams;
use codex_app_server_protocol::ThreadMemoryModeSetResponse;
use codex_app_server_protocol::ThreadStartParams;
use codex_app_server_protocol::ThreadStartResponse;
use codex_protocol::ThreadId;
use codex_state::StateRuntime;
use pretty_assertions::assert_eq;
use std::path::Path;
use std::sync::Arc;
use tempfile::TempDir;
use tokio::time::timeout;
const DEFAULT_READ_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10);
#[tokio::test]
async fn thread_memory_mode_set_updates_loaded_thread_state() -> Result<()> {
let server = create_mock_responses_server_repeating_assistant("Done").await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
let state_db = init_state_db(codex_home.path()).await?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let start_id = mcp
.send_thread_start_request(ThreadStartParams {
model: Some("mock-model".to_string()),
..Default::default()
})
.await?;
let start_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(start_id)),
)
.await??;
let ThreadStartResponse { thread, .. } = to_response::<ThreadStartResponse>(start_resp)?;
let thread_uuid = ThreadId::from_string(&thread.id)?;
let set_id = mcp
.send_thread_memory_mode_set_request(ThreadMemoryModeSetParams {
thread_id: thread.id,
mode: ThreadMemoryMode::Disabled,
})
.await?;
let set_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(set_id)),
)
.await??;
let _: ThreadMemoryModeSetResponse = to_response::<ThreadMemoryModeSetResponse>(set_resp)?;
let memory_mode = state_db.get_thread_memory_mode(thread_uuid).await?;
assert_eq!(memory_mode.as_deref(), Some("disabled"));
Ok(())
}
#[tokio::test]
async fn thread_memory_mode_set_updates_stored_thread_state() -> Result<()> {
let server = create_mock_responses_server_repeating_assistant("Done").await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
let state_db = init_state_db(codex_home.path()).await?;
let thread_id = create_fake_rollout(
codex_home.path(),
"2025-01-06T08-30-00",
"2025-01-06T08:30:00Z",
"Stored thread preview",
Some("mock_provider"),
/*git_info*/ None,
)?;
let thread_uuid = ThreadId::from_string(&thread_id)?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
for mode in [ThreadMemoryMode::Disabled, ThreadMemoryMode::Enabled] {
let set_id = mcp
.send_thread_memory_mode_set_request(ThreadMemoryModeSetParams {
thread_id: thread_id.clone(),
mode,
})
.await?;
let set_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(set_id)),
)
.await??;
let _: ThreadMemoryModeSetResponse = to_response::<ThreadMemoryModeSetResponse>(set_resp)?;
}
let memory_mode = state_db.get_thread_memory_mode(thread_uuid).await?;
assert_eq!(memory_mode.as_deref(), Some("enabled"));
Ok(())
}
async fn init_state_db(codex_home: &Path) -> Result<Arc<StateRuntime>> {
let state_db = StateRuntime::init(codex_home.to_path_buf(), "mock_provider".into()).await?;
state_db
.mark_backfill_complete(/*last_watermark*/ None)
.await?;
Ok(state_db)
}
fn create_config_toml(codex_home: &Path, server_uri: &str) -> std::io::Result<()> {
let config_toml = codex_home.join("config.toml");
std::fs::write(
config_toml,
format!(
r#"
model = "mock-model"
approval_policy = "never"
sandbox_mode = "read-only"
model_provider = "mock_provider"
suppress_unstable_features_warning = true
[features]
sqlite = true
[model_providers.mock_provider]
name = "Mock provider for test"
base_url = "{server_uri}/v1"
wire_api = "responses"
request_max_retries = 0
stream_max_retries = 0
"#
),
)
}

View File

@@ -178,7 +178,7 @@ async fn thread_resume_tracks_thread_initialized_analytics() -> Result<()> {
/*git_info*/ None,
)?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let resume_id = mcp
@@ -1901,7 +1901,7 @@ fn create_config_toml_with_chatgpt_base_url(
let general_analytics_toml = if general_analytics_enabled {
"\ngeneral_analytics = true".to_string()
} else {
String::new()
"\ngeneral_analytics = false".to_string()
};
let config_toml = codex_home.join("config.toml");
std::fs::write(

View File

@@ -40,8 +40,9 @@ use wiremock::matchers::method;
use wiremock::matchers::path;
use super::analytics::assert_basic_thread_initialized_event;
use super::analytics::enable_analytics_capture;
use super::analytics::mount_analytics_capture;
use super::analytics::thread_initialized_event;
use super::analytics::wait_for_analytics_event;
use super::analytics::wait_for_analytics_payload;
const DEFAULT_READ_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10);
@@ -232,9 +233,9 @@ async fn thread_start_tracks_thread_initialized_analytics() -> Result<()> {
&server.uri(),
/*general_analytics_enabled*/ true,
)?;
enable_analytics_capture(&server, codex_home.path()).await?;
mount_analytics_capture(&server, codex_home.path()).await?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let req_id = mcp
@@ -265,9 +266,9 @@ async fn thread_start_does_not_track_thread_initialized_analytics_without_featur
&server.uri(),
/*general_analytics_enabled*/ false,
)?;
enable_analytics_capture(&server, codex_home.path()).await?;
mount_analytics_capture(&server, codex_home.path()).await?;
let mut mcp = McpProcess::new(codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let req_id = mcp
@@ -280,7 +281,12 @@ async fn thread_start_does_not_track_thread_initialized_analytics_without_featur
.await??;
let _ = to_response::<ThreadStartResponse>(resp)?;
let payload = wait_for_analytics_payload(&server, Duration::from_millis(250)).await;
let payload = wait_for_analytics_event(
&server,
Duration::from_millis(250),
"codex_thread_initialized",
)
.await;
assert!(
payload.is_err(),
"thread analytics should be gated off when general_analytics is disabled"
@@ -888,7 +894,7 @@ fn create_config_toml_with_chatgpt_base_url(
let general_analytics_toml = if general_analytics_enabled {
"\ngeneral_analytics = true".to_string()
} else {
String::new()
"\ngeneral_analytics = false".to_string()
};
let config_toml = codex_home.join("config.toml");
std::fs::write(

View File

@@ -7,10 +7,8 @@ use app_test_support::create_mock_responses_server_sequence_unchecked;
use app_test_support::create_shell_command_sse_response;
use app_test_support::to_response;
use codex_app_server_protocol::ItemStartedNotification;
use codex_app_server_protocol::JSONRPCNotification;
use codex_app_server_protocol::JSONRPCResponse;
use codex_app_server_protocol::RequestId;
use codex_app_server_protocol::ServerNotification;
use codex_app_server_protocol::ThreadItem;
use codex_app_server_protocol::ThreadLoadedListParams;
use codex_app_server_protocol::ThreadLoadedListResponse;
@@ -21,7 +19,6 @@ use codex_app_server_protocol::ThreadResumeResponse;
use codex_app_server_protocol::ThreadStartParams;
use codex_app_server_protocol::ThreadStartResponse;
use codex_app_server_protocol::ThreadStatus;
use codex_app_server_protocol::ThreadStatusChangedNotification;
use codex_app_server_protocol::ThreadUnsubscribeParams;
use codex_app_server_protocol::ThreadUnsubscribeResponse;
use codex_app_server_protocol::ThreadUnsubscribeStatus;
@@ -81,7 +78,7 @@ async fn wait_for_responses_request_count_to_stabilize(
}
#[tokio::test]
async fn thread_unsubscribe_unloads_thread_and_emits_thread_closed_notification() -> Result<()> {
async fn thread_unsubscribe_keeps_thread_loaded_until_idle_timeout() -> Result<()> {
let server = create_mock_responses_server_repeating_assistant("Done").await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
@@ -104,20 +101,14 @@ async fn thread_unsubscribe_unloads_thread_and_emits_thread_closed_notification(
let unsubscribe = to_response::<ThreadUnsubscribeResponse>(unsubscribe_resp)?;
assert_eq!(unsubscribe.status, ThreadUnsubscribeStatus::Unsubscribed);
let closed_notif: JSONRPCNotification = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("thread/closed"),
)
.await??;
let parsed: ServerNotification = closed_notif.try_into()?;
let ServerNotification::ThreadClosed(payload) = parsed else {
anyhow::bail!("expected thread/closed notification");
};
assert_eq!(payload.thread_id, thread_id);
let status_changed = wait_for_thread_status_not_loaded(&mut mcp, &payload.thread_id).await?;
assert_eq!(status_changed.thread_id, payload.thread_id);
assert_eq!(status_changed.status, ThreadStatus::NotLoaded);
assert!(
timeout(
std::time::Duration::from_millis(250),
mcp.read_stream_until_notification_message("thread/closed"),
)
.await
.is_err()
);
let list_id = mcp
.send_thread_loaded_list_request(ThreadLoadedListParams::default())
@@ -129,22 +120,22 @@ async fn thread_unsubscribe_unloads_thread_and_emits_thread_closed_notification(
.await??;
let ThreadLoadedListResponse { data, next_cursor } =
to_response::<ThreadLoadedListResponse>(list_resp)?;
assert_eq!(data, Vec::<String>::new());
assert_eq!(data, vec![thread_id]);
assert_eq!(next_cursor, None);
Ok(())
}
#[tokio::test]
async fn thread_unsubscribe_during_turn_interrupts_turn_and_emits_thread_closed() -> Result<()> {
async fn thread_unsubscribe_during_turn_keeps_turn_running() -> Result<()> {
#[cfg(target_os = "windows")]
let shell_command = vec![
"powershell".to_string(),
"-Command".to_string(),
"Start-Sleep -Seconds 10".to_string(),
"Start-Sleep -Seconds 1".to_string(),
];
#[cfg(not(target_os = "windows"))]
let shell_command = vec!["sleep".to_string(), "10".to_string()];
let shell_command = vec!["sleep".to_string(), "1".to_string()];
let tmp = TempDir::new()?;
let codex_home = tmp.path().join("codex_home");
@@ -206,20 +197,18 @@ async fn thread_unsubscribe_during_turn_interrupts_turn_and_emits_thread_closed(
let unsubscribe = to_response::<ThreadUnsubscribeResponse>(unsubscribe_resp)?;
assert_eq!(unsubscribe.status, ThreadUnsubscribeStatus::Unsubscribed);
let closed_notif: JSONRPCNotification = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("thread/closed"),
)
.await??;
let parsed: ServerNotification = closed_notif.try_into()?;
let ServerNotification::ThreadClosed(payload) = parsed else {
anyhow::bail!("expected thread/closed notification");
};
assert_eq!(payload.thread_id, thread_id);
assert!(
timeout(
std::time::Duration::from_millis(250),
mcp.read_stream_until_notification_message("thread/closed"),
)
.await
.is_err()
);
wait_for_responses_request_count_to_stabilize(
&server,
/*expected_count*/ 1,
/*expected_count*/ 2,
std::time::Duration::from_millis(200),
)
.await?;
@@ -228,7 +217,7 @@ async fn thread_unsubscribe_during_turn_interrupts_turn_and_emits_thread_closed(
}
#[tokio::test]
async fn thread_unsubscribe_clears_cached_status_before_resume() -> Result<()> {
async fn thread_unsubscribe_preserves_cached_status_before_idle_unload() -> Result<()> {
let server = responses::start_mock_server().await;
let _response_mock = responses::mount_sse_once(
&server,
@@ -291,11 +280,14 @@ async fn thread_unsubscribe_clears_cached_status_before_resume() -> Result<()> {
.await??;
let unsubscribe = to_response::<ThreadUnsubscribeResponse>(unsubscribe_resp)?;
assert_eq!(unsubscribe.status, ThreadUnsubscribeStatus::Unsubscribed);
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("thread/closed"),
)
.await??;
assert!(
timeout(
std::time::Duration::from_millis(250),
mcp.read_stream_until_notification_message("thread/closed"),
)
.await
.is_err()
);
let resume_id = mcp
.send_thread_resume_request(ThreadResumeParams {
@@ -309,13 +301,13 @@ async fn thread_unsubscribe_clears_cached_status_before_resume() -> Result<()> {
)
.await??;
let resume: ThreadResumeResponse = to_response::<ThreadResumeResponse>(resume_resp)?;
assert_eq!(resume.thread.status, ThreadStatus::Idle);
assert_eq!(resume.thread.status, ThreadStatus::SystemError);
Ok(())
}
#[tokio::test]
async fn thread_unsubscribe_reports_not_loaded_after_thread_is_unloaded() -> Result<()> {
async fn thread_unsubscribe_reports_not_subscribed_before_idle_unload() -> Result<()> {
let server = create_mock_responses_server_repeating_assistant("Done").await;
let codex_home = TempDir::new()?;
create_config_toml(codex_home.path(), &server.uri())?;
@@ -341,12 +333,6 @@ async fn thread_unsubscribe_reports_not_loaded_after_thread_is_unloaded() -> Res
ThreadUnsubscribeStatus::Unsubscribed
);
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("thread/closed"),
)
.await??;
let second_unsubscribe_id = mcp
.send_thread_unsubscribe_request(ThreadUnsubscribeParams { thread_id })
.await?;
@@ -358,7 +344,7 @@ async fn thread_unsubscribe_reports_not_loaded_after_thread_is_unloaded() -> Res
let second_unsubscribe = to_response::<ThreadUnsubscribeResponse>(second_unsubscribe_resp)?;
assert_eq!(
second_unsubscribe.status,
ThreadUnsubscribeStatus::NotLoaded
ThreadUnsubscribeStatus::NotSubscribed
);
Ok(())
@@ -377,28 +363,6 @@ async fn wait_for_command_execution_item_started(mcp: &mut McpProcess) -> Result
}
}
async fn wait_for_thread_status_not_loaded(
mcp: &mut McpProcess,
thread_id: &str,
) -> Result<ThreadStatusChangedNotification> {
loop {
let status_changed_notif: JSONRPCNotification = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("thread/status/changed"),
)
.await??;
let status_changed_params = status_changed_notif
.params
.context("thread/status/changed params must be present")?;
let status_changed: ThreadStatusChangedNotification =
serde_json::from_value(status_changed_params)?;
if status_changed.thread_id == thread_id && status_changed.status == ThreadStatus::NotLoaded
{
return Ok(status_changed);
}
}
}
fn create_config_toml(codex_home: &std::path::Path, server_uri: &str) -> std::io::Result<()> {
let config_toml = codex_home.join("config.toml");
std::fs::write(

View File

@@ -3,6 +3,7 @@
use anyhow::Result;
use app_test_support::McpProcess;
use app_test_support::create_mock_responses_server_sequence;
use app_test_support::create_mock_responses_server_sequence_unchecked;
use app_test_support::create_shell_command_sse_response;
use app_test_support::to_response;
use codex_app_server_protocol::JSONRPCNotification;
@@ -43,14 +44,15 @@ async fn turn_interrupt_aborts_running_turn() -> Result<()> {
std::fs::create_dir(&working_directory)?;
// Mock server: long-running shell command then (after abort) nothing else needed.
let server = create_mock_responses_server_sequence(vec![create_shell_command_sse_response(
shell_command.clone(),
Some(&working_directory),
Some(10_000),
"call_sleep",
)?])
.await;
create_config_toml(&codex_home, &server.uri(), "never", "danger-full-access")?;
let server =
create_mock_responses_server_sequence_unchecked(vec![create_shell_command_sse_response(
shell_command.clone(),
Some(&working_directory),
Some(10_000),
"call_sleep",
)?])
.await;
create_config_toml(&codex_home, &server.uri(), "never", "workspace-write")?;
let mut mcp = McpProcess::new(&codex_home).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
@@ -87,6 +89,7 @@ async fn turn_interrupt_aborts_running_turn() -> Result<()> {
)
.await??;
let TurnStartResponse { turn } = to_response::<TurnStartResponse>(turn_resp)?;
let turn_id = turn.id.clone();
// Give the command a brief moment to start.
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
@@ -96,7 +99,7 @@ async fn turn_interrupt_aborts_running_turn() -> Result<()> {
let interrupt_id = mcp
.send_turn_interrupt_request(TurnInterruptParams {
thread_id: thread_id.clone(),
turn_id: turn.id,
turn_id: turn_id.clone(),
})
.await?;
let interrupt_resp: JSONRPCResponse = timeout(

View File

@@ -1,4 +1,5 @@
use anyhow::Result;
use app_test_support::DEFAULT_CLIENT_NAME;
use app_test_support::McpProcess;
use app_test_support::create_apply_patch_sse_response;
use app_test_support::create_exec_command_sse_response;
@@ -9,6 +10,7 @@ use app_test_support::create_mock_responses_server_sequence_unchecked;
use app_test_support::create_shell_command_sse_response;
use app_test_support::format_with_current_shell_display;
use app_test_support::to_response;
use app_test_support::write_mock_responses_config_toml_with_chatgpt_base_url;
use codex_app_server::INPUT_TOO_LARGE_ERROR_CODE;
use codex_app_server::INVALID_PARAMS_ERROR_CODE;
use codex_app_server_protocol::ByteRange;
@@ -64,6 +66,10 @@ use std::path::Path;
use tempfile::TempDir;
use tokio::time::timeout;
use super::analytics::enable_analytics_capture;
use super::analytics::mount_analytics_capture;
use super::analytics::wait_for_analytics_event;
#[cfg(windows)]
const DEFAULT_READ_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(25);
#[cfg(not(windows))]
@@ -328,6 +334,163 @@ async fn thread_start_omits_empty_instruction_overrides_from_model_request() ->
Ok(())
}
#[tokio::test]
async fn turn_start_tracks_turn_event_analytics() -> Result<()> {
let responses = vec![create_final_assistant_message_sse_response("Done")?];
let server = create_mock_responses_server_sequence_unchecked(responses).await;
let codex_home = TempDir::new()?;
write_mock_responses_config_toml_with_chatgpt_base_url(
codex_home.path(),
&server.uri(),
&server.uri(),
)?;
enable_analytics_capture(&server, codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
.send_thread_start_request(ThreadStartParams {
model: Some("mock-model".to_string()),
..Default::default()
})
.await?;
let thread_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(thread_req)),
)
.await??;
let ThreadStartResponse { thread, .. } = to_response::<ThreadStartResponse>(thread_resp)?;
let turn_req = mcp
.send_turn_start_request(TurnStartParams {
thread_id: thread.id.clone(),
input: vec![V2UserInput::Image {
url: "https://example.com/a.png".to_string(),
}],
..Default::default()
})
.await?;
let turn_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(turn_req)),
)
.await??;
let TurnStartResponse { turn } = to_response::<TurnStartResponse>(turn_resp)?;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("turn/completed"),
)
.await??;
let event = wait_for_analytics_event(&server, DEFAULT_READ_TIMEOUT, "codex_turn_event").await?;
assert_eq!(event["event_params"]["thread_id"], thread.id);
assert_eq!(event["event_params"]["turn_id"], turn.id);
assert_eq!(
event["event_params"]["app_server_client"]["product_client_id"],
DEFAULT_CLIENT_NAME
);
assert_eq!(event["event_params"]["model"], "mock-model");
assert_eq!(event["event_params"]["model_provider"], "mock_provider");
assert_eq!(event["event_params"]["sandbox_policy"], "read_only");
assert_eq!(event["event_params"]["ephemeral"], false);
assert_eq!(event["event_params"]["thread_source"], "user");
assert_eq!(event["event_params"]["initialization_mode"], "new");
assert_eq!(
event["event_params"]["subagent_source"],
serde_json::Value::Null
);
assert_eq!(
event["event_params"]["parent_thread_id"],
serde_json::Value::Null
);
assert_eq!(event["event_params"]["num_input_images"], 1);
assert_eq!(event["event_params"]["status"], "completed");
assert!(event["event_params"]["started_at"].as_u64().is_some());
assert!(event["event_params"]["completed_at"].as_u64().is_some());
assert!(event["event_params"]["duration_ms"].as_u64().is_some());
assert_eq!(event["event_params"]["input_tokens"], 0);
assert_eq!(event["event_params"]["cached_input_tokens"], 0);
assert_eq!(event["event_params"]["output_tokens"], 0);
assert_eq!(event["event_params"]["reasoning_output_tokens"], 0);
assert_eq!(event["event_params"]["total_tokens"], 0);
Ok(())
}
#[tokio::test]
async fn turn_start_does_not_track_turn_event_analytics_without_feature() -> Result<()> {
let responses = vec![create_final_assistant_message_sse_response("Done")?];
let server = create_mock_responses_server_sequence_unchecked(responses).await;
let codex_home = TempDir::new()?;
write_mock_responses_config_toml_with_chatgpt_base_url(
codex_home.path(),
&server.uri(),
&server.uri(),
)?;
let config_path = codex_home.path().join("config.toml");
let config_toml = std::fs::read_to_string(&config_path)?;
std::fs::write(
&config_path,
format!("{config_toml}\n[features]\ngeneral_analytics = false\n"),
)?;
mount_analytics_capture(&server, codex_home.path()).await?;
let mut mcp = McpProcess::new_without_managed_config(codex_home.path()).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
.send_thread_start_request(ThreadStartParams {
model: Some("mock-model".to_string()),
..Default::default()
})
.await?;
let thread_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(thread_req)),
)
.await??;
let ThreadStartResponse { thread, .. } = to_response::<ThreadStartResponse>(thread_resp)?;
let turn_req = mcp
.send_turn_start_request(TurnStartParams {
thread_id: thread.id,
input: vec![V2UserInput::Text {
text: "hello".to_string(),
text_elements: Vec::new(),
}],
..Default::default()
})
.await?;
let turn_resp: JSONRPCResponse = timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_response_message(RequestId::Integer(turn_req)),
)
.await??;
let _ = to_response::<TurnStartResponse>(turn_resp)?;
timeout(
DEFAULT_READ_TIMEOUT,
mcp.read_stream_until_notification_message("turn/completed"),
)
.await??;
let turn_event = wait_for_analytics_event(
&server,
std::time::Duration::from_millis(250),
"codex_turn_event",
)
.await;
assert!(
turn_event.is_err(),
"turn analytics should be gated off when general_analytics is disabled"
);
Ok(())
}
#[tokio::test]
async fn turn_start_accepts_text_at_limit_with_mention_item() -> Result<()> {
let responses = vec![create_final_assistant_message_sse_response("Done")?];

View File

@@ -6,6 +6,7 @@ use app_test_support::create_mock_responses_server_sequence;
use app_test_support::create_mock_responses_server_sequence_unchecked;
use app_test_support::create_shell_command_sse_response;
use app_test_support::to_response;
use app_test_support::write_mock_responses_config_toml_with_chatgpt_base_url;
use codex_app_server::INPUT_TOO_LARGE_ERROR_CODE;
use codex_app_server::INVALID_PARAMS_ERROR_CODE;
use codex_app_server_protocol::JSONRPCError;
@@ -23,6 +24,9 @@ use codex_protocol::user_input::MAX_USER_INPUT_TEXT_CHARS;
use tempfile::TempDir;
use tokio::time::timeout;
use super::analytics::enable_analytics_capture;
use super::analytics::wait_for_analytics_event;
const DEFAULT_READ_TIMEOUT: std::time::Duration = std::time::Duration::from_secs(10);
#[tokio::test]
@@ -32,9 +36,14 @@ async fn turn_steer_requires_active_turn() -> Result<()> {
std::fs::create_dir(&codex_home)?;
let server = create_mock_responses_server_sequence(vec![]).await;
create_config_toml(&codex_home, &server.uri())?;
write_mock_responses_config_toml_with_chatgpt_base_url(
&codex_home,
&server.uri(),
&server.uri(),
)?;
enable_analytics_capture(&server, &codex_home).await?;
let mut mcp = McpProcess::new(&codex_home).await?;
let mut mcp = McpProcess::new_without_managed_config(&codex_home).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
@@ -52,7 +61,7 @@ async fn turn_steer_requires_active_turn() -> Result<()> {
let steer_req = mcp
.send_turn_steer_request(TurnSteerParams {
thread_id: thread.id,
thread_id: thread.id.clone(),
input: vec![V2UserInput::Text {
text: "steer".to_string(),
text_elements: Vec::new(),
@@ -68,6 +77,21 @@ async fn turn_steer_requires_active_turn() -> Result<()> {
.await??;
assert_eq!(steer_err.error.code, -32600);
let event =
wait_for_analytics_event(&server, DEFAULT_READ_TIMEOUT, "codex_turn_steer_event").await?;
assert_eq!(event["event_params"]["thread_id"], thread.id);
assert_eq!(event["event_params"]["result"], "rejected");
assert_eq!(event["event_params"]["num_input_images"], 0);
assert_eq!(
event["event_params"]["expected_turn_id"],
"turn-does-not-exist"
);
assert_eq!(
event["event_params"]["accepted_turn_id"],
serde_json::Value::Null
);
assert_eq!(event["event_params"]["rejection_reason"], "no_active_turn");
Ok(())
}
@@ -96,9 +120,14 @@ async fn turn_steer_rejects_oversized_text_input() -> Result<()> {
"call_sleep",
)?])
.await;
create_config_toml(&codex_home, &server.uri())?;
write_mock_responses_config_toml_with_chatgpt_base_url(
&codex_home,
&server.uri(),
&server.uri(),
)?;
enable_analytics_capture(&server, &codex_home).await?;
let mut mcp = McpProcess::new(&codex_home).await?;
let mut mcp = McpProcess::new_without_managed_config(&codex_home).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
@@ -200,9 +229,14 @@ async fn turn_steer_returns_active_turn_id() -> Result<()> {
"call_sleep",
)?])
.await;
create_config_toml(&codex_home, &server.uri())?;
write_mock_responses_config_toml_with_chatgpt_base_url(
&codex_home,
&server.uri(),
&server.uri(),
)?;
enable_analytics_capture(&server, &codex_home).await?;
let mut mcp = McpProcess::new(&codex_home).await?;
let mut mcp = McpProcess::new_without_managed_config(&codex_home).await?;
timeout(DEFAULT_READ_TIMEOUT, mcp.initialize()).await??;
let thread_req = mcp
@@ -261,31 +295,20 @@ async fn turn_steer_returns_active_turn_id() -> Result<()> {
let steer: TurnSteerResponse = to_response::<TurnSteerResponse>(steer_resp)?;
assert_eq!(steer.turn_id, turn.id);
let event =
wait_for_analytics_event(&server, DEFAULT_READ_TIMEOUT, "codex_turn_steer_event").await?;
assert_eq!(event["event_params"]["thread_id"], thread.id);
assert_eq!(event["event_params"]["result"], "accepted");
assert_eq!(event["event_params"]["num_input_images"], 0);
assert_eq!(event["event_params"]["expected_turn_id"], turn.id);
assert_eq!(event["event_params"]["accepted_turn_id"], turn.id);
assert_eq!(
event["event_params"]["rejection_reason"],
serde_json::Value::Null
);
mcp.interrupt_turn_and_wait_for_aborted(thread.id, steer.turn_id, DEFAULT_READ_TIMEOUT)
.await?;
Ok(())
}
fn create_config_toml(codex_home: &std::path::Path, server_uri: &str) -> std::io::Result<()> {
let config_toml = codex_home.join("config.toml");
std::fs::write(
config_toml,
format!(
r#"
model = "mock-model"
approval_policy = "never"
sandbox_mode = "danger-full-access"
model_provider = "mock_provider"
[model_providers.mock_provider]
name = "Mock provider for test"
base_url = "{server_uri}/v1"
wire_api = "responses"
request_max_retries = 0
stream_max_retries = 0
"#
),
)
}

View File

@@ -29,7 +29,7 @@ const DIRECTORY_CONNECTORS_TIMEOUT: Duration = Duration::from_secs(60);
async fn apps_enabled(config: &Config) -> bool {
let auth_manager = AuthManager::shared(
config.codex_home.clone(),
config.codex_home.to_path_buf(),
/*enable_codex_api_key_env*/ false,
config.cli_auth_credentials_store_mode,
);
@@ -73,10 +73,9 @@ pub async fn list_cached_all_connectors(config: &Config) -> Option<Vec<AppInfo>>
}
let token_data = get_chatgpt_token_data()?;
let cache_key = all_connectors_cache_key(config, &token_data);
codex_connectors::cached_all_connectors(&cache_key).map(|connectors| {
let connectors = merge_plugin_apps(connectors, plugin_apps_for_config(config));
filter_disallowed_connectors(connectors)
})
let connectors = codex_connectors::cached_all_connectors(&cache_key)?;
let connectors = merge_plugin_apps(connectors, plugin_apps_for_config(config).await);
Some(filter_disallowed_connectors(connectors))
}
pub async fn list_all_connectors_with_options(
@@ -106,7 +105,7 @@ pub async fn list_all_connectors_with_options(
},
)
.await?;
let connectors = merge_plugin_apps(connectors, plugin_apps_for_config(config));
let connectors = merge_plugin_apps(connectors, plugin_apps_for_config(config).await);
Ok(filter_disallowed_connectors(connectors))
}
@@ -119,9 +118,10 @@ fn all_connectors_cache_key(config: &Config, token_data: &TokenData) -> AllConne
)
}
fn plugin_apps_for_config(config: &Config) -> Vec<codex_core::plugins::AppConnectorId> {
PluginsManager::new(config.codex_home.clone())
async fn plugin_apps_for_config(config: &Config) -> Vec<codex_core::plugins::AppConnectorId> {
PluginsManager::new(config.codex_home.to_path_buf())
.plugins_for_config(config)
.await
.effective_apps()
}

View File

@@ -141,7 +141,7 @@ pub async fn run_login_with_chatgpt(cli_config_overrides: CliConfigOverrides) ->
let forced_chatgpt_workspace_id = config.forced_chatgpt_workspace_id.clone();
match login_with_chatgpt(
config.codex_home,
config.codex_home.to_path_buf(),
forced_chatgpt_workspace_id,
config.cli_auth_credentials_store_mode,
)
@@ -229,7 +229,7 @@ pub async fn run_login_with_device_code(
}
let forced_chatgpt_workspace_id = config.forced_chatgpt_workspace_id.clone();
let mut opts = ServerOptions::new(
config.codex_home,
config.codex_home.to_path_buf(),
client_id.unwrap_or(CLIENT_ID.to_string()),
forced_chatgpt_workspace_id,
config.cli_auth_credentials_store_mode,
@@ -268,7 +268,7 @@ pub async fn run_login_with_device_code_fallback_to_browser(
let forced_chatgpt_workspace_id = config.forced_chatgpt_workspace_id.clone();
let mut opts = ServerOptions::new(
config.codex_home,
config.codex_home.to_path_buf(),
client_id.unwrap_or(CLIENT_ID.to_string()),
forced_chatgpt_workspace_id,
config.cli_auth_credentials_store_mode,

View File

@@ -1,22 +1,10 @@
use anyhow::Context;
use anyhow::Result;
use anyhow::bail;
use clap::Parser;
use codex_config::MarketplaceConfigUpdate;
use codex_config::record_user_marketplace;
use codex_core::config::find_codex_home;
use codex_core::plugins::OPENAI_CURATED_MARKETPLACE_NAME;
use codex_core::plugins::marketplace_install_root;
use codex_core::plugins::validate_marketplace_root;
use codex_core::plugins::validate_plugin_segment;
use codex_core::plugins::MarketplaceAddRequest;
use codex_core::plugins::add_marketplace;
use codex_utils_cli::CliConfigOverrides;
use std::fs;
use std::path::Path;
use std::time::SystemTime;
use std::time::UNIX_EPOCH;
mod metadata;
mod ops;
#[derive(Debug, Parser)]
pub struct MarketplaceCli {
@@ -51,14 +39,6 @@ struct AddMarketplaceArgs {
sparse_paths: Vec<String>,
}
#[derive(Debug, PartialEq, Eq)]
pub(super) enum MarketplaceSource {
Git {
url: String,
ref_name: Option<String>,
},
}
impl MarketplaceCli {
pub async fn run(self) -> Result<()> {
let MarketplaceCli {
@@ -87,449 +67,41 @@ async fn run_add(args: AddMarketplaceArgs) -> Result<()> {
sparse_paths,
} = args;
let source = parse_marketplace_source(&source, ref_name)?;
let codex_home = find_codex_home().context("failed to resolve CODEX_HOME")?;
let install_root = marketplace_install_root(&codex_home);
fs::create_dir_all(&install_root).with_context(|| {
format!(
"failed to create marketplace install directory {}",
install_root.display()
)
})?;
let install_metadata =
metadata::MarketplaceInstallMetadata::from_source(&source, &sparse_paths);
if let Some(existing_root) = metadata::installed_marketplace_root_for_source(
&codex_home,
&install_root,
&install_metadata,
)? {
let marketplace_name = validate_marketplace_root(&existing_root).with_context(|| {
format!(
"failed to validate installed marketplace at {}",
existing_root.display()
)
})?;
record_added_marketplace(&codex_home, &marketplace_name, &install_metadata)?;
let outcome = add_marketplace(
codex_home.to_path_buf(),
MarketplaceAddRequest {
source,
ref_name,
sparse_paths,
},
)
.await?;
if outcome.already_added {
println!(
"Marketplace `{marketplace_name}` is already added from {}.",
source.display()
"Marketplace `{}` is already added from {}.",
outcome.marketplace_name, outcome.source_display
);
println!("Installed marketplace root: {}", existing_root.display());
return Ok(());
}
let staging_root = ops::marketplace_staging_root(&install_root);
fs::create_dir_all(&staging_root).with_context(|| {
format!(
"failed to create marketplace staging directory {}",
staging_root.display()
)
})?;
let staged_dir = tempfile::Builder::new()
.prefix("marketplace-add-")
.tempdir_in(&staging_root)
.with_context(|| {
format!(
"failed to create temporary marketplace directory in {}",
staging_root.display()
)
})?;
let staged_root = staged_dir.path().to_path_buf();
let MarketplaceSource::Git { url, ref_name } = &source;
ops::clone_git_source(url, ref_name.as_deref(), &sparse_paths, &staged_root)?;
let marketplace_name = validate_marketplace_source_root(&staged_root)
.with_context(|| format!("failed to validate marketplace from {}", source.display()))?;
if marketplace_name == OPENAI_CURATED_MARKETPLACE_NAME {
bail!(
"marketplace `{OPENAI_CURATED_MARKETPLACE_NAME}` is reserved and cannot be added from {}",
source.display()
);
}
let destination = install_root.join(safe_marketplace_dir_name(&marketplace_name)?);
ensure_marketplace_destination_is_inside_install_root(&install_root, &destination)?;
if destination.exists() {
bail!(
"marketplace `{marketplace_name}` is already added from a different source; remove it before adding {}",
source.display()
);
}
ops::replace_marketplace_root(&staged_root, &destination)
.with_context(|| format!("failed to install marketplace at {}", destination.display()))?;
if let Err(err) = record_added_marketplace(&codex_home, &marketplace_name, &install_metadata) {
if let Err(rollback_err) = fs::rename(&destination, &staged_root) {
bail!(
"{err}; additionally failed to roll back installed marketplace at {}: {rollback_err}",
destination.display()
);
}
return Err(err);
}
println!(
"Added marketplace `{marketplace_name}` from {}.",
source.display()
);
println!("Installed marketplace root: {}", destination.display());
Ok(())
}
fn record_added_marketplace(
codex_home: &Path,
marketplace_name: &str,
install_metadata: &metadata::MarketplaceInstallMetadata,
) -> Result<()> {
let source = install_metadata.config_source();
let last_updated = utc_timestamp_now()?;
let update = MarketplaceConfigUpdate {
last_updated: &last_updated,
source_type: install_metadata.config_source_type(),
source: &source,
ref_name: install_metadata.ref_name(),
sparse_paths: install_metadata.sparse_paths(),
};
record_user_marketplace(codex_home, marketplace_name, &update).with_context(|| {
format!("failed to add marketplace `{marketplace_name}` to user config.toml")
})?;
Ok(())
}
fn validate_marketplace_source_root(root: &Path) -> Result<String> {
let marketplace_name = validate_marketplace_root(root)?;
validate_plugin_segment(&marketplace_name, "marketplace name").map_err(anyhow::Error::msg)?;
Ok(marketplace_name)
}
fn parse_marketplace_source(
source: &str,
explicit_ref: Option<String>,
) -> Result<MarketplaceSource> {
let source = source.trim();
if source.is_empty() {
bail!("marketplace source must not be empty");
}
let (base_source, parsed_ref) = split_source_ref(source);
let ref_name = explicit_ref.or(parsed_ref);
if looks_like_local_path(&base_source) {
bail!(
"local marketplace sources are not supported yet; use an HTTP(S) Git URL, SSH Git URL, or GitHub owner/repo"
);
}
if is_ssh_git_url(&base_source) || is_git_url(&base_source) {
let url = normalize_git_url(&base_source);
return Ok(MarketplaceSource::Git { url, ref_name });
}
if looks_like_github_shorthand(&base_source) {
let url = format!("https://github.com/{base_source}.git");
return Ok(MarketplaceSource::Git { url, ref_name });
}
bail!("invalid marketplace source format: {source}");
}
fn split_source_ref(source: &str) -> (String, Option<String>) {
if let Some((base, ref_name)) = source.rsplit_once('#') {
return (base.to_string(), non_empty_ref(ref_name));
}
if !source.contains("://")
&& !is_ssh_git_url(source)
&& let Some((base, ref_name)) = source.rsplit_once('@')
{
return (base.to_string(), non_empty_ref(ref_name));
}
(source.to_string(), None)
}
fn non_empty_ref(ref_name: &str) -> Option<String> {
let ref_name = ref_name.trim();
(!ref_name.is_empty()).then(|| ref_name.to_string())
}
fn normalize_git_url(url: &str) -> String {
let url = url.trim_end_matches('/');
if url.starts_with("https://github.com/") && !url.ends_with(".git") {
format!("{url}.git")
} else {
url.to_string()
}
}
fn looks_like_local_path(source: &str) -> bool {
source.starts_with("./")
|| source.starts_with("../")
|| source.starts_with('/')
|| source.starts_with("~/")
|| source == "."
|| source == ".."
}
fn is_ssh_git_url(source: &str) -> bool {
source.starts_with("ssh://") || source.starts_with("git@") && source.contains(':')
}
fn is_git_url(source: &str) -> bool {
source.starts_with("http://") || source.starts_with("https://")
}
fn looks_like_github_shorthand(source: &str) -> bool {
let mut segments = source.split('/');
let owner = segments.next();
let repo = segments.next();
let extra = segments.next();
owner.is_some_and(is_github_shorthand_segment)
&& repo.is_some_and(is_github_shorthand_segment)
&& extra.is_none()
}
fn is_github_shorthand_segment(segment: &str) -> bool {
!segment.is_empty()
&& segment
.chars()
.all(|ch| ch.is_ascii_alphanumeric() || matches!(ch, '-' | '_' | '.'))
}
fn safe_marketplace_dir_name(marketplace_name: &str) -> Result<String> {
let safe = marketplace_name
.chars()
.map(|ch| {
if ch.is_ascii_alphanumeric() || matches!(ch, '-' | '_' | '.') {
ch
} else {
'-'
}
})
.collect::<String>();
let safe = safe.trim_matches('.').to_string();
if safe.is_empty() || safe == ".." {
bail!("marketplace name `{marketplace_name}` cannot be used as an install directory");
}
Ok(safe)
}
fn ensure_marketplace_destination_is_inside_install_root(
install_root: &Path,
destination: &Path,
) -> Result<()> {
let install_root = install_root.canonicalize().with_context(|| {
format!(
"failed to resolve marketplace install root {}",
install_root.display()
)
})?;
let destination_parent = destination
.parent()
.context("marketplace destination has no parent")?
.canonicalize()
.with_context(|| {
format!(
"failed to resolve marketplace destination parent {}",
destination.display()
)
})?;
if !destination_parent.starts_with(&install_root) {
bail!(
"marketplace destination {} is outside install root {}",
destination.display(),
install_root.display()
println!(
"Added marketplace `{}` from {}.",
outcome.marketplace_name, outcome.source_display
);
}
println!(
"Installed marketplace root: {}",
outcome.installed_root.as_path().display()
);
Ok(())
}
fn utc_timestamp_now() -> Result<String> {
let duration = SystemTime::now()
.duration_since(UNIX_EPOCH)
.context("system clock is before Unix epoch")?;
Ok(format_utc_timestamp(duration.as_secs() as i64))
}
fn format_utc_timestamp(seconds_since_epoch: i64) -> String {
const SECONDS_PER_DAY: i64 = 86_400;
let days = seconds_since_epoch.div_euclid(SECONDS_PER_DAY);
let seconds_of_day = seconds_since_epoch.rem_euclid(SECONDS_PER_DAY);
let (year, month, day) = civil_from_days(days);
let hour = seconds_of_day / 3_600;
let minute = (seconds_of_day % 3_600) / 60;
let second = seconds_of_day % 60;
format!("{year:04}-{month:02}-{day:02}T{hour:02}:{minute:02}:{second:02}Z")
}
fn civil_from_days(days_since_epoch: i64) -> (i64, i64, i64) {
let days = days_since_epoch + 719_468;
let era = if days >= 0 { days } else { days - 146_096 } / 146_097;
let day_of_era = days - era * 146_097;
let year_of_era =
(day_of_era - day_of_era / 1_460 + day_of_era / 36_524 - day_of_era / 146_096) / 365;
let mut year = year_of_era + era * 400;
let day_of_year = day_of_era - (365 * year_of_era + year_of_era / 4 - year_of_era / 100);
let month_prime = (5 * day_of_year + 2) / 153;
let day = day_of_year - (153 * month_prime + 2) / 5 + 1;
let month = month_prime + if month_prime < 10 { 3 } else { -9 };
year += if month <= 2 { 1 } else { 0 };
(year, month, day)
}
impl MarketplaceSource {
fn display(&self) -> String {
match self {
Self::Git { url, ref_name } => {
if let Some(ref_name) = ref_name {
format!("{url}#{ref_name}")
} else {
url.clone()
}
}
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
#[test]
fn github_shorthand_parses_ref_suffix() {
assert_eq!(
parse_marketplace_source("owner/repo@main", /*explicit_ref*/ None).unwrap(),
MarketplaceSource::Git {
url: "https://github.com/owner/repo.git".to_string(),
ref_name: Some("main".to_string()),
}
);
}
#[test]
fn git_url_parses_fragment_ref() {
assert_eq!(
parse_marketplace_source(
"https://example.com/team/repo.git#v1",
/*explicit_ref*/ None,
)
.unwrap(),
MarketplaceSource::Git {
url: "https://example.com/team/repo.git".to_string(),
ref_name: Some("v1".to_string()),
}
);
}
#[test]
fn explicit_ref_overrides_source_ref() {
assert_eq!(
parse_marketplace_source(
"owner/repo@main",
/*explicit_ref*/ Some("release".to_string()),
)
.unwrap(),
MarketplaceSource::Git {
url: "https://github.com/owner/repo.git".to_string(),
ref_name: Some("release".to_string()),
}
);
}
#[test]
fn github_shorthand_and_git_url_normalize_to_same_source() {
let shorthand = parse_marketplace_source("owner/repo", /*explicit_ref*/ None).unwrap();
let git_url = parse_marketplace_source(
"https://github.com/owner/repo.git",
/*explicit_ref*/ None,
)
.unwrap();
assert_eq!(shorthand, git_url);
assert_eq!(
shorthand,
MarketplaceSource::Git {
url: "https://github.com/owner/repo.git".to_string(),
ref_name: None,
}
);
}
#[test]
fn github_url_with_trailing_slash_normalizes_without_extra_path_segment() {
assert_eq!(
parse_marketplace_source("https://github.com/owner/repo/", /*explicit_ref*/ None)
.unwrap(),
MarketplaceSource::Git {
url: "https://github.com/owner/repo.git".to_string(),
ref_name: None,
}
);
}
#[test]
fn non_github_https_source_parses_as_git_url() {
assert_eq!(
parse_marketplace_source("https://gitlab.com/owner/repo", /*explicit_ref*/ None)
.unwrap(),
MarketplaceSource::Git {
url: "https://gitlab.com/owner/repo".to_string(),
ref_name: None,
}
);
}
#[test]
fn file_url_source_is_rejected() {
let err =
parse_marketplace_source("file:///tmp/marketplace.git", /*explicit_ref*/ None)
.unwrap_err();
assert!(
err.to_string()
.contains("invalid marketplace source format"),
"unexpected error: {err}"
);
}
#[test]
fn local_path_source_is_rejected() {
let err = parse_marketplace_source("./marketplace", /*explicit_ref*/ None).unwrap_err();
assert!(
err.to_string()
.contains("local marketplace sources are not supported yet"),
"unexpected error: {err}"
);
}
#[test]
fn ssh_url_parses_as_git_url() {
assert_eq!(
parse_marketplace_source(
"ssh://git@github.com/owner/repo.git#main",
/*explicit_ref*/ None,
)
.unwrap(),
MarketplaceSource::Git {
url: "ssh://git@github.com/owner/repo.git".to_string(),
ref_name: Some("main".to_string()),
}
);
}
#[test]
fn utc_timestamp_formats_unix_epoch_as_rfc3339_utc() {
assert_eq!(
format_utc_timestamp(/*seconds_since_epoch*/ 0),
"1970-01-01T00:00:00Z"
);
assert_eq!(
format_utc_timestamp(/*seconds_since_epoch*/ 1_775_779_200),
"2026-04-10T00:00:00Z"
);
}
#[test]
fn sparse_paths_parse_before_or_after_source() {
let sparse_before_source =

View File

@@ -1,150 +0,0 @@
use super::MarketplaceSource;
use anyhow::Context;
use anyhow::Result;
use codex_config::CONFIG_TOML_FILE;
use codex_core::plugins::validate_marketplace_root;
use std::io::ErrorKind;
use std::path::Path;
use std::path::PathBuf;
#[derive(Debug, Clone, PartialEq, Eq)]
pub(super) struct MarketplaceInstallMetadata {
source: InstalledMarketplaceSource,
}
#[derive(Debug, Clone, PartialEq, Eq)]
enum InstalledMarketplaceSource {
Git {
url: String,
ref_name: Option<String>,
sparse_paths: Vec<String>,
},
}
pub(super) fn installed_marketplace_root_for_source(
codex_home: &Path,
install_root: &Path,
install_metadata: &MarketplaceInstallMetadata,
) -> Result<Option<PathBuf>> {
let config_path = codex_home.join(CONFIG_TOML_FILE);
let config = match std::fs::read_to_string(&config_path) {
Ok(config) => config,
Err(err) if err.kind() == ErrorKind::NotFound => return Ok(None),
Err(err) => {
return Err(err)
.with_context(|| format!("failed to read user config {}", config_path.display()));
}
};
let config: toml::Value = toml::from_str(&config)
.with_context(|| format!("failed to parse user config {}", config_path.display()))?;
let Some(marketplaces) = config.get("marketplaces").and_then(toml::Value::as_table) else {
return Ok(None);
};
for (marketplace_name, marketplace) in marketplaces {
if !install_metadata.matches_config(marketplace) {
continue;
}
let root = install_root.join(marketplace_name);
if validate_marketplace_root(&root).is_ok() {
return Ok(Some(root));
}
}
Ok(None)
}
impl MarketplaceInstallMetadata {
pub(super) fn from_source(source: &MarketplaceSource, sparse_paths: &[String]) -> Self {
let source = match source {
MarketplaceSource::Git { url, ref_name } => InstalledMarketplaceSource::Git {
url: url.clone(),
ref_name: ref_name.clone(),
sparse_paths: sparse_paths.to_vec(),
},
};
Self { source }
}
pub(super) fn config_source_type(&self) -> &'static str {
match &self.source {
InstalledMarketplaceSource::Git { .. } => "git",
}
}
pub(super) fn config_source(&self) -> String {
match &self.source {
InstalledMarketplaceSource::Git { url, .. } => url.clone(),
}
}
pub(super) fn ref_name(&self) -> Option<&str> {
match &self.source {
InstalledMarketplaceSource::Git { ref_name, .. } => ref_name.as_deref(),
}
}
pub(super) fn sparse_paths(&self) -> &[String] {
match &self.source {
InstalledMarketplaceSource::Git { sparse_paths, .. } => sparse_paths,
}
}
fn matches_config(&self, marketplace: &toml::Value) -> bool {
marketplace.get("source_type").and_then(toml::Value::as_str)
== Some(self.config_source_type())
&& marketplace.get("source").and_then(toml::Value::as_str)
== Some(self.config_source().as_str())
&& marketplace.get("ref").and_then(toml::Value::as_str) == self.ref_name()
&& config_sparse_paths(marketplace) == self.sparse_paths()
}
}
fn config_sparse_paths(marketplace: &toml::Value) -> Vec<String> {
marketplace
.get("sparse_paths")
.and_then(toml::Value::as_array)
.map(|paths| {
paths
.iter()
.filter_map(toml::Value::as_str)
.map(str::to_string)
.collect()
})
.unwrap_or_default()
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
use tempfile::TempDir;
#[test]
fn installed_marketplace_root_for_source_propagates_config_read_errors() -> Result<()> {
let codex_home = TempDir::new()?;
let config_path = codex_home.path().join(CONFIG_TOML_FILE);
std::fs::create_dir(&config_path)?;
let install_root = codex_home.path().join("marketplaces");
let source = MarketplaceSource::Git {
url: "https://github.com/owner/repo.git".to_string(),
ref_name: None,
};
let install_metadata = MarketplaceInstallMetadata::from_source(&source, &[]);
let err = installed_marketplace_root_for_source(
codex_home.path(),
&install_root,
&install_metadata,
)
.unwrap_err();
assert_eq!(
err.to_string(),
format!("failed to read user config {}", config_path.display())
);
Ok(())
}
}

View File

@@ -1,118 +0,0 @@
use anyhow::Context;
use anyhow::Result;
use anyhow::bail;
use std::fs;
use std::path::Path;
use std::path::PathBuf;
use std::process::Command;
pub(super) fn clone_git_source(
url: &str,
ref_name: Option<&str>,
sparse_paths: &[String],
destination: &Path,
) -> Result<()> {
let destination = destination.to_string_lossy().to_string();
if sparse_paths.is_empty() {
run_git(&["clone", url, destination.as_str()], /*cwd*/ None)?;
if let Some(ref_name) = ref_name {
run_git(&["checkout", ref_name], Some(Path::new(&destination)))?;
}
return Ok(());
}
run_git(
&[
"clone",
"--filter=blob:none",
"--no-checkout",
url,
destination.as_str(),
],
/*cwd*/ None,
)?;
let mut sparse_args = vec!["sparse-checkout", "set"];
sparse_args.extend(sparse_paths.iter().map(String::as_str));
let destination = Path::new(&destination);
run_git(&sparse_args, Some(destination))?;
run_git(&["checkout", ref_name.unwrap_or("HEAD")], Some(destination))?;
Ok(())
}
fn run_git(args: &[&str], cwd: Option<&Path>) -> Result<()> {
let mut command = Command::new("git");
command.args(args);
command.env("GIT_TERMINAL_PROMPT", "0");
if let Some(cwd) = cwd {
command.current_dir(cwd);
}
let output = command
.output()
.with_context(|| format!("failed to run git {}", args.join(" ")))?;
if output.status.success() {
return Ok(());
}
let stderr = String::from_utf8_lossy(&output.stderr);
let stdout = String::from_utf8_lossy(&output.stdout);
bail!(
"git {} failed with status {}\nstdout:\n{}\nstderr:\n{}",
args.join(" "),
output.status,
stdout.trim(),
stderr.trim()
);
}
pub(super) fn replace_marketplace_root(staged_root: &Path, destination: &Path) -> Result<()> {
if let Some(parent) = destination.parent() {
fs::create_dir_all(parent)?;
}
if destination.exists() {
bail!(
"marketplace destination already exists: {}",
destination.display()
);
}
fs::rename(staged_root, destination).map_err(Into::into)
}
pub(super) fn marketplace_staging_root(install_root: &Path) -> PathBuf {
install_root.join(".staging")
}
#[cfg(test)]
mod tests {
use super::*;
use pretty_assertions::assert_eq;
use tempfile::TempDir;
#[test]
fn replace_marketplace_root_rejects_existing_destination() {
let temp_dir = TempDir::new().unwrap();
let staged_root = temp_dir.path().join("staged");
let destination = temp_dir.path().join("destination");
fs::create_dir_all(&staged_root).unwrap();
fs::write(staged_root.join("marker.txt"), "staged").unwrap();
fs::create_dir_all(&destination).unwrap();
fs::write(destination.join("marker.txt"), "installed").unwrap();
let err = replace_marketplace_root(&staged_root, &destination).unwrap_err();
assert!(
err.to_string()
.contains("marketplace destination already exists"),
"unexpected error: {err}"
);
assert_eq!(
fs::read_to_string(staged_root.join("marker.txt")).unwrap(),
"staged"
);
assert_eq!(
fs::read_to_string(destination.join("marker.txt")).unwrap(),
"installed"
);
}
}

View File

@@ -299,6 +299,7 @@ async fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Re
transport: transport.clone(),
enabled: true,
required: false,
supports_parallel_tool_calls: false,
disabled_reason: None,
startup_timeout_sec: None,
tool_timeout_sec: None,
@@ -390,8 +391,10 @@ async fn run_login(config_overrides: &CliConfigOverrides, login_args: LoginArgs)
let config = Config::load_with_cli_overrides(overrides)
.await
.context("failed to load configuration")?;
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(config.codex_home.clone())));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None);
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(
config.codex_home.to_path_buf(),
)));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None).await;
let LoginArgs { name, scopes } = login_args;
@@ -441,8 +444,10 @@ async fn run_logout(config_overrides: &CliConfigOverrides, logout_args: LogoutAr
let config = Config::load_with_cli_overrides(overrides)
.await
.context("failed to load configuration")?;
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(config.codex_home.clone())));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None);
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(
config.codex_home.to_path_buf(),
)));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None).await;
let LogoutArgs { name } = logout_args;
@@ -471,8 +476,10 @@ async fn run_list(config_overrides: &CliConfigOverrides, list_args: ListArgs) ->
let config = Config::load_with_cli_overrides(overrides)
.await
.context("failed to load configuration")?;
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(config.codex_home.clone())));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None);
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(
config.codex_home.to_path_buf(),
)));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None).await;
let mut entries: Vec<_> = mcp_servers.iter().collect();
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
@@ -720,8 +727,10 @@ async fn run_get(config_overrides: &CliConfigOverrides, get_args: GetArgs) -> Re
let config = Config::load_with_cli_overrides(overrides)
.await
.context("failed to load configuration")?;
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(config.codex_home.clone())));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None);
let mcp_manager = McpManager::new(Arc::new(PluginsManager::new(
config.codex_home.to_path_buf(),
)));
let mcp_servers = mcp_manager.effective_servers(&config, /*auth*/ None).await;
let Some(server) = mcp_servers.get(&get_args.name) else {
bail!("No MCP server named '{name}' found.", name = get_args.name);

View File

@@ -63,7 +63,7 @@ pub async fn load_auth_manager() -> Option<AuthManager> {
// TODO: pass in cli overrides once cloud tasks properly support them.
let config = Config::load_with_cli_overrides(Vec::new()).await.ok()?;
Some(AuthManager::new(
config.codex_home,
config.codex_home.to_path_buf(),
/*enable_codex_api_key_env*/ false,
config.cli_auth_credentials_store_mode,
))

View File

@@ -14,6 +14,7 @@ workspace = true
[dependencies]
async-trait = { workspace = true }
deno_core_icudata = { workspace = true }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true }
tokio = { workspace = true, features = ["macros", "rt", "sync", "time"] }

View File

@@ -104,6 +104,8 @@ pub(crate) fn spawn_runtime(
request: ExecuteRequest,
event_tx: mpsc::UnboundedSender<RuntimeEvent>,
) -> Result<(std_mpsc::Sender<RuntimeCommand>, v8::IsolateHandle), String> {
initialize_v8()?;
let (command_tx, command_rx) = std_mpsc::channel();
let runtime_command_tx = command_tx.clone();
let (isolate_handle_tx, isolate_handle_rx) = std_mpsc::sync_channel(1);
@@ -164,15 +166,20 @@ pub(super) enum CompletionState {
},
}
fn initialize_v8() {
static PLATFORM: OnceLock<v8::SharedRef<v8::Platform>> = OnceLock::new();
fn initialize_v8() -> Result<(), String> {
static PLATFORM: OnceLock<Result<v8::SharedRef<v8::Platform>, String>> = OnceLock::new();
let _ = PLATFORM.get_or_init(|| {
match PLATFORM.get_or_init(|| {
v8::icu::set_common_data_77(deno_core_icudata::ICU_DATA)
.map_err(|error_code| format!("failed to initialize ICU data: {error_code}"))?;
let platform = v8::new_default_platform(0, false).make_shared();
v8::V8::initialize_platform(platform.clone());
v8::V8::initialize();
platform
});
Ok(platform)
}) {
Ok(_) => Ok(()),
Err(error_text) => Err(error_text.clone()),
}
}
fn run_runtime(
@@ -182,8 +189,6 @@ fn run_runtime(
isolate_handle_tx: std_mpsc::SyncSender<v8::IsolateHandle>,
runtime_command_tx: std_mpsc::Sender<RuntimeCommand>,
) {
initialize_v8();
let isolate = &mut v8::Isolate::new(v8::CreateParams::default());
let isolate_handle = isolate.thread_safe_handle();
if isolate_handle_tx.send(isolate_handle).is_err() {

View File

@@ -561,6 +561,85 @@ mod tests {
);
}
#[tokio::test]
async fn date_locale_string_formats_with_icu_data() {
let service = CodeModeService::new();
let response = service
.execute(ExecuteRequest {
source: r#"
const value = new Date("2025-01-02T03:04:05Z")
.toLocaleString("fr-FR", {
weekday: "long",
month: "long",
day: "numeric",
hour: "2-digit",
minute: "2-digit",
second: "2-digit",
hour12: false,
timeZone: "UTC",
});
text(value);
"#
.to_string(),
yield_time_ms: None,
..execute_request("")
})
.await
.unwrap();
assert_eq!(
response,
RuntimeResponse::Result {
cell_id: "1".to_string(),
content_items: vec![FunctionCallOutputContentItem::InputText {
text: "jeudi 2 janvier \u{e0} 03:04:05".to_string(),
}],
stored_values: HashMap::new(),
error_text: None,
}
);
}
#[tokio::test]
async fn intl_date_time_format_formats_with_icu_data() {
let service = CodeModeService::new();
let response = service
.execute(ExecuteRequest {
source: r#"
const formatter = new Intl.DateTimeFormat("fr-FR", {
weekday: "long",
month: "long",
day: "numeric",
hour: "2-digit",
minute: "2-digit",
second: "2-digit",
hour12: false,
timeZone: "UTC",
});
text(formatter.format(new Date("2025-01-02T03:04:05Z")));
"#
.to_string(),
yield_time_ms: None,
..execute_request("")
})
.await
.unwrap();
assert_eq!(
response,
RuntimeResponse::Result {
cell_id: "1".to_string(),
content_items: vec![FunctionCallOutputContentItem::InputText {
text: "jeudi 2 janvier \u{e0} 03:04:05".to_string(),
}],
stored_values: HashMap::new(),
error_text: None,
}
);
}
#[tokio::test]
async fn output_helpers_return_undefined() {
let service = CodeModeService::new();

View File

@@ -13,6 +13,7 @@ pub use models::ModelsClient;
pub use realtime_call::RealtimeCallClient;
pub use realtime_call::RealtimeCallResponse;
pub use realtime_websocket::RealtimeEventParser;
pub use realtime_websocket::RealtimeOutputModality;
pub use realtime_websocket::RealtimeSessionConfig;
pub use realtime_websocket::RealtimeSessionMode;
pub use realtime_websocket::RealtimeWebsocketClient;

View File

@@ -19,6 +19,7 @@ use serde_json::to_string;
use serde_json::to_value;
use std::sync::Arc;
use tracing::instrument;
use tracing::trace;
const MULTIPART_BOUNDARY: &str = "codex-realtime-call-boundary";
const MULTIPART_CONTENT_TYPE: &str = "multipart/form-data; boundary=codex-realtime-call-boundary";
@@ -200,6 +201,7 @@ fn decode_call_id_from_location(headers: &HeaderMap) -> Result<String, ApiError>
.ok_or_else(|| ApiError::Stream("realtime call response missing Location".to_string()))?
.to_str()
.map_err(|err| ApiError::Stream(format!("invalid realtime call Location: {err}")))?;
trace!("realtime call Location: {location}");
location
.split('?')
@@ -219,6 +221,7 @@ fn decode_call_id_from_location(headers: &HeaderMap) -> Result<String, ApiError>
mod tests {
use super::*;
use crate::endpoint::realtime_websocket::RealtimeEventParser;
use crate::endpoint::realtime_websocket::RealtimeOutputModality;
use crate::endpoint::realtime_websocket::RealtimeSessionMode;
use crate::provider::RetryConfig;
use async_trait::async_trait;
@@ -309,6 +312,7 @@ mod tests {
session_id: Some(session_id.to_string()),
event_parser: RealtimeEventParser::RealtimeV2,
session_mode: RealtimeSessionMode::Conversational,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Marin,
}
}

View File

@@ -7,9 +7,9 @@ use crate::endpoint::realtime_websocket::protocol::RealtimeAudioFrame;
use crate::endpoint::realtime_websocket::protocol::RealtimeEvent;
use crate::endpoint::realtime_websocket::protocol::RealtimeEventParser;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutboundMessage;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutputModality;
use crate::endpoint::realtime_websocket::protocol::RealtimeSessionConfig;
use crate::endpoint::realtime_websocket::protocol::RealtimeSessionMode;
use crate::endpoint::realtime_websocket::protocol::RealtimeTranscriptDelta;
use crate::endpoint::realtime_websocket::protocol::RealtimeTranscriptEntry;
use crate::endpoint::realtime_websocket::protocol::RealtimeVoice;
use crate::endpoint::realtime_websocket::protocol::parse_realtime_event;
@@ -17,6 +17,7 @@ use crate::error::ApiError;
use crate::provider::Provider;
use codex_client::backoff;
use codex_client::maybe_build_rustls_client_config_with_custom_ca;
use codex_protocol::protocol::RealtimeTranscriptDelta;
use codex_utils_rustls_provider::ensure_rustls_crypto_provider;
use futures::SinkExt;
use futures::StreamExt;
@@ -307,10 +308,17 @@ impl RealtimeWebsocketWriter {
&self,
instructions: String,
session_mode: RealtimeSessionMode,
output_modality: RealtimeOutputModality,
voice: RealtimeVoice,
) -> Result<(), ApiError> {
let session_mode = normalized_session_mode(self.event_parser, session_mode);
let session = session_update_session(self.event_parser, instructions, session_mode, voice);
let session = session_update_session(
self.event_parser,
instructions,
session_mode,
output_modality,
voice,
);
self.send_json(&RealtimeOutboundMessage::SessionUpdate { session })
.await
}
@@ -406,10 +414,10 @@ impl RealtimeWebsocketEvents {
let mut active_transcript = self.active_transcript.lock().await;
match event {
RealtimeEvent::InputAudioSpeechStarted(_) => {}
RealtimeEvent::InputTranscriptDelta(RealtimeTranscriptDelta { delta }) => {
RealtimeEvent::InputTranscriptDelta(RealtimeTranscriptDelta { delta, .. }) => {
append_transcript_delta(&mut active_transcript.entries, "user", delta);
}
RealtimeEvent::OutputTranscriptDelta(RealtimeTranscriptDelta { delta }) => {
RealtimeEvent::OutputTranscriptDelta(RealtimeTranscriptDelta { delta, .. }) => {
append_transcript_delta(&mut active_transcript.entries, "assistant", delta);
}
RealtimeEvent::HandoffRequested(handoff) => {
@@ -418,6 +426,8 @@ impl RealtimeWebsocketEvents {
}
}
RealtimeEvent::SessionUpdated { .. }
| RealtimeEvent::InputTranscriptDone(_)
| RealtimeEvent::OutputTranscriptDone(_)
| RealtimeEvent::AudioOut(_)
| RealtimeEvent::ResponseCreated(_)
| RealtimeEvent::ResponseCancelled(_)
@@ -581,7 +591,12 @@ impl RealtimeWebsocketClient {
);
connection
.writer
.send_session_update(config.instructions, config.session_mode, config.voice)
.send_session_update(
config.instructions,
config.session_mode,
config.output_modality,
config.voice,
)
.await?;
Ok(connection)
}
@@ -721,13 +736,14 @@ fn normalize_realtime_path(url: &mut Url) {
#[cfg(test)]
mod tests {
use super::*;
use crate::endpoint::realtime_websocket::protocol::RealtimeTranscriptDelta;
use crate::endpoint::realtime_websocket::protocol::RealtimeTranscriptEntry;
use codex_protocol::protocol::RealtimeHandoffRequested;
use codex_protocol::protocol::RealtimeInputAudioSpeechStarted;
use codex_protocol::protocol::RealtimeResponseCancelled;
use codex_protocol::protocol::RealtimeResponseCreated;
use codex_protocol::protocol::RealtimeResponseDone;
use codex_protocol::protocol::RealtimeTranscriptDelta;
use codex_protocol::protocol::RealtimeTranscriptDone;
use codex_protocol::protocol::RealtimeVoice;
use http::HeaderValue;
use pretty_assertions::assert_eq;
@@ -894,6 +910,8 @@ mod tests {
fn parse_realtime_v2_input_audio_transcription_delta_event() {
let payload = json!({
"type": "conversation.item.input_audio_transcription.delta",
"item_id": "item_input_1",
"content_index": 0,
"delta": "hello"
})
.to_string();
@@ -908,6 +926,32 @@ mod tests {
);
}
#[test]
fn parse_realtime_v2_item_done_output_text_event() {
let payload = json!({
"type": "conversation.item.done",
"item": {
"id": "item_output_1",
"type": "message",
"role": "assistant",
"content": [
{"type": "output_text", "text": "hello"},
{"type": "output_text", "text": " world"}
]
}
})
.to_string();
assert_eq!(
parse_realtime_event(payload.as_str(), RealtimeEventParser::RealtimeV2),
Some(RealtimeEvent::OutputTranscriptDone(
RealtimeTranscriptDone {
text: "hello world".to_string(),
}
))
);
}
#[test]
fn parse_realtime_v2_output_audio_delta_defaults_audio_shape() {
let payload = json!({
@@ -1374,6 +1418,7 @@ mod tests {
session_id: Some("conv_1".to_string()),
event_parser: RealtimeEventParser::V1,
session_mode: RealtimeSessionMode::Conversational,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Breeze,
},
HeaderMap::new(),
@@ -1648,6 +1693,7 @@ mod tests {
session_id: Some("conv_1".to_string()),
event_parser: RealtimeEventParser::RealtimeV2,
session_mode: RealtimeSessionMode::Conversational,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Cedar,
},
HeaderMap::new(),
@@ -1753,6 +1799,7 @@ mod tests {
session_id: Some("conv_1".to_string()),
event_parser: RealtimeEventParser::RealtimeV2,
session_mode: RealtimeSessionMode::Transcription,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Marin,
},
HeaderMap::new(),
@@ -1856,6 +1903,7 @@ mod tests {
session_id: Some("conv_1".to_string()),
event_parser: RealtimeEventParser::V1,
session_mode: RealtimeSessionMode::Transcription,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Cove,
},
HeaderMap::new(),
@@ -1945,6 +1993,7 @@ mod tests {
session_id: Some("conv_1".to_string()),
event_parser: RealtimeEventParser::V1,
session_mode: RealtimeSessionMode::Conversational,
output_modality: RealtimeOutputModality::Audio,
voice: RealtimeVoice::Cove,
},
HeaderMap::new(),

View File

@@ -8,6 +8,7 @@ use crate::endpoint::realtime_websocket::methods_v2::session_update_session as v
use crate::endpoint::realtime_websocket::methods_v2::websocket_intent as v2_websocket_intent;
use crate::endpoint::realtime_websocket::protocol::RealtimeEventParser;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutboundMessage;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutputModality;
use crate::endpoint::realtime_websocket::protocol::RealtimeSessionConfig;
use crate::endpoint::realtime_websocket::protocol::RealtimeSessionMode;
use crate::endpoint::realtime_websocket::protocol::RealtimeVoice;
@@ -57,13 +58,14 @@ pub(super) fn session_update_session(
event_parser: RealtimeEventParser,
instructions: String,
session_mode: RealtimeSessionMode,
output_modality: RealtimeOutputModality,
voice: RealtimeVoice,
) -> SessionUpdateSession {
let session_mode = normalized_session_mode(event_parser, session_mode);
match event_parser {
RealtimeEventParser::V1 => v1_session_update_session(instructions, voice),
RealtimeEventParser::RealtimeV2 => {
v2_session_update_session(instructions, session_mode, voice)
v2_session_update_session(instructions, session_mode, output_modality, voice)
}
}
}
@@ -73,6 +75,7 @@ pub fn session_update_session_json(config: RealtimeSessionConfig) -> JsonResult<
config.event_parser,
config.instructions,
config.session_mode,
config.output_modality,
config.voice,
);
session.id = config.session_id;

View File

@@ -9,6 +9,7 @@ use crate::endpoint::realtime_websocket::protocol::ConversationMessageItem;
use crate::endpoint::realtime_websocket::protocol::ConversationRole;
use crate::endpoint::realtime_websocket::protocol::NoiseReductionType;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutboundMessage;
use crate::endpoint::realtime_websocket::protocol::RealtimeOutputModality;
use crate::endpoint::realtime_websocket::protocol::RealtimeSessionMode;
use crate::endpoint::realtime_websocket::protocol::RealtimeVoice;
use crate::endpoint::realtime_websocket::protocol::SessionAudio;
@@ -26,9 +27,10 @@ use crate::endpoint::realtime_websocket::protocol::TurnDetectionType;
use serde_json::json;
const REALTIME_V2_OUTPUT_MODALITY_AUDIO: &str = "audio";
const REALTIME_V2_OUTPUT_MODALITY_TEXT: &str = "text";
const REALTIME_V2_TOOL_CHOICE: &str = "auto";
const REALTIME_V2_BACKGROUND_AGENT_TOOL_NAME: &str = "background_agent";
const REALTIME_V2_BACKGROUND_AGENT_TOOL_DESCRIPTION: &str = "Send a user request to the background agent. Use this as the default action. If the background agent is idle, this starts a new task and returns the final result to the user. If the background agent is already working on a task, this sends the request as guidance to steer that previous task. If the user asks to do something next, later, after this, or once current work finishes, call this tool so the work is actually queued instead of merely promising to do it later.";
const REALTIME_V2_BACKGROUND_AGENT_TOOL_DESCRIPTION: &str = "Send a user request to the background agent. Use this as the default action. Do not rephrase the user's ask or rewrite it in your own words; pass along the user's own words. If the background agent is idle, this starts a new task and returns the final result to the user. If the background agent is already working on a task, this sends the request as guidance to steer that previous task. If the user asks to do something next, later, after this, or once current work finishes, call this tool so the work is actually queued instead of merely promising to do it later.";
pub(super) fn conversation_item_create_message(text: String) -> RealtimeOutboundMessage {
RealtimeOutboundMessage::ConversationItemCreate {
@@ -59,6 +61,7 @@ pub(super) fn conversation_handoff_append_message(
pub(super) fn session_update_session(
instructions: String,
session_mode: RealtimeSessionMode,
output_modality: RealtimeOutputModality,
voice: RealtimeVoice,
) -> SessionUpdateSession {
match session_mode {
@@ -67,7 +70,7 @@ pub(super) fn session_update_session(
r#type: SessionType::Realtime,
model: None,
instructions: Some(instructions),
output_modalities: Some(vec![REALTIME_V2_OUTPUT_MODALITY_AUDIO.to_string()]),
output_modalities: Some(vec![output_modality_value(output_modality).to_string()]),
audio: SessionAudio {
input: SessionAudioInput {
format: SessionAudioFormat {
@@ -132,6 +135,13 @@ pub(super) fn session_update_session(
}
}
fn output_modality_value(output_modality: RealtimeOutputModality) -> &'static str {
match output_modality {
RealtimeOutputModality::Text => REALTIME_V2_OUTPUT_MODALITY_TEXT,
RealtimeOutputModality::Audio => REALTIME_V2_OUTPUT_MODALITY_AUDIO,
}
}
pub(super) fn websocket_intent() -> Option<&'static str> {
None
}

Some files were not shown because too many files have changed in this diff Show More