Compare commits

..

7 Commits

Author SHA1 Message Date
aibrahim-oai
21593bd36c Update models.json 2026-05-11 12:57:08 +00:00
jif-oai
7e15e6db9e [codex] default unknown contributed tools to mutating (#22143)
## Summary
- make the shared `ToolExecutor::is_mutating` default conservative by
returning `true`
- update the trait docs to say read-only tools should opt out explicitly
- add a regression test covering the default behavior

## Why
Hosts use this signal for serialization and approval policy. Treating
unknown contributed tools as read-only lets a write-capable tool
accidentally bypass mutating-tool safeguards if it forgets to override
the hook.

## Validation
- not run, per request
2026-05-11 14:39:21 +02:00
jif-oai
ebd3d53451 feat: drop CodexExtension (#22140)
Drop `CodexExtension` as not needed for now
2026-05-11 14:19:51 +02:00
jif-oai
95bfea847d refactor: extract executable tool contracts into codex-tool-api (#22138)
## Why
The tool-extraction work needs one shared executable-tool seam that
hosts and tool owners can depend on without reaching into `codex-core`.
Landing that seam first makes the later tool-family ports incremental
and keeps the reusable contract separate from any one migration.

## What changed
- add a new `codex-tool-api` crate and workspace wiring
- move the common executable-tool contracts into that crate:
`ToolBundle`, `ToolDefinition`, `ToolExecutor`, `ToolCall`, `ToolInput`,
`ToolOutput`, `JsonToolOutput`, and `ToolError`
- keep host state generic through `ToolBundle<C>` / `ToolCall<C>` so
later integrations can provide their own runtime context without baking
core types into the API
- carry the host signals the runtime will need later, including
parallel-call support and mutability probing
- leave existing tool families in place for now; this PR only
establishes the reusable API surface
- add the Bazel target and lockfile updates for the new crate

## Testing
- `cargo test -p codex-tool-api`
2026-05-11 13:56:59 +02:00
jif-oai
569ff6a1c4 extension: move git attribution into an extension (#21738)
## Why

Git commit attribution is prompt policy, not session orchestration.
After #21737 adds the extension-registry seam, this moves that
prompt-only behavior out of `codex-core` so `Session` can consume
extension-contributed prompt fragments instead of owning a one-off
policy path itself.

Before this PR, `Session` injected the trailer instruction directly from
`codex-core` ([session
assembly](a57a747eb6/codex-rs/core/src/session/mod.rs (L2733-L2739)),
[helper
module](a57a747eb6/codex-rs/core/src/commit_attribution.rs (L1-L33))).
This branch moves that same responsibility into
[`codex-git-attribution`](b5029a6736/codex-rs/ext/git-attribution/src/lib.rs (L14-L100)).

## What changed

- Added the `codex-git-attribution` extension crate.
- Snapshot `CodexGitCommit` plus `commit_attribution` at thread start,
then contribute the developer-policy fragment through the extension
registry.
- Register the extension in app-server thread extensions.
- Remove the old `codex-core` helper module and direct `Session`
injection path.

This keeps the existing behavior intact: the prompt is only contributed
when `CodexGitCommit` is enabled, blank attribution still disables the
trailer, and the default remains `Codex <noreply@openai.com>`.

## Stack

- Stacked on #21737.
2026-05-11 12:53:15 +02:00
jif-oai
436c0df658 extension: wire extension registries into sessions (#21737)
## Why

[#21736](https://github.com/openai/codex/pull/21736) introduces the
typed extension API, but the runtime does not yet carry a registry
through thread/session startup or give contributors host-owned stores to
read from. This PR wires that host-side path so later feature migrations
can move product-specific behavior behind typed contributions without
adding another bespoke seam directly to `codex-core`.

## What changed

- Thread `ExtensionRegistry<Config>` through `ThreadManager`,
`CodexSpawnArgs`, `Session`, and sub-agent spawn paths.
- Wire `ThreadStartContributor` and `ContextContributor`
- Expose the small supporting surface needed by non-core callers that
construct threads directly, including `empty_extension_registry()`
through `codex-core-api`.

This PR lands the host plumbing only: the app-server registry is still
empty, and concrete feature migrations are intended to follow
separately.
2026-05-11 11:38:18 +02:00
jif-oai
d2c3ebac1f extension: add initial typed extension API (#21736)
## Why

`codex-core` still owns a growing amount of product-specific behavior.
This PR starts the extraction path by introducing a small, typed
first-party extension seam: features can install the contribution
families they actually own, while the host keeps lifecycle and state
ownership instead of pushing a broad service locator into the API.

See the `examples/` for illustration

## Known limitations
* Tool contract definition will be shared with core
* Fragments must be extracted
* Missing some contributors
2026-05-11 11:06:24 +02:00
70 changed files with 1332 additions and 462 deletions

38
codex-rs/Cargo.lock generated
View File

@@ -1895,12 +1895,14 @@ dependencies = [
"codex-core",
"codex-core-plugins",
"codex-exec-server",
"codex-extension-api",
"codex-external-agent-migration",
"codex-external-agent-sessions",
"codex-features",
"codex-feedback",
"codex-file-search",
"codex-file-watcher",
"codex-git-attribution",
"codex-git-utils",
"codex-hooks",
"codex-login",
@@ -2497,6 +2499,7 @@ dependencies = [
"codex-core-skills",
"codex-exec-server",
"codex-execpolicy",
"codex-extension-api",
"codex-features",
"codex-feedback",
"codex-git-utils",
@@ -2600,6 +2603,7 @@ dependencies = [
"codex-config",
"codex-core",
"codex-exec-server",
"codex-extension-api",
"codex-features",
"codex-login",
"codex-model-provider-info",
@@ -2816,6 +2820,16 @@ dependencies = [
"syn 2.0.114",
]
[[package]]
name = "codex-extension-api"
version = "0.0.0"
dependencies = [
"codex-protocol",
"codex-tools",
"serde_json",
"thiserror 2.0.18",
]
[[package]]
name = "codex-external-agent-migration"
version = "0.0.0"
@@ -2905,6 +2919,16 @@ dependencies = [
"tracing",
]
[[package]]
name = "codex-git-attribution"
version = "0.0.0"
dependencies = [
"codex-core",
"codex-extension-api",
"codex-features",
"pretty_assertions",
]
[[package]]
name = "codex-git-utils"
version = "0.0.0"
@@ -3090,6 +3114,7 @@ dependencies = [
"codex-config",
"codex-core",
"codex-exec-server",
"codex-extension-api",
"codex-login",
"codex-protocol",
"codex-shell-command",
@@ -3681,6 +3706,18 @@ dependencies = [
"uuid",
]
[[package]]
name = "codex-tool-api"
version = "0.0.0"
dependencies = [
"codex-protocol",
"codex-tools",
"pretty_assertions",
"serde",
"serde_json",
"thiserror 2.0.18",
]
[[package]]
name = "codex-tools"
version = "0.0.0"
@@ -4297,6 +4334,7 @@ dependencies = [
"codex-config",
"codex-core",
"codex-exec-server",
"codex-extension-api",
"codex-features",
"codex-hooks",
"codex-login",

View File

@@ -45,6 +45,8 @@ members = [
"exec-server",
"execpolicy",
"execpolicy-legacy",
"ext/extension-api",
"ext/git-attribution",
"external-agent-migration",
"external-agent-sessions",
"keyring-store",
@@ -106,6 +108,7 @@ members = [
"test-binary-support",
"thread-manager-sample",
"thread-store",
"tool-api",
"uds",
"codex-experimental-api-macros",
"plugin",
@@ -160,6 +163,8 @@ codex-exec = { path = "exec" }
codex-file-system = { path = "file-system" }
codex-exec-server = { path = "exec-server" }
codex-execpolicy = { path = "execpolicy" }
codex-extension-api = { path = "ext/extension-api" }
codex-git-attribution = { path = "ext/git-attribution" }
codex-external-agent-migration = { path = "external-agent-migration" }
codex-external-agent-sessions = { path = "external-agent-sessions" }
codex-experimental-api-macros = { path = "codex-experimental-api-macros" }
@@ -205,6 +210,7 @@ codex-stdio-to-uds = { path = "stdio-to-uds" }
codex-terminal-detection = { path = "terminal-detection" }
codex-test-binary-support = { path = "test-binary-support" }
codex-thread-store = { path = "thread-store" }
codex-tool-api = { path = "tool-api" }
codex-tools = { path = "tools" }
codex-tui = { path = "tui" }
codex-uds = { path = "uds" }
@@ -468,6 +474,7 @@ unwrap_used = "deny"
[workspace.metadata.cargo-shear]
ignored = [
"codex-agent-graph-store",
"codex-tool-api",
"icu_provider",
"openssl-sys",
"codex-v8-poc",

View File

@@ -7714,31 +7714,6 @@
"title": "ConfigWriteResponse",
"type": "object"
},
"ConfiguredHookCommand": {
"anyOf": [
{
"type": "string"
},
{
"$ref": "#/definitions/v2/ConfiguredHookCommandByPlatform"
}
]
},
"ConfiguredHookCommandByPlatform": {
"properties": {
"unix": {
"type": "string"
},
"windows": {
"type": "string"
}
},
"required": [
"unix",
"windows"
],
"type": "object"
},
"ConfiguredHookHandler": {
"oneOf": [
{
@@ -7747,7 +7722,7 @@
"type": "boolean"
},
"command": {
"$ref": "#/definitions/v2/ConfiguredHookCommand"
"type": "string"
},
"statusMessage": {
"type": [

View File

@@ -4103,31 +4103,6 @@
"title": "ConfigWriteResponse",
"type": "object"
},
"ConfiguredHookCommand": {
"anyOf": [
{
"type": "string"
},
{
"$ref": "#/definitions/ConfiguredHookCommandByPlatform"
}
]
},
"ConfiguredHookCommandByPlatform": {
"properties": {
"unix": {
"type": "string"
},
"windows": {
"type": "string"
}
},
"required": [
"unix",
"windows"
],
"type": "object"
},
"ConfiguredHookHandler": {
"oneOf": [
{
@@ -4136,7 +4111,7 @@
"type": "boolean"
},
"command": {
"$ref": "#/definitions/ConfiguredHookCommand"
"type": "string"
},
"statusMessage": {
"type": [

View File

@@ -111,31 +111,6 @@
},
"type": "object"
},
"ConfiguredHookCommand": {
"anyOf": [
{
"type": "string"
},
{
"$ref": "#/definitions/ConfiguredHookCommandByPlatform"
}
]
},
"ConfiguredHookCommandByPlatform": {
"properties": {
"unix": {
"type": "string"
},
"windows": {
"type": "string"
}
},
"required": [
"unix",
"windows"
],
"type": "object"
},
"ConfiguredHookHandler": {
"oneOf": [
{
@@ -144,7 +119,7 @@
"type": "boolean"
},
"command": {
"$ref": "#/definitions/ConfiguredHookCommand"
"type": "string"
},
"statusMessage": {
"type": [

View File

@@ -1,6 +0,0 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { ConfiguredHookCommandByPlatform } from "./ConfiguredHookCommandByPlatform";
export type ConfiguredHookCommand = string | ConfiguredHookCommandByPlatform;

View File

@@ -1,5 +0,0 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
export type ConfiguredHookCommandByPlatform = { unix: string, windows: string, };

View File

@@ -1,6 +1,5 @@
// GENERATED CODE! DO NOT MODIFY BY HAND!
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
import type { ConfiguredHookCommand } from "./ConfiguredHookCommand";
export type ConfiguredHookHandler = { "type": "command", command: ConfiguredHookCommand, timeoutSec: bigint | null, async: boolean, statusMessage: string | null, } | { "type": "prompt", } | { "type": "agent", };
export type ConfiguredHookHandler = { "type": "command", command: string, timeoutSec: bigint | null, async: boolean, statusMessage: string | null, } | { "type": "prompt", } | { "type": "agent", };

View File

@@ -76,8 +76,6 @@ export type { ConfigRequirementsReadResponse } from "./ConfigRequirementsReadRes
export type { ConfigValueWriteParams } from "./ConfigValueWriteParams";
export type { ConfigWarningNotification } from "./ConfigWarningNotification";
export type { ConfigWriteResponse } from "./ConfigWriteResponse";
export type { ConfiguredHookCommand } from "./ConfiguredHookCommand";
export type { ConfiguredHookCommandByPlatform } from "./ConfiguredHookCommandByPlatform";
export type { ConfiguredHookHandler } from "./ConfiguredHookHandler";
export type { ConfiguredHookMatcherGroup } from "./ConfiguredHookMatcherGroup";
export type { ContextCompactedNotification } from "./ContextCompactedNotification";

View File

@@ -412,7 +412,7 @@ pub enum ConfiguredHookHandler {
#[serde(rename = "command")]
#[ts(rename = "command")]
Command {
command: ConfiguredHookCommand,
command: String,
#[serde(rename = "timeoutSec")]
#[ts(rename = "timeoutSec")]
timeout_sec: Option<u64>,
@@ -429,22 +429,6 @@ pub enum ConfiguredHookHandler {
Agent {},
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, JsonSchema, TS)]
#[serde(untagged)]
#[ts(export_to = "v2/")]
pub enum ConfiguredHookCommand {
Single(String),
ByPlatform(ConfiguredHookCommandByPlatform),
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(rename_all = "camelCase", export_to = "v2/")]
pub struct ConfiguredHookCommandByPlatform {
pub unix: String,
pub windows: String,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "camelCase")]
#[ts(export_to = "v2/")]

View File

@@ -37,9 +37,11 @@ codex-config = { workspace = true }
codex-core = { workspace = true }
codex-core-plugins = { workspace = true }
codex-exec-server = { workspace = true }
codex-extension-api = { workspace = true }
codex-external-agent-migration = { workspace = true }
codex-external-agent-sessions = { workspace = true }
codex-features = { workspace = true }
codex-git-attribution = { workspace = true }
codex-git-utils = { workspace = true }
codex-file-watcher = { workspace = true }
codex-hooks = { workspace = true }

View File

@@ -0,0 +1,11 @@
use std::sync::Arc;
use codex_core::config::Config;
use codex_extension_api::ExtensionRegistry;
use codex_extension_api::ExtensionRegistryBuilder;
pub(crate) fn thread_extensions() -> Arc<ExtensionRegistry<Config>> {
let mut builder = ExtensionRegistryBuilder::<Config>::new();
codex_git_attribution::install(&mut builder);
Arc::new(builder.build())
}

View File

@@ -83,6 +83,7 @@ mod config_manager_service;
mod connection_rpc_gate;
mod dynamic_tools;
mod error_code;
mod extensions;
mod filters;
mod fs_watch;
mod fuzzy_file_search;

View File

@@ -99,6 +99,7 @@ async fn queue_refresh(
#[cfg(test)]
mod tests {
use super::*;
use crate::extensions::thread_extensions;
use async_trait::async_trait;
use codex_arg0::Arg0DispatchPaths;
use codex_config::CloudRequirementsLoader;
@@ -183,6 +184,7 @@ mod tests {
auth_manager,
SessionSource::Exec,
Arc::new(EnvironmentManager::default_for_tests()),
thread_extensions(),
/*analytics_events_client*/ None,
thread_store,
Some(state_db.clone()),

View File

@@ -8,6 +8,7 @@ use crate::attestation::app_server_attestation_provider;
use crate::config_manager::ConfigManager;
use crate::connection_rpc_gate::ConnectionRpcGate;
use crate::error_code::invalid_request;
use crate::extensions::thread_extensions;
use crate::fs_watch::FsWatchManager;
use crate::outgoing_message::ConnectionId;
use crate::outgoing_message::ConnectionRequestId;
@@ -303,6 +304,7 @@ impl MessageProcessor {
auth_manager.clone(),
session_source,
environment_manager,
thread_extensions(),
Some(analytics_events_client.clone()),
Arc::clone(&thread_store),
state_db.clone(),

View File

@@ -18,8 +18,6 @@ use codex_app_server_protocol::ConfigRequirementsReadResponse;
use codex_app_server_protocol::ConfigValueWriteParams;
use codex_app_server_protocol::ConfigWriteErrorCode;
use codex_app_server_protocol::ConfigWriteResponse;
use codex_app_server_protocol::ConfiguredHookCommand;
use codex_app_server_protocol::ConfiguredHookCommandByPlatform;
use codex_app_server_protocol::ConfiguredHookHandler;
use codex_app_server_protocol::ConfiguredHookMatcherGroup;
use codex_app_server_protocol::ExperimentalFeatureEnablementSetParams;
@@ -34,8 +32,6 @@ use codex_app_server_protocol::SandboxMode;
use codex_app_server_protocol::ServerNotification;
use codex_chatgpt::connectors;
use codex_config::ConfigRequirementsToml;
use codex_config::HookCommandByPlatformConfig;
use codex_config::HookCommandConfig;
use codex_config::HookEventsToml;
use codex_config::HookHandlerConfig as CoreHookHandlerConfig;
use codex_config::ManagedHooksRequirementsToml;
@@ -522,7 +518,7 @@ fn map_hook_handler_to_api(handler: CoreHookHandlerConfig) -> ConfiguredHookHand
r#async,
status_message,
} => ConfiguredHookHandler::Command {
command: map_hook_command_to_api(command),
command,
timeout_sec,
r#async,
status_message,
@@ -532,15 +528,6 @@ fn map_hook_handler_to_api(handler: CoreHookHandlerConfig) -> ConfiguredHookHand
}
}
fn map_hook_command_to_api(command: HookCommandConfig) -> ConfiguredHookCommand {
match command {
HookCommandConfig::Single(command) => ConfiguredHookCommand::Single(command),
HookCommandConfig::ByPlatform(HookCommandByPlatformConfig { unix, windows }) => {
ConfiguredHookCommand::ByPlatform(ConfiguredHookCommandByPlatform { unix, windows })
}
}
}
fn map_sandbox_mode_requirement_to_api(mode: CoreSandboxModeRequirement) -> Option<SandboxMode> {
match mode {
CoreSandboxModeRequirement::ReadOnly => Some(SandboxMode::ReadOnly),

View File

@@ -52,7 +52,7 @@ fn command_hook_hash(
group: codex_config::MatcherGroup {
matcher: matcher.map(ToOwned::to_owned),
hooks: vec![codex_config::HookHandlerConfig::Command {
command: codex_config::HookCommandConfig::Single(command.to_string()),
command: command.to_string(),
timeout_sec: Some(timeout_sec),
r#async: false,
status_message: status_message.map(ToOwned::to_owned),

View File

@@ -125,7 +125,7 @@ pub struct MatcherGroup {
pub enum HookHandlerConfig {
#[serde(rename = "command")]
Command {
command: HookCommandConfig,
command: String,
#[serde(default, rename = "timeout")]
timeout_sec: Option<u64>,
#[serde(default)]
@@ -139,42 +139,6 @@ pub enum HookHandlerConfig {
Agent {},
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, JsonSchema)]
#[serde(untagged)]
pub enum HookCommandConfig {
Single(String),
ByPlatform(HookCommandByPlatformConfig),
}
impl HookCommandConfig {
pub fn for_current_platform(&self) -> &str {
match self {
Self::Single(command) => command,
Self::ByPlatform(command) => command.for_current_platform(),
}
}
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize, JsonSchema)]
pub struct HookCommandByPlatformConfig {
pub unix: String,
pub windows: String,
}
impl HookCommandByPlatformConfig {
fn for_current_platform(&self) -> &str {
#[cfg(windows)]
{
&self.windows
}
#[cfg(not(windows))]
{
&self.unix
}
}
}
#[derive(Debug, Default, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ManagedHooksRequirementsToml {
pub managed_dir: Option<PathBuf>,

View File

@@ -2,8 +2,6 @@ use pretty_assertions::assert_eq;
use std::collections::BTreeMap;
use super::HookCommandByPlatformConfig;
use super::HookCommandConfig;
use super::HookEventsToml;
use super::HookHandlerConfig;
use super::HooksFile;
@@ -41,7 +39,7 @@ fn hooks_file_deserializes_existing_json_shape() {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("python3 /tmp/pre.py".to_string()),
command: "python3 /tmp/pre.py".to_string(),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -75,7 +73,7 @@ statusMessage = "checking"
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("python3 /tmp/pre.py".to_string()),
command: "python3 /tmp/pre.py".to_string(),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -111,7 +109,7 @@ command = "python3 /tmp/pre.py"
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("python3 /tmp/pre.py".to_string()),
command: "python3 /tmp/pre.py".to_string(),
timeout_sec: None,
r#async: false,
status_message: None,
@@ -155,9 +153,7 @@ command = "python3 /enterprise/place/pre.py"
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single(
"python3 /enterprise/place/pre.py".to_string(),
),
command: "python3 /enterprise/place/pre.py".to_string(),
timeout_sec: None,
r#async: false,
status_message: None,
@@ -168,37 +164,3 @@ command = "python3 /enterprise/place/pre.py"
}
);
}
#[test]
fn hook_events_deserialize_platform_command_from_toml() {
let parsed: HookEventsToml = toml::from_str(
r#"
[[PreToolUse]]
matcher = "^Bash$"
[[PreToolUse.hooks]]
type = "command"
command = { unix = "bash /enterprise/hooks/pre.sh", windows = "powershell -File C:\\enterprise\\hooks\\pre.ps1" }
"#,
)
.expect("platform hook command TOML should deserialize");
assert_eq!(
parsed,
HookEventsToml {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::ByPlatform(HookCommandByPlatformConfig {
unix: "bash /enterprise/hooks/pre.sh".to_string(),
windows: r"powershell -File C:\enterprise\hooks\pre.ps1".to_string(),
}),
timeout_sec: None,
r#async: false,
status_message: None,
}],
}],
..Default::default()
}
);
}

View File

@@ -72,8 +72,6 @@ pub use diagnostics::format_config_error;
pub use diagnostics::format_config_error_with_source;
pub use diagnostics::io_error_from_config_error;
pub use fingerprint::version_for_toml;
pub use hook_config::HookCommandByPlatformConfig;
pub use hook_config::HookCommandConfig;
pub use hook_config::HookEventsToml;
pub use hook_config::HookHandlerConfig;
pub use hook_config::HookStateToml;

View File

@@ -19,6 +19,7 @@ codex-arg0 = { workspace = true }
codex-analytics = { workspace = true }
codex-config = { workspace = true }
codex-core = { workspace = true }
codex-extension-api = { workspace = true }
codex-exec-server = { workspace = true }
codex-features = { workspace = true }
codex-login = { workspace = true }

View File

@@ -45,6 +45,7 @@ pub use codex_core::skills::SkillsManager;
pub use codex_core::thread_store_from_config;
pub use codex_exec_server::EnvironmentManager;
pub use codex_exec_server::ExecServerRuntimePaths;
pub use codex_extension_api::empty_extension_registry;
pub use codex_features::Feature;
pub use codex_features::Features;
pub use codex_login::AuthManager;

View File

@@ -35,6 +35,7 @@ codex-config = { workspace = true }
codex-core-plugins = { workspace = true }
codex-core-skills = { workspace = true }
codex-exec-server = { workspace = true }
codex-extension-api = { workspace = true }
codex-features = { workspace = true }
codex-feedback = { workspace = true }
codex-login = { workspace = true }

View File

@@ -83,6 +83,7 @@ pub(crate) async fn run_codex_thread_interactive(
skills_manager: Arc::clone(&parent_session.services.skills_manager),
plugins_manager: Arc::clone(&parent_session.services.plugins_manager),
mcp_manager: Arc::clone(&parent_session.services.mcp_manager),
extensions: Arc::clone(&parent_session.services.extensions),
conversation_history: initial_history.unwrap_or(InitialHistory::New),
session_source: SessionSource::SubAgent(subagent_source.clone()),
thread_source: Some(ThreadSource::Subagent),

View File

@@ -1,33 +0,0 @@
const DEFAULT_ATTRIBUTION_VALUE: &str = "Codex <noreply@openai.com>";
fn build_commit_message_trailer(config_attribution: Option<&str>) -> Option<String> {
let value = resolve_attribution_value(config_attribution)?;
Some(format!("Co-authored-by: {value}"))
}
pub(crate) fn commit_message_trailer_instruction(
config_attribution: Option<&str>,
) -> Option<String> {
let trailer = build_commit_message_trailer(config_attribution)?;
Some(format!(
"When you write or edit a git commit message, ensure the message ends with this trailer exactly once:\n{trailer}\n\nRules:\n- Keep existing trailers and append this trailer at the end if missing.\n- Do not duplicate this trailer if it already exists.\n- Keep one blank line between the commit body and trailer block."
))
}
fn resolve_attribution_value(config_attribution: Option<&str>) -> Option<String> {
match config_attribution {
Some(value) => {
let trimmed = value.trim();
if trimmed.is_empty() {
None
} else {
Some(trimmed.to_string())
}
}
None => Some(DEFAULT_ATTRIBUTION_VALUE.to_string()),
}
}
#[cfg(test)]
#[path = "commit_attribution_tests.rs"]
mod tests;

View File

@@ -1,43 +0,0 @@
use super::build_commit_message_trailer;
use super::commit_message_trailer_instruction;
use super::resolve_attribution_value;
#[test]
fn blank_attribution_disables_trailer_prompt() {
assert_eq!(build_commit_message_trailer(Some("")), None);
assert_eq!(commit_message_trailer_instruction(Some(" ")), None);
}
#[test]
fn default_attribution_uses_codex_trailer() {
assert_eq!(
build_commit_message_trailer(/*config_attribution*/ None).as_deref(),
Some("Co-authored-by: Codex <noreply@openai.com>")
);
}
#[test]
fn resolve_value_handles_default_custom_and_blank() {
assert_eq!(
resolve_attribution_value(/*config_attribution*/ None),
Some("Codex <noreply@openai.com>".to_string())
);
assert_eq!(
resolve_attribution_value(Some("MyAgent <me@example.com>")),
Some("MyAgent <me@example.com>".to_string())
);
assert_eq!(
resolve_attribution_value(Some("MyAgent")),
Some("MyAgent".to_string())
);
assert_eq!(resolve_attribution_value(Some(" ")), None);
}
#[test]
fn instruction_mentions_trailer_and_omits_generated_with() {
let instruction = commit_message_trailer_instruction(Some("AgentX <agent@example.com>"))
.expect("instruction expected");
assert!(instruction.contains("Co-authored-by: AgentX <agent@example.com>"));
assert!(instruction.contains("exactly once"));
assert!(!instruction.contains("Generated-with"));
}

View File

@@ -1156,10 +1156,7 @@ async fn load_config_layers_includes_cloud_hook_requirements() -> anyhow::Result
pre_tool_use: vec![codex_config::MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![codex_config::HookHandlerConfig::Command {
command: codex_config::HookCommandConfig::Single(format!(
"python3 {}/pre.py",
managed_dir.display()
)),
command: format!("python3 {}/pre.py", managed_dir.display()),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),

View File

@@ -22,11 +22,11 @@ mod config_lock;
pub use codex_thread::CodexThread;
pub use codex_thread::CodexThreadTurnContextOverrides;
pub use codex_thread::ThreadConfigSnapshot;
pub use session::turn_context::TurnContext;
mod agent;
mod attestation;
mod codex_delegate;
mod command_canonicalization;
mod commit_attribution;
pub mod config;
pub mod connectors;
pub mod context;

View File

@@ -20,6 +20,7 @@ use crate::session::turn::built_tools;
use crate::state_db_bridge::StateDbHandle;
use crate::thread_manager::ThreadManager;
use crate::thread_manager::thread_store_from_config;
use codex_extension_api::empty_extension_registry;
/// Build the model-visible `input` list for a single debug turn.
#[doc(hidden)]
@@ -49,6 +50,7 @@ pub async fn build_prompt_input(
.await
.map_err(|err| CodexErr::Fatal(err.to_string()))?,
),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store,
state_db.clone(),

View File

@@ -16,7 +16,6 @@ use crate::agent::agent_status_from_event;
use crate::agent::status::is_final;
use crate::attestation::AttestationProvider;
use crate::build_available_skills;
use crate::commit_attribution::commit_message_trailer_instruction;
use crate::compact;
use crate::config::ManagedFeatures;
use crate::config::resolve_tool_suggest_config_from_layer_stack;
@@ -53,6 +52,7 @@ use codex_config::types::OAuthCredentialsStoreMode;
use codex_exec_server::Environment;
use codex_exec_server::EnvironmentManager;
use codex_exec_server::FileSystemSandboxContext;
use codex_extension_api::PromptSlot;
use codex_features::FEATURES;
use codex_features::Feature;
use codex_features::unstable_features_warning_event;
@@ -392,6 +392,7 @@ pub(crate) struct CodexSpawnArgs {
pub(crate) skills_manager: Arc<SkillsManager>,
pub(crate) plugins_manager: Arc<PluginsManager>,
pub(crate) mcp_manager: Arc<McpManager>,
pub(crate) extensions: Arc<codex_extension_api::ExtensionRegistry<crate::config::Config>>,
pub(crate) conversation_history: InitialHistory,
pub(crate) session_source: SessionSource,
pub(crate) thread_source: Option<ThreadSource>,
@@ -455,6 +456,7 @@ impl Codex {
skills_manager,
plugins_manager,
mcp_manager,
extensions,
conversation_history,
session_source,
thread_source,
@@ -650,6 +652,7 @@ impl Codex {
skills_manager,
plugins_manager,
mcp_manager.clone(),
extensions,
agent_control,
environment_manager,
analytics_events_client,
@@ -2570,6 +2573,7 @@ impl Session {
) -> Vec<ResponseItem> {
let mut developer_sections = Vec::<String>::with_capacity(8);
let mut contextual_user_sections = Vec::<String>::with_capacity(2);
let mut separate_developer_sections = Vec::<String>::new();
let (
reference_context_item,
previous_turn_settings,
@@ -2707,12 +2711,23 @@ impl Session {
{
developer_sections.push(plugin_instructions.render());
}
if turn_context.features.enabled(Feature::CodexGitCommit)
&& let Some(commit_message_instruction) = commit_message_trailer_instruction(
turn_context.config.commit_attribution.as_deref(),
)
{
developer_sections.push(commit_message_instruction);
for contributor in self.services.extensions.context_contributors() {
for fragment in contributor.contribute(
&self.services.session_extension_data,
&self.services.thread_extension_data,
) {
match fragment.slot() {
PromptSlot::DeveloperPolicy | PromptSlot::DeveloperCapabilities => {
developer_sections.push(fragment.text().to_string());
}
PromptSlot::ContextualUser => {
contextual_user_sections.push(fragment.text().to_string());
}
PromptSlot::SeparateDeveloper => {
separate_developer_sections.push(fragment.text().to_string());
}
}
}
}
if let Some(user_instructions) = turn_context.user_instructions.as_deref() {
contextual_user_sections.push(
@@ -2746,6 +2761,13 @@ impl Session {
{
items.push(developer_message);
}
for section in separate_developer_sections {
if let Some(developer_message) =
crate::context_manager::updates::build_developer_update_item(vec![section])
{
items.push(developer_message);
}
}
if let Some(usage_hint_text) = multi_agent_v2_usage_hint_text
&& let Some(usage_hint_message) =
crate::context_manager::updates::build_developer_update_item(vec![

View File

@@ -364,6 +364,7 @@ impl Session {
skills_manager: Arc<SkillsManager>,
plugins_manager: Arc<PluginsManager>,
mcp_manager: Arc<McpManager>,
extensions: Arc<codex_extension_api::ExtensionRegistry<crate::config::Config>>,
agent_control: AgentControl,
environment_manager: Arc<EnvironmentManager>,
analytics_events_client: Option<AnalyticsEventsClient>,
@@ -810,6 +811,16 @@ impl Session {
SessionId::from(thread_id)
};
let agent_control = agent_control.with_session_id(session_id);
let session_extension_data = codex_extension_api::ExtensionData::new();
let thread_extension_data = codex_extension_api::ExtensionData::new();
for contributor in extensions.thread_start_contributors() {
contributor.contribute(
config.as_ref(),
&session_extension_data,
&thread_extension_data,
);
}
let services = SessionServices {
// Initialize the MCP connection manager with an uninitialized
// instance. It will be replaced with one created via
@@ -845,6 +856,10 @@ impl Session {
skills_manager,
plugins_manager: Arc::clone(&plugins_manager),
mcp_manager: Arc::clone(&mcp_manager),
extensions,
// TODO(jif): extract session to share between sub-agents
session_extension_data,
thread_extension_data,
agent_control,
network_proxy,
network_approval: Arc::clone(&network_approval),

View File

@@ -1311,9 +1311,7 @@ async fn refresh_runtime_config_refreshes_hooks() -> anyhow::Result<()> {
group: codex_config::MatcherGroup {
matcher: None,
hooks: vec![codex_config::HookHandlerConfig::Command {
command: codex_config::HookCommandConfig::Single(
"python3 /tmp/user.py".to_string(),
),
command: "python3 /tmp/user.py".to_string(),
timeout_sec: Some(600),
r#async: false,
status_message: None,
@@ -3727,6 +3725,7 @@ async fn session_new_fails_when_zsh_fork_enabled_without_zsh_path() {
skills_manager,
plugins_manager,
mcp_manager,
Arc::new(codex_extension_api::ExtensionRegistryBuilder::new().build()),
AgentControl::default(),
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
/*analytics_events_client*/ None,
@@ -3873,6 +3872,9 @@ pub(crate) async fn make_session_and_context() -> (Session, TurnContext) {
skills_manager,
plugins_manager,
mcp_manager,
extensions: Arc::new(codex_extension_api::ExtensionRegistryBuilder::new().build()),
session_extension_data: codex_extension_api::ExtensionData::new(),
thread_extension_data: codex_extension_api::ExtensionData::new(),
agent_control,
network_proxy: None,
network_approval: Arc::clone(&network_approval),
@@ -4063,6 +4065,7 @@ async fn make_session_with_config_and_rx(
skills_manager,
plugins_manager,
mcp_manager,
Arc::new(codex_extension_api::ExtensionRegistryBuilder::new().build()),
AgentControl::default(),
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
/*analytics_events_client*/ None,
@@ -4165,6 +4168,7 @@ async fn make_session_with_history_source_and_agent_control_and_rx(
skills_manager,
plugins_manager,
mcp_manager,
Arc::new(codex_extension_api::ExtensionRegistryBuilder::new().build()),
agent_control,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
/*analytics_events_client*/ None,
@@ -5588,6 +5592,9 @@ where
skills_manager,
plugins_manager,
mcp_manager,
extensions: Arc::new(codex_extension_api::ExtensionRegistryBuilder::new().build()),
session_extension_data: codex_extension_api::ExtensionData::new(),
thread_extension_data: codex_extension_api::ExtensionData::new(),
agent_control,
network_proxy: None,
network_approval: Arc::clone(&network_approval),
@@ -6135,6 +6142,73 @@ async fn make_multi_agent_v2_usage_hint_test_session(
(session, turn_context)
}
struct GitAttributionTestContributor;
struct GitAttributionTestState;
impl codex_extension_api::ContextContributor for GitAttributionTestContributor {
fn contribute(
&self,
_session_store: &codex_extension_api::ExtensionData,
thread_store: &codex_extension_api::ExtensionData,
) -> Vec<codex_extension_api::PromptFragment> {
thread_store
.get::<GitAttributionTestState>()
.is_some()
.then(|| {
codex_extension_api::PromptFragment::developer_policy(
"git attribution extension enabled",
)
})
.into_iter()
.collect()
}
}
fn git_attribution_test_registry()
-> Arc<codex_extension_api::ExtensionRegistry<crate::config::Config>> {
let mut builder = codex_extension_api::ExtensionRegistryBuilder::new();
builder.prompt_contributor(Arc::new(GitAttributionTestContributor));
Arc::new(builder.build())
}
#[tokio::test]
async fn build_initial_context_includes_git_attribution_from_extensions() {
let (mut session, turn_context) = make_session_and_context().await;
session.services.extensions = git_attribution_test_registry();
session
.services
.thread_extension_data
.insert(GitAttributionTestState);
let initial_context = session.build_initial_context(&turn_context).await;
let developer_messages = developer_message_texts(&initial_context);
assert!(
developer_messages
.iter()
.flatten()
.any(|text| *text == "git attribution extension enabled"),
"expected git attribution developer text, got {developer_messages:?}"
);
}
#[tokio::test]
async fn build_initial_context_omits_git_attribution_when_feature_is_disabled() {
let (mut session, turn_context) = make_session_and_context().await;
session.services.extensions = git_attribution_test_registry();
let initial_context = session.build_initial_context(&turn_context).await;
let developer_messages = developer_message_texts(&initial_context);
assert!(
!developer_messages
.iter()
.flatten()
.any(|text| *text == "git attribution extension enabled"),
"did not expect git attribution developer text, got {developer_messages:?}"
);
}
#[tokio::test]
async fn build_initial_context_adds_multi_agent_v2_root_usage_hint_as_developer_message() {
let (session, turn_context) =

View File

@@ -742,6 +742,7 @@ async fn guardian_subagent_does_not_inherit_parent_exec_policy_rules() {
skills_manager,
plugins_manager,
mcp_manager,
extensions: codex_extension_api::empty_extension_registry(),
conversation_history: InitialHistory::New,
session_source: SessionSource::SubAgent(SubAgentSource::Other(
GUARDIAN_REVIEWER_NAME.to_string(),

View File

@@ -52,11 +52,11 @@ impl TurnEnvironment {
/// The context needed for a single turn of the thread.
#[derive(Debug)]
pub(crate) struct TurnContext {
pub struct TurnContext {
pub(crate) sub_id: String,
pub(crate) trace_id: Option<String>,
pub(crate) realtime_active: bool,
pub(crate) config: Arc<Config>,
pub config: Arc<Config>,
pub(crate) auth_manager: Option<Arc<AuthManager>>,
pub(crate) model_info: ModelInfo,
pub(crate) session_telemetry: SessionTelemetry,
@@ -84,7 +84,7 @@ pub(crate) struct TurnContext {
pub(crate) windows_sandbox_level: WindowsSandboxLevel,
pub(crate) shell_environment_policy: ShellEnvironmentPolicy,
pub(crate) tools_config: ToolsConfig,
pub(crate) features: ManagedFeatures,
pub features: ManagedFeatures,
pub(crate) ghost_snapshot: GhostSnapshotConfig,
pub(crate) final_output_json_schema: Option<Value>,
pub(crate) codex_self_exe: Option<PathBuf>,

View File

@@ -18,6 +18,8 @@ use arc_swap::ArcSwap;
use codex_analytics::AnalyticsEventsClient;
use codex_core_plugins::PluginsManager;
use codex_exec_server::EnvironmentManager;
use codex_extension_api::ExtensionData;
use codex_extension_api::ExtensionRegistry;
use codex_hooks::Hooks;
use codex_login::AuthManager;
use codex_mcp::McpConnectionManager;
@@ -59,6 +61,9 @@ pub(crate) struct SessionServices {
pub(crate) skills_manager: Arc<SkillsManager>,
pub(crate) plugins_manager: Arc<PluginsManager>,
pub(crate) mcp_manager: Arc<McpManager>,
pub(crate) extensions: Arc<ExtensionRegistry<crate::config::Config>>,
pub(crate) session_extension_data: ExtensionData,
pub(crate) thread_extension_data: ExtensionData,
pub(crate) agent_control: AgentControl,
pub(crate) network_proxy: Option<StartedNetworkProxy>,
pub(crate) network_approval: Arc<NetworkApprovalService>,

View File

@@ -20,6 +20,8 @@ use codex_app_server_protocol::ThreadHistoryBuilder;
use codex_app_server_protocol::TurnStatus;
use codex_core_plugins::PluginsManager;
use codex_exec_server::EnvironmentManager;
use codex_extension_api::ExtensionRegistry;
use codex_extension_api::empty_extension_registry;
use codex_login::AuthManager;
use codex_login::CodexAuth;
use codex_model_provider::create_model_provider;
@@ -201,6 +203,7 @@ pub(crate) struct ThreadManagerState {
skills_manager: Arc<SkillsManager>,
plugins_manager: Arc<PluginsManager>,
mcp_manager: Arc<McpManager>,
extensions: Arc<ExtensionRegistry<Config>>,
thread_store: Arc<dyn ThreadStore>,
attestation_provider: Option<Arc<dyn AttestationProvider>>,
session_source: SessionSource,
@@ -242,6 +245,7 @@ impl ThreadManager {
auth_manager: Arc<AuthManager>,
session_source: SessionSource,
environment_manager: Arc<EnvironmentManager>,
extensions: Arc<ExtensionRegistry<Config>>,
analytics_events_client: Option<AnalyticsEventsClient>,
thread_store: Arc<dyn ThreadStore>,
state_db: Option<StateDbHandle>,
@@ -270,6 +274,7 @@ impl ThreadManager {
skills_manager,
plugins_manager,
mcp_manager,
extensions,
thread_store,
attestation_provider,
auth_manager,
@@ -370,6 +375,7 @@ impl ThreadManager {
skills_manager,
plugins_manager,
mcp_manager,
extensions: empty_extension_registry(),
thread_store,
attestation_provider: None,
auth_manager,
@@ -1129,6 +1135,7 @@ impl ThreadManagerState {
skills_manager: Arc::clone(&self.skills_manager),
plugins_manager: Arc::clone(&self.plugins_manager),
mcp_manager: Arc::clone(&self.mcp_manager),
extensions: Arc::clone(&self.extensions),
conversation_history: initial_history,
session_source,
thread_source,

View File

@@ -7,6 +7,7 @@ use crate::session::session::SessionSettingsUpdate;
use crate::session::tests::make_session_and_context;
use crate::tasks::InterruptedTurnHistoryMarker;
use crate::tasks::interrupted_turn_history_marker;
use codex_extension_api::empty_extension_registry;
use codex_features::Feature;
use codex_models_manager::manager::RefreshStrategy;
use codex_protocol::models::ContentItem;
@@ -495,6 +496,7 @@ async fn resume_and_fork_do_not_restore_thread_environments_from_rollout() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, /*state_db*/ None),
/*state_db*/ None,
@@ -612,6 +614,7 @@ async fn explicit_installation_id_skips_codex_home_file() {
auth_manager,
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store,
state_db.clone(),
@@ -650,6 +653,7 @@ async fn resume_active_thread_from_rollout_returns_running_thread() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, /*state_db*/ None),
/*state_db*/ None,
@@ -706,6 +710,7 @@ async fn resume_stopped_thread_from_rollout_spawns_new_thread() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, /*state_db*/ None),
/*state_db*/ None,
@@ -769,6 +774,7 @@ async fn resume_stopped_thread_from_rollout_preserves_thread_source() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store,
state_db.clone(),
@@ -858,6 +864,7 @@ async fn rollout_path_resume_and_fork_read_history_through_thread_store() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store.clone(),
state_db,
@@ -960,6 +967,7 @@ async fn new_uses_active_provider_for_model_refresh() {
auth_manager,
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, /*state_db*/ None),
/*state_db*/ None,
@@ -1175,6 +1183,7 @@ async fn interrupted_fork_snapshot_does_not_synthesize_turn_id_for_legacy_histor
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, state_db.clone()),
state_db.clone(),
@@ -1282,6 +1291,7 @@ async fn interrupted_fork_snapshot_preserves_explicit_turn_id() {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, state_db.clone()),
state_db.clone(),
@@ -1378,6 +1388,7 @@ async fn interrupted_fork_snapshot_uses_persisted_mid_turn_history_without_live_
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, state_db.clone()),
state_db.clone(),
@@ -1520,6 +1531,7 @@ async fn resumed_thread_keeps_paused_goal_paused() -> anyhow::Result<()> {
auth_manager.clone(),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, state_db.clone()),
state_db.clone(),

View File

@@ -15,6 +15,7 @@ use crate::tools::handlers::multi_agents_v2::SendMessageHandler as SendMessageHa
use crate::tools::handlers::multi_agents_v2::SpawnAgentHandler as SpawnAgentHandlerV2;
use crate::tools::handlers::multi_agents_v2::WaitAgentHandler as WaitAgentHandlerV2;
use crate::turn_diff_tracker::TurnDiffTracker;
use codex_extension_api::empty_extension_registry;
use codex_features::Feature;
use codex_login::AuthManager;
use codex_login::CodexAuth;
@@ -3163,6 +3164,7 @@ async fn tool_handlers_cascade_close_and_resume_and_keep_explicitly_closed_subtr
AuthManager::from_auth_for_testing(CodexAuth::from_api_key("dummy")),
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, state_db.clone()),
state_db.clone(),

View File

@@ -18,6 +18,7 @@ base64 = { workspace = true }
codex-arg0 = { workspace = true }
codex-config = { workspace = true }
codex-core = { workspace = true }
codex-extension-api = { workspace = true }
codex-exec-server = { workspace = true }
codex-features = { workspace = true }
codex-hooks = { workspace = true }

View File

@@ -23,6 +23,7 @@ use codex_core::thread_store_from_config;
use codex_exec_server::CreateDirectoryOptions;
use codex_exec_server::ExecutorFileSystem;
use codex_exec_server::RemoveOptions;
use codex_extension_api::empty_extension_registry;
use codex_features::Feature;
use codex_login::CodexAuth;
use codex_model_provider_info::ModelProviderInfo;
@@ -437,6 +438,7 @@ impl TestCodexBuilder {
codex_core::test_support::auth_manager_from_auth(auth.clone()),
SessionSource::Exec,
Arc::clone(&environment_manager),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store,
state_db.clone(),

View File

@@ -7,6 +7,7 @@ use codex_core::ResponseEvent;
use codex_core::ThreadManager;
use codex_core::resolve_installation_id;
use codex_core::thread_store_from_config;
use codex_extension_api::empty_extension_registry;
use codex_features::Feature;
use codex_login::AuthManager;
use codex_login::CodexAuth;
@@ -1137,6 +1138,7 @@ async fn prefers_apikey_when_config_prefers_apikey_even_with_chatgpt_tokens() {
auth_manager,
SessionSource::Exec,
Arc::new(codex_exec_server::EnvironmentManager::default_for_tests()),
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(&config, /*state_db*/ None),
/*state_db*/ None,

View File

@@ -0,0 +1,6 @@
load("//:defs.bzl", "codex_rust_crate")
codex_rust_crate(
name = "extension-api",
crate_name = "codex_extension_api",
)

View File

@@ -0,0 +1,20 @@
[package]
edition.workspace = true
license.workspace = true
name = "codex-extension-api"
version.workspace = true
[lib]
name = "codex_extension_api"
path = "src/lib.rs"
test = false
doctest = false
[lints]
workspace = true
[dependencies]
codex-protocol = { workspace = true }
codex-tools = { workspace = true }
serde_json = { workspace = true }
thiserror = { workspace = true }

View File

@@ -0,0 +1,62 @@
#[path = "enabled_extensions/shared_state_extension.rs"]
mod shared_state_extension;
use codex_extension_api::ExtensionData;
use codex_extension_api::ExtensionRegistryBuilder;
use shared_state_extension::recorded_style_contributions;
use shared_state_extension::recorded_usage_contributions;
fn main() {
// 1. Install the contributors for the thread-start input type this host exposes.
let mut builder = ExtensionRegistryBuilder::<()>::new();
shared_state_extension::install(&mut builder);
let registry = builder.build();
// 2. The host decides which stores are shared.
let session_store = ExtensionData::new();
let first_thread_store = ExtensionData::new();
let second_thread_store = ExtensionData::new();
// 3. Reusing the same session store shares session state across threads.
let first_thread_fragments = contribute_prompt(&registry, &session_store, &first_thread_store);
contribute_prompt(&registry, &session_store, &first_thread_store);
contribute_prompt(&registry, &session_store, &second_thread_store);
println!("first prompt fragments: {}", first_thread_fragments.len());
println!(
"session style contributions: {}",
recorded_style_contributions(&session_store)
);
println!(
"session usage contributions: {}",
recorded_usage_contributions(&session_store)
);
println!(
"first thread style contributions: {}",
recorded_style_contributions(&first_thread_store)
);
println!(
"first thread usage contributions: {}",
recorded_usage_contributions(&first_thread_store)
);
println!(
"second thread style contributions: {}",
recorded_style_contributions(&second_thread_store)
);
println!(
"second thread usage contributions: {}",
recorded_usage_contributions(&second_thread_store)
);
}
fn contribute_prompt(
registry: &codex_extension_api::ExtensionRegistry<()>,
session_store: &ExtensionData,
thread_store: &ExtensionData,
) -> Vec<codex_extension_api::PromptFragment> {
registry
.context_contributors()
.iter()
.flat_map(|contributor| contributor.contribute(session_store, thread_store))
.collect()
}

View File

@@ -0,0 +1,94 @@
use std::sync::Arc;
use std::sync::atomic::AtomicU64;
use std::sync::atomic::Ordering;
use codex_extension_api::ContextContributor;
use codex_extension_api::ExtensionData;
use codex_extension_api::ExtensionRegistryBuilder;
use codex_extension_api::PromptFragment;
/// Installs the tutorial contributors used by the example host.
pub fn install(registry: &mut ExtensionRegistryBuilder<()>) {
registry.prompt_contributor(Arc::new(StyleContributor));
registry.prompt_contributor(Arc::new(UsageContributor));
}
#[derive(Debug)]
struct StyleContributor;
impl ContextContributor for StyleContributor {
fn contribute(
&self,
session_store: &ExtensionData,
thread_store: &ExtensionData,
) -> Vec<PromptFragment> {
contribution_counts(session_store).record_style();
contribution_counts(thread_store).record_style();
vec![PromptFragment::developer_policy(
"Prefer short answers unless the user asks for detail.",
)]
}
}
#[derive(Debug)]
struct UsageContributor;
impl ContextContributor for UsageContributor {
fn contribute(
&self,
session_store: &ExtensionData,
thread_store: &ExtensionData,
) -> Vec<PromptFragment> {
contribution_counts(session_store).record_usage();
contribution_counts(thread_store).record_usage();
vec![PromptFragment::developer_capability(
"This extension can contribute more than one prompt fragment.",
)]
}
}
/// Returns how many style contributions were recorded in `store`.
pub fn recorded_style_contributions(store: &ExtensionData) -> u64 {
store
.get::<ContributionCounts>()
.map(|counts| counts.style())
.unwrap_or_default()
}
/// Returns how many usage contributions were recorded in `store`.
pub fn recorded_usage_contributions(store: &ExtensionData) -> u64 {
store
.get::<ContributionCounts>()
.map(|counts| counts.usage())
.unwrap_or_default()
}
#[derive(Debug, Default)]
struct ContributionCounts {
style: AtomicU64,
usage: AtomicU64,
}
impl ContributionCounts {
fn record_style(&self) {
self.style.fetch_add(1, Ordering::Relaxed);
}
fn record_usage(&self) {
self.usage.fetch_add(1, Ordering::Relaxed);
}
fn style(&self) -> u64 {
self.style.load(Ordering::Relaxed)
}
fn usage(&self) -> u64 {
self.usage.load(Ordering::Relaxed)
}
}
fn contribution_counts(store: &ExtensionData) -> Arc<ContributionCounts> {
store.get_or_init::<ContributionCounts>(Default::default)
}

View File

@@ -0,0 +1,14 @@
Everything becomes a good contributor design, which contributors do we need?
git attribution Context
memories Context + Tool + Output
guardian Context + Request
goal Tool + Runtime
image generation Tool + Output
skills Context + Turn
personality Context
plugins / apps / connectors Context + Turn
shell snapshot Runtime
web search Tool
AGENTS.md Context (Runtime too only if you want eager refresh/cache behavior)
future sandboxing probably Request + Runtime

View File

@@ -0,0 +1,65 @@
use std::future::Future;
use codex_protocol::items::TurnItem;
use crate::ExtensionData;
mod prompt;
mod tool;
pub use prompt::PromptFragment;
pub use prompt::PromptSlot;
pub use tool::ToolCallError;
pub use tool::ToolContribution;
pub use tool::ToolHandler;
/// Contributor that receives host-owned thread-start input before later
/// contributors read from extension stores.
pub trait ThreadStartContributor<C>: Send + Sync {
fn contribute(&self, input: &C, session_store: &ExtensionData, thread_store: &ExtensionData);
}
/// Extension contribution that adds prompt fragments during prompt assembly.
pub trait ContextContributor: Send + Sync {
fn contribute(
&self,
session_store: &ExtensionData,
thread_store: &ExtensionData,
) -> Vec<PromptFragment>;
}
/// Extension contribution that exposes native tools owned by a feature.
pub trait ToolContributor: Send + Sync {
/// Returns the native tools visible for the supplied runtime context.
fn tools(&self, thread_store: &ExtensionData) -> Vec<ToolContribution>;
}
/// Future returned by one ordered turn-item contribution.
pub type TurnItemContributionFuture<'a> =
std::pin::Pin<Box<dyn Future<Output = Result<(), String>> + Send + 'a>>;
/// Ordered post-processing contribution for one parsed turn item.
///
/// Implementations may mutate the item before it is emitted and may use the
/// explicitly exposed thread- and turn-lifetime stores when they need durable
/// extension-private state.
pub trait TurnItemContributor: Send + Sync {
fn contribute<'a>(
&'a self,
thread_store: &'a ExtensionData,
turn_store: &'a ExtensionData,
item: &'a mut TurnItem,
) -> TurnItemContributionFuture<'a>;
}
// TODO: WIP (do not consider)
/// Extension contribution that can claim approval requests for a runtime context.
/// (ideally we can replace it by a session lifecycle thing or a request contributor?)
pub trait ApprovalInterceptorContributor: Send + Sync {
/// Returns whether this contributor should intercept approvals in `context`.
fn intercepts_approvals(
&self,
thread_store: &ExtensionData,
turn_store: &ExtensionData,
) -> bool;
}

View File

@@ -0,0 +1,50 @@
// All this file should be replaced by the existing fragment implementation ofc
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)]
pub enum PromptSlot {
DeveloperPolicy,
DeveloperCapabilities,
ContextualUser,
SeparateDeveloper,
}
#[derive(Clone, Debug, PartialEq, Eq)]
pub struct PromptFragment {
slot: PromptSlot,
text: String,
}
impl PromptFragment {
/// Creates a prompt fragment for the given slot.
pub fn new(slot: PromptSlot, text: impl Into<String>) -> Self {
Self {
slot,
text: text.into(),
}
}
/// Creates a developer-policy prompt fragment.
pub fn developer_policy(text: impl Into<String>) -> Self {
Self::new(PromptSlot::DeveloperPolicy, text)
}
/// Creates a developer-capabilities prompt fragment.
pub fn developer_capability(text: impl Into<String>) -> Self {
Self::new(PromptSlot::DeveloperCapabilities, text)
}
/// Creates a separate top-level developer prompt fragment.
pub fn separate_developer(text: impl Into<String>) -> Self {
Self::new(PromptSlot::SeparateDeveloper, text)
}
/// Returns the target prompt slot.
pub fn slot(&self) -> PromptSlot {
self.slot
}
/// Returns the model-visible text.
pub fn text(&self) -> &str {
&self.text
}
}

View File

@@ -0,0 +1,68 @@
use std::future::Future;
use std::pin::Pin;
use std::sync::Arc;
use codex_tools::ResponsesApiTool;
use serde_json::Value;
use thiserror::Error;
// TMP while we don't have the fully extracted tools
#[derive(Clone)]
pub struct ToolContribution {
spec: ResponsesApiTool,
handler: Arc<dyn ToolHandler>,
supports_parallel_tool_calls: bool,
}
impl ToolContribution {
pub fn new(spec: ResponsesApiTool, handler: Arc<dyn ToolHandler>) -> Self {
Self {
spec,
handler,
supports_parallel_tool_calls: false,
}
}
#[must_use]
pub fn allow_parallel_calls(mut self) -> Self {
self.supports_parallel_tool_calls = true;
self
}
pub fn spec(&self) -> &ResponsesApiTool {
&self.spec
}
pub fn supports_parallel_tool_calls(&self) -> bool {
self.supports_parallel_tool_calls
}
pub fn handler(&self) -> Arc<dyn ToolHandler> {
Arc::clone(&self.handler)
}
}
//////// Just to make it compile ////////////////////////////////
pub trait ToolHandler: Send + Sync {
/// Handles one JSON-encoded invocation for this tool.
fn handle<'a>(
&'a self,
arguments: Value,
) -> Pin<Box<dyn Future<Output = Result<Value, ToolCallError>> + Send + 'a>>;
}
/// Error returned by a contributed native tool handler.
#[derive(Clone, Debug, Error, PartialEq, Eq)]
#[error("{message}")]
pub struct ToolCallError {
message: String,
}
impl ToolCallError {
/// Creates a contributed-tool error with the supplied model-visible text.
pub fn new(message: impl Into<String>) -> Self {
Self {
message: message.into(),
}
}
}

View File

@@ -0,0 +1,19 @@
mod contributors;
mod registry;
mod state;
pub use contributors::ApprovalInterceptorContributor;
pub use contributors::ContextContributor;
pub use contributors::PromptFragment;
pub use contributors::PromptSlot;
pub use contributors::ThreadStartContributor;
pub use contributors::ToolCallError;
pub use contributors::ToolContribution;
pub use contributors::ToolContributor;
pub use contributors::ToolHandler;
pub use contributors::TurnItemContributionFuture;
pub use contributors::TurnItemContributor;
pub use registry::ExtensionRegistry;
pub use registry::ExtensionRegistryBuilder;
pub use registry::empty_extension_registry;
pub use state::ExtensionData;

View File

@@ -0,0 +1,115 @@
use std::sync::Arc;
use crate::ApprovalInterceptorContributor;
use crate::ContextContributor;
use crate::ThreadStartContributor;
use crate::ToolContributor;
use crate::TurnItemContributor;
/// Mutable registry used while hosts register typed runtime contributions.
pub struct ExtensionRegistryBuilder<C> {
thread_start_contributors: Vec<Arc<dyn ThreadStartContributor<C>>>,
context_contributors: Vec<Arc<dyn ContextContributor>>,
tool_contributors: Vec<Arc<dyn ToolContributor>>,
turn_item_contributors: Vec<Arc<dyn TurnItemContributor>>,
approval_interceptor_contributors: Vec<Arc<dyn ApprovalInterceptorContributor>>,
}
impl<C> Default for ExtensionRegistryBuilder<C> {
fn default() -> Self {
Self {
thread_start_contributors: Vec::new(),
approval_interceptor_contributors: Vec::new(),
context_contributors: Vec::new(),
tool_contributors: Vec::new(),
turn_item_contributors: Vec::new(),
}
}
}
impl<C> ExtensionRegistryBuilder<C> {
/// Creates an empty registry builder.
pub fn new() -> Self {
Self::default()
}
/// Registers one approval interceptor contributor.
pub fn approval_interceptor_contributor(
&mut self,
contributor: Arc<dyn ApprovalInterceptorContributor>,
) {
self.approval_interceptor_contributors.push(contributor);
}
/// Registers one thread-start contributor.
pub fn thread_start_contributor(&mut self, contributor: Arc<dyn ThreadStartContributor<C>>) {
self.thread_start_contributors.push(contributor);
}
/// Registers one prompt contributor.
pub fn prompt_contributor(&mut self, contributor: Arc<dyn ContextContributor>) {
self.context_contributors.push(contributor);
}
/// Registers one native tool contributor.
pub fn tool_contributor(&mut self, contributor: Arc<dyn ToolContributor>) {
self.tool_contributors.push(contributor);
}
/// Registers one ordered turn-item contributor.
pub fn turn_item_contributor(&mut self, contributor: Arc<dyn TurnItemContributor>) {
self.turn_item_contributors.push(contributor);
}
/// Finishes construction and returns the immutable registry.
pub fn build(self) -> ExtensionRegistry<C> {
ExtensionRegistry {
thread_start_contributors: self.thread_start_contributors,
approval_interceptor_contributors: self.approval_interceptor_contributors,
context_contributors: self.context_contributors,
tool_contributors: self.tool_contributors,
turn_item_contributors: self.turn_item_contributors,
}
}
}
/// Immutable typed registry produced after extensions are installed.
pub struct ExtensionRegistry<C> {
thread_start_contributors: Vec<Arc<dyn ThreadStartContributor<C>>>,
context_contributors: Vec<Arc<dyn ContextContributor>>,
tool_contributors: Vec<Arc<dyn ToolContributor>>,
turn_item_contributors: Vec<Arc<dyn TurnItemContributor>>,
approval_interceptor_contributors: Vec<Arc<dyn ApprovalInterceptorContributor>>,
}
impl<C> ExtensionRegistry<C> {
/// Returns the registered thread-start contributors.
pub fn thread_start_contributors(&self) -> &[Arc<dyn ThreadStartContributor<C>>] {
&self.thread_start_contributors
}
/// Returns the registered approval interceptor contributors.
pub fn approval_interceptor_contributors(&self) -> &[Arc<dyn ApprovalInterceptorContributor>] {
&self.approval_interceptor_contributors
}
/// Returns the registered prompt contributors.
pub fn context_contributors(&self) -> &[Arc<dyn ContextContributor>] {
&self.context_contributors
}
/// Returns the registered native tool contributors.
pub fn tool_contributors(&self) -> &[Arc<dyn ToolContributor>] {
&self.tool_contributors
}
/// Returns the registered ordered turn-item contributors.
pub fn turn_item_contributors(&self) -> &[Arc<dyn TurnItemContributor>] {
&self.turn_item_contributors
}
}
/// Creates an empty shared registry for hosts that do not register contributions.
pub fn empty_extension_registry<C>() -> Arc<ExtensionRegistry<C>> {
Arc::new(ExtensionRegistryBuilder::new().build())
}

View File

@@ -0,0 +1,77 @@
use std::any::Any;
use std::any::TypeId;
use std::collections::HashMap;
use std::sync::Arc;
use std::sync::Mutex;
use std::sync::PoisonError;
type ErasedData = Arc<dyn Any + Send + Sync>;
/// Typed extension-owned data attached to one host object.
#[derive(Default, Debug)]
pub struct ExtensionData {
entries: Mutex<HashMap<TypeId, ErasedData>>,
}
impl ExtensionData {
/// Creates an empty attachment map.
pub fn new() -> Self {
Self::default()
}
/// Returns the attached value of type `T`, if one exists.
pub fn get<T>(&self) -> Option<Arc<T>>
where
T: Any + Send + Sync,
{
let value = self.entries().get(&TypeId::of::<T>())?.clone();
Some(downcast_data(value))
}
/// Returns the attached value of type `T`, inserting one from `init` when absent.
///
/// The initializer runs while this map is locked, so it should stay cheap;
/// heavyweight lazy work belongs inside the attached value itself.
pub fn get_or_init<T>(&self, init: impl FnOnce() -> T) -> Arc<T>
where
T: Any + Send + Sync,
{
let mut entries = self.entries();
let value = entries
.entry(TypeId::of::<T>())
.or_insert_with(|| Arc::new(init()));
downcast_data(Arc::clone(value))
}
/// Stores `value` as the attachment of type `T`, returning any previous value.
pub fn insert<T>(&self, value: T) -> Option<Arc<T>>
where
T: Any + Send + Sync,
{
self.entries()
.insert(TypeId::of::<T>(), Arc::new(value))
.map(downcast_data)
}
/// Removes and returns the attached value of type `T`, if one exists.
pub fn remove<T>(&self) -> Option<Arc<T>>
where
T: Any + Send + Sync,
{
self.entries().remove(&TypeId::of::<T>()).map(downcast_data)
}
fn entries(&self) -> std::sync::MutexGuard<'_, HashMap<TypeId, ErasedData>> {
self.entries.lock().unwrap_or_else(PoisonError::into_inner)
}
}
fn downcast_data<T>(value: ErasedData) -> Arc<T>
where
T: Any + Send + Sync,
{
let Ok(value) = value.downcast::<T>() else {
unreachable!("typed extension data stored an incompatible value");
};
value
}

View File

@@ -0,0 +1,6 @@
load("//:defs.bzl", "codex_rust_crate")
codex_rust_crate(
name = "git-attribution",
crate_name = "codex_git_attribution",
)

View File

@@ -0,0 +1,21 @@
[package]
edition.workspace = true
license.workspace = true
name = "codex-git-attribution"
version.workspace = true
[lib]
name = "codex_git_attribution"
path = "src/lib.rs"
doctest = false
[lints]
workspace = true
[dependencies]
codex-core = { workspace = true }
codex-extension-api = { workspace = true }
codex-features = { workspace = true }
[dev-dependencies]
pretty_assertions = { workspace = true }

View File

@@ -0,0 +1,136 @@
use std::sync::Arc;
use codex_core::config::Config;
use codex_extension_api::ContextContributor;
use codex_extension_api::ExtensionData;
use codex_extension_api::ExtensionRegistryBuilder;
use codex_extension_api::PromptFragment;
use codex_extension_api::ThreadStartContributor;
use codex_features::Feature;
const DEFAULT_ATTRIBUTION_VALUE: &str = "Codex <noreply@openai.com>";
/// Contributes the configured git-attribution instruction.
#[derive(Clone, Copy, Debug, Default)]
pub struct GitAttributionExtension;
impl ContextContributor for GitAttributionExtension {
fn contribute(
&self,
_session_store: &ExtensionData,
thread_store: &ExtensionData,
) -> Vec<PromptFragment> {
let Some(config_store) = thread_store.get::<GitAttributionConfig>() else {
return Vec::new();
};
if !config_store.enabled {
return Vec::new();
}
build_instruction(config_store.prompt.as_deref())
.map(PromptFragment::developer_policy)
.into_iter()
.collect()
}
}
#[derive(Clone, Debug, Default)]
struct GitAttributionConfig {
enabled: bool,
prompt: Option<String>,
}
impl ThreadStartContributor<Config> for GitAttributionExtension {
fn contribute(
&self,
config: &Config,
_session_store: &ExtensionData,
thread_store: &ExtensionData,
) {
thread_store.insert(GitAttributionConfig {
enabled: config.features.enabled(Feature::CodexGitCommit),
prompt: config.commit_attribution.clone(),
});
}
}
/// Installs the git-attribution contributors into the extension registry.
pub fn install(registry: &mut ExtensionRegistryBuilder<Config>) {
let extension = Arc::new(GitAttributionExtension);
registry.thread_start_contributor(extension.clone());
registry.prompt_contributor(extension);
}
fn build_commit_message_trailer(config_attribution: Option<&str>) -> Option<String> {
let value = resolve_attribution_value(config_attribution)?;
Some(format!("Co-authored-by: {value}"))
}
fn build_instruction(config_attribution: Option<&str>) -> Option<String> {
let trailer = build_commit_message_trailer(config_attribution)?;
Some(format!(
"When you write or edit a git commit message, ensure the message ends with this trailer exactly once:\n{trailer}\n\nRules:\n- Keep existing trailers and append this trailer at the end if missing.\n- Do not duplicate this trailer if it already exists.\n- Keep one blank line between the commit body and trailer block."
))
}
fn resolve_attribution_value(config_attribution: Option<&str>) -> Option<String> {
match config_attribution {
Some(value) => {
let trimmed = value.trim();
if trimmed.is_empty() {
None
} else {
Some(trimmed.to_string())
}
}
None => Some(DEFAULT_ATTRIBUTION_VALUE.to_string()),
}
}
#[cfg(test)]
mod tests {
use pretty_assertions::assert_eq;
use super::build_commit_message_trailer;
use super::build_instruction;
use super::resolve_attribution_value;
#[test]
fn blank_attribution_disables_trailer_prompt() {
assert_eq!(build_commit_message_trailer(Some("")), None);
assert_eq!(build_instruction(Some(" ")), None);
}
#[test]
fn default_attribution_uses_codex_trailer() {
assert_eq!(
build_commit_message_trailer(/*config_attribution*/ None).as_deref(),
Some("Co-authored-by: Codex <noreply@openai.com>")
);
}
#[test]
fn resolve_value_handles_default_custom_and_blank() {
assert_eq!(
resolve_attribution_value(/*config_attribution*/ None),
Some("Codex <noreply@openai.com>".to_string())
);
assert_eq!(
resolve_attribution_value(Some("MyAgent <me@example.com>")),
Some("MyAgent <me@example.com>".to_string())
);
assert_eq!(
resolve_attribution_value(Some("MyAgent")),
Some("MyAgent".to_string())
);
assert_eq!(resolve_attribution_value(Some(" ")), None);
}
#[test]
fn instruction_mentions_trailer_and_omits_generated_with() {
let instruction =
build_instruction(Some("AgentX <agent@example.com>")).expect("instruction expected");
assert!(instruction.contains("Co-authored-by: AgentX <agent@example.com>"));
assert!(instruction.contains("exactly once"));
assert!(!instruction.contains("Generated-with"));
}
}

View File

@@ -38,7 +38,6 @@ pub(crate) fn plugin_hook_key_source(plugin_id: &str, source_relative_path: &str
#[cfg(test)]
mod tests {
use codex_config::HookCommandConfig;
use codex_config::HookEventsToml;
use codex_config::HookHandlerConfig;
use codex_config::MatcherGroup;
@@ -65,7 +64,7 @@ mod tests {
hooks: vec![
HookHandlerConfig::Prompt {},
HookHandlerConfig::Command {
command: HookCommandConfig::Single("echo hi".to_string()),
command: "echo hi".to_string(),
timeout_sec: None,
r#async: false,
status_message: None,

View File

@@ -392,7 +392,6 @@ fn append_matcher_groups(
r#async,
status_message,
} => {
let command = command.for_current_platform().to_string();
if r#async {
warnings.push(format!(
"skipping async hook in {}: async hooks are not supported yet",
@@ -409,7 +408,7 @@ fn append_matcher_groups(
}
let timeout_sec = timeout_sec.unwrap_or(600).max(1);
let normalized_handler = HookHandlerConfig::Command {
command: codex_config::HookCommandConfig::Single(command.clone()),
command: command.clone(),
timeout_sec: Some(timeout_sec),
r#async,
status_message: status_message.clone(),
@@ -566,8 +565,6 @@ fn hook_source_for_requirement_source(source: Option<&RequirementSource>) -> Hoo
mod tests {
use codex_config::ConfigLayerEntry;
use codex_config::ConfigLayerSource;
use codex_config::HookCommandByPlatformConfig;
use codex_config::HookCommandConfig;
use codex_config::HookEventsToml;
use codex_protocol::protocol::HookEventName;
use codex_protocol::protocol::HookSource;
@@ -610,7 +607,7 @@ mod tests {
MatcherGroup {
matcher: matcher.map(str::to_string),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("echo hello".to_string()),
command: "echo hello".to_string(),
timeout_sec: None,
r#async: false,
status_message: None,
@@ -755,7 +752,7 @@ mod tests {
session_start: vec![MatcherGroup {
matcher: None,
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("echo hello".to_string()),
command: "echo hello".to_string(),
timeout_sec: None,
r#async: false,
status_message: None,
@@ -766,47 +763,6 @@ mod tests {
);
}
#[test]
fn pre_tool_use_resolves_platform_specific_command_during_discovery() {
let mut handlers = Vec::new();
let mut warnings = Vec::new();
let mut display_order = 0;
let source_path = source_path();
let hook_states = std::collections::HashMap::new();
append_matcher_groups(
&mut handlers,
&mut Vec::new(),
&mut warnings,
&mut display_order,
&hook_handler_source(&source_path, &hook_states),
HookEventName::PreToolUse,
vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::ByPlatform(HookCommandByPlatformConfig {
unix: "echo unix".to_string(),
windows: "echo windows".to_string(),
}),
timeout_sec: None,
r#async: false,
status_message: None,
}],
}],
);
assert_eq!(warnings, Vec::<String>::new());
assert_eq!(handlers.len(), 1);
assert_eq!(
handlers[0].command,
if cfg!(windows) {
"echo windows"
} else {
"echo unix"
}
);
}
fn config_with_malformed_state_and_session_start_hook() -> TomlValue {
serde_json::from_value(serde_json::json!({
"hooks": {

View File

@@ -10,8 +10,6 @@ use codex_config::ConfigRequirements;
use codex_config::ConfigRequirementsToml;
use codex_config::Constrained;
use codex_config::ConstrainedWithSource;
use codex_config::HookCommandByPlatformConfig;
use codex_config::HookCommandConfig;
use codex_config::HookEventsToml;
use codex_config::HookHandlerConfig;
use codex_config::ManagedHooksRequirementsToml;
@@ -86,10 +84,7 @@ with Path(r"{log_path}").open("a", encoding="utf-8") as handle:
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single(format!(
"python3 {}",
script_path.display()
)),
command: format!("python3 {}", script_path.display()),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -174,95 +169,6 @@ with Path(r"{log_path}").open("a", encoding="utf-8") as handle:
assert!(log_contents.contains("\"hook_event_name\": \"PreToolUse\""));
}
#[tokio::test]
async fn requirements_managed_hooks_execute_platform_specific_command() {
let temp = tempdir().expect("create temp dir");
let managed_dir =
AbsolutePathBuf::try_from(temp.path().join("managed-hooks")).expect("absolute path");
fs::create_dir_all(managed_dir.as_path()).expect("create managed hooks dir");
let unix_log_path = managed_dir.join("unix_pre_tool_use_log.jsonl");
let windows_log_path = managed_dir.join("windows_pre_tool_use_log.jsonl");
let python_command = |log_path: &Path| {
format!(
r#"python3 -c 'import json, pathlib, sys; payload = json.load(sys.stdin); pathlib.Path(r"{log_path}").write_text(json.dumps(payload), encoding="utf-8")'"#,
log_path = log_path.display()
)
};
let managed_hooks = managed_hooks_for_current_platform(
managed_dir,
HookEventsToml {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::ByPlatform(HookCommandByPlatformConfig {
unix: python_command(unix_log_path.as_path()),
windows: python_command(windows_log_path.as_path()),
}),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
}],
}],
..Default::default()
},
);
let config_layer_stack = ConfigLayerStack::new(
Vec::new(),
ConfigRequirements {
managed_hooks: Some(ConstrainedWithSource::new(
Constrained::allow_any(managed_hooks.clone()),
Some(RequirementSource::CloudRequirements),
)),
..ConfigRequirements::default()
},
ConfigRequirementsToml {
hooks: Some(managed_hooks),
..ConfigRequirementsToml::default()
},
)
.expect("config layer stack");
let engine = ClaudeHooksEngine::new(
/*enabled*/ true,
Some(&config_layer_stack),
Vec::new(),
Vec::new(),
CommandShell {
program: String::new(),
args: Vec::new(),
},
);
let outcome = engine
.run_pre_tool_use(PreToolUseRequest {
session_id: ThreadId::new(),
turn_id: "turn-1".to_string(),
cwd: cwd(),
transcript_path: None,
model: "gpt-test".to_string(),
permission_mode: "default".to_string(),
tool_name: "Bash".to_string(),
matcher_aliases: Vec::new(),
tool_use_id: "tool-1".to_string(),
tool_input: serde_json::json!({ "command": "echo hello" }),
})
.await;
assert!(!outcome.should_block);
assert_eq!(
unix_log_path.exists(),
cfg!(not(windows)),
"only the Unix command should run off Windows"
);
assert_eq!(
windows_log_path.exists(),
cfg!(windows),
"only the Windows command should run on Windows"
);
}
#[test]
fn unknown_requirement_source_hooks_stay_managed() {
let temp = tempdir().expect("create temp dir");
@@ -275,7 +181,7 @@ fn unknown_requirement_source_hooks_stay_managed() {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("python3 /tmp/managed.py".to_string()),
command: "python3 /tmp/managed.py".to_string(),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -337,7 +243,7 @@ fn user_disablement_filters_non_managed_hooks_but_not_managed_hooks() {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single("python3 /tmp/managed.py".to_string()),
command: "python3 /tmp/managed.py".to_string(),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -556,10 +462,7 @@ fn requirements_managed_hooks_warn_when_managed_dir_is_missing() {
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single(format!(
"python3 {}",
missing_dir.join("pre.py").display()
)),
command: format!("python3 {}", missing_dir.join("pre.py").display()),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),
@@ -770,10 +673,7 @@ print(json.dumps({
pre_tool_use: vec![MatcherGroup {
matcher: Some("Bash".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single(format!(
"python3 {}",
script_path.display()
)),
command: format!("python3 {}", script_path.display()),
timeout_sec: Some(10),
r#async: false,
status_message: None,
@@ -880,10 +780,8 @@ fn plugin_hook_sources_expand_plugin_placeholders() {
pre_tool_use: vec![MatcherGroup {
matcher: Some("Bash".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: HookCommandConfig::Single(
"run ${PLUGIN_ROOT} ${CLAUDE_PLUGIN_ROOT} ${PLUGIN_DATA} ${CLAUDE_PLUGIN_DATA}"
.to_string(),
),
command: "run ${PLUGIN_ROOT} ${CLAUDE_PLUGIN_ROOT} ${PLUGIN_DATA} ${CLAUDE_PLUGIN_DATA}"
.to_string(),
timeout_sec: Some(5),
r#async: false,
status_message: None,

View File

@@ -22,6 +22,7 @@ codex-arg0 = { workspace = true }
codex-config = { workspace = true }
codex-core = { workspace = true }
codex-exec-server = { workspace = true }
codex-extension-api = { workspace = true }
codex-login = { workspace = true }
codex-protocol = { workspace = true }
codex-utils-cli = { workspace = true }

View File

@@ -7,6 +7,7 @@ use codex_core::ThreadManager;
use codex_core::config::Config;
use codex_core::thread_store_from_config;
use codex_exec_server::EnvironmentManager;
use codex_extension_api::empty_extension_registry;
use codex_login::AuthManager;
use codex_login::default_client::USER_AGENT_SUFFIX;
use codex_login::default_client::get_codex_user_agent;
@@ -68,6 +69,7 @@ impl MessageProcessor {
auth_manager,
SessionSource::Mcp,
environment_manager,
empty_extension_registry(),
/*analytics_events_client*/ None,
thread_store_from_config(config.as_ref(), state_db.clone()),
state_db.clone(),

View File

@@ -1,6 +1,6 @@
{
"models": [
{
{
"prefer_websockets": true,
"support_verbosity": true,
"default_verbosity": "low",

View File

@@ -53,6 +53,7 @@ use codex_core_api::UserInput;
use codex_core_api::WebSearchMode;
use codex_core_api::arg0_dispatch_or_else;
use codex_core_api::built_in_model_providers;
use codex_core_api::empty_extension_registry;
use codex_core_api::find_codex_home;
use codex_core_api::init_state_db;
use codex_core_api::item_event_to_server_notification;
@@ -122,6 +123,7 @@ async fn run_main(arg0_paths: Arg0DispatchPaths) -> anyhow::Result<()> {
auth_manager,
SessionSource::Exec,
environment_manager,
empty_extension_registry(),
/*analytics_events_client*/ None,
Arc::clone(&thread_store),
state_db,

View File

@@ -0,0 +1,6 @@
load("//:defs.bzl", "codex_rust_crate")
codex_rust_crate(
name = "tool-api",
crate_name = "codex_tool_api",
)

View File

@@ -0,0 +1,23 @@
[package]
edition.workspace = true
license.workspace = true
name = "codex-tool-api"
version.workspace = true
[lib]
name = "codex_tool_api"
path = "src/lib.rs"
doctest = false
[lints]
workspace = true
[dependencies]
codex-protocol = { workspace = true }
codex-tools = { workspace = true }
serde = { workspace = true, features = ["derive"] }
serde_json = { workspace = true }
thiserror = { workspace = true }
[dev-dependencies]
pretty_assertions = { workspace = true }

View File

@@ -0,0 +1,127 @@
use std::future::Future;
use std::pin::Pin;
use std::sync::Arc;
use codex_tools::ToolName;
use codex_tools::ToolSpec;
use crate::ToolCall;
use crate::ToolError;
use crate::ToolOutput;
/// Future returned by one executable-tool invocation.
pub type ToolFuture<'a> =
Pin<Box<dyn Future<Output = Result<Box<dyn ToolOutput>, ToolError>> + Send + 'a>>;
/// Future returned by one mutability probe.
pub type BoolFuture<'a> = Pin<Box<dyn Future<Output = bool> + Send + 'a>>;
/// Model-visible definition plus executable implementation for one tool.
#[derive(Clone)]
pub struct ToolBundle<C> {
definition: ToolDefinition,
executor: Arc<dyn ToolExecutor<C>>,
}
impl<C> ToolBundle<C> {
/// Creates one executable tool bundle.
pub fn new(name: ToolName, spec: ToolSpec, executor: Arc<dyn ToolExecutor<C>>) -> Self {
Self {
definition: ToolDefinition {
name,
spec,
supports_parallel_tool_calls: false,
},
executor,
}
}
/// Marks this tool as safe for the host to run in parallel with peers.
#[must_use]
pub fn allow_parallel_calls(mut self) -> Self {
self.definition.supports_parallel_tool_calls = true;
self
}
/// Returns the model-visible tool definition.
pub fn definition(&self) -> &ToolDefinition {
&self.definition
}
/// Returns the executable implementation.
pub fn executor(&self) -> Arc<dyn ToolExecutor<C>> {
Arc::clone(&self.executor)
}
}
/// Model-visible metadata owned by an executable tool bundle.
#[derive(Clone)]
pub struct ToolDefinition {
pub name: ToolName,
pub spec: ToolSpec,
pub supports_parallel_tool_calls: bool,
}
/// Executable behavior for one contributed tool.
///
/// Implementations should keep host-specific needs inside `C`; tool owners that
/// do not require host state can implement the trait for any `C`.
pub trait ToolExecutor<C>: Send + Sync {
fn execute<'a>(&'a self, call: ToolCall<C>) -> ToolFuture<'a>;
/// Returns whether the call may mutate user state.
///
/// Hosts can use this conservative signal for serialization or approval
/// policy. Read-only tools should override this default.
fn is_mutating<'a>(&'a self, _call: &'a ToolCall<C>) -> BoolFuture<'a> {
Box::pin(async { true })
}
}
#[cfg(test)]
mod tests {
use std::sync::Arc;
use std::task::Context;
use std::task::Poll;
use std::task::Wake;
use std::task::Waker;
use super::*;
use crate::JsonToolOutput;
use crate::ToolInput;
struct DefaultMutatingExecutor;
impl ToolExecutor<()> for DefaultMutatingExecutor {
fn execute<'a>(&'a self, _call: ToolCall<()>) -> ToolFuture<'a> {
Box::pin(async {
Ok(Box::new(JsonToolOutput::new(serde_json::json!(null))) as Box<dyn ToolOutput>)
})
}
}
struct NoopWaker;
impl Wake for NoopWaker {
fn wake(self: Arc<Self>) {}
}
#[test]
fn contributed_tools_default_to_mutating() {
let call = ToolCall {
context: (),
call_id: "call-default-mutating".to_string(),
input: ToolInput::Function {
arguments: "{}".to_string(),
},
};
let mut future = DefaultMutatingExecutor.is_mutating(&call);
let waker = Waker::from(Arc::new(NoopWaker));
let mut context = Context::from_waker(&waker);
assert!(matches!(
future.as_mut().poll(&mut context),
Poll::Ready(true)
));
}
}

View File

@@ -0,0 +1,14 @@
/// One executable tool call delivered to a contributed tool.
pub struct ToolCall<C> {
pub context: C,
pub call_id: String,
pub input: ToolInput,
}
/// Model-supplied input for the executable tool families currently exposed by
/// the shared tool seam.
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum ToolInput {
Function { arguments: String },
Freeform { input: String },
}

View File

@@ -0,0 +1,22 @@
use thiserror::Error;
/// Error returned by a contributed executable tool.
#[derive(Clone, Debug, Error, PartialEq, Eq)]
pub enum ToolError {
#[error("{0}")]
RespondToModel(String),
#[error("fatal tool error: {0}")]
Fatal(String),
}
impl ToolError {
/// Creates a model-visible tool error.
pub fn respond_to_model(message: impl Into<String>) -> Self {
Self::RespondToModel(message.into())
}
/// Creates a host-fatal tool error.
pub fn fatal(message: impl Into<String>) -> Self {
Self::Fatal(message.into())
}
}

View File

@@ -0,0 +1,17 @@
//! Reusable executable-tool contracts shared between hosts and tool owners.
mod bundle;
mod call;
mod error;
mod output;
pub use bundle::BoolFuture;
pub use bundle::ToolBundle;
pub use bundle::ToolDefinition;
pub use bundle::ToolExecutor;
pub use bundle::ToolFuture;
pub use call::ToolCall;
pub use call::ToolInput;
pub use error::ToolError;
pub use output::JsonToolOutput;
pub use output::ToolOutput;

View File

@@ -0,0 +1,113 @@
use codex_protocol::models::FunctionCallOutputBody;
use codex_protocol::models::FunctionCallOutputPayload;
use codex_protocol::models::ResponseInputItem;
use serde::Serialize;
use serde_json::Value;
use crate::ToolError;
use crate::ToolInput;
/// Tool-owned output rendering for each host-facing boundary.
pub trait ToolOutput: Send {
fn log_preview(&self) -> String;
fn success_for_logging(&self) -> bool;
fn to_response_item(&self, call_id: &str, input: &ToolInput) -> ResponseInputItem;
/// Returns the stable value exposed to post-tool-use hook integration when a
/// host chooses to wire that surface for this tool.
fn post_tool_use_response(&self, _call_id: &str, _input: &ToolInput) -> Option<Value> {
None
}
fn code_mode_result(&self, input: &ToolInput) -> Value;
}
/// Convenience output for ordinary JSON-returning function tools.
#[derive(Clone, Debug)]
pub struct JsonToolOutput {
value: Value,
}
impl JsonToolOutput {
/// Creates a JSON output from a serializable value.
pub fn from_serializable(value: impl Serialize) -> Result<Self, ToolError> {
serde_json::to_value(value).map(Self::new).map_err(|err| {
ToolError::respond_to_model(format!("failed to serialize output: {err}"))
})
}
/// Creates a JSON output from an already materialized value.
pub fn new(value: Value) -> Self {
Self { value }
}
}
impl ToolOutput for JsonToolOutput {
fn log_preview(&self) -> String {
self.value.to_string()
}
fn success_for_logging(&self) -> bool {
true
}
fn to_response_item(&self, call_id: &str, _input: &ToolInput) -> ResponseInputItem {
ResponseInputItem::FunctionCallOutput {
call_id: call_id.to_string(),
output: FunctionCallOutputPayload {
body: FunctionCallOutputBody::Text(self.value.to_string()),
success: Some(true),
},
}
}
fn post_tool_use_response(&self, _call_id: &str, _input: &ToolInput) -> Option<Value> {
Some(self.value.clone())
}
fn code_mode_result(&self, _input: &ToolInput) -> Value {
self.value.clone()
}
}
#[cfg(test)]
mod tests {
use codex_protocol::models::FunctionCallOutputBody;
use codex_protocol::models::FunctionCallOutputPayload;
use codex_protocol::models::ResponseInputItem;
use pretty_assertions::assert_eq;
use serde_json::json;
use super::JsonToolOutput;
use super::ToolOutput;
use crate::ToolInput;
#[test]
fn json_tool_output_renders_function_output() {
let input = ToolInput::Function {
arguments: "{}".to_string(),
};
let output = JsonToolOutput::from_serializable(json!({ "ok": true }))
.expect("serializable value should produce json output");
assert_eq!(output.log_preview(), "{\"ok\":true}");
assert!(output.success_for_logging());
assert_eq!(
output.to_response_item("call-1", &input),
ResponseInputItem::FunctionCallOutput {
call_id: "call-1".to_string(),
output: FunctionCallOutputPayload {
body: FunctionCallOutputBody::Text("{\"ok\":true}".to_string()),
success: Some(true),
},
}
);
assert_eq!(
output.post_tool_use_response("call-1", &input),
Some(json!({ "ok": true }))
);
assert_eq!(output.code_mode_result(&input), json!({ "ok": true }));
}
}

View File

@@ -929,9 +929,7 @@ approval_policy = "never"
pre_tool_use: vec![MatcherGroup {
matcher: Some("^Bash$".to_string()),
hooks: vec![HookHandlerConfig::Command {
command: codex_config::HookCommandConfig::Single(
"python3 /enterprise/hooks/pre.py".to_string(),
),
command: "python3 /enterprise/hooks/pre.py".to_string(),
timeout_sec: Some(10),
r#async: false,
status_message: Some("checking".to_string()),