mirror of
https://github.com/openai/codex.git
synced 2026-02-05 08:23:41 +00:00
Compare commits
15 Commits
file-notif
...
patch-guar
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
27018edc50 | ||
|
|
8a281cd1f4 | ||
|
|
e8863b233b | ||
|
|
8fed0b53c4 | ||
|
|
00debb6399 | ||
|
|
0144fb4fab | ||
|
|
0a0a10d8b3 | ||
|
|
13035561cd | ||
|
|
9be704a934 | ||
|
|
f7b4e29609 | ||
|
|
d6c5df9a0a | ||
|
|
8662162f45 | ||
|
|
57584d6f34 | ||
|
|
268a10f917 | ||
|
|
5346cc422d |
22
.github/ISSUE_TEMPLATE/2-bug-report.yml
vendored
22
.github/ISSUE_TEMPLATE/2-bug-report.yml
vendored
@@ -20,6 +20,14 @@ body:
|
||||
attributes:
|
||||
label: What version of Codex is running?
|
||||
description: Copy the output of `codex --version`
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: plan
|
||||
attributes:
|
||||
label: What subscription do you have?
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: model
|
||||
attributes:
|
||||
@@ -32,11 +40,18 @@ body:
|
||||
description: |
|
||||
For MacOS and Linux: copy the output of `uname -mprs`
|
||||
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: What issue are you seeing?
|
||||
description: Please include the full error messages and prompts with PII redacted. If possible, please provide text instead of a screenshot.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: steps
|
||||
attributes:
|
||||
label: What steps can reproduce the bug?
|
||||
description: Explain the bug and provide a code snippet that can reproduce it.
|
||||
description: Explain the bug and provide a code snippet that can reproduce it. Please include session id, token limit usage, context window usage if applicable.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
@@ -44,11 +59,6 @@ body:
|
||||
attributes:
|
||||
label: What is the expected behavior?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: What do you see instead?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: notes
|
||||
attributes:
|
||||
|
||||
24
.github/ISSUE_TEMPLATE/5-vs-code-extension.yml
vendored
24
.github/ISSUE_TEMPLATE/5-vs-code-extension.yml
vendored
@@ -14,11 +14,21 @@ body:
|
||||
id: version
|
||||
attributes:
|
||||
label: What version of the VS Code extension are you using?
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: plan
|
||||
attributes:
|
||||
label: What subscription do you have?
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: ide
|
||||
attributes:
|
||||
label: Which IDE are you using?
|
||||
description: Like `VS Code`, `Cursor`, `Windsurf`, etc.
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: platform
|
||||
attributes:
|
||||
@@ -26,11 +36,18 @@ body:
|
||||
description: |
|
||||
For MacOS and Linux: copy the output of `uname -mprs`
|
||||
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: What issue are you seeing?
|
||||
description: Please include the full error messages and prompts with PII redacted. If possible, please provide text instead of a screenshot.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: steps
|
||||
attributes:
|
||||
label: What steps can reproduce the bug?
|
||||
description: Explain the bug and provide a code snippet that can reproduce it.
|
||||
description: Explain the bug and provide a code snippet that can reproduce it. Please include session id, token limit usage, context window usage if applicable.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
@@ -38,11 +55,6 @@ body:
|
||||
attributes:
|
||||
label: What is the expected behavior?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: What do you see instead?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: notes
|
||||
attributes:
|
||||
|
||||
35
codex-cli/bin/codex.js
Executable file → Normal file
35
codex-cli/bin/codex.js
Executable file → Normal file
@@ -80,6 +80,32 @@ function getUpdatedPath(newDirs) {
|
||||
return updatedPath;
|
||||
}
|
||||
|
||||
/**
|
||||
* Use heuristics to detect the package manager that was used to install Codex
|
||||
* in order to give the user a hint about how to update it.
|
||||
*/
|
||||
function detectPackageManager() {
|
||||
const userAgent = process.env.npm_config_user_agent || "";
|
||||
if (/\bbun\//.test(userAgent)) {
|
||||
return "bun";
|
||||
}
|
||||
|
||||
const execPath = process.env.npm_execpath || "";
|
||||
if (execPath.includes("bun")) {
|
||||
return "bun";
|
||||
}
|
||||
|
||||
if (
|
||||
process.env.BUN_INSTALL ||
|
||||
process.env.BUN_INSTALL_GLOBAL_DIR ||
|
||||
process.env.BUN_INSTALL_BIN_DIR
|
||||
) {
|
||||
return "bun";
|
||||
}
|
||||
|
||||
return userAgent ? "npm" : null;
|
||||
}
|
||||
|
||||
const additionalDirs = [];
|
||||
const pathDir = path.join(archRoot, "path");
|
||||
if (existsSync(pathDir)) {
|
||||
@@ -87,9 +113,16 @@ if (existsSync(pathDir)) {
|
||||
}
|
||||
const updatedPath = getUpdatedPath(additionalDirs);
|
||||
|
||||
const env = { ...process.env, PATH: updatedPath };
|
||||
const packageManagerEnvVar =
|
||||
detectPackageManager() === "bun"
|
||||
? "CODEX_MANAGED_BY_BUN"
|
||||
: "CODEX_MANAGED_BY_NPM";
|
||||
env[packageManagerEnvVar] = "1";
|
||||
|
||||
const child = spawn(binaryPath, process.argv.slice(2), {
|
||||
stdio: "inherit",
|
||||
env: { ...process.env, PATH: updatedPath, CODEX_MANAGED_BY_NPM: "1" },
|
||||
env,
|
||||
});
|
||||
|
||||
child.on("error", (err) => {
|
||||
|
||||
77
codex-rs/Cargo.lock
generated
77
codex-rs/Cargo.lock
generated
@@ -1051,6 +1051,7 @@ dependencies = [
|
||||
"escargot",
|
||||
"eventsource-stream",
|
||||
"futures",
|
||||
"ignore",
|
||||
"indexmap 2.10.0",
|
||||
"landlock",
|
||||
"libc",
|
||||
@@ -1069,6 +1070,7 @@ dependencies = [
|
||||
"serde_json",
|
||||
"serial_test",
|
||||
"sha1",
|
||||
"sha2",
|
||||
"shlex",
|
||||
"similar",
|
||||
"strum_macros 0.27.2",
|
||||
@@ -1577,10 +1579,12 @@ dependencies = [
|
||||
"anyhow",
|
||||
"assert_cmd",
|
||||
"codex-core",
|
||||
"notify",
|
||||
"regex-lite",
|
||||
"serde_json",
|
||||
"tempfile",
|
||||
"tokio",
|
||||
"walkdir",
|
||||
"wiremock",
|
||||
]
|
||||
|
||||
@@ -2371,6 +2375,15 @@ dependencies = [
|
||||
"percent-encoding",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fsevent-sys"
|
||||
version = "4.1.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "76ee7a02da4d231650c7cea31349b889be2f45ddb3ef3032d2ec8185f6313fd2"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "futures"
|
||||
version = "0.3.31"
|
||||
@@ -3057,6 +3070,26 @@ version = "2.0.6"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f4c7245a08504955605670dbf141fceab975f15ca21570696aebe9d2e71576bd"
|
||||
|
||||
[[package]]
|
||||
name = "inotify"
|
||||
version = "0.11.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "f37dccff2791ab604f9babef0ba14fbe0be30bd368dc541e2b08d07c8aa908f3"
|
||||
dependencies = [
|
||||
"bitflags 2.9.1",
|
||||
"inotify-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inotify-sys"
|
||||
version = "0.1.5"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "e05c02b5e89bff3b946cedeca278abc628fe811e604f027c45a8aa3cf793d0eb"
|
||||
dependencies = [
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "inout"
|
||||
version = "0.1.4"
|
||||
@@ -3257,6 +3290,26 @@ dependencies = [
|
||||
"zeroize",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue"
|
||||
version = "1.1.1"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "eac30106d7dce88daf4a3fcb4879ea939476d5074a9b7ddd0fb97fa4bed5596a"
|
||||
dependencies = [
|
||||
"kqueue-sys",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kqueue-sys"
|
||||
version = "1.0.4"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "ed9625ffda8729b85e45cf04090035ac368927b8cebc34898e7c120f52e4838b"
|
||||
dependencies = [
|
||||
"bitflags 1.3.2",
|
||||
"libc",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "lalrpop"
|
||||
version = "0.19.12"
|
||||
@@ -3656,6 +3709,30 @@ version = "0.3.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "61807f77802ff30975e01f4f071c8ba10c022052f98b3294119f3e615d13e5be"
|
||||
|
||||
[[package]]
|
||||
name = "notify"
|
||||
version = "8.2.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "4d3d07927151ff8575b7087f245456e549fea62edf0ec4e565a5ee50c8402bc3"
|
||||
dependencies = [
|
||||
"bitflags 2.9.1",
|
||||
"fsevent-sys",
|
||||
"inotify",
|
||||
"kqueue",
|
||||
"libc",
|
||||
"log",
|
||||
"mio",
|
||||
"notify-types",
|
||||
"walkdir",
|
||||
"windows-sys 0.60.2",
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "notify-types"
|
||||
version = "2.0.0"
|
||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||
checksum = "5e0826a989adedc2a244799e823aece04662b66609d96af8dff7ac6df9a8925d"
|
||||
|
||||
[[package]]
|
||||
name = "nu-ansi-term"
|
||||
version = "0.50.1"
|
||||
|
||||
@@ -122,6 +122,7 @@ log = "0.4"
|
||||
maplit = "1.0.2"
|
||||
mime_guess = "2.0.5"
|
||||
multimap = "0.10.0"
|
||||
notify = "8.2.0"
|
||||
nucleo-matcher = "0.3.1"
|
||||
openssl-sys = "*"
|
||||
opentelemetry = "0.30.0"
|
||||
|
||||
@@ -26,6 +26,8 @@ use supports_color::Stream;
|
||||
mod mcp_cmd;
|
||||
|
||||
use crate::mcp_cmd::McpCli;
|
||||
use codex_core::config::Config;
|
||||
use codex_core::config::ConfigOverrides;
|
||||
|
||||
/// Codex CLI
|
||||
///
|
||||
@@ -45,6 +47,9 @@ struct MultitoolCli {
|
||||
#[clap(flatten)]
|
||||
pub config_overrides: CliConfigOverrides,
|
||||
|
||||
#[clap(flatten)]
|
||||
pub feature_toggles: FeatureToggles,
|
||||
|
||||
#[clap(flatten)]
|
||||
interactive: TuiCli,
|
||||
|
||||
@@ -97,6 +102,9 @@ enum Subcommand {
|
||||
/// Internal: run the responses API proxy.
|
||||
#[clap(hide = true)]
|
||||
ResponsesApiProxy(ResponsesApiProxyArgs),
|
||||
|
||||
/// Inspect feature flags.
|
||||
Features(FeaturesCli),
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
@@ -231,6 +239,53 @@ fn print_exit_messages(exit_info: AppExitInfo) {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Default, Parser, Clone)]
|
||||
struct FeatureToggles {
|
||||
/// Enable a feature (repeatable). Equivalent to `-c features.<name>=true`.
|
||||
#[arg(long = "enable", value_name = "FEATURE", action = clap::ArgAction::Append, global = true)]
|
||||
enable: Vec<String>,
|
||||
|
||||
/// Disable a feature (repeatable). Equivalent to `-c features.<name>=false`.
|
||||
#[arg(long = "disable", value_name = "FEATURE", action = clap::ArgAction::Append, global = true)]
|
||||
disable: Vec<String>,
|
||||
}
|
||||
|
||||
impl FeatureToggles {
|
||||
fn to_overrides(&self) -> Vec<String> {
|
||||
let mut v = Vec::new();
|
||||
for k in &self.enable {
|
||||
v.push(format!("features.{k}=true"));
|
||||
}
|
||||
for k in &self.disable {
|
||||
v.push(format!("features.{k}=false"));
|
||||
}
|
||||
v
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
struct FeaturesCli {
|
||||
#[command(subcommand)]
|
||||
sub: FeaturesSubcommand,
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
enum FeaturesSubcommand {
|
||||
/// List known features with their stage and effective state.
|
||||
List,
|
||||
}
|
||||
|
||||
fn stage_str(stage: codex_core::features::Stage) -> &'static str {
|
||||
use codex_core::features::Stage;
|
||||
match stage {
|
||||
Stage::Experimental => "experimental",
|
||||
Stage::Beta => "beta",
|
||||
Stage::Stable => "stable",
|
||||
Stage::Deprecated => "deprecated",
|
||||
Stage::Removed => "removed",
|
||||
}
|
||||
}
|
||||
|
||||
/// As early as possible in the process lifecycle, apply hardening measures. We
|
||||
/// skip this in debug builds to avoid interfering with debugging.
|
||||
#[ctor::ctor]
|
||||
@@ -248,11 +303,17 @@ fn main() -> anyhow::Result<()> {
|
||||
|
||||
async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()> {
|
||||
let MultitoolCli {
|
||||
config_overrides: root_config_overrides,
|
||||
config_overrides: mut root_config_overrides,
|
||||
feature_toggles,
|
||||
mut interactive,
|
||||
subcommand,
|
||||
} = MultitoolCli::parse();
|
||||
|
||||
// Fold --enable/--disable into config overrides so they flow to all subcommands.
|
||||
root_config_overrides
|
||||
.raw_overrides
|
||||
.extend(feature_toggles.to_overrides());
|
||||
|
||||
match subcommand {
|
||||
None => {
|
||||
prepend_config_flags(
|
||||
@@ -381,6 +442,30 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
|
||||
Some(Subcommand::GenerateTs(gen_cli)) => {
|
||||
codex_protocol_ts::generate_ts(&gen_cli.out_dir, gen_cli.prettier.as_deref())?;
|
||||
}
|
||||
Some(Subcommand::Features(FeaturesCli { sub })) => match sub {
|
||||
FeaturesSubcommand::List => {
|
||||
// Respect root-level `-c` overrides plus top-level flags like `--profile`.
|
||||
let cli_kv_overrides = root_config_overrides
|
||||
.parse_overrides()
|
||||
.map_err(|e| anyhow::anyhow!(e))?;
|
||||
|
||||
// Thread through relevant top-level flags (at minimum, `--profile`).
|
||||
// Also honor `--search` since it maps to a feature toggle.
|
||||
let overrides = ConfigOverrides {
|
||||
config_profile: interactive.config_profile.clone(),
|
||||
tools_web_search_request: interactive.web_search.then_some(true),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let config = Config::load_with_cli_overrides(cli_kv_overrides, overrides).await?;
|
||||
for def in codex_core::features::FEATURES.iter() {
|
||||
let name = def.key;
|
||||
let stage = stage_str(def.stage);
|
||||
let enabled = config.features.enabled(def.id);
|
||||
println!("{name}\t{stage}\t{enabled}");
|
||||
}
|
||||
}
|
||||
},
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -484,6 +569,7 @@ mod tests {
|
||||
interactive,
|
||||
config_overrides: root_overrides,
|
||||
subcommand,
|
||||
feature_toggles: _,
|
||||
} = cli;
|
||||
|
||||
let Subcommand::Resume(ResumeCommand {
|
||||
|
||||
@@ -13,10 +13,12 @@ use codex_core::config::load_global_mcp_servers;
|
||||
use codex_core::config::write_global_mcp_servers;
|
||||
use codex_core::config_types::McpServerConfig;
|
||||
use codex_core::config_types::McpServerTransportConfig;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::mcp::auth::compute_auth_statuses;
|
||||
use codex_core::protocol::McpAuthStatus;
|
||||
use codex_rmcp_client::delete_oauth_tokens;
|
||||
use codex_rmcp_client::perform_oauth_login;
|
||||
use codex_rmcp_client::supports_oauth_login;
|
||||
|
||||
/// [experimental] Launch Codex as an MCP server or manage configured MCP servers.
|
||||
///
|
||||
@@ -189,7 +191,10 @@ impl McpCli {
|
||||
|
||||
async fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Result<()> {
|
||||
// Validate any provided overrides even though they are not currently applied.
|
||||
config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
let overrides = config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
let config = Config::load_with_cli_overrides(overrides, ConfigOverrides::default())
|
||||
.await
|
||||
.context("failed to load configuration")?;
|
||||
|
||||
let AddArgs {
|
||||
name,
|
||||
@@ -225,17 +230,21 @@ async fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Re
|
||||
}
|
||||
}
|
||||
AddMcpTransportArgs {
|
||||
streamable_http: Some(streamable_http),
|
||||
streamable_http:
|
||||
Some(AddMcpStreamableHttpArgs {
|
||||
url,
|
||||
bearer_token_env_var,
|
||||
}),
|
||||
..
|
||||
} => McpServerTransportConfig::StreamableHttp {
|
||||
url: streamable_http.url,
|
||||
bearer_token_env_var: streamable_http.bearer_token_env_var,
|
||||
url,
|
||||
bearer_token_env_var,
|
||||
},
|
||||
AddMcpTransportArgs { .. } => bail!("exactly one of --command or --url must be provided"),
|
||||
};
|
||||
|
||||
let new_entry = McpServerConfig {
|
||||
transport,
|
||||
transport: transport.clone(),
|
||||
enabled: true,
|
||||
startup_timeout_sec: None,
|
||||
tool_timeout_sec: None,
|
||||
@@ -248,6 +257,17 @@ async fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Re
|
||||
|
||||
println!("Added global MCP server '{name}'.");
|
||||
|
||||
if let McpServerTransportConfig::StreamableHttp {
|
||||
url,
|
||||
bearer_token_env_var: None,
|
||||
} = transport
|
||||
&& matches!(supports_oauth_login(&url).await, Ok(true))
|
||||
{
|
||||
println!("Detected OAuth support. Starting OAuth flow…");
|
||||
perform_oauth_login(&name, &url, config.mcp_oauth_credentials_store_mode).await?;
|
||||
println!("Successfully logged in.");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -285,7 +305,7 @@ async fn run_login(config_overrides: &CliConfigOverrides, login_args: LoginArgs)
|
||||
.await
|
||||
.context("failed to load configuration")?;
|
||||
|
||||
if !config.use_experimental_use_rmcp_client {
|
||||
if !config.features.enabled(Feature::RmcpClient) {
|
||||
bail!(
|
||||
"OAuth login is only supported when experimental_use_rmcp_client is true in config.toml."
|
||||
);
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
use clap::Args;
|
||||
use clap::Parser;
|
||||
use codex_common::CliConfigOverrides;
|
||||
|
||||
@@ -6,4 +7,43 @@ use codex_common::CliConfigOverrides;
|
||||
pub struct Cli {
|
||||
#[clap(skip)]
|
||||
pub config_overrides: CliConfigOverrides,
|
||||
|
||||
#[command(subcommand)]
|
||||
pub command: Option<Command>,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Subcommand)]
|
||||
pub enum Command {
|
||||
/// Submit a new Codex Cloud task without launching the TUI.
|
||||
Exec(ExecCommand),
|
||||
}
|
||||
|
||||
#[derive(Debug, Args)]
|
||||
pub struct ExecCommand {
|
||||
/// Task prompt to run in Codex Cloud.
|
||||
#[arg(value_name = "QUERY")]
|
||||
pub query: Option<String>,
|
||||
|
||||
/// Target environment identifier (see `codex cloud` to browse).
|
||||
#[arg(long = "env", value_name = "ENV_ID")]
|
||||
pub environment: String,
|
||||
|
||||
/// Number of assistant attempts (best-of-N).
|
||||
#[arg(
|
||||
long = "attempts",
|
||||
default_value_t = 1usize,
|
||||
value_parser = parse_attempts
|
||||
)]
|
||||
pub attempts: usize,
|
||||
}
|
||||
|
||||
fn parse_attempts(input: &str) -> Result<usize, String> {
|
||||
let value: usize = input
|
||||
.parse()
|
||||
.map_err(|_| "attempts must be an integer between 1 and 4".to_string())?;
|
||||
if (1..=4).contains(&value) {
|
||||
Ok(value)
|
||||
} else {
|
||||
Err("attempts must be between 1 and 4".to_string())
|
||||
}
|
||||
}
|
||||
|
||||
@@ -7,7 +7,9 @@ mod ui;
|
||||
pub mod util;
|
||||
pub use cli::Cli;
|
||||
|
||||
use anyhow::anyhow;
|
||||
use std::io::IsTerminal;
|
||||
use std::io::Read;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
@@ -23,6 +25,175 @@ struct ApplyJob {
|
||||
diff_override: Option<String>,
|
||||
}
|
||||
|
||||
struct BackendContext {
|
||||
backend: Arc<dyn codex_cloud_tasks_client::CloudBackend>,
|
||||
base_url: String,
|
||||
}
|
||||
|
||||
async fn init_backend(user_agent_suffix: &str) -> anyhow::Result<BackendContext> {
|
||||
let use_mock = matches!(
|
||||
std::env::var("CODEX_CLOUD_TASKS_MODE").ok().as_deref(),
|
||||
Some("mock") | Some("MOCK")
|
||||
);
|
||||
let base_url = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
|
||||
set_user_agent_suffix(user_agent_suffix);
|
||||
|
||||
if use_mock {
|
||||
return Ok(BackendContext {
|
||||
backend: Arc::new(codex_cloud_tasks_client::MockClient),
|
||||
base_url,
|
||||
});
|
||||
}
|
||||
|
||||
let ua = codex_core::default_client::get_codex_user_agent();
|
||||
let mut http = codex_cloud_tasks_client::HttpClient::new(base_url.clone())?.with_user_agent(ua);
|
||||
let style = if base_url.contains("/backend-api") {
|
||||
"wham"
|
||||
} else {
|
||||
"codex-api"
|
||||
};
|
||||
append_error_log(format!("startup: base_url={base_url} path_style={style}"));
|
||||
|
||||
let auth = match codex_core::config::find_codex_home()
|
||||
.ok()
|
||||
.map(|home| codex_login::AuthManager::new(home, false))
|
||||
.and_then(|am| am.auth())
|
||||
{
|
||||
Some(auth) => auth,
|
||||
None => {
|
||||
eprintln!(
|
||||
"Not signed in. Please run 'codex login' to sign in with ChatGPT, then re-run 'codex cloud'."
|
||||
);
|
||||
std::process::exit(1);
|
||||
}
|
||||
};
|
||||
|
||||
if let Some(acc) = auth.get_account_id() {
|
||||
append_error_log(format!("auth: mode=ChatGPT account_id={acc}"));
|
||||
}
|
||||
|
||||
let token = match auth.get_token().await {
|
||||
Ok(t) if !t.is_empty() => t,
|
||||
_ => {
|
||||
eprintln!(
|
||||
"Not signed in. Please run 'codex login' to sign in with ChatGPT, then re-run 'codex cloud'."
|
||||
);
|
||||
std::process::exit(1);
|
||||
}
|
||||
};
|
||||
|
||||
http = http.with_bearer_token(token.clone());
|
||||
if let Some(acc) = auth
|
||||
.get_account_id()
|
||||
.or_else(|| util::extract_chatgpt_account_id(&token))
|
||||
{
|
||||
append_error_log(format!("auth: set ChatGPT-Account-Id header: {acc}"));
|
||||
http = http.with_chatgpt_account_id(acc);
|
||||
}
|
||||
|
||||
Ok(BackendContext {
|
||||
backend: Arc::new(http),
|
||||
base_url,
|
||||
})
|
||||
}
|
||||
|
||||
async fn run_exec_command(args: crate::cli::ExecCommand) -> anyhow::Result<()> {
|
||||
let crate::cli::ExecCommand {
|
||||
query,
|
||||
environment,
|
||||
attempts,
|
||||
} = args;
|
||||
let ctx = init_backend("codex_cloud_tasks_exec").await?;
|
||||
let prompt = resolve_query_input(query)?;
|
||||
let env_id = resolve_environment_id(&ctx, &environment).await?;
|
||||
let created = codex_cloud_tasks_client::CloudBackend::create_task(
|
||||
&*ctx.backend,
|
||||
&env_id,
|
||||
&prompt,
|
||||
"main",
|
||||
false,
|
||||
attempts,
|
||||
)
|
||||
.await?;
|
||||
let url = util::task_url(&ctx.base_url, &created.id.0);
|
||||
println!("{url}");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn resolve_environment_id(ctx: &BackendContext, requested: &str) -> anyhow::Result<String> {
|
||||
let trimmed = requested.trim();
|
||||
if trimmed.is_empty() {
|
||||
return Err(anyhow!("environment id must not be empty"));
|
||||
}
|
||||
let normalized = util::normalize_base_url(&ctx.base_url);
|
||||
let headers = util::build_chatgpt_headers().await;
|
||||
let environments = crate::env_detect::list_environments(&normalized, &headers).await?;
|
||||
if environments.is_empty() {
|
||||
return Err(anyhow!(
|
||||
"no cloud environments are available for this workspace"
|
||||
));
|
||||
}
|
||||
|
||||
if let Some(row) = environments.iter().find(|row| row.id == trimmed) {
|
||||
return Ok(row.id.clone());
|
||||
}
|
||||
|
||||
let label_matches = environments
|
||||
.iter()
|
||||
.filter(|row| {
|
||||
row.label
|
||||
.as_deref()
|
||||
.map(|label| label.eq_ignore_ascii_case(trimmed))
|
||||
.unwrap_or(false)
|
||||
})
|
||||
.collect::<Vec<_>>();
|
||||
match label_matches.as_slice() {
|
||||
[] => Err(anyhow!(
|
||||
"environment '{trimmed}' not found; run `codex cloud` to list available environments"
|
||||
)),
|
||||
[single] => Ok(single.id.clone()),
|
||||
[first, rest @ ..] => {
|
||||
let first_id = &first.id;
|
||||
if rest.iter().all(|row| row.id == *first_id) {
|
||||
Ok(first_id.clone())
|
||||
} else {
|
||||
Err(anyhow!(
|
||||
"environment label '{trimmed}' is ambiguous; run `codex cloud` to pick the desired environment id"
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn resolve_query_input(query_arg: Option<String>) -> anyhow::Result<String> {
|
||||
match query_arg {
|
||||
Some(q) if q != "-" => Ok(q),
|
||||
maybe_dash => {
|
||||
let force_stdin = matches!(maybe_dash.as_deref(), Some("-"));
|
||||
if std::io::stdin().is_terminal() && !force_stdin {
|
||||
return Err(anyhow!(
|
||||
"no query provided. Pass one as an argument or pipe it via stdin."
|
||||
));
|
||||
}
|
||||
if !force_stdin {
|
||||
eprintln!("Reading query from stdin...");
|
||||
}
|
||||
let mut buffer = String::new();
|
||||
std::io::stdin()
|
||||
.read_to_string(&mut buffer)
|
||||
.map_err(|e| anyhow!("failed to read query from stdin: {e}"))?;
|
||||
if buffer.trim().is_empty() {
|
||||
return Err(anyhow!(
|
||||
"no query provided via stdin (received empty input)."
|
||||
));
|
||||
}
|
||||
Ok(buffer)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn level_from_status(status: codex_cloud_tasks_client::ApplyStatus) -> app::ApplyResultLevel {
|
||||
match status {
|
||||
codex_cloud_tasks_client::ApplyStatus::Success => app::ApplyResultLevel::Success,
|
||||
@@ -148,7 +319,14 @@ fn spawn_apply(
|
||||
// (no standalone patch summarizer needed – UI displays raw diffs)
|
||||
|
||||
/// Entry point for the `codex cloud` subcommand.
|
||||
pub async fn run_main(_cli: Cli, _codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()> {
|
||||
pub async fn run_main(cli: Cli, _codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()> {
|
||||
if let Some(command) = cli.command {
|
||||
return match command {
|
||||
crate::cli::Command::Exec(args) => run_exec_command(args).await,
|
||||
};
|
||||
}
|
||||
let Cli { .. } = cli;
|
||||
|
||||
// Very minimal logging setup; mirrors other crates' pattern.
|
||||
let default_level = "error";
|
||||
let _ = tracing_subscriber::fmt()
|
||||
@@ -162,72 +340,8 @@ pub async fn run_main(_cli: Cli, _codex_linux_sandbox_exe: Option<PathBuf>) -> a
|
||||
.try_init();
|
||||
|
||||
info!("Launching Cloud Tasks list UI");
|
||||
set_user_agent_suffix("codex_cloud_tasks_tui");
|
||||
|
||||
// Default to online unless explicitly configured to use mock.
|
||||
let use_mock = matches!(
|
||||
std::env::var("CODEX_CLOUD_TASKS_MODE").ok().as_deref(),
|
||||
Some("mock") | Some("MOCK")
|
||||
);
|
||||
|
||||
let backend: Arc<dyn codex_cloud_tasks_client::CloudBackend> = if use_mock {
|
||||
Arc::new(codex_cloud_tasks_client::MockClient)
|
||||
} else {
|
||||
// Build an HTTP client against the configured (or default) base URL.
|
||||
let base_url = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
let ua = codex_core::default_client::get_codex_user_agent();
|
||||
let mut http =
|
||||
codex_cloud_tasks_client::HttpClient::new(base_url.clone())?.with_user_agent(ua);
|
||||
// Log which base URL and path style we're going to use.
|
||||
let style = if base_url.contains("/backend-api") {
|
||||
"wham"
|
||||
} else {
|
||||
"codex-api"
|
||||
};
|
||||
append_error_log(format!("startup: base_url={base_url} path_style={style}"));
|
||||
|
||||
// Require ChatGPT login (SWIC). Exit with a clear message if missing.
|
||||
let _token = match codex_core::config::find_codex_home()
|
||||
.ok()
|
||||
.map(|home| codex_login::AuthManager::new(home, false))
|
||||
.and_then(|am| am.auth())
|
||||
{
|
||||
Some(auth) => {
|
||||
// Log account context for debugging workspace selection.
|
||||
if let Some(acc) = auth.get_account_id() {
|
||||
append_error_log(format!("auth: mode=ChatGPT account_id={acc}"));
|
||||
}
|
||||
match auth.get_token().await {
|
||||
Ok(t) if !t.is_empty() => {
|
||||
// Attach token and ChatGPT-Account-Id header if available
|
||||
http = http.with_bearer_token(t.clone());
|
||||
if let Some(acc) = auth
|
||||
.get_account_id()
|
||||
.or_else(|| util::extract_chatgpt_account_id(&t))
|
||||
{
|
||||
append_error_log(format!("auth: set ChatGPT-Account-Id header: {acc}"));
|
||||
http = http.with_chatgpt_account_id(acc);
|
||||
}
|
||||
t
|
||||
}
|
||||
_ => {
|
||||
eprintln!(
|
||||
"Not signed in. Please run 'codex login' to sign in with ChatGPT, then re-run 'codex cloud'."
|
||||
);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
}
|
||||
None => {
|
||||
eprintln!(
|
||||
"Not signed in. Please run 'codex login' to sign in with ChatGPT, then re-run 'codex cloud'."
|
||||
);
|
||||
std::process::exit(1);
|
||||
}
|
||||
};
|
||||
Arc::new(http)
|
||||
};
|
||||
let BackendContext { backend, .. } = init_backend("codex_cloud_tasks_tui").await?;
|
||||
let backend = backend;
|
||||
|
||||
// Terminal setup
|
||||
use crossterm::ExecutableCommand;
|
||||
|
||||
@@ -91,3 +91,18 @@ pub async fn build_chatgpt_headers() -> HeaderMap {
|
||||
}
|
||||
headers
|
||||
}
|
||||
|
||||
/// Construct a browser-friendly task URL for the given backend base URL.
|
||||
pub fn task_url(base_url: &str, task_id: &str) -> String {
|
||||
let normalized = normalize_base_url(base_url);
|
||||
if let Some(root) = normalized.strip_suffix("/backend-api") {
|
||||
return format!("{root}/codex/tasks/{task_id}");
|
||||
}
|
||||
if let Some(root) = normalized.strip_suffix("/api/codex") {
|
||||
return format!("{root}/codex/tasks/{task_id}");
|
||||
}
|
||||
if normalized.ends_with("/codex") {
|
||||
return format!("{normalized}/tasks/{task_id}");
|
||||
}
|
||||
format!("{normalized}/codex/tasks/{task_id}")
|
||||
}
|
||||
|
||||
@@ -33,6 +33,7 @@ env-flags = { workspace = true }
|
||||
eventsource-stream = { workspace = true }
|
||||
futures = { workspace = true }
|
||||
indexmap = { workspace = true }
|
||||
ignore = { workspace = true }
|
||||
libc = { workspace = true }
|
||||
mcp-types = { workspace = true }
|
||||
os_info = { workspace = true }
|
||||
@@ -43,6 +44,7 @@ reqwest = { workspace = true, features = ["json", "stream"] }
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
serde_json = { workspace = true }
|
||||
sha1 = { workspace = true }
|
||||
sha2 = { workspace = true }
|
||||
shlex = { workspace = true }
|
||||
similar = { workspace = true }
|
||||
strum_macros = { workspace = true }
|
||||
|
||||
@@ -47,6 +47,7 @@ use crate::openai_tools::create_tools_json_for_responses_api;
|
||||
use crate::protocol::RateLimitSnapshot;
|
||||
use crate::protocol::RateLimitWindow;
|
||||
use crate::protocol::TokenUsage;
|
||||
use crate::state::TaskKind;
|
||||
use crate::token_data::PlanType;
|
||||
use crate::util::backoff;
|
||||
use codex_otel::otel_event_manager::OtelEventManager;
|
||||
@@ -123,8 +124,16 @@ impl ModelClient {
|
||||
/// the provider config. Public callers always invoke `stream()` – the
|
||||
/// specialised helpers are private to avoid accidental misuse.
|
||||
pub async fn stream(&self, prompt: &Prompt) -> Result<ResponseStream> {
|
||||
self.stream_with_task_kind(prompt, TaskKind::Regular).await
|
||||
}
|
||||
|
||||
pub(crate) async fn stream_with_task_kind(
|
||||
&self,
|
||||
prompt: &Prompt,
|
||||
task_kind: TaskKind,
|
||||
) -> Result<ResponseStream> {
|
||||
match self.provider.wire_api {
|
||||
WireApi::Responses => self.stream_responses(prompt).await,
|
||||
WireApi::Responses => self.stream_responses(prompt, task_kind).await,
|
||||
WireApi::Chat => {
|
||||
// Create the raw streaming connection first.
|
||||
let response_stream = stream_chat_completions(
|
||||
@@ -165,7 +174,11 @@ impl ModelClient {
|
||||
}
|
||||
|
||||
/// Implementation for the OpenAI *Responses* experimental API.
|
||||
async fn stream_responses(&self, prompt: &Prompt) -> Result<ResponseStream> {
|
||||
async fn stream_responses(
|
||||
&self,
|
||||
prompt: &Prompt,
|
||||
task_kind: TaskKind,
|
||||
) -> Result<ResponseStream> {
|
||||
if let Some(path) = &*CODEX_RS_SSE_FIXTURE {
|
||||
// short circuit for tests
|
||||
warn!(path, "Streaming from fixture");
|
||||
@@ -244,7 +257,7 @@ impl ModelClient {
|
||||
let max_attempts = self.provider.request_max_retries();
|
||||
for attempt in 0..=max_attempts {
|
||||
match self
|
||||
.attempt_stream_responses(attempt, &payload_json, &auth_manager)
|
||||
.attempt_stream_responses(attempt, &payload_json, &auth_manager, task_kind)
|
||||
.await
|
||||
{
|
||||
Ok(stream) => {
|
||||
@@ -272,6 +285,7 @@ impl ModelClient {
|
||||
attempt: u64,
|
||||
payload_json: &Value,
|
||||
auth_manager: &Option<Arc<AuthManager>>,
|
||||
task_kind: TaskKind,
|
||||
) -> std::result::Result<ResponseStream, StreamAttemptError> {
|
||||
// Always fetch the latest auth in case a prior attempt refreshed the token.
|
||||
let auth = auth_manager.as_ref().and_then(|m| m.auth());
|
||||
@@ -294,6 +308,7 @@ impl ModelClient {
|
||||
.header("conversation_id", self.conversation_id.to_string())
|
||||
.header("session_id", self.conversation_id.to_string())
|
||||
.header(reqwest::header::ACCEPT, "text/event-stream")
|
||||
.header("Codex-Task-Type", task_kind.header_value())
|
||||
.json(payload_json);
|
||||
|
||||
if let Some(auth) = auth.as_ref()
|
||||
|
||||
168
codex-rs/core/src/codebase_change_notice.rs
Normal file
168
codex-rs/core/src/codebase_change_notice.rs
Normal file
@@ -0,0 +1,168 @@
|
||||
use std::fmt::Write;
|
||||
|
||||
use codex_protocol::models::ContentItem;
|
||||
use codex_protocol::models::ResponseItem;
|
||||
|
||||
use crate::codebase_snapshot::SnapshotDiff;
|
||||
|
||||
pub(crate) const CODEBASE_CHANGE_NOTICE_MAX_PATHS: usize = 40;
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) struct CodebaseChangeNotice {
|
||||
added: Vec<String>,
|
||||
removed: Vec<String>,
|
||||
modified: Vec<String>,
|
||||
truncated: bool,
|
||||
}
|
||||
|
||||
impl CodebaseChangeNotice {
|
||||
pub(crate) fn new(diff: SnapshotDiff, limit: usize) -> Self {
|
||||
let mut remaining = limit;
|
||||
let mut truncated = false;
|
||||
|
||||
let added = take_paths(diff.added, &mut remaining, &mut truncated);
|
||||
let removed = take_paths(diff.removed, &mut remaining, &mut truncated);
|
||||
let modified = take_paths(diff.modified, &mut remaining, &mut truncated);
|
||||
|
||||
Self {
|
||||
added,
|
||||
removed,
|
||||
modified,
|
||||
truncated,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn is_empty(&self) -> bool {
|
||||
self.added.is_empty() && self.removed.is_empty() && self.modified.is_empty()
|
||||
}
|
||||
|
||||
pub(crate) fn serialize_to_xml(&self) -> String {
|
||||
let mut output = String::new();
|
||||
if self.truncated {
|
||||
let _ = writeln!(output, "<codebase_changes truncated=\"true\">");
|
||||
} else {
|
||||
let _ = writeln!(output, "<codebase_changes>");
|
||||
}
|
||||
|
||||
let mut summary_parts = Vec::new();
|
||||
if !self.added.is_empty() {
|
||||
summary_parts.push(format!("added {}", self.added.len()));
|
||||
}
|
||||
if !self.removed.is_empty() {
|
||||
summary_parts.push(format!("removed {}", self.removed.len()));
|
||||
}
|
||||
if !self.modified.is_empty() {
|
||||
summary_parts.push(format!("modified {}", self.modified.len()));
|
||||
}
|
||||
|
||||
if summary_parts.is_empty() {
|
||||
let _ = writeln!(output, " <summary>no changes</summary>");
|
||||
} else {
|
||||
let summary = summary_parts.join(", ");
|
||||
let _ = writeln!(output, " <summary>{summary}</summary>");
|
||||
}
|
||||
|
||||
serialize_section(&mut output, "added", &self.added);
|
||||
serialize_section(&mut output, "removed", &self.removed);
|
||||
serialize_section(&mut output, "modified", &self.modified);
|
||||
if self.truncated {
|
||||
let _ = writeln!(output, " <note>additional paths omitted</note>");
|
||||
}
|
||||
|
||||
let _ = writeln!(output, "</codebase_changes>");
|
||||
output
|
||||
}
|
||||
}
|
||||
|
||||
fn take_paths(mut paths: Vec<String>, remaining: &mut usize, truncated: &mut bool) -> Vec<String> {
|
||||
if *remaining == 0 {
|
||||
if !paths.is_empty() {
|
||||
*truncated = true;
|
||||
}
|
||||
return Vec::new();
|
||||
}
|
||||
|
||||
if paths.len() > *remaining {
|
||||
paths.truncate(*remaining);
|
||||
*truncated = true;
|
||||
}
|
||||
|
||||
*remaining -= paths.len();
|
||||
paths
|
||||
}
|
||||
|
||||
fn serialize_section(output: &mut String, tag: &str, paths: &[String]) {
|
||||
if paths.is_empty() {
|
||||
return;
|
||||
}
|
||||
|
||||
let _ = writeln!(output, " <{tag}>");
|
||||
for path in paths {
|
||||
let _ = writeln!(output, " <path>{}</path>", escape_xml(path));
|
||||
}
|
||||
let _ = writeln!(output, " </{tag}>");
|
||||
}
|
||||
|
||||
fn escape_xml(value: &str) -> String {
|
||||
let mut escaped = String::with_capacity(value.len());
|
||||
for ch in value.chars() {
|
||||
match ch {
|
||||
'&' => escaped.push_str("&"),
|
||||
'<' => escaped.push_str("<"),
|
||||
'>' => escaped.push_str(">"),
|
||||
'"' => escaped.push_str("""),
|
||||
'\'' => escaped.push_str("'"),
|
||||
other => escaped.push(other),
|
||||
}
|
||||
}
|
||||
escaped
|
||||
}
|
||||
|
||||
impl From<CodebaseChangeNotice> for ResponseItem {
|
||||
fn from(notice: CodebaseChangeNotice) -> Self {
|
||||
ResponseItem::Message {
|
||||
id: None,
|
||||
role: "user".to_string(),
|
||||
content: vec![ContentItem::InputText {
|
||||
text: notice.serialize_to_xml(),
|
||||
}],
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
#[test]
|
||||
fn constructs_notice_with_limit() {
|
||||
let diff = SnapshotDiff {
|
||||
added: vec!["a.rs".to_string(), "b.rs".to_string()],
|
||||
removed: vec!["c.rs".to_string()],
|
||||
modified: vec!["d.rs".to_string(), "e.rs".to_string()],
|
||||
};
|
||||
|
||||
let notice = CodebaseChangeNotice::new(diff, 3);
|
||||
assert!(notice.truncated);
|
||||
assert_eq!(
|
||||
notice.added.len() + notice.removed.len() + notice.modified.len(),
|
||||
3
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serializes_notice() {
|
||||
let diff = SnapshotDiff {
|
||||
added: vec!["src/lib.rs".to_string()],
|
||||
removed: Vec::new(),
|
||||
modified: vec!["src/main.rs".to_string()],
|
||||
};
|
||||
let notice = CodebaseChangeNotice::new(diff, CODEBASE_CHANGE_NOTICE_MAX_PATHS);
|
||||
let xml = notice.serialize_to_xml();
|
||||
assert!(xml.contains("<added>"));
|
||||
assert!(xml.contains("<modified>"));
|
||||
assert!(xml.contains("src/lib.rs"));
|
||||
assert!(xml.contains("src/main.rs"));
|
||||
}
|
||||
}
|
||||
278
codex-rs/core/src/codebase_snapshot.rs
Normal file
278
codex-rs/core/src/codebase_snapshot.rs
Normal file
@@ -0,0 +1,278 @@
|
||||
use std::borrow::Cow;
|
||||
use std::collections::BTreeMap;
|
||||
use std::fs::File;
|
||||
use std::io::Read;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::time::SystemTime;
|
||||
|
||||
use anyhow::Context;
|
||||
use anyhow::Result;
|
||||
use ignore::WalkBuilder;
|
||||
use sha2::Digest;
|
||||
use sha2::Sha256;
|
||||
use tokio::task;
|
||||
use tracing::warn;
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) struct CodebaseSnapshot {
|
||||
root: PathBuf,
|
||||
entries: BTreeMap<String, EntryFingerprint>,
|
||||
root_digest: DigestBytes,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub(crate) struct EntryFingerprint {
|
||||
pub kind: EntryKind,
|
||||
pub digest: DigestBytes,
|
||||
pub size: u64,
|
||||
pub modified_millis: Option<u128>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
#[repr(u8)]
|
||||
pub(crate) enum EntryKind {
|
||||
File,
|
||||
Symlink,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Default)]
|
||||
pub(crate) struct SnapshotDiff {
|
||||
pub added: Vec<String>,
|
||||
pub removed: Vec<String>,
|
||||
pub modified: Vec<String>,
|
||||
}
|
||||
|
||||
impl SnapshotDiff {
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.added.is_empty() && self.removed.is_empty() && self.modified.is_empty()
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) type DigestBytes = [u8; 32];
|
||||
|
||||
impl CodebaseSnapshot {
|
||||
pub(crate) async fn capture(root: PathBuf) -> Result<Self> {
|
||||
task::spawn_blocking(move || Self::from_disk(&root))
|
||||
.await
|
||||
.map_err(|e| anyhow::anyhow!("codebase snapshot task failed: {e}"))?
|
||||
}
|
||||
|
||||
pub(crate) fn from_disk(root: &Path) -> Result<Self> {
|
||||
if !root.exists() {
|
||||
return Ok(Self::empty(root));
|
||||
}
|
||||
|
||||
let mut entries: BTreeMap<String, EntryFingerprint> = BTreeMap::new();
|
||||
|
||||
let mut walker = WalkBuilder::new(root);
|
||||
walker
|
||||
.hidden(false)
|
||||
.git_ignore(true)
|
||||
.git_exclude(true)
|
||||
.parents(true)
|
||||
.ignore(true)
|
||||
.follow_links(false);
|
||||
|
||||
for result in walker.build() {
|
||||
let entry = match result {
|
||||
Ok(entry) => entry,
|
||||
Err(err) => {
|
||||
warn!("codebase snapshot failed to read entry: {err}");
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
let path = entry.path();
|
||||
if entry.depth() == 0 {
|
||||
continue;
|
||||
}
|
||||
|
||||
let relative = match path.strip_prefix(root) {
|
||||
Ok(rel) => rel,
|
||||
Err(_) => continue,
|
||||
};
|
||||
if relative.as_os_str().is_empty() {
|
||||
continue;
|
||||
}
|
||||
let rel_string = normalize_rel_path(relative);
|
||||
|
||||
let file_type = match entry.file_type() {
|
||||
Some(file_type) => file_type,
|
||||
None => continue,
|
||||
};
|
||||
|
||||
if file_type.is_dir() {
|
||||
continue;
|
||||
}
|
||||
|
||||
if file_type.is_file() {
|
||||
match fingerprint_file(path) {
|
||||
Ok(fp) => {
|
||||
entries.insert(rel_string, fp);
|
||||
}
|
||||
Err(err) => {
|
||||
warn!(
|
||||
"codebase snapshot failed to hash file {}: {err}",
|
||||
path.display()
|
||||
);
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
if file_type.is_symlink() {
|
||||
match fingerprint_symlink(path) {
|
||||
Ok(fp) => {
|
||||
entries.insert(rel_string, fp);
|
||||
}
|
||||
Err(err) => {
|
||||
warn!(
|
||||
"codebase snapshot failed to hash symlink {}: {err}",
|
||||
path.display()
|
||||
);
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
let root_digest = compute_root_digest(&entries);
|
||||
|
||||
Ok(Self {
|
||||
root: root.to_path_buf(),
|
||||
entries,
|
||||
root_digest,
|
||||
})
|
||||
}
|
||||
|
||||
pub(crate) fn diff(&self, newer: &CodebaseSnapshot) -> SnapshotDiff {
|
||||
let mut diff = SnapshotDiff::default();
|
||||
|
||||
for (path, fingerprint) in &newer.entries {
|
||||
match self.entries.get(path) {
|
||||
None => diff.added.push(path.clone()),
|
||||
Some(existing) if existing != fingerprint => diff.modified.push(path.clone()),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
for path in self.entries.keys() {
|
||||
if !newer.entries.contains_key(path) {
|
||||
diff.removed.push(path.clone());
|
||||
}
|
||||
}
|
||||
|
||||
diff
|
||||
}
|
||||
|
||||
pub(crate) fn root(&self) -> &Path {
|
||||
&self.root
|
||||
}
|
||||
|
||||
fn empty(root: &Path) -> Self {
|
||||
Self {
|
||||
root: root.to_path_buf(),
|
||||
entries: BTreeMap::new(),
|
||||
root_digest: Sha256::digest(b"").into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn fingerprint_file(path: &Path) -> Result<EntryFingerprint> {
|
||||
let metadata = path
|
||||
.metadata()
|
||||
.with_context(|| format!("metadata {}", path.display()))?;
|
||||
let mut file = File::open(path).with_context(|| format!("open {}", path.display()))?;
|
||||
|
||||
let mut hasher = Sha256::new();
|
||||
let mut buf = [0u8; 64 * 1024];
|
||||
loop {
|
||||
let read = file.read(&mut buf)?;
|
||||
if read == 0 {
|
||||
break;
|
||||
}
|
||||
hasher.update(&buf[..read]);
|
||||
}
|
||||
|
||||
Ok(EntryFingerprint {
|
||||
kind: EntryKind::File,
|
||||
digest: hasher.finalize().into(),
|
||||
size: metadata.len(),
|
||||
modified_millis: metadata.modified().ok().and_then(system_time_to_millis),
|
||||
})
|
||||
}
|
||||
|
||||
fn fingerprint_symlink(path: &Path) -> Result<EntryFingerprint> {
|
||||
let target =
|
||||
std::fs::read_link(path).with_context(|| format!("read_link {}", path.display()))?;
|
||||
let mut hasher = Sha256::new();
|
||||
let target_str = normalize_rel_path(&target);
|
||||
hasher.update(target_str.as_bytes());
|
||||
Ok(EntryFingerprint {
|
||||
kind: EntryKind::Symlink,
|
||||
digest: hasher.finalize().into(),
|
||||
size: 0,
|
||||
modified_millis: None,
|
||||
})
|
||||
}
|
||||
|
||||
fn compute_root_digest(entries: &BTreeMap<String, EntryFingerprint>) -> DigestBytes {
|
||||
let mut hasher = Sha256::new();
|
||||
for (path, fingerprint) in entries {
|
||||
hasher.update(path.as_bytes());
|
||||
hasher.update(fingerprint.digest);
|
||||
hasher.update([fingerprint.kind as u8]);
|
||||
hasher.update(fingerprint.size.to_le_bytes());
|
||||
if let Some(modified) = fingerprint.modified_millis {
|
||||
hasher.update(modified.to_le_bytes());
|
||||
}
|
||||
}
|
||||
hasher.finalize().into()
|
||||
}
|
||||
|
||||
fn normalize_rel_path(path: &Path) -> String {
|
||||
let s = path_to_cow(path);
|
||||
if s.is_empty() {
|
||||
String::new()
|
||||
} else {
|
||||
s.replace('\\', "/")
|
||||
}
|
||||
}
|
||||
|
||||
fn path_to_cow(path: &Path) -> Cow<'_, str> {
|
||||
path.to_string_lossy()
|
||||
}
|
||||
|
||||
fn system_time_to_millis(ts: SystemTime) -> Option<u128> {
|
||||
ts.duration_since(SystemTime::UNIX_EPOCH)
|
||||
.map(|duration| duration.as_millis())
|
||||
.ok()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
use tempfile::tempdir;
|
||||
|
||||
#[test]
|
||||
fn diff_tracks_added_modified_removed() {
|
||||
let dir = tempdir().unwrap();
|
||||
let root = dir.path();
|
||||
|
||||
std::fs::write(root.join("file_a.txt"), "alpha").unwrap();
|
||||
std::fs::write(root.join("file_b.txt"), "bravo").unwrap();
|
||||
let snapshot_one = CodebaseSnapshot::from_disk(root).unwrap();
|
||||
|
||||
std::fs::write(root.join("file_a.txt"), "alpha-updated").unwrap();
|
||||
std::fs::remove_file(root.join("file_b.txt")).unwrap();
|
||||
std::fs::write(root.join("file_c.txt"), "charlie").unwrap();
|
||||
let snapshot_two = CodebaseSnapshot::from_disk(root).unwrap();
|
||||
|
||||
let diff = snapshot_one.diff(&snapshot_two);
|
||||
assert_eq!(diff.added, vec!["file_c.txt".to_string()]);
|
||||
assert_eq!(diff.modified, vec!["file_a.txt".to_string()]);
|
||||
assert_eq!(diff.removed, vec!["file_b.txt".to_string()]);
|
||||
}
|
||||
}
|
||||
@@ -1,5 +1,6 @@
|
||||
use std::borrow::Cow;
|
||||
use std::fmt::Debug;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
use std::sync::atomic::AtomicU64;
|
||||
@@ -17,6 +18,7 @@ use codex_apply_patch::ApplyPatchAction;
|
||||
use codex_protocol::ConversationId;
|
||||
use codex_protocol::protocol::ConversationPathResponseEvent;
|
||||
use codex_protocol::protocol::ExitedReviewModeEvent;
|
||||
use codex_protocol::protocol::McpAuthStatus;
|
||||
use codex_protocol::protocol::ReviewRequest;
|
||||
use codex_protocol::protocol::RolloutItem;
|
||||
use codex_protocol::protocol::SessionSource;
|
||||
@@ -42,6 +44,9 @@ use crate::apply_patch::convert_apply_patch_to_protocol;
|
||||
use crate::client::ModelClient;
|
||||
use crate::client_common::Prompt;
|
||||
use crate::client_common::ResponseEvent;
|
||||
use crate::codebase_change_notice::CODEBASE_CHANGE_NOTICE_MAX_PATHS;
|
||||
use crate::codebase_change_notice::CodebaseChangeNotice;
|
||||
use crate::codebase_snapshot::CodebaseSnapshot;
|
||||
use crate::config::Config;
|
||||
use crate::config_types::ShellEnvironmentPolicy;
|
||||
use crate::conversation_history::ConversationHistory;
|
||||
@@ -99,6 +104,7 @@ use crate::rollout::RolloutRecorderParams;
|
||||
use crate::shell;
|
||||
use crate::state::ActiveTurn;
|
||||
use crate::state::SessionServices;
|
||||
use crate::state::TaskKind;
|
||||
use crate::tasks::CompactTask;
|
||||
use crate::tasks::RegularTask;
|
||||
use crate::tasks::ReviewTask;
|
||||
@@ -364,15 +370,32 @@ impl Session {
|
||||
|
||||
let mcp_fut = McpConnectionManager::new(
|
||||
config.mcp_servers.clone(),
|
||||
config.use_experimental_use_rmcp_client,
|
||||
config
|
||||
.features
|
||||
.enabled(crate::features::Feature::RmcpClient),
|
||||
config.mcp_oauth_credentials_store_mode,
|
||||
);
|
||||
let default_shell_fut = shell::default_user_shell();
|
||||
let history_meta_fut = crate::message_history::history_metadata(&config);
|
||||
let auth_statuses_fut = compute_auth_statuses(
|
||||
config.mcp_servers.iter(),
|
||||
config.mcp_oauth_credentials_store_mode,
|
||||
);
|
||||
|
||||
// Join all independent futures.
|
||||
let (rollout_recorder, mcp_res, default_shell, (history_log_id, history_entry_count)) =
|
||||
tokio::join!(rollout_fut, mcp_fut, default_shell_fut, history_meta_fut);
|
||||
let (
|
||||
rollout_recorder,
|
||||
mcp_res,
|
||||
default_shell,
|
||||
(history_log_id, history_entry_count),
|
||||
auth_statuses,
|
||||
) = tokio::join!(
|
||||
rollout_fut,
|
||||
mcp_fut,
|
||||
default_shell_fut,
|
||||
history_meta_fut,
|
||||
auth_statuses_fut
|
||||
);
|
||||
|
||||
let rollout_recorder = rollout_recorder.map_err(|e| {
|
||||
error!("failed to initialize rollout recorder: {e:#}");
|
||||
@@ -399,11 +422,24 @@ impl Session {
|
||||
// Surface individual client start-up failures to the user.
|
||||
if !failed_clients.is_empty() {
|
||||
for (server_name, err) in failed_clients {
|
||||
let message = format!("MCP client for `{server_name}` failed to start: {err:#}");
|
||||
error!("{message}");
|
||||
let log_message =
|
||||
format!("MCP client for `{server_name}` failed to start: {err:#}");
|
||||
error!("{log_message}");
|
||||
let display_message = if matches!(
|
||||
auth_statuses.get(&server_name),
|
||||
Some(McpAuthStatus::NotLoggedIn)
|
||||
) {
|
||||
format!(
|
||||
"The {server_name} MCP server is not logged in. Run `codex mcp login {server_name}` to log in."
|
||||
)
|
||||
} else {
|
||||
log_message
|
||||
};
|
||||
post_session_configured_error_events.push(Event {
|
||||
id: INITIAL_SUBMIT_ID.to_owned(),
|
||||
msg: EventMsg::Error(ErrorEvent { message }),
|
||||
msg: EventMsg::Error(ErrorEvent {
|
||||
message: display_message,
|
||||
}),
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -446,12 +482,7 @@ impl Session {
|
||||
client,
|
||||
tools_config: ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &config.model_family,
|
||||
include_plan_tool: config.include_plan_tool,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: config.tools_web_search_request,
|
||||
use_streamable_shell_tool: config.use_experimental_streamable_shell_tool,
|
||||
include_view_image_tool: config.include_view_image_tool,
|
||||
experimental_unified_exec_tool: config.use_experimental_unified_exec_tool,
|
||||
features: &config.features,
|
||||
}),
|
||||
user_instructions,
|
||||
base_instructions,
|
||||
@@ -719,6 +750,73 @@ impl Session {
|
||||
self.persist_rollout_items(&rollout_items).await;
|
||||
}
|
||||
|
||||
async fn stored_snapshot_for_root(&self, root: &Path) -> Option<CodebaseSnapshot> {
|
||||
let state = self.state.lock().await;
|
||||
state
|
||||
.codebase_snapshot
|
||||
.as_ref()
|
||||
.filter(|snapshot| snapshot.root() == root)
|
||||
.cloned()
|
||||
}
|
||||
|
||||
async fn set_codebase_snapshot(&self, snapshot: CodebaseSnapshot) {
|
||||
let mut state = self.state.lock().await;
|
||||
state.codebase_snapshot = Some(snapshot);
|
||||
}
|
||||
|
||||
pub(crate) async fn emit_codebase_delta_if_changed(
|
||||
&self,
|
||||
turn_context: &TurnContext,
|
||||
sub_id: &str,
|
||||
) -> anyhow::Result<()> {
|
||||
let cwd = turn_context.cwd.clone();
|
||||
let previous = self.stored_snapshot_for_root(&cwd).await;
|
||||
let latest = CodebaseSnapshot::capture(cwd.clone()).await?;
|
||||
|
||||
if let Some(previous_snapshot) = previous {
|
||||
let diff = previous_snapshot.diff(&latest);
|
||||
if diff.is_empty() {
|
||||
self.set_codebase_snapshot(latest).await;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let notice = CodebaseChangeNotice::new(diff, CODEBASE_CHANGE_NOTICE_MAX_PATHS);
|
||||
if notice.is_empty() {
|
||||
self.set_codebase_snapshot(latest).await;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let response_item: ResponseItem = notice.into();
|
||||
self.record_conversation_items(std::slice::from_ref(&response_item))
|
||||
.await;
|
||||
|
||||
for msg in
|
||||
map_response_item_to_event_messages(&response_item, self.show_raw_agent_reasoning())
|
||||
{
|
||||
let event = Event {
|
||||
id: sub_id.to_string(),
|
||||
msg,
|
||||
};
|
||||
self.send_event(event).await;
|
||||
}
|
||||
|
||||
self.set_codebase_snapshot(latest).await;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
self.set_codebase_snapshot(latest).await;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) async fn refresh_codebase_snapshot(
|
||||
&self,
|
||||
turn_context: &TurnContext,
|
||||
) -> anyhow::Result<()> {
|
||||
let snapshot = CodebaseSnapshot::capture(turn_context.cwd.clone()).await?;
|
||||
self.set_codebase_snapshot(snapshot).await;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) fn build_initial_context(&self, turn_context: &TurnContext) -> Vec<ResponseItem> {
|
||||
let mut items = Vec::<ResponseItem>::with_capacity(2);
|
||||
if let Some(user_instructions) = turn_context.user_instructions.as_deref() {
|
||||
@@ -1195,12 +1293,7 @@ async fn submission_loop(
|
||||
|
||||
let tools_config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &effective_family,
|
||||
include_plan_tool: config.include_plan_tool,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: config.tools_web_search_request,
|
||||
use_streamable_shell_tool: config.use_experimental_streamable_shell_tool,
|
||||
include_view_image_tool: config.include_view_image_tool,
|
||||
experimental_unified_exec_tool: config.use_experimental_unified_exec_tool,
|
||||
features: &config.features,
|
||||
});
|
||||
|
||||
let new_turn_context = TurnContext {
|
||||
@@ -1297,14 +1390,7 @@ async fn submission_loop(
|
||||
client,
|
||||
tools_config: ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: config.include_plan_tool,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: config.tools_web_search_request,
|
||||
use_streamable_shell_tool: config
|
||||
.use_experimental_streamable_shell_tool,
|
||||
include_view_image_tool: config.include_view_image_tool,
|
||||
experimental_unified_exec_tool: config
|
||||
.use_experimental_unified_exec_tool,
|
||||
features: &config.features,
|
||||
}),
|
||||
user_instructions: turn_context.user_instructions.clone(),
|
||||
base_instructions: turn_context.base_instructions.clone(),
|
||||
@@ -1536,14 +1622,15 @@ async fn spawn_review_thread(
|
||||
let model = config.review_model.clone();
|
||||
let review_model_family = find_family_for_model(&model)
|
||||
.unwrap_or_else(|| parent_turn_context.client.get_model_family());
|
||||
// For reviews, disable plan, web_search, view_image regardless of global settings.
|
||||
let mut review_features = config.features.clone();
|
||||
review_features.disable(crate::features::Feature::PlanTool);
|
||||
review_features.disable(crate::features::Feature::WebSearchRequest);
|
||||
review_features.disable(crate::features::Feature::ViewImageTool);
|
||||
review_features.disable(crate::features::Feature::StreamableShell);
|
||||
let tools_config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &review_model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: false,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: false,
|
||||
experimental_unified_exec_tool: config.use_experimental_unified_exec_tool,
|
||||
features: &review_features,
|
||||
});
|
||||
|
||||
let base_instructions = REVIEW_PROMPT.to_string();
|
||||
@@ -1634,6 +1721,7 @@ pub(crate) async fn run_task(
|
||||
turn_context: Arc<TurnContext>,
|
||||
sub_id: String,
|
||||
input: Vec<InputItem>,
|
||||
task_kind: TaskKind,
|
||||
) -> Option<String> {
|
||||
if input.is_empty() {
|
||||
return None;
|
||||
@@ -1661,6 +1749,14 @@ pub(crate) async fn run_task(
|
||||
.await;
|
||||
}
|
||||
|
||||
if !is_review_mode
|
||||
&& let Err(err) = sess
|
||||
.emit_codebase_delta_if_changed(turn_context.as_ref(), &sub_id)
|
||||
.await
|
||||
{
|
||||
warn!(error = ?err, "failed to compute codebase changes");
|
||||
}
|
||||
|
||||
let mut last_agent_message: Option<String> = None;
|
||||
// Although from the perspective of codex.rs, TurnDiffTracker has the lifecycle of a Task which contains
|
||||
// many turns, from the perspective of the user, it is a single turn.
|
||||
@@ -1717,6 +1813,7 @@ pub(crate) async fn run_task(
|
||||
Arc::clone(&turn_diff_tracker),
|
||||
sub_id.clone(),
|
||||
turn_input,
|
||||
task_kind,
|
||||
)
|
||||
.await
|
||||
{
|
||||
@@ -1869,6 +1966,7 @@ pub(crate) async fn run_task(
|
||||
);
|
||||
sess.notifier()
|
||||
.notify(&UserNotification::AgentTurnComplete {
|
||||
thread_id: sess.conversation_id.to_string(),
|
||||
turn_id: sub_id.clone(),
|
||||
input_messages: turn_input_messages,
|
||||
last_assistant_message: last_agent_message.clone(),
|
||||
@@ -1892,6 +1990,11 @@ pub(crate) async fn run_task(
|
||||
}
|
||||
}
|
||||
|
||||
if !is_review_mode && let Err(err) = sess.refresh_codebase_snapshot(turn_context.as_ref()).await
|
||||
{
|
||||
warn!(error = ?err, "failed to refresh codebase snapshot");
|
||||
}
|
||||
|
||||
// If this was a review thread and we have a final assistant message,
|
||||
// try to parse it as a ReviewOutput.
|
||||
//
|
||||
@@ -1942,6 +2045,7 @@ async fn run_turn(
|
||||
turn_diff_tracker: SharedTurnDiffTracker,
|
||||
sub_id: String,
|
||||
input: Vec<ResponseItem>,
|
||||
task_kind: TaskKind,
|
||||
) -> CodexResult<TurnRunResult> {
|
||||
let mcp_tools = sess.services.mcp_connection_manager.list_all_tools();
|
||||
let router = Arc::new(ToolRouter::from_config(
|
||||
@@ -1971,6 +2075,7 @@ async fn run_turn(
|
||||
Arc::clone(&turn_diff_tracker),
|
||||
&sub_id,
|
||||
&prompt,
|
||||
task_kind,
|
||||
)
|
||||
.await
|
||||
{
|
||||
@@ -2044,6 +2149,7 @@ async fn try_run_turn(
|
||||
turn_diff_tracker: SharedTurnDiffTracker,
|
||||
sub_id: &str,
|
||||
prompt: &Prompt,
|
||||
task_kind: TaskKind,
|
||||
) -> CodexResult<TurnRunResult> {
|
||||
// call_ids that are part of this response.
|
||||
let completed_call_ids = prompt
|
||||
@@ -2109,7 +2215,11 @@ async fn try_run_turn(
|
||||
summary: turn_context.client.get_reasoning_summary(),
|
||||
});
|
||||
sess.persist_rollout_items(&[rollout_item]).await;
|
||||
let mut stream = turn_context.client.clone().stream(&prompt).await?;
|
||||
let mut stream = turn_context
|
||||
.client
|
||||
.clone()
|
||||
.stream_with_task_kind(prompt.as_ref(), task_kind)
|
||||
.await?;
|
||||
|
||||
let tool_runtime = ToolCallRuntime::new(
|
||||
Arc::clone(&router),
|
||||
@@ -2748,12 +2858,7 @@ mod tests {
|
||||
);
|
||||
let tools_config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &config.model_family,
|
||||
include_plan_tool: config.include_plan_tool,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: config.tools_web_search_request,
|
||||
use_streamable_shell_tool: config.use_experimental_streamable_shell_tool,
|
||||
include_view_image_tool: config.include_view_image_tool,
|
||||
experimental_unified_exec_tool: config.use_experimental_unified_exec_tool,
|
||||
features: &config.features,
|
||||
});
|
||||
let turn_context = TurnContext {
|
||||
client,
|
||||
@@ -2821,12 +2926,7 @@ mod tests {
|
||||
);
|
||||
let tools_config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &config.model_family,
|
||||
include_plan_tool: config.include_plan_tool,
|
||||
include_apply_patch_tool: config.include_apply_patch_tool,
|
||||
include_web_search_request: config.tools_web_search_request,
|
||||
use_streamable_shell_tool: config.use_experimental_streamable_shell_tool,
|
||||
include_view_image_tool: config.include_view_image_tool,
|
||||
experimental_unified_exec_tool: config.use_experimental_unified_exec_tool,
|
||||
features: &config.features,
|
||||
});
|
||||
let turn_context = Arc::new(TurnContext {
|
||||
client,
|
||||
|
||||
@@ -16,6 +16,7 @@ use crate::protocol::InputItem;
|
||||
use crate::protocol::InputMessageKind;
|
||||
use crate::protocol::TaskStartedEvent;
|
||||
use crate::protocol::TurnContextItem;
|
||||
use crate::state::TaskKind;
|
||||
use crate::truncate::truncate_middle;
|
||||
use crate::util::backoff;
|
||||
use askama::Template;
|
||||
@@ -258,7 +259,11 @@ async fn drain_to_completed(
|
||||
sub_id: &str,
|
||||
prompt: &Prompt,
|
||||
) -> CodexResult<()> {
|
||||
let mut stream = turn_context.client.clone().stream(prompt).await?;
|
||||
let mut stream = turn_context
|
||||
.client
|
||||
.clone()
|
||||
.stream_with_task_kind(prompt, TaskKind::Compact)
|
||||
.await?;
|
||||
loop {
|
||||
let maybe_event = stream.next().await;
|
||||
let Some(event) = maybe_event else {
|
||||
|
||||
@@ -17,6 +17,10 @@ use crate::config_types::ShellEnvironmentPolicy;
|
||||
use crate::config_types::ShellEnvironmentPolicyToml;
|
||||
use crate::config_types::Tui;
|
||||
use crate::config_types::UriBasedFileOpener;
|
||||
use crate::features::Feature;
|
||||
use crate::features::FeatureOverrides;
|
||||
use crate::features::Features;
|
||||
use crate::features::FeaturesToml;
|
||||
use crate::git_info::resolve_root_git_project_for_trust;
|
||||
use crate::model_family::ModelFamily;
|
||||
use crate::model_family::derive_default_model_family;
|
||||
@@ -218,6 +222,9 @@ pub struct Config {
|
||||
/// Include the `view_image` tool that lets the agent attach a local image path to context.
|
||||
pub include_view_image_tool: bool,
|
||||
|
||||
/// Centralized feature flags; source of truth for feature gating.
|
||||
pub features: Features,
|
||||
|
||||
/// The active profile name used to derive this `Config` (if any).
|
||||
pub active_profile: Option<String>,
|
||||
|
||||
@@ -794,19 +801,15 @@ pub struct ConfigToml {
|
||||
/// Base URL for requests to ChatGPT (as opposed to the OpenAI API).
|
||||
pub chatgpt_base_url: Option<String>,
|
||||
|
||||
/// Experimental path to a file whose contents replace the built-in BASE_INSTRUCTIONS.
|
||||
pub experimental_instructions_file: Option<PathBuf>,
|
||||
|
||||
pub experimental_use_exec_command_tool: Option<bool>,
|
||||
pub experimental_use_unified_exec_tool: Option<bool>,
|
||||
pub experimental_use_rmcp_client: Option<bool>,
|
||||
pub experimental_use_freeform_apply_patch: Option<bool>,
|
||||
|
||||
pub projects: Option<HashMap<String, ProjectConfig>>,
|
||||
|
||||
/// Nested tools section for feature toggles
|
||||
pub tools: Option<ToolsToml>,
|
||||
|
||||
/// Centralized feature flags (new). Prefer this over individual toggles.
|
||||
#[serde(default)]
|
||||
pub features: Option<FeaturesToml>,
|
||||
|
||||
/// When true, disables burst-paste detection for typed input entirely.
|
||||
/// All characters are inserted as they are received, and no buffering
|
||||
/// or placeholder replacement will occur for fast keypress bursts.
|
||||
@@ -817,6 +820,13 @@ pub struct ConfigToml {
|
||||
|
||||
/// Tracks whether the Windows onboarding screen has been acknowledged.
|
||||
pub windows_wsl_setup_acknowledged: Option<bool>,
|
||||
|
||||
/// Legacy, now use features
|
||||
pub experimental_instructions_file: Option<PathBuf>,
|
||||
pub experimental_use_exec_command_tool: Option<bool>,
|
||||
pub experimental_use_unified_exec_tool: Option<bool>,
|
||||
pub experimental_use_rmcp_client: Option<bool>,
|
||||
pub experimental_use_freeform_apply_patch: Option<bool>,
|
||||
}
|
||||
|
||||
impl From<ConfigToml> for UserSavedConfig {
|
||||
@@ -980,9 +990,9 @@ impl Config {
|
||||
config_profile: config_profile_key,
|
||||
codex_linux_sandbox_exe,
|
||||
base_instructions,
|
||||
include_plan_tool,
|
||||
include_apply_patch_tool,
|
||||
include_view_image_tool,
|
||||
include_plan_tool: include_plan_tool_override,
|
||||
include_apply_patch_tool: include_apply_patch_tool_override,
|
||||
include_view_image_tool: include_view_image_tool_override,
|
||||
show_raw_agent_reasoning,
|
||||
tools_web_search_request: override_tools_web_search_request,
|
||||
} = overrides;
|
||||
@@ -1005,6 +1015,15 @@ impl Config {
|
||||
None => ConfigProfile::default(),
|
||||
};
|
||||
|
||||
let feature_overrides = FeatureOverrides {
|
||||
include_plan_tool: include_plan_tool_override,
|
||||
include_apply_patch_tool: include_apply_patch_tool_override,
|
||||
include_view_image_tool: include_view_image_tool_override,
|
||||
web_search_request: override_tools_web_search_request,
|
||||
};
|
||||
|
||||
let features = Features::from_config(&cfg, &config_profile, feature_overrides);
|
||||
|
||||
let sandbox_policy = cfg.derive_sandbox_policy(sandbox_mode);
|
||||
|
||||
let mut model_providers = built_in_model_providers();
|
||||
@@ -1050,13 +1069,13 @@ impl Config {
|
||||
|
||||
let history = cfg.history.unwrap_or_default();
|
||||
|
||||
let tools_web_search_request = override_tools_web_search_request
|
||||
.or(cfg.tools.as_ref().and_then(|t| t.web_search))
|
||||
.unwrap_or(false);
|
||||
|
||||
let include_view_image_tool = include_view_image_tool
|
||||
.or(cfg.tools.as_ref().and_then(|t| t.view_image))
|
||||
.unwrap_or(true);
|
||||
let include_plan_tool_flag = features.enabled(Feature::PlanTool);
|
||||
let include_apply_patch_tool_flag = features.enabled(Feature::ApplyPatchFreeform);
|
||||
let include_view_image_tool_flag = features.enabled(Feature::ViewImageTool);
|
||||
let tools_web_search_request = features.enabled(Feature::WebSearchRequest);
|
||||
let use_experimental_streamable_shell_tool = features.enabled(Feature::StreamableShell);
|
||||
let use_experimental_unified_exec_tool = features.enabled(Feature::UnifiedExec);
|
||||
let use_experimental_use_rmcp_client = features.enabled(Feature::RmcpClient);
|
||||
|
||||
let model = model
|
||||
.or(config_profile.model)
|
||||
@@ -1164,19 +1183,14 @@ impl Config {
|
||||
.chatgpt_base_url
|
||||
.or(cfg.chatgpt_base_url)
|
||||
.unwrap_or("https://chatgpt.com/backend-api/".to_string()),
|
||||
include_plan_tool: include_plan_tool.unwrap_or(false),
|
||||
include_apply_patch_tool: include_apply_patch_tool
|
||||
.or(cfg.experimental_use_freeform_apply_patch)
|
||||
.unwrap_or(false),
|
||||
include_plan_tool: include_plan_tool_flag,
|
||||
include_apply_patch_tool: include_apply_patch_tool_flag,
|
||||
tools_web_search_request,
|
||||
use_experimental_streamable_shell_tool: cfg
|
||||
.experimental_use_exec_command_tool
|
||||
.unwrap_or(false),
|
||||
use_experimental_unified_exec_tool: cfg
|
||||
.experimental_use_unified_exec_tool
|
||||
.unwrap_or(false),
|
||||
use_experimental_use_rmcp_client: cfg.experimental_use_rmcp_client.unwrap_or(false),
|
||||
include_view_image_tool,
|
||||
use_experimental_streamable_shell_tool,
|
||||
use_experimental_unified_exec_tool,
|
||||
use_experimental_use_rmcp_client,
|
||||
include_view_image_tool: include_view_image_tool_flag,
|
||||
features,
|
||||
active_profile: active_profile_name,
|
||||
windows_wsl_setup_acknowledged: cfg.windows_wsl_setup_acknowledged.unwrap_or(false),
|
||||
disable_paste_burst: cfg.disable_paste_burst.unwrap_or(false),
|
||||
@@ -1309,6 +1323,7 @@ pub fn log_dir(cfg: &Config) -> std::io::Result<PathBuf> {
|
||||
mod tests {
|
||||
use crate::config_types::HistoryPersistence;
|
||||
use crate::config_types::Notifications;
|
||||
use crate::features::Feature;
|
||||
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
@@ -1436,6 +1451,93 @@ exclude_slash_tmp = true
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn profile_legacy_toggles_override_base() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
let mut profiles = HashMap::new();
|
||||
profiles.insert(
|
||||
"work".to_string(),
|
||||
ConfigProfile {
|
||||
include_plan_tool: Some(true),
|
||||
include_view_image_tool: Some(false),
|
||||
..Default::default()
|
||||
},
|
||||
);
|
||||
let cfg = ConfigToml {
|
||||
profiles,
|
||||
profile: Some("work".to_string()),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let config = Config::load_from_base_config_with_overrides(
|
||||
cfg,
|
||||
ConfigOverrides::default(),
|
||||
codex_home.path().to_path_buf(),
|
||||
)?;
|
||||
|
||||
assert!(config.features.enabled(Feature::PlanTool));
|
||||
assert!(!config.features.enabled(Feature::ViewImageTool));
|
||||
assert!(config.include_plan_tool);
|
||||
assert!(!config.include_view_image_tool);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn feature_table_overrides_legacy_flags() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
let mut entries = BTreeMap::new();
|
||||
entries.insert("plan_tool".to_string(), false);
|
||||
entries.insert("apply_patch_freeform".to_string(), false);
|
||||
let cfg = ConfigToml {
|
||||
features: Some(crate::features::FeaturesToml { entries }),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let config = Config::load_from_base_config_with_overrides(
|
||||
cfg,
|
||||
ConfigOverrides::default(),
|
||||
codex_home.path().to_path_buf(),
|
||||
)?;
|
||||
|
||||
assert!(!config.features.enabled(Feature::PlanTool));
|
||||
assert!(!config.features.enabled(Feature::ApplyPatchFreeform));
|
||||
assert!(!config.include_plan_tool);
|
||||
assert!(!config.include_apply_patch_tool);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn legacy_toggles_map_to_features() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
let cfg = ConfigToml {
|
||||
experimental_use_exec_command_tool: Some(true),
|
||||
experimental_use_unified_exec_tool: Some(true),
|
||||
experimental_use_rmcp_client: Some(true),
|
||||
experimental_use_freeform_apply_patch: Some(true),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let config = Config::load_from_base_config_with_overrides(
|
||||
cfg,
|
||||
ConfigOverrides::default(),
|
||||
codex_home.path().to_path_buf(),
|
||||
)?;
|
||||
|
||||
assert!(config.features.enabled(Feature::ApplyPatchFreeform));
|
||||
assert!(config.features.enabled(Feature::StreamableShell));
|
||||
assert!(config.features.enabled(Feature::UnifiedExec));
|
||||
assert!(config.features.enabled(Feature::RmcpClient));
|
||||
|
||||
assert!(config.include_apply_patch_tool);
|
||||
assert!(config.use_experimental_streamable_shell_tool);
|
||||
assert!(config.use_experimental_unified_exec_tool);
|
||||
assert!(config.use_experimental_use_rmcp_client);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn config_honors_explicit_file_oauth_store_mode() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
@@ -2120,6 +2222,7 @@ model_verbosity = "high"
|
||||
use_experimental_unified_exec_tool: false,
|
||||
use_experimental_use_rmcp_client: false,
|
||||
include_view_image_tool: true,
|
||||
features: Features::with_defaults(),
|
||||
active_profile: Some("o3".to_string()),
|
||||
windows_wsl_setup_acknowledged: false,
|
||||
disable_paste_burst: false,
|
||||
@@ -2183,6 +2286,7 @@ model_verbosity = "high"
|
||||
use_experimental_unified_exec_tool: false,
|
||||
use_experimental_use_rmcp_client: false,
|
||||
include_view_image_tool: true,
|
||||
features: Features::with_defaults(),
|
||||
active_profile: Some("gpt3".to_string()),
|
||||
windows_wsl_setup_acknowledged: false,
|
||||
disable_paste_burst: false,
|
||||
@@ -2261,6 +2365,7 @@ model_verbosity = "high"
|
||||
use_experimental_unified_exec_tool: false,
|
||||
use_experimental_use_rmcp_client: false,
|
||||
include_view_image_tool: true,
|
||||
features: Features::with_defaults(),
|
||||
active_profile: Some("zdr".to_string()),
|
||||
windows_wsl_setup_acknowledged: false,
|
||||
disable_paste_burst: false,
|
||||
@@ -2325,6 +2430,7 @@ model_verbosity = "high"
|
||||
use_experimental_unified_exec_tool: false,
|
||||
use_experimental_use_rmcp_client: false,
|
||||
include_view_image_tool: true,
|
||||
features: Features::with_defaults(),
|
||||
active_profile: Some("gpt5".to_string()),
|
||||
windows_wsl_setup_acknowledged: false,
|
||||
disable_paste_burst: false,
|
||||
|
||||
@@ -20,6 +20,18 @@ pub struct ConfigProfile {
|
||||
pub model_verbosity: Option<Verbosity>,
|
||||
pub chatgpt_base_url: Option<String>,
|
||||
pub experimental_instructions_file: Option<PathBuf>,
|
||||
pub include_plan_tool: Option<bool>,
|
||||
pub include_apply_patch_tool: Option<bool>,
|
||||
pub include_view_image_tool: Option<bool>,
|
||||
pub experimental_use_unified_exec_tool: Option<bool>,
|
||||
pub experimental_use_exec_command_tool: Option<bool>,
|
||||
pub experimental_use_rmcp_client: Option<bool>,
|
||||
pub experimental_use_freeform_apply_patch: Option<bool>,
|
||||
pub tools_web_search: Option<bool>,
|
||||
pub tools_view_image: Option<bool>,
|
||||
/// Optional feature toggles scoped to this profile.
|
||||
#[serde(default)]
|
||||
pub features: Option<crate::features::FeaturesToml>,
|
||||
}
|
||||
|
||||
impl From<ConfigProfile> for codex_app_server_protocol::Profile {
|
||||
|
||||
@@ -6,6 +6,7 @@ use async_trait::async_trait;
|
||||
use crate::CODEX_APPLY_PATCH_ARG1;
|
||||
use crate::apply_patch::ApplyPatchExec;
|
||||
use crate::exec::ExecParams;
|
||||
use crate::executor::ExecutorConfig;
|
||||
use crate::function_tool::FunctionCallError;
|
||||
|
||||
pub(crate) enum ExecutionMode {
|
||||
@@ -22,6 +23,7 @@ pub(crate) trait ExecutionBackend: Send + Sync {
|
||||
params: ExecParams,
|
||||
// Required for downcasting the apply_patch.
|
||||
mode: &ExecutionMode,
|
||||
config: &ExecutorConfig,
|
||||
) -> Result<ExecParams, FunctionCallError>;
|
||||
|
||||
fn stream_stdout(&self, _mode: &ExecutionMode) -> bool {
|
||||
@@ -47,6 +49,7 @@ impl ExecutionBackend for ShellBackend {
|
||||
&self,
|
||||
params: ExecParams,
|
||||
mode: &ExecutionMode,
|
||||
_config: &ExecutorConfig,
|
||||
) -> Result<ExecParams, FunctionCallError> {
|
||||
match mode {
|
||||
ExecutionMode::Shell => Ok(params),
|
||||
@@ -65,17 +68,22 @@ impl ExecutionBackend for ApplyPatchBackend {
|
||||
&self,
|
||||
params: ExecParams,
|
||||
mode: &ExecutionMode,
|
||||
config: &ExecutorConfig,
|
||||
) -> Result<ExecParams, FunctionCallError> {
|
||||
match mode {
|
||||
ExecutionMode::ApplyPatch(exec) => {
|
||||
let path_to_codex = env::current_exe()
|
||||
.ok()
|
||||
.map(|p| p.to_string_lossy().to_string())
|
||||
.ok_or_else(|| {
|
||||
FunctionCallError::RespondToModel(
|
||||
"failed to determine path to codex executable".to_string(),
|
||||
)
|
||||
})?;
|
||||
let path_to_codex = if let Some(exe_path) = &config.codex_exe {
|
||||
exe_path.to_string_lossy().to_string()
|
||||
} else {
|
||||
env::current_exe()
|
||||
.ok()
|
||||
.map(|p| p.to_string_lossy().to_string())
|
||||
.ok_or_else(|| {
|
||||
FunctionCallError::RespondToModel(
|
||||
"failed to determine path to codex executable".to_string(),
|
||||
)
|
||||
})?
|
||||
};
|
||||
|
||||
let patch = exec.action.patch.clone();
|
||||
Ok(ExecParams {
|
||||
|
||||
@@ -30,19 +30,19 @@ use codex_otel::otel_event_manager::ToolDecisionSource;
|
||||
pub(crate) struct ExecutorConfig {
|
||||
pub(crate) sandbox_policy: SandboxPolicy,
|
||||
pub(crate) sandbox_cwd: PathBuf,
|
||||
codex_linux_sandbox_exe: Option<PathBuf>,
|
||||
pub(crate) codex_exe: Option<PathBuf>,
|
||||
}
|
||||
|
||||
impl ExecutorConfig {
|
||||
pub(crate) fn new(
|
||||
sandbox_policy: SandboxPolicy,
|
||||
sandbox_cwd: PathBuf,
|
||||
codex_linux_sandbox_exe: Option<PathBuf>,
|
||||
codex_exe: Option<PathBuf>,
|
||||
) -> Self {
|
||||
Self {
|
||||
sandbox_policy,
|
||||
sandbox_cwd,
|
||||
codex_linux_sandbox_exe,
|
||||
codex_exe,
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -86,7 +86,14 @@ impl Executor {
|
||||
maybe_translate_shell_command(request.params, session, request.use_shell_profile);
|
||||
}
|
||||
|
||||
// Step 1: Normalise parameters via the selected backend.
|
||||
// Step 1: Snapshot sandbox configuration so it stays stable for this run.
|
||||
let config = self
|
||||
.config
|
||||
.read()
|
||||
.map_err(|_| ExecError::rejection("executor config poisoned"))?
|
||||
.clone();
|
||||
|
||||
// Step 2: Normalise parameters via the selected backend.
|
||||
let backend = backend_for_mode(&request.mode);
|
||||
let stdout_stream = if backend.stream_stdout(&request.mode) {
|
||||
request.stdout_stream.clone()
|
||||
@@ -94,16 +101,9 @@ impl Executor {
|
||||
None
|
||||
};
|
||||
request.params = backend
|
||||
.prepare(request.params, &request.mode)
|
||||
.prepare(request.params, &request.mode, &config)
|
||||
.map_err(ExecError::from)?;
|
||||
|
||||
// Step 2: Snapshot sandbox configuration so it stays stable for this run.
|
||||
let config = self
|
||||
.config
|
||||
.read()
|
||||
.map_err(|_| ExecError::rejection("executor config poisoned"))?
|
||||
.clone();
|
||||
|
||||
// Step 3: Decide sandbox placement, prompting for approval when needed.
|
||||
let sandbox_decision = select_sandbox(
|
||||
&request,
|
||||
@@ -227,7 +227,7 @@ impl Executor {
|
||||
sandbox,
|
||||
&config.sandbox_policy,
|
||||
&config.sandbox_cwd,
|
||||
&config.codex_linux_sandbox_exe,
|
||||
&config.codex_exe,
|
||||
stdout_stream,
|
||||
)
|
||||
.await
|
||||
|
||||
250
codex-rs/core/src/features.rs
Normal file
250
codex-rs/core/src/features.rs
Normal file
@@ -0,0 +1,250 @@
|
||||
//! Centralized feature flags and metadata.
|
||||
//!
|
||||
//! This module defines a small set of toggles that gate experimental and
|
||||
//! optional behavior across the codebase. Instead of wiring individual
|
||||
//! booleans through multiple types, call sites consult a single `Features`
|
||||
//! container attached to `Config`.
|
||||
|
||||
use crate::config::ConfigToml;
|
||||
use crate::config_profile::ConfigProfile;
|
||||
use serde::Deserialize;
|
||||
use std::collections::BTreeMap;
|
||||
use std::collections::BTreeSet;
|
||||
|
||||
mod legacy;
|
||||
pub(crate) use legacy::LegacyFeatureToggles;
|
||||
|
||||
/// High-level lifecycle stage for a feature.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum Stage {
|
||||
Experimental,
|
||||
Beta,
|
||||
Stable,
|
||||
Deprecated,
|
||||
Removed,
|
||||
}
|
||||
|
||||
/// Unique features toggled via configuration.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash)]
|
||||
pub enum Feature {
|
||||
/// Use the single unified PTY-backed exec tool.
|
||||
UnifiedExec,
|
||||
/// Use the streamable exec-command/write-stdin tool pair.
|
||||
StreamableShell,
|
||||
/// Use the official Rust MCP client (rmcp).
|
||||
RmcpClient,
|
||||
/// Include the plan tool.
|
||||
PlanTool,
|
||||
/// Include the freeform apply_patch tool.
|
||||
ApplyPatchFreeform,
|
||||
/// Include the view_image tool.
|
||||
ViewImageTool,
|
||||
/// Allow the model to request web searches.
|
||||
WebSearchRequest,
|
||||
}
|
||||
|
||||
impl Feature {
|
||||
pub fn key(self) -> &'static str {
|
||||
self.info().key
|
||||
}
|
||||
|
||||
pub fn stage(self) -> Stage {
|
||||
self.info().stage
|
||||
}
|
||||
|
||||
pub fn default_enabled(self) -> bool {
|
||||
self.info().default_enabled
|
||||
}
|
||||
|
||||
fn info(self) -> &'static FeatureSpec {
|
||||
FEATURES
|
||||
.iter()
|
||||
.find(|spec| spec.id == self)
|
||||
.unwrap_or_else(|| unreachable!("missing FeatureSpec for {:?}", self))
|
||||
}
|
||||
}
|
||||
|
||||
/// Holds the effective set of enabled features.
|
||||
#[derive(Debug, Clone, Default, PartialEq)]
|
||||
pub struct Features {
|
||||
enabled: BTreeSet<Feature>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct FeatureOverrides {
|
||||
pub include_plan_tool: Option<bool>,
|
||||
pub include_apply_patch_tool: Option<bool>,
|
||||
pub include_view_image_tool: Option<bool>,
|
||||
pub web_search_request: Option<bool>,
|
||||
}
|
||||
|
||||
impl FeatureOverrides {
|
||||
fn apply(self, features: &mut Features) {
|
||||
LegacyFeatureToggles {
|
||||
include_plan_tool: self.include_plan_tool,
|
||||
include_apply_patch_tool: self.include_apply_patch_tool,
|
||||
include_view_image_tool: self.include_view_image_tool,
|
||||
tools_web_search: self.web_search_request,
|
||||
..Default::default()
|
||||
}
|
||||
.apply(features);
|
||||
}
|
||||
}
|
||||
|
||||
impl Features {
|
||||
/// Starts with built-in defaults.
|
||||
pub fn with_defaults() -> Self {
|
||||
let mut set = BTreeSet::new();
|
||||
for spec in FEATURES {
|
||||
if spec.default_enabled {
|
||||
set.insert(spec.id);
|
||||
}
|
||||
}
|
||||
Self { enabled: set }
|
||||
}
|
||||
|
||||
pub fn enabled(&self, f: Feature) -> bool {
|
||||
self.enabled.contains(&f)
|
||||
}
|
||||
|
||||
pub fn enable(&mut self, f: Feature) {
|
||||
self.enabled.insert(f);
|
||||
}
|
||||
|
||||
pub fn disable(&mut self, f: Feature) {
|
||||
self.enabled.remove(&f);
|
||||
}
|
||||
|
||||
/// Apply a table of key -> bool toggles (e.g. from TOML).
|
||||
pub fn apply_map(&mut self, m: &BTreeMap<String, bool>) {
|
||||
for (k, v) in m {
|
||||
match feature_for_key(k) {
|
||||
Some(feat) => {
|
||||
if *v {
|
||||
self.enable(feat);
|
||||
} else {
|
||||
self.disable(feat);
|
||||
}
|
||||
}
|
||||
None => {
|
||||
tracing::warn!("unknown feature key in config: {k}");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn from_config(
|
||||
cfg: &ConfigToml,
|
||||
config_profile: &ConfigProfile,
|
||||
overrides: FeatureOverrides,
|
||||
) -> Self {
|
||||
let mut features = Features::with_defaults();
|
||||
|
||||
let base_legacy = LegacyFeatureToggles {
|
||||
experimental_use_freeform_apply_patch: cfg.experimental_use_freeform_apply_patch,
|
||||
experimental_use_exec_command_tool: cfg.experimental_use_exec_command_tool,
|
||||
experimental_use_unified_exec_tool: cfg.experimental_use_unified_exec_tool,
|
||||
experimental_use_rmcp_client: cfg.experimental_use_rmcp_client,
|
||||
tools_web_search: cfg.tools.as_ref().and_then(|t| t.web_search),
|
||||
tools_view_image: cfg.tools.as_ref().and_then(|t| t.view_image),
|
||||
..Default::default()
|
||||
};
|
||||
base_legacy.apply(&mut features);
|
||||
|
||||
if let Some(base_features) = cfg.features.as_ref() {
|
||||
features.apply_map(&base_features.entries);
|
||||
}
|
||||
|
||||
let profile_legacy = LegacyFeatureToggles {
|
||||
include_plan_tool: config_profile.include_plan_tool,
|
||||
include_apply_patch_tool: config_profile.include_apply_patch_tool,
|
||||
include_view_image_tool: config_profile.include_view_image_tool,
|
||||
experimental_use_freeform_apply_patch: config_profile
|
||||
.experimental_use_freeform_apply_patch,
|
||||
experimental_use_exec_command_tool: config_profile.experimental_use_exec_command_tool,
|
||||
experimental_use_unified_exec_tool: config_profile.experimental_use_unified_exec_tool,
|
||||
experimental_use_rmcp_client: config_profile.experimental_use_rmcp_client,
|
||||
tools_web_search: config_profile.tools_web_search,
|
||||
tools_view_image: config_profile.tools_view_image,
|
||||
};
|
||||
profile_legacy.apply(&mut features);
|
||||
if let Some(profile_features) = config_profile.features.as_ref() {
|
||||
features.apply_map(&profile_features.entries);
|
||||
}
|
||||
|
||||
overrides.apply(&mut features);
|
||||
|
||||
features
|
||||
}
|
||||
}
|
||||
|
||||
/// Keys accepted in `[features]` tables.
|
||||
fn feature_for_key(key: &str) -> Option<Feature> {
|
||||
for spec in FEATURES {
|
||||
if spec.key == key {
|
||||
return Some(spec.id);
|
||||
}
|
||||
}
|
||||
legacy::feature_for_key(key)
|
||||
}
|
||||
|
||||
/// Deserializable features table for TOML.
|
||||
#[derive(Deserialize, Debug, Clone, Default, PartialEq)]
|
||||
pub struct FeaturesToml {
|
||||
#[serde(flatten)]
|
||||
pub entries: BTreeMap<String, bool>,
|
||||
}
|
||||
|
||||
/// Single, easy-to-read registry of all feature definitions.
|
||||
#[derive(Debug, Clone, Copy)]
|
||||
pub struct FeatureSpec {
|
||||
pub id: Feature,
|
||||
pub key: &'static str,
|
||||
pub stage: Stage,
|
||||
pub default_enabled: bool,
|
||||
}
|
||||
|
||||
pub const FEATURES: &[FeatureSpec] = &[
|
||||
FeatureSpec {
|
||||
id: Feature::UnifiedExec,
|
||||
key: "unified_exec",
|
||||
stage: Stage::Experimental,
|
||||
default_enabled: false,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::StreamableShell,
|
||||
key: "streamable_shell",
|
||||
stage: Stage::Experimental,
|
||||
default_enabled: false,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::RmcpClient,
|
||||
key: "rmcp_client",
|
||||
stage: Stage::Experimental,
|
||||
default_enabled: false,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::PlanTool,
|
||||
key: "plan_tool",
|
||||
stage: Stage::Stable,
|
||||
default_enabled: false,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::ApplyPatchFreeform,
|
||||
key: "apply_patch_freeform",
|
||||
stage: Stage::Beta,
|
||||
default_enabled: false,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::ViewImageTool,
|
||||
key: "view_image_tool",
|
||||
stage: Stage::Stable,
|
||||
default_enabled: true,
|
||||
},
|
||||
FeatureSpec {
|
||||
id: Feature::WebSearchRequest,
|
||||
key: "web_search_request",
|
||||
stage: Stage::Stable,
|
||||
default_enabled: false,
|
||||
},
|
||||
];
|
||||
158
codex-rs/core/src/features/legacy.rs
Normal file
158
codex-rs/core/src/features/legacy.rs
Normal file
@@ -0,0 +1,158 @@
|
||||
use super::Feature;
|
||||
use super::Features;
|
||||
use tracing::info;
|
||||
|
||||
#[derive(Clone, Copy)]
|
||||
struct Alias {
|
||||
legacy_key: &'static str,
|
||||
feature: Feature,
|
||||
}
|
||||
|
||||
const ALIASES: &[Alias] = &[
|
||||
Alias {
|
||||
legacy_key: "experimental_use_unified_exec_tool",
|
||||
feature: Feature::UnifiedExec,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "experimental_use_exec_command_tool",
|
||||
feature: Feature::StreamableShell,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "experimental_use_rmcp_client",
|
||||
feature: Feature::RmcpClient,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "experimental_use_freeform_apply_patch",
|
||||
feature: Feature::ApplyPatchFreeform,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "include_apply_patch_tool",
|
||||
feature: Feature::ApplyPatchFreeform,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "include_plan_tool",
|
||||
feature: Feature::PlanTool,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "include_view_image_tool",
|
||||
feature: Feature::ViewImageTool,
|
||||
},
|
||||
Alias {
|
||||
legacy_key: "web_search",
|
||||
feature: Feature::WebSearchRequest,
|
||||
},
|
||||
];
|
||||
|
||||
pub(crate) fn feature_for_key(key: &str) -> Option<Feature> {
|
||||
ALIASES
|
||||
.iter()
|
||||
.find(|alias| alias.legacy_key == key)
|
||||
.map(|alias| {
|
||||
log_alias(alias.legacy_key, alias.feature);
|
||||
alias.feature
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(Debug, Default)]
|
||||
pub struct LegacyFeatureToggles {
|
||||
pub include_plan_tool: Option<bool>,
|
||||
pub include_apply_patch_tool: Option<bool>,
|
||||
pub include_view_image_tool: Option<bool>,
|
||||
pub experimental_use_freeform_apply_patch: Option<bool>,
|
||||
pub experimental_use_exec_command_tool: Option<bool>,
|
||||
pub experimental_use_unified_exec_tool: Option<bool>,
|
||||
pub experimental_use_rmcp_client: Option<bool>,
|
||||
pub tools_web_search: Option<bool>,
|
||||
pub tools_view_image: Option<bool>,
|
||||
}
|
||||
|
||||
impl LegacyFeatureToggles {
|
||||
pub fn apply(self, features: &mut Features) {
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::PlanTool,
|
||||
self.include_plan_tool,
|
||||
"include_plan_tool",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::ApplyPatchFreeform,
|
||||
self.include_apply_patch_tool,
|
||||
"include_apply_patch_tool",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::ApplyPatchFreeform,
|
||||
self.experimental_use_freeform_apply_patch,
|
||||
"experimental_use_freeform_apply_patch",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::StreamableShell,
|
||||
self.experimental_use_exec_command_tool,
|
||||
"experimental_use_exec_command_tool",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::UnifiedExec,
|
||||
self.experimental_use_unified_exec_tool,
|
||||
"experimental_use_unified_exec_tool",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::RmcpClient,
|
||||
self.experimental_use_rmcp_client,
|
||||
"experimental_use_rmcp_client",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::WebSearchRequest,
|
||||
self.tools_web_search,
|
||||
"tools.web_search",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::ViewImageTool,
|
||||
self.include_view_image_tool,
|
||||
"include_view_image_tool",
|
||||
);
|
||||
set_if_some(
|
||||
features,
|
||||
Feature::ViewImageTool,
|
||||
self.tools_view_image,
|
||||
"tools.view_image",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
fn set_if_some(
|
||||
features: &mut Features,
|
||||
feature: Feature,
|
||||
maybe_value: Option<bool>,
|
||||
alias_key: &'static str,
|
||||
) {
|
||||
if let Some(enabled) = maybe_value {
|
||||
set_feature(features, feature, enabled);
|
||||
log_alias(alias_key, feature);
|
||||
}
|
||||
}
|
||||
|
||||
fn set_feature(features: &mut Features, feature: Feature, enabled: bool) {
|
||||
if enabled {
|
||||
features.enable(feature);
|
||||
} else {
|
||||
features.disable(feature);
|
||||
}
|
||||
}
|
||||
|
||||
fn log_alias(alias: &str, feature: Feature) {
|
||||
let canonical = feature.key();
|
||||
if alias == canonical {
|
||||
return;
|
||||
}
|
||||
info!(
|
||||
%alias,
|
||||
canonical,
|
||||
"legacy feature toggle detected; prefer `[features].{canonical}`"
|
||||
);
|
||||
}
|
||||
@@ -11,6 +11,8 @@ pub mod bash;
|
||||
mod chat_completions;
|
||||
mod client;
|
||||
mod client_common;
|
||||
mod codebase_change_notice;
|
||||
mod codebase_snapshot;
|
||||
pub mod codex;
|
||||
mod codex_conversation;
|
||||
pub mod token_data;
|
||||
@@ -29,6 +31,7 @@ pub mod exec;
|
||||
mod exec_command;
|
||||
pub mod exec_env;
|
||||
pub mod executor;
|
||||
pub mod features;
|
||||
mod flags;
|
||||
pub mod git_info;
|
||||
pub mod landlock;
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
use codex_protocol::models::ResponseItem;
|
||||
|
||||
use crate::codebase_snapshot::CodebaseSnapshot;
|
||||
use crate::conversation_history::ConversationHistory;
|
||||
use crate::protocol::RateLimitSnapshot;
|
||||
use crate::protocol::TokenUsage;
|
||||
@@ -13,6 +14,7 @@ pub(crate) struct SessionState {
|
||||
pub(crate) history: ConversationHistory,
|
||||
pub(crate) token_info: Option<TokenUsageInfo>,
|
||||
pub(crate) latest_rate_limits: Option<RateLimitSnapshot>,
|
||||
pub(crate) codebase_snapshot: Option<CodebaseSnapshot>,
|
||||
}
|
||||
|
||||
impl SessionState {
|
||||
|
||||
@@ -34,6 +34,16 @@ pub(crate) enum TaskKind {
|
||||
Compact,
|
||||
}
|
||||
|
||||
impl TaskKind {
|
||||
pub(crate) fn header_value(self) -> &'static str {
|
||||
match self {
|
||||
TaskKind::Regular => "standard",
|
||||
TaskKind::Review => "review",
|
||||
TaskKind::Compact => "compact",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub(crate) struct RunningTask {
|
||||
pub(crate) handle: AbortHandle,
|
||||
@@ -113,3 +123,15 @@ impl ActiveTurn {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::TaskKind;
|
||||
|
||||
#[test]
|
||||
fn header_value_matches_expected_labels() {
|
||||
assert_eq!(TaskKind::Regular.header_value(), "standard");
|
||||
assert_eq!(TaskKind::Review.header_value(), "review");
|
||||
assert_eq!(TaskKind::Compact.header_value(), "compact");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -27,6 +27,6 @@ impl SessionTask for RegularTask {
|
||||
input: Vec<InputItem>,
|
||||
) -> Option<String> {
|
||||
let sess = session.clone_session();
|
||||
run_task(sess, ctx, sub_id, input).await
|
||||
run_task(sess, ctx, sub_id, input, TaskKind::Regular).await
|
||||
}
|
||||
}
|
||||
|
||||
@@ -28,7 +28,7 @@ impl SessionTask for ReviewTask {
|
||||
input: Vec<InputItem>,
|
||||
) -> Option<String> {
|
||||
let sess = session.clone_session();
|
||||
run_task(sess, ctx, sub_id, input).await
|
||||
run_task(sess, ctx, sub_id, input, TaskKind::Review).await
|
||||
}
|
||||
|
||||
async fn abort(&self, session: Arc<SessionTaskContext>, sub_id: &str) {
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
use crate::client_common::tools::ResponsesApiTool;
|
||||
use crate::client_common::tools::ToolSpec;
|
||||
use crate::features::Feature;
|
||||
use crate::features::Features;
|
||||
use crate::model_family::ModelFamily;
|
||||
use crate::tools::handlers::PLAN_TOOL;
|
||||
use crate::tools::handlers::apply_patch::ApplyPatchToolType;
|
||||
@@ -33,26 +35,23 @@ pub(crate) struct ToolsConfig {
|
||||
|
||||
pub(crate) struct ToolsConfigParams<'a> {
|
||||
pub(crate) model_family: &'a ModelFamily,
|
||||
pub(crate) include_plan_tool: bool,
|
||||
pub(crate) include_apply_patch_tool: bool,
|
||||
pub(crate) include_web_search_request: bool,
|
||||
pub(crate) use_streamable_shell_tool: bool,
|
||||
pub(crate) include_view_image_tool: bool,
|
||||
pub(crate) experimental_unified_exec_tool: bool,
|
||||
pub(crate) features: &'a Features,
|
||||
}
|
||||
|
||||
impl ToolsConfig {
|
||||
pub fn new(params: &ToolsConfigParams) -> Self {
|
||||
let ToolsConfigParams {
|
||||
model_family,
|
||||
include_plan_tool,
|
||||
include_apply_patch_tool,
|
||||
include_web_search_request,
|
||||
use_streamable_shell_tool,
|
||||
include_view_image_tool,
|
||||
experimental_unified_exec_tool,
|
||||
features,
|
||||
} = params;
|
||||
let shell_type = if *use_streamable_shell_tool {
|
||||
let use_streamable_shell_tool = features.enabled(Feature::StreamableShell);
|
||||
let experimental_unified_exec_tool = features.enabled(Feature::UnifiedExec);
|
||||
let include_plan_tool = features.enabled(Feature::PlanTool);
|
||||
let include_apply_patch_tool = features.enabled(Feature::ApplyPatchFreeform);
|
||||
let include_web_search_request = features.enabled(Feature::WebSearchRequest);
|
||||
let include_view_image_tool = features.enabled(Feature::ViewImageTool);
|
||||
|
||||
let shell_type = if use_streamable_shell_tool {
|
||||
ConfigShellToolType::Streamable
|
||||
} else if model_family.uses_local_shell_tool {
|
||||
ConfigShellToolType::Local
|
||||
@@ -64,7 +63,7 @@ impl ToolsConfig {
|
||||
Some(ApplyPatchToolType::Freeform) => Some(ApplyPatchToolType::Freeform),
|
||||
Some(ApplyPatchToolType::Function) => Some(ApplyPatchToolType::Function),
|
||||
None => {
|
||||
if *include_apply_patch_tool {
|
||||
if include_apply_patch_tool {
|
||||
Some(ApplyPatchToolType::Freeform)
|
||||
} else {
|
||||
None
|
||||
@@ -74,11 +73,11 @@ impl ToolsConfig {
|
||||
|
||||
Self {
|
||||
shell_type,
|
||||
plan_tool: *include_plan_tool,
|
||||
plan_tool: include_plan_tool,
|
||||
apply_patch_tool_type,
|
||||
web_search_request: *include_web_search_request,
|
||||
include_view_image_tool: *include_view_image_tool,
|
||||
experimental_unified_exec_tool: *experimental_unified_exec_tool,
|
||||
web_search_request: include_web_search_request,
|
||||
include_view_image_tool,
|
||||
experimental_unified_exec_tool,
|
||||
experimental_supported_tools: model_family.experimental_supported_tools.clone(),
|
||||
}
|
||||
}
|
||||
@@ -906,14 +905,13 @@ mod tests {
|
||||
fn test_build_specs() {
|
||||
let model_family = find_family_for_model("codex-mini-latest")
|
||||
.expect("codex-mini-latest should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::PlanTool);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
features.enable(Feature::UnifiedExec);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: true,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(&config, Some(HashMap::new())).build();
|
||||
|
||||
@@ -926,14 +924,13 @@ mod tests {
|
||||
#[test]
|
||||
fn test_build_specs_default_shell() {
|
||||
let model_family = find_family_for_model("o3").expect("o3 should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::PlanTool);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
features.enable(Feature::UnifiedExec);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: true,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(&config, Some(HashMap::new())).build();
|
||||
|
||||
@@ -948,14 +945,12 @@ mod tests {
|
||||
fn test_parallel_support_flags() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("codex-mini-latest should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.disable(Feature::ViewImageTool);
|
||||
features.enable(Feature::UnifiedExec);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: false,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: false,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(&config, None).build();
|
||||
|
||||
@@ -969,14 +964,11 @@ mod tests {
|
||||
fn test_test_model_family_includes_sync_tool() {
|
||||
let model_family = find_family_for_model("test-gpt-5-codex")
|
||||
.expect("test-gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.disable(Feature::ViewImageTool);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: false,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: false,
|
||||
experimental_unified_exec_tool: false,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(&config, None).build();
|
||||
|
||||
@@ -1001,14 +993,12 @@ mod tests {
|
||||
#[test]
|
||||
fn test_build_specs_mcp_tools() {
|
||||
let model_family = find_family_for_model("o3").expect("o3 should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(
|
||||
&config,
|
||||
@@ -1106,14 +1096,11 @@ mod tests {
|
||||
#[test]
|
||||
fn test_build_specs_mcp_tools_sorted_by_name() {
|
||||
let model_family = find_family_for_model("o3").expect("o3 should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: false,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
|
||||
// Intentionally construct a map with keys that would sort alphabetically.
|
||||
@@ -1183,14 +1170,12 @@ mod tests {
|
||||
fn test_mcp_tool_property_missing_type_defaults_to_string() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
|
||||
let (tools, _) = build_specs(
|
||||
@@ -1252,14 +1237,12 @@ mod tests {
|
||||
fn test_mcp_tool_integer_normalized_to_number() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
|
||||
let (tools, _) = build_specs(
|
||||
@@ -1316,14 +1299,13 @@ mod tests {
|
||||
fn test_mcp_tool_array_without_items_gets_default_string_items() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
features.enable(Feature::ApplyPatchFreeform);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: true,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
|
||||
let (tools, _) = build_specs(
|
||||
@@ -1383,14 +1365,12 @@ mod tests {
|
||||
fn test_mcp_tool_anyof_defaults_to_string() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
|
||||
let (tools, _) = build_specs(
|
||||
@@ -1462,14 +1442,12 @@ mod tests {
|
||||
fn test_get_openai_tools_mcp_tools_with_additional_properties_schema() {
|
||||
let model_family = find_family_for_model("gpt-5-codex")
|
||||
.expect("gpt-5-codex should be a valid model family");
|
||||
let mut features = Features::with_defaults();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features.enable(Feature::WebSearchRequest);
|
||||
let config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_family: &model_family,
|
||||
include_plan_tool: false,
|
||||
include_apply_patch_tool: false,
|
||||
include_web_search_request: true,
|
||||
use_streamable_shell_tool: false,
|
||||
include_view_image_tool: true,
|
||||
experimental_unified_exec_tool: true,
|
||||
features: &features,
|
||||
});
|
||||
let (tools, _) = build_specs(
|
||||
&config,
|
||||
|
||||
@@ -49,6 +49,7 @@ impl UserNotifier {
|
||||
pub(crate) enum UserNotification {
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
AgentTurnComplete {
|
||||
thread_id: String,
|
||||
turn_id: String,
|
||||
|
||||
/// Messages that the user sent to the agent to initiate the turn.
|
||||
@@ -67,6 +68,7 @@ mod tests {
|
||||
#[test]
|
||||
fn test_user_notification() -> Result<()> {
|
||||
let notification = UserNotification::AgentTurnComplete {
|
||||
thread_id: "b5f6c1c2-1111-2222-3333-444455556666".to_string(),
|
||||
turn_id: "12345".to_string(),
|
||||
input_messages: vec!["Rename `foo` to `bar` and update the callsites.".to_string()],
|
||||
last_assistant_message: Some(
|
||||
@@ -76,7 +78,7 @@ mod tests {
|
||||
let serialized = serde_json::to_string(¬ification)?;
|
||||
assert_eq!(
|
||||
serialized,
|
||||
r#"{"type":"agent-turn-complete","turn-id":"12345","input-messages":["Rename `foo` to `bar` and update the callsites."],"last-assistant-message":"Rename complete and verified `cargo build` succeeds."}"#
|
||||
r#"{"type":"agent-turn-complete","thread-id":"b5f6c1c2-1111-2222-3333-444455556666","turn-id":"12345","input-messages":["Rename `foo` to `bar` and update the callsites."],"last-assistant-message":"Rename complete and verified `cargo build` succeeds."}"#
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -10,8 +10,10 @@ path = "lib.rs"
|
||||
anyhow = { workspace = true }
|
||||
assert_cmd = { workspace = true }
|
||||
codex-core = { workspace = true }
|
||||
notify = { workspace = true }
|
||||
regex-lite = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tempfile = { workspace = true }
|
||||
tokio = { workspace = true, features = ["time"] }
|
||||
walkdir = { workspace = true }
|
||||
wiremock = { workspace = true }
|
||||
|
||||
@@ -164,6 +164,149 @@ pub fn sandbox_network_env_var() -> &'static str {
|
||||
codex_core::spawn::CODEX_SANDBOX_NETWORK_DISABLED_ENV_VAR
|
||||
}
|
||||
|
||||
pub mod fs_wait {
|
||||
use anyhow::Result;
|
||||
use anyhow::anyhow;
|
||||
use notify::RecursiveMode;
|
||||
use notify::Watcher;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::mpsc;
|
||||
use std::sync::mpsc::RecvTimeoutError;
|
||||
use std::time::Duration;
|
||||
use std::time::Instant;
|
||||
use tokio::task;
|
||||
use walkdir::WalkDir;
|
||||
|
||||
pub async fn wait_for_path_exists(
|
||||
path: impl Into<PathBuf>,
|
||||
timeout: Duration,
|
||||
) -> Result<PathBuf> {
|
||||
let path = path.into();
|
||||
task::spawn_blocking(move || wait_for_path_exists_blocking(path, timeout)).await?
|
||||
}
|
||||
|
||||
pub async fn wait_for_matching_file(
|
||||
root: impl Into<PathBuf>,
|
||||
timeout: Duration,
|
||||
predicate: impl FnMut(&Path) -> bool + Send + 'static,
|
||||
) -> Result<PathBuf> {
|
||||
let root = root.into();
|
||||
task::spawn_blocking(move || {
|
||||
let mut predicate = predicate;
|
||||
blocking_find_matching_file(root, timeout, &mut predicate)
|
||||
})
|
||||
.await?
|
||||
}
|
||||
|
||||
fn wait_for_path_exists_blocking(path: PathBuf, timeout: Duration) -> Result<PathBuf> {
|
||||
if path.exists() {
|
||||
return Ok(path);
|
||||
}
|
||||
|
||||
let watch_root = nearest_existing_ancestor(&path);
|
||||
let (tx, rx) = mpsc::channel();
|
||||
let mut watcher = notify::recommended_watcher(move |res| {
|
||||
let _ = tx.send(res);
|
||||
})?;
|
||||
watcher.watch(&watch_root, RecursiveMode::Recursive)?;
|
||||
|
||||
let deadline = Instant::now() + timeout;
|
||||
loop {
|
||||
if path.exists() {
|
||||
return Ok(path.clone());
|
||||
}
|
||||
let now = Instant::now();
|
||||
if now >= deadline {
|
||||
break;
|
||||
}
|
||||
let remaining = deadline.saturating_duration_since(now);
|
||||
match rx.recv_timeout(remaining) {
|
||||
Ok(Ok(_event)) => {
|
||||
if path.exists() {
|
||||
return Ok(path.clone());
|
||||
}
|
||||
}
|
||||
Ok(Err(err)) => return Err(err.into()),
|
||||
Err(RecvTimeoutError::Timeout) => break,
|
||||
Err(RecvTimeoutError::Disconnected) => break,
|
||||
}
|
||||
}
|
||||
|
||||
if path.exists() {
|
||||
Ok(path)
|
||||
} else {
|
||||
Err(anyhow!("timed out waiting for {:?}", path))
|
||||
}
|
||||
}
|
||||
|
||||
fn blocking_find_matching_file(
|
||||
root: PathBuf,
|
||||
timeout: Duration,
|
||||
predicate: &mut impl FnMut(&Path) -> bool,
|
||||
) -> Result<PathBuf> {
|
||||
let root = wait_for_path_exists_blocking(root, timeout)?;
|
||||
|
||||
if let Some(found) = scan_for_match(&root, predicate) {
|
||||
return Ok(found);
|
||||
}
|
||||
|
||||
let (tx, rx) = mpsc::channel();
|
||||
let mut watcher = notify::recommended_watcher(move |res| {
|
||||
let _ = tx.send(res);
|
||||
})?;
|
||||
watcher.watch(&root, RecursiveMode::Recursive)?;
|
||||
|
||||
let deadline = Instant::now() + timeout;
|
||||
|
||||
while Instant::now() < deadline {
|
||||
let remaining = deadline.saturating_duration_since(Instant::now());
|
||||
match rx.recv_timeout(remaining) {
|
||||
Ok(Ok(_event)) => {
|
||||
if let Some(found) = scan_for_match(&root, predicate) {
|
||||
return Ok(found);
|
||||
}
|
||||
}
|
||||
Ok(Err(err)) => return Err(err.into()),
|
||||
Err(RecvTimeoutError::Timeout) => break,
|
||||
Err(RecvTimeoutError::Disconnected) => break,
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(found) = scan_for_match(&root, predicate) {
|
||||
Ok(found)
|
||||
} else {
|
||||
Err(anyhow!("timed out waiting for matching file in {:?}", root))
|
||||
}
|
||||
}
|
||||
|
||||
fn scan_for_match(root: &Path, predicate: &mut impl FnMut(&Path) -> bool) -> Option<PathBuf> {
|
||||
for entry in WalkDir::new(root).into_iter().filter_map(Result::ok) {
|
||||
let path = entry.path();
|
||||
if !entry.file_type().is_file() {
|
||||
continue;
|
||||
}
|
||||
if predicate(path) {
|
||||
return Some(path.to_path_buf());
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
fn nearest_existing_ancestor(path: &Path) -> PathBuf {
|
||||
let mut current = path;
|
||||
loop {
|
||||
if current.exists() {
|
||||
return current.to_path_buf();
|
||||
}
|
||||
match current.parent() {
|
||||
Some(parent) => current = parent,
|
||||
None => return PathBuf::from("."),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[macro_export]
|
||||
macro_rules! skip_if_sandbox {
|
||||
() => {{
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
use std::mem::swap;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use codex_core::CodexAuth;
|
||||
@@ -39,6 +40,12 @@ impl TestCodexBuilder {
|
||||
let mut config = load_default_config_for_test(&home);
|
||||
config.cwd = cwd.path().to_path_buf();
|
||||
config.model_provider = model_provider;
|
||||
config.codex_linux_sandbox_exe = Some(PathBuf::from(
|
||||
assert_cmd::Command::cargo_bin("codex")?
|
||||
.get_program()
|
||||
.to_os_string(),
|
||||
));
|
||||
|
||||
let mut mutators = vec![];
|
||||
swap(&mut self.config_mutators, &mut mutators);
|
||||
|
||||
|
||||
102
codex-rs/core/tests/responses_headers.rs
Normal file
102
codex-rs/core/tests/responses_headers.rs
Normal file
@@ -0,0 +1,102 @@
|
||||
use std::sync::Arc;
|
||||
|
||||
use codex_app_server_protocol::AuthMode;
|
||||
use codex_core::ContentItem;
|
||||
use codex_core::ModelClient;
|
||||
use codex_core::ModelProviderInfo;
|
||||
use codex_core::Prompt;
|
||||
use codex_core::ResponseEvent;
|
||||
use codex_core::ResponseItem;
|
||||
use codex_core::WireApi;
|
||||
use codex_otel::otel_event_manager::OtelEventManager;
|
||||
use codex_protocol::ConversationId;
|
||||
use core_test_support::load_default_config_for_test;
|
||||
use core_test_support::responses;
|
||||
use futures::StreamExt;
|
||||
use tempfile::TempDir;
|
||||
use wiremock::matchers::header;
|
||||
|
||||
#[tokio::test]
|
||||
async fn responses_stream_includes_task_type_header() {
|
||||
core_test_support::skip_if_no_network!();
|
||||
|
||||
let server = responses::start_mock_server().await;
|
||||
let response_body = responses::sse(vec![
|
||||
responses::ev_response_created("resp-1"),
|
||||
responses::ev_completed("resp-1"),
|
||||
]);
|
||||
|
||||
let request_recorder = responses::mount_sse_once_match(
|
||||
&server,
|
||||
header("Codex-Task-Type", "standard"),
|
||||
response_body,
|
||||
)
|
||||
.await;
|
||||
|
||||
let provider = ModelProviderInfo {
|
||||
name: "mock".into(),
|
||||
base_url: Some(format!("{}/v1", server.uri())),
|
||||
env_key: None,
|
||||
env_key_instructions: None,
|
||||
wire_api: WireApi::Responses,
|
||||
query_params: None,
|
||||
http_headers: None,
|
||||
env_http_headers: None,
|
||||
request_max_retries: Some(0),
|
||||
stream_max_retries: Some(0),
|
||||
stream_idle_timeout_ms: Some(5_000),
|
||||
requires_openai_auth: false,
|
||||
};
|
||||
|
||||
let codex_home = TempDir::new().expect("failed to create TempDir");
|
||||
let mut config = load_default_config_for_test(&codex_home);
|
||||
config.model_provider_id = provider.name.clone();
|
||||
config.model_provider = provider.clone();
|
||||
let effort = config.model_reasoning_effort;
|
||||
let summary = config.model_reasoning_summary;
|
||||
let config = Arc::new(config);
|
||||
|
||||
let conversation_id = ConversationId::new();
|
||||
|
||||
let otel_event_manager = OtelEventManager::new(
|
||||
conversation_id,
|
||||
config.model.as_str(),
|
||||
config.model_family.slug.as_str(),
|
||||
None,
|
||||
Some(AuthMode::ChatGPT),
|
||||
false,
|
||||
"test".to_string(),
|
||||
);
|
||||
|
||||
let client = ModelClient::new(
|
||||
Arc::clone(&config),
|
||||
None,
|
||||
otel_event_manager,
|
||||
provider,
|
||||
effort,
|
||||
summary,
|
||||
conversation_id,
|
||||
);
|
||||
|
||||
let mut prompt = Prompt::default();
|
||||
prompt.input = vec![ResponseItem::Message {
|
||||
id: None,
|
||||
role: "user".into(),
|
||||
content: vec![ContentItem::InputText {
|
||||
text: "hello".into(),
|
||||
}],
|
||||
}];
|
||||
|
||||
let mut stream = client.stream(&prompt).await.expect("stream failed");
|
||||
while let Some(event) = stream.next().await {
|
||||
if matches!(event, Ok(ResponseEvent::Completed { .. })) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
let request = request_recorder.single_request();
|
||||
assert_eq!(
|
||||
request.header("Codex-Task-Type").as_deref(),
|
||||
Some("standard")
|
||||
);
|
||||
}
|
||||
@@ -1,12 +1,11 @@
|
||||
use assert_cmd::Command as AssertCommand;
|
||||
use codex_core::RolloutRecorder;
|
||||
use codex_core::protocol::GitInfo;
|
||||
use core_test_support::fs_wait;
|
||||
use core_test_support::skip_if_no_network;
|
||||
use std::time::Duration;
|
||||
use std::time::Instant;
|
||||
use tempfile::TempDir;
|
||||
use uuid::Uuid;
|
||||
use walkdir::WalkDir;
|
||||
use wiremock::Mock;
|
||||
use wiremock::MockServer;
|
||||
use wiremock::ResponseTemplate;
|
||||
@@ -211,12 +210,12 @@ async fn responses_api_stream_cli() {
|
||||
|
||||
/// End-to-end: create a session (writes rollout), verify the file, then resume and confirm append.
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn integration_creates_and_checks_session_file() {
|
||||
async fn integration_creates_and_checks_session_file() -> anyhow::Result<()> {
|
||||
// Honor sandbox network restrictions for CI parity with the other tests.
|
||||
skip_if_no_network!();
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
// 1. Temp home so we read/write isolated session files.
|
||||
let home = TempDir::new().unwrap();
|
||||
let home = TempDir::new()?;
|
||||
|
||||
// 2. Unique marker we'll look for in the session log.
|
||||
let marker = format!("integration-test-{}", Uuid::new_v4());
|
||||
@@ -254,63 +253,20 @@ async fn integration_creates_and_checks_session_file() {
|
||||
|
||||
// Wait for sessions dir to appear.
|
||||
let sessions_dir = home.path().join("sessions");
|
||||
let dir_deadline = Instant::now() + Duration::from_secs(5);
|
||||
while !sessions_dir.exists() && Instant::now() < dir_deadline {
|
||||
std::thread::sleep(Duration::from_millis(50));
|
||||
}
|
||||
assert!(sessions_dir.exists(), "sessions directory never appeared");
|
||||
fs_wait::wait_for_path_exists(&sessions_dir, Duration::from_secs(5)).await?;
|
||||
|
||||
// Find the session file that contains `marker`.
|
||||
let deadline = Instant::now() + Duration::from_secs(10);
|
||||
let mut matching_path: Option<std::path::PathBuf> = None;
|
||||
while Instant::now() < deadline && matching_path.is_none() {
|
||||
for entry in WalkDir::new(&sessions_dir) {
|
||||
let entry = match entry {
|
||||
Ok(e) => e,
|
||||
Err(_) => continue,
|
||||
};
|
||||
if !entry.file_type().is_file() {
|
||||
continue;
|
||||
}
|
||||
if !entry.file_name().to_string_lossy().ends_with(".jsonl") {
|
||||
continue;
|
||||
}
|
||||
let path = entry.path();
|
||||
let Ok(content) = std::fs::read_to_string(path) else {
|
||||
continue;
|
||||
};
|
||||
let mut lines = content.lines();
|
||||
if lines.next().is_none() {
|
||||
continue;
|
||||
}
|
||||
for line in lines {
|
||||
if line.trim().is_empty() {
|
||||
continue;
|
||||
}
|
||||
let item: serde_json::Value = match serde_json::from_str(line) {
|
||||
Ok(v) => v,
|
||||
Err(_) => continue,
|
||||
};
|
||||
if item.get("type").and_then(|t| t.as_str()) == Some("response_item")
|
||||
&& let Some(payload) = item.get("payload")
|
||||
&& payload.get("type").and_then(|t| t.as_str()) == Some("message")
|
||||
&& let Some(c) = payload.get("content")
|
||||
&& c.to_string().contains(&marker)
|
||||
{
|
||||
matching_path = Some(path.to_path_buf());
|
||||
break;
|
||||
}
|
||||
}
|
||||
let marker_clone = marker.clone();
|
||||
let path = fs_wait::wait_for_matching_file(&sessions_dir, Duration::from_secs(10), move |p| {
|
||||
if p.extension().and_then(|ext| ext.to_str()) != Some("jsonl") {
|
||||
return false;
|
||||
}
|
||||
if matching_path.is_none() {
|
||||
std::thread::sleep(Duration::from_millis(50));
|
||||
}
|
||||
}
|
||||
|
||||
let path = match matching_path {
|
||||
Some(p) => p,
|
||||
None => panic!("No session file containing the marker was found"),
|
||||
};
|
||||
let Ok(content) = std::fs::read_to_string(p) else {
|
||||
return false;
|
||||
};
|
||||
content.contains(&marker_clone)
|
||||
})
|
||||
.await?;
|
||||
|
||||
// Basic sanity checks on location and metadata.
|
||||
let rel = match path.strip_prefix(&sessions_dir) {
|
||||
@@ -418,42 +374,25 @@ async fn integration_creates_and_checks_session_file() {
|
||||
assert!(output2.status.success(), "resume codex-cli run failed");
|
||||
|
||||
// Find the new session file containing the resumed marker.
|
||||
let deadline = Instant::now() + Duration::from_secs(10);
|
||||
let mut resumed_path: Option<std::path::PathBuf> = None;
|
||||
while Instant::now() < deadline && resumed_path.is_none() {
|
||||
for entry in WalkDir::new(&sessions_dir) {
|
||||
let entry = match entry {
|
||||
Ok(e) => e,
|
||||
Err(_) => continue,
|
||||
};
|
||||
if !entry.file_type().is_file() {
|
||||
continue;
|
||||
let marker2_clone = marker2.clone();
|
||||
let resumed_path =
|
||||
fs_wait::wait_for_matching_file(&sessions_dir, Duration::from_secs(10), move |p| {
|
||||
if p.extension().and_then(|ext| ext.to_str()) != Some("jsonl") {
|
||||
return false;
|
||||
}
|
||||
if !entry.file_name().to_string_lossy().ends_with(".jsonl") {
|
||||
continue;
|
||||
}
|
||||
let p = entry.path();
|
||||
let Ok(c) = std::fs::read_to_string(p) else {
|
||||
continue;
|
||||
};
|
||||
if c.contains(&marker2) {
|
||||
resumed_path = Some(p.to_path_buf());
|
||||
break;
|
||||
}
|
||||
}
|
||||
if resumed_path.is_none() {
|
||||
std::thread::sleep(Duration::from_millis(50));
|
||||
}
|
||||
}
|
||||
std::fs::read_to_string(p)
|
||||
.map(|content| content.contains(&marker2_clone))
|
||||
.unwrap_or(false)
|
||||
})
|
||||
.await?;
|
||||
|
||||
let resumed_path = resumed_path.expect("No resumed session file found containing the marker2");
|
||||
// Resume should write to the existing log file.
|
||||
assert_eq!(
|
||||
resumed_path, path,
|
||||
"resume should create a new session file"
|
||||
);
|
||||
|
||||
let resumed_content = std::fs::read_to_string(&resumed_path).unwrap();
|
||||
let resumed_content = std::fs::read_to_string(&resumed_path)?;
|
||||
assert!(
|
||||
resumed_content.contains(&marker),
|
||||
"resumed file missing original marker"
|
||||
@@ -462,6 +401,7 @@ async fn integration_creates_and_checks_session_file() {
|
||||
resumed_content.contains(&marker2),
|
||||
"resumed file missing resumed marker"
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Integration test to verify git info is collected and recorded in session files.
|
||||
|
||||
@@ -4,6 +4,7 @@ use codex_core::CodexAuth;
|
||||
use codex_core::ConversationManager;
|
||||
use codex_core::ModelProviderInfo;
|
||||
use codex_core::built_in_model_providers;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::model_family::find_family_for_model;
|
||||
use codex_core::protocol::EventMsg;
|
||||
use codex_core::protocol::InputItem;
|
||||
@@ -56,12 +57,12 @@ async fn collect_tool_identifiers_for_model(model: &str) -> Vec<String> {
|
||||
config.model = model.to_string();
|
||||
config.model_family =
|
||||
find_family_for_model(model).unwrap_or_else(|| panic!("unknown model family for {model}"));
|
||||
config.include_plan_tool = false;
|
||||
config.include_apply_patch_tool = false;
|
||||
config.include_view_image_tool = false;
|
||||
config.tools_web_search_request = false;
|
||||
config.use_experimental_streamable_shell_tool = false;
|
||||
config.use_experimental_unified_exec_tool = false;
|
||||
config.features.disable(Feature::PlanTool);
|
||||
config.features.disable(Feature::ApplyPatchFreeform);
|
||||
config.features.disable(Feature::ViewImageTool);
|
||||
config.features.disable(Feature::WebSearchRequest);
|
||||
config.features.disable(Feature::StreamableShell);
|
||||
config.features.disable(Feature::UnifiedExec);
|
||||
|
||||
let conversation_manager =
|
||||
ConversationManager::with_auth(CodexAuth::from_api_key("Test API Key"));
|
||||
|
||||
@@ -5,6 +5,7 @@ use codex_core::ConversationManager;
|
||||
use codex_core::ModelProviderInfo;
|
||||
use codex_core::built_in_model_providers;
|
||||
use codex_core::config::OPENAI_DEFAULT_MODEL;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::model_family::find_family_for_model;
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
@@ -99,10 +100,10 @@ async fn codex_mini_latest_tools() {
|
||||
config.cwd = cwd.path().to_path_buf();
|
||||
config.model_provider = model_provider;
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.disable(Feature::ApplyPatchFreeform);
|
||||
|
||||
let conversation_manager =
|
||||
ConversationManager::with_auth(CodexAuth::from_api_key("Test API Key"));
|
||||
config.include_apply_patch_tool = false;
|
||||
config.model = "codex-mini-latest".to_string();
|
||||
config.model_family = find_family_for_model("codex-mini-latest").unwrap();
|
||||
|
||||
@@ -185,7 +186,7 @@ async fn prompt_tools_are_consistent_across_requests() {
|
||||
config.cwd = cwd.path().to_path_buf();
|
||||
config.model_provider = model_provider;
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.include_plan_tool = true;
|
||||
config.features.enable(Feature::PlanTool);
|
||||
|
||||
let conversation_manager =
|
||||
ConversationManager::with_auth(CodexAuth::from_api_key("Test API Key"));
|
||||
|
||||
@@ -9,6 +9,7 @@ use std::time::UNIX_EPOCH;
|
||||
|
||||
use codex_core::config_types::McpServerConfig;
|
||||
use codex_core::config_types::McpServerTransportConfig;
|
||||
use codex_core::features::Feature;
|
||||
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
@@ -74,7 +75,7 @@ async fn stdio_server_round_trip() -> anyhow::Result<()> {
|
||||
|
||||
let fixture = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.use_experimental_use_rmcp_client = true;
|
||||
config.features.enable(Feature::RmcpClient);
|
||||
config.mcp_servers.insert(
|
||||
server_name.to_string(),
|
||||
McpServerConfig {
|
||||
@@ -227,7 +228,7 @@ async fn streamable_http_tool_call_round_trip() -> anyhow::Result<()> {
|
||||
|
||||
let fixture = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.use_experimental_use_rmcp_client = true;
|
||||
config.features.enable(Feature::RmcpClient);
|
||||
config.mcp_servers.insert(
|
||||
server_name.to_string(),
|
||||
McpServerConfig {
|
||||
@@ -408,7 +409,7 @@ async fn streamable_http_with_oauth_round_trip() -> anyhow::Result<()> {
|
||||
|
||||
let fixture = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.use_experimental_use_rmcp_client = true;
|
||||
config.features.enable(Feature::RmcpClient);
|
||||
config.mcp_servers.insert(
|
||||
server_name.to_string(),
|
||||
McpServerConfig {
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
#![cfg(not(target_os = "windows"))]
|
||||
|
||||
use anyhow::Result;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::model_family::find_family_for_model;
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
@@ -9,9 +10,12 @@ use codex_core::protocol::Op;
|
||||
use codex_core::protocol::SandboxPolicy;
|
||||
use codex_protocol::config_types::ReasoningSummary;
|
||||
use core_test_support::assert_regex_match;
|
||||
use core_test_support::responses::ev_apply_patch_function_call;
|
||||
use core_test_support::responses::ev_assistant_message;
|
||||
use core_test_support::responses::ev_completed;
|
||||
use core_test_support::responses::ev_custom_tool_call;
|
||||
use core_test_support::responses::ev_function_call;
|
||||
use core_test_support::responses::ev_local_shell_call;
|
||||
use core_test_support::responses::ev_response_created;
|
||||
use core_test_support::responses::mount_sse_sequence;
|
||||
use core_test_support::responses::sse;
|
||||
@@ -20,8 +24,11 @@ use core_test_support::skip_if_no_network;
|
||||
use core_test_support::test_codex::TestCodex;
|
||||
use core_test_support::test_codex::test_codex;
|
||||
use core_test_support::wait_for_event;
|
||||
use pretty_assertions::assert_eq;
|
||||
use regex_lite::Regex;
|
||||
use serde_json::Value;
|
||||
use serde_json::json;
|
||||
use std::fs;
|
||||
|
||||
async fn submit_turn(test: &TestCodex, prompt: &str, sandbox_policy: SandboxPolicy) -> Result<()> {
|
||||
let session_model = test.session_configured.model.clone();
|
||||
@@ -71,13 +78,28 @@ fn find_function_call_output<'a>(bodies: &'a [Value], call_id: &str) -> Option<&
|
||||
None
|
||||
}
|
||||
|
||||
fn find_custom_tool_call_output<'a>(bodies: &'a [Value], call_id: &str) -> Option<&'a Value> {
|
||||
for body in bodies {
|
||||
if let Some(items) = body.get("input").and_then(Value::as_array) {
|
||||
for item in items {
|
||||
if item.get("type").and_then(Value::as_str) == Some("custom_tool_call_output")
|
||||
&& item.get("call_id").and_then(Value::as_str) == Some(call_id)
|
||||
{
|
||||
return Some(item);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_output_stays_json_without_freeform_apply_patch() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = false;
|
||||
config.features.disable(Feature::ApplyPatchFreeform);
|
||||
config.model = "gpt-5".to_string();
|
||||
config.model_family = find_family_for_model("gpt-5").expect("gpt-5 is a model family");
|
||||
});
|
||||
@@ -119,7 +141,12 @@ async fn shell_output_stays_json_without_freeform_apply_patch() -> Result<()> {
|
||||
.and_then(Value::as_str)
|
||||
.expect("shell output string");
|
||||
|
||||
let parsed: Value = serde_json::from_str(output)?;
|
||||
let mut parsed: Value = serde_json::from_str(output)?;
|
||||
if let Some(metadata) = parsed.get_mut("metadata").and_then(Value::as_object_mut) {
|
||||
// duration_seconds is non-deterministic; remove it for deep equality
|
||||
let _ = metadata.remove("duration_seconds");
|
||||
}
|
||||
|
||||
assert_eq!(
|
||||
parsed
|
||||
.get("metadata")
|
||||
@@ -143,7 +170,7 @@ async fn shell_output_is_structured_with_freeform_apply_patch() -> Result<()> {
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
@@ -198,6 +225,83 @@ freeform shell
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_output_for_freeform_tool_records_duration() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
let sleep_cmd = vec!["/bin/bash", "-c", "sleep 1"];
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
let sleep_cmd = vec!["/bin/bash", "-c", "sleep 1"];
|
||||
|
||||
#[cfg(windows)]
|
||||
let sleep_cmd = "timeout 1";
|
||||
|
||||
let call_id = "shell-structured";
|
||||
let args = json!({
|
||||
"command": sleep_cmd,
|
||||
"timeout_ms": 2_000,
|
||||
});
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_function_call(call_id, "shell", &serde_json::to_string(&args)?),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"run the structured shell command",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_function_call_output(&bodies, call_id).expect("structured output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("structured output string");
|
||||
|
||||
let expected_pattern = r#"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
$"#;
|
||||
assert_regex_match(expected_pattern, output);
|
||||
|
||||
let wall_time_regex = Regex::new(r"(?m)^Wall (?:time|Clock): ([0-9]+(?:\.[0-9]+)?) seconds$")
|
||||
.expect("compile wall time regex");
|
||||
let wall_time_seconds = wall_time_regex
|
||||
.captures(output)
|
||||
.and_then(|caps| caps.get(1))
|
||||
.and_then(|value| value.as_str().parse::<f32>().ok())
|
||||
.expect("expected structured shell output to contain wall time seconds");
|
||||
assert!(
|
||||
wall_time_seconds > 0.5,
|
||||
"expected wall time to be greater than zero seconds, got {wall_time_seconds}"
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_output_reserializes_truncated_content() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
@@ -275,3 +379,428 @@ $"#;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn apply_patch_custom_tool_output_is_structured() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "apply-patch-structured";
|
||||
let file_name = "structured.txt";
|
||||
let patch = format!(
|
||||
r#"*** Begin Patch
|
||||
*** Add File: {file_name}
|
||||
+from custom tool
|
||||
*** End Patch
|
||||
"#
|
||||
);
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_custom_tool_call(call_id, "apply_patch", &patch),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"apply the patch via custom tool",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_custom_tool_call_output(&bodies, call_id).expect("apply_patch output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("apply_patch output string");
|
||||
|
||||
let expected_pattern = format!(
|
||||
r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
Success. Updated the following files:
|
||||
A {file_name}
|
||||
?$"
|
||||
);
|
||||
assert_regex_match(&expected_pattern, output);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn apply_patch_custom_tool_call_creates_file() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "apply-patch-add-file";
|
||||
let file_name = "custom_tool_apply_patch.txt";
|
||||
let patch = format!(
|
||||
"*** Begin Patch\n*** Add File: {file_name}\n+custom tool content\n*** End Patch\n"
|
||||
);
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_custom_tool_call(call_id, "apply_patch", &patch),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "apply_patch done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"apply the patch via custom tool to create a file",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_custom_tool_call_output(&bodies, call_id).expect("apply_patch output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("apply_patch output string");
|
||||
|
||||
let expected_pattern = format!(
|
||||
r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
Success. Updated the following files:
|
||||
A {file_name}
|
||||
?$"
|
||||
);
|
||||
assert_regex_match(&expected_pattern, output);
|
||||
|
||||
let new_file_path = test.cwd.path().join(file_name);
|
||||
let created_contents = fs::read_to_string(&new_file_path)?;
|
||||
assert_eq!(
|
||||
created_contents, "custom tool content\n",
|
||||
"expected file contents for {file_name}"
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn apply_patch_custom_tool_call_updates_existing_file() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "apply-patch-update-file";
|
||||
let file_name = "custom_tool_apply_patch_existing.txt";
|
||||
let file_path = test.cwd.path().join(file_name);
|
||||
fs::write(&file_path, "before\n")?;
|
||||
let patch = format!(
|
||||
"*** Begin Patch\n*** Update File: {file_name}\n@@\n-before\n+after\n*** End Patch\n"
|
||||
);
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_custom_tool_call(call_id, "apply_patch", &patch),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "apply_patch update done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"apply the patch via custom tool to update a file",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_custom_tool_call_output(&bodies, call_id).expect("apply_patch output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("apply_patch output string");
|
||||
|
||||
let expected_pattern = format!(
|
||||
r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
Success. Updated the following files:
|
||||
M {file_name}
|
||||
?$"
|
||||
);
|
||||
assert_regex_match(&expected_pattern, output);
|
||||
|
||||
let updated_contents = fs::read_to_string(file_path)?;
|
||||
assert_eq!(updated_contents, "after\n", "expected updated file content");
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn apply_patch_custom_tool_call_reports_failure_output() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "apply-patch-failure";
|
||||
let missing_file = "missing_custom_tool_apply_patch.txt";
|
||||
let patch = format!(
|
||||
"*** Begin Patch\n*** Update File: {missing_file}\n@@\n-before\n+after\n*** End Patch\n"
|
||||
);
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_custom_tool_call(call_id, "apply_patch", &patch),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "apply_patch failure done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"attempt a failing apply_patch via custom tool",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_custom_tool_call_output(&bodies, call_id).expect("apply_patch output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("apply_patch output string");
|
||||
|
||||
let expected_output = format!(
|
||||
"apply_patch verification failed: Failed to read file to update {}/{missing_file}: No such file or directory (os error 2)",
|
||||
test.cwd.path().to_string_lossy()
|
||||
);
|
||||
assert_eq!(output, expected_output);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn apply_patch_function_call_output_is_structured() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "apply-patch-function";
|
||||
let file_name = "function_apply_patch.txt";
|
||||
let patch =
|
||||
format!("*** Begin Patch\n*** Add File: {file_name}\n+via function call\n*** End Patch\n");
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_apply_patch_function_call(call_id, &patch),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "apply_patch function done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"apply the patch via function-call apply_patch",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_function_call_output(&bodies, call_id).expect("apply_patch function output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("apply_patch output string");
|
||||
|
||||
let expected_pattern = format!(
|
||||
r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
Success. Updated the following files:
|
||||
A {file_name}
|
||||
?$"
|
||||
);
|
||||
assert_regex_match(&expected_pattern, output);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_output_is_structured_for_nonzero_exit() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.model = "gpt-5-codex".to_string();
|
||||
config.model_family =
|
||||
find_family_for_model("gpt-5-codex").expect("gpt-5-codex is a model family");
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "shell-nonzero-exit";
|
||||
let args = json!({
|
||||
"command": ["/bin/sh", "-c", "exit 42"],
|
||||
"timeout_ms": 1_000,
|
||||
});
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_function_call(call_id, "shell", &serde_json::to_string(&args)?),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "shell failure handled"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"run the failing shell command",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item = find_function_call_output(&bodies, call_id).expect("shell output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("shell output string");
|
||||
|
||||
let expected_pattern = r"(?s)^Exit code: 42
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
?$";
|
||||
assert_regex_match(expected_pattern, output);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn local_shell_call_output_is_structured() -> Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.model = "gpt-5-codex".to_string();
|
||||
config.model_family =
|
||||
find_family_for_model("gpt-5-codex").expect("gpt-5-codex is a model family");
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
let call_id = "local-shell-call";
|
||||
let responses = vec![
|
||||
sse(vec![
|
||||
json!({"type": "response.created", "response": {"id": "resp-1"}}),
|
||||
ev_local_shell_call(call_id, "completed", vec!["/bin/echo", "local shell"]),
|
||||
ev_completed("resp-1"),
|
||||
]),
|
||||
sse(vec![
|
||||
ev_assistant_message("msg-1", "local shell done"),
|
||||
ev_completed("resp-2"),
|
||||
]),
|
||||
];
|
||||
mount_sse_sequence(&server, responses).await;
|
||||
|
||||
submit_turn(
|
||||
&test,
|
||||
"run the local shell command",
|
||||
SandboxPolicy::DangerFullAccess,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let requests = server
|
||||
.received_requests()
|
||||
.await
|
||||
.expect("recorded requests present");
|
||||
let bodies = request_bodies(&requests)?;
|
||||
let output_item =
|
||||
find_function_call_output(&bodies, call_id).expect("local shell output present");
|
||||
let output = output_item
|
||||
.get("output")
|
||||
.and_then(Value::as_str)
|
||||
.expect("local shell output string");
|
||||
|
||||
let expected_pattern = r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
local shell
|
||||
?$";
|
||||
assert_regex_match(expected_pattern, output);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
#![cfg(not(target_os = "windows"))]
|
||||
|
||||
use std::fs;
|
||||
|
||||
use assert_matches::assert_matches;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::model_family::find_family_for_model;
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
@@ -104,7 +107,7 @@ async fn update_plan_tool_emits_plan_update_event() -> anyhow::Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_plan_tool = true;
|
||||
config.features.enable(Feature::PlanTool);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
@@ -191,7 +194,7 @@ async fn update_plan_tool_rejects_malformed_payload() -> anyhow::Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_plan_tool = true;
|
||||
config.features.enable(Feature::PlanTool);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
@@ -285,7 +288,7 @@ async fn apply_patch_tool_executes_and_emits_patch_events() -> anyhow::Result<()
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
@@ -294,15 +297,19 @@ async fn apply_patch_tool_executes_and_emits_patch_events() -> anyhow::Result<()
|
||||
..
|
||||
} = builder.build(&server).await?;
|
||||
|
||||
let file_name = "notes.txt";
|
||||
let file_path = cwd.path().join(file_name);
|
||||
let call_id = "apply-patch-call";
|
||||
let patch_content = r#"*** Begin Patch
|
||||
*** Add File: notes.txt
|
||||
let patch_content = format!(
|
||||
r#"*** Begin Patch
|
||||
*** Add File: {file_name}
|
||||
+Tool harness apply patch
|
||||
*** End Patch"#;
|
||||
*** End Patch"#
|
||||
);
|
||||
|
||||
let first_response = sse(vec![
|
||||
ev_response_created("resp-1"),
|
||||
ev_apply_patch_function_call(call_id, patch_content),
|
||||
ev_apply_patch_function_call(call_id, &patch_content),
|
||||
ev_completed("resp-1"),
|
||||
]);
|
||||
responses::mount_sse_once_match(&server, any(), first_response).await;
|
||||
@@ -351,6 +358,7 @@ async fn apply_patch_tool_executes_and_emits_patch_events() -> anyhow::Result<()
|
||||
assert!(saw_patch_begin, "expected PatchApplyBegin event");
|
||||
let patch_end_success =
|
||||
patch_end_success.expect("expected PatchApplyEnd event to capture success flag");
|
||||
assert!(patch_end_success);
|
||||
|
||||
let req = second_mock.single_request();
|
||||
let output_item = req.function_call_output(call_id);
|
||||
@@ -360,38 +368,21 @@ async fn apply_patch_tool_executes_and_emits_patch_events() -> anyhow::Result<()
|
||||
);
|
||||
let output_text = extract_output_text(&output_item).expect("output text present");
|
||||
|
||||
if let Ok(exec_output) = serde_json::from_str::<Value>(output_text) {
|
||||
let exit_code = exec_output["metadata"]["exit_code"]
|
||||
.as_i64()
|
||||
.expect("exit_code present");
|
||||
let summary = exec_output["output"].as_str().expect("output field");
|
||||
assert_eq!(
|
||||
exit_code, 0,
|
||||
"expected apply_patch exit_code=0, got {exit_code}, summary: {summary:?}"
|
||||
);
|
||||
assert!(
|
||||
patch_end_success,
|
||||
"expected PatchApplyEnd success flag, summary: {summary:?}"
|
||||
);
|
||||
assert!(
|
||||
summary.contains("Success."),
|
||||
"expected apply_patch summary to note success, got {summary:?}"
|
||||
);
|
||||
let expected_pattern = format!(
|
||||
r"(?s)^Exit code: 0
|
||||
Wall time: [0-9]+(?:\.[0-9]+)? seconds
|
||||
Output:
|
||||
Success. Updated the following files:
|
||||
A {file_name}
|
||||
?$"
|
||||
);
|
||||
assert_regex_match(&expected_pattern, output_text);
|
||||
|
||||
let patched_path = cwd.path().join("notes.txt");
|
||||
let contents = std::fs::read_to_string(&patched_path)
|
||||
.unwrap_or_else(|e| panic!("failed reading {}: {e}", patched_path.display()));
|
||||
assert_eq!(contents, "Tool harness apply patch\n");
|
||||
} else {
|
||||
assert!(
|
||||
output_text.contains("codex-run-as-apply-patch"),
|
||||
"expected apply_patch failure message to mention codex-run-as-apply-patch, got {output_text:?}"
|
||||
);
|
||||
assert!(
|
||||
!patch_end_success,
|
||||
"expected PatchApplyEnd to report success=false when apply_patch invocation fails"
|
||||
);
|
||||
}
|
||||
let updated_contents = fs::read_to_string(file_path)?;
|
||||
assert_eq!(
|
||||
updated_contents, "Tool harness apply patch\n",
|
||||
"expected updated file content"
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -403,7 +394,7 @@ async fn apply_patch_reports_parse_diagnostics() -> anyhow::Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
#![allow(clippy::unwrap_used, clippy::expect_used)]
|
||||
|
||||
use anyhow::Result;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::model_family::find_family_for_model;
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
@@ -293,7 +294,11 @@ async fn collect_tools(use_unified_exec: bool) -> Result<Vec<String>> {
|
||||
let mock = mount_sse_sequence(&server, responses).await;
|
||||
|
||||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.use_experimental_unified_exec_tool = use_unified_exec;
|
||||
if use_unified_exec {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
} else {
|
||||
config.features.disable(Feature::UnifiedExec);
|
||||
}
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
use std::collections::HashMap;
|
||||
|
||||
use anyhow::Result;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::protocol::AskForApproval;
|
||||
use codex_core::protocol::EventMsg;
|
||||
use codex_core::protocol::InputItem;
|
||||
@@ -42,7 +43,13 @@ fn collect_tool_outputs(bodies: &[Value]) -> Result<HashMap<String, Value>> {
|
||||
if let Some(call_id) = item.get("call_id").and_then(Value::as_str) {
|
||||
let content = extract_output_text(item)
|
||||
.ok_or_else(|| anyhow::anyhow!("missing tool output content"))?;
|
||||
let parsed: Value = serde_json::from_str(content)?;
|
||||
let trimmed = content.trim();
|
||||
if trimmed.is_empty() {
|
||||
continue;
|
||||
}
|
||||
let parsed: Value = serde_json::from_str(trimmed).map_err(|err| {
|
||||
anyhow::anyhow!("failed to parse tool output content {trimmed:?}: {err}")
|
||||
})?;
|
||||
outputs.insert(call_id.to_string(), parsed);
|
||||
}
|
||||
}
|
||||
@@ -59,7 +66,7 @@ async fn unified_exec_reuses_session_via_stdin() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
@@ -176,6 +183,7 @@ async fn unified_exec_streams_after_lagged_output() -> Result<()> {
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
@@ -300,7 +308,7 @@ async fn unified_exec_timeout_and_followup_poll() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
||||
@@ -5,6 +5,7 @@ use std::os::unix::fs::PermissionsExt;
|
||||
use codex_core::protocol::EventMsg;
|
||||
use codex_core::protocol::InputItem;
|
||||
use codex_core::protocol::Op;
|
||||
use core_test_support::fs_wait;
|
||||
use core_test_support::responses;
|
||||
use core_test_support::skip_if_no_network;
|
||||
use core_test_support::test_codex::TestCodex;
|
||||
@@ -17,8 +18,7 @@ use responses::ev_assistant_message;
|
||||
use responses::ev_completed;
|
||||
use responses::sse;
|
||||
use responses::start_mock_server;
|
||||
use tokio::time::Duration;
|
||||
use tokio::time::sleep;
|
||||
use std::time::Duration;
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn summarize_context_three_requests_and_instructions() -> anyhow::Result<()> {
|
||||
@@ -60,14 +60,7 @@ echo -n "${@: -1}" > $(dirname "${0}")/notify.txt"#,
|
||||
wait_for_event(&codex, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
|
||||
|
||||
// We fork the notify script, so we need to wait for it to write to the file.
|
||||
for _ in 0..100u32 {
|
||||
if notify_file.exists() {
|
||||
break;
|
||||
}
|
||||
sleep(Duration::from_millis(100)).await;
|
||||
}
|
||||
|
||||
assert!(notify_file.exists());
|
||||
fs_wait::wait_for_path_exists(¬ify_file, Duration::from_secs(5)).await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -44,7 +44,7 @@ pub async fn determine_streamable_http_auth_status(
|
||||
}
|
||||
|
||||
/// Attempt to determine whether a streamable HTTP MCP server advertises OAuth login.
|
||||
async fn supports_oauth_login(url: &str) -> Result<bool> {
|
||||
pub async fn supports_oauth_login(url: &str) -> Result<bool> {
|
||||
let base_url = Url::parse(url)?;
|
||||
let client = Client::builder().timeout(DISCOVERY_TIMEOUT).build()?;
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@ mod rmcp_client;
|
||||
mod utils;
|
||||
|
||||
pub use auth_status::determine_streamable_http_auth_status;
|
||||
pub use auth_status::supports_oauth_login;
|
||||
pub use codex_protocol::protocol::McpAuthStatus;
|
||||
pub use oauth::OAuthCredentialsStoreMode;
|
||||
pub use oauth::StoredOAuthTokens;
|
||||
|
||||
@@ -165,8 +165,9 @@ impl ChatComposer {
|
||||
.unwrap_or_else(|| footer_height(footer_props));
|
||||
let footer_spacing = Self::footer_spacing(footer_hint_height);
|
||||
let footer_total_height = footer_hint_height + footer_spacing;
|
||||
const COLS_WITH_MARGIN: u16 = LIVE_PREFIX_COLS + 1;
|
||||
self.textarea
|
||||
.desired_height(width.saturating_sub(LIVE_PREFIX_COLS))
|
||||
.desired_height(width.saturating_sub(COLS_WITH_MARGIN))
|
||||
+ 2
|
||||
+ match &self.active_popup {
|
||||
ActivePopup::None => footer_total_height,
|
||||
@@ -197,7 +198,9 @@ impl ChatComposer {
|
||||
let [composer_rect, popup_rect] =
|
||||
Layout::vertical([Constraint::Min(1), popup_constraint]).areas(area);
|
||||
let mut textarea_rect = composer_rect;
|
||||
textarea_rect.width = textarea_rect.width.saturating_sub(LIVE_PREFIX_COLS);
|
||||
textarea_rect.width = textarea_rect.width.saturating_sub(
|
||||
LIVE_PREFIX_COLS + 1, /* keep a one-column right margin for wrapping */
|
||||
);
|
||||
textarea_rect.x = textarea_rect.x.saturating_add(LIVE_PREFIX_COLS);
|
||||
[composer_rect, textarea_rect, popup_rect]
|
||||
}
|
||||
@@ -962,6 +965,7 @@ impl ChatComposer {
|
||||
}
|
||||
let mut text = self.textarea.text().to_string();
|
||||
let original_input = text.clone();
|
||||
let input_starts_with_space = original_input.starts_with(' ');
|
||||
self.textarea.set_text("");
|
||||
|
||||
// Replace all pending pastes in the text
|
||||
@@ -975,6 +979,35 @@ impl ChatComposer {
|
||||
// If there is neither text nor attachments, suppress submission entirely.
|
||||
let has_attachments = !self.attached_images.is_empty();
|
||||
text = text.trim().to_string();
|
||||
if let Some((name, _rest)) = parse_slash_name(&text) {
|
||||
let treat_as_plain_text = input_starts_with_space || name.contains('/');
|
||||
if !treat_as_plain_text {
|
||||
let is_builtin = built_in_slash_commands()
|
||||
.into_iter()
|
||||
.any(|(command_name, _)| command_name == name);
|
||||
let prompt_prefix = format!("{PROMPTS_CMD_PREFIX}:");
|
||||
let is_known_prompt = name
|
||||
.strip_prefix(&prompt_prefix)
|
||||
.map(|prompt_name| {
|
||||
self.custom_prompts
|
||||
.iter()
|
||||
.any(|prompt| prompt.name == prompt_name)
|
||||
})
|
||||
.unwrap_or(false);
|
||||
if !is_builtin && !is_known_prompt {
|
||||
let message = format!(
|
||||
r#"Unrecognized command '/{name}'. Type "/" for a list of supported commands."#
|
||||
);
|
||||
self.app_event_tx.send(AppEvent::InsertHistoryCell(Box::new(
|
||||
history_cell::new_info_event(message, None),
|
||||
)));
|
||||
self.textarea.set_text(&original_input);
|
||||
self.textarea.set_cursor(original_input.len());
|
||||
return (InputResult::None, true);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let expanded_prompt = match expand_custom_prompt(&text, &self.custom_prompts) {
|
||||
Ok(expanded) => expanded,
|
||||
Err(err) => {
|
||||
@@ -2877,6 +2910,76 @@ mod tests {
|
||||
assert!(composer.textarea.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn slash_path_input_submits_without_command_error() {
|
||||
use crossterm::event::KeyCode;
|
||||
use crossterm::event::KeyEvent;
|
||||
use crossterm::event::KeyModifiers;
|
||||
|
||||
let (tx, mut rx) = unbounded_channel::<AppEvent>();
|
||||
let sender = AppEventSender::new(tx);
|
||||
let mut composer = ChatComposer::new(
|
||||
true,
|
||||
sender,
|
||||
false,
|
||||
"Ask Codex to do anything".to_string(),
|
||||
false,
|
||||
);
|
||||
|
||||
composer
|
||||
.textarea
|
||||
.set_text("/Users/example/project/src/main.rs");
|
||||
|
||||
let (result, _needs_redraw) =
|
||||
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
|
||||
|
||||
if let InputResult::Submitted(text) = result {
|
||||
assert_eq!(text, "/Users/example/project/src/main.rs");
|
||||
} else {
|
||||
panic!("expected Submitted");
|
||||
}
|
||||
assert!(composer.textarea.is_empty());
|
||||
match rx.try_recv() {
|
||||
Ok(event) => panic!("unexpected event: {event:?}"),
|
||||
Err(tokio::sync::mpsc::error::TryRecvError::Empty) => {}
|
||||
Err(err) => panic!("unexpected channel state: {err:?}"),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn slash_with_leading_space_submits_as_text() {
|
||||
use crossterm::event::KeyCode;
|
||||
use crossterm::event::KeyEvent;
|
||||
use crossterm::event::KeyModifiers;
|
||||
|
||||
let (tx, mut rx) = unbounded_channel::<AppEvent>();
|
||||
let sender = AppEventSender::new(tx);
|
||||
let mut composer = ChatComposer::new(
|
||||
true,
|
||||
sender,
|
||||
false,
|
||||
"Ask Codex to do anything".to_string(),
|
||||
false,
|
||||
);
|
||||
|
||||
composer.textarea.set_text(" /this-looks-like-a-command");
|
||||
|
||||
let (result, _needs_redraw) =
|
||||
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
|
||||
|
||||
if let InputResult::Submitted(text) = result {
|
||||
assert_eq!(text, "/this-looks-like-a-command");
|
||||
} else {
|
||||
panic!("expected Submitted");
|
||||
}
|
||||
assert!(composer.textarea.is_empty());
|
||||
match rx.try_recv() {
|
||||
Ok(event) => panic!("unexpected event: {event:?}"),
|
||||
Err(tokio::sync::mpsc::error::TryRecvError::Empty) => {}
|
||||
Err(err) => panic!("unexpected channel state: {err:?}"),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn custom_prompt_invalid_args_reports_error() {
|
||||
let (tx, mut rx) = unbounded_channel::<AppEvent>();
|
||||
|
||||
@@ -2,4 +2,4 @@
|
||||
source: tui/src/bottom_pane/mod.rs
|
||||
expression: "render_snapshot(&pane, area1)"
|
||||
---
|
||||
› Ask Codex to do an
|
||||
› Ask Codex to do a
|
||||
|
||||
@@ -3,4 +3,4 @@ source: tui/src/bottom_pane/mod.rs
|
||||
expression: "render_snapshot(&pane, area2)"
|
||||
---
|
||||
|
||||
› Ask Codex to do an
|
||||
› Ask Codex to do a
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
---
|
||||
source: tui/src/chatwidget/tests.rs
|
||||
assertion_line: 1152
|
||||
expression: "lines[start_idx..].join(\"\\n\")"
|
||||
---
|
||||
• I need to check the codex-rs repository to explain why the project's binaries
|
||||
@@ -33,7 +32,7 @@ expression: "lines[start_idx..].join(\"\\n\")"
|
||||
│ … +1 lines
|
||||
└ --- ansi-escape/Cargo.toml
|
||||
[package]
|
||||
… +7 lines
|
||||
… +243 lines
|
||||
] }
|
||||
tracing = { version
|
||||
|
||||
|
||||
@@ -48,10 +48,16 @@ pub(crate) fn new_active_exec_command(
|
||||
})
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub(crate) struct OutputLines {
|
||||
pub(crate) lines: Vec<Line<'static>>,
|
||||
pub(crate) omitted: Option<usize>,
|
||||
}
|
||||
|
||||
pub(crate) fn output_lines(
|
||||
output: Option<&CommandOutput>,
|
||||
params: OutputLinesParams,
|
||||
) -> Vec<Line<'static>> {
|
||||
) -> OutputLines {
|
||||
let OutputLinesParams {
|
||||
only_err,
|
||||
include_angle_pipe,
|
||||
@@ -63,9 +69,19 @@ pub(crate) fn output_lines(
|
||||
stderr,
|
||||
..
|
||||
} = match output {
|
||||
Some(output) if only_err && output.exit_code == 0 => return vec![],
|
||||
Some(output) if only_err && output.exit_code == 0 => {
|
||||
return OutputLines {
|
||||
lines: Vec::new(),
|
||||
omitted: None,
|
||||
};
|
||||
}
|
||||
Some(output) => output,
|
||||
None => return vec![],
|
||||
None => {
|
||||
return OutputLines {
|
||||
lines: Vec::new(),
|
||||
omitted: None,
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
let src = if *exit_code == 0 { stdout } else { stderr };
|
||||
@@ -73,7 +89,7 @@ pub(crate) fn output_lines(
|
||||
let total = lines.len();
|
||||
let limit = TOOL_CALL_MAX_LINES;
|
||||
|
||||
let mut out = Vec::new();
|
||||
let mut out: Vec<Line<'static>> = Vec::new();
|
||||
|
||||
let head_end = total.min(limit);
|
||||
for (i, raw) in lines[..head_end].iter().enumerate() {
|
||||
@@ -93,6 +109,11 @@ pub(crate) fn output_lines(
|
||||
}
|
||||
|
||||
let show_ellipsis = total > 2 * limit;
|
||||
let omitted = if show_ellipsis {
|
||||
Some(total - 2 * limit)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
if show_ellipsis {
|
||||
let omitted = total - 2 * limit;
|
||||
out.push(format!("… +{omitted} lines").into());
|
||||
@@ -114,7 +135,10 @@ pub(crate) fn output_lines(
|
||||
out.push(line);
|
||||
}
|
||||
|
||||
out
|
||||
OutputLines {
|
||||
lines: out,
|
||||
omitted,
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) fn spinner(start_time: Option<Instant>) -> Span<'static> {
|
||||
@@ -371,7 +395,7 @@ impl ExecCell {
|
||||
}
|
||||
|
||||
if let Some(output) = call.output.as_ref() {
|
||||
let raw_output_lines = output_lines(
|
||||
let raw_output = output_lines(
|
||||
Some(output),
|
||||
OutputLinesParams {
|
||||
only_err: false,
|
||||
@@ -380,15 +404,18 @@ impl ExecCell {
|
||||
},
|
||||
);
|
||||
|
||||
if raw_output_lines.is_empty() {
|
||||
if raw_output.lines.is_empty() {
|
||||
lines.extend(prefix_lines(
|
||||
vec![Line::from("(no output)".dim())],
|
||||
Span::from(layout.output_block.initial_prefix).dim(),
|
||||
Span::from(layout.output_block.subsequent_prefix),
|
||||
));
|
||||
} else {
|
||||
let trimmed_output =
|
||||
Self::truncate_lines_middle(&raw_output_lines, layout.output_max_lines);
|
||||
let trimmed_output = Self::truncate_lines_middle(
|
||||
&raw_output.lines,
|
||||
layout.output_max_lines,
|
||||
raw_output.omitted,
|
||||
);
|
||||
|
||||
let mut wrapped_output: Vec<Line<'static>> = Vec::new();
|
||||
let output_wrap_width = layout.output_block.wrap_width(width);
|
||||
@@ -427,7 +454,11 @@ impl ExecCell {
|
||||
out
|
||||
}
|
||||
|
||||
fn truncate_lines_middle(lines: &[Line<'static>], max: usize) -> Vec<Line<'static>> {
|
||||
fn truncate_lines_middle(
|
||||
lines: &[Line<'static>],
|
||||
max: usize,
|
||||
omitted_hint: Option<usize>,
|
||||
) -> Vec<Line<'static>> {
|
||||
if max == 0 {
|
||||
return Vec::new();
|
||||
}
|
||||
@@ -435,7 +466,17 @@ impl ExecCell {
|
||||
return lines.to_vec();
|
||||
}
|
||||
if max == 1 {
|
||||
return vec![Self::ellipsis_line(lines.len())];
|
||||
// Carry forward any previously omitted count and add any
|
||||
// additionally hidden content lines from this truncation.
|
||||
let base = omitted_hint.unwrap_or(0);
|
||||
// When an existing ellipsis is present, `lines` already includes
|
||||
// that single representation line; exclude it from the count of
|
||||
// additionally omitted content lines.
|
||||
let extra = lines
|
||||
.len()
|
||||
.saturating_sub(usize::from(omitted_hint.is_some()));
|
||||
let omitted = base + extra;
|
||||
return vec![Self::ellipsis_line(omitted)];
|
||||
}
|
||||
|
||||
let head = (max - 1) / 2;
|
||||
@@ -446,8 +487,12 @@ impl ExecCell {
|
||||
out.extend(lines[..head].iter().cloned());
|
||||
}
|
||||
|
||||
let omitted = lines.len().saturating_sub(head + tail);
|
||||
out.push(Self::ellipsis_line(omitted));
|
||||
let base = omitted_hint.unwrap_or(0);
|
||||
let additional = lines
|
||||
.len()
|
||||
.saturating_sub(head + tail)
|
||||
.saturating_sub(usize::from(omitted_hint.is_some()));
|
||||
out.push(Self::ellipsis_line(base + additional));
|
||||
|
||||
if tail > 0 {
|
||||
out.extend(lines[lines.len() - tail..].iter().cloned());
|
||||
|
||||
@@ -114,7 +114,11 @@ impl HistoryCell for UserHistoryCell {
|
||||
fn display_lines(&self, width: u16) -> Vec<Line<'static>> {
|
||||
let mut lines: Vec<Line<'static>> = Vec::new();
|
||||
|
||||
let wrap_width = width.saturating_sub(LIVE_PREFIX_COLS);
|
||||
let wrap_width = width
|
||||
.saturating_sub(
|
||||
LIVE_PREFIX_COLS + 1, /* keep a one-column right margin for wrapping */
|
||||
)
|
||||
.max(1);
|
||||
|
||||
let style = user_message_style();
|
||||
|
||||
@@ -125,7 +129,8 @@ impl HistoryCell for UserHistoryCell {
|
||||
.map(|l| Line::from(l).style(style))
|
||||
.collect::<Vec<_>>(),
|
||||
// Wrap algorithm matches textarea.rs.
|
||||
RtOptions::new(wrap_width as usize).wrap_algorithm(textwrap::WrapAlgorithm::FirstFit),
|
||||
RtOptions::new(usize::from(wrap_width))
|
||||
.wrap_algorithm(textwrap::WrapAlgorithm::FirstFit),
|
||||
);
|
||||
|
||||
lines.push(Line::from("").style(style));
|
||||
@@ -1148,7 +1153,7 @@ pub(crate) fn new_patch_apply_failure(stderr: String) -> PlainHistoryCell {
|
||||
lines.push(Line::from("✘ Failed to apply patch".magenta().bold()));
|
||||
|
||||
if !stderr.trim().is_empty() {
|
||||
lines.extend(output_lines(
|
||||
let output = output_lines(
|
||||
Some(&CommandOutput {
|
||||
exit_code: 1,
|
||||
stdout: String::new(),
|
||||
@@ -1160,7 +1165,8 @@ pub(crate) fn new_patch_apply_failure(stderr: String) -> PlainHistoryCell {
|
||||
include_angle_pipe: true,
|
||||
include_prefix: true,
|
||||
},
|
||||
));
|
||||
);
|
||||
lines.extend(output.lines);
|
||||
}
|
||||
|
||||
PlainHistoryCell { lines }
|
||||
|
||||
@@ -310,6 +310,7 @@ async fn run_ratatui_app(
|
||||
|
||||
let current_version = env!("CARGO_PKG_VERSION");
|
||||
let exe = std::env::current_exe()?;
|
||||
let managed_by_bun = std::env::var_os("CODEX_MANAGED_BY_BUN").is_some();
|
||||
let managed_by_npm = std::env::var_os("CODEX_MANAGED_BY_NPM").is_some();
|
||||
|
||||
let mut content_lines: Vec<Line<'static>> = vec![
|
||||
@@ -330,7 +331,14 @@ async fn run_ratatui_app(
|
||||
Line::from(""),
|
||||
];
|
||||
|
||||
if managed_by_npm {
|
||||
if managed_by_bun {
|
||||
let bun_cmd = "bun install -g @openai/codex@latest";
|
||||
content_lines.push(Line::from(vec![
|
||||
"Run ".into(),
|
||||
bun_cmd.cyan(),
|
||||
" to update.".into(),
|
||||
]));
|
||||
} else if managed_by_npm {
|
||||
let npm_cmd = "npm install -g @openai/codex@latest";
|
||||
content_lines.push(Line::from(vec![
|
||||
"Run ".into(),
|
||||
|
||||
@@ -3,6 +3,6 @@ source: tui/src/history_cell.rs
|
||||
expression: rendered
|
||||
---
|
||||
› one two
|
||||
three four
|
||||
five six
|
||||
seven
|
||||
three
|
||||
four five
|
||||
six seven
|
||||
|
||||
@@ -602,6 +602,7 @@ Specify a program that will be executed to get notified about events generated b
|
||||
```json
|
||||
{
|
||||
"type": "agent-turn-complete",
|
||||
"thread-id": "b5f6c1c2-1111-2222-3333-444455556666",
|
||||
"turn-id": "12345",
|
||||
"input-messages": ["Rename `foo` to `bar` and update the callsites."],
|
||||
"last-assistant-message": "Rename complete and verified `cargo build` succeeds."
|
||||
@@ -610,6 +611,8 @@ Specify a program that will be executed to get notified about events generated b
|
||||
|
||||
The `"type"` property will always be set. Currently, `"agent-turn-complete"` is the only notification type that is supported.
|
||||
|
||||
`"thread-id"` contains a string that identifies the Codex session that produced the notification; you can use it to correlate multiple turns that belong to the same task.
|
||||
|
||||
As an example, here is a Python script that parses the JSON and decides whether to show a desktop push notification using [terminal-notifier](https://github.com/julienXX/terminal-notifier) on macOS:
|
||||
|
||||
```python
|
||||
@@ -644,6 +647,8 @@ def main() -> int:
|
||||
print(f"not sending a push notification for: {notification_type}")
|
||||
return 0
|
||||
|
||||
thread_id = notification.get("thread-id", "")
|
||||
|
||||
subprocess.check_output(
|
||||
[
|
||||
"terminal-notifier",
|
||||
@@ -652,7 +657,7 @@ def main() -> int:
|
||||
"-message",
|
||||
message,
|
||||
"-group",
|
||||
"codex",
|
||||
"codex-" + thread_id,
|
||||
"-ignoreDnD",
|
||||
"-activate",
|
||||
"com.googlecode.iterm2",
|
||||
|
||||
Reference in New Issue
Block a user