Compare commits

..

1 Commits

Author SHA1 Message Date
Michael Bolin
7c94e4cab9 feat: exec-command-mcp 2025-08-20 13:59:42 -07:00
98 changed files with 2821 additions and 3669 deletions

View File

@@ -24,7 +24,3 @@ updates:
directory: /
schedule:
interval: weekly
- package-ecosystem: rust-toolchain
directory: codex-rs
schedule:
interval: weekly

View File

@@ -52,7 +52,7 @@ jobs:
~/.cargo/registry/cache/
~/.cargo/git/db/
${{ github.workspace }}/codex-rs/target/
key: cargo-ubuntu-24.04-x86_64-unknown-linux-gnu-dev-${{ hashFiles('**/Cargo.lock') }}
key: cargo-ubuntu-24.04-x86_64-unknown-linux-gnu-${{ hashFiles('**/Cargo.lock') }}
# Note it is possible that the `verify` step internal to Run Codex will
# fail, in which case the work to setup the repo was worthless :(

View File

@@ -8,7 +8,7 @@ In the codex-rs folder where the rust code lives:
- You operate in a sandbox where `CODEX_SANDBOX_NETWORK_DISABLED=1` will be set whenever you use the `shell` tool. Any existing code that uses `CODEX_SANDBOX_NETWORK_DISABLED_ENV_VAR` was authored with this fact in mind. It is often used to early exit out of tests that the author knew you would not be able to run given your sandbox limitations.
- Similarly, when you spawn a process using Seatbelt (`/usr/bin/sandbox-exec`), `CODEX_SANDBOX=seatbelt` will be set on the child process. Integration tests that want to run Seatbelt themselves cannot be run under Seatbelt, so checks for `CODEX_SANDBOX=seatbelt` are also often used to early exit out of tests, as appropriate.
Before finalizing a change to `codex-rs`, run `just fmt` (in `codex-rs` directory) to format the code and `just fix -p <project>` (in `codex-rs` directory) to fix any linter issues in the code. Additionally, run the tests:
Before finalizing a change to `codex-rs`, run `just fmt` (in `codex-rs` directory) to format the code and `just fix` (in `codex-rs` directory) to fix any linter issues in the code. Additionally, run the tests:
1. Run the test for the specific project that was changed. For example, if changes were made in `codex-rs/tui`, run `cargo test -p codex-tui`.
2. Once those pass, if any changes were made in common, core, or protocol, run the complete test suite with `cargo test --all-features`.

View File

@@ -383,13 +383,6 @@ base_url = "http://my-ollama.example.com:11434/v1"
### Platform sandboxing details
By default, Codex CLI runs code and shell commands inside a restricted sandbox to protect your system.
> [!IMPORTANT]
> Not all tool calls are sandboxed. Specifically, **trusted Model Context Protocol (MCP) tool calls** are executed outside of the sandbox.
> This is intentional: MCP tools are explicitly configured and trusted by you, and they often need to connect to **external applications or services** (e.g. issue trackers, databases, messaging systems).
> Running them outside the sandbox allows Codex to integrate with these external systems without being blocked by sandbox restrictions.
The mechanism Codex uses to implement the sandbox policy depends on your OS:
- **macOS 12+** uses **Apple Seatbelt** and runs commands using `sandbox-exec` with a profile (`-p`) that corresponds to the `--sandbox` that was specified.

106
codex-rs/Cargo.lock generated
View File

@@ -737,7 +737,6 @@ dependencies = [
"tree-sitter-bash",
"uuid",
"walkdir",
"which",
"whoami",
"wildmatch",
"wiremock",
@@ -823,7 +822,6 @@ version = "0.0.0"
dependencies = [
"base64 0.22.1",
"chrono",
"codex-protocol",
"pretty_assertions",
"rand 0.8.5",
"reqwest",
@@ -928,7 +926,6 @@ name = "codex-tui"
version = "0.0.0"
dependencies = [
"anyhow",
"async-stream",
"base64 0.22.1",
"chrono",
"clap",
@@ -964,7 +961,6 @@ dependencies = [
"supports-color",
"textwrap 0.16.2",
"tokio",
"tokio-stream",
"tracing",
"tracing-appender",
"tracing-subscriber",
@@ -1165,7 +1161,6 @@ checksum = "829d955a0bb380ef178a640b91779e3987da38c9aea133b20614cfed8cdea9c6"
dependencies = [
"bitflags 2.9.1",
"crossterm_winapi",
"futures-core",
"mio",
"parking_lot",
"rustix 0.38.44",
@@ -1443,6 +1438,12 @@ version = "0.15.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aaf95b3e5c8f23aa320147307562d361db0ae0d51242340f558153b4eb2439b"
[[package]]
name = "downcast-rs"
version = "1.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75b325c5dbd37f80359721ad39aca5a29fb04c89279657cffdda8736d0c0b9d2"
[[package]]
name = "dupe"
version = "0.9.1"
@@ -1637,6 +1638,21 @@ dependencies = [
"pin-project-lite",
]
[[package]]
name = "exec-command-mcp"
version = "0.0.0"
dependencies = [
"anyhow",
"mcp-types",
"portable-pty",
"schemars 0.8.22",
"serde",
"serde_json",
"tokio",
"tracing",
"tracing-subscriber",
]
[[package]]
name = "exr"
version = "1.73.0"
@@ -1688,6 +1704,17 @@ dependencies = [
"simd-adler32",
]
[[package]]
name = "filedescriptor"
version = "0.8.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e40758ed24c9b2eeb76c35fb0aebc66c626084edd827e07e1552279814c6682d"
dependencies = [
"libc",
"thiserror 1.0.69",
"winapi",
]
[[package]]
name = "fixedbitset"
version = "0.4.2"
@@ -3345,6 +3372,27 @@ dependencies = [
"portable-atomic",
]
[[package]]
name = "portable-pty"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b4a596a2b3d2752d94f51fac2d4a96737b8705dddd311a32b9af47211f08671e"
dependencies = [
"anyhow",
"bitflags 1.3.2",
"downcast-rs",
"filedescriptor",
"lazy_static",
"libc",
"log",
"nix",
"serial2",
"shared_library",
"shell-words",
"winapi",
"winreg",
]
[[package]]
name = "potential_utf"
version = "0.1.2"
@@ -4272,6 +4320,17 @@ dependencies = [
"syn 2.0.104",
]
[[package]]
name = "serial2"
version = "0.2.31"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "26e1e5956803a69ddd72ce2de337b577898801528749565def03515f82bad5bb"
dependencies = [
"cfg-if",
"libc",
"winapi",
]
[[package]]
name = "sha1"
version = "0.10.6"
@@ -4303,6 +4362,22 @@ dependencies = [
"lazy_static",
]
[[package]]
name = "shared_library"
version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a9e7e0f2bfae24d8a5b5a66c5b257a83c7412304311512a0c054cd5e619da11"
dependencies = [
"lazy_static",
"libc",
]
[[package]]
name = "shell-words"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24188a676b6ae68c3b2cb3a01be17fbf7240ce009799bb56d5b1409051e78fde"
[[package]]
name = "shlex"
version = "1.3.0"
@@ -5601,18 +5676,6 @@ version = "0.1.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a751b3277700db47d3e574514de2eced5e54dc8a5436a3bf7a0b248b2cee16f3"
[[package]]
name = "which"
version = "6.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b4ee928febd44d98f2f459a4a79bd4d928591333a494a10a868418ac1b39cf1f"
dependencies = [
"either",
"home",
"rustix 0.38.44",
"winsafe",
]
[[package]]
name = "whoami"
version = "1.6.0"
@@ -6026,10 +6089,13 @@ dependencies = [
]
[[package]]
name = "winsafe"
version = "0.0.19"
name = "winreg"
version = "0.10.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d135d17ab770252ad95e9a872d365cf3090e3be864a34ab46f48555993efc904"
checksum = "80d0f4e272c85def139476380b12f9ac60926689dd2e01d4923222f40580869d"
dependencies = [
"winapi",
]
[[package]]
name = "wiremock"

View File

@@ -7,6 +7,7 @@ members = [
"common",
"core",
"exec",
"exec-command-mcp",
"execpolicy",
"file-search",
"linux-sandbox",

View File

@@ -22,8 +22,6 @@ use tree_sitter_bash::LANGUAGE as BASH;
/// Detailed instructions for gpt-4.1 on how to use the `apply_patch` tool.
pub const APPLY_PATCH_TOOL_INSTRUCTIONS: &str = include_str!("../apply_patch_tool_instructions.md");
const APPLY_PATCH_COMMANDS: [&str; 2] = ["apply_patch", "applypatch"];
#[derive(Debug, Error, PartialEq)]
pub enum ApplyPatchError {
#[error(transparent)]
@@ -84,6 +82,7 @@ pub struct ApplyPatchArgs {
}
pub fn maybe_parse_apply_patch(argv: &[String]) -> MaybeApplyPatch {
const APPLY_PATCH_COMMANDS: [&str; 2] = ["apply_patch", "applypatch"];
match argv {
[cmd, body] if APPLY_PATCH_COMMANDS.contains(&cmd.as_str()) => match parse_patch(body) {
Ok(source) => MaybeApplyPatch::Body(source),
@@ -92,9 +91,7 @@ pub fn maybe_parse_apply_patch(argv: &[String]) -> MaybeApplyPatch {
[bash, flag, script]
if bash == "bash"
&& flag == "-lc"
&& APPLY_PATCH_COMMANDS
.iter()
.any(|cmd| script.trim_start().starts_with(cmd)) =>
&& script.trim_start().starts_with("apply_patch") =>
{
match extract_heredoc_body_from_apply_patch_command(script) {
Ok(body) => match parse_patch(&body) {
@@ -265,10 +262,7 @@ pub fn maybe_parse_apply_patch_verified(argv: &[String], cwd: &Path) -> MaybeApp
fn extract_heredoc_body_from_apply_patch_command(
src: &str,
) -> std::result::Result<String, ExtractHeredocError> {
if !APPLY_PATCH_COMMANDS
.iter()
.any(|cmd| src.trim_start().starts_with(cmd))
{
if !src.trim_start().starts_with("apply_patch") {
return Err(ExtractHeredocError::CommandDidNotStartWithApplyPatch);
}
@@ -779,33 +773,6 @@ PATCH"#,
}
}
#[test]
fn test_heredoc_applypatch() {
let args = strs_to_strings(&[
"bash",
"-lc",
r#"applypatch <<'PATCH'
*** Begin Patch
*** Add File: foo
+hi
*** End Patch
PATCH"#,
]);
match maybe_parse_apply_patch(&args) {
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, patch: _ }) => {
assert_eq!(
hunks,
vec![Hunk::AddFile {
path: PathBuf::from("foo"),
contents: "hi\n".to_string()
}]
);
}
result => panic!("expected MaybeApplyPatch::Body got {result:?}"),
}
}
#[test]
fn test_add_file_hunk_creates_file_with_contents() {
let dir = tempdir().unwrap();

View File

@@ -159,7 +159,7 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
codex_exec::run_main(exec_cli, codex_linux_sandbox_exe).await?;
}
Some(Subcommand::Mcp) => {
codex_mcp_server::run_main(codex_linux_sandbox_exe, cli.config_overrides).await?;
codex_mcp_server::run_main(codex_linux_sandbox_exe).await?;
}
Some(Subcommand::Login(mut login_cli)) => {
prepend_config_flags(&mut login_cli.config_overrides, cli.config_overrides);

View File

@@ -1,46 +0,0 @@
use codex_core::protocol::AskForApproval;
use codex_core::protocol::SandboxPolicy;
/// A simple preset pairing an approval policy with a sandbox policy.
#[derive(Debug, Clone)]
pub struct ApprovalPreset {
/// Stable identifier for the preset.
pub id: &'static str,
/// Display label shown in UIs.
pub label: &'static str,
/// Short human description shown next to the label in UIs.
pub description: &'static str,
/// Approval policy to apply.
pub approval: AskForApproval,
/// Sandbox policy to apply.
pub sandbox: SandboxPolicy,
}
/// Built-in list of approval presets that pair approval and sandbox policy.
///
/// Keep this UI-agnostic so it can be reused by both TUI and MCP server.
pub fn builtin_approval_presets() -> Vec<ApprovalPreset> {
vec![
ApprovalPreset {
id: "read-only",
label: "Read Only",
description: "Codex can read files and answer questions. Codex requires approval to make edits, run commands, or access network",
approval: AskForApproval::OnRequest,
sandbox: SandboxPolicy::ReadOnly,
},
ApprovalPreset {
id: "auto",
label: "Auto",
description: "Codex can read files, make edits, and run commands in the workspace. Codex requires approval to work outside the workspace or access network",
approval: AskForApproval::OnRequest,
sandbox: SandboxPolicy::new_workspace_write_policy(),
},
ApprovalPreset {
id: "full-access",
label: "Full Access",
description: "Codex can read files, make edits, and run commands with network access, without approval. Exercise caution",
approval: AskForApproval::Never,
sandbox: SandboxPolicy::DangerFullAccess,
},
]
}

View File

@@ -31,6 +31,3 @@ pub use config_summary::create_config_summary_entries;
pub mod fuzzy_match;
// Shared model presets used by TUI and MCP server
pub mod model_presets;
// Shared approval presets (AskForApproval + Sandbox) used by TUI and MCP server
// Not to be confused with AskForApproval, which we should probably rename to EscalationPolicy.
pub mod approval_presets;

View File

@@ -24,28 +24,28 @@ pub fn builtin_model_presets() -> &'static [ModelPreset] {
ModelPreset {
id: "gpt-5-minimal",
label: "gpt-5 minimal",
description: "fastest responses with limited reasoning; ideal for coding, instructions, or lightweight tasks",
description: "Fastest responses with very limited reasoning; ideal for coding, instructions, or lightweight tasks.",
model: "gpt-5",
effort: ReasoningEffort::Minimal,
},
ModelPreset {
id: "gpt-5-low",
label: "gpt-5 low",
description: "balances speed with some reasoning; useful for straightforward queries and short explanations",
description: "Balances speed with some reasoning; useful for straightforward queries and short explanations.",
model: "gpt-5",
effort: ReasoningEffort::Low,
},
ModelPreset {
id: "gpt-5-medium",
label: "gpt-5 medium",
description: "default setting; provides a solid balance of reasoning depth and latency for general-purpose tasks",
description: "Default setting; provides a solid balance of reasoning depth and latency for general-purpose tasks.",
model: "gpt-5",
effort: ReasoningEffort::Medium,
},
ModelPreset {
id: "gpt-5-high",
label: "gpt-5 high",
description: "maximizes reasoning depth for complex or ambiguous problems",
description: "Maximizes reasoning depth for complex or ambiguous problems.",
model: "gpt-5",
effort: ReasoningEffort::High,
},

View File

@@ -300,16 +300,6 @@ This is reasonable to use if Codex is running in an environment that provides it
Though using this option may also be necessary if you try to use Codex in environments where its native sandboxing mechanisms are unsupported, such as older Linux kernels or on Windows.
## Approval presets
Codex provides three main Approval Presets:
- Read Only: Codex can read files and answer questions; edits, running commands, and network access require approval.
- Auto: Codex can read files, make edits, and run commands in the workspace without approval; asks for approval outside the workspace or for network access.
- Full Access: Full disk and network access without prompts; extremely risky.
You can further customize how Codex runs at the command line using the `--ask-for-approval` and `--sandbox` options.
## mcp_servers
Defines the list of MCP servers that Codex can consult for tool use. Currently, only servers that are launched by executing a program that communicate over stdio are supported. For servers that use the SSE transport, consider an adapter like [mcp-proxy](https://github.com/sparfenyuk/mcp-proxy).

View File

@@ -71,9 +71,6 @@ openssl-sys = { version = "*", features = ["vendored"] }
[target.aarch64-unknown-linux-musl.dependencies]
openssl-sys = { version = "*", features = ["vendored"] }
[target.'cfg(target_os = "windows")'.dependencies]
which = "6"
[dev-dependencies]
assert_cmd = "2"
core_test_support = { path = "tests/common" }

View File

@@ -265,10 +265,7 @@ For casual greetings, acknowledgements, or other one-off conversational messages
## Shell commands
When using the shell, you must adhere to the following guidelines:
- When searching for text or files, prefer using `rg` or `rg --files` respectively because `rg` is much faster than alternatives like `grep`. (If the `rg` command is not found, then use alternatives.)
- Read files in chunks with a max chunk size of 250 lines. Do not use python scripts to attempt to output larger chunks of a file. Command line output will be truncated after 10 kilobytes or 256 lines of output, regardless of the command used.
Do NOT use `shell`. Use only `functions_exec_command` and `functions_write_stdin`.
## `apply_patch`

View File

@@ -208,7 +208,11 @@ impl ModelClient {
req_builder = req_builder.header("chatgpt-account-id", account_id);
}
let originator = &self.config.responses_originator_header;
let originator = self
.config
.internal_originator
.as_deref()
.unwrap_or("codex_cli_rs");
req_builder = req_builder.header("originator", originator);
req_builder = req_builder.header("User-Agent", get_codex_user_agent(Some(originator)));
@@ -248,12 +252,6 @@ impl ModelClient {
.and_then(|v| v.to_str().ok())
.and_then(|s| s.parse::<u64>().ok());
if status == StatusCode::UNAUTHORIZED
&& let Some(a) = auth.as_ref()
{
let _ = a.refresh_token().await;
}
// The OpenAI Responses endpoint returns structured JSON bodies even for 4xx/5xx
// errors. When we bubble early with only the HTTP status the caller sees an opaque
// "unexpected status 400 Bad Request" which makes debugging nearly impossible.
@@ -261,10 +259,7 @@ impl ModelClient {
// exact error message (e.g. "Unknown parameter: 'input[0].metadata'"). The body is
// small and this branch only runs on error paths so the extra allocation is
// negligible.
if !(status == StatusCode::TOO_MANY_REQUESTS
|| status == StatusCode::UNAUTHORIZED
|| status.is_server_error())
{
if !(status == StatusCode::TOO_MANY_REQUESTS || status.is_server_error()) {
// Surface the error body to callers. Use `unwrap_or_default` per Clippy.
let body = res.text().await.unwrap_or_default();
return Err(CodexErr::UnexpectedStatus(status, body));

View File

@@ -94,7 +94,6 @@ use crate::protocol::PatchApplyEndEvent;
use crate::protocol::ReviewDecision;
use crate::protocol::SandboxPolicy;
use crate::protocol::SessionConfiguredEvent;
use crate::protocol::StreamErrorEvent;
use crate::protocol::Submission;
use crate::protocol::TaskCompleteEvent;
use crate::protocol::TurnDiffEvent;
@@ -481,6 +480,7 @@ impl Session {
sandbox_policy.clone(),
config.include_plan_tool,
config.include_apply_patch_tool,
config.experimental_disable_built_in_shell_tool,
),
user_instructions,
base_instructions,
@@ -509,10 +509,9 @@ impl Session {
conversation_items.push(Prompt::format_user_instructions_message(user_instructions));
}
conversation_items.push(ResponseItem::from(EnvironmentContext::new(
Some(turn_context.cwd.clone()),
Some(turn_context.approval_policy),
Some(turn_context.sandbox_policy.clone()),
Some(sess.user_shell.clone()),
turn_context.cwd.to_path_buf(),
turn_context.approval_policy,
turn_context.sandbox_policy.clone(),
)));
sess.record_conversation_items(&conversation_items).await;
@@ -816,16 +815,6 @@ impl Session {
let _ = self.tx_event.send(event).await;
}
async fn notify_stream_error(&self, sub_id: &str, message: impl Into<String>) {
let event = Event {
id: sub_id.to_string(),
msg: EventMsg::StreamError(StreamErrorEvent {
message: message.into(),
}),
};
let _ = self.tx_event.send(event).await;
}
/// Build the full turn input by concatenating the current conversation
/// history with additional items for this turn.
pub fn turn_input_with_history(&self, extra: Vec<ResponseItem>) -> Vec<ResponseItem> {
@@ -1061,6 +1050,7 @@ async fn submission_loop(
new_sandbox_policy.clone(),
config.include_plan_tool,
config.include_apply_patch_tool,
config.experimental_disable_built_in_shell_tool,
);
let new_turn_context = TurnContext {
@@ -1079,11 +1069,9 @@ async fn submission_loop(
turn_context = Arc::new(new_turn_context);
if cwd.is_some() || approval_policy.is_some() || sandbox_policy.is_some() {
sess.record_conversation_items(&[ResponseItem::from(EnvironmentContext::new(
cwd,
approval_policy,
sandbox_policy,
// Shell is not configurable from turn to turn
None,
new_cwd,
new_approval_policy,
new_sandbox_policy,
))])
.await;
}
@@ -1139,6 +1127,7 @@ async fn submission_loop(
sandbox_policy.clone(),
config.include_plan_tool,
config.include_apply_patch_tool,
config.experimental_disable_built_in_shell_tool,
),
user_instructions: turn_context.user_instructions.clone(),
base_instructions: turn_context.base_instructions.clone(),
@@ -1534,7 +1523,7 @@ async fn run_turn(
// Surface retry information to any UI/frontend so the
// user understands what is happening instead of staring
// at a seemingly frozen screen.
sess.notify_stream_error(
sess.notify_background_event(
&sub_id,
format!(
"stream error: {e}; retrying {retries}/{max_retries} in {delay:?}"
@@ -1769,7 +1758,7 @@ async fn run_compact_task(
if retries < max_retries {
retries += 1;
let delay = backoff(retries);
sess.notify_stream_error(
sess.notify_background_event(
&sub_id,
format!(
"stream error: {e}; retrying {retries}/{max_retries} in {delay:?}"
@@ -2065,20 +2054,18 @@ pub struct ExecInvokeArgs<'a> {
pub stdout_stream: Option<StdoutStream>,
}
fn maybe_translate_shell_command(
fn maybe_run_with_user_profile(
params: ExecParams,
sess: &Session,
turn_context: &TurnContext,
) -> ExecParams {
let should_translate = matches!(sess.user_shell, crate::shell::Shell::PowerShell(_))
|| turn_context.shell_environment_policy.use_profile;
if should_translate
&& let Some(command) = sess
if turn_context.shell_environment_policy.use_profile {
let command = sess
.user_shell
.format_default_shell_invocation(params.command.clone())
{
return ExecParams { command, ..params };
.format_default_shell_invocation(params.command.clone());
if let Some(command) = command {
return ExecParams { command, ..params };
}
}
params
}
@@ -2243,7 +2230,7 @@ async fn handle_container_exec_with_params(
),
};
let params = maybe_translate_shell_command(params, sess, turn_context);
let params = maybe_run_with_user_profile(params, sess, turn_context);
let output_result = sess
.run_exec_with_events(
turn_diff_tracker,

View File

@@ -35,8 +35,6 @@ pub(crate) const PROJECT_DOC_MAX_BYTES: usize = 32 * 1024; // 32 KiB
const CONFIG_TOML_FILE: &str = "config.toml";
const DEFAULT_RESPONSES_ORIGINATOR_HEADER: &str = "codex_cli_rs";
/// Application configuration loaded from disk and merged with overrides.
#[derive(Debug, Clone, PartialEq)]
pub struct Config {
@@ -164,8 +162,10 @@ pub struct Config {
/// model family's default preference.
pub include_apply_patch_tool: bool,
pub experimental_disable_built_in_shell_tool: bool,
/// The value for the `originator` header included with Responses API requests.
pub responses_originator_header: String,
pub internal_originator: Option<String>,
/// If set to `true`, the API key will be signed with the `originator` header.
pub preferred_auth_method: AuthMode,
@@ -261,8 +261,7 @@ pub fn set_project_trusted(codex_home: &Path, project_path: &Path) -> anyhow::Re
// Mark the project as trusted. toml_edit is very good at handling
// missing properties
let normalized = crate::util::normalized_trust_project_root(project_path);
let project_key = normalized.to_string_lossy().to_string();
let project_key = project_path.to_string_lossy().to_string();
doc["projects"][project_key.as_str()]["trust_level"] = toml_edit::value("trusted");
// ensure codex_home exists
@@ -412,8 +411,10 @@ pub struct ConfigToml {
/// Experimental path to a file whose contents replace the built-in BASE_INSTRUCTIONS.
pub experimental_instructions_file: Option<PathBuf>,
pub experimental_disable_built_in_shell_tool: Option<bool>,
/// The value for the `originator` header included with Responses API requests.
pub responses_originator_header_internal_override: Option<String>,
pub internal_originator: Option<String>,
pub projects: Option<HashMap<String, ProjectConfig>>,
@@ -455,13 +456,8 @@ impl ConfigToml {
pub fn is_cwd_trusted(&self, resolved_cwd: &Path) -> bool {
let projects = self.projects.clone().unwrap_or_default();
// Prefer a normalized project key that points to the main git repo root when applicable.
let normalized_root = crate::util::normalized_trust_project_root(resolved_cwd);
let normalized_key = normalized_root.to_string_lossy().to_string();
projects
.get(&normalized_key)
.or_else(|| projects.get(&resolved_cwd.to_string_lossy().to_string()))
.get(&resolved_cwd.to_string_lossy().to_string())
.map(|p| p.trust_level.clone().unwrap_or("".to_string()) == "trusted")
.unwrap_or(false)
}
@@ -633,10 +629,6 @@ impl Config {
let include_apply_patch_tool_val =
include_apply_patch_tool.unwrap_or(model_family.uses_apply_patch_tool);
let responses_originator_header: String = cfg
.responses_originator_header_internal_override
.unwrap_or(DEFAULT_RESPONSES_ORIGINATOR_HEADER.to_owned());
let config = Self {
model,
model_family,
@@ -690,7 +682,10 @@ impl Config {
experimental_resume,
include_plan_tool: include_plan_tool.unwrap_or(false),
include_apply_patch_tool: include_apply_patch_tool_val,
responses_originator_header,
experimental_disable_built_in_shell_tool: cfg
.experimental_disable_built_in_shell_tool
.unwrap_or(false),
internal_originator: cfg.internal_originator,
preferred_auth_method: cfg.preferred_auth_method.unwrap_or(AuthMode::ChatGPT),
};
Ok(config)
@@ -1055,7 +1050,8 @@ disable_response_storage = true
base_instructions: None,
include_plan_tool: false,
include_apply_patch_tool: false,
responses_originator_header: "codex_cli_rs".to_string(),
experimental_disable_built_in_shell_tool: false,
internal_originator: None,
preferred_auth_method: AuthMode::ChatGPT,
},
o3_profile_config
@@ -1108,7 +1104,8 @@ disable_response_storage = true
base_instructions: None,
include_plan_tool: false,
include_apply_patch_tool: false,
responses_originator_header: "codex_cli_rs".to_string(),
experimental_disable_built_in_shell_tool: false,
internal_originator: None,
preferred_auth_method: AuthMode::ChatGPT,
};
@@ -1176,7 +1173,8 @@ disable_response_storage = true
base_instructions: None,
include_plan_tool: false,
include_apply_patch_tool: false,
responses_originator_header: "codex_cli_rs".to_string(),
experimental_disable_built_in_shell_tool: false,
internal_originator: None,
preferred_auth_method: AuthMode::ChatGPT,
};

View File

@@ -6,12 +6,12 @@ use crate::models::ContentItem;
use crate::models::ResponseItem;
use crate::protocol::AskForApproval;
use crate::protocol::SandboxPolicy;
use crate::shell::Shell;
use codex_protocol::config_types::SandboxMode;
use std::fmt::Display;
use std::path::PathBuf;
/// wraps environment context message in a tag for the model to parse more easily.
pub(crate) const ENVIRONMENT_CONTEXT_START: &str = "<environment_context>";
pub(crate) const ENVIRONMENT_CONTEXT_START: &str = "<environment_context>\n";
pub(crate) const ENVIRONMENT_CONTEXT_END: &str = "</environment_context>";
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, DeriveDisplay)]
@@ -24,87 +24,52 @@ pub enum NetworkAccess {
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
#[serde(rename = "environment_context", rename_all = "snake_case")]
pub(crate) struct EnvironmentContext {
pub cwd: Option<PathBuf>,
pub approval_policy: Option<AskForApproval>,
pub sandbox_mode: Option<SandboxMode>,
pub network_access: Option<NetworkAccess>,
pub shell: Option<Shell>,
pub cwd: PathBuf,
pub approval_policy: AskForApproval,
pub sandbox_mode: SandboxMode,
pub network_access: NetworkAccess,
}
impl EnvironmentContext {
pub fn new(
cwd: Option<PathBuf>,
approval_policy: Option<AskForApproval>,
sandbox_policy: Option<SandboxPolicy>,
shell: Option<Shell>,
cwd: PathBuf,
approval_policy: AskForApproval,
sandbox_policy: SandboxPolicy,
) -> Self {
Self {
cwd,
approval_policy,
sandbox_mode: match sandbox_policy {
Some(SandboxPolicy::DangerFullAccess) => Some(SandboxMode::DangerFullAccess),
Some(SandboxPolicy::ReadOnly) => Some(SandboxMode::ReadOnly),
Some(SandboxPolicy::WorkspaceWrite { .. }) => Some(SandboxMode::WorkspaceWrite),
None => None,
SandboxPolicy::DangerFullAccess => SandboxMode::DangerFullAccess,
SandboxPolicy::ReadOnly => SandboxMode::ReadOnly,
SandboxPolicy::WorkspaceWrite { .. } => SandboxMode::WorkspaceWrite,
},
network_access: match sandbox_policy {
Some(SandboxPolicy::DangerFullAccess) => Some(NetworkAccess::Enabled),
Some(SandboxPolicy::ReadOnly) => Some(NetworkAccess::Restricted),
Some(SandboxPolicy::WorkspaceWrite { network_access, .. }) => {
SandboxPolicy::DangerFullAccess => NetworkAccess::Enabled,
SandboxPolicy::ReadOnly => NetworkAccess::Restricted,
SandboxPolicy::WorkspaceWrite { network_access, .. } => {
if network_access {
Some(NetworkAccess::Enabled)
NetworkAccess::Enabled
} else {
Some(NetworkAccess::Restricted)
NetworkAccess::Restricted
}
}
None => None,
},
shell,
}
}
}
impl EnvironmentContext {
/// Serializes the environment context to XML. Libraries like `quick-xml`
/// require custom macros to handle Enums with newtypes, so we just do it
/// manually, to keep things simple. Output looks like:
///
/// ```xml
/// <environment_context>
/// <cwd>...</cwd>
/// <approval_policy>...</approval_policy>
/// <sandbox_mode>...</sandbox_mode>
/// <network_access>...</network_access>
/// <shell>...</shell>
/// </environment_context>
/// ```
pub fn serialize_to_xml(self) -> String {
let mut lines = vec![ENVIRONMENT_CONTEXT_START.to_string()];
if let Some(cwd) = self.cwd {
lines.push(format!(" <cwd>{}</cwd>", cwd.to_string_lossy()));
}
if let Some(approval_policy) = self.approval_policy {
lines.push(format!(
" <approval_policy>{}</approval_policy>",
approval_policy
));
}
if let Some(sandbox_mode) = self.sandbox_mode {
lines.push(format!(" <sandbox_mode>{}</sandbox_mode>", sandbox_mode));
}
if let Some(network_access) = self.network_access {
lines.push(format!(
" <network_access>{}</network_access>",
network_access
));
}
if let Some(shell) = self.shell
&& let Some(shell_name) = shell.name()
{
lines.push(format!(" <shell>{}</shell>", shell_name));
}
lines.push(ENVIRONMENT_CONTEXT_END.to_string());
lines.join("\n")
impl Display for EnvironmentContext {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
writeln!(
f,
"Current working directory: {}",
self.cwd.to_string_lossy()
)?;
writeln!(f, "Approval policy: {}", self.approval_policy)?;
writeln!(f, "Sandbox mode: {}", self.sandbox_mode)?;
writeln!(f, "Network access: {}", self.network_access)?;
Ok(())
}
}
@@ -114,7 +79,7 @@ impl From<EnvironmentContext> for ResponseItem {
id: None,
role: "user".to_string(),
content: vec![ContentItem::InputText {
text: ec.serialize_to_xml(),
text: format!("{ENVIRONMENT_CONTEXT_START}{ec}{ENVIRONMENT_CONTEXT_END}"),
}],
}
}

View File

@@ -1,16 +1,11 @@
use std::collections::HashSet;
use std::path::Path;
use codex_protocol::mcp_protocol::GitSha;
use futures::future::join_all;
use serde::Deserialize;
use serde::Serialize;
use tokio::process::Command;
use tokio::time::Duration as TokioDuration;
use tokio::time::timeout;
use crate::util::is_inside_git_repo;
/// Timeout for git commands to prevent freezing on large repositories
const GIT_COMMAND_TIMEOUT: TokioDuration = TokioDuration::from_secs(5);
@@ -27,12 +22,6 @@ pub struct GitInfo {
pub repository_url: Option<String>,
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct GitDiffToRemote {
pub sha: GitSha,
pub diff: String,
}
/// Collect git repository information from the given working directory using command-line git.
/// Returns None if no git repository is found or if git operations fail.
/// Uses timeouts to prevent freezing on large repositories.
@@ -91,23 +80,6 @@ pub async fn collect_git_info(cwd: &Path) -> Option<GitInfo> {
Some(git_info)
}
/// Returns the closest git sha to HEAD that is on a remote as well as the diff to that sha.
pub async fn git_diff_to_remote(cwd: &Path) -> Option<GitDiffToRemote> {
if !is_inside_git_repo(cwd) {
return None;
}
let remotes = get_git_remotes(cwd).await?;
let branches = branch_ancestry(cwd).await?;
let base_sha = find_closest_sha(cwd, &branches, &remotes).await?;
let diff = diff_against_sha(cwd, &base_sha).await?;
Some(GitDiffToRemote {
sha: base_sha,
diff,
})
}
/// Run a git command with a timeout to prevent blocking on large repositories
async fn run_git_command_with_timeout(args: &[&str], cwd: &Path) -> Option<std::process::Output> {
let result = timeout(
@@ -122,309 +94,6 @@ async fn run_git_command_with_timeout(args: &[&str], cwd: &Path) -> Option<std::
}
}
async fn get_git_remotes(cwd: &Path) -> Option<Vec<String>> {
let output = run_git_command_with_timeout(&["remote"], cwd).await?;
if !output.status.success() {
return None;
}
let mut remotes: Vec<String> = String::from_utf8(output.stdout)
.ok()?
.lines()
.map(|s| s.to_string())
.collect();
if let Some(pos) = remotes.iter().position(|r| r == "origin") {
let origin = remotes.remove(pos);
remotes.insert(0, origin);
}
Some(remotes)
}
/// Attempt to determine the repository's default branch name.
///
/// Preference order:
/// 1) The symbolic ref at `refs/remotes/<remote>/HEAD` for the first remote (origin prioritized)
/// 2) `git remote show <remote>` parsed for "HEAD branch: <name>"
/// 3) Local fallback to existing `main` or `master` if present
async fn get_default_branch(cwd: &Path) -> Option<String> {
// Prefer the first remote (with origin prioritized)
let remotes = get_git_remotes(cwd).await.unwrap_or_default();
for remote in remotes {
// Try symbolic-ref, which returns something like: refs/remotes/origin/main
if let Some(symref_output) = run_git_command_with_timeout(
&[
"symbolic-ref",
"--quiet",
&format!("refs/remotes/{remote}/HEAD"),
],
cwd,
)
.await
&& symref_output.status.success()
&& let Ok(sym) = String::from_utf8(symref_output.stdout)
{
let trimmed = sym.trim();
if let Some((_, name)) = trimmed.rsplit_once('/') {
return Some(name.to_string());
}
}
// Fall back to parsing `git remote show <remote>` output
if let Some(show_output) =
run_git_command_with_timeout(&["remote", "show", &remote], cwd).await
&& show_output.status.success()
&& let Ok(text) = String::from_utf8(show_output.stdout)
{
for line in text.lines() {
let line = line.trim();
if let Some(rest) = line.strip_prefix("HEAD branch:") {
let name = rest.trim();
if !name.is_empty() {
return Some(name.to_string());
}
}
}
}
}
// No remote-derived default; try common local defaults if they exist
for candidate in ["main", "master"] {
if let Some(verify) = run_git_command_with_timeout(
&[
"rev-parse",
"--verify",
"--quiet",
&format!("refs/heads/{candidate}"),
],
cwd,
)
.await
&& verify.status.success()
{
return Some(candidate.to_string());
}
}
None
}
/// Build an ancestry of branches starting at the current branch and ending at the
/// repository's default branch (if determinable)..
async fn branch_ancestry(cwd: &Path) -> Option<Vec<String>> {
// Discover current branch (ignore detached HEAD by treating it as None)
let current_branch = run_git_command_with_timeout(&["rev-parse", "--abbrev-ref", "HEAD"], cwd)
.await
.and_then(|o| {
if o.status.success() {
String::from_utf8(o.stdout).ok()
} else {
None
}
})
.map(|s| s.trim().to_string())
.filter(|s| s != "HEAD");
// Discover default branch
let default_branch = get_default_branch(cwd).await;
let mut ancestry: Vec<String> = Vec::new();
let mut seen: HashSet<String> = HashSet::new();
if let Some(cb) = current_branch.clone() {
seen.insert(cb.clone());
ancestry.push(cb);
}
if let Some(db) = default_branch
&& !seen.contains(&db)
{
seen.insert(db.clone());
ancestry.push(db);
}
// Expand candidates: include any remote branches that already contain HEAD.
// This addresses cases where we're on a new local-only branch forked from a
// remote branch that isn't the repository default. We prioritize remotes in
// the order returned by get_git_remotes (origin first).
let remotes = get_git_remotes(cwd).await.unwrap_or_default();
for remote in remotes {
if let Some(output) = run_git_command_with_timeout(
&[
"for-each-ref",
"--format=%(refname:short)",
"--contains=HEAD",
&format!("refs/remotes/{remote}"),
],
cwd,
)
.await
&& output.status.success()
&& let Ok(text) = String::from_utf8(output.stdout)
{
for line in text.lines() {
let short = line.trim();
// Expect format like: "origin/feature"; extract the branch path after "remote/"
if let Some(stripped) = short.strip_prefix(&format!("{remote}/"))
&& !stripped.is_empty()
&& !seen.contains(stripped)
{
seen.insert(stripped.to_string());
ancestry.push(stripped.to_string());
}
}
}
}
// Ensure we return Some vector, even if empty, to allow caller logic to proceed
Some(ancestry)
}
// Helper for a single branch: return the remote SHA if present on any remote
// and the distance (commits ahead of HEAD) for that branch. The first item is
// None if the branch is not present on any remote. Returns None if distance
// could not be computed due to git errors/timeouts.
async fn branch_remote_and_distance(
cwd: &Path,
branch: &str,
remotes: &[String],
) -> Option<(Option<GitSha>, usize)> {
// Try to find the first remote ref that exists for this branch (origin prioritized by caller).
let mut found_remote_sha: Option<GitSha> = None;
let mut found_remote_ref: Option<String> = None;
for remote in remotes {
let remote_ref = format!("refs/remotes/{remote}/{branch}");
let Some(verify_output) =
run_git_command_with_timeout(&["rev-parse", "--verify", "--quiet", &remote_ref], cwd)
.await
else {
// Mirror previous behavior: if the verify call times out/fails at the process level,
// treat the entire branch as unusable.
return None;
};
if !verify_output.status.success() {
continue;
}
let Ok(sha) = String::from_utf8(verify_output.stdout) else {
// Mirror previous behavior and skip the entire branch on parse failure.
return None;
};
found_remote_sha = Some(GitSha::new(sha.trim()));
found_remote_ref = Some(remote_ref);
break;
}
// Compute distance as the number of commits HEAD is ahead of the branch.
// Prefer local branch name if it exists; otherwise fall back to the remote ref (if any).
let count_output = if let Some(local_count) =
run_git_command_with_timeout(&["rev-list", "--count", &format!("{branch}..HEAD")], cwd)
.await
{
if local_count.status.success() {
local_count
} else if let Some(remote_ref) = &found_remote_ref {
match run_git_command_with_timeout(
&["rev-list", "--count", &format!("{remote_ref}..HEAD")],
cwd,
)
.await
{
Some(remote_count) => remote_count,
None => return None,
}
} else {
return None;
}
} else if let Some(remote_ref) = &found_remote_ref {
match run_git_command_with_timeout(
&["rev-list", "--count", &format!("{remote_ref}..HEAD")],
cwd,
)
.await
{
Some(remote_count) => remote_count,
None => return None,
}
} else {
return None;
};
if !count_output.status.success() {
return None;
}
let Ok(distance_str) = String::from_utf8(count_output.stdout) else {
return None;
};
let Ok(distance) = distance_str.trim().parse::<usize>() else {
return None;
};
Some((found_remote_sha, distance))
}
// Finds the closest sha that exist on any of branches and also exists on any of the remotes.
async fn find_closest_sha(cwd: &Path, branches: &[String], remotes: &[String]) -> Option<GitSha> {
// A sha and how many commits away from HEAD it is.
let mut closest_sha: Option<(GitSha, usize)> = None;
for branch in branches {
let Some((maybe_remote_sha, distance)) =
branch_remote_and_distance(cwd, branch, remotes).await
else {
continue;
};
let Some(remote_sha) = maybe_remote_sha else {
// Preserve existing behavior: skip branches that are not present on a remote.
continue;
};
match &closest_sha {
None => closest_sha = Some((remote_sha, distance)),
Some((_, best_distance)) if distance < *best_distance => {
closest_sha = Some((remote_sha, distance));
}
_ => {}
}
}
closest_sha.map(|(sha, _)| sha)
}
async fn diff_against_sha(cwd: &Path, sha: &GitSha) -> Option<String> {
let output = run_git_command_with_timeout(&["diff", &sha.0], cwd).await?;
// 0 is success and no diff.
// 1 is success but there is a diff.
let exit_ok = output.status.code().is_some_and(|c| c == 0 || c == 1);
if !exit_ok {
return None;
}
let mut diff = String::from_utf8(output.stdout).ok()?;
if let Some(untracked_output) =
run_git_command_with_timeout(&["ls-files", "--others", "--exclude-standard"], cwd).await
&& untracked_output.status.success()
{
let untracked: Vec<String> = String::from_utf8(untracked_output.stdout)
.ok()?
.lines()
.map(|s| s.to_string())
.filter(|s| !s.is_empty())
.collect();
if !untracked.is_empty() {
let futures_iter = untracked.into_iter().map(|file| async move {
let file_owned = file;
let args_vec: Vec<&str> =
vec!["diff", "--binary", "--no-index", "/dev/null", &file_owned];
run_git_command_with_timeout(&args_vec, cwd).await
});
let results = join_all(futures_iter).await;
for extra in results.into_iter().flatten() {
if extra.status.code().is_some_and(|c| c == 0 || c == 1)
&& let Ok(s) = String::from_utf8(extra.stdout)
{
diff.push_str(&s);
}
}
}
}
Some(diff)
}
#[cfg(test)]
mod tests {
use super::*;
@@ -435,8 +104,7 @@ mod tests {
// Helper function to create a test git repository
async fn create_test_git_repo(temp_dir: &TempDir) -> PathBuf {
let repo_path = temp_dir.path().join("repo");
fs::create_dir(&repo_path).expect("Failed to create repo dir");
let repo_path = temp_dir.path().to_path_buf();
let envs = vec![
("GIT_CONFIG_GLOBAL", "/dev/null"),
("GIT_CONFIG_NOSYSTEM", "1"),
@@ -491,41 +159,6 @@ mod tests {
repo_path
}
async fn create_test_git_repo_with_remote(temp_dir: &TempDir) -> (PathBuf, String) {
let repo_path = create_test_git_repo(temp_dir).await;
let remote_path = temp_dir.path().join("remote.git");
Command::new("git")
.args(["init", "--bare", remote_path.to_str().unwrap()])
.output()
.await
.expect("Failed to init bare remote");
Command::new("git")
.args(["remote", "add", "origin", remote_path.to_str().unwrap()])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to add remote");
let output = Command::new("git")
.args(["rev-parse", "--abbrev-ref", "HEAD"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to get branch");
let branch = String::from_utf8(output.stdout).unwrap().trim().to_string();
Command::new("git")
.args(["push", "-u", "origin", &branch])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to push initial commit");
(repo_path, branch)
}
#[tokio::test]
async fn test_collect_git_info_non_git_directory() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
@@ -639,136 +272,6 @@ mod tests {
assert_eq!(git_info.branch, Some("feature-branch".to_string()));
}
#[tokio::test]
async fn test_get_git_working_tree_state_clean_repo() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let (repo_path, branch) = create_test_git_repo_with_remote(&temp_dir).await;
let remote_sha = Command::new("git")
.args(["rev-parse", &format!("origin/{branch}")])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to rev-parse remote");
let remote_sha = String::from_utf8(remote_sha.stdout)
.unwrap()
.trim()
.to_string();
let state = git_diff_to_remote(&repo_path)
.await
.expect("Should collect working tree state");
assert_eq!(state.sha, GitSha::new(&remote_sha));
assert!(state.diff.is_empty());
}
#[tokio::test]
async fn test_get_git_working_tree_state_with_changes() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let (repo_path, branch) = create_test_git_repo_with_remote(&temp_dir).await;
let tracked = repo_path.join("test.txt");
fs::write(&tracked, "modified").unwrap();
fs::write(repo_path.join("untracked.txt"), "new").unwrap();
let remote_sha = Command::new("git")
.args(["rev-parse", &format!("origin/{branch}")])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to rev-parse remote");
let remote_sha = String::from_utf8(remote_sha.stdout)
.unwrap()
.trim()
.to_string();
let state = git_diff_to_remote(&repo_path)
.await
.expect("Should collect working tree state");
assert_eq!(state.sha, GitSha::new(&remote_sha));
assert!(state.diff.contains("test.txt"));
assert!(state.diff.contains("untracked.txt"));
}
#[tokio::test]
async fn test_get_git_working_tree_state_branch_fallback() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let (repo_path, _branch) = create_test_git_repo_with_remote(&temp_dir).await;
Command::new("git")
.args(["checkout", "-b", "feature"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to create feature branch");
Command::new("git")
.args(["push", "-u", "origin", "feature"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to push feature branch");
Command::new("git")
.args(["checkout", "-b", "local-branch"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to create local branch");
let remote_sha = Command::new("git")
.args(["rev-parse", "origin/feature"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to rev-parse remote");
let remote_sha = String::from_utf8(remote_sha.stdout)
.unwrap()
.trim()
.to_string();
let state = git_diff_to_remote(&repo_path)
.await
.expect("Should collect working tree state");
assert_eq!(state.sha, GitSha::new(&remote_sha));
}
#[tokio::test]
async fn test_get_git_working_tree_state_unpushed_commit() {
let temp_dir = TempDir::new().expect("Failed to create temp dir");
let (repo_path, branch) = create_test_git_repo_with_remote(&temp_dir).await;
let remote_sha = Command::new("git")
.args(["rev-parse", &format!("origin/{branch}")])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to rev-parse remote");
let remote_sha = String::from_utf8(remote_sha.stdout)
.unwrap()
.trim()
.to_string();
fs::write(repo_path.join("test.txt"), "updated").unwrap();
Command::new("git")
.args(["add", "test.txt"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to add file");
Command::new("git")
.args(["commit", "-m", "local change"])
.current_dir(&repo_path)
.output()
.await
.expect("Failed to commit");
let state = git_diff_to_remote(&repo_path)
.await
.expect("Should collect working tree state");
assert_eq!(state.sha, GitSha::new(&remote_sha));
assert!(state.diff.contains("updated"));
}
#[test]
fn test_git_info_serialization() {
let git_info = GitInfo {

View File

@@ -43,13 +43,12 @@ mod models;
mod openai_model_info;
mod openai_tools;
pub mod plan_tool;
pub mod project_doc;
mod project_doc;
mod rollout;
pub(crate) mod safety;
pub mod seatbelt;
pub mod shell;
pub mod spawn;
pub mod terminal;
pub mod turn_diff_tracker;
pub mod user_agent;
mod user_notification;

View File

@@ -37,6 +37,7 @@ pub enum ConfigShellToolType {
DefaultShell,
ShellWithRequest { sandbox_policy: SandboxPolicy },
LocalShell,
NoBuiltInShellTool,
}
#[derive(Debug, Clone)]
@@ -53,8 +54,11 @@ impl ToolsConfig {
sandbox_policy: SandboxPolicy,
include_plan_tool: bool,
include_apply_patch_tool: bool,
experimental_disable_built_in_shell_tool: bool,
) -> Self {
let mut shell_type = if model_family.uses_local_shell_tool {
let mut shell_type = if experimental_disable_built_in_shell_tool {
ConfigShellToolType::NoBuiltInShellTool
} else if model_family.uses_local_shell_tool {
ConfigShellToolType::LocalShell
} else {
ConfigShellToolType::DefaultShell
@@ -533,6 +537,9 @@ pub(crate) fn get_openai_tools(
ConfigShellToolType::LocalShell => {
tools.push(OpenAiTool::LocalShell {});
}
ConfigShellToolType::NoBuiltInShellTool => {
// Do not add a shell tool
}
}
if config.plan_tool {
@@ -597,6 +604,7 @@ mod tests {
SandboxPolicy::ReadOnly,
true,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(&config, Some(HashMap::new()));
@@ -612,6 +620,7 @@ mod tests {
SandboxPolicy::ReadOnly,
true,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(&config, Some(HashMap::new()));
@@ -627,6 +636,7 @@ mod tests {
SandboxPolicy::ReadOnly,
false,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(
&config,
@@ -721,6 +731,7 @@ mod tests {
SandboxPolicy::ReadOnly,
false,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(
@@ -777,6 +788,7 @@ mod tests {
SandboxPolicy::ReadOnly,
false,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(
@@ -828,6 +840,7 @@ mod tests {
SandboxPolicy::ReadOnly,
false,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(
@@ -882,6 +895,7 @@ mod tests {
SandboxPolicy::ReadOnly,
false,
model_family.uses_apply_patch_tool,
/*experimental_disable_built_in_shell_tool*/ false,
);
let tools = get_openai_tools(

View File

@@ -1,19 +1,18 @@
//! Project-level documentation discovery.
//!
//! Project-level documentation can be stored in files named `AGENTS.md`.
//! We include the concatenation of all files found along the path from the
//! repository root to the current working directory as follows:
//! Project-level documentation can be stored in a file named `AGENTS.md`.
//! Currently, we include only the contents of the first file found as follows:
//!
//! 1. Determine the Git repository root by walking upwards from the current
//! working directory until a `.git` directory or file is found. If no Git
//! root is found, only the current working directory is considered.
//! 2. Collect every `AGENTS.md` found from the repository root down to the
//! current working directory (inclusive) and concatenate their contents in
//! that order.
//! 3. We do **not** walk past the Git root.
//! 1. Look for the doc file in the current working directory (as determined
//! by the `Config`).
//! 2. If not found, walk *upwards* until the Git repository root is reached
//! (detected by the presence of a `.git` directory/file), or failing that,
//! the filesystem root.
//! 3. If the Git root is encountered, look for the doc file there. If it
//! exists, the search stops we do **not** walk past the Git root.
use crate::config::Config;
use std::path::PathBuf;
use std::path::Path;
use tokio::io::AsyncReadExt;
use tracing::error;
@@ -27,7 +26,7 @@ const PROJECT_DOC_SEPARATOR: &str = "\n\n--- project-doc ---\n\n";
/// Combines `Config::instructions` and `AGENTS.md` (if present) into a single
/// string of instructions.
pub(crate) async fn get_user_instructions(config: &Config) -> Option<String> {
match read_project_docs(config).await {
match find_project_doc(config).await {
Ok(Some(project_doc)) => match &config.user_instructions {
Some(original_instructions) => Some(format!(
"{original_instructions}{PROJECT_DOC_SEPARATOR}{project_doc}"
@@ -42,135 +41,95 @@ pub(crate) async fn get_user_instructions(config: &Config) -> Option<String> {
}
}
/// Attempt to locate and load the project documentation.
/// Attempt to locate and load the project documentation. Currently, the search
/// starts from `Config::cwd`, but if we may want to consider other directories
/// in the future, e.g., additional writable directories in the `SandboxPolicy`.
///
/// On success returns `Ok(Some(contents))` where `contents` is the
/// concatenation of all discovered docs. If no documentation file is found the
/// function returns `Ok(None)`. Unexpected I/O failures bubble up as `Err` so
/// callers can decide how to handle them.
pub async fn read_project_docs(config: &Config) -> std::io::Result<Option<String>> {
let max_total = config.project_doc_max_bytes;
/// On success returns `Ok(Some(contents))`. If no documentation file is found
/// the function returns `Ok(None)`. Unexpected I/O failures bubble up as
/// `Err` so callers can decide how to handle them.
async fn find_project_doc(config: &Config) -> std::io::Result<Option<String>> {
let max_bytes = config.project_doc_max_bytes;
if max_total == 0 {
return Ok(None);
// Attempt to load from the working directory first.
if let Some(doc) = load_first_candidate(&config.cwd, CANDIDATE_FILENAMES, max_bytes).await? {
return Ok(Some(doc));
}
let paths = discover_project_doc_paths(config)?;
if paths.is_empty() {
return Ok(None);
}
let mut remaining: u64 = max_total as u64;
let mut parts: Vec<String> = Vec::new();
for p in paths {
if remaining == 0 {
break;
}
let file = match tokio::fs::File::open(&p).await {
Ok(f) => f,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => continue,
Err(e) => return Err(e),
};
let size = file.metadata().await?.len();
let mut reader = tokio::io::BufReader::new(file).take(remaining);
let mut data: Vec<u8> = Vec::new();
reader.read_to_end(&mut data).await?;
if size > remaining {
tracing::warn!(
"Project doc `{}` exceeds remaining budget ({} bytes) - truncating.",
p.display(),
remaining,
);
}
let text = String::from_utf8_lossy(&data).to_string();
if !text.trim().is_empty() {
parts.push(text);
remaining = remaining.saturating_sub(data.len() as u64);
}
}
if parts.is_empty() {
Ok(None)
} else {
Ok(Some(parts.join("\n\n")))
}
}
/// Discover the list of AGENTS.md files using the same search rules as
/// `read_project_docs`, but return the file paths instead of concatenated
/// contents. The list is ordered from repository root to the current working
/// directory (inclusive). Symlinks are allowed. When `project_doc_max_bytes`
/// is zero, returns an empty list.
pub fn discover_project_doc_paths(config: &Config) -> std::io::Result<Vec<PathBuf>> {
// Walk up towards the filesystem root, stopping once we encounter the Git
// repository root. The presence of **either** a `.git` *file* or
// *directory* counts.
let mut dir = config.cwd.clone();
// Canonicalize the path so that we do not end up in an infinite loop when
// `cwd` contains `..` components.
if let Ok(canon) = dir.canonicalize() {
dir = canon;
}
// Build chain from cwd upwards and detect git root.
let mut chain: Vec<PathBuf> = vec![dir.clone()];
let mut git_root: Option<PathBuf> = None;
let mut cursor = dir.clone();
while let Some(parent) = cursor.parent() {
let git_marker = cursor.join(".git");
let git_exists = match std::fs::metadata(&git_marker) {
while let Some(parent) = dir.parent() {
// `.git` can be a *file* (for worktrees or submodules) or a *dir*.
let git_marker = dir.join(".git");
let git_exists = match tokio::fs::metadata(&git_marker).await {
Ok(_) => true,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => false,
Err(e) => return Err(e),
};
if git_exists {
git_root = Some(cursor.clone());
// We are at the repo root attempt one final load.
if let Some(doc) = load_first_candidate(&dir, CANDIDATE_FILENAMES, max_bytes).await? {
return Ok(Some(doc));
}
break;
}
chain.push(parent.to_path_buf());
cursor = parent.to_path_buf();
dir = parent.to_path_buf();
}
let search_dirs: Vec<PathBuf> = if let Some(root) = git_root {
let mut dirs: Vec<PathBuf> = Vec::new();
let mut saw_root = false;
for p in chain.iter().rev() {
if !saw_root {
if p == &root {
saw_root = true;
} else {
continue;
}
}
dirs.push(p.clone());
}
dirs
} else {
vec![config.cwd.clone()]
};
Ok(None)
}
let mut found: Vec<PathBuf> = Vec::new();
for d in search_dirs {
for name in CANDIDATE_FILENAMES {
let candidate = d.join(name);
match std::fs::symlink_metadata(&candidate) {
Ok(md) => {
let ft = md.file_type();
// Allow regular files and symlinks; opening will later fail for dangling links.
if ft.is_file() || ft.is_symlink() {
found.push(candidate);
break;
}
}
Err(e) if e.kind() == std::io::ErrorKind::NotFound => continue,
Err(e) => return Err(e),
}
/// Attempt to load the first candidate file found in `dir`. Returns the file
/// contents (truncated if it exceeds `max_bytes`) when successful.
async fn load_first_candidate(
dir: &Path,
names: &[&str],
max_bytes: usize,
) -> std::io::Result<Option<String>> {
for name in names {
let candidate = dir.join(name);
let file = match tokio::fs::File::open(&candidate).await {
Err(e) if e.kind() == std::io::ErrorKind::NotFound => continue,
Err(e) => return Err(e),
Ok(f) => f,
};
let size = file.metadata().await?.len();
let reader = tokio::io::BufReader::new(file);
let mut data = Vec::with_capacity(std::cmp::min(size as usize, max_bytes));
let mut limited = reader.take(max_bytes as u64);
limited.read_to_end(&mut data).await?;
if size as usize > max_bytes {
tracing::warn!(
"Project doc `{}` exceeds {max_bytes} bytes - truncating.",
candidate.display(),
);
}
let contents = String::from_utf8_lossy(&data).to_string();
if contents.trim().is_empty() {
// Empty file treat as not found.
continue;
}
return Ok(Some(contents));
}
Ok(found)
Ok(None)
}
#[cfg(test)]
@@ -319,32 +278,4 @@ mod tests {
assert_eq!(res, Some(INSTRUCTIONS.to_string()));
}
/// When both the repository root and the working directory contain
/// AGENTS.md files, their contents are concatenated from root to cwd.
#[tokio::test]
async fn concatenates_root_and_cwd_docs() {
let repo = tempfile::tempdir().expect("tempdir");
// Simulate a git repository.
std::fs::write(
repo.path().join(".git"),
"gitdir: /path/to/actual/git/dir\n",
)
.unwrap();
// Repo root doc.
fs::write(repo.path().join("AGENTS.md"), "root doc").unwrap();
// Nested working directory with its own doc.
let nested = repo.path().join("workspace/crate_a");
std::fs::create_dir_all(&nested).unwrap();
fs::write(nested.join("AGENTS.md"), "crate doc").unwrap();
let mut cfg = make_config(&repo, 4096, None);
cfg.cwd = nested;
let res = get_user_instructions(&cfg).await.expect("doc expected");
assert_eq!(res, "root doc\n\ncrate doc");
}
}

View File

@@ -1,24 +1,14 @@
use serde::Deserialize;
use serde::Serialize;
use shlex;
use std::path::PathBuf;
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)]
#[derive(Debug, PartialEq, Eq)]
pub struct ZshShell {
shell_path: String,
zshrc_path: String,
}
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)]
pub struct PowerShellConfig {
exe: String, // Executable name or path, e.g. "pwsh" or "powershell.exe".
bash_exe_fallback: Option<PathBuf>, // In case the model generates a bash command.
}
#[derive(Debug, PartialEq, Eq, Clone, Serialize, Deserialize)]
#[derive(Debug, PartialEq, Eq)]
pub enum Shell {
Zsh(ZshShell),
PowerShell(PowerShellConfig),
Unknown,
}
@@ -43,61 +33,6 @@ impl Shell {
}
Some(result)
}
Shell::PowerShell(ps) => {
// If model generated a bash command, prefer a detected bash fallback
if let Some(script) = strip_bash_lc(&command) {
return match &ps.bash_exe_fallback {
Some(bash) => Some(vec![
bash.to_string_lossy().to_string(),
"-lc".to_string(),
script,
]),
// No bash fallback → run the script under PowerShell.
// It will likely fail (except for some simple commands), but the error
// should give a clue to the model to fix upon retry that it's running under PowerShell.
None => Some(vec![
ps.exe.clone(),
"-NoProfile".to_string(),
"-Command".to_string(),
script,
]),
};
}
// Not a bash command. If model did not generate a PowerShell command,
// turn it into a PowerShell command.
let first = command.first().map(String::as_str);
if first != Some(ps.exe.as_str()) {
// TODO (CODEX_2900): Handle escaping newlines.
if command.iter().any(|a| a.contains('\n') || a.contains('\r')) {
return Some(command);
}
let joined = shlex::try_join(command.iter().map(|s| s.as_str())).ok();
return joined.map(|arg| {
vec![
ps.exe.clone(),
"-NoProfile".to_string(),
"-Command".to_string(),
arg,
]
});
}
// Model generated a PowerShell command. Run it.
Some(command)
}
Shell::Unknown => None,
}
}
pub fn name(&self) -> Option<String> {
match self {
Shell::Zsh(zsh) => std::path::Path::new(&zsh.shell_path)
.file_name()
.map(|s| s.to_string_lossy().to_string()),
Shell::PowerShell(ps) => Some(ps.exe.clone()),
Shell::Unknown => None,
}
}
@@ -151,51 +86,11 @@ pub async fn default_user_shell() -> Shell {
}
}
#[cfg(all(not(target_os = "macos"), not(target_os = "windows")))]
#[cfg(not(target_os = "macos"))]
pub async fn default_user_shell() -> Shell {
Shell::Unknown
}
#[cfg(target_os = "windows")]
pub async fn default_user_shell() -> Shell {
use tokio::process::Command;
// Prefer PowerShell 7+ (`pwsh`) if available, otherwise fall back to Windows PowerShell.
let has_pwsh = Command::new("pwsh")
.arg("-NoLogo")
.arg("-NoProfile")
.arg("-Command")
.arg("$PSVersionTable.PSVersion.Major")
.output()
.await
.map(|o| o.status.success())
.unwrap_or(false);
let bash_exe = if Command::new("bash.exe")
.arg("--version")
.output()
.await
.ok()
.map(|o| o.status.success())
.unwrap_or(false)
{
which::which("bash.exe").ok()
} else {
None
};
if has_pwsh {
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: bash_exe,
})
} else {
Shell::PowerShell(PowerShellConfig {
exe: "powershell.exe".to_string(),
bash_exe_fallback: bash_exe,
})
}
}
#[cfg(test)]
#[cfg(target_os = "macos")]
mod tests {
@@ -336,97 +231,3 @@ mod tests {
}
}
}
#[cfg(test)]
#[cfg(target_os = "windows")]
mod tests_windows {
use super::*;
#[test]
fn test_format_default_shell_invocation_powershell() {
let cases = vec![
(
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: None,
}),
vec!["bash", "-lc", "echo hello"],
vec!["pwsh.exe", "-NoProfile", "-Command", "echo hello"],
),
(
Shell::PowerShell(PowerShellConfig {
exe: "powershell.exe".to_string(),
bash_exe_fallback: None,
}),
vec!["bash", "-lc", "echo hello"],
vec!["powershell.exe", "-NoProfile", "-Command", "echo hello"],
),
(
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: Some(PathBuf::from("bash.exe")),
}),
vec!["bash", "-lc", "echo hello"],
vec!["bash.exe", "-lc", "echo hello"],
),
(
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: Some(PathBuf::from("bash.exe")),
}),
vec![
"bash",
"-lc",
"apply_patch <<'EOF'\n*** Begin Patch\n*** Update File: destination_file.txt\n-original content\n+modified content\n*** End Patch\nEOF",
],
vec![
"bash.exe",
"-lc",
"apply_patch <<'EOF'\n*** Begin Patch\n*** Update File: destination_file.txt\n-original content\n+modified content\n*** End Patch\nEOF",
],
),
(
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: Some(PathBuf::from("bash.exe")),
}),
vec!["echo", "hello"],
vec!["pwsh.exe", "-NoProfile", "-Command", "echo hello"],
),
(
Shell::PowerShell(PowerShellConfig {
exe: "pwsh.exe".to_string(),
bash_exe_fallback: Some(PathBuf::from("bash.exe")),
}),
vec!["pwsh.exe", "-NoProfile", "-Command", "echo hello"],
vec!["pwsh.exe", "-NoProfile", "-Command", "echo hello"],
),
(
// TODO (CODEX_2900): Handle escaping newlines for powershell invocation.
Shell::PowerShell(PowerShellConfig {
exe: "powershell.exe".to_string(),
bash_exe_fallback: Some(PathBuf::from("bash.exe")),
}),
vec![
"codex-mcp-server.exe",
"--codex-run-as-apply-patch",
"*** Begin Patch\n*** Update File: C:\\Users\\person\\destination_file.txt\n-original content\n+modified content\n*** End Patch",
],
vec![
"codex-mcp-server.exe",
"--codex-run-as-apply-patch",
"*** Begin Patch\n*** Update File: C:\\Users\\person\\destination_file.txt\n-original content\n+modified content\n*** End Patch",
],
),
];
for (shell, input, expected_cmd) in cases {
let actual_cmd = shell
.format_default_shell_invocation(input.iter().map(|s| s.to_string()).collect());
assert_eq!(
actual_cmd,
Some(expected_cmd.iter().map(|s| s.to_string()).collect())
);
}
}
}

View File

@@ -1,72 +0,0 @@
use std::sync::OnceLock;
static TERMINAL: OnceLock<String> = OnceLock::new();
pub fn user_agent() -> String {
TERMINAL.get_or_init(detect_terminal).to_string()
}
/// Sanitize a header value to be used in a User-Agent string.
///
/// This function replaces any characters that are not allowed in a User-Agent string with an underscore.
///
/// # Arguments
///
/// * `value` - The value to sanitize.
fn is_valid_header_value_char(c: char) -> bool {
c.is_ascii_alphanumeric() || c == '-' || c == '_' || c == '.' || c == '/'
}
fn sanitize_header_value(value: String) -> String {
value.replace(|c| !is_valid_header_value_char(c), "_")
}
fn detect_terminal() -> String {
sanitize_header_value(
if let Ok(tp) = std::env::var("TERM_PROGRAM")
&& !tp.trim().is_empty()
{
let ver = std::env::var("TERM_PROGRAM_VERSION").ok();
match ver {
Some(v) if !v.trim().is_empty() => format!("{tp}/{v}"),
_ => tp,
}
} else if let Ok(v) = std::env::var("WEZTERM_VERSION") {
if !v.trim().is_empty() {
format!("WezTerm/{v}")
} else {
"WezTerm".to_string()
}
} else if std::env::var("KITTY_WINDOW_ID").is_ok()
|| std::env::var("TERM")
.map(|t| t.contains("kitty"))
.unwrap_or(false)
{
"kitty".to_string()
} else if std::env::var("ALACRITTY_SOCKET").is_ok()
|| std::env::var("TERM")
.map(|t| t == "alacritty")
.unwrap_or(false)
{
"Alacritty".to_string()
} else if let Ok(v) = std::env::var("KONSOLE_VERSION") {
if !v.trim().is_empty() {
format!("Konsole/{v}")
} else {
"Konsole".to_string()
}
} else if std::env::var("GNOME_TERMINAL_SCREEN").is_ok() {
return "gnome-terminal".to_string();
} else if let Ok(v) = std::env::var("VTE_VERSION") {
if !v.trim().is_empty() {
format!("VTE/{v}")
} else {
"VTE".to_string()
}
} else if std::env::var("WT_SESSION").is_ok() {
return "WindowsTerminal".to_string();
} else {
std::env::var("TERM").unwrap_or_else(|_| "unknown".to_string())
},
)
}

View File

@@ -4,12 +4,11 @@ pub fn get_codex_user_agent(originator: Option<&str>) -> String {
let build_version = env!("CARGO_PKG_VERSION");
let os_info = os_info::get();
format!(
"{}/{build_version} ({} {}; {}) {}",
"{}/{build_version} ({} {}; {})",
originator.unwrap_or(DEFAULT_ORIGINATOR),
os_info.os_type(),
os_info.version(),
os_info.architecture().unwrap_or("unknown"),
crate::terminal::user_agent()
)
}
@@ -28,10 +27,9 @@ mod tests {
fn test_macos() {
use regex_lite::Regex;
let user_agent = get_codex_user_agent(None);
let re = Regex::new(
r"^codex_cli_rs/\d+\.\d+\.\d+ \(Mac OS \d+\.\d+\.\d+; (x86_64|arm64)\) (\S+)$",
)
.unwrap();
let re =
Regex::new(r"^codex_cli_rs/\d+\.\d+\.\d+ \(Mac OS \d+\.\d+\.\d+; (x86_64|arm64)\)$")
.unwrap();
assert!(re.is_match(&user_agent));
}
}

View File

@@ -1,6 +1,4 @@
use std::fs;
use std::path::Path;
use std::path::PathBuf;
use std::time::Duration;
use rand::Rng;
@@ -44,87 +42,3 @@ pub fn is_inside_git_repo(base_dir: &Path) -> bool {
false
}
/// Try to resolve the main git repository root for `base_dir`.
///
/// - For a normal repo (where `.git` is a directory), returns the directory
/// that contains the `.git` directory.
/// - For a worktree (where `.git` is a file with a `gitdir:` pointer), reads
/// the referenced git directory and, if present, its `commondir` file to
/// locate the common `.git` directory of the main repository. Returns the
/// parent of that common directory.
/// - Returns `None` when no enclosing repo is found.
pub fn git_main_repo_root(base_dir: &Path) -> Option<PathBuf> {
// Walk up from base_dir to find the first ancestor containing a `.git` entry.
let mut dir = base_dir.to_path_buf();
loop {
let dot_git = dir.join(".git");
if dot_git.is_dir() {
// Standard repository. The repo root is the directory containing `.git`.
return Some(dir);
} else if dot_git.is_file() {
// Worktree case: `.git` is a file like: `gitdir: /path/to/worktrees/<name>`
if let Ok(contents) = fs::read_to_string(&dot_git) {
// Extract the path after `gitdir:` and trim whitespace.
let gitdir_prefix = "gitdir:";
let line = contents
.lines()
.find(|l| l.trim_start().starts_with(gitdir_prefix));
if let Some(line) = line {
let path_part = line.split_once(':').map(|(_, r)| r.trim());
if let Some(gitdir_str) = path_part {
// Resolve relative paths against the directory containing `.git` (the worktree root).
let gitdir_path = Path::new(gitdir_str);
let gitdir_abs = if gitdir_path.is_absolute() {
gitdir_path.to_path_buf()
} else {
dir.join(gitdir_path)
};
// In worktrees, the per-worktree gitdir typically contains a `commondir`
// file that points (possibly relatively) to the common `.git` directory.
let commondir_path = gitdir_abs.join("commondir");
if let Ok(common_dir_rel) = fs::read_to_string(&commondir_path) {
let common_dir_rel = common_dir_rel.trim();
let common_dir_path = Path::new(common_dir_rel);
let common_dir_abs = if common_dir_path.is_absolute() {
common_dir_path.to_path_buf()
} else {
gitdir_abs.join(common_dir_path)
};
// The main repo root is the parent of the common `.git` directory.
if let Some(parent) = common_dir_abs.parent() {
return Some(parent.to_path_buf());
}
} else {
// Fallback: if no commondir file, use the parent of `gitdir_abs` if it looks like a `.git` dir.
if let Some(parent) = gitdir_abs.parent() {
return Some(parent.to_path_buf());
}
}
}
}
}
// If parsing fails, continue the walk upwards in case of nested repos (rare).
}
if !dir.pop() {
break;
}
}
None
}
/// Normalize a path for trust configuration lookups.
///
/// If inside a git repo, returns the main repository root; otherwise returns the
// canonicalized `base_dir` (or `base_dir` if canonicalization fails).
pub fn normalized_trust_project_root(base_dir: &Path) -> PathBuf {
if let Some(repo_root) = git_main_repo_root(base_dir) {
return repo_root.canonicalize().unwrap_or(repo_root);
}
base_dir
.canonicalize()
.unwrap_or_else(|_| base_dir.to_path_buf())
}

View File

@@ -260,7 +260,7 @@ async fn originator_config_override_is_used() {
let codex_home = TempDir::new().unwrap();
let mut config = load_default_config_for_test(&codex_home);
config.model_provider = model_provider;
config.responses_originator_header = "my_override".to_owned();
config.internal_originator = Some("my_override".to_string());
let conversation_manager = ConversationManager::default();
let codex = conversation_manager

View File

@@ -8,7 +8,6 @@ use codex_core::protocol::Op;
use codex_core::protocol::SandboxPolicy;
use codex_core::protocol_config_types::ReasoningEffort;
use codex_core::protocol_config_types::ReasoningSummary;
use codex_core::shell::default_user_shell;
use codex_login::CodexAuth;
use core_test_support::load_default_config_for_test;
use core_test_support::load_sse_fixture_with_id;
@@ -86,20 +85,9 @@ async fn prefixes_context_and_instructions_once_and_consistently_across_requests
let requests = server.received_requests().await.unwrap();
assert_eq!(requests.len(), 2, "expected two POST requests");
let shell = default_user_shell().await;
let expected_env_text = format!(
r#"<environment_context>
<cwd>{}</cwd>
<approval_policy>on-request</approval_policy>
<sandbox_mode>read-only</sandbox_mode>
<network_access>restricted</network_access>
{}</environment_context>"#,
cwd.path().to_string_lossy(),
match shell.name() {
Some(name) => format!(" <shell>{}</shell>\n", name),
None => String::new(),
}
"<environment_context>\nCurrent working directory: {}\nApproval policy: on-request\nSandbox mode: read-only\nNetwork access: restricted\n</environment_context>",
cwd.path().to_string_lossy()
);
let expected_ui_text =
"<user_instructions>\n\nbe consistent and helpful\n\n</user_instructions>";
@@ -195,10 +183,12 @@ async fn overrides_turn_context_but_keeps_cached_prefix_and_key_constant() {
.unwrap();
wait_for_event(&codex, |ev| matches!(ev, EventMsg::TaskComplete(_))).await;
// Change everything about the turn context.
let new_cwd = TempDir::new().unwrap();
let writable = TempDir::new().unwrap();
codex
.submit(Op::OverrideTurnContext {
cwd: None,
cwd: Some(new_cwd.path().to_path_buf()),
approval_policy: Some(AskForApproval::Never),
sandbox_policy: Some(SandboxPolicy::WorkspaceWrite {
writable_roots: vec![writable.path().to_path_buf()],
@@ -230,6 +220,7 @@ async fn overrides_turn_context_but_keeps_cached_prefix_and_key_constant() {
let body1 = requests[0].body_json::<serde_json::Value>().unwrap();
let body2 = requests[1].body_json::<serde_json::Value>().unwrap();
// prompt_cache_key should remain constant across overrides
assert_eq!(
body1["prompt_cache_key"], body2["prompt_cache_key"],
@@ -245,13 +236,11 @@ async fn overrides_turn_context_but_keeps_cached_prefix_and_key_constant() {
"content": [ { "type": "input_text", "text": "hello 2" } ]
});
// After overriding the turn context, the environment context should be emitted again
// reflecting the new approval policy and sandbox settings. Omit cwd because it did
// not change.
let expected_env_text_2 = r#"<environment_context>
<approval_policy>never</approval_policy>
<sandbox_mode>workspace-write</sandbox_mode>
<network_access>enabled</network_access>
</environment_context>"#;
// reflecting the new cwd, approval policy and sandbox settings.
let expected_env_text_2 = format!(
"<environment_context>\nCurrent working directory: {}\nApproval policy: never\nSandbox mode: workspace-write\nNetwork access: enabled\n</environment_context>",
new_cwd.path().to_string_lossy()
);
let expected_env_msg_2 = serde_json::json!({
"type": "message",
"id": serde_json::Value::Null,

View File

@@ -0,0 +1,35 @@
[package]
edition = "2024"
name = "exec-command-mcp"
version = { workspace = true }
[[bin]]
name = "exec-command-mcp"
path = "src/main.rs"
[lib]
name = "exec_command_mcp"
path = "src/lib.rs"
[lints]
workspace = true
[dependencies]
anyhow = "1"
mcp-types = { path = "../mcp-types" }
portable-pty = "0.9.0"
schemars = "0.8.22"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
tokio = { version = "1", features = [
"io-std",
"io-util",
"macros",
"process",
"rt-multi-thread",
"time",
"sync",
"signal",
] }
tracing = { version = "0.1.41", features = ["log"] }
tracing-subscriber = { version = "0.3", features = ["env-filter", "fmt"] }

View File

@@ -0,0 +1,2 @@
pub(crate) const INVALID_REQUEST_ERROR_CODE: i64 = -32600;
pub(crate) const INTERNAL_ERROR_CODE: i64 = -32603;

View File

@@ -0,0 +1,59 @@
use schemars::JsonSchema;
use serde::Deserialize;
use serde::Serialize;
use crate::session_id::SessionId;
#[allow(dead_code)]
#[derive(Debug, Clone, Deserialize, JsonSchema)]
pub(crate) struct ExecCommandParams {
pub(crate) cmd: String,
#[serde(default = "default_yield_time")]
pub(crate) yield_time_ms: u64,
#[serde(default = "max_output_tokens")]
pub(crate) max_output_tokens: u64,
#[serde(default = "default_shell")]
pub(crate) shell: String,
#[serde(default = "default_login")]
pub(crate) login: bool,
}
fn default_yield_time() -> u64 {
10_000
}
fn max_output_tokens() -> u64 {
10_000
}
fn default_login() -> bool {
true
}
fn default_shell() -> String {
"/bin/bash".to_string()
}
#[derive(Debug, Deserialize, Serialize, JsonSchema)]
pub(crate) struct WriteStdinParams {
pub(crate) session_id: SessionId,
pub(crate) chars: String,
#[serde(default = "write_stdin_default_yield_time_ms")]
pub(crate) yield_time_ms: u64,
#[serde(default = "write_stdin_default_max_output_tokens")]
pub(crate) max_output_tokens: u64,
}
fn write_stdin_default_yield_time_ms() -> u64 {
250
}
fn write_stdin_default_max_output_tokens() -> u64 {
10_000
}

View File

@@ -0,0 +1,54 @@
use std::sync::Arc;
use tokio::sync::Mutex;
use tokio::sync::mpsc;
use crate::session_id::SessionId;
#[allow(dead_code)]
#[derive(Debug)]
pub(crate) struct ExecCommandSession {
pub(crate) id: SessionId,
/// Queue for writing bytes to the process stdin (PTY master write side).
writer_tx: mpsc::Sender<Vec<u8>>,
/// Stream of output chunks read from the PTY. Wrapped in Mutex so callers can
/// `await` receiving without needing `&mut self`.
output_rx: Arc<Mutex<mpsc::Receiver<Vec<u8>>>>,
}
#[allow(dead_code)]
impl ExecCommandSession {
pub(crate) fn new(
id: SessionId,
writer_tx: mpsc::Sender<Vec<u8>>,
output_rx: mpsc::Receiver<Vec<u8>>,
) -> Self {
Self {
id,
writer_tx,
output_rx: Arc::new(Mutex::new(output_rx)),
}
}
/// Enqueue bytes to be written to the process stdin (PTY master).
pub(crate) async fn write_stdin(&self, bytes: impl AsRef<[u8]>) -> anyhow::Result<()> {
self.writer_tx
.send(bytes.as_ref().to_vec())
.await
.map_err(|e| anyhow::anyhow!("failed to send to writer: {e}"))
}
/// Receive the next chunk of output from the process. Returns `None` when the
/// output stream is closed (process exited or reader finished).
pub(crate) async fn recv_output_chunk(&self) -> Option<Vec<u8>> {
self.output_rx.lock().await.recv().await
}
pub(crate) fn writer_sender(&self) -> mpsc::Sender<Vec<u8>> {
self.writer_tx.clone()
}
pub(crate) fn output_receiver(&self) -> Arc<Mutex<mpsc::Receiver<Vec<u8>>>> {
self.output_rx.clone()
}
}

View File

@@ -0,0 +1,118 @@
#![deny(clippy::print_stdout, clippy::print_stderr)]
use mcp_types::JSONRPCMessage;
use std::io::Result as IoResult;
use tokio::io::AsyncBufReadExt;
use tokio::io::AsyncWriteExt;
use tokio::io::BufReader;
use tokio::sync::mpsc;
use tracing::debug;
use tracing::error;
use tracing::info;
use tracing_subscriber::EnvFilter;
use crate::message_processor::MessageProcessor;
use crate::outgoing_message::OutgoingMessage;
use crate::outgoing_message_sender::OutgoingMessageSender;
mod error_code;
mod exec_command;
mod exec_command_session;
mod message_processor;
mod outgoing_message;
mod outgoing_message_sender;
mod session_id;
mod session_manager;
/// Size of the bounded channels used to communicate between tasks. The value
/// is a balance between throughput and memory usage 128 messages should be
/// plenty for an interactive CLI.
const CHANNEL_CAPACITY: usize = 128;
pub async fn run_main() -> IoResult<()> {
// Honor `RUST_LOG`.
tracing_subscriber::fmt()
.with_writer(std::io::stderr)
.with_env_filter(EnvFilter::from_default_env())
.init();
// Set up channels.
let (incoming_tx, mut incoming_rx) = mpsc::channel::<JSONRPCMessage>(CHANNEL_CAPACITY);
let (outgoing_tx, mut outgoing_rx) = mpsc::channel::<OutgoingMessage>(CHANNEL_CAPACITY);
// Task: read from stdin, push to `incoming_tx`.
let stdin_reader_handle = tokio::spawn({
let incoming_tx = incoming_tx.clone();
async move {
let stdin = tokio::io::stdin();
let reader = BufReader::new(stdin);
let mut lines = reader.lines();
while let Some(line) = lines.next_line().await.unwrap_or_default() {
match serde_json::from_str::<JSONRPCMessage>(&line) {
Ok(msg) => {
if incoming_tx.send(msg).await.is_err() {
// Receiver gone nothing left to do.
break;
}
}
Err(e) => error!("Failed to deserialize JSONRPCMessage: {e}"),
}
}
debug!("stdin reader finished (EOF)");
}
});
// Task: process incoming messages.
let processor_handle = tokio::spawn({
let outgoing_message_sender = OutgoingMessageSender::new(outgoing_tx);
let mut processor = MessageProcessor::new(outgoing_message_sender);
async move {
while let Some(msg) = incoming_rx.recv().await {
match msg {
JSONRPCMessage::Request(request) => processor.process_request(request).await,
JSONRPCMessage::Response(_response) => {}
JSONRPCMessage::Notification(_notification) => {}
JSONRPCMessage::Error(_error) => {}
}
}
info!("processor task exited (channel closed)");
}
});
// Task: write outgoing messages to stdout.
let stdout_writer_handle = tokio::spawn(async move {
let mut stdout = tokio::io::stdout();
while let Some(outgoing_message) = outgoing_rx.recv().await {
let msg: JSONRPCMessage = outgoing_message.into();
match serde_json::to_string(&msg) {
Ok(json) => {
if let Err(e) = stdout.write_all(json.as_bytes()).await {
error!("Failed to write to stdout: {e}");
break;
}
if let Err(e) = stdout.write_all(b"\n").await {
error!("Failed to write newline to stdout: {e}");
break;
}
if let Err(e) = stdout.flush().await {
error!("Failed to flush stdout: {e}");
break;
}
}
Err(e) => error!("Failed to serialize JSONRPCMessage: {e}"),
}
}
info!("stdout writer exited (channel closed)");
});
// Wait for all tasks to finish. The typical exit path is the stdin reader
// hitting EOF which, once it drops `incoming_tx`, propagates shutdown to
// the processor and then to the stdout task.
let _ = tokio::join!(stdin_reader_handle, processor_handle, stdout_writer_handle);
Ok(())
}

View File

@@ -0,0 +1,7 @@
use exec_command_mcp::run_main;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
run_main().await?;
Ok(())
}

View File

@@ -0,0 +1,289 @@
use std::sync::Arc;
use mcp_types::CallToolRequestParams;
use mcp_types::CallToolResult;
use mcp_types::ClientRequest as McpClientRequest;
use mcp_types::ContentBlock;
use mcp_types::JSONRPCErrorError;
use mcp_types::JSONRPCRequest;
use mcp_types::ListToolsResult;
use mcp_types::ModelContextProtocolRequest;
use mcp_types::RequestId;
use mcp_types::ServerCapabilitiesTools;
use mcp_types::TextContent;
use mcp_types::Tool;
use mcp_types::ToolInputSchema;
use schemars::r#gen::SchemaSettings;
use crate::error_code::INVALID_REQUEST_ERROR_CODE;
use crate::error_code::{self};
use crate::exec_command::ExecCommandParams;
use crate::exec_command::WriteStdinParams;
use crate::outgoing_message_sender::OutgoingMessageSender;
use crate::session_manager::SessionManager;
#[derive(Debug)]
pub(crate) struct MessageProcessor {
initialized: bool,
outgoing: Arc<OutgoingMessageSender>,
session_manager: Arc<SessionManager>,
}
impl MessageProcessor {
pub(crate) fn new(outgoing: OutgoingMessageSender) -> Self {
Self {
initialized: false,
outgoing: Arc::new(outgoing),
session_manager: Arc::new(SessionManager::default()),
}
}
pub(crate) async fn process_request(&mut self, request: JSONRPCRequest) {
let request_id = request.id.clone();
let client_request = match McpClientRequest::try_from(request) {
Ok(client_request) => client_request,
Err(e) => {
self.outgoing
.send_error(
request_id,
JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: format!("Invalid request: {e}"),
data: None,
},
)
.await;
return;
}
};
match client_request {
McpClientRequest::InitializeRequest(params) => {
self.handle_initialize(request_id, params).await;
}
McpClientRequest::ListToolsRequest(params) => {
self.handle_list_tools(request_id, params).await;
}
McpClientRequest::CallToolRequest(params) => {
self.handle_call_tool(request_id, params).await;
}
_ => {
tracing::warn!("Unhandled client request: {client_request:?}");
}
}
}
async fn handle_initialize(
&mut self,
id: RequestId,
params: <mcp_types::InitializeRequest as ModelContextProtocolRequest>::Params,
) {
tracing::info!("initialize -> params: {:?}", params);
if self.initialized {
// Already initialised: send JSON-RPC error response.
let error = JSONRPCErrorError {
code: INVALID_REQUEST_ERROR_CODE,
message: "initialize called more than once".to_string(),
data: None,
};
self.outgoing.send_error(id, error).await;
return;
}
self.initialized = true;
// Build a minimal InitializeResult. Fill with placeholders.
let result = mcp_types::InitializeResult {
capabilities: mcp_types::ServerCapabilities {
completions: None,
experimental: None,
logging: None,
prompts: None,
resources: None,
tools: Some(ServerCapabilitiesTools {
list_changed: Some(true),
}),
},
instructions: None,
protocol_version: params.protocol_version.clone(),
server_info: mcp_types::Implementation {
name: "exec-command-mcp".to_string(),
version: env!("CARGO_PKG_VERSION").to_string(),
title: Some("Codex exec_command".to_string()),
},
};
self.send_response::<mcp_types::InitializeRequest>(id, result)
.await;
}
async fn handle_list_tools(
&self,
request_id: RequestId,
params: <mcp_types::ListToolsRequest as mcp_types::ModelContextProtocolRequest>::Params,
) {
tracing::trace!("tools/list ({request_id:?}) -> {params:?}");
// Generate tool schema eagerly in a short-lived scope to avoid holding
// non-Send schemars generator across await.
let result = {
let generator = SchemaSettings::draft2019_09()
.with(|s| {
s.inline_subschemas = true;
s.option_add_null_type = false;
})
.into_generator();
let exec_schema = generator
.clone()
.into_root_schema_for::<ExecCommandParams>();
let write_schema = generator.into_root_schema_for::<WriteStdinParams>();
#[expect(clippy::expect_used)]
let exec_schema_json =
serde_json::to_value(&exec_schema).expect("exec_command schema should serialize");
#[expect(clippy::expect_used)]
let write_schema_json =
serde_json::to_value(&write_schema).expect("write_stdin schema should serialize");
let exec_input_schema = serde_json::from_value::<ToolInputSchema>(exec_schema_json)
.unwrap_or_else(|e| {
panic!("failed to create Tool from schema: {e}");
});
let write_input_schema = serde_json::from_value::<ToolInputSchema>(write_schema_json)
.unwrap_or_else(|e| {
panic!("failed to create Tool from schema: {e}");
});
let tools = vec![
Tool {
name: "functions_exec_command".to_string(),
title: Some("Exec Command".to_string()),
description: Some("Start a PTY-backed shell command; returns early on timeout or completion.".to_string()),
input_schema: exec_input_schema,
output_schema: None,
annotations: None,
},
Tool {
name: "functions_write_stdin".to_string(),
title: Some("Write Stdin".to_string()),
description: Some("Write characters to a running exec session and collect output for a short window.".to_string()),
input_schema: write_input_schema,
output_schema: None,
annotations: None,
},
];
ListToolsResult {
tools,
next_cursor: None,
}
};
self.send_response::<mcp_types::ListToolsRequest>(request_id, result)
.await;
}
async fn handle_call_tool(
&self,
request_id: RequestId,
params: <mcp_types::CallToolRequest as mcp_types::ModelContextProtocolRequest>::Params,
) {
tracing::info!("tools/call -> params: {params:?}");
let CallToolRequestParams { name, arguments } = params;
match name.as_str() {
"functions_exec_command" => match extract_exec_command_params(arguments).await {
Ok(params) => {
tracing::info!("functions_exec_command -> params: {params:?}");
let session_manager = self.session_manager.clone();
let outgoing = self.outgoing.clone();
tokio::spawn(async move {
session_manager
.handle_exec_command_request(request_id, params, outgoing)
.await;
});
}
Err(jsonrpc_error) => {
self.outgoing.send_error(request_id, jsonrpc_error).await;
}
},
"functions_write_stdin" => match extract_write_stdin_params(arguments).await {
Ok(params) => {
tracing::info!("functions_write_stdin -> params: {params:?}");
let session_manager = self.session_manager.clone();
let outgoing = self.outgoing.clone();
tokio::spawn(async move {
session_manager
.handle_write_stdin_request(request_id, params, outgoing)
.await;
});
}
Err(jsonrpc_error) => {
self.outgoing.send_error(request_id, jsonrpc_error).await;
}
},
_ => {
let result = CallToolResult {
content: vec![ContentBlock::TextContent(TextContent {
r#type: "text".to_string(),
text: format!("Unknown tool '{name}'"),
annotations: None,
})],
is_error: Some(true),
structured_content: None,
};
self.send_response::<mcp_types::CallToolRequest>(request_id, result)
.await;
}
}
}
async fn send_response<T>(&self, id: RequestId, result: T::Result)
where
T: ModelContextProtocolRequest,
{
self.outgoing.send_response(id, result).await;
}
}
async fn extract_exec_command_params(
args: Option<serde_json::Value>,
) -> Result<ExecCommandParams, JSONRPCErrorError> {
match args {
Some(value) => match serde_json::from_value::<ExecCommandParams>(value) {
Ok(params) => Ok(params),
Err(e) => Err(JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: format!("Invalid request: {e}"),
data: None,
}),
},
None => Err(JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: "Missing arguments".to_string(),
data: None,
}),
}
}
async fn extract_write_stdin_params(
args: Option<serde_json::Value>,
) -> Result<WriteStdinParams, JSONRPCErrorError> {
match args {
Some(value) => match serde_json::from_value::<WriteStdinParams>(value) {
Ok(params) => Ok(params),
Err(e) => Err(JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: format!("Invalid request: {e}"),
data: None,
}),
},
None => Err(JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: "Missing arguments".to_string(),
data: None,
}),
}
}

View File

@@ -0,0 +1,46 @@
use mcp_types::JSONRPC_VERSION;
use mcp_types::JSONRPCError;
use mcp_types::JSONRPCErrorError;
use mcp_types::JSONRPCMessage;
use mcp_types::JSONRPCResponse;
use mcp_types::RequestId;
use mcp_types::Result;
use serde::Serialize;
/// Outgoing message from the server to the client.
pub(crate) enum OutgoingMessage {
Response(OutgoingResponse),
Error(OutgoingError),
}
impl From<OutgoingMessage> for JSONRPCMessage {
fn from(val: OutgoingMessage) -> Self {
use OutgoingMessage::*;
match val {
Response(OutgoingResponse { id, result }) => {
JSONRPCMessage::Response(JSONRPCResponse {
jsonrpc: JSONRPC_VERSION.into(),
id,
result,
})
}
Error(OutgoingError { id, error }) => JSONRPCMessage::Error(JSONRPCError {
jsonrpc: JSONRPC_VERSION.into(),
id,
error,
}),
}
}
}
#[derive(Debug, Clone, PartialEq, Serialize)]
pub(crate) struct OutgoingResponse {
pub id: RequestId,
pub result: Result,
}
#[derive(Debug, Clone, PartialEq, Serialize)]
pub(crate) struct OutgoingError {
pub error: JSONRPCErrorError,
pub id: RequestId,
}

View File

@@ -0,0 +1,47 @@
use mcp_types::JSONRPCErrorError;
use mcp_types::RequestId;
use serde::Serialize;
use tokio::sync::mpsc;
use crate::outgoing_message::OutgoingError;
use crate::outgoing_message::OutgoingMessage;
use crate::outgoing_message::OutgoingResponse;
use crate::error_code::INTERNAL_ERROR_CODE;
/// Sends messages to the client and manages request callbacks.
#[derive(Debug)]
pub(crate) struct OutgoingMessageSender {
sender: mpsc::Sender<OutgoingMessage>,
}
impl OutgoingMessageSender {
pub(crate) fn new(sender: mpsc::Sender<OutgoingMessage>) -> Self {
Self { sender }
}
pub(crate) async fn send_response<T: Serialize>(&self, id: RequestId, response: T) {
match serde_json::to_value(response) {
Ok(result) => {
let outgoing_message = OutgoingMessage::Response(OutgoingResponse { id, result });
let _ = self.sender.send(outgoing_message).await;
}
Err(err) => {
self.send_error(
id,
JSONRPCErrorError {
code: INTERNAL_ERROR_CODE,
message: format!("failed to serialize response: {err}"),
data: None,
},
)
.await;
}
}
}
pub(crate) async fn send_error(&self, id: RequestId, error: JSONRPCErrorError) {
let outgoing_message = OutgoingMessage::Error(OutgoingError { id, error });
let _ = self.sender.send(outgoing_message).await;
}
}

View File

@@ -0,0 +1,6 @@
use schemars::JsonSchema;
use serde::Deserialize;
use serde::Serialize;
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash, Serialize, Deserialize, JsonSchema)]
pub(crate) struct SessionId(pub u32);

View File

@@ -0,0 +1,324 @@
use std::collections::HashMap;
use std::io::Read;
use std::sync::Arc;
use std::sync::Mutex as StdMutex;
use std::sync::atomic::AtomicU32;
use mcp_types::CallToolResult;
use mcp_types::ContentBlock;
use mcp_types::JSONRPCErrorError;
use mcp_types::RequestId;
use mcp_types::TextContent;
use portable_pty::CommandBuilder;
use portable_pty::PtySize;
use portable_pty::native_pty_system;
use serde_json::json;
use tokio::sync::Mutex;
use tokio::sync::mpsc;
use tokio::sync::oneshot;
use tokio::time::Duration;
use tokio::time::Instant;
use tokio::time::timeout;
use crate::error_code;
use crate::exec_command::ExecCommandParams;
use crate::exec_command::WriteStdinParams;
use crate::exec_command_session::ExecCommandSession;
use crate::outgoing_message_sender::OutgoingMessageSender;
use crate::session_id::SessionId;
#[derive(Debug, Default)]
pub(crate) struct SessionManager {
next_session_id: AtomicU32,
sessions: Mutex<HashMap<SessionId, ExecCommandSession>>,
}
impl SessionManager {
/// Processes the request and is required to send a response via `outgoing`.
pub(crate) async fn handle_exec_command_request(
&self,
request_id: RequestId,
params: ExecCommandParams,
outgoing: Arc<OutgoingMessageSender>,
) {
// Allocate a session id.
let session_id = SessionId(
self.next_session_id
.fetch_add(1, std::sync::atomic::Ordering::SeqCst),
);
let result = create_exec_command_session(session_id, params.clone()).await;
match result {
Ok((session, mut exit_rx)) => {
// Insert into session map.
let output_receiver = session.output_receiver();
self.sessions.lock().await.insert(session_id, session);
// Collect output until either timeout expires or process exits.
// Cap by assuming 4 bytes per token (TODO: use a real tokenizer).
let cap_bytes_u64 = params.max_output_tokens.saturating_mul(4);
let cap_bytes: usize = cap_bytes_u64.min(usize::MAX as u64) as usize;
let cap_hint = cap_bytes.clamp(1024, 8192);
let mut collected: Vec<u8> = Vec::with_capacity(cap_hint);
let deadline = Instant::now() + Duration::from_millis(params.yield_time_ms);
let mut exit_code: Option<i32> = None;
loop {
if Instant::now() >= deadline {
break;
}
let remaining = deadline.saturating_duration_since(Instant::now());
tokio::select! {
biased;
exit = &mut exit_rx => {
exit_code = exit.ok();
// Small grace period to pull remaining buffered output
let grace_deadline = Instant::now() + Duration::from_millis(25);
while Instant::now() < grace_deadline {
let recv_next = async {
let mut rx = output_receiver.lock().await;
rx.recv().await
};
if let Ok(Some(chunk)) = timeout(Duration::from_millis(1), recv_next).await {
let available = cap_bytes.saturating_sub(collected.len());
if available == 0 { break; }
let take = available.min(chunk.len());
collected.extend_from_slice(&chunk[..take]);
} else {
break;
}
}
break;
}
chunk = timeout(remaining, async {
let mut rx = output_receiver.lock().await;
rx.recv().await
}) => {
match chunk {
Ok(Some(chunk)) => {
let available = cap_bytes.saturating_sub(collected.len());
if available == 0 { /* keep draining, but don't store */ }
else {
let take = available.min(chunk.len());
collected.extend_from_slice(&chunk[..take]);
}
}
Ok(None) => { break; }
Err(_) => { break; }
}
}
}
}
let text = String::from_utf8_lossy(&collected).to_string();
let mut structured = json!({ "sessionId": session_id });
if let Some(code) = exit_code {
structured["exitCode"] = json!(code);
}
let result = CallToolResult {
content: vec![ContentBlock::TextContent(TextContent {
r#type: "text".to_string(),
text,
annotations: None,
})],
is_error: None,
structured_content: Some(structured),
};
outgoing.send_response(request_id, result).await;
}
Err(err) => {
outgoing
.send_error(
request_id,
JSONRPCErrorError {
code: error_code::INTERNAL_ERROR_CODE,
message: format!("failed to start exec session: {err}"),
data: None,
},
)
.await;
}
}
}
/// Write characters to a session's stdin and collect combined output for up to `yield_time_ms`.
pub(crate) async fn handle_write_stdin_request(
&self,
request_id: RequestId,
params: WriteStdinParams,
outgoing: Arc<OutgoingMessageSender>,
) {
let WriteStdinParams {
session_id,
chars,
yield_time_ms,
max_output_tokens,
} = params;
// Grab handles without holding the sessions lock across await points.
let (writer_tx, output_rx) = {
let sessions = self.sessions.lock().await;
match sessions.get(&session_id) {
Some(session) => (session.writer_sender(), session.output_receiver()),
None => {
outgoing
.send_error(
request_id,
JSONRPCErrorError {
code: error_code::INVALID_REQUEST_ERROR_CODE,
message: format!("unknown session id {}", session_id.0),
data: None,
},
)
.await;
return;
}
}
};
// Write stdin if provided.
if !chars.is_empty() && writer_tx.send(chars.into_bytes()).await.is_err() {
outgoing
.send_error(
request_id,
JSONRPCErrorError {
code: error_code::INTERNAL_ERROR_CODE,
message: "failed to write to stdin".to_string(),
data: None,
},
)
.await;
return;
}
// Collect output up to yield_time_ms, truncating to max_output_tokens bytes.
let mut collected: Vec<u8> = Vec::with_capacity(4096);
let deadline = Instant::now() + Duration::from_millis(yield_time_ms);
loop {
let now = Instant::now();
if now >= deadline {
break;
}
let remaining = deadline - now;
match timeout(remaining, output_rx.lock().await.recv()).await {
Ok(Some(chunk)) => {
// Respect token/byte limit; keep draining but drop once full.
let available =
max_output_tokens.saturating_sub(collected.len() as u64) as usize;
if available > 0 {
let take = available.min(chunk.len());
collected.extend_from_slice(&chunk[..take]);
}
// Continue loop to drain further within time.
}
Ok(None) => break, // channel closed
Err(_) => break, // timeout
}
}
// Return text output as a CallToolResult
let text = String::from_utf8_lossy(&collected).to_string();
let result = CallToolResult {
content: vec![ContentBlock::TextContent(TextContent {
r#type: "text".to_string(),
text,
annotations: None,
})],
is_error: None,
structured_content: None,
};
outgoing.send_response(request_id, result).await;
}
}
/// Spawn PTY and child process per spawn_exec_command_session logic.
async fn create_exec_command_session(
session_id: SessionId,
params: ExecCommandParams,
) -> anyhow::Result<(ExecCommandSession, oneshot::Receiver<i32>)> {
let ExecCommandParams {
cmd,
yield_time_ms: _,
max_output_tokens: _,
shell,
login,
} = params;
// Use the native pty implementation for the system
let pty_system = native_pty_system();
// Create a new pty
let pair = pty_system.openpty(PtySize {
rows: 24,
cols: 80,
pixel_width: 0,
pixel_height: 0,
})?;
// Spawn a shell into the pty
let mut command_builder = CommandBuilder::new(shell);
let shell_mode_opt = if login { "-lc" } else { "-c" };
command_builder.arg(shell_mode_opt);
command_builder.arg(cmd);
let mut child = pair.slave.spawn_command(command_builder)?;
// Channel to forward write requests to the PTY writer.
let (writer_tx, mut writer_rx) = mpsc::channel::<Vec<u8>>(128);
// Channel for streaming PTY output to readers.
let (output_tx, output_rx) = mpsc::channel::<Vec<u8>>(256);
// Reader task: drain PTY and forward chunks to output channel.
let mut reader = pair.master.try_clone_reader()?;
let output_tx_clone = output_tx.clone();
tokio::task::spawn_blocking(move || {
let mut buf = [0u8; 8192];
loop {
match reader.read(&mut buf) {
Ok(0) => break, // EOF
Ok(n) => {
// Forward; block if receiver is slow to avoid dropping output.
let _ = output_tx_clone.blocking_send(buf[..n].to_vec());
}
Err(_) => break,
}
}
});
// Writer task: apply stdin writes to the PTY writer.
let writer = pair.master.take_writer()?;
let writer = Arc::new(StdMutex::new(writer));
tokio::spawn({
let writer = writer.clone();
async move {
while let Some(bytes) = writer_rx.recv().await {
let writer = writer.clone();
// Perform blocking write on a blocking thread.
let _ = tokio::task::spawn_blocking(move || {
if let Ok(mut guard) = writer.lock() {
use std::io::Write;
let _ = guard.write_all(&bytes);
let _ = guard.flush();
}
})
.await;
}
}
});
// Keep the child alive until it exits, then signal exit code.
let (exit_tx, exit_rx) = oneshot::channel::<i32>();
tokio::task::spawn_blocking(move || {
let code = match child.wait() {
Ok(status) => status.exit_code() as i32,
Err(_) => -1,
};
let _ = exit_tx.send(code);
});
// Create and store the session with channels.
let session = ExecCommandSession::new(session_id, writer_tx, output_rx);
Ok((session, exit_rx))
}

View File

@@ -20,7 +20,6 @@ use codex_core::protocol::McpToolCallEndEvent;
use codex_core::protocol::PatchApplyBeginEvent;
use codex_core::protocol::PatchApplyEndEvent;
use codex_core::protocol::SessionConfiguredEvent;
use codex_core::protocol::StreamErrorEvent;
use codex_core::protocol::TaskCompleteEvent;
use codex_core::protocol::TurnAbortReason;
use codex_core::protocol::TurnDiffEvent;
@@ -175,9 +174,6 @@ impl EventProcessor for EventProcessorWithHumanOutput {
EventMsg::BackgroundEvent(BackgroundEventEvent { message }) => {
ts_println!(self, "{}", message.style(self.dimmed));
}
EventMsg::StreamError(StreamErrorEvent { message }) => {
ts_println!(self, "{}", message.style(self.dimmed));
}
EventMsg::TaskStarted => {
// Ignore.
}

View File

@@ -24,8 +24,8 @@ file-search *args:
fmt:
cargo fmt -- --config imports_granularity=Item
fix *args:
cargo clippy --fix --all-features --tests --allow-dirty "$@"
fix:
cargo clippy --fix --all-features --tests --allow-dirty
install:
rustup show active-toolchain

View File

@@ -9,7 +9,6 @@ workspace = true
[dependencies]
base64 = "0.22"
chrono = { version = "0.4", features = ["serde"] }
codex-protocol = { path = "../protocol" }
rand = "0.8"
reqwest = { version = "0.12", features = ["json", "blocking"] }
serde = { version = "1", features = ["derive"] }

View File

@@ -29,7 +29,13 @@ mod token_data;
pub const CLIENT_ID: &str = "app_EMoamEEZ73f0CkXaXp7hrann";
pub const OPENAI_API_KEY_ENV_VAR: &str = "OPENAI_API_KEY";
pub use codex_protocol::mcp_protocol::AuthMode;
#[derive(Clone, Debug, PartialEq, Copy, Eq, Serialize, Deserialize)]
#[serde(rename_all = "lowercase")]
pub enum AuthMode {
ApiKey,
ChatGPT,
}
#[derive(Debug, Clone)]
pub struct CodexAuth {
@@ -56,39 +62,6 @@ impl CodexAuth {
}
}
pub async fn refresh_token(&self) -> Result<String, std::io::Error> {
let token_data = self
.get_current_token_data()
.ok_or(std::io::Error::other("Token data is not available."))?;
let token = token_data.refresh_token;
let refresh_response = try_refresh_token(token)
.await
.map_err(std::io::Error::other)?;
let updated = update_tokens(
&self.auth_file,
refresh_response.id_token,
refresh_response.access_token,
refresh_response.refresh_token,
)
.await?;
if let Ok(mut auth_lock) = self.auth_dot_json.lock() {
*auth_lock = Some(updated.clone());
}
let access = match updated.tokens {
Some(t) => t.access_token,
None => {
return Err(std::io::Error::other(
"Token data is not available after refresh.",
));
}
};
Ok(access)
}
/// Loads the available auth information from the auth.json or
/// OPENAI_API_KEY environment variable.
pub fn from_codex_home(
@@ -236,7 +209,7 @@ fn load_auth(
// "refreshable" even if we are using the API key for auth?
match &tokens {
Some(tokens) => {
if tokens.should_use_api_key(preferred_auth_method, tokens.is_openai_email()) {
if tokens.should_use_api_key(preferred_auth_method) {
return Ok(Some(CodexAuth::from_api_key(api_key)));
} else {
// Ignore the API key and fall through to ChatGPT auth.

View File

@@ -25,31 +25,16 @@ pub struct TokenData {
impl TokenData {
/// Returns true if this is a plan that should use the traditional
/// "metered" billing via an API key.
pub(crate) fn should_use_api_key(
&self,
preferred_auth_method: AuthMode,
is_openai_email: bool,
) -> bool {
pub(crate) fn should_use_api_key(&self, preferred_auth_method: AuthMode) -> bool {
if preferred_auth_method == AuthMode::ApiKey {
return true;
}
// If the email is an OpenAI email, use AuthMode::ChatGPT unless preferred_auth_method is AuthMode::ApiKey.
if is_openai_email {
return false;
}
self.id_token
.chatgpt_plan_type
.as_ref()
.is_none_or(|plan| plan.is_plan_that_should_use_api_key())
}
pub fn is_openai_email(&self) -> bool {
self.id_token
.email
.as_deref()
.is_some_and(|email| email.trim().to_ascii_lowercase().ends_with("@openai.com"))
}
}
/// Flat subset of useful claims in id_token from auth.json.

View File

@@ -17,7 +17,7 @@ workspace = true
[dependencies]
anyhow = "1"
codex-arg0 = { path = "../arg0" }
codex-common = { path = "../common", features = ["cli"] }
codex-common = { path = "../common" }
codex-core = { path = "../core" }
codex-login = { path = "../login" }
codex-protocol = { path = "../protocol" }

View File

@@ -8,14 +8,11 @@ use codex_core::ConversationManager;
use codex_core::NewConversation;
use codex_core::config::Config;
use codex_core::config::ConfigOverrides;
use codex_core::git_info::git_diff_to_remote;
use codex_core::protocol::ApplyPatchApprovalRequestEvent;
use codex_core::protocol::Event;
use codex_core::protocol::EventMsg;
use codex_core::protocol::ExecApprovalRequestEvent;
use codex_core::protocol::ReviewDecision;
use codex_protocol::mcp_protocol::AuthMode;
use codex_protocol::mcp_protocol::GitDiffToRemoteResponse;
use mcp_types::JSONRPCErrorError;
use mcp_types::RequestId;
use tokio::sync::Mutex;
@@ -31,17 +28,14 @@ use crate::outgoing_message::OutgoingNotification;
use codex_core::protocol::InputItem as CoreInputItem;
use codex_core::protocol::Op;
use codex_login::CLIENT_ID;
use codex_login::CodexAuth;
use codex_login::ServerOptions as LoginServerOptions;
use codex_login::ShutdownHandle;
use codex_login::logout;
use codex_login::run_login_server;
use codex_protocol::mcp_protocol::APPLY_PATCH_APPROVAL_METHOD;
use codex_protocol::mcp_protocol::AddConversationListenerParams;
use codex_protocol::mcp_protocol::AddConversationSubscriptionResponse;
use codex_protocol::mcp_protocol::ApplyPatchApprovalParams;
use codex_protocol::mcp_protocol::ApplyPatchApprovalResponse;
use codex_protocol::mcp_protocol::AuthStatusChangeNotification;
use codex_protocol::mcp_protocol::ClientRequest;
use codex_protocol::mcp_protocol::ConversationId;
use codex_protocol::mcp_protocol::EXEC_COMMAND_APPROVAL_METHOD;
@@ -50,6 +44,7 @@ use codex_protocol::mcp_protocol::ExecCommandApprovalResponse;
use codex_protocol::mcp_protocol::InputItem as WireInputItem;
use codex_protocol::mcp_protocol::InterruptConversationParams;
use codex_protocol::mcp_protocol::InterruptConversationResponse;
use codex_protocol::mcp_protocol::LOGIN_CHATGPT_COMPLETE_EVENT;
use codex_protocol::mcp_protocol::LoginChatGptCompleteNotification;
use codex_protocol::mcp_protocol::LoginChatGptResponse;
use codex_protocol::mcp_protocol::NewConversationParams;
@@ -60,7 +55,6 @@ use codex_protocol::mcp_protocol::SendUserMessageParams;
use codex_protocol::mcp_protocol::SendUserMessageResponse;
use codex_protocol::mcp_protocol::SendUserTurnParams;
use codex_protocol::mcp_protocol::SendUserTurnResponse;
use codex_protocol::mcp_protocol::ServerNotification;
// Duration before a ChatGPT login attempt is abandoned.
const LOGIN_CHATGPT_TIMEOUT: Duration = Duration::from_secs(10 * 60);
@@ -81,7 +75,6 @@ pub(crate) struct CodexMessageProcessor {
conversation_manager: Arc<ConversationManager>,
outgoing: Arc<OutgoingMessageSender>,
codex_linux_sandbox_exe: Option<PathBuf>,
config: Arc<Config>,
conversation_listeners: HashMap<Uuid, oneshot::Sender<()>>,
active_login: Arc<Mutex<Option<ActiveLogin>>>,
// Queue of pending interrupt requests per conversation. We reply when TurnAborted arrives.
@@ -93,13 +86,11 @@ impl CodexMessageProcessor {
conversation_manager: Arc<ConversationManager>,
outgoing: Arc<OutgoingMessageSender>,
codex_linux_sandbox_exe: Option<PathBuf>,
config: Arc<Config>,
) -> Self {
Self {
conversation_manager,
outgoing,
codex_linux_sandbox_exe,
config,
conversation_listeners: HashMap::new(),
active_login: Arc::new(Mutex::new(None)),
pending_interrupts: Arc::new(Mutex::new(HashMap::new())),
@@ -135,20 +126,23 @@ impl CodexMessageProcessor {
ClientRequest::CancelLoginChatGpt { request_id, params } => {
self.cancel_login_chatgpt(request_id, params.login_id).await;
}
ClientRequest::LogoutChatGpt { request_id } => {
self.logout_chatgpt(request_id).await;
}
ClientRequest::GetAuthStatus { request_id } => {
self.get_auth_status(request_id).await;
}
ClientRequest::GitDiffToRemote { request_id, params } => {
self.git_diff_to_origin(request_id, params.cwd).await;
}
}
}
async fn login_chatgpt(&mut self, request_id: RequestId) {
let config = self.config.as_ref();
let config =
match Config::load_with_cli_overrides(Default::default(), ConfigOverrides::default()) {
Ok(cfg) => cfg,
Err(err) => {
let error = JSONRPCErrorError {
code: INTERNAL_ERROR_CODE,
message: format!("error loading config for login: {err}"),
data: None,
};
self.outgoing.send_error(request_id, error).await;
return;
}
};
let opts = LoginServerOptions {
open_browser: false,
@@ -200,25 +194,19 @@ impl CodexMessageProcessor {
(false, Some("Login timed out".to_string()))
}
};
let payload = LoginChatGptCompleteNotification {
let notification = LoginChatGptCompleteNotification {
login_id,
success,
error: error_msg,
};
let params = serde_json::to_value(&notification).ok();
outgoing_clone
.send_server_notification(ServerNotification::LoginChatGptComplete(payload))
.send_notification(OutgoingNotification {
method: LOGIN_CHATGPT_COMPLETE_EVENT.to_string(),
params,
})
.await;
// Send an auth status change notification.
if success {
let payload = AuthStatusChangeNotification {
auth_method: Some(AuthMode::ChatGPT),
};
outgoing_clone
.send_server_notification(ServerNotification::AuthStatusChange(payload))
.await;
}
// Clear the active login if it matches this attempt. It may have been replaced or cancelled.
let mut guard = active_login.lock().await;
if guard.as_ref().map(|l| l.login_id) == Some(login_id) {
@@ -267,78 +255,6 @@ impl CodexMessageProcessor {
}
}
async fn logout_chatgpt(&mut self, request_id: RequestId) {
{
// Cancel any active login attempt.
let mut guard = self.active_login.lock().await;
if let Some(active) = guard.take() {
active.drop();
}
}
// Load config to locate codex_home for persistent logout.
let config = self.config.as_ref();
if let Err(err) = logout(&config.codex_home) {
let error = JSONRPCErrorError {
code: INTERNAL_ERROR_CODE,
message: format!("logout failed: {err}"),
data: None,
};
self.outgoing.send_error(request_id, error).await;
return;
}
self.outgoing
.send_response(
request_id,
codex_protocol::mcp_protocol::LogoutChatGptResponse {},
)
.await;
// Send auth status change notification.
let payload = AuthStatusChangeNotification { auth_method: None };
self.outgoing
.send_server_notification(ServerNotification::AuthStatusChange(payload))
.await;
}
async fn get_auth_status(&self, request_id: RequestId) {
// Load config to determine codex_home and preferred auth method.
let config = self.config.as_ref();
let preferred_auth_method: AuthMode = config.preferred_auth_method;
let response =
match CodexAuth::from_codex_home(&config.codex_home, config.preferred_auth_method) {
Ok(Some(auth)) => {
// Verify that the current auth mode has a valid, non-empty token.
// If token acquisition fails or is empty, treat as unauthenticated.
let reported_auth_method = match auth.get_token().await {
Ok(token) if !token.is_empty() => Some(auth.mode),
Ok(_) => None, // Empty token
Err(err) => {
tracing::warn!("failed to get token for auth status: {err}");
None
}
};
codex_protocol::mcp_protocol::GetAuthStatusResponse {
auth_method: reported_auth_method,
preferred_auth_method,
}
}
Ok(None) => codex_protocol::mcp_protocol::GetAuthStatusResponse {
auth_method: None,
preferred_auth_method,
},
Err(_) => codex_protocol::mcp_protocol::GetAuthStatusResponse {
auth_method: None,
preferred_auth_method,
},
};
self.outgoing.send_response(request_id, response).await;
}
async fn process_new_conversation(&self, request_id: RequestId, params: NewConversationParams) {
let config = match derive_config_from_params(params, self.codex_linux_sandbox_exe.clone()) {
Ok(config) => config,
@@ -598,27 +514,6 @@ impl CodexMessageProcessor {
}
}
}
async fn git_diff_to_origin(&self, request_id: RequestId, cwd: PathBuf) {
let diff = git_diff_to_remote(&cwd).await;
match diff {
Some(value) => {
let response = GitDiffToRemoteResponse {
sha: value.sha,
diff: value.diff,
};
self.outgoing.send_response(request_id, response).await;
}
None => {
let error = JSONRPCErrorError {
code: INVALID_REQUEST_ERROR_CODE,
message: format!("failed to compute git diff to remote for cwd: {cwd:?}"),
data: None,
};
self.outgoing.send_error(request_id, error).await;
}
}
}
}
async fn apply_bespoke_event_handling(

View File

@@ -268,7 +268,6 @@ async fn run_codex_tool_session_inner(
| EventMsg::ExecCommandOutputDelta(_)
| EventMsg::ExecCommandEnd(_)
| EventMsg::BackgroundEvent(_)
| EventMsg::StreamError(_)
| EventMsg::PatchApplyBegin(_)
| EventMsg::PatchApplyEnd(_)
| EventMsg::TurnDiff(_)

View File

@@ -1,14 +1,9 @@
//! Prototype MCP server.
#![deny(clippy::print_stdout, clippy::print_stderr)]
use std::io::ErrorKind;
use std::io::Result as IoResult;
use std::path::PathBuf;
use codex_common::CliConfigOverrides;
use codex_core::config::Config;
use codex_core::config::ConfigOverrides;
use mcp_types::JSONRPCMessage;
use tokio::io::AsyncBufReadExt;
use tokio::io::AsyncWriteExt;
@@ -46,10 +41,7 @@ pub use crate::patch_approval::PatchApprovalResponse;
/// plenty for an interactive CLI.
const CHANNEL_CAPACITY: usize = 128;
pub async fn run_main(
codex_linux_sandbox_exe: Option<PathBuf>,
cli_config_overrides: CliConfigOverrides,
) -> IoResult<()> {
pub async fn run_main(codex_linux_sandbox_exe: Option<PathBuf>) -> IoResult<()> {
// Install a simple subscriber so `tracing` output is visible. Users can
// control the log level with `RUST_LOG`.
tracing_subscriber::fmt()
@@ -85,27 +77,10 @@ pub async fn run_main(
}
});
// Parse CLI overrides once and derive the base Config eagerly so later
// components do not need to work with raw TOML values.
let cli_kv_overrides = cli_config_overrides.parse_overrides().map_err(|e| {
std::io::Error::new(
ErrorKind::InvalidInput,
format!("error parsing -c overrides: {e}"),
)
})?;
let config = Config::load_with_cli_overrides(cli_kv_overrides, ConfigOverrides::default())
.map_err(|e| {
std::io::Error::new(ErrorKind::InvalidData, format!("error loading config: {e}"))
})?;
// Task: process incoming messages.
let processor_handle = tokio::spawn({
let outgoing_message_sender = OutgoingMessageSender::new(outgoing_tx);
let mut processor = MessageProcessor::new(
outgoing_message_sender,
codex_linux_sandbox_exe,
std::sync::Arc::new(config),
);
let mut processor = MessageProcessor::new(outgoing_message_sender, codex_linux_sandbox_exe);
async move {
while let Some(msg) = incoming_rx.recv().await {
match msg {

View File

@@ -1,10 +1,9 @@
use codex_arg0::arg0_dispatch_or_else;
use codex_common::CliConfigOverrides;
use codex_mcp_server::run_main;
fn main() -> anyhow::Result<()> {
arg0_dispatch_or_else(|codex_linux_sandbox_exe| async move {
run_main(codex_linux_sandbox_exe, CliConfigOverrides::default()).await?;
run_main(codex_linux_sandbox_exe).await?;
Ok(())
})
}

View File

@@ -1,5 +1,6 @@
use std::collections::HashMap;
use std::path::PathBuf;
use std::sync::Arc;
use crate::codex_message_processor::CodexMessageProcessor;
use crate::codex_tool_config::CodexToolCallParam;
@@ -11,7 +12,7 @@ use crate::outgoing_message::OutgoingMessageSender;
use codex_protocol::mcp_protocol::ClientRequest;
use codex_core::ConversationManager;
use codex_core::config::Config;
use codex_core::config::Config as CodexConfig;
use codex_core::protocol::Submission;
use mcp_types::CallToolRequestParams;
use mcp_types::CallToolResult;
@@ -29,7 +30,6 @@ use mcp_types::ServerCapabilitiesTools;
use mcp_types::ServerNotification;
use mcp_types::TextContent;
use serde_json::json;
use std::sync::Arc;
use tokio::sync::Mutex;
use tokio::task;
use uuid::Uuid;
@@ -49,7 +49,6 @@ impl MessageProcessor {
pub(crate) fn new(
outgoing: OutgoingMessageSender,
codex_linux_sandbox_exe: Option<PathBuf>,
config: Arc<Config>,
) -> Self {
let outgoing = Arc::new(outgoing);
let conversation_manager = Arc::new(ConversationManager::default());
@@ -57,7 +56,6 @@ impl MessageProcessor {
conversation_manager.clone(),
outgoing.clone(),
codex_linux_sandbox_exe.clone(),
config,
);
Self {
codex_message_processor,
@@ -346,7 +344,7 @@ impl MessageProcessor {
}
}
async fn handle_tool_call_codex(&self, id: RequestId, arguments: Option<serde_json::Value>) {
let (initial_prompt, config): (String, Config) = match arguments {
let (initial_prompt, config): (String, CodexConfig) = match arguments {
Some(json_val) => match serde_json::from_value::<CodexToolCallParam>(json_val) {
Ok(tool_cfg) => match tool_cfg.into_config(self.codex_linux_sandbox_exe.clone()) {
Ok(cfg) => cfg,

View File

@@ -3,7 +3,6 @@ use std::sync::atomic::AtomicI64;
use std::sync::atomic::Ordering;
use codex_core::protocol::Event;
use codex_protocol::mcp_protocol::ServerNotification;
use mcp_types::JSONRPC_VERSION;
use mcp_types::JSONRPCError;
use mcp_types::JSONRPCErrorError;
@@ -122,17 +121,6 @@ impl OutgoingMessageSender {
.await;
}
pub(crate) async fn send_server_notification(&self, notification: ServerNotification) {
let method = format!("codex/event/{}", notification);
let params = match serde_json::to_value(&notification) {
Ok(serde_json::Value::Object(mut map)) => map.remove("data"),
_ => None,
};
let outgoing_message =
OutgoingMessage::Notification(OutgoingNotification { method, params });
let _ = self.sender.send(outgoing_message).await;
}
pub(crate) async fn send_notification(&self, notification: OutgoingNotification) {
let outgoing_message = OutgoingMessage::Notification(notification);
let _ = self.sender.send(outgoing_message).await;

View File

@@ -36,12 +36,10 @@ pub fn generate_ts(out_dir: &Path, prettier: Option<&Path>) -> Result<()> {
codex_protocol::mcp_protocol::LoginChatGptCompleteNotification::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::CancelLoginChatGptParams::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::CancelLoginChatGptResponse::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::GitDiffToRemoteParams::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::ApplyPatchApprovalParams::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::ApplyPatchApprovalResponse::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::ExecCommandApprovalParams::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::ExecCommandApprovalResponse::export_all_to(out_dir)?;
codex_protocol::mcp_protocol::ServerNotification::export_all_to(out_dir)?;
// Prepend header to each generated .ts file
let ts_files = ts_files_in(out_dir)?;

View File

@@ -13,7 +13,6 @@ use crate::protocol::TurnAbortReason;
use mcp_types::RequestId;
use serde::Deserialize;
use serde::Serialize;
use strum_macros::Display;
use ts_rs::TS;
use uuid::Uuid;
@@ -27,23 +26,6 @@ impl Display for ConversationId {
}
}
#[derive(Serialize, Deserialize, Clone, Debug, PartialEq, TS)]
#[ts(type = "string")]
pub struct GitSha(pub String);
impl GitSha {
pub fn new(sha: &str) -> Self {
Self(sha.to_string())
}
}
#[derive(Serialize, Deserialize, Debug, Clone, Copy, PartialEq, Eq, TS)]
#[serde(rename_all = "lowercase")]
pub enum AuthMode {
ApiKey,
ChatGPT,
}
/// Request from the client to the server.
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(tag = "method", rename_all = "camelCase")]
@@ -87,19 +69,6 @@ pub enum ClientRequest {
request_id: RequestId,
params: CancelLoginChatGptParams,
},
LogoutChatGpt {
#[serde(rename = "id")]
request_id: RequestId,
},
GetAuthStatus {
#[serde(rename = "id")]
request_id: RequestId,
},
GitDiffToRemote {
#[serde(rename = "id")]
request_id: RequestId,
params: GitDiffToRemoteParams,
},
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Default, TS)]
@@ -170,11 +139,16 @@ pub struct LoginChatGptResponse {
pub auth_url: String,
}
// Event name for notifying client of login completion or failure.
pub const LOGIN_CHATGPT_COMPLETE_EVENT: &str = "codex/event/login_chatgpt_complete";
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct GitDiffToRemoteResponse {
pub sha: GitSha,
pub diff: String,
pub struct LoginChatGptCompleteNotification {
pub login_id: Uuid,
pub success: bool,
#[serde(skip_serializing_if = "Option::is_none")]
pub error: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
@@ -183,40 +157,10 @@ pub struct CancelLoginChatGptParams {
pub login_id: Uuid,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct GitDiffToRemoteParams {
pub cwd: PathBuf,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct CancelLoginChatGptResponse {}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct LogoutChatGptParams {
pub login_id: Uuid,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct LogoutChatGptResponse {}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct GetAuthStatusParams {
pub login_id: Uuid,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct GetAuthStatusResponse {
#[serde(skip_serializing_if = "Option::is_none")]
pub auth_method: Option<AuthMode>,
pub preferred_auth_method: AuthMode,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct SendUserMessageParams {
@@ -349,34 +293,6 @@ pub struct ApplyPatchApprovalResponse {
pub decision: ReviewDecision,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct LoginChatGptCompleteNotification {
pub login_id: Uuid,
pub success: bool,
#[serde(skip_serializing_if = "Option::is_none")]
pub error: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS)]
#[serde(rename_all = "camelCase")]
pub struct AuthStatusChangeNotification {
/// Current authentication method; omitted if signed out.
#[serde(skip_serializing_if = "Option::is_none")]
pub auth_method: Option<AuthMode>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, TS, Display)]
#[serde(tag = "type", content = "data", rename_all = "snake_case")]
#[strum(serialize_all = "snake_case")]
pub enum ServerNotification {
/// Authentication status changed
AuthStatusChange(AuthStatusChangeNotification),
/// ChatGPT login flow completed
LoginChatGptComplete(LoginChatGptCompleteNotification),
}
#[cfg(test)]
mod tests {
use super::*;

View File

@@ -2,7 +2,6 @@ use serde::Deserialize;
use serde::Serialize;
#[derive(Debug, Clone, PartialEq, Eq, Deserialize, Serialize)]
#[serde(tag = "type", rename_all = "snake_case")]
pub enum ParsedCommand {
Read {
cmd: String,

View File

@@ -446,10 +446,6 @@ pub enum EventMsg {
BackgroundEvent(BackgroundEventEvent),
/// Notification that a model stream experienced an error or disconnect
/// and the system is handling it (e.g., retrying with backoff).
StreamError(StreamErrorEvent),
/// Notification that the agent is about to apply a code patch. Mirrors
/// `ExecCommandBegin` so frontends can show progress indicators.
PatchApplyBegin(PatchApplyBeginEvent),
@@ -725,11 +721,6 @@ pub struct BackgroundEventEvent {
pub message: String,
}
#[derive(Debug, Clone, Deserialize, Serialize)]
pub struct StreamErrorEvent {
pub message: String,
}
#[derive(Debug, Clone, Deserialize, Serialize)]
pub struct PatchApplyBeginEvent {
/// Identifier so this can be paired with the PatchApplyEnd event.

View File

@@ -22,7 +22,6 @@ workspace = true
[dependencies]
anyhow = "1"
async-stream = "0.3.6"
base64 = "0.22.1"
chrono = { version = "0.4", features = ["serde"] }
clap = { version = "4", features = ["derive"] }
@@ -34,19 +33,18 @@ codex-common = { path = "../common", features = [
"sandbox_summary",
] }
codex-core = { path = "../core" }
codex-protocol = { path = "../protocol" }
codex-file-search = { path = "../file-search" }
codex-login = { path = "../login" }
codex-ollama = { path = "../ollama" }
codex-protocol = { path = "../protocol" }
color-eyre = "0.6.3"
crossterm = { version = "0.28.1", features = ["bracketed-paste", "event-stream"] }
crossterm = { version = "0.28.1", features = ["bracketed-paste"] }
diffy = "0.4.2"
image = { version = "^0.25.6", default-features = false, features = ["jpeg"] }
lazy_static = "1"
mcp-types = { path = "../mcp-types" }
once_cell = "1"
mcp-types = { path = "../mcp-types" }
path-clean = "1.0.1"
rand = "0.9"
ratatui = { version = "0.29.0", features = [
"scrolling-regions",
"unstable-rendered-line-info",
@@ -69,7 +67,6 @@ tokio = { version = "1", features = [
"rt-multi-thread",
"signal",
] }
tokio-stream = "0.1.17"
tracing = { version = "0.1.41", features = ["log"] }
tracing-appender = "0.2.3"
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
@@ -78,6 +75,7 @@ tui-markdown = "0.3.3"
unicode-segmentation = "1.12.0"
unicode-width = "0.1"
uuid = "1"
rand = "0.9"
[target.'cfg(unix)'.dependencies]
libc = "0.2"

View File

@@ -1,318 +1,714 @@
use crate::LoginStatus;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::chatwidget::ChatWidget;
use crate::file_search::FileSearchManager;
use crate::transcript_app::TranscriptApp;
use crate::get_git_diff::get_git_diff;
use crate::get_login_status;
use crate::onboarding::onboarding_screen::KeyboardHandler;
use crate::onboarding::onboarding_screen::OnboardingScreen;
use crate::onboarding::onboarding_screen::OnboardingScreenArgs;
use crate::slash_command::SlashCommand;
use crate::tui;
use crate::tui::TuiEvent;
use codex_core::ConversationManager;
use codex_core::config::Config;
use codex_core::protocol::TokenUsage;
use codex_core::protocol::Event;
use codex_core::protocol::Op;
use color_eyre::eyre::Result;
use crossterm::SynchronizedUpdate;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyEventKind;
use crossterm::execute;
use crossterm::terminal::EnterAlternateScreen;
use crossterm::terminal::LeaveAlternateScreen;
use crossterm::terminal::supports_keyboard_enhancement;
use ratatui::layout::Rect;
use ratatui::layout::Offset;
use ratatui::prelude::Backend;
use ratatui::text::Line;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::atomic::AtomicBool;
use std::sync::atomic::Ordering;
use std::sync::mpsc::Receiver;
use std::sync::mpsc::channel;
use std::thread;
use std::time::Duration;
use tokio::select;
use tokio::sync::mpsc::unbounded_channel;
use std::time::Instant;
pub(crate) struct App {
/// Time window for debouncing redraw requests.
const REDRAW_DEBOUNCE: Duration = Duration::from_millis(1);
/// Top-level application state: which full-screen view is currently active.
#[allow(clippy::large_enum_variant)]
enum AppState<'a> {
Onboarding {
screen: OnboardingScreen,
},
/// The main chat UI is visible.
Chat {
/// Boxed to avoid a large enum variant and reduce the overall size of
/// `AppState`.
widget: Box<ChatWidget<'a>>,
},
}
pub(crate) struct App<'a> {
server: Arc<ConversationManager>,
app_event_tx: AppEventSender,
chat_widget: ChatWidget,
app_event_rx: Receiver<AppEvent>,
app_state: AppState<'a>,
/// Config is stored here so we can recreate ChatWidgets as needed.
config: Config,
file_search: FileSearchManager,
transcript_lines: Vec<Line<'static>>,
// Transcript overlay state
transcript_overlay: Option<TranscriptApp>,
deferred_history_lines: Vec<Line<'static>>,
transcript_saved_viewport: Option<Rect>,
pending_history_lines: Vec<Line<'static>>,
enhanced_keys_supported: bool,
/// Controls the animation thread that sends CommitTick events.
commit_anim_running: Arc<AtomicBool>,
/// Channel to schedule one-shot animation frames; coalesced by a single
/// scheduler thread.
frame_schedule_tx: std::sync::mpsc::Sender<Instant>,
}
impl App {
pub async fn run(
tui: &mut tui::Tui,
/// Aggregate parameters needed to create a `ChatWidget`, as creation may be
/// deferred until after the Git warning screen is dismissed.
#[derive(Clone, Debug)]
pub(crate) struct ChatWidgetArgs {
pub(crate) config: Config,
initial_prompt: Option<String>,
initial_images: Vec<PathBuf>,
enhanced_keys_supported: bool,
}
impl App<'_> {
pub(crate) fn new(
config: Config,
initial_prompt: Option<String>,
initial_images: Vec<PathBuf>,
) -> Result<TokenUsage> {
use tokio_stream::StreamExt;
let (app_event_tx, mut app_event_rx) = unbounded_channel();
let app_event_tx = AppEventSender::new(app_event_tx);
initial_images: Vec<std::path::PathBuf>,
show_trust_screen: bool,
) -> Self {
let conversation_manager = Arc::new(ConversationManager::default());
let (app_event_tx, app_event_rx) = channel();
let app_event_tx = AppEventSender::new(app_event_tx);
let enhanced_keys_supported = supports_keyboard_enhancement().unwrap_or(false);
let chat_widget = ChatWidget::new(
config.clone(),
conversation_manager.clone(),
tui.frame_requester(),
app_event_tx.clone(),
initial_prompt,
initial_images,
enhanced_keys_supported,
);
// Spawn a dedicated thread for reading the crossterm event loop and
// re-publishing the events as AppEvents, as appropriate.
{
let app_event_tx = app_event_tx.clone();
std::thread::spawn(move || {
loop {
// This timeout is necessary to avoid holding the event lock
// that crossterm::event::read() acquires. In particular,
// reading the cursor position (crossterm::cursor::position())
// needs to acquire the event lock, and so will fail if it
// can't acquire it within 2 sec. Resizing the terminal
// crashes the app if the cursor position can't be read.
if let Ok(true) = crossterm::event::poll(Duration::from_millis(100)) {
if let Ok(event) = crossterm::event::read() {
match event {
crossterm::event::Event::Key(key_event) => {
app_event_tx.send(AppEvent::KeyEvent(key_event));
}
crossterm::event::Event::Resize(_, _) => {
app_event_tx.send(AppEvent::RequestRedraw);
}
crossterm::event::Event::Paste(pasted) => {
// Many terminals convert newlines to \r when pasting (e.g., iTerm2),
// but tui-textarea expects \n. Normalize CR to LF.
// [tui-textarea]: https://github.com/rhysd/tui-textarea/blob/4d18622eeac13b309e0ff6a55a46ac6706da68cf/src/textarea.rs#L782-L783
// [iTerm2]: https://github.com/gnachman/iTerm2/blob/5d0c0d9f68523cbd0494dad5422998964a2ecd8d/sources/iTermPasteHelper.m#L206-L216
let pasted = pasted.replace("\r", "\n");
app_event_tx.send(AppEvent::Paste(pasted));
}
_ => {
// Ignore any other events.
}
}
}
} else {
// Timeout expired, no `Event` is available
}
}
});
}
let login_status = get_login_status(&config);
let should_show_onboarding =
should_show_onboarding(login_status, &config, show_trust_screen);
let app_state = if should_show_onboarding {
let show_login_screen = should_show_login_screen(login_status, &config);
let chat_widget_args = ChatWidgetArgs {
config: config.clone(),
initial_prompt,
initial_images,
enhanced_keys_supported,
};
AppState::Onboarding {
screen: OnboardingScreen::new(OnboardingScreenArgs {
event_tx: app_event_tx.clone(),
codex_home: config.codex_home.clone(),
cwd: config.cwd.clone(),
show_trust_screen,
show_login_screen,
chat_widget_args,
login_status,
}),
}
} else {
let chat_widget = ChatWidget::new(
config.clone(),
conversation_manager.clone(),
app_event_tx.clone(),
initial_prompt,
initial_images,
enhanced_keys_supported,
);
AppState::Chat {
widget: Box::new(chat_widget),
}
};
let file_search = FileSearchManager::new(config.cwd.clone(), app_event_tx.clone());
let mut app = Self {
// Spawn a single scheduler thread that coalesces both debounced redraw
// requests and animation frame requests, and emits a single Redraw event
// at the earliest requested time.
let (frame_tx, frame_rx) = channel::<Instant>();
{
let app_event_tx = app_event_tx.clone();
std::thread::spawn(move || {
use std::sync::mpsc::RecvTimeoutError;
let mut next_deadline: Option<Instant> = None;
loop {
if next_deadline.is_none() {
match frame_rx.recv() {
Ok(deadline) => next_deadline = Some(deadline),
Err(_) => break,
}
}
#[expect(clippy::expect_used)]
let deadline = next_deadline.expect("deadline set");
let now = Instant::now();
let timeout = if deadline > now {
deadline - now
} else {
Duration::from_millis(0)
};
match frame_rx.recv_timeout(timeout) {
Ok(new_deadline) => {
next_deadline =
Some(next_deadline.map_or(new_deadline, |d| d.min(new_deadline)));
}
Err(RecvTimeoutError::Timeout) => {
app_event_tx.send(AppEvent::Redraw);
next_deadline = None;
}
Err(RecvTimeoutError::Disconnected) => break,
}
}
});
}
Self {
server: conversation_manager,
app_event_tx,
chat_widget,
pending_history_lines: Vec::new(),
app_event_rx,
app_state,
config,
file_search,
enhanced_keys_supported,
transcript_lines: Vec::new(),
transcript_overlay: None,
deferred_history_lines: Vec::new(),
transcript_saved_viewport: None,
commit_anim_running: Arc::new(AtomicBool::new(false)),
};
let tui_events = tui.event_stream();
tokio::pin!(tui_events);
tui.frame_requester().schedule_frame();
while select! {
Some(event) = app_event_rx.recv() => {
app.handle_event(tui, event)?
}
Some(event) = tui_events.next() => {
app.handle_tui_event(tui, event).await?
}
} {}
tui.terminal.clear()?;
Ok(app.token_usage())
frame_schedule_tx: frame_tx,
}
}
pub(crate) async fn handle_tui_event(
&mut self,
tui: &mut tui::Tui,
event: TuiEvent,
) -> Result<bool> {
if let Some(overlay) = &mut self.transcript_overlay {
overlay.handle_event(tui, event)?;
if overlay.is_done {
// Exit alternate screen and restore viewport.
let _ = execute!(tui.terminal.backend_mut(), LeaveAlternateScreen);
if let Some(saved) = self.transcript_saved_viewport.take() {
tui.terminal.set_viewport_area(saved);
}
if !self.deferred_history_lines.is_empty() {
let lines = std::mem::take(&mut self.deferred_history_lines);
tui.insert_history_lines(lines);
}
self.transcript_overlay = None;
}
tui.frame_requester().schedule_frame();
} else {
fn schedule_frame_in(&self, dur: Duration) {
let _ = self.frame_schedule_tx.send(Instant::now() + dur);
}
pub(crate) fn run(&mut self, terminal: &mut tui::Tui) -> Result<()> {
// Schedule the first render immediately.
let _ = self.frame_schedule_tx.send(Instant::now());
while let Ok(event) = self.app_event_rx.recv() {
match event {
TuiEvent::Key(key_event) => {
self.handle_key_event(tui, key_event).await;
AppEvent::InsertHistory(lines) => {
self.pending_history_lines.extend(lines);
self.app_event_tx.send(AppEvent::RequestRedraw);
}
TuiEvent::Paste(pasted) => {
// Many terminals convert newlines to \r when pasting (e.g., iTerm2),
// but tui-textarea expects \n. Normalize CR to LF.
// [tui-textarea]: https://github.com/rhysd/tui-textarea/blob/4d18622eeac13b309e0ff6a55a46ac6706da68cf/src/textarea.rs#L782-L783
// [iTerm2]: https://github.com/gnachman/iTerm2/blob/5d0c0d9f68523cbd0494dad5422998964a2ecd8d/sources/iTermPasteHelper.m#L206-L216
let pasted = pasted.replace("\r", "\n");
self.chat_widget.handle_paste(pasted);
AppEvent::RequestRedraw => {
self.schedule_frame_in(REDRAW_DEBOUNCE);
}
TuiEvent::Draw => {
tui.draw(
self.chat_widget.desired_height(tui.terminal.size()?.width),
|frame| {
frame.render_widget_ref(&self.chat_widget, frame.area());
if let Some((x, y)) = self.chat_widget.cursor_pos(frame.area()) {
frame.set_cursor_position((x, y));
AppEvent::ScheduleFrameIn(dur) => {
self.schedule_frame_in(dur);
}
AppEvent::Redraw => {
std::io::stdout().sync_update(|_| self.draw_next_frame(terminal))??;
}
AppEvent::StartCommitAnimation => {
if self
.commit_anim_running
.compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed)
.is_ok()
{
let tx = self.app_event_tx.clone();
let running = self.commit_anim_running.clone();
thread::spawn(move || {
while running.load(Ordering::Relaxed) {
thread::sleep(Duration::from_millis(50));
tx.send(AppEvent::CommitTick);
}
});
}
}
AppEvent::StopCommitAnimation => {
self.commit_anim_running.store(false, Ordering::Release);
}
AppEvent::CommitTick => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.on_commit_tick();
}
}
AppEvent::KeyEvent(key_event) => {
match key_event {
KeyEvent {
code: KeyCode::Char('c'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => match &mut self.app_state {
AppState::Chat { widget } => {
widget.on_ctrl_c();
}
AppState::Onboarding { .. } => {
self.app_event_tx.send(AppEvent::ExitRequest);
}
},
)?;
KeyEvent {
code: KeyCode::Char('z'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => {
#[cfg(unix)]
{
self.suspend(terminal)?;
}
// No-op on non-Unix platforms.
}
KeyEvent {
code: KeyCode::Char('d'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => {
match &mut self.app_state {
AppState::Chat { widget } => {
if widget.composer_is_empty() {
self.app_event_tx.send(AppEvent::ExitRequest);
} else {
// Treat Ctrl+D as a normal key event when the composer
// is not empty so that it doesn't quit the application
// prematurely.
self.dispatch_key_event(key_event);
}
}
AppState::Onboarding { .. } => {
self.app_event_tx.send(AppEvent::ExitRequest);
}
}
}
KeyEvent {
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.dispatch_key_event(key_event);
}
_ => {
// Ignore Release key events.
}
};
}
#[cfg(unix)]
TuiEvent::ResumeFromSuspend => {
let cursor_pos = tui.terminal.get_cursor_position()?;
tui.terminal.set_viewport_area(ratatui::layout::Rect::new(
0,
cursor_pos.y,
0,
0,
));
AppEvent::Paste(text) => {
self.dispatch_paste_event(text);
}
}
}
Ok(true)
}
AppEvent::CodexEvent(event) => {
self.dispatch_codex_event(event);
}
AppEvent::ExitRequest => {
break;
}
AppEvent::CodexOp(op) => match &mut self.app_state {
AppState::Chat { widget } => widget.submit_op(op),
AppState::Onboarding { .. } => {}
},
AppEvent::DiffResult(text) => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.add_diff_output(text);
}
}
AppEvent::DispatchCommand(command) => match command {
SlashCommand::New => {
// User accepted switch to chat view.
let new_widget = Box::new(ChatWidget::new(
self.config.clone(),
self.server.clone(),
self.app_event_tx.clone(),
None,
Vec::new(),
self.enhanced_keys_supported,
));
self.app_state = AppState::Chat { widget: new_widget };
self.app_event_tx.send(AppEvent::RequestRedraw);
}
SlashCommand::Init => {
// Guard: do not run if a task is active.
if let AppState::Chat { widget } = &mut self.app_state {
const INIT_PROMPT: &str = include_str!("../prompt_for_init_command.md");
widget.submit_text_message(INIT_PROMPT.to_string());
}
}
SlashCommand::Compact => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.clear_token_usage();
self.app_event_tx.send(AppEvent::CodexOp(Op::Compact));
}
}
SlashCommand::Model => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.open_model_popup();
}
}
SlashCommand::Quit => {
break;
}
SlashCommand::Logout => {
if let Err(e) = codex_login::logout(&self.config.codex_home) {
tracing::error!("failed to logout: {e}");
}
break;
}
SlashCommand::Diff => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.add_diff_in_progress();
}
fn handle_event(&mut self, tui: &mut tui::Tui, event: AppEvent) -> Result<bool> {
match event {
AppEvent::NewSession => {
self.chat_widget = ChatWidget::new(
self.config.clone(),
self.server.clone(),
tui.frame_requester(),
self.app_event_tx.clone(),
None,
Vec::new(),
self.enhanced_keys_supported,
);
tui.frame_requester().schedule_frame();
}
AppEvent::InsertHistoryLines(lines) => {
if let Some(overlay) = &mut self.transcript_overlay {
overlay.insert_lines(lines.clone());
tui.frame_requester().schedule_frame();
let tx = self.app_event_tx.clone();
tokio::spawn(async move {
let text = match get_git_diff().await {
Ok((is_git_repo, diff_text)) => {
if is_git_repo {
diff_text
} else {
"`/diff` — _not inside a git repository_".to_string()
}
}
Err(e) => format!("Failed to compute diff: {e}"),
};
tx.send(AppEvent::DiffResult(text));
});
}
SlashCommand::Mention => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.insert_str("@");
}
}
SlashCommand::Status => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.add_status_output();
}
}
SlashCommand::Mcp => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.add_mcp_output();
}
}
#[cfg(debug_assertions)]
SlashCommand::TestApproval => {
use codex_core::protocol::EventMsg;
use std::collections::HashMap;
use codex_core::protocol::ApplyPatchApprovalRequestEvent;
use codex_core::protocol::FileChange;
self.app_event_tx.send(AppEvent::CodexEvent(Event {
id: "1".to_string(),
// msg: EventMsg::ExecApprovalRequest(ExecApprovalRequestEvent {
// call_id: "1".to_string(),
// command: vec!["git".into(), "apply".into()],
// cwd: self.config.cwd.clone(),
// reason: Some("test".to_string()),
// }),
msg: EventMsg::ApplyPatchApprovalRequest(
ApplyPatchApprovalRequestEvent {
call_id: "1".to_string(),
changes: HashMap::from([
(
PathBuf::from("/tmp/test.txt"),
FileChange::Add {
content: "test".to_string(),
},
),
(
PathBuf::from("/tmp/test2.txt"),
FileChange::Update {
unified_diff: "+test\n-test2".to_string(),
move_path: None,
},
),
]),
reason: None,
grant_root: Some(PathBuf::from("/tmp")),
},
),
}));
}
},
AppEvent::OnboardingAuthComplete(result) => {
if let AppState::Onboarding { screen } = &mut self.app_state {
screen.on_auth_complete(result);
}
}
self.transcript_lines.extend(lines.clone());
if self.transcript_overlay.is_some() {
self.deferred_history_lines.extend(lines);
} else {
tui.insert_history_lines(lines);
AppEvent::OnboardingComplete(ChatWidgetArgs {
config,
enhanced_keys_supported,
initial_images,
initial_prompt,
}) => {
self.app_state = AppState::Chat {
widget: Box::new(ChatWidget::new(
config,
self.server.clone(),
self.app_event_tx.clone(),
initial_prompt,
initial_images,
enhanced_keys_supported,
)),
}
}
}
AppEvent::InsertHistoryCell(cell) => {
if let Some(overlay) = &mut self.transcript_overlay {
overlay.insert_lines(cell.transcript_lines());
tui.frame_requester().schedule_frame();
AppEvent::StartFileSearch(query) => {
if !query.is_empty() {
self.file_search.on_user_query(query);
}
}
self.transcript_lines.extend(cell.transcript_lines());
let display = cell.display_lines();
if !display.is_empty() {
if self.transcript_overlay.is_some() {
self.deferred_history_lines.extend(display);
} else {
tui.insert_history_lines(display);
AppEvent::FileSearchResult { query, matches } => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.apply_file_search_result(query, matches);
}
}
AppEvent::UpdateReasoningEffort(effort) => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.set_reasoning_effort(effort);
}
}
AppEvent::UpdateModel(model) => {
if let AppState::Chat { widget } = &mut self.app_state {
widget.set_model(model);
}
}
}
AppEvent::StartCommitAnimation => {
if self
.commit_anim_running
.compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed)
.is_ok()
{
let tx = self.app_event_tx.clone();
let running = self.commit_anim_running.clone();
thread::spawn(move || {
while running.load(Ordering::Relaxed) {
thread::sleep(Duration::from_millis(50));
tx.send(AppEvent::CommitTick);
}
});
}
}
AppEvent::StopCommitAnimation => {
self.commit_anim_running.store(false, Ordering::Release);
}
AppEvent::CommitTick => {
self.chat_widget.on_commit_tick();
}
AppEvent::CodexEvent(event) => {
self.chat_widget.handle_codex_event(event);
}
AppEvent::ExitRequest => {
return Ok(false);
}
AppEvent::CodexOp(op) => self.chat_widget.submit_op(op),
AppEvent::DiffResult(text) => {
self.chat_widget.add_diff_output(text);
}
AppEvent::StartFileSearch(query) => {
if !query.is_empty() {
self.file_search.on_user_query(query);
}
}
AppEvent::FileSearchResult { query, matches } => {
self.chat_widget.apply_file_search_result(query, matches);
}
AppEvent::UpdateReasoningEffort(effort) => {
self.chat_widget.set_reasoning_effort(effort);
}
AppEvent::UpdateModel(model) => {
self.chat_widget.set_model(model);
}
AppEvent::UpdateAskForApprovalPolicy(policy) => {
self.chat_widget.set_approval_policy(policy);
}
AppEvent::UpdateSandboxPolicy(policy) => {
self.chat_widget.set_sandbox_policy(policy);
}
}
Ok(true)
terminal.clear()?;
Ok(())
}
#[cfg(unix)]
fn suspend(&mut self, terminal: &mut tui::Tui) -> Result<()> {
tui::restore()?;
// SAFETY: Unix-only code path. We intentionally send SIGTSTP to the
// current process group (pid 0) to trigger standard job-control
// suspension semantics. This FFI does not involve any raw pointers,
// is not called from a signal handler, and uses a constant signal.
// Errors from kill are acceptable (e.g., if already stopped) — the
// subsequent re-init path will still leave the terminal in a good state.
// We considered `nix`, but didn't think it was worth pulling in for this one call.
unsafe { libc::kill(0, libc::SIGTSTP) };
*terminal = tui::init(&self.config)?;
terminal.clear()?;
self.app_event_tx.send(AppEvent::RequestRedraw);
Ok(())
}
pub(crate) fn token_usage(&self) -> codex_core::protocol::TokenUsage {
self.chat_widget.token_usage().clone()
match &self.app_state {
AppState::Chat { widget } => widget.token_usage().clone(),
AppState::Onboarding { .. } => codex_core::protocol::TokenUsage::default(),
}
}
async fn handle_key_event(&mut self, tui: &mut tui::Tui, key_event: KeyEvent) {
match key_event {
KeyEvent {
code: KeyCode::Char('c'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => {
self.chat_widget.on_ctrl_c();
}
KeyEvent {
code: KeyCode::Char('d'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} if self.chat_widget.composer_is_empty() => {
self.app_event_tx.send(AppEvent::ExitRequest);
}
KeyEvent {
code: KeyCode::Char('t'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => {
// Enter alternate screen and set viewport to full size.
let _ = execute!(tui.terminal.backend_mut(), EnterAlternateScreen);
if let Ok(size) = tui.terminal.size() {
self.transcript_saved_viewport = Some(tui.terminal.viewport_area);
tui.terminal
.set_viewport_area(Rect::new(0, 0, size.width, size.height));
let _ = tui.terminal.clear();
}
fn draw_next_frame(&mut self, terminal: &mut tui::Tui) -> Result<()> {
if matches!(self.app_state, AppState::Onboarding { .. }) {
terminal.clear()?;
}
self.transcript_overlay = Some(TranscriptApp::new(self.transcript_lines.clone()));
tui.frame_requester().schedule_frame();
}
KeyEvent {
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.chat_widget.handle_key_event(key_event);
}
_ => {
// Ignore Release key events.
let screen_size = terminal.size()?;
let last_known_screen_size = terminal.last_known_screen_size;
if screen_size != last_known_screen_size {
let cursor_pos = terminal.get_cursor_position()?;
let last_known_cursor_pos = terminal.last_known_cursor_pos;
if cursor_pos.y != last_known_cursor_pos.y {
// The terminal was resized. The only point of reference we have for where our viewport
// was moved is the cursor position.
// NB this assumes that the cursor was not wrapped as part of the resize.
let cursor_delta = cursor_pos.y as i32 - last_known_cursor_pos.y as i32;
let new_viewport_area = terminal.viewport_area.offset(Offset {
x: 0,
y: cursor_delta,
});
terminal.set_viewport_area(new_viewport_area);
terminal.clear()?;
}
}
let size = terminal.size()?;
let desired_height = match &self.app_state {
AppState::Chat { widget } => widget.desired_height(size.width),
AppState::Onboarding { .. } => size.height,
};
let mut area = terminal.viewport_area;
area.height = desired_height.min(size.height);
area.width = size.width;
if area.bottom() > size.height {
terminal
.backend_mut()
.scroll_region_up(0..area.top(), area.bottom() - size.height)?;
area.y = size.height - area.height;
}
if area != terminal.viewport_area {
terminal.clear()?;
terminal.set_viewport_area(area);
}
if !self.pending_history_lines.is_empty() {
crate::insert_history::insert_history_lines(
terminal,
self.pending_history_lines.clone(),
);
self.pending_history_lines.clear();
}
terminal.draw(|frame| match &mut self.app_state {
AppState::Chat { widget } => {
if let Some((x, y)) = widget.cursor_pos(frame.area()) {
frame.set_cursor_position((x, y));
}
frame.render_widget_ref(&**widget, frame.area())
}
AppState::Onboarding { screen } => frame.render_widget_ref(&*screen, frame.area()),
})?;
Ok(())
}
/// Dispatch a KeyEvent to the current view and let it decide what to do
/// with it.
fn dispatch_key_event(&mut self, key_event: KeyEvent) {
match &mut self.app_state {
AppState::Chat { widget } => {
widget.handle_key_event(key_event);
}
AppState::Onboarding { screen } => match key_event.code {
KeyCode::Char('q') => {
self.app_event_tx.send(AppEvent::ExitRequest);
}
_ => screen.handle_key_event(key_event),
},
}
}
fn dispatch_paste_event(&mut self, pasted: String) {
match &mut self.app_state {
AppState::Chat { widget } => widget.handle_paste(pasted),
AppState::Onboarding { .. } => {}
}
}
fn dispatch_codex_event(&mut self, event: Event) {
match &mut self.app_state {
AppState::Chat { widget } => widget.handle_codex_event(event),
AppState::Onboarding { .. } => {}
}
}
}
fn should_show_onboarding(
login_status: LoginStatus,
config: &Config,
show_trust_screen: bool,
) -> bool {
if show_trust_screen {
return true;
}
should_show_login_screen(login_status, config)
}
fn should_show_login_screen(login_status: LoginStatus, config: &Config) -> bool {
match login_status {
LoginStatus::NotAuthenticated => true,
LoginStatus::AuthMode(method) => method != config.preferred_auth_method,
}
}
#[cfg(test)]
mod tests {
use super::*;
use codex_core::config::ConfigOverrides;
use codex_core::config::ConfigToml;
use codex_login::AuthMode;
fn make_config(preferred: AuthMode) -> Config {
let mut cfg = Config::load_from_base_config_with_overrides(
ConfigToml::default(),
ConfigOverrides::default(),
std::env::temp_dir(),
)
.expect("load default config");
cfg.preferred_auth_method = preferred;
cfg
}
#[test]
fn shows_login_when_not_authenticated() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(should_show_login_screen(
LoginStatus::NotAuthenticated,
&cfg
));
}
#[test]
fn shows_login_when_api_key_but_prefers_chatgpt() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ApiKey),
&cfg
))
}
#[test]
fn hides_login_when_api_key_and_prefers_api_key() {
let cfg = make_config(AuthMode::ApiKey);
assert!(!should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ApiKey),
&cfg
))
}
#[test]
fn hides_login_when_chatgpt_and_prefers_chatgpt() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(!should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ChatGPT),
&cfg
))
}
}

View File

@@ -1,11 +1,11 @@
use codex_core::protocol::Event;
use codex_file_search::FileMatch;
use crossterm::event::KeyEvent;
use ratatui::text::Line;
use std::time::Duration;
use crate::history_cell::HistoryCell;
use codex_core::protocol::AskForApproval;
use codex_core::protocol::SandboxPolicy;
use crate::app::ChatWidgetArgs;
use crate::slash_command::SlashCommand;
use codex_core::protocol_config_types::ReasoningEffort;
#[allow(clippy::large_enum_variant)]
@@ -13,8 +13,20 @@ use codex_core::protocol_config_types::ReasoningEffort;
pub(crate) enum AppEvent {
CodexEvent(Event),
/// Start a new session.
NewSession,
/// Request a redraw which will be debounced by the [`App`].
RequestRedraw,
/// Actually draw the next frame.
Redraw,
/// Schedule a one-shot animation frame roughly after the given duration.
/// Multiple requests are coalesced by the central frame scheduler.
ScheduleFrameIn(Duration),
KeyEvent(KeyEvent),
/// Text pasted from the terminal clipboard.
Paste(String),
/// Request to exit the application gracefully.
ExitRequest,
@@ -23,6 +35,10 @@ pub(crate) enum AppEvent {
/// bubbling channels through layers of widgets.
CodexOp(codex_core::protocol::Op),
/// Dispatch a recognized slash command from the UI (composer) to the app
/// layer so it can be handled centrally.
DispatchCommand(SlashCommand),
/// Kick off an asynchronous file search for the given query (text after
/// the `@`). Previous searches may be cancelled by the app layer so there
/// is at most one in-flight search.
@@ -39,22 +55,19 @@ pub(crate) enum AppEvent {
/// Result of computing a `/diff` command.
DiffResult(String),
InsertHistoryLines(Vec<Line<'static>>),
InsertHistoryCell(Box<dyn HistoryCell>),
InsertHistory(Vec<Line<'static>>),
StartCommitAnimation,
StopCommitAnimation,
CommitTick,
/// Onboarding: result of login_with_chatgpt.
OnboardingAuthComplete(Result<(), String>),
OnboardingComplete(ChatWidgetArgs),
/// Update the current reasoning effort in the running app and widget.
UpdateReasoningEffort(ReasoningEffort),
/// Update the current model slug in the running app and widget.
UpdateModel(String),
/// Update the current approval policy in the running app and widget.
UpdateAskForApprovalPolicy(AskForApproval),
/// Update the current sandbox policy in the running app and widget.
UpdateSandboxPolicy(SandboxPolicy),
}

View File

@@ -1,15 +1,15 @@
use tokio::sync::mpsc::UnboundedSender;
use std::sync::mpsc::Sender;
use crate::app_event::AppEvent;
use crate::session_log;
#[derive(Clone, Debug)]
pub(crate) struct AppEventSender {
pub app_event_tx: UnboundedSender<AppEvent>,
pub app_event_tx: Sender<AppEvent>,
}
impl AppEventSender {
pub(crate) fn new(app_event_tx: UnboundedSender<AppEvent>) -> Self {
pub(crate) fn new(app_event_tx: Sender<AppEvent>) -> Self {
Self { app_event_tx }
}

View File

@@ -12,13 +12,13 @@ use super::BottomPaneView;
use super::CancellationEvent;
/// Modal overlay asking the user to approve/deny a sequence of requests.
pub(crate) struct ApprovalModalView {
current: UserApprovalWidget,
pub(crate) struct ApprovalModalView<'a> {
current: UserApprovalWidget<'a>,
queue: Vec<ApprovalRequest>,
app_event_tx: AppEventSender,
}
impl ApprovalModalView {
impl ApprovalModalView<'_> {
pub fn new(request: ApprovalRequest, app_event_tx: AppEventSender) -> Self {
Self {
current: UserApprovalWidget::new(request, app_event_tx.clone()),
@@ -41,13 +41,13 @@ impl ApprovalModalView {
}
}
impl BottomPaneView for ApprovalModalView {
fn handle_key_event(&mut self, _pane: &mut BottomPane, key_event: KeyEvent) {
impl<'a> BottomPaneView<'a> for ApprovalModalView<'a> {
fn handle_key_event(&mut self, _pane: &mut BottomPane<'a>, key_event: KeyEvent) {
self.current.handle_key_event(key_event);
self.maybe_advance();
}
fn on_ctrl_c(&mut self, _pane: &mut BottomPane) -> CancellationEvent {
fn on_ctrl_c(&mut self, _pane: &mut BottomPane<'a>) -> CancellationEvent {
self.current.on_ctrl_c();
self.queue.clear();
CancellationEvent::Handled
@@ -75,7 +75,7 @@ impl BottomPaneView for ApprovalModalView {
mod tests {
use super::*;
use crate::app_event::AppEvent;
use tokio::sync::mpsc::unbounded_channel;
use std::sync::mpsc::channel;
fn make_exec_request() -> ApprovalRequest {
ApprovalRequest::Exec {
@@ -87,16 +87,15 @@ mod tests {
#[test]
fn ctrl_c_aborts_and_clears_queue() {
let (tx, _rx) = unbounded_channel::<AppEvent>();
let tx = AppEventSender::new(tx);
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let first = make_exec_request();
let mut view = ApprovalModalView::new(first, tx);
view.enqueue_request(make_exec_request());
let (tx2, _rx2) = unbounded_channel::<AppEvent>();
let (tx_raw2, _rx2) = channel::<AppEvent>();
let mut pane = BottomPane::new(super::super::BottomPaneParams {
app_event_tx: AppEventSender::new(tx2),
frame_requester: crate::tui::FrameRequester::test_dummy(),
app_event_tx: AppEventSender::new(tx_raw2),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),

View File

@@ -7,10 +7,10 @@ use super::BottomPane;
use super::CancellationEvent;
/// Trait implemented by every view that can be shown in the bottom pane.
pub(crate) trait BottomPaneView {
pub(crate) trait BottomPaneView<'a> {
/// Handle a key event while the view is active. A redraw is always
/// scheduled after this call.
fn handle_key_event(&mut self, _pane: &mut BottomPane, _key_event: KeyEvent) {}
fn handle_key_event(&mut self, _pane: &mut BottomPane<'a>, _key_event: KeyEvent) {}
/// Return `true` if the view has finished and should be removed.
fn is_complete(&self) -> bool {
@@ -18,7 +18,7 @@ pub(crate) trait BottomPaneView {
}
/// Handle Ctrl-C while this view is active.
fn on_ctrl_c(&mut self, _pane: &mut BottomPane) -> CancellationEvent {
fn on_ctrl_c(&mut self, _pane: &mut BottomPane<'a>) -> CancellationEvent {
CancellationEvent::Ignored
}
@@ -28,11 +28,6 @@ pub(crate) trait BottomPaneView {
/// Render the view: this will be displayed in place of the composer.
fn render(&self, area: Rect, buf: &mut Buffer);
/// Update the status indicator animated header. Default no-op.
fn update_status_header(&mut self, _header: String) {
// no-op
}
/// Called when task completes to check if the view should be hidden.
fn should_hide_when_task_is_done(&mut self) -> bool {
false

View File

@@ -23,7 +23,6 @@ use ratatui::widgets::WidgetRef;
use super::chat_composer_history::ChatComposerHistory;
use super::command_popup::CommandPopup;
use super::file_search_popup::FileSearchPopup;
use crate::slash_command::SlashCommand;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
@@ -39,7 +38,6 @@ const LARGE_PASTE_CHAR_THRESHOLD: usize = 1000;
/// Result returned when the user interacts with the text area.
pub enum InputResult {
Submitted(String),
Command(SlashCommand),
None,
}
@@ -277,11 +275,6 @@ impl ChatComposer {
self.textarea.set_text(&format!("/{} ", cmd.command()));
self.textarea.set_cursor(self.textarea.text().len());
}
// After completing the command, move cursor to the end.
if !self.textarea.text().is_empty() {
let end = self.textarea.text().len();
self.textarea.set_cursor(end);
}
}
(InputResult::None, true)
}
@@ -291,15 +284,15 @@ impl ChatComposer {
..
} => {
if let Some(cmd) = popup.selected_command() {
// Send command to the app layer.
self.app_event_tx.send(AppEvent::DispatchCommand(*cmd));
// Clear textarea so no residual text remains.
self.textarea.set_text("");
let result = (InputResult::Command(*cmd), true);
// Hide popup since the command has been dispatched.
self.active_popup = ActivePopup::None;
return result;
return (InputResult::None, true);
}
// Fallback to default newline handling if no command selected.
self.handle_key_event_without_popup(key_event)
@@ -752,7 +745,6 @@ mod tests {
use crate::bottom_pane::InputResult;
use crate::bottom_pane::chat_composer::LARGE_PASTE_CHAR_THRESHOLD;
use crate::bottom_pane::textarea::TextArea;
use tokio::sync::mpsc::unbounded_channel;
#[test]
fn test_current_at_token_basic_cases() {
@@ -909,7 +901,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -933,7 +925,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -963,7 +955,7 @@ mod tests {
use crossterm::event::KeyModifiers;
let large = "y".repeat(LARGE_PASTE_CHAR_THRESHOLD + 1);
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -985,7 +977,7 @@ mod tests {
use ratatui::Terminal;
use ratatui::backend::TestBackend;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut terminal = match Terminal::new(TestBackend::new(100, 10)) {
Ok(t) => t,
@@ -1041,8 +1033,9 @@ mod tests {
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
use std::sync::mpsc::TryRecvError;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -1058,18 +1051,25 @@ mod tests {
let (result, _needs_redraw) =
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
// When a slash command is dispatched, the composer should return a
// Command result (not submit literal text) and clear its textarea.
// When a slash command is dispatched, the composer should not submit
// literal text and should clear its textarea.
match result {
InputResult::Command(cmd) => {
assert_eq!(cmd.command(), "init");
}
InputResult::None => {}
InputResult::Submitted(text) => {
panic!("expected command dispatch, but composer submitted literal text: {text}")
}
InputResult::None => panic!("expected Command result for '/init'"),
}
assert!(composer.textarea.is_empty(), "composer should be cleared");
// Verify a DispatchCommand event for the "init" command was sent.
match rx.try_recv() {
Ok(AppEvent::DispatchCommand(cmd)) => {
assert_eq!(cmd.command(), "init");
}
Ok(_other) => panic!("unexpected app event"),
Err(TryRecvError::Empty) => panic!("expected a DispatchCommand event for '/init'"),
Err(TryRecvError::Disconnected) => panic!("app event channel disconnected"),
}
}
#[test]
@@ -1078,7 +1078,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -1099,8 +1099,9 @@ mod tests {
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
use std::sync::mpsc::TryRecvError;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -1113,16 +1114,24 @@ mod tests {
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
match result {
InputResult::Command(cmd) => {
assert_eq!(cmd.command(), "mention");
}
InputResult::None => {}
InputResult::Submitted(text) => {
panic!("expected command dispatch, but composer submitted literal text: {text}")
}
InputResult::None => panic!("expected Command result for '/mention'"),
}
assert!(composer.textarea.is_empty(), "composer should be cleared");
composer.insert_str("@");
match rx.try_recv() {
Ok(AppEvent::DispatchCommand(cmd)) => {
assert_eq!(cmd.command(), "mention");
composer.insert_str("@");
}
Ok(_other) => panic!("unexpected app event"),
Err(TryRecvError::Empty) => panic!("expected a DispatchCommand event for '/mention'"),
Err(TryRecvError::Disconnected) => {
panic!("app event channel disconnected")
}
}
assert_eq!(composer.textarea.text(), "@");
}
@@ -1132,7 +1141,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -1206,7 +1215,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());
@@ -1273,7 +1282,7 @@ mod tests {
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let (tx, _rx) = std::sync::mpsc::channel();
let sender = AppEventSender::new(tx);
let mut composer =
ChatComposer::new(true, sender, false, "Ask Codex to do anything".to_string());

View File

@@ -192,7 +192,7 @@ mod tests {
use super::*;
use crate::app_event::AppEvent;
use codex_core::protocol::Op;
use tokio::sync::mpsc::unbounded_channel;
use std::sync::mpsc::channel;
#[test]
fn duplicate_submissions_are_not_recorded() {
@@ -219,7 +219,7 @@ mod tests {
#[test]
fn navigation_with_async_fetch() {
let (tx, mut rx) = unbounded_channel::<AppEvent>();
let (tx, rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx);
let mut history = ChatComposerHistory::new();

View File

@@ -105,8 +105,8 @@ impl ListSelectionView {
}
}
impl BottomPaneView for ListSelectionView {
fn handle_key_event(&mut self, _pane: &mut BottomPane, key_event: KeyEvent) {
impl BottomPaneView<'_> for ListSelectionView {
fn handle_key_event(&mut self, _pane: &mut BottomPane<'_>, key_event: KeyEvent) {
match key_event {
KeyEvent {
code: KeyCode::Up, ..
@@ -131,7 +131,7 @@ impl BottomPaneView for ListSelectionView {
self.complete
}
fn on_ctrl_c(&mut self, _pane: &mut BottomPane) -> CancellationEvent {
fn on_ctrl_c(&mut self, _pane: &mut BottomPane<'_>) -> CancellationEvent {
self.complete = true;
CancellationEvent::Handled
}

View File

@@ -1,7 +1,7 @@
//! Bottom pane: shows the ChatComposer or a BottomPaneView, if one is active.
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::tui::FrameRequester;
use crate::user_approval_widget::ApprovalRequest;
use bottom_pane_view::BottomPaneView;
use codex_core::protocol::TokenUsage;
@@ -39,17 +39,15 @@ pub(crate) use list_selection_view::SelectionItem;
use status_indicator_view::StatusIndicatorView;
/// Pane displayed in the lower half of the chat UI.
pub(crate) struct BottomPane {
pub(crate) struct BottomPane<'a> {
/// Composer is retained even when a BottomPaneView is displayed so the
/// input state is retained when the view is closed.
composer: ChatComposer,
/// If present, this is displayed instead of the `composer`.
active_view: Option<Box<dyn BottomPaneView>>,
active_view: Option<Box<dyn BottomPaneView<'a> + 'a>>,
app_event_tx: AppEventSender,
frame_requester: FrameRequester,
has_input_focus: bool,
is_task_running: bool,
ctrl_c_quit_hint: bool,
@@ -61,13 +59,12 @@ pub(crate) struct BottomPane {
pub(crate) struct BottomPaneParams {
pub(crate) app_event_tx: AppEventSender,
pub(crate) frame_requester: FrameRequester,
pub(crate) has_input_focus: bool,
pub(crate) enhanced_keys_supported: bool,
pub(crate) placeholder_text: String,
}
impl BottomPane {
impl BottomPane<'_> {
const BOTTOM_PAD_LINES: u16 = 2;
pub fn new(params: BottomPaneParams) -> Self {
let enhanced_keys_supported = params.enhanced_keys_supported;
@@ -80,7 +77,6 @@ impl BottomPane {
),
active_view: None,
app_event_tx: params.app_event_tx,
frame_requester: params.frame_requester,
has_input_focus: params.has_input_focus,
is_task_running: false,
ctrl_c_quit_hint: false,
@@ -117,10 +113,7 @@ impl BottomPane {
if !view.is_complete() {
self.active_view = Some(view);
} else if self.is_task_running {
let mut v = StatusIndicatorView::new(
self.app_event_tx.clone(),
self.frame_requester.clone(),
);
let mut v = StatusIndicatorView::new(self.app_event_tx.clone());
v.update_text("waiting for model".to_string());
self.active_view = Some(Box::new(v));
self.status_view_active = true;
@@ -151,10 +144,7 @@ impl BottomPane {
self.active_view = Some(view);
} else if self.is_task_running {
// Modal aborted but task still running restore status indicator.
let mut v = StatusIndicatorView::new(
self.app_event_tx.clone(),
self.frame_requester.clone(),
);
let mut v = StatusIndicatorView::new(self.app_event_tx.clone());
v.update_text("waiting for model".to_string());
self.active_view = Some(Box::new(v));
self.status_view_active = true;
@@ -182,17 +172,6 @@ impl BottomPane {
self.request_redraw();
}
/// Update the animated header shown to the left of the brackets in the
/// status indicator (defaults to "Working"). This will update the active
/// StatusIndicatorView if present; otherwise, if a live overlay is active,
/// it will update that. If neither is present, this call is a no-op.
pub(crate) fn update_status_header(&mut self, header: String) {
if let Some(view) = self.active_view.as_mut() {
view.update_status_header(header.clone());
self.request_redraw();
}
}
pub(crate) fn show_ctrl_c_quit_hint(&mut self) {
self.ctrl_c_quit_hint = true;
self.composer
@@ -220,7 +199,6 @@ impl BottomPane {
if self.active_view.is_none() {
self.active_view = Some(Box::new(StatusIndicatorView::new(
self.app_event_tx.clone(),
self.frame_requester.clone(),
)));
self.status_view_active = true;
}
@@ -314,7 +292,7 @@ impl BottomPane {
/// Height (terminal rows) required by the current bottom pane.
pub(crate) fn request_redraw(&self) {
self.frame_requester.schedule_frame();
self.app_event_tx.send(AppEvent::RequestRedraw)
}
// --- History helpers ---
@@ -344,7 +322,7 @@ impl BottomPane {
}
}
impl WidgetRef for &BottomPane {
impl WidgetRef for &BottomPane<'_> {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
if let Some(view) = &self.active_view {
// Reserve bottom padding lines; keep at least 1 line for the view.
@@ -381,7 +359,7 @@ mod tests {
use crate::app_event::AppEvent;
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use tokio::sync::mpsc::unbounded_channel;
use std::sync::mpsc::channel;
fn exec_request() -> ApprovalRequest {
ApprovalRequest::Exec {
@@ -393,11 +371,10 @@ mod tests {
#[test]
fn ctrl_c_on_modal_consumes_and_shows_quit_hint() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx,
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -412,11 +389,10 @@ mod tests {
#[test]
fn overlay_not_shown_above_approval_modal() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx,
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -442,11 +418,10 @@ mod tests {
#[test]
fn composer_not_shown_after_denied_if_task_running() {
let (tx_raw, rx) = unbounded_channel::<AppEvent>();
let (tx_raw, rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx.clone(),
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -493,11 +468,10 @@ mod tests {
#[test]
fn status_indicator_visible_during_command_execution() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx,
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -526,11 +500,10 @@ mod tests {
#[test]
fn bottom_padding_present_for_status_view() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx,
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -579,11 +552,10 @@ mod tests {
#[test]
fn bottom_padding_shrinks_when_tiny() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut pane = BottomPane::new(BottomPaneParams {
app_event_tx: tx,
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),

View File

@@ -6,7 +6,6 @@ use ratatui::widgets::WidgetRef;
use crate::app_event_sender::AppEventSender;
use crate::bottom_pane::BottomPane;
use crate::status_indicator_widget::StatusIndicatorWidget;
use crate::tui::FrameRequester;
use super::BottomPaneView;
@@ -15,26 +14,18 @@ pub(crate) struct StatusIndicatorView {
}
impl StatusIndicatorView {
pub fn new(app_event_tx: AppEventSender, frame_requester: FrameRequester) -> Self {
pub fn new(app_event_tx: AppEventSender) -> Self {
Self {
view: StatusIndicatorWidget::new(app_event_tx, frame_requester),
view: StatusIndicatorWidget::new(app_event_tx),
}
}
pub fn update_text(&mut self, text: String) {
self.view.update_text(text);
}
pub fn update_header(&mut self, header: String) {
self.view.update_header(header);
}
}
impl BottomPaneView for StatusIndicatorView {
fn update_status_header(&mut self, header: String) {
self.update_header(header);
}
impl BottomPaneView<'_> for StatusIndicatorView {
fn should_hide_when_task_is_done(&mut self) -> bool {
true
}
@@ -47,7 +38,7 @@ impl BottomPaneView for StatusIndicatorView {
self.view.render_ref(area, buf);
}
fn handle_key_event(&mut self, _pane: &mut BottomPane, key_event: KeyEvent) {
fn handle_key_event(&mut self, _pane: &mut BottomPane<'_>, key_event: KeyEvent) {
if key_event.code == KeyCode::Esc {
self.view.interrupt();
}

View File

@@ -23,7 +23,6 @@ use codex_core::protocol::McpToolCallBeginEvent;
use codex_core::protocol::McpToolCallEndEvent;
use codex_core::protocol::Op;
use codex_core::protocol::PatchApplyBeginEvent;
use codex_core::protocol::StreamErrorEvent;
use codex_core::protocol::TaskCompleteEvent;
use codex_core::protocol::TokenUsage;
use codex_core::protocol::TurnDiffEvent;
@@ -48,14 +47,11 @@ use crate::bottom_pane::CancellationEvent;
use crate::bottom_pane::InputResult;
use crate::bottom_pane::SelectionAction;
use crate::bottom_pane::SelectionItem;
use crate::get_git_diff::get_git_diff;
use crate::history_cell;
use crate::history_cell::CommandOutput;
use crate::history_cell::ExecCell;
use crate::history_cell::HistoryCell;
use crate::history_cell::PatchEventType;
use crate::slash_command::SlashCommand;
use crate::tui::FrameRequester;
// streaming internals are provided by crate::streaming and crate::markdown_stream
use crate::user_approval_widget::ApprovalRequest;
mod interrupts;
@@ -64,13 +60,9 @@ mod agent;
use self::agent::spawn_agent;
use crate::streaming::controller::AppEventHistorySink;
use crate::streaming::controller::StreamController;
use codex_common::approval_presets::ApprovalPreset;
use codex_common::approval_presets::builtin_approval_presets;
use codex_common::model_presets::ModelPreset;
use codex_common::model_presets::builtin_model_presets;
use codex_core::ConversationManager;
use codex_core::protocol::AskForApproval;
use codex_core::protocol::SandboxPolicy;
use codex_core::protocol_config_types::ReasoningEffort as ReasoningEffortConfig;
use codex_file_search::FileMatch;
use uuid::Uuid;
@@ -81,10 +73,10 @@ struct RunningCommand {
parsed_cmd: Vec<ParsedCommand>,
}
pub(crate) struct ChatWidget {
pub(crate) struct ChatWidget<'a> {
app_event_tx: AppEventSender,
codex_op_tx: UnboundedSender<Op>,
bottom_pane: BottomPane,
bottom_pane: BottomPane<'a>,
active_exec_cell: Option<ExecCell>,
config: Config,
initial_user_message: Option<UserMessage>,
@@ -92,6 +84,8 @@ pub(crate) struct ChatWidget {
last_token_usage: TokenUsage,
// Stream lifecycle controller
stream: StreamController,
// Track the most recently active stream kind in the current turn
last_stream_kind: Option<StreamKind>,
running_commands: HashMap<String, RunningCommand>,
pending_exec_completions: Vec<(Vec<String>, Vec<ParsedCommand>, CommandOutput)>,
task_complete_pending: bool,
@@ -99,12 +93,7 @@ pub(crate) struct ChatWidget {
interrupts: InterruptManager,
// Whether a redraw is needed after handling the current event
needs_redraw: bool,
// Accumulates the current reasoning block text to extract a header
reasoning_buffer: String,
// Accumulates full reasoning content for transcript-only recording
full_reasoning_buffer: String,
session_id: Option<Uuid>,
frame_requester: FrameRequester,
}
struct UserMessage {
@@ -112,6 +101,8 @@ struct UserMessage {
image_paths: Vec<PathBuf>,
}
use crate::streaming::StreamKind;
impl From<String> for UserMessage {
fn from(text: String) -> Self {
Self {
@@ -129,21 +120,21 @@ fn create_initial_user_message(text: String, image_paths: Vec<PathBuf>) -> Optio
}
}
impl ChatWidget {
impl ChatWidget<'_> {
#[inline]
fn mark_needs_redraw(&mut self) {
self.needs_redraw = true;
}
fn flush_answer_stream_with_separator(&mut self) {
let sink = AppEventHistorySink(self.app_event_tx.clone());
let _ = self.stream.finalize(true, &sink);
let _ = self.stream.finalize(StreamKind::Answer, true, &sink);
}
// --- Small event handlers ---
fn on_session_configured(&mut self, event: codex_core::protocol::SessionConfiguredEvent) {
self.bottom_pane
.set_history_metadata(event.history_log_id, event.history_entry_count);
self.session_id = Some(event.session_id);
self.add_to_history(history_cell::new_session_info(&self.config, event, true));
self.add_to_history(&history_cell::new_session_info(&self.config, event, true));
if let Some(user_message) = self.initial_user_message.take() {
self.submit_user_message(user_message);
}
@@ -153,48 +144,30 @@ impl ChatWidget {
fn on_agent_message(&mut self, message: String) {
let sink = AppEventHistorySink(self.app_event_tx.clone());
let finished = self.stream.apply_final_answer(&message, &sink);
self.last_stream_kind = Some(StreamKind::Answer);
self.handle_if_stream_finished(finished);
self.mark_needs_redraw();
}
fn on_agent_message_delta(&mut self, delta: String) {
self.handle_streaming_delta(delta);
self.handle_streaming_delta(StreamKind::Answer, delta);
}
fn on_agent_reasoning_delta(&mut self, delta: String) {
// For reasoning deltas, do not stream to history. Accumulate the
// current reasoning block and extract the first bold element
// (between **/**) as the chunk header. Show this header as status.
self.reasoning_buffer.push_str(&delta);
if let Some(header) = extract_first_bold(&self.reasoning_buffer) {
// Update the shimmer header to the extracted reasoning chunk header.
self.bottom_pane.update_status_header(header);
} else {
// Fallback while we don't yet have a bold header: leave existing header as-is.
}
self.mark_needs_redraw();
self.handle_streaming_delta(StreamKind::Reasoning, delta);
}
fn on_agent_reasoning_final(&mut self) {
// At the end of a reasoning block, record transcript-only content.
self.full_reasoning_buffer.push_str(&self.reasoning_buffer);
if !self.full_reasoning_buffer.is_empty() {
self.add_to_history(history_cell::new_reasoning_block(
self.full_reasoning_buffer.clone(),
&self.config,
));
}
self.reasoning_buffer.clear();
self.full_reasoning_buffer.clear();
fn on_agent_reasoning_final(&mut self, text: String) {
let sink = AppEventHistorySink(self.app_event_tx.clone());
let finished = self.stream.apply_final_reasoning(&text, &sink);
self.last_stream_kind = Some(StreamKind::Reasoning);
self.handle_if_stream_finished(finished);
self.mark_needs_redraw();
}
fn on_reasoning_section_break(&mut self) {
// Start a new reasoning block for header extraction and accumulate transcript.
self.full_reasoning_buffer.push_str(&self.reasoning_buffer);
self.full_reasoning_buffer.push_str("\n\n");
self.reasoning_buffer.clear();
let sink = AppEventHistorySink(self.app_event_tx.clone());
self.stream.insert_reasoning_section_break(&sink);
}
// Raw reasoning uses the same flow as summarized reasoning
@@ -203,8 +176,7 @@ impl ChatWidget {
self.bottom_pane.clear_ctrl_c_quit_hint();
self.bottom_pane.set_task_running(true);
self.stream.reset_headers_for_new_turn();
self.full_reasoning_buffer.clear();
self.reasoning_buffer.clear();
self.last_stream_kind = None;
self.mark_needs_redraw();
}
@@ -213,7 +185,9 @@ impl ChatWidget {
// without emitting stray headers for other streams.
if self.stream.is_write_cycle_active() {
let sink = AppEventHistorySink(self.app_event_tx.clone());
let _ = self.stream.finalize(true, &sink);
if let Some(kind) = self.last_stream_kind {
let _ = self.stream.finalize(kind, true, &sink);
}
}
// Mark task stopped and request redraw now that all content is in history.
self.bottom_pane.set_task_running(false);
@@ -232,7 +206,7 @@ impl ChatWidget {
}
fn on_error(&mut self, message: String) {
self.add_to_history(history_cell::new_error_event(message));
self.add_to_history(&history_cell::new_error_event(message));
self.bottom_pane.set_task_running(false);
self.running_commands.clear();
self.stream.clear_all();
@@ -240,7 +214,7 @@ impl ChatWidget {
}
fn on_plan_update(&mut self, update: codex_core::plan_tool::UpdatePlanArgs) {
self.add_to_history(history_cell::new_plan_update(update));
self.add_to_history(&history_cell::new_plan_update(update));
}
fn on_exec_approval_request(&mut self, id: String, ev: ExecApprovalRequestEvent) {
@@ -275,7 +249,7 @@ impl ChatWidget {
}
fn on_patch_apply_begin(&mut self, event: PatchApplyBeginEvent) {
self.add_to_history(history_cell::new_patch_event(
self.add_to_history(&history_cell::new_patch_event(
PatchEventType::ApplyBegin {
auto_approved: event.auto_approved,
},
@@ -330,12 +304,6 @@ impl ChatWidget {
fn on_background_event(&mut self, message: String) {
debug!("BackgroundEvent: {message}");
}
fn on_stream_error(&mut self, message: String) {
// Show stream errors in the transcript so users see retry/backoff info.
self.add_to_history(history_cell::new_stream_error_event(message));
self.mark_needs_redraw();
}
/// Periodic tick to commit at most one queued line to history with a small delay,
/// animating the output.
pub(crate) fn on_commit_tick(&mut self) {
@@ -381,9 +349,10 @@ impl ChatWidget {
}
#[inline]
fn handle_streaming_delta(&mut self, delta: String) {
fn handle_streaming_delta(&mut self, kind: StreamKind, delta: String) {
let sink = AppEventHistorySink(self.app_event_tx.clone());
self.stream.begin(&sink);
self.stream.begin(kind, &sink);
self.last_stream_kind = Some(kind);
self.stream.push_and_maybe_commit(&delta, &sink);
self.mark_needs_redraw();
}
@@ -408,7 +377,7 @@ impl ChatWidget {
self.active_exec_cell = None;
let pending = std::mem::take(&mut self.pending_exec_completions);
for (command, parsed, output) in pending {
self.add_to_history(history_cell::new_completed_exec_command(
self.add_to_history(&history_cell::new_completed_exec_command(
command, parsed, output,
));
}
@@ -420,9 +389,9 @@ impl ChatWidget {
event: codex_core::protocol::PatchApplyEndEvent,
) {
if event.success {
self.add_to_history(history_cell::new_patch_apply_success(event.stdout));
self.add_to_history(&history_cell::new_patch_apply_success(event.stdout));
} else {
self.add_to_history(history_cell::new_patch_apply_failure(event.stderr));
self.add_to_history(&history_cell::new_patch_apply_failure(event.stderr));
}
}
@@ -444,7 +413,7 @@ impl ChatWidget {
ev: ApplyPatchApprovalRequestEvent,
) {
self.flush_answer_stream_with_separator();
self.add_to_history(history_cell::new_patch_event(
self.add_to_history(&history_cell::new_patch_event(
PatchEventType::ApprovalRequest,
ev.changes.clone(),
));
@@ -486,11 +455,11 @@ impl ChatWidget {
pub(crate) fn handle_mcp_begin_now(&mut self, ev: McpToolCallBeginEvent) {
self.flush_answer_stream_with_separator();
self.add_to_history(history_cell::new_active_mcp_tool_call(ev.invocation));
self.add_to_history(&history_cell::new_active_mcp_tool_call(ev.invocation));
}
pub(crate) fn handle_mcp_end_now(&mut self, ev: McpToolCallEndEvent) {
self.flush_answer_stream_with_separator();
self.add_boxed_history(history_cell::new_completed_mcp_tool_call(
self.add_to_history(&*history_cell::new_completed_mcp_tool_call(
80,
ev.invocation,
ev.duration,
@@ -527,7 +496,6 @@ impl ChatWidget {
pub(crate) fn new(
config: Config,
conversation_manager: Arc<ConversationManager>,
frame_requester: FrameRequester,
app_event_tx: AppEventSender,
initial_prompt: Option<String>,
initial_images: Vec<PathBuf>,
@@ -539,10 +507,8 @@ impl ChatWidget {
Self {
app_event_tx: app_event_tx.clone(),
frame_requester: frame_requester.clone(),
codex_op_tx,
bottom_pane: BottomPane::new(BottomPaneParams {
frame_requester,
app_event_tx,
has_input_focus: true,
enhanced_keys_supported,
@@ -557,13 +523,12 @@ impl ChatWidget {
total_token_usage: TokenUsage::default(),
last_token_usage: TokenUsage::default(),
stream: StreamController::new(config),
last_stream_kind: None,
running_commands: HashMap::new(),
pending_exec_completions: Vec::new(),
task_complete_pending: false,
interrupts: InterruptManager::new(),
needs_redraw: false,
reasoning_buffer: String::new(),
full_reasoning_buffer: String::new(),
session_id: None,
}
}
@@ -585,109 +550,10 @@ impl ChatWidget {
InputResult::Submitted(text) => {
self.submit_user_message(text.into());
}
InputResult::Command(cmd) => {
self.dispatch_command(cmd);
}
InputResult::None => {}
}
}
fn dispatch_command(&mut self, cmd: SlashCommand) {
match cmd {
SlashCommand::New => {
self.app_event_tx.send(AppEvent::NewSession);
}
SlashCommand::Init => {
// Guard: do not run if a task is active.
const INIT_PROMPT: &str = include_str!("../prompt_for_init_command.md");
self.submit_text_message(INIT_PROMPT.to_string());
}
SlashCommand::Compact => {
self.clear_token_usage();
self.app_event_tx.send(AppEvent::CodexOp(Op::Compact));
}
SlashCommand::Model => {
self.open_model_popup();
}
SlashCommand::Approvals => {
self.open_approvals_popup();
}
SlashCommand::Quit => {
self.app_event_tx.send(AppEvent::ExitRequest);
}
SlashCommand::Logout => {
if let Err(e) = codex_login::logout(&self.config.codex_home) {
tracing::error!("failed to logout: {e}");
}
self.app_event_tx.send(AppEvent::ExitRequest);
}
SlashCommand::Diff => {
self.add_diff_in_progress();
let tx = self.app_event_tx.clone();
tokio::spawn(async move {
let text = match get_git_diff().await {
Ok((is_git_repo, diff_text)) => {
if is_git_repo {
diff_text
} else {
"`/diff` — _not inside a git repository_".to_string()
}
}
Err(e) => format!("Failed to compute diff: {e}"),
};
tx.send(AppEvent::DiffResult(text));
});
}
SlashCommand::Mention => {
self.insert_str("@");
}
SlashCommand::Status => {
self.add_status_output();
}
SlashCommand::Mcp => {
self.add_mcp_output();
}
#[cfg(debug_assertions)]
SlashCommand::TestApproval => {
use codex_core::protocol::EventMsg;
use std::collections::HashMap;
use codex_core::protocol::ApplyPatchApprovalRequestEvent;
use codex_core::protocol::FileChange;
self.app_event_tx.send(AppEvent::CodexEvent(Event {
id: "1".to_string(),
// msg: EventMsg::ExecApprovalRequest(ExecApprovalRequestEvent {
// call_id: "1".to_string(),
// command: vec!["git".into(), "apply".into()],
// cwd: self.config.cwd.clone(),
// reason: Some("test".to_string()),
// }),
msg: EventMsg::ApplyPatchApprovalRequest(ApplyPatchApprovalRequestEvent {
call_id: "1".to_string(),
changes: HashMap::from([
(
PathBuf::from("/tmp/test.txt"),
FileChange::Add {
content: "test".to_string(),
},
),
(
PathBuf::from("/tmp/test2.txt"),
FileChange::Update {
unified_diff: "+test\n-test2".to_string(),
move_path: None,
},
),
]),
reason: None,
grant_root: Some(PathBuf::from("/tmp")),
}),
}));
}
}
}
pub(crate) fn handle_paste(&mut self, text: String) {
self.bottom_pane.handle_paste(text);
}
@@ -695,19 +561,14 @@ impl ChatWidget {
fn flush_active_exec_cell(&mut self) {
if let Some(active) = self.active_exec_cell.take() {
self.app_event_tx
.send(AppEvent::InsertHistoryCell(Box::new(active)));
.send(AppEvent::InsertHistory(active.display_lines()));
}
}
fn add_to_history(&mut self, cell: impl HistoryCell + 'static) {
fn add_to_history(&mut self, cell: &dyn HistoryCell) {
self.flush_active_exec_cell();
self.app_event_tx
.send(AppEvent::InsertHistoryCell(Box::new(cell)));
}
fn add_boxed_history(&mut self, cell: Box<dyn HistoryCell>) {
self.flush_active_exec_cell();
self.app_event_tx.send(AppEvent::InsertHistoryCell(cell));
.send(AppEvent::InsertHistory(cell.display_lines()));
}
fn submit_user_message(&mut self, user_message: UserMessage) {
@@ -743,7 +604,7 @@ impl ChatWidget {
// Only show the text portion in conversation history.
if !text.is_empty() {
self.add_to_history(history_cell::new_user_prompt(text.clone()));
self.add_to_history(&history_cell::new_user_prompt(text.clone()));
}
}
@@ -771,9 +632,9 @@ impl ChatWidget {
| EventMsg::AgentReasoningRawContentDelta(AgentReasoningRawContentDeltaEvent {
delta,
}) => self.on_agent_reasoning_delta(delta),
EventMsg::AgentReasoning(AgentReasoningEvent { .. })
| EventMsg::AgentReasoningRawContent(AgentReasoningRawContentEvent { .. }) => {
self.on_agent_reasoning_final()
EventMsg::AgentReasoning(AgentReasoningEvent { text })
| EventMsg::AgentReasoningRawContent(AgentReasoningRawContentEvent { text }) => {
self.on_agent_reasoning_final(text)
}
EventMsg::AgentReasoningSectionBreak(_) => self.on_reasoning_section_break(),
EventMsg::TaskStarted => self.on_task_started(),
@@ -798,7 +659,6 @@ impl ChatWidget {
EventMsg::BackgroundEvent(BackgroundEventEvent { message }) => {
self.on_background_event(message)
}
EventMsg::StreamError(StreamErrorEvent { message }) => self.on_stream_error(message),
}
// Coalesce redraws: issue at most one after handling the event
if self.needs_redraw {
@@ -808,7 +668,7 @@ impl ChatWidget {
}
fn request_redraw(&mut self) {
self.frame_requester.schedule_frame();
self.app_event_tx.send(AppEvent::RequestRedraw);
}
pub(crate) fn add_diff_in_progress(&mut self) {
@@ -820,12 +680,12 @@ impl ChatWidget {
pub(crate) fn add_diff_output(&mut self, diff_output: String) {
self.bottom_pane.set_task_running(false);
self.add_to_history(history_cell::new_diff_output(diff_output));
self.add_to_history(&history_cell::new_diff_output(diff_output));
self.mark_needs_redraw();
}
pub(crate) fn add_status_output(&mut self) {
self.add_to_history(history_cell::new_status_output(
self.add_to_history(&history_cell::new_status_output(
&self.config,
&self.total_token_usage,
&self.session_id,
@@ -873,57 +733,6 @@ impl ChatWidget {
);
}
/// Open a popup to choose the approvals mode (ask for approval policy + sandbox policy).
pub(crate) fn open_approvals_popup(&mut self) {
let current_approval = self.config.approval_policy;
let current_sandbox = self.config.sandbox_policy.clone();
let mut items: Vec<SelectionItem> = Vec::new();
let presets: Vec<ApprovalPreset> = builtin_approval_presets();
for preset in presets.into_iter() {
let is_current =
current_approval == preset.approval && current_sandbox == preset.sandbox;
let approval = preset.approval;
let sandbox = preset.sandbox.clone();
let name = preset.label.to_string();
let description = Some(preset.description.to_string());
let actions: Vec<SelectionAction> = vec![Box::new(move |tx| {
tx.send(AppEvent::CodexOp(Op::OverrideTurnContext {
cwd: None,
approval_policy: Some(approval),
sandbox_policy: Some(sandbox.clone()),
model: None,
effort: None,
summary: None,
}));
tx.send(AppEvent::UpdateAskForApprovalPolicy(approval));
tx.send(AppEvent::UpdateSandboxPolicy(sandbox.clone()));
})];
items.push(SelectionItem {
name,
description,
is_current,
actions,
});
}
self.bottom_pane.show_selection_view(
"Select Approval Mode".to_string(),
None,
Some("Press Enter to confirm or Esc to go back".to_string()),
items,
);
}
/// Set the approval policy in the widget's config copy.
pub(crate) fn set_approval_policy(&mut self, policy: AskForApproval) {
self.config.approval_policy = policy;
}
/// Set the sandbox policy in the widget's config copy.
pub(crate) fn set_sandbox_policy(&mut self, policy: SandboxPolicy) {
self.config.sandbox_policy = policy;
}
/// Set the reasoning effort in the widget's config copy.
pub(crate) fn set_reasoning_effort(&mut self, effort: ReasoningEffortConfig) {
self.config.model_reasoning_effort = effort;
@@ -936,7 +745,7 @@ impl ChatWidget {
pub(crate) fn add_mcp_output(&mut self) {
if self.config.mcp_servers.is_empty() {
self.add_to_history(history_cell::empty_mcp_output());
self.add_to_history(&history_cell::empty_mcp_output());
} else {
self.submit_op(Op::ListMcpTools);
}
@@ -984,7 +793,7 @@ impl ChatWidget {
}
fn on_list_mcp_tools(&mut self, ev: McpListToolsResponseEvent) {
self.add_to_history(history_cell::new_mcp_tools_output(&self.config, ev.tools));
self.add_to_history(&history_cell::new_mcp_tools_output(&self.config, ev.tools));
}
/// Programmatically submit a user text message as if typed in the
@@ -1016,7 +825,7 @@ impl ChatWidget {
}
}
impl WidgetRef for &ChatWidget {
impl WidgetRef for &ChatWidget<'_> {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
let [active_cell_area, bottom_pane_area] = self.layout_areas(area);
(&self.bottom_pane).render(bottom_pane_area, buf);
@@ -1063,35 +872,5 @@ fn add_token_usage(current_usage: &TokenUsage, new_usage: &TokenUsage) -> TokenU
}
}
// Extract the first bold (Markdown) element in the form **...** from `s`.
// Returns the inner text if found; otherwise `None`.
fn extract_first_bold(s: &str) -> Option<String> {
let bytes = s.as_bytes();
let mut i = 0usize;
while i + 1 < bytes.len() {
if bytes[i] == b'*' && bytes[i + 1] == b'*' {
let start = i + 2;
let mut j = start;
while j + 1 < bytes.len() {
if bytes[j] == b'*' && bytes[j + 1] == b'*' {
// Found closing **
let inner = &s[start..j];
let trimmed = inner.trim();
if !trimmed.is_empty() {
return Some(trimmed.to_string());
} else {
return None;
}
}
j += 1;
}
// No closing; stop searching (wait for more deltas)
return None;
}
i += 1;
}
None
}
#[cfg(test)]
mod tests;

View File

@@ -71,7 +71,7 @@ impl InterruptManager {
self.queue.push_back(QueuedInterrupt::PatchEnd(ev));
}
pub(crate) fn flush_all(&mut self, chat: &mut ChatWidget) {
pub(crate) fn flush_all(&mut self, chat: &mut ChatWidget<'_>) {
while let Some(q) = self.queue.pop_front() {
match q {
QueuedInterrupt::ExecApproval(id, ev) => chat.handle_exec_approval_now(id, ev),

View File

@@ -1,6 +1,10 @@
---
source: tui/src/chatwidget/tests.rs
assertion_line: 886
expression: combined
---
thinking
I will first analyze the request.
codex
Here is the result.

View File

@@ -2,5 +2,8 @@
source: tui/src/chatwidget/tests.rs
expression: combined
---
thinking
I will first analyze the request.
codex
Here is the result.

View File

@@ -19,7 +19,6 @@ use codex_core::protocol::ExecCommandEndEvent;
use codex_core::protocol::FileChange;
use codex_core::protocol::PatchApplyBeginEvent;
use codex_core::protocol::PatchApplyEndEvent;
use codex_core::protocol::StreamErrorEvent;
use codex_core::protocol::TaskCompleteEvent;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
@@ -31,6 +30,7 @@ use std::io::BufRead;
use std::io::BufReader;
use std::io::Read;
use std::path::PathBuf;
use std::sync::mpsc::channel;
use tokio::sync::mpsc::unbounded_channel;
fn test_config() -> Config {
@@ -45,7 +45,7 @@ fn test_config() -> Config {
#[test]
fn final_answer_without_newline_is_flushed_immediately() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Set up a VT100 test terminal to capture ANSI visual output
let width: u16 = 80;
@@ -73,7 +73,7 @@ fn final_answer_without_newline_is_flushed_immediately() {
});
// Drain history insertions and verify the final line is present.
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(
cells.iter().any(|lines| {
let s = lines
@@ -101,36 +101,27 @@ fn final_answer_without_newline_is_flushed_immediately() {
#[tokio::test(flavor = "current_thread")]
async fn helpers_are_available_and_do_not_panic() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let cfg = test_config();
let conversation_manager = Arc::new(ConversationManager::default());
let mut w = ChatWidget::new(
cfg,
conversation_manager,
crate::tui::FrameRequester::test_dummy(),
tx,
None,
Vec::new(),
false,
);
let mut w = ChatWidget::new(cfg, conversation_manager, tx, None, Vec::new(), false);
// Basic construction sanity.
let _ = &mut w;
}
// --- Helpers for tests that need direct construction and event draining ---
fn make_chatwidget_manual() -> (
ChatWidget,
tokio::sync::mpsc::UnboundedReceiver<AppEvent>,
ChatWidget<'static>,
std::sync::mpsc::Receiver<AppEvent>,
tokio::sync::mpsc::UnboundedReceiver<Op>,
) {
let (tx_raw, rx) = unbounded_channel::<AppEvent>();
let (tx_raw, rx) = channel::<AppEvent>();
let app_event_tx = AppEventSender::new(tx_raw);
let (op_tx, op_rx) = unbounded_channel::<Op>();
let cfg = test_config();
let bottom = BottomPane::new(BottomPaneParams {
app_event_tx: app_event_tx.clone(),
frame_requester: crate::tui::FrameRequester::test_dummy(),
has_input_focus: true,
enhanced_keys_supported: false,
placeholder_text: "Ask Codex to do anything".to_string(),
@@ -145,28 +136,24 @@ fn make_chatwidget_manual() -> (
total_token_usage: TokenUsage::default(),
last_token_usage: TokenUsage::default(),
stream: StreamController::new(cfg),
last_stream_kind: None,
running_commands: HashMap::new(),
pending_exec_completions: Vec::new(),
task_complete_pending: false,
interrupts: InterruptManager::new(),
needs_redraw: false,
reasoning_buffer: String::new(),
full_reasoning_buffer: String::new(),
session_id: None,
frame_requester: crate::tui::FrameRequester::test_dummy(),
};
(widget, rx, op_rx)
}
fn drain_insert_history(
rx: &mut tokio::sync::mpsc::UnboundedReceiver<AppEvent>,
rx: &std::sync::mpsc::Receiver<AppEvent>,
) -> Vec<Vec<ratatui::text::Line<'static>>> {
let mut out = Vec::new();
while let Ok(ev) = rx.try_recv() {
match ev {
AppEvent::InsertHistoryLines(lines) => out.push(lines),
AppEvent::InsertHistoryCell(cell) => out.push(cell.display_lines()),
_ => {}
if let AppEvent::InsertHistory(lines) = ev {
out.push(lines);
}
}
out
@@ -209,7 +196,7 @@ fn open_fixture(name: &str) -> std::fs::File {
#[test]
fn exec_history_cell_shows_working_then_completed() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Begin command
chat.handle_codex_event(Event {
@@ -239,7 +226,7 @@ fn exec_history_cell_shows_working_then_completed() {
}),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert_eq!(
cells.len(),
1,
@@ -254,7 +241,7 @@ fn exec_history_cell_shows_working_then_completed() {
#[test]
fn exec_history_cell_shows_working_then_failed() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Begin command
chat.handle_codex_event(Event {
@@ -284,7 +271,7 @@ fn exec_history_cell_shows_working_then_failed() {
}),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert_eq!(
cells.len(),
1,
@@ -299,7 +286,7 @@ fn exec_history_cell_shows_working_then_failed() {
#[tokio::test(flavor = "current_thread")]
async fn binary_size_transcript_matches_ideal_fixture() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Set up a VT100 test terminal to capture ANSI visual output
let width: u16 = 80;
@@ -340,25 +327,13 @@ async fn binary_size_transcript_matches_ideal_fixture() {
let ev: Event = serde_json::from_value(payload.clone()).expect("parse");
chat.handle_codex_event(ev);
while let Ok(app_ev) = rx.try_recv() {
match app_ev {
AppEvent::InsertHistoryLines(lines) => {
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
AppEvent::InsertHistoryCell(cell) => {
let lines = cell.display_lines();
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
_ => {}
if let AppEvent::InsertHistory(lines) = app_ev {
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
}
}
@@ -369,25 +344,13 @@ async fn binary_size_transcript_matches_ideal_fixture() {
{
chat.on_commit_tick();
while let Ok(app_ev) = rx.try_recv() {
match app_ev {
AppEvent::InsertHistoryLines(lines) => {
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
AppEvent::InsertHistoryCell(cell) => {
let lines = cell.display_lines();
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
_ => {}
if let AppEvent::InsertHistory(lines) = app_ev {
transcript.push_str(&lines_to_single_string(&lines));
crate::insert_history::insert_history_lines_to_writer(
&mut terminal,
&mut ansi,
lines,
);
}
}
}
@@ -403,11 +366,6 @@ async fn binary_size_transcript_matches_ideal_fixture() {
.expect("read ideal-binary-response.txt");
// Normalize line endings for Windows vs. Unix checkouts
let ideal = ideal.replace("\r\n", "\n");
let ideal_first_line = ideal
.lines()
.find(|l| !l.trim().is_empty())
.unwrap_or("")
.to_string();
// Build the final VT100 visual by parsing the ANSI stream. Trim trailing spaces per line
// and drop trailing empty lines so the shape matches the ideal fixture exactly.
@@ -433,68 +391,22 @@ async fn binary_size_transcript_matches_ideal_fixture() {
while lines.last().is_some_and(|l| l.is_empty()) {
lines.pop();
}
// Compare only after the last session banner marker. Skip the transient
// 'thinking' header if present, and start from the first non-empty line
// of content that follows.
// Compare only after the last session banner marker, and start at the next 'thinking' line.
const MARKER_PREFIX: &str = ">_ You are using OpenAI Codex in ";
let last_marker_line_idx = lines
.iter()
.rposition(|l| l.starts_with(MARKER_PREFIX))
.expect("marker not found in visible output");
// Anchor to the first ideal line if present; otherwise use heuristics.
let start_idx = (last_marker_line_idx + 1..lines.len())
.find(|&idx| lines[idx].trim_start() == ideal_first_line)
.or_else(|| {
// Prefer the first assistant content line (blockquote '>' prefix) after the marker.
(last_marker_line_idx + 1..lines.len())
.find(|&idx| lines[idx].trim_start().starts_with('>'))
})
.unwrap_or_else(|| {
// Fallback: first non-empty, non-'thinking' line
(last_marker_line_idx + 1..lines.len())
.find(|&idx| {
let t = lines[idx].trim_start();
!t.is_empty() && t != "thinking"
})
.expect("no content line found after marker")
});
let thinking_line_idx = (last_marker_line_idx + 1..lines.len())
.find(|&idx| lines[idx].trim_start() == "thinking")
.expect("no 'thinking' line found after marker");
let mut compare_lines: Vec<String> = Vec::new();
// Ensure the first line is trimmed-left to match the fixture shape.
compare_lines.push(lines[start_idx].trim_start().to_string());
compare_lines.extend(lines[(start_idx + 1)..].iter().cloned());
// Ensure the first line is exactly 'thinking' without leading spaces to match the fixture
compare_lines.push(lines[thinking_line_idx].trim_start().to_string());
compare_lines.extend(lines[(thinking_line_idx + 1)..].iter().cloned());
let visible_after = compare_lines.join("\n");
// Normalize: drop a leading 'thinking' line if present in either side to
// avoid coupling to whether the reasoning header is rendered in history.
fn drop_leading_thinking(s: &str) -> String {
let mut it = s.lines();
let first = it.next();
let rest = it.collect::<Vec<_>>().join("\n");
if first.is_some_and(|l| l.trim() == "thinking") {
rest
} else {
s.to_string()
}
}
let visible_after = drop_leading_thinking(&visible_after);
let ideal = drop_leading_thinking(&ideal);
// Normalize: strip leading Markdown blockquote markers ('>' or '> ') which
// may be present in rendered transcript lines but not in the ideal text.
fn strip_blockquotes(s: &str) -> String {
s.lines()
.map(|l| {
l.strip_prefix("> ")
.or_else(|| l.strip_prefix('>'))
.unwrap_or(l)
})
.collect::<Vec<_>>()
.join("\n")
}
let visible_after = strip_blockquotes(&visible_after);
let ideal = strip_blockquotes(&ideal);
// Optionally update the fixture when env var is set
if std::env::var("UPDATE_IDEAL").as_deref() == Ok("1") {
let mut p = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
@@ -511,7 +423,7 @@ async fn binary_size_transcript_matches_ideal_fixture() {
#[test]
fn apply_patch_events_emit_history_cells() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// 1) Approval request -> proposed patch summary cell
let mut changes = HashMap::new();
@@ -531,7 +443,7 @@ fn apply_patch_events_emit_history_cells() {
id: "s1".into(),
msg: EventMsg::ApplyPatchApprovalRequest(ev),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(!cells.is_empty(), "expected pending patch cell to be sent");
let blob = lines_to_single_string(cells.last().unwrap());
assert!(
@@ -556,7 +468,7 @@ fn apply_patch_events_emit_history_cells() {
id: "s1".into(),
msg: EventMsg::PatchApplyBegin(begin),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(!cells.is_empty(), "expected applying patch cell to be sent");
let blob = lines_to_single_string(cells.last().unwrap());
assert!(
@@ -575,7 +487,7 @@ fn apply_patch_events_emit_history_cells() {
id: "s1".into(),
msg: EventMsg::PatchApplyEnd(end),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(!cells.is_empty(), "expected applied patch cell to be sent");
let blob = lines_to_single_string(cells.last().unwrap());
assert!(
@@ -586,7 +498,7 @@ fn apply_patch_events_emit_history_cells() {
#[test]
fn apply_patch_approval_sends_op_with_submission_id() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Simulate receiving an approval request with a distinct submission id and call id
let mut changes = HashMap::new();
changes.insert(
@@ -627,7 +539,7 @@ fn apply_patch_approval_sends_op_with_submission_id() {
#[test]
fn apply_patch_full_flow_integration_like() {
let (mut chat, mut rx, mut op_rx) = make_chatwidget_manual();
let (mut chat, rx, mut op_rx) = make_chatwidget_manual();
// 1) Backend requests approval
let mut changes = HashMap::new();
@@ -743,7 +655,7 @@ fn apply_patch_untrusted_shows_approval_modal() {
#[test]
fn apply_patch_request_shows_diff_summary() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Ensure we are in OnRequest so an approval is surfaced
chat.config.approval_policy = codex_core::protocol::AskForApproval::OnRequest;
@@ -768,7 +680,7 @@ fn apply_patch_request_shows_diff_summary() {
});
// Drain history insertions and verify the diff summary is present
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(
!cells.is_empty(),
"expected a history cell with the proposed patch summary"
@@ -783,14 +695,14 @@ fn apply_patch_request_shows_diff_summary() {
// Per-file summary line should include the file path and counts
assert!(
blob.contains("README.md"),
blob.contains("README.md (+2 -0)"),
"missing per-file diff summary: {blob:?}"
);
}
#[test]
fn plan_update_renders_history_cell() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
let update = UpdatePlanArgs {
explanation: Some("Adapting plan".to_string()),
plan: vec![
@@ -812,7 +724,7 @@ fn plan_update_renders_history_cell() {
id: "sub-1".into(),
msg: EventMsg::PlanUpdate(update),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
assert!(!cells.is_empty(), "expected plan update cell to be sent");
let blob = lines_to_single_string(cells.last().unwrap());
assert!(
@@ -825,27 +737,8 @@ fn plan_update_renders_history_cell() {
}
#[test]
fn stream_error_is_rendered_to_history() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let msg = "stream error: stream disconnected before completion: idle timeout waiting for SSE; retrying 1/5 in 211ms…";
chat.handle_codex_event(Event {
id: "sub-1".into(),
msg: EventMsg::StreamError(StreamErrorEvent {
message: msg.to_string(),
}),
});
let cells = drain_insert_history(&mut rx);
assert!(!cells.is_empty(), "expected a history cell for StreamError");
let blob = lines_to_single_string(cells.last().unwrap());
assert!(blob.contains(""));
assert!(blob.contains("stream error:"));
assert!(blob.contains("idle timeout waiting for SSE"));
}
#[test]
fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
fn headers_emitted_on_stream_begin_for_answer_and_reasoning() {
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Answer: no header until a newline commit
chat.handle_codex_event(Event {
@@ -856,7 +749,7 @@ fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
});
let mut saw_codex_pre = false;
while let Ok(ev) = rx.try_recv() {
if let AppEvent::InsertHistoryLines(lines) = ev {
if let AppEvent::InsertHistory(lines) = ev {
let s = lines
.iter()
.flat_map(|l| l.spans.iter())
@@ -884,7 +777,7 @@ fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
chat.on_commit_tick();
let mut saw_codex_post = false;
while let Ok(ev) = rx.try_recv() {
if let AppEvent::InsertHistoryLines(lines) = ev {
if let AppEvent::InsertHistory(lines) = ev {
let s = lines
.iter()
.flat_map(|l| l.spans.iter())
@@ -902,8 +795,8 @@ fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
"expected 'codex' header to be emitted after first newline commit"
);
// Reasoning: do NOT emit a history header; status text is updated instead
let (mut chat2, mut rx2, _op_rx2) = make_chatwidget_manual();
// Reasoning: header immediately
let (mut chat2, rx2, _op_rx2) = make_chatwidget_manual();
chat2.handle_codex_event(Event {
id: "sub-b".into(),
msg: EventMsg::AgentReasoningDelta(AgentReasoningDeltaEvent {
@@ -912,7 +805,7 @@ fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
});
let mut saw_thinking = false;
while let Ok(ev) = rx2.try_recv() {
if let AppEvent::InsertHistoryLines(lines) = ev {
if let AppEvent::InsertHistory(lines) = ev {
let s = lines
.iter()
.flat_map(|l| l.spans.iter())
@@ -926,14 +819,14 @@ fn headers_emitted_on_stream_begin_for_answer_and_not_for_reasoning() {
}
}
assert!(
!saw_thinking,
"reasoning deltas should not emit history headers"
saw_thinking,
"expected 'thinking' header to be emitted at stream start"
);
}
#[test]
fn multiple_agent_messages_in_single_turn_emit_multiple_headers() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Begin turn
chat.handle_codex_event(Event {
@@ -965,7 +858,7 @@ fn multiple_agent_messages_in_single_turn_emit_multiple_headers() {
}),
});
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
let mut header_count = 0usize;
let mut combined = String::new();
for lines in &cells {
@@ -1001,7 +894,7 @@ fn multiple_agent_messages_in_single_turn_emit_multiple_headers() {
#[test]
fn final_reasoning_then_message_without_deltas_are_rendered() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// No deltas; only final reasoning followed by final message.
chat.handle_codex_event(Event {
@@ -1018,7 +911,7 @@ fn final_reasoning_then_message_without_deltas_are_rendered() {
});
// Drain history and snapshot the combined visible content.
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
let combined = cells
.iter()
.map(|lines| lines_to_single_string(lines))
@@ -1028,7 +921,7 @@ fn final_reasoning_then_message_without_deltas_are_rendered() {
#[test]
fn deltas_then_same_final_message_are_rendered_snapshot() {
let (mut chat, mut rx, _op_rx) = make_chatwidget_manual();
let (mut chat, rx, _op_rx) = make_chatwidget_manual();
// Stream some reasoning deltas first.
chat.handle_codex_event(Event {
@@ -1079,7 +972,7 @@ fn deltas_then_same_final_message_are_rendered_snapshot() {
// Snapshot the combined visible content to ensure we render as expected
// when deltas are followed by the identical final message.
let cells = drain_insert_history(&mut rx);
let cells = drain_insert_history(&rx);
let combined = cells
.iter()
.map(|lines| lines_to_single_string(lines))

View File

@@ -245,7 +245,7 @@ where
/// Index of the current buffer in the previous array
current: usize,
/// Whether the cursor is currently hidden
pub hidden_cursor: bool,
hidden_cursor: bool,
/// Area of the viewport
pub viewport_area: Rect,
/// Last known size of the terminal. Used to detect if the internal buffers have to be resized.

View File

@@ -130,20 +130,17 @@ pub(crate) fn create_diff_summary(
for (idx, f) in files.iter().enumerate() {
let mut spans: Vec<RtSpan<'static>> = Vec::new();
spans.push(RtSpan::raw(f.display_path.clone()));
// Show per-file +/- counts only when there are multiple files
if file_count > 1 {
spans.push(RtSpan::raw(" ("));
spans.push(RtSpan::styled(
format!("+{}", f.added),
Style::default().fg(Color::Green),
));
spans.push(RtSpan::raw(" "));
spans.push(RtSpan::styled(
format!("-{}", f.removed),
Style::default().fg(Color::Red),
));
spans.push(RtSpan::raw(")"));
}
spans.push(RtSpan::raw(" ("));
spans.push(RtSpan::styled(
format!("+{}", f.added),
Style::default().fg(Color::Green),
));
spans.push(RtSpan::raw(" "));
spans.push(RtSpan::styled(
format!("-{}", f.removed),
Style::default().fg(Color::Red),
));
spans.push(RtSpan::raw(")"));
let mut line = RtLine::from(spans);
let prefix = if idx == 0 { "" } else { " " };
@@ -212,18 +209,7 @@ fn render_patch_details(changes: &HashMap<PathBuf, FileChange>) -> Vec<RtLine<'s
move_path: _,
} => {
if let Ok(patch) = diffy::Patch::from_str(unified_diff) {
let mut is_first_hunk = true;
for h in patch.hunks() {
// Render a simple separator between non-contiguous hunks
// instead of diff-style @@ headers.
if !is_first_hunk {
out.push(RtLine::from(vec![
RtSpan::raw(" "),
RtSpan::styled("", style_dim()),
]));
}
is_first_hunk = false;
let mut old_ln = h.old_range().start();
let mut new_ln = h.new_range().start();
for l in h.lines() {
@@ -287,7 +273,8 @@ fn push_wrapped_diff_line(
// ("+"/"-" for inserts/deletes, or a space for context lines) so alignment
// stays consistent across all diff lines.
let gap_after_ln = SPACES_AFTER_LINE_NUMBER.saturating_sub(ln_str.len());
let prefix_cols = indent.len() + ln_str.len() + gap_after_ln;
let first_prefix_cols = indent.len() + ln_str.len() + gap_after_ln;
let cont_prefix_cols = indent.len() + ln_str.len() + gap_after_ln;
let mut first = true;
let (sign_opt, line_style) = match kind {
@@ -296,14 +283,16 @@ fn push_wrapped_diff_line(
DiffLineType::Context => (None, None),
};
let mut lines: Vec<RtLine<'static>> = Vec::new();
loop {
while !remaining_text.is_empty() {
let prefix_cols = if first {
first_prefix_cols
} else {
cont_prefix_cols
};
// Fit the content for the current terminal row:
// compute how many columns are available after the prefix, then split
// at a UTF-8 character boundary so this row's chunk fits exactly.
let available_content_cols = term_cols
.saturating_sub(if first { prefix_cols + 1 } else { prefix_cols })
.max(1);
let available_content_cols = term_cols.saturating_sub(prefix_cols).max(1);
let split_at_byte_index = remaining_text
.char_indices()
.nth(available_content_cols)
@@ -349,9 +338,6 @@ fn push_wrapped_diff_line(
}
lines.push(line);
}
if remaining_text.is_empty() {
break;
}
}
lines
}
@@ -441,72 +427,4 @@ mod tests {
// Render into a small terminal to capture the visual layout
snapshot_lines("wrap_behavior_insert", lines, DEFAULT_WRAP_COLS + 10, 8);
}
#[test]
fn ui_snapshot_single_line_replacement_counts() {
// Reproduce: one deleted line replaced by one inserted line, no extra context
let original = "# Codex CLI (Rust Implementation)\n";
let modified = "# Codex CLI (Rust Implementation) banana\n";
let patch = diffy::create_patch(original, modified).to_string();
let mut changes: HashMap<PathBuf, FileChange> = HashMap::new();
changes.insert(
PathBuf::from("README.md"),
FileChange::Update {
unified_diff: patch,
move_path: None,
},
);
let lines =
create_diff_summary("proposed patch", &changes, PatchEventType::ApprovalRequest);
snapshot_lines("single_line_replacement_counts", lines, 80, 8);
}
#[test]
fn ui_snapshot_blank_context_line() {
// Ensure a hunk that includes a blank context line at the beginning is rendered visibly
let original = "\nY\n";
let modified = "\nY changed\n";
let patch = diffy::create_patch(original, modified).to_string();
let mut changes: HashMap<PathBuf, FileChange> = HashMap::new();
changes.insert(
PathBuf::from("example.txt"),
FileChange::Update {
unified_diff: patch,
move_path: None,
},
);
let lines =
create_diff_summary("proposed patch", &changes, PatchEventType::ApprovalRequest);
snapshot_lines("blank_context_line", lines, 80, 10);
}
#[test]
fn ui_snapshot_vertical_ellipsis_between_hunks() {
// Create a patch with two separate hunks to ensure we render the vertical ellipsis (⋮)
let original =
"line 1\nline 2\nline 3\nline 4\nline 5\nline 6\nline 7\nline 8\nline 9\nline 10\n";
let modified = "line 1\nline two changed\nline 3\nline 4\nline 5\nline 6\nline 7\nline 8\nline nine changed\nline 10\n";
let patch = diffy::create_patch(original, modified).to_string();
let mut changes: HashMap<PathBuf, FileChange> = HashMap::new();
changes.insert(
PathBuf::from("example.txt"),
FileChange::Update {
unified_diff: patch,
move_path: None,
},
);
let lines =
create_diff_summary("proposed patch", &changes, PatchEventType::ApprovalRequest);
// Height is large enough to show both hunks and the separator
snapshot_lines("vertical_ellipsis_between_hunks", lines, 80, 16);
}
}

View File

@@ -1,7 +1,6 @@
use crate::diff_render::create_diff_summary;
use crate::exec_command::relativize_to_home;
use crate::exec_command::strip_bash_lc_and_escape;
use crate::markdown::append_markdown;
use crate::slash_command::SlashCommand;
use crate::text_formatting::format_and_truncate_tool_result;
use base64::Engine;
@@ -12,7 +11,6 @@ use codex_core::config::Config;
use codex_core::plan_tool::PlanItemArg;
use codex_core::plan_tool::StepStatus;
use codex_core::plan_tool::UpdatePlanArgs;
use codex_core::project_doc::discover_project_doc_paths;
use codex_core::protocol::FileChange;
use codex_core::protocol::McpInvocation;
use codex_core::protocol::SandboxPolicy;
@@ -41,7 +39,7 @@ use std::time::Instant;
use tracing::error;
use uuid::Uuid;
#[derive(Clone, Debug)]
#[derive(Clone)]
pub(crate) struct CommandOutput {
pub(crate) exit_code: i32,
pub(crate) stdout: String,
@@ -56,13 +54,9 @@ pub(crate) enum PatchEventType {
/// Represents an event to display in the conversation history. Returns its
/// `Vec<Line<'static>>` representation to make it easier to display in a
/// scrollable list.
pub(crate) trait HistoryCell: std::fmt::Debug + Send + Sync {
pub(crate) trait HistoryCell {
fn display_lines(&self) -> Vec<Line<'static>>;
fn transcript_lines(&self) -> Vec<Line<'static>> {
self.display_lines()
}
fn desired_height(&self, width: u16) -> u16 {
Paragraph::new(Text::from(self.display_lines()))
.wrap(Wrap { trim: false })
@@ -72,7 +66,6 @@ pub(crate) trait HistoryCell: std::fmt::Debug + Send + Sync {
}
}
#[derive(Debug)]
pub(crate) struct PlainHistoryCell {
lines: Vec<Line<'static>>,
}
@@ -83,22 +76,6 @@ impl HistoryCell for PlainHistoryCell {
}
}
#[derive(Debug)]
pub(crate) struct TranscriptOnlyHistoryCell {
lines: Vec<Line<'static>>,
}
impl HistoryCell for TranscriptOnlyHistoryCell {
fn display_lines(&self) -> Vec<Line<'static>> {
Vec::new()
}
fn transcript_lines(&self) -> Vec<Line<'static>> {
self.lines.clone()
}
}
#[derive(Debug)]
pub(crate) struct ExecCell {
pub(crate) command: Vec<String>,
pub(crate) parsed: Vec<ParsedCommand>,
@@ -124,7 +101,6 @@ impl WidgetRef for &ExecCell {
}
}
#[derive(Debug)]
struct CompletedMcpToolCallWithImageOutput {
_image: DynamicImage,
}
@@ -192,8 +168,7 @@ pub(crate) fn new_session_info(
Line::from("".dim()),
Line::from(format!(" /init - {}", SlashCommand::Init.description()).dim()),
Line::from(format!(" /status - {}", SlashCommand::Status.description()).dim()),
Line::from(format!(" /approvals - {}", SlashCommand::Approvals.description()).dim()),
Line::from(format!(" /model - {}", SlashCommand::Model.description()).dim()),
Line::from(format!(" /diff - {}", SlashCommand::Diff.description()).dim()),
Line::from("".dim()),
];
PlainHistoryCell { lines }
@@ -562,54 +537,6 @@ pub(crate) fn new_status_output(
sandbox_name.into(),
]));
// AGENTS.md files discovered via core's project_doc logic
let agents_list = {
match discover_project_doc_paths(config) {
Ok(paths) => {
let mut rels: Vec<String> = Vec::new();
for p in paths {
let display = if let Some(parent) = p.parent() {
if parent == config.cwd {
"AGENTS.md".to_string()
} else {
let mut cur = config.cwd.as_path();
let mut ups = 0usize;
let mut reached = false;
while let Some(c) = cur.parent() {
if cur == parent {
reached = true;
break;
}
cur = c;
ups += 1;
}
if reached {
format!("{}AGENTS.md", "../".repeat(ups))
} else if let Ok(stripped) = p.strip_prefix(&config.cwd) {
stripped.display().to_string()
} else {
p.display().to_string()
}
}
} else {
p.display().to_string()
};
rels.push(display);
}
rels
}
Err(_) => Vec::new(),
}
};
if agents_list.is_empty() {
lines.push(Line::from(" • AGENTS files: (none)"));
} else {
lines.push(Line::from(vec![
" • AGENTS files: ".into(),
agents_list.join(", ").into(),
]));
}
lines.push(Line::from(""));
// 👤 Account (only if ChatGPT tokens exist), shown under the first block
@@ -714,15 +641,6 @@ pub(crate) fn empty_mcp_output() -> PlainHistoryCell {
Line::from(vec!["🔌 ".into(), "MCP Tools".bold()]),
Line::from(""),
Line::from(" • No MCP servers configured.".italic()),
Line::from(vec![
" See the ".into(),
Span::styled(
"\u{1b}]8;;https://github.com/openai/codex/blob/main/codex-rs/config.md#mcp_servers\u{7}MCP docs\u{1b}]8;;\u{7}",
Style::default().add_modifier(Modifier::UNDERLINED),
),
" to configure them.".into(),
])
.style(Style::default().add_modifier(Modifier::DIM)),
Line::from(""),
];
@@ -800,12 +718,6 @@ pub(crate) fn new_error_event(message: String) -> PlainHistoryCell {
PlainHistoryCell { lines }
}
pub(crate) fn new_stream_error_event(message: String) -> PlainHistoryCell {
let lines: Vec<Line<'static>> =
vec![vec!["".magenta().bold(), message.dim()].into(), "".into()];
PlainHistoryCell { lines }
}
/// Render a userfriendly plan update styled like a checkbox todo list.
pub(crate) fn new_plan_update(update: UpdatePlanArgs) -> PlainHistoryCell {
let UpdatePlanArgs { explanation, plan } = update;
@@ -1008,17 +920,6 @@ pub(crate) fn new_patch_apply_success(stdout: String) -> PlainHistoryCell {
PlainHistoryCell { lines }
}
pub(crate) fn new_reasoning_block(
full_reasoning_buffer: String,
config: &Config,
) -> TranscriptOnlyHistoryCell {
let mut lines: Vec<Line<'static>> = Vec::new();
lines.push(Line::from("thinking".magenta().italic()));
append_markdown(&full_reasoning_buffer, &mut lines, config);
lines.push(Line::from(""));
TranscriptOnlyHistoryCell { lines }
}
fn output_lines(
output: Option<&CommandOutput>,
only_err: bool,

View File

@@ -22,7 +22,7 @@ use textwrap::Options as TwOptions;
use textwrap::WordSplitter;
/// Insert `lines` above the viewport.
pub(crate) fn insert_history_lines(terminal: &mut tui::Terminal, lines: Vec<Line>) {
pub(crate) fn insert_history_lines(terminal: &mut tui::Tui, lines: Vec<Line>) {
let mut out = std::io::stdout();
insert_history_lines_to_writer(terminal, &mut out, lines);
}
@@ -38,8 +38,9 @@ pub fn insert_history_lines_to_writer<B, W>(
W: Write,
{
let screen_size = terminal.backend().size().unwrap_or(Size::new(0, 0));
let cursor_pos = terminal.get_cursor_position().ok();
let mut area = terminal.viewport_area;
let mut area = terminal.get_frame().area();
// Pre-wrap lines using word-aware wrapping so terminal scrollback sees the same
// formatting as the TUI. This avoids character-level hard wrapping by the terminal.
@@ -103,14 +104,9 @@ pub fn insert_history_lines_to_writer<B, W>(
queue!(writer, ResetScrollRegion).ok();
// Restore the cursor position to where it was before we started.
queue!(
writer,
MoveTo(
terminal.last_known_cursor_pos.x,
terminal.last_known_cursor_pos.y
)
)
.ok();
if let Some(cursor_pos) = cursor_pos {
queue!(writer, MoveTo(cursor_pos.x, cursor_pos.y)).ok();
}
}
#[derive(Debug, Clone, PartialEq, Eq)]

View File

@@ -49,7 +49,6 @@ mod slash_command;
mod status_indicator_widget;
mod streaming;
mod text_formatting;
mod transcript_app;
mod tui;
mod user_approval_widget;
@@ -65,11 +64,6 @@ use color_eyre::owo_colors::OwoColorize;
pub use cli::Cli;
use crate::onboarding::TrustDirectorySelection;
use crate::onboarding::onboarding_screen::OnboardingScreenArgs;
use crate::onboarding::onboarding_screen::run_onboarding_app;
use crate::tui::Tui;
// (tests access modules directly within the crate)
pub async fn run_main(
@@ -220,17 +214,47 @@ pub async fn run_main(
let _ = tracing_subscriber::registry().with(file_layer).try_init();
#[allow(clippy::print_stderr)]
#[cfg(not(debug_assertions))]
if let Some(latest_version) = updates::get_upgrade_version(&config) {
let current_version = env!("CARGO_PKG_VERSION");
let exe = std::env::current_exe()?;
let managed_by_npm = std::env::var_os("CODEX_MANAGED_BY_NPM").is_some();
eprintln!(
"{} {current_version} -> {latest_version}.",
"✨⬆️ Update available!".bold().cyan()
);
if managed_by_npm {
let npm_cmd = "npm install -g @openai/codex@latest";
eprintln!("Run {} to update.", npm_cmd.cyan().on_black());
} else if cfg!(target_os = "macos")
&& (exe.starts_with("/opt/homebrew") || exe.starts_with("/usr/local"))
{
let brew_cmd = "brew upgrade codex";
eprintln!("Run {} to update.", brew_cmd.cyan().on_black());
} else {
eprintln!(
"See {} for the latest releases and installation options.",
"https://github.com/openai/codex/releases/latest"
.cyan()
.on_black()
);
}
eprintln!("");
}
run_ratatui_app(cli, config, should_show_trust_screen)
.await
.map_err(|err| std::io::Error::other(err.to_string()))
}
async fn run_ratatui_app(
fn run_ratatui_app(
cli: Cli,
config: Config,
should_show_trust_screen: bool,
) -> color_eyre::Result<codex_core::protocol::TokenUsage> {
let mut config = config;
color_eyre::install()?;
// Forward panic reports through tracing so they appear in the UI status
@@ -242,92 +266,23 @@ async fn run_ratatui_app(
tracing::error!("panic: {info}");
prev_hook(info);
}));
let mut terminal = tui::init()?;
let mut terminal = tui::init(&config)?;
terminal.clear()?;
let mut tui = Tui::new(terminal);
// Show update banner in terminal history (instead of stderr) so it is visible
// within the TUI scrollback. Building spans keeps styling consistent.
#[cfg(not(debug_assertions))]
if let Some(latest_version) = updates::get_upgrade_version(&config) {
use ratatui::style::Stylize as _;
use ratatui::text::Line;
use ratatui::text::Span;
let current_version = env!("CARGO_PKG_VERSION");
let exe = std::env::current_exe()?;
let managed_by_npm = std::env::var_os("CODEX_MANAGED_BY_NPM").is_some();
let mut lines: Vec<Line<'static>> = Vec::new();
lines.push(Line::from(vec![
"✨⬆️ Update available!".bold().cyan(),
Span::raw(" "),
Span::raw(format!("{current_version} -> {latest_version}.")),
]));
if managed_by_npm {
let npm_cmd = "npm install -g @openai/codex@latest";
lines.push(Line::from(vec![
Span::raw("Run "),
npm_cmd.cyan(),
Span::raw(" to update."),
]));
} else if cfg!(target_os = "macos")
&& (exe.starts_with("/opt/homebrew") || exe.starts_with("/usr/local"))
{
let brew_cmd = "brew upgrade codex";
lines.push(Line::from(vec![
Span::raw("Run "),
brew_cmd.cyan(),
Span::raw(" to update."),
]));
} else {
lines.push(Line::from(vec![
Span::raw("See "),
"https://github.com/openai/codex/releases/latest".cyan(),
Span::raw(" for the latest releases and installation options."),
]));
}
lines.push(Line::from(""));
tui.insert_history_lines(lines);
}
// Initialize high-fidelity session event logging if enabled.
session_log::maybe_init(&config);
let Cli { prompt, images, .. } = cli;
let mut app = App::new(config.clone(), prompt, images, should_show_trust_screen);
let login_status = get_login_status(&config);
let should_show_onboarding =
should_show_onboarding(login_status, &config, should_show_trust_screen);
if should_show_onboarding {
let directory_trust_decision = run_onboarding_app(
OnboardingScreenArgs {
codex_home: config.codex_home.clone(),
cwd: config.cwd.clone(),
show_login_screen: should_show_login_screen(login_status, &config),
show_trust_screen: should_show_trust_screen,
login_status,
preferred_auth_method: config.preferred_auth_method,
},
&mut tui,
)
.await?;
if let Some(TrustDirectorySelection::Trust) = directory_trust_decision {
config.approval_policy = AskForApproval::OnRequest;
config.sandbox_policy = SandboxPolicy::new_workspace_write_policy();
}
}
let app_result = App::run(&mut tui, config, prompt, images).await;
let app_result = app.run(&mut terminal);
let usage = app.token_usage();
restore();
// Mark the end of the recorded session.
session_log::log_session_end();
// ignore error when collecting usage report underlying error instead
app_result
app_result.map(|_| usage)
}
#[expect(
@@ -401,80 +356,3 @@ fn determine_repo_trust_state(
Ok(true)
}
}
fn should_show_onboarding(
login_status: LoginStatus,
config: &Config,
show_trust_screen: bool,
) -> bool {
if show_trust_screen {
return true;
}
should_show_login_screen(login_status, config)
}
fn should_show_login_screen(login_status: LoginStatus, config: &Config) -> bool {
// Only show the login screen for providers that actually require OpenAI auth
// (OpenAI or equivalents). For OSS/other providers, skip login entirely.
if !config.model_provider.requires_openai_auth {
return false;
}
match login_status {
LoginStatus::NotAuthenticated => true,
LoginStatus::AuthMode(method) => method != config.preferred_auth_method,
}
}
#[cfg(test)]
mod tests {
use super::*;
fn make_config(preferred: AuthMode) -> Config {
let mut cfg = Config::load_from_base_config_with_overrides(
ConfigToml::default(),
ConfigOverrides::default(),
std::env::temp_dir(),
)
.expect("load default config");
cfg.preferred_auth_method = preferred;
cfg
}
#[test]
fn shows_login_when_not_authenticated() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(should_show_login_screen(
LoginStatus::NotAuthenticated,
&cfg
));
}
#[test]
fn shows_login_when_api_key_but_prefers_chatgpt() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ApiKey),
&cfg
))
}
#[test]
fn hides_login_when_api_key_and_prefers_api_key() {
let cfg = make_config(AuthMode::ApiKey);
assert!(!should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ApiKey),
&cfg
))
}
#[test]
fn hides_login_when_chatgpt_and_prefers_chatgpt() {
let cfg = make_config(AuthMode::ChatGPT);
assert!(!should_show_login_screen(
LoginStatus::AuthMode(AuthMode::ChatGPT),
&cfg
))
}
}

View File

@@ -45,6 +45,22 @@ impl MarkdownStreamCollector {
self.buffer.push_str(delta);
}
/// Insert a paragraph/section separator if one is not already present at the
/// end of the buffer. Ensures the next content starts after a blank line.
pub fn insert_section_break(&mut self) {
if self.buffer.is_empty() {
return;
}
if self.buffer.ends_with("\n\n") {
return;
}
if self.buffer.ends_with('\n') {
self.buffer.push('\n');
} else {
self.buffer.push_str("\n\n");
}
}
/// Render the full buffer and return only the newly completed logical lines
/// since the last commit. When the buffer does not end with a newline, the
/// final rendered line is considered incomplete and is not emitted.

View File

@@ -1,5 +1,3 @@
#![allow(clippy::unwrap_used)]
use codex_login::CLIENT_ID;
use codex_login::ServerOptions;
use codex_login::ShutdownHandle;
@@ -20,19 +18,19 @@ use ratatui::widgets::WidgetRef;
use ratatui::widgets::Wrap;
use codex_login::AuthMode;
use std::sync::RwLock;
use crate::LoginStatus;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::onboarding::onboarding_screen::KeyboardHandler;
use crate::onboarding::onboarding_screen::StepStateProvider;
use crate::shimmer::shimmer_spans;
use crate::tui::FrameRequester;
use std::path::PathBuf;
use std::sync::Arc;
use super::onboarding_screen::StepState;
// no additional imports
#[derive(Clone)]
#[derive(Debug)]
pub(crate) enum SignInState {
PickMode,
ChatGptContinueInBrowser(ContinueInBrowserState),
@@ -42,17 +40,18 @@ pub(crate) enum SignInState {
EnvVarFound,
}
#[derive(Clone)]
#[derive(Debug)]
/// Used to manage the lifecycle of SpawnedLogin and ensure it gets cleaned up.
pub(crate) struct ContinueInBrowserState {
auth_url: String,
shutdown_flag: Option<ShutdownHandle>,
shutdown_handle: Option<ShutdownHandle>,
_login_wait_handle: Option<tokio::task::JoinHandle<()>>,
}
impl Drop for ContinueInBrowserState {
fn drop(&mut self) {
if let Some(handle) = &self.shutdown_flag {
handle.shutdown();
if let Some(flag) = &self.shutdown_handle {
flag.shutdown();
}
}
}
@@ -70,32 +69,20 @@ impl KeyboardHandler for AuthModeWidget {
self.start_chatgpt_login();
}
KeyCode::Char('2') => self.verify_api_key(),
KeyCode::Enter => {
let sign_in_state = { (*self.sign_in_state.read().unwrap()).clone() };
match sign_in_state {
SignInState::PickMode => match self.highlighted_mode {
AuthMode::ChatGPT => {
self.start_chatgpt_login();
}
AuthMode::ApiKey => {
self.verify_api_key();
}
},
SignInState::EnvVarMissing => {
*self.sign_in_state.write().unwrap() = SignInState::PickMode;
}
SignInState::ChatGptSuccessMessage => {
*self.sign_in_state.write().unwrap() = SignInState::ChatGptSuccess;
}
_ => {}
KeyCode::Enter => match self.sign_in_state {
SignInState::PickMode => match self.highlighted_mode {
AuthMode::ChatGPT => self.start_chatgpt_login(),
AuthMode::ApiKey => self.verify_api_key(),
},
SignInState::EnvVarMissing => self.sign_in_state = SignInState::PickMode,
SignInState::ChatGptSuccessMessage => {
self.sign_in_state = SignInState::ChatGptSuccess
}
}
_ => {}
},
KeyCode::Esc => {
tracing::info!("Esc pressed");
let sign_in_state = { (*self.sign_in_state.read().unwrap()).clone() };
if matches!(sign_in_state, SignInState::ChatGptContinueInBrowser(_)) {
*self.sign_in_state.write().unwrap() = SignInState::PickMode;
self.request_frame.schedule_frame();
if matches!(self.sign_in_state, SignInState::ChatGptContinueInBrowser(_)) {
self.sign_in_state = SignInState::PickMode;
}
}
_ => {}
@@ -103,12 +90,12 @@ impl KeyboardHandler for AuthModeWidget {
}
}
#[derive(Clone)]
#[derive(Debug)]
pub(crate) struct AuthModeWidget {
pub request_frame: FrameRequester,
pub event_tx: AppEventSender,
pub highlighted_mode: AuthMode,
pub error: Option<String>,
pub sign_in_state: Arc<RwLock<SignInState>>,
pub sign_in_state: SignInState,
pub codex_home: PathBuf,
pub login_status: LoginStatus,
pub preferred_auth_method: AuthMode,
@@ -228,13 +215,14 @@ impl AuthModeWidget {
fn render_continue_in_browser(&self, area: Rect, buf: &mut Buffer) {
let mut spans = vec![Span::from("> ")];
// Schedule a follow-up frame to keep the shimmer animation going.
self.request_frame
.schedule_frame_in(std::time::Duration::from_millis(100));
self.event_tx
.send(AppEvent::ScheduleFrameIn(std::time::Duration::from_millis(
100,
)));
spans.extend(shimmer_spans("Finish signing in via your browser"));
let mut lines = vec![Line::from(spans), Line::from("")];
let sign_in_state = self.sign_in_state.read().unwrap();
if let SignInState::ChatGptContinueInBrowser(state) = &*sign_in_state
if let SignInState::ChatGptContinueInBrowser(state) = &self.sign_in_state
&& !state.auth_url.is_empty()
{
lines.push(Line::from(" If the link doesn't open automatically, open the following link to authenticate:"));
@@ -327,45 +315,35 @@ impl AuthModeWidget {
// If we're already authenticated with ChatGPT, don't start a new login
// just proceed to the success message flow.
if matches!(self.login_status, LoginStatus::AuthMode(AuthMode::ChatGPT)) {
*self.sign_in_state.write().unwrap() = SignInState::ChatGptSuccess;
self.request_frame.schedule_frame();
self.sign_in_state = SignInState::ChatGptSuccess;
self.event_tx.send(AppEvent::RequestRedraw);
return;
}
self.error = None;
let opts = ServerOptions::new(self.codex_home.clone(), CLIENT_ID.to_string());
match run_login_server(opts) {
let server = run_login_server(opts);
match server {
Ok(child) => {
let sign_in_state = self.sign_in_state.clone();
let request_frame = self.request_frame.clone();
tokio::spawn(async move {
let auth_url = child.auth_url.clone();
{
*sign_in_state.write().unwrap() =
SignInState::ChatGptContinueInBrowser(ContinueInBrowserState {
auth_url,
shutdown_flag: Some(child.cancel_handle()),
});
}
request_frame.schedule_frame();
let r = child.block_until_done().await;
match r {
Ok(()) => {
*sign_in_state.write().unwrap() = SignInState::ChatGptSuccessMessage;
request_frame.schedule_frame();
}
_ => {
*sign_in_state.write().unwrap() = SignInState::PickMode;
// self.error = Some(e.to_string());
request_frame.schedule_frame();
}
}
let auth_url = child.auth_url.clone();
let shutdown_handle = child.cancel_handle();
let event_tx = self.event_tx.clone();
let join_handle = tokio::spawn(async move {
spawn_completion_poller(child, event_tx).await;
});
self.sign_in_state =
SignInState::ChatGptContinueInBrowser(ContinueInBrowserState {
auth_url,
shutdown_handle: Some(shutdown_handle),
_login_wait_handle: Some(join_handle),
});
self.event_tx.send(AppEvent::RequestRedraw);
}
Err(e) => {
*self.sign_in_state.write().unwrap() = SignInState::PickMode;
self.sign_in_state = SignInState::PickMode;
self.error = Some(e.to_string());
self.request_frame.schedule_frame();
self.event_tx.send(AppEvent::RequestRedraw);
}
}
}
@@ -375,18 +353,33 @@ impl AuthModeWidget {
if matches!(self.login_status, LoginStatus::AuthMode(AuthMode::ApiKey)) {
// We already have an API key configured (e.g., from auth.json or env),
// so mark this step complete immediately.
*self.sign_in_state.write().unwrap() = SignInState::EnvVarFound;
self.sign_in_state = SignInState::EnvVarFound;
} else {
*self.sign_in_state.write().unwrap() = SignInState::EnvVarMissing;
self.sign_in_state = SignInState::EnvVarMissing;
}
self.request_frame.schedule_frame();
self.event_tx.send(AppEvent::RequestRedraw);
}
}
async fn spawn_completion_poller(
child: codex_login::LoginServer,
event_tx: AppEventSender,
) -> tokio::task::JoinHandle<()> {
tokio::spawn(async move {
if let Ok(()) = child.block_until_done().await {
event_tx.send(AppEvent::OnboardingAuthComplete(Ok(())));
} else {
event_tx.send(AppEvent::OnboardingAuthComplete(Err(
"login failed".to_string()
)));
}
})
}
impl StepStateProvider for AuthModeWidget {
fn get_step_state(&self) -> StepState {
let sign_in_state = self.sign_in_state.read().unwrap();
match &*sign_in_state {
match &self.sign_in_state {
SignInState::PickMode
| SignInState::EnvVarMissing
| SignInState::ChatGptContinueInBrowser(_)
@@ -398,8 +391,7 @@ impl StepStateProvider for AuthModeWidget {
impl WidgetRef for AuthModeWidget {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
let sign_in_state = self.sign_in_state.read().unwrap();
match &*sign_in_state {
match self.sign_in_state {
SignInState::PickMode => {
self.render_pick_mode(area, buf);
}

View File

@@ -0,0 +1,34 @@
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use ratatui::widgets::WidgetRef;
use crate::app::ChatWidgetArgs;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::onboarding::onboarding_screen::StepStateProvider;
use super::onboarding_screen::StepState;
use std::sync::Arc;
use std::sync::Mutex;
/// This doesn't render anything explicitly but serves as a signal that we made it to the end and
/// we should continue to the chat.
pub(crate) struct ContinueToChatWidget {
pub event_tx: AppEventSender,
pub chat_widget_args: Arc<Mutex<ChatWidgetArgs>>,
}
impl StepStateProvider for ContinueToChatWidget {
fn get_step_state(&self) -> StepState {
StepState::Complete
}
}
impl WidgetRef for &ContinueToChatWidget {
fn render_ref(&self, _area: Rect, _buf: &mut Buffer) {
if let Ok(args) = self.chat_widget_args.lock() {
self.event_tx
.send(AppEvent::OnboardingComplete(args.clone()));
}
}
}

View File

@@ -1,5 +1,5 @@
mod auth;
mod continue_to_chat;
pub mod onboarding_screen;
mod trust_directory;
pub use trust_directory::TrustDirectorySelection;
mod welcome;

View File

@@ -1,7 +1,5 @@
use codex_core::util::is_inside_git_repo;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyEventKind;
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use ratatui::prelude::Widget;
@@ -11,24 +9,25 @@ use ratatui::widgets::WidgetRef;
use codex_login::AuthMode;
use crate::LoginStatus;
use crate::app::ChatWidgetArgs;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::onboarding::auth::AuthModeWidget;
use crate::onboarding::auth::SignInState;
use crate::onboarding::continue_to_chat::ContinueToChatWidget;
use crate::onboarding::trust_directory::TrustDirectorySelection;
use crate::onboarding::trust_directory::TrustDirectoryWidget;
use crate::onboarding::welcome::WelcomeWidget;
use crate::tui::FrameRequester;
use crate::tui::Tui;
use crate::tui::TuiEvent;
use color_eyre::eyre::Result;
use std::path::PathBuf;
use std::sync::Arc;
use std::sync::RwLock;
use std::sync::Mutex;
#[allow(clippy::large_enum_variant)]
enum Step {
Welcome(WelcomeWidget),
Auth(AuthModeWidget),
TrustDirectory(TrustDirectoryWidget),
ContinueToChat(ContinueToChatWidget),
}
pub(crate) trait KeyboardHandler {
@@ -46,42 +45,43 @@ pub(crate) trait StepStateProvider {
}
pub(crate) struct OnboardingScreen {
request_frame: FrameRequester,
event_tx: AppEventSender,
steps: Vec<Step>,
is_done: bool,
}
pub(crate) struct OnboardingScreenArgs {
pub event_tx: AppEventSender,
pub chat_widget_args: ChatWidgetArgs,
pub codex_home: PathBuf,
pub cwd: PathBuf,
pub show_trust_screen: bool,
pub show_login_screen: bool,
pub login_status: LoginStatus,
pub preferred_auth_method: AuthMode,
}
impl OnboardingScreen {
pub(crate) fn new(tui: &mut Tui, args: OnboardingScreenArgs) -> Self {
pub(crate) fn new(args: OnboardingScreenArgs) -> Self {
let OnboardingScreenArgs {
event_tx,
chat_widget_args,
codex_home,
cwd,
show_trust_screen,
show_login_screen,
login_status,
preferred_auth_method,
} = args;
let mut steps: Vec<Step> = vec![Step::Welcome(WelcomeWidget {
is_logged_in: !matches!(login_status, LoginStatus::NotAuthenticated),
})];
if show_login_screen {
steps.push(Step::Auth(AuthModeWidget {
request_frame: tui.frame_requester(),
event_tx: event_tx.clone(),
highlighted_mode: AuthMode::ChatGPT,
error: None,
sign_in_state: Arc::new(RwLock::new(SignInState::PickMode)),
sign_in_state: SignInState::PickMode,
codex_home: codex_home.clone(),
login_status,
preferred_auth_method,
preferred_auth_method: chat_widget_args.config.preferred_auth_method,
}))
}
let is_git_repo = is_inside_git_repo(&cwd);
@@ -91,6 +91,9 @@ impl OnboardingScreen {
// Default to not trusting the directory if it's not a git repo.
TrustDirectorySelection::DontTrust
};
// Share ChatWidgetArgs between steps so changes in the TrustDirectory step
// are reflected when continuing to chat.
let shared_chat_args = Arc::new(Mutex::new(chat_widget_args));
if show_trust_screen {
steps.push(Step::TrustDirectory(TrustDirectoryWidget {
cwd,
@@ -99,13 +102,39 @@ impl OnboardingScreen {
selection: None,
highlighted,
error: None,
chat_widget_args: shared_chat_args.clone(),
}))
}
steps.push(Step::ContinueToChat(ContinueToChatWidget {
event_tx: event_tx.clone(),
chat_widget_args: shared_chat_args,
}));
// TODO: add git warning.
Self {
request_frame: tui.frame_requester(),
steps,
is_done: false,
Self { event_tx, steps }
}
pub(crate) fn on_auth_complete(&mut self, result: Result<(), String>) {
let current_step = self.current_step_mut();
if let Some(Step::Auth(state)) = current_step {
match result {
Ok(()) => {
state.sign_in_state = SignInState::ChatGptSuccessMessage;
self.event_tx.send(AppEvent::RequestRedraw);
let tx1 = self.event_tx.clone();
let tx2 = self.event_tx.clone();
std::thread::spawn(move || {
std::thread::sleep(std::time::Duration::from_millis(150));
tx1.send(AppEvent::RequestRedraw);
std::thread::sleep(std::time::Duration::from_millis(200));
tx2.send(AppEvent::RequestRedraw);
});
}
Err(e) => {
state.sign_in_state = SignInState::PickMode;
state.error = Some(e);
self.event_tx.send(AppEvent::RequestRedraw);
}
}
}
}
@@ -139,57 +168,19 @@ impl OnboardingScreen {
out
}
pub(crate) fn is_done(&self) -> bool {
self.is_done
|| !self
.steps
.iter()
.any(|step| matches!(step.get_step_state(), StepState::InProgress))
}
pub fn directory_trust_decision(&self) -> Option<TrustDirectorySelection> {
fn current_step_mut(&mut self) -> Option<&mut Step> {
self.steps
.iter()
.find_map(|step| {
if let Step::TrustDirectory(TrustDirectoryWidget { selection, .. }) = step {
Some(*selection)
} else {
None
}
})
.flatten()
.iter_mut()
.find(|step| matches!(step.get_step_state(), StepState::InProgress))
}
}
impl KeyboardHandler for OnboardingScreen {
fn handle_key_event(&mut self, key_event: KeyEvent) {
match key_event {
KeyEvent {
code: KeyCode::Char('d'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
}
| KeyEvent {
code: KeyCode::Char('c'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
}
| KeyEvent {
code: KeyCode::Char('q'),
kind: KeyEventKind::Press,
..
} => {
self.is_done = true;
}
_ => {
if let Some(active_step) = self.current_steps_mut().into_iter().last() {
active_step.handle_key_event(key_event);
}
}
};
self.request_frame.schedule_frame();
if let Some(active_step) = self.current_steps_mut().into_iter().last() {
active_step.handle_key_event(key_event);
}
self.event_tx.send(AppEvent::RequestRedraw);
}
}
@@ -255,7 +246,7 @@ impl WidgetRef for &OnboardingScreen {
impl KeyboardHandler for Step {
fn handle_key_event(&mut self, key_event: KeyEvent) {
match self {
Step::Welcome(_) => (),
Step::Welcome(_) | Step::ContinueToChat(_) => (),
Step::Auth(widget) => widget.handle_key_event(key_event),
Step::TrustDirectory(widget) => widget.handle_key_event(key_event),
}
@@ -268,6 +259,7 @@ impl StepStateProvider for Step {
Step::Welcome(w) => w.get_step_state(),
Step::Auth(w) => w.get_step_state(),
Step::TrustDirectory(w) => w.get_step_state(),
Step::ContinueToChat(w) => w.get_step_state(),
}
}
}
@@ -284,39 +276,9 @@ impl WidgetRef for Step {
Step::TrustDirectory(widget) => {
widget.render_ref(area, buf);
}
}
}
}
pub(crate) async fn run_onboarding_app(
args: OnboardingScreenArgs,
tui: &mut Tui,
) -> Result<Option<crate::onboarding::TrustDirectorySelection>> {
use tokio_stream::StreamExt;
let mut onboarding_screen = OnboardingScreen::new(tui, args);
tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&onboarding_screen, frame.area());
})?;
let tui_events = tui.event_stream();
tokio::pin!(tui_events);
while !onboarding_screen.is_done() {
if let Some(event) = tui_events.next().await {
match event {
TuiEvent::Key(key_event) => {
onboarding_screen.handle_key_event(key_event);
}
TuiEvent::Draw => {
let _ = tui.draw(u16::MAX, |frame| {
frame.render_widget_ref(&onboarding_screen, frame.area());
});
}
_ => {}
Step::ContinueToChat(widget) => {
widget.render_ref(area, buf);
}
}
}
Ok(onboarding_screen.directory_trust_decision())
}

View File

@@ -1,6 +1,8 @@
use std::path::PathBuf;
use codex_core::config::set_project_trusted;
use codex_core::protocol::AskForApproval;
use codex_core::protocol::SandboxPolicy;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use ratatui::buffer::Buffer;
@@ -20,6 +22,9 @@ use crate::onboarding::onboarding_screen::KeyboardHandler;
use crate::onboarding::onboarding_screen::StepStateProvider;
use super::onboarding_screen::StepState;
use crate::app::ChatWidgetArgs;
use std::sync::Arc;
use std::sync::Mutex;
pub(crate) struct TrustDirectoryWidget {
pub codex_home: PathBuf,
@@ -28,10 +33,11 @@ pub(crate) struct TrustDirectoryWidget {
pub selection: Option<TrustDirectorySelection>,
pub highlighted: TrustDirectorySelection,
pub error: Option<String>,
pub chat_widget_args: Arc<Mutex<ChatWidgetArgs>>,
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum TrustDirectorySelection {
pub(crate) enum TrustDirectorySelection {
Trust,
DontTrust,
}
@@ -150,6 +156,13 @@ impl TrustDirectoryWidget {
// self.error = Some("Failed to set project trusted".to_string());
}
// Update the in-memory chat config for this session to a more permissive
// policy suitable for a trusted workspace.
if let Ok(mut args) = self.chat_widget_args.lock() {
args.config.approval_policy = AskForApproval::OnRequest;
args.config.sandbox_policy = SandboxPolicy::new_workspace_write_policy();
}
self.selection = Some(TrustDirectorySelection::Trust);
}

View File

@@ -132,16 +132,35 @@ pub(crate) fn log_inbound_app_event(event: &AppEvent) {
AppEvent::CodexEvent(ev) => {
write_record("to_tui", "codex_event", ev);
}
AppEvent::NewSession => {
AppEvent::KeyEvent(k) => {
let value = json!({
"ts": now_ts(),
"dir": "to_tui",
"kind": "new_session",
"kind": "key_event",
"event": format!("{:?}", k),
});
LOGGER.write_json_line(value);
}
AppEvent::Paste(s) => {
let value = json!({
"ts": now_ts(),
"dir": "to_tui",
"kind": "paste",
"text": s,
});
LOGGER.write_json_line(value);
}
AppEvent::DispatchCommand(cmd) => {
let value = json!({
"ts": now_ts(),
"dir": "to_tui",
"kind": "slash_command",
"command": format!("{:?}", cmd),
});
LOGGER.write_json_line(value);
}
// Internal UI events; still log for fidelity, but avoid heavy payloads.
AppEvent::InsertHistoryLines(lines) => {
AppEvent::InsertHistory(lines) => {
let value = json!({
"ts": now_ts(),
"dir": "to_tui",
@@ -150,15 +169,6 @@ pub(crate) fn log_inbound_app_event(event: &AppEvent) {
});
LOGGER.write_json_line(value);
}
AppEvent::InsertHistoryCell(cell) => {
let value = json!({
"ts": now_ts(),
"dir": "to_tui",
"kind": "insert_history_cell",
"lines": cell.transcript_lines().len(),
});
LOGGER.write_json_line(value);
}
AppEvent::StartFileSearch(query) => {
let value = json!({
"ts": now_ts(),

View File

@@ -22,7 +22,7 @@ pub(crate) fn shimmer_spans(text: &str) -> Vec<Span<'static>> {
// Use time-based sweep synchronized to process start.
let padding = 10usize;
let period = chars.len() + padding * 2;
let sweep_seconds = 2.0f32;
let sweep_seconds = 2.5f32;
let pos_f =
(elapsed_since_start().as_secs_f32() % sweep_seconds) / sweep_seconds * (period as f32);
let pos = pos_f as usize;
@@ -63,11 +63,9 @@ pub(crate) fn shimmer_spans(text: &str) -> Vec<Span<'static>> {
}
fn color_for_level(level: u8) -> Style {
// Tune thresholds so the edges of the shimmer band appear dim
// in fallback mode (no true color support).
if level < 160 {
if level < 144 {
Style::default().add_modifier(Modifier::DIM)
} else if level < 224 {
} else if level < 208 {
Style::default()
} else {
Style::default().add_modifier(Modifier::BOLD)

View File

@@ -13,7 +13,6 @@ pub enum SlashCommand {
// DO NOT ALPHA-SORT! Enum order is presentation order in the popup, so
// more frequently used commands should be listed first.
Model,
Approvals,
New,
Init,
Compact,
@@ -38,8 +37,7 @@ impl SlashCommand {
SlashCommand::Diff => "show git diff (including untracked files)",
SlashCommand::Mention => "mention a file",
SlashCommand::Status => "show current session configuration and token usage",
SlashCommand::Model => "choose what model and reasoning effort to use",
SlashCommand::Approvals => "choose what Codex can do without approval",
SlashCommand::Model => "choose a model preset (model + reasoning effort)",
SlashCommand::Mcp => "list configured MCP tools",
SlashCommand::Logout => "log out of Codex",
#[cfg(debug_assertions)]

View File

@@ -3,7 +3,7 @@ source: tui/src/diff_render.rs
expression: terminal.backend()
---
"proposed patch to 1 file (+2 -0) "
" └ README.md "
" └ README.md (+2 -0) "
" 1 +first line "
" 2 +second line "
" "

View File

@@ -1,14 +0,0 @@
---
source: tui/src/diff_render.rs
expression: terminal.backend()
---
"proposed patch to 1 file (+1 -1) "
" └ example.txt "
" 1 "
" 2 -Y "
" 2 +Y changed "
" "
" "
" "
" "
" "

View File

@@ -1,12 +0,0 @@
---
source: tui/src/diff_render.rs
expression: terminal.backend()
---
"proposed patch to 1 file (+1 -1) "
" └ README.md "
" 1 -# Codex CLI (Rust Implementation) "
" 1 +# Codex CLI (Rust Implementation) banana "
" "
" "
" "
" "

View File

@@ -1,9 +1,10 @@
---
source: tui/src/diff_render.rs
assertion_line: 380
expression: terminal.backend()
---
"proposed patch to 1 file (+1 -1) "
" └ src/lib.rs → src/lib_new.rs "
" └ src/lib.rs → src/lib_new.rs (+1 -1) "
" 1 line one "
" 2 -line two "
" 2 +line two changed "

View File

@@ -1,20 +0,0 @@
---
source: tui/src/diff_render.rs
expression: terminal.backend()
---
"proposed patch to 1 file (+2 -2) "
" └ example.txt "
" 1 line 1 "
" 2 -line 2 "
" 2 +line two changed "
" 3 line 3 "
" 4 line 4 "
" 5 line 5 "
" ⋮ "
" 6 line 6 "
" 7 line 7 "
" 8 line 8 "
" 9 -line 9 "
" 9 +line nine changed "
" 10 line 10 "
" "

View File

@@ -1,9 +1,10 @@
---
source: tui/src/diff_render.rs
assertion_line: 380
expression: terminal.backend()
---
" 1 +this is a very long line that should wrap across multiple terminal co "
" lumns and continue "
" 1 +this is a very long line that should wrap across multiple terminal col "
" umns and continue "
" "
" "
" "

View File

@@ -19,7 +19,6 @@ use unicode_width::UnicodeWidthStr;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use crate::shimmer::shimmer_spans;
use crate::tui::FrameRequester;
// We render the live text using markdown so it visually matches the history
// cells. Before rendering we strip any ANSI escape sequences to avoid writing
@@ -30,8 +29,6 @@ pub(crate) struct StatusIndicatorWidget {
/// Latest text to display (truncated to the available width at render
/// time).
text: String,
/// Animated header text (defaults to "Working").
header: String,
/// Animation state: reveal target `text` progressively like a typewriter.
/// We compute the currently visible prefix length based on the current
@@ -42,21 +39,18 @@ pub(crate) struct StatusIndicatorWidget {
reveal_len_at_base: usize,
start_time: Instant,
app_event_tx: AppEventSender,
frame_requester: FrameRequester,
}
impl StatusIndicatorWidget {
pub(crate) fn new(app_event_tx: AppEventSender, frame_requester: FrameRequester) -> Self {
pub(crate) fn new(app_event_tx: AppEventSender) -> Self {
Self {
text: String::from("waiting for model"),
header: String::from("Working"),
last_target_len: 0,
base_frame: 0,
reveal_len_at_base: 0,
start_time: Instant::now(),
app_event_tx,
frame_requester,
}
}
@@ -98,13 +92,6 @@ impl StatusIndicatorWidget {
self.app_event_tx.send(AppEvent::CodexOp(Op::Interrupt));
}
/// Update the animated header label (left of the brackets).
pub(crate) fn update_header(&mut self, header: String) {
if self.header != header {
self.header = header;
}
}
/// Reset the animation and start revealing `text` from the beginning.
#[cfg(test)]
pub(crate) fn restart_with_text(&mut self, text: String) {
@@ -156,13 +143,13 @@ impl WidgetRef for StatusIndicatorWidget {
}
// Schedule next animation frame.
self.frame_requester
.schedule_frame_in(Duration::from_millis(32));
self.app_event_tx
.send(AppEvent::ScheduleFrameIn(Duration::from_millis(100)));
let idx = self.current_frame();
let elapsed = self.start_time.elapsed().as_secs();
let shown_now = self.current_shown_len(idx);
let status_prefix: String = self.text.chars().take(shown_now).collect();
let animated_spans = shimmer_spans(&self.header);
let animated_spans = shimmer_spans("Working");
// Plain rendering: no borders or padding so the live cell is visually indistinguishable from terminal scrollback.
let inner_width = area.width as usize;
@@ -226,13 +213,13 @@ mod tests {
use super::*;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
use tokio::sync::mpsc::unbounded_channel;
use std::sync::mpsc::channel;
#[test]
fn renders_without_left_border_or_padding() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut w = StatusIndicatorWidget::new(tx, crate::tui::FrameRequester::test_dummy());
let mut w = StatusIndicatorWidget::new(tx);
w.restart_with_text("Hello".to_string());
let area = ratatui::layout::Rect::new(0, 0, 30, 1);
@@ -248,9 +235,9 @@ mod tests {
#[test]
fn working_header_is_present_on_last_line() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut w = StatusIndicatorWidget::new(tx, crate::tui::FrameRequester::test_dummy());
let mut w = StatusIndicatorWidget::new(tx);
w.restart_with_text("Hi".to_string());
// Ensure some frames elapse so we get a stable state.
std::thread::sleep(std::time::Duration::from_millis(120));
@@ -269,9 +256,9 @@ mod tests {
#[test]
fn header_starts_at_expected_position() {
let (tx_raw, _rx) = unbounded_channel::<AppEvent>();
let (tx_raw, _rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let mut w = StatusIndicatorWidget::new(tx, crate::tui::FrameRequester::test_dummy());
let mut w = StatusIndicatorWidget::new(tx);
w.restart_with_text("Hello".to_string());
std::thread::sleep(std::time::Duration::from_millis(120));

View File

@@ -2,6 +2,7 @@ use codex_core::config::Config;
use ratatui::text::Line;
use super::HeaderEmitter;
use super::StreamKind;
use super::StreamState;
/// Sink for history insertions and animation control.
@@ -17,7 +18,7 @@ pub(crate) struct AppEventHistorySink(pub(crate) crate::app_event_sender::AppEve
impl HistorySink for AppEventHistorySink {
fn insert_history(&self, lines: Vec<Line<'static>>) {
self.0
.send(crate::app_event::AppEvent::InsertHistoryLines(lines))
.send(crate::app_event::AppEvent::InsertHistory(lines))
}
fn start_commit_animation(&self) {
self.0
@@ -35,8 +36,8 @@ type Lines = Vec<Line<'static>>;
pub(crate) struct StreamController {
config: Config,
header: HeaderEmitter,
state: StreamState,
active: bool,
states: [StreamState; 2],
current_stream: Option<StreamKind>,
finishing_after_drain: bool,
}
@@ -45,8 +46,8 @@ impl StreamController {
Self {
config,
header: HeaderEmitter::new(),
state: StreamState::new(),
active: false,
states: [StreamState::new(), StreamState::new()],
current_stream: None,
finishing_after_drain: false,
}
}
@@ -56,18 +57,29 @@ impl StreamController {
}
pub(crate) fn is_write_cycle_active(&self) -> bool {
self.active
self.current_stream.is_some()
}
pub(crate) fn clear_all(&mut self) {
self.state.clear();
self.active = false;
self.states.iter_mut().for_each(|s| s.clear());
self.current_stream = None;
self.finishing_after_drain = false;
// leave header state unchanged; caller decides when to reset
}
fn emit_header_if_needed(&mut self, out_lines: &mut Lines) -> bool {
self.header.maybe_emit(out_lines)
#[inline]
fn idx(kind: StreamKind) -> usize {
kind as usize
}
fn state(&self, kind: StreamKind) -> &StreamState {
&self.states[Self::idx(kind)]
}
fn state_mut(&mut self, kind: StreamKind) -> &mut StreamState {
&mut self.states[Self::idx(kind)]
}
fn emit_header_if_needed(&mut self, kind: StreamKind, out_lines: &mut Lines) -> bool {
self.header.maybe_emit(kind, out_lines)
}
#[inline]
@@ -81,23 +93,56 @@ impl StreamController {
}
}
/// Begin an answer stream. Does not emit header yet; it is emitted on first commit.
pub(crate) fn begin(&mut self, _sink: &impl HistorySink) {
// Starting a new stream cancels any pending finish-from-previous-stream animation.
if !self.active {
self.header.reset_for_stream();
/// Begin a stream, flushing previously completed lines from any other
/// active stream to maintain ordering.
pub(crate) fn begin(&mut self, kind: StreamKind, sink: &impl HistorySink) {
if let Some(current) = self.current_stream
&& current != kind
{
// Synchronously flush completed lines from previous stream.
let cfg = self.config.clone();
let prev_state = self.state_mut(current);
let newly_completed = prev_state.collector.commit_complete_lines(&cfg);
if !newly_completed.is_empty() {
prev_state.enqueue(newly_completed);
}
let step = prev_state.drain_all();
if !step.history.is_empty() {
let mut lines: Lines = Vec::new();
self.emit_header_if_needed(current, &mut lines);
lines.extend(step.history);
// Ensure at most one trailing blank after the flushed block.
Self::ensure_single_trailing_blank(&mut lines);
sink.insert_history(lines);
}
self.current_stream = None;
}
if self.current_stream != Some(kind) {
let prev = self.current_stream;
self.current_stream = Some(kind);
// Starting a new stream cancels any pending finish-from-previous-stream animation.
self.finishing_after_drain = false;
if prev.is_some() {
self.header.reset_for_stream(kind);
}
// Emit header immediately for reasoning; for answers, defer to first commit.
if matches!(kind, StreamKind::Reasoning) {
let mut header_lines = Vec::new();
if self.emit_header_if_needed(kind, &mut header_lines) {
sink.insert_history(header_lines);
}
}
}
self.finishing_after_drain = false;
self.active = true;
}
/// Push a delta; if it contains a newline, commit completed lines and start animation.
pub(crate) fn push_and_maybe_commit(&mut self, delta: &str, sink: &impl HistorySink) {
if !self.active {
let Some(kind) = self.current_stream else {
return;
}
};
let cfg = self.config.clone();
let state = &mut self.state;
let state = self.state_mut(kind);
// Record that at least one delta was received for this stream
if !delta.is_empty() {
state.has_seen_delta = true;
@@ -112,22 +157,42 @@ impl StreamController {
}
}
/// Insert a reasoning section break and commit any newly completed lines.
pub(crate) fn insert_reasoning_section_break(&mut self, sink: &impl HistorySink) {
if self.current_stream != Some(StreamKind::Reasoning) {
self.begin(StreamKind::Reasoning, sink);
}
let cfg = self.config.clone();
let state = self.state_mut(StreamKind::Reasoning);
state.collector.insert_section_break();
let newly_completed = state.collector.commit_complete_lines(&cfg);
if !newly_completed.is_empty() {
state.enqueue(newly_completed);
sink.start_commit_animation();
}
}
/// Finalize the active stream. If `flush_immediately` is true, drain and emit now.
pub(crate) fn finalize(&mut self, flush_immediately: bool, sink: &impl HistorySink) -> bool {
if !self.active {
pub(crate) fn finalize(
&mut self,
kind: StreamKind,
flush_immediately: bool,
sink: &impl HistorySink,
) -> bool {
if self.current_stream != Some(kind) {
return false;
}
let cfg = self.config.clone();
// Finalize collector first.
let remaining = {
let state = &mut self.state;
let state = self.state_mut(kind);
state.collector.finalize_and_drain(&cfg)
};
if flush_immediately {
// Collect all output first to avoid emitting headers when there is no content.
let mut out_lines: Lines = Vec::new();
{
let state = &mut self.state;
let state = self.state_mut(kind);
if !remaining.is_empty() {
state.enqueue(remaining);
}
@@ -136,28 +201,28 @@ impl StreamController {
}
if !out_lines.is_empty() {
let mut lines_with_header: Lines = Vec::new();
self.emit_header_if_needed(&mut lines_with_header);
self.emit_header_if_needed(kind, &mut lines_with_header);
lines_with_header.extend(out_lines);
Self::ensure_single_trailing_blank(&mut lines_with_header);
sink.insert_history(lines_with_header);
}
// Cleanup
self.state.clear();
// Allow a subsequent block in this turn to emit its header.
self.header.allow_reemit_in_turn();
self.state_mut(kind).clear();
// Allow a subsequent block of the same kind in this turn to emit its header.
self.header.allow_reemit_for_same_kind_in_turn(kind);
// Also clear the per-stream emitted flag so the header can render again.
self.header.reset_for_stream();
self.active = false;
self.header.reset_for_stream(kind);
self.current_stream = None;
self.finishing_after_drain = false;
true
} else {
if !remaining.is_empty() {
let state = &mut self.state;
let state = self.state_mut(kind);
state.enqueue(remaining);
}
// Spacer animated out
self.state.enqueue(vec![Line::from("")]);
self.state_mut(kind).enqueue(vec![Line::from("")]);
self.finishing_after_drain = true;
sink.start_commit_animation();
false
@@ -166,29 +231,32 @@ impl StreamController {
/// Step animation: commit at most one queued line and handle end-of-drain cleanup.
pub(crate) fn on_commit_tick(&mut self, sink: &impl HistorySink) -> bool {
if !self.active {
let Some(kind) = self.current_stream else {
return false;
}
let step = { self.state.step() };
};
let step = {
let state = self.state_mut(kind);
state.step()
};
if !step.history.is_empty() {
let mut lines: Lines = Vec::new();
self.emit_header_if_needed(&mut lines);
self.emit_header_if_needed(kind, &mut lines);
let mut out = lines;
out.extend(step.history);
sink.insert_history(out);
}
let is_idle = self.state.is_idle();
let is_idle = self.state(kind).is_idle();
if is_idle {
sink.stop_commit_animation();
if self.finishing_after_drain {
// Reset and notify
self.state.clear();
// Allow a subsequent block in this turn to emit its header.
self.header.allow_reemit_in_turn();
self.state_mut(kind).clear();
// Allow a subsequent block of the same kind in this turn to emit its header.
self.header.allow_reemit_for_same_kind_in_turn(kind);
// Also clear the per-stream emitted flag so the header can render again.
self.header.reset_for_stream();
self.active = false;
self.header.reset_for_stream(kind);
self.current_stream = None;
self.finishing_after_drain = false;
return true;
}
@@ -199,14 +267,24 @@ impl StreamController {
/// Apply a full final answer: replace queued content with only the remaining tail,
/// then finalize immediately and notify completion.
pub(crate) fn apply_final_answer(&mut self, message: &str, sink: &impl HistorySink) -> bool {
self.apply_full_final(message, sink)
self.apply_full_final(StreamKind::Answer, message, true, sink)
}
fn apply_full_final(&mut self, message: &str, sink: &impl HistorySink) -> bool {
self.begin(sink);
pub(crate) fn apply_final_reasoning(&mut self, message: &str, sink: &impl HistorySink) -> bool {
self.apply_full_final(StreamKind::Reasoning, message, false, sink)
}
fn apply_full_final(
&mut self,
kind: StreamKind,
message: &str,
immediate: bool,
sink: &impl HistorySink,
) -> bool {
self.begin(kind, sink);
{
let state = &mut self.state;
let state = self.state_mut(kind);
// Only inject the final full message if we have not seen any deltas for this stream.
// If deltas were received, rely on the collector's existing buffer to avoid duplication.
if !state.has_seen_delta && !message.is_empty() {
@@ -223,6 +301,7 @@ impl StreamController {
.replace_with_and_mark_committed(&msg, committed);
}
}
self.finalize(true, sink)
self.finalize(kind, immediate, sink)
}
}

View File

@@ -2,6 +2,12 @@ use crate::markdown_stream::AnimatedLineStreamer;
use crate::markdown_stream::MarkdownStreamCollector;
pub(crate) mod controller;
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub(crate) enum StreamKind {
Answer,
Reasoning,
}
pub(crate) struct StreamState {
pub(crate) collector: MarkdownStreamCollector,
pub(crate) streamer: AnimatedLineStreamer,
@@ -36,44 +42,92 @@ impl StreamState {
}
pub(crate) struct HeaderEmitter {
emitted_this_turn: bool,
emitted_in_stream: bool,
reasoning_emitted_this_turn: bool,
answer_emitted_this_turn: bool,
reasoning_emitted_in_stream: bool,
answer_emitted_in_stream: bool,
}
impl HeaderEmitter {
pub(crate) fn new() -> Self {
Self {
emitted_this_turn: false,
emitted_in_stream: false,
reasoning_emitted_this_turn: false,
answer_emitted_this_turn: false,
reasoning_emitted_in_stream: false,
answer_emitted_in_stream: false,
}
}
pub(crate) fn reset_for_new_turn(&mut self) {
self.emitted_this_turn = false;
self.emitted_in_stream = false;
self.reasoning_emitted_this_turn = false;
self.answer_emitted_this_turn = false;
self.reasoning_emitted_in_stream = false;
self.answer_emitted_in_stream = false;
}
pub(crate) fn reset_for_stream(&mut self) {
self.emitted_in_stream = false;
}
/// Allow emitting the header again within the current turn after a finalize.
pub(crate) fn allow_reemit_in_turn(&mut self) {
self.emitted_this_turn = false;
}
pub(crate) fn maybe_emit(&mut self, out_lines: &mut Vec<ratatui::text::Line<'static>>) -> bool {
if !self.emitted_in_stream && !self.emitted_this_turn {
out_lines.push(render_header_line());
self.emitted_in_stream = true;
self.emitted_this_turn = true;
return true;
pub(crate) fn reset_for_stream(&mut self, kind: StreamKind) {
match kind {
StreamKind::Reasoning => self.reasoning_emitted_in_stream = false,
StreamKind::Answer => self.answer_emitted_in_stream = false,
}
}
pub(crate) fn has_emitted_for_stream(&self, kind: StreamKind) -> bool {
match kind {
StreamKind::Reasoning => self.reasoning_emitted_in_stream,
StreamKind::Answer => self.answer_emitted_in_stream,
}
}
/// Allow emitting the header again for the same kind within the current turn.
///
/// This is used when a stream (e.g., Answer) is finalized and a subsequent
/// block of the same kind is started within the same turn. Without this,
/// only the first block would render a header.
pub(crate) fn allow_reemit_for_same_kind_in_turn(&mut self, kind: StreamKind) {
match kind {
StreamKind::Reasoning => self.reasoning_emitted_this_turn = false,
StreamKind::Answer => self.answer_emitted_this_turn = false,
}
}
pub(crate) fn maybe_emit(
&mut self,
kind: StreamKind,
out_lines: &mut Vec<ratatui::text::Line<'static>>,
) -> bool {
let already_emitted_this_turn = match kind {
StreamKind::Reasoning => self.reasoning_emitted_this_turn,
StreamKind::Answer => self.answer_emitted_this_turn,
};
let already_emitted_in_stream = self.has_emitted_for_stream(kind);
if !already_emitted_in_stream && !already_emitted_this_turn {
out_lines.push(render_header_line(kind));
match kind {
StreamKind::Reasoning => {
self.reasoning_emitted_in_stream = true;
self.reasoning_emitted_this_turn = true;
// Reset opposite header so it may be emitted again this turn
self.answer_emitted_this_turn = false;
}
StreamKind::Answer => {
self.answer_emitted_in_stream = true;
self.answer_emitted_this_turn = true;
// Reset opposite header so it may be emitted again this turn
self.reasoning_emitted_this_turn = false;
}
}
true
} else {
false
}
false
}
}
fn render_header_line() -> ratatui::text::Line<'static> {
fn render_header_line(kind: StreamKind) -> ratatui::text::Line<'static> {
use ratatui::style::Stylize;
ratatui::text::Line::from("codex".magenta().bold())
match kind {
StreamKind::Reasoning => ratatui::text::Line::from("thinking".magenta().italic()),
StreamKind::Answer => ratatui::text::Line::from("codex".magenta().bold()),
}
}

View File

@@ -1,209 +0,0 @@
use std::io::Result;
use crate::insert_history;
use crate::tui;
use crate::tui::TuiEvent;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyEventKind;
use ratatui::buffer::Buffer;
use ratatui::layout::Rect;
use ratatui::style::Color;
use ratatui::style::Style;
use ratatui::style::Styled;
use ratatui::style::Stylize;
use ratatui::text::Line;
use ratatui::text::Span;
use ratatui::widgets::Paragraph;
use ratatui::widgets::WidgetRef;
pub(crate) struct TranscriptApp {
pub(crate) transcript_lines: Vec<Line<'static>>,
pub(crate) scroll_offset: usize,
pub(crate) is_done: bool,
}
impl TranscriptApp {
pub(crate) fn new(transcript_lines: Vec<Line<'static>>) -> Self {
Self {
transcript_lines,
scroll_offset: 0,
is_done: false,
}
}
pub(crate) fn handle_event(&mut self, tui: &mut tui::Tui, event: TuiEvent) -> Result<()> {
match event {
TuiEvent::Key(key_event) => self.handle_key_event(tui, key_event),
TuiEvent::Draw => {
tui.draw(u16::MAX, |frame| {
self.render(frame.area(), frame.buffer);
})?;
}
_ => {}
}
Ok(())
}
pub(crate) fn insert_lines(&mut self, lines: Vec<Line<'static>>) {
self.transcript_lines.extend(lines);
}
fn handle_key_event(&mut self, tui: &mut tui::Tui, key_event: KeyEvent) {
match key_event {
KeyEvent {
code: KeyCode::Char('q'),
kind: KeyEventKind::Press,
..
}
| KeyEvent {
code: KeyCode::Char('t'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
}
| KeyEvent {
code: KeyCode::Char('c'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
} => {
self.is_done = true;
}
KeyEvent {
code: KeyCode::Up,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.scroll_offset = self.scroll_offset.saturating_sub(1);
}
KeyEvent {
code: KeyCode::Down,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.scroll_offset = self.scroll_offset.saturating_add(1);
}
KeyEvent {
code: KeyCode::PageUp,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
let area = self.scroll_area(tui.terminal.viewport_area);
self.scroll_offset = self.scroll_offset.saturating_sub(area.height as usize);
}
KeyEvent {
code: KeyCode::PageDown,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
let area = self.scroll_area(tui.terminal.viewport_area);
self.scroll_offset = self.scroll_offset.saturating_add(area.height as usize);
}
KeyEvent {
code: KeyCode::Home,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.scroll_offset = 0;
}
KeyEvent {
code: KeyCode::End,
kind: KeyEventKind::Press | KeyEventKind::Repeat,
..
} => {
self.scroll_offset = usize::MAX;
}
_ => {}
}
}
fn scroll_area(&self, area: Rect) -> Rect {
let mut area = area;
// Reserve 1 line for the header and 4 lines for the bottom status section. This matches the chat composer.
area.y = area.y.saturating_add(1);
area.height = area.height.saturating_sub(5);
area
}
fn render(&mut self, area: Rect, buf: &mut Buffer) {
Span::from("/ ".repeat(area.width as usize / 2))
.dim()
.render_ref(area, buf);
Span::from("/ T R A N S C R I P T")
.dim()
.render_ref(area, buf);
// Main content area (excludes header and bottom status section)
let content_area = self.scroll_area(area);
let wrapped = insert_history::word_wrap_lines(&self.transcript_lines, content_area.width);
// Clamp scroll offset to valid range
self.scroll_offset = self
.scroll_offset
.min(wrapped.len().saturating_sub(content_area.height as usize));
let start = self.scroll_offset;
let end = (start + content_area.height as usize).min(wrapped.len());
let page = &wrapped[start..end];
Paragraph::new(page.to_vec()).render_ref(content_area, buf);
// Fill remaining visible lines (if any) with a leading '~' in the first column.
let visible = (end - start) as u16;
if content_area.height > visible {
let extra = content_area.height - visible;
for i in 0..extra {
let y = content_area.y.saturating_add(visible + i);
Span::from("~")
.dim()
.render_ref(Rect::new(content_area.x, y, 1, 1), buf);
}
}
// Bottom status section (4 lines): separator with % scrolled, then key hints (styled like chat composer)
let sep_y = content_area.bottom();
let sep_rect = Rect::new(area.x, sep_y, area.width, 1);
let hints_rect = Rect::new(area.x, sep_y + 1, area.width, 2);
// Separator line (dim)
Span::from("".repeat(sep_rect.width as usize))
.dim()
.render_ref(sep_rect, buf);
// Scroll percentage (0-100%) aligned near the right edge
let max_scroll = wrapped.len().saturating_sub(content_area.height as usize);
let percent: u8 = if max_scroll == 0 {
100
} else {
(((self.scroll_offset.min(max_scroll)) as f32 / max_scroll as f32) * 100.0).round()
as u8
};
let pct_text = format!(" {percent}% ");
let pct_w = pct_text.chars().count() as u16;
let pct_x = sep_rect.x + sep_rect.width - pct_w - 1;
Span::from(pct_text)
.dim()
.render_ref(Rect::new(pct_x, sep_rect.y, pct_w, 1), buf);
let key_hint_style = Style::default().fg(Color::Cyan);
let hints1 = vec![
" ".into(),
"".set_style(key_hint_style),
"/".into(),
"".set_style(key_hint_style),
" scroll ".into(),
"PgUp".set_style(key_hint_style),
"/".into(),
"PgDn".set_style(key_hint_style),
" page ".into(),
"Home".set_style(key_hint_style),
"/".into(),
"End".set_style(key_hint_style),
" jump".into(),
];
let hints2 = vec![" ".into(), "q".set_style(key_hint_style), " quit".into()];
Paragraph::new(vec![Line::from(hints1).dim(), Line::from(hints2).dim()])
.render_ref(hints_rect, buf);
}
}

View File

@@ -1,39 +1,28 @@
use std::io::Result;
use std::io::Stdout;
use std::io::stdout;
use std::pin::Pin;
use std::time::Duration;
use std::time::Instant;
use crossterm::SynchronizedUpdate;
use crossterm::cursor;
use codex_core::config::Config;
use crossterm::cursor::MoveTo;
use crossterm::event::DisableBracketedPaste;
use crossterm::event::EnableBracketedPaste;
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyEventKind;
use crossterm::event::KeyboardEnhancementFlags;
use crossterm::event::PopKeyboardEnhancementFlags;
use crossterm::event::PushKeyboardEnhancementFlags;
use crossterm::terminal::ScrollUp;
use ratatui::backend::Backend;
use crossterm::terminal::Clear;
use crossterm::terminal::ClearType;
use ratatui::backend::CrosstermBackend;
use ratatui::crossterm::execute;
use ratatui::crossterm::terminal::disable_raw_mode;
use ratatui::crossterm::terminal::enable_raw_mode;
use ratatui::layout::Offset;
use ratatui::text::Line;
use crate::custom_terminal;
use crate::custom_terminal::Terminal as CustomTerminal;
use tokio::select;
use tokio_stream::Stream;
use crate::custom_terminal::Terminal;
/// A type alias for the terminal type used in this application
pub type Terminal = CustomTerminal<CrosstermBackend<Stdout>>;
pub type Tui = Terminal<CrosstermBackend<Stdout>>;
pub fn set_modes() -> Result<()> {
/// Initialize the terminal (inline viewport; history stays in normal scrollback)
pub fn init(_config: &Config) -> Result<Tui> {
execute!(stdout(), EnableBracketedPaste)?;
enable_raw_mode()?;
@@ -51,37 +40,13 @@ pub fn set_modes() -> Result<()> {
| KeyboardEnhancementFlags::REPORT_ALTERNATE_KEYS
)
);
Ok(())
}
/// Restore the terminal to its original state.
/// Inverse of `set_modes`.
pub fn restore() -> Result<()> {
// Pop may fail on platforms that didn't support the push; ignore errors.
let _ = execute!(stdout(), PopKeyboardEnhancementFlags);
execute!(stdout(), DisableBracketedPaste)?;
disable_raw_mode()?;
let _ = execute!(stdout(), crossterm::cursor::Show);
Ok(())
}
/// Initialize the terminal (inline viewport; history stays in normal scrollback)
pub fn init() -> Result<Terminal> {
set_modes()?;
set_panic_hook();
// Instead of clearing the screen (which can drop scrollback in some terminals),
// scroll existing lines up until the cursor reaches the top, then start at (0, 0).
if let Ok((_x, y)) = cursor::position()
&& y > 0
{
execute!(stdout(), ScrollUp(y))?;
}
execute!(stdout(), MoveTo(0, 0))?;
// Clear screen and move cursor to top-left before drawing UI
execute!(stdout(), Clear(ClearType::All), MoveTo(0, 0))?;
let backend = CrosstermBackend::new(stdout());
let tui = CustomTerminal::with_options(backend)?;
let tui = Terminal::with_options(backend)?;
Ok(tui)
}
@@ -93,223 +58,11 @@ fn set_panic_hook() {
}));
}
#[derive(Debug)]
pub enum TuiEvent {
Key(KeyEvent),
Paste(String),
Draw,
#[cfg(unix)]
ResumeFromSuspend,
}
pub struct Tui {
frame_schedule_tx: tokio::sync::mpsc::UnboundedSender<Instant>,
draw_tx: tokio::sync::broadcast::Sender<()>,
pub(crate) terminal: Terminal,
pending_history_lines: Vec<Line<'static>>,
}
#[derive(Clone, Debug)]
pub struct FrameRequester {
frame_schedule_tx: tokio::sync::mpsc::UnboundedSender<Instant>,
}
impl FrameRequester {
pub fn schedule_frame(&self) {
let _ = self.frame_schedule_tx.send(Instant::now());
}
pub fn schedule_frame_in(&self, dur: Duration) {
let _ = self.frame_schedule_tx.send(Instant::now() + dur);
}
}
#[cfg(test)]
impl FrameRequester {
/// Create a no-op frame requester for tests.
pub(crate) fn test_dummy() -> Self {
let (tx, _rx) = tokio::sync::mpsc::unbounded_channel();
FrameRequester {
frame_schedule_tx: tx,
}
}
}
impl Tui {
pub fn new(terminal: Terminal) -> Self {
let (frame_schedule_tx, frame_schedule_rx) = tokio::sync::mpsc::unbounded_channel();
let (draw_tx, _) = tokio::sync::broadcast::channel(1);
// Spawn background scheduler to coalesce frame requests and emit draws at deadlines.
let draw_tx_clone = draw_tx.clone();
tokio::spawn(async move {
use tokio::select;
use tokio::time::Instant as TokioInstant;
use tokio::time::sleep_until;
let mut rx = frame_schedule_rx;
let mut next_deadline: Option<Instant> = None;
loop {
let target = next_deadline
.unwrap_or_else(|| Instant::now() + Duration::from_secs(60 * 60 * 24 * 365));
let sleep_fut = sleep_until(TokioInstant::from_std(target));
tokio::pin!(sleep_fut);
select! {
recv = rx.recv() => {
match recv {
Some(at) => {
if next_deadline.is_none_or(|cur| at < cur) {
next_deadline = Some(at);
}
if at <= Instant::now() {
next_deadline = None;
let _ = draw_tx_clone.send(());
}
}
None => break,
}
}
_ = &mut sleep_fut => {
if next_deadline.is_some() {
next_deadline = None;
let _ = draw_tx_clone.send(());
}
}
}
}
});
Self {
frame_schedule_tx,
draw_tx,
terminal,
pending_history_lines: vec![],
}
}
pub fn frame_requester(&self) -> FrameRequester {
FrameRequester {
frame_schedule_tx: self.frame_schedule_tx.clone(),
}
}
pub fn event_stream(&self) -> Pin<Box<dyn Stream<Item = TuiEvent> + Send + 'static>> {
use tokio_stream::StreamExt;
let mut crossterm_events = crossterm::event::EventStream::new();
let mut draw_rx = self.draw_tx.subscribe();
let event_stream = async_stream::stream! {
loop {
select! {
Some(Ok(event)) = crossterm_events.next() => {
match event {
crossterm::event::Event::Key(KeyEvent {
code: KeyCode::Char('z'),
modifiers: crossterm::event::KeyModifiers::CONTROL,
kind: KeyEventKind::Press,
..
}) => {
#[cfg(unix)]
{
let _ = Tui::suspend();
yield TuiEvent::ResumeFromSuspend;
yield TuiEvent::Draw;
}
}
crossterm::event::Event::Key(key_event) => {
yield TuiEvent::Key(key_event);
}
crossterm::event::Event::Resize(_, _) => {
yield TuiEvent::Draw;
}
crossterm::event::Event::Paste(pasted) => {
yield TuiEvent::Paste(pasted);
}
_ => {}
}
}
result = draw_rx.recv() => {
match result {
Ok(_) => {
yield TuiEvent::Draw;
}
Err(tokio::sync::broadcast::error::RecvError::Lagged(_)) => {
// We dropped one or more draw notifications; coalesce to a single draw.
yield TuiEvent::Draw;
}
Err(tokio::sync::broadcast::error::RecvError::Closed) => {
// Sender dropped; stop emitting draws from this source.
}
}
}
}
}
};
Box::pin(event_stream)
}
#[cfg(unix)]
fn suspend() -> Result<()> {
restore()?;
unsafe { libc::kill(0, libc::SIGTSTP) };
set_modes()?;
Ok(())
}
pub fn insert_history_lines(&mut self, lines: Vec<Line<'static>>) {
self.pending_history_lines.extend(lines);
self.frame_requester().schedule_frame();
}
pub fn draw(
&mut self,
height: u16,
draw_fn: impl FnOnce(&mut custom_terminal::Frame),
) -> Result<()> {
std::io::stdout().sync_update(|_| {
let terminal = &mut self.terminal;
let screen_size = terminal.size()?;
let last_known_screen_size = terminal.last_known_screen_size;
if screen_size != last_known_screen_size {
let cursor_pos = terminal.get_cursor_position()?;
let last_known_cursor_pos = terminal.last_known_cursor_pos;
if cursor_pos.y != last_known_cursor_pos.y {
let cursor_delta = cursor_pos.y as i32 - last_known_cursor_pos.y as i32;
let new_viewport_area = terminal.viewport_area.offset(Offset {
x: 0,
y: cursor_delta,
});
terminal.set_viewport_area(new_viewport_area);
terminal.clear()?;
}
}
let size = terminal.size()?;
let mut area = terminal.viewport_area;
area.height = height.min(size.height);
area.width = size.width;
if area.bottom() > size.height {
terminal
.backend_mut()
.scroll_region_up(0..area.top(), area.bottom() - size.height)?;
area.y = size.height - area.height;
}
if area != terminal.viewport_area {
terminal.clear()?;
terminal.set_viewport_area(area);
}
if !self.pending_history_lines.is_empty() {
crate::insert_history::insert_history_lines(
terminal,
self.pending_history_lines.clone(),
);
self.pending_history_lines.clear();
}
terminal.draw(|frame| {
draw_fn(frame);
})?;
Ok(())
})?
}
/// Restore the terminal to its original state
pub fn restore() -> Result<()> {
// Pop may fail on platforms that didn't support the push; ignore errors.
let _ = execute!(stdout(), PopKeyboardEnhancementFlags);
execute!(stdout(), DisableBracketedPaste)?;
disable_raw_mode()?;
Ok(())
}

View File

@@ -95,11 +95,11 @@ static PATCH_SELECT_OPTIONS: LazyLock<Vec<SelectOption>> = LazyLock::new(|| {
});
/// A modal prompting the user to approve or deny the pending request.
pub(crate) struct UserApprovalWidget {
pub(crate) struct UserApprovalWidget<'a> {
approval_request: ApprovalRequest,
app_event_tx: AppEventSender,
confirmation_prompt: Paragraph<'static>,
select_options: &'static Vec<SelectOption>,
confirmation_prompt: Paragraph<'a>,
select_options: &'a Vec<SelectOption>,
/// Currently selected index in *select* mode.
selected_option: usize,
@@ -137,7 +137,7 @@ fn to_command_display<'a>(
lines
}
impl UserApprovalWidget {
impl UserApprovalWidget<'_> {
pub(crate) fn new(approval_request: ApprovalRequest, app_event_tx: AppEventSender) -> Self {
let confirmation_prompt = match &approval_request {
ApprovalRequest::Exec {
@@ -328,7 +328,7 @@ impl UserApprovalWidget {
}
}
lines.push(Line::from(""));
self.app_event_tx.send(AppEvent::InsertHistoryLines(lines));
self.app_event_tx.send(AppEvent::InsertHistory(lines));
let op = match &self.approval_request {
ApprovalRequest::Exec { id, .. } => Op::ExecApproval {
@@ -356,7 +356,7 @@ impl UserApprovalWidget {
}
}
impl WidgetRef for &UserApprovalWidget {
impl WidgetRef for &UserApprovalWidget<'_> {
fn render_ref(&self, area: Rect, buf: &mut Buffer) {
let prompt_height = self.get_confirmation_prompt_height(area.width);
let [prompt_chunk, response_chunk] = Layout::default()
@@ -424,11 +424,11 @@ mod tests {
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
use tokio::sync::mpsc::unbounded_channel;
use std::sync::mpsc::channel;
#[test]
fn lowercase_shortcut_is_accepted() {
let (tx_raw, mut rx) = unbounded_channel::<AppEvent>();
let (tx_raw, rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let req = ApprovalRequest::Exec {
id: "1".to_string(),
@@ -438,10 +438,7 @@ mod tests {
let mut widget = UserApprovalWidget::new(req, tx);
widget.handle_key_event(KeyEvent::new(KeyCode::Char('y'), KeyModifiers::NONE));
assert!(widget.is_complete());
let mut events: Vec<AppEvent> = Vec::new();
while let Ok(ev) = rx.try_recv() {
events.push(ev);
}
let events: Vec<AppEvent> = rx.try_iter().collect();
assert!(events.iter().any(|e| matches!(
e,
AppEvent::CodexOp(Op::ExecApproval {
@@ -453,7 +450,7 @@ mod tests {
#[test]
fn uppercase_shortcut_is_accepted() {
let (tx_raw, mut rx) = unbounded_channel::<AppEvent>();
let (tx_raw, rx) = channel::<AppEvent>();
let tx = AppEventSender::new(tx_raw);
let req = ApprovalRequest::Exec {
id: "2".to_string(),
@@ -463,10 +460,7 @@ mod tests {
let mut widget = UserApprovalWidget::new(req, tx);
widget.handle_key_event(KeyEvent::new(KeyCode::Char('Y'), KeyModifiers::NONE));
assert!(widget.is_complete());
let mut events: Vec<AppEvent> = Vec::new();
while let Ok(ev) = rx.try_recv() {
events.push(ev);
}
let events: Vec<AppEvent> = rx.try_iter().collect();
assert!(events.iter().any(|e| matches!(
e,
AppEvent::CodexOp(Op::ExecApproval {

View File

@@ -1,9 +1,12 @@
To get started, describe a task or try one of these commands:
thinking
Investigating binary sizes
/init - create an AGENTS.md file with instructions for Codex
/status - show current session configuration and token usage
/approvals - choose what Codex can do without approval
/model - choose what model and reasoning effort to use
I need to check the codex-rs repository to explain why the project's binaries
are large. The user is likely seeking specifics about the setup: are Rust builds
static, what features are enabled, and is debug information included? It could
be due to static linking, included OpenSSL, or how panic handling is set up. I
should look into the Cargo.toml file to confirm features and profiles without
needing to edit any code. Let's get started on this!
codex
Im going to scan the workspace and Cargo manifests to see build profiles and
@@ -34,6 +37,16 @@ dependencies that impact binary size. Then Ill summarize the main causes.
# ratatui = { path = "../../ratatui" }
ratatui = { git = "https://github.com/nornagon/ratatui", branch =
"nornagon-v0.29.0-patch" }
Optimizing Rust Release Profile
I'm reviewing the workspace's release profile, which has settings like lto=fat,
strip=symbols, and codegen-units=1 to reduce binary size. However, it seems that
"strip = symbols" only removes symbols, not debug information. In Rust 1.73+,
both "symbols" and "debuginfo" stripping options are available. I should check
whether using these results in a smaller binary. Ill also skim through crates
for heavy dependencies in Cargo.toml, including cli, core, exec, linux-sandbox,
tui, login, ollama, and mcp.
⚡ Running for d in ansi-escape apply-patch arg0 cli common core exec execpolicy
file-search linux-sandbox login mcp-client mcp-server mcp-types ollama tui; do
echo "--- $d/Cargo.toml"; sed -n '1,200p' $d/Cargo.toml; echo; done
@@ -168,6 +181,85 @@ echo "--- $d/Cargo.toml"; sed -n '1,200p' $d/Cargo.toml; echo; done
[dev-dependencies]
tempfile = "3"
thinking
Considering binary size impacts
Im evaluating how binary size is influenced by various factors. There are many
binaries, such as codex and codex-exec, which include dependencies by default,
pulling in features like Tokio and hyper. For instance, reqwest includes
default features like TLS through rustls. If default features remain enabled,
it inflates the size with many additional components. I notice that for musl
targets, using vendored builds of OpenSSL could significantly increase binary
size. This is quite the intricate web of dependencies!
Analyzing musl and binary size
Im considering the implications of building for musl with static linking.
In this case, OpenSSL might come into play through some crates, but reqwest
defaults to using rustls, although openssl-sys vendored could increase binary
size. Its interesting to think about whether theyre targeting musl, especially
since it seems they might often be building for Mac and Linux with dynamic
linking instead.
I also realize theres a difference between debug and release builds — release
should be smaller due to optimizations, but if theyre defaulting to debug
builds, that could lead to much larger binaries.
Assessing crate impacts on size
I need to be precise when analyzing binary sizes from various crates. Some
crates, like ratatui-image, include heavy features such as libjpeg decoders,
which can really bulk things up. Then there are the tree-sitter crates that
contain large static grammar tables, adding significant size. The starlark crate
and tracing-subscriber also contribute to size increases.
With static linking per binary being standard in Rust, it leads to multiple
copies of the same dependencies unless cdylibs are used. That makes the compile
times longer, but using LTO helps reduce bloat somewhat through deduplication.
Evaluating binary size factors
Having multiple binaries indeed increases the overall size significantly.
I'm considering options like upgrading to "opt-level = 'z'" or using
"strip=debuginfo" for reducing size. It's interesting that while the profile
only sets symbols for stripping, not debuginfo, this could inflate sizes if
retained in release builds.
The distinction between "symbols" and "debuginfo" is critical; "symbols" keeps
debug information which may not align with the goal of minimizing size. Id like
to verify with the Rust documentation, but my network access is restricted.
Explaining large binary sizes
Im analyzing why binaries in this project are so large. The primary reasons
include monolithic static linking, where each binary includes the full
dependency graph without shared libraries. Heavy dependencies such as reqwest,
hyper, and TLS components also contribute significantly to size, along with
image decoders and other libraries.
Using MUSL targets means vendored OpenSSL can drastically increase the size due
to static linking. Furthermore, if the release profile keeps debug information
due to using "strip = symbols," that can lead to larger binaries as well. Its a
mix of several factors, including panic strategies and platform-specific linking
behavior.
Identifying binary size contributors
Im considering why binaries are still large even without static linking. The
tracing-subscriber crate potentially adds weight because it includes regex,
though it can sometimes use a lighter version. References from files like "core/
Cargo.toml" show dependencies like reqwest and tokio, while the tui uses the
image crate.
To present this clearly, Ill format the final answer with two main sections:
"Main Causes" and "Build-Mode Notes." I can also include brief suggestions for
reducing size, but I want to stay focused on answering the user's question.
codex
Heres whats driving size in this workspaces binaries.
@@ -203,4 +295,4 @@ assertions—outputs are much larger than cargo build --release.
If you want, I can outline targeted trims (e.g., strip = "debuginfo", opt-level
= "z", panic abort, tighter tokio/reqwest features) and estimate impact per
binary.
binary.