Compare commits

..

15 Commits

Author SHA1 Message Date
jif-oai
651d6852a2 feat: basic collab rendering 2026-01-12 15:27:59 +00:00
jif-oai
e3cf74885a feat: emit events around collab tools 2026-01-12 14:53:09 +00:00
jif-oai
9659583559 feat: add close tool implementation for collab (#9090)
Pretty straight forward. A known follow-up will be to drop it from the
AgentControl
2026-01-12 13:21:46 +00:00
jif-oai
623707ab58 feat: add wait tool implementation for collab (#9088)
Add implementation for the `wait` tool.

For this we consider all status different from `PendingInit` and
`Running` as terminal. The `wait` tool call will return either after a
given timeout or when the tool reaches a non-terminal status.

A few points to note:
* The usage of a channel is preferred to prevent some races (just
looping on `get_status()` could "miss" a terminal status)
* The order of operations is very important, we need to first subscribe
and then check the last known status to prevent race conditions
* If the channel gets dropped, we return an error on purpose
2026-01-12 12:16:24 +00:00
jif-oai
86f81ca010 feat: testing harness for collab 1 (#8983) 2026-01-12 11:17:05 +00:00
zbarsky-openai
6a57d7980b fix: support remote arm64 builds, as well (#9018) 2026-01-10 18:41:08 -08:00
jif-oai
198289934f Revert "Delete announcement_tip.toml" (#9032)
Reverts openai/codex#9003
2026-01-10 07:30:14 -08:00
charley-oai
6709ad8975 Label attached images so agent can understand in-message labels (#8950)
Agent wouldn't "see" attached images and would instead try to use the
view_file tool:
<img width="1516" height="504" alt="image"
src="https://github.com/user-attachments/assets/68a705bb-f962-4fc1-9087-e932a6859b12"
/>

In this PR, we wrap image content items in XML tags with the name of
each image (now just a numbered name like `[Image #1]`), so that the
model can understand inline image references (based on name). We also
put the image content items above the user message which the model seems
to prefer (maybe it's more used to definitions being before references).

We also tweak the view_file tool description which seemed to help a bit

Results on a simple eval set of images:

Before
<img width="980" height="310" alt="image"
src="https://github.com/user-attachments/assets/ba838651-2565-4684-a12e-81a36641bf86"
/>

After
<img width="918" height="322" alt="image"
src="https://github.com/user-attachments/assets/10a81951-7ee6-415e-a27e-e7a3fd0aee6f"
/>

```json
[
  {
    "id": "single_describe",
    "prompt": "Describe the attached image in one sentence.",
    "images": ["image_a.png"]
  },
  {
    "id": "single_color",
    "prompt": "What is the dominant color in the image? Answer with a single color word.",
    "images": ["image_b.png"]
  },
  {
    "id": "orientation_check",
    "prompt": "Is the image portrait or landscape? Answer in one sentence.",
    "images": ["image_c.png"]
  },
  {
    "id": "detail_request",
    "prompt": "Look closely at the image and call out any small details you notice.",
    "images": ["image_d.png"]
  },
  {
    "id": "two_images_compare",
    "prompt": "I attached two images. Are they the same or different? Briefly explain.",
    "images": ["image_a.png", "image_b.png"]
  },
  {
    "id": "two_images_captions",
    "prompt": "Provide a short caption for each image (Image 1, Image 2).",
    "images": ["image_c.png", "image_d.png"]
  },
  {
    "id": "multi_image_rank",
    "prompt": "Rank the attached images from most colorful to least colorful.",
    "images": ["image_a.png", "image_b.png", "image_c.png"]
  },
  {
    "id": "multi_image_choice",
    "prompt": "Which image looks more vibrant? Answer with 'Image 1' or 'Image 2'.",
    "images": ["image_b.png", "image_d.png"]
  }
]
```
2026-01-09 21:33:45 -08:00
Michael Bolin
cf515142b0 fix: include AGENTS.md as repo root marker for integration tests (#9010)
As explained in `codex-rs/core/BUILD.bazel`, including the repo's own
`AGENTS.md` is a hack to get some tests passing. We should fix this
properly, but I wanted to put stake in the ground ASAP to get `just
bazel-remote-test` working and then add a job to `bazel.yml` to ensure
it keeps working.
2026-01-09 17:09:59 -08:00
Michael Bolin
74b2238931 fix: add .git to .bazelignore (#9008)
As noted in the comment, this was causing a problem for me locally
because Sapling backed up some files under `.git/sl` named `BUILD.bazel`
and so Bazel tried to parse them.

It's a bit surprising that Bazel does not ignore `.git` out of the box
such that you have to opt-in to considering it rather than opting-out.
2026-01-10 00:55:02 +00:00
gt-oai
cc0b5e8504 Add URL to responses error messages (#8984)
Put the URL in error messages, to aid debugging Codex pointing at wrong
endpoints.

<img width="759" height="164" alt="Screenshot 2026-01-09 at 16 32 49"
src="https://github.com/user-attachments/assets/77a0622c-955d-426d-86bb-c035210a4ecc"
/>
2026-01-10 00:53:47 +00:00
gt-oai
8e49a2c0d1 Add model provider info to /status if non-default (#8981)
Add model provider info to /status if non-default

Enterprises are running Codex and migrating between proxied / API key
auth and SIWC. If you accidentally run Codex with `OPENAI_BASE_URL=...`,
which is surprisingly easy to do, we don't tend to surface this anywhere
and it may lead to breakage. One suggestion was to include this
information in `/status`:

<img width="477" height="157" alt="Screenshot 2026-01-09 at 15 45 34"
src="https://github.com/user-attachments/assets/630ce68f-c856-4a2b-a004-7df2fbe5de93"
/>
2026-01-10 00:53:34 +00:00
Ahmed Ibrahim
af1ed2685e Refactor remote models tests to use TestCodex builder (#8940)
- add `with_model_provider` to the test codex builder
- replace the bespoke remote models harness with `TestCodex` in
`remote_models` tests
2026-01-09 15:11:56 -08:00
pakrym-oai
1a0e2e612b Delete announcement_tip.toml (#9003) 2026-01-09 14:47:46 -08:00
pakrym-oai
acfd94f625 Add hierarchical agent prompt (#8996) 2026-01-09 13:47:37 -08:00
53 changed files with 2359 additions and 977 deletions

3
.bazelignore Normal file
View File

@@ -0,0 +1,3 @@
# Without this, Bazel will consider BUILD.bazel files in
# .git/sl/origbackups (which can be populated by Sapling SCM).
.git

View File

@@ -4,7 +4,7 @@ FROM ubuntu:24.04
# initial debugging, but we should publish to a more proper location.
#
# docker buildx create --use
# docker buildx build --platform linux/amd64 -f .github/workflows/Dockerfile.bazel -t mbolin491/codex-bazel:latest --push .
# docker buildx build --platform linux/amd64,linux/arm64 -f .github/workflows/Dockerfile.bazel -t mbolin491/codex-bazel:latest --push .
RUN apt-get update && \
apt-get install -y --no-install-recommends \

View File

@@ -2,14 +2,19 @@ common --remote_download_minimal
common --nobuild_runfile_links
common --keep_going
# Prefer to run the build actions entirely remotely so we can dial up the concurrency.
# Currently remote builds only work on Mac hosts, until we untangle the libc constraints mess on linux.
# We prefer to run the build actions entirely remotely so we can dial up the concurrency.
# We have platform-specific tests, so we want to execute the tests on all platforms using the strongest sandboxing available on each platform.
# On linux, we can do a full remote build/test, by targeting the right (x86/arm) runners, so we have coverage of both.
# Linux crossbuilds don't work until we untangle the libc constraint mess.
common:linux --config=remote
common:linux --strategy=remote
common:linux --platforms=//:rbe
# On mac, we can run all the build actions remotely but test actions locally.
common:macos --config=remote
common:macos --strategy=remote
# We have platform-specific tests, so execute the tests locally using the strongest sandboxing available on each platform.
common:macos --strategy=TestRunner=darwin-sandbox,local
# Note: linux-sandbox is stronger, but not available in GHA.
common:linux --strategy=TestRunner=processwrapper-sandbox,local
common:windows --strategy=TestRunner=local

View File

@@ -11,19 +11,9 @@ platform(
],
)
platform(
alias(
name = "rbe",
constraint_values = [
"@platforms//cpu:x86_64",
"@platforms//os:linux",
"@bazel_tools//tools/cpp:clang",
"@toolchains_llvm_bootstrapped//constraints/libc:gnu.2.28",
],
exec_properties = {
# Ubuntu-based image that includes git, python3, dotslash, and other
# tools that various integration tests need.
# Verify at https://hub.docker.com/layers/mbolin491/codex-bazel/latest/images/sha256:8c9ff94187ea7c08a31e9a81f5fe8046ea3972a6768983c955c4079fa30567fb
"container-image": "docker://docker.io/mbolin491/codex-bazel@sha256:8c9ff94187ea7c08a31e9a81f5fe8046ea3972a6768983c955c4079fa30567fb",
"OSFamily": "Linux",
},
actual = "@rbe_platform",
)
exports_files(["AGENTS.md"])

View File

@@ -120,3 +120,9 @@ crate.annotation(
deps = [":windows_import_lib"],
)
use_repo(crate, "crates")
rbe_platform_repository = use_repo_rule("//:rbe.bzl", "rbe_platform_repository")
rbe_platform_repository(
name = "rbe_platform",
)

View File

@@ -10,6 +10,7 @@ from_date = "2024-10-01"
to_date = "2024-10-15"
target_app = "cli"
# Test announcement only for local build version until 2026-01-10 excluded (past)
[[announcements]]
content = "This is a test announcement"
version_regex = "^0\\.0\\.0$"

View File

@@ -7,6 +7,7 @@ pub enum TransportError {
#[error("http {status}: {body:?}")]
Http {
status: StatusCode,
url: Option<String>,
headers: Option<HeaderMap>,
body: Option<String>,
},

View File

@@ -131,6 +131,7 @@ impl HttpTransport for ReqwestTransport {
);
}
let url = req.url.clone();
let builder = self.build(req)?;
let resp = builder.send().await.map_err(Self::map_error)?;
let status = resp.status();
@@ -140,6 +141,7 @@ impl HttpTransport for ReqwestTransport {
let body = String::from_utf8(bytes.to_vec()).ok();
return Err(TransportError::Http {
status,
url: Some(url),
headers: Some(headers),
body,
});
@@ -161,6 +163,7 @@ impl HttpTransport for ReqwestTransport {
);
}
let url = req.url.clone();
let builder = self.build(req)?;
let resp = builder.send().await.map_err(Self::map_error)?;
let status = resp.status();
@@ -169,6 +172,7 @@ impl HttpTransport for ReqwestTransport {
let body = resp.text().await.ok();
return Err(TransportError::Http {
status,
url: Some(url),
headers: Some(headers),
body,
});

View File

@@ -20,6 +20,15 @@ codex_rust_crate(
"//codex-rs/apply-patch:apply_patch_tool_instructions.md",
"prompt.md",
],
# This is a bit of a hack, but empirically, some of our integration tests
# are relying on the presence of this file as a repo root marker. When
# running tests locally, this "just works," but in remote execution,
# the working directory is different and so the file is not found unless it
# is explicitly added as test data.
#
# TODO(aibrahim): Update the tests so that `just bazel-remote-test` succeeds
# without this workaround.
test_data_extra = ["//:AGENTS.md"],
integration_deps_extra = ["//codex-rs/core/tests/common:common"],
test_tags = ["no-sandbox"],
extra_binaries = [

View File

@@ -0,0 +1,7 @@
Files called AGENTS.md commonly appear in many places inside a container - at "/", in "~", deep within git repositories, or in any other directory; their location is not limited to version-controlled folders.
Their purpose is to pass along human guidance to you, the agent. Such guidance can include coding standards, explanations of the project layout, steps for building or testing, and even wording that must accompany a GitHub pull-request description produced by the agent; all of it is to be followed.
Each AGENTS.md governs the entire directory that contains it and every child directory beneath that point. Whenever you change a file, you have to comply with every AGENTS.md whose scope covers that file. Naming conventions, stylistic rules and similar directives are restricted to the code that falls inside that scope unless the document explicitly states otherwise.
When two AGENTS.md files disagree, the one located deeper in the directory structure overrides the higher-level file, while instructions given directly in the prompt by the system, developer, or user outrank any AGENTS.md content.

View File

@@ -9,6 +9,7 @@ use codex_protocol::protocol::Op;
use codex_protocol::user_input::UserInput;
use std::sync::Arc;
use std::sync::Weak;
use tokio::sync::watch;
/// Control-plane handle for multi-agent operations.
/// `AgentControl` is held by each session (via `SessionServices`). It provides capability to
@@ -27,7 +28,6 @@ impl AgentControl {
Self { manager }
}
#[allow(dead_code)] // Used by upcoming multi-agent tooling.
/// Spawn a new agent thread and submit the initial prompt.
///
/// If `headless` is true, a background drain task is spawned to prevent unbounded event growth
@@ -50,7 +50,6 @@ impl AgentControl {
Ok(new_thread.thread_id)
}
#[allow(dead_code)] // Used by upcoming multi-agent tooling.
/// Send a `user` prompt to an existing agent thread.
pub(crate) async fn send_prompt(
&self,
@@ -69,7 +68,13 @@ impl AgentControl {
.await
}
#[allow(dead_code)] // Used by upcoming multi-agent tooling.
/// Submit a shutdown request to an existing agent thread.
pub(crate) async fn shutdown_agent(&self, agent_id: ThreadId) -> CodexResult<String> {
let state = self.upgrade()?;
state.send_op(agent_id, Op::Shutdown {}).await
}
#[allow(dead_code)] // Will be used for collab tools.
/// Fetch the last known status for `agent_id`, returning `NotFound` when unavailable.
pub(crate) async fn get_status(&self, agent_id: ThreadId) -> AgentStatus {
let Ok(state) = self.upgrade() else {
@@ -82,6 +87,16 @@ impl AgentControl {
thread.agent_status().await
}
/// Subscribe to status updates for `agent_id`, yielding the latest value and changes.
pub(crate) async fn subscribe_status(
&self,
agent_id: ThreadId,
) -> CodexResult<watch::Receiver<AgentStatus>> {
let state = self.upgrade()?;
let thread = state.get_thread(agent_id).await?;
Ok(thread.subscribe_status())
}
fn upgrade(&self) -> CodexResult<Arc<ThreadManagerState>> {
self.manager
.upgrade()
@@ -114,13 +129,63 @@ fn spawn_headless_drain(thread: Arc<CodexThread>) {
#[cfg(test)]
mod tests {
use super::*;
use crate::CodexAuth;
use crate::ThreadManager;
use crate::agent::agent_status_from_event;
use crate::config::Config;
use crate::config::ConfigBuilder;
use assert_matches::assert_matches;
use codex_protocol::protocol::ErrorEvent;
use codex_protocol::protocol::TurnAbortReason;
use codex_protocol::protocol::TurnAbortedEvent;
use codex_protocol::protocol::TurnCompleteEvent;
use codex_protocol::protocol::TurnStartedEvent;
use pretty_assertions::assert_eq;
use tempfile::TempDir;
async fn test_config() -> (TempDir, Config) {
let home = TempDir::new().expect("create temp dir");
let config = ConfigBuilder::default()
.codex_home(home.path().to_path_buf())
.build()
.await
.expect("load default test config");
(home, config)
}
struct AgentControlHarness {
_home: TempDir,
config: Config,
manager: ThreadManager,
control: AgentControl,
}
impl AgentControlHarness {
async fn new() -> Self {
let (home, config) = test_config().await;
let manager = ThreadManager::with_models_provider_and_home(
CodexAuth::from_api_key("dummy"),
config.model_provider.clone(),
config.codex_home.clone(),
);
let control = manager.agent_control();
Self {
_home: home,
config,
manager,
control,
}
}
async fn start_thread(&self) -> (ThreadId, Arc<CodexThread>) {
let new_thread = self
.manager
.start_thread(self.config.clone())
.await
.expect("start thread");
(new_thread.thread_id, new_thread.thread)
}
}
#[tokio::test]
async fn send_prompt_errors_when_manager_dropped() {
@@ -185,4 +250,135 @@ mod tests {
let status = agent_status_from_event(&EventMsg::ShutdownComplete);
assert_eq!(status, Some(AgentStatus::Shutdown));
}
#[tokio::test]
async fn spawn_agent_errors_when_manager_dropped() {
let control = AgentControl::default();
let (_home, config) = test_config().await;
let err = control
.spawn_agent(config, "hello".to_string(), false)
.await
.expect_err("spawn_agent should fail without a manager");
assert_eq!(
err.to_string(),
"unsupported operation: thread manager dropped"
);
}
#[tokio::test]
async fn send_prompt_errors_when_thread_missing() {
let harness = AgentControlHarness::new().await;
let thread_id = ThreadId::new();
let err = harness
.control
.send_prompt(thread_id, "hello".to_string())
.await
.expect_err("send_prompt should fail for missing thread");
assert_matches!(err, CodexErr::ThreadNotFound(id) if id == thread_id);
}
#[tokio::test]
async fn get_status_returns_not_found_for_missing_thread() {
let harness = AgentControlHarness::new().await;
let status = harness.control.get_status(ThreadId::new()).await;
assert_eq!(status, AgentStatus::NotFound);
}
#[tokio::test]
async fn get_status_returns_pending_init_for_new_thread() {
let harness = AgentControlHarness::new().await;
let (thread_id, _) = harness.start_thread().await;
let status = harness.control.get_status(thread_id).await;
assert_eq!(status, AgentStatus::PendingInit);
}
#[tokio::test]
async fn subscribe_status_errors_for_missing_thread() {
let harness = AgentControlHarness::new().await;
let thread_id = ThreadId::new();
let err = harness
.control
.subscribe_status(thread_id)
.await
.expect_err("subscribe_status should fail for missing thread");
assert_matches!(err, CodexErr::ThreadNotFound(id) if id == thread_id);
}
#[tokio::test]
async fn subscribe_status_updates_on_shutdown() {
let harness = AgentControlHarness::new().await;
let (thread_id, thread) = harness.start_thread().await;
let mut status_rx = harness
.control
.subscribe_status(thread_id)
.await
.expect("subscribe_status should succeed");
assert_eq!(status_rx.borrow().clone(), AgentStatus::PendingInit);
let _ = thread
.submit(Op::Shutdown {})
.await
.expect("shutdown should submit");
let _ = status_rx.changed().await;
assert_eq!(status_rx.borrow().clone(), AgentStatus::Shutdown);
}
#[tokio::test]
async fn send_prompt_submits_user_message() {
let harness = AgentControlHarness::new().await;
let (thread_id, _thread) = harness.start_thread().await;
let submission_id = harness
.control
.send_prompt(thread_id, "hello from tests".to_string())
.await
.expect("send_prompt should succeed");
assert!(!submission_id.is_empty());
let expected = (
thread_id,
Op::UserInput {
items: vec![UserInput::Text {
text: "hello from tests".to_string(),
}],
final_output_json_schema: None,
},
);
let captured = harness
.manager
.captured_ops()
.into_iter()
.find(|entry| *entry == expected);
assert_eq!(captured, Some(expected));
}
#[tokio::test]
async fn spawn_agent_creates_thread_and_sends_prompt() {
let harness = AgentControlHarness::new().await;
let thread_id = harness
.control
.spawn_agent(harness.config.clone(), "spawned".to_string(), false)
.await
.expect("spawn_agent should succeed");
let _thread = harness
.manager
.get_thread(thread_id)
.await
.expect("thread should be registered");
let expected = (
thread_id,
Op::UserInput {
items: vec![UserInput::Text {
text: "spawned".to_string(),
}],
final_output_json_schema: None,
},
);
let captured = harness
.manager
.captured_ops()
.into_iter()
.find(|entry| *entry == expected);
assert_eq!(captured, Some(expected));
}
}

View File

@@ -13,3 +13,7 @@ pub(crate) fn agent_status_from_event(msg: &EventMsg) -> Option<AgentStatus> {
_ => None,
}
}
pub(crate) fn is_final(status: &AgentStatus) -> bool {
!matches!(status, AgentStatus::PendingInit | AgentStatus::Running)
}

View File

@@ -25,11 +25,13 @@ pub(crate) fn map_api_error(err: ApiError) -> CodexErr {
ApiError::Api { status, message } => CodexErr::UnexpectedStatus(UnexpectedResponseError {
status,
body: message,
url: None,
request_id: None,
}),
ApiError::Transport(transport) => match transport {
TransportError::Http {
status,
url,
headers,
body,
} => {
@@ -71,6 +73,7 @@ pub(crate) fn map_api_error(err: ApiError) -> CodexErr {
CodexErr::UnexpectedStatus(UnexpectedResponseError {
status,
body: body_text,
url,
request_id: extract_request_id(headers.as_ref()),
})
}

View File

@@ -533,6 +533,7 @@ async fn handle_unauthorized(
fn map_unauthorized_status(status: StatusCode) -> CodexErr {
map_api_error(ApiError::Transport(TransportError::Http {
status,
url: None,
headers: None,
body: None,
}))

View File

@@ -164,6 +164,7 @@ use codex_protocol::protocol::InitialHistory;
use codex_protocol::user_input::UserInput;
use codex_utils_readiness::Readiness;
use codex_utils_readiness::ReadinessFlag;
use tokio::sync::watch;
/// The high-level interface to the Codex system.
/// It operates as a queue pair where you send submissions and receive events.
@@ -172,7 +173,7 @@ pub struct Codex {
pub(crate) tx_sub: Sender<Submission>,
pub(crate) rx_event: Receiver<Event>,
// Last known status of the agent.
pub(crate) agent_status: Arc<RwLock<AgentStatus>>,
pub(crate) agent_status: watch::Receiver<AgentStatus>,
}
/// Wrapper returned by [`Codex::spawn`] containing the spawned [`Codex`],
@@ -275,7 +276,7 @@ impl Codex {
// Generate a unique ID for the lifetime of this Codex session.
let session_source_clone = session_configuration.session_source.clone();
let agent_status = Arc::new(RwLock::new(AgentStatus::PendingInit));
let (agent_status_tx, agent_status_rx) = watch::channel(AgentStatus::PendingInit);
let session = Session::new(
session_configuration,
@@ -284,7 +285,7 @@ impl Codex {
models_manager.clone(),
exec_policy,
tx_event.clone(),
Arc::clone(&agent_status),
agent_status_tx.clone(),
conversation_history,
session_source_clone,
skills_manager,
@@ -303,7 +304,7 @@ impl Codex {
next_id: AtomicU64::new(0),
tx_sub,
rx_event,
agent_status,
agent_status: agent_status_rx,
};
#[allow(deprecated)]
@@ -345,8 +346,7 @@ impl Codex {
}
pub(crate) async fn agent_status(&self) -> AgentStatus {
let status = self.agent_status.read().await;
status.clone()
self.agent_status.borrow().clone()
}
}
@@ -354,9 +354,9 @@ impl Codex {
///
/// A session has at most 1 running task at a time, and can be interrupted by user input.
pub(crate) struct Session {
conversation_id: ThreadId,
pub(crate) conversation_id: ThreadId,
tx_event: Sender<Event>,
agent_status: Arc<RwLock<AgentStatus>>,
agent_status: watch::Sender<AgentStatus>,
state: Mutex<SessionState>,
/// The set of enabled features should be invariant for the lifetime of the
/// session.
@@ -557,7 +557,7 @@ impl Session {
models_manager: Arc<ModelsManager>,
exec_policy: ExecPolicyManager,
tx_event: Sender<Event>,
agent_status: Arc<RwLock<AgentStatus>>,
agent_status: watch::Sender<AgentStatus>,
initial_history: InitialHistory,
session_source: SessionSource,
skills_manager: Arc<SkillsManager>,
@@ -703,7 +703,7 @@ impl Session {
let sess = Arc::new(Session {
conversation_id,
tx_event: tx_event.clone(),
agent_status: Arc::clone(&agent_status),
agent_status,
state: Mutex::new(state),
features: config.features.clone(),
active_turn: Mutex::new(None),
@@ -1026,8 +1026,7 @@ impl Session {
pub(crate) async fn send_event_raw(&self, event: Event) {
// Record the last known agent status.
if let Some(status) = agent_status_from_event(&event.msg) {
let mut guard = self.agent_status.write().await;
*guard = status;
self.agent_status.send_replace(status);
}
// Persist the event into rollout (recorder filters as needed)
let rollout_items = vec![RolloutItem::EventMsg(event.msg.clone())];
@@ -1045,8 +1044,7 @@ impl Session {
pub(crate) async fn send_event_raw_flushed(&self, event: Event) {
// Record the last known agent status.
if let Some(status) = agent_status_from_event(&event.msg) {
let mut guard = self.agent_status.write().await;
*guard = status;
self.agent_status.send_replace(status);
}
self.persist_rollout_items(&[RolloutItem::EventMsg(event.msg.clone())])
.await;
@@ -1540,6 +1538,24 @@ impl Session {
}
}
/// Returns the input if there was no task running to inject into
pub async fn inject_response_items(
&self,
input: Vec<ResponseInputItem>,
) -> Result<(), Vec<ResponseInputItem>> {
let mut active = self.active_turn.lock().await;
match active.as_mut() {
Some(at) => {
let mut ts = at.turn_state.lock().await;
for item in input {
ts.push_pending_input(item);
}
Ok(())
}
None => Err(input),
}
}
pub async fn get_pending_input(&self) -> Vec<ResponseInputItem> {
let mut active = self.active_turn.lock().await;
match active.as_mut() {
@@ -3476,7 +3492,7 @@ mod tests {
));
let agent_control = AgentControl::default();
let exec_policy = ExecPolicyManager::default();
let agent_status = Arc::new(RwLock::new(AgentStatus::PendingInit));
let (agent_status_tx, _agent_status_rx) = watch::channel(AgentStatus::PendingInit);
let model = ModelsManager::get_model_offline(config.model.as_deref());
let session_configuration = SessionConfiguration {
provider: config.model_provider.clone(),
@@ -3539,7 +3555,7 @@ mod tests {
let session = Session {
conversation_id,
tx_event,
agent_status: Arc::clone(&agent_status),
agent_status: agent_status_tx,
state: Mutex::new(state),
features: config.features.clone(),
active_turn: Mutex::new(None),
@@ -3570,7 +3586,7 @@ mod tests {
));
let agent_control = AgentControl::default();
let exec_policy = ExecPolicyManager::default();
let agent_status = Arc::new(RwLock::new(AgentStatus::PendingInit));
let (agent_status_tx, _agent_status_rx) = watch::channel(AgentStatus::PendingInit);
let model = ModelsManager::get_model_offline(config.model.as_deref());
let session_configuration = SessionConfiguration {
provider: config.model_provider.clone(),
@@ -3633,7 +3649,7 @@ mod tests {
let session = Arc::new(Session {
conversation_id,
tx_event,
agent_status: Arc::clone(&agent_status),
agent_status: agent_status_tx,
state: Mutex::new(state),
features: config.features.clone(),
active_turn: Mutex::new(None),

View File

@@ -87,7 +87,7 @@ pub(crate) async fn run_codex_thread_interactive(
next_id: AtomicU64::new(0),
tx_sub: tx_ops,
rx_event: rx_sub,
agent_status: Arc::clone(&codex.agent_status),
agent_status: codex.agent_status.clone(),
})
}
@@ -129,7 +129,7 @@ pub(crate) async fn run_codex_thread_one_shot(
// Bridge events so we can observe completion and shut down automatically.
let (tx_bridge, rx_bridge) = async_channel::bounded(SUBMISSION_CHANNEL_CAPACITY);
let ops_tx = io.tx_sub.clone();
let agent_status = Arc::clone(&io.agent_status);
let agent_status = io.agent_status.clone();
let io_for_bridge = io;
tokio::spawn(async move {
while let Ok(event) = io_for_bridge.next_event().await {
@@ -363,20 +363,23 @@ mod tests {
use super::*;
use async_channel::bounded;
use codex_protocol::models::ResponseItem;
use codex_protocol::protocol::AgentStatus;
use codex_protocol::protocol::RawResponseItemEvent;
use codex_protocol::protocol::TurnAbortReason;
use codex_protocol::protocol::TurnAbortedEvent;
use pretty_assertions::assert_eq;
use tokio::sync::watch;
#[tokio::test]
async fn forward_events_cancelled_while_send_blocked_shuts_down_delegate() {
let (tx_events, rx_events) = bounded(1);
let (tx_sub, rx_sub) = bounded(SUBMISSION_CHANNEL_CAPACITY);
let (_agent_status_tx, agent_status) = watch::channel(AgentStatus::PendingInit);
let codex = Arc::new(Codex {
next_id: AtomicU64::new(0),
tx_sub,
rx_event: rx_events,
agent_status: Default::default(),
agent_status,
});
let (session, ctx, _rx_evt) = crate::codex::make_session_and_context_with_rx().await;

View File

@@ -5,6 +5,7 @@ use crate::protocol::Event;
use crate::protocol::Op;
use crate::protocol::Submission;
use std::path::PathBuf;
use tokio::sync::watch;
pub struct CodexThread {
codex: Codex,
@@ -38,6 +39,10 @@ impl CodexThread {
self.codex.agent_status().await
}
pub(crate) fn subscribe_status(&self) -> watch::Receiver<AgentStatus> {
self.codex.agent_status.clone()
}
pub fn rollout_path(&self) -> PathBuf {
self.rollout_path.clone()
}

View File

@@ -277,6 +277,7 @@ pub enum RefreshTokenFailedReason {
pub struct UnexpectedResponseError {
pub status: StatusCode,
pub body: String,
pub url: Option<String>,
pub request_id: Option<String>,
}
@@ -293,7 +294,11 @@ impl UnexpectedResponseError {
return None;
}
let mut message = format!("{CLOUDFLARE_BLOCKED_MESSAGE} (status {})", self.status);
let status = self.status;
let mut message = format!("{CLOUDFLARE_BLOCKED_MESSAGE} (status {status})");
if let Some(url) = &self.url {
message.push_str(&format!(", url: {url}"));
}
if let Some(id) = &self.request_id {
message.push_str(&format!(", request id: {id}"));
}
@@ -307,16 +312,16 @@ impl std::fmt::Display for UnexpectedResponseError {
if let Some(friendly) = self.friendly_message() {
write!(f, "{friendly}")
} else {
write!(
f,
"unexpected status {}: {}{}",
self.status,
self.body,
self.request_id
.as_ref()
.map(|id| format!(", request id: {id}"))
.unwrap_or_default()
)
let status = self.status;
let body = &self.body;
let mut message = format!("unexpected status {status}: {body}");
if let Some(url) = &self.url {
message.push_str(&format!(", url: {url}"));
}
if let Some(id) = &self.request_id {
message.push_str(&format!(", request id: {id}"));
}
write!(f, "{message}")
}
}
}
@@ -826,12 +831,16 @@ mod tests {
status: StatusCode::FORBIDDEN,
body: "<html><body>Cloudflare error: Sorry, you have been blocked</body></html>"
.to_string(),
url: Some("http://example.com/blocked".to_string()),
request_id: Some("ray-id".to_string()),
};
let status = StatusCode::FORBIDDEN.to_string();
let url = "http://example.com/blocked";
assert_eq!(
err.to_string(),
format!("{CLOUDFLARE_BLOCKED_MESSAGE} (status {status}), request id: ray-id")
format!(
"{CLOUDFLARE_BLOCKED_MESSAGE} (status {status}), url: {url}, request id: ray-id"
)
);
}
@@ -840,12 +849,14 @@ mod tests {
let err = UnexpectedResponseError {
status: StatusCode::FORBIDDEN,
body: "plain text error".to_string(),
url: Some("http://example.com/plain".to_string()),
request_id: None,
};
let status = StatusCode::FORBIDDEN.to_string();
let url = "http://example.com/plain";
assert_eq!(
err.to_string(),
format!("unexpected status {status}: plain text error")
format!("unexpected status {status}: plain text error, url: {url}")
);
}

View File

@@ -9,6 +9,10 @@ use codex_protocol::models::ReasoningItemContent;
use codex_protocol::models::ReasoningItemReasoningSummary;
use codex_protocol::models::ResponseItem;
use codex_protocol::models::WebSearchAction;
use codex_protocol::models::is_image_close_tag_text;
use codex_protocol::models::is_image_open_tag_text;
use codex_protocol::models::is_local_image_close_tag_text;
use codex_protocol::models::is_local_image_open_tag_text;
use codex_protocol::user_input::UserInput;
use tracing::warn;
use uuid::Uuid;
@@ -32,9 +36,17 @@ fn parse_user_message(message: &[ContentItem]) -> Option<UserMessageItem> {
let mut content: Vec<UserInput> = Vec::new();
for content_item in message.iter() {
for (idx, content_item) in message.iter().enumerate() {
match content_item {
ContentItem::InputText { text } => {
if (is_local_image_open_tag_text(text) || is_image_open_tag_text(text))
&& (matches!(message.get(idx + 1), Some(ContentItem::InputImage { .. })))
|| (idx > 0
&& (is_local_image_close_tag_text(text) || is_image_close_tag_text(text))
&& matches!(message.get(idx - 1), Some(ContentItem::InputImage { .. })))
{
continue;
}
if is_session_prefix(text) || is_user_shell_command_text(text) {
return None;
}
@@ -177,6 +189,80 @@ mod tests {
}
}
#[test]
fn skips_local_image_label_text() {
let image_url = "data:image/png;base64,abc".to_string();
let label = codex_protocol::models::local_image_open_tag_text(1);
let user_text = "Please review this image.".to_string();
let item = ResponseItem::Message {
id: None,
role: "user".to_string(),
content: vec![
ContentItem::InputText { text: label },
ContentItem::InputImage {
image_url: image_url.clone(),
},
ContentItem::InputText {
text: "</image>".to_string(),
},
ContentItem::InputText {
text: user_text.clone(),
},
],
};
let turn_item = parse_turn_item(&item).expect("expected user message turn item");
match turn_item {
TurnItem::UserMessage(user) => {
let expected_content = vec![
UserInput::Image { image_url },
UserInput::Text { text: user_text },
];
assert_eq!(user.content, expected_content);
}
other => panic!("expected TurnItem::UserMessage, got {other:?}"),
}
}
#[test]
fn skips_unnamed_image_label_text() {
let image_url = "data:image/png;base64,abc".to_string();
let label = codex_protocol::models::image_open_tag_text();
let user_text = "Please review this image.".to_string();
let item = ResponseItem::Message {
id: None,
role: "user".to_string(),
content: vec![
ContentItem::InputText { text: label },
ContentItem::InputImage {
image_url: image_url.clone(),
},
ContentItem::InputText {
text: codex_protocol::models::image_close_tag_text(),
},
ContentItem::InputText {
text: user_text.clone(),
},
],
};
let turn_item = parse_turn_item(&item).expect("expected user message turn item");
match turn_item {
TurnItem::UserMessage(user) => {
let expected_content = vec![
UserInput::Image { image_url },
UserInput::Text { text: user_text },
];
assert_eq!(user.content, expected_content);
}
other => panic!("expected TurnItem::UserMessage, got {other:?}"),
}
}
#[test]
fn skips_user_instructions_and_env() {
let items = vec![

View File

@@ -86,6 +86,8 @@ pub enum Feature {
RemoteModels,
/// Experimental shell snapshotting.
ShellSnapshot,
/// Append additional AGENTS.md guidance to user instructions.
HierarchicalAgents,
/// Experimental TUI v2 (viewport) implementation.
Tui2,
/// Enforce UTF8 output in Powershell.
@@ -352,6 +354,12 @@ pub const FEATURES: &[FeatureSpec] = &[
},
default_enabled: false,
},
FeatureSpec {
id: Feature::HierarchicalAgents,
key: "hierarchical_agents",
stage: Stage::Experimental,
default_enabled: false,
},
FeatureSpec {
id: Feature::ApplyPatchFreeform,
key: "apply_patch_freeform",

View File

@@ -5,30 +5,30 @@
//! We include the concatenation of all files found along the path from the
//! repository root to the current working directory as follows:
//!
//! 1. Determine the project root by walking upwards from the current working
//! directory until any marker in `project_root_markers` (default: `.git`)
//! is found. If no root is found (or markers are empty), only the current
//! working directory is considered.
//! 1. Determine the Git repository root by walking upwards from the current
//! working directory until a `.git` directory or file is found. If no Git
//! root is found, only the current working directory is considered.
//! 2. Collect every `AGENTS.md` found from the repository root down to the
//! current working directory (inclusive) and concatenate their contents in
//! that order.
//! 3. We do **not** walk past the detected project root.
//! 3. We do **not** walk past the Git root.
use crate::config::Config;
use crate::features::Feature;
use crate::skills::SkillMetadata;
use crate::skills::render_skills_section;
use dunce::canonicalize as normalize_path;
use std::io;
use std::path::PathBuf;
use tokio::io::AsyncReadExt;
use toml::Value as TomlValue;
use tracing::error;
pub(crate) const HIERARCHICAL_AGENTS_MESSAGE: &str =
include_str!("../hierarchical_agents_message.md");
/// Default filename scanned for project-level docs.
pub const DEFAULT_PROJECT_DOC_FILENAME: &str = "AGENTS.md";
/// Preferred local override for project-level docs.
pub const LOCAL_PROJECT_DOC_FILENAME: &str = "AGENTS.override.md";
const DEFAULT_PROJECT_ROOT_MARKERS: &[&str] = &[".git"];
/// When both `Config::instructions` and the project doc are present, they will
/// be concatenated with the following separator.
@@ -40,35 +40,46 @@ pub(crate) async fn get_user_instructions(
config: &Config,
skills: Option<&[SkillMetadata]>,
) -> Option<String> {
let skills_section = skills.and_then(render_skills_section);
let project_docs = read_project_docs(config).await;
let project_docs = match read_project_docs(config).await {
Ok(docs) => docs,
let mut output = String::new();
if let Some(instructions) = config.user_instructions.clone() {
output.push_str(&instructions);
}
match project_docs {
Ok(Some(docs)) => {
if !output.is_empty() {
output.push_str(PROJECT_DOC_SEPARATOR);
}
output.push_str(&docs);
}
Ok(None) => {}
Err(e) => {
error!("error trying to find project doc: {e:#}");
return config.user_instructions.clone();
}
};
let combined_project_docs = merge_project_docs_with_skills(project_docs, skills_section);
let mut parts: Vec<String> = Vec::new();
if let Some(instructions) = config.user_instructions.clone() {
parts.push(instructions);
}
if let Some(project_doc) = combined_project_docs {
if !parts.is_empty() {
parts.push(PROJECT_DOC_SEPARATOR.to_string());
let skills_section = skills.and_then(render_skills_section);
if let Some(skills_section) = skills_section {
if !output.is_empty() {
output.push_str("\n\n");
}
parts.push(project_doc);
output.push_str(&skills_section);
}
if parts.is_empty() {
None
if config.features.enabled(Feature::HierarchicalAgents) {
if !output.is_empty() {
output.push_str("\n\n");
}
output.push_str(HIERARCHICAL_AGENTS_MESSAGE);
}
if !output.is_empty() {
Some(output)
} else {
Some(parts.concat())
None
}
}
@@ -142,20 +153,43 @@ pub fn discover_project_doc_paths(config: &Config) -> std::io::Result<Vec<PathBu
dir = canon;
}
let markers = project_root_markers_from_config(config)?;
let search_dirs = match find_project_root(&dir, &markers)? {
Some(root) => {
let mut dirs = Vec::new();
for ancestor in dir.as_path().ancestors() {
dirs.push(ancestor.to_path_buf());
if ancestor == root {
break;
// Build chain from cwd upwards and detect git root.
let mut chain: Vec<PathBuf> = vec![dir.clone()];
let mut git_root: Option<PathBuf> = None;
let mut cursor = dir;
while let Some(parent) = cursor.parent() {
let git_marker = cursor.join(".git");
let git_exists = match std::fs::metadata(&git_marker) {
Ok(_) => true,
Err(e) if e.kind() == std::io::ErrorKind::NotFound => false,
Err(e) => return Err(e),
};
if git_exists {
git_root = Some(cursor.clone());
break;
}
chain.push(parent.to_path_buf());
cursor = parent.to_path_buf();
}
let search_dirs: Vec<PathBuf> = if let Some(root) = git_root {
let mut dirs: Vec<PathBuf> = Vec::new();
let mut saw_root = false;
for p in chain.iter().rev() {
if !saw_root {
if p == &root {
saw_root = true;
} else {
continue;
}
}
dirs.reverse();
dirs
dirs.push(p.clone());
}
None => vec![dir.clone()],
dirs
} else {
vec![config.cwd.clone()]
};
let mut found: Vec<PathBuf> = Vec::new();
@@ -181,60 +215,6 @@ pub fn discover_project_doc_paths(config: &Config) -> std::io::Result<Vec<PathBu
Ok(found)
}
fn project_root_markers_from_config(config: &Config) -> io::Result<Vec<String>> {
let merged = config.config_layer_stack.effective_config();
let Some(table) = merged.as_table() else {
return Ok(default_project_root_markers());
};
let Some(markers_value) = table.get("project_root_markers") else {
return Ok(default_project_root_markers());
};
let TomlValue::Array(entries) = markers_value else {
return Err(io::Error::new(
io::ErrorKind::InvalidData,
"project_root_markers must be an array of strings",
));
};
if entries.is_empty() {
return Ok(Vec::new());
}
let mut markers = Vec::new();
for entry in entries {
let Some(marker) = entry.as_str() else {
return Err(io::Error::new(
io::ErrorKind::InvalidData,
"project_root_markers must be an array of strings",
));
};
markers.push(marker.to_string());
}
Ok(markers)
}
fn default_project_root_markers() -> Vec<String> {
DEFAULT_PROJECT_ROOT_MARKERS
.iter()
.map(ToString::to_string)
.collect()
}
fn find_project_root(cwd: &std::path::Path, markers: &[String]) -> io::Result<Option<PathBuf>> {
if markers.is_empty() {
return Ok(None);
}
for ancestor in cwd.ancestors() {
for marker in markers {
let marker_path = ancestor.join(marker);
match std::fs::metadata(&marker_path) {
Ok(_) => return Ok(Some(ancestor.to_path_buf())),
Err(e) if e.kind() == io::ErrorKind::NotFound => {}
Err(e) => return Err(e),
}
}
}
Ok(None)
}
fn candidate_filenames<'a>(config: &'a Config) -> Vec<&'a str> {
let mut names: Vec<&'a str> =
Vec::with_capacity(2 + config.project_doc_fallback_filenames.len());
@@ -252,18 +232,6 @@ fn candidate_filenames<'a>(config: &'a Config) -> Vec<&'a str> {
names
}
fn merge_project_docs_with_skills(
project_doc: Option<String>,
skills_section: Option<String>,
) -> Option<String> {
match (project_doc, skills_section) {
(Some(doc), Some(skills)) => Some(format!("{doc}\n\n{skills}")),
(Some(doc), None) => Some(doc),
(None, Some(skills)) => Some(skills),
(None, None) => None,
}
}
#[cfg(test)]
mod tests {
use super::*;

View File

@@ -90,6 +90,7 @@ pub(crate) fn should_persist_event_msg(ev: &EventMsg) -> bool {
| EventMsg::AgentMessageContentDelta(_)
| EventMsg::ReasoningContentDelta(_)
| EventMsg::ReasoningRawContentDelta(_)
| EventMsg::SkillsUpdateAvailable => false,
| EventMsg::SkillsUpdateAvailable
| EventMsg::CollabInteraction(_) => false,
}
}

View File

@@ -56,6 +56,10 @@ pub(crate) struct ThreadManagerState {
models_manager: Arc<ModelsManager>,
skills_manager: Arc<SkillsManager>,
session_source: SessionSource,
#[cfg(any(test, feature = "test-support"))]
#[allow(dead_code)]
// Captures submitted ops for testing purpose.
ops_log: Arc<std::sync::Mutex<Vec<(ThreadId, Op)>>>,
}
impl ThreadManager {
@@ -74,6 +78,8 @@ impl ThreadManager {
skills_manager: Arc::new(SkillsManager::new(codex_home)),
auth_manager,
session_source,
#[cfg(any(test, feature = "test-support"))]
ops_log: Arc::new(std::sync::Mutex::new(Vec::new())),
}),
#[cfg(any(test, feature = "test-support"))]
_test_codex_home_guard: None,
@@ -111,6 +117,8 @@ impl ThreadManager {
skills_manager: Arc::new(SkillsManager::new(codex_home)),
auth_manager,
session_source: SessionSource::Exec,
#[cfg(any(test, feature = "test-support"))]
ops_log: Arc::new(std::sync::Mutex::new(Vec::new())),
}),
_test_codex_home_guard: None,
}
@@ -202,9 +210,19 @@ impl ThreadManager {
.await
}
fn agent_control(&self) -> AgentControl {
pub(crate) fn agent_control(&self) -> AgentControl {
AgentControl::new(Arc::downgrade(&self.state))
}
#[cfg(any(test, feature = "test-support"))]
#[allow(dead_code)]
pub(crate) fn captured_ops(&self) -> Vec<(ThreadId, Op)> {
self.state
.ops_log
.lock()
.map(|log| log.clone())
.unwrap_or_default()
}
}
impl ThreadManagerState {
@@ -217,7 +235,14 @@ impl ThreadManagerState {
}
pub(crate) async fn send_op(&self, thread_id: ThreadId, op: Op) -> CodexResult<String> {
self.get_thread(thread_id).await?.submit(op).await
let thread = self.get_thread(thread_id).await?;
#[cfg(any(test, feature = "test-support"))]
{
if let Ok(mut log) = self.ops_log.lock() {
log.push((thread_id, op.clone()));
}
}
thread.submit(op).await
}
#[allow(dead_code)] // Used by upcoming multi-agent tooling.

View File

@@ -1,3 +1,4 @@
use crate::agent::AgentStatus;
use crate::codex::TurnContext;
use crate::config::Config;
use crate::error::CodexErr;
@@ -10,30 +11,16 @@ use crate::tools::registry::ToolHandler;
use crate::tools::registry::ToolKind;
use async_trait::async_trait;
use codex_protocol::ThreadId;
use codex_protocol::protocol::CollabInteractionEvent;
use codex_protocol::protocol::EventMsg;
use serde::Deserialize;
use serde::Serialize;
pub struct CollabHandler;
pub(crate) const DEFAULT_WAIT_TIMEOUT_MS: i64 = 30_000;
pub(crate) const MAX_WAIT_TIMEOUT_MS: i64 = 300_000;
#[derive(Debug, Deserialize)]
struct SpawnAgentArgs {
message: String,
}
#[derive(Debug, Deserialize)]
struct SendInputArgs {
id: String,
message: String,
}
#[derive(Debug, Deserialize)]
struct WaitArgs {
id: String,
timeout_ms: Option<i64>,
}
#[derive(Debug, Deserialize)]
struct CloseAgentArgs {
id: String,
@@ -68,10 +55,10 @@ impl ToolHandler for CollabHandler {
};
match tool_name.as_str() {
"spawn_agent" => handle_spawn_agent(session, turn, arguments).await,
"send_input" => handle_send_input(session, arguments).await,
"wait" => handle_wait(arguments).await,
"close_agent" => handle_close_agent(arguments).await,
"spawn_agent" => spawn::handle(session, turn, arguments).await,
"send_input" => send_input::handle(session, turn, arguments).await,
"wait" => wait::handle(session, turn, arguments).await,
"close_agent" => close_agent::handle(session, turn, arguments).await,
other => Err(FunctionCallError::RespondToModel(format!(
"unsupported collab tool {other}"
))),
@@ -79,84 +66,309 @@ impl ToolHandler for CollabHandler {
}
}
async fn handle_spawn_agent(
session: std::sync::Arc<crate::codex::Session>,
turn: std::sync::Arc<TurnContext>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: SpawnAgentArgs = parse_arguments(&arguments)?;
if args.message.trim().is_empty() {
return Err(FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string(),
));
}
let config = build_agent_spawn_config(turn.as_ref())?;
let result = session
.services
.agent_control
.spawn_agent(config, args.message, true)
.await
.map_err(|err| FunctionCallError::Fatal(err.to_string()))?;
mod spawn {
use super::*;
use crate::codex::Session;
use std::sync::Arc;
Ok(ToolOutput::Function {
content: format!("agent_id: {result}"),
success: Some(true),
content_items: None,
})
#[derive(Debug, Deserialize)]
struct SpawnAgentArgs {
message: String,
}
pub async fn handle(
session: Arc<Session>,
turn: Arc<TurnContext>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: SpawnAgentArgs = parse_arguments(&arguments)?;
if args.message.trim().is_empty() {
return Err(FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string(),
));
}
let config = build_agent_spawn_config(turn.as_ref())?;
let result = session
.services
.agent_control
.spawn_agent(config, args.message.clone(), true)
.await
.map_err(|err| FunctionCallError::Fatal(err.to_string()))?;
emit_event(session, turn, args.message, result).await;
Ok(ToolOutput::Function {
content: format!("agent_id: {result}"),
success: Some(true),
content_items: None,
})
}
async fn emit_event(
session: Arc<Session>,
turn: Arc<TurnContext>,
prompt: String,
new_id: ThreadId,
) {
session
.send_event(
&turn,
EventMsg::CollabInteraction(CollabInteractionEvent::AgentSpawned {
sender_id: session.conversation_id,
new_id,
prompt,
}),
)
.await
}
}
async fn handle_send_input(
session: std::sync::Arc<crate::codex::Session>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: SendInputArgs = parse_arguments(&arguments)?;
let agent_id = agent_id(&args.id)?;
if args.message.trim().is_empty() {
return Err(FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string(),
));
mod send_input {
use super::*;
use crate::codex::Session;
use std::sync::Arc;
#[derive(Debug, Deserialize)]
struct SendInputArgs {
id: String,
message: String,
}
let content = session
.services
.agent_control
.send_prompt(agent_id, args.message)
.await
.map_err(|err| match err {
CodexErr::ThreadNotFound(id) => {
FunctionCallError::RespondToModel(format!("agent with id {id} not found"))
pub async fn handle(
session: Arc<Session>,
turn: Arc<TurnContext>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: SendInputArgs = parse_arguments(&arguments)?;
let agent_id = agent_id(&args.id)?;
if args.message.trim().is_empty() {
return Err(FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string(),
));
}
let content = session
.services
.agent_control
.send_prompt(agent_id, args.message.clone())
.await
.map_err(|err| match err {
CodexErr::ThreadNotFound(id) => {
FunctionCallError::RespondToModel(format!("agent with id {id} not found"))
}
err => FunctionCallError::Fatal(err.to_string()),
})?;
emit_event(session, turn, agent_id, args.message).await;
Ok(ToolOutput::Function {
content,
success: Some(true),
content_items: None,
})
}
async fn emit_event(
session: Arc<Session>,
turn: Arc<TurnContext>,
receiver_id: ThreadId,
prompt: String,
) {
session
.send_event(
&turn,
EventMsg::CollabInteraction(CollabInteractionEvent::AgentInteraction {
sender_id: session.conversation_id,
receiver_id,
prompt,
}),
)
.await
}
}
mod wait {
use super::*;
use crate::agent::status::is_final;
use crate::codex::Session;
use std::sync::Arc;
use std::time::Duration;
use tokio::time::Instant;
use tokio::time::timeout_at;
#[derive(Debug, Deserialize)]
struct WaitArgs {
id: String,
timeout_ms: Option<i64>,
}
#[derive(Debug, Serialize)]
struct WaitResult {
status: AgentStatus,
timed_out: bool,
}
pub async fn handle(
session: Arc<Session>,
turn: Arc<TurnContext>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: WaitArgs = parse_arguments(&arguments)?;
let agent_id = agent_id(&args.id)?;
// Validate timeout.
let timeout_ms = args.timeout_ms.unwrap_or(DEFAULT_WAIT_TIMEOUT_MS);
let timeout_ms = match timeout_ms {
ms if ms <= 0 => {
return Err(FunctionCallError::RespondToModel(
"timeout_ms must be greater than zero".to_owned(),
));
}
err => FunctionCallError::Fatal(err.to_string()),
ms => ms.min(MAX_WAIT_TIMEOUT_MS),
};
let mut status_rx = session
.services
.agent_control
.subscribe_status(agent_id)
.await
.map_err(|err| match err {
CodexErr::ThreadNotFound(id) => {
FunctionCallError::RespondToModel(format!("agent with id {id} not found"))
}
err => FunctionCallError::Fatal(err.to_string()),
})?;
let waiting_id = format!("collab-waiting-{}", uuid::Uuid::new_v4());
session
.send_event(
&turn,
EventMsg::CollabInteraction(CollabInteractionEvent::WaitingBegin {
sender_id: session.conversation_id,
receiver_id: agent_id,
waiting_id: waiting_id.clone(),
}),
)
.await;
// Get last known status.
let mut status = status_rx.borrow_and_update().clone();
let deadline = Instant::now() + Duration::from_millis(timeout_ms as u64);
let timed_out = loop {
if is_final(&status) {
break false;
}
match timeout_at(deadline, status_rx.changed()).await {
Ok(Ok(())) => status = status_rx.borrow().clone(),
Ok(Err(_)) => {
let last_status = session.services.agent_control.get_status(agent_id).await;
if last_status != AgentStatus::NotFound {
// On-purpose we keep the last known status if the agent gets dropped. This
// event is not supposed to happen.
status = last_status;
}
break false;
}
Err(_) => break true,
}
};
session
.send_event(
&turn,
EventMsg::CollabInteraction(CollabInteractionEvent::WaitingEnd {
sender_id: session.conversation_id,
receiver_id: agent_id,
waiting_id,
status: status.clone(),
}),
)
.await;
if matches!(status, AgentStatus::NotFound) {
return Err(FunctionCallError::RespondToModel(format!(
"agent with id {agent_id} not found"
)));
}
let result = WaitResult { status, timed_out };
let content = serde_json::to_string(&result).map_err(|err| {
FunctionCallError::Fatal(format!("failed to serialize wait result: {err}"))
})?;
Ok(ToolOutput::Function {
content,
success: Some(true),
content_items: None,
})
}
async fn handle_wait(arguments: String) -> Result<ToolOutput, FunctionCallError> {
let args: WaitArgs = parse_arguments(&arguments)?;
let _agent_id = agent_id(&args.id)?;
let timeout_ms = args.timeout_ms.unwrap_or(DEFAULT_WAIT_TIMEOUT_MS);
if timeout_ms <= 0 {
return Err(FunctionCallError::RespondToModel(
"timeout_ms must be greater than zero".to_string(),
));
Ok(ToolOutput::Function {
content,
success: Some(!result.timed_out),
content_items: None,
})
}
let _timeout_ms = timeout_ms.min(MAX_WAIT_TIMEOUT_MS);
// TODO(jif): implement agent wait once lifecycle tracking is wired up.
Err(FunctionCallError::Fatal("wait not implemented".to_string()))
}
async fn handle_close_agent(arguments: String) -> Result<ToolOutput, FunctionCallError> {
let args: CloseAgentArgs = parse_arguments(&arguments)?;
let _agent_id = agent_id(&args.id)?;
// TODO(jif): implement agent shutdown and return the final status.
Err(FunctionCallError::Fatal(
"close_agent not implemented".to_string(),
))
pub mod close_agent {
use super::*;
use crate::codex::Session;
use std::sync::Arc;
#[derive(Debug, Deserialize, Serialize)]
pub(super) struct CloseAgentResult {
pub(super) status: AgentStatus,
}
pub async fn handle(
session: Arc<Session>,
turn: Arc<TurnContext>,
arguments: String,
) -> Result<ToolOutput, FunctionCallError> {
let args: CloseAgentArgs = parse_arguments(&arguments)?;
let agent_id = agent_id(&args.id)?;
let status = session.services.agent_control.get_status(agent_id).await;
if !matches!(status, AgentStatus::Shutdown) {
let _ = session
.services
.agent_control
.shutdown_agent(agent_id)
.await
.map_err(|err| match err {
CodexErr::ThreadNotFound(id) => {
FunctionCallError::RespondToModel(format!("agent with id {id} not found"))
}
err => FunctionCallError::Fatal(err.to_string()),
})?;
}
emit_event(session, turn, agent_id, status.clone()).await;
let content = serde_json::to_string(&CloseAgentResult { status }).map_err(|err| {
FunctionCallError::Fatal(format!("failed to serialize close_agent result: {err}"))
})?;
Ok(ToolOutput::Function {
content,
success: Some(true),
content_items: None,
})
}
async fn emit_event(
session: Arc<Session>,
turn: Arc<TurnContext>,
receiver_id: ThreadId,
status: AgentStatus,
) {
session
.send_event(
&turn,
EventMsg::CollabInteraction(CollabInteractionEvent::Close {
sender_id: session.conversation_id,
receiver_id,
status,
}),
)
.await
}
}
fn agent_id(id: &str) -> Result<ThreadId, FunctionCallError> {
@@ -192,3 +404,384 @@ fn build_agent_spawn_config(turn: &TurnContext) -> Result<Config, FunctionCallEr
})?;
Ok(config)
}
#[cfg(test)]
mod tests {
use super::*;
use crate::CodexAuth;
use crate::ThreadManager;
use crate::built_in_model_providers;
use crate::codex::make_session_and_context;
use crate::config::types::ShellEnvironmentPolicy;
use crate::function_tool::FunctionCallError;
use crate::protocol::AskForApproval;
use crate::protocol::Op;
use crate::protocol::SandboxPolicy;
use crate::turn_diff_tracker::TurnDiffTracker;
use codex_protocol::ThreadId;
use pretty_assertions::assert_eq;
use serde_json::json;
use std::path::PathBuf;
use std::sync::Arc;
use std::time::Duration;
use tokio::sync::Mutex;
use tokio::time::timeout;
fn invocation(
session: Arc<crate::codex::Session>,
turn: Arc<TurnContext>,
tool_name: &str,
payload: ToolPayload,
) -> ToolInvocation {
ToolInvocation {
session,
turn,
tracker: Arc::new(Mutex::new(TurnDiffTracker::default())),
call_id: "call-1".to_string(),
tool_name: tool_name.to_string(),
payload,
}
}
fn function_payload(args: serde_json::Value) -> ToolPayload {
ToolPayload::Function {
arguments: args.to_string(),
}
}
fn thread_manager() -> ThreadManager {
ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
built_in_model_providers()["openai"].clone(),
)
}
#[tokio::test]
async fn handler_rejects_non_function_payloads() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"spawn_agent",
ToolPayload::Custom {
input: "hello".to_string(),
},
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("payload should be rejected");
};
assert_eq!(
err,
FunctionCallError::RespondToModel(
"collab handler received unsupported payload".to_string()
)
);
}
#[tokio::test]
async fn handler_rejects_unknown_tool() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"unknown_tool",
function_payload(json!({})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("tool should be rejected");
};
assert_eq!(
err,
FunctionCallError::RespondToModel("unsupported collab tool unknown_tool".to_string())
);
}
#[tokio::test]
async fn spawn_agent_rejects_empty_message() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"spawn_agent",
function_payload(json!({"message": " "})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("empty message should be rejected");
};
assert_eq!(
err,
FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string()
)
);
}
#[tokio::test]
async fn spawn_agent_errors_when_manager_dropped() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"spawn_agent",
function_payload(json!({"message": "hello"})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("spawn should fail without a manager");
};
assert_eq!(
err,
FunctionCallError::Fatal("unsupported operation: thread manager dropped".to_string())
);
}
#[tokio::test]
async fn send_input_rejects_empty_message() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"send_input",
function_payload(json!({"id": ThreadId::new().to_string(), "message": ""})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("empty message should be rejected");
};
assert_eq!(
err,
FunctionCallError::RespondToModel(
"Empty message can't be send to an agent".to_string()
)
);
}
#[tokio::test]
async fn send_input_rejects_invalid_id() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"send_input",
function_payload(json!({"id": "not-a-uuid", "message": "hi"})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("invalid id should be rejected");
};
let FunctionCallError::RespondToModel(msg) = err else {
panic!("expected respond-to-model error");
};
assert!(msg.starts_with("invalid agent id not-a-uuid:"));
}
#[tokio::test]
async fn send_input_reports_missing_agent() {
let (mut session, turn) = make_session_and_context().await;
let manager = thread_manager();
session.services.agent_control = manager.agent_control();
let agent_id = ThreadId::new();
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"send_input",
function_payload(json!({"id": agent_id.to_string(), "message": "hi"})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("missing agent should be reported");
};
assert_eq!(
err,
FunctionCallError::RespondToModel(format!("agent with id {agent_id} not found"))
);
}
#[tokio::test]
async fn wait_rejects_non_positive_timeout() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"wait",
function_payload(json!({"id": ThreadId::new().to_string(), "timeout_ms": 0})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("non-positive timeout should be rejected");
};
assert_eq!(
err,
FunctionCallError::RespondToModel("timeout_ms must be greater than zero".to_string())
);
}
#[tokio::test]
async fn wait_rejects_invalid_id() {
let (session, turn) = make_session_and_context().await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"wait",
function_payload(json!({"id": "invalid"})),
);
let Err(err) = CollabHandler.handle(invocation).await else {
panic!("invalid id should be rejected");
};
let FunctionCallError::RespondToModel(msg) = err else {
panic!("expected respond-to-model error");
};
assert!(msg.starts_with("invalid agent id invalid:"));
}
#[tokio::test]
async fn wait_times_out_when_status_is_not_final() {
let (mut session, turn) = make_session_and_context().await;
let manager = thread_manager();
session.services.agent_control = manager.agent_control();
let config = turn.client.config().as_ref().clone();
let thread = manager.start_thread(config).await.expect("start thread");
let agent_id = thread.thread_id;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"wait",
function_payload(json!({"id": agent_id.to_string(), "timeout_ms": 10})),
);
let output = CollabHandler
.handle(invocation)
.await
.expect("wait should succeed");
let ToolOutput::Function {
content, success, ..
} = output
else {
panic!("expected function output");
};
assert_eq!(content, r#"{"status":"pending_init","timed_out":true}"#);
assert_eq!(success, Some(false));
let _ = thread
.thread
.submit(Op::Shutdown {})
.await
.expect("shutdown should submit");
}
#[tokio::test]
async fn wait_returns_final_status_without_timeout() {
let (mut session, turn) = make_session_and_context().await;
let manager = thread_manager();
session.services.agent_control = manager.agent_control();
let config = turn.client.config().as_ref().clone();
let thread = manager.start_thread(config).await.expect("start thread");
let agent_id = thread.thread_id;
let mut status_rx = manager
.agent_control()
.subscribe_status(agent_id)
.await
.expect("subscribe should succeed");
let _ = thread
.thread
.submit(Op::Shutdown {})
.await
.expect("shutdown should submit");
let _ = timeout(Duration::from_secs(1), status_rx.changed())
.await
.expect("shutdown status should arrive");
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"wait",
function_payload(json!({"id": agent_id.to_string(), "timeout_ms": 1000})),
);
let output = CollabHandler
.handle(invocation)
.await
.expect("wait should succeed");
let ToolOutput::Function {
content, success, ..
} = output
else {
panic!("expected function output");
};
assert_eq!(content, r#"{"status":"shutdown","timed_out":false}"#);
assert_eq!(success, Some(true));
}
#[tokio::test]
async fn close_agent_submits_shutdown_and_returns_status() {
let (mut session, turn) = make_session_and_context().await;
let manager = thread_manager();
session.services.agent_control = manager.agent_control();
let config = turn.client.config().as_ref().clone();
let thread = manager.start_thread(config).await.expect("start thread");
let agent_id = thread.thread_id;
let status_before = manager.agent_control().get_status(agent_id).await;
let invocation = invocation(
Arc::new(session),
Arc::new(turn),
"close_agent",
function_payload(json!({"id": agent_id.to_string()})),
);
let output = CollabHandler
.handle(invocation)
.await
.expect("close_agent should succeed");
let ToolOutput::Function {
content, success, ..
} = output
else {
panic!("expected function output");
};
let result: close_agent::CloseAgentResult =
serde_json::from_str(&content).expect("close_agent result should be json");
assert_eq!(result.status, status_before);
assert_eq!(success, Some(true));
let ops = manager.captured_ops();
let submitted_shutdown = ops
.iter()
.any(|(id, op)| *id == agent_id && matches!(op, Op::Shutdown));
assert_eq!(submitted_shutdown, true);
}
#[tokio::test]
async fn build_agent_spawn_config_uses_turn_context_values() {
let (_session, mut turn) = make_session_and_context().await;
turn.developer_instructions = Some("dev".to_string());
turn.base_instructions = Some("base".to_string());
turn.compact_prompt = Some("compact".to_string());
turn.user_instructions = Some("user".to_string());
turn.shell_environment_policy = ShellEnvironmentPolicy {
use_profile: true,
..ShellEnvironmentPolicy::default()
};
let temp_dir = tempfile::tempdir().expect("temp dir");
turn.cwd = temp_dir.path().to_path_buf();
turn.codex_linux_sandbox_exe = Some(PathBuf::from("/bin/echo"));
turn.approval_policy = AskForApproval::Never;
turn.sandbox_policy = SandboxPolicy::DangerFullAccess;
let config = build_agent_spawn_config(&turn).expect("spawn config");
let mut expected = (*turn.client.config()).clone();
expected.model = Some(turn.client.get_model());
expected.model_provider = turn.client.get_provider();
expected.model_reasoning_effort = turn.client.get_reasoning_effort();
expected.model_reasoning_summary = turn.client.get_reasoning_summary();
expected.developer_instructions = turn.developer_instructions.clone();
expected.base_instructions = turn.base_instructions.clone();
expected.compact_prompt = turn.compact_prompt.clone();
expected.user_instructions = turn.user_instructions.clone();
expected.shell_environment_policy = turn.shell_environment_policy.clone();
expected.codex_linux_sandbox_exe = turn.codex_linux_sandbox_exe.clone();
expected.cwd = turn.cwd.clone();
expected
.approval_policy
.set(turn.approval_policy)
.expect("approval policy set");
expected
.sandbox_policy
.set(turn.sandbox_policy)
.expect("sandbox policy set");
assert_eq!(config, expected);
}
}

View File

@@ -11,7 +11,9 @@ use crate::tools::context::ToolPayload;
use crate::tools::handlers::parse_arguments;
use crate::tools::registry::ToolHandler;
use crate::tools::registry::ToolKind;
use codex_protocol::user_input::UserInput;
use codex_protocol::models::ContentItem;
use codex_protocol::models::ResponseInputItem;
use codex_protocol::models::local_image_content_items_with_label_number;
pub struct ViewImageHandler;
@@ -63,8 +65,15 @@ impl ToolHandler for ViewImageHandler {
}
let event_path = abs_path.clone();
let content: Vec<ContentItem> =
local_image_content_items_with_label_number(&abs_path, None);
let input = ResponseInputItem::Message {
role: "user".to_string(),
content,
};
session
.inject_input(vec![UserInput::LocalImage { path: abs_path }])
.inject_response_items(vec![input])
.await
.map_err(|_| {
FunctionCallError::RespondToModel(

View File

@@ -8,6 +8,7 @@ use crate::tools::handlers::apply_patch::create_apply_patch_json_tool;
use crate::tools::handlers::collab::DEFAULT_WAIT_TIMEOUT_MS;
use crate::tools::handlers::collab::MAX_WAIT_TIMEOUT_MS;
use crate::tools::registry::ToolRegistryBuilder;
use codex_protocol::models::VIEW_IMAGE_TOOL_NAME;
use codex_protocol::openai_models::ApplyPatchToolType;
use codex_protocol::openai_models::ConfigShellToolType;
use codex_protocol::openai_models::ModelInfo;
@@ -412,10 +413,9 @@ fn create_view_image_tool() -> ToolSpec {
)]);
ToolSpec::Function(ResponsesApiTool {
name: "view_image".to_string(),
description:
"Attach a local image (by filesystem path) to the thread context for this turn."
.to_string(),
name: VIEW_IMAGE_TOOL_NAME.to_string(),
description: "View a local image from the filesystem (only use if given a full filepath by the user, and the image isn't already attached to the thread context within <image ...> tags)."
.to_string(),
strict: false,
parameters: JsonSchema::Object {
properties,

View File

@@ -5,7 +5,6 @@ use codex_utils_cargo_bin::find_resource;
use tempfile::TempDir;
use codex_core::CodexThread;
use codex_core::config::CONFIG_TOML_FILE;
use codex_core::config::Config;
use codex_core::config::ConfigBuilder;
use codex_core::config::ConfigOverrides;
@@ -19,23 +18,6 @@ pub mod streaming_sse;
pub mod test_codex;
pub mod test_codex_exec;
pub fn prepare_test_project_root(codex_home: &TempDir) -> (TempDir, PathBuf) {
const MARKER: &str = ".codex-test-root";
let config_path = codex_home.path().join(CONFIG_TOML_FILE);
let config_contents = format!("project_root_markers = [\"{MARKER}\"]\n");
std::fs::write(config_path, config_contents).expect("write config.toml");
let root = TempDir::new().expect("tempdir");
let root_path = root.path().to_path_buf();
std::fs::write(root_path.join(MARKER), "").expect("write project root marker");
std::fs::write(root_path.join("AGENTS.md"), "Test project instructions.\n")
.expect("write AGENTS.md");
let cwd = root_path.join("workspace");
std::fs::create_dir_all(&cwd).expect("create workspace");
(root, cwd)
}
#[track_caller]
pub fn assert_regex_match<'s>(pattern: &str, actual: &'s str) -> regex_lite::Captures<'s> {
let regex = Regex::new(pattern).unwrap_or_else(|err| {

View File

@@ -18,7 +18,6 @@ use codex_core::protocol::WarningEvent;
use codex_protocol::config_types::ReasoningSummary;
use codex_protocol::user_input::UserInput;
use core_test_support::load_default_config_for_test;
use core_test_support::prepare_test_project_root;
use core_test_support::responses::ev_local_shell_call;
use core_test_support::responses::ev_reasoning_item;
use core_test_support::skip_if_no_network;
@@ -140,11 +139,9 @@ async fn summarize_context_three_requests_and_instructions() {
// Build config pointing to the mock server and spawn Codex.
let model_provider = non_openai_model_provider(&server);
let home = TempDir::new().unwrap();
let (_project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
set_test_compact_prompt(&mut config);
config.cwd = project_cwd;
config.model_auto_compact_token_limit = Some(200_000);
let thread_manager = ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
@@ -414,11 +411,9 @@ async fn manual_compact_emits_api_and_local_token_usage_events() {
let model_provider = non_openai_model_provider(&server);
let home = TempDir::new().unwrap();
let (_project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
set_test_compact_prompt(&mut config);
config.cwd = project_cwd;
let thread_manager = ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
@@ -1034,11 +1029,9 @@ async fn auto_compact_runs_after_token_limit_hit() {
let model_provider = non_openai_model_provider(&server);
let home = TempDir::new().unwrap();
let (_project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
set_test_compact_prompt(&mut config);
config.cwd = project_cwd;
config.model_auto_compact_token_limit = Some(200_000);
let thread_manager = ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
@@ -1365,11 +1358,9 @@ async fn auto_compact_persists_rollout_entries() {
let model_provider = non_openai_model_provider(&server);
let home = TempDir::new().unwrap();
let (_project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
set_test_compact_prompt(&mut config);
config.cwd = project_cwd;
config.model_auto_compact_token_limit = Some(200_000);
let thread_manager = ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
@@ -1616,11 +1607,9 @@ async fn manual_compact_twice_preserves_latest_user_messages() {
let model_provider = non_openai_model_provider(&server);
let home = TempDir::new().unwrap();
let (_project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
set_test_compact_prompt(&mut config);
config.cwd = project_cwd;
let codex = ThreadManager::with_models_provider(
CodexAuth::from_api_key("dummy"),
config.model_provider.clone(),

View File

@@ -24,7 +24,6 @@ use codex_core::protocol::WarningEvent;
use codex_core::spawn::CODEX_SANDBOX_NETWORK_DISABLED_ENV_VAR;
use codex_protocol::user_input::UserInput;
use core_test_support::load_default_config_for_test;
use core_test_support::prepare_test_project_root;
use core_test_support::responses::ResponseMock;
use core_test_support::responses::ev_assistant_message;
use core_test_support::responses::ev_completed;
@@ -152,7 +151,7 @@ async fn compact_resume_and_fork_preserve_model_history_view() {
let request_log = mount_initial_flow(&server).await;
let expected_model = "gpt-5.1-codex";
// 2. Start a new conversation and drive it through the compact/resume/fork steps.
let (_home, _project_root, config, manager, base) =
let (_home, config, manager, base) =
start_test_conversation(&server, Some(expected_model)).await;
user_turn(&base, "hello world").await;
@@ -605,8 +604,7 @@ async fn compact_resume_after_second_compaction_preserves_history() {
request_log.extend(mount_second_compact_flow(&server).await);
// 2. Drive the conversation through compact -> resume -> fork -> compact -> resume.
let (_home, _project_root, config, manager, base) =
start_test_conversation(&server, None).await;
let (_home, config, manager, base) = start_test_conversation(&server, None).await;
user_turn(&base, "hello world").await;
compact_conversation(&base).await;
@@ -865,18 +863,16 @@ async fn mount_second_compact_flow(server: &MockServer) -> Vec<ResponseMock> {
async fn start_test_conversation(
server: &MockServer,
model: Option<&str>,
) -> (TempDir, TempDir, Config, ThreadManager, Arc<CodexThread>) {
) -> (TempDir, Config, ThreadManager, Arc<CodexThread>) {
let model_provider = ModelProviderInfo {
name: "Non-OpenAI Model provider".into(),
base_url: Some(format!("{}/v1", server.uri())),
..built_in_model_providers()["openai"].clone()
};
let home = TempDir::new().expect("create temp dir");
let (project_root, project_cwd) = prepare_test_project_root(&home);
let mut config = load_default_config_for_test(&home).await;
config.model_provider = model_provider;
config.compact_prompt = Some(SUMMARIZATION_PROMPT.to_string());
config.cwd = project_cwd;
if let Some(model) = model {
config.model = Some(model.to_string());
}
@@ -889,7 +885,7 @@ async fn start_test_conversation(
.await
.expect("create conversation");
(home, project_root, config, manager, thread)
(home, config, manager, thread)
}
async fn user_turn(conversation: &Arc<CodexThread>, text: &str) {

View File

@@ -0,0 +1,71 @@
use codex_core::features::Feature;
use core_test_support::load_sse_fixture_with_id;
use core_test_support::responses::mount_sse_once;
use core_test_support::responses::start_mock_server;
use core_test_support::test_codex::test_codex;
const HIERARCHICAL_AGENTS_SNIPPET: &str =
"Files called AGENTS.md commonly appear in many places inside a container";
fn sse_completed(id: &str) -> String {
load_sse_fixture_with_id("../fixtures/completed_template.json", id)
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn hierarchical_agents_appends_to_project_doc_in_user_instructions() {
let server = start_mock_server().await;
let resp_mock = mount_sse_once(&server, sse_completed("resp1")).await;
let mut builder = test_codex().with_config(|config| {
config.features.enable(Feature::HierarchicalAgents);
std::fs::write(config.cwd.join("AGENTS.md"), "be nice").expect("write AGENTS.md");
});
let test = builder.build(&server).await.expect("build test codex");
test.submit_turn("hello").await.expect("submit turn");
let request = resp_mock.single_request();
let user_messages = request.message_input_texts("user");
let instructions = user_messages
.iter()
.find(|text| text.starts_with("# AGENTS.md instructions for "))
.expect("instructions message");
assert!(
instructions.contains("be nice"),
"expected AGENTS.md text included: {instructions}"
);
let snippet_pos = instructions
.find(HIERARCHICAL_AGENTS_SNIPPET)
.expect("expected hierarchical agents snippet");
let base_pos = instructions
.find("be nice")
.expect("expected AGENTS.md text");
assert!(
snippet_pos > base_pos,
"expected hierarchical agents message appended after base instructions: {instructions}"
);
}
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
async fn hierarchical_agents_emits_when_no_project_doc() {
let server = start_mock_server().await;
let resp_mock = mount_sse_once(&server, sse_completed("resp1")).await;
let mut builder = test_codex().with_config(|config| {
config.features.enable(Feature::HierarchicalAgents);
});
let test = builder.build(&server).await.expect("build test codex");
test.submit_turn("hello").await.expect("submit turn");
let request = resp_mock.single_request();
let user_messages = request.message_input_texts("user");
let instructions = user_messages
.iter()
.find(|text| text.starts_with("# AGENTS.md instructions for "))
.expect("instructions message");
assert!(
instructions.contains(HIERARCHICAL_AGENTS_SNIPPET),
"expected hierarchical agents message appended: {instructions}"
);
}

View File

@@ -30,6 +30,7 @@ mod exec;
mod exec_policy;
mod fork_thread;
mod grep_files;
mod hierarchical_agents;
mod items;
mod json_result;
mod list_dir;

View File

@@ -4,9 +4,7 @@ use std::sync::Arc;
use anyhow::Result;
use codex_core::CodexAuth;
use codex_core::CodexThread;
use codex_core::ModelProviderInfo;
use codex_core::ThreadManager;
use codex_core::built_in_model_providers;
use codex_core::config::Config;
use codex_core::error::CodexErr;
@@ -39,6 +37,8 @@ use core_test_support::responses::mount_sse_sequence;
use core_test_support::responses::sse;
use core_test_support::skip_if_no_network;
use core_test_support::skip_if_sandbox;
use core_test_support::test_codex::TestCodex;
use core_test_support::test_codex::test_codex;
use core_test_support::wait_for_event;
use core_test_support::wait_for_event_match;
use pretty_assertions::assert_eq;
@@ -98,19 +98,19 @@ async fn remote_models_remote_model_uses_unified_exec() -> Result<()> {
)
.await;
let harness = build_remote_models_harness(&server, |config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
})
.await?;
let RemoteModelsHarness {
let mut builder = test_codex()
.with_auth(CodexAuth::create_dummy_chatgpt_auth_for_testing())
.with_config(|config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
});
let TestCodex {
codex,
cwd,
config,
thread_manager,
..
} = harness;
} = builder.build(&server).await?;
let models_manager = thread_manager.get_models_manager();
let available_model =
@@ -214,16 +214,19 @@ async fn remote_models_truncation_policy_without_override_preserves_remote() ->
)
.await;
let harness = build_remote_models_harness(&server, |config| {
config.model = Some("gpt-5.1".to_string());
})
.await?;
let mut builder = test_codex()
.with_auth(CodexAuth::create_dummy_chatgpt_auth_for_testing())
.with_config(|config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
});
let test = builder.build(&server).await?;
let models_manager = harness.thread_manager.get_models_manager();
wait_for_model_available(&models_manager, slug, &harness.config).await;
let models_manager = test.thread_manager.get_models_manager();
wait_for_model_available(&models_manager, slug, &test.config).await;
let model_info = models_manager
.construct_model_info(slug, &harness.config)
.construct_model_info(slug, &test.config)
.await;
assert_eq!(
model_info.truncation_policy,
@@ -258,17 +261,20 @@ async fn remote_models_truncation_policy_with_tool_output_override() -> Result<(
)
.await;
let harness = build_remote_models_harness(&server, |config| {
config.model = Some("gpt-5.1".to_string());
config.tool_output_token_limit = Some(50);
})
.await?;
let mut builder = test_codex()
.with_auth(CodexAuth::create_dummy_chatgpt_auth_for_testing())
.with_config(|config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
config.tool_output_token_limit = Some(50);
});
let test = builder.build(&server).await?;
let models_manager = harness.thread_manager.get_models_manager();
wait_for_model_available(&models_manager, slug, &harness.config).await;
let models_manager = test.thread_manager.get_models_manager();
wait_for_model_available(&models_manager, slug, &test.config).await;
let model_info = models_manager
.construct_model_info(slug, &harness.config)
.construct_model_info(slug, &test.config)
.await;
assert_eq!(
model_info.truncation_policy,
@@ -335,19 +341,19 @@ async fn remote_models_apply_remote_base_instructions() -> Result<()> {
)
.await;
let harness = build_remote_models_harness(&server, |config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
})
.await?;
let RemoteModelsHarness {
let mut builder = test_codex()
.with_auth(CodexAuth::create_dummy_chatgpt_auth_for_testing())
.with_config(|config| {
config.features.enable(Feature::RemoteModels);
config.model = Some("gpt-5.1".to_string());
});
let TestCodex {
codex,
cwd,
config,
thread_manager,
..
} = harness;
} = builder.build(&server).await?;
let models_manager = thread_manager.get_models_manager();
wait_for_model_available(&models_manager, model, &config).await;
@@ -577,50 +583,6 @@ async fn wait_for_model_available(
}
}
struct RemoteModelsHarness {
codex: Arc<CodexThread>,
cwd: Arc<TempDir>,
config: Config,
thread_manager: Arc<ThreadManager>,
}
// todo(aibrahim): move this to with_model_provier in test_codex
async fn build_remote_models_harness<F>(
server: &MockServer,
mutate_config: F,
) -> Result<RemoteModelsHarness>
where
F: FnOnce(&mut Config),
{
let auth = CodexAuth::create_dummy_chatgpt_auth_for_testing();
let home = Arc::new(TempDir::new()?);
let cwd = Arc::new(TempDir::new()?);
let mut config = load_default_config_for_test(&home).await;
config.cwd = cwd.path().to_path_buf();
config.features.enable(Feature::RemoteModels);
let provider = ModelProviderInfo {
base_url: Some(format!("{}/v1", server.uri())),
..built_in_model_providers()["openai"].clone()
};
config.model_provider = provider.clone();
mutate_config(&mut config);
let thread_manager = ThreadManager::with_models_provider(auth, provider);
let thread_manager = Arc::new(thread_manager);
let new_conversation = thread_manager.start_thread(config.clone()).await?;
Ok(RemoteModelsHarness {
codex: new_conversation.thread,
cwd,
config,
thread_manager,
})
}
fn test_remote_model(slug: &str, visibility: ModelVisibility, priority: i32) -> ModelInfo {
test_remote_model_with_policy(
slug,

View File

@@ -216,6 +216,20 @@ async fn view_image_tool_attaches_local_image() -> anyhow::Result<()> {
let image_message =
find_image_message(&body).expect("pending input image message not included in request");
let content_items = image_message
.get("content")
.and_then(Value::as_array)
.expect("image message has content array");
assert_eq!(
content_items.len(),
1,
"view_image should inject only the image content item (no tag/label text)"
);
assert_eq!(
content_items[0].get("type").and_then(Value::as_str),
Some("input_image"),
"view_image should inject only an input_image content item"
);
let image_url = image_message
.get("content")
.and_then(Value::as_array)

View File

@@ -571,6 +571,9 @@ impl EventProcessor for EventProcessorWithHumanOutput {
EventMsg::ContextCompacted(_) => {
ts_msg!(self, "context compacted");
}
EventMsg::CollabInteraction(_) => {
// TODO(jif) handle collab tools.
}
EventMsg::ShutdownComplete => return CodexStatus::Shutdown,
EventMsg::WebSearchBegin(_)
| EventMsg::ExecApprovalRequest(_)

View File

@@ -306,6 +306,7 @@ async fn run_codex_tool_session_inner(
| EventMsg::ExitedReviewMode(_)
| EventMsg::ContextCompacted(_)
| EventMsg::ThreadRolledBack(_)
| EventMsg::CollabInteraction(_)
| EventMsg::DeprecationNotice(_) => {
// For now, we do not do anything extra for these
// events. Note that

View File

@@ -180,6 +180,48 @@ fn local_image_error_placeholder(
}
}
pub const VIEW_IMAGE_TOOL_NAME: &str = "view_image";
const IMAGE_OPEN_TAG: &str = "";
const LOCAL_IMAGE_OPEN_TAG_PREFIX: &str = "<image name=";
const LOCAL_IMAGE_OPEN_TAG_SUFFIX: &str = ">";
const LOCAL_IMAGE_CLOSE_TAG: &str = IMAGE_CLOSE_TAG;
pub fn image_open_tag_text() -> String {
IMAGE_OPEN_TAG.to_string()
}
pub fn image_close_tag_text() -> String {
IMAGE_CLOSE_TAG.to_string()
}
pub fn local_image_label_text(label_number: usize) -> String {
format!("[Image #{label_number}]")
}
pub fn local_image_open_tag_text(label_number: usize) -> String {
let label = local_image_label_text(label_number);
format!("{LOCAL_IMAGE_OPEN_TAG_PREFIX}{label}{LOCAL_IMAGE_OPEN_TAG_SUFFIX}")
}
pub fn is_local_image_open_tag_text(text: &str) -> bool {
text.strip_prefix(LOCAL_IMAGE_OPEN_TAG_PREFIX)
.is_some_and(|rest| rest.ends_with(LOCAL_IMAGE_OPEN_TAG_SUFFIX))
}
pub fn is_local_image_close_tag_text(text: &str) -> bool {
is_image_close_tag_text(text)
}
pub fn is_image_open_tag_text(text: &str) -> bool {
text == IMAGE_OPEN_TAG
}
pub fn is_image_close_tag_text(text: &str) -> bool {
text == IMAGE_CLOSE_TAG
}
fn invalid_image_error_placeholder(
path: &std::path::Path,
error: impl std::fmt::Display,
@@ -203,6 +245,53 @@ fn unsupported_image_error_placeholder(path: &std::path::Path, mime: &str) -> Co
}
}
pub fn local_image_content_items_with_label_number(
path: &std::path::Path,
label_number: Option<usize>,
) -> Vec<ContentItem> {
match load_and_resize_to_fit(path) {
Ok(image) => {
let mut items = Vec::with_capacity(3);
if let Some(label_number) = label_number {
items.push(ContentItem::InputText {
text: local_image_open_tag_text(label_number),
});
}
items.push(ContentItem::InputImage {
image_url: image.into_data_url(),
});
if label_number.is_some() {
items.push(ContentItem::InputText {
text: LOCAL_IMAGE_CLOSE_TAG.to_string(),
});
}
items
}
Err(err) => {
if matches!(&err, ImageProcessingError::Read { .. }) {
vec![local_image_error_placeholder(path, &err)]
} else if err.is_invalid_image() {
vec![invalid_image_error_placeholder(path, &err)]
} else {
let Some(mime_guess) = mime_guess::from_path(path).first() else {
return vec![local_image_error_placeholder(
path,
"unsupported MIME type (unknown)",
)];
};
let mime = mime_guess.essence_str().to_owned();
if !mime.starts_with("image/") {
return vec![local_image_error_placeholder(
path,
format!("unsupported MIME type `{mime}`"),
)];
}
vec![unsupported_image_error_placeholder(path, &mime)]
}
}
}
}
impl From<ResponseInputItem> for ResponseItem {
fn from(item: ResponseInputItem) -> Self {
match item {
@@ -296,41 +385,27 @@ pub enum ReasoningItemContent {
impl From<Vec<UserInput>> for ResponseInputItem {
fn from(items: Vec<UserInput>) -> Self {
let mut image_index = 0;
Self::Message {
role: "user".to_string(),
content: items
.into_iter()
.filter_map(|c| match c {
UserInput::Text { text } => Some(ContentItem::InputText { text }),
UserInput::Image { image_url } => Some(ContentItem::InputImage { image_url }),
UserInput::LocalImage { path } => match load_and_resize_to_fit(&path) {
Ok(image) => Some(ContentItem::InputImage {
image_url: image.into_data_url(),
}),
Err(err) => {
if matches!(&err, ImageProcessingError::Read { .. }) {
Some(local_image_error_placeholder(&path, &err))
} else if err.is_invalid_image() {
Some(invalid_image_error_placeholder(&path, &err))
} else {
let Some(mime_guess) = mime_guess::from_path(&path).first() else {
return Some(local_image_error_placeholder(
&path,
"unsupported MIME type (unknown)",
));
};
let mime = mime_guess.essence_str().to_owned();
if !mime.starts_with("image/") {
return Some(local_image_error_placeholder(
&path,
format!("unsupported MIME type `{mime}`"),
));
}
Some(unsupported_image_error_placeholder(&path, &mime))
}
}
},
UserInput::Skill { .. } => None, // Skill bodies are injected later in core
.flat_map(|c| match c {
UserInput::Text { text } => vec![ContentItem::InputText { text }],
UserInput::Image { image_url } => vec![
ContentItem::InputText {
text: image_open_tag_text(),
},
ContentItem::InputImage { image_url },
ContentItem::InputText {
text: image_close_tag_text(),
},
],
UserInput::LocalImage { path } => {
image_index += 1;
local_image_content_items_with_label_number(&path, Some(image_index))
}
UserInput::Skill { .. } => Vec::new(), // Skill bodies are injected later in core
})
.collect::<Vec<ContentItem>>(),
}
@@ -770,6 +845,33 @@ mod tests {
Ok(())
}
#[test]
fn wraps_image_user_input_with_tags() -> Result<()> {
let image_url = "data:image/png;base64,abc".to_string();
let item = ResponseInputItem::from(vec![UserInput::Image {
image_url: image_url.clone(),
}]);
match item {
ResponseInputItem::Message { content, .. } => {
let expected = vec![
ContentItem::InputText {
text: image_open_tag_text(),
},
ContentItem::InputImage { image_url },
ContentItem::InputText {
text: image_close_tag_text(),
},
];
assert_eq!(content, expected);
}
other => panic!("expected message response but got {other:?}"),
}
Ok(())
}
#[test]
fn local_image_read_error_adds_placeholder() -> Result<()> {
let dir = tempdir()?;

View File

@@ -683,6 +683,9 @@ pub enum EventMsg {
AgentMessageContentDelta(AgentMessageContentDeltaEvent),
ReasoningContentDelta(ReasoningContentDeltaEvent),
ReasoningRawContentDelta(ReasoningRawContentDeltaEvent),
/// Collab interaction.
CollabInteraction(CollabInteractionEvent),
}
/// Agent lifecycle status, derived from emitted events.
@@ -699,7 +702,7 @@ pub enum AgentStatus {
Completed(Option<String>),
/// Agent encountered an error.
Errored(String),
/// Agent has been shutdowned.
/// Agent has been shutdown.
Shutdown,
/// Agent is not found.
NotFound,
@@ -1933,6 +1936,56 @@ pub enum TurnAbortReason {
ReviewEnded,
}
#[derive(Debug, Clone, Deserialize, Serialize, PartialEq, JsonSchema, TS)]
#[serde(rename_all = "snake_case")]
pub enum CollabInteractionEvent {
AgentSpawned {
/// Thread ID of the sender.
sender_id: ThreadId,
/// Thread ID of the newly spawned agent.
new_id: ThreadId,
/// Initial prompt sent to the agent. Can be empty to prevent CoT leaking at the
/// beginning.
prompt: String,
},
AgentInteraction {
/// Thread ID of the sender.
sender_id: ThreadId,
/// Thread ID of the receiver.
receiver_id: ThreadId,
/// Prompt sent from the sender to the receiver. Can be empty to prevent CoT
/// leaking at the beginning.
prompt: String,
},
WaitingBegin {
/// Thread ID of the sender.
sender_id: ThreadId,
/// Thread ID of the receiver.
receiver_id: ThreadId,
/// ID of the waiting call.
waiting_id: String,
},
WaitingEnd {
/// Thread ID of the sender.
sender_id: ThreadId,
/// Thread ID of the receiver.
receiver_id: ThreadId,
/// ID of the waiting call.
waiting_id: String,
/// Final status of the receiver agent reported to the sender agent.
status: AgentStatus,
},
Close {
/// Thread ID of the sender.
sender_id: ThreadId,
/// Thread ID of the receiver.
receiver_id: ThreadId,
/// Last known status of the receiver agent reported to the sender agent before
/// the close.
status: AgentStatus,
},
}
#[cfg(test)]
mod tests {
use super::*;

View File

@@ -46,6 +46,7 @@ use crate::style::user_message_style;
use codex_common::fuzzy_match::fuzzy_match;
use codex_protocol::custom_prompts::CustomPrompt;
use codex_protocol::custom_prompts::PROMPTS_CMD_PREFIX;
use codex_protocol::models::local_image_label_text;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
@@ -59,7 +60,7 @@ use codex_core::skills::model::SkillMetadata;
use codex_file_search::FileMatch;
use std::cell::RefCell;
use std::collections::HashMap;
use std::path::Path;
use std::collections::HashSet;
use std::path::PathBuf;
use std::time::Duration;
use std::time::Instant;
@@ -274,10 +275,12 @@ impl ChatComposer {
// normalize_pasted_path already handles Windows → WSL path conversion,
// so we can directly try to read the image dimensions.
match image::image_dimensions(&path_buf) {
Ok((w, h)) => {
Ok((width, height)) => {
tracing::info!("OK: {pasted}");
let format_label = pasted_image_format(&path_buf).label();
self.attach_image(path_buf, w, h, format_label);
tracing::debug!("image dimensions={}x{}", width, height);
let format = pasted_image_format(&path_buf);
tracing::debug!("attached image format={}", format.label());
self.attach_image(path_buf);
true
}
Err(err) => {
@@ -408,13 +411,9 @@ impl ChatComposer {
}
/// Attempt to start a burst by retro-capturing recent chars before the cursor.
pub fn attach_image(&mut self, path: PathBuf, width: u32, height: u32, _format_label: &str) {
let file_label = path
.file_name()
.map(|name| name.to_string_lossy().into_owned())
.unwrap_or_else(|| "image".to_string());
let base_placeholder = format!("{file_label} {width}x{height}");
let placeholder = self.next_image_placeholder(&base_placeholder);
pub fn attach_image(&mut self, path: PathBuf) {
let image_number = self.attached_images.len() + 1;
let placeholder = local_image_label_text(image_number);
// Insert as an element to match large paste placeholder behavior:
// styled distinctly and treated atomically for cursor/mutations.
self.textarea.insert_element(&placeholder);
@@ -477,22 +476,6 @@ impl ChatComposer {
}
}
fn next_image_placeholder(&mut self, base: &str) -> String {
let text = self.textarea.text();
let mut suffix = 1;
loop {
let placeholder = if suffix == 1 {
format!("[{base}]")
} else {
format!("[{base} #{suffix}]")
};
if !text.contains(&placeholder) {
return placeholder;
}
suffix += 1;
}
}
pub(crate) fn insert_str(&mut self, text: &str) {
self.textarea.insert_str(text);
self.sync_popups();
@@ -818,47 +801,43 @@ impl ChatComposer {
if is_image {
// Determine dimensions; if that fails fall back to normal path insertion.
let path_buf = PathBuf::from(&sel_path);
if let Ok((w, h)) = image::image_dimensions(&path_buf) {
// Remove the current @token (mirror logic from insert_selected_path without inserting text)
// using the flat text and byte-offset cursor API.
let cursor_offset = self.textarea.cursor();
let text = self.textarea.text();
// Clamp to a valid char boundary to avoid panics when slicing.
let safe_cursor = Self::clamp_to_char_boundary(text, cursor_offset);
let before_cursor = &text[..safe_cursor];
let after_cursor = &text[safe_cursor..];
match image::image_dimensions(&path_buf) {
Ok((width, height)) => {
tracing::debug!("selected image dimensions={}x{}", width, height);
// Remove the current @token (mirror logic from insert_selected_path without inserting text)
// using the flat text and byte-offset cursor API.
let cursor_offset = self.textarea.cursor();
let text = self.textarea.text();
// Clamp to a valid char boundary to avoid panics when slicing.
let safe_cursor = Self::clamp_to_char_boundary(text, cursor_offset);
let before_cursor = &text[..safe_cursor];
let after_cursor = &text[safe_cursor..];
// Determine token boundaries in the full text.
let start_idx = before_cursor
.char_indices()
.rfind(|(_, c)| c.is_whitespace())
.map(|(idx, c)| idx + c.len_utf8())
.unwrap_or(0);
let end_rel_idx = after_cursor
.char_indices()
.find(|(_, c)| c.is_whitespace())
.map(|(idx, _)| idx)
.unwrap_or(after_cursor.len());
let end_idx = safe_cursor + end_rel_idx;
// Determine token boundaries in the full text.
let start_idx = before_cursor
.char_indices()
.rfind(|(_, c)| c.is_whitespace())
.map(|(idx, c)| idx + c.len_utf8())
.unwrap_or(0);
let end_rel_idx = after_cursor
.char_indices()
.find(|(_, c)| c.is_whitespace())
.map(|(idx, _)| idx)
.unwrap_or(after_cursor.len());
let end_idx = safe_cursor + end_rel_idx;
self.textarea.replace_range(start_idx..end_idx, "");
self.textarea.set_cursor(start_idx);
self.textarea.replace_range(start_idx..end_idx, "");
self.textarea.set_cursor(start_idx);
let format_label = match Path::new(&sel_path)
.extension()
.and_then(|e| e.to_str())
.map(str::to_ascii_lowercase)
{
Some(ext) if ext == "png" => "PNG",
Some(ext) if ext == "jpg" || ext == "jpeg" => "JPEG",
_ => "IMG",
};
self.attach_image(path_buf, w, h, format_label);
// Add a trailing space to keep typing fluid.
self.textarea.insert_str(" ");
} else {
// Fallback to plain path insertion if metadata read fails.
self.insert_selected_path(&sel_path);
self.attach_image(path_buf);
// Add a trailing space to keep typing fluid.
self.textarea.insert_str(" ");
}
Err(err) => {
tracing::trace!("image dimensions lookup failed: {err}");
// Fallback to plain path insertion if metadata read fails.
self.insert_selected_path(&sel_path);
}
}
} else {
// Non-image: inserting file path.
@@ -1464,20 +1443,29 @@ impl ChatComposer {
}
}
// For non-char inputs (or after flushing), handle normally.
// Special handling for backspace on placeholders
if let KeyEvent {
code: KeyCode::Backspace,
..
} = input
&& self.try_remove_any_placeholder_at_cursor()
// Backspace at the start of an image placeholder should delete that placeholder (rather
// than deleting content before it). Do this without scanning the full text by consulting
// the textarea's element list.
if matches!(input.code, KeyCode::Backspace)
&& self.try_remove_image_element_at_cursor_start()
{
return (InputResult::None, true);
}
// Normal input handling
// For non-char inputs (or after flushing), handle normally.
// Track element removals so we can drop any corresponding placeholders without scanning
// the full text. (Placeholders are atomic elements; when deleted, the element disappears.)
let elements_before = if self.pending_pastes.is_empty() && self.attached_images.is_empty() {
None
} else {
Some(self.textarea.element_payloads())
};
self.textarea.input(input);
let text_after = self.textarea.text();
if let Some(elements_before) = elements_before {
self.reconcile_deleted_elements(elements_before);
}
// Update paste-burst heuristic for plain Char (no Ctrl/Alt) events.
let crossterm::event::KeyEvent {
@@ -1499,176 +1487,69 @@ impl ChatComposer {
}
}
// Check if any placeholders were removed and remove their corresponding pending pastes
self.pending_pastes
.retain(|(placeholder, _)| text_after.contains(placeholder));
// Keep attached images in proportion to how many matching placeholders exist in the text.
// This handles duplicate placeholders that share the same visible label.
if !self.attached_images.is_empty() {
let mut needed: HashMap<String, usize> = HashMap::new();
for img in &self.attached_images {
needed
.entry(img.placeholder.clone())
.or_insert_with(|| text_after.matches(&img.placeholder).count());
}
let mut used: HashMap<String, usize> = HashMap::new();
let mut kept: Vec<AttachedImage> = Vec::with_capacity(self.attached_images.len());
for img in self.attached_images.drain(..) {
let total_needed = *needed.get(&img.placeholder).unwrap_or(&0);
let used_count = used.entry(img.placeholder.clone()).or_insert(0);
if *used_count < total_needed {
kept.push(img);
*used_count += 1;
}
}
self.attached_images = kept;
}
(InputResult::None, true)
}
/// Attempts to remove an image or paste placeholder if the cursor is at the end of one.
/// Returns true if a placeholder was removed.
fn try_remove_any_placeholder_at_cursor(&mut self) -> bool {
// Clamp the cursor to a valid char boundary to avoid panics when slicing.
let text = self.textarea.text();
let p = Self::clamp_to_char_boundary(text, self.textarea.cursor());
// Try image placeholders first
let mut out: Option<(usize, String)> = None;
// Detect if the cursor is at the end of any image placeholder.
// If duplicates exist, remove the specific occurrence's mapping.
for (i, img) in self.attached_images.iter().enumerate() {
let ph = &img.placeholder;
if p < ph.len() {
continue;
}
let start = p - ph.len();
if text.get(start..p) != Some(ph.as_str()) {
continue;
}
// Count the number of occurrences of `ph` before `start`.
let mut occ_before = 0usize;
let mut search_pos = 0usize;
while search_pos < start {
let segment = match text.get(search_pos..start) {
Some(s) => s,
None => break,
};
if let Some(found) = segment.find(ph) {
occ_before += 1;
search_pos += found + ph.len();
} else {
break;
}
}
// Remove the occ_before-th attached image that shares this placeholder label.
out = if let Some((remove_idx, _)) = self
.attached_images
.iter()
.enumerate()
.filter(|(_, img2)| img2.placeholder == *ph)
.nth(occ_before)
{
Some((remove_idx, ph.clone()))
} else {
Some((i, ph.clone()))
};
break;
}
if let Some((idx, placeholder)) = out {
self.textarea.replace_range(p - placeholder.len()..p, "");
self.attached_images.remove(idx);
return true;
fn try_remove_image_element_at_cursor_start(&mut self) -> bool {
if self.attached_images.is_empty() {
return false;
}
// Also handle when the cursor is at the START of an image placeholder.
// let result = 'out: {
let out: Option<(usize, String)> = 'out: {
for (i, img) in self.attached_images.iter().enumerate() {
let ph = &img.placeholder;
if p + ph.len() > text.len() {
continue;
}
if text.get(p..p + ph.len()) != Some(ph.as_str()) {
continue;
}
// Count occurrences of `ph` before `p`.
let mut occ_before = 0usize;
let mut search_pos = 0usize;
while search_pos < p {
let segment = match text.get(search_pos..p) {
Some(s) => s,
None => break 'out None,
};
if let Some(found) = segment.find(ph) {
occ_before += 1;
search_pos += found + ph.len();
} else {
break 'out None;
}
}
if let Some((remove_idx, _)) = self
.attached_images
.iter()
.enumerate()
.filter(|(_, img2)| img2.placeholder == *ph)
.nth(occ_before)
{
break 'out Some((remove_idx, ph.clone()));
} else {
break 'out Some((i, ph.clone()));
}
}
None
let p = self.textarea.cursor();
let Some(payload) = self.textarea.element_payload_starting_at(p) else {
return false;
};
let Some(idx) = self
.attached_images
.iter()
.position(|img| img.placeholder == payload)
else {
return false;
};
if let Some((idx, placeholder)) = out {
self.textarea.replace_range(p..p + placeholder.len(), "");
self.attached_images.remove(idx);
return true;
self.textarea.replace_range(p..p + payload.len(), "");
self.attached_images.remove(idx);
self.relabel_attached_images_and_update_placeholders();
true
}
fn reconcile_deleted_elements(&mut self, elements_before: Vec<String>) {
let elements_after: HashSet<String> =
self.textarea.element_payloads().into_iter().collect();
let mut removed_any_image = false;
for removed in elements_before
.into_iter()
.filter(|payload| !elements_after.contains(payload))
{
self.pending_pastes.retain(|(ph, _)| ph != &removed);
if let Some(idx) = self
.attached_images
.iter()
.position(|img| img.placeholder == removed)
{
self.attached_images.remove(idx);
removed_any_image = true;
}
}
// Then try pasted-content placeholders
if let Some(placeholder) = self.pending_pastes.iter().find_map(|(ph, _)| {
if p < ph.len() {
return None;
}
let start = p - ph.len();
if text.get(start..p) == Some(ph.as_str()) {
Some(ph.clone())
} else {
None
}
}) {
self.textarea.replace_range(p - placeholder.len()..p, "");
self.pending_pastes.retain(|(ph, _)| ph != &placeholder);
return true;
if removed_any_image {
self.relabel_attached_images_and_update_placeholders();
}
}
// Also handle when the cursor is at the START of a pasted-content placeholder.
if let Some(placeholder) = self.pending_pastes.iter().find_map(|(ph, _)| {
if p + ph.len() > text.len() {
return None;
fn relabel_attached_images_and_update_placeholders(&mut self) {
for idx in 0..self.attached_images.len() {
let expected = local_image_label_text(idx + 1);
let current = self.attached_images[idx].placeholder.clone();
if current == expected {
continue;
}
if text.get(p..p + ph.len()) == Some(ph.as_str()) {
Some(ph.clone())
} else {
None
}
}) {
self.textarea.replace_range(p..p + placeholder.len(), "");
self.pending_pastes.retain(|(ph, _)| ph != &placeholder);
return true;
self.attached_images[idx].placeholder = expected.clone();
let _renamed = self.textarea.replace_element_payload(&current, &expected);
}
false
}
fn handle_shortcut_overlay_key(&mut self, key_event: &KeyEvent) -> bool {
@@ -3477,12 +3358,12 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image1.png");
composer.attach_image(path.clone(), 32, 16, "PNG");
composer.attach_image(path.clone());
composer.handle_paste(" hi".into());
let (result, _) =
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
match result {
InputResult::Submitted(text) => assert_eq!(text, "[image1.png 32x16] hi"),
InputResult::Submitted(text) => assert_eq!(text, "[Image #1] hi"),
_ => panic!("expected Submitted"),
}
let imgs = composer.take_recent_submission_images();
@@ -3501,11 +3382,11 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image2.png");
composer.attach_image(path.clone(), 10, 5, "PNG");
composer.attach_image(path.clone());
let (result, _) =
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
match result {
InputResult::Submitted(text) => assert_eq!(text, "[image2.png 10x5]"),
InputResult::Submitted(text) => assert_eq!(text, "[Image #1]"),
_ => panic!("expected Submitted"),
}
let imgs = composer.take_recent_submission_images();
@@ -3526,21 +3407,15 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image_dup.png");
composer.attach_image(path.clone(), 10, 5, "PNG");
composer.attach_image(path.clone());
composer.handle_paste(" ".into());
composer.attach_image(path, 10, 5, "PNG");
composer.attach_image(path);
let text = composer.textarea.text().to_string();
assert!(text.contains("[image_dup.png 10x5]"));
assert!(text.contains("[image_dup.png 10x5 #2]"));
assert_eq!(
composer.attached_images[0].placeholder,
"[image_dup.png 10x5]"
);
assert_eq!(
composer.attached_images[1].placeholder,
"[image_dup.png 10x5 #2]"
);
assert!(text.contains("[Image #1]"));
assert!(text.contains("[Image #2]"));
assert_eq!(composer.attached_images[0].placeholder, "[Image #1]");
assert_eq!(composer.attached_images[1].placeholder, "[Image #2]");
}
#[test]
@@ -3555,7 +3430,7 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image3.png");
composer.attach_image(path.clone(), 20, 10, "PNG");
composer.attach_image(path.clone());
let placeholder = composer.attached_images[0].placeholder.clone();
// Case 1: backspace at end
@@ -3566,7 +3441,7 @@ mod tests {
// Re-add and test backspace in middle: should break the placeholder string
// and drop the image mapping (same as text placeholder behavior).
composer.attach_image(path, 20, 10, "PNG");
composer.attach_image(path);
let placeholder2 = composer.attached_images[0].placeholder.clone();
// Move cursor to roughly middle of placeholder
if let Some(start_pos) = composer.textarea.text().find(&placeholder2) {
@@ -3598,7 +3473,7 @@ mod tests {
// Insert an image placeholder at the start
let path = PathBuf::from("/tmp/image_multibyte.png");
composer.attach_image(path, 10, 5, "PNG");
composer.attach_image(path);
// Add multibyte text after the placeholder
composer.textarea.insert_str("日本語");
@@ -3607,16 +3482,11 @@ mod tests {
composer.handle_key_event(KeyEvent::new(KeyCode::Backspace, KeyModifiers::NONE));
assert_eq!(composer.attached_images.len(), 1);
assert!(
composer
.textarea
.text()
.starts_with("[image_multibyte.png 10x5]")
);
assert!(composer.textarea.text().starts_with("[Image #1]"));
}
#[test]
fn deleting_one_of_duplicate_image_placeholders_removes_matching_entry() {
fn deleting_one_of_duplicate_image_placeholders_removes_one_entry() {
let (tx, _rx) = unbounded_channel::<AppEvent>();
let sender = AppEventSender::new(tx);
let mut composer = ChatComposer::new(
@@ -3630,10 +3500,10 @@ mod tests {
let path1 = PathBuf::from("/tmp/image_dup1.png");
let path2 = PathBuf::from("/tmp/image_dup2.png");
composer.attach_image(path1, 10, 5, "PNG");
composer.attach_image(path1);
// separate placeholders with a space for clarity
composer.handle_paste(" ".into());
composer.attach_image(path2.clone(), 10, 5, "PNG");
composer.attach_image(path2.clone());
let placeholder1 = composer.attached_images[0].placeholder.clone();
let placeholder2 = composer.attached_images[1].placeholder.clone();
@@ -3647,25 +3517,66 @@ mod tests {
let new_text = composer.textarea.text().to_string();
assert_eq!(
0,
1,
new_text.matches(&placeholder1).count(),
"first placeholder removed"
"one placeholder remains after deletion"
);
assert_eq!(
0,
new_text.matches(&placeholder2).count(),
"second placeholder was relabeled"
);
assert_eq!(
1,
new_text.matches(&placeholder2).count(),
"second placeholder remains"
new_text.matches("[Image #1]").count(),
"remaining placeholder relabeled to #1"
);
assert_eq!(
vec![AttachedImage {
path: path2,
placeholder: "[image_dup2.png 10x5]".to_string()
placeholder: "[Image #1]".to_string()
}],
composer.attached_images,
"one image mapping remains"
);
}
#[test]
fn deleting_first_text_element_renumbers_following_text_element() {
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let sender = AppEventSender::new(tx);
let mut composer = ChatComposer::new(
true,
sender,
false,
"Ask Codex to do anything".to_string(),
false,
);
let path1 = PathBuf::from("/tmp/image_first.png");
let path2 = PathBuf::from("/tmp/image_second.png");
// Insert two adjacent atomic elements.
composer.attach_image(path1);
composer.attach_image(path2.clone());
assert_eq!(composer.textarea.text(), "[Image #1][Image #2]");
assert_eq!(composer.attached_images.len(), 2);
// Delete the first element using normal textarea editing (Delete at cursor start).
composer.textarea.set_cursor(0);
composer.handle_key_event(KeyEvent::new(KeyCode::Delete, KeyModifiers::NONE));
// Remaining image should be renumbered and the textarea element updated.
assert_eq!(composer.attached_images.len(), 1);
assert_eq!(composer.attached_images[0].path, path2);
assert_eq!(composer.attached_images[0].placeholder, "[Image #1]");
assert_eq!(composer.textarea.text(), "[Image #1]");
}
#[test]
fn pasting_filepath_attaches_image() {
let tmp = tempdir().expect("create TempDir");
@@ -3686,12 +3597,7 @@ mod tests {
let needs_redraw = composer.handle_paste(tmp_path.to_string_lossy().to_string());
assert!(needs_redraw);
assert!(
composer
.textarea
.text()
.starts_with("[codex_tui_test_paste_image.png 3x2] ")
);
assert!(composer.textarea.text().starts_with("[Image #1] "));
let imgs = composer.take_recent_submission_images();
assert_eq!(imgs, vec![tmp_path]);

View File

@@ -542,16 +542,9 @@ impl BottomPane {
self.request_redraw();
}
pub(crate) fn attach_image(
&mut self,
path: PathBuf,
width: u32,
height: u32,
format_label: &str,
) {
pub(crate) fn attach_image(&mut self, path: PathBuf) {
if self.view_stack.is_empty() {
self.composer
.attach_image(path, width, height, format_label);
self.composer.attach_image(path);
self.request_redraw();
}
}

View File

@@ -715,6 +715,88 @@ impl TextArea {
// ===== Text elements support =====
pub fn element_payloads(&self) -> Vec<String> {
self.elements
.iter()
.filter_map(|e| self.text.get(e.range.clone()).map(str::to_string))
.collect()
}
pub fn element_payload_starting_at(&self, pos: usize) -> Option<String> {
let pos = pos.min(self.text.len());
let elem = self.elements.iter().find(|e| e.range.start == pos)?;
self.text.get(elem.range.clone()).map(str::to_string)
}
/// Renames a single text element in-place, keeping it atomic.
///
/// This is intended for cases where the element payload is an identifier (e.g. a placeholder)
/// that must be updated without converting the element back into normal text.
pub fn replace_element_payload(&mut self, old: &str, new: &str) -> bool {
let Some(idx) = self
.elements
.iter()
.position(|e| self.text.get(e.range.clone()) == Some(old))
else {
return false;
};
let range = self.elements[idx].range.clone();
let start = range.start;
let end = range.end;
if start > end || end > self.text.len() {
return false;
}
let removed_len = end - start;
let inserted_len = new.len();
let diff = inserted_len as isize - removed_len as isize;
self.text.replace_range(range, new);
self.wrap_cache.replace(None);
self.preferred_col = None;
// Update the modified element's range.
self.elements[idx].range = start..(start + inserted_len);
// Shift element ranges that occur after the replaced element.
if diff != 0 {
for (j, e) in self.elements.iter_mut().enumerate() {
if j == idx {
continue;
}
if e.range.end <= start {
continue;
}
if e.range.start >= end {
e.range.start = ((e.range.start as isize) + diff) as usize;
e.range.end = ((e.range.end as isize) + diff) as usize;
continue;
}
// Elements should not partially overlap each other; degrade gracefully by
// snapping anything intersecting the replaced range to the new bounds.
e.range.start = start.min(e.range.start);
e.range.end = (start + inserted_len).max(e.range.end.saturating_add_signed(diff));
}
}
// Update the cursor position to account for the edit.
self.cursor_pos = if self.cursor_pos < start {
self.cursor_pos
} else if self.cursor_pos <= end {
start + inserted_len
} else {
((self.cursor_pos as isize) + diff) as usize
};
self.cursor_pos = self.clamp_pos_to_nearest_boundary(self.cursor_pos);
// Keep element ordering deterministic.
self.elements.sort_by_key(|e| e.range.start);
true
}
pub fn insert_element(&mut self, text: &str) {
let start = self.clamp_pos_for_insertion(self.cursor_pos);
self.insert_str_at(start, text);

View File

@@ -24,6 +24,7 @@ use codex_core::protocol::AgentReasoningRawContentDeltaEvent;
use codex_core::protocol::AgentReasoningRawContentEvent;
use codex_core::protocol::ApplyPatchApprovalRequestEvent;
use codex_core::protocol::BackgroundEventEvent;
use codex_core::protocol::CollabInteractionEvent;
use codex_core::protocol::CreditsSnapshot;
use codex_core::protocol::DeprecationNoticeEvent;
use codex_core::protocol::ErrorEvent;
@@ -102,6 +103,7 @@ use crate::bottom_pane::SelectionViewParams;
use crate::bottom_pane::custom_prompt_view::CustomPromptView;
use crate::bottom_pane::popup_consts::standard_popup_hint_line;
use crate::clipboard_paste::paste_image_to_temp_png;
use crate::collab_event_cell;
use crate::diff_render::display_path_for;
use crate::exec_cell::CommandOutput;
use crate::exec_cell::ExecCell;
@@ -1073,6 +1075,11 @@ impl ChatWidget {
self.set_status_header(message);
}
fn on_collab_interaction(&mut self, event: CollabInteractionEvent) {
self.add_to_history(collab_event_cell::new_collab_interaction(event));
self.request_redraw();
}
fn on_undo_started(&mut self, event: UndoStartedEvent) {
self.bottom_pane.ensure_status_indicator();
self.bottom_pane.set_interrupt_hint_visible(false);
@@ -1593,12 +1600,13 @@ impl ChatWidget {
{
match paste_image_to_temp_png() {
Ok((path, info)) => {
self.attach_image(
path,
tracing::debug!(
"pasted image size={}x{} format={}",
info.width,
info.height,
info.encoded_format.label(),
info.encoded_format.label()
);
self.attach_image(path);
}
Err(err) => {
tracing::warn!("failed to paste image: {err}");
@@ -1651,18 +1659,9 @@ impl ChatWidget {
}
}
pub(crate) fn attach_image(
&mut self,
path: PathBuf,
width: u32,
height: u32,
format_label: &str,
) {
tracing::info!(
"attach_image path={path:?} width={width} height={height} format={format_label}",
);
self.bottom_pane
.attach_image(path, width, height, format_label);
pub(crate) fn attach_image(&mut self, path: PathBuf) {
tracing::info!("attach_image path={path:?}");
self.bottom_pane.attach_image(path);
self.request_redraw();
}
@@ -2012,14 +2011,14 @@ impl ChatWidget {
return;
}
if !text.is_empty() {
items.push(UserInput::Text { text: text.clone() });
}
for path in image_paths {
items.push(UserInput::LocalImage { path });
}
if !text.is_empty() {
items.push(UserInput::Text { text: text.clone() });
}
if let Some(skills) = self.bottom_pane.skills() {
let skill_mentions = find_skill_mentions(&text, skills);
for skill in skill_mentions {
@@ -2190,6 +2189,7 @@ impl ChatWidget {
}
EventMsg::ExitedReviewMode(review) => self.on_exited_review_mode(review),
EventMsg::ContextCompacted(_) => self.on_agent_message("Context compacted".to_owned()),
EventMsg::CollabInteraction(event) => self.on_collab_interaction(event),
EventMsg::ThreadRolledBack(_) => {}
EventMsg::RawResponseItem(_)
| EventMsg::ItemStarted(_)

View File

@@ -1114,8 +1114,8 @@ async fn ctrl_c_cleared_prompt_is_recoverable_via_history() {
chat.bottom_pane.insert_str("draft message ");
chat.bottom_pane
.attach_image(PathBuf::from("/tmp/preview.png"), 24, 42, "png");
let placeholder = "[preview.png 24x42]";
.attach_image(PathBuf::from("/tmp/preview.png"));
let placeholder = "[Image #1]";
assert!(
chat.bottom_pane.composer_text().ends_with(placeholder),
"expected placeholder {placeholder:?} in composer text"

View File

@@ -0,0 +1,140 @@
use codex_core::protocol::AgentStatus;
use codex_core::protocol::CollabInteractionEvent;
use ratatui::style::Stylize;
use ratatui::text::Line;
use crate::history_cell::HistoryCell;
use crate::text_formatting::truncate_text;
use crate::wrapping::RtOptions;
use crate::wrapping::word_wrap_lines;
const COLLAB_PROMPT_MAX_GRAPHEMES: usize = 120;
#[derive(Debug)]
pub(crate) struct CollabInteractionCell {
summary: Line<'static>,
detail: Option<Line<'static>>,
}
impl CollabInteractionCell {
fn new(summary: Line<'static>, detail: Option<Line<'static>>) -> Self {
Self { summary, detail }
}
}
impl HistoryCell for CollabInteractionCell {
fn display_lines(&self, width: u16) -> Vec<Line<'static>> {
let wrap_width = width.max(1) as usize;
let mut lines = word_wrap_lines(
std::iter::once(self.summary.clone()),
RtOptions::new(wrap_width)
.initial_indent("".dim().into())
.subsequent_indent(" ".into()),
);
if let Some(detail) = &self.detail {
let detail_lines = word_wrap_lines(
std::iter::once(detail.clone()),
RtOptions::new(wrap_width)
.initial_indent("".dim().into())
.subsequent_indent(" ".into()),
);
lines.extend(detail_lines);
}
lines
}
}
fn collab_status_label(status: &AgentStatus) -> String {
match status {
AgentStatus::PendingInit => "pending init".to_string(),
AgentStatus::Running => "running".to_string(),
AgentStatus::Completed(message) => format!("completed: {message:?}"),
AgentStatus::Errored(_) => "errored".to_string(),
AgentStatus::Shutdown => "shutdown".to_string(),
AgentStatus::NotFound => "not found".to_string(),
}
}
fn collab_detail_line(label: &str, message: &str) -> Option<Line<'static>> {
let trimmed = message.trim();
if trimmed.is_empty() {
return None;
}
let collapsed = trimmed
.lines()
.map(str::trim)
.filter(|line| !line.is_empty())
.collect::<Vec<_>>()
.join(" ");
let truncated = truncate_text(&collapsed, COLLAB_PROMPT_MAX_GRAPHEMES);
let label = format!("{label}: ");
Some(Line::from(vec![label.dim(), truncated.into()]))
}
pub(crate) fn new_collab_interaction(event: CollabInteractionEvent) -> CollabInteractionCell {
let (summary, detail) = match event {
CollabInteractionEvent::AgentSpawned { new_id, prompt, .. } => {
let summary = Line::from(vec![
"Spawned agent".bold(),
" ".into(),
new_id.to_string().dim(),
]);
let detail = collab_detail_line("Prompt", &prompt);
(summary, detail)
}
CollabInteractionEvent::AgentInteraction {
receiver_id,
prompt,
..
} => {
let summary = Line::from(vec![
"Sent to agent".bold(),
" ".into(),
receiver_id.to_string().dim(),
]);
let detail = collab_detail_line("Message", &prompt);
(summary, detail)
}
CollabInteractionEvent::WaitingBegin { receiver_id, .. } => {
let summary = Line::from(vec![
"Waiting on agent".bold(),
" ".into(),
receiver_id.to_string().dim(),
]);
(summary, None)
}
CollabInteractionEvent::WaitingEnd {
receiver_id,
status,
..
} => {
let summary = Line::from(vec![
"Wait ended for agent".bold(),
" ".into(),
receiver_id.to_string().dim(),
" · ".dim(),
collab_status_label(&status).dim(),
]);
(summary, None)
}
CollabInteractionEvent::Close {
receiver_id,
status,
..
} => {
let summary = Line::from(vec![
"Closed agent".bold(),
" ".into(),
receiver_id.to_string().dim(),
" · ".dim(),
collab_status_label(&status).dim(),
]);
(summary, None)
}
};
CollabInteractionCell::new(summary, detail)
}

View File

@@ -43,6 +43,7 @@ mod bottom_pane;
mod chatwidget;
mod cli;
mod clipboard_paste;
mod collab_event_cell;
mod color;
pub mod custom_terminal;
mod diff_render;

View File

@@ -17,6 +17,7 @@ use ratatui::prelude::*;
use ratatui::style::Stylize;
use std::collections::BTreeSet;
use std::path::PathBuf;
use url::Url;
use super::account::StatusAccountDisplay;
use super::format::FieldFormatter;
@@ -62,6 +63,7 @@ struct StatusHistoryCell {
approval: String,
sandbox: String,
agents_summary: String,
model_provider: Option<String>,
account: Option<StatusAccountDisplay>,
session_id: Option<String>,
token_usage: StatusTokenUsageData,
@@ -129,6 +131,7 @@ impl StatusHistoryCell {
}
};
let agents_summary = compose_agents_summary(config);
let model_provider = format_model_provider(config);
let account = compose_account_display(auth_manager, plan_type);
let session_id = session_id.as_ref().map(std::string::ToString::to_string);
let default_usage = TokenUsage::default();
@@ -157,6 +160,7 @@ impl StatusHistoryCell {
approval,
sandbox,
agents_summary,
model_provider,
account,
session_id,
token_usage,
@@ -338,6 +342,9 @@ impl HistoryCell for StatusHistoryCell {
.collect();
let mut seen: BTreeSet<String> = labels.iter().cloned().collect();
if self.model_provider.is_some() {
push_label(&mut labels, &mut seen, "Model provider");
}
if account_value.is_some() {
push_label(&mut labels, &mut seen, "Account");
}
@@ -381,6 +388,9 @@ impl HistoryCell for StatusHistoryCell {
let directory_value = format_directory_display(&self.directory, Some(value_width));
lines.push(formatter.line("Model", model_spans));
if let Some(model_provider) = self.model_provider.as_ref() {
lines.push(formatter.line("Model provider", vec![Span::from(model_provider.clone())]));
}
lines.push(formatter.line("Directory", vec![Span::from(directory_value)]));
lines.push(formatter.line("Approval", vec![Span::from(self.approval.clone())]));
lines.push(formatter.line("Sandbox", vec![Span::from(self.sandbox.clone())]));
@@ -416,3 +426,39 @@ impl HistoryCell for StatusHistoryCell {
with_border_with_inner_width(truncated_lines, inner_width)
}
}
fn format_model_provider(config: &Config) -> Option<String> {
let provider = &config.model_provider;
let name = provider.name.trim();
let provider_name = if name.is_empty() {
config.model_provider_id.as_str()
} else {
name
};
let base_url = provider.base_url.as_deref().and_then(sanitize_base_url);
let is_default_openai = provider.is_openai() && base_url.is_none();
if is_default_openai {
return None;
}
Some(match base_url {
Some(base_url) => format!("{provider_name} - {base_url}"),
None => provider_name.to_string(),
})
}
fn sanitize_base_url(raw: &str) -> Option<String> {
let trimmed = raw.trim();
if trimmed.is_empty() {
return None;
}
let Ok(mut url) = Url::parse(trimmed) else {
return None;
};
let _ = url.set_username("");
let _ = url.set_password(None);
url.set_query(None);
url.set_fragment(None);
Some(url.to_string().trim_end_matches('/').to_string()).filter(|value| !value.is_empty())
}

View File

@@ -49,6 +49,7 @@ use crate::style::user_message_style;
use codex_common::fuzzy_match::fuzzy_match;
use codex_protocol::custom_prompts::CustomPrompt;
use codex_protocol::custom_prompts::PROMPTS_CMD_PREFIX;
use codex_protocol::models::local_image_label_text;
use crate::app_event::AppEvent;
use crate::app_event_sender::AppEventSender;
@@ -62,7 +63,7 @@ use codex_core::skills::model::SkillMetadata;
use codex_file_search::FileMatch;
use std::cell::RefCell;
use std::collections::HashMap;
use std::path::Path;
use std::collections::HashSet;
use std::path::PathBuf;
use std::time::Duration;
use std::time::Instant;
@@ -287,10 +288,12 @@ impl ChatComposer {
// normalize_pasted_path already handles Windows → WSL path conversion,
// so we can directly try to read the image dimensions.
match image::image_dimensions(&path_buf) {
Ok((w, h)) => {
Ok((width, height)) => {
tracing::info!("OK: {pasted}");
let format_label = pasted_image_format(&path_buf).label();
self.attach_image(path_buf, w, h, format_label);
tracing::debug!("image dimensions={}x{}", width, height);
let format = pasted_image_format(&path_buf);
tracing::debug!("attached image format={}", format.label());
self.attach_image(path_buf);
true
}
Err(err) => {
@@ -342,12 +345,9 @@ impl ChatComposer {
}
/// Attempt to start a burst by retro-capturing recent chars before the cursor.
pub fn attach_image(&mut self, path: PathBuf, width: u32, height: u32, _format_label: &str) {
let file_label = path
.file_name()
.map(|name| name.to_string_lossy().into_owned())
.unwrap_or_else(|| "image".to_string());
let placeholder = format!("[{file_label} {width}x{height}]");
pub fn attach_image(&mut self, path: PathBuf) {
let image_number = self.attached_images.len() + 1;
let placeholder = local_image_label_text(image_number);
// Insert as an element to match large paste placeholder behavior:
// styled distinctly and treated atomically for cursor/mutations.
self.textarea.insert_element(&placeholder);
@@ -735,47 +735,43 @@ impl ChatComposer {
if is_image {
// Determine dimensions; if that fails fall back to normal path insertion.
let path_buf = PathBuf::from(&sel_path);
if let Ok((w, h)) = image::image_dimensions(&path_buf) {
// Remove the current @token (mirror logic from insert_selected_path without inserting text)
// using the flat text and byte-offset cursor API.
let cursor_offset = self.textarea.cursor();
let text = self.textarea.text();
// Clamp to a valid char boundary to avoid panics when slicing.
let safe_cursor = Self::clamp_to_char_boundary(text, cursor_offset);
let before_cursor = &text[..safe_cursor];
let after_cursor = &text[safe_cursor..];
match image::image_dimensions(&path_buf) {
Ok((width, height)) => {
tracing::debug!("selected image dimensions={}x{}", width, height);
// Remove the current @token (mirror logic from insert_selected_path without inserting text)
// using the flat text and byte-offset cursor API.
let cursor_offset = self.textarea.cursor();
let text = self.textarea.text();
// Clamp to a valid char boundary to avoid panics when slicing.
let safe_cursor = Self::clamp_to_char_boundary(text, cursor_offset);
let before_cursor = &text[..safe_cursor];
let after_cursor = &text[safe_cursor..];
// Determine token boundaries in the full text.
let start_idx = before_cursor
.char_indices()
.rfind(|(_, c)| c.is_whitespace())
.map(|(idx, c)| idx + c.len_utf8())
.unwrap_or(0);
let end_rel_idx = after_cursor
.char_indices()
.find(|(_, c)| c.is_whitespace())
.map(|(idx, _)| idx)
.unwrap_or(after_cursor.len());
let end_idx = safe_cursor + end_rel_idx;
// Determine token boundaries in the full text.
let start_idx = before_cursor
.char_indices()
.rfind(|(_, c)| c.is_whitespace())
.map(|(idx, c)| idx + c.len_utf8())
.unwrap_or(0);
let end_rel_idx = after_cursor
.char_indices()
.find(|(_, c)| c.is_whitespace())
.map(|(idx, _)| idx)
.unwrap_or(after_cursor.len());
let end_idx = safe_cursor + end_rel_idx;
self.textarea.replace_range(start_idx..end_idx, "");
self.textarea.set_cursor(start_idx);
self.textarea.replace_range(start_idx..end_idx, "");
self.textarea.set_cursor(start_idx);
let format_label = match Path::new(&sel_path)
.extension()
.and_then(|e| e.to_str())
.map(str::to_ascii_lowercase)
{
Some(ext) if ext == "png" => "PNG",
Some(ext) if ext == "jpg" || ext == "jpeg" => "JPEG",
_ => "IMG",
};
self.attach_image(path_buf, w, h, format_label);
// Add a trailing space to keep typing fluid.
self.textarea.insert_str(" ");
} else {
// Fallback to plain path insertion if metadata read fails.
self.insert_selected_path(&sel_path);
self.attach_image(path_buf);
// Add a trailing space to keep typing fluid.
self.textarea.insert_str(" ");
}
Err(err) => {
tracing::trace!("image dimensions lookup failed: {err}");
// Fallback to plain path insertion if metadata read fails.
self.insert_selected_path(&sel_path);
}
}
} else {
// Non-image: inserting file path.
@@ -1381,20 +1377,28 @@ impl ChatComposer {
}
}
// For non-char inputs (or after flushing), handle normally.
// Special handling for backspace on placeholders
if let KeyEvent {
code: KeyCode::Backspace,
..
} = input
&& self.try_remove_any_placeholder_at_cursor()
// Backspace at the start of an image placeholder should delete that placeholder (rather
// than deleting content before it). Do this without scanning the full text by consulting
// the textarea's element list.
if matches!(input.code, KeyCode::Backspace)
&& self.try_remove_image_element_at_cursor_start()
{
return (InputResult::None, true);
}
// Normal input handling
// Track element removals so we can drop any corresponding placeholders without scanning
// the full text. (Placeholders are atomic elements; when deleted, the element disappears.)
let elements_before = if self.pending_pastes.is_empty() && self.attached_images.is_empty() {
None
} else {
Some(self.textarea.element_payloads())
};
self.textarea.input(input);
let text_after = self.textarea.text();
if let Some(elements_before) = elements_before {
self.reconcile_deleted_elements(elements_before);
}
// Update paste-burst heuristic for plain Char (no Ctrl/Alt) events.
let crossterm::event::KeyEvent {
@@ -1416,176 +1420,69 @@ impl ChatComposer {
}
}
// Check if any placeholders were removed and remove their corresponding pending pastes
self.pending_pastes
.retain(|(placeholder, _)| text_after.contains(placeholder));
// Keep attached images in proportion to how many matching placeholders exist in the text.
// This handles duplicate placeholders that share the same visible label.
if !self.attached_images.is_empty() {
let mut needed: HashMap<String, usize> = HashMap::new();
for img in &self.attached_images {
needed
.entry(img.placeholder.clone())
.or_insert_with(|| text_after.matches(&img.placeholder).count());
}
let mut used: HashMap<String, usize> = HashMap::new();
let mut kept: Vec<AttachedImage> = Vec::with_capacity(self.attached_images.len());
for img in self.attached_images.drain(..) {
let total_needed = *needed.get(&img.placeholder).unwrap_or(&0);
let used_count = used.entry(img.placeholder.clone()).or_insert(0);
if *used_count < total_needed {
kept.push(img);
*used_count += 1;
}
}
self.attached_images = kept;
}
(InputResult::None, true)
}
/// Attempts to remove an image or paste placeholder if the cursor is at the end of one.
/// Returns true if a placeholder was removed.
fn try_remove_any_placeholder_at_cursor(&mut self) -> bool {
// Clamp the cursor to a valid char boundary to avoid panics when slicing.
let text = self.textarea.text();
let p = Self::clamp_to_char_boundary(text, self.textarea.cursor());
// Try image placeholders first
let mut out: Option<(usize, String)> = None;
// Detect if the cursor is at the end of any image placeholder.
// If duplicates exist, remove the specific occurrence's mapping.
for (i, img) in self.attached_images.iter().enumerate() {
let ph = &img.placeholder;
if p < ph.len() {
continue;
}
let start = p - ph.len();
if text.get(start..p) != Some(ph.as_str()) {
continue;
}
// Count the number of occurrences of `ph` before `start`.
let mut occ_before = 0usize;
let mut search_pos = 0usize;
while search_pos < start {
let segment = match text.get(search_pos..start) {
Some(s) => s,
None => break,
};
if let Some(found) = segment.find(ph) {
occ_before += 1;
search_pos += found + ph.len();
} else {
break;
}
}
// Remove the occ_before-th attached image that shares this placeholder label.
out = if let Some((remove_idx, _)) = self
.attached_images
.iter()
.enumerate()
.filter(|(_, img2)| img2.placeholder == *ph)
.nth(occ_before)
{
Some((remove_idx, ph.clone()))
} else {
Some((i, ph.clone()))
};
break;
}
if let Some((idx, placeholder)) = out {
self.textarea.replace_range(p - placeholder.len()..p, "");
self.attached_images.remove(idx);
return true;
fn try_remove_image_element_at_cursor_start(&mut self) -> bool {
if self.attached_images.is_empty() {
return false;
}
// Also handle when the cursor is at the START of an image placeholder.
// let result = 'out: {
let out: Option<(usize, String)> = 'out: {
for (i, img) in self.attached_images.iter().enumerate() {
let ph = &img.placeholder;
if p + ph.len() > text.len() {
continue;
}
if text.get(p..p + ph.len()) != Some(ph.as_str()) {
continue;
}
// Count occurrences of `ph` before `p`.
let mut occ_before = 0usize;
let mut search_pos = 0usize;
while search_pos < p {
let segment = match text.get(search_pos..p) {
Some(s) => s,
None => break 'out None,
};
if let Some(found) = segment.find(ph) {
occ_before += 1;
search_pos += found + ph.len();
} else {
break 'out None;
}
}
if let Some((remove_idx, _)) = self
.attached_images
.iter()
.enumerate()
.filter(|(_, img2)| img2.placeholder == *ph)
.nth(occ_before)
{
break 'out Some((remove_idx, ph.clone()));
} else {
break 'out Some((i, ph.clone()));
}
}
None
let p = self.textarea.cursor();
let Some(payload) = self.textarea.element_payload_starting_at(p) else {
return false;
};
let Some(idx) = self
.attached_images
.iter()
.position(|img| img.placeholder == payload)
else {
return false;
};
if let Some((idx, placeholder)) = out {
self.textarea.replace_range(p..p + placeholder.len(), "");
self.attached_images.remove(idx);
return true;
self.textarea.replace_range(p..p + payload.len(), "");
self.attached_images.remove(idx);
self.relabel_attached_images_and_update_placeholders();
true
}
fn reconcile_deleted_elements(&mut self, elements_before: Vec<String>) {
let elements_after: HashSet<String> =
self.textarea.element_payloads().into_iter().collect();
let mut removed_any_image = false;
for removed in elements_before
.into_iter()
.filter(|payload| !elements_after.contains(payload))
{
self.pending_pastes.retain(|(ph, _)| ph != &removed);
if let Some(idx) = self
.attached_images
.iter()
.position(|img| img.placeholder == removed)
{
self.attached_images.remove(idx);
removed_any_image = true;
}
}
// Then try pasted-content placeholders
if let Some(placeholder) = self.pending_pastes.iter().find_map(|(ph, _)| {
if p < ph.len() {
return None;
}
let start = p - ph.len();
if text.get(start..p) == Some(ph.as_str()) {
Some(ph.clone())
} else {
None
}
}) {
self.textarea.replace_range(p - placeholder.len()..p, "");
self.pending_pastes.retain(|(ph, _)| ph != &placeholder);
return true;
if removed_any_image {
self.relabel_attached_images_and_update_placeholders();
}
}
// Also handle when the cursor is at the START of a pasted-content placeholder.
if let Some(placeholder) = self.pending_pastes.iter().find_map(|(ph, _)| {
if p + ph.len() > text.len() {
return None;
fn relabel_attached_images_and_update_placeholders(&mut self) {
for idx in 0..self.attached_images.len() {
let expected = local_image_label_text(idx + 1);
let current = self.attached_images[idx].placeholder.clone();
if current == expected {
continue;
}
if text.get(p..p + ph.len()) == Some(ph.as_str()) {
Some(ph.clone())
} else {
None
}
}) {
self.textarea.replace_range(p..p + placeholder.len(), "");
self.pending_pastes.retain(|(ph, _)| ph != &placeholder);
return true;
self.attached_images[idx].placeholder = expected.clone();
let _renamed = self.textarea.replace_element_payload(&current, &expected);
}
false
}
fn handle_shortcut_overlay_key(&mut self, key_event: &KeyEvent) -> bool {
@@ -3408,12 +3305,12 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image1.png");
composer.attach_image(path.clone(), 32, 16, "PNG");
composer.attach_image(path.clone());
composer.handle_paste(" hi".into());
let (result, _) =
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
match result {
InputResult::Submitted(text) => assert_eq!(text, "[image1.png 32x16] hi"),
InputResult::Submitted(text) => assert_eq!(text, "[Image #1] hi"),
_ => panic!("expected Submitted"),
}
let imgs = composer.take_recent_submission_images();
@@ -3432,11 +3329,11 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image2.png");
composer.attach_image(path.clone(), 10, 5, "PNG");
composer.attach_image(path.clone());
let (result, _) =
composer.handle_key_event(KeyEvent::new(KeyCode::Enter, KeyModifiers::NONE));
match result {
InputResult::Submitted(text) => assert_eq!(text, "[image2.png 10x5]"),
InputResult::Submitted(text) => assert_eq!(text, "[Image #1]"),
_ => panic!("expected Submitted"),
}
let imgs = composer.take_recent_submission_images();
@@ -3457,7 +3354,7 @@ mod tests {
false,
);
let path = PathBuf::from("/tmp/image3.png");
composer.attach_image(path.clone(), 20, 10, "PNG");
composer.attach_image(path.clone());
let placeholder = composer.attached_images[0].placeholder.clone();
// Case 1: backspace at end
@@ -3468,7 +3365,7 @@ mod tests {
// Re-add and test backspace in middle: should break the placeholder string
// and drop the image mapping (same as text placeholder behavior).
composer.attach_image(path, 20, 10, "PNG");
composer.attach_image(path);
let placeholder2 = composer.attached_images[0].placeholder.clone();
// Move cursor to roughly middle of placeholder
if let Some(start_pos) = composer.textarea.text().find(&placeholder2) {
@@ -3500,7 +3397,7 @@ mod tests {
// Insert an image placeholder at the start
let path = PathBuf::from("/tmp/image_multibyte.png");
composer.attach_image(path, 10, 5, "PNG");
composer.attach_image(path);
// Add multibyte text after the placeholder
composer.textarea.insert_str("日本語");
@@ -3509,12 +3406,7 @@ mod tests {
composer.handle_key_event(KeyEvent::new(KeyCode::Backspace, KeyModifiers::NONE));
assert_eq!(composer.attached_images.len(), 1);
assert!(
composer
.textarea
.text()
.starts_with("[image_multibyte.png 10x5]")
);
assert!(composer.textarea.text().starts_with("[Image #1]"));
}
#[test]
@@ -3532,10 +3424,10 @@ mod tests {
let path1 = PathBuf::from("/tmp/image_dup1.png");
let path2 = PathBuf::from("/tmp/image_dup2.png");
composer.attach_image(path1, 10, 5, "PNG");
composer.attach_image(path1);
// separate placeholders with a space for clarity
composer.handle_paste(" ".into());
composer.attach_image(path2.clone(), 10, 5, "PNG");
composer.attach_image(path2.clone());
let placeholder1 = composer.attached_images[0].placeholder.clone();
let placeholder2 = composer.attached_images[1].placeholder.clone();
@@ -3550,24 +3442,60 @@ mod tests {
let new_text = composer.textarea.text().to_string();
assert_eq!(
0,
new_text.matches(&placeholder1).count(),
"first placeholder removed"
new_text.matches(&placeholder2).count(),
"second placeholder was relabeled"
);
assert_eq!(
1,
new_text.matches(&placeholder2).count(),
"second placeholder remains"
new_text.matches("[Image #1]").count(),
"remaining placeholder relabeled to #1"
);
assert_eq!(
vec![AttachedImage {
path: path2,
placeholder: "[image_dup2.png 10x5]".to_string()
placeholder: "[Image #1]".to_string()
}],
composer.attached_images,
"one image mapping remains"
);
}
#[test]
fn deleting_first_text_element_renumbers_following_text_element() {
use crossterm::event::KeyCode;
use crossterm::event::KeyEvent;
use crossterm::event::KeyModifiers;
let (tx, _rx) = unbounded_channel::<AppEvent>();
let sender = AppEventSender::new(tx);
let mut composer = ChatComposer::new(
true,
sender,
false,
"Ask Codex to do anything".to_string(),
false,
);
let path1 = PathBuf::from("/tmp/image_first.png");
let path2 = PathBuf::from("/tmp/image_second.png");
// Insert two adjacent atomic elements.
composer.attach_image(path1);
composer.attach_image(path2.clone());
assert_eq!(composer.textarea.text(), "[Image #1][Image #2]");
assert_eq!(composer.attached_images.len(), 2);
// Delete the first element using normal textarea editing (Delete at cursor start).
composer.textarea.set_cursor(0);
composer.handle_key_event(KeyEvent::new(KeyCode::Delete, KeyModifiers::NONE));
// Remaining image should be renumbered and the textarea element updated.
assert_eq!(composer.attached_images.len(), 1);
assert_eq!(composer.attached_images[0].path, path2);
assert_eq!(composer.attached_images[0].placeholder, "[Image #1]");
assert_eq!(composer.textarea.text(), "[Image #1]");
}
#[test]
fn pasting_filepath_attaches_image() {
let tmp = tempdir().expect("create TempDir");
@@ -3588,12 +3516,7 @@ mod tests {
let needs_redraw = composer.handle_paste(tmp_path.to_string_lossy().to_string());
assert!(needs_redraw);
assert!(
composer
.textarea
.text()
.starts_with("[codex_tui_test_paste_image.png 3x2] ")
);
assert!(composer.textarea.text().starts_with("[Image #1] "));
let imgs = composer.take_recent_submission_images();
assert_eq!(imgs, vec![tmp_path]);

View File

@@ -529,16 +529,9 @@ impl BottomPane {
self.request_redraw();
}
pub(crate) fn attach_image(
&mut self,
path: PathBuf,
width: u32,
height: u32,
format_label: &str,
) {
pub(crate) fn attach_image(&mut self, path: PathBuf) {
if self.view_stack.is_empty() {
self.composer
.attach_image(path, width, height, format_label);
self.composer.attach_image(path);
self.request_redraw();
}
}

View File

@@ -715,6 +715,88 @@ impl TextArea {
// ===== Text elements support =====
pub fn element_payloads(&self) -> Vec<String> {
self.elements
.iter()
.filter_map(|e| self.text.get(e.range.clone()).map(str::to_string))
.collect()
}
pub fn element_payload_starting_at(&self, pos: usize) -> Option<String> {
let pos = pos.min(self.text.len());
let elem = self.elements.iter().find(|e| e.range.start == pos)?;
self.text.get(elem.range.clone()).map(str::to_string)
}
/// Renames a single text element in-place, keeping it atomic.
///
/// This is intended for cases where the element payload is an identifier (e.g. a placeholder)
/// that must be updated without converting the element back into normal text.
pub fn replace_element_payload(&mut self, old: &str, new: &str) -> bool {
let Some(idx) = self
.elements
.iter()
.position(|e| self.text.get(e.range.clone()) == Some(old))
else {
return false;
};
let range = self.elements[idx].range.clone();
let start = range.start;
let end = range.end;
if start > end || end > self.text.len() {
return false;
}
let removed_len = end - start;
let inserted_len = new.len();
let diff = inserted_len as isize - removed_len as isize;
self.text.replace_range(range, new);
self.wrap_cache.replace(None);
self.preferred_col = None;
// Update the modified element's range.
self.elements[idx].range = start..(start + inserted_len);
// Shift element ranges that occur after the replaced element.
if diff != 0 {
for (j, e) in self.elements.iter_mut().enumerate() {
if j == idx {
continue;
}
if e.range.end <= start {
continue;
}
if e.range.start >= end {
e.range.start = ((e.range.start as isize) + diff) as usize;
e.range.end = ((e.range.end as isize) + diff) as usize;
continue;
}
// Elements should not partially overlap each other; degrade gracefully by
// snapping anything intersecting the replaced range to the new bounds.
e.range.start = start.min(e.range.start);
e.range.end = (start + inserted_len).max(e.range.end.saturating_add_signed(diff));
}
}
// Update the cursor position to account for the edit.
self.cursor_pos = if self.cursor_pos < start {
self.cursor_pos
} else if self.cursor_pos <= end {
start + inserted_len
} else {
((self.cursor_pos as isize) + diff) as usize
};
self.cursor_pos = self.clamp_pos_to_nearest_boundary(self.cursor_pos);
// Keep element ordering deterministic.
self.elements.sort_by_key(|e| e.range.start);
true
}
pub fn insert_element(&mut self, text: &str) {
let start = self.clamp_pos_for_insertion(self.cursor_pos);
self.insert_str_at(start, text);

View File

@@ -1452,12 +1452,13 @@ impl ChatWidget {
{
match paste_image_to_temp_png() {
Ok((path, info)) => {
self.attach_image(
path,
tracing::debug!(
"pasted image size={}x{} format={}",
info.width,
info.height,
info.encoded_format.label(),
info.encoded_format.label()
);
self.attach_image(path);
}
Err(err) => {
tracing::warn!("failed to paste image: {err}");
@@ -1510,18 +1511,9 @@ impl ChatWidget {
}
}
pub(crate) fn attach_image(
&mut self,
path: PathBuf,
width: u32,
height: u32,
format_label: &str,
) {
tracing::info!(
"attach_image path={path:?} width={width} height={height} format={format_label}",
);
self.bottom_pane
.attach_image(path, width, height, format_label);
pub(crate) fn attach_image(&mut self, path: PathBuf) {
tracing::info!("attach_image path={path:?}");
self.bottom_pane.attach_image(path);
self.request_redraw();
}
@@ -1818,14 +1810,14 @@ impl ChatWidget {
return;
}
if !text.is_empty() {
items.push(UserInput::Text { text: text.clone() });
}
for path in image_paths {
items.push(UserInput::LocalImage { path });
}
if !text.is_empty() {
items.push(UserInput::Text { text: text.clone() });
}
if let Some(skills) = self.bottom_pane.skills() {
let skill_mentions = find_skill_mentions(&text, skills);
for skill in skill_mentions {
@@ -1996,6 +1988,9 @@ impl ChatWidget {
}
EventMsg::ExitedReviewMode(review) => self.on_exited_review_mode(review),
EventMsg::ContextCompacted(_) => self.on_agent_message("Context compacted".to_owned()),
EventMsg::CollabInteraction(_) => {
// TODO(jif) handle collab tools.
}
EventMsg::RawResponseItem(_)
| EventMsg::ThreadRolledBack(_)
| EventMsg::ItemStarted(_)

View File

@@ -1065,8 +1065,8 @@ async fn ctrl_c_cleared_prompt_is_recoverable_via_history() {
chat.bottom_pane.insert_str("draft message ");
chat.bottom_pane
.attach_image(PathBuf::from("/tmp/preview.png"), 24, 42, "png");
let placeholder = "[preview.png 24x42]";
.attach_image(PathBuf::from("/tmp/preview.png"));
let placeholder = "[Image #1]";
assert!(
chat.bottom_pane.composer_text().ends_with(placeholder),
"expected placeholder {placeholder:?} in composer text"

View File

@@ -17,6 +17,7 @@ use ratatui::prelude::*;
use ratatui::style::Stylize;
use std::collections::BTreeSet;
use std::path::PathBuf;
use url::Url;
use super::account::StatusAccountDisplay;
use super::format::FieldFormatter;
@@ -62,6 +63,7 @@ struct StatusHistoryCell {
approval: String,
sandbox: String,
agents_summary: String,
model_provider: Option<String>,
account: Option<StatusAccountDisplay>,
session_id: Option<String>,
token_usage: StatusTokenUsageData,
@@ -129,6 +131,7 @@ impl StatusHistoryCell {
}
};
let agents_summary = compose_agents_summary(config);
let model_provider = format_model_provider(config);
let account = compose_account_display(auth_manager, plan_type);
let session_id = session_id.as_ref().map(std::string::ToString::to_string);
let default_usage = TokenUsage::default();
@@ -157,6 +160,7 @@ impl StatusHistoryCell {
approval,
sandbox,
agents_summary,
model_provider,
account,
session_id,
token_usage,
@@ -338,6 +342,9 @@ impl HistoryCell for StatusHistoryCell {
.collect();
let mut seen: BTreeSet<String> = labels.iter().cloned().collect();
if self.model_provider.is_some() {
push_label(&mut labels, &mut seen, "Model provider");
}
if account_value.is_some() {
push_label(&mut labels, &mut seen, "Account");
}
@@ -380,6 +387,9 @@ impl HistoryCell for StatusHistoryCell {
let directory_value = format_directory_display(&self.directory, Some(value_width));
lines.push(formatter.line("Model", model_spans));
if let Some(model_provider) = self.model_provider.as_ref() {
lines.push(formatter.line("Model provider", vec![Span::from(model_provider.clone())]));
}
lines.push(formatter.line("Directory", vec![Span::from(directory_value)]));
lines.push(formatter.line("Approval", vec![Span::from(self.approval.clone())]));
lines.push(formatter.line("Sandbox", vec![Span::from(self.sandbox.clone())]));
@@ -415,3 +425,39 @@ impl HistoryCell for StatusHistoryCell {
with_border_with_inner_width(truncated_lines, inner_width)
}
}
fn format_model_provider(config: &Config) -> Option<String> {
let provider = &config.model_provider;
let name = provider.name.trim();
let provider_name = if name.is_empty() {
config.model_provider_id.as_str()
} else {
name
};
let base_url = provider.base_url.as_deref().and_then(sanitize_base_url);
let is_default_openai = provider.is_openai() && base_url.is_none();
if is_default_openai {
return None;
}
Some(match base_url {
Some(base_url) => format!("{provider_name} - {base_url}"),
None => provider_name.to_string(),
})
}
fn sanitize_base_url(raw: &str) -> Option<String> {
let trimmed = raw.trim();
if trimmed.is_empty() {
return None;
}
let Ok(mut url) = Url::parse(trimmed) else {
return None;
};
let _ = url.set_username("");
let _ = url.set_password(None);
url.set_query(None);
url.set_fragment(None);
Some(url.to_string().trim_end_matches('/').to_string()).filter(|value| !value.is_empty())
}

View File

@@ -1,3 +1,7 @@
# AGENTS.md
For information about AGENTS.md, see [this documentation](https://developers.openai.com/codex/guides/agents-md).
## Hierarchical agents message
When the `hierarchical_agents` feature flag is enabled (via `[features]` in `config.toml`), Codex appends additional guidance about AGENTS.md scope and precedence to the user instructions message and emits that message even when no AGENTS.md is present.

42
rbe.bzl Normal file
View File

@@ -0,0 +1,42 @@
def _rbe_platform_repo_impl(rctx):
arch = rctx.os.arch
if arch in ["x86_64", "amd64"]:
cpu = "x86_64"
exec_arch = "amd64"
image_sha = "8c9ff94187ea7c08a31e9a81f5fe8046ea3972a6768983c955c4079fa30567fb"
elif arch in ["aarch64", "arm64"]:
cpu = "aarch64"
exec_arch = "arm64"
image_sha = "ad9506086215fccfc66ed8d2be87847324be56790ae6a1964c241c28b77ef141"
else:
fail("Unsupported host arch for rbe platform: {}".format(arch))
rctx.file("BUILD.bazel", """\
platform(
name = "rbe_platform",
constraint_values = [
"@platforms//cpu:{cpu}",
"@platforms//os:linux",
"@bazel_tools//tools/cpp:clang",
"@toolchains_llvm_bootstrapped//constraints/libc:gnu.2.28",
],
exec_properties = {{
# Ubuntu-based image that includes git, python3, dotslash, and other
# tools that various integration tests need.
# Verify at https://hub.docker.com/layers/mbolin491/codex-bazel/latest/images/sha256:{image_sha}
"container-image": "docker://docker.io/mbolin491/codex-bazel@sha256:{image_sha}",
"Arch": "{arch}",
"OSFamily": "Linux",
}},
visibility = ["//visibility:public"],
)
""".format(
cpu = cpu,
arch = exec_arch,
image_sha = image_sha
))
rbe_platform_repository = repository_rule(
implementation = _rbe_platform_repo_impl,
doc = "Sets up a platform for remote builds with an Arch exec_property matching the host.",
)