Plan mode: stream proposed plans, emit plan items, and render in TUI (#9786)

## Summary
- Stream proposed plans in Plan Mode using `<proposed_plan>` tags parsed
in core, emitting plan deltas plus a plan `ThreadItem`, while stripping
tags from normal assistant output.
- Persist plan items and rebuild them on resume so proposed plans show
in thread history.
- Wire plan items/deltas through app-server protocol v2 and render a
dedicated proposed-plan view in the TUI, including the “Implement this
plan?” prompt only when a plan item is present.

## Changes

### Core (`codex-rs/core`)
- Added a generic, line-based tag parser that buffers each line until it
can disprove a tag prefix; implements auto-close on `finish()` for
unterminated tags. `codex-rs/core/src/tagged_block_parser.rs`
- Refactored proposed plan parsing to wrap the generic parser.
`codex-rs/core/src/proposed_plan_parser.rs`
- In plan mode, stream assistant deltas as:
  - **Normal text** → `AgentMessageContentDelta`
  - **Plan text** → `PlanDelta` + `TurnItem::Plan` start/completion  
  (`codex-rs/core/src/codex.rs`)
- Final plan item content is derived from the completed assistant
message (authoritative), not necessarily the concatenated deltas.
- Strips `<proposed_plan>` blocks from assistant text in plan mode so
tags don’t appear in normal messages.
(`codex-rs/core/src/stream_events_utils.rs`)
- Persist `ItemCompleted` events only for plan items for rollout replay.
(`codex-rs/core/src/rollout/policy.rs`)
- Guard `update_plan` tool in Plan Mode with a clear error message.
(`codex-rs/core/src/tools/handlers/plan.rs`)
- Updated Plan Mode prompt to:  
  - keep `<proposed_plan>` out of non-final reasoning/preambles  
  - require exact tag formatting  
  - allow only one `<proposed_plan>` block per turn  
  (`codex-rs/core/templates/collaboration_mode/plan.md`)

### Protocol / App-server protocol
- Added `TurnItem::Plan` and `PlanDeltaEvent` to core protocol items.
(`codex-rs/protocol/src/items.rs`, `codex-rs/protocol/src/protocol.rs`)
- Added v2 `ThreadItem::Plan` and `PlanDeltaNotification` with
EXPERIMENTAL markers and note that deltas may not match the final plan
item. (`codex-rs/app-server-protocol/src/protocol/v2.rs`)
- Added plan delta route in app-server protocol common mapping.
(`codex-rs/app-server-protocol/src/protocol/common.rs`)
- Rebuild plan items from persisted `ItemCompleted` events on resume.
(`codex-rs/app-server-protocol/src/protocol/thread_history.rs`)

### App-server
- Forward plan deltas to v2 clients and map core plan items to v2 plan
items. (`codex-rs/app-server/src/bespoke_event_handling.rs`,
`codex-rs/app-server/src/codex_message_processor.rs`)
- Added v2 plan item tests.
(`codex-rs/app-server/tests/suite/v2/plan_item.rs`)

### TUI
- Added a dedicated proposed plan history cell with special background
and padding, and moved “• Proposed Plan” outside the highlighted block.
(`codex-rs/tui/src/history_cell.rs`, `codex-rs/tui/src/style.rs`)
- Only show “Implement this plan?” when a plan item exists.
(`codex-rs/tui/src/chatwidget.rs`,
`codex-rs/tui/src/chatwidget/tests.rs`)

<img width="831" height="847" alt="Screenshot 2026-01-29 at 7 06 24 PM"
src="https://github.com/user-attachments/assets/69794c8c-f96b-4d36-92ef-c1f5c3a8f286"
/>

### Docs / Misc
- Updated protocol docs to mention plan deltas.
(`codex-rs/docs/protocol_v1.md`)
- Minor plumbing updates in exec/debug clients to tolerate plan deltas.
(`codex-rs/debug-client/src/reader.rs`, `codex-rs/exec/...`)

## Tests
- Added core integration tests:
  - Plan mode strips plan from agent messages.
  - Missing `</proposed_plan>` closes at end-of-message.  
  (`codex-rs/core/tests/suite/items.rs`)
- Added unit tests for generic tag parser (prefix buffering, non-tag
lines, auto-close). (`codex-rs/core/src/tagged_block_parser.rs`)
- Existing app-server plan item tests in v2.
(`codex-rs/app-server/tests/suite/v2/plan_item.rs`)

## Notes / Behavior
- Plan output no longer appears in standard assistant text in Plan Mode;
it streams via `PlanDelta` and completes as a `TurnItem::Plan`.
- The final plan item content is authoritative and may diverge from
streamed deltas (documented as experimental).
- Reasoning summaries are not filtered; prompt instructs the model not
to include `<proposed_plan>` outside the final plan message.

## Codex Author
`codex fork 019bec2d-b09d-7450-b292-d7bcdddcdbfb`
This commit is contained in:
Charley Cunningham
2026-01-30 10:59:30 -08:00
committed by GitHub
parent 40bf11bd52
commit ec4a2d07e4
36 changed files with 2021 additions and 42 deletions

View File

@@ -196,6 +196,7 @@ mod skills;
use self::skills::collect_tool_mentions;
use self::skills::find_app_mentions;
use self::skills::find_skill_mentions_with_tool_mentions;
use crate::streaming::controller::PlanStreamController;
use crate::streaming::controller::StreamController;
use chrono::Local;
@@ -485,6 +486,8 @@ pub(crate) struct ChatWidget {
rate_limit_poller: Option<JoinHandle<()>>,
// Stream lifecycle controller
stream_controller: Option<StreamController>,
// Stream lifecycle controller for proposed plan output.
plan_stream_controller: Option<PlanStreamController>,
running_commands: HashMap<String, RunningCommand>,
suppressed_exec_calls: HashSet<String>,
skills_all: Vec<ProtocolSkillMetadata>,
@@ -553,6 +556,12 @@ pub(crate) struct ChatWidget {
had_work_activity: bool,
// Whether the current turn emitted a plan update.
saw_plan_update_this_turn: bool,
// Whether the current turn emitted a proposed plan item.
saw_plan_item_this_turn: bool,
// Incremental buffer for streamed plan content.
plan_delta_buffer: String,
// True while a plan item is streaming.
plan_item_active: bool,
// Status-indicator elapsed seconds captured at the last emitted final-message separator.
//
// This lets the separator show per-chunk work time (since the previous separator) rather than
@@ -896,7 +905,7 @@ impl ChatWidget {
fn on_agent_message(&mut self, message: String) {
// If we have a stream_controller, then the final agent message is redundant and will be a
// duplicate of what has already been streamed.
if self.stream_controller.is_none() {
if self.stream_controller.is_none() && !message.is_empty() {
self.handle_streaming_delta(message);
}
self.flush_answer_stream_with_separator();
@@ -908,6 +917,56 @@ impl ChatWidget {
self.handle_streaming_delta(delta);
}
fn on_plan_delta(&mut self, delta: String) {
if self.active_mode_kind() != ModeKind::Plan {
return;
}
if !self.plan_item_active {
self.plan_item_active = true;
self.plan_delta_buffer.clear();
}
self.plan_delta_buffer.push_str(&delta);
// Before streaming plan content, flush any active exec cell group.
self.flush_unified_exec_wait_streak();
self.flush_active_cell();
if self.plan_stream_controller.is_none() {
self.plan_stream_controller = Some(PlanStreamController::new(
self.last_rendered_width.get().map(|w| w.saturating_sub(4)),
));
}
if let Some(controller) = self.plan_stream_controller.as_mut()
&& controller.push(&delta)
{
self.app_event_tx.send(AppEvent::StartCommitAnimation);
}
self.request_redraw();
}
fn on_plan_item_completed(&mut self, text: String) {
let streamed_plan = self.plan_delta_buffer.trim().to_string();
let plan_text = if text.trim().is_empty() {
streamed_plan
} else {
text
};
self.plan_delta_buffer.clear();
self.plan_item_active = false;
self.saw_plan_item_this_turn = true;
if let Some(mut controller) = self.plan_stream_controller.take()
&& let Some(cell) = controller.finalize()
{
self.add_boxed_history(cell);
// TODO: Replace streamed output with the final plan item text if plan streaming is
// removed or if we need to reconcile mismatches between streamed and final content.
return;
}
if plan_text.is_empty() {
return;
}
self.add_to_history(history_cell::new_proposed_plan(plan_text));
}
fn on_agent_reasoning_delta(&mut self, delta: String) {
// For reasoning deltas, do not stream to history. Accumulate the
// current reasoning block and extract the first bold element
@@ -954,6 +1013,10 @@ impl ChatWidget {
fn on_task_started(&mut self) {
self.agent_turn_running = true;
self.saw_plan_update_this_turn = false;
self.saw_plan_item_this_turn = false;
self.plan_delta_buffer.clear();
self.plan_item_active = false;
self.plan_stream_controller = None;
self.bottom_pane.clear_quit_shortcut_hint();
self.quit_shortcut_expires_at = None;
self.quit_shortcut_key = None;
@@ -969,6 +1032,11 @@ impl ChatWidget {
fn on_task_complete(&mut self, last_agent_message: Option<String>, from_replay: bool) {
// If a stream is currently active, finalize it.
self.flush_answer_stream_with_separator();
if let Some(mut controller) = self.plan_stream_controller.take()
&& let Some(cell) = controller.finalize()
{
self.add_boxed_history(cell);
}
self.flush_unified_exec_wait_streak();
// Mark task stopped and request redraw now that all content is in history.
self.agent_turn_running = false;
@@ -981,7 +1049,7 @@ impl ChatWidget {
self.request_redraw();
if !from_replay && self.queued_user_messages.is_empty() {
self.maybe_prompt_plan_implementation(last_agent_message.as_deref());
self.maybe_prompt_plan_implementation();
}
// If there is a queued user message, send exactly one now to begin the next turn.
self.maybe_send_next_queued_input();
@@ -993,7 +1061,7 @@ impl ChatWidget {
self.maybe_show_pending_rate_limit_prompt();
}
fn maybe_prompt_plan_implementation(&mut self, last_agent_message: Option<&str>) {
fn maybe_prompt_plan_implementation(&mut self) {
if !self.collaboration_modes_enabled() {
return;
}
@@ -1003,8 +1071,7 @@ impl ChatWidget {
if self.active_mode_kind() != ModeKind::Plan {
return;
}
let has_message = last_agent_message.is_some_and(|message| !message.trim().is_empty());
if !has_message && !self.saw_plan_update_this_turn {
if !self.saw_plan_item_this_turn {
return;
}
if !self.bottom_pane.no_modal_or_popup_active() {
@@ -1749,15 +1816,28 @@ impl ChatWidget {
/// Periodic tick to commit at most one queued line to history with a small delay,
/// animating the output.
pub(crate) fn on_commit_tick(&mut self) {
let mut has_controller = false;
let mut all_idle = true;
if let Some(controller) = self.stream_controller.as_mut() {
has_controller = true;
let (cell, is_idle) = controller.on_commit_tick();
if let Some(cell) = cell {
self.bottom_pane.hide_status_indicator();
self.add_boxed_history(cell);
}
if is_idle {
self.app_event_tx.send(AppEvent::StopCommitAnimation);
all_idle &= is_idle;
}
if let Some(controller) = self.plan_stream_controller.as_mut() {
has_controller = true;
let (cell, is_idle) = controller.on_commit_tick();
if let Some(cell) = cell {
self.bottom_pane.hide_status_indicator();
self.add_boxed_history(cell);
}
all_idle &= is_idle;
}
if has_controller && all_idle {
self.app_event_tx.send(AppEvent::StopCommitAnimation);
}
}
@@ -2160,6 +2240,7 @@ impl ChatWidget {
rate_limit_switch_prompt: RateLimitSwitchPromptState::default(),
rate_limit_poller: None,
stream_controller: None,
plan_stream_controller: None,
running_commands: HashMap::new(),
suppressed_exec_calls: HashSet::new(),
last_unified_wait: None,
@@ -2188,6 +2269,9 @@ impl ChatWidget {
needs_final_message_separator: false,
had_work_activity: false,
saw_plan_update_this_turn: false,
saw_plan_item_this_turn: false,
plan_delta_buffer: String::new(),
plan_item_active: false,
last_separator_elapsed_secs: None,
last_rendered_width: std::cell::Cell::new(None),
feedback,
@@ -2301,6 +2385,7 @@ impl ChatWidget {
rate_limit_switch_prompt: RateLimitSwitchPromptState::default(),
rate_limit_poller: None,
stream_controller: None,
plan_stream_controller: None,
running_commands: HashMap::new(),
suppressed_exec_calls: HashSet::new(),
last_unified_wait: None,
@@ -2319,6 +2404,9 @@ impl ChatWidget {
thread_name: None,
forked_from: None,
saw_plan_update_this_turn: false,
saw_plan_item_this_turn: false,
plan_delta_buffer: String::new(),
plan_item_active: false,
queued_user_messages: VecDeque::new(),
show_welcome_banner: is_first_run,
suppress_session_configured_redraw: false,
@@ -2431,6 +2519,7 @@ impl ChatWidget {
rate_limit_switch_prompt: RateLimitSwitchPromptState::default(),
rate_limit_poller: None,
stream_controller: None,
plan_stream_controller: None,
running_commands: HashMap::new(),
suppressed_exec_calls: HashSet::new(),
last_unified_wait: None,
@@ -2459,6 +2548,9 @@ impl ChatWidget {
needs_final_message_separator: false,
had_work_activity: false,
saw_plan_update_this_turn: false,
saw_plan_item_this_turn: false,
plan_delta_buffer: String::new(),
plan_item_active: false,
last_separator_elapsed_secs: None,
last_rendered_width: std::cell::Cell::new(None),
feedback,
@@ -3219,6 +3311,7 @@ impl ChatWidget {
match msg {
EventMsg::AgentMessageDelta(_)
| EventMsg::PlanDelta(_)
| EventMsg::AgentReasoningDelta(_)
| EventMsg::TerminalInteraction(_)
| EventMsg::ExecCommandOutputDelta(_) => {}
@@ -3234,6 +3327,7 @@ impl ChatWidget {
EventMsg::AgentMessageDelta(AgentMessageDeltaEvent { delta }) => {
self.on_agent_message_delta(delta)
}
EventMsg::PlanDelta(event) => self.on_plan_delta(event.delta),
EventMsg::AgentReasoningDelta(AgentReasoningDeltaEvent { delta })
| EventMsg::AgentReasoningRawContentDelta(AgentReasoningRawContentDeltaEvent {
delta,
@@ -3357,11 +3451,15 @@ impl ChatWidget {
EventMsg::ThreadRolledBack(_) => {}
EventMsg::RawResponseItem(_)
| EventMsg::ItemStarted(_)
| EventMsg::ItemCompleted(_)
| EventMsg::AgentMessageContentDelta(_)
| EventMsg::ReasoningContentDelta(_)
| EventMsg::ReasoningRawContentDelta(_)
| EventMsg::DynamicToolCallRequest(_) => {}
EventMsg::ItemCompleted(event) => {
if let codex_protocol::items::TurnItem::Plan(plan_item) = event.item {
self.on_plan_item_completed(plan_item.text);
}
}
}
}

View File

@@ -809,6 +809,7 @@ async fn make_chatwidget_manual(
rate_limit_switch_prompt: RateLimitSwitchPromptState::default(),
rate_limit_poller: None,
stream_controller: None,
plan_stream_controller: None,
running_commands: HashMap::new(),
suppressed_exec_calls: HashSet::new(),
skills_all: Vec::new(),
@@ -840,6 +841,9 @@ async fn make_chatwidget_manual(
needs_final_message_separator: false,
had_work_activity: false,
saw_plan_update_this_turn: false,
saw_plan_item_this_turn: false,
plan_delta_buffer: String::new(),
plan_item_active: false,
last_separator_elapsed_secs: None,
last_rendered_width: std::cell::Cell::new(None),
feedback: codex_feedback::CodexFeedback::new(),
@@ -1277,7 +1281,7 @@ async fn plan_implementation_popup_skips_when_messages_queued() {
}
#[tokio::test]
async fn plan_implementation_popup_shows_on_plan_update_without_message() {
async fn plan_implementation_popup_skips_without_proposed_plan() {
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(Some("gpt-5")).await;
chat.set_feature_enabled(Feature::CollaborationModes, true);
let plan_mask =
@@ -1295,10 +1299,31 @@ async fn plan_implementation_popup_shows_on_plan_update_without_message() {
});
chat.on_task_complete(None, false);
let popup = render_bottom_popup(&chat, 80);
assert!(
!popup.contains(PLAN_IMPLEMENTATION_TITLE),
"expected no plan popup without proposed plan output, got {popup:?}"
);
}
#[tokio::test]
async fn plan_implementation_popup_shows_after_proposed_plan_output() {
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(Some("gpt-5")).await;
chat.set_feature_enabled(Feature::CollaborationModes, true);
let plan_mask =
collaboration_modes::mask_for_kind(chat.models_manager.as_ref(), ModeKind::Plan)
.expect("expected plan collaboration mask");
chat.set_collaboration_mask(plan_mask);
chat.on_task_started();
chat.on_plan_delta("- Step 1\n- Step 2\n".to_string());
chat.on_plan_item_completed("- Step 1\n- Step 2\n".to_string());
chat.on_task_complete(None, false);
let popup = render_bottom_popup(&chat, 80);
assert!(
popup.contains(PLAN_IMPLEMENTATION_TITLE),
"expected plan popup after plan update, got {popup:?}"
"expected plan popup after proposed plan output, got {popup:?}"
);
}
@@ -1957,6 +1982,7 @@ async fn unified_exec_wait_after_final_agent_message_snapshot() {
id: "turn-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -1991,6 +2017,7 @@ async fn unified_exec_wait_before_streamed_agent_message_snapshot() {
id: "turn-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -2779,6 +2806,7 @@ async fn interrupted_turn_error_message_snapshot() {
id: "task-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -3793,6 +3821,7 @@ async fn interrupt_clears_unified_exec_wait_streak_snapshot() {
id: "turn-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -3866,6 +3895,7 @@ async fn ui_snapshots_small_heights_task_running() {
id: "task-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
chat.handle_codex_event(Event {
@@ -3897,6 +3927,7 @@ async fn status_widget_and_approval_modal_snapshot() {
id: "task-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
// Provide a deterministic header for the status line.
@@ -3949,6 +3980,7 @@ async fn status_widget_active_snapshot() {
id: "task-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
// Provide a deterministic header via a bold reasoning chunk.
@@ -3998,6 +4030,7 @@ async fn mcp_startup_complete_does_not_clear_running_task() {
id: "task-1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -4554,6 +4587,7 @@ async fn stream_recovery_restores_previous_status_header() {
id: "task".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
drain_insert_history(&mut rx);
@@ -4591,6 +4625,7 @@ async fn multiple_agent_messages_in_single_turn_emit_multiple_headers() {
id: "s1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
@@ -4785,6 +4820,7 @@ async fn chatwidget_exec_and_status_layout_vt100_snapshot() {
id: "t1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
chat.handle_codex_event(Event {
@@ -4832,6 +4868,7 @@ async fn chatwidget_markdown_code_blocks_vt100_snapshot() {
id: "t1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
// Build a vt100 visual from the history insertions only (no UI overlay)
@@ -4921,6 +4958,7 @@ async fn chatwidget_tall() {
id: "t1".into(),
msg: EventMsg::TurnStarted(TurnStartedEvent {
model_context_window: None,
collaboration_mode_kind: ModeKind::Custom,
}),
});
for i in 0..30 {

View File

@@ -25,6 +25,7 @@ use crate::render::line_utils::line_to_static;
use crate::render::line_utils::prefix_lines;
use crate::render::line_utils::push_owned_lines;
use crate::render::renderable::Renderable;
use crate::style::proposed_plan_style;
use crate::style::user_message_style;
use crate::text_formatting::format_and_truncate_tool_result;
use crate::text_formatting::truncate_text;
@@ -1768,6 +1769,63 @@ pub(crate) fn new_plan_update(update: UpdatePlanArgs) -> PlanUpdateCell {
PlanUpdateCell { explanation, plan }
}
pub(crate) fn new_proposed_plan(plan_markdown: String) -> ProposedPlanCell {
ProposedPlanCell { plan_markdown }
}
pub(crate) fn new_proposed_plan_stream(
lines: Vec<Line<'static>>,
is_stream_continuation: bool,
) -> ProposedPlanStreamCell {
ProposedPlanStreamCell {
lines,
is_stream_continuation,
}
}
#[derive(Debug)]
pub(crate) struct ProposedPlanCell {
plan_markdown: String,
}
#[derive(Debug)]
pub(crate) struct ProposedPlanStreamCell {
lines: Vec<Line<'static>>,
is_stream_continuation: bool,
}
impl HistoryCell for ProposedPlanCell {
fn display_lines(&self, width: u16) -> Vec<Line<'static>> {
let mut lines: Vec<Line<'static>> = Vec::new();
lines.push(vec!["".dim(), "Proposed Plan".bold()].into());
lines.push(Line::from(" "));
let mut plan_lines: Vec<Line<'static>> = vec![Line::from(" ")];
let plan_style = proposed_plan_style();
let wrap_width = width.saturating_sub(4).max(1) as usize;
let mut body: Vec<Line<'static>> = Vec::new();
append_markdown(&self.plan_markdown, Some(wrap_width), &mut body);
if body.is_empty() {
body.push(Line::from("(empty)".dim().italic()));
}
plan_lines.extend(prefix_lines(body, " ".into(), " ".into()));
plan_lines.push(Line::from(" "));
lines.extend(plan_lines.into_iter().map(|line| line.style(plan_style)));
lines
}
}
impl HistoryCell for ProposedPlanStreamCell {
fn display_lines(&self, _width: u16) -> Vec<Line<'static>> {
self.lines.clone()
}
fn is_stream_continuation(&self) -> bool {
self.is_stream_continuation
}
}
#[derive(Debug)]
pub(crate) struct PlanUpdateCell {
explanation: Option<String>,

View File

@@ -1,5 +1,8 @@
use crate::history_cell::HistoryCell;
use crate::history_cell::{self};
use crate::render::line_utils::prefix_lines;
use crate::style::proposed_plan_style;
use ratatui::prelude::Stylize;
use ratatui::text::Line;
use super::StreamState;
@@ -80,6 +83,106 @@ impl StreamController {
}
}
/// Controller that streams proposed plan markdown into a styled plan block.
pub(crate) struct PlanStreamController {
state: StreamState,
header_emitted: bool,
top_padding_emitted: bool,
}
impl PlanStreamController {
pub(crate) fn new(width: Option<usize>) -> Self {
Self {
state: StreamState::new(width),
header_emitted: false,
top_padding_emitted: false,
}
}
/// Push a delta; if it contains a newline, commit completed lines and start animation.
pub(crate) fn push(&mut self, delta: &str) -> bool {
let state = &mut self.state;
if !delta.is_empty() {
state.has_seen_delta = true;
}
state.collector.push_delta(delta);
if delta.contains('\n') {
let newly_completed = state.collector.commit_complete_lines();
if !newly_completed.is_empty() {
state.enqueue(newly_completed);
return true;
}
}
false
}
/// Finalize the active stream. Drain and emit now.
pub(crate) fn finalize(&mut self) -> Option<Box<dyn HistoryCell>> {
let remaining = {
let state = &mut self.state;
state.collector.finalize_and_drain()
};
let mut out_lines = Vec::new();
{
let state = &mut self.state;
if !remaining.is_empty() {
state.enqueue(remaining);
}
let step = state.drain_all();
out_lines.extend(step);
}
self.state.clear();
self.emit(out_lines, true)
}
/// Step animation: commit at most one queued line and handle end-of-drain cleanup.
pub(crate) fn on_commit_tick(&mut self) -> (Option<Box<dyn HistoryCell>>, bool) {
let step = self.state.step();
(self.emit(step, false), self.state.is_idle())
}
fn emit(
&mut self,
lines: Vec<Line<'static>>,
include_bottom_padding: bool,
) -> Option<Box<dyn HistoryCell>> {
if lines.is_empty() && !include_bottom_padding {
return None;
}
let mut out_lines: Vec<Line<'static>> = Vec::new();
let is_stream_continuation = self.header_emitted;
if !self.header_emitted {
out_lines.push(vec!["".dim(), "Proposed Plan".bold()].into());
out_lines.push(Line::from(" "));
self.header_emitted = true;
}
let mut plan_lines: Vec<Line<'static>> = Vec::new();
if !self.top_padding_emitted {
plan_lines.push(Line::from(" "));
self.top_padding_emitted = true;
}
plan_lines.extend(lines);
if include_bottom_padding {
plan_lines.push(Line::from(" "));
}
let plan_style = proposed_plan_style();
let plan_lines = prefix_lines(plan_lines, " ".into(), " ".into())
.into_iter()
.map(|line| line.style(plan_style))
.collect::<Vec<_>>();
out_lines.extend(plan_lines);
Some(Box::new(history_cell::new_proposed_plan_stream(
out_lines,
is_stream_continuation,
)))
}
}
#[cfg(test)]
mod tests {
use super::*;

View File

@@ -9,6 +9,10 @@ pub fn user_message_style() -> Style {
user_message_style_for(default_bg())
}
pub fn proposed_plan_style() -> Style {
proposed_plan_style_for(default_bg())
}
/// Returns the style for a user-authored message using the provided terminal background.
pub fn user_message_style_for(terminal_bg: Option<(u8, u8, u8)>) -> Style {
match terminal_bg {
@@ -17,6 +21,13 @@ pub fn user_message_style_for(terminal_bg: Option<(u8, u8, u8)>) -> Style {
}
}
pub fn proposed_plan_style_for(terminal_bg: Option<(u8, u8, u8)>) -> Style {
match terminal_bg {
Some(bg) => Style::default().bg(proposed_plan_bg(bg)),
None => Style::default(),
}
}
#[allow(clippy::disallowed_methods)]
pub fn user_message_bg(terminal_bg: (u8, u8, u8)) -> Color {
let (top, alpha) = if is_light(terminal_bg) {
@@ -26,3 +37,13 @@ pub fn user_message_bg(terminal_bg: (u8, u8, u8)) -> Color {
};
best_color(blend(top, terminal_bg, alpha))
}
#[allow(clippy::disallowed_methods)]
pub fn proposed_plan_bg(terminal_bg: (u8, u8, u8)) -> Color {
let (top, alpha) = if is_light(terminal_bg) {
((0, 110, 150), 0.08)
} else {
((80, 170, 220), 0.2)
};
best_color(blend(top, terminal_bg, alpha))
}