mirror of
https://github.com/openai/codex.git
synced 2026-02-01 22:47:52 +00:00
In a [recent PR](https://github.com/openai/codex/pull/9182), I made some improvements to config error messages so errors didn't leave app server clients in a dead state. This is a follow-on PR to make these error messages more readable and actionable for both TUI and GUI users. For example, see #9668 where the user was understandably confused about the source of the problem and how to fix it. The improved error message: 1. Clearly identifies the config file where the error was found (which is more important now that we support layered configs) 2. Provides a line and column number of the error 3. Displays the line where the error occurred and underlines it For example, if my `config.toml` includes the following: ```toml [features] collaboration_modes = "true" ``` Here's the current CLI error message: ``` Error loading config.toml: invalid type: string "true", expected a boolean in `features` ``` And here's the improved message: ``` Error loading config.toml: /Users/etraut/.codex/config.toml:43:23: invalid type: string "true", expected a boolean | 43 | collaboration_modes = "true" | ^^^^^^ ``` The bulk of the new logic is contained within a new module `config_loader/diagnostics.rs` that is responsible for calculating the text range for a given toml path (which is more involved than I would have expected). In addition, this PR adds the file name and text range to the `ConfigWarningNotification` app server struct. This allows GUI clients to present the user with a better error message and an optional link to open the errant config file. This was a suggestion from @.bolinfest when he reviewed my previous PR.