mirror of
https://github.com/openai/codex.git
synced 2026-02-01 22:47:52 +00:00
This is an alternate PR to solving the same problem as <https://github.com/openai/codex/pull/8227>. In this PR, when Ollama is used via `--oss` (or via `model_provider = "ollama"`), we default it to use the Responses format. At runtime, we do an Ollama version check, and if the version is older than when Responses support was added to Ollama, we print out a warning. Because there's no way of configuring the wire api for a built-in provider, we temporarily add a new `oss_provider`/`model_provider` called `"ollama-chat"` that will force the chat format. Once the `"chat"` format is fully removed (see <https://github.com/openai/codex/discussions/7782>), `ollama-chat` can be removed as well --------- Co-authored-by: Eric Traut <etraut@openai.com> Co-authored-by: Michael Bolin <mbolin@openai.com>