mirror of
https://github.com/openai/codex.git
synced 2026-02-04 07:53:43 +00:00
Compare commits
1 Commits
remote-tas
...
codex--app
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
72d8bde988 |
50
.github/ISSUE_TEMPLATE/5-vs-code-extension.yml
vendored
50
.github/ISSUE_TEMPLATE/5-vs-code-extension.yml
vendored
@@ -1,50 +0,0 @@
|
||||
name: 🧑💻 VS Code Extension
|
||||
description: Report an issue with the VS Code extension
|
||||
labels:
|
||||
- extension
|
||||
- needs triage
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Before submitting a new issue, please search for existing issues to see if your issue has already been reported.
|
||||
If it has, please add a 👍 reaction (no need to leave a comment) to the existing issue instead of creating a new one.
|
||||
|
||||
- type: input
|
||||
id: version
|
||||
attributes:
|
||||
label: What version of the VS Code extension are you using?
|
||||
- type: input
|
||||
id: ide
|
||||
attributes:
|
||||
label: Which IDE are you using?
|
||||
description: Like `VS Code`, `Cursor`, `Windsurf`, etc.
|
||||
- type: input
|
||||
id: platform
|
||||
attributes:
|
||||
label: What platform is your computer?
|
||||
description: |
|
||||
For MacOS and Linux: copy the output of `uname -mprs`
|
||||
For Windows: copy the output of `"$([Environment]::OSVersion | ForEach-Object VersionString) $(if ([Environment]::Is64BitOperatingSystem) { "x64" } else { "x86" })"` in the PowerShell console
|
||||
- type: textarea
|
||||
id: steps
|
||||
attributes:
|
||||
label: What steps can reproduce the bug?
|
||||
description: Explain the bug and provide a code snippet that can reproduce it.
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: expected
|
||||
attributes:
|
||||
label: What is the expected behavior?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: actual
|
||||
attributes:
|
||||
label: What do you see instead?
|
||||
description: If possible, please provide text instead of a screenshot.
|
||||
- type: textarea
|
||||
id: notes
|
||||
attributes:
|
||||
label: Additional information
|
||||
description: Is there anything else you think we should know?
|
||||
1
.github/actions/codex/.gitignore
vendored
Normal file
1
.github/actions/codex/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
/node_modules/
|
||||
8
.github/actions/codex/.prettierrc.toml
vendored
Normal file
8
.github/actions/codex/.prettierrc.toml
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
printWidth = 80
|
||||
quoteProps = "consistent"
|
||||
semi = true
|
||||
tabWidth = 2
|
||||
trailingComma = "all"
|
||||
|
||||
# Preserve existing behavior for markdown/text wrapping.
|
||||
proseWrap = "preserve"
|
||||
140
.github/actions/codex/README.md
vendored
Normal file
140
.github/actions/codex/README.md
vendored
Normal file
@@ -0,0 +1,140 @@
|
||||
# openai/codex-action
|
||||
|
||||
`openai/codex-action` is a GitHub Action that facilitates the use of [Codex](https://github.com/openai/codex) on GitHub issues and pull requests. Using the action, associate **labels** to run Codex with the appropriate prompt for the given context. Codex will respond by posting comments or creating PRs, whichever you specify!
|
||||
|
||||
Here is a sample workflow that uses `openai/codex-action`:
|
||||
|
||||
```yaml
|
||||
name: Codex
|
||||
|
||||
on:
|
||||
issues:
|
||||
types: [opened, labeled]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
types: [labeled]
|
||||
|
||||
jobs:
|
||||
codex:
|
||||
if: ... # optional, but can be effective in conserving CI resources
|
||||
runs-on: ubuntu-latest
|
||||
# TODO(mbolin): Need to verify if/when `write` is necessary.
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
pull-requests: write
|
||||
steps:
|
||||
# By default, Codex runs network disabled using --full-auto, so perform
|
||||
# any setup that requires network (such as installing dependencies)
|
||||
# before openai/codex-action.
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Run Codex
|
||||
uses: openai/codex-action@latest
|
||||
with:
|
||||
openai_api_key: ${{ secrets.CODEX_OPENAI_API_KEY }}
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
```
|
||||
|
||||
See sample usage in [`codex.yml`](../../workflows/codex.yml).
|
||||
|
||||
## Triggering the Action
|
||||
|
||||
Using the sample workflow above, we have:
|
||||
|
||||
```yaml
|
||||
on:
|
||||
issues:
|
||||
types: [opened, labeled]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
types: [labeled]
|
||||
```
|
||||
|
||||
which means our workflow will be triggered when any of the following events occur:
|
||||
|
||||
- a label is added to an issue
|
||||
- a label is added to a pull request against the `main` branch
|
||||
|
||||
### Label-Based Triggers
|
||||
|
||||
To define a GitHub label that should trigger Codex, create a file named `.github/codex/labels/LABEL-NAME.md` in your repository where `LABEL-NAME` is the name of the label. The content of the file is the prompt template to use when the label is added (see more on [Prompt Template Variables](#prompt-template-variables) below).
|
||||
|
||||
For example, if the file `.github/codex/labels/codex-review.md` exists, then:
|
||||
|
||||
- Adding the `codex-review` label will trigger the workflow containing the `openai/codex-action` GitHub Action.
|
||||
- When `openai/codex-action` starts, it will replace the `codex-review` label with `codex-review-in-progress`.
|
||||
- When `openai/codex-action` is finished, it will replace the `codex-review-in-progress` label with `codex-review-completed`.
|
||||
|
||||
If Codex sees that either `codex-review-in-progress` or `codex-review-completed` is already present, it will not perform the action.
|
||||
|
||||
As determined by the [default config](./src/default-label-config.ts), Codex will act on the following labels by default:
|
||||
|
||||
- Adding the `codex-review` label to a pull request will have Codex review the PR and add it to the PR as a comment.
|
||||
- Adding the `codex-triage` label to an issue will have Codex investigate the issue and report its findings as a comment.
|
||||
- Adding the `codex-issue-fix` label to an issue will have Codex attempt to fix the issue and create a PR wit the fix, if any.
|
||||
|
||||
## Action Inputs
|
||||
|
||||
The `openai/codex-action` GitHub Action takes the following inputs
|
||||
|
||||
### `openai_api_key` (required)
|
||||
|
||||
Set your `OPENAI_API_KEY` as a [repository secret](https://docs.github.com/en/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions). See **Secrets and varaibles** then **Actions** in the settings for your GitHub repo.
|
||||
|
||||
Note that the secret name does not have to be `OPENAI_API_KEY`. For example, you might want to name it `CODEX_OPENAI_API_KEY` and then configure it on `openai/codex-action` as follows:
|
||||
|
||||
```yaml
|
||||
openai_api_key: ${{ secrets.CODEX_OPENAI_API_KEY }}
|
||||
```
|
||||
|
||||
### `github_token` (required)
|
||||
|
||||
This is required so that Codex can post a comment or create a PR. Set this value on the action as follows:
|
||||
|
||||
```yaml
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
```
|
||||
|
||||
### `codex_args`
|
||||
|
||||
A whitespace-delimited list of arguments to pass to Codex. Defaults to `--full-auto`, but if you want to override the default model to use `o3`:
|
||||
|
||||
```yaml
|
||||
codex_args: "--full-auto --model o3"
|
||||
```
|
||||
|
||||
For more complex configurations, use the `codex_home` input.
|
||||
|
||||
### `codex_home`
|
||||
|
||||
If set, the value to use for the `$CODEX_HOME` environment variable when running Codex. As explained [in the docs](https://github.com/openai/codex/tree/main/codex-rs#readme), this folder can contain the `config.toml` to configure Codex, custom instructions, and log files.
|
||||
|
||||
This should be a relative path within your repo.
|
||||
|
||||
## Prompt Template Variables
|
||||
|
||||
As shown above, `"prompt"` and `"promptPath"` are used to define prompt templates that will be populated and passed to Codex in response to certain events. All template variables are of the form `{CODEX_ACTION_...}` and the supported values are defined below.
|
||||
|
||||
### `CODEX_ACTION_ISSUE_TITLE`
|
||||
|
||||
If the action was triggered on a GitHub issue, this is the issue title.
|
||||
|
||||
Specifically it is read as the `.issue.title` from the `$GITHUB_EVENT_PATH`.
|
||||
|
||||
### `CODEX_ACTION_ISSUE_BODY`
|
||||
|
||||
If the action was triggered on a GitHub issue, this is the issue body.
|
||||
|
||||
Specifically it is read as the `.issue.body` from the `$GITHUB_EVENT_PATH`.
|
||||
|
||||
### `CODEX_ACTION_GITHUB_EVENT_PATH`
|
||||
|
||||
The value of the `$GITHUB_EVENT_PATH` environment variable, which is the path to the file that contains the JSON payload for the event that triggered the workflow. Codex can use `jq` to read only the fields of interest from this file.
|
||||
|
||||
### `CODEX_ACTION_PR_DIFF`
|
||||
|
||||
If the action was triggered on a pull request, this is the diff between the base and head commits of the PR. It is the output from `git diff`.
|
||||
|
||||
Note that the content of the diff could be quite large, so is generally safer to point Codex at `CODEX_ACTION_GITHUB_EVENT_PATH` and let it decide how it wants to explore the change.
|
||||
125
.github/actions/codex/action.yml
vendored
Normal file
125
.github/actions/codex/action.yml
vendored
Normal file
@@ -0,0 +1,125 @@
|
||||
name: "Codex [reusable action]"
|
||||
description: "A reusable action that runs a Codex model."
|
||||
|
||||
inputs:
|
||||
openai_api_key:
|
||||
description: "The value to use as the OPENAI_API_KEY environment variable when running Codex."
|
||||
required: true
|
||||
trigger_phrase:
|
||||
description: "Text to trigger Codex from a PR/issue body or comment."
|
||||
required: false
|
||||
default: ""
|
||||
github_token:
|
||||
description: "Token so Codex can comment on the PR or issue."
|
||||
required: true
|
||||
codex_args:
|
||||
description: "A whitespace-delimited list of arguments to pass to Codex. Due to limitations in YAML, arguments with spaces are not supported. For more complex configurations, use the `codex_home` input."
|
||||
required: false
|
||||
default: "--config hide_agent_reasoning=true --full-auto"
|
||||
codex_home:
|
||||
description: "Value to use as the CODEX_HOME environment variable when running Codex."
|
||||
required: false
|
||||
codex_release_tag:
|
||||
description: "The release tag of the Codex model to run, e.g., 'rust-v0.3.0'. Defaults to the latest release."
|
||||
required: false
|
||||
default: ""
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# Do this in Bash so we do not even bother to install Bun if the sender does
|
||||
# not have write access to the repo.
|
||||
- name: Verify user has write access to the repo.
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
PERMISSION=$(gh api \
|
||||
"/repos/${GITHUB_REPOSITORY}/collaborators/${{ github.event.sender.login }}/permission" \
|
||||
| jq -r '.permission')
|
||||
|
||||
if [[ "$PERMISSION" != "admin" && "$PERMISSION" != "write" ]]; then
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Download Codex
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Determine OS/arch and corresponding Codex artifact name.
|
||||
uname_s=$(uname -s)
|
||||
uname_m=$(uname -m)
|
||||
|
||||
case "$uname_s" in
|
||||
Linux*) os="linux" ;;
|
||||
Darwin*) os="apple-darwin" ;;
|
||||
*) echo "Unsupported operating system: $uname_s"; exit 1 ;;
|
||||
esac
|
||||
|
||||
case "$uname_m" in
|
||||
x86_64*) arch="x86_64" ;;
|
||||
arm64*|aarch64*) arch="aarch64" ;;
|
||||
*) echo "Unsupported architecture: $uname_m"; exit 1 ;;
|
||||
esac
|
||||
|
||||
# linux builds differentiate between musl and gnu.
|
||||
if [[ "$os" == "linux" ]]; then
|
||||
if [[ "$arch" == "x86_64" ]]; then
|
||||
triple="${arch}-unknown-linux-musl"
|
||||
else
|
||||
# Only other supported linux build is aarch64 gnu.
|
||||
triple="${arch}-unknown-linux-gnu"
|
||||
fi
|
||||
else
|
||||
# macOS
|
||||
triple="${arch}-apple-darwin"
|
||||
fi
|
||||
|
||||
# Note that if we start baking version numbers into the artifact name,
|
||||
# we will need to update this action.yml file to match.
|
||||
artifact="codex-${triple}.tar.gz"
|
||||
|
||||
TAG_ARG="${{ inputs.codex_release_tag }}"
|
||||
# The usage is `gh release download [<tag>] [flags]`, so if TAG_ARG
|
||||
# is empty, we do not pass it so we can default to the latest release.
|
||||
gh release download ${TAG_ARG:+$TAG_ARG} --repo openai/codex \
|
||||
--pattern "$artifact" --output - \
|
||||
| tar xzO > /usr/local/bin/codex
|
||||
chmod +x /usr/local/bin/codex
|
||||
|
||||
# Display Codex version to confirm binary integrity.
|
||||
codex --version
|
||||
|
||||
- name: Install Bun
|
||||
uses: oven-sh/setup-bun@v2
|
||||
with:
|
||||
bun-version: 1.2.11
|
||||
|
||||
- name: Install dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
cd ${{ github.action_path }}
|
||||
bun install --production
|
||||
|
||||
- name: Run Codex
|
||||
shell: bash
|
||||
run: bun run ${{ github.action_path }}/src/main.ts
|
||||
# Process args plus environment variables often have a max of 128 KiB,
|
||||
# so we should fit within that limit?
|
||||
env:
|
||||
INPUT_CODEX_ARGS: ${{ inputs.codex_args || '' }}
|
||||
INPUT_CODEX_HOME: ${{ inputs.codex_home || ''}}
|
||||
INPUT_TRIGGER_PHRASE: ${{ inputs.trigger_phrase || '' }}
|
||||
OPENAI_API_KEY: ${{ inputs.openai_api_key }}
|
||||
GITHUB_TOKEN: ${{ inputs.github_token }}
|
||||
GITHUB_EVENT_ACTION: ${{ github.event.action || '' }}
|
||||
GITHUB_EVENT_LABEL_NAME: ${{ github.event.label.name || '' }}
|
||||
GITHUB_EVENT_ISSUE_NUMBER: ${{ github.event.issue.number || '' }}
|
||||
GITHUB_EVENT_ISSUE_BODY: ${{ github.event.issue.body || '' }}
|
||||
GITHUB_EVENT_REVIEW_BODY: ${{ github.event.review.body || '' }}
|
||||
GITHUB_EVENT_COMMENT_BODY: ${{ github.event.comment.body || '' }}
|
||||
91
.github/actions/codex/bun.lock
vendored
Normal file
91
.github/actions/codex/bun.lock
vendored
Normal file
@@ -0,0 +1,91 @@
|
||||
{
|
||||
"lockfileVersion": 1,
|
||||
"workspaces": {
|
||||
"": {
|
||||
"name": "codex-action",
|
||||
"dependencies": {
|
||||
"@actions/core": "^1.11.1",
|
||||
"@actions/github": "^6.0.1",
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/bun": "^1.2.20",
|
||||
"@types/node": "^24.3.0",
|
||||
"prettier": "^3.6.2",
|
||||
"typescript": "^5.9.2",
|
||||
},
|
||||
},
|
||||
},
|
||||
"packages": {
|
||||
"@actions/core": ["@actions/core@1.11.1", "", { "dependencies": { "@actions/exec": "^1.1.1", "@actions/http-client": "^2.0.1" } }, "sha512-hXJCSrkwfA46Vd9Z3q4cpEpHB1rL5NG04+/rbqW9d3+CSvtB1tYe8UTpAlixa1vj0m/ULglfEK2UKxMGxCxv5A=="],
|
||||
|
||||
"@actions/exec": ["@actions/exec@1.1.1", "", { "dependencies": { "@actions/io": "^1.0.1" } }, "sha512-+sCcHHbVdk93a0XT19ECtO/gIXoxvdsgQLzb2fE2/5sIZmWQuluYyjPQtrtTHdU1YzTZ7bAPN4sITq2xi1679w=="],
|
||||
|
||||
"@actions/github": ["@actions/github@6.0.1", "", { "dependencies": { "@actions/http-client": "^2.2.0", "@octokit/core": "^5.0.1", "@octokit/plugin-paginate-rest": "^9.2.2", "@octokit/plugin-rest-endpoint-methods": "^10.4.0", "@octokit/request": "^8.4.1", "@octokit/request-error": "^5.1.1", "undici": "^5.28.5" } }, "sha512-xbZVcaqD4XnQAe35qSQqskb3SqIAfRyLBrHMd/8TuL7hJSz2QtbDwnNM8zWx4zO5l2fnGtseNE3MbEvD7BxVMw=="],
|
||||
|
||||
"@actions/http-client": ["@actions/http-client@2.2.3", "", { "dependencies": { "tunnel": "^0.0.6", "undici": "^5.25.4" } }, "sha512-mx8hyJi/hjFvbPokCg4uRd4ZX78t+YyRPtnKWwIl+RzNaVuFpQHfmlGVfsKEJN8LwTCvL+DfVgAM04XaHkm6bA=="],
|
||||
|
||||
"@actions/io": ["@actions/io@1.1.3", "", {}, "sha512-wi9JjgKLYS7U/z8PPbco+PvTb/nRWjeoFlJ1Qer83k/3C5PHQi28hiVdeE2kHXmIL99mQFawx8qt/JPjZilJ8Q=="],
|
||||
|
||||
"@fastify/busboy": ["@fastify/busboy@2.1.1", "", {}, "sha512-vBZP4NlzfOlerQTnba4aqZoMhE/a9HY7HRqoOPaETQcSQuWEIyZMHGfVu6w9wGtGK5fED5qRs2DteVCjOH60sA=="],
|
||||
|
||||
"@octokit/auth-token": ["@octokit/auth-token@4.0.0", "", {}, "sha512-tY/msAuJo6ARbK6SPIxZrPBms3xPbfwBrulZe0Wtr/DIY9lje2HeV1uoebShn6mx7SjCHif6EjMvoREj+gZ+SA=="],
|
||||
|
||||
"@octokit/core": ["@octokit/core@5.2.1", "", { "dependencies": { "@octokit/auth-token": "^4.0.0", "@octokit/graphql": "^7.1.0", "@octokit/request": "^8.4.1", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.0.0", "before-after-hook": "^2.2.0", "universal-user-agent": "^6.0.0" } }, "sha512-dKYCMuPO1bmrpuogcjQ8z7ICCH3FP6WmxpwC03yjzGfZhj9fTJg6+bS1+UAplekbN2C+M61UNllGOOoAfGCrdQ=="],
|
||||
|
||||
"@octokit/endpoint": ["@octokit/endpoint@9.0.6", "", { "dependencies": { "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-H1fNTMA57HbkFESSt3Y9+FBICv+0jFceJFPWDePYlR/iMGrwM5ph+Dd4XRQs+8X+PUFURLQgX9ChPfhJ/1uNQw=="],
|
||||
|
||||
"@octokit/graphql": ["@octokit/graphql@7.1.1", "", { "dependencies": { "@octokit/request": "^8.4.1", "@octokit/types": "^13.0.0", "universal-user-agent": "^6.0.0" } }, "sha512-3mkDltSfcDUoa176nlGoA32RGjeWjl3K7F/BwHwRMJUW/IteSa4bnSV8p2ThNkcIcZU2umkZWxwETSSCJf2Q7g=="],
|
||||
|
||||
"@octokit/openapi-types": ["@octokit/openapi-types@24.2.0", "", {}, "sha512-9sIH3nSUttelJSXUrmGzl7QUBFul0/mB8HRYl3fOlgHbIWG+WnYDXU3v/2zMtAvuzZ/ed00Ei6on975FhBfzrg=="],
|
||||
|
||||
"@octokit/plugin-paginate-rest": ["@octokit/plugin-paginate-rest@9.2.2", "", { "dependencies": { "@octokit/types": "^12.6.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-u3KYkGF7GcZnSD/3UP0S7K5XUFT2FkOQdcfXZGZQPGv3lm4F2Xbf71lvjldr8c1H3nNbF+33cLEkWYbokGWqiQ=="],
|
||||
|
||||
"@octokit/plugin-rest-endpoint-methods": ["@octokit/plugin-rest-endpoint-methods@10.4.1", "", { "dependencies": { "@octokit/types": "^12.6.0" }, "peerDependencies": { "@octokit/core": "5" } }, "sha512-xV1b+ceKV9KytQe3zCVqjg+8GTGfDYwaT1ATU5isiUyVtlVAO3HNdzpS4sr4GBx4hxQ46s7ITtZrAsxG22+rVg=="],
|
||||
|
||||
"@octokit/request": ["@octokit/request@8.4.1", "", { "dependencies": { "@octokit/endpoint": "^9.0.6", "@octokit/request-error": "^5.1.1", "@octokit/types": "^13.1.0", "universal-user-agent": "^6.0.0" } }, "sha512-qnB2+SY3hkCmBxZsR/MPCybNmbJe4KAlfWErXq+rBKkQJlbjdJeS85VI9r8UqeLYLvnAenU8Q1okM/0MBsAGXw=="],
|
||||
|
||||
"@octokit/request-error": ["@octokit/request-error@5.1.1", "", { "dependencies": { "@octokit/types": "^13.1.0", "deprecation": "^2.0.0", "once": "^1.4.0" } }, "sha512-v9iyEQJH6ZntoENr9/yXxjuezh4My67CBSu9r6Ve/05Iu5gNgnisNWOsoJHTP6k0Rr0+HQIpnH+kyammu90q/g=="],
|
||||
|
||||
"@octokit/types": ["@octokit/types@13.10.0", "", { "dependencies": { "@octokit/openapi-types": "^24.2.0" } }, "sha512-ifLaO34EbbPj0Xgro4G5lP5asESjwHracYJvVaPIyXMuiuXLlhic3S47cBdTb+jfODkTE5YtGCLt3Ay3+J97sA=="],
|
||||
|
||||
"@types/bun": ["@types/bun@1.2.20", "", { "dependencies": { "bun-types": "1.2.20" } }, "sha512-dX3RGzQ8+KgmMw7CsW4xT5ITBSCrSbfHc36SNT31EOUg/LA9JWq0VDdEXDRSe1InVWpd2yLUM1FUF/kEOyTzYA=="],
|
||||
|
||||
"@types/node": ["@types/node@24.3.0", "", { "dependencies": { "undici-types": "~7.10.0" } }, "sha512-aPTXCrfwnDLj4VvXrm+UUCQjNEvJgNA8s5F1cvwQU+3KNltTOkBm1j30uNLyqqPNe7gE3KFzImYoZEfLhp4Yow=="],
|
||||
|
||||
"@types/react": ["@types/react@19.1.8", "", { "dependencies": { "csstype": "^3.0.2" } }, "sha512-AwAfQ2Wa5bCx9WP8nZL2uMZWod7J7/JSplxbTmBQ5ms6QpqNYm672H0Vu9ZVKVngQ+ii4R/byguVEUZQyeg44g=="],
|
||||
|
||||
"before-after-hook": ["before-after-hook@2.2.3", "", {}, "sha512-NzUnlZexiaH/46WDhANlyR2bXRopNg4F/zuSA3OpZnllCUgRaOF2znDioDWrmbNVsuZk6l9pMquQB38cfBZwkQ=="],
|
||||
|
||||
"bun-types": ["bun-types@1.2.20", "", { "dependencies": { "@types/node": "*" }, "peerDependencies": { "@types/react": "^19" } }, "sha512-pxTnQYOrKvdOwyiyd/7sMt9yFOenN004Y6O4lCcCUoKVej48FS5cvTw9geRaEcB9TsDZaJKAxPTVvi8tFsVuXA=="],
|
||||
|
||||
"csstype": ["csstype@3.1.3", "", {}, "sha512-M1uQkMl8rQK/szD0LNhtqxIPLpimGm8sOBwU7lLnCpSbTyY3yeU1Vc7l4KT5zT4s/yOxHH5O7tIuuLOCnLADRw=="],
|
||||
|
||||
"deprecation": ["deprecation@2.3.1", "", {}, "sha512-xmHIy4F3scKVwMsQ4WnVaS8bHOx0DmVwRywosKhaILI0ywMDWPtBSku2HNxRvF7jtwDRsoEwYQSfbxj8b7RlJQ=="],
|
||||
|
||||
"once": ["once@1.4.0", "", { "dependencies": { "wrappy": "1" } }, "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w=="],
|
||||
|
||||
"prettier": ["prettier@3.6.2", "", { "bin": { "prettier": "bin/prettier.cjs" } }, "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ=="],
|
||||
|
||||
"tunnel": ["tunnel@0.0.6", "", {}, "sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg=="],
|
||||
|
||||
"typescript": ["typescript@5.9.2", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-CWBzXQrc/qOkhidw1OzBTQuYRbfyxDXJMVJ1XNwUHGROVmuaeiEm3OslpZ1RV96d7SKKjZKrSJu3+t/xlw3R9A=="],
|
||||
|
||||
"undici": ["undici@5.29.0", "", { "dependencies": { "@fastify/busboy": "^2.0.0" } }, "sha512-raqeBD6NQK4SkWhQzeYKd1KmIG6dllBOTt55Rmkt4HtI9mwdWtJljnrXjAFUBLTSN67HWrOIZ3EPF4kjUw80Bg=="],
|
||||
|
||||
"undici-types": ["undici-types@7.10.0", "", {}, "sha512-t5Fy/nfn+14LuOc2KNYg75vZqClpAiqscVvMygNnlsHBFpSXdJaYtXMcdNLpl/Qvc3P2cB3s6lOV51nqsFq4ag=="],
|
||||
|
||||
"universal-user-agent": ["universal-user-agent@6.0.1", "", {}, "sha512-yCzhz6FN2wU1NiiQRogkTQszlQSlpWaw8SvVegAc+bDxbzHgh1vX8uIe8OYyMH6DwH+sdTJsgMl36+mSMdRJIQ=="],
|
||||
|
||||
"wrappy": ["wrappy@1.0.2", "", {}, "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="],
|
||||
|
||||
"@octokit/plugin-paginate-rest/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
|
||||
|
||||
"@octokit/plugin-rest-endpoint-methods/@octokit/types": ["@octokit/types@12.6.0", "", { "dependencies": { "@octokit/openapi-types": "^20.0.0" } }, "sha512-1rhSOfRa6H9w4YwK0yrf5faDaDTb+yLyBUKOCV4xtCDB5VmIPqd/v9yr9o6SAzOAlRxMiRiCic6JVM1/kunVkw=="],
|
||||
|
||||
"bun-types/@types/node": ["@types/node@24.2.1", "", { "dependencies": { "undici-types": "~7.10.0" } }, "sha512-DRh5K+ka5eJic8CjH7td8QpYEV6Zo10gfRkjHCO3weqZHWDtAaSTFtl4+VMqOJ4N5jcuhZ9/l+yy8rVgw7BQeQ=="],
|
||||
|
||||
"@octokit/plugin-paginate-rest/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
|
||||
|
||||
"@octokit/plugin-rest-endpoint-methods/@octokit/types/@octokit/openapi-types": ["@octokit/openapi-types@20.0.0", "", {}, "sha512-EtqRBEjp1dL/15V7WiX5LJMIxxkdiGJnabzYx5Apx4FkQIFgAfKumXeYAqqJCj1s+BMX4cPFIFC4OLCR6stlnA=="],
|
||||
}
|
||||
}
|
||||
21
.github/actions/codex/package.json
vendored
Normal file
21
.github/actions/codex/package.json
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
{
|
||||
"name": "codex-action",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"format": "prettier --check src",
|
||||
"format:fix": "prettier --write src",
|
||||
"test": "bun test",
|
||||
"typecheck": "tsc"
|
||||
},
|
||||
"dependencies": {
|
||||
"@actions/core": "^1.11.1",
|
||||
"@actions/github": "^6.0.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/bun": "^1.2.20",
|
||||
"@types/node": "^24.3.0",
|
||||
"prettier": "^3.6.2",
|
||||
"typescript": "^5.9.2"
|
||||
}
|
||||
}
|
||||
85
.github/actions/codex/src/add-reaction.ts
vendored
Normal file
85
.github/actions/codex/src/add-reaction.ts
vendored
Normal file
@@ -0,0 +1,85 @@
|
||||
import * as github from "@actions/github";
|
||||
import type { EnvContext } from "./env-context";
|
||||
|
||||
/**
|
||||
* Add an "eyes" reaction to the entity (issue, issue comment, or pull request
|
||||
* review comment) that triggered the current Codex invocation.
|
||||
*
|
||||
* The purpose is to provide immediate feedback to the user – similar to the
|
||||
* *-in-progress label flow – indicating that the bot has acknowledged the
|
||||
* request and is working on it.
|
||||
*
|
||||
* We attempt to add the reaction best suited for the current GitHub event:
|
||||
*
|
||||
* • issues → POST /repos/{owner}/{repo}/issues/{issue_number}/reactions
|
||||
* • issue_comment → POST /repos/{owner}/{repo}/issues/comments/{comment_id}/reactions
|
||||
* • pull_request_review_comment → POST /repos/{owner}/{repo}/pulls/comments/{comment_id}/reactions
|
||||
*
|
||||
* If the specific target is unavailable (e.g. unexpected payload shape) we
|
||||
* silently skip instead of failing the whole action because the reaction is
|
||||
* merely cosmetic.
|
||||
*/
|
||||
export async function addEyesReaction(ctx: EnvContext): Promise<void> {
|
||||
const octokit = ctx.getOctokit();
|
||||
const { owner, repo } = github.context.repo;
|
||||
const eventName = github.context.eventName;
|
||||
|
||||
try {
|
||||
switch (eventName) {
|
||||
case "issue_comment": {
|
||||
const commentId = (github.context.payload as any)?.comment?.id;
|
||||
if (commentId) {
|
||||
await octokit.rest.reactions.createForIssueComment({
|
||||
owner,
|
||||
repo,
|
||||
comment_id: commentId,
|
||||
content: "eyes",
|
||||
});
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "pull_request_review_comment": {
|
||||
const commentId = (github.context.payload as any)?.comment?.id;
|
||||
if (commentId) {
|
||||
await octokit.rest.reactions.createForPullRequestReviewComment({
|
||||
owner,
|
||||
repo,
|
||||
comment_id: commentId,
|
||||
content: "eyes",
|
||||
});
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "issues": {
|
||||
const issueNumber = github.context.issue.number;
|
||||
if (issueNumber) {
|
||||
await octokit.rest.reactions.createForIssue({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
content: "eyes",
|
||||
});
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
default: {
|
||||
// Fallback: try to react to the issue/PR if we have a number.
|
||||
const issueNumber = github.context.issue.number;
|
||||
if (issueNumber) {
|
||||
await octokit.rest.reactions.createForIssue({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
content: "eyes",
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
// Do not fail the action if reaction creation fails – log and continue.
|
||||
console.warn(`Failed to add \"eyes\" reaction: ${error}`);
|
||||
}
|
||||
}
|
||||
53
.github/actions/codex/src/comment.ts
vendored
Normal file
53
.github/actions/codex/src/comment.ts
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
import type { EnvContext } from "./env-context";
|
||||
import { runCodex } from "./run-codex";
|
||||
import { postComment } from "./post-comment";
|
||||
import { addEyesReaction } from "./add-reaction";
|
||||
|
||||
/**
|
||||
* Handle `issue_comment` and `pull_request_review_comment` events once we know
|
||||
* the action is supported.
|
||||
*/
|
||||
export async function onComment(ctx: EnvContext): Promise<void> {
|
||||
const triggerPhrase = ctx.tryGet("INPUT_TRIGGER_PHRASE");
|
||||
if (!triggerPhrase) {
|
||||
console.warn("Empty trigger phrase: skipping.");
|
||||
return;
|
||||
}
|
||||
|
||||
// Attempt to get the body of the comment from the environment. Depending on
|
||||
// the event type either `GITHUB_EVENT_COMMENT_BODY` (issue & PR comments) or
|
||||
// `GITHUB_EVENT_REVIEW_BODY` (PR reviews) is set.
|
||||
const commentBody =
|
||||
ctx.tryGetNonEmpty("GITHUB_EVENT_COMMENT_BODY") ??
|
||||
ctx.tryGetNonEmpty("GITHUB_EVENT_REVIEW_BODY") ??
|
||||
ctx.tryGetNonEmpty("GITHUB_EVENT_ISSUE_BODY");
|
||||
|
||||
if (!commentBody) {
|
||||
console.warn("Comment body not found in environment: skipping.");
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if the trigger phrase is present.
|
||||
if (!commentBody.includes(triggerPhrase)) {
|
||||
console.log(
|
||||
`Trigger phrase '${triggerPhrase}' not found: nothing to do for this comment.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// Derive the prompt by removing the trigger phrase. Remove only the first
|
||||
// occurrence to keep any additional occurrences that might be meaningful.
|
||||
const prompt = commentBody.replace(triggerPhrase, "").trim();
|
||||
|
||||
if (prompt.length === 0) {
|
||||
console.warn("Prompt is empty after removing trigger phrase: skipping");
|
||||
return;
|
||||
}
|
||||
|
||||
// Provide immediate feedback that we are working on the request.
|
||||
await addEyesReaction(ctx);
|
||||
|
||||
// Run Codex and post the response as a new comment.
|
||||
const lastMessage = await runCodex(prompt, ctx);
|
||||
await postComment(lastMessage, ctx);
|
||||
}
|
||||
11
.github/actions/codex/src/config.ts
vendored
Normal file
11
.github/actions/codex/src/config.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { readdirSync, statSync } from "fs";
|
||||
import * as path from "path";
|
||||
|
||||
export interface Config {
|
||||
labels: Record<string, LabelConfig>;
|
||||
}
|
||||
|
||||
export interface LabelConfig {
|
||||
/** Returns the prompt template. */
|
||||
getPromptTemplate(): string;
|
||||
}
|
||||
44
.github/actions/codex/src/default-label-config.ts
vendored
Normal file
44
.github/actions/codex/src/default-label-config.ts
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import type { Config } from "./config";
|
||||
|
||||
export function getDefaultConfig(): Config {
|
||||
return {
|
||||
labels: {
|
||||
"codex-investigate-issue": {
|
||||
getPromptTemplate: () =>
|
||||
`
|
||||
Troubleshoot whether the reported issue is valid.
|
||||
|
||||
Provide a concise and respectful comment summarizing the findings.
|
||||
|
||||
### {CODEX_ACTION_ISSUE_TITLE}
|
||||
|
||||
{CODEX_ACTION_ISSUE_BODY}
|
||||
`.trim(),
|
||||
},
|
||||
"codex-code-review": {
|
||||
getPromptTemplate: () =>
|
||||
`
|
||||
Review this PR and respond with a very concise final message, formatted in Markdown.
|
||||
|
||||
There should be a summary of the changes (1-2 sentences) and a few bullet points if necessary.
|
||||
|
||||
Then provide the **review** (1-2 sentences plus bullet points, friendly tone).
|
||||
|
||||
{CODEX_ACTION_GITHUB_EVENT_PATH} contains the JSON that triggered this GitHub workflow. It contains the \`base\` and \`head\` refs that define this PR. Both refs are available locally.
|
||||
`.trim(),
|
||||
},
|
||||
"codex-attempt-fix": {
|
||||
getPromptTemplate: () =>
|
||||
`
|
||||
Attempt to solve the reported issue.
|
||||
|
||||
If a code change is required, create a new branch, commit the fix, and open a pull-request that resolves the problem.
|
||||
|
||||
### {CODEX_ACTION_ISSUE_TITLE}
|
||||
|
||||
{CODEX_ACTION_ISSUE_BODY}
|
||||
`.trim(),
|
||||
},
|
||||
},
|
||||
};
|
||||
}
|
||||
116
.github/actions/codex/src/env-context.ts
vendored
Normal file
116
.github/actions/codex/src/env-context.ts
vendored
Normal file
@@ -0,0 +1,116 @@
|
||||
/*
|
||||
* Centralised access to environment variables used by the Codex GitHub
|
||||
* Action.
|
||||
*
|
||||
* To enable proper unit-testing we avoid reading from `process.env` at module
|
||||
* initialisation time. Instead a `EnvContext` object is created (usually from
|
||||
* the real `process.env`) and passed around explicitly or – where that is not
|
||||
* yet practical – imported as the shared `defaultContext` singleton. Tests can
|
||||
* create their own context backed by a stubbed map of variables without having
|
||||
* to mutate global state.
|
||||
*/
|
||||
|
||||
import { fail } from "./fail";
|
||||
import * as github from "@actions/github";
|
||||
|
||||
export interface EnvContext {
|
||||
/**
|
||||
* Return the value for a given environment variable or terminate the action
|
||||
* via `fail` if it is missing / empty.
|
||||
*/
|
||||
get(name: string): string;
|
||||
|
||||
/**
|
||||
* Attempt to read an environment variable. Returns the value when present;
|
||||
* otherwise returns undefined (does not call `fail`).
|
||||
*/
|
||||
tryGet(name: string): string | undefined;
|
||||
|
||||
/**
|
||||
* Attempt to read an environment variable. Returns non-empty string value or
|
||||
* null if unset or empty string.
|
||||
*/
|
||||
tryGetNonEmpty(name: string): string | null;
|
||||
|
||||
/**
|
||||
* Return a memoised Octokit instance authenticated via the token resolved
|
||||
* from the provided argument (when defined) or the environment variables
|
||||
* `GITHUB_TOKEN`/`GH_TOKEN`.
|
||||
*
|
||||
* Subsequent calls return the same cached instance to avoid spawning
|
||||
* multiple REST clients within a single action run.
|
||||
*/
|
||||
getOctokit(token?: string): ReturnType<typeof github.getOctokit>;
|
||||
}
|
||||
|
||||
/** Internal helper – *not* exported. */
|
||||
function _getRequiredEnv(
|
||||
name: string,
|
||||
env: Record<string, string | undefined>,
|
||||
): string | undefined {
|
||||
const value = env[name];
|
||||
|
||||
// Avoid leaking secrets into logs while still logging non-secret variables.
|
||||
if (name.endsWith("KEY") || name.endsWith("TOKEN")) {
|
||||
if (value) {
|
||||
console.log(`value for ${name} was found`);
|
||||
}
|
||||
} else {
|
||||
console.log(`${name}=${value}`);
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
/** Create a context backed by the supplied environment map (defaults to `process.env`). */
|
||||
export function createEnvContext(
|
||||
env: Record<string, string | undefined> = process.env,
|
||||
): EnvContext {
|
||||
// Lazily instantiated Octokit client – shared across this context.
|
||||
let cachedOctokit: ReturnType<typeof github.getOctokit> | null = null;
|
||||
|
||||
return {
|
||||
get(name: string): string {
|
||||
const value = _getRequiredEnv(name, env);
|
||||
if (value == null) {
|
||||
fail(`Missing required environment variable: ${name}`);
|
||||
}
|
||||
return value;
|
||||
},
|
||||
|
||||
tryGet(name: string): string | undefined {
|
||||
return _getRequiredEnv(name, env);
|
||||
},
|
||||
|
||||
tryGetNonEmpty(name: string): string | null {
|
||||
const value = _getRequiredEnv(name, env);
|
||||
return value == null || value === "" ? null : value;
|
||||
},
|
||||
|
||||
getOctokit(token?: string) {
|
||||
if (cachedOctokit) {
|
||||
return cachedOctokit;
|
||||
}
|
||||
|
||||
// Determine the token to authenticate with.
|
||||
const githubToken = token ?? env["GITHUB_TOKEN"] ?? env["GH_TOKEN"];
|
||||
|
||||
if (!githubToken) {
|
||||
fail(
|
||||
"Unable to locate a GitHub token. `github_token` should have been set on the action.",
|
||||
);
|
||||
}
|
||||
|
||||
cachedOctokit = github.getOctokit(githubToken!);
|
||||
return cachedOctokit;
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Shared context built from the actual `process.env`. Production code that is
|
||||
* not yet refactored to receive a context explicitly may import and use this
|
||||
* singleton. Tests should avoid the singleton and instead pass their own
|
||||
* context to the functions they exercise.
|
||||
*/
|
||||
export const defaultContext: EnvContext = createEnvContext();
|
||||
4
.github/actions/codex/src/fail.ts
vendored
Normal file
4
.github/actions/codex/src/fail.ts
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
export function fail(message: string): never {
|
||||
console.error(message);
|
||||
process.exit(1);
|
||||
}
|
||||
149
.github/actions/codex/src/git-helpers.ts
vendored
Normal file
149
.github/actions/codex/src/git-helpers.ts
vendored
Normal file
@@ -0,0 +1,149 @@
|
||||
import { spawnSync } from "child_process";
|
||||
import * as github from "@actions/github";
|
||||
import { EnvContext } from "./env-context";
|
||||
|
||||
function runGit(args: string[], silent = true): string {
|
||||
console.info(`Running git ${args.join(" ")}`);
|
||||
const res = spawnSync("git", args, {
|
||||
encoding: "utf8",
|
||||
stdio: silent ? ["ignore", "pipe", "pipe"] : "inherit",
|
||||
});
|
||||
if (res.error) {
|
||||
throw res.error;
|
||||
}
|
||||
if (res.status !== 0) {
|
||||
// Return stderr so caller may handle; else throw.
|
||||
throw new Error(
|
||||
`git ${args.join(" ")} failed with code ${res.status}: ${res.stderr}`,
|
||||
);
|
||||
}
|
||||
return res.stdout.trim();
|
||||
}
|
||||
|
||||
function stageAllChanges() {
|
||||
runGit(["add", "-A"]);
|
||||
}
|
||||
|
||||
function hasStagedChanges(): boolean {
|
||||
const res = spawnSync("git", ["diff", "--cached", "--quiet", "--exit-code"]);
|
||||
return res.status !== 0;
|
||||
}
|
||||
|
||||
function ensureOnBranch(
|
||||
issueNumber: number,
|
||||
protectedBranches: string[],
|
||||
suggestedSlug?: string,
|
||||
): string {
|
||||
let branch = "";
|
||||
try {
|
||||
branch = runGit(["symbolic-ref", "--short", "-q", "HEAD"]);
|
||||
} catch {
|
||||
branch = "";
|
||||
}
|
||||
|
||||
// If detached HEAD or on a protected branch, create a new branch.
|
||||
if (!branch || protectedBranches.includes(branch)) {
|
||||
if (suggestedSlug) {
|
||||
const safeSlug = suggestedSlug
|
||||
.toLowerCase()
|
||||
.replace(/[^\w\s-]/g, "")
|
||||
.trim()
|
||||
.replace(/\s+/g, "-");
|
||||
branch = `codex-fix-${issueNumber}-${safeSlug}`;
|
||||
} else {
|
||||
branch = `codex-fix-${issueNumber}-${Date.now()}`;
|
||||
}
|
||||
runGit(["switch", "-c", branch]);
|
||||
}
|
||||
return branch;
|
||||
}
|
||||
|
||||
function commitIfNeeded(issueNumber: number) {
|
||||
if (hasStagedChanges()) {
|
||||
runGit([
|
||||
"commit",
|
||||
"-m",
|
||||
`fix: automated fix for #${issueNumber} via Codex`,
|
||||
]);
|
||||
}
|
||||
}
|
||||
|
||||
function pushBranch(branch: string, githubToken: string, ctx: EnvContext) {
|
||||
const repoSlug = ctx.get("GITHUB_REPOSITORY"); // owner/repo
|
||||
const remoteUrl = `https://x-access-token:${githubToken}@github.com/${repoSlug}.git`;
|
||||
|
||||
runGit(["push", "--force-with-lease", "-u", remoteUrl, `HEAD:${branch}`]);
|
||||
}
|
||||
|
||||
/**
|
||||
* If this returns a string, it is the URL of the created PR.
|
||||
*/
|
||||
export async function maybePublishPRForIssue(
|
||||
issueNumber: number,
|
||||
lastMessage: string,
|
||||
ctx: EnvContext,
|
||||
): Promise<string | undefined> {
|
||||
// Only proceed if GITHUB_TOKEN available.
|
||||
const githubToken =
|
||||
ctx.tryGetNonEmpty("GITHUB_TOKEN") ?? ctx.tryGetNonEmpty("GH_TOKEN");
|
||||
if (!githubToken) {
|
||||
console.warn("No GitHub token - skipping PR creation.");
|
||||
return undefined;
|
||||
}
|
||||
|
||||
// Print `git status` for debugging.
|
||||
runGit(["status"]);
|
||||
|
||||
// Stage any remaining changes so they can be committed and pushed.
|
||||
stageAllChanges();
|
||||
|
||||
const octokit = ctx.getOctokit(githubToken);
|
||||
|
||||
const { owner, repo } = github.context.repo;
|
||||
|
||||
// Determine default branch to treat as protected.
|
||||
let defaultBranch = "main";
|
||||
try {
|
||||
const repoInfo = await octokit.rest.repos.get({ owner, repo });
|
||||
defaultBranch = repoInfo.data.default_branch ?? "main";
|
||||
} catch (e) {
|
||||
console.warn(`Failed to get default branch, assuming 'main': ${e}`);
|
||||
}
|
||||
|
||||
const sanitizedMessage = lastMessage.replace(/\u2022/g, "-");
|
||||
const [summaryLine] = sanitizedMessage.split(/\r?\n/);
|
||||
const branch = ensureOnBranch(issueNumber, [defaultBranch, "master"], summaryLine);
|
||||
commitIfNeeded(issueNumber);
|
||||
pushBranch(branch, githubToken, ctx);
|
||||
|
||||
// Try to find existing PR for this branch
|
||||
const headParam = `${owner}:${branch}`;
|
||||
const existing = await octokit.rest.pulls.list({
|
||||
owner,
|
||||
repo,
|
||||
head: headParam,
|
||||
state: "open",
|
||||
});
|
||||
if (existing.data.length > 0) {
|
||||
return existing.data[0].html_url;
|
||||
}
|
||||
|
||||
// Determine base branch (default to main)
|
||||
let baseBranch = "main";
|
||||
try {
|
||||
const repoInfo = await octokit.rest.repos.get({ owner, repo });
|
||||
baseBranch = repoInfo.data.default_branch ?? "main";
|
||||
} catch (e) {
|
||||
console.warn(`Failed to get default branch, assuming 'main': ${e}`);
|
||||
}
|
||||
|
||||
const pr = await octokit.rest.pulls.create({
|
||||
owner,
|
||||
repo,
|
||||
title: summaryLine,
|
||||
head: branch,
|
||||
base: baseBranch,
|
||||
body: sanitizedMessage,
|
||||
});
|
||||
return pr.data.html_url;
|
||||
}
|
||||
16
.github/actions/codex/src/git-user.ts
vendored
Normal file
16
.github/actions/codex/src/git-user.ts
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
export function setGitHubActionsUser(): void {
|
||||
const commands = [
|
||||
["git", "config", "--global", "user.name", "github-actions[bot]"],
|
||||
[
|
||||
"git",
|
||||
"config",
|
||||
"--global",
|
||||
"user.email",
|
||||
"41898282+github-actions[bot]@users.noreply.github.com",
|
||||
],
|
||||
];
|
||||
|
||||
for (const command of commands) {
|
||||
Bun.spawnSync(command);
|
||||
}
|
||||
}
|
||||
11
.github/actions/codex/src/github-workspace.ts
vendored
Normal file
11
.github/actions/codex/src/github-workspace.ts
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import * as pathMod from "path";
|
||||
import { EnvContext } from "./env-context";
|
||||
|
||||
export function resolveWorkspacePath(path: string, ctx: EnvContext): string {
|
||||
if (pathMod.isAbsolute(path)) {
|
||||
return path;
|
||||
} else {
|
||||
const workspace = ctx.get("GITHUB_WORKSPACE");
|
||||
return pathMod.join(workspace, path);
|
||||
}
|
||||
}
|
||||
56
.github/actions/codex/src/load-config.ts
vendored
Normal file
56
.github/actions/codex/src/load-config.ts
vendored
Normal file
@@ -0,0 +1,56 @@
|
||||
import type { Config, LabelConfig } from "./config";
|
||||
|
||||
import { getDefaultConfig } from "./default-label-config";
|
||||
import { readFileSync, readdirSync, statSync } from "fs";
|
||||
import * as path from "path";
|
||||
|
||||
/**
|
||||
* Build an in-memory configuration object by scanning the repository for
|
||||
* Markdown templates located in `.github/codex/labels`.
|
||||
*
|
||||
* Each `*.md` file in that directory represents a label that can trigger the
|
||||
* Codex GitHub Action. The filename **without** the extension is interpreted
|
||||
* as the label name, e.g. `codex-review.md` ➜ `codex-review`.
|
||||
*
|
||||
* For every such label we derive the corresponding `doneLabel` by appending
|
||||
* the suffix `-completed`.
|
||||
*/
|
||||
export function loadConfig(workspace: string): Config {
|
||||
const labelsDir = path.join(workspace, ".github", "codex", "labels");
|
||||
|
||||
let entries: string[];
|
||||
try {
|
||||
entries = readdirSync(labelsDir);
|
||||
} catch {
|
||||
// If the directory is missing, return the default configuration.
|
||||
return getDefaultConfig();
|
||||
}
|
||||
|
||||
const labels: Record<string, LabelConfig> = {};
|
||||
|
||||
for (const entry of entries) {
|
||||
if (!entry.endsWith(".md")) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const fullPath = path.join(labelsDir, entry);
|
||||
|
||||
if (!statSync(fullPath).isFile()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const labelName = entry.slice(0, -3); // trim ".md"
|
||||
|
||||
labels[labelName] = new FileLabelConfig(fullPath);
|
||||
}
|
||||
|
||||
return { labels };
|
||||
}
|
||||
|
||||
class FileLabelConfig implements LabelConfig {
|
||||
constructor(private readonly promptPath: string) {}
|
||||
|
||||
getPromptTemplate(): string {
|
||||
return readFileSync(this.promptPath, "utf8");
|
||||
}
|
||||
}
|
||||
80
.github/actions/codex/src/main.ts
vendored
Executable file
80
.github/actions/codex/src/main.ts
vendored
Executable file
@@ -0,0 +1,80 @@
|
||||
#!/usr/bin/env bun
|
||||
|
||||
import type { Config } from "./config";
|
||||
|
||||
import { defaultContext, EnvContext } from "./env-context";
|
||||
import { loadConfig } from "./load-config";
|
||||
import { setGitHubActionsUser } from "./git-user";
|
||||
import { onLabeled } from "./process-label";
|
||||
import { ensureBaseAndHeadCommitsForPRAreAvailable } from "./prompt-template";
|
||||
import { performAdditionalValidation } from "./verify-inputs";
|
||||
import { onComment } from "./comment";
|
||||
import { onReview } from "./review";
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const ctx: EnvContext = defaultContext;
|
||||
|
||||
// Build the configuration dynamically by scanning `.github/codex/labels`.
|
||||
const GITHUB_WORKSPACE = ctx.get("GITHUB_WORKSPACE");
|
||||
const config: Config = loadConfig(GITHUB_WORKSPACE);
|
||||
|
||||
// Optionally perform additional validation of prompt template files.
|
||||
performAdditionalValidation(config, GITHUB_WORKSPACE);
|
||||
|
||||
const GITHUB_EVENT_NAME = ctx.get("GITHUB_EVENT_NAME");
|
||||
const GITHUB_EVENT_ACTION = ctx.get("GITHUB_EVENT_ACTION");
|
||||
|
||||
// Set user.name and user.email to a bot before Codex runs, just in case it
|
||||
// creates a commit.
|
||||
setGitHubActionsUser();
|
||||
|
||||
switch (GITHUB_EVENT_NAME) {
|
||||
case "issues": {
|
||||
if (GITHUB_EVENT_ACTION === "labeled") {
|
||||
await onLabeled(config, ctx);
|
||||
return;
|
||||
} else if (GITHUB_EVENT_ACTION === "opened") {
|
||||
await onComment(ctx);
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "issue_comment": {
|
||||
if (GITHUB_EVENT_ACTION === "created") {
|
||||
await onComment(ctx);
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "pull_request": {
|
||||
if (GITHUB_EVENT_ACTION === "labeled") {
|
||||
await ensureBaseAndHeadCommitsForPRAreAvailable(ctx);
|
||||
await onLabeled(config, ctx);
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "pull_request_review": {
|
||||
await ensureBaseAndHeadCommitsForPRAreAvailable(ctx);
|
||||
if (GITHUB_EVENT_ACTION === "submitted") {
|
||||
await onReview(ctx);
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
case "pull_request_review_comment": {
|
||||
await ensureBaseAndHeadCommitsForPRAreAvailable(ctx);
|
||||
if (GITHUB_EVENT_ACTION === "created") {
|
||||
await onComment(ctx);
|
||||
return;
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
console.warn(
|
||||
`Unsupported action '${GITHUB_EVENT_ACTION}' for event '${GITHUB_EVENT_NAME}'.`,
|
||||
);
|
||||
}
|
||||
|
||||
main();
|
||||
62
.github/actions/codex/src/post-comment.ts
vendored
Normal file
62
.github/actions/codex/src/post-comment.ts
vendored
Normal file
@@ -0,0 +1,62 @@
|
||||
import { fail } from "./fail";
|
||||
import * as github from "@actions/github";
|
||||
import { EnvContext } from "./env-context";
|
||||
|
||||
/**
|
||||
* Post a comment to the issue / pull request currently in scope.
|
||||
*
|
||||
* Provide the environment context so that token lookup (inside getOctokit) does
|
||||
* not rely on global state.
|
||||
*/
|
||||
export async function postComment(
|
||||
commentBody: string,
|
||||
ctx: EnvContext,
|
||||
): Promise<void> {
|
||||
// Append a footer with a link back to the workflow run, if available.
|
||||
const footer = buildWorkflowRunFooter(ctx);
|
||||
const bodyWithFooter = footer ? `${commentBody}${footer}` : commentBody;
|
||||
|
||||
const octokit = ctx.getOctokit();
|
||||
console.info("Got Octokit instance for posting comment");
|
||||
const { owner, repo } = github.context.repo;
|
||||
const issueNumber = github.context.issue.number;
|
||||
|
||||
if (!issueNumber) {
|
||||
console.warn(
|
||||
"No issue or pull_request number found in GitHub context; skipping comment creation.",
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
console.info("Calling octokit.rest.issues.createComment()");
|
||||
await octokit.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
body: bodyWithFooter,
|
||||
});
|
||||
} catch (error) {
|
||||
fail(`Failed to create comment via GitHub API: ${error}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Helper to build a Markdown fragment linking back to the workflow run that
|
||||
* generated the current comment. Returns `undefined` if required environment
|
||||
* variables are missing – e.g. when running outside of GitHub Actions – so we
|
||||
* can gracefully skip the footer in those cases.
|
||||
*/
|
||||
function buildWorkflowRunFooter(ctx: EnvContext): string | undefined {
|
||||
const serverUrl =
|
||||
ctx.tryGetNonEmpty("GITHUB_SERVER_URL") ?? "https://github.com";
|
||||
const repository = ctx.tryGetNonEmpty("GITHUB_REPOSITORY");
|
||||
const runId = ctx.tryGetNonEmpty("GITHUB_RUN_ID");
|
||||
|
||||
if (!repository || !runId) {
|
||||
return undefined;
|
||||
}
|
||||
|
||||
const url = `${serverUrl}/${repository}/actions/runs/${runId}`;
|
||||
return `\n\n---\n*[_View workflow run_](${url})*`;
|
||||
}
|
||||
226
.github/actions/codex/src/process-label.ts
vendored
Normal file
226
.github/actions/codex/src/process-label.ts
vendored
Normal file
@@ -0,0 +1,226 @@
|
||||
import { fail } from "./fail";
|
||||
import { EnvContext } from "./env-context";
|
||||
import { renderPromptTemplate } from "./prompt-template";
|
||||
|
||||
import { postComment } from "./post-comment";
|
||||
import { runCodex } from "./run-codex";
|
||||
|
||||
import * as github from "@actions/github";
|
||||
import { Config, LabelConfig } from "./config";
|
||||
import { maybePublishPRForIssue } from "./git-helpers";
|
||||
|
||||
export async function onLabeled(
|
||||
config: Config,
|
||||
ctx: EnvContext,
|
||||
): Promise<void> {
|
||||
const GITHUB_EVENT_LABEL_NAME = ctx.get("GITHUB_EVENT_LABEL_NAME");
|
||||
const labelConfig = config.labels[GITHUB_EVENT_LABEL_NAME] as
|
||||
| LabelConfig
|
||||
| undefined;
|
||||
if (!labelConfig) {
|
||||
fail(
|
||||
`Label \`${GITHUB_EVENT_LABEL_NAME}\` not found in config: ${JSON.stringify(config)}`,
|
||||
);
|
||||
}
|
||||
|
||||
await processLabelConfig(ctx, GITHUB_EVENT_LABEL_NAME, labelConfig);
|
||||
}
|
||||
|
||||
/**
|
||||
* Wrapper that handles `-in-progress` and `-completed` semantics around the core lint/fix/review
|
||||
* processing. It will:
|
||||
*
|
||||
* - Skip execution if the `-in-progress` or `-completed` label is already present.
|
||||
* - Mark the PR/issue as `-in-progress`.
|
||||
* - After successful execution, mark the PR/issue as `-completed`.
|
||||
*/
|
||||
async function processLabelConfig(
|
||||
ctx: EnvContext,
|
||||
label: string,
|
||||
labelConfig: LabelConfig,
|
||||
): Promise<void> {
|
||||
const octokit = ctx.getOctokit();
|
||||
const { owner, repo, issueNumber, labelNames } =
|
||||
await getCurrentLabels(octokit);
|
||||
|
||||
const inProgressLabel = `${label}-in-progress`;
|
||||
const completedLabel = `${label}-completed`;
|
||||
for (const markerLabel of [inProgressLabel, completedLabel]) {
|
||||
if (labelNames.includes(markerLabel)) {
|
||||
console.log(
|
||||
`Label '${markerLabel}' already present on issue/PR #${issueNumber}. Skipping Codex action.`,
|
||||
);
|
||||
|
||||
// Clean up: remove the triggering label to avoid confusion and re-runs.
|
||||
await addAndRemoveLabels(octokit, {
|
||||
owner,
|
||||
repo,
|
||||
issueNumber,
|
||||
remove: markerLabel,
|
||||
});
|
||||
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Mark the PR/issue as in progress.
|
||||
await addAndRemoveLabels(octokit, {
|
||||
owner,
|
||||
repo,
|
||||
issueNumber,
|
||||
add: inProgressLabel,
|
||||
remove: label,
|
||||
});
|
||||
|
||||
// Run the core Codex processing.
|
||||
await processLabel(ctx, label, labelConfig);
|
||||
|
||||
// Mark the PR/issue as completed.
|
||||
await addAndRemoveLabels(octokit, {
|
||||
owner,
|
||||
repo,
|
||||
issueNumber,
|
||||
add: completedLabel,
|
||||
remove: inProgressLabel,
|
||||
});
|
||||
}
|
||||
|
||||
async function processLabel(
|
||||
ctx: EnvContext,
|
||||
label: string,
|
||||
labelConfig: LabelConfig,
|
||||
): Promise<void> {
|
||||
const template = labelConfig.getPromptTemplate();
|
||||
|
||||
// If this is a review label, prepend explicit PR-diff scoping guidance to
|
||||
// reduce out-of-scope feedback. Do this before rendering so placeholders in
|
||||
// the guidance (e.g., {CODEX_ACTION_GITHUB_EVENT_PATH}) are substituted.
|
||||
const isReview = label.toLowerCase().includes("review");
|
||||
const reviewScopeGuidance = `
|
||||
PR Diff Scope
|
||||
- Only review changes between the PR's merge-base and head; do not comment on commits or files outside this range.
|
||||
- Derive the base/head SHAs from the event JSON at {CODEX_ACTION_GITHUB_EVENT_PATH}, then compute and use the PR diff for all analysis and comments.
|
||||
|
||||
Commands to determine scope
|
||||
- Resolve SHAs:
|
||||
- BASE_SHA=$(jq -r '.pull_request.base.sha // .pull_request.base.ref' "{CODEX_ACTION_GITHUB_EVENT_PATH}")
|
||||
- HEAD_SHA=$(jq -r '.pull_request.head.sha // .pull_request.head.ref' "{CODEX_ACTION_GITHUB_EVENT_PATH}")
|
||||
- BASE_SHA=$(git rev-parse "$BASE_SHA")
|
||||
- HEAD_SHA=$(git rev-parse "$HEAD_SHA")
|
||||
- Prefer triple-dot (merge-base) semantics for PR diffs:
|
||||
- Changed commits: git log --oneline "$BASE_SHA...$HEAD_SHA"
|
||||
- Changed files: git diff --name-status "$BASE_SHA...$HEAD_SHA"
|
||||
- Review hunks: git diff -U0 "$BASE_SHA...$HEAD_SHA"
|
||||
|
||||
Review rules
|
||||
- Anchor every comment to a file and hunk present in git diff "$BASE_SHA...$HEAD_SHA".
|
||||
- If you mention context outside the diff, label it as "Follow-up (outside this PR scope)" and keep it brief (<=2 bullets).
|
||||
- Do not critique commits or files not reachable in the PR range (merge-base(base, head) → head).
|
||||
`.trim();
|
||||
|
||||
const effectiveTemplate = isReview
|
||||
? `${reviewScopeGuidance}\n\n${template}`
|
||||
: template;
|
||||
|
||||
const populatedTemplate = await renderPromptTemplate(effectiveTemplate, ctx);
|
||||
|
||||
// Always run Codex and post the resulting message as a comment.
|
||||
let commentBody = await runCodex(populatedTemplate, ctx);
|
||||
|
||||
// Current heuristic: only try to create a PR if "attempt" or "fix" is in the
|
||||
// label name. (Yes, we plan to evolve this.)
|
||||
if (label.indexOf("fix") !== -1 || label.indexOf("attempt") !== -1) {
|
||||
console.info(`label ${label} indicates we should attempt to create a PR`);
|
||||
const prUrl = await maybeFixIssue(ctx, commentBody);
|
||||
if (prUrl) {
|
||||
commentBody += `\n\n---\nOpened pull request: ${prUrl}`;
|
||||
}
|
||||
} else {
|
||||
console.info(
|
||||
`label ${label} does not indicate we should attempt to create a PR`,
|
||||
);
|
||||
}
|
||||
|
||||
await postComment(commentBody, ctx);
|
||||
}
|
||||
|
||||
async function maybeFixIssue(
|
||||
ctx: EnvContext,
|
||||
lastMessage: string,
|
||||
): Promise<string | undefined> {
|
||||
// Attempt to create a PR out of any changes Codex produced.
|
||||
const issueNumber = github.context.issue.number!; // exists for issues triggering this path
|
||||
try {
|
||||
return await maybePublishPRForIssue(issueNumber, lastMessage, ctx);
|
||||
} catch (e) {
|
||||
console.warn(`Failed to publish PR: ${e}`);
|
||||
}
|
||||
}
|
||||
|
||||
async function getCurrentLabels(
|
||||
octokit: ReturnType<typeof github.getOctokit>,
|
||||
): Promise<{
|
||||
owner: string;
|
||||
repo: string;
|
||||
issueNumber: number;
|
||||
labelNames: Array<string>;
|
||||
}> {
|
||||
const { owner, repo } = github.context.repo;
|
||||
const issueNumber = github.context.issue.number;
|
||||
|
||||
if (!issueNumber) {
|
||||
fail("No issue or pull_request number found in GitHub context.");
|
||||
}
|
||||
|
||||
const { data: issueData } = await octokit.rest.issues.get({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
});
|
||||
|
||||
const labelNames =
|
||||
issueData.labels?.map((label: any) =>
|
||||
typeof label === "string" ? label : label.name,
|
||||
) ?? [];
|
||||
|
||||
return { owner, repo, issueNumber, labelNames };
|
||||
}
|
||||
|
||||
async function addAndRemoveLabels(
|
||||
octokit: ReturnType<typeof github.getOctokit>,
|
||||
opts: {
|
||||
owner: string;
|
||||
repo: string;
|
||||
issueNumber: number;
|
||||
add?: string;
|
||||
remove?: string;
|
||||
},
|
||||
): Promise<void> {
|
||||
const { owner, repo, issueNumber, add, remove } = opts;
|
||||
|
||||
if (add) {
|
||||
try {
|
||||
await octokit.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
labels: [add],
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to add label '${add}': ${error}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (remove) {
|
||||
try {
|
||||
await octokit.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issueNumber,
|
||||
name: remove,
|
||||
});
|
||||
} catch (error) {
|
||||
console.warn(`Failed to remove label '${remove}': ${error}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
284
.github/actions/codex/src/prompt-template.ts
vendored
Normal file
284
.github/actions/codex/src/prompt-template.ts
vendored
Normal file
@@ -0,0 +1,284 @@
|
||||
/*
|
||||
* Utilities to render Codex prompt templates.
|
||||
*
|
||||
* A template is a Markdown (or plain-text) file that may contain one or more
|
||||
* placeholders of the form `{CODEX_ACTION_<NAME>}`. At runtime these
|
||||
* placeholders are substituted with dynamically generated content. Each
|
||||
* placeholder is resolved **exactly once** even if it appears multiple times
|
||||
* in the same template.
|
||||
*/
|
||||
|
||||
import { readFile } from "fs/promises";
|
||||
|
||||
import { EnvContext } from "./env-context";
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Helpers
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/**
|
||||
* Lazily caches parsed `$GITHUB_EVENT_PATH` contents keyed by the file path so
|
||||
* we only hit the filesystem once per unique event payload.
|
||||
*/
|
||||
const githubEventDataCache: Map<string, Promise<any>> = new Map();
|
||||
|
||||
function getGitHubEventData(ctx: EnvContext): Promise<any> {
|
||||
const eventPath = ctx.get("GITHUB_EVENT_PATH");
|
||||
let cached = githubEventDataCache.get(eventPath);
|
||||
if (!cached) {
|
||||
cached = readFile(eventPath, "utf8").then((raw) => JSON.parse(raw));
|
||||
githubEventDataCache.set(eventPath, cached);
|
||||
}
|
||||
return cached;
|
||||
}
|
||||
|
||||
async function runCommand(args: Array<string>): Promise<string> {
|
||||
const result = Bun.spawnSync(args, {
|
||||
stdout: "pipe",
|
||||
stderr: "pipe",
|
||||
});
|
||||
|
||||
if (result.success) {
|
||||
return result.stdout.toString();
|
||||
}
|
||||
|
||||
console.error(`Error running ${JSON.stringify(args)}: ${result.stderr}`);
|
||||
return "";
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Public API
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
// Regex that captures the variable name without the surrounding { } braces.
|
||||
const VAR_REGEX = /\{(CODEX_ACTION_[A-Z0-9_]+)\}/g;
|
||||
|
||||
// Cache individual placeholder values so each one is resolved at most once per
|
||||
// process even if many templates reference it.
|
||||
const placeholderCache: Map<string, Promise<string>> = new Map();
|
||||
|
||||
/**
|
||||
* Parse a template string, resolve all placeholders and return the rendered
|
||||
* result.
|
||||
*/
|
||||
export async function renderPromptTemplate(
|
||||
template: string,
|
||||
ctx: EnvContext,
|
||||
): Promise<string> {
|
||||
// ---------------------------------------------------------------------
|
||||
// 1) Gather all *unique* placeholders present in the template.
|
||||
// ---------------------------------------------------------------------
|
||||
const variables = new Set<string>();
|
||||
for (const match of template.matchAll(VAR_REGEX)) {
|
||||
variables.add(match[1]);
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------
|
||||
// 2) Kick off (or reuse) async resolution for each variable.
|
||||
// ---------------------------------------------------------------------
|
||||
for (const variable of variables) {
|
||||
if (!placeholderCache.has(variable)) {
|
||||
placeholderCache.set(variable, resolveVariable(variable, ctx));
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------
|
||||
// 3) Await completion so we can perform a simple synchronous replace below.
|
||||
// ---------------------------------------------------------------------
|
||||
const resolvedEntries: [string, string][] = [];
|
||||
for (const [key, promise] of placeholderCache.entries()) {
|
||||
resolvedEntries.push([key, await promise]);
|
||||
}
|
||||
const resolvedMap = new Map<string, string>(resolvedEntries);
|
||||
|
||||
// ---------------------------------------------------------------------
|
||||
// 4) Replace each occurrence. We use replace with a callback to ensure
|
||||
// correct substitution even if variable names overlap (they shouldn't,
|
||||
// but better safe than sorry).
|
||||
// ---------------------------------------------------------------------
|
||||
return template.replace(VAR_REGEX, (_, varName: string) => {
|
||||
return resolvedMap.get(varName) ?? "";
|
||||
});
|
||||
}
|
||||
|
||||
export async function ensureBaseAndHeadCommitsForPRAreAvailable(
|
||||
ctx: EnvContext,
|
||||
): Promise<{ baseSha: string; headSha: string } | null> {
|
||||
const prShas = await getPrShas(ctx);
|
||||
if (prShas == null) {
|
||||
console.warn("Unable to resolve PR branches");
|
||||
return null;
|
||||
}
|
||||
|
||||
const event = await getGitHubEventData(ctx);
|
||||
const pr = event.pull_request;
|
||||
if (!pr) {
|
||||
console.warn("event.pull_request is not defined - unexpected");
|
||||
return null;
|
||||
}
|
||||
|
||||
const workspace = ctx.get("GITHUB_WORKSPACE");
|
||||
|
||||
// Refs (branch names)
|
||||
const baseRef: string | undefined = pr.base?.ref;
|
||||
const headRef: string | undefined = pr.head?.ref;
|
||||
|
||||
// Clone URLs
|
||||
const baseRemoteUrl: string | undefined = pr.base?.repo?.clone_url;
|
||||
const headRemoteUrl: string | undefined = pr.head?.repo?.clone_url;
|
||||
|
||||
if (!baseRef || !headRef || !baseRemoteUrl || !headRemoteUrl) {
|
||||
console.warn(
|
||||
"Missing PR ref or remote URL information - cannot fetch commits",
|
||||
);
|
||||
return null;
|
||||
}
|
||||
|
||||
// Ensure we have the base branch.
|
||||
await runCommand([
|
||||
"git",
|
||||
"-C",
|
||||
workspace,
|
||||
"fetch",
|
||||
"--no-tags",
|
||||
"origin",
|
||||
baseRef,
|
||||
]);
|
||||
|
||||
// Ensure we have the head branch.
|
||||
if (headRemoteUrl === baseRemoteUrl) {
|
||||
// Same repository – the commit is available from `origin`.
|
||||
await runCommand([
|
||||
"git",
|
||||
"-C",
|
||||
workspace,
|
||||
"fetch",
|
||||
"--no-tags",
|
||||
"origin",
|
||||
headRef,
|
||||
]);
|
||||
} else {
|
||||
// Fork – make sure a `pr` remote exists that points at the fork. Attempting
|
||||
// to add a remote that already exists causes git to error, so we swallow
|
||||
// any non-zero exit codes from that specific command.
|
||||
await runCommand([
|
||||
"git",
|
||||
"-C",
|
||||
workspace,
|
||||
"remote",
|
||||
"add",
|
||||
"pr",
|
||||
headRemoteUrl,
|
||||
]);
|
||||
|
||||
// Whether adding succeeded or the remote already existed, attempt to fetch
|
||||
// the head ref from the `pr` remote.
|
||||
await runCommand([
|
||||
"git",
|
||||
"-C",
|
||||
workspace,
|
||||
"fetch",
|
||||
"--no-tags",
|
||||
"pr",
|
||||
headRef,
|
||||
]);
|
||||
}
|
||||
|
||||
return prShas;
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Internal helpers – still exported for use by other modules.
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
export async function resolvePrDiff(ctx: EnvContext): Promise<string> {
|
||||
const prShas = await ensureBaseAndHeadCommitsForPRAreAvailable(ctx);
|
||||
if (prShas == null) {
|
||||
console.warn("Unable to resolve PR branches");
|
||||
return "";
|
||||
}
|
||||
|
||||
const workspace = ctx.get("GITHUB_WORKSPACE");
|
||||
const { baseSha, headSha } = prShas;
|
||||
return runCommand([
|
||||
"git",
|
||||
"-C",
|
||||
workspace,
|
||||
"diff",
|
||||
"--color=never",
|
||||
`${baseSha}..${headSha}`,
|
||||
]);
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Placeholder resolution
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
async function resolveVariable(name: string, ctx: EnvContext): Promise<string> {
|
||||
switch (name) {
|
||||
case "CODEX_ACTION_ISSUE_TITLE": {
|
||||
const event = await getGitHubEventData(ctx);
|
||||
const issue = event.issue ?? event.pull_request;
|
||||
return issue?.title ?? "";
|
||||
}
|
||||
|
||||
case "CODEX_ACTION_ISSUE_BODY": {
|
||||
const event = await getGitHubEventData(ctx);
|
||||
const issue = event.issue ?? event.pull_request;
|
||||
return issue?.body ?? "";
|
||||
}
|
||||
|
||||
case "CODEX_ACTION_GITHUB_EVENT_PATH": {
|
||||
return ctx.get("GITHUB_EVENT_PATH");
|
||||
}
|
||||
|
||||
case "CODEX_ACTION_BASE_REF": {
|
||||
const event = await getGitHubEventData(ctx);
|
||||
return event?.pull_request?.base?.ref ?? "";
|
||||
}
|
||||
|
||||
case "CODEX_ACTION_HEAD_REF": {
|
||||
const event = await getGitHubEventData(ctx);
|
||||
return event?.pull_request?.head?.ref ?? "";
|
||||
}
|
||||
|
||||
case "CODEX_ACTION_PR_DIFF": {
|
||||
return resolvePrDiff(ctx);
|
||||
}
|
||||
|
||||
// -------------------------------------------------------------------
|
||||
// Add new template variables here.
|
||||
// -------------------------------------------------------------------
|
||||
|
||||
default: {
|
||||
// Unknown variable – leave it blank to avoid leaking placeholders to the
|
||||
// final prompt. The alternative would be to `fail()` here, but silently
|
||||
// ignoring unknown placeholders is more forgiving and better matches the
|
||||
// behaviour of typical template engines.
|
||||
console.warn(`Unknown template variable: ${name}`);
|
||||
return "";
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function getPrShas(
|
||||
ctx: EnvContext,
|
||||
): Promise<{ baseSha: string; headSha: string } | null> {
|
||||
const event = await getGitHubEventData(ctx);
|
||||
const pr = event.pull_request;
|
||||
if (!pr) {
|
||||
console.warn("event.pull_request is not defined");
|
||||
return null;
|
||||
}
|
||||
|
||||
// Prefer explicit SHAs if available to avoid relying on local branch names.
|
||||
const baseSha: string | undefined = pr.base?.sha;
|
||||
const headSha: string | undefined = pr.head?.sha;
|
||||
|
||||
if (!baseSha || !headSha) {
|
||||
console.warn("one of base or head is not defined on event.pull_request");
|
||||
return null;
|
||||
}
|
||||
|
||||
return { baseSha, headSha };
|
||||
}
|
||||
42
.github/actions/codex/src/review.ts
vendored
Normal file
42
.github/actions/codex/src/review.ts
vendored
Normal file
@@ -0,0 +1,42 @@
|
||||
import type { EnvContext } from "./env-context";
|
||||
import { runCodex } from "./run-codex";
|
||||
import { postComment } from "./post-comment";
|
||||
import { addEyesReaction } from "./add-reaction";
|
||||
|
||||
/**
|
||||
* Handle `pull_request_review` events. We treat the review body the same way
|
||||
* as a normal comment.
|
||||
*/
|
||||
export async function onReview(ctx: EnvContext): Promise<void> {
|
||||
const triggerPhrase = ctx.tryGet("INPUT_TRIGGER_PHRASE");
|
||||
if (!triggerPhrase) {
|
||||
console.warn("Empty trigger phrase: skipping.");
|
||||
return;
|
||||
}
|
||||
|
||||
const reviewBody = ctx.tryGet("GITHUB_EVENT_REVIEW_BODY");
|
||||
|
||||
if (!reviewBody) {
|
||||
console.warn("Review body not found in environment: skipping.");
|
||||
return;
|
||||
}
|
||||
|
||||
if (!reviewBody.includes(triggerPhrase)) {
|
||||
console.log(
|
||||
`Trigger phrase '${triggerPhrase}' not found: nothing to do for this review.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
const prompt = reviewBody.replace(triggerPhrase, "").trim();
|
||||
|
||||
if (prompt.length === 0) {
|
||||
console.warn("Prompt is empty after removing trigger phrase: skipping.");
|
||||
return;
|
||||
}
|
||||
|
||||
await addEyesReaction(ctx);
|
||||
|
||||
const lastMessage = await runCodex(prompt, ctx);
|
||||
await postComment(lastMessage, ctx);
|
||||
}
|
||||
58
.github/actions/codex/src/run-codex.ts
vendored
Normal file
58
.github/actions/codex/src/run-codex.ts
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
import { fail } from "./fail";
|
||||
import { EnvContext } from "./env-context";
|
||||
import { tmpdir } from "os";
|
||||
import { join } from "node:path";
|
||||
import { readFile, mkdtemp } from "fs/promises";
|
||||
import { resolveWorkspacePath } from "./github-workspace";
|
||||
|
||||
/**
|
||||
* Runs the Codex CLI with the provided prompt and returns the output written
|
||||
* to the "last message" file.
|
||||
*/
|
||||
export async function runCodex(
|
||||
prompt: string,
|
||||
ctx: EnvContext,
|
||||
): Promise<string> {
|
||||
const OPENAI_API_KEY = ctx.get("OPENAI_API_KEY");
|
||||
|
||||
const tempDirPath = await mkdtemp(join(tmpdir(), "codex-"));
|
||||
const lastMessageOutput = join(tempDirPath, "codex-prompt.md");
|
||||
|
||||
// Use the unified CLI and its `exec` subcommand instead of the old
|
||||
// standalone `codex-exec` binary.
|
||||
const args = ["/usr/local/bin/codex", "exec"];
|
||||
|
||||
const inputCodexArgs = ctx.tryGet("INPUT_CODEX_ARGS")?.trim();
|
||||
if (inputCodexArgs) {
|
||||
args.push(...inputCodexArgs.split(/\s+/));
|
||||
}
|
||||
|
||||
args.push("--output-last-message", lastMessageOutput, prompt);
|
||||
|
||||
const env: Record<string, string> = { ...process.env, OPENAI_API_KEY };
|
||||
const INPUT_CODEX_HOME = ctx.tryGet("INPUT_CODEX_HOME");
|
||||
if (INPUT_CODEX_HOME) {
|
||||
env.CODEX_HOME = resolveWorkspacePath(INPUT_CODEX_HOME, ctx);
|
||||
}
|
||||
|
||||
console.log(`Running Codex: ${JSON.stringify(args)}`);
|
||||
const result = Bun.spawnSync(args, {
|
||||
stdout: "inherit",
|
||||
stderr: "inherit",
|
||||
env,
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
fail(`Codex failed: see above for details.`);
|
||||
}
|
||||
|
||||
// Read the output generated by Codex.
|
||||
let lastMessage: string;
|
||||
try {
|
||||
lastMessage = await readFile(lastMessageOutput, "utf8");
|
||||
} catch (err) {
|
||||
fail(`Failed to read Codex output at '${lastMessageOutput}': ${err}`);
|
||||
}
|
||||
|
||||
return lastMessage;
|
||||
}
|
||||
33
.github/actions/codex/src/verify-inputs.ts
vendored
Normal file
33
.github/actions/codex/src/verify-inputs.ts
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
// Validate the inputs passed to the composite action.
|
||||
// The script currently ensures that the provided configuration file exists and
|
||||
// matches the expected schema.
|
||||
|
||||
import type { Config } from "./config";
|
||||
|
||||
import { existsSync } from "fs";
|
||||
import * as path from "path";
|
||||
import { fail } from "./fail";
|
||||
|
||||
export function performAdditionalValidation(config: Config, workspace: string) {
|
||||
// Additional validation: ensure referenced prompt files exist and are Markdown.
|
||||
for (const [label, details] of Object.entries(config.labels)) {
|
||||
// Determine which prompt key is present (the schema guarantees exactly one).
|
||||
const promptPathStr =
|
||||
(details as any).prompt ?? (details as any).promptPath;
|
||||
|
||||
if (promptPathStr) {
|
||||
const promptPath = path.isAbsolute(promptPathStr)
|
||||
? promptPathStr
|
||||
: path.join(workspace, promptPathStr);
|
||||
|
||||
if (!existsSync(promptPath)) {
|
||||
fail(`Prompt file for label '${label}' not found: ${promptPath}`);
|
||||
}
|
||||
if (!promptPath.endsWith(".md")) {
|
||||
fail(
|
||||
`Prompt file for label '${label}' must be a .md file (got ${promptPathStr}).`,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
15
.github/actions/codex/tsconfig.json
vendored
Normal file
15
.github/actions/codex/tsconfig.json
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"lib": ["ESNext"],
|
||||
"target": "ESNext",
|
||||
"module": "ESNext",
|
||||
"moduleDetection": "force",
|
||||
"moduleResolution": "bundler",
|
||||
|
||||
"noEmit": true,
|
||||
"strict": true,
|
||||
"skipLibCheck": true
|
||||
},
|
||||
|
||||
"include": ["src"]
|
||||
}
|
||||
BIN
.github/codex-cli-login.png
vendored
BIN
.github/codex-cli-login.png
vendored
Binary file not shown.
|
Before Width: | Height: | Size: 2.9 MiB After Width: | Height: | Size: 410 KiB |
BIN
.github/codex-cli-splash.png
vendored
BIN
.github/codex-cli-splash.png
vendored
Binary file not shown.
|
Before Width: | Height: | Size: 3.1 MiB After Width: | Height: | Size: 412 KiB |
4
.github/dotslash-config.json
vendored
4
.github/dotslash-config.json
vendored
@@ -21,10 +21,6 @@
|
||||
"windows-x86_64": {
|
||||
"regex": "^codex-x86_64-pc-windows-msvc\\.exe\\.zst$",
|
||||
"path": "codex.exe"
|
||||
},
|
||||
"windows-aarch64": {
|
||||
"regex": "^codex-aarch64-pc-windows-msvc\\.exe\\.zst$",
|
||||
"path": "codex.exe"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
4
.github/pull_request_template.md
vendored
4
.github/pull_request_template.md
vendored
@@ -1,6 +1,6 @@
|
||||
# External (non-OpenAI) Pull Request Requirements
|
||||
|
||||
Before opening this Pull Request, please read the dedicated "Contributing" markdown file or your PR may be closed:
|
||||
https://github.com/openai/codex/blob/main/docs/contributing.md
|
||||
Before opening this Pull Request, please read the "Contributing" section of the README or your PR may be closed:
|
||||
https://github.com/openai/codex#contributing
|
||||
|
||||
If your PR conforms to our contribution guidelines, replace this text with a detailed and high quality description of your changes.
|
||||
|
||||
49
.github/workflows/ci.yml
vendored
49
.github/workflows/ci.yml
vendored
@@ -14,39 +14,40 @@ jobs:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v5
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
run_install: false
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v5
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 22
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 10.8.1
|
||||
run_install: false
|
||||
|
||||
- name: Get pnpm store directory
|
||||
id: pnpm-cache
|
||||
shell: bash
|
||||
run: |
|
||||
echo "store_path=$(pnpm store path --silent)" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Setup pnpm cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ steps.pnpm-cache.outputs.store_path }}
|
||||
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-pnpm-store-
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
run: pnpm install
|
||||
|
||||
# build_npm_package.py requires DotSlash when staging releases.
|
||||
- uses: facebook/install-dotslash@v2
|
||||
# Run all tasks using workspace filters
|
||||
|
||||
- name: Stage npm package
|
||||
- name: Ensure staging a release works.
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
CODEX_VERSION=0.40.0
|
||||
PACK_OUTPUT="${RUNNER_TEMP}/codex-npm.tgz"
|
||||
python3 ./codex-cli/scripts/build_npm_package.py \
|
||||
--release-version "$CODEX_VERSION" \
|
||||
--pack-output "$PACK_OUTPUT"
|
||||
echo "PACK_OUTPUT=$PACK_OUTPUT" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Upload staged npm package artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: codex-npm-staging
|
||||
path: ${{ env.PACK_OUTPUT }}
|
||||
run: ./codex-cli/scripts/stage_release.sh
|
||||
|
||||
- name: Ensure root README.md contains only ASCII and certain Unicode code points
|
||||
run: ./scripts/asciicheck.py README.md
|
||||
|
||||
3
.github/workflows/codespell.yml
vendored
3
.github/workflows/codespell.yml
vendored
@@ -22,7 +22,6 @@ jobs:
|
||||
- name: Annotate locations with typos
|
||||
uses: codespell-project/codespell-problem-matcher@b80729f885d32f78a716c2f107b4db1025001c42 # v1
|
||||
- name: Codespell
|
||||
uses: codespell-project/actions-codespell@406322ec52dd7b488e48c1c4b82e2a8b3a1bf630 # v2.1
|
||||
uses: codespell-project/actions-codespell@406322ec52dd7b488e48c1c4b82e2a8b3a1bf630 # v2
|
||||
with:
|
||||
ignore_words_file: .codespellignore
|
||||
skip: frame*.txt
|
||||
|
||||
64
.github/workflows/codex.yml
vendored
Normal file
64
.github/workflows/codex.yml
vendored
Normal file
@@ -0,0 +1,64 @@
|
||||
name: Codex
|
||||
|
||||
on:
|
||||
issues:
|
||||
types: [opened, labeled]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
types: [labeled]
|
||||
|
||||
jobs:
|
||||
codex:
|
||||
# This `if` check provides complex filtering logic to avoid running Codex
|
||||
# on every PR. Admittedly, one thing this does not verify is whether the
|
||||
# sender has write access to the repo: that must be done as part of a
|
||||
# runtime step.
|
||||
#
|
||||
# Note the label values should match the ones in the .github/codex/labels
|
||||
# folder.
|
||||
if: |
|
||||
(github.event_name == 'issues' && (
|
||||
(github.event.action == 'labeled' && (github.event.label.name == 'codex-attempt' || github.event.label.name == 'codex-triage'))
|
||||
)) ||
|
||||
(github.event_name == 'pull_request' && github.event.action == 'labeled' && (github.event.label.name == 'codex-review' || github.event.label.name == 'codex-rust-review'))
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write # can push or create branches
|
||||
issues: write # for comments + labels on issues/PRs
|
||||
pull-requests: write # for PR comments/labels
|
||||
steps:
|
||||
# TODO: Consider adding an optional mode (--dry-run?) to actions/codex
|
||||
# that verifies whether Codex should actually be run for this event.
|
||||
# (For example, it may be rejected because the sender does not have
|
||||
# write access to the repo.) The benefit would be two-fold:
|
||||
# 1. As the first step of this job, it gives us a chance to add a reaction
|
||||
# or comment to the PR/issue ASAP to "ack" the request.
|
||||
# 2. It saves resources by skipping the clone and setup steps below if
|
||||
# Codex is not going to run.
|
||||
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v5
|
||||
|
||||
- uses: dtolnay/rust-toolchain@1.89
|
||||
with:
|
||||
targets: x86_64-unknown-linux-gnu
|
||||
components: clippy
|
||||
|
||||
- uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cargo/bin/
|
||||
~/.cargo/registry/index/
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
${{ github.workspace }}/codex-rs/target/
|
||||
key: cargo-ubuntu-24.04-x86_64-unknown-linux-gnu-dev-${{ hashFiles('**/Cargo.lock') }}
|
||||
|
||||
# Note it is possible that the `verify` step internal to Run Codex will
|
||||
# fail, in which case the work to setup the repo was worthless :(
|
||||
- name: Run Codex
|
||||
uses: ./.github/actions/codex
|
||||
with:
|
||||
openai_api_key: ${{ secrets.CODEX_OPENAI_API_KEY }}
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
codex_home: ./.github/codex/home
|
||||
52
.github/workflows/rust-ci.yml
vendored
52
.github/workflows/rust-ci.yml
vendored
@@ -57,31 +57,11 @@ jobs:
|
||||
working-directory: codex-rs
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: dtolnay/rust-toolchain@1.90
|
||||
- uses: dtolnay/rust-toolchain@1.89
|
||||
with:
|
||||
components: rustfmt
|
||||
- name: cargo fmt
|
||||
run: cargo fmt -- --config imports_granularity=Item --check
|
||||
- name: Verify codegen for mcp-types
|
||||
run: ./mcp-types/check_lib_rs.py
|
||||
|
||||
cargo_shear:
|
||||
name: cargo shear
|
||||
runs-on: ubuntu-24.04
|
||||
needs: changed
|
||||
if: ${{ needs.changed.outputs.codex == 'true' || needs.changed.outputs.workflows == 'true' || github.event_name == 'push' }}
|
||||
defaults:
|
||||
run:
|
||||
working-directory: codex-rs
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: dtolnay/rust-toolchain@1.90
|
||||
- uses: taiki-e/install-action@0c5db7f7f897c03b771660e91d065338615679f4 # v2
|
||||
with:
|
||||
tool: cargo-shear
|
||||
version: 1.5.1
|
||||
- name: cargo shear
|
||||
run: cargo shear
|
||||
|
||||
# --- CI to validate on different os/targets --------------------------------
|
||||
lint_build_test:
|
||||
@@ -120,30 +100,19 @@ jobs:
|
||||
- runner: windows-latest
|
||||
target: x86_64-pc-windows-msvc
|
||||
profile: dev
|
||||
- runner: windows-11-arm
|
||||
target: aarch64-pc-windows-msvc
|
||||
profile: dev
|
||||
|
||||
# Also run representative release builds on Mac and Linux because
|
||||
# there could be release-only build errors we want to catch.
|
||||
# Hopefully this also pre-populates the build cache to speed up
|
||||
# releases.
|
||||
- runner: macos-14
|
||||
target: aarch64-apple-darwin
|
||||
profile: release
|
||||
- runner: ubuntu-24.04
|
||||
target: x86_64-unknown-linux-musl
|
||||
profile: release
|
||||
- runner: windows-latest
|
||||
target: x86_64-pc-windows-msvc
|
||||
profile: release
|
||||
- runner: windows-11-arm
|
||||
target: aarch64-pc-windows-msvc
|
||||
profile: release
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: dtolnay/rust-toolchain@1.90
|
||||
- uses: dtolnay/rust-toolchain@1.89
|
||||
with:
|
||||
targets: ${{ matrix.target }}
|
||||
components: clippy
|
||||
@@ -165,7 +134,7 @@ jobs:
|
||||
|
||||
- name: cargo clippy
|
||||
id: clippy
|
||||
run: cargo clippy --target ${{ matrix.target }} --all-features --tests --profile ${{ matrix.profile }} -- -D warnings
|
||||
run: cargo clippy --target ${{ matrix.target }} --all-features --tests -- -D warnings
|
||||
|
||||
# Running `cargo build` from the workspace root builds the workspace using
|
||||
# the union of all features from third-party crates. This can mask errors
|
||||
@@ -180,17 +149,12 @@ jobs:
|
||||
find . -name Cargo.toml -mindepth 2 -maxdepth 2 -print0 \
|
||||
| xargs -0 -n1 -I{} bash -c 'cd "$(dirname "{}")" && cargo check --profile ${{ matrix.profile }}'
|
||||
|
||||
- uses: taiki-e/install-action@0c5db7f7f897c03b771660e91d065338615679f4 # v2
|
||||
with:
|
||||
tool: nextest
|
||||
version: 0.9.103
|
||||
|
||||
- name: tests
|
||||
- name: cargo test
|
||||
id: test
|
||||
# Tests take too long for release builds to run them on every PR.
|
||||
# `cargo test` takes too long for release builds to run them on every PR
|
||||
if: ${{ matrix.profile != 'release' }}
|
||||
continue-on-error: true
|
||||
run: cargo nextest run --all-features --no-fail-fast --target ${{ matrix.target }}
|
||||
run: cargo test --all-features --target ${{ matrix.target }} --profile ${{ matrix.profile }}
|
||||
env:
|
||||
RUST_BACKTRACE: 1
|
||||
|
||||
@@ -207,7 +171,7 @@ jobs:
|
||||
# --- Gatherer job that you mark as the ONLY required status -----------------
|
||||
results:
|
||||
name: CI results (required)
|
||||
needs: [changed, general, cargo_shear, lint_build_test]
|
||||
needs: [changed, general, lint_build_test]
|
||||
if: always()
|
||||
runs-on: ubuntu-24.04
|
||||
steps:
|
||||
@@ -215,7 +179,6 @@ jobs:
|
||||
shell: bash
|
||||
run: |
|
||||
echo "general: ${{ needs.general.result }}"
|
||||
echo "shear : ${{ needs.cargo_shear.result }}"
|
||||
echo "matrix : ${{ needs.lint_build_test.result }}"
|
||||
|
||||
# If nothing relevant changed (PR touching only root README, etc.),
|
||||
@@ -227,5 +190,4 @@ jobs:
|
||||
|
||||
# Otherwise require the jobs to have succeeded
|
||||
[[ '${{ needs.general.result }}' == 'success' ]] || { echo 'general failed'; exit 1; }
|
||||
[[ '${{ needs.cargo_shear.result }}' == 'success' ]] || { echo 'cargo_shear failed'; exit 1; }
|
||||
[[ '${{ needs.lint_build_test.result }}' == 'success' ]] || { echo 'matrix failed'; exit 1; }
|
||||
|
||||
122
.github/workflows/rust-release.yml
vendored
122
.github/workflows/rust-release.yml
vendored
@@ -72,12 +72,10 @@ jobs:
|
||||
target: aarch64-unknown-linux-gnu
|
||||
- runner: windows-latest
|
||||
target: x86_64-pc-windows-msvc
|
||||
- runner: windows-11-arm
|
||||
target: aarch64-pc-windows-msvc
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v5
|
||||
- uses: dtolnay/rust-toolchain@1.90
|
||||
- uses: dtolnay/rust-toolchain@1.89
|
||||
with:
|
||||
targets: ${{ matrix.target }}
|
||||
|
||||
@@ -89,7 +87,7 @@ jobs:
|
||||
~/.cargo/registry/cache/
|
||||
~/.cargo/git/db/
|
||||
${{ github.workspace }}/codex-rs/target/
|
||||
key: cargo-${{ matrix.runner }}-${{ matrix.target }}-release-${{ hashFiles('**/Cargo.lock') }}
|
||||
key: cargo-release-${{ matrix.runner }}-${{ matrix.target }}-release-${{ hashFiles('**/Cargo.lock') }}
|
||||
|
||||
- if: ${{ matrix.target == 'x86_64-unknown-linux-musl' || matrix.target == 'aarch64-unknown-linux-musl'}}
|
||||
name: Install musl build tools
|
||||
@@ -111,11 +109,6 @@ jobs:
|
||||
cp target/${{ matrix.target }}/release/codex "$dest/codex-${{ matrix.target }}"
|
||||
fi
|
||||
|
||||
- if: ${{ matrix.runner == 'windows-11-arm' }}
|
||||
name: Install zstd
|
||||
shell: powershell
|
||||
run: choco install -y zstandard
|
||||
|
||||
- name: Compress artifacts
|
||||
shell: bash
|
||||
run: |
|
||||
@@ -167,14 +160,6 @@ jobs:
|
||||
needs: build
|
||||
name: release
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
actions: read
|
||||
outputs:
|
||||
version: ${{ steps.release_name.outputs.name }}
|
||||
tag: ${{ github.ref_name }}
|
||||
should_publish_npm: ${{ steps.npm_publish_settings.outputs.should_publish }}
|
||||
npm_tag: ${{ steps.npm_publish_settings.outputs.npm_tag }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
@@ -195,37 +180,21 @@ jobs:
|
||||
version="${GITHUB_REF_NAME#rust-v}"
|
||||
echo "name=${version}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Determine npm publish settings
|
||||
id: npm_publish_settings
|
||||
env:
|
||||
VERSION: ${{ steps.release_name.outputs.name }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
version="${VERSION}"
|
||||
|
||||
if [[ "${version}" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
|
||||
echo "should_publish=true" >> "$GITHUB_OUTPUT"
|
||||
echo "npm_tag=" >> "$GITHUB_OUTPUT"
|
||||
elif [[ "${version}" =~ ^[0-9]+\.[0-9]+\.[0-9]+-alpha\.[0-9]+$ ]]; then
|
||||
echo "should_publish=true" >> "$GITHUB_OUTPUT"
|
||||
echo "npm_tag=alpha" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "should_publish=false" >> "$GITHUB_OUTPUT"
|
||||
echo "npm_tag=" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
# build_npm_package.py requires DotSlash when staging releases.
|
||||
- uses: facebook/install-dotslash@v2
|
||||
- name: Stage npm package
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
TMP_DIR="${RUNNER_TEMP}/npm-stage"
|
||||
./codex-cli/scripts/build_npm_package.py \
|
||||
python3 codex-cli/scripts/stage_rust_release.py \
|
||||
--release-version "${{ steps.release_name.outputs.name }}" \
|
||||
--staging-dir "${TMP_DIR}" \
|
||||
--pack-output "${GITHUB_WORKSPACE}/dist/npm/codex-npm-${{ steps.release_name.outputs.name }}.tgz"
|
||||
--tmp "${TMP_DIR}"
|
||||
mkdir -p dist/npm
|
||||
# Produce an npm-ready tarball using `npm pack` and store it in dist/npm.
|
||||
# We then rename it to a stable name used by our publishing script.
|
||||
(cd "$TMP_DIR" && npm pack --pack-destination "${GITHUB_WORKSPACE}/dist/npm")
|
||||
mv "${GITHUB_WORKSPACE}"/dist/npm/*.tgz \
|
||||
"${GITHUB_WORKSPACE}/dist/npm/codex-npm-${{ steps.release_name.outputs.name }}.tgz"
|
||||
|
||||
- name: Create GitHub Release
|
||||
uses: softprops/action-gh-release@v2
|
||||
@@ -243,74 +212,3 @@ jobs:
|
||||
with:
|
||||
tag: ${{ github.ref_name }}
|
||||
config: .github/dotslash-config.json
|
||||
|
||||
# Publish to npm using OIDC authentication.
|
||||
# July 31, 2025: https://github.blog/changelog/2025-07-31-npm-trusted-publishing-with-oidc-is-generally-available/
|
||||
# npm docs: https://docs.npmjs.com/trusted-publishers
|
||||
publish-npm:
|
||||
# Publish to npm for stable releases and alpha pre-releases with numeric suffixes.
|
||||
if: ${{ needs.release.outputs.should_publish_npm == 'true' }}
|
||||
name: publish-npm
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
id-token: write # Required for OIDC
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v5
|
||||
with:
|
||||
node-version: 22
|
||||
registry-url: "https://registry.npmjs.org"
|
||||
scope: "@openai"
|
||||
|
||||
# Trusted publishing requires npm CLI version 11.5.1 or later.
|
||||
- name: Update npm
|
||||
run: npm install -g npm@latest
|
||||
|
||||
- name: Download npm tarball from release
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
version="${{ needs.release.outputs.version }}"
|
||||
tag="${{ needs.release.outputs.tag }}"
|
||||
mkdir -p dist/npm
|
||||
gh release download "$tag" \
|
||||
--repo "${GITHUB_REPOSITORY}" \
|
||||
--pattern "codex-npm-${version}.tgz" \
|
||||
--dir dist/npm
|
||||
|
||||
# No NODE_AUTH_TOKEN needed because we use OIDC.
|
||||
- name: Publish to npm
|
||||
env:
|
||||
VERSION: ${{ needs.release.outputs.version }}
|
||||
NPM_TAG: ${{ needs.release.outputs.npm_tag }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
tag_args=()
|
||||
if [[ -n "${NPM_TAG}" ]]; then
|
||||
tag_args+=(--tag "${NPM_TAG}")
|
||||
fi
|
||||
|
||||
npm publish "${GITHUB_WORKSPACE}/dist/npm/codex-npm-${VERSION}.tgz" "${tag_args[@]}"
|
||||
|
||||
update-branch:
|
||||
name: Update latest-alpha-cli branch
|
||||
permissions:
|
||||
contents: write
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Update latest-alpha-cli branch
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
gh api \
|
||||
repos/${GITHUB_REPOSITORY}/git/refs/heads/latest-alpha-cli \
|
||||
-X PATCH \
|
||||
-f sha="${GITHUB_SHA}" \
|
||||
-F force=true
|
||||
|
||||
6
.vscode/extensions.json
vendored
6
.vscode/extensions.json
vendored
@@ -1,11 +1,5 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"rust-lang.rust-analyzer",
|
||||
"tamasfe.even-better-toml",
|
||||
"vadimcn.vscode-lldb",
|
||||
|
||||
// Useful if touching files in .github/workflows, though most
|
||||
// contributors will not be doing that?
|
||||
// "github.vscode-github-actions",
|
||||
]
|
||||
}
|
||||
|
||||
29
AGENTS.md
29
AGENTS.md
@@ -4,15 +4,13 @@ In the codex-rs folder where the rust code lives:
|
||||
|
||||
- Crate names are prefixed with `codex-`. For example, the `core` folder's crate is named `codex-core`
|
||||
- When using format! and you can inline variables into {}, always do that.
|
||||
- Install any commands the repo relies on (for example `just`, `rg`, or `cargo-insta`) if they aren't already available before running instructions here.
|
||||
- Never add or modify any code related to `CODEX_SANDBOX_NETWORK_DISABLED_ENV_VAR` or `CODEX_SANDBOX_ENV_VAR`.
|
||||
- You operate in a sandbox where `CODEX_SANDBOX_NETWORK_DISABLED=1` will be set whenever you use the `shell` tool. Any existing code that uses `CODEX_SANDBOX_NETWORK_DISABLED_ENV_VAR` was authored with this fact in mind. It is often used to early exit out of tests that the author knew you would not be able to run given your sandbox limitations.
|
||||
- Similarly, when you spawn a process using Seatbelt (`/usr/bin/sandbox-exec`), `CODEX_SANDBOX=seatbelt` will be set on the child process. Integration tests that want to run Seatbelt themselves cannot be run under Seatbelt, so checks for `CODEX_SANDBOX=seatbelt` are also often used to early exit out of tests, as appropriate.
|
||||
|
||||
Run `just fmt` (in `codex-rs` directory) automatically after making Rust code changes; do not ask for approval to run it. Before finalizing a change to `codex-rs`, run `just fix -p <project>` (in `codex-rs` directory) to fix any linter issues in the code. Prefer scoping with `-p` to avoid slow workspace‑wide Clippy builds; only run `just fix` without `-p` if you changed shared crates. Additionally, run the tests:
|
||||
Before finalizing a change to `codex-rs`, run `just fmt` (in `codex-rs` directory) to format the code and `just fix -p <project>` (in `codex-rs` directory) to fix any linter issues in the code. Additionally, run the tests:
|
||||
1. Run the test for the specific project that was changed. For example, if changes were made in `codex-rs/tui`, run `cargo test -p codex-tui`.
|
||||
2. Once those pass, if any changes were made in common, core, or protocol, run the complete test suite with `cargo test --all-features`.
|
||||
When running interactively, ask the user before running `just fix` to finalize. `just fmt` does not require approval. project-specific or individual tests can be run without asking the user, but do ask the user before running the complete test suite.
|
||||
|
||||
## TUI style conventions
|
||||
|
||||
@@ -27,26 +25,7 @@ See `codex-rs/tui/styles.md`.
|
||||
- Example: patch summary file lines
|
||||
- Desired: vec![" └ ".into(), "M".red(), " ".dim(), "tui/src/app.rs".dim()]
|
||||
|
||||
### TUI Styling (ratatui)
|
||||
- Prefer Stylize helpers: use "text".dim(), .bold(), .cyan(), .italic(), .underlined() instead of manual Style where possible.
|
||||
- Prefer simple conversions: use "text".into() for spans and vec![…].into() for lines; when inference is ambiguous (e.g., Paragraph::new/Cell::from), use Line::from(spans) or Span::from(text).
|
||||
- Computed styles: if the Style is computed at runtime, using `Span::styled` is OK (`Span::from(text).set_style(style)` is also acceptable).
|
||||
- Avoid hardcoded white: do not use `.white()`; prefer the default foreground (no color).
|
||||
- Chaining: combine helpers by chaining for readability (e.g., url.cyan().underlined()).
|
||||
- Single items: prefer "text".into(); use Line::from(text) or Span::from(text) only when the target type isn’t obvious from context, or when using .into() would require extra type annotations.
|
||||
- Building lines: use vec![…].into() to construct a Line when the target type is obvious and no extra type annotations are needed; otherwise use Line::from(vec![…]).
|
||||
- Avoid churn: don’t refactor between equivalent forms (Span::styled ↔ set_style, Line::from ↔ .into()) without a clear readability or functional gain; follow file‑local conventions and do not introduce type annotations solely to satisfy .into().
|
||||
- Compactness: prefer the form that stays on one line after rustfmt; if only one of Line::from(vec![…]) or vec![…].into() avoids wrapping, choose that. If both wrap, pick the one with fewer wrapped lines.
|
||||
|
||||
### Text wrapping
|
||||
- Always use textwrap::wrap to wrap plain strings.
|
||||
- If you have a ratatui Line and you want to wrap it, use the helpers in tui/src/wrapping.rs, e.g. word_wrap_lines / word_wrap_line.
|
||||
- If you need to indent wrapped lines, use the initial_indent / subsequent_indent options from RtOptions if you can, rather than writing custom logic.
|
||||
- If you have a list of lines and you need to prefix them all with some prefix (optionally different on the first vs subsequent lines), use the `prefix_lines` helper from line_utils.
|
||||
|
||||
## Tests
|
||||
|
||||
### Snapshot tests
|
||||
## Snapshot tests
|
||||
|
||||
This repo uses snapshot tests (via `insta`), especially in `codex-rs/tui`, to validate rendered output. When UI or text output changes intentionally, update the snapshots as follows:
|
||||
|
||||
@@ -61,7 +40,3 @@ This repo uses snapshot tests (via `insta`), especially in `codex-rs/tui`, to va
|
||||
|
||||
If you don’t have the tool:
|
||||
- `cargo install cargo-insta`
|
||||
|
||||
### Test assertions
|
||||
|
||||
- Tests should use pretty_assertions::assert_eq for clearer diffs. Import this at the top of the test module if it isn't already.
|
||||
|
||||
212
CHANGELOG.md
212
CHANGELOG.md
@@ -1 +1,211 @@
|
||||
The changelog can be found on the [releases page](https://github.com/openai/codex/releases)
|
||||
# Changelog
|
||||
|
||||
You can install any of these versions: `npm install -g codex@version`
|
||||
|
||||
## `0.1.2505172129`
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Add node version check (#1007)
|
||||
- Persist token after refresh (#1006)
|
||||
|
||||
## `0.1.2505171619`
|
||||
|
||||
- `codex --login` + `codex --free` (#998)
|
||||
|
||||
## `0.1.2505161800`
|
||||
|
||||
- Sign in with chatgpt credits (#974)
|
||||
- Add support for OpenAI tool type, local_shell (#961)
|
||||
|
||||
## `0.1.2505161243`
|
||||
|
||||
- Sign in with chatgpt (#963)
|
||||
- Session history viewer (#912)
|
||||
- Apply patch issue when using different cwd (#942)
|
||||
- Diff command for filenames with special characters (#954)
|
||||
|
||||
## `0.1.2505160811`
|
||||
|
||||
- `codex-mini-latest` (#951)
|
||||
|
||||
## `0.1.2505140839`
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Gpt-4.1 apply_patch handling (#930)
|
||||
- Add support for fileOpener in config.json (#911)
|
||||
- Patch in #366 and #367 for marked-terminal (#916)
|
||||
- Remember to set lastIndex = 0 on shared RegExp (#918)
|
||||
- Always load version from package.json at runtime (#909)
|
||||
- Tweak the label for citations for better rendering (#919)
|
||||
- Tighten up some logic around session timestamps and ids (#922)
|
||||
- Change EventMsg enum so every variant takes a single struct (#925)
|
||||
- Reasoning default to medium, show workdir when supplied (#931)
|
||||
- Test_dev_null_write() was not using echo as intended (#923)
|
||||
|
||||
## `0.1.2504301751`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- User config api key (#569)
|
||||
- `@mention` files in codex (#701)
|
||||
- Add `--reasoning` CLI flag (#314)
|
||||
- Lower default retry wait time and increase number of tries (#720)
|
||||
- Add common package registries domains to allowed-domains list (#414)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Insufficient quota message (#758)
|
||||
- Input keyboard shortcut opt+delete (#685)
|
||||
- `/diff` should include untracked files (#686)
|
||||
- Only allow running without sandbox if explicitly marked in safe container (#699)
|
||||
- Tighten up check for /usr/bin/sandbox-exec (#710)
|
||||
- Check if sandbox-exec is available (#696)
|
||||
- Duplicate messages in quiet mode (#680)
|
||||
|
||||
## `0.1.2504251709`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Add openai model info configuration (#551)
|
||||
- Added provider to run quiet mode function (#571)
|
||||
- Create parent directories when creating new files (#552)
|
||||
- Print bug report URL in terminal instead of opening browser (#510) (#528)
|
||||
- Add support for custom provider configuration in the user config (#537)
|
||||
- Add support for OpenAI-Organization and OpenAI-Project headers (#626)
|
||||
- Add specific instructions for creating API keys in error msg (#581)
|
||||
- Enhance toCodePoints to prevent potential unicode 14 errors (#615)
|
||||
- More native keyboard navigation in multiline editor (#655)
|
||||
- Display error on selection of invalid model (#594)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Model selection (#643)
|
||||
- Nits in apply patch (#640)
|
||||
- Input keyboard shortcuts (#676)
|
||||
- `apply_patch` unicode characters (#625)
|
||||
- Don't clear turn input before retries (#611)
|
||||
- More loosely match context for apply_patch (#610)
|
||||
- Update bug report template - there is no --revision flag (#614)
|
||||
- Remove outdated copy of text input and external editor feature (#670)
|
||||
- Remove unreachable "disableResponseStorage" logic flow introduced in #543 (#573)
|
||||
- Non-openai mode - fix for gemini content: null, fix 429 to throw before stream (#563)
|
||||
- Only allow going up in history when not already in history if input is empty (#654)
|
||||
- Do not grant "node" user sudo access when using run_in_container.sh (#627)
|
||||
- Update scripts/build_container.sh to use pnpm instead of npm (#631)
|
||||
- Update lint-staged config to use pnpm --filter (#582)
|
||||
- Non-openai mode - don't default temp and top_p (#572)
|
||||
- Fix error catching when checking for updates (#597)
|
||||
- Close stdin when running an exec tool call (#636)
|
||||
|
||||
## `0.1.2504221401`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Show actionable errors when api keys are missing (#523)
|
||||
- Add CLI `--version` flag (#492)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Agent loop for ZDR (`disableResponseStorage`) (#543)
|
||||
- Fix relative `workdir` check for `apply_patch` (#556)
|
||||
- Minimal mid-stream #429 retry loop using existing back-off (#506)
|
||||
- Inconsistent usage of base URL and API key (#507)
|
||||
- Remove requirement for api key for ollama (#546)
|
||||
- Support `[provider]_BASE_URL` (#542)
|
||||
|
||||
## `0.1.2504220136`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Add support for ZDR orgs (#481)
|
||||
- Include fractional portion of chunk that exceeds stdout/stderr limit (#497)
|
||||
|
||||
## `0.1.2504211509`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Support multiple providers via Responses-Completion transformation (#247)
|
||||
- Add user-defined safe commands configuration and approval logic #380 (#386)
|
||||
- Allow switching approval modes when prompted to approve an edit/command (#400)
|
||||
- Add support for `/diff` command autocomplete in TerminalChatInput (#431)
|
||||
- Auto-open model selector if user selects deprecated model (#427)
|
||||
- Read approvalMode from config file (#298)
|
||||
- `/diff` command to view git diff (#426)
|
||||
- Tab completions for file paths (#279)
|
||||
- Add /command autocomplete (#317)
|
||||
- Allow multi-line input (#438)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- `full-auto` support in quiet mode (#374)
|
||||
- Enable shell option for child process execution (#391)
|
||||
- Configure husky and lint-staged for pnpm monorepo (#384)
|
||||
- Command pipe execution by improving shell detection (#437)
|
||||
- Name of the file not matching the name of the component (#354)
|
||||
- Allow proper exit from new Switch approval mode dialog (#453)
|
||||
- Ensure /clear resets context and exclude system messages from approximateTokenUsed count (#443)
|
||||
- `/clear` now clears terminal screen and resets context left indicator (#425)
|
||||
- Correct fish completion function name in CLI script (#485)
|
||||
- Auto-open model-selector when model is not found (#448)
|
||||
- Remove unnecessary isLoggingEnabled() checks (#420)
|
||||
- Improve test reliability for `raw-exec` (#434)
|
||||
- Unintended tear down of agent loop (#483)
|
||||
- Remove extraneous type casts (#462)
|
||||
|
||||
## `0.1.2504181820`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Add `/bug` report command (#312)
|
||||
- Notify when a newer version is available (#333)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Update context left display logic in TerminalChatInput component (#307)
|
||||
- Improper spawn of sh on Windows Powershell (#318)
|
||||
- `/bug` report command, thinking indicator (#381)
|
||||
- Include pnpm lock file (#377)
|
||||
|
||||
## `0.1.2504172351`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Add Nix flake for reproducible development environments (#225)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Handle invalid commands (#304)
|
||||
- Raw-exec-process-group.test improve reliability and error handling (#280)
|
||||
- Canonicalize the writeable paths used in seatbelt policy (#275)
|
||||
|
||||
## `0.1.2504172304`
|
||||
|
||||
### 🚀 Features
|
||||
|
||||
- Add shell completion subcommand (#138)
|
||||
- Add command history persistence (#152)
|
||||
- Shell command explanation option (#173)
|
||||
- Support bun fallback runtime for codex CLI (#282)
|
||||
- Add notifications for MacOS using Applescript (#160)
|
||||
- Enhance image path detection in input processing (#189)
|
||||
- `--config`/`-c` flag to open global instructions in nvim (#158)
|
||||
- Update position of cursor when navigating input history with arrow keys to the end of the text (#255)
|
||||
|
||||
### 🪲 Bug Fixes
|
||||
|
||||
- Correct word deletion logic for trailing spaces (Ctrl+Backspace) (#131)
|
||||
- Improve Windows compatibility for CLI commands and sandbox (#261)
|
||||
- Correct typos in thinking texts (transcendent & parroting) (#108)
|
||||
- Add empty vite config file to prevent resolving to parent (#273)
|
||||
- Update regex to better match the retry error messages (#266)
|
||||
- Add missing "as" in prompt prefix in agent loop (#186)
|
||||
- Allow continuing after interrupting assistant (#178)
|
||||
- Standardize filename to kebab-case 🐍➡️🥙 (#302)
|
||||
- Small update to bug report template (#288)
|
||||
- Duplicated message on model change (#276)
|
||||
- Typos in prompts and comments (#195)
|
||||
- Check workdir before spawn (#221)
|
||||
|
||||
<!-- generated - do not edit -->
|
||||
|
||||
688
README.md
688
README.md
@@ -2,31 +2,76 @@
|
||||
|
||||
<p align="center"><code>npm i -g @openai/codex</code><br />or <code>brew install codex</code></p>
|
||||
|
||||
<p align="center"><strong>Codex CLI</strong> is a coding agent from OpenAI that runs locally on your computer.
|
||||
</br>
|
||||
</br>If you want Codex in your code editor (VS Code, Cursor, Windsurf), <a href="https://developers.openai.com/codex/ide">install in your IDE</a>
|
||||
</br>If you are looking for the <em>cloud-based agent</em> from OpenAI, <strong>Codex Web</strong>, go to <a href="https://chatgpt.com/codex">chatgpt.com/codex</a></p>
|
||||
<p align="center"><strong>Codex CLI</strong> is a coding agent from OpenAI that runs locally on your computer.</br>If you are looking for the <em>cloud-based agent</em> from OpenAI, <strong>Codex Web</strong>, see <a href="https://chatgpt.com/codex">chatgpt.com/codex</a>.</p>
|
||||
|
||||
<p align="center">
|
||||
<img src="./.github/codex-cli-splash.png" alt="Codex CLI splash" width="80%" />
|
||||
<img src="./.github/codex-cli-splash.png" alt="Codex CLI splash" width="50%" />
|
||||
</p>
|
||||
|
||||
---
|
||||
|
||||
<details>
|
||||
<summary><strong>Table of contents</strong></summary>
|
||||
|
||||
<!-- Begin ToC -->
|
||||
|
||||
- [Quickstart](#quickstart)
|
||||
- [Installing and running Codex CLI](#installing-and-running-codex-cli)
|
||||
- [Using Codex with your ChatGPT plan](#using-codex-with-your-chatgpt-plan)
|
||||
- [Connecting on a "Headless" Machine](#connecting-on-a-headless-machine)
|
||||
- [Authenticate locally and copy your credentials to the "headless" machine](#authenticate-locally-and-copy-your-credentials-to-the-headless-machine)
|
||||
- [Connecting through VPS or remote](#connecting-through-vps-or-remote)
|
||||
- [Usage-based billing alternative: Use an OpenAI API key](#usage-based-billing-alternative-use-an-openai-api-key)
|
||||
- [Forcing a specific auth method (advanced)](#forcing-a-specific-auth-method-advanced)
|
||||
- [Choosing Codex's level of autonomy](#choosing-codexs-level-of-autonomy)
|
||||
- [**1. Read/write**](#1-readwrite)
|
||||
- [**2. Read-only**](#2-read-only)
|
||||
- [**3. Advanced configuration**](#3-advanced-configuration)
|
||||
- [Can I run without ANY approvals?](#can-i-run-without-any-approvals)
|
||||
- [Fine-tuning in `config.toml`](#fine-tuning-in-configtoml)
|
||||
- [Example prompts](#example-prompts)
|
||||
- [Running with a prompt as input](#running-with-a-prompt-as-input)
|
||||
- [Using Open Source Models](#using-open-source-models)
|
||||
- [Platform sandboxing details](#platform-sandboxing-details)
|
||||
- [Experimental technology disclaimer](#experimental-technology-disclaimer)
|
||||
- [System requirements](#system-requirements)
|
||||
- [CLI reference](#cli-reference)
|
||||
- [Memory & project docs](#memory--project-docs)
|
||||
- [Non-interactive / CI mode](#non-interactive--ci-mode)
|
||||
- [Model Context Protocol (MCP)](#model-context-protocol-mcp)
|
||||
- [Tracing / verbose logging](#tracing--verbose-logging)
|
||||
- [DotSlash](#dotslash)
|
||||
- [Configuration](#configuration)
|
||||
- [FAQ](#faq)
|
||||
- [Zero data retention (ZDR) usage](#zero-data-retention-zdr-usage)
|
||||
- [Codex open source fund](#codex-open-source-fund)
|
||||
- [Contributing](#contributing)
|
||||
- [Development workflow](#development-workflow)
|
||||
- [Writing high-impact code changes](#writing-high-impact-code-changes)
|
||||
- [Opening a pull request](#opening-a-pull-request)
|
||||
- [Review process](#review-process)
|
||||
- [Community values](#community-values)
|
||||
- [Getting help](#getting-help)
|
||||
- [Contributor license agreement (CLA)](#contributor-license-agreement-cla)
|
||||
- [Quick fixes](#quick-fixes)
|
||||
- [Releasing `codex`](#releasing-codex)
|
||||
- [Security & responsible AI](#security--responsible-ai)
|
||||
- [License](#license)
|
||||
|
||||
<!-- End ToC -->
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Quickstart
|
||||
|
||||
### Installing and running Codex CLI
|
||||
|
||||
Install globally with your preferred package manager. If you use npm:
|
||||
Install globally with your preferred package manager:
|
||||
|
||||
```shell
|
||||
npm install -g @openai/codex
|
||||
```
|
||||
|
||||
Alternatively, if you use Homebrew:
|
||||
|
||||
```shell
|
||||
brew install codex
|
||||
npm install -g @openai/codex # Alternatively: `brew install codex`
|
||||
```
|
||||
|
||||
Then simply run `codex` to get started:
|
||||
@@ -54,52 +99,607 @@ Each archive contains a single entry with the platform baked into the name (e.g.
|
||||
### Using Codex with your ChatGPT plan
|
||||
|
||||
<p align="center">
|
||||
<img src="./.github/codex-cli-login.png" alt="Codex CLI login" width="80%" />
|
||||
<img src="./.github/codex-cli-login.png" alt="Codex CLI login" width="50%" />
|
||||
</p>
|
||||
|
||||
Run `codex` and select **Sign in with ChatGPT**. We recommend signing into your ChatGPT account to use Codex as part of your Plus, Pro, Team, Edu, or Enterprise plan. [Learn more about what's included in your ChatGPT plan](https://help.openai.com/en/articles/11369540-codex-in-chatgpt).
|
||||
Run `codex` and select **Sign in with ChatGPT**. You'll need a Plus, Pro, or Team ChatGPT account, and will get access to our latest models, including `gpt-5`, at no extra cost to your plan. (Enterprise is coming soon.)
|
||||
|
||||
You can also use Codex with an API key, but this requires [additional setup](./docs/authentication.md#usage-based-billing-alternative-use-an-openai-api-key). If you previously used an API key for usage-based billing, see the [migration steps](./docs/authentication.md#migrating-from-usage-based-billing-api-key). If you're having trouble with login, please comment on [this issue](https://github.com/openai/codex/issues/1243).
|
||||
> Important: If you've used the Codex CLI before, follow these steps to migrate from usage-based billing with your API key:
|
||||
>
|
||||
> 1. Update the CLI and ensure `codex --version` is `0.20.0` or later
|
||||
> 2. Delete `~/.codex/auth.json` (this should be `C:\Users\USERNAME\.codex\auth.json` on Windows)
|
||||
> 3. Run `codex login` again
|
||||
|
||||
### Model Context Protocol (MCP)
|
||||
If you encounter problems with the login flow, please comment on [this issue](https://github.com/openai/codex/issues/1243).
|
||||
|
||||
Codex CLI supports [MCP servers](./docs/advanced.md#model-context-protocol-mcp). Enable by adding an `mcp_servers` section to your `~/.codex/config.toml`.
|
||||
### Connecting on a "Headless" Machine
|
||||
|
||||
Today, the login process entails running a server on `localhost:1455`. If you are on a "headless" server, such as a Docker container or are `ssh`'d into a remote machine, loading `localhost:1455` in the browser on your local machine will not automatically connect to the webserver running on the _headless_ machine, so you must use one of the following workarounds:
|
||||
|
||||
### Configuration
|
||||
#### Authenticate locally and copy your credentials to the "headless" machine
|
||||
|
||||
Codex CLI supports a rich set of configuration options, with preferences stored in `~/.codex/config.toml`. For full configuration options, see [Configuration](./docs/config.md).
|
||||
The easiest solution is likely to run through the `codex login` process on your local machine such that `localhost:1455` _is_ accessible in your web browser. When you complete the authentication process, an `auth.json` file should be available at `$CODEX_HOME/auth.json` (on Mac/Linux, `$CODEX_HOME` defaults to `~/.codex` whereas on Windows, it defaults to `%USERPROFILE%\.codex`).
|
||||
|
||||
Because the `auth.json` file is not tied to a specific host, once you complete the authentication flow locally, you can copy the `$CODEX_HOME/auth.json` file to the headless machine and then `codex` should "just work" on that machine. Note to copy a file to a Docker container, you can do:
|
||||
|
||||
```shell
|
||||
# substitute MY_CONTAINER with the name or id of your Docker container:
|
||||
CONTAINER_HOME=$(docker exec MY_CONTAINER printenv HOME)
|
||||
docker exec MY_CONTAINER mkdir -p "$CONTAINER_HOME/.codex"
|
||||
docker cp auth.json MY_CONTAINER:"$CONTAINER_HOME/.codex/auth.json"
|
||||
```
|
||||
|
||||
whereas if you are `ssh`'d into a remote machine, you likely want to use [`scp`](https://en.wikipedia.org/wiki/Secure_copy_protocol):
|
||||
|
||||
```shell
|
||||
ssh user@remote 'mkdir -p ~/.codex'
|
||||
scp ~/.codex/auth.json user@remote:~/.codex/auth.json
|
||||
```
|
||||
|
||||
or try this one-liner:
|
||||
|
||||
```shell
|
||||
ssh user@remote 'mkdir -p ~/.codex && cat > ~/.codex/auth.json' < ~/.codex/auth.json
|
||||
```
|
||||
|
||||
#### Connecting through VPS or remote
|
||||
|
||||
If you run Codex on a remote machine (VPS/server) without a local browser, the login helper starts a server on `localhost:1455` on the remote host. To complete login in your local browser, forward that port to your machine before starting the login flow:
|
||||
|
||||
```bash
|
||||
# From your local machine
|
||||
ssh -L 1455:localhost:1455 <user>@<remote-host>
|
||||
```
|
||||
|
||||
Then, in that SSH session, run `codex` and select "Sign in with ChatGPT". When prompted, open the printed URL (it will be `http://localhost:1455/...`) in your local browser. The traffic will be tunneled to the remote server.
|
||||
|
||||
### Usage-based billing alternative: Use an OpenAI API key
|
||||
|
||||
If you prefer to pay-as-you-go, you can still authenticate with your OpenAI API key by setting it as an environment variable:
|
||||
|
||||
```shell
|
||||
export OPENAI_API_KEY="your-api-key-here"
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- This command only sets the key for your current terminal session, which we recommend. To set it for all future sessions, you can also add the `export` line to your shell's configuration file (e.g., `~/.zshrc`).
|
||||
- If you have signed in with ChatGPT, Codex will default to using your ChatGPT credits. If you wish to use your API key, use the `/logout` command to clear your ChatGPT authentication.
|
||||
|
||||
#### Forcing a specific auth method (advanced)
|
||||
|
||||
You can explicitly choose which authentication Codex should prefer when both are available.
|
||||
|
||||
- To always use your API key (even when ChatGPT auth exists), set:
|
||||
|
||||
```toml
|
||||
# ~/.codex/config.toml
|
||||
preferred_auth_method = "apikey"
|
||||
```
|
||||
|
||||
Or override ad-hoc via CLI:
|
||||
|
||||
```bash
|
||||
codex --config preferred_auth_method="apikey"
|
||||
```
|
||||
|
||||
- To prefer ChatGPT auth (default), set:
|
||||
|
||||
```toml
|
||||
# ~/.codex/config.toml
|
||||
preferred_auth_method = "chatgpt"
|
||||
```
|
||||
|
||||
Notes:
|
||||
|
||||
- When `preferred_auth_method = "apikey"` and an API key is available, the login screen is skipped.
|
||||
- When `preferred_auth_method = "chatgpt"` (default), Codex prefers ChatGPT auth if present; if only an API key is present, it will use the API key. Certain account types may also require API-key mode.
|
||||
|
||||
### Choosing Codex's level of autonomy
|
||||
|
||||
We always recommend running Codex in its default sandbox that gives you strong guardrails around what the agent can do. The default sandbox prevents it from editing files outside its workspace, or from accessing the network.
|
||||
|
||||
When you launch Codex in a new folder, it detects whether the folder is version controlled and recommends one of two levels of autonomy:
|
||||
|
||||
#### **1. Read/write**
|
||||
|
||||
- Codex can run commands and write files in the workspace without approval.
|
||||
- To write files in other folders, access network, update git or perform other actions protected by the sandbox, Codex will need your permission.
|
||||
- By default, the workspace includes the current directory, as well as temporary directories like `/tmp`. You can see what directories are in the workspace with the `/status` command. See the docs for how to customize this behavior.
|
||||
- Advanced: You can manually specify this configuration by running `codex --sandbox workspace-write --ask-for-approval on-request`
|
||||
- This is the recommended default for version-controlled folders.
|
||||
|
||||
#### **2. Read-only**
|
||||
|
||||
- Codex can run read-only commands without approval.
|
||||
- To edit files, access network, or perform other actions protected by the sandbox, Codex will need your permission.
|
||||
- Advanced: You can manually specify this configuration by running `codex --sandbox read-only --ask-for-approval on-request`
|
||||
- This is the recommended default non-version-controlled folders.
|
||||
|
||||
#### **3. Advanced configuration**
|
||||
|
||||
Codex gives you fine-grained control over the sandbox with the `--sandbox` option, and over when it requests approval with the `--ask-for-approval` option. Run `codex help` for more on these options.
|
||||
|
||||
#### Can I run without ANY approvals?
|
||||
|
||||
Yes, run codex non-interactively with `--ask-for-approval never`. This option works with all `--sandbox` options, so you still have full control over Codex's level of autonomy. It will make its best attempt with whatever contrainsts you provide. For example:
|
||||
|
||||
- Use `codex --ask-for-approval never --sandbox read-only` when you are running many agents to answer questions in parallel in the same workspace.
|
||||
- Use `codex --ask-for-approval never --sandbox workspace-write` when you want the agent to non-interactively take time to produce the best outcome, with strong guardrails around its behavior.
|
||||
- Use `codex --ask-for-approval never --sandbox danger-full-access` to dangerously give the agent full autonomy. Because this disables important safety mechanisms, we recommend against using this unless running Codex in an isolated environment.
|
||||
|
||||
#### Fine-tuning in `config.toml`
|
||||
|
||||
```toml
|
||||
# approval mode
|
||||
approval_policy = "untrusted"
|
||||
sandbox_mode = "read-only"
|
||||
|
||||
# full-auto mode
|
||||
approval_policy = "on-request"
|
||||
sandbox_mode = "workspace-write"
|
||||
|
||||
# Optional: allow network in workspace-write mode
|
||||
[sandbox_workspace_write]
|
||||
network_access = true
|
||||
```
|
||||
|
||||
You can also save presets as **profiles**:
|
||||
|
||||
```toml
|
||||
[profiles.full_auto]
|
||||
approval_policy = "on-request"
|
||||
sandbox_mode = "workspace-write"
|
||||
|
||||
[profiles.readonly_quiet]
|
||||
approval_policy = "never"
|
||||
sandbox_mode = "read-only"
|
||||
```
|
||||
|
||||
### Example prompts
|
||||
|
||||
Below are a few bite-size examples you can copy-paste. Replace the text in quotes with your own task. See the [prompting guide](https://github.com/openai/codex/blob/main/codex-cli/examples/prompting_guide.md) for more tips and usage patterns.
|
||||
|
||||
| ✨ | What you type | What happens |
|
||||
| --- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------- |
|
||||
| 1 | `codex "Refactor the Dashboard component to React Hooks"` | Codex rewrites the class component, runs `npm test`, and shows the diff. |
|
||||
| 2 | `codex "Generate SQL migrations for adding a users table"` | Infers your ORM, creates migration files, and runs them in a sandboxed DB. |
|
||||
| 3 | `codex "Write unit tests for utils/date.ts"` | Generates tests, executes them, and iterates until they pass. |
|
||||
| 4 | `codex "Bulk-rename *.jpeg -> *.jpg with git mv"` | Safely renames files and updates imports/usages. |
|
||||
| 5 | `codex "Explain what this regex does: ^(?=.*[A-Z]).{8,}$"` | Outputs a step-by-step human explanation. |
|
||||
| 6 | `codex "Carefully review this repo, and propose 3 high impact well-scoped PRs"` | Suggests impactful PRs in the current codebase. |
|
||||
| 7 | `codex "Look for vulnerabilities and create a security review report"` | Finds and explains security bugs. |
|
||||
|
||||
## Running with a prompt as input
|
||||
|
||||
You can also run Codex CLI with a prompt as input:
|
||||
|
||||
```shell
|
||||
codex "explain this codebase to me"
|
||||
```
|
||||
|
||||
```shell
|
||||
codex --full-auto "create the fanciest todo-list app"
|
||||
```
|
||||
|
||||
That's it - Codex will scaffold a file, run it inside a sandbox, install any
|
||||
missing dependencies, and show you the live result. Approve the changes and
|
||||
they'll be committed to your working directory.
|
||||
|
||||
## Using Open Source Models
|
||||
|
||||
<details>
|
||||
<summary><strong>Use <code>--profile</code> to use other models</strong></summary>
|
||||
|
||||
Codex also allows you to use other providers that support the OpenAI Chat Completions (or Responses) API.
|
||||
|
||||
To do so, you must first define custom [providers](./config.md#model_providers) in `~/.codex/config.toml`. For example, the provider for a standard Ollama setup would be defined as follows:
|
||||
|
||||
```toml
|
||||
[model_providers.ollama]
|
||||
name = "Ollama"
|
||||
base_url = "http://localhost:11434/v1"
|
||||
```
|
||||
|
||||
The `base_url` will have `/chat/completions` appended to it to build the full URL for the request.
|
||||
|
||||
For providers that also require an `Authorization` header of the form `Bearer: SECRET`, an `env_key` can be specified, which indicates the environment variable to read to use as the value of `SECRET` when making a request:
|
||||
|
||||
```toml
|
||||
[model_providers.openrouter]
|
||||
name = "OpenRouter"
|
||||
base_url = "https://openrouter.ai/api/v1"
|
||||
env_key = "OPENROUTER_API_KEY"
|
||||
```
|
||||
|
||||
Providers that speak the Responses API are also supported by adding `wire_api = "responses"` as part of the definition. Accessing OpenAI models via Azure is an example of such a provider, though it also requires specifying additional `query_params` that need to be appended to the request URL:
|
||||
|
||||
```toml
|
||||
[model_providers.azure]
|
||||
name = "Azure"
|
||||
# Make sure you set the appropriate subdomain for this URL.
|
||||
base_url = "https://YOUR_PROJECT_NAME.openai.azure.com/openai"
|
||||
env_key = "AZURE_OPENAI_API_KEY" # Or "OPENAI_API_KEY", whichever you use.
|
||||
# Newer versions appear to support the responses API, see https://github.com/openai/codex/pull/1321
|
||||
query_params = { api-version = "2025-04-01-preview" }
|
||||
wire_api = "responses"
|
||||
```
|
||||
|
||||
Once you have defined a provider you wish to use, you can configure it as your default provider as follows:
|
||||
|
||||
```toml
|
||||
model_provider = "azure"
|
||||
```
|
||||
|
||||
> [!TIP]
|
||||
> If you find yourself experimenting with a variety of models and providers, then you likely want to invest in defining a _profile_ for each configuration like so:
|
||||
|
||||
```toml
|
||||
[profiles.o3]
|
||||
model_provider = "azure"
|
||||
model = "o3"
|
||||
|
||||
[profiles.mistral]
|
||||
model_provider = "ollama"
|
||||
model = "mistral"
|
||||
```
|
||||
|
||||
This way, you can specify one command-line argument (.e.g., `--profile o3`, `--profile mistral`) to override multiple settings together.
|
||||
|
||||
</details>
|
||||
|
||||
Codex can run fully locally against an OpenAI-compatible OSS host (like Ollama) using the `--oss` flag:
|
||||
|
||||
- Interactive UI:
|
||||
- codex --oss
|
||||
- Non-interactive (programmatic) mode:
|
||||
- echo "Refactor utils" | codex exec --oss
|
||||
|
||||
Model selection when using `--oss`:
|
||||
|
||||
- If you omit `-m/--model`, Codex defaults to -m gpt-oss:20b and will verify it exists locally (downloading if needed).
|
||||
- To pick a different size, pass one of:
|
||||
- -m "gpt-oss:20b"
|
||||
- -m "gpt-oss:120b"
|
||||
|
||||
Point Codex at your own OSS host:
|
||||
|
||||
- By default, `--oss` talks to http://localhost:11434/v1.
|
||||
- To use a different host, set one of these environment variables before running Codex:
|
||||
- CODEX_OSS_BASE_URL, for example:
|
||||
- CODEX_OSS_BASE_URL="http://my-ollama.example.com:11434/v1" codex --oss -m gpt-oss:20b
|
||||
- or CODEX_OSS_PORT (when the host is localhost):
|
||||
- CODEX_OSS_PORT=11434 codex --oss
|
||||
|
||||
Advanced: you can persist this in your config instead of environment variables by overriding the built-in `oss` provider in `~/.codex/config.toml`:
|
||||
|
||||
```toml
|
||||
[model_providers.oss]
|
||||
name = "Open Source"
|
||||
base_url = "http://my-ollama.example.com:11434/v1"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Docs & FAQ
|
||||
### Platform sandboxing details
|
||||
|
||||
- [**Getting started**](./docs/getting-started.md)
|
||||
- [CLI usage](./docs/getting-started.md#cli-usage)
|
||||
- [Running with a prompt as input](./docs/getting-started.md#running-with-a-prompt-as-input)
|
||||
- [Example prompts](./docs/getting-started.md#example-prompts)
|
||||
- [Memory with AGENTS.md](./docs/getting-started.md#memory-with-agentsmd)
|
||||
- [Configuration](./docs/config.md)
|
||||
- [**Sandbox & approvals**](./docs/sandbox.md)
|
||||
- [**Authentication**](./docs/authentication.md)
|
||||
- [Auth methods](./docs/authentication.md#forcing-a-specific-auth-method-advanced)
|
||||
- [Login on a "Headless" machine](./docs/authentication.md#connecting-on-a-headless-machine)
|
||||
- [**Advanced**](./docs/advanced.md)
|
||||
- [Non-interactive / CI mode](./docs/advanced.md#non-interactive--ci-mode)
|
||||
- [Tracing / verbose logging](./docs/advanced.md#tracing--verbose-logging)
|
||||
- [Model Context Protocol (MCP)](./docs/advanced.md#model-context-protocol-mcp)
|
||||
- [**Zero data retention (ZDR)**](./docs/zdr.md)
|
||||
- [**Contributing**](./docs/contributing.md)
|
||||
- [**Install & build**](./docs/install.md)
|
||||
- [System Requirements](./docs/install.md#system-requirements)
|
||||
- [DotSlash](./docs/install.md#dotslash)
|
||||
- [Build from source](./docs/install.md#build-from-source)
|
||||
- [**FAQ**](./docs/faq.md)
|
||||
- [**Open source fund**](./docs/open-source-fund.md)
|
||||
By default, Codex CLI runs code and shell commands inside a restricted sandbox to protect your system.
|
||||
|
||||
> [!IMPORTANT]
|
||||
> Not all tool calls are sandboxed. Specifically, **trusted Model Context Protocol (MCP) tool calls** are executed outside of the sandbox.
|
||||
> This is intentional: MCP tools are explicitly configured and trusted by you, and they often need to connect to **external applications or services** (e.g. issue trackers, databases, messaging systems).
|
||||
> Running them outside the sandbox allows Codex to integrate with these external systems without being blocked by sandbox restrictions.
|
||||
|
||||
The mechanism Codex uses to implement the sandbox policy depends on your OS:
|
||||
|
||||
- **macOS 12+** uses **Apple Seatbelt** and runs commands using `sandbox-exec` with a profile (`-p`) that corresponds to the `--sandbox` that was specified.
|
||||
- **Linux** uses a combination of Landlock/seccomp APIs to enforce the `sandbox` configuration.
|
||||
|
||||
Note that when running Linux in a containerized environment such as Docker, sandboxing may not work if the host/container configuration does not support the necessary Landlock/seccomp APIs. In such cases, we recommend configuring your Docker container so that it provides the sandbox guarantees you are looking for and then running `codex` with `--sandbox danger-full-access` (or, more simply, the `--dangerously-bypass-approvals-and-sandbox` flag) within your container.
|
||||
|
||||
---
|
||||
|
||||
## Experimental technology disclaimer
|
||||
|
||||
Codex CLI is an experimental project under active development. It is not yet stable, may contain bugs, incomplete features, or undergo breaking changes. We're building it in the open with the community and welcome:
|
||||
|
||||
- Bug reports
|
||||
- Feature requests
|
||||
- Pull requests
|
||||
- Good vibes
|
||||
|
||||
Help us improve by filing issues or submitting PRs (see the section below for how to contribute)!
|
||||
|
||||
---
|
||||
|
||||
## System requirements
|
||||
|
||||
| Requirement | Details |
|
||||
| --------------------------- | --------------------------------------------------------------- |
|
||||
| Operating systems | macOS 12+, Ubuntu 20.04+/Debian 10+, or Windows 11 **via WSL2** |
|
||||
| Git (optional, recommended) | 2.23+ for built-in PR helpers |
|
||||
| RAM | 4-GB minimum (8-GB recommended) |
|
||||
|
||||
---
|
||||
|
||||
## CLI reference
|
||||
|
||||
| Command | Purpose | Example |
|
||||
| ------------------ | ---------------------------------- | ------------------------------- |
|
||||
| `codex` | Interactive TUI | `codex` |
|
||||
| `codex "..."` | Initial prompt for interactive TUI | `codex "fix lint errors"` |
|
||||
| `codex exec "..."` | Non-interactive "automation mode" | `codex exec "explain utils.ts"` |
|
||||
|
||||
Key flags: `--model/-m`, `--ask-for-approval/-a`.
|
||||
|
||||
---
|
||||
|
||||
## Memory & project docs
|
||||
|
||||
You can give Codex extra instructions and guidance using `AGENTS.md` files. Codex looks for `AGENTS.md` files in the following places, and merges them top-down:
|
||||
|
||||
1. `~/.codex/AGENTS.md` - personal global guidance
|
||||
2. `AGENTS.md` at repo root - shared project notes
|
||||
3. `AGENTS.md` in the current working directory - sub-folder/feature specifics
|
||||
|
||||
---
|
||||
|
||||
## Non-interactive / CI mode
|
||||
|
||||
Run Codex head-less in pipelines. Example GitHub Action step:
|
||||
|
||||
```yaml
|
||||
- name: Update changelog via Codex
|
||||
run: |
|
||||
npm install -g @openai/codex
|
||||
export OPENAI_API_KEY="${{ secrets.OPENAI_KEY }}"
|
||||
codex exec --full-auto "update CHANGELOG for next release"
|
||||
```
|
||||
|
||||
## Model Context Protocol (MCP)
|
||||
|
||||
The Codex CLI can be configured to leverage MCP servers by defining an [`mcp_servers`](./codex-rs/config.md#mcp_servers) section in `~/.codex/config.toml`. It is intended to mirror how tools such as Claude and Cursor define `mcpServers` in their respective JSON config files, though the Codex format is slightly different since it uses TOML rather than JSON, e.g.:
|
||||
|
||||
```toml
|
||||
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
|
||||
[mcp_servers.server-name]
|
||||
command = "npx"
|
||||
args = ["-y", "mcp-server"]
|
||||
env = { "API_KEY" = "value" }
|
||||
```
|
||||
|
||||
> [!TIP]
|
||||
> It is somewhat experimental, but the Codex CLI can also be run as an MCP _server_ via `codex mcp`. If you launch it with an MCP client such as `npx @modelcontextprotocol/inspector codex mcp` and send it a `tools/list` request, you will see that there is only one tool, `codex`, that accepts a grab-bag of inputs, including a catch-all `config` map for anything you might want to override. Feel free to play around with it and provide feedback via GitHub issues.
|
||||
|
||||
## Tracing / verbose logging
|
||||
|
||||
Because Codex is written in Rust, it honors the `RUST_LOG` environment variable to configure its logging behavior.
|
||||
|
||||
The TUI defaults to `RUST_LOG=codex_core=info,codex_tui=info` and log messages are written to `~/.codex/log/codex-tui.log`, so you can leave the following running in a separate terminal to monitor log messages as they are written:
|
||||
|
||||
```
|
||||
tail -F ~/.codex/log/codex-tui.log
|
||||
```
|
||||
|
||||
By comparison, the non-interactive mode (`codex exec`) defaults to `RUST_LOG=error`, but messages are printed inline, so there is no need to monitor a separate file.
|
||||
|
||||
See the Rust documentation on [`RUST_LOG`](https://docs.rs/env_logger/latest/env_logger/#enabling-logging) for more information on the configuration options.
|
||||
|
||||
---
|
||||
|
||||
### DotSlash
|
||||
|
||||
The GitHub Release also contains a [DotSlash](https://dotslash-cli.com/) file for the Codex CLI named `codex`. Using a DotSlash file makes it possible to make a lightweight commit to source control to ensure all contributors use the same version of an executable, regardless of what platform they use for development.
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><strong>Build from source</strong></summary>
|
||||
|
||||
```bash
|
||||
# Clone the repository and navigate to the root of the Cargo workspace.
|
||||
git clone https://github.com/openai/codex.git
|
||||
cd codex/codex-rs
|
||||
|
||||
# Install the Rust toolchain, if necessary.
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
|
||||
source "$HOME/.cargo/env"
|
||||
rustup component add rustfmt
|
||||
rustup component add clippy
|
||||
|
||||
# Build Codex.
|
||||
cargo build
|
||||
|
||||
# Launch the TUI with a sample prompt.
|
||||
cargo run --bin codex -- "explain this codebase to me"
|
||||
|
||||
# After making changes, ensure the code is clean.
|
||||
cargo fmt -- --config imports_granularity=Item
|
||||
cargo clippy --tests
|
||||
|
||||
# Run the tests.
|
||||
cargo test
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
Codex supports a rich set of configuration options documented in [`codex-rs/config.md`](./codex-rs/config.md).
|
||||
|
||||
By default, Codex loads its configuration from `~/.codex/config.toml`.
|
||||
|
||||
Though `--config` can be used to set/override ad-hoc config values for individual invocations of `codex`.
|
||||
|
||||
---
|
||||
|
||||
## FAQ
|
||||
|
||||
<details>
|
||||
<summary>OpenAI released a model called Codex in 2021 - is this related?</summary>
|
||||
|
||||
In 2021, OpenAI released Codex, an AI system designed to generate code from natural language prompts. That original Codex model was deprecated as of March 2023 and is separate from the CLI tool.
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>Which models are supported?</summary>
|
||||
|
||||
Any model available with [Responses API](https://platform.openai.com/docs/api-reference/responses). The default is `o4-mini`, but pass `--model gpt-4.1` or set `model: gpt-4.1` in your config file to override.
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Why does <code>o3</code> or <code>o4-mini</code> not work for me?</summary>
|
||||
|
||||
It's possible that your [API account needs to be verified](https://help.openai.com/en/articles/10910291-api-organization-verification) in order to start streaming responses and seeing chain of thought summaries from the API. If you're still running into issues, please let us know!
|
||||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>How do I stop Codex from editing my files?</summary>
|
||||
|
||||
Codex runs model-generated commands in a sandbox. If a proposed command or file change doesn't look right, you can simply type **n** to deny the command or give the model feedback.
|
||||
|
||||
</details>
|
||||
<details>
|
||||
<summary>Does it work on Windows?</summary>
|
||||
|
||||
Not directly. It requires [Windows Subsystem for Linux (WSL2)](https://learn.microsoft.com/en-us/windows/wsl/install) - Codex has been tested on macOS and Linux with Node 22.
|
||||
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Zero data retention (ZDR) usage
|
||||
|
||||
Codex CLI **does** support OpenAI organizations with [Zero Data Retention (ZDR)](https://platform.openai.com/docs/guides/your-data#zero-data-retention) enabled. If your OpenAI organization has Zero Data Retention enabled and you still encounter errors such as:
|
||||
|
||||
```
|
||||
OpenAI rejected the request. Error details: Status: 400, Code: unsupported_parameter, Type: invalid_request_error, Message: 400 Previous response cannot be used for this organization due to Zero Data Retention.
|
||||
```
|
||||
|
||||
Ensure you are running `codex` with `--config disable_response_storage=true` or add this line to `~/.codex/config.toml` to avoid specifying the command line option each time:
|
||||
|
||||
```toml
|
||||
disable_response_storage = true
|
||||
```
|
||||
|
||||
See [the configuration documentation on `disable_response_storage`](./codex-rs/config.md#disable_response_storage) for details.
|
||||
|
||||
---
|
||||
|
||||
## Codex open source fund
|
||||
|
||||
We're excited to launch a **$1 million initiative** supporting open source projects that use Codex CLI and other OpenAI models.
|
||||
|
||||
- Grants are awarded up to **$25,000** API credits.
|
||||
- Applications are reviewed **on a rolling basis**.
|
||||
|
||||
**Interested? [Apply here](https://openai.com/form/codex-open-source-fund/).**
|
||||
|
||||
---
|
||||
|
||||
## Contributing
|
||||
|
||||
This project is under active development and the code will likely change pretty significantly.
|
||||
|
||||
**At the moment, we only plan to prioritize reviewing external contributions for bugs or security fixes.**
|
||||
|
||||
If you want to add a new feature or change the behavior of an existing one, please open an issue proposing the feature and get approval from an OpenAI team member before spending time building it.
|
||||
|
||||
**New contributions that don't go through this process may be closed** if they aren't aligned with our current roadmap or conflict with other priorities/upcoming features.
|
||||
|
||||
### Development workflow
|
||||
|
||||
- Create a _topic branch_ from `main` - e.g. `feat/interactive-prompt`.
|
||||
- Keep your changes focused. Multiple unrelated fixes should be opened as separate PRs.
|
||||
- Following the [development setup](#development-workflow) instructions above, ensure your change is free of lint warnings and test failures.
|
||||
|
||||
### Writing high-impact code changes
|
||||
|
||||
1. **Start with an issue.** Open a new one or comment on an existing discussion so we can agree on the solution before code is written.
|
||||
2. **Add or update tests.** Every new feature or bug-fix should come with test coverage that fails before your change and passes afterwards. 100% coverage is not required, but aim for meaningful assertions.
|
||||
3. **Document behaviour.** If your change affects user-facing behaviour, update the README, inline help (`codex --help`), or relevant example projects.
|
||||
4. **Keep commits atomic.** Each commit should compile and the tests should pass. This makes reviews and potential rollbacks easier.
|
||||
|
||||
### Opening a pull request
|
||||
|
||||
- Fill in the PR template (or include similar information) - **What? Why? How?**
|
||||
- Run **all** checks locally (`cargo test && cargo clippy --tests && cargo fmt -- --config imports_granularity=Item`). CI failures that could have been caught locally slow down the process.
|
||||
- Make sure your branch is up-to-date with `main` and that you have resolved merge conflicts.
|
||||
- Mark the PR as **Ready for review** only when you believe it is in a merge-able state.
|
||||
|
||||
### Review process
|
||||
|
||||
1. One maintainer will be assigned as a primary reviewer.
|
||||
2. If your PR adds a new feature that was not previously discussed and approved, we may choose to close your PR (see [Contributing](#contributing)).
|
||||
3. We may ask for changes - please do not take this personally. We value the work, but we also value consistency and long-term maintainability.
|
||||
5. When there is consensus that the PR meets the bar, a maintainer will squash-and-merge.
|
||||
|
||||
### Community values
|
||||
|
||||
- **Be kind and inclusive.** Treat others with respect; we follow the [Contributor Covenant](https://www.contributor-covenant.org/).
|
||||
- **Assume good intent.** Written communication is hard - err on the side of generosity.
|
||||
- **Teach & learn.** If you spot something confusing, open an issue or PR with improvements.
|
||||
|
||||
### Getting help
|
||||
|
||||
If you run into problems setting up the project, would like feedback on an idea, or just want to say _hi_ - please open a Discussion or jump into the relevant issue. We are happy to help.
|
||||
|
||||
Together we can make Codex CLI an incredible tool. **Happy hacking!** :rocket:
|
||||
|
||||
### Contributor license agreement (CLA)
|
||||
|
||||
All contributors **must** accept the CLA. The process is lightweight:
|
||||
|
||||
1. Open your pull request.
|
||||
2. Paste the following comment (or reply `recheck` if you've signed before):
|
||||
|
||||
```text
|
||||
I have read the CLA Document and I hereby sign the CLA
|
||||
```
|
||||
|
||||
3. The CLA-Assistant bot records your signature in the repo and marks the status check as passed.
|
||||
|
||||
No special Git commands, email attachments, or commit footers required.
|
||||
|
||||
#### Quick fixes
|
||||
|
||||
| Scenario | Command |
|
||||
| ----------------- | ------------------------------------------------ |
|
||||
| Amend last commit | `git commit --amend -s --no-edit && git push -f` |
|
||||
|
||||
The **DCO check** blocks merges until every commit in the PR carries the footer (with squash this is just the one).
|
||||
|
||||
### Releasing `codex`
|
||||
|
||||
_For admins only._
|
||||
|
||||
Make sure you are on `main` and have no local changes. Then run:
|
||||
|
||||
```shell
|
||||
VERSION=0.2.0 # Can also be 0.2.0-alpha.1 or any valid Rust version.
|
||||
./codex-rs/scripts/create_github_release.sh "$VERSION"
|
||||
```
|
||||
|
||||
This will make a local commit on top of `main` with `version` set to `$VERSION` in `codex-rs/Cargo.toml` (note that on `main`, we leave the version as `version = "0.0.0"`).
|
||||
|
||||
This will push the commit using the tag `rust-v${VERSION}`, which in turn kicks off [the release workflow](.github/workflows/rust-release.yml). This will create a new GitHub Release named `$VERSION`.
|
||||
|
||||
If everything looks good in the generated GitHub Release, uncheck the **pre-release** box so it is the latest release.
|
||||
|
||||
Create a PR to update [`Formula/c/codex.rb`](https://github.com/Homebrew/homebrew-core/blob/main/Formula/c/codex.rb) on Homebrew.
|
||||
|
||||
---
|
||||
|
||||
## Security & responsible AI
|
||||
|
||||
Have you discovered a vulnerability or have concerns about model output? Please e-mail **security@openai.com** and we will respond promptly.
|
||||
|
||||
---
|
||||
|
||||
## License
|
||||
|
||||
This repository is licensed under the [Apache-2.0 License](LICENSE).
|
||||
|
||||
|
||||
8
codex-cli/.gitignore
vendored
8
codex-cli/.gitignore
vendored
@@ -1 +1,7 @@
|
||||
/vendor/
|
||||
# Added by ./scripts/install_native_deps.sh
|
||||
/bin/codex-aarch64-apple-darwin
|
||||
/bin/codex-aarch64-unknown-linux-musl
|
||||
/bin/codex-linux-sandbox-arm64
|
||||
/bin/codex-linux-sandbox-x64
|
||||
/bin/codex-x86_64-apple-darwin
|
||||
/bin/codex-x86_64-unknown-linux-musl
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
#!/usr/bin/env node
|
||||
// Unified entry point for the Codex CLI.
|
||||
|
||||
import { existsSync } from "fs";
|
||||
import path from "path";
|
||||
import { fileURLToPath } from "url";
|
||||
|
||||
@@ -41,11 +40,10 @@ switch (platform) {
|
||||
case "win32":
|
||||
switch (arch) {
|
||||
case "x64":
|
||||
targetTriple = "x86_64-pc-windows-msvc";
|
||||
targetTriple = "x86_64-pc-windows-msvc.exe";
|
||||
break;
|
||||
case "arm64":
|
||||
targetTriple = "aarch64-pc-windows-msvc";
|
||||
break;
|
||||
// We do not build this today, fall through...
|
||||
default:
|
||||
break;
|
||||
}
|
||||
@@ -58,10 +56,7 @@ if (!targetTriple) {
|
||||
throw new Error(`Unsupported platform: ${platform} (${arch})`);
|
||||
}
|
||||
|
||||
const vendorRoot = path.join(__dirname, "..", "vendor");
|
||||
const archRoot = path.join(vendorRoot, targetTriple);
|
||||
const codexBinaryName = process.platform === "win32" ? "codex.exe" : "codex";
|
||||
const binaryPath = path.join(archRoot, "codex", codexBinaryName);
|
||||
const binaryPath = path.join(__dirname, "..", "bin", `codex-${targetTriple}`);
|
||||
|
||||
// Use an asynchronous spawn instead of spawnSync so that Node is able to
|
||||
// respond to signals (e.g. Ctrl-C / SIGINT) while the native binary is
|
||||
@@ -70,6 +65,23 @@ const binaryPath = path.join(archRoot, "codex", codexBinaryName);
|
||||
// receives a fatal signal, both processes exit in a predictable manner.
|
||||
const { spawn } = await import("child_process");
|
||||
|
||||
async function tryImport(moduleName) {
|
||||
try {
|
||||
// eslint-disable-next-line node/no-unsupported-features/es-syntax
|
||||
return await import(moduleName);
|
||||
} catch (err) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function resolveRgDir() {
|
||||
const ripgrep = await tryImport("@vscode/ripgrep");
|
||||
if (!ripgrep?.rgPath) {
|
||||
return null;
|
||||
}
|
||||
return path.dirname(ripgrep.rgPath);
|
||||
}
|
||||
|
||||
function getUpdatedPath(newDirs) {
|
||||
const pathSep = process.platform === "win32" ? ";" : ":";
|
||||
const existingPath = process.env.PATH || "";
|
||||
@@ -81,9 +93,9 @@ function getUpdatedPath(newDirs) {
|
||||
}
|
||||
|
||||
const additionalDirs = [];
|
||||
const pathDir = path.join(archRoot, "path");
|
||||
if (existsSync(pathDir)) {
|
||||
additionalDirs.push(pathDir);
|
||||
const rgDir = await resolveRgDir();
|
||||
if (rgDir) {
|
||||
additionalDirs.push(rgDir);
|
||||
}
|
||||
const updatedPath = getUpdatedPath(additionalDirs);
|
||||
|
||||
|
||||
@@ -1,79 +0,0 @@
|
||||
#!/usr/bin/env dotslash
|
||||
|
||||
{
|
||||
"name": "rg",
|
||||
"platforms": {
|
||||
"macos-aarch64": {
|
||||
"size": 1787248,
|
||||
"hash": "blake3",
|
||||
"digest": "8d9942032585ea8ee805937634238d9aee7b210069f4703c88fbe568e26fb78a",
|
||||
"format": "tar.gz",
|
||||
"path": "ripgrep-14.1.1-aarch64-apple-darwin/rg",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-aarch64-apple-darwin.tar.gz"
|
||||
}
|
||||
]
|
||||
},
|
||||
"linux-aarch64": {
|
||||
"size": 2047405,
|
||||
"hash": "blake3",
|
||||
"digest": "0b670b8fa0a3df2762af2fc82cc4932f684ca4c02dbd1260d4f3133fd4b2a515",
|
||||
"format": "tar.gz",
|
||||
"path": "ripgrep-14.1.1-aarch64-unknown-linux-gnu/rg",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-aarch64-unknown-linux-gnu.tar.gz"
|
||||
}
|
||||
]
|
||||
},
|
||||
"macos-x86_64": {
|
||||
"size": 2082672,
|
||||
"hash": "blake3",
|
||||
"digest": "e9b862fc8da3127f92791f0ff6a799504154ca9d36c98bf3e60a81c6b1f7289e",
|
||||
"format": "tar.gz",
|
||||
"path": "ripgrep-14.1.1-x86_64-apple-darwin/rg",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-apple-darwin.tar.gz"
|
||||
}
|
||||
]
|
||||
},
|
||||
"linux-x86_64": {
|
||||
"size": 2566310,
|
||||
"hash": "blake3",
|
||||
"digest": "f73cca4e54d78c31f832c7f6e2c0b4db8b04fa3eaa747915727d570893dbee76",
|
||||
"format": "tar.gz",
|
||||
"path": "ripgrep-14.1.1-x86_64-unknown-linux-musl/rg",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-unknown-linux-musl.tar.gz"
|
||||
}
|
||||
]
|
||||
},
|
||||
"windows-x86_64": {
|
||||
"size": 2058893,
|
||||
"hash": "blake3",
|
||||
"digest": "a8ce1a6fed4f8093ee997e57f33254e94b2cd18e26358b09db599c89882eadbd",
|
||||
"format": "zip",
|
||||
"path": "ripgrep-14.1.1-x86_64-pc-windows-msvc/rg.exe",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/BurntSushi/ripgrep/releases/download/14.1.1/ripgrep-14.1.1-x86_64-pc-windows-msvc.zip"
|
||||
}
|
||||
]
|
||||
},
|
||||
"windows-aarch64": {
|
||||
"size": 1667740,
|
||||
"hash": "blake3",
|
||||
"digest": "47b971a8c4fca1d23a4e7c19bd4d88465ebc395598458133139406d3bf85f3fa",
|
||||
"format": "zip",
|
||||
"path": "rg.exe",
|
||||
"providers": [
|
||||
{
|
||||
"url": "https://github.com/microsoft/ripgrep-prebuilt/releases/download/v13.0.0-13/ripgrep-v13.0.0-13-aarch64-pc-windows-msvc.zip"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
101
codex-cli/package-lock.json
generated
101
codex-cli/package-lock.json
generated
@@ -2,17 +2,118 @@
|
||||
"name": "@openai/codex",
|
||||
"version": "0.0.0-dev",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@openai/codex",
|
||||
"version": "0.0.0-dev",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"@vscode/ripgrep": "^1.15.14"
|
||||
},
|
||||
"bin": {
|
||||
"codex": "bin/codex.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=20"
|
||||
}
|
||||
},
|
||||
"node_modules/@vscode/ripgrep": {
|
||||
"version": "1.15.14",
|
||||
"resolved": "https://registry.npmjs.org/@vscode/ripgrep/-/ripgrep-1.15.14.tgz",
|
||||
"integrity": "sha512-/G1UJPYlm+trBWQ6cMO3sv6b8D1+G16WaJH1/DSqw32JOVlzgZbLkDxRyzIpTpv30AcYGMkCf5tUqGlW6HbDWw==",
|
||||
"hasInstallScript": true,
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"https-proxy-agent": "^7.0.2",
|
||||
"proxy-from-env": "^1.1.0",
|
||||
"yauzl": "^2.9.2"
|
||||
}
|
||||
},
|
||||
"node_modules/agent-base": {
|
||||
"version": "7.1.4",
|
||||
"resolved": "https://registry.npmjs.org/agent-base/-/agent-base-7.1.4.tgz",
|
||||
"integrity": "sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 14"
|
||||
}
|
||||
},
|
||||
"node_modules/buffer-crc32": {
|
||||
"version": "0.2.13",
|
||||
"resolved": "https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.13.tgz",
|
||||
"integrity": "sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": "*"
|
||||
}
|
||||
},
|
||||
"node_modules/debug": {
|
||||
"version": "4.4.1",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-4.4.1.tgz",
|
||||
"integrity": "sha512-KcKCqiftBJcZr++7ykoDIEwSa3XWowTfNPo92BYxjXiyYEVrUQh2aLyhxBCwww+heortUFxEJYcRzosstTEBYQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ms": "^2.1.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"supports-color": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/fd-slicer": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/fd-slicer/-/fd-slicer-1.1.0.tgz",
|
||||
"integrity": "sha512-cE1qsB/VwyQozZ+q1dGxR8LBYNZeofhEdUNGSMbQD3Gw2lAzX9Zb3uIU6Ebc/Fmyjo9AWWfnn0AUCHqtevs/8g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"pend": "~1.2.0"
|
||||
}
|
||||
},
|
||||
"node_modules/https-proxy-agent": {
|
||||
"version": "7.0.6",
|
||||
"resolved": "https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-7.0.6.tgz",
|
||||
"integrity": "sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"agent-base": "^7.1.2",
|
||||
"debug": "4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 14"
|
||||
}
|
||||
},
|
||||
"node_modules/ms": {
|
||||
"version": "2.1.3",
|
||||
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
|
||||
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pend": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/pend/-/pend-1.2.0.tgz",
|
||||
"integrity": "sha512-F3asv42UuXchdzt+xXqfW1OGlVBe+mxa2mqI0pg5yAHZPvFmY3Y6drSf/GQ1A86WgWEN9Kzh/WrgKa6iGcHXLg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/proxy-from-env": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/yauzl": {
|
||||
"version": "2.10.0",
|
||||
"resolved": "https://registry.npmjs.org/yauzl/-/yauzl-2.10.0.tgz",
|
||||
"integrity": "sha512-p4a9I6X6nu6IhoGmBqAcbJy1mlC4j27vEPZX9F4L4/vZT3Lyq1VkFHw/V/PUcB9Buo+DG3iHkT0x3Qya58zc3g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"buffer-crc32": "~0.2.3",
|
||||
"fd-slicer": "~1.1.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -11,11 +11,16 @@
|
||||
},
|
||||
"files": [
|
||||
"bin",
|
||||
"vendor"
|
||||
"dist"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/openai/codex.git",
|
||||
"directory": "codex-cli"
|
||||
"url": "git+https://github.com/openai/codex.git"
|
||||
},
|
||||
"dependencies": {
|
||||
"@vscode/ripgrep": "^1.15.14"
|
||||
},
|
||||
"devDependencies": {
|
||||
"prettier": "^3.3.3"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,7 +5,5 @@ Run the following:
|
||||
To build the 0.2.x or later version of the npm module, which runs the Rust version of the CLI, build it as follows:
|
||||
|
||||
```bash
|
||||
./codex-cli/scripts/build_npm_package.py --release-version 0.6.0
|
||||
./codex-cli/scripts/stage_rust_release.py --release-version 0.6.0
|
||||
```
|
||||
|
||||
Note this will create `./codex-cli/vendor/` as a side-effect.
|
||||
|
||||
@@ -1,269 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Stage and optionally package the @openai/codex npm module."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
|
||||
SCRIPT_DIR = Path(__file__).resolve().parent
|
||||
CODEX_CLI_ROOT = SCRIPT_DIR.parent
|
||||
REPO_ROOT = CODEX_CLI_ROOT.parent
|
||||
GITHUB_REPO = "openai/codex"
|
||||
|
||||
# The docs are not clear on what the expected value/format of
|
||||
# workflow/workflowName is:
|
||||
# https://cli.github.com/manual/gh_run_list
|
||||
WORKFLOW_NAME = ".github/workflows/rust-release.yml"
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Build or stage the Codex CLI npm package.")
|
||||
parser.add_argument(
|
||||
"--version",
|
||||
help="Version number to write to package.json inside the staged package.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--release-version",
|
||||
help=(
|
||||
"Version to stage for npm release. When provided, the script also resolves the "
|
||||
"matching rust-release workflow unless --workflow-url is supplied."
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--workflow-url",
|
||||
help="Optional GitHub Actions workflow run URL used to download native binaries.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--staging-dir",
|
||||
type=Path,
|
||||
help=(
|
||||
"Directory to stage the package contents. Defaults to a new temporary directory "
|
||||
"if omitted. The directory must be empty when provided."
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--tmp",
|
||||
dest="staging_dir",
|
||||
type=Path,
|
||||
help=argparse.SUPPRESS,
|
||||
)
|
||||
parser.add_argument(
|
||||
"--pack-output",
|
||||
type=Path,
|
||||
help="Path where the generated npm tarball should be written.",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main() -> int:
|
||||
args = parse_args()
|
||||
|
||||
version = args.version
|
||||
release_version = args.release_version
|
||||
if release_version:
|
||||
if version and version != release_version:
|
||||
raise RuntimeError("--version and --release-version must match when both are provided.")
|
||||
version = release_version
|
||||
|
||||
if not version:
|
||||
raise RuntimeError("Must specify --version or --release-version.")
|
||||
|
||||
staging_dir, created_temp = prepare_staging_dir(args.staging_dir)
|
||||
|
||||
try:
|
||||
stage_sources(staging_dir, version)
|
||||
|
||||
workflow_url = args.workflow_url
|
||||
resolved_head_sha: str | None = None
|
||||
if not workflow_url:
|
||||
if release_version:
|
||||
workflow = resolve_release_workflow(version)
|
||||
workflow_url = workflow["url"]
|
||||
resolved_head_sha = workflow.get("headSha")
|
||||
else:
|
||||
workflow_url = resolve_latest_alpha_workflow_url()
|
||||
elif release_version:
|
||||
try:
|
||||
workflow = resolve_release_workflow(version)
|
||||
resolved_head_sha = workflow.get("headSha")
|
||||
except Exception:
|
||||
resolved_head_sha = None
|
||||
|
||||
if release_version and resolved_head_sha:
|
||||
print(f"should `git checkout {resolved_head_sha}`")
|
||||
|
||||
if not workflow_url:
|
||||
raise RuntimeError("Unable to determine workflow URL for native binaries.")
|
||||
|
||||
install_native_binaries(staging_dir, workflow_url)
|
||||
|
||||
if release_version:
|
||||
staging_dir_str = str(staging_dir)
|
||||
print(
|
||||
f"Staged version {version} for release in {staging_dir_str}\n\n"
|
||||
"Verify the CLI:\n"
|
||||
f" node {staging_dir_str}/bin/codex.js --version\n"
|
||||
f" node {staging_dir_str}/bin/codex.js --help\n\n"
|
||||
)
|
||||
else:
|
||||
print(f"Staged package in {staging_dir}")
|
||||
|
||||
if args.pack_output is not None:
|
||||
output_path = run_npm_pack(staging_dir, args.pack_output)
|
||||
print(f"npm pack output written to {output_path}")
|
||||
finally:
|
||||
if created_temp:
|
||||
# Preserve the staging directory for further inspection.
|
||||
pass
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def prepare_staging_dir(staging_dir: Path | None) -> tuple[Path, bool]:
|
||||
if staging_dir is not None:
|
||||
staging_dir = staging_dir.resolve()
|
||||
staging_dir.mkdir(parents=True, exist_ok=True)
|
||||
if any(staging_dir.iterdir()):
|
||||
raise RuntimeError(f"Staging directory {staging_dir} is not empty.")
|
||||
return staging_dir, False
|
||||
|
||||
temp_dir = Path(tempfile.mkdtemp(prefix="codex-npm-stage-"))
|
||||
return temp_dir, True
|
||||
|
||||
|
||||
def stage_sources(staging_dir: Path, version: str) -> None:
|
||||
bin_dir = staging_dir / "bin"
|
||||
bin_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
shutil.copy2(CODEX_CLI_ROOT / "bin" / "codex.js", bin_dir / "codex.js")
|
||||
rg_manifest = CODEX_CLI_ROOT / "bin" / "rg"
|
||||
if rg_manifest.exists():
|
||||
shutil.copy2(rg_manifest, bin_dir / "rg")
|
||||
|
||||
readme_src = REPO_ROOT / "README.md"
|
||||
if readme_src.exists():
|
||||
shutil.copy2(readme_src, staging_dir / "README.md")
|
||||
|
||||
with open(CODEX_CLI_ROOT / "package.json", "r", encoding="utf-8") as fh:
|
||||
package_json = json.load(fh)
|
||||
package_json["version"] = version
|
||||
|
||||
with open(staging_dir / "package.json", "w", encoding="utf-8") as out:
|
||||
json.dump(package_json, out, indent=2)
|
||||
out.write("\n")
|
||||
|
||||
|
||||
def install_native_binaries(staging_dir: Path, workflow_url: str | None) -> None:
|
||||
cmd = ["./scripts/install_native_deps.py"]
|
||||
if workflow_url:
|
||||
cmd.extend(["--workflow-url", workflow_url])
|
||||
cmd.append(str(staging_dir))
|
||||
subprocess.check_call(cmd, cwd=CODEX_CLI_ROOT)
|
||||
|
||||
|
||||
def resolve_latest_alpha_workflow_url() -> str:
|
||||
version = determine_latest_alpha_version()
|
||||
workflow = resolve_release_workflow(version)
|
||||
return workflow["url"]
|
||||
|
||||
|
||||
def determine_latest_alpha_version() -> str:
|
||||
releases = list_releases()
|
||||
best_key: tuple[int, int, int, int] | None = None
|
||||
best_version: str | None = None
|
||||
pattern = re.compile(r"^rust-v(\d+)\.(\d+)\.(\d+)-alpha\.(\d+)$")
|
||||
for release in releases:
|
||||
tag = release.get("tag_name", "")
|
||||
match = pattern.match(tag)
|
||||
if not match:
|
||||
continue
|
||||
key = tuple(int(match.group(i)) for i in range(1, 5))
|
||||
if best_key is None or key > best_key:
|
||||
best_key = key
|
||||
best_version = (
|
||||
f"{match.group(1)}.{match.group(2)}.{match.group(3)}-alpha.{match.group(4)}"
|
||||
)
|
||||
|
||||
if best_version is None:
|
||||
raise RuntimeError("No alpha releases found when resolving workflow URL.")
|
||||
return best_version
|
||||
|
||||
|
||||
def list_releases() -> list[dict]:
|
||||
stdout = subprocess.check_output(
|
||||
["gh", "api", f"/repos/{GITHUB_REPO}/releases?per_page=100"],
|
||||
text=True,
|
||||
)
|
||||
try:
|
||||
releases = json.loads(stdout or "[]")
|
||||
except json.JSONDecodeError as exc:
|
||||
raise RuntimeError("Unable to parse releases JSON.") from exc
|
||||
if not isinstance(releases, list):
|
||||
raise RuntimeError("Unexpected response when listing releases.")
|
||||
return releases
|
||||
|
||||
|
||||
def resolve_release_workflow(version: str) -> dict:
|
||||
stdout = subprocess.check_output(
|
||||
[
|
||||
"gh",
|
||||
"run",
|
||||
"list",
|
||||
"--branch",
|
||||
f"rust-v{version}",
|
||||
"--json",
|
||||
"workflowName,url,headSha",
|
||||
"--workflow",
|
||||
WORKFLOW_NAME,
|
||||
"--jq",
|
||||
"first(.[])",
|
||||
],
|
||||
text=True,
|
||||
)
|
||||
workflow = json.loads(stdout or "[]")
|
||||
if not workflow:
|
||||
raise RuntimeError(f"Unable to find rust-release workflow for version {version}.")
|
||||
return workflow
|
||||
|
||||
|
||||
def run_npm_pack(staging_dir: Path, output_path: Path) -> Path:
|
||||
output_path = output_path.resolve()
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
with tempfile.TemporaryDirectory(prefix="codex-npm-pack-") as pack_dir_str:
|
||||
pack_dir = Path(pack_dir_str)
|
||||
stdout = subprocess.check_output(
|
||||
["npm", "pack", "--json", "--pack-destination", str(pack_dir)],
|
||||
cwd=staging_dir,
|
||||
text=True,
|
||||
)
|
||||
try:
|
||||
pack_output = json.loads(stdout)
|
||||
except json.JSONDecodeError as exc:
|
||||
raise RuntimeError("Failed to parse npm pack output.") from exc
|
||||
|
||||
if not pack_output:
|
||||
raise RuntimeError("npm pack did not produce an output tarball.")
|
||||
|
||||
tarball_name = pack_output[0].get("filename") or pack_output[0].get("name")
|
||||
if not tarball_name:
|
||||
raise RuntimeError("Unable to determine npm pack output filename.")
|
||||
|
||||
tarball_path = pack_dir / tarball_name
|
||||
if not tarball_path.exists():
|
||||
raise RuntimeError(f"Expected npm pack output not found: {tarball_path}")
|
||||
|
||||
shutil.move(str(tarball_path), output_path)
|
||||
|
||||
return output_path
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
sys.exit(main())
|
||||
@@ -1,318 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Install Codex native binaries (Rust CLI plus ripgrep helpers)."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import tarfile
|
||||
import tempfile
|
||||
import zipfile
|
||||
from concurrent.futures import ThreadPoolExecutor, as_completed
|
||||
from pathlib import Path
|
||||
from typing import Iterable, Sequence
|
||||
from urllib.parse import urlparse
|
||||
from urllib.request import urlopen
|
||||
|
||||
SCRIPT_DIR = Path(__file__).resolve().parent
|
||||
CODEX_CLI_ROOT = SCRIPT_DIR.parent
|
||||
DEFAULT_WORKFLOW_URL = "https://github.com/openai/codex/actions/runs/17952349351" # rust-v0.40.0
|
||||
VENDOR_DIR_NAME = "vendor"
|
||||
RG_MANIFEST = CODEX_CLI_ROOT / "bin" / "rg"
|
||||
CODEX_TARGETS = (
|
||||
"x86_64-unknown-linux-musl",
|
||||
"aarch64-unknown-linux-musl",
|
||||
"x86_64-apple-darwin",
|
||||
"aarch64-apple-darwin",
|
||||
"x86_64-pc-windows-msvc",
|
||||
"aarch64-pc-windows-msvc",
|
||||
)
|
||||
|
||||
RG_TARGET_PLATFORM_PAIRS: list[tuple[str, str]] = [
|
||||
("x86_64-unknown-linux-musl", "linux-x86_64"),
|
||||
("aarch64-unknown-linux-musl", "linux-aarch64"),
|
||||
("x86_64-apple-darwin", "macos-x86_64"),
|
||||
("aarch64-apple-darwin", "macos-aarch64"),
|
||||
("x86_64-pc-windows-msvc", "windows-x86_64"),
|
||||
("aarch64-pc-windows-msvc", "windows-aarch64"),
|
||||
]
|
||||
RG_TARGET_TO_PLATFORM = {target: platform for target, platform in RG_TARGET_PLATFORM_PAIRS}
|
||||
DEFAULT_RG_TARGETS = [target for target, _ in RG_TARGET_PLATFORM_PAIRS]
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Install native Codex binaries.")
|
||||
parser.add_argument(
|
||||
"--workflow-url",
|
||||
help=(
|
||||
"GitHub Actions workflow URL that produced the artifacts. Defaults to a "
|
||||
"known good run when omitted."
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"root",
|
||||
nargs="?",
|
||||
type=Path,
|
||||
help=(
|
||||
"Directory containing package.json for the staged package. If omitted, the "
|
||||
"repository checkout is used."
|
||||
),
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main() -> int:
|
||||
args = parse_args()
|
||||
|
||||
codex_cli_root = (args.root or CODEX_CLI_ROOT).resolve()
|
||||
vendor_dir = codex_cli_root / VENDOR_DIR_NAME
|
||||
vendor_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
workflow_url = (args.workflow_url or DEFAULT_WORKFLOW_URL).strip()
|
||||
if not workflow_url:
|
||||
workflow_url = DEFAULT_WORKFLOW_URL
|
||||
|
||||
workflow_id = workflow_url.rstrip("/").split("/")[-1]
|
||||
|
||||
with tempfile.TemporaryDirectory(prefix="codex-native-artifacts-") as artifacts_dir_str:
|
||||
artifacts_dir = Path(artifacts_dir_str)
|
||||
_download_artifacts(workflow_id, artifacts_dir)
|
||||
install_codex_binaries(artifacts_dir, vendor_dir, CODEX_TARGETS)
|
||||
|
||||
fetch_rg(vendor_dir, DEFAULT_RG_TARGETS, manifest_path=RG_MANIFEST)
|
||||
|
||||
print(f"Installed native dependencies into {vendor_dir}")
|
||||
return 0
|
||||
|
||||
|
||||
def fetch_rg(
|
||||
vendor_dir: Path,
|
||||
targets: Sequence[str] | None = None,
|
||||
*,
|
||||
manifest_path: Path,
|
||||
) -> list[Path]:
|
||||
"""Download ripgrep binaries described by the DotSlash manifest."""
|
||||
|
||||
if targets is None:
|
||||
targets = DEFAULT_RG_TARGETS
|
||||
|
||||
if not manifest_path.exists():
|
||||
raise FileNotFoundError(f"DotSlash manifest not found: {manifest_path}")
|
||||
|
||||
manifest = _load_manifest(manifest_path)
|
||||
platforms = manifest.get("platforms", {})
|
||||
|
||||
vendor_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
targets = list(targets)
|
||||
if not targets:
|
||||
return []
|
||||
|
||||
task_configs: list[tuple[str, str, dict]] = []
|
||||
for target in targets:
|
||||
platform_key = RG_TARGET_TO_PLATFORM.get(target)
|
||||
if platform_key is None:
|
||||
raise ValueError(f"Unsupported ripgrep target '{target}'.")
|
||||
|
||||
platform_info = platforms.get(platform_key)
|
||||
if platform_info is None:
|
||||
raise RuntimeError(f"Platform '{platform_key}' not found in manifest {manifest_path}.")
|
||||
|
||||
task_configs.append((target, platform_key, platform_info))
|
||||
|
||||
results: dict[str, Path] = {}
|
||||
max_workers = min(len(task_configs), max(1, (os.cpu_count() or 1)))
|
||||
|
||||
with ThreadPoolExecutor(max_workers=max_workers) as executor:
|
||||
future_map = {
|
||||
executor.submit(
|
||||
_fetch_single_rg,
|
||||
vendor_dir,
|
||||
target,
|
||||
platform_key,
|
||||
platform_info,
|
||||
manifest_path,
|
||||
): target
|
||||
for target, platform_key, platform_info in task_configs
|
||||
}
|
||||
|
||||
for future in as_completed(future_map):
|
||||
target = future_map[future]
|
||||
results[target] = future.result()
|
||||
|
||||
return [results[target] for target in targets]
|
||||
|
||||
|
||||
def _download_artifacts(workflow_id: str, dest_dir: Path) -> None:
|
||||
cmd = [
|
||||
"gh",
|
||||
"run",
|
||||
"download",
|
||||
"--dir",
|
||||
str(dest_dir),
|
||||
"--repo",
|
||||
"openai/codex",
|
||||
workflow_id,
|
||||
]
|
||||
subprocess.check_call(cmd)
|
||||
|
||||
|
||||
def install_codex_binaries(
|
||||
artifacts_dir: Path, vendor_dir: Path, targets: Iterable[str]
|
||||
) -> list[Path]:
|
||||
targets = list(targets)
|
||||
if not targets:
|
||||
return []
|
||||
|
||||
results: dict[str, Path] = {}
|
||||
max_workers = min(len(targets), max(1, (os.cpu_count() or 1)))
|
||||
|
||||
with ThreadPoolExecutor(max_workers=max_workers) as executor:
|
||||
future_map = {
|
||||
executor.submit(_install_single_codex_binary, artifacts_dir, vendor_dir, target): target
|
||||
for target in targets
|
||||
}
|
||||
|
||||
for future in as_completed(future_map):
|
||||
target = future_map[future]
|
||||
results[target] = future.result()
|
||||
|
||||
return [results[target] for target in targets]
|
||||
|
||||
|
||||
def _install_single_codex_binary(artifacts_dir: Path, vendor_dir: Path, target: str) -> Path:
|
||||
artifact_subdir = artifacts_dir / target
|
||||
archive_name = _archive_name_for_target(target)
|
||||
archive_path = artifact_subdir / archive_name
|
||||
if not archive_path.exists():
|
||||
raise FileNotFoundError(f"Expected artifact not found: {archive_path}")
|
||||
|
||||
dest_dir = vendor_dir / target / "codex"
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
binary_name = "codex.exe" if "windows" in target else "codex"
|
||||
dest = dest_dir / binary_name
|
||||
dest.unlink(missing_ok=True)
|
||||
extract_archive(archive_path, "zst", None, dest)
|
||||
if "windows" not in target:
|
||||
dest.chmod(0o755)
|
||||
return dest
|
||||
|
||||
|
||||
def _archive_name_for_target(target: str) -> str:
|
||||
if "windows" in target:
|
||||
return f"codex-{target}.exe.zst"
|
||||
return f"codex-{target}.zst"
|
||||
|
||||
|
||||
def _fetch_single_rg(
|
||||
vendor_dir: Path,
|
||||
target: str,
|
||||
platform_key: str,
|
||||
platform_info: dict,
|
||||
manifest_path: Path,
|
||||
) -> Path:
|
||||
providers = platform_info.get("providers", [])
|
||||
if not providers:
|
||||
raise RuntimeError(f"No providers listed for platform '{platform_key}' in {manifest_path}.")
|
||||
|
||||
url = providers[0]["url"]
|
||||
archive_format = platform_info.get("format", "zst")
|
||||
archive_member = platform_info.get("path")
|
||||
|
||||
dest_dir = vendor_dir / target / "path"
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
is_windows = platform_key.startswith("win")
|
||||
binary_name = "rg.exe" if is_windows else "rg"
|
||||
dest = dest_dir / binary_name
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp_dir_str:
|
||||
tmp_dir = Path(tmp_dir_str)
|
||||
archive_filename = os.path.basename(urlparse(url).path)
|
||||
download_path = tmp_dir / archive_filename
|
||||
_download_file(url, download_path)
|
||||
|
||||
dest.unlink(missing_ok=True)
|
||||
extract_archive(download_path, archive_format, archive_member, dest)
|
||||
|
||||
if not is_windows:
|
||||
dest.chmod(0o755)
|
||||
|
||||
return dest
|
||||
|
||||
|
||||
def _download_file(url: str, dest: Path) -> None:
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
with urlopen(url) as response, open(dest, "wb") as out:
|
||||
shutil.copyfileobj(response, out)
|
||||
|
||||
|
||||
def extract_archive(
|
||||
archive_path: Path,
|
||||
archive_format: str,
|
||||
archive_member: str | None,
|
||||
dest: Path,
|
||||
) -> None:
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if archive_format == "zst":
|
||||
output_path = archive_path.parent / dest.name
|
||||
subprocess.check_call(
|
||||
["zstd", "-f", "-d", str(archive_path), "-o", str(output_path)]
|
||||
)
|
||||
shutil.move(str(output_path), dest)
|
||||
return
|
||||
|
||||
if archive_format == "tar.gz":
|
||||
if not archive_member:
|
||||
raise RuntimeError("Missing 'path' for tar.gz archive in DotSlash manifest.")
|
||||
with tarfile.open(archive_path, "r:gz") as tar:
|
||||
try:
|
||||
member = tar.getmember(archive_member)
|
||||
except KeyError as exc:
|
||||
raise RuntimeError(
|
||||
f"Entry '{archive_member}' not found in archive {archive_path}."
|
||||
) from exc
|
||||
tar.extract(member, path=archive_path.parent, filter="data")
|
||||
extracted = archive_path.parent / archive_member
|
||||
shutil.move(str(extracted), dest)
|
||||
return
|
||||
|
||||
if archive_format == "zip":
|
||||
if not archive_member:
|
||||
raise RuntimeError("Missing 'path' for zip archive in DotSlash manifest.")
|
||||
with zipfile.ZipFile(archive_path) as archive:
|
||||
try:
|
||||
with archive.open(archive_member) as src, open(dest, "wb") as out:
|
||||
shutil.copyfileobj(src, out)
|
||||
except KeyError as exc:
|
||||
raise RuntimeError(
|
||||
f"Entry '{archive_member}' not found in archive {archive_path}."
|
||||
) from exc
|
||||
return
|
||||
|
||||
raise RuntimeError(f"Unsupported archive format '{archive_format}'.")
|
||||
|
||||
|
||||
def _load_manifest(manifest_path: Path) -> dict:
|
||||
cmd = ["dotslash", "--", "parse", str(manifest_path)]
|
||||
stdout = subprocess.check_output(cmd, text=True)
|
||||
try:
|
||||
manifest = json.loads(stdout)
|
||||
except json.JSONDecodeError as exc:
|
||||
raise RuntimeError(f"Invalid DotSlash manifest output from {manifest_path}.") from exc
|
||||
|
||||
if not isinstance(manifest, dict):
|
||||
raise RuntimeError(
|
||||
f"Unexpected DotSlash manifest structure for {manifest_path}: {type(manifest)!r}"
|
||||
)
|
||||
|
||||
return manifest
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
sys.exit(main())
|
||||
91
codex-cli/scripts/install_native_deps.sh
Executable file
91
codex-cli/scripts/install_native_deps.sh
Executable file
@@ -0,0 +1,91 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Install native runtime dependencies for codex-cli.
|
||||
#
|
||||
# Usage
|
||||
# install_native_deps.sh [--workflow-url URL] [CODEX_CLI_ROOT]
|
||||
#
|
||||
# The optional RELEASE_ROOT is the path that contains package.json. Omitting
|
||||
# it installs the binaries into the repository's own bin/ folder to support
|
||||
# local development.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# ------------------
|
||||
# Parse arguments
|
||||
# ------------------
|
||||
|
||||
CODEX_CLI_ROOT=""
|
||||
|
||||
# Until we start publishing stable GitHub releases, we have to grab the binaries
|
||||
# from the GitHub Action that created them. Update the URL below to point to the
|
||||
# appropriate workflow run:
|
||||
WORKFLOW_URL="https://github.com/openai/codex/actions/runs/16840150768" # rust-v0.20.0-alpha.2
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--workflow-url)
|
||||
shift || { echo "--workflow-url requires an argument"; exit 1; }
|
||||
if [ -n "$1" ]; then
|
||||
WORKFLOW_URL="$1"
|
||||
fi
|
||||
;;
|
||||
*)
|
||||
if [[ -z "$CODEX_CLI_ROOT" ]]; then
|
||||
CODEX_CLI_ROOT="$1"
|
||||
else
|
||||
echo "Unexpected argument: $1" >&2
|
||||
exit 1
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
# Determine where the binaries should be installed.
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
if [ -n "$CODEX_CLI_ROOT" ]; then
|
||||
# The caller supplied a release root directory.
|
||||
BIN_DIR="$CODEX_CLI_ROOT/bin"
|
||||
else
|
||||
# No argument; fall back to the repo’s own bin directory.
|
||||
# Resolve the path of this script, then walk up to the repo root.
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CODEX_CLI_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
|
||||
BIN_DIR="$CODEX_CLI_ROOT/bin"
|
||||
fi
|
||||
|
||||
# Make sure the destination directory exists.
|
||||
mkdir -p "$BIN_DIR"
|
||||
|
||||
# ----------------------------------------------------------------------------
|
||||
# Download and decompress the artifacts from the GitHub Actions workflow.
|
||||
# ----------------------------------------------------------------------------
|
||||
|
||||
WORKFLOW_ID="${WORKFLOW_URL##*/}"
|
||||
|
||||
ARTIFACTS_DIR="$(mktemp -d)"
|
||||
trap 'rm -rf "$ARTIFACTS_DIR"' EXIT
|
||||
|
||||
# NB: The GitHub CLI `gh` must be installed and authenticated.
|
||||
gh run download --dir "$ARTIFACTS_DIR" --repo openai/codex "$WORKFLOW_ID"
|
||||
|
||||
# x64 Linux
|
||||
zstd -d "$ARTIFACTS_DIR/x86_64-unknown-linux-musl/codex-x86_64-unknown-linux-musl.zst" \
|
||||
-o "$BIN_DIR/codex-x86_64-unknown-linux-musl"
|
||||
# ARM64 Linux
|
||||
zstd -d "$ARTIFACTS_DIR/aarch64-unknown-linux-musl/codex-aarch64-unknown-linux-musl.zst" \
|
||||
-o "$BIN_DIR/codex-aarch64-unknown-linux-musl"
|
||||
# x64 macOS
|
||||
zstd -d "$ARTIFACTS_DIR/x86_64-apple-darwin/codex-x86_64-apple-darwin.zst" \
|
||||
-o "$BIN_DIR/codex-x86_64-apple-darwin"
|
||||
# ARM64 macOS
|
||||
zstd -d "$ARTIFACTS_DIR/aarch64-apple-darwin/codex-aarch64-apple-darwin.zst" \
|
||||
-o "$BIN_DIR/codex-aarch64-apple-darwin"
|
||||
# x64 Windows
|
||||
zstd -d "$ARTIFACTS_DIR/x86_64-pc-windows-msvc/codex-x86_64-pc-windows-msvc.exe.zst" \
|
||||
-o "$BIN_DIR/codex-x86_64-pc-windows-msvc.exe"
|
||||
|
||||
echo "Installed native dependencies into $BIN_DIR"
|
||||
120
codex-cli/scripts/stage_release.sh
Executable file
120
codex-cli/scripts/stage_release.sh
Executable file
@@ -0,0 +1,120 @@
|
||||
#!/usr/bin/env bash
|
||||
# -----------------------------------------------------------------------------
|
||||
# stage_release.sh
|
||||
# -----------------------------------------------------------------------------
|
||||
# Stages an npm release for @openai/codex.
|
||||
#
|
||||
# Usage:
|
||||
#
|
||||
# --tmp <dir> : Use <dir> instead of a freshly created temp directory.
|
||||
# -h|--help : Print usage.
|
||||
#
|
||||
# -----------------------------------------------------------------------------
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Helper - usage / flag parsing
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: $(basename "$0") [--tmp DIR] [--version VERSION]
|
||||
|
||||
Options
|
||||
--tmp DIR Use DIR to stage the release (defaults to a fresh mktemp dir)
|
||||
--version Specify the version to release (defaults to a timestamp-based version)
|
||||
-h, --help Show this help
|
||||
|
||||
Legacy positional argument: the first non-flag argument is still interpreted
|
||||
as the temporary directory (for backwards compatibility) but is deprecated.
|
||||
EOF
|
||||
exit "${1:-0}"
|
||||
}
|
||||
|
||||
TMPDIR=""
|
||||
# Default to a timestamp-based version (keep same scheme as before)
|
||||
VERSION="$(printf '0.1.%d' "$(date +%y%m%d%H%M)")"
|
||||
WORKFLOW_URL=""
|
||||
|
||||
# Manual flag parser - Bash getopts does not handle GNU long options well.
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--tmp)
|
||||
shift || { echo "--tmp requires an argument"; usage 1; }
|
||||
TMPDIR="$1"
|
||||
;;
|
||||
--tmp=*)
|
||||
TMPDIR="${1#*=}"
|
||||
;;
|
||||
--version)
|
||||
shift || { echo "--version requires an argument"; usage 1; }
|
||||
VERSION="$1"
|
||||
;;
|
||||
--workflow-url)
|
||||
shift || { echo "--workflow-url requires an argument"; exit 1; }
|
||||
WORKFLOW_URL="$1"
|
||||
;;
|
||||
-h|--help)
|
||||
usage 0
|
||||
;;
|
||||
--*)
|
||||
echo "Unknown option: $1" >&2
|
||||
usage 1
|
||||
;;
|
||||
*)
|
||||
echo "Unexpected extra argument: $1" >&2
|
||||
usage 1
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
# Fallback when the caller did not specify a directory.
|
||||
# If no directory was specified create a fresh temporary one.
|
||||
if [[ -z "$TMPDIR" ]]; then
|
||||
TMPDIR="$(mktemp -d)"
|
||||
fi
|
||||
|
||||
# Ensure the directory exists, then resolve to an absolute path.
|
||||
mkdir -p "$TMPDIR"
|
||||
TMPDIR="$(cd "$TMPDIR" && pwd)"
|
||||
|
||||
# Main build logic
|
||||
|
||||
echo "Staging release in $TMPDIR"
|
||||
|
||||
# The script lives in codex-cli/scripts/ - change into codex-cli root so that
|
||||
# relative paths keep working.
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
CODEX_CLI_ROOT="$(cd "$SCRIPT_DIR/.." && pwd)"
|
||||
|
||||
pushd "$CODEX_CLI_ROOT" >/dev/null
|
||||
|
||||
# 1. Build the JS artifacts ---------------------------------------------------
|
||||
|
||||
# Paths inside the staged package
|
||||
mkdir -p "$TMPDIR/bin"
|
||||
|
||||
cp -r bin/codex.js "$TMPDIR/bin/codex.js"
|
||||
cp ../README.md "$TMPDIR" || true # README is one level up - ignore if missing
|
||||
|
||||
# Modify package.json - bump version and optionally add the native directory to
|
||||
# the files array so that the binaries are published to npm.
|
||||
|
||||
jq --arg version "$VERSION" \
|
||||
'.version = $version' \
|
||||
package.json > "$TMPDIR/package.json"
|
||||
|
||||
# 2. Native runtime deps (sandbox plus optional Rust binaries)
|
||||
|
||||
./scripts/install_native_deps.sh --workflow-url "$WORKFLOW_URL" "$TMPDIR"
|
||||
|
||||
popd >/dev/null
|
||||
|
||||
echo "Staged version $VERSION for release in $TMPDIR"
|
||||
|
||||
echo "Verify the CLI:"
|
||||
echo " node ${TMPDIR}/bin/codex.js --version"
|
||||
echo " node ${TMPDIR}/bin/codex.js --help"
|
||||
|
||||
# Print final hint for convenience
|
||||
echo "Next: cd \"$TMPDIR\" && npm publish"
|
||||
70
codex-cli/scripts/stage_rust_release.py
Executable file
70
codex-cli/scripts/stage_rust_release.py
Executable file
@@ -0,0 +1,70 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
import argparse
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def main() -> int:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="""Stage a release for the npm module.
|
||||
|
||||
Run this after the GitHub Release has been created and use
|
||||
`--release-version` to specify the version to release.
|
||||
|
||||
Optionally pass `--tmp` to control the temporary staging directory that will be
|
||||
forwarded to stage_release.sh.
|
||||
"""
|
||||
)
|
||||
parser.add_argument(
|
||||
"--release-version", required=True, help="Version to release, e.g., 0.3.0"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--tmp",
|
||||
help="Optional path to stage the npm package; forwarded to stage_release.sh",
|
||||
)
|
||||
args = parser.parse_args()
|
||||
version = args.release_version
|
||||
|
||||
gh_run = subprocess.run(
|
||||
[
|
||||
"gh",
|
||||
"run",
|
||||
"list",
|
||||
"--branch",
|
||||
f"rust-v{version}",
|
||||
"--json",
|
||||
"workflowName,url,headSha",
|
||||
"--jq",
|
||||
'first(.[] | select(.workflowName == "rust-release"))',
|
||||
],
|
||||
stdout=subprocess.PIPE,
|
||||
check=True,
|
||||
)
|
||||
gh_run.check_returncode()
|
||||
workflow = json.loads(gh_run.stdout)
|
||||
sha = workflow["headSha"]
|
||||
|
||||
print(f"should `git checkout {sha}`")
|
||||
|
||||
current_dir = Path(__file__).parent.resolve()
|
||||
cmd = [
|
||||
str(current_dir / "stage_release.sh"),
|
||||
"--version",
|
||||
version,
|
||||
"--workflow-url",
|
||||
workflow["url"],
|
||||
]
|
||||
if args.tmp:
|
||||
cmd.extend(["--tmp", args.tmp])
|
||||
|
||||
stage_release = subprocess.run(cmd)
|
||||
stage_release.check_returncode()
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
2315
codex-rs/Cargo.lock
generated
2315
codex-rs/Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
@@ -1,19 +1,14 @@
|
||||
[workspace]
|
||||
members = [
|
||||
"backend-client",
|
||||
"ansi-escape",
|
||||
"apply-patch",
|
||||
"arg0",
|
||||
"codex-backend-openapi-models",
|
||||
"cloud-tasks",
|
||||
"cloud-tasks-client",
|
||||
"cli",
|
||||
"common",
|
||||
"core",
|
||||
"exec",
|
||||
"execpolicy",
|
||||
"file-search",
|
||||
"git-tooling",
|
||||
"linux-sandbox",
|
||||
"login",
|
||||
"mcp-client",
|
||||
@@ -22,11 +17,7 @@ members = [
|
||||
"ollama",
|
||||
"protocol",
|
||||
"protocol-ts",
|
||||
"rmcp-client",
|
||||
"responses-api-proxy",
|
||||
"tui",
|
||||
"git-apply",
|
||||
"utils/readiness",
|
||||
]
|
||||
resolver = "2"
|
||||
|
||||
@@ -38,175 +29,13 @@ version = "0.0.0"
|
||||
# edition.
|
||||
edition = "2024"
|
||||
|
||||
[workspace.dependencies]
|
||||
# Internal
|
||||
codex-ansi-escape = { path = "ansi-escape" }
|
||||
codex-apply-patch = { path = "apply-patch" }
|
||||
codex-arg0 = { path = "arg0" }
|
||||
codex-chatgpt = { path = "chatgpt" }
|
||||
codex-common = { path = "common" }
|
||||
codex-core = { path = "core" }
|
||||
codex-exec = { path = "exec" }
|
||||
codex-file-search = { path = "file-search" }
|
||||
codex-git-tooling = { path = "git-tooling" }
|
||||
codex-linux-sandbox = { path = "linux-sandbox" }
|
||||
codex-login = { path = "login" }
|
||||
codex-mcp-client = { path = "mcp-client" }
|
||||
codex-mcp-server = { path = "mcp-server" }
|
||||
codex-ollama = { path = "ollama" }
|
||||
codex-protocol = { path = "protocol" }
|
||||
codex-rmcp-client = { path = "rmcp-client" }
|
||||
codex-protocol-ts = { path = "protocol-ts" }
|
||||
codex-responses-api-proxy = { path = "responses-api-proxy" }
|
||||
codex-tui = { path = "tui" }
|
||||
codex-utils-readiness = { path = "utils/readiness" }
|
||||
core_test_support = { path = "core/tests/common" }
|
||||
mcp-types = { path = "mcp-types" }
|
||||
mcp_test_support = { path = "mcp-server/tests/common" }
|
||||
|
||||
# External
|
||||
allocative = "0.3.3"
|
||||
ansi-to-tui = "7.0.0"
|
||||
anyhow = "1"
|
||||
arboard = "3"
|
||||
askama = "0.12"
|
||||
assert_cmd = "2"
|
||||
async-channel = "2.3.1"
|
||||
async-stream = "0.3.6"
|
||||
async-trait = "0.1.89"
|
||||
base64 = "0.22.1"
|
||||
bytes = "1.10.1"
|
||||
chrono = "0.4.42"
|
||||
clap = "4"
|
||||
clap_complete = "4"
|
||||
color-eyre = "0.6.3"
|
||||
crossterm = "0.28.1"
|
||||
ctor = "0.5.0"
|
||||
derive_more = "2"
|
||||
diffy = "0.4.2"
|
||||
dirs = "6"
|
||||
dotenvy = "0.15.7"
|
||||
env-flags = "0.1.1"
|
||||
env_logger = "0.11.5"
|
||||
eventsource-stream = "0.2.3"
|
||||
escargot = "0.5"
|
||||
futures = "0.3"
|
||||
icu_decimal = "2.0.0"
|
||||
icu_locale_core = "2.0.0"
|
||||
ignore = "0.4.23"
|
||||
image = { version = "^0.25.8", default-features = false }
|
||||
indexmap = "2.6.0"
|
||||
insta = "1.43.2"
|
||||
itertools = "0.14.0"
|
||||
landlock = "0.4.1"
|
||||
lazy_static = "1"
|
||||
libc = "0.2.175"
|
||||
log = "0.4"
|
||||
maplit = "1.0.2"
|
||||
mime_guess = "2.0.5"
|
||||
multimap = "0.10.0"
|
||||
nucleo-matcher = "0.3.1"
|
||||
openssl-sys = "*"
|
||||
os_info = "3.12.0"
|
||||
owo-colors = "4.2.0"
|
||||
path-absolutize = "3.1.1"
|
||||
path-clean = "1.0.1"
|
||||
pathdiff = "0.2"
|
||||
portable-pty = "0.9.0"
|
||||
predicates = "3"
|
||||
pretty_assertions = "1.4.1"
|
||||
pulldown-cmark = "0.10"
|
||||
rand = "0.9"
|
||||
ratatui = "0.29.0"
|
||||
regex-lite = "0.1.7"
|
||||
reqwest = "0.12"
|
||||
schemars = "0.8.22"
|
||||
seccompiler = "0.5.0"
|
||||
serde = "1"
|
||||
serde_json = "1"
|
||||
serde_with = "3.14"
|
||||
sha1 = "0.10.6"
|
||||
sha2 = "0.10"
|
||||
shlex = "1.3.0"
|
||||
similar = "2.7.0"
|
||||
starlark = "0.13.0"
|
||||
strum = "0.27.2"
|
||||
strum_macros = "0.27.2"
|
||||
supports-color = "3.0.2"
|
||||
sys-locale = "0.3.2"
|
||||
tempfile = "3.23.0"
|
||||
textwrap = "0.16.2"
|
||||
thiserror = "2.0.16"
|
||||
time = "0.3"
|
||||
tiny_http = "0.12"
|
||||
tokio = "1"
|
||||
tokio-stream = "0.1.17"
|
||||
tokio-test = "0.4"
|
||||
tokio-util = "0.7.16"
|
||||
toml = "0.9.5"
|
||||
toml_edit = "0.23.4"
|
||||
tracing = "0.1.41"
|
||||
tracing-appender = "0.2.3"
|
||||
tracing-subscriber = "0.3.20"
|
||||
tree-sitter = "0.25.9"
|
||||
tree-sitter-bash = "0.25.0"
|
||||
ts-rs = "11"
|
||||
unicode-segmentation = "1.12.0"
|
||||
unicode-width = "0.2"
|
||||
url = "2"
|
||||
urlencoding = "2.1"
|
||||
uuid = "1"
|
||||
vt100 = "0.16.2"
|
||||
walkdir = "2.5.0"
|
||||
webbrowser = "1.0"
|
||||
which = "6"
|
||||
wildmatch = "2.5.0"
|
||||
wiremock = "0.6"
|
||||
zeroize = "1.8.1"
|
||||
|
||||
[workspace.lints]
|
||||
rust = {}
|
||||
|
||||
[workspace.lints.clippy]
|
||||
expect_used = "deny"
|
||||
identity_op = "deny"
|
||||
manual_clamp = "deny"
|
||||
manual_filter = "deny"
|
||||
manual_find = "deny"
|
||||
manual_flatten = "deny"
|
||||
manual_map = "deny"
|
||||
manual_memcpy = "deny"
|
||||
manual_non_exhaustive = "deny"
|
||||
manual_ok_or = "deny"
|
||||
manual_range_contains = "deny"
|
||||
manual_retain = "deny"
|
||||
manual_strip = "deny"
|
||||
manual_try_fold = "deny"
|
||||
manual_unwrap_or = "deny"
|
||||
needless_borrow = "deny"
|
||||
needless_borrowed_reference = "deny"
|
||||
needless_collect = "deny"
|
||||
needless_late_init = "deny"
|
||||
needless_option_as_deref = "deny"
|
||||
needless_question_mark = "deny"
|
||||
needless_update = "deny"
|
||||
redundant_clone = "deny"
|
||||
redundant_closure = "deny"
|
||||
redundant_closure_for_method_calls = "deny"
|
||||
redundant_static_lifetimes = "deny"
|
||||
trivially_copy_pass_by_ref = "deny"
|
||||
uninlined_format_args = "deny"
|
||||
unnecessary_filter_map = "deny"
|
||||
unnecessary_lazy_evaluations = "deny"
|
||||
unnecessary_sort_by = "deny"
|
||||
unnecessary_to_owned = "deny"
|
||||
unwrap_used = "deny"
|
||||
|
||||
# cargo-shear cannot see the platform-specific openssl-sys usage, so we
|
||||
# silence the false positive here instead of deleting a real dependency.
|
||||
[workspace.metadata.cargo-shear]
|
||||
ignored = ["openssl-sys", "codex-utils-readiness"]
|
||||
|
||||
[profile.release]
|
||||
lto = "fat"
|
||||
# Because we bundle some of these executables with the TypeScript CLI, we
|
||||
|
||||
@@ -19,11 +19,11 @@ While we are [working to close the gap between the TypeScript and Rust implement
|
||||
|
||||
### Config
|
||||
|
||||
Codex supports a rich set of configuration options. Note that the Rust CLI uses `config.toml` instead of `config.json`. See [`docs/config.md`](../docs/config.md) for details.
|
||||
Codex supports a rich set of configuration options. Note that the Rust CLI uses `config.toml` instead of `config.json`. See [`config.md`](./config.md) for details.
|
||||
|
||||
### Model Context Protocol Support
|
||||
|
||||
Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the [`mcp_servers`](../docs/config.md#mcp_servers) section in the configuration documentation for details.
|
||||
Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the [`mcp_servers`](./config.md#mcp_servers) section in the configuration documentation for details.
|
||||
|
||||
It is still experimental, but you can also launch Codex as an MCP _server_ by running `codex mcp`. Use the [`@modelcontextprotocol/inspector`](https://github.com/modelcontextprotocol/inspector) to try it out:
|
||||
|
||||
@@ -33,9 +33,9 @@ npx @modelcontextprotocol/inspector codex mcp
|
||||
|
||||
### Notifications
|
||||
|
||||
You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The [notify documentation](../docs/config.md#notify) includes a detailed example that explains how to get desktop notifications via [terminal-notifier](https://github.com/julienXX/terminal-notifier) on macOS.
|
||||
You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The [notify documentation](./config.md#notify) includes a detailed example that explains how to get desktop notifications via [terminal-notifier](https://github.com/julienXX/terminal-notifier) on macOS.
|
||||
|
||||
### `codex exec` to run Codex programmatically/non-interactively
|
||||
### `codex exec` to run Codex programmatially/non-interactively
|
||||
|
||||
To run Codex non-interactively, run `codex exec PROMPT` (you can also pass the prompt via `stdin`) and Codex will work on your task until it decides that it is done and exits. Output is printed to the terminal directly. You can set the `RUST_LOG` environment variable to see more about what's going on.
|
||||
|
||||
|
||||
@@ -8,9 +8,9 @@ name = "codex_ansi_escape"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[dependencies]
|
||||
ansi-to-tui = { workspace = true }
|
||||
ratatui = { workspace = true, features = [
|
||||
ansi-to-tui = "7.0.0"
|
||||
ratatui = { version = "0.29.0", features = [
|
||||
"unstable-rendered-line-info",
|
||||
"unstable-widget-ref",
|
||||
] }
|
||||
tracing = { workspace = true, features = ["log"] }
|
||||
tracing = { version = "0.1.41", features = ["log"] }
|
||||
|
||||
@@ -9,7 +9,7 @@ use ratatui::text::Text;
|
||||
pub fn ansi_escape_line(s: &str) -> Line<'static> {
|
||||
let text = ansi_escape(s);
|
||||
match text.lines.as_slice() {
|
||||
[] => "".into(),
|
||||
[] => Line::from(""),
|
||||
[only] => only.clone(),
|
||||
[first, rest @ ..] => {
|
||||
tracing::warn!("ansi_escape_line: expected a single line, got {first:?} and {rest:?}");
|
||||
|
||||
@@ -15,13 +15,13 @@ path = "src/main.rs"
|
||||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
anyhow = { workspace = true }
|
||||
similar = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
tree-sitter = { workspace = true }
|
||||
tree-sitter-bash = { workspace = true }
|
||||
anyhow = "1"
|
||||
similar = "2.7.0"
|
||||
thiserror = "2.0.12"
|
||||
tree-sitter = "0.25.8"
|
||||
tree-sitter-bash = "0.25.0"
|
||||
|
||||
[dev-dependencies]
|
||||
assert_cmd = { workspace = true }
|
||||
pretty_assertions = { workspace = true }
|
||||
tempfile = { workspace = true }
|
||||
assert_cmd = "2"
|
||||
pretty_assertions = "1.4.1"
|
||||
tempfile = "3.13.0"
|
||||
|
||||
@@ -6,7 +6,6 @@ use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::str::Utf8Error;
|
||||
use std::sync::LazyLock;
|
||||
|
||||
use anyhow::Context;
|
||||
use anyhow::Result;
|
||||
@@ -19,9 +18,6 @@ use similar::TextDiff;
|
||||
use thiserror::Error;
|
||||
use tree_sitter::LanguageError;
|
||||
use tree_sitter::Parser;
|
||||
use tree_sitter::Query;
|
||||
use tree_sitter::QueryCursor;
|
||||
use tree_sitter::StreamingIterator;
|
||||
use tree_sitter_bash::LANGUAGE as BASH;
|
||||
|
||||
pub use standalone_executable::main;
|
||||
@@ -40,11 +36,6 @@ pub enum ApplyPatchError {
|
||||
/// Error that occurs while computing replacements when applying patch chunks
|
||||
#[error("{0}")]
|
||||
ComputeReplacements(String),
|
||||
/// A raw patch body was provided without an explicit `apply_patch` invocation.
|
||||
#[error(
|
||||
"patch detected without explicit call to apply_patch. Rerun as [\"apply_patch\", \"<patch>\"]"
|
||||
)]
|
||||
ImplicitInvocation,
|
||||
}
|
||||
|
||||
impl From<std::io::Error> for ApplyPatchError {
|
||||
@@ -93,29 +84,26 @@ pub enum MaybeApplyPatch {
|
||||
pub struct ApplyPatchArgs {
|
||||
pub patch: String,
|
||||
pub hunks: Vec<Hunk>,
|
||||
pub workdir: Option<String>,
|
||||
}
|
||||
|
||||
pub fn maybe_parse_apply_patch(argv: &[String]) -> MaybeApplyPatch {
|
||||
match argv {
|
||||
// Direct invocation: apply_patch <patch>
|
||||
[cmd, body] if APPLY_PATCH_COMMANDS.contains(&cmd.as_str()) => match parse_patch(body) {
|
||||
Ok(source) => MaybeApplyPatch::Body(source),
|
||||
Err(e) => MaybeApplyPatch::PatchParseError(e),
|
||||
},
|
||||
// Bash heredoc form: (optional `cd <path> &&`) apply_patch <<'EOF' ...
|
||||
[bash, flag, script] if bash == "bash" && flag == "-lc" => {
|
||||
match extract_apply_patch_from_bash(script) {
|
||||
Ok((body, workdir)) => match parse_patch(&body) {
|
||||
Ok(mut source) => {
|
||||
source.workdir = workdir;
|
||||
MaybeApplyPatch::Body(source)
|
||||
}
|
||||
[bash, flag, script]
|
||||
if bash == "bash"
|
||||
&& flag == "-lc"
|
||||
&& APPLY_PATCH_COMMANDS
|
||||
.iter()
|
||||
.any(|cmd| script.trim_start().starts_with(cmd)) =>
|
||||
{
|
||||
match extract_heredoc_body_from_apply_patch_command(script) {
|
||||
Ok(body) => match parse_patch(&body) {
|
||||
Ok(source) => MaybeApplyPatch::Body(source),
|
||||
Err(e) => MaybeApplyPatch::PatchParseError(e),
|
||||
},
|
||||
Err(ExtractHeredocError::CommandDidNotStartWithApplyPatch) => {
|
||||
MaybeApplyPatch::NotApplyPatch
|
||||
}
|
||||
Err(e) => MaybeApplyPatch::ShellParseError(e),
|
||||
}
|
||||
}
|
||||
@@ -128,9 +116,7 @@ pub enum ApplyPatchFileChange {
|
||||
Add {
|
||||
content: String,
|
||||
},
|
||||
Delete {
|
||||
content: String,
|
||||
},
|
||||
Delete,
|
||||
Update {
|
||||
unified_diff: String,
|
||||
move_path: Option<PathBuf>,
|
||||
@@ -214,63 +200,17 @@ impl ApplyPatchAction {
|
||||
/// cwd must be an absolute path so that we can resolve relative paths in the
|
||||
/// patch.
|
||||
pub fn maybe_parse_apply_patch_verified(argv: &[String], cwd: &Path) -> MaybeApplyPatchVerified {
|
||||
// Detect a raw patch body passed directly as the command or as the body of a bash -lc
|
||||
// script. In these cases, report an explicit error rather than applying the patch.
|
||||
match argv {
|
||||
[body] => {
|
||||
if parse_patch(body).is_ok() {
|
||||
return MaybeApplyPatchVerified::CorrectnessError(
|
||||
ApplyPatchError::ImplicitInvocation,
|
||||
);
|
||||
}
|
||||
}
|
||||
[bash, flag, script] if bash == "bash" && flag == "-lc" => {
|
||||
if parse_patch(script).is_ok() {
|
||||
return MaybeApplyPatchVerified::CorrectnessError(
|
||||
ApplyPatchError::ImplicitInvocation,
|
||||
);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
|
||||
match maybe_parse_apply_patch(argv) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs {
|
||||
patch,
|
||||
hunks,
|
||||
workdir,
|
||||
}) => {
|
||||
let effective_cwd = workdir
|
||||
.as_ref()
|
||||
.map(|dir| {
|
||||
let path = Path::new(dir);
|
||||
if path.is_absolute() {
|
||||
path.to_path_buf()
|
||||
} else {
|
||||
cwd.join(path)
|
||||
}
|
||||
})
|
||||
.unwrap_or_else(|| cwd.to_path_buf());
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { patch, hunks }) => {
|
||||
let mut changes = HashMap::new();
|
||||
for hunk in hunks {
|
||||
let path = hunk.resolve_path(&effective_cwd);
|
||||
let path = hunk.resolve_path(cwd);
|
||||
match hunk {
|
||||
Hunk::AddFile { contents, .. } => {
|
||||
changes.insert(path, ApplyPatchFileChange::Add { content: contents });
|
||||
}
|
||||
Hunk::DeleteFile { .. } => {
|
||||
let content = match std::fs::read_to_string(&path) {
|
||||
Ok(content) => content,
|
||||
Err(e) => {
|
||||
return MaybeApplyPatchVerified::CorrectnessError(
|
||||
ApplyPatchError::IoError(IoError {
|
||||
context: format!("Failed to read {}", path.display()),
|
||||
source: e,
|
||||
}),
|
||||
);
|
||||
}
|
||||
};
|
||||
changes.insert(path, ApplyPatchFileChange::Delete { content });
|
||||
changes.insert(path, ApplyPatchFileChange::Delete);
|
||||
}
|
||||
Hunk::UpdateFile {
|
||||
move_path, chunks, ..
|
||||
@@ -298,7 +238,7 @@ pub fn maybe_parse_apply_patch_verified(argv: &[String], cwd: &Path) -> MaybeApp
|
||||
MaybeApplyPatchVerified::Body(ApplyPatchAction {
|
||||
changes,
|
||||
patch,
|
||||
cwd: effective_cwd,
|
||||
cwd: cwd.to_path_buf(),
|
||||
})
|
||||
}
|
||||
MaybeApplyPatch::ShellParseError(e) => MaybeApplyPatchVerified::ShellParseError(e),
|
||||
@@ -307,96 +247,33 @@ pub fn maybe_parse_apply_patch_verified(argv: &[String], cwd: &Path) -> MaybeApp
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract the heredoc body (and optional `cd` workdir) from a `bash -lc` script
|
||||
/// that invokes the apply_patch tool using a heredoc.
|
||||
/// Attempts to extract a heredoc_body object from a string bash command like:
|
||||
/// Optimistically
|
||||
///
|
||||
/// Supported top‑level forms (must be the only top‑level statement):
|
||||
/// - `apply_patch <<'EOF'\n...\nEOF`
|
||||
/// - `cd <path> && apply_patch <<'EOF'\n...\nEOF`
|
||||
/// ```bash
|
||||
/// bash -lc 'apply_patch <<EOF\n***Begin Patch\n...EOF'
|
||||
/// ```
|
||||
///
|
||||
/// Notes about matching:
|
||||
/// - Parsed with Tree‑sitter Bash and a strict query that uses anchors so the
|
||||
/// heredoc‑redirected statement is the only top‑level statement.
|
||||
/// - The connector between `cd` and `apply_patch` must be `&&` (not `|` or `||`).
|
||||
/// - Exactly one positional `word` argument is allowed for `cd` (no flags, no quoted
|
||||
/// strings, no second argument).
|
||||
/// - The apply command is validated in‑query via `#any-of?` to allow `apply_patch`
|
||||
/// or `applypatch`.
|
||||
/// - Preceding or trailing commands (e.g., `echo ...;` or `... && echo done`) do not match.
|
||||
/// # Arguments
|
||||
///
|
||||
/// Returns `(heredoc_body, Some(path))` when the `cd` variant matches, or
|
||||
/// `(heredoc_body, None)` for the direct form. Errors are returned if the script
|
||||
/// cannot be parsed or does not match the allowed patterns.
|
||||
fn extract_apply_patch_from_bash(
|
||||
/// * `src` - A string slice that holds the full command
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// This function returns a `Result` which is:
|
||||
///
|
||||
/// * `Ok(String)` - The heredoc body if the extraction is successful.
|
||||
/// * `Err(anyhow::Error)` - An error if the extraction fails.
|
||||
///
|
||||
fn extract_heredoc_body_from_apply_patch_command(
|
||||
src: &str,
|
||||
) -> std::result::Result<(String, Option<String>), ExtractHeredocError> {
|
||||
// This function uses a Tree-sitter query to recognize one of two
|
||||
// whole-script forms, each expressed as a single top-level statement:
|
||||
//
|
||||
// 1. apply_patch <<'EOF'\n...\nEOF
|
||||
// 2. cd <path> && apply_patch <<'EOF'\n...\nEOF
|
||||
//
|
||||
// Key ideas when reading the query:
|
||||
// - dots (`.`) between named nodes enforces adjacency among named children and
|
||||
// anchor to the start/end of the expression.
|
||||
// - we match a single redirected_statement directly under program with leading
|
||||
// and trailing anchors (`.`). This ensures it is the only top-level statement
|
||||
// (so prefixes like `echo ...;` or suffixes like `... && echo done` do not match).
|
||||
//
|
||||
// Overall, we want to be conservative and only match the intended forms, as other
|
||||
// forms are likely to be model errors, or incorrectly interpreted by later code.
|
||||
//
|
||||
// If you're editing this query, it's helpful to start by creating a debugging binary
|
||||
// which will let you see the AST of an arbitrary bash script passed in, and optionally
|
||||
// also run an arbitrary query against the AST. This is useful for understanding
|
||||
// how tree-sitter parses the script and whether the query syntax is correct. Be sure
|
||||
// to test both positive and negative cases.
|
||||
static APPLY_PATCH_QUERY: LazyLock<Query> = LazyLock::new(|| {
|
||||
let language = BASH.into();
|
||||
#[expect(clippy::expect_used)]
|
||||
Query::new(
|
||||
&language,
|
||||
r#"
|
||||
(
|
||||
program
|
||||
. (redirected_statement
|
||||
body: (command
|
||||
name: (command_name (word) @apply_name) .)
|
||||
(#any-of? @apply_name "apply_patch" "applypatch")
|
||||
redirect: (heredoc_redirect
|
||||
. (heredoc_start)
|
||||
. (heredoc_body) @heredoc
|
||||
. (heredoc_end)
|
||||
.))
|
||||
.)
|
||||
|
||||
(
|
||||
program
|
||||
. (redirected_statement
|
||||
body: (list
|
||||
. (command
|
||||
name: (command_name (word) @cd_name) .
|
||||
argument: [
|
||||
(word) @cd_path
|
||||
(string (string_content) @cd_path)
|
||||
(raw_string) @cd_raw_string
|
||||
] .)
|
||||
"&&"
|
||||
. (command
|
||||
name: (command_name (word) @apply_name))
|
||||
.)
|
||||
(#eq? @cd_name "cd")
|
||||
(#any-of? @apply_name "apply_patch" "applypatch")
|
||||
redirect: (heredoc_redirect
|
||||
. (heredoc_start)
|
||||
. (heredoc_body) @heredoc
|
||||
. (heredoc_end)
|
||||
.))
|
||||
.)
|
||||
"#,
|
||||
)
|
||||
.expect("valid bash query")
|
||||
});
|
||||
) -> std::result::Result<String, ExtractHeredocError> {
|
||||
if !APPLY_PATCH_COMMANDS
|
||||
.iter()
|
||||
.any(|cmd| src.trim_start().starts_with(cmd))
|
||||
{
|
||||
return Err(ExtractHeredocError::CommandDidNotStartWithApplyPatch);
|
||||
}
|
||||
|
||||
let lang = BASH.into();
|
||||
let mut parser = Parser::new();
|
||||
@@ -408,55 +285,26 @@ fn extract_apply_patch_from_bash(
|
||||
.ok_or(ExtractHeredocError::FailedToParsePatchIntoAst)?;
|
||||
|
||||
let bytes = src.as_bytes();
|
||||
let root = tree.root_node();
|
||||
let mut c = tree.root_node().walk();
|
||||
|
||||
let mut cursor = QueryCursor::new();
|
||||
let mut matches = cursor.matches(&APPLY_PATCH_QUERY, root, bytes);
|
||||
while let Some(m) = matches.next() {
|
||||
let mut heredoc_text: Option<String> = None;
|
||||
let mut cd_path: Option<String> = None;
|
||||
loop {
|
||||
let node = c.node();
|
||||
if node.kind() == "heredoc_body" {
|
||||
let text = node
|
||||
.utf8_text(bytes)
|
||||
.map_err(ExtractHeredocError::HeredocNotUtf8)?;
|
||||
return Ok(text.trim_end_matches('\n').to_owned());
|
||||
}
|
||||
|
||||
for capture in m.captures.iter() {
|
||||
let name = APPLY_PATCH_QUERY.capture_names()[capture.index as usize];
|
||||
match name {
|
||||
"heredoc" => {
|
||||
let text = capture
|
||||
.node
|
||||
.utf8_text(bytes)
|
||||
.map_err(ExtractHeredocError::HeredocNotUtf8)?
|
||||
.trim_end_matches('\n')
|
||||
.to_string();
|
||||
heredoc_text = Some(text);
|
||||
}
|
||||
"cd_path" => {
|
||||
let text = capture
|
||||
.node
|
||||
.utf8_text(bytes)
|
||||
.map_err(ExtractHeredocError::HeredocNotUtf8)?
|
||||
.to_string();
|
||||
cd_path = Some(text);
|
||||
}
|
||||
"cd_raw_string" => {
|
||||
let raw = capture
|
||||
.node
|
||||
.utf8_text(bytes)
|
||||
.map_err(ExtractHeredocError::HeredocNotUtf8)?;
|
||||
let trimmed = raw
|
||||
.strip_prefix('\'')
|
||||
.and_then(|s| s.strip_suffix('\''))
|
||||
.unwrap_or(raw);
|
||||
cd_path = Some(trimmed.to_string());
|
||||
}
|
||||
_ => {}
|
||||
if c.goto_first_child() {
|
||||
continue;
|
||||
}
|
||||
while !c.goto_next_sibling() {
|
||||
if !c.goto_parent() {
|
||||
return Err(ExtractHeredocError::FailedToFindHeredocBody);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(heredoc) = heredoc_text {
|
||||
return Ok((heredoc, cd_path));
|
||||
}
|
||||
}
|
||||
|
||||
Err(ExtractHeredocError::CommandDidNotStartWithApplyPatch)
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialEq)]
|
||||
@@ -648,18 +496,21 @@ fn derive_new_contents_from_chunks(
|
||||
}
|
||||
};
|
||||
|
||||
let mut original_lines: Vec<String> = original_contents.split('\n').map(String::from).collect();
|
||||
let mut original_lines: Vec<String> = original_contents
|
||||
.split('\n')
|
||||
.map(|s| s.to_string())
|
||||
.collect();
|
||||
|
||||
// Drop the trailing empty element that results from the final newline so
|
||||
// that line counts match the behaviour of standard `diff`.
|
||||
if original_lines.last().is_some_and(String::is_empty) {
|
||||
if original_lines.last().is_some_and(|s| s.is_empty()) {
|
||||
original_lines.pop();
|
||||
}
|
||||
|
||||
let replacements = compute_replacements(&original_lines, path, chunks)?;
|
||||
let new_lines = apply_replacements(original_lines, &replacements);
|
||||
let mut new_lines = new_lines;
|
||||
if !new_lines.last().is_some_and(String::is_empty) {
|
||||
if !new_lines.last().is_some_and(|s| s.is_empty()) {
|
||||
new_lines.push(String::new());
|
||||
}
|
||||
let new_contents = new_lines.join("\n");
|
||||
@@ -681,32 +532,51 @@ fn compute_replacements(
|
||||
let mut line_index: usize = 0;
|
||||
|
||||
for chunk in chunks {
|
||||
// If a chunk has a `change_context`, we use seek_sequence to find it, then
|
||||
// adjust our `line_index` to continue from there.
|
||||
if let Some(ctx_line) = &chunk.change_context {
|
||||
if let Some(idx) = seek_sequence::seek_sequence(
|
||||
original_lines,
|
||||
std::slice::from_ref(ctx_line),
|
||||
line_index,
|
||||
false,
|
||||
) {
|
||||
line_index = idx + 1;
|
||||
} else {
|
||||
return Err(ApplyPatchError::ComputeReplacements(format!(
|
||||
"Failed to find context '{}' in {}",
|
||||
ctx_line,
|
||||
path.display()
|
||||
)));
|
||||
// If a chunk has context lines, we use seek_sequence to find each in order,
|
||||
// then adjust our `line_index` to continue from there.
|
||||
if !chunk.context_lines.is_empty() {
|
||||
let total = chunk.context_lines.len();
|
||||
for (i, ctx_line) in chunk.context_lines.iter().enumerate() {
|
||||
if let Some(idx) = seek_sequence::seek_sequence(
|
||||
original_lines,
|
||||
std::slice::from_ref(ctx_line),
|
||||
line_index,
|
||||
false,
|
||||
) {
|
||||
line_index = idx + 1;
|
||||
} else {
|
||||
return Err(ApplyPatchError::ComputeReplacements(format!(
|
||||
"Failed to find context {}/{}: '{}' in {}",
|
||||
i + 1,
|
||||
total,
|
||||
ctx_line,
|
||||
path.display()
|
||||
)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if chunk.old_lines.is_empty() {
|
||||
// Pure addition (no old lines). We'll add them at the end or just
|
||||
// before the final empty line if one exists.
|
||||
let insertion_idx = if original_lines.last().is_some_and(String::is_empty) {
|
||||
original_lines.len() - 1
|
||||
// Pure addition (no old lines).
|
||||
// Prefer to insert at the matched context anchor if one exists and
|
||||
// the hunk is not explicitly marked as end-of-file.
|
||||
let insertion_idx = if chunk.is_end_of_file {
|
||||
if original_lines.last().is_some_and(|s| s.is_empty()) {
|
||||
original_lines.len() - 1
|
||||
} else {
|
||||
original_lines.len()
|
||||
}
|
||||
} else if !chunk.context_lines.is_empty() {
|
||||
// Insert immediately after the last matched context line.
|
||||
line_index
|
||||
} else {
|
||||
original_lines.len()
|
||||
// No context provided: fall back to appending at the end (before
|
||||
// the trailing empty line if present).
|
||||
if original_lines.last().is_some_and(|s| s.is_empty()) {
|
||||
original_lines.len() - 1
|
||||
} else {
|
||||
original_lines.len()
|
||||
}
|
||||
};
|
||||
replacements.push((insertion_idx, 0, chunk.new_lines.clone()));
|
||||
continue;
|
||||
@@ -729,11 +599,11 @@ fn compute_replacements(
|
||||
|
||||
let mut new_slice: &[String] = &chunk.new_lines;
|
||||
|
||||
if found.is_none() && pattern.last().is_some_and(String::is_empty) {
|
||||
if found.is_none() && pattern.last().is_some_and(|s| s.is_empty()) {
|
||||
// Retry without the trailing empty line which represents the final
|
||||
// newline in the file.
|
||||
pattern = &pattern[..pattern.len() - 1];
|
||||
if new_slice.last().is_some_and(String::is_empty) {
|
||||
if new_slice.last().is_some_and(|s| s.is_empty()) {
|
||||
new_slice = &new_slice[..new_slice.len() - 1];
|
||||
}
|
||||
|
||||
@@ -750,15 +620,13 @@ fn compute_replacements(
|
||||
line_index = start_idx + pattern.len();
|
||||
} else {
|
||||
return Err(ApplyPatchError::ComputeReplacements(format!(
|
||||
"Failed to find expected lines in {}:\n{}",
|
||||
path.display(),
|
||||
chunk.old_lines.join("\n"),
|
||||
"Failed to find expected lines {:?} in {}",
|
||||
chunk.old_lines,
|
||||
path.display()
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
replacements.sort_by(|(lhs_idx, _, _), (rhs_idx, _, _)| lhs_idx.cmp(rhs_idx));
|
||||
|
||||
Ok(replacements)
|
||||
}
|
||||
|
||||
@@ -845,7 +713,6 @@ mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
use std::fs;
|
||||
use std::string::ToString;
|
||||
use tempfile::tempdir;
|
||||
|
||||
/// Helper to construct a patch with the given body.
|
||||
@@ -854,72 +721,7 @@ mod tests {
|
||||
}
|
||||
|
||||
fn strs_to_strings(strs: &[&str]) -> Vec<String> {
|
||||
strs.iter().map(ToString::to_string).collect()
|
||||
}
|
||||
|
||||
// Test helpers to reduce repetition when building bash -lc heredoc scripts
|
||||
fn args_bash(script: &str) -> Vec<String> {
|
||||
strs_to_strings(&["bash", "-lc", script])
|
||||
}
|
||||
|
||||
fn heredoc_script(prefix: &str) -> String {
|
||||
format!(
|
||||
"{prefix}apply_patch <<'PATCH'\n*** Begin Patch\n*** Add File: foo\n+hi\n*** End Patch\nPATCH"
|
||||
)
|
||||
}
|
||||
|
||||
fn heredoc_script_ps(prefix: &str, suffix: &str) -> String {
|
||||
format!(
|
||||
"{prefix}apply_patch <<'PATCH'\n*** Begin Patch\n*** Add File: foo\n+hi\n*** End Patch\nPATCH{suffix}"
|
||||
)
|
||||
}
|
||||
|
||||
fn expected_single_add() -> Vec<Hunk> {
|
||||
vec![Hunk::AddFile {
|
||||
path: PathBuf::from("foo"),
|
||||
contents: "hi\n".to_string(),
|
||||
}]
|
||||
}
|
||||
|
||||
fn assert_match(script: &str, expected_workdir: Option<&str>) {
|
||||
let args = args_bash(script);
|
||||
match maybe_parse_apply_patch(&args) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, workdir, .. }) => {
|
||||
assert_eq!(workdir.as_deref(), expected_workdir);
|
||||
assert_eq!(hunks, expected_single_add());
|
||||
}
|
||||
result => panic!("expected MaybeApplyPatch::Body got {result:?}"),
|
||||
}
|
||||
}
|
||||
|
||||
fn assert_not_match(script: &str) {
|
||||
let args = args_bash(script);
|
||||
assert!(matches!(
|
||||
maybe_parse_apply_patch(&args),
|
||||
MaybeApplyPatch::NotApplyPatch
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_implicit_patch_single_arg_is_error() {
|
||||
let patch = "*** Begin Patch\n*** Add File: foo\n+hi\n*** End Patch".to_string();
|
||||
let args = vec![patch];
|
||||
let dir = tempdir().unwrap();
|
||||
assert!(matches!(
|
||||
maybe_parse_apply_patch_verified(&args, dir.path()),
|
||||
MaybeApplyPatchVerified::CorrectnessError(ApplyPatchError::ImplicitInvocation)
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_implicit_patch_bash_script_is_error() {
|
||||
let script = "*** Begin Patch\n*** Add File: foo\n+hi\n*** End Patch";
|
||||
let args = args_bash(script);
|
||||
let dir = tempdir().unwrap();
|
||||
assert!(matches!(
|
||||
maybe_parse_apply_patch_verified(&args, dir.path()),
|
||||
MaybeApplyPatchVerified::CorrectnessError(ApplyPatchError::ImplicitInvocation)
|
||||
));
|
||||
strs.iter().map(|s| s.to_string()).collect()
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -934,7 +736,7 @@ mod tests {
|
||||
]);
|
||||
|
||||
match maybe_parse_apply_patch(&args) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, .. }) => {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, patch: _ }) => {
|
||||
assert_eq!(
|
||||
hunks,
|
||||
vec![Hunk::AddFile {
|
||||
@@ -959,7 +761,7 @@ mod tests {
|
||||
]);
|
||||
|
||||
match maybe_parse_apply_patch(&args) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, .. }) => {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, patch: _ }) => {
|
||||
assert_eq!(
|
||||
hunks,
|
||||
vec![Hunk::AddFile {
|
||||
@@ -974,7 +776,29 @@ mod tests {
|
||||
|
||||
#[test]
|
||||
fn test_heredoc() {
|
||||
assert_match(&heredoc_script(""), None);
|
||||
let args = strs_to_strings(&[
|
||||
"bash",
|
||||
"-lc",
|
||||
r#"apply_patch <<'PATCH'
|
||||
*** Begin Patch
|
||||
*** Add File: foo
|
||||
+hi
|
||||
*** End Patch
|
||||
PATCH"#,
|
||||
]);
|
||||
|
||||
match maybe_parse_apply_patch(&args) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, patch: _ }) => {
|
||||
assert_eq!(
|
||||
hunks,
|
||||
vec![Hunk::AddFile {
|
||||
path: PathBuf::from("foo"),
|
||||
contents: "hi\n".to_string()
|
||||
}]
|
||||
);
|
||||
}
|
||||
result => panic!("expected MaybeApplyPatch::Body got {result:?}"),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
@@ -991,8 +815,7 @@ PATCH"#,
|
||||
]);
|
||||
|
||||
match maybe_parse_apply_patch(&args) {
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, workdir, .. }) => {
|
||||
assert_eq!(workdir, None);
|
||||
MaybeApplyPatch::Body(ApplyPatchArgs { hunks, patch: _ }) => {
|
||||
assert_eq!(
|
||||
hunks,
|
||||
vec![Hunk::AddFile {
|
||||
@@ -1005,69 +828,6 @@ PATCH"#,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_heredoc_with_leading_cd() {
|
||||
assert_match(&heredoc_script("cd foo && "), Some("foo"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_with_semicolon_is_ignored() {
|
||||
assert_not_match(&heredoc_script("cd foo; "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_or_apply_patch_is_ignored() {
|
||||
assert_not_match(&heredoc_script("cd bar || "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_pipe_apply_patch_is_ignored() {
|
||||
assert_not_match(&heredoc_script("cd bar | "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_single_quoted_path_with_spaces() {
|
||||
assert_match(&heredoc_script("cd 'foo bar' && "), Some("foo bar"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_double_quoted_path_with_spaces() {
|
||||
assert_match(&heredoc_script("cd \"foo bar\" && "), Some("foo bar"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_echo_and_apply_patch_is_ignored() {
|
||||
assert_not_match(&heredoc_script("echo foo && "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_apply_patch_with_arg_is_ignored() {
|
||||
let script = "apply_patch foo <<'PATCH'\n*** Begin Patch\n*** Add File: foo\n+hi\n*** End Patch\nPATCH";
|
||||
assert_not_match(script);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_double_cd_then_apply_patch_is_ignored() {
|
||||
assert_not_match(&heredoc_script("cd foo && cd bar && "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_two_args_is_ignored() {
|
||||
assert_not_match(&heredoc_script("cd foo bar && "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cd_then_apply_patch_then_extra_is_ignored() {
|
||||
let script = heredoc_script_ps("cd bar && ", " && echo done");
|
||||
assert_not_match(&script);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_echo_then_cd_and_apply_patch_is_ignored() {
|
||||
// Ensure preceding commands before the `cd && apply_patch <<...` sequence do not match.
|
||||
assert_not_match(&heredoc_script("echo foo; cd bar && "));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_add_file_hunk_creates_file_with_contents() {
|
||||
let dir = tempdir().unwrap();
|
||||
@@ -1265,33 +1025,6 @@ PATCH"#,
|
||||
assert_eq!(contents, "a\nB\nc\nd\nE\nf\ng\n");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_pure_addition_chunk_followed_by_removal() {
|
||||
let dir = tempdir().unwrap();
|
||||
let path = dir.path().join("panic.txt");
|
||||
fs::write(&path, "line1\nline2\nline3\n").unwrap();
|
||||
let patch = wrap_patch(&format!(
|
||||
r#"*** Update File: {}
|
||||
@@
|
||||
+after-context
|
||||
+second-line
|
||||
@@
|
||||
line1
|
||||
-line2
|
||||
-line3
|
||||
+line2-replacement"#,
|
||||
path.display()
|
||||
));
|
||||
let mut stdout = Vec::new();
|
||||
let mut stderr = Vec::new();
|
||||
apply_patch(&patch, &mut stdout, &mut stderr).unwrap();
|
||||
let contents = fs::read_to_string(path).unwrap();
|
||||
assert_eq!(
|
||||
contents,
|
||||
"line1\nline2-replacement\nafter-context\nsecond-line\n"
|
||||
);
|
||||
}
|
||||
|
||||
/// Ensure that patches authored with ASCII characters can update lines that
|
||||
/// contain typographic Unicode punctuation (e.g. EN DASH, NON-BREAKING
|
||||
/// HYPHEN). Historically `git apply` succeeds in such scenarios but our
|
||||
@@ -1556,6 +1289,57 @@ g
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_insert_addition_after_single_context_anchor() {
|
||||
let dir = tempdir().unwrap();
|
||||
let path = dir.path().join("single_ctx.txt");
|
||||
fs::write(&path, "class BaseClass:\n def method():\nline1\nline2\n").unwrap();
|
||||
|
||||
let patch = wrap_patch(&format!(
|
||||
r#"*** Update File: {}
|
||||
@@ class BaseClass:
|
||||
+INSERTED
|
||||
"#,
|
||||
path.display()
|
||||
));
|
||||
|
||||
let mut stdout = Vec::new();
|
||||
let mut stderr = Vec::new();
|
||||
apply_patch(&patch, &mut stdout, &mut stderr).unwrap();
|
||||
|
||||
let contents = fs::read_to_string(path).unwrap();
|
||||
assert_eq!(
|
||||
contents,
|
||||
"class BaseClass:\nINSERTED\n def method():\nline1\nline2\n"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_insert_addition_after_multi_context_anchor() {
|
||||
let dir = tempdir().unwrap();
|
||||
let path = dir.path().join("multi_ctx.txt");
|
||||
fs::write(&path, "class BaseClass:\n def method():\nline1\nline2\n").unwrap();
|
||||
|
||||
let patch = wrap_patch(&format!(
|
||||
r#"*** Update File: {}
|
||||
@@ class BaseClass:
|
||||
@@ def method():
|
||||
+INSERTED
|
||||
"#,
|
||||
path.display()
|
||||
));
|
||||
|
||||
let mut stdout = Vec::new();
|
||||
let mut stderr = Vec::new();
|
||||
apply_patch(&patch, &mut stdout, &mut stderr).unwrap();
|
||||
|
||||
let contents = fs::read_to_string(path).unwrap();
|
||||
assert_eq!(
|
||||
contents,
|
||||
"class BaseClass:\n def method():\nINSERTED\nline1\nline2\n"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_apply_patch_should_resolve_absolute_paths_in_cwd() {
|
||||
let session_dir = tempdir().unwrap();
|
||||
|
||||
@@ -69,7 +69,7 @@ pub enum Hunk {
|
||||
path: PathBuf,
|
||||
move_path: Option<PathBuf>,
|
||||
|
||||
/// Chunks should be in order, i.e. the `change_context` of one chunk
|
||||
/// Chunks should be in order, i.e. the first context line of one chunk
|
||||
/// should occur later in the file than the previous chunk.
|
||||
chunks: Vec<UpdateFileChunk>,
|
||||
},
|
||||
@@ -89,12 +89,13 @@ use Hunk::*;
|
||||
|
||||
#[derive(Debug, PartialEq, Clone)]
|
||||
pub struct UpdateFileChunk {
|
||||
/// A single line of context used to narrow down the position of the chunk
|
||||
/// (this is usually a class, method, or function definition.)
|
||||
pub change_context: Option<String>,
|
||||
/// Context lines used to narrow down the position of the chunk.
|
||||
/// Each entry is searched sequentially to progressively restrict the
|
||||
/// search to the desired region (e.g. class → method).
|
||||
pub context_lines: Vec<String>,
|
||||
|
||||
/// A contiguous block of lines that should be replaced with `new_lines`.
|
||||
/// `old_lines` must occur strictly after `change_context`.
|
||||
/// `old_lines` must occur strictly after the context.
|
||||
pub old_lines: Vec<String>,
|
||||
pub new_lines: Vec<String>,
|
||||
|
||||
@@ -175,11 +176,7 @@ fn parse_patch_text(patch: &str, mode: ParseMode) -> Result<ApplyPatchArgs, Pars
|
||||
remaining_lines = &remaining_lines[hunk_lines..]
|
||||
}
|
||||
let patch = lines.join("\n");
|
||||
Ok(ApplyPatchArgs {
|
||||
hunks,
|
||||
patch,
|
||||
workdir: None,
|
||||
})
|
||||
Ok(ApplyPatchArgs { hunks, patch })
|
||||
}
|
||||
|
||||
/// Checks the start and end lines of the patch text for `apply_patch`,
|
||||
@@ -348,32 +345,38 @@ fn parse_update_file_chunk(
|
||||
line_number,
|
||||
});
|
||||
}
|
||||
// If we see an explicit context marker @@ or @@ <context>, consume it; otherwise, optionally
|
||||
// allow treating the chunk as starting directly with diff lines.
|
||||
let (change_context, start_index) = if lines[0] == EMPTY_CHANGE_CONTEXT_MARKER {
|
||||
(None, 1)
|
||||
} else if let Some(context) = lines[0].strip_prefix(CHANGE_CONTEXT_MARKER) {
|
||||
(Some(context.to_string()), 1)
|
||||
} else {
|
||||
if !allow_missing_context {
|
||||
return Err(InvalidHunkError {
|
||||
message: format!(
|
||||
"Expected update hunk to start with a @@ context marker, got: '{}'",
|
||||
lines[0]
|
||||
),
|
||||
line_number,
|
||||
});
|
||||
let mut context_lines = Vec::new();
|
||||
let mut start_index = 0;
|
||||
let mut saw_context_marker = false;
|
||||
while start_index < lines.len() {
|
||||
if lines[start_index] == EMPTY_CHANGE_CONTEXT_MARKER {
|
||||
saw_context_marker = true;
|
||||
start_index += 1;
|
||||
} else if let Some(context) = lines[start_index].strip_prefix(CHANGE_CONTEXT_MARKER) {
|
||||
saw_context_marker = true;
|
||||
context_lines.push(context.to_string());
|
||||
start_index += 1;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
(None, 0)
|
||||
};
|
||||
}
|
||||
if !saw_context_marker && !allow_missing_context {
|
||||
return Err(InvalidHunkError {
|
||||
message: format!(
|
||||
"Expected update hunk to start with a @@ context marker, got: '{}'",
|
||||
lines[0]
|
||||
),
|
||||
line_number,
|
||||
});
|
||||
}
|
||||
if start_index >= lines.len() {
|
||||
return Err(InvalidHunkError {
|
||||
message: "Update hunk does not contain any lines".to_string(),
|
||||
line_number: line_number + 1,
|
||||
line_number: line_number + start_index,
|
||||
});
|
||||
}
|
||||
let mut chunk = UpdateFileChunk {
|
||||
change_context,
|
||||
context_lines,
|
||||
old_lines: Vec::new(),
|
||||
new_lines: Vec::new(),
|
||||
is_end_of_file: false,
|
||||
@@ -385,7 +388,7 @@ fn parse_update_file_chunk(
|
||||
if parsed_lines == 0 {
|
||||
return Err(InvalidHunkError {
|
||||
message: "Update hunk does not contain any lines".to_string(),
|
||||
line_number: line_number + 1,
|
||||
line_number: line_number + start_index,
|
||||
});
|
||||
}
|
||||
chunk.is_end_of_file = true;
|
||||
@@ -415,7 +418,7 @@ fn parse_update_file_chunk(
|
||||
message: format!(
|
||||
"Unexpected line found in update hunk: '{line_contents}'. Every line should start with ' ' (context line), '+' (added line), or '-' (removed line)"
|
||||
),
|
||||
line_number: line_number + 1,
|
||||
line_number: line_number + start_index,
|
||||
});
|
||||
}
|
||||
// Assume this is the start of the next hunk.
|
||||
@@ -495,7 +498,7 @@ fn test_parse_patch() {
|
||||
path: PathBuf::from("path/update.py"),
|
||||
move_path: Some(PathBuf::from("path/update2.py")),
|
||||
chunks: vec![UpdateFileChunk {
|
||||
change_context: Some("def f():".to_string()),
|
||||
context_lines: vec!["def f():".to_string()],
|
||||
old_lines: vec![" pass".to_string()],
|
||||
new_lines: vec![" return 123".to_string()],
|
||||
is_end_of_file: false
|
||||
@@ -522,7 +525,7 @@ fn test_parse_patch() {
|
||||
path: PathBuf::from("file.py"),
|
||||
move_path: None,
|
||||
chunks: vec![UpdateFileChunk {
|
||||
change_context: None,
|
||||
context_lines: Vec::new(),
|
||||
old_lines: vec![],
|
||||
new_lines: vec!["line".to_string()],
|
||||
is_end_of_file: false
|
||||
@@ -552,7 +555,7 @@ fn test_parse_patch() {
|
||||
path: PathBuf::from("file2.py"),
|
||||
move_path: None,
|
||||
chunks: vec![UpdateFileChunk {
|
||||
change_context: None,
|
||||
context_lines: Vec::new(),
|
||||
old_lines: vec!["import foo".to_string()],
|
||||
new_lines: vec!["import foo".to_string(), "bar".to_string()],
|
||||
is_end_of_file: false,
|
||||
@@ -572,7 +575,7 @@ fn test_parse_patch_lenient() {
|
||||
path: PathBuf::from("file2.py"),
|
||||
move_path: None,
|
||||
chunks: vec![UpdateFileChunk {
|
||||
change_context: None,
|
||||
context_lines: Vec::new(),
|
||||
old_lines: vec!["import foo".to_string()],
|
||||
new_lines: vec!["import foo".to_string(), "bar".to_string()],
|
||||
is_end_of_file: false,
|
||||
@@ -590,8 +593,7 @@ fn test_parse_patch_lenient() {
|
||||
parse_patch_text(&patch_text_in_heredoc, ParseMode::Lenient),
|
||||
Ok(ApplyPatchArgs {
|
||||
hunks: expected_patch.clone(),
|
||||
patch: patch_text.to_string(),
|
||||
workdir: None,
|
||||
patch: patch_text.to_string()
|
||||
})
|
||||
);
|
||||
|
||||
@@ -604,8 +606,7 @@ fn test_parse_patch_lenient() {
|
||||
parse_patch_text(&patch_text_in_single_quoted_heredoc, ParseMode::Lenient),
|
||||
Ok(ApplyPatchArgs {
|
||||
hunks: expected_patch.clone(),
|
||||
patch: patch_text.to_string(),
|
||||
workdir: None,
|
||||
patch: patch_text.to_string()
|
||||
})
|
||||
);
|
||||
|
||||
@@ -617,9 +618,8 @@ fn test_parse_patch_lenient() {
|
||||
assert_eq!(
|
||||
parse_patch_text(&patch_text_in_double_quoted_heredoc, ParseMode::Lenient),
|
||||
Ok(ApplyPatchArgs {
|
||||
hunks: expected_patch,
|
||||
patch: patch_text.to_string(),
|
||||
workdir: None,
|
||||
hunks: expected_patch.clone(),
|
||||
patch: patch_text.to_string()
|
||||
})
|
||||
);
|
||||
|
||||
@@ -637,7 +637,7 @@ fn test_parse_patch_lenient() {
|
||||
"<<EOF\n*** Begin Patch\n*** Update File: file2.py\nEOF\n".to_string();
|
||||
assert_eq!(
|
||||
parse_patch_text(&patch_text_with_missing_closing_heredoc, ParseMode::Strict),
|
||||
Err(expected_error)
|
||||
Err(expected_error.clone())
|
||||
);
|
||||
assert_eq!(
|
||||
parse_patch_text(&patch_text_with_missing_closing_heredoc, ParseMode::Lenient),
|
||||
@@ -708,7 +708,7 @@ fn test_update_file_chunk() {
|
||||
),
|
||||
Ok((
|
||||
(UpdateFileChunk {
|
||||
change_context: Some("change_context".to_string()),
|
||||
context_lines: vec!["change_context".to_string()],
|
||||
old_lines: vec![
|
||||
"".to_string(),
|
||||
"context".to_string(),
|
||||
@@ -730,7 +730,7 @@ fn test_update_file_chunk() {
|
||||
parse_update_file_chunk(&["@@", "+line", "*** End of File"], 123, false),
|
||||
Ok((
|
||||
(UpdateFileChunk {
|
||||
change_context: None,
|
||||
context_lines: Vec::new(),
|
||||
old_lines: vec![],
|
||||
new_lines: vec!["line".to_string()],
|
||||
is_end_of_file: true
|
||||
@@ -738,4 +738,29 @@ fn test_update_file_chunk() {
|
||||
3
|
||||
))
|
||||
);
|
||||
assert_eq!(
|
||||
parse_update_file_chunk(
|
||||
&[
|
||||
"@@ class BaseClass",
|
||||
"@@ def method()",
|
||||
" context",
|
||||
"-old",
|
||||
"+new",
|
||||
],
|
||||
123,
|
||||
false
|
||||
),
|
||||
Ok((
|
||||
(UpdateFileChunk {
|
||||
context_lines: vec![
|
||||
"class BaseClass".to_string(),
|
||||
" def method()".to_string()
|
||||
],
|
||||
old_lines: vec!["context".to_string(), "old".to_string()],
|
||||
new_lines: vec!["context".to_string(), "new".to_string()],
|
||||
is_end_of_file: false
|
||||
}),
|
||||
5
|
||||
))
|
||||
);
|
||||
}
|
||||
|
||||
@@ -112,10 +112,9 @@ pub(crate) fn seek_sequence(
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::seek_sequence;
|
||||
use std::string::ToString;
|
||||
|
||||
fn to_vec(strings: &[&str]) -> Vec<String> {
|
||||
strings.iter().map(ToString::to_string).collect()
|
||||
strings.iter().map(|s| s.to_string()).collect()
|
||||
}
|
||||
|
||||
#[test]
|
||||
|
||||
@@ -11,10 +11,10 @@ path = "src/lib.rs"
|
||||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
anyhow = { workspace = true }
|
||||
codex-apply-patch = { workspace = true }
|
||||
codex-core = { workspace = true }
|
||||
codex-linux-sandbox = { workspace = true }
|
||||
dotenvy = { workspace = true }
|
||||
tempfile = { workspace = true }
|
||||
tokio = { workspace = true, features = ["rt-multi-thread"] }
|
||||
anyhow = "1"
|
||||
codex-apply-patch = { path = "../apply-patch" }
|
||||
codex-core = { path = "../core" }
|
||||
codex-linux-sandbox = { path = "../linux-sandbox" }
|
||||
dotenvy = "0.15.7"
|
||||
tempfile = "3"
|
||||
tokio = { version = "1", features = ["rt-multi-thread"] }
|
||||
|
||||
@@ -21,7 +21,8 @@ const MISSPELLED_APPLY_PATCH_ARG0: &str = "applypatch";
|
||||
/// `codex-linux-sandbox` we *directly* execute
|
||||
/// [`codex_linux_sandbox::run_main`] (which never returns). Otherwise we:
|
||||
///
|
||||
/// 1. Load `.env` values from `~/.codex/.env` before creating any threads.
|
||||
/// 1. Use [`dotenvy::from_path`] and [`dotenvy::dotenv`] to modify the
|
||||
/// environment before creating any threads.
|
||||
/// 2. Construct a Tokio multi-thread runtime.
|
||||
/// 3. Derive the path to the current executable (so children can re-invoke the
|
||||
/// sandbox) when running on Linux.
|
||||
@@ -54,7 +55,7 @@ where
|
||||
|
||||
let argv1 = args.next().unwrap_or_default();
|
||||
if argv1 == CODEX_APPLY_PATCH_ARG1 {
|
||||
let patch_arg = args.next().and_then(|s| s.to_str().map(str::to_owned));
|
||||
let patch_arg = args.next().and_then(|s| s.to_str().map(|s| s.to_owned()));
|
||||
let exit_code = match patch_arg {
|
||||
Some(patch_arg) => {
|
||||
let mut stdout = std::io::stdout();
|
||||
@@ -105,7 +106,7 @@ where
|
||||
|
||||
const ILLEGAL_ENV_VAR_PREFIX: &str = "CODEX_";
|
||||
|
||||
/// Load env vars from ~/.codex/.env.
|
||||
/// Load env vars from ~/.codex/.env and `$(pwd)/.env`.
|
||||
///
|
||||
/// Security: Do not allow `.env` files to create or modify any variables
|
||||
/// with names starting with `CODEX_`.
|
||||
@@ -115,6 +116,10 @@ fn load_dotenv() {
|
||||
{
|
||||
set_filtered(iter);
|
||||
}
|
||||
|
||||
if let Ok(iter) = dotenvy::dotenv_iter() {
|
||||
set_filtered(iter);
|
||||
}
|
||||
}
|
||||
|
||||
/// Helper to set vars from a dotenvy iterator while filtering out `CODEX_` keys.
|
||||
|
||||
@@ -1,16 +0,0 @@
|
||||
[package]
|
||||
name = "codex-backend-client"
|
||||
version = "0.0.0"
|
||||
edition = "2024"
|
||||
publish = false
|
||||
|
||||
[lib]
|
||||
path = "src/lib.rs"
|
||||
|
||||
[dependencies]
|
||||
anyhow = "1"
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
reqwest = { version = "0.12", default-features = false, features = ["json", "rustls-tls"] }
|
||||
tokio = { version = "1", features = ["macros", "rt"] }
|
||||
codex-backend-openapi-models = { path = "../codex-backend-openapi-models" }
|
||||
@@ -1,242 +0,0 @@
|
||||
use crate::types::CodeTaskDetailsResponse;
|
||||
use crate::types::PaginatedListTaskListItem;
|
||||
use crate::types::TurnAttemptsSiblingTurnsResponse;
|
||||
use anyhow::Result;
|
||||
use reqwest::header::AUTHORIZATION;
|
||||
use reqwest::header::CONTENT_TYPE;
|
||||
use reqwest::header::HeaderMap;
|
||||
use reqwest::header::HeaderName;
|
||||
use reqwest::header::HeaderValue;
|
||||
use reqwest::header::USER_AGENT;
|
||||
use serde::de::DeserializeOwned;
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
pub enum PathStyle {
|
||||
CodexApi, // /api/codex/...
|
||||
ChatGptApi, // /wham/...
|
||||
}
|
||||
|
||||
impl PathStyle {
|
||||
pub fn from_base_url(base_url: &str) -> Self {
|
||||
if base_url.contains("/backend-api") {
|
||||
PathStyle::ChatGptApi
|
||||
} else {
|
||||
PathStyle::CodexApi
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct Client {
|
||||
base_url: String,
|
||||
http: reqwest::Client,
|
||||
bearer_token: Option<String>,
|
||||
user_agent: Option<HeaderValue>,
|
||||
chatgpt_account_id: Option<String>,
|
||||
path_style: PathStyle,
|
||||
}
|
||||
|
||||
impl Client {
|
||||
pub fn new(base_url: impl Into<String>) -> Result<Self> {
|
||||
let mut base_url = base_url.into();
|
||||
// Normalize common ChatGPT hostnames to include /backend-api so we hit the WHAM paths.
|
||||
// Also trim trailing slashes for consistent URL building.
|
||||
while base_url.ends_with('/') {
|
||||
base_url.pop();
|
||||
}
|
||||
if (base_url.starts_with("https://chatgpt.com")
|
||||
|| base_url.starts_with("https://chat.openai.com"))
|
||||
&& !base_url.contains("/backend-api")
|
||||
{
|
||||
base_url = format!("{base_url}/backend-api");
|
||||
}
|
||||
let http = reqwest::Client::builder().build()?;
|
||||
let path_style = PathStyle::from_base_url(&base_url);
|
||||
Ok(Self {
|
||||
base_url,
|
||||
http,
|
||||
bearer_token: None,
|
||||
user_agent: None,
|
||||
chatgpt_account_id: None,
|
||||
path_style,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn with_bearer_token(mut self, token: impl Into<String>) -> Self {
|
||||
self.bearer_token = Some(token.into());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_user_agent(mut self, ua: impl Into<String>) -> Self {
|
||||
if let Ok(hv) = HeaderValue::from_str(&ua.into()) {
|
||||
self.user_agent = Some(hv);
|
||||
}
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_chatgpt_account_id(mut self, account_id: impl Into<String>) -> Self {
|
||||
self.chatgpt_account_id = Some(account_id.into());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_path_style(mut self, style: PathStyle) -> Self {
|
||||
self.path_style = style;
|
||||
self
|
||||
}
|
||||
|
||||
fn headers(&self) -> HeaderMap {
|
||||
let mut h = HeaderMap::new();
|
||||
if let Some(ua) = &self.user_agent {
|
||||
h.insert(USER_AGENT, ua.clone());
|
||||
} else {
|
||||
h.insert(USER_AGENT, HeaderValue::from_static("codex-cli"));
|
||||
}
|
||||
if let Some(token) = &self.bearer_token {
|
||||
let value = format!("Bearer {token}");
|
||||
if let Ok(hv) = HeaderValue::from_str(&value) {
|
||||
h.insert(AUTHORIZATION, hv);
|
||||
}
|
||||
}
|
||||
if let Some(acc) = &self.chatgpt_account_id
|
||||
&& let Ok(name) = HeaderName::from_bytes(b"ChatGPT-Account-Id")
|
||||
&& let Ok(hv) = HeaderValue::from_str(acc)
|
||||
{
|
||||
h.insert(name, hv);
|
||||
}
|
||||
h
|
||||
}
|
||||
|
||||
async fn exec_request(
|
||||
&self,
|
||||
req: reqwest::RequestBuilder,
|
||||
method: &str,
|
||||
url: &str,
|
||||
) -> Result<(String, String)> {
|
||||
let res = req.send().await?;
|
||||
let status = res.status();
|
||||
let ct = res
|
||||
.headers()
|
||||
.get(CONTENT_TYPE)
|
||||
.and_then(|v| v.to_str().ok())
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
let body = res.text().await.unwrap_or_default();
|
||||
if !status.is_success() {
|
||||
anyhow::bail!("{method} {url} failed: {status}; content-type={ct}; body={body}");
|
||||
}
|
||||
Ok((body, ct))
|
||||
}
|
||||
|
||||
fn decode_json<T: DeserializeOwned>(&self, url: &str, ct: &str, body: &str) -> Result<T> {
|
||||
match serde_json::from_str::<T>(body) {
|
||||
Ok(v) => Ok(v),
|
||||
Err(e) => {
|
||||
anyhow::bail!("Decode error for {url}: {e}; content-type={ct}; body={body}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn list_tasks(
|
||||
&self,
|
||||
limit: Option<i32>,
|
||||
task_filter: Option<&str>,
|
||||
environment_id: Option<&str>,
|
||||
) -> Result<PaginatedListTaskListItem> {
|
||||
let url = match self.path_style {
|
||||
PathStyle::CodexApi => format!("{}/api/codex/tasks/list", self.base_url),
|
||||
PathStyle::ChatGptApi => format!("{}/wham/tasks/list", self.base_url),
|
||||
};
|
||||
let req = self.http.get(&url).headers(self.headers());
|
||||
let req = if let Some(lim) = limit {
|
||||
req.query(&[("limit", lim)])
|
||||
} else {
|
||||
req
|
||||
};
|
||||
let req = if let Some(tf) = task_filter {
|
||||
req.query(&[("task_filter", tf)])
|
||||
} else {
|
||||
req
|
||||
};
|
||||
let req = if let Some(id) = environment_id {
|
||||
req.query(&[("environment_id", id)])
|
||||
} else {
|
||||
req
|
||||
};
|
||||
let (body, ct) = self.exec_request(req, "GET", &url).await?;
|
||||
self.decode_json::<PaginatedListTaskListItem>(&url, &ct, &body)
|
||||
}
|
||||
|
||||
pub async fn get_task_details(&self, task_id: &str) -> Result<CodeTaskDetailsResponse> {
|
||||
let (parsed, _body, _ct) = self.get_task_details_with_body(task_id).await?;
|
||||
Ok(parsed)
|
||||
}
|
||||
|
||||
pub async fn get_task_details_with_body(
|
||||
&self,
|
||||
task_id: &str,
|
||||
) -> Result<(CodeTaskDetailsResponse, String, String)> {
|
||||
let url = match self.path_style {
|
||||
PathStyle::CodexApi => format!("{}/api/codex/tasks/{}", self.base_url, task_id),
|
||||
PathStyle::ChatGptApi => format!("{}/wham/tasks/{}", self.base_url, task_id),
|
||||
};
|
||||
let req = self.http.get(&url).headers(self.headers());
|
||||
let (body, ct) = self.exec_request(req, "GET", &url).await?;
|
||||
let parsed: CodeTaskDetailsResponse = self.decode_json(&url, &ct, &body)?;
|
||||
Ok((parsed, body, ct))
|
||||
}
|
||||
|
||||
pub async fn list_sibling_turns(
|
||||
&self,
|
||||
task_id: &str,
|
||||
turn_id: &str,
|
||||
) -> Result<TurnAttemptsSiblingTurnsResponse> {
|
||||
let url = match self.path_style {
|
||||
PathStyle::CodexApi => format!(
|
||||
"{}/api/codex/tasks/{}/turns/{}/sibling_turns",
|
||||
self.base_url, task_id, turn_id
|
||||
),
|
||||
PathStyle::ChatGptApi => format!(
|
||||
"{}/wham/tasks/{}/turns/{}/sibling_turns",
|
||||
self.base_url, task_id, turn_id
|
||||
),
|
||||
};
|
||||
let req = self.http.get(&url).headers(self.headers());
|
||||
let (body, ct) = self.exec_request(req, "GET", &url).await?;
|
||||
self.decode_json::<TurnAttemptsSiblingTurnsResponse>(&url, &ct, &body)
|
||||
}
|
||||
|
||||
/// Create a new task (user turn) by POSTing to the appropriate backend path
|
||||
/// based on `path_style`. Returns the created task id.
|
||||
pub async fn create_task(&self, request_body: serde_json::Value) -> Result<String> {
|
||||
let url = match self.path_style {
|
||||
PathStyle::CodexApi => format!("{}/api/codex/tasks", self.base_url),
|
||||
PathStyle::ChatGptApi => format!("{}/wham/tasks", self.base_url),
|
||||
};
|
||||
let req = self
|
||||
.http
|
||||
.post(&url)
|
||||
.headers(self.headers())
|
||||
.header(CONTENT_TYPE, HeaderValue::from_static("application/json"))
|
||||
.json(&request_body);
|
||||
let (body, ct) = self.exec_request(req, "POST", &url).await?;
|
||||
// Extract id from JSON: prefer `task.id`; fallback to top-level `id` when present.
|
||||
match serde_json::from_str::<serde_json::Value>(&body) {
|
||||
Ok(v) => {
|
||||
if let Some(id) = v
|
||||
.get("task")
|
||||
.and_then(|t| t.get("id"))
|
||||
.and_then(|s| s.as_str())
|
||||
{
|
||||
Ok(id.to_string())
|
||||
} else if let Some(id) = v.get("id").and_then(|s| s.as_str()) {
|
||||
Ok(id.to_string())
|
||||
} else {
|
||||
anyhow::bail!(
|
||||
"POST {url} succeeded but no task id found; content-type={ct}; body={body}"
|
||||
);
|
||||
}
|
||||
}
|
||||
Err(e) => anyhow::bail!("Decode error for {url}: {e}; content-type={ct}; body={body}"),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
mod client;
|
||||
pub mod types;
|
||||
|
||||
pub use client::Client;
|
||||
pub use types::CodeTaskDetailsResponse;
|
||||
pub use types::CodeTaskDetailsResponseExt;
|
||||
pub use types::PaginatedListTaskListItem;
|
||||
pub use types::TaskListItem;
|
||||
pub use types::TurnAttemptsSiblingTurnsResponse;
|
||||
@@ -1,141 +0,0 @@
|
||||
pub use codex_backend_openapi_models::models::CodeTaskDetailsResponse;
|
||||
pub use codex_backend_openapi_models::models::PaginatedListTaskListItem;
|
||||
pub use codex_backend_openapi_models::models::TaskListItem;
|
||||
|
||||
use serde::Deserialize;
|
||||
use serde_json::Value;
|
||||
|
||||
/// Extension helpers on generated types.
|
||||
pub trait CodeTaskDetailsResponseExt {
|
||||
/// Attempt to extract a unified diff string from `current_diff_task_turn`.
|
||||
fn unified_diff(&self) -> Option<String>;
|
||||
/// Extract assistant text output messages (no diff) from current turns.
|
||||
fn assistant_text_messages(&self) -> Vec<String>;
|
||||
/// Extract the user's prompt text from the current user turn, when present.
|
||||
fn user_text_prompt(&self) -> Option<String>;
|
||||
/// Extract an assistant error message (if the turn failed and provided one).
|
||||
fn assistant_error_message(&self) -> Option<String>;
|
||||
}
|
||||
impl CodeTaskDetailsResponseExt for CodeTaskDetailsResponse {
|
||||
fn unified_diff(&self) -> Option<String> {
|
||||
// `current_diff_task_turn` is an object; look for `output_items`.
|
||||
// Prefer explicit diff turn; fallback to assistant turn if needed.
|
||||
let candidates: [&Option<std::collections::HashMap<String, Value>>; 2] =
|
||||
[&self.current_diff_task_turn, &self.current_assistant_turn];
|
||||
|
||||
for map in candidates {
|
||||
let items = map
|
||||
.as_ref()
|
||||
.and_then(|m| m.get("output_items"))
|
||||
.and_then(|v| v.as_array());
|
||||
if let Some(items) = items {
|
||||
for item in items {
|
||||
match item.get("type").and_then(Value::as_str) {
|
||||
Some("output_diff") => {
|
||||
if let Some(s) = item.get("diff").and_then(Value::as_str) {
|
||||
return Some(s.to_string());
|
||||
}
|
||||
}
|
||||
Some("pr") => {
|
||||
if let Some(s) = item
|
||||
.get("output_diff")
|
||||
.and_then(|od| od.get("diff"))
|
||||
.and_then(Value::as_str)
|
||||
{
|
||||
return Some(s.to_string());
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
fn assistant_text_messages(&self) -> Vec<String> {
|
||||
let mut out = Vec::new();
|
||||
let candidates: [&Option<std::collections::HashMap<String, Value>>; 2] =
|
||||
[&self.current_diff_task_turn, &self.current_assistant_turn];
|
||||
for map in candidates {
|
||||
let items = map
|
||||
.as_ref()
|
||||
.and_then(|m| m.get("output_items"))
|
||||
.and_then(|v| v.as_array());
|
||||
if let Some(items) = items {
|
||||
for item in items {
|
||||
if item.get("type").and_then(Value::as_str) == Some("message")
|
||||
&& let Some(content) = item.get("content").and_then(Value::as_array)
|
||||
{
|
||||
for part in content {
|
||||
if part.get("content_type").and_then(Value::as_str) == Some("text")
|
||||
&& let Some(txt) = part.get("text").and_then(Value::as_str)
|
||||
{
|
||||
out.push(txt.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
out
|
||||
}
|
||||
|
||||
fn user_text_prompt(&self) -> Option<String> {
|
||||
use serde_json::Value;
|
||||
let map = self.current_user_turn.as_ref()?;
|
||||
let items = map.get("input_items").and_then(Value::as_array)?;
|
||||
let mut parts: Vec<String> = Vec::new();
|
||||
for item in items {
|
||||
if item.get("type").and_then(Value::as_str) == Some("message") {
|
||||
// optional role filter (prefer user)
|
||||
let is_user = item
|
||||
.get("role")
|
||||
.and_then(Value::as_str)
|
||||
.map(|r| r.eq_ignore_ascii_case("user"))
|
||||
.unwrap_or(true);
|
||||
if !is_user {
|
||||
continue;
|
||||
}
|
||||
if let Some(content) = item.get("content").and_then(Value::as_array) {
|
||||
for c in content {
|
||||
if c.get("content_type").and_then(Value::as_str) == Some("text")
|
||||
&& let Some(txt) = c.get("text").and_then(Value::as_str)
|
||||
{
|
||||
parts.push(txt.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if parts.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(parts.join("\n\n"))
|
||||
}
|
||||
}
|
||||
|
||||
fn assistant_error_message(&self) -> Option<String> {
|
||||
let map = self.current_assistant_turn.as_ref()?;
|
||||
let err = map.get("error")?.as_object()?;
|
||||
let message = err.get("message").and_then(Value::as_str).unwrap_or("");
|
||||
let code = err.get("code").and_then(Value::as_str).unwrap_or("");
|
||||
if message.is_empty() && code.is_empty() {
|
||||
None
|
||||
} else if message.is_empty() {
|
||||
Some(code.to_string())
|
||||
} else if code.is_empty() {
|
||||
Some(message.to_string())
|
||||
} else {
|
||||
Some(format!("{code}: {message}"))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Removed unused helpers `single_file_paths` and `extract_file_paths_list` to reduce
|
||||
// surface area; reintroduce as needed near call sites.
|
||||
|
||||
#[derive(Clone, Debug, Deserialize)]
|
||||
pub struct TurnAttemptsSiblingTurnsResponse {
|
||||
#[serde(default)]
|
||||
pub sibling_turns: Vec<std::collections::HashMap<String, Value>>,
|
||||
}
|
||||
@@ -7,14 +7,15 @@ version = { workspace = true }
|
||||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
anyhow = { workspace = true }
|
||||
clap = { workspace = true, features = ["derive"] }
|
||||
codex-common = { workspace = true, features = ["cli"] }
|
||||
codex-core = { workspace = true }
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true, features = ["full"] }
|
||||
codex-git-apply = { path = "../git-apply" }
|
||||
anyhow = "1"
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
codex-common = { path = "../common", features = ["cli"] }
|
||||
codex-core = { path = "../core" }
|
||||
codex-login = { path = "../login" }
|
||||
reqwest = { version = "0.12", features = ["json", "stream"] }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
tokio = { version = "1", features = ["full"] }
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = { workspace = true }
|
||||
tempfile = "3"
|
||||
|
||||
@@ -56,24 +56,46 @@ pub async fn apply_diff_from_task(
|
||||
}
|
||||
|
||||
async fn apply_diff(diff: &str, cwd: Option<PathBuf>) -> anyhow::Result<()> {
|
||||
let cwd = cwd.unwrap_or(std::env::current_dir().unwrap_or_else(|_| std::env::temp_dir()));
|
||||
let req = codex_git_apply::ApplyGitRequest {
|
||||
cwd,
|
||||
diff: diff.to_string(),
|
||||
revert: false,
|
||||
preflight: false,
|
||||
};
|
||||
let res = codex_git_apply::apply_git_patch(&req)?;
|
||||
if res.exit_code != 0 {
|
||||
let mut cmd = tokio::process::Command::new("git");
|
||||
if let Some(cwd) = cwd {
|
||||
cmd.current_dir(cwd);
|
||||
}
|
||||
let toplevel_output = cmd
|
||||
.args(vec!["rev-parse", "--show-toplevel"])
|
||||
.output()
|
||||
.await?;
|
||||
|
||||
if !toplevel_output.status.success() {
|
||||
anyhow::bail!("apply must be run from a git repository.");
|
||||
}
|
||||
|
||||
let repo_root = String::from_utf8(toplevel_output.stdout)?
|
||||
.trim()
|
||||
.to_string();
|
||||
|
||||
let mut git_apply_cmd = tokio::process::Command::new("git")
|
||||
.args(vec!["apply", "--3way"])
|
||||
.current_dir(&repo_root)
|
||||
.stdin(std::process::Stdio::piped())
|
||||
.stdout(std::process::Stdio::piped())
|
||||
.stderr(std::process::Stdio::piped())
|
||||
.spawn()?;
|
||||
|
||||
if let Some(mut stdin) = git_apply_cmd.stdin.take() {
|
||||
tokio::io::AsyncWriteExt::write_all(&mut stdin, diff.as_bytes()).await?;
|
||||
drop(stdin);
|
||||
}
|
||||
|
||||
let output = git_apply_cmd.wait_with_output().await?;
|
||||
|
||||
if !output.status.success() {
|
||||
anyhow::bail!(
|
||||
"Git apply failed (applied={}, skipped={}, conflicts={})\nstdout:\n{}\nstderr:\n{}",
|
||||
res.applied_paths.len(),
|
||||
res.skipped_paths.len(),
|
||||
res.conflicted_paths.len(),
|
||||
res.stdout,
|
||||
res.stderr
|
||||
"Git apply failed with status {}: {}",
|
||||
output.status,
|
||||
String::from_utf8_lossy(&output.stderr)
|
||||
);
|
||||
}
|
||||
|
||||
println!("Successfully applied diff");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
use codex_core::config::Config;
|
||||
use codex_core::default_client::create_client;
|
||||
use codex_core::user_agent::get_codex_user_agent;
|
||||
|
||||
use crate::chatgpt_token::get_chatgpt_token_data;
|
||||
use crate::chatgpt_token::init_chatgpt_token_from_auth;
|
||||
@@ -16,7 +16,7 @@ pub(crate) async fn chatgpt_get_request<T: DeserializeOwned>(
|
||||
init_chatgpt_token_from_auth(&config.codex_home).await?;
|
||||
|
||||
// Make direct HTTP request to ChatGPT backend API with the token
|
||||
let client = create_client();
|
||||
let client = reqwest::Client::new();
|
||||
let url = format!("{chatgpt_base_url}{path}");
|
||||
|
||||
let token =
|
||||
@@ -31,6 +31,7 @@ pub(crate) async fn chatgpt_get_request<T: DeserializeOwned>(
|
||||
.bearer_auth(&token.access_token)
|
||||
.header("chatgpt-account-id", account_id?)
|
||||
.header("Content-Type", "application/json")
|
||||
.header("User-Agent", get_codex_user_agent(None))
|
||||
.send()
|
||||
.await
|
||||
.context("Failed to send request")?;
|
||||
@@ -44,6 +45,6 @@ pub(crate) async fn chatgpt_get_request<T: DeserializeOwned>(
|
||||
} else {
|
||||
let status = response.status();
|
||||
let body = response.text().await.unwrap_or_default();
|
||||
anyhow::bail!("Request failed with status {status}: {body}")
|
||||
anyhow::bail!("Request failed with status {}: {}", status, body)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
use codex_core::CodexAuth;
|
||||
use codex_login::AuthMode;
|
||||
use codex_login::CodexAuth;
|
||||
use std::path::Path;
|
||||
use std::sync::LazyLock;
|
||||
use std::sync::RwLock;
|
||||
|
||||
use codex_core::token_data::TokenData;
|
||||
use codex_login::TokenData;
|
||||
|
||||
static CHATGPT_TOKEN: LazyLock<RwLock<Option<TokenData>>> = LazyLock::new(|| RwLock::new(None));
|
||||
|
||||
@@ -19,7 +20,7 @@ pub fn set_chatgpt_token_data(value: TokenData) {
|
||||
|
||||
/// Initialize the ChatGPT token from auth.json file
|
||||
pub async fn init_chatgpt_token_from_auth(codex_home: &Path) -> std::io::Result<()> {
|
||||
let auth = CodexAuth::from_codex_home(codex_home)?;
|
||||
let auth = CodexAuth::from_codex_home(codex_home, AuthMode::ChatGPT)?;
|
||||
if let Some(auth) = auth {
|
||||
let token_data = auth.get_token_data().await?;
|
||||
set_chatgpt_token_data(token_data);
|
||||
|
||||
@@ -15,46 +15,26 @@ path = "src/lib.rs"
|
||||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
anyhow = { workspace = true }
|
||||
clap = { workspace = true, features = ["derive"] }
|
||||
clap_complete = { workspace = true }
|
||||
codex-arg0 = { workspace = true }
|
||||
codex-chatgpt = { workspace = true }
|
||||
codex-common = { workspace = true, features = ["cli"] }
|
||||
codex-core = { workspace = true }
|
||||
codex-exec = { workspace = true }
|
||||
codex-login = { workspace = true }
|
||||
codex-mcp-server = { workspace = true }
|
||||
codex-protocol = { workspace = true }
|
||||
codex-protocol-ts = { workspace = true }
|
||||
codex-responses-api-proxy = { workspace = true }
|
||||
codex-tui = { workspace = true }
|
||||
codex-cloud-tasks = { path = "../cloud-tasks" }
|
||||
ctor = { workspace = true }
|
||||
owo-colors = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
supports-color = { workspace = true }
|
||||
tokio = { workspace = true, features = [
|
||||
anyhow = "1"
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
clap_complete = "4"
|
||||
codex-arg0 = { path = "../arg0" }
|
||||
codex-chatgpt = { path = "../chatgpt" }
|
||||
codex-common = { path = "../common", features = ["cli"] }
|
||||
codex-core = { path = "../core" }
|
||||
codex-exec = { path = "../exec" }
|
||||
codex-login = { path = "../login" }
|
||||
codex-mcp-server = { path = "../mcp-server" }
|
||||
codex-protocol = { path = "../protocol" }
|
||||
codex-tui = { path = "../tui" }
|
||||
serde_json = "1"
|
||||
tokio = { version = "1", features = [
|
||||
"io-std",
|
||||
"macros",
|
||||
"process",
|
||||
"rt-multi-thread",
|
||||
"signal",
|
||||
] }
|
||||
tracing = { workspace = true }
|
||||
tracing-subscriber = { workspace = true }
|
||||
|
||||
[target.'cfg(target_os = "linux")'.dependencies]
|
||||
libc = { workspace = true }
|
||||
|
||||
[target.'cfg(target_os = "android")'.dependencies]
|
||||
libc = { workspace = true }
|
||||
|
||||
[target.'cfg(target_os = "macos")'.dependencies]
|
||||
libc = { workspace = true }
|
||||
|
||||
[dev-dependencies]
|
||||
assert_cmd = { workspace = true }
|
||||
predicates = { workspace = true }
|
||||
pretty_assertions = { workspace = true }
|
||||
tempfile = { workspace = true }
|
||||
tracing = "0.1.41"
|
||||
tracing-subscriber = "0.3.19"
|
||||
codex-protocol-ts = { path = "../protocol-ts" }
|
||||
|
||||
@@ -64,6 +64,7 @@ async fn run_command_under_sandbox(
|
||||
sandbox_type: SandboxType,
|
||||
) -> anyhow::Result<()> {
|
||||
let sandbox_mode = create_sandbox_mode(full_auto);
|
||||
let cwd = std::env::current_dir()?;
|
||||
let config = Config::load_with_cli_overrides(
|
||||
config_overrides
|
||||
.parse_overrides()
|
||||
@@ -74,29 +75,13 @@ async fn run_command_under_sandbox(
|
||||
..Default::default()
|
||||
},
|
||||
)?;
|
||||
|
||||
// In practice, this should be `std::env::current_dir()` because this CLI
|
||||
// does not support `--cwd`, but let's use the config value for consistency.
|
||||
let cwd = config.cwd.clone();
|
||||
// For now, we always use the same cwd for both the command and the
|
||||
// sandbox policy. In the future, we could add a CLI option to set them
|
||||
// separately.
|
||||
let sandbox_policy_cwd = cwd.clone();
|
||||
|
||||
let stdio_policy = StdioPolicy::Inherit;
|
||||
let env = create_env(&config.shell_environment_policy);
|
||||
|
||||
let mut child = match sandbox_type {
|
||||
SandboxType::Seatbelt => {
|
||||
spawn_command_under_seatbelt(
|
||||
command,
|
||||
cwd,
|
||||
&config.sandbox_policy,
|
||||
sandbox_policy_cwd.as_path(),
|
||||
stdio_policy,
|
||||
env,
|
||||
)
|
||||
.await?
|
||||
spawn_command_under_seatbelt(command, &config.sandbox_policy, cwd, stdio_policy, env)
|
||||
.await?
|
||||
}
|
||||
SandboxType::Landlock => {
|
||||
#[expect(clippy::expect_used)]
|
||||
@@ -106,9 +91,8 @@ async fn run_command_under_sandbox(
|
||||
spawn_command_under_linux_sandbox(
|
||||
codex_linux_sandbox_exe,
|
||||
command,
|
||||
cwd,
|
||||
&config.sandbox_policy,
|
||||
sandbox_policy_cwd.as_path(),
|
||||
cwd,
|
||||
stdio_policy,
|
||||
env,
|
||||
)
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
use codex_common::CliConfigOverrides;
|
||||
use codex_core::CodexAuth;
|
||||
use codex_core::auth::CLIENT_ID;
|
||||
use codex_core::auth::login_with_api_key;
|
||||
use codex_core::auth::logout;
|
||||
use codex_core::config::Config;
|
||||
use codex_core::config::ConfigOverrides;
|
||||
use codex_login::AuthMode;
|
||||
use codex_login::CLIENT_ID;
|
||||
use codex_login::CodexAuth;
|
||||
use codex_login::OPENAI_API_KEY_ENV_VAR;
|
||||
use codex_login::ServerOptions;
|
||||
use codex_login::login_with_api_key;
|
||||
use codex_login::logout;
|
||||
use codex_login::run_login_server;
|
||||
use codex_protocol::mcp_protocol::AuthMode;
|
||||
use std::env;
|
||||
use std::path::PathBuf;
|
||||
|
||||
pub async fn login_with_chatgpt(codex_home: PathBuf) -> std::io::Result<()> {
|
||||
@@ -58,11 +60,19 @@ pub async fn run_login_with_api_key(
|
||||
pub async fn run_login_status(cli_config_overrides: CliConfigOverrides) -> ! {
|
||||
let config = load_config_or_exit(cli_config_overrides);
|
||||
|
||||
match CodexAuth::from_codex_home(&config.codex_home) {
|
||||
match CodexAuth::from_codex_home(&config.codex_home, config.preferred_auth_method) {
|
||||
Ok(Some(auth)) => match auth.mode {
|
||||
AuthMode::ApiKey => match auth.get_token().await {
|
||||
Ok(api_key) => {
|
||||
eprintln!("Logged in using an API key - {}", safe_format_key(&api_key));
|
||||
|
||||
if let Ok(env_api_key) = env::var(OPENAI_API_KEY_ENV_VAR)
|
||||
&& env_api_key == api_key
|
||||
{
|
||||
eprintln!(
|
||||
" API loaded from OPENAI_API_KEY environment variable or .env file"
|
||||
);
|
||||
}
|
||||
std::process::exit(0);
|
||||
}
|
||||
Err(e) => {
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
use anyhow::Context;
|
||||
use clap::CommandFactory;
|
||||
use clap::Parser;
|
||||
use clap_complete::Shell;
|
||||
@@ -13,20 +12,11 @@ use codex_cli::login::run_login_with_api_key;
|
||||
use codex_cli::login::run_login_with_chatgpt;
|
||||
use codex_cli::login::run_logout;
|
||||
use codex_cli::proto;
|
||||
use codex_cloud_tasks::Cli as CloudTasksCli;
|
||||
use codex_common::CliConfigOverrides;
|
||||
use codex_exec::Cli as ExecCli;
|
||||
use codex_responses_api_proxy::Args as ResponsesApiProxyArgs;
|
||||
use codex_tui::AppExitInfo;
|
||||
use codex_tui::Cli as TuiCli;
|
||||
use owo_colors::OwoColorize;
|
||||
use std::path::PathBuf;
|
||||
use supports_color::Stream;
|
||||
|
||||
mod mcp_cmd;
|
||||
mod pre_main_hardening;
|
||||
|
||||
use crate::mcp_cmd::McpCli;
|
||||
use crate::proto::ProtoCli;
|
||||
|
||||
/// Codex CLI
|
||||
@@ -66,8 +56,8 @@ enum Subcommand {
|
||||
/// Remove stored authentication credentials.
|
||||
Logout(LogoutCommand),
|
||||
|
||||
/// [experimental] Run Codex as an MCP server and manage MCP servers.
|
||||
Mcp(McpCli),
|
||||
/// Experimental: run Codex as an MCP server.
|
||||
Mcp,
|
||||
|
||||
/// Run the Protocol stream via stdin/stdout
|
||||
#[clap(visible_alias = "p")]
|
||||
@@ -83,20 +73,9 @@ enum Subcommand {
|
||||
#[clap(visible_alias = "a")]
|
||||
Apply(ApplyCommand),
|
||||
|
||||
/// Resume a previous interactive session (picker by default; use --last to continue the most recent).
|
||||
Resume(ResumeCommand),
|
||||
|
||||
/// Internal: generate TypeScript protocol bindings.
|
||||
#[clap(hide = true)]
|
||||
GenerateTs(GenerateTsCommand),
|
||||
|
||||
/// Browse and apply tasks from the cloud.
|
||||
#[clap(name = "cloud", alias = "cloud-tasks")]
|
||||
Cloud(CloudTasksCli),
|
||||
|
||||
/// Internal: run the responses API proxy.
|
||||
#[clap(hide = true)]
|
||||
ResponsesApiProxy(ResponsesApiProxyArgs),
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
@@ -106,21 +85,6 @@ struct CompletionCommand {
|
||||
shell: Shell,
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
struct ResumeCommand {
|
||||
/// Conversation/session id (UUID). When provided, resumes this session.
|
||||
/// If omitted, use --last to pick the most recent recorded session.
|
||||
#[arg(value_name = "SESSION_ID")]
|
||||
session_id: Option<String>,
|
||||
|
||||
/// Continue the most recent session without showing the picker.
|
||||
#[arg(long = "last", default_value_t = false, conflicts_with = "session_id")]
|
||||
last: bool,
|
||||
|
||||
#[clap(flatten)]
|
||||
config_overrides: TuiCli,
|
||||
}
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
struct DebugArgs {
|
||||
#[command(subcommand)]
|
||||
@@ -171,69 +135,6 @@ struct GenerateTsCommand {
|
||||
prettier: Option<PathBuf>,
|
||||
}
|
||||
|
||||
fn format_exit_messages(exit_info: AppExitInfo, color_enabled: bool) -> Vec<String> {
|
||||
let AppExitInfo {
|
||||
token_usage,
|
||||
conversation_id,
|
||||
} = exit_info;
|
||||
|
||||
if token_usage.is_zero() {
|
||||
return Vec::new();
|
||||
}
|
||||
|
||||
let mut lines = vec![format!(
|
||||
"{}",
|
||||
codex_core::protocol::FinalOutput::from(token_usage)
|
||||
)];
|
||||
|
||||
if let Some(session_id) = conversation_id {
|
||||
let resume_cmd = format!("codex resume {session_id}");
|
||||
let command = if color_enabled {
|
||||
resume_cmd.cyan().to_string()
|
||||
} else {
|
||||
resume_cmd
|
||||
};
|
||||
lines.push(format!("To continue this session, run {command}."));
|
||||
}
|
||||
|
||||
lines
|
||||
}
|
||||
|
||||
fn print_exit_messages(exit_info: AppExitInfo) {
|
||||
let color_enabled = supports_color::on(Stream::Stdout).is_some();
|
||||
for line in format_exit_messages(exit_info, color_enabled) {
|
||||
println!("{line}");
|
||||
}
|
||||
}
|
||||
|
||||
pub(crate) const CODEX_SECURE_MODE_ENV_VAR: &str = "CODEX_SECURE_MODE";
|
||||
|
||||
/// As early as possible in the process lifecycle, apply hardening measures
|
||||
/// if the CODEX_SECURE_MODE environment variable is set to "1".
|
||||
#[ctor::ctor]
|
||||
fn pre_main_hardening() {
|
||||
let secure_mode = match std::env::var(CODEX_SECURE_MODE_ENV_VAR) {
|
||||
Ok(value) => value,
|
||||
Err(_) => return,
|
||||
};
|
||||
|
||||
if secure_mode == "1" {
|
||||
#[cfg(any(target_os = "linux", target_os = "android"))]
|
||||
crate::pre_main_hardening::pre_main_hardening_linux();
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
crate::pre_main_hardening::pre_main_hardening_macos();
|
||||
|
||||
#[cfg(windows)]
|
||||
crate::pre_main_hardening::pre_main_hardening_windows();
|
||||
}
|
||||
|
||||
// Always clear this env var so child processes don't inherit it.
|
||||
unsafe {
|
||||
std::env::remove_var(CODEX_SECURE_MODE_ENV_VAR);
|
||||
}
|
||||
}
|
||||
|
||||
fn main() -> anyhow::Result<()> {
|
||||
arg0_dispatch_or_else(|codex_linux_sandbox_exe| async move {
|
||||
cli_main(codex_linux_sandbox_exe).await?;
|
||||
@@ -242,52 +143,26 @@ fn main() -> anyhow::Result<()> {
|
||||
}
|
||||
|
||||
async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()> {
|
||||
let MultitoolCli {
|
||||
config_overrides: root_config_overrides,
|
||||
mut interactive,
|
||||
subcommand,
|
||||
} = MultitoolCli::parse();
|
||||
let cli = MultitoolCli::parse();
|
||||
|
||||
match subcommand {
|
||||
match cli.subcommand {
|
||||
None => {
|
||||
prepend_config_flags(
|
||||
&mut interactive.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
let exit_info = codex_tui::run_main(interactive, codex_linux_sandbox_exe).await?;
|
||||
print_exit_messages(exit_info);
|
||||
let mut tui_cli = cli.interactive;
|
||||
prepend_config_flags(&mut tui_cli.config_overrides, cli.config_overrides);
|
||||
let usage = codex_tui::run_main(tui_cli, codex_linux_sandbox_exe).await?;
|
||||
if !usage.is_zero() {
|
||||
println!("{}", codex_core::protocol::FinalOutput::from(usage));
|
||||
}
|
||||
}
|
||||
Some(Subcommand::Exec(mut exec_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut exec_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut exec_cli.config_overrides, cli.config_overrides);
|
||||
codex_exec::run_main(exec_cli, codex_linux_sandbox_exe).await?;
|
||||
}
|
||||
Some(Subcommand::Mcp(mut mcp_cli)) => {
|
||||
// Propagate any root-level config overrides (e.g. `-c key=value`).
|
||||
prepend_config_flags(&mut mcp_cli.config_overrides, root_config_overrides.clone());
|
||||
mcp_cli.run(codex_linux_sandbox_exe).await?;
|
||||
}
|
||||
Some(Subcommand::Resume(ResumeCommand {
|
||||
session_id,
|
||||
last,
|
||||
config_overrides,
|
||||
})) => {
|
||||
interactive = finalize_resume_interactive(
|
||||
interactive,
|
||||
root_config_overrides.clone(),
|
||||
session_id,
|
||||
last,
|
||||
config_overrides,
|
||||
);
|
||||
codex_tui::run_main(interactive, codex_linux_sandbox_exe).await?;
|
||||
Some(Subcommand::Mcp) => {
|
||||
codex_mcp_server::run_main(codex_linux_sandbox_exe, cli.config_overrides).await?;
|
||||
}
|
||||
Some(Subcommand::Login(mut login_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut login_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut login_cli.config_overrides, cli.config_overrides);
|
||||
match login_cli.action {
|
||||
Some(LoginSubcommand::Status) => {
|
||||
run_login_status(login_cli.config_overrides).await;
|
||||
@@ -302,35 +177,19 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
|
||||
}
|
||||
}
|
||||
Some(Subcommand::Logout(mut logout_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut logout_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut logout_cli.config_overrides, cli.config_overrides);
|
||||
run_logout(logout_cli.config_overrides).await;
|
||||
}
|
||||
Some(Subcommand::Proto(mut proto_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut proto_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut proto_cli.config_overrides, cli.config_overrides);
|
||||
proto::run_main(proto_cli).await?;
|
||||
}
|
||||
Some(Subcommand::Completion(completion_cli)) => {
|
||||
print_completion(completion_cli);
|
||||
}
|
||||
Some(Subcommand::Cloud(mut cloud_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut cloud_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
codex_cloud_tasks::run_main(cloud_cli, codex_linux_sandbox_exe).await?;
|
||||
}
|
||||
Some(Subcommand::Debug(debug_args)) => match debug_args.cmd {
|
||||
DebugCommand::Seatbelt(mut seatbelt_cli) => {
|
||||
prepend_config_flags(
|
||||
&mut seatbelt_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut seatbelt_cli.config_overrides, cli.config_overrides);
|
||||
codex_cli::debug_sandbox::run_command_under_seatbelt(
|
||||
seatbelt_cli,
|
||||
codex_linux_sandbox_exe,
|
||||
@@ -338,10 +197,7 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
|
||||
.await?;
|
||||
}
|
||||
DebugCommand::Landlock(mut landlock_cli) => {
|
||||
prepend_config_flags(
|
||||
&mut landlock_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut landlock_cli.config_overrides, cli.config_overrides);
|
||||
codex_cli::debug_sandbox::run_command_under_landlock(
|
||||
landlock_cli,
|
||||
codex_linux_sandbox_exe,
|
||||
@@ -350,20 +206,12 @@ async fn cli_main(codex_linux_sandbox_exe: Option<PathBuf>) -> anyhow::Result<()
|
||||
}
|
||||
},
|
||||
Some(Subcommand::Apply(mut apply_cli)) => {
|
||||
prepend_config_flags(
|
||||
&mut apply_cli.config_overrides,
|
||||
root_config_overrides.clone(),
|
||||
);
|
||||
prepend_config_flags(&mut apply_cli.config_overrides, cli.config_overrides);
|
||||
run_apply_command(apply_cli, None).await?;
|
||||
}
|
||||
Some(Subcommand::GenerateTs(gen_cli)) => {
|
||||
codex_protocol_ts::generate_ts(&gen_cli.out_dir, gen_cli.prettier.as_deref())?;
|
||||
}
|
||||
Some(Subcommand::ResponsesApiProxy(args)) => {
|
||||
tokio::task::spawn_blocking(move || codex_responses_api_proxy::run_main(args))
|
||||
.await
|
||||
.context("responses-api-proxy blocking task panicked")??;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
@@ -380,256 +228,8 @@ fn prepend_config_flags(
|
||||
.splice(0..0, cli_config_overrides.raw_overrides);
|
||||
}
|
||||
|
||||
/// Build the final `TuiCli` for a `codex resume` invocation.
|
||||
fn finalize_resume_interactive(
|
||||
mut interactive: TuiCli,
|
||||
root_config_overrides: CliConfigOverrides,
|
||||
session_id: Option<String>,
|
||||
last: bool,
|
||||
resume_cli: TuiCli,
|
||||
) -> TuiCli {
|
||||
// Start with the parsed interactive CLI so resume shares the same
|
||||
// configuration surface area as `codex` without additional flags.
|
||||
let resume_session_id = session_id;
|
||||
interactive.resume_picker = resume_session_id.is_none() && !last;
|
||||
interactive.resume_last = last;
|
||||
interactive.resume_session_id = resume_session_id;
|
||||
|
||||
// Merge resume-scoped flags and overrides with highest precedence.
|
||||
merge_resume_cli_flags(&mut interactive, resume_cli);
|
||||
|
||||
// Propagate any root-level config overrides (e.g. `-c key=value`).
|
||||
prepend_config_flags(&mut interactive.config_overrides, root_config_overrides);
|
||||
|
||||
interactive
|
||||
}
|
||||
|
||||
/// Merge flags provided to `codex resume` so they take precedence over any
|
||||
/// root-level flags. Only overrides fields explicitly set on the resume-scoped
|
||||
/// CLI. Also appends `-c key=value` overrides with highest precedence.
|
||||
fn merge_resume_cli_flags(interactive: &mut TuiCli, resume_cli: TuiCli) {
|
||||
if let Some(model) = resume_cli.model {
|
||||
interactive.model = Some(model);
|
||||
}
|
||||
if resume_cli.oss {
|
||||
interactive.oss = true;
|
||||
}
|
||||
if let Some(profile) = resume_cli.config_profile {
|
||||
interactive.config_profile = Some(profile);
|
||||
}
|
||||
if let Some(sandbox) = resume_cli.sandbox_mode {
|
||||
interactive.sandbox_mode = Some(sandbox);
|
||||
}
|
||||
if let Some(approval) = resume_cli.approval_policy {
|
||||
interactive.approval_policy = Some(approval);
|
||||
}
|
||||
if resume_cli.full_auto {
|
||||
interactive.full_auto = true;
|
||||
}
|
||||
if resume_cli.dangerously_bypass_approvals_and_sandbox {
|
||||
interactive.dangerously_bypass_approvals_and_sandbox = true;
|
||||
}
|
||||
if let Some(cwd) = resume_cli.cwd {
|
||||
interactive.cwd = Some(cwd);
|
||||
}
|
||||
if resume_cli.web_search {
|
||||
interactive.web_search = true;
|
||||
}
|
||||
if !resume_cli.images.is_empty() {
|
||||
interactive.images = resume_cli.images;
|
||||
}
|
||||
if let Some(prompt) = resume_cli.prompt {
|
||||
interactive.prompt = Some(prompt);
|
||||
}
|
||||
|
||||
interactive
|
||||
.config_overrides
|
||||
.raw_overrides
|
||||
.extend(resume_cli.config_overrides.raw_overrides);
|
||||
}
|
||||
|
||||
fn print_completion(cmd: CompletionCommand) {
|
||||
let mut app = MultitoolCli::command();
|
||||
let name = "codex";
|
||||
generate(cmd.shell, &mut app, name, &mut std::io::stdout());
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use codex_core::protocol::TokenUsage;
|
||||
use codex_protocol::mcp_protocol::ConversationId;
|
||||
|
||||
fn finalize_from_args(args: &[&str]) -> TuiCli {
|
||||
let cli = MultitoolCli::try_parse_from(args).expect("parse");
|
||||
let MultitoolCli {
|
||||
interactive,
|
||||
config_overrides: root_overrides,
|
||||
subcommand,
|
||||
} = cli;
|
||||
|
||||
let Subcommand::Resume(ResumeCommand {
|
||||
session_id,
|
||||
last,
|
||||
config_overrides: resume_cli,
|
||||
}) = subcommand.expect("resume present")
|
||||
else {
|
||||
unreachable!()
|
||||
};
|
||||
|
||||
finalize_resume_interactive(interactive, root_overrides, session_id, last, resume_cli)
|
||||
}
|
||||
|
||||
fn sample_exit_info(conversation: Option<&str>) -> AppExitInfo {
|
||||
let token_usage = TokenUsage {
|
||||
output_tokens: 2,
|
||||
total_tokens: 2,
|
||||
..Default::default()
|
||||
};
|
||||
AppExitInfo {
|
||||
token_usage,
|
||||
conversation_id: conversation
|
||||
.map(ConversationId::from_string)
|
||||
.map(Result::unwrap),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn format_exit_messages_skips_zero_usage() {
|
||||
let exit_info = AppExitInfo {
|
||||
token_usage: TokenUsage::default(),
|
||||
conversation_id: None,
|
||||
};
|
||||
let lines = format_exit_messages(exit_info, false);
|
||||
assert!(lines.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn format_exit_messages_includes_resume_hint_without_color() {
|
||||
let exit_info = sample_exit_info(Some("123e4567-e89b-12d3-a456-426614174000"));
|
||||
let lines = format_exit_messages(exit_info, false);
|
||||
assert_eq!(
|
||||
lines,
|
||||
vec![
|
||||
"Token usage: total=2 input=0 output=2".to_string(),
|
||||
"To continue this session, run codex resume 123e4567-e89b-12d3-a456-426614174000."
|
||||
.to_string(),
|
||||
]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn format_exit_messages_applies_color_when_enabled() {
|
||||
let exit_info = sample_exit_info(Some("123e4567-e89b-12d3-a456-426614174000"));
|
||||
let lines = format_exit_messages(exit_info, true);
|
||||
assert_eq!(lines.len(), 2);
|
||||
assert!(lines[1].contains("\u{1b}[36m"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_model_flag_applies_when_no_root_flags() {
|
||||
let interactive = finalize_from_args(["codex", "resume", "-m", "gpt-5-test"].as_ref());
|
||||
|
||||
assert_eq!(interactive.model.as_deref(), Some("gpt-5-test"));
|
||||
assert!(interactive.resume_picker);
|
||||
assert!(!interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_picker_logic_none_and_not_last() {
|
||||
let interactive = finalize_from_args(["codex", "resume"].as_ref());
|
||||
assert!(interactive.resume_picker);
|
||||
assert!(!interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_picker_logic_last() {
|
||||
let interactive = finalize_from_args(["codex", "resume", "--last"].as_ref());
|
||||
assert!(!interactive.resume_picker);
|
||||
assert!(interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id, None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_picker_logic_with_session_id() {
|
||||
let interactive = finalize_from_args(["codex", "resume", "1234"].as_ref());
|
||||
assert!(!interactive.resume_picker);
|
||||
assert!(!interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id.as_deref(), Some("1234"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_merges_option_flags_and_full_auto() {
|
||||
let interactive = finalize_from_args(
|
||||
[
|
||||
"codex",
|
||||
"resume",
|
||||
"sid",
|
||||
"--oss",
|
||||
"--full-auto",
|
||||
"--search",
|
||||
"--sandbox",
|
||||
"workspace-write",
|
||||
"--ask-for-approval",
|
||||
"on-request",
|
||||
"-m",
|
||||
"gpt-5-test",
|
||||
"-p",
|
||||
"my-profile",
|
||||
"-C",
|
||||
"/tmp",
|
||||
"-i",
|
||||
"/tmp/a.png,/tmp/b.png",
|
||||
]
|
||||
.as_ref(),
|
||||
);
|
||||
|
||||
assert_eq!(interactive.model.as_deref(), Some("gpt-5-test"));
|
||||
assert!(interactive.oss);
|
||||
assert_eq!(interactive.config_profile.as_deref(), Some("my-profile"));
|
||||
assert!(matches!(
|
||||
interactive.sandbox_mode,
|
||||
Some(codex_common::SandboxModeCliArg::WorkspaceWrite)
|
||||
));
|
||||
assert!(matches!(
|
||||
interactive.approval_policy,
|
||||
Some(codex_common::ApprovalModeCliArg::OnRequest)
|
||||
));
|
||||
assert!(interactive.full_auto);
|
||||
assert_eq!(
|
||||
interactive.cwd.as_deref(),
|
||||
Some(std::path::Path::new("/tmp"))
|
||||
);
|
||||
assert!(interactive.web_search);
|
||||
let has_a = interactive
|
||||
.images
|
||||
.iter()
|
||||
.any(|p| p == std::path::Path::new("/tmp/a.png"));
|
||||
let has_b = interactive
|
||||
.images
|
||||
.iter()
|
||||
.any(|p| p == std::path::Path::new("/tmp/b.png"));
|
||||
assert!(has_a && has_b);
|
||||
assert!(!interactive.resume_picker);
|
||||
assert!(!interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id.as_deref(), Some("sid"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn resume_merges_dangerously_bypass_flag() {
|
||||
let interactive = finalize_from_args(
|
||||
[
|
||||
"codex",
|
||||
"resume",
|
||||
"--dangerously-bypass-approvals-and-sandbox",
|
||||
]
|
||||
.as_ref(),
|
||||
);
|
||||
assert!(interactive.dangerously_bypass_approvals_and_sandbox);
|
||||
assert!(interactive.resume_picker);
|
||||
assert!(!interactive.resume_last);
|
||||
assert_eq!(interactive.resume_session_id, None);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,384 +0,0 @@
|
||||
use std::collections::BTreeMap;
|
||||
use std::collections::HashMap;
|
||||
use std::path::PathBuf;
|
||||
|
||||
use anyhow::Context;
|
||||
use anyhow::Result;
|
||||
use anyhow::anyhow;
|
||||
use anyhow::bail;
|
||||
use codex_common::CliConfigOverrides;
|
||||
use codex_core::config::Config;
|
||||
use codex_core::config::ConfigOverrides;
|
||||
use codex_core::config::find_codex_home;
|
||||
use codex_core::config::load_global_mcp_servers;
|
||||
use codex_core::config::write_global_mcp_servers;
|
||||
use codex_core::config_types::McpServerConfig;
|
||||
|
||||
/// [experimental] Launch Codex as an MCP server or manage configured MCP servers.
|
||||
///
|
||||
/// Subcommands:
|
||||
/// - `serve` — run the MCP server on stdio
|
||||
/// - `list` — list configured servers (with `--json`)
|
||||
/// - `get` — show a single server (with `--json`)
|
||||
/// - `add` — add a server launcher entry to `~/.codex/config.toml`
|
||||
/// - `remove` — delete a server entry
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct McpCli {
|
||||
#[clap(flatten)]
|
||||
pub config_overrides: CliConfigOverrides,
|
||||
|
||||
#[command(subcommand)]
|
||||
pub cmd: Option<McpSubcommand>,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Subcommand)]
|
||||
pub enum McpSubcommand {
|
||||
/// [experimental] Run the Codex MCP server (stdio transport).
|
||||
Serve,
|
||||
|
||||
/// [experimental] List configured MCP servers.
|
||||
List(ListArgs),
|
||||
|
||||
/// [experimental] Show details for a configured MCP server.
|
||||
Get(GetArgs),
|
||||
|
||||
/// [experimental] Add a global MCP server entry.
|
||||
Add(AddArgs),
|
||||
|
||||
/// [experimental] Remove a global MCP server entry.
|
||||
Remove(RemoveArgs),
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct ListArgs {
|
||||
/// Output the configured servers as JSON.
|
||||
#[arg(long)]
|
||||
pub json: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct GetArgs {
|
||||
/// Name of the MCP server to display.
|
||||
pub name: String,
|
||||
|
||||
/// Output the server configuration as JSON.
|
||||
#[arg(long)]
|
||||
pub json: bool,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct AddArgs {
|
||||
/// Name for the MCP server configuration.
|
||||
pub name: String,
|
||||
|
||||
/// Environment variables to set when launching the server.
|
||||
#[arg(long, value_parser = parse_env_pair, value_name = "KEY=VALUE")]
|
||||
pub env: Vec<(String, String)>,
|
||||
|
||||
/// Command to launch the MCP server.
|
||||
#[arg(trailing_var_arg = true, num_args = 1..)]
|
||||
pub command: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, clap::Parser)]
|
||||
pub struct RemoveArgs {
|
||||
/// Name of the MCP server configuration to remove.
|
||||
pub name: String,
|
||||
}
|
||||
|
||||
impl McpCli {
|
||||
pub async fn run(self, codex_linux_sandbox_exe: Option<PathBuf>) -> Result<()> {
|
||||
let McpCli {
|
||||
config_overrides,
|
||||
cmd,
|
||||
} = self;
|
||||
let subcommand = cmd.unwrap_or(McpSubcommand::Serve);
|
||||
|
||||
match subcommand {
|
||||
McpSubcommand::Serve => {
|
||||
codex_mcp_server::run_main(codex_linux_sandbox_exe, config_overrides).await?;
|
||||
}
|
||||
McpSubcommand::List(args) => {
|
||||
run_list(&config_overrides, args)?;
|
||||
}
|
||||
McpSubcommand::Get(args) => {
|
||||
run_get(&config_overrides, args)?;
|
||||
}
|
||||
McpSubcommand::Add(args) => {
|
||||
run_add(&config_overrides, args)?;
|
||||
}
|
||||
McpSubcommand::Remove(args) => {
|
||||
run_remove(&config_overrides, args)?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
fn run_add(config_overrides: &CliConfigOverrides, add_args: AddArgs) -> Result<()> {
|
||||
// Validate any provided overrides even though they are not currently applied.
|
||||
config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
|
||||
let AddArgs { name, env, command } = add_args;
|
||||
|
||||
validate_server_name(&name)?;
|
||||
|
||||
let mut command_parts = command.into_iter();
|
||||
let command_bin = command_parts
|
||||
.next()
|
||||
.ok_or_else(|| anyhow!("command is required"))?;
|
||||
let command_args: Vec<String> = command_parts.collect();
|
||||
|
||||
let env_map = if env.is_empty() {
|
||||
None
|
||||
} else {
|
||||
let mut map = HashMap::new();
|
||||
for (key, value) in env {
|
||||
map.insert(key, value);
|
||||
}
|
||||
Some(map)
|
||||
};
|
||||
|
||||
let codex_home = find_codex_home().context("failed to resolve CODEX_HOME")?;
|
||||
let mut servers = load_global_mcp_servers(&codex_home)
|
||||
.with_context(|| format!("failed to load MCP servers from {}", codex_home.display()))?;
|
||||
|
||||
let new_entry = McpServerConfig {
|
||||
command: command_bin,
|
||||
args: command_args,
|
||||
env: env_map,
|
||||
startup_timeout_sec: None,
|
||||
tool_timeout_sec: None,
|
||||
};
|
||||
|
||||
servers.insert(name.clone(), new_entry);
|
||||
|
||||
write_global_mcp_servers(&codex_home, &servers)
|
||||
.with_context(|| format!("failed to write MCP servers to {}", codex_home.display()))?;
|
||||
|
||||
println!("Added global MCP server '{name}'.");
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn run_remove(config_overrides: &CliConfigOverrides, remove_args: RemoveArgs) -> Result<()> {
|
||||
config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
|
||||
let RemoveArgs { name } = remove_args;
|
||||
|
||||
validate_server_name(&name)?;
|
||||
|
||||
let codex_home = find_codex_home().context("failed to resolve CODEX_HOME")?;
|
||||
let mut servers = load_global_mcp_servers(&codex_home)
|
||||
.with_context(|| format!("failed to load MCP servers from {}", codex_home.display()))?;
|
||||
|
||||
let removed = servers.remove(&name).is_some();
|
||||
|
||||
if removed {
|
||||
write_global_mcp_servers(&codex_home, &servers)
|
||||
.with_context(|| format!("failed to write MCP servers to {}", codex_home.display()))?;
|
||||
}
|
||||
|
||||
if removed {
|
||||
println!("Removed global MCP server '{name}'.");
|
||||
} else {
|
||||
println!("No MCP server named '{name}' found.");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn run_list(config_overrides: &CliConfigOverrides, list_args: ListArgs) -> Result<()> {
|
||||
let overrides = config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
let config = Config::load_with_cli_overrides(overrides, ConfigOverrides::default())
|
||||
.context("failed to load configuration")?;
|
||||
|
||||
let mut entries: Vec<_> = config.mcp_servers.iter().collect();
|
||||
entries.sort_by(|(a, _), (b, _)| a.cmp(b));
|
||||
|
||||
if list_args.json {
|
||||
let json_entries: Vec<_> = entries
|
||||
.into_iter()
|
||||
.map(|(name, cfg)| {
|
||||
let env = cfg.env.as_ref().map(|env| {
|
||||
env.iter()
|
||||
.map(|(k, v)| (k.clone(), v.clone()))
|
||||
.collect::<BTreeMap<_, _>>()
|
||||
});
|
||||
serde_json::json!({
|
||||
"name": name,
|
||||
"command": cfg.command,
|
||||
"args": cfg.args,
|
||||
"env": env,
|
||||
"startup_timeout_sec": cfg
|
||||
.startup_timeout_sec
|
||||
.map(|timeout| timeout.as_secs_f64()),
|
||||
"tool_timeout_sec": cfg
|
||||
.tool_timeout_sec
|
||||
.map(|timeout| timeout.as_secs_f64()),
|
||||
})
|
||||
})
|
||||
.collect();
|
||||
let output = serde_json::to_string_pretty(&json_entries)?;
|
||||
println!("{output}");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
if entries.is_empty() {
|
||||
println!("No MCP servers configured yet. Try `codex mcp add my-tool -- my-command`.");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let mut rows: Vec<[String; 4]> = Vec::new();
|
||||
for (name, cfg) in entries {
|
||||
let args = if cfg.args.is_empty() {
|
||||
"-".to_string()
|
||||
} else {
|
||||
cfg.args.join(" ")
|
||||
};
|
||||
|
||||
let env = match cfg.env.as_ref() {
|
||||
None => "-".to_string(),
|
||||
Some(map) if map.is_empty() => "-".to_string(),
|
||||
Some(map) => {
|
||||
let mut pairs: Vec<_> = map.iter().collect();
|
||||
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
|
||||
pairs
|
||||
.into_iter()
|
||||
.map(|(k, v)| format!("{k}={v}"))
|
||||
.collect::<Vec<_>>()
|
||||
.join(", ")
|
||||
}
|
||||
};
|
||||
|
||||
rows.push([name.clone(), cfg.command.clone(), args, env]);
|
||||
}
|
||||
|
||||
let mut widths = ["Name".len(), "Command".len(), "Args".len(), "Env".len()];
|
||||
for row in &rows {
|
||||
for (i, cell) in row.iter().enumerate() {
|
||||
widths[i] = widths[i].max(cell.len());
|
||||
}
|
||||
}
|
||||
|
||||
println!(
|
||||
"{:<name_w$} {:<cmd_w$} {:<args_w$} {:<env_w$}",
|
||||
"Name",
|
||||
"Command",
|
||||
"Args",
|
||||
"Env",
|
||||
name_w = widths[0],
|
||||
cmd_w = widths[1],
|
||||
args_w = widths[2],
|
||||
env_w = widths[3],
|
||||
);
|
||||
|
||||
for row in rows {
|
||||
println!(
|
||||
"{:<name_w$} {:<cmd_w$} {:<args_w$} {:<env_w$}",
|
||||
row[0],
|
||||
row[1],
|
||||
row[2],
|
||||
row[3],
|
||||
name_w = widths[0],
|
||||
cmd_w = widths[1],
|
||||
args_w = widths[2],
|
||||
env_w = widths[3],
|
||||
);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn run_get(config_overrides: &CliConfigOverrides, get_args: GetArgs) -> Result<()> {
|
||||
let overrides = config_overrides.parse_overrides().map_err(|e| anyhow!(e))?;
|
||||
let config = Config::load_with_cli_overrides(overrides, ConfigOverrides::default())
|
||||
.context("failed to load configuration")?;
|
||||
|
||||
let Some(server) = config.mcp_servers.get(&get_args.name) else {
|
||||
bail!("No MCP server named '{name}' found.", name = get_args.name);
|
||||
};
|
||||
|
||||
if get_args.json {
|
||||
let env = server.env.as_ref().map(|env| {
|
||||
env.iter()
|
||||
.map(|(k, v)| (k.clone(), v.clone()))
|
||||
.collect::<BTreeMap<_, _>>()
|
||||
});
|
||||
let output = serde_json::to_string_pretty(&serde_json::json!({
|
||||
"name": get_args.name,
|
||||
"command": server.command,
|
||||
"args": server.args,
|
||||
"env": env,
|
||||
"startup_timeout_sec": server
|
||||
.startup_timeout_sec
|
||||
.map(|timeout| timeout.as_secs_f64()),
|
||||
"tool_timeout_sec": server
|
||||
.tool_timeout_sec
|
||||
.map(|timeout| timeout.as_secs_f64()),
|
||||
}))?;
|
||||
println!("{output}");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
println!("{}", get_args.name);
|
||||
println!(" command: {}", server.command);
|
||||
let args = if server.args.is_empty() {
|
||||
"-".to_string()
|
||||
} else {
|
||||
server.args.join(" ")
|
||||
};
|
||||
println!(" args: {args}");
|
||||
let env_display = match server.env.as_ref() {
|
||||
None => "-".to_string(),
|
||||
Some(map) if map.is_empty() => "-".to_string(),
|
||||
Some(map) => {
|
||||
let mut pairs: Vec<_> = map.iter().collect();
|
||||
pairs.sort_by(|(a, _), (b, _)| a.cmp(b));
|
||||
pairs
|
||||
.into_iter()
|
||||
.map(|(k, v)| format!("{k}={v}"))
|
||||
.collect::<Vec<_>>()
|
||||
.join(", ")
|
||||
}
|
||||
};
|
||||
println!(" env: {env_display}");
|
||||
if let Some(timeout) = server.startup_timeout_sec {
|
||||
println!(" startup_timeout_sec: {}", timeout.as_secs_f64());
|
||||
}
|
||||
if let Some(timeout) = server.tool_timeout_sec {
|
||||
println!(" tool_timeout_sec: {}", timeout.as_secs_f64());
|
||||
}
|
||||
println!(" remove: codex mcp remove {}", get_args.name);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn parse_env_pair(raw: &str) -> Result<(String, String), String> {
|
||||
let mut parts = raw.splitn(2, '=');
|
||||
let key = parts
|
||||
.next()
|
||||
.map(str::trim)
|
||||
.filter(|s| !s.is_empty())
|
||||
.ok_or_else(|| "environment entries must be in KEY=VALUE form".to_string())?;
|
||||
let value = parts
|
||||
.next()
|
||||
.map(str::to_string)
|
||||
.ok_or_else(|| "environment entries must be in KEY=VALUE form".to_string())?;
|
||||
|
||||
Ok((key.to_string(), value))
|
||||
}
|
||||
|
||||
fn validate_server_name(name: &str) -> Result<()> {
|
||||
let is_valid = !name.is_empty()
|
||||
&& name
|
||||
.chars()
|
||||
.all(|c| c.is_ascii_alphanumeric() || c == '-' || c == '_');
|
||||
|
||||
if is_valid {
|
||||
Ok(())
|
||||
} else {
|
||||
bail!("invalid server name '{name}' (use letters, numbers, '-', '_')");
|
||||
}
|
||||
}
|
||||
@@ -1,98 +0,0 @@
|
||||
#[cfg(any(target_os = "linux", target_os = "android"))]
|
||||
const PRCTL_FAILED_EXIT_CODE: i32 = 5;
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
const PTRACE_DENY_ATTACH_FAILED_EXIT_CODE: i32 = 6;
|
||||
|
||||
#[cfg(any(target_os = "linux", target_os = "android", target_os = "macos"))]
|
||||
const SET_RLIMIT_CORE_FAILED_EXIT_CODE: i32 = 7;
|
||||
|
||||
#[cfg(any(target_os = "linux", target_os = "android"))]
|
||||
pub(crate) fn pre_main_hardening_linux() {
|
||||
// Disable ptrace attach / mark process non-dumpable.
|
||||
let ret_code = unsafe { libc::prctl(libc::PR_SET_DUMPABLE, 0, 0, 0, 0) };
|
||||
if ret_code != 0 {
|
||||
eprintln!(
|
||||
"ERROR: prctl(PR_SET_DUMPABLE, 0) failed: {}",
|
||||
std::io::Error::last_os_error()
|
||||
);
|
||||
std::process::exit(PRCTL_FAILED_EXIT_CODE);
|
||||
}
|
||||
|
||||
// For "defense in depth," set the core file size limit to 0.
|
||||
set_core_file_size_limit_to_zero();
|
||||
|
||||
// Official Codex releases are MUSL-linked, which means that variables such
|
||||
// as LD_PRELOAD are ignored anyway, but just to be sure, clear them here.
|
||||
let ld_keys: Vec<String> = std::env::vars()
|
||||
.filter_map(|(key, _)| {
|
||||
if key.starts_with("LD_") {
|
||||
Some(key)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
for key in ld_keys {
|
||||
unsafe {
|
||||
std::env::remove_var(key);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
pub(crate) fn pre_main_hardening_macos() {
|
||||
// Prevent debuggers from attaching to this process.
|
||||
let ret_code = unsafe { libc::ptrace(libc::PT_DENY_ATTACH, 0, std::ptr::null_mut(), 0) };
|
||||
if ret_code == -1 {
|
||||
eprintln!(
|
||||
"ERROR: ptrace(PT_DENY_ATTACH) failed: {}",
|
||||
std::io::Error::last_os_error()
|
||||
);
|
||||
std::process::exit(PTRACE_DENY_ATTACH_FAILED_EXIT_CODE);
|
||||
}
|
||||
|
||||
// Set the core file size limit to 0 to prevent core dumps.
|
||||
set_core_file_size_limit_to_zero();
|
||||
|
||||
// Remove all DYLD_ environment variables, which can be used to subvert
|
||||
// library loading.
|
||||
let dyld_keys: Vec<String> = std::env::vars()
|
||||
.filter_map(|(key, _)| {
|
||||
if key.starts_with("DYLD_") {
|
||||
Some(key)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
})
|
||||
.collect();
|
||||
|
||||
for key in dyld_keys {
|
||||
unsafe {
|
||||
std::env::remove_var(key);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(unix)]
|
||||
fn set_core_file_size_limit_to_zero() {
|
||||
let rlim = libc::rlimit {
|
||||
rlim_cur: 0,
|
||||
rlim_max: 0,
|
||||
};
|
||||
|
||||
let ret_code = unsafe { libc::setrlimit(libc::RLIMIT_CORE, &rlim) };
|
||||
if ret_code != 0 {
|
||||
eprintln!(
|
||||
"ERROR: setrlimit(RLIMIT_CORE) failed: {}",
|
||||
std::io::Error::last_os_error()
|
||||
);
|
||||
std::process::exit(SET_RLIMIT_CORE_FAILED_EXIT_CODE);
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
pub(crate) fn pre_main_hardening_windows() {
|
||||
// TODO(mbolin): Perform the appropriate configuration for Windows.
|
||||
}
|
||||
@@ -2,7 +2,6 @@ use std::io::IsTerminal;
|
||||
|
||||
use clap::Parser;
|
||||
use codex_common::CliConfigOverrides;
|
||||
use codex_core::AuthManager;
|
||||
use codex_core::ConversationManager;
|
||||
use codex_core::NewConversation;
|
||||
use codex_core::config::Config;
|
||||
@@ -10,6 +9,7 @@ use codex_core::config::ConfigOverrides;
|
||||
use codex_core::protocol::Event;
|
||||
use codex_core::protocol::EventMsg;
|
||||
use codex_core::protocol::Submission;
|
||||
use codex_login::AuthManager;
|
||||
use tokio::io::AsyncBufReadExt;
|
||||
use tokio::io::BufReader;
|
||||
use tracing::error;
|
||||
@@ -37,8 +37,10 @@ pub async fn run_main(opts: ProtoCli) -> anyhow::Result<()> {
|
||||
|
||||
let config = Config::load_with_cli_overrides(overrides_vec, ConfigOverrides::default())?;
|
||||
// Use conversation_manager API to start a conversation
|
||||
let conversation_manager =
|
||||
ConversationManager::new(AuthManager::shared(config.codex_home.clone()));
|
||||
let conversation_manager = ConversationManager::new(AuthManager::shared(
|
||||
config.codex_home.clone(),
|
||||
config.preferred_auth_method,
|
||||
));
|
||||
let NewConversation {
|
||||
conversation_id: _,
|
||||
conversation,
|
||||
|
||||
@@ -1,86 +0,0 @@
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::Result;
|
||||
use codex_core::config::load_global_mcp_servers;
|
||||
use predicates::str::contains;
|
||||
use pretty_assertions::assert_eq;
|
||||
use tempfile::TempDir;
|
||||
|
||||
fn codex_command(codex_home: &Path) -> Result<assert_cmd::Command> {
|
||||
let mut cmd = assert_cmd::Command::cargo_bin("codex")?;
|
||||
cmd.env("CODEX_HOME", codex_home);
|
||||
Ok(cmd)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn add_and_remove_server_updates_global_config() -> Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let mut add_cmd = codex_command(codex_home.path())?;
|
||||
add_cmd
|
||||
.args(["mcp", "add", "docs", "--", "echo", "hello"])
|
||||
.assert()
|
||||
.success()
|
||||
.stdout(contains("Added global MCP server 'docs'."));
|
||||
|
||||
let servers = load_global_mcp_servers(codex_home.path())?;
|
||||
assert_eq!(servers.len(), 1);
|
||||
let docs = servers.get("docs").expect("server should exist");
|
||||
assert_eq!(docs.command, "echo");
|
||||
assert_eq!(docs.args, vec!["hello".to_string()]);
|
||||
assert!(docs.env.is_none());
|
||||
|
||||
let mut remove_cmd = codex_command(codex_home.path())?;
|
||||
remove_cmd
|
||||
.args(["mcp", "remove", "docs"])
|
||||
.assert()
|
||||
.success()
|
||||
.stdout(contains("Removed global MCP server 'docs'."));
|
||||
|
||||
let servers = load_global_mcp_servers(codex_home.path())?;
|
||||
assert!(servers.is_empty());
|
||||
|
||||
let mut remove_again_cmd = codex_command(codex_home.path())?;
|
||||
remove_again_cmd
|
||||
.args(["mcp", "remove", "docs"])
|
||||
.assert()
|
||||
.success()
|
||||
.stdout(contains("No MCP server named 'docs' found."));
|
||||
|
||||
let servers = load_global_mcp_servers(codex_home.path())?;
|
||||
assert!(servers.is_empty());
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn add_with_env_preserves_key_order_and_values() -> Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let mut add_cmd = codex_command(codex_home.path())?;
|
||||
add_cmd
|
||||
.args([
|
||||
"mcp",
|
||||
"add",
|
||||
"envy",
|
||||
"--env",
|
||||
"FOO=bar",
|
||||
"--env",
|
||||
"ALPHA=beta",
|
||||
"--",
|
||||
"python",
|
||||
"server.py",
|
||||
])
|
||||
.assert()
|
||||
.success();
|
||||
|
||||
let servers = load_global_mcp_servers(codex_home.path())?;
|
||||
let envy = servers.get("envy").expect("server should exist");
|
||||
let env = envy.env.as_ref().expect("env should be present");
|
||||
|
||||
assert_eq!(env.len(), 2);
|
||||
assert_eq!(env.get("FOO"), Some(&"bar".to_string()));
|
||||
assert_eq!(env.get("ALPHA"), Some(&"beta".to_string()));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,106 +0,0 @@
|
||||
use std::path::Path;
|
||||
|
||||
use anyhow::Result;
|
||||
use predicates::str::contains;
|
||||
use pretty_assertions::assert_eq;
|
||||
use serde_json::Value as JsonValue;
|
||||
use tempfile::TempDir;
|
||||
|
||||
fn codex_command(codex_home: &Path) -> Result<assert_cmd::Command> {
|
||||
let mut cmd = assert_cmd::Command::cargo_bin("codex")?;
|
||||
cmd.env("CODEX_HOME", codex_home);
|
||||
Ok(cmd)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn list_shows_empty_state() -> Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let mut cmd = codex_command(codex_home.path())?;
|
||||
let output = cmd.args(["mcp", "list"]).output()?;
|
||||
assert!(output.status.success());
|
||||
let stdout = String::from_utf8(output.stdout)?;
|
||||
assert!(stdout.contains("No MCP servers configured yet."));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn list_and_get_render_expected_output() -> Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let mut add = codex_command(codex_home.path())?;
|
||||
add.args([
|
||||
"mcp",
|
||||
"add",
|
||||
"docs",
|
||||
"--env",
|
||||
"TOKEN=secret",
|
||||
"--",
|
||||
"docs-server",
|
||||
"--port",
|
||||
"4000",
|
||||
])
|
||||
.assert()
|
||||
.success();
|
||||
|
||||
let mut list_cmd = codex_command(codex_home.path())?;
|
||||
let list_output = list_cmd.args(["mcp", "list"]).output()?;
|
||||
assert!(list_output.status.success());
|
||||
let stdout = String::from_utf8(list_output.stdout)?;
|
||||
assert!(stdout.contains("Name"));
|
||||
assert!(stdout.contains("docs"));
|
||||
assert!(stdout.contains("docs-server"));
|
||||
assert!(stdout.contains("TOKEN=secret"));
|
||||
|
||||
let mut list_json_cmd = codex_command(codex_home.path())?;
|
||||
let json_output = list_json_cmd.args(["mcp", "list", "--json"]).output()?;
|
||||
assert!(json_output.status.success());
|
||||
let stdout = String::from_utf8(json_output.stdout)?;
|
||||
let parsed: JsonValue = serde_json::from_str(&stdout)?;
|
||||
let array = parsed.as_array().expect("expected array");
|
||||
assert_eq!(array.len(), 1);
|
||||
let entry = &array[0];
|
||||
assert_eq!(entry.get("name"), Some(&JsonValue::String("docs".into())));
|
||||
assert_eq!(
|
||||
entry.get("command"),
|
||||
Some(&JsonValue::String("docs-server".into()))
|
||||
);
|
||||
|
||||
let args = entry
|
||||
.get("args")
|
||||
.and_then(|v| v.as_array())
|
||||
.expect("args array");
|
||||
assert_eq!(
|
||||
args,
|
||||
&vec![
|
||||
JsonValue::String("--port".into()),
|
||||
JsonValue::String("4000".into())
|
||||
]
|
||||
);
|
||||
|
||||
let env = entry
|
||||
.get("env")
|
||||
.and_then(|v| v.as_object())
|
||||
.expect("env map");
|
||||
assert_eq!(env.get("TOKEN"), Some(&JsonValue::String("secret".into())));
|
||||
|
||||
let mut get_cmd = codex_command(codex_home.path())?;
|
||||
let get_output = get_cmd.args(["mcp", "get", "docs"]).output()?;
|
||||
assert!(get_output.status.success());
|
||||
let stdout = String::from_utf8(get_output.stdout)?;
|
||||
assert!(stdout.contains("docs"));
|
||||
assert!(stdout.contains("command: docs-server"));
|
||||
assert!(stdout.contains("args: --port 4000"));
|
||||
assert!(stdout.contains("env: TOKEN=secret"));
|
||||
assert!(stdout.contains("remove: codex mcp remove docs"));
|
||||
|
||||
let mut get_json_cmd = codex_command(codex_home.path())?;
|
||||
get_json_cmd
|
||||
.args(["mcp", "get", "docs", "--json"])
|
||||
.assert()
|
||||
.success()
|
||||
.stdout(contains("\"name\": \"docs\""));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,30 +0,0 @@
|
||||
[package]
|
||||
name = "codex-cloud-tasks-client"
|
||||
version = { workspace = true }
|
||||
edition = "2024"
|
||||
|
||||
[lib]
|
||||
name = "codex_cloud_tasks_client"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[lints]
|
||||
workspace = true
|
||||
|
||||
[features]
|
||||
default = ["online"]
|
||||
online = ["dep:reqwest", "dep:tokio", "dep:codex-backend-client"]
|
||||
mock = []
|
||||
|
||||
[dependencies]
|
||||
anyhow = "1"
|
||||
async-trait = "0.1"
|
||||
chrono = { version = "0.4", features = ["serde"] }
|
||||
diffy = "0.4.2"
|
||||
reqwest = { version = "0.12", features = ["json"], optional = true }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
thiserror = "2.0.12"
|
||||
tokio = { version = "1", features = ["macros", "rt-multi-thread"], optional = true }
|
||||
codex-backend-client = { path = "../backend-client", optional = true }
|
||||
codex-git-apply = { path = "../git-apply" }
|
||||
dirs = { workspace = true }
|
||||
@@ -1,188 +0,0 @@
|
||||
use chrono::DateTime;
|
||||
use chrono::Utc;
|
||||
use serde::Deserialize;
|
||||
use serde::Serialize;
|
||||
|
||||
pub type Result<T> = std::result::Result<T, CloudTaskError>;
|
||||
|
||||
#[derive(Debug, thiserror::Error)]
|
||||
pub enum CloudTaskError {
|
||||
#[error("unimplemented: {0}")]
|
||||
Unimplemented(&'static str),
|
||||
#[error("http error: {0}")]
|
||||
Http(String),
|
||||
#[error("io error: {0}")]
|
||||
Io(String),
|
||||
#[error("{0}")]
|
||||
Msg(String),
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
#[serde(transparent)]
|
||||
pub struct TaskId(pub String);
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub enum TaskStatus {
|
||||
Pending,
|
||||
Ready,
|
||||
Applied,
|
||||
Error,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct TaskSummary {
|
||||
pub id: TaskId,
|
||||
pub title: String,
|
||||
pub status: TaskStatus,
|
||||
pub updated_at: DateTime<Utc>,
|
||||
/// Backend environment identifier (when available)
|
||||
pub environment_id: Option<String>,
|
||||
/// Human-friendly environment label (when available)
|
||||
pub environment_label: Option<String>,
|
||||
pub summary: DiffSummary,
|
||||
/// True when the backend reports this task as a code review.
|
||||
#[serde(default)]
|
||||
pub is_review: bool,
|
||||
/// Number of assistant attempts (best-of-N), when reported by the backend.
|
||||
#[serde(default)]
|
||||
pub attempt_total: Option<usize>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq, Default)]
|
||||
pub enum AttemptStatus {
|
||||
Pending,
|
||||
InProgress,
|
||||
Completed,
|
||||
Failed,
|
||||
Cancelled,
|
||||
#[default]
|
||||
Unknown,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct TurnAttempt {
|
||||
pub turn_id: String,
|
||||
pub attempt_placement: Option<i64>,
|
||||
pub created_at: Option<DateTime<Utc>>,
|
||||
pub status: AttemptStatus,
|
||||
pub diff: Option<String>,
|
||||
pub messages: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "lowercase")]
|
||||
pub enum ApplyStatus {
|
||||
Success,
|
||||
Partial,
|
||||
Error,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct ApplyOutcome {
|
||||
pub applied: bool,
|
||||
pub status: ApplyStatus,
|
||||
pub message: String,
|
||||
#[serde(default)]
|
||||
pub skipped_paths: Vec<String>,
|
||||
#[serde(default)]
|
||||
pub conflict_paths: Vec<String>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct CreatedTask {
|
||||
pub id: TaskId,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub enum AttachmentKind {
|
||||
File,
|
||||
Image,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct AttachmentReference {
|
||||
pub sediment_id: String,
|
||||
pub asset_pointer: String,
|
||||
pub path: Option<String>,
|
||||
pub display_name: Option<String>,
|
||||
pub kind: AttachmentKind,
|
||||
pub size_bytes: Option<u64>,
|
||||
pub width: Option<u32>,
|
||||
pub height: Option<u32>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Default, PartialEq, Eq)]
|
||||
pub struct FileServiceConfig {
|
||||
pub base_url: String,
|
||||
pub bearer_token: Option<String>,
|
||||
pub chatgpt_account_id: Option<String>,
|
||||
pub user_agent: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize, Default)]
|
||||
pub struct DiffSummary {
|
||||
pub files_changed: usize,
|
||||
pub lines_added: usize,
|
||||
pub lines_removed: usize,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct TaskText {
|
||||
pub prompt: Option<String>,
|
||||
pub messages: Vec<String>,
|
||||
pub turn_id: Option<String>,
|
||||
pub sibling_turn_ids: Vec<String>,
|
||||
pub attempt_placement: Option<i64>,
|
||||
pub attempt_status: AttemptStatus,
|
||||
}
|
||||
|
||||
impl Default for TaskText {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
prompt: None,
|
||||
messages: Vec::new(),
|
||||
turn_id: None,
|
||||
sibling_turn_ids: Vec::new(),
|
||||
attempt_placement: None,
|
||||
attempt_status: AttemptStatus::Unknown,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
pub trait CloudBackend: Send + Sync {
|
||||
async fn list_tasks(&self, env: Option<&str>) -> Result<Vec<TaskSummary>>;
|
||||
async fn get_task_diff(&self, id: TaskId) -> Result<Option<String>>;
|
||||
/// Return assistant output messages (no diff) when available.
|
||||
async fn get_task_messages(&self, id: TaskId) -> Result<Vec<String>>;
|
||||
/// Return the creating prompt and assistant messages (when available).
|
||||
async fn get_task_text(&self, id: TaskId) -> Result<TaskText>;
|
||||
/// Return any sibling attempts (best-of-N) for the given assistant turn.
|
||||
async fn list_sibling_attempts(
|
||||
&self,
|
||||
task: TaskId,
|
||||
turn_id: String,
|
||||
) -> Result<Vec<TurnAttempt>>;
|
||||
/// Dry-run apply (preflight) that validates whether the patch would apply cleanly.
|
||||
/// Never modifies the working tree. When `diff_override` is supplied, the provided diff is
|
||||
/// used instead of re-fetching the task details so callers can apply alternate attempts.
|
||||
async fn apply_task_preflight(
|
||||
&self,
|
||||
id: TaskId,
|
||||
diff_override: Option<String>,
|
||||
) -> Result<ApplyOutcome>;
|
||||
async fn apply_task(&self, id: TaskId, diff_override: Option<String>) -> Result<ApplyOutcome>;
|
||||
async fn create_task(
|
||||
&self,
|
||||
env_id: &str,
|
||||
prompt: &str,
|
||||
git_ref: &str,
|
||||
qa_mode: bool,
|
||||
attachments: &[AttachmentReference],
|
||||
) -> Result<CreatedTask>;
|
||||
|
||||
fn file_service_config(&self) -> Option<FileServiceConfig> {
|
||||
None
|
||||
}
|
||||
}
|
||||
@@ -1,849 +0,0 @@
|
||||
use crate::ApplyOutcome;
|
||||
use crate::ApplyStatus;
|
||||
use crate::AttemptStatus;
|
||||
use crate::CloudBackend;
|
||||
use crate::CloudTaskError;
|
||||
use crate::DiffSummary;
|
||||
use crate::Result;
|
||||
use crate::TaskId;
|
||||
use crate::TaskStatus;
|
||||
use crate::TaskSummary;
|
||||
use crate::TurnAttempt;
|
||||
use crate::api::TaskText;
|
||||
use chrono::DateTime;
|
||||
use chrono::Utc;
|
||||
|
||||
use serde_json::Value;
|
||||
use std::cmp::Ordering;
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
|
||||
use codex_backend_client as backend;
|
||||
use codex_backend_client::CodeTaskDetailsResponseExt;
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct HttpClient {
|
||||
pub base_url: String,
|
||||
backend: backend::Client,
|
||||
bearer_token: Option<String>,
|
||||
chatgpt_account_id: Option<String>,
|
||||
user_agent: Option<String>,
|
||||
}
|
||||
|
||||
impl HttpClient {
|
||||
pub fn new(base_url: impl Into<String>) -> anyhow::Result<Self> {
|
||||
let base_url = base_url.into();
|
||||
let backend = backend::Client::new(base_url.clone())?;
|
||||
Ok(Self {
|
||||
base_url,
|
||||
backend,
|
||||
bearer_token: None,
|
||||
chatgpt_account_id: None,
|
||||
user_agent: None,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn with_bearer_token(mut self, token: impl Into<String>) -> Self {
|
||||
let token = token.into();
|
||||
self.backend = self.backend.clone().with_bearer_token(token.clone());
|
||||
self.bearer_token = Some(token);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_user_agent(mut self, ua: impl Into<String>) -> Self {
|
||||
let ua = ua.into();
|
||||
self.backend = self.backend.clone().with_user_agent(ua.clone());
|
||||
self.user_agent = Some(ua);
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_chatgpt_account_id(mut self, account_id: impl Into<String>) -> Self {
|
||||
let account_id = account_id.into();
|
||||
self.backend = self
|
||||
.backend
|
||||
.clone()
|
||||
.with_chatgpt_account_id(account_id.clone());
|
||||
self.chatgpt_account_id = Some(account_id);
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
fn is_unified_diff(diff: &str) -> bool {
|
||||
let t = diff.trim_start();
|
||||
if t.starts_with("diff --git ") {
|
||||
return true;
|
||||
}
|
||||
let has_dash_headers = diff.contains("\n--- ") && diff.contains("\n+++ ");
|
||||
let has_hunk = diff.contains("\n@@ ") || diff.starts_with("@@ ");
|
||||
has_dash_headers && has_hunk
|
||||
}
|
||||
|
||||
fn tail(s: &str, max: usize) -> String {
|
||||
if s.len() <= max {
|
||||
s.to_string()
|
||||
} else {
|
||||
s[s.len() - max..].to_string()
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl CloudBackend for HttpClient {
|
||||
async fn list_tasks(&self, env: Option<&str>) -> Result<Vec<TaskSummary>> {
|
||||
let resp = self
|
||||
.backend
|
||||
.list_tasks(Some(20), Some("current"), env)
|
||||
.await
|
||||
.map_err(|e| CloudTaskError::Http(format!("list_tasks failed: {e}")))?;
|
||||
|
||||
let tasks: Vec<TaskSummary> = resp
|
||||
.items
|
||||
.into_iter()
|
||||
.map(map_task_list_item_to_summary)
|
||||
.collect();
|
||||
// Debug log for env filtering visibility
|
||||
append_error_log(&format!(
|
||||
"http.list_tasks: env={} items={}",
|
||||
env.unwrap_or("<all>"),
|
||||
tasks.len()
|
||||
));
|
||||
Ok(tasks)
|
||||
}
|
||||
|
||||
async fn get_task_diff(&self, _id: TaskId) -> Result<Option<String>> {
|
||||
let id = _id.0;
|
||||
let (details, body, ct) = self
|
||||
.backend
|
||||
.get_task_details_with_body(&id)
|
||||
.await
|
||||
.map_err(|e| CloudTaskError::Http(format!("get_task_details failed: {e}")))?;
|
||||
if let Some(diff) = details.unified_diff() {
|
||||
return Ok(Some(diff));
|
||||
}
|
||||
// No diff yet (pending or non-diff task).
|
||||
// Keep variables bound for potential future logging.
|
||||
let _ = (body, ct);
|
||||
Ok(None)
|
||||
}
|
||||
|
||||
async fn get_task_messages(&self, _id: TaskId) -> Result<Vec<String>> {
|
||||
let id = _id.0;
|
||||
let (details, body, ct) = self
|
||||
.backend
|
||||
.get_task_details_with_body(&id)
|
||||
.await
|
||||
.map_err(|e| CloudTaskError::Http(format!("get_task_details failed: {e}")))?;
|
||||
let mut msgs = details.assistant_text_messages();
|
||||
if msgs.is_empty() {
|
||||
msgs.extend(extract_assistant_messages_from_body(&body));
|
||||
}
|
||||
if !msgs.is_empty() {
|
||||
return Ok(msgs);
|
||||
}
|
||||
if let Some(err) = details.assistant_error_message() {
|
||||
return Ok(vec![format!("Task failed: {err}")]);
|
||||
}
|
||||
// No assistant messages found; return a debuggable error with context for logging.
|
||||
let url = if self.base_url.contains("/backend-api") {
|
||||
format!("{}/wham/tasks/{}", self.base_url, id)
|
||||
} else {
|
||||
format!("{}/api/codex/tasks/{}", self.base_url, id)
|
||||
};
|
||||
Err(CloudTaskError::Http(format!(
|
||||
"No assistant text messages in response. GET {url}; content-type={ct}; body={body}"
|
||||
)))
|
||||
}
|
||||
|
||||
async fn get_task_text(&self, _id: TaskId) -> Result<TaskText> {
|
||||
let id = _id.0;
|
||||
let (details, body, _ct) = self
|
||||
.backend
|
||||
.get_task_details_with_body(&id)
|
||||
.await
|
||||
.map_err(|e| CloudTaskError::Http(format!("get_task_details failed: {e}")))?;
|
||||
let prompt = details.user_text_prompt();
|
||||
let mut messages = details.assistant_text_messages();
|
||||
if messages.is_empty() {
|
||||
messages.extend(extract_assistant_messages_from_body(&body));
|
||||
}
|
||||
let turn_map = details.current_assistant_turn.as_ref();
|
||||
let turn_id = turn_map
|
||||
.and_then(|m| m.get("id"))
|
||||
.and_then(Value::as_str)
|
||||
.map(str::to_string);
|
||||
let sibling_turn_ids = turn_map
|
||||
.and_then(|m| m.get("sibling_turn_ids"))
|
||||
.and_then(Value::as_array)
|
||||
.map(|arr| {
|
||||
arr.iter()
|
||||
.filter_map(Value::as_str)
|
||||
.map(str::to_string)
|
||||
.collect()
|
||||
})
|
||||
.unwrap_or_default();
|
||||
let attempt_placement = turn_map
|
||||
.and_then(|m| m.get("attempt_placement"))
|
||||
.and_then(Value::as_i64);
|
||||
let attempt_status = attempt_status_from_str(
|
||||
turn_map
|
||||
.and_then(|m| m.get("turn_status"))
|
||||
.and_then(Value::as_str),
|
||||
);
|
||||
Ok(TaskText {
|
||||
prompt,
|
||||
messages,
|
||||
turn_id,
|
||||
sibling_turn_ids,
|
||||
attempt_placement,
|
||||
attempt_status,
|
||||
})
|
||||
}
|
||||
|
||||
async fn list_sibling_attempts(
|
||||
&self,
|
||||
task: TaskId,
|
||||
turn_id: String,
|
||||
) -> Result<Vec<TurnAttempt>> {
|
||||
let resp = self
|
||||
.backend
|
||||
.list_sibling_turns(&task.0, &turn_id)
|
||||
.await
|
||||
.map_err(|e| CloudTaskError::Http(format!("list_sibling_turns failed: {e}")))?;
|
||||
|
||||
let mut attempts: Vec<TurnAttempt> = resp
|
||||
.sibling_turns
|
||||
.iter()
|
||||
.filter_map(turn_attempt_from_map)
|
||||
.collect();
|
||||
attempts.sort_by(compare_attempts);
|
||||
Ok(attempts)
|
||||
}
|
||||
|
||||
async fn apply_task(&self, _id: TaskId, diff_override: Option<String>) -> Result<ApplyOutcome> {
|
||||
let id = _id.0;
|
||||
self.apply_with_diff(id, diff_override, false).await
|
||||
}
|
||||
|
||||
async fn apply_task_preflight(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
diff_override: Option<String>,
|
||||
) -> Result<ApplyOutcome> {
|
||||
let id = _id.0;
|
||||
self.apply_with_diff(id, diff_override, true).await
|
||||
}
|
||||
|
||||
async fn create_task(
|
||||
&self,
|
||||
env_id: &str,
|
||||
prompt: &str,
|
||||
git_ref: &str,
|
||||
qa_mode: bool,
|
||||
attachments: &[crate::AttachmentReference],
|
||||
) -> Result<crate::CreatedTask> {
|
||||
// Build request payload patterned after VSCode/newtask.rs
|
||||
let mut input_items: Vec<serde_json::Value> = Vec::new();
|
||||
input_items.push(serde_json::json!({
|
||||
"type": "message",
|
||||
"role": "user",
|
||||
"content": [{ "content_type": "text", "text": prompt }]
|
||||
}));
|
||||
|
||||
for attachment in attachments {
|
||||
match attachment.kind {
|
||||
crate::AttachmentKind::Image => {
|
||||
if let (Some(width), Some(height), Some(size_bytes)) =
|
||||
(attachment.width, attachment.height, attachment.size_bytes)
|
||||
{
|
||||
input_items.push(serde_json::json!({
|
||||
"type": "image_asset_pointer",
|
||||
"asset_pointer": attachment.asset_pointer,
|
||||
"width": width,
|
||||
"height": height,
|
||||
"size_bytes": size_bytes,
|
||||
}));
|
||||
continue;
|
||||
}
|
||||
// Fallback to container when metadata is missing
|
||||
}
|
||||
crate::AttachmentKind::File => {}
|
||||
}
|
||||
|
||||
let default_path = attachment
|
||||
.path
|
||||
.clone()
|
||||
.or_else(|| attachment.display_name.clone())
|
||||
.unwrap_or_else(|| attachment.sediment_id.clone());
|
||||
|
||||
let file_entry = serde_json::json!({
|
||||
"type": "file",
|
||||
"sediment_id": attachment.sediment_id,
|
||||
"path": default_path.clone(),
|
||||
});
|
||||
|
||||
let mut container = serde_json::json!({
|
||||
"type": "container_file",
|
||||
"file_ids": [file_entry],
|
||||
"body": "",
|
||||
});
|
||||
container["path"] = serde_json::Value::String(default_path);
|
||||
input_items.push(container);
|
||||
}
|
||||
|
||||
if let Ok(diff) = std::env::var("CODEX_STARTING_DIFF")
|
||||
&& !diff.is_empty()
|
||||
{
|
||||
input_items.push(serde_json::json!({
|
||||
"type": "pre_apply_patch",
|
||||
"output_diff": { "diff": diff }
|
||||
}));
|
||||
}
|
||||
|
||||
let request_body = serde_json::json!({
|
||||
"new_task": {
|
||||
"environment_id": env_id,
|
||||
"branch": git_ref,
|
||||
"run_environment_in_qa_mode": qa_mode,
|
||||
},
|
||||
"input_items": input_items,
|
||||
});
|
||||
|
||||
// Use the underlying backend client to post with proper headers
|
||||
match self.backend.create_task(request_body).await {
|
||||
Ok(id) => {
|
||||
append_error_log(&format!(
|
||||
"new_task: created id={id} env={} prompt_chars={} attachments={}",
|
||||
env_id,
|
||||
prompt.chars().count(),
|
||||
attachments.len()
|
||||
));
|
||||
Ok(crate::CreatedTask { id: TaskId(id) })
|
||||
}
|
||||
Err(e) => {
|
||||
append_error_log(&format!(
|
||||
"new_task: create failed env={} prompt_chars={} attachments={}: {}",
|
||||
env_id,
|
||||
prompt.chars().count(),
|
||||
attachments.len(),
|
||||
e
|
||||
));
|
||||
Err(CloudTaskError::Http(format!("create_task failed: {e}")))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn file_service_config(&self) -> Option<crate::FileServiceConfig> {
|
||||
Some(crate::FileServiceConfig {
|
||||
base_url: self.base_url.clone(),
|
||||
bearer_token: self.bearer_token.clone(),
|
||||
chatgpt_account_id: self.chatgpt_account_id.clone(),
|
||||
user_agent: self.user_agent.clone(),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// Best-effort extraction of assistant text messages from a raw `get_task_details` body.
|
||||
/// Falls back to worklog messages when structured turns are not present.
|
||||
impl HttpClient {
|
||||
async fn apply_with_diff(
|
||||
&self,
|
||||
id: String,
|
||||
diff_override: Option<String>,
|
||||
preflight: bool,
|
||||
) -> Result<ApplyOutcome> {
|
||||
let diff = match diff_override {
|
||||
Some(diff) => diff,
|
||||
None => {
|
||||
let details =
|
||||
self.backend.get_task_details(&id).await.map_err(|e| {
|
||||
CloudTaskError::Http(format!("get_task_details failed: {e}"))
|
||||
})?;
|
||||
details.unified_diff().ok_or_else(|| {
|
||||
CloudTaskError::Msg(format!("No diff available for task {id}"))
|
||||
})?
|
||||
}
|
||||
};
|
||||
|
||||
if !is_unified_diff(&diff) {
|
||||
let summary = summarize_patch_for_logging(&diff);
|
||||
let mode = if preflight { "preflight" } else { "apply" };
|
||||
append_error_log(&format!(
|
||||
"apply_error: id={id} mode={mode} format=non-unified; {summary}"
|
||||
));
|
||||
return Ok(ApplyOutcome {
|
||||
applied: false,
|
||||
status: ApplyStatus::Error,
|
||||
message: "Expected unified git diff; backend returned an incompatible format."
|
||||
.to_string(),
|
||||
skipped_paths: Vec::new(),
|
||||
conflict_paths: Vec::new(),
|
||||
});
|
||||
}
|
||||
|
||||
let req = codex_git_apply::ApplyGitRequest {
|
||||
cwd: std::env::current_dir().unwrap_or_else(|_| std::env::temp_dir()),
|
||||
diff: diff.clone(),
|
||||
revert: false,
|
||||
preflight,
|
||||
};
|
||||
let r = codex_git_apply::apply_git_patch(&req)
|
||||
.map_err(|e| CloudTaskError::Io(format!("git apply failed to run: {e}")))?;
|
||||
|
||||
let status = if r.exit_code == 0 {
|
||||
ApplyStatus::Success
|
||||
} else if !r.applied_paths.is_empty() || !r.conflicted_paths.is_empty() {
|
||||
ApplyStatus::Partial
|
||||
} else {
|
||||
ApplyStatus::Error
|
||||
};
|
||||
let applied = matches!(status, ApplyStatus::Success) && !preflight;
|
||||
|
||||
let message = if preflight {
|
||||
match status {
|
||||
ApplyStatus::Success => format!("Preflight passed for task {id} (applies cleanly)"),
|
||||
ApplyStatus::Partial => format!(
|
||||
"Preflight: patch does not fully apply for task {id} (applied={}, skipped={}, conflicts={})",
|
||||
r.applied_paths.len(),
|
||||
r.skipped_paths.len(),
|
||||
r.conflicted_paths.len()
|
||||
),
|
||||
ApplyStatus::Error => format!(
|
||||
"Preflight failed for task {id} (applied={}, skipped={}, conflicts={})",
|
||||
r.applied_paths.len(),
|
||||
r.skipped_paths.len(),
|
||||
r.conflicted_paths.len()
|
||||
),
|
||||
}
|
||||
} else {
|
||||
match status {
|
||||
ApplyStatus::Success => format!(
|
||||
"Applied task {id} locally ({} files)",
|
||||
r.applied_paths.len()
|
||||
),
|
||||
ApplyStatus::Partial => format!(
|
||||
"Apply partially succeeded for task {id} (applied={}, skipped={}, conflicts={})",
|
||||
r.applied_paths.len(),
|
||||
r.skipped_paths.len(),
|
||||
r.conflicted_paths.len()
|
||||
),
|
||||
ApplyStatus::Error => format!(
|
||||
"Apply failed for task {id} (applied={}, skipped={}, conflicts={})",
|
||||
r.applied_paths.len(),
|
||||
r.skipped_paths.len(),
|
||||
r.conflicted_paths.len()
|
||||
),
|
||||
}
|
||||
};
|
||||
|
||||
if matches!(status, ApplyStatus::Partial | ApplyStatus::Error)
|
||||
|| (preflight && !matches!(status, ApplyStatus::Success))
|
||||
{
|
||||
let mut log = String::new();
|
||||
let summary = summarize_patch_for_logging(&diff);
|
||||
let mode = if preflight { "preflight" } else { "apply" };
|
||||
use std::fmt::Write as _;
|
||||
let _ = writeln!(
|
||||
&mut log,
|
||||
"apply_result: mode={} id={} status={:?} applied={} skipped={} conflicts={} cmd={}",
|
||||
mode,
|
||||
id,
|
||||
status,
|
||||
r.applied_paths.len(),
|
||||
r.skipped_paths.len(),
|
||||
r.conflicted_paths.len(),
|
||||
r.cmd_for_log
|
||||
);
|
||||
let _ = writeln!(
|
||||
&mut log,
|
||||
"stdout_tail=
|
||||
{}
|
||||
stderr_tail=
|
||||
{}",
|
||||
tail(&r.stdout, 2000),
|
||||
tail(&r.stderr, 2000)
|
||||
);
|
||||
let _ = writeln!(&mut log, "{summary}");
|
||||
let _ = writeln!(
|
||||
&mut log,
|
||||
"----- PATCH BEGIN -----
|
||||
{diff}
|
||||
----- PATCH END -----"
|
||||
);
|
||||
append_error_log(&log);
|
||||
}
|
||||
|
||||
Ok(ApplyOutcome {
|
||||
applied,
|
||||
status,
|
||||
message,
|
||||
skipped_paths: r.skipped_paths,
|
||||
conflict_paths: r.conflicted_paths,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_assistant_messages_from_body(body: &str) -> Vec<String> {
|
||||
let mut msgs = Vec::new();
|
||||
if let Ok(full) = serde_json::from_str::<serde_json::Value>(body)
|
||||
&& let Some(arr) = full
|
||||
.get("current_assistant_turn")
|
||||
.and_then(|v| v.get("worklog"))
|
||||
.and_then(|v| v.get("messages"))
|
||||
.and_then(|v| v.as_array())
|
||||
{
|
||||
for m in arr {
|
||||
let is_assistant = m
|
||||
.get("author")
|
||||
.and_then(|a| a.get("role"))
|
||||
.and_then(|r| r.as_str())
|
||||
== Some("assistant");
|
||||
if !is_assistant {
|
||||
continue;
|
||||
}
|
||||
if let Some(parts) = m
|
||||
.get("content")
|
||||
.and_then(|c| c.get("parts"))
|
||||
.and_then(|p| p.as_array())
|
||||
{
|
||||
for p in parts {
|
||||
if let Some(s) = p.as_str() {
|
||||
if !s.is_empty() {
|
||||
msgs.push(s.to_string());
|
||||
}
|
||||
continue;
|
||||
}
|
||||
if let Some(obj) = p.as_object()
|
||||
&& obj.get("content_type").and_then(|t| t.as_str()) == Some("text")
|
||||
&& let Some(txt) = obj.get("text").and_then(|t| t.as_str())
|
||||
{
|
||||
msgs.push(txt.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
msgs
|
||||
}
|
||||
|
||||
fn turn_attempt_from_map(turn: &HashMap<String, Value>) -> Option<TurnAttempt> {
|
||||
let turn_id = turn.get("id").and_then(Value::as_str)?.to_string();
|
||||
let attempt_placement = turn.get("attempt_placement").and_then(Value::as_i64);
|
||||
let created_at = parse_timestamp_value(turn.get("created_at"));
|
||||
let status = attempt_status_from_str(turn.get("turn_status").and_then(Value::as_str));
|
||||
let diff = extract_diff_from_turn(turn);
|
||||
let messages = extract_assistant_messages_from_turn(turn);
|
||||
Some(TurnAttempt {
|
||||
turn_id,
|
||||
attempt_placement,
|
||||
created_at,
|
||||
status,
|
||||
diff,
|
||||
messages,
|
||||
})
|
||||
}
|
||||
|
||||
fn compare_attempts(a: &TurnAttempt, b: &TurnAttempt) -> Ordering {
|
||||
match (a.attempt_placement, b.attempt_placement) {
|
||||
(Some(lhs), Some(rhs)) => lhs.cmp(&rhs),
|
||||
(Some(_), None) => Ordering::Less,
|
||||
(None, Some(_)) => Ordering::Greater,
|
||||
(None, None) => match (a.created_at, b.created_at) {
|
||||
(Some(lhs), Some(rhs)) => lhs.cmp(&rhs),
|
||||
(Some(_), None) => Ordering::Less,
|
||||
(None, Some(_)) => Ordering::Greater,
|
||||
(None, None) => a.turn_id.cmp(&b.turn_id),
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_diff_from_turn(turn: &HashMap<String, Value>) -> Option<String> {
|
||||
let items = turn.get("output_items").and_then(Value::as_array)?;
|
||||
for item in items {
|
||||
match item.get("type").and_then(Value::as_str) {
|
||||
Some("output_diff") => {
|
||||
if let Some(diff) = item.get("diff").and_then(Value::as_str)
|
||||
&& !diff.is_empty()
|
||||
{
|
||||
return Some(diff.to_string());
|
||||
}
|
||||
}
|
||||
Some("pr") => {
|
||||
if let Some(diff) = item
|
||||
.get("output_diff")
|
||||
.and_then(Value::as_object)
|
||||
.and_then(|od| od.get("diff"))
|
||||
.and_then(Value::as_str)
|
||||
&& !diff.is_empty()
|
||||
{
|
||||
return Some(diff.to_string());
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
fn extract_assistant_messages_from_turn(turn: &HashMap<String, Value>) -> Vec<String> {
|
||||
let mut msgs = Vec::new();
|
||||
if let Some(items) = turn.get("output_items").and_then(Value::as_array) {
|
||||
for item in items {
|
||||
if item.get("type").and_then(Value::as_str) != Some("message") {
|
||||
continue;
|
||||
}
|
||||
if let Some(content) = item.get("content").and_then(Value::as_array) {
|
||||
for part in content {
|
||||
if part.get("content_type").and_then(Value::as_str) == Some("text")
|
||||
&& let Some(txt) = part.get("text").and_then(Value::as_str)
|
||||
&& !txt.is_empty()
|
||||
{
|
||||
msgs.push(txt.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if msgs.is_empty()
|
||||
&& let Some(err) = turn.get("error").and_then(Value::as_object)
|
||||
{
|
||||
let message = err.get("message").and_then(Value::as_str).unwrap_or("");
|
||||
let code = err.get("code").and_then(Value::as_str).unwrap_or("");
|
||||
if !message.is_empty() || !code.is_empty() {
|
||||
let text = if !code.is_empty() && !message.is_empty() {
|
||||
format!("{code}: {message}")
|
||||
} else if !code.is_empty() {
|
||||
code.to_string()
|
||||
} else {
|
||||
message.to_string()
|
||||
};
|
||||
msgs.push(format!("Task failed: {text}"));
|
||||
}
|
||||
}
|
||||
msgs
|
||||
}
|
||||
|
||||
fn parse_timestamp_value(v: Option<&Value>) -> Option<DateTime<Utc>> {
|
||||
let raw = v?.as_f64()?;
|
||||
let secs = raw as i64;
|
||||
let nanos = ((raw - secs as f64) * 1_000_000_000.0) as u32;
|
||||
Some(DateTime::<Utc>::from(
|
||||
std::time::UNIX_EPOCH + std::time::Duration::new(secs.max(0) as u64, nanos),
|
||||
))
|
||||
}
|
||||
|
||||
fn attempt_status_from_str(s: Option<&str>) -> AttemptStatus {
|
||||
match s.unwrap_or("") {
|
||||
"pending" => AttemptStatus::Pending,
|
||||
"in_progress" => AttemptStatus::InProgress,
|
||||
"completed" => AttemptStatus::Completed,
|
||||
"failed" => AttemptStatus::Failed,
|
||||
"cancelled" => AttemptStatus::Cancelled,
|
||||
_ => AttemptStatus::Unknown,
|
||||
}
|
||||
}
|
||||
|
||||
fn map_task_list_item_to_summary(src: backend::TaskListItem) -> TaskSummary {
|
||||
fn env_label_from_status_display(v: Option<&HashMap<String, Value>>) -> Option<String> {
|
||||
let obj = v?;
|
||||
let raw = obj.get("environment_label")?;
|
||||
if let Some(s) = raw.as_str() {
|
||||
if s.trim().is_empty() {
|
||||
return None;
|
||||
}
|
||||
return Some(s.to_string());
|
||||
}
|
||||
if let Some(o) = raw.as_object() {
|
||||
// Best-effort support for rich shapes: { text: "..." } or { plain_text: "..." }
|
||||
if let Some(s) = o.get("text").and_then(Value::as_str)
|
||||
&& !s.trim().is_empty()
|
||||
{
|
||||
return Some(s.to_string());
|
||||
}
|
||||
if let Some(s) = o.get("plain_text").and_then(Value::as_str)
|
||||
&& !s.trim().is_empty()
|
||||
{
|
||||
return Some(s.to_string());
|
||||
}
|
||||
// Fallback: compact JSON for debugging
|
||||
if let Ok(s) = serde_json::to_string(o)
|
||||
&& !s.is_empty()
|
||||
{
|
||||
return Some(s);
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
// Best-effort parse of diff_stats (when present in latest_turn_status_display)
|
||||
fn diff_summary_from_status_display(v: Option<&HashMap<String, Value>>) -> DiffSummary {
|
||||
let mut out = DiffSummary::default();
|
||||
let Some(map) = v else { return out };
|
||||
let latest = map
|
||||
.get("latest_turn_status_display")
|
||||
.and_then(Value::as_object);
|
||||
let Some(latest) = latest else { return out };
|
||||
if let Some(ds) = latest.get("diff_stats").and_then(Value::as_object) {
|
||||
if let Some(n) = ds.get("files_modified").and_then(Value::as_i64) {
|
||||
out.files_changed = n.max(0) as usize;
|
||||
}
|
||||
if let Some(n) = ds.get("lines_added").and_then(Value::as_i64) {
|
||||
out.lines_added = n.max(0) as usize;
|
||||
}
|
||||
if let Some(n) = ds.get("lines_removed").and_then(Value::as_i64) {
|
||||
out.lines_removed = n.max(0) as usize;
|
||||
}
|
||||
}
|
||||
out
|
||||
}
|
||||
|
||||
fn attempt_total_from_status_display(v: Option<&HashMap<String, Value>>) -> Option<usize> {
|
||||
let map = v?;
|
||||
let latest = map
|
||||
.get("latest_turn_status_display")
|
||||
.and_then(Value::as_object)?;
|
||||
let siblings = latest.get("sibling_turn_ids").and_then(Value::as_array)?;
|
||||
Some(siblings.len().saturating_add(1))
|
||||
}
|
||||
|
||||
TaskSummary {
|
||||
id: TaskId(src.id),
|
||||
title: src.title,
|
||||
status: map_status(src.task_status_display.as_ref()),
|
||||
updated_at: parse_updated_at(src.updated_at.as_ref()),
|
||||
environment_id: None,
|
||||
environment_label: env_label_from_status_display(src.task_status_display.as_ref()),
|
||||
summary: diff_summary_from_status_display(src.task_status_display.as_ref()),
|
||||
is_review: src
|
||||
.pull_requests
|
||||
.as_ref()
|
||||
.is_some_and(|prs| !prs.is_empty()),
|
||||
attempt_total: attempt_total_from_status_display(src.task_status_display.as_ref()),
|
||||
}
|
||||
}
|
||||
|
||||
fn map_status(v: Option<&HashMap<String, Value>>) -> TaskStatus {
|
||||
if let Some(val) = v {
|
||||
// Prefer nested latest_turn_status_display.turn_status when present.
|
||||
if let Some(turn) = val
|
||||
.get("latest_turn_status_display")
|
||||
.and_then(Value::as_object)
|
||||
&& let Some(s) = turn.get("turn_status").and_then(Value::as_str)
|
||||
{
|
||||
return match s {
|
||||
"failed" => TaskStatus::Error,
|
||||
"completed" => TaskStatus::Ready,
|
||||
"in_progress" => TaskStatus::Pending,
|
||||
"pending" => TaskStatus::Pending,
|
||||
"cancelled" => TaskStatus::Error,
|
||||
_ => TaskStatus::Pending,
|
||||
};
|
||||
}
|
||||
// Legacy or alternative flat state.
|
||||
if let Some(state) = val.get("state").and_then(Value::as_str) {
|
||||
return match state {
|
||||
"pending" => TaskStatus::Pending,
|
||||
"ready" => TaskStatus::Ready,
|
||||
"applied" => TaskStatus::Applied,
|
||||
"error" => TaskStatus::Error,
|
||||
_ => TaskStatus::Pending,
|
||||
};
|
||||
}
|
||||
}
|
||||
TaskStatus::Pending
|
||||
}
|
||||
|
||||
fn parse_updated_at(ts: Option<&f64>) -> DateTime<Utc> {
|
||||
if let Some(v) = ts {
|
||||
// Value is seconds since epoch with fractional part.
|
||||
let secs = *v as i64;
|
||||
let nanos = ((*v - secs as f64) * 1_000_000_000.0) as u32;
|
||||
return DateTime::<Utc>::from(
|
||||
std::time::UNIX_EPOCH + std::time::Duration::new(secs.max(0) as u64, nanos),
|
||||
);
|
||||
}
|
||||
Utc::now()
|
||||
}
|
||||
|
||||
/// Return a compact one-line classification of the patch plus a short head snippet
|
||||
/// to aid debugging when apply fails.
|
||||
fn summarize_patch_for_logging(patch: &str) -> String {
|
||||
let trimmed = patch.trim_start();
|
||||
let kind = if trimmed.starts_with("*** Begin Patch") {
|
||||
"codex-patch"
|
||||
} else if trimmed.starts_with("diff --git ") || trimmed.contains("\n*** End Patch\n") {
|
||||
// In some cases providers nest a codex patch inside another format; detect both.
|
||||
"git-diff"
|
||||
} else if trimmed.starts_with("@@ ") || trimmed.contains("\n@@ ") {
|
||||
"unified-diff"
|
||||
} else {
|
||||
"unknown"
|
||||
};
|
||||
let lines = patch.lines().count();
|
||||
let chars = patch.len();
|
||||
let cwd = std::env::current_dir()
|
||||
.ok()
|
||||
.map(|p| p.display().to_string())
|
||||
.unwrap_or_else(|| "<unknown>".to_string());
|
||||
// Grab the first up-to-20 non-empty lines for context.
|
||||
let head: String = patch.lines().take(20).collect::<Vec<&str>>().join("\n");
|
||||
// Make sure we don't explode logs with huge content.
|
||||
let head_trunc = if head.len() > 800 {
|
||||
format!("{}…", &head[..800])
|
||||
} else {
|
||||
head
|
||||
};
|
||||
format!(
|
||||
"patch_summary: kind={kind} lines={lines} chars={chars} cwd={cwd} ; head=\n{head_trunc}"
|
||||
)
|
||||
}
|
||||
|
||||
fn append_error_log(message: &str) {
|
||||
let timestamp = Utc::now().to_rfc3339();
|
||||
|
||||
if let Some(path) = log_file_path()
|
||||
&& write_log_line(&path, ×tamp, message)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
let fallback = Path::new("error.log");
|
||||
let _ = write_log_line(fallback, ×tamp, message);
|
||||
}
|
||||
|
||||
fn log_file_path() -> Option<PathBuf> {
|
||||
let mut codex_home = codex_home_dir()?;
|
||||
codex_home.push("log");
|
||||
std::fs::create_dir_all(&codex_home).ok()?;
|
||||
Some(codex_home.join("codex-cloud-tasks.log"))
|
||||
}
|
||||
|
||||
fn codex_home_dir() -> Option<PathBuf> {
|
||||
if let Ok(val) = std::env::var("CODEX_HOME")
|
||||
&& !val.is_empty()
|
||||
{
|
||||
let path = PathBuf::from(val);
|
||||
return path.canonicalize().ok().or(Some(path));
|
||||
}
|
||||
dirs::home_dir().map(|mut home| {
|
||||
home.push(".codex");
|
||||
home
|
||||
})
|
||||
}
|
||||
|
||||
fn write_log_line(path: &Path, timestamp: &str, message: &str) -> bool {
|
||||
let mut opts = std::fs::OpenOptions::new();
|
||||
opts.create(true).append(true);
|
||||
#[cfg(unix)]
|
||||
{
|
||||
use std::os::unix::fs::OpenOptionsExt;
|
||||
opts.mode(0o600);
|
||||
}
|
||||
|
||||
match opts.open(path) {
|
||||
Ok(mut file) => {
|
||||
use std::io::Write as _;
|
||||
writeln!(file, "[{timestamp}] {message}").is_ok()
|
||||
}
|
||||
Err(_) => false,
|
||||
}
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
mod api;
|
||||
|
||||
pub use api::ApplyOutcome;
|
||||
pub use api::ApplyStatus;
|
||||
pub use api::AttachmentKind;
|
||||
pub use api::AttachmentReference;
|
||||
pub use api::AttemptStatus;
|
||||
pub use api::CloudBackend;
|
||||
pub use api::CloudTaskError;
|
||||
pub use api::CreatedTask;
|
||||
pub use api::DiffSummary;
|
||||
pub use api::FileServiceConfig;
|
||||
pub use api::Result;
|
||||
pub use api::TaskId;
|
||||
pub use api::TaskStatus;
|
||||
pub use api::TaskSummary;
|
||||
pub use api::TaskText;
|
||||
pub use api::TurnAttempt;
|
||||
|
||||
#[cfg(feature = "mock")]
|
||||
mod mock;
|
||||
|
||||
#[cfg(feature = "online")]
|
||||
mod http;
|
||||
|
||||
#[cfg(feature = "mock")]
|
||||
pub use mock::MockClient;
|
||||
|
||||
#[cfg(feature = "online")]
|
||||
pub use http::HttpClient;
|
||||
|
||||
// Reusable apply engine now lives in the shared crate `codex-git-apply`.
|
||||
@@ -1,180 +0,0 @@
|
||||
use crate::ApplyOutcome;
|
||||
use crate::AttemptStatus;
|
||||
use crate::CloudBackend;
|
||||
use crate::DiffSummary;
|
||||
use crate::Result;
|
||||
use crate::TaskId;
|
||||
use crate::TaskStatus;
|
||||
use crate::TaskSummary;
|
||||
use crate::TurnAttempt;
|
||||
use crate::api::TaskText;
|
||||
use chrono::Utc;
|
||||
|
||||
#[derive(Clone, Default)]
|
||||
pub struct MockClient;
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl CloudBackend for MockClient {
|
||||
async fn list_tasks(&self, _env: Option<&str>) -> Result<Vec<TaskSummary>> {
|
||||
// Slightly vary content by env to aid tests that rely on the mock
|
||||
let rows = match _env {
|
||||
Some("env-A") => vec![("T-2000", "A: First", TaskStatus::Ready)],
|
||||
Some("env-B") => vec![
|
||||
("T-3000", "B: One", TaskStatus::Ready),
|
||||
("T-3001", "B: Two", TaskStatus::Pending),
|
||||
],
|
||||
_ => vec![
|
||||
("T-1000", "Update README formatting", TaskStatus::Ready),
|
||||
("T-1001", "Fix clippy warnings in core", TaskStatus::Pending),
|
||||
("T-1002", "Add contributing guide", TaskStatus::Ready),
|
||||
],
|
||||
};
|
||||
let environment_id = _env.map(str::to_string);
|
||||
let environment_label = match _env {
|
||||
Some("env-A") => Some("Env A".to_string()),
|
||||
Some("env-B") => Some("Env B".to_string()),
|
||||
Some(other) => Some(other.to_string()),
|
||||
None => Some("Global".to_string()),
|
||||
};
|
||||
let mut out = Vec::new();
|
||||
for (id_str, title, status) in rows {
|
||||
let id = TaskId(id_str.to_string());
|
||||
let diff = mock_diff_for(&id);
|
||||
let (a, d) = count_from_unified(&diff);
|
||||
out.push(TaskSummary {
|
||||
id,
|
||||
title: title.to_string(),
|
||||
status,
|
||||
updated_at: Utc::now(),
|
||||
environment_id: environment_id.clone(),
|
||||
environment_label: environment_label.clone(),
|
||||
summary: DiffSummary {
|
||||
files_changed: 1,
|
||||
lines_added: a,
|
||||
lines_removed: d,
|
||||
},
|
||||
is_review: false,
|
||||
attempt_total: Some(if id_str == "T-1000" { 2 } else { 1 }),
|
||||
});
|
||||
}
|
||||
Ok(out)
|
||||
}
|
||||
|
||||
async fn get_task_diff(&self, id: TaskId) -> Result<Option<String>> {
|
||||
Ok(Some(mock_diff_for(&id)))
|
||||
}
|
||||
|
||||
async fn get_task_messages(&self, _id: TaskId) -> Result<Vec<String>> {
|
||||
Ok(vec![
|
||||
"Mock assistant output: this task contains no diff.".to_string(),
|
||||
])
|
||||
}
|
||||
|
||||
async fn get_task_text(&self, _id: TaskId) -> Result<TaskText> {
|
||||
Ok(TaskText {
|
||||
prompt: Some("Why is there no diff?".to_string()),
|
||||
messages: vec!["Mock assistant output: this task contains no diff.".to_string()],
|
||||
turn_id: Some("mock-turn".to_string()),
|
||||
sibling_turn_ids: Vec::new(),
|
||||
attempt_placement: Some(0),
|
||||
attempt_status: AttemptStatus::Completed,
|
||||
})
|
||||
}
|
||||
|
||||
async fn apply_task(&self, id: TaskId, _diff_override: Option<String>) -> Result<ApplyOutcome> {
|
||||
Ok(ApplyOutcome {
|
||||
applied: true,
|
||||
status: crate::ApplyStatus::Success,
|
||||
message: format!("Applied task {} locally (mock)", id.0),
|
||||
skipped_paths: Vec::new(),
|
||||
conflict_paths: Vec::new(),
|
||||
})
|
||||
}
|
||||
|
||||
async fn apply_task_preflight(
|
||||
&self,
|
||||
id: TaskId,
|
||||
_diff_override: Option<String>,
|
||||
) -> Result<ApplyOutcome> {
|
||||
Ok(ApplyOutcome {
|
||||
applied: false,
|
||||
status: crate::ApplyStatus::Success,
|
||||
message: format!("Preflight passed for task {} (mock)", id.0),
|
||||
skipped_paths: Vec::new(),
|
||||
conflict_paths: Vec::new(),
|
||||
})
|
||||
}
|
||||
|
||||
async fn list_sibling_attempts(
|
||||
&self,
|
||||
task: TaskId,
|
||||
_turn_id: String,
|
||||
) -> Result<Vec<TurnAttempt>> {
|
||||
if task.0 == "T-1000" {
|
||||
return Ok(vec![TurnAttempt {
|
||||
turn_id: "T-1000-attempt-2".to_string(),
|
||||
attempt_placement: Some(1),
|
||||
created_at: Some(Utc::now()),
|
||||
status: AttemptStatus::Completed,
|
||||
diff: Some(mock_diff_for(&task)),
|
||||
messages: vec!["Mock alternate attempt".to_string()],
|
||||
}]);
|
||||
}
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
async fn create_task(
|
||||
&self,
|
||||
env_id: &str,
|
||||
prompt: &str,
|
||||
git_ref: &str,
|
||||
qa_mode: bool,
|
||||
attachments: &[crate::AttachmentReference],
|
||||
) -> Result<crate::CreatedTask> {
|
||||
let _ = (env_id, prompt, git_ref, qa_mode, attachments);
|
||||
let id = format!("task_local_{}", chrono::Utc::now().timestamp_millis());
|
||||
Ok(crate::CreatedTask { id: TaskId(id) })
|
||||
}
|
||||
}
|
||||
|
||||
fn mock_diff_for(id: &TaskId) -> String {
|
||||
match id.0.as_str() {
|
||||
"T-1000" => {
|
||||
"diff --git a/README.md b/README.md\nindex 000000..111111 100644\n--- a/README.md\n+++ b/README.md\n@@ -1,2 +1,3 @@\n Intro\n-Hello\n+Hello, world!\n+Task: T-1000\n".to_string()
|
||||
}
|
||||
"T-1001" => {
|
||||
"diff --git a/core/src/lib.rs b/core/src/lib.rs\nindex 000000..111111 100644\n--- a/core/src/lib.rs\n+++ b/core/src/lib.rs\n@@ -1,2 +1,1 @@\n-use foo;\n use bar;\n".to_string()
|
||||
}
|
||||
_ => {
|
||||
"diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md\nindex 000000..111111 100644\n--- /dev/null\n+++ b/CONTRIBUTING.md\n@@ -0,0 +1,3 @@\n+## Contributing\n+Please open PRs.\n+Thanks!\n".to_string()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn count_from_unified(diff: &str) -> (usize, usize) {
|
||||
if let Ok(patch) = diffy::Patch::from_str(diff) {
|
||||
patch
|
||||
.hunks()
|
||||
.iter()
|
||||
.flat_map(diffy::Hunk::lines)
|
||||
.fold((0, 0), |(a, d), l| match l {
|
||||
diffy::Line::Insert(_) => (a + 1, d),
|
||||
diffy::Line::Delete(_) => (a, d + 1),
|
||||
_ => (a, d),
|
||||
})
|
||||
} else {
|
||||
let mut a = 0;
|
||||
let mut d = 0;
|
||||
for l in diff.lines() {
|
||||
if l.starts_with("+++") || l.starts_with("---") || l.starts_with("@@") {
|
||||
continue;
|
||||
}
|
||||
match l.as_bytes().first() {
|
||||
Some(b'+') => a += 1,
|
||||
Some(b'-') => d += 1,
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
(a, d)
|
||||
}
|
||||
}
|
||||
@@ -1,54 +0,0 @@
|
||||
[package]
|
||||
name = "codex-cloud-tasks"
|
||||
version = { workspace = true }
|
||||
edition = "2024"
|
||||
|
||||
[lib]
|
||||
name = "codex_cloud_tasks"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[lints]
|
||||
workspace = true
|
||||
|
||||
[dependencies]
|
||||
anyhow = "1"
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
codex-common = { path = "../common", features = ["cli"] }
|
||||
tokio = { version = "1", features = ["fs", "macros", "rt-multi-thread"] }
|
||||
tracing = { version = "0.1.41", features = ["log"] }
|
||||
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
|
||||
codex-cloud-tasks-client = { path = "../cloud-tasks-client", features = ["mock", "online"] }
|
||||
ratatui = { version = "0.29.0" }
|
||||
crossterm = { version = "0.28.1", features = ["event-stream"] }
|
||||
tokio-stream = "0.1.17"
|
||||
chrono = { version = "0.4", features = ["serde"] }
|
||||
codex-login = { path = "../login" }
|
||||
codex-core = { path = "../core" }
|
||||
codex-backend-client = { path = "../backend-client" }
|
||||
throbber-widgets-tui = "0.8.0"
|
||||
base64 = "0.22"
|
||||
serde_json = "1"
|
||||
reqwest = { version = "0.12", features = ["json"] }
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
unicode-width = "0.1"
|
||||
codex-tui = { path = "../tui" }
|
||||
codex-file-search = { path = "../file-search" }
|
||||
mime_guess = "2"
|
||||
url = "2"
|
||||
image = { workspace = true }
|
||||
|
||||
[dev-dependencies]
|
||||
async-trait = "0.1"
|
||||
tempfile = "3"
|
||||
|
||||
[[bin]]
|
||||
name = "conncheck"
|
||||
path = "src/bin/conncheck.rs"
|
||||
|
||||
[[bin]]
|
||||
name = "newtask"
|
||||
path = "src/bin/newtask.rs"
|
||||
|
||||
[[bin]]
|
||||
name = "envcheck"
|
||||
path = "src/bin/envcheck.rs"
|
||||
@@ -1,474 +0,0 @@
|
||||
use std::time::Duration;
|
||||
|
||||
// Environment filter data models for the TUI
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct EnvironmentRow {
|
||||
pub id: String,
|
||||
pub label: Option<String>,
|
||||
pub is_pinned: bool,
|
||||
pub repo_hints: Option<String>, // e.g., "openai/codex"
|
||||
pub default_branch: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct EnvModalState {
|
||||
pub query: String,
|
||||
pub selected: usize,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Copy, PartialEq, Eq)]
|
||||
pub enum ApplyResultLevel {
|
||||
Success,
|
||||
Partial,
|
||||
Error,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct ApplyModalState {
|
||||
pub task_id: TaskId,
|
||||
pub title: String,
|
||||
pub result_message: Option<String>,
|
||||
pub result_level: Option<ApplyResultLevel>,
|
||||
pub skipped_paths: Vec<String>,
|
||||
pub conflict_paths: Vec<String>,
|
||||
pub diff_override: Option<String>,
|
||||
}
|
||||
|
||||
use crate::scrollable_diff::ScrollableDiff;
|
||||
use codex_cloud_tasks_client::CloudBackend;
|
||||
use codex_cloud_tasks_client::TaskId;
|
||||
use codex_cloud_tasks_client::TaskSummary;
|
||||
use throbber_widgets_tui::ThrobberState;
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct App {
|
||||
pub tasks: Vec<TaskSummary>,
|
||||
pub selected: usize,
|
||||
pub status: String,
|
||||
pub diff_overlay: Option<DiffOverlay>,
|
||||
pub throbber: ThrobberState,
|
||||
pub refresh_inflight: bool,
|
||||
pub details_inflight: bool,
|
||||
// Environment filter state
|
||||
pub env_filter: Option<String>,
|
||||
pub env_modal: Option<EnvModalState>,
|
||||
pub apply_modal: Option<ApplyModalState>,
|
||||
pub environments: Vec<EnvironmentRow>,
|
||||
pub env_last_loaded: Option<std::time::Instant>,
|
||||
pub env_loading: bool,
|
||||
pub env_error: Option<String>,
|
||||
// New Task page
|
||||
pub new_task: Option<crate::new_task::NewTaskPage>,
|
||||
// Apply preflight spinner state
|
||||
pub apply_preflight_inflight: bool,
|
||||
// Apply action spinner state
|
||||
pub apply_inflight: bool,
|
||||
// Background enrichment coordination
|
||||
pub list_generation: u64,
|
||||
pub in_flight: std::collections::HashSet<String>,
|
||||
// Background enrichment caches were planned; currently unused.
|
||||
}
|
||||
|
||||
impl App {
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
tasks: Vec::new(),
|
||||
selected: 0,
|
||||
status: "Press r to refresh".to_string(),
|
||||
diff_overlay: None,
|
||||
throbber: ThrobberState::default(),
|
||||
refresh_inflight: false,
|
||||
details_inflight: false,
|
||||
env_filter: None,
|
||||
env_modal: None,
|
||||
apply_modal: None,
|
||||
environments: Vec::new(),
|
||||
env_last_loaded: None,
|
||||
env_loading: false,
|
||||
env_error: None,
|
||||
new_task: None,
|
||||
apply_preflight_inflight: false,
|
||||
apply_inflight: false,
|
||||
list_generation: 0,
|
||||
in_flight: std::collections::HashSet::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn next(&mut self) {
|
||||
if self.tasks.is_empty() {
|
||||
return;
|
||||
}
|
||||
self.selected = (self.selected + 1).min(self.tasks.len().saturating_sub(1));
|
||||
}
|
||||
|
||||
pub fn prev(&mut self) {
|
||||
if self.tasks.is_empty() {
|
||||
return;
|
||||
}
|
||||
if self.selected > 0 {
|
||||
self.selected -= 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn load_tasks(
|
||||
backend: &dyn CloudBackend,
|
||||
env: Option<&str>,
|
||||
) -> anyhow::Result<Vec<TaskSummary>> {
|
||||
// In later milestones, add a small debounce, spinner, and error display.
|
||||
let tasks = tokio::time::timeout(Duration::from_secs(5), backend.list_tasks(env)).await??;
|
||||
// Hide review-only tasks from the main list.
|
||||
let filtered: Vec<TaskSummary> = tasks.into_iter().filter(|t| !t.is_review).collect();
|
||||
Ok(filtered)
|
||||
}
|
||||
|
||||
pub struct DiffOverlay {
|
||||
pub title: String,
|
||||
pub task_id: TaskId,
|
||||
pub sd: ScrollableDiff,
|
||||
pub base_can_apply: bool,
|
||||
pub diff_lines: Vec<String>,
|
||||
pub text_lines: Vec<String>,
|
||||
pub prompt: Option<String>,
|
||||
pub attempts: Vec<AttemptView>,
|
||||
pub selected_attempt: usize,
|
||||
pub current_view: DetailView,
|
||||
pub base_turn_id: Option<String>,
|
||||
pub sibling_turn_ids: Vec<String>,
|
||||
pub attempt_total_hint: Option<usize>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct AttemptView {
|
||||
pub turn_id: Option<String>,
|
||||
pub status: codex_cloud_tasks_client::AttemptStatus,
|
||||
pub attempt_placement: Option<i64>,
|
||||
pub diff_lines: Vec<String>,
|
||||
pub text_lines: Vec<String>,
|
||||
pub prompt: Option<String>,
|
||||
pub diff_raw: Option<String>,
|
||||
}
|
||||
|
||||
impl AttemptView {
|
||||
pub fn has_diff(&self) -> bool {
|
||||
!self.diff_lines.is_empty()
|
||||
}
|
||||
|
||||
pub fn has_text(&self) -> bool {
|
||||
!self.text_lines.is_empty() || self.prompt.is_some()
|
||||
}
|
||||
}
|
||||
|
||||
impl DiffOverlay {
|
||||
pub fn new(task_id: TaskId, title: String, attempt_total_hint: Option<usize>) -> Self {
|
||||
let mut sd = ScrollableDiff::new();
|
||||
sd.set_content(Vec::new());
|
||||
Self {
|
||||
title,
|
||||
task_id,
|
||||
sd,
|
||||
base_can_apply: false,
|
||||
diff_lines: Vec::new(),
|
||||
text_lines: Vec::new(),
|
||||
prompt: None,
|
||||
attempts: vec![AttemptView::default()],
|
||||
selected_attempt: 0,
|
||||
current_view: DetailView::Prompt,
|
||||
base_turn_id: None,
|
||||
sibling_turn_ids: Vec::new(),
|
||||
attempt_total_hint,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn current_attempt(&self) -> Option<&AttemptView> {
|
||||
self.attempts.get(self.selected_attempt)
|
||||
}
|
||||
|
||||
pub fn base_attempt_mut(&mut self) -> &mut AttemptView {
|
||||
if self.attempts.is_empty() {
|
||||
self.attempts.push(AttemptView::default());
|
||||
}
|
||||
&mut self.attempts[0]
|
||||
}
|
||||
|
||||
pub fn set_view(&mut self, view: DetailView) {
|
||||
self.current_view = view;
|
||||
self.apply_selection_to_fields();
|
||||
}
|
||||
|
||||
pub fn expected_attempts(&self) -> Option<usize> {
|
||||
self.attempt_total_hint.or({
|
||||
if self.attempts.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(self.attempts.len())
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
pub fn attempt_count(&self) -> usize {
|
||||
self.attempts.len()
|
||||
}
|
||||
|
||||
pub fn attempt_display_total(&self) -> usize {
|
||||
self.expected_attempts()
|
||||
.unwrap_or_else(|| self.attempts.len().max(1))
|
||||
}
|
||||
|
||||
pub fn step_attempt(&mut self, delta: isize) -> bool {
|
||||
let total = self.attempts.len();
|
||||
if total <= 1 {
|
||||
return false;
|
||||
}
|
||||
let total_isize = total as isize;
|
||||
let current = self.selected_attempt as isize;
|
||||
let mut next = current + delta;
|
||||
next = ((next % total_isize) + total_isize) % total_isize;
|
||||
let next = next as usize;
|
||||
self.selected_attempt = next;
|
||||
self.apply_selection_to_fields();
|
||||
true
|
||||
}
|
||||
|
||||
pub fn current_can_apply(&self) -> bool {
|
||||
matches!(self.current_view, DetailView::Diff)
|
||||
&& self
|
||||
.current_attempt()
|
||||
.and_then(|attempt| attempt.diff_raw.as_ref())
|
||||
.map(|diff| !diff.is_empty())
|
||||
.unwrap_or(false)
|
||||
}
|
||||
|
||||
pub fn apply_selection_to_fields(&mut self) {
|
||||
let (diff_lines, text_lines, prompt) = if let Some(attempt) = self.current_attempt() {
|
||||
(
|
||||
attempt.diff_lines.clone(),
|
||||
attempt.text_lines.clone(),
|
||||
attempt.prompt.clone(),
|
||||
)
|
||||
} else {
|
||||
self.diff_lines.clear();
|
||||
self.text_lines.clear();
|
||||
self.prompt = None;
|
||||
self.sd.set_content(vec!["<loading attempt>".to_string()]);
|
||||
return;
|
||||
};
|
||||
|
||||
self.diff_lines = diff_lines.clone();
|
||||
self.text_lines = text_lines.clone();
|
||||
self.prompt = prompt;
|
||||
|
||||
match self.current_view {
|
||||
DetailView::Diff => {
|
||||
if diff_lines.is_empty() {
|
||||
self.sd.set_content(vec!["<no diff available>".to_string()]);
|
||||
} else {
|
||||
self.sd.set_content(diff_lines);
|
||||
}
|
||||
}
|
||||
DetailView::Prompt => {
|
||||
if text_lines.is_empty() {
|
||||
self.sd.set_content(vec!["<no output>".to_string()]);
|
||||
} else {
|
||||
self.sd.set_content(text_lines);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
pub enum DetailView {
|
||||
Diff,
|
||||
Prompt,
|
||||
}
|
||||
|
||||
/// Internal app events delivered from background tasks.
|
||||
/// These let the UI event loop remain responsive and keep the spinner animating.
|
||||
#[derive(Debug)]
|
||||
pub enum AppEvent {
|
||||
TasksLoaded {
|
||||
env: Option<String>,
|
||||
result: anyhow::Result<Vec<TaskSummary>>,
|
||||
},
|
||||
// Background diff summary events were planned; removed for now to keep code minimal.
|
||||
/// Autodetection of a likely environment id finished
|
||||
EnvironmentAutodetected(anyhow::Result<crate::env_detect::AutodetectSelection>),
|
||||
/// Background completion of environment list fetch
|
||||
EnvironmentsLoaded(anyhow::Result<Vec<EnvironmentRow>>),
|
||||
DetailsDiffLoaded {
|
||||
id: TaskId,
|
||||
title: String,
|
||||
diff: String,
|
||||
},
|
||||
DetailsMessagesLoaded {
|
||||
id: TaskId,
|
||||
title: String,
|
||||
messages: Vec<String>,
|
||||
prompt: Option<String>,
|
||||
turn_id: Option<String>,
|
||||
sibling_turn_ids: Vec<String>,
|
||||
attempt_placement: Option<i64>,
|
||||
attempt_status: codex_cloud_tasks_client::AttemptStatus,
|
||||
},
|
||||
DetailsFailed {
|
||||
id: TaskId,
|
||||
title: String,
|
||||
error: String,
|
||||
},
|
||||
AttemptsLoaded {
|
||||
id: TaskId,
|
||||
attempts: Vec<codex_cloud_tasks_client::TurnAttempt>,
|
||||
},
|
||||
/// Background completion of new task submission
|
||||
NewTaskSubmitted(Result<codex_cloud_tasks_client::CreatedTask, String>),
|
||||
/// Background completion of apply preflight when opening modal or on demand
|
||||
ApplyPreflightFinished {
|
||||
id: TaskId,
|
||||
title: String,
|
||||
message: String,
|
||||
level: ApplyResultLevel,
|
||||
skipped: Vec<String>,
|
||||
conflicts: Vec<String>,
|
||||
},
|
||||
/// Background completion of apply action (actual patch application)
|
||||
ApplyFinished {
|
||||
id: TaskId,
|
||||
result: std::result::Result<codex_cloud_tasks_client::ApplyOutcome, String>,
|
||||
},
|
||||
}
|
||||
|
||||
// Convenience aliases; currently unused.
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use chrono::Utc;
|
||||
|
||||
struct FakeBackend {
|
||||
// maps env key to titles
|
||||
by_env: std::collections::HashMap<Option<String>, Vec<&'static str>>,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl codex_cloud_tasks_client::CloudBackend for FakeBackend {
|
||||
async fn list_tasks(
|
||||
&self,
|
||||
env: Option<&str>,
|
||||
) -> codex_cloud_tasks_client::Result<Vec<TaskSummary>> {
|
||||
let key = env.map(str::to_string);
|
||||
let titles = self
|
||||
.by_env
|
||||
.get(&key)
|
||||
.cloned()
|
||||
.unwrap_or_else(|| vec!["default-a", "default-b"]);
|
||||
let mut out = Vec::new();
|
||||
for (i, t) in titles.into_iter().enumerate() {
|
||||
out.push(TaskSummary {
|
||||
id: TaskId(format!("T-{i}")),
|
||||
title: t.to_string(),
|
||||
status: codex_cloud_tasks_client::TaskStatus::Ready,
|
||||
updated_at: Utc::now(),
|
||||
environment_id: env.map(str::to_string),
|
||||
environment_label: None,
|
||||
summary: codex_cloud_tasks_client::DiffSummary::default(),
|
||||
is_review: false,
|
||||
attempt_total: Some(1),
|
||||
});
|
||||
}
|
||||
Ok(out)
|
||||
}
|
||||
|
||||
async fn get_task_diff(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
) -> codex_cloud_tasks_client::Result<Option<String>> {
|
||||
Err(codex_cloud_tasks_client::CloudTaskError::Unimplemented(
|
||||
"not used in test",
|
||||
))
|
||||
}
|
||||
|
||||
async fn get_task_messages(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
) -> codex_cloud_tasks_client::Result<Vec<String>> {
|
||||
Ok(vec![])
|
||||
}
|
||||
async fn get_task_text(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
) -> codex_cloud_tasks_client::Result<codex_cloud_tasks_client::TaskText> {
|
||||
Ok(codex_cloud_tasks_client::TaskText {
|
||||
prompt: Some("Example prompt".to_string()),
|
||||
messages: Vec::new(),
|
||||
turn_id: Some("fake-turn".to_string()),
|
||||
sibling_turn_ids: Vec::new(),
|
||||
attempt_placement: Some(0),
|
||||
attempt_status: codex_cloud_tasks_client::AttemptStatus::Completed,
|
||||
})
|
||||
}
|
||||
|
||||
async fn list_sibling_attempts(
|
||||
&self,
|
||||
_task: TaskId,
|
||||
_turn_id: String,
|
||||
) -> codex_cloud_tasks_client::Result<Vec<codex_cloud_tasks_client::TurnAttempt>> {
|
||||
Ok(Vec::new())
|
||||
}
|
||||
|
||||
async fn apply_task(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
_diff_override: Option<String>,
|
||||
) -> codex_cloud_tasks_client::Result<codex_cloud_tasks_client::ApplyOutcome> {
|
||||
Err(codex_cloud_tasks_client::CloudTaskError::Unimplemented(
|
||||
"not used in test",
|
||||
))
|
||||
}
|
||||
|
||||
async fn apply_task_preflight(
|
||||
&self,
|
||||
_id: TaskId,
|
||||
_diff_override: Option<String>,
|
||||
) -> codex_cloud_tasks_client::Result<codex_cloud_tasks_client::ApplyOutcome> {
|
||||
Err(codex_cloud_tasks_client::CloudTaskError::Unimplemented(
|
||||
"not used in test",
|
||||
))
|
||||
}
|
||||
|
||||
async fn create_task(
|
||||
&self,
|
||||
_env_id: &str,
|
||||
_prompt: &str,
|
||||
_git_ref: &str,
|
||||
_qa_mode: bool,
|
||||
_attachments: &[codex_cloud_tasks_client::AttachmentReference],
|
||||
) -> codex_cloud_tasks_client::Result<codex_cloud_tasks_client::CreatedTask> {
|
||||
Err(codex_cloud_tasks_client::CloudTaskError::Unimplemented(
|
||||
"not used in test",
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn load_tasks_uses_env_parameter() {
|
||||
// Arrange: env-specific task titles
|
||||
let mut by_env = std::collections::HashMap::new();
|
||||
by_env.insert(None, vec!["root-1", "root-2"]);
|
||||
by_env.insert(Some("env-A".to_string()), vec!["A-1"]);
|
||||
by_env.insert(Some("env-B".to_string()), vec!["B-1", "B-2", "B-3"]);
|
||||
let backend = FakeBackend { by_env };
|
||||
|
||||
// Act + Assert
|
||||
let root = load_tasks(&backend, None).await.unwrap();
|
||||
assert_eq!(root.len(), 2);
|
||||
assert_eq!(root[0].title, "root-1");
|
||||
|
||||
let a = load_tasks(&backend, Some("env-A")).await.unwrap();
|
||||
assert_eq!(a.len(), 1);
|
||||
assert_eq!(a[0].title, "A-1");
|
||||
|
||||
let b = load_tasks(&backend, Some("env-B")).await.unwrap();
|
||||
assert_eq!(b.len(), 3);
|
||||
assert_eq!(b[2].title, "B-3");
|
||||
}
|
||||
}
|
||||
@@ -1,226 +0,0 @@
|
||||
pub mod upload;
|
||||
|
||||
pub use upload::AttachmentAssetPointer;
|
||||
pub use upload::AttachmentId;
|
||||
pub use upload::AttachmentUploadError;
|
||||
pub use upload::AttachmentUploadMode;
|
||||
pub use upload::AttachmentUploadProgress;
|
||||
pub use upload::AttachmentUploadState;
|
||||
pub use upload::AttachmentUploadUpdate;
|
||||
pub use upload::AttachmentUploader;
|
||||
pub use upload::HttpConfig as AttachmentUploadHttpConfig;
|
||||
pub use upload::pointer_id_from_value;
|
||||
|
||||
use serde::Deserialize;
|
||||
use serde::Serialize;
|
||||
|
||||
const MAX_SUGGESTIONS: usize = 5;
|
||||
|
||||
/// The type of attachment included alongside a composer submission.
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub enum AttachmentKind {
|
||||
File,
|
||||
Image,
|
||||
}
|
||||
|
||||
/// Metadata describing a file or asset attached via an `@` mention.
|
||||
#[derive(Clone, Debug, PartialEq, Eq, Serialize, Deserialize)]
|
||||
pub struct ComposerAttachment {
|
||||
pub kind: AttachmentKind,
|
||||
pub label: String,
|
||||
pub path: String,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub fs_path: Option<String>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub start_line: Option<u32>,
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub end_line: Option<u32>,
|
||||
#[serde(skip, default)]
|
||||
pub id: AttachmentId,
|
||||
#[serde(skip_serializing, skip_deserializing)]
|
||||
pub upload: AttachmentUploadState,
|
||||
}
|
||||
|
||||
impl ComposerAttachment {
|
||||
pub fn from_suggestion(id: AttachmentId, suggestion: &MentionSuggestion) -> Self {
|
||||
Self {
|
||||
kind: AttachmentKind::File,
|
||||
label: suggestion.label.clone(),
|
||||
path: suggestion.path.clone(),
|
||||
fs_path: suggestion.fs_path.clone(),
|
||||
start_line: suggestion.start_line,
|
||||
end_line: suggestion.end_line,
|
||||
id,
|
||||
upload: AttachmentUploadState::default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// UI state for the active `@` mention query inside the composer.
|
||||
#[derive(Clone, Debug, Default, PartialEq, Eq)]
|
||||
pub struct MentionQueryState {
|
||||
pub current: Option<MentionToken>,
|
||||
}
|
||||
|
||||
impl MentionQueryState {
|
||||
/// Returns true when the stored token changed.
|
||||
pub fn update_from(&mut self, token: Option<String>) -> bool {
|
||||
let next = token.map(MentionToken::from_query);
|
||||
if next != self.current {
|
||||
self.current = next;
|
||||
return true;
|
||||
}
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents an `@` mention currently under the user's cursor.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct MentionToken {
|
||||
/// Query string without the leading `@`.
|
||||
pub query: String,
|
||||
/// Raw token including the `@` prefix.
|
||||
pub raw: String,
|
||||
}
|
||||
|
||||
impl MentionToken {
|
||||
pub(crate) fn from_query(query: String) -> Self {
|
||||
let raw = format!("@{query}");
|
||||
Self { query, raw }
|
||||
}
|
||||
}
|
||||
|
||||
/// A suggested file (or range within a file) that matches the active `@` token.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct MentionSuggestion {
|
||||
pub label: String,
|
||||
pub path: String,
|
||||
pub fs_path: Option<String>,
|
||||
pub start_line: Option<u32>,
|
||||
pub end_line: Option<u32>,
|
||||
}
|
||||
|
||||
impl MentionSuggestion {
|
||||
pub fn new(label: impl Into<String>, path: impl Into<String>) -> Self {
|
||||
Self {
|
||||
label: label.into(),
|
||||
path: path.into(),
|
||||
fs_path: None,
|
||||
start_line: None,
|
||||
end_line: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Tracks suggestion list + selection for the mention picker overlay.
|
||||
#[derive(Clone, Debug, Default, PartialEq, Eq)]
|
||||
pub struct MentionPickerState {
|
||||
suggestions: Vec<MentionSuggestion>,
|
||||
selected: usize,
|
||||
}
|
||||
|
||||
impl MentionPickerState {
|
||||
pub fn clear(&mut self) -> bool {
|
||||
if self.suggestions.is_empty() {
|
||||
return false;
|
||||
}
|
||||
self.suggestions.clear();
|
||||
self.selected = 0;
|
||||
true
|
||||
}
|
||||
|
||||
pub fn move_selection(&mut self, delta: isize) {
|
||||
if self.suggestions.is_empty() {
|
||||
return;
|
||||
}
|
||||
let len = self.suggestions.len() as isize;
|
||||
let mut idx = self.selected as isize + delta;
|
||||
if idx < 0 {
|
||||
idx = len - 1;
|
||||
}
|
||||
if idx >= len {
|
||||
idx = 0;
|
||||
}
|
||||
self.selected = idx as usize;
|
||||
}
|
||||
|
||||
pub fn selected_index(&self) -> usize {
|
||||
self.selected.min(self.suggestions.len().saturating_sub(1))
|
||||
}
|
||||
|
||||
pub fn current(&self) -> Option<&MentionSuggestion> {
|
||||
self.suggestions.get(self.selected_index())
|
||||
}
|
||||
|
||||
pub fn render_height(&self) -> u16 {
|
||||
let rows = self.suggestions.len().clamp(1, MAX_SUGGESTIONS) as u16;
|
||||
// Add borders + padding space.
|
||||
rows.saturating_add(2)
|
||||
}
|
||||
|
||||
pub fn items(&self) -> &[MentionSuggestion] {
|
||||
&self.suggestions
|
||||
}
|
||||
|
||||
pub fn set_suggestions(&mut self, suggestions: Vec<MentionSuggestion>) -> bool {
|
||||
let mut trimmed = suggestions;
|
||||
if trimmed.len() > MAX_SUGGESTIONS {
|
||||
trimmed.truncate(MAX_SUGGESTIONS);
|
||||
}
|
||||
if trimmed == self.suggestions {
|
||||
return false;
|
||||
}
|
||||
self.suggestions = trimmed;
|
||||
self.selected = 0;
|
||||
true
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::AttachmentUploadState;
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn compose_attachment_from_suggestion_copies_fields() {
|
||||
let mut suggestion = MentionSuggestion::new("src/main.rs", "src/main.rs");
|
||||
suggestion.fs_path = Some("/repo/src/main.rs".to_string());
|
||||
suggestion.start_line = Some(10);
|
||||
suggestion.end_line = Some(20);
|
||||
let att = ComposerAttachment::from_suggestion(AttachmentId::new(42), &suggestion);
|
||||
assert_eq!(att.label, "src/main.rs");
|
||||
assert_eq!(att.path, "src/main.rs");
|
||||
assert_eq!(att.fs_path.as_deref(), Some("/repo/src/main.rs"));
|
||||
assert_eq!(att.start_line, Some(10));
|
||||
assert_eq!(att.end_line, Some(20));
|
||||
assert!(matches!(att.upload, AttachmentUploadState::NotStarted));
|
||||
assert_eq!(att.id.raw(), 42);
|
||||
}
|
||||
#[test]
|
||||
fn move_selection_wraps() {
|
||||
let _token = MentionToken::from_query("foo".to_string());
|
||||
let mut picker = MentionPickerState::default();
|
||||
assert!(picker.set_suggestions(vec![
|
||||
MentionSuggestion::new("src/foo.rs", "src/foo.rs"),
|
||||
MentionSuggestion::new("src/main.rs", "src/main.rs"),
|
||||
]));
|
||||
picker.move_selection(1);
|
||||
assert_eq!(
|
||||
picker.selected_index(),
|
||||
1.min(picker.items().len().saturating_sub(1))
|
||||
);
|
||||
picker.move_selection(-1);
|
||||
assert_eq!(picker.selected_index(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn refresh_none_clears_suggestions() {
|
||||
let _token = MentionToken::from_query("bar".to_string());
|
||||
let mut picker = MentionPickerState::default();
|
||||
assert!(
|
||||
picker.set_suggestions(vec![MentionSuggestion::new("docs/bar.md", "docs/bar.md",)])
|
||||
);
|
||||
assert!(picker.clear());
|
||||
assert!(picker.items().is_empty());
|
||||
}
|
||||
}
|
||||
@@ -1,605 +0,0 @@
|
||||
use std::collections::HashMap;
|
||||
use std::fmt;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
use std::sync::atomic::AtomicBool;
|
||||
use std::sync::atomic::Ordering;
|
||||
|
||||
use crate::util::append_error_log;
|
||||
use chrono::Local;
|
||||
use mime_guess::MimeGuess;
|
||||
use reqwest::Client;
|
||||
use serde::Deserialize;
|
||||
use serde::Serialize;
|
||||
use tokio::sync::mpsc;
|
||||
use tokio::sync::mpsc::UnboundedReceiver;
|
||||
use tokio::sync::mpsc::UnboundedSender;
|
||||
use tracing::debug;
|
||||
use tracing::warn;
|
||||
use url::Url;
|
||||
|
||||
const UPLOAD_USE_CASE: &str = "codex";
|
||||
|
||||
/// Stable identifier assigned to each staged attachment.
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, Default, Serialize, Deserialize)]
|
||||
#[serde(transparent)]
|
||||
pub struct AttachmentId(pub u64);
|
||||
|
||||
impl AttachmentId {
|
||||
pub const fn new(raw: u64) -> Self {
|
||||
Self(raw)
|
||||
}
|
||||
|
||||
pub const fn raw(self) -> u64 {
|
||||
self.0
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents the lifecycle of an attachment upload initiated after an `@` mention.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub enum AttachmentUploadState {
|
||||
NotStarted,
|
||||
Uploading(AttachmentUploadProgress),
|
||||
Uploaded(AttachmentUploadSuccess),
|
||||
Failed(AttachmentUploadError),
|
||||
}
|
||||
|
||||
impl Default for AttachmentUploadState {
|
||||
fn default() -> Self {
|
||||
Self::NotStarted
|
||||
}
|
||||
}
|
||||
|
||||
impl AttachmentUploadState {
|
||||
pub fn is_pending(&self) -> bool {
|
||||
matches!(self, Self::NotStarted | Self::Uploading(_))
|
||||
}
|
||||
|
||||
pub fn is_uploaded(&self) -> bool {
|
||||
matches!(self, Self::Uploaded(_))
|
||||
}
|
||||
|
||||
pub fn is_failed(&self) -> bool {
|
||||
matches!(self, Self::Failed(_))
|
||||
}
|
||||
}
|
||||
|
||||
/// Progress for uploads where the total size is known.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct AttachmentUploadProgress {
|
||||
pub uploaded_bytes: u64,
|
||||
pub total_bytes: Option<u64>,
|
||||
}
|
||||
|
||||
impl AttachmentUploadProgress {
|
||||
pub fn new(uploaded_bytes: u64, total_bytes: Option<u64>) -> Self {
|
||||
Self {
|
||||
uploaded_bytes,
|
||||
total_bytes,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Successful upload metadata containing the remote pointer.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct AttachmentUploadSuccess {
|
||||
pub asset_pointer: AttachmentAssetPointer,
|
||||
pub display_name: String,
|
||||
}
|
||||
|
||||
impl AttachmentUploadSuccess {
|
||||
pub fn new(asset_pointer: AttachmentAssetPointer, display_name: impl Into<String>) -> Self {
|
||||
Self {
|
||||
asset_pointer,
|
||||
display_name: display_name.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Describes the remote asset pointer returned by the file service.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct AttachmentAssetPointer {
|
||||
pub kind: AttachmentPointerKind,
|
||||
pub value: String,
|
||||
}
|
||||
|
||||
impl AttachmentAssetPointer {
|
||||
pub fn new(kind: AttachmentPointerKind, value: impl Into<String>) -> Self {
|
||||
Self {
|
||||
kind,
|
||||
value: value.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// High-level pointer type so we can support both single file and container uploads.
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
|
||||
pub enum AttachmentPointerKind {
|
||||
File,
|
||||
Image,
|
||||
#[allow(dead_code)]
|
||||
Container,
|
||||
}
|
||||
|
||||
impl fmt::Display for AttachmentPointerKind {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
Self::File => write!(f, "file"),
|
||||
Self::Image => write!(f, "image"),
|
||||
Self::Container => write!(f, "container"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Captures a user-visible error when uploading an attachment fails.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub struct AttachmentUploadError {
|
||||
pub message: String,
|
||||
}
|
||||
|
||||
impl AttachmentUploadError {
|
||||
pub fn new(message: impl Into<String>) -> Self {
|
||||
Self {
|
||||
message: message.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for AttachmentUploadError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "{}", self.message)
|
||||
}
|
||||
}
|
||||
|
||||
/// Internal update emitted by the background uploader task.
|
||||
#[derive(Clone, Debug, PartialEq, Eq)]
|
||||
pub enum AttachmentUploadUpdate {
|
||||
Started {
|
||||
id: AttachmentId,
|
||||
total_bytes: Option<u64>,
|
||||
},
|
||||
Finished {
|
||||
id: AttachmentId,
|
||||
result: Result<AttachmentUploadSuccess, AttachmentUploadError>,
|
||||
},
|
||||
}
|
||||
|
||||
/// Configuration for attachment uploads.
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum AttachmentUploadMode {
|
||||
Disabled,
|
||||
#[cfg_attr(not(test), allow(dead_code))]
|
||||
ImmediateSuccess,
|
||||
Http(HttpConfig),
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct HttpConfig {
|
||||
pub base_url: String,
|
||||
pub bearer_token: Option<String>,
|
||||
pub chatgpt_account_id: Option<String>,
|
||||
pub user_agent: Option<String>,
|
||||
}
|
||||
|
||||
impl HttpConfig {
|
||||
fn trimmed_base(&self) -> String {
|
||||
self.base_url.trim_end_matches('/').to_string()
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
enum AttachmentUploadBackend {
|
||||
Disabled,
|
||||
ImmediateSuccess,
|
||||
Http(Arc<AttachmentUploadHttp>),
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
struct AttachmentUploadHttp {
|
||||
client: Client,
|
||||
base_url: String,
|
||||
bearer_token: Option<String>,
|
||||
chatgpt_account_id: Option<String>,
|
||||
user_agent: Option<String>,
|
||||
}
|
||||
|
||||
impl AttachmentUploadHttp {
|
||||
fn apply_default_headers(&self, builder: reqwest::RequestBuilder) -> reqwest::RequestBuilder {
|
||||
let mut b = builder;
|
||||
if let Some(token) = &self.bearer_token {
|
||||
b = b.bearer_auth(token);
|
||||
}
|
||||
if let Some(acc) = &self.chatgpt_account_id {
|
||||
b = b.header("ChatGPT-Account-Id", acc);
|
||||
}
|
||||
if let Some(ua) = &self.user_agent {
|
||||
b = b.header(reqwest::header::USER_AGENT, ua.clone());
|
||||
}
|
||||
b
|
||||
}
|
||||
}
|
||||
|
||||
/// Bookkeeping for in-flight attachment uploads, providing polling APIs for the UI thread.
|
||||
pub struct AttachmentUploader {
|
||||
update_tx: UnboundedSender<AttachmentUploadUpdate>,
|
||||
update_rx: UnboundedReceiver<AttachmentUploadUpdate>,
|
||||
inflight: HashMap<AttachmentId, Arc<AtomicBool>>,
|
||||
backend: AttachmentUploadBackend,
|
||||
}
|
||||
|
||||
impl AttachmentUploader {
|
||||
pub fn new(mode: AttachmentUploadMode) -> Self {
|
||||
let (tx, rx) = mpsc::unbounded_channel();
|
||||
let backend = match mode {
|
||||
AttachmentUploadMode::Disabled => AttachmentUploadBackend::Disabled,
|
||||
AttachmentUploadMode::ImmediateSuccess => AttachmentUploadBackend::ImmediateSuccess,
|
||||
AttachmentUploadMode::Http(cfg) => match Client::builder().build() {
|
||||
Ok(client) => AttachmentUploadBackend::Http(Arc::new(AttachmentUploadHttp {
|
||||
client,
|
||||
base_url: cfg.trimmed_base(),
|
||||
bearer_token: cfg.bearer_token,
|
||||
chatgpt_account_id: cfg.chatgpt_account_id,
|
||||
user_agent: cfg.user_agent,
|
||||
})),
|
||||
Err(err) => {
|
||||
warn!("attachment_upload.http_client_init_failed: {err}");
|
||||
AttachmentUploadBackend::Disabled
|
||||
}
|
||||
},
|
||||
};
|
||||
Self {
|
||||
update_tx: tx,
|
||||
update_rx: rx,
|
||||
inflight: HashMap::new(),
|
||||
backend,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn start_upload(
|
||||
&mut self,
|
||||
id: AttachmentId,
|
||||
display_name: impl Into<String>,
|
||||
fs_path: PathBuf,
|
||||
) -> Result<(), AttachmentUploadError> {
|
||||
if self.inflight.contains_key(&id) {
|
||||
return Err(AttachmentUploadError::new("upload already queued"));
|
||||
}
|
||||
if let AttachmentUploadBackend::Disabled = &self.backend {
|
||||
return Err(AttachmentUploadError::new(
|
||||
"file uploads are not available in this environment",
|
||||
));
|
||||
}
|
||||
|
||||
if !is_supported_image(&fs_path) {
|
||||
return Err(AttachmentUploadError::new(
|
||||
"only image files can be uploaded",
|
||||
));
|
||||
}
|
||||
|
||||
let cancel_token = Arc::new(AtomicBool::new(false));
|
||||
self.inflight.insert(id, cancel_token.clone());
|
||||
let tx = self.update_tx.clone();
|
||||
let backend = self.backend.clone();
|
||||
let path_clone = fs_path.clone();
|
||||
let label = display_name.into();
|
||||
tokio::spawn(async move {
|
||||
let metadata = tokio::fs::metadata(&fs_path).await.ok();
|
||||
let total_bytes = metadata.as_ref().map(std::fs::Metadata::len);
|
||||
let _ = tx.send(AttachmentUploadUpdate::Started { id, total_bytes });
|
||||
|
||||
if cancel_token.load(Ordering::Relaxed) {
|
||||
let _ = tx.send(AttachmentUploadUpdate::Finished {
|
||||
id,
|
||||
result: Err(AttachmentUploadError::new("upload canceled")),
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
let result = match backend {
|
||||
AttachmentUploadBackend::Disabled => Err(AttachmentUploadError::new(
|
||||
"file uploads are not available in this environment",
|
||||
)),
|
||||
AttachmentUploadBackend::ImmediateSuccess => {
|
||||
let pointer = AttachmentAssetPointer::new(
|
||||
AttachmentPointerKind::File,
|
||||
format!("file-service://mock-{}", id.raw()),
|
||||
);
|
||||
Ok(AttachmentUploadSuccess::new(pointer, label.clone()))
|
||||
}
|
||||
AttachmentUploadBackend::Http(http) => {
|
||||
perform_http_upload(
|
||||
http,
|
||||
&path_clone,
|
||||
&label,
|
||||
total_bytes,
|
||||
cancel_token.clone(),
|
||||
)
|
||||
.await
|
||||
}
|
||||
};
|
||||
|
||||
let _ = tx.send(AttachmentUploadUpdate::Finished { id, result });
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg_attr(not(test), allow(dead_code))]
|
||||
pub fn cancel_all(&mut self) {
|
||||
for cancel in self.inflight.values() {
|
||||
cancel.store(true, Ordering::Relaxed);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn poll(&mut self) -> Vec<AttachmentUploadUpdate> {
|
||||
let mut out = Vec::new();
|
||||
while let Ok(update) = self.update_rx.try_recv() {
|
||||
if let AttachmentUploadUpdate::Finished { id, .. } = &update {
|
||||
self.inflight.remove(id);
|
||||
}
|
||||
out.push(update);
|
||||
}
|
||||
out
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for AttachmentUploader {
|
||||
fn default() -> Self {
|
||||
Self::new(AttachmentUploadMode::Disabled)
|
||||
}
|
||||
}
|
||||
|
||||
async fn perform_http_upload(
|
||||
http: Arc<AttachmentUploadHttp>,
|
||||
fs_path: &Path,
|
||||
display_label: &str,
|
||||
total_bytes: Option<u64>,
|
||||
cancel_token: Arc<AtomicBool>,
|
||||
) -> Result<AttachmentUploadSuccess, AttachmentUploadError> {
|
||||
let file_bytes = tokio::fs::read(fs_path)
|
||||
.await
|
||||
.map_err(|e| AttachmentUploadError::new(format!("failed to read file: {e}")))?;
|
||||
|
||||
if cancel_token.load(Ordering::Relaxed) {
|
||||
return Err(AttachmentUploadError::new("upload canceled"));
|
||||
}
|
||||
|
||||
let file_name = fs_path
|
||||
.file_name()
|
||||
.and_then(|s| s.to_str())
|
||||
.map(std::string::ToString::to_string)
|
||||
.unwrap_or_else(|| display_label.to_string());
|
||||
|
||||
let create_url = format!("{}/files", http.base_url);
|
||||
let body = CreateFileRequest {
|
||||
file_name: &file_name,
|
||||
file_size: total_bytes.unwrap_or(file_bytes.len() as u64),
|
||||
use_case: UPLOAD_USE_CASE,
|
||||
timezone_offset_min: (Local::now().offset().utc_minus_local() / 60),
|
||||
reset_rate_limits: false,
|
||||
};
|
||||
|
||||
let create_resp = http
|
||||
.apply_default_headers(http.client.post(&create_url))
|
||||
.json(&body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| AttachmentUploadError::new(format!("file create failed: {e}")))?;
|
||||
if !create_resp.status().is_success() {
|
||||
let status = create_resp.status();
|
||||
let text = create_resp.text().await.unwrap_or_default();
|
||||
return Err(AttachmentUploadError::new(format!(
|
||||
"file create request failed status={status} body={text}"
|
||||
)));
|
||||
}
|
||||
let created: CreateFileResponse = create_resp
|
||||
.json()
|
||||
.await
|
||||
.map_err(|e| AttachmentUploadError::new(format!("decode file create response: {e}")))?;
|
||||
|
||||
if cancel_token.load(Ordering::Relaxed) {
|
||||
return Err(AttachmentUploadError::new("upload canceled"));
|
||||
}
|
||||
|
||||
let upload_url = resolve_upload_url(&created.upload_url)
|
||||
.ok_or_else(|| AttachmentUploadError::new("invalid upload url"))?;
|
||||
|
||||
let mime = infer_image_mime(fs_path)
|
||||
.ok_or_else(|| AttachmentUploadError::new("only image files can be uploaded"))?;
|
||||
let mut azure_req = http.client.put(&upload_url);
|
||||
azure_req = azure_req
|
||||
.header("x-ms-blob-type", "BlockBlob")
|
||||
.header("x-ms-version", "2020-04-08");
|
||||
|
||||
azure_req = azure_req
|
||||
.header(reqwest::header::CONTENT_TYPE, mime.as_str())
|
||||
.header("x-ms-blob-content-type", mime.as_str());
|
||||
|
||||
let azure_resp = azure_req
|
||||
.body(file_bytes)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| AttachmentUploadError::new(format!("blob upload failed: {e}")))?;
|
||||
|
||||
if !(200..300).contains(&azure_resp.status().as_u16()) {
|
||||
let status = azure_resp.status();
|
||||
let text = azure_resp.text().await.unwrap_or_default();
|
||||
return Err(AttachmentUploadError::new(format!(
|
||||
"blob upload failed status={status} body={text}"
|
||||
)));
|
||||
}
|
||||
|
||||
if cancel_token.load(Ordering::Relaxed) {
|
||||
return Err(AttachmentUploadError::new("upload canceled"));
|
||||
}
|
||||
|
||||
// Finalization must succeed so the pointer can be used; surface any failure
|
||||
// to the caller after logging for easier debugging.
|
||||
if let Err(err) = finalize_upload(http.clone(), &created.file_id, &file_name).await {
|
||||
let reason = err.message.clone();
|
||||
warn!(
|
||||
"mention.attachment.upload.finalize_failed file_id={} reason={reason}",
|
||||
created.file_id
|
||||
);
|
||||
append_error_log(format!(
|
||||
"mention.attachment.upload.finalize_failed file_id={} reason={reason}",
|
||||
created.file_id
|
||||
));
|
||||
return Err(err);
|
||||
}
|
||||
|
||||
let pointer = asset_pointer_from_id(&created.file_id);
|
||||
debug!(
|
||||
"mention.attachment.upload.success file_id={} pointer={}",
|
||||
created.file_id, pointer
|
||||
);
|
||||
let pointer_kind = AttachmentPointerKind::Image;
|
||||
|
||||
Ok(AttachmentUploadSuccess::new(
|
||||
AttachmentAssetPointer::new(pointer_kind, pointer),
|
||||
display_label,
|
||||
))
|
||||
}
|
||||
|
||||
fn asset_pointer_from_id(file_id: &str) -> String {
|
||||
if file_id.starts_with("file_") {
|
||||
format!("sediment://{file_id}")
|
||||
} else {
|
||||
format!("file-service://{file_id}")
|
||||
}
|
||||
}
|
||||
|
||||
pub fn pointer_id_from_value(pointer: &str) -> Option<String> {
|
||||
pointer
|
||||
.strip_prefix("file-service://")
|
||||
.or_else(|| pointer.strip_prefix("sediment://"))
|
||||
.map(str::to_string)
|
||||
.or_else(|| (!pointer.is_empty()).then(|| pointer.to_string()))
|
||||
}
|
||||
|
||||
async fn finalize_upload(
|
||||
http: Arc<AttachmentUploadHttp>,
|
||||
file_id: &str,
|
||||
file_name: &str,
|
||||
) -> Result<(), AttachmentUploadError> {
|
||||
let finalize_url = format!("{}/files/process_upload_stream", http.base_url);
|
||||
let body = FinalizeUploadRequest {
|
||||
file_id,
|
||||
use_case: UPLOAD_USE_CASE,
|
||||
index_for_retrieval: false,
|
||||
file_name,
|
||||
};
|
||||
let finalize_resp = http
|
||||
.apply_default_headers(http.client.post(&finalize_url))
|
||||
.json(&body)
|
||||
.send()
|
||||
.await
|
||||
.map_err(|e| AttachmentUploadError::new(format!("finalize upload failed: {e}")))?;
|
||||
if !finalize_resp.status().is_success() {
|
||||
let status = finalize_resp.status();
|
||||
let text = finalize_resp.text().await.unwrap_or_default();
|
||||
return Err(AttachmentUploadError::new(format!(
|
||||
"finalize upload failed status={status} body={text}"
|
||||
)));
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn resolve_upload_url(url: &str) -> Option<String> {
|
||||
let parsed = Url::parse(url).ok()?;
|
||||
if !parsed.as_str().to_lowercase().contains("estuary") {
|
||||
return Some(parsed.into());
|
||||
}
|
||||
parsed
|
||||
.query_pairs()
|
||||
.find(|(k, _)| k == "upload_url")
|
||||
.map(|(_, v)| v.into_owned())
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct CreateFileRequest<'a> {
|
||||
file_name: &'a str,
|
||||
file_size: u64,
|
||||
use_case: &'a str,
|
||||
timezone_offset_min: i32,
|
||||
reset_rate_limits: bool,
|
||||
}
|
||||
|
||||
#[derive(Serialize)]
|
||||
struct FinalizeUploadRequest<'a> {
|
||||
file_id: &'a str,
|
||||
use_case: &'a str,
|
||||
index_for_retrieval: bool,
|
||||
file_name: &'a str,
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct CreateFileResponse {
|
||||
file_id: String,
|
||||
upload_url: String,
|
||||
}
|
||||
|
||||
fn is_supported_image(path: &Path) -> bool {
|
||||
infer_image_mime(path).is_some()
|
||||
}
|
||||
|
||||
fn infer_image_mime(path: &Path) -> Option<String> {
|
||||
let guess = MimeGuess::from_path(path)
|
||||
.first_raw()
|
||||
.map(std::string::ToString::to_string);
|
||||
if let Some(m) = guess {
|
||||
if m.starts_with("image/") {
|
||||
return Some(m);
|
||||
}
|
||||
}
|
||||
|
||||
let ext = path
|
||||
.extension()
|
||||
.and_then(|ext| ext.to_str())
|
||||
.map(|ext| ext.trim().to_ascii_lowercase())?;
|
||||
|
||||
let mime = match ext.as_str() {
|
||||
"png" => "image/png",
|
||||
"jpg" | "jpeg" => "image/jpeg",
|
||||
"gif" => "image/gif",
|
||||
"webp" => "image/webp",
|
||||
"bmp" => "image/bmp",
|
||||
"svg" => "image/svg+xml",
|
||||
"heic" => "image/heic",
|
||||
"heif" => "image/heif",
|
||||
_ => return None,
|
||||
};
|
||||
|
||||
Some(mime.to_string())
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::path::Path;
|
||||
|
||||
#[test]
|
||||
fn infer_image_mime_accepts_common_extensions() {
|
||||
let cases = [
|
||||
("foo.png", Some("image/png")),
|
||||
("bar.JPG", Some("image/jpeg")),
|
||||
("baz.jpeg", Some("image/jpeg")),
|
||||
("img.gif", Some("image/gif")),
|
||||
("slide.WEBP", Some("image/webp")),
|
||||
("art.bmp", Some("image/bmp")),
|
||||
("vector.svg", Some("image/svg+xml")),
|
||||
("photo.heic", Some("image/heic")),
|
||||
("photo.heif", Some("image/heif")),
|
||||
];
|
||||
|
||||
for (path, expected) in cases {
|
||||
let actual = infer_image_mime(Path::new(path));
|
||||
assert_eq!(actual.as_deref(), expected, "case {path}");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn infer_image_mime_rejects_unknown_extension() {
|
||||
assert!(infer_image_mime(Path::new("doc.txt")).is_none());
|
||||
}
|
||||
}
|
||||
@@ -1,106 +0,0 @@
|
||||
use codex_backend_client::Client as BackendClient;
|
||||
use codex_cloud_tasks::util::extract_chatgpt_account_id;
|
||||
use codex_cloud_tasks::util::normalize_base_url;
|
||||
use codex_cloud_tasks::util::set_user_agent_suffix;
|
||||
use codex_core::config::find_codex_home;
|
||||
use codex_core::default_client::get_codex_user_agent;
|
||||
use codex_login::AuthManager;
|
||||
use std::time::Duration;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
// Base URL (default to ChatGPT backend API) and normalize to canonical form
|
||||
let raw_base = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
let base_url = normalize_base_url(&raw_base);
|
||||
println!("base_url: {base_url}");
|
||||
let path_style = if base_url.contains("/backend-api") {
|
||||
"wham"
|
||||
} else {
|
||||
"codex-api"
|
||||
};
|
||||
println!("path_style: {path_style}");
|
||||
|
||||
// Locate CODEX_HOME and try to load ChatGPT auth
|
||||
let codex_home = match find_codex_home() {
|
||||
Ok(p) => {
|
||||
println!("codex_home: {}", p.display());
|
||||
Some(p)
|
||||
}
|
||||
Err(e) => {
|
||||
println!("codex_home: <not found> ({e})");
|
||||
None
|
||||
}
|
||||
};
|
||||
|
||||
// Build backend client with UA
|
||||
set_user_agent_suffix("codex_cloud_tasks_conncheck");
|
||||
let ua = get_codex_user_agent();
|
||||
let mut client = BackendClient::new(base_url.clone())?.with_user_agent(ua);
|
||||
|
||||
// Attach bearer token if available from ChatGPT auth
|
||||
let mut have_auth = false;
|
||||
if let Some(home) = codex_home {
|
||||
let authm = AuthManager::new(home);
|
||||
if let Some(auth) = authm.auth() {
|
||||
match auth.get_token().await {
|
||||
Ok(token) if !token.is_empty() => {
|
||||
have_auth = true;
|
||||
println!("auth: ChatGPT token present ({} chars)", token.len());
|
||||
// Add Authorization header
|
||||
client = client.with_bearer_token(&token);
|
||||
|
||||
// Attempt to extract ChatGPT account id from the JWT and set header.
|
||||
if let Some(account_id) = extract_chatgpt_account_id(&token) {
|
||||
println!("auth: ChatGPT-Account-Id: {account_id}");
|
||||
client = client.with_chatgpt_account_id(account_id);
|
||||
} else if let Some(acc) = auth.get_account_id() {
|
||||
// Fallback: some older auth.jsons persist account_id
|
||||
println!("auth: ChatGPT-Account-Id (from auth.json): {acc}");
|
||||
client = client.with_chatgpt_account_id(acc);
|
||||
}
|
||||
}
|
||||
Ok(_) => {
|
||||
println!("auth: ChatGPT token empty");
|
||||
}
|
||||
Err(e) => {
|
||||
println!("auth: failed to load ChatGPT token: {e}");
|
||||
}
|
||||
}
|
||||
} else {
|
||||
println!("auth: no ChatGPT auth.json");
|
||||
}
|
||||
}
|
||||
|
||||
if !have_auth {
|
||||
println!("note: Online endpoints typically require ChatGPT sign-in. Run: `codex login`");
|
||||
}
|
||||
|
||||
// Attempt the /list call with a short timeout to avoid hanging
|
||||
match path_style {
|
||||
"wham" => println!("request: GET /wham/tasks/list?limit=5&task_filter=current"),
|
||||
_ => println!("request: GET /api/codex/tasks/list?limit=5&task_filter=current"),
|
||||
}
|
||||
let fut = client.list_tasks(Some(5), Some("current"), None);
|
||||
let res = tokio::time::timeout(Duration::from_secs(30), fut).await;
|
||||
match res {
|
||||
Err(_) => {
|
||||
println!("error: request timed out after 30s");
|
||||
std::process::exit(2);
|
||||
}
|
||||
Ok(Err(e)) => {
|
||||
// backend-client includes HTTP status and body in errors.
|
||||
println!("error: {e}");
|
||||
std::process::exit(1);
|
||||
}
|
||||
Ok(Ok(list)) => {
|
||||
println!("ok: received {} tasks", list.items.len());
|
||||
for item in list.items.iter().take(5) {
|
||||
println!("- {} — {}", item.id, item.title);
|
||||
}
|
||||
// Keep output concise; omit full JSON payload to stay readable.
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,45 +0,0 @@
|
||||
use codex_backend_client::Client as BackendClient;
|
||||
use codex_cloud_tasks::util::set_user_agent_suffix;
|
||||
use codex_core::config::find_codex_home;
|
||||
use codex_core::default_client::get_codex_user_agent;
|
||||
use codex_login::AuthManager;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
let base_url = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
set_user_agent_suffix("codex_cloud_tasks_detailcheck");
|
||||
let ua = get_codex_user_agent();
|
||||
let mut client = BackendClient::new(base_url)?.with_user_agent(ua);
|
||||
|
||||
if let Ok(home) = find_codex_home() {
|
||||
let am = AuthManager::new(home);
|
||||
if let Some(auth) = am.auth()
|
||||
&& let Ok(tok) = auth.get_token().await
|
||||
{
|
||||
client = client.with_bearer_token(tok);
|
||||
}
|
||||
}
|
||||
|
||||
let list = client.list_tasks(Some(5), Some("current"), None).await?;
|
||||
println!("items: {}", list.items.len());
|
||||
for item in list.items.iter().take(5) {
|
||||
println!("item: {} {}", item.id, item.title);
|
||||
let (details, body, ct) = client.get_task_details_with_body(&item.id).await?;
|
||||
let diff = codex_backend_client::CodeTaskDetailsResponseExt::unified_diff(&details);
|
||||
match diff {
|
||||
Some(d) => println!(
|
||||
"unified diff len={} sample=\n{}",
|
||||
d.len(),
|
||||
&d.lines().take(10).collect::<Vec<_>>().join("\n")
|
||||
),
|
||||
None => {
|
||||
println!(
|
||||
"no unified diff found; ct={ct}; body sample=\n{}",
|
||||
&body.chars().take(5000).collect::<String>()
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,136 +0,0 @@
|
||||
use base64::Engine;
|
||||
use clap::Parser;
|
||||
use codex_cloud_tasks::util::set_user_agent_suffix;
|
||||
use codex_core::config::find_codex_home;
|
||||
use codex_core::default_client::get_codex_user_agent;
|
||||
use codex_login::AuthManager;
|
||||
use reqwest::header::AUTHORIZATION;
|
||||
use reqwest::header::HeaderMap;
|
||||
use reqwest::header::HeaderName;
|
||||
use reqwest::header::HeaderValue;
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
#[command(version, about = "Resolve Codex environment id (debug helper)")]
|
||||
struct Args {
|
||||
/// Optional override for environment id; if present we just echo it.
|
||||
#[arg(long = "env-id")]
|
||||
environment_id: Option<String>,
|
||||
/// Optional label to select a matching environment (case-insensitive exact match).
|
||||
#[arg(long = "env-label")]
|
||||
environment_label: Option<String>,
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
let args = Args::parse();
|
||||
|
||||
// Base URL (default to ChatGPT backend API) with normalization
|
||||
let mut base_url = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
while base_url.ends_with('/') {
|
||||
base_url.pop();
|
||||
}
|
||||
if (base_url.starts_with("https://chatgpt.com")
|
||||
|| base_url.starts_with("https://chat.openai.com"))
|
||||
&& !base_url.contains("/backend-api")
|
||||
{
|
||||
base_url = format!("{base_url}/backend-api");
|
||||
}
|
||||
println!("base_url: {base_url}");
|
||||
println!(
|
||||
"path_style: {}",
|
||||
if base_url.contains("/backend-api") {
|
||||
"wham"
|
||||
} else {
|
||||
"codex-api"
|
||||
}
|
||||
);
|
||||
|
||||
// Build headers: UA + ChatGPT auth if available
|
||||
set_user_agent_suffix("codex_cloud_tasks_envcheck");
|
||||
let ua = get_codex_user_agent();
|
||||
let mut headers = HeaderMap::new();
|
||||
headers.insert(
|
||||
reqwest::header::USER_AGENT,
|
||||
HeaderValue::from_str(&ua).unwrap_or(HeaderValue::from_static("codex-cli")),
|
||||
);
|
||||
|
||||
// Locate CODEX_HOME and try to load ChatGPT auth
|
||||
if let Ok(home) = find_codex_home() {
|
||||
println!("codex_home: {}", home.display());
|
||||
let authm = AuthManager::new(home);
|
||||
if let Some(auth) = authm.auth() {
|
||||
match auth.get_token().await {
|
||||
Ok(token) if !token.is_empty() => {
|
||||
println!("auth: ChatGPT token present ({} chars)", token.len());
|
||||
let value = format!("Bearer {token}");
|
||||
if let Ok(hv) = HeaderValue::from_str(&value) {
|
||||
headers.insert(AUTHORIZATION, hv);
|
||||
}
|
||||
if let Some(account_id) = auth
|
||||
.get_account_id()
|
||||
.or_else(|| extract_chatgpt_account_id(&token))
|
||||
{
|
||||
println!("auth: ChatGPT-Account-Id: {account_id}");
|
||||
if let Ok(name) = HeaderName::from_bytes(b"ChatGPT-Account-Id")
|
||||
&& let Ok(hv) = HeaderValue::from_str(&account_id)
|
||||
{
|
||||
headers.insert(name, hv);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(_) => println!("auth: ChatGPT token empty"),
|
||||
Err(e) => println!("auth: failed to load ChatGPT token: {e}"),
|
||||
}
|
||||
} else {
|
||||
println!("auth: no ChatGPT auth.json");
|
||||
}
|
||||
} else {
|
||||
println!("codex_home: <not found>");
|
||||
}
|
||||
|
||||
// If user supplied an environment id, just echo it and exit.
|
||||
if let Some(id) = args.environment_id {
|
||||
println!("env: provided env-id={id}");
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Auto-detect environment id using shared env_detect
|
||||
match codex_cloud_tasks::env_detect::autodetect_environment_id(
|
||||
&base_url,
|
||||
&headers,
|
||||
args.environment_label,
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(sel) => {
|
||||
println!(
|
||||
"env: selected environment_id={} label={}",
|
||||
sel.id,
|
||||
sel.label.unwrap_or_else(|| "<none>".to_string())
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
Err(e) => {
|
||||
println!("env: failed: {e}");
|
||||
std::process::exit(2)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn extract_chatgpt_account_id(token: &str) -> Option<String> {
|
||||
// JWT: header.payload.signature
|
||||
let mut parts = token.split('.');
|
||||
let (_h, payload_b64, _s) = match (parts.next(), parts.next(), parts.next()) {
|
||||
(Some(h), Some(p), Some(s)) if !h.is_empty() && !p.is_empty() && !s.is_empty() => (h, p, s),
|
||||
_ => return None,
|
||||
};
|
||||
let payload_bytes = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
||||
.decode(payload_b64)
|
||||
.ok()?;
|
||||
let v: serde_json::Value = serde_json::from_slice(&payload_bytes).ok()?;
|
||||
v.get("https://api.openai.com/auth")
|
||||
.and_then(|auth| auth.get("chatgpt_account_id"))
|
||||
.and_then(|id| id.as_str())
|
||||
.map(str::to_string)
|
||||
}
|
||||
@@ -1,206 +0,0 @@
|
||||
use base64::Engine;
|
||||
use clap::Parser;
|
||||
use codex_cloud_tasks::util::set_user_agent_suffix;
|
||||
use codex_core::config::find_codex_home;
|
||||
use codex_core::default_client::get_codex_user_agent;
|
||||
use codex_login::AuthManager;
|
||||
use reqwest::header::AUTHORIZATION;
|
||||
use reqwest::header::CONTENT_TYPE;
|
||||
use reqwest::header::HeaderMap;
|
||||
use reqwest::header::HeaderName;
|
||||
use reqwest::header::HeaderValue;
|
||||
|
||||
#[derive(Debug, Parser)]
|
||||
#[command(version, about = "Create a new Codex cloud task (debug helper)")]
|
||||
struct Args {
|
||||
/// Optional override for environment id; if absent we auto-detect.
|
||||
#[arg(long = "env-id")]
|
||||
environment_id: Option<String>,
|
||||
/// Optional label match for environment selection (case-insensitive, exact match).
|
||||
#[arg(long = "env-label")]
|
||||
environment_label: Option<String>,
|
||||
/// Branch or ref to use (e.g., main)
|
||||
#[arg(long = "ref", default_value = "main")]
|
||||
git_ref: String,
|
||||
/// Run environment in QA (ask) mode
|
||||
#[arg(long = "qa-mode", default_value_t = false)]
|
||||
qa_mode: bool,
|
||||
/// Task prompt text
|
||||
#[arg(required = true)]
|
||||
prompt: Vec<String>,
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> anyhow::Result<()> {
|
||||
let args = Args::parse();
|
||||
let prompt = args.prompt.join(" ");
|
||||
|
||||
// Base URL (default to ChatGPT backend API)
|
||||
let mut base_url = std::env::var("CODEX_CLOUD_TASKS_BASE_URL")
|
||||
.unwrap_or_else(|_| "https://chatgpt.com/backend-api".to_string());
|
||||
while base_url.ends_with('/') {
|
||||
base_url.pop();
|
||||
}
|
||||
if (base_url.starts_with("https://chatgpt.com")
|
||||
|| base_url.starts_with("https://chat.openai.com"))
|
||||
&& !base_url.contains("/backend-api")
|
||||
{
|
||||
base_url = format!("{base_url}/backend-api");
|
||||
}
|
||||
println!("base_url: {base_url}");
|
||||
let is_wham = base_url.contains("/backend-api");
|
||||
println!("path_style: {}", if is_wham { "wham" } else { "codex-api" });
|
||||
|
||||
// Build headers: UA + ChatGPT auth if available
|
||||
set_user_agent_suffix("codex_cloud_tasks_newtask");
|
||||
let ua = get_codex_user_agent();
|
||||
let mut headers = HeaderMap::new();
|
||||
headers.insert(
|
||||
reqwest::header::USER_AGENT,
|
||||
HeaderValue::from_str(&ua).unwrap_or(HeaderValue::from_static("codex-cli")),
|
||||
);
|
||||
let mut have_auth = false;
|
||||
// Locate CODEX_HOME and try to load ChatGPT auth
|
||||
if let Ok(home) = find_codex_home() {
|
||||
let authm = AuthManager::new(home);
|
||||
if let Some(auth) = authm.auth() {
|
||||
match auth.get_token().await {
|
||||
Ok(token) if !token.is_empty() => {
|
||||
have_auth = true;
|
||||
println!("auth: ChatGPT token present ({} chars)", token.len());
|
||||
let value = format!("Bearer {token}");
|
||||
if let Ok(hv) = HeaderValue::from_str(&value) {
|
||||
headers.insert(AUTHORIZATION, hv);
|
||||
}
|
||||
if let Some(account_id) = auth
|
||||
.get_account_id()
|
||||
.or_else(|| extract_chatgpt_account_id(&token))
|
||||
{
|
||||
println!("auth: ChatGPT-Account-Id: {account_id}");
|
||||
if let Ok(name) = HeaderName::from_bytes(b"ChatGPT-Account-Id")
|
||||
&& let Ok(hv) = HeaderValue::from_str(&account_id)
|
||||
{
|
||||
headers.insert(name, hv);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(_) => println!("auth: ChatGPT token empty"),
|
||||
Err(e) => println!("auth: failed to load ChatGPT token: {e}"),
|
||||
}
|
||||
} else {
|
||||
println!("auth: no ChatGPT auth.json");
|
||||
}
|
||||
}
|
||||
if !have_auth {
|
||||
println!("note: Online endpoints typically require ChatGPT sign-in. Run: `codex login`");
|
||||
}
|
||||
|
||||
// Determine environment id: prefer flag, then by-repo lookup, then full list.
|
||||
let env_id = if let Some(id) = args.environment_id.clone() {
|
||||
println!("env: using provided env-id={id}");
|
||||
id
|
||||
} else {
|
||||
match codex_cloud_tasks::env_detect::autodetect_environment_id(
|
||||
&base_url,
|
||||
&headers,
|
||||
args.environment_label.clone(),
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(sel) => sel.id,
|
||||
Err(e) => {
|
||||
println!("env: failed to auto-detect environment: {e}");
|
||||
std::process::exit(2);
|
||||
}
|
||||
}
|
||||
};
|
||||
println!("env: selected environment_id={env_id}");
|
||||
|
||||
// Build request payload patterned after VSCode: POST /wham/tasks
|
||||
let url = if is_wham {
|
||||
format!("{base_url}/wham/tasks")
|
||||
} else {
|
||||
format!("{base_url}/api/codex/tasks")
|
||||
};
|
||||
println!(
|
||||
"request: POST {}",
|
||||
url.strip_prefix(&base_url).unwrap_or(&url)
|
||||
);
|
||||
|
||||
// input_items
|
||||
let mut input_items: Vec<serde_json::Value> = Vec::new();
|
||||
input_items.push(serde_json::json!({
|
||||
"type": "message",
|
||||
"role": "user",
|
||||
"content": [{ "content_type": "text", "text": prompt }]
|
||||
}));
|
||||
|
||||
// Optional: starting diff via env var for quick testing
|
||||
if let Ok(diff) = std::env::var("CODEX_STARTING_DIFF")
|
||||
&& !diff.is_empty()
|
||||
{
|
||||
input_items.push(serde_json::json!({
|
||||
"type": "pre_apply_patch",
|
||||
"output_diff": { "diff": diff }
|
||||
}));
|
||||
}
|
||||
|
||||
let request_body = serde_json::json!({
|
||||
"new_task": {
|
||||
"environment_id": env_id,
|
||||
"branch": args.git_ref,
|
||||
"run_environment_in_qa_mode": args.qa_mode,
|
||||
},
|
||||
"input_items": input_items,
|
||||
});
|
||||
|
||||
let http = reqwest::Client::builder().build()?;
|
||||
let res = http
|
||||
.post(&url)
|
||||
.headers(headers)
|
||||
.header(CONTENT_TYPE, HeaderValue::from_static("application/json"))
|
||||
.json(&request_body)
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
let status = res.status();
|
||||
let ct = res
|
||||
.headers()
|
||||
.get(CONTENT_TYPE)
|
||||
.and_then(|v| v.to_str().ok())
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
let body = res.text().await.unwrap_or_default();
|
||||
println!("status: {status}");
|
||||
println!("content-type: {ct}");
|
||||
match serde_json::from_str::<serde_json::Value>(&body) {
|
||||
Ok(v) => println!(
|
||||
"response (pretty JSON):\n{}",
|
||||
serde_json::to_string_pretty(&v).unwrap_or(body)
|
||||
),
|
||||
Err(_) => println!("response (raw):\n{body}"),
|
||||
}
|
||||
|
||||
if !status.is_success() {
|
||||
// Exit non-zero on failure
|
||||
std::process::exit(1);
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn extract_chatgpt_account_id(token: &str) -> Option<String> {
|
||||
// JWT: header.payload.signature
|
||||
let mut parts = token.split('.');
|
||||
let (_h, payload_b64, _s) = match (parts.next(), parts.next(), parts.next()) {
|
||||
(Some(h), Some(p), Some(s)) if !h.is_empty() && !p.is_empty() && !s.is_empty() => (h, p, s),
|
||||
_ => return None,
|
||||
};
|
||||
let payload_bytes = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
||||
.decode(payload_b64)
|
||||
.ok()?;
|
||||
let v: serde_json::Value = serde_json::from_slice(&payload_bytes).ok()?;
|
||||
v.get("https://api.openai.com/auth")
|
||||
.and_then(|auth| auth.get("chatgpt_account_id"))
|
||||
.and_then(|id| id.as_str())
|
||||
.map(str::to_string)
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
use clap::Parser;
|
||||
use codex_common::CliConfigOverrides;
|
||||
|
||||
#[derive(Parser, Debug, Default)]
|
||||
#[command(version)]
|
||||
pub struct Cli {
|
||||
#[clap(skip)]
|
||||
pub config_overrides: CliConfigOverrides,
|
||||
}
|
||||
@@ -1,405 +0,0 @@
|
||||
use reqwest::header::CONTENT_TYPE;
|
||||
use reqwest::header::HeaderMap;
|
||||
use std::collections::HashMap;
|
||||
use tracing::info;
|
||||
use tracing::warn;
|
||||
|
||||
#[derive(Debug, Clone, serde::Deserialize)]
|
||||
struct CodeEnvironment {
|
||||
id: String,
|
||||
#[serde(default)]
|
||||
label: Option<String>,
|
||||
#[serde(default)]
|
||||
is_pinned: Option<bool>,
|
||||
#[serde(default)]
|
||||
task_count: Option<i64>,
|
||||
#[serde(default)]
|
||||
repo_map: Option<HashMap<String, GitRepository>>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, serde::Deserialize)]
|
||||
struct GitRepository {
|
||||
#[serde(default)]
|
||||
repository_full_name: Option<String>,
|
||||
#[serde(default)]
|
||||
default_branch: Option<String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct AutodetectSelection {
|
||||
pub id: String,
|
||||
pub label: Option<String>,
|
||||
pub default_branch: Option<String>,
|
||||
}
|
||||
|
||||
fn clean_branch(branch: Option<&str>) -> Option<String> {
|
||||
branch
|
||||
.map(str::trim)
|
||||
.filter(|s| !s.is_empty())
|
||||
.map(std::string::ToString::to_string)
|
||||
}
|
||||
|
||||
fn default_branch_from_env(env: &CodeEnvironment, repo_hint: Option<&str>) -> Option<String> {
|
||||
let repo_map = env.repo_map.as_ref()?;
|
||||
if let Some(hint) = repo_hint {
|
||||
if let Some(repo) = repo_map
|
||||
.values()
|
||||
.find(|repo| repo.repository_full_name.as_deref() == Some(hint))
|
||||
&& let Some(branch) = clean_branch(repo.default_branch.as_deref())
|
||||
{
|
||||
return Some(branch);
|
||||
}
|
||||
if let Some(repo) = repo_map.get(hint)
|
||||
&& let Some(branch) = clean_branch(repo.default_branch.as_deref())
|
||||
{
|
||||
return Some(branch);
|
||||
}
|
||||
}
|
||||
repo_map
|
||||
.values()
|
||||
.find_map(|repo| clean_branch(repo.default_branch.as_deref()))
|
||||
}
|
||||
|
||||
fn merge_environment_row(
|
||||
map: &mut HashMap<String, crate::app::EnvironmentRow>,
|
||||
env: &CodeEnvironment,
|
||||
repo_hint: Option<&str>,
|
||||
) {
|
||||
let default_branch = default_branch_from_env(env, repo_hint);
|
||||
let repo_hint_owned = repo_hint.map(str::to_string);
|
||||
let entry = map
|
||||
.entry(env.id.clone())
|
||||
.or_insert_with(|| crate::app::EnvironmentRow {
|
||||
id: env.id.clone(),
|
||||
label: env.label.clone(),
|
||||
is_pinned: env.is_pinned.unwrap_or(false),
|
||||
repo_hints: repo_hint_owned.clone(),
|
||||
default_branch: default_branch.clone(),
|
||||
});
|
||||
if entry.label.is_none() {
|
||||
entry.label = env.label.clone();
|
||||
}
|
||||
entry.is_pinned = entry.is_pinned || env.is_pinned.unwrap_or(false);
|
||||
if entry.repo_hints.is_none() {
|
||||
entry.repo_hints = repo_hint_owned;
|
||||
}
|
||||
if let Some(branch) = default_branch {
|
||||
entry.default_branch = Some(branch);
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn autodetect_environment_id(
|
||||
base_url: &str,
|
||||
headers: &HeaderMap,
|
||||
desired_label: Option<String>,
|
||||
) -> anyhow::Result<AutodetectSelection> {
|
||||
// 1) Try repo-specific environments based on local git origins (GitHub only, like VSCode)
|
||||
let origins = get_git_origins();
|
||||
crate::append_error_log(format!("env: git origins: {origins:?}"));
|
||||
let mut by_repo_envs: Vec<CodeEnvironment> = Vec::new();
|
||||
for origin in &origins {
|
||||
if let Some((owner, repo)) = parse_owner_repo(origin) {
|
||||
let url = if base_url.contains("/backend-api") {
|
||||
format!(
|
||||
"{}/wham/environments/by-repo/{}/{}/{}",
|
||||
base_url, "github", owner, repo
|
||||
)
|
||||
} else {
|
||||
format!(
|
||||
"{}/api/codex/environments/by-repo/{}/{}/{}",
|
||||
base_url, "github", owner, repo
|
||||
)
|
||||
};
|
||||
crate::append_error_log(format!("env: GET {url}"));
|
||||
match get_json::<Vec<CodeEnvironment>>(&url, headers).await {
|
||||
Ok(mut list) => {
|
||||
crate::append_error_log(format!(
|
||||
"env: by-repo returned {} env(s) for {owner}/{repo}",
|
||||
list.len(),
|
||||
));
|
||||
by_repo_envs.append(&mut list);
|
||||
}
|
||||
Err(e) => crate::append_error_log(format!(
|
||||
"env: by-repo fetch failed for {owner}/{repo}: {e}"
|
||||
)),
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(env) = pick_environment_row(&by_repo_envs, desired_label.as_deref()) {
|
||||
return Ok(AutodetectSelection {
|
||||
id: env.id.clone(),
|
||||
label: env.label.as_deref().map(str::to_owned),
|
||||
default_branch: default_branch_from_env(&env, None),
|
||||
});
|
||||
}
|
||||
|
||||
// 2) Fallback to the full list
|
||||
let list_url = if base_url.contains("/backend-api") {
|
||||
format!("{base_url}/wham/environments")
|
||||
} else {
|
||||
format!("{base_url}/api/codex/environments")
|
||||
};
|
||||
crate::append_error_log(format!("env: GET {list_url}"));
|
||||
// Fetch and log the full environments JSON for debugging
|
||||
let http = reqwest::Client::builder().build()?;
|
||||
let res = http.get(&list_url).headers(headers.clone()).send().await?;
|
||||
let status = res.status();
|
||||
let ct = res
|
||||
.headers()
|
||||
.get(CONTENT_TYPE)
|
||||
.and_then(|v| v.to_str().ok())
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
let body = res.text().await.unwrap_or_default();
|
||||
crate::append_error_log(format!("env: status={status} content-type={ct}"));
|
||||
match serde_json::from_str::<serde_json::Value>(&body) {
|
||||
Ok(v) => {
|
||||
let pretty = serde_json::to_string_pretty(&v).unwrap_or(body.clone());
|
||||
crate::append_error_log(format!("env: /environments JSON (pretty):\n{pretty}"));
|
||||
}
|
||||
Err(_) => crate::append_error_log(format!("env: /environments (raw):\n{body}")),
|
||||
}
|
||||
if !status.is_success() {
|
||||
anyhow::bail!("GET {list_url} failed: {status}; content-type={ct}; body={body}");
|
||||
}
|
||||
let all_envs: Vec<CodeEnvironment> = serde_json::from_str(&body).map_err(|e| {
|
||||
anyhow::anyhow!("Decode error for {list_url}: {e}; content-type={ct}; body={body}")
|
||||
})?;
|
||||
if let Some(env) = pick_environment_row(&all_envs, desired_label.as_deref()) {
|
||||
return Ok(AutodetectSelection {
|
||||
id: env.id.clone(),
|
||||
label: env.label.as_deref().map(str::to_owned),
|
||||
default_branch: default_branch_from_env(&env, None),
|
||||
});
|
||||
}
|
||||
anyhow::bail!("no environments available")
|
||||
}
|
||||
|
||||
fn pick_environment_row(
|
||||
envs: &[CodeEnvironment],
|
||||
desired_label: Option<&str>,
|
||||
) -> Option<CodeEnvironment> {
|
||||
if envs.is_empty() {
|
||||
return None;
|
||||
}
|
||||
if let Some(label) = desired_label {
|
||||
let lc = label.to_lowercase();
|
||||
if let Some(e) = envs
|
||||
.iter()
|
||||
.find(|e| e.label.as_deref().unwrap_or("").to_lowercase() == lc)
|
||||
{
|
||||
crate::append_error_log(format!("env: matched by label: {label} -> {}", e.id));
|
||||
return Some(e.clone());
|
||||
}
|
||||
}
|
||||
if envs.len() == 1 {
|
||||
crate::append_error_log("env: single environment available; selecting it");
|
||||
return Some(envs[0].clone());
|
||||
}
|
||||
if let Some(e) = envs.iter().find(|e| e.is_pinned.unwrap_or(false)) {
|
||||
crate::append_error_log(format!("env: selecting pinned environment: {}", e.id));
|
||||
return Some(e.clone());
|
||||
}
|
||||
// Highest task_count as heuristic
|
||||
if let Some(e) = envs
|
||||
.iter()
|
||||
.max_by_key(|e| e.task_count.unwrap_or(0))
|
||||
.or_else(|| envs.first())
|
||||
{
|
||||
crate::append_error_log(format!("env: selecting by task_count/first: {}", e.id));
|
||||
return Some(e.clone());
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
async fn get_json<T: serde::de::DeserializeOwned>(
|
||||
url: &str,
|
||||
headers: &HeaderMap,
|
||||
) -> anyhow::Result<T> {
|
||||
let http = reqwest::Client::builder().build()?;
|
||||
let res = http.get(url).headers(headers.clone()).send().await?;
|
||||
let status = res.status();
|
||||
let ct = res
|
||||
.headers()
|
||||
.get(CONTENT_TYPE)
|
||||
.and_then(|v| v.to_str().ok())
|
||||
.unwrap_or("")
|
||||
.to_string();
|
||||
let body = res.text().await.unwrap_or_default();
|
||||
crate::append_error_log(format!("env: status={status} content-type={ct}"));
|
||||
if !status.is_success() {
|
||||
anyhow::bail!("GET {url} failed: {status}; content-type={ct}; body={body}");
|
||||
}
|
||||
let parsed = serde_json::from_str::<T>(&body).map_err(|e| {
|
||||
anyhow::anyhow!("Decode error for {url}: {e}; content-type={ct}; body={body}")
|
||||
})?;
|
||||
Ok(parsed)
|
||||
}
|
||||
|
||||
fn get_git_origins() -> Vec<String> {
|
||||
// Prefer: git config --get-regexp remote\..*\.url
|
||||
let out = std::process::Command::new("git")
|
||||
.args(["config", "--get-regexp", "remote\\..*\\.url"])
|
||||
.output();
|
||||
if let Ok(ok) = out
|
||||
&& ok.status.success()
|
||||
{
|
||||
let s = String::from_utf8_lossy(&ok.stdout);
|
||||
let mut urls = Vec::new();
|
||||
for line in s.lines() {
|
||||
if let Some((_, url)) = line.split_once(' ') {
|
||||
urls.push(url.trim().to_string());
|
||||
}
|
||||
}
|
||||
if !urls.is_empty() {
|
||||
return uniq(urls);
|
||||
}
|
||||
}
|
||||
// Fallback: git remote -v
|
||||
let out = std::process::Command::new("git")
|
||||
.args(["remote", "-v"])
|
||||
.output();
|
||||
if let Ok(ok) = out
|
||||
&& ok.status.success()
|
||||
{
|
||||
let s = String::from_utf8_lossy(&ok.stdout);
|
||||
let mut urls = Vec::new();
|
||||
for line in s.lines() {
|
||||
let parts: Vec<&str> = line.split_whitespace().collect();
|
||||
if parts.len() >= 2 {
|
||||
urls.push(parts[1].to_string());
|
||||
}
|
||||
}
|
||||
if !urls.is_empty() {
|
||||
return uniq(urls);
|
||||
}
|
||||
}
|
||||
Vec::new()
|
||||
}
|
||||
|
||||
fn uniq(mut v: Vec<String>) -> Vec<String> {
|
||||
v.sort();
|
||||
v.dedup();
|
||||
v
|
||||
}
|
||||
|
||||
fn parse_owner_repo(url: &str) -> Option<(String, String)> {
|
||||
// Normalize common prefixes and handle multiple SSH/HTTPS variants.
|
||||
let mut s = url.trim().to_string();
|
||||
// Drop protocol scheme for ssh URLs
|
||||
if let Some(rest) = s.strip_prefix("ssh://") {
|
||||
s = rest.to_string();
|
||||
}
|
||||
// Accept any user before @github.com (e.g., git@, org-123@)
|
||||
if let Some(idx) = s.find("@github.com:") {
|
||||
let rest = &s[idx + "@github.com:".len()..];
|
||||
let rest = rest.trim_start_matches('/').trim_end_matches(".git");
|
||||
let mut parts = rest.splitn(2, '/');
|
||||
let owner = parts.next()?.to_string();
|
||||
let repo = parts.next()?.to_string();
|
||||
crate::append_error_log(format!("env: parsed SSH GitHub origin => {owner}/{repo}"));
|
||||
return Some((owner, repo));
|
||||
}
|
||||
// HTTPS or git protocol
|
||||
for prefix in [
|
||||
"https://github.com/",
|
||||
"http://github.com/",
|
||||
"git://github.com/",
|
||||
"github.com/",
|
||||
] {
|
||||
if let Some(rest) = s.strip_prefix(prefix) {
|
||||
let rest = rest.trim_start_matches('/').trim_end_matches(".git");
|
||||
let mut parts = rest.splitn(2, '/');
|
||||
let owner = parts.next()?.to_string();
|
||||
let repo = parts.next()?.to_string();
|
||||
crate::append_error_log(format!("env: parsed HTTP GitHub origin => {owner}/{repo}"));
|
||||
return Some((owner, repo));
|
||||
}
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
/// List environments for the current repo(s) with a fallback to the global list.
|
||||
/// Returns a de-duplicated, sorted set suitable for the TUI modal.
|
||||
pub async fn list_environments(
|
||||
base_url: &str,
|
||||
headers: &HeaderMap,
|
||||
) -> anyhow::Result<Vec<crate::app::EnvironmentRow>> {
|
||||
let mut map: HashMap<String, crate::app::EnvironmentRow> = HashMap::new();
|
||||
|
||||
// 1) By-repo lookup for each parsed GitHub origin
|
||||
let origins = get_git_origins();
|
||||
for origin in &origins {
|
||||
if let Some((owner, repo)) = parse_owner_repo(origin) {
|
||||
let url = if base_url.contains("/backend-api") {
|
||||
format!(
|
||||
"{}/wham/environments/by-repo/{}/{}/{}",
|
||||
base_url, "github", owner, repo
|
||||
)
|
||||
} else {
|
||||
format!(
|
||||
"{}/api/codex/environments/by-repo/{}/{}/{}",
|
||||
base_url, "github", owner, repo
|
||||
)
|
||||
};
|
||||
match get_json::<Vec<CodeEnvironment>>(&url, headers).await {
|
||||
Ok(list) => {
|
||||
info!("env_tui: by-repo {}:{} -> {} envs", owner, repo, list.len());
|
||||
for env in list {
|
||||
let repo_hint = format!("{owner}/{repo}");
|
||||
merge_environment_row(&mut map, &env, Some(repo_hint.as_str()));
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
warn!(
|
||||
"env_tui: by-repo fetch failed for {}/{}: {}",
|
||||
owner, repo, e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 2) Fallback to the full list; on error return what we have if any.
|
||||
let list_url = if base_url.contains("/backend-api") {
|
||||
format!("{base_url}/wham/environments")
|
||||
} else {
|
||||
format!("{base_url}/api/codex/environments")
|
||||
};
|
||||
match get_json::<Vec<CodeEnvironment>>(&list_url, headers).await {
|
||||
Ok(list) => {
|
||||
info!("env_tui: global list -> {} envs", list.len());
|
||||
for env in list {
|
||||
merge_environment_row(&mut map, &env, None);
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
if map.is_empty() {
|
||||
return Err(e);
|
||||
} else {
|
||||
warn!(
|
||||
"env_tui: global list failed; using by-repo results only: {}",
|
||||
e
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let mut rows: Vec<crate::app::EnvironmentRow> = map.into_values().collect();
|
||||
rows.sort_by(|a, b| {
|
||||
// pinned first
|
||||
let p = b.is_pinned.cmp(&a.is_pinned);
|
||||
if p != std::cmp::Ordering::Equal {
|
||||
return p;
|
||||
}
|
||||
// then label (ci), then id
|
||||
let al = a.label.as_deref().unwrap_or("").to_lowercase();
|
||||
let bl = b.label.as_deref().unwrap_or("").to_lowercase();
|
||||
let l = al.cmp(&bl);
|
||||
if l != std::cmp::Ordering::Equal {
|
||||
return l;
|
||||
}
|
||||
a.id.cmp(&b.id)
|
||||
});
|
||||
Ok(rows)
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,176 +0,0 @@
|
||||
use unicode_width::UnicodeWidthChar;
|
||||
use unicode_width::UnicodeWidthStr;
|
||||
|
||||
/// Scroll position and geometry for a vertical scroll view.
|
||||
#[derive(Clone, Copy, Debug, Default)]
|
||||
pub struct ScrollViewState {
|
||||
pub scroll: u16,
|
||||
pub viewport_h: u16,
|
||||
pub content_h: u16,
|
||||
}
|
||||
|
||||
impl ScrollViewState {
|
||||
pub fn clamp(&mut self) {
|
||||
let max_scroll = self.content_h.saturating_sub(self.viewport_h);
|
||||
if self.scroll > max_scroll {
|
||||
self.scroll = max_scroll;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A simple, local scrollable view for diffs or message text.
|
||||
///
|
||||
/// Owns raw lines, caches wrapped lines for a given width, and maintains
|
||||
/// a small scroll state that is clamped whenever geometry shrinks.
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct ScrollableDiff {
|
||||
raw: Vec<String>,
|
||||
wrapped: Vec<String>,
|
||||
wrapped_src_idx: Vec<usize>,
|
||||
wrap_cols: Option<u16>,
|
||||
pub state: ScrollViewState,
|
||||
}
|
||||
|
||||
impl ScrollableDiff {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
/// Replace the raw content lines. Does not rewrap immediately; call `set_width` next.
|
||||
pub fn set_content(&mut self, lines: Vec<String>) {
|
||||
self.raw = lines;
|
||||
self.wrapped.clear();
|
||||
self.wrapped_src_idx.clear();
|
||||
self.state.content_h = 0;
|
||||
// Force rewrap on next set_width even if width is unchanged
|
||||
self.wrap_cols = None;
|
||||
}
|
||||
|
||||
/// Set the wrap width. If changed, rebuild wrapped lines and clamp scroll.
|
||||
pub fn set_width(&mut self, width: u16) {
|
||||
if self.wrap_cols == Some(width) {
|
||||
return;
|
||||
}
|
||||
self.wrap_cols = Some(width);
|
||||
self.rewrap(width);
|
||||
self.state.clamp();
|
||||
}
|
||||
|
||||
/// Update viewport height and clamp scroll if needed.
|
||||
pub fn set_viewport(&mut self, height: u16) {
|
||||
self.state.viewport_h = height;
|
||||
self.state.clamp();
|
||||
}
|
||||
|
||||
/// Return the cached wrapped lines. Call `set_width` first when area changes.
|
||||
pub fn wrapped_lines(&self) -> &[String] {
|
||||
&self.wrapped
|
||||
}
|
||||
|
||||
pub fn wrapped_src_indices(&self) -> &[usize] {
|
||||
&self.wrapped_src_idx
|
||||
}
|
||||
|
||||
pub fn raw_line_at(&self, idx: usize) -> &str {
|
||||
self.raw.get(idx).map(String::as_str).unwrap_or("")
|
||||
}
|
||||
|
||||
/// Scroll by a signed delta; clamps to content.
|
||||
pub fn scroll_by(&mut self, delta: i16) {
|
||||
let s = self.state.scroll as i32 + delta as i32;
|
||||
self.state.scroll = s.clamp(0, self.max_scroll() as i32) as u16;
|
||||
}
|
||||
|
||||
/// Page by a signed delta; typically viewport_h - 1.
|
||||
pub fn page_by(&mut self, delta: i16) {
|
||||
self.scroll_by(delta);
|
||||
}
|
||||
|
||||
pub fn to_top(&mut self) {
|
||||
self.state.scroll = 0;
|
||||
}
|
||||
|
||||
pub fn to_bottom(&mut self) {
|
||||
self.state.scroll = self.max_scroll();
|
||||
}
|
||||
|
||||
/// Optional percent scrolled; None when not enough geometry is known.
|
||||
pub fn percent_scrolled(&self) -> Option<u8> {
|
||||
if self.state.content_h == 0 || self.state.viewport_h == 0 {
|
||||
return None;
|
||||
}
|
||||
if self.state.content_h <= self.state.viewport_h {
|
||||
return None;
|
||||
}
|
||||
let visible_bottom = self.state.scroll.saturating_add(self.state.viewport_h) as f32;
|
||||
let pct = (visible_bottom / self.state.content_h as f32 * 100.0).round();
|
||||
Some(pct.clamp(0.0, 100.0) as u8)
|
||||
}
|
||||
|
||||
fn max_scroll(&self) -> u16 {
|
||||
self.state.content_h.saturating_sub(self.state.viewport_h)
|
||||
}
|
||||
|
||||
fn rewrap(&mut self, width: u16) {
|
||||
if width == 0 {
|
||||
self.wrapped = self.raw.clone();
|
||||
self.state.content_h = self.wrapped.len() as u16;
|
||||
return;
|
||||
}
|
||||
let max_cols = width as usize;
|
||||
let mut out: Vec<String> = Vec::new();
|
||||
let mut out_idx: Vec<usize> = Vec::new();
|
||||
for (raw_idx, raw) in self.raw.iter().enumerate() {
|
||||
// Normalize tabs for width accounting (MVP: 4 spaces).
|
||||
let raw = raw.replace('\t', " ");
|
||||
if raw.is_empty() {
|
||||
out.push(String::new());
|
||||
out_idx.push(raw_idx);
|
||||
continue;
|
||||
}
|
||||
let mut line = String::new();
|
||||
let mut line_cols = 0usize;
|
||||
let mut last_soft_idx: Option<usize> = None; // last whitespace or punctuation break
|
||||
for (_i, ch) in raw.char_indices() {
|
||||
if ch == '\n' {
|
||||
out.push(std::mem::take(&mut line));
|
||||
out_idx.push(raw_idx);
|
||||
line_cols = 0;
|
||||
last_soft_idx = None;
|
||||
continue;
|
||||
}
|
||||
let w = UnicodeWidthChar::width(ch).unwrap_or(0);
|
||||
if line_cols.saturating_add(w) > max_cols {
|
||||
if let Some(split) = last_soft_idx {
|
||||
let (prefix, rest) = line.split_at(split);
|
||||
out.push(prefix.trim_end().to_string());
|
||||
out_idx.push(raw_idx);
|
||||
line = rest.trim_start().to_string();
|
||||
last_soft_idx = None;
|
||||
// retry add current ch now that line may be shorter
|
||||
} else if !line.is_empty() {
|
||||
out.push(std::mem::take(&mut line));
|
||||
out_idx.push(raw_idx);
|
||||
}
|
||||
}
|
||||
if ch.is_whitespace()
|
||||
|| matches!(
|
||||
ch,
|
||||
',' | ';' | '.' | ':' | ')' | ']' | '}' | '|' | '/' | '?' | '!' | '-' | '_'
|
||||
)
|
||||
{
|
||||
last_soft_idx = Some(line.len());
|
||||
}
|
||||
line.push(ch);
|
||||
line_cols = UnicodeWidthStr::width(line.as_str());
|
||||
}
|
||||
if !line.is_empty() {
|
||||
out.push(line);
|
||||
out_idx.push(raw_idx);
|
||||
}
|
||||
}
|
||||
self.wrapped = out;
|
||||
self.wrapped_src_idx = out_idx;
|
||||
self.state.content_h = self.wrapped.len() as u16;
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,207 +0,0 @@
|
||||
use base64::Engine as _;
|
||||
use chrono::Utc;
|
||||
use reqwest::header::HeaderMap;
|
||||
use std::path::Path;
|
||||
use std::path::PathBuf;
|
||||
use std::process::Command;
|
||||
|
||||
pub fn set_user_agent_suffix(suffix: &str) {
|
||||
if let Ok(mut guard) = codex_core::default_client::USER_AGENT_SUFFIX.lock() {
|
||||
guard.replace(suffix.to_string());
|
||||
}
|
||||
}
|
||||
|
||||
pub fn append_error_log(message: impl AsRef<str>) {
|
||||
let message = message.as_ref();
|
||||
let timestamp = Utc::now().to_rfc3339();
|
||||
|
||||
if let Some(path) = log_file_path()
|
||||
&& write_log_line(&path, ×tamp, message)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
let fallback = Path::new("error.log");
|
||||
let _ = write_log_line(fallback, ×tamp, message);
|
||||
}
|
||||
|
||||
/// Normalize the configured base URL to a canonical form used by the backend client.
|
||||
/// - trims trailing '/'
|
||||
/// - appends '/backend-api' for ChatGPT hosts when missing
|
||||
pub fn normalize_base_url(input: &str) -> String {
|
||||
let mut base_url = input.to_string();
|
||||
while base_url.ends_with('/') {
|
||||
base_url.pop();
|
||||
}
|
||||
if (base_url.starts_with("https://chatgpt.com")
|
||||
|| base_url.starts_with("https://chat.openai.com"))
|
||||
&& !base_url.contains("/backend-api")
|
||||
{
|
||||
base_url = format!("{base_url}/backend-api");
|
||||
}
|
||||
base_url
|
||||
}
|
||||
|
||||
fn log_file_path() -> Option<PathBuf> {
|
||||
let mut log_dir = codex_core::config::find_codex_home().ok()?;
|
||||
log_dir.push("log");
|
||||
std::fs::create_dir_all(&log_dir).ok()?;
|
||||
Some(log_dir.join("codex-cloud-tasks.log"))
|
||||
}
|
||||
|
||||
fn write_log_line(path: &Path, timestamp: &str, message: &str) -> bool {
|
||||
let mut opts = std::fs::OpenOptions::new();
|
||||
opts.create(true).append(true);
|
||||
#[cfg(unix)]
|
||||
{
|
||||
use std::os::unix::fs::OpenOptionsExt;
|
||||
opts.mode(0o600);
|
||||
}
|
||||
|
||||
match opts.open(path) {
|
||||
Ok(mut file) => {
|
||||
use std::io::Write as _;
|
||||
writeln!(file, "[{timestamp}] {message}").is_ok()
|
||||
}
|
||||
Err(_) => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract the ChatGPT account id from a JWT token, when present.
|
||||
pub fn extract_chatgpt_account_id(token: &str) -> Option<String> {
|
||||
let mut parts = token.split('.');
|
||||
let (_h, payload_b64, _s) = match (parts.next(), parts.next(), parts.next()) {
|
||||
(Some(h), Some(p), Some(s)) if !h.is_empty() && !p.is_empty() && !s.is_empty() => (h, p, s),
|
||||
_ => return None,
|
||||
};
|
||||
let payload_bytes = base64::engine::general_purpose::URL_SAFE_NO_PAD
|
||||
.decode(payload_b64)
|
||||
.ok()?;
|
||||
let v: serde_json::Value = serde_json::from_slice(&payload_bytes).ok()?;
|
||||
v.get("https://api.openai.com/auth")
|
||||
.and_then(|auth| auth.get("chatgpt_account_id"))
|
||||
.and_then(|id| id.as_str())
|
||||
.map(str::to_string)
|
||||
}
|
||||
|
||||
pub fn switch_to_branch(branch: &str) -> Result<(), String> {
|
||||
let branch = branch.trim();
|
||||
if branch.is_empty() {
|
||||
return Err("default branch name is empty".to_string());
|
||||
}
|
||||
|
||||
if let Ok(current) = current_branch()
|
||||
&& current == branch
|
||||
{
|
||||
append_error_log(format!("git.switch: already on branch {branch}"));
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
append_error_log(format!("git.switch: switching to branch {branch}"));
|
||||
match ensure_success(&["checkout", branch]) {
|
||||
Ok(()) => Ok(()),
|
||||
Err(err) => {
|
||||
append_error_log(format!("git.switch: checkout {branch} failed: {err}"));
|
||||
if ensure_success(&["rev-parse", "--verify", branch]).is_ok() {
|
||||
return Err(err);
|
||||
}
|
||||
if let Err(fetch_err) = ensure_success(&["fetch", "origin", branch]) {
|
||||
append_error_log(format!(
|
||||
"git.switch: fetch origin/{branch} failed: {fetch_err}"
|
||||
));
|
||||
return Err(err);
|
||||
}
|
||||
let tracking = format!("origin/{branch}");
|
||||
ensure_success(&["checkout", "-b", branch, &tracking]).map_err(|create_err| {
|
||||
append_error_log(format!(
|
||||
"git.switch: checkout -b {branch} {tracking} failed: {create_err}"
|
||||
));
|
||||
create_err
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn current_branch() -> Result<String, String> {
|
||||
let output = run_git(&["rev-parse", "--abbrev-ref", "HEAD"])?;
|
||||
if !output.status.success() {
|
||||
return Err(format!(
|
||||
"git rev-parse --abbrev-ref failed: {}",
|
||||
format_command_failure(output, &["rev-parse", "--abbrev-ref", "HEAD"])
|
||||
));
|
||||
}
|
||||
Ok(String::from_utf8_lossy(&output.stdout).trim().to_string())
|
||||
}
|
||||
|
||||
fn ensure_success(args: &[&str]) -> Result<(), String> {
|
||||
let output = run_git(args)?;
|
||||
if output.status.success() {
|
||||
return Ok(());
|
||||
}
|
||||
Err(format_command_failure(output, args))
|
||||
}
|
||||
|
||||
fn run_git(args: &[&str]) -> Result<std::process::Output, String> {
|
||||
Command::new("git")
|
||||
.args(args)
|
||||
.output()
|
||||
.map_err(|e| format!("failed to launch git {}: {e}", join_args(args)))
|
||||
}
|
||||
|
||||
fn format_command_failure(output: std::process::Output, args: &[&str]) -> String {
|
||||
let stdout = String::from_utf8_lossy(&output.stdout);
|
||||
let stderr = String::from_utf8_lossy(&output.stderr);
|
||||
format!(
|
||||
"git {} exited with status {}. stdout: {} stderr: {}",
|
||||
join_args(args),
|
||||
output
|
||||
.status
|
||||
.code()
|
||||
.map(|c| c.to_string())
|
||||
.unwrap_or_else(|| "<signal>".to_string()),
|
||||
stdout.trim(),
|
||||
stderr.trim()
|
||||
)
|
||||
}
|
||||
|
||||
fn join_args(args: &[&str]) -> String {
|
||||
args.join(" ")
|
||||
}
|
||||
|
||||
/// Build headers for ChatGPT-backed requests: `User-Agent`, optional `Authorization`,
|
||||
/// and optional `ChatGPT-Account-Id`.
|
||||
pub async fn build_chatgpt_headers() -> HeaderMap {
|
||||
use reqwest::header::AUTHORIZATION;
|
||||
use reqwest::header::HeaderName;
|
||||
use reqwest::header::HeaderValue;
|
||||
use reqwest::header::USER_AGENT;
|
||||
|
||||
set_user_agent_suffix("codex_cloud_tasks_tui");
|
||||
let ua = codex_core::default_client::get_codex_user_agent();
|
||||
let mut headers = HeaderMap::new();
|
||||
headers.insert(
|
||||
USER_AGENT,
|
||||
HeaderValue::from_str(&ua).unwrap_or(HeaderValue::from_static("codex-cli")),
|
||||
);
|
||||
if let Ok(home) = codex_core::config::find_codex_home() {
|
||||
let am = codex_login::AuthManager::new(home);
|
||||
if let Some(auth) = am.auth()
|
||||
&& let Ok(tok) = auth.get_token().await
|
||||
&& !tok.is_empty()
|
||||
{
|
||||
let v = format!("Bearer {tok}");
|
||||
if let Ok(hv) = HeaderValue::from_str(&v) {
|
||||
headers.insert(AUTHORIZATION, hv);
|
||||
}
|
||||
if let Some(acc) = auth
|
||||
.get_account_id()
|
||||
.or_else(|| extract_chatgpt_account_id(&tok))
|
||||
&& let Ok(name) = HeaderName::from_bytes(b"ChatGPT-Account-Id")
|
||||
&& let Ok(hv) = HeaderValue::from_str(&acc)
|
||||
{
|
||||
headers.insert(name, hv);
|
||||
}
|
||||
}
|
||||
}
|
||||
headers
|
||||
}
|
||||
@@ -1,22 +0,0 @@
|
||||
use codex_cloud_tasks_client::CloudBackend;
|
||||
use codex_cloud_tasks_client::MockClient;
|
||||
|
||||
#[tokio::test]
|
||||
async fn mock_backend_varies_by_env() {
|
||||
let client = MockClient;
|
||||
|
||||
let root = CloudBackend::list_tasks(&client, None).await.unwrap();
|
||||
assert!(root.iter().any(|t| t.title.contains("Update README")));
|
||||
|
||||
let a = CloudBackend::list_tasks(&client, Some("env-A"))
|
||||
.await
|
||||
.unwrap();
|
||||
assert_eq!(a.len(), 1);
|
||||
assert_eq!(a[0].title, "A: First");
|
||||
|
||||
let b = CloudBackend::list_tasks(&client, Some("env-B"))
|
||||
.await
|
||||
.unwrap();
|
||||
assert_eq!(b.len(), 2);
|
||||
assert!(b[0].title.starts_with("B: "));
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
[package]
|
||||
name = "codex-backend-openapi-models"
|
||||
version = { workspace = true }
|
||||
edition = "2024"
|
||||
|
||||
[lib]
|
||||
name = "codex_backend_openapi_models"
|
||||
path = "src/lib.rs"
|
||||
|
||||
# Important: generated code often violates our workspace lints.
|
||||
# Allow unwrap/expect in this crate so the workspace builds cleanly
|
||||
# after models are regenerated.
|
||||
# Lint overrides are applied in src/lib.rs via crate attributes
|
||||
|
||||
[dependencies]
|
||||
serde = { version = "1", features = ["derive"] }
|
||||
serde_json = "1"
|
||||
uuid = { version = "1", features = ["serde"] }
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user