mirror of
https://github.com/openai/codex.git
synced 2026-04-24 14:45:27 +00:00
## Summary This PR introduces support for Azure OpenAI as a provider within the Codex CLI. Users can now configure the tool to leverage their Azure OpenAI deployments by specifying `"azure"` as the provider in `config.json` and setting the corresponding `AZURE_OPENAI_API_KEY` and `AZURE_OPENAI_API_VERSION` environment variables. This functionality is added alongside the existing provider options (OpenAI, OpenRouter, etc.). Related to #92 **Note:** This PR is currently in **Draft** status because tests on the `main` branch are failing. It will be marked as ready for review once the `main` branch is stable and tests are passing. --- ## What’s Changed - **Configuration (`config.ts`, `providers.ts`, `README.md`):** - Added `"azure"` to the supported `providers` list in `providers.ts`, specifying its name, default base URL structure, and environment variable key (`AZURE_OPENAI_API_KEY`). - Defined the `AZURE_OPENAI_API_VERSION` environment variable in `config.ts` with a default value (`2025-03-01-preview`). - Updated `README.md` to: - Include "azure" in the list of providers. - Add a configuration section for Azure OpenAI, detailing the required environment variables (`AZURE_OPENAI_API_KEY`, `AZURE_OPENAI_API_VERSION`) with examples. - **Client Instantiation (`terminal-chat.tsx`, `singlepass-cli-app.tsx`, `agent-loop.ts`, `compact-summary.ts`, `model-utils.ts`):** - Modified various components and utility functions where the OpenAI client is initialized. - Added conditional logic to check if the configured `provider` is `"azure"`. - If the provider is Azure, the `AzureOpenAI` client from the `openai` package is instantiated, using the configured `baseURL`, `apiKey` (from `AZURE_OPENAI_API_KEY`), and `apiVersion` (from `AZURE_OPENAI_API_VERSION`). - Otherwise, the standard `OpenAI` client is instantiated as before. - **Dependencies:** - Relies on the `openai` package's built-in support for `AzureOpenAI`. No *new* external dependencies were added specifically for this Azure implementation beyond the `openai` package itself. --- ## How to Test *This has been tested locally and confirmed working with Azure OpenAI.* 1. **Configure `config.json`:** Ensure your `~/.codex/config.json` (or project-specific config) includes Azure and sets it as the active provider: ```json { "providers": { // ... other providers "azure": { "name": "AzureOpenAI", "baseURL": "https://YOUR_RESOURCE_NAME.openai.azure.com", // Replace with your Azure endpoint "envKey": "AZURE_OPENAI_API_KEY" } }, "provider": "azure", // Set Azure as the active provider "model": "o4-mini" // Use your Azure deployment name here // ... other config settings } ``` 2. **Set up Environment Variables:** ```bash # Set the API Key for your Azure OpenAI resource export AZURE_OPENAI_API_KEY="your-azure-api-key-here" # Set the API Version (Optional - defaults to `2025-03-01-preview` if not set) # Ensure this version is supported by your Azure deployment and endpoint export AZURE_OPENAI_API_VERSION="2025-03-01-preview" ``` 3. **Get the Codex CLI by building from this PR branch:** Clone your fork, checkout this branch (`feat/azure-openai`), navigate to `codex-cli`, and build: ```bash # cd /path/to/your/fork/codex git checkout feat/azure-openai # Or your branch name cd codex-cli corepack enable pnpm install pnpm build ``` 4. **Invoke Codex:** Run the locally built CLI using `node` from the `codex-cli` directory: ```bash node ./dist/cli.js "Explain the purpose of this PR" ``` *(Alternatively, if you ran `pnpm link` after building, you can use `codex "Explain the purpose of this PR"` from anywhere)*. 5. **Verify:** Confirm that the command executes successfully and interacts with your configured Azure OpenAI deployment. --- ## Tests - [x] Tested locally against an Azure OpenAI deployment using API Key authentication. Basic commands and interactions confirmed working. --- ## Checklist - [x] Added Azure provider details to configuration files (`providers.ts`, `config.ts`). - [x] Implemented conditional `AzureOpenAI` client initialization based on provider setting. - [x] Ensured `apiVersion` is passed correctly to the Azure client. - [x] Updated `README.md` with Azure OpenAI setup instructions. - [x] Manually tested core functionality against a live Azure OpenAI endpoint. - [x] Add/update automated tests for the Azure code path (pending `main` stability). cc @theabhinavdas @nikodem-wrona @fouad-openai @tibo-openai (adjust as needed) --- I have read the CLA Document and I hereby sign the CLA
71 lines
2.8 KiB
TypeScript
71 lines
2.8 KiB
TypeScript
import type { AppConfig } from "./config.js";
|
|
import type { ResponseItem } from "openai/resources/responses/responses.mjs";
|
|
|
|
import { createOpenAIClient } from "./openai-client.js";
|
|
|
|
/**
|
|
* Generate a condensed summary of the conversation items.
|
|
* @param items The list of conversation items to summarize
|
|
* @param model The model to use for generating the summary
|
|
* @param flexMode Whether to use the flex-mode service tier
|
|
* @param config The configuration object
|
|
* @returns A concise structured summary string
|
|
*/
|
|
/**
|
|
* Generate a condensed summary of the conversation items.
|
|
* @param items The list of conversation items to summarize
|
|
* @param model The model to use for generating the summary
|
|
* @param flexMode Whether to use the flex-mode service tier
|
|
* @param config The configuration object
|
|
* @returns A concise structured summary string
|
|
*/
|
|
export async function generateCompactSummary(
|
|
items: Array<ResponseItem>,
|
|
model: string,
|
|
flexMode = false,
|
|
config: AppConfig,
|
|
): Promise<string> {
|
|
const oai = createOpenAIClient(config);
|
|
|
|
const conversationText = items
|
|
.filter(
|
|
(
|
|
item,
|
|
): item is ResponseItem & { content: Array<unknown>; role: string } =>
|
|
item.type === "message" &&
|
|
(item.role === "user" || item.role === "assistant") &&
|
|
Array.isArray(item.content),
|
|
)
|
|
.map((item) => {
|
|
const text = item.content
|
|
.filter(
|
|
(part): part is { text: string } =>
|
|
typeof part === "object" &&
|
|
part != null &&
|
|
"text" in part &&
|
|
typeof (part as { text: unknown }).text === "string",
|
|
)
|
|
.map((part) => part.text)
|
|
.join("");
|
|
return `${item.role}: ${text}`;
|
|
})
|
|
.join("\n");
|
|
|
|
const response = await oai.chat.completions.create({
|
|
model,
|
|
...(flexMode ? { service_tier: "flex" } : {}),
|
|
messages: [
|
|
{
|
|
role: "assistant",
|
|
content:
|
|
"You are an expert coding assistant. Your goal is to generate a concise, structured summary of the conversation below that captures all essential information needed to continue development after context replacement. Include tasks performed, code areas modified or reviewed, key decisions or assumptions, test results or errors, and outstanding tasks or next steps.",
|
|
},
|
|
{
|
|
role: "user",
|
|
content: `Here is the conversation so far:\n${conversationText}\n\nPlease summarize this conversation, covering:\n1. Tasks performed and outcomes\n2. Code files, modules, or functions modified or examined\n3. Important decisions or assumptions made\n4. Errors encountered and test or build results\n5. Remaining tasks, open questions, or next steps\nProvide the summary in a clear, concise format.`,
|
|
},
|
|
],
|
|
});
|
|
return response.choices[0]?.message.content ?? "Unable to generate summary.";
|
|
}
|