Compare commits

...

36 Commits

Author SHA1 Message Date
Sebastian Herrlinger
c03eaf89ef update docs 2026-02-17 12:04:11 +01:00
Sebastian Herrlinger
f086f492f4 zod instead of manual 2026-02-17 11:01:56 +01:00
Sebastian Herrlinger
8ff026f1dd error handling 2026-02-17 10:59:45 +01:00
Sebastian Herrlinger
6a22098815 no path mixup 2026-02-17 10:28:10 +01:00
Sebastian Herrlinger
f10bd62478 coverage and config load catch 2026-02-17 10:24:44 +01:00
Sebastian Herrlinger
3afa2bcb64 migrate all 2026-02-17 10:15:39 +01:00
Sebastian Herrlinger
6af7908806 init 2026-02-17 02:03:54 +01:00
Aiden Cline
e35a4131d0 core: keep message part order stable when files resolve asynchronously (#13915) 2026-02-16 18:45:11 -06:00
Goni Zahavy
0e669b6016 ci: use useblacksmith/stickydisk on linux runners only (#13909) 2026-02-16 18:27:04 -06:00
Goni Zahavy
9163611989 ci: fixed apt cache not working in publish.yml (#13897) 2026-02-16 17:31:38 -06:00
James Long
d93cefd47a fix(website): fix site in safari 18 (#13894) 2026-02-16 17:39:28 -05:00
Aiden Cline
a580fb47d2 tweak: drop ids from attachments in tools, assign them in prompt.ts instead (#13890) 2026-02-16 14:59:57 -06:00
ImmuneFOMO
9d3c81a683 feat(acp): add opt-in flag for question tool (#13562)
Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
Co-authored-by: Aiden Cline <aidenpcline@gmail.com>
2026-02-16 14:20:41 -06:00
Zhiyuan Zheng
86e545a23e fix(opencode): ACP sessions never get LLM-generated titles (#13095) 2026-02-16 14:16:17 -06:00
Ariane Emory
b0afdf6ea4 feat(cli): add session delete command (#13571) 2026-02-16 14:15:34 -06:00
opencode
d8c25bfeb4 release: v1.2.6 2026-02-16 19:57:09 +00:00
Robert Schadek
160ba295a8 feat(opencode): add dfmt formatter support for D language files (#13867) 2026-02-16 13:14:35 -06:00
OpeOginni
16332a8583 fix(tui): make use of server dir path for file references in prompts (#13781) 2026-02-16 13:14:08 -06:00
Aiden Cline
ae6e85b2a4 ignore: rm random comment on opencode.jsonc 2026-02-16 13:09:39 -06:00
Dax
fdad823edc feat(cli): add db migrate command for JSON to SQLite migration (#13874) 2026-02-16 19:05:21 +00:00
Ryan Vogel
5cc1d6097e feat(cli): add --continue and --fork flags to attach command (#13879) 2026-02-16 13:45:00 -05:00
opencode-agent[bot]
8c1af9b445 chore: update nix node_modules hashes 2026-02-16 17:38:43 +00:00
Vladimir Glafirov
ef979ccfa8 fix: bump GitLab provider and auth plugin for mid-session token refresh (#13850) 2026-02-16 10:01:17 -06:00
Imanol Maiztegui
bb30e06855 fix (tui): Inaccurate tips (#13845) 2026-02-16 09:08:04 -05:00
Adam
b055f973df chore: cleanup 2026-02-16 07:58:18 -06:00
Rafi Khardalian
45fa5e7199 fix(core): remove unnecessary per-message title LLM calls (#13804) 2026-02-16 06:04:20 -06:00
Chujiang
3ebf27aab9 fix(docs): correct critical translation errors in Russian zen page (#13830) 2026-02-16 06:02:48 -06:00
Aiden Cline
1d041c8861 fix: google vertex var priority (#13816) 2026-02-16 02:41:52 -06:00
opencode-agent[bot]
089ab9defa chore: generate 2026-02-16 08:32:34 +00:00
Jhin Lee
f7708efa5b feat: add openai-compatible endpoint support for google-vertex provider (#10303)
Co-authored-by: BlueT - Matthew Lien - 練喆明 <BlueT@BlueT.org>
Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
Co-authored-by: Aiden Cline <aidenpcline@gmail.com>
2026-02-16 02:31:48 -06:00
bnema
60807846a9 fix(desktop): normalize Linux Wayland/X11 backend and decoration policy (#13143)
Co-authored-by: Brendan Allan <brendonovich@outlook.com>
2026-02-16 13:24:28 +08:00
dpuyosa
afd0716cbd feat(opencode): Add Venice support in temperature, topP, topK and smallOption (#13553) 2026-02-15 22:24:24 -06:00
Brendan Allan
920255e8c6 desktop: use process-wrap instead of manual job object (#13431) 2026-02-16 04:14:24 +00:00
opencode-agent[bot]
21e0778002 chore: generate 2026-02-15 22:31:40 +00:00
Pan Kaixin
d9363da9ee fix(website): correct zh-CN translation of proprietary terms in zen.mdx (#13734)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
2026-02-15 16:30:47 -06:00
Salam Elbilig
9b23130ac4 feat(opencode): add cljfmt formatter support for Clojure files (#13426) 2026-02-15 15:21:57 -06:00
100 changed files with 2661 additions and 1002 deletions

View File

@@ -4,6 +4,7 @@ runs:
using: "composite"
steps:
- name: Mount Bun Cache
if: ${{ runner.os == 'Linux' }}
uses: useblacksmith/stickydisk@v1
with:
key: ${{ github.repository }}-bun-cache-${{ runner.os }}

View File

@@ -137,7 +137,7 @@ jobs:
if: contains(matrix.settings.host, 'ubuntu')
uses: actions/cache@v4
with:
path: /var/cache/apt/archives
path: ~/apt-cache
key: ${{ runner.os }}-${{ matrix.settings.target }}-apt-${{ hashFiles('.github/workflows/publish.yml') }}
restore-keys: |
${{ runner.os }}-${{ matrix.settings.target }}-apt-
@@ -145,8 +145,10 @@ jobs:
- name: install dependencies (ubuntu only)
if: contains(matrix.settings.host, 'ubuntu')
run: |
mkdir -p ~/apt-cache && chmod -R a+rw ~/apt-cache
sudo apt-get update
sudo apt-get install -y libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf
sudo apt-get install -y --no-install-recommends -o dir::cache::archives="$HOME/apt-cache" libwebkit2gtk-4.1-dev libappindicator3-dev librsvg2-dev patchelf
sudo chmod -R a+rw ~/apt-cache
- name: install Rust stable
uses: dtolnay/rust-toolchain@stable

View File

@@ -359,6 +359,7 @@ opencode serve --hostname 0.0.0.0 --port 4096
opencode serve [--port <number>] [--hostname <string>] [--cors <origin>]
opencode session [command]
opencode session list
opencode session delete <sessionID>
opencode stats
opencode uninstall
opencode upgrade
@@ -598,6 +599,7 @@ OPENCODE_EXPERIMENTAL_MARKDOWN
OPENCODE_EXPERIMENTAL_OUTPUT_TOKEN_MAX
OPENCODE_EXPERIMENTAL_OXFMT
OPENCODE_EXPERIMENTAL_PLAN_MODE
OPENCODE_ENABLE_QUESTION_TOOL
OPENCODE_FAKE_VCS
OPENCODE_GIT_BASH_PATH
OPENCODE_MODEL

View File

@@ -1,8 +1,5 @@
{
"$schema": "https://opencode.ai/config.json",
// "enterprise": {
// "url": "https://enterprise.dev.opencode.ai",
// },
"provider": {
"opencode": {
"options": {},

View File

@@ -23,7 +23,7 @@
},
"packages/app": {
"name": "@opencode-ai/app",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@kobalte/core": "catalog:",
"@opencode-ai/sdk": "workspace:*",
@@ -73,7 +73,7 @@
},
"packages/console/app": {
"name": "@opencode-ai/console-app",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@cloudflare/vite-plugin": "1.15.2",
"@ibm/plex": "6.4.1",
@@ -107,7 +107,7 @@
},
"packages/console/core": {
"name": "@opencode-ai/console-core",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@aws-sdk/client-sts": "3.782.0",
"@jsx-email/render": "1.1.1",
@@ -134,7 +134,7 @@
},
"packages/console/function": {
"name": "@opencode-ai/console-function",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@ai-sdk/anthropic": "2.0.0",
"@ai-sdk/openai": "2.0.2",
@@ -158,7 +158,7 @@
},
"packages/console/mail": {
"name": "@opencode-ai/console-mail",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@jsx-email/all": "2.2.3",
"@jsx-email/cli": "1.4.3",
@@ -182,7 +182,7 @@
},
"packages/desktop": {
"name": "@opencode-ai/desktop",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@opencode-ai/app": "workspace:*",
"@opencode-ai/ui": "workspace:*",
@@ -215,7 +215,7 @@
},
"packages/enterprise": {
"name": "@opencode-ai/enterprise",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@opencode-ai/ui": "workspace:*",
"@opencode-ai/util": "workspace:*",
@@ -244,7 +244,7 @@
},
"packages/function": {
"name": "@opencode-ai/function",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@octokit/auth-app": "8.0.1",
"@octokit/rest": "catalog:",
@@ -260,7 +260,7 @@
},
"packages/opencode": {
"name": "opencode",
"version": "1.2.5",
"version": "1.2.6",
"bin": {
"opencode": "./bin/opencode",
},
@@ -288,8 +288,8 @@
"@ai-sdk/vercel": "1.0.33",
"@ai-sdk/xai": "2.0.51",
"@clack/prompts": "1.0.0-alpha.1",
"@gitlab/gitlab-ai-provider": "3.5.0",
"@gitlab/opencode-gitlab-auth": "1.3.2",
"@gitlab/gitlab-ai-provider": "3.5.1",
"@gitlab/opencode-gitlab-auth": "1.3.3",
"@hono/standard-validator": "0.1.5",
"@hono/zod-validator": "catalog:",
"@modelcontextprotocol/sdk": "1.25.2",
@@ -369,7 +369,7 @@
},
"packages/plugin": {
"name": "@opencode-ai/plugin",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@opencode-ai/sdk": "workspace:*",
"zod": "catalog:",
@@ -389,7 +389,7 @@
},
"packages/sdk/js": {
"name": "@opencode-ai/sdk",
"version": "1.2.5",
"version": "1.2.6",
"devDependencies": {
"@hey-api/openapi-ts": "0.90.10",
"@tsconfig/node22": "catalog:",
@@ -400,7 +400,7 @@
},
"packages/slack": {
"name": "@opencode-ai/slack",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@opencode-ai/sdk": "workspace:*",
"@slack/bolt": "^3.17.1",
@@ -413,7 +413,7 @@
},
"packages/ui": {
"name": "@opencode-ai/ui",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@kobalte/core": "catalog:",
"@opencode-ai/sdk": "workspace:*",
@@ -455,7 +455,7 @@
},
"packages/util": {
"name": "@opencode-ai/util",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"zod": "catalog:",
},
@@ -466,7 +466,7 @@
},
"packages/web": {
"name": "@opencode-ai/web",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@astrojs/cloudflare": "12.6.3",
"@astrojs/markdown-remark": "6.3.1",
@@ -989,9 +989,9 @@
"@fontsource/inter": ["@fontsource/inter@5.2.8", "", {}, "sha512-P6r5WnJoKiNVV+zvW2xM13gNdFhAEpQ9dQJHt3naLvfg+LkF2ldgSLiF4T41lf1SQCM9QmkqPTn4TH568IRagg=="],
"@gitlab/gitlab-ai-provider": ["@gitlab/gitlab-ai-provider@3.5.0", "", { "dependencies": { "@anthropic-ai/sdk": "^0.71.0", "@anycable/core": "^0.9.2", "graphql-request": "^6.1.0", "isomorphic-ws": "^5.0.0", "openai": "^6.16.0", "socket.io-client": "^4.8.1", "vscode-jsonrpc": "^8.2.1", "zod": "^3.25.76" }, "peerDependencies": { "@ai-sdk/provider": ">=2.0.0", "@ai-sdk/provider-utils": ">=3.0.0" } }, "sha512-OoAwCz4fOci3h/2l+PRHMclclh3IaFq8w1es2wvBJ8ca7vtglKsBYT7dvmYpsXlu7pg9mopbjcexvmVCQEUTAQ=="],
"@gitlab/gitlab-ai-provider": ["@gitlab/gitlab-ai-provider@3.5.1", "", { "dependencies": { "@anthropic-ai/sdk": "^0.71.0", "@anycable/core": "^0.9.2", "graphql-request": "^6.1.0", "isomorphic-ws": "^5.0.0", "openai": "^6.16.0", "socket.io-client": "^4.8.1", "vscode-jsonrpc": "^8.2.1", "zod": "^3.25.76" }, "peerDependencies": { "@ai-sdk/provider": ">=2.0.0", "@ai-sdk/provider-utils": ">=3.0.0" } }, "sha512-I8+EGdUeKmGJSjAdFobHtqpxM9Fm00w0j7NJbtln/D/XQ1SKEGoZIuqJko4v0pV2mkhGUIs7qezljH/2kbXovA=="],
"@gitlab/opencode-gitlab-auth": ["@gitlab/opencode-gitlab-auth@1.3.2", "", { "dependencies": { "@fastify/rate-limit": "^10.2.0", "@opencode-ai/plugin": "*", "fastify": "^5.2.0", "open": "^10.0.0" } }, "sha512-pvGrC+aDVLY8bRCC/fZaG/Qihvt2r4by5xbTo5JTSz9O7yIcR6xG2d9Wkuu4bcXFz674z2C+i5bUk+J/RSdBpg=="],
"@gitlab/opencode-gitlab-auth": ["@gitlab/opencode-gitlab-auth@1.3.3", "", { "dependencies": { "@fastify/rate-limit": "^10.2.0", "@opencode-ai/plugin": "*", "fastify": "^5.2.0", "open": "^10.0.0" } }, "sha512-FT+KsCmAJjtqWr1YAq0MywGgL9kaLQ4apmsoowAXrPqHtoYf2i/nY10/A+L06kNj22EATeEDRpbB1NWXMto/SA=="],
"@graphql-typed-document-node/core": ["@graphql-typed-document-node/core@3.2.0", "", { "peerDependencies": { "graphql": "^0.8.0 || ^0.9.0 || ^0.10.0 || ^0.11.0 || ^0.12.0 || ^0.13.0 || ^14.0.0 || ^15.0.0 || ^16.0.0 || ^17.0.0" } }, "sha512-mB9oAsNCm9aM3/SOv4YtBMqZbYj10R7dkq8byBqxGY/ncFwhf2oQzMV+LCRlWoDSEBJ3COiR1yeDvMtsoOsuFQ=="],

View File

@@ -1,8 +1,8 @@
{
"nodeModules": {
"x86_64-linux": "sha256-5pgd2xuvIIkTbIOGIdK5MIXo6O9qRpvk1RKQZ1e1R+8=",
"aarch64-linux": "sha256-FZiHwihM4b82ipQ9XfW08X+sd5CvZhx/+pU/8X1zsns=",
"aarch64-darwin": "sha256-iZv0w1NthV53pY5uvuf3JlI14GeKmCu7WHwGSRdEQeM=",
"x86_64-darwin": "sha256-c3Zm3P1goFPgg3vNAZPMFOhHX/gyTmsCN/PKbGO/v0E="
"x86_64-linux": "sha256-C3WIEER2XgzO85wk2sp3BzQ6dknW026zslD8nKZjo2U=",
"aarch64-linux": "sha256-+tTJHZMZ/+8fAjI/1fUTuca8J2MZfB+5vhBoZ7jgqcE=",
"aarch64-darwin": "sha256-vS82puFGBBToxyIBa8Zi0KLKdJYr64T6HZL2rL32mH8=",
"x86_64-darwin": "sha256-Tr8JMTCxV6WVt3dXV7iq3PNCm2Cn+RXAbU9+o7pKKV0="
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/app",
"version": "1.2.5",
"version": "1.2.6",
"description": "",
"type": "module",
"exports": {

View File

@@ -23,7 +23,6 @@ import { useSync } from "@/context/sync"
import { useTerminal, type LocalPTY } from "@/context/terminal"
import { useLayout } from "@/context/layout"
import { checksum, base64Encode } from "@opencode-ai/util/encode"
import { findLast } from "@opencode-ai/util/array"
import { useDialog } from "@opencode-ai/ui/context/dialog"
import { DialogSelectFile } from "@/components/dialog-select-file"
import FileTree from "@/components/file-tree"
@@ -35,7 +34,6 @@ import { useSDK } from "@/context/sdk"
import { usePrompt } from "@/context/prompt"
import { useComments } from "@/context/comments"
import { ConstrainDragYAxis, getDraggableId } from "@/utils/solid-dnd"
import { usePermission } from "@/context/permission"
import { showToast } from "@opencode-ai/ui/toast"
import { SessionHeader, SessionContextTab, SortableTab, FileVisual, NewSessionView } from "@/components/session"
import { navMark, navParams } from "@/utils/perf"
@@ -101,7 +99,6 @@ export default function Page() {
const sdk = useSDK()
const prompt = usePrompt()
const comments = useComments()
const permission = usePermission()
const permRequest = createMemo(() => {
const sessionID = params.id
@@ -770,11 +767,6 @@ export default function Page() {
return lines.slice(0, 2).join("\n")
}
const addSelectionToContext = (path: string, selection: FileSelection) => {
const preview = selectionPreview(path, selection)
prompt.context.add({ type: "file", path, selection, preview })
}
const addCommentToContext = (input: {
file: string
selection: SelectedLineRange
@@ -913,31 +905,11 @@ export default function Page() {
const focusInput = () => inputRef?.focus()
useSessionCommands({
command,
dialog,
file,
language,
local,
permission,
prompt,
sdk,
sync,
terminal,
layout,
params,
navigate,
tabs,
view,
info,
status,
userMessages,
visibleUserMessages,
activeMessage,
showAllFiles,
navigateMessageByOffset,
setExpanded: (id, fn) => setStore("expanded", id, fn),
setActiveMessage,
addSelectionToContext,
focusInput,
})

View File

@@ -1,5 +1,5 @@
import { describe, expect, test } from "bun:test"
import { combineCommandSections, createOpenReviewFile, focusTerminalById, getTabReorderIndex } from "./helpers"
import { createOpenReviewFile, focusTerminalById, getTabReorderIndex } from "./helpers"
describe("createOpenReviewFile", () => {
test("opens and loads selected review file", () => {
@@ -46,20 +46,6 @@ describe("focusTerminalById", () => {
})
})
describe("combineCommandSections", () => {
test("keeps section order stable", () => {
const result = combineCommandSections([
[{ id: "a", title: "A" }],
[
{ id: "b", title: "B" },
{ id: "c", title: "C" },
],
])
expect(result.map((item) => item.id)).toEqual(["a", "b", "c"])
})
})
describe("getTabReorderIndex", () => {
test("returns target index for valid drag reorder", () => {
expect(getTabReorderIndex(["a", "b", "c"], "a", "c")).toBe(2)

View File

@@ -1,4 +1,3 @@
import type { CommandOption } from "@/context/command"
import { batch } from "solid-js"
export const focusTerminalById = (id: string) => {
@@ -36,10 +35,6 @@ export const createOpenReviewFile = (input: {
}
}
export const combineCommandSections = (sections: readonly (readonly CommandOption[])[]) => {
return sections.flatMap((section) => section)
}
export const getTabReorderIndex = (tabs: readonly string[], from: string, to: string) => {
const fromIndex = tabs.indexOf(from)
const toIndex = tabs.indexOf(to)

View File

@@ -19,35 +19,14 @@ import { showToast } from "@opencode-ai/ui/toast"
import { findLast } from "@opencode-ai/util/array"
import { extractPromptFromParts } from "@/utils/prompt"
import { UserMessage } from "@opencode-ai/sdk/v2"
import { combineCommandSections } from "@/pages/session/helpers"
import { canAddSelectionContext } from "@/pages/session/session-command-helpers"
export type SessionCommandContext = {
command: ReturnType<typeof useCommand>
dialog: ReturnType<typeof useDialog>
file: ReturnType<typeof useFile>
language: ReturnType<typeof useLanguage>
local: ReturnType<typeof useLocal>
permission: ReturnType<typeof usePermission>
prompt: ReturnType<typeof usePrompt>
sdk: ReturnType<typeof useSDK>
sync: ReturnType<typeof useSync>
terminal: ReturnType<typeof useTerminal>
layout: ReturnType<typeof useLayout>
params: ReturnType<typeof useParams>
navigate: ReturnType<typeof useNavigate>
tabs: () => ReturnType<ReturnType<typeof useLayout>["tabs"]>
view: () => ReturnType<ReturnType<typeof useLayout>["view"]>
info: () => { revert?: { messageID?: string }; share?: { url?: string } } | undefined
status: () => { type: string }
userMessages: () => UserMessage[]
visibleUserMessages: () => UserMessage[]
activeMessage: () => UserMessage | undefined
showAllFiles: () => void
navigateMessageByOffset: (offset: number) => void
setExpanded: (id: string, fn: (open: boolean | undefined) => boolean) => void
setActiveMessage: (message: UserMessage | undefined) => void
addSelectionToContext: (path: string, selection: FileSelection) => void
focusInput: () => void
}
@@ -58,45 +37,88 @@ const withCategory = (category: string) => {
})
}
export const useSessionCommands = (input: SessionCommandContext) => {
const sessionCommand = withCategory(input.language.t("command.category.session"))
const fileCommand = withCategory(input.language.t("command.category.file"))
const contextCommand = withCategory(input.language.t("command.category.context"))
const viewCommand = withCategory(input.language.t("command.category.view"))
const terminalCommand = withCategory(input.language.t("command.category.terminal"))
const modelCommand = withCategory(input.language.t("command.category.model"))
const mcpCommand = withCategory(input.language.t("command.category.mcp"))
const agentCommand = withCategory(input.language.t("command.category.agent"))
const permissionsCommand = withCategory(input.language.t("command.category.permissions"))
export const useSessionCommands = (args: SessionCommandContext) => {
const command = useCommand()
const dialog = useDialog()
const file = useFile()
const language = useLanguage()
const local = useLocal()
const permission = usePermission()
const prompt = usePrompt()
const sdk = useSDK()
const sync = useSync()
const terminal = useTerminal()
const layout = useLayout()
const params = useParams()
const navigate = useNavigate()
const sessionKey = createMemo(() => `${params.dir}${params.id ? "/" + params.id : ""}`)
const tabs = createMemo(() => layout.tabs(sessionKey))
const view = createMemo(() => layout.view(sessionKey))
const info = createMemo(() => (params.id ? sync.session.get(params.id) : undefined))
const idle = { type: "idle" as const }
const status = createMemo(() => sync.data.session_status[params.id ?? ""] ?? idle)
const messages = createMemo(() => (params.id ? (sync.data.message[params.id] ?? []) : []))
const userMessages = createMemo(() => messages().filter((m) => m.role === "user") as UserMessage[])
const visibleUserMessages = createMemo(() => {
const revert = info()?.revert?.messageID
if (!revert) return userMessages()
return userMessages().filter((m) => m.id < revert)
})
const selectionPreview = (path: string, selection: FileSelection) => {
const content = file.get(path)?.content?.content
if (!content) return undefined
const start = Math.max(1, Math.min(selection.startLine, selection.endLine))
const end = Math.max(selection.startLine, selection.endLine)
const lines = content.split("\n").slice(start - 1, end)
if (lines.length === 0) return undefined
return lines.slice(0, 2).join("\n")
}
const addSelectionToContext = (path: string, selection: FileSelection) => {
const preview = selectionPreview(path, selection)
prompt.context.add({ type: "file", path, selection, preview })
}
const sessionCommand = withCategory(language.t("command.category.session"))
const fileCommand = withCategory(language.t("command.category.file"))
const contextCommand = withCategory(language.t("command.category.context"))
const viewCommand = withCategory(language.t("command.category.view"))
const terminalCommand = withCategory(language.t("command.category.terminal"))
const modelCommand = withCategory(language.t("command.category.model"))
const mcpCommand = withCategory(language.t("command.category.mcp"))
const agentCommand = withCategory(language.t("command.category.agent"))
const permissionsCommand = withCategory(language.t("command.category.permissions"))
const sessionCommands = createMemo(() => [
sessionCommand({
id: "session.new",
title: input.language.t("command.session.new"),
title: language.t("command.session.new"),
keybind: "mod+shift+s",
slash: "new",
onSelect: () => input.navigate(`/${input.params.dir}/session`),
onSelect: () => navigate(`/${params.dir}/session`),
}),
])
const fileCommands = createMemo(() => [
fileCommand({
id: "file.open",
title: input.language.t("command.file.open"),
description: input.language.t("palette.search.placeholder"),
title: language.t("command.file.open"),
description: language.t("palette.search.placeholder"),
keybind: "mod+p",
slash: "open",
onSelect: () => input.dialog.show(() => <DialogSelectFile onOpenFile={input.showAllFiles} />),
onSelect: () => dialog.show(() => <DialogSelectFile onOpenFile={args.showAllFiles} />),
}),
fileCommand({
id: "tab.close",
title: input.language.t("command.tab.close"),
title: language.t("command.tab.close"),
keybind: "mod+w",
disabled: !input.tabs().active(),
disabled: !tabs().active(),
onSelect: () => {
const active = input.tabs().active()
const active = tabs().active()
if (!active) return
input.tabs().close(active)
tabs().close(active)
},
}),
])
@@ -104,30 +126,30 @@ export const useSessionCommands = (input: SessionCommandContext) => {
const contextCommands = createMemo(() => [
contextCommand({
id: "context.addSelection",
title: input.language.t("command.context.addSelection"),
description: input.language.t("command.context.addSelection.description"),
title: language.t("command.context.addSelection"),
description: language.t("command.context.addSelection.description"),
keybind: "mod+shift+l",
disabled: !canAddSelectionContext({
active: input.tabs().active(),
pathFromTab: input.file.pathFromTab,
selectedLines: input.file.selectedLines,
active: tabs().active(),
pathFromTab: file.pathFromTab,
selectedLines: file.selectedLines,
}),
onSelect: () => {
const active = input.tabs().active()
const active = tabs().active()
if (!active) return
const path = input.file.pathFromTab(active)
const path = file.pathFromTab(active)
if (!path) return
const range = input.file.selectedLines(path) as SelectedLineRange | null | undefined
const range = file.selectedLines(path) as SelectedLineRange | null | undefined
if (!range) {
showToast({
title: input.language.t("toast.context.noLineSelection.title"),
description: input.language.t("toast.context.noLineSelection.description"),
title: language.t("toast.context.noLineSelection.title"),
description: language.t("toast.context.noLineSelection.description"),
})
return
}
input.addSelectionToContext(path, selectionFromLines(range))
addSelectionToContext(path, selectionFromLines(range))
},
}),
])
@@ -135,50 +157,50 @@ export const useSessionCommands = (input: SessionCommandContext) => {
const viewCommands = createMemo(() => [
viewCommand({
id: "terminal.toggle",
title: input.language.t("command.terminal.toggle"),
title: language.t("command.terminal.toggle"),
keybind: "ctrl+`",
slash: "terminal",
onSelect: () => input.view().terminal.toggle(),
onSelect: () => view().terminal.toggle(),
}),
viewCommand({
id: "review.toggle",
title: input.language.t("command.review.toggle"),
title: language.t("command.review.toggle"),
keybind: "mod+shift+r",
onSelect: () => input.view().reviewPanel.toggle(),
onSelect: () => view().reviewPanel.toggle(),
}),
viewCommand({
id: "fileTree.toggle",
title: input.language.t("command.fileTree.toggle"),
title: language.t("command.fileTree.toggle"),
keybind: "mod+\\",
onSelect: () => input.layout.fileTree.toggle(),
onSelect: () => layout.fileTree.toggle(),
}),
viewCommand({
id: "input.focus",
title: input.language.t("command.input.focus"),
title: language.t("command.input.focus"),
keybind: "ctrl+l",
onSelect: () => input.focusInput(),
onSelect: () => args.focusInput(),
}),
terminalCommand({
id: "terminal.new",
title: input.language.t("command.terminal.new"),
description: input.language.t("command.terminal.new.description"),
title: language.t("command.terminal.new"),
description: language.t("command.terminal.new.description"),
keybind: "ctrl+alt+t",
onSelect: () => {
if (input.terminal.all().length > 0) input.terminal.new()
input.view().terminal.open()
if (terminal.all().length > 0) terminal.new()
view().terminal.open()
},
}),
viewCommand({
id: "steps.toggle",
title: input.language.t("command.steps.toggle"),
description: input.language.t("command.steps.toggle.description"),
title: language.t("command.steps.toggle"),
description: language.t("command.steps.toggle.description"),
keybind: "mod+e",
slash: "steps",
disabled: !input.params.id,
disabled: !params.id,
onSelect: () => {
const msg = input.activeMessage()
const msg = args.activeMessage()
if (!msg) return
input.setExpanded(msg.id, (open: boolean | undefined) => !open)
args.setExpanded(msg.id, (open: boolean | undefined) => !open)
},
}),
])
@@ -186,61 +208,61 @@ export const useSessionCommands = (input: SessionCommandContext) => {
const messageCommands = createMemo(() => [
sessionCommand({
id: "message.previous",
title: input.language.t("command.message.previous"),
description: input.language.t("command.message.previous.description"),
title: language.t("command.message.previous"),
description: language.t("command.message.previous.description"),
keybind: "mod+arrowup",
disabled: !input.params.id,
onSelect: () => input.navigateMessageByOffset(-1),
disabled: !params.id,
onSelect: () => args.navigateMessageByOffset(-1),
}),
sessionCommand({
id: "message.next",
title: input.language.t("command.message.next"),
description: input.language.t("command.message.next.description"),
title: language.t("command.message.next"),
description: language.t("command.message.next.description"),
keybind: "mod+arrowdown",
disabled: !input.params.id,
onSelect: () => input.navigateMessageByOffset(1),
disabled: !params.id,
onSelect: () => args.navigateMessageByOffset(1),
}),
])
const agentCommands = createMemo(() => [
modelCommand({
id: "model.choose",
title: input.language.t("command.model.choose"),
description: input.language.t("command.model.choose.description"),
title: language.t("command.model.choose"),
description: language.t("command.model.choose.description"),
keybind: "mod+'",
slash: "model",
onSelect: () => input.dialog.show(() => <DialogSelectModel />),
onSelect: () => dialog.show(() => <DialogSelectModel />),
}),
mcpCommand({
id: "mcp.toggle",
title: input.language.t("command.mcp.toggle"),
description: input.language.t("command.mcp.toggle.description"),
title: language.t("command.mcp.toggle"),
description: language.t("command.mcp.toggle.description"),
keybind: "mod+;",
slash: "mcp",
onSelect: () => input.dialog.show(() => <DialogSelectMcp />),
onSelect: () => dialog.show(() => <DialogSelectMcp />),
}),
agentCommand({
id: "agent.cycle",
title: input.language.t("command.agent.cycle"),
description: input.language.t("command.agent.cycle.description"),
title: language.t("command.agent.cycle"),
description: language.t("command.agent.cycle.description"),
keybind: "mod+.",
slash: "agent",
onSelect: () => input.local.agent.move(1),
onSelect: () => local.agent.move(1),
}),
agentCommand({
id: "agent.cycle.reverse",
title: input.language.t("command.agent.cycle.reverse"),
description: input.language.t("command.agent.cycle.reverse.description"),
title: language.t("command.agent.cycle.reverse"),
description: language.t("command.agent.cycle.reverse.description"),
keybind: "shift+mod+.",
onSelect: () => input.local.agent.move(-1),
onSelect: () => local.agent.move(-1),
}),
modelCommand({
id: "model.variant.cycle",
title: input.language.t("command.model.variant.cycle"),
description: input.language.t("command.model.variant.cycle.description"),
title: language.t("command.model.variant.cycle"),
description: language.t("command.model.variant.cycle.description"),
keybind: "shift+mod+d",
onSelect: () => {
input.local.model.variant.cycle()
local.model.variant.cycle()
},
}),
])
@@ -249,22 +271,22 @@ export const useSessionCommands = (input: SessionCommandContext) => {
permissionsCommand({
id: "permissions.autoaccept",
title:
input.params.id && input.permission.isAutoAccepting(input.params.id, input.sdk.directory)
? input.language.t("command.permissions.autoaccept.disable")
: input.language.t("command.permissions.autoaccept.enable"),
params.id && permission.isAutoAccepting(params.id, sdk.directory)
? language.t("command.permissions.autoaccept.disable")
: language.t("command.permissions.autoaccept.enable"),
keybind: "mod+shift+a",
disabled: !input.params.id || !input.permission.permissionsEnabled(),
disabled: !params.id || !permission.permissionsEnabled(),
onSelect: () => {
const sessionID = input.params.id
const sessionID = params.id
if (!sessionID) return
input.permission.toggleAutoAccept(sessionID, input.sdk.directory)
permission.toggleAutoAccept(sessionID, sdk.directory)
showToast({
title: input.permission.isAutoAccepting(sessionID, input.sdk.directory)
? input.language.t("toast.permissions.autoaccept.on.title")
: input.language.t("toast.permissions.autoaccept.off.title"),
description: input.permission.isAutoAccepting(sessionID, input.sdk.directory)
? input.language.t("toast.permissions.autoaccept.on.description")
: input.language.t("toast.permissions.autoaccept.off.description"),
title: permission.isAutoAccepting(sessionID, sdk.directory)
? language.t("toast.permissions.autoaccept.on.title")
: language.t("toast.permissions.autoaccept.off.title"),
description: permission.isAutoAccepting(sessionID, sdk.directory)
? language.t("toast.permissions.autoaccept.on.description")
: language.t("toast.permissions.autoaccept.off.description"),
})
},
}),
@@ -273,71 +295,71 @@ export const useSessionCommands = (input: SessionCommandContext) => {
const sessionActionCommands = createMemo(() => [
sessionCommand({
id: "session.undo",
title: input.language.t("command.session.undo"),
description: input.language.t("command.session.undo.description"),
title: language.t("command.session.undo"),
description: language.t("command.session.undo.description"),
slash: "undo",
disabled: !input.params.id || input.visibleUserMessages().length === 0,
disabled: !params.id || visibleUserMessages().length === 0,
onSelect: async () => {
const sessionID = input.params.id
const sessionID = params.id
if (!sessionID) return
if (input.status()?.type !== "idle") {
await input.sdk.client.session.abort({ sessionID }).catch(() => {})
if (status()?.type !== "idle") {
await sdk.client.session.abort({ sessionID }).catch(() => {})
}
const revert = input.info()?.revert?.messageID
const message = findLast(input.userMessages(), (x) => !revert || x.id < revert)
const revert = info()?.revert?.messageID
const message = findLast(userMessages(), (x) => !revert || x.id < revert)
if (!message) return
await input.sdk.client.session.revert({ sessionID, messageID: message.id })
const parts = input.sync.data.part[message.id]
await sdk.client.session.revert({ sessionID, messageID: message.id })
const parts = sync.data.part[message.id]
if (parts) {
const restored = extractPromptFromParts(parts, { directory: input.sdk.directory })
input.prompt.set(restored)
const restored = extractPromptFromParts(parts, { directory: sdk.directory })
prompt.set(restored)
}
const priorMessage = findLast(input.userMessages(), (x) => x.id < message.id)
input.setActiveMessage(priorMessage)
const priorMessage = findLast(userMessages(), (x) => x.id < message.id)
args.setActiveMessage(priorMessage)
},
}),
sessionCommand({
id: "session.redo",
title: input.language.t("command.session.redo"),
description: input.language.t("command.session.redo.description"),
title: language.t("command.session.redo"),
description: language.t("command.session.redo.description"),
slash: "redo",
disabled: !input.params.id || !input.info()?.revert?.messageID,
disabled: !params.id || !info()?.revert?.messageID,
onSelect: async () => {
const sessionID = input.params.id
const sessionID = params.id
if (!sessionID) return
const revertMessageID = input.info()?.revert?.messageID
const revertMessageID = info()?.revert?.messageID
if (!revertMessageID) return
const nextMessage = input.userMessages().find((x) => x.id > revertMessageID)
const nextMessage = userMessages().find((x) => x.id > revertMessageID)
if (!nextMessage) {
await input.sdk.client.session.unrevert({ sessionID })
input.prompt.reset()
const lastMsg = findLast(input.userMessages(), (x) => x.id >= revertMessageID)
input.setActiveMessage(lastMsg)
await sdk.client.session.unrevert({ sessionID })
prompt.reset()
const lastMsg = findLast(userMessages(), (x) => x.id >= revertMessageID)
args.setActiveMessage(lastMsg)
return
}
await input.sdk.client.session.revert({ sessionID, messageID: nextMessage.id })
const priorMsg = findLast(input.userMessages(), (x) => x.id < nextMessage.id)
input.setActiveMessage(priorMsg)
await sdk.client.session.revert({ sessionID, messageID: nextMessage.id })
const priorMsg = findLast(userMessages(), (x) => x.id < nextMessage.id)
args.setActiveMessage(priorMsg)
},
}),
sessionCommand({
id: "session.compact",
title: input.language.t("command.session.compact"),
description: input.language.t("command.session.compact.description"),
title: language.t("command.session.compact"),
description: language.t("command.session.compact.description"),
slash: "compact",
disabled: !input.params.id || input.visibleUserMessages().length === 0,
disabled: !params.id || visibleUserMessages().length === 0,
onSelect: async () => {
const sessionID = input.params.id
const sessionID = params.id
if (!sessionID) return
const model = input.local.model.current()
const model = local.model.current()
if (!model) {
showToast({
title: input.language.t("toast.model.none.title"),
description: input.language.t("toast.model.none.description"),
title: language.t("toast.model.none.title"),
description: language.t("toast.model.none.description"),
})
return
}
await input.sdk.client.session.summarize({
await sdk.client.session.summarize({
sessionID,
modelID: model.id,
providerID: model.provider.id,
@@ -346,29 +368,27 @@ export const useSessionCommands = (input: SessionCommandContext) => {
}),
sessionCommand({
id: "session.fork",
title: input.language.t("command.session.fork"),
description: input.language.t("command.session.fork.description"),
title: language.t("command.session.fork"),
description: language.t("command.session.fork.description"),
slash: "fork",
disabled: !input.params.id || input.visibleUserMessages().length === 0,
onSelect: () => input.dialog.show(() => <DialogFork />),
disabled: !params.id || visibleUserMessages().length === 0,
onSelect: () => dialog.show(() => <DialogFork />),
}),
])
const shareCommands = createMemo(() => {
if (input.sync.data.config.share === "disabled") return []
if (sync.data.config.share === "disabled") return []
return [
sessionCommand({
id: "session.share",
title: input.info()?.share?.url
? input.language.t("session.share.copy.copyLink")
: input.language.t("command.session.share"),
description: input.info()?.share?.url
? input.language.t("toast.session.share.success.description")
: input.language.t("command.session.share.description"),
title: info()?.share?.url ? language.t("session.share.copy.copyLink") : language.t("command.session.share"),
description: info()?.share?.url
? language.t("toast.session.share.success.description")
: language.t("command.session.share.description"),
slash: "share",
disabled: !input.params.id,
disabled: !params.id,
onSelect: async () => {
if (!input.params.id) return
if (!params.id) return
const write = (value: string) => {
const body = typeof document === "undefined" ? undefined : document.body
@@ -398,7 +418,7 @@ export const useSessionCommands = (input: SessionCommandContext) => {
const ok = await write(url)
if (!ok) {
showToast({
title: input.language.t("toast.session.share.copyFailed.title"),
title: language.t("toast.session.share.copyFailed.title"),
variant: "error",
})
return
@@ -406,27 +426,27 @@ export const useSessionCommands = (input: SessionCommandContext) => {
showToast({
title: existing
? input.language.t("session.share.copy.copied")
: input.language.t("toast.session.share.success.title"),
description: input.language.t("toast.session.share.success.description"),
? language.t("session.share.copy.copied")
: language.t("toast.session.share.success.title"),
description: language.t("toast.session.share.success.description"),
variant: "success",
})
}
const existing = input.info()?.share?.url
const existing = info()?.share?.url
if (existing) {
await copy(existing, true)
return
}
const url = await input.sdk.client.session
.share({ sessionID: input.params.id })
const url = await sdk.client.session
.share({ sessionID: params.id })
.then((res) => res.data?.share?.url)
.catch(() => undefined)
if (!url) {
showToast({
title: input.language.t("toast.session.share.failed.title"),
description: input.language.t("toast.session.share.failed.description"),
title: language.t("toast.session.share.failed.title"),
description: language.t("toast.session.share.failed.description"),
variant: "error",
})
return
@@ -437,25 +457,25 @@ export const useSessionCommands = (input: SessionCommandContext) => {
}),
sessionCommand({
id: "session.unshare",
title: input.language.t("command.session.unshare"),
description: input.language.t("command.session.unshare.description"),
title: language.t("command.session.unshare"),
description: language.t("command.session.unshare.description"),
slash: "unshare",
disabled: !input.params.id || !input.info()?.share?.url,
disabled: !params.id || !info()?.share?.url,
onSelect: async () => {
if (!input.params.id) return
await input.sdk.client.session
.unshare({ sessionID: input.params.id })
if (!params.id) return
await sdk.client.session
.unshare({ sessionID: params.id })
.then(() =>
showToast({
title: input.language.t("toast.session.unshare.success.title"),
description: input.language.t("toast.session.unshare.success.description"),
title: language.t("toast.session.unshare.success.title"),
description: language.t("toast.session.unshare.success.description"),
variant: "success",
}),
)
.catch(() =>
showToast({
title: input.language.t("toast.session.unshare.failed.title"),
description: input.language.t("toast.session.unshare.failed.description"),
title: language.t("toast.session.unshare.failed.title"),
description: language.t("toast.session.unshare.failed.description"),
variant: "error",
}),
)
@@ -464,8 +484,8 @@ export const useSessionCommands = (input: SessionCommandContext) => {
]
})
input.command.register("session", () =>
combineCommandSections([
command.register("session", () =>
[
sessionCommands(),
fileCommands(),
contextCommands(),
@@ -475,6 +495,6 @@ export const useSessionCommands = (input: SessionCommandContext) => {
permissionCommands(),
sessionActionCommands(),
shareCommands(),
]),
].flatMap((section) => section),
)
}

View File

@@ -1,13 +1,13 @@
{
"name": "@opencode-ai/console-app",
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"scripts": {
"typecheck": "tsgo --noEmit",
"dev": "vite dev --host 0.0.0.0",
"dev:remote": "VITE_AUTH_URL=https://auth.dev.opencode.ai VITE_STRIPE_PUBLISHABLE_KEY=pk_test_51RtuLNE7fOCwHSD4mewwzFejyytjdGoSDK7CAvhbffwaZnPbNb2rwJICw6LTOXCmWO320fSNXvb5NzI08RZVkAxd00syfqrW7t bun sst shell --stage=dev bun dev",
"build": "./script/generate-sitemap.ts && vite build && ../../opencode/script/schema.ts ./.output/public/config.json",
"build": "./script/generate-sitemap.ts && vite build && ../../opencode/script/schema.ts ./.output/public/config.json ./.output/public/tui.json",
"start": "vite start"
},
"dependencies": {

View File

@@ -174,21 +174,6 @@ body {
}
}
input:-webkit-autofill,
input:-webkit-autofill:hover,
input:-webkit-autofill:focus,
input:-webkit-autofill:active {
transition: background-color 5000000s ease-in-out 0s;
}
input:-webkit-autofill {
-webkit-text-fill-color: var(--color-text-strong) !important;
}
input:-moz-autofill {
-moz-text-fill-color: var(--color-text-strong) !important;
}
[data-component="container"] {
max-width: 67.5rem;
margin: 0 auto;
@@ -1249,4 +1234,19 @@ body {
text-decoration: underline;
}
}
input:-webkit-autofill,
input:-webkit-autofill:hover,
input:-webkit-autofill:focus,
input:-webkit-autofill:active {
transition: background-color 5000000s ease-in-out 0s;
}
input:-webkit-autofill {
-webkit-text-fill-color: var(--color-text-strong) !important;
}
input:-moz-autofill {
-moz-text-fill-color: var(--color-text-strong) !important;
}
}

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/console-core",
"version": "1.2.5",
"version": "1.2.6",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-function",
"version": "1.2.5",
"version": "1.2.6",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-mail",
"version": "1.2.5",
"version": "1.2.6",
"dependencies": {
"@jsx-email/all": "2.2.3",
"@jsx-email/cli": "1.4.3",

View File

@@ -1,7 +1,7 @@
{
"name": "@opencode-ai/desktop",
"private": true,
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -2343,9 +2343,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.177"
version = "0.2.180"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2874a2af47a2325c2001a6e6fad9b16a53b802102b528163885171cf92b15976"
checksum = "bcc35a38544a891a5f7c865aca548a982ccb3b8650a5b06d0fd33a10283c56fc"
[[package]]
name = "libloading"
@@ -2663,6 +2663,18 @@ dependencies = [
"memoffset",
]
[[package]]
name = "nix"
version = "0.31.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "225e7cfe711e0ba79a68baeddb2982723e4235247aefce1482f2f16c27865b66"
dependencies = [
"bitflags 2.10.0",
"cfg-if",
"cfg_aliases",
"libc",
]
[[package]]
name = "nodrop"
version = "0.1.14"
@@ -3093,6 +3105,7 @@ dependencies = [
"listeners",
"objc2 0.6.3",
"objc2-web-kit",
"process-wrap",
"reqwest 0.12.24",
"semver",
"serde",
@@ -3123,7 +3136,6 @@ dependencies = [
"tracing-subscriber",
"uuid",
"webkit2gtk",
"windows 0.61.3",
]
[[package]]
@@ -3638,6 +3650,20 @@ dependencies = [
"unicode-ident",
]
[[package]]
name = "process-wrap"
version = "9.0.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ccd9713fe2c91c3c85ac388b31b89de339365d2c995146e630b5e0da9d06526a"
dependencies = [
"futures",
"indexmap 2.12.1",
"nix 0.31.1",
"tokio",
"tracing",
"windows 0.62.2",
]
[[package]]
name = "psl-types"
version = "2.0.11"
@@ -6460,11 +6486,23 @@ version = "0.61.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9babd3a767a4c1aef6900409f85f5d53ce2544ccdfaa86dad48c91782c6d6893"
dependencies = [
"windows-collections",
"windows-collections 0.2.0",
"windows-core 0.61.2",
"windows-future",
"windows-future 0.2.1",
"windows-link 0.1.3",
"windows-numerics",
"windows-numerics 0.2.0",
]
[[package]]
name = "windows"
version = "0.62.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "527fadee13e0c05939a6a05d5bd6eec6cd2e3dbd648b9f8e447c6518133d8580"
dependencies = [
"windows-collections 0.3.2",
"windows-core 0.62.2",
"windows-future 0.3.2",
"windows-numerics 0.3.1",
]
[[package]]
@@ -6476,6 +6514,15 @@ dependencies = [
"windows-core 0.61.2",
]
[[package]]
name = "windows-collections"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "23b2d95af1a8a14a3c7367e1ed4fc9c20e0a26e79551b1454d72583c97cc6610"
dependencies = [
"windows-core 0.62.2",
]
[[package]]
name = "windows-core"
version = "0.51.1"
@@ -6519,7 +6566,18 @@ checksum = "fc6a41e98427b19fe4b73c550f060b59fa592d7d686537eebf9385621bfbad8e"
dependencies = [
"windows-core 0.61.2",
"windows-link 0.1.3",
"windows-threading",
"windows-threading 0.1.0",
]
[[package]]
name = "windows-future"
version = "0.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e1d6f90251fe18a279739e78025bd6ddc52a7e22f921070ccdc67dde84c605cb"
dependencies = [
"windows-core 0.62.2",
"windows-link 0.2.1",
"windows-threading 0.2.1",
]
[[package]]
@@ -6566,6 +6624,16 @@ dependencies = [
"windows-link 0.1.3",
]
[[package]]
name = "windows-numerics"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6e2e40844ac143cdb44aead537bbf727de9b044e107a0f1220392177d15b0f26"
dependencies = [
"windows-core 0.62.2",
"windows-link 0.2.1",
]
[[package]]
name = "windows-registry"
version = "0.5.3"
@@ -6741,6 +6809,15 @@ dependencies = [
"windows-link 0.1.3",
]
[[package]]
name = "windows-threading"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3949bd5b99cafdf1c7ca86b43ca564028dfe27d66958f2470940f73d86d75b37"
dependencies = [
"windows-link 0.2.1",
]
[[package]]
name = "windows-version"
version = "0.1.7"

View File

@@ -34,7 +34,7 @@ tauri-plugin-single-instance = { version = "2", features = ["deep-link"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
tokio = "1.48.0"
tokio = { version = "1.48.0", features = ["process"] }
listeners = "0.3"
tauri-plugin-os = "2"
futures = "0.3.31"
@@ -52,6 +52,7 @@ tracing-subscriber = { version = "0.3", features = ["env-filter"] }
tracing-appender = "0.2"
chrono = "0.4"
tokio-stream = { version = "0.1.18", features = ["sync"] }
process-wrap = { version = "9.0.3", features = ["tokio1"] }
[target.'cfg(target_os = "linux")'.dependencies]
gtk = "0.18.2"
@@ -62,14 +63,6 @@ objc2 = "0.6"
objc2-web-kit = "0.3"
[target.'cfg(windows)'.dependencies]
windows = { version = "0.61", features = [
"Win32_Foundation",
"Win32_System_JobObjects",
"Win32_System_Threading",
"Win32_Security"
] }
[patch.crates-io]
specta = { git = "https://github.com/specta-rs/specta", rev = "591a5f3ddc78348abf4cbb541d599d65306d92b9" }
specta-typescript = { git = "https://github.com/specta-rs/specta", rev = "591a5f3ddc78348abf4cbb541d599d65306d92b9" }

View File

@@ -1,12 +1,19 @@
use futures::{FutureExt, Stream, StreamExt, future};
use process_wrap::tokio::CommandWrap;
#[cfg(unix)]
use process_wrap::tokio::ProcessGroup;
#[cfg(windows)]
use process_wrap::tokio::{JobObject, KillOnDrop};
#[cfg(unix)]
use std::os::unix::process::ExitStatusExt;
use std::{process::Stdio, time::Duration};
use tauri::{AppHandle, Manager, path::BaseDirectory};
use tauri_plugin_shell::{
ShellExt,
process::{CommandChild, CommandEvent, TerminatedPayload},
};
use tauri_plugin_store::StoreExt;
use tauri_specta::Event;
use tokio::sync::oneshot;
use tokio::io::{AsyncBufReadExt, BufReader};
use tokio::process::Command;
use tokio::sync::{mpsc, oneshot};
use tokio_stream::wrappers::ReceiverStream;
use tracing::Instrument;
use crate::constants::{SETTINGS_STORE, WSL_ENABLED_KEY};
@@ -25,6 +32,33 @@ pub struct Config {
pub server: Option<ServerConfig>,
}
#[derive(Clone, Debug)]
pub enum CommandEvent {
Stdout(Vec<u8>),
Stderr(Vec<u8>),
Error(String),
Terminated(TerminatedPayload),
}
#[derive(Clone, Copy, Debug)]
pub struct TerminatedPayload {
pub code: Option<i32>,
pub signal: Option<i32>,
}
#[derive(Clone, Debug)]
pub struct CommandChild {
kill: mpsc::Sender<()>,
}
impl CommandChild {
pub fn kill(&self) -> std::io::Result<()> {
self.kill
.try_send(())
.map_err(|e| std::io::Error::other(e.to_string()))
}
}
pub async fn get_config(app: &AppHandle) -> Option<Config> {
let (events, _) = spawn_command(app, "debug config", &[]).ok()?;
@@ -190,7 +224,7 @@ pub fn spawn_command(
app: &tauri::AppHandle,
args: &str,
extra_env: &[(&str, String)],
) -> Result<(impl Stream<Item = CommandEvent> + 'static, CommandChild), tauri_plugin_shell::Error> {
) -> Result<(impl Stream<Item = CommandEvent> + 'static, CommandChild), std::io::Error> {
let state_dir = app
.path()
.resolve("", BaseDirectory::AppLocalData)
@@ -217,7 +251,7 @@ pub fn spawn_command(
.map(|(key, value)| (key.to_string(), value.clone())),
);
let cmd = if cfg!(windows) {
let mut cmd = if cfg!(windows) {
if is_wsl_enabled(app) {
tracing::info!("WSL is enabled, spawning CLI server in WSL");
let version = app.package_info().version.to_string();
@@ -249,18 +283,16 @@ pub fn spawn_command(
script.push(format!("{} exec \"$BIN\" {}", env_prefix.join(" "), args));
app.shell()
.command("wsl")
.args(["-e", "bash", "-lc", &script.join("\n")])
let mut cmd = Command::new("wsl");
cmd.args(["-e", "bash", "-lc", &script.join("\n")]);
cmd
} else {
let mut cmd = app
.shell()
.sidecar("opencode-cli")
.unwrap()
.args(args.split_whitespace());
let sidecar = get_sidecar_path(app);
let mut cmd = Command::new(sidecar);
cmd.args(args.split_whitespace());
for (key, value) in envs {
cmd = cmd.env(key, value);
cmd.env(key, value);
}
cmd
@@ -269,26 +301,111 @@ pub fn spawn_command(
let sidecar = get_sidecar_path(app);
let shell = get_user_shell();
let cmd = if shell.ends_with("/nu") {
let line = if shell.ends_with("/nu") {
format!("^\"{}\" {}", sidecar.display(), args)
} else {
format!("\"{}\" {}", sidecar.display(), args)
};
let mut cmd = app.shell().command(&shell).args(["-il", "-c", &cmd]);
let mut cmd = Command::new(shell);
cmd.args(["-il", "-c", &line]);
for (key, value) in envs {
cmd = cmd.env(key, value);
cmd.env(key, value);
}
cmd
};
let (rx, child) = cmd.spawn()?;
let event_stream = tokio_stream::wrappers::ReceiverStream::new(rx);
cmd.stdin(Stdio::null())
.stdout(Stdio::piped())
.stderr(Stdio::piped());
let mut wrap = CommandWrap::from(cmd);
#[cfg(unix)]
{
wrap.wrap(ProcessGroup::leader());
}
#[cfg(windows)]
{
wrap.wrap(JobObject).wrap(KillOnDrop);
}
let mut child = wrap.spawn()?;
let stdout = child.stdout().take();
let stderr = child.stderr().take();
let (tx, rx) = mpsc::channel(256);
let (kill_tx, mut kill_rx) = mpsc::channel(1);
if let Some(stdout) = stdout {
let tx = tx.clone();
tokio::spawn(async move {
let mut lines = BufReader::new(stdout).lines();
while let Ok(Some(line)) = lines.next_line().await {
let _ = tx.send(CommandEvent::Stdout(line.into_bytes())).await;
}
});
}
if let Some(stderr) = stderr {
let tx = tx.clone();
tokio::spawn(async move {
let mut lines = BufReader::new(stderr).lines();
while let Ok(Some(line)) = lines.next_line().await {
let _ = tx.send(CommandEvent::Stderr(line.into_bytes())).await;
}
});
}
tokio::spawn(async move {
let status = loop {
match child.try_wait() {
Ok(Some(status)) => break Ok(status),
Ok(None) => {}
Err(err) => break Err(err),
}
tokio::select! {
_ = kill_rx.recv() => {
let _ = child.start_kill();
}
_ = tokio::time::sleep(Duration::from_millis(100)) => {}
}
};
match status {
Ok(status) => {
let payload = TerminatedPayload {
code: status.code(),
signal: signal_from_status(status),
};
let _ = tx.send(CommandEvent::Terminated(payload)).await;
}
Err(err) => {
let _ = tx.send(CommandEvent::Error(err.to_string())).await;
}
}
});
let event_stream = ReceiverStream::new(rx);
let event_stream = sqlite_migration::logs_middleware(app.clone(), event_stream);
Ok((event_stream, child))
Ok((event_stream, CommandChild { kill: kill_tx }))
}
fn signal_from_status(status: std::process::ExitStatus) -> Option<i32> {
#[cfg(unix)]
{
return status.signal();
}
#[cfg(not(unix))]
{
let _ = status;
None
}
}
pub fn serve(
@@ -340,7 +457,6 @@ pub fn serve(
let _ = tx.send(payload);
}
}
_ => {}
}
future::ready(())

View File

@@ -1,145 +0,0 @@
//! Windows Job Object for reliable child process cleanup.
//!
//! This module provides a wrapper around Windows Job Objects with the
//! `JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE` flag set. When the job object handle
//! is closed (including when the parent process exits or crashes), Windows
//! automatically terminates all processes assigned to the job.
//!
//! This is more reliable than manual cleanup because it works even if:
//! - The parent process crashes
//! - The parent is killed via Task Manager
//! - The RunEvent::Exit handler fails to run
use std::io::{Error, Result};
#[cfg(windows)]
use std::sync::Mutex;
use windows::Win32::Foundation::{CloseHandle, HANDLE};
use windows::Win32::System::JobObjects::{
AssignProcessToJobObject, CreateJobObjectW, JobObjectExtendedLimitInformation,
SetInformationJobObject, JOBOBJECT_EXTENDED_LIMIT_INFORMATION,
JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE,
};
use windows::Win32::System::Threading::{OpenProcess, PROCESS_SET_QUOTA, PROCESS_TERMINATE};
/// A Windows Job Object configured to kill all assigned processes when closed.
///
/// When this struct is dropped or when the owning process exits (even abnormally),
/// Windows will automatically terminate all processes that have been assigned to it.
pub struct JobObject(HANDLE);
// SAFETY: HANDLE is just a pointer-sized value, and Windows job objects
// can be safely accessed from multiple threads.
unsafe impl Send for JobObject {}
unsafe impl Sync for JobObject {}
impl JobObject {
/// Creates a new anonymous job object with `JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE` set.
///
/// When the last handle to this job is closed (including on process exit),
/// Windows will terminate all processes assigned to the job.
pub fn new() -> Result<Self> {
unsafe {
// Create an anonymous job object
let job = CreateJobObjectW(None, None).map_err(|e| Error::other(e.message()))?;
// Configure the job to kill all processes when the handle is closed
let mut info = JOBOBJECT_EXTENDED_LIMIT_INFORMATION::default();
info.BasicLimitInformation.LimitFlags = JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE;
SetInformationJobObject(
job,
JobObjectExtendedLimitInformation,
&info as *const _ as *const std::ffi::c_void,
std::mem::size_of::<JOBOBJECT_EXTENDED_LIMIT_INFORMATION>() as u32,
)
.map_err(|e| Error::other(e.message()))?;
Ok(Self(job))
}
}
/// Assigns a process to this job object by its process ID.
///
/// Once assigned, the process will be terminated when this job object is dropped
/// or when the owning process exits.
///
/// # Arguments
/// * `pid` - The process ID of the process to assign
pub fn assign_pid(&self, pid: u32) -> Result<()> {
unsafe {
// Open a handle to the process with the minimum required permissions
// PROCESS_SET_QUOTA and PROCESS_TERMINATE are required by AssignProcessToJobObject
let process = OpenProcess(PROCESS_SET_QUOTA | PROCESS_TERMINATE, false, pid)
.map_err(|e| Error::other(e.message()))?;
// Assign the process to the job
let result = AssignProcessToJobObject(self.0, process);
// Close our handle to the process - the job object maintains its own reference
let _ = CloseHandle(process);
result.map_err(|e| Error::other(e.message()))
}
}
}
impl Drop for JobObject {
fn drop(&mut self) {
unsafe {
// When this handle is closed and it's the last handle to the job,
// Windows will terminate all processes in the job due to KILL_ON_JOB_CLOSE
let _ = CloseHandle(self.0);
}
}
}
/// Holds the Windows Job Object that ensures child processes are killed when the app exits.
/// On Windows, when the job object handle is closed (including on crash), all assigned
/// processes are automatically terminated by the OS.
#[cfg(windows)]
pub struct JobObjectState {
job: Mutex<Option<JobObject>>,
error: Mutex<Option<String>>,
}
#[cfg(windows)]
impl JobObjectState {
pub fn new() -> Self {
match JobObject::new() {
Ok(job) => Self {
job: Mutex::new(Some(job)),
error: Mutex::new(None),
},
Err(e) => {
tracing::error!("Failed to create job object: {e}");
Self {
job: Mutex::new(None),
error: Mutex::new(Some(format!("Failed to create job object: {e}"))),
}
}
}
}
pub fn assign_pid(&self, pid: u32) {
if let Some(job) = self.job.lock().unwrap().as_ref() {
if let Err(e) = job.assign_pid(pid) {
tracing::error!(pid, "Failed to assign process to job object: {e}");
*self.error.lock().unwrap() =
Some(format!("Failed to assign process to job object: {e}"));
} else {
tracing::info!(pid, "Assigned process to job object for automatic cleanup");
}
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_job_object_creation() {
let job = JobObject::new();
assert!(job.is_ok(), "Failed to create job object: {:?}", job.err());
}
}

View File

@@ -1,21 +1,20 @@
mod cli;
mod constants;
#[cfg(windows)]
mod job_object;
#[cfg(target_os = "linux")]
pub mod linux_display;
#[cfg(target_os = "linux")]
pub mod linux_windowing;
mod logging;
mod markdown;
mod server;
mod window_customizer;
mod windows;
use crate::cli::CommandChild;
use futures::{
FutureExt, TryFutureExt,
future::{self, Shared},
};
#[cfg(windows)]
use job_object::*;
use std::{
env,
net::TcpListener,
@@ -27,7 +26,6 @@ use std::{
use tauri::{AppHandle, Listener, Manager, RunEvent, State, ipc::Channel};
#[cfg(any(target_os = "linux", all(debug_assertions, windows)))]
use tauri_plugin_deep_link::DeepLinkExt;
use tauri_plugin_shell::process::CommandChild;
use tauri_specta::Event;
use tokio::{
sync::{oneshot, watch},
@@ -631,12 +629,6 @@ async fn initialize(app: AppHandle) {
tracing::info!("CLI health check OK");
#[cfg(windows)]
{
let job_state = app.state::<JobObjectState>();
job_state.assign_pid(child.pid());
}
app.state::<ServerState>().set_child(Some(child));
Ok(ServerReadyData { url, password })
@@ -710,9 +702,6 @@ fn setup_app(app: &tauri::AppHandle, init_rx: watch::Receiver<InitStep>) {
#[cfg(any(target_os = "linux", all(debug_assertions, windows)))]
app.deep_link().register_all().ok();
#[cfg(windows)]
app.manage(JobObjectState::new());
app.manage(InitState { current: init_rx });
}

View File

@@ -0,0 +1,475 @@
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum Backend {
Auto,
Wayland,
X11,
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct BackendDecision {
pub backend: Backend,
pub note: String,
}
#[derive(Debug, Clone, Default)]
pub struct SessionEnv {
pub wayland_display: bool,
pub xdg_session_type: Option<String>,
pub display: bool,
pub xdg_current_desktop: Option<String>,
pub xdg_session_desktop: Option<String>,
pub desktop_session: Option<String>,
pub oc_allow_wayland: Option<String>,
pub oc_force_x11: Option<String>,
pub oc_force_wayland: Option<String>,
pub oc_linux_decorations: Option<String>,
pub oc_force_decorations: Option<String>,
pub oc_no_decorations: Option<String>,
pub i3_sock: bool,
}
impl SessionEnv {
pub fn capture() -> Self {
Self {
wayland_display: std::env::var_os("WAYLAND_DISPLAY").is_some(),
xdg_session_type: std::env::var("XDG_SESSION_TYPE").ok(),
display: std::env::var_os("DISPLAY").is_some(),
xdg_current_desktop: std::env::var("XDG_CURRENT_DESKTOP").ok(),
xdg_session_desktop: std::env::var("XDG_SESSION_DESKTOP").ok(),
desktop_session: std::env::var("DESKTOP_SESSION").ok(),
oc_allow_wayland: std::env::var("OC_ALLOW_WAYLAND").ok(),
oc_force_x11: std::env::var("OC_FORCE_X11").ok(),
oc_force_wayland: std::env::var("OC_FORCE_WAYLAND").ok(),
oc_linux_decorations: std::env::var("OC_LINUX_DECORATIONS").ok(),
oc_force_decorations: std::env::var("OC_FORCE_DECORATIONS").ok(),
oc_no_decorations: std::env::var("OC_NO_DECORATIONS").ok(),
i3_sock: std::env::var_os("I3SOCK").is_some(),
}
}
}
pub fn select_backend(env: &SessionEnv, prefer_wayland: bool) -> Option<BackendDecision> {
if is_truthy(env.oc_force_x11.as_deref()) {
return Some(BackendDecision {
backend: Backend::X11,
note: "Forcing X11 due to OC_FORCE_X11=1".into(),
});
}
if is_truthy(env.oc_force_wayland.as_deref()) {
return Some(BackendDecision {
backend: Backend::Wayland,
note: "Forcing native Wayland due to OC_FORCE_WAYLAND=1".into(),
});
}
if !is_wayland_session(env) {
return None;
}
if prefer_wayland {
return Some(BackendDecision {
backend: Backend::Wayland,
note: "Wayland session detected; forcing native Wayland from settings".into(),
});
}
if is_truthy(env.oc_allow_wayland.as_deref()) {
return Some(BackendDecision {
backend: Backend::Wayland,
note: "Wayland session detected; forcing native Wayland due to OC_ALLOW_WAYLAND=1"
.into(),
});
}
Some(BackendDecision {
backend: Backend::Auto,
note: "Wayland session detected; using native Wayland first with X11 fallback (auto backend). Set OC_FORCE_X11=1 to force X11."
.into(),
})
}
pub fn use_decorations(env: &SessionEnv) -> bool {
if let Some(mode) = decoration_override(env.oc_linux_decorations.as_deref()) {
return match mode {
DecorationOverride::Native => true,
DecorationOverride::None => false,
DecorationOverride::Auto => default_use_decorations(env),
};
}
if is_truthy(env.oc_force_decorations.as_deref()) {
return true;
}
if is_truthy(env.oc_no_decorations.as_deref()) {
return false;
}
default_use_decorations(env)
}
fn default_use_decorations(env: &SessionEnv) -> bool {
if is_known_tiling_session(env) {
return false;
}
if !is_wayland_session(env) {
return true;
}
is_full_desktop_session(env)
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
enum DecorationOverride {
Auto,
Native,
None,
}
fn decoration_override(value: Option<&str>) -> Option<DecorationOverride> {
let value = value?.trim().to_ascii_lowercase();
if matches!(value.as_str(), "auto") {
return Some(DecorationOverride::Auto);
}
if matches!(
value.as_str(),
"native" | "server" | "de" | "wayland" | "on" | "true" | "1"
) {
return Some(DecorationOverride::Native);
}
if matches!(
value.as_str(),
"none" | "off" | "false" | "0" | "client" | "csd"
) {
return Some(DecorationOverride::None);
}
None
}
fn is_truthy(value: Option<&str>) -> bool {
matches!(
value.map(|v| v.trim().to_ascii_lowercase()),
Some(v) if matches!(v.as_str(), "1" | "true" | "yes" | "on")
)
}
fn is_wayland_session(env: &SessionEnv) -> bool {
env.wayland_display
|| matches!(
env.xdg_session_type.as_deref(),
Some(value) if value.eq_ignore_ascii_case("wayland")
)
}
fn is_full_desktop_session(env: &SessionEnv) -> bool {
desktop_tokens(env).any(|value| {
matches!(
value.as_str(),
"gnome"
| "kde"
| "plasma"
| "xfce"
| "xfce4"
| "x-cinnamon"
| "cinnamon"
| "mate"
| "lxqt"
| "budgie"
| "pantheon"
| "deepin"
| "unity"
| "cosmic"
)
})
}
fn is_known_tiling_session(env: &SessionEnv) -> bool {
if env.i3_sock {
return true;
}
desktop_tokens(env).any(|value| {
matches!(
value.as_str(),
"niri"
| "sway"
| "swayfx"
| "hyprland"
| "river"
| "i3"
| "i3wm"
| "bspwm"
| "dwm"
| "qtile"
| "xmonad"
| "leftwm"
| "dwl"
| "awesome"
| "herbstluftwm"
| "spectrwm"
| "worm"
| "i3-gnome"
)
})
}
fn desktop_tokens<'a>(env: &'a SessionEnv) -> impl Iterator<Item = String> + 'a {
[
env.xdg_current_desktop.as_deref(),
env.xdg_session_desktop.as_deref(),
env.desktop_session.as_deref(),
]
.into_iter()
.flatten()
.flat_map(|desktop| desktop.split(':'))
.map(|value| value.trim().to_ascii_lowercase())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn prefers_wayland_first_on_wayland_session() {
let env = SessionEnv {
wayland_display: true,
display: true,
..Default::default()
};
let decision = select_backend(&env, false).expect("missing decision");
assert_eq!(decision.backend, Backend::Auto);
}
#[test]
fn force_x11_override_wins() {
let env = SessionEnv {
wayland_display: true,
display: true,
oc_force_x11: Some("1".into()),
oc_allow_wayland: Some("1".into()),
oc_force_wayland: Some("1".into()),
..Default::default()
};
let decision = select_backend(&env, true).expect("missing decision");
assert_eq!(decision.backend, Backend::X11);
}
#[test]
fn prefer_wayland_forces_wayland_backend() {
let env = SessionEnv {
wayland_display: true,
display: true,
..Default::default()
};
let decision = select_backend(&env, true).expect("missing decision");
assert_eq!(decision.backend, Backend::Wayland);
}
#[test]
fn force_wayland_override_works_outside_wayland_session() {
let env = SessionEnv {
display: true,
oc_force_wayland: Some("1".into()),
..Default::default()
};
let decision = select_backend(&env, false).expect("missing decision");
assert_eq!(decision.backend, Backend::Wayland);
}
#[test]
fn allow_wayland_forces_wayland_backend() {
let env = SessionEnv {
wayland_display: true,
display: true,
oc_allow_wayland: Some("1".into()),
..Default::default()
};
let decision = select_backend(&env, false).expect("missing decision");
assert_eq!(decision.backend, Backend::Wayland);
}
#[test]
fn xdg_session_type_wayland_is_detected() {
let env = SessionEnv {
xdg_session_type: Some("wayland".into()),
..Default::default()
};
let decision = select_backend(&env, false).expect("missing decision");
assert_eq!(decision.backend, Backend::Auto);
}
#[test]
fn returns_none_when_not_wayland_and_no_overrides() {
let env = SessionEnv {
display: true,
xdg_current_desktop: Some("GNOME".into()),
..Default::default()
};
assert!(select_backend(&env, false).is_none());
}
#[test]
fn prefer_wayland_setting_does_not_override_x11_session() {
let env = SessionEnv {
display: true,
xdg_current_desktop: Some("GNOME".into()),
..Default::default()
};
assert!(select_backend(&env, true).is_none());
}
#[test]
fn disables_decorations_on_niri() {
let env = SessionEnv {
xdg_current_desktop: Some("niri".into()),
wayland_display: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn keeps_decorations_on_gnome() {
let env = SessionEnv {
xdg_current_desktop: Some("GNOME".into()),
wayland_display: true,
..Default::default()
};
assert!(use_decorations(&env));
}
#[test]
fn disables_decorations_when_session_desktop_is_tiling() {
let env = SessionEnv {
xdg_session_desktop: Some("Hyprland".into()),
wayland_display: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn disables_decorations_for_unknown_wayland_session() {
let env = SessionEnv {
xdg_current_desktop: Some("labwc".into()),
wayland_display: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn disables_decorations_for_dwm_on_x11() {
let env = SessionEnv {
xdg_current_desktop: Some("dwm".into()),
display: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn disables_decorations_for_i3_on_x11() {
let env = SessionEnv {
xdg_current_desktop: Some("i3".into()),
display: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn disables_decorations_for_i3sock_without_xdg_tokens() {
let env = SessionEnv {
display: true,
i3_sock: true,
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn keeps_decorations_for_gnome_on_x11() {
let env = SessionEnv {
xdg_current_desktop: Some("GNOME".into()),
display: true,
..Default::default()
};
assert!(use_decorations(&env));
}
#[test]
fn no_decorations_override_wins() {
let env = SessionEnv {
xdg_current_desktop: Some("GNOME".into()),
oc_no_decorations: Some("1".into()),
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn linux_decorations_native_override_wins() {
let env = SessionEnv {
xdg_current_desktop: Some("niri".into()),
wayland_display: true,
oc_linux_decorations: Some("native".into()),
..Default::default()
};
assert!(use_decorations(&env));
}
#[test]
fn linux_decorations_none_override_wins() {
let env = SessionEnv {
xdg_current_desktop: Some("GNOME".into()),
wayland_display: true,
oc_linux_decorations: Some("none".into()),
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn linux_decorations_auto_uses_default_policy() {
let env = SessionEnv {
xdg_current_desktop: Some("sway".into()),
wayland_display: true,
oc_linux_decorations: Some("auto".into()),
..Default::default()
};
assert!(!use_decorations(&env));
}
#[test]
fn linux_decorations_override_beats_legacy_overrides() {
let env = SessionEnv {
xdg_current_desktop: Some("GNOME".into()),
wayland_display: true,
oc_linux_decorations: Some("none".into()),
oc_force_decorations: Some("1".into()),
..Default::default()
};
assert!(!use_decorations(&env));
}
}

View File

@@ -36,11 +36,7 @@ pub fn init(log_dir: &Path) -> WorkerGuard {
tracing_subscriber::registry()
.with(filter)
.with(fmt::layer().with_writer(std::io::stderr))
.with(
fmt::layer()
.with_writer(non_blocking)
.with_ansi(false),
)
.with(fmt::layer().with_writer(non_blocking).with_ansi(false))
.init();
guard
@@ -55,10 +51,7 @@ pub fn tail() -> String {
return String::new();
};
let lines: Vec<String> = BufReader::new(file)
.lines()
.map_while(Result::ok)
.collect();
let lines: Vec<String> = BufReader::new(file).lines().map_while(Result::ok).collect();
let start = lines.len().saturating_sub(TAIL_LINES);
lines[start..].join("\n")

View File

@@ -4,6 +4,7 @@
// borrowed from https://github.com/skyline69/balatro-mod-manager
#[cfg(target_os = "linux")]
fn configure_display_backend() -> Option<String> {
use opencode_lib::linux_windowing::{Backend, SessionEnv, select_backend};
use std::env;
let set_env_if_absent = |key: &str, value: &str| {
@@ -14,45 +15,28 @@ fn configure_display_backend() -> Option<String> {
}
};
let on_wayland = env::var_os("WAYLAND_DISPLAY").is_some()
|| matches!(
env::var("XDG_SESSION_TYPE"),
Ok(v) if v.eq_ignore_ascii_case("wayland")
);
if !on_wayland {
return None;
}
let session = SessionEnv::capture();
let prefer_wayland = opencode_lib::linux_display::read_wayland().unwrap_or(false);
let allow_wayland = prefer_wayland
|| matches!(
env::var("OC_ALLOW_WAYLAND"),
Ok(v) if matches!(v.to_ascii_lowercase().as_str(), "1" | "true" | "yes")
);
if allow_wayland {
if prefer_wayland {
return Some("Wayland session detected; using native Wayland from settings".into());
let decision = select_backend(&session, prefer_wayland)?;
match decision.backend {
Backend::X11 => {
set_env_if_absent("WINIT_UNIX_BACKEND", "x11");
set_env_if_absent("GDK_BACKEND", "x11");
set_env_if_absent("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
}
Backend::Wayland => {
set_env_if_absent("WINIT_UNIX_BACKEND", "wayland");
set_env_if_absent("GDK_BACKEND", "wayland");
set_env_if_absent("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
}
Backend::Auto => {
set_env_if_absent("GDK_BACKEND", "wayland,x11");
set_env_if_absent("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
}
return Some("Wayland session detected; respecting OC_ALLOW_WAYLAND=1".into());
}
// Prefer XWayland when available to avoid Wayland protocol errors seen during startup.
if env::var_os("DISPLAY").is_some() {
set_env_if_absent("WINIT_UNIX_BACKEND", "x11");
set_env_if_absent("GDK_BACKEND", "x11");
set_env_if_absent("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
return Some(
"Wayland session detected; forcing X11 backend to avoid compositor protocol errors. \
Set OC_ALLOW_WAYLAND=1 to keep native Wayland."
.into(),
);
}
set_env_if_absent("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
Some(
"Wayland session detected without X11; leaving Wayland enabled (set WINIT_UNIX_BACKEND/GDK_BACKEND manually if needed)."
.into(),
)
Some(decision.note)
}
fn main() {

View File

@@ -2,12 +2,12 @@ use std::time::{Duration, Instant};
use tauri::AppHandle;
use tauri_plugin_dialog::{DialogExt, MessageDialogButtons, MessageDialogResult};
use tauri_plugin_shell::process::CommandChild;
use tauri_plugin_store::StoreExt;
use tokio::task::JoinHandle;
use crate::{
cli,
cli::CommandChild,
constants::{DEFAULT_SERVER_URL_KEY, SETTINGS_STORE, WSL_ENABLED_KEY},
};

View File

@@ -7,6 +7,22 @@ use tauri::{AppHandle, Manager, Runtime, WebviewUrl, WebviewWindow, WebviewWindo
use tauri_plugin_window_state::AppHandleExt;
use tokio::sync::mpsc;
#[cfg(target_os = "linux")]
use std::sync::OnceLock;
#[cfg(target_os = "linux")]
fn use_decorations() -> bool {
static DECORATIONS: OnceLock<bool> = OnceLock::new();
*DECORATIONS.get_or_init(|| {
crate::linux_windowing::use_decorations(&crate::linux_windowing::SessionEnv::capture())
})
}
#[cfg(not(target_os = "linux"))]
fn use_decorations() -> bool {
true
}
pub struct MainWindow(WebviewWindow);
impl Deref for MainWindow {
@@ -31,13 +47,13 @@ impl MainWindow {
.ok()
.map(|v| v.enabled)
.unwrap_or(false);
let decorations = use_decorations();
let window_builder = base_window_config(
WebviewWindowBuilder::new(app, Self::LABEL, WebviewUrl::App("/".into())),
app,
decorations,
)
.title("OpenCode")
.decorations(true)
.disable_drag_drop_handler()
.zoom_hotkeys_enabled(false)
.visible(true)
@@ -113,9 +129,12 @@ impl LoadingWindow {
pub const LABEL: &str = "loading";
pub fn create(app: &AppHandle) -> Result<Self, tauri::Error> {
let decorations = use_decorations();
let window_builder = base_window_config(
WebviewWindowBuilder::new(app, Self::LABEL, tauri::WebviewUrl::App("/loading".into())),
app,
decorations,
)
.center()
.resizable(false)
@@ -129,8 +148,9 @@ impl LoadingWindow {
fn base_window_config<'a, R: Runtime, M: Manager<R>>(
window_builder: WebviewWindowBuilder<'a, R, M>,
_app: &AppHandle,
decorations: bool,
) -> WebviewWindowBuilder<'a, R, M> {
let window_builder = window_builder.decorations(true);
let window_builder = window_builder.decorations(decorations);
#[cfg(windows)]
let window_builder = window_builder

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/enterprise",
"version": "1.2.5",
"version": "1.2.6",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,7 +1,7 @@
id = "opencode"
name = "OpenCode"
description = "The open source coding agent."
version = "1.2.5"
version = "1.2.6"
schema_version = 1
authors = ["Anomaly"]
repository = "https://github.com/anomalyco/opencode"
@@ -11,26 +11,26 @@ name = "OpenCode"
icon = "./icons/opencode.svg"
[agent_servers.opencode.targets.darwin-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.5/opencode-darwin-arm64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.6/opencode-darwin-arm64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.darwin-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.5/opencode-darwin-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.6/opencode-darwin-x64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.5/opencode-linux-arm64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.6/opencode-linux-arm64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.5/opencode-linux-x64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.6/opencode-linux-x64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.windows-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.5/opencode-windows-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.2.6/opencode-windows-x64.zip"
cmd = "./opencode.exe"
args = ["acp"]

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/function",
"version": "1.2.5",
"version": "1.2.6",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"$schema": "https://json.schemastore.org/package.json",
"version": "1.2.5",
"version": "1.2.6",
"name": "opencode",
"type": "module",
"license": "MIT",
@@ -74,8 +74,8 @@
"@ai-sdk/vercel": "1.0.33",
"@ai-sdk/xai": "2.0.51",
"@clack/prompts": "1.0.0-alpha.1",
"@gitlab/gitlab-ai-provider": "3.5.0",
"@gitlab/opencode-gitlab-auth": "1.3.2",
"@gitlab/gitlab-ai-provider": "3.5.1",
"@gitlab/opencode-gitlab-auth": "1.3.3",
"@hono/standard-validator": "0.1.5",
"@hono/zod-validator": "catalog:",
"@modelcontextprotocol/sdk": "1.25.2",

View File

@@ -2,46 +2,62 @@
import { z } from "zod"
import { Config } from "../src/config/config"
import { TuiConfig } from "../src/config/tui"
const file = process.argv[2]
console.log(file)
function generate(schema: z.ZodType) {
const result = z.toJSONSchema(schema, {
io: "input", // Generate input shape (treats optional().default() as not required)
/**
* We'll use the `default` values of the field as the only value in `examples`.
* This will ensure no docs are needed to be read, as the configuration is
* self-documenting.
*
* See https://json-schema.org/draft/2020-12/draft-bhutton-json-schema-validation-00#rfc.section.9.5
*/
override(ctx) {
const schema = ctx.jsonSchema
const result = z.toJSONSchema(Config.Info, {
io: "input", // Generate input shape (treats optional().default() as not required)
/**
* We'll use the `default` values of the field as the only value in `examples`.
* This will ensure no docs are needed to be read, as the configuration is
* self-documenting.
*
* See https://json-schema.org/draft/2020-12/draft-bhutton-json-schema-validation-00#rfc.section.9.5
*/
override(ctx) {
const schema = ctx.jsonSchema
// Preserve strictness: set additionalProperties: false for objects
if (schema && typeof schema === "object" && schema.type === "object" && schema.additionalProperties === undefined) {
schema.additionalProperties = false
}
// Add examples and default descriptions for string fields with defaults
if (schema && typeof schema === "object" && "type" in schema && schema.type === "string" && schema?.default) {
if (!schema.examples) {
schema.examples = [schema.default]
// Preserve strictness: set additionalProperties: false for objects
if (
schema &&
typeof schema === "object" &&
schema.type === "object" &&
schema.additionalProperties === undefined
) {
schema.additionalProperties = false
}
schema.description = [schema.description || "", `default: \`${schema.default}\``]
.filter(Boolean)
.join("\n\n")
.trim()
}
},
}) as Record<string, unknown> & {
allowComments?: boolean
allowTrailingCommas?: boolean
// Add examples and default descriptions for string fields with defaults
if (schema && typeof schema === "object" && "type" in schema && schema.type === "string" && schema?.default) {
if (!schema.examples) {
schema.examples = [schema.default]
}
schema.description = [schema.description || "", `default: \`${schema.default}\``]
.filter(Boolean)
.join("\n\n")
.trim()
}
},
}) as Record<string, unknown> & {
allowComments?: boolean
allowTrailingCommas?: boolean
}
// used for json lsps since config supports jsonc
result.allowComments = true
result.allowTrailingCommas = true
return result
}
// used for json lsps since config supports jsonc
result.allowComments = true
result.allowTrailingCommas = true
const configFile = process.argv[2]
const tuiFile = process.argv[3]
await Bun.write(file, JSON.stringify(result, null, 2))
console.log(configFile)
await Bun.write(configFile, JSON.stringify(generate(Config.Info), null, 2))
if (tuiFile) {
console.log(tuiFile)
await Bun.write(tuiFile, JSON.stringify(generate(TuiConfig.Info), null, 2))
}

View File

@@ -44,6 +44,16 @@ opencode acp
opencode acp --cwd /path/to/project
```
### Question Tool Opt-In
ACP excludes `QuestionTool` by default.
```bash
OPENCODE_ENABLE_QUESTION_TOOL=1 opencode acp
```
Enable this only for ACP clients that support interactive question prompts.
### Programmatic
```typescript

View File

@@ -21,7 +21,6 @@ export class ACPSessionManager {
const session = await this.sdk.session
.create(
{
title: `ACP Session ${crypto.randomUUID()}`,
directory: cwd,
},
{ throwOnError: true },

View File

@@ -4,6 +4,8 @@ import { Database } from "../../storage/db"
import { Database as BunDatabase } from "bun:sqlite"
import { UI } from "../ui"
import { cmd } from "./cmd"
import { JsonMigration } from "../../storage/json-migration"
import { EOL } from "os"
const QueryCommand = cmd({
command: "$0 [query]",
@@ -58,11 +60,59 @@ const PathCommand = cmd({
},
})
const MigrateCommand = cmd({
command: "migrate",
describe: "migrate JSON data to SQLite (merges with existing data)",
handler: async () => {
const sqlite = new BunDatabase(Database.Path)
const tty = process.stderr.isTTY
const width = 36
const orange = "\x1b[38;5;214m"
const muted = "\x1b[0;2m"
const reset = "\x1b[0m"
let last = -1
if (tty) process.stderr.write("\x1b[?25l")
try {
const stats = await JsonMigration.run(sqlite, {
progress: (event) => {
const percent = Math.floor((event.current / event.total) * 100)
if (percent === last) return
last = percent
if (tty) {
const fill = Math.round((percent / 100) * width)
const bar = `${"■".repeat(fill)}${"・".repeat(width - fill)}`
process.stderr.write(
`\r${orange}${bar} ${percent.toString().padStart(3)}%${reset} ${muted}${event.current}/${event.total}${reset} `,
)
} else {
process.stderr.write(`sqlite-migration:${percent}${EOL}`)
}
},
})
if (tty) process.stderr.write("\n")
if (tty) process.stderr.write("\x1b[?25h")
else process.stderr.write(`sqlite-migration:done${EOL}`)
UI.println(
`Migration complete: ${stats.projects} projects, ${stats.sessions} sessions, ${stats.messages} messages`,
)
if (stats.errors.length > 0) {
UI.println(`${stats.errors.length} errors occurred during migration`)
}
} catch (err) {
if (tty) process.stderr.write("\x1b[?25h")
UI.error(`Migration failed: ${err instanceof Error ? err.message : String(err)}`)
process.exit(1)
} finally {
sqlite.close()
}
},
})
export const DbCommand = cmd({
command: "db",
describe: "database tools",
builder: (yargs: Argv) => {
return yargs.command(QueryCommand).command(PathCommand).demandCommand()
return yargs.command(QueryCommand).command(PathCommand).command(MigrateCommand).demandCommand()
},
handler: () => {},
})

View File

@@ -38,10 +38,34 @@ function pagerCmd(): string[] {
export const SessionCommand = cmd({
command: "session",
describe: "manage sessions",
builder: (yargs: Argv) => yargs.command(SessionListCommand).demandCommand(),
builder: (yargs: Argv) => yargs.command(SessionListCommand).command(SessionDeleteCommand).demandCommand(),
async handler() {},
})
export const SessionDeleteCommand = cmd({
command: "delete <sessionID>",
describe: "delete a session",
builder: (yargs: Argv) => {
return yargs.positional("sessionID", {
describe: "session ID to delete",
type: "string",
demandOption: true,
})
},
handler: async (args) => {
await bootstrap(process.cwd(), async () => {
try {
await Session.get(args.sessionID)
} catch {
UI.error(`Session not found: ${args.sessionID}`)
process.exit(1)
}
await Session.remove(args.sessionID)
UI.println(UI.Style.TEXT_SUCCESS_BOLD + `Session ${args.sessionID} deleted` + UI.Style.TEXT_NORMAL)
})
},
})
export const SessionListCommand = cmd({
command: "list",
describe: "list sessions",

View File

@@ -38,6 +38,8 @@ import { ArgsProvider, useArgs, type Args } from "./context/args"
import open from "open"
import { writeHeapSnapshot } from "v8"
import { PromptRefProvider, usePromptRef } from "./context/prompt"
import { TuiConfigProvider } from "./context/tui-config"
import { TuiConfig } from "@/config/tui"
async function getTerminalBackgroundColor(): Promise<"dark" | "light"> {
// can't set raw mode if not a TTY
@@ -104,6 +106,7 @@ import type { EventSource } from "./context/sdk"
export function tui(input: {
url: string
args: Args
config: TuiConfig.Info
directory?: string
fetch?: typeof fetch
headers?: RequestInit["headers"]
@@ -138,35 +141,37 @@ export function tui(input: {
<KVProvider>
<ToastProvider>
<RouteProvider>
<SDKProvider
url={input.url}
directory={input.directory}
fetch={input.fetch}
headers={input.headers}
events={input.events}
>
<SyncProvider>
<ThemeProvider mode={mode}>
<LocalProvider>
<KeybindProvider>
<PromptStashProvider>
<DialogProvider>
<CommandProvider>
<FrecencyProvider>
<PromptHistoryProvider>
<PromptRefProvider>
<App />
</PromptRefProvider>
</PromptHistoryProvider>
</FrecencyProvider>
</CommandProvider>
</DialogProvider>
</PromptStashProvider>
</KeybindProvider>
</LocalProvider>
</ThemeProvider>
</SyncProvider>
</SDKProvider>
<TuiConfigProvider config={input.config}>
<SDKProvider
url={input.url}
directory={input.directory}
fetch={input.fetch}
headers={input.headers}
events={input.events}
>
<SyncProvider>
<ThemeProvider mode={mode}>
<LocalProvider>
<KeybindProvider>
<PromptStashProvider>
<DialogProvider>
<CommandProvider>
<FrecencyProvider>
<PromptHistoryProvider>
<PromptRefProvider>
<App />
</PromptRefProvider>
</PromptHistoryProvider>
</FrecencyProvider>
</CommandProvider>
</DialogProvider>
</PromptStashProvider>
</KeybindProvider>
</LocalProvider>
</ThemeProvider>
</SyncProvider>
</SDKProvider>
</TuiConfigProvider>
</RouteProvider>
</ToastProvider>
</KVProvider>

View File

@@ -1,6 +1,10 @@
import { cmd } from "../cmd"
import { UI } from "@/cli/ui"
import { tui } from "./app"
import { win32DisableProcessedInput, win32InstallCtrlCGuard } from "./win32"
import { TuiConfig } from "@/config/tui"
import { Instance } from "@/project/instance"
import { existsSync } from "fs"
export const AttachCommand = cmd({
command: "attach <url>",
@@ -16,11 +20,20 @@ export const AttachCommand = cmd({
type: "string",
description: "directory to run in",
})
.option("continue", {
alias: ["c"],
describe: "continue the last session",
type: "boolean",
})
.option("session", {
alias: ["s"],
type: "string",
describe: "session id to continue",
})
.option("fork", {
type: "boolean",
describe: "fork the session when continuing (use with --continue or --session)",
})
.option("password", {
alias: ["p"],
type: "string",
@@ -31,6 +44,12 @@ export const AttachCommand = cmd({
try {
win32DisableProcessedInput()
if (args.fork && !args.continue && !args.session) {
UI.error("--fork requires --continue or --session")
process.exitCode = 1
return
}
const directory = (() => {
if (!args.dir) return undefined
try {
@@ -47,9 +66,18 @@ export const AttachCommand = cmd({
const auth = `Basic ${Buffer.from(`opencode:${password}`).toString("base64")}`
return { Authorization: auth }
})()
const config = await Instance.provide({
directory: directory && existsSync(directory) ? directory : process.cwd(),
fn: () => TuiConfig.get(),
})
await tui({
url: args.url,
args: { sessionID: args.session },
config,
args: {
continue: args.continue,
sessionID: args.session,
fork: args.fork,
},
directory,
headers,
})

View File

@@ -247,7 +247,8 @@ export function Autocomplete(props: {
const width = props.anchor().width - 4
options.push(
...sortedFiles.map((item): AutocompleteOption => {
const fullPath = `${process.cwd()}/${item}`
const baseDir = (sync.data.path.directory || process.cwd()).replace(/\/+$/, "")
const fullPath = `${baseDir}/${item}`
const urlObj = pathToFileURL(fullPath)
let filename = item
if (lineRange && !item.endsWith("/")) {

View File

@@ -2,7 +2,7 @@ import { createMemo, createSignal, For } from "solid-js"
import { DEFAULT_THEMES, useTheme } from "@tui/context/theme"
const themeCount = Object.keys(DEFAULT_THEMES).length
const themeTip = `Use {highlight}/theme{/highlight} or {highlight}Ctrl+X T{/highlight} to switch between ${themeCount} built-in themes`
const themeTip = `Use {highlight}/themes{/highlight} or {highlight}Ctrl+X T{/highlight} to switch between ${themeCount} built-in themes`
type TipPart = { text: string; highlight: boolean }
@@ -80,11 +80,11 @@ const TIPS = [
"Switch to {highlight}Plan{/highlight} agent to get suggestions without making actual changes",
"Use {highlight}@agent-name{/highlight} in prompts to invoke specialized subagents",
"Press {highlight}Ctrl+X Right/Left{/highlight} to cycle through parent and child sessions",
"Create {highlight}opencode.json{/highlight} in project root for project-specific settings",
"Place settings in {highlight}~/.config/opencode/opencode.json{/highlight} for global config",
"Create {highlight}opencode.json{/highlight} for server settings and {highlight}tui.json{/highlight} for TUI settings",
"Place TUI settings in {highlight}~/.config/opencode/tui.json{/highlight} for global config",
"Add {highlight}$schema{/highlight} to your config for autocomplete in your editor",
"Configure {highlight}model{/highlight} in config to set your default model",
"Override any keybind in config via the {highlight}keybinds{/highlight} section",
"Override any keybind in {highlight}tui.json{/highlight} via the {highlight}keybinds{/highlight} section",
"Set any keybind to {highlight}none{/highlight} to disable it completely",
"Configure local or remote MCP servers in the {highlight}mcp{/highlight} config section",
"OpenCode auto-handles OAuth for remote MCP servers requiring auth",
@@ -126,7 +126,7 @@ const TIPS = [
"Use {highlight}{file:path}{/highlight} to include file contents in config values",
"Use {highlight}instructions{/highlight} in config to load additional rules files",
"Set agent {highlight}temperature{/highlight} from 0.0 (focused) to 1.0 (creative)",
"Configure {highlight}maxSteps{/highlight} to limit agentic iterations per request",
"Configure {highlight}steps{/highlight} to limit agentic iterations per request",
'Set {highlight}"tools": {"bash": false}{/highlight} to disable specific tools',
'Set {highlight}"mcp_*": false{/highlight} to disable all tools from an MCP server',
"Override global tool settings per agent configuration",
@@ -140,14 +140,13 @@ const TIPS = [
"Press {highlight}Ctrl+X G{/highlight} or {highlight}/timeline{/highlight} to jump to specific messages",
"Press {highlight}Ctrl+X H{/highlight} to toggle code block visibility in messages",
"Press {highlight}Ctrl+X S{/highlight} or {highlight}/status{/highlight} to see system status info",
"Enable {highlight}tui.scroll_acceleration{/highlight} for smooth macOS-style scrolling",
"Enable {highlight}scroll_acceleration{/highlight} in {highlight}tui.json{/highlight} for smooth macOS-style scrolling",
"Toggle username display in chat via command palette ({highlight}Ctrl+P{/highlight})",
"Run {highlight}docker run -it --rm ghcr.io/anomalyco/opencode{/highlight} for containerized use",
"Use {highlight}/connect{/highlight} with OpenCode Zen for curated, tested models",
"Commit your project's {highlight}AGENTS.md{/highlight} file to Git for team sharing",
"Use {highlight}/review{/highlight} to review uncommitted changes, branches, or PRs",
"Run {highlight}/help{/highlight} or {highlight}Ctrl+X H{/highlight} to show the help dialog",
"Use {highlight}/details{/highlight} to toggle tool execution details visibility",
"Use {highlight}/rename{/highlight} to rename the current session",
"Press {highlight}Ctrl+Z{/highlight} to suspend the terminal and return to your shell",
]

View File

@@ -1,5 +1,4 @@
import { createMemo } from "solid-js"
import { useSync } from "@tui/context/sync"
import { Keybind } from "@/util/keybind"
import { pipe, mapValues } from "remeda"
import type { KeybindsConfig } from "@opencode-ai/sdk/v2"
@@ -7,14 +6,15 @@ import type { ParsedKey, Renderable } from "@opentui/core"
import { createStore } from "solid-js/store"
import { useKeyboard, useRenderer } from "@opentui/solid"
import { createSimpleContext } from "./helper"
import { useTuiConfig } from "./tui-config"
export const { use: useKeybind, provider: KeybindProvider } = createSimpleContext({
name: "Keybind",
init: () => {
const sync = useSync()
const keybinds = createMemo(() => {
const config = useTuiConfig()
const keybinds = createMemo<Record<string, Keybind.Info[]>>(() => {
return pipe(
sync.data.config.keybinds ?? {},
(config.keybinds ?? {}) as Record<string, string>,
mapValues((value) => Keybind.parse(value)),
)
})

View File

@@ -1,7 +1,6 @@
import { SyntaxStyle, RGBA, type TerminalColors } from "@opentui/core"
import path from "path"
import { createEffect, createMemo, onMount } from "solid-js"
import { useSync } from "@tui/context/sync"
import { createSimpleContext } from "./helper"
import aura from "./theme/aura.json" with { type: "json" }
import ayu from "./theme/ayu.json" with { type: "json" }
@@ -41,6 +40,7 @@ import { useRenderer } from "@opentui/solid"
import { createStore, produce } from "solid-js/store"
import { Global } from "@/global"
import { Filesystem } from "@/util/filesystem"
import { useTuiConfig } from "./tui-config"
type ThemeColors = {
primary: RGBA
@@ -279,17 +279,17 @@ function ansiToRgba(code: number): RGBA {
export const { use: useTheme, provider: ThemeProvider } = createSimpleContext({
name: "Theme",
init: (props: { mode: "dark" | "light" }) => {
const sync = useSync()
const config = useTuiConfig()
const kv = useKV()
const [store, setStore] = createStore({
themes: DEFAULT_THEMES,
mode: kv.get("theme_mode", props.mode),
active: (sync.data.config.theme ?? kv.get("theme", "opencode")) as string,
active: (config.theme ?? kv.get("theme", "opencode")) as string,
ready: false,
})
createEffect(() => {
const theme = sync.data.config.theme
const theme = config.theme
if (theme) setStore("active", theme)
})

View File

@@ -0,0 +1,9 @@
import { TuiConfig } from "@/config/tui"
import { createSimpleContext } from "./helper"
export const { use: useTuiConfig, provider: TuiConfigProvider } = createSimpleContext({
name: "TuiConfig",
init: (props: { config: TuiConfig.Info }) => {
return props.config
},
})

View File

@@ -78,6 +78,7 @@ import { QuestionPrompt } from "./question"
import { DialogExportOptions } from "../../ui/dialog-export-options"
import { formatTranscript } from "../../util/transcript"
import { UI } from "@/cli/ui.ts"
import { useTuiConfig } from "../../context/tui-config"
addDefaultParsers(parsers.parsers)
@@ -100,6 +101,7 @@ const context = createContext<{
showDetails: () => boolean
diffWrapMode: () => "word" | "none"
sync: ReturnType<typeof useSync>
tui: ReturnType<typeof useTuiConfig>
}>()
function use() {
@@ -112,6 +114,7 @@ export function Session() {
const route = useRouteData("session")
const { navigate } = useRoute()
const sync = useSync()
const tuiConfig = useTuiConfig()
const kv = useKV()
const { theme } = useTheme()
const promptRef = usePromptRef()
@@ -164,7 +167,7 @@ export function Session() {
const contentWidth = createMemo(() => dimensions().width - (sidebarVisible() ? 42 : 0) - 4)
const scrollAcceleration = createMemo(() => {
const tui = sync.data.config.tui
const tui = tuiConfig
if (tui?.scroll_acceleration?.enabled) {
return new MacOSScrollAccel()
}
@@ -968,6 +971,7 @@ export function Session() {
showDetails,
diffWrapMode,
sync,
tui: tuiConfig,
}}
>
<box flexDirection="row">
@@ -1912,7 +1916,7 @@ function Edit(props: ToolProps<typeof EditTool>) {
const { theme, syntax } = useTheme()
const view = createMemo(() => {
const diffStyle = ctx.sync.data.config.tui?.diff_style
const diffStyle = ctx.tui.diff_style
if (diffStyle === "stacked") return "unified"
// Default to "auto" behavior
return ctx.width > 120 ? "split" : "unified"
@@ -1983,7 +1987,7 @@ function ApplyPatch(props: ToolProps<typeof ApplyPatchTool>) {
const files = createMemo(() => props.metadata.files ?? [])
const view = createMemo(() => {
const diffStyle = ctx.sync.data.config.tui?.diff_style
const diffStyle = ctx.tui.diff_style
if (diffStyle === "stacked") return "unified"
return ctx.width > 120 ? "split" : "unified"
})

View File

@@ -15,6 +15,7 @@ import { Keybind } from "@/util/keybind"
import { Locale } from "@/util/locale"
import { Global } from "@/global"
import { useDialog } from "../../ui/dialog"
import { useTuiConfig } from "../../context/tui-config"
type PermissionStage = "permission" | "always" | "reject"
@@ -48,14 +49,14 @@ function EditBody(props: { request: PermissionRequest }) {
const themeState = useTheme()
const theme = themeState.theme
const syntax = themeState.syntax
const sync = useSync()
const config = useTuiConfig()
const dimensions = useTerminalDimensions()
const filepath = createMemo(() => (props.request.metadata?.filepath as string) ?? "")
const diff = createMemo(() => (props.request.metadata?.diff as string) ?? "")
const view = createMemo(() => {
const diffStyle = sync.data.config.tui?.diff_style
const diffStyle = config.diff_style
if (diffStyle === "stacked") return "unified"
return dimensions().width > 120 ? "split" : "unified"
})

View File

@@ -10,6 +10,8 @@ import { withNetworkOptions, resolveNetworkOptions } from "@/cli/network"
import type { Event } from "@opencode-ai/sdk/v2"
import type { EventSource } from "./context/sdk"
import { win32DisableProcessedInput, win32InstallCtrlCGuard } from "./win32"
import { TuiConfig } from "@/config/tui"
import { Instance } from "@/project/instance"
declare global {
const OPENCODE_WORKER_PATH: string
@@ -133,6 +135,10 @@ export const TuiThreadCommand = cmd({
if (!args.prompt) return piped
return piped ? piped + "\n" + args.prompt : args.prompt
})
const config = await Instance.provide({
directory: cwd,
fn: () => TuiConfig.get(),
})
// Check if server should be started (port or hostname explicitly set in CLI or config)
const networkOpts = await resolveNetworkOptions(args)
@@ -161,6 +167,8 @@ export const TuiThreadCommand = cmd({
const tuiPromise = tui({
url,
config,
directory: cwd,
fetch: customFetch,
events,
args: {

View File

@@ -3,7 +3,6 @@ import path from "path"
import { pathToFileURL } from "url"
import os from "os"
import z from "zod"
import { Filesystem } from "../util/filesystem"
import { ModelsDev } from "../provider/models"
import { mergeDeep, pipe, unique } from "remeda"
import { Global } from "../global"
@@ -32,6 +31,7 @@ import { PackageRegistry } from "@/bun/registry"
import { proxied } from "@/util/proxied"
import { iife } from "@/util/iife"
import { Control } from "@/control"
import { ConfigPaths } from "./paths"
export namespace Config {
const ModelId = z.string().meta({ $ref: "https://models.dev/model-schema.json#/$defs/Model" })
@@ -40,7 +40,7 @@ export namespace Config {
// Managed settings directory for enterprise deployments (highest priority, admin-controlled)
// These settings override all user and project settings
function getManagedConfigDir(): string {
function systemManagedConfigDir(): string {
switch (process.platform) {
case "darwin":
return "/Library/Application Support/opencode"
@@ -51,10 +51,14 @@ export namespace Config {
}
}
const managedConfigDir = process.env.OPENCODE_TEST_MANAGED_CONFIG_DIR || getManagedConfigDir()
export function managedConfigDir() {
return process.env.OPENCODE_TEST_MANAGED_CONFIG_DIR || systemManagedConfigDir()
}
const managedDir = managedConfigDir()
// Custom merge function that concatenates array fields instead of replacing them
function merge(target: Info, source: Info): Info {
function mergeConfigConcatArrays(target: Info, source: Info): Info {
const merged = mergeDeep(target, source)
if (target.plugin && source.plugin) {
merged.plugin = Array.from(new Set([...target.plugin, ...source.plugin]))
@@ -89,7 +93,10 @@ export namespace Config {
const remoteConfig = wellknown.config ?? {}
// Add $schema to prevent load() from trying to write back to a non-existent file
if (!remoteConfig.$schema) remoteConfig.$schema = "https://opencode.ai/config.json"
result = merge(result, await load(JSON.stringify(remoteConfig), `${key}/.well-known/opencode`))
result = mergeConfigConcatArrays(
result,
await load(JSON.stringify(remoteConfig), `${key}/.well-known/opencode`),
)
log.debug("loaded remote config from well-known", { url: key })
}
}
@@ -99,21 +106,18 @@ export namespace Config {
}
// Global user config overrides remote config.
result = merge(result, await global())
result = mergeConfigConcatArrays(result, await global())
// Custom config path overrides global config.
if (Flag.OPENCODE_CONFIG) {
result = merge(result, await loadFile(Flag.OPENCODE_CONFIG))
result = mergeConfigConcatArrays(result, await loadFile(Flag.OPENCODE_CONFIG))
log.debug("loaded custom config", { path: Flag.OPENCODE_CONFIG })
}
// Project config overrides global and remote config.
if (!Flag.OPENCODE_DISABLE_PROJECT_CONFIG) {
for (const file of ["opencode.jsonc", "opencode.json"]) {
const found = await Filesystem.findUp(file, Instance.directory, Instance.worktree)
for (const resolved of found.toReversed()) {
result = merge(result, await loadFile(resolved))
}
for (const file of await ConfigPaths.projectFiles("opencode", Instance.directory, Instance.worktree)) {
result = mergeConfigConcatArrays(result, await loadFile(file))
}
}
@@ -121,31 +125,10 @@ export namespace Config {
result.mode = result.mode || {}
result.plugin = result.plugin || []
const directories = [
Global.Path.config,
// Only scan project .opencode/ directories when project discovery is enabled
...(!Flag.OPENCODE_DISABLE_PROJECT_CONFIG
? await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
start: Instance.directory,
stop: Instance.worktree,
}),
)
: []),
// Always scan ~/.opencode/ (user home directory)
...(await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
start: Global.Path.home,
stop: Global.Path.home,
}),
)),
]
const directories = await ConfigPaths.directories(Instance.directory, Instance.worktree)
// .opencode directory config overrides (project and global) config sources.
if (Flag.OPENCODE_CONFIG_DIR) {
directories.push(Flag.OPENCODE_CONFIG_DIR)
log.debug("loading config from OPENCODE_CONFIG_DIR", { path: Flag.OPENCODE_CONFIG_DIR })
}
@@ -155,7 +138,7 @@ export namespace Config {
if (dir.endsWith(".opencode") || dir === Flag.OPENCODE_CONFIG_DIR) {
for (const file of ["opencode.jsonc", "opencode.json"]) {
log.debug(`loading config from ${path.join(dir, file)}`)
result = merge(result, await loadFile(path.join(dir, file)))
result = mergeConfigConcatArrays(result, await loadFile(path.join(dir, file)))
// to satisfy the type checker
result.agent ??= {}
result.mode ??= {}
@@ -178,7 +161,7 @@ export namespace Config {
// Inline config content overrides all non-managed config sources.
if (Flag.OPENCODE_CONFIG_CONTENT) {
result = merge(result, JSON.parse(Flag.OPENCODE_CONFIG_CONTENT))
result = mergeConfigConcatArrays(result, JSON.parse(Flag.OPENCODE_CONFIG_CONTENT))
log.debug("loaded custom config from OPENCODE_CONFIG_CONTENT")
}
@@ -186,9 +169,9 @@ export namespace Config {
// Kept separate from directories array to avoid write operations when installing plugins
// which would fail on system directories requiring elevated permissions
// This way it only loads config file and not skills/plugins/commands
if (existsSync(managedConfigDir)) {
if (existsSync(managedDir)) {
for (const file of ["opencode.jsonc", "opencode.json"]) {
result = merge(result, await loadFile(path.join(managedConfigDir, file)))
result = mergeConfigConcatArrays(result, await loadFile(path.join(managedDir, file)))
}
}
@@ -227,8 +210,6 @@ export namespace Config {
result.share = "auto"
}
if (!result.keybinds) result.keybinds = Info.shape.keybinds.parse({})
// Apply flag overrides for compaction settings
if (Flag.OPENCODE_DISABLE_AUTOCOMPACT) {
result.compaction = { ...result.compaction, auto: false }
@@ -290,7 +271,7 @@ export namespace Config {
}
}
async function needsInstall(dir: string) {
export async function needsInstall(dir: string) {
// Some config dirs may be read-only.
// Installing deps there will fail; skip installation in that case.
const writable = await isWritable(dir)
@@ -918,20 +899,6 @@ export namespace Config {
ref: "KeybindsConfig",
})
export const TUI = z.object({
scroll_speed: z.number().min(0.001).optional().describe("TUI scroll speed"),
scroll_acceleration: z
.object({
enabled: z.boolean().describe("Enable scroll acceleration"),
})
.optional()
.describe("Scroll acceleration settings"),
diff_style: z
.enum(["auto", "stacked"])
.optional()
.describe("Control diff rendering style: 'auto' adapts to terminal width, 'stacked' always shows single column"),
})
export const Server = z
.object({
port: z.number().int().positive().optional().describe("Port to listen on"),
@@ -1006,10 +973,7 @@ export namespace Config {
export const Info = z
.object({
$schema: z.string().optional().describe("JSON schema reference for configuration validation"),
theme: z.string().optional().describe("Theme name to use for the interface"),
keybinds: Keybinds.optional().describe("Custom keybind configurations"),
logLevel: Log.Level.optional().describe("Log level"),
tui: TUI.optional().describe("TUI specific settings"),
server: Server.optional().describe("Server configuration for opencode serve and web commands"),
command: z
.record(z.string(), Command)
@@ -1229,86 +1193,32 @@ export namespace Config {
return result
})
export const { readFile } = ConfigPaths
async function loadFile(filepath: string): Promise<Info> {
log.info("loading", { path: filepath })
let text = await Bun.file(filepath)
.text()
.catch((err) => {
if (err.code === "ENOENT") return
throw new JsonError({ path: filepath }, { cause: err })
})
const text = await readFile(filepath)
if (!text) return {}
return load(text, filepath)
}
async function load(text: string, configFilepath: string) {
const original = text
text = text.replace(/\{env:([^}]+)\}/g, (_, varName) => {
return process.env[varName] || ""
})
const data = await ConfigPaths.parseText(text, configFilepath)
const fileMatches = text.match(/\{file:[^}]+\}/g)
if (fileMatches) {
const configDir = path.dirname(configFilepath)
const lines = text.split("\n")
const normalized = (() => {
if (!data || typeof data !== "object" || Array.isArray(data)) return data
const copy = { ...(data as Record<string, unknown>) }
const hadLegacy = "theme" in copy || "keybinds" in copy || "tui" in copy
if (!hadLegacy) return copy
delete copy.theme
delete copy.keybinds
delete copy.tui
log.warn("tui keys in opencode config are deprecated; move them to tui.json", { path: configFilepath })
return copy
})()
for (const match of fileMatches) {
const lineIndex = lines.findIndex((line) => line.includes(match))
if (lineIndex !== -1 && lines[lineIndex].trim().startsWith("//")) {
continue // Skip if line is commented
}
let filePath = match.replace(/^\{file:/, "").replace(/\}$/, "")
if (filePath.startsWith("~/")) {
filePath = path.join(os.homedir(), filePath.slice(2))
}
const resolvedPath = path.isAbsolute(filePath) ? filePath : path.resolve(configDir, filePath)
const fileContent = (
await Bun.file(resolvedPath)
.text()
.catch((error) => {
const errMsg = `bad file reference: "${match}"`
if (error.code === "ENOENT") {
throw new InvalidError(
{
path: configFilepath,
message: errMsg + ` ${resolvedPath} does not exist`,
},
{ cause: error },
)
}
throw new InvalidError({ path: configFilepath, message: errMsg }, { cause: error })
})
).trim()
// escape newlines/quotes, strip outer quotes
text = text.replace(match, () => JSON.stringify(fileContent).slice(1, -1))
}
}
const errors: JsoncParseError[] = []
const data = parseJsonc(text, errors, { allowTrailingComma: true })
if (errors.length) {
const lines = text.split("\n")
const errorDetails = errors
.map((e) => {
const beforeOffset = text.substring(0, e.offset).split("\n")
const line = beforeOffset.length
const column = beforeOffset[beforeOffset.length - 1].length + 1
const problemLine = lines[line - 1]
const error = `${printParseErrorCode(e.error)} at line ${line}, column ${column}`
if (!problemLine) return error
return `${error}\n Line ${line}: ${problemLine}\n${"".padStart(column + 9)}^`
})
.join("\n")
throw new JsonError({
path: configFilepath,
message: `\n--- JSONC Input ---\n${text}\n--- Errors ---\n${errorDetails}\n--- End ---`,
})
}
const parsed = Info.safeParse(data)
const parsed = Info.safeParse(normalized)
if (parsed.success) {
if (!parsed.data.$schema) {
parsed.data.$schema = "https://opencode.ai/config.json"
@@ -1333,13 +1243,7 @@ export namespace Config {
issues: parsed.error.issues,
})
}
export const JsonError = NamedError.create(
"ConfigJsonError",
z.object({
path: z.string(),
message: z.string().optional(),
}),
)
export const { JsonError, InvalidError } = ConfigPaths
export const ConfigDirectoryTypoError = NamedError.create(
"ConfigDirectoryTypoError",
@@ -1350,15 +1254,6 @@ export namespace Config {
}),
)
export const InvalidError = NamedError.create(
"ConfigInvalidError",
z.object({
path: z.string(),
issues: z.custom<z.core.$ZodIssue[]>().optional(),
message: z.string().optional(),
}),
)
export async function get() {
return state().then((x) => x.config)
}

View File

@@ -0,0 +1,155 @@
import path from "path"
import { applyEdits, modify, parse as parseJsonc } from "jsonc-parser"
import { unique } from "remeda"
import z from "zod"
import { ConfigPaths } from "./paths"
import { TuiInfo, TuiOptions } from "./tui-schema"
import { Instance } from "@/project/instance"
import { Flag } from "@/flag/flag"
import { Log } from "@/util/log"
import { Global } from "@/global"
const log = Log.create({ service: "tui.migrate" })
const TUI_SCHEMA_URL = "https://opencode.ai/tui.json"
const LegacyTheme = TuiInfo.shape.theme.optional()
const LegacyRecord = z.record(z.string(), z.unknown()).optional()
const TuiLegacy = z
.object({
scroll_speed: TuiOptions.shape.scroll_speed.catch(undefined),
scroll_acceleration: TuiOptions.shape.scroll_acceleration.catch(undefined),
diff_style: TuiOptions.shape.diff_style.catch(undefined),
})
.strip()
interface MigrateInput {
directories: string[]
custom?: string
managed: string
}
/**
* Migrates tui-specific keys (theme, keybinds, tui) from opencode.json files
* into dedicated tui.json files. Migration is performed per-directory and
* skips only locations where a tui.json already exists.
*/
export async function migrateTuiConfig(input: MigrateInput) {
const opencode = await opencodeFiles(input)
for (const file of opencode) {
const source = await Bun.file(file)
.text()
.catch((error) => {
log.warn("failed to read config for tui migration", { path: file, error })
return undefined
})
if (!source) continue
const data = parseJsonc(source)
if (!data || typeof data !== "object" || Array.isArray(data)) continue
const theme = LegacyTheme.safeParse("theme" in data ? data.theme : undefined)
const keybinds = LegacyRecord.safeParse("keybinds" in data ? data.keybinds : undefined)
const legacyTui = LegacyRecord.safeParse("tui" in data ? data.tui : undefined)
const extracted = {
theme: theme.success ? theme.data : undefined,
keybinds: keybinds.success ? keybinds.data : undefined,
tui: legacyTui.success ? legacyTui.data : undefined,
}
const tui = extracted.tui ? normalizeTui(extracted.tui) : undefined
if (extracted.theme === undefined && extracted.keybinds === undefined && !tui) continue
const target = path.join(path.dirname(file), "tui.json")
const targetExists = await Bun.file(target).exists()
if (targetExists) continue
const payload: Record<string, unknown> = {
$schema: TUI_SCHEMA_URL,
}
if (extracted.theme !== undefined) payload.theme = extracted.theme
if (extracted.keybinds !== undefined) payload.keybinds = extracted.keybinds
if (tui) Object.assign(payload, tui)
const wrote = await Bun.write(target, JSON.stringify(payload, null, 2))
.then(() => true)
.catch((error) => {
log.warn("failed to write tui migration target", { from: file, to: target, error })
return false
})
if (!wrote) continue
const stripped = await backupAndStripLegacy(file, source)
if (!stripped) {
log.warn("tui config migrated but source file was not stripped", { from: file, to: target })
continue
}
log.info("migrated tui config", { from: file, to: target })
}
}
function normalizeTui(data: Record<string, unknown>) {
const parsed = TuiLegacy.parse(data)
if (
parsed.scroll_speed === undefined &&
parsed.diff_style === undefined &&
parsed.scroll_acceleration === undefined
) {
return
}
return parsed
}
async function backupAndStripLegacy(file: string, source: string) {
const backup = file + ".tui-migration.bak"
const hasBackup = await Bun.file(backup).exists()
const backed = hasBackup
? true
: await Bun.write(backup, source)
.then(() => true)
.catch((error) => {
log.warn("failed to backup source config during tui migration", { path: file, backup, error })
return false
})
if (!backed) return false
const text = ["theme", "keybinds", "tui"].reduce((acc, key) => {
const edits = modify(acc, [key], undefined, {
formattingOptions: {
insertSpaces: true,
tabSize: 2,
},
})
if (!edits.length) return acc
return applyEdits(acc, edits)
}, source)
return Bun.write(file, text)
.then(() => {
log.info("stripped tui keys from server config", { path: file, backup })
return true
})
.catch((error) => {
log.warn("failed to strip legacy tui keys from server config", { path: file, backup, error })
return false
})
}
async function opencodeFiles(input: { directories: string[]; managed: string }) {
const project = Flag.OPENCODE_DISABLE_PROJECT_CONFIG
? []
: await ConfigPaths.projectFiles("opencode", Instance.directory, Instance.worktree)
const files = [...project, ...ConfigPaths.fileInDirectory(Global.Path.config, "opencode")]
for (const dir of unique(input.directories)) {
files.push(...ConfigPaths.fileInDirectory(dir, "opencode"))
}
if (Flag.OPENCODE_CONFIG) files.push(Flag.OPENCODE_CONFIG)
files.push(...ConfigPaths.fileInDirectory(input.managed, "opencode"))
const existing = await Promise.all(
unique(files).map(async (file) => {
const ok = await Bun.file(file).exists()
return ok ? file : undefined
}),
)
return existing.filter((file): file is string => !!file)
}

View File

@@ -0,0 +1,154 @@
import path from "path"
import os from "os"
import z from "zod"
import { type ParseError as JsoncParseError, parse as parseJsonc, printParseErrorCode } from "jsonc-parser"
import { NamedError } from "@opencode-ai/util/error"
import { Filesystem } from "@/util/filesystem"
import { Flag } from "@/flag/flag"
import { Global } from "@/global"
export namespace ConfigPaths {
export async function projectFiles(name: string, directory: string, worktree: string) {
const files: string[] = []
for (const file of [`${name}.jsonc`, `${name}.json`]) {
const found = await Filesystem.findUp(file, directory, worktree)
for (const resolved of found.toReversed()) {
files.push(resolved)
}
}
return files
}
export async function directories(directory: string, worktree: string) {
return [
Global.Path.config,
...(!Flag.OPENCODE_DISABLE_PROJECT_CONFIG
? await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
start: directory,
stop: worktree,
}),
)
: []),
...(await Array.fromAsync(
Filesystem.up({
targets: [".opencode"],
start: Global.Path.home,
stop: Global.Path.home,
}),
)),
...(Flag.OPENCODE_CONFIG_DIR ? [Flag.OPENCODE_CONFIG_DIR] : []),
]
}
export function fileInDirectory(dir: string, name: string) {
return [path.join(dir, `${name}.jsonc`), path.join(dir, `${name}.json`)]
}
export const JsonError = NamedError.create(
"ConfigJsonError",
z.object({
path: z.string(),
message: z.string().optional(),
}),
)
export const InvalidError = NamedError.create(
"ConfigInvalidError",
z.object({
path: z.string(),
issues: z.custom<z.core.$ZodIssue[]>().optional(),
message: z.string().optional(),
}),
)
/** Read a config file, returning undefined for missing files and throwing JsonError for other failures. */
export async function readFile(filepath: string) {
return Bun.file(filepath)
.text()
.catch((err) => {
if (err.code === "ENOENT") return
throw new JsonError({ path: filepath }, { cause: err })
})
}
/** Apply {env:VAR} and {file:path} substitutions to config text. */
async function substitute(text: string, configFilepath: string, missing: "error" | "empty" = "error") {
text = text.replace(/\{env:([^}]+)\}/g, (_, varName) => {
return process.env[varName] || ""
})
const fileMatches = text.match(/\{file:[^}]+\}/g)
if (!fileMatches) return text
const configDir = path.dirname(configFilepath)
const lines = text.split("\n")
for (const match of fileMatches) {
const lineIndex = lines.findIndex((line) => line.includes(match))
if (lineIndex !== -1 && lines[lineIndex].trim().startsWith("//")) continue
let filePath = match.replace(/^\{file:/, "").replace(/\}$/, "")
if (filePath.startsWith("~/")) {
filePath = path.join(os.homedir(), filePath.slice(2))
}
const resolvedPath = path.isAbsolute(filePath) ? filePath : path.resolve(configDir, filePath)
const fileContent = (
await Bun.file(resolvedPath)
.text()
.catch((error) => {
if (missing === "empty") return ""
const errMsg = `bad file reference: "${match}"`
if (error.code === "ENOENT") {
throw new InvalidError(
{
path: configFilepath,
message: errMsg + ` ${resolvedPath} does not exist`,
},
{ cause: error },
)
}
throw new InvalidError({ path: configFilepath, message: errMsg }, { cause: error })
})
).trim()
text = text.replace(match, () => JSON.stringify(fileContent).slice(1, -1))
}
return text
}
/** Substitute and parse JSONC text, throwing JsonError on syntax errors. */
export async function parseText(text: string, configFilepath: string, missing: "error" | "empty" = "error") {
text = await substitute(text, configFilepath, missing)
const errors: JsoncParseError[] = []
const data = parseJsonc(text, errors, { allowTrailingComma: true })
if (errors.length) {
const lines = text.split("\n")
const errorDetails = errors
.map((e) => {
const beforeOffset = text.substring(0, e.offset).split("\n")
const line = beforeOffset.length
const column = beforeOffset[beforeOffset.length - 1].length + 1
const problemLine = lines[line - 1]
const error = `${printParseErrorCode(e.error)} at line ${line}, column ${column}`
if (!problemLine) return error
return `${error}\n Line ${line}: ${problemLine}\n${"".padStart(column + 9)}^`
})
.join("\n")
throw new JsonError({
path: configFilepath,
message: `\n--- JSONC Input ---\n${text}\n--- Errors ---\n${errorDetails}\n--- End ---`,
})
}
return data
}
}

View File

@@ -0,0 +1,25 @@
import z from "zod"
import { Config } from "./config"
export const TuiOptions = z.object({
scroll_speed: z.number().min(0.001).optional().describe("TUI scroll speed"),
scroll_acceleration: z
.object({
enabled: z.boolean().describe("Enable scroll acceleration"),
})
.optional()
.describe("Scroll acceleration settings"),
diff_style: z
.enum(["auto", "stacked"])
.optional()
.describe("Control diff rendering style: 'auto' adapts to terminal width, 'stacked' always shows single column"),
})
export const TuiInfo = z
.object({
$schema: z.string().optional(),
theme: z.string().optional(),
keybinds: Config.Keybinds.optional(),
})
.extend(TuiOptions.shape)
.strict()

View File

@@ -0,0 +1,118 @@
import { existsSync } from "fs"
import z from "zod"
import { mergeDeep, unique } from "remeda"
import { Config } from "./config"
import { ConfigPaths } from "./paths"
import { migrateTuiConfig } from "./migrate-tui-config"
import { TuiInfo } from "./tui-schema"
import { Instance } from "@/project/instance"
import { Flag } from "@/flag/flag"
import { Log } from "@/util/log"
import { Global } from "@/global"
export namespace TuiConfig {
const log = Log.create({ service: "tui.config" })
export const Info = TuiInfo
export type Info = z.output<typeof Info>
function mergeInfo(target: Info, source: Info): Info {
return mergeDeep(target, source)
}
function customPath() {
return Flag.OPENCODE_TUI_CONFIG
}
const state = Instance.state(async () => {
let projectFiles = Flag.OPENCODE_DISABLE_PROJECT_CONFIG
? []
: await ConfigPaths.projectFiles("tui", Instance.directory, Instance.worktree)
const directories = await ConfigPaths.directories(Instance.directory, Instance.worktree)
const custom = customPath()
const managed = Config.managedConfigDir()
await migrateTuiConfig({ directories, custom, managed })
// Re-compute after migration since migrateTuiConfig may have created new tui.json files
projectFiles = Flag.OPENCODE_DISABLE_PROJECT_CONFIG
? []
: await ConfigPaths.projectFiles("tui", Instance.directory, Instance.worktree)
let result: Info = {}
for (const file of ConfigPaths.fileInDirectory(Global.Path.config, "tui")) {
result = mergeInfo(result, await loadFile(file))
}
if (custom) {
result = mergeInfo(result, await loadFile(custom))
log.debug("loaded custom tui config", { path: custom })
}
for (const file of projectFiles) {
result = mergeInfo(result, await loadFile(file))
}
for (const dir of unique(directories)) {
if (!dir.endsWith(".opencode") && dir !== Flag.OPENCODE_CONFIG_DIR) continue
for (const file of ConfigPaths.fileInDirectory(dir, "tui")) {
result = mergeInfo(result, await loadFile(file))
}
}
if (existsSync(managed)) {
for (const file of ConfigPaths.fileInDirectory(managed, "tui")) {
result = mergeInfo(result, await loadFile(file))
}
}
result.keybinds ??= Config.Keybinds.parse({})
return {
config: result,
}
})
export async function get() {
return state().then((x) => x.config)
}
async function loadFile(filepath: string): Promise<Info> {
const text = await ConfigPaths.readFile(filepath)
if (!text) return {}
return load(text, filepath).catch((error) => {
log.warn("failed to load tui config", { path: filepath, error })
return {}
})
}
async function load(text: string, configFilepath: string): Promise<Info> {
const data = await ConfigPaths.parseText(text, configFilepath, "empty")
if (!data || typeof data !== "object" || Array.isArray(data)) return {}
// Flatten a nested "tui" key so users who wrote `{ "tui": { ... } }` inside tui.json
// (mirroring the old opencode.json shape) still get their settings applied.
const normalized = (() => {
const copy = { ...(data as Record<string, unknown>) }
if (!("tui" in copy)) return copy
if (!copy.tui || typeof copy.tui !== "object" || Array.isArray(copy.tui)) {
delete copy.tui
return copy
}
const tui = copy.tui as Record<string, unknown>
delete copy.tui
return {
...tui,
...copy,
}
})()
const parsed = Info.safeParse(normalized)
if (!parsed.success) {
log.warn("invalid tui config", { path: configFilepath, issues: parsed.error.issues })
return {}
}
return parsed.data
}
}

View File

@@ -7,6 +7,7 @@ export namespace Flag {
export const OPENCODE_AUTO_SHARE = truthy("OPENCODE_AUTO_SHARE")
export const OPENCODE_GIT_BASH_PATH = process.env["OPENCODE_GIT_BASH_PATH"]
export const OPENCODE_CONFIG = process.env["OPENCODE_CONFIG"]
export declare const OPENCODE_TUI_CONFIG: string | undefined
export declare const OPENCODE_CONFIG_DIR: string | undefined
export const OPENCODE_CONFIG_CONTENT = process.env["OPENCODE_CONFIG_CONTENT"]
export const OPENCODE_DISABLE_AUTOUPDATE = truthy("OPENCODE_DISABLE_AUTOUPDATE")
@@ -30,6 +31,7 @@ export namespace Flag {
export declare const OPENCODE_CLIENT: string
export const OPENCODE_SERVER_PASSWORD = process.env["OPENCODE_SERVER_PASSWORD"]
export const OPENCODE_SERVER_USERNAME = process.env["OPENCODE_SERVER_USERNAME"]
export const OPENCODE_ENABLE_QUESTION_TOOL = truthy("OPENCODE_ENABLE_QUESTION_TOOL")
// Experimental
export const OPENCODE_EXPERIMENTAL = truthy("OPENCODE_EXPERIMENTAL")
@@ -73,6 +75,17 @@ Object.defineProperty(Flag, "OPENCODE_DISABLE_PROJECT_CONFIG", {
configurable: false,
})
// Dynamic getter for OPENCODE_TUI_CONFIG
// This must be evaluated at access time, not module load time,
// because tests and external tooling may set this env var at runtime
Object.defineProperty(Flag, "OPENCODE_TUI_CONFIG", {
get() {
return process.env["OPENCODE_TUI_CONFIG"]
},
enumerable: true,
configurable: false,
})
// Dynamic getter for OPENCODE_CONFIG_DIR
// This must be evaluated at access time, not module load time,
// because external tooling may set this env var at runtime

View File

@@ -364,3 +364,21 @@ export const ormolu: Info = {
return Bun.which("ormolu") !== null
},
}
export const cljfmt: Info = {
name: "cljfmt",
command: ["cljfmt", "fix", "--quiet", "$FILE"],
extensions: [".clj", ".cljs", ".cljc", ".edn"],
async enabled() {
return Bun.which("cljfmt") !== null
},
}
export const dfmt: Info = {
name: "dfmt",
command: ["dfmt", "-i", "$FILE"],
extensions: [".d"],
async enabled() {
return Bun.which("dfmt") !== null
},
}

View File

@@ -57,6 +57,30 @@ export namespace Provider {
return isGpt5OrLater(modelID) && !modelID.startsWith("gpt-5-mini")
}
function googleVertexVars(options: Record<string, any>) {
const project =
options["project"] ?? Env.get("GOOGLE_CLOUD_PROJECT") ?? Env.get("GCP_PROJECT") ?? Env.get("GCLOUD_PROJECT")
const location =
options["location"] ?? Env.get("GOOGLE_CLOUD_LOCATION") ?? Env.get("VERTEX_LOCATION") ?? "us-central1"
const endpoint = location === "global" ? "aiplatform.googleapis.com" : `${location}-aiplatform.googleapis.com`
return {
GOOGLE_VERTEX_PROJECT: project,
GOOGLE_VERTEX_LOCATION: location,
GOOGLE_VERTEX_ENDPOINT: endpoint,
}
}
function loadBaseURL(model: Model, options: Record<string, any>) {
const raw = options["baseURL"] ?? model.api.url
if (typeof raw !== "string") return raw
const vars = model.providerID === "google-vertex" ? googleVertexVars(options) : undefined
return raw.replace(/\$\{([^}]+)\}/g, (match, key) => {
const val = Env.get(String(key)) ?? vars?.[String(key) as keyof typeof vars]
return val ?? match
})
}
const BUNDLED_PROVIDERS: Record<string, (options: any) => SDK> = {
"@ai-sdk/amazon-bedrock": createAmazonBedrock,
"@ai-sdk/anthropic": createAnthropic,
@@ -353,9 +377,16 @@ export namespace Provider {
},
}
},
"google-vertex": async () => {
const project = Env.get("GOOGLE_CLOUD_PROJECT") ?? Env.get("GCP_PROJECT") ?? Env.get("GCLOUD_PROJECT")
const location = Env.get("GOOGLE_CLOUD_LOCATION") ?? Env.get("VERTEX_LOCATION") ?? "us-east5"
"google-vertex": async (provider) => {
const project =
provider.options?.project ??
Env.get("GOOGLE_CLOUD_PROJECT") ??
Env.get("GCP_PROJECT") ??
Env.get("GCLOUD_PROJECT")
const location =
provider.options?.location ?? Env.get("GOOGLE_CLOUD_LOCATION") ?? Env.get("VERTEX_LOCATION") ?? "us-central1"
const autoload = Boolean(project)
if (!autoload) return { autoload: false }
return {
@@ -363,6 +394,18 @@ export namespace Provider {
options: {
project,
location,
fetch: async (input: RequestInfo | URL, init?: RequestInit) => {
const { GoogleAuth } = await import(await BunProc.install("google-auth-library"))
const auth = new GoogleAuth()
const client = await auth.getApplicationDefault()
const credentials = await client.credential
const token = await credentials.getAccessToken()
const headers = new Headers(init?.headers)
headers.set("Authorization", `Bearer ${token.token}`)
return fetch(input, { ...init, headers })
},
},
async getModel(sdk: any, modelID: string) {
const id = String(modelID).trim()
@@ -994,11 +1037,16 @@ export namespace Provider {
const provider = s.providers[model.providerID]
const options = { ...provider.options }
if (model.providerID === "google-vertex" && !model.api.npm.includes("@ai-sdk/openai-compatible")) {
delete options.fetch
}
if (model.api.npm.includes("@ai-sdk/openai-compatible") && options["includeUsage"] !== false) {
options["includeUsage"] = true
}
if (!options["baseURL"]) options["baseURL"] = model.api.url
const baseURL = loadBaseURL(model, options)
if (baseURL !== undefined) options["baseURL"] = baseURL
if (options["apiKey"] === undefined && provider.key) options["apiKey"] = provider.key
if (model.headers)
options["headers"] = {

View File

@@ -298,8 +298,8 @@ export namespace ProviderTransform {
if (id.includes("glm-4.7")) return 1.0
if (id.includes("minimax-m2")) return 1.0
if (id.includes("kimi-k2")) {
// kimi-k2-thinking & kimi-k2.5 && kimi-k2p5
if (id.includes("thinking") || id.includes("k2.") || id.includes("k2p")) {
// kimi-k2-thinking & kimi-k2.5 && kimi-k2p5 && kimi-k2-5
if (["thinking", "k2.", "k2p", "k2-5"].some((s) => id.includes(s))) {
return 1.0
}
return 0.6
@@ -310,7 +310,7 @@ export namespace ProviderTransform {
export function topP(model: Provider.Model) {
const id = model.id.toLowerCase()
if (id.includes("qwen")) return 1
if (id.includes("minimax-m2") || id.includes("kimi-k2.5") || id.includes("kimi-k2p5") || id.includes("gemini")) {
if (["minimax-m2", "gemini", "kimi-k2.5", "kimi-k2p5", "kimi-k2-5"].some((s) => id.includes(s))) {
return 0.95
}
return undefined
@@ -319,7 +319,7 @@ export namespace ProviderTransform {
export function topK(model: Provider.Model) {
const id = model.id.toLowerCase()
if (id.includes("minimax-m2")) {
if (id.includes("m2.1")) return 40
if (["m2.", "m25", "m21"].some((s) => id.includes(s))) return 40
return 20
}
if (id.includes("gemini")) return 64
@@ -802,6 +802,11 @@ export namespace ProviderTransform {
}
return { reasoningEffort: "minimal" }
}
if (model.providerID === "venice") {
return { veniceParameters: { disableThinking: true } }
}
return {}
}

View File

@@ -445,6 +445,12 @@ export namespace SessionPrompt {
log.error("subtask execution failed", { error, agent: task.agent, description: task.description })
return undefined
})
const attachments = result?.attachments?.map((attachment) => ({
...attachment,
id: Identifier.ascending("part"),
sessionID,
messageID: assistantMessage.id,
}))
await Plugin.trigger(
"tool.execute.after",
{
@@ -467,7 +473,7 @@ export namespace SessionPrompt {
title: result.title,
metadata: result.metadata,
output: result.output,
attachments: result.attachments,
attachments,
time: {
...part.state.time,
end: Date.now(),
@@ -797,6 +803,15 @@ export namespace SessionPrompt {
},
)
const result = await item.execute(args, ctx)
const output = {
...result,
attachments: result.attachments?.map((attachment) => ({
...attachment,
id: Identifier.ascending("part"),
sessionID: ctx.sessionID,
messageID: input.processor.message.id,
})),
}
await Plugin.trigger(
"tool.execute.after",
{
@@ -805,9 +820,9 @@ export namespace SessionPrompt {
callID: ctx.callID,
args,
},
result,
output,
)
return result
return output
},
})
}
@@ -855,16 +870,13 @@ export namespace SessionPrompt {
)
const textParts: string[] = []
const attachments: MessageV2.FilePart[] = []
const attachments: Omit<MessageV2.FilePart, "id" | "sessionID" | "messageID">[] = []
for (const contentItem of result.content) {
if (contentItem.type === "text") {
textParts.push(contentItem.text)
} else if (contentItem.type === "image") {
attachments.push({
id: Identifier.ascending("part"),
sessionID: input.session.id,
messageID: input.processor.message.id,
type: "file",
mime: contentItem.mimeType,
url: `data:${contentItem.mimeType};base64,${contentItem.data}`,
@@ -876,9 +888,6 @@ export namespace SessionPrompt {
}
if (resource.blob) {
attachments.push({
id: Identifier.ascending("part"),
sessionID: input.session.id,
messageID: input.processor.message.id,
type: "file",
mime: resource.mimeType ?? "application/octet-stream",
url: `data:${resource.mimeType ?? "application/octet-stream"};base64,${resource.blob}`,
@@ -965,17 +974,22 @@ export namespace SessionPrompt {
}
using _ = defer(() => InstructionPrompt.clear(info.id))
type Draft<T> = T extends MessageV2.Part ? Omit<T, "id"> & { id?: string } : never
const assign = (part: Draft<MessageV2.Part>): MessageV2.Part => ({
...part,
id: part.id ?? Identifier.ascending("part"),
})
const parts = await Promise.all(
input.parts.map(async (part): Promise<MessageV2.Part[]> => {
input.parts.map(async (part): Promise<Draft<MessageV2.Part>[]> => {
if (part.type === "file") {
// before checking the protocol we check if this is an mcp resource because it needs special handling
if (part.source?.type === "resource") {
const { clientName, uri } = part.source
log.info("mcp resource", { clientName, uri, mime: part.mime })
const pieces: MessageV2.Part[] = [
const pieces: Draft<MessageV2.Part>[] = [
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -998,7 +1012,6 @@ export namespace SessionPrompt {
for (const content of contents) {
if ("text" in content && content.text) {
pieces.push({
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1009,7 +1022,6 @@ export namespace SessionPrompt {
// Handle binary content if needed
const mimeType = "mimeType" in content ? content.mimeType : part.mime
pieces.push({
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1021,7 +1033,6 @@ export namespace SessionPrompt {
pieces.push({
...part,
id: part.id ?? Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
})
@@ -1029,7 +1040,6 @@ export namespace SessionPrompt {
log.error("failed to read MCP resource", { error, clientName, uri })
const message = error instanceof Error ? error.message : String(error)
pieces.push({
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1046,7 +1056,6 @@ export namespace SessionPrompt {
if (part.mime === "text/plain") {
return [
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1054,7 +1063,6 @@ export namespace SessionPrompt {
text: `Called the Read tool with the following input: ${JSON.stringify({ filePath: part.filename })}`,
},
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1063,7 +1071,6 @@ export namespace SessionPrompt {
},
{
...part,
id: part.id ?? Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
},
@@ -1120,9 +1127,8 @@ export namespace SessionPrompt {
}
const args = { filePath: filepath, offset, limit }
const pieces: MessageV2.Part[] = [
const pieces: Draft<MessageV2.Part>[] = [
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1146,7 +1152,6 @@ export namespace SessionPrompt {
}
const result = await t.execute(args, readCtx)
pieces.push({
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1166,7 +1171,6 @@ export namespace SessionPrompt {
} else {
pieces.push({
...part,
id: part.id ?? Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
})
@@ -1182,7 +1186,6 @@ export namespace SessionPrompt {
}).toObject(),
})
pieces.push({
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1209,7 +1212,6 @@ export namespace SessionPrompt {
const result = await ReadTool.init().then((t) => t.execute(args, listCtx))
return [
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1217,7 +1219,6 @@ export namespace SessionPrompt {
text: `Called the Read tool with the following input: ${JSON.stringify(args)}`,
},
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1226,7 +1227,6 @@ export namespace SessionPrompt {
},
{
...part,
id: part.id ?? Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
},
@@ -1237,7 +1237,6 @@ export namespace SessionPrompt {
FileTime.read(input.sessionID, filepath)
return [
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1245,7 +1244,7 @@ export namespace SessionPrompt {
synthetic: true,
},
{
id: part.id ?? Identifier.ascending("part"),
id: part.id,
messageID: info.id,
sessionID: input.sessionID,
type: "file",
@@ -1264,13 +1263,11 @@ export namespace SessionPrompt {
const hint = perm.action === "deny" ? " . Invoked by user; guaranteed to exist." : ""
return [
{
id: Identifier.ascending("part"),
...part,
messageID: info.id,
sessionID: input.sessionID,
},
{
id: Identifier.ascending("part"),
messageID: info.id,
sessionID: input.sessionID,
type: "text",
@@ -1287,14 +1284,13 @@ export namespace SessionPrompt {
return [
{
id: Identifier.ascending("part"),
...part,
messageID: info.id,
sessionID: input.sessionID,
},
]
}),
).then((x) => x.flat())
).then((x) => x.flat().map(assign))
await Plugin.trigger(
"chat.message",

View File

@@ -1,5 +1,3 @@
import { Provider } from "@/provider/provider"
import { fn } from "@/util/fn"
import z from "zod"
import { Session } from "."
@@ -8,16 +6,10 @@ import { MessageV2 } from "./message-v2"
import { Identifier } from "@/id/id"
import { Snapshot } from "@/snapshot"
import { Log } from "@/util/log"
import { Storage } from "@/storage/storage"
import { Bus } from "@/bus"
import { LLM } from "./llm"
import { Agent } from "@/agent/agent"
export namespace SessionSummary {
const log = Log.create({ service: "session.summary" })
function unquoteGitPath(input: string) {
if (!input.startsWith('"')) return input
if (!input.endsWith('"')) return input
@@ -117,41 +109,6 @@ export namespace SessionSummary {
diffs,
}
await Session.updateMessage(userMsg)
const textPart = msgWithParts.parts.find((p) => p.type === "text" && !p.synthetic) as MessageV2.TextPart
if (textPart && !userMsg.summary?.title) {
const agent = await Agent.get("title")
if (!agent) return
const stream = await LLM.stream({
agent,
user: userMsg,
tools: {},
model: agent.model
? await Provider.getModel(agent.model.providerID, agent.model.modelID)
: ((await Provider.getSmallModel(userMsg.model.providerID)) ??
(await Provider.getModel(userMsg.model.providerID, userMsg.model.modelID))),
small: true,
messages: [
{
role: "user" as const,
content: `
The following is the text to summarize:
<text>
${textPart?.text ?? ""}
</text>
`,
},
],
abort: new AbortController().signal,
sessionID: userMsg.sessionID,
system: [],
retries: 3,
})
const result = await stream.text
log.info("title", { title: result })
userMsg.summary.title = result
await Session.updateMessage(userMsg)
}
}
export const diff = fn(

View File

@@ -77,6 +77,12 @@ export const BatchTool = Tool.define("batch", async () => {
})
const result = await tool.execute(validatedParams, { ...ctx, callID: partID })
const attachments = result.attachments?.map((attachment) => ({
...attachment,
id: Identifier.ascending("part"),
sessionID: ctx.sessionID,
messageID: ctx.messageID,
}))
await Session.updatePart({
id: partID,
@@ -91,7 +97,7 @@ export const BatchTool = Tool.define("batch", async () => {
output: result.output,
title: result.title,
metadata: result.metadata,
attachments: result.attachments,
attachments,
time: {
start: callStartTime,
end: Date.now(),

View File

@@ -6,7 +6,6 @@ import { LSP } from "../lsp"
import { FileTime } from "../file/time"
import DESCRIPTION from "./read.txt"
import { Instance } from "../project/instance"
import { Identifier } from "../id/id"
import { assertExternalDirectory } from "./external-directory"
import { InstructionPrompt } from "../session/instruction"
@@ -127,9 +126,6 @@ export const ReadTool = Tool.define("read", {
},
attachments: [
{
id: Identifier.ascending("part"),
sessionID: ctx.sessionID,
messageID: ctx.messageID,
type: "file",
mime,
url: `data:${mime};base64,${Buffer.from(await file.bytes()).toString("base64")}`,

View File

@@ -94,10 +94,11 @@ export namespace ToolRegistry {
async function all(): Promise<Tool.Info[]> {
const custom = await state().then((x) => x.custom)
const config = await Config.get()
const question = ["app", "cli", "desktop"].includes(Flag.OPENCODE_CLIENT) || Flag.OPENCODE_ENABLE_QUESTION_TOOL
return [
InvalidTool,
...(["app", "cli", "desktop"].includes(Flag.OPENCODE_CLIENT) ? [QuestionTool] : []),
...(question ? [QuestionTool] : []),
BashTool,
ReadTool,
GlobTool,

View File

@@ -36,7 +36,7 @@ export namespace Tool {
title: string
metadata: M
output: string
attachments?: MessageV2.FilePart[]
attachments?: Omit<MessageV2.FilePart, "id" | "sessionID" | "messageID">[]
}>
formatValidationError?(error: z.ZodError): string
}>

View File

@@ -3,7 +3,6 @@ import { Tool } from "./tool"
import TurndownService from "turndown"
import DESCRIPTION from "./webfetch.txt"
import { abortAfterAny } from "../util/abort"
import { Identifier } from "../id/id"
const MAX_RESPONSE_SIZE = 5 * 1024 * 1024 // 5MB
const DEFAULT_TIMEOUT = 30 * 1000 // 30 seconds
@@ -103,9 +102,6 @@ export const WebFetchTool = Tool.define("webfetch", {
metadata: {},
attachments: [
{
id: Identifier.ascending("part"),
sessionID: ctx.sessionID,
messageID: ctx.messageID,
type: "file",
mime,
url: `data:${mime};base64,${base64Content}`,

View File

@@ -55,6 +55,28 @@ test("loads JSON config file", async () => {
})
})
test("ignores legacy tui keys in opencode config", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await writeConfig(dir, {
$schema: "https://opencode.ai/config.json",
model: "test/model",
theme: "legacy",
tui: { scroll_speed: 4 },
})
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await Config.get()
expect(config.model).toBe("test/model")
expect((config as Record<string, unknown>).theme).toBeUndefined()
expect((config as Record<string, unknown>).tui).toBeUndefined()
},
})
})
test("loads JSONC config file", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
@@ -109,14 +131,14 @@ test("merges multiple config files with correct precedence", async () => {
test("handles environment variable substitution", async () => {
const originalEnv = process.env["TEST_VAR"]
process.env["TEST_VAR"] = "test_theme"
process.env["TEST_VAR"] = "test-user"
try {
await using tmp = await tmpdir({
init: async (dir) => {
await writeConfig(dir, {
$schema: "https://opencode.ai/config.json",
theme: "{env:TEST_VAR}",
username: "{env:TEST_VAR}",
})
},
})
@@ -124,7 +146,7 @@ test("handles environment variable substitution", async () => {
directory: tmp.path,
fn: async () => {
const config = await Config.get()
expect(config.theme).toBe("test_theme")
expect(config.username).toBe("test-user")
},
})
} finally {
@@ -147,7 +169,7 @@ test("preserves env variables when adding $schema to config", async () => {
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify({
theme: "{env:PRESERVE_VAR}",
username: "{env:PRESERVE_VAR}",
}),
)
},
@@ -156,7 +178,7 @@ test("preserves env variables when adding $schema to config", async () => {
directory: tmp.path,
fn: async () => {
const config = await Config.get()
expect(config.theme).toBe("secret_value")
expect(config.username).toBe("secret_value")
// Read the file to verify the env variable was preserved
const content = await Bun.file(path.join(tmp.path, "opencode.json")).text()
@@ -177,10 +199,10 @@ test("preserves env variables when adding $schema to config", async () => {
test("handles file inclusion substitution", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "included.txt"), "test_theme")
await Bun.write(path.join(dir, "included.txt"), "test-user")
await writeConfig(dir, {
$schema: "https://opencode.ai/config.json",
theme: "{file:included.txt}",
username: "{file:included.txt}",
})
},
})
@@ -188,7 +210,7 @@ test("handles file inclusion substitution", async () => {
directory: tmp.path,
fn: async () => {
const config = await Config.get()
expect(config.theme).toBe("test_theme")
expect(config.username).toBe("test-user")
},
})
})
@@ -199,7 +221,7 @@ test("handles file inclusion with replacement tokens", async () => {
await Bun.write(path.join(dir, "included.md"), "const out = await Bun.$`echo hi`")
await writeConfig(dir, {
$schema: "https://opencode.ai/config.json",
theme: "{file:included.md}",
username: "{file:included.md}",
})
},
})
@@ -207,7 +229,7 @@ test("handles file inclusion with replacement tokens", async () => {
directory: tmp.path,
fn: async () => {
const config = await Config.get()
expect(config.theme).toBe("const out = await Bun.$`echo hi`")
expect(config.username).toBe("const out = await Bun.$`echo hi`")
},
})
})
@@ -1042,7 +1064,6 @@ test("managed settings override project settings", async () => {
$schema: "https://opencode.ai/config.json",
autoupdate: true,
disabled_providers: [],
theme: "dark",
})
},
})
@@ -1059,7 +1080,6 @@ test("managed settings override project settings", async () => {
const config = await Config.get()
expect(config.autoupdate).toBe(false)
expect(config.disabled_providers).toEqual(["openai"])
expect(config.theme).toBe("dark")
},
})
})

View File

@@ -0,0 +1,439 @@
import { afterEach, expect, test } from "bun:test"
import path from "path"
import fs from "fs/promises"
import { tmpdir } from "../fixture/fixture"
import { Instance } from "../../src/project/instance"
import { TuiConfig } from "../../src/config/tui"
import { Global } from "../../src/global"
const managedConfigDir = process.env.OPENCODE_TEST_MANAGED_CONFIG_DIR!
afterEach(async () => {
delete process.env.OPENCODE_CONFIG
delete process.env.OPENCODE_TUI_CONFIG
await fs.rm(path.join(Global.Path.config, "tui.json"), { force: true }).catch(() => {})
await fs.rm(path.join(Global.Path.config, "tui.jsonc"), { force: true }).catch(() => {})
await fs.rm(managedConfigDir, { force: true, recursive: true }).catch(() => {})
})
test("loads tui config with the same precedence order as server config paths", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(Global.Path.config, "tui.json"), JSON.stringify({ theme: "global" }, null, 2))
await Bun.write(path.join(dir, "tui.json"), JSON.stringify({ theme: "project" }, null, 2))
await fs.mkdir(path.join(dir, ".opencode"), { recursive: true })
await Bun.write(
path.join(dir, ".opencode", "tui.json"),
JSON.stringify({ theme: "local", diff_style: "stacked" }, null, 2),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("local")
expect(config.diff_style).toBe("stacked")
},
})
})
test("migrates tui-specific keys from opencode.json when tui.json does not exist", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify(
{
theme: "migrated-theme",
tui: { scroll_speed: 5 },
keybinds: { app_exit: "ctrl+q" },
},
null,
2,
),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("migrated-theme")
expect(config.scroll_speed).toBe(5)
expect(config.keybinds?.app_exit).toBe("ctrl+q")
const text = await Bun.file(path.join(tmp.path, "tui.json")).text()
expect(JSON.parse(text)).toMatchObject({
theme: "migrated-theme",
scroll_speed: 5,
})
const server = JSON.parse(await Bun.file(path.join(tmp.path, "opencode.json")).text())
expect(server.theme).toBeUndefined()
expect(server.keybinds).toBeUndefined()
expect(server.tui).toBeUndefined()
expect(await Bun.file(path.join(tmp.path, "opencode.json.tui-migration.bak")).exists()).toBe(true)
expect(await Bun.file(path.join(tmp.path, "tui.json")).exists()).toBe(true)
},
})
})
test("migrates project legacy tui keys even when global tui.json already exists", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(Global.Path.config, "tui.json"), JSON.stringify({ theme: "global" }, null, 2))
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify(
{
theme: "project-migrated",
tui: { scroll_speed: 2 },
},
null,
2,
),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("project-migrated")
expect(config.scroll_speed).toBe(2)
expect(await Bun.file(path.join(tmp.path, "tui.json")).exists()).toBe(true)
const server = JSON.parse(await Bun.file(path.join(tmp.path, "opencode.json")).text())
expect(server.theme).toBeUndefined()
expect(server.tui).toBeUndefined()
},
})
})
test("drops unknown legacy tui keys during migration", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify(
{
theme: "migrated-theme",
tui: { scroll_speed: 2, foo: 1 },
},
null,
2,
),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("migrated-theme")
expect(config.scroll_speed).toBe(2)
const text = await Bun.file(path.join(tmp.path, "tui.json")).text()
const migrated = JSON.parse(text)
expect(migrated.scroll_speed).toBe(2)
expect(migrated.foo).toBeUndefined()
},
})
})
test("skips migration when tui.json already exists", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "opencode.json"), JSON.stringify({ theme: "legacy" }, null, 2))
await Bun.write(path.join(dir, "tui.json"), JSON.stringify({ diff_style: "stacked" }, null, 2))
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.diff_style).toBe("stacked")
expect(config.theme).toBeUndefined()
const server = JSON.parse(await Bun.file(path.join(tmp.path, "opencode.json")).text())
expect(server.theme).toBe("legacy")
expect(await Bun.file(path.join(tmp.path, "opencode.json.tui-migration.bak")).exists()).toBe(false)
},
})
})
test("continues loading tui config when legacy source cannot be stripped", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "opencode.json"), JSON.stringify({ theme: "readonly-theme" }, null, 2))
},
})
const source = path.join(tmp.path, "opencode.json")
await fs.chmod(source, 0o444)
try {
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("readonly-theme")
expect(await Bun.file(path.join(tmp.path, "tui.json")).exists()).toBe(true)
const server = JSON.parse(await Bun.file(source).text())
expect(server.theme).toBe("readonly-theme")
},
})
} finally {
await fs.chmod(source, 0o644)
}
})
test("migration backup preserves JSONC comments", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "opencode.jsonc"),
`{
// top-level comment
"theme": "jsonc-theme",
"tui": {
// nested comment
"scroll_speed": 1.5
}
}`,
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
await TuiConfig.get()
const backup = await Bun.file(path.join(tmp.path, "opencode.jsonc.tui-migration.bak")).text()
expect(backup).toContain("// top-level comment")
expect(backup).toContain("// nested comment")
expect(backup).toContain('"theme": "jsonc-theme"')
expect(backup).toContain('"scroll_speed": 1.5')
},
})
})
test("migrates legacy tui keys across multiple opencode.json levels", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
const nested = path.join(dir, "apps", "client")
await fs.mkdir(nested, { recursive: true })
await Bun.write(path.join(dir, "opencode.json"), JSON.stringify({ theme: "root-theme" }, null, 2))
await Bun.write(path.join(nested, "opencode.json"), JSON.stringify({ theme: "nested-theme" }, null, 2))
},
})
await Instance.provide({
directory: path.join(tmp.path, "apps", "client"),
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("nested-theme")
expect(await Bun.file(path.join(tmp.path, "tui.json")).exists()).toBe(true)
expect(await Bun.file(path.join(tmp.path, "apps", "client", "tui.json")).exists()).toBe(true)
},
})
})
test("flattens nested tui key inside tui.json", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "tui.json"),
JSON.stringify({
theme: "outer",
tui: { scroll_speed: 3, diff_style: "stacked" },
}),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.scroll_speed).toBe(3)
expect(config.diff_style).toBe("stacked")
// top-level keys take precedence over nested tui keys
expect(config.theme).toBe("outer")
},
})
})
test("top-level keys in tui.json take precedence over nested tui key", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "tui.json"),
JSON.stringify({
diff_style: "auto",
tui: { diff_style: "stacked", scroll_speed: 2 },
}),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.diff_style).toBe("auto")
expect(config.scroll_speed).toBe(2)
},
})
})
test("OPENCODE_TUI_CONFIG takes precedence over project config", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "tui.json"), JSON.stringify({ theme: "project", diff_style: "auto" }))
const custom = path.join(dir, "custom-tui.json")
await Bun.write(custom, JSON.stringify({ theme: "custom", diff_style: "stacked" }))
process.env.OPENCODE_TUI_CONFIG = custom
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
// project tui.json overrides the custom path (higher precedence)
expect(config.theme).toBe("project")
// but project also set diff_style, so that wins
expect(config.diff_style).toBe("auto")
},
})
})
test("OPENCODE_TUI_CONFIG provides settings when no project config exists", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
const custom = path.join(dir, "custom-tui.json")
await Bun.write(custom, JSON.stringify({ theme: "from-env", diff_style: "stacked" }))
process.env.OPENCODE_TUI_CONFIG = custom
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("from-env")
expect(config.diff_style).toBe("stacked")
},
})
})
test("does not derive tui path from OPENCODE_CONFIG", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
const customDir = path.join(dir, "custom")
await fs.mkdir(customDir, { recursive: true })
await Bun.write(path.join(customDir, "opencode.json"), JSON.stringify({ model: "test/model" }))
await Bun.write(path.join(customDir, "tui.json"), JSON.stringify({ theme: "should-not-load" }))
process.env.OPENCODE_CONFIG = path.join(customDir, "opencode.json")
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBeUndefined()
},
})
})
test("applies env and file substitutions in tui.json", async () => {
const original = process.env.TUI_THEME_TEST
process.env.TUI_THEME_TEST = "env-theme"
try {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "keybind.txt"), "ctrl+q")
await Bun.write(
path.join(dir, "tui.json"),
JSON.stringify({
theme: "{env:TUI_THEME_TEST}",
keybinds: { app_exit: "{file:keybind.txt}" },
}),
)
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("env-theme")
expect(config.keybinds?.app_exit).toBe("ctrl+q")
},
})
} finally {
if (original === undefined) delete process.env.TUI_THEME_TEST
else process.env.TUI_THEME_TEST = original
}
})
test("loads managed tui config and gives it highest precedence", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "tui.json"), JSON.stringify({ theme: "project-theme" }, null, 2))
await fs.mkdir(managedConfigDir, { recursive: true })
await Bun.write(path.join(managedConfigDir, "tui.json"), JSON.stringify({ theme: "managed-theme" }, null, 2))
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("managed-theme")
},
})
})
test("loads .opencode/tui.json", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await fs.mkdir(path.join(dir, ".opencode"), { recursive: true })
await Bun.write(path.join(dir, ".opencode", "tui.json"), JSON.stringify({ diff_style: "stacked" }, null, 2))
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.diff_style).toBe("stacked")
},
})
})
test("gracefully falls back when tui.json has invalid JSON", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(path.join(dir, "tui.json"), "{ invalid json }")
await fs.mkdir(managedConfigDir, { recursive: true })
await Bun.write(path.join(managedConfigDir, "tui.json"), JSON.stringify({ theme: "managed-fallback" }, null, 2))
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const config = await TuiConfig.get()
expect(config.theme).toBe("managed-fallback")
expect(config.keybinds).toBeDefined()
},
})
})

View File

@@ -2127,3 +2127,94 @@ test("custom model with variants enabled and disabled", async () => {
},
})
})
test("Google Vertex: retains baseURL for custom proxy", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify({
$schema: "https://opencode.ai/config.json",
provider: {
"vertex-proxy": {
name: "Vertex Proxy",
npm: "@ai-sdk/google-vertex",
api: "https://my-proxy.com/v1",
env: ["GOOGLE_APPLICATION_CREDENTIALS"], // Mock env var requirement
models: {
"gemini-pro": {
name: "Gemini Pro",
tool_call: true,
},
},
options: {
project: "test-project",
location: "us-central1",
baseURL: "https://my-proxy.com/v1", // Should be retained
},
},
},
}),
)
},
})
await Instance.provide({
directory: tmp.path,
init: async () => {
Env.set("GOOGLE_APPLICATION_CREDENTIALS", "test-creds")
},
fn: async () => {
const providers = await Provider.list()
expect(providers["vertex-proxy"]).toBeDefined()
expect(providers["vertex-proxy"].options.baseURL).toBe("https://my-proxy.com/v1")
},
})
})
test("Google Vertex: supports OpenAI compatible models", async () => {
await using tmp = await tmpdir({
init: async (dir) => {
await Bun.write(
path.join(dir, "opencode.json"),
JSON.stringify({
$schema: "https://opencode.ai/config.json",
provider: {
"vertex-openai": {
name: "Vertex OpenAI",
npm: "@ai-sdk/google-vertex",
env: ["GOOGLE_APPLICATION_CREDENTIALS"],
models: {
"gpt-4": {
name: "GPT-4",
provider: {
npm: "@ai-sdk/openai-compatible",
api: "https://api.openai.com/v1",
},
},
},
options: {
project: "test-project",
location: "us-central1",
},
},
},
}),
)
},
})
await Instance.provide({
directory: tmp.path,
init: async () => {
Env.set("GOOGLE_APPLICATION_CREDENTIALS", "test-creds")
},
fn: async () => {
const providers = await Provider.list()
const model = providers["vertex-openai"].models["gpt-4"]
expect(model).toBeDefined()
expect(model.api.npm).toBe("@ai-sdk/openai-compatible")
},
})
})

View File

@@ -2,6 +2,7 @@ import path from "path"
import { describe, expect, test } from "bun:test"
import { Instance } from "../../src/project/instance"
import { Session } from "../../src/session"
import { MessageV2 } from "../../src/session/message-v2"
import { SessionPrompt } from "../../src/session/prompt"
import { tmpdir } from "../fixture/fixture"
@@ -50,4 +51,54 @@ describe("session.prompt missing file", () => {
},
})
})
test("keeps stored part order stable when file resolution is async", async () => {
await using tmp = await tmpdir({
git: true,
config: {
agent: {
build: {
model: "openai/gpt-5.2",
},
},
},
})
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const missing = path.join(tmp.path, "still-missing.ts")
const msg = await SessionPrompt.prompt({
sessionID: session.id,
agent: "build",
noReply: true,
parts: [
{
type: "file",
mime: "text/plain",
url: `file://${missing}`,
filename: "still-missing.ts",
},
{ type: "text", text: "after-file" },
],
})
if (msg.info.role !== "user") throw new Error("expected user message")
const stored = await MessageV2.get({
sessionID: session.id,
messageID: msg.info.id,
})
const text = stored.parts.filter((part) => part.type === "text").map((part) => part.text)
expect(text[0]?.startsWith("Called the Read tool with the following input:")).toBe(true)
expect(text[1]?.includes("Read tool failed to read")).toBe(true)
expect(text[2]).toBe("after-file")
await Session.remove(session.id)
},
})
})
})

View File

@@ -349,6 +349,9 @@ describe("tool.read truncation", () => {
expect(result.metadata.truncated).toBe(false)
expect(result.attachments).toBeDefined()
expect(result.attachments?.length).toBe(1)
expect(result.attachments?.[0]).not.toHaveProperty("id")
expect(result.attachments?.[0]).not.toHaveProperty("sessionID")
expect(result.attachments?.[0]).not.toHaveProperty("messageID")
},
})
})
@@ -363,6 +366,9 @@ describe("tool.read truncation", () => {
expect(result.attachments).toBeDefined()
expect(result.attachments?.length).toBe(1)
expect(result.attachments?.[0].type).toBe("file")
expect(result.attachments?.[0]).not.toHaveProperty("id")
expect(result.attachments?.[0]).not.toHaveProperty("sessionID")
expect(result.attachments?.[0]).not.toHaveProperty("messageID")
},
})
})

View File

@@ -46,6 +46,9 @@ describe("tool.webfetch", () => {
expect(result.attachments?.[0].type).toBe("file")
expect(result.attachments?.[0].mime).toBe("image/png")
expect(result.attachments?.[0].url.startsWith("data:image/png;base64,")).toBe(true)
expect(result.attachments?.[0]).not.toHaveProperty("id")
expect(result.attachments?.[0]).not.toHaveProperty("sessionID")
expect(result.attachments?.[0]).not.toHaveProperty("messageID")
},
})
},

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/plugin",
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/sdk",
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/slack",
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/ui",
"version": "1.2.5",
"version": "1.2.6",
"type": "module",
"license": "MIT",
"exports": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/util",
"version": "1.2.5",
"version": "1.2.6",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -2,7 +2,7 @@
"name": "@opencode-ai/web",
"type": "module",
"license": "MIT",
"version": "1.2.5",
"version": "1.2.6",
"scripts": {
"dev": "astro dev",
"dev:remote": "VITE_API_URL=https://api.opencode.ai astro dev",

View File

@@ -29,6 +29,7 @@ description: يستخدم OpenCode مُنسِّقات خاصة بكل لغة.
| htmlbeautifier | .erb, .html.erb | يتوفر أمر `htmlbeautifier` |
| air | .R | يتوفر أمر `air` |
| dart | .dart | يتوفر أمر `dart` |
| dfmt | .d | يتوفر أمر `dfmt` |
| ocamlformat | .ml, .mli | يتوفر أمر `ocamlformat` وملف إعداد `.ocamlformat` |
| terraform | .tf, .tfvars | يتوفر أمر `terraform` |
| gleam | .gleam | يتوفر أمر `gleam` |

View File

@@ -27,6 +27,7 @@ OpenCode dolazi sa nekoliko ugrađenih formatera za popularne jezike i okvire. I
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` komanda dostupna |
| air | .R | `air` komanda dostupna |
| dart | .dart | `dart` komanda dostupna |
| dfmt | .d | `dfmt` komanda dostupna |
| ocamlformat | .ml, .mli | `ocamlformat` komanda dostupna i `.ocamlformat` konfiguracioni fajl |
| terraform | .tf, .tfvars | `terraform` komanda dostupna |
| gleam | .bleam | `gleam` komanda dostupna |

View File

@@ -558,6 +558,7 @@ OpenCode can be configured using environment variables.
| `OPENCODE_AUTO_SHARE` | boolean | Automatically share sessions |
| `OPENCODE_GIT_BASH_PATH` | string | Path to Git Bash executable on Windows |
| `OPENCODE_CONFIG` | string | Path to config file |
| `OPENCODE_TUI_CONFIG` | string | Path to TUI config file |
| `OPENCODE_CONFIG_DIR` | string | Path to config directory |
| `OPENCODE_CONFIG_CONTENT` | string | Inline json config content |
| `OPENCODE_DISABLE_AUTOUPDATE` | boolean | Disable automatic update checks |

View File

@@ -14,10 +14,11 @@ OpenCode supports both **JSON** and **JSONC** (JSON with Comments) formats.
```jsonc title="opencode.jsonc"
{
"$schema": "https://opencode.ai/config.json",
// Theme configuration
"theme": "opencode",
"model": "anthropic/claude-sonnet-4-5",
"autoupdate": true,
"server": {
"port": 4096,
},
}
```
@@ -34,7 +35,7 @@ Configuration files are **merged together**, not replaced.
Configuration files are merged together, not replaced. Settings from the following config locations are combined. Later configs override earlier ones only for conflicting keys. Non-conflicting settings from all configs are preserved.
For example, if your global config sets `theme: "opencode"` and `autoupdate: true`, and your project config sets `model: "anthropic/claude-sonnet-4-5"`, the final configuration will include all three settings.
For example, if your global config sets `autoupdate: true` and your project config sets `model: "anthropic/claude-sonnet-4-5"`, the final configuration will include both settings.
---
@@ -95,7 +96,9 @@ You can enable specific servers in your local config:
### Global
Place your global OpenCode config in `~/.config/opencode/opencode.json`. Use global config for user-wide preferences like themes, providers, or keybinds.
Place your global OpenCode config in `~/.config/opencode/opencode.json`. Use global config for user-wide server/runtime preferences like providers, models, and permissions.
For TUI-specific settings, use `~/.config/opencode/tui.json`.
Global config overrides remote organizational defaults.
@@ -105,6 +108,8 @@ Global config overrides remote organizational defaults.
Add `opencode.json` in your project root. Project config has the highest precedence among standard config files - it overrides both global and remote configs.
For project-specific TUI settings, add `tui.json` alongside it.
:::tip
Place project specific config in the root of your project.
:::
@@ -146,7 +151,9 @@ The custom directory is loaded after the global config and `.opencode` directori
## Schema
The config file has a schema that's defined in [**`opencode.ai/config.json`**](https://opencode.ai/config.json).
The server/runtime config schema is defined in [**`opencode.ai/config.json`**](https://opencode.ai/config.json).
TUI config uses [**`opencode.ai/tui.json`**](https://opencode.ai/tui.json).
Your editor should be able to validate and autocomplete based on the schema.
@@ -154,28 +161,24 @@ Your editor should be able to validate and autocomplete based on the schema.
### TUI
You can configure TUI-specific settings through the `tui` option.
Use a dedicated `tui.json` (or `tui.jsonc`) file for TUI-specific settings.
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"tui": {
"scroll_speed": 3,
"scroll_acceleration": {
"enabled": true
},
"diff_style": "auto"
}
"$schema": "https://opencode.ai/tui.json",
"scroll_speed": 3,
"scroll_acceleration": {
"enabled": true
},
"diff_style": "auto"
}
```
Available options:
Use `OPENCODE_TUI_CONFIG` to point to a custom TUI config file.
- `scroll_acceleration.enabled` - Enable macOS-style scroll acceleration. **Takes precedence over `scroll_speed`.**
- `scroll_speed` - Custom scroll speed multiplier (default: `3`, minimum: `1`). Ignored if `scroll_acceleration.enabled` is `true`.
- `diff_style` - Control diff rendering. `"auto"` adapts to terminal width, `"stacked"` always shows single column.
Legacy `theme`, `keybinds`, and `tui` keys in `opencode.json` are deprecated and automatically migrated when possible.
[Learn more about using the TUI here](/docs/tui).
[Learn more about TUI configuration here](/docs/tui#configure).
---
@@ -301,12 +304,12 @@ Bearer tokens (`AWS_BEARER_TOKEN_BEDROCK` or `/connect`) take precedence over pr
### Themes
You can configure the theme you want to use in your OpenCode config through the `theme` option.
Set your UI theme in `tui.json`.
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"theme": ""
"$schema": "https://opencode.ai/tui.json",
"theme": "tokyonight"
}
```
@@ -406,11 +409,11 @@ You can also define commands using markdown files in `~/.config/opencode/command
### Keybinds
You can customize your keybinds through the `keybinds` option.
Customize keybinds in `tui.json`.
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"$schema": "https://opencode.ai/tui.json",
"keybinds": {}
}
```

View File

@@ -29,6 +29,7 @@ OpenCode leveres med flere indbyggede formatere til populære sprog og rammer. N
| htmlbeautifier | .erb,.html.erb | `htmlbeautifier` kommando tilgængelig |
| luft | .R | `air` kommando tilgængelig |
| dart | .dart | `dart` kommando tilgængelig |
| dfmt | .d | `dfmt` kommando tilgængelig |
| ocamlformat | .ml,.mli | `ocamlformat` kommando tilgængelig og `.ocamlformat` config fil |
| terraform | .tf,.tfvars | `terraform` kommando tilgængelig |
| glimt | .glimt | `gleam` kommando tilgængelig |

View File

@@ -29,6 +29,7 @@ OpenCode verfügt über mehrere integrierte Formatierer für gängige Sprachen u
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier`-Befehl verfügbar |
| air | .R | `air`-Befehl verfügbar |
| dart | .dart | `dart`-Befehl verfügbar |
| dfmt | .d | `dfmt`-Befehl verfügbar |
| ocamlformat | .ml, .mli | `ocamlformat` Befehl verfügbar und `.ocamlformat` Konfigurationsdatei |
| terraform | .tf, .tfvars | `terraform`-Befehl verfügbar |
| gleam | .gleam | `gleam`-Befehl verfügbar |

View File

@@ -29,6 +29,7 @@ OpenCode viene con varios formateadores integrados para lenguajes y marcos popul
| htmlbeautifier | .erb, .html.erb | Comando `htmlbeautifier` disponible |
| air | .R | Comando `air` disponible |
| dart | .dart | Comando `dart` disponible |
| dfmt | .d | Comando `dfmt` disponible |
| ocamlformat | .ml, .mli | Comando `ocamlformat` disponible y archivo de configuración `.ocamlformat` |
| terraform | .tf, .tfvars | Comando `terraform` disponible |
| gleam | .gleam | Comando `gleam` disponible |

View File

@@ -13,30 +13,32 @@ OpenCode comes with several built-in formatters for popular languages and framew
| Formatter | Extensions | Requirements |
| -------------------- | -------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------- |
| gofmt | .go | `gofmt` command available |
| mix | .ex, .exs, .eex, .heex, .leex, .neex, .sface | `mix` command available |
| prettier | .js, .jsx, .ts, .tsx, .html, .css, .md, .json, .yaml, and [more](https://prettier.io/docs/en/index.html) | `prettier` dependency in `package.json` |
| air | .R | `air` command available |
| biome | .js, .jsx, .ts, .tsx, .html, .css, .md, .json, .yaml, and [more](https://biomejs.dev/) | `biome.json(c)` config file |
| zig | .zig, .zon | `zig` command available |
| cargofmt | .rs | `cargo fmt` command available |
| clang-format | .c, .cpp, .h, .hpp, .ino, and [more](https://clang.llvm.org/docs/ClangFormat.html) | `.clang-format` config file |
| cljfmt | .clj, .cljs, .cljc, .edn | `cljfmt` command available |
| dart | .dart | `dart` command available |
| dfmt | .d | `dfmt` command available |
| gleam | .gleam | `gleam` command available |
| gofmt | .go | `gofmt` command available |
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` command available |
| ktlint | .kt, .kts | `ktlint` command available |
| mix | .ex, .exs, .eex, .heex, .leex, .neex, .sface | `mix` command available |
| nixfmt | .nix | `nixfmt` command available |
| ocamlformat | .ml, .mli | `ocamlformat` command available and `.ocamlformat` config file |
| ormolu | .hs | `ormolu` command available |
| oxfmt (Experimental) | .js, .jsx, .ts, .tsx | `oxfmt` dependency in `package.json` and an [experimental env variable flag](/docs/cli/#experimental) |
| pint | .php | `laravel/pint` dependency in `composer.json` |
| prettier | .js, .jsx, .ts, .tsx, .html, .css, .md, .json, .yaml, and [more](https://prettier.io/docs/en/index.html) | `prettier` dependency in `package.json` |
| rubocop | .rb, .rake, .gemspec, .ru | `rubocop` command available |
| ruff | .py, .pyi | `ruff` command available with config |
| rustfmt | .rs | `rustfmt` command available |
| cargofmt | .rs | `cargo fmt` command available |
| uv | .py, .pyi | `uv` command available |
| rubocop | .rb, .rake, .gemspec, .ru | `rubocop` command available |
| standardrb | .rb, .rake, .gemspec, .ru | `standardrb` command available |
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` command available |
| air | .R | `air` command available |
| dart | .dart | `dart` command available |
| ocamlformat | .ml, .mli | `ocamlformat` command available and `.ocamlformat` config file |
| terraform | .tf, .tfvars | `terraform` command available |
| gleam | .gleam | `gleam` command available |
| nixfmt | .nix | `nixfmt` command available |
| shfmt | .sh, .bash | `shfmt` command available |
| pint | .php | `laravel/pint` dependency in `composer.json` |
| oxfmt (Experimental) | .js, .jsx, .ts, .tsx | `oxfmt` dependency in `package.json` and an [experimental env variable flag](/docs/cli/#experimental) |
| ormolu | .hs | `ormolu` command available |
| standardrb | .rb, .rake, .gemspec, .ru | `standardrb` command available |
| terraform | .tf, .tfvars | `terraform` command available |
| uv | .py, .pyi | `uv` command available |
| zig | .zig, .zon | `zig` command available |
So if your project has `prettier` in your `package.json`, OpenCode will automatically use it.

View File

@@ -29,6 +29,7 @@ OpenCode est livré avec plusieurs formateurs intégrés pour les langages et fr
| htmlbeautifier | .erb, .html.erb | Commande `htmlbeautifier` disponible |
| air | .R | Commande `air` disponible |
| dart | .dart | Commande `dart` disponible |
| dfmt | .d | Commande `dfmt` disponible |
| ocamlformat | .ml, .mli | Commande `ocamlformat` disponible et fichier de configuration `.ocamlformat` |
| terraform | .tf, .tfvars | Commande `terraform` disponible |
| gleam | .gleam | Commande `gleam` disponible |

View File

@@ -29,6 +29,7 @@ OpenCode include diversi formattatori integrati per linguaggi e framework popola
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` command available |
| air | .R | `air` command available |
| dart | .dart | `dart` command available |
| dfmt | .d | `dfmt` command available |
| ocamlformat | .ml, .mli | `ocamlformat` command available and `.ocamlformat` config file |
| terraform | .tf, .tfvars | `terraform` command available |
| gleam | .gleam | `gleam` command available |

View File

@@ -29,6 +29,7 @@ OpenCode には、一般的な言語およびフレームワーク用のいく
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` command available |
| air | .R | `air` command available |
| dart | .dart | `dart` command available |
| dfmt | .d | `dfmt` command available |
| ocamlformat | .ml, .mli | `ocamlformat` command available and `.ocamlformat` config file |
| terraform | .tf, .tfvars | `terraform` command available |
| gleam | .gleam | `gleam` command available |

View File

@@ -3,11 +3,11 @@ title: Keybinds
description: Customize your keybinds.
---
OpenCode has a list of keybinds that you can customize through the OpenCode config.
OpenCode has a list of keybinds that you can customize through `tui.json`.
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"$schema": "https://opencode.ai/tui.json",
"keybinds": {
"leader": "ctrl+x",
"app_exit": "ctrl+c,ctrl+d,<leader>q",
@@ -117,11 +117,11 @@ You don't need to use a leader key for your keybinds but we recommend doing so.
## Disable keybind
You can disable a keybind by adding the key to your config with a value of "none".
You can disable a keybind by adding the key to `tui.json` with a value of "none".
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"$schema": "https://opencode.ai/tui.json",
"keybinds": {
"session_compact": "none"
}

View File

@@ -28,6 +28,7 @@ opencode는 인기있는 언어 및 프레임 워크에 대한 몇 가지 내장
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` 명령 사용 가능 |
| Air | .R | `air` 명령 사용 가능 |
| Dart | 다트 | `dart` 명령 |
| dfmt | .d | `dfmt` 명령 사용 가능 |
| ocamlformat | .ml, .mli | `ocamlformat` 명령 사용 가능·`.ocamlformat` 설정 파일 |
| Terraform | .tf, .tfvars | `terraform` 명령 사용 가능 |
| gleam | .gleam | `gleam` 명령 사용 가능 |

View File

@@ -29,6 +29,7 @@ OpenCode kommer med flere innebygde formattere for populære språk og rammeverk
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` kommando tilgjengelig |
| air | .R | `air` kommando tilgjengelig |
| dart | .dart | `dart` kommando tilgjengelig |
| dfmt | .d | `dfmt` kommando tilgjengelig |
| ocamlformat | .ml, .mli | `ocamlformat` kommando tilgjengelig og `.ocamlformat` konfigurasjonsfil |
| terraform | .tf, .tfvars | `terraform` kommando tilgjengelig |
| gleam | .gleam | `gleam` kommando tilgjengelig |

View File

@@ -29,6 +29,7 @@ OpenCode zawiera kilka wbudowanych formaterów dla popularnych języków i frame
| htmlbeautifier | .erb, .html.erb | Dostępne polecenie `htmlbeautifier` |
| air | .R | Dostępne polecenie `air` |
| dart | .dart | Dostępne polecenie `dart` |
| dfmt | .d | Dostępne polecenie `dfmt` |
| ocamlformat | .ml, .mli | Dostępne polecenie `ocamlformat` i plik konfiguracyjny `.ocamlformat` |
| terraform | .tf, .tfvars | Dostępne polecenie `terraform` |
| gleam | .gleam | Dostępne polecenie `gleam` |

View File

@@ -29,6 +29,7 @@ O opencode vem com vários formatadores integrados para linguagens e frameworks
| htmlbeautifier | .erb, .html.erb | Comando `htmlbeautifier` disponível |
| air | .R | Comando `air` disponível |
| dart | .dart | Comando `dart` disponível |
| dfmt | .d | Comando `dfmt` disponível |
| ocamlformat | .ml, .mli | Comando `ocamlformat` disponível e arquivo de configuração `.ocamlformat` |
| terraform | .tf, .tfvars | Comando `terraform` disponível |
| gleam | .gleam | Comando `gleam` disponível |

View File

@@ -29,6 +29,7 @@ opencode поставляется с несколькими встроенным
| htmlbeautifier | .erb, .html.erb | Доступна команда `htmlbeautifier` |
| air | .R | Доступна команда `air` |
| dart | .dart | Доступна команда `dart` |
| dfmt | .d | Доступна команда `dfmt` |
| ocamlformat | .ml, .mli | Доступна команда `ocamlformat` и файл конфигурации `.ocamlformat`. |
| terraform | .tf, .tfvars | Доступна команда `terraform` |
| gleam | .gleam | Доступна команда `gleam` |

View File

@@ -1,29 +1,29 @@
---
title: Zen
description: Кураторский список моделей, предоставленный opencode.
description: Подобранный список моделей, предоставленный OpenCode.
---
import config from "../../../../config.mjs"
export const console = config.console
export const email = `mailto:${config.email}`
OpenCode Zen — это список протестированных и проверенных моделей, предоставленный командой opencode.
OpenCode Zen — это список протестированных и проверенных моделей, предоставленный командой OpenCode.
:::note
OpenCode Zen в настоящее время находится в стадии бета-тестирования.
:::
Zen работает как любой другой провайдер в opencode. Вы входите в OpenCode Zen и получаете
Zen работает как любой другой провайдер в OpenCode. Вы входите в OpenCode Zen и получаете
ваш ключ API. Это **совершенно необязательно**, и вам не обязательно использовать его для использования
Открытый код.
OpenCode.
---
## Предыстория
Существует большое количество моделей, но лишь некоторые из них.
эти модели хорошо работают в качестве агентов кодирования. Кроме того, большинство провайдеров
настроен совсем по-другому; так что вы получите совсем другую производительность и качество.
Существует большое количество моделей, но лишь некоторые из них
хорошо работают в качестве кодинг-агентов. Кроме того, большинство провайдеров
настроены совсем по-другому; так что вы получите совсем другую производительность и качество.
:::tip
Мы протестировали избранную группу моделей и поставщиков, которые хорошо работают с opencode.
@@ -36,10 +36,9 @@ Zen работает как любой другой провайдер в openco
1. Мы протестировали избранную группу моделей и поговорили с их командами о том, как
лучше всего запустить их.
2. Затем мы поработали с несколькими поставщиками услуг, чтобы убедиться, что они обслуживаются.
правильно.
3. Наконец, мы сравнили комбинацию модель/провайдер и пришли к выводу, что
со списком, который мы с удовольствием рекомендуем.
2. Затем мы поработали с несколькими поставщиками услуг, чтобы убедиться, что они обслуживаются правильно.
3. Наконец, мы сравнили комбинацию модель/провайдер и составили
список, который мы с удовольствием рекомендуем.
OpenCode Zen — это шлюз искусственного интеллекта, который дает вам доступ к этим моделям.
@@ -47,10 +46,10 @@ OpenCode Zen — это шлюз искусственного интеллект
## Как это работает
OpenCode Zen работает так же, как и любой другой поставщик opencode.
OpenCode Zen работает так же, как и любой другой поставщик OpenCode.
1. Вы входите в систему **<a href={console}>OpenCode Zen</a>**, добавляете свой платежный аккаунт.
подробности и скопируйте свой ключ API.
1. Вы входите в систему **<a href={console}>OpenCode Zen</a>**, добавляете платежные
данные и копируете свой ключ API.
2. Вы запускаете команду `/connect` в TUI, выбираете OpenCode Zen и вставляете свой ключ API.
3. Запустите `/models` в TUI, чтобы просмотреть список рекомендуемых нами моделей.
@@ -82,8 +81,10 @@ OpenCode Zen работает так же, как и любой другой п
| Claude Opus 4.1 | claude-opus-4-1 | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` |
| Gemini 3 Pro | gemini-3-pro | `https://opencode.ai/zen/v1/models/gemini-3-pro` | `@ai-sdk/google` |
| Gemini 3 Flash | gemini-3-flash | `https://opencode.ai/zen/v1/models/gemini-3-flash` | `@ai-sdk/google` |
| MiniMax M2.5 | minimax-m2.5 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| MiniMax M2.5 Free | minimax-m2.5-free | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| MiniMax M2.1 | minimax-m2.1 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| MiniMax M2.1 Free | minimax-m2.1-free | `https://opencode.ai/zen/v1/messages` | `@ai-sdk/anthropic` |
| GLM 5 | glm-5 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| GLM 4.7 | glm-4.7 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| GLM 4.7 Free | glm-4.7-free | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
| GLM 4.6 | glm-4.6 | `https://opencode.ai/zen/v1/chat/completions` | `@ai-sdk/openai-compatible` |
@@ -117,11 +118,14 @@ https://opencode.ai/zen/v1/models
| Модель | Вход | Выход | Кэшированное чтение | Кэшированная запись |
| -------------------------------------- | --------- | --------- | ------------------- | ------------------- |
| Big Pickle | Бесплатно | Бесплатно | Бесплатно | - |
| MiniMax M2.1 Free | Бесплатно | Бесплатно | Бесплатно | - |
| MiniMax M2.5 Free | Бесплатно | Бесплатно | Бесплатно | - |
| MiniMax M2.5 | $0.30 | $1.20 | $0.06 | - |
| MiniMax M2.1 | $0.30 | $1.20 | $0.10 | - |
| GLM 4.7 Free | Бесплатно | Бесплатно | Бесплатно | - |
| GLM 5 | $1.00 | $3.20 | $0.20 | - |
| GLM 4.7 | $0.60 | $2.20 | $0.10 | - |
| GLM 4.7 | $0.60 | $2.20 | $0.10 | - |
| GLM 4.6 | $0.60 | $2.20 | $0.10 | - |
| GLM 4.7 Free | Бесплатно | Бесплатно | Бесплатно | - |
| Kimi K2.5 Free | Бесплатно | Бесплатно | Бесплатно | - |
| Kimi K2.5 | $0.60 | $3.00 | $0.08 | - |
| Kimi K2 Thinking | $0.40 | $2.50 | - | - |
@@ -158,10 +162,9 @@ https://opencode.ai/zen/v1/models
Бесплатные модели:
- GLM 4.7 Free доступен на opencode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- Kimi K2.5 Free доступен на opencode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- MiniMax M2.1 Free доступен на opencode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- Big Pickle — это стелс-модель, которая доступна бесплатно на opencode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- Kimi K2.5 Free доступен на OpenCode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- MiniMax M2.5 Free доступен на OpenCode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
- Big Pickle — это стелс-модель, которая доступна бесплатно на OpenCode в течение ограниченного времени. Команда использует это время для сбора отзывов и улучшения модели.
<a href={email}>Свяжитесь с нами</a>, если у вас есть вопросы.
@@ -171,7 +174,7 @@ https://opencode.ai/zen/v1/models
Если ваш баланс упадет ниже 5 долларов, Zen автоматически пополнит 20 долларов.
Вы можете изменить сумму автопополнения. Вы также можете полностью отключить автоматическую перезагрузку.
Вы можете изменить сумму автопополнения. Вы также можете полностью отключить автопополнение.
---
@@ -181,7 +184,7 @@ https://opencode.ai/zen/v1/models
член вашей команды.
Например, предположим, что вы установили ежемесячный лимит использования в размере 20 долларов США, Zen не будет использовать
более 20 долларов в месяц. Но если у вас включена автоматическая перезагрузка, Дзен может оказаться
более 20 долларов в месяц. Но если у вас включено автопополнение, Zen может
взимать с вас более 20 долларов США, если ваш баланс опускается ниже 5 долларов США.
---
@@ -191,9 +194,8 @@ https://opencode.ai/zen/v1/models
Все наши модели размещены в США. Наши поставщики придерживаются политики нулевого хранения и не используют ваши данные для обучения моделей, за следующими исключениями:
- Big Pickle: во время бесплатного периода собранные данные могут быть использованы для улучшения модели.
- GLM 4.7 Free: в течение бесплатного периода собранные данные могут использоваться для улучшения модели.
- Kimi K2.5 Free: в течение бесплатного периода собранные данные могут использоваться для улучшения модели.
- MiniMax M2.1 Free: в течение бесплатного периода собранные данные могут использоваться для улучшения модели.
- MiniMax M2.5 Free: в течение бесплатного периода собранные данные могут использоваться для улучшения модели.
- API OpenAI: запросы хранятся в течение 30 дней в соответствии с [Политикой данных OpenAI](https://platform.openai.com/docs/guides/your-data).
- API-интерфейсы Anthropic: запросы хранятся в течение 30 дней в соответствии с [Политикой данных Anthropic](https://docs.anthropic.com/en/docs/claude-code/data-usage).
@@ -201,15 +203,15 @@ https://opencode.ai/zen/v1/models
## Для команд
Дзен также отлично подходит для команд. Вы можете приглашать товарищей по команде, назначать роли, курировать
Zen также отлично подходит для команд. Вы можете приглашать товарищей по команде, назначать роли, выбирать
модели, которые использует ваша команда, и многое другое.
:::note
Рабочие пространства в настоящее время бесплатны для команд в рамках бета-тестирования.
:::
Управление вашим рабочим пространством в настоящее время бесплатно для команд в рамках бета-тестирования. Мы будем
скоро поделимся более подробной информацией о ценах.
Управление вашим рабочим пространством в настоящее время бесплатно для команд в рамках бета-тестирования. Мы вскоре
поделимся более подробной информацией о ценах.
---
@@ -233,7 +235,7 @@ https://opencode.ai/zen/v1/models
---
### Принесите свой ключ
### Использование собственных API-ключей
Вы можете использовать свои собственные ключи API OpenAI или Anthropic, сохраняя при этом доступ к другим моделям в Zen.
@@ -248,7 +250,7 @@ https://opencode.ai/zen/v1/models
Мы создали OpenCode Zen, чтобы:
1. **Сравните** лучшие модели/поставщики агентов кодирования.
2. Получите доступ к вариантам **самого высокого качества**, не снижая производительность и не обращаясь к более дешевым поставщикам.
3. Не допускайте **падения цен**, продавая по себестоимости; поэтому единственная надбавка предназначена для покрытия наших сборов за обработку.
4. Не допускайте **привязки**, позволяя использовать его с любым другим агентом кодирования. И всегда позволяйте вам использовать любого другого провайдера с opencode.
1. **Сравнить** лучшие модели/поставщики кодинг-агентов.
2. Получить доступ к вариантам **наивысшего качества**, не снижая производительность и не обращаясь к более дешевым поставщикам.
3. Передавать **снижение цен**, продавая по себестоимости; поэтому единственная наценка предназначена для покрытия наших комиссий за обработку.
4. Исключить **привязку**, позволяя использовать его с любым другим кодинг-агентом. И всегда позволяя вам использовать любого другого провайдера с OpenCode.

View File

@@ -29,6 +29,7 @@ OpenCode มาพร้อมกับฟอร์แมตเตอร์ใ
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` คำสั่งใช้ได้ |
| air | .r | `air` คำสั่งใช้ได้ |
| dart | .dart | `dart` คำสั่งใช้ได้ |
| dfmt | .d | `dfmt` คำสั่งใช้ได้ |
| ocamlformat | .ml, .mli | มีคำสั่ง `ocamlformat` และไฟล์ปรับแต่ง `.ocamlformat` |
| terraform | .tf, .tfvars | `terraform` คำสั่งใช้ได้ |
| gleam | .gleam | `gleam` คำสั่งใช้ได้ |

View File

@@ -61,11 +61,11 @@ The system theme is for users who:
## Using a theme
You can select a theme by bringing up the theme select with the `/theme` command. Or you can specify it in your [config](/docs/config).
You can select a theme by bringing up the theme select with the `/theme` command. Or you can specify it in `tui.json`.
```json title="opencode.json" {3}
```json title="tui.json" {3}
{
"$schema": "https://opencode.ai/config.json",
"$schema": "https://opencode.ai/tui.json",
"theme": "tokyonight"
}
```

View File

@@ -29,6 +29,7 @@ opencode, popüler diller ve çerçeveler için çeşitli yerleşik biçimlendir
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` komutu mevcut |
| air | .R | `air` komutu mevcut |
| dart | .dart | `dart` komutu mevcut |
| dfmt | .d | `dfmt` komutu mevcut |
| ocamlformat | .ml, .mli | `ocamlformat` komutu mevcut ve `.ocamlformat` yapılandırma dosyası |
| terraform | .tf, .tfvars | `terraform` komutu mevcut |
| gleam | .gleam | `gleam` komutu mevcut |

View File

@@ -355,24 +355,34 @@ Some editors need command-line arguments to run in blocking mode. The `--wait` f
## Configure
You can customize TUI behavior through your OpenCode config file.
You can customize TUI behavior through `tui.json` (or `tui.jsonc`).
```json title="opencode.json"
```json title="tui.json"
{
"$schema": "https://opencode.ai/config.json",
"tui": {
"scroll_speed": 3,
"scroll_acceleration": {
"enabled": true
}
}
"$schema": "https://opencode.ai/tui.json",
"theme": "opencode",
"keybinds": {
"leader": "ctrl+x"
},
"scroll_speed": 3,
"scroll_acceleration": {
"enabled": true
},
"diff_style": "auto"
}
```
This is separate from `opencode.json`, which configures server/runtime behavior.
### Options
- `scroll_acceleration` - Enable macOS-style scroll acceleration for smooth, natural scrolling. When enabled, scroll speed increases with rapid scrolling gestures and stays precise for slower movements. **This setting takes precedence over `scroll_speed` and overrides it when enabled.**
- `theme` - Sets your UI theme. [Learn more](/docs/themes).
- `keybinds` - Customizes keyboard shortcuts. [Learn more](/docs/keybinds).
- `scroll_acceleration.enabled` - Enable macOS-style scroll acceleration for smooth, natural scrolling. When enabled, scroll speed increases with rapid scrolling gestures and stays precise for slower movements. **This setting takes precedence over `scroll_speed` and overrides it when enabled.**
- `scroll_speed` - Controls how fast the TUI scrolls when using scroll commands (minimum: `1`). Defaults to `3`. **Note: This is ignored if `scroll_acceleration.enabled` is set to `true`.**
- `diff_style` - Controls diff rendering. `"auto"` adapts to terminal width, `"stacked"` always shows a single-column layout.
Use `OPENCODE_TUI_CONFIG` to load a custom TUI config path.
---

View File

@@ -29,6 +29,7 @@ opencode 附带了多个适用于流行语言和框架的内置格式化程序
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` command available |
| air | .R | `air` command available |
| dart | .dart | `dart` command available |
| dfmt | .d | `dfmt` command available |
| ocamlformat | .ml, .mli | `ocamlformat` command available and `.ocamlformat` config file |
| terraform | .tf, .tfvars | `terraform` command available |
| gleam | .gleam | `gleam` command available |

View File

@@ -116,39 +116,39 @@ https://opencode.ai/zen/v1/models
| 模型 | 输入 | 输出 | 缓存读取 | 缓存写入 |
| ---------------------------------- | ---------- | ---------- | ---------- | ---------- |
| 大泡菜 | 免费 | 免费 | 免费 | - |
| MiniMax M2.1 免费 | 免费 | 免费 | 免费 | - |
| 迷你最大M2.1 | 0.30 美元 | 1.20 美元 | 0.10 美元 | - |
| GLM 4.7 免费 | 免费 | 免费 | 免费 | - |
| Big Pickle | 免费 | 免费 | 免费 | - |
| MiniMax M2.1 Free | 免费 | 免费 | 免费 | - |
| MiniMax M2.1 | 0.30 美元 | 1.20 美元 | 0.10 美元 | - |
| GLM 4.7 Free | 免费 | 免费 | 免费 | - |
| GLM 4.7 | 0.60 美元 | 2.20 美元 | 0.10 美元 | - |
| GLM 4.6 | 0.60 美元 | 2.20 美元 | 0.10 美元 | - |
| Kimi K2.5 免费 | 免费 | 免费 | 免费 | - |
| 作为K2.5 | 0.60 美元 | $3.00 | 0.08 美元 | - |
| Kimi K2 思考 | 0.40 美元 | 2.50 美元 | - | - |
| 作为K2 | 0.40 美元 | 2.50 美元 | - | - |
| Qwen3 编码器 480B | 0.45 美元 | 1.50 美元 | - | - |
| Claude Sonnet 4.5(≤ 200K Tokens | $3.00 | 15.00 美元 | 0.30 美元 | 3.75 美元 |
| 克劳德十四行诗 4.5> 200K 代币) | 6.00 美元 | 22.50 美元 | 0.60 美元 | 7.50 美元 |
| Claude Sonnet 4≤ 200K Tokens | $3.00 | 15.00 美元 | 0.30 美元 | 3.75 美元 |
| Kimi K2.5 Free | 免费 | 免费 | 免费 | - |
| Kimi K2.5 | 0.60 美元 | 3.00 美元 | 0.08 美元 | - |
| Kimi K2 Thinking | 0.40 美元 | 2.50 美元 | - | - |
| Kimi K2 | 0.40 美元 | 2.50 美元 | - | - |
| Qwen3 Coder 480B | 0.45 美元 | 1.50 美元 | - | - |
| Claude Sonnet 4.5(≤ 200K Tokens | 3.00 美元 | 15.00 美元 | 0.30 美元 | 3.75 美元 |
| Claude Sonnet 4.5> 200K Tokens | 6.00 美元 | 22.50 美元 | 0.60 美元 | 7.50 美元 |
| Claude Sonnet 4≤ 200K Tokens | 3.00 美元 | 15.00 美元 | 0.30 美元 | 3.75 美元 |
| Claude Sonnet 4> 200K Tokens | 6.00 美元 | 22.50 美元 | 0.60 美元 | 7.50 美元 |
| Claude 俳句 4.5 | 1.00 美元 | 5.00 美元 | 0.10 美元 | 1.25 美元 |
| Claude 俳句 3.5 | 0.80 美元 | 4.00 美元 | 0.08 美元 | 1.00 美元 |
| 克劳德作品4.6(≤ 200K 代币) | 5.00 美元 | 25.00 美元 | 0.50 美元 | 6.25 美元 |
| Claude Haiku 4.5 | 1.00 美元 | 5.00 美元 | 0.10 美元 | 1.25 美元 |
| Claude Haiku 3.5 | 0.80 美元 | 4.00 美元 | 0.08 美元 | 1.00 美元 |
| Claude Opus 4.6(≤ 200K Tokens | 5.00 美元 | 25.00 美元 | 0.50 美元 | 6.25 美元 |
| Claude Opus 4.6> 200K Tokens | 10.00 美元 | 37.50 美元 | 1.00 美元 | 12.50 美元 |
| Claude 工作 4.5 | 5.00 美元 | 25.00 美元 | 0.50 美元 | 6.25 美元 |
| Claude 工作 4.1 | 15.00 美元 | 75.00 美元 | 1.50 美元 | 18.75 美元 |
| Gemini 3 Pro≤20万代币) | 2.00 美元 | 12.00 美元 | 0.20 美元 | - |
| Gemini 3 Pro>20万代币) | 4.00 美元 | 18.00 美元 | 0.40 美元 | - |
| 双子座 3 闪光 | 0.50 美元 | $3.00 | 0.05 美元 | - |
| Claude Opus 4.5 | 5.00 美元 | 25.00 美元 | 0.50 美元 | 6.25 美元 |
| Claude Opus 4.1 | 15.00 美元 | 75.00 美元 | 1.50 美元 | 18.75 美元 |
| Gemini 3 Pro≤20万 Tokens | 2.00 美元 | 12.00 美元 | 0.20 美元 | - |
| Gemini 3 Pro>20万 Tokens | 4.00 美元 | 18.00 美元 | 0.40 美元 | - |
| Gemini 3 Flash | 0.50 美元 | 3.00 美元 | 0.05 美元 | - |
| GPT 5.2 | 1.75 美元 | 14.00 美元 | 0.175 美元 | - |
| GPT 5.2 法典 | 1.75 美元 | 14.00 美元 | 0.175 美元 | - |
| GPT 5.2 Codex | 1.75 美元 | 14.00 美元 | 0.175 美元 | - |
| GPT 5.1 | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5.1 法典 | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5.1 法典最大 | 1.25 美元 | 10.00 美元 | 0.125 美元 | - |
| GPT 5.1 迷你版 | 0.25 美元 | 2.00 美元 | 0.025 美元 | - |
| GPT 5.1 Codex | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5.1 Codex Max | 1.25 美元 | 10.00 美元 | 0.125 美元 | - |
| GPT 5.1 Codex Mini | 0.25 美元 | 2.00 美元 | 0.025 美元 | - |
| GPT 5 | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5 法典 | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5 奈米 | 免费 | 免费 | 免费 | - |
| GPT 5 Codex | 1.07 美元 | 8.50 美元 | 0.107 美元 | - |
| GPT 5 Nano | 免费 | 免费 | 免费 | - |
您可能会在您的使用历史记录中注意到*Claude Haiku 3.5*。这是一个[低成本模型](/docs/config/#models),用于生成会话标题。
@@ -216,8 +216,8 @@ Zen 也非常适合团队使用。您可以邀请您可以邀请队友,分配
您可以邀请团队成员到您的工作区并分配角色:
- **管理员**管理模型、成员、API 密钥和设备
- **成员**仅管理自己的API 金?
- **管理员**管理模型、成员、API 密钥和计费/账单
- **成员**:仅管理自己的 API 密钥
管理员还可以为每个成员设置每月支出限额,以控制成本。

View File

@@ -29,6 +29,7 @@ opencode 附帶了多個適用於流行語言和框架的內建格式化程式
| htmlbeautifier | .erb, .html.erb | `htmlbeautifier` 指令可用 |
| air | .R | `air` 指令可用 |
| dart | .dart | `dart` 指令可用 |
| dfmt | .d | `dfmt` 指令可用 |
| ocamlformat | .ml, .mli | `ocamlformat` 指令可用,且存在 `.ocamlformat` 設定檔 |
| terraform | .tf, .tfvars | `terraform` 指令可用 |
| gleam | .gleam | `gleam` 指令可用 |

View File

@@ -2,7 +2,7 @@
"name": "opencode",
"displayName": "opencode",
"description": "opencode for VS Code",
"version": "1.2.5",
"version": "1.2.6",
"publisher": "sst-dev",
"repository": {
"type": "git",