Compare commits

..

17 Commits

Author SHA1 Message Date
Kit Langton
8d2385ad49 test: finish HTTP mock processor coverage 2026-03-31 20:36:18 -04:00
Kit Langton
7f6a5bb2c8 test: migrate processor tests to HTTP mock LLM server
Replace the custom TestLLM Effect service with the real LLM layer +
TestLLMServer HTTP mock for 9 of 10 processor tests. Tests now exercise
the full HTTP→SSE→AI SDK→processor pipeline.

- Export Provider.defaultLayer for test layer composition
- Add boot() helper for common service access (processor, session, provider)
- Extend TestLLMServer with usage support and httpError step type
- Tool abort test registers a real tool with hanging execute
- Reasoning test stays with in-process TestLLM (needs fine-grained events)
2026-03-31 19:59:56 -04:00
Kit Langton
537cc32bf0 test: migrate processor tests to HTTP mock LLM server
Replace the custom TestLLM Effect service with the real LLM layer +
TestLLMServer HTTP mock. Tests now exercise the full HTTP→SSE→AI SDK
pipeline instead of injecting Effect streams directly.

- Extend TestLLMServer with usage support on text responses and
  httpError step type for non-200 responses
- Drop reasoning test (can't produce reasoning events via
  @ai-sdk/openai-compatible SSE)
- 9 tests pass, covering: text capture, token overflow, error retry,
  structured errors, context overflow, abort/interrupt cleanup
2026-03-31 19:21:27 -04:00
Kit Langton
82da702f64 refactor: tighten instance context helper fallbacks 2026-03-31 17:32:24 -04:00
Kit Langton
90469bbb7e refactor: simplify instance context helpers in prompt tests 2026-03-31 16:42:20 -04:00
Kit Langton
4ff0fbc043 fix: retry scoped tempdir cleanup on windows 2026-03-31 16:42:19 -04:00
Kit Langton
e24369eaf1 fix: break installation cycle in database context binding 2026-03-31 16:42:18 -04:00
Kit Langton
825f51c39f fix: restore instance context in deferred database callbacks 2026-03-31 16:42:18 -04:00
Kit Langton
191a747405 fix: propagate InstanceRef across static function boundaries
- makeRuntime.provide reads InstanceRef from current Effect fiber when
  ALS is unavailable, bridging static function calls (like Bus.publish)
  that create new fibers from inside Effect code
- Database.transaction preserves Instance ALS via Instance.bind on the
  bun:sqlite transaction callback (native fn loses ALS)
- Instance.restore helper for bridging Effect→sync code with ALS
- InstanceState.withALS bridges InstanceRef back to ALS for sync callers
- prompt.ts: InstructionPrompt.clear wrapped with withALS
- Remove ALL provideInstance(dir) wrappers from prompt-effect tests
2026-03-31 16:42:17 -04:00
Kit Langton
cc412f3014 refactor: migrate Instance ALS reads to InstanceRef in Effect services
Migrate 16 direct Instance.directory/worktree/project reads inside
Effect code to use InstanceState.directory/context helpers that check
the InstanceRef first and fall back to ALS.

- Export InstanceState.directory and InstanceState.context helpers
- bus/index.ts: GlobalBus.emit uses InstanceState.directory
- session/prompt.ts: 5 callsites migrated to InstanceState.context
- session/index.ts: 4 callsites migrated
- session/compaction.ts: 1 callsite migrated
- config/config.ts: 1 callsite migrated
- format/index.ts: 1 callsite migrated
- worktree/index.ts: 5 callsites migrated
- storage/db.ts: Database.effect preserves Instance ALS via Instance.bind
- test/lib/llm-server.ts: add wait/hold/fail SSE stream support
- Remove most provideInstance(dir) wrappers from prompt tests
  (5 remain due to Instance.state sync ALS dependency)
2026-03-31 16:42:16 -04:00
Kit Langton
bb039496d5 refactor: propagate Instance context through Effect fibers via InstanceRef
Add a ServiceMap.Reference that carries InstanceContext through the
Effect service graph so child fibers retain instance context even when
resumed by external I/O events outside the ALS boundary.

- Add InstanceRef to instance-state.ts; InstanceState.get/has/invalidate
  try the Reference first, fall back to ALS
- makeRuntime automatically captures ALS into InstanceRef at the boundary
- provideInstance (test fixture) sets InstanceRef for Effect.runPromiseWith
- Remove all redundant provideInstance(dir) wrappers from prompt tests
- Fix test/lib/effect.ts type params (drop unnecessary S/T generics)
2026-03-31 16:42:16 -04:00
Kit Langton
f2fa1a681d test: move more prompt cases to mock llm server
Migrate the next prompt-effect cases to the HTTP-backed mock server path, keep the shell handoff cases on short live timeouts, and leave the stream-failure case on the in-process fake until the server DSL matches it.
2026-03-31 16:42:15 -04:00
Kit Langton
6bd340492c test: infer mock server callback types 2026-03-31 16:42:15 -04:00
Kit Langton
21ec3207e7 test: extend mock llm server coverage
Add fixture support for tmpdir-backed mock server tests, extend the mock LLM server DSL for failure and hanging cases, and migrate the next prompt tests to the HTTP-backed path.
2026-03-31 16:42:14 -04:00
Kit Langton
123123b6c3 test: start moving prompt tests to mock llm server
Switch the basic assistant reply prompt-effect test to the HTTP-backed mock LLM server while keeping the more stream-sensitive cases on the in-process fake for now.
2026-03-31 16:42:14 -04:00
Kit Langton
6ea467b0ac test: add live effect helper mode
Default the shared effect test helper to support both test-clock and live execution, and switch the current opencode effect tests to the live path for real integration behavior.
2026-03-31 16:42:13 -04:00
Kit Langton
459fbc99a8 refactor(test): migrate llm-server to Effect HTTP platform
- Replace Bun.serve with Effect HTTP server using NodeHttpServer
- Add TestLLMServer service for mock LLM testing with SSE responses
- Update prompt-provider.test.ts to use testEffect pattern with provideTmpdirInstance
- Remove redundant test/fixture/effect.ts (using existing test/lib/effect.ts instead)
2026-03-31 16:42:13 -04:00
53 changed files with 1213 additions and 1609 deletions

1
.github/VOUCHED.td vendored
View File

@@ -27,4 +27,3 @@ rekram1-node
-robinmordasiewicz
-spider-yamet clawdbot/llm psychosis, spam pinging the team
thdxr
-toastythebot

861
bun.lock

File diff suppressed because it is too large Load Diff

View File

@@ -1,8 +1,8 @@
{
"nodeModules": {
"x86_64-linux": "sha256-C7y5FMI1pGEgMw/vcPoBhK9tw5uGg1bk0gPXPUUVhgU=",
"aarch64-linux": "sha256-cUlQ9jp4WIaJkd4GRoHMWc+REG/OnnGCmsQUNmvg4is=",
"aarch64-darwin": "sha256-3GXmqG7yihJ91wS/jlW19qxGI62b1bFJnpGB4LcMlpY=",
"x86_64-darwin": "sha256-cUF0TfYg2nXnU80kWFpr9kNHlu9txiatIgrHTltgx4g="
"x86_64-linux": "sha256-UuVbB5lTRB4bIcaKMc8CLSbQW7m9EjXgxYvxp/uO7Co=",
"aarch64-linux": "sha256-8D7ReLRVb7NDd5PQTVxFhRLmlLbfjK007XgIhhpNKoE=",
"aarch64-darwin": "sha256-M+z7C/eXfVqwDiGiiwKo/LT/m4dvCjL1Pblsr1kxoyI=",
"x86_64-darwin": "sha256-RzZS6GMwYVDPK0W+K/mlebixNMs2+JRkMG9n8OFhd0c="
}
}

View File

@@ -12,7 +12,6 @@
"dev:console": "ulimit -n 10240 2>/dev/null; bun run --cwd packages/console/app dev",
"dev:storybook": "bun --cwd packages/storybook storybook",
"typecheck": "bun turbo typecheck",
"postinstall": "bun run --cwd packages/opencode fix-node-pty",
"prepare": "husky",
"random": "echo 'Random script'",
"hello": "echo 'Hello World!'",
@@ -102,7 +101,6 @@
},
"trustedDependencies": [
"esbuild",
"node-pty",
"protobufjs",
"tree-sitter",
"tree-sitter-bash",

View File

@@ -1,8 +1,5 @@
import { test as base, expect, type Page } from "@playwright/test"
import { ManagedRuntime } from "effect"
import type { E2EWindow } from "../src/testing/terminal"
import type { Item, Reply, Usage } from "../../opencode/test/lib/llm-server"
import { TestLLMServer } from "../../opencode/test/lib/llm-server"
import {
healthPhase,
cleanupSession,
@@ -16,24 +13,6 @@ import {
} from "./actions"
import { createSdk, dirSlug, getWorktree, sessionPath } from "./utils"
type LLMFixture = {
url: string
push: (...input: (Item | Reply)[]) => Promise<void>
text: (value: string, opts?: { usage?: Usage }) => Promise<void>
tool: (name: string, input: unknown) => Promise<void>
toolHang: (name: string, input: unknown) => Promise<void>
reason: (value: string, opts?: { text?: string; usage?: Usage }) => Promise<void>
fail: (message?: unknown) => Promise<void>
error: (status: number, body: unknown) => Promise<void>
hang: () => Promise<void>
hold: (value: string, wait: PromiseLike<unknown>) => Promise<void>
hits: () => Promise<Array<{ url: URL; body: Record<string, unknown> }>>
calls: () => Promise<number>
wait: (count: number) => Promise<void>
inputs: () => Promise<Record<string, unknown>[]>
pending: () => Promise<number>
}
export const settingsKey = "settings.v3"
const seedModel = (() => {
@@ -47,7 +26,6 @@ const seedModel = (() => {
})()
type TestFixtures = {
llm: LLMFixture
sdk: ReturnType<typeof createSdk>
gotoSession: (sessionID?: string) => Promise<void>
withProject: <T>(
@@ -58,11 +36,7 @@ type TestFixtures = {
trackSession: (sessionID: string, directory?: string) => void
trackDirectory: (directory: string) => void
}) => Promise<T>,
options?: {
extra?: string[]
model?: { providerID: string; modelID: string }
setup?: (directory: string) => Promise<void>
},
options?: { extra?: string[] },
) => Promise<T>
}
@@ -72,31 +46,6 @@ type WorkerFixtures = {
}
export const test = base.extend<TestFixtures, WorkerFixtures>({
llm: async ({}, use) => {
const rt = ManagedRuntime.make(TestLLMServer.layer)
try {
const svc = await rt.runPromise(TestLLMServer.asEffect())
await use({
url: svc.url,
push: (...input) => rt.runPromise(svc.push(...input)),
text: (value, opts) => rt.runPromise(svc.text(value, opts)),
tool: (name, input) => rt.runPromise(svc.tool(name, input)),
toolHang: (name, input) => rt.runPromise(svc.toolHang(name, input)),
reason: (value, opts) => rt.runPromise(svc.reason(value, opts)),
fail: (message) => rt.runPromise(svc.fail(message)),
error: (status, body) => rt.runPromise(svc.error(status, body)),
hang: () => rt.runPromise(svc.hang),
hold: (value, wait) => rt.runPromise(svc.hold(value, wait)),
hits: () => rt.runPromise(svc.hits),
calls: () => rt.runPromise(svc.calls),
wait: (count) => rt.runPromise(svc.wait(count)),
inputs: () => rt.runPromise(svc.inputs),
pending: () => rt.runPromise(svc.pending),
})
} finally {
await rt.dispose()
}
},
page: async ({ page }, use) => {
let boundary: string | undefined
setHealthPhase(page, "test")
@@ -150,8 +99,7 @@ export const test = base.extend<TestFixtures, WorkerFixtures>({
const root = await createTestProject()
const sessions = new Map<string, string>()
const dirs = new Set<string>()
await options?.setup?.(root)
await seedStorage(page, { directory: root, extra: options?.extra, model: options?.model })
await seedStorage(page, { directory: root, extra: options?.extra })
const gotoSession = async (sessionID?: string) => {
await page.goto(sessionPath(root, sessionID))
@@ -185,14 +133,7 @@ export const test = base.extend<TestFixtures, WorkerFixtures>({
},
})
async function seedStorage(
page: Page,
input: {
directory: string
extra?: string[]
model?: { providerID: string; modelID: string }
},
) {
async function seedStorage(page: Page, input: { directory: string; extra?: string[] }) {
await seedProjects(page, input)
await page.addInitScript((model: { providerID: string; modelID: string }) => {
const win = window as E2EWindow
@@ -217,7 +158,7 @@ async function seedStorage(
variant: {},
}),
)
}, input.model ?? seedModel)
}, seedModel)
}
export { expect }

View File

@@ -1,44 +1,8 @@
import fs from "node:fs/promises"
import path from "node:path"
import { test, expect } from "../fixtures"
import { promptSelector } from "../selectors"
import { sessionIDFromUrl } from "../actions"
import { createSdk } from "../utils"
import { cleanupSession, sessionIDFromUrl, withSession } from "../actions"
async function config(dir: string, url: string) {
await fs.writeFile(
path.join(dir, "opencode.json"),
JSON.stringify({
$schema: "https://opencode.ai/config.json",
enabled_providers: ["e2e-llm"],
provider: {
"e2e-llm": {
name: "E2E LLM",
npm: "@ai-sdk/openai-compatible",
env: [],
models: {
"test-model": {
name: "Test Model",
tool_call: true,
limit: { context: 128000, output: 32000 },
},
},
options: {
apiKey: "test-key",
baseURL: url,
},
},
},
agent: {
build: {
model: "e2e-llm/test-model",
},
},
}),
)
}
test("can send a prompt and receive a reply", async ({ page, llm, withProject }) => {
test("can send a prompt and receive a reply", async ({ page, sdk, gotoSession }) => {
test.setTimeout(120_000)
const pageErrors: string[] = []
@@ -47,51 +11,42 @@ test("can send a prompt and receive a reply", async ({ page, llm, withProject })
}
page.on("pageerror", onPageError)
await gotoSession()
const token = `E2E_OK_${Date.now()}`
const prompt = page.locator(promptSelector)
await prompt.click()
await page.keyboard.type(`Reply with exactly: ${token}`)
await page.keyboard.press("Enter")
await expect(page).toHaveURL(/\/session\/[^/?#]+/, { timeout: 30_000 })
const sessionID = (() => {
const id = sessionIDFromUrl(page.url())
if (!id) throw new Error(`Failed to parse session id from url: ${page.url()}`)
return id
})()
try {
await withProject(
async (project) => {
const sdk = createSdk(project.directory)
const token = `E2E_OK_${Date.now()}`
await expect
.poll(
async () => {
const messages = await sdk.session.messages({ sessionID, limit: 50 }).then((r) => r.data ?? [])
return messages
.filter((m) => m.info.role === "assistant")
.flatMap((m) => m.parts)
.filter((p) => p.type === "text")
.map((p) => p.text)
.join("\n")
},
{ timeout: 90_000 },
)
await llm.text(token)
await project.gotoSession()
const prompt = page.locator(promptSelector)
await prompt.click()
await page.keyboard.type(`Reply with exactly: ${token}`)
await page.keyboard.press("Enter")
await expect(page).toHaveURL(/\/session\/[^/?#]+/, { timeout: 30_000 })
const sessionID = (() => {
const id = sessionIDFromUrl(page.url())
if (!id) throw new Error(`Failed to parse session id from url: ${page.url()}`)
return id
})()
project.trackSession(sessionID)
await expect
.poll(
async () => {
const messages = await sdk.session.messages({ sessionID, limit: 50 }).then((r) => r.data ?? [])
return messages
.filter((m) => m.info.role === "assistant")
.flatMap((m) => m.parts)
.filter((p) => p.type === "text")
.map((p) => p.text)
.join("\n")
},
{ timeout: 30_000 },
)
.toContain(token)
},
{
model: { providerID: "e2e-llm", modelID: "test-model" },
setup: (dir) => config(dir, llm.url),
},
)
.toContain(token)
} finally {
page.off("pageerror", onPageError)
await cleanupSession({ sdk, sessionID })
}
if (pageErrors.length > 0) {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/app",
"version": "1.3.13",
"version": "1.3.11",
"description": "",
"type": "module",
"exports": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-app",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/console-core",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-function",
"version": "1.3.13",
"version": "1.3.11",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-mail",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@jsx-email/all": "2.2.3",
"@jsx-email/cli": "1.4.3",

View File

@@ -1,7 +1,7 @@
{
"name": "@opencode-ai/desktop-electron",
"private": true,
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"homepage": "https://opencode.ai",

View File

@@ -1,7 +1,7 @@
{
"name": "@opencode-ai/desktop",
"private": true,
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/enterprise",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,7 +1,7 @@
id = "opencode"
name = "OpenCode"
description = "The open source coding agent."
version = "1.3.13"
version = "1.3.11"
schema_version = 1
authors = ["Anomaly"]
repository = "https://github.com/anomalyco/opencode"
@@ -11,26 +11,26 @@ name = "OpenCode"
icon = "./icons/opencode.svg"
[agent_servers.opencode.targets.darwin-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-darwin-arm64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-darwin-arm64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.darwin-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-darwin-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-darwin-x64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-linux-arm64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-linux-arm64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-linux-x64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-linux-x64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.windows-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-windows-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-windows-x64.zip"
cmd = "./opencode.exe"
args = ["acp"]

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/function",
"version": "1.3.13",
"version": "1.3.11",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"$schema": "https://json.schemastore.org/package.json",
"version": "1.3.13",
"version": "1.3.11",
"name": "opencode",
"type": "module",
"license": "MIT",
@@ -10,7 +10,6 @@
"typecheck": "tsgo --noEmit",
"test": "bun test --timeout 30000",
"build": "bun run script/build.ts",
"fix-node-pty": "bun run script/fix-node-pty.ts",
"upgrade-opentui": "bun run script/upgrade-opentui.ts",
"dev": "bun run --conditions=browser ./src/index.ts",
"random": "echo 'Random script updated at $(date)' && echo 'Change queued successfully' && echo 'Another change made' && echo 'Yet another change' && echo 'One more change' && echo 'Final change' && echo 'Another final change' && echo 'Yet another final change'",
@@ -33,11 +32,6 @@
"bun": "./src/storage/db.bun.ts",
"node": "./src/storage/db.node.ts",
"default": "./src/storage/db.bun.ts"
},
"#pty": {
"bun": "./src/pty/pty.bun.ts",
"node": "./src/pty/pty.node.ts",
"default": "./src/pty/pty.bun.ts"
}
},
"devDependencies": {
@@ -97,13 +91,8 @@
"@aws-sdk/credential-providers": "3.993.0",
"@clack/prompts": "1.0.0-alpha.1",
"@effect/platform-node": "catalog:",
"@gitlab/gitlab-ai-provider": "3.6.0",
"@gitlab/opencode-gitlab-auth": "1.3.3",
"@hono/node-server": "1.19.11",
"@hono/node-ws": "1.3.0",
"@hono/standard-validator": "0.1.5",
"@hono/zod-validator": "catalog:",
"@lydell/node-pty": "1.2.0-beta.10",
"@modelcontextprotocol/sdk": "1.27.1",
"@octokit/graphql": "9.0.2",
"@octokit/rest": "catalog:",
@@ -113,8 +102,8 @@
"@opencode-ai/sdk": "workspace:*",
"@opencode-ai/util": "workspace:*",
"@openrouter/ai-sdk-provider": "2.3.3",
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@parcel/watcher": "2.5.1",
"@pierre/diffs": "catalog:",
"@solid-primitives/event-bus": "1.1.2",

4
packages/opencode/script/build-node.ts Executable file → Normal file
View File

@@ -1,6 +1,5 @@
#!/usr/bin/env bun
import { $ } from "bun"
import fs from "fs"
import path from "path"
import { fileURLToPath } from "url"
@@ -41,14 +40,11 @@ const migrations = await Promise.all(
)
console.log(`Loaded ${migrations.length} migrations`)
await $`bun install --os="*" --cpu="*" @lydell/node-pty@1.2.0-beta.10`
await Bun.build({
target: "node",
entrypoints: ["./src/node.ts"],
outdir: "./dist",
format: "esm",
sourcemap: "linked",
external: ["jsonc-parser"],
define: {
OPENCODE_MIGRATIONS: JSON.stringify(migrations),

View File

@@ -1,28 +0,0 @@
#!/usr/bin/env bun
import fs from "fs/promises"
import path from "path"
import { fileURLToPath } from "url"
const __filename = fileURLToPath(import.meta.url)
const __dirname = path.dirname(__filename)
const dir = path.resolve(__dirname, "..")
if (process.platform !== "win32") {
const root = path.join(dir, "node_modules", "node-pty", "prebuilds")
const dirs = await fs.readdir(root, { withFileTypes: true }).catch(() => [])
const files = dirs.filter((x) => x.isDirectory()).map((x) => path.join(root, x.name, "spawn-helper"))
const result = await Promise.all(
files.map(async (file) => {
const stat = await fs.stat(file).catch(() => undefined)
if (!stat) return
if ((stat.mode & 0o111) === 0o111) return
await fs.chmod(file, stat.mode | 0o755)
return file
}),
)
const fixed = result.filter(Boolean)
if (fixed.length) {
console.log(`fixed node-pty permissions for ${fixed.length} helper${fixed.length === 1 ? "" : "s"}`)
}
}

View File

@@ -75,7 +75,6 @@ export namespace Agent {
const config = yield* Config.Service
const auth = yield* Auth.Service
const skill = yield* Skill.Service
const provider = yield* Provider.Service
const state = yield* InstanceState.make<State>(
Effect.fn("Agent.state")(function* (ctx) {
@@ -331,9 +330,9 @@ export namespace Agent {
model?: { providerID: ProviderID; modelID: ModelID }
}) {
const cfg = yield* config.get()
const model = input.model ?? (yield* provider.defaultModel())
const resolved = yield* provider.getModel(model.providerID, model.modelID)
const language = yield* provider.getLanguage(resolved)
const model = input.model ?? (yield* Effect.promise(() => Provider.defaultModel()))
const resolved = yield* Effect.promise(() => Provider.getModel(model.providerID, model.modelID))
const language = yield* Effect.promise(() => Provider.getLanguage(resolved))
const system = [PROMPT_GENERATE]
yield* Effect.promise(() =>
@@ -394,7 +393,6 @@ export namespace Agent {
)
export const defaultLayer = layer.pipe(
Layer.provide(Provider.defaultLayer),
Layer.provide(Auth.defaultLayer),
Layer.provide(Config.defaultLayer),
Layer.provide(Skill.defaultLayer),

View File

@@ -23,7 +23,7 @@ export const AcpCommand = cmd({
process.env.OPENCODE_CLIENT = "acp"
await bootstrap(process.cwd(), async () => {
const opts = await resolveNetworkOptions(args)
const server = await Server.listen(opts)
const server = Server.listen(opts)
const sdk = createOpencodeClient({
baseUrl: `http://${server.hostname}:${server.port}`,

View File

@@ -15,7 +15,7 @@ export const ServeCommand = cmd({
console.log("Warning: OPENCODE_SERVER_PASSWORD is not set; server is unsecured.")
}
const opts = await resolveNetworkOptions(args)
const server = await Server.listen(opts)
const server = Server.listen(opts)
console.log(`opencode server listening on http://${server.hostname}:${server.port}`)
await new Promise(() => {})

View File

@@ -37,7 +37,7 @@ export const WebCommand = cmd({
UI.println(UI.Style.TEXT_WARNING_BOLD + "! " + "OPENCODE_SERVER_PASSWORD is not set; server is unsecured.")
}
const opts = await resolveNetworkOptions(args)
const server = await Server.listen(opts)
const server = Server.listen(opts)
UI.empty()
UI.println(UI.logo(" "))
UI.empty()

View File

@@ -128,8 +128,7 @@ export namespace Plugin {
get serverUrl(): URL {
return Server.url ?? new URL("http://localhost:4096")
},
// @ts-expect-error
$: typeof Bun === "undefined" ? undefined : Bun.$,
$: Bun.$,
}
for (const plugin of INTERNAL_PLUGINS) {

View File

@@ -3,7 +3,7 @@ import { Bus } from "@/bus"
import { InstanceState } from "@/effect/instance-state"
import { makeRuntime } from "@/effect/run-service"
import { Instance } from "@/project/instance"
import type { Proc } from "#pty"
import { type IPty } from "bun-pty"
import z from "zod"
import { Log } from "../util/log"
import { lazy } from "@opencode-ai/util/lazy"
@@ -26,11 +26,9 @@ export namespace Pty {
close: (code?: number, reason?: string) => void
}
const sock = (ws: Socket) => (ws.data && typeof ws.data === "object" ? ws.data : ws)
type Active = {
info: Info
process: Proc
process: IPty
buffer: string
bufferCursor: number
cursor: number
@@ -52,7 +50,10 @@ export namespace Pty {
return out
}
const pty = lazy(() => import("#pty"))
const pty = lazy(async () => {
const { spawn } = await import("bun-pty")
return spawn
})
export const Info = z
.object({
@@ -121,9 +122,9 @@ export namespace Pty {
try {
session.process.kill()
} catch {}
for (const [sub, ws] of session.subscribers.entries()) {
for (const [key, ws] of session.subscribers.entries()) {
try {
if (sock(ws) === sub) ws.close()
if (ws.data === key) ws.close()
} catch {}
}
session.subscribers.clear()
@@ -196,7 +197,7 @@ export namespace Pty {
}
log.info("creating session", { id, cmd: command, args, cwd })
const { spawn } = await pty()
const spawn = await pty()
const proc = spawn(command, args, {
name: "xterm-256color",
cwd,
@@ -225,19 +226,19 @@ export namespace Pty {
Instance.bind((chunk) => {
session.cursor += chunk.length
for (const [sub, ws] of session.subscribers.entries()) {
for (const [key, ws] of session.subscribers.entries()) {
if (ws.readyState !== 1) {
session.subscribers.delete(sub)
session.subscribers.delete(key)
continue
}
if (sock(ws) !== sub) {
session.subscribers.delete(sub)
if (ws.data !== key) {
session.subscribers.delete(key)
continue
}
try {
ws.send(chunk)
} catch {
session.subscribers.delete(sub)
session.subscribers.delete(key)
}
}
@@ -301,12 +302,15 @@ export namespace Pty {
}
log.info("client connected to session", { id })
const sub = sock(ws)
session.subscribers.delete(sub)
session.subscribers.set(sub, ws)
// Use ws.data as the unique key for this connection lifecycle.
// If ws.data is undefined, fallback to ws object.
const key = ws.data && typeof ws.data === "object" ? ws.data : ws
// Optionally cleanup if the key somehow exists
session.subscribers.delete(key)
session.subscribers.set(key, ws)
const cleanup = () => {
session.subscribers.delete(sub)
session.subscribers.delete(key)
}
const start = session.bufferCursor

View File

@@ -1,26 +0,0 @@
import { spawn as create } from "bun-pty"
import type { Opts, Proc } from "./pty"
export type { Disp, Exit, Opts, Proc } from "./pty"
export function spawn(file: string, args: string[], opts: Opts): Proc {
const pty = create(file, args, opts)
return {
pid: pty.pid,
onData(listener) {
return pty.onData(listener)
},
onExit(listener) {
return pty.onExit(listener)
},
write(data) {
pty.write(data)
},
resize(cols, rows) {
pty.resize(cols, rows)
},
kill(signal) {
pty.kill(signal)
},
}
}

View File

@@ -1,27 +0,0 @@
/** @ts-expect-error */
import * as pty from "@lydell/node-pty"
import type { Opts, Proc } from "./pty"
export type { Disp, Exit, Opts, Proc } from "./pty"
export function spawn(file: string, args: string[], opts: Opts): Proc {
const proc = pty.spawn(file, args, opts)
return {
pid: proc.pid,
onData(listener) {
return proc.onData(listener)
},
onExit(listener) {
return proc.onExit(listener)
},
write(data) {
proc.write(data)
},
resize(cols, rows) {
proc.resize(cols, rows)
},
kill(signal) {
proc.kill(signal)
},
}
}

View File

@@ -1,25 +0,0 @@
export type Disp = {
dispose(): void
}
export type Exit = {
exitCode: number
signal?: number | string
}
export type Opts = {
name: string
cols?: number
rows?: number
cwd?: string
env?: Record<string, string>
}
export type Proc = {
pid: number
onData(listener: (data: string) => void): Disp
onExit(listener: (event: Exit) => void): Disp
write(data: string): void
resize(cols: number, rows: number): void
kill(signal?: string): void
}

View File

@@ -1,7 +1,6 @@
import { describeRoute, resolver } from "hono-openapi"
import { Hono } from "hono"
import { proxy } from "hono/proxy"
import type { UpgradeWebSocket } from "hono/ws"
import z from "zod"
import { createHash } from "node:crypto"
import { Log } from "../util/log"
@@ -41,11 +40,11 @@ const DEFAULT_CSP =
const csp = (hash = "") =>
`default-src 'self'; script-src 'self' 'wasm-unsafe-eval'${hash ? ` 'sha256-${hash}'` : ""}; style-src 'self' 'unsafe-inline'; img-src 'self' data: https:; font-src 'self' data:; media-src 'self' data:; connect-src 'self' data:`
export const InstanceRoutes = (upgrade: UpgradeWebSocket, app: Hono = new Hono()) =>
app
export const InstanceRoutes = (app?: Hono) =>
(app ?? new Hono())
.onError(errorHandler(log))
.route("/project", ProjectRoutes())
.route("/pty", PtyRoutes(upgrade))
.route("/pty", PtyRoutes())
.route("/config", ConfigRoutes())
.route("/experimental", ExperimentalRoutes())
.route("/session", SessionRoutes())

View File

@@ -1,5 +1,4 @@
import type { MiddlewareHandler } from "hono"
import type { UpgradeWebSocket } from "hono/ws"
import { getAdaptor } from "@/control-plane/adaptors"
import { WorkspaceID } from "@/control-plane/schema"
import { Workspace } from "@/control-plane/workspace"
@@ -25,78 +24,76 @@ function local(method: string, path: string) {
return false
}
export function WorkspaceRouterMiddleware(upgrade: UpgradeWebSocket): MiddlewareHandler {
const routes = lazy(() => InstanceRoutes(upgrade))
const routes = lazy(() => InstanceRoutes())
return async (c) => {
const raw = c.req.query("directory") || c.req.header("x-opencode-directory") || process.cwd()
const directory = Filesystem.resolve(
(() => {
try {
return decodeURIComponent(raw)
} catch {
return raw
}
})(),
)
export const WorkspaceRouterMiddleware: MiddlewareHandler = async (c) => {
const raw = c.req.query("directory") || c.req.header("x-opencode-directory") || process.cwd()
const directory = Filesystem.resolve(
(() => {
try {
return decodeURIComponent(raw)
} catch {
return raw
}
})(),
)
const url = new URL(c.req.url)
const workspaceParam = url.searchParams.get("workspace")
const url = new URL(c.req.url)
const workspaceParam = url.searchParams.get("workspace")
// TODO: If session is being routed, force it to lookup the
// project/workspace
// TODO: If session is being routed, force it to lookup the
// project/workspace
// If no workspace is provided we use the "project" workspace
if (!workspaceParam) {
return Instance.provide({
directory,
init: InstanceBootstrap,
async fn() {
return routes().fetch(c.req.raw, c.env)
},
})
}
const workspaceID = WorkspaceID.make(workspaceParam)
const workspace = await Workspace.get(workspaceID)
if (!workspace) {
return new Response(`Workspace not found: ${workspaceID}`, {
status: 500,
headers: {
"content-type": "text/plain; charset=utf-8",
},
})
}
// Handle local workspaces directly so we can pass env to `fetch`,
// necessary for websocket upgrades
if (workspace.type === "worktree") {
return Instance.provide({
directory: workspace.directory!,
init: InstanceBootstrap,
async fn() {
return routes().fetch(c.req.raw, c.env)
},
})
}
// Remote workspaces
if (local(c.req.method, url.pathname)) {
// No instance provided because we are serving cached data; there
// is no instance to work with
return routes().fetch(c.req.raw, c.env)
}
const adaptor = await getAdaptor(workspace.type)
const headers = new Headers(c.req.raw.headers)
headers.delete("x-opencode-workspace")
return adaptor.fetch(workspace, `${url.pathname}${url.search}`, {
method: c.req.method,
body: c.req.method === "GET" || c.req.method === "HEAD" ? undefined : await c.req.raw.arrayBuffer(),
signal: c.req.raw.signal,
headers,
// If no workspace is provided we use the "project" workspace
if (!workspaceParam) {
return Instance.provide({
directory,
init: InstanceBootstrap,
async fn() {
return routes().fetch(c.req.raw, c.env)
},
})
}
const workspaceID = WorkspaceID.make(workspaceParam)
const workspace = await Workspace.get(workspaceID)
if (!workspace) {
return new Response(`Workspace not found: ${workspaceID}`, {
status: 500,
headers: {
"content-type": "text/plain; charset=utf-8",
},
})
}
// Handle local workspaces directly so we can pass env to `fetch`,
// necessary for websocket upgrades
if (workspace.type === "worktree") {
return Instance.provide({
directory: workspace.directory!,
init: InstanceBootstrap,
async fn() {
return routes().fetch(c.req.raw, c.env)
},
})
}
// Remote workspaces
if (local(c.req.method, url.pathname)) {
// No instance provided because we are serving cached data; there
// is no instance to work with
return routes().fetch(c.req.raw, c.env)
}
const adaptor = await getAdaptor(workspace.type)
const headers = new Headers(c.req.raw.headers)
headers.delete("x-opencode-workspace")
return adaptor.fetch(workspace, `${url.pathname}${url.search}`, {
method: c.req.method,
body: c.req.method === "GET" || c.req.method === "HEAD" ? undefined : await c.req.raw.arrayBuffer(),
signal: c.req.raw.signal,
headers,
})
}

View File

@@ -1,14 +1,15 @@
import { Hono, type MiddlewareHandler } from "hono"
import { Hono } from "hono"
import { describeRoute, validator, resolver } from "hono-openapi"
import type { UpgradeWebSocket } from "hono/ws"
import { upgradeWebSocket } from "hono/bun"
import z from "zod"
import { Pty } from "@/pty"
import { PtyID } from "@/pty/schema"
import { NotFoundError } from "../../storage/db"
import { errors } from "../error"
import { lazy } from "../../util/lazy"
export function PtyRoutes(upgradeWebSocket: UpgradeWebSocket) {
return new Hono()
export const PtyRoutes = lazy(() =>
new Hono()
.get(
"/",
describeRoute({
@@ -206,5 +207,5 @@ export function PtyRoutes(upgradeWebSocket: UpgradeWebSocket) {
},
}
}),
)
}
),
)

View File

@@ -436,13 +436,13 @@ export const SessionRoutes = lazy(() =>
validator(
"param",
z.object({
sessionID: SessionSummary.DiffInput.shape.sessionID,
sessionID: SessionSummary.diff.schema.shape.sessionID,
}),
),
validator(
"query",
z.object({
messageID: SessionSummary.DiffInput.shape.messageID,
messageID: SessionSummary.diff.schema.shape.messageID,
}),
),
async (c) => {

View File

@@ -4,14 +4,12 @@ import { Hono } from "hono"
import { compress } from "hono/compress"
import { cors } from "hono/cors"
import { basicAuth } from "hono/basic-auth"
import type { UpgradeWebSocket } from "hono/ws"
import z from "zod"
import { Auth } from "../auth"
import { Flag } from "../flag/flag"
import { ProviderID } from "../provider/schema"
import { createAdaptorServer, type ServerType } from "@hono/node-server"
import { createNodeWebSocket } from "@hono/node-ws"
import { WorkspaceRouterMiddleware } from "./router"
import { websocket } from "hono/bun"
import { errors } from "./error"
import { GlobalRoutes } from "./routes/global"
import { MDNS } from "./mdns"
@@ -26,14 +24,8 @@ globalThis.AI_SDK_LOG_WARNINGS = false
initProjectors()
export namespace Server {
export type Listener = {
hostname: string
port: number
url: URL
stop: (close?: boolean) => Promise<void>
}
const log = Log.create({ service: "server" })
const zipped = compress()
const skipCompress = (path: string, method: string) => {
@@ -42,9 +34,10 @@ export namespace Server {
return false
}
export const Default = lazy(() => create({}).app)
export const Default = lazy(() => ControlPlaneRoutes())
export function ControlPlaneRoutes(upgrade: UpgradeWebSocket, app = new Hono(), opts?: { cors?: string[] }): Hono {
export const ControlPlaneRoutes = (opts?: { cors?: string[] }): Hono => {
const app = new Hono()
return app
.onError(errorHandler(log))
.use((c, next) => {
@@ -69,7 +62,9 @@ export namespace Server {
path: c.req.path,
})
await next()
if (!skip) timer.stop()
if (!skip) {
timer.stop()
}
})
.use(
cors({
@@ -86,8 +81,15 @@ export namespace Server {
)
return input
if (/^https:\/\/([a-z0-9-]+\.)*opencode\.ai$/.test(input)) return input
if (opts?.cors?.includes(input)) return input
// *.opencode.ai (https only, adjust if needed)
if (/^https:\/\/([a-z0-9-]+\.)*opencode\.ai$/.test(input)) {
return input
}
if (opts?.cors?.includes(input)) {
return input
}
return
},
}),
)
@@ -232,20 +234,11 @@ export namespace Server {
return c.json(true)
},
)
.use(WorkspaceRouterMiddleware(upgrade))
}
function create(opts: { cors?: string[] }) {
const app = new Hono()
const ws = createNodeWebSocket({ app })
return {
app: ControlPlaneRoutes(ws.upgradeWebSocket, app, opts),
ws,
}
.use(WorkspaceRouterMiddleware)
}
export function createApp(opts: { cors?: string[] }) {
return create(opts).app
return ControlPlaneRoutes(opts)
}
export async function openapi() {
@@ -253,8 +246,8 @@ export namespace Server {
// hono-openapi can see describeRoute metadata (`.route()` wraps
// handlers when the sub-app has a custom errorHandler, which
// strips the metadata symbol).
const { app, ws } = create({})
InstanceRoutes(ws.upgradeWebSocket, app)
const app = ControlPlaneRoutes()
InstanceRoutes(app)
const result = await generateSpecs(app, {
documentation: {
info: {
@@ -268,86 +261,52 @@ export namespace Server {
return result
}
/** @deprecated do not use this dumb shit */
export let url: URL
export async function listen(opts: {
export function listen(opts: {
port: number
hostname: string
mdns?: boolean
mdnsDomain?: string
cors?: string[]
}): Promise<Listener> {
const built = create(opts)
const start = (port: number) =>
new Promise<ServerType>((resolve, reject) => {
const server = createAdaptorServer({ fetch: built.app.fetch })
built.ws.injectWebSocket(server)
const fail = (err: Error) => {
cleanup()
reject(err)
}
const ready = () => {
cleanup()
resolve(server)
}
const cleanup = () => {
server.off("error", fail)
server.off("listening", ready)
}
server.once("error", fail)
server.once("listening", ready)
server.listen(port, opts.hostname)
})
const server = opts.port === 0 ? await start(4096).catch(() => start(0)) : await start(opts.port)
const addr = server.address()
if (!addr || typeof addr === "string") {
throw new Error(`Failed to resolve server address for port ${opts.port}`)
}) {
url = new URL(`http://${opts.hostname}:${opts.port}`)
const app = ControlPlaneRoutes({ cors: opts.cors })
const args = {
hostname: opts.hostname,
idleTimeout: 0,
fetch: app.fetch,
websocket: websocket,
} as const
const tryServe = (port: number) => {
try {
return Bun.serve({ ...args, port })
} catch {
return undefined
}
}
const server = opts.port === 0 ? (tryServe(4096) ?? tryServe(0)) : tryServe(opts.port)
if (!server) throw new Error(`Failed to start server on port ${opts.port}`)
const next = new URL("http://localhost")
next.hostname = opts.hostname
next.port = String(addr.port)
url = next
const mdns =
const shouldPublishMDNS =
opts.mdns &&
addr.port &&
server.port &&
opts.hostname !== "127.0.0.1" &&
opts.hostname !== "localhost" &&
opts.hostname !== "::1"
if (mdns) {
MDNS.publish(addr.port, opts.mdnsDomain)
if (shouldPublishMDNS) {
MDNS.publish(server.port!, opts.mdnsDomain)
} else if (opts.mdns) {
log.warn("mDNS enabled but hostname is loopback; skipping mDNS publish")
}
let closing: Promise<void> | undefined
return {
hostname: opts.hostname,
port: addr.port,
url: next,
stop(close?: boolean) {
closing ??= new Promise((resolve, reject) => {
if (mdns) MDNS.unpublish()
server.close((err) => {
if (err) {
reject(err)
return
}
resolve()
})
if (close) {
if ("closeAllConnections" in server && typeof server.closeAllConnections === "function") {
server.closeAllConnections()
}
if ("closeIdleConnections" in server && typeof server.closeIdleConnections === "function") {
server.closeIdleConnections()
}
}
})
return closing
},
const originalStop = server.stop.bind(server)
server.stop = async (closeActiveConnections?: boolean) => {
if (shouldPublishMDNS) MDNS.unpublish()
return originalStop(closeActiveConnections)
}
return server
}
}

View File

@@ -63,13 +63,7 @@ export namespace SessionCompaction {
export const layer: Layer.Layer<
Service,
never,
| Bus.Service
| Config.Service
| Session.Service
| Agent.Service
| Plugin.Service
| SessionProcessor.Service
| Provider.Service
Bus.Service | Config.Service | Session.Service | Agent.Service | Plugin.Service | SessionProcessor.Service
> = Layer.effect(
Service,
Effect.gen(function* () {
@@ -79,7 +73,6 @@ export namespace SessionCompaction {
const agents = yield* Agent.Service
const plugin = yield* Plugin.Service
const processors = yield* SessionProcessor.Service
const provider = yield* Provider.Service
const isOverflow = Effect.fn("SessionCompaction.isOverflow")(function* (input: {
tokens: MessageV2.Assistant["tokens"]
@@ -177,9 +170,11 @@ export namespace SessionCompaction {
}
const agent = yield* agents.get("compaction")
const model = agent.model
? yield* provider.getModel(agent.model.providerID, agent.model.modelID)
: yield* provider.getModel(userMessage.model.providerID, userMessage.model.modelID)
const model = yield* Effect.promise(() =>
agent.model
? Provider.getModel(agent.model.providerID, agent.model.modelID)
: Provider.getModel(userMessage.model.providerID, userMessage.model.modelID),
)
// Allow plugins to inject context or replace compaction prompt.
const compacting = yield* plugin.trigger(
"experimental.session.compacting",
@@ -382,7 +377,6 @@ When constructing the summary, try to stick to this template:
export const defaultLayer = Layer.unwrap(
Effect.sync(() =>
layer.pipe(
Layer.provide(Provider.defaultLayer),
Layer.provide(Session.defaultLayer),
Layer.provide(SessionProcessor.defaultLayer),
Layer.provide(Agent.defaultLayer),

View File

@@ -294,10 +294,12 @@ export namespace SessionProcessor {
}
ctx.snapshot = undefined
}
SessionSummary.summarize({
sessionID: ctx.sessionID,
messageID: ctx.assistantMessage.parentID,
})
yield* Effect.promise(() =>
SessionSummary.summarize({
sessionID: ctx.sessionID,
messageID: ctx.assistantMessage.parentID,
}),
).pipe(Effect.ignoreCause({ log: true, message: "session summary failed" }), Effect.forkDetach)
if (
!ctx.assistantMessage.summary &&
isOverflow({ cfg: yield* config.get(), tokens: usage.tokens, model: ctx.model })

View File

@@ -84,7 +84,6 @@ export namespace SessionPrompt {
const status = yield* SessionStatus.Service
const sessions = yield* Session.Service
const agents = yield* Agent.Service
const provider = yield* Provider.Service
const processor = yield* SessionProcessor.Service
const compaction = yield* SessionCompaction.Service
const plugin = yield* Plugin.Service
@@ -207,14 +206,14 @@ export namespace SessionPrompt {
const ag = yield* agents.get("title")
if (!ag) return
const mdl = ag.model
? yield* provider.getModel(ag.model.providerID, ag.model.modelID)
: ((yield* provider.getSmallModel(input.providerID)) ??
(yield* provider.getModel(input.providerID, input.modelID)))
const msgs = onlySubtasks
? [{ role: "user" as const, content: subtasks.map((p) => p.prompt).join("\n") }]
: yield* Effect.promise(() => MessageV2.toModelMessages(context, mdl))
const text = yield* Effect.promise(async (signal) => {
const mdl = ag.model
? await Provider.getModel(ag.model.providerID, ag.model.modelID)
: ((await Provider.getSmallModel(input.providerID)) ??
(await Provider.getModel(input.providerID, input.modelID)))
const msgs = onlySubtasks
? [{ role: "user" as const, content: subtasks.map((p) => p.prompt).join("\n") }]
: await MessageV2.toModelMessages(context, mdl)
const result = await LLM.stream({
agent: ag,
user: firstInfo,
@@ -933,35 +932,21 @@ NOTE: At any point in time through this workflow you should feel free to ask the
return { info: msg, parts: [part] }
})
const getModel = Effect.fn("SessionPrompt.getModel")(function* (
providerID: ProviderID,
modelID: ModelID,
sessionID: SessionID,
) {
const exit = yield* provider.getModel(providerID, modelID).pipe(Effect.exit)
if (Exit.isSuccess(exit)) return exit.value
const err = Cause.squash(exit.cause)
if (Provider.ModelNotFoundError.isInstance(err)) {
const hint = err.data.suggestions?.length ? ` Did you mean: ${err.data.suggestions.join(", ")}?` : ""
yield* bus.publish(Session.Event.Error, {
sessionID,
error: new NamedError.Unknown({
message: `Model not found: ${err.data.providerID}/${err.data.modelID}.${hint}`,
}).toObject(),
})
}
return yield* Effect.failCause(exit.cause)
})
const lastModel = Effect.fnUntraced(function* (sessionID: SessionID) {
const model = yield* Effect.promise(async () => {
for await (const item of MessageV2.stream(sessionID)) {
if (item.info.role === "user" && item.info.model) return item.info.model
}
})
if (model) return model
return yield* provider.defaultModel()
})
const getModel = (providerID: ProviderID, modelID: ModelID, sessionID: SessionID) =>
Effect.promise(() =>
Provider.getModel(providerID, modelID).catch((e) => {
if (Provider.ModelNotFoundError.isInstance(e)) {
const hint = e.data.suggestions?.length ? ` Did you mean: ${e.data.suggestions.join(", ")}?` : ""
Bus.publish(Session.Event.Error, {
sessionID,
error: new NamedError.Unknown({
message: `Model not found: ${e.data.providerID}/${e.data.modelID}.${hint}`,
}).toObject(),
})
}
throw e
}),
)
const createUserMessage = Effect.fn("SessionPrompt.createUserMessage")(function* (input: PromptInput) {
const agentName = input.agent || (yield* agents.defaultAgent())
@@ -975,12 +960,9 @@ NOTE: At any point in time through this workflow you should feel free to ask the
}
const model = input.model ?? ag.model ?? (yield* lastModel(input.sessionID))
const same = ag.model && model.providerID === ag.model.providerID && model.modelID === ag.model.modelID
const full =
!input.variant && ag.variant && same
? yield* provider
.getModel(model.providerID, model.modelID)
.pipe(Effect.catch(() => Effect.succeed(undefined)))
!input.variant && ag.variant
? yield* Effect.promise(() => Provider.getModel(model.providerID, model.modelID).catch(() => undefined))
: undefined
const variant = input.variant ?? (ag.variant && full?.variants?.[ag.variant] ? ag.variant : undefined)
@@ -1127,7 +1109,7 @@ NOTE: At any point in time through this workflow you should feel free to ask the
]
const read = yield* Effect.promise(() => ReadTool.init()).pipe(
Effect.flatMap((t) =>
provider.getModel(info.model.providerID, info.model.modelID).pipe(
Effect.promise(() => Provider.getModel(info.model.providerID, info.model.modelID)).pipe(
Effect.flatMap((mdl) =>
Effect.promise(() =>
t.execute(args, {
@@ -1729,7 +1711,6 @@ NOTE: At any point in time through this workflow you should feel free to ask the
Layer.provide(FileTime.defaultLayer),
Layer.provide(ToolRegistry.defaultLayer),
Layer.provide(Truncate.layer),
Layer.provide(Provider.defaultLayer),
Layer.provide(AppFileSystem.defaultLayer),
Layer.provide(Plugin.defaultLayer),
Layer.provide(Session.defaultLayer),
@@ -1875,6 +1856,15 @@ NOTE: At any point in time through this workflow you should feel free to ask the
return runPromise((svc) => svc.command(CommandInput.parse(input)))
}
const lastModel = Effect.fnUntraced(function* (sessionID: SessionID) {
return yield* Effect.promise(async () => {
for await (const item of MessageV2.stream(sessionID)) {
if (item.info.role === "user" && item.info.model) return item.info.model
}
return Provider.defaultModel()
})
})
/** @internal Exported for testing */
export function createStructuredOutputTool(input: {
schema: Record<string, any>

View File

@@ -1,14 +1,12 @@
import z from "zod"
import { Effect, Layer, ServiceMap } from "effect"
import { makeRuntime } from "@/effect/run-service"
import { Bus } from "../bus"
import { Snapshot } from "../snapshot"
import { Storage } from "@/storage/storage"
import { SyncEvent } from "../sync"
import { Log } from "../util/log"
import { Session } from "."
import { MessageV2 } from "./message-v2"
import { SessionID, MessageID, PartID } from "./schema"
import { Snapshot } from "../snapshot"
import { MessageV2 } from "./message-v2"
import { Session } from "."
import { Log } from "../util/log"
import { SyncEvent } from "../sync"
import { Storage } from "@/storage/storage"
import { Bus } from "../bus"
import { SessionPrompt } from "./prompt"
import { SessionSummary } from "./summary"
@@ -22,152 +20,116 @@ export namespace SessionRevert {
})
export type RevertInput = z.infer<typeof RevertInput>
export interface Interface {
readonly revert: (input: RevertInput) => Effect.Effect<Session.Info>
readonly unrevert: (input: { sessionID: SessionID }) => Effect.Effect<Session.Info>
readonly cleanup: (session: Session.Info) => Effect.Effect<void>
}
export class Service extends ServiceMap.Service<Service, Interface>()("@opencode/SessionRevert") {}
export const layer = Layer.effect(
Service,
Effect.gen(function* () {
const sessions = yield* Session.Service
const snap = yield* Snapshot.Service
const storage = yield* Storage.Service
const bus = yield* Bus.Service
const revert = Effect.fn("SessionRevert.revert")(function* (input: RevertInput) {
yield* Effect.promise(() => SessionPrompt.assertNotBusy(input.sessionID))
const all = yield* sessions.messages({ sessionID: input.sessionID })
let lastUser: MessageV2.User | undefined
const session = yield* sessions.get(input.sessionID)
let rev: Session.Info["revert"]
const patches: Snapshot.Patch[] = []
for (const msg of all) {
if (msg.info.role === "user") lastUser = msg.info
const remaining = []
for (const part of msg.parts) {
if (rev) {
if (part.type === "patch") patches.push(part)
continue
}
if (!rev) {
if ((msg.info.id === input.messageID && !input.partID) || part.id === input.partID) {
const partID = remaining.some((item) => ["text", "tool"].includes(item.type)) ? input.partID : undefined
rev = {
messageID: !partID && lastUser ? lastUser.id : msg.info.id,
partID,
}
}
remaining.push(part)
}
}
}
if (!rev) return session
rev.snapshot = session.revert?.snapshot ?? (yield* snap.track())
yield* snap.revert(patches)
if (rev.snapshot) rev.diff = yield* snap.diff(rev.snapshot as string)
const range = all.filter((msg) => msg.info.id >= rev!.messageID)
const diffs = yield* Effect.promise(() => SessionSummary.computeDiff({ messages: range }))
yield* storage.write(["session_diff", input.sessionID], diffs).pipe(Effect.ignore)
yield* bus.publish(Session.Event.Diff, { sessionID: input.sessionID, diff: diffs })
yield* sessions.setRevert({
sessionID: input.sessionID,
revert: rev,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
return yield* sessions.get(input.sessionID)
})
const unrevert = Effect.fn("SessionRevert.unrevert")(function* (input: { sessionID: SessionID }) {
log.info("unreverting", input)
yield* Effect.promise(() => SessionPrompt.assertNotBusy(input.sessionID))
const session = yield* sessions.get(input.sessionID)
if (!session.revert) return session
if (session.revert.snapshot) yield* snap.restore(session.revert!.snapshot!)
yield* sessions.clearRevert(input.sessionID)
return yield* sessions.get(input.sessionID)
})
const cleanup = Effect.fn("SessionRevert.cleanup")(function* (session: Session.Info) {
if (!session.revert) return
const sessionID = session.id
const msgs = yield* sessions.messages({ sessionID })
const messageID = session.revert.messageID
const remove = [] as MessageV2.WithParts[]
let target: MessageV2.WithParts | undefined
for (const msg of msgs) {
if (msg.info.id < messageID) continue
if (msg.info.id > messageID) {
remove.push(msg)
continue
}
if (session.revert.partID) {
target = msg
continue
}
remove.push(msg)
}
for (const msg of remove) {
SyncEvent.run(MessageV2.Event.Removed, {
sessionID,
messageID: msg.info.id,
})
}
if (session.revert.partID && target) {
const partID = session.revert.partID
const idx = target.parts.findIndex((part) => part.id === partID)
if (idx >= 0) {
const removeParts = target.parts.slice(idx)
target.parts = target.parts.slice(0, idx)
for (const part of removeParts) {
SyncEvent.run(MessageV2.Event.PartRemoved, {
sessionID,
messageID: target.info.id,
partID: part.id,
})
}
}
}
yield* sessions.clearRevert(sessionID)
})
return Service.of({ revert, unrevert, cleanup })
}),
)
export const defaultLayer = Layer.unwrap(
Effect.sync(() =>
layer.pipe(
Layer.provide(Session.defaultLayer),
Layer.provide(Snapshot.defaultLayer),
Layer.provide(Storage.defaultLayer),
Layer.provide(Bus.layer),
),
),
)
const { runPromise } = makeRuntime(Service, defaultLayer)
export async function revert(input: RevertInput) {
return runPromise((svc) => svc.revert(input))
await SessionPrompt.assertNotBusy(input.sessionID)
const all = await Session.messages({ sessionID: input.sessionID })
let lastUser: MessageV2.User | undefined
const session = await Session.get(input.sessionID)
let revert: Session.Info["revert"]
const patches: Snapshot.Patch[] = []
for (const msg of all) {
if (msg.info.role === "user") lastUser = msg.info
const remaining = []
for (const part of msg.parts) {
if (revert) {
if (part.type === "patch") {
patches.push(part)
}
continue
}
if (!revert) {
if ((msg.info.id === input.messageID && !input.partID) || part.id === input.partID) {
// if no useful parts left in message, same as reverting whole message
const partID = remaining.some((item) => ["text", "tool"].includes(item.type)) ? input.partID : undefined
revert = {
messageID: !partID && lastUser ? lastUser.id : msg.info.id,
partID,
}
}
remaining.push(part)
}
}
}
if (revert) {
const session = await Session.get(input.sessionID)
revert.snapshot = session.revert?.snapshot ?? (await Snapshot.track())
await Snapshot.revert(patches)
if (revert.snapshot) revert.diff = await Snapshot.diff(revert.snapshot)
const rangeMessages = all.filter((msg) => msg.info.id >= revert!.messageID)
const diffs = await SessionSummary.computeDiff({ messages: rangeMessages })
await Storage.write(["session_diff", input.sessionID], diffs)
Bus.publish(Session.Event.Diff, {
sessionID: input.sessionID,
diff: diffs,
})
return Session.setRevert({
sessionID: input.sessionID,
revert,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
}
return session
}
export async function unrevert(input: { sessionID: SessionID }) {
return runPromise((svc) => svc.unrevert(input))
log.info("unreverting", input)
await SessionPrompt.assertNotBusy(input.sessionID)
const session = await Session.get(input.sessionID)
if (!session.revert) return session
if (session.revert.snapshot) await Snapshot.restore(session.revert.snapshot)
return Session.clearRevert(input.sessionID)
}
export async function cleanup(session: Session.Info) {
return runPromise((svc) => svc.cleanup(session))
if (!session.revert) return
const sessionID = session.id
const msgs = await Session.messages({ sessionID })
const messageID = session.revert.messageID
const remove = [] as MessageV2.WithParts[]
let target: MessageV2.WithParts | undefined
for (const msg of msgs) {
if (msg.info.id < messageID) {
continue
}
if (msg.info.id > messageID) {
remove.push(msg)
continue
}
if (session.revert.partID) {
target = msg
continue
}
remove.push(msg)
}
for (const msg of remove) {
SyncEvent.run(MessageV2.Event.Removed, {
sessionID: sessionID,
messageID: msg.info.id,
})
}
if (session.revert.partID && target) {
const partID = session.revert.partID
const removeStart = target.parts.findIndex((part) => part.id === partID)
if (removeStart >= 0) {
const preserveParts = target.parts.slice(0, removeStart)
const removeParts = target.parts.slice(removeStart)
target.parts = preserveParts
for (const part of removeParts) {
SyncEvent.run(MessageV2.Event.PartRemoved, {
sessionID: sessionID,
messageID: target.info.id,
partID: part.id,
})
}
}
}
await Session.clearRevert(sessionID)
}
}

View File

@@ -1,12 +1,14 @@
import { fn } from "@/util/fn"
import z from "zod"
import { Effect, Layer, ServiceMap } from "effect"
import { makeRuntime } from "@/effect/run-service"
import { Bus } from "@/bus"
import { Snapshot } from "@/snapshot"
import { Storage } from "@/storage/storage"
import { Session } from "."
import { MessageV2 } from "./message-v2"
import { SessionID, MessageID } from "./schema"
import { Snapshot } from "@/snapshot"
import { Storage } from "@/storage/storage"
import { Bus } from "@/bus"
import { NotFoundError } from "@/storage/db"
export namespace SessionSummary {
function unquoteGitPath(input: string) {
@@ -65,117 +67,103 @@ export namespace SessionSummary {
return Buffer.from(bytes).toString()
}
export interface Interface {
readonly summarize: (input: { sessionID: SessionID; messageID: MessageID }) => Effect.Effect<void>
readonly diff: (input: { sessionID: SessionID; messageID?: MessageID }) => Effect.Effect<Snapshot.FileDiff[]>
readonly computeDiff: (input: { messages: MessageV2.WithParts[] }) => Effect.Effect<Snapshot.FileDiff[]>
}
export class Service extends ServiceMap.Service<Service, Interface>()("@opencode/SessionSummary") {}
export const layer = Layer.effect(
Service,
Effect.gen(function* () {
const sessions = yield* Session.Service
const snapshot = yield* Snapshot.Service
const storage = yield* Storage.Service
const bus = yield* Bus.Service
const computeDiff = Effect.fn("SessionSummary.computeDiff")(function* (input: {
messages: MessageV2.WithParts[]
}) {
let from: string | undefined
let to: string | undefined
for (const item of input.messages) {
if (!from) {
for (const part of item.parts) {
if (part.type === "step-start" && part.snapshot) {
from = part.snapshot
break
}
}
}
for (const part of item.parts) {
if (part.type === "step-finish" && part.snapshot) to = part.snapshot
}
}
if (from && to) return yield* snapshot.diffFull(from, to)
return []
})
const summarize = Effect.fn("SessionSummary.summarize")(function* (input: {
sessionID: SessionID
messageID: MessageID
}) {
const all = yield* sessions.messages({ sessionID: input.sessionID })
if (!all.length) return
const diffs = yield* computeDiff({ messages: all })
yield* sessions.setSummary({
sessionID: input.sessionID,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
yield* storage.write(["session_diff", input.sessionID], diffs).pipe(Effect.ignore)
yield* bus.publish(Session.Event.Diff, { sessionID: input.sessionID, diff: diffs })
const messages = all.filter(
(m) => m.info.id === input.messageID || (m.info.role === "assistant" && m.info.parentID === input.messageID),
)
const target = messages.find((m) => m.info.id === input.messageID)
if (!target || target.info.role !== "user") return
const msgDiffs = yield* computeDiff({ messages })
target.info.summary = { ...target.info.summary, diffs: msgDiffs }
yield* sessions.updateMessage(target.info)
})
const diff = Effect.fn("SessionSummary.diff")(function* (input: { sessionID: SessionID; messageID?: MessageID }) {
const diffs = yield* storage
.read<Snapshot.FileDiff[]>(["session_diff", input.sessionID])
.pipe(Effect.catch(() => Effect.succeed([] as Snapshot.FileDiff[])))
const next = diffs.map((item) => {
const file = unquoteGitPath(item.file)
if (file === item.file) return item
return { ...item, file }
})
const changed = next.some((item, i) => item.file !== diffs[i]?.file)
if (changed) yield* storage.write(["session_diff", input.sessionID], next).pipe(Effect.ignore)
return next
})
return Service.of({ summarize, diff, computeDiff })
export const summarize = fn(
z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod,
}),
async (input) => {
await Session.messages({ sessionID: input.sessionID })
.then((all) =>
Promise.all([
summarizeSession({ sessionID: input.sessionID, messages: all }),
summarizeMessage({ messageID: input.messageID, messages: all }),
]),
)
.catch((err) => {
if (NotFoundError.isInstance(err)) return
throw err
})
},
)
export const defaultLayer = Layer.unwrap(
Effect.sync(() =>
layer.pipe(
Layer.provide(Session.defaultLayer),
Layer.provide(Snapshot.defaultLayer),
Layer.provide(Storage.defaultLayer),
Layer.provide(Bus.layer),
),
),
)
const { runPromise } = makeRuntime(Service, defaultLayer)
export const summarize = (input: { sessionID: SessionID; messageID: MessageID }) =>
void runPromise((svc) => svc.summarize(input)).catch(() => {})
export const DiffInput = z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod.optional(),
})
export async function diff(input: z.infer<typeof DiffInput>) {
return runPromise((svc) => svc.diff(input))
async function summarizeSession(input: { sessionID: SessionID; messages: MessageV2.WithParts[] }) {
const diffs = await computeDiff({ messages: input.messages })
await Session.setSummary({
sessionID: input.sessionID,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
await Storage.write(["session_diff", input.sessionID], diffs)
Bus.publish(Session.Event.Diff, {
sessionID: input.sessionID,
diff: diffs,
})
}
async function summarizeMessage(input: { messageID: string; messages: MessageV2.WithParts[] }) {
const messages = input.messages.filter(
(m) => m.info.id === input.messageID || (m.info.role === "assistant" && m.info.parentID === input.messageID),
)
const msgWithParts = messages.find((m) => m.info.id === input.messageID)
if (!msgWithParts || msgWithParts.info.role !== "user") return
const userMsg = msgWithParts.info
const diffs = await computeDiff({ messages })
userMsg.summary = {
...userMsg.summary,
diffs,
}
await Session.updateMessage(userMsg)
}
export const diff = fn(
z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod.optional(),
}),
async (input) => {
const diffs = await Storage.read<Snapshot.FileDiff[]>(["session_diff", input.sessionID]).catch(() => [])
const next = diffs.map((item) => {
const file = unquoteGitPath(item.file)
if (file === item.file) return item
return {
...item,
file,
}
})
const changed = next.some((item, i) => item.file !== diffs[i]?.file)
if (changed) Storage.write(["session_diff", input.sessionID], next).catch(() => {})
return next
},
)
export async function computeDiff(input: { messages: MessageV2.WithParts[] }) {
return runPromise((svc) => svc.computeDiff(input))
let from: string | undefined
let to: string | undefined
// scan assistant messages to find earliest from and latest to
// snapshot
for (const item of input.messages) {
if (!from) {
for (const part of item.parts) {
if (part.type === "step-start" && part.snapshot) {
from = part.snapshot
break
}
}
}
for (const part of item.parts) {
if (part.type === "step-finish" && part.snapshot) {
to = part.snapshot
}
}
}
if (from && to) return Snapshot.diffFull(from, to)
return []
}
}

View File

@@ -1,47 +0,0 @@
/** @jsxImportSource @opentui/solid */
import { expect, test } from "bun:test"
import { createSlot, createSolidSlotRegistry, testRender, useRenderer } from "@opentui/solid"
import { onMount } from "solid-js"
type Slots = {
prompt: {}
}
test("replace slot mounts plugin content once", async () => {
let mounts = 0
const Probe = () => {
onMount(() => {
mounts += 1
})
return <box />
}
const App = () => {
const renderer = useRenderer()
const reg = createSolidSlotRegistry<Slots>(renderer, {})
const Slot = createSlot(reg)
reg.register({
id: "plugin",
slots: {
prompt() {
return <Probe />
},
},
})
return (
<box>
<Slot name="prompt" mode="replace">
<box />
</Slot>
</box>
)
}
await testRender(() => <App />)
expect(mounts).toBe(1)
})

View File

@@ -1,81 +0,0 @@
import { Effect, Layer } from "effect"
import { Provider } from "../../src/provider/provider"
import { ModelID, ProviderID } from "../../src/provider/schema"
export namespace ProviderTest {
export function model(override: Partial<Provider.Model> = {}): Provider.Model {
const id = override.id ?? ModelID.make("gpt-5.2")
const providerID = override.providerID ?? ProviderID.make("openai")
return {
id,
providerID,
name: "Test Model",
capabilities: {
toolcall: true,
attachment: false,
reasoning: false,
temperature: true,
interleaved: false,
input: { text: true, image: false, audio: false, video: false, pdf: false },
output: { text: true, image: false, audio: false, video: false, pdf: false },
},
api: { id, url: "https://example.com", npm: "@ai-sdk/openai" },
cost: { input: 0, output: 0, cache: { read: 0, write: 0 } },
limit: { context: 200_000, output: 10_000 },
status: "active",
options: {},
headers: {},
release_date: "2025-01-01",
...override,
}
}
export function info(override: Partial<Provider.Info> = {}, mdl = model()): Provider.Info {
const id = override.id ?? mdl.providerID
return {
id,
name: "Test Provider",
source: "config",
env: [],
options: {},
models: { [mdl.id]: mdl },
...override,
}
}
export function fake(override: Partial<Provider.Interface> & { model?: Provider.Model; info?: Provider.Info } = {}) {
const mdl = override.model ?? model()
const row = override.info ?? info({}, mdl)
return {
model: mdl,
info: row,
layer: Layer.succeed(
Provider.Service,
Provider.Service.of({
list: Effect.fn("TestProvider.list")(() => Effect.succeed({ [row.id]: row })),
getProvider: Effect.fn("TestProvider.getProvider")((providerID) => {
if (providerID === row.id) return Effect.succeed(row)
return Effect.die(new Error(`Unknown test provider: ${providerID}`))
}),
getModel: Effect.fn("TestProvider.getModel")((providerID, modelID) => {
if (providerID === row.id && modelID === mdl.id) return Effect.succeed(mdl)
return Effect.die(new Error(`Unknown test model: ${providerID}/${modelID}`))
}),
getLanguage: Effect.fn("TestProvider.getLanguage")(() =>
Effect.die(new Error("ProviderTest.getLanguage not configured")),
),
closest: Effect.fn("TestProvider.closest")((providerID) =>
Effect.succeed(providerID === row.id ? { providerID: row.id, modelID: mdl.id } : undefined),
),
getSmallModel: Effect.fn("TestProvider.getSmallModel")((providerID) =>
Effect.succeed(providerID === row.id ? mdl : undefined),
),
defaultModel: Effect.fn("TestProvider.defaultModel")(() =>
Effect.succeed({ providerID: row.id, modelID: mdl.id }),
),
...override,
}),
),
}
}
}

View File

@@ -64,7 +64,9 @@ describe("Format", () => {
),
)
it.live("service initializes without error", () => provideTmpdirInstance(() => Format.Service.use(() => Effect.void)))
it.live("service initializes without error", () =>
provideTmpdirInstance(() => Format.Service.use(() => Effect.void)),
)
it.live("status() initializes formatter state per directory", () =>
Effect.gen(function* () {

View File

@@ -1,4 +1,4 @@
import { afterEach, describe, expect, mock, test } from "bun:test"
import { afterEach, describe, expect, mock, spyOn, test } from "bun:test"
import { APICallError } from "ai"
import { Cause, Effect, Exit, Layer, ManagedRuntime } from "effect"
import * as Stream from "effect/Stream"
@@ -20,9 +20,9 @@ import { MessageID, PartID, SessionID } from "../../src/session/schema"
import { SessionStatus } from "../../src/session/status"
import { ModelID, ProviderID } from "../../src/provider/schema"
import type { Provider } from "../../src/provider/provider"
import * as ProviderModule from "../../src/provider/provider"
import * as SessionProcessorModule from "../../src/session/processor"
import { Snapshot } from "../../src/snapshot"
import { ProviderTest } from "../fake/provider"
Log.init({ print: false })
@@ -65,8 +65,6 @@ function createModel(opts: {
} as Provider.Model
}
const wide = () => ProviderTest.fake({ model: createModel({ context: 100_000, output: 32_000 }) })
async function user(sessionID: SessionID, text: string) {
const msg = await Session.updateMessage({
id: MessageID.ascending(),
@@ -164,11 +162,10 @@ function layer(result: "continue" | "compact") {
)
}
function runtime(result: "continue" | "compact", plugin = Plugin.defaultLayer, provider = ProviderTest.fake()) {
function runtime(result: "continue" | "compact", plugin = Plugin.defaultLayer) {
const bus = Bus.layer
return ManagedRuntime.make(
Layer.mergeAll(SessionCompaction.layer, bus).pipe(
Layer.provide(provider.layer),
Layer.provide(Session.defaultLayer),
Layer.provide(layer(result)),
Layer.provide(Agent.defaultLayer),
@@ -201,13 +198,12 @@ function llm() {
}
}
function liveRuntime(layer: Layer.Layer<LLM.Service>, provider = ProviderTest.fake()) {
function liveRuntime(layer: Layer.Layer<LLM.Service>) {
const bus = Bus.layer
const status = SessionStatus.layer.pipe(Layer.provide(bus))
const processor = SessionProcessorModule.SessionProcessor.layer
return ManagedRuntime.make(
Layer.mergeAll(SessionCompaction.layer.pipe(Layer.provide(processor)), processor, bus, status).pipe(
Layer.provide(provider.layer),
Layer.provide(Session.defaultLayer),
Layer.provide(Snapshot.defaultLayer),
Layer.provide(layer),
@@ -548,12 +544,14 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const msgs = await Session.messages({ sessionID: session.id })
const done = defer()
let seen = false
const rt = runtime("continue", Plugin.defaultLayer, wide())
const rt = runtime("continue")
let unsub: (() => void) | undefined
try {
unsub = await rt.runPromise(
@@ -598,9 +596,11 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const rt = runtime("compact", Plugin.defaultLayer, wide())
const rt = runtime("compact")
try {
const msgs = await Session.messages({ sessionID: session.id })
const result = await rt.runPromise(
@@ -636,9 +636,11 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const rt = runtime("continue", Plugin.defaultLayer, wide())
const rt = runtime("continue")
try {
const msgs = await Session.messages({ sessionID: session.id })
const result = await rt.runPromise(
@@ -676,6 +678,8 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
await user(session.id, "root")
const replay = await user(session.id, "image")
@@ -689,7 +693,7 @@ describe("session.compaction.process", () => {
url: "https://example.com/cat.png",
})
const msg = await user(session.id, "current")
const rt = runtime("continue", Plugin.defaultLayer, wide())
const rt = runtime("continue")
try {
const msgs = await Session.messages({ sessionID: session.id })
const result = await rt.runPromise(
@@ -724,11 +728,13 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
await user(session.id, "earlier")
const msg = await user(session.id, "current")
const rt = runtime("continue", Plugin.defaultLayer, wide())
const rt = runtime("continue")
try {
const msgs = await Session.messages({ sessionID: session.id })
const result = await rt.runPromise(
@@ -784,11 +790,13 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const msgs = await Session.messages({ sessionID: session.id })
const abort = new AbortController()
const rt = liveRuntime(stub.layer, wide())
const rt = liveRuntime(stub.layer)
let off: (() => void) | undefined
let run: Promise<"continue" | "stop"> | undefined
try {
@@ -858,11 +866,13 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const msgs = await Session.messages({ sessionID: session.id })
const abort = new AbortController()
const rt = runtime("continue", plugin(ready), wide())
const rt = runtime("continue", plugin(ready))
let run: Promise<"continue" | "stop"> | undefined
try {
run = rt
@@ -960,9 +970,11 @@ describe("session.compaction.process", () => {
await Instance.provide({
directory: tmp.path,
fn: async () => {
spyOn(ProviderModule.Provider, "getModel").mockResolvedValue(createModel({ context: 100_000, output: 32_000 }))
const session = await Session.create({})
const msg = await user(session.id, "hello")
const rt = liveRuntime(stub.layer, wide())
const rt = liveRuntime(stub.layer)
try {
const msgs = await Session.messages({ sessionID: session.id })
await rt.runPromise(

View File

@@ -0,0 +1,247 @@
import { describe, expect, spyOn, test } from "bun:test"
import { Instance } from "../../src/project/instance"
import { Provider } from "../../src/provider/provider"
import { Session } from "../../src/session"
import { MessageV2 } from "../../src/session/message-v2"
import { SessionPrompt } from "../../src/session/prompt"
import { SessionStatus } from "../../src/session/status"
import { MessageID, PartID, SessionID } from "../../src/session/schema"
import { Log } from "../../src/util/log"
import { tmpdir } from "../fixture/fixture"
Log.init({ print: false })
function deferred() {
let resolve!: () => void
const promise = new Promise<void>((done) => {
resolve = done
})
return { promise, resolve }
}
// Helper: seed a session with a user message + finished assistant message
// so loop() exits immediately without calling any LLM
async function seed(sessionID: SessionID) {
const userMsg: MessageV2.Info = {
id: MessageID.ascending(),
role: "user",
sessionID,
time: { created: Date.now() },
agent: "build",
model: { providerID: "openai" as any, modelID: "gpt-5.2" as any },
}
await Session.updateMessage(userMsg)
await Session.updatePart({
id: PartID.ascending(),
messageID: userMsg.id,
sessionID,
type: "text",
text: "hello",
})
const assistantMsg: MessageV2.Info = {
id: MessageID.ascending(),
role: "assistant",
parentID: userMsg.id,
sessionID,
mode: "build",
agent: "build",
cost: 0,
path: { cwd: "/tmp", root: "/tmp" },
tokens: { input: 0, output: 0, reasoning: 0, cache: { read: 0, write: 0 } },
modelID: "gpt-5.2" as any,
providerID: "openai" as any,
time: { created: Date.now(), completed: Date.now() },
finish: "stop",
}
await Session.updateMessage(assistantMsg)
await Session.updatePart({
id: PartID.ascending(),
messageID: assistantMsg.id,
sessionID,
type: "text",
text: "hi there",
})
return { userMsg, assistantMsg }
}
describe("session.prompt concurrency", () => {
test("loop returns assistant message and sets status to idle", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
await seed(session.id)
const result = await SessionPrompt.loop({ sessionID: session.id })
expect(result.info.role).toBe("assistant")
if (result.info.role === "assistant") expect(result.info.finish).toBe("stop")
const status = await SessionStatus.get(session.id)
expect(status.type).toBe("idle")
},
})
})
test("concurrent loop callers get the same result", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
await seed(session.id)
const [a, b] = await Promise.all([
SessionPrompt.loop({ sessionID: session.id }),
SessionPrompt.loop({ sessionID: session.id }),
])
expect(a.info.id).toBe(b.info.id)
expect(a.info.role).toBe("assistant")
},
})
})
test("assertNotBusy throws when loop is running", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const userMsg: MessageV2.Info = {
id: MessageID.ascending(),
role: "user",
sessionID: session.id,
time: { created: Date.now() },
agent: "build",
model: { providerID: "openai" as any, modelID: "gpt-5.2" as any },
}
await Session.updateMessage(userMsg)
await Session.updatePart({
id: PartID.ascending(),
messageID: userMsg.id,
sessionID: session.id,
type: "text",
text: "hello",
})
const ready = deferred()
const gate = deferred()
const getModel = spyOn(Provider, "getModel").mockImplementation(async () => {
ready.resolve()
await gate.promise
throw new Error("test stop")
})
try {
const loopPromise = SessionPrompt.loop({ sessionID: session.id }).catch(() => undefined)
await ready.promise
await expect(SessionPrompt.assertNotBusy(session.id)).rejects.toBeInstanceOf(Session.BusyError)
gate.resolve()
await loopPromise
} finally {
gate.resolve()
getModel.mockRestore()
}
// After loop completes, assertNotBusy should succeed
await SessionPrompt.assertNotBusy(session.id)
},
})
})
test("cancel sets status to idle", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
// Seed only a user message — loop must call getModel to proceed
const userMsg: MessageV2.Info = {
id: MessageID.ascending(),
role: "user",
sessionID: session.id,
time: { created: Date.now() },
agent: "build",
model: { providerID: "openai" as any, modelID: "gpt-5.2" as any },
}
await Session.updateMessage(userMsg)
await Session.updatePart({
id: PartID.ascending(),
messageID: userMsg.id,
sessionID: session.id,
type: "text",
text: "hello",
})
// Also seed an assistant message so lastAssistant() fallback can find it
const assistantMsg: MessageV2.Info = {
id: MessageID.ascending(),
role: "assistant",
parentID: userMsg.id,
sessionID: session.id,
mode: "build",
agent: "build",
cost: 0,
path: { cwd: "/tmp", root: "/tmp" },
tokens: { input: 0, output: 0, reasoning: 0, cache: { read: 0, write: 0 } },
modelID: "gpt-5.2" as any,
providerID: "openai" as any,
time: { created: Date.now() },
}
await Session.updateMessage(assistantMsg)
await Session.updatePart({
id: PartID.ascending(),
messageID: assistantMsg.id,
sessionID: session.id,
type: "text",
text: "hi there",
})
const ready = deferred()
const gate = deferred()
const getModel = spyOn(Provider, "getModel").mockImplementation(async () => {
ready.resolve()
await gate.promise
throw new Error("test stop")
})
try {
// Start loop — it will block in getModel (assistant has no finish, so loop continues)
const loopPromise = SessionPrompt.loop({ sessionID: session.id })
await ready.promise
await SessionPrompt.cancel(session.id)
const status = await SessionStatus.get(session.id)
expect(status.type).toBe("idle")
// loop should resolve cleanly, not throw "All fibers interrupted"
const result = await loopPromise
expect(result.info.role).toBe("assistant")
expect(result.info.id).toBe(assistantMsg.id)
} finally {
gate.resolve()
getModel.mockRestore()
}
},
})
}, 10000)
test("cancel on idle session just sets idle", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
await SessionPrompt.cancel(session.id)
const status = await SessionStatus.get(session.id)
expect(status.type).toBe("idle")
},
})
})
})

View File

@@ -12,7 +12,6 @@ import { LSP } from "../../src/lsp"
import { MCP } from "../../src/mcp"
import { Permission } from "../../src/permission"
import { Plugin } from "../../src/plugin"
import { Provider as ProviderSvc } from "../../src/provider/provider"
import type { Provider } from "../../src/provider/provider"
import { ModelID, ProviderID } from "../../src/provider/schema"
import { Session } from "../../src/session"
@@ -152,7 +151,6 @@ function makeHttp() {
Permission.layer,
Plugin.defaultLayer,
Config.defaultLayer,
ProviderSvc.defaultLayer,
filetime,
lsp,
mcp,

View File

@@ -10,66 +10,9 @@ import { Instance } from "../../src/project/instance"
import { MessageID, PartID } from "../../src/session/schema"
import { tmpdir } from "../fixture/fixture"
const projectRoot = path.join(__dirname, "../..")
Log.init({ print: false })
function user(sessionID: string, agent = "default") {
return Session.updateMessage({
id: MessageID.ascending(),
role: "user" as const,
sessionID: sessionID as any,
agent,
model: { providerID: ProviderID.make("openai"), modelID: ModelID.make("gpt-4") },
time: { created: Date.now() },
})
}
function assistant(sessionID: string, parentID: string, dir: string) {
return Session.updateMessage({
id: MessageID.ascending(),
role: "assistant" as const,
sessionID: sessionID as any,
mode: "default",
agent: "default",
path: { cwd: dir, root: dir },
cost: 0,
tokens: { output: 0, input: 0, reasoning: 0, cache: { read: 0, write: 0 } },
modelID: ModelID.make("gpt-4"),
providerID: ProviderID.make("openai"),
parentID: parentID as any,
time: { created: Date.now() },
finish: "end_turn",
})
}
function text(sessionID: string, messageID: string, content: string) {
return Session.updatePart({
id: PartID.ascending(),
messageID: messageID as any,
sessionID: sessionID as any,
type: "text" as const,
text: content,
})
}
function tool(sessionID: string, messageID: string) {
return Session.updatePart({
id: PartID.ascending(),
messageID: messageID as any,
sessionID: sessionID as any,
type: "tool" as const,
tool: "bash",
callID: "call-1",
state: {
status: "completed" as const,
input: {},
output: "done",
title: "",
metadata: {},
time: { start: 0, end: 1 },
},
})
}
describe("revert + compact workflow", () => {
test("should properly handle compact command after revert", async () => {
await using tmp = await tmpdir({ git: true })
@@ -340,98 +283,4 @@ describe("revert + compact workflow", () => {
},
})
})
test("cleanup with partID removes parts from the revert point onward", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
const p1 = await text(sid, u1.id, "first part")
const p2 = await tool(sid, u1.id)
const p3 = await text(sid, u1.id, "third part")
// Set revert state pointing at a specific part
await Session.setRevert({
sessionID: sid,
revert: { messageID: u1.id, partID: p2.id },
summary: { additions: 0, deletions: 0, files: 0 },
})
const info = await Session.get(sid)
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
expect(msgs.length).toBe(1)
// Only the first part should remain (before the revert partID)
expect(msgs[0].parts.length).toBe(1)
expect(msgs[0].parts[0].id).toBe(p1.id)
const cleared = await Session.get(sid)
expect(cleared.revert).toBeUndefined()
},
})
})
test("cleanup removes messages after revert point but keeps earlier ones", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
await text(sid, u1.id, "hello")
const a1 = await assistant(sid, u1.id, tmp.path)
await text(sid, a1.id, "hi back")
const u2 = await user(sid)
await text(sid, u2.id, "second question")
const a2 = await assistant(sid, u2.id, tmp.path)
await text(sid, a2.id, "second answer")
// Revert from u2 onward
await Session.setRevert({
sessionID: sid,
revert: { messageID: u2.id },
summary: { additions: 0, deletions: 0, files: 0 },
})
const info = await Session.get(sid)
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
const ids = msgs.map((m) => m.info.id)
expect(ids).toContain(u1.id)
expect(ids).toContain(a1.id)
expect(ids).not.toContain(u2.id)
expect(ids).not.toContain(a2.id)
},
})
})
test("cleanup is a no-op when session has no revert state", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
await text(sid, u1.id, "hello")
const info = await Session.get(sid)
expect(info.revert).toBeUndefined()
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
expect(msgs.length).toBe(1)
},
})
})
})

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/plugin",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {
@@ -21,8 +21,8 @@
"zod": "catalog:"
},
"peerDependencies": {
"@opentui/core": ">=0.1.95",
"@opentui/solid": ">=0.1.95"
"@opentui/core": ">=0.1.93",
"@opentui/solid": ">=0.1.93"
},
"peerDependenciesMeta": {
"@opentui/core": {
@@ -33,8 +33,8 @@
}
},
"devDependencies": {
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@tsconfig/node22": "catalog:",
"@types/node": "catalog:",
"typescript": "catalog:",

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/sdk",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -7040,6 +7040,44 @@
},
"components": {
"schemas": {
"Event.installation.updated": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.updated"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.installation.update-available": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.update-available"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Project": {
"type": "object",
"properties": {
@@ -7116,44 +7154,6 @@
},
"required": ["type", "properties"]
},
"Event.installation.updated": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.updated"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.installation.update-available": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.update-available"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.server.instance.disposed": {
"type": "object",
"properties": {
@@ -9733,15 +9733,15 @@
},
"Event": {
"anyOf": [
{
"$ref": "#/components/schemas/Event.project.updated"
},
{
"$ref": "#/components/schemas/Event.installation.updated"
},
{
"$ref": "#/components/schemas/Event.installation.update-available"
},
{
"$ref": "#/components/schemas/Event.project.updated"
},
{
"$ref": "#/components/schemas/Event.server.instance.disposed"
},

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/slack",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/ui",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"exports": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/util",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -2,7 +2,7 @@
"name": "@opencode-ai/web",
"type": "module",
"license": "MIT",
"version": "1.3.13",
"version": "1.3.11",
"scripts": {
"dev": "astro dev",
"dev:remote": "VITE_API_URL=https://api.opencode.ai astro dev",

View File

@@ -2,7 +2,7 @@
"name": "opencode",
"displayName": "opencode",
"description": "opencode for VS Code",
"version": "1.3.13",
"version": "1.3.11",
"publisher": "sst-dev",
"repository": {
"type": "git",