Compare commits

..

17 Commits

Author SHA1 Message Date
Kit Langton
8d2385ad49 test: finish HTTP mock processor coverage 2026-03-31 20:36:18 -04:00
Kit Langton
7f6a5bb2c8 test: migrate processor tests to HTTP mock LLM server
Replace the custom TestLLM Effect service with the real LLM layer +
TestLLMServer HTTP mock for 9 of 10 processor tests. Tests now exercise
the full HTTP→SSE→AI SDK→processor pipeline.

- Export Provider.defaultLayer for test layer composition
- Add boot() helper for common service access (processor, session, provider)
- Extend TestLLMServer with usage support and httpError step type
- Tool abort test registers a real tool with hanging execute
- Reasoning test stays with in-process TestLLM (needs fine-grained events)
2026-03-31 19:59:56 -04:00
Kit Langton
537cc32bf0 test: migrate processor tests to HTTP mock LLM server
Replace the custom TestLLM Effect service with the real LLM layer +
TestLLMServer HTTP mock. Tests now exercise the full HTTP→SSE→AI SDK
pipeline instead of injecting Effect streams directly.

- Extend TestLLMServer with usage support on text responses and
  httpError step type for non-200 responses
- Drop reasoning test (can't produce reasoning events via
  @ai-sdk/openai-compatible SSE)
- 9 tests pass, covering: text capture, token overflow, error retry,
  structured errors, context overflow, abort/interrupt cleanup
2026-03-31 19:21:27 -04:00
Kit Langton
82da702f64 refactor: tighten instance context helper fallbacks 2026-03-31 17:32:24 -04:00
Kit Langton
90469bbb7e refactor: simplify instance context helpers in prompt tests 2026-03-31 16:42:20 -04:00
Kit Langton
4ff0fbc043 fix: retry scoped tempdir cleanup on windows 2026-03-31 16:42:19 -04:00
Kit Langton
e24369eaf1 fix: break installation cycle in database context binding 2026-03-31 16:42:18 -04:00
Kit Langton
825f51c39f fix: restore instance context in deferred database callbacks 2026-03-31 16:42:18 -04:00
Kit Langton
191a747405 fix: propagate InstanceRef across static function boundaries
- makeRuntime.provide reads InstanceRef from current Effect fiber when
  ALS is unavailable, bridging static function calls (like Bus.publish)
  that create new fibers from inside Effect code
- Database.transaction preserves Instance ALS via Instance.bind on the
  bun:sqlite transaction callback (native fn loses ALS)
- Instance.restore helper for bridging Effect→sync code with ALS
- InstanceState.withALS bridges InstanceRef back to ALS for sync callers
- prompt.ts: InstructionPrompt.clear wrapped with withALS
- Remove ALL provideInstance(dir) wrappers from prompt-effect tests
2026-03-31 16:42:17 -04:00
Kit Langton
cc412f3014 refactor: migrate Instance ALS reads to InstanceRef in Effect services
Migrate 16 direct Instance.directory/worktree/project reads inside
Effect code to use InstanceState.directory/context helpers that check
the InstanceRef first and fall back to ALS.

- Export InstanceState.directory and InstanceState.context helpers
- bus/index.ts: GlobalBus.emit uses InstanceState.directory
- session/prompt.ts: 5 callsites migrated to InstanceState.context
- session/index.ts: 4 callsites migrated
- session/compaction.ts: 1 callsite migrated
- config/config.ts: 1 callsite migrated
- format/index.ts: 1 callsite migrated
- worktree/index.ts: 5 callsites migrated
- storage/db.ts: Database.effect preserves Instance ALS via Instance.bind
- test/lib/llm-server.ts: add wait/hold/fail SSE stream support
- Remove most provideInstance(dir) wrappers from prompt tests
  (5 remain due to Instance.state sync ALS dependency)
2026-03-31 16:42:16 -04:00
Kit Langton
bb039496d5 refactor: propagate Instance context through Effect fibers via InstanceRef
Add a ServiceMap.Reference that carries InstanceContext through the
Effect service graph so child fibers retain instance context even when
resumed by external I/O events outside the ALS boundary.

- Add InstanceRef to instance-state.ts; InstanceState.get/has/invalidate
  try the Reference first, fall back to ALS
- makeRuntime automatically captures ALS into InstanceRef at the boundary
- provideInstance (test fixture) sets InstanceRef for Effect.runPromiseWith
- Remove all redundant provideInstance(dir) wrappers from prompt tests
- Fix test/lib/effect.ts type params (drop unnecessary S/T generics)
2026-03-31 16:42:16 -04:00
Kit Langton
f2fa1a681d test: move more prompt cases to mock llm server
Migrate the next prompt-effect cases to the HTTP-backed mock server path, keep the shell handoff cases on short live timeouts, and leave the stream-failure case on the in-process fake until the server DSL matches it.
2026-03-31 16:42:15 -04:00
Kit Langton
6bd340492c test: infer mock server callback types 2026-03-31 16:42:15 -04:00
Kit Langton
21ec3207e7 test: extend mock llm server coverage
Add fixture support for tmpdir-backed mock server tests, extend the mock LLM server DSL for failure and hanging cases, and migrate the next prompt tests to the HTTP-backed path.
2026-03-31 16:42:14 -04:00
Kit Langton
123123b6c3 test: start moving prompt tests to mock llm server
Switch the basic assistant reply prompt-effect test to the HTTP-backed mock LLM server while keeping the more stream-sensitive cases on the in-process fake for now.
2026-03-31 16:42:14 -04:00
Kit Langton
6ea467b0ac test: add live effect helper mode
Default the shared effect test helper to support both test-clock and live execution, and switch the current opencode effect tests to the live path for real integration behavior.
2026-03-31 16:42:13 -04:00
Kit Langton
459fbc99a8 refactor(test): migrate llm-server to Effect HTTP platform
- Replace Bun.serve with Effect HTTP server using NodeHttpServer
- Add TestLLMServer service for mock LLM testing with SSE responses
- Update prompt-provider.test.ts to use testEffect pattern with provideTmpdirInstance
- Remove redundant test/fixture/effect.ts (using existing test/lib/effect.ts instead)
2026-03-31 16:42:13 -04:00
32 changed files with 863 additions and 1094 deletions

1
.github/VOUCHED.td vendored
View File

@@ -27,4 +27,3 @@ rekram1-node
-robinmordasiewicz
-spider-yamet clawdbot/llm psychosis, spam pinging the team
thdxr
-toastythebot

View File

@@ -26,7 +26,7 @@
},
"packages/app": {
"name": "@opencode-ai/app",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@kobalte/core": "catalog:",
"@opencode-ai/sdk": "workspace:*",
@@ -79,7 +79,7 @@
},
"packages/console/app": {
"name": "@opencode-ai/console-app",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@cloudflare/vite-plugin": "1.15.2",
"@ibm/plex": "6.4.1",
@@ -113,7 +113,7 @@
},
"packages/console/core": {
"name": "@opencode-ai/console-core",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@aws-sdk/client-sts": "3.782.0",
"@jsx-email/render": "1.1.1",
@@ -140,7 +140,7 @@
},
"packages/console/function": {
"name": "@opencode-ai/console-function",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@ai-sdk/anthropic": "3.0.64",
"@ai-sdk/openai": "3.0.48",
@@ -164,7 +164,7 @@
},
"packages/console/mail": {
"name": "@opencode-ai/console-mail",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@jsx-email/all": "2.2.3",
"@jsx-email/cli": "1.4.3",
@@ -188,7 +188,7 @@
},
"packages/desktop": {
"name": "@opencode-ai/desktop",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@opencode-ai/app": "workspace:*",
"@opencode-ai/ui": "workspace:*",
@@ -221,7 +221,7 @@
},
"packages/desktop-electron": {
"name": "@opencode-ai/desktop-electron",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@opencode-ai/app": "workspace:*",
"@opencode-ai/ui": "workspace:*",
@@ -252,7 +252,7 @@
},
"packages/enterprise": {
"name": "@opencode-ai/enterprise",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@opencode-ai/ui": "workspace:*",
"@opencode-ai/util": "workspace:*",
@@ -281,7 +281,7 @@
},
"packages/function": {
"name": "@opencode-ai/function",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@octokit/auth-app": "8.0.1",
"@octokit/rest": "catalog:",
@@ -297,7 +297,7 @@
},
"packages/opencode": {
"name": "opencode",
"version": "1.3.13",
"version": "1.3.11",
"bin": {
"opencode": "./bin/opencode",
},
@@ -338,8 +338,8 @@
"@opencode-ai/sdk": "workspace:*",
"@opencode-ai/util": "workspace:*",
"@openrouter/ai-sdk-provider": "2.3.3",
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@parcel/watcher": "2.5.1",
"@pierre/diffs": "catalog:",
"@solid-primitives/event-bus": "1.1.2",
@@ -423,22 +423,22 @@
},
"packages/plugin": {
"name": "@opencode-ai/plugin",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@opencode-ai/sdk": "workspace:*",
"zod": "catalog:",
},
"devDependencies": {
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@tsconfig/node22": "catalog:",
"@types/node": "catalog:",
"@typescript/native-preview": "catalog:",
"typescript": "catalog:",
},
"peerDependencies": {
"@opentui/core": ">=0.1.95",
"@opentui/solid": ">=0.1.95",
"@opentui/core": ">=0.1.93",
"@opentui/solid": ">=0.1.93",
},
"optionalPeers": [
"@opentui/core",
@@ -457,7 +457,7 @@
},
"packages/sdk/js": {
"name": "@opencode-ai/sdk",
"version": "1.3.13",
"version": "1.3.11",
"devDependencies": {
"@hey-api/openapi-ts": "0.90.10",
"@tsconfig/node22": "catalog:",
@@ -468,7 +468,7 @@
},
"packages/slack": {
"name": "@opencode-ai/slack",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@opencode-ai/sdk": "workspace:*",
"@slack/bolt": "^3.17.1",
@@ -503,7 +503,7 @@
},
"packages/ui": {
"name": "@opencode-ai/ui",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@kobalte/core": "catalog:",
"@opencode-ai/sdk": "workspace:*",
@@ -550,7 +550,7 @@
},
"packages/util": {
"name": "@opencode-ai/util",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"zod": "catalog:",
},
@@ -561,7 +561,7 @@
},
"packages/web": {
"name": "@opencode-ai/web",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@astrojs/cloudflare": "12.6.3",
"@astrojs/markdown-remark": "6.3.1",
@@ -1461,21 +1461,21 @@
"@opentelemetry/api": ["@opentelemetry/api@1.9.0", "", {}, "sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg=="],
"@opentui/core": ["@opentui/core@0.1.95", "", { "dependencies": { "bun-ffi-structs": "0.1.2", "diff": "8.0.2", "jimp": "1.6.0", "marked": "17.0.1", "yoga-layout": "3.2.1" }, "optionalDependencies": { "@dimforge/rapier2d-simd-compat": "^0.17.3", "@opentui/core-darwin-arm64": "0.1.95", "@opentui/core-darwin-x64": "0.1.95", "@opentui/core-linux-arm64": "0.1.95", "@opentui/core-linux-x64": "0.1.95", "@opentui/core-win32-arm64": "0.1.95", "@opentui/core-win32-x64": "0.1.95", "bun-webgpu": "0.1.5", "planck": "^1.4.2", "three": "0.177.0" }, "peerDependencies": { "web-tree-sitter": "0.25.10" } }, "sha512-Ha73I+PPSy6Jk8CTZgdGRHU+nnmrPAs7m6w0k6ge1/kWbcNcZB0lY67sWQMdoa6bSINQMNWg7SjbNCC9B/0exg=="],
"@opentui/core": ["@opentui/core@0.1.93", "", { "dependencies": { "bun-ffi-structs": "0.1.2", "diff": "8.0.2", "jimp": "1.6.0", "marked": "17.0.1", "yoga-layout": "3.2.1" }, "optionalDependencies": { "@dimforge/rapier2d-simd-compat": "^0.17.3", "@opentui/core-darwin-arm64": "0.1.93", "@opentui/core-darwin-x64": "0.1.93", "@opentui/core-linux-arm64": "0.1.93", "@opentui/core-linux-x64": "0.1.93", "@opentui/core-win32-arm64": "0.1.93", "@opentui/core-win32-x64": "0.1.93", "bun-webgpu": "0.1.5", "planck": "^1.4.2", "three": "0.177.0" }, "peerDependencies": { "web-tree-sitter": "0.25.10" } }, "sha512-HlTM16ZiBKN0mPBNMHSILkSrbzNku6Pg/ovIpVVkEPqLeWeSC2bfZS4Uhc0Ej1sckVVVoU9HKBJanfHvpP+pMg=="],
"@opentui/core-darwin-arm64": ["@opentui/core-darwin-arm64@0.1.95", "", { "os": "darwin", "cpu": "arm64" }, "sha512-92joqr0ucGaIBCl9uYhe5DwAPbgGMTaCsCeY8Yf3VQ72wjGbOTwnC1TvU5wC6bUmiyqfijCqMyuUnj83teIVVQ=="],
"@opentui/core-darwin-arm64": ["@opentui/core-darwin-arm64@0.1.93", "", { "os": "darwin", "cpu": "arm64" }, "sha512-4I2mwhXLqRNUv7tu88hA6cBGaGpLZXkAa8W0VqBiGDV+Tx337x4T+vbQ7G57OwKXT787oTrEOF9rOOrGLov6qw=="],
"@opentui/core-darwin-x64": ["@opentui/core-darwin-x64@0.1.95", "", { "os": "darwin", "cpu": "x64" }, "sha512-+TLL3Kp3x7DTWEAkCAYe+RjRhl58QndoeXMstZNS8GQyrjSpUuivzwidzAz0HZK9SbZJfvaxZmXsToAIdI2fag=="],
"@opentui/core-darwin-x64": ["@opentui/core-darwin-x64@0.1.93", "", { "os": "darwin", "cpu": "x64" }, "sha512-jvYMgcg47a5qLhSv1DnQiafEWBQ1UukGutmsYV1TvNuhWtuDXYLVy2AhKIHPzbB9JNrV0IpjbxUC8QnJaP3n8g=="],
"@opentui/core-linux-arm64": ["@opentui/core-linux-arm64@0.1.95", "", { "os": "linux", "cpu": "arm64" }, "sha512-dAYeRqh7P8o0xFZleDDR1Abt4gSvCISqw6syOrbH3dl7pMbVdGgzA5stM9jqMgdPUVE7Ngumo17C23ehkGv93A=="],
"@opentui/core-linux-arm64": ["@opentui/core-linux-arm64@0.1.93", "", { "os": "linux", "cpu": "arm64" }, "sha512-bvFqRcPftmg14iYmMc3d63XC9rhe4yF7pJRApH6klLBKp27WX/LU0iSO4mvyX7qhy65gcmyy4Sj9dl5jNJ+vlA=="],
"@opentui/core-linux-x64": ["@opentui/core-linux-x64@0.1.95", "", { "os": "linux", "cpu": "x64" }, "sha512-O54TCgK8E7j2NKrDXUOTZqO4sb8JjeAfnhrStxAMMEw4RFCGWx3p3wLesqR16uKfFFJFDyoh2OWZ698tO88EAA=="],
"@opentui/core-linux-x64": ["@opentui/core-linux-x64@0.1.93", "", { "os": "linux", "cpu": "x64" }, "sha512-/wJXhwtNxdcpshrRl1KouyGE54ODAHxRQgBHtnlM/F4bB8cjzOlq2Yc+5cv5DxRz4Q0nQZFCPefwpg2U6ZwNdA=="],
"@opentui/core-win32-arm64": ["@opentui/core-win32-arm64@0.1.95", "", { "os": "win32", "cpu": "arm64" }, "sha512-T1RlZ6U/95eYDN6rUm4SLOVA5LBR7iL3TcBroQhV/883bVczXIBPhriEXQayup5FsAemnQba1BzMNvy6128SUw=="],
"@opentui/core-win32-arm64": ["@opentui/core-win32-arm64@0.1.93", "", { "os": "win32", "cpu": "arm64" }, "sha512-g3PQobfM2yFPSzkBKRKFp8FgTG4ulWyJcU+GYXjyYmxQIT+ZbOU7UfR//ImRq3/FxUAfUC/MhC6WwjqccjEqBw=="],
"@opentui/core-win32-x64": ["@opentui/core-win32-x64@0.1.95", "", { "os": "win32", "cpu": "x64" }, "sha512-lH2FHO0HSP2xWT+ccoz0BkLYFsMm7e6OYOh63BUHHh5b7ispnzP4aTyxiaLWrfJwdL0M9rp5cLIY32bhBKF2oA=="],
"@opentui/core-win32-x64": ["@opentui/core-win32-x64@0.1.93", "", { "os": "win32", "cpu": "x64" }, "sha512-Spllte2W7q+WfB1zVHgHilVJNp+jpp77PkkxTWyMQNvT7vJNt9LABMNjGTGiJBBMkAuKvO0GgFNKxrda7tFKrQ=="],
"@opentui/solid": ["@opentui/solid@0.1.95", "", { "dependencies": { "@babel/core": "7.28.0", "@babel/preset-typescript": "7.27.1", "@opentui/core": "0.1.95", "babel-plugin-module-resolver": "5.0.2", "babel-preset-solid": "1.9.10", "entities": "7.0.1", "s-js": "^0.4.9" }, "peerDependencies": { "solid-js": "1.9.11" } }, "sha512-iotYCvULgDurLXv3vgOzTLnEOySHFOa/6cEDex76jBt+gkniOEh2cjxxIVt6lkfTsk6UNTk6yCdwNK3nca/j+Q=="],
"@opentui/solid": ["@opentui/solid@0.1.93", "", { "dependencies": { "@babel/core": "7.28.0", "@babel/preset-typescript": "7.27.1", "@opentui/core": "0.1.93", "babel-plugin-module-resolver": "5.0.2", "babel-preset-solid": "1.9.10", "entities": "7.0.1", "s-js": "^0.4.9" }, "peerDependencies": { "solid-js": "1.9.11" } }, "sha512-Qx+4qoLSjnRGoo/YY4sZJMyXj09Y5kaAMpVO+65Ax58MMj4TjABN4bOOiRT2KV7sKOMTjxiAgXAIaBuqBBJ0Qg=="],
"@oslojs/asn1": ["@oslojs/asn1@1.0.0", "", { "dependencies": { "@oslojs/binary": "1.0.0" } }, "sha512-zw/wn0sj0j0QKbIXfIlnEcTviaCzYOY3V5rAyjR6YtOByFtJiT574+8p9Wlach0lZH9fddD4yb9laEAIl4vXQA=="],

View File

@@ -1,8 +1,8 @@
{
"nodeModules": {
"x86_64-linux": "sha256-g6zIq2w5nM+UPMsOzqB5EbCi4a9xkarin27WwTHqzHo=",
"aarch64-linux": "sha256-ivRoB8jrPtMGaphkIQuZ4G5E83/iPjoMGQGgxIp6p8Q=",
"aarch64-darwin": "sha256-0oW0PQpcXsNOGA868BjGRAAJWKcZJ8GVVkQueJfz6ng=",
"x86_64-darwin": "sha256-7Jm198RsxBC9yewcTlV6HXLtT3+GRFThi+5vL/R8d18="
"x86_64-linux": "sha256-UuVbB5lTRB4bIcaKMc8CLSbQW7m9EjXgxYvxp/uO7Co=",
"aarch64-linux": "sha256-8D7ReLRVb7NDd5PQTVxFhRLmlLbfjK007XgIhhpNKoE=",
"aarch64-darwin": "sha256-M+z7C/eXfVqwDiGiiwKo/LT/m4dvCjL1Pblsr1kxoyI=",
"x86_64-darwin": "sha256-RzZS6GMwYVDPK0W+K/mlebixNMs2+JRkMG9n8OFhd0c="
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/app",
"version": "1.3.13",
"version": "1.3.11",
"description": "",
"type": "module",
"exports": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-app",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/console-core",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-function",
"version": "1.3.13",
"version": "1.3.11",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/console-mail",
"version": "1.3.13",
"version": "1.3.11",
"dependencies": {
"@jsx-email/all": "2.2.3",
"@jsx-email/cli": "1.4.3",

View File

@@ -1,7 +1,7 @@
{
"name": "@opencode-ai/desktop-electron",
"private": true,
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"homepage": "https://opencode.ai",

View File

@@ -1,7 +1,7 @@
{
"name": "@opencode-ai/desktop",
"private": true,
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/enterprise",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -1,7 +1,7 @@
id = "opencode"
name = "OpenCode"
description = "The open source coding agent."
version = "1.3.13"
version = "1.3.11"
schema_version = 1
authors = ["Anomaly"]
repository = "https://github.com/anomalyco/opencode"
@@ -11,26 +11,26 @@ name = "OpenCode"
icon = "./icons/opencode.svg"
[agent_servers.opencode.targets.darwin-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-darwin-arm64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-darwin-arm64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.darwin-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-darwin-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-darwin-x64.zip"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-aarch64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-linux-arm64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-linux-arm64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.linux-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-linux-x64.tar.gz"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-linux-x64.tar.gz"
cmd = "./opencode"
args = ["acp"]
[agent_servers.opencode.targets.windows-x86_64]
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.13/opencode-windows-x64.zip"
archive = "https://github.com/anomalyco/opencode/releases/download/v1.3.11/opencode-windows-x64.zip"
cmd = "./opencode.exe"
args = ["acp"]

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/function",
"version": "1.3.13",
"version": "1.3.11",
"$schema": "https://json.schemastore.org/package.json",
"private": true,
"type": "module",

View File

@@ -1,6 +1,6 @@
{
"$schema": "https://json.schemastore.org/package.json",
"version": "1.3.13",
"version": "1.3.11",
"name": "opencode",
"type": "module",
"license": "MIT",
@@ -102,8 +102,8 @@
"@opencode-ai/sdk": "workspace:*",
"@opencode-ai/util": "workspace:*",
"@openrouter/ai-sdk-provider": "2.3.3",
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@parcel/watcher": "2.5.1",
"@pierre/diffs": "catalog:",
"@solid-primitives/event-bus": "1.1.2",

View File

@@ -1541,10 +1541,9 @@ export namespace Provider {
}),
)
const { runPromise } = makeRuntime(
Service,
layer.pipe(Layer.provide(Config.defaultLayer), Layer.provide(Auth.defaultLayer)),
)
export const defaultLayer = layer.pipe(Layer.provide(Config.defaultLayer), Layer.provide(Auth.defaultLayer))
const { runPromise } = makeRuntime(Service, defaultLayer)
export async function list() {
return runPromise((svc) => svc.list())

View File

@@ -436,13 +436,13 @@ export const SessionRoutes = lazy(() =>
validator(
"param",
z.object({
sessionID: SessionSummary.DiffInput.shape.sessionID,
sessionID: SessionSummary.diff.schema.shape.sessionID,
}),
),
validator(
"query",
z.object({
messageID: SessionSummary.DiffInput.shape.messageID,
messageID: SessionSummary.diff.schema.shape.messageID,
}),
),
async (c) => {

View File

@@ -294,10 +294,12 @@ export namespace SessionProcessor {
}
ctx.snapshot = undefined
}
SessionSummary.summarize({
sessionID: ctx.sessionID,
messageID: ctx.assistantMessage.parentID,
})
yield* Effect.promise(() =>
SessionSummary.summarize({
sessionID: ctx.sessionID,
messageID: ctx.assistantMessage.parentID,
}),
).pipe(Effect.ignoreCause({ log: true, message: "session summary failed" }), Effect.forkDetach)
if (
!ctx.assistantMessage.summary &&
isOverflow({ cfg: yield* config.get(), tokens: usage.tokens, model: ctx.model })

View File

@@ -1,14 +1,12 @@
import z from "zod"
import { Effect, Layer, ServiceMap } from "effect"
import { makeRuntime } from "@/effect/run-service"
import { Bus } from "../bus"
import { Snapshot } from "../snapshot"
import { Storage } from "@/storage/storage"
import { SyncEvent } from "../sync"
import { Log } from "../util/log"
import { Session } from "."
import { MessageV2 } from "./message-v2"
import { SessionID, MessageID, PartID } from "./schema"
import { Snapshot } from "../snapshot"
import { MessageV2 } from "./message-v2"
import { Session } from "."
import { Log } from "../util/log"
import { SyncEvent } from "../sync"
import { Storage } from "@/storage/storage"
import { Bus } from "../bus"
import { SessionPrompt } from "./prompt"
import { SessionSummary } from "./summary"
@@ -22,152 +20,116 @@ export namespace SessionRevert {
})
export type RevertInput = z.infer<typeof RevertInput>
export interface Interface {
readonly revert: (input: RevertInput) => Effect.Effect<Session.Info>
readonly unrevert: (input: { sessionID: SessionID }) => Effect.Effect<Session.Info>
readonly cleanup: (session: Session.Info) => Effect.Effect<void>
}
export class Service extends ServiceMap.Service<Service, Interface>()("@opencode/SessionRevert") {}
export const layer = Layer.effect(
Service,
Effect.gen(function* () {
const sessions = yield* Session.Service
const snap = yield* Snapshot.Service
const storage = yield* Storage.Service
const bus = yield* Bus.Service
const revert = Effect.fn("SessionRevert.revert")(function* (input: RevertInput) {
yield* Effect.promise(() => SessionPrompt.assertNotBusy(input.sessionID))
const all = yield* sessions.messages({ sessionID: input.sessionID })
let lastUser: MessageV2.User | undefined
const session = yield* sessions.get(input.sessionID)
let rev: Session.Info["revert"]
const patches: Snapshot.Patch[] = []
for (const msg of all) {
if (msg.info.role === "user") lastUser = msg.info
const remaining = []
for (const part of msg.parts) {
if (rev) {
if (part.type === "patch") patches.push(part)
continue
}
if (!rev) {
if ((msg.info.id === input.messageID && !input.partID) || part.id === input.partID) {
const partID = remaining.some((item) => ["text", "tool"].includes(item.type)) ? input.partID : undefined
rev = {
messageID: !partID && lastUser ? lastUser.id : msg.info.id,
partID,
}
}
remaining.push(part)
}
}
}
if (!rev) return session
rev.snapshot = session.revert?.snapshot ?? (yield* snap.track())
yield* snap.revert(patches)
if (rev.snapshot) rev.diff = yield* snap.diff(rev.snapshot as string)
const range = all.filter((msg) => msg.info.id >= rev!.messageID)
const diffs = yield* Effect.promise(() => SessionSummary.computeDiff({ messages: range }))
yield* storage.write(["session_diff", input.sessionID], diffs).pipe(Effect.ignore)
yield* bus.publish(Session.Event.Diff, { sessionID: input.sessionID, diff: diffs })
yield* sessions.setRevert({
sessionID: input.sessionID,
revert: rev,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
return yield* sessions.get(input.sessionID)
})
const unrevert = Effect.fn("SessionRevert.unrevert")(function* (input: { sessionID: SessionID }) {
log.info("unreverting", input)
yield* Effect.promise(() => SessionPrompt.assertNotBusy(input.sessionID))
const session = yield* sessions.get(input.sessionID)
if (!session.revert) return session
if (session.revert.snapshot) yield* snap.restore(session.revert!.snapshot!)
yield* sessions.clearRevert(input.sessionID)
return yield* sessions.get(input.sessionID)
})
const cleanup = Effect.fn("SessionRevert.cleanup")(function* (session: Session.Info) {
if (!session.revert) return
const sessionID = session.id
const msgs = yield* sessions.messages({ sessionID })
const messageID = session.revert.messageID
const remove = [] as MessageV2.WithParts[]
let target: MessageV2.WithParts | undefined
for (const msg of msgs) {
if (msg.info.id < messageID) continue
if (msg.info.id > messageID) {
remove.push(msg)
continue
}
if (session.revert.partID) {
target = msg
continue
}
remove.push(msg)
}
for (const msg of remove) {
SyncEvent.run(MessageV2.Event.Removed, {
sessionID,
messageID: msg.info.id,
})
}
if (session.revert.partID && target) {
const partID = session.revert.partID
const idx = target.parts.findIndex((part) => part.id === partID)
if (idx >= 0) {
const removeParts = target.parts.slice(idx)
target.parts = target.parts.slice(0, idx)
for (const part of removeParts) {
SyncEvent.run(MessageV2.Event.PartRemoved, {
sessionID,
messageID: target.info.id,
partID: part.id,
})
}
}
}
yield* sessions.clearRevert(sessionID)
})
return Service.of({ revert, unrevert, cleanup })
}),
)
export const defaultLayer = Layer.unwrap(
Effect.sync(() =>
layer.pipe(
Layer.provide(Session.defaultLayer),
Layer.provide(Snapshot.defaultLayer),
Layer.provide(Storage.defaultLayer),
Layer.provide(Bus.layer),
),
),
)
const { runPromise } = makeRuntime(Service, defaultLayer)
export async function revert(input: RevertInput) {
return runPromise((svc) => svc.revert(input))
await SessionPrompt.assertNotBusy(input.sessionID)
const all = await Session.messages({ sessionID: input.sessionID })
let lastUser: MessageV2.User | undefined
const session = await Session.get(input.sessionID)
let revert: Session.Info["revert"]
const patches: Snapshot.Patch[] = []
for (const msg of all) {
if (msg.info.role === "user") lastUser = msg.info
const remaining = []
for (const part of msg.parts) {
if (revert) {
if (part.type === "patch") {
patches.push(part)
}
continue
}
if (!revert) {
if ((msg.info.id === input.messageID && !input.partID) || part.id === input.partID) {
// if no useful parts left in message, same as reverting whole message
const partID = remaining.some((item) => ["text", "tool"].includes(item.type)) ? input.partID : undefined
revert = {
messageID: !partID && lastUser ? lastUser.id : msg.info.id,
partID,
}
}
remaining.push(part)
}
}
}
if (revert) {
const session = await Session.get(input.sessionID)
revert.snapshot = session.revert?.snapshot ?? (await Snapshot.track())
await Snapshot.revert(patches)
if (revert.snapshot) revert.diff = await Snapshot.diff(revert.snapshot)
const rangeMessages = all.filter((msg) => msg.info.id >= revert!.messageID)
const diffs = await SessionSummary.computeDiff({ messages: rangeMessages })
await Storage.write(["session_diff", input.sessionID], diffs)
Bus.publish(Session.Event.Diff, {
sessionID: input.sessionID,
diff: diffs,
})
return Session.setRevert({
sessionID: input.sessionID,
revert,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
}
return session
}
export async function unrevert(input: { sessionID: SessionID }) {
return runPromise((svc) => svc.unrevert(input))
log.info("unreverting", input)
await SessionPrompt.assertNotBusy(input.sessionID)
const session = await Session.get(input.sessionID)
if (!session.revert) return session
if (session.revert.snapshot) await Snapshot.restore(session.revert.snapshot)
return Session.clearRevert(input.sessionID)
}
export async function cleanup(session: Session.Info) {
return runPromise((svc) => svc.cleanup(session))
if (!session.revert) return
const sessionID = session.id
const msgs = await Session.messages({ sessionID })
const messageID = session.revert.messageID
const remove = [] as MessageV2.WithParts[]
let target: MessageV2.WithParts | undefined
for (const msg of msgs) {
if (msg.info.id < messageID) {
continue
}
if (msg.info.id > messageID) {
remove.push(msg)
continue
}
if (session.revert.partID) {
target = msg
continue
}
remove.push(msg)
}
for (const msg of remove) {
SyncEvent.run(MessageV2.Event.Removed, {
sessionID: sessionID,
messageID: msg.info.id,
})
}
if (session.revert.partID && target) {
const partID = session.revert.partID
const removeStart = target.parts.findIndex((part) => part.id === partID)
if (removeStart >= 0) {
const preserveParts = target.parts.slice(0, removeStart)
const removeParts = target.parts.slice(removeStart)
target.parts = preserveParts
for (const part of removeParts) {
SyncEvent.run(MessageV2.Event.PartRemoved, {
sessionID: sessionID,
messageID: target.info.id,
partID: part.id,
})
}
}
}
await Session.clearRevert(sessionID)
}
}

View File

@@ -1,12 +1,14 @@
import { fn } from "@/util/fn"
import z from "zod"
import { Effect, Layer, ServiceMap } from "effect"
import { makeRuntime } from "@/effect/run-service"
import { Bus } from "@/bus"
import { Snapshot } from "@/snapshot"
import { Storage } from "@/storage/storage"
import { Session } from "."
import { MessageV2 } from "./message-v2"
import { SessionID, MessageID } from "./schema"
import { Snapshot } from "@/snapshot"
import { Storage } from "@/storage/storage"
import { Bus } from "@/bus"
import { NotFoundError } from "@/storage/db"
export namespace SessionSummary {
function unquoteGitPath(input: string) {
@@ -65,117 +67,103 @@ export namespace SessionSummary {
return Buffer.from(bytes).toString()
}
export interface Interface {
readonly summarize: (input: { sessionID: SessionID; messageID: MessageID }) => Effect.Effect<void>
readonly diff: (input: { sessionID: SessionID; messageID?: MessageID }) => Effect.Effect<Snapshot.FileDiff[]>
readonly computeDiff: (input: { messages: MessageV2.WithParts[] }) => Effect.Effect<Snapshot.FileDiff[]>
}
export class Service extends ServiceMap.Service<Service, Interface>()("@opencode/SessionSummary") {}
export const layer = Layer.effect(
Service,
Effect.gen(function* () {
const sessions = yield* Session.Service
const snapshot = yield* Snapshot.Service
const storage = yield* Storage.Service
const bus = yield* Bus.Service
const computeDiff = Effect.fn("SessionSummary.computeDiff")(function* (input: {
messages: MessageV2.WithParts[]
}) {
let from: string | undefined
let to: string | undefined
for (const item of input.messages) {
if (!from) {
for (const part of item.parts) {
if (part.type === "step-start" && part.snapshot) {
from = part.snapshot
break
}
}
}
for (const part of item.parts) {
if (part.type === "step-finish" && part.snapshot) to = part.snapshot
}
}
if (from && to) return yield* snapshot.diffFull(from, to)
return []
})
const summarize = Effect.fn("SessionSummary.summarize")(function* (input: {
sessionID: SessionID
messageID: MessageID
}) {
const all = yield* sessions.messages({ sessionID: input.sessionID })
if (!all.length) return
const diffs = yield* computeDiff({ messages: all })
yield* sessions.setSummary({
sessionID: input.sessionID,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
yield* storage.write(["session_diff", input.sessionID], diffs).pipe(Effect.ignore)
yield* bus.publish(Session.Event.Diff, { sessionID: input.sessionID, diff: diffs })
const messages = all.filter(
(m) => m.info.id === input.messageID || (m.info.role === "assistant" && m.info.parentID === input.messageID),
)
const target = messages.find((m) => m.info.id === input.messageID)
if (!target || target.info.role !== "user") return
const msgDiffs = yield* computeDiff({ messages })
target.info.summary = { ...target.info.summary, diffs: msgDiffs }
yield* sessions.updateMessage(target.info)
})
const diff = Effect.fn("SessionSummary.diff")(function* (input: { sessionID: SessionID; messageID?: MessageID }) {
const diffs = yield* storage
.read<Snapshot.FileDiff[]>(["session_diff", input.sessionID])
.pipe(Effect.catch(() => Effect.succeed([] as Snapshot.FileDiff[])))
const next = diffs.map((item) => {
const file = unquoteGitPath(item.file)
if (file === item.file) return item
return { ...item, file }
})
const changed = next.some((item, i) => item.file !== diffs[i]?.file)
if (changed) yield* storage.write(["session_diff", input.sessionID], next).pipe(Effect.ignore)
return next
})
return Service.of({ summarize, diff, computeDiff })
export const summarize = fn(
z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod,
}),
async (input) => {
await Session.messages({ sessionID: input.sessionID })
.then((all) =>
Promise.all([
summarizeSession({ sessionID: input.sessionID, messages: all }),
summarizeMessage({ messageID: input.messageID, messages: all }),
]),
)
.catch((err) => {
if (NotFoundError.isInstance(err)) return
throw err
})
},
)
export const defaultLayer = Layer.unwrap(
Effect.sync(() =>
layer.pipe(
Layer.provide(Session.defaultLayer),
Layer.provide(Snapshot.defaultLayer),
Layer.provide(Storage.defaultLayer),
Layer.provide(Bus.layer),
),
),
)
const { runPromise } = makeRuntime(Service, defaultLayer)
export const summarize = (input: { sessionID: SessionID; messageID: MessageID }) =>
void runPromise((svc) => svc.summarize(input)).catch(() => {})
export const DiffInput = z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod.optional(),
})
export async function diff(input: z.infer<typeof DiffInput>) {
return runPromise((svc) => svc.diff(input))
async function summarizeSession(input: { sessionID: SessionID; messages: MessageV2.WithParts[] }) {
const diffs = await computeDiff({ messages: input.messages })
await Session.setSummary({
sessionID: input.sessionID,
summary: {
additions: diffs.reduce((sum, x) => sum + x.additions, 0),
deletions: diffs.reduce((sum, x) => sum + x.deletions, 0),
files: diffs.length,
},
})
await Storage.write(["session_diff", input.sessionID], diffs)
Bus.publish(Session.Event.Diff, {
sessionID: input.sessionID,
diff: diffs,
})
}
async function summarizeMessage(input: { messageID: string; messages: MessageV2.WithParts[] }) {
const messages = input.messages.filter(
(m) => m.info.id === input.messageID || (m.info.role === "assistant" && m.info.parentID === input.messageID),
)
const msgWithParts = messages.find((m) => m.info.id === input.messageID)
if (!msgWithParts || msgWithParts.info.role !== "user") return
const userMsg = msgWithParts.info
const diffs = await computeDiff({ messages })
userMsg.summary = {
...userMsg.summary,
diffs,
}
await Session.updateMessage(userMsg)
}
export const diff = fn(
z.object({
sessionID: SessionID.zod,
messageID: MessageID.zod.optional(),
}),
async (input) => {
const diffs = await Storage.read<Snapshot.FileDiff[]>(["session_diff", input.sessionID]).catch(() => [])
const next = diffs.map((item) => {
const file = unquoteGitPath(item.file)
if (file === item.file) return item
return {
...item,
file,
}
})
const changed = next.some((item, i) => item.file !== diffs[i]?.file)
if (changed) Storage.write(["session_diff", input.sessionID], next).catch(() => {})
return next
},
)
export async function computeDiff(input: { messages: MessageV2.WithParts[] }) {
return runPromise((svc) => svc.computeDiff(input))
let from: string | undefined
let to: string | undefined
// scan assistant messages to find earliest from and latest to
// snapshot
for (const item of input.messages) {
if (!from) {
for (const part of item.parts) {
if (part.type === "step-start" && part.snapshot) {
from = part.snapshot
break
}
}
}
for (const part of item.parts) {
if (part.type === "step-finish" && part.snapshot) {
to = part.snapshot
}
}
}
if (from && to) return Snapshot.diffFull(from, to)
return []
}
}

View File

@@ -1,47 +0,0 @@
/** @jsxImportSource @opentui/solid */
import { expect, test } from "bun:test"
import { createSlot, createSolidSlotRegistry, testRender, useRenderer } from "@opentui/solid"
import { onMount } from "solid-js"
type Slots = {
prompt: {}
}
test("replace slot mounts plugin content once", async () => {
let mounts = 0
const Probe = () => {
onMount(() => {
mounts += 1
})
return <box />
}
const App = () => {
const renderer = useRenderer()
const reg = createSolidSlotRegistry<Slots>(renderer, {})
const Slot = createSlot(reg)
reg.register({
id: "plugin",
slots: {
prompt() {
return <Probe />
},
},
})
return (
<box>
<Slot name="prompt" mode="replace">
<box />
</Slot>
</box>
)
}
await testRender(() => <App />)
expect(mounts).toBe(1)
})

View File

@@ -64,7 +64,9 @@ describe("Format", () => {
),
)
it.live("service initializes without error", () => provideTmpdirInstance(() => Format.Service.use(() => Effect.void)))
it.live("service initializes without error", () =>
provideTmpdirInstance(() => Format.Service.use(() => Effect.void)),
)
it.live("status() initializes formatter state per directory", () =>
Effect.gen(function* () {

View File

@@ -1,31 +1,12 @@
import { NodeHttpServer } from "@effect/platform-node"
import { NodeHttpServer, NodeHttpServerRequest } from "@effect/platform-node"
import * as Http from "node:http"
import { Deferred, Effect, Layer, ServiceMap, Stream } from "effect"
import * as HttpServer from "effect/unstable/http/HttpServer"
import { HttpRouter, HttpServerRequest, HttpServerResponse } from "effect/unstable/http"
type Step =
| {
type: "text"
text: string
}
| {
type: "tool"
tool: string
input: unknown
}
| {
type: "fail"
message: string
}
| {
type: "hang"
}
| {
type: "hold"
text: string
wait: PromiseLike<unknown>
}
export type Usage = { input: number; output: number }
type Line = Record<string, unknown>
type Hit = {
url: URL
@@ -37,147 +18,293 @@ type Wait = {
ready: Deferred.Deferred<void>
}
function sse(lines: unknown[]) {
return HttpServerResponse.stream(
Stream.fromIterable([
[...lines.map((line) => `data: ${JSON.stringify(line)}`), "data: [DONE]"].join("\n\n") + "\n\n",
]).pipe(Stream.encodeText),
{ contentType: "text/event-stream" },
)
type Sse = {
type: "sse"
head: unknown[]
tail: unknown[]
wait?: PromiseLike<unknown>
hang?: boolean
error?: unknown
reset?: boolean
}
function text(step: Extract<Step, { type: "text" }>) {
return sse([
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: { role: "assistant" } }],
},
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: { content: step.text } }],
},
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: {}, finish_reason: "stop" }],
},
])
type HttpError = {
type: "http-error"
status: number
body: unknown
}
function tool(step: Extract<Step, { type: "tool" }>, seq: number) {
const id = `call_${seq}`
const args = JSON.stringify(step.input)
return sse([
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: { role: "assistant" } }],
},
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [
export type Item = Sse | HttpError
const done = Symbol("done")
function line(input: unknown) {
if (input === done) return "data: [DONE]\n\n"
return `data: ${JSON.stringify(input)}\n\n`
}
function tokens(input?: Usage) {
if (!input) return
return {
prompt_tokens: input.input,
completion_tokens: input.output,
total_tokens: input.input + input.output,
}
}
function chunk(input: { delta?: Record<string, unknown>; finish?: string; usage?: Usage }) {
return {
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [
{
delta: input.delta ?? {},
...(input.finish ? { finish_reason: input.finish } : {}),
},
],
...(input.usage ? { usage: tokens(input.usage) } : {}),
} satisfies Line
}
function role() {
return chunk({ delta: { role: "assistant" } })
}
function textLine(value: string) {
return chunk({ delta: { content: value } })
}
function reasonLine(value: string) {
return chunk({ delta: { reasoning_content: value } })
}
function finishLine(reason: string, usage?: Usage) {
return chunk({ finish: reason, usage })
}
function toolStartLine(id: string, name: string) {
return chunk({
delta: {
tool_calls: [
{
delta: {
tool_calls: [
{
index: 0,
id,
type: "function",
function: {
name: step.tool,
arguments: "",
},
},
],
index: 0,
id,
type: "function",
function: {
name,
arguments: "",
},
},
],
},
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [
})
}
function toolArgsLine(value: string) {
return chunk({
delta: {
tool_calls: [
{
delta: {
tool_calls: [
{
index: 0,
function: {
arguments: args,
},
},
],
index: 0,
function: {
arguments: value,
},
},
],
},
{
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: {}, finish_reason: "tool_calls" }],
},
])
})
}
function fail(step: Extract<Step, { type: "fail" }>) {
return HttpServerResponse.stream(
Stream.fromIterable([
'data: {"id":"chatcmpl-test","object":"chat.completion.chunk","choices":[{"delta":{"role":"assistant"}}]}\n\n',
]).pipe(Stream.encodeText, Stream.concat(Stream.fail(new Error(step.message)))),
{ contentType: "text/event-stream" },
)
function bytes(input: Iterable<unknown>) {
return Stream.fromIterable([...input].map(line)).pipe(Stream.encodeText)
}
function hang() {
return HttpServerResponse.stream(
Stream.fromIterable([
'data: {"id":"chatcmpl-test","object":"chat.completion.chunk","choices":[{"delta":{"role":"assistant"}}]}\n\n',
]).pipe(Stream.encodeText, Stream.concat(Stream.never)),
{ contentType: "text/event-stream" },
)
function send(item: Sse) {
const head = bytes(item.head)
const tail = bytes([...item.tail, ...(item.hang || item.error ? [] : [done])])
const empty = Stream.fromIterable<Uint8Array>([])
const wait = item.wait
const body: Stream.Stream<Uint8Array, unknown> = wait
? Stream.concat(head, Stream.fromEffect(Effect.promise(() => wait)).pipe(Stream.flatMap(() => tail)))
: Stream.concat(head, tail)
let end: Stream.Stream<Uint8Array, unknown> = empty
if (item.error) end = Stream.concat(empty, Stream.fail(item.error))
else if (item.hang) end = Stream.concat(empty, Stream.never)
return HttpServerResponse.stream(Stream.concat(body, end), { contentType: "text/event-stream" })
}
function hold(step: Extract<Step, { type: "hold" }>) {
return HttpServerResponse.stream(
Stream.fromIterable([
'data: {"id":"chatcmpl-test","object":"chat.completion.chunk","choices":[{"delta":{"role":"assistant"}}]}\n\n',
]).pipe(
Stream.encodeText,
Stream.concat(
Stream.fromEffect(Effect.promise(() => step.wait)).pipe(
Stream.flatMap(() =>
Stream.fromIterable([
`data: ${JSON.stringify({
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: { content: step.text } }],
})}\n\n`,
`data: ${JSON.stringify({
id: "chatcmpl-test",
object: "chat.completion.chunk",
choices: [{ delta: {}, finish_reason: "stop" }],
})}\n\n`,
"data: [DONE]\n\n",
]).pipe(Stream.encodeText),
),
),
),
),
{ contentType: "text/event-stream" },
)
const reset = Effect.fn("TestLLMServer.reset")(function* (item: Sse) {
const req = yield* HttpServerRequest.HttpServerRequest
const res = NodeHttpServerRequest.toServerResponse(req)
yield* Effect.sync(() => {
res.writeHead(200, { "content-type": "text/event-stream" })
for (const part of item.head) res.write(line(part))
for (const part of item.tail) res.write(line(part))
res.destroy(new Error("connection reset"))
})
yield* Effect.never
})
function fail(item: HttpError) {
return HttpServerResponse.text(JSON.stringify(item.body), {
status: item.status,
contentType: "application/json",
})
}
export class Reply {
#head: unknown[] = [role()]
#tail: unknown[] = []
#usage: Usage | undefined
#finish: string | undefined
#wait: PromiseLike<unknown> | undefined
#hang = false
#error: unknown
#reset = false
#seq = 0
#id() {
this.#seq += 1
return `call_${this.#seq}`
}
text(value: string) {
this.#tail = [...this.#tail, textLine(value)]
return this
}
reason(value: string) {
this.#tail = [...this.#tail, reasonLine(value)]
return this
}
usage(value: Usage) {
this.#usage = value
return this
}
wait(value: PromiseLike<unknown>) {
this.#wait = value
return this
}
stop() {
this.#finish = "stop"
this.#hang = false
this.#error = undefined
this.#reset = false
return this
}
toolCalls() {
this.#finish = "tool_calls"
this.#hang = false
this.#error = undefined
this.#reset = false
return this
}
tool(name: string, input: unknown) {
const id = this.#id()
const args = JSON.stringify(input)
this.#tail = [...this.#tail, toolStartLine(id, name), toolArgsLine(args)]
return this.toolCalls()
}
pendingTool(name: string, input: unknown) {
const id = this.#id()
const args = JSON.stringify(input)
const size = Math.max(1, Math.floor(args.length / 2))
this.#tail = [...this.#tail, toolStartLine(id, name), toolArgsLine(args.slice(0, size))]
return this
}
hang() {
this.#finish = undefined
this.#hang = true
this.#error = undefined
this.#reset = false
return this
}
streamError(error: unknown = "boom") {
this.#finish = undefined
this.#hang = false
this.#error = error
this.#reset = false
return this
}
reset() {
this.#finish = undefined
this.#hang = false
this.#error = undefined
this.#reset = true
return this
}
item(): Item {
return {
type: "sse",
head: this.#head,
tail: this.#finish ? [...this.#tail, finishLine(this.#finish, this.#usage)] : this.#tail,
wait: this.#wait,
hang: this.#hang,
error: this.#error,
reset: this.#reset,
}
}
}
export function reply() {
return new Reply()
}
export function httpError(status: number, body: unknown): Item {
return {
type: "http-error",
status,
body,
}
}
export function raw(input: {
chunks?: unknown[]
head?: unknown[]
tail?: unknown[]
wait?: PromiseLike<unknown>
hang?: boolean
error?: unknown
reset?: boolean
}): Item {
return {
type: "sse",
head: input.head ?? input.chunks ?? [],
tail: input.tail ?? [],
wait: input.wait,
hang: input.hang,
error: input.error,
reset: input.reset,
}
}
function item(input: Item | Reply) {
return input instanceof Reply ? input.item() : input
}
namespace TestLLMServer {
export interface Service {
readonly url: string
readonly text: (value: string) => Effect.Effect<void>
readonly tool: (tool: string, input: unknown) => Effect.Effect<void>
readonly fail: (message?: string) => Effect.Effect<void>
readonly push: (...input: (Item | Reply)[]) => Effect.Effect<void>
readonly text: (value: string, opts?: { usage?: Usage }) => Effect.Effect<void>
readonly tool: (name: string, input: unknown) => Effect.Effect<void>
readonly toolHang: (name: string, input: unknown) => Effect.Effect<void>
readonly reason: (value: string, opts?: { text?: string; usage?: Usage }) => Effect.Effect<void>
readonly fail: (message?: unknown) => Effect.Effect<void>
readonly error: (status: number, body: unknown) => Effect.Effect<void>
readonly hang: Effect.Effect<void>
readonly hold: (text: string, wait: PromiseLike<unknown>) => Effect.Effect<void>
readonly hold: (value: string, wait: PromiseLike<unknown>) => Effect.Effect<void>
readonly hits: Effect.Effect<Hit[]>
readonly calls: Effect.Effect<number>
readonly wait: (count: number) => Effect.Effect<void>
@@ -194,12 +321,11 @@ export class TestLLMServer extends ServiceMap.Service<TestLLMServer, TestLLMServ
const router = yield* HttpRouter.HttpRouter
let hits: Hit[] = []
let list: Step[] = []
let seq = 0
let list: Item[] = []
let waits: Wait[] = []
const push = (step: Step) => {
list = [...list, step]
const queue = (...input: (Item | Reply)[]) => {
list = [...list, ...input.map(item)]
}
const notify = Effect.fnUntraced(function* () {
@@ -210,11 +336,10 @@ export class TestLLMServer extends ServiceMap.Service<TestLLMServer, TestLLMServ
})
const pull = () => {
const step = list[0]
if (!step) return { step: undefined, seq }
seq += 1
const first = list[0]
if (!first) return
list = list.slice(1)
return { step, seq }
return first
}
yield* router.add(
@@ -223,21 +348,22 @@ export class TestLLMServer extends ServiceMap.Service<TestLLMServer, TestLLMServ
Effect.gen(function* () {
const req = yield* HttpServerRequest.HttpServerRequest
const next = pull()
if (!next.step) return HttpServerResponse.text("unexpected request", { status: 500 })
const json = yield* req.json.pipe(Effect.orElseSucceed(() => ({})))
if (!next) return HttpServerResponse.text("unexpected request", { status: 500 })
const body = yield* req.json.pipe(Effect.orElseSucceed(() => ({})))
hits = [
...hits,
{
url: new URL(req.originalUrl, "http://localhost"),
body: json && typeof json === "object" ? (json as Record<string, unknown>) : {},
body: body && typeof body === "object" ? (body as Record<string, unknown>) : {},
},
]
yield* notify()
if (next.step.type === "text") return text(next.step)
if (next.step.type === "tool") return tool(next.step, next.seq)
if (next.step.type === "fail") return fail(next.step)
if (next.step.type === "hang") return hang()
return hold(next.step)
if (next.type === "sse" && next.reset) {
yield* reset(next)
return HttpServerResponse.empty()
}
if (next.type === "sse") return send(next)
return fail(next)
}),
)
@@ -248,20 +374,37 @@ export class TestLLMServer extends ServiceMap.Service<TestLLMServer, TestLLMServ
server.address._tag === "TcpAddress"
? `http://127.0.0.1:${server.address.port}/v1`
: `unix://${server.address.path}/v1`,
text: Effect.fn("TestLLMServer.text")(function* (value: string) {
push({ type: "text", text: value })
push: Effect.fn("TestLLMServer.push")(function* (...input: (Item | Reply)[]) {
queue(...input)
}),
tool: Effect.fn("TestLLMServer.tool")(function* (tool: string, input: unknown) {
push({ type: "tool", tool, input })
text: Effect.fn("TestLLMServer.text")(function* (value: string, opts?: { usage?: Usage }) {
const out = reply().text(value)
if (opts?.usage) out.usage(opts.usage)
queue(out.stop().item())
}),
fail: Effect.fn("TestLLMServer.fail")(function* (message = "boom") {
push({ type: "fail", message })
tool: Effect.fn("TestLLMServer.tool")(function* (name: string, input: unknown) {
queue(reply().tool(name, input).item())
}),
toolHang: Effect.fn("TestLLMServer.toolHang")(function* (name: string, input: unknown) {
queue(reply().pendingTool(name, input).hang().item())
}),
reason: Effect.fn("TestLLMServer.reason")(function* (value: string, opts?: { text?: string; usage?: Usage }) {
const out = reply().reason(value)
if (opts?.text) out.text(opts.text)
if (opts?.usage) out.usage(opts.usage)
queue(out.stop().item())
}),
fail: Effect.fn("TestLLMServer.fail")(function* (message: unknown = "boom") {
queue(reply().streamError(message).item())
}),
error: Effect.fn("TestLLMServer.error")(function* (status: number, body: unknown) {
queue(httpError(status, body))
}),
hang: Effect.gen(function* () {
push({ type: "hang" })
queue(reply().hang().item())
}).pipe(Effect.withSpan("TestLLMServer.hang")),
hold: Effect.fn("TestLLMServer.hold")(function* (text: string, wait: PromiseLike<unknown>) {
push({ type: "hold", text, wait })
hold: Effect.fn("TestLLMServer.hold")(function* (value: string, wait: PromiseLike<unknown>) {
queue(reply().wait(wait).text(value).stop().item())
}),
hits: Effect.sync(() => [...hits]),
calls: Effect.sync(() => hits.length),
@@ -275,8 +418,5 @@ export class TestLLMServer extends ServiceMap.Service<TestLLMServer, TestLLMServ
pending: Effect.sync(() => list.length),
})
}),
).pipe(
Layer.provide(HttpRouter.layer), //
Layer.provide(NodeHttpServer.layer(() => Http.createServer(), { port: 0 })),
)
).pipe(Layer.provide(HttpRouter.layer), Layer.provide(NodeHttpServer.layer(() => Http.createServer(), { port: 0 })))
}

View File

@@ -1,8 +1,6 @@
import { NodeFileSystem } from "@effect/platform-node"
import { expect } from "bun:test"
import { APICallError } from "ai"
import { Cause, Effect, Exit, Fiber, Layer, ServiceMap } from "effect"
import * as Stream from "effect/Stream"
import { Cause, Effect, Exit, Fiber, Layer } from "effect"
import path from "path"
import type { Agent } from "../../src/agent/agent"
import { Agent as AgentSvc } from "../../src/agent/agent"
@@ -10,7 +8,7 @@ import { Bus } from "../../src/bus"
import { Config } from "../../src/config/config"
import { Permission } from "../../src/permission"
import { Plugin } from "../../src/plugin"
import type { Provider } from "../../src/provider/provider"
import { Provider } from "../../src/provider/provider"
import { ModelID, ProviderID } from "../../src/provider/schema"
import { Session } from "../../src/session"
import { LLM } from "../../src/session/llm"
@@ -21,8 +19,9 @@ import { SessionStatus } from "../../src/session/status"
import { Snapshot } from "../../src/snapshot"
import { Log } from "../../src/util/log"
import * as CrossSpawnSpawner from "../../src/effect/cross-spawn-spawner"
import { provideTmpdirInstance } from "../fixture/fixture"
import { provideTmpdirServer } from "../fixture/fixture"
import { testEffect } from "../lib/effect"
import { reply, TestLLMServer } from "../lib/llm-server"
Log.init({ print: false })
@@ -31,118 +30,51 @@ const ref = {
modelID: ModelID.make("test-model"),
}
type Script = Stream.Stream<LLM.Event, unknown> | ((input: LLM.StreamInput) => Stream.Stream<LLM.Event, unknown>)
class TestLLM extends ServiceMap.Service<
TestLLM,
{
readonly push: (stream: Script) => Effect.Effect<void>
readonly reply: (...items: LLM.Event[]) => Effect.Effect<void>
readonly calls: Effect.Effect<number>
readonly inputs: Effect.Effect<LLM.StreamInput[]>
}
>()("@test/SessionProcessorLLM") {}
function stream(...items: LLM.Event[]) {
return Stream.make(...items)
const cfg = {
provider: {
test: {
name: "Test",
id: "test",
env: [],
npm: "@ai-sdk/openai-compatible",
models: {
"test-model": {
id: "test-model",
name: "Test Model",
attachment: false,
reasoning: false,
temperature: false,
tool_call: true,
release_date: "2025-01-01",
limit: { context: 100000, output: 10000 },
cost: { input: 0, output: 0 },
options: {},
},
},
options: {
apiKey: "test-key",
baseURL: "http://localhost:1/v1",
},
},
},
}
function usage(input = 1, output = 1, total = input + output) {
function providerCfg(url: string) {
return {
inputTokens: input,
outputTokens: output,
totalTokens: total,
inputTokenDetails: {
noCacheTokens: undefined,
cacheReadTokens: undefined,
cacheWriteTokens: undefined,
},
outputTokenDetails: {
textTokens: undefined,
reasoningTokens: undefined,
...cfg,
provider: {
...cfg.provider,
test: {
...cfg.provider.test,
options: {
...cfg.provider.test.options,
baseURL: url,
},
},
},
}
}
function start(): LLM.Event {
return { type: "start" }
}
function textStart(id = "t"): LLM.Event {
return { type: "text-start", id }
}
function textDelta(id: string, text: string): LLM.Event {
return { type: "text-delta", id, text }
}
function textEnd(id = "t"): LLM.Event {
return { type: "text-end", id }
}
function reasoningStart(id: string): LLM.Event {
return { type: "reasoning-start", id }
}
function reasoningDelta(id: string, text: string): LLM.Event {
return { type: "reasoning-delta", id, text }
}
function reasoningEnd(id: string): LLM.Event {
return { type: "reasoning-end", id }
}
function finishStep(): LLM.Event {
return {
type: "finish-step",
finishReason: "stop",
rawFinishReason: "stop",
response: { id: "res", modelId: "test-model", timestamp: new Date() },
providerMetadata: undefined,
usage: usage(),
}
}
function finish(): LLM.Event {
return { type: "finish", finishReason: "stop", rawFinishReason: "stop", totalUsage: usage() }
}
function toolInputStart(id: string, toolName: string): LLM.Event {
return { type: "tool-input-start", id, toolName }
}
function toolCall(toolCallId: string, toolName: string, input: unknown): LLM.Event {
return { type: "tool-call", toolCallId, toolName, input }
}
function fail<E>(err: E, ...items: LLM.Event[]) {
return stream(...items).pipe(Stream.concat(Stream.fail(err)))
}
function hang(_input: LLM.StreamInput, ...items: LLM.Event[]) {
return stream(...items).pipe(Stream.concat(Stream.fromEffect(Effect.never)))
}
function model(context: number): Provider.Model {
return {
id: "test-model",
providerID: "test",
name: "Test",
limit: { context, output: 10 },
cost: { input: 0, output: 0, cache: { read: 0, write: 0 } },
capabilities: {
toolcall: true,
attachment: false,
reasoning: false,
temperature: true,
input: { text: true, image: false, audio: false, video: false },
output: { text: true, image: false, audio: false, video: false },
},
api: { npm: "@ai-sdk/anthropic" },
options: {},
} as Provider.Model
}
function agent(): Agent.Info {
return {
name: "build",
@@ -211,43 +143,6 @@ const assistant = Effect.fn("TestSession.assistant")(function* (
return msg
})
const llm = Layer.unwrap(
Effect.gen(function* () {
const queue: Script[] = []
const inputs: LLM.StreamInput[] = []
let calls = 0
const push = Effect.fn("TestLLM.push")((item: Script) => {
queue.push(item)
return Effect.void
})
const reply = Effect.fn("TestLLM.reply")((...items: LLM.Event[]) => push(stream(...items)))
return Layer.mergeAll(
Layer.succeed(
LLM.Service,
LLM.Service.of({
stream: (input) => {
calls += 1
inputs.push(input)
const item = queue.shift() ?? Stream.empty
return typeof item === "function" ? item(input) : item
},
}),
),
Layer.succeed(
TestLLM,
TestLLM.of({
push,
reply,
calls: Effect.sync(() => calls),
inputs: Effect.sync(() => [...inputs]),
}),
),
)
}),
)
const status = SessionStatus.layer.pipe(Layer.provideMerge(Bus.layer))
const infra = Layer.mergeAll(NodeFileSystem.layer, CrossSpawnSpawner.defaultLayer)
const deps = Layer.mergeAll(
@@ -257,27 +152,37 @@ const deps = Layer.mergeAll(
Permission.layer,
Plugin.defaultLayer,
Config.defaultLayer,
LLM.defaultLayer,
Provider.defaultLayer,
status,
llm,
).pipe(Layer.provideMerge(infra))
const env = SessionProcessor.layer.pipe(Layer.provideMerge(deps))
const env = Layer.mergeAll(TestLLMServer.layer, SessionProcessor.layer.pipe(Layer.provideMerge(deps)))
const it = testEffect(env)
it.live("session.processor effect tests capture llm input cleanly", () => {
return provideTmpdirInstance(
(dir) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const boot = Effect.fn("test.boot")(function* () {
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const provider = yield* Provider.Service
return { processors, session, provider }
})
yield* test.reply(start(), textStart(), textDelta("t", "hello"), textEnd(), finishStep(), finish())
// ---------------------------------------------------------------------------
// Tests
// ---------------------------------------------------------------------------
it.live("session.processor effect tests capture llm input cleanly", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const { processors, session, provider } = yield* boot()
yield* llm.text("hello")
const chat = yield* session.create({})
const parent = yield* user(chat.id, "hi")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -303,46 +208,29 @@ it.live("session.processor effect tests capture llm input cleanly", () => {
const value = yield* handle.process(input)
const parts = yield* Effect.promise(() => MessageV2.parts(msg.id))
const calls = yield* test.calls
const inputs = yield* test.inputs
const calls = yield* llm.calls
expect(value).toBe("continue")
expect(calls).toBe(1)
expect(inputs).toHaveLength(1)
expect(inputs[0].messages).toStrictEqual([{ role: "user", content: "hi" }])
expect(parts.some((part) => part.type === "text" && part.text === "hello")).toBe(true)
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests stop after token overflow requests compaction", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests stop after token overflow requests compaction", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.reply(
start(),
{
type: "finish-step",
finishReason: "stop",
rawFinishReason: "stop",
response: { id: "res", modelId: "test-model", timestamp: new Date() },
providerMetadata: undefined,
usage: usage(100, 0, 100),
},
textStart(),
textDelta("t", "after"),
textEnd(),
)
yield* llm.text("after", { usage: { input: 100, output: 0 } })
const chat = yield* session.create({})
const parent = yield* user(chat.id, "compact")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(20)
const base = yield* provider.getModel(ref.providerID, ref.modelID)
const mdl = { ...base, limit: { context: 20, output: 10 } }
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -369,51 +257,73 @@ it.live("session.processor effect tests stop after token overflow requests compa
const parts = yield* Effect.promise(() => MessageV2.parts(msg.id))
expect(value).toBe("compact")
expect(parts.some((part) => part.type === "text")).toBe(false)
expect(parts.some((part) => part.type === "text" && part.text === "after")).toBe(true)
expect(parts.some((part) => part.type === "step-finish")).toBe(true)
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests reset reasoning state across retries", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests capture reasoning from http mock", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.push(
fail(
new APICallError({
message: "boom",
url: "https://example.com/v1/chat/completions",
requestBodyValues: {},
statusCode: 503,
responseHeaders: { "retry-after-ms": "0" },
responseBody: '{"error":"boom"}',
isRetryable: true,
}),
start(),
reasoningStart("r"),
reasoningDelta("r", "one"),
),
)
yield* test.reply(
start(),
reasoningStart("r"),
reasoningDelta("r", "two"),
reasoningEnd("r"),
finishStep(),
finish(),
)
yield* llm.push(reply().reason("think").text("done").stop())
const chat = yield* session.create({})
const parent = yield* user(chat.id, "reason")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
model: mdl,
})
const value = yield* handle.process({
user: {
id: parent.id,
sessionID: chat.id,
role: "user",
time: parent.time,
agent: parent.agent,
model: { providerID: ref.providerID, modelID: ref.modelID },
} satisfies MessageV2.User,
sessionID: chat.id,
model: mdl,
agent: agent(),
system: [],
messages: [{ role: "user", content: "reason" }],
tools: {},
})
const parts = yield* Effect.promise(() => MessageV2.parts(msg.id))
const reasoning = parts.find((part): part is MessageV2.ReasoningPart => part.type === "reasoning")
const text = parts.find((part): part is MessageV2.TextPart => part.type === "text")
expect(value).toBe("continue")
expect(yield* llm.calls).toBe(1)
expect(reasoning?.text).toBe("think")
expect(text?.text).toBe("done")
}),
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests reset reasoning state across retries", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const { processors, session, provider } = yield* boot()
yield* llm.push(reply().reason("one").reset(), reply().reason("two").stop())
const chat = yield* session.create({})
const parent = yield* user(chat.id, "reason")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -441,28 +351,26 @@ it.live("session.processor effect tests reset reasoning state across retries", (
const reasoning = parts.filter((part): part is MessageV2.ReasoningPart => part.type === "reasoning")
expect(value).toBe("continue")
expect(yield* test.calls).toBe(2)
expect(yield* llm.calls).toBe(2)
expect(reasoning.some((part) => part.text === "two")).toBe(true)
expect(reasoning.some((part) => part.text === "onetwo")).toBe(false)
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests do not retry unknown json errors", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests do not retry unknown json errors", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.push(fail({ error: { message: "no_kv_space" } }, start()))
yield* llm.error(400, { error: { message: "no_kv_space" } })
const chat = yield* session.create({})
const parent = yield* user(chat.id, "json")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -487,29 +395,26 @@ it.live("session.processor effect tests do not retry unknown json errors", () =>
})
expect(value).toBe("stop")
expect(yield* test.calls).toBe(1)
expect(yield* test.inputs).toHaveLength(1)
expect(handle.message.error?.name).toBe("UnknownError")
expect(yield* llm.calls).toBe(1)
expect(handle.message.error?.name).toBe("APIError")
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests retry recognized structured json errors", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests retry recognized structured json errors", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.push(fail({ type: "error", error: { type: "too_many_requests" } }, start()))
yield* test.reply(start(), textStart(), textDelta("t", "after"), textEnd(), finishStep(), finish())
yield* llm.error(429, { type: "error", error: { type: "too_many_requests" } })
yield* llm.text("after")
const chat = yield* session.create({})
const parent = yield* user(chat.id, "retry json")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -536,43 +441,28 @@ it.live("session.processor effect tests retry recognized structured json errors"
const parts = yield* Effect.promise(() => MessageV2.parts(msg.id))
expect(value).toBe("continue")
expect(yield* test.calls).toBe(2)
expect(yield* llm.calls).toBe(2)
expect(parts.some((part) => part.type === "text" && part.text === "after")).toBe(true)
expect(handle.message.error).toBeUndefined()
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests publish retry status updates", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests publish retry status updates", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
const bus = yield* Bus.Service
yield* test.push(
fail(
new APICallError({
message: "boom",
url: "https://example.com/v1/chat/completions",
requestBodyValues: {},
statusCode: 503,
responseHeaders: { "retry-after-ms": "0" },
responseBody: '{"error":"boom"}',
isRetryable: true,
}),
start(),
),
)
yield* test.reply(start(), finishStep(), finish())
yield* llm.error(503, { error: "boom" })
yield* llm.text("")
const chat = yield* session.create({})
const parent = yield* user(chat.id, "retry")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const states: number[] = []
const off = yield* bus.subscribeCallback(SessionStatus.Event.Status, (evt) => {
if (evt.properties.sessionID !== chat.id) return
@@ -604,27 +494,25 @@ it.live("session.processor effect tests publish retry status updates", () => {
off()
expect(value).toBe("continue")
expect(yield* test.calls).toBe(2)
expect(yield* llm.calls).toBe(2)
expect(states).toStrictEqual([1])
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests compact on structured context overflow", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests compact on structured context overflow", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.push(fail({ type: "error", error: { code: "context_length_exceeded" } }, start()))
yield* llm.error(400, { type: "error", error: { code: "context_length_exceeded" } })
const chat = yield* session.create({})
const parent = yield* user(chat.id, "compact json")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -649,32 +537,25 @@ it.live("session.processor effect tests compact on structured context overflow",
})
expect(value).toBe("compact")
expect(yield* test.calls).toBe(1)
expect(yield* llm.calls).toBe(1)
expect(handle.message.error).toBeUndefined()
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests mark pending tools as aborted on cleanup", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests mark pending tools as aborted on cleanup", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const ready = defer<void>()
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
yield* test.push((input) =>
hang(input, start(), toolInputStart("tool-1", "bash"), toolCall("tool-1", "bash", { cmd: "pwd" })).pipe(
Stream.tap((event) => (event.type === "tool-call" ? Effect.sync(() => ready.resolve()) : Effect.void)),
),
)
yield* llm.toolHang("bash", { cmd: "pwd" })
const chat = yield* session.create({})
const parent = yield* user(chat.id, "tool abort")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -700,7 +581,15 @@ it.live("session.processor effect tests mark pending tools as aborted on cleanup
})
.pipe(Effect.forkChild)
yield* Effect.promise(() => ready.promise)
yield* llm.wait(1)
yield* Effect.promise(async () => {
const end = Date.now() + 500
while (Date.now() < end) {
const parts = await MessageV2.parts(msg.id)
if (parts.some((part) => part.type === "tool")) return
await Bun.sleep(10)
}
})
yield* Fiber.interrupt(run)
const exit = yield* Fiber.await(run)
@@ -708,45 +597,38 @@ it.live("session.processor effect tests mark pending tools as aborted on cleanup
yield* handle.abort()
}
const parts = yield* Effect.promise(() => MessageV2.parts(msg.id))
const tool = parts.find((part): part is MessageV2.ToolPart => part.type === "tool")
const call = parts.find((part): part is MessageV2.ToolPart => part.type === "tool")
expect(Exit.isFailure(exit)).toBe(true)
if (Exit.isFailure(exit)) {
expect(Cause.hasInterruptsOnly(exit.cause)).toBe(true)
}
expect(yield* test.calls).toBe(1)
expect(tool?.state.status).toBe("error")
if (tool?.state.status === "error") {
expect(tool.state.error).toBe("Tool execution aborted")
expect(tool.state.time.end).toBeDefined()
expect(yield* llm.calls).toBe(1)
expect(call?.state.status).toBe("error")
if (call?.state.status === "error") {
expect(call.state.error).toBe("Tool execution aborted")
expect(call.state.time.end).toBeDefined()
}
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests record aborted errors and idle state", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests record aborted errors and idle state", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const ready = defer<void>()
const seen = defer<void>()
const test = yield* TestLLM
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const { processors, session, provider } = yield* boot()
const bus = yield* Bus.Service
const status = yield* SessionStatus.Service
const sts = yield* SessionStatus.Service
yield* test.push((input) =>
hang(input, start()).pipe(
Stream.tap((event) => (event.type === "start" ? Effect.sync(() => ready.resolve()) : Effect.void)),
),
)
yield* llm.hang
const chat = yield* session.create({})
const parent = yield* user(chat.id, "abort")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const errs: string[] = []
const off = yield* bus.subscribeCallback(Session.Event.Error, (evt) => {
if (evt.properties.sessionID !== chat.id) return
@@ -779,7 +661,7 @@ it.live("session.processor effect tests record aborted errors and idle state", (
})
.pipe(Effect.forkChild)
yield* Effect.promise(() => ready.promise)
yield* llm.wait(1)
yield* Fiber.interrupt(run)
const exit = yield* Fiber.await(run)
@@ -788,7 +670,7 @@ it.live("session.processor effect tests record aborted errors and idle state", (
}
yield* Effect.promise(() => seen.promise)
const stored = yield* Effect.promise(() => MessageV2.get({ sessionID: chat.id, messageID: msg.id }))
const state = yield* status.get(chat.id)
const state = yield* sts.get(chat.id)
off()
expect(Exit.isFailure(exit)).toBe(true)
@@ -803,30 +685,23 @@ it.live("session.processor effect tests record aborted errors and idle state", (
expect(state).toMatchObject({ type: "idle" })
expect(errs).toContain("MessageAbortedError")
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)
it.live("session.processor effect tests mark interruptions aborted without manual abort", () => {
return provideTmpdirInstance(
(dir) =>
it.live("session.processor effect tests mark interruptions aborted without manual abort", () =>
provideTmpdirServer(
({ dir, llm }) =>
Effect.gen(function* () {
const ready = defer<void>()
const processors = yield* SessionProcessor.Service
const session = yield* Session.Service
const status = yield* SessionStatus.Service
const test = yield* TestLLM
const { processors, session, provider } = yield* boot()
const sts = yield* SessionStatus.Service
yield* test.push((input) =>
hang(input, start()).pipe(
Stream.tap((event) => (event.type === "start" ? Effect.sync(() => ready.resolve()) : Effect.void)),
),
)
yield* llm.hang
const chat = yield* session.create({})
const parent = yield* user(chat.id, "interrupt")
const msg = yield* assistant(chat.id, parent.id, path.resolve(dir))
const mdl = model(100)
const mdl = yield* provider.getModel(ref.providerID, ref.modelID)
const handle = yield* processors.create({
assistantMessage: msg,
sessionID: chat.id,
@@ -852,12 +727,12 @@ it.live("session.processor effect tests mark interruptions aborted without manua
})
.pipe(Effect.forkChild)
yield* Effect.promise(() => ready.promise)
yield* llm.wait(1)
yield* Fiber.interrupt(run)
const exit = yield* Fiber.await(run)
const stored = yield* Effect.promise(() => MessageV2.get({ sessionID: chat.id, messageID: msg.id }))
const state = yield* status.get(chat.id)
const state = yield* sts.get(chat.id)
expect(Exit.isFailure(exit)).toBe(true)
expect(handle.message.error?.name).toBe("MessageAbortedError")
@@ -867,6 +742,6 @@ it.live("session.processor effect tests mark interruptions aborted without manua
}
expect(state).toMatchObject({ type: "idle" })
}),
{ git: true },
)
})
{ git: true, config: (url) => providerCfg(url) },
),
)

View File

@@ -10,66 +10,9 @@ import { Instance } from "../../src/project/instance"
import { MessageID, PartID } from "../../src/session/schema"
import { tmpdir } from "../fixture/fixture"
const projectRoot = path.join(__dirname, "../..")
Log.init({ print: false })
function user(sessionID: string, agent = "default") {
return Session.updateMessage({
id: MessageID.ascending(),
role: "user" as const,
sessionID: sessionID as any,
agent,
model: { providerID: ProviderID.make("openai"), modelID: ModelID.make("gpt-4") },
time: { created: Date.now() },
})
}
function assistant(sessionID: string, parentID: string, dir: string) {
return Session.updateMessage({
id: MessageID.ascending(),
role: "assistant" as const,
sessionID: sessionID as any,
mode: "default",
agent: "default",
path: { cwd: dir, root: dir },
cost: 0,
tokens: { output: 0, input: 0, reasoning: 0, cache: { read: 0, write: 0 } },
modelID: ModelID.make("gpt-4"),
providerID: ProviderID.make("openai"),
parentID: parentID as any,
time: { created: Date.now() },
finish: "end_turn",
})
}
function text(sessionID: string, messageID: string, content: string) {
return Session.updatePart({
id: PartID.ascending(),
messageID: messageID as any,
sessionID: sessionID as any,
type: "text" as const,
text: content,
})
}
function tool(sessionID: string, messageID: string) {
return Session.updatePart({
id: PartID.ascending(),
messageID: messageID as any,
sessionID: sessionID as any,
type: "tool" as const,
tool: "bash",
callID: "call-1",
state: {
status: "completed" as const,
input: {},
output: "done",
title: "",
metadata: {},
time: { start: 0, end: 1 },
},
})
}
describe("revert + compact workflow", () => {
test("should properly handle compact command after revert", async () => {
await using tmp = await tmpdir({ git: true })
@@ -340,98 +283,4 @@ describe("revert + compact workflow", () => {
},
})
})
test("cleanup with partID removes parts from the revert point onward", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
const p1 = await text(sid, u1.id, "first part")
const p2 = await tool(sid, u1.id)
const p3 = await text(sid, u1.id, "third part")
// Set revert state pointing at a specific part
await Session.setRevert({
sessionID: sid,
revert: { messageID: u1.id, partID: p2.id },
summary: { additions: 0, deletions: 0, files: 0 },
})
const info = await Session.get(sid)
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
expect(msgs.length).toBe(1)
// Only the first part should remain (before the revert partID)
expect(msgs[0].parts.length).toBe(1)
expect(msgs[0].parts[0].id).toBe(p1.id)
const cleared = await Session.get(sid)
expect(cleared.revert).toBeUndefined()
},
})
})
test("cleanup removes messages after revert point but keeps earlier ones", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
await text(sid, u1.id, "hello")
const a1 = await assistant(sid, u1.id, tmp.path)
await text(sid, a1.id, "hi back")
const u2 = await user(sid)
await text(sid, u2.id, "second question")
const a2 = await assistant(sid, u2.id, tmp.path)
await text(sid, a2.id, "second answer")
// Revert from u2 onward
await Session.setRevert({
sessionID: sid,
revert: { messageID: u2.id },
summary: { additions: 0, deletions: 0, files: 0 },
})
const info = await Session.get(sid)
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
const ids = msgs.map((m) => m.info.id)
expect(ids).toContain(u1.id)
expect(ids).toContain(a1.id)
expect(ids).not.toContain(u2.id)
expect(ids).not.toContain(a2.id)
},
})
})
test("cleanup is a no-op when session has no revert state", async () => {
await using tmp = await tmpdir({ git: true })
await Instance.provide({
directory: tmp.path,
fn: async () => {
const session = await Session.create({})
const sid = session.id
const u1 = await user(sid)
await text(sid, u1.id, "hello")
const info = await Session.get(sid)
expect(info.revert).toBeUndefined()
await SessionRevert.cleanup(info)
const msgs = await Session.messages({ sessionID: sid })
expect(msgs.length).toBe(1)
},
})
})
})

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/plugin",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {
@@ -21,8 +21,8 @@
"zod": "catalog:"
},
"peerDependencies": {
"@opentui/core": ">=0.1.95",
"@opentui/solid": ">=0.1.95"
"@opentui/core": ">=0.1.93",
"@opentui/solid": ">=0.1.93"
},
"peerDependenciesMeta": {
"@opentui/core": {
@@ -33,8 +33,8 @@
}
},
"devDependencies": {
"@opentui/core": "0.1.95",
"@opentui/solid": "0.1.95",
"@opentui/core": "0.1.93",
"@opentui/solid": "0.1.93",
"@tsconfig/node22": "catalog:",
"@types/node": "catalog:",
"typescript": "catalog:",

View File

@@ -1,7 +1,7 @@
{
"$schema": "https://json.schemastore.org/package.json",
"name": "@opencode-ai/sdk",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -7040,6 +7040,44 @@
},
"components": {
"schemas": {
"Event.installation.updated": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.updated"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.installation.update-available": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.update-available"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Project": {
"type": "object",
"properties": {
@@ -7116,44 +7154,6 @@
},
"required": ["type", "properties"]
},
"Event.installation.updated": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.updated"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.installation.update-available": {
"type": "object",
"properties": {
"type": {
"type": "string",
"const": "installation.update-available"
},
"properties": {
"type": "object",
"properties": {
"version": {
"type": "string"
}
},
"required": ["version"]
}
},
"required": ["type", "properties"]
},
"Event.server.instance.disposed": {
"type": "object",
"properties": {
@@ -9733,15 +9733,15 @@
},
"Event": {
"anyOf": [
{
"$ref": "#/components/schemas/Event.project.updated"
},
{
"$ref": "#/components/schemas/Event.installation.updated"
},
{
"$ref": "#/components/schemas/Event.installation.update-available"
},
{
"$ref": "#/components/schemas/Event.project.updated"
},
{
"$ref": "#/components/schemas/Event.server.instance.disposed"
},

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/slack",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"scripts": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/ui",
"version": "1.3.13",
"version": "1.3.11",
"type": "module",
"license": "MIT",
"exports": {

View File

@@ -1,6 +1,6 @@
{
"name": "@opencode-ai/util",
"version": "1.3.13",
"version": "1.3.11",
"private": true,
"type": "module",
"license": "MIT",

View File

@@ -2,7 +2,7 @@
"name": "@opencode-ai/web",
"type": "module",
"license": "MIT",
"version": "1.3.13",
"version": "1.3.11",
"scripts": {
"dev": "astro dev",
"dev:remote": "VITE_API_URL=https://api.opencode.ai astro dev",

View File

@@ -2,7 +2,7 @@
"name": "opencode",
"displayName": "opencode",
"description": "opencode for VS Code",
"version": "1.3.13",
"version": "1.3.11",
"publisher": "sst-dev",
"repository": {
"type": "git",