Compare commits

...

10 Commits

Author SHA1 Message Date
Aiden Cline
e6a3be31de merge dev 2025-11-21 01:44:45 -06:00
Ian Maurer
715265de4b Update .gitignore to remove '*.bun-build'
Remove '*.bun-build' from the .gitignore file
2025-11-12 11:33:59 -05:00
Ian Maurer
2048f32491 Merge branch 'dev' into fix/cli-clean-exit-on-model-errors 2025-11-12 11:30:36 -05:00
GitHub Action
2d7ba43a21 chore: format code 2025-11-12 16:14:12 +00:00
Ian Maurer
bd198d8550 fix(cli): robust suggestions for unknown provider and model-only input; ignore bun build artifacts 2025-11-12 11:13:34 -05:00
GitHub Action
cbffbcdd3d chore: format code 2025-11-12 15:43:30 +00:00
Ian Maurer
2be8b2269f feat(cli): suggest closest provider/model on not found ("Did you mean…")\n\nSummary\n- Add fuzzy suggestions to ProviderModelNotFoundError with up to 3 candidates\n- Normalize punctuation (e.g., 4.5 vs 4-5) and case to better match common typos\n- Support model-only input (no provider) by searching across all providers\n- Enhance CLI error formatter to display suggestions when present\n\nImplementation\n- provider.ts: use fuzzysort; add normalization by stripping non-alphanumerics; search by key for robust matches\n- provider.ts: when provider is unknown and model is empty, treat token as unqualified model and search across all providers' models; otherwise suggest provider matches\n- error.ts: print "Did you mean: <provider/model>, …" when suggestions exist\n\nExamples\n1) Typo in model ID\n $ bun run ./src/index.ts run --model anthropic/claude-haiu-4-5 "hi"\n Error: Model not found: anthropic/claude-haiu-4-5\n Did you mean: anthropic/claude-haiku-4-5, anthropic/claude-haiku-4-5-20251001\n Try: zai-coding-plan/glm-4.5-flash
zai-coding-plan/glm-4.5
zai-coding-plan/glm-4.5-air
zai-coding-plan/glm-4.5v
zai-coding-plan/glm-4.6
opencode/big-pickle
opencode/grok-code
anthropic/claude-opus-4-0
anthropic/claude-3-5-sonnet-20241022
anthropic/claude-opus-4-1
anthropic/claude-haiku-4-5
anthropic/claude-3-5-sonnet-20240620
anthropic/claude-3-5-haiku-latest
anthropic/claude-3-opus-20240229
anthropic/claude-sonnet-4-5
anthropic/claude-sonnet-4-5-20250929
anthropic/claude-sonnet-4-20250514
anthropic/claude-opus-4-20250514
anthropic/claude-3-5-haiku-20241022
anthropic/claude-3-haiku-20240307
anthropic/claude-3-7-sonnet-20250219
anthropic/claude-3-7-sonnet-latest
anthropic/claude-sonnet-4-0
anthropic/claude-opus-4-1-20250805
anthropic/claude-3-sonnet-20240229
anthropic/claude-haiku-4-5-20251001
openai/gpt-4.1-nano
openai/text-embedding-3-small
openai/gpt-4
openai/o1-pro
openai/gpt-4o-2024-05-13
openai/gpt-4o-2024-08-06
openai/gpt-4.1-mini
openai/o3-deep-research
openai/gpt-3.5-turbo
openai/text-embedding-3-large
openai/gpt-4-turbo
openai/o1-preview
openai/o3-mini
openai/codex-mini-latest
openai/gpt-5-nano
openai/gpt-5-codex
openai/gpt-4o
openai/gpt-4.1
openai/o4-mini
openai/o1
openai/gpt-5-mini
openai/o1-mini
openai/text-embedding-ada-002
openai/o3-pro
openai/gpt-4o-2024-11-20
openai/o3
openai/o4-mini-deep-research
openai/gpt-4o-mini
openai/gpt-5
openai/gpt-5-pro to list available models\n   Or check your config (opencode.json) provider/model names\n\n2) Dot vs dash (punctuation normalization)\n   $ bun run ./src/index.ts run --model anthropic/claude-haiku-4.5 "hi"\n   Error: Model not found: anthropic/claude-haiku-4.5\n   Did you mean: anthropic/claude-haiku-4-5, anthropic/claude-haiku-4-5-20251001\n   Try: zai-coding-plan/glm-4.5-flash
zai-coding-plan/glm-4.5
zai-coding-plan/glm-4.5-air
zai-coding-plan/glm-4.5v
zai-coding-plan/glm-4.6
opencode/big-pickle
opencode/grok-code
anthropic/claude-opus-4-0
anthropic/claude-3-5-sonnet-20241022
anthropic/claude-opus-4-1
anthropic/claude-haiku-4-5
anthropic/claude-3-5-sonnet-20240620
anthropic/claude-3-5-haiku-latest
anthropic/claude-3-opus-20240229
anthropic/claude-sonnet-4-5
anthropic/claude-sonnet-4-5-20250929
anthropic/claude-sonnet-4-20250514
anthropic/claude-opus-4-20250514
anthropic/claude-3-5-haiku-20241022
anthropic/claude-3-haiku-20240307
anthropic/claude-3-7-sonnet-20250219
anthropic/claude-3-7-sonnet-latest
anthropic/claude-sonnet-4-0
anthropic/claude-opus-4-1-20250805
anthropic/claude-3-sonnet-20240229
anthropic/claude-haiku-4-5-20251001
openai/gpt-4.1-nano
openai/text-embedding-3-small
openai/gpt-4
openai/o1-pro
openai/gpt-4o-2024-05-13
openai/gpt-4o-2024-08-06
openai/gpt-4.1-mini
openai/o3-deep-research
openai/gpt-3.5-turbo
openai/text-embedding-3-large
openai/gpt-4-turbo
openai/o1-preview
openai/o3-mini
openai/codex-mini-latest
openai/gpt-5-nano
openai/gpt-5-codex
openai/gpt-4o
openai/gpt-4.1
openai/o4-mini
openai/o1
openai/gpt-5-mini
openai/o1-mini
openai/text-embedding-ada-002
openai/o3-pro
openai/gpt-4o-2024-11-20
openai/o3
openai/o4-mini-deep-research
openai/gpt-4o-mini
openai/gpt-5
openai/gpt-5-pro to list available models\n   Or check your config (opencode.json) provider/model names\n\n3) Missing provider (model-only input)\n   $ bun run ./src/index.ts run --model big-pickle "hi"\n   Error: Model not found: big-pickle/\n   Did you mean: opencode/big-pickle\n\n4) Correct model after suggestion\n   $ bun run ./src/index.ts run --model opencode/big-pickle "hi"\n   Hi! How can I help you with your opencode project today?\n\nNotes\n- Suggestions are hints only; behavior is unchanged (no auto-selection).\n- This runs locally as part of the CLI error path; performance impact is negligible (small in-memory scans).
2025-11-12 10:42:18 -05:00
Ian Maurer
c1fa257a92 Merge branch 'dev' into fix/cli-clean-exit-on-model-errors 2025-11-11 17:18:23 -05:00
Ian Maurer
7fd81dd93e chore(cli): resolve merge conflict with dev in bootstrap; keep try/finally with explicit return 2025-11-11 17:16:47 -05:00
Ian Maurer
d554e7aaef fix(cli): always dispose instance on error to prevent hanging; add friendly ProviderModelNotFoundError/InitError messages\n\n- Wrap bootstrap callback in try/finally to guarantee Instance.dispose()\n- Format provider/model errors into actionable guidance (opencode models, config)\n\nRepro: running opencode run --model typo/claude-haiku-4-5 prints stack and hangs until SIGINT due to lingering watchers.\nFix: disposing Instance tears down watchers/subscriptions, allowing process to exit.\n\nNotes: Prior attempt (#3083) explicitly exited; this approach addresses root cause without forcing exit and improves UX for common misconfigurations. 2025-11-11 17:13:55 -05:00
2 changed files with 86 additions and 2 deletions

View File

@@ -1,11 +1,24 @@
import { ConfigMarkdown } from "@/config/markdown"
import { Config } from "../config/config"
import { MCP } from "../mcp"
import { Provider } from "../provider/provider"
import { UI } from "./ui"
export function FormatError(input: unknown) {
if (MCP.Failed.isInstance(input))
return `MCP server "${input.data.name}" failed. Note, opencode does not support MCP authentication yet.`
if (Provider.ModelNotFoundError.isInstance(input)) {
const { providerID, modelID, suggestions } = input.data
return [
`Model not found: ${providerID}/${modelID}`,
...(Array.isArray(suggestions) && suggestions.length ? ["Did you mean: " + suggestions.join(", ")] : []),
`Try: \`opencode models\` to list available models`,
`Or check your config (opencode.json) provider/model names`,
].join("\n")
}
if (Provider.InitError.isInstance(input)) {
return `Failed to initialize provider "${input.data.providerID}". Check credentials and configuration.`
}
if (Config.JsonError.isInstance(input)) {
return (
`Config file at ${input.data.path} is not valid JSON(C)` + (input.data.message ? `: ${input.data.message}` : "")

View File

@@ -1,4 +1,5 @@
import z from "zod"
import fuzzysort from "fuzzysort"
import { Config } from "../config/config"
import { mergeDeep, sortBy } from "remeda"
import { NoSuchModelError, type LanguageModel, type Provider as SDK } from "ai"
@@ -597,9 +598,78 @@ export namespace Provider {
})
const provider = s.providers[providerID]
if (!provider) throw new ModelNotFoundError({ providerID, modelID })
if (!provider) {
let suggestions: string[] = []
const normalize = (str: string) => str.toLowerCase().replace(/[^a-z0-9]/g, "")
const levenshtein = (a: string, b: string) => {
const m = a.length,
n = b.length
const dp = Array.from({ length: m + 1 }, () => new Array<number>(n + 1).fill(0))
for (let i = 0; i <= m; i++) dp[i][0] = i
for (let j = 0; j <= n; j++) dp[0][j] = j
for (let i = 1; i <= m; i++) {
for (let j = 1; j <= n; j++) {
const cost = a[i - 1] === b[j - 1] ? 0 : 1
dp[i][j] = Math.min(dp[i - 1][j] + 1, dp[i][j - 1] + 1, dp[i - 1][j - 1] + cost)
}
}
return dp[m][n]
}
if (!modelID || modelID.trim() === "") {
// Treat single-token input as an unqualified model; search across all providers' models.
const q = normalize(providerID)
const entries: { combo: string; norm: string }[] = []
for (const [pid, prov] of Object.entries(s.providers)) {
for (const mid of Object.keys(prov.info.models)) {
entries.push({ combo: pid + "/" + mid, norm: normalize(mid) })
}
}
const byNorm = fuzzysort.go(q, entries as any, { limit: 5, key: "norm" }).map((r: any) => r.obj.combo)
const combos = entries.map((e) => e.combo)
const byRaw = fuzzysort.go(providerID, combos, { limit: 5 }).map((r) => r.target)
let merged = Array.from(new Set([...byNorm, ...byRaw]))
if (merged.length === 0) {
// fallback to edit distance on normalized mid
const scored = entries
.map((e) => ({ combo: e.combo, d: levenshtein(q, e.norm) }))
.sort((a, b) => a.d - b.d)
.slice(0, 3)
.map((x) => x.combo)
merged = scored
}
suggestions = merged.slice(0, 3)
} else {
const pcands = Object.keys(s.providers)
const corpus = pcands.map((raw) => ({ raw, norm: normalize(raw) }))
const q = normalize(providerID)
const hits = fuzzysort.go(q, corpus as any, { limit: 5, key: "norm" })
let ranked = hits.map((r: any) => r.obj.raw)
if (ranked.length === 0) {
ranked = pcands
.map((p) => ({ p, d: levenshtein(q, normalize(p)) }))
.sort((a, b) => a.d - b.d)
.slice(0, 3)
.map((x) => x.p)
}
const providerSuggestions = ranked.map((r) => r + "/" + modelID)
suggestions = providerSuggestions
}
throw new ModelNotFoundError({ providerID, modelID, suggestions })
}
const info = provider.info.models[modelID]
if (!info) throw new ModelNotFoundError({ providerID, modelID })
if (!info) {
const candidates = Object.keys(provider.info.models)
// Normalize punctuation differences like '-' vs '.' by stripping non-alphanumerics
const normalize = (s: string) => s.toLowerCase().replace(/[^a-z0-9]/g, "")
const corpus = candidates.map((raw) => ({ raw, norm: normalize(raw) }))
const query = normalize(modelID)
const results = fuzzysort.go(query, corpus as any, { limit: 5, key: "norm" })
const ranked = results.map((r) => ("obj" in r ? (r as any).obj.raw : (r as any).target)) as string[]
const fallback = fuzzysort.go(modelID, candidates, { limit: 5 }).map((r) => r.target)
const merged = Array.from(new Set([...ranked, ...fallback]))
const suggestions = merged.slice(0, 3).map((m) => providerID + "/" + m)
throw new ModelNotFoundError({ providerID, modelID, suggestions })
}
const sdk = await getSDK(provider.info, info)
try {
@@ -700,6 +770,7 @@ export namespace Provider {
z.object({
providerID: z.string(),
modelID: z.string(),
suggestions: z.array(z.string()).optional(),
}),
)