Investigating SDK chat function error

/home/jm/data/code/opencode/packages/opencode · 30 messages · 82 parts · · 68,715 tokens
Created Jan 21, 05:16:36 PM · Updated Jan 26, 01:41:22 AM · 1 syncs
userJan 21, 05:16:36 PM

Can you investigate into what is causing this error here?

DEBUG 2026-01-21T17:13:37 +1ms service=network url=https://api.z.ai/api/coding/paas/v4/chat/completions method=POST net
work request
INFO 2026-01-21T17:13:37 +0ms service=bus type=session.updated publishing
INFO 2026-01-21T17:13:37 +0ms service=bus type=session.diff publishing
INFO 2026-01-21T17:13:37 +6ms service=session.processor process
INFO 2026-01-21T17:13:37 +0ms service=llm providerID=cloudflare-ai-gateway modelID=anthropic/claude-3.5-haiku sessionI
D=ses_41e73e73fffeLKAcPnMU1XvWSq small=false agent=build stream
INFO 2026-01-21T17:13:37 +0ms service=provider status=started providerID=cloudflare-ai-gateway getSDK
INFO 2026-01-21T17:13:37 +0ms service=provider providerID=cloudflare-ai-gateway pkg=@ai-sdk/openai-compatible using bu
ndled provider
INFO 2026-01-21T17:13:37 +0ms service=provider status=completed duration=0 providerID=cloudflare-ai-gateway getSDK
ERROR 2026-01-21T17:13:37 +1ms service=session.processor error=sdk.chat is not a function. (In 'sdk.chat(modelID)', 'sdk.chat' is undefined) stack="TypeError: sdk.chat is not a function. (In 'sdk.chat(modelID)', 'sdk.chat' is undefined)\n at getModel (/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:374:22)\n at getModel (/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:373:34)\n at getLanguage (/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:949:38)\n at processTicksAndRejections (native:7:39)" process

assistantJan 21, 05:16:36 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00331|         }
00332|         return undefined
00333|       })
00334|       const deploymentId = Env.get("AICORE_DEPLOYMENT_ID")
00335|       const resourceGroup = Env.get("AICORE_RESOURCE_GROUP")
00336| 
00337|       return {
00338|         autoload: !!envServiceKey,
00339|         options: envServiceKey ? { deploymentId, resourceGroup } : {},
00340|         async getModel(sdk: any, modelID: string) {
00341|           return sdk(modelID)
00342|         },
00343|       }
00344|     },
00345|     zenmux: async () => {
00346|       return {
00347|         autoload: false,
00348|         options: {
00349|           headers: {
00350|             "HTTP-Referer": "https://opencode.ai/",
00351|             "X-Title": "opencode",
00352|           },
00353|         },
00354|       }
00355|     },
00356|     "cloudflare-ai-gateway": async (input) => {
00357|       const accountId = Env.get("CLOUDFLARE_ACCOUNT_ID")
00358|       const gateway = Env.get("CLOUDFLARE_GATEWAY_ID")
00359| 
00360|       if (!accountId || !gateway) return { autoload: false }
00361| 
00362|       // Get API token from env or auth prompt
00363|       const apiToken = await (async () => {
00364|         const envToken = Env.get("CLOUDFLARE_API_TOKEN")
00365|         if (envToken) return envToken
00366|         const auth = await Auth.get(input.id)
00367|         if (auth?.type === "api") return auth.key
00368|         return undefined
00369|       })()
00370| 
00371|       return {
00372|         autoload: true,
00373|         async getModel(sdk: any, modelID: string, _options?: Record<string, any>) {
00374|           return sdk.chat(modelID)
00375|         },
00376|         options: {
00377|           baseURL: `https://gateway.ai.cloudflare.com/v1/${accountId}/${gateway}/compat`,
00378|           headers: {
00379|             // Cloudflare AI Gateway uses cf-aig-authorization for authenticated gateways
00380|             // This enables Unified Billing where Cloudflare handles upstream provider...
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00901|       const mod = await import(installedPath)
00902| 
00903|       const fn = mod[Object.keys(mod).find((key) => key.startsWith("create"))!]
00904|       const loaded = fn({
00905|         name: model.providerID,
00906|         ...options,
00907|       })
00908|       s.sdk.set(key, loaded)
00909|       return loaded as SDK
00910|     } catch (e) {
00911|       throw new InitError({ providerID: model.providerID }, { cause: e })
00912|     }
00913|   }
00914| 
00915|   export async function getProvider(providerID: string) {
00916|     return state().then((s) => s.providers[providerID])
00917|   }
00918| 
00919|   export async function getModel(providerID: string, modelID: string) {
00920|     const s = await state()
00921|     const provider = s.providers[providerID]
00922|     if (!provider) {
00923|       const availableProviders = Object.keys(s.providers)
00924|       const matches = fuzzysort.go(providerID, availableProviders, { limit: 3, threshold: -10000 })
00925|       const suggestions = matches.map((m) => m.target)
00926|       throw new ModelNotFoundError({ providerID, modelID, suggestions })
00927|     }
00928| 
00929|     const info = provider.models[modelID]
00930|     if (!info) {
00931|       const availableModels = Object.keys(provider.models)
00932|       const matches = fuzzysort.go(modelID, availableModels, { limit: 3, threshold: -10000 })
00933|       const suggestions = matches.map((m) => m.target)
00934|       throw new ModelNotFoundError({ providerID, modelID, suggestions })
00935|     }
00936|     return info
00937|   }
00938| 
00939|   export async function getLanguage(model: Model): Promise<LanguageModelV2> {
00940|     const s = await state()
00941|     const key = `${model.providerID}/${model.id}`
00942|     if (s.models.has(key)) return s.models.get(key)!
00943| 
00944|     const provider = s.providers[model.providerID]
00945|     const sdk = await getSDK(model)
00946| 
00947|     try {
00948|       const language = s.modelL...
9,954in / 94out
assistantJan 21, 05:16:41 PM
gpt-5.2-codex · github-copilot
grepcompleted
modelLoaders
Output
Found 5 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 580:     const modelLoaders: {
  Line 763:         if (result.getModel) modelLoaders[providerID] = result.getModel
  Line 822:       modelLoaders,
  Line 948:       const language = s.modelLoaders[model.providerID]
  Line 949:         ? await s.modelLoaders[model.providerID](sdk, model.api.id, provider.options)
4,040in / 29out
assistantJan 21, 05:16:45 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00561|   }
00562| 
00563|   const state = Instance.state(async () => {
00564|     using _ = log.time("state")
00565|     const config = await Config.get()
00566|     const modelsDev = await ModelsDev.get()
00567|     const database = mapValues(modelsDev, fromModelsDevProvider)
00568| 
00569|     const disabled = new Set(config.disabled_providers ?? [])
00570|     const enabled = config.enabled_providers ? new Set(config.enabled_providers) : null
00571| 
00572|     function isProviderAllowed(providerID: string): boolean {
00573|       if (enabled && !enabled.has(providerID)) return false
00574|       if (disabled.has(providerID)) return false
00575|       return true
00576|     }
00577| 
00578|     const providers: { [providerID: string]: Info } = {}
00579|     const languages = new Map<string, LanguageModelV2>()
00580|     const modelLoaders: {
00581|       [providerID: string]: CustomModelLoader
00582|     } = {}
00583|     const sdk = new Map<number, SDK>()
00584| 
00585|     log.info("init")
00586| 
00587|     const configProviders = Object.entries(config.provider ?? {})
00588| 
00589|     // Add GitHub Copilot Enterprise provider that inherits from GitHub Copilot
00590|     if (database["github-copilot"]) {
00591|       const githubCopilot = database["github-copilot"]
00592|       database["github-copilot-enterprise"] = {
00593|         ...githubCopilot,
00594|         id: "github-copilot-enterprise",
00595|         name: "GitHub Copilot Enterprise",
00596|         models: mapValues(githubCopilot.models, (model) => ({
00597|           ...model,
00598|           providerID: "github-copilot-enterprise",
00599|         })),
00600|       }
00601|     }
00602| 
00603|     function mergeProvider(providerID: string, provider: Partial<Info>) {
00604|       const existing = providers[providerID]
00605|       if (existing) {
00606|         // @ts-expect-error
00607|         providers[providerID] = mergeDeep(existing, provider)
00608|         return
00609|       }
0...
217in / 40out
assistantJan 21, 05:16:51 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00721| 
00722|       // Special handling for github-copilot: also check for enterprise auth
00723|       if (providerID === "github-copilot" && !hasAuth) {
00724|         const enterpriseAuth = await Auth.get("github-copilot-enterprise")
00725|         if (enterpriseAuth) hasAuth = true
00726|       }
00727| 
00728|       if (!hasAuth) continue
00729|       if (!plugin.auth.loader) continue
00730| 
00731|       // Load for the main provider if auth exists
00732|       if (auth) {
00733|         const options = await plugin.auth.loader(() => Auth.get(providerID) as any, database[plugin.auth.provider])
00734|         mergeProvider(plugin.auth.provider, {
00735|           source: "custom",
00736|           options: options,
00737|         })
00738|       }
00739| 
00740|       // If this is github-copilot plugin, also register for github-copilot-enterprise if auth exists
00741|       if (providerID === "github-copilot") {
00742|         const enterpriseProviderID = "github-copilot-enterprise"
00743|         if (!disabled.has(enterpriseProviderID)) {
00744|           const enterpriseAuth = await Auth.get(enterpriseProviderID)
00745|           if (enterpriseAuth) {
00746|             const enterpriseOptions = await plugin.auth.loader(
00747|               () => Auth.get(enterpriseProviderID) as any,
00748|               database[enterpriseProviderID],
00749|             )
00750|             mergeProvider(enterpriseProviderID, {
00751|               source: "custom",
00752|               options: enterpriseOptions,
00753|             })
00754|           }
00755|         }
00756|       }
00757|     }
00758| 
00759|     for (const [providerID, fn] of Object.entries(CUSTOM_LOADERS)) {
00760|       if (disabled.has(providerID)) continue
00761|       const result = await fn(database[providerID])
00762|       if (result && (result.autoload || providers[providerID])) {
00763|         if (result.getModel) modelLoaders[providerID] = result.getModel
00764|         mergeProvi...
1,735in / 40out
assistantJan 21, 05:16:55 PM
gpt-5.2-codex · github-copilot
grepcompleted
openai-compatible
Output
Found 10 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 25: import { createOpenAICompatible } from "@ai-sdk/openai-compatible"
  Line 27: import { createOpenaiCompatible as createGitHubCopilotOpenAICompatible } from "./sdk/openai-compatible/src"
  Line 49:     "@ai-sdk/openai-compatible": createOpenAICompatible,
  Line 839:       if (model.api.npm.includes("@ai-sdk/openai-compatible") && options["includeUsage"] !== false) {

/home/jm/data/code/opencode/packages/opencode/src/provider/sdk/openai-compatible/src/openai-compatible-provider.ts:
  Line 2: import { OpenAICompatibleChatLanguageModel } from "@ai-sdk/openai-compatible"
  Line 66:   const getHeaders = () => withUserAgentSuffix(headers, `ai-sdk/openai-compatible/${VERSION}`)
  Line 70:       provider: `${options.name ?? "openai-compatible"}.chat`,
  Line 79:       provider: `${options.name ?? "openai-compatible"}.responses`,

/home/jm/data/code/opencode/packages/opencode/src/provider/sdk/openai-compatible/src/index.ts:
  Line 1: export { createOpenaiCompatible, openaiCompatible } from "./openai-compatible-provider"
  Line 2: export type { OpenaiCompatibleProvider, OpenaiCompatibleProviderSettings } from "./openai-compatible-provider"
grepcompleted
sdk.chat
Output
Found 5 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 121:           return sdk.chat(modelID)
  Line 133:           return sdk.chat(modelID)
  Line 143:             return sdk.chat(modelID)
  Line 157:             return sdk.chat(modelID)
  Line 374:           return sdk.chat(modelID)
837in / 65out
assistantJan 21, 05:17:01 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00081|       }
00082|     },
00083|     async opencode(input) {
00084|       const hasKey = await (async () => {
00085|         const env = Env.all()
00086|         if (input.env.some((item) => env[item])) return true
00087|         if (await Auth.get(input.id)) return true
00088|         const config = await Config.get()
00089|         if (config.provider?.["opencode"]?.options?.apiKey) return true
00090|         return false
00091|       })()
00092| 
00093|       if (!hasKey) {
00094|         for (const [key, value] of Object.entries(input.models)) {
00095|           if (value.cost.input === 0) continue
00096|           delete input.models[key]
00097|         }
00098|       }
00099| 
00100|       return {
00101|         autoload: Object.keys(input.models).length > 0,
00102|         options: hasKey ? {} : { apiKey: "public" },
00103|       }
00104|     },
00105|     openai: async () => {
00106|       return {
00107|         autoload: false,
00108|         async getModel(sdk: any, modelID: string, _options?: Record<string, any>) {
00109|           return sdk.responses(modelID)
00110|         },
00111|         options: {},
00112|       }
00113|     },
00114|     "github-copilot": async () => {
00115|       return {
00116|         autoload: false,
00117|         async getModel(sdk: any, modelID: string, _options?: Record<string, any>) {
00118|           if (modelID.includes("codex")) {
00119|             return sdk.responses(modelID)
00120|           }
00121|           return sdk.chat(modelID)
00122|         },
00123|         options: {},
00124|       }
00125|     },
00126|     "github-copilot-enterprise": async () => {
00127|       return {
00128|         autoload: false,
00129|         async getModel(sdk: any, modelID: string, _options?: Record<string, any>) {
00130|           if (modelID.includes("codex")) {
00131|             return sdk.responses(modelID)
00132|           }
00133|           return sdk.chat(modelID)
00134|         },
00135|         options: ...
1,499in / 40out
assistantJan 21, 05:17:05 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/sdk/openai-compatible/src/openai-compatible-provider.ts
Output
<file>
00002| import { OpenAICompatibleChatLanguageModel } from "@ai-sdk/openai-compatible"
00003| import { type FetchFunction, withoutTrailingSlash, withUserAgentSuffix } from "@ai-sdk/provider-utils"
00004| import { OpenAIResponsesLanguageModel } from "./responses/openai-responses-language-model"
00005| 
00006| // Import the version or define it
00007| const VERSION = "0.1.0"
00008| 
00009| export type OpenaiCompatibleModelId = string
00010| 
00011| export interface OpenaiCompatibleProviderSettings {
00012|   /**
00013|    * API key for authenticating requests.
00014|    */
00015|   apiKey?: string
00016| 
00017|   /**
00018|    * Base URL for the OpenAI Compatible API calls.
00019|    */
00020|   baseURL?: string
00021| 
00022|   /**
00023|    * Name of the provider.
00024|    */
00025|   name?: string
00026| 
00027|   /**
00028|    * Custom headers to include in the requests.
00029|    */
00030|   headers?: Record<string, string>
00031| 
00032|   /**
00033|    * Custom fetch implementation.
00034|    */
00035|   fetch?: FetchFunction
00036| }
00037| 
00038| export interface OpenaiCompatibleProvider {
00039|   (modelId: OpenaiCompatibleModelId): LanguageModelV2
00040|   chat(modelId: OpenaiCompatibleModelId): LanguageModelV2
00041|   responses(modelId: OpenaiCompatibleModelId): LanguageModelV2
00042|   languageModel(modelId: OpenaiCompatibleModelId): LanguageModelV2
00043| 
00044|   // embeddingModel(modelId: any): EmbeddingModelV2
00045| 
00046|   // imageModel(modelId: any): ImageModelV2
00047| }
00048| 
00049| /**
00050|  * Create an OpenAI Compatible provider instance.
00051|  */
00052| export function createOpenaiCompatible(options: OpenaiCompatibleProviderSettings = {}): OpenaiCompatibleProvider {
00053|   const baseURL = withoutTrailingSlash(options.baseURL ?? "https://api.openai.com/v1")
00054| 
00055|   if (!baseURL) {
00056|     throw new Error("baseURL is required")
00057|   }
00058| 
00059|   // Merge headers: defaults first, then user overrides
00060...
1,425in / 50out
assistantJan 21, 05:17:08 PM
gpt-5.2-codex · github-copilot
grepcompleted
cloudflare-ai-gateway
Output
Found 1 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 356:     "cloudflare-ai-gateway": async (input) => {
1,233in / 222out / 192reasoning
assistantJan 21, 05:17:16 PM
gpt-5.2-codex · github-copilot
grepcompleted
cloudflare-ai-gateway
Output
Found 1 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 356:     "cloudflare-ai-gateway": async (input) => {
grepcompleted
ai-gateway
Output
Found 2 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 356:     "cloudflare-ai-gateway": async (input) => {

/home/jm/data/code/opencode/packages/opencode/src/cli/cmd/auth.ts:
  Line 349:           prompts.log.info("You can create an api key at https://vercel.link/ai-gateway-token")
grepcompleted
cloudflare-ai
Output
Found 1 matches
/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:
  Line 356:     "cloudflare-ai-gateway": async (input) => {
350in / 167out / 64reasoning
assistantJan 21, 05:17:24 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/cli/cmd/auth.ts
Output
<file>
00321|           provider = provider.replace(/^@ai-sdk\//, "")
00322|           if (prompts.isCancel(provider)) throw new UI.CancelledError()
00323| 
00324|           // Check if a plugin provides auth for this custom provider
00325|           const customPlugin = await Plugin.list().then((x) => x.find((x) => x.auth?.provider === provider))
00326|           if (customPlugin && customPlugin.auth) {
00327|             const handled = await handlePluginAuth({ auth: customPlugin.auth }, provider)
00328|             if (handled) return
00329|           }
00330| 
00331|           prompts.log.warn(
00332|             `This only stores a credential for ${provider} - you will need configure it in opencode.json, check the docs for examples.`,
00333|           )
00334|         }
00335| 
00336|         if (provider === "amazon-bedrock") {
00337|           prompts.log.info(
00338|             "Amazon bedrock can be configured with standard AWS environment variables like AWS_BEARER_TOKEN_BEDROCK, AWS_PROFILE or AWS_ACCESS_KEY_ID",
00339|           )
00340|           prompts.outro("Done")
00341|           return
00342|         }
00343| 
00344|         if (provider === "opencode") {
00345|           prompts.log.info("Create an api key at https://opencode.ai/auth")
00346|         }
00347| 
00348|         if (provider === "vercel") {
00349|           prompts.log.info("You can create an api key at https://vercel.link/ai-gateway-token")
00350|         }
00351| 
00352|         const key = await prompts.password({
00353|           message: "Enter your API key",
00354|           validate: (x) => (x && x.length > 0 ? undefined : "Required"),
00355|         })
00356|         if (prompts.isCancel(key)) throw new UI.CancelledError()
00357|         await Auth.set(provider, {
00358|           type: "api",
00359|           key,
00360|         })
00361| 
00362|         prompts.outro("Done")
00363|       },
00364|     })
00365|   },
00366| })
00367| 
00368| export const AuthLogoutCommand = ...
1,221in / 43out
assistantJan 21, 05:17:30 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/provider.ts
Output
<file>
00002| import fuzzysort from "fuzzysort"
00003| import { Config } from "../config/config"
00004| import { mapValues, mergeDeep, sortBy } from "remeda"
00005| import { NoSuchModelError, type Provider as SDK } from "ai"
00006| import { Log } from "../util/log"
00007| import { BunProc } from "../bun"
00008| import { Plugin } from "../plugin"
00009| import { ModelsDev } from "./models"
00010| import { NamedError } from "@opencode-ai/util/error"
00011| import { Auth } from "../auth"
00012| import { Env } from "../env"
00013| import { Instance } from "../project/instance"
00014| import { Flag } from "../flag/flag"
00015| import { iife } from "@/util/iife"
00016| 
00017| // Direct imports for bundled providers
00018| import { createAmazonBedrock } from "@ai-sdk/amazon-bedrock"
00019| import { createAnthropic } from "@ai-sdk/anthropic"
00020| import { createAzure } from "@ai-sdk/azure"
00021| import { createGoogleGenerativeAI } from "@ai-sdk/google"
00022| import { createVertex } from "@ai-sdk/google-vertex"
00023| import { createVertexAnthropic } from "@ai-sdk/google-vertex/anthropic"
00024| import { createOpenAI } from "@ai-sdk/openai"
00025| import { createOpenAICompatible } from "@ai-sdk/openai-compatible"
00026| import { createOpenRouter, type LanguageModelV2 } from "@openrouter/ai-sdk-provider"
00027| import { createOpenaiCompatible as createGitHubCopilotOpenAICompatible } from "./sdk/openai-compatible/src"
00028| import { createXai } from "@ai-sdk/xai"
00029| import { createMistral } from "@ai-sdk/mistral"
00030| import { createGroq } from "@ai-sdk/groq"
00031| import { createDeepInfra } from "@ai-sdk/deepinfra"
00032| import { createCerebras } from "@ai-sdk/cerebras"
00033| import { createCohere } from "@ai-sdk/cohere"
00034| import { createGateway } from "@ai-sdk/gateway"
00035| import { createTogetherAI } from "@ai-sdk/togetherai"
00036| import { createPerplexity } from "@ai-sdk/perplexity"
00037| 
00038| export namespace Provider {
00039|   const log = Log...
788in / 106out / 64reasoning
assistantJan 21, 05:17:36 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/models.ts
Output
<file>
00002| import { Log } from "../util/log"
00003| import path from "path"
00004| import z from "zod"
00005| import { data } from "./models-macro" with { type: "macro" }
00006| import { Installation } from "../installation"
00007| import { Flag } from "../flag/flag"
00008| 
00009| export namespace ModelsDev {
00010|   const log = Log.create({ service: "models.dev" })
00011|   const filepath = path.join(Global.Path.cache, "models.json")
00012| 
00013|   export const Model = z.object({
00014|     id: z.string(),
00015|     name: z.string(),
00016|     family: z.string().optional(),
00017|     release_date: z.string(),
00018|     attachment: z.boolean(),
00019|     reasoning: z.boolean(),
00020|     temperature: z.boolean(),
00021|     tool_call: z.boolean(),
00022|     interleaved: z
00023|       .union([
00024|         z.literal(true),
00025|         z
00026|           .object({
00027|             field: z.enum(["reasoning_content", "reasoning_details"]),
00028|           })
00029|           .strict(),
00030|       ])
00031|       .optional(),
00032|     cost: z
00033|       .object({
00034|         input: z.number(),
00035|         output: z.number(),
00036|         cache_read: z.number().optional(),
00037|         cache_write: z.number().optional(),
00038|         context_over_200k: z
00039|           .object({
00040|             input: z.number(),
00041|             output: z.number(),
00042|             cache_read: z.number().optional(),
00043|             cache_write: z.number().optional(),
00044|           })
00045|           .optional(),
00046|       })
00047|       .optional(),
00048|     limit: z.object({
00049|       context: z.number(),
00050|       output: z.number(),
00051|     }),
00052|     modalities: z
00053|       .object({
00054|         input: z.array(z.enum(["text", "audio", "image", "video", "pdf"])),
00055|         output: z.array(z.enum(["text", "audio", "image", "video", "pdf"])),
00056|       })
00057|       .optional(),
00058|     exper...
grepcompleted
cloudflare
Output
No files found
1,122in / 208out / 128reasoning
assistantJan 21, 05:17:46 PM
gpt-5.2-codex · github-copilot
grepcompleted
cloudflare
Output
No files found
1,250in / 30out
assistantJan 21, 05:17:50 PM
gpt-5.2-codex · github-copilot
readcompleted
packages/opencode/src/provider/models-macro.ts
Output
<file>
00002|   const path = Bun.env.MODELS_DEV_API_JSON
00003|   if (path) {
00004|     const file = Bun.file(path)
00005|     if (await file.exists()) {
00006|       return await file.text()
00007|     }
00008|   }
00009|   const json = await fetch("https://models.dev/api.json").then((x) => x.text())
00010|   return json
00011| }
00012| 

(End of file - total 12 lines)
</file>
137in / 42out
assistantJan 21, 05:17:56 PM
gpt-5.2-codex · github-copilot

Cause

  • cloudflare-ai-gateway’s custom loader calls sdk.chat(modelID) in src/provider/provider.ts:356, but the bundled provider for that ID is @ai-sdk/openai-compatible, which doesn’t expose chat in this build (it only exposes a callable provider and languageModel/responses). That’s why sdk.chat is undefined and getModel throws at src/provider/provider.ts:374.

Why it happens

  • BUNDLED_PROVIDERS wires @ai-sdk/openai-compatible via createOpenAICompatible (src/provider/provider.ts:41), and the current SDK interface in that package doesn’t include chat. The custom loader assumes chat exists.

Suggested fix

  • In src/provider/provider.ts:373, replace sdk.chat(modelID) with sdk.languageModel(modelID) (or simply sdk(modelID)) for cloudflare-ai-gateway. That matches the OpenAI-compatible provider surface and avoids the TypeError.

Want me to patch that change?

307in / 874out / 640reasoning
userJan 21, 05:19:37 PM

Ues go ahead

assistantJan 21, 05:19:37 PM
gpt-5.2-codex · github-copilot
editcompleted
packages/opencode/src/provider/provider.ts
677in / 112out
assistantJan 21, 05:19:45 PM
gpt-5.2-codex · github-copilot

Done — updated the Cloudflare AI Gateway loader to use sdk.languageModel().

Change

  • src/provider/provider.ts:373

Want me to run tests or validate a single provider call?

282in / 45out
userJan 21, 05:19:59 PM

OK

assistantJan 21, 05:19:59 PM
gpt-5.2-codex · github-copilot

Do you want me to run any tests or just leave it here?

78in / 20out
userJan 21, 05:20:22 PM

RUn a test please, there are env vars set that is hopefully enough auth

assistantJan 21, 05:20:22 PM
gpt-5.2-codex · github-copilot
bashcompleted
Runs provider test suite
Output
bun test v1.3.6 (d530ed99)

test/provider/provider.test.ts:
400 |     init: async () => {
401 |       Env.set("ANTHROPIC_API_KEY", "test-api-key")
402 |     },
403 |     fn: async () => {
404 |       const model = await Provider.defaultModel()
405 |       expect(model.providerID).toBe("anthropic")
                                     ^
error: expect(received).toBe(expected)

Expected: "anthropic"
Received: "zai-coding-plan"

      at <anonymous> (/home/jm/data/code/opencode/packages/opencode/test/provider/provider.test.ts:405:32)
(fail) defaultModel respects config model setting [2.33ms]
921 |     const provider = s.providers[providerID]
922 |     if (!provider) {
923 |       const availableProviders = Object.keys(s.providers)
924 |       const matches = fuzzysort.go(providerID, availableProviders, { limit: 3, threshold: -10000 })
925 |       const suggestions = matches.map((m) => m.target)
926 |       throw new ModelNotFoundError({ providerID, modelID, suggestions })
                  ^
ProviderModelNotFoundError: ProviderModelNotFoundError
 data: {
  providerID: "zai-coding-plan",
  modelID: "glm-4.5-flash",
  suggestions: [],
},

      at getModel (/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:926:13)
(fail) getSmallModel returns appropriate small model [414.69ms]
921 |     const provider = s.providers[providerID]
922 |     if (!provider) {
923 |       const availableProviders = Object.keys(s.providers)
924 |       const matches = fuzzysort.go(providerID, availableProviders, { limit: 3, threshold: -10000 })
925 |       const suggestions = matches.map((m) => m.target)
926 |       throw new ModelNotFoundError({ providerID, modelID, suggestions })
                  ^
ProviderModelNotFoundError: ProviderModelNotFoundError
 data: {
  providerID: "zai-coding-plan",
  modelID: "glm-4.5-flash",
  suggestions: [],
},

      at getModel (/home/jm/data/code/opencode/packages/opencode/src/provider/provider.ts:926:13)
(fail) getSmallModel respects con...
118in / 48out
assistantJan 21, 05:20:51 PM
gpt-5.2-codex · github-copilot

Tests ran and failed due to config/env in this environment, not the change.

Test Result

  • bun test test/provider/provider.test.ts
  • Failures: 3 (defaultModel/getSmallModel), expecting anthropic but got zai-coding-plan and missing zai-coding-plan provider.

This looks like your env/config is selecting zai-coding-plan as default provider, which isn’t present in the test environment. The edit should be fine; the failures are unrelated to the Cloudflare AI Gateway fix.

Want me to rerun with a specific env override, or leave it here?

6,110in / 138out
userJan 21, 05:21:22 PM

That alright, :)

assistantJan 21, 05:21:22 PM
gpt-5.2-codex · github-copilot

Great — I’ll leave it there. If you want me to run any other tests or tweak provider defaults, just say the word.

114in / 31out
userJan 21, 05:26:54 PM

What could be the problem here?
stream error
ERROR 2026-01-21T17:25:53 +0ms service=session.processor error=Bad Request stack="AI_APICallError: Bad Request\n at (/home/jm/data/code/opencode/node_modules/.bun/@ai-sdk+provider-utils@3.0.17+d6123d32214422cb/node_modules/@ai-sdk/provider-utils/dist/index.mjs:903:18)\n at processTicksAndRejections (native:7:39)" process

assistantJan 21, 05:26:54 PM
gpt-5.2-codex · github-copilot

I need a bit more context to pinpoint it. That error is a generic 400 from the AI SDK. Can you share the surrounding logs showing:

  • Provider/model being used (the service=llm line with providerID/modelID)
  • The request target URL (the service=network line)
  • The response body if it’s logged (often in debug or provider logs)

With that, I can tell whether it’s an auth header issue, bad model ID, invalid params, or a format mismatch.

31,754in / 116out
userJan 21, 05:27:27 PM

I t says stream errorand there is some stack here:
uccess":false,"result":[],"messages":[],"error":[{"code":2001,"message":"Please configure AI Gateway in the
Cloudflare dashboard"}]}","isRetryable":false}} stream error
ERROR 2026-01-21T17:25:53 +0ms service=session.processor error=Bad Request stack="AI_APICallError: Bad Request\n at
(/home/jm/data/code/opencode/node_modules/.bun/@ai-sdk+provider-utils@3.0.17+d6123d32214422cb/node_modules/
@ai-sdk/provider-utils/dist/index.mjs:903:18)\n at processTicksAndRejections (native:7:39)" process
INFO 2026-01-21T17:25:53 +0ms service=bus type=session.error publishing
INFO 2026-01-21T17:25:53 +1ms service=bus type=message.updated publishing
INFO 2026-01-21T17:25:53 +0ms service=session.compaction pruning
INFO 2026-01-21T17:25:53 +0ms service=session.prompt sessionID=ses_41e73e73fffeLKAcPnMU1XvWSq cancel
INFO 2026-01-21T17:25:53 +0ms service=bus type=session.status publishing
INFO 2026-01-21T17:25:53 +0ms service=bus type=session.idle publishing
INFO 2026-01-21T17:25:53 +1ms service=session.compaction pruned=0 total=0 found
ERROR 2026-01-21T17:25:53 +604ms service=default e=NotFoundError rejection

assistantJan 21, 05:27:27 PM
gpt-5.2-codex · github-copilot

That response body is explicit: Cloudflare AI Gateway isn’t configured for the account/gateway ID you’re hitting.

"code":2001,"message":"Please configure AI Gateway in the Cloudflare dashboard"

So the 400 is coming from Cloudflare, not opencode. Fixes:

  • Ensure CLOUDFLARE_ACCOUNT_ID + CLOUDFLARE_GATEWAY_ID are correct.
  • In the Cloudflare dashboard, create/configure an AI Gateway for that ID and add at least one upstream provider (Anthropic/OpenAI/etc).
  • If the gateway is authenticated, make sure CLOUDFLARE_API_TOKEN is set and has AI Gateway access.
  • Double‑check the base URL matches https://gateway.ai.cloudflare.com/v1/{accountId}/{gateway}/compat.

Once the gateway is configured, the stream error should disappear.

655in / 252out / 64reasoning