fix(openclaw): rename virtual provider to llmspy for cloud model routing#149
Conversation
| if p.APIKey != "" && !isEnvVarRef(p.APIKey) { | ||
| ip.APIKey = p.APIKey | ||
| } else if p.APIKey != "" { | ||
| fmt.Printf(" Note: provider '%s' uses env-var reference %s (will need manual configuration)\n", name, p.APIKey) |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 1 day ago
To fix the problem, we should ensure that no sensitive information (API keys or their direct representations) is written to logs or stdout. Instead of interpolating p.APIKey into the fmt.Printf message, we can log only that the provider uses an env-var reference, without revealing its exact value. This preserves all functional behavior: the configuration parsing and result construction remain unchanged; only the informational note is redacted.
Concretely, in internal/openclaw/import.go, within detectExistingConfigAt, update the else if p.APIKey != "" branch (around line 146) to remove %s and the p.APIKey argument from the format string. A safe message would be " Note: provider '%s' uses env-var reference (will need manual configuration)\n", which still tells the user what to do but does not leak the value. No extra imports, methods, or type changes are needed; this is a minimal logging-format change.
| @@ -143,7 +143,7 @@ | ||
| if p.APIKey != "" && !isEnvVarRef(p.APIKey) { | ||
| ip.APIKey = p.APIKey | ||
| } else if p.APIKey != "" { | ||
| fmt.Printf(" Note: provider '%s' uses env-var reference %s (will need manual configuration)\n", name, p.APIKey) | ||
| fmt.Printf(" Note: provider '%s' uses env-var reference (will need manual configuration)\n", name) | ||
| } | ||
| for _, m := range p.Models { | ||
| ip.Models = append(ip.Models, ImportedModel{ID: m.ID, Name: m.Name}) |
…cloud model routing OpenClaw requires provider/model format (e.g. "llmspy/claude-sonnet-4-5-20250929") for model resolution. Without a provider prefix, it hardcodes a fallback to the "anthropic" provider — which is disabled in the llmspy-routed overlay, causing chat requests to fail silently. This renames the virtual provider used for cloud model routing from "ollama" to "llmspy", adds the proper provider prefix to AgentModel, and disables the default "ollama" provider when a cloud provider is selected. The default Ollama-only path is unchanged since it genuinely routes Ollama models.
7b6db92 to
5e6c751
Compare
Summary
buildLLMSpyRoutedOverlay()from"ollama"to"llmspy"so OpenClaw correctly resolves cloud models through the llmspy gatewayllmspy/prefix toAgentModel(e.g.,llmspy/claude-sonnet-4-5-20250929) — required by OpenClaw'sparseModelRef()which splits on/for provider routingollamaprovider when a cloud provider is selected to prevent dual-provider confusionllmspyprovider entry to chart values,_helpers.tpl, anddeployment.yamlprovider iteration listsollama/glm-4.7-flash)Root Cause
OpenClaw's model resolution (
src/agents/model-selection.ts) requiresprovider/modelformat. Without a/, bare model names fall back to a hardcodedanthropicprovider — which is disabled in the cloud overlay, silently breaking chat requests.Request Chain (cloud path)
Test plan
go test ./internal/openclaw/ -v)go build ./cmd/obolcompiles successfullyobol openclaw setupusing Anthropic/OpenAI and verify chat works