Skip to content

Instantly share code, notes, and snippets.

@johnnyasantoss
Last active April 16, 2026 18:17
Show Gist options
  • Select an option

  • Save johnnyasantoss/a5200046eca87dba2bafc8b670a4c946 to your computer and use it in GitHub Desktop.

Select an option

Save johnnyasantoss/a5200046eca87dba2bafc8b670a4c946 to your computer and use it in GitHub Desktop.
Generates the config to load and use models from LMStudio on OpenCode
#!/usr/bin/env bash
set -euo pipefail
usage() {
cat << 'EOF'
Usage: lmstudio-models.sh [--dry-run] [--reset] [-h]
Options:
--dry-run Preview changes without modifying opencode.json
--reset Replace models field with API response (don't merge)
-h, --help Show this help message
EOF
}
log() { echo "[$0] $*" >&2; }
# Parse flags
DRY_RUN=false RESET=false
for arg in "$@"; do
case "$arg" in
--dry-run) DRY_RUN=true ;;
--reset) RESET=true ;;
-h|--help) usage; exit 0 ;;
esac
done
log "Fetching models from LM Studio API..."
API_RESPONSE=$(curl -sf http://localhost:1234/api/v1/models) || { log "ERROR: Failed to fetch from LM Studio API"; exit 1; }
log "Processing $(echo "$API_RESPONSE" | jq '.models | length') models..."
# Build provider JSON
PROVIDER=$(echo "$API_RESPONSE" | jq '{
provider: {
lmstudio: {
npm: "@ai-sdk/openai-compatible",
name: "LM Studio",
options: { baseURL: "http://localhost:1234/v1" },
models: (.models
| map(select(.type == "llm"))
| map({
key: .key,
value: {
name: ((.display_name | sub("\\s*\\(.*\\)$"; "")) + "@" + (.format // "uknw") + "-" + (.quantization.name // "uknw"))
}
})
| sort_by(.key)
| from_entries)
}
}
}')
MODELS=$(echo "$PROVIDER" | jq '.provider.lmstudio.models')
MODEL_COUNT=$(echo "$MODELS" | jq 'length')
if $RESET; then
log "Reset mode: Replacing models with $MODEL_COUNT models from API"
if $DRY_RUN; then
log "[DRY-RUN] Would update opencode.json"
jq --argjson models "$MODELS" '.provider.lmstudio.models = $models' opencode.json
else
jq --argjson models "$MODELS" '.provider.lmstudio.models = $models' opencode.json > opencode.json.new
mv opencode.json.new opencode.json
log "✓ Updated opencode.json with $MODEL_COUNT models"
fi
else
log "Merge mode: Merging $MODEL_COUNT models with existing config"
echo "$PROVIDER" > lmstudio-provider.json
if $DRY_RUN; then
log "[DRY-RUN] Would merge with opencode.json"
node scripts/deep-merge.js opencode.json <(echo "$PROVIDER")
else
node scripts/deep-merge.js opencode.json lmstudio-provider.json > opencode.json.new
mv opencode.json.new opencode.json
log "✓ Merged into opencode.json"
fi
fi
const deepMerge = (target, source) => {
const result = { ...target };
for (const key of Object.keys(source)) {
if (
source[key] instanceof Object &&
!Array.isArray(source[key]) &&
key in target &&
target[key] instanceof Object &&
!Array.isArray(target[key])
) {
result[key] = deepMerge(target[key], source[key]);
} else {
result[key] = source[key];
}
}
return result;
};
const main = JSON.parse(require('fs').readFileSync(process.argv[2], 'utf8'));
const patch = JSON.parse(require('fs').readFileSync(process.argv[3], 'utf8'));
const merged = deepMerge(main, patch);
console.log(JSON.stringify(merged, null, 2));
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment