Skip to content

Instantly share code, notes, and snippets.

@iam-veeramalla
Last active April 18, 2026 00:16
Show Gist options
  • Select an option

  • Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.

Select an option

Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.
claude code integration with ollama to use local models

Run Claude with the power of Local LLMs using Ollama

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Pull the Model

ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)

Install Claude

curl -fsSL https://claude.ai/install.sh | bash

Run Claude with Ollama

ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@Hemanth26080
Copy link
Copy Markdown

i have 8 gb ram in my laptop which claude model is perfect for my laptop that works fine . if anyone knows kindly helps me @Hemanth26080

What I did was I just installed Claude code, and I configured the OpenRoute API key without any installation. I started using all the free model in openroute for free. I hope this information will helpful. @iam-veeramalla @aneeshksoft @ShubhmPatil @chinthakindi-saikumar @Kalyani-192006

@AnshumanBharatiya
Copy link
Copy Markdown

i have 8 gb ram in my laptop which claude model is perfect for my laptop that works fine . if anyone knows kindly helps me @Hemanth26080

What I did was I just installed Claude code, and I configured the OpenRoute API key without any installation. I started using all the free model in openroute for free. I hope this information will helpful. @iam-veeramalla @aneeshksoft @ShubhmPatil @chinthakindi-saikumar @Kalyani-192006

Which Is Bettter Open Router or Ollama?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment