Skip to content

Instantly share code, notes, and snippets.

@tzkmx
Created September 9, 2025 00:10
Show Gist options
  • Save tzkmx/0cc7557b1411de482131ef0943d9fd7e to your computer and use it in GitHub Desktop.
Save tzkmx/0cc7557b1411de482131ef0943d9fd7e to your computer and use it in GitHub Desktop.
Ollama local tests
REM move gemma-3-1b-it-f16.gguf gemma-3-1b-it-f16
ollama create gemma-3-1b-it-f16 -f Modelfile
FROM gguf
PARAMETER model-file gemma-3b-it-Q4_K_M.gguf
<#
.SYNOPSIS
Chain multiple prompts or files into one request to a local Ollama model.
.DESCRIPTION
Concatenates multiple inputs (text or files) and sends them as a single prompt.
Supports timing and model selection.
.PARAMETER Inputs
Array of strings or file paths to be combined.
.PARAMETER Model
The model name (default: gemma:2b)
.PARAMETER Time
Switch to measure execution time.
.EXAMPLE
.\ollama-chain.ps1 intro.txt chapter1.txt -Time
.EXAMPLE
.\ollama-chain.ps1 "Summarize this:" .\doc.txt -Model "gemma-local"
#>
param(
[Parameter(Mandatory = $true, ValueFromRemainingArguments = $true)]
[string[]]$Inputs, # Can be file paths or raw text
[string]$Model = "gemma:2b",
[switch]$Time
)
# Build the combined prompt
$PromptParts = foreach ($item in $Inputs) {
if (Test-Path $item) {
Get-Content $item -Raw
}
else {
$item
}
}
$FullPrompt = ($PromptParts -join "`n`n").Trim()
if ($Time) {
$result = Measure-Command {
ollama run $Model $FullPrompt
}
Write-Host "`nExecution Time: $($result.TotalSeconds) seconds" -ForegroundColor Yellow
}
else {
ollama run $Model $FullPrompt
}
<#
.SYNOPSIS
Run a single prompt against a local Ollama model.
.DESCRIPTION
Pass a prompt string or file to a local model using Ollama.
Supports timing and model selection.
.PARAMETER Prompt
The prompt text or path to a file containing the prompt.
.PARAMETER Model
The model name (default: gemma:2b)
.PARAMETER Time
Switch to measure execution time.
.EXAMPLE
.\ollama-cli.ps1 -Prompt "Tell me a joke" -Time
.EXAMPLE
.\ollama-cli.ps1 -Prompt .\prompt.txt -Model "gemma-local"
#>
param(
[Parameter(Mandatory = $true)]
[string]$Prompt,
[string]$Model = "gemma-3-1b-it-f16", # Default model "gemma:2b" suggested
[switch]$Time # Add -Time to measure execution time
)
# Function to run Ollama and capture output
function Invoke-Ollama {
param($Prompt, $Model)
# Run Ollama with the given prompt
$output = ollama run $Model $Prompt
return $output
}
if ($Time) {
# Measure execution time
$result = Measure-Command {
$response = Invoke-Ollama -Prompt $Prompt -Model $Model
$response | Out-Host
}
Write-Host "`nExecution Time: $($result.TotalSeconds) seconds" -ForegroundColor Yellow
}
else {
# Just run without timing
Invoke-Ollama -Prompt $Prompt -Model $Model | Out-Host
}
Set-ExecutionPolicy -Scope CurrentUser Unrestricted -Force
$scriptPath = "$env:USERPROFILE\Scripts"
$currentPath = [Environment]::GetEnvironmentVariable("PATH", "User")
if ($currentPath -notlike "*$scriptPath*") {
[Environment]::SetEnvironmentVariable("PATH", "$currentPath;$scriptPath", "User")
Write-Host "✅ Added $scriptPath to user PATH."
} else {
Write-Host "ℹ️ $scriptPath is already in PATH."
}
function ollama-p {
param(
[Parameter(Position = 0, Mandatory = $true)]
[string]$PromptOrFile,
[string]$Model = "gemma-3-1b-it-f16", # Default model "gemma:2b" suggested
[switch]$Time
)
# If the parameter is a file path, read its contents
if (Test-Path $PromptOrFile) {
$Prompt = Get-Content $PromptOrFile -Raw
}
else {
$Prompt = $PromptOrFile
}
if ($Time) {
$result = Measure-Command {
ollama run $Model $Prompt
}
Write-Host "`nExecution Time: $($result.TotalSeconds) seconds" -ForegroundColor Yellow
}
else {
ollama run $Model $Prompt
}
}

🧠 Offline Ollama CLI Tools for GGUF Models

This gist includes:

  • ollama-cli.ps1: Run a single prompt with a local model (like gemini-cli -p)
  • ollama-chain.ps1: Chain multiple files or prompts into one request
  • PowerShell profile function ollama-p for quick terminal use
  • Instructions to install .gguf models manually into Ollama

🔧 Setup

  1. Install Ollama
  2. Create a Modelfile and build your model:
    ollama create gemma-local -f Modelfile

🧪 Usage 3. Add ollama-p to your PowerShell profile ($PROFILE)

ollama-p "Write a haiku" -Time
ollama-p .\prompt.txt -Model "gemma-local"
.\ollama-chain.ps1 intro.txt chapter1.txt -Time

Use Get-Help .\ollama-cli.ps1 or Get-Help .\ollama-chain.ps1 for inline help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment