A beautiful home for your local AI models
Use Ollama on iOS and macOS without a terminal.
No IP addresses, no VPNs, no setup.
How it works
A tiny menu bar companion that connects your Mac to your local network and lets you manage your local models.
Your Mac appears by name, automatically. No IP addresses. No Tailscale (yet). Tap once to connect.
Full streaming chat with every model on your Mac. Switch models on the fly. Conversations stay on your devices.
Features
Uses Bonjour to find your Mac the same way AirDrop does. Same Wi-Fi, instant connection.
Responses stream live from Ollama to your phone. No waiting for the full reply.
Browse, download, and delete Ollama models from your iPhone. No Terminal needed.
Everything stays between your iPhone and your Mac. No cloud, no tokens, no data leaving your home.
Each model family has its own visual world โ Mesa for Llama, Alpine for Mistral, Grove for CodeLlama.
No subscription. No API key. No credits. You already own the Mac doing the inference.
Requirements
Pasture is currently in private beta. Join the waitlist and you'll be among the first to get access.
Or, if you already have access โ