Run LLMs locally with one command. Llama, Mistral, Gemma, and more. Zero cloud dependency.
This artifact was automatically scanned for malicious patterns, credential access, code execution risks, and source authenticity. All checks passed. Learn about our verification process.
No reviews yet. Be the first to review!
Sign in to write a review.
Token estimate unavailable — available for GitHub imports and pasted content.
curl -fsSL https://ollama.com/install.sh | shSkill Shope
Verified Publisher