LLM


#tech #llm #ai #selfhosting

Prompt Engineering

Context Engineering

Local inference and model libraries

How to download huggingface models via Ollama

via Use Ollama with any GGUF Model on Hugging Face Hub

ollama run hf.co/{username}/{repository}:{quantization}
# It also works with pull

Interface

→ Ollama and LM Studio provide and inteface

GGUF sources

Hugginface is as of writing this the biggest models repo.

Here are some notable LLM model publishers:

Back to the top ↑