LLM

Prompt Engineering

Context Engineering

Local inference and model libraries

How to download huggingface models via Ollama

via Use Ollama with any GGUF Model on Hugging Face Hub

ollama run hf.co/{username}/{repository}:{quantization}
# It also works with pull

Interface

→ Ollama and LM Studio provide and inteface

  • OpenWebUI → this project is being used by a bunch of LLM startups, notably z.ai and kimi

GGUF sources

Hugginface is as of writing this the biggest models repo.

Here are some notable LLM model publishers: