menu

Search By Label

You can run a LLM model locally with the Ollama package.
Ollama supports a list of models available on ollama.com/library

Pull a model
ollama pull llama3.2
Run the model
ollama run llama3.2
Source: https://github.com/ollama/ollama
Page to compare different LLM:

https://arena.lmsys.org/