Introducing the breakthrough in LLM.
Created 3 months ago
Get up and running with Llama 2 and other large language models locally - GitHub - jmorganca/ollama:...
Run the Mixtral1 8x7B mixture-of-experts (MoE) model in MLX on Apple silicon.
Phi-2 is now accessible on the Azure model catalog.
Gemini is our most capable and general model, built to be multimodal and optimized for three differe...
Login to subscribe this collection.