Introducing the breakthrough in LLM.
Created 9 months ago
Get up and running with Llama 2 and other large language models locally - GitHub - jmorganca/ollama:...
Added ago
Run the Mixtral1 8x7B mixture-of-experts (MoE) model in MLX on Apple silicon.
Phi-2 is now accessible on the Azure model catalog.
The best 7B model to date, Apache 2.0
Gemini is our most capable and general model, built to be multimodal and optimized for three differe...
Gemini is our natively multimodal AI model capable of reasoning across text, images, audio, video an...
Login to subscribe this collection.