ndr_ 6 days ago

If you're on macOS, check out the MLX community: https://huggingface.co/mlx-community - Ollama not needed, potentially more efficient(?). The Gemma 2 27B model was broken as of a few days ago, but a fix is already committed and underway to the pip repo. Gemma 2 9B works directly.

wkat4242 6 days ago

This was not submitted as a link so you can't click on it. That doesn't really follow the HN convention.

PS: Personally I prefer ollama + openwebui <3 For its extensibility. Though I like the playful approach here.