Run this on your Mac with Outlier — a free macOS app for local MLX inference.

gemma-3-27b-it (MLX 4-bit)

MLX 4-bit conversion of google/gemma-3-27b-it. License and base-model fields inherit from the original — see YAML frontmatter above.

Load with mlx-lm

pip install mlx-lm
python -m mlx_lm.generate --model Outlier-Ai/gemma-3-27b-it-MLX-4bit --prompt "Hello" --max-tokens 256

What is Outlier?

A free macOS app that runs MLX models locally — no cloud, no API keys, no usage caps.

outlier.host

Other Outlier conversions

License

Inherits from upstream (gemma). See base model card.

Downloads last month
245
Safetensors
Model size
27B params
Tensor type
BF16
·
U32
·
MLX
Hardware compatibility
Log In to add your hardware

4-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Outlier-Ai/gemma-3-27b-it-MLX-4bit

Quantized
(131)
this model

Collection including Outlier-Ai/gemma-3-27b-it-MLX-4bit