Who is your favourite quant publisher and why?

Reddit r/LocalLLaMA News

Summary

A user shares their preference for Unsloth quantized models due to fast releases and low perplexity, compares them with Apex MoE quants, and asks the community for their favorite quant publisher.

Hey everyone, I’ve been a big fan of **Unsloth** for several reasons: * They publish models ASAP after release. * They usually offer the lowest PPL. * Their website has tons of helpful tutorials and documentation. Recently, I stumbled upon this Reddit thread suggesting to try out an **Apex MoE quant** of *Mudler* instead: 👉 [https://www.reddit.com/r/LocalLLaMA/comments/1t3n6jo/apex\_moe\_quants\_update\_25\_new\_models\_since\_the/](https://www.reddit.com/r/LocalLLaMA/comments/1t3n6jo/apex_moe_quants_update_25_new_models_since_the/) So I decided to test it myself. I tried running **Qwen3.5 122B IQuality**, which is roughly the same size as Qwen3.5 122B Q4\_K\_XL. So far, I haven’t noticed a difference in real world tasks between these two models in terms of output quality so i decided to run one gsm8k benchmark and unsloth was slightly better. So im asking you now, who is your fav publisher and why?
Original Article

Similar Articles