@jun_song: In few weeks, everyone with 128gb Mac will have uncensored Opus-4.6 locally. It will be Minimax-M3.0-JANGTQ-CRACK by @d…
Summary
The tweet claims that an uncensored version of Opus 4.6, derived from Minimax-M3.0 and created by alignnai, will soon run locally on 128GB Macs and 24GB VRAM GPUs.
View Cached Full Text
Cached at: 05/13/26, 10:18 AM
In few weeks, everyone with 128gb Mac will have uncensored Opus-4.6 locally.
It will be Minimax-M3.0-JANGTQ-CRACK by @dealignai
The open-source community is working hard on fitting them into 24GB VRAM.
The future of Local LLM is so bright.
Similar Articles
Given how good Qwen become, is it time to grab a 128gb m5 max?
User considers upgrading to 128GB M5 Max to run improved Qwen 27B models locally, noting near-Opus-4.5-level performance.
JANGQ-AI/MiniMax-M2.7-JANGTQ_K : mixed-bit quant of MiniMax M2.7 - 74 GB on disk
Release of a mixed-bit quantized version of the MiniMax M2.7 model, optimized to 74 GB for efficient local inference on Apple Silicon devices.
@remilouf: Following @julien_c’s tweet I bought a MacBook Pro with 128B unified memory, and started running Qwen3.6 as my daily dr…
The author shares their experience running the Qwen3.6 model on a MacBook Pro with 128GB of unified memory, praising Apple's hardware efficiency for local AI inference.
@PandaTalk8: These test results are stunning. The original poster tested the DS4 inference engine written in C by @antirez, and local deployment seems incredibly fast. The good news is that only 128GB of RAM is needed to run a local model equivalent to GPT-4o. The bad news is that you need a MacBook Pro with 128GB of RAM.
This article reports on tests of the DS4 inference engine written in C by @antirez, noting its impressive speed when running a GPT-4o-equivalent model on a MacBook Pro with 128GB of RAM.
@garrytan: Downloading now... 1M token context window with supposedly usable coding agent capability all on a 128GB Macbook Pro is
Garry Tan highlights a model with a 1M token context window and coding agent capabilities running locally on a 128GB MacBook Pro, expressing excitement about the milestone.