community-discussion

Tag

Cards List
#community-discussion

Who is your favourite quant publisher and why?

Reddit r/LocalLLaMA · 17h ago

A user shares their preference for Unsloth quantized models due to fast releases and low perplexity, compares them with Apex MoE quants, and asks the community for their favorite quant publisher.

0 favorites 0 likes
#community-discussion

@HoodyLiu: What do you do while waiting for Claude Code to modify your code?

X AI KOLs Following · 3d ago

This post asks the community what they do while waiting for Claude Code to modify their codebase, highlighting the latency of AI coding assistants.

0 favorites 0 likes
#community-discussion

Stop letting LLMs edit your .bib [D]

Reddit r/MachineLearning · 2026-05-06

The article criticizes the reliance on Large Language Models for generating bibliographic entries, highlighting issues with hallucinated citations and incorrect author lists in academic papers.

0 favorites 0 likes
#community-discussion

Choosing a Mac Mini for local LLMs — what would YOU actually buy?

Reddit r/LocalLLaMA · 2026-04-21

A community discussion post seeking advice on which Mac Mini configuration (M4, M2 Pro, or M1 Max) to purchase for running local LLMs with Ollama and coding assistants, with the decision complicated by rumored M5 releases and current supply shortages.

0 favorites 0 likes
#community-discussion

About to build a 6× Arc B70 LLM rig, want to talk to someone experienced first

Reddit r/LocalLLaMA · 2026-04-20

A user seeks experienced guidance on building a 6× Intel Arc B70 LLM inference rig, particularly for Llama models and vLLM deployment, offering compensation for consultation.

0 favorites 0 likes
← Back to home

Submit Feedback