apache-2-0

Tag

Cards List
#apache-2-0

@QuixiAI: What a beautiful release! Perfectly sized, FAST, and Apache 2.0. And American, for those customers who are picky about …

X AI KOLs Following · 4d ago Cached

A user praises a newly released AI model from QuixiAI, highlighting its speed, Apache 2.0 license, and American origin.

0 favorites 0 likes
#apache-2-0

Granite 4.1 LLMs: How They’re Built

Hugging Face Blog · 2026-04-29 Cached

This article details the technical architecture and training pipeline of IBM's Granite 4.1 LLMs, covering pre-training, SFT, and RL stages. It highlights that the 8B dense model outperforms larger MoE counterparts and notes the release under Apache 2.0 license.

0 favorites 0 likes
#apache-2-0

poolside/Laguna-XS.2

Hugging Face Models Trending · 2026-04-23 Cached

Poolside releases Laguna XS.2, a 33B parameter MoE model with 3B activated parameters designed for agentic coding and local deployment on Macs with 36GB RAM.

0 favorites 0 likes
#apache-2-0

ibm-granite/granite-4.1-8b · Hugging Face

Reddit r/LocalLLaMA · 2026-04-21 Cached

IBM releases Granite-4.1-8B, an Apache 2.0 licensed 8B parameter long-context instruct model with enhanced tool-calling and multilingual support.

0 favorites 0 likes
← Back to home

Submit Feedback