@BraceSproul: Chat LangChain has been revamped, and re-open sourced We've been working on a few improvements for a while now, and are…

X AI KOLs Following Tools

Summary

Chat LangChain has been revamped and re-open sourced as a production-ready documentation assistant agent built with LangGraph, capable of handling nearly 2 trillion tokens per week.

Chat LangChain has been revamped, and re-open sourced We've been working on a few improvements for a while now, and are very excited to finally open source them again! Want to see how a production Q&A agent that handles nearly 2T tokens a week is built? Checkout the repo here: https://github.com/langchain-ai/chat-langchain…
Original Article
View Cached Full Text

Cached at: 05/12/26, 04:54 PM

Chat LangChain has been revamped, and re-open sourced We’ve been working on a few improvements for a while now, and are very excited to finally open source them again! Want to see how a production Q&A agent that handles nearly 2T tokens a week is built? Checkout the repo here: https://github.com/langchain-ai/chat-langchain…


langchain-ai/chat-langchain

Source: https://github.com/langchain-ai/chat-langchain

Chat LangChain

A simple documentation assistant built with LangGraph.

LangGraph Python License

Overview

This is a documentation assistant agent that helps answer questions about LangChain, LangGraph, and LangSmith. It demonstrates how to build a production-ready agent using:

  • LangGraph - For agent orchestration and state management
  • LangChain Agents - For agent creation with middleware support
  • Guardrails - To keep conversations on-topic

The repo also includes a Next.js frontend in frontend/ for the public chat UI.

Features

  • Documentation Search - Searches official LangChain docs
  • Support KB - Searches the Pylon knowledge base for known issues
  • Link Validation - Verifies URLs before including in responses
  • Guardrails - Filters off-topic queries

Quick Start

Prerequisites

  • Python 3.11+
  • uv (recommended) or pip

Installation

# Clone the repository
git clone https://github.com/langchain-ai/chat-langchain.git
cd chat-langchain

# Install dependencies with uv
uv sync

# Or with pip
pip install -e . "langgraph-cli[inmem]"

Configuration

# Copy environment template
cp .env.example .env

# Edit .env with your API keys

Required Environment Variables

VariableDescription
ANTHROPIC_API_KEYAnthropic API key (or use another provider)
MINTLIFY_API_URLMintlify API base URL for docs search (e.g. https://api-dsc.mintlify.com/v1/search/docs.langchain.com)
MINTLIFY_API_KEYMintlify API key for docs search
PYLON_API_KEYPylon API key for support KB
PYLON_KB_IDPylon knowledge base ID for support articles

Running Locally

Backend

# Start LangGraph development server
uv run langgraph dev

# Or with pip
langgraph dev

Open LangGraph Studio: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024

Frontend

cd frontend
npm ci
npm run dev:local

The frontend expects the LangGraph server at http://127.0.0.1:2024 by default. If you want trace sharing from the UI, set LANGSMITH_API_KEY in frontend/.env.local.

Project Structure

├── src/
│   ├── agent/
│   │   ├── docs_graph.py      # Main docs agent
│   │   └── config.py          # Model configuration
│   ├── tools/
│   │   ├── docs_tools.py      # Documentation search
│   │   ├── pylon_tools.py     # Support KB tools
│   │   └── link_check_tools.py # URL validation
│   ├── prompts/
│   │   └── docs_agent_prompt.py
│   └── middleware/
│       ├── guardrails_middleware.py
│       └── retry_middleware.py
├── frontend/                  # Next.js public chat UI
├── langgraph.json             # LangGraph configuration
└── pyproject.toml             # Python project config

How It Works

The agent uses a docs-first research strategy:

  1. Guardrails Check - Validates the query is LangChain-related
  2. Documentation Search - Searches official docs via Mintlify
  3. Knowledge Base - Searches Pylon for known issues/solutions
  4. Link Validation - Verifies any URLs before including them
  5. Response Generation - Synthesizes a helpful answer

Deployment

LangGraph Cloud

  1. Push to GitHub
  2. Connect repository in LangSmith
  3. Configure environment variables
  4. Deploy

Resources

License

MIT

Similar Articles

[N] LangChain Interrupt 2026 announcements [N]

Reddit r/MachineLearning

LangChain announced SmithDB, a distributed database for agent observability, Context Hub for managing agent context with an open memory standard, and Deep Agents v0.6 at Interrupt 2026, alongside enterprise case studies and keynotes by Andrew Ng and Harrison Chase.