Local manga translator with LLM build-in, written in Rust with llama.cpp integration

Reddit r/LocalLLaMA Tools

Summary

Koharu is an open-source Rust-based manga/image translator that combines object detection, visual LLM OCR, layout analysis, and inpainting, with llama.cpp integration supporting Gemma 4 and Qwen3.5 models.

Hi LocalLLaMA, I created a post a few weeks ago, but this time this project has become more reliable and easier to use. This is a manga translator that can also be used to translate any image. It uses a combination of object detection, visual LLM-based OCR, layout analysis, and fine-tuned inpainting models. I believe it is the most performant and easy-to-use pipeline for manga translation. For the LLM part, I have integrated llama.cpp into this application; it supports the Gemma 4 family and the Qwen3.5 family, and also includes uncensored and fine-tuned models. It also supports OpenAPI-compatible API, so you can use LM Studio or OpenRouter, etc. I think the demo video explains the workflow a lot, basiclly you just click a button and it will run the pipeline for you. You can also proofread and edit the result, changing the font, size, color, etc. It's a mini Photoshop editor. For who may have interest on this, it's fully open-source: [https://github.com/mayocream/koharu](https://github.com/mayocream/koharu)
Original Article

Similar Articles

GGML and llama.cpp join HF to ensure the long-term progress of Local AI

Hugging Face Blog

GGML and llama.cpp have joined Hugging Face to ensure long-term sustainability of local AI development. Georgi Gerganov's team will maintain full autonomy over the projects while receiving resources to scale community support and improve integration between llama.cpp inference and transformers model definitions.

llama.cpp is the linux of llm

Reddit r/LocalLLaMA

The article draws a parallel between llama.cpp and Linux, positioning the open-source library as foundational infrastructure for running large language models.