Liquid Audio - Speech-to-Speech audio models by Liquid AI
-
Updated
Jan 6, 2026 - Python
Liquid Audio - Speech-to-Speech audio models by Liquid AI
A hackable library for running and fine-tuning modern transformer models on commodity and alternative GPUs, powered by tinygrad.
ComfyUI nodes for Batch Image Captioning + various Vision-Language (VL) models, including InternVL3.5, Xiaomi MiMo-VL, LiquidAI LFM2-VL, Kwai Keye-VL, AIDC-AI Ovis2.5 and Ovis-U1. Models: Ovis2.5-2B, Ovis2.5-9B, Keye-VL-8B-Preview, MiMo-VL-7B-RL-GGUF, LFM2-VL-450M, LFM2-VL-1.6B, Ovis-U1-3B, Ovis2.5-2B, Ovis2.5-9B, InternVL3_5-1B/2B/4B/8B/14B/38B
Advanced PDF/Document Translator with interactive comparison. Built on IBM Docling.
LFM2-powered research agent | Automated Related Work & citations | Semantic Scholar + arXiv
🎯 The world's first fully internationalized LFM2-350M chat interface with native support for 8 languages!
Fine-tuning LFM2-1.2B for Korean-English bidirectional translation. GRPO+COMET & SFT Training, outperforming 4B models.
Anywhere-LFM est une application de bureau légère, pensée pour simplifier l'utilisation en local des modèles LiquidAI/LFM2 (350M à 1.2B). Elle permet à n'importe qui de discuter avec une IA générative performante, sans nécessiter de ligne de commande ou de configuration complexe.
Minimal android app to make inference of llm models
Blazing fast NPCs with Spatio-Temporal awareness inside Unreal Engine
Fine-tuning LiquidAI/LFM2-VL-1.6B in Colab (LoRA/4-bit) + dataset template + probe test.
C# example for LM Studio integration - Complete API coverage (chat, vision, embeddings, model management) (Tool support on dev branch) (events, async, disposal, error handling) Ready-to-run Windows Forms application Extensively documented code
Llama.cpp Inference + voice-ui-kit
LFM2 350M Math, large language model, operates within web browsers via 🤗 Transformers.js and ONNX Runtime Web.
Add a description, image, and links to the lfm2 topic page so that developers can more easily learn about it.
To associate your repository with the lfm2 topic, visit your repo's landing page and select "manage topics."