Skip to content

skylarmb/wikid

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

wikid 📚

An offline research assistant powered by vLLM and ZIM files. Because sometimes you want to ask Wikipedia questions without the internet.

What is this?

A fun side project that lets you chat with an AI that can search through offline knowledge bases (Wikipedia, Stack Exchange, etc.) stored as ZIM files. Built with vLLM for fast local inference and 100% vibe-coded with Claude.

Quick Start

# Install dependencies
uv sync

# Start the vLLM server (in one terminal)
uv run wikid-server

# Chat with your offline assistant (in another terminal)
uv run wikid-chat

What's in the box?

  • 🤖 Local AI assistant using Hermes-2-Pro-Llama-3-8B-AWQ (optimized for 16GB VRAM, easily swap models)
  • 📁 Offline knowledge search through ZIM files
  • 💬 Interactive chat with streaming responses
  • 🎨 Markdown rendering in your terminal
  • 🔧 Tool calling for research tasks

Current Status

Basic infrastructure is done. Working on ZIM file integration next. See PROJECT.md for the full roadmap.

Why?

The goal was to experiment with function calling / tool use using local models and create a reliable AI assistant that could provide factual information even in a completely off-grid / offline / zombie apocalypse scenario. Because when the internet goes down, you still need to know how to configure systemd services! 🧟‍♂️

About

Local inference + offline ZIM files (Wikipedia, Stack Exchange, etc) = Apocalypse-proof AI research agent

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors