Skip to content

Ollama‑Chat is a Streamlit-based web UI for interacting with Ollama-hosted language models. It provides a clean, modern chat interface where users can: Choose from multiple local Ollama models Adjust parameters like temperature and system prompts View and preserve chat history Customize settings via a sidebar

License

Notifications You must be signed in to change notification settings

Nukaze/ollama-chat

Repository files navigation

Ollama Chat

A modern Streamlit-based chat interface for interacting with Ollama LLMs.

Features

  • Clean and modern chat interface
  • Support for multiple Ollama models
  • Adjustable temperature settings
  • Custom system prompts
  • Chat history
  • Responsive design

Prerequisites

Installation

  1. Clone this repository:
git clone <your-repo-url>
cd <your-repo-directory>
  1. Install the required dependencies:
pip install -r requirements.txt
  1. Make sure Ollama is running on your system (default: http://localhost:11434)

Usage

  1. Start the Streamlit app:
streamlit run app.py
  1. Open your web browser and navigate to the URL shown in the terminal (typically http://localhost:8501)

  2. In the sidebar:

    • Select your preferred Ollama model
    • Adjust the temperature setting
    • Customize the system prompt if desired
  3. Start chatting with the AI!

Configuration

The application connects to Ollama at http://localhost:11434 by default. If your Ollama instance is running on a different host or port, you can modify the base_url parameter in the OllamaClient class initialization.

License

MIT License

About

Ollama‑Chat is a Streamlit-based web UI for interacting with Ollama-hosted language models. It provides a clean, modern chat interface where users can: Choose from multiple local Ollama models Adjust parameters like temperature and system prompts View and preserve chat history Customize settings via a sidebar

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published