A modern Streamlit-based chat interface for interacting with Ollama LLMs.
- Clean and modern chat interface
- Support for multiple Ollama models
- Adjustable temperature settings
- Custom system prompts
- Chat history
- Responsive design
- Python 3.8 or higher
- Ollama installed and running locally (https://ollama.ai/)
- Clone this repository:
git clone <your-repo-url>
cd <your-repo-directory>- Install the required dependencies:
pip install -r requirements.txt- Make sure Ollama is running on your system (default: http://localhost:11434)
- Start the Streamlit app:
streamlit run app.py-
Open your web browser and navigate to the URL shown in the terminal (typically http://localhost:8501)
-
In the sidebar:
- Select your preferred Ollama model
- Adjust the temperature setting
- Customize the system prompt if desired
-
Start chatting with the AI!
The application connects to Ollama at http://localhost:11434 by default. If your Ollama instance is running on a different host or port, you can modify the base_url parameter in the OllamaClient class initialization.
MIT License