LLM Chat
A chat application built with C++, Qt 6 and Ollama
LLM Chat
A chat application built with C++, Qt 6 and Ollama
Tech Stack :
LLM Chat Desktop Application
Overview
A modern desktop chat application built with Qt 6 and C++ for interacting with Large Language Models through Ollama. The application provides a sleek Material Design interface and robust features for AI-powered conversations.
Key Features
- 🎨 Modern Material Design UI - Clean, intuitive interface with dark mode support
- 💬 Chat Interface - Multi-threaded conversations with message history
- 🔄 Real-time Streaming - Live streaming of AI responses
- 🌐 Ollama Integration - Seamless connection to local Ollama server
- ⚡ High Performance - Built with C++ and Qt for optimal speed
- 🛠️ Customizable Settings - Configurable options including:
- Server URL
- Model selection
- System prompts
- Keyboard shortcuts
- UI preferences
- 🖥️ Cross-platform - Supports Windows and Linux
Technical Details
Architecture
- Frontend: Qt Quick/QML for modern UI components
- Backend: C++ core with Qt framework
- Build System: CMake with CPM for dependency management
- Testing: Comprehensive unit tests using Catch2
- CI/CD: GitHub Actions for automated builds and testing
- Code Quality:
- SonarCloud integration
- Clang-tidy static analysis
- CodeQL security scanning
Development Environment
The project includes a complete development container setup with:
- GCC 14 / Clang 18
- Qt 6.8.0
- CMake 3.27+
- Code analysis tools
- Formatting tools
Key Components
- Chat backend with Ollama API integration
- Thread management system
- Real-time message streaming
- Settings persistence
- Custom QML components
Development Practices
- Modern C++23 standards
- Comprehensive error handling
- Memory safety focus
- Extensive documentation
- Automated testing
- CI/CD pipeline integration
Project Structure
The project follows a clean, modular architecture:
core
chat
quick
qml
Build & Development
Prerequisites
- Qt 6.8.0+
- CMake 3.27+
- C++17 compatible compiler
- Ollama server
Quick Start
License
This project is licensed under the MIT License. See the LICENSE file for details.