Introducing RLAMA Chat
RLAMA
A powerful document question-answering tool that connects to your local Ollama models. Create, manage, and interact with RAG systems for all your document needs.
Available for macOS, Linux, and Windows
Key Features
Everything you need to build powerful document-based question answering systems
Simple Setup
Create and configure RAG systems with just a few commands and minimal setup.
Multiple Document Formats
Support for PDFs, Markdown, Text files, and more with intelligent parsing.
Offline First
100% local processing with no data sent to external servers.
Intelligent Chunking
Smart document segmentation for optimal context retrieval.
Interactive Sessions
Query your documents with natural language in an interactive terminal.
Document Watching
Automatically update your RAG systems when documents change.
RLAMA in Action
See how simple it is to create and use RAG systems with our intuitive CLI
Create a RAG System
Index a folder of documents to create a new RAG system. Supports multiple file formats and embedding models.
Popular Use Cases
Discover how Rlama can be used in various scenarios
Technical Documentation
Query your project documentation, manuals, and specifications with ease.
Private Knowledge Base
Create secure RAG systems for sensitive documents with full privacy.
Research Assistant
Query research papers, textbooks, and study materials for faster learning.
Command Reference
A complete reference of all available commands to master RLAMA
Create a new RAG system from documents
Start an interactive session with a RAG system
List all available RAG systems
Delete a RAG system
Set up directory watching for a RAG system
Disable directory watching for a RAG system
Check a RAG's watched directory for new files
Start API server
Update RLAMA to the latest version
Display RLAMA version
Common Issues & Solutions
Quick fixes for the most frequently encountered problems
Supported File Formats
RLAMA supports a wide variety of document formats to meet all your needs