IntraMind is a local intelligence system that transforms an organization's internal documents — PDFs, Word files, reports, and even scanned papers — into a smart, searchable knowledge network.
It runs securely inside the company's own network (LAN), ensuring full data privacy while delivering AI-powered answers and insights in real time.
Runs inside your local servers; no data ever leaves the network.
Delivers results instantly, even when completely offline.
Combines powerful document retrieval with AI-driven reasoning.
A single solution for colleges, corporate offices, R&D labs, and government departments.
Documents
Ingestion
Embeddings
Vector DB
Query
LLM
Response
Supports PDF, DOCX, images, and scanned documents with OCR
Vector embeddings and semantic search for accurate context
All data stored securely on your infrastructure
Runs models like Llama 3, Mistral via Ollama
Zero external API calls, complete data sovereignty
Deploy on your internal network infrastructure
# Clone the repository$ git clone https://github.com/crux-ecosystem/IntraMind-Showcase.git$ cd IntraMind-Showcase# Install dependencies$ pip install -r requirements.txt# Start IntraMind$ python main.py
1# IntraMind Configuration2model:3embedding: "all-MiniLM-L6-v2"4llm: "llama3"56storage:7vector_db: "chromadb"8documents_path: "./documents"910server:11host: "0.0.0.0"12port: 808013lan_only: true
Search through research papers, course materials, and administrative documents instantly.
Examples:
Access company policies, reports, and internal documentation with AI-powered search.
Examples:
Query technical documents, research papers, and experimental data securely.
Examples:
Secure document management and retrieval for sensitive information.
Examples: