Local LLM Infrastructure: Self-Hosted AI on Mac Mini
Production-grade self-hosted AI infrastructure on Mac Mini M4 Pro. Complete automation with Ollama, Open WebUI, and MCP bridge. Deep dive into LLM deployment, RAG implementation, and enterprise-ready AI systems.
Local LLM Setup: Self-Hosted AI Infrastructure
Self-hosted AI infrastructure providing private, offline capabilities. No cloud dependencies, no subscriptions, complete data sovereignty. Ollama, Qwen 2.5:14B, OpenWebUI, RAG integration.
AI Memory System: Building Personal AI Workflow Automation
Personal knowledge management infrastructure maintaining context across AI conversations. Solves the "context loss" problem where each session starts fresh. JSONL ledger for Claude, ChatGPT, and local LLMs.