EasyLocalLLM is a Unity library that makes it easy to use a local LLM (Ollama). With just a few lines of code, you can build offline AI chatbots and natural in-game NPC conversations.OverviewEasyLocalLLM is a lightweight Unity library that makes local LLM integration simple and practical.With as little as a few lines of code, you can add offline AI chat and immersive NPC dialogue powered by Ollama.Included samples:SimpleChat (basic chat UI)LateralThinkingQuiz (gameplay-oriented example)QuickStartTest (minimal functionality check)Key features:Easy setup with minimal boilerplateReal-time streaming responses (partial output support)Flexible configuration via OllamaConfigAutomatic retry with exponential backoff for network issuesMulti-session conversation managementPer-session system prompts for roles and character behaviorTools / Function Calling to trigger game-side features from the LLMEncrypted chat history with persistence supportRequirements:Unity 2021.3 or later (recommended)Windows 10/11Ollama server required (installation/setup guide is included in this asset package)mistral installed in Ollama (installation/setup guide is included in this asset package)GPU recommended (CPU is supported but slower)Limitations:Unity-only (UnityWebRequest dependency)Windows-focused supportTask-based APIs are coroutine-backed internallyI used GitHub Copilot in coding process and documentation.I also use GitHub Copilot to help to make Store Asset Images. (Not in my asset )



