Offline, Large Language Model chat capability for Mobile Games.
The LLM inference Plugin provides simplified access to offline Large Language Model chat capabilities for use in mobile games.
It includes samples and guides on building your own purpose driven A.I. chatbot for in game characters and general chat. Also, included are guides on how to use the extensive and growing list of open source LLM models and LoRas. The plugin is simple to use and suitable for beginners of Large Language models.
The Plugin implements a GameInstance Subsystem that provides configuration options to an on device LLM inference model runtime.
Features:
Code Modules:
Number of Blueprints: 7
Number of C++ Classes: 5
Network Replicated: NO
Supported Development Platforms: Windows
Supported Target Build Platforms: Android
Documentation: Docs
Example Project:
Important/Additional Notes: