Spotlight Sale: Save 50% on select products now through July 15.

Unray Plugin for RL

Unray - Code Plugins - Nov 10, 2023
3.83 out of 5 stars(6 ratings)
  • 50%
  • 17%
  • 17%
  • 0%
  • 17%

Training Tool for Multiagent Scenarios with Reinforcement Learning in Unreal Engine

  • Supported Platforms
  • Supported Engine Versions
    5.3 - 5.4
  • Download Type
    Engine Plugin
    This product contains a code plugin, complete with pre-built binaries and all its source code that integrates with Unreal Engine, which can be installed to an engine version of your choice then enabled on a per-project basis.

Discover how Unray can power your game and simulation development with reinforcement learning in Unreal Engine:

1. Interactive Game Development: Use Unray to create complex game environments with multiple agents that learn and adapt as they play.

2. Realistic Environment Simulations: Create realistic simulations to train agents in environments that mimic real-world situations.

3. Research in Artificial Intelligence: Employ Unray as a research platform to experiment with different reinforcement learning algorithms in multi-agent environments.


- Uses powerful RLlib technology for effective training.

- Leverages the ability to parallelize training using Ray technology.

- Supports a variety of algorithms, including PPO, QMIX, DQN, in addition to those built into the RLLib library.

- Facilitates the creation of multi-agent environments.

Demo Video:

Discord Server:

Technical Details


  •  Train single and multiagents envs with Reinforcement Learning
  •  Parallel Training
  • Create and configure agents for RL training
  • Create envs for RL training
  • Run inference on trained models

Code Modules:

  •  Name: Unray. Type: Runtime.

Number of Blueprints: 9

Number of C++ Classes: 1

Network Replicated: No

Supported Development Platforms: Windows


Important/Additional Notes: Unray plugin is complimented with a Python API counterpart, which makes use of RLLib, so it is necessary to install and develop the training from a Python IDE.