In experiments demonstrating multi-agent planning and learning, it is often difficult for observers to gain an intuitive understanding of the underlying behaviors and decisions made by agents. We have developed a new technique for real-time visualization of planning and learning algorithms. Filtered motion capture data is combined with a ground-projection system for casting primary and meta-information of agents in real-time.
Our work, Measurable Augmented Reality for Prototyping Cyber-Physical Systems (MAR-CPS), allows spectators to observe hardware while simultaneously gaining an intuitive understanding of the decisions made by algorithms. The developed system is tested in several scenarios, such as multi-agent intruder monitoring and forest fire fighting using quadcopters. In each case, the visualization is used to improve understanding of the vehicles' behaviors using meta-data such as vehicle position, health state, and viability of future actions.
Additional features include interactivity of virtual and physical platforms in the same domain, as well as human-robot interactions using motion capture input devices.