Simulating pHRI Scenarios

HRI experiments are conducted to observe the dynamics between humans and robots. It takes significant amounts of time to develop scenarios and experimental procedures before they can be executed because of risks physical robots pose to human participants.

By utilizing virtual environments and haptic devices, scenarios that previously were infeasible or impractical can be rapidly put together and used in experiments. We use the Unity3D game engine to develop the virtual environment and the Novint Falcon haptic device as the input and output device for human participants. The following video demonstrates the view for the two participants in the game.

Figure 1: Human-Human Cooperation

Two participants are working together to move a wooden block from the center to the green goal on the bottom right. These participants have no communication between each other outside of the forces they feel from their haptic devices and camera view. One participant can be substituted with an autonomous agent so that the remaining participant's behavior can be observed when cooperating without his knowledge that he is not working with another human.

Figure 2: Virtual Environment

The scene can be modified rapidly to test different scenarios. Different scenes can be saved and loaded to minimize the time between experiments. The scenes can be customized so that objects, layouts, lighting, and textures can be tailored to the scenario. Different numbers of participants and agents can be included to observe their behavior.

Figure 3: High-Level System Architecture

High-Level System Architecture

There are three distinctive parts of the system:

  1. Simulation program
  2. Participant interface
  3. Autonomous agent

Simulation Program

The simulation is run using the Unity game engine. Actuation data from participants and autonomous agents is received via the subscriber, which is then applied to the object of interest withing the simulated environments. Reaction forces from the object are read and sent to the participants and autonomous agents as feedback.

Participant Interface

Participants interact with a monitor and a haptic feedback device while data is passed to and from the simulation program via the publisher and subscriber. The publisher publishes data about the position of the haptic device end effector and the subscriber takes in data from the simulation program to set the forces of the haptic device and to update what the participant sees in the scene.

Autonomous Agent

Autonomous agents publish and subscribe to the same kinds of data as participants. Autonomous agents are currently not implemented, but are expected to be highly customizable while maintaining drop-in replacement capabilities with participant interfaces.

Communication

This project utilizes ZeroMQ to handle data between different parts of the system. ZeroMQ is used because it has implementations in both C# and C++ and because it can be used on any level of networking, from the local machine to computers outside the local network. The publishing rate for all publishers is limited to 30Hz so that subscribers can process the data quickly enough to minimize latency.

Haptic Device Control

The Novint Falcon haptic device is used as the input device and force feedback device for the system. The libnifalcon open-source driver is used to operate the device, and a modified fork of it can be found below in Source Code and Assets. The position and velocity data of the end effector is used to calculate the force acting in the simulation environment using Equation 1 from [1]. The negative of that equation is applied to the end effector so that the participant feels the reaction forces from the environment. The haptic device runs up to 1000Hz depending on the capabilities of the host computer. .

Source Code and Assets

A script to download, compile, and install everything can be found here.

A fork of the GitHub repository for the Novint Falcon haptic device can be found at https://github.com/yzgarrard/libnifalcon.

All the Unity3D files can be downloaded here. Loading the Unity project requires a Unity account and a personal license.

Future Work

This project is in its early prototyping stages. To make it generally usable, the project will need several features:

  • Portable software so that anyone can download and use it with minimal coding.
  • Networking capabilities so that multiple users can interact in the same environment while being on separate networks.
  • Ability to integrate autonomous agents.
  • Integration with a virtual reality (VR) headset for more immersive games.

Additional Literature

[1]C. E. Madan, A. Kucukyilmaz, T. M. Sezgin, and C. Basdogan, “Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation,” IEEE Transactions on Haptics, vol. 8, no. 1, pp. 54–66, Jan. 2015.

Contact Information

Yizhuang Garrard and Wenlong Zhang

Email: ygarrar1@asu.edu; wenlong.zhang@asu.edu