Virtual-Reality Driven Robotic Control

This Virtual-Reality game maps your movement to a real robot predator in pursuit of a mouse. The purpose of this project is to elicit planning behaviors from mice as they evade a hostile predator in the form of a robot

GitHub Repository

Robotic Control

Players teleoperate a skid-steer robot using their own movement inside a hexagonal arena whilst pursuing a live mouse

Networking

Messages are sent in a Client-Server relationship, allowing full-duplex communication between the Virtual-Reality game and the Habitat

Virtual-Reality Game

The game is a virtual-reality recreation of a hexagonal arena, filled with hexagonal obstacles that the robot must traverse in order to capture its prey

Robotic Control

How Does the Robot Move?

Several systems developed by the MacIver and Dombeck laboratories at Northwestern, as part of their cellworld system for studying how the brain makes plans when evading a serious threat (Lai et al. 2024, Cell Reports, in press; cellworld website), come together in order to get the robot moving! Learn more about the skid steer robot, how it decides where to move using path planning, and how destinations are converted into wheel commands for the motors that drive the robot! The images and videos below are from a forthcoming publication  Lai, et al. (2024). A robot-rodent interaction arena with adjustable spatial complexity for ethologically relevant behavioral studies. Cell Reports, in press

The Skid Steer Robot

The Robot, depicted above, is a skid steer robot. It utilizes a drive train to spin two sets of wheels. Rotation is achieved by sending different speeds to the wheels.

Path Planning

The Path Planning receives the given destination and navigates towards the closest visible cell. It passes on wheel commands to the motors to move the robot towards the destination

Destinations to Wheel Commands

Destinations are given to the robot from the Virtual-Reality game. When the server receives the destinations, it passes it on to the Path Planning. The robot then moves according to the wheel commands given by the Path Planning.

Networking

Client-Server Communication

The Networking Magic that controls a feedback-loop between the Virtual-Reality Game and the Habitat

The Client

The Virtual-Reality Game acts as a client to the server running in the "Habitat" - the hexagonal arena that the robot resides. As a client, we can send requests, messages,  and receive broadcasted messages from the subscribed server.  

The Server

Running in the Habitat is the server which has several functions. First, it tracks the location of the robot and the mouse. Second, it sends the robot destinations fed by the virtual-reality player. The server can broadcast messages to its subscribed clients,  which controls the avatars of the mouse and the real robot in the game. 

Feedback Communication

Because of this client-server relationship, as the player moves and interacts with the virtual-reality environment, so too does the robot! As the player traverses the world, tracking information from the Habitat is sent back into the game, updating the location of the mouse and real robot avatars in real-time. 

Virtual-Reality Game

Converting the Habitat to a Virtual Environment

The Virtual Habitat

The Virtual Habitat is a 20 meter long recreation of the Habitat. As with the original Habitat, you can add in Occlusions - hexagonal obstacles that occlude the view of the mouse. The Virtual Habitat is designed for a player to run in an open-field. Because humans can outrun the robot, we scaled the world up to make it feel like the player is in the actual Habitat. 

Translating Coordinates

Because the Virtual Habitat is designed for a larger environment, there needs to be a way to convert Virtual-Reality Coordinates to Canonical - the coordinates of the Habitat.
Inside the game is a function to convert the Virtual-Reality Coordinates to Canonical  and vice-versa. 

Robot and Mouse Avatars

In the game are the robot and mouse avatars. These representations move according to tracking information obtained from the server. As these avatars update, the player can make an informed decision as to how to pursue the mouse!

Step-Up Module

Going Fully Mobile

Because this project requires a free moving agent in a large field, we needed to create something that would allow the player to be completely untethered. HP currently does not have a method of powering the Virtual-Reality Headset from the Virtual-Reality Backpack, so I created one! This Step-Up Module utilizes the 12V Output from the Virtual-Reality Backpack and steps it up to 19.5V. This connects to the headset, providing it an untethered supply of power

Virtual-Reality Backpack

Depicted above is the HP VR Backpack G2. It's essentially a laptop with a dedicated graphics card, and is where the Virtual-Reality Game is run. At the very top is the 12V DC Output - which is first stepped up in Voltage  to be used to power the Virtual-Reality Headset. A set of batteries power the backpack when it is fastened to the harness. These typically have a battery life of around 30 minutes.

Virtual-Reality Headset

The Virtual-Reality Headset is the HP Reverb G2. This headset is directly connected to the backpack, but does not come with a power connector to the backpack! Because of this, I created the Step-Up Module, which converts the 12V DC Output from the backpack into 19.5V to power the headset. This allows the user to be completely untethered from any outlet.

Step-Up Module

The Step-Up Module sits on the backpack, connecting the 12V DC Output into the 19.5V DC Input. The module was designed around a DC-DC converter. This converter sits inside a simple project box which also hosts two barrel connectors. These allow each cable to be replaced individually and adjusted on-the-fly. 

Future Work

There are two things I'd like to add into the game to make it easier on the players. The first is a mini-map which displays the location of the real robot to the player at all times. When the mouse is in view, it will be displayed on the mini-map as well!

The second is adding an incentive for the player to stay close to the real robot. Currently, players will tunnel vision on the mouse as soon as it's spotted. This often leaves the location of the real robot disregarded. In order to fix this, I came up with the idea of making the world dark, and adding a light source only to the real robot. This way there is an incentive for the player to move slower through the arena while waiting for the real robot to catch up to the player's location.  

Acknowledgements

This project would not be possible without the help of everyone in the MacIver and Dombeck laboratories. A big thank you to Angel Germán Espinosa Coarasa, Gabrielle Wink, Alex Lai, Chris Angeloni, Felix Alexander Maldonado, Selim Chalyshkan, Joe Reed,  Joshua Chi, and of course Dr. MacIver and Dr. Dombeck! I would also like to thank everyone in the Master's in Robotics program for an incredible year full of invaluable experiences.