Roboy Open Topics

Roboy Student Team works on developing the tendon-driven humanoid robody Roboy 3.0 and its next generations. Robody is a remote robotic body, controlled at all times by a human operator using VR gear.

All code, results & scientific findings produced by the Roboy Student Team are open-sourced and, if possible, published. Topics of interest cover biologically-inspired actuation, simulation, sensing, control, telepresence and XR, where a wide range of skills in mechanics, electronics & software engineering can be applied.

High-level reasoning approaches, typical for autonomous robots, are out of scope.

Here we collect all ideas for potential Roboy thesis, lab course, etc.

Field

Priority

Partner

Description

References

Field

Priority

Partner

Description

References

Control

HIGH / ONGOING

 

Mujoco simulation of a tendon-driven musculo-skeletal robot Roboy

Research & apply state of the art methods to narrow hardware-simulation gap with goal to open-source the simulation model for the community and research.

 

HIGH / ONGOING

MAE,

Chinese University of Hong Kong

Valero Lab,

University of Southern California

Force control of a tendon-driven musculo-skeletal robotic arm

Implement kinematic control for Roboy 3.0 (simulation & hardware). Model-based and model-free methods will be taken into account.

Possible collab:

HIGH

TUM FAR

Wrist control

Implement kinematic control and operator mapping for the new stronger wrist of Roboy 3.0

 

Sensing

MEDIUM

University of Leeds

Soft tactile sensors for Roboy

In the scope of this project the team will be developing tactile sensors (aka skin) for Roboy based on the advancements from the University of Leeds, who will be co-supervising the project. Topics cover

  • sensor modification

  • sensor manufacturing

  • data read-out and interpretation

Partner’s papers:

Adjustable Compliance Soft Sensor via an Elastically Inflatable Fluidic Dome

https://eprints.whiterose.ac.uk/168371/1/SITS_Optimization.pdf

HIGH

 

“Manual” Robody Control

  • Explore the control of the Robody using haptic gloves

  • Map the Hand’s state to the robody’s hand

  • Explore ways to switch between operating the robody and navigating the menu

  • Make use of the glove’s haptic feedback

https://www.bhaptics.com/shop/tactglove

HIGH completed

 

Facial expression transfer and mapping between the operator and the avatar during telepresense

This project covers:

  • recognition of the operator’s emotions while wearing a VR headset (currently Meta Quest Pro)

  • design of new animations for the existing Roboy Face (Unity project)

  • facial expression operator-avatar mapping

Possible HW addition: https://www.youtube.com/watch?v=4_EIqmIWn_Y

HIGH / ONGOING

 

Extra-human sensing for Roboy Avatar: infrared, ultrasound, magnetic field

Telepresence opens a wide range of possibilities to extend the human sensing capabilities. For example, adding an infrared camera on the avatar and fusing the its image with the RGB camera stream will already give the operator extra-human sensing which will be invaluable in situations like search and rescue operations.

Detailed description:

 

HIGH completed

 

Contactless vitals monitoring

  • Use the cameras to find out if a person is standing/sitting or lying on the floor

  • Use other sensors (e.g. 60 GHz radar) to establish the persons vital data - e.g. body temperature, breathing rate, heart rate, blood pressure, …

 

VR/AR

HIGH / ONGOING

Infineon

Immersive vision stream for telepresence using ToF / 360 deg camera
Currently the vision data in telepresence is using a stereocamera. The field of view it provides, however, is rather limited. Therefore, we seek to augment the RGB camera stream with a long-range (9m) time-of-flight sensor and/or a 360 degrees camera to improve operator’s situational awareness.

Detailed description:

LOW

Nvidia

Telepresence through Nvidia’s CloudXR & Omniverse

This project aims to explore the potential of using the latest Nvidia’s tools for telepresence - as a data transmission and cloud processing layer. These tools will allow to move computationally expensive tasks to the cloud - e.g. video stream super-resoluiton, detailed rendering and simulation, etc.

Expert support by the partner.

Develop on NVIDIA Omniverse Platform

NVIDIA CloudXR

MEDIUM

 

AR Roboy

Developing an AR overlay of the Roboy Avatar.

Potential HW: https://www.nreal.ai/light

LOW

 

Neural style transfer during telepresence
This artistic project will allow the operator to see the reality in a new way through Roboy eyes inspired by Vincent van Gogh or maybe Simpsons. Here we aim to apply real-time time-coherent neural style transfer on the avatar camera stream.

https://blog.unity.com/technology/real-time-style-transfer-in-unity-using-deep-neural-networks

Mechatronics

completed

 

Standing Roboy Avatar

Currently Roboy 3.0 is sitting on an electric wheelchair. The goal of this project is to make Roboy stand up by rigidly mounting him to a self-balancing platform (e.g. Segway).

 

MEDIUM

 

Next generation muscle unit

Design a new series elastic actuator (4th generation)

 

completed

 

Series elastic actuators: artificial muscle profiling
Roboy’s actuators are series elastic actuators developed fully in house by our team. In the scope of this project an extensive testing and profiling will be done including:

  • frequency response (Bode plots)

  • maximum performance range

  • control loop update frequency on:

    • motor driver board (FPGA)

    • central motor control unit (FPGA)

    • ROS layer (C++)

  • control accuracy

  • durability tests

For example, see https://docs.hebi.us/hardware.html#x-series-actuators

Control

LOW



Cooperative control of the mobile platform for Roboy Avatar

While navigating Roboy Avatar through telepresence, it is often challenging to drive though narrow spaces or sharp corners. The field of view of the operator is limited compared to human eyes and the distances to objects are distorted. In addition, network latencies or outages might cause unsafe situations while driving the avatar. Therefore, a controller that would assist the operator with obstacle avoidance and ensure safety is desired. This project encompasses integration of the new sensors into the mobile platform and implementation of the assisted driving control algorithms.



LOW

 

Assisted grasping

Due to the occlusion of vision and low robot hand dexterity during telepresence, it is challenging for the operator to effectively manipulate objects. This project aims to address this issue by adding new sensing modalities to the hand and writing cooperative controllers to assist the operator during grasping and in-hand manipulation.

 

MEDIUM

 

ROS2 motor control

Migration the current motor control stack from ROS1 to ROS2 for its real-time capabilities.