Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Here we collect all ideas for potential Roboy thesis, lab course, etc.

Field

Priority

Partner

Description

References

Control

Status
colourRed
titleHIGH
/
Status
colourPurple
titleONGOING

Mujoco simulation of a tendon-driven musculo-skeletal robot Roboy

Research & apply state of the art methods to narrow hardware-simulation gap with goal to open-source the simulation model for the community and research.

Status
colourRed
titleHIGH
/
Status
colourPurple
titleONGOING

MAE,

Chinese University of Hong Kong

Valero Lab,

University of Southern California

Force control of a tendon-driven musculo-skeletal robotic arm

Implement kinematic control for Roboy 3.0 (simulation & hardware). Model-based and model-free methods will be taken into account.

Possible collab:

View file
nameGuidedResearch_USCxRoboy.pdf

Status
colourRed
titleHIGH

TUM FAR

Wrist control

Implement kinematic control and operator mapping for the new stronger wrist of Roboy 3.0

Sensing

Status
colourBlue
titleMEDIUM

University of Leeds

Soft tactile sensors for Roboy

In the scope of this project the team will be developing tactile sensors (aka skin) for Roboy based on the advancements from the University of Leeds, who will be co-supervising the project. Topics cover

  • sensor modification

  • sensor manufacturing

  • data read-out and interpretation

Partner’s papers:

https://www.mdpi.com/1424-8220/21/6/1970

https://eprints.whiterose.ac.uk/168371/1/SITS_Optimization.pdf

Status
colourRed
titleHIGH

“Manual” Robody Control

  • Explore the control of the Robody using haptic gloves

  • Map the Hand’s state to the robody’s hand

  • Explore ways to switch between operating the robody and navigating the menu

  • Make use of the glove’s haptic feedback

https://www.bhaptics.com/shop/tactglove

Status
colourRed
titleHIGH
Status
colourGreen
titlecompleted

Facial expression transfer and mapping between the operator and the avatar during telepresense

This project covers:

  • recognition of the operator’s emotions while wearing a VR headset (currently Meta Quest Pro)

  • design of new animations for the existing Roboy Face (Unity project)

  • facial expression operator-avatar mapping

Possible HW addition: https://www.youtube.com/watch?v=4_EIqmIWn_Y

Status
colourRed
titleHIGH
/
Status
colourPurple
titleONGOING

Extra-human sensing for Roboy Avatar: infrared, ultrasound, magnetic field

Telepresence opens a wide range of possibilities to extend the human sensing capabilities. For example, adding an infrared camera on the avatar and fusing the its image with the RGB camera stream will already give the operator extra-human sensing which will be invaluable in situations like search and rescue operations.

Detailed description:

View file
nameinfrared.pdf

Status
colourRed
titleHIGH
Status
colourGreen
titlecompleted

Contactless vitals monitoring

  • Use the cameras to find out if a person is standing/sitting or lying on the floor

  • Use other sensors (e.g. 60 GHz radar) to establish the persons vital data - e.g. body temperature, breathing rate, heart rate, blood pressure, …

VR/AR

Status
colourRed
titleHIGH
/
Status
colourPurple
titleONGOING

Infineon

Immersive vision stream for telepresence using ToF / 360 deg camera
Currently the vision data in telepresence is using a stereocamera. The field of view it provides, however, is rather limited. Therefore, we seek to augment the RGB camera stream with a long-range (9m) time-of-flight sensor and/or a 360 degrees camera to improve operator’s situational awareness.

Detailed description:

View file
nameToF.pdf

Status
colourGreen
titleLOW

Nvidia

Telepresence through Nvidia’s CloudXR & Omniverse

This project aims to explore the potential of using the latest Nvidia’s tools for telepresence - as a data transmission and cloud processing layer. These tools will allow to move computationally expensive tasks to the cloud - e.g. video stream super-resoluiton, detailed rendering and simulation, etc.

Expert support by the partner.

https://developer.nvidia.com/nvidia-omniverse-platform

https://www.nvidia.com/en-us/design-visualization/solutions/cloud-xr/

Status
colourBlue
titleMEDIUM

AR Roboy

Developing an AR overlay of the Roboy Avatar.

Potential HW: https://www.nreal.ai/light

Status
colourGreen
titleLOW

Neural style transfer during telepresence
This artistic project will allow the operator to see the reality in a new way through Roboy eyes inspired by Vincent van Gogh or maybe Simpsons. Here we aim to apply real-time time-coherent neural style transfer on the avatar camera stream.

https://blog.unity.com/technology/real-time-style-transfer-in-unity-using-deep-neural-networks

Mechatronics

Status
colourGreen
titlecompleted

Standing Roboy Avatar

Currently Roboy 3.0 is sitting on an electric wheelchair. The goal of this project is to make Roboy stand up by rigidly mounting him to a self-balancing platform (e.g. Segway).

Status
colourBlue
titleMEDIUM

Next generation muscle unit

Design a new series elastic actuator (4th generation)

Status
colourGreen
titlecompleted

Series elastic actuators: artificial muscle profiling
Roboy’s actuators are series elastic actuators developed fully in house by our team. In the scope of this project an extensive testing and profiling will be done including:

  • frequency response (Bode plots)

  • maximum performance range

  • control loop update frequency on:

    • motor driver board (FPGA)

    • central motor control unit (FPGA)

    • ROS layer (C++)

  • control accuracy

  • durability tests

For example, see https://docs.hebi.us/hardware.html#x-series-actuators

Control

Status
colourGreen
titleLOW


Cooperative control of the mobile platform for Roboy Avatar

While navigating Roboy Avatar through telepresence, it is often challenging to drive though narrow spaces or sharp corners. The field of view of the operator is limited compared to human eyes and the distances to objects are distorted. In addition, network latencies or outages might cause unsafe situations while driving the avatar. Therefore, a controller that would assist the operator with obstacle avoidance and ensure safety is desired. This project encompasses integration of the new sensors into the mobile platform and implementation of the assisted driving control algorithms.


Status
colourGreen
titleLOW

Assisted grasping

Due to the occlusion of vision and low robot hand dexterity during telepresence, it is challenging for the operator to effectively manipulate objects. This project aims to address this issue by adding new sensing modalities to the hand and writing cooperative controllers to assist the operator during grasping and in-hand manipulation.

Status
colourBlue
titleMEDIUM

ROS2 motor control

Migration the current motor control stack from ROS1 to ROS2 for its real-time capabilities.