Roboy Open Topics
Roboy Student Team works on developing the tendon-driven humanoid robody Roboy 3.0 and its next generations. Robody is a remote robotic body, controlled at all times by a human operator using VR gear.
All code, results & scientific findings produced by the Roboy Student Team are open-sourced and, if possible, published. Topics of interest cover biologically-inspired actuation, simulation, sensing, control, telepresence and XR, where a wide range of skills in mechanics, electronics & software engineering can be applied.
High-level reasoning approaches, typical for autonomous robots, are out of scope.
Here we collect all ideas for potential Roboy thesis, lab course, etc.
Field | Priority | Partner | Description | References |
---|---|---|---|---|
Control | HIGH / ONGOING |
| Mujoco simulation of a tendon-driven musculo-skeletal robot Roboy Research & apply state of the art methods to narrow hardware-simulation gap with goal to open-source the simulation model for the community and research. |
|
HIGH / ONGOING | MAE, Chinese University of Hong Kong University of Southern California | Force control of a tendon-driven musculo-skeletal robotic arm Implement kinematic control for Roboy 3.0 (simulation & hardware). Model-based and model-free methods will be taken into account. | Possible collab: | |
HIGH | TUM FAR | Wrist control Implement kinematic control and operator mapping for the new stronger wrist of Roboy 3.0 |
| |
Sensing | MEDIUM | Soft tactile sensors for Roboy In the scope of this project the team will be developing tactile sensors (aka skin) for Roboy based on the advancements from the University of Leeds, who will be co-supervising the project. Topics cover
| Partner’s papers: Adjustable Compliance Soft Sensor via an Elastically Inflatable Fluidic Dome https://eprints.whiterose.ac.uk/168371/1/SITS_Optimization.pdf | |
HIGH |
| “Manual” Robody Control
| ||
HIGH completed |
| Facial expression transfer and mapping between the operator and the avatar during telepresense This project covers:
| Possible HW addition: https://www.youtube.com/watch?v=4_EIqmIWn_Y | |
HIGH / ONGOING |
| Extra-human sensing for Roboy Avatar: infrared, ultrasound, magnetic field Telepresence opens a wide range of possibilities to extend the human sensing capabilities. For example, adding an infrared camera on the avatar and fusing the its image with the RGB camera stream will already give the operator extra-human sensing which will be invaluable in situations like search and rescue operations. | Detailed description:
| |
HIGH completed |
| Contactless vitals monitoring
|
| |
VR/AR | HIGH / ONGOING | Infineon | Immersive vision stream for telepresence using ToF / 360 deg camera | Detailed description: |
LOW | Nvidia | Telepresence through Nvidia’s CloudXR & Omniverse This project aims to explore the potential of using the latest Nvidia’s tools for telepresence - as a data transmission and cloud processing layer. These tools will allow to move computationally expensive tasks to the cloud - e.g. video stream super-resoluiton, detailed rendering and simulation, etc. Expert support by the partner. | ||
MEDIUM |
| AR Roboy Developing an AR overlay of the Roboy Avatar. | Potential HW: https://www.nreal.ai/light | |
LOW |
| Neural style transfer during telepresence | https://blog.unity.com/technology/real-time-style-transfer-in-unity-using-deep-neural-networks | |
Mechatronics | completed |
| Standing Roboy Avatar Currently Roboy 3.0 is sitting on an electric wheelchair. The goal of this project is to make Roboy stand up by rigidly mounting him to a self-balancing platform (e.g. Segway). |
|
MEDIUM |
| Next generation muscle unit Design a new series elastic actuator (4th generation) |
| |
completed |
| Series elastic actuators: artificial muscle profiling
| For example, see https://docs.hebi.us/hardware.html#x-series-actuators | |
Control | LOW | Cooperative control of the mobile platform for Roboy Avatar While navigating Roboy Avatar through telepresence, it is often challenging to drive though narrow spaces or sharp corners. The field of view of the operator is limited compared to human eyes and the distances to objects are distorted. In addition, network latencies or outages might cause unsafe situations while driving the avatar. Therefore, a controller that would assist the operator with obstacle avoidance and ensure safety is desired. This project encompasses integration of the new sensors into the mobile platform and implementation of the assisted driving control algorithms. | ||
LOW |
| Assisted grasping Due to the occlusion of vision and low robot hand dexterity during telepresence, it is challenging for the operator to effectively manipulate objects. This project aims to address this issue by adding new sensing modalities to the hand and writing cooperative controllers to assist the operator during grasping and in-hand manipulation. |
| |
MEDIUM |
| ROS2 motor control Migration the current motor control stack from ROS1 to ROS2 for its real-time capabilities. |
|