Joint Control and Inverse Kinematics

Follow the instruction on Control page to get and build the required software components. Start roboy plexus from Motor Control and joint state estimators from DRAFT Robot State Monitoring to work with hardware.

Naming: joints, links, body parts

Config files provide an overview of the names used in the robot. For Roboy 3.0 upper body, see here.

Initialization procedure

Before sending any joint or IK commands to the robot, all tendons must be tight and the joints should be at their position 0. The initialization routine can be performed for each body part individually (shoulder_left, shoulder_right, head). A body part will not react to any commands if it is not initialized. To initialize shoulder_left, for instance:

  1. Launch kindyn as described above in the quick start

  2. Call the /init_pose service for shoulder left (or use INIT buttons in the node-red dashboard, as described in Quick start )

    rosservice call /roboy/brain/init_pose "strings: - 'shoulder_left'"
  3. Hold the shoulder in its 0 position (see photo) and wait for the service to finish execution (5 seconds by default) - the tendons should start moving and get shorter.

  4. If the tendons are still loose after the first service call, repeat steps 2 and 3.

    1. In case the tendons are very long, increase the init timeout duration (seconds) by

      rosparam set /timeout 10

       

  5. If all tendons are tight, the initialization of shoulder_left is done.

If the communication with a motor drops, i.e. there’s an issue detected with a motor, then the body part associated with that motor becomes uninitialized automatically and will not respond to incoming joint targets.

Visualization in RVIZ

Assuming the roboy3 ROS workspace is build and kindyn is launched, RVIZ can display the current and target robot state, as well as tendon information and interactive markers as IK targets.

For the first time setup:

  1. Open RVIZ

  2. In the top menu Panel → Add New Panel → CardsflowRviz → OK

  3. In the Displays window:

    1. Fixed frame → world

    2. Add → Marker

  4. Adjust the visibility of meshes, collisions, targets and tendons in the CardsflowRviz panel.

To test, send a joint target through the Node-Red dashboard and see the updated robot state in RVIZ.

robot.launch explained

Each robot has its own executable inside the kindyn repository. For example, the upper body of Roboy 3.0 is managed by kindyn/src/robots/upper_body.cpp script. The current functionality includes the following:

  • calculate inverse kinematics

  • calculate tendon length for a given joint position

  • send commands to the muscle units (M3 and M2)

  • visualize robot state (current and target)

  • visualize tendons

  • receive joint states from external publishers

Arguments overview

Name

Function

Type

Default

Name

Function

Type

Default

robot_name

  • name of the robot to launch

  • robot_name.cpp and robot_name executable must exist

  • robot_name folder in robots package must exist (including model.urdf, cardsflow.xml and config folder)

string

“msj_platform”

robot_model

  • model of a given robot

  • for example there 2 versions of upper_body: brain and pinky

string

pinky

simulated

if false, HW commands will not be sent

bool

false

urdf_file_name

name of the URDF file located in robots/upper_body

string

model.urdf

external_robot_state

if true, will subscribe to /external_joint_state and use the information on joint positions available there - see DRAFT Robot State Monitoring

if false will use internal calculation of joint positions

bool

false

ROS parameters overview

Name

Function

Default

Name

Function

Default

cardsflow_xml

the path to cardsflow.xml file that contains tendon routing description (see example here)

“$(arg model_name)/cardsflow.xml"

pwm

PWM value for init_pose service

5.0

shoulder_left/motor_ids

a list of motor ids that belong to a given body part

defined in config/motor_config.yaml (see example here)

Kp_dl, Ki_dl

P and I gains used to improve tracking of joint position targets (see implementation here)

0, 0; recommended: Kp_dl=[1.5..2.0], Ki_dl=[0.1..1.0]

timeout

seconds to wait during the initialization of a body part

5

Quick start

Hardware

# start shoulders, elbows, wrists and head control roslaunch kindyn robot.launch robot_name:=upper_body simulated:=false robot_model:=brain # visualize the robot rviz & # initialize shoulder_left rosservice call /roboy/brain/init_pose "strings: - 'shoulder_left'" # wait for all tendons on the left shoulder and elbow to be tight # publish a joint position target command rostopic pub /roboy/brain/control/joint_targets sensor_msgs/JointState "header: seq: 0 stamp: {secs: 0, nsecs: 0} frame_id: '' name: ['shoulder_left_axis1'] position: [-0.2] velocity: [0] effort: [0]" # send IK target and execute it rosservice call /roboy/brain/control/execute_ik "endeffector: 'hand_left' type: 0 target_frame: 'torso' pose: position: x: 0.0 y: 0.0 z: 0.0 orientation: x: 0.0 y: 0.0 z: 0.0 w: 0.0"

Simulation

Head control

There’s a simple way to control Roboy’s head with a joystick, including:

  • 3DOF head movement

  • facial expressions control

  • mouth control

  • eyes control

Buttons and their mappings can be changed in the head_joystick_control2.py script.