Skip to content

tanay-bits/baxter_skeletonteleop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Skeleton Tracking and IK Based Teleoperation of the Baxter Robot

This ROS package allows convenient remote teleoperation of the Baxter robot using skeleton tracking information based on the NITE skeleton tracking library.

demo

The following packages are pre-requisites, in addition to the Baxter SDK:

The following answers are great resources for setting up skeleton tracking:

To pair your Wiimote with your computer, follow this tutorial.

After enabling Baxter, simply roslaunch baxter_skeletonteleop.launch to start the program: roslaunch baxter_skeletonteleop baxter_skeletonteleop.launch

Optionally, if you have your Wiimote paired, use the followinng launch command: roslaunch baxter_skeletonteleop baxter_skeletonteleop.launch wiimote:=true

And, if you want to use my (now demoted) old approach which relied solely on Baxter's IK Service, type: roslaunch baxter_skeletonteleop baxter_skeletonteleop.launch old_method:=true

The default launch uses new_teleop.py and new_limb_mover.py, whereas the old method uses teleop.py, limb_mover.py and ik_solver.py. You should soon see an RViz window with the depth video overlaid on the skeleton tracker. Stand in front of your OpenNI-compliant depth-sensor (such as Kinect or Asus Xtion), and bring your hands to the start position (both hands near the torso) to begin tele-operating the robot. If there are multiple people coming in and out of the depth sensor's field of view, Baxter will track whichever user is most central.

If you're holding your paired Wiimote, press the B button to close Baxter's gripper (currently only one arm supported for gripping) and the A button to open it. If Baxter grabs an object, the Wiimote would momentarily rumble for user feedback.

The basic flow of processes is as follows: skeleton tracking --> mapping of human's hand position to robot's end-effector position --> custom iterative numerical inverse kinematics --> proportional joint-space velocity control

The robot's joint speeds are intentionally capped to 30% of maximum, as a safety precaution. The current scaling factors for mapping human hand position to Baxter's end-effector position work pretty well for most people in my experiments, but one may need to tweak them if that isn't the case. This will also be necessary if your depth sensor's coordinate axes are different from mine (I'm using an Asus Xtion Pro Live). The scaling factors can be estimated by noting Baxter’s endpoint state at the outward limits of its reach in positive and negative X,Y,Z directions, perhaps using rostopic echo /robot/limb/<left or right>/endpoint_state -c on your terminal, and comparing that with the human user's arm measurements. The relevant lines of code to change these factors are:

The inverse kinematics routine implemented here runs multivariate optimization of a squared error objective function over the domain of forward kinematics solutions (joint configurations). Only the first 6 joint angles are considered, since the last joint angle (wrist roll) doesn't really matter here. The joint angle 6-vectors are bounded by their physical limits. The objective function consists of two costs:

  • Square of the distance between the end-effector position arrived at via forward kinematics, and the target end-effector position determined from skeleton tracking. This cost is most important, so has higher weight multiplied to it.
  • Square of the L2 norm of difference between the current actual joint configuration and the joint configuration being evaluated by the optimizer (the guess). This is of secondary importance, so has lower weight.

The two weights are global variables that can be changed in the beginning of new_limb_mover.py. Additionally, to aid the optimizer, I provide it the Jacobian of the above objective function.

About

IK-based teleoperation of Baxter using skeleton data from a depth camera, like Kinect.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages