Skip to content

Experiment Replication

Stephen Tian edited this page Nov 9, 2019 · 16 revisions

We've provided hyperparameter files and models for replicating our results in the RoboNet paper. You can load the models with the code provided in RoboNet. However, you will need to install the latest version visual foresight codebase in order to perform control experiments on the robot.

Visual Foresight

Sawyer

Before running Sawyer experiments, make sure to start the required ROS nodes for the camera, gripper (if applicable), and Sawyer impedance control by running the following commands:

# in a new intera terminal
roslaunch foresight_rospkg start_cameras.launch   # run cameras

# in a new intera terminal
roslaunch foresight_rospkg start_gripper.launch   # start gripper node

# in a new intera terminal
roscd foresight_rospkg/launch
rosrun foresight_rospkg send_urdf_fragment.py     # (optional) stop after Sawyer recognizes the gripper
./start_impedance 

View Generalization

For the view generalization experiments, we evaluate two models: one trained on Sawyer data with only the front viewpoint, and another trained on all Sawyer data (including multiple viewpoints). The experiment files and models for both setups can be found at:

Only front view: https://drive.google.com/open?id=18D4xqMCd6ypRdXXMimQVmSazhL4modOg

All views: https://drive.google.com/open?id=1Jl8bAxB7OQ5EPu6r4Tq9DQVkPp3n5XUr

To run benchmarks for these experiments, modify the model_path hyperparameter in the experiment file to point to the location of the downloaded model on your local machine. Then execute rosrun foresight_rospkg run_robot.py <robot_name> <path to experiment file> --benchmark

Franka

The experiment files and models for each of the Franka experiments can be found at:

Franka (N=8000): https://drive.google.com/open?id=1mi5vBRGYoDy9oCODR78k0pnzFN6BPuff

Franka (N=400): https://drive.google.com/open?id=1AFg2wSko1cvIu84WZlHLMjh8xJ9GD71O

RoboNet Pretrain + Franka Finetune (N=400): https://drive.google.com/open?id=1mt8CmFO6Jalfek5T4Scd7EL-tPSoV5RK

Inverse Model: https://drive.google.com/open?id=1OKFFkbsx3qHlmorYJybSRTyBh0eJnnW0

The code can be run by executing:

rosrun foresight_rospkg run_robot.py franka <path to experiment file> --benchmark

NOTE: The Franka code assumes that there is a separate robot control HTTP server running, which supports command requests over a REST API. This is because the Franka requires real time control, so the robot control server and Robonet video prediction/planning code must be run on separate machines. This Franka control server will depend on the robot configuration and type of underlying controller, so a prerequisite to use this code with the Franka is that you will need to have such a control server implemented.

Kuka

Running the Kuka experiments requires a communication setup with the arm which is compatible with the MoveIt ROS interface. In our experiments, we used the following ROS-FRI wrapper which has some dependencies on Kuka infrastructure for which a custom solution is currently required: https://github.com/tdinesh/iiwa_hw_fri/tree/devel_velocity_control

The experiments also require the cameras to stream over ROS topics which will be picked up by the visual foresight package. Any package which publishes camera streams on ROS topics will work with the existing experiment file provided below (kuka_scratch.py) as long as the topics are specified in the ‘camera_topics’ hyperparameter. We used the usb_cam package: http://wiki.ros.org/usb_cam The launch file (kuka_camera.launch) for this library used in our experiments is available in the drive provided below.

With the specific MoveIt wrapper and camera stream libraries we used, the following commands setup the Kuka experiments.

# in a new terminal - launch the FRI driver
roslaunch iiwa_hw_fri fri_driver.launch
# in a new terminal - launch the MoveIt wrapper
roslaunch iiwa_hw_fri moveit_grl.launch
# in a new terminal - launch the camera drive
roslaunch usb_cam kuka_camera.launch

In general, the following commands will setup the Kuka experiments for arbitrary choices of MoveIt wrapper and camera stream libraries.

# in a new terminal - launch the FRI driver
roslaunch <fri_package> <fri_driver.launch>
# in a new terminal - launch the MoveIt wrapper
roslaunch <fri_package> <move_it_fri_wrapper.launch>
# in a new terminal - launch the camera drive
roslaunch <camera launch package> <camera.launch>

The MoveIt ROS interface brings up the robot’s current configuration in Rviz. If the system is ready to run control experiments, the Rviz model of the robot should display in the same configuration as the physical robot.

A template experiment file and the trained models can be found at:

https://drive.google.com/drive/folders/1j6D_PRw071P9UN9qg9NutKut2cq1cNk8?usp=sharing

The included models are associated as follows with the experiments in the paper:

Kuka (N=400): kuka_400.zip

Kuka (N=1800): kuka_1800.zip

RoboNet Pretrain + Kuka Finetune (N=400): kuka_finetune.zip

To run benchmarks for these experiments, modify the model_path hyperparameter in the experiment file (kuka_scratch.py) to point to the location of the downloaded model on your local machine. Then execute

rosrun foresight_rospkg run_robot.py kuka <path to experiment file> --benchmark

Baxter

The experiment files and models for each of the Baxter experiments can be found at:

Baxter (N=300): https://drive.google.com/open?id=1ABqUZ3oI43ncdTdu1RDo8laXQJ-scxH4

Pre-train on Sawyer, finetune on Baxter data (N=300): https://drive.google.com/open?id=1QulG0gfXc4TM1DC3HvhJdSNMZWyFoZNJ

Pretrain on RoboNet without Baxter data, finetune on Baxter data (N=300): https://drive.google.com/open?id=1qgc9FWsos2Vnxgw-7xQg_lzUHc0LgDBG

The code can be run by executing:

rosrun foresight_rospkg run_robot.py baxter <path to experiment file> --benchmark

Inverse Model

For the inverse model experiments, we evaluate the same model, trained on both Sawyer and Franka data, on both robots.

Sawyer

The experiment file and model for the Sawyer experiments can be found at https://drive.google.com/drive/folders/1pmzgIX8604Z-v-Go7Nbe5K4hH1_884g4?usp=sharing

To run on the robot, modify the model_path hyperparameter in the Python experiment file provided above to point at the location of the model on your machine. Then run

rosrun foresight_rospkg run_robot.py <robot_name> <path to experiment file> --benchmark

Clone this wiki locally