Tracking 3 AR tags with a Standard Monocular Webcam, a Logitech C615 |
This post is the documentation for how I got it working on my machine. It should be mostly complete, but I will admit that I have probably left out some things that I thought were self explanatory. If you have problems or suggest changes, please post those in the comments.
AR tag |
Prerequisites
1) Installed ROS Kinetic. I am using a virtual machine as detailed HERE.2) Setup Catkin Workspace (I'll assume it's called catkin_ws).
3) Know some basic ROS. If you don't you can likely Google your questions.
Setup
Install package ar_track_alvar
1) Open a terminal in catkin_ws/src2) In the terminal type:
git clone -b kinetic-devel https://github.com/ros-perception/ar_track_alvar.git
cd ..
catkin_make
Install package video_stream_opencv
1) Open a terminal
2) In the terminal type:
sudo apt-get install ros-kinetic-video-stream-opencv sudo apt-get update
Create our custom package
1) Open a terminal in catkin_ws/src
2) In the terminal type:
catkin_create_pkg ar_tag_demo std_msgs rospy
Install package image_pipeline
This is likely already installed. You can check with <rospack list>. If it is not simply enter into a terminal:
sudo apt-get install ros-kinetic-image-pipeline
Then run another catkin_make.
Write Launch Files
Camera.launch
In your custom package "ar_tag_demo", create a new folder called "launch". Inside, create a file called camera.launch. Copy the code below into it. It is a modified version of the camera.launch file from video_stream_opencv. Note that video_stream_provider may have to be changed to 1 if you are using an external camera. If you are using a virtual machine like I am, you will need to enable the webcam under Devices>Webcam in the Virtual Box menu. If you have issues with this, install the Virtual Box extension pack as discussed in my previous post.<launch> <arg name="camera_name" default="camera" /> <!-- video_stream_provider can be a number as a video device or a url of a video stream --> <arg name="video_stream_provider" default="0" /> <!-- frames per second to query the camera for --> <arg name="fps" default="10" /> <!-- frame_id for the camera --> <arg name="frame_id" default="camera_link" /> <!-- By default, calibrations are stored to file://${ROS_HOME}/camera_info/${NAME}.yaml To use your own fill this arg with the corresponding url, e.g.: "file:///$(find your_camera_package)/config/your_camera.yaml" --> <arg name="camera_info_url" default="" /> <!-- flip the image horizontally (mirror it) --> <arg name="flip_horizontal" default="false" /> <!-- flip the image vertically --> <arg name="flip_vertical" default="false" /> <!-- force width and height, 0 means no forcing --> <arg name="width" default="0"/> <arg name="height" default="0"/> <!-- if show a image_view window subscribed to the generated stream --> <arg name="visualize" default="true"/> <!-- images will be published at /camera_name/image with the image transports plugins (e.g.: compressed) installed --> <group ns="$(arg camera_name)"> <node pkg="video_stream_opencv" type="video_stream" name="$(arg camera_name)_stream" output="screen"> <remap from="camera" to="image_raw" /> <param name="camera_name" type="string" value="$(arg camera_name)" /> <param name="video_stream_provider" type="string" value="$(arg video_stream_provider)" /> <param name="fps" type="int" value="$(arg fps)" /> <param name="frame_id" type="string" value="$(arg frame_id)" /> <param name="camera_info_url" type="string" value="$(arg camera_info_url)" /> <param name="flip_horizontal" type="bool" value="$(arg flip_horizontal)" /> <param name="flip_vertical" type="bool" value="$(arg flip_vertical)" /> <param name="width" type="int" value="$(arg width)" /> <param name="height" type="int" value="$(arg height)" /> </node> <node if="$(arg visualize)" name="$(arg camera_name)_image_view" pkg="image_view" type="image_view"> <remap from="image" to="image_raw" /> </node> </group> </launch>
Track.launch
Next we create the launch file that does the tracking. Again, this is a modified launch file from the ar_track_alvar package. Create a file called track.launch in your launch file folder and copy the following code inside it. Note that you will need to set the marker size. This is the length in centimeters of one side of the black part of an AR Tag.
<launch> <arg name="marker_size" default="6.9" /> <arg name="max_new_marker_error" default="0.08" /> <arg name="max_track_error" default="0.2" /> <arg name="cam_image_topic" default="/camera/image_raw" /> <arg name="cam_info_topic" default="/camera/camera_info" /> <arg name="output_frame" default="/camera_link" /> <node name="ar_track_alvar" pkg="ar_track_alvar" type="individualMarkersNoKinect" respawn="false" output="screen"> <param name="marker_size" type="double" value="$(arg marker_size)" /> <param name="max_new_marker_error" type="double" value="$(arg max_new_marker_error)" /> <param name="max_track_error" type="double" value="$(arg max_track_error)" /> <param name="output_frame" type="string" value="$(arg output_frame)" /> <remap from="camera_image" to="$(arg cam_image_topic)" /> <remap from="camera_info" to="$(arg cam_info_topic)" /> </node> </launch>
main.launch
Because this is a demo, you might only want to have to launch one file. This launch file simply calls the other two.
<launch> <include file="$(find ar_tag_demo)/launch/camera.launch" /> <include file="$(find ar_tag_demo)/launch/track.launch" /> </launch>
Running the files
Camera Calibration
You will want to calibrate the camera using the camera_calibrate node (part of the image_pipeline package). You can follow the instructions found on the wiki for monocular camera calibration: http://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration
Here are the pertinent parts:
Here are the pertinent parts:
rosdep install camera_calibration rosrun ar_tag_demo camera.launch rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.0245 image:=/camera/image_raw camera:=/camera
Note that that the grid size (8x6) and square size (.0245) is for the above as printed on my printer. You may have to adjust it. The square size is in meters.
3) Complete the calibration by moving the checkerboard around the camera's field of view and rotating it in all directions.
4) When you are done, click commit to automatically save the camera calibration data. The camera node will now automatically pull that calibration file when you launch it.
3) Complete the calibration by moving the checkerboard around the camera's field of view and rotating it in all directions.
4) When you are done, click commit to automatically save the camera calibration data. The camera node will now automatically pull that calibration file when you launch it.
5) ctrl + c in all terminal windows to stop camera and calibration nodes
Run the demo
In a terminal type the following command.
roslaunch ar_tag_demo main.launch
This should bring up the camera and the tracking node. Feel free to rostopic echo ar_pose_marker to see the raw data, but RVIZ is probably more impressive. Launch RVIZ (type rviz into a terminal), and add TF to the data visualized on the left. Show the camera a marker, then set fixed frame to "camera_frame". You should now see something like this!
Show off your AR tag demo with pride! Don't tell anyone that Scott Neikum (the code maintainer) did all the hard work for you.
I hope this was helpful to someone. If it was, comment below and let me know. If you run into any problems or see anything that should be changed, comment below for that as well.
Until next time,
Matthew