Showing posts with label Tutorial. Show all posts
Showing posts with label Tutorial. Show all posts

Friday, October 6, 2017

Visual Object Recognition in ROS Using Keras with TensorFlow

I've recently gotten interested in machine learning and all of the tools that come along with that. This post will document a method of doing object recognition in ROS using Keras. I don't want to turn this post into a "what is machine learning and how does it work" piece, so I am going to assume you are familiar with machine learning in general and the robotic operating system (ROS). Instead I'm going to present a specific set of instructions on how to get a specific (but very useful) machine learning algorithm working on a ROS platform.

The end result. Object recognition in ROS on a live webcam (~2Hz)

When I was looking around the ROS wiki I was a bit surprised that there was no object recognition package readily available. I decided I wanted one; that is I wanted a package that would take a raw camera image and tell me what was in the picture. While I have no doubt that there are many obscure ways of doing this, the most common these days (to my knowledge) is machine learning - specifically using convolutional neural networks (CNNs). In fact, it is often used as an example of what machine learning is all about. This is where this project picks up.

There are many tutorials on getting CNNs working on various platforms, but I am going to use Keras with the TensorFlow backend. The idea is this, there are plenty of tutorials on getting object recognition working with this package. Pick one (I used THIS one, but more general would be the Keras documentation). This code is simply Python code. ROS accepts Python code via rospy. Let's put this code into a ROS package. I will be the first to admit that I am not an expert in ROS or machine learning, so use these instructions at your own risk. However, this did work for me.

Step 1: Install TensorFlow

I am installing TensorFlow on my virtualized Ubuntu 16.04 install as created in this post. I will tell you that this works surprisingly well, but I am giving it 12 GB of RAM and 3 cores of an i7. The point is, if you have Windows this will work for you too!

Install TensorFlow using the Linux install instructions. I used the CPU support only ones for virtualenv. This is probably not the best way to do this as I imagine there is a way in ROS to handle external dependencies. Feel free to comment below what that is. I figured worst case I could activate the virtualenv in my launch file. This will work for prototyping. When you decide which version of Python to use, I used 2.7 as this is the version recommended for ROS Kinetic. Be sure to validate the install before proceeding.


Step 2: Install Keras

Next you want to install Keras. The important note here is that you want to install this in the same virtualenv environment as TensorFlow. Do this by activating the environment before you install like you did in the TensorFlow directions (source ~/tensorflow/bin/activate). The TensorFlow backend is the default, so you are ok there. However you will need h5py. Install this with <pip install h5py>.

Step 3: Build your ROS package

First, we need to create a package. Call it what you want, but note the dependencies.


catkin_create_pkg object_recognition rospy std_msgs cv_bridge sensor_msgs

Next, create a new file called classify.py, and make sure it is an enabled as an executable. Copy the code below into the file.


#!/usr/bin/env python
import rospy
import cv2
import roslib
import numpy as np
from std_msgs.msg import String
from std_msgs.msg import Float32
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError

import tensorflow as tf
from keras.preprocessing import image
from keras.applications.resnet50 import ResNet50, preprocess_input, decode_predictions

# import model and  implement fix found here.
# https://github.com/fchollet/keras/issues/2397
model = ResNet50(weights='imagenet')
model._make_predict_function()
graph = tf.get_default_graph()
target_size = (224, 224)

rospy.init_node('classify', anonymous=True)
#These should be combined into a single message
pub = rospy.Publisher('object_detected', String, queue_size = 1)
pub1 = rospy.Publisher('object_detected_probability', Float32, queue_size = 1)
bridge = CvBridge()

msg_string = String()
msg_float = Float32()



def callback(image_msg):
    #First convert the image to OpenCV image 
    cv_image = bridge.imgmsg_to_cv2(image_msg, desired_encoding="passthrough")
    cv_image = cv2.resize(cv_image, target_size)  # resize image
    np_image = np.asarray(cv_image)               # read as np array
    np_image = np.expand_dims(np_image, axis=0)   # Add another dimension for tensorflow
    np_image = np_image.astype(float)  # preprocess needs float64 and img is uint8
    np_image = preprocess_input(np_image)         # Regularize the data
    
    global graph                                  # This is a workaround for asynchronous execution
    with graph.as_default():
       preds = model.predict(np_image)            # Classify the image
       # decode returns a list  of tuples [(class,description,probability),(class, descrip ...
       pred_string = decode_predictions(preds, top=1)[0]   # Decode top 1 predictions
       msg_string.data = pred_string[0][1]
       msg_float.data = float(pred_string[0][2])
       pub.publish(msg_string)
       pub1.publish(msg_float)      

rospy.Subscriber("camera/image_raw", Image, callback, queue_size = 1, buff_size = 16777216)



while not rospy.is_shutdown():
  rospy.spin()

At this point you can obviously go straight to running the code if you wish, but I'll step through each chunk and explain it.

Load Dependencies

#!/usr/bin/env python
import rospy
import cv2
import roslib
import numpy as np
from std_msgs.msg import String
from std_msgs.msg import Float32
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError

import tensorflow as tf
from keras.preprocessing import image
from keras.applications.resnet50 import ResNet50, preprocess_input, decode_predictions

This section just imports the dependencies. You can see we have some from Python, some from ROS, and some from Keras. If you are not too familiar with rospy, the comment on the first line always has to be there. Don't put anything else on the first line or else ROS won't know this is a Python script.

Load Keras Model

# import model and  implement fix found here.
# https://github.com/fchollet/keras/issues/2397
model = ResNet50(weights='imagenet')
model._make_predict_function()
graph = tf.get_default_graph()
target_size = (224, 224)

This section is where we import our machine learning model. I am using the ResNet50 model frankly because that is what the tutorial linked above used, but there are many others included if you look HERE. You can see that this ResNet model was trained using ImageNet, but you could also obviously insert your own model or weights here as well. Also note the fix that his been implemented as noted in the comment.

Start ROS Node

rospy.init_node('classify', anonymous=True)
#These should be combined into a single message
pub = rospy.Publisher('object_detected', String, queue_size = 1)
pub1 = rospy.Publisher('object_detected_probability', Float32, queue_size = 1)
bridge = CvBridge()

msg_string = String()
msg_float = Float32()

This starts all of the ROS stuff. We initialize the node and start two publishers. Now, I am aware that this is bad practice. I should really create a ROS message to house this data. However, at the moment I don't have a specific application for this, so I will leave that to the user. I am just publishing two different messages - one for the name of the most likely object name and one for the corresponding probability.


Run Model Inside callback

def callback(image_msg):
    #First convert the image to OpenCV image 
    cv_image = bridge.imgmsg_to_cv2(image_msg, desired_encoding="passthrough")
    cv_image = cv2.resize(cv_image, target_size)  # resize image
    np_image = np.asarray(cv_image)               # read as np array
    np_image = np.expand_dims(np_image, axis=0)   # Add another dimension for tensorflow
    np_image = np_image.astype(float)  # preprocess needs float64 and img is uint8
    np_image = preprocess_input(np_image)         # Normalize the data
    
    global graph                                  # This is a workaround for asynchronous execution
    with graph.as_default():
       preds = model.predict(np_image)            # Classify the image
       # decode returns a list  of tuples [(class,description,probability),(class, descrip ...
       pred_string = decode_predictions(preds, top=1)[0]   # Decode top 1 predictions
       msg_string.data = pred_string[0][1]
       msg_float.data = float(pred_string[0][2])
       pub.publish(msg_string)
       pub1.publish(msg_float)      

rospy.Subscriber("camera/image_raw", Image, callback, queue_size = 1, buff_size = 16777216)

while not rospy.is_shutdown():
  rospy.spin()

Here is the heart of the code. I tried to comment it pretty well, but here is the workflow.

  1. The callback function fires when a new image is available. 
  2. Use cv_bridge to convert the image from a ROS image type to an OpenCV image type.
  3. Resize the image to the shape required by ResNet50, 224 x 224. 
  4. Read the OpenCV image in as a NumPy array.
  5. Expand the array into the size needed for TensorFlow.
  6. Convert the data from uint8 to float64.
  7. Normalize the data.
  8. Run the model and classify the image.
  9. Decode the prediction and convert them to appropriate data types.
  10. Publish the prediction.
It's also worth noting the large buffer size on the subscriber. This was done per the recommendation HERE.

Step 4: Run the Code!

Now the fun part. Start your webcam via your favorite method. We just need the camera/image_raw topic which is pretty standard. If you need help with that, see my other post on AR Tags for instructions.

Now we need to launch our node. It's important that we do that in our virtualenv, so source the environment again if you haven't already (source ~/tensorflow/bin/activate). Then just rosrun your node.


rosrun object_recognition classify.py

Now you should be able to rostopic echo /object_detected and /object_detected_probability to see what your webcam is seeing. On my virtual machine this runs at about 2 Hz, but I imagine that could be increased if you're on a typical Ubuntu install. Here are some examples! It does ok. It didn't recognize a pack of playing cards, so I am guessing that is not in the ImageNet training set. I am still fairly impressed with it.



So that's it; you can now implement an object recognition package in ROS! Comment below if you use this in a project. I'd be particularly interested if someone uses their own model or does some transfer learning with this one to suit their specific application. If you have any other questions or comments, feel free to post those as well.

-Matthew

Thursday, March 5, 2015

Serial Port Communication with GNU Octave in Windows

This is not so much a finished post as it is a place to record progress. Use any information found on this page at your own risk.

Introduction

I have been using GNU Octave in place of MATLAB on my laptop for a while now. It is free and serves my purposes well. One place MATLAB does have it beat though is in its ability to communicate with outside hardware through a serial port. I recently needed this functionality for Octave. This is how I made it work. My configuration:
  • Windows 7 - 64 bit
  • GNU Octave 3.8.2-5 using MXE installer
  • Instrument Control Package 0.2.1

Walkthrough

Install Octave

If you found this post I will assume you are probably running Windows. There is a convenient unoffical installer for Windows HERE. At the time of this writing I am running 3.8.2-5. Anything greater than 3.8.0 has the nice MATLAB style GUI.

Install Instrument Control Package

The equivalent of MATLAB toolboxes are packages in Octave. You need the instrument-control package to access the serial ports. There are two ways to install it.

1) Install it from Octave forge. Assuming you have an internet connection, open Octave and type in the command window "pkg install -forge instrument-control-0.2.1.tar.gz" Replace the 0.2.1 with the newest version of the package.

2) Download it from HERE. Assuming you did a standard install, move it to the folder "C:/Octave/Octave-3.8.2/src". There you will find all the other packages that were included with the installer. Now open Octave and make that folder your directory. Type in the command window "pkg install instrument-control-0.2.1.tar.gz". Obviously you may need to change the name of the package if you download a newer version.

Both options will take a while. One of my first mistakes was thinking I had crashed my computer. I wasn't sure if it would work on Windows, so when it just sat there for a minute I thought it was hung. Just give it some time. Mine took a couple minutes. 

Load Instrument Control Package

You only have to install the package once, but you need to load it every time you open Octave (you can also set it to auto load. Google it.)

Type "pkg list" to see all your installed packages. If you don't see instrument-control then you need to go back to the last step. Any package with an * by it is loaded.

To load the package type "pkg load instrument-control". Now load the list of packages again to see if it worked.

Use the Package

Now the part you have been waiting for. It is important to note that at the time of this writing the instrument control package is not a drop in replacement for the serial capabilities of MATLAB. Here are some helpful links to illustrate this. It is fairly obvious that the function names are different or missing for Octave.
For my initial test I used an Arduino with a jumper between Rx and Tx. This essentially mirrored anything I sent back to me. To simplify things, go to the device manager and change the serial port number to COM1 through COM8. Over that and additional work is needed. Device Manager > your port > Port Settings > Advanced > COM Port Number.

My Additions
To better serve my needs I added a few files to make the package more MATLAB compatible. Just make sure they are in your path somewhere if you want to use them.

srl_fwrite: Download HERE. Similar to the MATLAB fwrite. The regular srl_write only accepts char and uint8s. I made this function to simplify sending other variable types. Accepts three inputs 
  • Serial Object
  • Data to be sent
  • Data Type - int8, uint8, int16, uint16, int32, uint32, int64, or uint64
srl_fread: Download HERE. Similar to MATLAB fread. Reads serial port and returns data type specified. Takes three inputs.
  • Serial Object
  • Number of values to be returned. (eg for 3 uint64s, enter 3 not 24)
  • Data Type - int8, uint8, int16, uint16, int32, uint32, int64, or uint64

Test Script
Test Script: This script was taken and modified from the wiki linked above. It opens a serial port, sends a couple values and then attempts to read them when the serial device mirrors them back. A "correct" output should look something like this.

Serial: Supported
s1 = 0x444
int8 = 200
intdata =

    0  142    1   44


That is all I have at the moment. I hope this tutorial was useful to someone out there. I plan to do another post on the way I am actually using this capability in the future as a more in depth example. 

-Matthew

Saturday, November 16, 2013

iRobot Create: Arduino Control

Introduction

This is the fourth section of the iRobot Create tutorial. If you have not completed the first sections, I would recommend that you go back and do so by following the links below.

Sections

Reference Documents

These documents should be referenced for details on interfacing with the Create
  • iRobot Create Open Interface Manual (OIM)- This manual provides detailed information on the serial interface with the Create. It details the implementation of the opcode system used to control the various systems as well as the necessary measures that must be taken to receive sensor data from the Create. Information regarding sensor packet size, connector pinouts, and command details can be found here.
  • Roomba Class Reference Guide (CRG)- This document provides support for the Arduino "Roomba" library. Here details can be found regarding the functions included in that library.
  • iRobot Create User Manual - This manual provides an introduction to the basic functions of the Create and an overview of the basic onboard functionality

Necessary Hardware

Necessary Software


Arduino Control

Arduino control can be implemented using the Roomba library. This library handles all the background serial commands allowing the user to program the Create's functions using the Arduino IDE. If the Roomba library is not installed download it HERE. Unzip and install the library then restart the Arduino IDE (See How to Install a Library).

Arduino Basics

This tutorial assumes basic knowledge of the Arduino IDE. If instructions are unclear or problems arise concerning the Arduino system, refer to THIS page and my previous posts (the ones labeled Arduino).


Wiring

Connecting the Arduino Mega to the Create is simple. In general, connections should be made according to the chart below. See my post, iRobot Create - Arduino interface cable.
Note that Serial1(TX1/RX1) should be used on the Arduino Mega. The serial port output TXD from the Roomba/Create is too weak to drive the RX serial port (Serial0) input of an Arduino properly. This is because of the USB-Serial converter on the Arduino: it also tries to drive the RX serial port input via a pullup resistor, but the Roomba does not have enough drive to pull the RX down below about 2.5 volts, which is insufficient to be reliably detected as a TTL serial input of 0. Furthermore, using Serial1 still allows for the use of the Arduino Serial Monitor for debugging purposes. Also note that Serial2 or Serial3 could be used if selected in software.
Arduino Mega PinCreate Cargo Bay Pin
TX1 (pin 18)RXD (pin 1)
RX1 (pin 19)TXD (pin 2)
GNDGND (pin 14)
Cargo Bay Pinout.JPG
ArduinoMega pinout.png

In this tutorial, connections will be simplified using a custom interface shield and ribbon cable. Connect the Arduino as shown below. See THIS post for details on the custom interface shield and cable.
IRobot Create Arduino Wiring 1.jpgIRobot Create Arduino Wiring 2.JPG
Note:
  • The direction of the ribbon cable is important. It must be connected as shown.
  • The connection of the TX/RX cable is important. Connect it exactly as shown.
    • White: TX1 pin 18
    • Black: RX1 pin 19
  • When using the cargo bay connector, ensure that the mini-DIN connector (the one used with the Create serial cable) is unplugged.

The recommended input voltage for an Arduino is 7-12V, center positive. Verify battery voltage before connecting. 


Checking Connections: TestSuite

This example is included in the Roomba library (see "Necessary Software"). It allows for a quick assessment of Arduino-Create communication.
1) Open TestSuite.pde - In the Arduino IDE: File > Examples > Roomba > TestSuite
2) Connect the USB cable to the Arduino. Install the driver if not already done (How to Install Arduino Drivers)
3) Upload Program
  • Tools > Board > Arduino Mega 2560
  • Tools > Serial Port > [COM port]
  • File > Upload 
4) Open Serial Monitor - Set baud to 9600
5) Restart Arduino Mega by pressing restart button

A message indicating 0 errors should be displayed in the Serial Monitor and the Create should play an audible melody. If errors are reported, check the items listed below. Proceeding to other examples will be futile until these errors are eliminated.
  • TX/RX cable: White -> pin 18 , Black -> pin 19
  • Orientation of ribbon cable
  • Serial monitor baud rate
  • Arduino Driver Installed
  • Correct COM port selected

Controlling Drive Output

This example shows the basics of controlling the Create's movements. There are 2 basic functions that can be used to control the Create's drive motors. The Roomba library provides support for both. Details regarding usage of the 2 functions can be found in the Roomba Class Reference Guide. Information on maximums, minimums, and special cases can be found in the Open Interface Manual. 
  • drive(int16_t velocity, int16_t radius) - Velocity is in mm/s. Radius is in mm. Special values can be found in the CRG.
  • driveDirect(int16_t leftVelocity, int16_t rightVelocity) - Velocity is in mm/s.

1) Turn Create to OFF.
2) Open Roomba_Drive_Test.ino - In the Arduino IDE: File > Examples > Roomba > TTU Examples > Roomba_Drive_Test.ino or copy sketch from link at the bottom of the section.
3) Upload it to the Arduino Mega.
WARNING: If the Create is on, the sketch will start as soon as the upload is complete. Before uploading, turn the Create off. It is important to note that the Create may be ON even if the power LED is off. The power LED turns off when the Create is put in safe or full mode as well. Cycling power until the power LED is lit and then goes off will ensure that the Create is truly OFF. Regardless, it is a good practice to ensure that adequate space is available in case of accidental movement.
4) Disconnect USB and connect Arduino external power.
5) Place Create on large, flat surface (ie. the floor)
6) Power up the Create.
7) Restart Arduino.
The Create should cycle through a series of movements using the two methods of control as defined below. 
  • driveDirect
    • Drive straight
    • Spin CounterClockwise
    • Spin Clockwise
    • Stop
  • drive
    • Turn Left
    • Turn Right
    • Drive Straight
    • Spin Clockwise
    • Spin CounterClockwise
    • Stop


Reading Sensor Data

Sensor data can be read in two different ways. While both methods are described in the OIM, this example will only cover the getSensors() approach. The Create automatically updates its sensor data every 15ms. The user can choose often to read the values of those readings. While calling getSensors more frequently will cause no harm, the values read in that period will be redundant.
To read a sensor, the following information is needed
  • Sensor packet ID - the number associated with the sensor value the user is trying to read. Packets 0-6 are associated with groups of sensor values. Packet 6 is associated with all sensor values available 
  • Size of sensor packet (in bytes) - the number of bytes returned when the user calls a sensor packet ID 
  • Variable type returned - the way the bytes received must be interpreted. 
Example 1: Packet 7 returns one byte. However, it must be interpreted as individual bits. A value of 3 means that bytes 0 and 1 are 1s and therefore the Left and Right Bumpers are triggered.
Example 2: Packet 28 returns 2 bytes. They must be interpreted as one unsigned integer value. As in the "Drive Forward 20cm" example, a value of 1 and 44 would mean that the Left Cliff Sensor is reading 300.
All of this information can be found in the Open Interface Manual beginning on pg. 17. 
Useful Arduino functions for interpreting sensor values
  • bitRead - Reads an individual bit in a byte
  • Bitshift - Shifts the bits in a variable in either direction. Useful for high_byte, low_byte composition
  • BitShiftCombine - Function included in the example (defined at the bottom). Uses Bitshift to combine to bytes into a 16 bit int. Note that the int may be signed or unsigned depending on the receiving variable type. 

getSensors(uint8_t packetID, uint8_t* destination, uint8_t length)
  • packetID - number of packet to read
  • destination - an array with at at least "length" entries. Note that arrays are 0-indexed. ie, the first value in an array of 52 entries is array[0]. The last entry is array[51].
  • length - number of bytes associated with packetID being used.

Process for running sketch:
1) Open Full_Sensor_Test.ino - In the Arduino IDE: File > Examples > Roomba > TTU Examples > Full_Sensor_Test
2) Upload sketch to Arduino Mega
3) Open Arduino Serial Monitor - Set baud to 57600
4) Power ON Create
5) Restart the Arduino
Data from sensor packet 6 (all sensor data) should be displayed in the Serial Monitor. For more information regarding the nature of the sensor data, see the Open Interface Manual.


Basic Object Avoidance

This example demonstrates the use of the Create's sensors to navigate around obstacles. When executed, the Create should drive forward. When it bumps into an object, it should back up and turn away from the object. Sensor data is read using getSensors and motor control is implemented using driveDirect.


1) Turn Create to OFF.
2) Open Basic_Object_Avoidance.ino - In the Arduino IDE: File > Examples > Roomba > TTU Examples > Basic_Object_Avoidance
3) Upload sketch to the Arduino Mega.
WARNING: If the Create is on, the sketch will start as soon as the upload is complete. Before uploading, turn the Create off. It is a good practice to ensure that adequate space is available in case of accidental movement.
4) Disconnect USB and connect Arduino external power.
5) Place Create on large, flat surface (ie. the floor).
6) Power up the Create.
7) Restart Arduino. 

iRobot Create: MATLAB Control

Introduction

This is the third section of the iRobot Create tutorial. If you have not completed the first and second sections, I would recommend that you go back and do so by following the links below. They provide more insight into how the toolbox actually works. This section covers control of the iRobot Create via MATLAB. If you do not have access to MATLAB, feel free to skip this section. While it may be possible to use GNU Octave (a free Matlab compatible software), I know very little about that(Update: See my post on Octave serial communication HERE. More details on an Octave package coming soon).

Sections

Reference Documents

These documents should be referenced for details on interfacing with the Create
  • MATLAB Toolbox Documentation - This document provides details on the various functions included in the MATLAB toolbox. More information can be found in comments in the functions themselves.
  • iRobot Create Open Interface Manual (OIM)- This manual provides detailed information on the serial interface with the Create. It details the implementation of the opcode system used to control the various systems as well as the necessary measures that must be taken to receive sensor data from the Create. Information regarding sensor packet size, connector pinouts, and command details can be found here.
  • iRobot Create User Manual - This manual provides an introduction to the basic functions of the Create and an overview of the basic onboard functionality

    Necessary Hardware

    Necessary Software


    MATLAB Control

    Another convenient way to control the Create is with a MATLAB toolbox. The MATLAB toolbox to be used in this lab (Developed by Joel Esposito at the US Naval Academy) allows the user to control the Create from any computer via a serial tether. This lab will explore the basic functions of this toolbox. See comments in the scripts for details on each function. For more information, see comments in the functions themselves or read the MTIC Documentation

    Before beginning, download the MATLAB toolbox HERE and unzip it into the folder in your MATLAB directory. In windows it will be "C:\Program Files\MATLAB." Next connect to the computer the Create using either the iRobot serial cable and a serial extension or a bluetooth serial link. Open the Device Manager and make note of the COM port associated with the Create. Open MATLAB and proceed.

    ExampleButtonBeep

    This example demonstrates the use of the Create's Advance and Play buttons. When in Full or Safe Mode, these buttons may be read as digital inputs. The function ButtonSensorRoomba returns 1 for a depressed button and 0 for a button that is not depressed.
    1) Open ExampleButtonBeep.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RoombaInit() to RoombaInit(8) 
    3) Click the green run icon in the center of the MATLAB toolbar.
    4) Press combinations of the Play and Advance buttons to hear different patterns of beeps

    ExampleDrive

    This example explores two of the ways the user can control the Create's wheels, SetFwdVelRadiusRoomba and SetDriveWheelsCreate. See comments in the scripts for details on each function. For more information, see comments in the functions themselves or read the MTIC Documentation.
    1) Open ExampleDrive.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RoombaInit() to RoombaInit(8) 
    3) Place Create on large, flat surface (e.g. the floor).
    4) If not already done, turn Create to ON.
    5) Click the green run icon in the center of the MATLAB toolbar. 

    Observe how each function moves the Create. Note the distance measurements displayed in the MATLAB Command Window. They are the distance readings from the Create's wheel encoders. Note that the distance is taken as the average of the two wheels, so if one wheel traveled 1m and the other wheel traveled -1m. The distance is still 0. 

    Explore how different inputs effect the Create's movements and the sensor readings.

    ExampleDrive2

    This example demonstrates two of the ways to control the Create's wheels using feedback from the Create's wheel encoders, travelDist and turnAngle.
    Note: travelDist and turnAngle use scripting from the Create Open Interface. This means that these functions are blocking. The Create waits and will not accept any new commands (e.g. requests for sensor data or commands to STOP!!) until it has traveled the desired distance. For this reason, use of these functions should be limited to small distances.

    1) Open ExampleDrive2.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RoombaInit() to RoombaInit(8) 
    3) Place Create on large, flat surface (e.g. the floor).
    4) If not already done, turn Create to ON.
    5) Click the green run icon in the center of the MATLAB toolbar. 
    Observe how each function moves the Create.
    Explore how different inputs effect the Create's movements.

    ExampleSensorRead

    This example demonstrates the various methods of reading the Create's sensors. For details on interpreting the sensor readings, see the Create Open Interface Manual. 
    Note that not all sensor read functions are included in this example. See the MTIC documentation for information on other individual sensor read functions. 
    1) Open ExampleSensorRead.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RoombaInit() to RoombaInit(8) 
    3) If not already done, turn Create to ON.
    4) Click the green run icon in the center of the MATLAB toolbar. 
    Observe the values printed to the MATLAB Command Window from the different functions. Note the format of each reading. Using the Create Open Interface Manual and the MTIC documentation, interpret each value displayed and consider how that value might be useful. For example:
    CliffRgt = 0. What does that mean?
    Wall = 1. What does that mean?
    pCharge = 66.4563. What does that mean? 

    ExampleBasicObjectAvoidance

    This example demonstrates the implementation of basic object avoidance using the Create's bump sensors. 
    The Create drives forward until it encounters an object and turns away from it. Pressing either button (Advance or Play) while the script is running stops the Create. 
    Note: Set Serial Port to port connected to Create
    Note: ctrl + c stops execution of MATLAB code 
    1) Open ExampleBasicObjectAvoidance.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RoombaInit() to RoombaInit(8) 
    3) Place Create on large, flat surface (e.g. the floor).
    4) If not already done, turn Create to ON.
    Note: To stop the Create's movements, press one of its buttons or simply lift one of its wheels off the ground. If the Create becomes disconnected from the computer, it will follow the last command it was given.
    5) Click the green run icon in the center of the MATLAB toolbar. 
    Observe the behavior of the Create. Make changes to the example sketch to improve its functionality. What do the limitations seem to be in the system?

    ExampleKeyboardControl

    This example demonstrates keyboard control of the iRobot Create. 
    Note: Exiting keyboard control by any method other than pressing 'q' (e.g. clicking the red x on the window) will likely crash MATLAB. 
    1) Open ExampleKeyboardControl.m
    2) Set serial port to COM port associated with Create. 
    Example: For a Create connected to COM 8. Change RobotHardKeyBoard() to RobotHardKeyBoard(8), and change RoombaInit() to RoombaInit(8) 
    3) Place Create on large, flat surface (e.g. the floor).
    4) If not already done, turn Create to ON.
    5) Click the green run icon in the center of the MATLAB toolbar.
    6) Use keyboard commands displayed on screen to control Create.
    7) Press 'q' to end keyboard control

    iRobot Create: Console App GUI Control

    Introduction

    This is the second section of the iRobot Create tutorial. If you have not completed the first section, I would recommend that you go back and do so by following the link below.

    Sections

    Reference Documents

    These documents should be referenced for details on interfacing with the Create
    • iRobot Create Open Interface Manual (OIM)- This manual provides detailed information on the serial interface with the Create. It details the implementation of the opcode system used to control the various systems as well as the necessary measures that must be taken to receive sensor data from the Create. Information regarding sensor packet size, connector pinouts, and command details can be found here.
    • iRobot Create User Manual - This manual provides an introduction to the basic functions of the Create and an overview of the basic onboard functionality

    Necessary Hardware

    Necessary Software


    GUI Control

    One convenient way to get started with controlling the iRobot Create is with a desktop application. One such application (Downloadable HERE) that has been developed provides a convenient GUI to mask the underlying serial commands. Follow the instructions below to get started.
    1) Connect the Create as described above. Use of a serial extension is recommended.
    2) Run Create.1.0.1.2.exe (close RealTerm if open)
    3) Power on Create
    4) Select "COM Open"
    Use the following settings
    • COM Port: Default
    • Baud: 57600
    • Data Bits: 8
    • Stop Bits: 1
    • Parity: None
    • Flow Control:
      • RTS: Disabled
      • DTR: Disabled
    Note that "COM PORT OPENED" should be displayed in the bottom left corner of the window. If the program fails to connect, check that RealTerm is closed and no other program is using the COM port. If problems persist, open the device manager and ensure that the Create is using COM 1.
    Create COMopen.jpg 

    5) Select "Safe Mode." The lights on the Create should go out and the picture of the Create should change to that shown below. Note that if any of the wheel drop sensor are triggered, the Create will go into Passive mode again. Also note that Full Mode should not be used as it disables safety critical sensor functions.
    Create GUIsafemode.jpg 

    6) Check "Automatic Refresh" next to Get Sensor Data. Try triggering the Create's sensors. The green icons should now turn red when the respective sensor is triggered.
    Create GUIautomaticupdate.jpg 

    7) Place the Create on a large flat, surface and explore the various methods of control described below. The user should note, in case of emergency picking up the Create will cease all output. If the Create becomes unplugged, it will continue to follow the last command it received. For this reason it is a good practice to have one team member "spot" the Create while the other controls it. A robot traveling at 400mm/s can be surprisingly difficult to catch. 
    1) Directional buttons- These buttons allow the user to control the Create at a safe speed merely by clicking on the appropriate button. Note that the Create will follow the last command received until a new one is given. That is, the Create will follow the last command given (such as drive forward) until the user inputs a new command (such as stop). 
    2) Manual speed input- This option allows the user to input the desired speed of the Create (mm/s) and select a direction. Again, the Create will follow the last command given until it receives a new one. 
    3) Graphical control- Note the graph to the right of the window. It depicts the iRobot Create with the red line pointing toward the front of the Create. Clicking on a point on the graph will cause the Create to navigate to that point using it's wheel encoder data. Ensure that sufficient cable length is available before sending any commands. Each grid mark is 10 inches. 

    8) View the Log- Selecting the "Log" tab at top of the window allows the user to view the serial commands sent to the Create. Compare the serial commands sent for driving forward to those used in the "Getting Started: Using OI Commands" section. Reference the Open Interface Manual for information on each of the commands.
    Create GUIlog.jpg



    Goto: Arduino Control