<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://old.hacdc.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Andrewtron3000</id>
	<title>HacDC Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://old.hacdc.org/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Andrewtron3000"/>
	<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php/Special:Contributions/Andrewtron3000"/>
	<updated>2026-05-07T12:08:14Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.3</generator>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_1&amp;diff=7977</id>
		<title>Robotics Class 2011/Assignment 1</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_1&amp;diff=7977"/>
		<updated>2012-09-08T00:50:58Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:RobotClass]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Note: This assignment assumes you have already installed ROS as well as the supplemental ROS package named [http://www.ros.org/wiki/irobot_create_2_1 irobot_create_2_1], an excellent ROS package that serves as a ROS driver for the iRobot Create robot series.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Write a ROS node that subscribes to the &amp;quot;sensorPacket&amp;quot; topic (a message of type irobot_create_2_1/SensorPacket), and uses values from within this message to represent the emotional state of the robot.  You will then publish the robot emotional state via publishing a message of type robot_emotions/EmotionalState with the topic name &amp;quot;robot_emotions&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
In order to start publishing the &amp;quot;sensorPacket&amp;quot; at home without a robot, you will need to check out the irobot_sensor_simulator ROS package, available from the HacDC ROS repository (several of you already installed it):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobot_sensor_simulator&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake irobot_sensor_simulator&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Once the make is done, you can start up the simulator by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscore&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun irobot_sensor_simulator sensor_simulator.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this node will not produce any output.  It will just sit there.  If you do a &amp;quot;rostopic list&amp;quot; at this point, you will see a sensorPacket topic, and you can display it using &amp;quot;rostopic echo sensorPacket&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
In order to start modifying the values to test your emotion generator, you will need to run the ROS [http://www.ros.org/wiki/dynamic_reconfigure dynamic_reconfigure] reconfiguration GUI interface.  This will provide the GUI interface illustrated in class that allows you to modify values in the sensorPacket.  You can start the reconfigure GUI by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun dynamic_reconfigure reconfigure_gui&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You will need to select irobot_sensor_simulator from the drop down box and you should then be presented with boxes and sliders that let you change the values of the sensor packet in real time.&lt;br /&gt;
&lt;br /&gt;
Note that these sliders allow you to configure the values of the sensors in an arbitrary fashion.  However, of course, there is a nominal state of the robot defined by having a fully charged battery and being properly oriented with its base firmly on the floor.&lt;br /&gt;
&lt;br /&gt;
A powerful feature of ROS is the ability to create archives, or &amp;quot;bag files&amp;quot; that contain sensor data that can be played back.  A bag file containing messages of type irobot_create_2_1/SensorPacket has been created and can be downloaded [http://hacdc-ros-pkg.googlecode.com/files/2011-06-08-20-32-02.bag here].  After downloading this bag file, it can be played back using the [http://www.ros.org/wiki/rosbag rosbag] package, which should be installed by default:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosbag play 2011-06-08-20-32-02.bag&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This bag file was recorded with a full battery charge with the robot on the floor.  At a point in the middle, the wall sensor was triggered with a piece of paper (the wall sensor signifies when the Roomba detects a wall to its right side which is useful for wall-following).  Near the end of the bag file, the robot was lifted off the floor, causing the states of wheel drop and cliff sensors to change.&lt;br /&gt;
&lt;br /&gt;
In order to obtain the definition of the robot_emotions/EmotionalState message, you will need to check out the robot_emotions ROS package from the HacDC ROS repository:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/robot_emotions&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake robot_emotions&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can explore the structure of the two above message types by using rosmsg show in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show irobot_create_2_1/SensorPacket&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show robot_emotions/EmotionalState&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment Hint:&lt;br /&gt;
The robot_emotions ROS package contains, in addition to the definition of the robot_emotions/EmotionalState message, a complete (but very simple) solution to the homework.  Feel free to investigate it if you need help.  If you want to try running the emotion generator provided within the robot_emotions package, just type (after doing the rosmake robot_emotions above):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun robot_emotions emotion_generator.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
After typing this, a &amp;quot;rostopic list&amp;quot; command will show a new topic called robot_emotions, just as the assignment asked for.&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=7976</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=7976"/>
		<updated>2012-09-08T00:47:21Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:RobotClass]]&lt;br /&gt;
&lt;br /&gt;
The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  Assignment 3 included instructions on how to check out the floating_faces package from the HacDC ROS repository.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine (smach_guard.py in the museum_guard ROS package obtained above) that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to [http://www.wolframalpha.com/input/?i=euler+angles&amp;amp;a=*C.euler+angles-_*Formula.dflt-&amp;amp;a=*FP.EulerRotation.EAS-_e321&amp;amp;f3=Pi+%2F+2+radians&amp;amp;x=0&amp;amp;y=0&amp;amp;f=EulerRotation.th1_Pi+%2F+2+radians&amp;amp;f4=0&amp;amp;f=EulerRotation.th2_0&amp;amp;f5=0&amp;amp;f=EulerRotation.th3_0 type &amp;quot;euler angles&amp;quot; into Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;beta_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;beta_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;beta_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;beta_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_3&amp;diff=7975</id>
		<title>Robotics Class 2011/Assignment 3</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_3&amp;diff=7975"/>
		<updated>2012-09-08T00:45:32Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:RobotClass]]&lt;br /&gt;
&lt;br /&gt;
Write a ROS node that subscribes to the &amp;quot;/face_coords&amp;quot; topic that is generated from the previous face detector homework assignment and use the information provided in that topic to move the robot base by publishing messages on the &amp;quot;/cmd_vel&amp;quot; topic.  The goal of the assignment is to build a face tracker that attempts to move the robot to keep faces centered in the camera frame.  The &amp;quot;/cmd_vel&amp;quot; topic is of type &amp;quot;geometry_msgs/Twist&amp;quot;, and you can learn more about it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show geometry_msgs/Twist&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the simulated robot you will use for this assignment, it is important to know its coordinate system.  X is positive going forward, Y is positive going to the left, and Z is positive going up.  This coordinate system is a right handed coordinate system.  We spent a significant amount of time in class going over the importance of keeping track of the coordinate system of the robot, and always making sure you follow the right-hand rule for dealing with the robot coordinate frame.&lt;br /&gt;
&lt;br /&gt;
Because the robot we are using is a differential drive robot, it has two powered wheels and can only move forward and backward, and it can also rotate in place.  Thus, for this assignment, when you publish the &amp;quot;/cmd_vel&amp;quot; topic, you need only worry about populating the &#039;&#039;linear.x&#039;&#039; (forward and backward motion) and &#039;&#039;angular.z&#039;&#039; (turning left and right in place) components of the geometry_msgs/Twist structure.  Following the coordinate system above, moving the robot forward would require you to publish a geometry_msgs/Twist message with a positive &#039;&#039;linear.x&#039;&#039; component.  Rotating the robot in place to the left would require you to publish a geometry_msgs/Twist message with a positive value in the &#039;&#039;angular.z&#039;&#039; component.  No other fields need be populated in the geometry_msgs/Twist message other than &#039;&#039;linear.x&#039;&#039; and &#039;&#039;angular.z&#039;&#039;.  Remember that you must publish the geometry_msgs/Twist message on the &amp;quot;/cmd_vel&amp;quot; topic.  The robot is listening for messages on this  topic and will move accordingly.&lt;br /&gt;
&lt;br /&gt;
This assignment is more complicated than the first two, and requires several nodes to be running.  First, you must install gazebo and be able to run the simple gazebo empty world launch script.  The simple command to start gazebo is:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds empty_world.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
It may help to follow the ROS instructions [http://www.ros.org/wiki/simulator_gazebo/Tutorials/StartingGazebo here] in order to install and get gazebo running.&lt;br /&gt;
&lt;br /&gt;
The next step will be to make sure you have the &#039;&#039;gazebo_erratic_plugins&#039;&#039;.  These are extensions to gazebo to support differential drive robots (like the iRobot Create) that have two driven wheels.  If you installed the ROS installation using Synaptic, you can search Synaptic for &amp;quot;erratic&amp;quot; and you should see a package named &#039;&#039;&#039;ros-electric-erratic-robot&#039;&#039;&#039;.  You will want to install this package through Synaptic.  If you compiled from source, you will want to check out, rosdep install, and rosmake the [http://www.ros.org/wiki/erratic_robot erratic_robot] package.&lt;br /&gt;
&lt;br /&gt;
Once that is complete, you can proceed to checking out the HacDC robot simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once that completes, you can start the robot in the simulation (make sure you have done the &#039;&#039;&#039;roslaunch gazebo_worlds...&#039;&#039;&#039; step above before doing this):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should then see the robot get inserted into the world.  At this point, the robot is up and running in the simulation and you can do a &#039;&#039;rostopic list&#039;&#039; to see a variety of message topics.  The robot simulator has a camera being simulated that has characteristics similar to the camera on the actual robot.  You can subscribe to the simulated robot&#039;s camera stream the same way you have done in the past:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Of course this world is not terribly interesting as it is completely empty.  You can add some excitement by using the &#039;&#039;&#039;floating_faces&#039;&#039;&#039; package available in the HacDC ROS repository:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/floating_faces&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake floating_faces&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once that is built, you can then launch the floating faces into the world:&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once the faces are in the simulation, it would be useful to be able to manually drive the robot around before trying to control it with a controller.  You can do that by checking out the [http://www.ros.org/wiki/teleop_twist_keyboard teleop_twist_keyboard] ROS package and building it.  Once it is built, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun teleop_twist_keyboard teleop_twist_keyboard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can following the onscreen instructions on how to use it, but it is advisable to slow the robot commands down by pushing the &amp;quot;z&amp;quot; key a few times, so that the &amp;quot;speed&amp;quot; is around 0.2.  This has been found to be a reasonable maximum speed for linear motion with this particular robot.  You can move forward by pressing &amp;quot;i&amp;quot;, and turn left by pressing &amp;quot;j&amp;quot;, but these are all on the on-screen instructions when you run the teleop_twist_keyboard node.  At this point, if you are still subscribed to the &#039;&#039;/stereo/left/image_rect&#039;&#039; image stream, you should be able to drive around and see what the robot sees, and you should be able to see the faces on the cubes.&lt;br /&gt;
&lt;br /&gt;
Now, you could start up your face detection node from the previous example.  Since the simulated robot is publishing the image topic named &amp;quot;/stereo/left/image_rect&amp;quot;, your face detection system should work on the faces in the simulated world.&lt;br /&gt;
&lt;br /&gt;
In class we also went over PID control.  There is an excellent article on PID [http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf here] that is recommended reading.  Note that for this particular problem, the error term should be thought of as the number of pixels between the center of the detected face and a vertical line running down the center of the image frame.  Since the images from the camera are 352 pixels wide, the center of the image is at 176.  Then the x coordinate of the &amp;quot;/face_coords&amp;quot; messages can be subtracted from 176 to find the error term that is fed into the controller of your choice to output a commanded velocity that should be fed into the &#039;&#039;angular.z&#039;&#039; field of the geometry_msgs/Twist message.  Note for this homework, it is recommended to start by only rotating the robot in place to keep an image in the center of the frame (i.e. only populate the &#039;&#039;angular.z&#039;&#039; of the geometry_msgs/Twist message).  Once you get that working you can think about moving the robot forward and backwards to keep the face at a constant scale, but this will also require modifications to the &amp;quot;/face_coords&amp;quot; message since &amp;quot;/face_coords&amp;quot; currently only provides the center of the located face, not the size of the bounding rectangle.  Extra credit is assigned for getting these modifications into your solution.&lt;br /&gt;
&lt;br /&gt;
There is a complete solution to the homework assignment if you are interested in studying it.  It can be checked out from the ROS HacDC repository.  There are two packages.  The first is a generic PID control library and the second is the controller itself.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/pid_control&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake pid_control&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_follow&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake face_follow&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
To run the face tracker, you can launch it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_follow follow.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Don&#039;t forget for it to work you need to start a face detector first, either your own from the last homework, or the example solution from the last assignment.&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_2&amp;diff=7974</id>
		<title>Robotics Class 2011/Assignment 2</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_2&amp;diff=7974"/>
		<updated>2012-09-08T00:44:20Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:RobotClass]]&lt;br /&gt;
&lt;br /&gt;
Write a ROS node that subscribes to the image topic &amp;quot;/stereo/left/image_rect&amp;quot; (which has message type sensor_msgs/Image), and publish two topics.  The first topic is a topic named &amp;quot;/face_view&amp;quot; that is an image topic that has a rectangle around any face that is seen.  The second topic, named &amp;quot;/face_coords&amp;quot; is a PointStamped message that has point.x and point.y set to the center of the identified face if there is one face.  If there is more than one face, the behavior can be implementation dependent.  &lt;br /&gt;
&lt;br /&gt;
In class we discussed some basic properties of a mathematical model of the pinhole camera, which in turn provided insights into the idea of camera calibration and image rectification.  Camera calibration is discussed in detail in the [http://opencv.willowgarage.com/documentation/camera_calibration_and_3d_reconstruction.html OpenCV Camera Calibration Section].&lt;br /&gt;
&lt;br /&gt;
To start the assignment, you can learn about the structure of the PointStamped message by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show PointStamped&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Remember when working at home to comment out the two lines in your .bashrc file that contain ROS_IP and ROS_MASTER_URI definitions for working with the robot at the HacDC space.  Once you have commented out those two lines (and have restarted your shell or sourced your .bashrc), make sure also to start your own local roscore (the ROS master node) by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscore&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The bag file distributed in class contains the image data that should be used to test your face detector.  You can download the bag file [http://hacdc-ros-pkg.googlecode.com/files/2011-06-18-12-38-55.bag here].  The bag file can be played back by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosbag play -l 2011-06-18-12-38-55.bag&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that the additional &amp;quot;-l&amp;quot; argument allows the bag file to be looped indefinitely.&lt;br /&gt;
&lt;br /&gt;
Once the bag file has begun playing, you can verify the image stream by viewing the raw images from the bag file by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The image viewing program is part of the [http://www.ros.org/wiki/image_view image_view] ROS package (which is, in turn, part of the ROS [http://www.ros.org/wiki/image_pipeline image_pipeline]).&lt;br /&gt;
&lt;br /&gt;
Once the image stream is verified to be working, you can begin developing your face detection system.  &lt;br /&gt;
&lt;br /&gt;
There is a complete example of the homework in the HacDC ROS repository.  If you would like to refer to it, you can check it out via:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that the homework solution uses [http://www.ros.org/wiki/cv_bridge cv_bridge] to map between ROS image messages and OpenCV image arrays, and it makes use of the OpenCV image blurring function &#039;&#039;&#039;cv.Smooth&#039;&#039;&#039; and face detection function &#039;&#039;&#039;cv.HaarDetectObjects&#039;&#039;&#039;.&lt;br /&gt;
&lt;br /&gt;
In order to actually run the demonstration face detector, you will need to use the &#039;&#039;&#039;roslaunch&#039;&#039;&#039; tool.  When you check out the above face_detection module from the HacDC repository, you will find inside the package a directory named &#039;&#039;&#039;launch&#039;&#039;&#039;.  Inside this directory is a ROS launch file designed to start up the face detection module.  The [http://www.ros.org/wiki/roslaunch roslaunch] system is useful for starting up a large number of nodes on your robot, instead of manually starting each node in a separate window using &#039;&#039;&#039;rosrun...&#039;&#039;&#039;.  Inside the (XML) launch file we also define a &amp;quot;private&amp;quot; parameter that is loaded to the ROS [http://www.ros.org/wiki/Parameter%20Server Parameter Server].  As mentioned in class, this parameter is named &amp;quot;classifier&amp;quot; and it contains the filename to the Haar Cascade used by the OpenCV face detection algorithm.&lt;br /&gt;
&lt;br /&gt;
In order to start the face detection example node, you may type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
As the assignment requests, the example face_detector node outputs two topics, one called &amp;quot;/face_view&amp;quot; and one called &amp;quot;/face_coords&amp;quot;.  You can view the &amp;quot;/face_view&amp;quot; image topic like you would any other image topic (using image_view):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/face_view&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
If you wish to view the &amp;quot;/face_coords&amp;quot; topic, you made use the simple rostopic echo covered previously:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that the &amp;quot;/face_coords&amp;quot; topic is only transmitted (in the face detection example node at least) when faces are detected.&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_1&amp;diff=7973</id>
		<title>Robotics Class 2011/Assignment 1</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_1&amp;diff=7973"/>
		<updated>2012-09-08T00:43:26Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Category:RobotClass]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Note: This assignment assumes you have already installed ROS as well as the supplemental ROS package named [http://www.ros.org/wiki/irobot_create_2_1 irobot_create_2_1], an excellent ROS package that serves as a ROS driver for the iRobot Create robot series.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Write a ROS node that subscribes to the &amp;quot;sensorPacket&amp;quot; topic (a message of type irobot_create_2_1/SensorPacket), and uses values from within this message to represent the emotional state of the robot.  You will then publish the robot emotional state via publishing a message of type robot_emotions/EmotionalState with the topic name &amp;quot;robot_emotions&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
In order to start publishing the &amp;quot;sensorPacket&amp;quot; at home without a robot, you will need to check out the irobot_sensor_simulator ROS package, available from the HacDC ROS repository (several of you already installed it):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobot_sensor_simulator&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake irobot_sensor_simulator&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Once the make is done, you can start up the simulator by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun irobot_sensor_simulator sensor_simulator.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this node will not produce any output.  It will just sit there.  If you do a &amp;quot;rostopic list&amp;quot; at this point, you will see a sensorPacket topic, and you can display it using &amp;quot;rostopic echo sensorPacket&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
In order to start modifying the values to test your emotion generator, you will need to run the ROS [http://www.ros.org/wiki/dynamic_reconfigure dynamic_reconfigure] reconfiguration GUI interface.  This will provide the GUI interface illustrated in class that allows you to modify values in the sensorPacket.  You can start the reconfigure GUI by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun dynamic_reconfigure reconfigure_gui&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You will need to select irobot_sensor_simulator from the drop down box and you should then be presented with boxes and sliders that let you change the values of the sensor packet in real time.&lt;br /&gt;
&lt;br /&gt;
Note that these sliders allow you to configure the values of the sensors in an arbitrary fashion.  However, of course, there is a nominal state of the robot defined by having a fully charged battery and being properly oriented with its base firmly on the floor.&lt;br /&gt;
&lt;br /&gt;
A powerful feature of ROS is the ability to create archives, or &amp;quot;bag files&amp;quot; that contain sensor data that can be played back.  A bag file containing messages of type irobot_create_2_1/SensorPacket has been created and can be downloaded [http://hacdc-ros-pkg.googlecode.com/files/2011-06-08-20-32-02.bag here].  After downloading this bag file, it can be played back using the [http://www.ros.org/wiki/rosbag rosbag] package, which should be installed by default:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosbag play 2011-06-08-20-32-02.bag&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This bag file was recorded with a full battery charge with the robot on the floor.  At a point in the middle, the wall sensor was triggered with a piece of paper (the wall sensor signifies when the Roomba detects a wall to its right side which is useful for wall-following).  Near the end of the bag file, the robot was lifted off the floor, causing the states of wheel drop and cliff sensors to change.&lt;br /&gt;
&lt;br /&gt;
In order to obtain the definition of the robot_emotions/EmotionalState message, you will need to check out the robot_emotions ROS package from the HacDC ROS repository:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/robot_emotions&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmake robot_emotions&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can explore the structure of the two above message types by using rosmsg show in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show irobot_create_2_1/SensorPacket&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show robot_emotions/EmotionalState&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment Hint:&lt;br /&gt;
The robot_emotions ROS package contains, in addition to the definition of the robot_emotions/EmotionalState message, a complete (but very simple) solution to the homework.  Feel free to investigate it if you need help.  If you want to try running the emotion generator provided within the robot_emotions package, just type (after doing the rosmake robot_emotions above):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun robot_emotions emotion_generator.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
After typing this, a &amp;quot;rostopic list&amp;quot; command will show a new topic called robot_emotions, just as the assignment asked for.&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=7340</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=7340"/>
		<updated>2012-04-17T04:06:10Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the summer of 2011 Robotics Class at http://hacdc.org.  There is a HacDC robotics [https://groups.google.com/a/hacdc.org/group/Robotics/topics?lnk mailing list].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc_robotics_2011_floating_faces.png | 800 px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/electric/Installation/Ubuntu Electric ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The class is now over, but the homeworks are freely viewable and should be self explanatory.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  Members of the 2011 HacDC robotics class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=6575</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=6575"/>
		<updated>2012-04-01T17:40:16Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  Assignment 3 included instructions on how to check out the floating_faces package from the HacDC ROS repository.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine (smach_guard.py in the museum_guard ROS package obtained above) that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to [http://www.wolframalpha.com/input/?i=euler+angles&amp;amp;a=*C.euler+angles-_*Formula.dflt-&amp;amp;a=*FP.EulerRotation.EAS-_e321&amp;amp;f3=Pi+%2F+2+radians&amp;amp;x=0&amp;amp;y=0&amp;amp;f=EulerRotation.th1_Pi+%2F+2+radians&amp;amp;f4=0&amp;amp;f=EulerRotation.th2_0&amp;amp;f5=0&amp;amp;f=EulerRotation.th3_0 type &amp;quot;euler angles&amp;quot; into Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;beta_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;beta_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;beta_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;beta_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_3&amp;diff=6574</id>
		<title>Robotics Class 2011/Assignment 3</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_3&amp;diff=6574"/>
		<updated>2012-04-01T17:34:47Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Write a ROS node that subscribes to the &amp;quot;/face_coords&amp;quot; topic that is generated from the previous face detector homework assignment and use the information provided in that topic to move the robot base by publishing messages on the &amp;quot;/cmd_vel&amp;quot; topic.  The goal of the assignment is to build a face tracker that attempts to move the robot to keep faces centered in the camera frame.  The &amp;quot;/cmd_vel&amp;quot; topic is of type &amp;quot;geometry_msgs/Twist&amp;quot;, and you can learn more about it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosmsg show geometry_msgs/Twist&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
For the simulated robot you will use for this assignment, it is important to know its coordinate system.  X is positive going forward, Y is positive going to the left, and Z is positive going up.  This coordinate system is a right handed coordinate system.  We spent a significant amount of time in class going over the importance of keeping track of the coordinate system of the robot, and always making sure you follow the right-hand rule for dealing with the robot coordinate frame.&lt;br /&gt;
&lt;br /&gt;
Because the robot we are using is a differential drive robot, it has two powered wheels and can only move forward and backward, and it can also rotate in place.  Thus, for this assignment, when you publish the &amp;quot;/cmd_vel&amp;quot; topic, you need only worry about populating the &#039;&#039;linear.x&#039;&#039; (forward and backward motion) and &#039;&#039;angular.z&#039;&#039; (turning left and right in place) components of the geometry_msgs/Twist structure.  Following the coordinate system above, moving the robot forward would require you to publish a geometry_msgs/Twist message with a positive &#039;&#039;linear.x&#039;&#039; component.  Rotating the robot in place to the left would require you to publish a geometry_msgs/Twist message with a positive value in the &#039;&#039;angular.z&#039;&#039; component.  No other fields need be populated in the geometry_msgs/Twist message other than &#039;&#039;linear.x&#039;&#039; and &#039;&#039;angular.z&#039;&#039;.  Remember that you must publish the geometry_msgs/Twist message on the &amp;quot;/cmd_vel&amp;quot; topic.  The robot is listening for messages on this  topic and will move accordingly.&lt;br /&gt;
&lt;br /&gt;
This assignment is more complicated than the first two, and requires several nodes to be running.  First, you must install gazebo and be able to run the simple gazebo empty world launch script.  The simple command to start gazebo is:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds empty_world.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
It may help to follow the ROS instructions [http://www.ros.org/wiki/simulator_gazebo/Tutorials/StartingGazebo here] in order to install and get gazebo running.&lt;br /&gt;
&lt;br /&gt;
The next step will be to make sure you have the &#039;&#039;gazebo_erratic_plugins&#039;&#039;.  These are extensions to gazebo to support differential drive robots (like the iRobot Create) that have two driven wheels.  If you installed the ROS installation using Synaptic, you can search Synaptic for &amp;quot;erratic&amp;quot; and you should see a package named &#039;&#039;&#039;ros-electric-erratic-robot&#039;&#039;&#039;.  You will want to install this package through Synaptic.  If you compiled from source, you will want to check out, rosdep install, and rosmake the [http://www.ros.org/wiki/erratic_robot erratic_robot] package.&lt;br /&gt;
&lt;br /&gt;
Once that is complete, you can proceed to checking out the HacDC robot simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once that completes, you can start the robot in the simulation (make sure you have done the &#039;&#039;&#039;roslaunch gazebo_worlds...&#039;&#039;&#039; step above before doing this):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should then see the robot get inserted into the world.  At this point, the robot is up and running in the simulation and you can do a &#039;&#039;rostopic list&#039;&#039; to see a variety of message topics.  The robot simulator has a camera being simulated that has characteristics similar to the camera on the actual robot.  You can subscribe to the simulated robot&#039;s camera stream the same way you have done in the past:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Of course this world is not terribly interesting as it is completely empty.  You can add some excitement by using the &#039;&#039;&#039;floating_faces&#039;&#039;&#039; package available in the HacDC ROS repository:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/floating_faces&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake floating_faces&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once that is built, you can then launch the floating faces into the world:&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once the faces are in the simulation, it would be useful to be able to manually drive the robot around before trying to control it with a controller.  You can do that by checking out the [http://www.ros.org/wiki/teleop_twist_keyboard teleop_twist_keyboard] ROS package and building it.  Once it is built, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun teleop_twist_keyboard teleop_twist_keyboard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can following the onscreen instructions on how to use it, but it is advisable to slow the robot commands down by pushing the &amp;quot;z&amp;quot; key a few times, so that the &amp;quot;speed&amp;quot; is around 0.2.  This has been found to be a reasonable maximum speed for linear motion with this particular robot.  You can move forward by pressing &amp;quot;i&amp;quot;, and turn left by pressing &amp;quot;j&amp;quot;, but these are all on the on-screen instructions when you run the teleop_twist_keyboard node.  At this point, if you are still subscribed to the &#039;&#039;/stereo/left/image_rect&#039;&#039; image stream, you should be able to drive around and see what the robot sees, and you should be able to see the faces on the cubes.&lt;br /&gt;
&lt;br /&gt;
Now, you could start up your face detection node from the previous example.  Since the simulated robot is publishing the image topic named &amp;quot;/stereo/left/image_rect&amp;quot;, your face detection system should work on the faces in the simulated world.&lt;br /&gt;
&lt;br /&gt;
In class we also went over PID control.  There is an excellent article on PID [http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf here] that is recommended reading.  Note that for this particular problem, the error term should be thought of as the number of pixels between the center of the detected face and a vertical line running down the center of the image frame.  Since the images from the camera are 352 pixels wide, the center of the image is at 176.  Then the x coordinate of the &amp;quot;/face_coords&amp;quot; messages can be subtracted from 176 to find the error term that is fed into the controller of your choice to output a commanded velocity that should be fed into the &#039;&#039;angular.z&#039;&#039; field of the geometry_msgs/Twist message.  Note for this homework, it is recommended to start by only rotating the robot in place to keep an image in the center of the frame (i.e. only populate the &#039;&#039;angular.z&#039;&#039; of the geometry_msgs/Twist message).  Once you get that working you can think about moving the robot forward and backwards to keep the face at a constant scale, but this will also require modifications to the &amp;quot;/face_coords&amp;quot; message since &amp;quot;/face_coords&amp;quot; currently only provides the center of the located face, not the size of the bounding rectangle.  Extra credit is assigned for getting these modifications into your solution.&lt;br /&gt;
&lt;br /&gt;
There is a complete solution to the homework assignment if you are interested in studying it.  It can be checked out from the ROS HacDC repository.  There are two packages.  The first is a generic PID control library and the second is the controller itself.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/pid_control&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake pid_control&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_follow&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_follow&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
To run the face tracker, you can launch it as follows:&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_follow follow.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Don&#039;t forget for it to work you need to start a face detector first, either your own from the last homework, or the example solution from the last assignment.&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=6573</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=6573"/>
		<updated>2012-04-01T17:27:10Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the summer of 2011 Robotics Class at http://hacdc.org.  There is a HacDC robotics [https://groups.google.com/a/hacdc.org/group/Robotics/topics?lnk mailing list].&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc_robotics_2011_floating_faces.png | 800 px]]&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/electric/Installation/Ubuntu Electric ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The class is now over, but the homeworks are freely viewable and should be self explanatory.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  Members of the 2011 HacDC robotics class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=File:Hacdc_robotics_2011_floating_faces.png&amp;diff=6572</id>
		<title>File:Hacdc robotics 2011 floating faces.png</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=File:Hacdc_robotics_2011_floating_faces.png&amp;diff=6572"/>
		<updated>2012-04-01T17:23:54Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: Illustration of Gazebo running a simulated irobotron as well as texture mapped face cubes for experimenting with OpenCV face tracking within Gazebo.  These packages are freely available from the HacDC ROS repository.  The simulated robot is contained with&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Illustration of Gazebo running a simulated irobotron as well as texture mapped face cubes for experimenting with OpenCV face tracking within Gazebo.  These packages are freely available from the HacDC ROS repository.  The simulated robot is contained within the irobotron_description ROS package, while the texture mapped face cubes are contained within the floating_faces ROS package.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=6571</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=6571"/>
		<updated>2012-04-01T17:18:16Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the summer of 2011 Robotics Class at http://hacdc.org.  There is a HacDC robotics [https://groups.google.com/a/hacdc.org/group/Robotics/topics?lnk mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/electric/Installation/Ubuntu Electric ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The class is now over, but the homeworks are freely viewable and should be self explanatory.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  Members of the 2011 HacDC robotics class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5570</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5570"/>
		<updated>2011-08-26T03:29:24Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the summer of 2011 Robotics Class at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/diamondback/Installation/Ubuntu Diamondback ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The class is now over, but the homeworks are freely viewable and should be self explanatory.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  Members of the 2011 HacDC robotics class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Welcome_to_the_HacDC_Wiki&amp;diff=5569</id>
		<title>Welcome to the HacDC Wiki</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Welcome_to_the_HacDC_Wiki&amp;diff=5569"/>
		<updated>2011-08-26T03:27:43Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{TOCright}}&amp;lt;span style=&amp;quot;font-size:large; line-height:1.5em; color:#222&amp;quot;&amp;gt;Welcome to HacDC. We are a hacker space located in Washington, DC. HacDC members improve the world by creatively rethinking technology. We break, build, and occasionally abuse technology in the pursuit of greater knowledge about how it works and repurposing it to build new things. Our shop is located in the [http://en.wikipedia.org/wiki/Columbia_Heights%2C_Washington%2C_D.C. Columbia Heights] neighborhood of DC.&amp;lt;/span&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If you&#039;re new here, you should start by looking at our public web site at [http://www.hacdc.org/ hacdc.org]. Our Wiki is mostly for internal organization, but you&#039;re welcome to look through to get a sense of what we work on (and consider joining yourself!).  We invite you to subscribe to our [http://www.hacdc.org/mailman/listinfo/announce announcement] (weekly e-mail) and [http://www.hacdc.org/mailman/listinfo/blabber blabber] (high traffic) mailing lists.&amp;lt;hr&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;float:left; width:48%;&amp;quot;&amp;gt;&lt;br /&gt;
=== Cool Stuff Going On Right Now ===&lt;br /&gt;
&lt;br /&gt;
* [[Electron Tube Class]]&lt;br /&gt;
* [[Byzantium]]&lt;br /&gt;
* [[3D Printing]]&lt;br /&gt;
* &#039;&#039;&#039;Microcontroller Mondays&#039;&#039;&#039; (BYO project to work on, get help on, or show off...or just come to check us out!)&lt;br /&gt;
&lt;br /&gt;
=== Top Links ===&lt;br /&gt;
Themes and threads that span across the other categories in the Wiki.  This is also where we keep things that don&#039;t easily fit in other categories:&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;[[New Space]]&#039;&#039;&#039;&lt;br /&gt;
* [[GettingHere]]&lt;br /&gt;
* [[:Category:Meeting Minutes | Meeting Minutes]]&lt;br /&gt;
* [[:Category:Meeting Agendas | Meeting Agendas]]&lt;br /&gt;
* [[:Category:Classes | Classes]]&lt;br /&gt;
* [[:Category:Hacker Travel | Hacker Travel]]&lt;br /&gt;
&lt;br /&gt;
=== [[:Category:Projects]] ===&lt;br /&gt;
Where our projects collaborate and document their research and progress.  They are generally classified as:&lt;br /&gt;
* [[:Category:Ongoing Projects | Ongoing Projects]]&lt;br /&gt;
* [[:Category:Proposed Projects | Proposed Projects]]&lt;br /&gt;
* [[:Category:Abandoned Projects | Abandoned Projects]]&lt;br /&gt;
&lt;br /&gt;
=== [[:Category:In the Space | Records]] ===&lt;br /&gt;
Entries relating to the space including layout and events and classes we hold&lt;br /&gt;
* [[:SpaceSearch | Space Search: the search for a new space]]&lt;br /&gt;
* [[:Category:Space Configuration | Configuration]]&lt;br /&gt;
* [[:Category:Event Planning | Event Planning]]&lt;br /&gt;
* [[:Category:Classes | Classes]]&lt;br /&gt;
* [[Wishlist]]&lt;br /&gt;
* [[:Category:What_I_Stole | What I Stole]]&lt;br /&gt;
* [[Human_Resources | Workspace Access]]&lt;br /&gt;
* [[Central_Services | HacDC Leadership]]&lt;br /&gt;
* [[Inventory | Inventory]]&lt;br /&gt;
* [[Procurement | Procurement]]&lt;br /&gt;
* [[Suppliers | Suppliers]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div style=&amp;quot;float:right; width:48%;&amp;quot;&amp;gt; &amp;lt;!-- This width adds to the margin above to equal 100 %--&amp;gt;&lt;br /&gt;
=== Events ===&lt;br /&gt;
&amp;lt;googlecalendar&amp;gt;?showTitle=0&amp;amp;amp;showPrint=0&amp;amp;amp;showCalendars=0&amp;amp;amp;mode=AGENDA&amp;amp;amp;height=250&amp;amp;amp;wkst=1&amp;amp;amp;bgcolor=%23FFFFFF&amp;amp;amp;src=c0jnbtagjrjs0h1o00jqvauduflv24ca%40import.calendar.google.com&amp;amp;amp;color=%2328754E&amp;amp;amp;ctz=America%2FNew_York&amp;quot; style=&amp;quot; border-width:0 &amp;quot; width=&amp;quot;500&amp;quot; height=&amp;quot;250&amp;quot; frameborder=&amp;quot;0&amp;quot; scrolling=&amp;quot;no&amp;quot;&amp;gt;&amp;lt;/googlecalendar&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A [http://hacdc.org/ics iCal feed] is also available, for enjoying our events from your favorite calendaring software.&lt;br /&gt;
&lt;br /&gt;
=== Past Cool Stuff That Will Happen Again ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
* Max/MSP Workshop&lt;br /&gt;
* [[Quantified Self]]&lt;br /&gt;
* [[Intro to Programming]]&lt;br /&gt;
* [[AVR Microcontroller Class 2011]]&lt;br /&gt;
* [[Great Global Hackerspace Challenge]]&lt;br /&gt;
* [[Linux Class]]&lt;br /&gt;
* [[TECS | The Elements of Computing Systems: Building a Modern Computer from First Principles]]&lt;br /&gt;
* [[Bike Maintenance Class]]&lt;br /&gt;
* [[HacDC Spaceblimp 5]]&lt;br /&gt;
* [[Robotics Class 2011]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== [[:Category:Community]] ===&lt;br /&gt;
Pages on people in our community :&lt;br /&gt;
* [[:Category:Members | Member User Pages]] &lt;br /&gt;
* [[:Category:Friends | Friends of HacDC]]&lt;br /&gt;
* [[:Category:Heroes | Heroes and people who inspire us]]&lt;br /&gt;
&lt;br /&gt;
=== [[:Category:Policy]] ===&lt;br /&gt;
Our Policy Manual is divided into the following subcategories:&lt;br /&gt;
* [[Articles_of_Incorporation|Articles of Incorporation]]&lt;br /&gt;
* [[Bylaws]]&lt;br /&gt;
* [[MIBS_Simplified_Rules_of_Coordinated_Consensus_through_Chaos | Meeting Rules]]&lt;br /&gt;
* [[Privacy_Policy | Privacy Policy]]&lt;br /&gt;
* [[Resource_Use_Policy | Resource Use Policy]]&lt;br /&gt;
* [[Resource_Disposal | Resource Disposal Policy]]&lt;br /&gt;
* [[:Category:Communications Policy | Communications Policy]]&lt;br /&gt;
* [[Licensing_Policy | Licensing Policy]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/div&amp;gt;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5542</id>
		<title>File:Hacdc-gw-trip.JPG</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5542"/>
		<updated>2011-08-09T12:25:26Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: uploaded a new version of &amp;amp;quot;File:Hacdc-gw-trip.JPG&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Members of the 2011 HacDC robotics class visit Dr. Evan Drumwright and the Willow Garage PR2 at GWU.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5541</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5541"/>
		<updated>2011-08-09T11:52:53Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/diamondback/Installation/Ubuntu Diamondback ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  Members of the 2011 HacDC robotics class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5539</id>
		<title>File:Hacdc-gw-trip.JPG</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5539"/>
		<updated>2011-08-09T03:38:01Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Members of the 2011 HacDC robotics class visit Dr. Evan Drumwright and the Willow Garage PR2 at GWU.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5537</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5537"/>
		<updated>2011-08-09T02:16:46Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/diamondback/Installation/Ubuntu Diamondback ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
&#039;&#039;&#039;Field Trip!&#039;&#039;&#039;  The 2011 class went to visit [http://www.seas.gwu.edu/~drum/ Dr. Evan Drumwright] and the [http://www.willowgarage.com/pages/pr2/overview PR2] at GWU.&lt;br /&gt;
&lt;br /&gt;
[[Image: Hacdc-gw-trip.JPG | 500 px]]&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Also, we&#039;ll have open question time where questions can be asked on any previous homework assignments and questions about how the homeworks were set up, for example the robot description for the simulation.  Anything goes!&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5536</id>
		<title>File:Hacdc-gw-trip.JPG</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=File:Hacdc-gw-trip.JPG&amp;diff=5536"/>
		<updated>2011-08-09T02:04:36Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: HacDC Visits Dr. Evan Drumwright and the Willow Garage PR2 at GWU.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;HacDC Visits Dr. Evan Drumwright and the Willow Garage PR2 at GWU.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5477</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5477"/>
		<updated>2011-07-18T01:27:52Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  Assignment 3 included instructions on how to check out the floating_faces package from the HacDC ROS repository.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine (smach_guard.py in the museum_guard ROS package obtained above) that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to [http://www.wolframalpha.com/input/?i=euler+angles&amp;amp;a=*C.euler+angles-_*Formula.dflt-&amp;amp;a=*FP.EulerRotation.EAS-_e321&amp;amp;f3=Pi+%2F+2+radians&amp;amp;x=0&amp;amp;y=0&amp;amp;f=EulerRotation.th1_Pi+%2F+2+radians&amp;amp;f4=0&amp;amp;f=EulerRotation.th2_0&amp;amp;f5=0&amp;amp;f=EulerRotation.th3_0 type &amp;quot;euler angles&amp;quot; into Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5476</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5476"/>
		<updated>2011-07-18T01:25:03Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine (smach_guard.py in the museum_guard ROS package obtained above) that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to [http://www.wolframalpha.com/input/?i=euler+angles&amp;amp;a=*C.euler+angles-_*Formula.dflt-&amp;amp;a=*FP.EulerRotation.EAS-_e321&amp;amp;f3=Pi+%2F+2+radians&amp;amp;x=0&amp;amp;y=0&amp;amp;f=EulerRotation.th1_Pi+%2F+2+radians&amp;amp;f4=0&amp;amp;f=EulerRotation.th2_0&amp;amp;f5=0&amp;amp;f=EulerRotation.th3_0 type &amp;quot;euler angles&amp;quot; into Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5475</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5475"/>
		<updated>2011-07-16T23:05:19Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine (smach_guard.py in the museum_guard ROS package obtained above) that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5474</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5474"/>
		<updated>2011-07-16T23:03:05Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5473</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5473"/>
		<updated>2011-07-16T23:01:34Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 grid squares (meters) forward and one grid square (meter) left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5472</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5472"/>
		<updated>2011-07-16T22:41:30Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5471</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5471"/>
		<updated>2011-07-16T22:39:55Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/diamondback/Installation/Ubuntu Diamondback ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of paranormal activity. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5470</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5470"/>
		<updated>2011-07-16T22:38:45Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native [http://www.ros.org/wiki/diamondback/Installation/Ubuntu Diamondback ROS installation] on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5469</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5469"/>
		<updated>2011-07-16T22:37:53Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
A native ROS installation on your computer.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5468</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5468"/>
		<updated>2011-07-16T22:37:01Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5467</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5467"/>
		<updated>2011-07-16T22:36:35Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;VirtualBox note: It appears that trying to use the gazebo simulator with VirtualBox is problematic.  For this reason we suggest that if you are trying to do the homeworks on your own that you use a native installation of Ubuntu on your machine.  There are numerous ways of installing Ubuntu, including onto a separate partition, or even on an external USB disk.&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5466</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5466"/>
		<updated>2011-07-16T22:33:24Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, W. Burgard, D. Fox, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5465</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5465"/>
		<updated>2011-07-16T22:32:36Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
A standard text on probabilistic robotics:&lt;br /&gt;
[http://www.probabilistic-robotics.org/ S. Thrun, Probabilistic Robotics]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5464</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5464"/>
		<updated>2011-07-16T22:18:00Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED. [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5463</id>
		<title>Robotics Class 2011</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011&amp;diff=5463"/>
		<updated>2011-07-16T22:17:42Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is the class page for the 2011 Robotics Class going on over the summer at http://hacdc.org.  There is a HacDC robotics [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list].&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;WORK IN PROGRESS&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course cost:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The 2011 Robotics Class will require a $25 donation per class.  Do not bring cash to class, instead bring a printed (or electronic) copy of a Paypal eCheck payment clear notice that proves you have donated to HacDC for the interval leading up to the next class.  If you are a member, bring your monthly Paypal payment cleared email.  [http://www.hacdc.org/donate Donations can be made through the normal HacDC donation method].  You will not be admitted to the class without proof of an up to date donation.  Also, do not donate up front with a $150 donation because I may be unable to teach all six planned classes due to schedule conflicts or work load (at my day job).  I am hoping to be able to teach six classes, however.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course prerequisites:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Basic knowledge of python programming&lt;br /&gt;
&lt;br /&gt;
Basic linux knowledge (operating from the unix command line).  Ubuntu 10.04 LTS will be used for this class.&lt;br /&gt;
&lt;br /&gt;
Laptop with [http://www.virtualbox.org/ VirtualBox] installed, capable of running a 30GB disk image that requires 1 GB of RAM, or alternatively a native ROS installation on your computer.  A VirtualBox disk image will be distributed during class that has a complete Diamondback ROS installation running on Ubuntu 10.04 LTS. &lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Registering:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;Unfortunately the class is full&#039;&#039;.  However, you can still subscribe to the [http://hacdc.org/cgi-bin/mailman/listinfo/robotics mailing list] and follow along with the assignments.&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course syllabus:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Class 1 (June 4, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to [http://www.ros.org ROS] and the robot we will be using for the class.  Assignment will involve interpreting robot sensor state and developing a robot &amp;quot;mood metric&amp;quot;.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_1 More information regarding Assignment 1 &amp;quot;Endowing a robot with emotions&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 2 (June 18, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to computer vision and face detection using [http://www.ros.org/wiki/vision_opencv vision_opencv].  Assignment will involve processing image data and using the OpenCV toolkit to do face detection (the routine is already provided in OpenCV).  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_2 More information regarding Assignment 2 &amp;quot;Detecting Faces&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 3 (July 2, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to moving the robot base.  The assignment will continue the face detection of the previous class, we will move the robot to track a face, both rotating the base to keep a constant face position, and moving the base forward and backward to keep a constant face scale.  [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_3 More information regarding Assignment 3 &amp;quot;Tracking CLUs on the Grid&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 4 (July 16, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Using the [http://www.ros.org/wiki/smach smach] executive to write robot state machines.  Assignment will involve writing a state machine that allows the robot to [http://www.ros.org/wiki/navigation navigate] through the secret HacDC warehouse and investigate the recent claims of REDACTED [http://wiki.hacdc.org/index.php/Robotics_Class_2011/Assignment_4 More information regarding Assignment 4 &amp;quot;Paranormal Activities in the HacDC Antiquities Warehouse&amp;quot;]&lt;br /&gt;
&lt;br /&gt;
Class 5 (August 6, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Introduction to stereo vision and point clouds.  Visualizing 3D point cloud data using [http://www.ros.org/wiki/rviz rviz].  Using [http://www.ros.org/wiki/tf tf] to transform point clouds into the robot base frame.  Assignment will involve trying to find the floor in the point cloud.&lt;br /&gt;
&lt;br /&gt;
Class 6 (August 20, 2011 &#039;&#039;&#039;10:30am - 12:30pm&#039;&#039;&#039;)&lt;br /&gt;
Undecided&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;&#039;&#039;Course Links:&#039;&#039;&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
The iRobot Open Interface Document will be useful for the class:&lt;br /&gt;
[http://www.irobot.com/hrd_right_rail/create_rr/create_fam/createFam_rr_manuals.html iRobot Create Manuals]&lt;br /&gt;
The HacDC ROS repository will also be useful:&lt;br /&gt;
[http://code.google.com/p/hacdc-ros-pkg/source/checkout HacDC ROS Repository]&lt;br /&gt;
An excellent article on PID control:&lt;br /&gt;
[http://www.eetimes.com/ContentEETimes/Documents/Embedded.com/2000/f-wescot.pdf T. Wescott, &amp;quot;PID Without a PhD&amp;quot;]&lt;br /&gt;
An excellent book on planning algorithms:&lt;br /&gt;
[http://planning.cs.uiuc.edu S. LaValle, Planning Algorithms]&lt;br /&gt;
&lt;br /&gt;
[[Category:Classes]]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5462</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5462"/>
		<updated>2011-07-16T22:15:01Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You can then take these parameters and copy them into your navigation goal, and hopefully the robot should do what you command!&lt;br /&gt;
&lt;br /&gt;
Good luck on the assignment and happy ghost hunting!&lt;br /&gt;
&lt;br /&gt;
[http://wiki.hacdc.org/index.php/Robotics_Class_2011 Back to Robotics Class 2011]&lt;br /&gt;
&lt;br /&gt;
[http://www.youtube.com/watch?v=Qka5HX-R-cQ Secret bonus video (slyt)]&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5461</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5461"/>
		<updated>2011-07-16T22:10:58Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following (however, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5460</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5460"/>
		<updated>2011-07-16T22:08:51Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5459</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5459"/>
		<updated>2011-07-16T22:07:14Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  There are a couple important things to remember when using this conversion utility.  The first is to make sure you select the &amp;quot;Euler rotation sequence&amp;quot; and enter your rotations in such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5458</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5458"/>
		<updated>2011-07-16T22:05:56Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a (pi/2) radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  The important things to remember when using this conversion is to make sure you select the rotation sequence such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;q_0&#039;&#039;&#039; is &#039;&#039;&#039;w&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_1&#039;&#039;&#039; is &#039;&#039;&#039;x&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_2&#039;&#039;&#039; is &#039;&#039;&#039;y&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;q_3&#039;&#039;&#039; is &#039;&#039;&#039;z&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5457</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5457"/>
		<updated>2011-07-16T22:04:17Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note to make the assignment slightly harder, some of the artifacts are on different walls than the first artifact.  This requires you to specify navigation goals in your smach state machine that have not only a target position, but also a target attitude, specified as a quaternion.  The navigation goal provides an orientation in the goal that specifies the robot ending attitude.  Based on the robot&#039;s coordinate system of &#039;&#039;x&#039;&#039; being forward, &#039;&#039;y&#039;&#039; to the left, and &#039;&#039;z&#039;&#039; up, when we want to end in a different orientation, we must specify an ending rotation about &#039;&#039;z&#039;&#039;, called &#039;&#039;yaw&#039;&#039;.  For example, to get the quaternion for the robot to be facing to the left of its original position in the map, we would provide a quaternion to the navigation stack of the form:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;x&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;y&#039;&#039;&#039; = 0.0&lt;br /&gt;
&#039;&#039;&#039;z&#039;&#039;&#039; = 0.707&lt;br /&gt;
&#039;&#039;&#039;w&#039;&#039;&#039; = 0.707&lt;br /&gt;
&lt;br /&gt;
This quaternion represents a &amp;lt;math&amp;gt;\over{\pi}{2}&amp;lt;/math&amp;gt; radian rotation about the &#039;&#039;z&#039;&#039; axis (a yaw).  This ends up pointing us left.  It is easy to convert between the notion of a yaw and a quaternion.  One way is to type &amp;quot;euler angles&amp;quot; into [http://www.wolframalpha.com/ Wolfram Alpha].  Doing this gives you a nice conversion utility.  The important things to remember when using this conversion is to make sure you select the rotation sequence such that you only request a yaw rotation.  Also, when the computation is finished, the desired quaternion parameters are in the box named &amp;quot;Euler parameters (quaternions)&amp;quot;, with the convention that the parameters are specified as &amp;lt;math&amp;gt;q_0 = w, q_1 = x, q_2 = y, q_3 = z&amp;lt;/math&amp;gt;.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5456</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5456"/>
		<updated>2011-07-16T21:33:43Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
body_name (x, y)&lt;br /&gt;
&#039;&#039;&#039;alan_model::alan_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, -0.14)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david_model::david_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 1.2)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;david2_model::david2_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.8, 2.3)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;eric_model::eric_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(1.6, 4.0)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;erica_model::erica_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.7, 5.1)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;jeff_model::jeff_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(4.0, 6.4)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;john_model::john_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(3.9, 8.7)&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;phil_model::phil_link&#039;&#039;&#039; is at position &#039;&#039;&#039;(2.5, 9.6)&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5455</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5455"/>
		<updated>2011-07-16T21:32:03Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.  Note in the incantation above you do not have to change the &amp;quot;reference_point&amp;quot; based on the positions below.  The &amp;quot;reference_point&amp;quot; specifies where gazebo should put the wrench force in relation the object specified with the particular &#039;&#039;body_name&#039;&#039;.  Thus to get different artifacts to rotate, you need only change the &#039;&#039;body_name&#039;&#039; argument in the above command.  Here are the body names and positions:&lt;br /&gt;
&lt;br /&gt;
body_name (x, y)&lt;br /&gt;
alan_model::alan_link is at position (3.9, -0.14)&lt;br /&gt;
david_model::david_link is at position (4.0, 1.2)&lt;br /&gt;
david2_model::david2_link is at position (3.8, 2.3)&lt;br /&gt;
eric_model::eric_link is at position (1.6, 4.0)&lt;br /&gt;
erica_model::erica_link is at position (2.7, 5.1)&lt;br /&gt;
jeff_model::jeff_link is at position (4.0, 6.4)&lt;br /&gt;
john_model::john_link is at position (3.9, 8.7)&lt;br /&gt;
phil_model::phil_link is at position (2.5, 9.6)&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5454</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5454"/>
		<updated>2011-07-16T21:29:48Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Note that this will only modify the first artifact.  Gazebo modifies the artifact with the specify &amp;quot;body_name&amp;quot; that is specified in the incantation above.  In the above case, the body name is &amp;quot;alan_model::alan_link&amp;quot;.  However there are eight total artifacts.  The list below specifies each artifacts body_name and also its position in the warehouse.  The body_names will be convenient when wanting to rotate them with the above incantation.  The position will be useful when encoding a navigation goal.  All dimensions are in meters.&lt;br /&gt;
&lt;br /&gt;
body_name (x, y)&lt;br /&gt;
alan_model::alan_link (3.9, -0.14)&lt;br /&gt;
david_model::david_link (4.0, 1.2)&lt;br /&gt;
david2_model::david2_link (3.8, 2.3)&lt;br /&gt;
eric_model::eric_link (1.6, 4.0)&lt;br /&gt;
erica_model::erica_link (2.7, 5.1)&lt;br /&gt;
jeff_model::jeff_link (4.0, 6.4)&lt;br /&gt;
john_model::john_link (3.9, 8.7)&lt;br /&gt;
phil_model::phil_link (2.5, 9.6)&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5453</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5453"/>
		<updated>2011-07-16T21:19:55Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039; rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.1 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5452</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5452"/>
		<updated>2011-07-16T21:18:09Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039; rosservice call gazebo/apply_body_wrench &#039;{reference_point: {x: -2, y: 0, z: 0}, body_name: &amp;quot;alan_model::alan_link&amp;quot;, wrench: { torque: { x: 0, y: 0 , z: 0.01 } }, duration: 1000000000 }&#039; &#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5451</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5451"/>
		<updated>2011-07-16T21:16:47Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.  Remember that you can monitor the &amp;quot;/face_coords&amp;quot; topic by simply opening a terminal and typing the following.  However, note that the topic is generated &#039;&#039;only&#039;&#039; when a face is recognized:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rostopic echo /face_coords&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5450</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5450"/>
		<updated>2011-07-16T21:05:41Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roscd irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;br /&gt;
&lt;br /&gt;
Once that is working, remember that you can view the imagery coming from the robots mast camera, with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun image_view image_view image:=/stereo/left/image_rect&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, even better than this, you can run the face_detector covered in the last class.  If you did not check it out last class, you can either use your own face_detector, or use the one provided as an example.  To get the one provided, simply do:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/face_detection&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake face_detection&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Assignment 2 discusses face_detection in more detail.&lt;br /&gt;
&lt;br /&gt;
Once the face_detector is available, you can start it as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch face_detection face_detector.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This will open a new image_view for the &amp;quot;/face_view&amp;quot; topic, which is the same as the normal image topic &amp;quot;/stereo/left/image_rect&amp;quot;, but it also puts boxes around recognized faces.  Also, when a face is detected, the face detector puts out a &amp;quot;/face_coords&amp;quot; topic that describes the point in the image plane of the face being tracked.&lt;br /&gt;
&lt;br /&gt;
This will be most important when trying to uncover any paranormal activity.  For example, when the ghost is rotating an artifact, the face detector will likely detect a face, but the &amp;quot;/face_coords&amp;quot; topic will undoubtedly be changing as the artifact rotates.  Since it is assumed that the artifacts never move without paranormal assistance, if the robot sits long enough in front of each artifact, it should be able to closely monitor the &amp;quot;/face_coords&amp;quot; topic to determine whether there is paranormal activity in the area.&lt;br /&gt;
&lt;br /&gt;
While the security guards would like the mystery uncovered as soon as possible, they have given us two weeks of development time for building the system.  For testing, there is a specific gazebo incantation, not unlike an ancient curse, that will cause an artifact to begin rotating.  To test your algorithm you&#039;ll most likely want to rotate a variety of artifacts to see how the robot responds.  To start, here is an example of how to get the first artifact to rotate (very slowly):&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5449</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5449"/>
		<updated>2011-07-16T20:55:08Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you need to add the museum artifacts into the simulation.  You can insert them into the simulation with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch floating_faces faces_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
These artifacts are in the same locations as the real artifacts in the secret HacDC antiquities warehouse.&lt;br /&gt;
&lt;br /&gt;
Next you can then start the ROS navigation stack to allow the robot to be able to navigate autonomously throughout the warehouse.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack topics and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related topics.&lt;br /&gt;
&lt;br /&gt;
After you are comfortable with rviz, you can obtain and start the simple prototype museum guard example that is provided here:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/museum_guard&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake museum_guard&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once museum_guard is made, you can start it up by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun museum_guard smach_guard.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
One of the powerful features of smach is the smach_viewer utility that you can start as follows:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun smach_viewer smach_viewer.py&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You should see the two states illustrated in class, namely &amp;quot;GOTO_HALL_1&amp;quot; and &amp;quot;INSPECT_ARTIFACT_1&amp;quot; in the Smach Viewer window.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5448</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5448"/>
		<updated>2011-07-16T20:48:24Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you can then start the ROS navigation stack.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description move_base.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;br /&gt;
&lt;br /&gt;
Once the navigation stack is up and running, you can start rviz to start seeing what the robot sees.  Rviz is a powerful visualization tool useful for looking at TF frames, navigation stack products and all kinds of other stuff.  Anyway, you can start it by typing:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;rosrun rviz rviz&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once rviz is up and running, follow [http://www.ros.org/wiki/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack this tutorial] to set it up to visualize all kinds of navigation stack related products.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5447</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5447"/>
		<updated>2011-07-16T20:45:54Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  We do this to start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once you have successfully rosmake&#039;ed the irobotron_description, you can then insert the robot into the simulation:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch irobotron_description create_mobile_base_in_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
This command will insert the robot into the starting location in the map.  It may be hard to see the robot, but if you use the mouse to navigate in the gazebo window, you will find the robot 10 meters forward and one meter left of the initial gazebo starting coordinates.&lt;br /&gt;
&lt;br /&gt;
Once you have the robot in the simulation, you can then start the ROS navigation stack.  There is a launch file that has been written that customizes the ROS navigation stack for our particular robot.  You can look at the launch file, named move_base.launch, for more info.  To start the navigation stack, type:&lt;br /&gt;
&lt;br /&gt;
roslaunch irobotron_description move_base.launch&lt;br /&gt;
&lt;br /&gt;
You may see a few warnings when the navigation stack starts.  But if you do a rostopic list, you should see numerous topics in the &amp;quot;/move_base/&amp;quot; namespace.  These are topics related to the navigation stack.&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
	<entry>
		<id>https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5446</id>
		<title>Robotics Class 2011/Assignment 4</title>
		<link rel="alternate" type="text/html" href="https://old.hacdc.org/index.php?title=Robotics_Class_2011/Assignment_4&amp;diff=5446"/>
		<updated>2011-07-16T20:39:36Z</updated>

		<summary type="html">&lt;p&gt;Andrewtron3000: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The rumor is that something bad is going on in the secret underground HacDC antiquities warehouse.  The security guards have written a memo describing recent sightings of artifacts in the warehouse &amp;quot;moving by themselves&amp;quot;.  They have come to us, the robotics experts, to see if we can&#039;t provide them with a robot that can make the rounds in the warehouse keeping an eye out for mysterious paranormal activities.&lt;br /&gt;
&lt;br /&gt;
The last assignment illustrated the commands required to start up gazebo and also to check out the irobotron_description ROS package from the HacDC repository.  However, we will make a slight adjustment to the command line for starting up gazebo.  The reason is that we will start up in a simulated world that happens to have the same rooms and structure of the secret underground HacDC antiquities warehouse.  You can start gazebo in the following way:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;roslaunch gazebo_worlds simple_office2.launch&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
Once gazebo is up and running and you next need to insert the robot into the simulation.  If you have not checked out the irobotron_description ROS package, you can do so with the following command:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn co http://hacdc-ros-pkg.googlecode.com/svn/trunk/irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
However, if you have already checked it out, follow these modified instructions to update the package as recent changes have been made to it:&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;cd&#039;&#039;&#039; to where you store your downloaded ROS packages&lt;br /&gt;
&#039;&#039;&#039;svn update&#039;&#039;&#039;&lt;br /&gt;
&#039;&#039;&#039;rosmake irobotron_description&#039;&#039;&#039;&lt;/div&gt;</summary>
		<author><name>Andrewtron3000</name></author>
	</entry>
</feed>