Version 33 (modified by 17 years ago) ( diff ) | ,
---|
This journal is maintained by Winlab summer interns Vamsi and Aniket, who are working on robotmobility. Unless otherwise noted, the entries are typed up by me (Aniket) and I refer to myself in first person for simplicity.
Week 3
6/18/07 – Day 11
Today was full of reading: reading the API, reading the User's guide, reading sample code, and reading the tutorials. We started using the behavior composer for the first time, and find it has some pretty nifty features. The essential command is behave. A behavior is something the robot does constantly, ask compared to a task which is generally a one time assignment. For example, a task might be to move across a room, whereas a behavior is to repeatedly check for obstacles and change course to avoid one.
We still cannot figure out how to compile our own code or generate makefiles. Instead, we have been writing small code snipets using the same filenames as those in the sample code, (eg simple.cpp) and using the makefiles provided with our fingers crossed.
Right now I believe the most important thing for us to understand is the last three tutorials in the navigation section. These are the tutorials about using vSlam and creating landmarks. Understanding this will be, in my opinion, the critical step towards successfully using the robots in wifi experiments.
Unfortunately (though not surprisingly) the code resists compiling. Navigation Tutorial !#4 is the introduction to vSlam, and consists of 4 programs, only the first of which actually works. This first program causes the robot to move in a "bow-tie" pattern, but does not demonstrate any of the map-building capabilites of the robot. The remaining tutorials give compile errors, often with this error:
error: extra qualification 'Evolution::OccupancyGrid::' on member 'distance_to_obstacle'
I sincerely hope we can overcome our compiler issues shortly. In the meantime, we have python to work in.
A complication we keep running into is that we do not have a joystick here at winlab to operate these robots, and while a joystick is not strictly required, much of the sample code assumes you have one. It is hard for us to study the sample code as we cannot see it in action. Additionally, a joystick is required to run the provided odometry calibration scripts (something we probably should have done by now). We plan to bring in some of our home gaming equipment tomorrow; Vamsi has a USB joystick and I have a USB xbox controller. Hopefully at least one of those will work and allow us to explore the sample code better.
6/19/07 - Day 12
We now feel like we really understand IR sensor calibration. An experiment on Bender shows that some careful manipulation of resource-config.xml makes a tremendous difference in the robot's obstacle avoidance capabilities. Calibration is as follows: First, configure the (x,y,z) coordinates and roll, pitch and yaw for each sensor (remember the origin is located on the floor directly between the back wheels). Then, in the section labeled <dimensions> under <Shape id="body…>, set the parameters (x,y,z) to the center of the robot with respect to the origin, and (lx,ly,lz) are the lengths of an imaginary rectangle which should encapsulate the robot.
Once these parameters are set, obstacle avoidance works very well. It is important to make sure the sensors are in fact pointing in the directions specified under (roll,pitch,yaw).
We played around with joysticks for a little today. Between us, we have one Destroyer Tilt and one MS XBox 360 controller. We were able to get the Tilt working but not the Xbox controller. Oddly however, different applications reacted differently to being joystick controlled. Some programs, such as the odometry calibration script, were reluctant to respond to left and right motions on the joystick. We hope to iron out the details in using the joystick soon as it will be very useful for calibration. Ivan has given the go ahead to purchase one for Winlab.
An exciting end of the day - we began writing a python script which will hopefully cause the robot to seek out the area of best internet connectivity. So far, the script is able to run iwconfig and extract the signal strength. It also has structures in place to keep track of three (x,y) coordinates and the signal strength at each of them. The functions written so far are: GoToXY(x,y) [Moves robot], scanning_thread() [returns signal strength], BigReading() [calls scanning_thread several times and returns average], Say() [calls flite for text to speech], Sort() [Sorts the three points in memory in order: best, good, worst], and PrintPoints() [prints BGW points to screen]. Python scripting is proving fast and easy, and we can probably have this working before lunch tomorrow.
6/20/07 - Day 13
We accomplished a few things today. In particular, we've written two python scripts which perform some wifi related tasks.
The first, the one I started coding yesterday, is an implementation of the Nelder-Mead algorithm for finding minimums. The robot wanders around the floor of Orbit (x,y coordinates) and surveys the signal strength using iwconfig. Based on the input it receives from iwconfig it attempts to move to the point of greatest wifi connection. At each point it surveys, the robot speaks the signal strength, a task accomplished by feeding the output to flite. The algorithm may not be quite appropriate because of the random fluctuations in connectivity; if the robot gets a burst of strong connectivity that is due solely to natural variations it will remember that point forever. One solution might be to program the robot to 'forget' good points after a few subsequent scannings.
The second performs a related task. It wanders aimlessly around the room, recording the signal strength from iwconfig at regular intervals. The real accomplishment here was making two tasks (wandering and surveying signal strength) occuring simultaneously. The first step is to turn a python function into an ERSP recognized task. The important python command to do so is:
someTask = ersp.task.registerTask("idstring", function_name)
The purpose of "idstring" is elusive, but as long as it's a string the software seems to accept it. Once this assignment is completed, someTask can be passed as an argument in to a parallel object:\
p = ersp.task.Parallel() p.addTask(someTask)
We've been issued a challenge. A week from today there is to be a race. One of the nodes in Orbit will become an access point, and each robot will attempt a time-trial to locate the node as quickly as possible. Robot !#1 will probably use the Nelder-Mead script completed today, but we have flexibility regarding the algorithm for the other two.
Another short term goal: have object-avoidance behavior run in the background while we are running our own python scripts. In general, the python MoveTo commands have timeout values, so if the robot is trying to reach an unreachable location (ie, inside a pillar) it should give up after n seconds. This will affect our scanning algorithms.
6/21/07 - Day 14
Today has been one of our less fruitful days. Much time was spent working on algorithms for our challenge. I believe I've perfected the Nelder Mead algorithm, except for the condition at which the robot decides it's time to stop searching. Under ideal conditions, the algorithm would cause the robot to move in constantly shrinking triangles around the center of the access point, but realistically at some point the robot has to stop and decide it is close enough. As of now, that condition is if the difference in signal strength between the Best and the Worst points is less than 1.0 Dbm. Whether or not this is a good metric is something we might not know until we can make a trial run.
A second algorithm I'm working on computes the approximate gradient of the signal strength and moves in the opposite direction. The complete method has not been decided yet, but essentially the robot will move first in a triangle, taking measurements at each vertex. It will then interpolate the vertexes and signal strengths to find the equation of a plane, from which the gradient can be computed. It will then move to a position in the direction opposite the gradient from the centroid of the triangle which it interpolated. A new triangle will form using the new point and the two (geometrically) nearest previously surveyed points, and the process will repeat. As of yet I don't know how well this will work, but I intend to try it out.
An important objective is to get object avoidance working. One idea I thought of follows: If we could somehow create a behavior network that uses the IR sensors to avoid obstacles, we could call that behavior from within python using os.system(behave obstacle_avoidance.xml &). Another hopeful tactic might be to find a way to call a python script (or a general system call) from a behavior network. The fear is that if we run either a behavior network or a python script, one will lock the other out of the RCM service.
Once obstacle avoidance is working, an added challenge will be getting the robot to correct it's (x,y) position when it cannot reach the locations requested by our algorithms. For example, if an algorithm tells the robot to move to the point (50,50), and that point is inside a pillar, the robot will move towards the pillar and hover around it until it's MoveTo call times out (we can set the timeout interval). At that point, hopefully the robot will be able to tell us what it's position is, because the algorithm will not know.
The C++ code continues to be a headache. We are now officially convinced that many of the upper level API functions really demand that we code in C++, which is disheartening as we've made very little progress getting the code to compile. I dispatched a support request to Evolution, but have not heard from them as of yet. We really ought to get a user id and password for their support forums.
6/22/07 - Day 15
We've signed up for some time on the grid. Monday morning we'll come a little early and test out some of our algorithms. We learned today how to image nodes on the grid and turn them on. The procedure is pretty simple:
$ ssh uid@console.grid.orbit-lab.org $ imageNodes x1..x2,y1..y2 baseline.ndz $ wget -O - -q 'http://cmc:5012/cmc/on?x=x1&y=y1'