129 | | Today, magically, the teleop applet was working with camera support on Bender. When we clicked a destination for the robot, it moved in generally the right direction, but passed the target and spent a long time colliding into the wall until we swithed off the battery. This is dissapointing, especially considering that the camera is supposed to be calibrated. |
| 129 | Today, magically, the teleop applet was working with camera support on Bender. When we clicked a destination for the robot, it moved in generally the right direction, but passed the target and spent a long time colliding into the wall until we switched off the battery. This is disappointing, especially considering that the camera is supposed to be calibrated. |
| 135 | == 6/14/07 - Day 9 == |
| 136 | |
| 137 | As it stands, only Bender is fully operational. Artoo's chassis is looking good, but for some reason the robot still refuses to make landmarks. When on explore mode in Navtool, it moves in a peculiar pattern that made us wonder if there was something wrong with the left motor so as to make it sluggish. The robot tends to travel in a curved path. However, using a python script (triangles.py) we found that the robot is quite capable of moving in a straight line. I suspect that the lack of landmarks is a result of this erratic movement rather than any disconnection between the software and the camera; I remember reading somewhere in the guide that it is difficult for the robot to create landmarks while moving along curved paths. (This is by no means certain) |
| 138 | |
| 139 | The thing keeping the terminator from working for now is that the camera is unmounted. We need some epoxy to mount the camera and then we will test it's navigation system. |
| 140 | |
| 141 | We learned how to calibrate the IR sensors today. The task involves measuring the position of the sensors with respect to the coordinate system of the robot, which is defined as follows: The origin is located on the floor directly between the two back wheels and directly below their common axis. Positive X is the forward direction of the robot, positive Y is leftwards, and positive Z is up. The parameters Roll, pitch, and yaw are angles defined as rotations about the axes X, Y, and Z respectively. To calibrate the IR sensors, all we need to do is measure these 6 quantities (position and rotation) and declare them in the file ''resources-config.xml''. |
| 142 | |
| 143 | We turned our full attention to the Tutorials Guide for the first time today. The first tutorial, 01-simple, is a code that makes the robot move a given distance forward, take a picture, and display the picture momentarily. We got the code to make the robot move and snap a picture, but code creating the GUI picture display refused to compile. The compiler complains about the type ''!IImageWindow'' being undefined, although the !API indicates that it is a class in the Evolution namespace. We will investigate this more tomorrow. |
| 144 | |
| 145 | It took quite a while before I felt like I understood this first bit of code. The programming is very high level, and many objects need to be created for apparently elusive reasons. For example, one method takes two arguments. Before these arguments can be passed to the method, they must be stored in a !TaskArgs object. To complicate things further, one of the arguments is a !DoubleArray. A !DoubleArray is an array of doubles but stored in an object defined by Evolution. Rather than simply creating an array of doubles, we need to create an array of doubles and pass it as an argument to a new !DoubleArray object. This instance of !DoubleArray is then passed to !TaskArgs which in turn gets passed to a new !TaskContextPtr, which is a sort of smart pointer containing the data needed to execute a task. To execute a method, we must pass the result of a !TaskContextPtr's get() method to a task functor's run() method. The whole process seems a bit convoluted, but we're hopeful that we'll be able to accept it and use it. |
| 146 | |
| 147 | I read ahead a bit in the tutorials guide. Among the many features Evolution provides is an object which handles the processing of multiple tasks simultaneously. This will certainly come in handy when we write our own code. |
| 148 | |
| 149 | We've gotten a few suggestions for code we should write, and I think we should get started on it soon. First off, I would very much like to see obstacle avoidance working with our code, as I feel it is in many ways fundamental if the robots are ever to be useful in Winlab. Once this is figured out, (on Ivan's suggestion), we should write some code that makes the robots move about basically randomly (except for obstacle avoidance). Additionally, I would like to make some robust and easy to use !MoveToCoordinate type commands which through SSH or whatever other means will cause the robots to create design a smart path and travel to a desired location. With these goals accomplished reliably, we'll be able to start using the robots to study wireless communication. |