Week 4
6/25/07 - Day 16
We setup an node to act as an access point. Then we tested our algorithms to see if they could find the node. Unfortunately, they were not completely successful.
Problems with Vamsi's code:
- Initially the error "local variable cannot be referenced before assignment" was complaining. This has to something with local and global variables. The parallel class might need to pass the necessary variables in order for it work properly. Temporarily all variables are assigned in their needed methods. Therefore, this error no longer occurs
- Ultimately, the robot roam around in ~10X10 (in) square that is near to the correct location. Due to fluctuations in signal strength, this method does not come as close as we would like it
6/26/07 - Day 17
Today was spent tweaking our algorithms for the presentation tomorrow. We got a little bit of time on the grid, which we used to calibrate the distance the robot chooses to travel based on its measurements.
Updates have been slow lately because most of our time has been spent perfecting our algorithms. The presentation is tomorrow, we will describe the final methods in full detail then.
I'm not sure if I've done this before, but I will describe the challenge again. One of the nodes in orbit is made to be an Access Point. The robots connect to that access point, and then based solely on the strength of the signal they are receiving attempt to travel towards it. For now we are neglecting obstacles, so avoidance is not an issue. Coding is done in python.
One method I have ruled out is the Nelder-Mead algorithm, which is a popular choice in optimization but ill-suited for our particular purpose. The reason is that the method does not take into account the random fluctuations of our signal. One of the characteristics of the Nelder-Mead algorithm is to perpetually remember the "best" point it ever finds. Due to random fluctuations in Oribt, often the "best" point located is found to be such because of circumstance rather than proximity to the access point. I've attempted a variation of the Nelder-Mead which "forgets" its best point after a number of iterations go by, but I never was able to use the method successfully.
One thing that has become clear is that many measurements must be made to smooth out the noise.
Another somewhat unusual characteristic of the problem (unusual compared to most optimization problems)… TBC
6/27/07 - Day 18
Today was spent mostly on changing our existing code to make our algorithms better. As mentioned before, although our programs eventually reach a "good" distance from the access point, there are always fluctuations in signal strenght that cause the program to be inconsistent.
Ivan suggested a using a statistical method to fit data so that our programs will function more efficeintly. The following week is going to be spent working on that.
6/28/07 - Day 19
Today was an educational day. We have been reviewing our statistics and reading up on signal processing. We ran an experiment and collected some data to use to work out our statistical algorithms. The experiment was as followed: We set up an access point out in the hall way and had a robot drive strait passed it, never stopping but collecting signal strength and position data the whole way through. Two trials were run - in one the source was right at the beginning of the robot's motion, and at another the source was midway between the robots start and end points.
We found an interesting fact in MadWifi's documentation - the scale we use to measure signal strength (dbm) was specifically chosen to be linear with distance. We went ahead and performed a least square regression and fit a line to the data from the first test. The line fit well but something is wrong with my code that calculates RMS value - I will debug that.
Next, looking at the second test, we found a very distinct U shaped pattern in signal strength with distance. We estimated the location of the minimum with our eyes and fit two LSLs (least square regressions). Rather than eyeball it, I thought we should find the code that finds the location along the distance axis which calculates the lowest over all square regressions (summing the regressions from the two lines). It takes a large amount of computer power (approx 15sec) to compute the total regression for every possible location, but in the end we see another distinct U-shaped curve with a minimum right about at the location of the access point.
We downloaded the packages scipy, numpy, and ipython today. I believe they will prove very useful. Up until now, the numerical analysis has been performed in Octave or Matlab, but if we can port it over to python we can write robot scripts which perform numeric analysis in real time and respond to it.
I am currently reading about the Fourier Transform in attempt to understand filters. I understande scipy contains some built in functions for filters and fft. Hopefully before long we'll at least be able to implement a filter, if not design an appropriate one of our own.
6/29/07 - Day 20
Scipy contains a nice linear regression package. The two most valuable functions we've discovered so far are polyval, which returns a curve fitted polynomial for given data, and leastsq, which performs a minimization on a residuals function to curve fit any shape you can chart residuals for. An example found online shows how to use leastsq to fit a plane.