Other teams are using flat table-tops, or LEGO(R) base plates for their lunar surfaces, as is the custom in FLL, and MoonBots (up to now, heh heh). We decided to propose a realistic landscape, covered with sand, occasional rocks, and a bumpy High Ridge. We feel this will really help the public "connect" with the mission during our live demonstration.
However, once you move to a sandy, rocky surface, you find that you cannot trust your motors they way you can on a flat table-top. You tell a wheel to make four rotations, and you might find the robot only went a distance equal to three and a half rotations, or worse.
Our strategy to deal with this is to use ultrasonic steering. We won't trust where our task items are on the landscape, we will find them in real time. Besides making the mission achievable, it also has a bonus feature: it makes the robot tolerant of errors. If the task item is a few inches away from where we humans thought it would be when programming the robot, the sensor-based steering will correct for that automatically!
To implement ultrasonic steering, we concluded we have to use two ultrasonic sensors, just as you can close your eyes and use your two ears to locate where a sound is coming from. We think this is the first time binaural steering (like binocular, but for sound instead of light) has been successfully used in a LEGO robot competition.
Remember: our live webcast from the Museum of Flight is on Sunday, November 11th, 2012, at 2:30pm (Pacific).
No comments:
Post a Comment