- Archival Video
- Photos
- Source Code
- About Ultrasonic Steering
Archival Video
Our webcast on 2012-11-11 did not record. However, we simultaneously recorded with a handheld camera. We prefaced the live demo footage with some practice session footage from earlier, to help show parts that didn't work right in the live demo. See our final judging video.
Photos
Detailed pictures of our landscape and robot are here.
Source Code
We have uploaded all of our source code, and are making it Open Source. We hope future teams will be able to advance our Ultrasonic Steering, and make a cool Mission Monitor or remote control application like we enjoyed making.
* All source code (Both NXT-G for Inspiration, and the monitoring and remote control apps.)
* Just the USsteering.rbt My Block
About Ultrasonic Steering
Some people have asked about our Ultrasonic Steering (US), so here is how we did it...
As we experimented with the idea, we found out that US is only going to work well in certain situations. First, because we found the sensors only have a "cone of view" of about 30 degrees, the target item has to be roughly in front of the robot. US is thus for closing in on an object, but the robot has to be pointed in the object's general direction first.
Second, the two ultrasonic sensors have to be spread far apart. Inspiration's sensors are 30cm apart, which means that in order to fit into base at the mission start, we needed to have them spring-loaded with rubber bands.
Third, the two sensors will interfere with each other if you are not careful. A signal from the left sensor that bounces off an object and hits the second sensor will give whacky values to both sensors. This problem is more obvious in this diagram:
On the left, the robot's sensors are confused because the signal bounces off the target at an angle. On the right, the signal tends to bounce directly back when it hits a round object, so each sensor receives its own signal back!
This is the My Block that does the actual steering:
Every time this My Block is called, it compares values from the left and right ultrasonic sensors. It then adjusts the motors to steer slightly toward whichever sensor is closer to an object. Below the My Block is placed within a loop, which repeats until the touch sensor is triggered.