Kybernetes is a test platform for autonomous navigation I developed with the Robotics Society of UC Merced. Up until recently, I’ve been the only person on the project, however our recent increase in membership means Kybernetes will serve as an important resource for educating new members in electronics, sensor systems, localization, and mapping.
Please ignore the outburst of “ran out of power” in the video, I was jumping to conclusions since I was running on a two hours of sleep and countless cups of coffee.
The robotics society placed second in the Robomagellan competition, getting within seven feet of the goal cone. We didn’t manage to complete the course for two major reasons – power problems and the vision system. The power problem reinforced an important lesson, that electric motors cause serious amount of electrical noise in single power supply systems. Because of the filtering in the switching regulator leading to the electronics, we avoided the classic issue where “gunning it” would reset the computers. The major issue at the competition was that anytime the motors generated a spike (speeding up, slowing down rapidly), the servos would jerk to one of their extremes before accepting any control input again. This problem is evident at the beginning of the video where the robot makes a sudden right turn as it starts. We had only been testing on paved areas on campus and the dry, short grass in the UCM quad. The wet and tall grass at Robogames was the first time we observed the issue, and we spent the next hour or so patching in the servo power straight to the battery instead of through the main motor ESC’s battery eliminator circuit. We didn’t do this earlier because the battery supplied a voltage above the rated value for the servos (8.4V battery, 7.2V servos), but for the course of the competition at least, we didn’t blow up the servos.
Solving the power issue consumed the time I had allotted to calibrating our vision system for the conditions of the Robomagellan course. We solved the issue right before we were called up for our first run of the day. I hadn’t even put in the course coordinates yet, so I ran the heading lock test, hoping that Kybernetes would at least follow a straight course and put some distance between the starting point and the goal cone. Since the robot would not stop automatically (it was just following the heading it was pointing when the program started), I stopped the run with our failsafe remote. We were around 40 meters from the goal cone, about half the distance to it from the starting cone.
After walking and plotting a course for Kybernetes to take during the lull between rounds, I loaded the coordinates for our next run. When our time came, I put Kybernetes in the starting position, started the navigation program, and prayed. Camp called for the start of the run and off Kybernetes went, following the course I plotted perfectly. It drove to the point at which it previously stopped, made a hard right towards the goal cone, and opened the throttle (well, the electronic equivalent of it anyways). Its hard to see in the video, but it hit the transition between the grass and pavement pretty harshly, tilting the front up about 45 degrees, sending it about four inches into the air. It landed on its wheels and continued all the way up to the cone, stopping seven feet of it. In all honesty I was a little sad it didn’t continue, but since the vision was disabled, it was instructed to stop after predicting its distance to target within the error radius of the GPS.
In the end, there were things that could have been done better, but I went home that day knowing that Kybernetes had done everything it was instructed to do flawlessly. If you want to check out the software written to control Kybernetes, check out the “kybernetes” repository on my GitHub.