Sunday, 18 October 2015

Autobots, roll out - Autonomous operation

I've been working on Optimus Pi for over three months now and I've finally gotten around to working on the autonomous challenges. For last week's Egham Raspberry Jam I had mounted the VL6180 range sensor on the front of the chassis so it made sense to start with the Proximity Alert challenge, which is probably the easiest of the autonomous challenges.

The first thing to do was to better integrate the VL6180 code into the software library (as opposed to hacking it in like I did last weekend). As there are going to be various sensors connected to the robot I created a 'Sensor' base class that will contain all the common operations supported by a sensor (e.g. checking its connected, turning it on and off), which will then be extended by each sensor type.

With the base APIs decided upon I created the VL6180 variant and copied in the code from my example, tidying it up a little and extending the public API on the Sense class with a 'range' call to actually read in the value. I followed my normal approach of getting it working first, then making it 'nice'. So the first implementation made direct calls to the sensor, then once I was happy that the sensor was operating as expected I moved the code into a background thread, allowing the stored range to be constantly updated whilst the sensor is enabled, hopefully not blocking the main thread.

Up until now the 'manual' control code has been in the 'main' function, so that had to be moved out. Back in my pre-planning blog posting I have a white-board picture where the 'Brains' object has multiple 'automated' modules hanging off it, which the robot will switch between for the various challenges. As there needs to be a manual control module as well I decided to name these modules a 'ThoughtProcess' of which only one will be active at a time (Optimus Pi is a very single-minded robot).

Now the ultimate aim is that the 'Brain' will have a list of supported ThoughtProcesses that can be dynamically switched between for each challenge, but for now I'm just looking at getting things working (Worse case I can manually launch a specific binary for each challenge on the day). So once again the 'main' function is just hard coded to launch the activity I want.

With the 'ThoughtProcess' and 'Sensor' classes in place its only took a few minutes work to create a simple 'ThoughtProcess_ProximityAlert' class that just drives the robot forwards, lowering the speed as it gets closer to the target until it finally stops. A few test runs to adjust the speed to run okay on the carpet and we end up with...


Obviously this is just a very simple script with plenty of room for improvement, but its already better than last years entry which didn't end up competing in any of the autonomous challenges. Things that could be improved are adding a calibration step so the robot can work out its minimum speed on the floor, using the SenseHAT to help keep the robot moving in a straight line, adding encoders to the motors to measure distance travelled etc. Not to mention straightening the range sensor on the front of the robot (its a little crooked), but for now I'm leaving it as is and moving onto the next autonomous challenge (line following perhaps?)

Leo

No comments:

Post a Comment

Note: only a member of this blog may post a comment.