Thursday, 29 October 2015

I'm spinning around - Line following

The second autonomous challenge I decided to focus on was the Line Follower challenge, using the Pololu QTR-8RC reflective sensor I posted about earlier. As the sensor needs to be very close to the ground, its not something I can leave mounted all the time (The obstacle course would not be kind to it) so I needed it to be easy to add/remove.  Previously I had the QTR-8RC connected directly to the Raspberry Pi using 11 jumper leads, but that number of cables doesn't lend itself to quick attaching and/or detaching.

To try and make the sensor a little more self contained I decided to use an Arduino as a buffer device, with the sensor connected to the Arduino with the 11 jumper leads, and the Arduino connecting to the Raspberry Pi over I2C, in a similar manner to the motor driver.
The QTR-8RC sensor connected to an Arduino.
As I'm already successfully using an Adafruit Metro 328 for controlling the motors I decided to get a second one, but this time going with the 'Mini' variant to save space. Getting the  QTR-8RC sensor up and running was fairly trivial, as its just a case of grabbing the QTRSensors library written by Pololu and following the instructions. For communicating between the Arduino and the Raspberry Pi I copied the I2C code from my MotorDriver script.

For testing my Line Following capabilities I, obviously, needed to attach the sensor to the robot and, after digging around in my various boxes of supplies, discovered a couple of large bolts that would hold the sensor in the correct position. A few holes drilled later, plus a couple of blobs of blu-tac, and the sensor and Arduino was successfully mounted.


As the circuit is currently wired up on a bread board I've actually mounted everything on the rear of the chassis where there was still plenty of room. Obviously its more traditional to have line following sensors on the front of the robot, but it only takes a few tweaks in the code to make it drive backwards, and there's still plenty of time to move it before PiWars. In theory the best place for the sensor would be in the middle of the robot, as that is the point it turns around, but I don't currently have a good way of mounting it there.

With the hardware sorted it was time to turn my attention to the software. First task was to create a new 'Sensor' class for talking to the QTR-8RC sensor. As this was now controlled by the Arduino via I2C, this was a fairly trivial task (SensorQTR8RC.h) as I could just copy an already implemented I2C sensor and tweak the code.

Next was to add some actual logic to connect the sensor readings to the motor. Again this is mostly a case of copying an existing 'ThoughtProcess', tweaking it to initialize the correct sensor, sit in a loop trying to get the 'line' to stay in the centre of the sensor and we have ourselves a working ThoughtProcess_LineFollower module yes? Well turns out it wasn't quite that easy....

Other than the usual coding issues (e.g. copy and pasting a one second sleep, turning left when I should be turning right) a major problem was the Arduino resetting itself after a couple of seconds, causing the loss of calibration data and making the robot just see 'white' and drive in a straight line. My initial thoughts was that it was related to the known issue with how the I2C hardware in the Raspberry Pi deals with clock stretching (badly), so I spent a few hours switching over to the pigpio library's bit banging  I2C implementation, which correctly handles clock stretching.

I moved the QTR-8RC sensor over to the newly created 'I2CExternal' class and started running some tests. Initial results were not encouraging,  with the Arduino resetting after 20-30 requests, leaving me in the same situation I was earlier. Whilst running further tests I noticed that the Arduino was now resetting whenever I was checking if it existed, which seemed to be due to how I'd changed the 'I2C::exists()' API to work on both the 'internal' and 'external' I2C buses.

This, happily, was a 100% reproducible test case and I started to review the I2C handling code running on the Arduino. Using the time-honoured approach of  sticking an early 'return' statement in the receiveData call to determine at what point the crash was occurring. I, fairly quickly, tracked the issue to the 'for' loop that attempts to determine which command was being requested, and I spotted that I was using 'sizeof(supportedI2Ccmd)' which returns the total size of the array in bytes, instead of '(sizeof(supportedI2Ccmd) / sizeof(supportedI2Ccmd[0]))' which returns the actual number of entries in the array. Meaning I was massively reading off the end of the array and accessing memory that I didn't 'own', apparently annoying the Arduino enough to restart.

So with that change made (and ported across to the Motor Driver Arduino code) I placed my robot back onto my test track (Last years course scaled by 50%) and ended up with...



Now the robot doesn't get close to making its way around the course, but it is definitely trying to track and follow the line, which is a good starting point and something I can build and improve on over the next few weeks.

So with that in a 'working' state I'll be moving onto the next autonomous challenge, the Straight-line speed test.

Leo

Sunday, 18 October 2015

Autobots, roll out - Autonomous operation

I've been working on Optimus Pi for over three months now and I've finally gotten around to working on the autonomous challenges. For last week's Egham Raspberry Jam I had mounted the VL6180 range sensor on the front of the chassis so it made sense to start with the Proximity Alert challenge, which is probably the easiest of the autonomous challenges.

The first thing to do was to better integrate the VL6180 code into the software library (as opposed to hacking it in like I did last weekend). As there are going to be various sensors connected to the robot I created a 'Sensor' base class that will contain all the common operations supported by a sensor (e.g. checking its connected, turning it on and off), which will then be extended by each sensor type.

With the base APIs decided upon I created the VL6180 variant and copied in the code from my example, tidying it up a little and extending the public API on the Sense class with a 'range' call to actually read in the value. I followed my normal approach of getting it working first, then making it 'nice'. So the first implementation made direct calls to the sensor, then once I was happy that the sensor was operating as expected I moved the code into a background thread, allowing the stored range to be constantly updated whilst the sensor is enabled, hopefully not blocking the main thread.

Up until now the 'manual' control code has been in the 'main' function, so that had to be moved out. Back in my pre-planning blog posting I have a white-board picture where the 'Brains' object has multiple 'automated' modules hanging off it, which the robot will switch between for the various challenges. As there needs to be a manual control module as well I decided to name these modules a 'ThoughtProcess' of which only one will be active at a time (Optimus Pi is a very single-minded robot).

Now the ultimate aim is that the 'Brain' will have a list of supported ThoughtProcesses that can be dynamically switched between for each challenge, but for now I'm just looking at getting things working (Worse case I can manually launch a specific binary for each challenge on the day). So once again the 'main' function is just hard coded to launch the activity I want.

With the 'ThoughtProcess' and 'Sensor' classes in place its only took a few minutes work to create a simple 'ThoughtProcess_ProximityAlert' class that just drives the robot forwards, lowering the speed as it gets closer to the target until it finally stops. A few test runs to adjust the speed to run okay on the carpet and we end up with...


Obviously this is just a very simple script with plenty of room for improvement, but its already better than last years entry which didn't end up competing in any of the autonomous challenges. Things that could be improved are adding a calibration step so the robot can work out its minimum speed on the floor, using the SenseHAT to help keep the robot moving in a straight line, adding encoders to the motors to measure distance travelled etc. Not to mention straightening the range sensor on the front of the robot (its a little crooked), but for now I'm leaving it as is and moving onto the next autonomous challenge (line following perhaps?)

Leo

Monday, 12 October 2015

Its show time! - 9th Egham Raspberry Jam

This weekend I took Optimus Pi for its first public outing at the 9th Egham Raspberry Jam. Overall it was a successful event, but not without a few problems..

I've been to most of the previous Egham Raspberry Jams, so I know that it attracts a mix of people, ranging from those who have never used a Raspberry Pi before, to those that have used it extensively. With that in mind I was fully expecting to hear the question 'PiWars? What's PiWars' so, as well as taking along my robot, I went prepared with details about the PiWar's website, the list of challenges and print outs of some of the more complicated challenges that the visitors could read through.

My stand! Complete with robot, skittles and line following sensor demo.
As I'd gotten my range and line following sensors working the previous weekend I wanted to be able to show them off. The range sensor itself I mounted on the front of the robot and quickly hacked in the standalone test code into my main 'PiWars' code and set it up so (in theory) if there was an obstacle within 150mm of the front of the robot then it would stop and refuse to drive forwards (Of course if you were travelling fast enough you can still crash). Originally I was just querying the range sensor every time I processed input from the joystick, but this seemed to be causing the controls to respond a little sluggishly (As they had to keep waiting for the sensor to respond) so Sunday morning (A few hours before I had to leave) I quickly moved that out to a background thread that updated a global, atomic variable. Something I completed much faster in C++ than I would have managed in Python. For anyone interested these changes can be found over at my GitHub account on the EghamRaspberryJam branch.

Lots of wires and the range sensor attached on the front.
As both the range sensor and Arduino motor driver communicate with the Raspberry Pi via i2c I needed a way to plug both devices in together, so I quickly soldered up a break out board to allow both of them to be connected to the Raspberry Pi's i2c pins.I also added a logic level shifter to convert from the Raspberry Pi's 3.3V to the Arduino 5V which, whilst not technically required in this setup, will allow me to add additional 5V sensors at a later date.

At the Jam the code worked as expected for about 10 minutes, after which the range sensor no longer seemed to respond and cause the motors to stop. So some investigation and fixes will need to be done later. The rest of the robot worked happily for the next hour or so... and then one of the wheels fell off... I blame the kid driving it at the time :)
A three wheeled robot!
The other sensor I got up and running last weekend was the QTR-8RC module that I'll be using for the line following challenge. Now I haven't yet worked out how to mount it on the robot so I needed a different way to show it off. The module itself is comprised of eight individual sensors, which conveniently is also the width of the LED Matrix on the Sense HAT. By taking some code from the example 'snake' program that come with the Sense HAT I was able to set things up so that as you move the line sensor across a white page the lights on the Sense HAT would go out to match the black electrical tape on the page.



The final props I brought along was a wooden skittle set. These spent the first half of the Jam up on the table, but later I moved them onto the floor and some of the kids tried knocking them over with my three wheeled robot. The best result was knocking over 2 pins (followed by a strike when they drove the robot into them) but most of the time the ball was rolling slowly enough it just stopped when it reached the pins. I know this would have been slowed down a bit by the carpet tiles and the ball should move faster across hardboard, but it definitely looks like I need a little extra something to get the ball moving.  Of course it was shortly after this that another wheel fell off!
Down to two wheels!
Over the three hours of the event I spoke to a lot of people, talking about my robot and PiWars, how the challenges encourage people to learn about robotics, sensors and writing code to connect them together, with several people saying they may grab tickets to attend, others on how they could use this encourage the children they teach and one or two even talking about entering into Pi Wars 2016 (If it happens of course).

This Jam was extra special as there was a Show and Tell competition taking place, with prizes graciously donated by various Raspberry Pi related companies for both an under 16 and over 16 category.
The 16 and over prizes, with the list of  donators.
Each person attending the event was given two tokens to vote for their preferred under 16 and over 16 exhibit and by the end of the event I had collected 7 on my table. Unfortunately that wasn't enough to win the prize, which went, I believe, to a robot arm with laser targeting (a table I didn't manage to get to myself). The under 16 prize went to a nice 3d printed, articulated robot hand!
Albert talking through the under 16 prizes.
So it was another good Egham Raspberry Jam, which has hi-lighted a few issues with my robot that I can now work on improving, and hopefully I sent a few people away with a few more ideas on what to do with Raspberry Pis.


Tuesday, 6 October 2015

Stay on target - Sensors!

With my robot up and moving again under manual control I can turn my attention back to the autonomous challenges, something I didn't partake in last year's competition.

With the release of the SenseHAT, as mentioned in a previous posting, I can just connect that up to the Raspberry Pi to get access to an acceleration and gyroscope sensor which will, hopefully, help keep the robot moving in a straight line during the three point turn and straight-line speed challenges. But for the proximity alert and line-following challenges I need something extra, and so for these I have chosen the VL6180 ToF Range Finder and Pololu QTR-8RC Reflectance Sensor Array modules.

The VL6180 sensor.
The VL6180 sensor communicates via i2c and supports an input voltage of 3.3v-5v so, after soldering on the pins, it can be connected directly to the Raspberry Pi. The module is provided with numerous resources, including the schematics, connection guide, Arduino library and, inside the Application Note, an almost complete C example of how to drive the sensor. So with a couple of tweaks to re-implement the i2c read/write functions using the pigpio APIs I have a test program (VL6180Test.cpp) that outputs the range every second.

Reading through the documentation supplied with sensors is always useful as, in this case, I found out that the sensor becomes unreliable when the distance drops below 10mm. Something that wasn't explicitly noted in the description of the sensor. So when I mount this on my robot I need to make sure there's at least a 10mm gap between the sensor and the front most part of the robot to ensure it gets as close to the wall as possible.

QTR-8RC sensor (Front)
QTR-8RC sensor (back)
The QTR-8RC module contains IR LEDs to light up the surface being measured, and 8 different sets of sensors which, as there is a jumper to select 3.3V input, can be connected directly to the Raspberry Pi's GPIO pins. Again this module comes with instructions on how to connect it up, an Arduino library and various other documents which included the six steps required to take a reading from the sensor and I set to work writing a program to implement these steps.
The sensor connected via an Adafruit T-cobbler.
As this module was going to be more tricky to get up and running I decided to concentrate on getting a reading from one sensor first, and then extend the code to read from all eight. The first problem I ran into was not getting a good reading from the sensor, which initially looked to be caused by the IR LEDs not being turned on. Having checked the code to make sure I was enabling the correct pin I decided to connect the LED_ON pin directly to the 3.3V pin to validate that the hardware was working as expected.

Using the camera on my mobile phone I could see that the IR LEDs weren't coming on, so it was out with the multi meter to check the connections. The Raspberry Pi was definitely outputting 3.3V, but the sensor had a strange reading of 1.2V across its VIN and GND pins... Something wasn't right there and, after a bit more poking around, I realized I hadn't actually connected up the ground on the bread board!  Luckily a harmless mistake and quickly rectified, with the IR LEDs immediately lighting up, so it was back to the code.

I still didn't seem to be getting valid results form the sensor so it was time to read through the documentation again to see if I had missed anything. Aha! The recommend sensing distance was a mere 3mm, and my piece of test paper was much further away than that. Moving it closer and I started to see better results, my code was working!

From that point it was just a case of extending the code to support all 8 sensors, tidy it up a bit, and soon I had a new test program that would output the state of the 8 sensors every second (QTR-8RCTest.cpp) . For some reason the first result always comes out as 'fully white' but the subsequent results are accurate, as tested by sticking some black tape on a white envelope and sliding it back and forth in front of the sensor.

Now that I've tested and gotten both sensor modules working the next task is to decide how to attach them to the robot. My initial plan was to simple mount them on the front, but now that the range finder needs to be offset by at least 10mm, and the line following sensor has to be very close to the ground, I'll have to come up with something slightly different.

Of course the more difficult task is to get the robot to actually use the sensors to drive itself around, something that I'm sure will keep me busy for the next week or two.

Leo


Saturday, 3 October 2015

Its a long way to - Python to C++

Well that took a while... Its been almost three weeks since my last post, and in that time I've been slowly working on replicating the functionality of my 'MotorTest' python script in a C++ program. Some of that time was spent learning about new features and rewriting the code to use it, and some of it was spent with an annoyingly long drawn out cold that meant that some days I just didn't want to be on the computer.

The main reasons I gave for doing this was learning more about C++ and improving the performance of the resulting program. So given that the functionally my robot is exactly the same as it was last time, was it worth the effort?

Overall... yes. Whilst its taken a while to complete I've learnt a fair bit about C++11 and the newer classes and APIs available in it (Even if I didn't end up using them in my current code). A number of pieces of functionality I've used before (such as threads and mutexs) are now built into the standard C++11 libraries, along with a few additions. I've come across a few new libraries and features (using the evdev library for input and eventfd for thread message passing). I've also read up on various articles about thread safety, and hopefully my code is thread safe..

One of the fun things I learned was that the default C++ compiler in Raspbian doesn't actually support C++11... My initial development was on an Ubuntu machine, so I was happily using these C++11 specific features until I had enough working to try running it on the Raspberry Pi... Luckily I'm not the first person who wants to use C++11 in Raspbian and I quickly found instructions to install the C++ compiler from the Jessie repository, which is newer and supports the necessary features.

The Raspberry Pi A+ I have installed on the robot is a little slow at compiling C++, so for the most part I've been developing and testing the code on a Raspberry Pi 2B and only switched over to the  A+ when I had it working. Of course this 'slowness' is one of the reasons I wanted to switch to C++, so how is the performance?

Running the 'MotorTest' python script and monitoring it with 'top' I can see that it uses ~75% CPU time and 10% memory. Running the 'OptimusPi' binary I can see that is uses ~5% CPU time and 1.5% memory, so a significant improvement that leaves me plenty of CPU for processing the input from sensors and other activities.

Speaking of which, some additional sensors did turn up last month, and I've been trying to ignore them until I got my robot moving again. With just a week until the Egham Raspberry Jam I'll have to see how many of them I can get up and running!

New stuff!

Of course there's still plenty of coding to do. I've taken a number of shortcuts to get manual control working (Input device hard coded, the main loop just feeing input into the motors) which I'll need to tidy up and extend to reach my original target, being able to select different autonomous programs to run, auto detecting the input devices etc. The current state of my code can of course be seen by following the 'Github' link on the right.

My original target for this stage.What I actually have!
The current state of my code can of course be seen by following the 'Github' link on the right.

Leo