Friday 11 December 2015

And the results are in - How did we do?

Last Saturday Pi Wars 2015 took place at the University of Cambridge Computer Laboratory, so I got up early (after a late night), packed up my things and headed on up, arriving about 9:20ish. Due to the large number of competitors we were split up into two different 'workrooms' where we could perform any setup on our Robots both before, and between challenges.

Workroom A, everyone busy working on robots!
 Optimus Pi powered up fine after being transported up from London so, after unpacking a few more items, I had a quick look around before the doors opened to check out the challenges. Most of them followed the same pattern, a red base with any course markings done with electrical tape, with the two challenges standing out the most being the line following and obstacle courses.

The new and improved obstacle course.

A big change from last year's black tape on white paper course!
With the courses reviewed I took the opportunity to visit the shops, picking up the latest CamJam EduKit, Scroll pHAT as well as a replacement SenseHAT (I managed to damage one of mine!), before heading back up to the workroom to wait for my first challenge, doing a little tweaking and getting a few photos taken.

Optimus Pi and me.

Proximity Alert

First up was the proximity alert challenges, something that had gone reasonably okay in my testing at home, but I was a little concerned with the white painted wall as 'white' surfaces had given me some odd results in testing.

First run the sensor decided to not work and not only did Optimus Pi hit the wall, it also proceeded to drive straight up the wall and flip over backwards.  Unfortunately I had forgotten to start the camera so there was no footage of this embarrassing attempt.

I turned the sensor 'off and on again' to reset it and did a quick practise to make sure all was working well before attempt two. This went better (i.e. Optimus Pi actual slowed down) but ended up approaching at an angle so one wheel ended up touching the wall, another fault!

The final attempt didn't go much better, again ending up approaching the wall at an angle so one wheel touched before Optimus Pi stopped. A triple fault! Not the best of starts.. Still onwards.

Line following

I did not have much confidence in this event as I'd spent most of Friday evening trying to get my Line following sensor working, the Arduino that was actually driving the sensor had suddenly started restarting and loosing its calibration data when I was reading from it. Unable to track down the problem I just changed the code to return the raw sensor reading and do some basic processing on the Raspberry Pi.

So with some trepidation I put Optimus Pi on the easiest looking part of the course and we were off... For about three inches before drifting to the left and stopping... I knew I had a slightly dodgy connection on one of the motors so I gave it a prod in the right place and we were off again, going a few more inches before slowly curving off to the right and into the wall... A quick rescue and off to the next corner, where Optimus Pi again drifted off to the right and started doing circles trying to find the line, narrowly missing a cone in the process.  A final rescue and off again until once more losing the line and going off course.  At this point I admitted defeat, failing the second challenge in a row.. this was not looking good.

Three point turn

The next challenge was the three point turn. Now originally I had planned on using the various sensors in the Sense HAT to determine where I was, how far I've turned etc. but it had turned out to be too unreliable with the first turn usually ending up being more than 90 degrees, and the rest generally being okay.

So during the previous week I had put together a variant that just used dead reckoning, running the motors at set powers for set times. With this programme selected I put Optimus Pi in the starting box and set it off a-trundling. First attempts wasn't too bad, Optimus Pi went a little too far forwards, didn't quite cross the left marker and drifted a bit on the way back, driving straight into a cone.

Second attempt I positioned Optimus Pi towards the left of the starting box and set it off again, crossing the correct lines but still ending up on the edge of the course.

Final attempt and I positioned Optimus Pi in the bottom left of the starting box, still going slightly off the far end of the course but holding a much straighter line on the return trip, ending up in almost exactly the same position as it started in, an almost perfect run!

Straight line speed test

This was another challenge where I had planned on using the Sense HAT to keep Optimus Pi moving in a straight line, but after being unable to get this reliably working over 1 meter (let along the 7+ meters of the course) I ended up just manually driving down the course.

Each of the three runs was a fairly smooth run, nothing ultra fast, but I was happy that I'd been consistent across all the runs.

Pi Noon - Round 1

The last event before lunch was Pi Noon. I'd manage to catch one of the earlier rounds to find out what the event was about. Basically each robot had an inverted 'L' shaped metal wire attached to the front, with a balloon attached to the top and a pin on the front. The idea being that you had to protect your balloon whilst trying to pop your opponents balloon, a task which required more skill and control than the sumo challenge of last year.

For this round I was drawn against Yasmin Bey's robot The Bomb Squad. I was feeling fairly confident as I'd noted The Bomb Squad having some control issues earlier during the three point turn. After a slight delay we were off, I angled Optimus Pi to keep my balloon away from the pin and started circling around the arena when The Bomb Squad suddenly turned, giving me a clear view of the balloon and I darted in and 'pop' I had won! Things were beginning to look up.



Lunch

I had over two hours until the next event and took this time to grab some food at the Cafe, charge up Optimius Pi's batteries and watch a few other of the challenges. As my next event was the Obstacle course I went and watched one of the other competitors going around the course.

Obstacle course

I had brought along some larger wheels for the obstacle course, expecting similar hazards to last year, but I was unsure how they would deal with the marbles, and I hadn't had much practise with them since re-building Optimus Pi around the larger project box. So I decided to stick with my small wheels.

I immediately ran into problems on the double hump at the start of the course, taking several attempts to get up the first hump, before beaching Optimus Pi on the top, all four wheels spinning freely in the air. One rescue later and it was on with the rest of the course, I approached the marbles at a slow and steady speed and barely noticed they were there.  A quick run up the slope and the spinning circle of doom was before me. I waited for a gap and dashed forwards and onto the spinning area. My first attempt to exit failed so I waited for the circle to spin me around and tried again, this time just squeezing out.

With that out of the way the rest was easy, heading down the slope, up and over the see-saw and across the line! A reasonable time, I thought, but I had one penalty to add.

Skittles

The last of the challenges was Skittles, I'd run out of time to build anything fancy so just settled with pushing the ball. We were allowed two practise runs, with my first I knocked over three pins, and in the second five pins. So, feeling fairly confident at this point, I moved onto the real attempts.

Attempt 1, with 2 balls I only managed to knock down a single pin! In attempt 2 I knocked over four pins, and in attempt 3 I knocked over three pins, giving me a total of 8! So with 6 balls I had managed to match the number of pins downed during practise.....

Pi Noon - Round 2

For the second round I was up against KEITH Evolution, a tracked, tank looking robot with many lights on the front. This round went on for much longer as we jostled for position, a few bumps and at one point, as my attention was focussed on the balloons and pins, I accidentally drove up one of the tracks!

A bit more back and forth and 'pop!' I had won! Optimus Pi was through to the quarter finals.

Pi Noon - Quarter finals

This round it was Optimus Pi vs D.O.G. (Dreally Orsome Grobot), with plenty more bumps, a couple of head to head collisions and 'Pop!' I was through to the Semi-finals!

Pi Noon - Semi finals

Next up was Revenge of PyroBot, a rematch from last years Pi Wars when we'd had a David vs. Goliath moment, my large, six wheeled Pirate Ship vs the tiny PyroBot. This time Revenge of PyroBot was the larger, and faster of the two robots so I was not feeling hugely confident.

PiWars 2014 Sumo - BiggerTrak vs PyroBot

Revenge of PyroBot zoomed around the arena, I turned to met it, we collided, I backed up, Revenge leaped forwards in front of me, the balloon brushing across my pin and 'Pop!' Some what surprisingly Optimus Pi was through to the finals!

Pi Noon - Finals

It was finals time, Optimus Pi vs Triangula. Triangula was the first robot I had come up against that could move in any direction, thanks to its special wheels and a lot of maths.

We moved across the arena, bumped, backed off, circled around so we ended up in the opposite corner, came back towards the middle and 'Pop!' my balloon went, Triangula was triumphant!

Optimus Pi was runner up, my best result of the day, and further than I had gotten in the Pi Wars 2014 Sumo battles.

Awards

As expected I didn't win any of the other challenges of the day, but I got the runner up prize for Pi Noon, and every team was presented with a Pi Zero! Which as I was a team of 1 meant it was all mine!
Prizes!
Whilst its a bit hard to compare, the prizes felt large this year, with my Pi Noon runner up prize containing a 4tronix Initio robot kit and Gecko, Pimoroni Explorer pHAT, AverageMan ProtoCam kit, RyanTek debug clip, MyPiFi LED board, a few other small items a copy of 'Practical Raspberry Pi Projects' (Which, rather amusingly, has one of my earliest RPi projects on the front cover, my Rasperry Pi controlled BigTrak!) and of course the Pi Zero mentioned earlier.

The final results have now been published on the PiWars.org website and Optimus Pi had an overall score of 7th in the A4 and under category, some what higher than I was expecting after flunking the first two challenges!

So all in all a good day out, some nice prizes, and plenty of soldering for me to do over the next week or so.

With all the awards handed out it was back to the Obstacle course for a final, all robots, photo shoot.

So many robots!

Next year?

Will there be a Pi Wars 2016? Well we don't yet know, but that didn't stop me starting to think of a new robot design to address some of the issues I had this year (mostly handling the unreliable line sensor) and the following morning I came up with a cunning idea for the Proximity alert challenge (which is probably not against the rules...)

Final items

Before I can draw a line under this years Pi Wars there's a few more things to do, I have some more code to finish pushing up to my GitHub page (The final changes made before Pi Wars), a list of hardware to put together and some final tweaks and releasing of the 3D printed parts I used.

I'd also like to have another go at the Line Following sensor, see if I can track down the issue before moving on.

One other item I enabled at the last minute was an onboard camera, which was used to record (most) of the challenges, as well as receiving the prizes at the end. So the day, as seen from the P.O.V of Optimus Pi is below for your viewing pleasure (it mostly contains shoes, lots of shoes.)




Leo

Tuesday 1 December 2015

Tea, Earl Grey, Hot - Replicating components

In my last blog post I listed a number of items I wanted to complete over the weekend and, well a few issues cropped up that took up a chunk of time. My Sense HAT suddenly decided it didn't like my OLED display, both worked fine when connected by themselves to the RPi, but when both were connected the I2C bus starting reporting nonsense. Luckily I have a spare which doesn't have that problem, although I've ordered another OLED just in case that suddenly needs replacing too!

Most of my attention was spent on getting the line follower sensor re-mounted on the robot, as I no longer have the plywood base to connect it to. So for this I needed to create a new mount to hold it in place. Those of you who follow my Google+ account will be aware that I purchased a second hand 3D printer and its now time to create custom parts from nothing (well some filament).

A  RepRapPro Mendel Mono

Now a 3D printer is a long way from a Star Trek Replicator, for a start its not voice controlled, designing new components is a bit more complicated than pressing a few buttons, and its far from instant to actually create your new component. They also require a bit more maintenance as well, Sunday morning the PLA decided it didn't want to stick to the base, and Monday evening the gear that feeds in the filament decided to unscrew itself. Luckily each issue only took and hour or so to get sorted (Once Pi Wars is complete I'll be spending more time getting the 3D printer setup, including printing out some spare parts for itself!).

I've been using AutoDesk Fusion 360 to design my components (Its free for hobbyists) and after a bit of practice I can put together a basic (i.e. rectangular) shape in a few minutes. Once the shape is complete I use the '3D print' option to export a .stl file, which I then import into another program to 'Slice' it into 'GCode' ready to be fed into the printer (I'm currently using the RepRapPro Slicer software). Its recommended that the resulting G-Code is copied onto the SD card built into the printer for maximum reliability, which is another step to complete. (Once I have more free time I'm sure I can streamline this a little).


To actually control the 3D printer I have a Raspberry Pi connected to it via USB which is setup running VNC. Connecting to this via WiFi I can launch the control software, select the file from the printer's SD card and set it off printing. Depending on the size and complexity of the print this can take a few minutes, or a few hours, so I mostly alternated between printing the LineFollower mount and the 'wire' mount for PiNoon, tweaking the design of one whilst the other was printing.

The 'wire' mount took about three iterations, the first being too long, the second not strong enough, with the final one needing just a little bit of tweaking to match up the screw holes (for this I used a drill to widen the holes, as that was much quicker than printing off another component).

Final design is on the right.

The PiNoon attachment mounted, I didn't have a coat hanger to test it with, so used a screw instead.
The line follower mount is a little more complicated as it needs to securely hold the sensor in place, as well as allow it to be mounted to the robot itself. To avoid trying to find a stand-off or two to hold the sensor at the correct distance above the ground I realized I could just simply 'print' out arms to hold the sensor in place. Simple right?

Well maybe not quite that simple, having items at different heights that 'float' in the air doesn't work unless its supported. Luckily the RepRapPro Slicer software will automatically insert supports to the model, but it can be a little tricky to trim it off afterwards. My first attempt has the screw mounts on the outside of the sensor, but this turned out to make the sensor a little too wide to easily connect to the robot.

Attempt one

On the second attempt I flipped them over to be above the sensor, making it much more compact, but requiring careful trimming to remove the 'support' material.

Action shot!
Attempt two

Due to some of the issues mentioned above with the 3D printer the line follower didn't finish until late last night, so I've not had a chance to connect it to the robot yet and, due to it being work's Christmas party tonight it won't get done until tomorrow. However whilst it was printing I did move the Arduino onto a Perma-Proto PCB ready to be connect up (I may just use blu-tac for this, my robot has a lot of screw holes in it already!)

So with less than four days left to do things are sorta coming together, with a bit of a question mark around the Skittles challenge!

Leo


Saturday 28 November 2015

Its the final countdown! - One week to go

This time next week I'll be in Cambridge participating in PiWars (Assuming no more magic smoke escapes from my robot) and I have a busy weekend planned to try and pull everything together. I've been busy doing some more soldering this morning and will be doing some more this afternoon, after visiting Maplins to get supplies!

Main focus today is to get my line sensor mounted on my newer chassis, as the previous method is no longer viable. This is going to mean moving the Arduino that controls it off the bread board and onto something more permanent. I did consider moving the sensor onto the same Arduino that drives the motors, but I just don't have enough pins spare, and it would be a little fiddly connecting it up.

After that I need to work out a mechanism for holding the 'wire' for the Pi Noon challenge, which is slightly trickier now that the motors and wheels are mounted forwards of the chassis, and then tweaking the code for the autonomous challenge.

Hopefully that will then leave tomorrow for sorting out a device to help with the Skittles challenge. I have some plans but have not had time to actually build anything!  Worse case I'll just have to give the ball a good push.

There's a few other things I want to work on, but as they are extras they can wait until later.. As for other deadlines, well the 'blogging' challenge completes today (in about 30 minutes!) to ensure there's enough time for them all to be judged before Saturday. Obviously that doesn't mean I'll stop blogging, I should have at least one post before the event that (hopefully) shows OptimusPi fully functional, and I'll have a post-PiWars update as well.

So, with the clock ticking, its back to the soldering!

Leo

Thursday 26 November 2015

Soldiering on - Soldering on

One of the issues I've had with previous robots (and other contraptions) is managing to plug cables in the wrong way round, something that's especially easy to do when using jumper leads as they are typically not joined together and can get mixed up in oh so many ways (Not to mention the fact there's 40 pins on the Raspberry Pi to actually connect them to).

To try and avoid this happening I try to add dedicated connectors for each component, preferably of the kind that can only be plugged in one way. So last night I grabbed an Adafruit Perma-Proto Pi HAT board out of the box its been living in for the past few months and began to prod it with a hot stick (well a soldering iron).


With such small components it can be difficult to rectify an error so its best to take some time to work out what you need connecting up, and where. For OptimusPi pretty much everything connects via I2C, some components using the regular I2C pins, and the rest using GPIO pins 5 and 6. However I also need to have the SenseHAT connected, and it needs to be on top so I can get at the joystick and use the LED matrix. Due to the ways that the pins connect up on the Adafruit board that just left the one side I could easily bring connections out to.

After moving things around, trying out different connections and seeing how things looked when connected to the RPi itself I decided to have the 'external' I2C connector in the bottom right corner (above the AV connector) and the OLED in the middle (covering the HDMI port, but I don't plan on using that) which leaves room in the bottom left corner for future expansion (Maybe my RTC will end up there).

Constantly checking the connections to make sure I don't wire things up backwards I added the necessary wires to connect the GPIO pins to the correct locations on the lower half of the board and, before plugging it in, uses a multi meter to double check everything was connected as expected.


As I didn't have too many right-angled connectors in my box of bits I had to modify one of the 'straight up' connectors by lowering the pins (heating them with the soldering iron and pulling them through the plastic) and bending them with a pair of pliers. So adding the I2C connector took a little longer than it needed to. However everything was finally connected, so time to apply the power.


A couple of checks using 'i2cdetect' to make sure the OLED and SenseHAT were being correctly detected and I fired up the OptimusPi binary and was happy to see the OLED working. Of course with it mounted on the side I have to go and tweak the code so the joystick works as expected. But where does that other wire go?

Well one of the other components I picked up recently was the Grove I2C hub. which I've mounted underneath the Raspberry Pi (I may move this as, whilst its nicely hidden, its a bit of a pain to reach). This replaces the I2C 'expander' I'd home-made previously, with the added benefit of ensuring you can't connect the cables backwards.


With the Raspberry Pi all connected up I turned my attention to the Arduino controlling the motors. This time it was an Adafruit ProtoShield I soldered up, and so far only the I2C connection that I've soldered in. I actually need to go and revisit this and use a proper connector (Not sure why I didn't... was getting late!) but for now this makes it a little easier to wire everything up.


I have a few other items I want to connect up to the Arduino, if I have time that is, but for now I need to make sure all the basics are working again before adding extras.

As a reminder of how little time there is left my PiWars timetable turned up today, and about the only challenge I'm ready for is the Obstacle course!

Leo

Tuesday 24 November 2015

'Tis but a scratch! - Up and running again

Well after carefully checking all the components everything seems to be working, apart from possibly the UBEC that was powering the Raspberry Pi.  There's no sign of burnt cables, or discolourations on the Arduino or Motor driver and after re-applying power OptimusPi happily trundles around.


There's still the issue of the motors not being mounted level, and I need to swap in a replacement UBEC (currently using an Adafruit PowerBoost board to power the Raspberry Pi) but at least I don't have to order in any replacement parts!

Speaking of which, the NFC reader I ordered back in September has finally arrived... Unfortunately its probably a little to late for me to do anything useful with it.


So with this minor panic over its back to the more general case of 'running out of time' panic! I've got some soldering to do these next few evenings, sort out the motor mounts, work out how to connect the line following sensor to this new chassis, improve the code and various other things!

Leo

Puff the magic dragon - uh-oh I see smoke...

So this blog was supposed to be about how I decided I needed a bit more room inside my robot, how I switched to a large box, mounted the motors on it, added cabling so the motors can be easily swapped out, added a power switch and generally tided things up a bit.

So many wires
Unfortunately when I was setting OptimusPi up last night to take photos I suddenly noticed smoke escaping from the inside of the box... Possibly the mythical magical smoke that makes electronics work?

Quickly cutting the power (yay for the new power switch) I popped the lid off and removed the battery to check for damage. An initial glance didn't show any burnt wiring or other damage, which implies the damage was done to the Arduino which is hidden beneath the motor driver.

I've checked the Raspberry Pi and the OLED, which both seem to be working happily, but it was getting late and I didn't have time to check the state of the Arduino, so that's something I'll have to do tonight.  Hopefully its just the Arduino that got charred, and not the motor driver, as I have a spare Arduino I can replace it with (Although not a nice, shiny, black Adafruit one). But with only 11 days to go until PiWars I have a lot of work to do.

But back to the updated robot. With the motors mounted to the base I, technically, don't need the plywood base I was using previously, however it means the robot doesn't look all that pretty. I did have plans for the top, but now I'm not sure I'll have time to get anything extra done now.


One other problem I ran into is I managed to end up with motors that aren't mounted level, so it has a bit of a wobble to it. Its something that's fixable (just takes more time) but I need a working robot first, here's hoping nothing too badly damaged.

Leo

Saturday 14 November 2015

Job's done - Code complete-ish

Today, at noon, was the deadline for the Code Quality challenge and therefore an obvious target to have my code functionally complete (The competitors are still allowed to change code up to the main PiWars event itself). So how did I do?

To keep track of the code I submitted today I created a tag in GIT called PiWars2015-CodeChallenge so I can refer back to it later (if needed) and the stats of that code snapshot are

http://cloc.sourceforge.net v 1.60  T=0.17 s (297.2 files/s, 29810.7 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
C++                             20            583            594           1700
C/C++ Header                    21            258            422            564
Arduino Sketch                   6            131            221            340
Python                           2             44             55            162
CMake                            2             12              9             21
-------------------------------------------------------------------------------
SUM:                            51           1028           1301           2787
-------------------------------------------------------------------------------

So that's 51 files created and almost 2800 lines of code written... But is it complete?

In the past few weeks I've added 'ThoughtProcesses for the StraightLine and ThreePointTurn challenges. These both make use of the Sense HAT's array of sensors to determine the current heading of the robot, using it to try and maintain driving in a straight line, and for performing the turns for the three point turn. As can be seen in the following videos it doesn't always work quite right... I've tried re-calibrating the sensors a couple of times, but looks like I need to try it once more, not to mention changing the 'dead-reckoning' times to something a little more intelligent.


So, in theory, I can now enter all the autonomous challenges, although not with a huge chance of success as the code currently stands.

During testing I've been running one challenge at a time, editing the code and rebuilding it when switching between them. Obviously this is something I'd rather not have to do on the day, so I need a way of selecting between the various tasks. I had previously purchased an OLED display, but had been avoiding playing with it until I had the other functionality out of the way. Most of the example code for driving the OLED was for the Arduino so I spent some time hunting for a library that I could use from C/C++.

After wasting a bit of time searching for the wrong driver (SSD1305 instead of a SSD1306) I eventually selected the ArduiPi_OLED library to drive the display. The library does come with a few disadvantages, I now have to run my binary with 'sudo', the binary contains two library that try to access the GPIO pins and I2C bus (so far no obvious conflicts) and I can't have all my devices hanging off the same bus (which means more cables on the robot). However it has the advantage of getting things up and running now, which is useful when you have a deadline!

With the display working I added a 'Menu' class, updated the main PiWars class to create and display a menu, and to update all the ThoughtProcesses so they can be run in a background thread to not block the main thread. To navigate the menu's I'm making use of the joystick on the Sense HAT, as that will always be connected (the PS3 joystick may not be) and is simple to use. The results of all this work can be seen below


With the menu system I can select all the supported functionality, as well as shutting down the robot correctly, without yanking the power out and risking file system corruption.

So with three weeks what's left to do?  There's a couple of nice-to-have items that are just not going to make it (My RFID reader has been on back order for 2 months now, it may arrive before December but no time to add it) so I'll be spending the rest of the time tweaking my code,, practising driving and making the robot pretty, not necessarily in that order!

Leo

Thursday 29 October 2015

I'm spinning around - Line following

The second autonomous challenge I decided to focus on was the Line Follower challenge, using the Pololu QTR-8RC reflective sensor I posted about earlier. As the sensor needs to be very close to the ground, its not something I can leave mounted all the time (The obstacle course would not be kind to it) so I needed it to be easy to add/remove.  Previously I had the QTR-8RC connected directly to the Raspberry Pi using 11 jumper leads, but that number of cables doesn't lend itself to quick attaching and/or detaching.

To try and make the sensor a little more self contained I decided to use an Arduino as a buffer device, with the sensor connected to the Arduino with the 11 jumper leads, and the Arduino connecting to the Raspberry Pi over I2C, in a similar manner to the motor driver.
The QTR-8RC sensor connected to an Arduino.
As I'm already successfully using an Adafruit Metro 328 for controlling the motors I decided to get a second one, but this time going with the 'Mini' variant to save space. Getting the  QTR-8RC sensor up and running was fairly trivial, as its just a case of grabbing the QTRSensors library written by Pololu and following the instructions. For communicating between the Arduino and the Raspberry Pi I copied the I2C code from my MotorDriver script.

For testing my Line Following capabilities I, obviously, needed to attach the sensor to the robot and, after digging around in my various boxes of supplies, discovered a couple of large bolts that would hold the sensor in the correct position. A few holes drilled later, plus a couple of blobs of blu-tac, and the sensor and Arduino was successfully mounted.


As the circuit is currently wired up on a bread board I've actually mounted everything on the rear of the chassis where there was still plenty of room. Obviously its more traditional to have line following sensors on the front of the robot, but it only takes a few tweaks in the code to make it drive backwards, and there's still plenty of time to move it before PiWars. In theory the best place for the sensor would be in the middle of the robot, as that is the point it turns around, but I don't currently have a good way of mounting it there.

With the hardware sorted it was time to turn my attention to the software. First task was to create a new 'Sensor' class for talking to the QTR-8RC sensor. As this was now controlled by the Arduino via I2C, this was a fairly trivial task (SensorQTR8RC.h) as I could just copy an already implemented I2C sensor and tweak the code.

Next was to add some actual logic to connect the sensor readings to the motor. Again this is mostly a case of copying an existing 'ThoughtProcess', tweaking it to initialize the correct sensor, sit in a loop trying to get the 'line' to stay in the centre of the sensor and we have ourselves a working ThoughtProcess_LineFollower module yes? Well turns out it wasn't quite that easy....

Other than the usual coding issues (e.g. copy and pasting a one second sleep, turning left when I should be turning right) a major problem was the Arduino resetting itself after a couple of seconds, causing the loss of calibration data and making the robot just see 'white' and drive in a straight line. My initial thoughts was that it was related to the known issue with how the I2C hardware in the Raspberry Pi deals with clock stretching (badly), so I spent a few hours switching over to the pigpio library's bit banging  I2C implementation, which correctly handles clock stretching.

I moved the QTR-8RC sensor over to the newly created 'I2CExternal' class and started running some tests. Initial results were not encouraging,  with the Arduino resetting after 20-30 requests, leaving me in the same situation I was earlier. Whilst running further tests I noticed that the Arduino was now resetting whenever I was checking if it existed, which seemed to be due to how I'd changed the 'I2C::exists()' API to work on both the 'internal' and 'external' I2C buses.

This, happily, was a 100% reproducible test case and I started to review the I2C handling code running on the Arduino. Using the time-honoured approach of  sticking an early 'return' statement in the receiveData call to determine at what point the crash was occurring. I, fairly quickly, tracked the issue to the 'for' loop that attempts to determine which command was being requested, and I spotted that I was using 'sizeof(supportedI2Ccmd)' which returns the total size of the array in bytes, instead of '(sizeof(supportedI2Ccmd) / sizeof(supportedI2Ccmd[0]))' which returns the actual number of entries in the array. Meaning I was massively reading off the end of the array and accessing memory that I didn't 'own', apparently annoying the Arduino enough to restart.

So with that change made (and ported across to the Motor Driver Arduino code) I placed my robot back onto my test track (Last years course scaled by 50%) and ended up with...



Now the robot doesn't get close to making its way around the course, but it is definitely trying to track and follow the line, which is a good starting point and something I can build and improve on over the next few weeks.

So with that in a 'working' state I'll be moving onto the next autonomous challenge, the Straight-line speed test.

Leo

Sunday 18 October 2015

Autobots, roll out - Autonomous operation

I've been working on Optimus Pi for over three months now and I've finally gotten around to working on the autonomous challenges. For last week's Egham Raspberry Jam I had mounted the VL6180 range sensor on the front of the chassis so it made sense to start with the Proximity Alert challenge, which is probably the easiest of the autonomous challenges.

The first thing to do was to better integrate the VL6180 code into the software library (as opposed to hacking it in like I did last weekend). As there are going to be various sensors connected to the robot I created a 'Sensor' base class that will contain all the common operations supported by a sensor (e.g. checking its connected, turning it on and off), which will then be extended by each sensor type.

With the base APIs decided upon I created the VL6180 variant and copied in the code from my example, tidying it up a little and extending the public API on the Sense class with a 'range' call to actually read in the value. I followed my normal approach of getting it working first, then making it 'nice'. So the first implementation made direct calls to the sensor, then once I was happy that the sensor was operating as expected I moved the code into a background thread, allowing the stored range to be constantly updated whilst the sensor is enabled, hopefully not blocking the main thread.

Up until now the 'manual' control code has been in the 'main' function, so that had to be moved out. Back in my pre-planning blog posting I have a white-board picture where the 'Brains' object has multiple 'automated' modules hanging off it, which the robot will switch between for the various challenges. As there needs to be a manual control module as well I decided to name these modules a 'ThoughtProcess' of which only one will be active at a time (Optimus Pi is a very single-minded robot).

Now the ultimate aim is that the 'Brain' will have a list of supported ThoughtProcesses that can be dynamically switched between for each challenge, but for now I'm just looking at getting things working (Worse case I can manually launch a specific binary for each challenge on the day). So once again the 'main' function is just hard coded to launch the activity I want.

With the 'ThoughtProcess' and 'Sensor' classes in place its only took a few minutes work to create a simple 'ThoughtProcess_ProximityAlert' class that just drives the robot forwards, lowering the speed as it gets closer to the target until it finally stops. A few test runs to adjust the speed to run okay on the carpet and we end up with...


Obviously this is just a very simple script with plenty of room for improvement, but its already better than last years entry which didn't end up competing in any of the autonomous challenges. Things that could be improved are adding a calibration step so the robot can work out its minimum speed on the floor, using the SenseHAT to help keep the robot moving in a straight line, adding encoders to the motors to measure distance travelled etc. Not to mention straightening the range sensor on the front of the robot (its a little crooked), but for now I'm leaving it as is and moving onto the next autonomous challenge (line following perhaps?)

Leo

Monday 12 October 2015

Its show time! - 9th Egham Raspberry Jam

This weekend I took Optimus Pi for its first public outing at the 9th Egham Raspberry Jam. Overall it was a successful event, but not without a few problems..

I've been to most of the previous Egham Raspberry Jams, so I know that it attracts a mix of people, ranging from those who have never used a Raspberry Pi before, to those that have used it extensively. With that in mind I was fully expecting to hear the question 'PiWars? What's PiWars' so, as well as taking along my robot, I went prepared with details about the PiWar's website, the list of challenges and print outs of some of the more complicated challenges that the visitors could read through.

My stand! Complete with robot, skittles and line following sensor demo.
As I'd gotten my range and line following sensors working the previous weekend I wanted to be able to show them off. The range sensor itself I mounted on the front of the robot and quickly hacked in the standalone test code into my main 'PiWars' code and set it up so (in theory) if there was an obstacle within 150mm of the front of the robot then it would stop and refuse to drive forwards (Of course if you were travelling fast enough you can still crash). Originally I was just querying the range sensor every time I processed input from the joystick, but this seemed to be causing the controls to respond a little sluggishly (As they had to keep waiting for the sensor to respond) so Sunday morning (A few hours before I had to leave) I quickly moved that out to a background thread that updated a global, atomic variable. Something I completed much faster in C++ than I would have managed in Python. For anyone interested these changes can be found over at my GitHub account on the EghamRaspberryJam branch.

Lots of wires and the range sensor attached on the front.
As both the range sensor and Arduino motor driver communicate with the Raspberry Pi via i2c I needed a way to plug both devices in together, so I quickly soldered up a break out board to allow both of them to be connected to the Raspberry Pi's i2c pins.I also added a logic level shifter to convert from the Raspberry Pi's 3.3V to the Arduino 5V which, whilst not technically required in this setup, will allow me to add additional 5V sensors at a later date.

At the Jam the code worked as expected for about 10 minutes, after which the range sensor no longer seemed to respond and cause the motors to stop. So some investigation and fixes will need to be done later. The rest of the robot worked happily for the next hour or so... and then one of the wheels fell off... I blame the kid driving it at the time :)
A three wheeled robot!
The other sensor I got up and running last weekend was the QTR-8RC module that I'll be using for the line following challenge. Now I haven't yet worked out how to mount it on the robot so I needed a different way to show it off. The module itself is comprised of eight individual sensors, which conveniently is also the width of the LED Matrix on the Sense HAT. By taking some code from the example 'snake' program that come with the Sense HAT I was able to set things up so that as you move the line sensor across a white page the lights on the Sense HAT would go out to match the black electrical tape on the page.



The final props I brought along was a wooden skittle set. These spent the first half of the Jam up on the table, but later I moved them onto the floor and some of the kids tried knocking them over with my three wheeled robot. The best result was knocking over 2 pins (followed by a strike when they drove the robot into them) but most of the time the ball was rolling slowly enough it just stopped when it reached the pins. I know this would have been slowed down a bit by the carpet tiles and the ball should move faster across hardboard, but it definitely looks like I need a little extra something to get the ball moving.  Of course it was shortly after this that another wheel fell off!
Down to two wheels!
Over the three hours of the event I spoke to a lot of people, talking about my robot and PiWars, how the challenges encourage people to learn about robotics, sensors and writing code to connect them together, with several people saying they may grab tickets to attend, others on how they could use this encourage the children they teach and one or two even talking about entering into Pi Wars 2016 (If it happens of course).

This Jam was extra special as there was a Show and Tell competition taking place, with prizes graciously donated by various Raspberry Pi related companies for both an under 16 and over 16 category.
The 16 and over prizes, with the list of  donators.
Each person attending the event was given two tokens to vote for their preferred under 16 and over 16 exhibit and by the end of the event I had collected 7 on my table. Unfortunately that wasn't enough to win the prize, which went, I believe, to a robot arm with laser targeting (a table I didn't manage to get to myself). The under 16 prize went to a nice 3d printed, articulated robot hand!
Albert talking through the under 16 prizes.
So it was another good Egham Raspberry Jam, which has hi-lighted a few issues with my robot that I can now work on improving, and hopefully I sent a few people away with a few more ideas on what to do with Raspberry Pis.


Tuesday 6 October 2015

Stay on target - Sensors!

With my robot up and moving again under manual control I can turn my attention back to the autonomous challenges, something I didn't partake in last year's competition.

With the release of the SenseHAT, as mentioned in a previous posting, I can just connect that up to the Raspberry Pi to get access to an acceleration and gyroscope sensor which will, hopefully, help keep the robot moving in a straight line during the three point turn and straight-line speed challenges. But for the proximity alert and line-following challenges I need something extra, and so for these I have chosen the VL6180 ToF Range Finder and Pololu QTR-8RC Reflectance Sensor Array modules.

The VL6180 sensor.
The VL6180 sensor communicates via i2c and supports an input voltage of 3.3v-5v so, after soldering on the pins, it can be connected directly to the Raspberry Pi. The module is provided with numerous resources, including the schematics, connection guide, Arduino library and, inside the Application Note, an almost complete C example of how to drive the sensor. So with a couple of tweaks to re-implement the i2c read/write functions using the pigpio APIs I have a test program (VL6180Test.cpp) that outputs the range every second.

Reading through the documentation supplied with sensors is always useful as, in this case, I found out that the sensor becomes unreliable when the distance drops below 10mm. Something that wasn't explicitly noted in the description of the sensor. So when I mount this on my robot I need to make sure there's at least a 10mm gap between the sensor and the front most part of the robot to ensure it gets as close to the wall as possible.

QTR-8RC sensor (Front)
QTR-8RC sensor (back)
The QTR-8RC module contains IR LEDs to light up the surface being measured, and 8 different sets of sensors which, as there is a jumper to select 3.3V input, can be connected directly to the Raspberry Pi's GPIO pins. Again this module comes with instructions on how to connect it up, an Arduino library and various other documents which included the six steps required to take a reading from the sensor and I set to work writing a program to implement these steps.
The sensor connected via an Adafruit T-cobbler.
As this module was going to be more tricky to get up and running I decided to concentrate on getting a reading from one sensor first, and then extend the code to read from all eight. The first problem I ran into was not getting a good reading from the sensor, which initially looked to be caused by the IR LEDs not being turned on. Having checked the code to make sure I was enabling the correct pin I decided to connect the LED_ON pin directly to the 3.3V pin to validate that the hardware was working as expected.

Using the camera on my mobile phone I could see that the IR LEDs weren't coming on, so it was out with the multi meter to check the connections. The Raspberry Pi was definitely outputting 3.3V, but the sensor had a strange reading of 1.2V across its VIN and GND pins... Something wasn't right there and, after a bit more poking around, I realized I hadn't actually connected up the ground on the bread board!  Luckily a harmless mistake and quickly rectified, with the IR LEDs immediately lighting up, so it was back to the code.

I still didn't seem to be getting valid results form the sensor so it was time to read through the documentation again to see if I had missed anything. Aha! The recommend sensing distance was a mere 3mm, and my piece of test paper was much further away than that. Moving it closer and I started to see better results, my code was working!

From that point it was just a case of extending the code to support all 8 sensors, tidy it up a bit, and soon I had a new test program that would output the state of the 8 sensors every second (QTR-8RCTest.cpp) . For some reason the first result always comes out as 'fully white' but the subsequent results are accurate, as tested by sticking some black tape on a white envelope and sliding it back and forth in front of the sensor.

Now that I've tested and gotten both sensor modules working the next task is to decide how to attach them to the robot. My initial plan was to simple mount them on the front, but now that the range finder needs to be offset by at least 10mm, and the line following sensor has to be very close to the ground, I'll have to come up with something slightly different.

Of course the more difficult task is to get the robot to actually use the sensors to drive itself around, something that I'm sure will keep me busy for the next week or two.

Leo


Saturday 3 October 2015

Its a long way to - Python to C++

Well that took a while... Its been almost three weeks since my last post, and in that time I've been slowly working on replicating the functionality of my 'MotorTest' python script in a C++ program. Some of that time was spent learning about new features and rewriting the code to use it, and some of it was spent with an annoyingly long drawn out cold that meant that some days I just didn't want to be on the computer.

The main reasons I gave for doing this was learning more about C++ and improving the performance of the resulting program. So given that the functionally my robot is exactly the same as it was last time, was it worth the effort?

Overall... yes. Whilst its taken a while to complete I've learnt a fair bit about C++11 and the newer classes and APIs available in it (Even if I didn't end up using them in my current code). A number of pieces of functionality I've used before (such as threads and mutexs) are now built into the standard C++11 libraries, along with a few additions. I've come across a few new libraries and features (using the evdev library for input and eventfd for thread message passing). I've also read up on various articles about thread safety, and hopefully my code is thread safe..

One of the fun things I learned was that the default C++ compiler in Raspbian doesn't actually support C++11... My initial development was on an Ubuntu machine, so I was happily using these C++11 specific features until I had enough working to try running it on the Raspberry Pi... Luckily I'm not the first person who wants to use C++11 in Raspbian and I quickly found instructions to install the C++ compiler from the Jessie repository, which is newer and supports the necessary features.

The Raspberry Pi A+ I have installed on the robot is a little slow at compiling C++, so for the most part I've been developing and testing the code on a Raspberry Pi 2B and only switched over to the  A+ when I had it working. Of course this 'slowness' is one of the reasons I wanted to switch to C++, so how is the performance?

Running the 'MotorTest' python script and monitoring it with 'top' I can see that it uses ~75% CPU time and 10% memory. Running the 'OptimusPi' binary I can see that is uses ~5% CPU time and 1.5% memory, so a significant improvement that leaves me plenty of CPU for processing the input from sensors and other activities.

Speaking of which, some additional sensors did turn up last month, and I've been trying to ignore them until I got my robot moving again. With just a week until the Egham Raspberry Jam I'll have to see how many of them I can get up and running!

New stuff!

Of course there's still plenty of coding to do. I've taken a number of shortcuts to get manual control working (Input device hard coded, the main loop just feeing input into the motors) which I'll need to tidy up and extend to reach my original target, being able to select different autonomous programs to run, auto detecting the input devices etc. The current state of my code can of course be seen by following the 'Github' link on the right.

My original target for this stage.What I actually have!
The current state of my code can of course be seen by following the 'Github' link on the right.

Leo

Monday 14 September 2015

To make or not to - CMake

Its been almost three weeks since I said I would start coding, but with the distraction of the not-quite working motor driver and its replacement I've done little more than a couple of test scripts based on last year's entry. So with a working, if basic, robot up and running its time to turn my attention back to the software.

Most of my Raspberry Pi projects end up being a Python script with a mixture of my own code and the Python example that came with the hardware/tutorial that I've been following. Reasons for this are plentiful, Python tends to be the default language on the Raspberry Pi, it comes pre-installed with a large number of supporting libraries and allows you to get up and running quickly. However its not the language I'm most familiar with, and I'm not convinced its the best use of resources (CPU and RAM) when running on a Model A+.

So for this project I've decided to go with a C++ solution. Its a language that I have some experience of, but I've never done a pure C++ project from scratch, so there's plenty for me to learn here. C++ (and C) is a compiled language, so before you can test any changes you've made the program first needs to be built. With a simple, one file program the compiler can be manually run each time a change is made :-
pi@raspberrypi ~ $ g++ helloworld.cpp
pi@raspberrypi ~ $ ./a.out
Hello world!

Easy enough, but with a multi file project this can quickly get cumbersome, especially when the files have dependencies on each other.
pi@raspberrypi ~ $ g++ helloworld.cpp goodbyeworld.cpp cruelworld.cpp crazyworld.cpp
g++: error: cruelworld.cpp: No such file or directory

My usual solution to this would be to write a Makefile, specifying all the files I needed to build, how to build them, what options were needed to build them, what I wanted the resulting binary to be called and so on and so forth. i.e. a lot of work! Now back at the turn of the century I spent quite some time reading man pages about make and gmake, putting together a complicated, yet elegant, series of makefiles that would allow the building of a project made of thousands of files for a wide range of target platforms. The system, once in place, worked so well that it only needed the occasional minor tweak and, as not actively required, the original knowledge to build such a system faded into the mists of time.

Now spending a few hours re-reading man pages is all that would be needed to help pull that information back out of the long term storage areas of my memory, but surely there must be an easier way, and the way that I decided to follow was 'CMake'.

CMake describes itself as a cross-platform, open-source build system. One that, for the purposes of this project, will happily write the Makefiles for you and, with a quick 'apt-get install cmake' call is readily available for use on the Raspberry Pi of your choice.

Following along with the tutorial on the website we quickly learn that a CMakeLists.txt file is required to describe what files make up our project, a simple one may look like the following.
project (Test)

add_executable(Test helloworld.cpp)
And then you just need to run CMake and out pops the makefiles
pi@raspberrypi ~$ cmake .
-- The C compiler identification is GNU 4.6.3
-- The CXX compiler identification is GNU 4.6.3
-- Check for working C compiler: /usr/bin/gcc
-- Check for working C compiler: /usr/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Configuring done
-- Generating done
-- Build files have been written to: /home/pi
pi@raspberrypi ~$
CMake itself only needs to be run when you make changes to the CMakeLists.txt file, and from this point on whenever you need to compile your project you can run 'make' in the normal way.
pi@raspberrypi ~$ make
Scanning dependencies of target Test
[100%] Building CXX object CMakeFiles/Test.dir/helloworld.cpp.o
Linking CXX executable Test
[100%] Built target Test
pi@raspberrypi ~$ ./Test
Hello world!
Then when you need to add more files its a simple case of adding them to the CMakeLists.txt file
project (Test)

add_executable(Test helloworld.cpp goodbyeworld.cpp cruelworld.cpp crazyworld.cpp)
Re-running cmake and finally make.
pi@raspberrypi ~$ cmake .
-- Configuring done
-- Generating done
-- Build files have been written to: /home/pi
pi@raspberrypi ~$ make
Scanning dependencies of target Test
[ 25%] Building CXX object CMakeFiles/Test.dir/goodbyeworld.cpp.o
[ 50%] Building CXX object CMakeFiles/Test.dir/cruelworld.cpp.o
[ 75%] Building CXX object CMakeFiles/Test.dir/crazyworld.cpp.o
Linking CXX executable Test
[100%] Built target Test
pi@raspberrypi ~$
and you can see that only the new files need to be built (At least at this point).

Of course this is only a simple demonstration, as you add more files and libraries to the project the CMakeLists.txt file will grow, but in a much simpler way that if you had to edit the Makefile itself. In this last case the 100 byte CMakeLists.txt file generated (amongst other files) a 6836 byte Makefile, saving quite a bit of typing.

With the build system sorted I just now need to write some code for it to build!

Leo