This robot platform is just for my own experiments in robotics electronics & software, mostly just building on what I learned from Dusty & the NIARC project. Overall the aim is to provide a platform for localisation & mapping, for very low cost at the expense of added effort.
Structural parts: 3d printed in blue PETG
Drive: 2x high-torque stepper motors, each driven by a TI DRV8825 driver which offers comparatively high performance for the cost. Big fat squishy tires since hard wheels didn’t perform well before. A dedicated stepper cooling fan will keep things running nicely.
Sensors: 6x HC-SR04 ultrasonic sensors, GY-85 9dof inertial measurement unit.
Electrical & Brains: TBC! The system will run off two 3S lithium batteries in series, giving 24V. I’m aiming to use a separate microcontroller for each system, with CAN communication between systems.
Last year, two friends and I entered the 2014 NI Autonomous Robotics Competition on behalf of UWA. The theme was agriculture: the robots had to navigate a course to a seed pickup point, then plant those seeds in specified locations before returning to the starting point. The robot had to avoid obstacles and (being a competition) had to be fast. We wanted a design that was cheap and easy to manufacture while still being competitive, and this was the outcome:
The 8 x 3.75m world was built from astroturf and cinderblocks, and the “seeds” were 100mm cubes made of stressball foam. The course looked roughly like the picture below, with obstacles placed randomly in the first region, and the “planting rows” in the second region. In order to gain points the robot had to traverse the entire course within 2 minutes, planting all the seeds inside the bounds of the planting rows while avoiding all obstacles and walls.
We were to be funding the robot ourselves, and with no prior parts to use we had to be frugal – so most of our design decisions were based on getting value for money regardless of the work involved. Fortunately the most expensive component was provided by NI – the onboard processor was an NI myRIO which features an ARM Cortex processor (running a real-time OS) integrated with an FPGA, Wi-Fi and plenty of configurable I/O ports. Both the RTOS and FPGA had to be programmed using LAbVIEW (C++ is possible but complicated to set up the development environment) so as the team programmer, I got a lot of practice using LabVIEW.
Early on in the design phase we settled on the layered chassis structure to make it easier to update the robot. We used wood and some standard fastenings from Bunnings for the same reason, and to reduce cost.
We decided that a two-wheeled differential drive would be simpler to program and require fewer parts (hence less risk of breakage) than other drive configurations like tracks or omni-drive. A quick analysis of torque & speed needed to run the course gave us our motor requirements and after considering a few options we settled on some stepper motors – not only would they provide the necessary power, the stepping operation of the motors would give us automatic odometry, eliminating the need for wheel rotation sensors.
As the robot was not allowed to touch any walls or obstacles, it had to be fitted with some kind of proximity or range sensors. We considered every reasonable option: IR proximity sensors, self-made LED/LDR sensors, LIDAR and SONAR sensors, and were about to go ahead with an array of IR distance sensors when we found the HC-SR04 ultrasonic range sensor online for $2 each. These are time-of-flight sensors which don’t have much control circuitry built in, so the distance measured has to be timed by the main robot controller: we used the FPGA for this as the timings between transmit and echo pulses were on the order of microseconds, too fast for the RTOS to be reliable.
The first prototype used a two-layer chassis, with three ultrasonic sensors, wheels made of plywood and rubber bands, and custom designed motor brackets 3d-printed from ABS. We wanted to be able to recycle the parts after the competition, so we kept wires long and used removable mounts.
These wheels had good traction in general, but accurate odometry was difficult on thick carpet or astroturf. We saw two solutions: to use wide tires and spread the load so that the ground didn’t get as deformed, or to use very thin wheels to penetrate through the grass to the base layer. For the competition we decided to go with the thin option, so I modelled some and 3d printed them in ABS.
These thin wheels worked very well on carpet and on the astroturf sample that we’d been given, but in the final competition the wheels didn’t penetrate far enough to the base layer so we lost a lot of traction.
One requirement of the competition was that the robot have an emergency stop button which was easily accessible and would cut power to all moving parts. Rather than use a toggle switch we opted to develop a latching circuit which would provide the emergency stop functionality as well as cutting power in case of a fault – I’ll write about this in another post.
We had two main designs for the seed dispenser: the first was a ferris-wheel type design which loaded the seeds into a circular magazine and dispensed them by placing them gently on the ground – the aim being to prevent it rolling out of the designated zone. This design placed seeds very well but it was very heavy, slowing down the robot substantially and taking the weight off the driving wheels. Despite being heavy, it was not very structurally sound and was mostly held together with duct tape. You can see it in our video for Milestone 4:
The squeaking noise comes from the poor, sad roller ball that sits between the drive wheels and the dispenser – at this point we realised there were major balance issues but we did not have the time to redevelop from scratch. Making the dispenser stronger would have meant even more weight so we redesigned it completely and went for a square tube-shaped hopper made of balsa:
This layout was a bit less delicate in placing the seeds, but it was stronger than the wheel design and was a tiny bit lighter. The final competition was looming and so most of our work from here on was more or less a hack to just keep it running – exemplified by the mechanism one of my teammates built to dispense the seeds from the hopper, which was a Scotch yoke cobbled together using stationery around his house, and worked perfectly:
This also fit very nicely with our goal of keeping things cheap and easy to manufacture.
The robot ended up struggling in the competition and was knocked out in round 1. The round included 3 runs of the competition course: on the first run the robot didn’t move as it couldn’t get enough grip, and on the second and third runs it was too slow and got lost – but it didn’t hit any obstacles! So we were not docked any points and finished the round with zero points. In the end we placed equal 20th with a few other teams who had similar troubles. We stuck with our initial project goal of keeping the robot cheap (final parts cost was ~$200, compared to many teams who used a $1000 LIDAR sensor) and easy to manufacture, and made it to the final competition while meeting all project milestones. Overall this was very rewarding and the design/build/compete process was a lot of fun!
We needed a motor bracket for the NIARC robot, so we had a look at what was currently around. Googling gave a few options in the $10-50 range, which was way too much! So I modelled one of the brackets in Inventor and had it printed in PLA:
This did the job of holding the motor, but it wasn’t very stiff (being made from PLA) which is bad news for accurate odometry, and it took up a lot of vertical space as the fasteners go between the motor and the bracket. So I redesigned it to have the fasteners either side of the motor, and with larger side supports:
This design also used the motor body itself as support structure to minimise flex, when mounted in the upright orientation. The cut-out in the base was to reduce material and allow room for the motor wires if needed. We had two of these printed in fancy red ABS:
These were very stiff and compact, and worked fantastically! You can see them in place here, in an early version of the robot: