Posted on Leave a comment

Pi Wars 2019 Part 7 – Interfacing

Due to the limited wifi access on the day, I want to make my code as easy to interface as possible without the need for an external source (such as a laptop). With this in mind, I decided to purchase a Pimoroni GFX Hat to try and make a simple and easy to use menu to interface with my robot. Although the screen has to be functional, I also wanted to make sure that it looked good. Because of this, I wanted to make icons that could be used to identify each challenge, as well as some simple commands, such as shutdown, reboot and access settings.

Icons

Because of the size of the screen, I limited my icons to being  around 128×40 pixels to enable me to include a status bar at the top, and relevant text along the bottom. I started off by making image files, and using the Python PIL library to convert them to black and white pixels to be used on the screen. I soon found this to be a very slow process, as the user could rapidly press the next button, but the code would still be struggling to load an image, causing a large deal of lag. After a little bit of debugging, I soon found the speed issue to be the conversion of the image from a PNG to black and white image object in PIL. To resolve this, I decided to pre-define the pixels in each image in a series of icon functions. These an then be stored in a different module, and be called from the main menu program when required. I started off by making a black and white PNG image, and then used a small python script to generate the coordinates required. These can then be appended to an array of coordinates, which can then be drawn to the screen. The program saved me hours of time which I would have had to manually go through each image getting the coordinates of each black pixel.

from PIL import Image
im = Image.open("settings.png") # Image in question
pix = im.load() # Loading image

size = str(im.size) # Getting size of image
size = size.replace("(", "") # Manipulating size do can be numeric
size = size.replace(")", "") # Manipulating size do can be numeric
size = size.replace(" ", "") # Manipulating size do can be numeric

xmax, ymax = size.split(",") # setting x and y max of image
SettingsFile = open("export.txt", "w") # Creating file to add appends to to be copied into icon program
for x in range(int(xmax)): # Loop through each x coordinate of image
   for y in range(int(ymax)): # loop through each y coordinate of x row
        if str(pix[x,y]) == "(0, 0, 0, 255)" or str(pix[x,y]) == "(0, 0, 0)": # Check if pixel is black
            SettingsFile.write("cords.append((" + str(x) + "," + str(y) + "))\n") # Add the array append to the temp file
SettingsFile.close() # Close the written file
print("Done")

I did end up spending a little time on some of the icons, with my favourites below being the Spirit of Curiosity, Hubble Telescope Nebula Challenge & blastoff

 

 

Building A Setting Class

One thing I wanted the ability to do was to be able to load various settings between boots. To do this, I created a text file and settings class which can be loaded and saved within the program. I am planning to re-visit this in the future, as I feel there is possibly a better way to do it, however for the time being it seems to do the job. Initially, I created the base class which loads the settings from a text file. An example text file looks like:

255
255
255
[255, 89, 136]
[36, 169, 255]
[198, 255, 142]
[42, 205, 255]

To load this I created the class with the below code. It loads the lines of the text file and sets the values to the appropriate property of the class:

class Settings:
    def __init__(self):
        SettingsFile = open("Settings.txt", "r") # Read file contents to variable

        r = str(SettingsFile.readline()) # reading through each line to set appropriate value
        g = str(SettingsFile.readline()) 
        b = str(SettingsFile.readline()) 
        nebRed = str(SettingsFile.readline())
        nebBlu = str(SettingsFile.readline()) 
        nebYel = str(SettingsFile.readline())
        nebGre = str(SettingsFile.readline())

        self.blR = r.replace('\n', '') # clean values to make numeric, removing new lines
        self.blG = g.replace('\n', '')
        self.blB = b.replace('\n', '')
        self.nebRed = nebRed.replace('\n', '')
        self.nebBlue = nebBlu.replace('\n', '')
        self.nebYellow = nebYel.replace('\n', '')
        self.nebGreen = nebGre.replace('\n', '')

Once I then added a save file function to the class, so that the settings could be saved on the fly. This should then allow any values saved to to the settings class to be reloaded on the next boot.

def saveSettings(self):
    SettingsFile = open("Settings.txt", "w") # Open file to write
    SettingsFile.write((str(self.blR) + '\n' + str(self.blG) + '\n' + str(self.blB) + '\n' + str(self.nebRed) + '\n' + str(self.nebBlue) + '\n' + str(self.nebYellow) + '\n' + str(self.nebGreen))) # Save values with new lines to seperate
    SettingsFile.close() # Close file with new lines written

Display Location

I have mounted the pi on a stand on the upper chassis to make it easily accessible on the day, this does however mean that the display is upside down. To handle this, as part of the process of displaying the image to the screen, I have added an additional process to rotate the image 180 °. This means there is very limited impact to code and if it was needed to be changed, I would only need to alter the code  in one place.

def gfxDisplay(image):
    image = mdlPil.rotateImage(image, 180) # Rotate the image 180°
    for x in range(128): # Go through each x row
        for y in range(64): # go through each y row for each x row
            pixel = image.getpixel((x, y)) # Identify whether pixel needs to be on or off
            lcd.set_pixel(x, y, pixel) # Set GFX pixel to value
    lcd.show() # Update display

Status Monitoring On The Go

In addition to accessing programs and features, it is also useful to be ale to see the status of both the controller and motor controllers to see if they are connected. This should help with on the go trouble shooting on the day. Initially, I want to see if the controller is connected. To do this, I can query the Pi’s Bluetooth controller to see if the mac address is connected. Depending on the status, I  can then show a pre-defined icon, to show in the upper segment of the display:

def ControllerCheck(image):
    stdoutdata = sp.getoutput("hcitool con") # Check for connected devices

    if "00:06:F7:13:66:8F" in stdoutdata.split(): # Checking if PS3 mac address is in connected devices
        image = mdlPil.Controller(True, image) # Show connected icon
    else:
        image = mdlPil.Controller(False, image) # Show disconnected icon

    return image # Return the status bar for the menu

To check if the thunderborg’s are connected, I use their library to query each board to see if the controller has been found. This is then passed to the icon library which returns an image to the status bar. This can be seen below:

def TBCheck(image):
    TB1 = ThunderBorg.ThunderBorg() # Loading class
    TB1.i2cAddress = 10 # Setting address
    TB1.Init() # Initialising chip
    image = mdlPil.TB(TB1.foundChip, image, 1) # Passing if chip found to icon module

    TB2 = ThunderBorg.ThunderBorg() # Loading class
    TB2.i2cAddress = 11 # Setting address
    TB2.Init() # Initialising chip
    image = mdlPil.TB(TB1.foundChip, image, 2) # Passing if chip found to icon module
    return image # Retuning status bar to menu

Once all the items are put together, it enable a menu as seen below, with the status bar at the top, and icon in the middle:

Whats left to do?

To finish off, I need to work on a way to easily execute the challenge programs, but still be able to stop them using the menu. I think the best way to do this will be using threading, loading the challenge programs onto a thread, killing the thread when needed.

Overall I feel I have created both a easy to read but also visually appealing menu, using the GFX to its full capabilities. I hope that it shall prove very useful during the competition!

Posted on Leave a comment

Pi Wars 2019 Part 6 – Starting to get a Sense for it!

Choosing The Best Camera

Now that I have a stable chassis and movement code sorted, my next step is to look at the best way to interface with sensors. I want the primary sense of my robot to be using image recognition, so the first port of call was to look for a solid and stable camera platform. From initial research, I found 2 main candidates, the Pixy cam 2, an Arduino based board with an integrated micro processor, or the official Raspberry Pi Camera Module, using open CV. To decide which route to go down, I decided to weigh up the pros and cons of the cameras as seen below:

From this, although the price was a little higher, I decided to opt for the Pixy Cam 2. One of the main things that drew me towards it in the end was the line following API, where the camera automatically identifies and returns a lines location and coordinates on a frame, this should make my goal of achieving a mostly all image processing robot a lot easier!

Setting it up

Setting up the camera was fairly simple for Python2, however for Python3 I had to make a few modifications.

From the git “”, I initially changed the setup.py file for Python3. As with most instances, I just had to change a few print’s to include brackets, the changes are highlighted below:

pixy2/src/host/libpixyusb2_examples/python_demos/setup.py

#!/usr/bin/env python

from distutils.core import setup, Extension

pixy_module = Extension('_pixy',
  include_dirs = ['/usr/include/libusb-1.0',
  '/usr/local/include/libusb-1.0',
  '../../../common/inc',
  '../../../host/libpixyusb2/include/',
  '../../../host/arduino/libraries/Pixy2'],
  libraries = ['pthread',
  'usb-1.0'],
  sources =   ['pixy_wrap.cxx',
  '../../../common/src/chirp.cpp',
  '../../../host/libpixyusb2_examples/python_demos/pixy_python_interface.cpp',
  '../../../host/libpixyusb2/src/usblink.cpp',
  '../../../host/libpixyusb2/src/util.cpp',
  '../../../host/libpixyusb2/src/libpixyusb2.cpp'])

import os
print("dir = ")
print(os.path.dirname(os.path.realpath(__file__)))

setup (name = 'pixy',
  version = '0.1',
  author = 'Charmed Labs, LLC',
  description = """libpixyusb2 module""",
  ext_modules = [pixy_module],
  py_modules = ["pixy"],
  )

Then, I needed to change the install bash script to run the setup script in a Python3 environment as apposed to Python2. To do this, I edited the following file with the highlighted changes below:

pixy2/scripts/build_python_demos.sh

#!/bin/bash

function WHITE_TEXT {
  printf "\033[1;37m"
}
function NORMAL_TEXT {
  printf "\033[0m"
}
function GREEN_TEXT {
  printf "\033[1;32m"
}
function RED_TEXT {
  printf "\033[1;31m"
}

WHITE_TEXT
echo "########################################################################################"
echo "# Building Python (SWIG) Demos...                                                      #"
echo "########################################################################################"
NORMAL_TEXT

uname -a

TARGET_BUILD_FOLDER=../build

mkdir $TARGET_BUILD_FOLDER
mkdir $TARGET_BUILD_FOLDER/python_demos

cd ../src/host/libpixyusb2_examples/python_demos

swig -c++ -python pixy.i
python3 setup.py build_ext --inplace -D__LINUX__

if [ -f ../../../../build/python_demos/_pixy.so ]; then
  rm ../../../../build/python_demos/_pixy.so
fi

cp * ../../../../build/python_demos

if [ -f ../../../../build/python_demos/_pixy.so ]; then
  GREEN_TEXT
  printf "SUCCESS "
else
  RED_TEXT
  printf "FAILURE "
fi
echo ""

The Plan Both Past & Future…

Once this was all done, I was then able to interface with the camera. The first program I wanted to master was the line following application. This code is still very much a work in progress at the moment, however I feel the main basis is there. Once completed, I want it to:

  1. Get the angle of the line
  2. Get the position of the line in the frame
  3. Adjust accordingly using back wheel

Using the Pixy cameras PixyMon software, I am able to get it to identify and tack colours. Using this part of the API, should help me with the nebula challenge, as I will be able to train the camera onto a particular colour, once I can get it to identify the colour, I will then use a Time Of Flight distance sensor to tell the robot how close to target it is. Similarly, I will use the same logic for the canyons of mars maze challenge, using the camera to train and track the alien targets and rotate when the sensor senses the bot is close to the wall with the desired angle.

Once I’ve got all the above working and coded, this should enable me to complete all if the autonomous courses, primarily using image recognition with the backup of a distance sensor. If I had more time, I could use the frame size of the camera to identify sizes and distances, however with the time left, I don’t think this will be possible.

Posted on Leave a comment

Pi Wars 2019 Part 5 – Quick Maths

Once the Chassis was complete, my next task was to initially focus on a stable driving system and then look at a way of using inputs to move the robot.

Initially, I wanted to find a stable and easy to use Python3 library to enable input from a controller. My first port of call was Pygame, having used it for the majority of my GCSE Computer Science coursework. After looking into it further however, I found the best library seemed to be the Approximate Engineering Input Library. This takes the input of various different controllers, putting the values into a class with the universal button mappings. This means that if for some reason i needed to change my controller, i could easily do so with very minimal code changes.

I started off by following their guide on pairing a PS3 DualShock controller, which was very  informative pairing with no hassle at all! A link to the guide can be found here.

Once the Controller was connected, I then set about learning how the library works, which turned out to be very simple. Using the example code on the documentation site, I was able to easily able to gain data from the inputs on the controller. The basis of the code was an infinite loop checking to see whether the controller was connected. Once connected, it defined the Controller which whilst connected returned a class that can be read for values.

from approxeng.input.selectbinder import ControllerResource

while True:
    try:
        with ControllerResource() as joystick:
            print('Found a joystick and connected')
            while joystick.connected:
                # Do stuff with your joystick here!
                # ....
                # ....
        # Joystick disconnected...
        print('Connection to joystick lost')
    except IOError:
        # No joystick found, wait for a bit before trying again
        print('Unable to find any joysticks')
        sleep(1.0)

Once I had got this stably working, I then moved onto the maths of the general movement. There are a few library’s available for this, such as this one, however I really wanted to try and make my own one, being the main feature of my bot. I started off by sketching out the logic. The basic fundamentals I wanted to work along were that the joystick controlled both direction and speed. Using the axis of the joystick, I would be able to get both a angle and percentage of speed. An example of this can be seen below:

 

I started off by trying to translate the X & Y positions of the joystick to an angle. Using trigonometry I can take the X & Y and convert it into a triangle to work out the angle. To make tings easier, I coveted all coordinates into positive values and then added the respective values afterwards to find the exact angle.

In coding terms,

def CalcAngle(x, y):
    if abs(x) < 0.2 and abs(y) < 0.2: # Checks if axis is less than 20% on both XY stop robot. Stops glitching
        motorSpeed(0, 0) #Sets motor speed to 0 on all motors to stop movement
    else:
        #Checking for exact angles
        if x > 0.0 and y == 0.0: #
            angle = 90
        elif x == 0.0 and y < 0.0:
            angle = 180
        elif x < -0.0 and y == 0.0:
            angle = 270
        elif x == 0.0 and y > 0.0:
            angle = 0
        else:
            #Working out angle from coordinates using inverse TAN rule. abs() converts negative values to positive
            angle = math.degrees(math.atan(abs(y) / abs(x)))
              
            #Adding appropriate values to angle depending on the values of coordinates to give 360° direction
            if x > 0.0 and y > 0.0:
                angle = 90 - angle
            elif x > 0.0 and y < 0.0:
                angle = 90 + angle
            elif x < 0.0 and y < 0.0:
                angle = 270 - angle
            elif x < 0.0 and y > 0.0:
                angle = 270 + angle


        return angle #Returns value of angle

Working out the speed is simply done by using Pythagoras, A^2 + B^2 = C^2, in my case, 

Image result for pythagoras theorem

This could then be converted into python using the following line of code:

speed = min(math.sqrt((x * x) + (y * y)), 1)

Sometimes the value supplied was a little over 1, so adding the min() function simply meant that if this was the case, it would supply a value of 100%. This can then be applied with the motor control software to calculate the speed of each motor to get the desired movement.

The movement of the bot can be split into 6 different zones. a pair of wheels can work together in both directions. These are separated by 60° and are shown in the below diagram. (Beautiful I know!)

I then used the angle from above to calculate the speed of each motor, adding in a graduation between the different positions to enable a smooth drive.

def motorSpeed(angle, speed):
    global TB1, TB2
    val = 0.0166666666667 #1/60 to calculate the gradiation
    m1 = 0 # Initially setting motors to 0
    m2 = 0 # Initially setting motors to 0
    m3 = 0 # Initially setting motors to 0
    if angle >= -1 and angle < 60: # Between 0° and 59°
        dif = angle # Working out the multiplier for gradual decrease in speed
        m1 = 1
        m2 = -1 + (dif * val)
        m3 = 0 - (dif * val)
    elif angle >= 60 and angle < 120:# Between 60° and 119°
        m1 = 1
        m2 = 0 
        m3 = -1
    elif angle >= 120 and angle < 180: # Between 120° and 179°
        dif = angle - 120 # Working out the multiplier for gradual decrease in speed
        m1 = 0 - (dif * val)
        m2 = 1
        m3 = -1 + (dif * val)
    elif angle >= 180 and angle < 240: # Between 180° and 239°
        dif = angle - 180 # Working out the multiplier for gradual decrease in speed
        m1 = -1
        m2 = 1 - (dif * val)
        m3 = 0 + (dif * val)
    elif angle >= 240 and angle < 300:# Between 240° and 299°
        m1 = -1
        m2 = 0
        m3 = 1
    elif angle >= 300 and angle <= 360: # Between 300° and 360°
        dif = angle - 300 # Working out the multiplier for gradual decrease in speed
        m1 = 0 + (dif * val)
        m2 = -1
        m3 = 1 - (dif * val)

    m1 = m1 * speed # Multiplying the drive for angle by the speed (value is between 0 and 1)
    m2 = m2 * speed # Multiplying the drive for angle by the speed (value is between 0 and 1)
    m3 = m3 * speed # Multiplying the drive for angle by the speed (value is between 0 and 1)

    TB1.SetMotor1(m1) # Setting speed for motors
    TB1.SetMotor2(m2) # Setting speed for motors
    TB2.SetMotor1(m3) # Setting speed for motors

Once all the above are put together, they can be added to the controller function to get a smooth direct drive on 3 wheels 🙂

while True:
    if stopThread == True:
        break
    try:
        with ControllerResource() as joystick:
            print('Found a joystick and connected')
            while joystick.connected:
                Drive(round(joystick.rx, 2), round(joystick.ry, 2))
        print('Connection to joystick lost')
    except IOError:
        # No joystick found, wait for a bit before trying again
        print('Unable to find any joysticks')
        sleep(1.0)

Changes and modifications

After testing, I have mad 1 modification to the code. Previously the angle was being calculated to the nearest degree, however after testing it caused the robot to become quite jumpy. To resolve this, I have added an extra line of code to round the angle to the nearest 5.

angle = int(5 * round(float(angle) / 5))
Posted on Leave a comment

Pi Wars 2019 Part 4 – Another Tool For The Arsenal

Whilst developing my Robot, I have hit various snags and limitations with my 3D printer, with its smaller build plate, and limit to Brand PLA filament. I have been thinking about getting an upgrade for some time, however during my usual daily browse of Facebook and an advert caught me by surprise. Aldi, were selling a 3D printer!

I’m used to picking up some groceries from Aldi, however a 3D printer did seem a little extreme for the weekly groceries! I sat down and read various reviews and  write ups. It’s all very confusing, as there seems to be loads of differant brands for the same printer, however it seems to stem down to being a re-branded Wanhao Duplicator i3 Plus. After watching countless review videos, I decided to bit the bullet and pick myself one up.

It arrived within a week and was very easy to setup. There were a couple of issues with the Z  Axis being a little out on one side, but a quick adjustment and several bed adjustments later, it seemed to be printing perfect prints. My main gripe with my other printer (XYZ Davinci Jr) was the fact that you can only use their own brand filament, unless you want to mod the printer and install custom firmware, something I wasn’t too keen on. Because of this, I ordered a range of different materials in the hope I could find an alternative to laser cutting my chassis, a big cost saver as I would be able to do it in-house rather than externally.

I was initially drawn to ABS. Reading various sources, it seemed to have greater strength and rigidity to standard PLA. I spent a good 6 hours trying to profile the material, in the aim of getting a perfect first layer, however after countless attempts trying various techniques such as using hairspray and glue on the build plate, I was unable to get that perfect first layer. I turned to the lovely community of twitter to try and help me get to the bottom of the problem.

 

The general consensus seemed to be that it wasn’t worth messing around with ABS, so I did just that and ordered myself a reel of SUNLU PLA+ filament in space grey. A sort of hybrid between ABS, with increased strength. After a couple of runs trying to profile it, I finally seemed to get the perfect first layer, and let it print a top part of the chassis. All seemed to be going well until I noticed the extruder was progressing further up the Z axis, but the model wasn’t. The extruder was jammed.

I was a bit lost at this point. Because my other printer was designed at the entry level audience, I have had very limited experience with trouble shooting failures, however after googling and reading various articles, I started by using a 0.4mm rod to try and clean the nozzle. This didn’t seem to work, so I then followed a guide on how to assemble and dis-assemble the extruder. Doing this ended up to be fairly simple, removing 2 bolts enabled the fan, heat sink and stepper motor to become separated, which game be access to the entry to the heat element. It was quite obvious at that point to see what had happened…

The cause of this seemed to be a heat issue, so I increased the extruder temperature by 10° to try and fix the issue.

I set the print gong again from scratch, and after a nerve wracking 8 hours, the build had finished to what  I thought was a very good standard.

The final settings I used were:

  • Bed temperature – 60°c
  • Extruder Temperature = 215°c
  • Initial layer speed 30mm/s
  • Print speed 60mm/s

I used the same settings to print the bottom chassis part and after assembly finally decommissioned the cardboard chassis although I have to give it credit, it had done very well and lasted longer than I ever though it would!

Posted on Leave a comment

Pi Wars 2019 Part 3 – Progress is on fire, literally!

Progress seemed to be moving very fast. But as with any project, I soon hit a rather large and scary brick wall! When powering the Pi on, I suddenly got a rather large ploom of smoke and a strong smell of burning plastic.  I quickly sprung into action, tearing the circuit apart to try and stop it developing, which thankfully was successful. Having a cardboard chassis isn’t the most fire retardant material, so i can only imaging what would have happened. Once everything had cooled down, I assessed the damage to see that all the hardware was ruined.

  • The jumper wires seemed to have melted
  • The connectors on the ThunderBorg had also melted
  • The Pi was no longer powering on

I posted onto the PiBorg forum, which they also confirmed everything was to be deemed unsafe to use. To reduce further delay, i bit the bullet and purchased 2 new motor controllers and a Pi 3B+

Whilst this hardware was on the way, I decided to evaluate my prototype, and look at what went wrong, as well as what could have possibly gone wrong. From this, I deduced the following:

  1. All the hardware needed properly mounting, moving hardware could short, causing another fire
  2. The battery needed added protection, having it balancing on the other hardware was a disaster waiting to happen
  3. The Pi needed to be covered or properly mounted to reduce risk of static damage

It came at the best time, that I had actually arranged 3 days off to work on progress, however I didn’t have a bot to work on. The new hardware wasn’t due to arrive until at least day  2, so I decided to focus this time on fixing the above issues. I am lucky enough to have a 3D Printer in my arsenal, so I loaded up my SketchUp and set about fixing the problems.

1 & 3 – Mounting Hardware & the Pi

First of all, I wanted to solve the mounting issues. To do this, i decided to start off by adding a new layer to my cardboard contraction of a chassis. I added 40mm standoffs, and bolted a second layer to the existing chassis. This made a nice new and spacious platform to add more components. I wanted to add a User Interface to the robot , to avoid me having to walk around with a laptop on the day, so I decided the Pi was the best thing to add to this layer, as it will be where the screen is mounted in the future. The next problem I wanted to solve, was how to protect the Pi, to reduce the risk of static damage and components being shorted. To do this, I created a stand angled at 45° to create a steady incline, which will be ideal when i add a menu in the future.

 

I used all the official Pi dimensions to get the correct dimensions for the stand-offs, so luckily everything printed correct first time. My printer (an XYZ Davinci Jr) Struggled a little bit with the overhangs, however it was fine for an initial test. After mounting it all up, it seemed to do the trick. It enabled the pi to be angled at thee correct angle for the Pi, as well as enabling i to be effectively secured to the chassis.

2 – Battery protection

The next key aspect of the robot robot I wanted to focus on was battery protection. To do this, I wanted to design a tightly fitting shroud, that could protect the battery, but also enable it to easily be swapped out, as well as all the cables being easily accessible. Again, I got my head down in SketchUp, and came up with the below holder. The battery is very tightly fitted into the shroud, with a sliding hatch to secure it in place, but still allow the cables to be accessed easily. In future revisions, I hope to be able to alter it to hold the voltage indicator.

By this time, the new parts had arrived, so I swiftly re-assemble the chassis with improvements and tested.

Posted on Leave a comment

Pi Wars 2019 Part 2 – Testing, Testing & More Testing

One thing I am usually guilty of is limited testing. Due to the large scale of the project in hand, I made sure to thoroughly test each aspect of my robot before committing it to the final design.

Chassis

The initial priority for me was to get a working chassis. Being the main hub for the robot, I wanted to make sure this was the best it could be, fitting all the components on before proceeding with the robot design further. To achieve the omni-directional travel, i needed to make sure the chassis was completely circular, with equal distances between the motors. To achieve an initial design that i could easily change and edit, i decided to use 3mm card . I managed to source some off-cuts from work, the same card used in Hardback Book covers. I was a little sceptical at first how it would handle the pressure of motors, however i was pleasantly surprised. Bolting the motors to it, it seemed to fair well, with limited flex.

Once I had the initial base of the robot, I then turned my attention to movement.

Motors & Power

One of the things I was most worried about was power. I’ve seen so many horror stories of LiPo’s catching fire specifically this one at the Raspberry Pi birthday bash. Because off this, I wanted to test every stage. Initially, I wanted to assemble the motor circuits externally, to make sure everything worked, before putting more time into the development of the chassis and placement. Initial tests seemed to go well.

 

All the voltages seemed to be good, and the Pi seemed to be stably powered.

Mounting it all up

Once I had checked all the current components, I then wanted to add them to my test chassis to make sure the thing actually moves! I ordered some chap nylon nut’s & bolts off amazon, and used a screwdriver to make some holes in the chassis… very glorified, I know! To gt optimum movement, and avoid unwanted turning, i needed to make sure the wheels were exactly the same distance apart. Because I have 3 points of movement, the distance between the wheels can be calculated by 360° / 3 = 120°. I used a protractor to locate this on my chassis and used nylon bolts to secure the motors. I also used the same nylon bolts to secure the wheels to the motor hubs.

I then used the supplied standoffs with the thunderborg to mount the motor controllers between 2 motors. Once these were all hooked up, it was time to test.

 

What I Learnt

From the initial movement testing, I learnt i needed better hardware to mount the wheels.after a couple of rotations, the wheels were becoming loose and falling off. Being nylon as well, they were regally flexing, causing the robot to make random turns and veering off path. Because of this testing, I learnt:

  1. The nylon mounting hardware needed to be replaced with a metal alternative. The wheels may also need to be adapted to suit a better fitting mechanism
  2. I need a better method of mounting the hardware. There doesn’t appear to be a larger surface area. It may be better to add a second layer.