Friday, 5 February 2016

LED Night Lamp

A bit more of an artistic project I had a play round with to make use of a 10W LED and driver combo I got off eBay. The files are here: http://www.thingiverse.com/thing:1318807

The bottom piece was designed in OpenSCAD, the top piece - the "box" - was designed in Blender because it was originally meant to have complex patterns on the inside which would only be visible when lit.

Finished product:

The 10W LED is still surprisingly bright when covered, the colour output of the LED is white but the plastic box unfortunately yellows the light a lot.


Make sure you fit the hidden captive M3 nuts on the back side of those screw holes. This can be done by poking a long M3 screw through the hole, threading a nut on and then pulling the nut down into the nut trap. If the nut trap is undersized, heat the nut up with a soldering iron then quickly pull it into the nut trap.
Small 12v switch mode PSU on the left with a ~0.8A current limiter integrated which is important for powering LEDs and not having them burn up due to their non-ohmic behavior. The quality of the PSU isn't bad either, there's filtering and proper separation between high and low voltage, I didn't inspect the transformer's wingdings but I hope it's of the same quality inside as the rest of the board. The LED "chip" is on the right.

The makeshift bending process for the dual aluminium heat sink/outer casing. Had to keep that protective wrap on till the end so it didn't scratch! Those edges where the wrapping is curling up did get scratched though - turns out toothpaste is a scarily effective (and still pretty coarse) abrasive if you don't have anything else to polish with.

Oops... something went wrong, but this wasn't totally unexpected given the tooling. More important than the total height of the base or the slightly mis-matched bend radii was getting the sides level so I cut down the left side.
Here it is drilled and re-sized through much chiseling, and sanding. The amount I had to take off was a bit small for a hacksaw, too large to sand and don't even think about grinding aluminium!

Make the aluminium outer first and see how it comes out then modify the printed base to suit the errors - that's why it's parametric. In this case the radius has been made larger on one side and the overall height reduced. Some gaps are visible, and the aluminium isn't flat in the middle so viewing distance, angle and favourable lighting are late additions to the BOM.

All wired up, the PSU is hot glued to the base because it has no mounting holes anyway. Since the device runs on 240V mains and has exposed metal, a three pronged plug and lead with earth are crucial - even more crucial is to connect the ground lead to the metal casing: note the bottom right screw pillar which was made slightly shorter to allow room for a connection. I'm not an electrician or nuanced in electrical codes - so wiring is your own responsibility and most LED drivers from China aren't certified either. I don't leave this light running for long periods of time unattended.




Wednesday, 8 July 2015

RoboCup Soccer Robot Pt. 2 - OpenCV (Computer Vision)

So far, I have covered the mechanical part of my school team's miniature soccer robots we made last year, but I have not gone over the computer software that ran on them (or was intended to).

Most competitors at the level we competed at based the control of their robots around some combination of Arduinos, ultrasonic sensors, infra-red sensors (the soccer ball emits IR light), digital compasses and colour sensors. The Arduino is programmed to process all the data inputs and move the robot accordingly to find the ball, shoot goals and defend. The ultrasonic sensors can be used to find the location of the robot on the field by pinging the raised walls around the field, provided the robot is properly aligned (which is where the compass comes in). Having ultrasonic sensors pointing forwards, backwards and to the sides gets around the problem of other robots getting in the way of the sensors: if the total distance measured doesn't add up to the length or width of the field, discard the readings, or perhaps figure out which sensor is being blocked. Colour sensors are used as another source of positioning data because the fields are either marked with black lines on a green field, or the green field has a few zones of different shades of green. Unfortunately the location and orientation finding cannot be based of something like an optical mouse or rotary encoder because the robots are often picked up and replaced by referees according to whatever rules have been broken or a goal being scored. Any calibration only gets to be done once: at the start of the match (so the compass can be set to point towards the opposition side of the field).

Our team agreed that we could take a more adventurous approach and use a camera and a computer vision library running our own software on an ARM processor (I convinced the team to let me bite off more than I could chew as the person who would have to code the computer vision and tactics module). Webcams are cheap and plentiful, there are lots of small and powerful ARM boards on the market and there is a comprehensive open-source computer vision library available: OpenCV; all the ingredients for this project exist.  Before getting into the details - this system failed because the execution of the project was lacking, though I gained a lot of useful programming, embedded ARM Linux and computer vision experience. Looking back on it at the moment, the traditional approach with an Arduino is no walk in the park either. Fingers crossed the Arduino approach is effective, as we are working on it for this year's competition.

We used OpenCV to identify the field and calculate our robot's position based off of perspective, knowing the actual size of the field and what could be seen from our position. It was also used to recognise goals and the ball. Also created was a module to automatically tune the HSV colours that the robot recognises as the field and goals because we couldn't know how they would appear under lighting on the day. We were hoping to one day extend the CV component to recognising enemy and friendly robots, though our progress was a long way off. None of the computer vision techniques used up to the point we finished at involved machine learning and the methods I came up with were crude and primitive compared with some of what is going on in the field. Note also that all the following Windows screenshots are from testing and development on the computer and that the code was later ported to the Odroid, with optimisations in a few places and the GUI disabled.


 Above: this was a trial of the field identification/dimensioning, it's not an actual field, but the code is easily adapted, as in both cases the "field" is square and there is a clear colour difference between the field and outside it. First an edge detection was done, then the Hough straight line transform was used, this of course still let straight lines in the background through, so each line segment had the colour of the image directly to each side of it checked: one side had to be the field colour, and the other had to be the background colour. Then, lines of similar gradient and x/y intercept were merged  and extended, the intersections between the lines were the corners of the field.

 Above we see the thresholding of the field colour, sort of like green screening, as seen before, identifying the colour of the field is another important component to identifying it, however, hard coding the shade of green that the field is doesn't work, as the camera sometimes does funky colour correction, and the lighting in two places is rarely the same. So what has happened in the screenshot above is auto-tuning: I hard coded a really wide HSV green colour range into the software, and then it narrowed it down by itself. The key to this is that when the colour range is well chosen, the colour mask will have very few blobs. Just iterate through, changing the HSV ranges and pick the one which gives the fewest blobs. Note, the image used was found on the internet and isn't quite the same as the environment our robot had to operate in, the reason for this is that our robot and testing field was not yet ready at the time of writing the OpenCV code.

 Above is the ball detection process. I noted that the ball gave off a few things which are easily seen by a camera and not common to see in any other place together - white dots from specular reflection and in the centre of LEDs, red regions from the red LEDs and also purple bits from the IR LEDs: though 920nm infrared light is not visible to the human eye, it renders as faint purple on cameras that don't filter it out properly! So in the top left image is the input, top right is looking for purple (didn't work so well - probably needed better HSV ranges), bottom left is looking for red and bottom right is looking for white. Next, each of those coloured blobs, which have been visualised to let you see what the software found, were put onto three black and white masks and each blob was massively increased in size, the red mask and purple mask were then combined/unioned as the purple only sometimes showed up at close range. Then the intersection of the purple/red mask and the white mask were taken, this whole process removed the false points seen in the last two images. From there, the centre of the ball could be easily estimated and more difficultly, its distance determined by its relative size. There were two other screenshots I had - of the combined mask and the estimated center of the ball and also another of the ball a couple metres away being identified, but I've since lost those images and changed the code, so they're not easy to recreate visualisations of.

Seen above is the goal detection, the goals had blue backings and sometimes one side would have yellow and the other blue. The standards in these competitions don't seem too widely spread and accepted. However, just as a side note, the field in this image is the same as the one in our competition, we didn't have goals at our school test field, so this image is again off the internet. To identify the goals, the robot would have first been moved up close to one, so it covered most of its field of view, then the blue or yellow colour would have been auto tuned, shown above are the steps after this. The whole image is thresholded for the blue colour, but we can clearly see erroneous points in the mask. To get rid of these, I got the code to check the colour of 10 points along the bottom each blob a few pixels below it: if the blob was indeed a goal, there would be the green colour of the field underneath. Then the centre of the goal blob was taken and in the top right window you can see it rendered. If this technique hadn't worked, plan B would have been to use the squareness of the goal to identify it.

After getting these techniques nutted out, I moved them to the Odroid and got them running a lot faster. For instance, for whatever reason, when trying to make the white pixels in masks much larger (morphological dilation), as in 30 - 70 pixels extra round the edge, it would take seconds for this process on a single frame. I think from memory, in the one place this happened - in the ball finding code, instead of performing a dilation on the red, purple and white masks, I simply used a blob detector to find the white parts of the mask and then put a big circle on the centre of every identified blob - hacked, but it worked blindingly fast. That was why it was so easy to show little circles in the visualisation a few images above. I just made the radii of the circles small, changed their colour and drew them onto copies of the original image.

So, for the competition, with only time enough for a bodged solution, I began piecing together the monolithic program which had to do computer vision and communicate data to the Arduino over serial, which the rest of my team had written movement code (harder than it sounds for omniwheel robots with no rotary encoders on the wheels) and some simple tactics on. The tactics, if they had been more complicated, should have resided on the Odroid, but there was no time for that! The field detection and identification code was useless at this time as there was no code to go with it which could determine the robot's position from the perspective and apparent size of the sides of the field. So that left colour autotuning, ball detection, goal detection and serial communication. I got these all integrated together the night before the competition and on the way there. I also removed all the GUI stuff and never got the chance to test it with the robot because a team member had it. So in the hour before the competition at the venue, we got to test the whole system together, it didn't work of course, the software was crashing - something in the goal detection - so that module was commented out. Then it seemed that even though the auto tuning and ball detection was working, sending data to the Arduino over UART wasn't working properly, the Arduino was receiving garbled strings even though we could read from it alright. We never got that part working; in Linux, writing to the serial port should be as easy writing and reading to a file after you've changed some settings of the serial port, in this case: /dev/ttyACM0. I even got it working on my Linux laptop, but the same code didn't play nicely on the Odroid, I think I didn't get the configuration of the serial port right - as in none of the settings I thought I was applying were actually being applied. I should have used the Boost serial communications library.

And that, folks, was where the project was left after doing abysmally in the competition. OpenCV and the Odroid have been abandoned, though writing this piece makes me think we got close to something working - if we had ironed the bugs out... but who knows. I learnt a lot, though little to do with time management, and the Odroid, its Arduino shield and Linux make an awesome little platform. Maybe this blog will have some more computer vision projects in the future!

Saturday, 14 March 2015

MHS RoboCup Soccer Robot - Pt 1: Mechanical

The RoboCup Junior is a robotics competition aimed at secondary schools in Australia and around the world. Last year, I competed in it as part of a team of five other students from my school (Melbourne High School).  The gist of the soccer competition is there are two teams in each soccer match, and each team has two robots on the field, which is about the size of a large tabletop and is covered in green felt with lines marked on it. The aim is to shoot goals. The constraints: the robots have to be autonomous - they can communicate with each other but cannot be remote controlled in any way, they can't damage other robots, and they have to fit within a certain size. How your team carries out tactics, choice of motors, wheels, sensors, etc. etc. is up to themselves.

A lot of work went into our team's robot and it is of the technical and engineering nature of this blog, so I have finally got around to writing up a blog post about my involvement in the project. At this stage, I think I'll separate the posts into the mechanical and software side of things. This part will go over the mechanical design. The GitHub repo for all the code/CAD models is at: https://github.com/BillyWoods/RCJ_Soccer_Robot

That model above is our solution to the problem. The robot has three omni-wheels set 120 degrees apart. This lets the robot turn on the spot and move in any direction regardless of where the front is pointing. All the red parts on the model are electronics and the green parts are 3D printed. I modeled/coded the whole robot in OpenSCAD, it's got a pretty hefty code base to it considering how simple it may look (about 4000 lines of code at the moment). This is due to the detail and how parametric it is, most of this turned out to be overkill. Still a good learning experience however, and having the model killed a few problems before the robot was even built and allowed us to change the electronics layout very quickly when we tested a few different boards.

One advantage to modeling out every little detail can be seen above: drilling/cutting templates can be produced for each of those layers of the robot automatically. Simply export as a .dxf file, add the cross hairs for the holes manually and then print out at 1:1 scale. If I ever need to make a large batch of these robots, the plates can also be made on a CNC with the .dxf files.


Seen above - the tricks of the trade. Stick the template onto whatever sheet material you're making the robot out of, centre punch all the holes, then cut out the circular plates with a jigsaw. Since the aluminium is so thin, the jigsaw was extremely rough on the 1.2mm aluminium sheet with even a small length sticking over the side of the bench. Using the "custom jig" on the right was my solution; having all that thick MDF around supporting the aluminium as it was cut made cutting less dangerous, less noisy and much more fun.

Here's a bottom plate, it has a curved section at the front cut out to trap the soccer ball. The DiBond it is made of looks much nicer than aluminium.

Next step, print out the motor mounts on the 3D printer:

It is very satisfying watching what started as a design on a screen get translated into something tangible.


Skip a month or so and a few missing photos and here's an operational robot:
 

The robot in these photos is controlled by an Odroid U3 and an Arduino on top of it as a shield. The Odroid is a single board computer based on the same ARM chip as the Samsung Galaxy S3. It is like a more powerful Raspberry Pi. We used it so I could code a computer vision program to make use of the camera on the robot. The Arduino shield on top of the Odroid is an Arduino Uno in all but shape. It uses an Atmega328, the Arduino bootloader and has all the same pins available. It communicates with the Odroid via serial/UART. The Arduino IDE runs on the Odroid and sketches are uploaded via it.

The wheels used on the robot were not straightforward to attach to the motors. They have a 9mm bore which only goes halfway through the wheel hub. To attach them, I machined couplings out of steel on my lathe and tapped a hole for an M3 grub screw in them. These couplings would then press fit into the wheel hub and the grub screw would secure them on the motor's shaft.

This worked, however the drill bit I used to bore out the hole for the motor's shaft was slightly bent so The hole it drilled was slightly too large and as a result the wheels were wobbly. Not wanting to have to spend time making more, and keeping in mind that I had to make multiples of all these parts for other groups in the school's robotics club, I looked for a 3D printed alternative. The first thing I tried was almost exactly a copy of the press fit metal couplings, though in plastic. These didn't work so well because the printed parts did not have a good enough tolerance for a snug press fit and the plastic couldn't take a thread well enough for the grub screw that secures the motor shaft.

The second attempt was much less naïve. The design made use of the spokes in the wheel to allow more torque to be transferred. There was also room for a captive M3 nut for the grub screw. This design worked very well once it was glued and pressed into the wheel.



























Here's a bonus shot of the soccer ball and the fenders I designed and printed for the robot. These were an afterthought when we realized the bottom plate was too low and thin and would just chip the ball into the air.

In terms of improvements to the robot's design, I do have concerns about the motors wearing down, in particular the gearboxes. Though they have metal gears inside, we've noticed an increase in backlash after some use. The motors' output shafts only use bushings instead of ball bearings and the motors' shafts are carrying the entire weight of the robot. Stepper motors would be the best alternative. NEMA 17s are cheap and powerful, the same goes for their drivers. NEMA 17s are also very tough in terms of wear. Stepper motors are also designed for much more precise control of position compared with DC motors and the omnidirectional drive system needs precise control of the relative speeds of the wheels of the robot. Despite these concerns, in the end the mechanical side of the robot turned out to be the most reliable aspect of the robot. The success of the robot, or lack of it, was as far as we can tell, because we ran out of time to get the sensors, electronics and software working well. The good part is that all this work can be used and built upon in this year's coming competition.

Saturday, 1 November 2014

Compiling OpenCV with MinGW for Windows (and setup in CodeBlocks IDE)

For a robotics project, which I may document on this blog at a later stage, I needed OpenCV working on my computer so I could play around with algorithms and ideas. Later the code was ported, without any hassle really, onto a robot that I built with a school robotics team that had an ARM processor doing some computer vision. I discovered that getting the pre-compiled OpenCV binaries to work with my preferred compiler and IDE was never going to work - the only option was to compile the source code into binaries and headers that MinGW64 would be able to work with. Luckily this isn't as hard as it sounds.

Compiling OpenCV yourself can be useful if you want to use a compiler or computer architecture for your projects which does not have pre-compiled binaries officially released for it. Compiling yourself also lets you choose what you want and don't want included. Unless you are cross-compiling, the only way to make things work reliably is to compile your programs with the exact same compiler that you compiled OpenCV with.

I wrote this guide after a lot of research on Google and trial and error in the first step to trying to get into some computer vision. Disclaimer: I accept no responsibility for anything that may come of following this guide or things that may not work.

--Download the following or newer versions--:

OpenCV 2.4.3 for windows:        
http://sourceforge.net/projects/opencvlibrary/files/opencv-win/2.4.3/OpenCV-2.4.3.exe/download

MinGW compiler:                  
http://sourceforge.net/projects/mingwbuilds

CMake 2.8.12 installer for windows:
http://www.cmake.org/cmake/resources/software.html

CodeBlocks IDE with no compiler: 
http://sourceforge.net/projects/codeblocks/files/Binaries/13.12/Windows/codeblocks-13.12-setup.exe/download


--Setup the programs--:


    1.) The OpenCV file you downloaded was a self-extracting executable. Extract it to whatever location you want, I would recommend C:\opencv because 
it is easy to remember and get to. You will be entering this path A LOT.

    2.) The MinGW file you downloaded is not actually MinGW, but a program that allows you to choose very precisely what version to download and install. Run it and pick your own settings, I chose these, but it doesn't really matter-
        -Version:               4.8.1
        -Architecture:       x64 (assuming your machine is 64 bit, if not pick x86)
        -Threads:              Posix
        -Exception:           Seh
        -Build revision:     5
After getting MinGw installed(I would recommend putting it in C:\MinGW64), you will have to set up your Windows' PATH variable for Windows to be able to find MinGW's  executables. This is so that you can call any of MinGW's modules, e.g. g++(the c++ compiler) from anywhere on the command prompt. This saves you having to navigate to the MinGW directory every time. It is also crucial that MinGW is included in the PATH so that other programs which rely on it can find it by themselves. The path which you  add to PATH will look something like this: "C:\MinGW64\mingw64\bin" no quotation marks of course.

The PATH can be edited from the command line (google for it), or using  a Windows tool with a GUI: just search for "path" on Windows 7 and 8 in the start menu and a program called "Edit the system environment variables" will come up. Start the program as an administrator (right click on its icon - it's in the menu), click "environment variables". Under the list "System variables", search for "Path" double click it to edit it. All you have to do is add the location of MinGW's bin (stands for binary) directory to the end of the list. Make sure you put a semicolon before (if required) and after to separate it from other paths.

To check that MinGW is setup properly, go to the command prompt and type in "g++", you should get a message saying "g++ fatal error: no input files compilation terminated". Though  it's an error, it is actually a good error, because it confirms MinGW is in the path as the error is coming from g++ rather than Windows. 

    3.) CMake's executable will install CMake all properly for you.
   
    4.) Install CodeBlocks, once again, its installer will do everything for you. While it's installing, it should find the MinGW installation we did earlier, if not, 
don't worry. If it didn't find MinGW, or you installed CodeBlocks before MinGW, we'll have to tell it where it is.

Even if CodeBlocks did find MinGW by itself, I would still go through the following steps to check that everything is setup properly  by default anyway.

Start CodeBlocks, then head to the drop down menu item at the top called "settings", then click on "Compiler...", click the tab "Toolchain Executables".
First set "Compiler's installation directory" to where MinGW was installed(same as the thing we added to PATH, just minus the bin folder on the end). As for the other  settings:

        -C compiler:                           gcc.exe
        -C++ compiler:                      g++.exe
        -linker for dynamic libs:        g++.exe
        -linker for static libs:             ar.exe
        -Debugger:                             GDB/CDB debugger
        -Resource compiler:               windres.exe
        -Make program:                     mingw32-make.exe (yes, I know we installed a 64bit MinGW)

If for some reason these executables don't exist in your installation, it is most likely they are there but named something slightly different. Go into the \bin directory  of MinGW and have a look and see if you can see something similarly named. Now CodeBlocks should be ready to go. Create a hello world project and see if it all works.


--Compile OpenCV for MinGW--:


    1.) Start cmake-GUI.exe, it should be under "C:\Program Files (x86)\cmake-2.8.12.2-win32-x86\bin", or you could just search for it.

For the text field "Where is the source code:" in the CMake GUI, enter the directory where openCV was extracted (should be: "C:\opencv") don't enter any deeper folders which look like they may have some code. The correct directory is the one with, among other things, a "CMakeLists.txt" file in it. This is the file that tells CMake what user-selectable options there are for our library and how to create compilation instructions for the compiler. CMake WILL NOT compile stuff, just configure stuff ready for compilation.

Now, for "where to build the binaries" I would reccomend making another opencv directory called "opencv_mingw" under "C:\" and use that. Once the directories are sorted, click the "configure" button, in the window that pops up, select "MinGW Makefiles" in the dropdown, and select the radio button "use default compilers" then click "finish". A progress bar will appear and some text in the output, this may last for a minute, all CMake is doing is mainly testing our compiler that we installed(MinGW), which it found because it was included in the PATH. Now, for all those checkboxes in CMake that appear the, defaults will be fine, but the point of compiling is that you get to choose, so if there is anything from other guides that you want to follow, do it. Now click "generate".
   
    2.) Now we leave CMake and go to the command prompt. Navigate to "C:\opencv_mingw" or whatever you named it, just make sure it's the directory you put down for "where to build the binaries" in CMake. You can navigate to the file in the command prompt using the command: "cd <directory>", which stands for change directory. The command won't appear to do anything, but if you look at the bit to the left of the flashing cursor in the prompt, you will see that it now has the name of the directory(if it went successfully). Now type in "mingw32-make", now the command propmt will light up for about 30 minutes depending on your PC's power. When that is done, type in "mingw32-make install" DON'T miss this second step because you see some newly created files and think you are done.

    3.) Open up CodeBlocks, create a new "console application", now you will see it appear in the directory tree on the left of the CodeBlocks window. Right click on your project(should have a logo next to it with 4 coloured blocks), select "build options". Go to the tab "search directories" and select the second-tier tab "compiler" add the directory "C:\opencv_mingw\install\include", now select the second-tier tab "linker" and add the directory "C:\opencv_mingw\install\lib". Now go to the first-tier tab "linker settings", under the currently-empty list "Link libraries" add all the files in "C:\opencv_mingw\install\lib", the directory will contain about 20 files, all which end in ".dll.a". To add them all quickly, just click the "add" button, then the button in the window that pops up with the three dots (right next to where you'd normally ype a path). Now navigate to "C:\opencv_mingw\install\lib" and shift click to select all the files. If you are asked whether you want to use relative paths, click no, this won't affect operation either way, but reltive paths are messy.

YOU WILL HAVE TO DO THE STEPS ABOVE FOR EVERY NEW CODEBLOCKS PROJECT YOU CREATE WHICH YOU WANT TO USE OPENCV IN. There are ways around this: by doing the previous steps in global rather than project settings. To do this, go to the menu bar, go to "settings" then "compiler...". Note if you do this that it will apply to any project you open in CodeBlocks.

    4.) CodeBlocks, OpenCV and MinGW are now all set up to work, but if you try to compile a program which includes OpenCV libraries, the program will compile but crash when you run it. This is because windows can't find the OpenCV dll's we created when we compiled OpenCV (in simple terms dll's are libraries written by humans in C++, etc. turned into machine-readable code through compilation and are used at runtime). So what we are going to do now is add the OpenCV dll's to Windows' PATH environment variable, like we did 
with MinGW. Add the directory "C:\opencv_mingw\install\bin" to PATH.


    5.) Test everything by compiling and running a test program. Here is a test program which should run if everything is working fine:


#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
 
using namespace cv;
 
int main()
{
    Mat image;// new blank image
    image = imread("test.png", 0);// read the file
    namedWindow( "Display window", CV_WINDOW_AUTOSIZE );// create a window for display.
    imshow( "Display window", image );// show our image inside it.
    waitKey(0);// wait for a keystroke in the window
    return 0;



If it compiles but shows no image, there are a few common reasons why, and you should keep the following in mind when using CodeBlocks:

    -You need to create a .png image called "test.png" and put it under the folder "<my_project>\bin\Debug", this is where the executable file will be put whenever you compile your own programs.
    -The "run" button in CodeBlocks (Looks like a green play button) seems to run programs but never see any files which you put in the same directory as the executable (and refer to in software with relative paths). So instead of using the run button in CodeBlocks, navigate to "<my_project>\bin\Debug" in Windows explorer and click the executable file yourself.

 

 

--Credits and Resources--:

http://kevinhughes.ca/tutorials/opencv-install-on-windows-with-codeblocks-and-mingw/
http://zahidhasan.wordpress.com/2013/02/16/how-to-install-opencv-on-windows-7-64bit-using-mingw-64-and-codeblocks/

Friday, 31 January 2014

DIY CNC Linear Rail Design

For my CNC machine I decided that I would try and make my own linear rail system using 608 skate bearings and flat steel. This was to try and save money, and I must day that my design, which made the best of the equipment I had on hand, failed miserably. Luckily I decided to try making a prototype with a 1m length and half a carriage before diving in and buying metres and metres of steel and wasting my time building any further.

Here's a drawing of the rail with a carriage riding on it:

This is one side of that carriage made and put together, to get adjustability and thus make up for lack of tolerance (had to do this by hand), I drilled the holes a bit large on purpose so I could (in theory) get the bearings pressing tightly against the linear rail and then just tighten up the bolt:
These are the four pieces of SHS cut and drilled, two of them have cuts which allow for access to get at bolts which have to be tightened. It turns out that there was not adequate room to get at bolts with this design anyway:
The linear rail I put together, it is made of two pieces of steel bolted together, namely 50mm SHS with 3mm walls, and also steel flat which was 5mm thick by 75mm wide if I remember correctly.
For the prototype, I only used construction-grade steel and this showed in the surface finish, in the final design I planned to replace the steel flat with precision-ground steel flat and just use structural SHS for support and rigidity.
Another problem with the overly large holes which I hoped would allow me to get adjustability was that the nut would have less flat area to sit on and thus tend to skew off at an angle easily and not be all that rigid, this problem was exacerbated by the fact that the square tube had very thin walls. Also I couldn't get a washer into the tight space to help the nut. This led me to trying to mill slots for adjustability, since I don't have a milling machine or a milling attachment for my lathe, I just clamped the piece to the tool post which was very finicky, also not very rigid when all the milling force was on the clamp rather than the tool post. It sure did mill fairly well though.

In the end I decided against this method of making home-made linear rails even if I could get easy adjustability and rigidity which were major challenges with the tools I have. The final reasons I decided against this design were: the carriages were too bulky, making the carriages was way too time-consuming given the number I would have to make in total and also the fact that precision-ground steel flat for the bearings to run on would cost a lot and negate any cost savings of going homemade. Bottom line: you're going to get what you pay for and your labour is not worth wasting when the products out of China are so cheap.

Instead I've been considering 16mm supported linear rail mounted on a frame made out of 50mm SHS or larger with thickest walls I can get and the linear bearings to go with it. Lucky for me it turns out that the supplier of the linear rail is a short drive away from me.

Monday, 14 October 2013

Sanguinololu Heated Bed Connector Failure

Recently, an unpleasant surprise greeted me when I went through my usual pre-print checks on my printer to make sure there weren't any loose bolts, obvious shorts or loose connections. At first I thought that it was just a bit of dust on the heated bed connection but on closer inspection it was obviously burnt. It hadn't been like that when I started my last print or I would have seen it; in hindsight, that connector must have got ten very hot. At this point I still thought that it was only a small burn and that I could still get a print in but luckily, having time, I didn't risk it and tried to pull the connector out - it was fused in place and took a lot of nerve-wracking work with pliers in the small space around fragile components to remove.

Upon removal it was obvious that this connector had had its day so I cut it off and was thankful that I had some screw terminals on hand which I could replace the under current-rated Molex connectors with, the pins on the board were also covered in charred plastic residue. It was very lucky that I decided to replace the connector because I discovered that the wall in the plug between +12V and ground had been burnt away, so a short could have happened very easily and blown the bed MOSFET which would have been a real pain.

I then soldered in the screw terminals. Desoldering the old Molex pins was a pain with only solderwick available as I could never quite get all the solder out of the holes, so a solder pump/sucker is on my list of things to buy. It was only possible to get it off by pulling the plastic sheathing off the actual pins (which is risky because it puts a lot of stress on the board), so they could be individually pulled out while the solder was molten. Afterwards there was still solder left in the holes to remove, I've found that the best way to tackle this with only solderwick is to put as much solder into the hole as you can so that there is a large bulge of it coming out of the hole, then add lots of liquid or gel flux on top of the little domes and try to wick the solder out with the solderwick. I found that you have to leave the wick and iron there for at least 7 seconds to be sure of clearing the hole, some holes took me a few tries. 

The screw terminals that I used were very hard to find, it turns out that 2.54 mm pitch (do not get 5 mm or 5.08 mm pitch by accident as these are by far more common) screw terminals are hard to come by and that 4 pin ones are even rarer than 2 or 3 pin ones. I couldn't find them on element 14 or similar suppliers and a search through eBay's own search engine turned up nothing, but Google directed me to these ones on eBay with the exact same search query I put into eBay's search engine. eBay's search engine is broken. So far this is the only source for 2.54 mm pitch 4 pin screw terminals that I have come across. They appear to be rated (according to some writing on them, for 6A up to 150V per pin with RU certification though I would take this with a pinch of salt given the lack of branding and the supply source). Fingers crossed that these will work for me. Also of interest, nophead has his own Sanguinololu modifications which I think would leave a lot more margin for safely carrying the bed's current but a ring terminal is needed and the modification results in it being a bit harder to get a heat sink on the MOSFET.