Robotic Camera Platform - the story so far, where to from here?


As posted here, I am planning to use the Parallella as the main processor for my robotic camera platform. This is something I started thinking about 3 or so years back. It is just something to do with my hobbies, which combines my love of photography and robotics - it need not be inherently useful in and of itself, but who knows where it will lead.

Back in May 2011, I played around with controlling by DSLR from my PC (running Linux), and also found a library I was able to adapt and run on an 8-bit AVR, and a USB host controller I could use to connect to it with. I hooked up a character based LCD to show the current aperture/focal length and exposure settings, as well as using serial connection from a PC to drive the thing. The initial aim was to use keypad/LCD when a laptop was not connected.

This idea evolved as embedded Linux devices became available/cheap, so I moved towards the idea of an onboard computer, add an Android UI (via BT) for control.

Since then, I have built a prototype servo/stepper actuator which takes the weight of the DSLR and lens, and allows movement in 2 axes (yaw/pitch - roll can be achieved by rotating the image, after all). Around the same time I played around with using an accelerometer / PID control for maintaining balance, and intend to use this for managing both balance and movement.

Now that I have improved my abilities to cobble old bits of junk together, (soon will) have access to a 3D printer, have a Parallella on my desk, and a Porcupine daughter board freshly arrived (generously donated by Adapteva), I plan to revive this project and share my progress.

The first step is to resurrect the mechanical enclosure, and develop a bit of software around libgphoto2 (since I will use the Linux approach rather than AVR driven), and drive everything from the Parallella, which will be the new brains for the operation. The AVR will still be used as the motor driver to keep the logic level conversions to a minimum. I will also hook an accelerometer into the system, and use this for balancing. Once this crude system is working, I'll provide an update.

The second step will be to improve the mechanical aspects of the system, through the use of 3D printed parts which are easy to affix motors to. Much of the cobbled up bits of old computers and whatever was lying around will be replaced with more useful parts.

Thirdly, I want to get around to developing the Android UI, as well as using the accelerometer in the phone to control the camera facing (I have prototyped this using a 3-servo arm I threw together a while back).

Fourthly, after some experimentation, I should have a better idea of what sort of motors / drive mechanisms I want to use (and work out where the trade offs between speed, accuracy, holding torque, and vibration level when locking the shaft need to be). I'm putting this off until after a (and accelerometer control)discovery/learning phase, as I am not a mechantronics/mechanical engineer.

Then we'll see where it goes from there.

Tagged as  camera control

Back to Blog