The [Parts List] is now a GitHub Wiki page:
The [Parts List] is now a GitHub Wiki page:
This is my setup log for my second Raspberry Pi. It is a little older, but I think some might find the walk through useful.
This could be used to create a robot that is easy to program and control remotely via VNC and SSH.
This is more of a “log” of my work than instructions, so a lot of it is cut and paste from the sources listed.
Please let me know if you run into questions or problems and I will make updates.
and run the Propeller SimpleIDE to control the Propeller board
Following Propeller’s instructions:
### BEGIN INFO
# Provides: vncserver
# Short-Description: Start VNC Server at boot time
# Description: Start VNC Server at boot time.
### END INIT INFO
eval cd ~$USER
case "$1" in
su -c 'vncserver :1 -geometry 1024x768' $USER
echo "Starting vnc server for $USER";;
echo "vnc server stopped";;
echo "usage /etc/init.d/vncserver (start|stop)"
exit 1 ;;
pi@raspberrypi ~ $ cat /etc/network/interfaces
iface lo inet loopback
iface eth0 inet dhcp
iface wlan0 inet manual
iface default inet dhcp
I have been slow to update the site here, but for those of you that are just looking for code, I am uploading everything to my GitHub site here:
I made a video today to document my Parallax ArloBot running Robot Operating System:
I have been very busy over the past few months building up a robot to make use of the Robot Operating System (ROS)
However I’ve already been planning my robot version for a while and ROS is more of an addon, albeit a huge one, then a starting point. While the TurtleBot has a lot of features that I may never be able to come up with myself, it is missing a couple of key ones:
What I’ve been looking at is the Arlo Robot from Parallax: http://www.parallax.com/product/arlo-robotic-platform-system
This is by no means a “pull it out of the box and go” platform, but it is well within my skills. No designing circuits, no laser cutting, no hacking together of bits of metal and plywood. It can be built by anyone who can put together a radio controlled car kit, and their site is basically designed around teaching kids how to program their controller board, so the learning curve is well assisted.
Fortunately the Arlo Robot is also similar in shape and layout to the TurtleBot. Once I built code for the Propeller based Activity Board to talk to ROS, I can actually run a lot of the ROS TurtleBot code with very little modification!
Here is a list of my progress so far:
I am doing all of this on a little ActivityBot, because it uses the exact same controller as the ArloBot. Right now I am just waiting on parts from Parallax to get my ArloBot together so that I can start running ROS with it.
I have not made SLAM or “gmapping” work well yet because the odometry from the ActivityBot is very poor due to the fact that its little rubber clad wheels slip very easily and very unevenly on all floor surfaces. I expect the ArloBot to have much less wheel slip. I will also be experimenting with adding a gyro to the Propeller Activity Board to assist in verifying rotation angles.
Since a photo is worth a thousand words, here is a picture of where I’m at at the moment:
Here you can see my rudimentary ArloBot URDF with a depth registered 3D image from the ASUS Xtion Live and the simulated red laser scan line. Also notice the small line of white dots in front of the robot. That is a simulated LaserScan built from the PING sensor on the front of the ActivityBot. I plan to place PING (Ultrasonic) and InfraRed sensors around the ArloBot and use simulated LaserScans from them to help the robot navigate around low or close obstacles that the 3D Camera misses.
I have a lot of documentation on my process and my code is all online, so if anyone is interested in more about this please let me know and I will post some more background as well as on going work.
Thanks to BrickLink I found I could buy used UltraSonic sensors for less than $10 each! So I bought two, and now my little friend shouldn’t run into walls as much.
He should also be able to follow a wall on one side while avoiding obstacles ahead.
I tried putting all three up top, like the picture in my first post, but besides getting top heavy, I found I was unable to see a lot of obstacles. Putting all three down low makes them far more steady, makes it less top heavy, and allows me to see lower obstacles.
Because the sensors are not “adjustable” and only provide a basic “distance to nearest object” response, their position on the vehicle determines what you see and what you miss. So the lower I mount them, the lower of an object I can see. Too low and they see the ground, too high and they miss things. Just right and they see what I cannot driver over!
Now the fun part is programming. leJOS provides great functions for dealing with UltraSonic sensors, but there is a catch with multiple sensors:
The way they work is that they send out a ping and listen for the sound to bounce back. If two are going at once, they cannot tell if the return sound is their own or another one.
By default they run “continuously” and you just poll a sensor for the latest reading, but with more than one, you need to make sure only one sends a ping at a time. So now I get to write code to cycle them:
Turn on sensor 1, ping, record distance measured, turn off.
Turn on sensor 2, ping, record distance measured, turn off.
Turn on sensor 3, ping, record distance measured, turn off.
Then I have to do something with all of this data, but that comes later . . .
This blog is for me to share some of the experiments I do on my own time.
I will throw up some pictures and descriptions over the next few days of some of my past projects and then put updates about what I’m currently doing.
I am a Unix System Admin by trade.
I am working on a degree in Web Development.
I am teaching myself Java.
And I like to play with computers. 🙂
“He’s just this guy, you know?”
Computers have been my hobby and passion since childhood. My hobby is to experiment with computers. My goal here is to share some of my projects for those who are interested and who may want to try some of these things themselves.
I love explaining things and teaching, so if you want more direction on how to make something work do comment and I will elaborate.
I have a full time job, but if you think I’m brilliant and want to pay me good money to experiment with computers for your company then feel free to contact me. 🙂
If you want a little background:
I am a Full Stack Web Developer (Jack of all Trades, Master of None).
I was a Unix System Admin for ten years.
And I like to play with computers.
In short, I am a jack of all (computer) trades, and master of none.
If you are wondering about the name see Of Ekpyrotic and Froods