I bought a new plate from Parallax to expand my Arlobot.
I’ve been experimenting. 🙂
I bought a new plate from Parallax to expand my Arlobot.
I’ve been experimenting. 🙂
I’ve been over committed lately between work, education and family. So poor robot has been neglected.
I’m taking some time out for a needed break and hopefully starting the new year with more free time for hobbies.
For now I’m focusing on the important stuff.
(I guess the signal from so long ago and far away is having trouble getting through.)
I have a list of things to try next year though, so I’ll be back soon I hope.
I just set up a Google Group mailing list at https://groups.google.com/forum/#!forum/ros-for-arlobot
My hope is that with this we can:
So you got your robot built, installed everything, and the calibration worked, but now that you actually want to run the robot it does nothing.
The first thing is to turn it on and go use the `~/catkin_ws/src/Metatron/scripts/direct2PropSerialTest.sh` script along with the recommended commands from `~/catkin_ws/src/Metatron/scripts/direct2PropSerialTestLines.txt` to observe and test how everything works without ROS.
However, most people seem to either be doing this already, or be well past it, because you guys are really smart! 🙂
There is a hidden problem though. It is discussed in the calibration instructions, so I was assuming everyone knew about it, but I now realize it isn’t well explained.
Three separate people have contacted me with this issue, so I want to post it here on the blog.
The motor controllers have a strange quirk. Their “normal” mode does not work with the Propeller board! However, within the first few seconds after turning them on they listen for a certain kind of signal. If they get it, they flip to a mode that does work with the Propeller.
What does this mean? If you turn on the motors, then start everything up, the robot won’t do anything.
The solution: You have to wait until the Propeller board is fully running, after the first arlo_drive command has been sent to turn on the motors.
If you have PING sensors installed, you can know when it is time to turn the motors on by when the PING sensors start flashing.
The main thing is just to make sure the Propeller board is “initialized” before you turn on the motors. That means after the ROS arlobot program is already started, or after you sent the “i 0” to it via the `direct2PropSerialTest.sh` command.
Another solution? Get a USB Controlled Relay and wire it between the motor outputs on the Arlo Power Distribution board and the motor controllers. This way the code will actually turn on the motors at the right time. This is how mine is set up, and why I forget about this problem.
Yesterday, after updating to the linux-image-
$ miniterm.py /dev/ttyUSB0 115200 --- Miniterm on /dev/ttyUSB0: 115200,8,N,1 --- --- Quit: Ctrl+] | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H --- --- exit --- Exception in thread Thread-1: Traceback (most recent call last): File "/usr/lib/python2.7/threading.py", line 810, in __bootstrap_inner self.run() File "/usr/lib/python2.7/threading.py", line 763, in run self.__target(*self.__args, **self.__kwargs) File "/usr/bin/miniterm.py", line 220, in reader data = character(self.serial.read(1)) File "/usr/lib/python2.7/dist-packages/serial/serialposix.py", line 460, in read raise SerialException('device reports readiness to read but returned no data (device disconnected?)') SerialException: device reports readiness to read but returned no data (device disconnected?)
To resolve your issue temporarily reboot, and at the “GNU GRUB” menu select “Advanced options for Ubuntu” and then move down to select the Ubuntu, with Linux 3.13.0-63-generic kernel option. (NOT the recovery mode one) and it should boot into the old kernel and work fine again.Reverting to the 3.13.0-63 kernel allows miniterm.py and other Python Serial based programs to work normally.
To fix the issue permanently: Remove the new kernel with:
sudo apt-get remove linux-image-3.13.0-65-generic
which will remove the new kernel and remove it from GRUB.
There is a bug report open for this:
I also opened my own, but it is probably a duplicate of the above
Hopefully the next kernel patch won’t break it again.
Someone asked where I’ve been. What have TwoFlower and I been doing?
I have been quiet here because the robot has gone from a fever of building to slow steady background work.
My focus right now is to build an internal “behavior” system using Behavior3JS to allow the robot to “act” more independently.
The first goal is simply to have it intelligently go from “power on” to freely moving about the room. This may sound simple, but there are a lot of start-up tasks that need to be coded properly so that things are safe and sane.
Then the robot needs to know where he is,
load a map for that room,
know where he is in that room if point 0 on the map doesn’t happen to be it,
know if any doors in the room are open which might lead to dangerous stairs,
and unplug himself.
My github repo kind of shows my work days and progress https://github.com/chrisl8/Metatron/commits/master, but I realize it is also cluttered because I commit too often.
So here is a brief timeline of what I’ve been laboring away at:
June 16th – This is when I built the web based control panel that runs both on board and can be remotely accessed. This lets me control important parameters of the robot remotely and get him started remotely without having to run lots of scripts:
June 17th – I built a setup script to allow you to install everything from one command. This is documented on the front of my Arlobot repository now in the readme.
July 2nd – I added roslibjs to my Node based control panel so that ROS functions can be directly polled and controlled.
July 5th – I added the ability to set “waypoints” using the web interface. This means you can drive the robot to a position on the map, set a waypoint, and then in the future you can tell it to go to that waypoint again and it will return to the same spot on the map.
Now places like “Kitchen” and “LivingRoom” can easily be set and recalled.
July 7th – Added QR code reading. Now the robot can use a QR code taped to the wall to determine what room it is in as soon as it is started and automatically load the correct map for the room without any human intervention.
August 17th – Added the function for the robot to use a waypoint called “initial”, if found for the map, to set its initial “2D pose estimate” so if your normal starting point isn’t the same as point 0 on the map, the robot can automatically localize itself on startup.
Somewhere in there I also tweaked the “stop if a door is open” code and I installed a Moteino based door monitor on my basement door to alert the robot.
Next I am working on the self unplugging routine. This isn’t super fancy. I just anchor the cord to the wall and have the robot back up, but once this works the robot can start up and go without human interaction!
So that is what I’ve been doing! 🙂
It excites me a lot when someone reaches out and asks what I’m up to and why no recent posts. So thank you!
Someone pointed out that they were not sure what the terms in my domain name meant. While it is probably very fun and safe to Google either term, I will explain both briefly here for reference.
“Ekpyrotic” is a term that was coined in 2001 by Paul Steinhardt, from Ancient Greek Ekpyrosis (ekpur?sis) “conflagration, ekpyrosis”, referring in Stoic philosophy to the destruction and recreation of the world in fire. The word was coined specifically to name the “ekpyrotic universe” or “ekpyrotic scenario” theories. https://en.wiktionary.org/wiki/ekpyrotic
Paul Steinhardt explains the theory succinctly here: http://wwwphy.princeton.edu/~steinh/npr/
However I think Brian Greene explains it more elegantly in his book “The Fabric of the Cosmos” in his discussion of The Universe on a Brane.
Ultimately “ekpyrotic” isn’t anything deep and mysterious, just a fun word with roots in science that sparks my imagination.
“Frood” means “really amazingly together guy”, from Douglas Adams’ Hitchhiker’s Guide to the Galaxy:
“Hence a phrase that has passed into hitchhiking slang, as in “Hey, you sass that hoopy Ford Prefect? There’s a frood who really knows where his towel is.” (Sass: know, be aware of, meet, have sex with; hoopy: really together guy; frood: really amazingly together guy.)”
Adams, Douglas (2010-09-29). The Ultimate Hitchhiker’s Guide to the Galaxy (p. 21). Random House, Inc.. Kindle Edition.
This is a little random, and off of the usual topic, but it is nerd stuff, so I’m posting it here.
Our electric company recently installed a “smart” meter that allows us to see our energy usage in fifteen minute increments on their web site. I find this fascinating and fun in itself. However, I also find the way it reveals daily life interesting.
Last night our youngest woke us up in the middle of the night when she threw up. We had to get up, turn on lights, and use the vacuum cleaner (“shop vac”) to clean up.
Can you tell when this happened?
Most nights the energy usage is nearly flat during our entire night’s sleep. Obviously this shows our sleep patterns. Now add a “blip” on any night, and suddenly you can tell when something happened in our house. You can just imagine how data like this from thousands and millions of homes could show patterns of illness, and surely other events, around the country!
I’m sure this makes some people afraid of “big brother” watching them, but it only fascinates me. 🙂
Thanks to BrickLink I found I could buy used UltraSonic sensors for less than $10 each! So I bought two, and now my little friend shouldn’t run into walls as much.
He should also be able to follow a wall on one side while avoiding obstacles ahead.
I tried putting all three up top, like the picture in my first post, but besides getting top heavy, I found I was unable to see a lot of obstacles. Putting all three down low makes them far more steady, makes it less top heavy, and allows me to see lower obstacles.
Because the sensors are not “adjustable” and only provide a basic “distance to nearest object” response, their position on the vehicle determines what you see and what you miss. So the lower I mount them, the lower of an object I can see. Too low and they see the ground, too high and they miss things. Just right and they see what I cannot driver over!
Now the fun part is programming. leJOS provides great functions for dealing with UltraSonic sensors, but there is a catch with multiple sensors:
The way they work is that they send out a ping and listen for the sound to bounce back. If two are going at once, they cannot tell if the return sound is their own or another one.
By default they run “continuously” and you just poll a sensor for the latest reading, but with more than one, you need to make sure only one sends a ping at a time. So now I get to write code to cycle them:
Turn on sensor 1, ping, record distance measured, turn off.
Turn on sensor 2, ping, record distance measured, turn off.
Turn on sensor 3, ping, record distance measured, turn off.
Then I have to do something with all of this data, but that comes later . . .