Dawn Robotics Forum Support and community forums for Dawn Robotics Ltd 2015-04-27T15:56:33+01:00 http://forum.dawnrobotics.co.uk/feed.php?f=12&t=1352 2015-04-27T15:56:33+01:00 2015-04-27T15:56:33+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1756#p1756 <![CDATA[Re: RB Reads Signs – continued]]>
Thanks for the directions. I’ve seen some examples of stationary camera’s acting as an indoor GPS. As with using markers (far more easy) it would imply preparing the room before exploration. Remote control, like the RPI Camera Bot, could also fill in the missing info, but I think it will be hard work to combine this with the readings. Andrew Davidson (I didn’t know his work. I noticed he published a lot of interesting material around this topic and I will dig into that much deeper) already mentioned the prerequisites: 2D movements, known controls, modelled dynamics, well calibrated sensors, simple environments with unambiguous landmarks. So I’ll better start with your suggestion to calibrate the sensors as much as possible and do the tests in a very similar environment (especially floor & light conditions). After all I’m not the one preparing for a PhD :D

Another topic I’m exploring is the communication with the bot. I’ve got more than enough computer power to filter and process massive data, but sending those using Tornado and Jsocks, does apply its limits. For example: reading the sensors attached to the mini driver seems to be limited to a good 67 times per second (should be around 100 according the Arduino sketch). Starting the camera streamer reduces that to 24 times (- 64%). Displaying it by openCV brings it further down to 18 times (-73%). Such restricts the bots agility in a serious manner. (Note: I was surprised by the performance degradation of print statements in Python. I haven’t got the logic behind that yet. The picture shows the diagrams of my tests. Appending the readings to a list seems to be far more appropriate.)

Image

Anyway: lots of exploration and learning to consume for the coming weeks! Thanks again and I’ll surely keep in touch!

Regards,
Bastian

Statistics: Posted by Bastian — Mon Apr 27, 2015 3:56 pm


]]>
2015-04-27T00:18:49+01:00 2015-04-27T00:18:49+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1755#p1755 <![CDATA[Re: RB Reads Signs – continued]]>
Yeah, I'll post a link to my thesis once it's done. Someone might read it. :)

Mapping, or SLAM is really tough, and pretty much it's impossible to know if you've gone in a really straight line unless you've got some kind of external positioning system such as GPS - and it's hard to get a signal indoors. :)

In theory you could use the camera to look at the environment around the robot to see how it's moved although that kind of general purpose computer vision is very hard and very much an area of active research. Probably the easiest way to proceed if you don't mind modifying the environment is to put up AR markers (like your signs) and use a library like Aruco to spot them. You can use those sightings to estimate your position and integrate the position measurments with a Kalman filter to filter out noise. For the encoders, you try to tune them as best as possible so that the robot goes in a straight line (although a perfect straight line is usually impossible due to things like wheel slip etc), and then you try to characterise their inaccuracy as best as possible as noise i.e. drive forward 1m a few times, and what's the mean and covariance of the position and orientation you end up with? Taking this statistical model of your encoders you can then take the commands you send to the motors, estmate how the robot is likely to have moved and feed that into your Kalman filter. Hope that makes some kind of sense...

If you haven't seen it already then the work of Andrew Davidson is good to read in this area. It's getting quite old now but his PhD thesis is a really good introduction to the SLAM problem and he's done a lot of interesting work on mapping with cameras and robots since then.

Regards

Alan

Statistics: Posted by Alan — Mon Apr 27, 2015 12:18 am


]]>
2015-04-26T11:21:15+01:00 2015-04-26T11:21:15+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1753#p1753 <![CDATA[Re: RB Reads Signs – continued]]> Thank you for sharing. I find balancing time, energy and money to be the universal, everlasting and greatest challenge in life. :) I do believe you’re on track to cope with that: I think further developing Dawn’s business model will be successful. Your RPi Camera Bot is a good appliance for hobbyists to start with.

A lot of appealing researches at BRL! I think I understand how your MOET project brought you to your thesis study. (Made the Pi reading a digital display and then realized: ‘Now What?’). Maybe you could send me a link once it’s published?

I’m trying to get my head around applying a PID in an unknown situation. E.g. when a bot is exploring and mapping an unknown space/room, how can I be sure that it has moved in a real straight line? I’ve tried some things, but for now got stuck in operating them (timing constraints, buffers and latency phenomena). So first I’ll concentrate on experimenting with your classes.

Thanks a lot for taking the time to answer and all the best while finishing your research.

Regards,
Bas

Statistics: Posted by Bastian — Sun Apr 26, 2015 11:21 am


]]>
2015-04-25T01:06:20+01:00 2015-04-25T01:06:20+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1751#p1751 <![CDATA[Re: RB Reads Signs – continued]]>
Sorry for the delayed reply.

My PhD is entitled 'Autonomous Model Building Using Vision and Manipulation' and it's taking me an absolute age to complete. Basically the aim is to program robotic systems so that they can build internal models of themselves (such as a kinematics model which describes how they move) and of objects around them without having to have models provided by the people who build the robotic systems. The idea is that this would give robotics systems increased resilience, able to cope if their bodies change, or if they encounter unfamiliar objects. You can see my rather out of date PhD webpage here which gives a bit of background. I've got the feeling whilst doing my PhD that an academic career isn't really for me so that was why I started Dawn Robotics, as hopefully a way of making a living whilst making robots. I spent a lot of last year working on Dawn Robotics when I really should have been doing PhD stuff and am now having to really push to complete. Slowly getting there though. :roll:

That's very impresive hearing about your sons. Sounds like you've got all the skills for a cool tech start up in the family. :)

My understanding of PID loops is not the best, and it looks like tuning them successfully can require quite a bit of experience. But from speaking with other roboticists it seems as if they're often a good first thing to try (they've been in use for a very long time) and are often more appropriate than more complex control strategies. I've found the loop tuning section of the Wikipedia page on PID controllers to be very helpful, and another good strategy I've found is to plot the quantity you're trying to control over time so that you can see things like overshoot etc and judge how well your PID gains are working. If you look in the robot_controller.py file of the experimental branch you should see that it has a DUMP_MOVEMENT_TO_CSV variable that you can use to produce graphs from.

I tried a few things with the encoder code that I wrote for our robot kit, but the thing that seems to work best is to focus on position control for the wheels (as opposed to trying to directly control the velocity). Driving forward to a position of say 1000 encoder ticks. I was able to control speed, and then smoothly accelerate and deccelerate by varying the amount that the target position changed each time step. It's also worthwhile googling micromouse and encoders as micromouse competitors have a lot of experience of using encoders to generate controlled movement.

The encoders I was talking about are encoders that I'm planning to sell once I've finished my PhD. They're hall effect sensors that measure the rotation of a magnetic disc which attaches to the extended shaft of the motors that come with the 2WD kit. I'm hoping to sell them for about £12.50ish per pair, and all the hardware is done. It's just as I said, I've had to put them aside to focus on completing my PhD as otherwise I get far too easily distracted. :)

Regards

Alan

Statistics: Posted by Alan — Sat Apr 25, 2015 1:06 am


]]>
2015-04-22T20:49:11+01:00 2015-04-22T20:49:11+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1746#p1746 <![CDATA[Re: RB Reads Signs – continued]]>
Google fixed the original video for me as well, so both links work (just in case anyone is looking for the same info I’ve been scavenging the last months).

Is it impropriate to ask for the subject of your thesis? Just out of curiosity. (My job is building and consolidating datacentres and my team members are all Msc’s on very different topics; some of them also started on a PhD. My eldest son just graduated at Delft University on fluid and aerodynamics, my youngest is preparing his master thesis on vision controlled vehicles within automotive engineering.)

Concerning the scripting, I try to find a method to move directly to a focused point. Up till now I used encoders in the same way I used the focused centre of my signs: measure the difference between left and right and send an adjustment to the motors until the destination point is reached. For differential driving this results, after a lot of tuning, in a nice curved move within a certain set of ranges. When (if) I’m able to use a PID, I will be able to use the method for one other topic on my idea list (explore an unknown space/room, detect walls and objects by vision control, collect sensor readings – like gasses -, take pictures and map the space/room using compass and encoder data). Another desire is building a balancing robot for which a Kalman filter can be used as PID (but I got as far as installing pykalman).

Up till now I’m struggling using a PID for several reasons: I still don’t really understand the logic behind it (Left school a lot of years ago, daily duties leave only few scarce hours at night and finding meaningful info through Google is ultimately time consuming). Secondly my knowledge of Python is still far behind as classes (and especially multiprocessing classes) are concerned. So I understand your script handler and differential drive controller classes are meant to move in a straight line towards a predefined destination point. For me a great subject to dig in: I’m still convinced it is possible to have the Dagu frame moving in a straight line on a tiled surface (and for sure I’m keeping it for some other ideas as well). But for now I’m just puzzled on the matter of PID’s. So, if you’ve got some docu or links to point me out, I would be very obliged.

You mention you sell encoders, but I could only find them in the Rover kit? On my RB2 I’m using rather simple single output optical controllers (had to start somewhere). Do you sell the encoders as component or only as part of a full kit?

Regards,
Bastian

Statistics: Posted by Bastian — Wed Apr 22, 2015 8:49 pm


]]>
2015-04-22T17:16:32+01:00 2015-04-22T17:16:32+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1745#p1745 <![CDATA[Re: RB Reads Signs – continued]]>
New video works for me and looks good. :)

Yes I have some encoders for the 2WD chassis that we sell ready to go, and I've done most of the work on the encoder software, although it needs to be cleaned up a lot. Main thing I need to do is produce some assembly instructions but I've been holding off for the last 2-3 months as I struggle to finish the writeup of my PhD. Hopefully I'll be finished in the next couple of weeks though at which point I'll have time to focus full time on Dawn Robotics.

If you have any questions about what our encoder code is 'trying' to do then I'd be more than happy to answer if it helps with your code.

Regards

Alan

Statistics: Posted by Alan — Wed Apr 22, 2015 5:16 pm


]]>
2015-04-21T21:51:35+01:00 2015-04-21T21:51:35+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1742#p1742 <![CDATA[Re: RB Reads Signs – continued]]>
https://youtu.be/igyVvmr4hXI

It's a simple video taken with my BlackBerry, but I think it’s illustrative for the script.

I’m currently wrestling with a PID controller for the encoders in Python, but obviously have to gain some more understanding of the principles. I noticed in Bitbucket that you started creating classes for an encoder PID. Are you taking that up in the near future?

Statistics: Posted by Bastian — Tue Apr 21, 2015 9:51 pm


]]>
2015-04-21T15:03:53+01:00 2015-04-21T15:03:53+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1740#p1740 <![CDATA[Re: RB Reads Signs – continued]]>
That looks really good. Your new 4WD build looks reassuringly solid, and when I have some time I'm going to have to download your sign reading script and give it a go.

Unable to look at your video at the moment. I get a message saying 'This video contains content from SME, who has blocked it in your country on copyright grounds.' which I assume is down to the video's music?

Regards

Alan

Statistics: Posted by Alan — Tue Apr 21, 2015 3:03 pm


]]>
2015-04-20T20:31:29+01:00 2015-04-20T20:31:29+01:00 http://forum.dawnrobotics.co.uk/viewtopic.php?t=1352&p=1736#p1736 <![CDATA[RB Reads Signs – continued]]> RB1 (blog 'Robot following sings') has evolved to RB2. First of all, I switched to a 4-wheel chassis to get more control of the movements. The Dagu magician frame is ok, but running on a tiled surface, the rear castor caused a lot of unexpected swirling. Adding weight solved that, but then the hard tires started to slip when adding torque. RB2 has an aluminum frame (DF Robot), 4 motors (installed as 2x2) and softer wheels. It doesn’t have the funny looks of the Dagu, but the ugly bastard runs like a clock. The rest stayed unchanged: RPi B+, Dagu mini driver and the camera and websockets classes of Dawn Robotics.

Image

The script evolved as well and RB2 now operates trustworthy at an acceptable speed.

Video:
https://youtu.be/7bVeIi_Izqg

The major differences in this script are:
* Just use color tracking for detection and moving. It’s 50x faster than the full routine. I used openCV bounding box to get a more accurate centroid (Contours are sometimes only a part of the picture). The bounding box also produces the width of the sign, which is used to keep focus. The difference is shown in the picture by red lines (contours) and a green rectangle.
* A range routine was added, using a constant value to multiply the width. Range isn’t needed anymore to adjust position and direction, but can be used to keep distance. (so, more fun than functional) :D
* A heads-up display of the center coordinates and range is added. Also just for fun (but who knows)
* A time-out routine is added for Pythons time.sleep is only reliable at very small intervals and I needed an accurate time-out to enable exact turns.
* The grabbed image is used as global variable (saves a lot of typing and a small bit of memory)
* Readings while moving are tuned with time-outs. The routine produces more than a hundred readings in a couple of seconds, creating an overload of the webserver and Pi’s memory
* After reading the sign the script forces a wait for the latest image using the max-time variable

Image

* Finally the full detection routine is used for comparing the sign with the reference images. This routine detects the white inner rectangle, shown in the picture as a blue rectangle.

Image

When interested, more details are well commented in the script itself, which can be found at:

https://bitbucket.org/RoboBasics/raspberry-robo-cars/src/1434877c12f39efc2c9b2ff99172ad605236914f/Scripts/reading_signs.py?at=master

(Changed from GitHub as well, for the ease of using SourceTree)

The script can easily be extended with all kinds of routines. I will be working on logging thru a digital compass and the encoders. (Noticed that Allan has been working on some preliminary classes for an encoder PID; I'm curious for the result :idea: )

Have Fun!

Statistics: Posted by Bastian — Mon Apr 20, 2015 8:31 pm


]]>