It is running remote on a W8 computer. The script is rather straight forward to keep the coding challenges low (for me). So, I had to take some shortcuts in the solution. In the Google-blog I describe the reasons.
Then one of the two major challenges was of course the timing. I think the sample_time is on the edge of workable. Using py_websocket_bot I couldn’t get it much lower than 0.05 anyhow.
The timer generates weird peaks even when working in plain Python without your classes on several computers. I’m not sure if that’s due to Python. I’ll want to recode the script in C/C++ once, to find out if that’s the case.
The other phenomena was the difference in performance between the left and right motors. I could get one side working with a perfect graph. The other side fell short on the max PWM. Thus resulting in too much weight of the balancing loop in the controlled part. In the blog I mention almost everything I tried, but I’m still guessing for a solution.
Anyhow the outcome is satisfactory as an intermediate result. I’ve just ordered the Grove compass. I’m curious if I can add that while still using py_websockets_bot.
The epic after that will be trying out the improved Dagu mini driver. I’m thinking of building the control loop, including the compass, in Arduino. The memory should be sufficient now, I think.
That will leave me the Raspberry (also probably Model 2) to work on locating and mapping, incorporating vision control. (I might be needing your help in implementing the new Dagu board in py_websockets_bot)
And of course I’ll try more qualitative encoders as well then. (You’ll have them launched at your store by then, I hope). ;-0)
So still a lot of fun ahead and I’ll be back when I’ve something I can share.
Cheers,
BastianStatistics: Posted by Bastian — Wed Jun 17, 2015 7:55 pm
]]>