Sunday, April 6, 2014

I am excited about this Pixy Vision (CMUCam5) sensor that stared on Kickstarter a while back. The project started shipping mid-March 2014, I was an early backer and received 'award' last week.

Out of the box it comes with decent work-in-progress documentation with a enough basics to start playing with it. Given this is a Kickstarter effort and not a full commercial product, I don't expect it to be flawless or perfectly documented. However the group did really a nice job with packaging and providing the quick start info!

I was able to get the hello_world with pan & title rig running immediately with no microcontroller. Next I hooked up an Arduino to demo the microcontroller SPI interface option. I am rusty on C/C++ so there were a few things that I was easily confused on. However, I am now running on the SimpleIDE though... I did try some of the Arduino propeller C libraries, no go. Need to speed a lot more time on Prop C first!

Not having much experience writing an interface to sensor from scratch I struggled on basic stuff here. However using Spin and Propeller tool I was able to generate similar Pixy SPI data output on the Propeller as in the Hello_World.ino example.

My attempt was a learning exercise on a couple different levels... I couldn't get the SPI_Spin object to work correctly. I ended up using parts of the Parallax Semiconductor 'Application Note AN012' to make this work. Leveraged a USBee LA also to measure and replicate the Arduino communication. Again, because of my rusty C/C++ skills getting through classes, template classes, etc. This time I ended up reinventing the wheel... Success is in what I learned along the way.

In the picture: TV running Propeller tv_text.spin object. PixyMon - PC based monitor/configuration tool for Pixy. USBee suite running the LA. The data on the tv represents SPI data output from the Pixy. The green box is the object Pixy is 'trained' to report on.

NEXT STEPS: Do something with the data besides look at it. I have the Zumo or Sting Ray that will likely be the target. Perhaps phase 3 of the Table Top challenge...

(Thanks to FIRST FRC Team 701 - Robovikes for the green ball, works great for tracking testing)

Sunday, November 17, 2013

New 'lab/workshop' space heat problem solved - Thanks Nick L.!

Thank-you Nick.

As the weather has been cooling off (not cold given this isn't Iowa), I am finding my new work space to be less than comfortable, space heater I already had doesn't cut it.  Poking around on the usual buying web sites I found a lot of conflicting information on how to solve this one.

Today while at the home store, I had one item in hand, waiting in line to check out.  I changed my mind and went back to the heaters to look again.  The guy behind me in line noticed and had asked what I was trying to warm up.  Briefly shared my research and then he offered up the dual heater pictured to the left.

Came to find out we had similar interests so it was good to get to know him a little while I came over to pick up the unit.  What a surprising afternoon!

Friday, November 15, 2013

It's Robot Season! Volunteers wanted, sign up now!

The FRIST (For the Recognition and Inspiration of Science and Technology) 2013-2014 season is in full swing. I am committed to be the Sacramento Regional FRC Lead Robot Inspector for the 2nd year.  Unfortunately due to work conflicts I am not able to attend the Central Valley Regional this year.

Events currently going on now are FLL (First Lego League) and FTC (First Tech Challenge).  Both are very exciting and FLL covers the younger age range.  There are lots of YouTube videos to check out, again very exciting events!!

I have posted about the big high school robots in the past, that will start in Jan 2014.  However many FRC teams are starting now in the off-season for team and skill building.  I have been hanging out a little with team 3257, the VorTECHS.

FIRST (For Inspiration and Recognition of Science and Technology) inspires young people to be science and technology leaders, by engaging them in exciting mentor-based programs that build science, engineering and technology skills, that inspire innovation, and that foster well-rounded life capabilities including self-confidence, communication and leadership.

Saturday, November 9, 2013

Zumo Update - 3 more non-contact object sensors added

Where does time fly? Phase 2 Sac Area Robotics table top challenge completed! 

To accomplish this I added more sensors to Zumo platform with the intent of using them each with single purposes; not great design but met goal and increase knowledge of different kinds of sensors - so its a WIN. Also I didn't want to modify the blade front of the bot.

Challenge 2 was to find a block, push it off table and not fall off (built upon prior challenge). The neat part was to see how other accomplished this. For example, my solution was brute force, another's was cartesian based (he shared his code, I am still digesting it...) and a Lego Mindstorm.

Sensors added: Maxbotix MB1010 (sonar), Sharp GP2Y0D810 (IR), 2x QTI (IR). With the exception of laser or radar I ended up with the several common non-contact object defection devices. 

Why so many sensors? The problem was time to get it working -vs- brute force strategy -vs- minimum sensor distance reporting limitations. I didn't buy all of these sensors at once, more or less it was sensor creep as found the limitation of each sensor, I added another one to compensate.

The Maxbotix provides Sonar object detection with reported minimum measurement of 6in while providing distance from 6 to 200+. I could find the block and track it provided I was greater than 6 inches from it, getting closer than 6 it would appear to the block was missed (this is due in part to the angle I mounted it, is above angled down rather than straight on). Also if I had encoders and knew I was straight on I could find it then drive to it. However within 6 inches of travel, in testing depending on battery voltage the bot would head left or right and miss the block.

The Sharp sensors had an effective range down to ~.7 inches reporting True or False, no distance. When the Maxbotix reported 6 inch, the bot began tracking with the Sharp adjusting the motors to keep the object in sight. Again due to angle the object detection was clear if the bot had contact yet with the object.

The QTI sensors were downward facing just above the block. Once the bot touches the block the QTI sensors would detect it. I considered at contact sensors, given I didn't want to modify the front blade on the bot so the non-contact QTI worked well; also at this point I had QTIs already on hand, didn't have a suitable switch to use as a whisker contact.

The end result was a series of SPIN methods invoked each sensor depending on how far from the bot the block was. This worked with about 85% (WAG) reliability. Issues came up when other objects not on the table were detected - ie people. 

I now have a variety of non-contact object detection sensors in my parts bin to play with in the future. Given that I found limitations while attempting to apply them, my future designs will perhaps take that into consideration...!?!


Thursday, November 7, 2013

VorTECHs Visit Parallax World Head Quarters


Propeller Activity Bot vs. FRC Ultimate Accent Bot
Parallax hosted a robotics team for an open house and demo of COOL STUFF! 

Established in 2011, the VORTECHS, team 3257 is comprised of 3 local (Rocklin/Roseville/Lincoln, CA) high schools. 

Tonight was all about getting students and parents to start thinking about the exciting build season starting the first week of 2014. It's the "off season" for FRC (FIRST Robotics Competition). Generally teams are engaged in skill building and getting excited about planning for the upcoming intense 6 week build period. 

Learning solutions such as the Parallax Activity Bot using C programming language is an excellent tool for teaching concepts that will later apply to the official FRC competition robot. Teams must be ready to build/program/solve challenges before the season starts - there isn't enough time to build these skills while designing and building a complex robot on a 6 week schedule! 

Ken Gracey kicked off the evening by showing off the activity bot and talking about how exposure to technology and programs like FIRST will shape their future and change their lives in unexpected, exciting ways. He had a few pictures from recent international Parallax trips with other of the latest Parallax products. The students and parents took a tour of the Parallax facility, even watching a CNC mill grind a way on a part. (I jump at any opportunity for taking the tour, I always see something new to me.) Finally a newly built ELEV-Super was on hand as well for a night time flight demo, always a crowd pleaser! Parallax has a great facility to show off and always knows how to present people the stuff that hooks them on technology for life! Plus its always exciting to hear about new products in the pipeline...

As always Ken and Parallax associates, thank-you for being open and sharing the exciting work you all do!! 

Throughout the rest of the evening a presentation was given by the team student president who shared facts, pictures and a prior year video about the team. Several other impromptu student presentations were given and finally wrapped up with some of the business aspects by the Mr. Toy, the team teacher. 

FIRST teams like this one are successful because of the countless hours put in by Students, Teachers, Mentors, Parents and community participants like Parallax.

Wednesday, July 31, 2013

Zumo - Reflectance Sensor modification (remapping)

This sensor is pretty slick.  Pololu designed it to work out of the box with the Zumo Ardunio Shield.  However for my project, I need access to the sensor IO lines without going through any of the Ardunio A-Analog pins.

Sensor is not working post-modification.  My concern is if this is a multi-layer board such that I may have inadvertently a cut trace that may run under trace on the top layer? I suspect not given the proximity of the vias.

When powering up the board the Red LEDs no longer light and I have looked at the IR LEDs through two different cameras - seeing no IR light.

The enable terminal is in place, however the jumper is removed.  The Red LEDs were functioning prior to my cutting the traces for remapping.  In the picture I have soldered a resister on the IO line for each sensor.

I have been checking pin voltages and tracing everything.  Going to post a question on the Pololu form next.

Saturday, July 27, 2013

Windows7 forcing legacy filename format - short filename NAS pain

Struggle for several hours trying to figure out why all of a sudden Windows7 was not allowing me to use long formatted file names on a NAS (Apple Time Capsule). I noticed it quickly after setting up a Windows7 library.

In order to create a library from a NAS, the device must be indexed. I enable the Sync tool to create the index.  This caused any new file and folder to be limited to DOS8.3 naming convention.

I attempted to resolve based on a number of searches that were all over the board.  The solution was to turn off and clear any offline files that were set to by sync'ed with the NAS.  After a reboot things were normal.

NOTE: I didn't have Apple Bonjour running, assuming something with SMB was casing the issue.  When I setup the Apple Airport Utility it appeared to resolve; however NOT. What happen was that my SMB mapping to the NAS was setup to sync, the new Bonjour mappings were not.  I figured out the relationship to Windows 7 Sync when I created a library using the Bonjour mapping.

This is frustrating!! I have not done any additional research so I don't know if there is a Sync issues, Time Capsule issue or otherwise.  At this time I have fully disabled the Windows sync features.  I would like to use this feature, however after this ordeal I am burned on out it...  Looking for advice to resolve.