Technology

Off-The-Shelf Hacker: What’s Your Vision?

18 Jul 2018 12:43pm, by

You have some interesting options if you’re putting “vision” into your next physical computing or robotic project. Exactly how much “vision” do you need? What do you expect your creation to actually see? How do you plan to use the data streaming in from your machine vision device?

This week I’ll cover several off-the-shelf vision solutions you might consider. They range from simple to very complex. Some are nothing more than analog sensors that hook to an input pin on a microcontroller. Others send comprehensive data over the serial line to your robot’s brain.

Start with Simple

The simplest vision sensor is a photocell. It measures the amount of light falling on its front surface. A photocell doesn’t have much resolution and it certainly can’t distinguish a human from an automobile. It would definitely notice that your automobile has pulled into the driveway at night, with its headlights on. Maybe that’s all you need for your project. Hook up the photocell between an analog pin on your Arduino and the 5-volt line, then use a 10K Ohm resistor from the analog pin to ground. Read the pin value with a Python or Arduino program and take some action. Adafruit has regular old photocells for $0.95.

Line-following sensors fall under photocell category. They use an infrared LED to shine a light onto a line, then read the value with a photocell. Since the LED and sensor are in close proximity to the line, it’s easy to see when you are tracking and when you are not. Karlsson Robotics has a little line following module for $2.95. The vision sensor is very limited and you have to do all the “video scene” analysis in your Arduino or Python programs.

Get a Little More Sophisticated

The new Pixy 2 machine vision sensor is the next step up from the photocell, although it’s a pretty big jump. Readers may recall my using the original Pixy sensor in the Steampunk Eyeball project. I could train the eyeball to track a certain color, then if I wore a shirt with that color, it would faithfully track me as I moved around in the Pixy’s field of view. I liked the Pixy because not only did it have dual (pan and tilt) servo controllers built into the board, it also had a pan-tilt program built into the firmware. You could just hook up your servos, teach it a color, then run the demo. I leveraged those capabilities to build a working 2-axis robotic system, with no programming.

The new Pixy is physically smaller, so putting it in a robotic skull or “eyeball” should be pretty straightforward, actually easier than the original model. There is also a line-following program, that can interpret a handful of road signs and line intersections. You use little road signs to give the Pixy 2 instructions. For example, you might place a turn left sign just before an intersection of two lines. The Pixy will then read the sign and at the next intersection, decide to follow the line to the left. The video shows some of the cool capabilities.

One big problem with machine vision sensors is lighting. Without enough light on your subject or if you are trying to interpret an object that is backlit, most sensors I’ve worked with have troubles. The Pixy 2 now has several LEDs built into the board that augments the ambient lighting. It should certainly help the line following. I don’t know if it will help with color blog recognition in a regular-sized room though. As soon as I get my hands on a Pixy 2, I’ll let you know if the new LEDs help with image recognition in real-world situations.

Keep in mind that the Pixy and Pixy 2 are both firmware-based devices that have built-in algorithms, on the device. The website has information on how you might build your own algorithms and install them on the Pixy, although I don’t have any experience with that particular feat. Location data is also streamed over the serial line to your host computer. So, if you don’t want to use the onboard servo controllers, you could just send the Pixy data stream to your Arduino and have it move servos or control other physical devices. You can pick up the original Pixy (new) for about $70 on Amazon. Similarly, Amazon has the Pixy 2 for around $60.

The Smart-Vision Sensor

The most complex and capable vision device in my off-the-shelf inventory is the JeVois machine vision sensor. This little beast has a quad-core ARM processor running at around 1.3 GHz, a streamlined version of Linux and is roughly the size of four Lego blocks stuck together.

Since it runs Linux, you can write and compile your own artificial intelligence programs and upload them to the device. It also has a serial line that outputs location data for use by an Arduino or Raspberry Pi. You can use the sample programs too if you aren’t into training artificial intelligence neural networks. Examples include color blob recognition, “drive between the lines” capabilities and several neural networks that recognize about 1000 common everyday items. I sat in the front window of a coffee shop and the JeVois camera correctly identified vehicles as “cars” as they sped by at 30 MPH. Location data, relative to the camera’s field of view, simultaneously streamed out the serial line, as the video feed (via USB) superimposed analytical data on the first person view of what the camera sees. The augmented reality video stream can be viewed with any conventional webcam viewer like Webcamoid or guvcview. The JeVois is quite a piece of equipment for a paltry $50.

The JeVois is pretty much the state of the art, right now, in my opinion. Scene lighting is still a critical consideration with what you are trying to see.

Over-the-Horizon Thoughts

Machine vision is coming along rather nicely. The Pixy was one of the first devices that people could buy at a reasonable price, that gave basic shape and color recognition. Now they have a new and improved version. The JeVois is pretty much a platform that you use to implement artificial intelligence into your vision project. It’s a fascinating device with an awful lot of science and human behavior inside.

We are just at the beginning of the new packaged sensor revolution. As we’ve seen devices like the Pixy and the JeVois are only going to get better. I’m anxious to see the creative ways people use these gadgets to do new and wondrous things with physical computing.


A digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.