My latest project, Hedley, the robotic skull, has a JeVois smart machine vision sensor in his right eye socket. An ArUco marker on my lapel keeps me centered in his field of view by actuating a pan servo. He also tracks me using the “salient” algorithm, which is influenced by movement and light sources. He works great on the desk or on top of his tripod at a conference.
I’ve wanted to explore the “road navigation” algorithm (in the JeVois sensor) for quite a while and wedging him up on the dash of the car didn’t seem like a great idea. I figured it would be much easier to get familiar with some of the other algorithms in the JeVois on something more manageable. It also made sense to observe the behavior of the “road navigation” algorithm before the planned integration of the JeVois sensor into the Elegoo wheeled robot platform. We covered the Elegoo robot in a past article.
Sadly, Hedley had to lose an eye temporarily.
One good thing about being an off-the-shelf hacker is that since we fabricate our own gadgets, we’re also the experts at mixing and matching the various parts to create new functionality. Don’t worry, the sensor was installed with a removable mount, so making Hedley “see” again is trivial.
What we need is a portable JeVois smart sensor testing device.
Ginning up a Handheld Smart Vision Test Rig
It just so happens that the Steampunk Conference Badge has enough horsepower to display the augmented-reality video feed from the JeVois, through the luvcview program. You may recall that the badge has a Raspberry Pi 2 model B microcontroller, a 3.5-inch color LCD touchscreen and an ultrasonic sensor attached to an Arduino Pro Mini. It runs on a 5-volt cell phone power brick and is basically a little desktop on my chest. I ran my tech-talk slides from the badge last July at OSCON. luvcview runs great on the tiny screen if the resolution stays below about 600×400. I rounded out the “chest-top” package with a little Rii thumb keyboard/mousepad.
Check out my five-minute mods to the badge.
I used parts of a “third hand” to attach the JeVois sensor to the top of the badge with a thick rubber band.
Additional rubber bands secured the big li-po power brick to the back of the badge. The JeVois needs quite a bit of current, so I also pirated the USB power outlet board, I built for the portable Steampunk monitor and the big li-po battery. The board converts 12-volts [from the battery] to 5-volt USB power. The Pi plugged into one of the sockets, while the JeVois cable plugged into one of the USB ports on the Pi. The JeVois cable’s auxiliary pigtail then connected to the outlet board. The cables sort of stick out all over the place. Ah well, it’s not meant as a finished project.
It might be interesting to build a detachable bracket for the JeVois sensor on the top of the badge and then have some kind of LED interface that lights up when a “human” is in front of the badge. What else could we do with a JeVois attached to the conference badge?
Machine Vision Road Navigation
I originally hung the badge from the rearview mirror, sat the JeVois sensor on the dash using the “third hand” with its base and drove slowly down my residential street. The HDMI output on the Pi connected to the 10.1-inch steampunk portable monitor on the seat. I used the 12-volt power socket on the dash to power the USB power outlet board. In addition to 5-volt USB power, the outlet board has a dedicated wired connector for the monitor.
The JeVois tracked the road pretty well up to about 25 miles per hour. Needless to say, I didn’t venture very far or very fast with the initial makeshift setup. After returning to the driveway, I attached the sensor and the battery, so as to make a handheld device for testing. Of course, now I’ll need an assistant to drive while I make note of the algorithm’s behavior at various speeds and road conditions. Always be safe and observe the rules of the road.
The JeVois sensor “road navigation” algorithm tracks the edges of a road with white lines and marks the horizontal vanishing point with a green dot. A purple bar shows the center-line of the road. Augmented-reality data is also displayed across the bottom of the viewing window. Normally, location data also streams over the serial line so a robot car or other device can make decisions on where to steer. Take a look at this paper about the development of the algorithm by the JeVois creator, Dr. Itti, and his associates. Note that the paper was from 2012. Yikes!
Running the road navigation algorithm on the handheld conference badge/test rig was pretty straightforward. I booted the badge up in LCD mode and when the desktop came up, I started a terminal and ran luvcview from the command line. luvcview is available in Raspbian Linux. The thumb keyboard/mousepad works a lot better than lugging the full-sized Logitech keyboard around.
rob% luvcview -f YUYV -s 320x256
After luvcview was running I just stuffed the keyboard in my pocket. The JeVois sensor points toward the back of the badge, while the video feed appears on the front mounted LCD screen. I took photos of the screen with my cell phone, because I ran out of hands to run the keyboard and do a screenshot.
Notice that the shadows on the right side skew the vanishing point a little. Good readings require that the horizon index row of dots align with the actual horizon.
The algorithm even worked later in the evening in a Target parking lot on some parking space lines. I haven’t played around with gain, contrast or other sensor settings yet. I’m sure there is room for optimum detection tweaking.
Over the next few weeks, I’ll “live” with the portable badge test rig, evaluating the “road navigation” and other algorithms embedded in the JeVois machine vision sensor. At some point, I’d like to attach the sensor to the Elegoo robot car and have it follow me around using either ArUco markers or road navigation.
Naturally, putting the JeVois on the conference badge has opened up a brand new line of consciousness into uses for that sort of off-the-shelf parts combination. Hey, what if we also add text-to-speech or a servo that points to the vanishing point (center of the road)? Could that be useful?
And, that’s why I’m not worried about the robot revolution taking over the world. Hedley and his ilk are a long way from making the creative, inventive, one-off or “ah-ha” connections humans find easy.
The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Shelf, MADE, Bit.