Culture / Technology / Top Stories

Off-The-Shelf Hacker: Could Your Project Use Some Artificial Intelligence?

14 Mar 2018 6:00am, by

My last project, Hedley the robotic skull, was a fun endeavor. He can turn his head and follow me as I walk around in his field of view. The JeVois smart vision camera, in the right eye socket, sends X-Y coordinates of my location to an Arduino in Hedley’s noggin. The Arduino then directs a servo to rotate left and right. I used the “saliency” vision module, in the JeVois camera for tracking control. It works pretty well.

In some ways, I’ve become stuck in the mindset of just using the JeVois for robotic sight. Certainly, as off-the-shelf hackers, we should always be on the lookout for extracurricular ideas to expand our interests and knowledge. While moving a robotic head is old hat, artificial intelligence (AI) built into the camera sensor is the new and awesome magic dust.

Surely, the magic dust along with its crazy new hardware can be used in other areas of physical computing. And there’s an even bigger question. How do we all remain relevant in this bold and possibly threatening world filled with AI?

By being off-the-shelf, free-associating hackers.

Tangential Deviation

AI just isn’t equipped to “invent” new ideas, in my opinion. I don’t see how it can ever model the leap from “existing” to “may be possible.” Lots of experts say that creativity is strictly human behavior. I agree.

Remember the PIR sensor used with the ESP8266 project? The device crudely reports when a warm body, like a human (or a stray cat) moves through its field of view. Floodlights turn on when someone enters my backyard, thanks to the practically ancient, 20-year-old PIR technology.

Why not use the JeVois and some of it’s AI algorithms in place of a PIR sensor?

Certainly, you appreciated my leap? Exactly my point.

It might make sense because the JeVois is a new breed of smart sensors based on ARM chips and Linux. The device is a wholly self-contained physical computing system that appears, to a host Linux notebook, like a standard USB webcam. The binary algorithms reside on the micro-SD card in the JeVois sensor. You can interact with the sensor by switching resolutions while in a camera viewing application, like guvcview. Optionally, if you are another machine, you can interact with the JeVois via a serial line. This is how Hedley’s Arduino works with the JeVois sensor. Text messages stream across the serial interface in both directions. You can also change which algorithms to use, alter the sensor parameters and thresholds along with a whole host of other functions.

And, at $50 each, the price of integration is practically affordable for a constellation of sensors.

An Example

We could hook the JeVois sensor to an ESP8266 and send sensor status messages, via MQTT, to a central security server in my office. Then, we’d use some as-yet-unknown AI program to analyze the results over the last three days to figure out if there might be a creeper planning to break into my house.

An interesting JeVois sensor module appropriate for this job might be the Surprise Recorder. This algorithm waits around silently for something to change in its field of view. When a bird, a human or an alien craft flies into sight, the JeVois records a few seconds of video on the micro-SD card.

Obviously, we’d want to either analyze the short videos to be able to distinguish between a human malcontent and a squirrel looking for acorns or crunch the data locally and draw a conclusion. Not sure, yet, how you’d do that via ESP8266 networking. Keep in mind that PIR sensors may or may not be able to pick up a drone flying at window level in the side yard. A vision-based sensor might.

Come to think of it, a module that might be suited for detecting drone intruders, as well as some goofball looking for trouble, could be the Optical Flow algorithm. This module detects the direction of movement by objects in front of the camera. It uses the 177×288 resolution and gray-scale. The view window is split into two panes, one above the other. The top pane shows horizontal movement, while the bottom one tracks vertical movement. Passing your hand in front of the camera from right to left produces a moving black image. Moving your hand from left to right, creates a white moving image. Similarly, moving the hand from the bottom to the top registers a white image.

What will we do with all this data? How long until our smart sensors can tell one individual from another? What if our smart sensor can see in ultraviolet? Or, infra-red? Does it make sense to combine conventional sensors like ultra-sonic rangefinders, Lidar, radar or PIR sensors with a JeVois type camera to enhance our situational awareness? What about integrating sound? Or, pressure sensors in the ground?

We don’t know yet, the smart sensor tech is just so new.

The New Calculus of Leaping Into AI

I’m really vigilant when sitting in a lonely parking lot, at 9:30 PM, while waiting for my wife as she shops in Target. Boredom sets in after about 15 minutes of walking around the store, so I usually retreat to the car to read. Also, getting engrossed in one of my favorite night-time conspiracy-theory radio programs leaves me vulnerable, too.

Maybe I need a smart sensor that sits on the top of the car and watches over me, while I focus on my “Never Eat Alone” or “Creativity: The Psychology Of Discovery And Invention” book.

Police officers might benefit from such a device as well.

Figuring out what to do with the data from all these new smart sensors, how we’ll integrate them into useful products, and be able to take appropriate actions with the data they produce is the new job for contemporary physical computing off-the-shelf hackers.

Just remember, we builders will always have that very important intangible edge over our creations, no matter how smart they get.

Be free and leap often.


A digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.