I’ve wanted to start a new project and came across a plastic skull head at Target the other day. It’s about 8 inches tall and 6 inches around. The jaw is movable and it’s hollow inside. A robotic skull would fit perfectly into my ever expanding program of one-off steampunk physical computing prototypes.
Building a robotic skull is also a great excuse to start working with the new JeVois machine vision system, which we summarized a couple of weeks ago.
Today, I’ll lay out my thoughts for this skull. There’s a lot we can do with that space between the ears, so to speak. Certainly, we’ll have to have interesting movements and there will have to be some ghoulish effects like pulsing LED lighting. I think it will be fun to use the skull in upcoming conference tech talks. Not to mention Halloween.
Evolving into the Future
Readers may recall The Steampunk Eyeball. Its purpose was to demo current off-the-shelf tech by tracking me as I moved around in its field of view. While the Eyeball’s Pixy imaging camera color blob recognition was revolutionary, a couple of years ago, it had serious limitations. Lighting was critical, which was hard to deal with in everyday conference session situations. And, although some of my robot club friends had the device, it never really took off in the larger industry community. That meant there weren’t very many firmware updates or sets of new capabilities.
Will the JeVois device be better? I don’t know. In my line of work, I have to give it a shot.
The good news is that the camera feeds a quad-core ARM nano-Linux machine that is dedicated to image processing. My preliminary testing shows that in addition to basic color blob identification, the camera system is capable of recognizing objects, humans, QR codes and AR markers.
For example, it identified our Maltese, Sequin, as a toy poodle. I’d say that’s close enough for our purposes.
It can also detect movement in its field of view and outputs relevant information over a serial port, separate from the USB stream. The combination of data streams will be used by other processors and devices to do cool things.
What Should the Skull Do?
The skull will perform the same basic role as the Eyeball. Namely:
- Attract attention and entertain people during live performances.
- Act as a real-world demo for the physical computing stack topics cover.
- Motivate audience members to want to build their own physical computing gadgets.
Additional possibilities include:
- Interacting with the audience (and me) through gestures, movement, talking (moving its jaw) and so on.
- Reliably track me as I move around the room.
- “Listen” for commands and take appropriate actions.
- Put on its own little show, as it “comes alive.”
- Control other, as yet undefined gadgets, that I might use during a show.
And, of course, as carnival barkers everywhere want:
- Make me look good.
That’s just the start. Off-The-Shelf Hackers know that new use cases and behaviors will certainly present themselves, as we get into the flow of the project. This is great because once you have a working hardware and software platform, many machine behaviors can be changed through software and firmware mods.
One thing I do early in a project is to make lists. I just dump all the thoughts and possibilities into a text document, so I don’t forget stuff.
For example, here’s a list of available tech we might incorporate into the skull, based on a few past topics we’ve covered on Off-The-Shelf Hacker columns:
- Head panning (right and left) with servos.
- Head tilt (up and down) with servos.
- Movable jaw with a servo.
- Elevate skull out of a box using a motor/driver board.
- LED eyes.
- Pan/tilt eyeballs with a servo.
- Raspberry Pi for a brain.
- Speech synthesis using Python on the Pi.
- Alexa enabled using the Pi.
- Arduino for analog inputs and PWM outputs.
- Network enabled either with built-in WiFi or an auxiliary ESP8266.
- MQTT enabled using the Pi.
My wife made fun of me as I carried the JeVois camera and the skull around for a couple of days. It’s part of my creative process. Use what works to get the ideas flowing, make connections, write the thoughts down and then go see if it works.
One Small Step for Robot-Kind
I’ve already modded the skull.
We’ll need to start putting things inside, so I cut the top of the skull off with a Dremel and hinged it at the back. I also secured the front edge back in place with a couple of small leather straps and some brass screws.
The camera will be mounted in the right eye socket, making it easily adjustable and removable.
The jaw will need some kind of a pivot, that is actuated by a servo. Maybe I’ll graft in a brass bracket to the upper part of the jawbone that attaches to the pivot rod. That would be pretty Steampunk.
Planning for skull panning left and right are in the works. Likely ball bearings will find their way into the design, much like in the Eyeball.
On tap for today is mounting the external antenna for an ESP8266 and building a bracket to mount the camera in the eye socket. Once I get some hardware in or on the skull, other parts of the project will start to fall into place.
Dream up your ideas, then go build it in the shop. Cut, mod, test, rinse and repeat.
That’s what we do as Off-The-Shelf Hackers.
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Shelf.