As the author of the Off-The-Shelf Hacker column, part of my job is to try seemingly goofy ideas and pass along the experience to readers. Not only does this experiential prototyping help keep you from going down unproductive rabbit holes, it just might jump-start your creativity from a different perspective. In the rough and tumble world of inventing new products and services, a steady stream of fresh ideas and combinations, is mandatory.
That said, for my current ongoing project, Hedley the robotic skull, I want to explore the idea of robotic telepresence.
Why a Robotic Skull Telepresence?
How exciting would a Raspberry Pi board be simply wired to a JeVois machine vision sensor and a servo and sitting on a table to the average teacher, passersby or child? Maybe not much. Ho, hum… meh!
Put those off-the-shelf parts into something of context, like a plastic Halloween skull, that responds to its environment and people actually start thinking it has a personality. A friend, upon seeing the Steampunk Eyeball track my daughter around the room excitedly exclaimed, “It likes Margaret.”
So, hook Hedley up to a network and he may really get interesting.
You could order your robotic minion to do your bidding from across the room. He can feed data to other physical computing devices. We’ll keep that one in mind for future articles. He might also tap into information up in the cloud or leverage the services of high-powered artificial intelligence servers (think Amazon Alexa). And, if Hedley doesn’t happen to be hooked up to a monitor, you can still execute commands, remotely, over the network.
In other words, networking the skull opens up to lots of crazy possibilities for control, entertainment, automation… and personality.
Start with the Basics
Linux has a bunch of built-in tools that make networking easy. Both Raspbian (on the Pi 3) and Xubuntu (on my old war-horse ASUS notebook) have networking, wired and wireless enabled by default. Click on the little radiation symbol, in the tool-tray, choose an access point and enter the wireless password to get connected to your network, on both machines.
One command I use all the time is ssh (short for secure shell). It’s a Linux command-line program that establishes a secure encrypted connection to the remote Linux machine. The following line, from my Linux notebook, is a typical ssh command that I use to log into the skull’s remote Raspberry Pi 3.
rob% ssh -X email@example.com
Here, “pi” is the remote user on the machine at the local network IP address of 192.168.1.101. In this case, that address corresponds to the Raspberry Pi 3 board in the skull. Obviously, the Pi needs to be up and running and connected to the network. We’re assuming that the WiFi radio, in the Pi 3 is also linked to the address of 192.168.1.101. You could also plug an Ethernet cable into the Pi, which links that interface to its own associated IP address. Once you are logged into the Pi remotely from the Linux notebook you can use ifconfig to find out information about which networking interface is hooked up to which IP address. I happened to already know that the WiFi radio links to the .101 address.
The topmost entry in the list is usually for the wired interface, which has an IP address of 192.168.1.110. Wireless interfaces usually start with “wlan.” Since there is only one WiFi radio on this Pi 3, the interface name is wlan0. It has the address of 192.168.1.101. You can see that data has flowed over these interfaces by the RX and TX packet lines.
You might have also noticed that mysterious “-X” option.
This part is really cool. The -X option allows you to sit on your notebook and actually see the user interface of a command or application that runs on the remote Raspberry Pi.
For example, once I’m logged into the Pi from my notebook, I can start up any command-line program and it will spit out any resultant text into the terminal. This is particularly useful for working with “headless” machines. Headless simply means that there’s no monitor connected. You can do a lot of things with a headless machine.
With the -X option, most any remote graphical application will display its interface on your local machine.
Although not that practical, I can easily run LibreOffice on the Pi, while watching the user interface, while sitting at my notebook. I’ve also run the Arduino IDE remotely and can even program those boards without being next to the Pi. Hedley, right now has a Pi 3, the JeVois sensor (Linux on an ARM processor) and an Arduino. Being able to change the firmware on the Arduino remotely will certainly be useful for process development, as the skull project progresses.
Back to Telepresence
Another thing I’ve run remotely is the guvcview program to control the JeVois smart machine vision sensor. This week I positioned Hedley on my front porch table, facing my front yard. I wanted to see what the JeVois saw while I was sitting in my office.
Getting it going amounted to powering up the systems on the skull then logging into the Pi from the notebook. Typing “guvcview” at the command line brought the various windows up on the notebook. After a short time, the video from the JeVois camera came up, as well.
Now I have to warn you, using guvcview remotely definitely won’t set any speed records. There are considerable latency and lag in the images due to the amount of video data being sent over the network.
Where you might easily get 60 frames per second, actual video frame rate, with a monitor hooked up to the Pi 3 on the skull, you’ll get only 5 or even less than one frame per second, depending on the video resolution and network bandwidth over ssh. Check out this screenshot of guvcview running on the Pi 3 hooked up to the monitor. We’re managing almost 47 frames per second.
In contrast here’s guvcview running remotely (over ssh with the -X option) on the Linux notebook. Note that I was using the wired Ethernet connection for both machines. Managed an awe-inspiring 3+ frames per second.
It’s a bit clunky, although it does work. Keep in mind that the JeVois is still running at 60 or 100 frames per second. It’s just the “display” that is very slo when using guvcview over an ssh -X connection.
The point of the whole exercise, as I develop the skull, is the need to be able to verify the kinds of things the JeVois is analyzing and recognizing. In this particular case, it’s a development/diagnostic function, not necessarily a real-time human user interface job. There will also be times when the skull won’t have a monitor or projector hooked up and I’ll need to have full control from my notebook or even the Pi-powered steampunk conference badge.
How’s that for some convoluted, goofy ideas? I call it development.
Is Hedley’s Telepresence Personality Useful?
OK, so running guvcview remotely on the skull is mildly interesting. The value of this exploration might not be readily apparent. This is the pioneering time for physical computing technology. Techies are struggling to find practical problems for all these cool off-the-shelf technical solutions. Consultants are gazing into their crystal balls and professing the awesome future of the Internet of Things, artificial intelligence, neural networks and such. A few brave companies are hammering away on actually producing interesting, maybe even practical new gadgets. Nobody really has a handle on it all just yet.
That’s OK because readers can choose to get in on the fun and create their own future. It’s fairly inexpensive and just takes dedication to learning, a bit of courage to step into the unknown and the follow-through to explore crazy ideas. Who knows where it will lead.
So, now I’m off to see if using VLC will speed up the video, a bit, when streaming output from the JeVois sensor, through the Pi 3, over the network, to my remote Linux notebook.
Is that useful? Could be. We’ll find out as Hedley’s telepresence and personality grows.
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Shelf.