As robots of all sizes and stripes become more prevalent in our factories and offices, it becomes vital that humans find an easy and intuitive way to control them, without having to undergo some kind of special training to do so. While we may someday be able to control robots telepathically with our thoughts using non-invasive brain-computer interfaces, these solutions still require some kind of preparatory training (in addition to donning awkward-looking swim-cap headgear).
That said, an even more likely candidate for intuitive robot control might be using ordinary smartphones or tablets equipped with some kind of easy-to-use application. That’s exactly what Jared Alan Frank, a mechanical engineering Ph.D. candidate in New York University’s Tandon School of Engineering has developed: an app that utilizes augmented reality (AR) to allow users to tell robots where to go and what to do. Watch how it’s done in this video from IEEE Spectrum:
Manipulating Virtual Objects
Similar to other augmented reality applications, Frank’s app uses the camera on smart devices to “capture” a scene. It then overlays markers on designated “virtual objects” that can be then spatially manipulated within the app, using gestures. These taps, swipes and finger-drawn lines on the smart device’s screen translates into corresponding movements or actions from the robots in the real world.
Frank used Apple’s software development platform, Xcode, to create a virtual grid with a coordinate system. The virtual objects as defined by the user are placed within these virtual coordinates, and visual tags called fiducial markers are placed on whatever the user wants to control within this virtual space, whether that’s a robot or another item that needs to be moved. The smart device’s built-in sensors — such as its accelerometers, gyroscopes, and magnetometers — are also brought into play when establishing this virtual scene.
With this virtual stage set up in this way, the scene is then captured using the device’s camera. Once that is done, the user is now able to give commands via the smart device by manipulating the scene’s virtual objects. These instructions are relayed through WiFi to the robots, which are equipped with Raspberry Pi boards as the primary controller that processes these commands.
The main advantage of this system is that it doesn’t need special equipment. “Unlike the methods that are conventionally used to interact with sophisticated teams of robots, our approach does not require purchasing or installing any additional hardware or software and does not require the interaction to be in a traditional laboratory environment,” Frank told Next Reality. “This is because our app relieves the dependence on laboratory-grade and industrial-grade equipment and instead leverages the capabilities of the mobile device in tracking and controlling the robots.”
This means that one could potentially just take out a smart device with the app installed, snap a scene, and conveniently begin to control a group of robots that are connected to the system. A tool like this would be more relatively more mobile and could have huge implications for better integrating robots into everyday life, and into many industries.
“Enabling everyday people to control a small swarm of robots is of great practical value since the list of applications in which people and groups of robots may need to interact is expect to grow steadily (e.g., in education and training, construction and manufacturing, and recreation),” explained Frank. The aim now is to further refine the app but to keep it easy and intuitive to use so that it can be tested on construction sites and on the factory floor.
It makes sense to take this approach, which would democratize access and simplify the use of robots. After all, we already depend on our smart devices for a variety of tasks — navigating the streets, finding a good restaurant or scanning QR codes to get more information on something — the list goes on. Now, imagine being able to command robots using your smartphone with yet another easy-to-use app. It’s an appealing idea, no doubt.
The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Prevalent, Real.