TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%

Drones Make a Three-Dimensional UI for Programmable Matter

Researchers from the Human Media Lab of Queen's University in Canada are developing a platform that uses a swarm of cube-shaped nanocopters that act as a lattice of "interactive, touchable 3D graphics voxels."
Nov 1st, 2018 11:00am by
Featued image for: Drones Make a Three-Dimensional UI for Programmable Matter

Computer user interfaces have evolved tremendously during the last few decades, starting from the humble punch card to early command-line interfaces, to today’s easy-to-use graphical user interfaces. There are augmented (AR) and virtual reality (VR) interfaces as well, but by and large, most interfaces available today are two-dimensional and aren’t necessarily all that interactive nor intuitive. But what if there was some way to create an interactive, three-dimensional user interface that users could control with gestures, using bits of programmable matter that can be haptically manipulated in real-time and in real space?

This intriguing idea isn’t new, and researchers from the Human Media Lab of Queen’s University in Canada are developing a platform that uses a swarm of cube-shaped nanocopters that act as a lattice of “interactive, touchable 3D graphics voxels.” Dubbed GridDrones, the system allows direct user interaction through a set of hand movements, permitting people to physically sculpt out three-dimensional forms with these drone “voxels” — or the three-dimensional version of a pixel. The idea is to provide users with the ability to create unsupported structures like arches, NURBs (non-uniform rational basis splines) or even 3D animations that can be wrangled around freely in three dimensions, rather than on a flat screen. Watch this demonstration of the research:

Self-Levitating Voxels

This new study builds upon BitDrones, the lab’s previous research into interactive 3D displays. To tackle some of the limitations of the prior work, the GridDrones system uses smaller drones, a better communication system, while also employing a lattice-like model that can deform and undergo spatial transformations in three dimensions.

“Unlike 3D printing materials, GridDrones do not require structural support as each element self-levitates to overcome gravity. Unlike 3D prints, the system is bi-directional: you can change the ‘print’ simply by picking up the pixels and re-arranging them,” said Human Media Lab director, study co-author and professor Roel Vertegaal. “This is an important first step towards robotic systems that render graphics as physical reality, rather than as just light. This means users will have a fully immersive experience without a head-mounted display, one that provides haptics for free.”

The team used 15 nano quadcopters that were able to maintain relative position to one another through the use of a Vicon motion capture system. Each nanocopter represents a physical building block that can be interacted with, through three different kinds of input: uni-manual touch to select single voxels; bi-manual touch to select groups of voxels and to rotate or transform the grid; and gestural inputs, such as the “point” gesture which sends out a “3D ray” that intersects with voxels.

GridDrones creating a catenary archway with a) a 2 x 7 flat grid of drones; b) user employs “point” gesture to select two ‘keystones’ for the arch, setting some topological parameters on the app; c) user moves keystones up to d) create an arch.

Besides these inputs, the system utilizes many other conventional inputs that we are familiar with, like double-clicking and lassoing. In addition, the team developed a smartphone app that enables users to easily change the topographical relationship between drone-voxels using a touch slider; for instance, setting the vertical distance between voxels at a certain percentage so that when one voxel is moved, the rest will automatically reposition themselves to reflect that percentage setting.

Grid transformations. Top: Point controls in which one voxel is moved with a) linear relationship between voxels; b) curvilinear relationship. Middle: With a line of 3 voxels selected, translation in the z dimension resulting in c) faceted relationship; d) complex curvature. Bottom: Arbitrary voxels selected and translated in the z dimension resulting in e) faceted relationship; f) complex curvature.

A “Real Reality Interface”

While the current study is considered a “low-resolution” version that only uses 15 drones, according to the researchers, the GridDrones system can be easily scaled up to include many more drones.

“Future versions of the system will feature billions of drones that are so small that they will be able to cling together to create physical structures that are not discernible from real [3D] prints,” explained study co-author and professor Tim Merritt from Denmark’s Aalborg University. “This technology has the potential to ultimately displace virtual reality. The real advantage is that it is situated in the user’s real reality. That’s why we call it a ‘real reality interface.'”

Such “real reality interfaces” could be useful in creating full-scaled prototyping capabilities to engineers, designers and architects, as well as offering interactive educational tools for people of all ages. Such three-dimensional interfaces would be further extended by incorporating brain-computer interfaces (BCIs), allowing users to manipulate physical voxels with their brain waves. It’s a provocative idea that would potentially bring user interfaces out into our everyday lives, and controlling them as easy as making a gesture, or thinking a thought.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Real.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.