TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Edge Computing / Tech Life

Off-The-Shelf Hacker: Virtual to Physical with Alexa Using 3 Internet Services

Jul 15th, 2017 3:00am by
Featued image for: Off-The-Shelf Hacker: Virtual to Physical with Alexa Using 3 Internet Services

I picked up an Amazon Echo Dot a few weeks ago. You can get the time, a weather report or listen to internet radio stations by simply asking Alexa, the Echo Dot’s built-in virtual assistant. While that’s all quite interesting, I secretly wanted to learn how Alexa may control physical stuff.

Earlier this month I wrote about connecting Alexa through a data hub, which I refer to as MuG, the Mosquitto MQTT broker application running as a server under glass. The glass? It’s decorative more than anything, an ode to my interest in the steampunk movement. Seems like a new and old way of looking at the world. It connects a virtual space with the physical world. Voice commands serve as the connecting bridge.

The MuG is a CHIP computer configured as a standalone WiFi access point, with the Mosquitto MQTT broker application running as a server. A decorative glass ticker-tape styled dome houses the device, while an articulated brass-tubing bracket holds a tri-color LED, that dramatically lights up the front of the CHIP circuit board. Mosquitto subscription client and Adafruit general purpose input/output (GPIO) library calls in Python, drive the CHIPs pins connected to the LED. Sitting on my desk, the MuG is a wireless visual interface to my physical computing stack.

Getting Alexa to blink the MuG’s LED is today’s topic. You might say that this is the Alexa “Hello, World” exercise, in the fine tradition of C and Unix.

Pulling the Parts Together

Successfully getting Alexa to change the MuG’s LED color has a lot of moving parts. Some are virtual, some local and some happen in software. Physical computing stack, definitely.

Along with the Dot and the MuG, we’ll use three Internet services to get the job done. We’ll need an Amazon account, an If-This-Then-That (IFTTT) account, and an Adafruit IO account. Amazon handles Alexa. Adafruit IO is used as an MQTT broker. IFTTT acts as the intermediary. This is a first-pass prototype process, that actually works. The process will mature over time.

If your Dot is operational, you’ll have already created an Amazon account and talked to Alexa. Here are the three account set-up pages if you need them.

Let’s start with the Adafruit IO configuration.

Connect to the Adafruit MQTT Broker

Adafruit kindly offers both REST and MQTT API’s, to help customers develop their projects. We’ll use the MQTT API which is the front-end to a cloud-based MQTT broker. It’s currently in beta, so use it at your own discretion. It seemed pretty stable and responsive for the Alexa to MuG prototype process.

Make sure to record your Adafruit IO values, user name, the AIO key and so on, in your project notebook, for future reference and easy access.

Log into your Adafruit IO account and click the “Feeds” menu item. Click the “Create a New Feed” item on the “Actions” drop down menu list. Enter a feed name and description, then the “Create” button. My feed was called “alexadata.”

Adafruit IO Feed Page

Adafruit IO Feed Page

Switch over to a Linux terminal and use a Mosquitto pub command to test your Adafruit IO “alexadata” feed. Here’s the command line I used. Substitute your values in as needed.

I used the following command line in a terminal on my Linux notebook to see if I could update the Adafruit IO “alexadata” feed.


The feed value should change when you hit “reload” on the “Feeds” page.

The Adafruit IO overview page explains lots of details about the site.

Connect Alexa to IFTTT

Next, while still on the Adafruit IO site, go to the “Settings” page. Click the “Connected to IFTTT.com” under the Connected Accounts setting. This action takes you to the Adafruit IFTTT connection page where we can set up the trigger that changes the Adafruit data feed value. Use your IFTTT login information to get on the site.

Adafruit IO - IFTTT Connection Page

Adafruit IO — IFTTT Connection Page

Once there, click the “My Applets” button at the top of the page. Click the “New Applet” button on the right.

IFTTT New Applet Page

On the applet create page, hit the blue “+ this” text, in the middle of the page. Search and the select “Amazon Alexa”. On the “Choose trigger” page select the “Say a specific phrase” box. Fill in the “What phrase” text. Mine was “blue on”. “Alexa, trigger blue on”, will then start the chain of actions in the Alexa to MuG process. You’ll be returned to the IFTTT main page and should now see blue “+ that” text.

Click the “+ that” text. Search for and click the Adafruit IO box. Click on the “Send data to Adafruit IO” box. Add the feed name, you created earlier into the “Feed name” slot. Now enter a feed value that the MuG’s Python program will recognize, so it can take appropriate action with the CHIP’s GPIO pin. The value I used was “b-on.” Be sure to finish up with the “Create action” button. The next page will summarize the applet in a little box.

The way all this works is that every time I bellow “Alexa trigger blue on” to the Dot, it’ll kick off an IFTTT job, that sends the value “b-on” to the Adafruit IO MQTT broker, which will then publish the value. The Python program on the MuG will pick up the value, through a subscription to the Adafruit IO broker, in the code. Of course, the Python program needs to be running on the MuG, before the Alexa command will work.

Python script (named subalexa2.py), on the MuG, is started with the following command line.


I added some GPIO code to an MQTT subscription script on the techtutorialsx.com site to read messages and control the MuG’s 3-color LED.


Notice the “wow” function. This was a bit of little razzle-dazzle I put together. It flashes a cycle of blue, red and green, 10 times in a row, using a .2 second delay time when we get a “wow” message from the Adafruit IO broker.

I admit the process is a bit clunky, in that you need a separate IFTTT trigger for each action phrase, like “blue on” or “show wow”.

My wife thought it was pretty cool when I said “Alexa trigger show wow” and the MuG blinked through it’s 10 blue, red, green cycles.

Going Further

I’ve just touched on the basics of an Alexa voice command to MuG process. We could easily send different values that move a servo or turns on a motor.

Amazon Web Services might be another area to explore since they seem to have some kind of MQTT broker. Going that route might eliminate the IFTTT step. There are other MQTT services, both free and paid, out on the Internet, that might satisfy our data messaging needs.

Are voice commands the wave of the future? Who knows.

I’m off to install Alexa on my fifth generation steampunk conference badge.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.