TNS
VOXPOP
What news from AWS re:Invent last week will have the most impact on you?
Amazon Q, an AI chatbot for explaining how AWS works.
0%
Super-fast S3 Express storage.
0%
New Graviton 4 processor instances.
0%
Emily Freeman leaving AWS.
0%
I don't use AWS, so none of this will affect me.
0%
Edge Computing

Off-The-Shelf Hacker: The Great Robot Servo Dilemma

How to control multiple servos in a dynamically moving robot.
Oct 12th, 2019 6:00am by
Featued image for: Off-The-Shelf Hacker: The Great Robot Servo Dilemma

My project, Hedley the Robotic Skull, has a servo that pans his head back and forth. I’ll soon add one that makes the old noggin tilt forward and back. Wouldn’t Hedley be a lot more interesting if he could roll his left eye and perhaps raise and lower an eyebrow? All the experts opine that robots should be appropriately expressive in order to work harmoniously next to their human overlords.

I’m discovering that one big technical challenge is controlling “X” number of servos in a robot.

Pan and tilt for the head and one eyeball, equals four servos. Remember Hedley has the JeVois smart vision sensor in his right eye socket. Another servo is needed for a simple up/down eyebrow movement. Now we’re up to five. Clearly an Arduino or Raspberry Pi, on their own, is not going to be able to effectively orchestrate as few as five servos. What if we want to add more eyebrows or another eye?

This week I’ll prototype using a 16-channel servo/PWM board. This device goes between your collection of robotic servos and your robot’s microcontroller brain, communicating via the I2C bus. All the servo timing is handled by the board, so the microcontroller just has to script where to and when the servos should move. Scripting our “robot song and dance act” is shaping up to be a complicated process, as well. Today we’ll get the board working and gain a little insight into what we’ll need to consider in a robot control application.

Connecting to the NodeMCU

The I2C bus is built into most modern microcontrollers including the Arduino and the Raspberry Pi. We explored using the I2C bus for the BME280 atmospheric sensor story last week. It uses two data lines, a connection for +5 volts and ground, so four wires in all. The servo controller board talks over I2C.

I recently picked up a three-pack of PCA9685 16X12-bit servo/PWM controller boards from Banggood for about $12.

In previous stories, I’ve used a NodeMCU board as an “Arduino.” At about $4 per board, why use anything else, especially considering you can activate a WiFi connection, with a fairly simple upgrade of the firmware. Connecting the NodeMCU to the servo board was easy and I used a breadboard to get the job done quickly. The servo board came with a male header at one end, so I simply plugged it vertically into the breadboard and ran jumpers between the two boards. Quick hookup and proof of principle is the name of the game when prototyping. The connections are as follows, from the NodeMCU to the servo board. 3.3 volts to VCC, GND to GND, D1 to SLC and D2 to SDA.

You can’t run a servo board from the 5-volt pin on the NodeMCU. It will pull too much current and give erratic behavior. The best way is to connect a 5-volt wall wart that delivers at least 1 Amp. I had an old Sanyo cell phone model with 950 mA capacity. Close enough. The inner conductor on the cable went to the positive screw terminal, while the braided conductor attached to the negative terminal, on the servo board. The pan and jaw servo leads were mated to servo headers 0 and 1, respectively. After uploading the firmware to the NodeMCU board, the USB cable was plugged into a standard 5-volt wall wart.

Breadboard with NodeMCU and the servo controller

Ginning up the Test Code

I copied the example program from the SunFounder site and made a few modifications. Take a look at the PCA9685 page for an interesting refresher description of how servos work with pulse width modulation (PWM).

Be sure to install the Adafruit PWM servo driver library in the Arduino IDE, before trying to compile and run your servo program. I used version 2.0.0.


We start with the usual variables, initializations and library references. Similarly, the setup starts with serial communications and does the library call to open the PWM channels.

Next, the pwm.setPWM() function moves the designated servo to the desired angular position using the pulseWidth() function to convert angles to pulse width. There is then a delay of a half-second between moves and the whole sequence repeats. Servo 0 is the pan servo and 1 is for the jaw.

In this case, Hedley moves his head back and forth then opens and closes his jaw for each cycle of the main loop.

Keep in mind that you should probably start with just several loose servos attached to the servo controller board. Get everything working right then figure out the starting and stopping positions for the actual robot servos. For example, I went back to past lab notes and found that Hedley had opening and closing angles of 95 and 120 degrees. When I hooked up the servo board to Hedley’s servos I used 90 and 115 initially to make sure I didn’t crash the servos into their mechanical limits, thus possibly stripping out the internal gears. Always be ready to pull the power plug the first time you run a servo program.

After verifying I wasn’t going to break anything, it was a simple matter to inch my way up to a lightly closed jaw and an open position that didn’t hit Hedley’s neck bone after plugging the controller board into the actual robot servos.

What’s Next

Now that we know the board works and the servos move, it’s time to figure out how to easily configure a “show” so that Hedley moves his head, opens his mouth and speaks so it all looks realistic.

I might be able to build a Processing program that follows a “script” so I can interact with Hedley during a tech talk. I’m not sure how to run the audio analysis (that syncs the jaw to the audio) routines while also moving the pan/tilt servos, yet.

I’m not even sure if the audio analysis routines will run through the servo control board. Experimenting in that area will probably be next on the agenda.

Regardless, building robotic and physical computing systems is challenging and fun. I’m glad I can share my discoveries, triumphs and occasional missteps with readers.

Catch Dr. Torq’s Off-The-Shelf Hacker column, each Saturday, only on The New Stack! Contact him for consultation, speaking appearances and commissioned projects at doc@drtorq.com or 407-718-3274.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Shelf, Torq, The New Stack.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.