Getting Started with Go and InfluxDB
Conventional databases such as PostgreSQL or MongoDB are great at safekeeping the state of your system in a tabular or document format, but what about time-dependent data: systems metrics, IoT device measurement or application state change?
For those things, you need a more suitable type of database, one designed to manage better semi-structured data with a time characteristic.
InfluxDB is a high-performance data store explicitly written for time-series data. InfluxData provides not only the database, but also tools to ingest, transform and visualize your data. For instance, Telegraf offers more than 200 plugins to ingest data. However, if you want to integrate InfluxDB directly into your backend application, you need to use the dedicated client library.
This tutorial will walk you through how to use the InfluxDB Go client library, create a connection to the database and store and query data from it.
Getting Started with InfluxDB
You are about to add InfluxDB to your application stack. At the end of this tutorial, you will have a codebase illustrating how to interface a Go application with InfluxDB. But first, let’s create some context for this demo.
You are designing a new smart thermostat IoT product. You receive frequent temperature measurement from your IoT sensors. Let’s assume you store temperature measurements in an InfluxDB database. Also, that your users can adjust the temperature of their smart thermostat using your application. Every time the user changes the thermostat, you update the state of the thermostat in your classical database.
Furthermore, you wish to keep a history of all the thermostat temperature settings, alongside the temperature measurement. Temperature settings and measurements together enable you to analyze user behaviors. With that data in hand, you can later make your smart thermostat even smarter by predicting changes before the user even acts.
Requirements for This Tutorial
This tutorial is OS agnostic and assumes that you have GO 1.16+ and Docker installed.
I selected the Docker installation, as it’s best suited for continuous integration. However, InfluxDB supports many platforms (Linux, macOS, Windows, Docker, Kubernetes).
Starting a Local Database
To start the setup, you need to define a
docker-compose.yml file that defines the configuration of your InfluxDB in Docker.
docker-compose up to get the database started. For test purposes, define your environment variables in a file called
test_influxdb.env. Here is the content of my
You need to add the flag
--env-file to your docker-compose command to force Docker to take that file into account.
docker-compose --env-file test_influxdb.env up
InfluxDB comes with an easy-to-use UI. Go to
http://localhost:8086 and have a look.
Getting Familiar with InfluxDB
InfluxDB is not just a time-series database. It’s the core element behind the
influxdata ecosystem. This ecosystem includes:
- A UI. The web interface offers administrative interfaces, development tools such as Query Builder and Data Visualization.
- A command-line interface (CLI) influx.
influxCLI can administer the database, load data from CSV, insert data and query it. It’s a good companion when developing and debugging applications.
Getting Started with the Go Client Library
Your database is ready, so let’s write some Go. Initialize your Go module.
go mod init github.com/xNok/Getting-Started-with-Go-and-InfluxDB
influxdb-client-go as a dependency to your project.
go get github.com/influxdata/influxdb-client-go/v2
Making a Connection
Let’s create a function that helps you connect to the database. You already defined a token in your
test_influxdb.env with the variable
INFLUXDB_TOKEN. You will be using this token for test purposes. You can also create a new token via the UI. Your instance of InfluxDB should still be running.
- Go back to the UI and generate a new authentication token.
- Click Data.
- In Client Libraries, select Go.
This section lets you create an auth token and provide some code snippets to get started with the Go library.
This is the function you’ll aim to create:
Next, create the test function. When you call
connectToInfluxDB, you get a successful connection to the database, and you can validate that by calling the
Health method from
influxdb2.Client. As you can see, I used
godotenv.Load("../test_influxdb.env") to fetch the credentials you defined for your InfluxDB in Docker. (You will need to add
godotenv as a dependency to your project).
When it comes to creating a connection, call the influxdb2 client initialization constructor. Including reading the credentials from environment variables and validating the code for
connectToInfluxDB looks like this:
If you pass the test, you are ready to implement some features using InfluxDB. However, you are not prepared yet to go to production.
Enabling SSL/TLS encryption is strongly recommended for the production environment. You won’t need it in this tutorial, since you’re using a local Docker environment.
In your application, you’ll need to pass the certificates to your InfluxDB client.
First step: Model your data. Your requirement is to send changes in “thermostat settings” to an InfluxDB. The setting contains an identifier for your user and the desired average and maximum temperature in the room.
Second step: Write a test function. You can use several possible ways to insert data to find the one that suits you best.
The InfluxDB Go client library offers three ways to insert data:
- Line protocol uses text-based database queries.
- Data point with a constructor uses maps to populate the data.
- Data point with fluent style uses a builder pattern.
Here is a generic test function made for that purpose:
You will need a small helper function,
init_testDB, to initialize the connection and clean the database before each test.
At last, you’re ready to try each type of data insertion.
The line protocol is straightforward to use and acts a bit like an SQL query. A record in InfluxDB is composed of three elements: measurementName, fields and tags. The key concepts of InfluxDB include:
measurementName, which refers to a dataset
fields, which are key/value pairs
tags, which are also key/value pairs, but act as indexes for your record.
The point data approach is lengthy to write, but also provides more structure. It’s convenient when data parameters are already in the desired format.
Alternatively, you can use the builder
NewPointWithMeasurement to construct the query step by step, which is easy to read.
Which insertion method is best for you? Don’t forget to update the tests to validate your implementations.
Note that the InfluxDB client uses batching to send data to the database. By default, no data will be sent to the database until the batch size is reached (5,000 points by default), as a trade between the load on the database and the availability of the data. A smaller batch size means a higher gear, thus potentially affecting the performance of the database. On the other hand, waiting for the batch size to be reached means that the data is still in memory in your application and not in the database.
You can adjust the
Batch Size when calling the initialization constructor for
Also, you can force the client to send the data using
Flush(). You’ve seen this in the previous example.
However, based on my experience with time-series databases, don’t use the
Flush methods everywhere. Even if it seems reasonable to write the data instantly to the database, it can significantly affect performance. Instead, use the
Batch Size option.
Blocking vs. Non-blocking
While the default behavior of InfluxDB is to use asynchronous calls and batches (i.e., non-blocking I/O), you have the option to write points synchronously. This option is recommended for infrequent writes that need to be immediately committed to the database.
InfluxDB uses a query language called Flux. Flux uses a functional approach to select, filter and aggregate data. It’s effortless to read and understand once you get the basics of it. Additionally, InfluxDB provides a powerful query builder to design your query based on the data ingested.
To complete this tutorial, run your integration test to add data points to the database. Next, use the query builder to create a query that isolates the data you need. This is the query built using the QueryBuilder:
Now all that’s left to do is implement a function that queries your data using the Go client. There are two ways to query data.
The first is to use QueryTableResult. You will notice that putting back the data into the
ThermostatSetting structure requires a bit of work. Even though you send the content of
ThermostatSetting as one data point, the fields
max come out as two separate records.
The second option is
QueryRaw() that returns an unparsed result string.
Finally, you’ll need to update your test function and see if it works as expected.
If you completed this tutorial, you have a thoroughly tested application using InfluxDB. You’re using four different ways to query data (three styles of non-blocking insertion plus one blocking insertion), and you’re aware that you need to enable SSL/TLS certificates before going to production.
You have the opportunity to insert data and visualize it in the InfluxDB UI, from which you can quickly build your queries and then use the data in your application. In sum, you can insert data and retrieve it in the same data structure to use in your application. Your future smart thermostats company is definitely on the right track.
If you’re interested in going further with InfluxDB, read the official documentation to get more familiar with the Go client library.