Mistake on this page? Email us

Structuring the time series workflow

Scoping

Let's assume the data streams are readily available from Device Management. In other words, there are some devices connected to Device Management that provide the following, in addition to the standard LwM2M objects:

  • Product ID (String): "Cheese," "Apples," "Beer," "Bread" and "Milk."
  • Product count (Integer).

The same product with the same product ID may be displayed on multiple shelves, so we need to capture this in our final metric through an aggregation stage. We want to keep track of the product count for each product ID over time, so we will store these readings in a time series database. To do this, we need a custom proxy application that takes the data streams from Device Management and pushes them into the time series database. We also need to present these readings to the analysts in a meaningful way through some form of visualization. In this tutorial, we use InfluxDB for storing the time series values and Grafana for visualization.

Capturing the structure in microservices

Microservice architectures are modular applications, where subcomponents are built up as loosely coupled and reusable services. Given a standard interface, for example, the same description for an SQL database microservice can be used in just about any application. Microservice architectures also offer straightforward deployment scenarios and controllable service scaling.

Docker is one of many solutions for building microservice-based applications. In our example service, we partition the work into four microservices:

  • linux_client: An example client that generates data streams.
  • app: A proxy web application that receives data from Device Management and forwards the data to InfluxDB.
  • influxdb: An instance of the InfluxDB time series database.
  • grafana: A data visualization engine.

Microservices themselves are described using Dockerfiles, which include information, such as how to build and run a specific service and what ports are needed to connect to it. This application has two Dockerfiles: one for the proxy web application and one for a local Linux client found in webapp/Dockerfile and mbed-cloud-client-example/Dockerfile, respectively. Docker captures the application structure in docker-compose.yml files, which designate which microservices are needed in an application and how to connect them.

For our example system, we use the following compose script:

version: "3"

services:
    linux_client:
        build: mbed-cloud-client-example

    app:
        build: webapp
        ports:
            - "3000:8080"
        volumes:
            - ./webapp:/usr/src/app
        links:
            - influxdb
            - linux_client
        environment:
            - ENV=development

    # We can refer to the influxdb url with http://influxdb:8086
    influxdb:
        image: influxdb:1.4-alpine
        expose:
            - "8086"

    grafana:
        image: grafana/grafana
        ports:
            - "3001:3000"
        links:
            - influxdb

For InfluxDB and Grafana, we do not need to specify Dockerfiles, as they are public on DockerHub and Docker pulls them automatically on build. The project structure is evident in the links keyword. For example, the app service depends on both the influxdb service and linux_client service. Likewise, the grafana service pulls data from the influxdb service. Finally, the grafana service is visible on port 3001.

Writing a proxy sampling application based on subscriptions

For an introduction to building web applications around Device Management see integrating web applications.

# Subscribe to all devices at runtime
def Initialize():
    for each device in connected_devices():
        if has_product_count_resource(device):
            add_resource_subscription(device, PRODUCT_COUNT_PATH, product_count_callback)

# Push current value and product_id to the database
def product_count_callback(device_id, path, current_value):
    TimeSeriesDatabase.push((device_id, current_value, get_product_id(device_id)))