# Amazon DynamoDB Persistence

This service allows you to persist state updates using the Amazon DynamoDB (opens new window) database. Query functionality is also fully supported.


  • Writing/reading information to relational database systems
  • Configurable database table names
  • Automatic table creation

# Disclaimer

This service is provided "AS IS", and the user takes full responsibility of any charges or damage to Amazon data.

# Table of Contents

# Prerequisites

You must first set up an Amazon account as described below.

Users are recommended to familiarize themselves with AWS pricing before using this service. Please note that there might be charges from Amazon when using this service to query/store data to DynamoDB. See Amazon DynamoDB pricing pages (opens new window) for more details. Please also note possible Free Tier (opens new window) benefits.

# Setting Up an Amazon Account

  • Sign up (opens new window) for Amazon AWS.
  • Select the AWS region in the AWS console (opens new window) using these instructions (opens new window). Note the region identifier in the URL (e.g. https://eu-west-1.console.aws.amazon.com/console/home?region=eu-west-1 means that region id is eu-west-1).
  • Create user for openHAB with IAM
    • Open Services -> IAM -> Users -> Create new Users. Enter openhab to User names, keep Generate an access key for each user checked, and finally click Create.
    • Show User Security Credentials and record the keys displayed
  • Configure user policy to have access for dynamodb
    • Open Services -> IAM -> Policies
    • Check AmazonDynamoDBFullAccess and click Policy actions -> Attach
    • Check the user created in step 2 and click Attach policy

# Configuration

This service can be configured in the file services/dynamodb.cfg.

# Basic configuration

Property Default Required Description
accessKey Yes access key as shown in Setting up Amazon account.
secretKey Yes secret key as shown in Setting up Amazon account.
region Yes AWS region ID as described in Setting up Amazon account. The region needs to match the region that was used to create the user.

# Configuration Using Credentials File

Alternatively, instead of specifying accessKey and secretKey, one can configure a configuration profile file.

Property Default Required Description
profilesConfigFile Yes path to the credentials file. For example, /etc/openhab2/aws_creds. Please note that the user that runs openHAB must have approriate read rights to the credential file. For more details on the Amazon credential file format, see Amazon documentation (opens new window).
profile Yes name of the profile to use
region Yes AWS region ID as described in Step 2 in Setting up Amazon account. The region needs to match the region that was used to create the user.

Example of service configuration file (services/dynamodb.cfg):


Example of credentials file (/etc/openhab2/aws_creds):


# Advanced Configuration

In addition to the configuration properties above, the following are also available:

Property Default Required Description
readCapacityUnits 1 No read capacity for the created tables
writeCapacityUnits 1 No write capacity for the created tables
tablePrefix openhab- No table prefix used in the name of created tables
bufferCommitIntervalMillis 1000 No Interval to commit (write) buffered data. In milliseconds.
bufferSize 1000 No Internal buffer size in datapoints which is used to batch writes to DynamoDB every bufferCommitIntervalMillis.

Typically you should not need to modify parameters related to buffering.

Refer to Amazon documentation on provisioned throughput (opens new window) for details on read/write capacity.

All item- and event-related configuration is done in the file persistence/dynamodb.persist.

# Details

# Tables Creation

When an item is persisted via this service, a table is created (if necessary). Currently, the service will create at most two tables for different item types. The tables will be named <tablePrefix><item-type>, where the <item-type> is either bigdecimal (numeric items) or string (string and complex items).

Each table will have three columns: itemname (item name), timeutc (in ISO 8601 format with millisecond accuracy), and itemstate (either a number or string representing item state).

# Buffering

By default, the service is asynchronous which means that data is not written immediately to DynamoDB but instead buffered in-memory. The size of the buffer, in terms of datapoints, can be configured with bufferSize. Every bufferCommitIntervalMillis the whole buffer of data is flushed to DynamoDB.

It is recommended to have the buffering enabled since the synchronous behaviour (writing data immediately) might have adverse impact to the whole system when there is many items persisted at the same time. The buffering can be disabled by setting bufferSize to zero.

The defaults should be suitable in many use cases.

# Caveats

When the tables are created, the read/write capacity is configured according to configuration. However, the service does not modify the capacity of existing tables. As a workaround, you can modify the read/write capacity of existing tables using the Amazon console (opens new window).

# Developer Notes

# Updating Amazon SDK

  1. Clean lib/*
  2. Update SDK version in scripts/fetch_sdk_pom.xml. You can use the maven online repository browser (opens new window) to find the latest version available online.
  3. scripts/fetch_sdk.sh
  4. Copy scripts/target/site/dependencies.html and scripts/target/dependency/*.jar to lib/
  5. Generate build.properties entries ls lib/*.jar | python -c "import sys; print(' ' + ',\\\\\\n '.join(map(str.strip, sys.stdin.readlines())))"
  6. Generate META-INF/MANIFEST.MF Bundle-ClassPath entries ls lib/*.jar | python -c "import sys; print(' ' + ',\\n '.join(map(str.strip, sys.stdin.readlines())))"
  7. Generate .classpath entries ls lib/*.jar | python -c "import sys;pre='<classpathentry exported=\"true\" kind=\"lib\" path=\"';post='\"/>'; print('\\t' + pre + (post + '\\n\\t' + pre).join(map(str.strip, sys.stdin.readlines())) + post)"

After these changes, it's good practice to run integration tests (against live AWS DynamoDB) in org.openhab.persistence.dynamodb.test bundle. See README.md (opens new window) in the test bundle for more information how to execute the tests.

# Running integration tests

To run integration tests, one needs to provide AWS credentials.

Eclipse instructions

  1. Run all tests (in package org.openhab.persistence.dynamodb.internal) as JUnit Tests
  2. Configure the run configuration, and open Arguments sheet
  3. In VM arguments, provide the credentials for AWS

The tests will create tables with prefix dynamodb-integration-tests-. Note that when tests are begun, all data is removed from that table!